FREE ELECTRONIC LIBRARY - Dissertations, online materials

«Volume 4, Issue 12, December 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research ...»

Volume 4, Issue 12, December 2014 ISSN: 2277 128X

International Journal of Advanced Research in

Computer Science and Software Engineering

Research Paper

Available online at: www.ijarcsse.com

Classification of Human Facial Expression based on Mouth

Feature using SUSAN Edge Operator

Prasad M* Ajit Danti

De pt. of Master of Computer Applications, Dept. of Computer Applications, Bapuji Institute of Engineering and Technology, JNN College of Engineering, Davanagere-4, India Shimog-4, India Abstract: In this paper, human facial expressions are recognized based on the mouth feature using Susan edge detector [3,4]. Face part is segmented from the face image, in which mouth feature is separated and potential geometrical features are used for the determination of facial expression such as surprise, neutral, sad and happy.

Experimentation is done on standard JAFFE database [8] images of different people and efficacy of the results are discussed.

Keywords- Edge projection analysis, Facial features, feature extraction, segmentation SUSAN Threshold edge detection operator.


Facial expression recognition involves three steps face detection, feature extraction and expression classification. Facial expression classification plays very important role in the study of human mood. Many researchers have done work in this research area yet efficient and robust facial expression system need to be developed. Facial expression recognition based on Local Binary Patterns features. Different classification techniques are examined on several databases [1]. Basic principle of Adaboost milti-expresion classification algorithm and demonstrate the process of training and testing in detail. Because changes of facial expression mainly exist in eyes and mouth, we treat eyes and mouth as mutual independent elements, which improved the speed of training threshold value[10].

Method to identify the facial expressions of a user by processing images taken from a webcam. This is done by passing the image through 3 stages -face detection, feature extraction, and expression recognition[12]. The combination of SUSAN edge detector, edge projection analysis and facial geometry distance measure is best combination to locate and extract the facial feature for gray scale images in constrained environments and feed forward back-propagation neural network is used to recognize the facial expression[8]. To attain successful recognition performance, most current expression recognition approaches require some control over the imaging conditions because many real-world applications require operational flexibility. In particular, research into automatic expression recognition systems capable of adapting their knowledge periodically or continuously has not received much attention[5]. A system that performs these operations accurately and in real time would form a big step in achieving a human-like interaction between man and machine[11].

The most expressive way humans display emotions is through facial expressions. method for expression recognition based on global LDP features and local LDPv features with SVM decision-level fusion, which can retain the influence of global facial face and while highlight the local region with more contribution on expression changes[7].

Deriving an effective facial representation from original face images is a vital step for successful facial expression recognition.There are two common approaches to extract facial features: geometric feature-based methods and appearance-based methods [13]. Accuracy of facial expression recognition is mainly based on accurate extraction of facial feature components. Facial feature contains three types of information i.e texture, shape and combination of texture and shape information [14]. Face is represented based on statistical local features, local binary patterns(LBP) for person independent expression recognition.

LBP is used for texture analysis along with Support Vector Machine for low resolution and better performance[2]. One of the fundamental issues about the facial expression analysis is the representation of the visual information that an examined face might reveal [15]. For successful facial expression recognition, deriving an effective facial representation from original face images is a crucial step. There are two common approaches to extract facial features: geometric feature-based methods and appearance-based methods [4,9].

Multiple face region features are selected by Adaboost algorithm. Face is divided into sub regions by Adaboost based on multiple region Orthogonal component principle component analysis features like eyes, mouth and nose. The region combination were used as input to AdaBoost classifier, this at each stage chooses the best such combination before changing the weights for next iteration [6]. Susan operator is used to locate corners for different feature point to increase Accuracy[3].

–  –  –


In this work JAFFE (Japanese Female Facial Expression) Database developed by Kyushu University for Japanese women expression is used. The JAFFE database is made up 213 individual images of ten persons, and each person shows anger, disgust, fear, happiness, sadness, surprise and neutral. There are 2 - 4 images for every face expression, and images are all 256 × 256 grayscale images. The photos were taken at the Psychology Department in Kyushu University. Few samples are shown in Fig. 1

–  –  –

The proposed method detects the face part depending upon the measurement given. Then the algorithm crops the selected facial part from the image, this cropped image is then divided horizontally into two parts depending upon the central point of the image located.

We ignore the upper right portion and concentrate lower left portion. In this portion the SUSAN algorithm[3,4] selects the larger part which is mouth compared to eyes and nose. The SUSAN algorithm then generates binary image of the mouth, which is then complimented for conversion. Later the SUSAN algorithm checks the complimented image for noise, if present. If they are present they are removed to larger extent and small bits are filled. Then the image is filled using RGB as parameters.

Steps involved in the proposed system.

Step1: Preprocessing: In given input image, quality of image is enhanced by different filters such as median filter, average filter, wiener filter according to noise present in image, improving contrast of image by histogram equalization,adoptive equalization etc.

Step 2: Mouth Detection: Edge detector such as Sobel, canny, pewitt etc are applied on the image to detect edges on the given image and face boundary is located by using suitable threshold value. Further Facial feature Candidate are located by Geometrical method. It is assume that in most of faces the vertical distance between eyes and mouth are proportional to the horizontal distance between the two centres of eyes. In which we consider mouth area only.

© 2014, IJARCSSE All Rights Reserved Page | 441 Prasad et al., International Journal of Advanced Research in Computer Science and Software Engineering 4(12), December - 2014, pp. 440-443 Step 3: SUSAN operator to detect corner for different features:- There are various edge detector available in DIP such as Sobel, Canny, Prewitt but they can only detect the edges. But SUSAN operator having advantages to locate corners of image in addition to edges. So to improve accuracy of feature point extraction SUSAN operator is applied on face area to detect far and near corner for two eyes and two corner for mouth area.

Step4: Geometrical features such as area, height and width of the mouth features are extracted for the purpose of expression recognition.

Step5: Facial expressions such as surprise, neutral, sad and happy are recognized based on the range of statistical values given for each expressions satisfying the condition.

The algorithm matches with the facial expression types stored in trained image and outputs the matched one for which various geometrical features such as area, height and width of mouth portion are calculated. Sample experimental results are shown in Fig 3 and statistical results are tabulated in Table I.

–  –  –

Fig. 3 (a) Input Image (b) Detected Face (c) Cropped Face (d) Extracted Mouth Portion (e) Binary Convert (f) Complemented (g) Noise Removed (h) Holes Filled (i) Mouth Expression

–  –  –


In this work, 25 images are selected from three persons from JAFFE database. Table II shows different experimental results between JAFFE database of the four different facial expressions(1 is happiness, 2 is neutral, 3 is sadness, 4 is surprise).

–  –  –


In this paper an attempt is made to generate different expression of people by using mouth as a parameter using SUSAN operator. Proposed system is tested on JAFEE data base for facial expressions of different peoples in different moods and obtained satisfactory results.

In the future, other facial expressions are also recognized.

© 2014, IJARCSSE All Rights Reserved Page | 442 Prasad et al., International Journal of Advanced Research in Computer Science and Software Engineering 4(12), December - 2014, pp. 440-443 REFERENCES Caifeng Shan, Shaogang Gong, Peter W. McOwan, “Facial expression recognition based on Local Binary [1] Patterns: A comprehensive study”, ELSEVIER, PP 803-816, 2009.

[2] Caifeng shan,Shaogang Gong,Peter W,Mcowan "Facial expression recognition based on Local Binary Patterns:

A comprehensive Study" Image and Vision Computing 27(2009) 803-816.

Ms. B.J.Chilke, Mr D.R.Dandekar,” Facial Feature Point Extraction Methods-Review”, International [3] Conference on Advanced Computing, Communication and Networks’11.

T. Gritti, C. Shan, V. Jeanne and R. Braspenning, “Local Features based Facial Expression Recognition with [4] Face Registration Errors” 978-1-4244-1/08/2008. IEEE G.Hemalatha, C.P. Sumathi,” A Study of Techniques for Facial Detection and Expression Classification”, [5] International Journal of Computer Science & Engineering Survey (IJCSES) Vol.5, No.2, April 2014.

[6] Jiaming Li, Geoff Poulton, Ying Guo,Rong-Yu Qiao "Face Recognition Based on Multiple Region Features" Proc.VIIth Digital Image Computing:Techniques and Applications,Sunc,Talbot H,OurselinS. and Adriaansen T.(Eds), 10-12 Dec 2003,Sydney.

Juxiang Zhou, Tianwei Xu and Jianhou Gan,” Feature Extraction based on Local Directional Pattern with SVM [7] Decision-level Fusion for Facial Expression Recognition”, International Journal of Bio-Science and BioTechnology Vol. 5, No. 2, April, 2013.

S.P.Khandait, Dr. R.C.Thool, “Automatic Facial Feature Extraction and Expression Recognition based on [8] Neural Network”, International Journal of Advanced Computer Science and Applications,Vol. 2, No.1, January 2011.

V. P. Lekshmi and M. Sasikumar, “Analysis of Facial Expression using Gabor and SVM”, International Journal [9] of Recent Trends in Engineering, vol. 1, no. 2, (2009) May.

Liying Lang, Zuntao Hu, “ The Study of Multi-Expression Classification Algorithm Based on Adaboost and [10] Mutual Independent Feature”, Journal of Signal and Information Processing,, PP 270-273,2011.

Maja Pantic, Leon J.M. Rothkrantz,” Automatic Analysis of Facial Expressions:The State of the Art”, ieee [11] transactions on pattern analysis and machine intelligence, vol. 22, no. 12, december 2000.

Srinivasa K G, Inchara Shivalingaiah, Lisa Gracias, and Nischit Ranganath, “Facial Expression Recognition [12] System Using Weight-Based Approach”, Msrit.

[13] Y. Tian, T. Kanade, J. Cohn, Handbook of Face Recognition, Springer, 2005 (Chapter 11. Facial Expression Analysis).

Xiaoyi Feng, Baohua Lv, Zhen Li, Jiling Zhang, ―A Novel Feature Extraction Method for Facial expression [14] Recognition.

[15] H. Yamada, ªVisual Information for Categorizing Facial Expressions of Emotions,º Applied Cognitive Psychology, vol. 7, pp. 257-270, 1993.

© 2014, IJARCSSE All Rights Reserved Page | 443

Similar works:

«Journal of the Linguistic Association of Nigeria Volume 14 Number 2 2011 (pp. 319-327) A Re-Appraisal of Nasal Vowels in Esan E.O. Osiruemu Department of Linguistics and African Languages, University of Benin, P.O. Box 4563, Benin City E-mail: ofuije@yahoo.com This paper discusses nasal vowels in Esan. It re-evaluates prevailing assumptions about their status and proffers phonetic as well as phonological basis to justify the possibility that these so-called ‘nasal vowels’ may be a...»

«Introducing Billy Bungarra FEBRUARY /MARCH 2014 VOL. 34 NUMBER 1 GENERAL DISCLAIMER Shire Contact Details The Murchison Monologue is published by the Shire of Office: 99637999 Murchison as a public service for the community. Fax: 99637966 Web: www.murchison.wa.gov.au The opinions expressed have been published in good faith on the request of the person requesting publicaCEO: Jenny Goodbourn ceo@murchison.wa.gov.au tion, and are not those of the Shire of Murchison. All articles, comments, advice...»

«The sensitive interface Paul Mc Kevitt Department of Computer Science Regent Court, 211 Portobello Street University of She eld GBS1 4DP, She eld England, EU. E-mail: p.mckevitt@dcs.shef.ac.uk John G. Gammack Department of Computing and Information Systems High Street University of Paisley GBPA1 2BE, Paisley Scotland, EU. E-mail: gamm-ci0@paisley.ac.uk Keywords: anthropocentrism, emotion, face recognition, human-centredness, human-computer interaction (HCI), IDIOMS, interfaces, natural-language...»

«Ludzas apriņķis, Ludza Bukels Konstantīns Ādama d., dz. 1918, Dārza iela 5; izsūt. 25.03.49, Tomskas apg. Tuganas raj., atbrīv. 18.05.55. Lieta Nr. 3R. Savicka Regīna Konstantīna m., dz. 1917, Blaumaņa iela 5; izsūt. 25.03.49, Tomskas apg. Tuganas raj., atbrīv. 17.11.55. Lieta Nr. 150R. Savickis Roberts Juliana d., dz. 1940, Blaumaņa iela 5; izsūt. 25.03.49, Tomskas apg. Tuganas raj., atbrīv. 17.11.55. Lieta Nr. 150R. Savickis Edgars Juliana d., dz. 1941, Blaumaņa iela 5;...»

«The Impact of Minimum Wages on Labour Market Transitions Pierre Brochu∗and David A. Green† October 2012‡ Abstract We investigate differences in labour market transition rates in high versus low minimum wage regimes using Canadian data spanning 1979 to 2008. The data include consistent questions on job tenure and reason for job separation for the whole period. Over the same time frame, there were over 140 minimum wage changes in Canada. We find that higher minimum wages are associated...»

«Family Dollar Stores, Inc. Equity Valuation and Analysis *As of November 1, 2007* A Fundamental Study by: Cody Baker cody.baker@ttu.edu Nolan Bosworth nolan.bosworth@ttu.edu Ryan Huff ryan.t.huff@ttu.edu Denyel Johnston denyel.johnston@ttu.edu T.J. Randolph thomas.j.randolph@ttu.edu 0 Table of Contents Executive Summary 3 Business and Industry Analysis 9 Company Overview 9 Industry Overview 10 Five Forces Model 12 Rivalry Among Existing Firms 13 Threat of New Entrants 19 Threat of Substitute...»

«Copyrighted material ® Unless otherwise indicated, all Scripture quotations are from the Holy Bible, New International Version, ® NIV. Copyright © 1973, 1978, 1984, 2011, by Biblica, Inc.™ Used by permission of Zondervan. All rights reserved worldwide. www.zondervan.com Verses marked nlt are taken from the Holy Bible, New Living Translation, copyright © 1996, 2004, 2007 by Tyndale House Foundation. Used by permission of Tyndale House Publishers, Inc., Carol Stream, Illinois 60188. All...»

«================================================================= This opinion is uncorrected and subject to revision before publication in the New York Reports.No. 31 Marc A. Nicometi, Appellant-Respondent, v. The Vineyards of Fredonia, LLC, et al., Respondents-Appellants, et al., Defendants. Scott Pfohl, et al., Third-Party Plaintiffs, v. Western New York Plumbing-Ellicott Plumbing and Remodeling Co., Inc., Third-Party Respondent-Appellant. Michael J. Hutter, Jr., for appellant-respondent....»

«Design and Construction of a Machine-Tractable Malay-English Lexicon Chiew Kin Quah*1, Francis Bond and Takefumi Yamazaki *R & D Department NTT Communication NTT Cyber Space Labs NTT MSC Sdn Bhd Science Labs NTT Corporation No. 43000, Jalan APEC, NTT Corporation 1-1 Hikarinooka 63000 Cyberjaya, 2-4, Hikaridai, Seika-cho, Yokosuka-shi, Kanagawa Selangor Darul Ehsan, Soraku-gun, Kyoto 239-0847, Japan yamazaki@nttnly.isl.ntt.co.jp Malaysia 619-0237, Japan annieq@nttmsc.com.my...»

«Residually finite finitely presented solvable groups Olga Kharlampovich (Joint results with A. Miasnikov and M.Sapir) August 2012, Dusseldorf 1 / 32 Two 100 anniversaries The word problem in groups becomes 100 years old this year (Max Dehn introduced it in 1912). This year the mathematical world also celebrates the 100’th anniversary of Alan Turing. 2 / 32 Abstract We construct the first examples of finitely presented residually finite groups with arbitrarily complicated word problem and...»

«Breaking In: Female Intelligence and Agency in British Children’s Fantasy Literature by Mikala A. Carpenter Breaking In: Female Intelligence and Agency in British Children’s Fantasy Literature by Mikala A. Carpenter A thesis presented for the B.A. degree with Honors in The Department of English University of Michigan Winter 2015 © 2015 Mikala A. Carpenter Thank you to my family, to my friends, and to my cohort, for all the love and support. Acknowledgements There are so many people from...»

«International Letters of Social and Humanistic Sciences Online: 2014-06-18 ISSN: 2300-2697, Vol. 31, pp 56-67 doi:10.18052/www.scipress.com/ILSHS.31.56 © 2014 SciPress Ltd., Switzerland A Postcolonial Feminist Reading of Evelyn Waugh's A Handful of Dust, Black Mischief and Scoop Sarah Esmaeeli, Hossein Pirnajmuddin* English Department, University of Isfahan, Isfahan, Iran *E-mail address: pirnajmuddin@fgn.ui.ac.ir ABSTRACT Evelyn Waugh is commonly said to be a misogynist. However, his stance...»

<<  HOME   |    CONTACTS
2016 www.dissertation.xlibx.info - Dissertations, online materials

Materials of this site are available for review, all rights belong to their respective owners.
If you do not agree with the fact that your material is placed on this site, please, email us, we will within 1-2 business days delete him.