Methods to Detect Emotion, Part 2

of one

Methods to Detect Emotion, Part 2

Research successfully identified five different methods of detecting emotions that are backed by research evidence. The methods are emotion detection through voice signals, facial expressions, eye movement, gait or walking patterns, and keyboard typing patterns. A detailed report is presented below.


  • In 2015, Valery A. Petrushin patented an invention for detecting emotion in voice signals in a call center for Accenture Global Services Limited.
  • It included a system, method, and article of manufacture for detecting emotion by extracting at least one feature from the voice signal and comparing it to voice parameters in the database. A statistic was used to correlate the voice feature to an emotion.
  • Applying the approach to a certain data set and gave an average accuracy of about 55%. Accuracy among the emotional states is as follows: normal state is 40-50%, happiness is 55-65%, anger is 60-80%, sadness is 60-70%, and fear is 20-40%.


  • Suja Palaniswamy & Shikha Tripathi conducted a study entitled Emotion Recognition from Facial Expressions Using Images with Pose, Illumination and Age Variation for Human-Computer/Robot Interaction.
  • The study, which proposed a technique for emotion recognition from facial expressions in images with simultaneous poses, illumination and age variation in real-time, considered the basic emotions of anger, disgust, happy, surprise, and neutral.
  • Subjects of varying age groups expressing emotions through different poses and illumination were tested to validate real-time performance of the proposed method. The method obtained 96% recognition accuracy at an average time of 120ms.


  • Yang Wang, Zhao Lv, and Yongjun Chung authored the study Automatic Emotion Perception Using Eye Movement Information for E-Healthcare Systems.
  • The study proposed "an eye movement information-based emotion perception algorithm by collecting and analyzing electrooculography (EOG) signals and eye movement video synchronously."
  • Three emotional states were experimented on, namely positive, neutral, and negative.
  • The average accuracy using feature-level fusion (FLF) and decision-level fusion (DLF) was 88.64% and 88.35%, respectively.


  • Mangtik Chiu, Jiayu Shu, and Pan Hui studied emotion recognition through gait on mobile devices.
  • In their paper, the possibility of emotion recognition through gait, which is one of the most common human behaviors, was explored. The team collected data by recording individuals walking using a smartphone camera for about 10 seconds.
  • Subjects were explicitly told to perform joy, anger, sadness, relaxation, or neutrality while walking.
  • The team trained several models to classify emotion label and the best accuracy obtained was 64%.


  • Clayton Epp, Michael Lippold, and Regan L. Mandryk studied identifying emotional states using keystroke dynamics in 2011.
  • In the study, the team determined user emotion by analyzing the rhythm of typing patterns on a standard keyboard. Participants’ keystrokes and their emotional states via self-reports were collected. Keystroke features were extracted keystroke features and considered classifiers for 15 emotional states.
  • The emotional states were anger, boredom, confidence, distraction, excitement, focused, frustration, happiness, hesitance, nervousness overwhelmed, relaxation, sadness, stress, and tired.
  • 77-88% accuracy was observed for confidence, hesitance, nervousness, relaxation, sadness, and tiredness. Results for anger and excitement were 84% accurate.


In order to determine the different methods of detecting emotion that are supported by research, we consulted scholarly publications for scientific studies on emotion detection. We restricted our research to studies published in the last 24 months and found resourceful studies although some of them draw insight from older studies in the same field. Ultimately, our attempt was successful and we found five methods of detecting emotions that are supported by scientific and academic research.