Sorry, you need to enable JavaScript to visit this website.

Multimedia human-machine interface and interaction

Affect Recognition from Lip Articulations


Lips deliver visually active clues for speech articulation. Affective states define how humans articulate speech; hence, they also change articulation of lip motion. In this paper, we investigate effect of phonetic classes for affect recognition from lip articulations. The affect recognition problem is formalized in discrete activation, valence and dominance attributes. We use the symmetric Kullback-Leibler divergence (KLD) to rate phonetic classes with larger discrimination across different affective states. We perform experimental evaluations using the IEMOCAP database.

Paper Details

Authors:
Rizwan Sadiq, Engin Erzin
Submitted On:
23 March 2017 - 1:44pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

sadiq-erzin-icassp17.pdf

(46 downloads)

Keywords

Subscribe

[1] Rizwan Sadiq, Engin Erzin, "Affect Recognition from Lip Articulations", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1609. Accessed: Jun. 23, 2017.
@article{1609-17,
url = {http://sigport.org/1609},
author = {Rizwan Sadiq; Engin Erzin },
publisher = {IEEE SigPort},
title = {Affect Recognition from Lip Articulations},
year = {2017} }
TY - EJOUR
T1 - Affect Recognition from Lip Articulations
AU - Rizwan Sadiq; Engin Erzin
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1609
ER -
Rizwan Sadiq, Engin Erzin. (2017). Affect Recognition from Lip Articulations. IEEE SigPort. http://sigport.org/1609
Rizwan Sadiq, Engin Erzin, 2017. Affect Recognition from Lip Articulations. Available at: http://sigport.org/1609.
Rizwan Sadiq, Engin Erzin. (2017). "Affect Recognition from Lip Articulations." Web.
1. Rizwan Sadiq, Engin Erzin. Affect Recognition from Lip Articulations [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1609

Use of Affect Based Interaction Classification for Continuous Emotion Tracking


Natural and affective handshakes of two participants define the course of dyadic interaction. Affective states of the participants are expected to be correlated with the nature of the dyadic interaction. In this paper, we extract two classes of the dyadic interaction based on temporal clustering of affective states. We use the k-means temporal clustering to define the interaction classes, and utilize support vector machine based classifier to estimate the interaction class types from multimodal, speech and motion, features.

Paper Details

Authors:
Hossein Khaki, Engin Erzin
Submitted On:
3 March 2017 - 8:28am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

khaki-erzin-icassp17.pdf

(57 downloads)

Keywords

Subscribe

[1] Hossein Khaki, Engin Erzin, "Use of Affect Based Interaction Classification for Continuous Emotion Tracking", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1608. Accessed: Jun. 23, 2017.
@article{1608-17,
url = {http://sigport.org/1608},
author = {Hossein Khaki; Engin Erzin },
publisher = {IEEE SigPort},
title = {Use of Affect Based Interaction Classification for Continuous Emotion Tracking},
year = {2017} }
TY - EJOUR
T1 - Use of Affect Based Interaction Classification for Continuous Emotion Tracking
AU - Hossein Khaki; Engin Erzin
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1608
ER -
Hossein Khaki, Engin Erzin. (2017). Use of Affect Based Interaction Classification for Continuous Emotion Tracking. IEEE SigPort. http://sigport.org/1608
Hossein Khaki, Engin Erzin, 2017. Use of Affect Based Interaction Classification for Continuous Emotion Tracking. Available at: http://sigport.org/1608.
Hossein Khaki, Engin Erzin. (2017). "Use of Affect Based Interaction Classification for Continuous Emotion Tracking." Web.
1. Hossein Khaki, Engin Erzin. Use of Affect Based Interaction Classification for Continuous Emotion Tracking [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1608

Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues

Paper Details

Authors:
Hossein Khaki, Elif Bozkurt, Engin Erzin
Submitted On:
20 April 2016 - 5:27am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues-Presentation.pdf

(125 downloads)

Keywords

Subscribe

[1] Hossein Khaki, Elif Bozkurt, Engin Erzin, "Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/977. Accessed: Jun. 23, 2017.
@article{977-16,
url = {http://sigport.org/977},
author = {Hossein Khaki; Elif Bozkurt; Engin Erzin },
publisher = {IEEE SigPort},
title = {Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues},
year = {2016} }
TY - EJOUR
T1 - Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues
AU - Hossein Khaki; Elif Bozkurt; Engin Erzin
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/977
ER -
Hossein Khaki, Elif Bozkurt, Engin Erzin. (2016). Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues. IEEE SigPort. http://sigport.org/977
Hossein Khaki, Elif Bozkurt, Engin Erzin, 2016. Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues. Available at: http://sigport.org/977.
Hossein Khaki, Elif Bozkurt, Engin Erzin. (2016). "Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues." Web.
1. Hossein Khaki, Elif Bozkurt, Engin Erzin. Agreement and Disagreement Classification of Dyadic Interactions Using Vocal and Gestural Cues [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/977

Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal


We propose a new algorithm for source localization on rigid surfaces, which allows one to convert daily objects into human-computer touch interfaces using surface-mounted vibration sensors. This is achieved via estimating the time-difference-of-arrivals (TDOA) of the signals across the sensors. In this work, we employ a smooth parametrized function to model the gradual noise-to-signal energy transition at each sensor. Specifically, the noise-to-signal transition is modeled by a four-parameter logistic function.

Paper Details

Authors:
Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong
Submitted On:
20 March 2016 - 2:22am
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

icassp2016poster_twocols.pdf

(122 downloads)

Keywords

Subscribe

[1] Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong, "Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/857. Accessed: Jun. 23, 2017.
@article{857-16,
url = {http://sigport.org/857},
author = {Quang Hanh Nguyen; V. G. Reju; Andy W. H. Khong },
publisher = {IEEE SigPort},
title = {Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal},
year = {2016} }
TY - EJOUR
T1 - Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal
AU - Quang Hanh Nguyen; V. G. Reju; Andy W. H. Khong
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/857
ER -
Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong. (2016). Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal. IEEE SigPort. http://sigport.org/857
Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong, 2016. Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal. Available at: http://sigport.org/857.
Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong. (2016). "Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal." Web.
1. Quang Hanh Nguyen, V. G. Reju, Andy W. H. Khong. Source Localization on Solids Utilizing Logsitic Modeling of Energy Transition in Vibration Signal [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/857

NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS


Negative symptoms in schizophrenia are associated with significant
burden and functional impairment, especially speech
production. There are no robust
treatments for negative symptoms and one obstacle surrounding
its research is the lack of an objective measure. To this
end, we explore non-verbal speech cues as objective measures.
Specifically, we extract these cues while schizophrenic
patients are interviewed by psychologists. Our results suggest a
strong correlation between certain measures of the two rating
sets.

Paper Details

Authors:
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee
Submitted On:
14 March 2016 - 11:19am
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

ICASSP2016Poster_yasir_landscape_v2.pptx

(193 downloads)

Keywords

Subscribe

[1] Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee, "NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/675. Accessed: Jun. 23, 2017.
@article{675-16,
url = {http://sigport.org/675},
author = {Yasir Tahir; Debsubhra Chakraborty; Nadia Thalmann; Daniel Thalmann; Jimmy Lee },
publisher = {IEEE SigPort},
title = {NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS},
year = {2016} }
TY - EJOUR
T1 - NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS
AU - Yasir Tahir; Debsubhra Chakraborty; Nadia Thalmann; Daniel Thalmann; Jimmy Lee
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/675
ER -
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. (2016). NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS. IEEE SigPort. http://sigport.org/675
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee, 2016. NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS. Available at: http://sigport.org/675.
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. (2016). "NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS." Web.
1. Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/675

NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS


Negative symptoms in schizophrenia are associated with significant
burden and functional impairment, especially speech
production. In clinical practice today, there are no robust
treatments for negative symptoms and one obstacle surrounding
its research is the lack of an objective measure. To this
end, we explore non-verbal speech cues as objective measures.
Specifically, we extract these cues while schizophrenic
patients are interviewed by psychologists. Our results suggest a
strong correlation between certain measures of the two rating

Paper Details

Authors:
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee
Submitted On:
14 March 2016 - 11:19am
Short Link:
Type:
Event:
Presenter's Name:
Document Year:
Cite

Document Files

ICASSP2016Poster_yasir_landscape_v2.pptx

(193 downloads)

Keywords

Subscribe

[1] Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee, "NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS", IEEE SigPort, 2016. [Online]. Available: http://sigport.org/674. Accessed: Jun. 23, 2017.
@article{674-16,
url = {http://sigport.org/674},
author = {Yasir Tahir; Debsubhra Chakraborty; Nadia Thalmann; Daniel Thalmann; Jimmy Lee },
publisher = {IEEE SigPort},
title = {NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS},
year = {2016} }
TY - EJOUR
T1 - NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS
AU - Yasir Tahir; Debsubhra Chakraborty; Nadia Thalmann; Daniel Thalmann; Jimmy Lee
PY - 2016
PB - IEEE SigPort
UR - http://sigport.org/674
ER -
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. (2016). NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS. IEEE SigPort. http://sigport.org/674
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee, 2016. NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS. Available at: http://sigport.org/674.
Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. (2016). "NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS." Web.
1. Yasir Tahir, Debsubhra Chakraborty, Nadia Thalmann, Daniel Thalmann, Jimmy Lee. NON-VERBAL SPEECH ANALYSIS OF INTERVIEWS WITH SCHIZOPHRENIC PATIENTS [Internet]. IEEE SigPort; 2016. Available from : http://sigport.org/674