Recognising facial expression from spatially and temporally modified movements.
about
Action and emotion recognition from point light displays: an investigation of gender differencesEmotional Actions Are Coded via Two Mechanisms: With and without Identity Representation.Validation of the Amsterdam Dynamic Facial Expression Set--Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions.Identity modulates short-term memory for facial emotionPerceiving performer identity and intended expression intensity in point-light displays of dance.Get The FACS Fast: Automated FACS face analysis benefits from the addition of velocity.Singing emotionally: a study of pre-production, production, and post-production facial expressions.The application of biological motion research: biometrics, sport, and the military.Recognition of biological motion in children with autistic spectrum disorders.Emotion perception from dynamic and static body expressions in point-light and full-light displays.The perception of facial expressions from two-frame apparent motion.Independence of face identity and expression processing: exploring the role of motion.Perception of temporal asymmetries in dynamic facial expressions.Expression of emotion in the kinematics of locomotion.Classification of dynamic facial expressions of emotion presented briefly.Can biological motion research provide insight on how to reduce friendly fire incidents?Neural substrates for action understanding at different description levels in the human brain.Role of biological-motion information in recognition of facial expressions by young children.Person identification from biological motion: effects of structural and kinematic cues.The Ryerson Audio-Visual Database of Emotional Speech and Song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English.Human Observers and Automated Assessment of Dynamic Emotional Facial Expressions: KDEF-dyn Database ValidationUse and Usefulness of Dynamic Face Stimuli for Face Perception Studies-a Review of Behavioral Findings and Methodology
P2860
Q21135342-68BE8ADF-A201-41F2-B61B-60D0D4C5D992Q27304724-BD72172D-DC7A-4CC4-882C-D27CC1C5EB67Q27329955-6700B8AA-34B8-4FE6-AFEF-551F45647B43Q28748602-19F3BCA3-3653-445F-BB09-CB9E03E91C32Q30395359-D608CCF3-43A1-45B0-ABD6-60C85875BDE3Q30433265-3B0BD890-A3C3-4B26-AD23-B8910F0DCB97Q30438784-0AB687E7-3779-46B4-950B-776712C6B864Q38211928-FFD4C9D5-B893-4E3F-B2CC-6EB5832784ADQ38390594-E268BB11-2D67-49E0-9B4D-1DC0DD50AD64Q39692958-0368DC0C-5A1C-4EE3-9D7C-03512F70A14CQ40020343-B544085E-25D0-4A77-B1F0-42EDE29BEC98Q41980008-7DD313E8-EE0A-475F-B856-B0CFB01CA0C5Q42413474-59814D88-B2CA-476B-AE95-A63819C7B21AQ44802647-947E6EAA-992A-48F5-B139-AF36A66E4367Q45056832-8DCEFAE3-2107-4E05-9AE9-EC0EC368622BQ47587212-DD10F321-07B8-4766-BC7A-5E19E608A5E8Q48360088-69B62759-1783-4F60-96A6-E3D031DDCA69Q50770238-A134174F-AD58-469E-81BC-6648F6189F67Q51989159-A74433C4-DA74-4A47-A389-0DC83226F0B3Q55426186-BA94F658-CB99-4D95-BD88-A0DE5C0BCE92Q58707569-CAEB5B38-851C-4671-8CA2-99DD37946189Q58801637-E4EFA898-A8BE-4872-AB12-B15072E5C436
P2860
Recognising facial expression from spatially and temporally modified movements.
description
2003 nî lūn-bûn
@nan
2003年の論文
@ja
2003年学术文章
@wuu
2003年学术文章
@zh
2003年学术文章
@zh-cn
2003年学术文章
@zh-hans
2003年学术文章
@zh-my
2003年学术文章
@zh-sg
2003年學術文章
@yue
2003年學術文章
@zh-hant
name
Recognising facial expression from spatially and temporally modified movements.
@en
Recognising facial expression from spatially and temporally modified movements.
@nl
type
label
Recognising facial expression from spatially and temporally modified movements.
@en
Recognising facial expression from spatially and temporally modified movements.
@nl
prefLabel
Recognising facial expression from spatially and temporally modified movements.
@en
Recognising facial expression from spatially and temporally modified movements.
@nl
P2860
P356
P1433
P1476
Recognising facial expression from spatially and temporally modified movements
@en
P2093
Andrew Calder
Harold Hill
P2860
P304
P356
10.1068/P3319
P577
2003-01-01T00:00:00Z