Facial Dynamics in infants with risk for Autism spectrum disorder (PhD position)
The study of infants with risk for Autism spectrum disorder (ASD) is a research collaboration between multiple partners at the Universities of Gent and Leuven, Belgium. The research is supported by a 4-year FWO-SBO research grant. In this multidisciplinary research group novel computer vision methods for the analysis of facial dynamics are being developed to assess the risk of ASD. The group is looking for a young scientist in computer vision or computational sciences who is willing to start a 4-year PhD based at the KU Leuven, Department Electrical Engineering (ESAT), Processing Speech and Images (PSI) under the supervision of dr. Peter Claes and prof. dr. Dirk Vandermeulen. The group of KU Leuven, Department Electrical Engineering (ESAT), Proc... For more information see https://icts.kuleuven.be/apps/jobsite/vacatures/54305213
The face has evolved to be the single most telling part of the humanbody. Developmentally, the face also evolves in concert with the brain, eachinfluencing the development of the other. Therefore, in developmental disordersthe phrase “the face predicts the brain” is commonly used and recently facial characteristicshave been correlated with clinical aspects in children with ASD. Recognition of a face, for example, elicits an emotionalresponse, and humans are adept at detecting subtle differences in the mood orintent of others from their facial expressions. The dynamics of facialexpressions contain an additional and possibly even bigger source ofinformation to aid in the diagnosis of ASD.
Earlier facial expression analysis relies on a neutral expression andfive universal emotions including happiness, sadness, anger, fear and disgustin adults. Experts were able to rate the positive/negative valence of these prototypicalcategories over the timeframe of an observational session. However, it has beenshown that neuropsychiatric conditions often result in 1) ambiguous facialexpressions which are combinations of emotions, and 2) subtle expressions whichhave low intensity. To resolve this problem, the Facial Action Coding system(FACS), which is based on changes in muscle movement (called action units(AUs)) and developed so that facial expressions irrespective of emotion can be analyzed,was proposed. Similar to the earlier analysis protocols, human raters encodethe facial actions during observational sessions. However, FACS rating requiresextensive training and is time consuming thus prone to error. To address thisinvestigational burden, current state-of-the-art algorithms in computer visionaim to produce FACS scores both objectively and fast.
The first aim of this PhD is to implement an analysis of 2D facialdynamics to objectively underpin related ASD measurements obtained in theoverall project. The second aim is to create a system that helps in assessmentof ASD risk when combined with all other project outcomes. Technically, the aimis to create a facial action coding system (FACS) that can be applied toinfants in the assessment of ASD risk. Any such system starts with thedetection and tracking of faces. Subsequently, geometric and/or texturalfeatures are extracted from the images to detect action units (AUs). Finally,temporal profiles of AUs are analyzed and used to measure motion type, flatness(lack of response) or inappropriateness (abnormality) of facial expressions.The challenge lies in the discovery of AUs and motion types suitable forinfants, for which different computer vision approaches can be explored, rangingfrom active appearance models to convolutional neural networks and deeplearning.
This job comes from a partnership with Science Magazine and