AI Tracks Facial Expressions To Support PTSD Detection in Children

Updated

Diagnosing post-traumatic stress disorder (PTSD) in children presents distinct challenges, especially for those with limited verbal communication or emotional self-awareness. A research team at the University of South Florida (USF) has developed an artificial intelligence (AI) system that analyzes children’s facial expressions to help clinicians identify PTSD and track symptom changes over time.

To develop the system, the team used footage from 18 therapy sessions in which children described emotional experiences. Each child contributed over 100 minutes of video, with around 185,000 frames per session. The AI models processed these data to detect patterns in facial movement linked to PTSD.

Instead of using raw video, the AI analyzes de-identified data such as eye gaze, head movement and the positions of facial features. The system strips out personally identifiable information and evaluates the dynamics of facial expressions in different conversational contexts, including those with parents and clinicians.

The system found that children’s emotional expressions were more revealing during clinician-led interviews than during parent-child conversations. This observation aligns with psychological literature suggesting children may be more expressive with therapists and may suppress emotions in the presence of parents.

【MORE】