Quantifying Emotion-Performance Dynamics in Formula 1: A Multimodal Approach

Quantifying Emotion-Performance Dynamics in Formula 1: A Multimodal Approach

Category

Behavior
Performance

This study presents the Multimodal Emotion-Performance Analysis (MEPA) framework for quantifying the correlation between Formula 1 drivers’ psychological states and on-track performance. MEPA employs a fusion of BERT-based NLP for driver communications analysis, CNN-powered computer vision for facial expression recognition from broadcast footage, and LSTM networks for time-series race telemetry data. Analysis of textual sources and videos resulted in a statistically significant Emotion-Performance Correlation which correlated with mean improvement in lap times.

Research Index

Our latest research and publications