Category
This study presents the Multimodal Emotion-Performance Analysis (MEPA) framework for quantifying the correlation between Formula 1 drivers’ psychological states and on-track performance. MEPA employs a fusion of BERT-based NLP for driver communications analysis, CNN-powered computer vision for facial expression recognition from broadcast footage, and LSTM networks for time-series race telemetry data. Analysis of textual sources and videos resulted in a statistically significant Emotion-Performance Correlation which correlated with mean improvement in lap times.
Our latest research and publications