SINLAB: Continuous body emotion recognition during creative tasks

This project ventures into the exciting interface of technology and human emotion. We aim to unravel the intricacies of human engagement and affect during creative endeavors using non-invasive methods. Think cameras, everyday equipment, no complicated rigs!

Bildet kan inneholde: grønn, erme, gul, sportsklær, briller.

(Flintisis et al., 2019)

Emotion recognition is rapidly emerging as a pivotal technology across various sectors. In healthcare, it can be employed to monitor patient well-being, detect mental health issues, or enhance therapeutic interventions. For the entertainment industry, understanding audience reactions in real-time offers insights into content optimization. In education, tracking student engagement and emotional states can lead to more effective learning interventions. The reason creative tasks are particularly instrumental in this research is twofold: First, they span a broad spectrum of emotional intensities and complexities, providing a rich dataset for analysis. Second, by decoding the emotional dynamics during creative processes, we can gain deeper insights into human cognition, potentially revolutionizing interfaces, tools, and environments tailored to enhance creative output and well-being.

Objective

As creative tasks progress, subtle physical cues often reflect shifts in human engagement and emotional states. Can we develop a system that harnesses commonly available tech tools, such as cameras, to continuously monitor and interpret these cues during creative undertakings?

Project Description

The goal of this thesis project is to explore non-invasive methods for tracking and assessing human emotions during activities like playing an instrument, coding, writing, or painting. The emphasis is on utilizing standard equipment, primarily cameras, to minimize intrusive monitoring.

Key Tasks

  • Emotion Recognition: Investigate the application of machine vision techniques for recognizing emotions during creative tasks.
  • Deep Learning Deployment: Apply advanced deep learning algorithms to interpret the intricacies of emotion-linked physical cues.
  • Equipment Utilization: Evaluate the feasibility and efficiency of non-specialized equipment for emotion tracking in real-world settings.

The students will join the In Sustainable Immersive Networking Lab (SIN-LAB) and work in a team setting. SIN-LAB provides ample opportunities for experimental studies as it is equipped with a wide range of multimedia (cameras, LIDARs, headsets, glasses VR) and haptic devices (haptic gloves, robot arms and hands, in-air touch feedback, and haptic body suit) along with networking equipment to support immersive networking applications.

Learning outcome

  • Be at the frontier of a rapidly evolving interdisciplinary research domain.
  • Interact with state-of-the-art deep learning frameworks and novel sensing methodologies.
  • Contribute to the growing body of knowledge, impacting domains ranging from education to entertainment and mental health.
  • Combine your technical and creative skills; 
  • Hands-on experience in interdisciplinary system design and user study; 
  • Possibility to publish the result and attend the top international conference.

Suggested courses & skills

  • IN5060
  • IN4260MCT
  • A good foundation in machine learning and computer vision
  • Interest in human-computer interaction
  • Programming skills: Python, Web development
Publisert 2. nov. 2023 08:57 - Sist endret 2. nov. 2023 08:57

Veileder(e)

Omfang (studiepoeng)

60