Autism Community Network

Autism-Open Access

EMOTION BASED MUSIC PLAYER USING DEEP LEARNING

Abstract

Author(s): K Praveen Kumar*, Nikhila Telakuntla

Every single person has a different face and the expressions on those faces tell the same story and are a major indicator of how someone is feeling and acting. Music is considered to be the purest form of expression and creativity and it is known to have a stronger emotional impact on listeners. It has a special ability to elevate one's mood. Emotion detection is the process of determining a person's feelings based on various facial cues and visual information. This field has flourished as a result of how popular deep learning has become. Emotion recognition has also opened the door to a wide range of previously unthinkable applications. One of the things with a strong emotional connection is music. A listener may look for music that conveys a specific emotion when they are feeling that emotion. Using our emotion detection model, we associate these emotions with a music player that plays music that improves user experiences. The main goal of the proposed system is to create a powerful music player that makes use of facial recognition technology to access user emotion. When compared to doing it manually, the system produced by the extracted facial features will save time and effort. However, the results of research into categorising music based on emotions have not been the best. In this project, we offer an affective cross-platform music player that recommends songs based on the user's current mood. EMP provides intelligent mood based music suggestions by integrating the capabilities of emotion context reasoning within our adaptive music recommendation engine.