Emotion is a relatively new field of study, filled with assumptions and limited in previous research. In current studies, researchers know that emotions involve physiological and behavior responses somehow, but even the basic concept of ‘emotion’ is unclear. Further, few studies have looked at the intersection of auditory stimulus facial emotion recognition. This study conducts an experiment wherein conflicting emotional stimuli are fed to participants via the facial and auditory emotion recognition. A pilot group of 55 participants were recruited through Amazon’s Mechanical Turk (M-Turk) and asked to identify a facial expression as “happy” or “sad” while listening to music. The results show that when different musical stimuli are applied to a somewhat ambiguous facial expression, the tone of the music can have a significant effect on facial emotion recognition. Extending this research may provide insight into how the brain processes emotion.