Saturday, December 30, 2006

[Reading] XPOD - A Human Activity and Emotion Awake Mobile Music Player

http://ebiquity.umbc.edu/paper/html/id/280/

XPod - a human activity and emotion aware mobile music player

Authors: Sandor Dornbush, Kevin Fisher, Kyle McKay, Alex Prikhodko, and Zary Segall

Book Title: Proceedings of the International Conference on Mobile Technology, Applications and Systems

Date: November 17, 2005

Abstract: In this paper, we consider the notion of collecting human emotion and activity information from the user, and explore how this information could be used to improve the user experience with mobile music players. This paper proposes a mobile MP3 player, XPod, which is able to automate the process of selecting the song best suited to the emotion and the current activity of the user. The XPod concept is based on the idea of automating much of the interaction between the music player and its user. The XPod project introduces a "smart" music player that learns its user's preferences, emotions and activity, and tailors its music selections accordingly. The device is able to monitor a number of external variables to determine its user's levels of activity, motion and physical states to make an accurate model of the task its user is undertaking at the moment and predict the genre of music would be appropriate. The XPod relies on its user to train the player as to what music is preferred and under what conditions. After an initial training period, the XPod is able to use its internal algorithms to make an educated selection of the song that would best fit its user's emotion and situation. We use the data gathered from a streaming version of the BodyMedia SenseWear to detect different levels of user activity and emotion. After determining the state of the user the neural network engine compares the user's current state, time, and activity levels to past user song preferences matching the existing set of conditions and makes a musical selection. The XPod system was trained to play different music based on the user’s activity level. A simple pattern was used so the state dependant customization could be verified. XPod successfully learned the pattern of listening behavior exhibited by the test user. As the training proceeded the XPod learned the desired behavior and chose music to match the preferences of the test user. XPod automates the process of choosing music best suited for a user’s current activity. The success of the initial implementation of XPod concepts provides the basis for further exploration of human- and emotion-aware mobile music players.

==== After Reading ====
It is a prototype application. They collect acceleration, galvanic skin response(GSR), skin temperature, heart flow, and near body temperature as input data. By using fully connected neural network, the system predict whether the user will skip this song. It looks like the product recently provide by Nike and Adidas.
But I think that the claim of song decision by emotion detecting is a little weak. Mostly they did is putting the bio-sensor signals into NN. They considered the changing of the signals. They did not explain how to consider the emotion and how they inference the result.
Moreover, I think some bio signals can help the recognition of activity, especially the `actively'.

No comments: