Saturday, December 30, 2006

[Reading] XPOD - A Human Activity and Emotion Awake Mobile Music Player

http://ebiquity.umbc.edu/paper/html/id/280/

XPod - a human activity and emotion aware mobile music player

Authors: Sandor Dornbush, Kevin Fisher, Kyle McKay, Alex Prikhodko, and Zary Segall

Book Title: Proceedings of the International Conference on Mobile Technology, Applications and Systems

Date: November 17, 2005

Abstract: In this paper, we consider the notion of collecting human emotion and activity information from the user, and explore how this information could be used to improve the user experience with mobile music players. This paper proposes a mobile MP3 player, XPod, which is able to automate the process of selecting the song best suited to the emotion and the current activity of the user. The XPod concept is based on the idea of automating much of the interaction between the music player and its user. The XPod project introduces a "smart" music player that learns its user's preferences, emotions and activity, and tailors its music selections accordingly. The device is able to monitor a number of external variables to determine its user's levels of activity, motion and physical states to make an accurate model of the task its user is undertaking at the moment and predict the genre of music would be appropriate. The XPod relies on its user to train the player as to what music is preferred and under what conditions. After an initial training period, the XPod is able to use its internal algorithms to make an educated selection of the song that would best fit its user's emotion and situation. We use the data gathered from a streaming version of the BodyMedia SenseWear to detect different levels of user activity and emotion. After determining the state of the user the neural network engine compares the user's current state, time, and activity levels to past user song preferences matching the existing set of conditions and makes a musical selection. The XPod system was trained to play different music based on the user’s activity level. A simple pattern was used so the state dependant customization could be verified. XPod successfully learned the pattern of listening behavior exhibited by the test user. As the training proceeded the XPod learned the desired behavior and chose music to match the preferences of the test user. XPod automates the process of choosing music best suited for a user’s current activity. The success of the initial implementation of XPod concepts provides the basis for further exploration of human- and emotion-aware mobile music players.

==== After Reading ====
It is a prototype application. They collect acceleration, galvanic skin response(GSR), skin temperature, heart flow, and near body temperature as input data. By using fully connected neural network, the system predict whether the user will skip this song. It looks like the product recently provide by Nike and Adidas.
But I think that the claim of song decision by emotion detecting is a little weak. Mostly they did is putting the bio-sensor signals into NN. They considered the changing of the signals. They did not explain how to consider the emotion and how they inference the result.
Moreover, I think some bio signals can help the recognition of activity, especially the `actively'.

Friday, December 22, 2006

[Reading] A Study in Users' Physiological Response to an Empathic Interface Agent

Helmut Prendinger, Christian Becker, and Mitsuru Ishizuka.
A study in users' physiological response to an empathic interface agent. [pdf]
International Journal of Humanoid Robotics, Vol. 3, No. 3, Sept. 2006, pp 371-391.
http://www.worldscinet.com/191/03/0303/S0219843606000801.html


Game: Skip-Bo
Emotion recognition: SC, EMG, & game sate(time) as input, 2
-axes (arousal & valence) as output. (ref29)
Method: ANOVA
Subject: 32 (14m, 18f)
Hypothesis:
  • If the virtual game opponent behaves ``naturally'' in that if follows its own goals and expresses associated positively or negatively valenced affective behaviors, users will be less aroused or stressed than when the agent does not do so.
  • If the game opponent is oriented only toward its own goals and displays associated behaviors, users will be less aroused or stressed than when the agent does not express any emotion at all.
Agent's behavior:
  • non-emotional
  • self-centered emotional
  • negative empathic
  • positive empathic
Agent's abilities: (in PAD space)
  • auditory speech
  • facial
  • body gesture
Result:
  • The positive empathic condition was experienced as significantly more arousaling or stressful than the negative empathic condition.
  • Users seemingly do not respond significantly different when empahtic agent behavior is absent.
Notes:
  • EMG high, negative valence more
  • global baseline -> for individual differences

a good word to use

  empathy
n : understanding and entering into another's feelings

This is what I want the computer to know and to do.

Tuesday, December 05, 2006

[listen] Brooky's proposal

  • Learn ability - features number v.s training set size (how many data should I have to train a trusty model?)
  • Technical detail of methods - such as SVN, DBN
  • CRF how to use.
Go, go, go!

Experiment Prework

The induction has been record as many sound files. I have make some movie clips for different mood induction. I will finish the flash for connecting database and different playing tomorrow or on Thursday.

Something need to do is to ask SY to borrow the bio-sensors. I need time....!!!
So many things need to do.