Thursday, November 01, 2007

Hibernate Implement Tips

假設你已經對於hibernate有一點點概念. 那麼這裡是一些implent的時候最基礎最有機會用到的東西的小整理.
  1. 讓我們設定db的外部連結名稱為dbname. 則在hibernate.cfg.xml誘惑是hibernate.properties檔案中的database 名稱就是dbname.
  2. 在db當中的每一個table都對應有一個javaBean.hbm.xml, 這樣一個檔案會對應到一個原始檔為javaBean.java的class. 簡單來說, 就是每個table都有一個mapping file及一個bean來對應他.
    table <--> javaBean.hbm.xml <--> javaBean.java
  3. hibernate.cfg.xml及hibernate.properties可以則一使用. 當兩者都存在的時候, 前者會覆蓋後者. 他們的功能就是設定連線時所需要的一些information, 像是db位置名稱, 帳號, 密碼, 所使用的sqlDialet...等
  4. 請加入log4j.properites用來輸出相關的logs
  5. Sessin Factory是利用configuration file(一般命名為hibernate.cfg.xml)來建立的. 若在此檔案中沒有給factory一個名稱, 則無須bind JNDI.
  6. 整體簡單來說, 所需要的file有以下這些: javaBean.java, javaBean.hbm.xml, hibernate.cfg.xml/hibernate.properties, log4j.properties

Difference/Relation between Transaction and Session

一看到這個字, 有種學英文的感覺, scope長多大都沒概念. 所以筆記一下transaction以及session二者的區別.
  1. 當client連結到DB, 通過驗證之後, 就會建立一個session. 這樣的一個session則可以包含許多個transaction.
  2. Transaction本身的定義是an indivisible unit of work, 及表明他是數個連續的動作且不可切割的. 故一個transaction下的資料庫更新要麼全部都成功了, 要麼就是rollback了. 預設的commit方式大多為auto commit, 所以每一個SQL statement都是一個transaction; 又假若設定commit的方式為manual commit, 則transaction的界線就由commit或者是rollback來劃分了.

Monday, October 29, 2007

今天在deploy一個ejb, 裡面因為用到spring所以把他的manifest.mf改了改, 想說可以把dependent的jar塞進去這個ejb的jar. 原本想說沒啥問題, 但今天弄了一整天, 才發現這句關鍵的句子害我感到都浪費了
Use the manifest file to specify that a JAR file can reference another JAR file.
Standalone EJBs cannot use the Manifest Class-Path. It is only supported for
components that are deployed within an EAR file
. The clients should reference
the client.jar in the classpath entry of the manifest file.

真讓我感到傷心, 原來是因為這樣, 不過超神奇的還是weblogic如果直接選原本的檔案, 卻又活的好好的, 讓我感到很迷惘阿.
雖然可以找到的解法很多, 像是直接把classpath塞到weblogic的classpath裡面, 但是想說不要動server的設定比較好, 結果就讓我耗了一天了!! 可惡~~

Friday, October 26, 2007

web.xml的內容

Web.xml是一在web application deploy的時候必要的file, 它是被放置於/WEB-INF/之下的. 由於它是個xml file, 所以檔案內的大小寫有別. 另外, 所有element排放的順序是規定的.

1. Head: 檔案的開頭必定為xml head, 大多為:
<?xml version="1.0" encoding="UTF-8" ?>
2. DOCTYPE: 告訴server servlet的版本(是2.2或是2.3)
<!DOCTYPE web-app PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.3//EN" http://java.sun.com/dtd/web-app_2_3.dtd&gt>
3. Root: 必定為<web-app> 且一定為小寫. 如:
<web-app>
<!-- Other elements go here. All are optional. -->
</web-app>
4. Elements: 順序必定為以下順序:
  • icon
  • display-name
  • description
  • distributable
  • context-param
  • filter
  • filter-mapping
  • listener
  • servlet
  • servlet-mapping
  • session-config
  • mine-mapping
  • welcome-file-list
  • error-page
  • taglib
  • resource-env-ref
  • resource-ref
  • security-constraint
  • login-config
  • security-role
  • env-entry
  • ejb-ref
  • ejb-local-ref

詳細內容可以參考dtd的定義, 下面指列出一些我常用的elements以及一些注意事項.

1. servlet: 常用servlet-name以及servlet-class這兩個subelements. servlet-name一定要先出現, 才能出現servlet-class. 它的影響有兩個a). initialization parameters, custom URL patterns, and other customizations refer to the servlet by the registered name, not by the class name. b). the name can be used in the URL instead of the class name. 也就是我可以用http://host/webAppPrefix/servlet/xxx代替http://host/webAppPrefix/servlet/package.yyyServlet來連結, 其中xxx是設定的servlet-name, package.yyyServlet是servlet的class. 另外, 若是此servlet為一個jsp檔案的時候, 請改用jsp-file這個tag去代替servlet-class, 因為我們無法得知jsp被編譯之後的class名稱為何. 而一個container當中可以有很多個servlet, 所以在web.xml當中也可以同時定義很多個servlet.

2. servlet-mapping: 用於客制化URL, 一般結構為:

<servlet-mapping> <servlet-name>name</servlet-name> <url-pattern>url</url-pattern> </servlet-mapping>
其中name就是在servlet中所定義的servlet-name, 而url則為一個pattern,大致有下面幾種形式
a).路徑映射: 以/開頭或是以/*結尾
b).擴展映射: 以前綴*.開頭
c).default servlet映射: 使用/
d).詳細映射: 如ab/tt/cc.action
因此定義/*.action會出錯的原因,是由於同時有路徑mapping也有擴展的mapping, 所以container無法判斷

3. 使用servlet的initialization parameters. 一般型態為類似於下面的格式:

<servlet>
<servlet-name>InitTest</servlet-name>
<servlet-class>myPackage.InitServlet</servlet-class>
<init-param>
<param-name>param1</param-name>
<param-value>Value 1</param-value>
</init-param>
</servlet>
Initialization parameters are only available whenservlets are accessed by means of their registered names or throughcustom URL patterns associated with their registered names. Initialization parameters are not available in servlets that are accessedby their default URL. 而自己撰寫servlet的init()時可以利用getServletConfig().getInitParameter("...") 來取得type為String的parameter.

4.context-param可以設定提供給整個system, 裡面包含的subelement為context-name以及context-value.

5.若要將servlet在system啟動時就load進來, 可以使用load-on-starup這個tag

6.Welcome page的設定是透過welcome-file-list這個element來達到, 而放置於當中的welcome-file是有順序可言的, 若是第一個找不到, 則會找第二個, 以此類推, 直到沒有則使用server所設定的.

7.

當Spring碰上EJB

最近都在學一堆JAVA的framework, 對很多人來說, 這都不是新東西. 不過依舊多少會碰到一些問題. 所以就順手把一些常用的筆記寫下來了.

記得第一個碰到的是怎麼利用spring去config EJB. 方法有很多種, 在此簡單寫一下我的做法.

在Spring當中, 它提供三種abstract class, 形同EJB的bean class.








abstract class for bean in Spring
In Spring EJB
AbstractStatelessSessionBean Stateless Session Bean
AbstractStatefulSessionBean Stateful Session Bean
AbstractMessageDrivenBean Message Driven Bean

以上皆源自於AbstractEnterpriseBean

需要的步驟如下:

1. BeanFactoryLocator: 他是個interface, 必須有Function: BeanFactoryReference userBeanFactory (String factoryKey) throws BeansException;

2. ServiceInterface以及他的implementation

3. EJBObject的Interface要extends ServiceInterfaceEJBObject

4. EJBHome: 要extends EJBHome
,要記得function create()要return 3的type

5. Bean: 根據需求, extends上表當中的class, 最重要的事情是要有onEjbCreate() throws CreateException這個function.

6. Finally, 要在deployment descriptor當中(也就是ejb-jar.xml檔案)當中加入中的<env-entry-name>, <env-entry-type>,<env-entry-value>.


<env-entry>
<env-entry-name>ejb/BeanFactoryPath</env-entry-name>
<env-entry-type>java.lang.String</env-entry-type>
<env-entry-value>spring-config.xml</env-entry-value>
</env-entry>

Tuesday, October 09, 2007

ClassNotFoundException: org.hibernate.hql.ast.HqlToken解决之道

From http://www.blogjava.net/SkyWinder/articles/40306.html
拥有Hibernate3.jar的应用,被部署到weblogic8.1上后,抛出异常 CharScanner; panic: ClassNotFoundException: org.hibernate.hql.ast.HqlToken。 解决方法:在hibernate.properties上,或是在spring的conext xml中,加上一个属性hibernate.query.factory_class,值为org.hibernate.hql.classic.ClassicQueryTranslatorFactory。
原因:从网上获知,weblogic.jar中已经有了一个antlr.jar的版本,导致应用中hibernate3.jar中用到的antlr.jar不能找到,导致异常。

Tuesday, May 01, 2007

Control LED

a board has 16 port to control the environment. Can connect LED
clipped from www.parallax.com
The BASIC Stamp 2 is a 24-pin DIP (Dual inline package) module. Most commonly referred to as a microcontroller, on occasion you may see it being called a single board computer since it has its very own processor, memory, clock, and
interface (via 16 I/O pins). The BASIC Stamp essentially serves as the brains inside of electronics projects and
applications that require a programmable microcontroller. It is able to control and monitor switches, timers, motors,
sensors, relays, valves, and more. Best yet, programming may be performed in the PBASIC language. Very similar to BASIC, this
language has a quick learning curve and no compiler is required.
Processor Speed20 MHz
Program Execution Speed~4,000 instructions/sec.
RAM Size32 Bytes (6 I/O, 26 Variable)
EEPROM (Program) Size2K Bytes, ~500 instructions
I/O Pins16 +2 Dedicated Serial
Voltage Requirements5 - 15 vdc
Current Draw at 5V3 mA Run / 50 µA Sleep
PBASIC Commands42
Size1.2"x0.6"x0.4"
Stock#:
BASIC Stamp 2 Module
 powered by clipmarksblog it

Tuesday, April 24, 2007

SVM v2 result

F_.v1.v2.Test

t c=32.0, g=0.125 CV rate=94.8725
Training...
Output model: F_v1.v2.Train.model
Scaling testing data...
Testing...
Accuracy = 46.9498% (1647/3508) (classification)
Output prediction: F_v1.v2.Test.predict


answer
H, A, S, F, P |Predict
48 69 103 27 264 |0
9 7 2 63 78 |1
0 24 46 3 7 |2
6 7 15 46 77 |3
287 187 184 449 1500 |4

=============
F0_.v1.v2.Test

t c=32.0, g=0.125 CV rate=94.7875
Training...
Output model: F0_v1.v2.Train.model
Scaling testing data...
Testing...
Accuracy = 52.1095% (1828/3508) (classification)
Output prediction: F0_v1.v2.Test.predict


answer
H, A, S, F, P |Predict
47 44 50 1 159 |0
8 2 0 46 56 |1
0 23 72 24 12 |2
1 5 12 26 18 |3
294 220 216 491 1681 |4

J48 v1 result

F_v1.csv(10 CV)
=== Run information ===

Scheme: weka.classifiers.trees.J48 -C 0.25 -M 2
Relation: F_v1
Instances: 6274
Attributes: 103
[list of attributes omitted]
Test mode: 10-fold cross-validation

=== Classifier model (full training set) ===

=== Summary ===

Correctly Classified Instances 5823 92.8116 %
Incorrectly Classified Instances 451 7.1884 %
Kappa statistic 0.8832
K&B Relative Info Score 552448.6144 %
K&B Information Score 9940.3112 bits 1.5844 bits/instance
Class complexity | order 0 11285.6028 bits 1.7988 bits/instance
Class complexity | scheme 327256.4897 bits 52.1607 bits/instance
Complexity improvement (Sf) -315970.8869 bits -50.362 bits/instance
Mean absolute error 0.0316
Root mean squared error 0.1657
Relative absolute error 12.8354 %
Root relative squared error 47.2221 %
Total Number of Instances 6274

=== Detailed Accuracy By Class ===

TP Rate FP Rate Precision Recall F-Measure Class
0.923 0.006 0.942 0.923 0.933 _S
0.845 0.012 0.851 0.845 0.848 _A
0.941 0.013 0.925 0.941 0.933 _F
0.861 0.017 0.856 0.861 0.858 _H
0.948 0.07 0.949 0.948 0.949 _P

=== Confusion Matrix ===

a b c d e <-- classified as
553 4 14 1 27 | a = _S
8 382 7 8 47 | b = _A
1 15 859 5 33 | c = _F
1 6 8 570 77 | d = _H
24 42 41 82 3459 | e = _P

Number of Leaves : 278

Size of the tree : 555



================
F0_v1.csv (10 CV)

=== Run information ===

Scheme: weka.classifiers.trees.J48 -C 0.25 -M 2
Relation: F0_v1
Instances: 6274
Attributes: 103
[list of attributes omitted]
Test mode: 10-fold cross-validation

=== Classifier model (full training set) ===

Number of Leaves : 262

Size of the tree : 523


Time taken to build model: 34.48 seconds

=== Stratified cross-validation ===
=== Summary ===

Correctly Classified Instances 5742 91.5206 %
Incorrectly Classified Instances 532 8.4794 %
Kappa statistic 0.8615
Mean absolute error 0.0364
Root mean squared error 0.1782
Relative absolute error 14.7968 %
Root relative squared error 50.7918 %
Total Number of Instances 6274

=== Detailed Accuracy By Class ===

TP Rate FP Rate Precision Recall F-Measure Class
0.93 0.008 0.922 0.93 0.926 _S
0.812 0.012 0.838 0.812 0.825 _A
0.92 0.015 0.911 0.92 0.916 _F
0.802 0.016 0.854 0.802 0.827 _H
0.945 0.092 0.935 0.945 0.94 _P

=== Confusion Matrix ===

a b c d e <-- classified as
557 7 10 1 24 | a = _S
9 367 16 8 52 | b = _A
5 12 840 5 51 | c = _F
1 8 8 531 114 | d = _H
32 44 48 77 3447 | e = _P

SVM v1 result

F_v1.test.5

Best c=128.0, g=0.5 CV rate=93.2173
Training...
Output model: F_v1.train.5.model
Scaling testing data...
Testing...
Accuracy = 94.8791% (1334/1406) (classification)
Output prediction: F_v1.test.5.predict


answer
H, A, S, F, P |Predict
129 0 0 1 13 |0
0 88 1 0 2 |1
1 0 124 0 2 |2
0 1 2 188 3 |3
20 11 3 12 805 |4


=========
F0_v1.test.5

Best c=32.0, g=0.5 CV rate=93.0753
Training...
Output model: F0_v1.train.5.model
Scaling testing data...
Testing...
Accuracy = 94.8791% (1334/1406) (classification)
Output prediction: F0_v1.test.5.predict


answer
H, A, S, F, P |Predict
129 0 0 1 12 |0
0 86 1 0 3 |1
1 0 124 0 2 |2
0 0 2 189 2 |3
20 14 3 11 806 |4

Wednesday, April 11, 2007

BVP analysis


analysis of BVP
To see more clips or start creating your own, visit clipmarks.com
Sent with Clipmarks

Tuesday, April 10, 2007

FFT

Fs是指sample rate
y是要被做FFT的data, 長度為L
NFFT = 2^nextpow2(L); % Next power of 2 from length of y
Y是做完FFT的結果
f是在frequence domain時的橫軸座標, 因此橫軸座標是的間隔是1/NFFT*2*Fs/2= 1*Fs/NFFT

clipped from www.mathworks.com
Y = fft(y,NFFT)/L;
f = Fs/2*linspace(0,1,NFFT/2);

powered by clipmarksblog it
http://civil.njtc.edu.tw/weng/excel/lectureNote/2.8.htm
補上一筆
只有一半可以用

clipped from civil.njtc.edu.tw

假如原來在時間域的資料點有N點,則經過FFT轉換後,在頻率域其數據仍為N點,但己經是複數了。這N點中第一個點為所有其他點的總合,而且前半數的N/2點與後半數的N/2點是共軛對稱的複數,對稱點在中點,例如有256點,則128點為其共軛對稱點。因此在時間域有N點,則以FFT轉換到頻率頻後只有N/2點可用。至於各點頻率差,若在時間域的數據每點間隔則在頻率域的頻率間隔為,例如前述的A900地震儀,其=0.005,取N=1024點做FFT轉換後可用點只有其半數512點,各點頻率間隔為=0.1953125cps.

 powered by clipmarksblog it

Power Spectrum Density

x: data
window: 在data中考量的window的大小
noverlap: window和window之間overlap的sample個數
nfft: 做fft時window的大小
clipped from www.mathworks.com

[Pxx,w] = pwelch(x,window,noverlap) divides x into segments according to window, and uses
the integer noverlap to specify the number of signal samples
(elements of x) that are common to two adjacent segments. noverlap must
be less than the length of the window you specify. If you specify noverlap as
the empty vector [], then pwelch determines
the segments of x so that there is 50% overlap (default).

[Pxx,w] = pwelch(x,window,noverlap,nfft) uses
Welch's method to estimate the PSD while specifying the length of the FFT
with the integer nfft. If you set nfft to
the empty vector [], it adopts the default value for N listed
in the previous syntax.

 powered by clipmarksblog it

Monday, April 09, 2007

一直忘了補上average

再取average的時候
到底怎樣取才好勒....
我現在有兩種取法,
一種是在整個induction的過程中都拿來average.
一種是將切出來的各種檔案拿來average

好像第二種比較make sense. ! so?

smooth?! normalized?!

smooth and normalized是最近在想的問題.
該怎麼寫比較準確, 該怎麼弄比較好
都是要考慮的....

smooth現在寫了兩個方法, gaussian smooth filter(用guassian window去做convorution)
另一個是標準的law pass filter(span=5).
其他其實還有很多的方法, 但感覺不出好壞與優劣還有特點
感覺上我應該要依據訊號的特徵去處理
現在是傾向寫起來放著 XD

除此之外, normalized其實就是shift and scale.
再放入feature的時候真的有這個必要嗎?!
真是不知道哪個好....
一樣先寫起來放著吧.... XD

快寫完了, 我要快點train

Sunday, March 25, 2007

Progress!

Updating!!

Something else.

=== Thesis
Acknowledgments 50%
Abstract 10%

Chapter 1 Introduction
1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20%
1.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1.3 Research Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Chapter 2 Related Work
2.1 View from Psychology . . . . . . . . . . . . . . . . . . . . . . . . . .100%
2.1.1 Emotion Models . . . . . . . . . . . . . . . . . . . . . . . . . .
2.1.2 Characteristic and Roles of emotion . . . . . . . . . . . . . . .

2.2 Recognition in Artificial Intelligent . . . . . . . . . . . . . . . . . . . 100%
2.2.1 Emotion Recognition . . . . . . . . . . . . . . . . . . . . . . . 90%
2.2.2 Learning Method . . . . . . . . . . . . . . . . . . . . . . . . . 0%

2.3 Relevance of Emotion Research for Affective Computing . . . . . . .
2.3.1 Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3.2 Health Care . . . . . . . . . . . . . . . . . . . . . . . . . . . .
2.3.3 Tutor and Education System . . . . . . . . . . . . . . . . . . .

Chapter 3 Methodology
3.1 Emotion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2 Signal Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.3 Learning Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Chapter 4 Experiment
4.1 Mood Induction and Data Collection . . . . . . . . . . . . . . . . . .
4.2 Data Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4.3 Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Chapter 5 Conclusion

Appendix A Questionnaire for Mood Induction 5%

Friday, March 23, 2007

如何使用 Win32 API 存取 RS232

http://www.csie.ntu.edu.tw/~b88032/expr/AcceRS232.htm

This is a reference link to tell me how to use Win32 API to access RS232.
A clearly and easy one.


Tuesday, March 06, 2007

[Writing] Tech writing

https://ceiba.ntu.edu.tw/modules/bulletin/bulletin.php?current_lang=english&csn=a96624#

Tech writing

[Progress Report] 2007 Feb

  • What I have done in February
    • 在二月當中,在春假之前,我收集了共15人的資料。其中四種情緒都可以使用的約有12個人
    • 已經把取出關鍵時刻資料的parser寫好了. 比如做happy的狀態, 我只需要取所有collect的過程中的最後三分鐘的資料, 而現在parser會根據log的狀態去做.
    • 跟潘昭將我們的實驗過程大致上做了一個初版的影片. 其中包含如何進行實驗, 以及一個簡單的情境說明我們做的事情有何功效.
  • What is the main research issue that I am working on right now
    • 將data轉化為feature
    • related work的整理與撰寫
  • What I plan to do in March broken down into weekly goals
    • now ~ Mar. 10: 寫完related work, 準備group meeting報告
    • Mar. 11 ~ Mar. 17: Feature parser. 準備care robot demo.(要demo啥??!!)
    • Mar. 18 ~ Mar. 24: Learning, Mining, (decision tree以及SVM開始) care robot demo.
    • Mar. 25 ~ Mar. 31: 利用sy的sensor收集data?(需再討論), 寫一部份論文關於實驗的部份

加油畢業!!

Tuesday, February 13, 2007

論文開工

最近忙私事
忙到壓力很大,
但是論文總是開工了!
加油吧...

專心做一件事情的感覺
舒服的多
比壓力小的多

Friday, January 26, 2007

提醒事項

combime parser
第一個參數:資料夾名稱
會自動把讀入的四個channel變成一個檔案myFile.cvs

log parser
第一個參數:資料夾的名稱
會根據log.log把每一段的檔案切出來(fear.cvs, happy.cvs, peace0.cvs, etc.)

feature
第一個參數:資料夾的名稱
會計算features,放到每一段的檔案中(same as name in log parser)

merge to one for support vector machine
把所有資料夾中的所以段結果放到一個大的檔案中給svn使用

Thursday, January 18, 2007

LED燈接法

LED要串一個電阻
假設LED的規格是x V, 工作店最大事 y mA
則串接的電阻的電阻值為 (電源電壓 - x)/z, 其中z <= y
一般來說z=0.01
因此紅色1.8V的LED要串大概330歐姆的電阻

電阻的串法
LED長腳接五伏電壓
短腳接電阻
電阻的另一個腳接地

圖晚點補

Tuesday, January 09, 2007

Recording Data

Today, I am trying to parse the recording data.
I count the duration of my experiment and the file. I found that they did not match!! There are 16 times than the samples I collect. I need to find out what the others stand for.
Sigh....

Tuesday, January 02, 2007

[Reading] XPod: a Human Activity Aware Learning Mobile Music Player

http://ebiquity.umbc.edu/paper/html/id/335/XPod-A-Human-Activity-Aware-Learning-Mobile-Music-Player

Sandor Dornbush, Jesse English, Tim Oates, Zary Segall, and Anupam Joshi
Jan 08, 2007

I think the work is an on-going one. In the paper the show on Jan 08, 2007 (a coming day), I think the writers take out the part of emotion, which is the one I am interested in. The try to uses different method, including decision tree, AdaBoost, SVM, KNN, neural networks, to classify 5 different state. They collect GSR, acceleration (2D), skin temperature, BVP, time, song information and beats per minute to predict how a user would rate a song in the future. They have 565 training instances. The result consider states is a little better than the one without states. More precisely speaking, states are considering the physiological information gather from the sensors. The result range is from 31.87% to 46.72, and mean square error is from 0.17 to about 0.45.
I think the result is not very well. I think they should collect more training data. Some more interesting points should be added into XPod.

Before the holiday

What should me do .
-- problem definition
-- complete survey related work
-- solultion (optional)
-- experiment result

Monday, January 01, 2007

[Reading] Using Human Physiology to Evaluate Subtle Expressivity of a Virtual Quizmaster in a Mathematical Game

http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6WGR-4F4WYNR-1&_coverDate=02%2F01%2F2005&_alid=516189552&_rdoc=1&_fmt=&_orig=search&_qd=1&_cdi=6829&_sort=d&view=c&_acct=C000050221&_version=1&_urlVersion=0&_userid=10&md5=c9c7e4550c1e39c2d84e639d3e50adcd

Helmut Prendinger and Junichiro Mori and Mitsuru Ishizuka
year 2003.

Abstraction: The aim of the experimental study described in this article is to investigate the effect of a life-like character with subtle expressivity on the affective state of users. The character acts as a quizmaster in the context of a mathematical game. This application was chosen as a simple, and for the sake of the experiment, highly controllable, instance of human–computer interfaces and software. Subtle expressivity refers to the character's affective response to the user's performance by emulating multimodal human–human communicative behavior such as different body gestures and varying linguistic style. The impact of em-pathic behavior, which is a special form of affective response, is examined by deliberately frustrating the user during the game progress. There are two novel aspects in this investigation. First, we employ an animated interface agent to address the affective state of users rather than a text-based interface, which has been used in related research. Second, while previous empirical studies rely on questionnaires to evaluate the effect of life-like characters, we utilize physiological information of users (in addition to questionnaire data) in order to precisely associate the occurrence of interface events with users’ autonomic nervous system activity. The results of our study indicate that empathic character response can significantly decrease user stress and that affective behavior may have a positive effect on users’ perception of the difficulty of a task.

Keyword: Life-like characters; Affective behavior; Empathy; Physiological user information; Evaluation

==== After Read ====
The writers use physiological signal to evaluate user interface and interaction between human and computer game. Their primary hypothesis is that if a life-like character provides affective feedback to the user, it can effectively reduce user frustration and stress. They use bio-sensors including GSR and BVP. Another short questionnaire is requested. The feature they get from the sensor signal is mean. The easy game is summing up the given five numbers. The result shows that the hypothesis is held expect the relation between game score and empathy.

The feature they get is very simple. They use a few sensors to show the result is good. I think the write wants to tell us that we can use just a few sensors the justify the helpful of well-design user interface. Some more application can be derived from the work.