predicting mobile user emotion from finger strokes
- In this work, we propose a simpler model to predict the affective state of a touch screen user. The prediction is done based on the user’s touch input, namely the finger strokes.
- The validation study demonstrates a high prediction accuracy of 90.47 %.
- Researchers tried to incorporate affect and emotion in HCI, resulting in interaction styles that are “affective
- The goal of affective interaction is to make systems more natural and responsive to the goals and expectations of the user, so as to improve usability and user experience.
- Categorisation of users into three states - positive, negative and neutral.
- There are broadly two ways of representing emotions: the discrete model of emotion and the continuous model . The former view posits that emotions are discrete, measurable, and are physiologically distinct. The continuous model, on the other hand, represents emotion as a point in a two dimensional space of valence and arousal
- We needed to come up with techniques that do not require extra set-up or significant computation.
- We propose to use touch interaction characteristics to predict emotion. The touch interaction characteristics are captured in terms of finger strokes
- Triggered the participants to a particular emotional state and recorded the time taken for swipes.
- Defined features
- We compared performance of four classification models and found maximum prediction accuracy with the k-means clustering based classifier (72 %). In order to improve the accuracy further and reduce the computational com- plexity, we experimented with a linear regression ap- proach and derived a model with 90.47 % accuracy.