Self Improvement

Interactive content presentation based on expressed emotion and physiological feedback

Description
Interactive content presentation based on expressed emotion and physiological feedback
Published
of 2
2
Published
All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Share
Transcript
  Interactive Content Presentation Based onExpressed Emotion and Physiological Feedback  ∗ Tien-Lin Wu, Hsuan-Kai Wang, Chien-Chang Ho, Yuan-Pin Lin, Ting-Ting Hu,Ming-Fang Weng, Li-Wei Chan, Chang-Hua Yang, Yi-Hsuan Yang, Yi-Ping Hung,Yung-Yu Chuang, Hsin-Hsi Chen, Homer H. Chen, Jyh-Horng Chen, Shyh-Kang Jeng National Taiwan University, Taipei, Taiwan f94921032@ntu.edu.tw ABSTRACT In this technical demonstration we showcase an  interactive content presentation   that integrates media-expressed emo-tions composition, user-perceived preference feedback, andinteractive digital art creation under an unified framework.An user can easily organize and browse different types of multimedia data (including music, photos, and web blog ar-ticles) at the same time according to expressed emotion.The playback content is adjusted in real time according tothe user’s preference feedback measured from physiologicalsignals. This prototype system properly integrates our ICPwith advanced research results on cross-media composition,media-expressed emotion classification and physiological sig-nal processing. Categories and Subject Descriptors D.3.3 [ Programming Languages ]: Interfaces and Presen-tation User Interfaces-Prototyping, Theory and methods General Terms Management, Measurement, Human Factors Keywords Media-expressed emotion, cross-media composition, interac-tive digital art creation, user-perceived preference feedback 1. INTRODUCTION As user’s music, photo, and blog articles grow rapidly,their effective management has become a vital issue. How-ever, their media-expressed emotions which could be suit-able properties for combining distinct media types, is rarelyapplied in the existing systems. In addition, surfing these ∗ This work is sponsored by the National Science Councilof Taiwan, R.O.C., under contract NSC96-2752-E002-006-PAE. Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and that copiesbear this notice and the full citation on the first page. To copy otherwise, torepublish, to post on servers or to redistribute to lists, requires prior specificpermission and/or a fee.  ACMMM  ’08, Oct. 27 – Nov. 1, 2008, Vancouver, BC, Canada.Copyright 2008 ACM X-XXXXX-XX-X/XX/XX ...$5.00. Figure 1: The architecture of our ICP. multimedia files often makes user anxious and lost; thus, au-tomatic content recommendation according to user’s prefer-ence rather than manually clicking the remote device wouldbe an excellent solution. Moreover, traditional browsingmultimedia is usually tedious, novel experience could be thusobtained through continuously interacting with the mediafiles.In this demonstration we show an  interactive content pre-sentation   (ICP) system, an intelligent interactive entertain-ment media, bridging media-expressed emotions composi-tion and user-perceived preference feedback. This prototypesystem properly integrates our ICP with advanced researchresults on cross-media composition, media-expressed emo-tion classification and physiological signal processing. 2. INTERACTIVECONTENTPRESENTER2.1 System Overview Figure 1 shows the architecture of ICP, which is com-posed of three major components. First, the media withdifferent expressed emotions are classified into eight emo-tion categories [3]. The input data includes photos, music,and web blog articles. After recognizing the expressed emo-tions, different types of multimedia data are composed [1]  Figure 2: Content Recommendation Rules. in a slideshow fashion [2], resulting in a music video (MV).These MVs can be played by ordinary computers, or furtherintegrated into a novel  i  - m  -Top (interactive multi-resolutionTableTop) platform. At the same time, the user’s preference(like or netural/dislike) for multimedia contents are recog-nized from physiological signals. The detected preference isthen utilized to recommend the next MV according to somerecommendation rules. 2.2 Media-Expressed Emotion Composition Figure 2 shows the  emotion taxonomy ,  preferencemeter , and recommendation rules . This 8-category emo-tion taxonomy semantically represents that the meaning of neighboring groups varies in a cumulative way until reachinga contrast for groups in opposite positions. Not only clas-sifying media files (photo, music, text) based on their lowlevel features into the emotion taxonomy [3], we also com-pose these bi-modal media files into a music video accordingto smooth photo sequences and their similar expressed emo-tions [1]. 2.3 User-Perceived Preference Feedback As shown in Figure 2, the  preference meter  is based onthe classification of the user’s physiological signals, includinggalvanic skin response (GSR), electromyography (EMG) andpulse. Also, our  recommendation rules  are to guide userto stay or toward the great preference while watching theseselected MVs.For example, if the system knows that the user likes theMV very much (preference  >  0.65), another piece of MVwith the same emotion category is chosen as the next MV.On the other hand, suppose that the user dislikes the currentMV (preference value close to 0), a piece of MV with an op-posite emotion category (arranged in the opposite directionof the  emotion taxonomy ). For other preference values,another piece of MV with an emotion category in-betweenin the chart is selected.These rules bridge user-perceived preference and media-expressed emotion by setting different thresholds for the de-tected preference value, and switch the MV according to therange in which the detected preference value lies. 2.4 Interactive Digital Art Creation As shown in Figure 3, in addition to preference feedback,the user can create interactive digital art with the  i  - m  -Topsystem via his/her breathing and finger-touch, based on the Figure 3:  i  - m  -Top integrated presentation. result of media-expressed emotion classification. With asmart vest, the breathing signal of the user can be detectedand used to drive the particles, or the moving elements, ren-dered on the tabletop. In our design, the space displayedon the tabletop is a metaphor of the universe which encom-passes the user. When the user inhales, the particles ren-dered on the tabletop will be attracted toward the user, andwhen the user exhales, more particles will be generated andpushed outward. In the meantime, the particles can reactto the beats of the music and can be maneuvered with theuser’s fingertips.To further create an immersed sensation, luminescent lampsare installed near the edges of the tabletop. The brightnessof the lamps will vary according to user’s breathing, whichimplies that the inhaling can bring in energy to light up thelamps. Moreover, we further extend the  i  - m  -Top system toplayback user’s inner image by projecting the image of useron the center part of the tabletop. A PTZ (Pan, Tilt, Zoom)camera is installed on the tabletop system to acquire user’sfacial expressions and play it back, purposely with some de-lay, on the tabletop. The recorded expressions create sucha playful and magical experience for users. 3. DEMONSTRATION We demonstrate the novel experience for user to appreci-ate his/her multimedia with the ICP system at home. Threeselling points will be dramatized here: 1) We could use theICP to organize our mess of miscellaneous digital files ac-cording to their expressed emotion similarity. 2) No moremanually clicking remote controller to surf on channels, theICP could read our preference and automatic guide for us.3) Breaking from tediously browsing our familiar materials,we could interact with the ICP to create our unique andfresh experience. 4. REFERENCES [1] C.-H. Chen, M.-F. Weng, S.-K. Jeng, and Y.-Y.Chuang. Emotion-based music visualization usingphotos. In  MMM  , pages 358–368, 2008.[2] J.-C. Chen, W.-T. Chu, J.-H. Kuo, C.-Y. Weng, andJ.-L. Wu. Tiling slideshows. In  ACM Multimedia  , 2006.[3] T.-L. Wu and S.-K. Jeng. Probabilistic estimation of anovel music emotion model. In  MMM  , pages 487–497,2008.
Search
Tags
Related Search
We Need Your Support
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x