CFP: Special Issue on User States in Extended Reality Media Experiences for Entertainment Games

9 views
Skip to first unread message

David Melhart

unread,
Dec 16, 2021, 5:32:53 AM12/16/21
to Computational Intelligence and Games
Call for Abstracts and/or Manuscripts
Frontiers in Virtual Reality Journal
Special Issue on User States in Extended Reality Media Experiences for Entertainment Games

We are looking for contributions at the crossroads of affective computing, user experience (measuring and modeling), autonomous content adaptation and procedural content generation (PCG).
This includes various stages of the adaptation pipeline such as:
  • Exploring extended media experiences which intend to elicit different emotional responses (Happiness, Frustration, Anxiety) or user experiences (Challenge, Control, Expectation) for use in entertainment games;
  • Application of affective computing or player experience for entertainment games;
  • Emotion and experience annotation during entertainment gameplay - “How to Rate/Rank Player Experience/Emotion and Human Physiology during an Interactive Experience”.
  • Evaluating adaptive or affective games - are the adaptation methods evoking the intended emotion/experience?
  • Dynamic virtual environments - how experience and emotion can influence virtual environments within the context of entertainment games
  • Adapting Haptic Feedback for eliciting different emotional/experience responses in games
  • Tools and Game Design Guidelines - How Emotions and Player Experience tools can aid in the creation of better games?
  • Multiplayer Games – How context influencing factors are affecting social acceptability and user emotions
  • Measuring and Evaluation of adaptation methodologies - Validating and User Experiments
For more information on the topic, including the full scope description, please see here.

Submissions

Submit your abstract through the Frontiers portal: https://www.frontiersin.org/research-topics/22484/user-states-in-extended-reality-media-experiences-for-entertainment-games

All submissions should follow the Frontiers Author Guidelines and be submitted on the Special Issue Website. For the full list of article types, please see here. All submitted articles are peer-reviewed. Article-processing charges are applied to all published articles.
See if your institution has a payment plan with us.
(see here: https://www.frontiersin.org/about/institutional-membership)
Find out about applying for fee support.
(see here https://www.frontiersin.org/about/publishing-fees#feesupport)

Deadlines for submission
Abstract deadline: April 21, 2022
Invitation to submit a full manuscript: August 18, 2022
Full manuscript deadline: September 08, 2022

Kind Regards,
Phil Lopes - Universidade Lusófona, Lisbon, Portugal
Jan-Niklas Voigt-Antons - Hamm-Lippstadt University of Applied Sciences, Hamm, Germany
Jaime Andres Garcia - University of Technology Sydney, Sydney, Australia
David Melhart - University of Malta, Msida, Malta
Topic Editors
Virtual Reality and Human Behaviour Section, Frontiers in Virtual Reality
--------------------------

About this Research Topic
Extended Reality Media are multifaceted experiences combining a wide range of different multimedia sources, from audio and visuals, to story-telling - which combine into a fully interactive experience. The term Extended Reality Media (ERM) is a catch-all term for interactive media which includes: Digital Games (non-VR or otherwise) and Virtual Reality/Augmented Reality Experiences. Unlike other mediums, users engage directly with these environments, controlling the action and responding to the challenges presented. These virtual environments and challenges are often hand-crafted by designers, who tailor these experiences for a general audience. The downside of the latter is that these experiences will influence each individual player distinctly - players have different skill sets, engage in the game differently, have diverging responses to certain events; which is nearly impossible to personalize for each individual player.
Given the recent rise of statistical modelling of playing behaviors [1], designers and researchers have been looking at the possibility of autonomous game content adaptation, by taking advantage of affective computing [2] and player experience theory [3]. More precisely, it consists of constructing algorithms that are capable of dynamically altering the parameters of virtual assets and how these are shown to the player, i.e. through orchestration [4]; or building virtual environments entirely “from scratch” algorithmically without the need of a human designer, i.e. Procedural Content Generation (PCG) [5]. By taking advantage of player affective and/or experience models, such information would be used in conjunction with Orchestration and PCG methodologies for the dynamic construction of personalized virtual scenarios and game-playing sequences (e.g., making the scenario more action packed if too “boring”).
How to Adapt Games Effectively? Although a lot of the work in the field has been devoted to the aspects of creating models capable of detecting player emotion and experience [8, 9, 10, 11], the same cannot be said on the topic of autonomous game content adaptation. More precisely, the ways in which these applications take advantage of said affective and player experience models so that the adaptation can actually bring a “richer” and more “fulfilling” emotional and user experience. Currently there is a severe lack of work when it comes to determining exactly this, where a large focus stems solely on the topic of creating these emotional and player experience recognition models, while ignoring how this information can be leveraged effectively. Thus, this Research Topic is interested in investigating solutions on “how can we adapt virtual content for the purposes of entertainment?”.
Why does this matter? The most obvious application of the research being put forth is in the construction of tools and systems that increase the autonomy of a game, allowing it to provide more dynamic and unpredictable content based on factors and preferences from an individual user. Providing such a system would increase the replay value of an individual game and create unique experiences for each individual player.

References
[1] Yannakakis, G. N., & Togelius, J. (2018). Modeling players. In Artificial intelligence and games (pp. 203-255). Springer, Cham.
[2] Picard, R. W. (2015). The promise of affective computing. The Oxford handbook of affective computing, 11-20.
[3] Wiemeyer, J., Nacke, L., & Moser, C. (2016). Player experience. In Serious games (pp. 243-271). Springer, Cham.
[4] Liapis, A., Yannakakis, G. N., Nelson, M. J., Preuss, M., & Bidarra, R. (2018). Orchestrating game generation. IEEE Transactions on Games, 11(1), 48-68.
[5] Shaker, N., Togelius, J., & Nelson, M. J. (2016). Procedural content generation in games. Switzerland: Springer International Publishing.
[6] Conati, C., Marsella, S., & Paiva, A. (2005, January). Affective interactions: the computer in the affective loop. In Proceedings of the 10th international conference on Intelligent user interfaces (pp. 7-7).
[7] Hunicke, R., LeBlanc, M., & Zubek, R. (2004, July). MDA: A formal approach to game design and game research. In Proceedings of the AAAI Workshop on Challenges in Game AI (Vol. 4, No. 1, p. 1722).
[8] Makantasis, K., Liapis, A., & Yannakakis, G. N. (2019, September). From pixels to affect: a study on games and player experience. In 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1-7). IEEE.
[9] Guthier, B., Dörner, R., & Martinez, H. P. (2016). Affective computing in games. In Entertainment computing and serious games (pp. 402-441). Springer, Cham.
[10] Lopes, P., Liapis, A., & Yannakakis, G. N. (2017). Modelling affect for horror soundscapes. IEEE Transactions on Affective Computing, 10(2), 209-222.
[11] Chanel, G., Rebetez, C., Bétrancourt, M., & Pun, T. (2011). Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 41(6), 1052-1063.
Reply all
Reply to author
Forward
0 new messages