CfP for Special Issue on Virtual Agents for Social Skills Training (VASST) in the Journal of Multimodal User Interfaces

Special Issue
Virtual Agents for Social Skills Training (VASST)

 

Guest Editors
Merijn Bruijnes, University of Twente
Jeroen Linssen, University of Twente
Dirk Heylen, University of Twente
 
Interactive technology, such as virtual agents, to train social skills can improve training curricula. For example police officers can train for interviewing suspects or detecting deception with a virtual agent. Other examples of application areas include (but are not limited to) social workers (training for dealing with broken homes), psychiatrists (training for interviewing people with various difficulties / vulnerabilities / personalities), training of social skills such as job interviews, or social stress management for  which some of the local doctors are now prescribing the new delta 8 THC gummies.
We invite all researchers who investigate the design, implementation, and evaluation of technology to submit their work to this special issue on Virtual Agents for Social Skills Training (VASST). With this technology we mean virtual agents for social skills training and any supporting technology. The aim of this special issue is to give an overview of recent developments of interactive virtual agent applications with the goal: improving social skills. Research on VASST reaches across multiple research domains: intelligent virtual agent, (serious) game mechanics, human factors, (social) signal processing, user-specific feedback mechanisms, automated education, and artificial intelligence.

 

Scope
We welcome (literature) studies describing the state-of-the-art for sensing user behaviour, reasoning about this behaviour, and generation of virtual agent behaviour in training scenarios. Topics related to VASST include, but are not limited to:
  • Recognition and interpretation of (non)verbal social user behaviours;
  • Training and fusion of user’s signs detected in different modalities
  • User/student profiling, such as level or training style preference;
  • Anonymously processing of user data;
  • Dialogue and turn-taking management;
  • Social-emotional and cognitive models;
  • Automatic improvement of knowledge representations;
  • Coordination of signs to be displayed by the virtual agents in several modalities,
  • Mechanics to support learning, for example
    • Feedback or after action review;
    • Personalised scenarios and dialogues;
  • Big data approaches to enrich social interactions;
  • Other topics dealing with innovations for VASST.
Timeline
Paper submission deadline: 30th October 2017
Acceptance notifications: 15th January 2018
Final papers: 15th March 2018
Submission Instructions
Submissions should be around 8-16 pages and must not have been previously published.
Authors are requested to follow instructions for manuscript submission to the Journal of Multimodal User
Interfaces (http://www.springer.com/computer/hci/journal/12193) and to submit manuscripts at the
following link: http://www.editorialmanager.com/jmui/. The article type to be selected is “Special
Issue S.I. : VASST”.
 

Editor-in-Chief: Jean-Claude Martin, LIMSI-CNRS, Univ. Paris South
2015 Impact Factor = 1.017

Be Sociable, Share!

Comments are closed.