The book is based on interdisciplinary research on various aspects and dynamics of human multimodal signal exchanges. It discusses realistic application scenarios where human interaction is the focus, in order to
-
identify new methods for data processing and data flow coordination through synchronization, and optimization of new encoding features combining contextually enacted communicative signals, and
-
develop shared digital data repositories and annotation standards for benchmarking the algorithmic feasibility and successive implementation of believable human–computer interaction (HCI) systems.
Currently, researchers coming from psychological, computational, and
engineering research fields have developed a human-centered behavioral informat-
icscharacterizedbytechniquesanalyzingandcodinghumanbehaviors,conventional
andunconventionalsocialconducts,signalscomingfromaudioandvideorecordings,
auditory and visual pathways, neural waves, neurological and cognitive disorders,
psychological and personal traits, emotional states, mood disorders. This interweav-
ing of expertise had produced extensive research progresses and unexpected con-
verging interests allowing the groundwork for a book dedicated to pose the current
progresses in dynamics of signal exchanges and reporting the latest advances on the
synthesis and automatic recognition of human interactional behaviors. Key features
consideredarethefusionandimplementationofautomaticprocessesandalgorithms
for interpreting, tracking, and synthesizing dynamic signals such as facial expres-
sions, gaits, EEGs, brain and speech waves. The acquisition, analysis, and modeling
of such signals is crucial for computational studies devoted to a human-centered
behavioral informatics.