inf5458 - Applied AI - Multimodal-Multisensor Interfaces II: Signal Processing, Architectures, and Detection of Emotion and Cognition (Complete module description)

inf5458 - Applied AI - Multimodal-Multisensor Interfaces II: Signal Processing, Architectures, and Detection of Emotion and Cognition (Complete module description)

Original version English PDF Download
Module label Applied AI - Multimodal-Multisensor Interfaces II: Signal Processing, Architectures, and Detection of Emotion and Cognition
Modulkürzel inf5458
Credit points 3.0 KP
Workload 90 h
Institute directory Department of Computing Science
Verwendbarkeit des Moduls
  • Master's Programme Computing Science (Master) > Angewandte Informatik
Zuständige Personen
  • Sonntag, Daniel (module responsibility)
  • Lehrenden, Die im Modul (Prüfungsberechtigt)
Prerequisites

basic concepts of Artificial Intelligence, Human-Computer Interfaces

Skills to be acquired in this module

Learning methods of multimodal interaction, learning Human-Computer Interaction concepts.

Professional competences
The students

work their way into the topic of multimodality (competence: basic concepts of multimodality, develop an intuition for multimodal approaches, multimodal fusion techniques).

Methological competences
The students

  • prepare a term paper on a special topic in the field of multimodality (competence: quick comprehension, structured literature review, precise expression).


Social competences

The students

  • choose a topic and interact with each other and the supervising person (competence: communication skills, enthusiasm, initiative)

Self competences
The students

  • work independently in a supervised setting (competencies: Personal responsibility, analytical thinking, organization, time management).
Module contents

We begin with multimodal signal processing, architectures, and machine learning. It includes recent deep-learning approaches for processing multisensorial and multimodal user data and interaction, as well as context-sensitivity. A further highlight is processing of information about users' states and traits, an exciting emerging capability in next-generation user interfaces. We discuss real-time multimodal analysis of emotion and social signals from various modalities and perception of affective expression by users. Then we discuss multimodal processing of cognitive state using behavioral and physiological signals to detect cognitive load, domain expertise, deception, and depression. This collection of chapters provides walk- through examples of system design and processing, information on tools and practical resources for developing and evaluating new systems, and terminology, and tutorial support for mastering this rapidly expanding field. Finally, we look at experts' exchange views on the timely and controversial challenge topic of multimodal deep learning. The discussion focuses on how multimodal-multisensor interfaces are most likely to advance human performance during the next decade.

Literaturempfehlungen

The Handbook of Multimodal-Multisensor Interfaces: Signal Processing, Architectures, and Detection of Emotion and Cognition - Volume 2 (https://dl.acm.org/doi/book/10.1145/3107990)

Links
Language of instruction English
Duration (semesters) 1 Semester
Module frequency every semester
Module capacity unrestricted
Teaching/Learning method S
Examination Prüfungszeiten Type of examination
Final exam of module

at the end of the lecture period

oral exam or portfolio or presentation

Lehrveranstaltungsform Seminar
SWS 2
Frequency SoSe und WiSe
Workload Präsenzzeit 28 h