¡@

Cross-modal information processing

    Perception of human senses (visual, auditory, tactile, etc.) derives from different forms of physical stimulation (e.g., light waves, sound waves, mechanical pressure, etc.), different receptor types (e.g., cones, hair cells, and mechanoreceptors, etc.), and processes in different brain areas (e.g., occipital lobe, temporal lobe, and anterior parietal lobe, etc.). Along this processing stream, physical stimulation is converted to electric signals by various sensory receptors, and different perceptual representations from different senses must rely on processing of different neural pathways in the brain. Integration of multiple senses thus seems to face with the following dilemma: on the one hand information derived from specific sensory inputs must maintain its modality-specific representation, and on the other hand, information between different sensory systems need to be communicated and integrated.

    The main purpose in this course is to provide an overview of cross-modal processing. The contents include: phenomena and mechanisms, spatial and temporal characteristics, orienting, emotions, language, food, development, plasticity, synesthesia, clinical and other applications. Materials are taken from the latest cognitive neuroscience researches from multiple disciplines such as cognitive science, psychology, neuropsychology, computational simulation, consumer behavior and applied science.