Program details

Schedule

Full online program available here.

An A5 printout of this schedule will be given to you when you register at the conference. The other side of that printout contains information about getting to and around Yale-NUS College. It is available here.

Schedule at a glance:

Keynote speakers

Symposia

Mechanisms of face perception

Organized by: Colin Palmer

The human face has special significance as a visual cue, helping us to track the emotional reactions and attentional focus of others, shaping social trait impressions (e.g., attractiveness and trustworthiness), and helping us to identify those people familiar to us. While face processing has received much attention in vision science, the mechanisms that shape the everyday experience of faces are still only partially understood. What are the core dimensions of facial information represented in the visual system? How is this information extracted from the visual signals relayed to the brain from the retina? How do implicit processes, such as physiological responses or evolutionary pressures, align with our perceptual experience of faces? This symposium showcases recent discoveries and novel approaches to understanding the visual processing of faces in the human brain. Talks range from the use of intracranial neural recordings to uncover cortical and subcortical responses underlying face perception, data-driven approaches to defining the social dimensions observers perceive in faces, characterisation of the link between face features, perception and physiology using psychophysics and computational models, and analysis of the biological and evolutionary factors that shape face impressions. Together this provides a snapshot of exciting developments occurring at a key interface between vision science and social behaviour.

Qian Wang, Yingying Wang, Guanpeng Chen, Ruolin Yang and Fang Fang

Unveiling subcortical and cortical mechanisms of face perception via intracranial recordings in the human brain

Jessica Taubert, Shruti Japee, Amanda K. Robinson, Houqiu Long, Tijl Grootswagers, Charles Zheng, Francisco Pereira and Chris Baker

Uncovering the multidimensional representation underlying human dissimilarity judgements of expressive faces

Anqi Mao, Runnan Cao, Sai Sun, Shuo Wang and Dongwon Oh

Implicit encoding of social trait perceptions: Modeling eye-gaze patterns, pupillary responses, and neuronal activity

Yong Zhi Foo

The evolutionary basis of preferences for male facial masculinity.

Colin Palmer and Gwenisha Liaw

Eye glint as a perceptual cue in human vision


The impact of recent technologies on studies of multisensory integration

Organized by: Hiroaki Kiyokawa and Juno Kim

Multisensory integration is one of the key functions to obtain stable visual and non-visual perception in our daily life. However, it is still a challenging problem to comprehensively understand how our brain integrates different types of modal information. How does our visual system extract meaningful visual information from retinal images and integrate those with information from other sensory modalities? Recent technologies, such as virtual reality (VR) and/or augmented reality (AR), can provide scalable interactive and immersive environments to test the effects of external stimulation on our subjective experiences. What do those technologies bring to our research? We invite world- leading scientists in human perception and performance to discuss the psychological, physiological, and computational foundations of multisensory integration, and methodologies that provide insight into how non-visual sensory information enhances our visual experiences of the world.

Hideki Tamura

Forward and backward steps in virtual reality affect facial expression recognition

Michiteru Kitazaki

Multimodal information for virtual walking

Stephen Palmisano

Can we measure sensory conflict during virtual reality? And if we can, then what can we do with this information?


Regularity and (un)certainty: extracting implicit sensory information in perception and action

Organized by: Shao-Min Hung and Hsin-I Iris Liao

How do we track the relations among sensory items in the surroundings? With our sensory systems bombarded by immeasurable external information, it is hard to envision a willful, deliberate, and moment-by-moment sensory tracking mechanism. Instead, here we seek to illustrate how our behavior is affected by implicitly tracked regularity and the accompanying (un)certainty. We will provide evidence from a wide spectrum of studies, encompassing interactions among vision, audition, and motor systems. Shao-Min (Sean) Hung first establishes implicit regularity tracking in a cue-target paradigm. His findings suggest that regularity tracking between sensory items relies very little on explicit knowledge or visual awareness. However, how we derive meaningful results requires careful work. Philip Tseng’s work expands on this point and demonstrates how visual statistical learning can be influenced by task demand. These results advocate the importance of experimental design in searching for implicit extraction of sensory information. Similar tracking of perceptual statistics extends to the auditory domain, as evidenced by Hsin-I (Iris) Liao’s work. Her research shows how pupillary responses reflect perceptual alternations and unexpected uncertainty in response to auditory stimulations. Next, we ask how our behavior reacts to such regularities. Using a motor learning paradigm, Nobuhiro Hagura reveals that different visual uncertainty can tag different motor memories, showing that uncertainty provides contextual information to guide our movement. Finally, David Alais uses continuous measurement of perception during walking to reveal a modulation occurring at the step rate, with perceptual sensitivity optimal in the swing phase between steps. Together, our symposium aims to paint a multifaceted picture of perceptual regularity tracking, with the (un)certainty it generates. These findings reveal the ubiquitous nature of implicit sensory processing in multiple sensory domains, integrating perception and action.

Shao-Min Hung and Akira Sarodo

Tracking probability in the absence of awareness

Hsin-I Iris Liao

Auditory information extraction revealed by pupil-linked arousal

Philip Tseng

Importance of task demand in measuring implicit learning

Nobuhiro Hagura

Decision uncertainty as a context for motor memory

David Alais and Matthew Davidson

Seeing the world one step at a time: perceptual modulations linked to the gait cycle