_edited.jpg)
Welcome to the website of the CAP-Lab, a group of researchers studying the cognitive (neuro)science of consciousness, attention, perception, and working memory.
The head of the CAP-Lab (pun intended) is Surya Gayet, assistant professor in Experimental Psychology at Utrecht University, The Netherlands.
​
From left to right: Sam, Luzi, Dan, Xiaohua, Femke, Surya, Kabir, Yichen, Liangyou, Andre.
News

March 17th, 2025
Luzi defended her PhD!
Last Friday, Luzi Xu successfully defended her PhD in the Academy building of Utrecht University (many thanks to the other supervisors: Stefan van der Stigchel and Chris Paffen).
​
After only a little more than three years, she conducted and wrote up four multi-experiment empirical papers (including a beautiful publication in Psychological Science), and will now leave us to do a postdoc in Hong Kong. Good luck, Luzi!​​​

March 8th, 2025
New paper by Kabir
The 1st paper of Kabir Arora's PhD was accepted in iScience. Co-supervised by Sam Chota, Leon Kenemans, and Stefan van der Stigchel.
Using a novel set-up combining EEG and Rapid Invisible Frequency Tagging (RIFT), Kabir investigated the differences and similarities between external attention (to objects in the world) and internal attention (to objects in visual short-term memory).​​​​

February 17th, 2025
New pre-print by Yichen
Yichen Yuan pre-prints the 2nd project of his PhD, co-supervised by Nathan van der Stoep.
​​​
Using EEG, we show that - unlike visual working memory load - auditory working memory load is not represented in lateralized responses. Instead, auditory working memory load can be decoded from (non-lateralized) scalp-patterns of alpha oscillations in (mostly) temporal electrodes.​​

February 3rd, 2025
2 pre-prints by Giacomo!
Giacomo Aldegheri pre-prints TWO projects from his PhD in one go! With Marius Peelen.
​​​
In two fMRI experiments and three behavioral experiments, we show that visual representations of objects are automatically updated (i.e., rotated and scaled) as the observer's viewpoint on the scene changes - when rotating around, and moving towards an object.​​

December 19th, 2024
New paper by Dan
Dan Wang's first PhD paper has been published in Consciousness & Cognition.
​
In this series of behavioral studies we show that human observers can flexibly up- and down-regulate the extent to which visual working memory content influence concurrent visual processing. By doing so, observers can choose which items guide top-down search and which items do not, while only minimally affecting memory quality.

December 4th, 2024
New paper by Yichen
Yichen Yuan's paper "Using hearing and vision for motion prediction, motion perception, and localization" is accepted for publication in Journal of Experimental Psychology: General.
Here, we show that we combine auditory and visual input to localize static objects, but rely on a single modality when tracking moving objects, or predicting the location of occluded objects.​

September 3rd, 2024
New paper by Luzi
The paper "Statistical learning facilitates access to awareness", spearheaded by Luzi Xu, is published in Psychological Science!
Here, we show that probable visual objects gain preferential access to consciousness over improbable objects; you don't perceive what you don't expect.
​