We have recently shown that playing first person point of view action video games affects several aspects of perception, attention, and cognition. The skills found to be enhanced by action video game training, so far, include low-level vision (enhanced contrast sensitivity function), various aspects of attention (ability to monitor several objects at once, to search through a cluttered scene, to detect an event of interest in fast-forwarding video), more cognitive tasks (multi-tasking, task-switching) and, finally, a general speeding during decision making. This work illustrates how skilled performance in a variety of processing domains can be enhanced by a single training regimen. Practical implications of this finding, such as vocational training (e.g., for laparoscopic surgeons) or clinical rehabilitation (e.g. amblyopia) are of high social relevance.
Importantly, not all video games have these effects; for the skills studies so far, action games lead to greater benefits than other entertainment games. By studying the impact of various types of videogames on brain function, we aim to determine which aspects of performance can be altered by experience and to characterize the factors in a training regimen that favor the transfer of learning. Behavioral investigations are combined with brain imaging techniques to allow a more direct characterization of the brain systems that are modified by video game playing.
Loss of a sense dramatically alters the type of experience that individuals can rely on as they navigate in their world. We study the impact of early deafness on vision, attention, and cognition. Our work documents an array of rather specific changes following early, profound deafness. For example, lifelong deafness enhances only one aspect of vision: peripheral visual attention. The consequences of that one change range from enhanced visual search abilities to differences in reading patterns. We study how these changes proceed developmentally with an eye toward providing valuable information for deaf education.
We are also interested in characterizing the impact of language type, spoken or signed, on the brain organization for memory and language. We have shown that native users of American Sign Language have a smaller capacity limit of about 5 signs, rather than the classical 7+/-2 items documented in speakers, when testing serial short-term memory. This result cannot be attributed to often cited factors such as greater complexity or time of articulation for signs as compared to spoken words, or to lower cognitive abilities in the deaf. Rather, our work indicates that signers and speakers share that same underlying short-term memory structure, but differentially rely on different codes to support memory. Speakers depend on the phonological loop to a much greater extent than signers, who in turn rely on distributed coding across phonological, semantic, and visuo-spatial processes. Consequences of these different processing biases for reading and its neural bases are currently under investigation.