Speaker Jeffrey Yau

Jeffrey Yau, Ph.D.

Department of Neuroscience
Baylor College of Medicine

Wednesday, June 19, 2024 | 12 p.m. | Reynolds School of Journalism Room 101


Bimanual touch in the primate somatosensory cortical system

Our remarkable manual dexterity is characterized in part by an ability to coordinate our use of two hands. While bimanual motor control has been well­studied, far less is known about bimanual touch and the neural computations that mediate information processing over the hands. In this talk, I will first present evidence that tactile cues experienced on one hand automatically and systematically bias the perception of cues experienced concurrently on the other hand. These bimanual interactions are well-explained by a cue integration model that incorporates canonical operations. I will then present functional neuroimaging results showing that human brain responses associated with bimanual cues are more consistent with cue integration rather than conjunctive coding. Finally, I will present neurophysiology data recorded from human and non-human primates which indicate the presence of bimanual signals in primary somatosensory cortex. These collective results reveal the complex manner by which the nervous system combines sensory inputs over the hands and highlight candidate brain regions and neural mechanisms that mediate bi manual touch.

Speaker Kenneth Knoblauch

Kenneth Knoblauch, Ph.D.

Universite Claude Bernard, Lyon 1, lnserm U 1208
Stem-cell and Brain Research Institute

Workshop: May 9 | 9:30 a.m. to 12:30 p.m. | Pennington Student Achievement Center, Room 113

Seminar: May 8 | 11 a.m. | Reynold's School of Journalism, Room 101


Workshop: Statistics for Psychophysics, using R

This workshop will focus on statistical tools to analyze and to model psychophysical experiments within the framework of Signal Detection Theory.  This includes choice experiments (detection, discrimination, identification, etc.), fitting psychometric functions and rating scale experiments with ROC analyses.  In many cases, the decision rule underlying these paradigms is linear, thereby permitting the analyses to be facilitated by using a Generalized Linear Model (GLM).  Rating scales, similarly, are analyzed by using ordinal regression models with cumulative link functions.  With these approaches, we can define straight-forward procedures to fit the data, to test hypotheses about them, to obtain confidence intervals, etc.  Most off-the-shelf software packages now include tools for performing GLMs, thus, making it easy to implement these tests and procedures.  Examples will be shown using the R programming environment and language (https://www.r-project.org).  Finally, an example will be shown of this approach with a paradigm for measuring appearance, i.e., scaling. Background reading includes the books "Modeling Psychophysical Data in R", K. Knoblauch & L. T. Maloney, 2012, Springer, (for R users) and "Psychophysics: A Practical Introduction, 2nd Edition", F. A. A. Kingdom & N. Prins, 2016, Academic Press (for Matlab users).

Data sets used during the workshop will be made available ahead of time, and R packages used in the tutorial can be installed ahead of time from CRAN:

  1. effects
  2. MLDS
  3. MPDiR
  4. ordinal
  5. psyphy
  6. shiny

Packages should be installed with dependencies, e.g., from within an R session install.packages(“effects”, dependencies = TRUE).

Seminar: Neural Circuits Implicated in Long-range Color Filling-in

Visual illusions-defined as deviations between what we expect the observer to perceive based on the stimulus and what is actually perceived-are frequently used to probe visual mechanisms. In a general sense, analyzing stimulus features that induce illusions focuses only on bottom-up processing while the violation of expectation aspect relates more to built-in assumptions implicit to visual perception. I will describe a series of psychophysical experiments in which we quantified optimal stimulus features to induce a long-range, color filling-in phenomenon called the watercolor effect. This evidence suggests the phenomenon requires processing over multiple levels along the visual hierarchy. We also investigated the neural substrate of the phenomenon using functional imagery. Here, the evidence implicates an interaction between extrastriate cortical areas in both dorsal and ventral streams and feedback to earlier visual areas. Since feedforward/feedback processes are established through layer-specific cortical projections, further insight as to the mechanisms underlying this phenomenon would benefit from studies using laminar resolution imaging.

Speaker Evan Feinberg

Evan Feinberg, Ph.D.

Associate Professor of Anatomy
Kavli Institute for Fundamental Neuroscience, Center for Integrative Neuroscience
University of California, San Francisco

May 4 | 12-1 p.m. | William J. Raggio, Room 2003


A modular logic of coordinated movements

Most behaviors require moving multiple body parts. For example, catching a ball involves wrist, elbow, and shoulder movements, and looking at a friend calling your name entails head and eye movements. How the brain coordinates the movement of multiple body parts to construct goal-directed behavior is not well understood. My lab investigates this question by applying neural ensemble recordings, perturbations, and modeling to a suite of mouse innate behaviors we have identified. In this talk, I will describe published and unpublished findings that reveal a surprising network architecture and computational logic for coordinating movements.

Speaker James Kenyon

James Kenyon, Ph.D.

Director of the Institute of Neuroscience

May 4 | 9-10 a.m. | William J. Raggio, Room 2003

Institute for Neuroscience Update from the Director

Dr. Kenyon, the Director of the Institute for Neuroscience, will be discussing an update on the Institute.

Speaker Nathaniel Dominy

Nathaniel Dominy, Ph.D.

Department of Anthropology, Dartmouth College
Hanover, New Hampshire

March 19 | 1:30 p.m. | Ansari Business, room 106


How did we become the storytelling ape?

Storytelling is a uniquely human trait and it entails a diverse range of narrative forms, from formalised performance to informal anecdotes. It can be fictional or factual, but it always includes an ordered narrative, with characters, settings, events and resolution. In the intent of this talk -- this story -- is to probe and unpack the origin and evolution of human storytelling, progressing from forelimb anatomy of apes to the need for syntax and control of fire for social purposes. The talk will include new data on the nest-building behaviors of chimpanzees in Budongo, Uganda and from several large fire festivals, including Las Falles in Valencia, Spain and Up Helly Aa in Lerwick, Scotland.

Speaker Karen Schloss

Karen Schloss, Ph.D.

Department of Psychology & Wisconsin Institute for Discovery

Friday, April 12, 2024 | 1:30 p.m. | Reynolds School of Journalism Room 101


Understanding color semantics for visual communication

Visual communication is fundamental to how humans share information, from weather patterns, to disease prevalence, to their latest scientific discoveries. When people attempt to interpret information visualizations, such as graphs, maps, diagrams, and signage, they are faced with the task of mapping perceptual features onto meanings. Sometimes, visualization designs include legends, labels, or accompanying verbal descriptions to help determine the meaning of colors (color semantics). However, people have expectations about how colors will map to concepts (called inferred mappings), and they find it more difficult to interpret visualizations that violate those expectations. Traditionally, studies on inferred mappings distinguished factors relevant for visualizations of categorical vs. continuous information.

In this talk, I will discuss recent work that unites these two domains within a single framework of assignment inference. Assignment inference is the process by which people infer mappings between perceptual features and concepts represented in encoding systems. I will begin by presenting evidence that observers infer globally optimal assignments by maximizing the "merit," or "goodness," of assignments between colors and concepts. I will then discuss factors that contribute to merit in assignment inference and explain how we can model the combination of multiple (sometimes competing) sources of merit to predict human judgments. This work has increased our understanding of people's expectations about color semantics, which can be used to make visual communication more effective and efficient.

Speaker Valerie Chalcraft

Valerie Chalcraft, Ph.D.

Animal Behavior Institute

Friday, April 19, 2023 | 2 p.m. | Reynolds School of Journalism Room 101


Applied Animal Behavior: Combining psychology and ethology to improve the lives of companion animals

Despite different evolutionary pathways and social structures, cats and dogs (and other species) develop similar behavior problems that can be addressed with similar treatment plans. Stress and fear underlie a lot of behavior in non-human animals and manifest in similar ways on a continuum. In confinement, natural behaviors can be­come behavior problems. Several examples of common behavior problems in dogs and cats are discussed as are the basic principles for resolving behavior problems in general.