Skip to main content

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
      • Building 46 Room Reservations
    • Leadership
    • Employment
    • Contact
      • BCS Spot Awards
      • Building 46 Email and Slack
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
      • Postdoctoral Association and Committees
    • Core Facilities
    • InBrain
      • InBRAIN Collaboration Data Sharing Policy
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
      • Course 6-9 MEng
    • Brain and Cognitive Sciences PhD
      • How to Apply
      • Program Details
      • Classes
      • Research
      • Student Life
      • For Current Students
    • Molecular and Cellular Neuroscience Program
      • How to Apply to MCN
      • MCN Faculty and Research Areas
      • MCN Curriculum
      • Model Systems
      • MCN Events
      • MCN FAQ
      • MCN Contacts
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
      • MIT Summer Research Program (MSRP)
      • Post-Baccalaureate Research Scholars
      • Conferences, Outreach and Networking Opportunities
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
    • Upcoming Events
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us

Footer

  • Contact Us
  • Employment
  • Be a Test Subject
  • Login

Footer 2

  • McGovern
  • Picower

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us
Brain and Cognitive Sciences
Menu
MIT

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
    • Leadership
    • Employment
    • Contact
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
    • Core Facilities
    • InBrain
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
    • Brain and Cognitive Sciences PhD
    • Molecular and Cellular Neuroscience Program
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
    • Upcoming Events
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Directory

Breadcrumb

  1. Home
  2. Directory
  3. Josh McDermott
Josh_July_2016_in_office_cropped_small.jpg
McDermott
Josh
Ph.D.
Professor
Brain & Cognitive Sciences
Associate Investigator
McGovern Institute for Brain Research
Faculty Appointment
Primary
Building
46-4078
Email
jhm@mit.edu
Phone
6172537437
Lab website
Administrative Asst
canfield@mit.edu
    About

    Josh McDermott is a perceptual scientist studying sound and hearing in the Department of Brain and Cognitive Sciences at MIT, where he is an Associate Professor and heads the Laboratory for Computational Audition. His research addresses human and machine audition using tools from experimental psychology, engineering, and neuroscience. McDermott obtained a BA in Brain and Cognitive Science from Harvard, an MPhil in Computational Neuroscience from University College London, a PhD in Brain and Cognitive Science from MIT, and postdoctoral training in psychoacoustics at the University of Minnesota and in computational neuroscience at NYU. He is the recipient of a Marshall Scholarship, a James S. McDonnell Foundation Scholar Award, and an NSF CAREER Award.

    Research

    Computational Audition

    Our lab studies how people hear. Sound is produced by events in the world, travels through the air as pressure waves, and is measured by two sensors (the ears). The brain uses the signals from these sensors to infer a vast number of important things: what someone said, their emotional state when they said it, and the whereabouts and nature of events we cannot see, to name but a few. Humans make such auditory judgments hundreds of times a day, but their basis in our acoustic sensory input is often not obvious, and reflects many stages of sophisticated processing that remain poorly characterized.

    We seek to understand the computational basis of these impressive yet routine perceptual inferences. We hope to use our research to improve devices for assisting those whose hearing is impaired, and to design more effective machine systems for recognizing and interpreting sound, which at present perform dramatically worse in real-world conditions than do normal human listeners.

    Our work combines behavioral experiments with computational modeling and tools for analyzing, manipulating and synthesizing sounds. We draw particular inspiration from machine hearing research: we aim to conduct experiments in humans that reveal how we succeed where machine algorithms fail, and to use approaches in machine hearing to motivate new experimental work. We also have strong ties to auditory neuroscience. Models of the auditory system provide the backbone of our perceptual theories, and we collaborate actively with neurophysiologists and cognitive neuroscientists. The lab thus functions at the intersection of psychology, neuroscience, and engineering.

    Current research in our lab explores how humans recognize real-world sound sources, segregate particular sounds from the mixture that enters the ear (the cocktail party problem), separate the acoustic contribution of the environment (e.g. room reverberation) from that of the sound source, and remember and/or attend to particular sounds of interest. We also study music perception and cognition, both for their intrinsic interest, and because music often provides revealing examples of basic hearing mechanisms at work.

    Teaching

    9.35 Perceptual Systems

    9.285 Neural Coding and Perception of Sound

    Publications

    Francl, A., McDermott, J.H. (2022) Deep neural network models of sound localization reveal how perception is adapted to real-world environments. Nature Human Behaviour, 6, 111-133.

    Agarwal, V., Cusimano, M., Traer, J. McDermott, J.H. (2021) Object-based synthesis of scraping and rolling sounds based on non-linear physical constraints. The 24th International Conference on Digital Audio Effects (DAFx-21).

    McPherson, M.J., McDermott, J.H. (2020) Time-dependent discrimination advantages for harmonic sounds suggest efficient coding for memory. Proceedings of the National Academy of Sciences, 117, 32169-32180.

    Jacoby, N., Undurraga, E.A., McPherson, M.J., Valdes, J., Ossandon, T., McDermott, J.H. (2019) Universal and non-universal features of musical pitch perception revealed by singing. Current Biology, 29, 3229-3243.

    Norman-Haignere, S.V., McDermott, J.H. (2018) Neural responses to natural and model-matched stimuli reveal distinct computations in primary and nonprimary auditory cortex. PLoS Biology, 16, e2005127.

    Kell, A.J.E., Yamins, D.L.K., Shook, E.N., Norman-Haignere, S.V., McDermott, J.H. (2018) A task-optimized neural network replicates human auditory behavior, predicts brain responses, and reveals a cortical processing hierarchy. Neuron, 98, 630-644.

    Traer, J., McDermott, J.H. (2016) Statistics of natural reverberation enable perceptual separation of sound and space. Proceedings of the National Academy of Sciences, 113, E7856--E7865.

    Don't miss our next newsletter!
    Sign Up

    Footer menu

    • Contact Us
    • Employment
    • Be a Test Subject
    • Login

    Footer 2

    • McGovern
    • Picower
    Brain and Cognitive Sciences

    MIT Department of Brain and Cognitive Sciences

    Massachusetts Institute of Technology

    77 Massachusetts Avenue, Room 46-2005

    Cambridge, MA 02139-4307 | (617) 253-5748

    For Emergencies | Accessibility

    Massachusetts Institute of Technology