Skip to main content

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
      • Building 46 Room Reservations
    • Leadership
    • Employment
    • Contact
      • BCS Spot Awards
      • Building 46 Email and Slack
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
      • Postdoctoral Association and Committees
    • Core Facilities
    • InBrain
      • InBRAIN Collaboration Data Sharing Policy
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
      • Course 6-9 MEng
    • Brain and Cognitive Sciences PhD
      • How to Apply
      • Program Details
      • Classes
      • Research
      • Student Life
      • For Current Students
    • Molecular and Cellular Neuroscience Program
      • How to Apply to MCN
      • MCN Faculty and Research Areas
      • MCN Curriculum
      • Model Systems
      • MCN Events
      • MCN FAQ
      • MCN Contacts
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
      • MIT Summer Research Program (MSRP)
      • Post-Baccalaureate Research Scholars
      • Conferences, Outreach and Networking Opportunities
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us

Footer

  • Contact Us
  • Employment
  • Be a Test Subject
  • Login

Footer 2

  • McGovern
  • Picower

Utility Menu

  • Directory
  • Apply to BCS
  • Contact Us
Brain and Cognitive Sciences
Menu
MIT

Main navigation

  • About BCS
    • Mission
    • History
    • Building 46
    • Leadership
    • Employment
    • Contact
    • Directory
  • Faculty + Research
    • Faculty
    • Areas of Research
    • Postdoctoral Research
    • Core Facilities
    • InBrain
  • Academics
    • Course 9: Brain and Cognitive Sciences
    • Course 6-9: Computation and Cognition
    • Brain and Cognitive Sciences PhD
    • Molecular and Cellular Neuroscience Program
    • Computationally-Enabled Integrative Neuroscience Program
    • Research Scholars Program
    • Course Offerings
  • News + Events
    • News
    • Events
    • Recordings
    • Newsletter
  • Community + Culture
    • Community + Culture
    • Community Stories
    • Outreach
    • Get Involved (MIT login required)
    • Resources (MIT login Required)
  • Give to BCS
    • Join the Champions of the Brain Fellows Society
    • Meet Our Donors

Events

News Menu

  • News
  • Events
  • Newsletters

Breadcrumb

  1. Home
  2. Events
  3. CBMM Virtual Research Meeting: Hossein Mobahi
Hossein_Profile_Big.jpg
Center for Brains, Minds and Machines (CBMM)
Seminar

CBMM Virtual Research Meeting: Hossein Mobahi

Add to CalendarAmerica/New_YorkCBMM Virtual Research Meeting: Hossein Mobahi06/09/2020 6:00 pm06/09/2020 7:00 pmZoom -
June 9, 2020
6:00 pm - 7:00 pm
Location
Zoom -
Contact
Jean Lawrence
    Description

    TITLE:
    Improving Generalization Performance by Self-Training and Self-Distillation
     
    ABSTRACT:
    In supervised learning we often seek a model which minimizes (to epsilon optimality) a loss function over a training set, possibly subject to some (implicit or explicit) regularization. Suppose you train a model this way and read out the predictions it makes over the training inputs, which may slightly differ from the training targets due to the epsilon optimality. Now suppose you treat these predictions as new target values, and retrain another model from scratch using those predictions instead of the original target values. Surprisingly, the second model can often outperform the original model in terms of accuracy on the test set. Actually, we may repeat this loop a few times, and each time see an increase in the generalization performance. This might sound strange as such a supervised self-training process (aka self-distillation) does not receive any new information about the task and solely evolves by retraining itself. In this talk, I argue such self-training process induces additional regularization, which gets amplified in each round of retraining. In fact, I will rigorously characterize such regularization effects when learning the function in Hilbert space. The latter setting can relate to neural networks with infinite width. I will conclude by discussing some open problems in the area of self-training and self-distillation.
     
    Link to Talk: https://mit.zoom.us/j/95694089706?pwd=TnplQVRTbWMxZmJKdS84NXRoY3k2QT09
    Password: brains

    Upcoming Events

    See All Events
    Don't miss our next newsletter!
    Sign Up

    Footer menu

    • Contact Us
    • Employment
    • Be a Test Subject
    • Login

    Footer 2

    • McGovern
    • Picower
    Brain and Cognitive Sciences

    MIT Department of Brain and Cognitive Sciences

    Massachusetts Institute of Technology

    77 Massachusetts Avenue, Room 46-2005

    Cambridge, MA 02139-4307 | (617) 253-5748

    For Emergencies | Accessibility

    Massachusetts Institute of Technology