CogLunch: Matthew Cashman "Humans as next-token predictors: measuring the flow of memes through minds"
Description
Speaker: Matthew Cashman
Title: Humans as next-token predictors: measuring the flow of memes through minds
Abstract: Being able to measure what information ends up in which minds is a necessary first step to explaining how it got there. Models of genetic evolution are tested by counting alleles, and a good model successfully predicts which genes will be found where. Cultural evolution is changing humanity much faster than genetic evolution, but at present, we lack a way to empirically ground models in a quantitative, content-agnostic way analogous to counting alleles. A quantitative view of what information ends up in which minds permits modeling of the many processes at many levels that govern its flow, from informational legacies left to descendants to sharing on social media. I will describe a new information-theoretic method for measuring the flow of memes through minds and share some empirical results. We take Shannon’s classic cloze-completion game for estimating the entropy of written language and turn it on its head: instead of using minds to learn about written language, we use language to learn about minds. The Shannon game uses human minds to encode and decode because he lacked large, digitized corpora and fast computers—but if we use human minds precisely because we are interested in their properties as encoders/decoders we end up with a window into those minds. Entropy estimates generated based on a test set from Harry Potter will differ between a treatment group (Readers, people who have read Harry Potter), and a control group (Non-Readers). This difference is driven by the way their minds have been changed by reading the book. It is an expression, in bits, of how much information from the book is actually stored in Readers' minds and capable of influencing behavior.
Location: 46-3037 (note different room)