Gauging language proficiency through eye movement
A study by MIT researchers has uncovered a new way of telling how well people are learning English: tracking their eyes.
That’s right. Using data generated by cameras trained on readers’ eyes, the research team has found that patterns of eye movement — particularly how long people’s eyes rest on certain words — correlate strongly with performance on standardized tests of English as a second language.
“To a large extent [eye movement] captures linguistic proficiency, as we can measure it against benchmarks of standardized tests,” says Yevgeni Berzak, a postdoc in MIT’s Department of Brain and Cognitive Sciences (BCS) and co-author of a new paper outlining the research. He adds: “The signal of eye movement during reading is very rich and very informative.”
Indeed, the researchers even suggest the new method has potential use as a testing tool. “It has real potential applications,” says Roger Levy, an associate professor in BCS and another of the study’s co-authors.
The paper, “Assessing Language Proficiency from Eye Movements in Reading,” is being published in the Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies. The authors are Berzak, a postdoc in the Computational Psycholinguistics Group in BCS; Boris Katz, a principal research scientist and head of the InfoLab Group at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL); and Levy, who also directs the Computational Psycholinguistics Lab in BCS.
The illusion of continuity
The study delves into a phenomenon about reading that we may never notice, no matter how much we read: Our eyes do not move continuously along a string of text, but instead fix on particular words for up to 200 to 250 milliseconds. We also take leaps from one word to another that may last about 1/20 of a second.
“Although you have a subjective experience of a continuous, smooth pass over text, that’s absolutely not what your eyes are doing,” says Levy. “Your eyes are jumping around, mostly forward, sometimes backward. Your mind stitches together a smooth experience. … It’s a testimony to the ability of the mind to create illusions.”
But if you are learning a new language, your eyes may dwell on particular words for longer periods of time, as you try to comprehend the text. The particular pattern of eye movement, for this reason, can reveal a lot about comprehension, at least when analyzed in a clearly defined context.
To conduct the study, the researchers used a dataset of eye movement records from work conducted by Berzak. The dataset has 145 students of English as a second language, divided almost evenly among four native languages — Chinese, Japanese, Portuguese, and Spanish — as well as 37 native English speakers.
The readers were given 156 sentences to read, half of which were part of a “fixed test” in which everyone in the study read the same sentences. The video footage enabled the research team to focus intensively on a series of duration times — the length of time readers were fixated on particular words.
The research team called the set of metrics they used the “EyeScore.” After evaluating how it correlated with the Michigan English Test (MET) and the Test of English as a Foreign Language (TOEFL), they concluded in the paper that the EyeScore method produced “competitive results” with the standardized tests, “further strengthening the evidence for the ability of our approach to capture language proficiency.”
As a result, the authors write, the new method is “the first proof of concept for a system which utilizes eye tracking to measure linguistic ability.”
Sentence by sentence
Other scholars say the study is an interesting addition to the research literature on the subject.
“The method [used in the study] is very innovative and — in my opinion — holds much promise for using eye-tracking technology to its full potential,” says Erik Reichle, head of the Department of Psychology at Macquarie University in Sydney, Australia, who has conducted many experiments about tracking eye movement. Reichle adds that he suspects the paper “will have a big impact in a number of different fields, including those more directly related to second-language learning.”
As the researchers see it, the current study is just one step on a longer journey of exploration about the interactions of language and cognition.
As Katz says, “The bigger question is, how does language affect your brain?” Given that we only began processing written text within the last several thousand years, he notes, our reading ability is an example of the “amazing plasticity” of the brain. Before too long, he adds, “We could actually be in a position to start answering these questions.”
Levy, for his part, thinks that it may be possible to make these eye tests about reading more specific. Rather than evaluating reader comprehension over a corpus of 156 sentences, as the current study did, experts might be able to render more definitive judgments about even smaller strings of text.
“One thing that we would hope to do in the future that we haven’t done yet, for example, is ask, on a sentence-by-sentence basis, to what extent can we tell how well you understood a sentence by the eye movements you made when you read it,” Levy says. “That’s an open question nobody’s answered. We hope we might be able to do that in the future.”
The study was supported, in part, by MIT’s Center for Brains, Minds, and Machines, through a National Science Foundation grant.