Gauging language proficiency through eye movement

The study tracks eye movement to determine how well people understand English as a foreign language.

Share

MIT scientists in a new study have discovered a new method that tells how well people are learning English: tracking their eyes. Utilizing information produced by cameras trained on perusers’ eyes, the exploration group has discovered that patterns of eye movement— especially to what extent individuals’ eyes lay on specific words — associated unequivocally with execution on a standardized trial of English as a second language.

According to scientists, this method could be used as a testing tool.

The authors are Yevgeni Berzak, a postdoc in MIT’s Department of Brain and Cognitive Sciences (BCS), Roger Levy, an associate professor in BCS, Boris Katz, a principal research scientist and head of the InfoLab Group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL); and Levy, who also directs the Computational Psycholinguistics Lab in BCS.

The examination digs into a phenomenon about perusing that we may never see, regardless how much we read: Our eyes don’t move persistently along a string of content, but rather fix on specific words for to 200 to 250 milliseconds. We likewise take leaps from a single word then onto the next that may last around 1/20 of a second.

But if you are learning a new language, your eyes may dwell on particular words for longer periods of time, as you try to comprehend the text. The particular pattern of eye movement, for this reason, can reveal a lot about comprehension.

Amid the study, scientists used a dataset of eye movement records from work conducted by Berzak. The dataset has 145 understudies of English as a secondary language, partitioned uniformly among four local languages— Chinese, Japanese, Portuguese, and Spanish — and additionally 37 local English speakers.

The perusers were given 156 sentences to read, half of which were a part of a “fixed test” in which everybody in the investigation read similar sentences. The video film empowered the examination group to center seriously around a progression of span times — the time allotment perusers were focused on specific words.

The set of metrics, which scientists used called as “EyeScore.” After evaluating how it correlated with the Michigan English Test (MET) and the Test of English as a Foreign Language (TOEFL), they concluded in the paper that the EyeScore method produced “competitive results” with the standardized tests.

Erik Reichle, head of the Department of Psychology at Macquarie University in Sydney, Australia said, “The method [used in the study] is very innovative and — in my opinion — holds much promise for using eye-tracking technology to its full potential. It will have a big impact in a number of different fields, including those more directly related to second-language learning.”

Katz said, “The bigger question is, how does language affect your brain?” Given that we only began processing written text within the last several thousand years, our reading ability is an example of the “amazing plasticity” of the brain. Before too long. We could actually be in a position to start answering these questions.”

Levy said, “One thing that we would hope to do in the future that we haven’t done yet, for example, asks, on a sentence-by-sentence basis, to what extent can we tell how well you understood a sentence by the eye movements you made when you read it. That’s an open question nobody’s answered. We hope we might be able to do that in the future.”

The paper, “Assessing Language Proficiency from Eye Movements in Reading,” is being published in the Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies.

Latest Updates

Trending