BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Incremental Accumulation of Linguistic Context in Artificial and B
 iological Neural Networks - Refael Tikochinski\, Hebrew University of Jeru
 salem
DTSTART:20240523T100000Z
DTEND:20240523T110000Z
UID:TALK215497@talks.cam.ac.uk
CONTACT:Panagiotis Fytas
DESCRIPTION:Accumulated evidence suggests that Large Language Models (LLMs
 ) are beneficial in predicting neural signals related to narrative process
 ing. The way LLMs integrate context over large timescales\, however\, is f
 undamentally different from the way the brain does it. In my study\, we sh
 ow that unlike LLMs that apply parallel processing of large contextual win
 dows\, the incoming context to the brain is limited to short windows of a 
 few tens of words. We hypothesize that whereas lower-level brain areas pro
 cess short contextual windows\, higher-order areas in the default-mode net
 work (DMN) engage in an online incremental mechanism where the incoming sh
 ort context is summarized and integrated with information accumulated acro
 ss long timescales. Consequently\, we introduce a novel LLM that instead o
 f processing the entire context at once\, it incrementally generates a con
 cise summary of previous information. As predicted\, we found that neural 
 activities at the DMN were better predicted by the incremental model\, and
  conversely\, lower-level areas were better predicted with short-context-w
 indow LLM.\n\n
LOCATION:https://cam-ac-uk.zoom.us/j/97599459216?pwd=QTRsOWZCOXRTREVnbTJBd
 XVpOXFvdz09
END:VEVENT
END:VCALENDAR
