BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:Can Sparsity Lead to Efficient LLMs? - Shiwei Liu\, University of 
 Oxford
DTSTART:20240613T100000Z
DTEND:20240613T110000Z
UID:TALK215509@talks.cam.ac.uk
CONTACT:Panagiotis Fytas
DESCRIPTION:The rapid advancements in Large Language Models (LLMs) have re
 volutionized various natural language processing tasks. However\, the subs
 tantial size of LLMs presents significant challenges in training\, fine-tu
 ning\, and deployment. In this talk\, I will discuss how sparsity\, a fund
 amental characteristic in neural networks\, can be leveraged to enhance LL
 M efficiency. The presentation will cover recent advances in LLM pruning\,
  parameter-efficient fine-tuning\, centered on the principle: Not Every La
 yer in LLMs is Worth Equal Computing.\n
LOCATION:Faculty of English\, Room SR24
END:VEVENT
END:VCALENDAR
