BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Talks.cam//talks.cam.ac.uk//
X-WR-CALNAME:Talks.cam
BEGIN:VEVENT
SUMMARY:A Contrastive Framework for Neural Text Generation - Yixuan Su\, U
 niversity of Cambridge
DTSTART:20220602T100000Z
DTEND:20220602T110000Z
UID:TALK175274@talks.cam.ac.uk
CONTACT:Marinela Parovic
DESCRIPTION:Text generation is of great importance to many NLP application
 s. However\, maximization-based decoding methods (e.g. beam search) of neu
 ral language models often lead to degenerate solutions---the generated tex
 t is unnatural and contains undesirable repetitions. Existing approaches i
 ntroduce stochasticity via sampling or modify training objectives to decre
 ase probabilities of certain tokens (e.g.\, unlikelihood training). Howeve
 r\, they often lead to solutions that lack coherence. In this work\, we sh
 ow that an underlying reason for model degeneration is the anisotropic dis
 tribution of token representations. We present a contrastive solution: (i)
  SimCTG\, a contrastive training objective to calibrate the model's repres
 entation space\, and (ii) a decoding method---contrastive search---to enco
 urage diversity while maintaining coherence in the generated text. Extensi
 ve experiments and analyses on three benchmarks from two languages demonst
 rate that our proposed approach outperforms state-of-the-art text generati
 on methods as evaluated by both human and automatic metrics.
LOCATION:Board Room\, English Faculty\, 9 West Road (Sidgwick Site)
END:VEVENT
END:VCALENDAR
