A Contrastive Framework for Neural Text Generation
- đ¤ Speaker: Yixuan Su, University of Cambridge
- đ Date & Time: Thursday 02 June 2022, 11:00 - 12:00
- đ Venue: Board Room, English Faculty, 9 West Road (Sidgwick Site)
Abstract
Text generation is of great importance to many NLP applications. However, maximization-based decoding methods (e.g. beam search) of neural language models often lead to degenerate solutions—-the generated text is unnatural and contains undesirable repetitions. Existing approaches introduce stochasticity via sampling or modify training objectives to decrease probabilities of certain tokens (e.g., unlikelihood training). However, they often lead to solutions that lack coherence. In this work, we show that an underlying reason for model degeneration is the anisotropic distribution of token representations. We present a contrastive solution: (i) SimCTG, a contrastive training objective to calibrate the model’s representation space, and (ii) a decoding method—-contrastive search—-to encourage diversity while maintaining coherence in the generated text. Extensive experiments and analyses on three benchmarks from two languages demonstrate that our proposed approach outperforms state-of-the-art text generation methods as evaluated by both human and automatic metrics.
Series This talk is part of the Language Technology Lab Seminars series.
Included in Lists
- bld31
- Board Room, English Faculty, 9 West Road (Sidgwick Site)
- Cambridge Centre for Data-Driven Discovery (C2D3)
- Cambridge Forum of Science and Humanities
- Cambridge Language Sciences
- Cambridge talks
- Chris Davis' list
- Guy Emerson's list
- Interested Talks
- Language Sciences for Graduate Students
- Language Technology Lab Seminars
- ndk22's list
- ob366-ai4er
- rp587
- Simon Baker's List
- Trust & Technology Initiative - interesting events
- yk449
Note: Ex-directory lists are not shown.
![[Talks.cam]](/static/images/talkslogosmall.gif)


Thursday 02 June 2022, 11:00-12:00