Artwork

Indhold leveret af NLP Highlights and Allen Institute for Artificial Intelligence. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af NLP Highlights and Allen Institute for Artificial Intelligence eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

47:38
 
Del
 

Manage episode 293241805 series 1452120
Indhold leveret af NLP Highlights and Allen Institute for Artificial Intelligence. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af NLP Highlights and Allen Institute for Artificial Intelligence eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
We invited Lisa Li to talk about her recent work, Prefix-Tuning: Optimizing Continuous Prompts for Generation. Prefix tuning is a lightweight alternative to finetuning, and the idea is to tune only a fixed-length task-specific continuous vector, and to keep the pretrained transformer parameters frozen. We discussed how prefix tuning compares with finetuning and other efficient alternatives on two tasks in various experimental settings, and in what scenarios prefix tuning is preferable. Lisa is a Phd student at Stanford University. Lisa's webpage: https://xiangli1999.github.io/ The hosts for this episode are Pradeep Dasigi and Ana Marasović.
  continue reading

145 episoder

Artwork
iconDel
 
Manage episode 293241805 series 1452120
Indhold leveret af NLP Highlights and Allen Institute for Artificial Intelligence. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af NLP Highlights and Allen Institute for Artificial Intelligence eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
We invited Lisa Li to talk about her recent work, Prefix-Tuning: Optimizing Continuous Prompts for Generation. Prefix tuning is a lightweight alternative to finetuning, and the idea is to tune only a fixed-length task-specific continuous vector, and to keep the pretrained transformer parameters frozen. We discussed how prefix tuning compares with finetuning and other efficient alternatives on two tasks in various experimental settings, and in what scenarios prefix tuning is preferable. Lisa is a Phd student at Stanford University. Lisa's webpage: https://xiangli1999.github.io/ The hosts for this episode are Pradeep Dasigi and Ana Marasović.
  continue reading

145 episoder

Όλα τα επεισόδια

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning

Lyt til dette show, mens du udforsker
Afspil