Artwork

Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Transformers On Large-Scale Graphs with Bayan Bruss - #641

38:36
 
Del
 

Manage episode 373510957 series 2355587
Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Today we’re joined by Bayan Bruss, Vice President of Applied ML Research at Capital One. In our conversation with Bayan, we covered a pair of papers his team presented at this year’s ICML conference. We begin with the paper Interpretable Subspaces in Image Representations, where Bayan gives us a dive deep into the interpretability framework, embedding dimensions, contrastive approaches, and how their model can accelerate image representation in deep learning. We also explore GOAT: A Global Transformer on Large-scale Graphs, a scalable global graph transformer. We talk through the computation challenges, homophilic and heterophilic principles, model sparsity, and how their research proposes methodologies to get around the computational barrier when scaling to large-scale graph models.

The complete show notes for this episode can be found at twimlai.com/go/641.

  continue reading

779 episoder

Artwork
iconDel
 
Manage episode 373510957 series 2355587
Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Today we’re joined by Bayan Bruss, Vice President of Applied ML Research at Capital One. In our conversation with Bayan, we covered a pair of papers his team presented at this year’s ICML conference. We begin with the paper Interpretable Subspaces in Image Representations, where Bayan gives us a dive deep into the interpretability framework, embedding dimensions, contrastive approaches, and how their model can accelerate image representation in deep learning. We also explore GOAT: A Global Transformer on Large-scale Graphs, a scalable global graph transformer. We talk through the computation challenges, homophilic and heterophilic principles, model sparsity, and how their research proposes methodologies to get around the computational barrier when scaling to large-scale graph models.

The complete show notes for this episode can be found at twimlai.com/go/641.

  continue reading

779 episoder

모든 에피소드

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning

Lyt til dette show, mens du udforsker
Afspil