Artwork

Indhold leveret af PyTorch, Edward Yang, and Team PyTorch. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af PyTorch, Edward Yang, and Team PyTorch eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Batching

13:37
 
Del
 

Manage episode 300204756 series 2921809
Indhold leveret af PyTorch, Edward Yang, and Team PyTorch. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af PyTorch, Edward Yang, and Team PyTorch eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

PyTorch operates on its input data in a batched manner, typically processing multiple batches of an input at once (rather than once at a time, as would be the case in typical programming). In this podcast, we talk a little about the implications of batching operations in this way, and then also about how PyTorch's API is structured for batching (hint: poorly) and how Numpy introduced a concept of ufunc/gufuncs to standardize over broadcasting and batching behavior. There is some overlap between this podcast and previous podcasts about TensorIterator and vmap; you may also be interested in those episodes.

Further reading.

  continue reading

82 episoder

Artwork

Batching

PyTorch Developer Podcast

33 subscribers

published

iconDel
 
Manage episode 300204756 series 2921809
Indhold leveret af PyTorch, Edward Yang, and Team PyTorch. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af PyTorch, Edward Yang, and Team PyTorch eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

PyTorch operates on its input data in a batched manner, typically processing multiple batches of an input at once (rather than once at a time, as would be the case in typical programming). In this podcast, we talk a little about the implications of batching operations in this way, and then also about how PyTorch's API is structured for batching (hint: poorly) and how Numpy introduced a concept of ufunc/gufuncs to standardize over broadcasting and batching behavior. There is some overlap between this podcast and previous podcasts about TensorIterator and vmap; you may also be interested in those episodes.

Further reading.

  continue reading

82 episoder

Alle episoder

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning