Artwork

Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Decoding Animal Behavior to Train Robots with EgoPet with Amir Bar - #692

43:16
 
Del
 

Manage episode 428029018 series 2355587
Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Today, we're joined by Amir Bar, a PhD candidate at Tel Aviv University and UC Berkeley to discuss his research on visual-based learning, including his recent paper, “EgoPet: Egomotion and Interaction Data from an Animal’s Perspective.” Amir shares his research projects focused on self-supervised object detection and analogy reasoning for general computer vision tasks. We also discuss the current limitations of caption-based datasets in model training, the ‘learning problem’ in robotics, and the gap between the capabilities of animals and AI systems. Amir introduces ‘EgoPet,’ a dataset and benchmark tasks which allow motion and interaction data from an animal's perspective to be incorporated into machine learning models for robotic planning and proprioception. We explore the dataset collection process, comparisons with existing datasets and benchmark tasks, the findings on the model performance trained on EgoPet, and the potential of directly training robot policies that mimic animal behavior.

The complete show notes for this episode can be found at https://twimlai.com/go/692.

  continue reading

724 episoder

Artwork
iconDel
 
Manage episode 428029018 series 2355587
Indhold leveret af TWIML and Sam Charrington. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af TWIML and Sam Charrington eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

Today, we're joined by Amir Bar, a PhD candidate at Tel Aviv University and UC Berkeley to discuss his research on visual-based learning, including his recent paper, “EgoPet: Egomotion and Interaction Data from an Animal’s Perspective.” Amir shares his research projects focused on self-supervised object detection and analogy reasoning for general computer vision tasks. We also discuss the current limitations of caption-based datasets in model training, the ‘learning problem’ in robotics, and the gap between the capabilities of animals and AI systems. Amir introduces ‘EgoPet,’ a dataset and benchmark tasks which allow motion and interaction data from an animal's perspective to be incorporated into machine learning models for robotic planning and proprioception. We explore the dataset collection process, comparisons with existing datasets and benchmark tasks, the findings on the model performance trained on EgoPet, and the potential of directly training robot policies that mimic animal behavior.

The complete show notes for this episode can be found at https://twimlai.com/go/692.

  continue reading

724 episoder

Semua episod

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning