Artwork

Indhold leveret af Declarations: The Human Rights Podcast. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Declarations: The Human Rights Podcast eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !

Deepfakes and Non-Consensual Pornography

37:47
 
Del
 

Manage episode 324150049 series 1918661
Indhold leveret af Declarations: The Human Rights Podcast. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Declarations: The Human Rights Podcast eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

The Deepfake detection platform Sensity came out with a report in 2019 that 96% of Deepfakes on the internet are pornographic and 90% of those represent women. Deepfakes are a modern form of synthetic media created by two competing AI’s with the goal of replicating hyper-realistic videos, images, and voices. Over the past five years this has led to major concerns of the technology being used to spread mis/disinformation, carry out fraudulent cybercrimes, tamper with human rights evidence, and most importantly in relation to this episode create non-consensual pornography. In this episode, the last of this season of the Declarations podcast, host Maryam Tanwir sat down with panellist Neema Jayasinghe and Henry Adjer who is not only responsible for the Sensity report that came out in 2019 but is also a seasoned expert on the topic of deepfakes and synthetic media. He is currently the head of policy and partnerships at Metaphysic.AI and also co-authored the report ‘Deeptrace: The State of Deepfakes’ while at Sensity. This was the first major report published to map the landscape of deepfakes and found that the overwhelming majority are used in pornography.

  continue reading

96 episoder

Artwork
iconDel
 
Manage episode 324150049 series 1918661
Indhold leveret af Declarations: The Human Rights Podcast. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af Declarations: The Human Rights Podcast eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.

The Deepfake detection platform Sensity came out with a report in 2019 that 96% of Deepfakes on the internet are pornographic and 90% of those represent women. Deepfakes are a modern form of synthetic media created by two competing AI’s with the goal of replicating hyper-realistic videos, images, and voices. Over the past five years this has led to major concerns of the technology being used to spread mis/disinformation, carry out fraudulent cybercrimes, tamper with human rights evidence, and most importantly in relation to this episode create non-consensual pornography. In this episode, the last of this season of the Declarations podcast, host Maryam Tanwir sat down with panellist Neema Jayasinghe and Henry Adjer who is not only responsible for the Sensity report that came out in 2019 but is also a seasoned expert on the topic of deepfakes and synthetic media. He is currently the head of policy and partnerships at Metaphysic.AI and also co-authored the report ‘Deeptrace: The State of Deepfakes’ while at Sensity. This was the first major report published to map the landscape of deepfakes and found that the overwhelming majority are used in pornography.

  continue reading

96 episoder

Alle episoder

×
 
Loading …

Velkommen til Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Hurtig referencevejledning