Player FM - Internet Radio Done Right
Checked 6M ago
Tilføjet four år siden
Indhold leveret af The Thesis Review and Sean Welleck. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af The Thesis Review and Sean Welleck eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Player FM - Podcast-app
Gå offline med appen Player FM !
Gå offline med appen Player FM !
[25] Tomas Mikolov - Statistical Language Models Based on Neural Networks
Manage episode 302418420 series 2982803
Indhold leveret af The Thesis Review and Sean Welleck. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af The Thesis Review and Sean Welleck eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episoder
Manage episode 302418420 series 2982803
Indhold leveret af The Thesis Review and Sean Welleck. Alt podcastindhold inklusive episoder, grafik og podcastbeskrivelser uploades og leveres direkte af The Thesis Review and Sean Welleck eller deres podcastplatformspartner. Hvis du mener, at nogen bruger dit ophavsretligt beskyttede værk uden din tilladelse, kan du følge processen beskrevet her https://da.player.fm/legal.
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
…
continue reading
49 episoder
Alle episoder
×T
The Thesis Review

1 [48] Tianqi Chen - Scalable and Intelligent Learning Systems 46:29
46:29
Afspil senere
Afspil senere
Lister
Like
Liked46:29
Tianqi Chen is an Assistant Professor in the Machine Learning Department and Computer Science Department at Carnegie Mellon University and the Chief Technologist of OctoML. His research focuses on the intersection of machine learning and systems. Tianqi's PhD thesis is titled "Scalable and Intelligent Learning Systems," which he completed in 2019 at the University of Washington. We discuss his influential work on machine learning systems, starting with the development of XGBoost,an optimized distributed gradient boosting library that has had an enormous impact in the field. We also cover his contributions to deep learning frameworks like MXNet and machine learning compilation with TVM, and connect these to modern generative AI. - Episode notes: www.wellecks.com/thesisreview/episode48.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Follow Tianqi Chen on Twitter (@tqchenml) - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [47] Niloofar Mireshghallah - Auditing and Mitigating Safety Risks in Large Language Models 1:17:06
1:17:06
Afspil senere
Afspil senere
Lister
Like
Liked1:17:06
Niloofar Mireshghallah is a postdoctoral scholar at the University of Washington. Her research focuses on privacy, natural language processing, and the societal implications of machine learning. Niloofar completed her PhD in 2023 at UC San Diego, where she was advised by Taylor Berg-Kirkpatrick. Her PhD thesis is titled "Auditing and Mitigating Safety Risks in Large Language Models." We discuss her journey into research and her work on privacy and LLMs, including how privacy is defined, common attacks and mitigations, differential privacy, and the balance between memorization and generalization. - Episode notes: www.wellecks.com/thesisreview/episode47.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [46] Yulia Tsvetkov - Linguistic Knowledge in Data-Driven NLP 59:53
59:53
Afspil senere
Afspil senere
Lister
Like
Liked59:53
Yulia Tsvetkov is a Professor in the Allen School of Computer Science & Engineering at the University of Washington. Her research focuses on multilingual NLP, NLP for social good, and language generation. Yulia's PhD thesis is titled "Linguistic Knowledge in Data-Driven Natural Language Processing", which she completed in 2016 at CMU. We discuss getting started in research, then move to Yulia's work in the thesis that combines ideas from linguistics and natural language processing. We discuss low-resource and multilingual NLP, large language models, and great advice about research and beyond. - Episode notes: www.wellecks.com/thesisreview/episode46.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [45] Luke Zettlemoyer - Learning to Map Sentences to Logical Form 59:35
59:35
Afspil senere
Afspil senere
Lister
Like
Liked59:35
Luke Zettlemoyer is a Professor at the University of Washington and Research Scientist at Meta. His work spans machine learning and NLP, including foundational work in large-scale self-supervised pretraining of language models. Luke's PhD thesis is titled "Learning to Map Sentences to Logical Form", which he completed in 2009 at MIT. We talk about his PhD work, the path to the foundational Elmo paper, and various topics related to large language models. - Episode notes: www.wellecks.com/thesisreview/episode45.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [44] Hady Elsahar - NLG from Structured Knowledge Bases (& Controlling LMs) 1:05:56
1:05:56
Afspil senere
Afspil senere
Lister
Like
Liked1:05:56
Hady Elsahar is a Research Scientist at Naver Labs Europe. His research focuses on Neural Language Generation under constrained and controlled conditions. Hady's PhD was on interactions between Natural Language and Structured Knowledge bases for Data2Text Generation and Relation Extraction & Discovery, which he completed in 2019 at the Université de Lyon. We talk about his phd work and how it led to interests in multilingual and low-resource in NLP, as well as controlled generation. We dive deeper in controlling language models, including his interesting work on distributional control and energy-based models. - Episode notes: www.wellecks.com/thesisreview/episode44.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [43] Swarat Chaudhuri - Logics and Algorithms for Software Model Checking 1:06:18
1:06:18
Afspil senere
Afspil senere
Lister
Like
Liked1:06:18
Swarat Chaudhuri is an Associate Professor at the University of Texas. His lab studies problems at the interface of programming languages, logic and formal methods, and machine learning. Swarat's PhD thesis is titled "Logics and Algorithms for Software Model Checking", which he completed in 2007 at the University of Pennsylvania. We discuss reasoning about programs, formal methods & safer machine learning systems, and the future of program synthesis & neurosymbolic programming. - Episode notes: www.wellecks.com/thesisreview/episode43.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at www.wellecks.com/thesisreview - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [42] Charles Sutton - Efficient Training Methods for Conditional Random Fields 1:18:01
1:18:01
Afspil senere
Afspil senere
Lister
Like
Liked1:18:01
Charles Sutton is a Research Scientist at Google Brain and an Associate Professor at the University of Edinburgh. His research focuses on deep learning for generating code and helping people write better programs. Charles' PhD thesis is titled "Efficient Training Methods for Conditional Random Fields", which he completed in 2008 at UMass Amherst. We start with his work in the thesis on structured models for text, and compare/contrast with today's large language models. From there, we discuss machine learning for code & the future of language models in program synthesis. - Episode notes: https://cs.nyu.edu/~welleck/episode42.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [41] Talia Ringer - Proof Repair 1:19:02
1:19:02
Afspil senere
Afspil senere
Lister
Like
Liked1:19:02
Talia Ringer is an Assistant Professor with the Programming Languages, Formal Methods, and Software Engineering group at University of Illinois Urbana-Champaign. Her research focuses on formal verification and proof engineering technologies. Talia's PhD thesis is titled "Proof Repair", which she completed in 2021 at the University of Washington. We discuss software verification and her PhD work on proof repair for maintaining verified systems, and discuss the intersection of machine learning with her work. - Episode notes: https://cs.nyu.edu/~welleck/episode41.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [40] Lisa Lee - Learning Embodied Agents with Scalably-Supervised RL 46:59
46:59
Afspil senere
Afspil senere
Lister
Like
Liked46:59
Lisa Lee is a Research Scientist at Google Brain. Her research focuses on building AI agents that can learn and adapt like humans and animals do. Lisa's PhD thesis is titled "Learning Embodied Agents with Scalably-Supervised Reinforcement Learning", which she completed in 2021 at Carnegie Mellon University. We talk about her work in the thesis on reinforcement learning, including exploration, learning with weak supervision, and embodied agents, and cover various topics related to trends in reinforcement learning. - Episode notes: https://cs.nyu.edu/~welleck/episode40.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [39] Burr Settles - Curious Machines: Active Learning with Structured Instances 1:06:33
1:06:33
Afspil senere
Afspil senere
Lister
Like
Liked1:06:33
Burr Settles leads the research group at Duolingo, a language-learning website and mobile app whose mission is to make language education free and accessible to everyone. Burr’s PhD thesis is titled "Curious Machines: Active Learning with Structured Instances", which he completed in 2008 at the University of Wisconsin-Madison. We talk about his work in the thesis on active learning, then chart the path to Burr’s role at DuoLingo. We discuss machine learning for education and language learning, including content, assessment, and the exciting possibilities opened by recent advancements. - Episode notes: https://cs.nyu.edu/~welleck/episode39.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations 1:04:47
1:04:47
Afspil senere
Afspil senere
Lister
Like
Liked1:04:47
Andrew Lampinen is a research scientist at DeepMind. His research focuses on cognitive flexibility and generalization. Andrew’s PhD thesis is titled "A Computational Framework for Learning and Transforming Task Representations", which he completed in 2020 at Stanford University. We talk about cognitive flexibility in brains and machines, centered around his work in the thesis on meta-mapping. We cover a lot of interesting ground, including complementary learning systems and memory, compositionality and systematicity, and the role of symbols in machine learning. - Episode notes: https://cs.nyu.edu/~welleck/episode38.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [37] Joonkoo Park - Neural Substrates of Visual Word and Number Processing 1:09:28
1:09:28
Afspil senere
Afspil senere
Lister
Like
Liked1:09:28
Joonkoo Park is an Associate Professor and Honors Faculty in the Department of Psychological and Brain Sciences at UMass Amherst. He leads the Cognitive and Developmental Neuroscience Lab, focusing on understanding the developmental mechanisms and neurocognitive underpinnings of our knowledge about number and mathematics. Joonkoo’s PhD thesis is titled "Experiential Effects on the Neural Substrates of Visual Word and Number Processing", which he completed in 2011 at the University of Michigan. We talk about numerical processing in the brain, starting with nature vs. nurture, including the learned versus built-in aspects of neural architectures. We talk about the difference between word and number processing, types of numerical thinking, and symbolic vs. non-symbolic numerical processing. - Episode notes: https://cs.nyu.edu/~welleck/episode37.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [36] Dieuwke Hupkes - Hierarchy and Interpretability in Neural Models of Language Processing 1:02:26
1:02:26
Afspil senere
Afspil senere
Lister
Like
Liked1:02:26
Dieuwke Hupkes is a Research Scientist at Facebook AI Research and the scientific manager of the Amsterdam unit of ELLIS. Dieuwke's PhD thesis is titled, "Hierarchy and Interpretability in Neural Models of Language Processing", which she completed in 2020 at the University of Amsterdam. We discuss her work on which aspects of hierarchical compositionality and syntactic structure can be learned by recurrent neural networks, how these models can serve as explanatory models of human language processing, what compositionality actually means, and a lot more. - Episode notes: https://cs.nyu.edu/~welleck/episode36.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [35] Armando Solar-Lezama - Program Synthesis by Sketching 1:15:56
1:15:56
Afspil senere
Afspil senere
Lister
Like
Liked1:15:56
Armando Solar-Lezama is a Professor at MIT, and the Associate Director & COO of CSAIL. He leads the Computer Assisted Programming Group, focused on program synthesis. Armando’s PhD thesis is titled, "Program Synthesis by Sketching", which he completed in 2008 at UC Berkeley. We talk about program synthesis & his work on Sketch, how machine learning's role in program synthesis has evolved over time, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode35.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [34] Sasha Rush - Lagrangian Relaxation for Natural Language Decoding 1:08:12
1:08:12
Afspil senere
Afspil senere
Lister
Like
Liked1:08:12
Sasha Rush is an Associate Professor at Cornell Tech and researcher at Hugging Face. His research focuses on building NLP systems that are safe, fast, and controllable. Sasha's PhD thesis is titled, "Lagrangian Relaxation for Natural Language Decoding", which he completed in 2014 at MIT. We talk about his work in the thesis on decoding in NLP, how it connects with today, and many interesting topics along the way such as the role of engineering in machine learning, breadth vs. depth, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode34.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [33] Michael R. Douglas - G/H Conformal Field Theory 1:12:58
1:12:58
Afspil senere
Afspil senere
Lister
Like
Liked1:12:58
Michael R. Douglas is a theoretical physicist and Professor at Stony Brook University, and Visiting Scholar at Harvard University. His research focuses on string theory, theoretical physics and its relations to mathematics. Michael's PhD thesis is titled, "G/H Conformal Field Theory", which he completed in 1988 at Caltech. We talk about working with Feynman, Sussman, and Hopfield during his PhD days, the superstring revolutions and string theory, and machine learning's role in the future of science and mathematics. - Episode notes: https://cs.nyu.edu/~welleck/episode33.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [32] Andre Martins - The Geometry of Constrained Structured Prediction 1:26:39
1:26:39
Afspil senere
Afspil senere
Lister
Like
Liked1:26:39
Andre Martins is an Associate Professor at IST and VP of AI Research at Unbabel in Lisbon, Portugal. His research focuses on natural language processing and machine learning. Andre’s PhD thesis is titled, "The Geometry of Constrained Structured Prediction: Applications to Inference and Learning of Natural Language Syntax", which he completed in 2012 at Carnegie Mellon University and IST. We talk about his work in the thesis on structured prediction in NLP, and discuss connections between his thesis work on later work on sparsity, sparse communication, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode32.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [31] Jay McClelland - Preliminary Letter Identification in the Perception of Words and Nonwords 1:33:51
1:33:51
Afspil senere
Afspil senere
Lister
Like
Liked1:33:51
Jay McClelland is a Professor in the Psychology Department and Director of the Center for Mind, Brain, Computation and Technology at Stanford. His research addresses a broad range of topics in cognitive science and cognitive neuroscience, including Parallel Distributed Processing (PDP). Jay's PhD thesis is titled "Preliminary Letter Identification in the Perception of Words and Nonwords", which he completed in 1975 at University of Pennsylvania. We discuss his work in the thesis on the word superiority effect, how it led to the Integrated Activation model, the path to Parallel Distributed Processing and the connectionist revolution, and distributed vs rule-based and symbolic approaches to modeling human cognition and artificial intelligence. - Episode notes: https://cs.nyu.edu/~welleck/episode31.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [30] Dustin Tran - Probabilistic Programming for Deep Learning 1:02:50
1:02:50
Afspil senere
Afspil senere
Lister
Like
Liked1:02:50
Dustin Tran is a research scientist at Google Brain. His research focuses on advancing science and intelligence, including areas involving probability, programs, and neural networks. Dustin’s PhD thesis is titled "Probabilistic Programming for Deep Learning", which he completed in 2020 at Columbia University. We discuss the intersection of probabilistic modeling and deep learning, including the Edward library and the novel inference algorithms and models that he developed in the thesis. - Episode notes: https://cs.nyu.edu/~welleck/episode30.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [29] Tengyu Ma - Non-convex Optimization for Machine Learning 1:17:22
1:17:22
Afspil senere
Afspil senere
Lister
Like
Liked1:17:22
Tengyu Ma is an Assistant Professor at Stanford University. His research focuses on deep learning and its theory, as well as various topics in machine learning. Tengyu's PhD thesis is titled "Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding", which he completed in 2017 at Princeton University. We discuss theory in machine learning and deep learning, including the 'all local minima are global minima' property, overparameterization, as well as perspectives that theory takes on understanding deep learning. - Episode notes: https://cs.nyu.edu/~welleck/episode29.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [28] Karen Ullrich - A Coding Perspective on Deep Latent Variable Models 1:06:20
1:06:20
Afspil senere
Afspil senere
Lister
Like
Liked1:06:20
Karen Ullrich is a Research Scientist at FAIR. Her research focuses on the intersection of information theory and probabilistic machine learning and deep learning. Karen's PhD thesis is titled "A coding perspective on deep latent variable models", which she completed in 2020 at The University of Amsterdam. We discuss information theory & the minimum description length principle, along with her work in the thesis on compression and communication. Episode notes: https://cs.nyu.edu/~welleck/episode28.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [27] Danqi Chen - Neural Reading Comprehension and Beyond 55:43
55:43
Afspil senere
Afspil senere
Lister
Like
Liked55:43
Danqi Chen is an assistant professor at Princeton University, co-leading the Princeton NLP Group. Her research focuses on fundamental methods for learning representations of language and knowledge, and practical systems including question answering, information extraction and conversational agents. Danqi’s PhD thesis is titled "Neural Reading Comprehension and Beyond", which she completed in 2018 at Stanford University. We discuss her work on parsing, reading comprehension and question answering. Throughout we discuss progress in NLP, fundamental challenges, and what the future holds. Episode notes: https://cs.nyu.edu/~welleck/episode27.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [26] Kevin Ellis - Algorithms for Learning to Induce Programs 1:17:49
1:17:49
Afspil senere
Afspil senere
Lister
Like
Liked1:17:49
Kevin Ellis is an assistant professor at Cornell and currently a research scientist at Common Sense Machines. His research focuses on artificial intelligence, program synthesis, and neurosymbolic models. Kevin's PhD thesis is titled "Algorithms for Learning to Induce Programs", which he completed in 2020 at MIT. We discuss Kevin’s work at the intersection of machine learning and program induction, including inferring graphics programs from images and drawings, DreamCoder, and more. Episode notes: https://cs.nyu.edu/~welleck/episode26.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [25] Tomas Mikolov - Statistical Language Models Based on Neural Networks 1:19:17
1:19:17
Afspil senere
Afspil senere
Lister
Like
Liked1:19:17
Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity. Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata. Episode notes: https://cs.nyu.edu/~welleck/episode25.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [24] Martin Arjovsky - Out of Distribution Generalization in Machine Learning 1:02:48
1:02:48
Afspil senere
Afspil senere
Lister
Like
Liked1:02:48
Martin Arjovsky is a postdoctoral researcher at INRIA. His research focuses on generative modeling, generalization, and exploration in RL. Martin's PhD thesis is titled "Out of Distribution Generalization in Machine Learning", which he completed in 2019 at New York University. We discuss his work on the influential Wasserstein GAN early in his PhD, then discuss his thesis work on out-of-distribution generalization which focused on causal invariance and invariant risk minimization. Episode notes: https://cs.nyu.edu/~welleck/episode24.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [23] Simon Du - Gradient Descent for Non-convex Problems in Modern Machine Learning 1:06:30
1:06:30
Afspil senere
Afspil senere
Lister
Like
Liked1:06:30
Simon Shaolei Du is an Assistant Professor at the University of Washington. His research focuses on theoretical foundations of deep learning, representation learning, and reinforcement learning. Simon's PhD thesis is titled "Gradient Descent for Non-convex Problems in Modern Machine Learning", which he completed in 2019 at Carnegie Mellon University. We discuss his work related to the theory of gradient descent for challenging non-convex problems that we encounter in deep learning. We cover various topics including connections with the Neural Tangent Kernel, theory vs. practice, and future research directions. Episode notes: https://cs.nyu.edu/~welleck/episode23.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [22] Graham Neubig - Unsupervised Learning of Lexical Information 1:02:30
1:02:30
Afspil senere
Afspil senere
Lister
Like
Liked1:02:30
Graham Neubig is an Associate Professor at Carnegie Mellon University. His research focuses on language and its role in human communication, with the goal of breaking down barriers in human-human or human-machine communication through the development of NLP technologies. Graham’s PhD thesis is titled "Unsupervised Learning of Lexical Information for Language Processing Systems", which he completed in 2012 at Kyoto University. We discuss his PhD work related to the fundamental processing units that NLP systems use to process text, including non-parametric Bayesian models, segmentation, and alignment problems, and discuss how his perspective on machine translation has evolved over time. Episode notes: http://cs.nyu.edu/~welleck/episode22.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at http://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [21] Michela Paganini - Machine Learning Solutions for High Energy Physics 1:07:43
1:07:43
Afspil senere
Afspil senere
Lister
Like
Liked1:07:43
Michela Paganini is a Research Scientist at DeepMind. Her research focuses on investigating ways to compress and scale up neural networks. Michela's PhD thesis is titled "Machine Learning Solutions for High Energy Physics", which she completed in 2019 at Yale University. We discuss her PhD work on deep learning for high energy physics, including jet tagging and fast simulation for the ATLAS experiment at the Large Hadron Collider, and the intersection of machine learning and physics. Episode notes: https://cs.nyu.edu/~welleck/episode21.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [20] Josef Urban - Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics 1:25:18
1:25:18
Afspil senere
Afspil senere
Lister
Like
Liked1:25:18
Josef Urban is a Principal Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research focuses on artificial intelligence for large-scale computer-assisted reasoning. Josef's PhD thesis is titled "Exploring and Combining Deductive and Inductive Reasoning in Large Libraries of Formalized Mathematics", which he completed in 2004 at Charles University in Prague. We discuss his PhD work on the Mizar Problems for Theorem Proving, machine learning for premise selection, and how it evolved into his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode20.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [19] Dumitru Erhan - Understanding Deep Architectures and the Effect of Unsupervised Pretraining 1:20:03
1:20:03
Afspil senere
Afspil senere
Lister
Like
Liked1:20:03
Dumitru Erhan is a Research Scientist at Google Brain. His research focuses on understanding the world with neural networks. Dumitru's PhD thesis is titled "Understanding Deep Architectures and the Effect of Unsupervised Pretraining", which he completed in 2010 at the University of Montreal. We discuss his work in the thesis on understanding deep networks and unsupervised pretraining, his perspective on deep learning's development, and the path of ideas to his recent research. Episode notes: https://cs.nyu.edu/~welleck/episode19.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [18] Eero Simoncelli - Distributed Representation and Analysis of Visual Motion 1:25:37
1:25:37
Afspil senere
Afspil senere
Lister
Like
Liked1:25:37
Eero Simoncelli is a Professor of Neural Science, Mathematics, Data Science, and Psychology at New York University. His research focuses on representation and analysis of visual information. Eero's PhD thesis is titled "Distributed Representation & Analysis of Visual Motion", which he completed in 1993 at MIT. We discuss his PhD work which focused on optical flow, which ideas and methods have stayed with him throughout his career, making biological connections with machine learning models, and how Eero's perspective of vision has evolved. Episode notes: https://cs.nyu.edu/~welleck/episode18.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [17] Paul Middlebrooks - Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex 1:36:10
1:36:10
Afspil senere
Afspil senere
Lister
Like
Liked1:36:10
Paul Middlebrooks is a neuroscientist and host of the Brain Inspired podcast, which explores the intersection of neuroscience and artificial intelligence. Paul's PhD thesis is titled "Neuronal Correlates of Meta-Cognition in Primate Frontal Cortex", which he completed at the University of Pittsburgh in 2011. We discuss Paul's work on meta-cognition - informally, thinking about thinking - then discuss neuroscience for A.I. and A.I. for neuroscience. Episode notes: https://cs.nyu.edu/~welleck/episode17.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.patreon.com/thesisreview…
T
The Thesis Review

1 [16] Aaron Courville - A Latent Cause Theory of Classical Conditioning 1:19:21
1:19:21
Afspil senere
Afspil senere
Lister
Like
Liked1:19:21
Aaron Courville is a Professor at the University of Montreal. His research focuses on the development of deep learning models and methods. Aaron's PhD thesis is titled "A Latent Cause Theory of Classical Conditioning", which he completed at Carnegie Mellon University in 2006. We discuss Aaron's work on the latent cause theory during his PhD, talk about how Aaron moved into machine learning and deep learning research, chart a path to today's deep learning methods, and discuss his recent work on systematic generalization in language. Episode notes: cs.nyu.edu/~welleck/episode16.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [15] Christian Szegedy - Some Applications of the Weighted Combinatorial Laplacian 1:06:52
1:06:52
Afspil senere
Afspil senere
Lister
Like
Liked1:06:52
Christian Szegedy is a Research Scientist at Google. His research machine learning methods such as the inception architecture, batch normalization and adversarial examples, and he currently investigates machine learning for mathematical reasoning. Christian’s PhD thesis is titled "Some Applications of the Weighted Combinatorial Laplacian" which he completed in 2005 at the University of Bonn. We discuss Christian’s background in mathematics, his PhD work on areas of both pure and applied mathematics, and his path into machine learning research. Finally, we discuss his recent work with using deep learning for mathematical reasoning and automatically formalizing mathematics. Episode notes: https://cs.nyu.edu/~welleck/episode15.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [14] Been Kim - Interactive and Interpretable Machine Learning Models 1:04:22
1:04:22
Afspil senere
Afspil senere
Lister
Like
Liked1:04:22
Been Kim is a Research Scientist at Google Brain. Her research focuses on designing high-performance machine learning methods that make sense to humans. Been's PhD thesis is titled "Interactive and Interpretable Machine Learning Models for Human Machine Collaboration", which she completed in 2015 at MIT. We discuss her work on interpretability, including her work in the thesis on the Bayesian Case Model and its interactive version, as well as connections with her subsequent work on black-box interpretability methods that are used in many real-world applications. Episode notes: https://cs.nyu.edu/~welleck/episode14.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling 1:07:50
1:07:50
Afspil senere
Afspil senere
Lister
Like
Liked1:07:50
Adji Bousso Dieng is currently a Research Scientist at Google AI, and will be starting as an assistant professor at Princeton University in 2021. Her research focuses on combining probabilistic graphical modeling and deep learning to design models for structured high-dimensional data. Her PhD thesis is titled "Deep Probabilistic Graphical Modeling", which she completed in 2020 at Columbia University. We discuss her work on combining graphical models and deep learning, including models and algorithms, the value of interpretability and probabilistic models, as well as applications and making an impact through research. Episode notes: https://cs.nyu.edu/~welleck/episode13.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [12] Martha White - Regularized Factor Models 1:08:35
1:08:35
Afspil senere
Afspil senere
Lister
Like
Liked1:08:35
Martha White is an Associate Professor at the University of Alberta. Her research focuses on developing reinforcement learning and representation learning techniques for adaptive, autonomous agents learning on streams of data. Her PhD thesis is titled "Regularized Factor Models", which she completed in 2014 at the University of Alberta. We discuss the regularized factor model framework, which unifies many machine learning methods and led to new algorithms and applications. We talk about sparsity and how it also appears in her later work, as well as the common threads between her thesis work and her research in reinforcement learning. Episode notes: https://cs.nyu.edu/~welleck/episode12.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [11] Jacob Andreas - Learning from Language 1:19:43
1:19:43
Afspil senere
Afspil senere
Lister
Like
Liked1:19:43
Jacob Andreas is an Assistant Professor at MIT, where he leads the language and intelligence group, focusing on language as a communicative and computational tool. His PhD thesis is titled "Learning from Language" which he completed in 2018 at UC Berkeley. We discuss compositionality and neural module networks, the intersection of RL and language, and translating a neural communication channel called 'neuralese', and how this can lead to more interpretable machine learning models. Episode notes: https://cs.nyu.edu/~welleck/episode11.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [10] Chelsea Finn - Learning to Learn with Gradients 51:44
51:44
Afspil senere
Afspil senere
Lister
Like
Liked51:44
Chelsea Finn is an Assistant Professor at Stanford University, where she leads the IRIS lab that studies intelligence through robotic interaction at scale. Her PhD thesis is titled "Learning to Learn with Gradients", which she completed in 2018 at UC Berkeley. Chelsea received the prestigious ACM Doctoral Dissertation Award for her work in the thesis. We discuss machine learning for robotics, focusing on learning-to-learn - also known as meta-learning - and her work on the MAML algorithm during her PhD, as well as the future of robotics research. Episode notes: https://cs.nyu.edu/~welleck/episode10.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [09] Kenneth Stanley - Efficient Evolution of Neural Networks through Complexification 1:21:26
1:21:26
Afspil senere
Afspil senere
Lister
Like
Liked1:21:26
Kenneth Stanley is a researcher at OpenAI, where he leads the team on Open-endedness. Previously he was a Professor Computer Science at the University of Central Florida, cofounder of Geometric Intelligence, and head of Core AI research at Uber AI labs. His PhD thesis is titled "Efficient Evolution of Neural Networks through Complexification", which he completed on 2004 at the University of Texas. We talk about evolving increasingly complex structures and how this led to the NEAT algorithm that he developed during his PhD. We discuss his research directions related to open-endedness, how the field has changed over time, and how he currently views algorithms that were developed over a decade ago. Episode notes: https://cs.nyu.edu/~welleck/episode9.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [08] He He - Sequential Decisions and Predictions in NLP 1:00:39
1:00:39
Afspil senere
Afspil senere
Lister
Like
Liked1:00:39
He He is an Assistant Professor at New York University. Her research focuses on enabling reliable communication in natural language between machine and humans, including topics in text generation, robust language understanding, and dialogue systems. Her PhD thesis is titled "Sequential Decisions and Predictions in NLP", which she completed in 2016 at the University of Maryland. We talk about the intersection of language with imitation learning and reinforcement learning, her work in the thesis on opponent modeling and simultaneous translation, and how it relates to recent work on generation and robustness. Episode notes: https://cs.nyu.edu/~welleck/episode8.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [07] John Schulman - Optimizing Expectations: From Deep RL to Stochastic Computation Graphs 1:04:28
1:04:28
Afspil senere
Afspil senere
Lister
Like
Liked1:04:28
John Schulman is a Research Scientist and co-founder of Open AI. John co-leads the reinforcement learning team, researching algorithms that safely and efficiently learn by trial and error and by imitating humans. His PhD thesis is titled "Optimizing Expectations: From Deep Reinforcement Learning to Stochastic Computation Graphs", which he completed in 2016 at Berkeley. We talk about his work on stochastic computation graphs and TRPO, how it evolved to PPO and how it's used in large-scale applications like Open AI Five, as well as his recent work on generalization in RL. Episode notes: https://cs.nyu.edu/~welleck/episode7.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [06] Yoon Kim - Deep Latent Variable Models of Natural Language 1:05:50
1:05:50
Afspil senere
Afspil senere
Lister
Like
Liked1:05:50
Yoon Kim is currently a Research Scientist at the MIT-IBM AI Watson Lab, and will be joining MIT as an assistant professor in 2021. Yoon’s research focuses on machine learning and natural language processing. His PhD thesis is titled "Deep Latent Variable Models of Natural Language", which he completed in 2020 at Harvard University. We discuss his work on uncovering latent structure in natural language, including continuous vector representations, tree structures, and grammars. We cover learning and variational inference methods that he developed during his PhD, and he offers a look at where latent variable models will be heading in the future. Episode notes: https://cs.nyu.edu/~welleck/episode6.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [05] Julian Togelius - Computational Intelligence and Games 1:12:06
1:12:06
Afspil senere
Afspil senere
Lister
Like
Liked1:12:06
Julian Togelius is an Associate Professor at New York University, where he co-directs the NYU Game Innovation Lab. His research is at the intersection of computational intelligence and computer games. His PhD thesis is titled "Optimization, Imitation, and Innovation: Computational Intelligence and Games", which he completed in 2007. We cover his work in the thesis on AI for games and games for AI, and how it connects to his recent work on procedural content generation. Episode notes: https://cs.nyu.edu/~welleck/episode5.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.buymeacoffee.com/thesisreview…
T
The Thesis Review

1 [04] Sebastian Nowozin - Learning with Structured Data: Applications to Computer Vision 1:44:32
1:44:32
Afspil senere
Afspil senere
Lister
Like
Liked1:44:32
Sebastian Nowozin is currently a Researcher at Microsoft Research Cambridge. His research focuses on probabilistic deep learning, consequences of model misspecification, understanding agent complexity in order to improve learning efficiency, and designing models for reasoning and planning. His PhD thesis is titled "Learning with Structured Data: Applications to Computer Vision", which he completed in 2009. We discuss the work in his thesis on structured inputs and structured outputs, which involves beautiful ideas from polyhedral combinatorics and optimization. We talk about his recent work on Bayesian deep learning and the connections it has to ideas that he explored during his PhD. Episode notes: https://cs.nyu.edu/~welleck/episode4.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [03] Sebastian Ruder - Neural Transfer Learning for Natural Language Processing 1:24:32
1:24:32
Afspil senere
Afspil senere
Lister
Like
Liked1:24:32
Sebastian Ruder is currently a Research Scientist at Deepmind. His research focuses on transfer learning for natural language processing, and making machine learning and NLP more accessible. His PhD thesis is titled "Neural Transfer Learning for Natural Language Processing", which he completed in 2019. We cover transfer learning from philosophical and technical perspectives, and talk about its societal implications, focusing on his work on sequential transfer learning and cross-lingual learning. Episode notes: https://cs.nyu.edu/~welleck/episode3.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [02] Colin Raffel - Learning-Based Methods for Comparing Sequences 1:15:54
1:15:54
Afspil senere
Afspil senere
Lister
Like
Liked1:15:54
Colin Raffel is currently a Senior Research Scientist at Google Brain, and soon to be an assistant professor at the University of North Carolina. His recent work focuses on transfer learning and learning from limited labels. His thesis is titled "Learning-Based Methods for Comparing Sequences, with Applications to Audio-to-MIDI Alignment and Matching", which we discuss along with the connections to his later work, and plans for the future. Episode notes: https://cs.nyu.edu/~welleck/episode2.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [01] Gus Xia - Expressive Collaborative Music Performance via Machine Learning 57:59
57:59
Afspil senere
Afspil senere
Lister
Like
Liked57:59
Gus Xia is an assistant professor at New York University Shanghai. His research explores machine learning for music, with a goal of building intelligent systems that understand and extend musical creativity and expression. His PhD thesis is titled Expressive Collaborative Music Performance via Machine Learning, which we discuss in depth along with his ongoing research at the NYU Shanghai Music X Lab. - Gus Xia's homepage: https://www.cs.cmu.edu/~gxia/ - Thesis: http://reports-archive.adm.cs.cmu.edu/anon/ml2016/CMU-ML-16-103.pdf Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html…
T
The Thesis Review

1 [00] The Thesis Review Podcast - Introduction 1:46
1:46
Afspil senere
Afspil senere
Lister
Like
Liked1:46
[00] The Thesis Review Podcast - Introduction by Sean Welleck
Velkommen til Player FM!
Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.