Cold Takes Audio-logo

Cold Takes Audio

Technology Podcasts

Amateur read-throughs of blog posts on Cold-Takes.com, for those who prefer listening to reading. Available on Apple, Spotify, Google Podcasts, and anywhere else you listen to podcasts by searching Cold Takes Audio.

Location:

United States

Description:

Amateur read-throughs of blog posts on Cold-Takes.com, for those who prefer listening to reading. Available on Apple, Spotify, Google Podcasts, and anywhere else you listen to podcasts by searching Cold Takes Audio.

Language:

English

Contact:

347-469-0256


Episodes
Ask host to enable sharing for playback control

What AI companies can do today to help with the most important century

2/20/2023
Major AI companies can increase or reduce global catastrophic risks. https://www.cold-takes.com/what-ai-companies-can-do-today-to-help-with-the-most-important-century/

Duration:00:18:27

Ask host to enable sharing for playback control

Jobs that can help with the most important century

2/10/2023
People are far better at their jobs than at anything else. Here are the best ways to help the most important century go well. https://www.cold-takes.com/jobs-that-can-help-with-the-most-important-century/

Duration:00:30:42

Ask host to enable sharing for playback control

Spreading messages to help with the most important century

1/25/2023
For people who want to help improve our prospects for navigating transformative AI, and have an audience (even a small one). https://www.cold-takes.com/spreading-messages-to-help-with-the-most-important-century/

Duration:00:20:09

Ask host to enable sharing for playback control

How we could stumble into AI catastrophe

1/13/2023
Hypothetical stories where the world tries, but fails, to avert a global disaster. https://www.cold-takes.com/how-we-could-stumble-into-ai-catastrophe

Duration:00:28:54

Ask host to enable sharing for playback control

Transformative AI issues (not just misalignment): an overview

1/5/2023
An overview of key potential factors (not just alignment risk) for whether things go well or poorly with transformative AI. https://www.cold-takes.com/transformative-ai-issues-not-just-misalignment-an-overview/

Duration:00:25:01

Ask host to enable sharing for playback control

Racing Through a Minefield: the AI Deployment Problem

12/22/2022
Push AI forward too fast, and catastrophe could occur. Too slow, and someone else less cautious could do it. Is there a safe course? https://www.cold-takes.com/racing-through-a-minefield-the-ai-deployment-problem/

Duration:00:21:04

Ask host to enable sharing for playback control

High-level hopes for AI aligment

12/15/2022
A few ways we might get very powerful AI systems to be safe. https://www.cold-takes.com/high-level-hopes-for-ai-alignment/

Duration:00:23:49

Ask host to enable sharing for playback control

AI safety seems hard to measure

12/8/2022
Four analogies for why "We don't see any misbehavior by this AI" isn't enough. https://www.cold-takes.com/ai-safety-seems-hard-to-measure/

Duration:00:22:22

Ask host to enable sharing for playback control

Why Would AI "Aim" To Defeat Humanity?

11/29/2022
Today's AI development methods risk training AIs to be deceptive, manipulative and ambitious. This might not be easy to fix as it comes up. https://www.cold-takes.com/why-would-ai-aim-to-defeat-humanity/

Duration:00:46:14

Ask host to enable sharing for playback control

The Track Record of Futurists Seems ... Fine

6/30/2022
We scored mid-20th-century sci-fi writers on nonfiction predictions. They weren't great, but weren't terrible either. Maybe doing futurism works fine. https://www.cold-takes.com/the-track-record-of-futurists-seems-fine/

Duration:00:21:21

Ask host to enable sharing for playback control

Nonprofit Boards are Weird

6/23/2022
With great power comes, er, unclear responsibility and zero accountability. https://www.cold-takes.com/nonprofit-boards-are-weird-2/

Duration:00:25:27

Ask host to enable sharing for playback control

AI Could Defeat All Of Us Combined

6/7/2022
How big a deal could AI misalignment be? About as big as it gets. https://www.cold-takes.com/ai-could-defeat-all-of-us-combined/

Duration:00:23:56

Ask host to enable sharing for playback control

Useful Vices for Wicked Problems

4/12/2022
Investigating important topics with laziness, impatience, hubris and self-preservation. https://www.cold-takes.com/useful-vices-for-wicked-problems/

Duration:00:25:19

Ask host to enable sharing for playback control

Ideal governance (for companies, countries and more)

4/4/2022
What kind of governance system should you set up, if you're starting from scratch and can do it however you want? https://www.cold-takes.com/ideal-governance-for-companies-countries-and-more/

Duration:00:17:33

Ask host to enable sharing for playback control

Debating myself on whether “extra lives lived” are as good as “deaths prevented”

3/29/2022
Preventing extinction would be good - but "saving 8 billion lives" good or "saving a trillion trillion trillion lives" good? https://www.cold-takes.com/debating-myself-on-whether-extra-lives-lived-are-as-good-as-deaths-prevented/

Duration:00:20:20

Ask host to enable sharing for playback control

The Wicked Problem Experience

3/2/2022
A day in the life of trying to complete a self-assigned project with no clear spec or goal. https://www.cold-takes.com/the-wicked-problem-experience/

Duration:00:14:50

Ask host to enable sharing for playback control

Learning By Writing

2/22/2022

Duration:00:16:24

Ask host to enable sharing for playback control

Defending One-Dimensional Ethics

2/15/2022
First in a series of dialogues on utilitarianism and "future-proof ethics." https://www.cold-takes.com/defending-one-dimensional-ethics/

Duration:00:26:00

Ask host to enable sharing for playback control

Future-Proof Ethics

2/2/2022
Ethics based on common sense seems to have a horrible historical track record. Can we do better? https://www.cold-takes.com/future-proof-ethics/

Duration:00:27:09

Ask host to enable sharing for playback control

Stakeholder Management and Cost Disease

1/27/2022

Duration:00:18:04