Machines Like Us-logo

Machines Like Us

Technology Podcasts

Machines Like Us is a technology show about people. We are living in an age of breakthroughs propelled by advances in artificial intelligence. Technologies that were once the realm of science fiction will become our reality: robot best friends, bespoke gene editing, brain implants that make us smarter. Every other Tuesday Taylor Owen sits down with the people shaping this rapidly approaching future. He’ll speak with entrepreneurs building world-changing technologies, lawmakers trying to ensure they’re safe, and journalists and scholars working to understand how they’re transforming our lives.

Location:

Canada

Description:

Machines Like Us is a technology show about people. We are living in an age of breakthroughs propelled by advances in artificial intelligence. Technologies that were once the realm of science fiction will become our reality: robot best friends, bespoke gene editing, brain implants that make us smarter. Every other Tuesday Taylor Owen sits down with the people shaping this rapidly approaching future. He’ll speak with entrepreneurs building world-changing technologies, lawmakers trying to ensure they’re safe, and journalists and scholars working to understand how they’re transforming our lives.

Twitter:

@cigionline

Language:

English

Contact:

15198852444 x7269


Episodes
Ask host to enable sharing for playback control

The Battle for Your Brain

5/21/2024
Earlier this year, Elon Musk’s company Neuralink successfully installed one of their brain implants in a 29 year old quadriplegic man named Noland Arbaugh. The device changed Arbaugh’s life. He no longer needs a mouth stylus to control his computer or play video games. Instead, he can use his mind. The brain-computer interface that Arbaugh uses is part of an emerging field known as neurotechnology that promises to reshape the way we live. A wide range of AI empowered neurotechnologies may allow disabled people like Arbaugh to regain independence, or give us the ability to erase traumatic memories in patients suffering from PTSD. But it doesn’t take great leaps to envision how these technologies could be abused as well. Law enforcement agencies in the United Arab Emirates have used neurotechnology to read the minds of criminal suspects, and convict them based on what they’ve found. And corporations are developing ways to advertise to potential customers in their dreams. Remarkably, both of these things appear to be legal, as there are virtually no laws explicitly governing neurotechnology. All of which makes Nita Farahany’s work incredibly timely. Farahany is a professor of law and philosophy at Duke University and the author of The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. Farahany isn’t fatalistic about neurotech – in fact, she uses some of it herself. But she is adamant that we need to start developing laws and guardrails as soon as possible, because it may not be long before governments, employers and corporations have access to our brains. Mentioned: “PRIME Study Progress Update – User Experience,” Neuralink “Paralysed man walks using device that reconnects brain with muscles,” The Guardian Cognitive Warfare – NATO’s ACT The Ethics of Neurotechnology: UNESCO appoints international expert group to prepare a new global standard

Duration:00:39:36

Ask host to enable sharing for playback control

Can AI Companions Cure Loneliness?

5/7/2024
When Eugenia Kuyda saw Her for the first time – the 2013 film about a man who falls in love with his virtual assistant – it didn’t read as science fiction. That’s because she was developing a remarkably similar technology: an AI chatbot that could function as a close friend, or even a romantic partner. That idea would eventually become the basis for Replika, Kuyda’s AI startup. Today, Replika has millions of active users – that’s millions of people who have AI friends, AI siblings and AI partners. When I first heard about the idea behind Replika, I thought it sounded kind of dystopian. I envisioned a world where we’d rather spend time with our AI friends than our real ones. But that’s not the world Kuyda is trying to build. In fact, she thinks chatbots will actually make people more social, not less, and that the cure for our technologically exacerbated loneliness might just be more technology. Mentioned: “ELIZA—A Computer Program For the Study of Natural Language Communication Between Man And Machine” by Joseph Weizenbaum “elizabot.js”, implemented by Norbert Landsteiner “Speak, Memory” by Casey Newton (The Verge) “Creating a safe Replika experience” by Replika “The Year of Magical Thinking” by Joan Didion Additional Reading: The Globe & Mail: “They fell in love with the Replika AI chatbot. A policy update left them heartbroken” “Loneliness and suicide mitigation for students using GPT3-enabled chatbots” by Maples, Cerit, Vishwanath, & Pea “Learning from intelligent social agents as social and intellectual mirrors” by Maples, Pea, Markowitz

Duration:00:34:39

Ask host to enable sharing for playback control

Maria Ressa saw the dangers of social media. AI might be worse.

5/7/2024
In the last few years, artificial intelligence has gone from a novelty to perhaps the most influential technology we’ve ever seen. The people building AI are convinced that it will eradicate disease, turbocharge productivity, and solve climate change. It feels like we’re on the cusp of a profound societal transformation. And yet, I can’t shake the feeling we’ve been here before. Fifteen years ago, there was a similar wave of optimism around social media: it was going to connect the world, catalyze social movements and spur innovation. It may have done some of these things. But it also made us lonelier, angrier, and occasionally detached from reality. Few people understand this trajectory better than Maria Ressa. Ressa is a Filipino journalist, and the CEO of a news organization called Rappler. Like many people, she was once a fervent believer in the power of social media. Then she saw how it could be abused. In 2016, she reported on how Rodrigo Duterte, then president of the Philippines, had weaponized Facebook in the election he’d just won. After publishing those stories, Ressa became a target herself, and her inbox was flooded with death threats. In 2021, she won the Nobel Peace Prize. I wanted this to be our first episode because I think, as novel as AI is, it has undoubtedly been shaped by the technologies, the business models, and the CEOs that came before it. And Ressa thinks we’re about to repeat the mistakes we made with social media all over again. Mentioned: “How to Stand Up to a Dictator” by Maria Ressa “A Shocking Amount of the Web is Machine Translated: Insights from Multi-Way Parallelism” by Thompson et al. Rappler’s Matrix Protocol Chat App: Rappler Communities “Democracy Report 2023: Defiance in the Face of Autocratization” by V-Dem “The Foundation Model Transparency Index” by Stanford HAI (Human-Centered Artificial Intelligence) “All the ways Trump’s campaign was aided by Facebook, ranked by importance” by Philip Bump (The Washington Post) “Our Epidemic of Loneliness and Isolation” by U.S. Surgeon General Dr. Vivek H. Murthy

Duration:00:44:42

Ask host to enable sharing for playback control

Introducing Machines Like Us

4/29/2024
We are living in an age of breakthroughs propelled by advances in artificial intelligence. Technologies that were once the realm of science fiction will become our reality: robot best friends, bespoke gene editing, brain implants that make us smarter. Every other Tuesday Taylor Owen sits down with someone shaping this rapidly approaching future. The first two episodes will be released on May 7th. Subscribe now so you don’t miss an episode.

Duration:00:02:37

Ask host to enable sharing for playback control

How Much Should We Worry about the Future of Tech Governance?

4/21/2022
On the season finale of Big Tech, host Taylor Owen discusses the future of tech governance with Azeem Azhar, author of The Exponential Age: How Accelerating Technology is Transforming Business, Politics, and Society. In addition to his writing, Azeem hosts the Exponential View podcast, which, much like this podcast, looks at how technology is transforming business and society. Taylor and Azeem reflect on some of the broad themes that have concerned them this season, from platform governance, antitrust and competition, to polarization, deliberative democracy and Web3. As listeners have come to know, Taylor often views technology’s future through a cautionary lens, while Azeem has a more optimistic outlook. They begin with the recent news of Elon Musk’s attempt to purchase Twitter and what that might mean for the platform. As the episode unfolds, Taylor and Azeem touch on the varied approaches to tech regulation around the world, and how polarization and its amplification via social media are impacting democracy. They discuss Web3’s potential to foster more transparency and trust building on the internet, as well as the need for states to be involved in shaping our future online. Ultimately, there are opportunities to make positive changes at many levels of these complex, multilayered issues. As a concluding thought, Azeem points to the coal industry as an example of how, regardless of political winds, many factors in a system can bring about change.

Duration:00:46:29

Ask host to enable sharing for playback control

All Eyes on Crypto

4/14/2022
In this episode of Big Tech, host Taylor Owen speaks with Ephrat Livni, a lawyer and journalist who reports from Washington on the intersection of business and policy for DealBook at The New York Times. One of Livni’s focuses has been how cryptocurrencies have moved from the periphery of the financial world into the mainstream. The cryptocurrency movement originated with a commitment to the decentralization of money and the removal of intermediaries and government to enable person-to-person financial transactions. Early on, governments viewed cryptocurrency as a tool for illicit criminal activity and a threat to institutional power. In the last two years, cryptocurrency has moved into the mainstream, with sporting arenas named after crypto companies and flashy celebrity endorsements and Super Bowl ads. Crypto markets are extremely volatile with great interest from retail investors and venture capitalists. There’s a lot of enthusiasm about crypto, but not a lot of information. With crypto moving into the mainstream, companies that wish to create trust with their customers must be more transparent, accept regulations and act more like the institutions they initially sought to disrupt. As Livni and Owen discuss, this is not a sector that regulators can ignore: it is complicated, fast-changing, multinational, and demanding a great deal of thought about how best to proceed.

Duration:00:37:58

Ask host to enable sharing for playback control

Web3 — Technology of Control or Freedom?

4/7/2022
The internet is an ever-evolving thing, with new features and services popping up daily. But these innovations are happening in the current internet space, known as Web 2.0. The potential next big leap is to what is being called Web3 or Web 3.0. You have likely heard some of the terms associated with this next age — the token economy, blockchain, NFTs. Our guest this week walks us through what all this “future stuff” means, and how it could impact our daily lives. In this episode of Big Tech, host Taylor Owen speaks with Shermin Voshmgir, founder of Token Kitchen and BlockchainHub Berlin and the author of Token Economy: How the Web3 reinvents the Internet. Her work focuses on making technology accessible to a non-tech audience to ensure everyone can be part of the decision-making process. Early adopters in the Web3 space see this new iteration of the Web as liberating, an innovation that will decentralize power, facilitate peer-to-peer transactions, enable individual data ownership and challenge the dominance of tech giants. There are many questions about the governance of Web3 and its impacts to society that regulators, still stuck on platform content moderation, are not yet looking at. The conversation between Taylor and Shermin provides a foundational understanding of Web3 and a look ahead at areas where regulators should focus their attention.

Duration:00:31:30

Ask host to enable sharing for playback control

What Happens If We Live Forever?

3/31/2022
Humanity has long imagined a future where humans could live for hundreds of years, if not forever. But those ideas have been the stuff of science fiction, up until now. There’s growing interest and investment in the realm of biohacking and de-aging, and leading scientists such as Harvard’s David A. Sinclair are bringing the idea of extended lifespans out of fantasy into a reality we may see within our generation. But a world where more people are living a lot longer than ever thought possible will have sweeping economic and social consequences. In this episode of Big Tech, host Taylor Owen speaks with journalist Matthew D. LaPlante, co-author of Lifespan: Why We Age — And Why We Don’t Have To with David A. Sinclair. LaPlante’s focus is on the impacts longer lifespans will have, rather than on the technology involved in achieving de-aging. For example: When people live longer, where do we set the retirement age? Can the planet support more humans? And how will we deal with our past choices when we live long enough to see their impacts on our great-great-grandchildren? In this wide-ranging conversation, Taylor and Matthew discuss more implications longer life would have on our society. In the justice system, appointing a 50-year-old to the Supreme Court looks very different when that person could live to 110 rather than 80. What about geopolitical stability, if autocrats and dictators can extend their lives to maintain power for much longer periods? And what are the implications for medical privacy when technology companies are using monitoring devices, such as the ubiquitous smart watch, in conjunction with artificial intelligence to predict when someone may develop an illness or have a heart attack?

Duration:00:51:21

Ask host to enable sharing for playback control

Borders Matter – Even in Cyberspace

3/24/2022
A fundamental feature of the internet is its ability to transcend borders, connecting people to one another and all forms of information. The World Wide Web was heralded as a global village that would remove the traditional gatekeepers and allow anyone a platform to be heard. But the reality is that access to the internet and online services is very much bound to geography. A benign example is the location lockouts to online streaming platforms depending on which country you access. But more extreme examples of how location is inherently tied to internet access occur in authoritarian regimes that will limit access during uprisings, filter and block content, and surveil online conversations and then make real-world arrests. In this episode of Big Tech, host Taylor Owen speaks with Nanjala Nyabola, a CIGI fellow, political analyst and author of Digital Democracy, Analogue Politics: How the Internet Era is Transforming Politics in Kenya and Travelling While Black: Essays Inspired by a Life on the Move. Governments have been working on platform governance and content moderation reforms for a few years now, and the need to find solutions and set rules becomes increasingly important – just look at how misinformation and censorship have been playing out in Russia and other authoritarian states over the last few weeks during the war in Ukraine. In Nyabola’s work on internet governance, she proposes that rather than look for global consensus on regulation, we need to think of the internet as a public good. “Water isn’t administered the same way in Kenya as it is in Uganda, as it is in Ethiopia, as it is in the United States; different municipalities will have different codes. But there is a fundamental agreement that water is necessary for life and should, as far as possible, be administered as a public utility.” Nyabola explains that governing the internet requires first setting out its fundamental aspects that humanity wants to safeguard and then protecting those common principles while allowing jurisdictions deliver this public good in their own unique ways.

Duration:00:46:33

Ask host to enable sharing for playback control

Inside the Russian Information War

3/17/2022
The speed at which the Russia-Ukraine war has played out across the internet has led to some interesting insights about how different groups have been experiencing and responding to information and misinformation about it. The West found unity across political divides, and the big tech platforms, breaking their long-held stance, have quickly acted to limit the spread of disinformation by making changes to their algorithms. However, across much of the non-English-language internet, the information ecosystem is very different. Many Russians aren’t even aware that there is a war going on. And technology companies that are discontinuing their operations in Russia as a well-meaning sign of solidarity with Ukraine may be making the problem worse. In this episode of Big Tech, host Taylor Owen speaks with Ben Scott and Frederike Kaltheuner about various aspects of communications technology and the social media platforms that are being used by all sides in the Russia-Ukraine war. We begin with a conversation between Taylor and Ben, the executive director of Reset, on the state of the information ecosystem both inside Russia and around the world. In the second half, Taylor speaks with Frederike, the director of the technology and rights division at Human Rights Watch, about the importance of access to information during wartime in the monitoring and documenting of human rights abuses, as well as the critical role that communications systems play in helping citizens inside conflict zones.

Duration:00:40:21

Ask host to enable sharing for playback control

A History Lesson That Shatters the Mythology of Silicon Valley

3/10/2022
In this episode of Big Tech, host Taylor Owen speaks with Margaret O’Mara, a historian of modern America and author of The Code: Silicon Valley and the Remaking of America. Silicon Valley and the massive wealth it has generated have long symbolized the wonders of free market capitalism, viewed as proof of how innovation can thrive when it is not burdened by government oversight. Silicon Valley is infused with this libertarian ethos, centred on the idea that it was guys in their garages, setting out to create something new and make the world a better place, who built the Valley. But O’Mara looks back into history and says that’s all just a myth. During the Cold War, the United States was looking for ways to bolster its technological advantage over the Soviets. Knowing that state-led projects would appear “Communist” to the American people, the government funnelled federal funding for research and development through universities, research institutions and defence companies. This influx of funds enabled private companies to expand and innovate and universities to subsidize tuition. The Apollo space program offers one such example, where federal funds supported tech companies working in electronic miniaturization and semiconductors. The upshot is that the entire Silicon Valley tech sector was built on government intervention and support, and even the guys in their garages benefited from the access to affordable university education. “To pull yourself up by your bootstraps is an American myth that’s very corrosive — there are very, very few truly self-made people,” explains O’Mara. By demystifying Silicon Valley’s origins we can better approach regulation and oversight of the tech industry.

Duration:00:46:04

Ask host to enable sharing for playback control

Johann Hari Knows You Won’t Be Able to Finish This Episode without Checking Your Phone

3/3/2022
Do you feel as if you can’t get through a single task without distractions? Perhaps you are watching a movie and stop it to check social media or respond to a message. You aren’t alone; studies show that collectively our attention spans have been shrinking for decades. Many factors contribute to our fractured focus, including the processed foods we eat, which cause energy highs and lows, but the greatest culprit of all is technology. In this episode of Big Tech, host Taylor Owen speaks with Johann Hari, the author of three New York Times bestsellers: Stolen Focus, Lost Connections and Chasing the Scream. Hari has been writing about depression, addiction and drugs for many years. Using that as background, Hari seeks to understand how social media has been changing our ability to deeply focus on important tasks. Hari argues that we must not think of this as a personal failing and charge the individual with finding a way out of this crisis, as we have done with obesity and drug addictions. Instead, society must change its relationship with technology so that we can regain our human ability to focus. Technology has increased the speed at which we work and live; as we try to consume so much information, we begin to focus less and less on the details. Hari compares it to speed reading: “It’s surprisingly effective, but it always comes with a cost, even for professional speed readers, which is the faster you read, the less you understand, the less you remember, and the more you’re drawn to shallow and simplistic documents.” Couple that with the way platforms prioritize certain types of content and you have a recipe for disaster. “Everyone has experienced it. Human beings will stare longer at something that makes them angry and upset than they will at something that makes them feel good,” says Hari. What Hari worries is that rather than take collective action, society will put the onus on individuals much as in dealing with obesity it ignores the wider food supply network and instead sells fad diets and supplements to individuals. “And if you come to the attention crisis the same way [we responded] to the obesity crisis, we’ll get the same outcome, which is an absolute disaster.”

Duration:00:53:37

Ask host to enable sharing for playback control

Early Women Innovators Offer Tech a Way Forward

2/24/2022
In the history of computers and the internet, a few names likely come to mind: Alan Turing, Tim Berners-Lee, Bill Gates and Steve Jobs. Undoubtedly, these men’s contributions to computer sciences have shaped much of our modern life. In the case of Jobs and Gates, their financial success shifted the landscape of software development and the metrics of success in Silicon Valley. Some sectors of the industry, such as programming, hypertext and databases, had been dominated by women in the early days, but once those areas became economic drivers, men flooded in, pushing aside the women. In the process, many of their contributions have been overlooked. In this episode of Big Tech, host Taylor Owen speaks with Claire L. Evans, a musician, internet historian and author of Broad Band: The Untold Story of the Women Who Made the Internet. Evans’s book chronicles the work of women involved in creating the internet but left out of its history. Owen and Evans reflect on several important milestones of the early internet where women were innovating in community building and the moderation of message boards. Evans reveals a little-known history of the early web and the women involved. One aspect that stands out is how the projects that women led focused on building trust with users and the production of knowledge rather than the technical specifications of microprocessors or memory storage. Today, in the face of online harms, misinformation, failing institutional trust and content moderation challenges, there is a great deal we can learn from the work women were already doing decades ago in this space.

Duration:00:45:43

Ask host to enable sharing for playback control

Nicholas Carr Is Silicon Valley’s Most Prescient Tech Critic

2/17/2022
Nicholas Carr is a prolific blogger, author and critic of technology since the early days of the social web. Carr began his blog Rough Type in 2005, at a time when some of today’s biggest companies where still start-ups operating out of college dorms. In 2010, he wrote the Pulitzer Prize for Nonfiction finalist The Shallows, in which he discussed how technology was changing the human brain. At the time, many were skeptical about Carr’s argument, but in just over a decade many of his predictions have come true. In this episode of Big Tech, host Taylor Owen and guest Nicholas Carr reflect on how he was able to identify these societal shifts long before others. The social web, known as Web 2.0, was billed as a democratizing tool for breaking down barriers so that anyone could share information and have their voices heard. Carr had concerns; while others saw college kids making toys, he saw the potential for major shifts in society. “As someone who had studied the history of media, I knew that when you get these kinds of big systems, particularly big communication systems, the unexpected, unanticipated consequences are often bigger than what everybody thinks is going to happen,” Carr explains. We are again on the verge of the next online shift, called Web3, and as new online technologies like non-fungible tokens, cryptocurrencies and the metaverse are being built, we can learn from Web 2.0 in hopes of mitigating future unanticipated consequences. As Carr sees it, we missed the opportunity to become involved early on with social platforms, before they became entrenched in our lives. “Twitter was seen as a place where people, you know, describe what they had for breakfast, and so society didn’t get involved in thinking about what are the long-term consequences here and how it’s going to play out. So I think if we take a lesson from that, even if you’re skeptical about virtual reality and augmented reality, now is the time that society has to engage with these visions of the future.”

Duration:00:42:31

Ask host to enable sharing for playback control

Your Facts Aren’t My Facts — Joe Rogan and Our Infodemic Age

2/10/2022
People are divided: you are either pro-vaccination or against it, and there seems to be no middle ground. Whether around the dinner table or on social media, people are entrenched in their positions. A deep-seated mistrust in science, despite its contributions to the flourishing of human life, is being fuelled by online misinformation. For the first time in history, humanity is in the midst of a pandemic with communication tools of almost unlimited reach and potential benefit, yet social media and the information economy appear structured to promote polarization. Take the case of The Joe Rogan Experience podcast on Spotify: Rogan, a comedian, is able to engage millions of listeners and spread, unchecked, misinformation about COVID-19 “cures” and “treatments” that have no basis in evidence. What responsibility does Spotify have as the platform enabling Rogan to spread this misinformation, and is it possible for the scientific community to break through to skeptics? In this episode of Big Tech, host Taylor Owen speaks with Timothy Caulfield, the author of bestselling books such as Is Gwyneth Paltrow Wrong About Everything? and The Vaccination Picture. He is also the Canada Research Chair in Health Law and Policy at the University of Alberta. Throughout the COVID-19 pandemic, Caulfield has been outspoken on Twitter about medical misinformation with the #ScienceUpFirst campaign. What we have learned though the pandemic is how critical it is to have clear public health communication, and that it is remarkably difficult to share information with the public. As everyone rushed to provide medical advice, people were looking for absolutes. But in science, one needs to remain open to new discoveries, so, as the pandemic evolved, guidelines were updated. As Caulfield explains, “I think it’s also a recognition of how important it is to bring the public along on that sort of scientific ride, saying, Look, this is the best advice we can give right now based on the science available.” When health guidelines are presented in a dogmatic way, it becomes difficult to share new emerging research; misunderstood or outdated facts become weaponized by those trying to discredit the public health sector who point to what was previously known and attempt to muddy the discourse and sow doubt. And that doubt leads to mistrust in institutions, the rise of “alternative facts,” the sharing of untested therapeutics on popular podcasts — and a convoy of truckers camped out in the Canadian capital to protest COVID lockdown and vaccine mandates.

Duration:00:48:32

Ask host to enable sharing for playback control

The Entrenched Colonialism of Tech

2/3/2022
Time and time again, we see the billionaire tech founder or CEO take the stage to present the latest innovation meant to make people’s lives better, revolutionize industries and glorify the power of technology to save the world. While these promises are dressed up in fancy new clothes, in reality, the tech sector is no different than other expansionist enterprises from the past. Their core foundation of growth and expansion is deeply rooted in the European and American colonialization and Manifest Destiny doctrines. And just as in the past, the tech sector is engaging in extraction, exploitation and expansion. In this episode of Big Tech, host Taylor Owen speaks with Jeff Doctor, who is Cayuga from Six Nations of the Grand River Territory. He is an impact strategist for Animikii, an Indigenous-owned technology company. Doctor isn’t surprised that technology is continuing to evolve in the same colonial way that he saw growing up and was built into television shows, movies and video games, such as the popular Civilizations franchise, which applies the same European expand-and-conquer strategy to winning the game regardless of the society a player represents in the game. “You see this manifested in the tech billionaire class, like all of them are literally trying to colonize space right now. It’s not even a joke any more. They grew up watching the same crap,” Doctor says. Colonialism and technology have always been entwined. European expansionism depended on modern technology to dominate, whether it be through deadlier weapons, faster ships or the laying of telegraph and railway lines across the west. Colonization continues through, for example, English-only development tools, and country selection dropdown options limited to “Canada” or the “United States” that ignore Indigenous peoples’ communities and nations. And, as governments grapple with how to protect people’s personal data from the tech sector, there is little attention paid to Indigenous data sovereignty, to ensure that every nation and community has the ability to govern and benefit from its own data.

Duration:00:42:14

Ask host to enable sharing for playback control

How Europe Is Trying to Rein in Big Tech

1/27/2022
Governments around the world are looking at their legal frameworks and how they apply to the digital technologies and platforms that have brought widespread disruptive change to their economies, societies and politics. Most governments are aware that their regulations are inadequate to address the challenges of an industry that crosses borders and pervades all aspects of daily life. Three regulatory approaches are emerging: the restrictive regime of the Chinese state; the lax, free-market approach of the United States; and the regulatory frameworks of the European Union, which are miles ahead of those of any other Western democratic country. In this episode of Big Tech, host Taylor Owen speaks with Mark Scott, the chief technology correspondent at Politico, about the state of digital technology and platform regulations in Europe. Following the success of implementing the General Data Protection Regulation, which went into effect in 2018, the European Parliament currently has three big policy proposals in the works: the Digital Services Act, the Digital Markets Act and the Artificial Intelligence Act. Taylor and Mark discuss how each of these proposals will impact the tech sector and discuss their potential for adoption across Europe — and how many other nations, including Canada, are modelling similar regulations within their own countries.

Duration:00:29:48

Ask host to enable sharing for playback control

The Brain Is Not a Computer

1/20/2022
Many unlocked mysteries remain about the workings of the human brain. Neuroscientists are making discoveries that are helping us to better understand the brain and correct preconceived notions about how it works. With the dawn of the information age, the brain’s processing was often compared to that of a computer. But the problem with this analogy is that it suggested the human brain was hard-wired, able to work in one particular way only, much as if it were a computer chip, and which, if damaged, could not reroute itself or restore function to a damaged pathway. Taylor Owen’s guest this week on the Big Tech podcast is a leading scholar of neuroplasticity, which is the ability of the brain to change its neural networks through growth and reorganization. Dr. Norman Doidge is a psychiatrist and author of The Brain That Changes Itself and The Brain’s Way of Healing. His work points to just how malleable the brain can be. Dr. Doidge talks about the brain’s potential to heal but also warns of the darker side of neuroplasticity, which is that our brains adapt to negative influences just as they do to positive ones. Today, our time spent in front of a screen and how we interact with technology are having significant impacts on our brains, and those of our children, affecting attention span, memory and recall, and behaviour. And all of these changes have societal implications.

Duration:00:57:17

Ask host to enable sharing for playback control

What Does Real Democracy Look Like?

1/13/2022
Democracy is in decline globally. It’s one year since the Capitol Hill insurrection, and many worry that the United States’ democratic system is continuing to crumble. Freedom House, an America think tank, says that nearly three-quarters of the world’s population lives in a country that experienced democratic deterioration last year. The rise of illiberalism is one reason for this, but another may be that democratic governments simply haven’t been performing all that well in recent years. In this episode of Big Tech, host Taylor Owen speaks with Hélène Landemore, author of Open Democracy and Debating Democracy and professor of political science at Yale University. Landemore’s work explores the limitations of casting a vote every few years for a candidate or political party and how in practice that isn’t a very democratic process. “Electoral democracy is a closed democracy where power is restricted to people who can win elections,” she says. Positions on issues become entrenched within party lines; powerful lobbyists exert influence; and representatives, looking ahead to the next election, lack political will to lead in the here and now. In an open democracy, citizens would be called on to debate issues and create policy solutions for problems. “If you include more people in the conversation, in the deliberation, you get the benefits of cognitive diversity, the difficulties of looking at problems and coming up with solutions, which benefits the group ultimately,” Landemore explains. In response to the yellow jacket movement in France, the government asked 150 citizens to come up with climate policies. Over seven weekend meetings, that group came up with 149 proposals on how to reduce France’s greenhouse gas emissions. In Ireland, a group of citizens was tasked with deliberating the abortion topic, a sensitive issue that was deadlocked in the political arena. The group included pro-life and pro-choice individuals and, rather than descending into partisan mud-slinging, was able to come to the recommendation, after much civil deliberation, that abortion be decriminalized. Landemore sees the French and Irish examples as precedents for further exploration and experimentation and that “it means potentially going through constitutional reforms to create a fourth or so chamber called the House of the People or something else, where it would be like a parliament but just made up of randomly selected citizens.”

Duration:00:44:32

Ask host to enable sharing for playback control

From the Beginnings of Fake News to the Capitol Riots

1/6/2022
On the first anniversary of the January 6 insurrection at the United States Capitol, Big Tech host Taylor Owen sits down with Craig Silverman to discuss how the rise of false facts led us to that moment. Silverman is a journalist for ProPublica and previously worked at Buzzfeed News, and is the editor of the Verification Handbook series. Before Donald Trump popularized “fake news” as a blanket term to attack mainstream news outlets, Silverman had been using it to mean something different and very specific. Fake news, also known as misinformation, disinformation or false facts, is online content that has been intentionally created to be shared on social media platforms. Before it was weaponized as a tool for election interference, fake news was simply a lucrative clickbait market that saw higher engagement than traditional media. And social media platforms’ algorithms amplified it because that higher engagement meant people spent more time on the platforms and boosted their ad revenue. After establishing the origins of misinformation and how it was used to manipulate the 2016 US presidential election, Owen and Silverman discuss how Facebook, in particular, responded to the 2020 US presidential election. Starting in September 2020, the company established a civic integrity team focusing on, among other issues, its role in elections globally and removed posts, groups and users that were promoting misinformation. Silverman describes what happens next. “After the election, what does Facebook do? Well, it gets rid of the whole civic integrity team, including the group’s task force. And so, as things get worse and worse leading up to January 6, nobody is on the job in a very focused way.” Before long, Facebook groups had “become an absolute hotbed and cesspool of delegitimization, death threats, all this kind of stuff,” explains Silverman. The lie that the election had been rigged was spreading unchecked via organized efforts on Facebook. Within a few weeks of the civic integrity team’s dismantling, Trump’s supporters arrived on Capitol Hill to “stop the steal.” It was then, as Silverman puts it, “the real world consequences came home to roost.”

Duration:00:44:12