Q&A with Nora Young, host of the CBC's Spark
Get the full story at:
Nora Young is the host and creator of Spark, a fun and informative CBC Radio show about technology and how it is changing our relationships, our work, and our culture.
Young will be moderating our Fall 2018 Wall Exchange with Rosalind Picard, who will describe how computers are now able to measure and respond to our emotions, and the implications of this new technology.
In preparation for the Wall Exchange, the Peter Wall Institute for Advanced Studies (PWIAS) caught up with Young for a Q&A about big data, science-fiction and the future of AI.
PWIAS: Spark has done many episodes on AI, some recent ones that stand out include the one on AI for crime forecasting and the ethical concerns behind that, or the one on Blade Runner – which I particularly liked because it gets into the discussion of consciousness.
Could you tell me about one episode that stands out to you?
Nora Young: Well, this is not one episode, but one issue we return to. That is the problems of bias and lack of transparency. Machine learning relies on large datasets in order to ‘train’ the AI, but as several of our guests have pointed out, datasets are designed by humans and can contain unintentional bias. To cite a recent example, Amazon was using AI aimed at finding talented new hires by scanning the internet for candidates that matched certain patterns. The dataset it had been trained on was 10 years of historical data about successful Amazon candidates. But that historical data was overwhelmingly about male candidates, and that skewed the results about what sort of patterns and qualifications the AI should search for, so it was (unintentionally) biased against females.
If you add to this the challenge of designing machine learning that can offer an ‘explanation’ for how it’s come up with its results, you have a recipe for disturbing bias in systems that are increasingly relied on for everything from credit scores to job interviews.
PWIAS: What is your favourite AI-themed movie and what about it appeals to you?
Nora Young: I thought Ex Machina, although very much in the realm of science fiction, was a terrific exploration of some of the ethical questions that may one day emerge about AI. This is what’s so interesting about the area of affective computing. Humans really do want to connect to AI, to robots. We tend to anthropomorphize and get attached to them. Even if robots never achieve anything like consciousness, even far in the future, the fact that we will likely feel like they have consciousness raises issues for us. How will we relate to them? How do we ensure they’re not designed to manipulate us emotionally, for example.
PWIAS: What are some of the biggest misconceptions around AI that you’ve come across?
Nora Young: There’s a lot of over-estimation of what AI can do, which I think is not helped by companies that throw around the language of “our proprietary algorithm”. A lot of what AI is used for now is really looking at probability, correlation, and risk assessment. Yes, it’s amazing that we can ask Siri or Google Assistant a question, but even at that level, they fairly often get things wrong.
PWIAS: I was recently reading about a Swedish company that is inserting microchips into people’s thumbs. This allows people to unlock their building doors, pay for things and even take trains just by pressing their thumbs into electronic readers. What do you think the future of wearable AI will look like?
Nora Young: We are doing a story on Spark about that Swedish project! In terms of wearables, there is so much miniaturization happening, I can see a future where we have little bits of networked technology embedded with us as we go, for example through smart clothing, and of course the Internet of Things.
As far as wearable/portable AI goes, I can imagine a sort of Siri 3.0 world, where we have an AI powered bot that’s like a personal assistant. Rather than just being something you access on your phone, or smart speaker, that it becomes something like your interface with the entire digital world.
PWIAS: In your book The Virtual Self you talk about the challenges around virtual data-sharing and its potential for building more responsive communities and governments. Do you see something similar happening with the data collected through emotionally-cognizant AI? What do you think that particular data could be used for?
Nora Young: You ask tough questions! Mental health is such an important area, I can imagine that data being used to further research into mental health. We are already seeing the value of Big Data in (physical) health, for example, in our ability to draw insights from large datasets of genetic data.
PWIAS: Spark has also produced segments on AI and health, like the one on urban design and obesity. As you know, Rosalind Picard’s Fall 2018 Wall Exchange lecture will be about affective computing and its impact on wellbeing. What are you most looking forward to when it comes to her lecture?
Nora Young: Partly, I’m just looking forward to meeting her! I think this idea of how we use affective computing to support health and well-being is fascinating. My pessimistic side fears the potential for affective computing to manipulate people (because of our tendency to anthropomorphize the technology), so I’m interested in learning what she has to say about the ethics of affective computing.
Photo courtesy of CBC Media Centre
Fall 2018 Wall Exchange with Rosalind Picard
Rosalind Picard will deliver the Wall Exchange lecture on Monday, November 5, at the Vogue Theatre in Vancouver.
What: Rosalind Picard: Emotional Intelligence in a Brave New Robotic World | Fall 2018 Wall Exchange
When: Monday, November 5, 2018, 7 p.m.
Where: The Vogue Theatre, 918 Granville St., Vancouver, B.C.