Johanna Blakley

Media | Entertainment | Fashion

Archive for artificial intelligence

The Unintended Consequences of Technology


An article of mine on the “Technologies of Taste” has just come out in Technology & Society, a publication of the Institute of Electrical and Electronics Engineers (IEEE). It’s a fascinating special issue exploring the “Unintended Consequences of Technology.” As the guest editor, Ramona Pringle explained it to me that the focus wasn’t on “the dark side” of tech, but rather the complicated nature of our increasingly connected lives.

The call for papers, however, emphasized the danger of not carefully examining our relationship to new technology:

With all great innovation comes responsibility; and with the exponential growth of technology, the window within which we can examine the ethics and consequences of our adoption of new technologies becomes increasingly narrow. Instead of fear mongering, how do we adjust our course, as a society, before it is too late?

My piece explores the role that recommendation systems play in our online pursuits of knowledge and pleasure. How is our personal taste affected by finely-tuned commercial algorithms that are optimized to sell us products and monetize our attention? While Eli Pariser and others have argued that these systems place us in “filter bubbles” that insulate us from new ideas, I argue that companies like Google, Amazon and Netflix have strong commercial incentives to develop recommendation systems that broaden their customers’ horizons rather than limiting them, effectively bursting filter bubbles rather than reinforcing them.

This couldn’t be a more timely argument considering that concerns about filter bubbles have grown exponentially during the last presidential election cycle. What complicates the debate about filter bubbles is that each site — whether it’s primarily an ecommerce, social media, search or content platform — has very different goals in mind and different proprietary algorithms in place to achieve them. I hope this article triggers a more thoughtful conversation when people claim that ideological insularity is the obvious outcome of filtering and recommendation technology.


Deflating the Filter Bubble


I was asked recently to speak at a symposium on Media Choices at Drexel University. The event drew a fascinating array of scholars who were studying things like Internet addiction, online dating, and political polarization in media consumption.

When someone mentions “media choice” to me, I automatically start thinking about the algorithms that have been developed to help shape that choice.

I have followed avidly the growing use of recommendation systems that you see on sites like Amazon, Netflix, YouTube and Pandora. I saw these mechanisms as a significant move away from demographic marketing (which I find deeply flawed) to marketing based on customer taste.

I did have my reservations though. I was very moved by Eli Pariser’s TED talk about the danger of “filter bubbles,” which effectively insulate us from opinions and content that we don’t understand or like. His talk really resonated with me because of the deeply divided ideological and taste communities that I found in a major survey research project I conducted on the correlation between entertainment preferences and political ideology (spoiler: they are even more deeply connected than you might think.)

But, when I conducted further research about collaborative filtering systems, I made some rather counter-intuitive discoveries. YouTube, for instance, found that “suggesting the videos most closely related to the one a person is already watching actually drives them away.”

Of course YouTube’s goal is to get you to sit and watch YouTube like you watch TV: to lean back and watch a half hour to an hour of programming, rather than watching for two minutes, getting frustrated trying to find something else worth watching and then going elsewhere. So, in short, it’s in YouTube’s best interest to introduce some calculated serendipity into their recommendations. Read the rest of this entry »

Embracing Blur


Z Holly, the former vice provost for innovation at USC and host of the first TEDx ever, sure knows me well. A newcomer to the prestigious TTI/Vanguard Board, Z thought I would be good fit for their next conference on Embracing Blur.

Um, she couldn’t have been more correct. I have long been fascinated by the interplay between representations and reality (my last TEDx talk dealt with this pretty directly). And I’d venture to say that the majority of my work at the Lear Center explores the cultural and commercial ramifications of this blur.

What Z didn’t know was that my dissertation was actually about “betweenness” – something I saw as a key formal and thematic characteristic of avant-garde modernism. Many of my friends and colleagues wondered how a high-theory English PhD ended up in a think tank studying the impact of media, but it all seems quite rational to me: isn’t the key formal and thematic characteristic of 21st century media the blur between representation and reality? What we considered avant-garde in literary Paris at the turn of the 20th century is the (often unacknowledged) cultural dominant of contemporary global pop culture.

And so the description of the TTI/Vanguard program couldn’t have been more appealing to me:

A flood of technologies is washing away traditional boundaries between work and play, companies and governments, war and peace, near and far, virtual and physical, society and the individual. In its wake, a global nervous system is emerging as we connect billions of people with each other and with billions of newly smart objects. This unbounded organism is developing an unsurpassable intelligence, resistant to human control. Where is it taking us? Can we hope to understand it, control it, contain it?

Z had to warn me though – there’s one thing about this conference that is very atypical: every attendee (and there’s over a 100 of them) has a mic and can interrupt you at any point during your presentation.

This wouldn’t be quite so nerve-wracking if you didn’t know that the crowd would be composed of carefully vetted C-level folks from Fortune 100 companies and an engaged board that includes Alan Kay, Eric Haseltine, Gordon Bell, Nicholas Negroponte, and John Perry Barlow (never a guy to sit back and listen to anything he thinks is bullshit). Read the rest of this entry »