Johanna Blakley

Media | Entertainment | Fashion

Archive for recommendation engines

The Unintended Consequences of Technology

ieeecover265

An article of mine on the “Technologies of Taste” has just come out in Technology & Society, a publication of the Institute of Electrical and Electronics Engineers (IEEE). It’s a fascinating special issue exploring the “Unintended Consequences of Technology.” As the guest editor, Ramona Pringle explained it to me that the focus wasn’t on “the dark side” of tech, but rather the complicated nature of our increasingly connected lives.

The call for papers, however, emphasized the danger of not carefully examining our relationship to new technology:

With all great innovation comes responsibility; and with the exponential growth of technology, the window within which we can examine the ethics and consequences of our adoption of new technologies becomes increasingly narrow. Instead of fear mongering, how do we adjust our course, as a society, before it is too late?

My piece explores the role that recommendation systems play in our online pursuits of knowledge and pleasure. How is our personal taste affected by finely-tuned commercial algorithms that are optimized to sell us products and monetize our attention? While Eli Pariser and others have argued that these systems place us in “filter bubbles” that insulate us from new ideas, I argue that companies like Google, Amazon and Netflix have strong commercial incentives to develop recommendation systems that broaden their customers’ horizons rather than limiting them, effectively bursting filter bubbles rather than reinforcing them.

This couldn’t be a more timely argument considering that concerns about filter bubbles have grown exponentially during the last presidential election cycle. What complicates the debate about filter bubbles is that each site — whether it’s primarily an ecommerce, social media, search or content platform — has very different goals in mind and different proprietary algorithms in place to achieve them. I hope this article triggers a more thoughtful conversation when people claim that ideological insularity is the obvious outcome of filtering and recommendation technology.

 

Advertisements

Talking About the Culture of Technology with Katina Michael

I had a wonderful conversation with Katina Michael, professor of Information Systems and Technology at the University of Wollongong, and editor in chief of IEEE Technology and Society Magazine. She wanted to have a chat about my forthcoming article on “Technologies of Taste,” which explores the social impact of recommendation engines. But the conversation ranged far beyond that topic, touching on the behavioral biometrics of game play, the privacy implications of Samsung TVs that can listen to your conversations, and the attention economy as a “zero sum game.” Clearly, I may just have to fly to Australia to continue this conversation in person.

 

 

Deflating the Filter Bubble

filterbubble

I was asked recently to speak at a symposium on Media Choices at Drexel University. The event drew a fascinating array of scholars who were studying things like Internet addiction, online dating, and political polarization in media consumption.

When someone mentions “media choice” to me, I automatically start thinking about the algorithms that have been developed to help shape that choice.

I have followed avidly the growing use of recommendation systems that you see on sites like Amazon, Netflix, YouTube and Pandora. I saw these mechanisms as a significant move away from demographic marketing (which I find deeply flawed) to marketing based on customer taste.

I did have my reservations though. I was very moved by Eli Pariser’s TED talk about the danger of “filter bubbles,” which effectively insulate us from opinions and content that we don’t understand or like. His talk really resonated with me because of the deeply divided ideological and taste communities that I found in a major survey research project I conducted on the correlation between entertainment preferences and political ideology (spoiler: they are even more deeply connected than you might think.)

But, when I conducted further research about collaborative filtering systems, I made some rather counter-intuitive discoveries. YouTube, for instance, found that “suggesting the videos most closely related to the one a person is already watching actually drives them away.”

Of course YouTube’s goal is to get you to sit and watch YouTube like you watch TV: to lean back and watch a half hour to an hour of programming, rather than watching for two minutes, getting frustrated trying to find something else worth watching and then going elsewhere. So, in short, it’s in YouTube’s best interest to introduce some calculated serendipity into their recommendations. Read the rest of this entry »

The Internet Knows What You Want

When Juliet Webster, a professor at the Open University of Catalunya, shared this cartoon with me, I couldn’t get over how well it captures the calculated, seductive power of the Internet. Like the most effective femme fatale, the Internet is increasingly optimized to give us what we want . . . even when we didn’t know we wanted it.

Oh what I’d give for an essay by Lacan on desire and recommendation engines . . .