Laden...
Outputs vs. The Machine: We must change the conversation about the tech giants’ problems if we want solutions
When a Facebook video of doctors sharing bad coronavirus information reached 20 million people last week, you could almost predict the reaction. The company’s critics seemed shocked, pointing out it had made little progress since a similar video — Plandemic — went viral in May. They put forth the standard demand for better content moderation. But by then it was too late. The cycle where a tech company messes up, critics point it out, and then it happens again, has repeated itself for years. To end this frustrating dance, we need a new way of discussing these companies’ problems. We should focus on the systems causing them, not simply the fact that they happened. It’s more difficult. But without a conversation about the fundamental structures underneath the surface, we’ll see the same bad things repeating. When considering the tech giants’ problems, it’s useful to split them into two components (I’m cribbing a bit from a post I wrote last year): Outputs and The Machine. Outputs are the bad things the tech platforms spit out: things like viral misinformation, violent images, and outrage. The Machine is the way the systems are designed, the fundamental structure producing the outputs. Tech companies want us talking about Outputs. Conversations about Outputs become conversations about policy and enforcement. This means the core product escapes scrutiny. A conversation about Facebook’s enforcement actions against Plandemic, for instance, gives the system that helped spread it a pass. So Facebook leaves the fundamentals in place. Then it happens again. Conversations about The Machine, more uncomfortable for tech companies, bring us closer to solutions. The Plandemic discussion, though necessary, missed an opportunity in this regard. Research shows that making people pause before sharing something makes them less likely to pass along misinformation. Yet the Plandemic discussion ignored the Share button. It was all Outputs, no Machine. A repeat was inevitable. “I'd be in favor of testing an intervention where anytime anyone clicks Share for anything related to Covid-19, they are then asked how accurate they think the headline is, and then re-asked if they still want to share it,” David Rand, an MIT professor and leading researcher in this area, told me. Without tests of this nature, which get at the heart of The Machine, the issues will persist. This isn’t simply a theory. WhatsApp, for instance, shows how effective a focus on The Machine can be. When we talk about WhatsApp, we can’t focus on the Outputs since the platform is encrypted, so we focus on the system’s design, not content moderation. After seeing signs that coronavirus misinformation was spreading on the service, the conversation focused on The Machine and WhatsApp put severe limits on message forwards (its version of the Share). The spread of “highly forwarded” messages then dropped 70%. Though we can’t see exactly what happened inside, the change forced people to be more thoughtful about what they pass along. That’s a good thing. The advertisers in the Stop Hate For Profit campaign that pushed Facebook in their ad boycott would’ve been perfect messengers here. They depend on The Machine, especially the share button, to give their paid posts more reach. Yet they led with content moderation demands, a losing battle. ”It just doesn't seem like anybody was interested in the actual solution,” one digital advertising professional told me. Editors, for their part, could do well to appreciate the appeal of writing about Outputs, and consider the broader picture. There’s an endless well of these stories, and they spark outrage and generate traffic, making them somewhat irresistible. Yet they accomplish little in the long run unless accompanied by changes to The Machine. “We do a lot of training with journalists who say ‘I hear everything you're saying, but unless you get my editor in this room and train them up, there's no point, because everything you're telling me, I try to say that in editorial meetings and I get overruled, because I've got an editor with a metrics target for this month and a good, juicy disinfo story will make me hit that target,’” Claire Wardle, the head of strategy and research for First Draft News, which fights misinformation, told me. Beyond looking at the platforms’ mechanisms, we should also examine the broader societal context that led to the outputs. “The absence of the public in these conversations has been really troubling,” Wardle said. ”There's only so much we can do if society is not part of these conversations.” When Congress called the tech giant CEOs last week, one of the most telling (yet overlooked) exchanges occurred when Rep. David Cicilline asked Mark Zuckerberg how could Facebook could let the coronavirus misinformation video got 20 million views in five hours. Zuckerberg’s answer said it all. “Well Congressman,” he replied. “A lot of people shared that.” Microsoft, TikTok, and Satya NadellaMicrosoft’s TikTok interest initially confused the company’s rank and file and its alumni. “Why not go full force on all things enterprise,” one ex-employee told me. “That’s their bread and butter.” Others inside the company echoed the sentiment. The move seems strange on the surface, but consider Satya Nadella’s background and it makes sense. When Nadella took over as Microsoft CEO in 2014, he had worked inside the company for two decades. Through that span, he watched as Microsoft made money from its core asset — Windows — while letting the future pass by. Though Microsoft owned the dominant desktop operating system, it somehow missed mobile. Steve Ballmer, its then-CEO, even laughed at the iPhone. The experience taught Nadella an important lesson: Never overvalue the present, never undervalue the future. (For the inside story, check out Always Day One) Though Microsoft is flying high thanks to Nadella’s investments in cloud and mobile, its success today does not ensure ongoing success. TikTok, with its machine learning-based recommendations and obsessive userbase, seems like as good a bet on the future as any. One President Trump made possible in extraordinary circumstances. So Microsoft is going for it. DivisionsI’m in Seattle. Not as much to report on Microsoft’s dealings with TikTok. But because after coming up here in 2018 to report on Amazon and Microsoft for Always Day One, it’s become a summer tradition. Concerned about flying, I drove up through Northern California and the Pacific Northwest (Crater Lake pictured above) and got a chance to see how these areas are responding to the coronavirus. It was strange. As many who’ve recently traveled in this country will tell you, it’s difficult to tell which regions are pro-mask or anti-mask. You can get a dirty look anywhere for whichever approach you choose. Usually our divisions are underneath the surface, but now there’s a physical manifestation. To see this as the president works to delegitimize our upcoming election before it happens is not very reassuring. It makes one wonder what we’re heading toward in November. See you next Thursday. If you liked this post from Big Technology, why not share it? © 2020 Alex Kantrowitz Unsubscribe |
Laden...
Laden...
© 2024