Hello, TikTok's addictive algorithm keeps you watching, sometimes for hours on end. Yet very little is understood about how TikTok's personalised "For You" page works, and what risks and harms might emerge for its users as a result. When Mozilla studied how YouTube's recommendations work, we found it spread disinformation and dangerous videos to users. Knowing what, how, and why TikTok recommends videos is critically important, but TikTok doesn't provide robust tools for Mozilla's researchers or others in civil society to study the platform. We want to build a new, open source mobile app that will let TikTok users volunteer their watch history so that researchers can study how TikTok's algorithm works. Our goal is to raise $100,000 in the next few weeks to kickstart the development of this TikTok Reporter app. Please contribute $25 to Mozilla – or whatever amount makes sense to you today – so that we can start to build an open source TikTok Reporter app that will let users volunteer their TikTok history for researchers around the world to study. This tool will help uncover what videos TikTok shows users, how it recommends them, and what effects it has on users. Donate → It took TikTok half the time to reach 1 billion users as it did Facebook. It's a huge platform with unprecedented growth. TikTok's influence only keeps growing, and Internet experts and researchers have been stymied in learning how it works. Mozilla built a similar tool that allowed YouTube users to donate their watch history for researchers to understand how that platform recommended content to users. Nearly 40,000 people volunteered their YouTube accounts to be studied by our researchers, which led to important discoveries about YouTube recommendations often containing dangerous, radicalising content. After we released this information, YouTube was forced to become more transparent about its recommendation algorithms. They started publishing information about what kind of content the YouTube algorithm recommends, and committed to doing more to reduce harmful content in its recommendations. Now we want to do the same for TikTok, and let researchers around the world have access to the data too. We already know that algorithms serve recommended content to maximise engagement with a platform. And we know that kids or other people at risk can be harmed by that content. Unfortunately, tech companies don’t make it easy to look into what’s making their algorithms tick, even though we know the consequences are very real. By building our TikTok Reporter app, we'll help the public learn more about how TikTok feeds videos to its users, and force TikTok to be more transparent about its recommendation algorithm. Can you add a contribution to build this tool?
Please contribute $25 today so that we can start to build our app that will uncover critical information about how TikTok's video algorithm works. Our goal is to raise $100,000 in the next few weeks to start building this app, and your contribution will go a long way to reaching that goal. Donate → Thank you for all you do for an open Internet, Your friends at the Mozilla Foundation |