Toutiao is a Chinese news personalization app which uses artificial intelligence (AI) to select what news items to present to its 120 million daily active users. Media companies can set up their own Touitao accounts, and over 350,000 do. A rough translation of Toutiao is “Today’s headlines.”
According to a research article in the Columbia Journalism Review (CJR), Toutiao is so popular in China that you can say “toutiaoization” to mean algorithmic news personalization. Even Chinese state-controlled media use toutiaoization. Add that 99% of news reading in China happen in mobile phones and you see how Chinese media has dealt with the tension between media and tech platforms for longer, and more deeply, than we have in the West.
There is no Toutiao in the West, but the app was the first successful product from ByteDance, owner of TikTok, another app which uses AI to personalize what content to show you. If you’ve ever used TikTok, you know its recommendations are magic, keeping you glued to the screen for hours.
Weapons of Mass Personalization
When the US government gave ByteDance a deadline to sell TikTok, the Chinese government told the company to seek its approval before selling. The sale of TikTok did not come to pass in the end, and ByteDance is now offering its AI technology as a separate service to other companies. Some Chinese companies like carmaker Geely use the technology in their own products.
That could mean that news companies anywhere in the world will one day use the same technology that made TikTok famous, and its news counterpart Toutiao the leader in news personalization in China. Perhaps your favorite news outlet will also publish gems like the shark that underwent rectum surgery, or the nervous student who lost his ID card twice before a national exam.
Many media outlets do want that. As Verb’s long-suffering reporters can attest, it’s always the most ridiculous stories that get the most traffic. A reporter can work on a 6-month investigation to see the shark that ate too much get more traffic.
It’s not as simple as setting goals different to traffic. According to the CJR article, in China state-owned media doesn’t have traffic goals like its commercial counterparts, but they still want personalization technology because it helps them achieve their own engagement goals. No doubt algorithmic personalization is effective, but how to prevent its downsides?
Lessons from Science-Fiction
“The lifecycle of software objects” by Ted Chiang offers a sobering morale. It’s the story of how AI entities need as much nurturing as any human child to become truly useful for society. Once that happens, you might as well give them rights. In Chiang’s version of the AI race, trying to speed up the learning process results in obsessive entities which aren’t truly intelligent.
Chiang’s book is not easy to find right now, but Kim Stanley Robinson’s “Aurora” has a similar message. A character in that story spends years training “Ship,” a starship onboard AI, to keep a journal of their 150-year trip to Tau Ceti. At first the AI struggles to understand the concept of narrative, but you end up loving Ship.
What Did Father Say?
You get the idea: AI might be just as hard or perhaps even harder to educate than humans. Father of computing Alan Turing said, “if a machine is expected to be infallible, it cannot also be intelligent.” He also proved there is no solution to the “halting problem,” which (very roughly) means that an algorithm can’t decide for itself when to stop computing. In other words, there are no judgement calls in computer city.
If training a computer to make proper editorial decisions is impossible, then you are better off training humans in the proven method to judge what’s news: work in a newsroom. And once there follow George Orwell’s tips, including “break any of these rules sooner than saying anything outright barbarous.”
Others do believe that computers will eventually be able to perform all tasks. When asked in which tasks humans will do better than computers, David Roberts of Singularity University said “none.” And he added: we need to figure out what things we don’t want computers to do, even if they are better at them than us. As you probably know, at Verb we think that journalism is something you should only trust humans with.