It took a while, but Southeast Asia's autocrats found a way to subjugate social media. Whereas Facebook's co-founder and CEO Mark Zuckerberg thought social networks would become "a force for peace in the world" and build a "common global community", his platform is accused of being " complicit" in censorship on behalf of the Vietnamese Communist Party, a profitable market of 90 million people.
Cambodian Prime Minister Hun Sen has also jailed critics, but he's largely drowned out rivals by becoming Facebooker-in-chief, swamping critical messages with his own propaganda. At one point, in 2018, he was reported to be the fourth-most followed leader in the world on the platform-although how much of that was paid for is a matter of debate. He has now moved onto Telegram, a messaging app, where his personal channel bypasses traditional sources of information to spoon-feed agitprop to the masses.
It's conceivable that the region's autocrats will exert less sweat in taming the new language-learning technologies like ChatGPT. Cambodia's Ministry of Post and Telecommunications has said it's been using the popular artificial intelligence technology for several months to speed up administrative work. The Ho Chi Minh City Department of Information and Communications has tasked experts to think of ways it can help governance.
There’s a temptation to think adoption of these technologies will be mundane; they'll be used to write up government policy papers and social media updates. They’ll help officials to draft documents appealing for foreign donor support. And it could be democratic; the average Thai or Cambodian could collate all their governments’ ills with a leading question.
Beijing, after all, has blocked local access to ChatGPT and warned Chinese internet users that it's a way for the United States to "spread false information". But it's difficult to see how ChatGPT increases the amount of information available to netizens when its purpose is to simplify and condense what's already online.
Autocrats cajole technology
But the lesson of social media is that autocrats find a way to cajole the new digital technology. "This tool [ChatGPT] is going to be the most powerful tool for spreading misinformation that has ever been on the internet," Gordon Crovitz, a co-chief executive of NewsGuard, a company that tracks online misinformation, has opined.
There are a few paths forward. One is to impose a moratorium on their development. That was the recent open-letter appeal from more than 15,000 tech luminaries, including Elon Musk and Steve Wozniak, the Apple co-founder.
But a moratorium will be the decision either of the companies themselves or of the U.S. government since these language-learning technologies are almost exclusively America-based; it seems Beijing is falling behind on this front, which may be the result of Washington’s campaign to deny China the most complicated semiconductors. Another option is let the chips fall as they will and then reassess once the dangers of the technology are manifest, the likely way things will develop.
One result of all of this, it has been suggested, will be the revival of traditional "gatekeepers" of information, from newspapers to academic journals. Rather than swelling up the sewer of disinformation, it could return us to the status quo ante, of life before social media, when most people got their daily dose of information from newspapers and TV stations. In the future we will not be able to trust any image or video – because of " deep-fakes"– unless it's available on Getty Images. There might be greater doubt in the veracity of any news story that isn't published by one of the giants of media, such as Reuters or the New York Times.
Corruption, vice and brutality
That may be a decent outcome for the West. But what about the autocracies in Southeast Asia?
To take just one problem, consider the absence of “written truths”. So much of what happens in authoritarian countries is not publicly available– or, at the very least, is buried beneath an avalanche of counterinformation – even though it is known to many people through rumor and gossip, now spread primarily on social media.
These are the sorts of tales of corruption, vice and brutality that go unreported by state-run or ruling party-aligned newspapers. It’s the sort of information that can land you to decades in prison or being branded a traitor for repeating online. It’s the sort that autocratic regimes now pressure Facebook or Google to remove from their platforms.
"Rumors as unsubstantiated information exist in almost every society, but can be particularly prevalent in authoritarian countries due to their restrictions on independent news media," states a particularly interesting essay on rumors in Chinese politics, published in the British Journal of Political Science. It added: "rumors in a non-democracy are an alternative form of media that directly competes with official information and mainstream media, and therefore constitutes a counter-power against official power."
I asked ChatGPT (not the latest edition) the question, “Can ChatGPT detect rumors in authoritarian countries?”, and it provided an interesting response. “Detecting rumors in authoritarian countries can be a complex task that requires access to a wide range of sources of information and knowledge of the political and social dynamics of the country in question,” I was informed. One approach, it added, is to rely on trusted sources of information, like journalists and international organizations. Another, it said, is to “use computational methods to analyze” social media and other online platforms for patterns of misinformation and disinformation.”
Indeed, machine learning techniques can “identify and track the spread of rumors and other false information." ChatGPT assumes rumors to be, by definition, “false information”. But the clincher was its final sentence. “Ultimately,” the program told me, “the most effective way to combat rumors and misinformation is through a commitment to open and transparent communication, and a dedication to upholding the principles of free speech and access to information.”
Endless reams of propaganda?
But that’s the circular problem. Technology like ChatGPT will lead to less open and transparent communication. Just imagine the opportunities for ministries and propagandists when they can produce endless reams of AI-written content.
These language-learning technologies will allow autocratic regimes to inundate society with their propaganda, swelling the swamp of information that ChatGPT feeds off. After all, the information it processes is mainly that which is published, which in authoritarian states means mostly by state-aligned newspapers or through government social media channels.
The nuance will be lost. Will it know which social media feeds publish rumors that end up being proven true? Will it balance one article from an independent newspaper with one hundred published by a state-run newspaper? How much will a foreign-produced technology understand local sensitivities?
It will also further incentivize authoritarian regimes to close down independent news outlets, so that they simply cannot publish alternative information for ChatGPT to condense for readers.
Intentionally or not, the technologies will end up regurgitating propaganda.
David Hutt is a research fellow at the Central European Institute of Asian Studies (CEIAS) and the Southeast Asia Columnist at the Diplomat. As a journalist, he has covered Southeast Asian politics since 2014. The views expressed here are his own and do not reflect the position of RFA.