Friday, March 31, 2023

Algorithms, Disinformation, and the Deep State

At Tablet, Jacob Siegel has launched a 12,000-word attack on what he calls the "hoax of the century," which is the government's war on disinformation. I did not read the whole thing but I think I read enough to get the general idea. Working together with search engines and social media sites, they – whoever "they" are – are censoring what we can read and trying to control our thoughts in the name of fighting a bogus threat:

The crime is the information war itself, which was launched under false pretenses and by its nature destroys the essential boundaries between the public and private and between the foreign and domestic, on which peace and democracy depend. By conflating the anti-establishment politics of domestic populists with acts of war by foreign enemies, it justified turning weapons of war against Americans citizens. It turned the public arenas where social and political life take place into surveillance traps and targets for mass psychological operations. The crime is the routine violation of Americans’ rights by unelected officials who secretly control what individuals can think and say.

While Siegel's essay is over-the-top bordering on madness – among other things he is obsessed with Hunter Biden's laptop – the issue he raises is real and worth talking about. Siegel is absolutely right that some people in the government are obsessed with disinformation:

Something in the looming specter of Donald Trump and the populist movements of 2016 reawakened sleeping monsters in the West. Disinformation, a half-forgotten relic of the Cold War, was newly spoken of as an urgent, existential threat. Russia was said to have exploited the vulnerabilities of the open internet to bypass U.S. strategic defenses by infiltrating private citizens’ phones and laptops. The Kremlin’s endgame was to colonize the minds of its targets, a tactic cyber warfare specialists call “cognitive hacking.”
If you followed the run-up to Russia's invasion of Ukraine and the early days of the war, statements from US and other western officials were full of this fear. Russia, it was said over and over, was spreading propaganda so effectively that this might undermine Ukraine's will to fight and the west's willingness to support them. This turned out to be complete nonsense. Most Ukrainians are fighting like hell, and the ones who do support Russia mostly do so while fully understanding what Putinism is all about. They aren't "fooled by Russian propanganda," they are authoritiarians who miss the Soviet Union, or else opportunists who thought they could make a fast buck and maybe get good jobs out of supporting the eventual winner.

If you're paying attention, you may have noticed that I just elided the crucial issue: is there a difference between believing something in some sort of genuine way, and believing it because you have been fooled by propaganda and disinformation? But we'll get back to that later.

Anyway the fear of Russian propaganda became a big thing in the west, among both security hawks afraid of Russian or Chinese aggression and liberals afraid of Donald Trump. Various Twitter accounts were identified as Russian bots, and Twitter kept mum about it even though they knew some of them were in fact real people living in the US. I still meet people who say that Russian trolling elected Trump, even though this has been refuted in every possible way. Claims that Trump was a Russian agent were all over liberal social media sites.

Now security hawks have shifted their attention to manufacturing artillery shells, but liberals are still obsessed with disinformation of two kinds. First, there is Trump's claim that he won the 2020 election. That false claim certainly did lead to a nasty riot and has spawned a multi-billion-dollar lawsuit over voting machines; it is one of the blind spots in Siegel's essay that he doesn't really deal with the whole election business. Second, Covid. 

It is on the subject of Covid that I think you can see the real damage being done by the fight over disinformation. Terrified of spreading pro-Invemectin or anti-vaxx propaganda, Google and the other search engines have tweaked their algorithms to direct all medical queries toward officially approved sites like the CDC, the Mayo Clinic, or WebMD. But those sites are, as Scott Siskind has shown several times, all but useless for many people. If you try to look up drug side effects on any of them, they simply repeat the warnings that come on the label. Siskind once put up, side by side, the warning labels for a drug that doctors consider very safe and one that causes 40,000 emergency room visits a year: to the untrained eye, they are identical. Sites that work to explain the differences to lay people used to get many more page views, but Google's shift shredded their readership. In the name of keeping us safe from disinformation, Google and Facebook and probably others have made it much harder to find useful medical knowledge.

This shifts our attention from the military industrial complex to the other arcane thing about our world: the operations of search engines and other internet algorithms. Online retailers are constantly complaining that some minor tweak in Google's code caused their business to plummet overnight. People who sell on Amazon say the same thing about their algorithm. I believe this is true, because over the years it has happened half a dozen times to this site. Since this is only a hobby for me, I merely note it and move on, but if I made my living this way I would also be screaming. Google, Facebook et al. really do have enormous power over what we see and read. How are they using it?

Mostly to make money. I think that if the algorithms do any real political damage it is mostly by directing our attention to outrage porn, which gets the most engagement. But we know that Google and Facebook work with the government in several ways. For example, against child porn; they have an elaborate arrangement with the FBI that allows them to store and transmit images that would otherwise land them in prison. Testimony in Congress has confirmed that they also do this for anything that smacks of Islamic terrorism, including discourse that I think ought to be protected under freedom of religion. Surely there is also collaboration around the surveillance of neo-Nazis and other violent right-wing types.

Where does this end? I doubt anybody really knows, even in the highest reaches of the FBI or the NSA. Their supercomputers have compiled so much data, and so many different people might have access to it, that we simply don't know how it is being used.

I think, though, that we can say with confidence that it is not being used to censor conservative thought. Conservative ideas are all over the internet and the airways in every conceivable form. If Google's algorithm has a preference for liberal ideas, as has been claimed, that certainly doesn't keep conservative firebrands from becoming internet stars. So I think the basic premise of Siegel's argument is false; he is obviously not being censored, since his essay is spreading like crazy.

But I wonder two things: first, how are our interests being manipulated by search engines? Are there things I would really like to read that have been hidden from me? And does this, sometimes, relate to government pressure to hide certain sites or promote others?

And second, what is disinformation anyway, and how do we really form our beliefs? Is there a hard line to be drawn between things that are so firmly refuted that defending them in public is lying, and things that are probably not true but still worth thinking about? 

I think this is a hard problem. I am concerned about what CO2 emissions might do to the planet, but I regularly see statements about climate change that I think are not at all supported by the evidence; for example, that the western drought of the last 20 years was caused by global warming, or that hurricanes have gotten more powerful and numerous. When you are dealing with a vast, chaotic system like the weather, it is simply impossible to know, even in theory, what "caused" a particular drought or storm. I suppose some people would call that "disinformation." I prefer to call it a difference in interpretation that should be hashed out in public debate.

Many people think that their own beliefs are so obviously true that opposition to them is just lying or wrong. The algorithms used by search engines and social media sites are opaque, biased, and probably influenced in some ways by government pressure.

Put these things together and you get, on the one hand, demands that Twitter silence the Russian bots that were electing Trump, and on the other Jacob Siegel ranting about a conspiracy to silence conservative thought. Both fears are overblown. Millions of people on both political sides believe that other people get their politics from disinformation and Russian lies, to which those other people are uniquely susceptible. But maybe we just disagree, like we always have.

7 comments:

  1. You don't think a descent from the argument into madness is evidence of madness?

    "There is a real issue here..." That is a pre-supposed conclusion by you, and an article supporting mad conspiracy theories is not worth the bits it is written in.

    Yes, a government could get so concerned about disinformation that it goes overboard. But an insane piece is only evidence of mental illness, not a real concern.

    ReplyDelete
  2. This comment has been removed by the author.

    ReplyDelete
  3. I think crazy people can believe correct things. I also think that sometimes the hyper-sensitive can serve as canaries in the coalmine, identifying problems before they bother anyone else.

    ReplyDelete
  4. It's all fine and dany figthing the disinformatin, until the opposite faction takes over the tools. Imagine Trump will be reelected and he would force a nation-wide war against disinformation. For example, banning people who claim there is too much police brutality in USA .

    ReplyDelete
  5. Errr, I searched the article for the phrase, "Hunter Biden." I found five references, in four of which the phrase serves as a modifier to the word "laptop," and the fifth is a hypothetical. So far as I could tell, all of these references had to do with what the author sees as a Deep State-type plot to quash discussion of HB's laptop before the 2020 election.

    I then partly read, partly skimmed the entire article, paragraph by paragraph. No, he doesn't think Hunter Biden is a key leader of the conspiracy.

    The author does seem to be a Trump nationalist-populist who doesn't like what he considers the ruling class, which seems to mean a combination including but not necessarily limited to the national security apparatus, establishment politicians (most prominently the Clintons and Obama, but probably including insufficiently rowdy Republicans as well), big philanthropy, and Silicon Valley.

    The key passage, to me, seems to be this: "a ruling class describes a social group whose members are bound together by something deeper than institutional position: their shared values and instincts. . . . Two criteria define membership in the ruling class. First, as Michael Lind has written, it is made up of people who belong to a 'homogeneous national oligarchy, with the same accent, manners, values, and educational backgrounds from Boston to Austin and San Francisco to New York and Atlanta.' . . . Second, to be a member of the ruling class is to believe that only other members of your class can be allowed to lead the country."

    I don't think this is particularly insane. In fact, I think it is a fairly correct analysis. I say this as someone who, given a choice between right-wing national populism and the current ruling class, will go with the ruling class every time. If the story of HB's laptop really was quashed to hinder Trump's re-election prospects, I'm pretty good with that.

    On the issues of search algorithms, information flows and restrictions, data collection, and whatnot--I think it's like AI. I think it's all huge and mysterious and that we really have very little idea what we're doing. I don't expect we're going to get utopia out of it and that we may well get quite the opposite (but I'm pessimistic by nature). And I don't think there's much that can be done to stop it or even slow it down.

    Hah! There's a multi-paragraph screed for you. Sweet dreams!

    ReplyDelete
  6. But if crazy people believe correct things, we don't point to the crazy people, we point to the reasoned arguments of non-crazy people.

    We don't point to the writings of cranks to support medical knowledge.

    Of course, I am jumping to 8nclusion about mental illness. A lot of these writers are playing to their base rather than actual believers in the nonsense they spew. But that doesn't make them any better as a source of informed argument. The whole right wing of our society has so inured itself to constant lying that they consistently reject the truth.

    Pointing to an article about over-reacting to disinformation which is filled with disinformation is almost laughable.

    ReplyDelete
  7. Two of the OSINT people I follow on Twitter were just digging through some of the Twitter source code that somebody posted online. There is a feature that massively downvotes anything categorized as "disinformation," but no clear indication of how that designation is assigned.

    ReplyDelete