AI Politics

It's an interesting, nonetheless contemptible fact we refuse to acknowledge the politics of technology. It is especially curious after two-centuries industrial technologies altered daily human existence to a far greater degree than any political dogma of the time. Across the 20th century, industrialization was, which the exceptions of Cambodia's Khmer Rouge and Peru's Sendero Luminoso, the defining goal of politics left, right, and center. When the Bolsheviks seized control of Russia, industrializing became job one. The Chinese Communists struggled for decades, then in the last forty years industrializing became the grail quest. In the last decade, India's Bharatiya Janata party turned full force toward industrialization, while questions concerning the future of the African continent overwhelmingly begin with industrializing.

Industrial development results in political economy centralization, readily apparent with the American experiences turning of the distributed yeoman farm republic to a corporate oligarchy. For many reasons this change remains largely unexamined, though a most important reason is industrial wealth and corporations funded schools of industrial economics as apolitical thought. Their first accomplishment was removing political from economy. From its inception in the 19th century economics was known as “political economy.” Removing politics from the discipline, placed the politics of economics firmly in the hands of the industrial and financial corporations.

The idea of economics divorced from politics, from government, was always completely ahistorical. It was in itself a political doctrine. In the monarchies of Europe, political power had been centralized in the courts and manors, including all things today considered economics. Initially, liberalism largely concerned pulling economic power away from the royal courts. With the establishment of modern republicanism, the idea of economic liberalism became more complicated, but by the early 20th century the dominant power of the industrial corporation resulted in their monopoly control over most things deemed economic. The exception coming with the system's collapse in the 1930s, resulting in the anti-republican growth of Washington that proved both an inadequate and problematic counterweight to corporate power.

The idea of economic power exclusive from government was always a political myth. Karl Polanyi's seminal work, The Great Transformation, extensively documents the state's essential role in moving from an agrarian to industrial economy. Today, government remains just as indispensable to mega-corporate control of the economy, especially regarding the large role government plays in the development of technologies, a role always underplayed by the corporations who most profit. This is especially true of the computer industry, where most initial machine designs and understandings emerged from World War II, specifically with the development of cryptography, anti-aircraft weapons, and the atomic bomb, in addition to technologies, specifically the transistor, emerging from government bestowed monopolies such as Bell Labs. From inception, the computer industry was and remains completely enmeshed with the military, unacknowledged by industry PR misinformation promoting a juvenile libertarianism or a tech-liberalism. Today, the always insufficient politics of industrialism proves not simply lacking, but detrimental to the evolution of compute technology, most especially with this latest generation marketed as artificial intelligence (AI).

The FT has an interview with a Professor Li from Stanford, “one of a small number of academics and technologists responsible for laying the foundation for today’s revolution in artificial intelligence” – an illuminating phrase on a number of counts. First, it is an extremely small number of people developing this technology, but it can only be implemented utilizing an already entrenched and ubiquitous network controlled by a handful of leviathan corporations. Secondly, there's that word revolution, tech's omnipresent marketing term over the last thirty years, a revolution always determinedly devoid of politics. Finally, technologists have rarely been held responsible for anything except the operation of any given technology. As far as technology's greater impact on society, the record of technologists' responsibility is short. Such responsibility relies on having a politics of technology and we don't. If we did, few would turn to academics for responsibility, where lies a long established tradition of political ineptitude. The piece even includes a punchline, “She is now pushing to ensure that revolution is carried out responsibly from a new institute at Stanford.” The most recent example of Stanford tech responsibility are the Bankman-Frieds, mother and father both Stanford Professors, and the convicted, imprisoned Stanford born and nurtured son.

Professor Li is a woman immigrant from China. In the meaningless politics of today, you can feel good she's a woman, but scared she's from China. As regards to any sort of thought on the politics of technology, the piece is barren. What irks most is both the interviewer’s and professor's naivete. Maybe this comes from too much personal experience, but there is no greater political sin than being naïve. Don't get me wrong, too often power simplistically brandishes the charge of naivete against those wanting change. It is not in the dreams or goals that political naivete is rightfully criticized, but in the means, too often a self-righteous cluelessness on the process of getting from here to there.

The professor lays naked her naivete, “Right now in AI, what worries me is we don’t have the resources to make sure that academic AI continues to be a centre of gravity. Because if we lose that centre of gravity, then the other centre of gravity is driven by capitalism.” So incredibly clueless, but not anymore so than so much out of Stanford and the Valley for decades. The university isn't a center a gravity, it's a Valley research and child-care division. As far as the “other centre of gravity” being capitalism, if that’s defined as economic control by leviathan corporations, OK. However, the idea academia could ever balance corporate power is like saying the serfs in Medieval Europe balanced the power of the lords.

Life in the 21st Century is a reader-supported publication. Please become a paid subscriber.

Like every piece written about tech for decades, it is silent about government and governance, though the topic inadvertently pops up in the professor's resume and working for Google. “In 2018, when Google was at the centre of a controversy regarding the use of its AI by the US Department of Defense. Li was not directly responsible for the partnership, but was nonetheless caught up in an internal crisis which saw a number of staffers quit the company.” What's great about this statement is neither the professor or the interviewer comprehend it makes the article's entire premise moot. Forgotten too, the not long ago Google Boys' charter babble of “doing no evil,” that is, I guess, until such time it gets in the way of profit.

The FT ain't really concerned with the responsible use of AI, but how their readers can profit. As with all popular press pieces there's little digging into what exactly AI is. Once again, the piece accidentally hits on it via Li's resume, where she worked on something called Imagenet:

“ImageNet is an image dataset organized according to the WordNet hierarchy. Each meaningful concept in WordNet, possibly described by multiple words or word phrases, is called a "synonym set" or "synset". There are more than 100,000 synsets in WordNet*; the majority of them are nouns (80,000+). In ImageNet, we aim to provide on average 1000 images to illustrate each synset. Images of each concept are quality-controlled and human-annotated. In its completion, we hope ImageNet will offer tens of millions of cleanly labeled and sorted images for most of the concepts in the WordNet hierarchy.”
*“WordNet® is a large lexical database of English. Nouns, verbs, adjectives and adverbs are grouped into sets of cognitive synonyms (synsets), each expressing a distinct concept.”

AI is brute force compute reliant on oceans of data, made possible by another leap in compute power with the compiling tens of billions of transistors on one board. The continuing growth of compute power has been essential to the industry's evolution since the development of the first integrated circuit sixty years ago. Data engineers using this compute power have configured ever more extensive feedback into the repeated processing of enormous amounts of data, allowing more sophisticated results. As the above definition of Imagenet states, this data design is “quality-controlled and human-annotated” and as that old dead Greek long ago wrote, where there is human there is politics, and so it should be said about the machine.

The great 20th century thinker Norbert Wiener, who in many ways is the father of this technology, wrote, “The feedback principle means that behavior is scanned for its result, and that the success or failure of this result modifies future behavior. It is known to serve the function of rendering the behavior of an individual or a machine relatively independent of the so-called 'load' (initial) conditions.” He adds, “Learning is a most complicated form of feedback, and influences not merely the individual action, but the pattern of action. It is also a mode of rendering behavior less at the mercy of the demands of the environment.”

AI continuously runs results back across the system then modified by other algorithmic parameters. This is the process of AI's Large Language Models, a process of feedback Wiener noted to be the key for all learning, whether by biological organism or machine. It is this process we largely define as intelligence.

Implemented across all these systems is designed structure, an architecture, including the collection and storage of data, the creation of algorithms for data manipulation, and the transistor boards themselves. Unrecognized or more accurately simply dismissed, a political element exists to every aspect of this technological architecture. Among other things, AI is being advertised as some sort of objective intelligence and that it surely is not, nor will it ever be, like all intelligence it is inherently subjective in design and data, and subjective is inherently political.

Finally, Professor Li states, “If you’re chasing the fashionable algorithm you’re not doing the best science.” Algorithms are not science, they are technology. Technology is derived from science, technology is built on science, no technology can do anything science doesn't allow, but how technology is developed and implemented is not science, it is human design. This is an especially important point entering an age where labeling something is science is tantamount to saying it is beyond interpretation, dispute, or challenge. Science becomes the new temple.

There's a better piece in the Journal about the enormous quantities of energy AI requires and there's never been a more political technology than the harnessing of energy in all its forms, nor a greater political issue in the 21st century. “The rapid adoption of AI could represent a sea change in how much electricity is required to run the internet—specifically, the data centers that comprise the cloud, and make possible all the digital services we rely on.”

Again, AI is only possible installed into an already entrenched energy and information infrastructure, an infrastructure that was beholden to little politics in its development. An infrastructure controlled by a handful of corporations using marketing terms like “the cloud” to deliberately obfuscate a centralized technological architecture, centrally controlled. AI allows even greater centralized control, with ever more sophisticated feedback facilitating greater automation, taking more and more power out of human hands, while simultaneously concentrating ever greater wealth and power in fewer hands.

The Journal article continues, “Constellation Energy, which has already agreed to sell Microsoft nuclear power for its data centers, projects that AI’s demand for power in the U.S. could be five to six times the total amount needed in the future to charge America’s electric vehicles.” Constellation Energy has a current market cap of $38 billion, so you might assume their estimates are useful, but unexamined utility numbers are the province of fools. For example, ask them the cost of building a new nuke. The most recent opened in Georgia this past summer was seven years late and $17 billion over budget. In general, utility numbers should be taken as seriously as statements by one of the richest men in the world, who made his fortune with MS tech-monopoly, saying, “If we make smart investments now, AI can make the world a more equitable place.”

It's sort of amusing, if you have the stomach, how people in positions of power get away continually saying whatever stupid fucking thing they want these days. Accountability on such matters would require a functioning and healthy politics, that, we have not.

The Industrial Era is most simply defined as Homo Sapiens mass harnessing of energy, completely reshaping the planet's social, political, and economic systems, while radically transforming the environmental landscape. The feedback mechanisms to all this change were at best insufficient, mostly nonexistent. What we know of any such feedback is politics. In the US, this included the 19th century Populists, small farmers consolidated out of existence by the railroads and telegraph wielding NY banks, and then organized labor of the 20th century factory floor. There's been no organized response to the last half-century's tech growth, no feedback, no politics, even though the technology's infiltrated almost every aspect of American society. But then, we have no politics of technology or a healthy politics of any sort in America at this point. This doesn't bode well for AI's future, not just for those posed on the hard receiving end of effected change, but for the technology itself and its profiteers. Ironically enough, the tech industry has no feedback, it doesn't learn.