Saturday, December 27, 2025

Back to the Dark Ages

Our techbro overlords would have us believe that we're on the cusp - or even in the midst of - a golden age, a new Enlightenment, an exponential expansion of knowledge and capacity. But Joseph de Weck, in this Guardian article, argues the exact opposite: that AI is taking us back to the Dark Ages.

The MIT study cited might be flawed, and the findings overstated/misrepresented, but evidence that AI is inducing laziness and a dereliction of responsibility for independent/critical thinking and decision making seems overwhelming and incontrovertible. On a daily basis, people are surrendering to and putting absolute faith in systems that come with disclaimers that they may deliver "inaccurate or incorrect information" - and frequently do.

This is of course not the only reason to be intensely wary of or outright opposed to AI - see also its cannibalisation of artistic endeavours and devaluation of the creative process, and the enormity of its impact on the environment. But the production of a more supine, manipulable population is a huge concern, especially politically.

This is not to criticise individuals. The temptation to take short-cuts and avoid hard, time-consuming graft or extensive reflection is understandable and - let's face it - probably human instinct. What de Weck's article doesn't really emphasise - and should - is that this is a structural problem: of governance, of opacity, of corporate and political influence.

We need to focus on one key question: whose interests does AI ultimately serve? The answer is: not mine, and not yours either, but those of an elite minority intent on enriching themselves at our expense.

No comments: