Saturday 12 July 2025
Last week I was lucky enough to attend an event at the Royal Institution hosted by Braintree entitled ‘AI: Digital Consciousness – Sustainable Intelligence’. As far as I understand it, Braintree has developed a different approach to AI based on Symbolic architecture, which promises AI with far lower computing requirements and hence far lower power and water requirements in data centres – a more sustainable way.
Like many people, I have played around with LLMs and have been equally amazed and terrified. The incidence of LLM hallucination, coupled with a general and probably growing trend to accept anything that comes out of a computer as true, is deeply worrying with massive implications for education, science, engineering, our brains, and global society.
The whole evening was so interesting I have tried to write up my notes from the event, apologies to the speakers if I have mis-interpreted their presentation.
The two external speakers were brilliant and thought provoking. Kay Firth-Butterworth is a lawyer who has specialised in ethical AI for fourteen years, which in itself is amazing given that 14 years ago the profile of AI was much, much lower than today.
Kay’s presentation included the facts that: 20% of men in the US have an intimate relationship with an AI, each query to an LLM consumes 0.25 litre of water, and that by next month AI will be generating more data than humans. Kay spoke about the problems that AI is creating in the legal system where briefs now often include ‘hallucinations’ created by AI, either referring to non-existent cases or just making up evidence. This means the legal system is grinding to a halt and of course lawyers can be charged with contempt if they put forward arguments based on AI hallucinations.
There is also evidence from the US that AI means that recent University graduates are illiterate in their own subjects. AI is literally creating stupidity. Trust in AI is falling but at the same time it is increasingly being deployed, not just in ‘simple’ things like the apps and programs we use but also in autonomous lethal weapons. Organisations are facing cascading risks and the very nature of management and leadership is changing.
The second speaker, Pippa Malmgren, has been an adviser to President George W. Bush, and the UK Cabinet, as well as many large companies. Having set the context of the explosion in knowledge Pippa gave essentially a positive view of AI and talked about how it can do the ‘heavy lifting’ in many fields while humans can be freed up for more human things, like art and creativity.
Pippa talked about how AI is forcing the abandonment of the cartesian split, the split between left brain and right brain. She also touched on how most of our data input is visually whereas other senses like touch and smell are important. She used good quotations including:
Suffering comes from having an argument with reality
Evolution comes from adapting to reality
Imagination is more important than analytics
Pippa believes that AI will help us move from scarcity to abundance.
The vision of AI taking on the heavy lifting, and the dull and dirty tasks, while humans focus on creativity and more right brained stuff, is beguiling – and of course it has long been used in science-fiction. The concern of course is that if all the ‘boring’ stuff like cars, the Air Traffic Control system, nuclear power stations, and the ICBMs are controlled by AI which starts hallucinating that is really very bad. This vision has also been widely used in science fiction, notably in the 1970 film “Colossus: The Forbin Project” in which an AI is given command of the ICBMs and starts communicating with its Russian equivalent. In ‘2001: A Space Odyssey’ HAL tries to kill all of the crew of the spacecraft as he has decided it is in the best interests of the mission.
It seems that through the work of Kay and others we have started to address the issue of ethical AI but we need to do a lot more as the Silicon Valley ‘tech bros’ and wave of investors backing AI seem hell bent on pushing its use without much if any thought. The precautionary principle seems to have been forgotten entirely.
Comments
Dr Steven Fawkes
Welcome to my blog on energy efficiency and energy efficiency financing. The first question people ask is why my blog is called 'only eleven percent' - the answer is here. I look forward to engaging with you!
Tag cloud
Black & Veatch Building technologies Caludie Haignere China Climate co-benefits David Cameron E.On EDF EDF Pulse awards Emissions Energy Energy Bill Energy Efficiency Energy Efficiency Mission energy security Environment Europe FERC Finance Fusion Government Henri Proglio innovation Innovation Gateway investment in energy Investor Confidence Project Investors Jevons paradox M&V Management net zero new technology NorthWestern Energy Stakeholders Nuclear Prime Minister RBS renewables Research survey Technology uk energy policy US USA Wind farmsMy latest entries