The future of AI is analogue

How neuromorphic computing will solve AI's energy usage problem

Much of the talk around AI is hyperbolic, speculative, and often misses the point. In our attempts to recreate our own intelligence using conventional computers we have ended up creating energy inefficient and over engineered systems that rely more on computation scale than “intelligence”. If we want intelligent machines as efficient as our own brains, then we must mimic how it computes. Following recent developments in architecture and learning, Jonathan Peters argues that the future of AI will be analogue.

 

The rise to prominence of artificial intelligence (AI) technology is arguably the most important technological change since the turn of the 21st Century. So rapid have the technological advancements been that in a matter of a few years global understanding and perspectives on AI have changed from ignorance to a mixture of intrigue, curiosity, and fear of the unknown future consequences.

When most think of concerns regarding AI technology, the prospect of robot-like technology terrorising humans and taking over the planet will often come to mind. Whilst this is dramatic and highly unlikely in the near future, one concern people should be more informed about is the implications of the technology on the climate. An example of this naivety is clear when looking at the recent EU ‘artificial intelligence act’, dubbed the ‘first of its kind’ in the world, does little to address climate threats. The bill itself asks companies to self-report climate-related statistics, such as consumption, however there is no limits/punishments for extreme energy usage.

Continue reading

Enjoy unlimited access to the world's leading thinkers.

Start by exploring our subscription options or joining our mailing list today.

Start Free Trial

Already a subscriber? Log in

Latest Releases
Join the conversation