Energetics of the brain and AI
📝 Abstract
Does the energy requirements for the human brain give energy constraints that give reason to doubt the feasibility of artificial intelligence? This report will review some relevant estimates of brain bioenergetics and analyze some of the methods of estimating brain emulation energy requirements. Turning to AI, there are reasons to believe the energy requirements for de novo AI to have little correlation with brain (emulation) energy requirements since cost could depend merely of the cost of processing higher-level representations rather than billions of neural firings. Unless one thinks the human way of thinking is the most optimal or most easily implementable way of achieving software intelligence, we should expect de novo AI to make use of different, potentially very compressed and fast, processes.
💡 Analysis
Does the energy requirements for the human brain give energy constraints that give reason to doubt the feasibility of artificial intelligence? This report will review some relevant estimates of brain bioenergetics and analyze some of the methods of estimating brain emulation energy requirements. Turning to AI, there are reasons to believe the energy requirements for de novo AI to have little correlation with brain (emulation) energy requirements since cost could depend merely of the cost of processing higher-level representations rather than billions of neural firings. Unless one thinks the human way of thinking is the most optimal or most easily implementable way of achieving software intelligence, we should expect de novo AI to make use of different, potentially very compressed and fast, processes.
📄 Content
Technical Report STR 2016-2 • February 2016
ENERGETICS OF THE BRAIN AND AI
Anders Sandberg
Sapience Project 2016
Synopsis
Does the energy requirements for the human brain give energy constraints that give reason to
doubt the feasibility of artificial intelligence? This report will review some relevant estimates of
brain bioenergetics and analyze some of the methods of estimating brain emulation energy re-
quirements. Turning to AI, there are reasons to believe the energy requirements for de novo AI
to have little correlation with brain (emulation) energy requirements since cost could depend
merely of the cost of processing higher-level representations rather than billions of neural fir-
ings. Unless one thinks the human way of thinking is the most optimal or most easily imple-
mentable way of achieving software intelligence, we should expect de novo AI to make use of
different, potentially very compressed and fast, processes.
ACM Computing Classification System (CCS): HardwareCellular neural networks • Hard-
wareEmerging technologiesBiology-related information processingNeural systems
— 2 —
London, United Kingdom
Sapience Project is a thinktank dedicated to the study of disruptive and intelligent computing. Its charter is to identify,
extrapolate, and anticipate disruptive, long-lasting and possibly unintended consequences of progressively intelligent
computation on economy and society; and to syndicate focus reports and mitigation strategies.
Board
Vic Callaghan, University of Essex
B. Jack Copeland, University of Canterbury
Amnon H. Eden, Sapience Project
Jim Moor, Dartmouth College
David Pearce, BLTC Research
Steve Phelps, Kings College London
Anders Sandberg, Oxford University
Tony Willson, Helmsman Services
Sapience Project — 3 — Recently there has been both major enthusiasm for artificial intelligence (AI) and concerns that it might pose major risks to humanity (Bostrom 2014). While a number of high-profile re- searchers think AI safety should be a high priority (Future of Life Institute 2015), there is also significant disagreement about how much risk AI poses. This is especially true for questions about human-level and beyond AI. Lawrence Krauss (Krauss 2015) is not worried about AI risk; while much of his complacency is based on a particular view of the trustworthiness and level of common sense exhibited by pos- sible future AI that is pretty impossible to criticise, he makes a particular claim: First, let’s make one thing clear. Even with the exponential growth in computer storage and processing power over the past 40 years, thinking computers will require a digital architecture that bears little resemblance to current computers, nor are they likely to become competitive with consciousness in the near term. A simple physics thought experiment supports this claim: Given current power consumption by electronic computers, a computer with the storage and processing capability of the human mind would require in excess of 10 Terawatts of power, within a factor of two of the current power consumption of all of humanity. However, the human brain uses about 10 watts of power. This means a mismatch of a factor of 1012, or a million million. Over the past decade the doubling time for Megaflops/watt has been about 3 years. Even assuming Moore’s Law continues unabated, this means it will take about 40 doubling times, or about 120 years, to reach a comparable power dissipation. Moreover, each doubling in efficiency requires a relatively radical change in technology, and it is extremely unlikely that 40 such dou- blings could be achieved without essentially changing the way computers compute. This claim has several problems. First, few, if any, AI developers think that we must stay with current architectures. Second, more importantly, the community concerned with superintelli- gence risk is generally agnostic about how soon smart AI could be developed: it does not have to happen soon for us to have a tough problem in need of a solution, given how hard the AI value alignment problem seems to be. Third, consciousness is likely irrelevant for instrumental intelligence; maybe the word is just used as a stand-in for some equally messy term like “mind”, “common sense” or “human intelligence”. The interesting issue is however, what energy requirements and computational power tells us about human and machine intelligence, and vice versa. If energy is a major constraint on cog- nition then we have a way of constraining predictions and claims about future artificial minds. Computer and brain emulation energy use I have earlier looked at the energy requirements of the Singularity (Sandberg 2015). To sum up, current computers are energy hogs requiring 2.5 TW of power globally, with an average cost around 25 nJ per operation. More efficient processors are certainly possible (many of the ones in current use are old and subopti
This content is AI-processed based on ArXiv data.