Inside AI cover
welcome to this free extract from
an online version of the Manning book.
to read more
or

9 Technological singularity is absurd

 

This chapter covers

  • The unlikelihood of the singularity
  • The lack of intelligence in machines
  • Thoughts about the human brain
Nothing in this world is to be feared … only understood.
—Marie Curie

According to some people, the end of human civilization won’t be due to things like climate change, nuclear war, or our sun dying out. Instead, they believe that in the not-so-distant future, artificial intelligence could become so advanced that it gains its own will and takes control of the planet. This potential catastrophe is often referred to as “the singularity,” a hypothetical point in time when AI would advance so rapidly that humans couldn’t keep up with its progress. While this concept makes for exciting stories in science fiction, it is essential to ground such speculations in reality. In this chapter, we aim to demystify the notion of technological singularity, arguing that it is fundamentally flawed.

9.1 The genesis of technological singularity

9.2 The truth about the evolution of robotics

9.3 Merging human with machine?

9.4 Science fiction vs. reality

Summary