Philosophy

Can machines behave like humans do?

Can machines have real consiousness like humans do?

Does thinking require "a brain" or just "brain-like" parts?

Weak AI hypothesis

Machines can only act as if they were intelligent.

Definition of AI from a philosophical point of view (not as finding the best agent program):

Argument from disability

"a machine can never do X"

But computers can do some things better than humans.

Gödels incompleteness theorem: The mathematical objection

certain mathematical questions are in principle unanswerable by particular formal systems.

The theorem itself is an example:

For any axiomatic formal system FF powerful enough to do arithmetic, it is possible to construct a gödel sentence G(F)G(F) with the properties:

  • G(F)G(F) is a sentence of FF but can not be provided within FF
  • If FF is consistent, then G(F)G(F) is true

Idea: machines are formal systems and thus they are limited by the incompleteness theorem.

Argument from informality of behaviour”

Turing: human behaviour is too complex to be captured by a set of rules. Because computers can only follow sets of rules, they cannot generate behaviour as intelligent as that of humans.

☞ The inability to capture everything in a set of logical rules is the qualification problem in "Good Old-Fashioned AI GOFAI"

But intelligence does not necessarily have to emerge from a system that reasons logically from a set of facts and rules.

Embodied cognition

it makes no sense to consider the brain separately - we need to study the system as a whole.

Strong AI hypothesis

Machines don't simulate thinking - they actually are .

Argument of consiousness

Turing: Machines have to be aware of their own mental states and actions - machines need actually feel emotions.

Turing: In ordinary life, we never have any direct evidence about the internal mental states of other humans.

Mind body problem

René Descartes: Considered the mind’s activity of thinking and the physical processes of the body.

Concludes that the two must exist in separate realms = dualist theory

Monism / physicalism: mind is not separated from the body - mental states are physical states.

Functionalism: mental state is any intermediate causal condition between input and output

Brain replacement experiment:

replace all the neurons in someone’s head with electronic devices. The subject's external behaviour must remain unchanged.

But what about the internal experience of the subject? - Diverging views.

Biological Naturalism

Mental states as high-level emergent features that are caused by low-level physical processes in the neurons. Unspecified properties of the neurons matter.

Mental states cannot be duplicated just on the basis of some program having the same functional structure.

We would require that the program would be running on an architecture with the same causal power as neurons.

A program can pass the Turing test, but not understand anything of its inputs and outputs (the "Chinese Room").