Mind, Matter & Language: Mind; Behaviourism & Functionalism
Terms in this set (6)
- Psychology is the science of observable behaviour, and not inner mental states (
- we can explain behaviour in terms of external stimuli, responses and learning without reference to mental events (
- mental states e.g. beliefs, desires, thoughts etc, are just dispositions to act (
: reductive view of
- Same behaviour, same mental state
- Mental states are explicable in terms of their
: typical causes and effects of an experience
- Different states in different organisms can realise the same causal role (combating
- Beliefs, desires, sensations etc are either identified with their functional role (
e.g. chain event of bodily injury, PAIN, crying) or whatever physical state realises those roles (
e.g. C-FIBRES FIRING)
- Same functional role, same mental state
's Beetle in the Box analogy:
Everyone has a private box which only they can see in to, all asserting a beetle (sensation) is inside.
Since we cannot see what is inside the other person's box, we cannot be sure we are both referring to the same substance when we say beetle.
The contents of the box (sensation) plays no part in the
; its identity is irrelevant to public discourse.
(tho that's not to say there is nothing there.)
Any creature with a mind can be regarded as a Turing machine, reducing mentality to a form of computation.
cf. The Blue Brain project
's Chinese room thought experiment:
(the view that suitably programmed computers can understand natural language and have other mental capabilities) is true, then there's a programme for Chinese such that if any computer system runs that programme, it therefore comes to understand Chinese
pr2) I could run a programme for Chinese without thereby coming to understand Chinese e.g. the room
C1) Therefore Strong AI is false
(formal structure of a string of symbols) is not by itself sufficient for, nor is it constitutive of
(meaning of a string of symbols)
- Human minds have mental contents (semantics) whereas computer programmes are purely formal (syntactic)
doesn't believe a computer could never think, just that understanding depends on implementing the right type of programme (one beyond mere input-output relations)
- Systems response: the person may not have an understanding of Chinese but the room/system as a whole does
responds: even if the person internalises the system (making it a part of him), he wouldn't understand
- h/e WHAT is 'understanding'?
Conceivability Argument against
pr1) its conceivable that a part of your brain could be replaced by homunculi carrying out the same role (
Homunculi thought experiment
pr2) any machine whose inner states realise the same functional role as your brain states would be in the same mental states as you (
pr3) these homunculi would not have any qualitative states (
- even if this and the
Chinese Nation thought experiment
aren't physically possible, they are both metaphysically possible