| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Feb 12

Page history last edited by PBworks 16 years, 1 month ago

Searle's Chinese Room Argument

 

What's the setup?

    Locked in a room

    Given Chinese Symbols (database)

    Given a set of English rules for transforming chinese symbols from the first two lists

    Given a Second List of Chinese Symbols is slid under the door

 

   In order to produce a third list of chinese symbols

       which then slide under the door

 

This is meant to be a direct analogy to turing machines (computers)

 

Main Argument

 

The act of following the rules (instantiating a program) is not sufficent for understanding (intentionality)

 

Why?

    The person in the Chinese room does not understand chinese, just follows a program

 

What's missing is meaning.

 

 

 

What is meaning? 

    one thing represents another thing

       flag represent a country

       word/character "cup" represents the object cup

       symbolic representation, stands for, points to, indicates,

 

Semantics

 

    Reference  one things "refers" to another

        Nouns at least have this  

        maybe we can refer concepts (platonic form for verbs?)

    Sign vs. Symbol

          Natural Signs

             "smoke means fire"

          Difference "intentionality"

 

    Context

        wide and narrow content

 

    Activity/Imperative

 

 

    Associations

 

       semantic networks

       associationism (version connectionism)

 

 

 

Intentionality

    Semantics

    Mental Content

    Intentions

    Aboutness

    Propositional Attitudes

       relation to a concept or proposition

 

       proposition P = X is Y

       agent A

       relation b  (belief) such that "A b P"

       relation d (doubt) "A d P"

 

 

    Language of Thought

       mental content are sentences (propositions) in the head

       we can take different attitudes towards them

 

 

How do mental representations achieve representation? 

How do thoughts get their contents?

 

Why can't programs (computers) alone have intentinality??

    No causal connection

    No beliefs (of course)

    No access to semantic content of a symbol

    Only syntactic meaning

 

 

 

 

Causal Structures

    What does that mean?     

 

not that it is not a program, but must be something more

 

that something a specific physcial causal structure,

 

so that computer programs don't count.

    Why?

    multiple realizability

    the comuter does not matter, just the structure of the program

 

 

Language can be a reflex

 

 

Tricks

 

Searle gets leverage from language

 

Homunculus Theory, little man in the head

 

 

 

Strong AI vs. Weak AI

 

Comments (0)

You don't have permission to comment on this page.