• FlowVoid@midwest.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    1 year ago

    The whole point of the Chinese room is that it doesn’t need anything “dedicated to creating the experience of consciousness”. It can pass the Turing test perfectly well without such a component. Therefore passing the Turing test - or any similar test based solely on algorithmic output - is not the same as possessing consciousness.

    • lloram239@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The problem with the Chinese room thought experiment is that it does not show that, at all, not even a little bit. The though experiment is nothing more than a stupid magic trick that depends on humans assuming other humans are the only creatures in the universe that can understand. Thus when the human in the room is revealed to not understand anything, therefore there was be no understanding anywhere near the room.

      But that’s a stupid argument. It does not answer the question if the room understands or not. Quite the opposite, since the room by definition passes all tests we can throw at it, the only logical conclusion should be that it understands. Any other claim is not supported by the argument.

      For the argument to be meaningful, it would have to define “understand”, “consciousness” and all the other aspects of human intelligence clearly and show how the room fails them. But the thought experiment does not do that. It just hopes that you buy into the premise because you already believe it.

      • FlowVoid@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        1 year ago

        “The room understands” is a common counterargument, and it was addressed by Searle by proposing that a person memorize the contents of the book.

        And the room passes the Turing test, that does not mean that “it passes all the tests we can throw at it”. Here is one test that it would fail: it contains various components that respond to the word “red”, but it does not contain any components that exclusively respond to any use of the word “red”. This level of abstraction is part of what we mean by understanding. Internal representation matters.

        • lloram239@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          it was addressed by Searle by proposing that a person memorize the contents of the book.

          It wasn’t addressed, he just added a layer of nonsense on top of a nonworking though experiment. A human remembering and executing rules is no different from reading those rules in a book. It doesn’t mean a human understands them, just because he remembers them. The human intuitive understanding works at a completely different level than the manual execution of mechanical rules.

          it contains various components that respond to the word “red”, but it does not contain any components that exclusively respond to any use of the word “red”.

          Not getting it.