PDA

View Full Version : Sentience


Andrew Karpinski
06-01-2006, 03:10 PM
Why is a human sentient when a computer is not? It is my position that we are not; that our mind does not differ from a computers, fundamentally, or a gazelles.

DougShrapnel
06-01-2006, 03:23 PM
[ QUOTE ]
Why is a human sentient when a computer is not?

[/ QUOTE ] Because we evolved that way. The entire physical phenomena of concousness, sensation and feeling is not fully understood yet. I imagine there are some big surprises here.

[ QUOTE ]
It is my position that we are not; that our mind does not differ from a computers, fundamentally, or a gazelles.

[/ QUOTE ] Computers don't sense anything, gazelles probably do sense something to what extent is hard to know.

Andrew Karpinski
06-01-2006, 03:32 PM
What does sensing have to do with sentience? Also, is not the input from a keyboard, mouse or modem a sense?

malorum
06-01-2006, 04:08 PM
I have often pondered - during meals - as to wether or not my spoon is sentient.

DougShrapnel
06-01-2006, 04:25 PM
[ QUOTE ]
What does sensing have to do with sentience?

[/ QUOTE ] It's normally included in the definition. If you wish to equate sentience with conciousness I'd be willing to examine it within that framework as well. Or if you want to limit it further to self-aware, thats fine.

[ QUOTE ]
Also, is not the input from a keyboard, mouse or modem a sense?

[/ QUOTE ] In a sense, I guess now you just need to find out what parts are concious and feeling.

Andrew Karpinski
06-01-2006, 04:29 PM
what do you know... dictionary.com agrees with you. I've always thought as sentience as basically intelligent conciousness not really a function of sense.

Basically, I'm asking what differs man from machine.

Wires
06-01-2006, 05:10 PM
[ QUOTE ]
Basically, I'm asking what differs man from machine.

[/ QUOTE ]

A sense of self. An awareness of our own existence.

I think therefore I am.

madnak
06-01-2006, 07:48 PM
There may be no special value inherent in the human mind. But it certainly differs from a computer.

Computers and brains function in very different ways. They aren't analogous. The crude metaphors of computers "thinking" or human minds being "programmed" have become very popular recently, but they're not very appropriate.

Metric
06-01-2006, 09:08 PM
Are you suggesting that the human brain processes information in a way that would not be possible for a universal Turing machine to process?

madnak
06-01-2006, 09:34 PM
No, but the basic mechanics are different. A computer could emulate a human brain, but no current approach to artificial intelligence even attempts to do so.

Copernicus
06-01-2006, 11:57 PM
[ QUOTE ]
No, but the basic mechanics are different. A computer could emulate a human brain, but no current approach to artificial intelligence even attempts to do so.

[/ QUOTE ]

I believe it comes down to the mechanics of memory. Computer memory is addressable storage. There are pre-existing storage spaces and ne information is placed in a slot and indexed. There may be multiple paths to access that area of memory, but it exists in one place.

The brain's memory functions differently. When something new is observed it may be thought of as being "stored" in a neuoron or group of neurons. However that is just the first step. Pathways to other neurons that are associated with that information are established in a complex network. The information is so completely connected to other memories that it in essence is stored in all of the related areas in a framework that is somewhat analagous to a hologram. Each fractional part of memory "contains" all of the related information that it has been connected with. If you killed the neuron that originally stored the memory, it isnt lost if its had time to form all of those connections..its a much more complex redundancy than "backup" copies of the memory. This is done organically, and can include the growth of brand new neurons or change the function of an existing neuron. I doubt whether a computer will ever have a complex enough automatic addressing system and the massive, instantly accessible redundancy of a brain.

I personally dont find Turing's definition very compelling. There is a difference between mimicking a brain so the computers output is indistinguishable from human output and actually being a brain.

luckyme
06-02-2006, 01:15 AM
[ QUOTE ]
There is a difference between mimicking a brain so the computers output is indistinguishable from human output and actually being a brain.

[/ QUOTE ]

I'd be interested in the difference you're seeing. From the perspective of having two systems producing the same output ... which one can be be declared the mimic? A Martian may decide we're the fakes. no?

I must be missing the obvious, ..again..sigh

madnak
06-02-2006, 02:48 AM
There would be no observable difference if a sentient computer were created. However, such a computer wouldn't function like a brain does.

Imagine a perfect weather simulation that can predict the weather. It wouldn't be "weather." In the same sense, a computer simulating a human brain wouldn't work according to cellular and neurological functions - it would simply mimic them. And it would be no justification for suggesting that the human brain is "like a computer." A computer can simulate many things without having any inherent similarities to those things.

I think it's impossible to understand what implications that might have regarding sentience - as far as I'm concerned if a computer can "ape" sentience it should be considered a sentient actor. But consciousness likely arises through the human process of thought, rather than through the outcome of that thought. So while the outcome of the computer system might be identical, the fact that the process is different could be philosophically relevant. The computer simulation of a nerve cell would be no more a real nerve cell than the computer simulation of a tornado would be a real tornado.

hmkpoker
06-02-2006, 06:19 AM
How do you know a computer isn't sentient, to some degree?

luckyme
06-02-2006, 02:06 PM
the weather example doesn't work, it's like realizing a well-detailed map of London isn't London. A closer analogy is to claim that Swazzivilla isn't a town but london is because the physical features of the two are so radically different.
[ QUOTE ]
The computer simulation of a nerve cell would be no more a real nerve cell than the computer simulation of a tornado would be a real tornado.

[/ QUOTE ]

I don't understand the fixation with simulation. A closer analogy is when an atomic bomb creates a tornado and we claim it isn't a 'real' tornado because a thunderstorm didn't create it.

Sentience isn't about nerve cells, but in any case, isn't the difference between a live nerve cell and a dead nerve cell the function that they perform, it's 'what it does' not 'what it is'. We can't say ol' Harry is still sentient just because he has nerve cells, and we can't say bionic-Harry isn't sentient because we've replaced 2 of his nerve cells with a black box that serves the same needs ( or 200 or 2 trillion).

Physical differences can't be the test for sentience or else my wife and dog are both in big trouble. There's no need for a sentient non-mammal to simulate anything, it just has to 'be' sentient. We define a property, it something has those properties then the word gets attached as describing those properties.

"Is it sentient" can't be morphed into "does it have mammalian nerve cells". Sentience is an experience not a description of types of flesh or any other substrate it is supported by.

is it?

madnak
06-02-2006, 03:22 PM
[ QUOTE ]
the weather example doesn't work, it's like realizing a well-detailed map of London isn't London. A closer analogy is to claim that Swazzivilla isn't a town but london is because the physical features of the two are so radically different.

[/ QUOTE ]

The weather example works quite well. I don't know what Swazzivilla is so I can't comment on the appropriateness of your own analogy.

[ QUOTE ]
I don't understand the fixation with simulation. A closer analogy is when an atomic bomb creates a tornado and we claim it isn't a 'real' tornado because a thunderstorm didn't create it.

[/ QUOTE ]

When an atomic bomb creates a tornado, it is a real tornado. It's a meteorological phenomenon, a "violently rotating column of air." It's not a discrete structure of data represented by a digital storage system. The difference is enormous.

[ QUOTE ]
Sentience isn't about nerve cells,

[/ QUOTE ]

It is as it exists today. If you're talking about a computer achieving sentience without simulating human activity, that's a much broader subject, but in any case it currently belongs in the realm of science fiction. There's no more reason to think a computer can become sentient than there is to think a sewer can become sentient. Maybe a sewer system, or a power grid, or even a planet could become sentient. A computer has no special relevance.

[ QUOTE ]
but in any case, isn't the difference between a live nerve cell and a dead nerve cell the function that they perform, it's 'what it does' not 'what it is'.

[/ QUOTE ]

Not at all. A nerve cell can be alive without performing its normal function. At best the issue gets semantic. How do you classify an excitable membrane, etc. Living and dead nerve cells are, in any case, both nerve cells. They have nuclei and cell membranes and the composition and general structure of a cell, etc.

[ QUOTE ]
We can't say ol' Harry is still sentient just because he has nerve cells, and we can't say bionic-Harry isn't sentient because we've replaced 2 of his nerve cells with a black box that serves the same needs ( or 200 or 2 trillion).

[/ QUOTE ]

That depends on your definition of sentience. Regardless, bionic nerve cells don't represent computer intelligence. They represent human intelligence achieved through computer technology. It's still Harry who is intelligent, not a computer. None of the computer programs involved in the maintenance of Harry's bionic brain can be considered intelligent, and none of them would function as a cell either. There is no sentient computer involved in this situation. Whether Harry is human is debatable, but he's definitely not a computer.

[ QUOTE ]
Physical differences can't be the test for sentience or else my wife and dog are both in big trouble. There's no need for a sentient non-mammal to simulate anything, it just has to 'be' sentient. We define a property, it something has those properties then the word gets attached as describing those properties.

[/ QUOTE ]

Again, purely semantic. Sit on one definition and I can approach it. Speaking of sentience as though it's some undefinable miasma isn't helping your case.

[ QUOTE ]
"Is it sentient" can't be morphed into "does it have mammalian nerve cells". Sentience is an experience not a description of types of flesh or any other substrate it is supported by.

[/ QUOTE ]

Maybe, maybe not. We haven't yet identified the essential characteristics of sentience, by any definition, and in what situations they might apply. The physical nature of nerve cells may very well play a large role in the human experience of sentience, and a computer that simulates the functioning of a human brain may therefore lack a critical component.

bearly
06-02-2006, 08:09 PM
throw one of each out the window----look----wait 5 days---smell............................b

Metric
06-02-2006, 11:44 PM
[ QUOTE ]
No, but the basic mechanics are different. A computer could emulate a human brain, but no current approach to artificial intelligence even attempts to do so.

[/ QUOTE ]
Does it matter if your computer computes via semiconductors, or a massive series of gears, levers, pullies, and falling ball-bearings housed in a stadium next door?

Take it a step further. Suppose someone has brain damage in a specific, localized region of the brain, and we replace it with an integrated "chip" that emulates the input/output of that specific section of the brain. Is that person any less "sentient" or "human" now?

If you're willing to follow me that far, what happens if you, over the course of decades, piece by piece replace more and more of the brain until there is no "original" brain left?

Personally, I think that it IS all about information and processing. The particular mechanism used to encode and process the information is basically irrelevant.

madnak
06-03-2006, 12:17 AM
[ QUOTE ]
Does it matter if your computer computes via semiconductors, or a massive series of gears, levers, pullies, and falling ball-bearings housed in a stadium next door?

[/ QUOTE ]

Yes, it matters quite a bit. I assume you mean "all things being equal," but all things aren't equal between the two systems. I program computers - I know very well output isn't the only thing that matters.

[ QUOTE ]
Take it a step further. Suppose someone has brain damage in a specific, localized region of the brain, and we replace it with an integrated "chip" that emulates the input/output of that specific section of the brain. Is that person any less "sentient" or "human" now?

[/ QUOTE ]

That's a more interesting question, and we don't have enough information to answer it. At minimum I believe the basic quality of experience would likely be different. Of course, it would depend on the specific component being replaced. If sentience is isolated in the forebrain, then obviously you could replace any other component without affecting sentience. If sentience is an intra-neuronal property, then even your earlier example could disrupt it. If sentience is a mystical property, then anything said about it is pure speculation from a rational perspective.

I believe sentience is a property arising from the interactions between neurons at some level, so I think replacing individual neurons wouldn't affect it but replacing groups of them might.

[ QUOTE ]
If you're willing to follow me that far, what happens if you, over the course of decades, piece by piece replace more and more of the brain until there is no "original" brain left?

[/ QUOTE ]

I'm not willing to follow you that far. You've introduced a fundamental difference in how the brain works.

[ QUOTE ]
Personally, I think that it IS all about information and processing. The particular mechanism used to encode and process the information is basically irrelevant.

[/ QUOTE ]

The brain doesn't "encode and process information" in anywhere close to the sense a computer does.

I think it's clear that information alone doesn't represent anything. If we use an arbitrary encoding scheme, every particle in the universe contains every possible information configuration. By that reasoning, everything must be sentient. If that's your position, it has some merit. If, on the other hand, you believe that computers are somehow more capable of sentience than, say, rocks - you'll have to use more than information encoded in an arbitrary way to support your point.

Metric
06-03-2006, 02:25 AM
[ QUOTE ]
[ QUOTE ]
Does it matter if your computer computes via semiconductors, or a massive series of gears, levers, pullies, and falling ball-bearings housed in a stadium next door?

[/ QUOTE ]

Yes, it matters quite a bit. I assume you mean "all things being equal," but all things aren't equal between the two systems. I program computers - I know very well output isn't the only thing that matters.

[/ QUOTE ]
If the map from all inputs to all outputs is the same, then they are computationally the same. There will be no computation you can perform on your terminal that will distinguish between the two processors. They will both run MS Word just fine.

Notice, for example, that code cracking during WWII was the art of figuring out this input-output map. Eventually, the US had machines that were computationally the same as the Japanese code reading machines, though no American had ever seen one. It didn't matter at all if they encoded information differently or used fundamentally different hardware.

[ QUOTE ]
[ QUOTE ]
Take it a step further. Suppose someone has brain damage in a specific, localized region of the brain, and we replace it with an integrated "chip" that emulates the input/output of that specific section of the brain. Is that person any less "sentient" or "human" now?

[/ QUOTE ]

That's a more interesting question, and we don't have enough information to answer it. At minimum I believe the basic quality of experience would likely be different. Of course, it would depend on the specific component being replaced. If sentience is isolated in the forebrain, then obviously you could replace any other component without affecting sentience. If sentience is an intra-neuronal property, then even your earlier example could disrupt it. If sentience is a mystical property, then anything said about it is pure speculation from a rational perspective.

I believe sentience is a property arising from the interactions between neurons at some level, so I think replacing individual neurons wouldn't affect it but replacing groups of them might.

[/ QUOTE ]
I don't understand why you believe that a "single neuron simulator" would have no effect, but an "n-neuron simulator" changes something in a fundamental way. From an information theoretic point of view, of course, there is no difference at all.

[ QUOTE ]
[ QUOTE ]
If you're willing to follow me that far, what happens if you, over the course of decades, piece by piece replace more and more of the brain until there is no "original" brain left?

[/ QUOTE ]

I'm not willing to follow you that far. You've introduced a fundamental difference in how the brain works.

[/ QUOTE ]
I understand that this is your belief. I don't understand why you believe it.

[ QUOTE ]
[ QUOTE ]
Personally, I think that it IS all about information and processing. The particular mechanism used to encode and process the information is basically irrelevant.

[/ QUOTE ]

The brain doesn't "encode and process information" in anywhere close to the sense a computer does.

[/ QUOTE ]
This is vague. Obviously my brain doesn't run Microsoft software, but if a sufficiently powerful computer can simulate a human brain, it is by definition encoding and processing information in the same sense as a brain. If not, then it's not a good simulation.

[ QUOTE ]
I think it's clear that information alone doesn't represent anything. If we use an arbitrary encoding scheme, every particle in the universe contains every possible information configuration.

[/ QUOTE ]
You can do this (ecoding information in arbitrary ways), but in order for the concept of computation to make sense, you need to be able to process the information in some specific and useful way. Obviously, if you take a bucket of water and say "the state of all water molecules at time t=1 will be used to encode the state of my computer's hard-drive," that information will be immediately wiped out in the next instant of time, due to thermodynamics. If, however, you actually DO have a system to manipulate the data encoded in the water -- then yes indeed that information becomes significant!

[ QUOTE ]
By that reasoning, everything must be sentient.

[/ QUOTE ]
As I just pointed out, you skipped a very crucial step -- processing. Einstein's brain might still contain a lot of information depending on your encoding scheme, but it's still dead. It doesn't process anything, anymore.

[ QUOTE ]
If that's your position, it has some merit. If, on the other hand, you believe that computers are somehow more capable of sentience than, say, rocks - you'll have to use more than information encoded in an arbitrary way to support your point.

[/ QUOTE ]
And that "something more" is the concept of processing. Arbitrary information encoded in the state of phonons in a rock do not compute anything terribly interesting.

madnak
06-03-2006, 03:41 AM
[ QUOTE ]
If the map from all inputs to all outputs is the same, then they are computationally the same. There will be no computation you can perform on your terminal that will distinguish between the two processors. They will both run MS Word just fine.

Notice, for example, that code cracking during WWII was the art of figuring out this input-output map. Eventually, the US had machines that were computationally the same as the Japanese code reading machines, though no American had ever seen one. It didn't matter at all if they encoded information differently or used fundamentally different hardware.

[/ QUOTE ]

Sentience isn't an output. And there are plenty of relevant factors, such as performance and stability, that also don't strike me as outputs. I suppose you could call them that, but that's really going backward. At any rate, there are differences in implementation and maintenance and secondary effects (such as heat generated) between two systems. For a programmer, the specific platform is extremely relevant.

[ QUOTE ]
I don't understand why you believe that a "single neuron simulator" would have no effect, but an "n-neuron simulator" changes something in a fundamental way. From an information theoretic point of view, of course, there is no difference at all.

[/ QUOTE ]

I've explained this, but I'll do so again. I think sentience arises from interactions between neurons. Therefore, changing an individual neuron won't affect it. That is, I don't think sentience arises from the neurons themselves. If it does, then replacing neurons would obviously disrupt it unless the replacement had the same relevant properties as the cells themselves.

[ QUOTE ]
This is vague. Obviously my brain doesn't run Microsoft software, but if a sufficiently powerful computer can simulate a human brain, it is by definition encoding and processing information in the same sense as a brain. If not, then it's not a good simulation.

[/ QUOTE ]

I disagree. At some level of abstraction this is true, of course, but not at the basic level. The specific nature of "storage" in the brain is unknown, but it seems to be based on chemical and electrical connections between cells, and on the structures of the cells themselves. It's potentially possible that a certain kind of "biocomputer" could store information in the same way. I don't believe that's true. I think the brain's method of storage is qualitative and fluid and the computer's method is quantitative and discrete, and therefore the two are incompatible. But I can hardly back that up, it's really just a hunch.

For a modern computer using magnetic storage, well, it uses magnetic storage, not cellular storage. Very different storage mediums.

Even if a computer were able to use "biostorage," the basic units of processing would by definition be alien to the basic units of human brain functioning. From wikipedia: "A computer is a machine designed for manipulating data according to a list of instructions known as a program." The human brain doesn't use a list of instructions known as a program! It's also arguable whether it's "a machine designed for manipulating data," but it definitely doesn't follow a program and perform operations like a computer does. A computer must follow a program to be a computer in the first place, therefore the basic functional unit for a computer is different from the basic functional unit for a brain.

[ QUOTE ]
You can do this (ecoding information in arbitrary ways), but in order for the concept of computation to make sense, you need to be able to process the information in some specific and useful way. Obviously, if you take a bucket of water and say "the state of all water molecules at time t=1 will be used to encode the state of my computer's hard-drive," that information will be immediately wiped out in the next instant of time, due to thermodynamics. If, however, you actually DO have a system to manipulate the data encoded in the water -- then yes indeed that information becomes significant!

[/ QUOTE ]

The question isn't whether it's significant. The question is whether it represents sentience. What kind of processing is necessary to imply sentience?

[ QUOTE ]
And that "something more" is the concept of processing. Arbitrary information encoded in the state of phonons in a rock do not compute anything terribly interesting.

[/ QUOTE ]

I don't think that's necessarily true. Regardless, I originally used the ideas of sewer systems and power systems and planets, all of which have elements of processing. As well as all life forms. And technological information systems that aren't computers. How is a computer special? It's far from the only thing that processes.

You really need to clarify what you mean by "processing." By a strict computer science definition, the human brain doesn't process and doesn't meet your requirements. But by a general definition, almost anything could be said to involve processing.

I still think you don't understand how the human brain works. It doesn't "perform operations on data." It's very organic, it grows like a tree. Branching out and exploring and linking back in. It's closer to a series of roots than to a computer.

Metric
06-03-2006, 06:57 AM
[ QUOTE ]
Sentience isn't an output. And there are plenty of relevant factors, such as performance and stability, that also don't strike me as outputs. I suppose you could call them that, but that's really going backward. At any rate, there are differences in implementation and maintenance and secondary effects (such as heat generated) between two systems. For a programmer, the specific platform is extremely relevant.

[/ QUOTE ]
It may be relevant to your specific job of encoding a program, but for the purposes of describing what the machine does with the information, it doesn't much matter. As to performance and stability, these are just limitations of specific system -- not exactly fundamental. If you need a better machine for sentience, just build it.

[ QUOTE ]
[ QUOTE ]
I don't understand why you believe that a "single neuron simulator" would have no effect, but an "n-neuron simulator" changes something in a fundamental way. From an information theoretic point of view, of course, there is no difference at all.

[/ QUOTE ]

I've explained this, but I'll do so again. I think sentience arises from interactions between neurons.

[/ QUOTE ]
Forgive me if this doesn't seem too satisfying.

[ QUOTE ]
Therefore, changing an individual neuron won't affect it. That is, I don't think sentience arises from the neurons themselves. If it does, then replacing neurons would obviously disrupt it unless the replacement had the same relevant properties as the cells themselves.

[/ QUOTE ]
Yes -- the "relevant properties" are essential. But these are information theoretic properties! A neuron responds in some way to input (perhaps altering its connections or emitting some chemical), and gives an output. There is no reason to believe it does something that is inherently non-computable.

[ QUOTE ]
I disagree. At some level of abstraction this is true, of course, but not at the basic level. The specific nature of "storage" in the brain is unknown, but it seems to be based on chemical and electrical connections between cells, and on the structures of the cells themselves. It's potentially possible that a certain kind of "biocomputer" could store information in the same way. I don't believe that's true. I think the brain's method of storage is qualitative and fluid and the computer's method is quantitative and discrete, and therefore the two are incompatible. But I can hardly back that up, it's really just a hunch.

For a modern computer using magnetic storage, well, it uses magnetic storage, not cellular storage. Very different storage mediums.

[/ QUOTE ]
The brain obviously has a discrete "state" at different moments of time, encoded by electric potentials, connections between neurons, varying levels of certain chemicals, etc. etc. etc. But there are a finite number of these parameters. What is non-computable about evolving this system into the future?

[ QUOTE ]
Even if a computer were able to use "biostorage," the basic units of processing would by definition be alien to the basic units of human brain functioning. From wikipedia: "A computer is a machine designed for manipulating data according to a list of instructions known as a program." The human brain doesn't use a list of instructions known as a program!

[/ QUOTE ]
Each cell basically does...

[ QUOTE ]
It's also arguable whether it's "a machine designed for manipulating data," but it definitely doesn't follow a program and perform operations like a computer does. A computer must follow a program to be a computer in the first place, therefore the basic functional unit for a computer is different from the basic functional unit for a brain.

[/ QUOTE ]
But if whatever it does can be simulated on a computer, the computer effectively inherits all its properties. The hardware may be different, but it handles the same information in the same way.

[ QUOTE ]
[ QUOTE ]
You can do this (ecoding information in arbitrary ways), but in order for the concept of computation to make sense, you need to be able to process the information in some specific and useful way. Obviously, if you take a bucket of water and say "the state of all water molecules at time t=1 will be used to encode the state of my computer's hard-drive," that information will be immediately wiped out in the next instant of time, due to thermodynamics. If, however, you actually DO have a system to manipulate the data encoded in the water -- then yes indeed that information becomes significant!

[/ QUOTE ]

The question isn't whether it's significant. The question is whether it represents sentience. What kind of processing is necessary to imply sentience?

[/ QUOTE ]
The point is that it's a dynamical thing -- you can't have sentience in something that just sits there in the same exact state.

[ QUOTE ]
[ QUOTE ]
And that "something more" is the concept of processing. Arbitrary information encoded in the state of phonons in a rock do not compute anything terribly interesting.

[/ QUOTE ]

I don't think that's necessarily true. Regardless, I originally used the ideas of sewer systems and power systems and planets, all of which have elements of processing. As well as all life forms. And technological information systems that aren't computers. How is a computer special? It's far from the only thing that processes.

[/ QUOTE ]
A universal turing machine is special because it will compute roughly whatever you want it to compute, given enough resources.

[ QUOTE ]
You really need to clarify what you mean by "processing." By a strict computer science definition, the human brain doesn't process and doesn't meet your requirements. But by a general definition, almost anything could be said to involve processing.

[/ QUOTE ]
I'm talking about the ability to use information in a specific way to produce some desired output. The rock, though its thermodynamic microstate contains a great deal of information, doesn't do this. A human brain, a computer, and perhaps some other things do. But this is really beside the point, which is that sentience requires dynamics -- it's not a kinematical thing.

(now that I think about it, there are certain formulations of mechanics which are covariant -- dynamics itsself is contained in the notion of "state" -- thus specifying a point in phase space in some sense is equivalent to knowing the future, the past, etc. Off topic but interesting food for thought)

[ QUOTE ]
I still think you don't understand how the human brain works. It doesn't "perform operations on data." It's very organic, it grows like a tree. Branching out and exploring and linking back in. It's closer to a series of roots than to a computer.

[/ QUOTE ]
Refer to previous arguments. If a computer can accurately simulate it, the computer must contain the same "branching" and "tree-like" properties on some emergent level that is equally hard to quantify. Who cares if the information is stored via electrochemistry or magnetic tape? The same information-theoretic processes must be happening, just like the US vs. Japanese decoding machines!

Threaten the electric brain with unplugging, and it will attempt to reason, beg and plead with you, and probably start praying to God! Why? Because that's what human brains do, and that's what is being accurately simulated!

Copernicus
06-03-2006, 11:24 AM
Im with madnak all the way on this, I think his two longer posts express where I was going in my first.

Here is where I think an essential difference can be shown: A computer cannot learn on its own. At some point in its development it requires a sentient being to teach it how to process information, in addition to the "nutrients" of its life, electricity.

A brain, however, given its necessary nutrients to survive, will learn on its own. It comes hardwired with a few basic connections, but it grows its own connnections over time.

Another essential difference between the two can be seen in "reverse engineering" its outputs. Given the inputs and outputs of a computer you can (and imo always will) be able to trace how it got from beginning to end. There are no "insights", "leaps of intuition" or ability to generalize from one situation to another without being instructed to do so. You cannot trace from input to output of the brain and recreate every step. Einsteins thought experiments are an excellent example. No computer that simulates the thought process would have the insight to equate gravity and accleration, and gravity and curvature of space/time.

It may even be simpler than that...you can describe how a computer adds 2+2 to get 4. You cannot describe how a brain does it, and, again imo, never will be able to.

madnak
06-03-2006, 02:28 PM
I'm going to reiterate that I believe sentience is not an output. Virtually all of your arguments here seem to assume it is.

[ QUOTE ]
Refer to previous arguments. If a computer can accurately simulate it, the computer must contain the same "branching" and "tree-like" properties on some emergent level that is equally hard to quantify. Who cares if the information is stored via electrochemistry or magnetic tape?

[/ QUOTE ]

You're assuming that the electrochemical processes themselves aren't a relevant component of sentience. Awareness is a concrete visceral experience, and I don't believe it arises from information theoretical outputs.

[ QUOTE ]
The same information-theoretic processes must be happening, just like the US vs. Japanese decoding machines!

[/ QUOTE ]

Untrue. Different information-theoretical processes can lead to the same output. For example, you can solve the algebra problem x+n = 0 (where n is an integer input) in at least two ways. One is subtracting n from both sides of the equation, resulting in x = -n. The other is to use a process of trial and error, inserting 0 for x and then 1 and then -1 and so on until you reach the correct answer. The two different information-theoretical processes will both result in the correct answer, but they aren't the same process.

[ QUOTE ]
Threaten the electric brain with unplugging, and it will attempt to reason, beg and plead with you, and probably start praying to God! Why? Because that's what human brains do, and that's what is being accurately simulated!

[/ QUOTE ]

And that tells us nothing about the quality of the computer's experience. I can write a bot right now that will beg and plead and pray if you threaten to close the program. It might not pass the Turing test, but maybe it could even evoke some human emotion in the chat participant. That doesn't mean it's sentient.

There is absolutely nothing to indicate a Universal Turing Machine can create the sensation of awareness. It's like saying a computer simulation of an optic nerve can create the sensation of sight. Or saying that a computer simulation of a thunderstorm can produce real lightning.

luckyme
06-03-2006, 02:55 PM
[ QUOTE ]
No computer that simulates the thought process would have the insight to equate gravity and accleration, and gravity and curvature of space/time.

[/ QUOTE ]

Or beat a chess master, or write a sonnet... oops, wrong decade, same argument.

luckyme
06-03-2006, 03:54 PM
[ QUOTE ]
There is absolutely nothing to indicate a Universal Turing Machine can create the sensation of awareness. It's like saying a computer simulation of an optic nerve can create the sensation of sight. Or saying that a computer simulation of a thunderstorm can produce real lightning.

[/ QUOTE ]

And my brain can produce those events? That's why those are not useful analogies, they apply to both entities under discussion, so it's a similarity not a distinction. Those voices in my head are not real voices after all.

Sentience is about something ( insert definition here) occuring within the operations/processes of the system itself, not external. If it is occuring then it isn't 'simulated' anymore than your own is. There is no such thing as simulated sentience, either it has it or it doesn't. If you want to claim only meat can be sentient, ok , but if you think explaining the 'how' of sentience is tough (which isn't necessary to discuss sentience), try explaining the 'what' of simulated sentience.

Are you familiar with Dennett's zimboe argument? If you're not, I'll try and find a source for you when I get home.

madnak
06-03-2006, 04:37 PM
[ QUOTE ]
And my brain can produce those events?

[/ QUOTE ]

Your brain can produce sentience. Or mine can, anyhow. So far there is no way to measure or empirically verify sentience, but your inherent similarities to me are strong indicators that you also experience what I call sentience.

[ QUOTE ]
That's why those are not useful analogies, they apply to both entities under discussion, so it's a similarity not a distinction. Those voices in my head are not real voices after all.

[/ QUOTE ]

No, but they are real experiences. So far the brain is the only thing we know of that seems capable of producing experience.

[ QUOTE ]
Sentience is about something ( insert definition here) occuring within the operations/processes of the system itself, not external. If it is occuring then it isn't 'simulated' anymore than your own is.

[/ QUOTE ]

If sentience is a property of the operations/processes and not the output, then changing the operations/processes in a way that preserves the output won't necessarily preserve sentience.

[ QUOTE ]
There is no such thing as simulated sentience, either it has it or it doesn't. If you want to claim only meat can be sentient, ok , but if you think explaining the 'how' of sentience is tough (which isn't necessary to discuss sentience), try explaining the 'what' of simulated sentience.

[/ QUOTE ]

The "what" is part of the "how." The "output" of sentience can be simulated. A computer can be programmed to claim sentience in a very convincing way without necessarily experiencing sentience. A computer can also simulate the process by which sentience occurs without creating sentience, in the same sense that a computer can simulate the process by which lightning occurs without creating lightning. A representation of the process isn't the process itself.

[ QUOTE ]
Are you familiar with Dennett's zimboe argument? If you're not, I'll try and find a source for you when I get home.

[/ QUOTE ]

No I'm not, but having googled around I'm far from impressed. The more I read of Dennett, the more firmly opposed to him I am.

Metric
06-03-2006, 10:57 PM
[ QUOTE ]
I'm going to reiterate that I believe sentience is not an output. Virtually all of your arguments here seem to assume it is.

[/ QUOTE ]
What??? Half of my previous post was devoted to insisting that sentience had to be a dynamical process involving the manipulation of information, and not simply a "state of information!" i.e. not an output, but a property of the system!

[ QUOTE ]
You're assuming that the electrochemical processes themselves aren't a relevant component of sentience. Awareness is a concrete visceral experience, and I don't believe it arises from information theoretical outputs.

[/ QUOTE ]
Chemistry is information theory at some level. You have a state of particular information at a particular time, and then it is "processed" according to the laws of physics. Most of the time, it results in nothing interesting. In a human brain, the information is organized in a specific way -- that is what makes it different than a chemical "soup" of the same elements. If I encode the same information in a different medium, and processes it in exactly the same way -- every meaningful brain-like property of what the brain actually does is identical.

[ QUOTE ]
[ QUOTE ]
The same information-theoretic processes must be happening, just like the US vs. Japanese decoding machines!

[/ QUOTE ]

Untrue. Different information-theoretical processes can lead to the same output. For example, you can solve the algebra problem x+n = 0 (where n is an integer input) in at least two ways. One is subtracting n from both sides of the equation, resulting in x = -n. The other is to use a process of trial and error, inserting 0 for x and then 1 and then -1 and so on until you reach the correct answer. The two different information-theoretical processes will both result in the correct answer, but they aren't the same process.

[/ QUOTE ]
Technicality -- I should have said they are "isomorphic." If the Japanese machine has a subunit that solves an equation by guessing, and the US machine has a subunit that solves it analytically it still does not matter. The subunit still takes a piece of data and transforms it in the same way -- it represents the same component of the larger system. If you argue that it's not just a subunit -- the emergent properties of the system depend on the actual process, then I agree we have to simulate the relevant process in the same way -- but the point is that whatever we decide that the "fundamental" subunits or elements are (be they neurons or something else), we can construct a system that is isomorphic but built out of different materials.

[ QUOTE ]
[ QUOTE ]
Threaten the electric brain with unplugging, and it will attempt to reason, beg and plead with you, and probably start praying to God! Why? Because that's what human brains do, and that's what is being accurately simulated!

[/ QUOTE ]

And that tells us nothing about the quality of the computer's experience. I can write a bot right now that will beg and plead and pray if you threaten to close the program. It might not pass the Turing test, but maybe it could even evoke some human emotion in the chat participant. That doesn't mean it's sentient.

[/ QUOTE ]
The point of course was that the simulated brain actually thinks about what it will be like to die. It devotes resources to considering everything a real brain considers under threat of death (family, meaning of life, possibility of afterlife, if there will be pain, etc.)-- if it doesn't do these things, it's not a good simulation.

[ QUOTE ]
There is absolutely nothing to indicate a Universal Turing Machine can create the sensation of awareness. It's like saying a computer simulation of an optic nerve can create the sensation of sight. Or saying that a computer simulation of a thunderstorm can produce real lightning.

[/ QUOTE ]
The laws of physics that allow you to exist and function make no distinction between "real" and "simulated." They simply make information theoretic statements: e.g. if I have a "state" representing two like charges sitting next to each other, then future states will have them accelerating away from each other. I.E. physics tells you 1) how information is represented (kinematics) and 2) how it is processed (dynamics). There is nothing more. The fact is, you cannot prove that this universe (including you) is not a simulation on someone else's higher-dimensional computer. And if the entire universe can be expressed in information theoretic terms (via the laws of physics), then sentience, awareness, etc. certainly must be expressable in information theoretic terms -- it is only a matter of sufficient sophistication.

madnak
06-04-2006, 01:19 PM
I think reality is more than information.

aeest400
06-04-2006, 05:05 PM
Have a look at some of the links on this page, including the bibliography. If you still think you have something intelligent (or intelligible) to say, please do. This is not a horse that has gone unbeaten by philosophers/psychologists/cognitive scientists over the last 40 years. This is not a reply to any particular post.

http://consc.net/chalmers/

Copernicus
06-04-2006, 05:06 PM
The difference is madnak and I believe that there is more to sentience than getting the same output from the input. The process matters.

I dont believe you (metric) ever addressed my comments regarding reverse engineering the thought process. If it is a UTM, you will be able to trace every step of the process from input to output. If there is some randomizing element then you might need to store the additional information of either the seed or the random number itself.

In contrast, a brains process, where there is original thought such as Einsteins insight into the nature of gravity, cannot be similarly recreated step by step...it is more crunching the inputs..the process matters.

carlo
06-04-2006, 11:18 PM
"Why is a human sentient when a computer is not? It is my position that we are not; that our mind does not differ from a computers, fundamentally, or a gazelles"

This is tantamount to saying that a lever board is sentient because the other end responds to my force. Humans, in a sense,(no pun intended) attempt to mimic a sentient quality and this is best displayed in the study of servomechanisms where a sample of the output is signaled back to source which adjusts to preordained patterns. An example would be the cruise control on your car.

This is the difficulty with mechanistic thinking which abounds today and believes that all of nature can be forced into a mechanistic pattern. The pattern is taken from machines and placed on the poor unknowing flower. Sense realities have to do with LIFE which contain feelings and thinking. This is contained in the animal kingdom(feelings) and of course the human kingdom(feelings and thinking).

The next time your car says "ouch" to a kick you've got a scoop /images/graemlins/grin.gif.

carlo

madnak
06-04-2006, 11:27 PM
Well, he's more interesting than most functionalists. But that's not saying much.

aeest400
06-05-2006, 01:26 AM
I don't necessarily agree with Chalmers (I've studied the issues some--too much to have stong opinions, but also too little to have stong opinions), but he matains a pretty extensive bibliography. I don't agree with Dennett either, but he's a pretty smart guy, very smart actaully. As far as sentience, most of the prior action was in discussing the "chinese room" of Serle, a lot of the current stuff is zombies. Before that it was turing tests.

madnak
06-05-2006, 02:15 AM
And all of which miss the point, if you ask me.

luckyme
06-05-2006, 02:49 PM
[ QUOTE ]
As far as sentience, most of the prior action was in discussing the "chinese room" of Serle, a lot of the current stuff is zombies. Before that it was turing tests.

[/ QUOTE ]

Regardless of the unexplained situation under question, people seem to have always separated into two general approachs. Those who call on 'special forces', whether religious or 'elan vitae' and decide it's all beyond our comprehension and those who expect there is a natural process at work that will likely be understood as it succumbs to scrutiny, but in any case, we don't have to assume anything far out.

Chalmers falls into the 'special forces' camp ( but not very far in), and Dennett expects a natural explanation will become clearer. Dennett has quite the track record in making predictions about cognitive processes that prove out and stimulates new ways of looking at old problems.

As far as brains and sentience, it's a matter of whether you think of the brain as a wonder-meat' or simply meat that supports some pretty neat stuff. Neuroscience has shown us the modularity of the brains functions, even those that seem unified can be found fragmented. As we get farther along in understanding the fragments, we'll be closer to duplicating the unified version. Dennett's work on zombies and zimboes provides very good insights in this area.

luckyme
06-05-2006, 02:59 PM
[ QUOTE ]
This is the difficulty with mechanistic thinking which abounds today and believes that all of nature can be forced into a mechanistic pattern

[/ QUOTE ]

The difficulty with 'and then magic happens' thinking is that it's running out of places to hide. No flowers are known to break any laws of physics, and accurate predictions can be made about action based on those constraints. 'Magic Thinking' is just an mega-shrug, attempting to defend a special ego-saving place from the the prying eyes of understanding.

madnak
06-05-2006, 03:10 PM
This is a false dichotomy. Dennett's "work" rests on similarly false dichotomies and unjustified assumptions. You can pretty easily "prove" anything by this standard. I'll prove I'm not human.

There are two kinds of humans, those with red hair and those with green eyes. I have no red hair. I have no green eyes. I am not human.

At minimum, your two approaches are neither mutually exclusive nor collectively exhaustive, therefore they can't represent a true dichotomy.

aeest400
06-05-2006, 03:52 PM
I like this comment. I'm a committed materialist with a very strong background in cognitive science/phil of mind but don't see how to explain qualia or come up with any demarcation between conscious and nonconscious (though my intuitions are a bit stronger than Dennett's that consciousness exists--I once wrote a paper on his views re visual consciousness). However, one thing I believe is that chimps and, say, dogs are semi-conscious (pretty sure ants aren't), which leads me to believe that humans are probably just another bump up the chain, with many more possible bumps.

I do think that consciousness is separate from intelligence (but the application of intelligence requires attention, which is closely related to consciousness), and I can do a better job imagining how meat does intelligence than how it does consciousness. Incidentally, the best book on how meat does intelligence that I've come across is http://www.amazon.com/gp/product/0805074562/104-2241292-0250333?v=glance&n=283155 .

Madnak, you comments on Dennett seem a little strawmanish. Consciousness Explained is about 500 pages long--he lays out his theory and addresses potential criticisms at length. He probably one of the smartest and well-informed folks (re computation, neuroscience, other philosophers, etc) writing today. There is an issue of Behavorial and Brian Science where he lays out his views and 20-30 folks respond. Penrose can simply be dismissed, but Dennett is too smart for that.

This thread is on an interestesting topic, but as it's been discussed by pros at length and the literature is quite sophisticated, I'm not sure that the 2+2 community is going to resolve it (or add to it), myself included.

However, any form of dualism is a non-starter. Can't use the mysterious to explain the obscure.

aeest400
06-05-2006, 04:00 PM
Actually, though, I stongly disagree with the comments re "modularity" to the extent they evoke Fodor/Chomsky/Pinker. Computers do math, people do similarities and categories. Woman, Fire, and Dangerous Things by Lakoff is still the best book on cognition I've read, and one can see at least how some of the ideas would work in analogical systems like brains. We can do digital thinking, but I think that's because we can follow rules, not because thought is rule-based.

madnak
06-05-2006, 05:03 PM
I haven't read his book, only what I've recently found online. I'm not "dismissing" him but I think he works from a number of assumptions that are basically unsupportable. And dualism is hardly a "non-starter," in fact it's unassailable. It's no more possible to disprove dualism than it is to disprove God. The problem with dualism is the same as the problem with functionalism - it rests on assumptions that are basically unsupportable.

luckyme
06-05-2006, 06:11 PM
[ QUOTE ]
I'm not sure that the 2+2 community is going to resolve it (or add to it), myself included.

[/ QUOTE ]

Not bloodly likely, but lots of useful things can come from these exchanges. Your recommendation of Hawkins book is a good example. For layman interested in a topic, hearing the areas that befuddle others working through it at a similar level can sometimes stimulate a fresh view that a Dennett or a Hawkins may not have clarified enough for you. Or at least I may feel I'm not the only dullard in the world.

luckyme
06-05-2006, 06:16 PM
[ QUOTE ]
It's no more possible to disprove dualism than it is to disprove God.

[/ QUOTE ]

Uncle. ( ...or that the tooth fairy does it all).

carlo
06-06-2006, 12:35 PM
The difficulty with 'and then magic happens' thinking is that it's running out of places to hide. No flowers are known to break any laws of physics, and accurate predictions can be made about action based on those constraints. 'Magic Thinking' is just an mega-shrug, attempting to defend a special ego-saving place from the the prying eyes of understanding.

Of course no flowers are known to break the laws of physics but the reality is that no "law of physics" as known in this day could possibly explain the flower and does not.

I'm really not sure of the drift of your thought of "magic thinking" and who is using it. Goethe's scientific works, in particular the concept of "Metamorophosis" and his study of Light can offer some insight into "Reality Bound Thinking".

carlo

JMAnon
06-06-2006, 03:13 PM
[ QUOTE ]
The next time your car says "ouch" to a kick you've got a scoop /images/graemlins/grin.gif.
carlo

[/ QUOTE ]

How about the next time my computer beeps after I tickle the keys the wrong way?

luckyme
06-06-2006, 04:12 PM
[ QUOTE ]
I'm really not sure of the drift of your thought of "magic thinking" and who is using it. Goethe's scientific works, in particular the concept of "Metamorophosis" and his study of Light can offer some insight into "Reality Bound Thinking".

[/ QUOTE ]

in today's jargon Goethe would be an anti-reductionist. The problem with anti-reductionists is that they believe there is such a beast as a reductionist.

The role of intuition and visualization operates on several levels in scientific analysis. Einsteins thought-experiments or Dennett's intuition-pumps are common tools. Goethe's holistic methods of research and study give us insights that are difficult to acheive at smaller scales. That does not mean that there is any special forces at work at the holistic level... no magic, even though if we ignore the finer-grained underpinnings it would appear there would need to be.

Neuroscience is starting to get a grasp on how what we term 'intuition' works, that doesn't take away the insights it provides us, it just reminds us what we're not mythic beings and at some level we are a pack of quarks. That is misread by magical thinkers as 'just a pack of quarks'. oh, well.

madnak
06-06-2006, 04:47 PM
[ QUOTE ]
at some level we are a pack of quarks

[/ QUOTE ]

Last I heard, quarks can't be computed because their states can't have finite representations.

Copernicus
06-06-2006, 07:14 PM
[ QUOTE ]
[ QUOTE ]
I'm not sure that the 2+2 community is going to resolve it (or add to it), myself included.

[/ QUOTE ]

Not bloodly likely, but lots of useful things can come from these exchanges. Your recommendation of Hawkins book is a good example. For layman interested in a topic, hearing the areas that befuddle others working through it at a similar level can sometimes stimulate a fresh view that a Dennett or a Hawkins may not have clarified enough for you. Or at least I may feel I'm not the only dullard in the world.

[/ QUOTE ]

The intro to Hawkins book indicates that he adopts the same basic philosophy I had tried to express earlier .... distinguishing between the rote production of output from inputs and "dewey decimal memory" (each piece of knowledge is indexed and retrieveable) vs the manner in which the brain "remembers sequences of events and their nested relationships and making predictions based on those memories"...where the nested relationships in my posts were embodied in the multiple connections each neuron forms with other neurons that share an "interest" in that memory.

carlo
06-06-2006, 07:38 PM
it just reminds us what we're not mythic beings and at some level we are a pack of quarks.

Materialism in the abstract is what this is all about,i.e. only a material entity can explain the world which turns on itself and explains itself which is a profound contradictory falsehood. This type of scientific thinking(einstein.etc.) has left the world in a cerebral revery and at the same time demanding a material primal force. Thoughts that leave the world into their own little nook and cranny of the universe without really seeing the material as it really is. It is a great conundrum and the phenomological approach of Goethe is a beginning of a proper understanding of materiality and MAN himself.

Try "The Philosophy of Freedom" by Rudolph Steiner. Many of the questions addresed in this post and others are addressed such as "dualism","thing in itself " of Kant and the paradox of man as a free being vs the iron necessity of natural law.

And yes, it is related to scientific thought, in that it is a study of "thinking" , the grace of which human beings live.

carlo

luckyme
06-06-2006, 07:50 PM
[ QUOTE ]
distinguishing between the rote production of output from inputs and "dewey decimal memory" (each piece of knowledge is indexed and retrieveable) vs the manner in which the brain "remembers sequences of events and their nested relationships and making predictions based on those memories"

[/ QUOTE ]

( an aside..is there somebody alive who believes the first part of the quote above?)
Darn, now I don't know whether to read it or not .. I suppose I will, because a book that seems well recieved can't possibly be based on the simplistic approach in the latter comment.

I was dissappointed with this review comment, "Hawkins explains why the way we build computers today won't take us down that path." No kidding. Although, somebody even having that thought explains a bit of why when you say "AI" they give you a funny look.

"Remembers sequences of events.." isn't all the evidence showing the brain (re)constructs sequences of events, frequently incorrectly. Perhaps it's just a shorthand way of saying the same thing, I hope.

luckyme
06-06-2006, 08:09 PM
[ QUOTE ]
,i.e. only a material entity can explain the world which turns on itself...

[/ QUOTE ]

What happened to solipsism? or even the 'consciousness is all there is ( not necessarily just mine)' viewpoints, to name a few.

Each person can come up with their own concepts of what's realy out there, but to have a useful dialogue we require something shared. Since current scientific methods at least give us repeatability and testability, it's a decent community way to go. Goethe had a typical artistic way of approaching the world. As I noted, it has it's uses but it can only take an exchange so far and then you run into a subjective mess. ( or we'd be landing Goethized craft on the moon ).

aeest400
06-06-2006, 09:49 PM
[ QUOTE ]
[ QUOTE ]
distinguishing between the rote production of output from inputs and "dewey decimal memory" (each piece of knowledge is indexed and retrieveable) vs the manner in which the brain "remembers sequences of events and their nested relationships and making predictions based on those memories"

[/ QUOTE ]

( an aside..is there somebody alive who believes the first part of the quote above?)
Darn, now I don't know whether to read it or not .. I suppose I will, because a book that seems well recieved can't possibly be based on the simplistic approach in the latter comment.

I was dissappointed with this review comment, "Hawkins explains why the way we build computers today won't take us down that path." No kidding. Although, somebody even having that thought explains a bit of why when you say "AI" they give you a funny look.

"Remembers sequences of events.." isn't all the evidence showing the brain (re)constructs sequences of events, frequently incorrectly. Perhaps it's just a shorthand way of saying the same thing, I hope.

[/ QUOTE ]

Yes, the "digital/computation" view of conigiton has been the dominant paradigm over the last 50 years. Only in the early 80s did researchers begin to systematically question it. It's still known as GOFAI (good old-fashioned artificial intelligence) and is still very influential, particulalrly among CS/engineering types. It's no longer a widely held view in the cognitive sciences. What Hawkins does is make a nice step toward explaining how the brain instantiates thought--an issue on which less progress has been made than one would hope.

Metric
06-08-2006, 02:37 AM
[ QUOTE ]
I think reality is more than information.

[/ QUOTE ]

That's fine, and the default view. Coincidentally, the last few days (in which I have not been posting here) I have been attending a physics conference where the opposite point of view (that the concept of information is in some sense more fundamental than the concept of physical law) was the topic of some very interesting discussion!

Metric
06-08-2006, 02:49 AM
[ QUOTE ]
I dont believe you (metric) ever addressed my comments regarding reverse engineering the thought process. If it is a UTM, you will be able to trace every step of the process from input to output. If there is some randomizing element then you might need to store the additional information of either the seed or the random number itself.

[/ QUOTE ]
In general, all physical systems (of which the brain is one example) should have a traceable output from a given input. It just so happens that this becomes extremely difficult from a practical point of view when the system is sufficiently complex (like the brain). Similarly an n-tape Turing machine may be in some sense more difficult to "track" than a single-tape Turing machine if you're not familiar with the programming, but that doesn't mean that it performs computations that are fundamentally different than a single-tape Turing machine.

bearly
06-14-2006, 05:01 PM
"i believe" was a real schmaltzy song. do you have any reasons to support your position?....................b