PDA

View Full Version : Unethical programming - the truth is out there.



sean
11-08-2001, 04:44 PM
Hey,
It's obvious that biotechnology and all that is controversial at times, but do you think there will ever be a time when we will be the ones being questioned? I mean with AI or something. Tell me what you think.

Sean Mackrory
sean_mackrory@hotmail.com

Betazep
11-08-2001, 05:09 PM
You're not the boss of me now and you're not so big.

greenRoom
11-08-2001, 05:19 PM
I hear in ~20 years, with the way our advancements are going in AI and Robotics, we should be able to create an artifical soccer team that could wip any world cup team's rear. With that in mind, yes, I believe some people will be ........ed (heck there are already champion chess players that are ........ed). Seriously though, I'm sure deeper ethical -- moral issues will arise when true AI comes about (and it will, eventually). However, imo, I think bio-technology and genetic engineering will be a bigger issue (for the next half century or so anyway; but of course what the hell do i know?).

DavidP
11-08-2001, 05:51 PM
Would we humans ever become so stupid to make a machine as smart and logical as us?

dirkduck
11-08-2001, 06:20 PM
probably ;), and Sean, im from Colorado too, where in the state are ya?

Yoshi
11-08-2001, 06:46 PM
Robots are good, but to a certain extent

Natase
11-08-2001, 07:06 PM
I don't think there will ever be 'true' AI... I know that in the future anything is possible but I think what most people class as AI is the ability to learn and apply knowledge...

I don't reckon that a computer will ever be able to decide for itself what to learn, and a computer will never ask itself why.

eg. A human hears a piece of music and (as computers will be able to) decides who probably wrote it, what type of music it is, what response it was designed to provoke, etc. I don't think that anyone will be able to make a computer ask itself why that particular piece of music was written, what mindset the composer was in and whether the musician could also paint!?

If it did become possible to cause a computer to think laterally, it would probable become boundless... imagine a computer designed by NASA to design and build more human-friendly space-stations spending its entire existence pondering on whether the inventor of post-it's knew an effective means of catching trout!

Cruxus
11-08-2001, 07:23 PM
Maybe, the "safe" answer:

It could pass as yes;
it could pass as no;
we don't know.

sean
11-08-2001, 07:32 PM
^that's where in Co I live by the way! I enjoyed your responses. They were really good!

oskilian
11-08-2001, 07:43 PM
I plan on making some kind of robot with all the characteristics a human has, then upload my brain to it and I'll become immortal!!!

Oskilian

sean
11-08-2001, 07:46 PM
Setlle down dude!

Yoshi
11-08-2001, 07:57 PM
You will be the robot if you transplant your brain

Garfield
11-08-2001, 08:16 PM
I don't think robots can get genuine intelligence of a human. Many emotions (not availible to robots) drive our intelligence and we don't even know it. I think robots are capable of a lot, but to a certain extent.

--Garfield the Programmer

Yoshi
11-08-2001, 08:19 PM
Robots can't express emotions as well as we do.

The V.
11-08-2001, 08:51 PM
To believe that human minds could be duplicated in robots, you must believe that the human mind is completely deterministic -- that is, there is no free will.

A machine can make decisions, but it MUST do so only in a mathematical fashion. A transistor passes current or blocks it completely under the laws of physics. With the appropriate voltages, current flows or does not flow. There is no "will" of a machine. You might simulate it to the point where nobody can tell the difference, but any machine can be completely determined by its initial state and all past and present inputs. That is, given the initial state of the machine (factory settings) and given a complete history of the machine's inputs, you could duplicate every output, 100% of the time, barring malfunction. You could write an equation (granted, it would be enormously complex) to tell you what it "thinks" at any time, where all of its past inputs and its initial state are inputs into the equation.

Of course, you could argue that humans have no free will. If what we call "thought" is solely caused by neurons in the brain firing, then we have no free will -- the true decisions are simply made by chemicals exciting membranes and the electrical summation of these impulses either triggering or failing to trigger a neuron's firing. Then, you could write the same equations about the human, and it, too, would be horribly complicated, but an equationd WOULD EXIST to predict perfectly the human's mind at any point in time.

If we have no free will, but ourselves are deterministic machines, determined by the initial state of our brain when we developed as a fetus, and determined by every single input that ever entered our brain, them machines could duplicate us in every way.

It is impossible to make a machine that truly has free will, because all machines are slaves to the laws of physics. AI can never create a machine with a will -- but it could end up proving that humans lack a true will, and that we never truly choose anything (in other words, that every 'choice' we make is completely determined, and all our thoughts and acts could be completely predicted, with enough information about the neurons of the brain).

Yoshi
11-08-2001, 09:33 PM
And the robot get short circuited sometimes in their 'life'

The V.
11-08-2001, 11:13 PM
Well, random failures are just that -- random; however for any purposes of "consciousness", a machine cannot choose to short circuit any of its pathways, nor could this short circuiting be used to create some kind of "will" of the machine.

Add this to the fact that an AI could be put onto one single processor, which is difficult to cause faults in except by physical damage to the chip.

novacain
11-09-2001, 12:52 AM
In Australia we have just banned cloning of race horses ect. Imagine all the horses in a race were the same!

Here a 'hacker' can get ten years for looking at your data let alone destroying it with a virus.

If someone on these boards asked you to help them write what was obviously a virus/trojan would you?

Garfield
11-09-2001, 05:35 AM
> that is, there is no free will.
Ah, my friend. You are quite mistaken. God gave us all free wills to do as we please...whether it is bad or good. We are not "robots".

Natase
11-09-2001, 07:50 AM
Originally posted by Garfield
> that is, there is no free will.
Ah, my friend. You are quite mistaken. God gave us all free wills to do as we please...whether it is bad or good. We are not "robots".

Exactly his meaning I believe...

Koshare
11-09-2001, 12:25 PM
I agree with garfield...


"Of course, you could argue that humans have no free will. If what we call "thought" is solely caused by neurons in the brain firing, then we have no free will -- the true decisions are simply made by chemicals exciting membranes and the electrical summation of these impulses either triggering or failing to trigger a neuron's firing. Then, you could write the same equations about the human, and it, too, would be horribly complicated, but an equationd WOULD EXIST to predict perfectly the human's mind at any point in time. "


this is wrong, the human brain doesnt work in a linear fasion, many parts of it are randomized, messages are sent with chemicals too. These chemicals can be intersepted by other chemicals, and those chemicals can be intersepted by other chemicals, and so on.


there is 3 "different" parts to the human brain, all working together, yet seperated, these are further broken down into other parts. Everything absorbing and sending information simotaneously. How could you predict what a human will think in a certain situation, when EVERYTHING in the brain will do different things.

Natase
11-09-2001, 02:54 PM
Unless I'm very much mistaken, you two are both arguing that The V's argument is wrong... the part you've both picked was used as an example of why AI will never be fully achieved... you're all on the same side on this one.

The V.
11-09-2001, 03:45 PM
I know that the brain isn't linear, but it IS deterministic. It may be incredibly, incredibly complicated to determine, but the laws of physics control the entire situation. Chemicals move and interact subject to physics, and there is no TRULY random motion; these chemicals are too large to be subject to quantum mechanical phenomena like tunneling -- they behave as newtonian solids, and the movement of a single molecule is completely determined by the laws of momentum, electromagnetism, etc, etc.

And thanks, Natase -- you do in fact understand my argument. I was saying that no completely determined system can show free will -- I do not believe human minds ARE determined systems. I guess I'm a dualist at heart -- I believe that there is a spirit/soul which is NOT merely a product of the brain's activity -- this is outside of the physical body entirely. I believe we do have free will, and because we have free will, I argue that we are not completely physical machines, because every physical object behaves in a completely detemined fashion.

Of course, the determination of any physical object is very, very complicated. But, if you were some Godlike being, who knew simultaneously every single variable about every single object (basically, you know the momentum, position, etc. of each particle, and you know all of the laws of physics) you could predict, with only that knowledge, the entire future state of the universe until the end of time.

It would be like predicting the weather in the distant future only knowing today's conditions. The weather is completely deterministic (a cloud has no will, it does not choose how to move, when to rain, etc). But the number of factors which influence the weather are so high as to make them immeasurable. And the weather is chaotic -- a small change in one factor could cause large changes in the prediction. So, you could never PRACTICALLY predict the weather say 90 days in the future, because the number of variables are incredibly high, and you'd need to measure them with perfect precision. Again, if you were a godlike being who knew every variable to infinite precision, you could predict the weather perfectly.

AI could end up being like the weather -- very hard to predict, but still completely determined (no free will). Both operate under the laws of physics. Neither has a will, neither can truly make a choice.

I do not feel that humans are like this -- I think that we HAVE a will, and a spirit. Assuming humans have free will, no machine could ever equal us. That has always been my argument.

Procyon
11-09-2001, 06:09 PM
Originally posted by The V.
Of course, the determination of any physical object is very, very complicated. But, if you were some Godlike being, who knew simultaneously every single variable about every single object (basically, you know the momentum, position, etc. of each particle, and you know all of the laws of physics) you could predict, with only that knowledge, the entire future state of the universe until the end of time.The central principle of quantum mechanics is that you cannot possibly know simultaneously the momentum and position of a particle. Measuring one prevents you from being able to determine the other. This is not an experimental limitation, this is a basic fact about the universe. In addition, there's the obvious fact that stroring all the information about our universe would require another universe larger than our own to hold the data in. Even a man-made robot AI cannot be completely understood.

It's hard to determine what implications this would have on the idea of an omiscent deity, of course.

I do not feel that humans are like this -- I think that we HAVE a will, and a spirit. Assuming humans have free will, no machine could ever equal us. That has always been my argument. What does it take to create a spirit? Fertilization? Female or male meiosis? Birth? Would a cloned human have free will, or would it just be a 'machine'? What about a molecular copy of a human blastula implanted in a surrogate? Do other animals have free will? What if you genetically engineered a creature with 99.5% human DNA but the other .5% artificially engineered, or replaced with animal DNA? (0.2%? 0.05%?) Would various levels of cyborgs have free will?

The answers that you get to these questions, using the idea of a spirit or free will, are frankly quite ridiculous - and I think that there's little doubt that within 200 years we'll have answered most if not all of them. The concept of free will or an individual spirit can to a very large extent be tested scientifically, and especially given religion's scientific prediction track record I think this means that the idea is on its way out - though I'd be happy to discuss the issue.

The V.
11-09-2001, 07:23 PM
Actually, I'm quite invovled in AI and research into neural networks -- you can quite EASILY understand it, it's usually just matrix multiplication.

The uncertainty principle holds that trying to measure a quantity will itself change it. But every particle HAS a single, definate velocity, and a single, definite momentum. We can never KNOW both to perfect precision, but both always exist and have a single definite value (at a given point in time). Again, this argument doesn't in any way invalidate the idea that physics is deterministic -- it just means that it's impossible to determine the future state of the universe (because we can never know the present state infinitely precisely). It does not mean that the present state does not exist, or that the future states are not determined by the present state.

If you argue that there is no spirit, and our consciousness is a construct of the brain, how is it that we are self-aware? Also, what, then, is this perception of "thought"? Certianly our brains process information, but a "choice", at a neuron level, is a summation of post-synaptic potentials bringing or failing to bring a neuron to threshold. How is it that we perceive that we make a choice? We certainly don't have an awareness of the action potentials of the brain.

Further, each neuron in the brain, although connected to other neurons, is not directly capable of perceiving the state of other neurons in the brain. Is there one, single "consciousness" neuron which generates "thought"?

Certain behaviors, such as instincts, can be explained by brain activity. Many things, our brain does that we are completely unaware of. Baroreceptors, right now, are sending messages to your brain telling it what your mean arterial blood pressure is, and your brain is attenuating your heart rate and blood vessel diameter to smooth out variations. This "automatic" activity can be easily understood by looking at brain structure.

But, how can a collection of neurons generate "consciousness"? Essentially a neuron is a nonlinear, asynchronous adding machine. No single neuron could be solely responsible for consciousness -- so how can their collection be conscious? Each neuron only has 2 states (active or inactive) and it cannot differentiate between its inputs (different inputs will have different "weights", but their sum is the determining factor of the output.)

It is not possible that a solely mechanical system of neurons could generate consciousness, or event he illusion thereof. It's certainly possible, at least in theory, to make an AI that 'seems' to be intelligent, but when it's essentially making choices by matrix multiplication, how can this generate "consciousness"?

Bottom line, I don't think there *is* a scientific explanation for consciousness. When you break the brain down into its component neurons, the system doesn't have the properties that we'd associate with consciousness.

oskilian
11-09-2001, 08:27 PM
>Robots can't express emotions as well as we do.

Wait some time, someone thought a long time ago that a machine could never do a human's work, not to mention doing it faster.

I'm not planning on transplanting my brain, because brain is made out of cells (cells are living things and they die), and I'm sure it has a limited capacity (I'm planning to have a VERY big hard drive), I'm planning on uploading my brain information to the robot's "brain", that way I could be immortal, that's what I want, if there's another way of doing this instead of doing what I'm planning to, ideas are welcome, I'm not very comfortable with the idea of being a "robot" either.

Oskilian

Procyon
11-09-2001, 08:59 PM
I'm cautious about trying to contradict you on this because you sound very knowledgeable on quantum physics (and I haven't read about it to a significant extent for over a year), but I believe that you are incorrect. According to quantum mechanics each individual particle exists only as a superposition of states - a probabilistic smear of likelihoods - until observed. For example, the double-slit experiment even with individually released photons will generate an interference pattern (as long as there is no way to determine which slit the photon passes through). There is no passage through either slit, and there is no travel path: the different possible paths interact with each other and the photon attains a specfic position only at the time a measurement is made when it hits the photodetector (or perhaps more accuratly, when the photodetector is seen by observer.) Of course, this creates a whole new mess about what constitutes an 'observer', which also ties into consciousness.



As for the rest of the message, I think you're making a serious error - you're abusing reductionism. When you break down the system into its component neurons you've lost sight of the fact that they're connected to each other and interacting. Of course an individual neuron does not exhibit the properties of conciousness. It doesn't exhibit the obviously unconscious properties of the brain either. Consciousness, like brain function or any other property of an organized system, is an emergent property of some sort that is dependent not on the states of its individual elements but also on their organization - in this case, the fact that neurons are NOT isolated entities, but that they can communicate with each other.

You ask a lot of questions that aren't answerable. I don't think we "make choices" in the way you're suggesting. When there are different possibilities, we end up doing one or the other - but there's no specific decision-making instant. Even if we think we've decided something, we don't actually know we're truly going to do it until it's done.

It's also rather hard to explain how a system of neurons (or molecules, or atoms) generates conciousness if we don't have a clear definition of what conciousness is. How do you determine whether something or someone is conscious? We can't, by any way we can think of. How do you prove that you are conscious? There's not much way to do that either. I don't know if there's much meaning in asking us to explain something we can't even define properly.

There's no doubt something pretty complicated going on in our minds. Why does it need some sort of intangible spirit entity to explain - what makes this any more unique than process rooted in some sort of physical law (even if it's one we don't understand yet)? What property would this entity have that the physical world cannot provide?

In any event, I don't feel this kind of purely philosophical discussion is very productive. I'd prefer to focus on effects we can tangibly see in the real world. If there's some sort of magic entity behind the human mind, we ought to be able to detect its presence or absence, else we're just playing philosophical games with ourselves. From the very limited amount of current information we have, there's not much we can say. But I think there are things we're going to be able to do very soon which will start to be able to put some sort of limits on this thing. This is why I am curious as to what your responses would be to my thought experiments above, which I think should generally be able to clearly detect whether or not a soul exists, at least in a philosophically reasonable form.

The V.
11-09-2001, 10:43 PM
Well, quantum physics is a strange area, especially because it seems to directly contradict many other areas of physics.

We do know that quantum physical effects for objects larger than an electron aren't significant, and a newtonian model fits better. Of course, newtonian, quantum, or relativistic models are only those -- models. They are equations which accurately predict how an object behaves, but they don't necessarily tell us WHY. And each of those 3 models is useful for predicting object behavior in certain situations, yet fails to accurately predict behavior in others. As yet, there is no unified model which encompasses all the situations.

As to a soul being scientifically definable, I disagree that it is even possible. The reason is that science is concerned with matter and energy -- both of which are subject to deterministic systems (or purely probabalistic systems, in quantum models). In neither situation is there any kind of room for a will of any kind.

So any entity of free will, i.e. a spirit/soul, would by definition NOT be bound by the laws of physics -- thus it is unlikely that it exists in that portion of the universe in which we are capable of perceiving, either directly or through instrumentation. It is entirely conceivable that there are portions of the universe which are beyond any possible detection simply because they do not strongly couple with the regions of the universe that we perceive with our bodies or instruments.

My argument with consciousness being a product of neuronal activity is multifaceted.

1) We have no conscious awareness of a vast majority of the operations that our brain carries out continuously.

2) There has yet to be any explanation of where the artifact called "consciousness" arises from. We know very very much about how a neuron works -- and it is difficult if not impossible to see how a sense of consciousness can arise from a collection of neurons.

3) If consciousness IS an artifact of neuronal activity, how does it work? Clearly, it is not possible for any single neuron to be the only "conscious" neuron -- there are obviously more than 2 mental states anyone can be in. If many neurons are involved, how does the information that they hold integrate into a consciousness? Further, given how neurons function, how do we have a "stream of consciousness" by which we have a linear sequence of thoughts? It seems impossible to build such a network from neurons.

4) If neurons are the only means by which choices are made, and choices are made by neurons on a completely chemical level, then we are deterministic machines with no free will. How, then, do we constantly PERCEIVE that we have free will? How do we even perceive ANYTHING if all we are is a network of asynchronous nonlinear adding machines?

Now, I'm an engineer, so I'm by no means anti-science. But I recognize that there will always be things beyond the capacity of science to explain. Hell, I've experienced things in my life, on more than one occasion, that science declares impossible. Science can't explain everything.

And, in the realm of souls, science can't explain ANYTHING. Science cannot answer questions in which it is impossible to gather data.

And, your questions can't be answered, by science anyway. I have beliefs, and many others do, too, but it's simply not possible to verify who, if anyone, is correct.

Philosophy is about the ONLY way to answer this, and even then, it doesn't get far before it strays from the path of logic into supposition.

I am aware that I exist. I perceive that I have a free will, because every experience of my life has reinforced the fact that I can make choices. Because no object that existed solely in the domain of matter or energy could have a free will or make a choice, I conclude that at least part of what I consider "me" lies outside of these domains. This part I call "spirit".

Beyond that, it's impossible to KNOW anything. It's not even possible to prove that there is a physical universe. We assume that the physical universe exists because we perceive its existence, but we cannot prove its existence. In the same way, I assume that I have a spirit, because I perceive its existence, but again, it cannot be proven. I don't think it's any more of a leap to use this line of reasoning for the spirit than it is for the physical universe.

We have to make MANY inferences like this to even *have* science. Science is logical, but it is based on inherant assumptions about the universe which can never be proven.

Of course, all discussion is rather a moot point. Whether or not the soul or universe exists is fixed -- we may not know the answers to these questions, but there IS a correct answer. Further, we cannot help but believe that the universe and free will exists; it is impossible to not make choices, even if we say we don't believe in free will.

So, being as we have to live AS IF there was a universe and free will, I'll choose to believe that my perceptions are accurate, and that will and universe both exist.

Procyon
11-09-2001, 11:12 PM
I have to go in a few minutes, so I'll have to make this a quick response:

I claim that the idea of a soul is amenable to scientific test to some degree. Specifically, if a "soul" is actually composed of matter and energy, we should be able to create one somehow, perhaps by cloning or direct molecular assembly. If not, such an attempt should be unsuccessful at creating a self-aware being. The human/animal/computer hybrids should show a dividing line of some sort that we can measure at a certain point. If we were to do this and not see such a dividing line, I think it would consign the concept of a nonphysical soul to the absurd-though-undisprovable idea bin. Of course it can't be truly disproven, but neither can anything else.

I'll give a more general response tomorrow morning...

The V.
11-10-2001, 02:39 AM
Well, firstly "spirit" could be completely matter and energy -- this would not fulfill the "free will" criteria which was one of the reasons for not going with the purely matter/energy brain as the source of consciousness. If the spirit was simply energy and matter, it too would be a completely determined system, devoid of free will.

If spirit exists, it exists at least partially outside the realm of energy and matter -- thus outside of the ability of our devices to detect.

Also, you would have a hard time with animals, because there's nothing which says animals don't have spirits, too (and some evidence which suggests that some, at least, exhibit self-awareness and other "intelligent" traits). With cybernetics, you'd probably have to hybridize the brain itself (make a partly organic, partly mechanical brain) which is FAR beyond our current capabilities.

But unless you can measure something, or at the very least test presence or absence of a thing, science won't get you ANYWHERE.

-KEN-
11-10-2001, 07:36 AM
Oh Lord, I can feel the religous argument about ready to explode itself noto the scene...

Can't we all go a month without a religious argument? it would make me sooo happy:).

Bobish
11-10-2001, 10:52 AM
All this you have no will of your own stuff makes me wonder by i bother playing vidio games when what happens is destined to happen why dont i just turn of the system and on the vcr. Well its fun to think your in control. About AI there was a computer called hal that could learn and talk all that and it would just be the next step to give it a personality. Of corse this brings up the question if it acts like a person and for all practical perposes it is a person wouldn't it be unethical to turn it off. Wouldn't that ne like killing someone (your stoping their brain from functioning) and sure if you turn it back on whould their be the same consisness or would it be like a clone. That would be a place where having a soul would come in handy it would be like a restore disk when the computer crashes, or back up memory. You die it floats off, your reserect, it floats back. Well any way sorry for my inteuption back to quantum phisics. This has been an interesting discussion.

Procyon
11-10-2001, 12:30 PM
I don't see any reason to think we do have any sort of a stream of consciousness. For example, imagine that we do have souls, but instead of being bound to a single individual they constantly migrate between them. However, memory has been scientifically demonstrated (to my satisfaction, at least) to be a property only of the brain: both short-term and long-term. So these entities would not be aware that they're constantly shifting around, because they have only the memories of their current host. The stream of consciousness is an illusion created by memory. I think it's easy to carry this further: now assume that only half the time an individual has a spirit host: this does not affect the perception of any of the hosts. Reduce to this figure from half indefinitely toward zero.

The answer to the rest of your questions is either "I don't know, we don't know enough about the brain," or "that question doesn't have any meaning because we're not self-aware in that sense." The fact that we don't understand the brain completely doesn't mean we have to pretend it's controlled by magic. I also maintain that quantum mechanics means that the universe and/or the organisms cannot be completely deterministic, especially if quantum mechanics has some sort of a role in self-awareness. Chaos theory clearly demonstrates that completely insignificant changes in some cases - which can easily be provided by quantum randomness, it seems to me.


Assuming the universe and assuming souls exists are very different things, it seems to me. If we assume the universe exists, we can then make predictions that can be tested, and use the results to gain greater understanding. What do we gain if we assume souls exist? We haven't explained anything, we've just artifically cordoned off an area from our understanding. I think that an aim of an assumption can be just as satisfactory as an explanation of an assumption.

Now, that isn't completely true. I think that the presense of a spirit can be tested to some extent. You misinterpreted my last message somewhat: I was saying that IF the "soul" is something composed of matter and energy, then we should be able to create it and manipulate it. If is a true sould like the one you speak of that is not bound by universal physical laws, we should not be able to do this, and therefore there should be a distinction between entities that are susceptible to such manipulation and those that are not.

Specifically, we've already been discussing AI. Since we'd have to directly create such a thing, if the soul is truly nonphysical, it would not be self-aware. We can't yet create a human; therefore it's special. This create/can't create distinction is the fact of life that sustains this idea of dualism. However, I'm pointing out that it's not going to remain a fact of life forever. Cloning is going to be the first step in erasing the distinction; but it won't be the last. Without doubt it would be hard to create human/animal hybrids (or various animal/animal hybrids if you want to set the bar for self-awareness slightly lower), or cyborgs, or molecularly assembled creatures, but I think there's little doubt that it's possible and within 100, 200, 500 years will have been done. If we do this stuff and there's no evidence of a barrier between self-aware and not self-aware I think that would disprove the concept of a spirit. The concept has lost its usefulness if we have no idea what to assign it to, because the boundaries of species and natural/artificial that restrict it today will have fallen.

The V.
11-10-2001, 05:09 PM
I don't think assuming the existence of a spirit is an unreasonable assumption, and we do it for the same reason we assume the existence of a universe. We do it because we perceive ourselves to have a will. The properties we perceive of this will contradict the properties of matter/energy systems, so we conclude that part of the system lies outside of those domains.

It's not a conclusion made without thought, or without knowledge of facts. But we cannot conceive of any way that consciousness and free will could be built from purely matter and energy; it exhibits traits that contradict the laws of physics that we know. Thus, the most logical assumption is that there is another player in the game -- some entity or structure beyond our current understanding of the universe, which is not a slave to those laws of physics which we know.

Assuming a spirit is NOT irrational, nor illogical, nor is it a worthless assumption -- we infer its existence by observing phenomena which cannot be attributed to anything else we know of.

We *do* know how the fundamental units of the brain work. We know exactly how neurotransmitters cause ligand-gated channels to open, causing post-synaptic potentials which sum, and if they reach threshold, they generate an action potential by opening of voltage-gated channels in the membrane. The brain itself is hardly much of a mystery. Thought is a mystery, but not the electrochemical adding machines of the neurons.

Knowing how each individual unit works, it seems impossible that a brain built from such units could have the properties we see. It is like saying that a building made of pure gold could exhibit a tensile strength greate than structural steel. Based on what we know about the properties of the material used to construct the object, we know what properties the whole object could/could not have.

Knowing how the units which make up the brain work, the two issues still remain:

1) It seems impossible for self-awareness from a collection of these adding machines, each of which acts independantly from the rest.

2) We cannot help but believe that we have free will. We directly experience free will, and directly perceive ourselves making choices. A collection of neurons has no will.

Because of 1) and 2), we see that the properties of thought which we directly perceive do not mesh with the model of the brain-as-origin-of-thought.

The creation vs. noncreation is not a valid argument. Perhaps it is a property of the growth and development of a living being which infuses them with a spirit. Clones could very easily have such a spirit -- after all, outside of the fact they're genetically a copy of another person, they grow and develop exactly like any other fertilized egg. A clone isn't really created, it's grown, in the same way any human is grown. To "create" a human, you'd need to assemble one, fully grown, *without* allowing the human to use the natural developmental cycle in any way. And that's simply impossible, or will be for centuries. You'd have to take the component molecules and build every cell from scratch. If you grow a cell in the traditional sense, any properties that are a consequence of the manner of its creation wouldn't change from "normal" cells, because it would be made in the "normal" fashion.

Also, this fails to address the one key issue -- how do you determine whether something ELSE has free will? I know that I have free will because I perceive it. How can I ever prove that YOU have free will? How can you prove that I have free will? It's simply impossible.

Procyon
11-10-2001, 09:43 PM
I'm not sure that I perceive that I have a will. Do I really control what I'm doing? I don't think I really do. When I have to make a decision, it's shaped either by a complex evalution external factors that I don't actually perform or some sort of spur-of-the-moment arbitrary selection - I certainly don't sense any mystical stuff going on. Another important thing to consider is the enormous impact drugs can have on a person's mental state, including feelings of self-awareness. Why should drugs affect something that does not follow the physical laws of our universe?

And again, the reason why people assume the existence of a spirit is not the same as the reason they assume the existence of the universe. It's more than 'perception' - it's also that the assumption of the existence of reality is productive. Assuming the existence of a soul is NOT productive because it automatically claims we can't understand human behavior. Maybe we can't explain human behavior at every level, but until we have more evidence I feel it is foolhardy to make that claim. The fact that nothing else "we know of" appears to exhibit consciousness is not sufficient.

I also repeat that you are making way too many assumptions in extending the properties of the neuron to the properties of the brain. A neuron is NOT a mini-brain with all the functionality of the full unit. A transistor can't run Windows '95; that hardly suggests that a computer can't either. Your gold/steel analogy doesn't make sense either. Perhaps you would claim, looking only at a small sample of graphite, that pure carbon could never be made into a transparent material several times harder than any other naturally occuring substance? Or that hydrogen and oxygen gas could never be made liquid at standard temperature and pressure?

Most people who argue for the existence of a soul take the easy way out and say that it is generated at fertilization, which is why I referred to cloning. Obviously creating an individual at later stages becomes extremely difficult, but there is no reason to believe that it impossible - and with sufficient technological advancement we will probably be capable of trying it.

If you don't think it's creation versus noncreation that defines self-awareness, why can't a sufficiently advanced computer be self-aware?

A self-aware being should act different in some detectable way from a non-self-aware being, or the distinction between self-awareness and non-self-awareness is useless. I won't claim to know what that distinction would be, but if none were found after extremely intense scrutiny I think that would be sufficient to reject the idea of a nonphysical soul.

Scourfish
11-10-2001, 10:14 PM
NO! Down with AI! Don't let the robots enslave the human race!