PDA

View Full Version : varying levels of consciousness


Cavernio
01-9-2009, 07:22 AM
My last thread on consciousness was closed because someone decided that it either was or wasn't, apparently. This is not necessarily true. However, even if it was, we still have no way to actually determine whether something is conscious or not, something which could also be discussed.

For one, just as individuals we actually experience varying levels of consciousness. The best example is dreaming versus undreaming consciousness. Drugs also change our consciousness.

Secondly, our consciousnes is clearly linked to memory in a couple of ways. Whatever we remember at any given moment is our consciousness for that moment. So obviously whatever we've learned from the past plays a huge role in what our future consciousness may be. If I've only ever experienced life in a box, my consciousness is going to be hugely different from most people. But, you say, I'm still aware, so I'd still be conscious and that's it. But I would not be aware of the same things as you, and I suggest it might even be possible for someone to not be self-aware in a seriously restricted environment.

Another thing about consciousness related to memory, the most important one in my argument, is that our consciousness is restricted by how much information we can hold in our working memory at once. There's clearly individual differences in how much this is, and yet there's still a standard in how much someone can have, about 7 pieces of information. If someone can think about a huge number of things at once, then they have higher consciousness than someone who can't. If someone could hold all the information they've ever known in their head at once, they'd be approaching god levels of intelligence from their godly consciousness. I do not think fish are of the same level of consciousness as people are, yet I believe that they are self-aware in that they see themselves as something. I think that babies develop consciousness as they grow.

So, like I said, maybe you don't agree with this. Maybe I'm just using the term 'consciousness' too loosely. However, I have a hard time imagining how consciousness suddenly evolved out of nowhere, and evolve it must have. Furthermore, I can't imagine how you can say that it either is or isn't when we don't actually know what has it and what doesn't.

~kitty~
01-12-2009, 07:26 PM
Usually questions that have no answers for the moment are the most annoying ones to ask and I usually advise myself not to ask them... but it's hard not to and I don't blame you for asking it.

You want peoples thoughts on this matter?

Many things can interfere with the solution with what one person has and will take real research (hard research that may not even be successful) to find one best fitting for a problem like this...

For me, consciousness is something I kinda lost half way through reading that and I was at minimal consciousness just reading words, but some how they stuck as visual in my head, making it more clear on a 2nd read...

Which means there's a unconscious consciousness, which we have to also take in account for...

It's instinct, I believe, that is our unconscious consciousness... and it does seem like an oxymoron, but I know what I'm talking about and I kinda don't care if no one else does. The matter of consciousness is a problem that probably is making many people nowadays question our existence... like why are we conscious... what makes us special... etc.

Anyways, that's all I have on this matter for now. I won't have anything else to say until I see another response that may trigger a realization of a flaw in what I am thinking... but until then, I'm ending my rant here.

Reach
01-14-2009, 09:44 PM
Consciousness is a difficult word to define, and in order to make solid arguments for or against anything you need to define what you're talking about.

I like to think about and define consciousness as the summation of various mental processes into a single, coherent perception. That is, sensation, thought, mood, emotion, perception, attention, analysis etc combine to form our mental representation of the world around us and therefore what we perceive as our conscious awareness of what is going on around us.

It's easy to sit there and argue that you're either 'aware' of your existence or you're not, but I think that's a terrible oversimplification. For one, I feel consciousness is related to all perception, not just perception of the self. But even if it were just of the self, to what 'degree' are you really aware of your existence? How we think about ourselves and others, especially in social situations, is a complex field of study (social cognition), and there are a vast number of complexities that you have to deal with when talking about how people view themselves and their existence, and it's interrelationship in the world.


Moving on, it isn't hard to see that all of the various mental processes I listed off above can vary from state to state and from organism to organism based on the genes that they are endowed with. I don't see consciousness as something you have or you don't, but something that exists on a continuum.

I think you bring up a good point with working memory. Ultimately, many of the mental processes I listed are limited by mental capacity for complexity (i.e. intelligence). This is directly related to working memory capacity, as the amount of information you can manipulate in your mind simultaneously is a large determinant of what we consider to be intelligence and is highly genetic. This working memory is a restriction on our perception of the world in every way, which would ultimately limit your 'consciousness' under my definition of the word.


In conclusion, I think every organism with a functional nervous system and some basic mental processes such as ones I listed has some level of consciousness. I think our tendency to view it as something black and white is due to ignorance of what consciousness really is. Just because an organism lacks the mental power necessary to perceive the world in a way that allows them to understand themselves as an autonomous entity as we do doesn't mean we should jump to the conclusion that every other organism lacks conscious awareness, or that it's a black and white concept.

Cavernio
01-15-2009, 07:42 AM
Yay Reach, someone sees consciousness like I do. So now onto stuff I actually wanted to talk about in my now closed post.

"In conclusion, I think every organism with a functional nervous system and some basic mental processes such as ones I listed has some level of consciousness."

So do you think it's possible to create an 'artificial' consciousness? I used to think we couldn't (I also used to be religious), but I disagree with that now. What I'm still in agreement with my old self, however, is that it doesn't seem like our current technology is anywhere near approaching that, and that we're not ever going to get artificial consciousness with our current, 'mindset', if you will. It seems very common to think that we just need to make things incredibly complex, and through that complexity, consciousness will emerge. That seems unlikely to me though. My computer is incredibly complex, but I don't think that it's anymore conscious than my computer desk. It also seems like we'd need something that can interact, that just up and on it's own 'decide' to do something. But again, there already exists aritificial interaction that can be quite complex. Any computer game AI does that, and again, I don't think that they're conscious. Not only that, I'm hard-pressed to imagine anything close to consciousness emerging from such technologies. I can imagine something that could mimic consciousness and emotions, but not actually be it.

Seeing as I don't forsee any of the technology we make today making themselves conscious, but I don't think that consciousness is only human, there has to be some way to develop consciousness. Here's where I say something crazy: I think that consciousness may have a physical reality, and that life contains it or channels it or something. In an attempt to make this sound less totally crazy, physics has found analogies in our world already. Take any wavelength in the electromagnet spectrum besides those that we can see. We have radio, xrays, ultraviolet, etc. We can't see them, but they've all become common place. I think that if life does contain something that causes consciousness, it is obviously uniquely attached to our brains in some way shape or form.

I do see that this is totally out there, and maybe I shouldn't dismiss current AI technologies so readily. But here lies the real crux: we currently have no way to tell whether something is conscious or not.

~kitty~
01-15-2009, 08:05 AM
There could always be obstacles in the process of making artificial consciousness/intelligence in the fact that no one knows close to enough to know what outcomes will come about making such experiments...

Such as the dolly clone experiment didn't end up right, even when we thought we finally got cloning down.

It's hard to talk about something we haven't much knowledge on, but maybe you have more information packed in your brain about it than I do...
So maybe you can think of ways and whatever. What I'm trying to say is that it's a waste of time to have it if we don't have anything worth putting it into when we don't know what damages it could cause towards society.

Reach
01-15-2009, 09:44 PM
To Cavernio


Could we create an artificial consciousness? Sure. I think it would be very 'different' than what we're used to experiencing, in that it would be an entirely different level of consciousness. This is due to the fact that humans...engineers...people that would build this kind of artificial consciousness would do so in a very organized way, much unlike evolution that has shaped our brains in a very convoluted and unorganized way when you start to get down to the level of the neuron.


A computer currently has memory (far beyond ours), but what a computer doesn't have is something analogous to a functional nervous system that allows the computer to act on it's own without user input **see end for an additional note on this**. This is necessary in order to generate 'processes' that I consider mental, and are thus required for consciousness under my current definition. Computers can read off millions of 1's and 0's that we input very quickly, but they cannot generate that input in a coherent manner themselves, which is obviously the step that would have to be taken in order for us to generate artificial consciousness.

Yes, it's not something that is going to come soon because it requires an entirely different kind of system. It would no longer be a 'computer' since it wouldn't just 'compute', but rather it would have to be both a generator and a computer. Current designs don't allow for this type of functionality yet, despite massive advances in self generating algorithms etc.

We probably won't see artificial consciousness until our knowledge of neuroscience, and quantum computing advances much further.


I suppose we could tell if something was conscious or not; the real problem is the lack of a formal and consistent definition of consciousness that everyone uses. This would be required before anyone would be able to come up with a test for whether or not an organism demonstrates some level of consciousness. Under my definition it would be fairly easy to test (functional nervous system with more than just spine and brainstem, i.e. cortex and cerebellum).

I see consciousness as a direct product of the complexity of the cortical regions of the brain responsible for mental processes. I suppose that doesn't really disagree with what you're saying about a physical medium carrying consciousness...I just see that physical medium as the complex network of neurons and other cells interacting in chaotic ways.



** we have to be careful here. I know that I think I act without user input, but a possible reality is one where I am acting based on input from the universe and the interactions happening on the quantum level of space-time. Under this model, I would be analogous to a computer (i.e. the brain is a quantum computer and the universe is the user), and this would mean a computer wouldn't have to act on it's own in order to generate some level of consciousness (given that I can). However, computers are still obviously not capable of this yet.

This starts to get into free will, which is another topic, though fundamentally related. 'Free will' is a lie in my eyes nowadays, and the more I think about the topic the more I lean towards a universe that functions in the way I just described.

devonin
01-15-2009, 10:54 PM
'Free will' is a lie in my eyes nowadays
interacting in chaotic ways

A lack of free will tends to require a systematic, measurable and consistant sequence of cause and effect that is predictable. This seems at odds with interactions being 'chaotic' and thus unpredictable.

Cavernio
01-20-2009, 09:27 AM
Reach: Yes, I do see what you're saying about computer's not acting on their own. I'm pretty sure that there are well-known scientists believe that this is what ultimately separates computers from people. 2 things about this:
First, as you've already mentioned, people aren't necessarily creating their own consciousness. But what we do have is constant input then, and that there's obviously feedback from ourselves into ourselves. I don't know how far AI has in terms of self-feedback, but I'm sure it exists to some extent. So this one thing supports what you're saying I suppose.
My second thing, however, goes against what was just said. As you also mentioned, having a self-generating machine would imply that it then has developed free-will (or is developing). But I don't think that that's necessary for some levels of consciousness. For instance, if I touch a hot burner, I retract my hand as soon as the 'hot' signal can reach my brain. It is an instinct, and I will do it unless I've put a lot of self-control over myself to not do so. However, even as I move my hand away, I *know* that I've touched something hot. I become aware of what just happened, even though my free-will and self-generating consciousness had nothing to do with it. I think that regardless or not of self-feedback or free-will, consciousness can exist. That means that our current AI should have the potential for consciousness, even though it may only develop that consciousness when it's interacted with.

Perhaps we do, in fact, have the ability to determine whether something is conscious or not. I mean, we think animals have it to some extent, and lesser things don't. Or perhaps in order for anything to act human, it must develop the consciousness as a matter of course: maybe that's just how the world works.
Do you think that if were to develop the ability to share our own consciousness (aka: mind reading), we'd be able to tell? Just throwing that out there, because I think that that's a more plausible task.

devonin
01-20-2009, 04:19 PM
The main issue I would need resolved about what you mean by "mind reading" would be whether you would only be able to perceive active thinking (Like, I'd have to think AT you, consciously and deliberately) or whether you would also be able to perceive passive thinking, and then if the latter, whether conscious or unconscious thought would be transmitted.

I mean, I'm thinking these words in my head as I type them, so would a mind reader who can't see the screen be able to hear the words as I think them? Could they -also- detect subconscious thoughts or multiple thoughts happening simultaneously?

I suppose if you could just be passively telepathic and simply intake all thought happening around you, it would sound, in a crowd, much like it would if the crowd were all talking at once.

You'd presumably be able to determine "There are thinking things here" if just because you believe you can "hear thoughts" and are hearing something. But that doesn't necessarily prove other minds, because their thoughts can be just as much a potential illusion as their speech is, so it woudln't -prove- consciousness of other beings, just provide even more strong evidence for it.

Lueadar
03-17-2009, 06:44 PM
I tend to agree somewhat with Reach in that free will may well be a lie. If you think about the human brain as a computer with a user, then free will makes sense as in the user (presumably a soul of some sort) is able to use the brain to obtain data from the various senses of the body, recognize key features of that data, calculate possible outcomes, and then ultimately the soul gets to make the decision, for better or worse.

However, if our brains are, in fact, controlled by something outside our physical selves (i.e. a soul), then there must be a method by which the soul (not of the physical world) can interact with the physical world. This method of interaction would probably have to be very small (i.e. affecting the trajectories/energies of single particles to create a depolarization (nerve impulse) in the brain). If it were not small, then 1: souls would have no need for a physical body, as they would be able to interact freely with the environment at large, and 2: we would have noticed this long ago (scientific proof of psychic phenomena, ghosts, and what have you, of which there currently is none). Thus, a very small and unnoticeable method of interacting with the environment is a very probable conclusion (the interactions would be so tiny that we could never observe them a la the Heisenberg uncertainty principle), but it still begs the question: what is a soul? Is it simply that energy has a will of its own (to vary slightly on a whim)? Is all energy somehow connected and controlled by a supreme force (i.e. god).

The other option is that our brains are NOT controlled by some outside force, and that they are simply a very complex computer that reads in data (via the 5 senses), performs calculations (the decision making process in our brain - and the illusion that we control it), and produces output (actions via our musculature).

The problem with the second option is that we, as humans, feel, and have ownership over, our own consciousness. I am me, not anyone else. If you were to make an exact copy of my body and brain, down to the last electron, it would not be me. I could look at the copy of myself, and it would be identical to me in every way, but still I would be looking at an entirely separate being with a separate consciousness (presumably). Do I then have some ownership over the specific particles, atoms, and molecules in my body? Absurd, since those particles and atoms and molecules are constantly being replaced by new ones.

So if we do indeed have ownership over our own consciousness, and that ownership is not by way of physical particles, how then do we possess our consciousness? Is it by way of a soul that is entirely apart from this physical world? Or is the feeling of our own consciousness a lie? Is it just an illusion, and thus we really are just a brain?

'Tis the million dollar question.

Reach
03-17-2009, 10:28 PM
A lack of free will tends to require a systematic, measurable and consistant sequence of cause and effect that is predictable. This seems at odds with interactions being 'chaotic' and thus unpredictable.

I never got around to reading this thread again...so I'll reply now:

A lack of free will *could* mean there is a cause an effect sequence but that doesn't mean it's predictable. You haven't addressed who's doing the predictions, or the physical problem of observation in making predictions (i.e. changing the outcome).

A system could easily be unpredictable to the casual observer (like us) but causal in nature. The problem is that we're mere observers within a system where there is vast amounts of information we can't account for on the quantum level. Quantum mechanics, for example, is probabilistic to observers like us. That doesn't mean it's not inherently causal though, as I would assume it is (since I see no reason or evidence for why anything isn't causal)


Even then, assuming quantum mechanics is inherently probabilistic and has no causation associated with it, it doesn't give evidence for or against free will. We could still lack free will - in a universe where the future is indeterminate.


There is evidence from neuroscience to suggest we lack free will. To me, it's an obvious conclusion when you throw Cartesian dualism out the window and understand that mind = body, and structure = function. =/

It's not conclusive and still open inquiry at this point though.


However, if our brains are, in fact, controlled by something outside our physical selves (i.e. a soul), then there must be a method by which the soul (not of the physical world) can interact with the physical world.

As much as it's an idea that has been regarded highly in the past, modern neuroscience rejects it entirely.

That doesn't mean it isn't true, since you can't prove it isn't, but it's one hell of a leap. Definitely not the most parsimonious explanation.


Is it just an illusion, and thus we really are just a brain?

Yeah, we're just a brain.

That doesn't necessarily mean consciousness is an 'illusion' though. It just means the feeling of control we have over this consciousness is an illusion and a byproduct of the brain as well.

I like to think of our consciousness as the universe reflecting on itself.

Lueadar
03-18-2009, 12:31 PM
Yeah, we're just a brain.

That doesn't necessarily mean consciousness is an 'illusion' though. It just means the feeling of control we have over this consciousness is an illusion and a byproduct of the brain as well.

That's more or less what I was getting at. We "feel" our own consciousness, and we feel we're controlling it, but apart from some outside force (again, a "soul" of some sort existing outside the bounds of what is observable for us), the feeling on control would be an illusion.

That doesn't mean it isn't true, since you can't prove it isn't, but it's one hell of a leap. Definitely not the most parsimonious explanation.

It is definitely a huge leap, but I figure it has to be at least considered from a logical standpoint considering the vast amount of believers in some kind of "god" (within all intelligence ranges).

A lack of free will *could* mean there is a cause an effect sequence but that doesn't mean it's predictable. You haven't addressed who's doing the predictions, or the physical problem of observation in making predictions (i.e. changing the outcome).

A system could easily be unpredictable to the casual observer (like us) but causal in nature. The problem is that we're mere observers within a system where there is vast amounts of information we can't account for on the quantum level. Quantum mechanics, for example, is probabilistic to observers like us. That doesn't mean it's not inherently causal though, as I would assume it is (since I see no reason or evidence for why anything isn't causal)


Even then, assuming quantum mechanics is inherently probabilistic and has no causation associated with it, it doesn't give evidence for or against free will. We could still lack free will - in a universe where the future is indeterminate.

There is evidence from neuroscience to suggest we lack free will. To me, it's an obvious conclusion when you throw Cartesian dualism out the window and understand that mind = body, and structure = function. =/

Again, this is the million dollar question. If the universe, and everything in it (which would include us), is causal in nature, then we are at the mercy of the universe, and our lives are meaningless to us (though they may have meaning to the universe, whatever that may mean). We are the product of what came before us: we can be no different--every action has an equal and opposite reaction. Where we are now is just the result of an incalculable number of previous reactions. Within this view, there is no room for change. Fate is inescapable (this is starting to sound like the matrix). It's a bleak outlook (or not, depending on how you look at it). If we, humans, are robbed of choice, then who is making the choices (or made the initial choices as it were)? Is anyone? Is the universe just a meaningless heap of matter and energy--changing in a predictable and inevitable manner timelessly? No beginning, no end, no time, just change--meaningless change.

It's hard for me to agree with the above statements. I'm an atheist by definition, but I still see meaning in the universe. I see not meaningless change, but instead progress. Since the beginning of recorded time, there has been nothing but progress. Evolution from single-celled organisms to upright-walking, thinking, creatures, technology, all of it is progress. Progressing towards what? I don't know. If it was all meaningless however, I don't believe we would be here, I don't believe we COULD be here. If evolution's goal is to create more adaptable organisms, then it would seem the ultimate goal is immortality. For what purpose? So the universe can reflect on itself? Perhaps. =)

It seems to me almost as if the universe, through us, is trying to create god, rather than the other way around.

Reach
03-18-2009, 02:29 PM
It is definitely a huge leap, but I figure it has to be at least considered from a logical standpoint considering the vast amount of believers in some kind of "god" (within all intelligence ranges).


Sure, I've considered it. And rejected it. Most people might believe it, but that doesn't mean anything. Most people believed the Earth was flat.

It's also reasonable to develop experiments to test whether or not we have free will, and thus far, it has been rejected. Under experimental conditions, regardless of directed attention, when a subject reports to have willed an action, the action had already been set into motion in the brain. To me, the interpretation of this is pretty straight forward - consciousness is just a retrospective interpretation of a preconceived reality.

The argument from brain damage or altering brain states also blows a hole in this argument. Why do drugs work, for example? Because they alter brain chemistry, and sometimes those pathways associated with conscious perception you have absolutely no control over.

We're all puppets. We just can't see the strings.


Also, God could exist and we could still have no free will. I mean, if you go to church they'll tell you God gave us free will, but that isn't evidence for anything.


Again, this is the million dollar question. If the universe, and everything in it (which would include us), is causal in nature, then we are at the mercy of the universe, and our lives are meaningless to us (though they may have meaning to the universe, whatever that may mean)

How so?

Meaning is entirely arbitrary. Whether or not your life is meaningful to you depends on your subjective perception of the universe. Whether or not this is causal is irrelevant.

A lot of people forget about this, but if there's no free will, and we rewind the clock and observe the summation of your life, nothing has changed. Everything that has happened, happened, without your say in the matter, the exact same way it did when you thought you were in control. So, why does it matter?


I see not meaningless change, but instead progress. Since the beginning of recorded time, there has been nothing but progress.

I don't think this is the case at all. Take evolution for example. Evolution has no purpose, plan or reasoning behind it. However, it produces progress in the sense of change over time.

I find meaning in my life, and I think the fact we're here is quite beautiful and elegant. I don't think there's meaning other than that which we create though.


I don't know. If it was all meaningless however, I don't believe we would be here

I would say this is hindsight bias. The evolution of the universe necessitates change over time, and regardless of 'what' happens, something happens. It's easy to look back at a chain of events and recognize all of the intricate things that had to fall into place for it to work out that way, but really...something had to work out either way, and if it was some other way, you'd probably think the same thing.

Lueadar
03-18-2009, 06:45 PM
I can definitely tell you've given all of this a lot of thought. Mostly what I've written in this thread is the product of my own meandering reasoning process, and it's nice to get some insightful feedback.

I've often thought that if we are simply autonomous computers that can reproduce, then the culmination of artificial intelligence technology will be artificial life (most probably more advanced and capable of survival that we are).

Cavernio
03-24-2009, 11:40 AM
This discussion has defintiely gone places I didn't think about. I never equated a lack of a soul to necessarily imply no free will. Possibly because I think free will can be constructed. Yes, our brains have control over our consciousness. However, the conciousness that is created, I believe, has control over our brains too. Just as Reach sited that drugs and physical changes to our brain affects our consciousness, so does a conversation with someone cause us to change our thoughts. This is irrespective of the existence of a soul, but not of consciousness.

The fact that I feel I have control over what I do is what gives me free will. That given enough technology and time all of my actions would be predicted and known does not change the fact that I think about what I do make decisions based on my thoughts, or that I'm aware when my actions are based on habit or impulse. Also, whatever technology we would make, if the technology is as it is today, it may predict what we do, but it would not feel the emotions involved in our actions, and as a consequence, not be able to understand them.

botchi246
03-25-2009, 09:36 AM
Those are some long posts!!!!! OMG!!! My take on consciousness: (btw, I did not read any of the other posts. This is pure unaltered botchism.)

I define consciousness as having two parts. The observable and the sub.
Lets get an example going.
Example: Pie

When a person wants some pie, you can see in their face the effect the pie has on them. Such can be noted through facial expression and body language. Body language includes excessive grabbing motions and the ultimate suffocation of pie intake obstructing the trachea. This is the observable conscious. Clearly, this poor person has fallen victim to the nature of pie, which is to completely take over one's senses. The observable conscious is just that. A clearly defined 'want' is the essence of this conscious. Hence, the want of pie.

Now, the subconscious is completely different. This I like to call the 'need'. The same person used in example 1 sees this potentially amazing tasting pie. The observable conscious may take over. See result of pie intake. But, something happens in the brain before this want takes over. This is the subconscious trying to define the 'need'. Do I really need this pie? What are the long term effects? Will it taste good? These are the foremost questions that go through the mind without hesitation and without thought. Unfortunately in this case, the 'want' has exceeded the 'need' and the poor person has died by suffocation.

I know this may seem like an elementary and immature way to look at consciousness and most likely has nothing to do with the aforementioned posts above. It may not seem very logical, but that is in the eye of the beholder. It is hard to define consciousness and its implications and this was the only way that I could express myself through a subject I know very well. Pie.

devonin
03-25-2009, 11:20 AM
I give marks for the attempt to be amusing with the pie analogy, but for one, as soon as you say "I didn't read the other posts" if the next thought across your mind is not "So I'm going to go to that before posting" then you are doing something wrong in this forum.

By posting a reaction solely to the thread title you can either miss the point of the thread entirely, be posting something that has already been said etc. Either way, it is disrespectful of those who took the time to actually follow the discussion as it has developed before adding their own thoughts.

As a result, your post seems to do nothing more than simply define what the conscious and unconscious minds are, and, in my opinion, did so incorrectly.

How is the -conscious- mind somehow that which results in you stuffing pie into your face? That sounds more like you're trying to describe an -unconscious- compulsion to eat, which has nothing to do with the discussion at hand. The conscious mind is the one that is actually taking time to deliberate and ponder before action happens, so at best you've inverted your definitions.

"Do I really need this pie" is -exactly- what conscious thought is.

dsliscoo
05-27-2009, 06:53 AM
Well, here is my first post in this forum.. There is alot i want to respond to here, but I am going to try to keep this post simple and short so we will see how that goes. I apologize that my thoughts aren't going to be as organized as your thoughts appear to be.. but here I go anyway :)

I will start with my view..

In regards to the purpose of this thread, I think your trying to define consciousness?(looking down at that first post)
the conscience comes from(and is affected by) two different factors. The first one being external factors, things like your environment, physical restrictions, and possibly genetics(though not to the level some people think..)
The other being internal factors. What you percieve, how you choose to look at it, etc.

Looking at the environment you are born in to. First, being born as a human, you have literally been born to the supreme species on the planet. You can do almost anything you want, though other humans offer consequences. Your conscience really isnt with you these first few days or weeks it is developed.
I am guessing this will be my prime view. Consciousness is developed for a variety of reasons and modernly it may just be used to define yourself against the clusterduck of the world around you.
(This part has made me go on a wide variety of roads in my mind.)
Your parents teach you words. Ways to define the things you feel, want, and crave. Later these are used in communication, another thing to let others know what YOU feel, want, and crave.
Now just this simple part, lets pretend for a moment that you weren't taught words. You are still human, but this is before written definable words.. your the first. Hunger isnt hunger, its a nervous impulse in your body. Your father knows your hungry because he knows what the feeling is. He is the one who hunts and kills the turkey you eat. Your mother also knows the feeling in the late night, she knows you want the blanket, because it makes you comfortable.
can you imagine this? now it is true other ways of communication can be used and have been developed, but words, this is the start of actual conscienceness. The ability to define yourself to the world around you conscience. (looking at Hellen Keller, she had a very undeveloped one until her teacher had figured a way to show her.)
Going through school you learn more words and ideas. You start being able to define very complex emotions inside you. This learning is the most responsible part for the conscience development.
One more example.. The first post, living life in a box. If for whatever reason you survived for thirty years inside a room by yourself, when you were to come out of the world on a busy street. You would see all the people moving they have direction and you would probably be able to feel that, you would feel isolated in that you have no direction. but noone ever told you that isolation is even real.. You would just feel out of place and more then likely want to return back to the box. Where you belong.(and now you have a direction to go haha)

This brings up one more quick thing. Say you did sense that everybody on this busy street did have direction, and you did sense it. This is empathy. This characteristic is in eery person I believe to. Words make empathy a little bit harder to listen to though. Empathy instinct and even intuition disappear because it is not neded anymore. To me, intuition may be the only true thing that a person can not change.

Now what I really think this topic is seeking out is this second part. THe internal factors. How you change yourself and this is what I find is most interesting.
Your will power, what you percieve and how you choose to percieve it, and your will to survive. These are the other factors that have shaped conscienceness today.

For my beliefs on this all I can say is that I think every person is capable of anything. I really am not sure where to go with this post from here I feel that I left something out, but i am pretty sure I defined conscienceness in my mind. f anyone has any questions with my post just let me know.

One more thing.. Animal conscienceness. It is like ours, they are just undeveloped.



As I said earlier there is alot I wanted to respond to in this topic.

Uhm im not sure how to quote on here but here it goes.


Do you think it's possible to create an 'artificial' consciousness?

I definitely do. If you can give the computer a goal. then give it an environement to complete the goal . Then give the environment a way to communicate with the computer. or the ability for the computer to sense the invironement. now the goal to give the computer would probably be to keep surviving. it has to want to continue or it wont. With this you might be able to approach a bacterias form of consciouscness.


That doesn't necessarily mean consciousness is an 'illusion' though. It just means the feeling of control we have over this consciousness is an illusion and a byproduct of the brain as well.


well.. Im just going to say I think we have complete control over our own conscience. I may just be a special case but I can change my perceptions any time I want to. to elaborate, I can look at a red pen and honestly convince myself its a blue pen. Does it make it blue to anyone else? hell no. So I dont do it :) The only thing that controls my conscience aside from myself is my desire to communicate with people.


Take evolution for example. Evolution has no purpose, plan or reasoning behind it

Evolution has only reasoning behind it. what can survive will survive. IF this werent the truth of evolution then there would be no reasoning behind it.

And as for the whole Pie thing.. well I will just say thats freaking hilarious :)

Reach
05-27-2009, 09:50 AM
This discussion has defintiely gone places I didn't think about. I never equated a lack of a soul to necessarily imply no free will. Possibly because I think free will can be constructed. Yes, our brains have control over our consciousness. However, the conciousness that is created, I believe, has control over our brains too. Just as Reach sited that drugs and physical changes to our brain affects our consciousness, so does a conversation with someone cause us to change our thoughts. This is irrespective of the existence of a soul, but not of consciousness.

The fact that I feel I have control over what I do is what gives me free will. That given enough technology and time all of my actions would be predicted and known does not change the fact that I think about what I do make decisions based on my thoughts, or that I'm aware when my actions are based on habit or impulse. Also, whatever technology we would make, if the technology is as it is today, it may predict what we do, but it would not feel the emotions involved in our actions, and as a consequence, not be able to understand them.

I have a few questions.

Can you explain how created consciousness has control over our brain? I'm going to assume you're implying a form of Cartesian Dualism, or Mind-Body differentiation (Something neuroscience rejects entirely). If not, I'm curious.

My argument would be that, if consciousness stems from the information created by the summation of reactions in your brain based on the rules of the physical world, then by necessity, your conscious mind would have to be a *physical* entity separate from your brain in order to have any governing properties over your brain. How is this so?

In other words, I would argue perception is isomorphic to the physical world like language is isomorphic to perception; it's a representation of what is real based on what's happening in your brain. If those happenings are bound by physical laws of the universe, then how could consciousness possibly exert control over the brain, given it is...by definition here, your brain.

And 2) Can you explain what you mean by "The fact that I feel I have control over what I do is what gives me free will." I don't see how this holds up. It might give you free will in a mental sense, but physically whether or not you believe you have free will is going to have no effect on whether or not you *actually* do (Given feelings are just a product of the brain as well).


I would say Free Will and Absolute Determinism are actually indistinguishable through perception alone, which is why it is inherently human to assume we have free will, but our knowledge of the universe and neuroscience suggests otherwise.

Im just going to say I think we have complete control over our own conscience. I may just be a special case but I can change my perceptions any time I want to. to elaborate, I can look at a red pen and honestly convince myself its a blue pen. Does it make it blue to anyone else? hell no. So I dont do it The only thing that controls my conscience aside from myself is my desire to communicate with people.

Have you considered the possibility that this is just your retrospective interpretation of something that was already set into motion irrespective of you? If you took a test to look at your brain waves when you report being able to change the red pen into a blue men, the brain waves associated with this change would come before your so called will to make that change.

Can you explain this phenomenon without giving up your free will?

Evolution has only reasoning behind it. what can survive will survive. IF this werent the truth of evolution then there would be no reasoning behind it.


When I say no reasoning behind it, I mean the system itself (Evolution) makes no choices. Rather, the outcome of the system is determined by the actions of those within the system itself (e.g. whether or not the kangaroo is able to reproduce). As such, evolution is unbiased and without direction - it only takes what it is given.

dsliscoo
05-29-2009, 08:04 AM
Have you considered the possibility that this is just your retrospective interpretation of something that was already set into motion irrespective of you? If you took a test to look at your brain waves when you report being able to change the red pen into a blue men, the brain waves associated with this change would come before your so called will to make that change.

Can you explain this phenomenon without giving up your free will?


Well, I wil use a ittle more complicated example..

2(x+4)=6

Higher thinking isn't done solely by one part of your brain. Where these parts meet and communicate is your conscience. and versely(or is it inversely?) the conscience reflects back to the seperate parts in the brain. Thinking about this I am wondering if conscienceness is even that amazing or if it just comes around when you try to run while chewing gum and talking to your jogging partner..

My brain is what makes my conscience if thats what you mean, but your conscience does have control over your brain or any single part of it I should say. I would say there is limited communication between too many parts of the brain without conscience effort. An example might be, unless your smart enough to avoid drinking, alot of us have taken a little bit too much alchohol before. Blacking out would be what your brain is without conscience. (though not exactly, Neuro-transmission being inhibited and such..)

Since I know your not trying to say that the brain is some outside shady character that I shouldnt trust and is just committed to controlling me.. I think what you might be trying to get at is that, I have programming that I must follow like a coffee machine. There might or might not be a core programming in every person, but if there is, it is nonsensical and practically ineffective. Whatever that programming once was has been almost entirely annhilated by the experience of life on the brain itself.

The thing is that this same programming has worked up to this point. Humanity is a marvel and it all boils down to that base genetic code, but I am positive that is not what gives the variety of people we see today and in the past.

I have deviated, back to your question. I am sure that I have decided to do everything I have done regardless of an outside extremity controlling me.


Reach,
as a side question you mention alot of things about neuro science. do you have a degree in it or just google? because just to let you know, I am a googler lol.:)

Cavernio
06-2-2009, 02:26 PM
Reach:
I started out with trying to respond to each our your paragraphs, but I found that it all basically boils down to whether consciousness can affect the brain in a top-down manner. What I take your position to be is that, since consciousness can only develop from our brain, the brain is obviously the physical determinant of our thoughts. I've already agreed with this when I said something like even if we could determine everything that would happen, it doesn't mean, to me, that free-will doesn't exist.

This is a rather low-grade analogue, but it's all I can think of. If you, say, create a computer program A that is designed to, itself, make another computer B which is then designed to do C, you could say that program A made program C, but it's also right to say that program B made program C. Now what if the design of program C is to change what program A does. This then changes program B which then changes program C. At the very start, it is still clearly A that started everything. Let's pretend that program A is not actually a program, but hardware now, and very basic hardware at that, that B is some sort of basic software, and that C is highly advanced, intricate software. After hardware A has started, can you say that the hardware is still controlling everything, but that would be ignoring the fact that clearly software C is doing much more of the calculations and controls, and indeed that C has taken over the process. A is the body, B is the brain, C is the mind. Well, ok, with a mind/body thing, C actually affects B which then affects A, and A and B are both hardware. And I guess B would be doing calculation and C would be making sense of the calculations, but the gist is the same. If you can imagine how a computer program could exist that could change itself, and if you can see how C would actually be controlling things, then in my mind, you should be able to imagine how our mind can control ourselves.

I don't know if this would fall under dualism or not, but I know that to not have a dualism of mind/body to some extent seems absolutely ridiculous. Our mind, no matter how connected our brain and body, is a completely different creature than the rest of the physical world as we know it.

"Can you explain what you mean by "The fact that I feel I have control over what I do is what gives me free will." I don't see how this holds up. It might give you free will in a mental sense, but physically whether or not you believe you have free will is going to have no effect on whether or not you *actually* do (Given feelings are just a product of the brain as well)."

Maybe you think this is a wholly stupid statement, but if we have free will in a mental sense, that to me is actually free-will.


"Have you considered the possibility that this is just your retrospective interpretation of something that was already set into motion irrespective of you? If you took a test to look at your brain waves when you report being able to change the red pen into a blue men, the brain waves associated with this change would come before your so called will to make that change."

I know this is not a comment directed at me, but to get into nitty gritty things, there's absolutely no scientific certainty that any given thought happens only after the brain does something. There is no way to determine when a thought changes because 1) there's delays in relaying information to a person, 2) our instrumentation is very weak in determining both time and place of any activity in the brain, especially in conjuction with the fact 3) that we cannot know a person's thoughts even knowing all brain activity. (Although perhaps in the future, given enough studying, we may be able to do 3.) You are making an assumption.

Cavernio
06-3-2009, 06:21 AM
Whatever that programming once was has been almost entirely annhilated by the experience of life on the brain itself.


This is one thing you said that I agree with dsl. I could even add it to my computer program analogy by saying that after so much stuff D has happened (D being the environment which generally affects only C, but which can affect A and B too), A is even less of the cause to what a person's thoughts are.

This is addressed at Reach again, and is more of something to ponder than any true evidence I have about my view of conciousness. Evolutionarily speaking, why would we even evolve consciousness if it had no control over us? There's no purpose to it beyond that that I can see.
You might say that consciousness is merely a by-product of the complexities of the brain, that it can't do what it does without having conciousness. Firstly, I just don't think that that's true. Computers are very complex and yet they're not conscious. Secondly, we DO have reflexes, movements which are indepedent from our consciousness. Parkinson's disease specifically interrupts conscious control of movement, but not reflexive movements.