Date: 2009-03-05 04:31 pm (UTC)
From: [identity profile] meaningrequired.livejournal.com
Would someone who voted "eventually reached such a level of complication that I threw them away and now do what feels right at the time" be very unpredictable?

If I thought that I would do what feels good at the time, then I'd be likely to be in a lot more messes than I am!

Date: 2009-03-05 04:32 pm (UTC)
From: [identity profile] meaningrequired.livejournal.com
And fatter.

And poorer.

Date: 2009-03-05 04:37 pm (UTC)
From: [identity profile] meaningrequired.livejournal.com
Ah yes - my brain substituted the word "right" for "good" :)

Date: 2009-03-05 04:38 pm (UTC)
From: [identity profile] meaningrequired.livejournal.com
I think people with a rigid code would be more likely to be aggressive with them (on others).

Date: 2009-03-05 04:35 pm (UTC)
From: [identity profile] laserboy.livejournal.com
Too restrictive.

Date: 2009-03-05 05:09 pm (UTC)

Date: 2009-03-05 04:58 pm (UTC)
From: [identity profile] meihua.livejournal.com
Free will: I don't have free will, and wobble between feeling like I do and feeling like I don't.

Date: 2009-03-05 05:28 pm (UTC)
From: [identity profile] meihua.livejournal.com
Well, day to day, I think, "Would I like to eat this apple or not? Hmmm." rather than, "The current state of the universe only allows one progression from this state, and that progression is not one which can be controlled by this illusion of sentience I have." If I only thought the second way, it would be rather tiring. :)

More mundanely, I tend to feel like I have free will when things are going well, and remind myself that I don't when things are going less than well. ;)

Date: 2009-03-05 05:07 pm (UTC)
cdave: (Brains)
From: [personal profile] cdave
I think that pointing to those brain scans that say you neurons fire to move your finger before "you" are aware "you" decided to are not proof of no free will.

"you" are made up from lots of complicated interactions mostly in the brain. Just because one of those happens before another does not mean you don't have free will.

Date: 2009-03-05 05:15 pm (UTC)
cdave: (Default)
From: [personal profile] cdave
I can't think of anything right now that would make me think we don't have free will.

Sure we may have conditioned responses (ring a bell and I'll salivate), and unachivable actions (I'll raise my left foot now, and then fly now) but that doesn't mean that you don't have a choice outside of those restrictions.

In fact if you pile together enough conditioned responses, you may come up with something I'd be happy to say has free will. I don't believe in philosophical zombies.

Date: 2009-03-05 06:24 pm (UTC)
cdave: (Default)
From: [personal profile] cdave
How are you defining free then?

Date: 2009-03-05 06:34 pm (UTC)
cdave: (Default)
From: [personal profile] cdave
Okay then.

When people talk about "free will" they seem to be referring to "you" having at least some level of choice in your actions, and your responses not being completely defined by exterior activities, but some level of interior choice.

This definition is intrinsicly bound up with conciousness, as that's what the "you" in this definition is.

So my definition of free will would be any "you" that is sufficiently complicated that its every response can be predicted faster that it can be made.

You have to base you definitions in the real world.

Riddle me this:
If you have a long incompressible rod and jiggle your end of it backwards and forwards, could you communicate by morse faster than light?
Yes, but there's no such thing as an incompressible rod, so don't base any serious plans on it.

Date: 2009-03-06 11:34 am (UTC)
cdave: (Default)
From: [personal profile] cdave
Yeah, I'm afraid I'm something of a materialist.

And you're right, that's a rubbish definition. I retract it.

I don't believe your mind is made up of anything more than the totality of interactions within it. Therefore any definition of "free will" which involves a mind beyond matter won't cut it with me.

How's about this: Anything you have made a concious decision to do, was a result of free will.

So pulling you hand away from an unexpectedly hot surface is not a result of free will. Hovering you hand over it afterwards to check the temperature is.

Date: 2009-03-06 11:36 am (UTC)
From: [identity profile] endless-psych.livejournal.com
+1

It depends how you define free will and also "you". To my mind citing the fact the brain acts before "you" do (conciousness must be nessecarily post hoc, if only at the speed of thought, given immediate access to stimulus in the enviroment is impossible) is invoking mind-brain duality and a vestige of older philosophical traditions that in turn are examples of the body-soul duality...

Date: 2009-03-05 05:24 pm (UTC)
From: [identity profile] anton-p-nym.livejournal.com
My perception of free will may or may not be an illusion, but it is useful and mildly comforting so I'll go with it.

I'd like to think that my morals are updating as the latest patches are circulated, but in certain areas (notably copyright, these days) seem to be several iterations behind the state of the art. It's confusing.

-- Steve thinks sometimes that his morals and ethics have one foot in the 19th century and the other in the 21st; and his brain doesn't seem to be quite limber enough to carry that sort of Godfather-of-Soul splits off elegantly, alas.

Date: 2009-03-05 05:35 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
[X] No-one has any idea what free will is? :)

Date: 2009-03-10 07:15 pm (UTC)
From: [identity profile] cartesiandaemon.livejournal.com
Ah, ok! :) I'm contingently unable to leave polls unfilled :) (And want to distinguish between makes no sense to me, but other people understand it, like differentiable manifolds, and things that make no sense at all, like "flibble bebop overland" :))

Date: 2009-03-05 07:06 pm (UTC)
From: [identity profile] nmg.livejournal.com
I clearly have free will. The rest of you are philosophical zombies, and so don't.

This is particularly true of [livejournal.com profile] cdave.

Date: 2009-03-05 08:33 pm (UTC)
zz: (Default)
From: [personal profile] zz
i'm a prisoner of my own mind anyway, so whether that mind has free will is irrelevant, whatever free will is. although i suspect quantum stuff might break the "the universe is a big set of cogs" idea.

morality: every situation should be viewed on its own merits. i also have no problem with cognitive dissonance. some conflicting points of view can be equally logical. also i like to gnaw on things.

Date: 2009-03-05 09:01 pm (UTC)
From: [identity profile] drjon.livejournal.com
*shrug*

I have an illusion which it pleases me to call Free Will, in and of the illusion which it pleases me to call Me.

(cf reification)

Honestly, modern "philosophers" burble on such unmitigated crap as you wouldn't believe. "Philosophical zombies"? Horseshit! Adrift from any connection with sense.

Date: 2009-03-06 12:01 pm (UTC)
From: [identity profile] drjon.livejournal.com
The whole point is that the "zombie" is suppose to be indistinuishable from a "real person" except for the narratively-driven assertion of the lack of a spooky indwelling essence (consciousness, a soul, whatever). But there is by the definition of the problem no way to actually prove such a thing is missing beyond the very assertion that it's so. Horseshit, in other words. Snake Oil.

Of course, the idea of a simulacra of consciousness is a good one, but if you have a system complex enough to generate that simulation, then there's also a very good argument that there's a consciousness either encoded, implicit, or enfolded within that system. Within the "philosophical zombie" gedankenexperiment, there is of course a demonstrable consciousness at work. (The location of that consciousness is left as an exercise for the Reader ;}P> )

But even putting aside that notion, there's the issue of what the hell the observed behaviour we call "consciousness" is, beyond "we know it when we see it". Ain't nothing better than the Turing Test, which isn't a scientific test. Because beyond "we know it when we see it" and "if it looks like a duck, and sounds like a duck", there really is only speculation. And all of it, without any actual way of measuring this abstraction, is... horseshit.

Diogenes would laugh his guts out. And throw a plucked duck.

Date: 2009-03-06 11:41 pm (UTC)
From: [identity profile] drjon.livejournal.com
No way to tell from the outside. You could presumably open them up and look at their tasty, tasty brains and see whether they had a consciousness centre. :->

There are so many unsupported assumptions there that to say you're begging the question would make a Merchant Banker blush.

But I'm not here to convince you that the concept's horseshit. Either you'll realise that yourself, or you won't.

Date: 2009-03-07 09:42 pm (UTC)
From: [identity profile] drjon.livejournal.com
"Of course, the idea of a simulacra of consciousness is a good one".
It's probably your job to try and make sure you don't look silly by claiming I'm holding a position I'm demonstrably not holding.

But that's not what's being discussed. What's being discussed is the concept of the Philosophical Zombie. If you don't believe that the indeterminacy which is both implicit and explicit in the Philosophical Zombie gedankenexperiment is actually relevent, then you're already agreeing with me (because you've rejected the point of the exercise). And if you do believe it, then you're contradicting yourself (because you've already stated that you think you can measure consciousness within the Philosophical Zombie gedankenexperiment).

You asked me what my "quibble" was. I've answered your question.

Date: 2009-03-05 11:20 pm (UTC)
From: [identity profile] khbrown.livejournal.com
What is more interesting, in 'my' opinion, is to investigate who believes in free will and who doesn't, and the correlations thereof.

If we are successful, however defined, do we want to believe it is as a result of our own actions? And the converse for failure?

Date: 2009-03-06 12:04 am (UTC)
From: [identity profile] martling.livejournal.com
The option #4 from question #1, is copiously missing from question #2.

Date: 2009-03-06 12:33 am (UTC)
ext_267: Photo of DougS, who has a round face with thinning hair and a short beard (Default)
From: [identity profile] dougs.livejournal.com
q1: I've always regarded the first two answers as identical... synonymous... indistinguishable.

q2: I've always done what feels right at the time, so any answer involving the word "eventually" captures the truth rather less than completely.

Date: 2009-03-06 10:37 am (UTC)
From: [identity profile] rhythmaning.livejournal.com
Technically, I am not sure if we have free will or not, because of the way brains work and so on - I think our behaviour is probably predictable if one could measure all the chemicals and electrical connections and so on.

On the other hand, my brain is wired to make me think that I have free will!

Date: 2009-03-09 01:08 pm (UTC)
From: [identity profile] 0olong.livejournal.com
How far in advance would we have to be able to predict it, and with how much certainty, in order for you not to be 'free'?

Date: 2009-03-10 11:25 am (UTC)
From: [identity profile] rhythmaning.livejournal.com
...I have no idea!
From: [identity profile] 0olong.livejournal.com
The concept of me having free will is almost completely incoherent, in that 'free', 'will', and 'me' are all profoundly ill-defined. Free from (or to) what? Where does 'will' begin and end? What does this thing I am choosing to call 'me' encompass?

Does it make the slightest bit of difference, ethically speaking, if the entire course of the universe's history is uniquely constrained by the moments after the big bang (but in practice, absolutely impossible for anyone to predict), or if the decisions we make stem in part from the completely random 'choices' made by subatomic particles within us?

I would be tempted to tick all three boxes for your second question, if it weren't for this universe of exclusive and well-defined moral choices that we are forced to live in.

March 2026

S M T W T F S
1 2 3 4567
891011121314
15161718192021
22232425262728
293031    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Mar. 4th, 2026 06:07 pm
Powered by Dreamwidth Studios