Tuesday, October 10, 2006

Confidence is a Problem

Here's me thinking about confidence and trying to figure out some larger patterns I've been seeing around me in the last while. No need to read it. This is something I'll Write about sometime, but these are some early diagnoses.

This is a thought that occurred sharply when I was learning to ride a motorcycle: confidence is a problem, as is unconfidence (if that's a word). The more often I was reminded by people to be careful, and particularly the more I was told horror stories of bad falls and accidents, the more confidence became a problem.

Here's the rub. One must actually be confident enough in oneself and one's ability to start riding. Further, one must ride confidently to avoid the problems of falls and accidents. On the other side, having too much confidence can dull your edge, pacify your brain, and lead you into bad decisions.

So you have to have just enough confidence to do it, but not so much confidence that you don't try always to do it well. This is just the old Aristotelian maxim of the golden mean. This is also one version of something I'm starting to see all around me.

In terms of security, and here let's talk about national security in the States - you need people to be just afraid enough to accept certain arguably necessary changes (only 'arguably') in their political rights, etc., but you need them to be just confident enough to think that things are going well, and to go to work, and to have consumer confidence, and all the rest of it. A defect in confidence of a sufficient degree would paralyze, having just enough a defect in confidence seems to motivate.

I see this in the much smaller scope of personal security. We need just enough mistrust of our neighbours to prevent us from walking a dangerous route home in the dark where we very well might have a problem. Still, we need enough confidence to be able to go out the door and walk to work. We also need enough trust in our neighbours to have communities that are not armed camps. In this model, there's a vast degree of difference between the minimal amount of confidence and the maximal amount of confidence that is useful. I'm not sure if the distance is all that broad between the minimal and maximal in national affairs, but it might be.

We also need just enough fear of consequences to prevent us from succumbing to certain temptations, but not so much that we become unable to be in the same room with the object of our weakness.

Do these balances exist in good things, too? Tricky. I think what I'm seeing is the balance brought about by understanding, or at least perceiving, negative consequences. I don't know how it plays out in terms of hopes.

The reason for that, at least what I see as the reason in my own reflective moments, is that hopes and fears are remarkably similar things as states but hugely different in how we experience them. They're both dispositional reactions about the future - we have a picture of it, and we figure out how we feel about that picture. Sometimes fears trump hopes, if you're risk averse, and sometimes hopes trump fears, if you're a gambler - or an idealist, or a romantic.

For me, anyway, the vivacity of reactions I get to hopes and fears is different. Not just between hopes and fears, but between individual examples of each and - this is where it gets awefully murky - between the same example of a hope or fear at one time, and at another.

What this suggests to me is that there are probably only a handful of reliably consistent motivations that people can act on, the rest are highly fluid in terms of their products. Ever had a flippant moment while doing something dangerous? Most people probably have said to themselves after such a moment, "Wow, that was dumb, I know better than that."

What I'm starting to wonder is whether it comes down to confidence and unconfidence (which isn't the same, really, as worry). The rest of our dispositions might be coloured by our confidence, our sense of ego, and our basic 'oomphiness'.

On the side of sanity, we all have certain intellectual commitments that drive us to some decisions we would prefer to avoid, and some other commitments that keep us from doing things that we might locally enjoy. I didn't think I'd ever come to this suspicion, but perhaps our intellectual commitments are more reliable and sturdy than our emotional ones - in spite of the fact that the feeling of hope and fear is so much more vibrant.

Okay, again this is just me rambling my way towards an idea I'll really Write about sometime, but there's a point. The more we can shunt our decisions into principles the more stable we'll be. Also, the more we shift public responsibility away from quick decision makers motivated by interest and towards institutions that are impersonal, perhaps that's better. This horrifies me a bit, because I think that institutions are terrible decision makers. That's why we have rules to govern institutions, because they can't experience real guilt and learn from it. They can't be corrected in the same way as can, say, an agent.

Who comes to the conclusion that well run and regulated corporations might be better than individuals? It cuts across the grain of what I think of as my own moral commitments. Nevertheless, I'm thinking that decisions made apriori, with a minimal admixture of emotion, that provide guidance to institutional practices might be really preferable for some things than, say, people. Cooler decision-taking is itself desirable, but this strikes me as being a nasty thought. It argues a closeness between institutions and intellectual commitments, but I'm wondering if intellectual commitments are what policies really are, and vice versa.

Regardless, I'm seeing stuff like this crop up in lots of different places. So it starts with wondering about balancing confidence to cope with risk, and winds up arguing that institutions might do less damage than people. Welcome to my mind. Hope you enjoy the view.

3L

No comments: