Sunday, 21 June 2009

An amusing diversion.

If I told you I didn't believe in rationality, what could you do to convince me? You could try to present an argument, but I, believing the argument to be necessarily irrational, would have no reason to accept it, or even listen to it. Your best bet would probably be to medicate me, though such techniques don't really fall under the banner of 'convincing'.

This is because rationality is so basic. Any proof that rationality exists must be in some sense circular, in that it must rely at some point on reason. This isn't a problematic kind of circularity, though, in that we simply rely on the proper function of reason in our brains to understand reasonable arguments. We don't, and can't, rely on our knowledge that we are rational in order to do this. But lack of proper care in thinking about this can lead to paradoxes.

It's much easier with the existence of things like tables. We have a sensible theory of the world into which tables snugly fit. More precisely, in this theory there is a role for tables as existing objects which fits pretty closely with our experience of them and of the world. To return to the theme of this post, our physical theories of the world give good reason to believe that continuing to use the word 'table', as we do, as if it were a noun, will not lead us into serious difficulty. So, according to that post, we shouldn't be at all surprised that we find ourselves, from time to time, making claims like 'tables exist'. There is no paradox here, despite the fact that tables have been widely used by those who put these theories together.

Let's pause at this point to notice the dizzy heights we haven't reached. The dalliance with the word 'exists' in the post referenced above gave no explanation at all of what existence is, or of what it is for a thing to exist. It was simply an observation about the linguistic use of a word, a piece of anthropology. Accordingly, we needn't worry about things relying for their existence on the ideas introduced there. This is good: It avoids vicious circularity - those ideas needn't rely on themselves for their own existence, for example. Bafflement easily ensues if this is not kept in mind.

In fact, there are quite a few things which exist, and whose existence may be explained in this very limited sense, but for which the explanation invokes the things themselves. The problem is only apparent, since what is explained is only the fact that, given the way we use language, we sensibly use the word 'exists' about those things. It is amusing to examine a few cases, and see how in each case familiar paradoxes are generated. I'll look at numbers, language, time, probability and deconstruction, but I'm sure you can think of your own examples.

Numbers

Although numbers aren't made of physical stuff in the same way as tables, our basic physical theories give us good reasons to accept that using the words for numbers nounishly will be unproblematic. For the world is generally divisible into discrete bits which we can count, and preservation of cardinality is a general rule which emerges from our understanding of how those bits of stuff behave. However, the more deeply we look into the behaviour of stuff and our understanding of it, the more we find that in order to explain what is going on we must make use of mathematics.

Because of the misunderstanding I outlined above, this phenomenon has often been taken to show that such explanations are circular and therefore inadequate. So the question of how our minds can interact with numbers has been singled out as a mystery. How can our minds, which are made of physical stuff, possibly interact with such abstract nonphysical entities? Even the brilliant thinker Roger Penrose has taken pains to express how baffling he finds this, in terms of a triangular structure of mental, material, and mathematical worlds, each enclosing the previous one:
In order to confront the profound issues that confront us, I shall phrase things in terms of three different worlds, and the three deep mysteries that relate each of these worlds to each of the others. The worlds are somewhat related to those of Popper (cf. Popper and Eccles 1977), but my emphasis will be very different.

The world that we know most directly is the world of our conscious perceptions, yet it is the world that we know least about in any kind of precise scientific terms. ... There are two other worlds that we are cognisant of ... One ... is the world we call the physical world ... There is also one other world, though many find difficulty in accepting its actual existence: it is the Platonic world of mathematical forms.

Language

Though we are far from completely understanding it, linguistics has given us a pretty good outline of how language emerged and how it is used in day to day life. Mundanely enough, in order to express this stuff even linguists must humbly rely on the very linguistic structures they are studying. Pulling a mystery from this banality, I've seen this fact used to reject the explanations of linguists as fundamentally unable to address the 'deep questions' of language, such as when a sentence is true. This misunderstanding is used to claim that there is a sacred territory here on which we must proceed philosophically barefoot.

Time

Great strides have been made in understanding what time is. Newton and others produced a sensible mathematical model into which time fits. Einstein showed that the model in question was slightly off (in a way that reflects our simple mental pictures of time), and proposed a better one. This better perspective has yet to be fully integrated into quantum mechanics, but there's no reason to suppose that this won't ever be achieved. In any case, we are close to having an understanding of how time exists (that is, of why using the word time as a noun doesn't screw us up) which is built on cunning maths which makes no reference to and doesn't rely on a prior concept of time.

Being human, though, when we consider these mathematical models, our brains (being so tied up in a primitive model of time) situate them in a kind of timeless present. Besides the obvious confusion this leads to (nonsense like 'Relativity implies the future already exists'), this has led to the claim that this present, for example, is prior to and in principle cannot be accounted for by scientific models. Once more, there is a purported need to abandon all our usual tools when addressing our day-to-day experience of time. I'm told Heidegger followed a line of this kind, but I haven't checked this in his writings.

Probability

This is a slightly odd special case. The normal way we introduce probability into scientific models is at the ground floor, with a variety of possible worlds whose actualities are assigned various prior probabilities. For this reason, like numbers, probability is difficult to explain except by exemplifying the proper use of the language (and remarkably it seems not to have been understood systematically until a few centuries ago).

To some extent this can be dealt with, though it normally isn't, by considering appropriate strategies for a limited reasoning being in a deterministic, patterned reality. Such a being could represent its ignorance about that world by assigning various probabilities to ways it reckons the world might be. So we can see how probabilistic language might become useful and how the probabilities of particular events could be said to exist without having to invoke probability in the explanation.

Unfortunately, though, as far as we can tell such 'probability-free' basic models aren't adequate to the behaviour of the world that we find ourselves in, where there appear to be genuinely random events if you look on a small enough scale. Quantum mechanics forces us to once more build in randomness at the bottom. Actually, the situation is a little more complicated still, in a way that is so outlandish that I'll have to leave an explanation to a later post.

Deconstruction

A lot of the language Derrida originally used when introducing deconstruction appears to fall into this category. I can't comment for myself on later postmodernists, since I haven't looked into them, but I'm told that to a large extent they were trying for something a little different. There is, as far as I can tell, no current systematic understanding of why 'differance', or 'trace', for example, are a sensibly nounish words. I can see that they are, but not in terms that I can explain without resorting to the vocabulary of Derrida, and even that isn't well enough developed to explain it thoroughly. This linguistic oddity may turn out to be explicable in other terms eventually, for example with an adequate development of neuroscience, or it may not. It almost certainly won't be done whilst I'm still alive.

No comments: