Saturday 9 January 2010

Conceptual concretions

Rocks exist, and so do numbers. But there's something rather more concrete about the existence of rocks. At a basic level, we might explain this by saying that we can reach out and touch rocks, or bang them together; we can interact with them in a physical way. We can't do that with numbers. But there's something a little more subtle going on here; after all, I reckon that rocks in Melbourne exist concretely, even though I've never bothered to go there and juggle with them. Maybe I could do that. But the universe is full of rocks, and there's a limit to how many light years I can run before I get out of breath. My inability to visit them doesn't diminish the concreteness of their existence.

Well, perhaps what's going on is that in principle if I was in the vicinity I could heft those rocks. This comes closer to the truth; it shows that our understanding of concrete existence depends somehow on our understanding of how, in principle, the universe might turn out to be. That is, it depends on the structure of our understanding of how the universe is patterned: Our rudimentary physics, if you will. To help us navigate the day-to-day world, we all have built in models of how the stuff we encounter might be expected to behave. Of course, our modelling kit is rather more sophisticated: It deals with the raw feeds from the nerves leading to our brains and it's only after some serious processing that this raw data is made to fit models involving stuff behaving in various ways; what we are consciously aware of is always already neatly chunked like this. The models we use fit nicely with the language we use; some bits (the stuff) correspond to nouns and other bits (the behaviour) to verbs.

It is a little clearer what is going on if we look at the models we are more aware of; the ones we consciously make for ourselves. Nowadays, especially in the sciences, these are often expressed in terms of mathematics. Usually, there is some mathematical structure, in which we can perform computations, and which corresponds closely with the world, so that we can speak of those computations as being somehow about the world. More precisely, the mathematics is set up in such a way that the statements and equations it produces correspond in a sensible way with statements we might make about the world, and the rules for rearranging those statements and equations do a pretty good job of preserving truth. Under this correspondence, some bits of the mathematics correspond to nouns, some to verbs, and so on. It is the bits which correspond to nouns which are taken to `concretely exist' with respect to a particular model. The fact that the model works then underwrites their existence, in the sense I outlined here.

Of course, things are not quite so simple. First of all, there are lots of things that we think of as concretely existing, but which are a bit fuzzy round the edges. Clouds, for example. We can clearly see that they exist, but the question of just where a cloud ends and the rest of the sky begins is hard to answer precisely. This isn't a problem for our modelling of clouds, however, since we know that clouds are made up of tiny droplets of water, which certainly exist concretely and whose behaviour we can model pretty accurately. So the concrete existence of fuzzy things, like clouds, is understood in terms of the concrete existence of less fuzzy things, like water droplets.

However, the lack of fuzziness of water droplets is just a matter of scale. If you had the ability to shrink yourself down to a microscopic scale, and to look closely at the surface of a water droplet, you would see that the surface is also quite fuzzy, with water molecules near the surface constantly jiggling around and some of them flying off into the air, whilst others from the air rejoin the droplet. So we don't escape fuzziness by looking at the water droplets; maybe if we really want to understand the concrete existece of water droplets (and so of clouds) we need to understand the behaviour of the real concretely existing constituents; the water molecules.

This is the situation with most things we think of as concretely existing, even rocks. Our theory of them involves them being somehow constituted of smaller concretely existing things. From the behaviour of the smaller things, we can derive the behaviour of the larger things; so we can explain the concrete existence (to a good approximation) of the larger in terms of the smaller. The smaller things may be explained in terms of even tinier constituents, and so on down through several levels. We proceed in this way from biology to chemistry to physics.

What lies at the bottom of this downward chain? Maybe, at some level, there are tiny basic constituent particles of the universe whose concrete existence is not just approximate but perfect; they precisely obey mathematical laws which fit well with our grammatical distinction between nouns and verbs, assigning these particles the roles designated by the nouns. This would be the ideal kind of backing our ideas of concrete existence could have. In fact, it is hard to imagine how the universe could be any other way. Perhaps our imaginations could stretch to some simple variations; an infinitely descending or even periodic series. But it seems completely clear that the concrete reality at any level, if it is only approximate, must be backed up by a more detailed explanation involving the interactions of smaller concretely existing things.

Imagine, however, the following alternative scenario: we wish to find a sensible theory of microwidgets, which are taken as concretely existing items at some point in this chain of explanation - they can't quite be at the bottom, because careful experimentation has shown that they are a little fuzzy. We find, as usual, that the best way to explain the fuzziness in the behaviour of microwidgets is by means of a mathematical theory. Also as usual, by taking some small simplifying approximations in this mathematical theory, we can reduce the theory to a theory which mentions objects called microwidgets, behaving as microwidgets should (but without the fuzziness); so we can use this theory to understand why it is sensible, for the most part, to take microwidgets as concretely existing. The difference is that the new, `more accurate' theory does not fit the grammatical conventions of our natural language; it cannot be translated into statements about even tinier nanowidgets and their quirkily nanowidgerific behaviour. No part of the mathematics corresponds to nouns, or to verbs. It just doesn't fit our way of thinking.

My reference to grammatical conventions in the last paragraph isn't to things like `In sentences of such-a-kind the noun should come before the verb' but rather to conventions like `There are some words which are called nouns. They refer to things.' These conventions apply, so far as we know, to all kinds of natural human language, and grammar is the word used for these conventions by the linguists that study them. So, to repeat, in the scenario of the last paragraph it is these deep universal conventions which are irreconcilable with the mathematical theory in question.

Is this even possible? It is certainly hard to conceptualise how it could ever work out that way, and it doesn't seem to have been seriously considered as a possibility until it actually happened. But it has happened. The theory which has wonderful predictive power but does not share the grammar of our thought has been discovered, and accepted as the best current model of small scale phenomena (though, of course, the word `phenomena' is no longer really appropriate). It is quantum mechanics.

There have been many attempts to say just what the concretely existing entities postulated by quantum mechanics are (it being assumed that there must be such entities). Unfortunately, all of them either fall foul of the mathematics or contort the idea of existence so seriously that concretely existing things no longer satisfy the grammar of nouns. However, this is not the only, or even the main, reason for concluding that the grammar of quantum mechanics and normal grammar are incompatible. It is simply a clue that might lead us to suspect that that is what is going on. It is possible to formalise what it might mean for a mathematical theory to fit our natural language, and to demonstrate that quantum mechanics does not satisfy the necessary formal criteria. See, for example, this paper (though I'm not sure the authors of that paper would agree with all that I've said here; it is the mathematical, rather than the philosophical, content of the paper which is relevant).

The striking conclusion of all this is that the deep grammar of our languages, the grammar on which concepts like concrete existence rely for their very meaningfulness, is contingent. It is appropriate for talking about things which are about the same size as us, and which move at a sedate speed. We should expect this, since our language evolved to help us cope with sedately moving things of a similar size to us. We have no reason to think it must fit what happens on smaller scales (though it does fit down to the atomic scale, where it begins to jar). Our reason for assuming that it had to be that way was that the grammar is so ingrained our ways of thinking that we cannot conceptualise any way the world could be that wouldn't fit.

This conclusion does not depend on the fact that quantum mechanics is odd in the ways I have outlined, though quantum mechanics provided a helpful pointer to it. Even if the mathematics of quantum mechanics is revised or, by some miracle, found to be compatible under a sufficiently cunning contortion with natural language, this will not resolve the issue. For there doesn't seem to be any good reason to reject the possibility that it could have turned out in the way I have presented above. Our only reason for supposing that the world must be neatly divisible into concretely existing entities even on tiny scales appears to be that we are blinkered by the grammatical form of our own thought.