Should we talk about Art?

(my thanks to Wayne redhart, whose comments on But is it REAL? Is  Art a Joke? Five Funny Things stimulated this response)

Let us suppose two people – for ease of storytelling, we’ll make them a man and a woman, though that is not significant. They have become acquainted in the virtual world of social media and found a considerable commonality of feeling and outlook. Now they are meeting for the first time in the flesh, in an art gallery on the man’s home turf.

As they go round, we can observe a growing apprehension in the man, which he does his best to conceal, though it is evident in the tensing of his fingers and the covert looks he casts at the woman as she looks at the pictures. As they enter a particular room, his apprehension peaks. There is a painting there, and while the woman looks at it, the man looks at her, anxiously. The woman takes time to study the painting, then all at once, her face lights up, and she turns to the man with an expression of delight. His anxiety vanishes. He smiles and nods in affirmation, his expression a reflection of hers. The two return their attention to the painting, rapt. No words are spoken.

The painting, of course, is an old favourite of the man’s, that he sets great store by, and he is worried that the woman will not ‘get it’ – but she does.

The original theme of this piece was to have been whether we should judge a work of art on its own merits – on what is contained within the frame, so to speak – or with reference to something outside itself (and I should make clear that I use ‘art’ in an inclusive sense here – not just paintings and sculpture, but music, poetry, stories, dance and so on) but on reflection I realised that this was bound up with another matter, namely how we talk about art and what we can say about it.

The key thing about the man and the woman in the gallery is that no words are exchanged, yet they come to an understanding – each knows what the other is thinking; you could say that they are of one mind – each recognises that the other ‘gets it’.

But in that curious expression – which I think we perfectly understand, though might struggle to explain – what is ‘it’ and how is it ‘got’? If you asked a group of people to mime ‘getting it’ and ‘not getting it’ I imagine there would be a considerable consistency of response: faces lighting up, smiles and affirmative gestures – nodding, for instance – on one hand; on the other, puzzled looks, head shaking, throwing up of hands, shrugging. An interesting variant might be where one party gets it and the other doesn’t.

The image of whether we should stay within the frame or stray outside it is a useful one – a significant boundary we should think carefully about crossing. Within it, we can only look, and look again (if it is a painting or anything visual) or read and reread, or listen and listen again – the only thing we can vary is how often we go back, and what we do in between, which may be very important – an obvious example is something that we could make neither head nor tail of in our youth – we just didn’t get it then – but which we come back to in later years and find that we do, now.

If we cross the boundary, step outside the frame, our tongues are loosened. This is a natural enough reaction, and in some respects the question I have used as a title is a fatuous one – should we talk about art? Try stopping us! try stopping yourself! If we see or hear or read something that impresses us profoundly the natural response is to tell someone – your friends or indeed complete strangers – such is the pressure that you feel the need to express.

And it is here that complications arise, and I trace them back to my pet theme of Language*, and how it has come to dominate our thinking and all other forms of expression. Those of us who have had children or remember what it is like to be one will recognise the behaviour that comes in the wake of some great experience – the urge to give an account of it in every detail, generally at high speed, the words tumbling over one another into incoherence; the struggle to find words that are adequate to the huge wonder and marvel of it all, so that there is a succession of attempts that break off as a new and possibly better one occurs, only to be discarded in its turn; sometimes, indeed, the right words just cannot be found, and the child is, or becomes, speechless, and just grins and runs around.

All that strikes me as the right and proper human response to anything that impresses us in this way – a sort of incoherent joy which nevertheless sends a very clear message, sometimes summed up in the parent’s laconic response, ‘well, that was good, wasn’t it?’
In other words, all that we are seeing here is an extension of the wordless expressions of delight in the art gallery described above. The words are attempts to convey the magnitude of that delight which succeed, paradoxically, by their failure to express it adequately (and of course that is a formula we use when we are deeply moved, whether to joy or grief or gratitude – ‘there are no words to express how I feel’).

The problem is that while we allow children to run around babbling incoherently, we are less indulgent to adults. When the concert hall audience debouches into the foyer and there is great buzz of people all talking at once – ‘amazing passion!’ ‘superb orchestral technique!’ ‘I loved that passage with the horns’ ‘it’s such a vivid piece, you can see it like a picture in your head’ ‘I adore Sibelius!’ ‘it’s so strenuous – in a good way, I mean’ – it is important to see that they are all really saying the same thing: ‘well, that was good, wasn’t it?’ and that their babble of talk is just an extension of the applause they gave the orchestra, continued by other means; the actual words do not matter.

But we have been brought up in the strong belief that language should be articulate, that it should express meaning coherently and precisely, that it should be something better than an incoherent exclamation of delight (that is part of the problem – we rather look down on incoherent exclamations of delight and reserve them for watching football and the like). So we try to find ‘an adequate form of words’ – and some people become rather good at it, and end up as critics in newspapers and magazines. And these articulate accounts create a false relation with the works of art they relate to: they come to be seen as a necessary adjunct to them, a learned explanation, to which ordinary people should have recourse if they wish to understand the work. To some extent, they become a substitute for  the work itself, and the critic replaces the artist as an authority – he is the one who decides what is good and what is not, what is admissible (to the salon, the gallery, the concert hall, the theatre, the syllabus) and what should be excluded.

Being able to speak (and write) about Art in a particular way becomes the mark of authority that others seek to imitate and go to university (NB not Art College) to learn (I speak as a veteran of Aesthetics and General Philosophy 1 & 2 at Edinburgh University). Unfortunately, this way of speaking is often associated with ‘cleverness’ (a greatly overrated trait) and can easily become a means of making people who have not learned it feel stupid and inadequate, afraid to open their mouths for fear of saying the ‘wrong’ thing, or embarrassed when their initial splurge of joy expresses itself in naive terms which some ‘clever’ person makes mock of (and the classic victim here is the person who tries but fails to imitate what they think is the right sort of thing to say, rather than the one who says ‘ken whit? that wis pure fuckin brilliant!’ or simply gives an inarticulate roar of joy).

Now I do not mean to condemn criticism out of hand: it can be informative, entertaining and educational. It can be (though it is not always) a delight to be with someone who can place a work of art in a tradition and make connections with other works and help you see or hear or read it better, get more out of it; but there is a real danger here, and it is deep-rooted.

I would put it like this: Language* is by its nature antithetic, indeed inimical, to art. It is like a foreign conqueror who bans the native tongue and insists that his own be adopted for all official and public use; if the native tongue is used at all, it must invariably be accompanied by a translation into the state language.

To understand why, we need to go back to the fifth paragraph:
‘The key thing about the man and the woman in the gallery is that no words are exchanged, yet they come to an understanding – each knows what the other is thinking; you could say that they are of one mind – each recognises that the other ‘gets it’.’

This is the point where a lot of philosophers will walk away, shaking their heads; I fancy that I might have, in my youth. ‘How can he know what she is thinking?’ they will protest. ‘Well, by the way she reacts – the look on her face. It is the same way that he reacts.’ This will not satisfy them. ‘But how can he be so sure that her look has the same cause as his? she might be thinking something completely different.’

The temptation here can be to insist – with a hint of asperity – ‘well, he just does.‘ ‘O, by intuition I suppose,’ sneers the other, ‘sort of like telepathy, you mean?’

At which point you either have recourse to violence, and ‘cause him to be knocked down with blows,’ as Rabelais would put it, or else retreat, as Myles na gCopaleen would say, in that lofty vehicle, High Dudgeon.

But there is a better answer, though you might be as well to pin the philosopher against the wall, to ensure that he hears you out. So, seize him by the shoulders of his ill-fitting jacket, hoist him off his feet, press him against the wall and say,

‘Because they are human.’ (At this point you should probably lower him to the floor again, otherwise your arms will tire).

‘He is human and so is she. In the presence of the picture he experiences a particular feeling of delight, an emotional uplift, similar in kind to others he has felt, in the presence of Nature, or listening to music. He recognises that in some way the picture is the external corollary of this inner sensation and through it he feels connected not only with the artist but with everyone else who has looked at the picture and recognised the same thing – which includes the woman beside him’.

Now that the threat of immediate danger has receded, the philosopher is emboldened.

‘Ah, I see – now you are talking about feelings, but to start with you said that he knew what she was thinking. But you still haven’t convinced me that he knows that she feels the same – it’s a guess at best. He can’t be certain till she verifies it.’

‘And how do you suggest that he does that?’

‘Why, he should ask her.’

‘And what should she do?’

‘Give him an account of her feelings, of course. Though perhaps he should write an account of his own first, without showing it to her, so that they can make a genuine comparison.’

At this point, you should probably let him go, though you might just want to ask him if, when someone kisses him passionately, he asks ‘what did you mean by that?’

Wittgenstein asks somewhere the interesting question, how we know when we are imitating someone, e.g making our face wear the same expression; we don’t do it by looking in a mirror. I would say we do it because, as far as our fellow humans are informed, we can infer the inside from the outside, and vice versa.

I’m not sure how well I have made my point, but I do notice that I have had to resort to telling a story latterly – a sort of non-Platonic dialogue – and I think that is part of what I am trying to say about the terms in which it is possible to explain something – the woman in our art gallery story might respond to the man’s painting by sending him a particular poem, to which he might reply with a passage of music, then she with a short story – and this might be a deeply enjoyable and intimate conversation between them, without any words of explanation from either side.

Like General MacArthur, I will return – but for now, enough.

*Language here means the literate form that is the basis of our thought and discourse. It is characterised by having a written form which dominates its spoken form.

Leave a comment

Filed under Uncategorized

Stone-sucking, or what matters

2016-05-09 13.48.09

If you read this page aloud it will strike you that there is nothing in your speech corresponding to the white spaces on the page that separate the words. Word separation is not a feature of every script – some Asiatic ones do not use it even now – and it has been accomplished in different ways at different times; the Romans used dots or points (puncta – the origin of ‘punctuation’) the Greeks I think used none originally, and even at one time wrote boustrophedon (literally ‘ox-turning’, or in the manner of ploughing a field) – i.e. the lines run alternately from left to right then from right to left, perfectly logical in terms of eye movement:

2016-05-09 13.13.19

 

Word division, then, is plainly an aid to reading, an adjunct of the written form, with no counterpart in speech; and this raises interesting questions about words themselves. We might incline to think that we need no word division in speech because we ‘already know’ how to distinguish words, because, well, we know the individual words, which are stored in our vocabulary (or word-hoard, as the Anglo-Saxons called it) like so many building blocks or components ready for use whenever we wish to construct a sentence.

There may now be an element of truth in that, because two and a half millennia of literacy (interrupted by the Dark Ages in Western Europe, but continuous further East) has schooled us – literally – in the ways of educated speech, which is heavily influenced and indeed dominated by the written form. We are used (or at least my generation was) to learning other languages in a way that brings out their rule-governed nature – we have verbs laid out in tables that show the variations from first to third persons, and from singular to plural; we analyse individual words into roots that remain the same and endings and beginnings – prefixes and suffixes, or inflections – which vary according to case and so on; we learn rules for the order of pronouns (me te se before le la les  before lui leur before  y before en before the verb, if I recall). And of course we accumulate lists of vocabulary, learning individual words and their particular meanings.

All of this encourages us to think of language as a system of building blocks or individual components – words – which can be assembled in a variety of ways according to certain rules – grammar. Yet a little reflection will tell us that this analysis only became possible – or indeed necessary – with the development of the written form.

When speech was – as I have suggested before [Plucked from the Chorus Line, The Disintegration of Expression] – only one mode of expression among many (and quite likely not the most important) – then we had neither the means nor the need to analyse it in the way we take for granted now. We did not have the means because there was no method of giving speech objective form so that it could be studied and analysed; that only comes as a by-product of the invention of writing [as discussed in The Muybridge Moment]. A by-product, because we must remember that writing was not primarily devised as a means of transcribing speech, a need which our ancestors would not have felt – after all, we had been transmitting our culture orally (and by other means of expression) since the dawn of time, for hundreds of thousands, if not millions, of years.

The accidental nature of the whole concept of a written language and all that it entails – literacy, books, systematic text-based education, the whole basis of our modern way of life – is worth emphasising, to remind us that we managed for a long time without these things and did not feel in the least deprived or impoverished: it is perhaps the most significant example of what I have called an ‘elective indispensable’ something we have managed very well without, then reoriented our way of life to make living without it inconceivable.

Before we were able to analyse language by studying its written form, we may have followed rules, but we did so unconsciously, by instinct, much as (say) indigenous Amazonian tribes will appear to observers to engage in rule-governed speech but would not (I guess) be able to say much about the rules they were following, or offer a grammatical analysis of their own tongue in the way that the observers (trained to look at things that way) could.

‘Trained to look at things that way’ is a key expression there. Do the observers see something that the native speakers overlook? That is a complex question, worthy of close attention. To walk with a trained geologist through a landscape is to see it with fresh eyes, and to learn a new and different way of looking at it; and to walk with an indigenous Australian through the landscape where he is at home would be similar, though the two would see quite different things. One way of putting it would be that they would see themselves as in two different stories about how they related to and understood the landscape; what strikes one as significant might be quite different from what strikes the other, so who is overlooking what?

What that comparison brings out is the extent that we bring things to our analyses, rather than finding them there. An analogy might be to going out equipped with a box divided into compartments of different shapes and sizes – the things you find to put in the different compartments are ‘already there’ but you have brought your system of categorisation with you; your principles of selection are decided beforehand. If you came instead with a number of equal-sized boxes but each lined with a different colour sample which you sought to match, you would end up with a wholly different selection and arrangement of things ‘already there’.

The underlying question is whether your system of categorisation corresponds to something objective, something we might be inclined to call ‘reality’. This seems to me a – or possibly the – fundamental philosophical question, and it reminds me of something that might at first seem wholly unconnected. I wonder if you will follow my leap?

What my mind leaps to – or leaps to my mind – is a passage from Samuel Beckett, in Molloy. I must thank my friend Stephanie Peppard (her blog, The Woman on a Yellow Bicycle, is worth a visit) for drawing it to my attention. I strongly commend reading it in full – http://www.samuel-beckett.net/molloy1.html.

or indeed you can hear it here (in a slightly varied text): https://www.youtube.com/watch?v=TXoq_H9BrTE

The gist of it is that Molloy, on visiting the seaside, lays in a store of pebbles, which he calls ‘sucking stones’. He likes to suck each stone in turn and is considerably exercised by how best he should arrange them about his person in order to facilitate this. Having four pockets and sixteen stones, he first considers an equitable distribution of four in each, so that when he draws from his ‘supply’ pocket (which we can call the first) for a stone to suck, he transfers a stone from the next, second, pocket to make up the deficiency, and so on, with the sucked stone eventually taking its place to make up the depleted numbers in the fourth pocket.

However, he soon hits a snag:
‘But this solution did not satisfy me fully. For it did not escape me that, by an extraordinary hazard, the four stones circulating thus might always be the same four. In which case, far from sucking the sixteen stones turn and turn about, I was really only sucking four, always the same, turn and turn about.’

In order to guarantee his principle of sucking each stone in turn, he tries various permutations, only to find that he has to sacrifice another cherished principle, that of having the stones in balance across his pockets:

‘Here then were two incompatible bodily needs, at loggerheads. Such things happen. But
deep down I didn’t give a tinker’s curse about being off my balance, dragged to the right hand and the left, backwards and forewards. And deep down it was all the same to me whether I sucked a different stone each time or always the same stone, until the end
of time. For they all tasted exactly the same. And if I had collected sixteen, it was not in order to ballast myself in such and such a way, or to suck them turn about, but simply to have a little store, so as never to be without. But deep down I didn’t give a fiddler’s curse about being without, when they were all gone they would be all gone, I wouldn’t be any the worse off, or hardly any. And the solution to which I rallied in the end was to throw away all the stones but one, which I kept now in one pocket, now in another, and which of course I soon lost, or threw away, or gave away, or swallowed …’

This passage strikes me as a profound – and profoundly funny – insight into human behaviour: it captures the absurd rigour with which we observe self-imposed conventions, while all the time being aware ‘deep down’ that none of it matters, or rather only matters because we choose to make it matter. That last distinction is important: to read this as a commentary on the pointlessness of human behaviour is, I think, too bleak; it is more that what we do is self-validating – it matters because we make it matter. The underlying message is not that nothing matters, but rather that something does – though what that is, exactly, we are not sure; which is why we go on searching – or just go on.

1 Comment

Filed under language-related, philosophy

Heart-thought

When I was young and studying philosophy at Edinburgh University I remember becoming excited about the figurative use of prepositions; they seemed to crop up everywhere, openly and in disguise as Latin prefixes, in uses that clearly were not literal. Reasoning from the fact that the meaning of any preposition could be demonstrated using objects and space, I concluded that a world of objects and space was implied in all our thinking, and that this might act as a limit on what and how we thought.

2016-04-27 14.26.34

What strikes me about this now is not so much the idea as the assumptions on which it is based: I have made Language in its full-blown form my starting point, which is a bit like starting a history of transport with the motor-car. As I have suggested before, what we think of as ‘Language’ is a relatively recent development, arising from the invention of writing and the influence it has exerted on speech, simultaneously elevating it above all other forms of expression and subjugating it to the written form. It is the written form that gives language an objective existence, independent of human activity, and relocates ‘meaning’ from human activity (what Wittgenstein terms ‘language games’ or ‘forms of life’) to words themselves; and alongside this, it makes possible the systematic anaylsis of speech [as discussed in The Muybridge Moment].

In that earlier theory of mine I took for granted a number of things which I now think were mistaken. The first, as I have said, is that the milieu which gives rise to the figurative use of words is the developed form of language described above; that is to confuse the identification and definition of something with its origin, rather as if I were to suppose that a new species of monkey I had discovered had not existed before I found and named it.

Bound up with this is the model of figurative language which I assumed, namely that figurative use was derived from literal use and dependent upon it, and that literal use was prior and original – in other words, that we go about the world applying names like labels to what we see about us (the process of ‘ostensive definition’ put forward by St Augustine, and quoted by Wittgenstein at the start of his Philosophical Investigations) and only afterwards develop the trick of ‘transferring’ these labels to apply to other things (the word ‘metaphor’ in Greek is the direct equivalent of ‘transfer’ in Latin – both suggest a ‘carrying over or across’).

Points to note about this model are that it is logically derived and that it presents metaphorical thinking as an intellectual exercise – it is, as Aristotle describes it, ‘the ability to see the similarity in dissimilar things.’

The logic appears unassailable: clearly, if metaphor consists in transferring a word from its literal application and applying it elsewhere, so that the sense of the original is now understood as applying to the new thing, then the literal use must necessarily precede the metaphorical and the metaphorical be wholly dependent on and derived from it: to say of a crowd that it surged forward is to liken its action to that of a wave, but we can only understand this if we have the original sense of ‘surge’ as a starting point.

However, there is a difficulty here. It is evident that there can be no concept of literal use and literal meaning till there are letters, since the literal meaning of ‘literal’ is ‘having to do with letters’. Only when words can be written down can we have an idea of a correspondence between the words in the sentence and the state of affairs that it describes (what Wittgenstein in the Tractatus calls the ‘picture theory’ of language). If what we term metaphors were in use before writing was invented – and I am quite certain that they were – then we must find some other explanation of them than the ‘transfer model’ outlined above, with its assumption that literal use necessarily precedes metaphorical and the whole is an intellectual process of reasoned comparison.

The root of the matter lies in the fact already mentioned, that only with the invention of a written form does the systematic analysis of speech become possible, or indeed necessary. Before then (as I suggest in ‘The Disintegration of Expression‘) speech was one facet or mode of expression, quite likely not the most important (I would suggest that various kinds of body language, gesture and facial expression were possibly more dominant in conveying meaning). It was something that we used by instinct and intuition rather than conscious reflection, and it would always have been bound up with some larger activity, for the simple reason that there was no means of separating it (the nearest approach would be a voice speaking in the dark, but that is still a voice, with all the aesthetic qualities that a voice brings, and also by implication a person; furthermore, it is still firmly located in time, at that moment, for those hearers, in that situation. Compare this with a written sentence, where language for the first time is able to stand on its own, independent of space and time and not associated with any speaker).

In other words, when metaphor was first defined, it was in terms of a literate language, and was seen primarily as a use we make of words. (Given the definition supplied by Belloc’s schoolboy, that ‘a metaphor is just a long Greek word for a lie’, there is an illuminating parallel to be drawn here with lying, which might be defined as ‘making a false statement, one that is not literally true’. This again puts the focus on words, and makes lying primarily a matter of how words are used and what they mean. The words or the statement are seen as what is false, but actually it is the person – hence the old expression ‘the truth is not in him’. Deceit consists in creating a false appearance, in conveying a false impression: words are merely instrumental, and though certainly useful – as a dagger is for murder – are by no means necessary. We can lie by a look or an action; we can betray with a kiss.)

There is a great liberation in freeing metaphor from the shackles that bind it to literal language (and to logic, with which it is at odds, since it breaks at least two of the so-called ‘laws of thought’ – it violates the law of identity, which insists that ‘A is A’, by asserting that A is B, and by the same token, the law of contradiction, which insists that you cannot have A and not-A, by asserting that A is not-A). It allows us to see it from a wholly new perspective, and does away with the need to see it either as an intellectual act (‘seeing the similarity in dissimilars’) or as something that necessarily has to do with words or even communication; I would suggest that metaphor is primarily a way of looking at the world, and so is first and foremost a mode of thought, but one that operates not through the intellect and reason but through intuition and feelings.

To illustrate this, I would like to take first an example I came up with when I was trying to envisage how metaphor might have evolved. Two brothers, out in the bush, come on a lion, at a safe distance, so that they can admire its noble mien and powerful grace without feeling threatened. One brother smiles and says ‘mother!’ The other, after an initial look of puzzlement, nods his head in affirmation and laughs.

The explanation I furnished to accompany this is that their mother is a formidable and beautiful woman and that the first brother, seeing the lion, is reminded of her, and by naming her, invites his brother to make the same comparison that has already occurred to him, which he does after a moment’s puzzlement, and the two take pleasure in this new and unexpected – yet apt – use of the word.

20160427_143706

I think that the focus here is wrong: it is still concerned to make metaphor about words, and to see it primarily as a way of communicating ideas.

I would now like to alter the story slightly. A man on his own in the bush catches sight of the lion (from a safe distance, as before). On seeing it, he is moved: the sight of it stirs him, fills him with a mixture of awe and delight. And it is not what he sees, but rather what he feels, that calls his mother to mind: the feeling that the lion induces him is the same as he has felt in the presence of his mother. That is where the identification takes place, in the feeling: the outer circumstances might differ (the lion in the bush, his mother in the village) but the inner feeling is the same. If we think of an experience as combining an external objective component with an internal subjective one (and I am carefully avoiding any notion of cause and effect here) then the origin of metaphor lies in experiences where the external objective component differs but the internal subjective component is the same.

Why am I wary of saying ‘the sight of the lion causes the same feelings that the sight of his mother does’ ? Because it strikes me as what I would call a ‘mixed mode’ of thinking: it imports the notion of causality, a modern and analytic way of thinking, into an account of an ancient and synthetic way of thinking, thus imposing an explanation rather than simply describing. (This is difficult territory because causality is so fundamental to all our explanations, based as they are on thinking that makes use of literate language as its main instrument)

What I want to say is this: causal explanations impose a sequence – one thing comes first – the cause – and elicits the other, the effect. So if we stick with the man and the lion we would analyse it like this: ‘sense data arrive in the man’s brain through his eyes by the medium of light, and this engenders a physical response (spine tingling, hair standing on end, a frisson passing over the body) which the man experiences as a feeling of awe and delight.’

We can demonstrate by reason that the lion, or the sight of it, is the cause and the emotion the effect, because if we take the lion away (for instance, before the man comes on it) the man does not experience the emotion (although he may experience ‘aftershocks’ once it has gone, as he recalls the sight of it).

But there is a fault here. If we leave the lion but substitute something else for the man – an antelope, say, or a vulture – does it still have the same effect? It is impossible to say for sure, though we may infer something from how each behaves – the antelope, at the sight (and quite probably the scent) of the lion might bound away in the opposite direction, while the vulture (sensing the possibility of carrion near by or in the offing) might well move closer.

My point is that the analysis of cause and effect is rather more complex  than I have presented it here, which is much as David Hume makes it out to be, with his analogy with one billiard ball striking another; as Schopenhauer points out, what causes the window to shatter is not the stone alone, but the fact of its being thrown with a certain force and direction combined with the brittleness of the glass (and if the stone is thrown by a jealous husband through his love rival’s window, then we might need to include his wife’s conduct and the construction he puts upon it in the causal mix). Change any one of these and the result is different.

My being human is as much a precondition for the feelings I experience in the presence of a lion as the lion is, and I think that this is a case where, as Wordsworth puts it, ‘we murder to dissect’ – it is much more enlightening to consider the experience as a single simultaneous event with, as I have suggested, an inner and an outer aspect that are effectively counterparts. So the lion is the embodiment of the man’s feelings but so is his mother, and the lion and his mother are identified by way of the feelings that both embody; and the feelings are in some sense the inner nature or meaning of both the lion and the mother (think here of all the songs and poetry and music that have been written where the lover tries to give expression to his feelings for his beloved). This interchangeability and the identity of different things or situations through a common feeling aroused in each case is the foundation of metaphor and, I think, the key ‘mechanism’ of Art.

(This has an interesting parallel with the philosophy of Schopenhauer, as expressed in the title of his work Die Welt als Wille und Vortsellung, variously translated as ‘The World as Will and Representation’ or ‘The World as Will and Idea’. In this he borrows from Eastern philosophy to present the world as having a dual aspect – objectively, as it appears to others and subjectively, as it is in itself. Its objective aspect, Representation, is made known to us via our senses, and is the same world of Objects and Space with which this discussion began; we cannot by definition see what it is like in itself since it only ever appears as object, but once we realise that we ourselves are objects in the ‘World as Representation,’ we can gain a special insight by ‘turning our eyes inward’ as it were, and contemplating our own inner nature, which we know not by seeing but by being it.

And what do we find? For Schopenhauer, it is the Will; and the revelation is that this is not an individual will – my will as opposed to yours – it is the same Will that is the inner nature of everything, the blind will to exist, to come into being and to remain in being. (This bears a striking resemblance to the position advanced by evolutionary biologists such as Richard Dawkins, for whom humankind is effectively a by-product of our genetic material’s urge to perpetuate itself).)

I would diverge from Schopenhauer – and the evolutionary biologists – in their pessimistic and derogatory account of the inner nature of things, on two grounds. The first is that it makes us anomalous. Schopenhauer asserts that ‘in us alone, the Will comes to consciousness’ but is unable to explain why this should be so, while his only solution to the revelation that all things are just the urges of a blind and senseless will is effectively self-annihilation (not a course he chose to pursue himself, as it happens – he lived to be 72). There is a lack of humility here that I find suspect, a desire still to assert our uniqueness and importance in a senseless world. If the Will is indeed the inner nature of all things (and that is questionable) why should we consider ourselves the highest manifestation of it?

2016-04-27 14.24.26

The second ground is the nature of the feelings that I describe, which are the opposite of pessimistic: they are uplifting, feelings of awe, elation and delight. There is a fashion nowadays for explaining everything in terms of genetic inheritance or evolutionary advantage (‘stress is a manifestation of the fight-or-flight reaction’ for instance, or any number of explanations which couch our behaviour in terms of advertising our reproductive potential) but I have yet to come across any satisfactory explanation in the same terms of why we should feel elated in the presence of beauty, whether it is a person, an animal, a landscape, the sea or (as Kant puts it) ‘the starry heavens over us*’. The characteristic feature of such experiences is ‘being taken out of yourself’ (which is what ‘ecstasy’ means) a feeling of exaltation or rapture, of temporarily losing any sense of yourself and feeling absorbed in some greater whole.

I would venture that this disinterested delight is the single most important aspect of human experience and is (in Kantian phrase) ‘worthy of all attention.’

*The full quotation is not without interest: “Two things fill the mind with ever new and increasing admiration and awe, the more often and steadily we reflect upon them: the starry heavens above me and the moral law within me. I do not seek or conjecture either of them as if they were veiled obscurities or extravagances beyond the horizon of my vision; I see them before me and connect them immediately with the consciousness of my existence.” (Critique of Practical Reason)

3 Comments

Filed under art, language-related, philosophy

‘These great concurrences of things’

20160423_162851

One of the main ideas I pursue here is that the invention of writing has radically altered the way we think, not immediately, but eventually, through its impact on speech, which it transforms from one mode of expression among many into our main instrument of thought, which we call Language, in which the spoken form is dominated by the written and meaning is no longer seen as embedded in human activity but rather as a property of words, which appear to have an independent, objective existence. (This notion is examined in the form of a fable here)

This means in effect that the Modern world begins in Classical Greece, about two and a half thousand years ago, and is built on foundations laid by Socrates, Plato and Aristotle; though much that we think of as marking modernity is a lot more recent (some would choose the Industrial Revolution, some the Enlightenment, some the Renaissance) the precondition for all of these – the way of seeing ourselves in the world which they imply – is, I would argue, the change in our thinking outlined above.

This naturally gives rise to the question of how we thought before, which is not a matter of merely historical interest, since we are not talking here about one way of thinking replacing another, but rather a new mode displacing and dominating the existing one, which nevertheless continues alongside, albeit in a low estate, a situation closely analogous to an independent nation that is invaded and colonised by an imperial power.

What interests me particularly is that this ancient mode of thought, being ancient – indeed, primeval – is instinctive and ‘natural’ in the way that speech is (and Language, as defined above, is not). Unlike modern ‘intellectual’ thought, which marks us off from the rest of the animal kingdom (something on which we have always rather plumed ourselves, perhaps mistakenly, as I suggested recently) this instinctive mode occupies much the same ground, and reminds us that what we achieve by great ingenuity and contrivance (remarkable feats of construction, heroic feats of navigation over great distances, to name but two) is done naturally and instinctively by ants, bees, wasps, spiders, swifts, salmon, whales and many others, as a matter of course.

So how does this supposed ‘ancient mode’ of thought work? I am pretty sure that metaphor is at the heart of it. Metaphor consists in seeing one thing in terms of another, or, if you like, in seeing something in the world as expressing or embodying your thought; as such, it is the basic mechanism of most of what we term Art: poetry, storytelling, painting, sculpture, dance, music, all have this transformative quality in which different things are united and seen as aspects of one another, or one is seen as the expression of the other – they become effectively interchangeable.

(a key difference between metaphorical thinking and analytic thinking – our modern mode – is that it unites and identifies where the other separates and makes distinctions – which is why metaphor always appears illogical or paradoxical when described analytically: ‘seeing the similarity in dissimilars’ as Aristotle puts it, or ‘saying that one thing is another’)

This long preamble was prompted by an odd insight I gained the other day when, by a curious concatenation of circumstances, I found myself rereading, for the first time in many years, John Buchan’s The Island of Sheep.

Now Buchan is easy to mock – the values and attitudes of many of his characters are very much ‘of their time’ and may strike us as preposterous, if not worse – but he knows how to spin a yarn, and there are few writers better at evoking the feelings aroused by nature and landscape at various times and seasons. He was also widely and deeply read, a classical scholar, and his popular fiction (which never pretended to be more than entertainment and generally succeeded) has a depth and subtlety not found in his contemporaries.

What struck me in The Island of Sheep were two incidents, both involving the younger Haraldsen. Haraldsen is a Dane from the ‘Norlands‘ – Buchan’s name for the Faeroes. He is a gentle, scholarly recluse who has been raised by his father – a world-bestriding colossus of a man, a great adventurer – to play some leading part in an envisaged great revival of the ‘Northern Race’, a role for which he is entirely unfitted. He inherits from his father an immense fortune, in which he is not interested, and a vendetta or blood-feud which brings him into conflict with some ruthless and unscrupulous men.

Early in the book, before we know who he is, he encounters Richard Hannay and his son Peter John (another pair of opposites). They are out wildfowling and Peter John flies his falcon at an incoming skein of geese; it separates a goose from the flight and pursues it in a thrilling high-speed chase, but the goose escapes by flying low and eventually gaining the safety of a wood. ‘Smith’ (as Haraldsen is then known) is moved to tears, and exclaims
‘It is safe because it was humble. It flew near the ground. It was humble and lowly, as I am. It is a message from Heaven.’
He sees this as an endorsement of the course he has chosen to evade his enemies, by lying low and disguising himself.

Later, however, he takes refuge on Lord Clanroyden’s estate, along with Richard Hannay and his friends, who in their youth in Africa had sworn an oath to old Haraldsen to look after his son, when they were in a tight spot. They attend a shepherd’s wedding and after the festivities there is a great set-to among the various sheepdogs, with the young pretenders ganging up to overthrow the old top-dog, Yarrow, who rather lords it over them. The old dog fights his corner manfully but is hopelessly outnumbered, then just as all seems lost, he turns from defence to attack and sallies out against his opponents with great suddenness and ferocity, scattering them and winning the day.

20160423_165527

Again, Haraldsen is deeply moved:

‘It is a message to me,’ he croaked. ‘That dog is like Samr, who died with Gunnar of Lithend. He reminds me of what I had forgotten.’

He abandons his scheme of running and hiding and resolves to return to his home, the eponymous Island of Sheep, and face down his enemies, thus setting up the climax of the book (it’s not giving too much away to reveal that good triumphs in the end, though of course it’s ‘a dam’ close-run thing’).

Both these incidents have for me an authentic ring: I can well believe that just such ‘seeing as’ played a key role in the way our ancestors thought about the world and their place in it.

It is, of course, just the kind of thing that modern thinking labels ‘mere superstition’ but I think it should not be dismissed so lightly.

The modern objection might be phrased like this: ‘the primitive mind posits a ruling intelligence, an invisible force that controls the world and communicates through signs – bolts of lightning, volcanic eruptions, comets and other lesser but in some way striking events. The coincidence of some unusual or striking occurrence in nature with a human crisis is seen as a comment on it, and may be viewed (if preceded by imploration) as the answer to prayer. We know better: these are natural events with no connection to human action beyond fortuitous coincidence.’

The way I have chosen to phrase this illustrates a classic problem that arises when modern thinking seeks to give an account of ancient or traditional thinking – ‘primitive’ thinking, if you like, since I see nothing pejorative in being first and original. The notion of cause and effect is key to any modern explanation, so we often find that ‘primitive’ thinking is characterised by erroneous notions of causality – basically, a causal connection is supposed where there is none.

For instance, in a talk I heard by the the philosopher John Haldane, he cited a particular behaviour known as ‘tree binding’ in which trees were wounded and bound as a way of treating human wounds – a form of what is called ‘sympathetic magic’, where another object acts as a surrogate for the person or thing we wish to affect (or, to be more precise, ‘wish to be affected’). An account of such behavior in causal terms will always show it to be mistaken and apparently foolish – typical ‘primitive superstition’: ‘They suppose a causal connection between binding the tree’s wound and binding the man’s, and that by healing the one, they will somehow heal the other (which we know cannot work).’

But I would suggest that the tree-binding is not a mistaken scientific process, based on inadequate knowledge – it is not a scientific process at all, and it is an error to describe it in those terms. It is, I would suggest, much more akin to both prayer and poetry. The ritual element – the play-acting – is of central importance.

The tree-binders, I would suggest, are well aware of their ignorance in matters of medicine: they do not know how to heal wounds, but they know that wounds do heal; and they consider that the same power (call it what you will) that heals the wound in a tree also heals the wound in man’s body. They fear that the man may die but hope that he will live, and they know that only time will reveal the outcome.

Wounding then binding the tree seems to me a ritual akin to prayer rather than a misguided attempt at medicine. First and foremost, it is an expression of hope, like the words of reassurance we utter in such cases – ‘I’m sure he’ll get better’. The tree’s wound will heal (observation tells them this) – so, too, might the man’s.

But the real power of the ritual, for me, lies in its flexibility, its openness to interpretation. It is a very pragmatic approach, one that can be tailored to suit any outcome. If the man lives, well and good; that is what everyone hoped would happen. Should the man die, the tree (now identified with him in some sense) remains (with its scar, which does heal). The tree helps reconcile them to the man’s death by showing it in a new perspective: though all they have now is his corpse, the tree is a reminder that this man was more than he seems now: he had a life, spread over time. Also, the continued survival of the tree suggests that in some sense the man, too, or something of him that they cannot see (the life or soul which the tree embodies) may survive the death of his body. The tree can also be seen as saying something about the man’s family (we have the same image ourselves in ‘family tree’, though buried some layers deeper) and how it survives without him, scarred but continuing; and by extension, the same applies to the tribe, which will continue to flourish as the tree does, despite the loss of an individual member.

And the tree ‘says’ all these things because we give it tongue – we make it tell a story, or rather we weave it into one that is ongoing (there are some parallels here to the notion of ‘Elective Causality’ that I discuss elsewhere). As I have argued elsewhere [‘For us, there is only the trying‘] we can only find a sign, or see something as a sign, if we are already looking for one and already think in those terms. Haraldsen, in The Island of Sheep, is troubled about whether he has chosen the right course, and finds justification for it in the stirring sight of the goose evading the falcon; later, still troubled about the rightness of his course, he opts to change it, stirred by the sight of the dog Yarrow turning the tables on his opponents.

His being stirred, I think, is actually the key here. It would be an error to suppose that he is stirred because he sees the goose’s flight and the dog’s bold sally as ‘messages from heaven’; the reverse is actually the case – he calls these ‘messages from heaven’ to express the way in which they stir him. There is a moment when he identifies, first with the fleeing goose, then with the bold dog. What unites him with them in each case is what he feels. But this is not cause and effect, which is always a sequence; rather, this is parallel or simultaneous – the inner feeling and the outward action are counterparts, aspects of the same thing. A much closer analogy is resonance, where a plucked string or a struck bell sets up sympathetic vibration in another.

This is why I prefer Vita Sackville West’s definition of metaphor to Aristotle’s: for him, metaphor is the ability to see the similarity in dissimilar things; for her, (the quote is from her book on Marvell)

‘The metaphysical poets were intoxicated—if one may apply so excitable a word to writers so severely and deliberately intellectual—by the potentialities of metaphor. They saw in it an opportunity for expressing their intimations of the unknown and the dimly suspected Absolute in terms of the known concrete, whether those intimations related to philosophic, mystical, or intellectual experience, to religion, or to love. They were ‘struck with these great concurrences of things’’

A subject to which I shall return.

Leave a comment

Filed under language-related, philosophy

It’s not what you think

What do gorillas think about? Or hens?

‘A hen stares at nothing with one eye, then picks it up.’

20160421_115046

 

(in looking up McCaig’s line (from ‘Summer Farm’) just now I came across two curious comments on it:

‘Could refer to a weathervane as an inanimate hen only has one eye. “Nothing” refers to the wind and the weathervane is picking it up.
The one eye can also refer to one perspective.’

Hmm. Or it could be a beautifully observed and exact description of a hen, in characteristic action. Sometimes the surface is what matters)

This thought came to me when I was reflecting on something that happened yesterday. I was walking up Earl’s Dykes, a curiously-named side street in Perth, pondering the possible meanings and implications of two utterances I meant to write an article about; and it struck me that probably no other species on earth engaged in such speculations.

What do gorillas think about, if anything?

2016-04-21 12.16.48

I have loved gorillas a long time – since my brother and I were small boys playing with plastic Britain’s models of them in our old plum tree – and my kind sister gave us a book by George Schaller, The Year of the Gorilla, about his time spent in the Virunga volcanoes observing Mountain Gorillas. It was published in 1964, so I suppose it must have been fifty years or so since that happened. Schaller was one of the first to counter the popular fictional image of the gorilla as a savage and dangerous monster with actual observation that it was gentle, shy, vegetarian and family-oriented, so his book is of great importance in establishing what has now become the mainstream opinion of these beautiful but sadly threatened creatures.

So I do not mean to be churlish in recalling a passage that has stuck with me, and I hope I am not being unfair in recollecting it from memory, since I do not have the book to hand. The gist of it was that Schaller at one point found himself in close proximity to a large group of gorillas; he and they were sheltering from a downpour (I think this is in the chapter titled ‘am I satyr or man?’). He found himself wondering much the same as my opening line: what was going on behind those watchful, somewhat wary eyes? Not much, was his conclusion, and I think there was a line that likened his companions to ‘rather dim relatives in fur coats’ (if that is not so, or my recollection is awry, I apologise).

My point in recalling this is to wonder whether we do well to plume ourselves on what we consider our unique and superior intellect; maybe we should take our singularity in this respect as a warning rather than a mark of distinction. The Hitchhiker’s Guide to the Galaxy (a work I enjoy but do not revere to the extent that some do) proposes (if I recall correctly) that humans are only third in intellectual attainment on our planet, behind mice and dolphins. This is satire, of course, but for me it does not strike quite the right note; I increasingly wonder if our reverence of intellectual attainment is not itself the problem.

Schaller’s gorillas sitting somewhat dolefully in the rain (they are prone to colds and pulmonary ailments) or dappled with sunlight as they feed at leisure may well have no mental preoccupations whatever – but is that not something to be envied rather than despised? Do they not attain effortlessly that same absorption in the moment, that pure existence in the present, that is the aim of meditation, which we humans attain only* through rigorous discipline, quieting the mind with mantras and controlling the body through physical training?

Maybe it is the surface that matters. We have much to unlearn.

*I am in error here, of course: we can attain it by various means – drawing, painting, making music or listening to it.

 

Leave a comment

Filed under philosophy, Uncategorized

In the beginning was the word… or was it?

1511

Reflecting on the origin of words leads us into interesting territory. I do not mean the origin of particular words, though that can be interesting too; I mean the notion of words as units, as building blocks into which sentences can be divided.

How long have we had words? The temptation is to say ‘as long as we have had speech’ but when you dig a bit deeper, you strike an interesting vein of thought.

As I have remarked elsewhere [see ‘The Muybridge Moment‘] it seems unlikely that there was any systematic analysis of speech till we were able to write it down, and perhaps there was no need of such analysis. Certainly a great many of the things that we now associate with language only become necessary as a result of its having a written form: formal grammar, punctuation, spelling – the three things that probably generate the most unnecessary heat – are all by-products of the introduction of writing.

The same could be said of words. Till we have to write them down, we have no need to decide where one word ends and another begins: the spaces between words on a page do not reflect anything that is found in speech, where typically words flow together except where we hesitate or pause for effect. We are reminded of this in learning a foreign language, where we soon realise that listening out for individual words is a mistaken technique; the ear needs to attune itself to rhythms and patterns and characteristic constructions.

So were words there all along just waiting to be discovered? That is an interesting question. Though ‘discovery’ and ‘invention’ effectively mean the same, etymologically (both have the sense of ‘coming upon’ or ‘uncovering’) we customarily make a useful distinction between them – ‘discovery’ implies pre-existence – so we discover buried treasure, ancient ruins, lost cities – whereas ‘invention’ is reserved for things we have brought into being, that did not previously exist, like bicycles and steam engines.  (an idea also explored in Three Misleading Oppositions, Three Useful Axioms)

So are words a discovery or an invention?

People of my generation were taught that Columbus ‘discovered’ America, though even in my childhood the theory that the Vikings got their earlier had some currency; but of course in each case they found a land already occupied, by people who (probably) had arrived there via a land-bridge from Asia, or possibly by island-hopping, some time between 42000 and 17000 years ago. In the same way, Dutch navigators ‘discovered’ Australia in the early 17th century, though in British schools the credit is given to Captain Cook in the late 18th century, who actually only laid formal claim in the name of the British Crown to a territory that Europeans had known about for nearly two centuries – and its indigenous inhabitants had lived in for around five hundred centuries.

In terms of discovery, the land-masses involved predate all human existence, so they were there to be ‘discovered’ by whoever first set foot on them, but these later rediscoveries and colonisations throw a different light on the matter. The people of the Old World were well used to imperial conquest as a way of life, but that was a matter of the same territory changing hands under different rulers; the business of treating something as ‘virgin territory’ – though it quite plainly was not, since they found people well-established there – is unusual, and I think it is striking where it comes in human, and particularly European, history. It implies an unusual degree of arrogance and self-regard on the part of the colonists, and it is interesting to ask where that came from.

Since immigration has become such a hot topic, there have been various witty maps circulating on social media, such as this one showing ‘North America prior to illegal immigration’ 2gdVlD0

The divisions, of course, show the territories of the various peoples who lived there before the Europeans arrived, though there is an ironic tinge lent by the names by which they are designated, which for the most part are anglicised. Here we touch on something I have discussed before  [in Imaginary lines: bounded by consent]  – the fact that any political map is a work of the imagination, denoting all manner of territories and divisions that have no existence outside human convention.

Convention could be described as our ability to project or impose our imagination on reality; as I have said elsewhere [The Lords of Convention] it strikes me as a version of the game we play in childhood, ‘let’s pretend’ or ‘make-believe’ – which is not to trivialise it, but rather to indicate the profound importance of the things we do in childhood, by natural inclination, as it were.

Are words conventions, a form we have imposed on speech much as we impose a complex conventional structure on a land-mass by drawing it on a map? The problem is that the notion of words is so fundamental to our whole way of thinking – may, indeed, be what makes it possible – that it is difficult to set them aside.

That is what I meant by my comment about the arrogance and self-regard implied in treating America and Australia as ‘virgin territory’ – its seems to me to stem from a particular way of thinking, and that way of thinking, I suggest, is bound up with the emergence of words into our consciousness, which I think begins about two and a half thousand years ago, and (for Europeans at least) with the Greeks.

I would like to offer a model of it which is not intended to be historical (though I believe it expresses an underlying truth) but is more a convenient way of looking at it. The years from around 470 to 322 BC span the lives of three men: the first, Socrates, famously wrote nothing, but spoke in the market place to whoever would listen; we know of him largely through his pupil, Plato. It was on Plato’s pupil, Aristotle, that Dante bestowed the title ‘maestro di color che sanno’ – master of those that know.

This transition, from the talking philosopher to the one who laid the foundations of all European thought, is deeply symbolic: it represents the transition from the old way of thought and understanding, which was inseparable from human activity – conversation, or ‘language games’ and ‘forms of life’ as Wittgenstein would say – to the new, which is characteristically separate and objective, existing in its own right, on the written page.

The pivotal figure is the one in the middle, Plato, who very much has a foot in both camps, or perhaps more accurately, is standing on the boundary of one world looking over into another newly-discovered. The undoubted power of his writing is derived from the old ways – he uses poetic imagery and storytelling (the simile of the cave, the myth of Er) to express an entirely new way of looking at things, one that will eventually subjugate the old way entirely; and at the heart of his vision is the notion of the word.

Briefly, Plato’s Theory of Forms or Ideas can be expressed like this: the world has two aspects, Appearance and Reality; Appearance is what is made known to us by the senses, the world we see when we look out the window or go for a walk. It is characterised by change and impermanence – nothing holds fast, everything is always in the process of changing into something else, a notion for which the Greeks seemed to have a peculiar horror; in the words of the hymn, ‘change and decay in all around I see’.

Reality surely cannot be like that: Truth must be absolute, immutable (it is important to see the part played in this by desire and disgust: the true state of the world surely could not be this degrading chaos and disorder where nothing lasts). So Plato says this: Reality is not something we can apprehend by the senses, but only by the intellect. And what the intellect grasps is that beyond Appearance, transcending it, is a timeless and immutable world of Forms or Ideas. Our senses make us aware of many tables, cats, trees; but our intellect sees that these are but instances of a single Idea or Form, Table, Cat, Tree, which somehow imparts to them the quality that makes them what they are, imbues them with ‘tableness’ ‘catness’ and ‘treeness’.

This notion beguiled me when I first came across it, aged fourteen. It has taken me rather longer to appreciate the real nature of Plato’s ‘discovery’, which is perhaps more prosaic (literally) but no less potent. Briefly, I think that Plato has discovered the power of general terms, and he has glimpsed in them – as an epiphany, a sudden revelation – a whole new way of looking at the world; and it starts with being able to write a word on a page.

Writing makes possible the relocation of meaning: from being the property of a situation, something embedded in human activity (‘the meaning of a word is its use in the language’) meaning becomes the property of words, these new things that we can write down and look at. The icon of a cat or a tree resembles to some extent an actual cat or tree but the word ‘cat’ looks nothing like a cat, nor ‘tree’ like a tree; in order to understand it, you must learn what it means – an intellectual act. And what you learn is more than just the meaning of a particular word – it is the whole idea of how words work, that they stand for things and can, in many respects, be used in their stead, just as the beads on an abacus can be made to stand for various quantities. What you learn is a new way of seeing the world, one where its apparently chaotic mutability can be reduced to order.

Whole classes of things that seem immensely varied can now be subsumed under a single term: there is a multiplicity of trees and cats, but the one word ‘tree’ or ‘cat’ can be used to stand for all or any of them indifferently. Modelled on that, abstract ideas such as ‘Justice’ ‘Truth’ and ‘The Good’ can be seen standing for some immutable, transcendent form that imbues all just acts with justice and so on. Plato’s pupil Aristotle discarded the poetic clothing of his teacher’s thought, but developed the idea of generalisation to the full: it is to him that we owe the system of classification by genus and species and the invention of formal logic, which could be described as the system of general relations; and these are the very foundation of all our thinking.

In many respects, the foundations of the modern world are laid here, so naturally these developments are usually presented as one of mankind’s greatest advances. However, I would like to draw attention to some detrimental aspects. The first is that this new way of looking at the world, which apprehends it through the intellect, must be learned. Thus, at a stroke, we render natural man stupid (and ‘primitive’ man, to look ahead to those European colonisations, inferior, somewhat less than human). We also establish a self-perpetuating intellectual elite – those who have a vested interest in maintaining the power that arises from a command of the written word – and simultaneously exclude and devalue those who struggle to acquire that command.

The pernicious division into ‘Appearance’ and ‘Reality’ denigrates the senses and all natural instincts, subjugating them to and vaunting the intellect; and along with that goes the false dichotomy of Heart and Head, where the Head is seen as being the Seat of Reason, calm, objective, detached, which should properly rule the emotional, subjective, passionate and too-easily-engaged Heart.

This, in effect, is the marginalising of the old way of doing things that served us well till about two and a half thousand ago, which gave a central place to those forms of expression and understanding which we now divide and rule as the various arts, each in its own well-designed box: poetry, art, music, etc. (a matter discussed in fable form in Plucked from the Chorus Line)

So what am I advocating? that we undo all this? No, rather that we take a step to one side and view it from a slightly different angle. Plato could only express his new vision of things in the old way, so he presents it as an alternative world somewhere out there beyond the one we see, a world of Ideas or Forms, which he sees as the things words stand for, what they point to – and in so doing, makes the fatal step of discarding the world we live in for an intellectual construct; but the truth of the matter is that words do not point to anything beyond themselves; they are the Platonic Forms or Ideas: the Platonic Idea of ‘Horse’ is the word ‘Horse’. What Plato has invented is an Operating System; his mistake is in thinking he has discovered the hidden nature of Reality.

What he glimpsed, and Aristotle developed, and we have been using ever since, is a way of thinking about the world that is useful for certain purposes, but one that has its limitations. We need to take it down a peg or two, and put it alongside those other, older operating systems that we are all born with, which we developed over millions of years. After all, the rest of the world – animal and vegetable – seems to have the knack of living harmoniously; we are the ones who have lost it, and now threaten everyone’s existence, including our own; perhaps it is time to take a fresh look.

Leave a comment

Filed under language-related, philosophy

The Lords of Convention

‘The present king of France is bald’ seems to present a logical problem that ‘the cat is on the table’ does not – there is no present king of France, so how can we assert that he is bald? and is the sentence true or false?

But I am much more interested in the second sentence: ‘the cat is on the table’ – what does it mean?

fa728226aacb800d3412c62368829d5e

(‘Cat on a Table’ by John Shelton, 1923-1993)

Can it mean, for instance., ‘it’s your cat, I hold you responsible for its behaviour’?

Consider:

Scene: a sunny flat. A man sprawls at ease on the sofa. To him, from the neighbour room, a woman.

Woman: The cat is on the table.

(Man rolls his eyes, sighs, gets up reluctantly)

Should you want to grasp the difference between the philosophy of the early Wittgenstein, as expressed in the Tractatus Logico-Philosophicus, and his later philsophy, as expressed in Philosophical Investigations (and I accept that not everyone does) then this example epitomises it. It also pins down – or at least, develops further – thoughts I have been having lately about meaning, objectivity and the impact of the invention of writing on thought.

The form of the question in the second paragraph above is curious: ‘what does it mean?’ – where ‘it’ refers to the sentence. The clear implication is that meaning is a property of the sentence, of words – an assertion that may not strike us as strange, till we set it alongside another that we might ask – ‘what do you mean?’

I would suggest that the first question only becomes possible once language has a written form: before that, no-one would think to ask it, because there would be no situation in which you could come across words that were not being spoken by someone in a particular situation – such as the scene imagined above. Suppose we alter it slightly:

Woman: The cat is on the table.
Man: What do you mean?
Woman: What do you mean, what do I mean? I mean the cat is on the table.
Man: What I mean is, the cat is under the sideboard, eating a mouse – look!

The words spoken here all have their meaning within the situation, as it were (what Wittgenstein would call the Language Game or the Form of Life) and the question of their having their own, separate meaning simply does not arise; if we seek clarification, we ask the person who spoke – the meaning of the words is held to be something they intend (though it is open to interpretation, since a rich vein of language is saying one thing and meaning another, or meaning more than we say – just as in our little scene, the line about the cat is far less about description of an event, far more about an implied criticism of the owner through the behaviour of his pet – which in turn is probably just a token of some much deeper tension or quarrel between the two).

Only when you can have words written on a page, with no idea who wrote them or why, do we start to consider that the meaning might reside in the words themselves, that the sentence on the page might mean something of itself, without reference to anything (or anyone) else.

This relocation of meaning – from the situation where words are spoken, to the words themselves – is, at the very least, a necessary condition of Western philosophy, by which I mean the way of thinking about the world that effectively starts with Plato and stretches all the way to the early Wittgenstein, whose Tractatus can be viewed as a succinct summary of it, or all that matters in it;  and perhaps it is more than a necessary condition – it may be the actual cause of Western philosophy.

The crucial shift, it seems to me, lies in the objectification of language, and so of meaning, which becomes a matter of how words relate to the world, with ourselves simply interested bystanders; and this objectification only becomes possible, as I have said, when speech is given an objective form, in writing.

If you were inclined to be censorious, you might view this as an abnegation of responsibility: we are the ones responsible for meaning, but we pass that off on language – ‘not us, guv, it’s them words wot done it.’ However, I would be more inclined to think of it as an instance of that most peculiar and versatile human invention, the convention. Indeed, a convention could be defined as an agreement to invest some external thing with power, or rather to treat it as if it had power – a power that properly belongs to (and remains with) us.

(The roots of convention are worth thinking about. I trace them back to childhood, and the game of ‘make-believe’ or ‘let’s pretend’ which demonstrates a natural facility for treating things as if they existed (imaginary friends) or as if they have clearly defined roles and rules they must follow (the characters in a game a child plays with dolls and other objects it invests with life and character). Is it any wonder that a natural facility we demonstrate early in childhood (cp. speech) should play an important part in adult life? In fact, should we not expect it to?)

It is convenient to act as if meaning is a property of words, and is more or less fixed (and indeed is something we can work to clarify and fix, by study). It facilitates rapid and efficient thought, because if words mean the things they denote, then we can, in a sense, manipulate the world by manipulating words; and this is especially so once we have mastered the knack of thinking in words, i.e. as a purely mental act, without having to write or read them in physical form.

We can perhaps appreciate the power of this more fully if we consider how thinking must have been done before – and though this is speculation, I think it is soundly based. I would argue that before the advent of writing no real analysis of speech was possible: we simply lacked any means of holding it still in order to look at it. An analytic approach to language sees it as something built up from various components – words of different sorts – which can be combined in a variety of ways to express meaning. It also sees it as something capable of carrying the whole burden of expression, though this is a species of circular argument – once meaning is defined as a property of words, then whatever has meaning must be capable of being expressed in words, and whatever cannot be expressed in words must be meaningless.

Without the analytic approach that comes with writing, expression is something that a person does, by a variety of means – speech, certainly, but also gesture, facial expression, bodily movement, song, music, painting, sculpture. And what do they express? in a word, experience – that is to say, the fact of being in the world; expression, in all its forms, is a response to Life (which would serve, I think, as a definition of Art).

Such expression is necessarily subjective, and apart from the cases where it involves making a physical object – a sculpture or painting, say – it is inseparable from the person and the situation that gives rise to it. Viewed from another angle, it has a directness about it: what I express is the result of direct contact with the world, through the senses – nothing mediates it  (and consider here that Plato’s first step is to devalue and dismiss the senses, which he says give us only deceptive Appearance; to perceive true Reality, we must turn to the intellect).

Compare that with what becomes possible once we start thinking in words: a word is a marvel of generalisation – it can refer to something, yet has no need of any particular detail – not colour, size, shape or form: ‘cat’ and ‘tree’ can stand indifferently for any cat, any tree, and can be used in thought to represent them, without resembling them in any respect.

‘A cat sat on a table under a tree’

might be given as a brief to an art class to interpret, and might result in twenty different pictures; yet the sentence would serve as a description of any of them – it seems to capture, in a way, some common form that all the paintings share – a kind of underlying reality of which each of them is an expression; and that is not very far off what Plato means when he speaks of his ‘Forms’ or ‘Ideas’ (or Wittgenstein, when he says ‘a logical picture of facts is a thought’ (T L-P 3) ).

While this way of thinking – I mean using words as mental tokens, language as an instrument of thought – undoubtedly has its advantages (it is arguably the foundation on which the modern world is built), it has been purchased at a price: the distancing and disengagement from reality, which is mediated through language, and the exclusion of all other forms of expression as modes of thought (effectively, the redefinition of thought as ‘what we do with language in our heads’); the promotion of ‘head’ over ‘heart’ by the suppression of the subject and the denigration of subjectivity (which reflects our actual experience of the world) in favour of objectivity, which is a mere convention, an adult game of make-believe –

all this points to the intriguing possibility, as our dissatisfaction grows with the way of life we have thus devised, that we might do it differently if we chose, and abandon the tired old game for a new one.

Leave a comment

Filed under philosophy