Tag Archives: Aristotle

St. Anselm and the Blackbird

 

2017-05-23 15.17.35

Blackbird

Its eye a dark pool
in which Sirius glitters
and never goes out.
Its melody husky
as though with suppressed tears.
Its bill is the gold
one quarries for amid
evening shadows. Do not despair
at the stars’ distance. Listening
to blackbird music is
to bridge in a moment chasms
of space-time, is to know
that beyond the silence
which terrified Pascal
there is a presence whose language
is not our language, but who has chosen
with peculiar clarity the feathered
creatures to convey the austerity
of his thought in song.

– R.S. Thomas

St Anselm was Archbishop of Canterbury and lived from 1033 to 1109 at the start of the intellectual renaissance that the High Middle Ages brought to Western Europe. It was a period of great intellectual ferment, an Age of both Faith and Reason, when the best minds of the day applied with passionate curiosity the learning they were rediscovering to the big topic of the day: God.

It takes some effort of the imagination in this secular age to realise that for the mediaeval mind, Theology was the Queen of Sciences, as exciting in its day as quantum physics is now. ‘What is God?’ is the question that the greatest of the mediaevals – one of the greatest intellects ever, Thomas Aquinas – asked at an early age and pursued the rest of his life.

The learning they were rediscovering had two principal strands, both of which had been kept alive elsewhere, since the Eastern Empire, centred on Constantinople, continued after the Western one, centred on Rome, had fallen, though increasingly encroached upon latterly by a new intellectual and religious power to the East and South: Islam.

The most immediately accessible strand, because it was written in Latin, was the neoPlatonism of the late Roman period, whose most notable exponent was Augustine of Hippo. Platonism, with its notion of a transcendent Reality composed of eternal, immutable Forms and a vision of Truth as a brilliant sun that is the source of all wisdom, is a good fit for Christianity – so little is needed to reconcile them that Plato (with the Christ-like Socrates as his literary mouthpiece) can seem almost a pagan prophet of Christianity.

The second strand was more difficult, because it took a circuitous route from the Greek-speaking Eastern empire through the Arabic of Islamic scholars (Avicenna and Averroes, principally) before being translated into Latin where the two cultures met in Spain. This second strand centred chiefly on the writings of Plato’s pupil, one of the greatest minds of any age, Aristotle.

It was Aquinas who met the challenge of reconciling this new influx of pagan (and heretic) thought into catholic teaching and did so with such effect that he remains to this day the chief philosopher of the catholic church, with his Summa Theologica his principal work*.

This period marks the second beginning of Western thought; its first beginning had been some thirteen centuries previously with the Classical Age of Greece, and the two giants, Plato and Aristotle. It is important to realise that what might seem at first glance a recovery of ancient wisdom was in reality nothing of the sort: it was the rediscovery of a new and startling way of looking at things, one that displaced and subjugated the traditionally accepted way of understanding our relation to the world that had held since time immemorial.

What made this new way of thought possible was the written word. For the first time, it was possible to separate one of the elements of human expression, speech, from the larger activity of which it was part, and give it what appeared to be an independent and objective form. This did not happen at once; indeed, it took about three thousand years from the invention of writing, around 5500 years ago, to the realisation of its potential in Classical Greece.

The word written on the page is the precondition of the relocation of meaning: from being a property of situations, inseparable from human activity and conveyed by a variety of methods, such as facial expression, gesture, bodily posture, with speech playing a minor role, meaning now becomes the property of words, and is deemed, by implication, to exist independently and objectively, and to be more or less fixed.

This one change is the foundation of modern thought: it is what allows Plato, with breathtaking audacity, to reverse the relation between the intellect and the senses and proclaim that what the senses tell us is mere Appearance, and that Reality is apprehended by the intellect – and consists of the world viewed from a general aspect: effectively, through the medium of language. It is the beginning of a world-view that casts us as detached spectators of an independent objective reality, a world-view that cannot be acquired naturally and instinctively, but only through a prolonged process of education, based on literacy.

When, some thirteen centuries later, Anselm devises his ‘ontological proof’ for the existence of God, it is squarely within this intellectual framework erected by Plato and Aristotle:
[Even a] fool, when he hears of … a being than which nothing greater can be conceived … understands what he hears, and what he understands is in his understanding.… And assuredly that, than which nothing greater can be conceived, cannot exist in the understanding alone. For suppose it exists in the understanding alone: then it can be conceived to exist in reality; which is greater.… Therefore, if that, than which nothing greater can be conceived, exists in the understanding alone, the very being, than which nothing greater can be conceived, is one, than which a greater can be conceived. But obviously this is impossible. Hence, there is no doubt that there exists a being, than which nothing greater can be conceived, and it exists both in the understanding and in reality.’

This is straightforward enough, if you take your time and attend to the punctuation: the expression ‘that than which nothing greater can be conceived’ is Anselm’s definition of God; and even a simpleton, he says, can understand it; but to exist in reality is better than to exist merely in the imagination, so a God that exists in reality is greater than one which exists only in the imagination, so if God is that than which nothing greater can be conceived, then God must exist in Reality (Because that leaves no room, as it were, to conceive of anything greater).

Much has been said and written about this argument since it was first made over 900 years ago, but I want to concentrate on a single aspect of it, which is the continuity it implies between the human understanding and reality. To use an image, if we conceive the intellect as a skyscraper, then by taking the lift to its utmost height and climbing, so to speak, onto the roof, we arrive at Reality, the only thing that is higher than the height of our understanding.

This is what leads us to suppose – via the notion that we are created in the image and likeness of God – that God must be the perfection of all that is best in us; and if we esteem our intellectual faculties above all else (as, in the ‘West’, we seem to do) then God must be the supreme intellect.

This presents a problem, one that has considerable force in arguments against the existence of God: though a lesser intellect cannot fully comprehend a greater one, they share a great deal of common ground, and the greater intellect can certainly attune itself to the capacity of the lesser: this is a familar case (though not always!) between adult and child, teacher and pupil. Why, then, does God not deal directly with us at our intellectual level? Why doesn’t God speak our language? He surely would, if he could; yet he must be able to, since he is God – so the fact that he does not makes him appear either perverse (like a parent playing a cruel sort of game where he pretends not to be there, and does not answer when his child calls out to him, though he may do something that indirectly suggests his presence, like throwing a ball or making the bushes move) or absent, since he would if he could, but does not.

Thomas’s poem is an answer to this conundrum, though it is not a comfortable one. Perhaps our assumption that reality is at the top of the skyscraper is an error: maybe it is outside, at ground level. Maybe God speaks to us all the time, but we do not recognise the fact, because ‘God’ is quite other than we suppose, and cannot be contained in the intellectual framework that Plato and Aristotle have bequeathed to us.

This would explain on the one hand why religion – in its broadest sense – is bound up with immemorial ritual (which belongs to the world before Plato and Aristotle) and on the other, why, in an age that puts it confidence in intellect and reason – the ‘new thinking’ that Plato and Aristotle invented, not so very long ago in terms of our earthly existence – God is proving increasingly difficult to find.

*in the context of this piece, it is worth recalling that Aquinas on his deathbed said that his work now seemed ‘all so much straw.’

Leave a comment

Filed under language-related, theology

Heart-thought

When I was young and studying philosophy at Edinburgh University I remember becoming excited about the figurative use of prepositions; they seemed to crop up everywhere, openly and in disguise as Latin prefixes, in uses that clearly were not literal. Reasoning from the fact that the meaning of any preposition could be demonstrated using objects and space, I concluded that a world of objects and space was implied in all our thinking, and that this might act as a limit on what and how we thought.

2016-04-27 14.26.34

What strikes me about this now is not so much the idea as the assumptions on which it is based: I have made Language in its full-blown form my starting point, which is a bit like starting a history of transport with the motor-car. As I have suggested before, what we think of as ‘Language’ is a relatively recent development, arising from the invention of writing and the influence it has exerted on speech, simultaneously elevating it above all other forms of expression and subjugating it to the written form. It is the written form that gives language an objective existence, independent of human activity, and relocates ‘meaning’ from human activity (what Wittgenstein terms ‘language games’ or ‘forms of life’) to words themselves; and alongside this, it makes possible the systematic anaylsis of speech [as discussed in The Muybridge Moment].

In that earlier theory of mine I took for granted a number of things which I now think were mistaken. The first, as I have said, is that the milieu which gives rise to the figurative use of words is the developed form of language described above; that is to confuse the identification and definition of something with its origin, rather as if I were to suppose that a new species of monkey I had discovered had not existed before I found and named it.

Bound up with this is the model of figurative language which I assumed, namely that figurative use was derived from literal use and dependent upon it, and that literal use was prior and original – in other words, that we go about the world applying names like labels to what we see about us (the process of ‘ostensive definition’ put forward by St Augustine, and quoted by Wittgenstein at the start of his Philosophical Investigations) and only afterwards develop the trick of ‘transferring’ these labels to apply to other things (the word ‘metaphor’ in Greek is the direct equivalent of ‘transfer’ in Latin – both suggest a ‘carrying over or across’).

Points to note about this model are that it is logically derived and that it presents metaphorical thinking as an intellectual exercise – it is, as Aristotle describes it, ‘the ability to see the similarity in dissimilar things.’

The logic appears unassailable: clearly, if metaphor consists in transferring a word from its literal application and applying it elsewhere, so that the sense of the original is now understood as applying to the new thing, then the literal use must necessarily precede the metaphorical and the metaphorical be wholly dependent on and derived from it: to say of a crowd that it surged forward is to liken its action to that of a wave, but we can only understand this if we have the original sense of ‘surge’ as a starting point.

However, there is a difficulty here. It is evident that there can be no concept of literal use and literal meaning till there are letters, since the literal meaning of ‘literal’ is ‘having to do with letters’. Only when words can be written down can we have an idea of a correspondence between the words in the sentence and the state of affairs that it describes (what Wittgenstein in the Tractatus calls the ‘picture theory’ of language). If what we term metaphors were in use before writing was invented – and I am quite certain that they were – then we must find some other explanation of them than the ‘transfer model’ outlined above, with its assumption that literal use necessarily precedes metaphorical and the whole is an intellectual process of reasoned comparison.

The root of the matter lies in the fact already mentioned, that only with the invention of a written form does the systematic analysis of speech become possible, or indeed necessary. Before then (as I suggest in ‘The Disintegration of Expression‘) speech was one facet or mode of expression, quite likely not the most important (I would suggest that various kinds of body language, gesture and facial expression were possibly more dominant in conveying meaning). It was something that we used by instinct and intuition rather than conscious reflection, and it would always have been bound up with some larger activity, for the simple reason that there was no means of separating it (the nearest approach would be a voice speaking in the dark, but that is still a voice, with all the aesthetic qualities that a voice brings, and also by implication a person; furthermore, it is still firmly located in time, at that moment, for those hearers, in that situation. Compare this with a written sentence, where language for the first time is able to stand on its own, independent of space and time and not associated with any speaker).

In other words, when metaphor was first defined, it was in terms of a literate language, and was seen primarily as a use we make of words. (Given the definition supplied by Belloc’s schoolboy, that ‘a metaphor is just a long Greek word for a lie’, there is an illuminating parallel to be drawn here with lying, which might be defined as ‘making a false statement, one that is not literally true’. This again puts the focus on words, and makes lying primarily a matter of how words are used and what they mean. The words or the statement are seen as what is false, but actually it is the person – hence the old expression ‘the truth is not in him’. Deceit consists in creating a false appearance, in conveying a false impression: words are merely instrumental, and though certainly useful – as a dagger is for murder – are by no means necessary. We can lie by a look or an action; we can betray with a kiss.)

There is a great liberation in freeing metaphor from the shackles that bind it to literal language (and to logic, with which it is at odds, since it breaks at least two of the so-called ‘laws of thought’ – it violates the law of identity, which insists that ‘A is A’, by asserting that A is B, and by the same token, the law of contradiction, which insists that you cannot have A and not-A, by asserting that A is not-A). It allows us to see it from a wholly new perspective, and does away with the need to see it either as an intellectual act (‘seeing the similarity in dissimilars’) or as something that necessarily has to do with words or even communication; I would suggest that metaphor is primarily a way of looking at the world, and so is first and foremost a mode of thought, but one that operates not through the intellect and reason but through intuition and feelings.

To illustrate this, I would like to take first an example I came up with when I was trying to envisage how metaphor might have evolved. Two brothers, out in the bush, come on a lion, at a safe distance, so that they can admire its noble mien and powerful grace without feeling threatened. One brother smiles and says ‘mother!’ The other, after an initial look of puzzlement, nods his head in affirmation and laughs.

The explanation I furnished to accompany this is that their mother is a formidable and beautiful woman and that the first brother, seeing the lion, is reminded of her, and by naming her, invites his brother to make the same comparison that has already occurred to him, which he does after a moment’s puzzlement, and the two take pleasure in this new and unexpected – yet apt – use of the word.

20160427_143706

I think that the focus here is wrong: it is still concerned to make metaphor about words, and to see it primarily as a way of communicating ideas.

I would now like to alter the story slightly. A man on his own in the bush catches sight of the lion (from a safe distance, as before). On seeing it, he is moved: the sight of it stirs him, fills him with a mixture of awe and delight. And it is not what he sees, but rather what he feels, that calls his mother to mind: the feeling that the lion induces him is the same as he has felt in the presence of his mother. That is where the identification takes place, in the feeling: the outer circumstances might differ (the lion in the bush, his mother in the village) but the inner feeling is the same. If we think of an experience as combining an external objective component with an internal subjective one (and I am carefully avoiding any notion of cause and effect here) then the origin of metaphor lies in experiences where the external objective component differs but the internal subjective component is the same.

Why am I wary of saying ‘the sight of the lion causes the same feelings that the sight of his mother does’ ? Because it strikes me as what I would call a ‘mixed mode’ of thinking: it imports the notion of causality, a modern and analytic way of thinking, into an account of an ancient and synthetic way of thinking, thus imposing an explanation rather than simply describing. (This is difficult territory because causality is so fundamental to all our explanations, based as they are on thinking that makes use of literate language as its main instrument)

What I want to say is this: causal explanations impose a sequence – one thing comes first – the cause – and elicits the other, the effect. So if we stick with the man and the lion we would analyse it like this: ‘sense data arrive in the man’s brain through his eyes by the medium of light, and this engenders a physical response (spine tingling, hair standing on end, a frisson passing over the body) which the man experiences as a feeling of awe and delight.’

We can demonstrate by reason that the lion, or the sight of it, is the cause and the emotion the effect, because if we take the lion away (for instance, before the man comes on it) the man does not experience the emotion (although he may experience ‘aftershocks’ once it has gone, as he recalls the sight of it).

But there is a fault here. If we leave the lion but substitute something else for the man – an antelope, say, or a vulture – does it still have the same effect? It is impossible to say for sure, though we may infer something from how each behaves – the antelope, at the sight (and quite probably the scent) of the lion might bound away in the opposite direction, while the vulture (sensing the possibility of carrion near by or in the offing) might well move closer.

My point is that the analysis of cause and effect is rather more complex  than I have presented it here, which is much as David Hume makes it out to be, with his analogy with one billiard ball striking another; as Schopenhauer points out, what causes the window to shatter is not the stone alone, but the fact of its being thrown with a certain force and direction combined with the brittleness of the glass (and if the stone is thrown by a jealous husband through his love rival’s window, then we might need to include his wife’s conduct and the construction he puts upon it in the causal mix). Change any one of these and the result is different.

My being human is as much a precondition for the feelings I experience in the presence of a lion as the lion is, and I think that this is a case where, as Wordsworth puts it, ‘we murder to dissect’ – it is much more enlightening to consider the experience as a single simultaneous event with, as I have suggested, an inner and an outer aspect that are effectively counterparts. So the lion is the embodiment of the man’s feelings but so is his mother, and the lion and his mother are identified by way of the feelings that both embody; and the feelings are in some sense the inner nature or meaning of both the lion and the mother (think here of all the songs and poetry and music that have been written where the lover tries to give expression to his feelings for his beloved). This interchangeability and the identity of different things or situations through a common feeling aroused in each case is the foundation of metaphor and, I think, the key ‘mechanism’ of Art.

(This has an interesting parallel with the philosophy of Schopenhauer, as expressed in the title of his work Die Welt als Wille und Vortsellung, variously translated as ‘The World as Will and Representation’ or ‘The World as Will and Idea’. In this he borrows from Eastern philosophy to present the world as having a dual aspect – objectively, as it appears to others and subjectively, as it is in itself. Its objective aspect, Representation, is made known to us via our senses, and is the same world of Objects and Space with which this discussion began; we cannot by definition see what it is like in itself since it only ever appears as object, but once we realise that we ourselves are objects in the ‘World as Representation,’ we can gain a special insight by ‘turning our eyes inward’ as it were, and contemplating our own inner nature, which we know not by seeing but by being it.

And what do we find? For Schopenhauer, it is the Will; and the revelation is that this is not an individual will – my will as opposed to yours – it is the same Will that is the inner nature of everything, the blind will to exist, to come into being and to remain in being. (This bears a striking resemblance to the position advanced by evolutionary biologists such as Richard Dawkins, for whom humankind is effectively a by-product of our genetic material’s urge to perpetuate itself).)

I would diverge from Schopenhauer – and the evolutionary biologists – in their pessimistic and derogatory account of the inner nature of things, on two grounds. The first is that it makes us anomalous. Schopenhauer asserts that ‘in us alone, the Will comes to consciousness’ but is unable to explain why this should be so, while his only solution to the revelation that all things are just the urges of a blind and senseless will is effectively self-annihilation (not a course he chose to pursue himself, as it happens – he lived to be 72). There is a lack of humility here that I find suspect, a desire still to assert our uniqueness and importance in a senseless world. If the Will is indeed the inner nature of all things (and that is questionable) why should we consider ourselves the highest manifestation of it?

2016-04-27 14.24.26

The second ground is the nature of the feelings that I describe, which are the opposite of pessimistic: they are uplifting, feelings of awe, elation and delight. There is a fashion nowadays for explaining everything in terms of genetic inheritance or evolutionary advantage (‘stress is a manifestation of the fight-or-flight reaction’ for instance, or any number of explanations which couch our behaviour in terms of advertising our reproductive potential) but I have yet to come across any satisfactory explanation in the same terms of why we should feel elated in the presence of beauty, whether it is a person, an animal, a landscape, the sea or (as Kant puts it) ‘the starry heavens over us*’. The characteristic feature of such experiences is ‘being taken out of yourself’ (which is what ‘ecstasy’ means) a feeling of exaltation or rapture, of temporarily losing any sense of yourself and feeling absorbed in some greater whole.

I would venture that this disinterested delight is the single most important aspect of human experience and is (in Kantian phrase) ‘worthy of all attention.’

*The full quotation is not without interest: “Two things fill the mind with ever new and increasing admiration and awe, the more often and steadily we reflect upon them: the starry heavens above me and the moral law within me. I do not seek or conjecture either of them as if they were veiled obscurities or extravagances beyond the horizon of my vision; I see them before me and connect them immediately with the consciousness of my existence.” (Critique of Practical Reason)

3 Comments

Filed under art, language-related, philosophy

‘These great concurrences of things’

20160423_162851

One of the main ideas I pursue here is that the invention of writing has radically altered the way we think, not immediately, but eventually, through its impact on speech, which it transforms from one mode of expression among many into our main instrument of thought, which we call Language, in which the spoken form is dominated by the written and meaning is no longer seen as embedded in human activity but rather as a property of words, which appear to have an independent, objective existence. (This notion is examined in the form of a fable here)

This means in effect that the Modern world begins in Classical Greece, about two and a half thousand years ago, and is built on foundations laid by Socrates, Plato and Aristotle; though much that we think of as marking modernity is a lot more recent (some would choose the Industrial Revolution, some the Enlightenment, some the Renaissance) the precondition for all of these – the way of seeing ourselves in the world which they imply – is, I would argue, the change in our thinking outlined above.

This naturally gives rise to the question of how we thought before, which is not a matter of merely historical interest, since we are not talking here about one way of thinking replacing another, but rather a new mode displacing and dominating the existing one, which nevertheless continues alongside, albeit in a low estate, a situation closely analogous to an independent nation that is invaded and colonised by an imperial power.

What interests me particularly is that this ancient mode of thought, being ancient – indeed, primeval – is instinctive and ‘natural’ in the way that speech is (and Language, as defined above, is not). Unlike modern ‘intellectual’ thought, which marks us off from the rest of the animal kingdom (something on which we have always rather plumed ourselves, perhaps mistakenly, as I suggested recently) this instinctive mode occupies much the same ground, and reminds us that what we achieve by great ingenuity and contrivance (remarkable feats of construction, heroic feats of navigation over great distances, to name but two) is done naturally and instinctively by ants, bees, wasps, spiders, swifts, salmon, whales and many others, as a matter of course.

So how does this supposed ‘ancient mode’ of thought work? I am pretty sure that metaphor is at the heart of it. Metaphor consists in seeing one thing in terms of another, or, if you like, in seeing something in the world as expressing or embodying your thought; as such, it is the basic mechanism of most of what we term Art: poetry, storytelling, painting, sculpture, dance, music, all have this transformative quality in which different things are united and seen as aspects of one another, or one is seen as the expression of the other – they become effectively interchangeable.

(a key difference between metaphorical thinking and analytic thinking – our modern mode – is that it unites and identifies where the other separates and makes distinctions – which is why metaphor always appears illogical or paradoxical when described analytically: ‘seeing the similarity in dissimilars’ as Aristotle puts it, or ‘saying that one thing is another’)

This long preamble was prompted by an odd insight I gained the other day when, by a curious concatenation of circumstances, I found myself rereading, for the first time in many years, John Buchan’s The Island of Sheep.

Now Buchan is easy to mock – the values and attitudes of many of his characters are very much ‘of their time’ and may strike us as preposterous, if not worse – but he knows how to spin a yarn, and there are few writers better at evoking the feelings aroused by nature and landscape at various times and seasons. He was also widely and deeply read, a classical scholar, and his popular fiction (which never pretended to be more than entertainment and generally succeeded) has a depth and subtlety not found in his contemporaries.

What struck me in The Island of Sheep were two incidents, both involving the younger Haraldsen. Haraldsen is a Dane from the ‘Norlands‘ – Buchan’s name for the Faeroes. He is a gentle, scholarly recluse who has been raised by his father – a world-bestriding colossus of a man, a great adventurer – to play some leading part in an envisaged great revival of the ‘Northern Race’, a role for which he is entirely unfitted. He inherits from his father an immense fortune, in which he is not interested, and a vendetta or blood-feud which brings him into conflict with some ruthless and unscrupulous men.

Early in the book, before we know who he is, he encounters Richard Hannay and his son Peter John (another pair of opposites). They are out wildfowling and Peter John flies his falcon at an incoming skein of geese; it separates a goose from the flight and pursues it in a thrilling high-speed chase, but the goose escapes by flying low and eventually gaining the safety of a wood. ‘Smith’ (as Haraldsen is then known) is moved to tears, and exclaims
‘It is safe because it was humble. It flew near the ground. It was humble and lowly, as I am. It is a message from Heaven.’
He sees this as an endorsement of the course he has chosen to evade his enemies, by lying low and disguising himself.

Later, however, he takes refuge on Lord Clanroyden’s estate, along with Richard Hannay and his friends, who in their youth in Africa had sworn an oath to old Haraldsen to look after his son, when they were in a tight spot. They attend a shepherd’s wedding and after the festivities there is a great set-to among the various sheepdogs, with the young pretenders ganging up to overthrow the old top-dog, Yarrow, who rather lords it over them. The old dog fights his corner manfully but is hopelessly outnumbered, then just as all seems lost, he turns from defence to attack and sallies out against his opponents with great suddenness and ferocity, scattering them and winning the day.

20160423_165527

Again, Haraldsen is deeply moved:

‘It is a message to me,’ he croaked. ‘That dog is like Samr, who died with Gunnar of Lithend. He reminds me of what I had forgotten.’

He abandons his scheme of running and hiding and resolves to return to his home, the eponymous Island of Sheep, and face down his enemies, thus setting up the climax of the book (it’s not giving too much away to reveal that good triumphs in the end, though of course it’s ‘a dam’ close-run thing’).

Both these incidents have for me an authentic ring: I can well believe that just such ‘seeing as’ played a key role in the way our ancestors thought about the world and their place in it.

It is, of course, just the kind of thing that modern thinking labels ‘mere superstition’ but I think it should not be dismissed so lightly.

The modern objection might be phrased like this: ‘the primitive mind posits a ruling intelligence, an invisible force that controls the world and communicates through signs – bolts of lightning, volcanic eruptions, comets and other lesser but in some way striking events. The coincidence of some unusual or striking occurrence in nature with a human crisis is seen as a comment on it, and may be viewed (if preceded by imploration) as the answer to prayer. We know better: these are natural events with no connection to human action beyond fortuitous coincidence.’

The way I have chosen to phrase this illustrates a classic problem that arises when modern thinking seeks to give an account of ancient or traditional thinking – ‘primitive’ thinking, if you like, since I see nothing pejorative in being first and original. The notion of cause and effect is key to any modern explanation, so we often find that ‘primitive’ thinking is characterised by erroneous notions of causality – basically, a causal connection is supposed where there is none.

For instance, in a talk I heard by the the philosopher John Haldane, he cited a particular behaviour known as ‘tree binding’ in which trees were wounded and bound as a way of treating human wounds – a form of what is called ‘sympathetic magic’, where another object acts as a surrogate for the person or thing we wish to affect (or, to be more precise, ‘wish to be affected’). An account of such behavior in causal terms will always show it to be mistaken and apparently foolish – typical ‘primitive superstition’: ‘They suppose a causal connection between binding the tree’s wound and binding the man’s, and that by healing the one, they will somehow heal the other (which we know cannot work).’

But I would suggest that the tree-binding is not a mistaken scientific process, based on inadequate knowledge – it is not a scientific process at all, and it is an error to describe it in those terms. It is, I would suggest, much more akin to both prayer and poetry. The ritual element – the play-acting – is of central importance.

The tree-binders, I would suggest, are well aware of their ignorance in matters of medicine: they do not know how to heal wounds, but they know that wounds do heal; and they consider that the same power (call it what you will) that heals the wound in a tree also heals the wound in man’s body. They fear that the man may die but hope that he will live, and they know that only time will reveal the outcome.

Wounding then binding the tree seems to me a ritual akin to prayer rather than a misguided attempt at medicine. First and foremost, it is an expression of hope, like the words of reassurance we utter in such cases – ‘I’m sure he’ll get better’. The tree’s wound will heal (observation tells them this) – so, too, might the man’s.

But the real power of the ritual, for me, lies in its flexibility, its openness to interpretation. It is a very pragmatic approach, one that can be tailored to suit any outcome. If the man lives, well and good; that is what everyone hoped would happen. Should the man die, the tree (now identified with him in some sense) remains (with its scar, which does heal). The tree helps reconcile them to the man’s death by showing it in a new perspective: though all they have now is his corpse, the tree is a reminder that this man was more than he seems now: he had a life, spread over time. Also, the continued survival of the tree suggests that in some sense the man, too, or something of him that they cannot see (the life or soul which the tree embodies) may survive the death of his body. The tree can also be seen as saying something about the man’s family (we have the same image ourselves in ‘family tree’, though buried some layers deeper) and how it survives without him, scarred but continuing; and by extension, the same applies to the tribe, which will continue to flourish as the tree does, despite the loss of an individual member.

And the tree ‘says’ all these things because we give it tongue – we make it tell a story, or rather we weave it into one that is ongoing (there are some parallels here to the notion of ‘Elective Causality’ that I discuss elsewhere). As I have argued elsewhere [‘For us, there is only the trying‘] we can only find a sign, or see something as a sign, if we are already looking for one and already think in those terms. Haraldsen, in The Island of Sheep, is troubled about whether he has chosen the right course, and finds justification for it in the stirring sight of the goose evading the falcon; later, still troubled about the rightness of his course, he opts to change it, stirred by the sight of the dog Yarrow turning the tables on his opponents.

His being stirred, I think, is actually the key here. It would be an error to suppose that he is stirred because he sees the goose’s flight and the dog’s bold sally as ‘messages from heaven’; the reverse is actually the case – he calls these ‘messages from heaven’ to express the way in which they stir him. There is a moment when he identifies, first with the fleeing goose, then with the bold dog. What unites him with them in each case is what he feels. But this is not cause and effect, which is always a sequence; rather, this is parallel or simultaneous – the inner feeling and the outward action are counterparts, aspects of the same thing. A much closer analogy is resonance, where a plucked string or a struck bell sets up sympathetic vibration in another.

This is why I prefer Vita Sackville West’s definition of metaphor to Aristotle’s: for him, metaphor is the ability to see the similarity in dissimilar things; for her, (the quote is from her book on Marvell)

‘The metaphysical poets were intoxicated—if one may apply so excitable a word to writers so severely and deliberately intellectual—by the potentialities of metaphor. They saw in it an opportunity for expressing their intimations of the unknown and the dimly suspected Absolute in terms of the known concrete, whether those intimations related to philosophic, mystical, or intellectual experience, to religion, or to love. They were ‘struck with these great concurrences of things’’

A subject to which I shall return.

Leave a comment

Filed under language-related, philosophy

In the beginning was the word… or was it?

1511

Reflecting on the origin of words leads us into interesting territory. I do not mean the origin of particular words, though that can be interesting too; I mean the notion of words as units, as building blocks into which sentences can be divided.

How long have we had words? The temptation is to say ‘as long as we have had speech’ but when you dig a bit deeper, you strike an interesting vein of thought.

As I have remarked elsewhere [see ‘The Muybridge Moment‘] it seems unlikely that there was any systematic analysis of speech till we were able to write it down, and perhaps there was no need of such analysis. Certainly a great many of the things that we now associate with language only become necessary as a result of its having a written form: formal grammar, punctuation, spelling – the three things that probably generate the most unnecessary heat – are all by-products of the introduction of writing.

The same could be said of words. Till we have to write them down, we have no need to decide where one word ends and another begins: the spaces between words on a page do not reflect anything that is found in speech, where typically words flow together except where we hesitate or pause for effect. We are reminded of this in learning a foreign language, where we soon realise that listening out for individual words is a mistaken technique; the ear needs to attune itself to rhythms and patterns and characteristic constructions.

So were words there all along just waiting to be discovered? That is an interesting question. Though ‘discovery’ and ‘invention’ effectively mean the same, etymologically (both have the sense of ‘coming upon’ or ‘uncovering’) we customarily make a useful distinction between them – ‘discovery’ implies pre-existence – so we discover buried treasure, ancient ruins, lost cities – whereas ‘invention’ is reserved for things we have brought into being, that did not previously exist, like bicycles and steam engines.  (an idea also explored in Three Misleading Oppositions, Three Useful Axioms)

So are words a discovery or an invention?

People of my generation were taught that Columbus ‘discovered’ America, though even in my childhood the theory that the Vikings got their earlier had some currency; but of course in each case they found a land already occupied, by people who (probably) had arrived there via a land-bridge from Asia, or possibly by island-hopping, some time between 42000 and 17000 years ago. In the same way, Dutch navigators ‘discovered’ Australia in the early 17th century, though in British schools the credit is given to Captain Cook in the late 18th century, who actually only laid formal claim in the name of the British Crown to a territory that Europeans had known about for nearly two centuries – and its indigenous inhabitants had lived in for around five hundred centuries.

In terms of discovery, the land-masses involved predate all human existence, so they were there to be ‘discovered’ by whoever first set foot on them, but these later rediscoveries and colonisations throw a different light on the matter. The people of the Old World were well used to imperial conquest as a way of life, but that was a matter of the same territory changing hands under different rulers; the business of treating something as ‘virgin territory’ – though it quite plainly was not, since they found people well-established there – is unusual, and I think it is striking where it comes in human, and particularly European, history. It implies an unusual degree of arrogance and self-regard on the part of the colonists, and it is interesting to ask where that came from.

Since immigration has become such a hot topic, there have been various witty maps circulating on social media, such as this one showing ‘North America prior to illegal immigration’ 2gdVlD0

The divisions, of course, show the territories of the various peoples who lived there before the Europeans arrived, though there is an ironic tinge lent by the names by which they are designated, which for the most part are anglicised. Here we touch on something I have discussed before  [in Imaginary lines: bounded by consent]  – the fact that any political map is a work of the imagination, denoting all manner of territories and divisions that have no existence outside human convention.

Convention could be described as our ability to project or impose our imagination on reality; as I have said elsewhere [The Lords of Convention] it strikes me as a version of the game we play in childhood, ‘let’s pretend’ or ‘make-believe’ – which is not to trivialise it, but rather to indicate the profound importance of the things we do in childhood, by natural inclination, as it were.

Are words conventions, a form we have imposed on speech much as we impose a complex conventional structure on a land-mass by drawing it on a map? The problem is that the notion of words is so fundamental to our whole way of thinking – may, indeed, be what makes it possible – that it is difficult to set them aside.

That is what I meant by my comment about the arrogance and self-regard implied in treating America and Australia as ‘virgin territory’ – its seems to me to stem from a particular way of thinking, and that way of thinking, I suggest, is bound up with the emergence of words into our consciousness, which I think begins about two and a half thousand years ago, and (for Europeans at least) with the Greeks.

I would like to offer a model of it which is not intended to be historical (though I believe it expresses an underlying truth) but is more a convenient way of looking at it. The years from around 470 to 322 BC span the lives of three men: the first, Socrates, famously wrote nothing, but spoke in the market place to whoever would listen; we know of him largely through his pupil, Plato. It was on Plato’s pupil, Aristotle, that Dante bestowed the title ‘maestro di color che sanno’ – master of those that know.

This transition, from the talking philosopher to the one who laid the foundations of all European thought, is deeply symbolic: it represents the transition from the old way of thought and understanding, which was inseparable from human activity – conversation, or ‘language games’ and ‘forms of life’ as Wittgenstein would say – to the new, which is characteristically separate and objective, existing in its own right, on the written page.

The pivotal figure is the one in the middle, Plato, who very much has a foot in both camps, or perhaps more accurately, is standing on the boundary of one world looking over into another newly-discovered. The undoubted power of his writing is derived from the old ways – he uses poetic imagery and storytelling (the simile of the cave, the myth of Er) to express an entirely new way of looking at things, one that will eventually subjugate the old way entirely; and at the heart of his vision is the notion of the word.

Briefly, Plato’s Theory of Forms or Ideas can be expressed like this: the world has two aspects, Appearance and Reality; Appearance is what is made known to us by the senses, the world we see when we look out the window or go for a walk. It is characterised by change and impermanence – nothing holds fast, everything is always in the process of changing into something else, a notion for which the Greeks seemed to have a peculiar horror; in the words of the hymn, ‘change and decay in all around I see’.

Reality surely cannot be like that: Truth must be absolute, immutable (it is important to see the part played in this by desire and disgust: the true state of the world surely could not be this degrading chaos and disorder where nothing lasts). So Plato says this: Reality is not something we can apprehend by the senses, but only by the intellect. And what the intellect grasps is that beyond Appearance, transcending it, is a timeless and immutable world of Forms or Ideas. Our senses make us aware of many tables, cats, trees; but our intellect sees that these are but instances of a single Idea or Form, Table, Cat, Tree, which somehow imparts to them the quality that makes them what they are, imbues them with ‘tableness’ ‘catness’ and ‘treeness’.

This notion beguiled me when I first came across it, aged fourteen. It has taken me rather longer to appreciate the real nature of Plato’s ‘discovery’, which is perhaps more prosaic (literally) but no less potent. Briefly, I think that Plato has discovered the power of general terms, and he has glimpsed in them – as an epiphany, a sudden revelation – a whole new way of looking at the world; and it starts with being able to write a word on a page.

Writing makes possible the relocation of meaning: from being the property of a situation, something embedded in human activity (‘the meaning of a word is its use in the language’) meaning becomes the property of words, these new things that we can write down and look at. The icon of a cat or a tree resembles to some extent an actual cat or tree but the word ‘cat’ looks nothing like a cat, nor ‘tree’ like a tree; in order to understand it, you must learn what it means – an intellectual act. And what you learn is more than just the meaning of a particular word – it is the whole idea of how words work, that they stand for things and can, in many respects, be used in their stead, just as the beads on an abacus can be made to stand for various quantities. What you learn is a new way of seeing the world, one where its apparently chaotic mutability can be reduced to order.

Whole classes of things that seem immensely varied can now be subsumed under a single term: there is a multiplicity of trees and cats, but the one word ‘tree’ or ‘cat’ can be used to stand for all or any of them indifferently. Modelled on that, abstract ideas such as ‘Justice’ ‘Truth’ and ‘The Good’ can be seen standing for some immutable, transcendent form that imbues all just acts with justice and so on. Plato’s pupil Aristotle discarded the poetic clothing of his teacher’s thought, but developed the idea of generalisation to the full: it is to him that we owe the system of classification by genus and species and the invention of formal logic, which could be described as the system of general relations; and these are the very foundation of all our thinking.

In many respects, the foundations of the modern world are laid here, so naturally these developments are usually presented as one of mankind’s greatest advances. However, I would like to draw attention to some detrimental aspects. The first is that this new way of looking at the world, which apprehends it through the intellect, must be learned. Thus, at a stroke, we render natural man stupid (and ‘primitive’ man, to look ahead to those European colonisations, inferior, somewhat less than human). We also establish a self-perpetuating intellectual elite – those who have a vested interest in maintaining the power that arises from a command of the written word – and simultaneously exclude and devalue those who struggle to acquire that command.

The pernicious division into ‘Appearance’ and ‘Reality’ denigrates the senses and all natural instincts, subjugating them to and vaunting the intellect; and along with that goes the false dichotomy of Heart and Head, where the Head is seen as being the Seat of Reason, calm, objective, detached, which should properly rule the emotional, subjective, passionate and too-easily-engaged Heart.

This, in effect, is the marginalising of the old way of doing things that served us well till about two and a half thousand ago, which gave a central place to those forms of expression and understanding which we now divide and rule as the various arts, each in its own well-designed box: poetry, art, music, etc. (a matter discussed in fable form in Plucked from the Chorus Line)

So what am I advocating? that we undo all this? No, rather that we take a step to one side and view it from a slightly different angle. Plato could only express his new vision of things in the old way, so he presents it as an alternative world somewhere out there beyond the one we see, a world of Ideas or Forms, which he sees as the things words stand for, what they point to – and in so doing, makes the fatal step of discarding the world we live in for an intellectual construct; but the truth of the matter is that words do not point to anything beyond themselves; they are the Platonic Forms or Ideas: the Platonic Idea of ‘Horse’ is the word ‘Horse’. What Plato has invented is an Operating System; his mistake is in thinking he has discovered the hidden nature of Reality.

What he glimpsed, and Aristotle developed, and we have been using ever since, is a way of thinking about the world that is useful for certain purposes, but one that has its limitations. We need to take it down a peg or two, and put it alongside those other, older operating systems that we are all born with, which we developed over millions of years. After all, the rest of the world – animal and vegetable – seems to have the knack of living harmoniously; we are the ones who have lost it, and now threaten everyone’s existence, including our own; perhaps it is time to take a fresh look.

Leave a comment

Filed under language-related, philosophy

Where to Find Talking Bears, or The Needless Suspension of Disbelief

polar bear child stroking tube

Something I have been struggling to pin down is a clear expression of my thoughts on the oft-quoted dictum of Coleridge, shown in its original context here:

‘it was agreed, that my endeavours should be directed to persons and characters supernatural, or at least romantic, yet so as to transfer from our inward nature a human interest and a semblance of truth sufficient to procure for these shadows of imagination that willing suspension of disbelief for the moment, which constitutes poetic faith.’

This strikes me as a curious instance of something that has become a commonplace – you can almost guarantee to come across it in critical discussion of certain things, chiefly film and theatre – despite the fact that it completely fails to stand up to any rigorous scrutiny. It is, in a word, nonsense.

But there is another strand here, which may be part of my difficulty. This dictum, and its popularity, strike me as a further instance of something I have grown increasingly aware of in my recent thinking, namely the subjugation of Art to Reason. By this I mean the insistence that Art is not only capable of, but requires rational explanation – that its meaning can and should be clarified by writing and talking about it in a certain way (and note the crucial assumption that involves, namely that art has meaning).

This seems to me much like insisting that everyone say what they have to say in English, rather than accepting that there are languages other than our own which are different but equally good.

But back to Coleridge. If the ‘willing suspension of disbelief for the moment’ is what ‘constitutes poetic faith,’ then all I can say is that it must be an odd sort of faith that consists not in believing something – or indeed anything – but rather in putting aside one’s incredulity on a temporary basis: ‘when I say I believe in poetry, what I mean is that I actually find it incredible, but I am willing to pretend I don’t in order to read it.’

That is the pernicious link – that this suspension of disbelief is a necessary prerequisite of engaging with poetry, fiction or indeed Art as a whole; we see it repeated (as gospel) in these quotations, culled at random from the internet:

‘Any creative endeavor, certainly any written creative endeavor, is only successful to the extent that the audience offers this willing suspension as they read, listen, or watch. It’s part of an unspoken contract: The writer provides the reader/viewer/player with a good story, and in return, they accept the reality of the story as presented, and accept that characters in the fictional universe act on their own accord.’

(‘Any creative endeavour’ ? ‘is only successful’ ? Come on!)

‘In the world of fiction you are often required to believe a premise which you would never accept in the real world. Especially in genres such as fantasy and science fiction, things happen in the story which you would not believe if they were presented in a newspaper as fact. Even in more real-world genres such as action movies, the action routinely goes beyond the boundaries of what you think could really happen.

In order to enjoy such stories, the audience engages in a phenomenon known as “suspension of disbelief”. This is a semi-conscious decision in which you put aside your disbelief and accept the premise as being real for the duration of the story.’
(‘required to believe’ ? ‘in order to enjoy’? Really?)

The implication is that we spend our waking lives in some sort of active scepticism, measuring everything we encounter against certain criteria before giving it our consideration; and when we come on any work of art – or at least one that deals with ‘persons and characters supernatural, or at least romantic’ – we immediately find it wanting, measured against reality, and so must give ourselves a temporary special dispensation to look at it at all.

This is rather as if, on entering a theatre, we said to ourselves ‘these fellows are trying to convince me that I’m in Denmark, but actually it’s just a stage set and they are actors in costumes pretending to be other people – Hamlet, Claudius, Horatio, Gertrude; of course it doesn’t help that instead of Danish they speak a strange sort of English that is quite unlike the way people really talk.’

The roots of this confusion go back what seems a long way, to classical Greece (about twenty-five centuries) though in saying that we should remember that artistic expression is a great deal older (four hundred centuries at least; probably much, much more). I have quoted the contest between Zeuxis and Parrhasius before:

…when they had produced their respective pieces, the birds came to pick with the greatest avidity the grapes which Zeuxis had painted. Immediately Parrhasius exhibited his piece, and Zeuxis said, ‘Remove your curtain that we may see the painting.’ The painting was the curtain, and Zeuxis acknowledged himself conquered, by exclaiming ‘Zeuxis has deceived birds, but Parrhasius has deceived Zeuxis himself.’

– Lempriere’s Classical Dictionary

This is the epitome of the pernicious notion that art is a lie, at its most successful where it is most deceptive: thus Plato banishes it from his ideal state, because in his world it is at two removes from Reality. Plato’s Reality (which he also identifies with Truth) is the World of Forms or Ideas, apprehended by the intellect; the world apprehended by the senses is Appearance, and consists of inferior copies of Ideas; so that Art, which imitates Appearance, is but a copy of a copy, and so doubly inferior and untrustworthy.

Aristotle takes a different line on Appearance and Reality (he is willing to accept the world of the sense as Reality) but continues the same error with his theory of Mimesis, that all art is imitation – which, to use Aristotle’s own terminology, is to mistake the accident for the substance, the contingent for the necessary.

To be sure, some art does offer a representation of reality, and often with great technical skill; and indeed there are works in the tradition of Parrhasius that are expressly intended to deceive – trompe l’oeil paintings, which in the modern era can achieve astonishing effects

but far from being the pinnacle of art (though they are demonstrations of great technical skill) these are a specialist subset of it, and in truth a rather minor one, a sort of visual joke.

Insofar as any work of art resembles reality there will always be the temptation to measure it against reality and judge it accordingly, and this is particularly so of the visual arts, especially cinema, though people will apply the same criterion to fiction and poetry.

They are unlikely to do so in the case of music, however, and this exception is instructive. Even where music sets out to be specifically representative (technically what is termed ‘program(me) music’, I believe) and depict some scene or action – for instance Britten’s ‘Sea Interludes’ –it still does not look like the thing it depicts (for the simple reason that it has no visual element). Music is so far removed in character from what it depicts that we do not know where to start in making a comparison – we see at once that it is a different language, if you like.

The Sea Interludes are extraordinarily evocative, yet we would not call them ‘realistic’, something we might be tempted to say of a photo-realistic depiction of a seascape compared to one by Turner, say:

SEASCAPE-MARINE-PAINTING-FIRST-LIGHT-SURF
(original source here)  Tom Nielsen – ‘First light surf’

Screenshot 2015-12-09 18.40.27

(JMW Turner, ‘Seascape with storm coming on’ 1840)

Of all the different forms of Art, it is cinema that has gone furthest down this erroneous path – with the rise of CGI, almost anything can be ‘realised’ in the sense of presenting it in fully rounded, fully detailed form, and the revival of 3D imagery in its latest version and various other tricks are all geared to the same end of making it seem as if you were actually there in the action, as if that were the ultimate goal.

Yet even with the addition of scent and taste – the only senses yet to be catered for in film – the illusion is only temporary and never complete: we are always aware at some level that it is an illusion, and indeed the more it strives to be a perfect illusion the more aware we are of its illusory nature (we catch ourselves thinking ‘these special effects are amazing!’).

On the other hand, a black and white film from decades ago can so enrapture us that we are completely engaged with it to the exclusion of all else – we grip the arms of our seat and bite our lip when the hero is in peril, we shed tears at the denouement, we feel hugely uplifted at the joyous conclusion – but none of this is because we mistake what we are seeing for reality; it has to do with the engagement of our feelings.

In marked contrast to the cinema, the theatre now rarely aims at a realistic presentation; on the contrary, the wit with which a minimum of props can be used for a variety of purposes (as the excellent Blue Raincoat production of The Poor Mouth did with four chairs and some pig masks) can be part of the pleasure we experience, just as the different voices and facial expressions used by a storyteller can. It is not the main pleasure, of course, but it helps clarify the nature of the error that Coleridge makes.

How a story is told – the technique with which it is presented, whether it be on stage, screen or page – is a separate thing from the story itself. Take, for instance, these two fine books by Jackie Morris

East-of-Suncover-1024x351

wilds

 

East of the Sun, West of the Moon‘ and ‘The Wild Swans‘ are traditional tales; in retelling them, Jackie Morris puts her own stamp on them, not only with her own words and beautiful illustrations, but also with some changes of detail and action (for more about the writing of East of the Sun, see here).

The nature of these changes is interesting. It is like retuning a musical instrument: certain notes that jarred before now ring true; the tales are refreshed – their spirit is not altered but enhanced.

This ‘ringing true’ is an important concept in storytelling and in Art generally (I have discussed it before, in this fable). On the face of it, both these tales are prime candidates for Coleridge’s pusillanimous ‘suspension of disbelief’: in one, a talking bear makes a pact with a girl which she violates, thus failing to free him from the enchantment laid on him (he is actually a handsome prince); in consequence, the girl must find her way to the castle East of the Sun, West of the Moon, an enterprise in which she is aided by several wise women and the four winds; there she must outwit a troll-maiden. In the other, a sister finds her eleven brothers enchanted into swans by the malice of their stepmother, and can only free them by taking a vow of silence and knitting each of them shirts of stinging nettles.

After all, it will be said, you don’t meet with talking bears, any more than you do with boys enchanted into swans, in the Real World, do you?

Hm. I have to say that I view the expression ‘Real World’ and those who use it with deep suspicion: it is invariably employed to exclude from consideration something which the speaker does not like and fears to confront. As might be shown in a Venn diagram, what people mean by the ‘Real World’ is actually a subset of the World, one that is expressly defined to rule out the possibility of whatever its proponents wish to exclude:

Screenshot 2015-12-09 19.18.50

In other words, all they are saying is ‘you will not find talking bears or enchanted swans if you look in a place where you don’t find such things.’

Cue howls of protest: ‘you don’t meet talking bears walking down the street, do you?’ Well, it depends where you look: if you look at the start of East of the Sun, you will meet a talking bear walking through the streets of a city. Further howls: ‘But that’s just a story!’

polar bear child stroking tube

(Some people met this bear on the London underground but I don’t think it spoke )

Well, no – it isn’t just a story; it’s a story – and stories and what is in them are as much part of the world as belisha beacons, horse-blankets and the Retail Price Index. The World, after all, must include the totality of human experience. The fact that we do not meet with talking bears in the greengrocer’s (and has anyone ever said we might?) does not preclude the possibility of meeting them in stories, which is just where you’d expect to find them (for a similar point, see Paxman and the Angels).

3 Comments

Filed under book-related, philosophy

The Muybridge Moment

Muybridge-2

The memorable Eadweard Muybridge invented a number of things, including his own name – he was born Edward Muggeridge in London in 1830. He literally got away with murder in 1872 when he travelled some seventy-five miles to shoot dead his wife’s lover (prefacing the act with ‘here’s the answer to the letter you sent my wife’) but was acquitted by the jury (against the judge’s direction) on the grounds of ‘justifiable homicide’. He is best known for the sequence of pictures of a galloping horse shot in 1878 at the behest of Leland Stanford, Governor of California, to resolve the question of whether the horse ever has all four feet off the ground (it does, though not at the point people imagined). To capture the sequence, Muybridge used multiple cameras and devised a means of showing the results which he called a zoopraxoscope, thereby inventing stop-motion photography and the cinema projector, laying the foundations of the motion-picture industry.

The_Horse_in_Motion-anim

(“The Horse in Motion-anim” by Eadweard Muybridge, Animation: Nevit Dilmen – Library of Congress Prints and Photographs Division; http://hdl.loc.gov/loc.pnp/cph.3a45870. Licensed under Public Domain via Commons – https://commons.wikimedia.org/wiki/File:The_Horse_in_Motion-anim.gif#/media/File:The_Horse_in_Motion-anim.gif)

Muybridge’s achievement was to take a movement that was too fast for the human eye to comprehend and freeze it so that each phase of motion could be analysed. It was something that he set out to do – as deliberately and methodically as he set out to shoot Major Harry Larkyns, his wife’s lover.

It is interesting to consider that something similar to Muybridge’s achievement happened a few thousand years ago, entirely by accident and over a longer span of time, but with consequences so far-reaching that they could be said to have shaped the modern world.

We do not know what prompted the invention of writing between five and six thousand years ago, but it was not a desire to transcribe speech and give it a permanent form; most likely it began, alongside numbering, as a means of listing things, such as the contents of storehouses – making records for tax purposes, perhaps, or of the ruler’s wealth – and from there it might have developed as a means of recording notable achievements in battle and setting down laws.

We can be confident that transcribing speech was not the primary aim because that is not something anyone would have felt the need to do. For us, that may take some effort of the imagination to realise, not least because we live in an age obsessed with making permanent records of almost anything and everything, perhaps because it is so easy to do so – it is a commonplace to observe that people now seem to go on holiday not to enjoy seeing new places at first hand, but in order to make a record of them that they can look at once they return home.

And long before that, we had sayings like

vox audita perit, littera scripta manet
(the voice heard is lost; the written word remains)

to serve as propaganda for the written word and emphasise how vital it is to write things down. One of the tricks of propaganda is to take your major weakness and brazenly pass it off as a strength (‘we care what you think’ ‘your opinion matters to us’ ‘we’re listening!’ as banks and politicians say) and that is certainly the case with this particular Latin tag: it is simply not true that the spoken word is lost – people have been remembering speech from time immemorial (think of traditional stories and songs passed from one generation to the next); it is reasonable to suppose that retaining speech is as natural to us as speaking.

If anything, writing was devised to record what was not memorable. Its potential beyond that was only slowly realised: it took around a thousand years for anyone to use it for something we might call ‘literature’. It is not till the classical Greek period – a mere two and a half millennia ago (Homo Sapiens is reckoned  at 200,000 years old, the genus Homo at 2.8 million)  – that the ‘Muybridge moment’ arrives, with the realisation that writing allows us to ‘freeze’ speech just as his pictures ‘froze’ movement, and so, crucially, to analyse it.

When you consider all that stems from this, a considerable degree of ‘unthinking’ is required to imagine how things must have been before writing came along. I think the most notable thing would have been that speech was not seen as a separate element but rather as part of a spectrum of expression, nigh-inseparable from gesture and facial expression.  A great many of the features of language which we think fundamental would have been unknown: spelling and punctuation – to which some people attach so much importance – belong exclusively to writing and would not have been thought of at all; even the idea of words as a basic unit of language, the building blocks of sentences, is a notion that only arises once you can ‘freeze’ the flow of speech like Muybridge’s galloping horse and study each phase of its movement; before then, the ‘building blocks’ would have been complete utterances, a string of sounds that belonged together, rather like a phrase in music, and these would invariably have been integrated, not only with gestures and facial expressions, but some wider activity of which they formed part (and possibly not the most significant part).

As for grammar, the rules by which language operates and to which huge importance is attached by some, it is likely that no-one had the least idea of it; after all, speech is even now something we learn (and teach) by instinct, though that process is heavily influenced and overlaid by all the ideas that stem from the invention of writing; but then we have only been able to analyse language in that way for a couple of thousand years; we have been expressing ourselves in a range of ways, including speech, since the dawn of humanity.

When I learned grammar in primary school – some fifty years ago – we did it by parsing and analysis. Parsing was taking a sentence and identifying the ‘parts of speech’ of which it was composed – not just words, but types or categories of word, defined by their function: Noun, Verb, Adjective, Adverb, Pronoun, Preposition, Conjunction, Article.

Analysis established the grammatical relations within the sentence, in terms of the Subject and Predicate. The Subject, confusingly, was not what the sentence was about – which puzzled me at first – but rather ‘the person or thing that performs the action described by the verb’ (though we used the rough-and-ready method of asking ‘who or what before the verb?’). The Predicate was the remainder of the sentence,  what was said about (‘predicated of’) the Subject, and could generally be divided into Verb and Object (‘who or what after the verb’ was the rough and ready method for finding that).

It was not till I went to university that I realised that these terms – in particular, Subject and Predicate – derived from mediaeval Logic, which in turn traced its origin back to Aristotle (whom Dante called maestro di color che sanno – master of those that know) in the days of Classical Greece.

Socrates

Socrates

Plato

Plato

Aristotle

Aristotle

Alexander the Great

Alexander the Great

Aristotle is the third of the trio of great teachers who were pupils of their predecessors: he was a student of Plato, who was a student of Socrates. It is fitting that Aristotle’s most notable pupil was not a philosopher but a King: Alexander the Great, who had conquered much of the known world and created an empire that stretched from Macedonia to India by the time he was 30.

That transition (in a period of no more than 150 years) from Socrates to the conquest of the world, neatly embodies the impact of Classical Greek thought, which I would argue stems from the ‘Muybridge Moment’ when people began to realise the full potential of the idea of writing down speech. Socrates, notably, wrote nothing: his method was to hang around the market place and engage in conversation with whoever would listen; we know him largely through the writings of Plato, who uses him as a mouthpiece for his own ideas. Aristotle wrote a great deal, and what he wrote conquered the subsequent world of thought to an extent and for a length of time that puts Alexander in eclipse.

In the Middle Ages – a millennium and a half after his death – he was known simply as ‘The Philosopher’ and quoting his opinion sufficed to close any argument. Although the Renaissance was to a large extent a rejection of Aristotelian teaching as it had developed (and ossified) in the teachings of the Schoolmen, the ideas of Aristotle remain profoundly influential, and not just in the way I was taught grammar as a boy – the whole notion of taxonomy, classification by similarity and difference, genus and species – we owe to Aristotle, to say nothing of Logic itself, from which not only my grammar lessons but rational thought were derived.

I would argue strongly that the foundations of modern thought – generalisation, taxonomy, logic, reason itself – are all products of that ‘Muybridge Moment’ and are only made possible by the ability to ‘freeze’ language, then analyse it, that writing makes possible.

It is only when you begin to think of language as composed of individual words (itself a process of abstraction) and how those words relate to the world and to each other, that these foundations are laid. Though Aristotle makes most use of it, the discovery of the power of generalisation should really be credited to his teacher, Plato: for what else are Plato’s Ideas or Forms but general ideas, and indeed (though Plato did not see this) those ideas as embodied in words? Thus, the Platonic idea or form of ‘table’ is the word ‘table’ – effectively indestructible and eternal, since it is immaterial, apprehended by the intellect rather than the senses, standing indifferently for any or all particular instances of a table – it fulfils all the criteria*.

Which brings us to Socrates: what was his contribution? He taught Plato, of course; but I think there is also a neat symbolism in his famous response to being told that the Oracle at Delphi had declared him ‘the wisest man in Greece’ – ‘my only wisdom is that while these others (the Sophists) claim to know something, I know that I know nothing.’ As the herald of Plato and Aristotle, Socrates establishes the baseline, clears the ground, as it were: at this point, no-one knows anything; but the construction of the great edifice of modern knowledge in which we still live today was just about to begin.

However, what interests me most of all is what constituted ‘thinking’ before the ‘Muybridge Moment’, before the advent of writing – not least because, whatever it was, we had been doing it for very much longer than the mere two and a half millennia that we have made use of generalisation, taxonomy, logic and reason as the basis of our thought.

How did we manage without them? and might we learn something useful from that?

I think so.

*seeing that ideas are actually words also solves the problem Hume had, concerning general ideas: if ideas are derived from impressions, then is the general idea ‘triangle’ isosceles, equilateral, or scalene or some impossible combination of them all? – no, it is just the word ‘triangle’. Hume’s mistake was in supposing that an Idea was a ‘(faint) copy’ of an impression; actually, it stands for it, but does not resemble it.

2 Comments

Filed under language-related, philosophy

The Exploration of Inner Space II : by way of metaphor

101_1705

In a recent piece, prompted by Eliot’s line
‘Humankind cannot bear very much reality’
I suggested that we have constructed a carapace that protects us from Reality much as a spacesuit protects an astronaut or a bathysphere a deep-sea explorer.

This in itself is an instance of how metaphor works as a tool of thought and I think it is worth examining. There is, as I have discussed elsewhere  a certain hostility to metaphor and this should not surprise us, since metaphor – ‘seeing the similarity in dissimilars’ as Aristotle defines it – effectively violates at least two of the three so-called ‘Laws of Thought’ that underpin rational argument:

Identity – ‘A is A’ (metaphor asserts that A is B)
Contradiction – ‘A is not not-A’ (again, metaphor asserts that ‘something is what it is not’)
(The third law, Excluded Middle, states that where there are only two choices, there is no third possibility (so ‘A or not-A’) That may also be violated, but let’s not go into that now.)

Yet despite that – in fact, I would assert, because of it – metaphor is a key tool for thinking about the world and how we are situated in it.

There is no mystery to its mechanism, as I think can be illustrated from the particular case we are discussing. The essence of metaphor is ‘seeing as’ – considering the thing we are trying to understand in terms of something we already understand. In most cases, what we are invited to see is a set of relations – ‘x stands to y much as a stands to b.’ So, in this case, I say that we should think of ourselves standing in relation to Reality as someone who is protected by a carapace or intervening layer that comes between them and their surroundings.

This, of course, is to do no more than unpack what is already implied in Eliot’s line and to reinforce it by concrete imagery: we understand the importance of the spacesuit and the bathysphere, so we are being invited to see our experience (by which I mean ‘what it is like to be alive and conscious’) in terms of being surrounded by an environment from which we must protect ourselves by interposing some mediating layer since we cannot cope with prolonged exposure to it.

There will be people who view this sort of talk with some degree of hostility and scepticism, and it was to forestall them that I modified my earlier expression ‘thinking about the world and how we are situated in it’ to ‘our experience’ as a signal to step back from conventional terms which could be misleading. This is because we are not looking down a microscope here, at something (e.g. plant cells) whose place in a particular scheme of things is already agreed; we are taking a step back to where the ‘schemes of things’ are dreamed up in the first place, namely ‘inside the head’ (or inner space, if you like): we are operating in the realm of the imagination, attempting to disentangle problems of thought.

This highlights a difficulty inherent in philosophy, which someone once described as ‘a kind of thinking about thinking’: how do you get back to the starting point and avoid being ensnared by preconceived ideas? How do you use an existing way of thought to think about a different way of thinking? It is a kind of paradox. Wittgenstein touches on it in the Tractatus (6.54):
My propositions serve as elucidations in the following way: anyone who understands me eventually recognizes them as nonsensical, when he has used them – as steps – to climb up beyond them. (He must, so to speak, throw away the ladder after he has climbed up it.)

Descartes was trying to do the same thing in his Discourse, where he aimed to get back to some bedrock of which he could be certain, to use as a foundation on which to build a system of thought, and came up with his ‘cogito ergo sum’ (some thousand years after Augustine had said the same thing). It is in that wanting to be certain that Descartes goes wrong – in the territory where we are operating, nothing is certain, everything is provisional; the question is not ‘what can I be sure of?’ but rather ‘how can I see this?’

Thus (to return to the matter in hand, our metaphorical carapace) we proceed obliquely, by suggesting ‘ways of seeing it’ that coincide or seem complementary. It should be no surprise that the first is yet again drawn from poetry, since that is where metaphorical thinking is at home:

detail of Averkamp's Winter Landscape(Hendrick Avercamp, Winter Landscape (detail))

Suddenly I saw the cold and rook-delighting heaven
That seemed as though ice burned and was but the more ice,
And thereupon imagination and heart were driven
So wild that every casual thought of that and this
Vanished, and left but memories, that should be out of season
With the hot blood of youth, of love crossed long ago;
And I took all the blame out of all sense and reason,
Until I cried and trembled and rocked to and fro,
Riddled with light. Ah! when the ghost begins to quicken,
Confusion of the death-bed over, is it sent
Out naked on the roads, as the books say, and stricken
By the injustice of the skies for punishment?

That is WB Yeats’s poem, The Cold Heaven. As Seamus Heaney observes (in his brilliant essay ‘Joy or Night’ in The Redress of Poetry)

‘This is an extraordinarily vivid rendering of a spasm of consciousness, a moment of exposure to the total dimensions of what Wallace Stevens once called our ‘spiritual height and depth.’ The turbulence of the lines dramatizes a sudden apprehension that there is no hiding place, that the individual human life cannot be sheltered from the galactic cold. The spirit’s vulnerability, the mind’s awe at the infinite spaces and its bewilderment at the implacable inquisition which they represent – all of this is simultaneously present.’

I was strongly reminded of Yeats’s poem, particularly the lines

I took all the blame out of all sense and reason,
Until I cried and trembled and rocked to and fro,
Riddled with light.

when I came across a deeply moving account by a mother of life with her daughter. This is an extract – I urge you to read the whole piece here – a terrific piece of writing.

‘I have had to learn to do these things quietly because my daughter needs me to.  She is seven; bright, super funny, articulate, thoughtful and loving.  She also has autism spectrum disorder.  If you saw her on a good day, you’d maybe think she was a little shy and kooky.  You’d maybe wonder why I am letting her wear flip-flops in the winter rain.  You’ll never see her on a bad day as she can’t leave the house*.

She has severe sensory processing difficulties.  A normal day exhausts her and when she feels overwhelmed, even a gentle voice trying to soothe her with loving words can be too much to process, making her feel crazy.  She describes walking into a room of people as “like staring at the sun”. She’s incredibly empathetic but you may not realise as she feels her own and others’ emotions so deeply she can’t bear it, and so sometimes she has to just shut down. ‘

(that asterisk, by the way, links to this footnote:
‘*3 months of non-stop bad days and counting, not left the house since December 3rd 2014’ – the blog was written on 3 March)

I apologise for appropriating another person’s anguish to use as an illustration but I hope I do not do so lightly. I have my own experience of the pain that results when someone you love cannot cope with the world and I am increasingly convinced that a great deal of what we term ‘mental illness’ – particularly in the young – has to do with their difficulty in reconciling Reality (or Life, if you like) as they experience it with the version that those around them seem to accept – it is a learning difficulty or impairment; they just cannot get the hang of how they are ‘supposed to’ see things.

In fact, ‘supposed to’ is just the right idiom here, for the subtle nuances it has in English:

‘that’s not supposed to happen’
‘you’re not supposed to do that’
‘it’s supposed to do this’
‘because that’s what you’re supposed to do!’

– it conveys not only a divergence between how things are and how they are meant to be – the infinite capacity of life to surprise us, the inherent tendency of all plans to miscarry (‘the best laid schemes o mice an men gang aft agley’) – but also the tension between social constraint and the individual will: ‘you’re not supposed to do that!’ is what the child who has bought into the conventions early on (that would be me, I fear) squeals when his bolder companion transgresses (and that squeal is followed by an expectant hush during which the sky is supposed to fall in, but doesn’t).

The world is not as we suppose – or perhaps it would be better to say that it is ‘not as we pretend,’ since that brings out the puzzlement that many – perhaps all – children experience at some point, that the adult world is an elaborate pretence, a denial of the reality that is in front of their noses.

Here is Eliot again, from Murder in the Cathedral:

Man’s Life is a cheat and a disappointment;
All things are unreal,
Unreal or disappointing:
The Catherine wheel, the pantomime cat,
The prizes given at the children’s party,
The prize awarded for the English essay,
The scholar’s degree, the statesman’s decoration,
All things become less real.

fredwcat1909

the hollowness of achievement and the emptiness of success is a commonplace of adult writing, and it complements a central theme of much children’s writing, that the world is a marvellous and enchanting place full of magic and wonder (and terror) – but adults, as a general rule, cannot see it (which has just this instant reminded me of a favourite and curious book of my childhood, The Hick-boo**. about a creature only children could see – the adult exception being an artist).

And that is a hopeful note to end on, for now: that there may be a better way to mediate Reality than the conventional carapace, namely Art (in its most inclusive sense – painting, sculpture, poetry, storytelling, music, dance). That is something I shall come back to.

**to be exact, ‘The Hick-boo, a tale of a tailless transparent goblin’ by MH Stephen Smith (Hutchinson 1948).

3 Comments

Filed under philosophy