Category Archives: Uncategorized

The Bonfire of Responsibility

campfire_olive

The thing about systems is that they are designed to work as a whole, each component interacting to produce the desired effect. To interfere with one part is to throw the whole out of kilter.

If it is your job to make hard decisions it is wise to consider and indeed consult the opinion of those who will be affected by them; but making the decision still remains your job, not theirs.

That goes to the heart of the awful slow-motion train-wreck that we in Britain are presently witnessing, where a government, shamefully aided and abetted by the leader of the Opposition, is in the process of railroading through both Houses of Parliament a bill which, given a free vote, they would certainly reject.

At this point, we might expect the comic figure of Mr Jacob Rees-Mogg to pop up and start trumpeting about ‘the will of the British people’ and how it ‘must not be thwarted’.

A compounding factor in this disaster* is the inability of people like Mr Rees-Mogg to tell the truth. Each time he or anyone, in discussing the European referendum, utters the phrase ‘the will of the British people’, he should be gently stopped, and told to say instead ‘the will of a large minority of the electorate at a time when the majority did not vote to leave Europe (and those who will be most lastingly affected – the 16-18 year olds – were excluded from the process).’

I grant it is neither as catchy nor as resounding as ‘the will of the British people’ but it does have the advantage of being an accurate statement of the truth, which ‘the will of the British people’, in this context, is not (something that Mr Rees-Mogg and his like know perfectly well – hence their unwillingness to discuss the point).

But Jacob Rees-Mogg, like Mr Punch, is not easily suppressed. Up he pops again and tells us that the government agreed that it would be bound by the result of the referendum, so it is a matter of honour, of keeping one’s word, of honouring a pledge made to the British people (and so on, and so on…).

But it is none of those things: it is, on the contrary, a complete abnegation of responsibility – shirking, in plain terms. To begin at the beginning: a thing is either binding or it is not; if it is not, no amount of saying that it is will make it so. ‘Binding’ in this case means ‘having the force of law’ – in other words, you would be breaking the law to go against it.

As was made plain in the House of Commons Briefing Paper (no. 7212) that set out the scope and powers of the European referendum, ‘The UK does not have constitutional provisions which would require the results of a referendum to be implemented, unlike, for example, the Republic of Ireland’. To have such a binding referendum would require new legislation : Parliament would have to pass a law to make it so; that is how the system works.

It does not work by the government saying (as it has done here) ‘this does not have the force of law, but we will treat it as if it does.’ You cannot treat something as a law: it either is or isn’t.

The reasoning that underpins this is worth examining. While the laws of physics – gravity, for example – have actual force and cannot be defied, the laws of the land are conventions – they only have such force as we agree to allow them (which is why they have to be backed by sanctions with a police force and courts to enforce them).

This act of endowing the law with compelling force is really a transfer of responsibility, largely for practical purposes: it saves us making our mind up in every case individually if we have a rule that we agree to apply in all such cases. Naturally, we want to think carefully before transferring power to an order of words in this way, which is why we have a system of parliamentary scrutiny before any legislation is passed.

And this means that, where something is not the law, the responsibility for deciding what happens in that case must lie elsewhere. In the matter of the European referendum, that responsibility lies with parliament, which has a duty to take full cognisance of the result and act accordingly, in the best interests of the whole country, now and in the foreseeable future (that’s their job, what we elect them to do). Yes, I know – tedious, boring, grown-up. But this is not a game show.

To put it in terms that even Jacob Rees-Mogg can understand –

A harassed mother of seven children, at the end of her tether because they are all squabbling as it is raining and they were going to have a bonfire, says ‘Right! we’ll have a vote – whatever the majority of you want to do, that’s what we’re going to do, all right? Only no more squabbling!’

Two of the children (twins) gaze round-eyed but say nothing. Two vote to watch telly and have the bonfire another day. The remaining three vote to have the bonfire now, indoors, on the living-room carpet.

Hands up all those who think mum is obliged to start gathering combustible material on the carpet?

*and with the continuing rise of Marine le Pen towards the presidency of France and the hitherto-unthinkable possibility that one of the two main foundations of the European Union will be removed (with others surely following), I grow fearful that it will be disastrous, not only for us, but Europe and ultimately the world. I hope I am wrong.

Advertisements

Leave a comment

Filed under Uncategorized

Who’s got the idea?

20161203_160703

Suppose you catch me at my usual philosophical musing, mooning about and muttering to myself. I chance to say aloud, ‘I wonder when people first developed the idea of language?’ Being a practical sort, you say ‘Come with me. I happen to have brought a time machine and no end of wizard gadgetry, so we can go and have a look.’

In less time than it takes to get there (because we are going backwards, of course) we are hovering, cloaked in invisibility, over the grassy plains of Africa. The dial says it is about 200,000 years ago and here is a group of Homo sapiens, our direct ancestors, all talking away.

You observe them for a time with your gadgets, making notes and taking measurements, and then say, ‘Well, these are not just grunts associated with exertion, or cries of alarm or excitement. There is rhythm and pattern there, and a clear sense of exchange, of going to-and-fro. This certainly looks like conversation, and if I feed the results into my analyser, I’m sure we’ll be able to say a bit about the grammatical rules they’re following and probably have a crack at the syntax and maybe even define a few of the words in their vocabulary. So I think you can say with some confidence that these ancestors of ours have the idea of language.’

But I am not so sure. I think that what you have demonstrated is that you have the idea of language. You are the one who has turned up with, so to speak, an annotated diagram, and been able to look at this new thing to see the points of resemblance it has and conclude that it belongs to the same class as other things you call ‘language’. You are the one who has brought his box already divided into labelled compartments, into which you can put the bits you call ‘grammar’ ‘syntax’ and ‘vocabulary’. And till these chaps on the plain start doing the same thing, then I do not think you can say they have the idea of language: they may talk, though I think if you take off your spectacles of preconception, you will see that they do a great deal else – facial expression, gesture, bodily posture, movement; only you haven’t come equipped with the box to put those in.

Having the idea of something consists precisely of being able to do this kind of thing – identification, classification, analysis – in short, fitting in to a pre-existing scheme (and having that scheme to start with). It’s the sort of scheme we can carry about ‘in our heads’ but don’t be fooled into thinking that any special merit attaches to this as a mental activity: that division is not important. We can think aloud, give voice to our words, or even think with a pen and paper, drawing diagrams and writing words. It just so happens that we have also learned the trick of forming words without speaking them aloud, and that is what we do, mostly, because it is convenient.

You might want to say that you are finding something that is there though the speakers are unaware of it – that their language is the first instance of the idea of language, which is something that transcends time and space, of which all our specific languages are mere instances – and that would be rather Platonic of you.

Which is why I would suggest that if you want to find when and where people first had the idea of language – indeed, the idea of ideas – you should set your time machine forward from the plains of Africa and head for classical Greece about two and a half thousand years ago, there to eavesdrop on Plato and his pupil, Aristotle.

Leave a comment

Filed under Uncategorized

The trees they grow high, the leaves they do grow green: out on a limb with Schopenhauer

10446266_10152497452088603_5642669257220306397_oWell now. Suppose a leaf comes to consciousness. Does it say, ‘I am a leaf’?

Looking around, does it say ‘I am one leaf among many’?

Does it reflect on the fact that the lot of a leaf is to flourish briefly, wither and die, while the tree just keeps on growing, putting out more leaves, generation after generation?

Does it think, ‘what a cruel irony to be conscious of being a small part of an otherwise blind and unconscious process’ ?

That, in effect, is Schopenhauer’s position: looking outward, I see the world, the objective world, as it is presented to me by my senses; looking inward, I know my will, my subjective self, and recognise it not as an individual, separate will but as a single tendril, as it were, of the blind will of the world to exist; hence the title of his major work, the world as will and representation.

But why should the leaf consider itself unique in being conscious? (it does not matter if it is a solipsistic leaf which supposes itself the sole conscious leaf on the tree, or one that consider all leaves to be similarly conscious)

Why should it not suppose that, rather than being so singularly endowed, the consciousness it has might be shared by the tree?

Indeed, might it not be wiser to suppose that, rather than thinking of the tree as sharing its consciousness, it would be better (and certainly humbler) to suppose that it had a share of the tree’s consciousness, and that in accordance with its capacity as a leaf, which in all probability is only a fraction of the tree’s?

Leave a comment

Filed under Uncategorized

The Real Enemies of the People

‘This Bill requires a referendum to be held on the question of the UK’s continued membership of the European Union (EU) before the end of 2017. It does not contain any requirement for the UK Government to implement the results of the referendum, nor set a time limit by which a vote to leave the EU should be implemented. Instead, this is a type of referendum known as pre-legislative or consultative, which enables the electorate to voice an opinion which then influences the Government in its policy decisions.’

Commons Briefing Paper 7212, giving background on the European Union Referendum Bill

Results of the EU Referendum:

Remain: 16,141,241 (48.1%)
Leave: 17,410,742 (51.9%)
Total Electorate: 46,500,001
Turnout: 72.2%
Rejected Ballots: 25,359

Given that this is a consultative exercise, ‘which enables the electorate to voice an opinion which then influences the Government in its policy decisions’ what inferences can be safely drawn from the result regarding the opinion voiced by the electorate, and how should policy be guided by them?

First, the bare facts are these:

a minority of the electorate – 37.5 % – favour leaving the EU;
a smaller minority – 34.7% – favour remaining;
a considerable majority – 62.5% – did not vote to leave (i.e. those who voted to remain plus those who did not vote)

What inferences can be drawn from this, with regard to influencing policy?

  1. The clearest inference is that the electorate does not speak with a single voice on this matter; on the contrary, it is deeply divided – the 52/48% split among those who voted reflects this.
  2. There is not an overall majority of the electorate in favour of leaving.
  3. No inference can be safely drawn about the views of those who did not vote; however, in the context of a decision that will affect the entire country, the fact of their number – 12.9 million – cannot be ignored.

Beyond these immediate inferences, some wider conclusions can be drawn. From inference (1) above, it is clear that there is no warrant for talking in terms of the ‘express will of the British people’. It is not only the voices of the 17.4 million who voted to leave that must be heeded, but also the 16.1 who voted to remain and the 12.9 million who did not vote, for whatever reason. This is not a game show where the winner takes all: it is an instrument for shaping policy for the entire country.

It is evident that some of those who voted may have been under the misapprehension that the result of the referendum would be legally binding. However, anyone who had sufficient interest or was obliged by their position to inform themselves and others about the issue could have been in no doubt that the type and purpose of the referendum was as clearly stated in the Commons Briefing paper quoted at the head of this article, which was published on 3 June 2015 and freely available.

A numerous group of people including all MPs and parliamentarians, News Editors, political journalists and public commentators had either a sworn duty or a serious responsibility to acquaint themselves with the content of Commons Briefing Paper 7212 and therefore to know that the referendum was consultative and not legally binding.

It follows that anyone in that group who implied otherwise, by action or inaction, acted reprehensibly, mischievously, dishonestly and irresponsibly.

Much blame must attach to the previous Prime Minister, Mr Cameron, whose conduct in this matter can only be described as reckless and irresponsible throughout, since he repeatedly used a matter of grave import to the whole country as a party-political tool.

His initial inclusion of the referendum as a manifesto promise appears to have been intended primarily to stem the haemorrhaging of Tory support to UKIP and there are strong grounds for supposing that he did not expect to have to implement it, since he did not think he would be elected outright and would be required to jettison it as part of any coalition deal.

His failure on taking office to make clear the status of the referendum is reprehensible and negligent. He then aggravated matters by embarking on a process of renegotiation with the EU prior to the referendum. This was completely wrong-headed and appears once more to have been motivated by his own political situation. It is evident that he hoped to use the threat of the UK’s possibly voting to leave as a means of pressing the EU for concessions which he hoped would sway the referendum outcome in his favour, i.e. a vote to remain.

However, since the express purpose of the referendum was ‘consultative, [to enable] the electorate to voice an opinion which then influences the Government in its policy decisions’ it is clear that its proper use should have been to form the basis of any renegotiation of membership – that is the very policy which it was intended to influence.

Had the actual result (i.e. a vote in favour of leaving) been put to its proper use, the Prime Minister would presently be engaged in renegotiation of our membership (and reform of the EU as a whole) in good faith but with a strong hand since the option of leaving would remain a possibility if the results were not to our satisfaction. It is hard to see that this would not be better, from everyone’s point of view – leavers and remainers alike – than the situation we now find ourselves in, having closed down our options and effectively resigned all influence by a premature (and unnecessary) commitment to leave.

For that, Mrs May is to blame. Mr Cameron’s abrupt departure (his final irresponsible act) may have pitched her into a situation that was more febrile than it need have been but she came in with a clean slate. The opportunity was there for her to show leadership but she has failed to take it.

She has never challenged what she knows to be the mistaken assumption that the referendum result commits the government to leave the EU. She could have done so and defied contradiction since every other MP, parliamentarian, News Editor and political journalist knows it as well as she. Instead she has confirmed the error by her frequent reiteration of the idiotic mantra ‘Brexit means Brexit’ and aggravated it by challenging the High Court’s decision that sovereignty of parliament cannot be circumvented in this matter – something which she and all these others know very well.

It should not have been left to the courage of a private citizen to have the courts reaffirm what every parliamentarian not only knows, but has a duty to uphold, namely the sovereignty of parliament. They should have been foremost in asserting it, not shying away and attempting to deny it.

The response of certain newspapers to the Court ruling was disgraceful. The fact that we expect little better from the British press does not exonerate the editors from blame. They know that they have been instrumental, from the outset, in encouraging their readers in the false belief that the referendum binds the government to a course of action, whereas – as they know perfectly well – ‘It does not contain any requirement for the UK Government to implement the results of the referendum, nor set a time limit by which a vote to leave the EU should be implemented.’

We are now in the ridiculous and entirely avoidable situation where a large minority of the populace believe erroneously that they have been given (or won by their vote) the right to compel the government to do their will and take the United Kingdom out of the European Union. This misapprehension has now been suffered to continue uncorrected so long, and indeed been reinforced by the ill-judged actions of such a number of people, that any attempt to remedy it will probably result in considerable civil strife and violence, since it will be seen as the ‘Establishment’ attempting to thwart the will of the people and deprive them of what is rightfully theirs.

Who is going to have the courage to stand up and state the facts, and defy anyone to contradict them?

Here they are, once more:

The referendum was consultative and did not bind the government to any course of action.
It was intended to ascertain the voice of the people, in order to influence policy. It is evident from the result that the people have not spoken with a single voice and do not have a settled will in this matter. The nation is divided. There is no majority in favour of leaving the EU. A large minority wish to do so; a similar but slightly smaller minority wish to remain. A further group, nearly 13 million people, did not express a view. The majority of the electorate did not vote in favour of leaving.

The situation, though dire, is not irrecoverable. Probably the most honest course would be to admit that almost everyone concerned, across all parties and on both sides of the debate, has created an unnecessary and dangerous mess, call a general election, and let the people decide.

In the meantime, there is an onus on those who have contributed to the creation of this dangerous situation to do their best to defuse it, by speaking calmly and honestly and confining themselves to the facts. A collective act of contrition on their part would be a good beginning. They have deceived the people and endangered the country.

Leave a comment

Filed under Uncategorized

The Games We Play

Things we do early are of great interest. Where we show a natural propensity to do something – if we seem, so to speak, programmed to do it – the inference is that the behaviour is both ancient (to have become so ingrained) and important (to have persisted so long). The obvious example here is speech, though I think ‘expression’ would be a more accurate description of that instinct.

A less obvious one is playing games. Not all play amounts to a game, but there is a large area of overlap; the game, if you like, is the complete expression of play. We shall come to a definition of ‘game’ presently.

There is a considerable obstacle to seeing game-playing as an activity of fundamental importance: though tolerated, even encouraged to some degree in children, in adult life it is permitted only as ‘recreation or sport’. Notwithstanding the fact that it can be pursued professionally (and there it is a form of entertainment for a mass audience) – it is regarded as essentially pointless, frivolous, unserious. In adult form it tends to be codified and regulated and exhibits great variety, from field sports such as the various forms of football, cricket and golf to board games such as chess and draughts and nowadays that interesting development, computer games.

(Indeed, such is the variety of the things we call ‘games’ that Wittgenstein doubted if they were capable of a common definition in terms of characteristics which they all shared, and suggested instead that they might resemble one another as members of a family do, some alike in this respect, others in that; a gentle grenade lobbed into an apple-cart that philosophers had been trundling since Aristotle’s day, the notion that whatever had the same name shared the same essence or set of defining characteristics that made it what it was and distinguished it from what it was not; but that is by the way)

I think that games are at their most interesting and revealing in their primitive form, when they are nearest to being a natural, instinctive behaviour, and there I think we can identify common characteristics, not so much in the content of the game as in the behaviour it involves.

Although games are regarded as frivolous, idle, fundamentally unserious, ‘just for fun’, the most striking thing about children is how seriously they play: what they do is done in earnest, and adults who join in but do not show the appropriate commitment will be reproved for ‘not playing properly.’

This earnestness shows chiefly in the degree of absorption in the activity: the child will be described as ‘in a world of his own’ ‘lost in the game’ and so on. The utter solemnity with which children conduct themselves in play can strike adults as amusing (though it can also wring the heart) and it is worth asking why that should be so. The answers are often given in terms of a contrast with ‘real life’ – the child can (briefly) enjoy the pleasures of play, but all too soon he will learn that ‘life isn’t like that’ and the time will come, as St Paul has it, ‘to put away childish things’.

We will come back to the relation between the world of play and ‘real life’ presently. In the meantime, I would like to consider what I think are three fundamental characteristics of playing games as a natural or instinctive activity. All three are related, and could indeed be seen as different aspects of the same thing, but I will separate them for ease of consideration.

One is that the game occupies its own space, not just physically, but as a plane of existence. There may well be actual boundaries – ‘a field of play’, if you like – but these are the embodiment of an idea, the idea that ‘in the game’ identifies a space or plane of existence where things happen differently than outside  or ‘not in the game’.

Within this space, objects and actions are invested with a significance which they do not have elsewhere. For instance, when children play indoors, they will often commandeer the furniture to play some part in their game: a line of kitchen chairs can be a train, for instance; the space under the table, a cave; a rug can be a raft on the carpet sea, an armchair a ship, and so on. The child is perfectly able to distinguish between what any of these objects is in itself and what it is ‘in the game’ – there is no confusion or delusion, a point to which we shall return.

The final feature, alongside having its own space and investing objects or actions in that space with significance, is the idea of giving oneself a rule to follow. The game consists in doing things in a particular way. The thing to grasp here is that this rule is self-imposed, which is a kind of paradox: you are free to do it any way you like, but you act as if you have to do it this particular way. There is no concept of cheating in this primitive stage, for the simple reason that the game is to follow the rule; ‘winning’ (insofar as the concept can be applied) is following the rule successfully; not following the rule is, literally, ‘not playing the game’. (Again, adults who join in and make a false move will be told ‘you can’t do that’).

Trying but failing to follow the rule may be attended by penalties, but again these are self-imposed, and operate ‘in the game’. So, bears will eat you if you step on the cracks in the pavement; if you fall in the carpet  sea, or down the chasm that you were trying to leap across, you will die – but only in the game; and in the game you may have several lives, which permit you to start again. (If the rule proves either too difficult or too easy, it will be adjusted, which again shows an implicit understanding of the dual worlds of ‘in the game’  and outside it, and the dual role that implies – the child is both the game-maker and the game player.)

The language that is used is interesting. In the case of a game that evolves spontaneously, a group of children may be milling around, each doing his own thing, but as their activities begin to converge, someone might say ‘let’s make it that you have to -‘ and will add some activity that then becomes the game. ‘Let’s make it’ casts the players in the role of legislators, defining what has to be done; ‘that you have to’ brings out the sense of agreeing to be bound by the self-imposed rule.

‘Acting as if’ goes to the heart of playing games and it is a concept worth examining in detail because it sheds light on our curious reluctance to accept this ancient instinctive behaviour as serious and important, our insistence on classing it as frivolous.

The adult observer of a child at play may say things like ‘it’s as if she’s in another world’ or ‘it’s as if he really believes he’s driving a bus – he’ll talk to the passengers, take their fares, then drive off to the next stop.’ We may picture the child as being assisted in this by various props – the sofa may be the bus, with the driver’s seat a kitchen chair at one end; the passengers might be various toys.

‘As if’ carries an implication of pretence, that what is deemed to be happening is not actually happening, is not real. it is worth examining the viewpoints involved in this, though it can become difficult, because we may find language working against us.

The first thing to say is that the judgement about what is real matters much more to the adult than it does to the child. Some adults worry themselves quite seriously about the status of ‘imaginary friends’ because they seem so vivid to their child; they may engage in conversations to lead them to the view that Mr Wotsit ‘isn’t really there (like mummy and Daddy are’ that he is ‘just pretend’ and ‘just in the game’ – to which the child will probably assent quite happily, if only to reassure their anxious parent.

Play does not involve delusion – believing something to be other than it is – and I think the difference is easily demonstrated; but the root of the problem is that the adult’s conceptual framework lacks the flexibility to describe the child’s behaviour accurately.

If we consider the theatre – one area where ‘make believe’ is allowed in adult life – at no time in the performance of Hamlet does the audience actually believe itself to be in the royal court of Denmark, nor that David Tennant (or better still Maxine Peake) has ceased to exist and has become, for the time, the eponymous melancholy Dane; nor for that matter does Peake or Tennant think this either. Notwithstanding, people will say things like ‘Maxine Peake is Hamlet’ and ‘for three hours, we were transported to Elsinore’ – but these are just attempts to express the power of the performance,  not factual descriptions.

There is a type of confidence trick that employs a similar set up to the theatre – there is a set to be dressed (perhaps a vacant country house) a cast of players (in character, perhaps in costume) and action (a party, perhaps, where the rich and famous discuss matters of high finance and good investments). The difference is that (for the con to work) the intended audience – the ‘mark’ – must take what he sees at face value, must believe it genuine; in other words, he must be deluded, in a way that the theatre audience is not.

It is worth examining the two different kinds of belief  we encounter here. There is ‘believing something to be the case’ which we encounter in the con: the mark believes the party etc. to be genuine, while in reality it is a set-up. That defines ‘delusion’ – believing something to be other than it is.

But the child does not believe himself to be a bus-driver in this sense, nor his toys to be passengers, any more than David Tennant or Maxine Peake believe themselves to be Hamlet. People will talk of ‘belief’ and ‘believing’ here, but it is in a subtly different sense.

Wittgenstein somewhere observes, in discussing scepticism, that if you want to know what a man believes, you should observe what he does, rather than heed what he says – he may profess doubt as to the reality of the world of appearance, yet he will still sit in chairs without a qualm,  cross floors without fear of plummeting into some abyss, go through doors in the expectation of finding himself in the next room and so on; in other words, he behaves as if the world exists, even if he claims to doubt it.

‘Behaving as if’ is at the heart of play whether it is playing Hamlet or playing at being a bus driver. It is the sincerity of Maxine Peake’s performance that brings Hamlet to life, just as it is the earnestness of the child’s play that makes the adult say ‘he really believes he’s a bus-driver.’ Peake’s performance is done with conscious skill, though I think it draws on the same natural instinct that the child demonstrates; the one is a studied and refined version of the other. It is worth considering them side by side.

As I have remarked elsewhere, there is a deep-rooted ambivalence in our attitude to Art in almost all its forms, illustrated by our use of the same language to describe telling stories and telling lies. Art seems indistinguishable from lying, since both involve representing something as other than it is. It is a problem that has troubled philosophers since Plato’s day; and indeed it surfaces in Hamlet, in respect of the counterfeiting of emotions that acting seems to involve:

Is it not monstrous that this player here,
But in a fiction, in a dream of passion,
Could force his soul so to his own conceit
That from her working all his visage wann’d,
Tears in his eyes, distraction in’s aspect,
A broken voice, and his whole function suiting
With forms to his conceit? and all for nothing!
For Hecuba!
What’s Hecuba to him, or he to Hecuba,
That he should weep for her?

No-one would accuse the child of insincerity or counterfeiting when he is playing at being a bus-driver, yet the same child may well counterfeit emotions if he thinks he can get his own way by it, so the distinction exists even at that level – and it shows our confusion that such behaviour  – crying to get sweets, say – may well be described by the parent as ‘just play-acting,’ meaning that it is insincere.

How is this puzzle to be resolved?

I think we need to consider two things that we have already touched on: our notion of being ‘in the game’, that the game is on a separate plane of existence, and the nature of belief. In the theatre, as in the bus-driver game, the planes of existence are separate and distinct. What happens on the stage, with the actors in costume, may occupy the same space as the set and messrs Tennant or Peake and the rest of the cast inside their costumes, but the action of Hamlet is understood to happen in Elsinore and to involve Hamlet, Horatio and the rest, not the actors who play them. Likewise the bus-game occupies the same space as the living room furniture, but in a different plane, where the sofa is a bus and the assortment of soft toys are passengers.

The con, on the other hand, whether it is counterfeit tears for sweets or a grand scam involving a dressed set and a cast of players, succeeds when it is thought to be taking place in the same plane as the observer: i.e. he thinks this is an actual country house party, that this is genuine grief and real tears.

There is an implication here that is not immediately obvious but is profoundly important: the planes of existence are equal. Whether they are equally real or equally fictitious is unimportant, no more than a manner of speaking. To put it another way: there is not one game being played here, but two; but one of the games – ‘the game of Real Life’ – is accorded special status. If you want to be the only game in town, you dissociate yourself from all the rest and either ban them or banish them to some lowly status. So it is part of the Game of Life that it is not regarded as a game, and that the concepts of ‘being real’ and ‘existing’ are restricted exclusively to it and are not allowed in any other game.

So the ‘bus’ is really a sofa, the ‘passengers’ are really soft toys and the ‘bus-driver’ is really a wee boy called Hamish who is nearly 3. It is at this point that Language becomes a serious obstacle, for the good reason that Language is instrumental in giving the Game of Life its special status. However, let us make the attempt.

The argument I am trying to construct can be illustrated again with reference to Hamlet, where there is at one point a play within the play. In regarding Hamish at his bus-driver game, we think ourselves like the audience in the theatre, and his game the action on stage; but I am saying that we are actually the players on stage, and his game the play within the play. The question then becomes what is the third thing, the reality in which these two games are played, the equivalent (in our illustration) of the theatre?

We can come at it by an oblique route. As I have discussed elsewhere, the distinction we customarily make between ‘imaginary’ and ‘real’ is that favourite term of the philosophy student, a false dichotomy. Most of what we consider ‘real’ are works of the imagination. If you look around you, how much do you see that does not owe its existence to having passed through the human imagination? I can see trees and a hill that might be exempted, though the trees are still where someone chose to put them (and may even have been bred by some human effort) while the hill has certainly been shaped by human thought. But the houses, indeed the whole city in between, with its infrastructure of roads and railways, water supply and drains, all that – real as it is – was first an idea in some human mind.

The succinct way of putting this is that you can have the plans without the house, but not the house without the plans. What we call ‘reality’ is in most cases a degree of embodiment: the architect’s vision, the detailed plans he draws and the completed building are different versions of the same thing.

We have imposed our imagination on this planet to an extraordinary degree: quite apart from the physical embodiment of our ideas in various forms, there is the map we have overlaid on the planet, dividing it into various territories, and within those territories a highly complex structure of custom, law, industry and so on. While the general inclination might be to suppose that the child playing with model figures on the landscape of the floor is imitating these larger entities – ‘the real world’ – I would suggest that the reverse is nearer the truth: the great world we have built around us is a development of the same imaginary powers that have their first expression in creating other worlds on the floor, in the living room or in the garden. The difference is in scale and degree of realisation, not in kind.

The ‘third thing’, our actual situation, the default position if you like, is immediate experience. Again, language is an obstacle here, because it mediates experience, interposing a picture of objective reality and ourselves as detached observers who exist in that world objectively, as individuals. But our immediate experience is subjective and involved: we find ourselves here (wherever ‘here’ is and whatever ‘ourselves’ are) and the rest is invention – a word which, neatly, can mean both ‘making what was not there before’ and ‘discovering’.

On the matter of belief, we need to distinguish between the common but rather narrow sense of ‘believing something to be the case’ and the wider sense of ‘having confidence or trust in’. The injunction ‘do it with belief‘ (which we hear as advice to performers of various sorts – singers, actors, even footballers) relates to the second kind of belief, not the first. To do it ‘as if you really believe in it’ is not an injunction to counterfeiting and hypocrisy, which it seems to be if we understand ‘belief’ in the first sense, since ‘as if’ implies that we do not actually believe it to be the case; rather it is to do it seriously and earnestly, to recapture the commitment of the child to his game, to do it properly, as if it matters.

And that is the secret: by doing it as if it matters, we make it matter: we invest it with importance and meaning. Ritual has no intrinsic value or meaning; that is conferred on it by our performing it meaningfully, seriously, as if it matters.

This sheds an interesting sidelight on Existentialism,  which was first explained to me in Sixth Year, when we were reading Camus’ L’Etranger: ‘Life has no meaning save what we bring to it,’we were told. It seemed to me then a grim and depressing philosophy; now I find it a hopeful one. I still think L’Etranger a bit grim, but Beckett makes me laugh and gives me hope. I rather like the idea that our participation is necessary to give life meaning, that we invest life with meaning by living it earnestly, wholeheartedly, as if it matters – by becoming, indeed, like little children.

 

 

 

6 Comments

Filed under Uncategorized

An Age without a Name, 2: Progress or Digression?

Myths are stories we tell to explain how we see ourselves and our place in the world. One of the dominant myths of the current age is that of progress, which sees the human story as one of continual improvement over time, with that tendency accelerating in recent centuries, particularly the last. (It is worth reminding ourselves that technically ‘progress’ is a neutral descriptive term: it simply means to go forward, or go on; and since that is something we have little choice but to do, you could argue that the positive charge we have given ‘progress’ is a case of making a virtue of necessity).

An illustration of human progress might look like this, presented in the style of a contour profile:

DSCF5149 (1)

C is the beginning of history, which starts with the possibility of written records, some 5,500 years ago – a date that is much the same as our invention of metallurgy. D is the start of the Classical Period in Greece, some 2,500 years ago. The dip about a thousand years later is the Fall of Rome, the beginning of the Dark Ages, though the dotted line reminds us that the Dark Ages were a local phenomenon – the level of civilisation attained in Classical Greece continued in the Eastern Roman Empire and was maintained by the Golden Age of Islam, while Western Europe was in the darkness of ignorance.

Point No.1 at the right is the beginning of the agrarian revolution some three centuries ago, driven very much by notions of ‘improvement’ in agriculture, land management and animal husbandry as age-old practices were superseded by a modern, rational approach born of the Enlightenment.

Point No. 2, some two centuries ago, is the beginning of the Industrial Revolution, which transformed society, starting in Britain and Western Europe, and spreading worldwide.

Point No.3 is simply the start of the twentieth century, which has eclipsed all others in terms of technical progress and has largely shaped what we consider the modern world. The gradient here should undoubtedly be much steeper: a century that started without heavier-than-air flight, with much sea-cargo still carried by sail, with few motor cars and newspapers the medium of mass communication, cinema and telephone communication in their infancy and gaslighting the norm, has transformed into the extraordinary world of space travel, nuclear power, the world-wide web, social media, mobile phones and all the rest.

Only the right-hand end of my diagram sounds an ominous note, one touched on in the previous article on the Anthropocene, namely the fear that we may be headed for disaster, a precipitous fall as human impact on the environment – particularly biodiversity and climate change – threatens not only our way of life, but all life on the planet.

This is where the limitations of diagrams like the one above become evident: what is the alternative to continued upward progress? The problem is that even to slacken the rate of ascent looks like abandoning the course that has taken us so far so rapidly; to flatten out looks like stagnation, and anything else is pessimistic decline.

Perhaps the time has come to try another map. I would suggest this one:

 

Version 2

The first thing to note is that the scale here is very different: the span from left to right is 60,000 years. The second thing is that this is not a contour profile, but an aerial plan, much like a conventional map. The blue line is human progress; the green line running parallel to it is the generality of life on earth.

Point B, some 10,000 years ago, is the beginning of a significant divergence between the two lines: it marks the point where we began to live in a new way, in fixed settlements supported by agriculture, instead of the nomadic hunter-gatherer way of life we had followed since the dawn of humanity. This is the beginning of civilisation, and has been suggested as a suitable start for the Anthropocene, the Epoch of Human influence, that was the subject of the previous article. Yet it is worth recalling that we are still in prehistoric times and indeed the Stone Age – we have to get to point C, which also appears on the first diagram, to arrive at the invention of writing – and so the beginning of history – and the discovery of metallurgy, between five and six thousand years ago.

Point D also appears on the first diagram: it marks the start of the classical period in Greece, the point in time when the invention of writing really began to have an impact on human expression. For the first few millennia it has been a useful method of storage, akin to the dehydration of food: it allows unmemorable but useful information to be preserved. Its role in the transmission of culture – all the things people regard as sufficiently important to pass on to succeeding generations – is minimal. It takes about a thousand years from its first invention for anyone to use writing for something that we might call literature.

This tardiness in realising its potential in this respect can best be understood with reference to the point marked A, some 40,000 years ago. This is the date of some cave-paintings, sculpture and musical instruments that we have discovered. It does not, of course, mark a beginning, but rather a continuity – we have every reason to suppose that the aesthetic impulse, the human urge to give external expression to our feelings, dates much further back than that – singing, dancing, storytelling leave no lasting mark on the environment, but we know that, even today, they are human activities strongly associated with gathering round a fire – and current estimates suggest that the controlled use of fire by humankind dates back at least 400,000 years.

The implication of this is that we had evolved distinctive means of transmitting our culture effectively, that did not involve writing but did involve aesthetic expression,  by 40,000 years ago and quite probably ten times longer ago than that. The continuance of our race in itself attests that humans were able to transmit their culture effectively for tens and indeed hundreds of thousands of years before the invention of writing; and the practice of cave-painting seems to have died out about 10,000 years ago, though it survived later in some places. In other words, we were cave painters for two or even three times as long as we consider ourselves to have been civilised. And of course all these means – art, music, poetry, storytelling, dance, theatre – still play a central role in transmitting culture even today – they have never died out, though the conditions under which they operate have altered drastically.

Where that alteration begins is shown on the second diagram at D, which marks the point where we began thinking and looking at the world in a different way. That is why I have shown it as a right-angle digression from the course which we had followed from time immemorial, a course that till the advent of civilisation some ten thousand years ago, ran in parallel and in harmony with the rest of life on Earth. That is a supposition, but an entirely reasonable one: we are one among many forms of life on earth, and for most of our time here, we have lived interdependently with nature, relying on its bounty for survival, but also conforming our way of life to its demands, just as every other form of life on Earth has had to do.

I would argue that, rather than adopting the idea of the Anthropocene discussed in the previous article – which finds evidence of human influence in the environment – we should look instead at the points where we ourselves changed our relationship, our attitude, to the environment. While the first of these is arguably our adoption of agriculture, of far greater significance is the change that began some two and a half thousand years ago in Classical Greece. I would say that is the beginning of the Age of Language, which I would contrast with the preceding Age of Expression, which stretches back to the beginning of humanity.

I maintain that Language as we know it, and the way of thinking it makes possible, is of relatively recent origin, the accidental result of the invention of writing and its impact on human expression generally and on speech in particular. That impact could be described as the disintegration of expression and the isolation and elevation of speech to an eminence it had not previously enjoyed.

Prior to the invention of writing, I would argue that human expression was broader in range and integral in character – speech was one mode among many, not the most important, and it was not regarded as distinct from facial expression, gesture, and bodily posture as immediate physical modes of expression, nor were these distinguished from more developed modes of expression such as song, dance, music, painting, sculpture, storytelling, poetry or ritual behaviour combining all or any of these. Where the Age of Language – our current age – is characterised by the intellectual apprehension of the world through the medium of language, specifically words, and could be described as rational, objective and detached; in the former Age, of Expression, people responded to the world made known by the senses through their feelings, which found expression in the range of modes noted above; it could be described as intuitive, subjective and emotionally engaged.

One way to put this is to say that Writing pulls down the edifice of Expression that has stood since time immemorial, but drags Speech out of the wreckage, and the two set up in a new (but unequal) relation (a notion examined here in fable form: Plucked from the Chorus line).

I will lay out the detail of how this revolution was effected in a third article, and will also discuss the different principles or mechanisms by which thought operated in the Age of Expression and our current Age of Language. For the present, I would like to conclude by explaining how I can presume to make claims about how people thought in a different age of the world. My case rests wholly on what is demonstrated by point A on the diagram above, namely the great age of the aesthetic impulse, which is not merely ancient, but primal – and still very much survives.

A key aspect of my theory is that although Language and the characteristic way of thought that goes with it dominate our Age, they do so much as a conquering power rules a country it has colonised: though the old regime is overthrown, and the new one brings in new laws and customs, the old way of doing things does not disappear, but persists in new guises, often in the face of official disapproval, and subject to official control and authority. Everything that we now term Art, in its broadest sense – not simply painting and sculpture, but music, dance, theatre, poetry, storytelling – is a survival of the Age of Expression and works on the same intuitive, subjective and feelings-based principles. These two elements are in tension because one (Language, Reason) claims the whole territory of thought and judgement for itself, yet the other – Art – seems better able to express what people feel is most important to them.

(it might be thought that I am here rehashing CP Snow’s ‘Two Cultures’ argument, but although there are superficial similarities, the differences are profound and fundamental. Snow’s argument is essentially one about the content of English education – put crudely, that it gives too much weight to the Humanities, in particular, the Classics, over the Sciences; he cites other systems (e.g. the German) that have a better balance. The argument that I am putting forward here is not about the content but the basis of Education (by which I mean all ‘Western’ Education) – namely that it is, fundamentally, Platonic – by which I mean that it disparages the senses, devalues feelings and vaunts the intellect and language as affording the only ‘correct’ perspective of the world – in effect, substituting an intellectual construct for the Reality that we all experience)

Leave a comment

Filed under Uncategorized

An Age without a Name, 1: adopting the Anthropocene

‘Sound, sound the clarion, fill the fife!
throughout the sensual world proclaim
one crowded hour of glorious life
is worth an age without a name’

You may have your doubts about the sentiment – a bit juvenile for my taste, but then I am no longer young – but the curious fact is that we currently live in an Age without a name.

img00008

The previous Age, the Late Pleistocene, lasted some 120,000 years – give or take a few thousand – and its end, some 11,700 years ago, also marked the end of the Pleistocene Epoch, which lasted some 2.5 million years.

In Geochronology, an Age is the smallest unit; next comes the Epoch, then Period, Era, and ultimately Eon. Their length is not precisely defined: Ages can span millions of years, Epochs tens of millions, Periods up to a hundred million, Eras several hundred million, Eons half a billion years or more (I presume the short scale billion 10⁹ is meant, rather than the long scale 10¹²) – so the present Epoch, the Holocene, at a mere 11,700 years, is barely under way, so perhaps it is no surprise that its first Age has not been named yet.

Stripped of their Greek, the impressive-sounding names are rather dull. (As a child, coloured depictions of layers of rock coupled with the name led me to confuse ‘Pleistocene’ and ‘Plasticene’). Holocene – ‘wholly new’ – effectively means ‘recent’; Pleistocene (a touch confusingly) is ‘most new’ or ‘newest’ and succeeded the Pl(e)iocene, the ‘newer’ – from which I gather that they started naming from the oldest first, then had to squeeze in various distinctions as they reached more recent geological times.

For all its short existence, there is a body of thought that suggests that the Holocene should be superseded by a new Epoch, the Anthropocene, defined as the period when human activities started to have a significant impact on the Earth’s geology and ecosystems.

(The etymology requires some explanation – the ‘-cene’ ending is common to all the Epochs that make up the Cenozoic Era and means ‘New’. Cenozoic means ‘New Life’ and marks the period, beginning some 66 million years ago, when mammals superseded reptiles as the dominant form of life on Earth. The ‘Anthropocene’, then, is the ‘New Human’ Epoch – suggesting dominance not by a class of animals, but a single species – our own)

Where the Anthropocene should start is a matter for illuminating discussion. Some, following the standard geological model of an impact left on the rocks of the Earth itself, would start as recently as seventy odd years ago, when the first use of nuclear weapons left a signature that will remain legible for as long as the Earth lasts. Others, taking their cue from human impact on ecosystems, point to the Industrial Revolution, begun between two and three hundred years ago, or perhaps the Agrarian revolution that immediately preceded it and made it possible; while others trace a line all the way back to the beginnings of civilisation, around ten thousand years ago, when our species made the fundamental shift from being nomadic hunter-gatherers to living in settled communities supported by agriculture, which left its mark on the Earth not only in the form of our fields and settlements, but also in a rapid expansion of the human population.

While that last start point would effectively rub out the Holocene altogether – or reduce it to a mere 1700 years, the span from the end of the last Ice Age to the beginning of ‘civilisation’ – even the most recent option, dating it from the first nuclear explosions, would still leave it as little more than a blip on the Geological time scale.

The argument for the Anthropocene is interesting and shows a significant shift in thought. Had the Victorians – who are largely responsible for the geochronology we use today – chosen to call the latest Epoch after our own species, it would be seen as an expression of Human triumphalism; this was, after all, the time when the advent of steam power and industry had seen a small nation on the fringes of Europe establish an Empire which by 1922 held sway over one-fifth of the world’s population and one quarter of its territory.

Now, however, the urge to characterise the latest Epoch as one shaped by the human race is a warning rather than a boast: it is driven by concerns over the negative impact of our activity on biodiversity and climate change. And this is a significant shift in attitude. The motto of the Victorian geologists was ‘the present is the key to the past’, which contradicted the prevailing catastrophist view that Earth’s geology had been shaped, in a relatively short span of time, by a series of violent, widespread events, such as floods. The uniformitarian or gradualist school argued that far slower-acting processes, still in operation today – such as erosion – were the main shaping influences, so that the age of the earth must be far greater than had been previously calculated.

(I think it important to add here that no-one ever believed that the world was other than very ancient; what they had not done was quantify what being very ancient amounted to. The 5,646 years proposed by Archbishop Ussher in 1642 as the span from the moment of Creation to the present would have seemed as unimaginably distant in his day as the 4.53 billion years currently estimated to be the age of our planet does to us; it is only comparison that makes one seem absurdly short. And it is probably true to say that we have very little sense of time, for all our skill in measuring it – what, for instance, does half an hour feel like? Does it always feel the same?)

The Victorians felt secure in their position as detached observers, reading the Great Book of Nature with rational objectivity, a tradition inherited from the Greeks and reflected in their choice of Greek nomenclature for naming the Ages, Epochs, Periods, Eras and Eons of the Earth; but what has now been brought home to us is the realisation that the observers are themselves the key agents of change in what they observe, and the question that now exercises our minds is not how far back the process began, but where and how it might end; and bound up with that is another, which is ‘how should we act?’

Adopting the Anthropocene as a label for our age is a signal that detached observation is no longer a tenable position: we cannot be content to stand by and watch. The time is ripe, I think, to consider a fresh way of looking at ourselves in relation to the world; I will consider what that might be in a separate article.

Leave a comment

Filed under Uncategorized