You don’t need a weatherman to know which way the wind blows

On Wednesday, to Housman’s bookshop in King’s Cross, a place I am making a habit of, to hear Jeff Laster talk about the Weathermen, the radical 60s student underground movement in America of which he had been part as a teenager (though there’s underground and underground – “I wasn’t in the proper Underground, I wasn’t considered radical enough.”) The session started, as many good things do, with a song to which I knew all the words, and turned into a propah seminar.

I couldn’t stay for the whole evening owing to personal flim-flammery, which was a great shame as he was a highly engaging speaker, reflecting usefully on the differences between British and American radicalism, then and now. Short version: Europe has a tradition of full-on Communist-stylee radicalism, America doesn’t and didn’t, and the Weathermen were (this is my take on the basis of what he said, not his) in their early days much closer to what I would consider a modern social liberal democratic tradition than to balls-out left-wing radicalism. Laster started the discussion by reflecting on participatory democracy, an idea which the Weathermen avowed in their early days and only jettisoned later on in favour of strict hierarchy as their movement grew, and as external events (the Kennedy assassination, Vietnam) sharpened minds and raised stakes. “I took orders,” Laster drawled, and it was at this point, when he was getting with marked enthusiasm into a description of the kind of free love-related orders he had taken, that I had to leave.

My sole contribution to the unfolding discussion was to answer his first question to the audience (yes, I was that child at school), which was along the lines of “Can any political movement have genuine participatory democracy?” I said that if any movement was successful and started to grow, decision-making processes would become unwieldy. He ran with the idea and talked about the different perspectives that started to emerge in the Weathermen movement, and the arguments they had about what kind of causes they should fight for – should it be the working class as a whole, civil rights, the draft?

In the event it was civil rights and the draft, and not the working class as a whole. It is difficult to talk about the working class as a homogenous body in 1960s America because of the profound racial divides. But it is also, conveniently, true that civil rights and the draft were naturally close to the hearts of the mostly middle class college kids who formed the heart of the movement. Laster was refreshingly – or appallingly, depending on your viewpoint, and at least one member of the audience did seem fairly appalled – honest about his take on the working class. The relatives he knew who belonged in that category were to his ears racist, retrograde, right-wing in their opinions, and he decided he wanted nothing to do with them. It is the kind of thing you suspect many British Labour politicians over the years have thought but not (unless accidentally captured on a still-recording mike) said.

But what I had in mind when I mentioned the unwieldiness of burgeoning political movements was not so much a multiplication of views as a multiplication of people, and I don’t think they are necessarily the same thing. A classic anthropology article which made a great impression on me when I was studying archaeology is Gregory Johnson’s 1983 meta-study of the operation of consensual decision-making and heirarchy among pastoral nomadic groups. He concluded that “information processing overload” imposes natural constraints on the size of communities that can get by with genuinely consensual day-by-day decision-making, and scaling up beyond a certain size invariably entails some kind of reorganisation into some sort of hierarchy, even if this is broadly what we would call a democratic one. The classic upper limit on a human group operating on pure consensus is, apparently, six. Six people. That’s not very many, is it?

This number can scale, such that six groups of six can come to a decision that usefully furthers the interests of the group as a whole, and so can six tribes each composed of six groups of six people, and so on. But already we are a long way from direct participatory democracy, and into representative democracy, the system with which we all have to live, for better or worse, as “enlightened” modern nation states, and latterly international blocs. And in representative democracy, as we know, things fall down the back of the sofa; certain groups’ voices are not heard, because there is a degree of summarising, of neatening round the edges of the message that each sub-group takes to its superior group in the course of the decision-making process. It needn’t be the case that radically different views are involved. Johnson’s point is simply that not all views can be assimilated in any sort of complex society, even if they are on the whole quite similar, because the human brain simply cannot take it. And even this imperfect system presupposes a perfect Russian doll style set of nesting groups of six, which is a shockingly long way from what we have in the UK parliament. No wonder people feel disenfranchised, I suppose.

On the plus side, it strikes me that, excepting Antarctica, there are six continents in the world, politically speaking. So if that alien war Hollywood is waiting for ever does come along, as a race we are sitting pretty. Just don’t be surprised if the fall-out conducted notionally in your name is Not, in fact, In Your Name.

The economics of anxiety

Decision fatigue helps explain why ordinarily sensible people get angry at colleagues and families, splurge on clothes, buy junk food at the supermarket and can’t resist the dealer’s offer to rustproof their new car. No matter how rational and high-minded you try to be, you can’t make decision after decision without paying a biological price. It’s different from ordinary physical fatigue — you’re not consciously aware of being tired — but you’re low on mental energy. The more choices you make throughout the day, the harder each one becomes for your brain, and eventually it looks for shortcuts, usually in either of two very different ways. One shortcut is to become reckless: to act impulsively instead of expending the energy to first think through the consequences. (Sure, tweet that photo! What could go wrong?) The other shortcut is the ultimate energy saver: do nothing.

Yeah, I get that a lot.

I like the concept of decision fatigue. For one thing it’s a neat little piece of research to have to hand when arguing with people who believe that the country’s social problems are essentially a matter of Other People not getting up early enough, or failing to get three meals out of an organically reared chicken, or persistently watching their big flatscreen telly rather than, I dunno, buying an old cathode tube one off ebay or something, seeing as you can’t actually buy a non flatscreen telly in the shops any more. If willpower is a finite resource, all the small survival decisions that come with being poor and having to worry about money deplete your resources very early in the day. They make your life mentally harder. Apparently there is some scarily literal science behind all this – it is a matter of glucose depletion (pdf).

You will notice that much of the popular chat about this is couched in terms of temptation – resisting the cake makes it harder to resist the fag. Hence smoking, drinking, poor diets etc as a social problem. But people don’t seem to talk so much about the implications for general anxiety and related foibles, across all classes (reference to mental health is buried in the “Implications” section of the study linked to above).

And yet this is one of the key applications for the concept. In the case of both poverty and anxiety disorders, decision fatigue has the potential to make you poor at, not just later decisions, but big decisions. This is a killer. In both situations is the chronic battle against a stream of small problem-solving, decision-making exercises that mean you will rarely have the mental energy for big picture stuff like quitting smoking, or looking for another job.

Being not quite in either camp, I have found that the concept galvanises me to actually sweat the small stuff less, rather than just nodding glumly when people tell me that is what I should do. As far as anxiety goes I only have low-level common-or-garden crazy. For me, knowing that the thinking resource is finite is a useful brake on looking for problems to expend it on. And in the name of my sporadic quest for eliminating residual crazy, it occurs to me that there is one area of enquiry that has developed models to deal with exactly this problem of scarce resources: economics. Has anyone ever applied relevant models from the field to the problem of dealing with one’s own anxiety? Behavioural economics is a lively sub-field interested in human behaviour in given situations, our calculations, our perception of variables and so on. But economists – or maybe just people – being what they are, behavioural economics tends to conceive of people as indivisible wholes interacting with each other. What if there was a branch of behavioural economics devoted to teasing out the most resource-efficient way to interact and negotiate with oneself?

With any luck someone will now tell me this exists.

I’m the urban spaceman, baby

There should be a collective noun for a mismatched set of opinions that are almost unfailingly congenial to one particular type of person, which are unmistakeably redolent of That Sort of Person but are nonetheless stubbornly contradictory.

A “politics” perhaps. Hm. Anyway, one such set lurks in urban space. How we use it now, how we want to use it in the future, a problem particularly acute in That London, but I don’t think anyone can really ignore it. It’s a human problem. As of 2008, for the first time in history, as many of us live in cities as live outside them. So the ongoing dialogue about urban space is one of the key challenges for any far-sighted government. Ha, yes, those.

My attention was drawn to this today. I’ve never been to the Half Moon in Herne Hill (and maybe that’s part of the problem), whenever I’ve had occasion to murder a few liver cells up that way it has been at the Prince Regent, but I am nonetheless vicariously distressed to learn that it has been closed since July last year and proposals have been advanced by the owner, Dulwich Estate, to turn most of the building into flats with a pub remaining underneath. That post records a statement from Dulwich Estate, before any planning permission had actually been sought:

However, following pre-application planning advice received from Southwark’s planning department it was suggested that the Estate should look at alternative uses for the upper floors other than for residential accommodation.

Fans of planning permission and urban distopia (I know you’re out there) might care to compare and contrast the case of The Greyhound, Sydenham, which did get planning permission from Lewisham council for a similarly peculiar combination of flats around and above it, which subsequently fell apart in spectacular fashion as the developers’ real intentions were revealed. They were fined by the council for deliberate demolition of the pub (this was just after the People’s Republic arrived in the area – lawks), and in the glare of publicity had new plans to rebuild it approved… and now the shell is just sitting there. It looks sad as anything. I’ll take a picture next time I’m past, but essentially it won’t be much different to this:


There’s a campaign, of course.

And I find myself rather spoilt for piss-boiling options, because while it unquestionably boils my piss that developers do what the fuck they like and get fined the housing market equivalent of pocket change by lackwit councils that should probably never have approved this shambles in the first place, it also boils my piss that there aren’t enough places for people to live.

We can’t have it both ways. Living in urban society with limited space is a constant business of negotiation between the nice-to-haves and the must-haves, and housing is a must-have. Pubs are a nice-to-have. Now, nice-to-haves always shade into must-haves at some point – a park for every street in London is clearly just a bonkers nice-to-have; no parks at all would be an unsustainable disaster. But in a housing crisis there has to be some sort of argument for maintaining alternative space uses other than “we like it and it’s always been there”. There’s a name for that kind of argument and it begins with a small-c.

(There’s a name for the mechanism that would sort this whole business of property values and amenities out overnight, by the way, and that begins with LVT. It would be a pretty bloody night, mind.)

And these are exactly the kinds of opinions you will find simultaneously held by people, well, frankly very similar to me. We support our local arts festivals and scribble on our local forum and shop in the local butcher (cheaper than the supermarket as eny hipster kno) and wring our hands over the housing crisis and want all the pubs to remain open forever and ever.

We’re nice, I suppose. I just sometimes wish we had more of A Plan. It would almost certainly be a better plan than whoever’s actually making the plans would make.

Yes/No demographics and the conservatism of the young

Ashcroft’s breakdown of Yes/No voting is interesting if you like baseless tossed-off morning-after speculation (which you do, you dawg).


Incidentally, Martin Kettle suggested at some bleary godless hour this morning that women had “saved the union”:

In the polls, men were decisively in favour of yes. The yes campaign was in some sense a guy thing. Men wanted to make a break with the Scotland they inhabit. Women didn’t.

I don’t know whether he was looking at a different poll, maybe one written in purple and orange on the inside of his eyelids, but I don’t think the figures above reflect that, so I suspect the usual gender narratives are at work here. Those women and their fearful conservatism eh? Tcoh.

Much has been made of the staggering 16/17 year old vote and the mirror opposite 65+ vote (I’m sure I’m not the only one who would like the latter broken down further by the way. Now that we no longer, all being well, routinely drop dead three years after retirement age as soon as all our paperwork is in order it seems silly to group 65 year olds with 80 year olds.)

More interesting, if you are a person who likes to whiffle on about cohorts and conservatism and the young and all that jazz, are the wild downward swing in the Yes vote among 18-24 year olds and the (lesser but still probably outwith the margin of error) upswing in Yes in the 25-34 year old group, before the march towards No resumes. I’ve read suggestions that the first of these patterns is about economic security – maybe the 18-24s, being on the sharp end of most economic indicators going, are inclined to hedge their bets. So by the same token maybe their older siblings, being a little more established, are more at ease with economic risk. But this doesn’t altogether satisfy me, partly because I have just never bought this idea that people construe their bank balances in terms of macro-economics in the way that they will often vaguely imply they do, and partly because it isn’t really established that a No vote was a vote for Steady Now economics anyway. In fact, the Yes campaign did their very best to paint it as a hair-tearingly disastrous risk for the future of the economically vulnerable.

Perhaps there is something more abstract going on here though, a conservatism of life stages rather than of economics in the raw. You could say that a characteristic of the average 18-24 year old life is uncertainty and the unknown. It’s not so much that they live on beans (which actually one does perfectly cheerfully at that age) as that they are looking at their blank page futures post-graduation, or have just been plunged into the maelstrom of work and don’t really understand how it’s all going to pan out. The Steady Now is not so much economic as social. They are trying out adulthood for size (certainly I was) and that default “nae bothered” is a bit of a pose that conceals a very real fear about what the world is going to end up doing to you. The 25-34 year olds, formally speaking, are just as economically fucked on the whole – they are also on the business end of the ageing population, the pensionable age change and the housing crisis. And they have had it harder in some ways – when I was 25 ten years ago there was already a housing crisis – it’s just that no-one gave a fairy-shaped shit. At least everyone knows and acknowledges that 18-24 year olds now are fucked.

But what the 25-34 year old group contains are people who have nonetheless pieced together a life (ha!) if only out of eggboxes and bits of string. They are probably at the stage of making some hefty life choices, insofar as those choices are economically available to them. The referendum may not be the scariest thing they have had to make a decision about this year. They have perhaps weathered a few personal, financial and professional crises of their own, and realised that the world doesn’t end. They just may be more at ease than the very young with the idea of the coins being thrown in the air, just to see whether they fall out any better.

Patrick Keiller at Housman’s Bookshop (1)

I love the smell of hobby horses in the evening so naturally I went to hear Patrick Keiller, architect-turned-filmmaker and psychogeographer extraordinaire give a talk at Housman’s bookshop a couple of weeks ago. Not that Keiller is a tubthumper at all, in fact in person he is even more considered and gradualist than his films, whose political and economic messages hit you only cumulatively. But left-wing gatherings of any kind redound with wooden neighs and this was no exception, or maybe mumble years of political blogging has just left me impatient with any kind of conversation that isn’t practically all allusive and ultimately deeply civilised. I’m not sure when this happened, maybe I got older, maybe I stopped doing party political blogging, or maybe I just made friends with most of the people who ought to be my enemies, but most of my internet conversations with other politicos these days go something like this:

I’m sort of.”

“Yeah but.”


“But on the other hand.”

“Oh totally.”

“But I still.”

“Of course.”

“Wanna get a snowcone?”


So it’s always a shock to get out into this Real World I keep hearing so much about and find Others whose idea of a political conversation is asking questions that go on for seven minutes and make compulsory reference to the miners. And I say this with the greatest of affection. We’ve all been there. But I digress: the upshot was that there really wasn’t enough time to explore all the themes we could have done – psychogeography got short shrift which as an armchair archaeologist with a secret longing for woo I found disappointing. What follows are just some randomly spewed out thoughts – there was a lot to the evening and I might return to it.

From an archaeologist’s perspective London the film is essentially a phenomenological record of one man’s experience. Phenomenology has a high-falutin existence as a philosophical concept but has been purloined by archaeologists to mean, essentially, the exploration of how people used and experienced space. Attention is paid to things like access routes, lines of sight, the interplay of light, dark and sound, and the experience of space by different demographics, for example, men, women and children. London is a sort of non-linear journey round town in the company of the narrator Robinson (pretty much a proxy for Keiller, as he owned) who explores the personal and political implications of footage that ranges over ordinary high streets, abandoned industry and buildings at the various seats of economic and political power. At the outset the chair introducing Keiller reflected on how there were different ways of living in and experiencing cities, mentioning the psychogeographical/mystical approach and personal memories. Interestingly I think these elide. Psychogeography can be a deeply personal thing and this is essentially what Robinson’s narrative is about. When I walk around the City, on the face of it an unpromising and largely architecturally modern creation deserted every weekend, I can feel the medieval facades just behind the sheer glass walls, and I am grateful every time that London wasn’t rebuilt after the fire in the manner of Paris. You might say this was a bit woolike and psychogeographical; it is also personal, because I studied medieval buildings and it is my particular understanding of the built environment that prompts this response – I am making a whole raft of personal associations that you won’t. And yes, I am odd, but probably no odder than you are in how you experience the space around you.

Of course, individual memories and associations are not generally something the archaeologist is able to uncover. One of the most insightful questions reflected on the fact that there were very few individuals in Keiller’s films. This is probably surprising from an arts perspective but from the archaeological perspective perfectly natural – we don’t tend to identify individuals in the archaeological record, only classes of people. It left me wondering whether archaeologists might usefully produce similar fictionalised narratives of experiencing space.

I also wondered (and I really should have asked) how the film (which is twenty years old) would differ if it was made now. Obviously twenty more years of history exists to inform the narration, both in the political and economic life of the city and in Keiller’s own life. Would this amount to a completely different sort of phenomenological experience, if an archaeologist compared it to the 1994 film? Logically it must, because we presumably all agree that the experience of living in the first Sumerian cities must have been vastly different to the experience of living in a Roman, medieval or modern city. A city – whether we’re talking about a given individual city or some sort of Platonic ideal – does not stay still. There must be some kind of incremental change in experience as the architectural, political and technological layers accrete, and there’s no reason why this shouldn’t be very evident over twenty years. That might give prehistoric archaeologists used to dealing in “blocks” of centuries at a time pause for thought.

Unreviewed! Sapiens: a Brief History of Humankind

There are two kinds of popular history book: the cameo and the synthesis. Historians find it easy to fit cameos into their working lives. They produce exquisite little portraits of a family in Wars of the Roses England, or uncover the poignant mental history of an aristocrat at the end of the long nineteenth century. They focus on a crime or an incident or a vignette and use it to draw little lessons and inferences about the world in which it took place.

Syntheses are different. They only incidentally involve the writer’s own original research; they are commentary which should inform the lay reader while also making the expert see a familiar area in a new light. But what they should have at least is a solid angle. The Mediterranean’s history can be told as a giant narrative of interlocking narratives, for example. Or, the history of the world can only be told through money (reprise).

I’m not sure Yuval Harari’s Sapiens has a compelling angle, judging by the extract or whatever it was I read in the Guardian from last weekend. He starts by suggesting that there have been:

almost no scientific studies of the long-term history of happiness.

This might come across as less goady if he then made the slightest attempt to set out a suitable research strategy for this, but he doesn’t, or at least not in the Guardian piece. And scientific studies, really? What he means here is “rigorous”, which is all anybody in a qualitative discipline can aspire to; it’s exactly this kind of sloppy thinking that makes STEM people think HASS people are basically being funded to make shit up. The works to which he obliquely refers on measuring happiness in modern populations normally talk about markers like mental health and reported life satisfaction. It’s certainly an intriguing idea that we might be able to find some corresponding measurements from previous eras and stack them up against each other but that sounds like a lifetime’s work for several people complete with conferences, a dedicated journal, acolyte students, three schisms, seventeen famous blazing rows and eight trillion pints of beer.

And that’s not what’s going to happen, because what Harari is planning to do instead is gallop through the great revolutions in the history of modern humans and offer his take on whether or not they were Good Things. That really seems to be all there is to it, and that’s not an angle. Depending on your specialism, you will find some of his takes intuitively correct, some mildly revelatory and some shonky and over-simplified. The trouble is, once you’ve seen shonky over-simplification in one place, you suspect that it might be lurking in other passages whose background you don’t know so well. For me, the tell was the agricultural revolution. Harari – along with every palaeo-geek and primal dieter on the internet – thinks this is a Bad Thing. The subject is foregrounded in the Guardian piece and caught the attention of the subsequent reviewer:

It’s a neat thought that “we did not domesticate wheat. It domesticated us.”

Well, so it is, but it’s not Harari’s. Ian Hodder, Jacques Cauvin, Peter Wilson and pretty much every Neolithic specialist who has come after them have played with the idea that domestication is always a two-way process, and that changes in the head or in social arrangements may have led change in the environment and not the other way round. Above all, the field is implicitly familiar with the fact that agriculture brought tremendous problems to the burgeoning human populations it produced. This is not new. If it were, Harari wouldn’t be able to quote Jared Diamond, for god’s sake, suggesting agriculture was the greatest mistake in human history, which he duly does. Synthesizers quoting other synthesizers. Aieee.

Lack of novelty may not a problem in historical synthesis, but lack of close examination is. That cutesy “agriculture was a mistake, we belong on the savannah” line is a commonplace internet chatroom trope of (usually) white American middle-aged men. It quickly falls to bits when unpicked – what savannah, in what time? Exactly when were we eating “the perfect diet”, how many human groups in a world of massive bio-diversity were really eating it, and above all, how many of the disadvantages of this Eden are you willing to take on board alongside your nuts and berries? Incidentally, there’s also an entire field of philosophy called population studies dedicated to teasing out the implications of the position Harari seems to take as read in all this, that fewer, happier people are better than more, miserable people. Derek Parfitt calls it “a repugnant conclusion” that this is not in fact the case – from the point of view of the potential individual it is better to exist, however miserably, than not. Did no-one mention this to Harari, really?

Maybe I just know the most about the part of history Harari is weakest on, and maybe this is why most people don’t write synthesis history, because everybody’s a well-informed critic about something, the bastards. I’ll read the full thing because I’m a sucker for this stuff, but it don’t look promising so far.

Why do people cry on Who Do You Think You Are?

Genealogy is, when I think about it, the thing I have been doing longer than I’ve been doing pretty much anything else – over twenty years. This fact is surprising to me when I remember it, because I don’t really think about genealogy that much. For a start, it’s a bit of a finished project – there are parish chests I could plunder and mysteries I could worry at, but the easy unfurling of names bit is mostly as done as it’ll ever be. It all floats around in the background, both the doing of it and the data collected, and when I walk down a street on which “somebody” lived or my finger passes over a significant place name on a map I pleasantly “remember” a particular story or set of dates in the same way that I sometimes become gratefully aware of breathing, and then I forget again. With the exception of a couple of years, everywhere I have chosen to live in London has a family connection of some kind, as if this might serve as some kind of grounding in a life of frequent moves I suppose.

I have been catching odd moments of the current Who Do You Think You Are series, which is mostly useless for getting at the actual experience of doing genealogy (nobody wears white cotton gloves and brings you pre-identified bound volumes containing everything you could possibly need to know unless you are Stephen Fry) but excellent at invoking the mythology around it. The pattern is that the fresh and curious disciple arrives at their kitchen table or their desk to join the gently steeple-fingered experts, expresses curiosity, relates some family legends, a few jokes are made. Then the hunt begins, the experts unfold terrible knowledge like calm old Jedis and the disciple is led down a path of admixed discovery and self-discovery that ends, ideally for TV’s purposes, in them bursting into tears.

I’m not being caustic here. It’s a pattern that reflects in a highly dramatised form what happens when you do it yourself. You may not actually weep in record offices – though I have seen people do that – but the ticklish magic of researching family history is that it is an unpicking, of temperaments and circumstances and decisions that might explain why you are what, and where, you are. It helps you to think about family, about the long durée, about life and love and how all decisions have mixed consequences. I gravitate towards well written family histories because I can sense the writer doing that unpicking, not because I am interested in the data itself.

And yet there’s a sort of fiction at the heart of all this, because none of these events in which you are desperately seeking foreshadowings of your own temperamental make-up actually happened to you. You are not culpable in any of it. The decisions you have taken yourself – surely the more relevant data in the project of understanding yourself – are not the things under scrutiny, so you can blub in an uncomplicated way for someone else’s mistakes or misfortunes. The fantasy at work in WDYTYA is more or less that you arrive at the research as a completed and serene person who has resolved everything in their own life, and needs to look further afield for insight and catharsis.

What I suppose we are seeing is somebody displace their own knotted problems onto someone else. Someone dead who never – in modern parlance – agreed to have a public profile, unlike their soggy descendent, which if you think about it is rather spooky. And even if most people who do genealogy don’t get to cry on TV, they still experience the frisson of turning over new data, and they draw it into an account of their background that suits them. Maybe genealogy is a bit of a predatory business.

Can you teach through the medium of Stuff?

If I were unbearably cynical, I would suggest that the Teaching History with 100 Objects project was an attempt to co-opt a major national institution into supporting a particular cast of goverment education policy, but the People’s Republic is a place of wide-eyed, dewy-cheeked innocence, as you know, so we will confine ourselves to commenting on whether it’s a good idea or not. Pfft.

Maybe it’s an idea that ought to delight anyone trained as both a historian and an archaeologist – history told through Stuff and Things. The question is, should you do that? It’s not so much that the selections are – this is inevitable – screaming with omissions. It’s that I’m not sure I believe in the iconic power of objects to stand for history. Or even archaeology, come to that. The real fun in archaeology isn’t goggling at individual objects, it’s goggling at graphs of classes of objects and trying to figure out what their number, distribution and changing form tell us. It’s true that whenever I’m by the British Museum and have a moment I have a little commune with the Sutton Hoo helmet, but that’s because I spent a term learning about the burial, the society, the mysterious nature of the spearholder and the provenance of the spoons. And the gold enamelled shoulder clasps (seriously, good grief) probably look far more amazing to me than you (yes, even you, howsobeit that you are clearly an intelligent, discerning and highly sophisticated reader, and that colour really suits you, by the way) because I looked at so many diagrams of post-holes and excavation reports that term and I have a crude idea of how much archaeology is Yet More Bloody Potsherds and then, suddenly, woosh, garnet cloisonné work. Without all the background knowledge and context Sutton Hoo is basically one big glass case of insurance headache. No wonder so many children wander round museums looking underwhelmed. You gotta wanna see it.

Teaching history is surely about teaching ideas and patterns, as with any humanities subject. If that doesn’t float your longboat, then as with any subject at all, no hard feelings. But it seems a strange endeavour to try to inspire children to learn about ideas by giving them objects instead. Objects are not easier than ideas. They do not provide better access to knowledge, but different access. The notion that learning history is done by starting from a particular physical object as “inspiration” and working outwards to the patterns and ideas bit is actually pretty complex, and calls for some kind of materialist/semiological defence which I dearly hope the DfE SpAds are too busy to construct. The objects are doing the job of icons here, representing a whole bunch of epistemological categories in an intellectual enterprise that is not really “about” physical objects at all. It’s  bordering on the mystical, though I suppose education policy is one of those areas where we seem to think a bit of mysticism is appropriate – inspiring minds, unlocking potential, all those things.

I also think it forms more of a departure than the schools minister Nick Gibb thinks from the strait-laced chronological curriculum laid down by Michael Gove which, gleefully piss-rip-worthy though it was, did enshrine a hazy version of the premise that in order to understand patterns you need to start with a lot of data. This project could be taken to advocate starting with one bit of data, a physical bit. I’m not sure whether that’s right or wrong, I just think it’s different. I guess a lot lies with the teaching, but that’s always the case, and incidentally the reason why (as ever) I wonder if there’s any earthly point in governments specifying curriculums to this level of detail in the first place.

Can computers replace historians?

This Bank Holiday weekend’s Question To Which The Answer Is No And Which Successfully Winds Up Alix Mortimer (#QTWTAINAWSWUAM – it’ll never catch on, though it clearly should) is this one from the Beeb’s Rory Cellan-Jones:

Can computers replace historians?

here is the biggest claim so far – crunching through the big data of history can help us spot patterns and work out where the world is heading next.

That is what Kalev Leetaru, a data scientist at Washington’s Georgetown University, believes may be possible. Using a tool called Google Big Query, designed for interrogating vast collections of data, he has been sifting through a database of events stretching back to 1979.

This is GDELT, which has collected media reports of events from innumerable sources in more than 100 languages for 35 years. “What we did here,” Leetaru explains, “was use this tool to shove in a quarter of a billion records and use this massive piece of software to just in a few minutes sift out the patterns in this data.”

What he says he found was complex patterns of events repeating themselves over the years. He has looked at recent events in Egypt , in Ukraine, in Lebanon and tried to draw common patterns.

The answer of course is No, and in fact nobody is seriously suggesting otherwise, not even the data scientist in the story.

Leetaru says historians should see this kind of computational tool as just another technique amongst many rather than a threat to their professional expertise. In any case, they may look at the patchy record of big data in areas like election forecasting and flu trends and decide their days sifting through dusty archives are not numbered after all.

For all the tail-end humility, it is worth rehearsing the reasons why this idea is being oversold, and they go beyond the fact that Google Big Query sounds like something you’d use to report graffiti in your neighbourhood or check the local bye-laws on squirrel-feeding.

The tritest first – is the purpose of “doing history” solely to work out where the world is going next? For policy-focussed think thanks maybe, for historians probably not. This is the bread and butter of undergraduate historiography seminars, and it’s not difficult to come up with reasons other than than to “do history”. Because it tells you something about the human condition which has nothing to do with mere events, because it widens your understanding of your own culture and biases and those of other people, because it challenges your preconceptions about tradition and heritage, or enhances them, or perhaps just because you have that certain bloody cast of mind that delights in intellectual problems which cannot be reduced easily to numerical values and positively require human intervention to make sense of them, and because you believe that the training and sharpening of such minds is of value to the future of the race. All these things.

Second objection, it indexes and detects patterns in media reports. Not in the Raw Stuff of Time Itself. A more perfectly designed tool to assess changes in pattens of media reportage over the last 35 years would be hard to conceive, but whether it can be said to be crunching up actual history in its neat teeth is something else. There’s a whole extra layer of analysis to slot in here about the nature of historical data and how we create it. There is no such thing as “just data”, a fact which most of the internet found itself having to explain to Chris Anderson in 2008 when he wrote a piece in Wired called “The End of Theory.” This is philosophy of science 101 (it’s archaeology 101 too). “Data” in anything other than pure numerical terms is conceived of through human intervention, through choices about what to foreground and what to omit, through the murky veil of language itself. Somebody has to put this stuff in to this difference engine, and however you do is going to shape your outputs. GIGO &c.

There is at least a certain audacity in making the source of all your inputs The Meedja, an audacity that we can only hope the data scientist in question is aware of (although US print media is famously more staid than British print media – to fully grasp how eye-popping this exercise looks from a British perspective, USian readers should imagine the inputs were TV news segments). On the other hand, at least media coverage of current events can be said to be broadly afflicted with the same problems and biases down the last 35 years or so. At least it’s consistently wrong, right? I mean, obviously after you control for differences between individual journalists and their many biases and bugbears, between editorial approaches at different media outlets, the whole meta-history of the media scene and of reportage and its changing norms and standards, particularly over the period which sees the arrival of the internet, and, er… Well, there are some issues with your input source, in other words, and being able to identify problems with your sources is not the same as being able to control for those problems, as any historian can tell you.

The third objection is the killer from an archaeologist’s point of view and it is a corollary of the second – the data involved currently goes back (like the Head of the People’s Republic) to 1979, which while it is naturally a great deal in fag-and-wisdom years is a blink of an eye in human history itself – even recorded human history, which makes the pattern-detection thing a bit redundant. What are you going to do when your newspaper reports run out? What are you going to do when basic assumptions about, I dunno, states, war, international law, human political relations, are so morphed by the passage of time as to be unrecognisable? What are you going to do, in short, when modernity runs out? What are you going to do when writing itself runs out? What are you going to do – and any data scientist should perforce be interested in answering this question – about the big, the seriously fucking big, patterns in human history when your data inputs are so patchy and variable?

Archaeologists struggle with this all the time, and it’s one of the reasons why prehistory is the best kind. You cannot seek trite, proximate causes, because you simply don’t know that this set of people invaded that, or this set of people started speaking that language because of some particular set of political pressures, or this set of people moved there because of a series of famines. We do not have the data. All you can do is try and detect much more abstract patterns in the distribution of material and try and make it say something about human endeavour (or, as prehistorians of my acquaintance colloquially put it, make it up). Historians of recorded periods are less naturally stretched in theoretical terms, but even they are dealing with much, much patchier and more variable data inputs than newspaper reportage.

For all that, this does sound like an insanely useful tool for certain purposes, and I wonder if Leetaru simply needs to scale back his terms a bit. It doesn’t seem logical in any sense that computers could replace historians. What they could well replace, it seems to me, is think tanks.

Sharing and hoarding

Economists and their fellow travellers are great at churning out neat little books with titles using the format of [Noun]ification: how [Noun] is [Verb]ing the World, and Why It Matters (And How You Can Use it to Get More Twitter Followers). I hardly ever know enough about the subject matter to tell how much of the contents are weapons-grade bullshit, but they’re often interesting anyway.

According to Tim Harford writing a few months ago, somebody has got hold of a set of ideas from anthropology about hunter-gatherers and sharing, and farmers and hoarding, and is using them as a jumping-off point to comment on contemporary society:

Megan McArdle, in her fascinating forthcoming book The Up Side of Down, observes that modern societies can’t make up their minds whether to adopt the morality of farmers or of hunters. The idea that hard work needs to be rewarded is a farmer’s view of fairness. The claim that “we’re all in this together” is hunter-thinking.

We could, at this point, lay into McArdle for analysing modern society in terms of Just So stories based on fuzzy, under-examined perceptions about How Things Used To Be in the Old Days – a complaint David Graeber makes of economists generally in Debt (n.b. in my impressionistic taxonomy, the single-noun pop-econ titles sit a level above the Nounification ones).

But the trouble is, she has lifted the outline of this from anthropological theory. The authorised version I read about as an archaeology student holds that hunter-gatherer societies are incentivized to share food because the supply is uncertain, and norms of reciprocity will grow up to keep you in gazelle steak when your own sub-group is not doing so hot. In any case, the mobile nature of the lifestyle mitigates against storage – if it ain’t getting eaten now, it will go to waste. Farmers on the other hand are incentivized to hoard, by implication hoard from each other – the farming year is predictable and in return for a given input of time and effort generates (all being well) food at set intervals which can be eked out over however long the producer decides. At the same time sedentism, which is held to accompany farming as a social development, enables storage (and maintenance of the stored crop). Food is power. If you have enough to feed yourself, grand, but if you have enough to start giving it away at times of your own choosing in return for favours, labour, marriage partners and general prestige, so much the better. All this is tied up with the development of nuclear families who are held to be an appropriate unit for agricultural productivity, and ultimately the beginnings of social stratification.

Of course, archaeologists and anthropologists formulated this tool of analysis without ever intending it to be used by a City AM commentator to crowbar open the NHS, but that’s the trouble with wanting to Give Something Back to the other social sciences – and theoretical archaeology regularly beats itself up for its failure to do this* – you don’t get a choice about how your ideas are used. McArdle may have bolted an awkward morality tale onto what was intended as a piece of socio-economic analysis but she hasn’t been constrained by injunctions not to do it. There is a sort of Middle England whiff about a lot of the farming/sedentism stuff I’ve read, as if people are almost slightly relieved to be recognising the roots of nuclear families, social stratification, capitalism and other jolly things. Whatever their own politics, at least it is familiar, and feels like the beginnings of a Big Thing. It’s not really surprising if actual capitalists come along and make free with it.

In fact, I’m failing to come up with an instance of an archaeologist conducting a specific and critical examination of her own politics in the context of the sharing/hoarding dichotomy which might act as a kind of user’s guide to passing economic commentators. Are there any?

* I’m never sure why. The only social science I can think of that freely nicks ideas from other disciplines and then vomits a bunch of hurriedly regurgitated theory back down their beaks is, in fact, economics, and should that be any shy, naive young discipline’s idea of a suitable role model? Really?


Get every new post delivered to your Inbox.

Join 1,295 other followers