Swindon Studies: Social Science in Simpleton

carfaxOne of the recurring features of academic life is the way in which particular intellectual traditions of thought are associated with particular places, as in multiple Chicago Schools, for example, but also in the way in which particular places come to stand as vectors for general theoretical claims – Paris and modernism, obviously, but more prosaically, certain places, like Baltimore or Vancouver or Columbus, Ohio, come to serve as the empirical reference points for the working through of theoretical ideas about capitalist urbanization, neoliberalism, governance and scale, and the like (this is not quite the same, but not unrelated either, to the ways in which towns and cities are presented as sites for experimentation).

When I was an undergraduate and postgraduate, the so-called ‘locality debates‘ were the focus of much of the most interesting discussion of the relations between social theory and spatiality. The very question of how to think about the relation between places, on the one hand, and knowledge of general trends, on the other, was at the centre of these debates. A whole set of issues – the relations between the abstract and the concrete, the empirical and the theoretical, the nature of case analysis, the relations between different axes of social differentiation, questions of ‘scale’ – were worked through in these debates. In the early 1990s, they ended up being supplanted by debates about ‘postmodernism’, which had all the appearance of intellectual pluralism and philosophical weight, but were often rather simplistic by comparison.

Swindon has a small part to play in this lineage of spatial theory in the social sciences. Of course, since 1988 a lot of social science has been commissioned, managed, and audited in Swindon, under the auspices of the ESRC most obviously, and more recently the AHRC and EPSRC too – including a succession of urban-oriented research programmes (Ian Gordon has analysed four decades of urban research programmes in the UK from the 1960s onwards, and it would be interesting to update this in light of more recent initiatives around Urban Transformations, Connected Communities, Urban Living Partnerships, the GCRF and the like). But as an object of urban and/or place-based social science research, Swindon also has a minor claim to significance. I mentioned in my last post Mike Savage’s account of the way in which post-1945 British social science evolved through a distinctive form of effacement of place, typified by the affluent worker studies which were not-necessarily-famously undertaken in Luton but were emphatically not studies of Luton. Swindon doesn’t merit a mention in Savage’s reconstruction of a ‘landscaped’ conception of social inquiry. But Swindon’s status as an object of social science illustrates some of the different ways in which specific places come to play a synecdochical role of one form or other in shaping images of the social.

mouldingsMichael Harloe’s Town in Transition, published in 1975, is the most important contribution of ‘Swindon Studies’ to urban theory more generally, I think it’s fair to say. Harloe had worked for the Borough during the town’s expansion in the late 1960s, and the book was one product of the Centre for Environmental Studies, the think tank that served an important medium for spatial thinking in the 1960s and 1970s whose alumni included Doreen Massey (somebody should really be writing a geneaology of the institutional worlds that generated spatial thought in this period). Harloe’s book is a fantastic account of the politics of post-war planning, where politics is understood as a matter of compromising, lobbying, building alliances, strategising across scales. Intellectually, the book stands at the cusp of the theoretical transformation of urban studies in the 1970s (not least through the International Journal of Urban and Regional Research, of which Harloe was a founding editor in 1977) – there is not much trace of the sorts of Marxist political economy or state theory in it, but that’s OK, it has weathered well precisely because of its resolutely organisational and strategic sense of the political.

screen-shot-2017-03-02-at-13-00-09By the 1980s, Swindon had become one of the places used to make sense of the reconfiguration of cities and regions, centres and peripheries, that was a central focus of intellectual debate in the so-called ‘spatialization’ of social science that was inaugurated by the theoretical transformations that are not yet evident in Town in Transition (it is of course slap-bang in the middle of the then much-talked about high-tech, ‘sunbelt’ ‘M4 Corridor’). Swindon was the site for one of the locality studies funded under the ESRC’s Changing Urban and Regional System initiative (which was originally conceived and proposed by Doreen Massey). In this guise, it was made into the test-case for assessing whether theories of “growth coalitions“, originally developed in the context of North American urban politics and policy, could be usefully applied in the UK (the answer was ‘sort of’, in so far as Swindon might once have had something like a stable, consensual civic coalition promoting expansion and diversification through to the 1980s, but then it didn’t). Then, in 1997, Swindon was presented as the very epitome of ‘the city for twenty-first century‘, in a book that gathers together and synthesises the findings of a succession of ESRC projects on the town and the region of which it is part (the 20ish year gap between the Harloe book and the Boddy et al book in 1997 suggests that the next book-length academic study of Swindon is due to be written just about now….). More interestingly, perhaps, Phil Pinch used Swindon as one model of ‘ordinary places‘ (the other one was Reading), places that presented challenges to the tendency of radical political theory to take rather special places as the models for general claims about political possibilities. More recently, Sophie Bowlby chose Swindon as the site for her research on the changing nature of women’s friendship networks across the lifecourse because of its typicality (she told me that when I bumped into on a train from Paddington, as you do). And in the research of Linda McDowell and her colleagues on the intersections of class, ethnicity, masculinity and labour market dynamics in the UK, Swindon again functions as an interestingly ordinary place (compared to Luton, these days), one which they use, amongst other things, to complicate narratives of politics and anti-politics.

It should also be said that all of these examples of social science research on Swindon are pursued by academics based in other places – in places like Reading, Oxford, or Bristol, University towns all of them, of different sorts. Swindon still struggles to build any significant higher education presence of its own (it’s surpassed by Luton in that respect). But perhaps this has something to do with why Swindon gets to be the place where you can learn about the value of ordinary things.

In fact, when you take the trouble to look at the social science about Swindon, you begin to see that it might have a small claim to be the exemplary ordinary place, if such a thing makes sense. But you can also see Swindon as an example of the different ways in which places are figured in social science (of the different forms of ‘geographical reasoning’ to which life-in-places is subjected) – sometimes the town is seen as representative of wider trends and patterns (in this sense, Swindon gets to be what Luton was for social science in the 1960s), even “a starkly exaggerated example” of national trends; sometimes it is framed in comparison with, or even counterpoint to other places (this is how Harloe presents the lessons of the ‘local’ and ‘national’ politics of Swindon’s growth); sometimes as the focus of forms of conjunctural analysis (as in the locality studies research). These don’t quite exhaust the ways in place and/or the local get framed in social analysis, but they do cover three important versions – if you had the time and inclination, you could even imagine writing a piece in which “Swindon Studies’ gets to enact the different conceptual operations through which geographical specificity is translated into theoretical generality. Mind you, I’m not saying ‘It all comes together in Swindon’. It doesn’t (in fact, in more ways than one, a lot of ‘it’ just passes by).

Geography Books

screen-shot-2017-02-27-at-10-15-28It’s sad, I know, but one of my favourite places is the Bookbarn, in Somerset on the road from Bristol to Wells. It is, as the name suggests, a big barn full of old books (my partner refuses to ever come along with me, because the smell of second-hand books repulses her just a little). The books here seem to consist mainly of discontinued library stock, from everywhere from the Cleveland County Library and the former Bath College of Higher Education (precursor to Bath Spa) to the Seeley Historical Library in Cambridge. If you were so inclined, you could acquire pretty much any book written about the Royal Family in the last 60 years here, or, alternatively, construct your own personal archive of every single Open University social science course from The Dimensions of Society (1975) onwards.

The Bookbarn even has a whole Geography section, which is more than you can say about most academic bookshops these days. It’s about 12 square feet of shelves, containing books mainly from the 1960s and 1970s, with a sprinkling from  1990s and more recently. I was there on Saturday, and I could have bought all of my old school textbooks for both O and A level, but thought better of it. You could, too, collect a number of ‘classics’ of modern academic Geography, including Harvey’s Explanation in Geography, Haggett’s Locational Analysis, pretty much anything you might want by Dudley Stamp, Wilbur Zelinsky’s A Prologue to Population Geography, different editions of Wooldridge and East’s The Spirit and Purpose of Geography, the original version of Sparks’ Geomorphology, or the first Progress in Geography edited collection from 1969.

These shelves offer a snapshot of how Geography was represented in public life in the UK somewhere between about 1970 and the mid-1980s, in so far as the books acquired by school and University libraries but also by local public libraries are an indication of that. Standing there, in front of them all, you get a strong sense of the 1970s having been a little bit of a golden age for Geography publishing in the UK, with a wide range of book length research monographs and edited collections reviewing and promoting geography as a science, and in particular human geography as a social science (an age when publishers such as Heinemann, Croom Helm, Arnold, and Hutchinson all had important geography lists it seems). Many of the books on these shelves are ones I can remember, at least from the covers if not necessarily from actually reading them, from when I was an undergraduate in the late 1980s. They seemed a little dated even then, which might have been a design issue in some cases, but also had to do with the way in which the intellectual substance of many of the books you can find in the Bookbarn had, already by then, been framed as standing on one said of a divide between ‘radical’ and not-so-radical geography, which was overlain onto the mutually hostile methodological chauvinisms on both sides. I liked the radical stuff (the only book on the shelves at the Bookbarn which really counts as an influential one for my own intellectual formation is 1984’s Geography and Gender: An Introduction to Feminist Geography, by the IBG’s Women and Geography Study Group). Amazingly in hindsight, did an undergraduate degree in which one didn’t actually have to take any notice of ‘quantitative’ and statistical approaches at all if you didn’t want to (I don’t as a result share the antipathy towards those approaches often felt by people once forced to sit through what, way back when, were not very well taught classes promoting them; nor the sense of self-righteousness often attached to ‘qualitative’ approaches that is the flip-side of generation-shaping ‘Bad-Stats’ experiences). The books I have in mind (some of which I bought – they are dead cheap), are expressions of the “methodological ferment” that transformed Geography from the 1950s onwards, primarily through the adoption, development and refinement of statistical techniques and mathematical modelling to spatial patterns, processes and forms. You can trace the emergence of whole new sub-disciplines in the wake of this modernization in the books in the Bookbarn: of urban geography, for example, in Harold Carter’s The Study of Urban Geography, David Herbert’s Urban Geography: A Social Perspective, and Ron Johnston’s City and Society;  or of development geography, in Akin Mabogunje’s The Development Process: A Spatial Perspective or David Grigg’s The Harsh Lands; as well as the traces of approaches that sound suddenly contemporary again (e.g. The Political Geography of the Oceans). The books gathering dust on these shelves were, I guess, integral to the institutionalisation of geography-as-(social)science as higher education expanded during the 1970s, and are testament to what I can’t help thinking of as ‘IBG-Geography’, expressions of an assertive discipline framed in no small part by turning away from the associations of geography with merely descriptive accounts of far away places In his wonderful genealogy of modern social science in Britain, which is very geographical without saying much about Geography, Identities and Social Change in Britain, Mike Savage does identify human geography as exemplifying the adoption of social scientific expertise in what were traditionally conceived of and practised as humanities disciplines: “Foremost amongst these was human geography, which largely abandoned its focus on the culture and traditions of fixed regional spaces and forged close relationships with sociology and anthropology and self-identified as a social science.” It’s the books through which this process of self-identification was enacted that are all sitting in the Bookbarn. You can even find here evidence of that moment when it was possible to imagine human geography and physical geography having common intellectual grounds, and not only ones based in shared methodologies, but even in shared philosophical assumptions (I picked up a copy of Bob Bennett’s and Dick Chorley’s Environmental Systems: Philosophy, Analysis and Control, which is rather prescient in its presentation of the synthesizing promise of systems theory, now all the rage again in somewhat different, resilient, form).

screen-shot-2017-02-27-at-17-19-58Driving home (composing this blog in my head), it occurred to me that this ‘sample’ of books captures the becoming-relevant of geography in this period. You can pick up a copy of David Smith’s Human Geography: A Welfare Approach (with its great front cover) alongside his more technical Patterns in Human Geography, both of which explicitly question the sorts of problems geographers sought to address and the values they sought to advance in addressing them. You can find traces of the divisions between different images of the vocation of geography (stresses and strains captured in the very title of Michael Chisholm’s Human Geography: Evolution or Revolution?). The recurring focus is on issues of spatial analysis, where this involves the delimitation of distinctively spatial processes and spatial forms, but none of these books are aridly methodological – there is plenty of social theory embedded in these books, just not perhaps the sort of (post-)Marxist thought that had become so central to defining the meaning of social theory by the time I was an undergraduate. For example, the OU’s co-published Fundamentals of Human Geography reader, from 1978, includes a piece by Claus Offe on advanced capitalism and the welfare state, a fact which in no small part captures something of the taken-for-granted background of quite a lot of the substance held on these shelves. Assertions of the importance of a newly robust social scientific human geography – such as Studies in Human Geography, a 1973 collection edited by Chisholm and Brian Rodgers and sponsored by the Social Science Research Council as it was then, with the intention to “focus attention on the substantive contribution of geographers to several fields of study” and aimed as much at ‘non-geographers’ as at ‘practising geographers’ (I’m still practising) – were articulated in a context in which it was still assumed that a relatively stable institutional field of ‘planning’ and ‘regional policy’ existed into which geographers could speak with authority and influence. By the time I was an undergraduate, this stability no longer existed, and I was inducted into geography in a context in which it was the dissolution of that stable field which generated all the most exciting intellectual energies (you can pick up a copy of Martin and Rowthorn’s The Geography of De-Industrialisation at the Bookbarn too, from 1986, a book which pretty much captures the moment, as do the slightly later  of OU edited course books on The Economy in Question and Politics in Transition, which are also there). By the time I was a graduate student, in the early 1990s, as those stable fields of ‘relevance’ further dwindled, the sorts of “critical human geography” that I settled into was rapidly reshaped around theoretically sophisticated forms of analysis which were really good at identifying the possibilities of political purchase for academic analysis in situations where it seemed, at first look, to have disappeared (a pattern of analysis which continues to frame an awful lot of work in human geography, probably including most of mine).

My excuse for spending my Saturday afternoon leafing through books I mainly didn’t read 30 years ago and mainly won’t be reading now (with some exceptions), if I need one, is that I do have a professional interest in the more or less recent profile of Geography. Amongst many other things, I’m meant to be editing a Companion on the history and philosophy of geography (a rather daunting task; I’m not doing it on my own), so I am telling myself that all this browsing really counted as research, of a sort at least. It’s interesting, for example, to notice just how many of the old books you can find at the Bookbarn were concerned not merely with applying quantitative methods to spatial problems, but rather are explicitly engaged with the challenge of theorising issues that are “peculiarly geographical”. Not thinking of the spatial as just a residual, or as an externality, or merely contextual, remains a compelling issue across social science, and it is one theme that might well connect what are often still presented as incompatible qualitative and the quantitative ‘paradigms’ in geography (does anyone still use that word?). It’s not, for sure, an issue over which strands of quantitative geography and traditions of spatial analysis hold a monopoly, but my afternoon in the company of all these old books reminded me that it is this theoretical issue that was at the core of the process of making human geography from the 1950s onwards, and it’s this theoretical issue that might well remain central to a distinctively geographical imagination of the challenges of ‘spatializing the social sciences’ (and humanities, I suppose).

The Politics of the Global Challenges Research Fund

Screen Shot 2016-05-24 at 16.14.24In UniversityLand in the UK, alongside various worries about the TEF, OfS, and UKRI (try to keep up) generated by the government’s Higher Education white paper, there is also a sudden flurry of notice being taken of the Global Challenges Research Fund (GCRF). This was formally announced before Christmas in George Osborne’s Spending Review. It is now officially launched as “a new Resource funding stream” (see the RCUK’s brief  on the GCRF).  That’s how it is being presented at University level, by research and funding councils, and in cross-University partnerships. The GCRF is part of the UK science and research budget, so it belongs to the Department of Business, Innovation and Skills (BIS), according to whom “It provides an additional £1.5bn of Resource spend over the next five years to ensure that UK research takes a leading role in addressing the problems faced by developing countries. This fund will harness the expertise of the UK’s research base to pioneer new ways of tackling global challenges such as in strengthening resilience and response to crises; promoting global prosperity; and tackling extreme poverty and helping the world’s most vulnerable.” That all sounds nice, doesn’t it.

Oh, by the way, the key thing to remember is this: “GCRF is protected science spend that is also part of the Government’s pledge to allocate 0.7% of Gross National Income to Official Development Assistance (ODA).”

hirschman-300x400

Maybe I’m just a cynic, but it does seem to me that there are a number of issues around the GCRF that deserve a little more honest acknowledgement before everyone (individual researchers, research teams, departments, whole Universities) rushes-off to re-brand themselves as international development specialists (I’m not one, so I’m not being defensive). The GCRF is a deeply political initiative, in the sense that it involves all sorts of pitfalls and risks and likely unintended consequences that need some thinking through by those being enrolled into the agenda of which it is a part. In fact, the GCRF is ‘political’ in at least three related senses:

  1. First, the GCRF is quite explicitly a re-direction of government spending on ‘international development’ away from the Department nominally charged with that area, DFID, to BIS, the department that is responsible for science, innovation and research (but which would also really like to not spend much money doing very much at all). It is one part of a dispersion of spending on development and aid across a larger number of departments, while allowing the government to remain committed to the principle spending 0.7% of national income on official development assistance (ODA) (a commitment which is itself, of course, the target of ongoing right-wing campaigning, directed primarily against DFID; this is a rather important context for the re-configuration of aid policy by the current government). The headline story from the November spending review was that the science budget did much better than expected, with a real term protection over the next five year period. But this commitment depends on various things being ‘tucked under’, as they say, including the GCRF – it’s not new money for science at all, it is DFID’s money handed over to BIS. Depending on how you look at it, the GCRF is either a very clever and quite open accounting scam, or it is a rather wonderful example of having your cake and eating it – an austerity shaped cake with ODA-shaped sprinkles on top.
  2. So, everyone knows all this, but the point is that the GCRF is part of a concerted reconfiguration of the way in which UK government development funding is organised. The reconfiguration is shaped by an approach now enshrined in the new  UK Aid Strategy, which seeks to ‘tackle global challenges in the national interest’. This actually means a refocusing of aid policy around concerns with security, crisis, and emergency. Again, none of this is a secret, it’s all quite well-known. Somebody, somewhere is no doubt already writing the critique of this new policy. In terms of the GCRF specifically, a £1.5 billion pot of money dedicated to ‘ODA’-relevant research has the potential to fundamentally reorient the ethos, one might say, of UK scientific research. On the other hand, it also looks like a move to direct more ‘development’-related spending to the UK. GCRF is explicitly premised on the idea that “research directly and primarily relevant to the problems of developing countries may be counted as ODA. The costs may still be counted as ODA if the research is carried out in a developed country.” That’s why everyone is so much more excited about this than they have been by the Newton Fund, which is much more explicitly about the difficult work of building partnerships and capacity with international collaborators (and the GCRF is a lot more money than Newton). Whether and how GCRF will help generate capacity-building elsewhere, rather than the requirement to meet ODA criteria being met by standard ‘impact’ models, is just one dimension of the future politics of the GCRF. On the one hand, then, GCRF redirects ‘development’ money to UK institutions; on the other hand, this money comes with very thick strings attached (apart from everything else, the GCRF is also just one example of a widespread and disturbing move to centralize strategic decision-making about what counts as science that is evident elsewhere in government higher education policy).
  3. As I say, all of this is publicly known, although it seems to me interesting how little of this context is being acknowledged as the GCRF is rolled-out. There is some growing awareness of what it all might imply. In one interpretation, for example, the GCRF has been identified as ‘hoovering up extra science cash’ for ‘developing world problems’. That’s true in a sense, although as already indicated, the ‘extra cash’ was always already development-related money. No one is actually taking money away from non-ODA-able research funds for the GCRF – it’s that any extra money the science budget is getting, to make it appear as if it is ‘protected in real terms’, is actually coming from DFID’s coffers, without actually being administered by DFID (My point is not that DFID is a model of idealistic efficiency.  There is already a rather contested institutional field assessing whether international aid strategy does any good (see the ICAI website). This field is only likely to get a lot more complicated when it’s not primarily focussed on the accountability of DFID). There is a bit of a Duck-Rabbit issue here: rather than thinking of the GCRF as ‘a new funding stream’, it might be better to acknowledge that it effectively obliges a significantly greater proportion of science and research to get engaged with the world of international development issues. This is where the more mundane, but very real politics of the GCRF is going to unfold: no doubt there will be an initial rush to re-badge current research as ODA-compliant (by Universities and funders and government departments), but over the more medium term this all implies either very significant transformations in how research agendas are shaped and delivered, or, an ongoing finessing and revision of ODA criteria to justify, nationally and internationally, the redistribution of money away from traditional fields of development policy. That’s a politics already going on, and it is evidenced by the recurring theme of ‘uncertainty’ and ‘need for clarification’ in the commentaries around the GCRF ever since the November spending review. 

It seems likely that an awful of people in British Universities are suddenly going to be learning about the SDGs, scurrying around to find people in their institution who have ever visited Mali or Cambodia, and, I suspect, engaging in more or less unreconstructed paternalistic and patronising ‘development-speak’. It’s best not to be too credulous about the public statements about tackling extreme poverty and helping the most vulnerable – if Universities are going to be drawn much more holistically into the world of international development policy, driven by nice-sounding funding streams, then they are, of course, going to be drawn into a world that is complex, and grubby, and deeply compromised (‘Aid as Imperialism’, anyone?). There is, of course, a very real politics of development assistance already, that lots of people in Universities might hopefully be about to learn a little bit more about.

Be careful what you wish for!

Are We There Yet? Or, is this what fieldwork feels like?

UntitledI have just returned from Johannesburg, a city I have not been to since 1997, when I first went to South Africa. I had a nice time, and as ever, I learnt a lot in a  short space of time by being in a very different place. I have spent lots of time in South Africa in between that first trip and now, but apart from going in and out of the the airport and a brief day-trip in the early 2000s, not any time in Jo’burg. So it was an occasion for reflecting on what it is I have been doing coming and going to South Africa in the meantime.

I remain unsure whether or not the time I have spent in South Africa counts as ‘fieldwork’, a rather precious idea in GeographyLand, the everyday world which I inhabit. Does visiting other people’s countries and finding things out about them counts as ‘fieldwork’? I certainly think I have done ‘research’ in South Africa (actually, mainly, in Durban), but I’m still not sure why I am meant to think that the quality or significance of research is meant to depend on the implied sense of immersion or exposure associated with the idea of fieldwork.

IMG_0791I have been to South Africa 17 times in the last 19 years (it’s a long flight, you have time to count these things…). Adding up all those trips, which have been as long (or not?) as 3 months and as short as a week, I have spent almost a whole year of my life there since 1997. These trips have been funded by ‘seed’ money from the University of Reading, the OU, Exeter (and who knows what grew from that money), and by proper grown-up research funding from the British Academy, and especially from the Leverhulme Trust (an historically ambivalent source of funding for African research, it should be said). Some of these trips have been associated with formal research projects, some of them with conferences, and some of them just occasions to go and meet people and find things out. And it should be said that pretty much anything I have learnt while in this other place has been dependent on the generosity of South African academics, activists, lawyers, policy makers, journalists, and the like – generosity with their time, their insight, and their own analysis of the world they live in. ‘Being there’ turns out to be an opportunity to listen to the testimony others.

Actually, the more I go to South Africa, the less and less I think of it as a place in which to pretend to do ‘research’ – I initially went to do research on media policy, on my own, in my own name; but then I ended up collaborating with other people, which seems the only reasonable way of proceeding – in my case, falling under the spell of Di Scott, and then being part of a multi-person project on democracy in Durban with all sorts of other nice and smart people, and more recently accidentally conjuring writing projects with Sue Parnell and a shared project with Sophie Oldfield. Along the way, I have passed through all sorts of spaces of research knowledge: hotels, apartments, different cities, taxis, bookshops, beaches, living rooms, offices, bookshops, coffee shops, libraries, bookshops, shopping malls, bookshops in shopping malls. I have gone from researching media policy to researching urban-based environmental politics, using ‘methods’ including interviewing to watching TV and listening to the radio, to using more or less formal ‘archives’, on one occasion delivered in person as a pile of paper, on another accessed by being ushered into a cupboard at the SABC.

I’ve actually learnt a lot about Theory across all these visits, in a weird inversion of Paulin Hountondji’s account of Africa’s ‘theoretical extraversion’ – about the way that ideas of the public sphere, or governmentality, or class, or decolonisation, amongst others, resonate and settle in a place like South Africa. Most recently, this has been my main excuse for visiting, to learn more about how ‘urban theory’ circulates through and emerges from South African situations.

So, anyway, I wonder still why it is that time spent in South Africa should present itself (to me, but also to others faced with me) as a source of something like ‘field’ experience in a way that, for example, time spent in the USA seems not to. I have, I think (I know), actually spent more time in the States as an adult than I have in South Africa, including a whole year of immersive ethnographic observation of GeographyLand at Ohio State. I have an American sister. I’ve walked pretty much the entire length of Peachtree Street (although not all at once). But none of that is translatable into a claim of professional expertise about American life and culture and politics in the way that, I suspect, time in South Africa could be. And in saying that, I know it is the case because I have a distinct sense that I have not been very good at constructing an aura of either ‘developmental’ or ‘ethnographic’ or ‘(South) Africanist’ expertise on the basis of all that time in South Africa.

And now back to life in Swindon. A non-city much the same age as Durban, half a century older than Johannesburg, and about 300 years younger than Cape Town. But no less weird than any of them.

Cultural Geography is Dead! Long Live Cultural Geography?

I’ve been pondering a new paper in Progress in Human Geography by my former OU colleague, Gillian Rose, which addresses the conceptual and methodological challenges presented to cultural geography by the emergence of digital modes of cultural practice. The paper is entitled ‘Rethinking the geographies of cultural “objects” through digital technologies: interface, network and friction’. Here is the abstract:

This paper addresses how geographers conceptualize cultural artifacts. Many geographical studies of cultural objects continue to depend heavily on an approach developed as part of the ‘new cultural geography’ in the 1980s. That approach examined the cultural politics of representations of place, space and landscape by undertaking close readings of specific cultural objects. Over three decades on, the cultural field (certainly in the Global North) has changed fundamentally, as digital technologies for the creation and dissemination of meaning have become extraordinarily pervasive and diverse. Yet geographical studies of cultural objects have thus far neglected to consider the conceptual and methodological implications of this shift. This paper argues that such studies must begin to map the complexities of digitally-mediated cultural production, circulation and interpretation. It will argue that, to do this, it is necessary to move away from the attentive gaze on stable cultural objects as formulated by some of the new cultural geography, and instead focus on mapping the dynamics of the production, circulation and modification of meaning at digital interfaces and across frictional networks.”

There is a lot going on in the paper, but two things strike me as important about it: first, it brings into view, that is, it explicitly names the distinctive object of analysis upon which a significant amount of so-called ‘new cultural geography’ depended; and then, secondly, it announces that this object of analysis and associated methodologies of ‘reading’ are more or less redundant. That’s not quite how Gillian puts it, admittedly, but it’s not far off. (It should probably be noted that not all forms of ‘reading’ necessarily presume the specific type of ‘object’ that Gillian defines in her paper – more on that below).

Now, I happen to think that to a large extent both ‘new cultural geography’ and ‘the cultural turn’ really refer to a series of missed opportunities. And it’s in light of this prejudice of mine that I have been provoked by Gillian’s paper.  Amongst other things, I have always wondered how this entire field has ever managed to be taken quite so seriously, indeed how it ever managed to take itself quite so seriously, while seeming to be constituted as if radio and television were never invented, or indeed as if The Beatles, Elvis, or The Supremes never happened (interesting work on these worlds had tended to be produced by economic geographers and others, not by cultural geographers). Cultural Geography has always seemed to me to be a bit un-Pop. This is partly, as my colleague Sam Kinsley has suggested, to do with an aversion to considering ‘vulgar’ cultural forms as worthy of attention; but as he further suggests, this has implications for how geographers think about what one might call the ‘ontology of media’.

My second frame for thinking about Gillian’s argument is a broader thought, another prejudice of mine if you will, about the ways in which human geography’s narratives of disciplinary ‘progress’ often tend to invest heavily in the idea that the best way of moving forward is by compounding a series of previously accumulated errors (see: ‘non-representational theory’).

So here, I want to pinpoint one or two aspects of Gillian’s argument about the challenge of digital technologies to cultural geography that might be framed slightly differently: partly these are matters to do with the constitutive elision of ‘the fact of television’, to borrow a phrase from Stanley Cavell, although I would be inclined to extend this into a more encompassing notion of ‘the fact of pop’; and beyond this, to questions of how to avoid mis-attributing to one specific media form a set of relational features around which a broader project of differentiating cultural mediums might be pursued.

IMG_32821). The work of art before and after the age of digital reproduction

The focus of Gillian’s paper is with “the legacy of those new cultural geographers who were concerned to interpret cultural objects”. She is referring to what one might characterise as the self-consciously ‘arty’ end of the spectrum of approaches to cultural analysis in geography, not so much because of its focus on arty-artefacts per se, but because of a distinctively arty concept of the object of cultural analysis. As she puts it, the focus is on discerning the meanings of “stable cultural objects”, such as maps, buildings, films, novels, and photographs. The paper does not say so clearly, but this is a strand of work that has operated with a quite distinct set of understandings of “meaning” and “reading”, when compared, say to the type of ethnographic work on ordinary food cultures developed by Peter Jackson (which elaborates a clear sense of the notion that ‘meaning is use’), or the work developed by Don Mitchell excavating the hidden injuries of landscape, or indeed Gillian’s own work on the practices of domestic photography. I’ll leave it others to determine how extensive the particular strand of work targeted by Gillian in this piece is representative of the best of the whole field.

Gillian’s argument is that the assumptions about the stable objects of cultural geography have been unsettled by the rise of digital modes of cultural production and distribution. As she puts it, “since the creation of so many cultural objects – though certainly not all, and not everywhere – is digitally mediated now, the stable cultural object is currently the rare exception rather than the rule.” The related claim that “close reading of stable cultural objects is ill-equipped to engage with the defining characteristics of contemporary, digitally-mediated cultural activity” is true enough. But I do wonder why the kind of approach that Gillian focuses on in this discussion was ever considered adequate, 25 years ago and ever since? Or, to put it another way, why is it that it is the fact of digital technology that seems to be the occasion for presenting cultural geography (of one sort at least) with the challenge of grappling with the constitutive role of technologies of dispersal, iteration, recomposition and translation in cultural life? And further, what might be elided by making ‘the digital’ so central to this conceptual and methodological disruption?

In accounting for the predilection for analysing stable cultural objects, Gillian refers to Walter Benjamin’s account of ‘aura’. Her suggestion is that the canonical objects of cultural geography were ‘auratic’ objects: “the new cultural geography emerged at a historical moment when the vast majority of cultural objects could be traced back to an original: an original manuscript, a building, a reel of film, a map.” Gillian’s strong implication is that these forms are, indeed, auratic objects. Now, it seems more plausible that this may have been how cultural geography constructed its objects of analysis. Either way, in so far as it holds true, then it is actually quite extraordinary. Benjamin’s point, in so far as it is a simple one, was that the auratic understanding of cultural artefacts was lost to modernity, and that modern modes of cultural practice opened up wholly new forms of apprehending and experiencing meaning. The argument is an inherently spatial one, in so far as aura is a concept related to the here-and-now presence of a subject before an object as the scene for a certain sort of aesthetic experience. The loss of aura is, in turn, a kind of shattering or dislocation of aesthetic experience, but crucially, of course it is a ‘loss’ that is found to be always already inscribed within the movement of cultural life (I am paraphrasing here, and largely based on  half-forgotten readings of Samuel Weber’s rendition of Benjamin’s work on ‘mediauras‘ and on the centrality of the suffix ‘-abilites‘ to Benjamin’s style of conceptual analysis).

To be clear, Gillian’s presentation of how cultural geography addresses a stable cultural object certainly rings true to me. But in so far as it is accurate, we should be clear that this is the result of a motivated theoretical construction, it is not a result of the innate characteristics of cultural practice three decades ago. The significance of Benjamin’s accounts of aura, of ‘the work of art in the age of mechanical reproduction’, of translation, and other themes, all written in the 1920s and 1930s, has always been in providing prescient resources for understanding the spatially dispersed and temporally strung-out forms of culture that already defined his time (print, film, radio) as well as ones soon to come (television, video, digital). Which gives rise to the question of how in the world cultural geography ever got away with holding so strongly to what, from a strictly Benjaminian perspective, looks like a distinctively pre-modern concept of culture?

Gillian’s claim in the paper is that received methods of “close reading” of “cultural texts” need to be reconsidered, indeed supplanted, because of the changes wrought by the rise of digital technologies: “For in the three decades or so since the emergence of the new cultural geography, both cultural objects and the technologies and practices in which they are embedded have altered significantly. Over the past 30 years there have been profound changes in the processes and practices of cultural production, in the circulation and display of cultural objects, and in the processes of audiencing, participation and critique.” Taken in isolation, this reads as an uncontroversial claim. But remember, what Gillian is arguing here is that these new developments challenge a notion of ‘stable cultural objects’ understood as more or less ‘auratic’ forms, containing more or less determinate meanings. My point is that this notion of culture was already redundant way back in the 1980s, when we were all busy learning to love our video machines and wrecking the music industry by taping the Top40 from the radio and listening to mix-tapes on our Walkmans. Cultural meaning did not become dispersed across multiple sites, spread across multiple media platforms, ‘massified’, or split up and recombined across fragile networks only recently, in the last couple of decades. Nor did this start in the 20s or 30s, when Benjamin was writing (his point is that it has always been happening, that it a movement that lies at the source of any and all ‘originals’).

I am trying to make two related points. First, that digital technologies no doubt introduce all sorts of new dimensions into cultural life, but that whatever these might be, they are not best understood by reference to the idea of stable cultural objects that have held cultural geography in thrall. Secondly, the stability of cultural objects presumed by cultural geography, according to Gillian’s account, should not be mistaken as some sort of inherent ‘material’ feature of forms such as the novel, films, or photographs. If this is how cultural geography thought of its objects of analysis from the 1980s onwards, then this is something that needs to be accounted for on other grounds (as a specific response to a certain intuition of loss, perhaps?). Approaching paintings, or photos, or novels as stable cultural objects to be read for meaning is a particular achievement, one that depends on various procedures such as practices of exhibiting, or paratextual networks of reviewing and marketing. Take, for example, the way in which ‘Film’ has become a staple object of analysis not just in cultural geography, but in other fields such as Classics and Political Theory in the last three decades. Before that, the academic analysis of Film, and its most famous theoretical products such as Auteur Theory or Screen Theory, were largely the preserve of specialist film schools. Now, we can all do it. This generalisation of ‘Film’ as a potential stable object of academic analysis is dependent, of course, on the widespread dissemination of video technology from the 1980s onwards, that is, it is dependent on the widespread and cheap distribution of an archive of film history, and the possibility of recording films off the telly, and in turn the possibility for anyone to watch and re-watch, stop and pause and rewind, and to do so not only as ‘research’ but also as a teachable methodology.

This is just one example of how the stable cultural objects that cultural geography focussed on were made available by a series of distributed, networked, mobile technologies that stand as the conditions of possibility of that imputed autonomy and stability. (You could make a similar argument about the degree to which the emergence of a shareable canon of Theory upon which ‘the cultural turn’ depended, that could be learnt and mastered even in an odd discipline like Geography, was dependent on the photocopier). And I invoke this example because it indicates how the attributes that Gillian defines as peculiarly new ones, associated with digital technologies, are not just discernible in other modes of cultural practice, but more precisely, that the erasure of these modes of mediation from ‘new cultural geography’ might well be the condition for the particular framing of the challenge of digital technology as it is now felt in geography and articulated so clearly in Gillian’s paper.

IMG_32862). Acknowledging media

Gillian’s argument is that the artefacts of digital technologies are distinguished by three features, understood by reference to the magical signifier ‘materiality’: they are mutable, multimedial, and mass. I think the categories are really useful, but they clearly do not categorically distinguish digital artefacts from non-digital ones (they only appear to do so because of what we have established to be a bizarrely restrictive construction of the object of cultural-geographic analysis). I think there is a danger here of reserving for one particular mode of cultural practice, the digitally mediated, a set of features that actually should be better understood as relational terms of comparative analysis and judgement, as if they were attributes of a particular mode. The language of ‘materiality’ just reinforces this tendency, which is a kind of category error.

Lots of practices of meaning are mutable (you can forge other people’s handwriting, or fake photographs of Lee Harvey Oswald in the backyard (maybe)); lots of cultural forms are multi-medial (song is a theatrical form, an amplified form, a recorded form, only sometimes all at the same time; films have soundtracks); writing, as Raymond Williams memorably demonstrated, is a form inscribed in all sort of non-literary cultural forms, from public speaking to theatre to television and film (and digital technologies are significant not least, surely, for reviving and inventing a range of practices of literacy); and ‘mass’ culture, defined by what Gillian refers to as ‘the sheer amount of cultural production now’, but which really refers to a difficulty of containing and tracking where meaning flows that is not just about quantity, has been with us for quite some time, at least since the time of Caxton.

I suspect that the difference that digital technologies makes to these practices of translation, movement, and projection is better theorised in terms of the reconfiguration of parameters of speed, expertise, perhaps crucially, cost. Trying to pin down the distinctive features of digital technology by reference to the assumptions made about stable cultural objects, assumptions that we have seen depended on pretending that a whole history of modern media simply did not impinge on cultural-geographic analysis, threatens to misapprehend not so much what is new and different about digital technologies, but rather to misconstrue how to go about conceptualising what is new and different about any media practice. Going back to Benjamin, one thing we might think about is the idea that historically novel forms actually throw into new relief the characteristics of ‘old’ ones – they enable us to acknowledge features of the old ones previously unavailable to perception or sense. Related to this, we might pause and consider the degree to which thinking seriously about culture and media and technology really requires us to engage in some reflection on the nature of a particular sort of reasoning, namely analogical reasoning. ‘New’ media and the cultural forms they make possible are routinely made sense of through a process of selecting and enforcing authoritative analogies: this is the case in legal decision-making about new technologies; it is also evident in the very names given to innovative forms of cultural expression associated with new digital practices – forms such as e-mail, webpages, YouTube. These are not mere lazy affectations, they are small indices of the ways in which ‘new’ media forms emerge through process of learning that draw on formal and informal competencies to draw and act upon appropriate analogies.

Gillian’s analysis of these three features of digital technology culminates in a claim about the distinctive spatiality of digital culture, according to which the analytical challenge is to appreciate that “meaning is performed and materialized at specific sites; it is accessed, made to travel, searched for, modified, patched and laboured over in an uneven, variable and frictional network held together by diverse forms of work which do not always succeed in making meaning move.” This is a great description of how we might conceptualise the geography of meaning; only, it seems to me that it stands as a perfectly good description of modern print cultures, or of how broadcasting emerged as a cultural form in the 1920s, or indeed, a quite good paraphrase of what certain sorts of literary theorists once conceptualised as ‘textuality’. Again, my point is not to suggest that there is nothing new or distinctive or unsettling about digital media, just that the interesting question is to ask how these dimensions are configured by this mode of meaning-making, rather than supposing that they are emphatically characteristics of this mode alone.

There are important questions raised by Gillian’s paper about how one might approach the task of doing ‘media ontology’. I happen to think that thinking in terms of the ‘materialies’ of particular media or forms or technologies is likely to lead us astray, not least by encouraging the mis-atttribution of relational modalities or emergent ‘-abilities’ to singular forms or technologies. I prefer thinking about what Albert Hirschman liked to call ‘structural characteristics’ of practices, by which he meant the different combinations of spatial and temporal wiggle-room or latitude that shaped the pathways of different projects. I also like Cavell’s style of thinking through the ontology of film, as well as television, one which gets at what is distinctive about different mediums by asking, for example, what it is about a new medium that attracts disapproval. But more profoundly, Cavell thinks of the ontology of different mediums as what it is that they allow to be revealed or acknowledged about the human condition (and yes, this requires a certain sort of ‘reading’ of more or less canonical objects, but not of the sort which would be much approved of by cultural geographers I suspect (it would appear too naively characterological); besides, perhaps we should also allow that there is more than one way of ‘reading’ a ‘cultural text’, that Fredric Jameson’s style of addressing a film or novel as a totalising crystallisation of historical epochs is not quite the same as the reading by Robert Pippin of Westerns or Noir as allegories for certain recurrent political dilemmas, and further, that none of these examples looks much like anything undertaken in cultural geography).

Cavell’s discussion of ‘the fact of television‘ revolves around the idea that there might be something about TV that seems to resist acknowledgement, that it seems to be a medium distinguished by it being so taken-for-grantedly there and available (the occasion for Cavell’s discussion was the early 1980s ‘video revolution’). So the absence of TV from cultural geography is not necessarily a failure, it might be part of a broader phenomenon (one related, while I think about it, to the degree to which a great deal of critical academic discourse is shaped by an understanding of pressing political imperatives that derive from the world routinely disclosed to us as ‘News’). One of Cavell’s recurring concerns is with thinking of the distinctive qualities of different mediums in terms of genre. The problem of genre is for him the entry point for acknowledging the ontological qualities of film, or television, or painting. One of the qualities of film that passes over into television, he suggests, is the series; television, in turn, he suggests, is characterised not by particular objects, more by formats (like the sit-com). The point of recalling this sort of analysis is to indicate how the singular, stable objects of cultural analysis are made available to us from within these elusively structured modes of making meaning (and by the forms of forgetting that inhere within them too).

Anyway, all of this work about ‘genres’, ‘structural characteristics’, and ‘-abilities’ has one thing in common that might still meet with resistance from the paradigm of new cultural geography that Gillian’s paper addresses: none of it allows one to suppose that the best way of approaching cultural analysis is by supposing that cultural forms somehow shape or change subjectivities. The idea of subjectivity is the principle of totalization that continues to anchor cultural geography – from the presumption that culture is a medium for the construction or, worse, the production of subjectivity; to the ways in which this same idea remains the primary reference point for asserting the significance of stories about affectively imbued flows and encounters; and now, it seems, an interest in distinctive forms of digital or online subjectivity. It is this idea – that there is a thing called ‘subjectivity’ that it is the task of cultural analysis to comprehend in all its contingency and variety by attending to its modes of production – that is the most enduring feature of the paradigm of analysis that focusses on finding the meaning of stable cultural objects. And for as long as this anchor point remains in place, taken for granted even when disavowed, little progress will have been made in moving beyond the closures of the new cultural geography.

IMG_32843). The pressure for meaning

I have been assuming throughout my discussion here that Gillian is essentially correct in saying that the new cultural geography rose to prominence through the elevation of a distinctive method of reading for the meanings of stable cultural objects. I have suggested that this should be recognised as a motivated construction, rather than a more or less natural response to the ‘materialities’ of pre-digital media cultures. And I have tried to raise some questions about what we are to make of this closure of questions about the mediums of media cultures, a closure that I think might well continue to frame discussions of the challenge presented by digital technologies to established paradigms of geographical analysis. I have also suggested there is one thing that remains constant across Gillian’s discussion of the new cultural geography and its stable cultural objects, and the new forms associated with the interfaces, frictions and networks of digital cultures: the assumption that the main thing at stake is understanding something called ‘subjectivity’. What remains constant, across more constructivist approaches, self-righteously ‘non-representational’ approaches, and new work on digital culture, is the strong idea that cultural technologies do things to people, and that understanding what they do to people is the key concern that justifies ‘critical’ analysis.

The persistence of this problematic of subjectivity is indicated by Gillian’s refrain about the need to attend to how the “forms of contemporary subjectivity” are being changed by digital technologies. Once upon a time, the idea that subjectivity was constructed through culture depended on the assumption of stable spatial and temporal relations between a singular cultural object and a fixed viewer/reader. These days, the image that recurs is one of mobile bodies immersed within environments saturated with affectively configured meanings, moving from one screen to the next. In both cases, ‘the subject’ is assumed to be totally encompassed within the milieu of its own subjection. It’s the recurring image that once underwrote important arguments about ‘cultural politics’ and assertions of ‘resistance’, and which now underwrites misanthropic arguments about the ability of states to manipulate people’s feelings or about the real subsumption of subjectivity to capital.

In Gillian’s argument, there is an analytical slippage involved in counterposing the idealised model of a viewer/reader in front of a photo, film, or book to a nuanced description of the conditions through which digital technologies enable cultural forms to be produced and circulated. This is not comparing like with like, it should be admitted. I’ve already suggested that the conditions through which those stable cultural objects are made available for analysis are not quite so different from the conditions defined as distinctively ‘digital’. But one might perhaps be a little more charitable towards the analytical constructs of the new cultural geography. One thing that this mode of analysis does at least begin to approximate is the ordinary ways in which cultural forms are apprehended – as novels, as films, and so on. The concepts of ‘reading’ invoked in such work were highly stylised version of more ordinary modes of engaging with cultural forms, to be sure. But they do at least acknowledge that people engage with identifiable cultural forms, and not with technologies. Gillian’s characterisation of the distinctive features of digital culture seems to take for granted the adequacy of the previous formation of stable cultural objects in their own time, but in the wrong way. Reckoning only with the obvious limitations of that paradigm threatens to erase its virtues (an appreciation of the ordinary forms through which culture circulates): a complex, nuanced understanding of the modes of production and distribution of cultural forms is, after all, only ever interesting in relation to a concern with those ordinary formations – it is not a substitute for them, and it is certainly not the secret to understanding how power is exercised through mediated cultural artefacts.

It is best not to think of any type of understanding of the conditions of meaning as somehow throwing ‘critical’ light upon ordinary forms of engagement; as revealing constructions of subjectivity, the exercise of power, or the manipulations of affect. It is better to think of any such understanding as a resource for the better appreciation of what is at stake in those ordinary forms of apprehending cultural forms. Having outlined an account of the networks through which digital culture circulates, Gillian suggests in her conclusion that there is a need for “a richer analytical vocabulary for the power relations performed through this convergent network”. Perhaps what is really needed is a reassessment of the very idea that culture is a medium for exercising power at all; and a reassessment too of the idea which anchors this assumption about power, namely that it is at all respectable to think of people’s subjectivities as primarily formed in a subordinate relationship with their favoured cultural forms. In fundamental respects, the paradigm of cultural analysis that Gillian dissects in this paper might well get things the wrong way around, making a mistake that the diagnosis of digital cultures is only likely to compound for as long as it is not recognised: what William Kentridge calls the ‘pressure for meaning’ is not best thought of as an imperative imposed upon subjects by so many produced, circulated, distributed, dispersed cultural forms; it is, rather, something that we bring to those forms, more or less expertly, more or less successfully, and with more or less serious or hilarious consequences.

What are the humanities good for?

SMAGThere is, apparently, a ‘war against the humanities‘ going on in British higher education, according to a piece in The Observer this weekend. The piece cites as its primary evidence for this ‘war’ the perspectives of scholars from the humanities, of course, lamenting the effects of changes to funding regimes but also the culture of management in British Universities on the proper pursuit of scholarship.

I always worry when ‘the humanities’ is used as a catch-all to encompass the social sciences as well as more ‘arts’-type fields. It is true, of course, that both arts and social sciences disciplines have suffered from the same funding changes since 2010, but I’m not quite sure that the standard ‘whither the humanities?’ style of criticism of higher education policy over this period necessarily sheds much light on what is really going on, or on how best to evaluate it. The piece in The Observer shares various features of a broader genre of criticism of higher education transformation in the name of ‘the humanities’:

First, as already noted, it conflates a range of different disciplines, but presents next to no insight from anyone who looks or sounds like a social scientist. No doubt we could argue about whether the social sciences counts as ‘humanities’ or not, but in this sort of piece, it turns out that ‘the humanities’ really means literary and arts-based fields and forms of analysis. Therein lay the values most under threat from funding changes and top-down management styles and impact agendas. Amongst other things, one effect of this elision of social science is a tendency to present ‘the sciences’ as the more or less unwitting bad guys in the story. Two cultures, all over again, one of which is always a bit too uncultured.

Second, the lament about the squeezing of ‘humanities’ is often enough made in the name of the values of criticism and critique, but I do wonder whether we should really look for our models of these practices from ‘the humanities’ anymore? To be fair, there is a ‘social science’ version of the same lament. John Holmwood, for example, has written in much the same vein recently about the apparent marginalisation of the critical voice of social sciences in British public debate. Holmwood worries that social science is being shaped too pragmatically, in such a way as to displace attention to social structures. I dare say that an appeal to the value of social science as lying in access to knowledge of structures and possibilities of change bears some structural similarity to the form of discerning insight that ‘the humanities’ are meant to have. In both cases, ‘critique’ is the magical practice that is best able to articulate with public worlds by maintaining a certain sort of distance from them.

The genre is remarkably resilient, it seems, even resurgent. Unhappily, it turns on quite conventional oppositions between (bad) instrumental knowledge and (good) critical knowledge. Somewhere in between, the scope for thinking about different versions of instrumentality gets lost, and the critical voice gets snared in its own contradictions, being forced to disavow various public entanglements (the impact agenda, most obviously, or treating students as adults, rather more implicitly), in the name of a weakly expressed ideal of the worldly force of ‘really useless knowledge’.

There is much to lament about the state of British higher education. And there is, of course, a ‘campaign for social science‘, which has recently managed to produce a deeply embarrassing representation of the value of social science that might well confirm all one’s suspicions about the selling-out of social scientists to ‘neoliberal agendas’ (we are in ‘the business of people‘, apparently). Social science is, of course, a divided field, as Holmwood implies. So too, one might suspect, are ‘the humanities’. The resilience of the ‘two cultures’ genre has been evident since 2010, at least, when arguments in the defence of the ‘public university’ took off in response to Coalition policy changes. It was evident, for example, in the controversy around the AHRC’s alignment with ‘the big society’ agenda (remember that?). That episode illustrated the division within the humanities I just mentioned, rather than an impure imposition of pernicious instrumentalism from the outside. It turns out, of course, that the humanities are really good at being instrumentally useful, at knowing how to ‘sell-out’; not least, humanities fields have been at the forefront of legitimizing the impact agenda both in principle and in practice (as evidenced by evaluations of impact submissions and indicators in the 2014 REF exercise).

The ‘two cultures’ genre is always a trap, not least in the current conjuncture when the defence of ‘the value of the humanities’ is made alongside sweeping references to neoliberalization of higher education. Like it or not, the restructuring of higher education in Britain, and elsewhere, is explicitly made in the name of public values like accountability and social mobility; as a result, the defence of ‘the humanities’ always already suffers from a populist deficit when articulated from within the confines of the two cultures genre, however refined that has become in the hands of Stefan Collini or Martha Nussbaum. ‘Neoliberalism’ is, of course, a social science concept, but not a very good one, especially in this context, because in its most sophisticated varieties, it doesn’t allow you to recognise that contemporary political-economic processes involve the reconfiguration of the means and ends of public life, rather than just a straightforward diminution of public life (here represented by ‘the humanities’) in the face of privatisation, individualism, and competition.

Herein lies the real problem with the elision of social science into a precious view of ‘the humanities’ as the repository of irreducibly qualitative values: the defence of the humanities is generally made via a simplistic conceptual vocabulary of ‘the market’, ‘the state’, ‘bureaucracy’, and other hoary old figures of the forces of philistinism. There is a critique, certainly, to be made of trends in higher education in the UK, but it probably requires better social science, better social theory, than the prevalent defence of ‘the humanities’ seems able or willing to muster. It would require, amongst other things, giving up on the idea that critique is a special preserve of ‘the humanities’, or indeed that it requires discerning access to structural analysis.

On Stoddart

CCCCI was saddened to hear of the death of the geographer David Stoddart. The Guardian has an obituary, written by Peter Haggett, and The Independent has one by Tam Dalyell, with whom Stoddart campaigned to save Aldabra from being used as a military base; and there is an appreciation on the Berkeley Geography Department website.

Stoddart is the main influence on me becoming a Geographer, or at least on remaining in Geography long enough to become one. I am the last-but-one Geography undergraduate he admitted before leaving Cambridge, and no-one had applied to his College for a couple of years before me. Later on, it occurred to me that this might have been why I got in – I assume he wasn’t going to look too hard at the stray application that did turn up (I only applied to that College because they offered the best accommodation deal). As my Director of Studies, I was taught by Stoddart for a year, in his office in the Department of Geography (he had effectively ceased to actually visit the College some time ago). He wasn’t actually around when my first term started, he arrived a couple of weeks later, having been away in California, securing the Chair to which he moved at the end of 1987.

For a year, I had one-on-one supervisions with Stoddart, because there weren’t any other Geographers for whom he was responsible (This wasn’t, in my experience otherwise, a normal situation at all; supervisions normally had two or three people in them). In these meetings, I learnt various things. I learnt how to nurse a large glass beaker full of sherry through an hour-long meeting in which someone else was ding a lot of the talking without ending up totally trashed (not a skill that has been called on much since then). Above all, I learnt that Geography was an intellectual vocation. Stoddart’s outward demeanour was, as I recall, rather hearty, but his teaching was focussed on ideas, ideas, and ideas. His supervisions were interrupted by phone calls from Joseph Needham, and full of discussions, by Stoddart, of Darwin. His model of teaching was to send you off to read something for next time, and then when next time came round, you would find yourself talking about something else entirely. As a matter of principle, he didn’t set essays; so I didn’t write any in my first year as an undergraduate, until exams in the summer. This was a model of Geography as reading, like a personalised version of Geography as ‘Greats’ (I tended not to invest so heavily in Stoddart’s predilection for romping around salt marshes in the cold of November).

UntitledStoddart had me read Paul Wheatley’s Pivot of the Four Quarters in my first term, and Clarence Glacken’s Traces of the Rhodian Shore over the Christmas break (let’s not dwell on whether I understood anything going on in these kinds of books). Perhaps most importantly, he pointed me in the direction of David Harvey’s work. Getting me to read Harvey was his strategy to keep me from switching from Geography at the end of my first year. I went to University with the intention of studying Economics, and only started with Geography because if you already had an A-Level in Economics, you did not need to do the first-year Economics course. I thought doing Geography would be a good way of learning a few more facts about desertification and drought before focussing on proper, complex ideas about how the world really worked (which is what doing Economics at school had seemed to have been about). When I first met him, at a meet-and-greet event in the Spring before going to University, I had told Stoddart that I liked Keynes (he had asked me who my intellectual hero was, and I didn’t think it wise to say ‘Charles M. Schulz’), and that I had an interest in knowing more about Marxism (probably because of reading too much of the NME). So when I started, when he did arrive back, he told me to read Harvey, specifically, ‘Population, Resources, and the Ideology of Science‘. This essay was almost designed to convert callow just-out-of-school Geographers into critical social scientists. It worked on me. When I later ordered Harvey’s Limits to Capital for the College library, the request was forwarded to the Economics fellow for approval, who declined it on the grounds that this book was already held at the University’s Economics library. Stoddart was furious at this, and insisted on it being ordered as core Geography reading.

By the end of my first year, actually much earlier, I had settled on staying with Geography (helped by the realisation that Economics was really just abstracted applied algebra). This was because I had discovered a whole world of social theory, a world full of Marxism and feminism and Giddens, a world in which it turned out that everyone was talking about politics and power. And I had discovered this world in no small part because Stoddart encouraged me in that direction, and also because he demonstrated to me through his own work and teaching style that Geography was the place to stay if you were really interested in pursuing ideas.

Emergent Publics?

croftI gave a talk week ago or so at a conference on New Perspectives on the Problems of the Public, at the University of Westminster. I presented a version of a paper titled ‘Theorising Emergent Publics’, soon to be published I hope, and which is an attempt to say out loud some of the things I learnt through my involvement on the ESRC Emergent Publics project that Nick Mahony, Janet Newman and myself ‘convened’ a few years back now. The paper tries to think through the problem of making use of concepts like the public sphere, or public space, public-whatever, which are inherently normative but which have an empirical reference, and to do so in a non-reductive, not-backward-looking way. The term ‘emergent’ is meant to flag this problem of thinking about how to use normative concepts as they are meant to be used – evaluatively – in relation to ‘new’ formations of public life which don’t conform to established models of what public life is and should be.

Last time I talked about this theme, at an event in Ottawa, I came away having realised that the issue of ‘attention’ really deserved, well, more attention in discussions of publicness (that’s one paper I still haven’t written up…). This time, someone asked me what the ‘emergent’ bit meant in the title of the paper. Good question! It’s taken 6 years for anyone to ask that one. This is a dimension of the Emergent Publics project that we never really developed, it’s true (I have collected an awful lot of things to read on this topic…. Another unwritten paper). The thing about ‘emergent’ or ‘emergence’ is that it’s not just a smart word for saying ‘new things’, although it is that too. That’s what the question was getting at, I think (obviously, at the time, I blagged my way around the difficult question). Without consulting that pile of paper I mentioned, here is my first-cut at the different strands of thought that one might invoke to think through what the relevance of ‘emergent’ might be in talking about ‘emergent publics’ (actually, the Understanding Society blog by Daniel Little has a set of discussions on this topic and its relevance to social theory which is probably the best place to start):

–       One obvious reference point is Raymond Williams’ account of dominant, residual and emergent cultural formations. This is most useful as a descriptive framework, as a kind of starting point for mapping out relationships and assessing the relative powers of different practices.

–       Next, depending on your age and inclination, perhaps we should mention critical realism, a field in which the idea of ’emergent properties’ is particularly important. In terms of public things, what this sense, derived of course from a wider set of debates across science and the humanities, points towards is the sense that ‘publics’ arise from conditions to which they are irreducibly linked but also to which they cannot be reduced. I have in the past discussed this sort of idea with reference to the motif of the parasite, drawn from deconstruction, suggesting that publicness is inherently parasitical, or supplementary if you prefer. I’m not sure that this idea has caught on.

–       The notion of ‘emergence’ in social theory, whatever usage you alight upon, is always referencing the ‘proper’ sense of this idea drawn from physics, biology, and strands of philosophy of mind, particularly around ‘the hard problem’ of consciousness. Whereas in social theory, emergence is a really cool thing to invoke, I think it’s fair to say that in these fields it’s a much more contested idea – important certainly, but far from having the stable, established authority that social science wants the idea of emergence to carry.

–       Never mind, let’s keep going, because then there is perhaps currently the most sexy version of emergence-talk, associated with William Connolly and other versions of Spinoza-inflected vitalist styles of political theory. Connolly’s account of affect, pluralism, neuropolitics and such things cashes out in a discussion of ‘emergent causality’, which sounds like a great idea – the idea that events have conditions, certainly, but that you can’t quite anticipate how any set of given conditions will generate new forms. Now, not only might this not be so distinctive as one might think if you’re old/clunky enough to remember the hey-day of critical realism, but worse, or is it better, yes, it’s better, Connolly seems not to have noticed that his own account of emergent causality is pretty much identical to what Louis Althusser and his friends once called ‘structural causality’. Of course, ‘structural’ causality sounds a little bit deterministic, but it’s actually all about how structures rub up against each other and generate entirely surprising events, like the Russian revolution happening in, oh, Russia – that wasn’t meant to happen, was it? (somewhere along the way, if you’re following, this chain of associations might remind you, or help you see for the first time, or notice what was obvious, that structuralism as a tradition invented the analysis of ‘contingency’ – post-structuralism might, then, be just a footnote to that tradition). Anyway, anyway, by the time one has spotted the ‘overdetermined’ and ‘contradictory’ family resemblances between the ideas of emergence in Althusser, Connolly, Deleuze and anyone else who thinks it’s really obvious what Spinoza was really on about, then you will have arrived at the realization that ‘emergence’ is perhaps not able to do all the work you might want it to do. Emergence is often invoked against the idea of ‘linear causality’ in this sort of work, an idea which is really just a useful straw figure.

–     And then there is Hayek. Oops. The idea that markets are best thought of as ‘spontaneous orders’, which Hayek didn’t invent but did refine and then popularise in a particular way, has been picked up and taken seriously by, for example, Andrew Sayer (remember the critical realist interest in ideas of emergence), and more recently by Warren Magnusson.

There might be other strands I haven’t thought of (I’m writing this off the top of my head). But ending with Hayek is fun, isn’t it? It underlines the degree to which thinking about the ‘emergent’ bit of emergent publics should really have two dimensions to it: the normative/evaluative puzzle, certainly, but also the sense in which publicness is not something best thought of by analogy to our received ideas about construction and/or contingency. One of the things I have noticed about discussions of publicness in my ongoing ethnography of academic understandings of public value over the last few years is a constant temptation to infer a particular lesson from the observation that publics, public spheres, public spaces are not natural, but variable, constructed, assembled: it is routinely assumed that this means that publics, if they are not naturally given, must be actively made, for good or ill; and that by extension, of course, that ‘we’ should be involved in making them better, in better ways.

So, dare I say that Hayek might be really important to theorising the politics of public formation? Maybe that just means that, at the very least, using the vocabulary of ‘emergence’ in relation to publicness should lead us to be more attentive to the hubris that easily attaches itself to discussions of this topic, in which we all too easily find that other people are not virtuous enough but then console ourselves in imagining that our role as academics is to help them be better versions of themselves.

Are there 15 ways to be unhappy? Surfing Bruno Latour’s ‘An Inquiry into Modes of Existence’

1). Samin’ and changin’

DSCF1034I have had Bruno Latour’s An Inquiry in Modes of Existence (AIME) kicking around my desk since last summer, thinking it’s the sort of book one should probably read in case it turns out to be mind-blowingly important. I finally got round to reading it, in a certain manner, recently, encouraged by the setting up of a reading group by the NAMBIO research group in the Geography here at Exeter, which I have actually not been able to attend until this week. I might not be able to go to the next one meeting either, so in the spirit of stretched-out, online thinking that this book is meant to exemplify, I thought I’d try to articulate some of the thoughts that it has provoked in me. (The book is just one element of a more ambitious ‘digital humanities’ project – a website, basically, with some further written material, a glossary, and some interactive activities, where you are invited to assist in the empirical fleshing out of Latour’s ambitious analytical framework).

AIME is a book that invites a certain sort of engagement, and not only because of this hyper-textual dimension to the print version. It has an interesting narrative structure, apart from anything else, involving a series of deferrals from a lead narrator (let’s agree to call him ‘Latour’), telling the story of what an ethnographer amongst ‘the Moderns’ might expect to find, and then ‘the Moderns’ themselves (‘the Moderns’ are a people who believe in sharp distinctions between words and thing, apparently. Their voices are not heard at all, throughout. Which may or may not lead you to think they are a made-up people). If I remembered more literary theory, I think I might be able to name this sort of narrative device, which creates both an implied distance between the narrator and the world being described (that of ‘the Moderns’), and an implied first-person intimacy between the narrator and the reader as sharing in the same insights about those who are written about in the third person.

There is lots going on in the book, which is I guess part of the ongoing ‘coming out as a philosopher’ which Latour announced a while ago, but more precisely is a fleshing out of the awkward attempt of giving some normative substance to the distinctive ontological drift of Latour’s work, evident in discussions of such things as ‘learning to be affected’ and ‘matters of concern’ (I particularly like the bits on habit, and the general theme of ‘prepositions’, which bought to my mind the work of Gerard Genette on ‘thresholds of interpretation’). What I found most entertaining, and the reason I felt the book might be worth reading, is the way in which it attempts to outline an analytic framework for discerning the internal normativity of different fields of practice (this is not how Latour puts it, I’m translating). I think Latour’s project has various resemblances with similar projects: everything from Foucault’s outlines for doing the ‘history of thought’ (well, actually, it everything and anything by Foucault); Boltanski and Thevenot’s account of the coordinating function of practices of justification in various ‘economies of worth’; the analysis of the rationalities of different forms of action by Habermas, of course, and of the different interests served by different forms of knowledge in particular; Goffman’s frame analysis; field theory, from Bourdieu through to Fligstein and McAdam; Rainer Forst’s consideration of normative orders…. You can add your own examples of the sort of thing I’m getting at, if you want. Michael Oakeshott’s Experience and its Modes, perhaps? Kenneth Burke on the ‘grammar of motives’? Needless to say, none of these resemblances is noticed in AIME. I guess they might not pass muster as being adequately attuned to the demands of “ontological realism” (on the other hand, all of them suppose to a greater or lesser degree that conflict is an irreducible dynamic of life in a way in which Latour’s account of controversies arising from mistakes does not).

What has always struck me as most interesting about Latour’s work and that of others associated with ANT and STS is not the grand ontological claims, but the demonstration of the ways in which responsibility, accountability, obligation and the like are dispersed across networks of motives and machines, intentions and insects. From key-fobs to speed-bumps, it’s not interesting to think of all this work as about ontology and materiality; hasn’t it always been about norms (not ontonorms; just norms – the conjunction makes no difference: the onto- is the easy bit; the norms are the difficult part). If you take these stories as primarily about ontological issues, about symmetry between human and non-human actants, or, more interestingly, as being about distributed agency, then you still miss what seems to me most interesting about them: the key-fob story, from Latour, is about particular values, such as honesty; the speed-bumps is about a different combination of values, such as safety, legality, efficiency. On this reading, this style of onto-inflected work has always been about norms, and in interesting ways (although that only raises the question of why it’s own authors didn’t seem to notice until quite recently, and/or feel the need to explicate this now). The reason these strands of canonical ANT are interesting, it seems to me, is because they focus attention on some of the weird dimensions of ‘moral’ action: the ways in which the actions for which people might well be held responsible, in one sense or other, can be caused by all sorts of factors beyond their intention or control. These ideas can be found in other fields of social theory and philosophy, no doubt, but I like the idea of reading ANT/STS in this way, against the grain of its own publicity, for sure. Not least because I think it’s a way of drawing attention to an irreducible, shall we say, ‘humanist’ reference in this work, without which it might just not resonate – but a reference the full consequences of which, I also think, are systematically evaded by recourse to the easy trumping of ontological claims (what sort of being cares about ontology, after all?).

In this respect, it’s notable that Latour’s new book is actually all about speech, and more precisely, about the ethics of speech. It is anchored around a concern to elaborate on how different fields of practice are distinguished by their own forms of truth and falsity, in order to assist us all in avoiding making category mistakes. Latour wants to be able to clear up conflicts between the values that shape distinct fields (between science and, perhaps, social studies of science, for example?). These conflicts arise, he seems to suppose, because truth-and-falsity-talk in one realm (e.g. in science) is mistaken for truth-and-falsity-talk in another (e.g. in law). That’s why Gilbert Ryle’s notion of ‘category mistakes’ is so important to the analysis in AIME – Latour wants to help us to avoid making errors of this sort, so that we might all be able to get on a little better. Now, I really like the idea of category mistakes (although I always tend to say ‘category error’, I think because of sitting through lectures by Terry Eagleton long ago. Eagleton has always had a rather good way of mobilizing this idea. I’m not sure if getting the name of this notion actually wrong counts as an error, or a mistake. But it might matter, as we’ll see: you can correct mistakes, and learn from them: error is the stuff of life). If Latour wants to help us avoid category mistakes, he also wants to free speech from “the awkward constraints peculiar to Modernism”. These constraints seem to turn around that clear-cut distinction between words and things, which Latour just can’t help continuously ascribing to the ‘modern’ subjects of his account. The concern with avoiding mistakes is shaped by the imperative to develop the art of ‘speaking well to one’s interlocutors’, by learning to be sensitive to what it is that those from other fields of life are actually doing, what they are going through, what they are concerned about. It is this moral imperative that justifies Latour’s development of a typology of an elaborate typology of different ‘modes of existence’, each defined by its own, proper, forms of forms of truth of falsity.

As I have already admitted, I tend to read the notion of ‘modes of existence’ through the lens of a whole family of related ideas in contemporary social theory. It helps, as a way of working out what might be distinctive about Latour’s approach (it also helps if you suspend one’s credulity towards the terms of interpretation Latour himself provides – the stuff about the moderns, the grand claims about ontology, the non-human, that sort of thing: all those terms that have become slogans). Roughly speaking, modes of existence are different orders, let’s say, of practice, or life, perhaps, depending on your inclination; as I say, they might look like ‘fields’. Each one (in the course of the book, Latour identifies 15, but that’s not meant to be exhaustive) is associated with ‘distinct forms of experience’; they lay down ‘experiential conditions’ that have their own truth and falsity. Whether this talk of variable forms of experience evokes memories of reading Foucault depends on your own intellectual heritage, I suppose; whether or not Latour’s idea that each mode of existence is characterized by its own proper forms of veridiction also brings Foucault to mind, for you, depends on which bits of Foucault you most like to read. Whether or not you would like to hear more about the personal qualities required in speaking the truth, as a first person practice of ethical truth-telling, which this notion of veridiction perhaps brings to mind depends perhaps on whether you think Foucault is a more profound thinker about the limits of the human than Latour.

Latour’s project is to identify, he says, the principles of judgment that each mode of existence appeals to in order to decide what is true and false. Modes of existence are presented as having forms of truth and falsity proper to them, a recurrent line in the book. What’s involved here, then, is a multiplication of the truth and falsity, across distinct realms of practice. This is not the only thing that distinguishes modes of existence – they are also distinguished by different forms of ‘hiatus’ (the problems or worries or interruptions they suffer from); ‘trajectories’, ‘beings to institute’, ‘alterations’ (there is a really helpful table at the back of the book which helps you to get a sense of what all these mean across the different modes of existence; one thing that seemed to be agreed in the reading meeting which I attended is that across the 15 modes Latour identifies, there are different kinds of modes of existence: from specific fields of practice such as law and politics and religion, through to things which sound more like names for generic processes, like network, preposition, reproduction ). It is, though, the variable forms of truth and falsity that is given most weight: the other dimensions are readily available for description, whereas it is these variable forms of ‘truthing’, if I can borrow a term from Nancy Sinatra, that need to be negotiated in order to better cultivate the virtue of ‘speaking well’.

I’m not sure if any of this will make sense unless you are in the middle of reading this book, and I’m probably not the best person to ask to provide a clear (and balanced) exposition of the key concepts in AIME. Although nor, it seems, is Latour. It does read like a book designed to be read in reading groups, where everyone sits around spotting the allusions to other thinkers, trying to piece together what it is that a new term is really referring to (the material on the website doesn’t help, it just has more of the same type of fleeting definition).

2). Doing things with Austin

DSCF1168What most interests me about AIME is Latour’s use of a specific strand of ordinary language philosophy (he refers to it as speech act theory, which I think is itself telling), and in particular, the reference to the work of J.L. Austin. Latour does not give much attention to the possibility that the reference to Ryle might give the impression of a certain sort of prescriptive intent behind his project. Ryle was interested in correcting other people’s mistakes, by showing that whole ways of thinking about problems were flawed. Austin engaged in some of this too, not least in Sense and Sensibilia (where, amongst other things, he shows how claims about ‘reality’ are easily deployed to shut other people up). But the appeal to Austin here, it seems to me, opens up some questions about the values implicit in Latour’s approach to identifying modes of existence. I guess this is not the most likely line of questioning that AIME will generate – but it’s honourable concern with helping to clarify and correct mistakes and enable more diplomatic negotiation of controversies suggests is not beyond ‘critique’, if we are allowed to still use such a word.

Austin is, it should be said, just one amongst a series of names or concepts drawn from the canon of ‘modern’ philosophy of language and/or linguistics that Latour uses: we have actants, competence and performance, shifters, speech acts, prepositions. If I were engaged in a proper reading, the repeated borrowing or paraphrase of concepts from this resolutely ‘modern’ line of thinking about language would garner much more attention. What is one to make of the fact Latour seems unable to reconstruct the real pluralism of values in an ontological register without recourse to this range of concepts (I’m not making the cheap point that he is writing it all down, using language; the point is that the conceptual architecture being used is certainly resolutely ‘modern’, historically speaking, although not quite in the sense that Latour uses this term). If this book was the only source you had available to you with which to reconstruct the concerns of ‘modern’ thought, then in fact you would find quite a lot of evidence that ‘the moderns’ have all sorts of ways of talking about the world that did not suppose sharp distinctions between words and things.

Reference to Austin is one of the defining features of French Theory – everyone from Lacan to Ricouer, de Certeu to Deleuze & Guattari have recourse to some version of Austin’s thought. Latour’s use is distinctive, however, not least because he appeals to Austin in order to bolster what is an explicitly metaphysical, ontological project. What in particular Latour claims to be taking from Austin and from ‘speech act theory’ is the idea of ‘felicity and infelicity conditions’, “notions which make it possible to contrast very different types of veridiction without reducing them to a single model”. The idea that modes of existence can be identified by their distinctive felicity and infelicity conditions recurs throughout the book. Now, it seems to me, that this reference to Austin, and speech act theory, and to felicity and infelicity conditions deserves to be treated seriously. Austin certainly gave a lot of attention to ‘infelicities’, most obviously in How to do things with words. To borrow a phrase from Foucault talking about Canguilhem, Austin was a philosopher of error, in the sense that he sought to understand action by analyzing the ways in which actions went wrong and how in turn this generated certain sorts of accounting and evaluation (which is not quite the same thing as Anscombe’s story about intentionality being a function of forms of description, although I’m not quite sure why, or can’t say why off the top of my head, although I also think it can’t just be because she didn’t like him). Being able to tell whether an action was an accident or a mistake, whether it needed to be excused or justified – these were the sorts of things that Austin worried away at. The degree to which this project was oriented by a concern to correct and clarify is open to interpretation: it depends, somewhat, on whose ‘Austin’ you most like – John Searle’s, Derrida’s, Stanley Cavell’s, Shoshona Felman’s, Mary Louise Pratt’s, Judith Butler’s? And depending on which ‘Austin’ you prefer, you may or may not still think that what Austin was doing was pluralizing forms of truth, or whether it was something altogether more interesting and disturbing, something to do with suggesting that there was more to things going well or going awry than truth and falsity.

I’ve already mentioned the idea that Latour’s work has already contained a set of lessons about responsibility, accountability, obligation and the like. The reason to draw attention to this is to flag up one possible link with Austin, perhaps, many of whose examples draw from questions about Tort law and related issues, and overlap with the legal philosophy developed by Herbert Hart and Tony Honoré. One reason to make the link is because it helps to see what Austin might have been concerned with in developing, first, and most famously, the distinction between performative and constatives and, then, junking it and replacing it with a more complex conceptual framework of locutionary acts, perlocutionary acts, and illocutionary acts. So, yes, there is a lot of infelicity-talk in Austin, but that using this sort of term isn’t really a smart way of saying that there is more than one version of truth and falsity. There is something else going on. Nor does Austin doesn’t talk much about there being conditions of felicity and infelicity (felicity doesn’t have much of a role in Austin’s stories at all). This idea seems to resonate most strongly with John Searle’s formalization of Austin, in which he outlined the conditions that allowed one to properly categorize certain acts as being, well, more or less proper (the paradigm case is, of course, promising). Latour’s usage seems, to me at least, to echo quite strongly the concern with proper categorization that one finds in Searle (but without Searle’s concern, for example, with thinking through conditions such as sincerity). It’s the prescriptive side of Austin, if you like. What Latour does not acknowledge, shall we say, at least not in this analysis, is the degree to which Austin might not be concerned with pluralizing orders of truth and falsity at all, but with thinking of forms accountability and evaluation (of judgement) that are not restricted to truth and falsity. Latour actually keeps alluding to this, to be fair, without properly following up: he tends to mark distinctions and then collapse them again, referring to ‘truth and falsity, satisfactory and unsatisfactory’, ‘truth and false, good and bad’, ‘truthful or deceitful’. The second terms in these sorts of remarks aren’t just variations of truth or falsity: they indicate different orders of evaluation (truth can be quite unsatisfactory, after all). That, one might suppose, is precisely why Austin talked about infelicities – he was interested in various forms through which things went astray, or turned out well, or came off as intended, or ended unhappily. Another way of putting this is that Austin was interested in the faculty of judgment, and did not reduce this to a matter of assessing truth and falsity, however contextual ones understanding of those terms. Knowing how to speak well to others might well involve being able to tell when there is more than truth or falsity at stake; so might knowing when not to feel obliged to do so at all.

Latour doesn’t seem that interested in getting at this aspect of modes of existence, and this disinterest seems to be wrapped up in a certain sort of ontological anxiety. When, in AIME, Latour first mentions Austin, he quickly asserts that to really make use of the ideas in speech act theory that he likes “we shall need to go beyond the linguistic or language-bound version of the inquiry to make these modes more substantial realities”. What an odd worry to have, to think that one needs to take a tradition of analysis beyond language? Why the default to the spatialization of ‘language’? What sort of prejudice is it that still requires you to present a concern with matters of language as requiring this sort of aggrandizing correction? Elsewhere, in an interview published last year trailing the publication of AIME, Latour talks of his ambition to develop “a sort of ontological form of speech act theory. If you could ontologize speech act theory, you would get the concept of modes of existence”. Well, maybe you would, although I’m not sure if Latour hasn’t really just succeeded in ontologizing Foucault’s notion of ‘episteme’ instead. This line makes me ask what would it mean to ontologize Austin, specifically? (Would that be an error, or a mistake? Would it be excusable? Justifiable? And does it matter that those questions might sound different in other natural languages?). ‘Ontologize’ here seems to mean, at a minimum, moving beyond language, not restricting the analysis of conditions of (in)felicity to speech acts. The project of articulating plural values, says Latour, has to be done “for real” (his inverted commas) and not ‘merely in words’. Ho hum. In trying to identify the (in)felicity conditions of modes of existence to do justice to the diversity of values, Latour announces that “it would do no good to settle for saying that it is simply a matter of different ‘language games’”. Were we to do so, our generosity would actually be a cover for extreme stinginess, since it is to LANGUAGE, but still not to being, that we would be entrusting the task of accounting for diversity”.

Again, where does this sharp distinction between language and being come from? Who exactly believes in this? Who is fooling whom? Last time I looked, agreeing in ‘language games’ was all about agreeing in ‘forms of life’ (and this is not agreeing on the latter by means of the former – the difference is not of the kind that Latour insists on imposing on it; the former is an index, or a trace, or a synecdoche of the latter). Or, to put it another way, Latour seems to be making a category mistake, because he seems to think that Austin and speech act theory and ordinary language philosophy and ‘analytical philosophy’ is all about language and speech. What if we make the effort to see that it might be all about acts. So, for example, matters of truth and falsity are referred, by Austin, to the circumstances of the acts being performed (which is not quite the same as the conditions). One fundamental theme in the history of doing things with J.L. Austin lies here, in the question of the degree to which the contexts to which Austin refers matters of meaning (that is, matters of intention, motive) is thought of as a kind of frame that precedes and, finally, prescribes different acts; or whether acts are thought to have an open structure, what, after Derrida (being nice about Austin) or even Butler (pretending not to be), a certain sort of iterability in the structure of the act; or in Canguilhem’s terms, whether these contexts are normative for those acts….. The differences of interpretation at stake aren’t about ‘ontology’ at all, however you construe that term. They are about different understandings of the force of norms (which is, after all, what Searle and Derrida argued about way back when). More or less inadvertently, Latour seems to have allied himself with Searle, in the sense that he wants to find rules that can help him enforce codes of proper conduct for speech (the point is not necessarily that allying oneself with Searle is a problem, but that one way or another, we are not in a realm where what really matters is claims about ontology, but understandings of the normativity of norms). If you really want to admit “more diversity in the beings admitted to existence”, then perhaps the best way of doing so is not to develop more sophisticated ontologies at all. The problem isn’t one of ontological insufficiency after all. It’s not a problem of not knowing enough about the qualities of the real in all its varieties. It might be more like a problem of acknowledgement. There are forms of relating that exhaust truth and falsity, however pluralized, without being rendered matters of subjective caprice: and they might well be more compelling for not being confined by that frame.

I realize that I’m engaging in my own form of allusive arm waving now, to Cavell, most obviously, because it’s Cavell’s Austin that I find most compelling. Also to Sandra Laugier (if you haven’t yet found the Dictionary of Untranslateables, she has some great entries in there on these sorts of issues). I’m just not sure that expanding the scope of communities of concern really requires getting everybody to agree to new models of ontology. Having the wrong picture of the world isn’t the problem. The problem is one of understanding practices of assent, agreement and approval. That might be Austin’s lesson.

3). All too human

IMG_0723I’m rambling now, and not really explaining well what it is that I have in my head. Reading Latour’s book made me realize how much Austin’s work might well overlap with Foucault’s late work on ethical truth-telling, that’s one thing floating around up there. But that’s not, I suspect, one of the intended take-home points. Latour seems uninterested in the personal qualities associated with different modes of existence. But this seems to me precisely what one might expect from an account that seeks to elaborate on the task of speaking well to others. So I’m left to wonder what sort of truth-telling it is that is involved in contemporary forms of onto-talk, of the sort outlined here by Latour. As I said at the start, I think what is most interesting about this book is precisely the degree to which it is all about the ethics of speech. I’m interested to see how much attention will be given to this aspect of the project. It is here that the limits of the ontological imagination seem to become most evident: this is an imagination that seems to suppose that the best way to foster preferred virtues lies in correcting some peoples’ mistaken views of themselves and their relation to the world by outlining an all encompassing pluralistic ontology. But ontology is just a smart word for metaphysics, which is in turn a smart word for the stuff you make up. Or, it’s just the word for the stuff you can’t help being committed to. One way or the other, outlining new ontological pictures of the world helps no-one. I happen to think that Latour might have chosen the wrong register in which to cultivate his preferred virtues, and that that might be because he has made a mistake in his diagnosis of what is lacking in the world.

Making Human Geography: New book by Kevin Cox

KCOX

I have just read Kevin Cox’s new book, Making Human Geography. It tells the story, as he sees it, of how over the last 50 years or so, human geography has become a field of sophisticated theoretical and methodological inquiry. He starts by admitting this is a ‘personal understanding’, and it has a strong ‘interpretative’ line that reflects is own convictions, not least about the continuing saliency not just of Marxism, but of geography’s Marxism, of ‘historical-geographical materialism’ as an explanatory framework. I guess this won’t be to everyone’s tastes (there is plenty to disagree with about Kevin’s account of all sorts of things). But one of the things that I liked about the book was its tone. He worries about the ‘eclecticism’ associated with contemporary human geography, especially in its self-consciously ‘critical’ varieties; but does not complain about fragmentation nor indulge in nostalgia for lost coherence. Above all, the book makes an assertive case for human geographer’s achievements in laying the groundwork for the on-going challenge of spatializing the social sciences. This is a book about the ‘strong ideas’ developed by geographers, not the geographical ideas you can find elsewhere – no Lefebvre here, no ‘methodological nationalism’. These sorts of absences might be something that not everyone will be comfortable with – after all, geography now inhabits a broad field in which various spatial and environmental vocabularies are shared, including political theory, media studies, science and technology studies, as well as ‘Continental Philosophy’. All sorts of theorists get to be classified as ‘spatial thinkers’. Geographers increasingly thrive in this interstitial field, finding it easier to ‘pass’ as just another social scientist or theorist (in turn, in the UK at least, the institutional form of Geography in higher education has been transformed by the capacity of what are now very seldom mere ‘Departments of Geography’ to act as hospitable homes for various fields of inter-disciplinary social science ). Just how to ‘wear’ the distinctive disciplinary understandings of space, or scale, or networks developed since the 1950s outlined in this book has become more and more of a challenge. Not least, the challenge is to avoid a certain sort of ‘take-my-ball-home’ chauvinism that is associated, for example, with arguments about using space ‘metaphorically’ compared to proper ‘material’ understandings. The story in this book revolves around the different concepts of space (the trusty triad of absolute, relative and relational space) that have shaped human geography. This is a much more helpful way of approaching inter-disciplinary conversations (though not without it’s own implicit chauvinisms I suspect).

Scan 130260001This book covers a lot of ground – everything from geographical deconstruction to the expansion method (which is much less interesting than it might sound) – even as it cleaves to its own distinctive narrative line. It’s accessibly written, reflecting its origins no doubt in many years of seminar teaching. In parts, it presumes quite a lot of familiarity with the discipline and its main players. Apart from anything else, it does a really good job of elaborating on how important the ‘quantitative-spatial revolution’ both was and still should be for human geography’s intellectual progress: one of the most interesting themes is the idea of quantification and spatialization as two distinct intellectual movements that converged in the 1950s and 1960s; it also makes the point that the development of quantitative spatial science since then has been more often than not focussed on issues of contextualisation, against the caricature of ‘generalisation’ and ‘law-finding’ often directed against this style of work. Again, I guess the call for some sort of rapprochement across quantitative and qualitative styles might not resonate that much in some ears – not only, but not least, because to a considerable extent the cross-generational formation of human geographers (like me) naturally attuned to the worlds of social theory, Continental Philosophy, or qualitative methodologies is dependent on an institutionalised blindness around quantitative social science (the reverse is true too, of course).

I don’t necessarily agree with how Kevin interprets human geography’s trajectory. For example, I don’t really recognise the presentation of change since the 1980s, in terms of various ‘Posts’ that displace the centrality of Marxism. It’s a standard presentation, no doubt. It easily underestimates just how central Marxism still is in human geography, compared to any other social science field I can think of. I’d tell that story differently (perhaps in terms of a succession of errors compounding themselves… perhaps as the triumph of certain ‘philosophical’ temptations over the modern dilemmas of social theory…; or perhaps, on reflection, more charitably, in the same tone of genuine curiosity that Kevin strikes in his version of the story). But I do think that his account focuses in on the fundamental points of tension around which any disciplinary field develops: issues of method, key concepts, and the question of how best to understand ‘why things happen and why’. Above all, I like the fact that this is unashamed celebration of what human geographers do as geographers, and why this is important for the social sciences more generally.