The Dictionary of Human Geography (117 page)

BOOK: The Dictionary of Human Geography
12.42Mb size Format: txt, pdf, ePub
ads
locational analysis
A term best associated with Peter Haggett?s (1965) seminal book, Locational analysis in human geography, refer ring to the logically and empirically rigorous investigation of the spatial arrangements of phenomena and related flow patterns. The heyday of locational analysis in geography was the 1960s and early 1970s, when vigorous attempts were made to recast the discipline as spatiaL science, but its roots can be traced back to the first half of the nineteenth century and the formation of a German school of Location theory. Although locational analy sis is now out of fashion in contemporary geography, it has recently been reactivated in economics and advertised as a component of a new economic geography. (NEW PARAGRAPH) Haggett?s (1965, p. 1) scrupulously ordered book was fundamentally about ?the search for order?, by which he meant locational order or ?spatiaL structure?, as it might be studied in (and as) human geography. He (re)turned to the science par excellence of spa tial order, geometry, which he described as a ?neglected tradition in geography? (p. 15). Accordingly, he organized his first five substantive chapters in geometric terms (see figure): (NEW PARAGRAPH) movement the interaction between points; (NEW PARAGRAPH) networks the lines of linkage among points; (NEW PARAGRAPH) nodes the convergence of links; (NEW PARAGRAPH) hierarchies the differential role played by different nodes; (NEW PARAGRAPH) surfaces the spaces among nodes. (NEW PARAGRAPH) For each of the five geometries, he deployed rigorous theory and formal numerical methods to analyse and explain associated location issues: this was locational analysis. (NEW PARAGRAPH) Haggett made it clear, though, that his approach was a continuation of the long standing, albeit ?deviant?, intellectual tradition of location theory that historically twinned geometrical reasoning with theoretical and empirical analysis. Its origins lie with the nineteenth century Prussian landowner cum part time geographer, Johann Heinrich von Thunen (1783 1850), who combined meticu lous empiricism, theoretical innovation and a concentric geometrical sensibility to describe and explain location patterns of agricultural land use (see von thunen modeL). Later con tributors to the project, and their geometries, included Alfred Weber (1869 1958) (indus trial location and triangles), and August Losch (1906 45) and Walter Christaller (1893 1969) (centraL pLace theory and hexagons). Walter Isard (1919 ), writing in (NEW PARAGRAPH) nine years before Haggett, synthesized these contributions within his own account of locational analysis, which he termed regionaL science (although it was longer on theory and shorter on geometry than Haggett's version). (NEW PARAGRAPH) Both Haggett's and Isard's accounts of locational analysis enjoyed considerable suc cess within human geography for a period, but their star waned from the 1970s for reasons both internal and external to the discipline (see quantitative revoLution). During the 1990s, however, the baton of loca tional analysis was passed to the economists in the form of a New Economic Geography (Fujita, Krugman and Venables, 1999). Un surprisingly, it was more theoretical and mathematical, less empirical and geometrical, than in its previous version, but the basic mission was unaltered: to represent and ana lyse economic geographical distributions, and interactions, at all scales using exact, formal techniques and vocabularies. (NEW PARAGRAPH) The tie that binds locational analysis over its more than 150 year history is a conviction that the secrets of location are to be revealed by a ruthless pursuit of rational enquiry and methods, which takes its most perfect form in the logical structure and techniques of mathematics including geometry. The spe cifics of the location question vary from point to point interaction to the composition of regional networks and surfaces but the universal rationalist method is constant, thereby ensuring certainty, progress, and truth (so its protagonists claim). In contrast, Barnes (2003) argues that it is not as clear cut: as with science more generally, or any other form of knowledge, the historical and (NEW PARAGRAPH) geographical circumstances of the locational analysts intrude and inevitably corrupt their supposedly pure rationality (cf. LocaL know Ledge). This does not detract from the achievements of locational analysis, but it casts them in a different light. For it requires an understanding of the peculiar historical and geographical context of their production, an emphasis ironically quite different from the one offered by locational analysts themselves. tb (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) Barnes (2003). (NEW PARAGRAPH)
logical empiricism
Sometimes used as a synonym for LogicaL positivism, but more often used to mark the post Second World War Anglo American movement in phiLoso phy that built upon and extended it (drawing on earlier strands of positivism and empiri cism). Committed to empirical verifiability and logic, the original logical positivists were sceptical of theory, because it was neither facts nor operations of logic. Such scepticism could not be sustained in the face of the massive explosion (sometimes literal) of war and postwar science defined by theoretical con struction and empirical testing. Something had to give. In this case, ?empiricism? was substituted for the troubling term ?positivism' (and the root cause of the scepticism). Those working under the banner of logical empiri cism were some of the pre war logical positiv ists who had immigrated to the United States and there revised their position, such as (NEW PARAGRAPH) Rudolf Carnap (1891 1970) and Hans Reichenbach (1891 1953), but there were also American adherents, of whom perhaps the best known was Ernest Nagel (1901 85). David Harvey (1969) drew heavily on Nagel?s logical empiricist justification of the scientific method in presenting his own argument for the importance of ExpLanation in geog raphy. tB (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) Giere and Richardson (1996). (NEW PARAGRAPH)
logical positivism
A particular version of the phiLosophy of positivism associated with the work of a group of primarily Austrian sci entists and philosophers in the 1920s and 1930s known as the Vienna Circle. The con venor of the discussions at the University of Vienna in 1922 was the physicist Moritz Schlick, who invited figures such as the math ematician Kurt Godel, philosophers Rudolf Carnap and Herbert Feigl, economist Otto Neurath and physicist Phillip Frank. On the edge of the circle but not strictly members were the philosophers Karl Popper (see crit icaL rationaLism), A.J. Ayer (who later popu larized the movement for an English language audience) and Ludwig Wittgenstein. (NEW PARAGRAPH) Wittgenstein was behind the defining pro posal of the group, ?the verifiability principle?. It asserted that scientifically meaningful propositions are those that (1) can be verified empirically by the five senses; or (2) are true tautologically that is, their truth arises from the very meaning of the terms in which they are expressed such as in logic or mathematics. Propositions incapable of satisfying the prin ciple, and which included most issues that made up the history of Western philosophy until that point concerning aesthetics, moral ity and religion, were judged senseless if not nonsense. Carnap declared that the Circle ?reject[s] all philosophical questions, whether Metaphysics, Ethics or Epistemology?. Mean ingful propositions would be found in philoso phy only when it modelled itself on the natural sciences, and particularly physics. (NEW PARAGRAPH) An immediate criticism was that the verifia bility principle could not itself be verified: it was neither an empirical proposition, nor a logical or mathematical tautology. The Vienna Circle?s criterion of meaning was thus, dis turbingly, meaningless. Additionally, Popper argued early on that even those in the Circle?s Pantheon, the physicists, did not engage in empirical verification but only faLsification: that is, they sought to disprove, not prove, their theoretical claims. As a result of these objections, as well as the physical dispersal of the group (many were Jewish and left wing and fled Vienna with the rise of Nazism), logical positivism quickly unravelled. By the 1950s, the movement was dead. Its extreme positions were discarded, and any useful ideas were absorbed within the larger movement of analytic and empiricist philosophy that came to define the post Second World War Anglo American field (Reisch, 2005). (NEW PARAGRAPH) In human geography, however, Guelke (1978, p. 46) suggests that ?from Hartshorne to Harvey geographical writing on method ology and philosophy has to a greater or lesser degree shown the influence of logical positivist ideas?. But Hartshorne?s (1939) panegyric tome on regional description as the core of the geographical project makes no reference to logical positivism or, indeed, to a single logical positivist. It was in fact Hartshorne?s methodological arch enemy, Frederick Schae fer (1953), who came under the spell of logical positivism. He was influenced at the Univer sity of Iowa in the late 1940s and early 1950s by Gustav Bergmann, one of the Vienna Circle refugees, and became the first geographer to apply formal principles of logical positivism to geography through his work on morphological laws that were to take the form: ?If spatial pattern A, then spatial pattern B.? Watered down versions of logical positivism were sub sequently proffered by an intellectual disciple of Schaefer, William Bunge (1966), and in the form of a textbook by two former graduate students at the University of Iowa, who took the mandatory Bergmann course (Amedeo and Golledge, 1975). All three contributions were part of the quantitative rEvolUton, the movement towards modelling the study of geography on the natural sciences in general, and physics in particular, as part of the project of spatiaL sciEncE. That said, many quan titative revolutionaries were ignorant of logical positivism, and not interested anyway. Harvey?s (1969) prospectus for scientific ExpLanation in geography bears only traces of logical positivism. Its central preoccupation was the scientific method, not philosophy as such, and certainly not the philosophy of logical positivism: Harvey?s understanding of the scientific method was indebted, rather, to LogicaL Empiricism. Furthermore, both Harvey and the discipline at large were subse quently to move away from this version of the scientific method, rendering discussions of the usefulness of logical positivism of only historical value. tB (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) Guelke (1978). (NEW PARAGRAPH)
log-linear modelling
Procedures for analys ing data at the nominal level of measurement that is, when the variables comprised unor dered categories (such as a binary division urban/rural or a set of political parties Conservative/Labour/Liberal Democrat etc.). The goal as in regression (hence the alter native term logistic regression) is to fit an (NEW PARAGRAPH) equation that estimates the values in the cells of a contingency table, where the independent variables are also measured at either the nom inal (categorical) or ordinal level only; models can also be fitted if one or more of the inde pendent variables are measured at interval or ratio level. The terms of the modeL, presented in logarithmic form, indicate the deviations from the individual cell values of the contin gency table from the grand mean for the entire sample being analysed, and can be interpreted as the odds of getting a particular outcome (e.g. the likelihood that electors in urban areas are more likely to vote for one political party than those in rural areas). (See also categoricaL data anaLysis). rj (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) O?Brien (1992). (NEW PARAGRAPH)
logit regression models
These models bel ong to the family of techniques for categoricaL data analysis. They are used when the response variable is either binary (e.g. a person either moved or stayed) or a proportion (the propor tion who have moved). The nature of this response variable, such that it cannot exceed 0 and 1, means that the standard regression model should not be used. Instead, the logit (the logarithm of the odds of moving) is mod elled and the random part of the model takes on a binomial form so that the stochastic variation (see stochastic process) around the underlying relationship is structured to be least when the values of 0 and 1 are approached. Although the modelling is undertaken on the logits, it is simple to transform these to both probabilities and odds for interpretation. The popularity of this form of the binary response model stems from the ability when the predictor is categorical of giving a relative odds interpretation, generat ing, for example, the relative odds of a person aged over 35 moving in comparison to someone aged under 35. (NEW PARAGRAPH) The multinomial form of the logit model is used when there are more than two possible outcomes; the conditional form is used when (NEW PARAGRAPH) analysing choice alternatives and the predictor variables may include attributes of the choice alternatives (e.g. cost) as well as characteristics of the individuals making the choices (such as income); the multi level logit (see muLti LeveL modeLLing) assesses the effects of variables measured at different levels (such as individual, household and neighbourhood effects on an individual moving); and the nested logit model is used when there is a hierarchical structure to the outcomes (with moves broken down into short and long distance, for example). kj (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) Hensher and Greene (2004); Hosmer and Lemshow (2000). (NEW PARAGRAPH)
longitudinal data analysis (LDA)
A set of (NEW PARAGRAPH) quantitative methods that involve measures over time. In contrast to time series analysis, in which there tends to be one entity measured a large number of times (cf. sequence anaLy sis), LDA is usually concerned with a large number of entities (e.g. people, places or firms) measured a relatively few times. Data for such analysis come from extensive designs (see extensive research) such as panel or cohort studies. LDA is increasing in import ance because it allows the study of develop ment and change, including: the transition from one state to another (e.g. from single to married to separated); the time spent in a particular state (e.g. the length of unemploy ment); and the determinants of such duration. LDA is fundamental to adopting a life course perspective in which people are affected by cumulative stimuli over a long period. (NEW PARAGRAPH) The value of such an approach can be seen by contrasting the data analysis of cross sections with that of a panel. In the former, because we are measuring different people at each time point, we can only assess net or aggregate change we can only know, for example, that the percentage of the population below the poverty line has increased from 10 to 12. But with panel data we can assess the volatility of micro social change as individuals move in and out of poverty, thereby tackling questions about the permanent nature of an undercLass. Such questions are of particular importance in evidence based poLicy when we are concerned with either affecting change or removing barriers to change. Cross sectional analysis can also be misleading about the direction of causality, for without repeated measures over time we cannot distin guish between, say, unemployment causing illness or illness causing unemployment. Measurements over time usually show strong state dependence in that individuals do not move rapidly and continually between differ ent states, so that current behaviour is influ enced by past or previous behaviour. Only LDA is capable of taking account of prior information when examining current situ ations. Moreover, only longitudinal data can separate age and cohort effects the life experiences of those aged over sixty years may be quite different between cohorts born before and after 1945. Unlike cross sectional analysis, which analyses variation between cases, LDA also works within cases between occasions. As such, it is much better able to take account of ?unobserved heterogeneity?, unexplained variation due to the omission of explanatory variables that are either un measured or even un measurable. LDA will improve control for such heterogeneity and help to provide a measure of the extent of its presence (Davies and Pickles, 1985). (NEW PARAGRAPH) Longitudinal data sets have a number of features that provide a challenge for analysis. Thus if the outcome has not yet occurred the sequence is censored, so that in an analysis of longevity there will be people who are still alive at the end ofobservation period. Standard stat istical modeLs are based on the assumption of independence, but repeated measures over time are likely to be strongly autocorrelated. Missing data are also a particular problem, as the requirement of multiple follow up often leads to attrition: this is a particularly challen ging problem when the drop out is informative, because it depends on what would have been observed if the person had not dropped out. (NEW PARAGRAPH) Technically, there are three broad sets of approaches to LDA: (NEW PARAGRAPH) Repeated measures analysis, in which data on repeated occasions are seen as being nested within individuals, so that we can examine, for example, the growth of in come over time and evaluate the changing gender gap; muLti LeveL modeLLing is increasingly being used for such data, as it does not require the measurement of every individual on all occasions to allow between individual, within individual and between occasion variation to be explicitly modelled. (NEW PARAGRAPH) Event history analysis (duration analysis or hazard modelling) is concerned with the timing of transitions from one state to another, so that it is possible to test how the transition rate of moving from one state (e.g. unemployed) to another (employed) is affected by other variables such as edu cation level: these explanatory variables may be either time invariant (such as gender) and/or time varying. In such analyses, the dependent variable is the duration until event occurrence. Single non repeatable event analysis (e.g. the transition to death) was developed first and is often called ?survival analysis?, but it is now possible to analyse repeated events (e.g. multiple spells of unemploy ment and employment), competing risks (e.g. different reasons for leaving a job such as voluntary choosing another job, redundancy, retirement), multiple states (e.g. transitions between single, marriage and cohabitation states) and multiple processes (e.g. joint modelling of partner ship and employment histories). (NEW PARAGRAPH) sequence anaLysis, which works holistic ally to identify characteristic time based trajectories. (NEW PARAGRAPH) These methods have been extended to model spatial choice (Wrigley, 1990), the duration of point patterns (Pellegrini and Reader, (NEW PARAGRAPH) and spatial processes more generally (Waldorf, 2003). Kj (NEW PARAGRAPH) Suggested reading (NEW PARAGRAPH) Allison (1984); Blossfeld and Rohwer (2002); Dale and Davies (1994); Fitzmaurice, Laird and Ware (2004); Ruspini (2002); Singer and Willett (2003); Taris (2000); Wrigley (1986). (NEW PARAGRAPH)
BOOK: The Dictionary of Human Geography
12.42Mb size Format: txt, pdf, ePub
ads

Other books

Woman Walks into a Bar by Rowan Coleman
Werewolf Upstairs by Ashlyn Chase
The Plan by Apryl Summers
A Little Complicated by Kade Boehme
Lost for Words by Alice Kuipers
The Lost Swimmer by Ann Turner
Sweet Dreams by Massimo Gramellini
The Last Killiney by J. Jay Kamp