Understanding Research (36 page)

Read Understanding Research Online

Authors: Marianne Franklin

BOOK: Understanding Research
11.42Mb size Format: txt, pdf, ePub
DISCOURSE ANALYSIS

Before moving into the last set of issues for this chapter, we need to take a look at another large rubric under which a lot of research falls:
discourse analysis
. Whilst this term overlaps with content/textual analysis at first sight, it actually emerges out of the interaction between the ‘literary turn’ of the late 1960s in which written texts – archives, policy, speeches – predominate as an object of inquiry and that era’s social movements, decolonization, and political protest. The point for now is to note that here we see the term discourse as a larger, more malleable rubric covering the above distinctions.

Here content/texts are treated as ways in which powerful elites over time, or in a particular institutional setting, convey and reproduce power; by setting terms of debate, agendas, or simply in the way the historical record, as archives particularly, is selective and not representative of all views. ‘History is written by the victors’ as the saying goes. For this reason, discourse analysis is an approach that consciously moves the selectivity net outwards to include ‘a collection of related texts, constituted as speech, written documents, and social practices, that produce meaning and organize social knowledge’ (Abdelal et al. 2009: 7; see Burn and Parker 2003: 77
passim
).

Whilst references to content/textual analysis imply a relatively contained set of terms and techniques, references to
discourse analysis
are more expansive. And this is where the problems begin. The contentiousness of the very term
discourse
remains an open debate, between successive generations of proponents of this mode of research and between more content-based and more textual-based schools of thought (sometimes, but not always demarcated by the Atlantic Ocean or the English Channel) and their critics.

The main objection, not to put too fine a point on it, is a lack of precision about what parameters of the raw data (speech, documents, practices) are selected as well as how said data is deemed to be significant, beyond the aims of the research questions – given the open-endedness of this approach. A key figure in this body of work, particularly as it found its way into Anglo-American academe, is the French philosopher, provocateur, and godfather of the notion of discourse analysis as a critical alternative to textual analysis, Michel Foucault.

First, the intention of this move away from content, or text strictly conceived in his own words is:

to raise questions in an effective, genuine way, and to raise them with the greatest possible rigor, with the maximum complexity and difficulty so that a solution doesn’t spring from the head of some reformist intellectual or suddenly appear in the head of a party’s political bureau.

(Foucault, cited in Faubion 2000: 288)

Next, the notion of
discourse
itself is treated here as not only an object of research, even if a broadly inclusive one, but also the means by which to conduct the analysis. What this boils down to, in Foucault’s words, is that in

every society the production of discourse is at once controlled, selected, organized and redistributed according to a certain number of procedures, whose role is to
avert its powers and its dangers, to cope with chance events, to evade its ponderous, awesome materiality. In a society such as our own we all know the rules of exclusion. The most obvious and familiar of these concerns what is prohibited.

(Foucault 1972: 215–16)

This approach clearly departs company from architectural and content-based procedures for deciphering texts, however defined, in several ways:

  • by consciously combining literary techniques with historical research methods (living and archival documents as a focus being a favourite of Foucault’s own research);
  • by employing sociological and anthropological approaches to connect the above to their sociocultural and institutional contexts: people and practices (elicited through interviews as well as participant-observation).

The sum total is a range of research that investigates these sorts of ‘procedures’ or ‘exclusions’ at any given period of time or locale; not only text in literal terms can be focused upon but so can the role of key figures, media coverage, parliamentary debates, and popular culture.

Whilst there is range of conceptualizations and applications of the term ‘discourse’ currently operating, the bottom line is that these inquiries are premised on designing research that investigates how the way ‘we describe reality has an effect on the way we perceive and act upon our environment [and if so, how] new perspectives might lead us to consider alternative courses of action’ (Tickner 1995: 61); it is not unthinkable that some of these research designs incorporate equally diverging and converging notions of the status of quantitative and qualitative data as evidence, or units of analysis.

Summing up

So how to make sense of these moving targets, claims, and counterclaims about rigour, selectivity, and replicability as the
sine qua non
of truth claims? How to keep our bearings in the strong ebb and flow of intellectual fashion where these approaches have strong traction for our inquiry?

  1. For coping purposes, a key distinction for projects undertaking some kind of discourse analysis is that these can focus purely on literary texts as well as ‘place texts and practices in their intersubjective contexts’, this being ‘the qualitative and interpretive recovery of meaning from the language that actors use to describe and understand social phenomena’ (Abdelal et al. 2009: 6).
  2. In other words, meaning-making is not locked in the ‘text’ as a static object; necessary for traditional content analyses. Rather what actors, as individuals but also as institutions, do with these ‘texts’ also counts.
  3. So, discourse also refers to practices as well as the written word, visual signs, or audio cues: to intentions behind a message as well as the way audiences, or in some
    cases consumers pick up on these intentions or indeed reinvent them in the way they then talk about, refer to, and reiterate these composite objects.
  4. Data gathered in discourse/textual analyses can be quantitative in the sense that a high or low frequency of an item (word, phrase) in a written data set may be significant if taken in context; conversely the absence of keywords can also be significant depending on the underlying relations governing how a text is conceived (for example, diplomatic or politically sensitive policy-making).
  5. However, the main emphasis is on the scholar interpreting these texts, placing this interpretation in a larger context and then convincing ‘his or her readers that a particular reconstruction of the intersubjective context of some social phenomenon . . . is useful for understanding an empirical outcome’ (Abdelal et al. 2009: 7).

To sum up and move on, we can regard the different modes of analysing ‘content’ along

  • Quantitative lines: where manifest content (frequency and/or placement of keywords, phrases, framing, keyword mapping), coding schemes and hypothesis-based research questions predominate, and varying levels of statistical operations are applied once a coding scheme has been devised.
  • Qualitative lines: whereby ‘content’ is treated as irreducible to frequency or placements and there is interpretation of latent content whereby the sub-text or sub-texts are isolated and analysed by the researchers; this can be achieved via open-ended research questions or more structured analytical frameworks based on grammatical and syntactic structures – in linguistic terms, hermeneutics (i.e. interpretation, semiotics), signified/signifier, etc.

Figure 7.1
Surrealist painter meets surrealist plumber

Source
: Dan Piraro,
http://www.bizarro.com

What is at stake? Wider methodological considerations for this line of research reside in the various theories of communication and language underpinning the above distinctions. For instance:

  • It is not unreasonable to proceed by getting to grips with how a text, or set of texts is made up before embarking on an interpretation of its larger meaning.
  • This can also provide a manageable approach to generating some original data in cost-effective ways.
  • However, just as focus groups don’t necessarily provide us with a substitute for the time and effort needed to undertake semi-structured interviews, content analysis in its conventional, more quantitatively understood sense has advantages and disadvantages. These, needless to say, have implications for the role this approach may play in your inquiry, or its relevance for your research question.
Further reading

McQuail (1994) for an overview of content analysis in media and communications; Gunter (2000) for more quantitative extensions; Bertrand and Hughes (2004) for a discussion of texts as an axial aspect of media and communications research; see also Skovmand and Schrøder (1992) for their take on mixed methods in this discipline.

For more detailed discussions within distinct approaches see Atkinson and Coffey (in Silverman 2011: 77–92; Prior (in Silverman 2011: 93–110); Emmison (in Silverman 2011: 233–49); Heath (in Silverman 2011: 250–69). See also Markham (in Silverman 2011: 111–28) for researching web-based content issues.

Overviews include Berg (2009), Burn and Parker (2003), and Sturken and Cartwright (2005), who look at the theory and methodological practicalities of textual/visual analysis as explicitly qualitative methodologies in the round.

Working example
: As the theoretical and evidential terrain available for first-timers is huge, intimately connected to the emergence of the moving image, consumer society, and advertising, the following cluster of authors concentrate on photography, a medium that has fascinated philosophers and researchers since it was invented in the late nineteenth century, evolved into the moving image, and was then digitized in the late twentieth century. Images are now a staple of contemporary news and entertainment, PR and advertising, and the way people communicate online: see Benjamin (1973 [1931]), Barthes (1981), Sontag (1977, 2003), Williamson (1978), Freedberg (1989).

DEDUCTIVE AND INDUCTIVE PATHS TO KNOWLEDGE

We now need to spend a moment with the underwater part of the analysis iceberg, distinctions within and between disciplines about which ways of reasoning are the ‘key’ to knowledge that claim superiority to the knowledge and understanding based on what some call common sense (life experience), others see as residing in formal belief systems and scriptures, and others follow based on knowledge, ‘folk wisdom’,
handed down through the generations from father to son, mother to daughter, elders to younger generations.
9

There is an implicit yet crucial distinction made between
deductive
and
inductive
modes of reasoning. Induction refers to the process of reasoning from specific observation to general principle or theory. Deduction indicates proceeding from general principle or theory to specific observations. Both are involved in the history and practices of scientific discovery, knowledge production, and the business of academic research. Both are embedded in major philosophical debates in western academe, and popular imaginaries, about what distinguishes scientific knowledge from other truth claims. Because working assumptions about which is the right path to follow, in principle, govern formal and informal expectations about what sorts of inquiry pass muster in respective settings, we need to make a pit stop.

First, think for a moment. Which approach do you tend towards when learning about things?

  • Do you begin with a theory – presupposition – about something and assess the evidence, the facts, as supporting or refuting this presupposition?
  • Do you prefer to wait until you have all the facts before you draw any conclusions?

In short; deductive modes of reasoning work from the top down. Inductive reasoning works the other way around: conclusions or general laws are drawn only after enough evidence – facts – has been assembled and analysed, from the ground up. Here a ‘general law’ can only be inferred from ‘particular instances’. For deductive reasoning, theoretical formulations effectively precede the data-gathering with the latter functioning as proof or ‘falsification’ of the original supposition, hypothesis, or general theory. For inductive modes, theories or generalizations emerge out of the raw material much later in the day; certain methods, and methodological issues follow.

Sherlock Holmes – the fictional detective created by Conan Doyle whose method for solving crimes was by ‘deduction’ (the first scenario above), when he cracked a case would exclaim to his bemused sidekick Dr Watson that it was ‘elementary, my dear Watson!’. Holmes was an ardent advocate, fictional as he may be, of what he called
deduction
as a failsafe mode of logical reasoning. His approach and manner to investigation is best summed up in the following way: ‘I’m Sherlock Holmes, the world’s only consulting detective . . . .This is what I do: * 1. I observe everything. * 2. From what I observe, I deduce everything. * 3. When I’ve eliminated the impossible, whatever remains, no matter how mad it might seem, must be the truth.’
10

Taken in the context of early forensic science and at the height of industrialization and the ‘Big Science’ narrative during the late nineteenth and twentieth centuries, the notion of applying a logical approach to solving problems has come to epitomize and distinguish the ‘scientific’ from the non-scientific, the methodical from the chaotic, the logical from the irrational.
11
What Holmes was effectively doing, however, was drawing inferences from observational evidence, coming to a ‘logical conclusion’ about ‘particular instances’ of a ‘general law’.

Some of the ‘laws’ at work in this sort of detective work, driving many a twist and turn in any whodunit plot, are models of human behaviour (for example, a murderer often returns to the scene of a crime, facial movements or physiological reactions are
seen to correlate with lying or telling the truth), the properties of physical matter, and such like. More recent popular television shows like
Silent Witness
(UK) or
CSI Investigation
(USA) write this way of reasoning into the script, making the accompanying computer-facilitated techniques of forensic proof the star of the show. Even when inspiration, intuition, or coincidence contribute – a detective’s ‘hunch’ or sense that they ‘have got a feeling about this one’, the approach that wins out in the end is this ‘Holmesian’ emphasis on the close relationship between logical reasoning and observation.

Much ink has been spilled in arguments about what comes first, the facts, their observation, or the frameworks by which such facts can be observed and thereby identified at all. Starting with a law – or an hypothesis – and then substantiating it by recourse to the facts is a basic tenet of the natural sciences, and their strictly quantitative cousins in the social sciences. Starting with the ‘facts’ and drawing what are often tentative conclusions at a later date is the hallmark of both qualitative and quantitative work. In both cases the strengths and weaknesses of either get used as sticks to beat the other over the head with. Philosophers have provided researchers here with a rich literature and armoury of arguments and counter-arguments. Some are the stuff of legend.

But are these two ways of thinking really that mutually exclusive (see Gray 2009: 14–16)? Most research practice involves a hybrid of inductive and deductive reasoning. So, think again, was Sherlock Holmes really employing a
deductive
mode of reasoning? Perhaps he was doing a bit of both; inferring from the general to the particular as well as the particular to the general. Some pundits would go further by noting how ‘Holemsian’ deduction is in fact
induction
by another name.
12

So why does this distinction matter so much in theory and practice, and lie at the heart of major schisms, strategic alliances, and occasional détentes between schools of thought as it does?

The heart of the matter is actually a paradox, or as it is often called, the ‘problem of induction’ (Popper in Jarvie 2005: 821). In the history of western academe and particularly the rise of ‘scientific method’ as the arbiter of knowledge, truth, and validity, it is not a minor matter to argue about what comes first, ‘theoretical knowledge or experience’, ‘perceptions’ or ‘conceptions’ of the world (see Chalmers 2004, Radder 2006).

The paradox for everyday research is the realization that hypotheses cannot ‘come from observation alone because there is no observation without hypotheses’ (Jarvie op cit, see Chalmers 2004: 41
passim
, Gray 2009: 14–16). If that is the case, then the optimal way to proceed in a
scientific
way, according to Karl Popper’s formative reworking of this everyday ‘problem of induction’ and his answer to it, a way of getting around this conundrum by privileging a particular form of hypothesis-formation, is one that is phrased and executed in such a way that it can be, indeed must be open to being ‘
falsified
’. To put it another way, scientific research, and by association scholarship aiming to produce a particular sort of knowledge must be able, and willing to be found wrong (Chalmers 2004: 59
passim
). In other words, researchers are not in the business of being ‘right’ despite protestations to the contrary (see Schulz 2010).
13

As Popper put it, what scientists really do is engage in a set of decisions about how to go about doing science according to strict rules about scientific procedure; ‘designed in such a way that they do not protect any statement in science against falsification’ (Popper in Jarvie 2005: 821). This distinguishes scientific knowledge from ‘metaphysics’, not whether either camp can ever lay claim to the truth at the end of the day. The upshot is, to over-simplify some longstanding and complex debates, that deductive modes of reasoning as an ideal-type now predominate in popular and scholarly imaginaries, research cultures, career-structures, and research questions. More than a rerun of the classic chicken-and-egg conundrum, these preferences amount to a ‘policy decision governing action and embodied in norms or “methodological rules”’ (ibid.).

Figure 7.2
Measuring climate change

Source
: Josh:
http://www.cartoonsbyjosh.com

The governing consensus, now institutionalized, is that researchers are ‘theorising all the time in order to navigate the world, and our encounters with negative evidence are the bumps that deliver information about the shape of reality’ (Popper in Jarvie 2005: 821). The constitution, conclusions drawn, and consequences these ‘bumps’ may or may not have in sociocultural, political, or economic terms are where researchers and commentators stake all manner of claims; the ongoing scholarly and public debates that line up climate change supporters against ‘sceptical environ-mentalists’ is a case in point (see
Box 2.2
).

Back to the ‘problem of induction’, for those who see this as a problem. Critics of Popper’s take on the matters, also covering the spectrum of worldviews and research cultures (see
Chapter 3
) can include those who favour consciously inductive approaches to empirical research nonetheless. So, here most would have trouble
disagreeing with the first part of the above statement; that researchers are ‘theorising all the time in order to navigate the world’.

BOX 7.1 PHILOSOPHICAL RESEARCH

One approach alluded to in these discussions, philosophical research, bears mentioning given its place in philosophy, the oldest scholarly discipline by all accounts, and the way the term refers to research projects that are not primarily involved in gathering and analysing empirical data. For our purposes, let’s condense two millennia or more of discussions around the best way to do philosophy per se, to two main principles: ‘slow reading’ and ‘slow writing’.

  • These are ‘slow’ because the aim is not to collect but to consider, and then to present the outcomes as an argument.
  • ‘Slow reading’ is another way of saying ‘close reading’, i.e. taking time to consider a written text one step at a time, sometimes word by word, paragraph by paragraph. Here the quality of thought is not judged by how much literature is consumed.
  • ‘Slow writing’ follows because the aim of the exercise is to get ‘under the skin’ of ideas, concepts, and assumptions in themselves rather than treat them as a means to empirical data-gathering ends.
  • Writing, rewriting, and building up a ‘plausible’ argument means that philosophical research is an intrinsically literary activity; genres here are characterized by the various branches of philosophy and their respective views of ‘being’ (ontology) and ‘knowing’ (epistemology). The latter term is also a school of philosophy.

So, to sum up, philosophical research distinguishes itself from those forms of research discussed here, yet also informs many other disciplines through its ‘general strategy’ as Hans Radder (2006: 179) puts it. Following his schematic, there are three criteria for ‘plausible’ philosophical accounts: (1) presenting the ‘basic philosophical claims’ you are considering in a coherent way; (2) showing how these claims stand up to and rebut ‘existing and potential criticisms’; (3) arguing how any alternatives to your claims are ‘inadequate, either for intrinsic reasons or by comparison’ with the claims you are advocating (ibid.).

These sorts of inquiries and their governing research questions may well occur within a particular literature; for example, see how Radder articulates his inquiry (2006: 1–3) and so proceeds without any references to ‘reality out there’. This is not exclusively the case. Whilst, as another commentator puts it; the ‘primary task of philosophical research consists, to my mind, in the clarification of concepts or meanings. . . . [There] are cases where it is not possible to clarify the meaning of a word without empirical research. . .’ (Ernst Tugendhat, cited in Wren 1990: 3). In other words, philosophical research can be entirely engaged in by philosophers within the philosophical canon or it can be applied to research projects looking to both ‘clarify concepts and meanings’ as well as conduct ‘empirical research’ (ibid.).

Others would take issue at the one-way street implied in the second part; that is, they do not agree that the ‘shape of reality’, and the bumps encountered are inseparable from theorizing. There is more than one way to produce viable knowledge about what the bumps along the way mean; scientific knowledge can include modes of reasoning that are qualitatively different from the construction of falsifiable hypotheses and experimental research design.

The strict demarcation drawn by Popper and others between science and non-science is also disputed on philosophical but also cultural and political grounds. Worldviews are at stake in what is seen to constitute objective ‘reality’ at any one time; the formative role reflection and thought plays in all these views of the correct path, or paths to knowledge remains paramount across the board, however.

Meanwhile, in practical terms these intricacies are less about which comes first in retrospect. More to the point, is the way you understand and then manage this fraught relationship at crucial points in the larger process and as you come to grips with your material, the appropriate method for the
phenomena
your research question is looking to study?
14
This is where ‘coping with ambiguity’ is part of the territory and part of the struggle; albeit more explicitly put forward in certain approaches than others.
15

Other books

A Curse Dark as Gold by Elizabeth C. Bunce
The Middle Moffat by Eleanor Estes
Shooting for the Stars by R. G. Belsky
West with the Night by Beryl Markham
The Royal Pursuit by Ruth Ann Nordin
What Money Can Buy by Katie Cramer