Digital resources in the Social Sciences and Humanities OpenEdition Our platforms OpenEdition Books OpenEdition Journals Hypotheses Calenda Libraries OpenEdition Freemium Follow us

Some quotes on how tools influence the humanities

40 minutes read, 3 minutes without the quotes.

Now that I have identified what I want to salvage from the critique of big data visualization, I must forge a general argument. I do not have a clear overview of the literature on the subject; I only remember the multiple times where I had this something-is-not-quite-right feeling. Here I gather some quotes on the subject, from my recent readings.

Let me briefly state my argument, as it helps understanding the perspective of this selection of quotes. Some authors criticize digital tools and/or visualization because they supposedly carry a regrettable external influence on the humanities. My discomfort lies in two points.

Firstly, users repurpose tools in unexpected ways, and (re-)interpret them in ways widely different from their original framing. So if there is an influence, it is not self-evident; the origin of a tool does not guarantee its influence.

Secondly, science is “disunified” (Galison’s word). A given discipline such as physics is not a single culture, but many (you may think of Knorr Cetina’s epistemic cultures). When humanists use Gephi, they do not borrow it from computational social scientists like Lada Adamic, or network scientists like Albert-László Barabási, but from engineers like Mathieu Bastian, the 2010 me, and fieldwork-engaged sociologists like Dana Diminescu. Our project was to explore and describe, not model and predict. Furthermore, what some scholars perceive as their own fight against science-as-seeking-universal-laws is already existing within “hard science” fields such as physics. Why seeing tools as Trojan horses when the landscape of knowledge production is so bafflingly fragmented and complex that an entire field of research is dedicated to studying it?

These points deserve more development, I know; but this post is not the right place, and I will stop for now.

I must precise, however, that I do not contest the existence of an external influence on the humanities. I am only complicating things to improve the criticism of scientific instruments so that it can better help designing and using them in practice. Let’s only fix tools where, when, and if there is something broken. Which is still, for me, a pretty open question.

I expect the following quotes to help me identify a few things:

  • Who, they say, is influencing?
  • What, they say, is being influenced?
  • How, they say, does this influencing operate?
  • Why, they say, is it a matter of tools, visualization or technology?

I want to know if there is a core argument, or a constellation of semi-related points. And I want to know if the point is serious or, worst case scenario, a glorified misunderstanding.

Casual finding

These authors have a variety of perspectives and conceptualizations of the tool question. Most papers are not directly, or at all, about tools and/or visualization. Yet most of them casually state, often when they summarize their point, that presuppositions/ontologies/methods are built into the tool and/or visualization. Yet none of these papers tells who stuffed the tool/visualization with its epistemic filling, why, and how.

A first kind of author just does not ground the claim. A second kind cites Johanna Drucker, but she does not ground the claim (I am not saying she has to, but that is how it is). Finally, a third kind actually proposes an explanation; but the closer we get to the question, the more it seems to be about something else. Judge for yourself.

Some quotes

I present the quotes by paper, sorted by anti-chronological order, because it sometimes helps following the network of references. This way it feels more like one progresses upstream toward the source of the argument.

Of course, at some point the argument grounds on cognition and semiotics; I stop before extracting quotes from those papers. That space is too large. I only track who says what, and on which ground. I focus on the science and technology studies, and the humanities (roughly speaking). I do not even try to be exhaustive.

Ruppert & Scheel, 2018

Ruppert, E., & Scheel, S. (2019). The Politics of Method: Taming the New, Making Data Official. International Political Sociology13(3), 233-252.

Evelyn Ruppert and Stephan Scheel’s piece is not about the social science and humanities. However, it emphasizes the materiality of practices with big data, notably visualization (see the quote below). This is why I include it here.

we depart from Savage’s conception of the politics of method to argue that these strategies … also feature material‐semiotic practices like demonstrations that seek to legitimise innovations in methods and data as official. … The politics of method rather require a symmetrical analysis that accounts for how different kinds of digital devices are mobilised

For the authors, visualizations perform realities two ways: by rendering the phenomenal world as self-evident, and by making absent the work put into obtaining and refining the data (enacting a reality independent of the method).

we understand visualizations as crafted set‐ups that involve situated enactments of realities. … visualizations bring realities performatively into being.

In contrast to other statistical accounts of mobility, such as static charts and tables, MPD seems to speak for itself precisely because it moves. The moving red dots become not only a vehicle for the data, but first and foremost for its claimed self‐evidence. The red dots moving along Estonia’s main transport routes suggest that they correspond to the commuters they are meant to represent. Through this “realist trick” (Law 2012) mobility is enacted as a reality that exists independently of the methods that are used to describe it. There appears to be a seamless correspondence between the visualization (the moving dots) and the reality (commuting patterns in Estonia) it represents and renders “the phenomenal world (as if it) were self‐evident and the apprehension of it a mere mechanical task” (Drucker 2011, 2). In this way MPD is constituted as the perfect method for tracing the movements and locations of increasingly mobile populations, a method that offers an unrestricted vision from above, a vision that allows, in tradition of the “god trick” described by Donna Haraway (1988, 581), to see “everything from nowhere.”

the map’s capacity to build allies derives precisely from making absent all the work that goes into the map’s crafting as a coherent account of mobility.

the demonstration is a strategy to convince other statisticians and build allies … In other words, the heat map demonstrates a paradigm shift that statisticians sometimes refer to as a change from statistics to modelling.

References on that matter:

  • Law, 2012
  • Drucker, 2011
  • Haraway, 1988

Masson, 2017

Masson, E. (2017). Humanistic Data Research: An Encounter between Epistemic Traditions. In Mirko Tobias Schäfer and Karin van Es, The Datafied Society. Studying Culture through Data.

For Eef Masson, the humanities borrow digital tools from positivist and epistemic traditions. The humanities are thus inevitably indebted because of presuppositions built into the tool. Masson does not specify how it works, but points to authors who made the argument. She also mentions a transparency effect of computational methods and visualization.

with the introduction of digital research tools, and tools for data research specifically, humanistic scholarship seems to get increasingly indebted to positivist traditions

those tools, more often than not, are borrowed from disciplines centred on the analysis of empirical, usually quantitative data. Inevitably, then, they incorporate the epistemic traditions they derive from.

Johanna Drucker points out that tools for information visualization are inevitably indebted to the disciplines from which they derive. The same, one might add, applies to tools for data scraping, and for the cleaning, sorting or otherwise processing of collected data.

At the most basic level, the indebtedness Drucker speaks of can be understood as a set of built-in presuppositions about how knowledge is obtained. In this context, it is important to consider not only the assumptions of the practitioners for whom the tools were designed … but also those of the software engineers who conceived them.

Eef Masson draws her argument from Drucker, Kitchin, and Rieder & Röhle. Here is her summary.

In the absence of readily legible clues as to their epistemic foundations, computational research tools are often assigned such values as reliability and transparency (Kitchin 2014: 130). As Rieder and Röhle observe, the automated processing of empirical data that they enable seems to suggest a neutral perspective on reality, unaffected by human subjectivity (2012: 72). Drucker, a specialist in the history of graphics, makes a similar point, focusing more closely on practices of data visualization. She argues that the tools used for this purpose are often treated as if the representations they render provide direct access to ‘what is’. This way, the distinction between scientific observation (‘the act of creating a statistical, empirical, or subjective account or image’) and the phenomena observed is being collapsed (Drucker 2014: 125; see also Drucker 2012: 86).

References on that matter:

  • Drucker, 2012
  • Drucker, 2014
  • Kitchin, 2014
  • Rieder & Röhle, 2012

Rieder & Röhle, 2017

Rieder, B., & Röhle, T. (2017). Digital Methods: From Challenges to Bildung. In M. T. Schäfer & K. van Es (Eds.), The Datafied Society: Studying Culture Through Data (pp. 109–124). Amsterdam: Amsterdam University Press. 

For Bernhard Rieder and Theo Röhle, knowledge is embedded, stuffed within tools; tools express and perform methods. By this, they mean that (1) tools make methods possible, but also (2) independently from their user (to a certain extent).

They frame that matter in the context of an encounter between the humanities (understood in a broad sense) and computing (seen as a method drawing from multiple fields). They pay attention to not reducing the dichotomy to quali/quanti or critical/administrative.

Largely drawing from their 2012 paper, they break down the influence of digital tools as such:

  • People believe computers can be more objective that humans
  • Visualizations are spectacular, which plays a rhetorical role
  • Their opacity undermines a pillar of scholarship: open scrutiny
  • If one believes in the existence of underlying laws of nature, computer calculations are quintessential
  • By commoditizing methods to profane publics, they render mediations invisible and thus promote essentialist interpretations.

For the authors, users play an active role in the influence of the tool, because they may or may not understand the concepts and methods embedded in the tools – hence the importance of Bildung.

The encounter between the humanities and computing plays out in different ways in different arenas, but needs to be addressed in principle as well as in relation to particular settings. … While terms like ‘digital humanities’, ‘Cultural Analytics’, ‘digital methods’ or ‘web science’ can play the role of buzzwords, their proliferation can be seen as indicator for a ‘computational turn’ that runs deeper than a simple rise of quantitative or ‘scientific’ modes of analysis.

Even if ‘the digital’ has become a dominant passage point, it works like a meat grinder: the shredded material does not come out as a single thread, but as many. To connect back to the Methodenstreit: computational methods can be both deductive and inductive (see e.g. Tukey’s (1962) concept of exploratory data analysis), both quantitative and qualitative in outlook, both critical and administrative.

why computational tools have sparked such a tremendous amount of interest when it comes to studying social or cultural matters [?] One explanation might be the notion that the computer is able to reach beyond human particularities and into the realm of objectivity

Since [visualizations of network topologies, timelines or enriched cartographies] possess spectacular aesthetic – and thus rhetorical – qualities, we asked how the argumentative power of images could (or should) be criticized. … The challenge is thus to maintain a productive self- reflexive inquiry into our own visual practices … without abandoning the promise of gaining insights via visual forms (Drucker 2014: 130-137).

Despite the fact that writing software forces us to make procedures explicit by laying them out in computer code, ‘readability’ is by no means guaranteed. However, an open process of scrutiny is one of the pillars of scholarship and, in the end, of scholarship’s claim to social legitimacy.

When reality is perceived to adhere to a specifiable system of rules, the computer appears to be the quintessential tool to represent this system and to calculate its dynamics.

A critique of digital tools is incomplete without a critique of their users and the wider settings they are embedded in.

If students and researchers are trained in using these tools without considerable attention being paid to the conceptual spaces they mobilize, the outcomes can be highly problematic. Digital Bildung thus requires attentiveness not just to the software form, but to the actual concepts and methods expressed and made operational through computational procedures.

It is again important to notice that the point and line form comes with its own epistemic commitments and implications, and graph analysis and visualization tools like Gephi further structure the research process.

The problem, again, comes from the fact that tools such as Gephi have made network analysis accessible to broad audiences that happily produce network diagrams without having acquired robust understanding of the concepts and techniques the software mobilizes. This more often than not leads to a lack of awareness of the layers of mediation network analysis implies and thus to limited or essentialist readings of the produced outputs that miss its artificial, analytical character. A network visualization is closer to a correlation coefficient than to a geographical map and needs to be treated accordingly.

While our three examples might be considered very specific, we think that similar arguments could be made for a wide variety of cases where software performs a method.

The problem of black boxing does not begin with the opacity of computer code, but with the desire to banish technology from the ‘world of signification’ (Simondon 1958: 10). Behind the laudable efforts to increase levels of technical capacity lies the dangerous phantasm that technology’s epistemologies are ultimately ‘thin’ and that once programming skill has been acquired, mastery and control return. We believe, on the contrary, that any nontrivial software tool implies thick layers of mediation that connect to computation as such, certainly, but in most cases also imply concepts, methods and styles of reasoning adapted from various other domains.

Digital methods are here to stay and to go beyond the simplistic reflexes of enthusiasm and rejection we need to engage in critical practice that is aware of the shocking amounts of knowledge we have stuffed into our tools.

References on that matter:

  • Rieder & Röhle, 2012
  • Drucker, 2014

Marres & Gerlitz, 2016

Marres, N., & Gerlitz, C. (2016). Interface methods: Renegotiating relations between digital social research, STS and sociology. The Sociological Review, 64(1), 21-46.

For Noortje Marres and Carolin Gerlitz, re-purposing a digital tool is ambivalent. As the tool was not exactly made for our needs, yet for something similar, it comes with constraints but also freedom. This is made possible because these tools are often plastic, if not unstable.

I must precise that the author’s approach assume born-digital data, so here they do not address the problem of digitization.

digital analytics invoke a methodological uncanny for social research. The tools mentioned above closely resemble the techniques and methods deployed in social inquiry, but we can certainly not call them ‘our own’. ‘Not our own’ because in second instance the methods built into popular tools often prove to have more alien disciplinary provenances, and to serve the objectives of digital platforms rather than those of research. … We will propose that there are decisive advantages to affirming the ambivalence of digital analytics – according to which data tools are both similar and different from sociological research techniques.

A key characteristic of the methodological uncanny is that it is not necessarily clear, which analytic purposes digital tools may serve, what research objectives they may align with or what disciplinary agendas they enact. One of us has previously characterised social research tools as ‘multifarious instruments’ which have the capacity to serve multiple purposes, which may not always be clearly distinguished, and which require some form of experimental test in order to be established (Marres, 2012).

much of the debate about digital methods in social media studies has focused on the possibility of the re-purposing of digital devices (Rogers, 2009). Sociologists have drawn attention to the instability and under-determinacy of digital research methods themselves, proposing notions such as plastic methods (Lury, 2012) and live methods (Back and Puwar, 2012).

If a tool can serve multiple purposes, it cannot be simply defined as a sociological tool or method, but can only become so through its deployment and in assembly with research questions, objectives and narrativation.

Taking into account the alignment of research objectives, data, tools, media and analytical purpose, we can conclude that digital research metrics may be called ‘thick’ provided we take the research context into account: they are propositions that suggest particular ways to equip, organize, and valuate practices and knowledges. While the measures built into online data tools are arguably rather ‘thin’ indeed, the socio-technical apparatus they enable – the detection of currency (for free!) – is much ‘thicker’: it integrates the analysis of live data into digital practices, and as such helps to realize informational societies orientated towards liveness. For this reason, we think of co-occurrence, or at least its implementation in data tools online, as a highly ‘interested method’ (Asdal, 2014).

References on that matter:

  • Marres, 2012 a
  • Ruppert, Law & Savage, 2013
  • Rogers, R., (2013), Digital Methods, Cambridge, MA: MIT Press.

Drucker, 2014

Drucker, J. (2014). Graphesis: Visual Forms of Knowledge Production. Cambridge, MA: Harvard University Press.

For Johanna Drucker, the graphical tools developed in empirical sciences and adopted by humanists are Trojan horses conveying assumptions about the nature of information:

  • Their familiarity conceals their epistemological biases, and collapses the critical distance between the phenomenal world and its interpretation.
  • Their simplicity and legibility hide the fact that data was obtained.
  • Their stable reading conventions hide the fact that phenomena are not independent from their observer.

“We know this”, she writes; but I find none of these three points self-evident. One can find details in other texts but not this one. She does not explain why some (all?) visualizations feel familiar: does it stem from their semiotics, or did an external influence weaponize it? Similarly, it is unclear whether she just contests the reductionist approach, or visualization techniques have inherent reductionist qualities.

Also note that although her vocabulary suggests an intent to deceive, that intentionality is not argued.

Most information visualizations are acts of interpretation masquerading as presentation. In other words, they are images that act as if they are just showing us what is, but in actuality, they are arguments made in graphical form.

the primary effect of visual forms of knowledge production in any medium … is to mask the very fact of their visuality, to render invisible the very means through which they function as argument.

Expectations about images changed and even the concept of what constitutes a likeness alters over time. We come to believe that photographs are an unmediated image, what Roland Barthes called an “image without a code,” and continue this belief as digital methods of scanning, altering, and creating have developed. But of course, all images are encoded by their technologies of production and embody the qualities of the media in which they exist.

Most, if not all, of the visualizations adopted by humanists, such as GIS mapping, graphs, and charts, were developed in other disciplines. These graphical tools are a kind of intellectual Trojan horse, a vehicle through which assumptions about what constitutes information swarm with potent force. These assumptions are cloaked in a rhetoric taken wholesale from the techniques of the empirical sciences that conceals their epistemological biases under a guise of familiarity. So naturalized are the maps and bar charts generated from spread sheets that they pass as unquestioned representations of “what is.” This is the hallmark of realist models of knowledge and needs to be subjected to a radical critique to return the humanistic tenets of constructedness and interpretation to the fore. Realist approaches depend above all upon an idea that phenomena are observer-independent and can be characterized as data. Data pass themselves off as mere descriptions of a priori conditions. Rendering observation (the act of creating a statistical, empirical, or subjective account or image) as if it were the same as the phenomena observed collapses the critical distance between the phenomenal world and its interpretation, undoing the concept of interpretation on which humanistic knowledge production is based. We know this. But we seem ready and eager to suspend critical judgment in a rush to visualization. At the very least, humanists beginning to play at the intersection of statistics and graphics ought to take a detour through the substantial discussions of the sociology of knowledge and its critical discussion of realist models of data gathering. At best, we need to take on the challenge of developing graphical expressions rooted in and appropriate to interpretative activity.

Because realist approaches to visualization assume transparency and equivalence, as if the phenomenal world were self-evident and the apprehension of it a mere mechanical task, they are fundamentally at odds with approaches to humanities scholarship premised on constructivist principles. I would argue that even for realist models, those that presume an observer-independent reality available to description, the methods of presenting ambiguity and uncertainty in more nuanced terms would be useful.

the rendering of statistical information into graphical form gives it a simplicity and legibility that hides every aspect of the original interpretative framework on which the statistical data were constructed. The graphical force conceals what the statistician knows very well—that no “data” pre-exist their parameterization. Data are capta, taken not given, constructed as an interpretation of the phenomenal world, not inherent in it.

knowledge created with the acknowledgment of the constructed nature of its premises is not commensurate with principles of certainty guiding empirical or realist methods. Humanistic methods are counter to the idea of reliably repeatable experiments or standard metrics that assume observer-independent phenomena. By definition, a humanistic approach is centered in the experiential, subjective conditions of interpretation.

Kitchin, 2014

Kitchin, R. (2014). The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences. London: Sage.

I did not read it, mostly for practical reasons. It features here because Eef Masson refers to it on this point: “In the absence of readily legible clues as to their epistemic foundations, computational research tools are often assigned such values as reliability and transparency.”

Ruppert, Law & Savage, 2013

Ruppert, E., Law, J., & Savage, M. (2013). Reassembling social science methods: The challenge of digital devices. Theory, culture & society, 30(4), 22-46.

For Evelyn Ruppert, John Law and Mike Savage, the most impacting traits of the digital apparatus are the kind of traces produced, the modalities of their circulation, the kind of arrangements they elicit… Most of their argument is not specific about tools or visualization; but a small part is.

For the authors, visualization can carry political power, for instance when it maps populations. This power lies in what it represents, but also how it was obtained, and the kind of patterns it does or does not manifest. The way visualizations circulate also matters.

Note. I did not extract quotes that are not directly about tools. However, I can offer an idea of the aspects of the digital apparatus they elaborate on, since it might interest you: (1) transactional actors; (2) heterogeneity; (3) visualization; (4) continuous, rather than bundled time; (5) whole populations; (6) granularity; (7) expertise; (8) mobile and mobilizing; and (9) non-coherence.

On the one hand, we want to suggest, controversially, that we are seeing a partial return to an older, observational kind of knowledge economy, based on the political power of the visualization and mapping of administratively derived data about whole populations. On the other hand, as a genealogical approach demands, we need to attend to the differential problems, concerns and devices through which observation is being performed by the digital and its material and productive effects, including the reconfiguration of knowledge spaces and social science expertise.

in the move to the digital visualization now becomes a means of showing how ‘excessive’ information can be reduced to a form in which it can be meaningfully, if partially, rendered for interpretation. … visualization becomes a summarizing inscription device for stabilizing and representing patterns so that they can be interpreted.

since both the distribution of digital devices and inscriptions is widespread, and that cascading devices work in different ways to produce different effects in different locations and circumstances, it is more readily apparent that knowledges do not cohere to generate a single authoritative representation of the social.

Rather than competition between ideas, it is competition between material devices where those that assemble and summarize can become ‘centres of calculation’. But crucial to this is their mobility, transmission and circulation, and the similar movement of inscriptions.

Drucker, 2012

Drucker, J. (2012). Humanistic Theory and Digital Scholarship. In Debates in the Digital Humanities, ed. Matthew K. Gold, 85-95. Minneapolis: University of Minnesota Press.

Johanna Drucker writes that the association of certain visualization and processing techniques blocks the functioning of the humanistic methods. Those techniques are the digital techniques developed for empirical, positivist research (including the social and natural sciences) and other fields such as management, business, gaming… They block its functioning mainly because the interpretation, in the humanities, depends on the observer and cannot be reified into the observer-independent artifacts required by digital processing and visualization.

Her only argument incriminating visualization is that it requires digital data, problematic because of their inherent reductionism. Johanna Drucker gives visualization the power to imply certainty, but she does not support the claim. On the contrary, she mentions that only a naive viewer would fall for it, and that most humanists and social and natural scientists accept it regardless.

Tools for humanities work have evolved considerably in the last decade, but during that same period a host of protocols for information visualization, data mining, geospatial representation, and other research instruments have been absorbed from disciplines whose epistemological foundations and fundamental values are at odds with, or even hostile to, the humanities. Positivistic, strictly quantitative, mechanistic, reductive and literal, these visualization and processing techniques preclude humanistic methods from their operations because of the very assumptions on which they are designed: that objects of knowledge can be understood as self-identical, self-evident, ahistorical, and autonomous.

these visualization techniques … come entirely from realms outside the humanities—management, social sciences, natural sciences, business, economics, military surveillance, entertainment, gaming, and other fields in which the relativistic and comparative methods of the humanities play, at best, a small and accessory role.

Getting the work done—putting texts into digital formats with markup that identified content—might be an interpretative exercise, but introducing ambiguity at the level of markup was untenable, not merely impractical. … to play in a digital sandbox one had to follow the rules of computation: disambiguation and making explicit what was so often implicit in humanities work was the price of entry.

Humanities approaches would proceed from a number of very specific principles. The first of these is that interpretation is performative, not mechanistic—in other words, no text is self-identical; each instance or reading constructs a text; discourses create their objects; texts (in the broad sense of linguistic, visual, acoustic, filmic works) are not static objects but encoded provocations for reading.

The graphical tools that are used for statistical display depend, in the first instance, on quantitative data, information that can be parameterized so that it lends itself to display. Virtually no humanistic data lends itself to such parameterization (e.g., what year should a publication be dated to in the long history of its production and reception?), and it is in fact precisely in the impossibility of creating metrics appropriate to humanistic artifacts that the qualitative character of capta, that which is taken as interpretation rather than data, comes sharply into relief.

if the premises on which quantitative information might be abstracted from texts or corpora raise one set of issues, the use of graphical techniques from social and natural sciences raise others. Graphs and charts reify statistical information. They give it a look of certainty. Only a naive viewer, unskilled and untrained in matters of statistics or critical thought, would accept an information visualization at face value. But most humanists share with their social and natural science colleagues a willingness to accept the use of standard metrics and conventions without question in the production of these graphs

Probability is not the same as ambiguity or multivalent possibility within the field of humanistic inquiry. The task of calculating norms, medians, means, and averages will never be the same as the task of engaging with anomalies and taking their details as the basis of an argument. Statistics and pataphysics will never meet on the playing fields of shared understanding. They play different games, not just the same game with different rules.

Law, 2012

 Law, J. (2011). Collateral realities. The politics of knowledge157. 

John Law looks at reality as something that is performed, as opposed to an already-made fate. In that perspective, no representation can be considered transparent – because that would conceal what the representation performs.

He argues here that material-semiotic practices enact reality the most powerfully when they leverage whatever lies beyond the limits of contestability, in particular the assumptions that are not represented but nevertheless accepted by participants. This realist trick, as he casually calls it, is the technique at the heart of common sense realism.

Practices enact realities … This means that if we want to understand how realities are done or to explore their politics, then we have to attend carefully to practices and ask how they work. … For my purposes, practices are detectable and somewhat ordered sets of material-semiotic relations.

[My interest] is to ask how these talking and meeting practices work to assemble a putative reality. But if we are to do this then we have to teach ourselves to see the work being done by the PowerPoints and the abstracts. We need to find ways of making this work visible. We need to resist the propensity to treat these texts as transparent, self-evident, or uninteresting windows on a pre-given world.

if we stick with the methodologists, then we know that they worry about technical adequacy. The assumption is that good techniques produce satisfactory representations of reality. What follows? One implication that I’ve already touched on is that techniques themselves become essentially uninteresting. This is because when they are working properly they are transparent. In this way of thinking they don’t distort realities, but merely transmit them. In short, good methods are a like window on reality. This means that unless something has gone wrong they can be ignored. As is clear, I have been arguing against this. No representation, I’ve been saying, is actually transparent.

The words (appear to) open a small window onto reality. At the same time (this is a part of the realist trick) the methods for making that window have been more or less erased.

Here is the argument. First attend to practices. Look to see what is being done. In particular, attend empirically to how it is being done: how the relations are being assembled and ordered to produce objects, subjects and appropriate locations. Second, wash away the assumption that there is a reality out there beyond practice that is independent, definite, singular, coherent, and prior to that practice. Ask, instead, how it is that such a world is done in practice, and how it manages to hold steady. Third, ask how this process works to delete the way in which this sense of a definite exterior world is being done, to wash away the practices and turn representations into windows on the world. Four, remember that wherever you look whether this is a meeting hall, a talk, a laboratory, or a survey, there is no escape from practice. It is practices all the way down, contested or otherwise. Five, look for the gaps, the aporias and the tensions between the practices and their realities – for if you go looking for differences you will discover them.

Here is the proposition: whatever which is not contested and, more particularly, whatever lies beyond the limits of contestability is that which operates most powerfully to do the real. And it is this, to be sure, that is the technique that lies at the heart of common sense realism. It is the enactment of collateral realities that turns what is being done in practice into what necessarily has to be.

Marres, 2012 a

Marres, N. (2012). The redistribution of methods: on intervention in digital social research, broadly conceived. The sociological review, 60, 139-165. 

For Noortje Marres, tools mediate methods, and as such have the power re-distribute them. This ambivalent ability can reinstate old power relations, but also enable new forms of critique and the experimentation with new methods.

Both [the Issue Crawler and the Co-Word Machine] re-mediate existing social methods, and both, I argue, involve the attempt to render specific methodology critiques effective in the online realm, namely critiques of the authority effects implicit in citation analysis. As such, these methods offer ways for social research to intervene critically in digital social research, and more specifically, to endorse and actively pursue the re-distribution of social methods online.

Taking up digital online tools, sociologists are likely to enter into working relations with platforms, tool developers and analytic and visual devices which are operating in contexts and developed for purposes that are not necessarily those of sociology

The new network science namely favours a new set of techniques for data collection and analysis, which entail an unusual division of labour between research subjects, data collection devices, and analysts in social research. To put it somewhat crudely, the approach seeks to maximize the role of mathematical techniques, at the expense of research subjects. … the new network science reinstates a classic opposition of social research, that between subjective and objective data.

Marres, 2012 b

Marres, N. (2012). On some uses and abuses of topology in the social analysis of technology (or the problem with smart meters). Theory, Culture & Society, 29(4-5), 288-310. 

Noortje Marres does not write here specifically about tools, but technology and methods. Yet she sketches a conceptual frame along the way. Methods and ontologies can be built into material devices, and those can resurface regardless of the user, as artefacts.

In the social studies of technology, topology has been mostly understood as a theoretical construct, as a conceptual language that can help social theory to render explicit the structure of socio-technical phenomena. However, at the current juncture, topology must also be understood as a device, as a way of structuring phenomena in practice, which is enabled (and disabled) by particular technologies. We must, then, attend more closely to how a topological imagination is facilitated by specific material apparatuses deployed across social life.

Online applications for data analysis and visualization, that is, enable dynamic, and arguably ‘topological’, renderings of controversy (November and Latour, 2010; Scharnhorst and Wouters, 2006). However, what makes matters especially complicated here is that these applications have built into them particular methods of analysis and visualization, on which social studies of technology has also relied in the past to analyse controversies. To speak of the deployment of topology as a device, in this case, is then to do more than suggest an analogy between a sociological concept and digital technologies. It is to highlight that certain methods of ‘topological’ analysis have become built into digital technologies in recent times.

The topological unfolding of a space-time of controversy is revealed to be partly an artefact of the devices used to render controversy visible and analysable. The topological organization of controversy, that is, is here accomplished experimentally, through the deployment of digital devices. And this, in turn, has implications for how we imagine the relation between social and technological change.

mapping controversies may be said to offer a way of being critical that does not require a transcendentalizing move. To develop such empirical forms of critique requires more serious work and reflection on the tools and methods of topological analysis and, in particular, on the kinds of ontologies that get built into the software applications on which we rely.

Rieder & Röhle, 2012

Rieder, B., Röhle, T. (2012). Digital Methods: Five Challenges. In Understanding Digital Humanities, ed. David M. Berry, 67-84. New York: Palgrave Macmillan.

In this text Berhard Rieder and Theo Röhle focus on the tools that function as methods. For them, those tools promise nice things to the humanities, but condition their results to certain truth-claims. The mechanization they offer is attractive because it appears less subjective, and visualization is appealing because it efficiently reduces information; but both effects are rhetorical effects of concealing the process and the mediation, which is methodologically dangerous, if not plain wrong. Yet it does not invalidate the benefits of the tools. Understanding the tool and its methodology is crucial, which also depends on the user.

These computational tools hold a lot of promise: they are able to process much more data than would ever be possible manually; they promise extended zoomability between micro and macro and the ability to reconcile breadth and depth of analysis; they can help to reveal patterns and structures which are impossible to discern with the naked eye; they are even probing into semantic relations and meaning.

What interests us here … are tools that explicitly function as methods: they process data systematically and associate certain “truth claims” with their results. Two types of software can be distinguished here.

The first consists of automated versions of existing manual methods … The central question here is how these methods are affected when they move into the digital realm and are implemented as software. … the change in technology may have important consequence for how these methods are used, how they evolve, and how they produce knowledge. …

The second type of tools, which can be broadly subsumed under the term “data exploration”, represents a more inductive tendency. While they do not pretend to verify or falsify hypotheses, they still try to generate knowledge about the data they analyse. By rendering certain aspects, properties, and relations visible, they offer us particular perspectives on the phenomena we are interested in. While their results may be visually impressive and intuitively convincing, the methodological and epistemological status of their output seems unclear at best. Nevertheless, it is these very tools that provoke the most enthusiastic reactions. What is rarely reflected by advocates of an “end of theory” (Chris Anderson) though, is that theory is already at work on the most basic level of methodology, i.e. when it comes to defining units of analysis, algorithms, and visualisation procedures.

Empirical methods that rely on strong formalisation epitomise [the scientific ideal of the natural sciences], especially if they are implemented in an automated way. Automatic collection and processing appears to remove the data one step further from the perils of human error and subjective judgement.

Visualisation has a long history as a rhetoric device in the sciences; it is one of the prime vehicles for reducing complexity and conveying a certain perspective on the material. The fact that a visualisation is not given, but always a specific projection of the data is often forgotten in the process, especially when visual competence is lacking.

visualisations are specific kinds of representations that involve specific kinds of reductions. But is it therefore feasible to treat them exclusively as a rhetoric device? The visualisations seem to carry some kind of valid proposition about the world, but how can their range be properly delineated, what are kosher ways to integrate them into a scientific argument?

without the full consciousness of what it means to mechanise methodology, we may find ourselves in a situation where large parts of knowledge production are delegated to software tools that we do no longer understand.

Drucker, 2011

Drucker, J. (2011). Humanities approaches to graphical display. Digital Humanities Quarterly5(1), 1-21.

This text is basically the same as a section featured in Johanna Drucker’s 2014 book Graphesis quoted above. I summarized the point already. I will still quote it for the record.

This is the first paragraph:

As digital visualization tools have become more ubiquitous, humanists have adopted many applications such as GIS mapping, graphs, and charts for statistical display that were developed in other disciplines. But, I will argue, such graphical tools are a kind of intellectual Trojan horse, a vehicle through which assumptions about what constitutes information swarm with potent force. These assumptions are cloaked in a rhetoric taken wholesale from the techniques of the empirical sciences that conceals their epistemological biases under a guise of familiarity. So naturalized are the google maps and bar charts generated from spread sheets that they pass as unquestioned representations of “what is.” This is the hallmark of realist models of knowledge and needs to be subjected to a radical critique to return the humanistic tenets of constructedness and interpretation to the fore. Realist approaches depend above all upon an idea that phenomena are observer-independent and can be characterized as data. Data pass themselves off as mere descriptions of a priori conditions. Rendering observation (the act of creating a statistical, empirical, or subjective account or image) as if it were the same as the phenomena observed collapses the critical distance between the phenomenal world and its interpretation, undoing the basis of interpretation on which humanistic knowledge production is based. We know this. But we seem ready and eager to suspend critical judgment in a rush to visualization. At the very least, humanists beginning to play at the intersection of statistics and graphics ought to take a detour through the substantial discussions of the sociology of knowledge and its developed critique of realist models of data gathering. At best, we need to take on the challenge of developing graphical expressions rooted in and appropriate to interpretative activity.

These quotes are from a more detailed discussion that engages with semiotics.

At stake, as I have said before and in many contexts, is the authority of humanistic knowledge in a culture increasingly beset by the claims of quantitative approaches that operate on claims of certainty. … The digital humanities can no longer afford to take its tools and methods from disciplines whose fundamental epistemological assumptions are at odds with humanistic method.

the rendering of statistical information into graphical form gives it a simplicity and legibility that hides every aspect of the original interpretative framework on which the statistical data were constructed. The graphical force conceals what the statistician knows very well—that no “data” pre-exist their parameterization. Data are capta, taken not given, constructed as an interpretation of the phenomenal world, not inherent in it.

To expose the constructedness of data as capta a number of systematic changes have to be applied to the creation of graphical displays. That is the foundation and purpose of a humanistic approach to the qualitative display of graphical information.

take these basic elements of graphical display and rethink them according to humanistic principles:
In conventional statistical graphics, the scale divisions are equal units. In humanistic, interpretative, graphics, they are not.
In statistical graphics the coordinate lines are always continuous and straight. In humanistic, interpretative, graphics, they might have breaks, repetitions, and curves or dips. Interpretation is stochastic and probabilistic, not mechanistic, and its uncertainties require the same mathematical and computational models as other complex systems.
The scale figures and labels in statistical graphics need to be clear and legible in all cases, and all the more so in humanistic, interpretative, graphics since they will need to do quite a bit of work.

References on that matter:

  • Latour, 1986
  • Knorr-Cetina, Karin, and Klaus Amann (1990). “Image Dissection in Natural Scientific Inquiry”. Science, Technology, and Human Values 15 (1990), pp. 259-259.
  • Lynch, Michael, and Steve Woolgar. (1988) “Introduction: Sociological Orientations to Representational Practice in Science”. Human Studies 11 (1988), pp. 99-116.
  • Anderson, Margo. “Quantitative History”. In William Outwaite and Stephen Turner, eds., The Sage Handbook of Social Science Methodology. London: Sage Publications, 2007. pp. 246-263.
  • Anderson, Margo. “The Census, Audiences, and Publics”. Social Science History 32: 1 (2008), pp. 1-18.
  • Porter, Ted. Trust in Numbers: The Pursuit of Objectivity. Princeton: Princeton University Press, 1995.
  • Lochlann, Jain. “Morality Effect: Counting the Dead in the Cancer Trail”. Public Culture (2010), pp. 89-117.

Haraway, 1988

Haraway, D. (1988). Situated knowledges: The science question in feminism and the privilege of partial perspective. Feminist studies14(3), 575-599.

Donna Haraway writes on visualization – if you can make it through the fermented dough of her language. The text is often cited on the idea of the god trick of seeing everything from nowhere. The trick is to conceal the mediation and make you believe that the dominant knowledge is not situated – it is objective because it does not have a body, but you have a body and you are thus biased.

The eyes have been used to signify a perverse capacity … to distance the knowing subject from everybody and everything in the interests of unfettered power. The instruments of visualization in multinationalist, postmodernist culture have compounded these meanings of disembodiment.

Vision in this technological feast becomes unregulated gluttony; all seems not just mythically about the god trick of seeing everything from nowhere, but to have put the myth into ordinary practice. And like the god trick, this eye fucks the world to make techno-monsters.

this ideology of direct, devouring, generative, and unrestricted vision, whose technological mediations are simultaneously celebrated and presented as utterly transparent

we need to reclaim that sense to find our way through all the visualizing tricks and powers of modern sciences and technologies that have transformed the objectivity debates.

There is no unmediated photograph or passive camera obscura in scientific accounts of bodies and machines; there are only highly specific visual possibilities, each with a wonderfully detailed, active, partial way of organizing worlds. All these pictures of the world should not be allegories of infinite mobility and interchangeability but of elaborate specificity and difference and the loving care people might take to learn how to see faithfully from another’s point of view, even when the other is our own machine.

I wish to translate the ideological dimensions of “facticity” and “the organic” into a cumbersome entity called a “material-semiotic actor.” This unwieldy term is intended to portray the object of knowledge as an active, meaning-generating part of apparatus of bodily production, without ever implying the immediate presence of such objects or, what is the same thing, their final or unique determination of what can count as objective knowledge at a particular historical juncture.

Latour, 1986

Latour, Bruno. (1986) "Visualisation and Cognition: Drawing Things Together." Knowledge and Society: Studies in the Sociology of Culture and Present, 6: 1-40. 

For Bruno Latour, the materiality of visualization may matter more than its semiotics. The fact that it allows optical consistency, can be moved, recombined, plays a crucial role in the power of science. He also argues that despite the subjectivity of interpretation, disagreeing with the main interpretation becomes increasingly costly as visualizations accumulate and succeed in mobilizing many powerful actors.

The essential characteristics of inscriptions cannot be defined in terms of visualization, print, and writing. In other words, it is not perception which is at stake in this problem of visualization and cognition. New inscriptions, and new ways of perceiving them, are the results of something deeper. … In sum, you have to invent objects which have the properties of being mobile but also immutable, presentable, readable and combinable with one another.

The shift from the other senses to vision is a consequence of the agonistic situation. You present absent things. No one can smell or hear or touch Sakhalin island, but you can look at the map and determine at which bearing you will see the land when you send the next fleet. The speakers are talking to one another, feeling, hearing and touching each other, but they are now talking with many absent things presented all at once. This presence/absence is possible through the two-way connection established by these many contrivances —perspective, projection, map, log book, etc.— that allow translation without corruption.

The main quality of the new space [of the map] is not to be “objective” as a naïve definition of realism often claims, but rather to have optical consistency. This consistency entails the “art of describing” everything and the possibility of going from one type of visual trace to another.

There is no detectable difference between natural and social science, as far as the obsession for graphism is concerned. If scientists were looking at nature, at economies, at stars, at organs, they would not see anything. … In the debates around perception, what is always forgotten is this simple drift from watching confusing three-dimensional objects, to inspecting two-dimensional images which have been made less confusing.

it is not the inscription by itself that should carry the burden of explaining the power of science ; it is the inscription as the fine edge and the final stage of a whole process of mobilization, that modifies the scale of the rhetoric. Without the displacement, the inscription is worthless ; without the inscription the displacement is wasted. … So, the phenomenon we are tackling is not inscription per se, but the cascade of ever simplified inscriptions that allow harder facts to be produced at greater cost.

It is precisely because the dissenter can always escape and try out another interpretation, that so much energy and time is devoted by scientists to corner him and surround him with ever more dramatic visual effects. Although in principle any interpretation can be opposed to any text and image, in practice this is far from being the case ; the cost of dissenting increases with each new collection, each new labeling, each new redrawing. … Thus, one more inscription, one more trick to enhance contrast, one simple device to decrease background, one coloring procedure, might be enough, all things being equal, to swing the balance of power and turn an incredible statement into a credible one which would then be passed along without further modification.


OpenEdition suggests that you cite this post as follows:
Mathieu Jacomy (May 28, 2020). Some quotes on how tools influence the humanities. Reticular. Retrieved February 19, 2025 from https://reticular.hypotheses.org/1677


5 thoughts on “Some quotes on how tools influence the humanities”

  1. Note: I am going to leave here some other papers that could extend this list. Feel free to contribute.

    Paßmann, Johannes, and Asher Boersma. ‘Unknowing Algorithms: On Transparency of Unopenable Black Boxes’. The Datafied Society. Studying Culture through Data, edited by Mirko Tobias Schäfer and Karin Es, van, Amsterdam University Press, 2017. doi:10.5117/9789462981362.

    Bruns, Axel. ‘Faster than the Speed of Print: Reconciling “Big Data” Social Media Analysis and Academic Scholarship’. First Monday, vol. 18, no. 10, Oct. 2013. firstmonday.org, http://firstmonday.org/ojs/index.php/fm/article/view/4879.

  2. Nice start of an analysis on what can be improved – looking forward to examples of how it can be done more productively. I think we really do need new methodological frameworks for critiquing digital tools in ways that acknowledge what they bring while also pointing out where they may fall short or distort the research results; agreed that we should do this with Knorr Cetina’s epistemic Cultures in mind (so clearly articulating the specific debates and the underlying assumptions we are engaging with). Small detail: my former colleague Eef Masson is a she :-)

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.