In the Fall of 2014, I visited the Friday Harbor Laboratories to “revisit” critical GIS with a group of about thirty scholars. This month, our commentary on the workshop is making its way to print in Environment and Planning A. If you have institutional access, you can download it on their website. If not, you can download the accepted manuscript version. Or, if you want the TL;DR version, skip to the last two lines: “We are continually revisiting critical GIS. Join us.”
Toward a Geographical Software Studies 1: Political economy and infrastructures
Wednesday, 3/30/2016, from 10:00 AM – 11:40 AM
Laura Beltz Imaoka (University of California, Irvine), The Immaterial Value of Proprietary Software: Platforming ArcGIS
Ashwin Jacob Mathew (University of California, Berkeley/Packet Clearing House), Protocol as a Fieldsite
Till Straube (Goethe University Frankfurt), Seeing Like a Stack
Will Payne (University of California – Berkeley), What’s in a (Neighborhood) Name? Location-Based Services and Contested Delineations of Place
Discussant: James Thatcher (University of Washington – Tacoma)
Toward a Geographical Software Studies 2: Language and tools
Wednesday, 3/30/2016, from 1:20 PM – 3:00 PM
Matthias Plennert (FAU Erlangen-Nürnberg), Analyzing the hidden backbone of an open-data-project: a genealogy of the OpenStreetMap data model
Warren SACK (University of California – Santa Cruz), Out of Bounds: Language Limits, Language Planning, and Linguistic Capitalism
Luke R. Bergmann (University of Washington), Speculative computing: toward Geographic Imagination Systems (GIS)
Pip Thornton (Royal Holloway, University of London), The Production of Context and the Digital Reconstruction of Language
Discussant: Cheryl Gilge (University of Washington)
Toward a Geographical Software Studies 3: The visual and control
Wednesday, 3/30/2016, from 3:20 PM – 5:00 PM
Craig M. Dalton (Hofstra University), Seeing with Software: Mobile device users’ geographic knowledges
Aaron Shapiro (University of Pennsylvania), The Surface of Things: Google Street View, Computer Vision, and Broken Windows
Louise Amoore (Durham University)
Teresa Scassa (University of Ottawa), Mapping Crime: Civic Technology in the Emerging Smart Cities Context
Discussant: Clare Melhuish (University College London)
Toward a Geographical Software Studies: methods and theory
Wednesday, 3/30/2016, from 5:20 PM – 7:00 PM
Elvin K. Wyly (University of British Columbia)
Pip Thornton (Royal Holloway, University of London)
Daniel G. Cockayne (University of Kentucky)
Keith Woodward (University of Wisconsin-Madison)
Discussant: Matthew W. Wilson (Harvard University)
Session Description: A growing body of recent geographic scholarship has focused its attention on software and algorithms. Some of these studies analyze geographic technologies —GIS and the geoweb, for example— as such, while others investigate a myriad of digital technologies that have become ubiquitous within the spaces of everyday life. These software/code objects interact with and modulate the world in complex ways, enact processes that connect humans and nonhumans, and become entangled with social, cultural, political, and economic systems. Moreover, software created to visualize data is used to produce knowledge about urban environments and everyday life, but obscure the processes and contexts which underlie its development. Engaging these topics, geographers have developed concepts like the “automatic production of space” (Thrift and French 2002), “software-sorted geographies” (Graham 2005), and “code/space” (Kitchin and Dodge 2011) to describe how software and space are co-constituted in the contemporary world. Productive research is building on these topics to explore new ways geographies are produced (Rose, Degen, and Melhuish 2014), governed (Amoore 2011), materialized, represented (Woodward et al. 2015), and lived through software (Kinsley 2014).
This session seeks to bring together a range of spatial thinkers who are producing new studies, theories, and methods for understanding and producing software. We welcome submissions that address all facets of software: the context of its production, its internal operational logics, the material work it does in the world, and its spatial distribution of social and political effects.
Sponsorships: Geographic Information Science and Systems Specialty Group
Cyberinfrastructure Specialty Group
Political Geography Specialty Group
In an attempt to trace a history of radical thought within the discipline of geography (and assemble a reading list for my qualifying exams), I’ve been experimenting with co-citation visualizations. These network graphs rely on citation data from Web of Science, connecting texts that are cited together a minimum number of times, as indicated by the citation threshold slider. I focused mostly on the journal Antipode, which has offered a “radical (Marxist/socialist/anarchist/anti-racist/feminist/queer/green) analysis of geographical issues” since its founding in 1969. Unfortunately, Web of Science only indexes Antipode articles from 1990 through 2015. But I was able to see how older Antipode articles have been taken up by three major English language geography journals (The Annals of the Association of American Geographers, Progress in Human Geography, and Transactions of the Institute of British Geographers) by mapping co-citations of Antipode articles within them. This allowed me to go back to 1969 for the Annals, 1982 for Progress, and 1970 for Transactions and produce this graph:
Zooming in reveals discussions within the discipline. For example on the left, we find five scholars debating visuality within geography in a special issue in Antipode, while on the right, we find an ongoing discussion about the university, education, and pedagogy:
Below are all of the graphs I’ve done so far. Let me know if you find any interesting patterns!
This work relies on code written by number of people. I used Neal Caren’s python script with slight modifications to convert the Web of Science data into a D3 graph. Kieran Healy’s code helped me modify the visualization. And Jonathan Goodwin’s code and write-up were super helpful in adding the citation threshold slider and putting everything together.
Later this week, I’ll be in Denver to participate in the Society for Social Studies of Science (4S) Annual Meeting. On Thursday (4:00 to 5:30pm, Denver Sheraton, Plaza Ballroom D), I will be presenting a paper co-authored with Jennifer Denbow that outlines some of our recent research on ultrasound. Our abstract reads:
Developments in ultrasound software continue to produce new representations of bodily interiors, profoundly influencing medical, popular, and political understandings of pregnant bodies. While numerous feminist STS studies have explicated the importance of the resultant images, none have examined how legal regulations and economic interests interact with software production to produce these images. In this paper, we examine the development of ultrasound software through interviews with computer programmers, technical documentation, academic articles, and federal regulations. We explore how three different academic ultrasound laboratories conceptualize their work in relation to regulatory frameworks. These three sites, and the researchers within, are part of complex assemblages that influence the design choices and assumptions that go into producing ultrasound software. Through an examination of the technical and regulatory frameworks, as well as the software that laboratories produce, we argue that the regulatory and economic structures affect what software is produced and thus what images are possible. Thus, we argue that the production and use of ultrasound software is co-constitutive with legal regulations, economic interests, and understandings of reproductive bodies.
I haven’t gone through the program closely yet, but someone pointed me to a panel titled “Make Kin Not Babies: Toward Feminist STS Pro-Kin and Non-Natalist Politics of Population and Environment,” which looks fantastic. It includes contributions from Donna Haraway, Adele E. Clarke, Michelle Murphy, Kim TallBear, Alondra Nelson, and Chia-Ling Wu (Thursday, November 12, 10:30am to 12:00pm, Denver Sheraton, Governor’s Square 15). The abstract reads:
Feminist STS scholarship has long and richly addressed biogenetic reproduction, focusing on race, region, sexuality, class, gender, and more. However, feminist STS has also largely been silent about reducing the human burden on earth while strengthening ecojustice for people and other critters as means and not ends. Can we develop anti-colonial, anti-imperialist, anti-racist, STS-informed feminist politics of peopling the earth in current times, when babies should be rare yet precious and real pro-family and community politics for young and old remain rare yet urgently needed? How can we develop collaborative politics recognizing that peoples subjected to (ongoing) genocides may need more children? How can we intervene in the relentless glut of attention devoted to problematic, costly “rights” and “needs” for (mainly richer) women to have babies as an individual “choice”?
Questions: How to nurture durable multi-generational non-biological kin-making, while humans everywhere transition to vastly less reproduction? What alternative ways of flourishing can be nurtured across generations and across cultures, religions, nations? How to deter on-going anti-feminist population control efforts while generating innovative discourses that legitimate non-natalist policies and choices? How to promote research on forms of contraception women and men want (and can use under diverse circumstances) and reproductive services that actually serve? How to build non-natalist kin-making technologies and sciences in housing, travel, urban design, food growing, environmental rehabilitation, etc.?
Where are the feminist utopian, collaborative, risky imaginings and actions for earthlings in a mortal, damaged, human-heavy world? Why hasn’t feminist STS taken the lead in such fundamental endeavors?
Rachael Squire has an interesting post on the Geopolitics and Security blog from Royal Holloway. She describes recent news stories about Russian submarines operating near deep sea communication cables and provides us with some historical context. I will be interested to see where this research goes. She writes:
Last week, reports emerged of a Russian submarine ‘aggressively operating’ near US undersea cable infrastructure. According to the New York Times and a subsequent report by CNN, the presence of Russian subs near such vital infrastructure has prompted fears that Russia might be planning to ‘attack’ the cables in ‘times of tension or conflict’. The ‘cable’ posing a security threat is not a new phenomenon. During the Cold War for example, cable tapping was a key intelligence gathering strategy by both the US and Soviet Union. As a case in point Operation Ivy Bells saw fast-attack submarines and combat divers deployed to ‘drop waterproof recording pods on the lines’. The divers would return every few weeks to gather the pods before delivering them to the NSA. The latest reports, however, hint at something different to Cold War cable hacks. According to the NYT the primary threat is that the cables would be cut or severed.
Artist and geographer Trevor Paglen has also been doing some interesting work researching, photographing, and mapping undersea cables. Hyperallergic has a nice review of his recent exhibition at Metro Pictures, which includes the following two images:
Over on Geographical Imaginations, Derek Gregory points out two new projects, both of which sound fascinating. One is Eyal Weizman’s new “Forensic Architecture” lecture, which extends Weizman’s earlier work on Rebel Architecture, nicely summarized in this video:
Gregory also mentions a new book titled A Prehistory of the Cloud by Tung-Hui Hu, which he predicts will “surely be one of the must-reads of the year.” Hu traces a history of cloud computing, finding its roots in older networks like railroads and in older forms of political power. Lisa Banks writes:
Hu’s riveting genealogy of the cloud takes us into its precursors and politics, and boldly demonstrates how fantasies of sovereignty, security, and participation are bound up in it. Much more than a data center, the cloud is a diffuse and invisible structure of power that has yielded a data-centric order. Imaginative and lucidly written, this book will be core to digital media studies.
Recently, I’ve been interested in accounts like these that take the “new” out of “new media” by historicizing digital technologies–showing how they emerged from particular imaginaries, discourses, and historical precedents. Armand Mattelart’s short book Networking the World, 1794-2000 does a nice job of this, showing how numerous communication networks, from the telegraph to the internet, have been met with utopian hopes that describe a new connected, democratic, and peaceful world. He writes:
Messianic discourse about the democratic virtues of technology, which mask what is at stake in the struggles for control of the structure and content of knowledge networks, are of use in geopolitics. The champion of information superhighways, Albert Gore, adopts the same tone as the prophets who have preceded him since the end of the eighteenth century, when he presents to the “great human family” his world project for a network of networks: the global information infrastructure (GII). (92)
Over on booktwo.org, James Bridle has a nice post that, like some of Gregory’s writing, connects military history to contemporary digital technologies and the politics of vision. He writes:
When radar signals were received aboard an aircraft carrier, they were displayed on a radar oscilloscope. But in order for this information be used in the midst of battle, the positions needed to be transcribed to a large glass viewing pane, and as part of this process they needed to be inverted and reversed. To perform this operation quickly and accurately, the radar operators were trained and drilled extensively in “upside down and backwards town”, a classified location where everything from newspapers to street signs were printed upside down and backwards. This experience would not so much create a new ability for the radar operators, as break down their existing biases towards left-to-right text, allowing them to operate in multiple dimensions at once.
This process, in Kevin’s reading and in mine, is akin to much of our experience of new technology, when our existing frameworks of reference, both literary and otherwise, are broken down, and we must learn over once again how to operate in the world, how to transform and transliterate information, how to absorb it, think it, search for it and deploy it. We must relearn our relationship not only with information, but with knowledge itself.
And, finally, I recently came across two great collections of articles, books, and scholars writing critically about digital media. The first, from the Social Media Collective at Microsoft Research New England, is a Critical Algorithm Studies reading list, which “spans sociology, anthropology, science and technology studies, geography, communication, media studies, and legal studies, among others,” intentionally avoiding work from computer science. The second is the Remediating Political Theory / Repoliticizing Media Theory reading list compiled by Jason Adams from The New Centre for Research & Practice. It’s a list of writers with no descriptions, so it’s a little harder to navigate, but potentially useful for the meticulous reader.
In this paper, we examine the relationship between the digital and geography. Our analysis provides an overview of the rich scholarship that has examined: (1) geographies of the digital, (2) geographies produced by the digital, and (3) geographies produced through the digital. Using this material we reflect on two questions: has there been a digital turn in geography? and, would it be productive to delimit ‘digital geography’ as a field of study within the discipline, as has recently occurred with the attempt to establish ‘digital anthropology’ and ‘digital sociology’? We argue that while there has been a digital turn across geographical sub-disciplines, the digital is now so pervasive in mediating the production of space and in producing geographic knowledge that it makes little sense to delimit digital geography as a distinct field. Instead, we believe it is more productive to think about how the digital reshapes many geographies.
You can download a copy on the Social Science Research Network page.
Also of interest is the new issue of Surveillance & Society–a double issue with a theme of “Surveillance Asymmetries and Ambiguities.” While I haven’t read it through yet, many of the abstracts sound very promising as scholars attempt to complicate understandings of power relations, especially in relation to computational surveillance practices.
Hitachi Data Systems, in a recent announcement, has upped the ante in the predictive policing game with the introduction of Visualization Predictive Crime Analytics (PCA) into their Visualization Suite. They claim:
PCA is the first tool of its kind to use real-time social media and Internet data feeds together with unique, sophisticated analytics to gather intelligent insight and enhance public safety through the delivery of highly accurate crime predictions.
The press release explicitly connects the technology to “smart cities,” claiming the suite will “help public and private entities accelerate extraction of rich, actionable insights from all of their data sources.” The use of the word “actionable” here is interesting and reminds me of Louise Amoore’s great article, “Data Derivatives: On the Emergence of a Security Risk Calculus for Our Times” from 2011. The data derivative, in Amoore’s articles, is the “flag, map or score” that is inferred from large data sets like those used in Hitachi’s PCA system. She argues:
The pre-emptive deployment of a data derivative does not seek to predict the future, as in systems of pattern recognition that track forward from past data, for example, because it is precisely indifferent to whether a particular event occurs or not. What matters instead is the capacity to act in the face of uncertainty, to render data actionable.
The whole project seems wildly ambitious, with unknown statistical models, machine learning algorithms, and natural language processing routines holding it all together. More from the press release:
Hitachi Visualization Suite (HVS) is a hybrid cloud-based platform that integrates disparate data and video assets from public safety systems—911 computer-aided dispatch, license plate readers, gunshot sensors, and so on—in real time and presents them geospatially. HVS provides law enforcement with critical insight to improve intelligence, enhance investigative capabilities and increase operational efficiencies. Along with capturing real-time event data from sensors, HVS now offers the ability to provide geospatial visualizations for historical crime data in several forms, including heat maps. This feature is available in the Hitachi Visualization Predictive Crime Analytics (PCA) add-on module of the new Hitachi Visualization Suite 4.5 software release.
Blending real-time event data captured from public safety systems and sensors with historical and contextual crime data from record management systems, social media and other sources, PCA’s powerful spatial and temporal prediction algorithms help law enforcement and first responder teams assign threat levels for every city block. The algorithms can also be used to create threat level predictions to accurately forecast where crimes are likely to occur or additional resources are likely to be needed. PCA is unique in that it provides users with a better understanding of the underlying risk factors that generate or mitigate crime. It is the first predictive policing tool that uses natural language processing for topic intensity modeling using social media networks together with other public and private data feeds in real time to deliver highly accurate crime predictions.
Remapping Spatial Sensibilities
In a number of recent articles, scholars have drawn connections between cartography and the visual arts. These connections are usually confined to questions of aesthetics and representation, eschewing larger conceptual and historical connections. In this paper, I deploy Jacques Rancière’s concept of the “distribution of the sensible,” which he uses to describe how art changes what we are able to perceive. Using a number of maps as examples, I use this concept to trace a history of cartography concerned with changing understandings of space. This periodization, I argue, suggests a path forward for cartographic work concerned with developing new spatial cognizance, or using Rancière’s terms, re-distributing what is spatially sensible. This path, informed by art theory, opens up exciting new possibilities for cartographic work to exist as an independent knowledge-producing practice, intersect with theories in human geography, respond to the current moment, and produce new representations of space.
The conference schedule looks excellent and people keep telling me good things about past meetings, so I’m excited to be participating.
Recently, I’ve been writing about software used in surveillance and policing practices. One such practice, called predictive policing, uses software packages to analyze spatial crime data and predict where and when future crimes are likely to happen. Police departments can then use this information to decide where to deploy officers. This practice has received some mainstream press in recent years, with many pointing out its similarities with the Pre-crime division in the 2002 film Minority Report. Coincidently, the last three cities I’ve called home–Santa Cruz, Oakland, and Madison–have all used predictive policing software.
One popular package, PredPol, use principles from earthquake aftershock modeling software. Others are sometimes compared to analytic software used to create targeted ads through the analysis of customer shopping data. Of course, these software packages are proprietary, making it difficult to look under the hood to see how they really work. But their use of existing crime data should be a cause for alarm, especially in places with disparities in policing.
A news article from the Wisconsin State journal last year indicates that crime analysts in Madison are using predictive policing software, although the details are vague and there isn’t much documentation that I could dig up. But in a city with well-documented and profound racial disparities in policing, we can only guess that this will reinforce those practices.
As Ingrid Burrington writes in a Nation article:
All of these applications assume the credibility of the underlying crime data—and the policing methods that generate that data in the first place. As countless scandals over quotas in police departments demonstrate, that is a huge assumption.
It’s easy to imagine how biased data could render the criminal-justice system even more of a black box for due process, replacing racist cops with racist algorithms.
The adoption of software solutions for policing, whether implicitly or explicitly, often contain the hope of bypassing the problem of structural racism in policing. But these software packages can only reinforce those racist assumptions if they rely on datasets constructed through policing practices. Madison Police Chief Koval rejects claims that his department is responsible for racial disparities in policing, deferring blame onto the larger community:
“On any given month, more than 98 percent of our calls for service are activated through the 9-1-1 Center,” he said in a statement. “Upon arrival, our officers are required by law to evaluate the behavior that is manifesting to see if it reaches legal thresholds required to ticket and/or arrest.”
But, as I argued in an earlier post, the practices of patrolling officers seem to reflect those same disparities (although, the role of community members in perpetuating racism should also be taken seriously). This can only lead to a situation where the algorithms merely reflect and hone actually existing ideas and spatial imaginaries about who commits crimes and where. And I don’t doubt that these systems will produce arrest statistics to back up their claims, but whose interests are they serving?
As Ellen Huet observes in Forbes:
Police departments pay around $10,000 to $150,000 a year to gain access to these red boxes, having heard that other departments that do so have seen double-digit drops in crime. It’s impossible to know if PredPol prevents crime, since crime rates fluctuate, or to know the details of the software’s black-box algorithm, but budget-strapped police chiefs don’t care. Santa Cruz saw burglaries drop by 11% and robberies by 27% in the first year of using the software. “I’m not really concerned about the formulas,” said Atlanta Police Chief George Turner, who implemented the software in July 2013. “That’s not my business. My business is to fight crime in my city.”
I think it’s time to be concerned.