A consensual hallucination no more? The Internet as simulation machine
Fenwick McKelvey
Concordia University, Canada
Matthew Tiessen
Ryerson University, Canada
Luke Simcoe
Ryerson University, Canada; York University, Canada
Abstract
European Journal of Cultural Studies
2015, Vol. 18(4-5) 577–594 © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/1367549415584856 ecs.sagepub.com
european journal of
     
In this article, we investigate the macro-role being played – and played out – by digital, social and ‘new’ media today. We suggest that these media, facilitated by the Internet, can together be understood as a vast simulation machine that mediates and modulates everyday life to refashion what was once the ‘real world’ in its own image. Life in the ‘meatspace’ (the physical world) is most valuable, we suggest, not because it involves tweets, opinions or our desires but because these data produce useful and computable digital resources for finance, business and government. Today’s Big Data mining and predictive analytics allow for digital priorities to become non-digital realities, resulting – we suggest – in the algorithmically generated landscapes of today (and tomorrow). The imperatives driving today’s Internet and mobile technology have more to do with making the world computationally comprehensible than with the facilitation of free expression, open markets or open communication. We discuss the conditions created by these digital simulation machines as well as emerging opportunities for subversion and resistance.
Keywords
Big data, data mining, Internet, prediction, preemption, simulation, surveillance
Corresponding author:
Fenwick McKelvey, Department of Communication Studies, Concordia University, 7141 Sherbrooke Street West, Montreal, QC H4B 1R6, Canada.
Email: fenwick.mckelvey@concordia.ca

578 European Journal of Cultural Studies 18(4-5)

The technotronic era involves the gradual appearance of a more controlled society. Such a society would be dominated by an elite, unrestrained by traditional values. Soon it will be possible to assert almost continuous surveillance over every citizen and maintain up-to- date complete files containing even the most personal information about the citizen.
Brzezinski (1970: 97)
Introduction
Are we living in a simulated ‘reality’? The book Simulacron-3 by Daniel F. Galouye (1964), although it is a work of science fiction, asks a question that speaks to today’s digitally driven world. The titular city that serves as the book’s location unsettles its inhabitants. Over the course of the novel, it turns out the city itself is a simulation running on a computer. Created by scientists, the simulated city functions as a market research experiment. The city/experiment is one giant public opinion poll, data-mining the minds of millions of people to help companies and governments make decisions. Although a cautionary tale written in a Cold War era defined by Command and Control military doc- trines (Edwards, 1997) and a pursuit of bureaucratic efficiencies through Operations Research (Light, 2005), the book mirrors what has become of today’s Internet-centric moment, where algorithms and communication technologies are used to surveil indi- viduals, preempt dissent and securitize society (Andrejevic, 2011; Bogard, 1996; Elmer, 2004; Massumi, 2007) while simultaneously informing market research and determining new modes of propaganda and promotional culture (Bernays, 1928; Wernick, 1991). In this article, we investigate the role digital, social and ‘new’ media play in contributing to a society wherein everyday life is increasingly becoming a simulation of networked tech- nology’s tactics, logics and protocols. Our suggestion will be that analogue realities are increasingly being shaped according to the demands of the digital.
The Internet, according to this perspective, is no longer a space primarily for com- munication, but a machinic generator of simulation. We argue that the mediation of eve- ryday life by the Internet, coupled with Big Data, machine learning, social media platforms, and predictive algorithms and analytics, has turned the Internet – once theo- rized as a virtual simulation of the ‘real world’ – into a simulation-generating machine. Yesterday’s cyberspace has become a computational simulation machine that creates models of reality through an ongoing process of data mining, all the while creating feed- back loops coupled with the algorithms that ‘read’, process, modulate and determine our digital information (before doing it all over again). The real world, then, is becoming a ‘simulation’ not of some thing or force from the beyond, but a simulation (in the more conventional sense of mimicry) of the Internet. What seems certain is that the digital world and the computable realities it creates will increasingly eclipse the incalculability of analogue reality for no other reason than the seemingly commonsense assumption that the incalculable is less able to be controlled than that which can be quantified (we will deal more extensively with the incalculable later on). The simulation machine will drive the production of a non-digital life-world optimized and approved for today’s controlled, quantified and computationally comprehensible societies (Deibert, 2003; Deleuze, 1992; Fuchs et al., 2013; Genosko and Thompson, 2006; Haggerty and Ericson, 2000). Our
McKelvey et al. 579

own collective contributions to social media, in such a world, are most valuable not because they express our opinions or our desires but because these quantifiable data produce a useful resource for the digital simulation machines of finance, business and government akin to the collective behaviour of Simulacron-3. Our Facebook profiles, our credit scores, our stock market results and our continuous partial attention have more to do with informing algorithms designed to manage plugged-in populations (in the ser- vice of capital, as the Critical Art Ensemble (2001) predicted in the 1990s) than to facili- tate free expression or open communication. The primary purpose and product of digital media – and especially the algorithmic logics and platforms that organize and operation- alize them – is the creation of an ongoing simulation in which communication and what appears to the majority of users as free expression are but feel-good side effects or externalities.
Today’s algorithms and computational logics tune and optimize digital communica- tion through feedback loops designed to narrowly calibrate human desire and activity towards market- and financial-friendly equilibrium. Amazon.com’s preemptive product recommendations, for example, are only the beginning of tomorrow’s digitally mediated desiring, but are nonetheless contemporary expressions of both a logic (one that unques- tioningly supports economic growth on a finite planet) and a trajectory (one that places utmost faith in the power of quantification, digitization, financialization and technol- ogy). In the face of these ongoing developments, we are compelled to ask, What is the end game? What are human desire and creativity being programmed to become? To para- phrase Deleuze, ‘What work is the digital simulation machine doing?’
The simulation machine, of course, still remains imperfect, not to mention overly dependent on a society that celebrates a self-profiling and participatory culture (Dean, 2005). Later we suggest, in light of the simulation machine’s state of deficiency, that idiocy and the non-sensical might be the simulation machine’s most effective saboteur. Indeed, when computers identify and compute everything in order to decipher and shape knowledge and information, what is to be done to subvert the data-mining machines other than to stop making sense?
We are what we Tweet: The Internet and simulation
If digital technologies and the Internet have become generators of digitally inflected simulations, how can these simulations be understood? By simulation, we do not mean a reproduction or representation of reality ‘as it is out there’, nor do we mean simply a Baudrillardian ‘hyperreal’ wherein signs of the real substitute for or parody the real itself (Baudrillard, 2001). Rather, we refer to a digital reality that generates analogue ‘realities’ not by creatively constructing false narratives or representations but by manufacturing the realities of tomorrow by data-mining and diagnosing the desires of today. So while Baudrillard (1995) once suggested that we falsely ‘believe we exist in the original’ (p. 97), these days the simulation we are living in is in fact (an) original, that is, a digitally generated original that simulates and iterates the logics of the digital. This real digitally modulated reality gets created as our algorithmically driven machines churn through and (data-)mine the events, dispositions and desires of today in order to produce the seemingly inevitable events of tomorrow. So while Baudrillard (1995) observes that
580 European Journal of Cultural Studies 18(4-5)

our own reality “doesn’t exist anymore’ (p. 97), we want to point out that our own reality does still exist and that our machines recreate and reprocess it on our behalf 24hours a day, 7 days a week, in real time. The digital simulation machine ‘overcodes’ (Deleuze and Guattari, [1980] 1987: 212), reconfigures and re-calibrates the non-digital or not-yet- digital realities it is designed to quantify and, in turn, to (re)construct computationally.
Today’s Internet, in other words, can be understood as a simulation machine insofar as it contributes to the production of an analogue world that is becoming primarily an expression of computation, one capable of being further modelled and made predictable. In other words, our everyday lives in meatspace (the physical world) are a simulation (and a stimulation) of the computer model itself. The issue for our times is not that we Netizens – nor the inhabitants of Simulacron-3 – need to grapple with the idea that we are living fake or fictitious virtual lives; rather, we must come to terms with how our online (and remaining off-line) activities feed the appetites of algorithmically driven machines designed to facilitate the expansion of profit and power by quantifying and calibrating our desires with its own desires.
A new standing reserve of bits?
It has been noted that digital mediation and communication transforms social activity and information into a sort of Heideggerian (1977) ‘standing reserve’ – a resource-in- waiting and in service of shortsighted and, indeed, reductive human desires and imagi- naries. This position is articulated by Darin Barney (2000), who describes the Internet as a ‘standing-reserve of bits’ (p. 223). While Barney wrote his work before the emergence of social media and Big Data, he nevertheless foreshadows our present, one where ever- growing troves of human communication data reside in server farms as standing reserves of data to be put to work on behalf of simulation generation.
Social media platforms profit by mediating access to the standing reserve that is their real-time data flow, known colloquially as the ‘firehose’. The value of unmediated access to the firehose and the rich insights into the social world it grants Google, Facebook and Twitter might be rivalled, as Morozov (2013) wryly suggests, only by the National Security Agency’s (NSA) own PRISM program. Facebook, for instance, knows the value of its ability to process analogue phenomena (human relationships and desires) in order to feed it back into, and produce, the world as a simulation of the digital. Indeed, insofar as Facebook is conscious of its immense power, it is very selective about who can gain access to its data, permitting only select firms to tap in to its data flow. It has begun, for instance, sending weekly data reports to the top American televisions networks (the com- pany publicly announced partnerships with ABC, NBC, Fox and CBS) (Rusli, 2013). More recently, Facebook has granted select media outlets such as Buzzfeed, CNN, NBC’s Today Show, BSkyB, Slate and Mass Relevance the ability to report Facebook’s news about the news from the source, allowing these outlets to see ‘how many people on Facebook talked about a popular subject, where it’s getting the most buzz, whether it’s most popular among males or females, and with which age group’ (Osofsky, 2013: n.p.). Speculation abounds that Facebook opened its data to compete with Twitter, which long ago saw its firehose of tweets as a source of profit. Before Facebook, Twitter partnered with Nielsen ratings to replace Nielsen’s older audience simulation technology (see Streeter,
McKelvey et al. 581

1996) with a new Nielsen Twitter TV Rating (Nielsen Company, 2012). Twitter, however, plays with a less exclusive client list than Facebook. Nonetheless, data licensing accounted for US$243 million of Twitter’s first quarter revenue of 2014, about 10 percent of its over- all profit (Twitter, 2014). Although small compared to its complementary advertising rev- enue, Twitter continues to invest in and expand this side of its business which serves to operationalize social media content by regarding opinions expressed on Twitter as poten- tially opinion-making and opinion-shaping in the ‘real’ world.
Similarly, a secondary market of data intermediaries now sells access to the simula- tion machine as manageable and more affordable data feeds (Helmond, 2014). Gnip, one of the top four data providers, offers a product called Firehose that aggregates the data flows of Twitter as well as Tumblr, Foursquare and the comment engine Disqus, while its Data Collector program merges public application programming interfaces (APIs) such as YouTube, Facebook and Reddit. Datasift, one of its three competitors in the market, offers data aggregation of over 22 different sources for a monthly fee (as low as US$150 per month in 2011) or by data processing units (DPUs) that roughly break down to US$0.10 per 1000 tweets (Benson, 2011; Datasift, 2014). It is notable that these fire- hoses extend into the past as well as the present. Both Gnip and Datasift offer historical services – the ability to query yesterday’s data flows – for a price. In so far as these companies rely on clean commercial feeds of data, the whole Internet becomes simply data to be mined – or, as Barney suggests, a standing reserve of bits – to be put to use by those seeking to shape opinions and preemptively mould perspectives in the future.
The real money, however, seems to be in the more covert world of data analysis or what we might call simulectronics, the term Galouye used to describe the technology in Simulacron-3. Helmond (2014) suggests that social media data intermediaries ‘add the bling’ to their data through the enhanced use of metadata, geolocation protocols and other analysis/analytics services. Gnip might augment the feed, but another competitor Dataminr (2014) offers clustering of data around geolocation or co-occurrence to offer ‘real-time situational awareness’ and ‘historical comparison’. These services of interme- diaries, however, compete not with themselves but with numerous firms offering social media analytics, sentiment analysis and predictive analytics (cf. Andrejevic, 2011).
The horizons of possibility that stem from the world of algorithmic mining of the data flow are described by William Gibson, who ends his novel Zero History with antagonist- turned-protagonist Hubertus Bigend able to predict the future using what he calls the ‘order flow’ – an actionable rendering of reality:
It’s the aggregate of all the orders in the market. Everything anyone is about to buy or sell, all of it. Stocks, bonds, gold, anything. If I understood him, that information exists, at any given moment, but there’s no aggregator. It exists, constantly, but is unknowable. If someone were able to aggregate that, the market would cease to be real ... Because the market is the inability to aggregate the order flow at any given moment. (Gibson, 2011: 177)
In the book’s final pages, Bigend confesses that he needs only the briefest lead on the present – ‘seven seconds, in most cases’ – to construct his desired financial future. Knowing ‘seven seconds’ ahead is all anyone needs to eke out a profit. It is unclear whether an exact number has been found in the real world, but it is clear that data
582 European Journal of Cultural Studies 18(4-5)

intermediaries sell access to today’s order flows just like high-speed algorithmic traders buy the right to execute trades for banks so that they can know the order flow in advance (Lewis, 2014). Of course, information intermediaries and predictive analytics firms, by having access to their own order flows, can also gain predictive and preemptive advan- tages when it comes to shaping tomorrow’s simulations. To what extent does the slight gap between collection and dissemination enjoyed by data brokers give them foreknowl- edge of the future before it happens (or is made to happen)?
Simulectronics, control and recorded futures
The collective activity of humanity serves as ‘standing reserve’ in order to provide the data that informs the decision-making processes of algorithmic simulation systems such as high-frequency trading, threat mitigation or aggregated news services – systems whose outputs in turn define the worlds we will wake up to tomorrow. These simulec- tronics are owned by individuals and institutions who wield global power and control: banks, corporations, governments. Stock trends, news events and even global conflicts become predictable in the simulation, thanks to the simulectronics (Brandom, 2014). Consider the recent policy document entitled Big Data: Seizing Opportunities, Preserving Values (Podesta et al., 2014) created by the Office of the President of the United States that attempts to begin the discussion that will lead to future policy directions designed to pursue digitally generated profits at the expense of – if history is our guide – what once qualified as privacy (all the while insisting that values will be preserved rather than pro- duced). Or consider a similar policy document entitled National Strategy for Trusted Identities in Cyberspace (Grant, 2011) created for the White House in the United States. This document lays the groundwork for people to access the Internet using what amounts to an Internet ID card that will allegedly facilitate online security while, it goes without saying, also undermining the last vestiges of online anonymity (if it can still be said to exist post-Snowden).
Big Data and pervasive online IDs are only two of the many ways the simulation machine is closing the door to on- and off-line activities that operate independently of simulation system inputs. Consider also the ways Google’s investment in self-driving automobiles will give rise to persistently trackable mobility, or the ways Google Glass and similar products will pave the way for increased life-logging, or Google’s acquisi- tion of NEST, a company that develops ‘smart’ thermostats and smoke alarms that register how people use energy and more by tracking its users (Wohlsen, 2014). The drive towards an Internet of Things means plugging in ever more desire and inputs into the simulation machine. If houses, cars and even cities can sense (if not feel), then could their computable responses improve the simulation? We will only find out in the status updates of the smart homes, objects and businesses of the future.
To generate simulation is to manufacture a kind of control space (Deleuze, 1992). Access to the Internet’s flows drive algorithmic and computational functions that ensure future prosperity of the status quo and the control that comes with it. One only needs to look at the constant A/B Testing taking place at Google, which involves doing randomized tests and experiments with two potential outcomes. Google runs experiments on its live user base to find the optimal configurations of its interface and search results for corporate
McKelvey et al. 583

goals. When Google released its Website Optimizer, its own internal A/B testing tool to help web developers refine their websites in pursuit of higher web traffic, anyone could experiment with their own little cog in the simulation machine. Using these types of tools, the Obama campaign, for example, using the competing Optimizely product, found a black and white image of the Presidential family outperformed its initial splash page image by 13.1percent. A/B testing is just one example of the ways the Internet both consumes data and creates influential digital products – and simulations – in near real time (Christian, 2012), creating a state of constant tweaking that Neff and Stark (2004) describe as permanently beta. Control through simulation functions as an immanent order–calibrating communication to produce an informational present, a predictable future and an infinitely mine-able past. In his social science fiction, Bogard (1996) states that ‘with simulation, sight and fore-sight, actual and virtual begin to merge’ (p. 76). We would add hind-sight to this list. In this way, we suggest the Internet as simulation machine is both forward looking (anticipatory) and retrospective (and even retroactive).
Processes of data mining and indexing work to make actual an endlessly revisitable digital past composed of raw data, able to be re-calculated and refined in order for those in control of the simulation machine to guide us towards desired future outcomes. Consider the specifications of the NSA’s new data centre in Utah, known as the Intelligence Community Comprehensive National Cybersecurity Initiative Data Center, which requires that its data-storage capacity exceed a billion gigabytes. Can there be any doubt that such storage capacity is being designed to archive and mine the data collection of the entire Internet in raw form in order, in turn, to put it to work and make it useful (Bamford, 2012)? Creating a malleable past and future also happens outside the NSA’s data centre through more affordable commercial options like ISC8’s NetFalcon which collects all traffic flowing on a network to create localized indices of the past (ISC8, 2013). These digital networks and devices, we argue, implicitly operate together to calibrate and re- calibrate the past based on what gets sold to the public in the present.
Alongside their more covert signals, intelligence, law enforcement and national secu- rity have begun investing in open-source intelligence (OSINT) by using Internet data for security purposes. BlueJay, a tool for law enforcement to monitor Twitter, buys access to Twitter’s firehose to offer law enforcement a ‘Twitter crime scanner’ so that police can observe their city through social media. Could these scanners have played a role in the predictive and preemptive policing initiatives during recent political protests in Chicago, New York, San Francisco or Seattle (Bond-Graham and Wednesday, 2013; Economist, 2013; Elmer and Opel, 2008; Stroud, 2014)?
Even these simulectronics lag behind the real bleeding edge of the predictive analytics industry where Google, alongside the Central Intelligence Agency, has made major investments in an industry valued now in the billions (Albergotti, 2013; Shachtman, 2010). Predictive analytics companies like Recorded Future and Palantir mine the data provided by the Internet before selling and monetizing predictions. Recorded Future, for example, has patented a Temporal AnalyticsTM Engine that attempts to predict the future using a combination of algorithmic analysis that assesses the sentiments, spaces and times of the data flow (Truvé and Ahlberg, 2013). While these firms might differ in their algorithmic systems, they all depend on access to the standing reserve of the Web pro- duced by its users.
584 European Journal of Cultural Studies 18(4-5)

So what else can be done with these simulectronics? Finance capital has certainly found a use for these flows of actionable data. These days over 80 percent of stock trad- ing on Western financial markets is done by computer algorithms that make decisions in nanoseconds (Tiessen, 2013). These decisions to trade result not from human intuition (let alone discernment of such fuddy-duddy things such as ‘value’), but from the ‘logic’ embedded in these algorithms; as they feed upon a variety of information sources as inputs, they output decisions of what to buy and sell. The Internet has already been wired as an input for these decisions. Indeed, the BBC reports that Google searches can predict stock market decisions (BBC News, 2011). The increasing role of online data in the world of finance was on display a few years ago when, in 2008, shares of United Airlines plummeted when the Google News aggregator in concert with a Bloomberg newsletter mistakenly categorized a 6-year-old ‘news’ story as current (Zetter, 2008). The story discussed the insolvency of the airline – long resolved – and investors (or more likely machines) reacted promptly by dumping its stock. More recently, hackers breached the Twitter account of the Associated Press and tweeted that President Obama had been attacked. Seconds after the tweet, the Dow Jones lost 144 points before returning to nor- mal 10 minutes later. This faux-crisis could certainly have been profitable for those able to take advantage of this digitally determined instant (CBC News, 2013). Indeed, such events do and will have real-world effects as the digital simulation machine persists in shaping the analogue world it seeks to determine.
These calibrations of past, present and future actualize both the fantasies of many computer scientists and media theorists and the nightmares of science fiction authors (Edwards, 1997; Holmes, 2007). Digital computer networks once promised stability through simulation that Marshall McLuhan gestured towards in his famed Playboy inter- view from 1969:
There’s nothing at all difficult about putting computers in the position where they will be able to conduct carefully orchestrated programming of the sensory life of whole populations. I know it sounds rather science-fictional, but if you understood cybernetics you’d realize we could do it today. The computer could program the media to determine the given messages a people should hear in terms of their over-all needs, creating a total media experience absorbed and patterned by all the senses. We could program five hours less of TV in Italy to promote the reading of newspapers during an election, or lay on an additional 25 hours of TV in Venezuela to cool down the tribal temperature raised by radio the preceding month. By such orchestrated interplay of all media, whole cultures could now be programed in order to improve and stabilize their emotional climate, just as we are beginning to learn how to maintain equilibrium among the world’s competing economies. (McLuhan quoted in Norden, 1969)
Computers, McLuhan observed, had the potential to observe and intervene in the global emotional Zeitgeist. Digital simulation would blanket the earth, keeping us warm and cozy in the face of an unstable atmosphere. This promised ‘closed world’ where computers render the world manageable and under control has been refracted through countercultures and cybercultures, as both digital utopia and dystopia (Edwards, 1997). The unactualized histories of the cybernetic dreams of the 1960s have resulted in similar ideals persisting in the modern age (Turner, 2006). Personal computers, it is commonly held, allow humans to express themselves more freely. We won’t argue with that claim,
McKelvey et al. 585

but let’s not forget that this ‘free expression’ has so far been allowed to persist primarily to nourish data aggregators and simulators in the pursuit of profit and power.
We speculate that social and financial exclusion and asymmetries of power will intensify as the Internet succeeds in making its algorithmically sorted simulations ‘real’ (cf. boyd and Crawford, 2012). As the promise and hype of ‘Big Data’, for example, demonstrates, each iteration of the Web further disavows uncertainty, contributing to an ‘evidence-based’ and data-driven simulation machine that, increasingly, will overcode our everyday and digitally simulated lives. Those inputs that cannot be aggregated will be removed or ignored. There is little hope or need for the system to self-correct should things be ‘wrong’. Those invested in the algorithmic production of simulation will not worry about the outcome so much as the probability of its realization. A future that is unequal but knowable to those privileged will, we fear, be regarded as better than an unknowable – or opaque – future replete with freedom, equality, privacy and equitable human flourishing as its goal (Andrejevic, 2002; Fuchs, 2010; Petersen, 2008; Rey, 2012; Terranova, 2000).
It seems, then, that at some point – a generation or two from now – the inherently binary logics and realities of the digital world will pave over any previous non-digitally simulated reality and so lead to ‘reality’ being merely an always differentiating and responsive expression or simulation of self-referential digital (or perhaps quantum) feed- back loops (Aaronson, 2014). Today, we actively enjoy our participation in this fledgling simulation through free-and-friendly services that allow us to connect, surf, gain access, seek fame and pursue fortune. These are ostensibly enabling services that allow us to ‘tweet’ (Twitter), that help us ‘connect’ with friends (Facebook) and – it goes without saying – that are always represented and branded as friendly and never ‘evil’ (Google). The systems function not only to produce profits in the here and now but also to maintain and intensify the accuracy of the simulation while increasing the likelihood of its prob- abilities. Companies like Google or Facebook, then, have less desire to create digital commodities (Mosco, 1996) than to mediate the entire experience, to ensure the predict- ability of their user base, to sell products and to offer up advertising. French philosopher Philippe Mengue (2013) understands tactics such as Google’s or Facebook’s well and reminds us that control controls best when it modulates its subjects with a smile and grants them new freedoms and opportunities while, at the same time, keeping its less-than-friendly designs imperceptibly out of view and in the shadows. He points out,
Contrary to the repressive hypothesis, the power of control cannot be reduced to a negative mechanism, it does not consist in saying no, in declaring what is forbidden. Instead of withdrawing a part of the production or drawing blood, control implies a positive mechanism that aims to protect life, increase its dynamism. Control seeks to manage things, to insert them in a set of utilities, and to appear to regulate them in the direction of the common good. (n.p.)
Moreover, Mengue (2013) adds, ‘it is not in spite of power that there are liberties but thanks to power that liberties exist and increase’ (n.p.). Indeed, it seems it is only through extraordinary acts of human determination – Edward Snowden provides us with the most recent notable example (Greenwald et al., 2013) – that power’s imperceptible modes of appearing benevolent while applying (digital) control are made visible, that its modes of granting ‘liberties’ are revealed.
586 European Journal of Cultural Studies 18(4-5)

But despite these rather ominous developments in the area of simulation systems, we would be remiss if we failed to recall that today’s technologies of digital mediation offer as many opportunities as they do risks. Nonetheless, since the risks remain significant, our job is not simply to pass judgement on them but to engage in experimentation. Indeed, it is not enough to determine what today’s digital simulations are, but to begin experimenting with what they can do, what they are capable of becoming. As Deleuze (1992) explains about the societies of control he began to outline in the 1990s following Foucault’s examination of disciplinary societies:
There is no need to ask which is the toughest or most tolerable regime, for it’s within each of them that liberating and enslaving forces confront one another. There is no need to fear or hope, but only to look for new weapons. (p. 4)
In what follows, we suggest that nonsense, noise and even idiocy might be these new weapons. The web we have woven, however, might just be too tight to slip out of, result- ing instead in our having to develop ‘a respect for the people who manipulate the threads’ if we are ever to find a gap in today’s and tomorrow’s digital fabric (Solzhenitsyn, [1967] 1991: 208–209).
Embracing the new trickster: vacuoles of non- communication and other circuit breakers
There is no doubt that our simulated present and future require new tactics of resistance (Tiessen and Elmer, 2013). In this regard, we are inspired – as are many others – by the work of Gilles Deleuze (1995) who presciently wrote,
Maybe speech and communication have been corrupted. They’re thoroughly permeated by money – and not by accident but by their very nature. We’ve got to hijack speech. Creating has always been something different from communicating. The key thing may be to create vacuoles of non-communication, circuit breakers, so we can elude control. (p. 175)
These, then, could be regarded as the tasks of contemporary resistance – not to com- municate with the simulation system, but to elude control, to construct incomprehensible new realities and communities. Our suggestion is that it makes more sense to make no sense since logical forms of antagonistic and subversive communication are merely inputs that feed both bots making stock market decisions and governments devising new modes of digital control. The transitory British collective known as The Deterritorial Support Group also draws on Deleuze to offer a perfect example of both the vacuoles of non-communication in their writings on Internet culture and the popularity of seemingly meaningless jokes or memes:
Internet memes originally functioned as subjects of the Internet hate machine – operating in a totally amoral fashion, where achieving ‘lulz’1 was the only aim. Within the past few years, memes have started to take on a totally different function, and what would have been perceived as a slightly pathetic bunch of bastards in the past are today global players in undermining
McKelvey et al. 587 international relations – namely in the complex interaction of Wikileaks with Anonymous,
4chan and other online hooligans.
There’s no coherent analysis to be had of this at the moment. However ‘lulz’ also demonstrate their potential as part of a policy of radical refusal to the demands of capital. When asked by liberals ‘Do you condone or condemn the violence of the [often private property destroying and occasionally violent] Black Bloc?’ we can only reply in unison ‘This cat is pushing a watermelon out of a lake. Your premise is invalid’. (Nesbitt, 2012: n.p.)
Adrien Chen (2013), a writer for online news outlet Gawker, echoes this sentiment when he describes film director Harmony Korine’s absurd(ist) interview on the Internet- trend-following website Reddit (technically an ‘Ask Me Anything’ or AMA) when he writes, ‘It’s unclear if he’s on or off something but his typo-and-non-sequitur-filled per- formance in his AMA today was inspired. The only way to resist the insufferable PR machine is clogging it with pure nonsense’ (n.p.). Indeed, nonsense – or the incalculable – might be the new sabotage.
The usefulness of incalculable nonsense as a means of eluding control is extended by Mengue who, in his reading of Deleuze, suggests that in order to throw sticks in the spokes of systems of digital, corporate, political and financial control, we should all act a bit more idiotic. As Mengue (2013) suggests,
To escape control we should play the idiot. Perhaps the very idea will seem in itself idiotic. But we have to discover all that is implied by it. In order to resist we need a subject and since this cannot be the people or the proletariat, we must turn to a new form of subjectivity, one which has to be invented or constructed. What mode of subjectivation can we use? A mode in which Idiocy will be the principal. (n.p.)
The idiot is effective for Mengue because he or she makes room for indetermination and for the gaps in knowledge and judgement necessary to allow for the not-yet-imag- ined to occur, to allow – thanks to the idiot’s sheer ignorance, naivete, non-sensical sense-making and non-approved thinking – potential realities beyond the status quo’s perpetuation of the simulation system to emerge. Mengue (2013) notes that to counter control requires what he describes as ‘a politics of indeterminacy’ that serves as a non- causal condition capable of offering a chance to the unexpected which, by definition, ‘cannot be an object of decision and is excluded from any program or project’ (n.p.). Idiots, then, are those who allow themselves to be open to the indeterminate Void, to the becomings and affordances that exist beyond the present and outside of the simulated apparatus. This, for Mengue, is how new possibilities and realities can begin to take shape. As he suggests,
It is in encountering the Void or the Undetermined that control slips, skids, loses its grasp ... [I]t is on this indetermination that we must count ... to render possible the apparition of Something (which depends on other factors). So, ... the Deleuzian politics of event necessarily becomes a politics of indetermination, and cannot therefore be other than a politics of the Idiot, in a broad sense, since this character best incarnates the principle of indetermination. (Mengue, 2013: n.p.)

588 European Journal of Cultural Studies 18(4-5)

For Mengue, idiocy and indetermination are not the goal; rather, they are the strategy employed in the effort to open up a gap in the system and to allow for a sliver of sunshine through the simulation.
In our view, nonsense, or idiocy, clarifies what matters about Internet tricksters like the popular 4chan online message boards, typically used by anonymous members of a plethora of subcultures to post endless streams of randomness, subversive missives and political interventions and commentary. Or like the hacker group LulzSec – the hacking collective that targeted companies like Sony and News Corporation in 2011, leaving leaked user information, published security vulnerabilities and sarcastic press releases in its wake (Coleman, 2012). The creator of 4chan, Christopher Poole, suggests that the site matters because it allows for an anonymity not possible on platforms like Facebook. As Jessica Beyer (2014) attests, the anonymous design of the 4chan platform and the ironic, sarcastic, non-sensical and subversive uploads and contributions of its users can be regarded as radical because the content on 4chan does not easily lend itself to the sense- making strategies of data mining and monitoring. For example, 4chan’s refusal to archive itself results in an online social space that consciously resists simulation (Knuttila, 2011). Similarly, the (relatively) unmoderated nature of the site results in anyone’s desire to scrape it – to render it digestible to the simulation machine – being at risk of download- ing illegal content.
The culture of 4chan works alongside its code to confound visitors, both human and nonhuman (Simcoe, 2012). The community actively resists categorization, and the only memes or behaviours that gain significant traction are those that present to the 4chan community with sufficient room for ambiguity and playful interpretation. Meaning is meaningless on 4Chan; humour and a logic of the Lulz that favours ‘distanced irony and critique’ (Milner, 2013) is its principal currency. To date, even the most advanced senti- ment-analyzing algorithms have had difficulty deciphering jokes and sarcasm (Reyes et al., 2012). Also relevant to our argument is the way 4chan users deploy their penchant for nonsense in a tactical fashion. The constant stream of obscenity, inside jokes and non sequiturs they insist upon – from graphic images of Goatse2 to an inexplicable fondness for Rick Astley music videos – have rendered the site one of the few large online forums not able to turn a profit. Despite a base of 22 million monthly users, Poole has struggled to find advertisers willing to have their products appear next to the aforementioned Goatse; indeed, he once even bragged that 4chan loses money (Wortham and Gallagher, 2012). Outside the confines of the site, trolls lurk under the Internet’s myriad bridges circulating false news stories, hijacking online polls and marketing campaigns and gen- erally making online information less reliable – or at least more difficult to parse. And when those kernels of spurious data get fed into the algorithmic feedback loop, who knows what Lulzy futures they might bring into being?
These idiot-friendly observations are hardly specific to 4chan. In her work on the massive multiplayer virtual world Second Life, Burcu Bakioglu (2009) describes how ‘griefers’ rejected the game’s built-in logic of commodification through in-game raids and hacks that spawned hordes of offensive sprites, ranging from flying penises to swas- tikas. This type of ‘grief-play’ jammed Second Life’s signification system, literally caus- ing sims to crash, and disrupted its virtual economy by preventing or delaying the micro-transactions depended upon by Second Life’s owners, Linden Labs. Such actions
McKelvey et al. 589

suggest new horizons of possibility for ‘acting out’ both on- and off-line insofar as trolls and even griefers can interfere with the logics driving simulation and computational sense-making.
Conclusion
In conclusion, we propose that the relative ongoing openness of today’s Internet could itself be understood as a giant global experiment or honeypot. This honeypot has been designed – for now at least – to allow human desires to flow freely (desires to connect, curious desires, depraved desires) in order to facilitate the accumulation of data prior to the algorithmic imposition of more intense forms of digital stratification, modulation and control (which will, we suspect, be designed to be as imperceptible as possible or will, at least, be imposed piece by piece following moments of crisis). From our perspective, it wouldn’t be too far-fetched to suggest that we’ve become more valuable to the Internet and its scanbots as aggregate (meta)data inputs than we ever were as consumers of banner ads. As Bogard (1996) imagined before social media ever existed, ‘more and more, you are what you enter into a computer, you are the electronic transactions you make, you are public opinion, you are the endless forms you fill out’ (p. 24). The ‘free’ services offered by Twitter, Facebook and Google – admittedly dependent on immaterial or virtual labour (Gill and Pratt, 2008; Langlois et al., 2009; Lazzarato, 1996; Terranova, 2004) – synthesize the desires of the powerful to convert life into information and our own desires for self-expression into data to be repackaged, resold and re-consumed. It is as if Marx’s theories of alienation have been resolved since we are so connected to our work – not only at the office but our work online – that our menial everyday labours and our crafting of our digital identities have become our primary source of subjectivity. After all, these days it often seems as though there is no alternative.
We are left wondering, despite our belief in the potential of idiocy, whether there can be any effective popular resistance to today’s – and tomorrow’s – algorithmically modu- lated simulation. Indeed, any resistance runs the risk of just being filtered out. Already, social media platforms have hired an army of content moderators in the Philippines and other developing nations to remove questionable content online (Roberts, 2014). We can imagine in the future that 4chan might either be removed from the web to ensure that an Internet with real names offers better simulation-supporting information or just ignored altogether by algorithms programmed to avoid websites that traffic in troublemakers, perverts and weirdos. Moreover, will the masses see the sense in strategic and tactical applications of nonsense? Does free, simulation-resistant communication matter enough to the world’s publics that they will stop communicating (or at least, stop making sense)? And if the Internet fails to simulate effectively, will the externalities of free communica- tion endure? These are all questions for which there are no answers on the immediate horizon. What is known, however, is that: 1) the Internet’s apparently rhizomatic tendrils are categorized and controlled by simulectronics; 2) that efforts to squeeze data out of the Internet’s users in pursuit of profit and preemptive forms of control is increasingly being accomplished using digital computational capacities calibrated through analogue-to- digital feedback loops; and 3) that digital simulations will continue to move beyond the digital interface into our own very real yet simulated worlds.
590 European Journal of Cultural Studies 18(4-5) Acknowledgements
All authors contributed equally to this work.
Funding
This research was supported by the Social Sciences and Humanities Research Council of Canada.
Notes
Lulz is a pluralized form of LOL or Laugh Out Loud. Just for the lulz roughly translates from Internet slang as just for laughs.
A photo of a naked man stretching his anus with both hands is one of the most well-known ‘shock’ images on the Internet.
References
Aaronson S (2014) Is there anything beyond quantum computing? The Nature of Reality – NOVA Next|PBS. The Nature of Reality. Available at: http://www.pbs.org/wgbh/nova/blogs/phys- ics/2014/04/is-there-anything-beyond-quantum-computing/ (accessed 12 May 2014).
Albergotti R (2013) Big data, big Dollars: Palantir valued at $9 Billion. The Wall Street Journal, 6th December. Available at: http://online.wsj.com/news/articles/SB10001424052702303497 804579240501078423362 (accessed 8 May 2014).
Andrejevic M (2002) The work of being watched: Interactive media and the exploitation of self- disclosure. Critical Studies in Media Communication 19(2): 230–248.
Andrejevic M (2011) The work that affective economies does. Cultural Studies 25(4–5): 604–620. Bakioglu BS (2009) Spectacular interventions of second life: Goon culture, griefing, and disrup- tion in virtual spaces. Journal of Virtual Worlds Research 1(3). Available at: http://journals.
tdl.org/jvwr/index.php/jvwr/article/view/348 (accessed 12 May 2014).
Bamford J (2012) The NSA is building the country’s biggest spy center (Watch What You Say). WIRED. Available at: http://www.wired.com/2012/03/ff_nsadatacenter/all/1 (accessed 9
May 2014).
Barney D (2000) Prometheus Wired: The Hope for Democracy in the Age of Network Technology.
Chicago, IL: University of Chicago Press.
Baudrillard J (1995) The virtual illusion: Or the automatic writing of the world. Theory, Culture
& Society 12(4): 97–107.
Baudrillard J (2001) Jean Baudrillard: Selected Writings (ed M Poster M). Stanford, CA: Stanford
University Press.
BBC News (2011) Twitter predicts future of stocks. BBC News, 6 April. Available at: http://www.
bbc.co.uk/news/technology-12976254 (accessed 12 May 2014).
Benson J (2011) Q&A: Nick Halstead on mining Twitter’s firehose with Datasift (Wired UK).
Wired UK. Available at: http://www.wired.co.uk/news/archive/2011-10/12/nick-halstead-
mining-twitter (accessed 7 May 2014).
Bernays EL (1928) Manipulating public opinion: The why and the how. American Journal of
Sociology 33(6): 958–971.
Beyer J (2014) Expect Us: Online Communities and Political Mobilization. New York: Oxford
University Press.
Bogard W (1996) The Simulation of Surveillance: Hypercontrol in Telematic Societies. Cambridge:
Cambridge University Press.
Bond-Graham D and Wednesday AW (2013) All tomorrow’s crimes: The future of policing looks
a lot like good branding. 30 October. Available at: http://www.sfweekly.com/2013-10-30/ news/predpol-sfpd-predictive-policing-compstat-lapd/ (accessed 8 May 2014).

McKelvey et al. 591

boyd d and Crawford K (2012) Critical questions for big data: Provocations for a cultural, techno- logical, and scholarly phenomenon. Information, Communication & Society 15(5): 662–679. Brandom R (2014) This algorithm can predict a revolution. The Verge, Available at: http://www. theverge.com/2014/2/12/5404750/can-a-database-predict-a-revolution (accessed 9 May 2014). Brzezinski Z (1970) Between Two Ages; America’s Role in the Technetronic Era. New York:
Viking Press.
CBC News (2013) Fake White House bomb report causes brief stock market panic. CBC News.
Available at: http://www.cbc.ca/1.1352024 (accessed 12 May 2014).
Chen A (2013) Spring breakers director Harmony Korine just did the best Reddit AMA ever. Gawker. Available at: http://gawker.com/5991995/all-reddit-amas-should-be-like-spring-
breakers-directory-harmony-korines (accessed 12 May 2014).
Christian B (2012) The A/B test: Inside the technology that’s changing the rules of business.
WIRED. Available at: http://www.wired.com/2012/04/ff_abtesting/ (accessed 13 May 2014). Coleman G (2012) Phreaks, hackers, and trolls and the politics of transgression and spectacle. In:
Mandiberg M (ed.) The Social Media Reader. New York: NYU Press, pp.99–119.
Critical Art Ensemble (2001) Digital Resistance: Explorations in Tactical Media. Brooklyn, NY:
Autonomedia.
Dataminr (2014) Dataminr’s event detection technology. Available at: http://www.dataminr.com/
technology/ (accessed 12 May 2014).
Datasift (2014) Understanding billing. DataSift developers. Available at: http://dev.datasift.com/
docs/billing (accessed 12 May 2014).
Dean J (2005) Communicative capitalism: Circulation and the foreclosure of politics. Cultural
Politics 1(1): 51–74.
Deibert RJ (2003) Black code: Censorship, surveillance, and the militarisation of cyberspace.
Millennium: Journal of International Studies 32(3): 501–530.
Deleuze G (1992) Postscript on the societies of control. October 59(1): 3–7.
Deleuze G (1995) Negotiations, 1972–1990. New York: Columbia University Press.
Deleuze G and Guattari F ([1980] 1987) A Thousand Plateaus: Capitalism and Schizophrenia.
Minneapolis, MN: University of Minnesota Press.
Edwards PN (1997) The Closed World: Computers and the Politics of Discourse in Cold War
America. Cambridge, MA: MIT Press.
Elmer G (2004) Profiling Machines: Mapping the Personal Information Economy. Cambridge,
MA: MIT Press.
Elmer G and Opel A (2008) Preempting Dissent: The Politics of an Inevitable Future. Winnipeg,
MA: Arbeiter Ring Publishing.
Fuchs C (2010) Labor in informational capitalism and on the Internet. Information Society: An
International Journal 26(3): 179–196.
Fuchs C, Boersma K, Albrechtslund A, et al. (2013) Internet and Surveillance: The Challenges of
Web 2.0 and Social Media. New York: Routledge.
Galouye D (1964) Simulacron-3. New York: Bantam Books.
Genosko G and Thompson S (2006) Tense theory: The temporalities of surveillance. In: Lyon D
(ed.) Theorizing Surveillance: The Panopticon and Beyond. Cullompton: Willan Publishing,
pp.123–139.
Gibson W (2011) Zero History. New York: Berkley Books.
Gill R and Pratt A (2008) In the social factory? Immaterial labour, precariousness and cultural
work. Theory, Culture & Society 25(7–8): 1–30.
Grant J (2011) National strategy for trusted identities in cyberspace: Enhancing online choice,
efficiency, security, and privacy. Washington, DC: Executive Office of the President. Available at: http://www.whitehouse.gov/sites/default/files/docs/big_data_privacy_report_ may_1_2014.pdf (accessed 12 May 2014).
592 European Journal of Cultural Studies 18(4-5)

Greenwald G, MacAskill E and Poitras L (2013) Edward snowden: The whistleblower behind the NSA surveillance revelations. The Guardian, 10th June. Available at: http://www.theguard- ian.com/world/2013/jun/09/edward-snowden-nsa-whistleblower-surveillance (accessed 12 May 2014).
Haggerty KD and Ericson RV (2000) The surveillant assemblage. The British Journal of Sociology 51(4): 605–622.
Heidegger M (1977) The Question Concerning Technology and Other Essays. New York: Harper & Row.
Helmond A (2014) Adding the bling: The role of social media data intermediaries. Culture Digitally Available at: http://culturedigitally.org/2014/05/adding-the-bling-the-role-of- social-media-data-intermediaries/ (accessed 8 May 2014).
Holmes B (2007) Future map, or: How the Cyborgs learned to stop worrying and love surveil- lance. Available at: http://brianholmes.wordpress.com/2007/09/09/future-map/ (accessed 8 May 2014).
ISC8 (2013) Cyber NetFalcon – Big data analytics. Available at: http://www.isc8.com/products/ cyber-netfalcon.html (accessed 12 May 2014).
Knuttila L (2011) User unknown: 4chan, anonymity and contingency. First Monday 16(10). Available at: http://firstmonday.org/ojs/index.php/fm/article/viewArticle/3665 (accessed 25 October 2014).
Langlois G, McKelvey F, Elmer G, et al. (2009) Mapping commercial web 2.0 worlds: Towards a new critical ontogenesis. The Fibreculture Journal 14. Available at: http://fourteen.fibre- culturejournal.org/fcj-095-mapping-commercial-web-2-0-worlds-towards-a-new-critical- ontogenesis/
Lazzarato M (1996) Immaterial labour. Available at: http://www.generation-online.org/c/fcimma- teriallabour3.htm
Lewis M (2014) Flash Boys: A Wall Street Revolt. New York: W.W. Norton & Company.
Light JS (2005) From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War
America. Baltimore, MD; London: Johns Hopkins University Press.
Massumi B (2007) Potential politics and the primacy of preemption. Theory & Event 10(2).
Available at: http://muse.jhu.edu/journals/theory_and_event/v010/10.2massumi.html Mengue P (2013) The idiot in societies of control. Theory & Event 16(3). Available at: http://muse.
jhu.edu/journals/theory_and_event/v016/16.3.mengue.html (accessed 12 May 2014).
Milner RM (2013) Hacking the social: Internet memes, identity antagonism, and the logic of lulz. The Fibreculture Journal 22. Available at: http://twentytwo.fibreculturejournal.org/ fcj-156-hacking-the-social-internet-memes-identity-antagonism-and-the-logic-of-lulz/
(accessed 25 October 2014).
Morozov E (2013) Let’s make the NSA’s data available for public use. Slate. Available at: http://
www.slate.com/blogs/future_tense/2013/08/26/nsa_s_data_should_be_available_for_public_
use.html (accessed 7 May 2014).
Mosco V (1996) The Political Economy of Communication: Rethinking and Renewal. Thousand
Oaks, CA: Sage.
Neff G and Stark D (2004) Permanently beta: Responsive organization in the Internet era. In:
Howard PN and Jones S (eds) Society Online: The Internet in Context. Thousand Oaks, CA:
Sage, pp.173–188.
Nesbitt H (2012) Deterritorial support group. Dazed. Available at: http://www.dazeddigital.com/
artsandculture/article/10260/1/deterritorial-support-group (accessed 12 May 2014).
Nielsen Company (2012) Nielsen and Twitter establish social TV rating. Available at: http:// www.nielsen.com/us/en/press-room/2012/nielsen-and-twitter-establish-social-tv-rating.html
(accessed 7 May 2014).
McKelvey et al. 593

Norden E (1969) The Playboy interview: Marshall McLuhan. Playboy Magazine Available at: http://www.google.com/url?q=http://www.cs.ucdavis.edu/~rogaway/classes/188/spring07/ mcluhan.pdf (accessed 20 October 2014).
Osofsky J (2013) New tools for surfacing conversations on Facebook. Facebook Newsroom. Available at: https://newsroom.fb.com/news/2013/09/new-tools-for-surfacing-conversa- tions-on-facebook/ (accessed 7 May 2014).
Petersen SM (2008) Loser generated content: From participation to exploitation. First Monday 13(3). Available at: http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/ view/2141/1948 (accessed 20 October 2014).
Podesta J, Pritzker P, Moniz EJ, et al. (2014) Big data: Seizing opportunities, preserving values. Washington, DC: Executive Office of the President, Available at: http://www.whitehouse. gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf (accessed 12 May 2014).
Rey PJ (2012) Alienation, exploitation, and social media. American Behavioral Scientist 56(4): 399–420.
Reyes A, Rosso P and Buscaldi D (2012) From humor recognition to irony detection: The figura- tive language of social media. Data & Knowledge Engineering 74: 1–12.
Roberts S (2014) Behind the screen: The hidden digital labor of commercial content moderation. PhD Thesis, Champaign, IL: University of Illinois at Urbana-Champaign.
Rusli EM (2013) Facebook woos TV networks with data. In: WSJ Blogs: Digits. Available at: http:// blogs.wsj.com/digits/2013/09/29/facebook-woos-tv-networks-with-more-data/ (accessed 7 May 2014).
Shachtman N (2010) Exclusive: Google, CIA invest in ‘Future’ of web monitoring. WIRED. Available at: http://www.wired.com/2010/07/exclusive-google-cia/ (accessed 8 May 2014).
Simcoe L (2012) The Internet is serious business: 4chan’s/B/Board and the Lulz as alternative political discourse on the Internet. Major research paper. Toronto, ON Canada: Joint Graduate Program in Communication & Culture between Ryerson University and York University.
Solzhenitsyn A ([1967] 1991) Cancer Ward. New York: Random House.
Streeter T (1996) Selling the Air: A Critique of the Policy of Commercial Broadcasting in the
United States. Chicago, IL: University of Chicago Press.
Stroud M (2014) The minority report: Chicago’s new police computer predicts crimes, but is it
racist? The Verge. Available at: http://www.theverge.com/2014/2/19/5419854/the-minority-
report-this-computer-predicts-crime-but-is-it-racist (accessed 8 May 2014).
Terranova T (2000) Free labor: Producing culture for the digital economy. Social Text 63 18(2):
33–58.
Terranova T (2004) Network Culture: Politics for the Information Age. Ann Arbor, MI: Pluto
Press.
The Economist (2013) Don’t even think about it. The Economist. Available at: http://www.econo-
mist.com/news/briefing/21582042-it-getting-easier-foresee-wrongdoing-and-spot-likely-
wrongdoers-dont-even-think-about-it (accessed 8 May 2014).
Tiessen M (2013) High-frequency trading and the centering of the (Financial) periphery. The
Trading Mesh. Available at: http://www.thetradingmesh.com/pg/blog/mtiessen/read/70969
(accessed 3 February 2014).
Tiessen M and Elmer G (2013) Editorial introduction: Neoliberal diagrammatics and digital con-
trol. Media Tropes 4(1): i–xvi.
Truvé S and Ahlberg C (2013) Information service for facts extracted from differing sources on a
wide area network.
Turner F (2006) From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network,
and the Rise of Digital Utopianism. Chicago, IL: University of Chicago Press.
594 European Journal of Cultural Studies 18(4-5)

Twitter (2014) Twitter reports fourth quarter and fiscal year 2013 results. Available at: https:// investor.twitterinc.com/releasedetail.cfm?ReleaseID=823321 (accessed 12 May 2014). Wernick A (1991) Promotional Culture: Advertising, Ideology, and Symbolic Expression.
Thousand Oaks, CA: Sage.
Wohlsen M (2014) What Google really gets out of buying nest for $3.2 Billion. WIRED. Available
at: http://www.wired.com/2014/01/googles-3-billion-nest-buy-finally-make-internet-things-
real-us/ (accessed 12 May 2014).
Wortham J and Gallagher DF (2012) XOXO: A festival of indie Internet creativity. In: Bits Blog.
Available at: http://bits.blogs.nytimes.com/2012/09/18/xoxo-a-festival-of-indie-internet-cre-
ativity/ (accessed 12 May 2014).
Zetter K (2008) Six-year-old news story causes United Airlines stock to plummet – UPDATE
Google placed wrong date on story. WIRED. Available at: http://www.wired.com/2008/09/ six-year-old-st/ (accessed 12 May 2014).
Biographical notes
Fenwick McKelvey is an Assistant Professor at the Department of Communication Studies at Concordia University. His research focuses on control and algorithmic media.
Matthew Tiessen is an Assistant Professor in the School of Professional Communication (ProCom) in the Faculty of Communication and Design (FCAD) at Ryerson University (Toronto) and a Research Associate at The Infoscape Research Lab: Centre for the Study of Social Media. In 2013, Dr Tiessen was awarded a SSHRC Insight Development Grant in the area of ‘Digital Economy’ to support his research on the social implications of algorithmically driven digital technologies.
Luke Simcoe is a journalist and independent scholar. He holds a joint MA in Communication & Culture from Ryerson and York universities. His research examined the online community of 4chan and the politics of Internet trolling.