How did the cold war end? With a massive arms dismantlement program. Why wouldn’t a massive disarming be a solution to end Facebookistan?
By disarming, I do not mean throwing out stacks of artillery. Rather, I mean opening up information: make statistical user data analysis available via an open API, and build tools allowing people to analyze data. And that would empower anyone, anywhere to draw the types of conclusions that currently are only available to marketers with sufficiently deep pockets, giant social media networks, or other cloud giants. People can help one another determine what is evil and what is not.
Is this a realistic dream? I’d hope so. After all, open source didn’t just die out, even though it was ridiculed at its birth as well. Sure, opening data does not mean opening up your entire company strategy. Convenience of access, guaranteed availability of data access, and so on are features that corporations will still pay for. But the core business changes slightly, the same way that open source and closed source businesses are slightly different: rather than building a business around selling other people’s data, give the data away for free and instead build a business around selling services and access to other people’s data.
In this digital age where privacy is commodity where theoretical demands for privacy and practices against privacy do not coincide, and where anonymized statistical analysis are becoming more rampant than ever, is open data really that far away?
Originally shared by +Jay Gordon
‘Ask an internet platform spokesperson why his or her firm made nearly any decision, and you’ll hear some variation on “to improve user experience.” But we all know that it’s only a certain kind of user experience that is really valued, and promoted. For Facebook to continue to meet Wall Street’s demands for growth, its user base must grow and/or individual users must become more “productive.” Predictive analytics demands standardization: forecastable estimates of revenue-per-user. The more a person clicks on ads and buys products, the better. Secondarily, the more a person draws other potential ad-clickers in–via clicked-on content, catalyzing discussions, crying for help, whatever–the more valuable they become to the platform. The “model users” gain visibility, subtly instructing by example how to act on the network. They’ll probably never attain the notoriety of a Lei Feng, but the Republic of Facebookistan gladly pays them the currency of attention, as long as the investment pays off for top managers and shareholders.
As more people understand the implications of enjoying Facebook “for free“–i.e., that they are the product of the service–they also see that its real paying customers are advertisers. As N. Katherine Hayles has stated, the critical question here is: “will ubiquitous computing be coopted as a stalking horse for predatory capitalism, or can we seize the opportunity” to deploy more emancipatory uses of it? I have expressed faith in the latter possibility, but Facebook continually validates Julie Cohen’s critique of a surveillance-innovation complex. The experiment fiasco is just the latest in a long history of ethically troubling decisions at that firm, and several others like it.
Unfortunately, many in Silicon Valley still barely get what the fuss is about. For them, A/B testing is simply a way of life. There are some revealing similarities between casinos and major internet platforms. As Rob Horning observes, Social media platforms are engineered to be sticky . . . Like video slots, which incite extended periods of “time-on-machine” to assure “continuous gaming productivity” (i.e. money extraction from players), social-media sites are designed to maximize time-on-site, to make their users more valuable to advertisers . . . and to ratchet up user productivity in the form of data sharing and processing that social-media sites reserve the rights to.” That’s one reason we get headlines like “Teens Can’t Stop Using Facebook Even Though They Hate It.” There are sociobiological routes to conditioning action. The platforms are constantly shaping us, based on sophisticated psychological profiles.
The characteristics of Facebook’s model (i.e., exemplary) users in many ways reflect the constraints on the model users in the company–i.e., the data scientists who try to build stylized versions of reality (models) based on certain data points and theories. The Facebook emotion experiment is part of a much larger reshaping of social science. To what extent will academics study data driven firms like Facebook, and to what extent will they try to join forces with its own researchers to study others?
Present incentives are clear: collaborate with (rather than develop a critical theory of) big data firms. As Zeynep Tufekci puts it, “the most valuable datasets have become corporate and proprietary [and] top journals love publishing from them.” “Big data” has an aura of scientific validity simply because of the velocity, volume, and variety of the phenomena it encompasses. Psychologists certainly must have learned something from looking at over 600,000 accounts’ activity, right?
The problem, though is that the corporate “science” of manipulation is a far cry from academic science’s ethics of openness and reproducibility. That’s already led to some embarrassments in the crossover from corporate to academic modeling (such as Google’s flu trends failures). Researchers within Facebook worried about multiple experiments being performed at once on individual users, which might compromise the results of any one study. Standardized review could have prevented that. But, true to the Silicon Valley ethic of “move fast and break things,” speed was paramount: “There’s no review process. Anyone…could run a test…trying to alter peoples’ behavior,” said one former Facebook data scientist.
Why are journals so interested in this form of research? Why are academics jumping on board? Fortunately, social science has matured to the point that we now have a robust, insightful literature about the nature of social science itself[…] One of Isaac’s major contributions in that piece is to interpret the social science coming out of the academy (and entities like RAND) as acultural practice: “Insofar as theories involve certain forms of practice, they are caught up in worldly, quotidian matters: performances, comportments, training regimes, and so on.” Government leveraged funding to mobilize research to specific ends. To maintain university patronage systems and research centers, leaders had to be on good terms with the grantors. The common goal of strengthening the US economy (and defeating the communist threat) cemented an ideological alliance.
Government still exerts influence in American social and behavioral sciences. But private industry controls critical data sets for the most glamorous, data-driven research. In the Cold War era, “grant getting” may have been the key to economic security, and to securing one’s voice in the university. Today, “exit” options are more important than voice, and what better place to exit to than an internet platform? Thus academic/corporate “flexians” shuttle between the two worlds. Their research cannot be too venal, lest the academy disdain it. But neither can it indulge in, say, critical theory (what would nonprofit social networks look like), just as Cold War social scientists were ill-advised to, say, develop Myrdal’s or Leontief’s theories. There was a lot more money available for the Friedmanite direction economics would, eventually, take.
It is very hard to develop categories and kinds for internet firms, because they are so secretive about most of their operations. (And make no mistake about the current PR kerfuffle for Facebook: it will lead the company to become ever more secretive about its data science, just as Target started camouflaging its pregnancy-related ads and not talking to reporters after people appeared creeped out by the uncanny accuracy of its natal predictions.) But the data collection of the firms is creating whole new kinds of people—for marketers, for the NSA, and for anyone with the money or connections to access the information.’
Facebook’s Model Users | New Criticals
Frank Pasquale on Facebook users, researchers and company practices.