Data is FiređŸ”„đŸ”„đŸ”„

Yancey Strickler
7 min readNov 12, 2020

--

If you’ve ever slept outdoors or even just watched a survival reality show, you know the importance of fire. Fire is warmth, energy, safety. Fire bridges the line between comfort and discomfort. Even life and death.

Early humans’ ability to tame fire — which took hundreds of thousands of years to develop — changed the course of history. Human biology too. Taming fire led to cooked food which increased the calories in human diets, growing the size of our brains.

Fire was and is pivotal. But fire is also dangerous. It kills people. It’s difficult to tame. It’s not easy to get.

We see ourselves as far more advanced than our ancestors, but we face a strikingly similar situation today with a force just as powerful and mysterious. Our fire is called data.

Data Rules Everything Around Me

Today’s big moods are to be either fearful or possessive of data. (In the context of this essay, “data” means information about human activity that is collected, processed, and used.) Activists see data as a dangerous tool for privacy invasion, manipulation, and control. Companies treat data as a resource to be amassed, mined, and exploited.

Less appreciated is that even though we take data’s ubiquity for granted, it’s surprisingly new. The field of statistics was invented at the turn of the 19th century. The first tabulating machine, made by a startup known as IBM, was made only a century ago. Our current digital age — taken for granted as eternal and inescapable — is less than a generation old.

Data didn’t exist until recently because until recently data was too costly to gather. Counting stuff is a lot harder and more important than you think. Money and things that can make money were both easy to count and worth the effort to count (part of how the gold standard became the gold standard). Today, value — meant in a financial or numerical sense — has surpassed values — meant in the moral sense — in many societies because it simplifies the math. Numeric value creates an empirical feedback loop. Moral values and their language of words do not.

Whether you celebrate or condemn them, the systems built to tabulate, collect, analyze, process, and share data are incredible inventions that have produced enormous value. The vast majority of these systems exist to maximize one specific value — money — which has, not coincidentally, grown exponentially over these same years. As the saying goes, you are what you measure.

New values

The digital era has changed what we measure and, as a result, changed what’s valuable. A growing amount of human behavior occurs via systems where it’s trivial to track our actions in minute detail. Because of the ease of measurement, our ability to define discrete values has exponentially increased, as have alternative forms of exchange based on alternative forms of value. Examples include:

Time. Time is a common currency in the digital age. Video games grant access to special features and character upgrades based on time spent playing the game. Video platforms give the option to pay with your time by watching thirty-second pre-roll ads or with money by buying a subscription. Forms of currency based on time are growing.

Loyalty. Loyalty is traditionally a personal/moral value (how each of us individually values our relationships) or an opt-in value (joining a loyalty program or becoming a financial supporter). Loyalty is also now a passive value that can be used to distribute goods and services. Adele invited people identified by an algorithm as her biggest fans to buy tickets to her shows. Rather than let a traditional auction determine who would see her perform — optimizing for who had the most money to spend — she created a transaction that satisfied a financial minimum (tickets cost money) but maximized for a non-financial value (loyalty and community). Loyalty as a metric and multi-value transactions like Adele’s will both become increasingly common.

Social desirability. The most dystopian possibilities are in social scoring systems that algorithmically determine what privileges, goods, and rights someone should have access to. Using algorithms to allot concert tickets is one thing, but when human rights are based on these scores things get hairy. Such a system currently exists in China, and is what conservatives allege happens now when social media networks de-platform them. This direction will be deeply contested for years to come.

“Fuck the algorithm”

These first forays into new values can prove controversial.

A stark example came earlier this year when the British government used an algorithm to adjust student’s scores after COVID, boosting students at elite schools and penalizing students at less academically successful ones. This led to protestors chanting “fuck the algorithm” — the first but certainly not last time such chants will be heard.

There was a similar controversy in the US last year for the opposite reasons. The SATs announced a new adversity metric that would boost scores for people from less privileged backgrounds. The proposal was quickly pulled after widespread pushback. Using math to quantify the difficulty in someone’s life was a step too far. This rejection of statistics in the field of education is growing — several top schools announced in 2020 that they’ll no longer use the SATs at all, accusing them of systemic biases.

We put more trust in independent decision makers than defined processes. This distrust is not irrational. Our digital systems are new. They’re black boxes. They’re built by private companies. They discriminate against certain groups of people because they’re based on data from a world that discriminates against certain groups of people. They aren’t democratically designed or produced.

Our impressions of data are wrapped in suspicion for good reasons. Today data is primarily used to sell ads — quite possibly the lamest possible task it could be used for. Data remains arguably the most promising path for how the world can transform. But until we learn to tame and control the downsides of data, we won’t realize that potential.

Democratic data

Our current age of data is only the beginning. How societies use data in the future will be significantly different than how we use it today.

Already data is a new form of capital. Like capital, companies amass, hoard, and use data to secure further dominance. But data is also the means by which new forms of capital are defined and exchanged. Bitcoin and the blockchain are only the beginning. In the coming years people will translate centuries of moral debates of right and wrong, what’s just and unjust, and ideal behavior into mathematical expressions of concepts like loyalty, community, and purpose that digital sensors will passively detect.

A fascinating paper by an NYU post-doc named Salome Viljoen advocates for “egalitarian data” where the measurement of our collective actions is seen as a “democratic resource.” She writes:

“Far from offering terrain on which to re-impose forms of private market ordering, data governance may plausibly retrieve spheres of life from private governance and begin to develop new alternatives”

The writer Evgeny Morozov echoes this: “A more promising project for the left might be to find ways to deploy ‘feedback infrastructure’ for new, non-market forms of social coordination.”

Real-world examples already exist in health care. The QALY score is a measurement of someone’s quality of life, which then informs policies and procedures for their care. The system used to find organ donors is another. Instead of allowing money to decide who gets access to organs and who doesn’t, organ donors are matched according to quality of life and need. The system is an algorithm for fairness, essentially.

The rise of data will create more systems like these and more transactions like Adele’s. Data will reveal whole new frontiers of value that we contribute to and draw from individually and collectively. Our maps to the world will be rewritten by what we find.

This is uncharted territory. It won’t be easy. There will be discomfort. We’re still in the early days of our relationship with data. Not far from our ancestors still learning how to handle fire. Their ability to control fire got us here. Our ability to control data will determine where we go next.

Further Reading

  • A great book for learning more about data and data gathering: How to Measure Anything by Douglas Hubbard. Thanks to my friend Jeff Hammerbacher for the recommendation.
  • One extreme model for a world run on data is “automated luxury communism,” as this talk by Toby Shorin explains:
  • I liked this quote from the Evgeny Morozov essay: “As G. A. Cohen put it in his last book, ‘The principal problem that faces the socialist ideal is that we do not know how to design the machinery that would make it run.’”
  • A longer, not-yet-published version of Salome Viljoen’s essay on “Democratic Data” informed parts of this essay. Thanks to her for sharing.
  • Algorithms reveal what we really want. From New York Magazine:

“In 2018, a group of data scientists at the Times unveiled Project Feels, a set of algorithms that could determine what emotions a given article might induce. “Hate” was associated with stories that used the words tax, corrupt, or Mr. — the initial study took place in the wake of the Me Too movement — while stories that included the words first, met, and York generally produced “happiness.” But the “Modern Love” column was only so appealing. “Hate drives readership more than any of us care to admit,” one employee on the business side told me.”

--

--

Yancey Strickler
Yancey Strickler

Written by Yancey Strickler

Author of “This Could Be Our Future: A Manifesto for a More Generous World”; Cofounder of Kickstarter; Bentoist; http://www.ystrickler.com