Sign up FAST! Login

The Scoreboards Where You Can’t See Your Score -

Stashed in: Privacy does not exist., Quantified Self

To save this post, select a stash from drop-down menu or type in a new one:

We are being judged all the time by private companies.

In two nonfiction books, scheduled to be published in January, technology experts examine similar consumer-ranking techniques already in widespread use. Even before the appearance of these books, a report called “The Scoring of America” by the World Privacy Forum showed how analytics companies now offer categorization services like “churn scores,” which aim to predict which customers are likely to forsake their mobile phone carrier or cable TV provider for another company; “job security scores,” which factor a person’s risk of unemployment into calculations of his or her ability to pay back a loan; “charitable donor scores,” which foundations use to identify the households likeliest to make large donations; and “frailty scores,” which are typically used to predict the risk of medical complications and death in elderly patients who have surgery.

Unlike Lenny Abramov, however, most people in real life are not aware of the types and frequency of rankings to which they are subject. While a federal law called the Fair Credit Reporting Act requires consumer reporting agencies to provide individuals with copies of theircredit reports on request, many other companies are free to keep their proprietary consumer scores to themselves.

“This will happen whether or not you want to participate, and these scores will be used by others to make major decisions about your life, such as whether to hire, insure, or even date you,” write Michael Fertik and David Thompson in a forthcoming book, “The Reputation Economy: How to Optimize Your Digital Footprint in A World Where Your Reputation Is Your Most Valuable Asset” (Crown Business). Mr. Fertik is the chief executive of, a service that helps individuals and companies manage their online images.

In his new book, Frank Pasquale, a law professor at the University of Maryland, similarly describes the information asymmetry inherent in the scoring industry.

“Important corporate actors have unprecedented knowledge of the minutiae of our daily lives,” he writes in “The Black Box Society: The Secret Algorithms That Control Money and Information” (Harvard University Press), “while we know little to nothing about how they use this knowledge to influence important decisions that we — and they — make.”

Both books outline how consumer scoring works. Data brokers amass dossiers with thousands of details about individual consumers, like age, religion, ethnicity, profession, mortgage size, social networks, estimated income and health concerns such as impotence and irritable bowel syndrome. Then analytics engines can compare patterns in those variables against computer forecasting models. Algorithms are used to assign consumers scores — and to recommend offering, or withholding, particular products, services or fees — based on predictions about their behavior.

But while both books emphasize the notion that consumer reputations are vulnerable to such covert scoring apparatuses, the authors differ markedly in the steps they say ordinary people might take to protect themselves.

Befitting the founder of a firm that markets reputation management, Mr. Fertik contends that individuals have some power to influence commercial scoring systems. He presents nascent technologies, like online education courses that can score people on the specific practical skills or concepts they have mastered, as democratizing forces that could enable workers to better compete for jobs on merit. His book suggests that readers curate, or hack, their digital reputations — for instance, by emphasizing certain keywords on their résumés to position them better for predictive scoring engines, or by posting positive reviews of restaurants or hotels online, in the hope that algorithms will flag them for future V.I.P. treatment.

You May Also Like: