fusaroli's review

Go to review page

3.0

The topics are intriguing, the insights (at times) well, insightful. However, the constant I me I me I me over-celebratory tone, continuous references to the author having created startups (you said in the intro, now get over it) and lack of detail bring this book down to a two stars. It got me reading the (much better) original papers, though.

archytas's review

Go to review page

2.0

This is not some pop-science book by a run-of-the-mill academic: Alex Pentland is one of the most powerful scientists in the world. His projects outlined here are not just experiments: they are prototypes exploring what a surveillance-based society might look like. Which is why Pentland's avoidance of any discussion of *power* in the book - who has it, who wields it and for what end - feels less ignorance than dissemblance.
There is a lot here to agree with - Pentland's explication of how systemic social dynamics can create or ruin a company's output, for example, should be compulsory reading for managements obsessed with time metrics and personality typing. I tend to agree with the analysis of the role of collaboration in driving human evolution. His argument that exploration and novel exposure drive innovation, and his enthusiasm for structuring networks and systems to support this is compelling.
In part, I think Pentland's ideas are dangerous because of what he's right about: exactly what is needed to manipulate people. And the dangerous part is not so much what he is wrong about, but what he glosses over and ignores: the question of who benefits from this manipulation, who has the power to carry it out and how this is governed.
It is notable that Pentland does not discuss the issues around data analysis and equality. There is no mention of how this technology is being used to deepen racist housing and schooling segregation, or to 'predict' criminality based on race, or reduce loan limits for women. In recent years, as the evidence for drastically worsening racial injustice from large data systems has increased, Pentland has been involved in a few projects around "fairness" in tech, including one which facilitates a tool allowing decision makers to weigh how 'unfair' the system is vs how much it might cost to make it 'fairer'.
He diverts from this by much content around individual privacy and control over data. How this sits with his work in organisations - in which all employees must submit to constant monitoring via a biometric badge and total system surveillance is not specifically discussed. That Pentland's findings support more engaging and social workplace is no doubt some incentive to the surveillance, but the end objective is not to improve their health and wellbeing but to improve productivity for the corporations employing them.
Similarly, Pentland's smart cities experiments can be sold to participants as improving traffic flow, better targeting of health services and disease control. In reality, of course, this technology is wielded almost entirely to get people to buy things when they are most vulnerable to suggestion. Our travel routes are not designed to maximise our pleasure and relaxation but to ensure we notice that coffee shop we might love, or that pet shop with the puppy we can't resist. Our vulnerabilities are on sale to the highest bidder. The overall message: that the way we construct our societies determines what we can achieve, I find very attractive.
Pentland, of course, doesn't bring this up at all. Instead, he waxes lyrical about the potential for constant smart surveillance for social improvements. We can know, he enthuses, where people who are going to get diabetes visit. Now I'm not a data scientist or a dietician, but even I know that fast food outlets are a major contributor to poor health outcomes. This isn't exactly a secret - our problems aren't about knowledge, they are about power and the power that the profit holds. Pentland argues explicitly at several points that markets have become divorced from social good because of a lack of knowledge. Most cheekily, he tries to imply the industrial revolution was hard on the poor because of a lack of cross-class socialising (industrial barons would have given up all that power and wealth if they just understood their workers' grief, physical pains, and exhaustion and starvation better.)
Never is this more hubristic than in Pentland's claim that data will enable us to avoid or minimise pandemics. Of course, I get to be late 2020 hysterically hindsighty about this but turns out Pentland is right about what data *could* do, but totally wrong about our ability to use it for public good. 'Contact tracing' is now ordinary language and data modellers have had a hell of a year, but in the heartland of the market surveillance model Pentland advocates, it has had minimal impact on public policy, and hundreds of thousands have died. As I write this, millions of Americans gather for Thanksgiving in open violation of recommended protections, in a push largely motivated by ensuring continuing commodity consumption driven by a market economy in direct conflict with public safety.
I'm not much of a libertarian. I have little time for the idea of some pure individual unshaped or unmanipulated by forces around us. I don't hate the 'social physics' discipline because it snoops on us, or is designed to alter our behaviour. Human cognition is socially constructed - we are shaped by our society all the time. But I certainly don't believe it is a new organising model either. It is just a tool - an extraordinarily powerful one. The question is who wields it, and at what cost?

18thstjoe's review

Go to review page

5.0

Interesting stuff, I'd like to learn more about the Terento smart city and the Ivory Coast data projects

restlessunicorn's review

Go to review page

3.0

Interesting exploration of leveraging big data to enhance society and predict likely behaviors. A lot of it is common sense - ex. people tend to be strongly influenced by those with whom they have the closest personal or professional ties - but the mathematical modeling (and resultant predictability) was quite intriguing. Novel methods for promoting desired behavior (incentives that work) and the theory of maximizing idea flows were also compelling. The primary downside was that the author seemed to be screaming "hooray me!" every page, taking about his labs, spin-off companies, brilliant ideas, and former doctoral students he mentored. I get it - he's brilliant - but have enough security to avoid filling the page with your accomplishments.
More...