World Without Mind — Franklin Foer
Popular opinion is starting turn against large tech companies, (see Galloway's The Four, notes link), and Foer is at the forefront of the charge. The previous editor editor (of 7 years) for The New Republic, Foer writes about the slow death of journalism that's happening before our eyes at the hands of tech companies for several reasons:
The increasingly brand-less nature of consumption: e.g. reading news through Facebook without recognition of source (similar to how Amazon promotes its own brand through Alexa, see The Four)
The reduced attention spans technology — especially mobile — has created, (see The Shallows link)
The nature of digital media causing in a race to the bottom for attention-grabbing headlines, e.g. the Cecil lion incident: a photo of a hunter and a dead lion spawned 3.2M stories as everybody tried to jump on the bandwagon, including The Atlantic's article titled "From Cecil the Lon to Climate Change: A Perfect Storm of Outrage."
Creeping homogenization: "Everything looks the same, reads the same, and seems to be competing for the same eyeballs, " Joshua Topolsky, founder of Vox Media and The Verge
"Curiosity Gap" a technique pioneered by Upworthy whereby a new story would be published with 20+ different headlines to 20+ different groups of people (A/B testing, or should it more accurately be called A-Z testing) to determine the most click-worthy
Foer's opinion of Big Tech can be summarized in his passage:
"If you stare hard enough at Google, Facebook, and Amazon they become a bit like Italy, a country where it's never entirely clear how power really operates. Rules exist but are never convincingly spelled out. We have a dim awareness that we're being subconsciously influenced, but never know when and how. We see some types of information given more favorable treatment, but not for any explicit reason. Though the tech companies preach liberal values, they crave access to markets in authoritarian countries, where compromise with ugly regimes is the cost of doing business."
World Without Mind is organized into three sections: (1) the companies, (2) what they've done, and (3) how we can fight back.
This last section is the most interesting and in it the author suggests an impending "Big Hack," which will result in the creation of a Data Protection Bureau, similar to how the CPFB was spawned out of the Great Recession. The very last chapter is aptly titled 'The Paper Rebellion,' and reasonably suggests a return to reading on paper, an activity which totally isolates you from the long reach of tech companies and their algorithms. Good luck. Nicholas Carr in The Shallows was right to presciently predict a stratification of society along lines of those who read and those who don't:
“We are now seeing [book] reading return to its former social base: a self-perpetuating minority that we shall call the reading class.”
In the first section, the Silicon Valley culture of monopoly as a social good — look no further than Peter Thiel's Zero to One for corroboration —and technological engineering as a means to better humanity and ultimately liberate humanity, are charted out with interesting origins in Stewart Brand's The Whole Earth Catalog (link).
He also traces historical roots of the ideas of:
The liberation of the mind from the prison of the body (what technology aims to do) to Descartes, and
The idea of a technocracy to Saint Simon, Comte, Veblen, and Leibnitz
Further, he characterizes Google and Facebook by their parallel desires to remove 'potential bias' from any equation, and charts the deleterious consequences of this: both use paternalistic nudging to guide users in the direction deemed best for them, which also happens to be the direction that thoroughly addicts them.
Algorithmic bias, a topic which deserves its own chapter / book (which you can find in Weapons of Math Destruction, link) is outlined briefly, and yet an elegant defense of why technologists might not believe in its existence is also discussed: instead of looking for patterns in data with a priori hypotheses, researchers can blindly throw numbers into a black box and let algorithms find patterns where science cannot. The reality, link?
In the chapter on Google, an interesting observation of Robert Geraci, an anthropologist of religion, struck me:
"Apocalyptic AI is the legitimate heir to these religious promises, not a bastardized version of them. In Apocalyptic AI, technological research and religious categories come together in a stirringly well-integrated unit."
Also, Foer observantly notes that Google's book-scanning project was mainly driven by the desire to train their AI.
In Facebook's chapter, the idea of "ultimate transparency" is discussed. The theory holds that the more we share the better we'll be. Zuckerberg himself is quoted as saying "Having two identities for yourself is an example of a lack of integrity." This idea is expounded cleverly in a recent movie The Circle (2017, trailer link) starring Tom Hanks, which is worth watching.
More worrying, however, is the statistic that 60% of Facebook's users are unaware of the existence of Facebook's algorithm , the very one that decides what you should see every day first thing in the morning when you wake up. This power is extrapolated (considering recent events, with good reason) to the power Facebook could wield should it decide to influence an election. And this manipulative power has no oversight: one member of Facebook's data science team confessed that "Anyone on that team could run a test. They're always trying to alter people's behavior." 
Finally, the chapter on Amazon re-centers the main themes of the book using the journalism industry as a guiding narrative. The atomization of digital content (i.e. albums -> songs, authors -> e-books, news channels -> 3-minute videos on Facebook) has increasingly concentrated power in the hands of a few gatekeepers.
gatekeepers — certain well-placed individuals, full of conscious and submerged biases, who exert control over the flow of information.
And this power is subject to extreme positive feedback loops as evidenced by Amazon's "almost sadistic" negotiating tactics: "the smaller the publisher, the more extravagant the pressure to comply with Amazon's wishes" This is covered excellently in Brad Stone's The Everything Store (notes link) where Amazon's strategy of demanding lower prices from publishers lest they shut off recommendation algorithms for the publisher’s books (which could result to a 40% fall in sales) by citing the 1936 Robinson-Patman Act, which prevented manufacturers from selling to large corporations cheaper than to their smaller competitors (when AMZN was small).
And as also noted in The Four, regulators can't keep up the flagrant exploitation of the market power that tech companies enjoy, and so what's the logical thing for them to do? Continue to break the laws: Facebook lied about potential data sharing between Whatsapp and Facebook during acquisition talks, and when caught later simply paid a $110M fine. Additionally, tax avoidance favors tech companies that deal in slippery, intangible software products: Apple, Amazon, and Alphabet pay effective tax rates in the mid-teens, while Walmart pays double.
 CHI'15 Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, April 2015, 153-62.
 "Facebook Experiments Had Few Limits; Data Science Lab Conducted Tests on Users With Little Oversight," WSJ, July 2, 2014