Showing posts with label Facebook. Show all posts
Showing posts with label Facebook. Show all posts

Monday, February 18, 2019

Zucked: Waking Up to the Facebook Catastrophe


Zucked: Waking Up to the Facebook Catastrophe

by Roger McNamee 



Well, this is awkward.

Roger McNamee, the author, was a mentor to Mark Zuckerberg, invested very early in Facebook, and introduced Zuck to Sheryl Sandberg. But when after the 2016 election he informed them he thought Facebook had become a danger to democracy, they didn't seem to want to talk about it.

McNamee is a smart guy, and figured out earlier than most that Facebook's unintended consequences were harming public health and enabling any number of bad actors to influence elections (Trump, Brexit) and oppress minorities (Myanmar, Sri Lanka). Facebook can be and in fact is incredibly harmful to children--for commercial gain. This is the sharp edge of what Shoshana Zuboff calls Surveillance Capitalism.


If there's a sliver lining at all, it's that McNamee and a small team of experts has been busy educating people of the hazard and what can be done. His briefs included the US Senate Intelligence Committee, the House Intelligence Committee, Nancy Pelosi's staff, experts in academia, the EU, and elsewhere. But make no mistake: Facebook (and Google, and the rest of Big Tech) won't give up money and power without a fight. And they have all of our data.

Surveillance Capitalism

The Age of Surveillance Capitalism

By Shoshana Zuboff





-->
You should read this book.

What’s your view as to the most pressing issue facing humankind today? Climate change? Nuclear war? Viral pandemic? Mine is surveillance capitalism.

Whoa, but aren’t the other issues of an existential nature, capable of wiping out the human race? Yes, but… in each case there are experts and activists working hard to make sure that doesn’t happen.

The problem with surveillance capitalism is that very few people outside of Google, Facebook, and other Big Tech even know what it is, and they aren’t working to stop it. Far from it.

You should read this book.

Shoshana Zuboff is a social psychologist who studies and writes about the impact of technology on people. I read her 1988 book, In The Age of the Smart Machine, where, over 30 years ago, she investigated the early impact of smart machines on people’s lives. But that was before Google, and Facebook, and surveillance capitalism.

Zuboff is a professor at Harvard Business School. She has spent her career researching how the information age has morphed into something not so positive: surveillance capitalism. What does that term even mean? Here are her many definitions, with which she begins the book:

Surveillance Capitalism, n:
1. A new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales; 2. A parasitic economic logic in which the production of goods and services is subordinated to a new global architecture of behavioral modification; 3. A rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history; 4. The foundational framework of a surveillance economy; 5. As significant a threat to human nature in the twenty-first century as industrial capitalism was to the natural world in the nineteenth and twentieth; 6. The origin of a new instrumentarian power that asserts dominance over society and presents startling challenges to market democracy; 7. A movement that aims to impose a new collective order based on total certainty; 8. An expropriation of critical human rights that is best understood as a coup from above: an overthrow of the people’s sovereignty.

I came across surveillance capitalism in my final job, working in cybersecurity and learning about the dark side of mobile phones with their persistent surveillance and egregious invasions of privacy. I have therefore been aware of how our corporate masters cynically capture many aspects of our lived experience so they can deliver “better” ads. The overall goal, as with all advertising, is to sell us more stuff. For this, our privacy is stealthily violated. This I already knew.

But I now know that I was only seeing the tip of the iceberg. Zuboff outlines how Google, Facebook and Big Tech can now take a “God’s eye view” of the data resulting from our lived experience--our location, calls, texts, emails, digital assistants, web surfing, online purchases, offline purchases, connections, friends, social media posts, likes, follows, and many other factors--and can appropriate that as their proprietary data and inferences which can be sold to advertisers, yes, but also to other entities including insurance companies, financial service companies, educational institutions, law enforcement, city, state and federal governments, and any other entity willing to pay.

We know that websites track us. We know our phone also tracks us, to the degree we should call is surveillance. But as the so-called Internet of Things (IoT) relentlessly comes into pervasive reality, we can (and are) being tracked by smart speakers, TVs, appliances, cars, toys, thermostats, smart vacuums, security systems, and soon--clothing and other wearables. Zuboff refers to the sum of surveillance devices as Big Other (sounds like Big Brother--get it?). And here’s a key point: It’s not big brother. It’s not the government. Out 4th amendment rights don’t protect us. This pervasive surveillance infrastructure--the Big Other--is controlled by our corporate masters. Companies such as Google and Facebook gather bazillion data points daily, and, using millions of servers deployed around the world, unseen but powerful, and deduce characteristics about us that we can’t know, can’t appeal, and which control our lives. Want a job? The algorithm that used to just scan resumes now adds everything it knows about you--right or wrong--to determine who gets the job. And who gets the loan. And who qualifies for the house. And what the length of your sentence is.

As time goes on, surveillance capitalism shifts from a system that can deliver you relevant ads, to one that controls your life. Without transparency or accountability.

I won't go too far in describing what these technologies can do to children, because it makes me too upset. Suffice it to say, something called "persuasive technology" has been weaponized by Facebook and others to manipulate children's emotions for commercial gain.

Zuboff is not a technologist, but she puts tech in perspective. She compares surveillance capitalism to the industrial revolution, totalitarianism, the cultural trend towards individualism, and how behavior modification became accepted (which also caused me to drop out of college as a psych major, coincident with B.F. Skinner’s publishing Beyond Freedom and Dignity). Zuboff cites philosophers, social scientists, economists, psychologists, and of course tech industry leaders. In this book she provides a 360-degree perspective on the many factors that influence this movement. And she sounds a warning.

Naomi Klein says “everyone needs to read this book as an act of digital self defense.” Robert Reich says “Her sweeping analysis demonstrates the unprecedented challenges to human autonomy, social solidarity, and democracy perpetrated by this rogue capitalism.”

Rogue capitalism. You don’t have to be a techie to understand it, but you have to pay attention.

You should read this book.

Monday, February 12, 2018

Is It Your Smartphone That's Addictive — Or Your Apps?


The recent spate of articles on the topic of smartphone addiction reflects growing concerns about our reduced cognitive capacity, increasing loneliness and depression, and our diminishing ability to control where our attention is focused—all attributed to the increasing amount of smartphone screen time in our daily lives.

Our daily smartphone use in the U.S. has grown to over 4 hours per day, according to eMarketer. And in the details we see that the vast majority of that time is due to our use of mobile apps. It's not the smartphone that's addictive, but the apps—which are specifically designed to keep us engaged, and by that they mean using their apps for longer so that the stalker economy can profit from our attention.

Make no mistake: Apps such as Facebook, Snapchat, Instagram, WhatsApp, and Twitter employ an economic model that's tied to keeping your attention on their app (despite what their marketing departments say about connecting people). That's for two reasons: first, to serve us more ads; second, to surveil us for longer so that companies such as Acxiom, Epsilon, Datalogix, RapLeaf, Reed Elsevier, BlueKai, Spokeo, and Flurry can collect more data about us.  These companies are players in the $156 billion per year data surveillance industry— an industry that exists so that marketing companies can serve us the best ads, depending on dozens of factors including where we are at any given time. Usage patterns, what other apps we use, and how we use them allow marketers to determine our gender, profession, marital status, sexual orientation, income level, age, health conditions, and other personal characteristics. Flurry, for example, identifies app users based on their persona such as Business Travelers, Pet Owners, and New Moms, among many others.

Enterprises in the U.S. don't worry all that much about protecting employees' privacy. But they are concerned about employee productivity, and ensuring that—unlike Homer Simpson in the cartoon above—their employees focus their attention on the job at hand. That's why Facebook is one of the most common apps for enterprises to blacklist. Other approaches to eliminate employee loss of attention include adoption of container strategies such as Android Enterprise and Samsung Knox so that employees can only use work-related apps while they're at work.

But employees resist corporate attempts to control what apps are on their devices, and containers' adoption is slowed by ease of use and other concerns. What other options exist for enterprise mobile security?

As we outlined in a prior post, any mobile security approach for enterprises that requires users to delete apps from their devices will be subject to resistance from app-addicted employees. That's one reason why Mobile Threat Defense (MTD) solutions face deployment headwinds. And unless app policies are developed in a strong partnership with the HR department, and employees agree to such measures as a condition of employment, enterprises will find it very challenging to enforce any but the most egregious security concerns regarding employee-owned devices.

Instead, enterprises should investigate a lightweight approach to mobile security that's transparent to employees but which has the ability to prevent operation of enterprise-selected personal apps while the employee is at work. But every day when they leave the workplace, their apps are re-enabled and will work normally while the employee is on personal time and away from the office. That's the security model that has served enterprise laptops for the past decade, and it's a logical separation between work and personal use of mobile devices.

______________________________________________________________________

Note: Many of the ideas explored in this post were stimulated by two books: Future Crimes: Inside the Digital Underground and the Battle for Our Connected World, by Marc Goodman, and The Attention Merchants: The Epic Scramble to Get Inside Our Heads, by Tim Wu. I am indebted to them both.

Monday, January 8, 2018

"Modern computing security is like a flimsy house that needs to be fundamentally rebuilt"

Image courtesy NY Times

Zeynep Tufekci has an interesting take on the latest cyber security news in her column entitled The Looming Digital Meltdown. The money quote is this: "Modern computing security is like a flimsy house that needs to be fundamentally rebuilt." Her focus is the chip-based vulnerabilities disclosed last week, but she's talking about cyber security in general. And her point is hard to argue with.

Tufekci has been thinking both deeply and broadly about these topics for quite some time. She's the author of Twitter and Tear Gas: The Power and Fragility of Networked Protest, and her TED talk "We're building a dystopia just to make people click on ads" gets right to the point about the economic incentives that enable the Internet's stalker economy.

The theme of Tufekci's column is that vendors, driven by consumer demand and the frenzy to be first to market, have sacrificed security for speed and convenience. She rightly asserts that this is a solvable problem--and maybe would have already been solved if we simply held our vendors accountable (as we do with airplane travel, for example, or with consumer products).

But I think users have been complicit. By users, I mean people who use Facebook, and who buy smartphones, and who are inevitably attracted to "free" apps and services. I put the word free in quotes because the cost is real but not always evident. The stalker economy, which leads to exploits such as ADINT where by using the same techniques available to advertisers one can easily track friends, relatives and even strangers. We've ceded this power to the technology providers in order to have easy access to social networking apps and other "free" services.

And profit-driven technology is proving more powerful than traditional safety systems. Witness the fact that Uber Can Find You but 911 Can’t. Would we ever have designed such an outcome consciously?

Maybe there's hope on the horizon: Mark Zuckerberg says one of his goals for 2018 is to fix Facebook.

Meanwhile, here on earth, we're left to live with the chip-related vulnerabilities (known as Meltdown and Spectre). There's much hand-wringing that the proposed fixes will slow our systems down--maybe by as much as 30%. But MIT has developed an ad-blocking system that improves web page download times by up to 34%. That sounds like a fair tradeoff to me.