The New Business Model
Contrary to what you may believe Facebooks #1 priority is not to educate or inform its users but to keep users on the site. They do so by manipulating our psychology. According to Brent Barnhart in a Sproutsocial, there’s the belief that social media algorithms exist to push brands to pay a premium for social ads.
The belief is that if brands can’t reach their audience organically, they’ll turn to ads instead. This means more money for social networks. This point-of-view might seem cynical or even paranoid, but social marketers know that changes in how social media algorithms prioritize paid and organic content can have a huge impact. We will get more into that when looking at the social and psychological effects of social media.
To those that think it’s just paranoia and it’s not all that bad I always ask them the same questions “Facebook doesn’t make anything (red. especially before acquiring Oculus and Facebook Gaming) and you don’t have to pay to use it. So, how are they make so much money? What are they selling and who is buying?”
“If you are not paying for it, you are not the customer. You are the product being sold.”
– Tim O’Reilly
We all love free stuff but if Facebook is free the question of how they make their money should be a clue to many as to what Facebook really is. Simple answer: A 1.15 billion users per day Marketing Agency.
They make on average $12,- per active user per year, doesn’t sound all that much till you take in account the 2.7 billion users and suddenly it doesn’t sound as shabby anymore. Who is paying them this money? Facebook has guaranteed advertisers a level of effectiveness in getting you to buy or change your behavior, all in a way it doesn’t raise awareness of the way there doing so. According to Professor Paul Miller. Why? Cause if you would feel how hard you were being target you may not behave as they want you to. A repeated soft touch can move a person more easily and willingly the way you want than a hard push where resistance will be the result.
Shoshana Zuboff in the Netflix documentary “The Social Dilemma” calls it Surveillance Capitalism, where not stock futures are traded but human futures. What will you do, what will you buy next, where are you going to be, what will you be doing, with who will you be there? Predictive Modeling. And Facebook is by no means the only company using them and making money of them, look at Google. In this business model, the one with the best predictive model (AI) wins. But there’s a catch, in order forfor the models to work at the highest performance capabilities they need data. And lots of it. So how do you get them a steady diet of data using a social media platform? Make sure users engage as much as possible for as long as possible, grow the “come back rate” by pushing post from friends and family the user interacts the most with. This will increase the value of the space (your feed) and this space gets sold with a level of certainty of effectiveness to predict the next behavior. This ability grows with each data point collected and each passing day. The models get better at predicting your behavior but even the best engineers at these companies can’t tell you with 100% certainty how they come up with these predictions. In goes the data, it goes through the model and out comes a result, but why this result is not really understood. The black box problem.
We're Creating Our Own Bubbles
Facebook collects at least 125 data point on every user. Some are willingly, these are the data we give them freely like, birthday, telephone number and workplace. But it’s the unwilling data that is where many have problems with Facebook, and I will honestly say… me included. This is the reason why they received so much backflash about WhatsApp (owned by Facebook) and Facebook sharing data. And now are in a data policy war with Apple about unwilling data collection. Apple is planning to launch an update that will force all app on Apple store to ask users for their permission to collect specific data and explain what they are going to use it for. Facebook stated that if Apple were to continue with this plan, Facebook was afraid users would decline the collection and would have huge impact on their business. Fun thing about this Apple move is “the kettle calling the pot black”. But let look at the bright side at least they are actively trying to do something.
Why is this such a big issue? AI’s are calculating what information they deem we should see. They’re filtering the world around us to the point they have filter us into a bubble. Where echoes of what we already believe are amplify and take up all the room for other believe or point of views to enter.
But it’s not all the AI’s fault. We program a model to “Figure out the most optimal, efficient, accurate, repeatable and scalable answer to a question”. But optimal doesn’t equal right, efficient doesn’t mean worth doing, accurate leaves out flexibility, repeatable increases generalization and scalable increases the impact. It was built to run on this basis. Who is asking the questions, what questions are they asking and what they are doing with the answers, are also at the center of this issue?
At the end of the day AI is a tool. A tool trying to provide us with one of the basic human need which is connection, feeling of belonging! Users who the AI was meant to connect are being isolate more and more while the user remains unaware of how every like, comment or share plays more into their isolation. Surveillance capitalistic profit at all cost business model changed the AI technology’s true purpose which is to help us built a better world. Injected with persuasive technology that can be compared to casino/slot machine tactics design to create addition. It’s too easy to only blame the technology when it’s only magnifying deep rooted issues of all types, shapes and forms that were long lying under the surface. We talk more on this in the 3rd part of the series.
Many most of all this is not new information and to some it is, but I want to leave you all with this thought: Are you in a bubble?
“Personalized filters play to the most compulsive parts of you, creating ‘compulsive media’ to get you to click things more”
-Elis Pariser