The Facebook Algorithm Problem - podcast episode cover

The Facebook Algorithm Problem

Dec 02, 202053 min
--:--
--:--
Listen in podcast apps:

Episode description

Algorithms, when you get down to it, are simple in concept. In practice, they can be really useful. Or, in Facebook's case, they can lead to a lot of problems.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Transcript

Speaker 1

Welcome to tex Stuff, a production from I Heart Radio. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with I Heart Radio, and I love all things tech and I don't get to do this very often, so indulge me. But in Shakespeare's classic tragedy Hamlet, our protagonist and perpetually moody Dane says at one point there is nothing either good or bad,

but thinking makes it so. What he means there is that, at least in his eyes, all things in the universe are just as they are. They are neither good nor bad. It's only how we think about them that makes them good or bad. This is similar to a philosophy and I've talked about on this show when it comes to technology. Technology and of itself has the capacity to help or to harm. But the underlying tech isn't necessarily good or evil.

It's how it's put to use that makes all the difference. Now, granted, there are some technologies that are harder to use in a way that is a net positive, you know, like nuclear warheads for example. Not that people haven't tried, they have, but you get my meaning. Well, one technology that has far reaching effects and consequences that can end up being positive or negative. Would be the algorithms that are underlying

stuff like search and social networks. Specifically today, I'm going to be talking about Facebook in this episode, but algorithms underlie the various computer systems and online systems that we rely upon, and they have the potential to be harmful depending on their implementation. But before I start talking about the effects of algorithms, let's back up a little bit and define that what exactly is an algorithm. Basically, an algorithm is a set of directions or rules, and that's

really it. There's nothing special about this definition. These rules guide how computers calculate operations and solve problems. It's a finite sequence of instructions for computers to follow when they're executing a specific task. These instructions are unambiguous, which means the computer system doesn't have to make judgment calls or anything like that while executing the instructions. There is a clear path to follow based on whatever the situation is

at hand. A different situation might lead to a different outcome, but the instruction path would still remain clear. If you were to dig down deeply enough to see what was going on behind the computer's actions. The number of potential paths to various outcomes could be an warmous and so to us it would look like a really complicated system

and very difficult to grock. But if you really got granular, you would see the logic, as it were, that connects the beginning state to the end state of any given operation. At Google, search algorithms include elements that determine what position of particular query result will take on a results page. So when we search for something using an engine like Google, there are several things going on in that split second

before our results pop up. For one thing, Google has an index of all the web pages it has crawled, and so Google will reference that index to find all the entries and that index that relate to the terms that you put in your search query. And then those results are ranked by a few different criteria. So way up there is relevancy, like how relevant is each search

result to the original search query. If I were to search for Captain America, Google is going to put the pages about the fictional superhero way up towards the top search results about someone who happens to hold the rank of Captain and there's somewhere a mention of the region of America would likely fall much lower on the search results list, because even though those pages will have relevant terms in them, the chances are it's not what I'm

actually looking for. A context is important, but context is just part of it. There are thousands of pages that mentioned Captain America. Some are from really good resources, meaning they are resources that are dependable, they're informative, they might be entertaining, you know whatever. Some are from garbage sites that don't provide any real valuable experience. So then you've got the ranking system within Google that determines how the

most relevant search engine results will appear on the results age. Now, if there are thousands or even millions of hits, how does Google determine which ones show up on the first page of results. Well, there's a complicated system that determines that ranking, and it's not really in the realm of this episode to go into it, but suffice it to say that Google takes a lot of things into consideration

when ordering search results. Ostensibly, this is all in an effort to give you the most relevant and helpful results based on your search query. For the people companies and organizations that are making pages. It's always a goal to rank really high on that list, preferably on that first page, and if you can, the top half of that first page, because most people will only look at the first few search results and then they'll ignore everything that comes afterward.

Over the years, Google has made it really tricky to game this engine, which means it's hard to come up with effective strategies to show up higher in search results. Your best bet is really to create just really good resources around the particular topic of interest in question. And also, you know, happened to be a really respected resource, which

is not easy to do, especially if you're just starting out. Obviously, Also, whenever Google makes changes to this algorithm, it can drastically affect how your page appears in search results, even with a perfect query. And when tech stuff was still part of how stuff works, that was something we were always concerned about. A change in the Google algorithm would mean that our web traffic could either skyrocket or it could plummet. There was no change in the quality of our work.

Our work was just as good as it had been before. It would just be a change in how our work would show up in search results. However, that's the Google algorithm. With other algorithms like the ones on Facebook, those can be easier to game if you know what you're doing. And what's more, there are algorithms on Facebook that allow for advertisers to engage in some discriminatory and potentially criminal behavior.

So we'll look at how the various algorithms on Facebook can have a negative impact because of the way they are implemented and because of the way that people are taking advantage of them. Now, in the beginning, Facebook didn't have an algorithm at all. Heck, it didn't even have a news feed when it first started. Instead, you could post stuff to your profile, but then you could visit the profiles of friends to see what they had posted. You could leave comments on stuff, but it still required

you to actually go over to your friends profile. There was no aggregator for content just yet that changed in two thousand seven when Facebook introduced the news feed. Now, your homepage on Facebook had a record of your friends activities, just listed in reverse chronological order. You know, the most recent things appearing at the top of the list. You could log in and see what your friends had been up to since the last time you were on and

you could catch up. There's still no algorithm to speak of yet, and as would become a hallmark of any change made on Facebook. Ever, a lot of users objected to the news feed, and the main objection was that people felt it was a violation of privacy. They didn't like the idea that the activities they were doing on Facebook were being broadcast to all of their friends, which okay, now this one stumps me a little bit, because the profiles on Facebook were public unless you set privacy features

to make it otherwise. So if you change your relationship status on your profile, for example, why would you object to it being broadcast to your friends. If anyone visited your profile, they would see the difference. So either you don't change it, or you change the privacy settings on your profile, But otherwise why would you be upset? I think the main takeaway is just that people hate change, and if you don't want something broadcast to everybody, maybe

you shouldn't post it. A couple of years later, Facebook introduced the first iteration of their algorithm, which would determine what users would see in their news feed and in what order they would appear, so rather than getting a full account, of all of your friends activities in reverse chronological order. Users would see a selection of stuff that the algorithm had determined was the best for that user.

So how did that work? Well, generally speaking, the algorithm took three main criteria into account to select the stuff that would show up on news feeds. One was user affinity, which was an indicator of how close the relationship is between one Facebook user and another, or a Facebook user and a brand. And this was measured in one way perspectives, and that can be a little confusing. So let's make a an example. We'll take three fictional people to explain

how this affinity worked. So in this example, we've got Sam, Dean, and Castiel. Okay, So Sam and Dean are super close, right, and they often comment on each other's posts, and they'll hit like on each other's statuses and all that kind of stuff. So they tend to see one another's posts a lot in their own news feeds when they log onto Facebook. They each have a high affinity for each other. Then we've got Castil. So Castiel is friends with both Sam and Dean, but really he's much closer to Dean

than he is to Sam. And Castiole responds to a lot of Dean's posts, and as a result, Castiel C's Dean's stuff pop up in his that is, Castil's news feed pretty regularly. Castil is not as close with Sam and he doesn't interact with Sam as much. So sam stuff can pop up in Castile's news feed, but it's not as frequent or as sure a thing as when Dean posts something. Now, on top of that, Dean likes Castiel, but you get the feeling that this relationship is not

a perfect two way street. So Dean only interacts with some of Castile's stuff that he posts online. So Dean sees Castiel's posts on occasion in his news feed, but not nearly at the same frequency at which he sees Sam's post in his that is Dean's news feed. And yes, I just summarized the show Supernatural in an example about Facebook algorithms and affinity. Now, as we see a high affinity one person has for another helps determine how much of that other person's stuff the first person is going

to see. But it only works in that one way. If I like your posts a lot, then I'm going to see more of your posts in my feed. But if you never interact with me, You're not going to see a lot of my stuff in your feed. That's what I mean by that one way perspective. Of course, there are other ways to wait this, more like you can tag someone in a post, which will definitely give their attention much more effectively. But let's move on. So the second component to the edge rank factor we had

user affinity as a component one. Component two is the weight of the post W E, I, G, H T. So different kinds of posts have different weights, and the greater the weight, the more likely it will pop up on a given friends feed. So, generally speaking, with edge rank, stuff that had photos or videos had the most weight,

followed by posts that contained links, followed by plain text posts. Now, the reason for this is that Facebook analysts had figured out that posts that had stuff like photos and videos with tend to get more engagement. I'll get back to engagement a bit later in this episode. Just know that it is a critical concept for Facebook. The third component factored into edge rink is called time decay, which essentially

translates to how old is this post? And the idea here is that the more recent posts are likely more relevant than older posts are it doesn't do you any good to see the announcement of an upcoming get together if in fact that get together actually happened last week, so you're not likely to see the initial post pop up at the top of your news feed if you've already missed everything. As the post ages, the value of

the post decreases. So when you log into Facebook or open the Facebook app or whatever, this algorithm would run with your particular situation in mind. It would scan through all the stuff that had been posted by the friends you have and the brands you follow since your last session, and then it would select and rank or order those status updates that you would see in your news feed based off of those criteria. But this would also mean

you would miss out on stuff. No longer would you be able to see everything that all of your friends were doing, listed in reverse chronological order. In fact, at times, chronology didn't have anything to do with the order in which status updates would appear on your news feed. A post several hours old might appear above one that was posted just five minutes before you logged in, all because of those other factors. For some Facebook users, including me

at the time, this was really frustrating. It made us feel as though we were missing out on stuff our friends were saying because we were. I found it particularly irre tating because I wasn't one to frequently engage with updates. I didn't like a lot of them, and I wouldn't comment. A ton of times. I would read everything, but I wasn't interacting with it, So the algorithm wasn't getting a really good idea of who my good friends were versus

like people who are my acquaintances. So it meant that some people who were my closest friends in reality were not the ones that I was seeing, you know, being active on Facebook. Even if they were, I just wasn't seeing those posts. And my case was not unusual. A lot of people went through this and and then I missed stuff, sometimes big stuff like an engagement announcement or something.

Edge rank was just the beginning, and actually that particular algorithm didn't stick around that long in the grand scheme of things. By late within Facebook, no one was really using the term edge rank anymore because they had moved towards a more sophisticated machine learning based algorithm. This one was far more sophisticated than edge rank. It was not limited to just those three main factors, although they were

still components within the algorithm. Instead, according to Lar's Backstrom, who was the engineering manager for the news feed ranking at the time, this new one used somewhere in the neighborhood of one hundred thousand weights to determine what users would see and in which order. Now you would think that with that many weights it would be hard to gain the system, but you'd be wrong. Now, a couple other things I want to mention before we go to break.

One of those is that, yes, there was an option still is to try and list things in most recent as opposed to what is most relevant or whatever Facebook thinks is most relevant, except that you're not really seeing everything that way. Either. You're not getting the most recent reverse chronological list of all of your friends activities. It's still a selection of what your friends are posting, and it's a selection that the algorithm has determined is the

most relevant for you. It's just that they'll be listed in reverse order instead of kind of that haphazard all over the place method. But the fact is you're still not seeing everything, which still frustrates a lot of people. And when we come back we'll talk about the algorithm, how people have used the creative approach to game the system, and some unintended consequences that algorithm has led to. But

first let's take a quick break. So we're at the point where an algorithm starts to serve up content catered to each individual Facebook user, and the goal was always the same from the very beginning keep people engaged, which really just means keep people on Facebook for longer. And this gets to Facebook's method of generating revenue. Facebook's primary

source of revenue is through advertising. And when I say primary, according to Investopedia, advertising accounts for ent of Facebook's revenue, which for Q three was twenty one point five billion dollars. Wholly cal princely some doesn't even begin to cover it. Facebook sales ad space to various companies to serve up advertising to Facebook's users, so they're essentially think of it like a billboard. Facebook is like a giant billboard, but

it's a very special billboard. It's incredibly valuable. There's another component to this I'll touch on later, which revolves around targeted advertising, but let's just stick with the basic value proposition here. So Facebook has space on its apps and its pages, and in that space ace it has the ability to show ads that they could be on the side, they can be fed into the news feed as well, so they're kind of interspersed along with the status updates

from friends and stuff. Uh. The longer that people use Facebook per any given session, the more ads they will encounter as they interact with Facebook, which also means that they could potentially interact with the ads. So part of what Facebook really needs to do is to find ways to convince people to spend as much time as possible on Facebook. In this use case, time really is money.

Greater engagement doesn't just mean more opportunities to see ads, it also means that Facebook can demonstrate its own value to advertisers and thus command higher prices. To put it another way, let's say you own a few billboards around town. Since I've already mentioned them, so let's say you have a billboard that is on a back road and that gets just a little bit of traffic every day, and you happen to own another billboard space which is on a much more heavily used road that goes right into

the heart of town. The rates that you would be able to charge for the back road billboard are probably gonna be a lot lower than what you can command with the heavily traveled thoroughfare. More folks are going to see that second billboard, which makes it more valuable. On a similar front, if Facebook can show numbers that indicate that more than two billion people are spending more and more time on Facebook, it can command higher prices for advertising.

Add companies and their clients want to be where the eyeballs are, and Facebook fits the bill nicely. So with that in mind, the real purpose of the algorithm ultimately is to serve up stuff to users that will keep them on Facebook longer and convince them to come back to Facebook frequently. Now, some of the things that will bring people back fall online with a few of the

ideas that Drank demonstrated. If you logged onto Facebook but you only ever saw posts from people you only kind of sort of know, and you always missed out on the posts from your close friends, you probably not find it a very satisfying experience. So at least some of the stuff you'd be seeing would fall in line with the stuff you'd really want to see in the first place, But maybe there would be some posts you would really like to see, but the algorithm, for whatever reason, doesn't

judge those as being particularly engaging. Well, the chances are more slim that you would see those posts. If you've ever posted something to Facebook that you thought was particularly important or impactful, but you didn't see very much reaction to it, there's a chance that not many people actually saw what you posted, So that's a real possibility. Of course, there's also a chance that they did see it, they

just didn't care. It's kind of hard to say for any given post, but it's entirely possible for you to post something that people would have reacted to. They just never saw it, so they never got the chance to. But let's say someone in your circle posts something and it starts to catch on. Maybe it's someone in your circle that you don't interact with much, so usually you

don't see very much from them on Facebook. But for whatever reason, this particular post that they made is getting a lot of interaction from other friends, which means the algorithm might say, hey, this post is really taking off a lot of people are interacting with it, so I'm going to serve it up to even more people, including you,

who usually wouldn't see such things. From this particular distant friend. So, in other words, posts that get a lot of engagement, that being reactions and comments and shares and that kind of stuff, they tend to rise above other posts. Engagement doesn't have to be positive either. It's not really important to Facebook if the stuff you see makes you happy or unhappy, or if the stuff you see is true

or if it's false. What matters to Facebook is that you spend as much time on the platform as they can coax from you, and that you feel the need to return to Facebook frequently. Maybe it turns out you rage posts on Facebook a lot. Maybe you see stuff that you vehemently disagree with, so you post lengthy comments, or you share a post you don't agree with, but you do so in an effort to put your own spin on the thesis of the original post, or to

criticize it in some way. Your motivations or goals are not important to Facebook. The company could not care less. The platform just once your time. Well, that's being a bit too reductive. Facebook also wants your posts to inspire other people to engage using their own comments, likes, and shares, so that they get their time too. That's really it. They want everyone's time, not just yours. I'm sure most

of you already see where this becomes an issue. As we've heard for a few years now, Facebook has become a hotbed for misinformation and disinformation, something that Facebook founder Mark Zuckerberg has tried to avoid addressing for years. There's a saying that's frequently misattributed to Mark Twain that says a lie can travel halfway around the world while the

truth is still putting on its shoes. I think it's kind of fitting that Twain is often cited as the originator of the saying, but the truth of the matter is that it predates him. It might have been the product of Jonathan Swift, but we don't really know. But the saying is apt. Misinformation is not saddled with the heavy burden of being, you know. True misinformation can tap

into biases and preconceptions. It can reinforce the beliefs we hold, whether they're positive or negative, even if those beliefs have no real basis in reality. And if we feel that our beliefs are validated, we're more likely to engage and share those posts and to feel reinforced in it to do more, to continue sharing that kind of stuff. Now.

I've seen examples of this across different ideologies. The narrative we frequently see here in the United States is how right wing misinformation will spread across Facebook, and that that certainly does happen, but I've also seen it happen for people who have a left leaning ideology. For example, if a post indicates that someone with a right wing philosophy has done or said something terrible, that can spread rapidly

across Facebook. But often you can do a little fact checking and find out that there's no actual evidence that such a thing was ever said or done by that person. I've seen this happen multiple times, So what I'm getting at is that no one is immune. That being said, certain groups, particularly those catering to more right wing audiences, have really pounced on the opportunity more than others have. And again I'm not saying left leaning organizations and people

haven't done the same thing. They just haven't been as fast to jump on it, and it's less common they don't do it as much. Right now, I think it's pretty fair to say that no ethical organization or person would want to perpetuate misinformation or create disinformation. This is a form of manipulation to get people to support something without having to go to the trouble of actually really giving evidence that that point of view is the best one.

If you're using lies to get people to support you, that's, you know, not great. But on Facebook that sort of thing can really take off, and as a post gets traction with people sharing it or commenting on it, or reacting to it, whether positively or negatively, Facebook will start serving that content to progressively larger audiences. And you can

think of this as like ripples in a pond. That initial circle of a ripple that's really small, right, But let's say that that circle represents a certain group of users and they react very strongly to that content. Well, Facebook will then expand that to the next circle outward to affect a larger group of users, and if the

trend continues, it just keeps getting bigger and bigger. Now, while this has been an issue with Facebook for ages, we really saw it come under scrutiny in when it became clear that Russian operatives were leveraging Facebook to manipulate Americans and spread disinformation, particularly when they came to politics. Complicating matters are fake Facebook accounts also known as bot accounts.

These accounts don't represent actual users, but rather are part of larger systems designed to amplify the signal of certain messages. Every content creator out there would love for every post they make to go viral, that is, to become the sort of thing that tons of people see and share with others. Making viral content is pretty darn hard to do on purpose. Often the stuff happens organically, then it becomes difficult or even impossible to recapture that same magic

on subsequent efforts. I'm sure you can think of lots of famous viral videos from people that never seemed to pop up again afterward, or if they did, it didn't have nearly the same effect. There countless examples of that. But by creating lots of fake accounts, you can amplify a message. You can get that effect of a viral breakout because you're manipulating the system. You're using bots that are either all blasting out the same post at the

same time, which is actually really easy to detect. You can often see this on Twitter, where you'll find the exact same message posted by a bunch of seemingly disconnected Twitter accounts, which indicates they're all bots that are coming out of the same source, or more likely, you would have the bots interacting with a post and sharing that post and liking the post, thus amplifying the post's importance on Facebook in general and making it more visible as

a result. Now, the visibility is one thing, but that interaction also creates a perception among the real humans who are using Facebook that that original post must have merit. If you look at a post and you see that it has thousands of reactions connected to it, you're more likely to pay attention and perhaps even take that post to heart. You know that that has a certain cash a that that social acceptance of a piece. And by

lots of fake accounts, I really do mean lots. Last year in twenty nineteen, Facebook announced it had shut down more than three billion fake accounts over the course of half a year, three billion in six months. Even after that, Facebook estimated that somewhere around five of all accounts still on Facebook we're actually bots. Considering Facebook has more than two billion users, five percent represents a lot of fake accounts. Now, this doesn't mean that every user is going to see

the posts that get that kind of engagement. The algorithm is taking a lot more into account, for example, the types of pages you like on the platform. That matters. If you like a lot of pages that have a more left leaning slant, you know, stuff that falls into like progressive politics, that category, you're less likely to see really hard right leaning posts as a rule, and in

this way, Facebook can become a true echo chamber. The platform again wants to keep you there as long as possible, as many times as possible, so the algorithm is prioritizing content that it calculates has the best chance of keeping you on Facebook. If they drive you away, they've done the wrong thing, at least as far as Facebook's revenue

generating plan goes. Facebook's crackdown on fake accounts was part of a larger effort to address the problems with misinformation that we're growing concerning enough for the US Congress to call Zuckerberg to Washington, d C. To answer questions about

fake news and the like. Zuckerberg had long maintained that he didn't want Facebook to get into the business of arbitrating which bits of information were legit and which ones represented misinformation, as kind of this idea of it's the free market of ideas all ideas are are given equal platforms, And you can sort of understand his reasoning from a business perspective, because all that engagement really meant money for

his company. But with Facebook under scrutiny and criticism, there was a very real danger of the platform being seen as a toxic place for advertisers. Companies that aren't too keen on getting looped in with content that falls in with radical ideology, we're getting nervous, and in fact, in we've seen advertisers boycott Facebook for various reasons, some of which relating back to Facebook's failure to address things like groups that are radicalizing individuals, that sort of stuff. So

it does have an impact. Shutting down body accounts was one way to diminish the spread of misinformation on Facebook without actually taking the additional step of Facebook becoming a judge of what is and isn't legit information. But that wasn't quite enough to satisfy critics who are pointing out that people were using face to spread false narratives and to mislead people and to manipulate large populations, while Facebook meanwhile just raking in the profits from all of that.

In fact, a recent article in The New York Times titled on Facebook, misinformation is more popular now than in Reporter Davey Alba shared research from the German Martial Fund Digital which found that people were sharing, liking, and commenting on three times as many articles from outlets that regularly published misinformation in than they were back in ten. So

we're actually seeing more of it today. That's not a great trend there, and it's another indication that Facebook is reluctant to intervene in this and as Karen Cornbla, the director of the research firm, would say, counteracting this trend quote just runs against their economic incentives. In the quote, when we come back, it gets worse. But first let's

take another quick break. So I mentioned before the break that it gets worse, and early in this episode I alluded to the concept of targeted advertising, and this gets into another aspect of algorithms on Facebook that can be harmful. Okay, So the basic idea behind targeted advertising is really simple, and the goal is to get the right ads in front of the right people. The goal is always to increase the odds that someone is going to act upon

that ad. Otherwise advertising is just throwing money away, right, So going back to my billboard example from earlier, there's only so much targeting you can do with billboards. Now, you might choose to put up an add on a billboard that's in a part of town that most closely matches the demographic of your average customer, Meaning the people that you cater to happen to live in a certain part of town, so it makes more sense to put your billboard in that part of town. But that's a

pretty primitive approach to targeted advertising. Facebook provides a laser focused, individual precision approach. Every user activity on Facebook gives Facebook more information about the the user in question and what they like, which is obvious, but we need to start there. So the pages that you visit on Facebook, the posts you interact with on Facebook, the general information in your profile like your birth date, your location, your relationship status,

all of these are valuable pieces of information. You can actually go into Facebook's settings and the ad preferences page and you can see what Facebook has deduced about you. You know, the company will list out what it thinks your interests are, so you can see, oh, this is why I'm getting all of those ads for such and such. It's because Facebook thinks I'm really interested in that. For example, when I was first getting into exercising, which I really

need to get back to. Uh, Facebook would serve me up all these ads for things like muscle enhancement, protein shake things. They're all these insanely buff dudes popping up on my Facebook profile news feed, and I was like, well, you know, good on you guys. You've done some great work. But that's not really it's not really my jam. But anyway, that that is one way that Facebook starts to build out information on you. But that's just the tip of

the iceberg. Facebook also has marketing partners, a lot of marketing partners, and these partners are also collecting information about the people who are visiting their pages and their habits, including stuff like what items people might have purchased or perhaps just searched for or looked at. Like think of something that you've searched for, maybe on a site like Amazon, and you're looking at stuff you haven't pulled the trigger

on buying anything yet, but you're just kind of comparison shopping. Well, these sort of partners share that data back with Facebook, and then Facebook can leverage that information and target specific ads to you based on what you've been doing off

of Facebook in other parts of the web. So if you ever done a search for something like an air fryer and then you notice that whenever you go to Facebook you're seeing ads for air fryers and other kitchen gadgets just in your news feed, that's what's going on now. I saw this happen a lot when I was shopping around for a guitar. Every time I would log onto Facebook, I would see a ton of ads for various music shops, some of which were clearly scams. There's a lot of

that on Facebook. Whether you'll see an ad for you know, you'll see an ad for something that looks like an incredible product, something there's phenomenal, amazing mask or costume piece, or you know, a special kit that seems to be catered right to your interests, and more than that, it's an unbelievably low price, like this is gonna be for forty bucks. You're like, wow, that would normally be hundreds

of dollars. Well, this falls into the general category of if it sounds too good to be true, it probably is. A lot of these company pages are really just fronts for stores that sell cheap knockoffs, often with huge shipping costs because a lot of them are located in places like China. So you order the thing that looks incredible, you're like, wow, this would normally set me back a grant and they're asking for forty bucks. And then if you do get a product it doesn't resemble what you

were promised. You end up getting a super cheap knockoff of whatever it was you were ordering it. But the expense of returning that product would be so much higher because of shipping costs, that it would not make any sense to return it. You would still be out more money by returning the product, so you just kind of suck it up. But that's kind of a tangent. Let's

let's get back to targeting now. After some criticism, Facebook has made it possible for people to tweak their settings on the platform to bit targeted advertising, So you can change things so that the ads you get are more general and they're no longer hinging upon your identity or your behaviors. Not that Facebook isn't still collecting of these information. They are, it's just you're not seeing it in the

effect of the advertising that appears on Facebook. But it's not always easy to spot these settings, and a lot of people just never bother to even check into it. And that suits Facebook just fine, because remember when I said Facebook can charge more for ads because of their high engagement, the fact that people spend so much time on Facebook. That gets boosted even more when Facebook can tell an advertiser that they're able to put their ads in front of the people who are most likely to

respond to those ads. So not only are people spending more time on the platform, Facebook can tell the advertisers you want to you want to target this specific person because they're the one most likely to respond to your ad. Now, let me be clear, this doesn't have to be a bad thing necessarily. I mean, you might end up seeing ads that are really relevant to you and they could be really helpful. Maybe you would see something that you wouldn't have come across otherwise. But it can also come

across as really invasive or creeping. But now let's go to something that is just a bad thing. Period See targeting ads to get your most likely customer. Sounds fairly on the up and up, except the practice can also be discriminatory. For example, let's say there's a big company and they own upscale apartment complexes in different cities, and this company wants to advertise on Facebook because it's opening

up a brand new apartment building. And in the process, the company specifically opts to show the ad only to white people, because that's something Facebook used to allow advertisers to do. Facebook was allowing for racial discrimination in advertising. Now, if you took a bird's eye view of these practices, you would see some really ugly trends here. Ads for companies that were hiring fast food workers were disproportionately directed

towards black users. Ads for companies hiring cleaners or nurses disproportionately targeted women users. The practice was reinforcing disparities in the job market. It was perpetuating and and and heightening these disparities. It was an effect making things worse. And this practice first came to light way back in sixteen, but it wasn't until twenty nineteen that Facebook was dealing with a lawsuit and the company promised that it would

end the practice. The announcement also wasn't exactly transparent. In a post, the Facebook for Business division said that it was removing the option as part of its quote efforts to simplify and stream line are targeting options end quote, and that it would be quote removing multicultural affinity segments and encouraging advertisers to use other targeting options such as

language or culture end quote. Multicultural affinity segments was the euphemism the company used to essentially talk about racial discrimination now. According to a Facebook insider named Kion Lvy, team members within Facebook had been pushing for the removal of these factors for advertisers for more than three years. In addition, it's not the only discrimination that people have faced when

it comes to trying to do advertising on Facebook. One thing I didn't touch on earlier in this episode is how brands have been affected by the various tweaks to the news feed algorithm and the news feed itself. Brands can make pages, and users can visit the pages, and they can like those pages. And in the old days, if a business posted to their page, then that update would go out to a decent number of the followers

who had liked that page. But Facebook tweaked the news feed a couple of times, and each time it would affect the organic reach of a business's post. So, for example, there is a text of Facebook page. I think. I mean I quit Facebook a couple of weeks ago, so I mean I assume it's still there, but it wasn't updated frequently. And I'll be up front with you why that is because there wasn't much point to updating the

Facebook page due to the way the news feedworks. Posting to the text off page would mean only a small fraction of the people who were actually following that page would even see the update. So if a thousand people had liked the page, maybe only ten of them get the update. That's not a good return on investment. So ostensibly this was on an effort to deliver a better

experience to users. You know, Zuckerberg always positioned this as saying the ideas they wanted the users to see more posts from the actual people they know, their friends and family, and fewer posts from businesses and media outlets, even though we should remember these were outlets and businesses that users had gone to and liked on Facebook, so presumably the posts from those entities were things that the users were

interested in the first place. But what was really going on here was that Facebook was creating a market for itself. Why give away free access to users when you can create an incentive for businesses to pay for promoted posts, and that's what was really going on here. Facebook had created a system where brands felt the necessity for creating a Facebook page, and you had to have a face fcebook page. There are two billion people on Facebook, so you want to be where the people are, you gotta

have a Facebook page. But to reach a significant percentage of the people who have actually gone to that page and liked it, you would also have to pay for that privilege. Now, I'm not going to argue against Facebook's strategy here. It kind of stinks if you happen to run a brand. But from Facebook's perspective, I can understand the reasoning. So while it kind of is tough if you're someone who's trying to reach your fans, uh, from Facebook's perspective, I get the I get the idea, I

get the business case. However, that discrimination that we would see directed toward users would creep in again, this time towards advertisers. There's a band called Unsung Lily, which is an independent pop band. It's headed by a same sex couple, Frankie and Sarah Golding Young. These two are attempting to promote an album on Facebook this year, in because, like I think pretty much everyone in the world, the pandemic

changed all of their plans for the year. They wouldn't be able to tour, they wouldn't be able to market their album the way they would typically do so, and so they decided to run and add on Facebook. But they also knew that if they relied solely on posting the the ad to the band's page, like if they just made a post on their fan page, the way Facebook serves up that kind of stuff means that they wouldn't reach nearly enough of their fans for it to make an impact. So they decided to do it as

an paid ad instead. So they shot an ad, they put it together, they edited it, they submitted it. They received a rejection, and under the reason why the ad had been rejected, it said that the ad contained inappropriate sexually explicit content, except it didn't. What it did feature was an image of Sarah and Frank Key leaning into each other with their foreheads touching, kind of just an image of affection. There was nothing remotely sexual or explicit

about this photograph. Now, the rejection notice didn't specify that the image was the reason for the rejection, so they decided they would run a test to see if, in fact that was the problem. First, they submitted an AD that had all the same content as the original piece, only with this one image swapped out for a quote unquote non romantic picture of the couple boom Facebook excepts the ad. To follow that up, they also submitted another ad.

They still used the same video, the same copy, but they got a heterosexual couple making the same pose that they had made with the two four heads together, so it's a man and a woman foreheads touching. They just used that to replace their own image. Boom Facebook except that ad too. Okay, so I guess I don't have

to explain that this is a really bad thing. The fact that Facebook ads deemed a heterosexual display of affection as perfectly acceptable and the same sex display of the exact same act of affection as offensive is beyond infuriating. The band contacted the American Civil Liberties Union, which facilitated some communication with Facebook. Facebook said, oh, no, no, no no, this whole thing was a mistake. And besides, the actual objectionable part wasn't that photograph. It was because of some

of the dancing those in the video. But again, Facebook accepted two versions of the video that contained the dancing, It just didn't have that photograph of Sarah and Frankie, which seems pretty fishy to me. Meanwhile, the company continues to face criticism that it fails to curb truly objectionable stuff like hate speech. So hate speech spreads like easy on Facebook, but an ad that had a photograph of

two women with their foreheads together was offensive. There are studies that are also showing that black users are disproportionately more likely to have their accounts suspended by automated moderating systems than white users are, which is another bad algorithm example. And there are real concerns that the targeting criteria Facebook continues to use exacerbate socioeconomic disparities in the real world.

None of this is good news. Facebook has been under intense scrutiny recently from various governments around the world that have all floated variations of the possibility that Facebook could face some really big consequences down the road, like getting

broken up. There are a lot of different entities that argue Facebook is a monopoly or is monopolistic, and it's pretty clear that Facebook and its incarnation as it exists right now really has some big problems to iron out, things that are affecting people in a truly negative way. These are all parts of the reason why I left Facebook. I did so for my own personal reasons, and I'm not advocating that anyone listening to this just go out and shut down your account right now. I'm not saying

that for me that was the right decision. I needed to do it for my own mental health, honestly, So I felt like this was an important thing to talk about, because again, it's those algorithms that, at the end of the day, we're really just designed to optimize the amount of time people would spend on Facebook and the number and quality, or at least the variety of ads that they would see while they spent their time there. That

was really the only thing that mattered to Facebook. It was otherwise kind of distanced from any of the content stuff. Content was not at all the concern. The concern was how long can we keep them there, how many ads can we show them, how much money can we get

for those ads? That was really all that mattered. And now we're starting to see when you disassociate the content from the approach how that can have a profoundly negative impact on the world, and we're trying to get to the point where we figure out what to do next. I don't have the answers um other than I walked away.

So hopefully people at Facebook will start to come up with ways to approach the business from a more ethical standpoint that will have less of a negative impact on the people who actually use the service, whether it's the advertisers or the end users. I certainly hope. So if you found this interesting, let me know if you have suggestions for future topics on tech stuff. Whether it's a company, a trend in technology, maybe it's a person in tech.

Maybe it's a specific technology you've always wanted to know how it worked. Let me know. The best way to do that is clearly to go over to Twitter. Not that that's a perfect service, but that's a different podcast for a different time. But use the handle text stuff h s W and I'll talk to you again really soon. Text Stuff is an I Heart Radio production. For more podcasts from my Heart Radio, visit the i Heart Radio app, Apple Podcasts, or wherever you listen to your favorite shows.

Transcript source: Provided by creator in RSS feed: download file