Footnoting for the nerds – this theme is explored further in Tim Wu’s The Master Switchcentralization has taken over online advertising, too
Footnoting for the nerds – this theme is explored further in Tim Wu’s The Master Switchcentralization has taken over online advertising, too
it does both ; adding a retailer decrease margins and it increases prices!I think this might be why we've seen some smaller markers start working with retailers. But that also has its drawbacks, since adding another chain in the process either means higher prices for the consumer or lower margins.
Eh ML systems are surprisingly chaotic. It feels like they wasted a lot of money on the metaverse stuff, but to try and see round the corner on every change would be impossible and impossibly expensive, orders of magnitude differences here.Well yes, but no.
As in, that's the line we keep be fed, but it's not really the point, and it's not really true. Facebook would not go out of business if they had to have some real accountability for their machine learning systems.
All the money isnt even going into making the systems better or the sites more workable, it all went into Zuck's vanity metaverse project. From my perspective, them having less money has been categorically demonstrated to not even matter as such.
Aren't we tired of all this "but the innovation!" talk? I've been hearing this **** my ENTIRE life and it's never felt more hollow than it does in 2023.
Again at these scales you can't predict everything that algos like that might do. It's not intent and not neglect.Intent and neglect are usually the two options on the table. You can be fired for either. Saying “I didn’t see your dog” before running it over doesn’t make it less dead.
The issues I see are
1. No communication. People’s business depends on the platform. For shame.
2. A clear double standard between user content and the ads it allows to be served. On the same day.
It’s fine to manage content with an algorithm. Algorithms are human designed, with intent.
Eh ML systems are surprisingly chaotic. It feels like they wasted a lot of money on the metaverse stuff, but to try and see round the corner on every change would be impossible and impossibly expensive, orders of magnitude differences here.
They do plenty of evil things, but expecting all content moderation at that scale to be mistakeless is not reasonable, I'm sorry. So they'll bias to itchy trigger fingers as that likely leads to less problems. I get it.
Again at these scales you can't predict everything that algos like that might do. It's not intent and not neglect.
#2 though, yeah - insta sucks. The double standards on ads, the fact they know their algo will feed vulnerable people content they find addictive but is harmful to their mental health etc. they're morally bankrupt, but I just don't think that has anything to do with what happened here. This was just the kind of thing that happens with automated data moderation at this scale, and the fact it was seemingly rectified pretty quickly lends weight to that. There was no profit motive in what happened & there was no intent & it isn't a realistic standard to say mistakes like this can't happen. If the makers had been deplatformed for good because insta couldn't be bothered to fix it, that would be a whole different story.
Again at these scales you can't predict everything that algos like that might do. It's not intent and not neglect.
That's a product of capitalism and the need for constant, unending growth to satisfy the shareholders.tbh I worked in the ML industry as a DS/MLE for several years and I fundamentally dont buy the "we need unfettered progress" line is all.
@JB1's point, with which I agree, is that these systems and algorithms have scaled to a point at which thorough testing is close to impossible, and tuning to prevent one kind of mistake without creating other kinds of mistakes in consequence is literally impossible. The fear of messing with a system like that is a huge driver.It is 100% neglect. If you didn't test it, you neglected to do your job.
@JB1's point, with which I agree, is that these systems and algorithms have scaled to a point at which thorough testing is close to impossible, and tuning to prevent one kind of mistake without creating other kinds of mistakes in consequence is literally impossible. The fear of messing with a system like that is a huge driver.
It's a reason to avoid building systems at this scale, seeking instead to break up the problem and decentralize.
None of this is to let them off the hook morally. They've created a monster, and creating a monster so big you can't possibly control it is culpable behavior.
A second problem is the organizational personality. It's a commonplace to say that organizations tend to take on the personality characteristics of the CEO. In this case, the CEO is not really in tune with human beings. I don't think we have been visited by aliens, but if I started to believe we had, I'd assume that one of them was running Meta.
We're getting a bit more philosophical here and you're not gonna find me disagreeing.tbh I worked in the ML industry as a DS/MLE for several years and I fundamentally dont buy the "we need unfettered progress" line is all.
Im not suggested all moderation be mistakeless, but I am suggesting that making FB do a LOT more would be reasonable. IDC that it would take money out of Zuck's pocket either. More accountability would be a great first step. Splitting up FB and Insta would be the right one though.
It is important to understand the machine - it is designed to make money. So it comes down to a choice - does your post make them money? Yes - good. Yes, but could it be illegal somewhere in the world? Will it generate enough revenue to cover the potential risk of compliance? Yes - it stays; no - it get's blocked. Meta does not care if blocking that post will destroy your business. All they care about is that they make their money.
I agree with this. As much as I don't like them, this has clearly not been intentional. It is unfortunate, but that is different. To test something like this with a guarantee that there are no errors is so complex and costly that it is not practical and no one in their right mind would ever do it, so expecting them to do it is unreasonable. Breaking them up though would be great and it is unfortunate that IG was bought by FB in the first place. It is also unfortunate that IG is being used to sell stuff. I wish it went back to food pictures and such, then I'd be able to ignore it as I did before....100% it's this. Some points. One knifemakers make them some (small amount of money) & aren't illegal, so as I said, I doubt it was intentional. They did fix pretty quickly.
Secondly, there are a lot of people who want to do illegal things and they're incredibly resourceful and clever. The stopping illegal stuff is a constant losing battle. It's like trying to stop people doing illegal things on the road. It's why they're always biased heavy handed. I've seen in a different situation how that happens. Even with the best intentions and I don't think they have the best intentions. I just don't think this time they did it on purpose and think the appeal -> human intervention -> algo learning feedback loop worked as intended.
Footnoting for the philosophy-of-systems-engineering nerdstuning to prevent one kind of mistake without creating other kinds of mistakes in consequence is literally impossible. The fear of messing with a system like that is a huge driver.
For those that don't want to sign up.Footnoting for the philosophy-of-systems-engineering nerds
The Lessons of ValuJet 592
As a reconstruction of this terrible crash suggests, in complex systems some accidents may be "normal"—and trying to prevent them all could even make operations more dangerouswww.theatlantic.com
This seems at best... Tangentially related.Society pooed their pants over baby it's cold outside being "rapey" while , literally, at the same time , WAP carsi b was the no 1 song. Cancel culture is out of control
What do you mean? You mean two songs that are unrelated in agency have nothing to do with instagram and knives? Who’da thunk.This seems at best... Tangentially related.
What do you mean? You mean two songs that are unrelated in agency have nothing to do with instagram and knives? Who’da thunk.
Society pooed their pants over baby it's cold outside being "rapey" while , literally, at the same time , WAP carsi b was the no 1 song. Cancel culture is out of control
WAP is disgusting and the perfect example of what is wrong with society . Stop being part of the problem , and do better . If you take baby it's cold outside in the context of the time period it was written, it's to the standard. I'm not disagreeing there are overt under tones, but sexually explicit lyrics are not appropriate to be played on the radio or any other form , unless you are over the age of 18, just like porn .Not sure those two songs have much to do with each other. “Baby it’s cold outside” is a frequently played holiday family favorite that has come under fire recently (rightly so imo) for being a little creepy. WAP is a popular song about consensual sex. There’s a consistent philosophy at work here: sex is ok, nonconsensual sex is not.
If you take baby it's cold outside in the context of the time period it was written, it's to the standard.
I condone this take.This is a really weird argument for continuing to play this song on the radio. I mean, in the time the song was written nonconsensual sex was often condoned/ignored…. Now many people have a different understanding about it and therefore are saying the song shouldn’t be played.
——
I’ve never heard WAP on the radio, and I have no idea how often it’s played on there, or how much it’s censored when it is. I would hope it’s not played in department stores. I certainly don’t plan on playing it for my kid, but I’m also not freaked out that he’ll inevitably hear it at some point in his life.
WAP is great and is a celebration of sexual empowerment and agency for women. It challenges traditional gender norms and promotes the idea that women can be assertive about their desires.WAP is disgusting and the perfect example of what is wrong with society . Stop being part of the problem , and do better . If you take baby it's cold outside in the context of the time period it was written, it's to the standard. I'm not disagreeing there are overt under tones, but sexually explicit lyrics are not appropriate to be played on the radio or any other form , unless you are over the age of 18, just like porn .
I'm looking forward to the audiophile discussions 50 years from now.
This amplifier just makes the expletives in classics really come alive, without ever making them overpowering.