Facebook curators were instructed not to let pieces critical of Facebook trend on Facebook? gizmodo.com/former-faceboo…
Claim: Facebook injected Syria and BLM content. News feed algorithm suppresses non-cheery news, curators inject it?
Look, Facebook *caring* about the content it promotes is a good thing. I think it should care, and care a lot more.
My point is NOT it's bad for Facebook to care about content but that 1-Facebook (and Google) concentrate power; 2-Are not open about this.+
3-That many will like their current choices is no guarantee of anything; 4-That they censor news about Facebook is indicative about issues;+
5-Facebook & Google that have enormous power to shape the public sphere. They should grapple with this but also be open & more accountable.
On the right, "Former Facebook Workers" start trending on Twitter. Will it trend on Facebook at all? Anyone?
This, and with machine learning entering this space very fast, we don't even directly understand much of it anymore. twitter.com/drewconway/sta…
The point isn't the past was some unbiased paradise and the future is horrible, but we've entered a new regime with new concentrated powers.
Facebook feed algorithm sold as "what you want" but it structures the experience. FB trending is sold as algorithm but is FB's preferences.🙄
Meanwhile, Google still defines "algorithm" as computer programs that give you "exactly what you want."
I'm teaching an algorithms, big data and ethics class this Fall. I am so not writing the syllabus till August. Stuff comes out every week.😏
BTW I *do* wish there were a better term than algorithm to mean "complex and opaque computation of consequence". Language does what it does.
FB: "Trust us." Ok. Define "hoax", "sufficient source", "coherent", "effective" & "neutral". facebook.com/tstocky/posts/…
For such decisions "we show you what's most interesting to you", there's no right answer, hence no escaping values—editorial or computation.
Algorithms for, say, flying a plane, are different than algos to make a subjective decision. It's "no one right answer" all the way down.
You see it in the Facebook VP's answer: they want to retreat to "we're neutral and effective." Doesn't work too well in human affairs.
I don't mean "neutral" and "effective" don't work for Facebook. It doesn't work for human affairs. Welcome to your business model, Facebook.
I am not saying why isn't Facebook neutral, surfacing non-hoax news only from sufficient sources. I'm saying that will always be contested.
Hmm. People can focus on "trending"—though it means little on Facebook—but not on the powerful newsfeed algorithm. twitter.com/CNBCnow/status…
Easier to worry about "Facebook editors selected this"—it has a journalism analogy— but harder to think about complex computational filters.
Look, every journalist and editor and NGO and activist out there is thinking HOW DO I MATCH MY STUFF TO FACEBOOK'S ALGORITHM? Trending? Meh.
Another editorial decision by a giant platform. It's a good decision but an editorial one. Shows Google's power. twitter.com/JedBracy/statu…
Seems Facebook stopped making trending topics purely algorithmic after criticisms about lack of Ferguson coverage. theguardian.com/technology/201…
I don't think Facebook bringing editorial or quality control into the mix is a bad thing. I DO think it's a hard problem, esp. given scale.
I do think the FB algorithm has a huge shaping power with a lot of effects—some intended, some not, some unforeseen, some from $$$ motives.
My criticism is this: Facebook is now among the world's most important gatekeepers, and it has to own that role. It's not an afterthought.
That Facebook mediates interactions within our social networks & flow of news is why it's such a huge, profitable company. That's its core.
What makes Facebook successful is that gatekeeping role. I'm sure many Facebook engineers are super brilliant but that's not the core value.
Facebook tries so hard to recruit engineers and to make some blue box just right. But it's core business is in humanities & social sciences.
Facebook cannot escape humanities since it can't escape adopting values, nor social sciences since it judges rank in our social networks.
Look, these are hard questions. The issues Facebook needs to grapple with are what humanity has grappled with for .. millennia. Seriously.
Worst response is to pretend these issues do not exist. They do. Attention is scarce. Values have consequences and are contested. So on.
And Facebook is in the middle of it all; maybe affecting more people than any predecessor, and all this in just over a decade.