All News about the USA :,Trump,stars,sport,problems sociaux,
How Facebook helped Donald Trump to became president ?
I hear fearful voices calling for building walls, and distancing people they label as ‘others,’” he said. It was a veiled dig at Donald Trump.
Facebook had a bigger mission for its blue-hued network: “We stand for connecting every person. For a global community, for bringing people together, for giving all people a voice, for a free flow of ideas and culture across nations.”
It’s a noble goal, yet like so much in Silicon Valley, Zuckerberg’s idealistic view of his social network borders on naive.
Technically, yes, Facebook connects people. Realistically, it has helped divide family, friends and acquaintances into increasingly-concrete silos of opinion, stoked to irrational levels of fear and anger with fake news and conspiracy theories from sites like Briebart. "A cloud of nonsense," as Pres. Obama himself put it. The reason is simple: Facebook’s hyper-personalized News Feed.
Facebook needs to keep you addicted to its News Feed in order to stay profitable. To do this, it shows you comments, status updates and news stories that will give you a constant stream of dopamine hits. It prioritises stories that have you nodding in agreement, over stories and comments that challenge your prejudices and general worldview.
Facebook has spent years fine tuning the algorithm that decides what the News Feed will show you, to present information that is increasingly personalised or which, in other words, appeals to your deepest desires, instincts and beliefs.
Recommended by Forbes
How President Trump Will Affect Your Wallet
Chatbot That Fooled Trump Supporters May Presage The End For Customer Service...
MOST POPULAR Photos: The Richest Person In Every State
+691,975 VIEWS WWE Royal Rumble 2017 Results: Live Results And Updates
MOST POPULAR Photos: The World's Highest-Paid Models 2016
While this brings more people back to Facebook again and again, the effect is that we are constantly surrounded by our own troop of Yes Men.
Spoon-feeding us information that reinforces our existing opinions, damages our ability as humans to think rationally, and to consider an opposing point of view. It leads to consequences like groupthink, mob mentality, and so-called filter bubbles, where the web I browse each day looks completely different to the one you do.
Facebook has been challenged on this before. As more people have got their news headlines from Facebook, newspapers and media organisations have rightly called on Facebook to take a greater responsibility for its growing role as an “editor” of web content. They say Facebook should start holding itself to editorial standards - making its newsfeed less personalized (gasp) by showing users opposing ideas and comments.
But Facebook has no intention of doing that.
“Facebook gives people a voice,” one senior executive told me this week when I asked him point blank if Mark Zuckerberg “felt bad” about the company’s role in stoking such hyperbolic debate in this presidential campaign. In summary, his view was that Facebook hadn’t made more people in America angry enough to vote for Trump - they were already angry. Facebook simply gave those people a voice, and that’s a good thing.
He has a point, but it's a partial cop-out. Facebook has become more than a platform for people to vent their anger. It’s become a speaker system to amplify it, and take it to places it really shouldn’t go. Real anger should be met by empathy, understanding and reasoned discourse. On Facebook, it has led to flame wars in comments and hopelessly inaccurate viral posts and conspiracy theories.
Last week Buzzfeed revealed that a handful of teens in Macedonia, were responsible for dozens or more of the inaccurate conspiracy theories that had been getting sometimes hundreds of thousands of likes and shares on Facebook. Nearly all of these were in favour of Donald Trump. They'd gamed Facebook’s algorithm, noting that stories aimed at Clinton supporters didn’t make as many ad dollars as those aimed at Trump supporters.
So why hasn’t Facebook changed its algorithm to provide us with more balanced ideas? Because, it argues, it is not an editorial company like a newspaper and thus has no obligation to do so. A media company, Mark Zuckerberg said in August, “has people who are editing content. That’s not us… We are a tech company, not a media company.”
In a company blog post in July 2016, Facebook explicitly said it showed people stories and ideas “that an individual person most wants to see.” In many instances, Facebook uses the word “meaningful” to describe how it ranks what you see in a News Feed, but I question the way it’s defining the word “meaningful” here.
In fact, Facebook went on to say, unabashedly, that the “meaningful” stories in your newsfeed were good for its bottom line: “We do this not only because we believe it’s the right thing, but also because it’s good for our business.” And that it also kept people hooked on Facebook: “When people see content they are interested in, they are more likely to spend time on News Feed and enjoy the experience.”
Hopefully down the line, Facebook users will start to realize that their hyper personalised echo chambers are gradually taking away their ability to make reasoned judgements. Facebook addiction is already seen as unhealthy. Facebook’s inadvertent ability to reinforce prejudice (on both sides of the political divide) will hopefully be seen as bad for your health too, one day, not to mention your relationships.
A public backlash could force Facebook to do the same kind of soul-searcing McDonalds did a few years ago, when it started adding more salads and healthy-eating options to its menu. I’d love to see Facebook make it possible to to change the way our News Feed are ranked, to show us a balance of opinions that both challenge and agree with our world view. Then, I’d love for Facebook to make that a default change to its News Feed algorithm.
There are plenty of people in tech who don’t blame Facebook in the slightest for Donald Trump’s remarkable rise to the presidency. “Facebook is fine,” says Krishna Subramanian, who runs social media analytics company Captiv8 and believes it’s up to Facebook’s users to make their newsfeeds more politically balanced. “It’s the nature of how people are optimizing what they want to see in their feeds.”
Subramanian is giving regular people too much credit. Most of us don’t read Terms of Service. Most of us don’t tweak our privacy settings the way we should. Saying it’s up to users of Facebook to re-rank their newsfeeds ignores the mounting responsibility Facebook itself must take as a global communication utility for close-to two billion people. Zuckerberg can no longer hide behind the “we-are-a-tech-company” line. Facebook is the most powerful communications medium in the world.
The founder of Kik, a popular messaging app with American teens, took issue with this notion too, though. He told me Wednesday that Trump’s ascendancy had as much to do with falling incomes as it did with Facebook. The median net worth of American families in the U.S. had gone down by 20% between 1998 and 2013, he pointed out, and by as much as 50% among the working class. Upper-class net worths had shot up by 75%.
Rising inequality absolutely helps explain the popular revolt that’s happening around the western world, in the U.K. with Brexit and in the U.S. with Trump’s rise.
But inequality doesn’t explain why people would see the solution in someone who uses dramatic, hate-filled statements, who is prone to exaggeration, who passes on conspiracy theories as truth.
Our new and growing addiction to Facebook’s personalised echo chambers, however, does.
At an appearance at the Techonomy conference in California on Thursday, he also confronted criticisms that Facebook helped create "filter bubbles" of ideology. (Watch the full interview here.)
Information presented to people on Facebook was much more diverse today than it was 20 years ago, when people got their news from a few newspapers and TV channels, he argued. He said that Facebook carried out a study in which it observed how people read and engaged with different types of content on their News Feed.
"By far the biggest filter wasn’t having friends from a different opinion, but that you didn’t click on it," he said. "You’d just tune it out."
"They just floated down the feed?" asked his interviewer, Techonomy founder and author David Kirkpatrick.
"Yeah we just don’t click on them," Zuckerberg answered. "I don’t know what to do about that. I think we should work on that."
The study Zuckerberg referred to has been criticised for sampling a small subset of users who chose to publish their political affiliation on Facebook. A proper study would have studied a random sample of users, according to Zeynep Tufekci, a professor at the University of North Carolina.
Facebook's researchers actually showed that the News Feed algorithm "decreases ideologically diverse, cross-cutting content" that users see, she added.