Eve is here. Lambert might say, “Oh my god!” But it would have been better if challenges to “misinformation” had been made sooner and more often.
By Undark Contributing Editor Sara Talpos. It was first published in underdark
The following paper was published in the journal Nature in June. perspective It suggests that the harm of online misinformation is misunderstood. The paper’s authors, representing four universities and Microsoft, conducted a review of the behavioral science literature and identified what they characterized as three common misconceptions. That the average person is often exposed to false and inflammatory content, that algorithms facilitate this exposure, and that many of society’s broader problems are primarily caused by social media. It is.
“It’s very rare that someone goes to YouTube to watch a baking video and ends up on a Nazi website,” says David Rothschild, an economist at Microsoft Research and a researcher at the University of Pennsylvania. says Mr. Penn Media Accountability Project. That’s not to say special cases aren’t important, he and his colleagues say, but treating them as typical can be misleading and distract from more pressing issues. I’m writing.
Mr. Rothschild spoke to Mr. Anderk about the paper via video call. Our conversation has been edited for length and clarity.
Undark: What inspired you and your co-author to write this point of view?
David Rothschild: All five co-authors of this paper have conducted various studies in this field over the years, and they have a wide range of knowledge about what is happening with social media, what is good and what is bad, especially when it comes to social media. I’ve been trying to understand the difference. The stories we hear from mainstream media and other researchers.
Specifically, we were narrowing these questions down to what the typical consumer experience is, who is the typical person, and who are the more extreme examples. Much of what we saw or understood was referenced in many studies and actually described fairly extreme scenarios.
The second part of it is focused on algorithms, and there’s a lot of concern about algorithms. What we’re seeing is that a lot of harmful content isn’t coming from algorithms being pushed onto people. In fact, it’s quite the opposite. It’s like an algorithm is pulling you towards the center.
And then there are questions about causation and correlation. A lot of research, especially mainstream media, confuses the immediate cause of something with its root cause.
“Oh, there are yellow vest riots happening in France. They were organized on Facebook.” Well, there have been riots in France for hundreds of years. They find ways to organize without social media.
The immediate cause, the immediate way people organized (on January 6th), was certainly a lot online. But the question arises: Could these things have happened in the offline world? And these are difficult questions.
Writing viewpoints in Nature allows us to actually address problems with stakeholders outside of academia. broader discussion Because it has real-world implications. Research is allocated, funding is allocated, and platforms are pressured to solve the problems that people are discussing.
United Nations: Can you talk about the example of the 2016 election? Can you also talk about what you discovered about the election and perhaps the role that the media played in disseminating information that wasn’t completely accurate?
Doctor: The bottom line is that what the Russians did in 2016 was certainly interesting and newsworthy. They have invested heavily in creating sleeper Facebook organizations that post viral content and end up lacing it with untrue and fake news. It’s certainly meaningful and I can understand why people were intrigued. But ultimately what we wanted to say is, “How much of an impact could it have?”
It’s really hard to measure the impact, but at least it puts into perspective people’s news diets, and Russia’s direct misinformation views are a tiny fraction of people’s news consumption on Facebook. We can show that – not to mention the amount of news consumed on Facebook. It goes without saying that Facebook is only a small part of news consumption in general. Especially in 2016, the majority of people, including young people, still watched far more news on TV than on social media, let alone online.
While we agree that any kind of fake news is probably bad, the reality is that repeated interactions with content, no matter how it’s presented, can actually reveal the underlying causal relationships in the world. There’s enough research to show that it’s what drives understanding, which is what drives storytelling. Encountering fake news from time to time is very rare for the average consumer, but it is not a driving force.
UD: My impression from reading your Nature paper is that it shows that journalists are spreading misinformation about the Earth. effect of incorrect information. Is that accurate? If so, why do you think this is happening?
Doctor: After all, it’s a good story. Hard, very hard, and negative nuances are popular.
UD: So what exactly is a good story?
Doctor: That social media is harming children. That social media is a problem.
In general, you tend to want to approach things from a more negative perspective. There is certainly a long history of people becoming obsessed with new technology, whether it’s the Internet, television, radio, music, or books, and taking on all of society’s ills. You can see all of these types of concerns just by going back in time.
Eventually, some people will benefit from social media. As society continues to advance with new technology, some people will be harmed by social media, and many others will advance along with it. Without offsetting it, it’s not an interesting story as much as social media is causing these problems.
“Social media is the problem, it’s actually the algorithm” provides a very simple and manageable solution: fix the algorithm. And we can avoid more difficult questions about human nature, questions that we generally don’t want to ask.
I think a lot of the research we cite here that makes people uncomfortable is that a segment of the population is asking for horrible things. They demand things that are racist, degrading, and incite violence. Not only can that demand be met by various social media outlets, but whatever people listen to and get, whether it’s people reading books, movies, radio, or other Past information as well as pre-filled with media in the form of.
Ultimately, the different channels available to us will certainly change the ease and method of distribution. But these beings are problems of humanity that are far beyond my ability to solve as a researcher, and far beyond the abilities of many people, most people, everyone. I think that makes it difficult and also uncomfortable. I think that’s why so many journalists want to focus on “social media is bad and algorithms are the problem.”
UD: On the same day that Nature published your paper, the journal also published Comment has been published The title is “Misinformation poses a bigger threat to democracy than you think.” The authors say, “Misinformation can increase polarization and undermine trust in electoral processes. “Concerns about what is expected to be a blizzard of election-related misinformation are justified, given that.” What does the average person make of these seemingly different views?
Doctor: We don’t want to give the impression that we condone misinformation or harmful content in any way, and especially minimize its impact on the people it affects. Our point is that this problem is concentrated in the very poor, far away from the typical consumer, and that getting there requires a different approach and different resource allocation than traditional research, and a consumers talk about aiming to influence this public.
I read it and I basically don’t know who they’re shouting at in that sentence, but I don’t necessarily think it’s wrong. I don’t think it’s a big enough movement and trivializing it to say, “Actually, we should fight where we are, we should fight where the problems are.” In a sense, I think it’s a story of passing each other.
UD: You’re a Microsoft employee, right? How can you reassure potentially skeptical readers that your research is not an effort to downplay the negative effects of products such as: make a profit to technology industry?
Doctor: The paper had four academic co-authors and went through an incredibly rigorous process. You may not realize it on the surface, but this paper was submitted on October 13, 2021 and finally accepted on April 11, 2024. I’ve been through some crazy vetting processes. This was intense.
We came up with an idea based on our own academic research. We supplemented it with the latest research and will continue to supplement it with research that comes in, especially some studies that contradict the original concept.
The bottom line is that Microsoft Research is a very unique place. For those unfamiliar, the company was founded as follows. bell laboratories In this model, publications published by Microsoft Research do not have a review process. This is because Microsoft Research believes that the integrity of its research is based on the fact that it does not censor itself at the time of publication. The aim is to use this position to discuss and understand the impact of things that are close to the company and those that are not connected to the company.
In this case, I think it’s quite far away. It’s a really great place. Many studies are co-authored with academic collaborators, and there are very clear guidelines for that process, which is certainly always important to ensure the academic integrity of the research being done.
UD: I forgot to ask how your team is doing.
Doctor: This is clearly different from previous research results. In this case, it definitely started with a conversation between co-authors about the collaborative and individual work we’ve been doing that we felt hadn’t sunk into the right place yet. It actually started by stating some theories that we have about the differences in our academic work, the general structure of academic work, and what we’re seeing in the public debate. Then a very thorough review of the literature.
As you can see, we’re somewhere over 150 citations, or 154 citations. And in this incredibly long review process at Nature, we went line by line to make sure there was nothing that wasn’t defended by the literature. Either the academic literature where appropriate, or what we were able to research where appropriate. I am quoting from what was published.
The idea was to actually create a comprehensive article to hopefully get people to understand what we think are really important discussions – and this is what I’m going to talk about with you today. That’s why I’m so happy to talk to you – where the real harm is, and where the push should be.
None of us believe strongly enough to make a position and try to maintain it despite new evidence. The social media model is changing. what we have now TikTokand reeland YouTube short This is a very different experience than the predominant social media consumption of a few years ago (longer videos) or the predominant social media consumption of a few years ago (news feeds). These will continue to be things we need to monitor and understand.