Marina Tulin studies how good we are at spotting false information and how much we trust science stories published on social media platforms
‘What is your profile on Facebook?’, asked Olivia to Emma after meeting during a party, ten years ago. They planned to stay in touch but, since then, they haven’t shared more than some likes on the social media platform. Well, that is what they thought. Without being aware, Emma ended up sharing her own but also some of Olivia’s Facebook information to a project, when she agreed to participate in an online quiz built on the social platform. In fact, with this quiz, Emma also shared the Facebook data of all of her friends on this social media platform. And, like Emma, many other people did that. Later on, a company got access to all the data and used the information to, first, create psychological profiles, and then design tailored messages to each of these profiles to influence their political choices. Does this story seem false or true to you? What would you do to check its veracity?
If you were following the news four years ago, this story probably rang a bell for you. In case you missed it, I’m talking about a true story, the Cambridge Analytica scandal. In this case, back in 2016, the Facebook data of 87 million people were exposed by this social media enterprise to Cambridge Analytica, a consulting political company, which created targeted content for different psychological profiles. The whole scandal fueled a discussion regarding the ethics of using social media data by third parties. But it also made it very clear how easy it was to spread personalized content, including false information, on these platforms. This feature revealed by the scandal was upsetting for many, but for Dr. Marina Tulin it was more, it also became a motivation to investigate public trust in online content.
Marina is a German social psychologist and researcher, who obtained her PhD in Sociology from the University of Amsterdam. Working as a postdoctoral researcher at the Erasmus University Rotterdam in the project TRESCA, she was trying to understand how people trust the information they consume, especially online. In other words, considering the huge amount of information received online, how do we separate the wheat from the chaff? “We see a lot of things on the Internet and TV, sometimes you know that it is just a nice story, like when you watch a cartoon or a movie about superheroes. You know that people cannot fly and that animals cannot talk. This is a little bit like a lie, but that is fine because you know that it is just a nice story. Other times it is not so easy to understand that something is just a story,” says Marina.
While misinformation, unfortunately, exists in any type of content, in Marina’s study, she focused on public trust in science-related stories, “a topic that has gained urgency and importance during the coronavirus pandemic,” she adds. “The main question of my work is how we can effectively deal with the current climate of distrust in science. I am interested in whether better science communication can help rebuild some of the lost trust. While there are good reasons to be critical of scientific practices, especially when they are unethical or outright fraudulent, there are also many instances where this criticism is not well-founded. In my work I seek to take seriously the critical voices in society and strengthen not only science enthusiasm but also appropriate skepticism,” she explains.
While there are good reasons to be critical of scientific practices, especially when they are unethical or outright fraudulent, there are also many instances where this criticism is not well-founded.
In her research project, Marina and her colleagues used a post about a study exploring the negative impact of social media use on teenagers. With this example, they asked what makes people believe in online content and what motivates them to fact-check whether what they read was true. To study this, the team designed an online experiment that mimicked a situation in which people are just scrolling through social media. After being presented with a post about the downside effects of social media use on young people, the participants were then asked to rate how much they were willing to trust the information that was shown to them. Later, they were asked if they would like to spend some minutes checking if the information was correct. The experiment was done by a survey company, for consistency, with more than 7,000 adults from seven countries (the Netherlands, Germany, France, Italy, Spain, Hungary, and Poland), which were matched by some characteristics like age, gender, and race.
The results have not been verified by external scientists and published in an academic journal yet, but according to preliminary data analyzed by Marina and her colleagues, two-thirds of participants chose that they would like to fact-check the content of the post. “This suggests that people are quite willing to double-check the truthfulness of the information they see,” says Marina. In general, this result was consistent across the seven countries, but there were particular differences related to participants’ characteristics, such as age. Currently, the researchers are investigating what could explain such discrepancy.
While the battle against mis- and disinformation will continue to be hard to win, studies like Marina’s help to understand what we can do about this, providing evidence-based strategies. For instance, fact-checking requires extra time, which could prevent people from doing that. To minimize the issue, Marina suggests that “building fact-checking tools into social media platforms can make this task much easier for users. And as our study shows, the majority of users would engage with built-in fact-checking tools.” But even if users are willing to fact-check, this doesn’t mean that the sources they will access would be reliable. And one strategy to cope with this could be the integration of reliable sources on social media platforms themselves. In fact, now during the coronavirus pandemics, such a strategy has been used by Instagram, in which posts related to the pandemic
s usually come accompanied by a link to the pages of health authorities.
It is also important to highlight that, although Marina’s study included a large number of people, the participants were all from European countries, which may not necessarily represent the behavior of individuals from other continents. In addition, the experiments were done in a controlled condition and, as Marina complements, “it remains to be shown if such findings generalize to the real world when individuals do not only see one post, but when they are presented with an abundance of information. It will be interesting to see whether the high levels of willingness to fact-check is also present in everyday life.” Hopefully, we will be able to hear more about the outcomes of this study very soon.
When not researching public trust in science, Marina also enjoys teaching about it. If you are more interested in the topic, she was one of the instructors in the online and freely available course “Communicating Trustworthy Information in the Digital World”. On top of that, Marina enjoys playing the guitar in the post-punk band “Big Pleasure”, training for her green belt in Japanese Jiu-Jitsu, as well as skateboarding.
Let’s stay tuned for more exciting results of Marina’s research. Best of luck in your journey, Marina! Thank you for participating 🙂