The majority of Twitter users are unaware that researchers freely collect and analyze their tweets—including deleted ones—in the name of science.
Most believe that this shouldn’t be allowed without their consent. And many wrongly assume it would be a violation of Twitter’s terms of service, according to a new study by researchers from Boulder and University of Kentucky.
- Scientists routinely use tweets and other social media data for studies.
- 62 percent of Twitter users said they didn't know this. Many wrongly assumed it was not allowed.
- Boulder researchers are helping to develop ethical guidelines and policies for such research.
“There is a ton of research right now using Twitter and other social media data,” said lead author Casey Fiesler, an assistant professor in the Department of Information Science at Boulder. “Yet our study found that the majority of users may not even be aware that this is a thing that happens.”
The comes at a time when several high-profile controversies–including this week’s revelation that Cambridge Analytica used Facebook data to develop political ads–are raising questions about what’s ethical and legal in the burgeoning world of big data research.
It is the first in a series of studies Fiesler is working on related to the ethics of such research.
For the study, Fiesler and co-author Nicholas Proferes, an assistant professor at University of Kentucky, surveyed 268 Twitter users, average age 32. The average participant had posted about 2,000 tweets and followed about 350 accounts.
When asked if they knew researchers sometimes used their tweets, 62 percent said ‘no.’ When asked whether they thought researchers were permitted to do so without asking them, 43 percent said ‘no.’ Of those, 61 percent said they believed the researchers would be breaking ethical rules, and 23 percent said they believed Twitter’s terms of service forbid it.
In reality, Twitter states in its privacy policy that it “broadly and instantly disseminates your public information to a wide range of users, including search engines, developers, universities, public health agencies, and market research firms…”
When asked how they felt in general about Tweets being used in research, just 20 percent said they were generally uncomfortable with it and only 30 percent said they would opt-out of having any of their tweets used in research if they could.
More than half said that if a researcher contacted them to get consent and told them what the study was about, they would probably give it.
“They were more upset about not being told than about the fact that it was happening,” said Fiesler. “And they cared about things like what the study is about.”
Levels of discomfort varied broadly depending on what information was used and how.
For instance, 63 percent said they would not want deleted comments used for a study (as they often are), and 55 percent said they would not want one of their tweets, with their username attached, quoted in a published research paper. (In at least once instance, a reporter contacted the owners of such tweets to interview them). In contrast, if a tweet was analyzed along with millions or quoted anonymously, most users didn’t object.
Fiesler and Proferes suggest researchers consider such factors when designing their research.
“I certainly don’t want to suggest we shouldn’t be doing research using Twitter data,” said Fiesler. “But I do think that we should be thinking about how to do it ethically.”
She pointed to several cases in which big data research has sparked public outcry:This week, numerous news outlets reported that data gathered from Facebook users ostensibly for research purposes was ultimately shared with strategic communications firm Cambridge Analytica to develop political campaigns. In another instance, Danish researchers drew fire when they shared a dataset in a web forum for social science researchers containing sensitive information from 70,000 users of an online dating site. And last September, researchers in California came under fire for using 35,000 photos from online dating apps to develop an algorithm that assessed sexual-orientation based on facial features.
“There are things that can be done with this information in the name of science that people might be uncomfortable with, even if it is technically public data,” says Fiesler, who worries such controversies might dampen user trust in a vital area of research, which could reap big social benefits. “Just because we can collect data doesn’t mean that we should be able to use it however we like.”
In September, the National Science Foundation granted Fiesler and colleagues at five other institutions a $3 million grant to begin to develop ethical guidance and standards for the nascent field. She said she hopes her work will inform the way researchers design and conduct studies and also serve as an eye opener to social media users.
“I would never want my work to suggest to people that they need to stop using social media or lock all their stuff down. But it is important for people to realize that there are other uses of your social media content than what you intend.”