Threats to Digital Privacy—and Ways to Protect It
When you think of digital privacy, you might worry about hackers accessing your passwords or credit card numbers. Those are nightmarish scenarios, to be sure, but privacy concerns may expand far beyond you as an individual, says Thomas Qitong Cao, assistant professor of technology policy at The Fletcher School.
“Our understanding of privacy will have to evolve” because of rapid changes in today’s era of big data and artificial intelligence, Cao says. In the course Privacy in the Digital Age, he asks students to consider “to what extent privacy remains a private issue or now has a public component.”
How can privacy be a public concern? Cao explains it like this: By a conventional understanding of privacy, if individual users consent to a social media platform having their data in exchange for using the platform’s services, that’s the end of the story.
But in the age of AI and big data, “a user may individually be willing to provide informed consent, but the aggregate data collected from a huge number of such users could allow tech companies to potentially infer much more information about others who may have consented to only partial data collection or none at all, in ways that could affect the public interest,” he says.
To grapple with this and related issues, Cao and his students examine several contemporary cases. Take TikTok, for example. The popular social video app was banned by Congress in 2024 after it failed to spin off from its China-based parent company. In January, the Supreme Court unanimously upheld the ban on national security grounds.
President Trump has temporarily blocked the ban, but concerns remain that the Chinese government could coerce TikTok to hand over sensitive U.S. user data.
Should the ban go into effect? Would the data TikTok collects be more secure if the app were owned by a U.S. company? These are the sorts of questions Cao asks, while stressing that the issues are larger than the TikTok case and applicable to other apps where the Chinese government is not involved.
Debating Surveillance
Privacy in the Digital Age covers a broad range of evolving threats to privacy, as well as efforts to protect it.
The course was first created as a half-term module in 2018 by Susan Landau, professor of cyber security and policy in the School of Engineering. She expanded it to a full-semester course the following year and Josephine Wolff, associate professor of cybersecurity policy at Fletcher, taught it from 2021 through 2024. It uses Landau’s book Listening In: Cybersecurity in an Insecure Age as a primary guide.
In his version of the class, Cao devotes considerable time to comparing the different approaches to digital privacy regulation in the United States, the European Union, and China, as one might expect at a school of international relations like Fletcher. But the class is also cross-listed in the Department of Computer Science, and Cao doesn’t shy away from including technical material, albeit at an introductory level.
Thomas Qitong Cao, assistant professor of technology policy at The Fletcher School, compares the different approaches to digital privacy regulation in the United States, the European Union, and China. Photo: Courtesy of Thomas Qitong Cao
The discussions of different countries’ approaches to security have been particularly interesting to Chelsie Wei, A24, who has a Tufts undergraduate degree in cognitive and brain science and is now earning a master’s degree in computer science. Wei grew up in China, where “We were taught that it's nice to have CCTV [closed-circuit television] cameras around you because it feels safer that way,” she says. But in the United States, she has found, “people have this very, very different sense of self, where the way that people feel protected is for people not to be surveilled on.”
The more technical topics in the class are a welcome challenge for Armaan Mathur, F26, who worked on digital privacy policy in India, where he’s from, but doesn’t have a computer science background. “Unless you have a basic understanding of how the mathematics of encryption works, you won't be able to understand what purpose it serves in the broader privacy debate,” he says.
Mathur has felt more in his element during the semester’s six in-class debates, in which students take sides on questions such as “Should Apple keep a copy of the decryption key for every iPhone stored on its servers?” and “To what extent should [the popular messaging app] Telegram be held criminally responsible for aiding and abetting child exploitation and drug trafficking?”
The debates help students understand readings and clarify their own positions, says Mathur, a former college debater. But the format is quite different from his previous experiences.
“We were very feisty back in India,” he says. “Here it's much more, ‘Oh, I like your point, and I respect that, but this is what I feel.’”
Collecting Data
In one of the course’s more unusual assignments, students may choose to download all their data from Apple, Amazon, Facebook, Google, or Snapchat, then write a reflection on how they think the information could be or is being used by that company or its partners.
The exercise is disturbing, Wei says. She examined what Google had collected about her and concluded that the company was gathering information from sites that she had expected to be secure.
“I do have the option for them to delete all this information,” she says—but the information may have already been sold to another company. “Maybe I should just not use Google,” she adds. “But I heavily rely on Google.”
Mathur found it unnerving to learn that companies could sense how long his cursor was lingering over a website and how many times he clicked on certain parts of a page, and the information could be used to create a profile of him. “We actually consent to all of these things, because whenever you go to any popular website, you see this box to tick that means you agree to the terms and conditions and the privacy policy,” he says. “You're often in a hurry and have the attitude of, ‘Let's click this and get this over with.’”
He's become much more cautious about ticking such boxes. “Managing preferences is something that I have started to do,” he says, “keeping your accounts private, for example, and your activity anonymous on the internet.”
Changing Circumstances
The choice to share one’s data can have consequences far beyond the individual, as the concerns about TikTok suggest.
Cao urges students to consider how digital privacy affects national security, and how technology and the law can better protect privacy. He also stresses that making good decisions requires understanding technology and social norms, which are both evolving rapidly.
“One thing I want to have the students think hard about is how we conceptualize privacy and how that conceptualization can change as our technology changes, as our lifestyle changes, and circumstances change,” he says. “And how we can have a toolkit to think about the issue of privacy even if we don't know how the future will change.”
Latest Tufts Now
- This Classic Snack Keeps Tufts Marathoners Feeling Fine After Mile NineTufts Marathon Team coach Don Megerle reveals his secret weapon for finishing a marathon
- Shadows in the NightFilmmaker Khary Jones discusses his latest movie, Night Fight, and how he helps students bring their own experiences to the screen
- Faculty Patents Land Tufts in National Academy of Inventors’ Top 100 for 2024The ranking for utility patents shows the wide range of innovative research led by scientists across the university
- Duncan Johnson’s Mission to Change How Kids CodeEngineering undergraduate finds momentum and inspiration on a journey to expand access to computer coding
- Predicting and Preventing Disease Starts With the MouthHend Alqaderi studies new ways to catch both dental and systemic diseases early on
- How Did Environmentalism Become a Partisan Issue?Sociologist Caleb Scoville wins a prestigious Andrew Carnegie fellowship to study the question, part of a larger effort to understand growing political polarization in the U.S.