Jan 19, 2021

Algorithmic Transparency, and Marc Faddoul’s Very Good Year

Early in 2020, Marc Faddoul discovered something: it looked like the social media company TikTok based its user recommendations on physical appearance. Faddoul began to play around with dummy accounts: if he followed a white woman with blonde hair, it was suggested he follow other white women with blonde hair; same with Asian men, white men with beards, and black women. 

Faddoul is an alumnus (MIMS 2019), who works as a research scientist with School of Information Head of School and Associate Dean Hany Farid.

Not only did the recommendations appear to be physiognomic, but he also discovered that in addition to filtering for gender and ethnicity, the app would adapt its follow recommendations for age, body shape, hairstyle, and even whether the person has visible disabilities. “I was trying to find political filter bubbles, and how they could be enabled by the recommender,” Faddoul said. “The effect struck me while I was looking at the recommended profiles: there was much more homogeneity from a physiognomic perspective than from a political one.”

While he never conducted a formal study — and TikTok says they weren’t able to replicate his results — his keen observation sparked discussions of algorithmic bias and risks involved with engagement-driven social media recommendations.

Faddoul is keenly interested in issues of transparency in social media platforms. “When researchers try to fill that gap, [the companies] tend to downplay the results and look the other way,” he says, but journalists have successfully replicated the experiment. He also isn’t surprised that TikTok’s algorithm functions this way. 

“If an algorithm is trained purely on user-engagement data, it will generate what appears to be appearance-based filter-bubbles.”
— Marc Faddoul

“Users tend to follow accounts from similar ages, gender, or cultures. On aggregate, these patterns are strong, and they translate into physical characteristics. Moreover, TikTok is inherently a very appearance-driven platform, and users have quite consistent tastes when it comes to the faces and bodies they enjoy looking at.

“Therefore, if an algorithm is trained purely on user-engagement data, it will generate what appears to be appearance-based filter-bubbles. For their defense, there was almost certainly no explicit intention from TikTok for their system to behave this way.”

But it's not an excuse, he says. “Such implicit biases can and need to be anticipated and accounted for, or at least acknowledged, especially for such a widely used algorithm, that is so impactful in shaping our cultures.”

The Transformative Nature of Technology

With degrees in computer science engineering and a master’s in data science from the prestigious Télécom Paris under his belt, Faddoul soon realized he was more interested in the transformative nature of technology at a societal, rather than a technical, level. 

“The French curriculum gave me really solid scientific foundations,” he said, “but it lacked that interdisciplinary take on technology that the School of Information offered.” After a stint in industry, he was accepted into the Master of Information Management and Systems program.

His work reflects a particularly I School perspective. In addition to his TikTok findings, as 2020 progressed he co-authored a paper with Hany Farid and Guillaume Chaslot, a former Youtube engineer, in which they studied the promotion of conspiracy videos on YouTube. He published an essay with the Brookings Institute that examines the limits of content moderation using algorithms in relation to the pandemic. In the summer he gave a lecture titled, “Recommender Systems and Power Dynamics” to the State Department and US Cyber Command. In August, his essay “Toward Algorithmic Humility,” which discusses the fallibility of pretrial risk assessment tools used by the justice system, was included in the book 97 Things About Ethics Everyone in Data Science Should Know

Not bad for a plague year. 

Next up for Faddoul is a focus on recommendation system transparency; he’s working on algotransparency.org to help create an independent third-party organization that exposes the behavior of the algorithms which regulate our informational ecosystems to users and lawmakers. 

“Marc is working on some of the most pressing issues at the intersection of society and technology,” said Hany Farid. “Marc brings a special blend of technical skills along with a deep understanding and care of the underlying social issues.”

“The I School might just be one of the few institutions in the world where expertise in computer science and humanities is so entwined.”
— Marc Faddoul

In a year of global uncertainty, one of the things Faddoul is certain about is that 21st-century technologists can no longer be educated in silos. The hardest technological challenges, he asserts, must be addressed in a holistic way, accounting for behavioral, geo-political or ethical dimensions. “And the I School,” he says, “might just be one of the few institutions in the world where expertise in computer science and humanities is so entwined, and I’m grateful for the school and its community for sensitizing me to some of the most interesting challenges of our century, and for giving me the tools to empower me to address a little part of it.”  

Last updated:

January 28, 2021