Days after shutting the door on a research into political adverts being carried out on its platform by a team from NYU, news has it that Facebook Inc. has shut down another research, this time by some Germans, on Instagram algorithm. Researchers at AlgorithmWatch are saying that they were forced to close down their research prematurely, after Facebook started serving legal threats to them. Consequently, the team made the issue public on Friday, saying that their situation was similar to their NYU counterparts.
A post issued by the group reads that there exist high chances of more such bullying cases remaining unreported, with AlgorithmWatch hoping that if they voice their concerns, other organizations will be encouraged to follow their lead.
Bare Skin and Faces
The study in question, which was launched in March 2020, had to do with how Instagram prioritizes videos and pictures on its platform. Akin to the study at NYU, a browser plug-in was developed, which allowed users to share with the team, data from their IG feeds. Through its regularly-published findings, AlgorithmWatch found that images which showed bare skin or faces were ranked higher than those carrying texts. Interestingly, for the first year, Facebook didn’t take any solid action against the study, despite frowning upon the methodology, for the first year.
Fast forward to May this year, when FB met up with the project’s leaders, and accused them of violating the Terms of Service on IG. Additionally, the researchers also claim that they were called out for allegedly violating the European Union’s GDPR, since they apparently did not ask users for permission before accessing their data.
Issues With The Practice
These allegations seem to be in line with those that the NYU Ad Observatory was slammed with, and the response by AlgorithmWatch has been strikingly similar as well, with the group saying that they collected the data only from those users who voluntarily added the plug-in to their browsers.
Nevertheless, the researchers have finally shut down the project since they could be liable to facing legal action if they continue on with it. A spokesperson for Facebook, however, has said that the company contacted the group several times only because they had issues “with their practice.” Apparently, all they demanded was that the team match their study up with the their terms.
One may also note that even though AlgorithmWatch does say that their plug-in was used only by willing users, the fact that the data of many others users (many of whom had not agreed to the sharing of information) turns up in those willing users’ feeds, cannot be ignored.
However, researchers have hit back, saying that Facebook cannot be relied on to provide a clear picture of its algorithms or other data.