Goal: dismantled anti disinformation tool, fine discriminatory algorithm, Oversight Board report

Often at the center of controversy, Meta is preparing for days of fire, after the sentence for a discriminating advertising algorithm, the stop to an anti fake news tool, and for the publication of the first operational report of the Oversight Board.
Goal: dismantled anti disinformation tool, fine discriminatory algorithm, Oversight Board report

Listen to this article

Meta, the group that controls Facebook and several other interaction platforms (such as Instagram, Messenger, and WhatsApp), is destined to be involved in new controversies, following the publication of the first annual report of its Oversight Board, for the decision to dismantle an anti-disinformation tool, and for using a discriminating advertising algorithm.

The first news related to Facebook concerns CrowdTangle, the service, bought in 2016, which offered publishers a way to track the performance of their content on social networks, such as Instagram and Facebook. Over time, the tool has also established itself as a platform used by fact-chekers, researchers, press agencies (e.g. by the French Agence de Presse) and journalists, to follow the dissemination of content on public groups and Facebook pages, on Instagram profiles and in subreddits: in this sense, suffice it to recall when, in 2020, the New York Times reporter, Kevin Roose, discovered thanks to it how some far-right commentators, including Ben Shapiro, came out with their contents more involving than traditional newspapers, or how it was possible to ascertain an influence on the 2016 US presidential elections thanks to the billions of posts of Russian ancestry shared on Facebook.

Revelations like these have created several problems in terms of public relations for Facebook’s Meta and, consequently, Menlo Park has decided to dismantle CrownTange: this was revealed by Bloomberg, who reports that in July 2021 the related team (which, previously, it launched new products every 6 months and several major updates every month), and as in October of the same year, the founder and CEO of the platform, Brandon Silverman, resigned. Asked about it, the Facebook company expressed its position.

Specifically, it was reported that CrownTange will remain operational until the mid-term elections, expected in the US in November. In any case, support is guaranteed to researchers active in combating disinformation, who will be provided with new “more functional tools”, which imitate some features of the platform being phased out, without giving users “full access to its original capabilities. “(That is, not through an intuitive tool that can be used to ask one’s questions, but in the form of a” refined “report that gives Meta more control over the messages disseminated).

The second novelty also uncovered a source of concern for Meta. Specifically, the company has developed the Special Ad Audience algorithmic tool that shows advertisements only to certain Facebook users, based on parameters set by advertisers. In the USA, however, there is the Fair Housing Act which prohibits any discrimination in showing advertisements for leasing or selling houses: having used the above tool in violation of the aforementioned rule, Facebook was sentenced by the US Department of Justice to a fine of $ 115,054 and the obligation to develop a non-discriminatory system in the advertising field, to be used starting from 31 December 2022, when it will be forbidden to use the old Special Ad Audience: in the event that this part of the agreement is not deemed to be respected by a third-party supervisory body, Meta will be brought to court by American institutions.

Finally, the Oversight Board, the independent control body (for now on individual posts, although the hypothesis of extending its scope to profiles and groups is being studied) founded by Meta which can evaluate its decisions in terms of policy and moderation measures, published its first annual report in which it announced that, in 70% of the cases analyzed (unfortunately only 20 cases, considered significant, compared to over a million appeals presented by users of Instagram and Facebook), overturned the initial decisions of Menlo Park. The supervisory board also provided general recommendations (e.g. to more scrutinize those governments that use official channels to spread health disinformation, or to tighten the rules on doxing), some of which were implemented, such as when it was decided to be less. ambiguous in explaining to users what rules they would violate, in terms of hate speech, on its platforms.

LEAVE A REPLY

Please enter your comment!
Please enter your name here