Facebook tries to clean up Groups with new policies – TechCrunch
Facebook this morning announced a series of new rules designed to further penalize those who violate its Community Standards, specifically around Facebook Groups. It also introduced rules meant to crack down on the spread of misinformation through these more private networks. The changes will impact those who had helped lead groups that were later banned and members who participated in them. The rules will also remove some of the more potentially harmful groups from Facebook’s Group recommendations, among other things.
Facebook’s existing recidivism policy was meant to prevent people from creating new groups similar to those that were banned for violating its Community Standards. However, the rule had only applied to Group admins. Now, Facebook says both admins and moderators alike will not be able to create any new groups for “a period of time” after their group had been banned for a policy violation. Facebook tells us this period is 30 days. If after the 30 days, the admin or moderator tries to create another violating group, they’ll be again paused for 30 days.
In addition, Group members who had any Community Standards violation in a group will now require post approval for the next 30 days. That means all their posts will have to be pre-approved by a Group admin or moderator. This could help larger groups deal with those whose behavior is often flagged, but it could also overwhelm groups with a large number of users. And Facebook says if the admins or moderators then approve a post that violates Community Standards, the group will be removed.
Facebook will also require Groups have an active admin. Often admins get busy and step down or leave their group. Facebook will now attempt to identify groups where an admin is not involved and proactively suggest admin roles to members who may be interested. You may have already received notifications from some of your Groups that an admin is needed. If so, it’s because Facebook identified you as someone who has the potential to lead the group, because you don’t have a history of violations.
The company will begin to archive groups without an active admin in the weeks ahead, it said. When admins leave and no one else assumes the admin role, the group will be archived.
This change could help to crack down on the unmoderated flow of information across groups, which can lead to spam and misinformation spreading quickly. It is helpful to have direct moderation, as other forum sites like Reddit have shown, but it’s often not enough. Group culture, too, can help to encourage certain types of content — including content that violates Facebook’s rules — and admins are often willing participants in that.
Another change will impact which Groups are suggested to users.
Facebook says health groups will no longer be recommended, as “it’s crucial that people get their health information from authoritative sources,” the company said.
Unfortunately, this change alone can only mitigate the danger of misleading health information, but does little to actually stop it. Because health groups can still be found via Search, users will be able to easily surface groups that fit their beliefs, even when those beliefs are actively harmful to themselves or to others.
There are, today, a large number of groups that continue to spread misleading health information or push users to try alternative or untested cures. These group participants may have the “right” to have these discussions online, at least in Facebook’s view, but there’s disagreement on whether or not the groups should be allowed the same search billing and discoverability as more expert-led resources.
For instance, if you search Facebook today for vaccines, Facebook will gladly point you to several large groups that tell you not to get one. By doing so, Facebook has effectively taken away medical experts’ and doctors’ authority on health-related matters and handed it over to the general public. Multiply this at the scale of Facebook’s billions of users and across all subject matters, and it’s easy to see why simply not “recommending” some groups barely makes a dent.
Facebook is also tweaking its rules to reduce the spread of groups tied to violence. It already removes them from recommendations, restricts them from search, and in the near future, it says it will reduce their content in News Feed. These groups are also removed if they use veiled language and symbols in an attempt to avoid being flagged. Recently, 790 groups linked to QAnon were removed under this policy, Facebook said.
This change, however, comes too little, too late. QAnon, left unchecked for years, has tapped into the mainstream consciousness and is now involving people who may not even realize they’re being manipulated by QAnon-driven initiatives.
Then there is the not-small matter of whether Facebook can actually enforce the rules it comes up with. A quick glance at Facebook search results for QAnon indicate it cannot. It may have removed 790 QAnon groups, but after scrolling for a couple of minutes we couldn’t even reach the bottom of group search results for QAnon content. And they weren’t anti-QAnon groups.
That demonstrates that much of Facebook’s work in this area is performative, rather than effective. A one-time sweep of harmful groups is not the same as dedicating resources and personnel to the task of pushing these dangerous, fringe movements, violence-prone organizers, or anti-medical science believers to the edges of society — a position they once held back in the offline, unconnected era. Today’s Facebook gives these groups access to all the same tools to organize as anyone else, and only limits their spread in dribs and drabs over time.
For instance, Facebook’s policy on groups ties to violence practically contradicts itself. It claims to remove groups discussing violence, but simultaneously includes a number of rules about limiting these same groups in recommendations and downranking them in search. That indicates even Facebook understands it can’t remove these groups in a timely fashion.
People disagree whether Facebook’s role should involve moderating this sort of content or to what extent any this should be protected as “free speech.” But Facebook never really took a moral position here or argued that it’s not a governmental body, so it can make its own rules based on what it stands for. Instead, it built out massive internet infrastructure where content moderation has been an afterthought and a job to be outsourced to the less fortunate. Now Facebook wants accolades for its clean-up work, before it even effectively solves the problems it has created.