Facebook today introduced a new set of tools to help Facebook group admins better manage their online communities and, potentially, prevent conversations from derailing. Among the coolest new tools is a machine learning-based feature that alerts admins to potentially unhealthy conversations going on in their group. Another allows the admin to slow down the pace of a heated conversation, by limiting how often group members can post.
Facebook groups are a big reason people continue to use the social network today. Today, there are “tens of millions” of groups, which are managed by more than 70 million active administrators and moderators around the world, according to Facebook.
For years, the company has strived to deploy better tools for these group owners, who are often overwhelmed with the administrative responsibilities of running a large-scale online community. As a result, many admins give up work and let groups run somewhat unmanaged, allowing them to turn into breeding ground for disinformation, spam, and abuse.
Last fall, Facebook attempted to address this issue by rolling out new group policies to crack down on, among other things, groups without an active admin. Of course, the company’s preference would be to keep the groups active and growing by making them easier to operate.
That’s where today’s new feature set comes in.
A new dashboard called Admin Home will centralize administration tools, settings and features in one place, and feature “pro tips” that suggest other useful tools tailored to the needs of the group.
Another new admin assist feature will allow admins to automatically moderate comments in their groups by setting criteria that can restrict comments and posts more proactively, instead of forcing admins to go back and to delete them, which can be problematic, especially after a discussion is underway and members are engaged in the conversation.
For example, admins can now prevent people from posting if they haven’t had a Facebook account for a very long time or if they’ve recently broken group policies. Admins can also automatically decline posts that contain specific promotional content (maybe MLM! Hooray! Links) and then automatically share comments with the author of the post about why those posts are not allowed.
Admins can also take advantage of pre-defined criteria suggested by Facebook to help limit spam and manage conflicts.
One notable update is a new type of moderation alert dubbed “conflict alerts”. This feature, currently under testing, will notify admins when a potentially contentious or unhealthy conversation takes place in the group, according to Facebook. This would allow an admin to take action quickly, like turning off comments, restricting who can comment, removing a post, or any other way they would like to approach the situation.
Conflict alerts are powered by machine learning, Facebook explains. Its machine learning model examines multiple signals, including response time and comment volume, to determine whether engagement between users has or could lead to negative interactions, according to the company.
It’s kind of like an automated extension of the keyword alerts feature that many admins already use to find certain topics that lead to controversial conversations.
A related feature, also new, would allow admins to also limit how often specific members can comment or how often comments can be added to posts selected by admins.
When enabled, members can leave 1 comment every 5 minutes. The idea here is that forcing users to pause and reflect on their words in the midst of heated debate could lead to more civilized conversations. We’ve seen this concept implemented on other social media as well, like prompting Twitter to read posts before retweeting, or those that flag potentially dangerous replies, giving you the ability to re-edit your post.
Facebook, however, has embraced engagement widely on its platform, even when it doesn’t lead to positive interactions or experiences. Although small, this particular characteristic is an admission that building a healthy online community sometimes means that people shouldn’t be able to respond and comment immediately with the first thought that occurs to them.
Additionally, Facebook is testing tools that allow admins to temporarily limit the activity of certain group members.
If used, administrators will be able to determine how many posts (between 1 and 9 posts) per day a given member can share, and for how long this limit should be in effect (every 12 hours, 24 hours, 3 days , 7 days, 14 days or 28 days). Administrators will also be able to determine how many comments (between 1 and 30 comments, in increments of 5 comments) per hour that a given member can share, and for how long this limit should be in effect (also every 12 hours, 24 hours). hours, 3 days, 7 days, 14 days or 28 days).
On a related note to building healthier communities, a new member summary feature will give admins an overview of each member’s activity on their group, allowing them to see how many times they’ve posted and commented, have seen deleted messages or have been ignored. .
Facebook doesn’t say how admins should use this new tool, but one would imagine admins would take advantage of the detailed summary to do the occasional cleanup of their member base by removing bad actors who continually disrupt discussions. They could also use it to locate and elevate regulator contributors without violations to moderator roles, perhaps.
Admins will also be able to flag their group’s rules in comment sections, ban certain types of posts (for example, polls or events), and submit a call to Facebook to review decisions related to group violations, in the event of an error. .
Of particular interest is the return of Cats, which was previously announced, although a bit buried amidst the multitude of other news.
Facebook abruptly removed the Chat feature in 2019, possibly due to spam, some had speculated. (Facebook said it was a product infrastructure.) As before, chats can have up to 250 people, including active members and those who have opted in for chat notifications. Once this limit is reached, other members will not be able to engage with that specific chat room until existing active participants leave the chat or turn off notifications.
Now, Facebook group members can start, search, and chat with others within Facebook groups instead of using Messenger. Administrators and moderators can also have their own chats.
Notably, this change follows the growth of messaging-based social networks, like IRL, a new unicorn (due to its $ 1.17 billion valuation), as well as the growth seen by other messaging apps. , like Telegram, Signal and other alternatives. social networks.
Along with this vast set of new features, Facebook has also made changes to some existing features, based on feedback from admins.
It now tests pinned comments and introduces a new post type “Admin Announcement” that notifies group members of important news (if notifications are received for that group).
Additionally, admins will be able to share their comments when they reject group members.
The changes will be rolled out to Facebook groups around the world in the coming weeks.