Instagram Live Streamers Can Assign Their Own Moderators To Clean Up Chat

Instagram live streamers can assign their own moderators of the five instagram live streamers can assign their own moderators vs mediators instagram live streamers can assign their own moderators discord instagram live streamers can assignments instagram live streamers cannon instagram live streamers fortnite instagram live streamers recorders instagram live streamers on youtube instagram live streamers for drag madison lecroy instagram live instagram live video noah cyrus instagram live bikini
Instagram Live Streamers Can Assign Their Own Moderators to Clean Up Chat


Instagram Live Streamers Can Assign Their Own Moderators to Clean Up Chat

Instagram Live streamers can now assign someone to moderate their stream as it happens, freeing up broadcasters to focus on their content.

Instead of having to do all the moderating themselves, streamers can deputize someone to clean up chat by reporting comments, turning off comments for some viewers, or even booting them from the stream entirely. To assign their moderator after starting an Instagram Live, streamers tap the "..." button in the comment bar and either choose from a list of suggested accounts or manually search for one.

Rumors of assigned moderators first appeared in November when leaker Alessandro Paluzzi tweeted screenshots showing the feature in action. Instagram says it's adding the capability to help streamers keep their broadcasts safe and civil for them and viewers.

Read more:  Instagram Boss Says App Will 'Rethink What Instagram Is' in 2022

Instagram parent company Meta has a lengthy history with moderating its social networks and had struggled with policing extreme content during the pandemic on Facebook. Even so, what it does allow has been challenged, as studies linked youth use of Instagram along with rival social media platform TikTok with body image and eating disorders.

The company slowly added more tools for administrators of groups to moderate comments themselves back in June, though reports later in the year said self-harm content is easily found while bullying and harassment are still prevalent on Meta's social media platforms. Still, adding livestream moderators -- especially trusted accounts who can clean up chat and remove viewers as aggressively as needed -- gives content creators tools to keep their own spaces safer than before.


Source