
Meta Platforms, the parent company of social media giants Instagram and Facebook, announced this week their plan to control “potentially unwelcome or unwanted comments” regarding the ongoing conflict between Hamas and Israel.
This week, the tech giant posted a blog entry that said the company is offering all users a “temporary measure” that’s designed to “protect them [from] unwanted or unwelcome” comments regarding the Middle East conflict.
In addition, default settings will be changed for how users are able to comment on public and new posts on Facebook that are created by users located “in their region” so that it only shows those posted by followers or friends.
Users on their two social media platforms are able to change or opt out of that setting whenever they want to, the company added.
A spokesperson for Meta wouldn’t specify how they were defining a “region.”
The company did say that its new policies were designed to “keep people safe … [while] giving everyone a voice.”
In the blog post, Meta wrote:
“After the terrorist attack by Hamas against Israel [October 7], and Israel’s response in Gaza, our teams introduced a series of measures to address the spike in harmful and potentially harmful content spreading on our platforms. Our policies are designed to keep people safe on our apps while giving everyone a voice.”
The new policy will be applied “equally around the world,” the company wrote, adding “there is no truth to the suggestion that we are deliberately suppressing voice.”
Meta was accused earlier this week of suppressing content that was posted by people who either supported the citizens of Gaza or Palestine as a whole.
A relatively newer website that covers the human rights of Palestinians, called Mondoweiss, said Instagram twice suspended the profile of one of their video correspondents on the site. Other users of Instagram said their stories and posts that had to do with Palestine didn’t receive any views at all.
In response, Meta said they were fixing an Instagram “bug” that caused content that was re-posted to not appear correctly on the story of a user – which is set to disappear from their story once 24 hours passes.
In the past, Meta said it was creating what they called a “special operations center” staffed with experts who speak both fluent Arabic and Hebrew so they could more effectively monitor their social media platforms to remove any content that violated their policies more swiftly.
The company also added that Hamas as an organization – as well as any known members of it – have been banned from Instagram and Facebook according to its policy regarding dangerous individuals and organizations.
In a statement, Meta said:
“We want to reiterate that our policies are designed to give everyone a voice while keeping people safe on our apps. We apply these policies regardless of who is posting on their personal beliefs, and it is never our intention to suppress a particular community or point of view.”