Tag: oversight’
Meta is reforming ‘Facebook jail’ in response to the Oversight Board
It’s now going to be harder to land in “Facebook jail.” Meta says it’s reforming its penalty system so that people are less likely to have their accounts restricted for less serious violations of the company’s rules.
“Under the new system, we will focus on helping people understand why we have removed their content, which is shown to be more effective at preventing re-offending, rather than so quickly restricting their ability to post,” Meta explains in a blog post. “We will still apply account restrictions to persistent violators, typically beginning at the seventh violation, after we’ve given sufficient warnings and explanations to help the person understand why we removed their content.”
Previously, users could land in “Facebook jail,” which could prevent them from posting on the platform for 30 days at a time, for relatively minor infractions. Meta says that it sometimes imposed these types of penalties mistakenly due to “missed context.” For example, someone who jokingly told a friend they would “kidnap” them, or posted a friend’s address in order to invite others to an event, may have been wrongly penalized. These punishments were not just unfair for “well-intentioned” users, but in some cases actually made it more difficult for the company to identify actual bad actors.
With the new system, users may still be restricted from certain features, like posting in groups, following a strike, but will still be able to post elsewhere on the service. Longer, thirty-day restrictions will be reserved for a user’s tenth strike, though the company may impose more restrictions for “severe” rule violations. Facebook users will be able to to view their past violations and details about account restrictions in the “Account Status” section of the app.
Meta notes that the overhaul comes as a result of feedback from the Oversight Board, which has repeatedly criticized Meta for not providing users with information about why their posts were removed. In a statement following Meta’s new policy, the board said the changes were “a welcome step in the right direction,” but that “room for improvement remains.”
The board notes that the latest changes don’t do anything to address “severe strikes,” which can have an outsize impact on activists and journalists, especially when the company makes a mistake. The Oversight Board also said that Meta should provide users the opportunity to add context to their appeals, and that the information should be available to its moderators.
Meta’s Oversight Board will take on more cases and make decisions faster
Meta’s Oversight Board says it will review more cases and fast-track some within as little as 48 hours. “Increasing the number of decisions we produce, and the speed at which we do so, will let us tackle more of the big challenges of content moderation, and respond more quickly in situations with urgent real-world consequences,” the board wrote in a blog post.
Although previous versions of the Oversight Board’s bylaws mentioned expedited reviews of Facebook and Instagram content moderation cases, it has not used this process so far. Under the board’s revised charter and bylaws, Meta can now refer expedited cases to the board with relevant information and an explanation as to why it felt an urgent review was necessary. If the board’s co-chairs decide to take on an expedited case, Meta “agrees to be bound by the board’s ultimate determination,” the bylaws state.
A panel (instead of the board’s entire 23-strong membership) will review expedited cases and come to a decision that’s posted on the Oversight Board’s website within as little as 48 hours. The board notes, however, that this process can take up to 30 days. The target timeframe for standard decisions that demand more in-depth reviews is 90 days.
The board won’t take public comments into account for expedited cases due to time constraints. It might also choose to carry out expedited reviews of user appeals.
We have designed new procedures that will allow us to act quickly and maximize our impact in urgent situations through expedited review.
Our expedited decisions could be published as soon as 48 hours after accepting a case, but in some cases it might take longer – up to 30 days. pic.twitter.com/VhvM8NJGjp
— Oversight Board (@OversightBoard) February 14, 2023
Meanwhile, the Oversight Board plans to publish its first summary decisions. It said that after a committee chooses a list of cases that the board may consider, Meta sometimes reverses its original decision. The company has done so around 80 times so far, mostly to restore content it originally yanked. The board notes that while it has published full decisions on some of these cases, they’ve largely been summarized in transparency reports.
Moving forward, a committee will choose some of these cases in which Meta changed its mind. A panel (not the full board) will review them and publish summary decisions. These will include details about the original decision that Meta walked back and they won’t take public comments into account. “We believe that these cases hold important lessons and can help Meta avoid making the same mistakes in the future,” the board said.
Since it formed just over two years ago, the board has published 35 case decisions relating to moves by Facebook and Instagram to remove content or allow it to remain on the platforms. Last quarter alone, Meta users submitted 193,137 cases for review.
While it’s unlikely that the board’s latest steps mean it will review anything close to the full number of cases it receives, the group should be able to address high-profile, urgent cases more quickly, such as Meta’s decision to indefinitely suspend former President Donald Trump from its platforms due to his influence over the January 6th, 2021 insurrection. The company restored his accounts earlier this month, but Trump has yet to post on them again.
Meanwhile, the Oversight Board has published its latest quarterly transparency report (PDF). The body says it has now made 196 policy recommendations to Meta, “many of which are already improving people’s experiences of Facebook and Instagram.” By the end of October, the company had fully implemented 24 of the recommendations and had made progress on enacting dozens of others (Meta did not provide its fourth quarter update to the board before the transparency report was published).
The Oversight Board has also added a new board member. Kenji Yoshino is the Chief Justice Earl Warren Professor of Constitutional Law at New York University School of Law and the Director of the Meltzer Center for Diversity, Inclusion and Belonging. The board noted that he specializes in constitutional law; antidiscrimination law; and law and literature.
Facebook and Instagram told to overhaul nudity policies by Oversight Board
People have been fighting to #FreeTheNipple on Instagram and Facebook for years. Now, Meta‘s Oversight Board – a group of academics, lawyers, and rights experts – have recommended that the company update its rules around adult nudity to “respect international human rights standards”.
In a statement on Jan. 17, the board recommended an overhaul of Meta’s Adult Nudity and Sexual Activity Community Standard, advising that the company put forth “clear, objective, rights-respecting criteria” regarding its policies in this area, “without discrimination on the basis of sex or gender”. The decision comes after the board examined two posts from an account belonging to an American couple who are non-binary and transgender.
The posts – one shared in 2021 and the other in 2022 – displayed the couple topless but with their nipples covered. The captions featured a discussion about transgender healthcare and gender-affirming surgery. These posts were flagged by users and later removed for violating “the Sexual Solicitation Community Standard,” seemingly due to a fundraising link for said surgery.
Instagram restored the posts after the couple appealed and after investigating, the board overturned Meta’s original decision, stating that both cases “highlight fundamental issues with Meta’s policies”.
The Oversight Board operates independently from Meta but is funded by the company, advising them on content moderation. In the group’s statement, it said, “The restrictions and exceptions to the rules on female nipples are extensive and confusing,” and even more so when it comes to transgender and non-binary people. Examples cited include posts about breast cancer awareness, top surgery, childbirth, and protests.
“…the Board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans, and gender non-binary people on its platforms,” reads the post.
Instagram and Facebook’s rules have seemed more arbitrary over the years, with exceptions made in some instances and less reinforcement in others. Female nudity has always been more staunchly censored on the platform, and continues to be.
Back in 2020, Instagram altered its nudity policy after backlash against its censorship of plus-size Black women on the platform. After a sweeping campaign from activist Nyome Nicholas-Williams, the policy change allowed for breast hugging, cupping, and holding to be shown in posts. In 2021, the Oversight Board updated nudity policies on Facebook, allowing for some nuance and permitting “health-related nudity”. This wasn’t exactly a win but a step forward.
This new guidance could mean the ban on nipples and bare breasts may soon be a thing of the past. And it’s been a long time coming.
Oversight Board presses Meta to revise ‘convoluted and poorly defined’ nudity policy
Meta’s Oversight Board, which independently evaluates difficult content moderation decisions, has overturned the company’s takedown of two posts that depicted a nonbinary and transgender person’s bare chest. The case represents a failure of a convoluted and impractical nudity policy, the Board said, and recommended that Meta take a serious look at revising it. The decision […]
Oversight Board presses Meta to revise ‘convoluted and poorly defined’ nudity policy by Devin Coldewey originally published on TechCrunch
Facebook Gives Powerful, Influential Users Special Treatment, Oversight Board Says – CNET
Meta Oversight Board Says Facebook and Instagram Skirt Moderation Rules for Famous People
On Tuesday, Meta’s Oversight Board dropped a more than-50 page report detailing how the company needs to overhaul its systems that have allowed major influencers and celebrities leeway to post disingenuous or harmful content that would otherwise be moderated.
Meta’s Oversight Board comes out swinging against Facebook’s VIP ‘cross-check’ program
The Oversight Board, an independent body set up to review Facebook and Instagram content decisions and policies, slammed the company on Tuesday over its cross-check program. In its statement, the Oversight Board laid out a number of changes Facebook’s parent company Meta should make regarding content moderation across its social media platforms.
Cross-check is an internal Facebook program that was portrayed as a “quality control” measure — a way to double check a content decision for potential moderation when it came to high-profile Facebook users. As Facebook reviews millions of pieces of content a day, the company is bound to make mistakes. The cross-check system was put in place to help limit faulty content takedowns from users deemed a priority to the company.
However, according to a report from the Wall Street Journal, the program basically set up a two tier moderation system for high-profile Facebook users and everyone else.
Facebook reminds employees: You can’t fact check Trump now that he’s running for president
Basically, thanks to cross-check, celebrities, politicians, and other influencers were able to routinely break Facebook and Instagram’s rules without facing penalties like the ones doled out to every other user. As many as 5.8 million accounts made the cross-check white list at one point. These names included former president Donald Trump and Mark Zuckerberg himself.
The Board came out hard in a statement, accusing Meta of not being truthful to them about the cross-check program initially.
“In our review, we found several shortcomings in Meta’s cross-check program,” writes the Oversight Board. “While Meta told the Board that cross-check aims to advance Meta’s human rights commitments, we found that the program appears more directly structured to satisfy business concerns.”
While the Board said that it understands that Meta is a business, it failed to uphold its own content policies, failed to track data on the program’s results, and failed to be transparent about the program.
The cross-check program first came to light when Facebook whistleblower Frances Haugen shared internal documents regarding real-world harm stemming from the social media platform. Haugen briefed the Oversight Board on cross-check, among other issues revealed in the documents.
The Oversight Board suggested a number of changes to the program, mostly surrounding transparency. For example, the Oversight Board said that Meta should mark the accounts of users that are part of the cross-check program. The Board suggested that this would “allow the public to hold privileged users accountable for whether protected entities are upholding their commitment to follow the rules.”
In addition, the Oversight Board recommended that Facebook still take down certain “high severity” content, regardless of a user being part of cross-check. If a user whitelisted by cross-check continuously breaks the rules, the Oversight Board suggests that Meta remove their account from the program.
While Meta acts on the Oversight Board’s rulings on specific content moderation decisions, such as the reinstatement of a specific post from a user, the Board’s policy change recommendations are just that: recommendations. Meta is not bound to listen to the Oversight Board’s suggested changes to the cross-check program.
Meta’s oversight board calls for changes around VIP content moderation
The oversight board said the controversial ‘cross-check’ system has led to an unequal treatment of users on Facebook and Instagram, while allowing potentially harmful content to stay online for longer.
Read more: Meta’s oversight board calls for changes around VIP content moderation
Twitter layoffs trigger oversight risk warning from Brussels
In another move that’s being frowned upon by European Union regulators, Elon Musk-owned Twitter has closed its Brussels office per a report in the Financial Times — citing sources with knowledge of the departures. Staffers in the office were focused on European Union digital policy, working in close proximity to the seat of power of […]
Twitter layoffs trigger oversight risk warning from Brussels by Natasha Lomas originally published on TechCrunch