wp-plugin-hostgator
domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init
action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114ol-scrapes
domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init
action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114Source:https:\/\/techcrunch.com\/2023\/06\/08\/blueskys-growing-pains-strain-relationship-with-its-black-community-moderation\/<\/a><\/br> Bluesky, the decentralized<\/span> social network and frontrunner alternative to Twitter<\/a>, has been hailed as a wonderland of funny posts and good vibes. But a moderation policy change that followed a death threat against a Black user has many on Bluesky<\/span><\/a> questioning if the platform is safe for marginalized communities after all.<\/span><\/p>\n Bluesky had around 50,000 users by the end of April. Its users have doubled since then, and as it gains more, it also faces increased pressure to crack down on hate speech and other violent comments. As a soon to be federated platform<\/a>, Bluesky is at a turning point that could set the precedent for moderating decentralized social networks. Though robust moderation wasn\u2019t one of Bluesky\u2019s founding principles, many users expect the site to be more proactive in refusing platform bigotry \u2014 even if it conflicts with Bluesky\u2019s decentralized goals.\u00a0<\/span><\/p>\n Bluesky has not announced a specific timeline for federation. In a May 5 <\/span>blog post<\/span><\/a>, the Bluesky team said it plans to launch a \u201csandbox environment\u201d to begin the testing phase of federation \u201csoon.\u201d\u00a0<\/span>\u00a0<\/span><\/p>\n It started in the hellthread last month. The thread, which initially formed when a coding bug notified every single user in the thread every time another user responded to it, grew into a chaotic, seemingly infinite discussion board with countless subthreads. Initially a shitposting outlet, the thread has devolved into a hotbed of discourse \u2014 opening the door for rampant racism.\u00a0<\/span><\/p>\n Aveta, a software engineer who has invited hundreds of Black users to Bluesky in hopes of recreating Black Twitter, replied in the thread asking people to stop posting R. Kelly memes. Aveta is well known on Bluesky for <\/span>expanding the Black community<\/span><\/a> on the platform, and is an outspoken advocate for acknowledging Black influence on internet culture.\u00a0<\/span><\/p>\n Last month, she had a dispute with Alice, a Bluesky user who went by cererean, over comments that Alice made about the growing Black community. Alice made multiple racist posts in the past month, including one that said that Black users are welcome to create their own spaces if they don\u2019t want to be somewhere that \u201creflects the demographics of the Anglosphere.\u201d\u00a0<\/span><\/p>\n In response to another comment about Aveta\u2019s hellthread interactions, Alice suggested that Aveta get shoved off \u201csomewhere real high.\u201d Aveta, who declined to comment out of fear of harassment, <\/span>described<\/span><\/a> Alice\u2019s comment as a death threat in posts on Bluesky.\u00a0<\/span><\/p>\n Other users reported Alice\u2019s comment as a violation of Bluesky\u2019s policy prohibiting extreme violence. Bluesky\u2019s moderation team did not initially ban Alice, and invoked further outrage among users when Bluesky CEO Jay Graber announced a change in the platform\u2019s policies that appeared to excuse comments like Alice\u2019s.\u00a0<\/span><\/p>\n \u201cWe do not condone death threats and will continue to remove accounts when we believe their posts represent targeted harassment or a credible threat of violence. But not all heated language crosses the line into a death threat,\u201d Graber said in a <\/span>weekend thread<\/span><\/a>. \u201cWisely or not, many people use violent imagery when they\u2019re arguing or venting. We debated whether a \u201cdeath threat\u201d needs to be specific and direct in order to cause harm, and what it would mean for people\u2019s ability to engage in heated discussions on Bluesky if we prohibited this kind of speech.\u201d\u00a0<\/span><\/p>\n Under Bluesky\u2019s new policy, any post that threatens violence or physical harm \u2014 whether literal or metaphorical \u2014 will result in a temporary account suspension. Repeat offenders will be banned from Bluesky\u2019s server, but once Bluesky finishes the \u201cwork required for federation,\u201d Graber said, users will be able to move to a new server with their mutuals and other data intact.\u00a0<\/span><\/p>\n Like Mastodon, Bluesky aims to be a decentralized, federated social network. It isn\u2019t federated yet, so all users still interact on Bluesky\u2019s server and have to abide by Bluesky\u2019s policies. Once it is federated, any users on any server on AT Protocol will be able to \u201copt in\u201d to a community labeling system that would include certain content filters.\u00a0<\/span><\/p>\n That means that under Bluesky\u2019s new content moderation policy, a user who was suspended for hate speech or making violent threats would still be able to engage with other servers running on AT Protocol. Bluesky has always been transparent about becoming a decentralized social network, but the swift action it previously took against users who threatened others convinced many Bluesky early adopters that the platform would continue to shut down violent or hateful rhetoric.\u00a0<\/span><\/p>\n \u201cWhile this may not be your vision necessarily, I think a lot of people are less concerned with moving to a new instance of Bluesky, than making sure bigots are not able to have *any* instance on here,\u201d Ben Perry, a Bluesky user also known as tedcruznipples, <\/span>replied<\/span><\/a> to Graber\u2019s thread. \u201cThey shouldn\u2019t be given the opportunity to have federation and proliferate their message.\u201d\u00a0<\/span><\/p>\n Bluesky rolled out custom algorithms the day after Graber announced the new moderation policy. The feature allows users to choose from Bluesky\u2019s \u201cmarketplace of algorithms\u201d instead of just seeing content from the \u201cmaster algorithm\u201d that most social media sites employ. Like Twitter lists, users will be able to toggle between the \u201cWhat\u2019s hot\u201d tab, a tab of people they follow and tabs for custom feeds they\u2019ve pinned. The \u201cCat Pics\u201d feed shows, predictably, cat pics, while other feeds lean more toward memes and NSFW content.\u00a0<\/span><\/p>\n But many Bluesky users \u2014 particularly Black Bluesky users \u2014 questioned the timing of the roll out. Rudy Fraser, who created a custom algorithm for Black users called Blacksky, said it was unfortunate that Bluesky tried to offer custom algorithms as a \u201csolution\u201d to the moderation debate.\u00a0<\/span><\/p>\n \u201cAs if a new feature would resolve the underlying issue and as if they couldn\u2019t just ban the offending user,\u201d Fraser said. \u201cSome form of programmable moderation is on the horizon, but there\u2019s not yet a prototype to see how it would work\u2026 There are already ways around the NSFW tags for example. They need to find those bugs before they reach critical mass.\u201d\u00a0<\/span><\/p>\n Moderating decentralized social networks is a challenge that by definition offers no easy solutions. In the case of Bluesky, establishing and enforcing centralized community guidelines for all users seems antithetical to Bluesky\u2019s aspirational system of federation and customizable moderation. On <\/span>Mastodon<\/span><\/a>, moderation is unique to each server, and one server\u2019s policies can\u2019t be enforced on another server with a different set of rules. To be listed under Mastodon\u2019s <\/span>server picker<\/span><\/a>, servers must commit to the <\/span>Mastodon Server Covenant<\/span><\/a>, which requires \u201cactive moderation against racism, sexism, homophobia and transphobia.\u201d While most prominent servers abide by the Covenant, unlisted servers aren\u2019t held to a minimum standard of moderation.\u00a0<\/span><\/p>\n The fediverse, a portmanteau of \u201cfederated universe,\u201d promises a vast social network that can exist beyond the authority of a single institution. Though there are benefits to that level of independence, the approach to community-led moderation is often optimistic at best, and negligent at worst. Platforms can absolve themselves of the burden of moderation \u2014 which is labor intensive, costly and always divisive \u2014 by letting users take the wheel instead.\u00a0<\/span><\/p>\n Allowing communities to moderate themselves also allows violent hate to go unchecked. In a recent <\/span>skeet<\/span><\/a>, software developer Dare Obasanjo pointed out that many \u201ctechno-optimistic\u201d approaches to content moderation fail to account for context.\u00a0<\/span><\/p>\n \u201cA user with a virulent racist history wishing harm on a BIPOC is different from the same comment in a debate about MCU versus DCEU movies from otherwise well behaved users,\u201d Obasanjo wrote. \u201cA legalistic discussion of whether \u2018someone should push you off of a tall building\u2019 is a ban worthy offense misses the point completely. The question is whether you tolerate openly racist people wishing harm on BIPOC on your app or not?\u201d\u00a0<\/span><\/p>\n Bluesky employs automated filtering to weed out illegal content and do a first pass of labeling \u201cobjectionable material,\u201d as described in a <\/span>blog post<\/span><\/a> about the platform\u2019s composable moderation. Then, Bluesky applies server-level filters that allow users to hide, warn, or show content that may be explicit or offensive. Bluesky plans to let users opt-in to certain filters to further customize their individual feeds. The ACLU, for example, can label certain accounts and posts as \u201chate-speech.\u201d Other users will be able to subscribe to the ACLU\u2019s content filter to mute content.\u00a0\u00a0<\/span><\/p>\n Graber wrote that the layered, customizable moderation system aims to \u201cprioritize user safety while giving people more control.\u201d The company hasn\u2019t publicly clarified whether or not it plans to hire human moderators as well.\u00a0<\/span><\/p>\n Moderation in Bluesky\u2019s early days has been met with mixed reception from users. In April, Bluesky banned a user who went by Hannah for responding to Matt Yglesias with, \u201cWE ARE GOING TO BEAT YOU WITH HAMMERS.\u201d Many Bluesky users protested the ban, and insisted that the user was joking. Days later, Bluesky swiftly banned another account that had harassed other users with transphobic comments.\u00a0<\/span><\/p>\n Hannah\u2019s hammer ban resurfaced on Bluesky in wake of the moderation policy change. Black Bluesky users questioned why the threat against Aveta wasn\u2019t taken as seriously as Hannah\u2019s reply. Mekka Okereke, director of engineering at Google Play, described Hannah\u2019s comment as \u201cmetaphorical and inappropriate,\u201d but pointed out that \u201cpeople just can\u2019t empathize when Black women are the subject.\u201d\u00a0<\/span><\/p>\n \u201cAnd as I\u2019ve said on here before, \u2018echo chamber\u2019 is a specific term mostly used by right wing news outlets to describe any place that tries to make Black, brown, and LGBTQIA people feel safe,\u201d Okereke said in a <\/span>post<\/span><\/a>. \u201cAnd the \u2018truth matters\u2019 philosophical pedantism only seems to come out when we\u2019re talking about making Black women feel safe online.\u201d\u00a0\u00a0<\/span><\/p>\n Pariss Athena, who founded the job board Black Tech Pipeline, conceded that no online space is truly safe, but pointed out that direct racism, transphobia and anti-Blackness are \u201cnot blurred lines.\u201d\u00a0<\/span><\/p>\n
\nBluesky\u2019s growing pains strain its relationship with Black users<\/br>
\n2023-06-08 21:48:14<\/br><\/p>\n