<span>Given that the </span><span>internet has made everyone a potential entertainer or commentator, it's only natural that we might be interested in how many people are reading, watching or listening to us. </span> <span>But that </span><span>interest can quickly become an obsession – after all, getting an audience isn't easy, and sustaining it is even more difficult.</span> <span>An entire industry has been built </span><span>to advise people on maximising the reach of their posts on Facebook, Instagram, Twitter, TikTok and other social media platforms, but the factors that make one tweet, image or video more popular than any other are largely unknown. </span> <span>Into this information vacuum rushes annoyance, anger and accusation, the most common complaint being that of “shadowbanning”, the idea that platforms, intentionally or not, are quietly suppressing our content without our knowledge, and aren’t giving it an equal chance to be seen and heard.</span> <span>Only last month, shadowbanning was blamed for the disappearance of social media posts reporting on the impending eviction of Palestinian families from Sheikh Jarrah in </span><span>East Jerusalem</span><span>. </span> <span>Instagram responded with a statement that these deletions were a global issue that happened to include those Palestinian posts, but the supposition remains that it was the result of a widespread secretive practice that </span><span>penalises people unfairly.</span> <span>Adam Mosseri, Instagram's chief executive, </span><span>attempted to put straight some of the criticism levelled at his platform in a series of blog posts, the first of which begins: "It's hard to trust what you don't understand."</span> <span>He lays out in detail how its automated systems rank people’s photos and videos, and the reasons why they might be demoted. </span> <span>Regarding shadowbanning, Mosseri had </span><span>previously stated, during a Q&A on Instagram, that it "is not a thing" – that it </span><span>doesn't exist. But his comments are now more wary, possibly because several studies have shown that it does exist in some form. "We can't promise you that you'll consistently reach the same amount of people when you post," he says. "But we can be more transparent </span><span>and</span><span> work to make fewer mistakes."</span> <span>Some evidence of shadowbanning can usually be found where platforms inconsistently apply their own rules regarding inappropriate content. </span> <span>Double standards surrounding nudity and semi-nudity are said to affect content posted by athletes, educators and artists, and have been shown to have a disproportionate impact on women and people of colour. </span> <span>Automated systems have been shown to struggle with languages such as Arabic, resulting in the overzealous removal of posts, while content that "brushes up" against rules – even if they </span><span>aren't broken – can frequently find itself demoted. </span> <span>The fact that the majority of these demotions are automated and performed by algorithms means </span><span>there is </span><span>little transparency, which in turn breeds paranoia, </span><span>says Carolina Are, an online moderation researcher at City, University of London.</span> <span>"People tend to believe in conspiracy theories at times of uncertainty," she says. "So if people feel uncertain about what's happening to their content on the platform, they're going to come up with their own reasons for why it's not doing well. The fact that platforms like Instagram have had to apologise for censoring users means it's only natural that people are going to think </span><span>platforms are trying to reduce the reach of their content."</span> <span>As a result of this uncertainty, accusations of shadowbanning have become a convenient and powerful weapon for the alt-right in the US</span><span>, with former president Donald Trump tweeting about Republicans being subjected to this "discriminatory and illegal practice", thus helping to spread the notion that social media platforms have an inherent political bias. </span> <span>This idea continues to swirl; </span><span>this year, Hungary's Minister of Justice, Judit Varga, accused Facebook of suppressing "Christian conservative, right-wing opinions".</span> <span>Such accusations are always firmly denied by the platforms, and the real reasons for any piece of content not being popular are often more prosaic</span><span>.</span> <span>“For example, you might see that an Instagram user doesn’t use the ‘Reels’ feature, which Instagram is trying hard to push at the moment,” says Are. “So it could be that they’re not recommending that account because it’s not doing what they’re wishing it to do.”</span> <span>This issue was highlighted in Instagram's belated response to accusations of Palestinian censorship; it </span><span>said its policy of favouring original content over reposts of identical content was to blame, and promised to make changes to that prioritisation.</span> <span>As new users join these platforms and content proliferates, algorithms will have to work harder to sift through and recommend things that we are most likely to want to see, but Are believes</span><span> the platforms need to be held more accountable for the changes they make, and the knock-on effects that may occur. </span> <span>“They might say, oh, we have too much content, so some mistakes are going to be made,” she says. “They might say [in relation to censorship], oh, we have so many users and we need to please everyone. But do they really need to rule over that much content? I think platforms are trying to do too much. It’s not realistic for them to have a successful moderation system on that scale.”</span> <span>But control over the promotion of online content, bound up as it is with financial, corporate and cultural considerations, is not something</span><span> Big Tech will give up easily. Transparency is probably the best we can hope for. </span> <span>“The lack of clarity with which platforms recommend, moderate and circulate content is really striking,” says Are. “As a user, you are left posting into a void and hoping for your content to do well, without the platform telling you what’s going on. It’s a service that we’re using, and we should be told how to use it the most to our advantage. I don’t think that is happening at the moment.”</span>