A recent engagement report into user engagement with brands on Instagram revealed that throughout the pandemic, particularly at the height of lockdown, this engagement dipped to the lowest level of the year 2020 so far.
Yet, despite this decline in engagement with potential positive brands, that may well help small businesses thrive, the platform has experienced further troubles regarding its policy on removing harmful and hateful content.
The children’s charity, the NSPCC has said that the drop in Facebook’s (who run Instagram) removal of such content was a ‘significant failure in corporate responsibility’. Instagram has attempted to justify such poor actions by asserting that the restrictions imposed as a means of curbing the spread of the pandemic, meant that content moderators were sent home. Now, they claim that as moderators start to go back to work, the removal of harmful content has increased.
However, is this enough? Tara Hopkins, the head of Instagram’s public policy has said that ‘from July to September we took action on 1.3m pieces of suicide and self-harm content, over 95% of which we found proactively.’ Yet, in their article on the matter, the BBC used the heading ‘not surprised’ to quote Chris Gray, a former Facebook moderator who is now involved in a legal dispute with the company: “it’s chaos, as soon as humans are out, we can see…there’s just way, way more self-harm, child exploitation, this kind of stuff on platforms because there’s nobody there to deal with it.’
It further seems that the Facebook group cannot grapple managing working in an era of COVID-19 with workers also signing an open letter to Mark Zuckerberg complaining about being forced back into offices which they deem to be unsafe and are ‘needlessly risking their lives.’ Facebook claim that a ‘majority’ of content reviews are working from home although this further begs the question as to why the amount of harmful content removed, dropped so abruptly during the height of the pandemic.
The NSPCC sum up the issue clearly: “young people who needed protection from damaging content were let down by Instagram’s steep reduction in takedowns of harmful suicide and self-harm posts” said the head of the NSPCC’s child safety online policy.
Over the last few years and particularly since we entered a new normal age with coronavirus, the world has become heavily dependent on social media as a means of communication between friends and loved ones. Instagram has provided a channel for people to share what they have been up to during the lockdown and also engage with others, whether this be friends or celebrities. However, it seems that the social media outlet itself, as well as the government who have delayed plans to pass the Online Harms Bill (which allows big tech companies to be held accountable), need to do more to ensure that Instagram, amongst other social media outlets, is a safe cyber space for everyone.