Hate, abuse and extremism, whether online, on our streets or in schools or workplaces, is of great concern in any society.
Hate crime, the subject of this blog, is defined by a Home Affairs Committee report (published 25 April 2017) as:
“any criminal offence which is perceived, by the victim or any other person, to be motivated by hostility or prejudice based on a personal characteristic. Hate crime can be motivated by disability, gender identity, race, religion or faith and sexual orientation.”
The report recommendations
The Home Affairs Committee report goes into some detail on where things stand now, what positive work is being done and the ways in which organisations are dealing with or failing to deal with hate, abuse and extremism, or not acting quickly enough.
Here is a summary of the report’s recommendations:
Advertising revenue derived from extremist videos
Google’s failure to remove extremist content on YouTube is strongly criticised: “one of the world’s largest companies has profited from hatred and has allowed itself to be a platform from which extremists have generated revenue”.
Responsibility to take action
While social media and technology companies, including those who provided evidence for the inquiry, have gone some way to promote online safety and reduce online abuse, not enough is being done.
Removal of illegal content
As well as being illegal in many cases, the existence of extremist and terrorist material online is considered “completely irresponsible and indefensible”. There is particular criticism of:
a) Delays in removing content
b) The ease of people posting the same material under a different name
c) The contrast between organisations’ swift removal of material that breaches copyright compared with their slow removal of extremist content
d) Outsourcing safeguarding by relying on users to report material to moderators
e) Lack of investment in moderating and removing material
The report suggests that organisations could contribute to policing costs in the same way that football teams are required to, and that more severe penalties should be in place for failing to remove illegal
While platforms such as Facebook, Twitter and YouTube all have good community standards, “the interpretation and implementation of the community standards in practice is too often slow and haphazard”. Reviewing the standards and implementing training are recommended.
Social media companies’ response to complaints
This is criticised for not being adequate, in particular for the general public. There’s also criticism for organisations’ lack of transparency about the amount they spend on safeguarding and the number of people engaged in this activity.
While technological innovation to help police extremist content is welcomed, solutions are being developed too slowly, and human judgement is still required.
Because many laws came into being before the existence of social media, “The Government should review the entire legislative framework governing online hate speech, harassment and extremism and ensure that the law is up to date.”
About the report
The inquiry into hate crime was announced after Jo Cox MP was murdered in the run up to the EU referendum vote. There were also concerns following the referendum about the increase in attacks on those from ethnic minorities and non-British nationalities.
It was created by a cross-party group, based on a large amount of written and oral evidence from social media companies, the police, MPs, academics and community organisations.
Although the announcement of the 2017 General Election curtailed the scope of the report, so that its final recommendations related primarily to online hate – the major area of concern – the committee advised that its research should be used for further recommendations in future.
You can read the full report here:
For further information on the Government’s Internet Strategy Green Paper, visit:
Extremist behaviour and language
What is Prevent and why does it exist?
Prevent and Fundamental British Values