Clause 12 - Adults’ risk assessment duties

Online Safety Bill – in a Public Bill Committee am 10:30 am ar 13 Rhagfyr 2022.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Question proposed, That the clause stand part of the Bill.

Photo of Roger Gale Roger Gale Ceidwadwyr, North Thanet

With this it will be convenient to discuss the following:

Clause 13 stand part.

Government amendments 18, 23 to 25, 32, 33 and 39.

Clause 55 stand part.

Government amendments 42 to 45, 61 to 66, 68 to 70, 74, 80, 85, 92, 51 and 52, 54, 94 and 60.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

To protect free speech and remove any possibility that the Bill could cause tech companies to censor legal content, I seek to remove the so-called “legal but harmful” duties from the Bill. These duties are currently set out in clauses 12 and 13 and apply to the largest in-scope services. They require services to undertake risk assessments for defined categories of harmful but legal content, before setting and enforcing clear terms of service for each category of content.

I share the concerns raised by Members of this House and more broadly that these provisions could have a detrimental effect on freedom of expression. It is not right that the Government define what legal content they consider harmful to adults and then require platforms to risk assess for that content. Doing so may encourage companies to remove legal speech, undermining this Government’s commitment to freedom of expression. That is why these provisions must be removed.

At the same time, I recognise the undue influence that the largest platforms have over our public discourse. These companies get to decide what we do and do not see online. They can arbitrarily remove a user’s content or ban them altogether without offering any real avenues of redress to users. On the flip side, even when companies have terms of service, these are often not enforced, as we have discussed. That was the case after the Euro 2020 final where footballers were subject to the most appalling abuse, despite most platforms clearly prohibiting that. That is why I am introducing duties to improve the transparency and accountability of platforms and to protect free speech through new clauses 3 and 4. Under these duties, category 1 platforms will only be allowed to remove or restrict access to content or ban or suspend users when this is in accordance with their terms of service or where they face another legal obligation. That protects against the arbitrary removal of content.

Companies must ensure that their terms of service are consistently enforced. If companies’ terms of service say that they will remove or restrict access to content, or will ban or suspend users in certain circumstances, they must put in place proper systems and processes to apply those terms. That will close the gap between what companies say they will do and what they do in practice. Services must ensure that their terms of service are easily understandable to users and that they operate effective reporting and redress mechanisms, enabling users to raise concerns about a company’s application of the terms of service. We will debate the substance of these changes later alongside clause 18.

Clause 55 currently defines

“content that is harmful to adults”,

including

“priority content that is harmful to adults” for the purposes of this legislation. As this concept would be removed with the removal of the adult safety duties, this clause will also need to be removed.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

My hon. Friend mentioned earlier that companies will not be able to remove content if it is not part of their safety duties or if it was not a breach of their terms of service. I want to be sure that I heard that correctly and to ask whether Ofcom will be able to risk assess that process to ensure that companies are not over-removing content.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Absolutely. I will come on to Ofcom in a second and respond directly to his question.

The removal of clauses 12, 13 and 55 from the Bill, if agreed by the Committee, will require a series of further amendments to remove references to the adult safety duties elsewhere in the Bill. These amendments are required to ensure that the legislation is consistent and, importantly, that platforms, Ofcom and the Secretary of State are not held to requirements relating to the adult safety duties that we intend to remove from the Bill. The amendments remove requirements on platforms and Ofcom relating to the adult safety duties. That includes references to the adult safety duties in the duties to provide content reporting and redress mechanisms and to keep records. They also remove references to content that is harmful to adults from the process for designating category 1, 2A and 2B companies. The amendments in this group relate mainly to the process for the category 2B companies.

I also seek to amend the process for designating category 1 services to ensure that they are identified based on their influence over public discourse, rather than with regard to the risk of harm posed by content that is harmful to adults. These changes will be discussed when we debate the relevant amendments alongside clause 82 and schedule 11. The amendments will remove powers that will no longer be required, such as the Secretary of State’s ability to designate priority content that is harmful to adults. As I have already indicated, we intend to remove the adult safety duties and introduce new duties on category 1 services relating to transparency, accountability and freedom of expression. While they will mostly be discussed alongside clause 18, amendments 61 to 66, 68 to 70 and 74 will add references to the transparency, accountability and freedom of expression duties to schedule 8. That will ensure that Ofcom can require providers of category 1 services to give details in their annual transparency reports about how they comply with the new duties. Those amendments define relevant content and consumer content for the purposes of the schedule.

We will discuss the proposed transparency and accountability duties that will replace the adult safety duties in more detail later in the Committee’s deliberations. For the reasons I have set out, I do not believe that the current adult safety duties with their risks to freedom of expression should be retained. I therefore urge the Committee that clauses 12, 13 and 55 do not stand part and instead recommend that the Government amendments in this group are accepted.

Photo of Roger Gale Roger Gale Ceidwadwyr, North Thanet

Before we proceed, I emphasise that we are debating clause 13 stand part as well as the litany of Government amendments that I read out.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy) 10:45, 13 Rhagfyr 2022

Clause 12 is extremely important because it outlines the platforms’ duties in relation to keeping adults safe online. The Government’s attempts to remove the clause through an amendment that thankfully has not been selected are absolutely shocking. In addressing Government amendments 18, 23, 24, 25, 32, 33 and 39, I must ask the Minister: exactly how will this Bill do anything to keep adults safe online?

In the original clause 12, companies had to assess the risk of harm to adults and the original clause 13 outlined the means by which providers had to report these assessments back to Ofcom. This block of Government amendments will make it impossible for any of us—whether that is users of a platform or service, researchers or civil society experts—to understand the problems that arise on these platforms. Labour has repeatedly warned the Government that this Bill does not go far enough to consider the business models and product design of platforms and service providers that contribute to harm online. By tabling this group of amendments, the Government are once again making it incredibly difficult to fully understand the role of product design in perpetuating harm online.

We are not alone in our concerns. Colleagues from Carnegie UK Trust, who are a source of expertise to hon. Members across the House when it comes to internet regulation, have raised their concerns over this grouping of amendments too. They have raised specific concerns about the removal of the transparency obligation, which Labour has heavily pushed for in previous Bill Committees.

Previously, service providers had been required to inform customers of the harms their risk assessment had detected, but the removal of this risk assessment means that users and consumers will not have the information to assess the nature or risk on the platform. The Minister may point to the Government’s approach in relation to the new content duties in platforms’ and providers’ terms of service, but we know that there are risks arising from the fact that there is no minimum content specified for the terms of service for adults, although of course all providers will have to comply with the illegal content duties.

This approach, like the entire Bill, is already overly complex—that is widely recognised by colleagues across the House and is the view of many stakeholders too. In tabling this group of amendments, the Minister is showing his ignorance. Does he really think that all vulnerabilities to harm online simply disappear at the age of 18? By pushing these amendments, which seek to remove these protections from harmful but legal content to adults, the Minister is, in effect, suggesting that adults are not susceptible to harm and therefore risk assessments are simply not required. That is an extremely narrow-minded view to take, so I must push the Minister further. Does he recognise that many young, and older, adults are still highly likely to be impacted by suicide and self-harm messaging, eating disorder content, disinformation and abuse, which will all be untouched by these amendments?

Labour has been clear throughout the passage of the Bill that we need to see more, not less, transparency and protection from online harm for all of us—whether adults or children. These risk assessments are absolutely critical to the success of the Online Safety Bill and I cannot think of a good reason why the Minister would not support users in being able to make an assessment about their own safety online.

We have supported the passage of the Bill, as we know that keeping people safe online is a priority for us all and we know that the perfect cannot be the enemy of the good. The Government have made some progress towards keeping children safe, but they clearly do not consider it their responsibility to do the same for adults. Ultimately, platforms should be required to protect everyone: it does not matter whether they are a 17-year-old who falls short of being legally deemed an adult in this country, an 18-year-old or even an 80-year-old. Ultimately, we should all have the same protections and these risk assessments are critical to the online safety regime as a whole. That is why we cannot support these amendments. The Government have got this very wrong and we have genuine concerns that this wholesale approach will undermine how far the Bill will go to truly tackling harm online.

I will also make comments on clause 55 and the other associated amendments. I will keep my comments brief, as the Minister is already aware of my significant concerns over his Department’s intention to remove adult safety duties more widely. In the previous Bill Committee, Labour made it clear that it supports, and thinks it most important, that the Bill should clarify specific content that is deemed to be harmful to adults. We have repeatedly raised concerns about missing harms, including health misinformation and disinformation, but really this group of amendments, once again, will touch on widespread concerns that the Government’s new approach will see adults online worse off. The Government’s removal of the “legal but harmful” sections of the Online Safety Bill is a major weakening—not a strengthening—of the Bill. Does the Minister recognise that the only people celebrating these decisions will be the executives of big tech firms, and online abusers? Does he agree that this delay shows that the Government have bowed to vested interests over keeping users and consumers safe?

Labour is not alone in having these concerns. We are all pleased to see that child safety duties are still present in the Bill, but the NSPCC, among others, is concerned about the knock-on implications that may introduce new risks to children. Without adult safety duties in place, children will be at greater risk of harm if platforms do not identify and protect them as children. In effect, these plans will now place a significant greater burden on platforms to protect children than adults. As the Bill currently stands, there is a significant risk of splintering user protections that can expose children to adult-only spaces and harmful content, while forming grooming pathways for offenders, too.

The reality is that these proposals to deal with harms online for adults rely on the regulator ensuring that social media companies enforce their own terms and conditions. We already know and have heard that that can have an extremely damaging impact for online safety more widely, and we have only to consider the very obvious and well-reported case study involving Elon Musk’s takeover of Twitter to really get a sense of how damaging that approach is likely to be.

In late November, Twitter stopped taking action against tweets in violation of coronavirus rules. The company had suspended at least 11,000 accounts under that policy, which was designed to remove accounts posting demonstrably false or misleading content relating to covid-19 that could lead to harm. The company operated a five-strike policy, and the impact on public health around the world of removing that policy will likely be tangible. The situation also raises questions about the platform’s other misinformation policies. As of December 2022, they remain active, but for how long remains unclear.

Does the Minister recognise that as soon as they are inconvenient, platforms will simply change their terms and conditions, and terms of service? We know that simply holding platforms to account for their terms and conditions will not constitute robust enough regulation to deal with the threat that these platforms present, and I must press the Minister further on this point.

Photo of Kim Leadbeater Kim Leadbeater Llafur, Batley and Spen

My hon. Friend is making an excellent speech. I share her deep concerns about the removal of these clauses. The Government have taken this tricky issue of the concept of “legal but harmful”—it is a tricky issue; we all acknowledge that—and have removed it from the Bill altogether. I do not think that is the answer. My hon. Friend makes an excellent point about children becoming 18; the day after they become 18, they are suddenly open to lots more harmful and dangerous content. Does she also share my concern about the risks of people being drawn towards extremism, as well as disinformation and misinformation?

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

My hon. Friend makes a valid point. This is not just about misinformation and disinformation; it is about leading people to really extreme, vile content on the internet. As we all know, that is a rabbit warren. That situation does not change as soon as a 17-year-old turns 18 on their 18th birthday—that they are then exempt when it comes to seeing this horrendous content. The rules need to be there to protect all of us.

As we have heard, terms and conditions can change overnight. Stakeholders have raised the concern that, if faced with a clearer focus on their terms of service, platforms and providers may choose to make their terms of service shorter, in an attempt to cut out harmful material that, if left undealt with, they may be held liable for.

In addition, the fact that there is no minimum requirement in the regime means that companies have complete freedom to set terms of service for adults, which may not reflect the risks to adults on that service. At present, service providers do not even have to include terms of service in relation to the list of harmful content proposed by the Government for the user empowerment duties—an area we will come on to in more detail shortly as we address clause 14. The Government’s approach and overreliance on terms of service, which as we know can be so susceptible to rapid change, is the wrong approach. For that reason, we cannot support these amendments.

I would just say, finally, that none of us was happy with the term “legal but harmful”. It was a phrase we all disliked, and it did not encapsulate exactly what the content is or includes. Throwing the baby out with the bathwater is not the way to tackle that situation. My hon. Friend the Member for Batley and Spen is right that this is a tricky area, and it is difficult to get it right. We need to protect free speech, which is sacrosanct, but we also need to recognise that there are so many users on the internet who do not have access to free speech as a result of being piled on or shouted down. Their free speech needs to be protected too. We believe that the clauses as they stand in the Bill go some way to making the Bill a meaningful piece of legislation. I urge the Minister not to strip them out, to do the right thing and to keep them in the Bill.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

Throughout the consideration of the Bill, I have been clear that I do not want it to end up simply being the keep MPs safe on Twitter Bill. That is not what it should be about. I did not mean that we should therefore take out everything that protects adults; what I meant was that we need to have a big focus on protecting children in the Bill, which thankfully we still do. For all our concerns about the issues and inadequacies of the Bill, it will go some way to providing better protections for children online. But saying that it should not be the keep MPs safe on Twitter Bill does not mean that it should not keep MPs safe on Twitter.

I understand how we have got to this situation. What I cannot understand is the Minister’s being willing to stand up there and say, “We can’t have these clauses because they are a risk to freedom of speech.” Why are they in the Bill in the first place if they are such a big risk to freedom of speech? If the Government’s No. 1 priority is making sure that we do not have these clauses, why did they put them in it? Why did it go through pre-legislative scrutiny? Why were they in the draft Bill? Why were they in the Bill? Why did they agree with them in Committee? Why did they agree with them on Report? Why have we ended up in a situation where, suddenly, there is a massive epiphany that they are a threat to freedom of speech and therefore we cannot possibly have them?

What is it that people want to say that they will be banned from saying as a result of this Bill? What is it that freedom of speech campaigners are so desperate to want to say online? Do they want to promote self-harm on platforms? Is that what people want to do? Is that what freedom of speech campaigners are out for? They are now allowed to do that a result of the Bill.

Photo of Nicholas Fletcher Nicholas Fletcher Ceidwadwyr, Don Valley

I believe that the triple shield being put in is in place of “legal but harmful”. That will enable users to put a layer of protection in so they can actually take control. But the illegal content still has to be taken down: anything that promotes self-harm is illegal content and would still have to be removed. The problem with the way it was before is that we had a Secretary of State telling us what could be said out there and what could not. What may offend the hon. Lady may not offend me, and vice versa. We have to be very careful of that. It is so important that we protect free speech. We are now giving control to each individual who uses the internet.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

The promotion of self-harm is not illegal content; people are now able to do that online—congratulations, great! The promotion of incel culture is not illegal content, so this Bill will now allow people to do that online. It will allow terms of service that do not require people to be banned for promoting incel culture, self-harm, not wearing masks and not getting a covid vaccine. It will allow the platforms to allow people to say these things. That is what has been achieved by campaigners.

The Bill is making people less safe online. We will continue to have the same problems that we have with people being driven to suicide and radicalised online as a result of the changes being made in this Bill. I know the Government have been leaned on heavily by the free speech lobby. I still do not know what people want to say that they cannot say as a result of the Bill as it stands. I do not know. I cannot imagine that anybody is not offended by content online that drives people to hurt themselves. I cannot imagine anybody being okay and happy with that. Certainly, I imagine that nobody in this room is okay and happy with that.

These people have won this war on the attack on free speech. They have won a situation where they are able to promote misogynistic, incel culture and health disinformation, where they are able to say that the covid vaccine is entirely about putting microchips in people. People are allowed to say that now—great! That is what has been achieved, and it is a societal issue. We have a generational issue where people online are being exposed to harmful content. That will now continue.

It is not just a generational societal thing—it is not just an issue for society as a whole that these conspiracy theories are pervading. Some of the conspiracy theories around antisemitism are unbelievably horrific, but do not step over into illegality or David Icke would not be able to stand up and suggest that the world is run by lizard people—who happen to be Jewish. He would not be allowed to say that because it would be considered harmful content. But now he is. That is fine. He is allowed to say that because this Bill is refusing to take action on that.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee) 11:00, 13 Rhagfyr 2022

Can the hon. Lady tell me where in the Bill, as it is currently drafted—so, unamended—it requires platforms to remove legal speech?

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

It allows the platforms to do that. It allows them, and requires legal but harmful stuff to be taken into account. It requires the platforms to act—to consider, through risk assessments, the harm done to adults by content that is legal but massively harmful.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

The hon. Lady is right: the Bill does not require the removal of legal speech. Platforms must take the issue into account—it can be risk assessed—but it is ultimately their decision. I think the point has been massively overstated that, somehow, previously, Ofcom had the power to strike down legal but harmful speech that was not a breach of either terms of service or the law. It never had that power.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

Why do the Government now think that there is a risk to free speech? If Ofcom never had that power, if it was never an issue, why are the Government bothered about that risk—it clearly was not a risk—to free speech? If that was never a consideration, it obviously was not a risk to free speech, so I am now even more confused as to why the Government have decided that they will have to strip this measure out of the Bill because of the risk to free speech, because clearly it was not a risk in this situation. This is some of the most important stuff in the Bill for the protection of adults, and the Government are keen to remove it.

Photo of Kim Leadbeater Kim Leadbeater Llafur, Batley and Spen

The hon. Member is making an excellent and very passionate speech, and I commend her for that. Would she agree with one of my concerns, which is about the message that this sends to the public? It is almost that the Government were acknowledging that there was a problem with legal but harmful content—we can all, hopefully, acknowledge that that is a problem, even though we know it is a tricky one to tackle—but, by removing these clauses from the Bill, are now sending the message that, “We were trying to clean up the wild west of the internet, but, actually, we are not that bothered anymore.”

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

The hon. Lady is absolutely right. We have all heard from organisations and individuals who have had their lives destroyed as a result of “legal but harmful”—I don’t have a better phrase for it—content online and of being radicalised by being driven deeper and deeper into blacker and blacker Discord servers, for example, that are getting further and further right wing.

A number of the people who are radicalised—who are committing terror attacks, or being referred to the Prevent programme because they are at risk of committing terror attacks—are not so much on the far-right levels of extremism any more, or those with incredible levels of religious extremism, but are in a situation where they have got mixed up or unclear ideological drivers. It is not the same situation as it was before, because people are being radicalised by the stuff that they find online. They are being radicalised into situations where they “must do something”—they “must take some action”—because of the culture change in society.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

The hon. Member is making a powerful point. Just a few weeks ago, I asked the Secretary of State for Digital, Culture, Media and Sport, at the Dispatch Box, whether the horrendous and horrific content that led a man to shoot and kill five people in Keyham—in the constituency of my hon. Friend Luke Pollard—would be allowed to remain and perpetuate online as a result of the removal of these clauses from the Bill. I did not get a substantial answer then, but we all know that the answer is yes.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

That is the thing: this Bill is supposed to be the Online Safety Bill. It is supposed to be about protecting people from the harm that can be done to them by others. It is also supposed to be about protecting people from that radicalisation and that harm that they can end up in. It is supposed to make a difference. It is supposed to be game changer and a world leader.

Although, absolutely, I recognise the importance of the child-safety duties in the clauses and the change that that will have, when people turn 18 they do not suddenly become different humans. They do not wake up on their 18th birthday as a different person from the one that they were before. They should not have to go from that level of protection, prior to 18, to being immediately exposed to comments and content encouraging them to self-harm, and to all of the negative things that we know are present online.

Photo of Nicholas Fletcher Nicholas Fletcher Ceidwadwyr, Don Valley

I understand some of the arguments the hon. Lady is making, but that is a poor argument given that the day people turn 17 they can learn to drive or the day they turn 16 they can do something else. There are lots of these things, but we have to draw a line in the sand somewhere. Eighteen is when people become adults. If we do not like that, we can change the age, but there has to be a line in the sand. I agree with much of what the hon. Lady is saying, but that is a poor argument. I am sorry, but it is.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I do not disagree that overnight changes are involved, but the problem is that we are going from a certain level of protection to nothing; there will be a drastic, dramatic shift. We will end up with any vulnerable person who is over 18 being potentially subject to all this content online.

I still do not understand what people think they will have won as a result of having the provisions removed from the Bill. I do not understand how people can say, “This is now a substantially better Bill, and we are much freer and better off as a result of the changes.” That is not the case; removing the provisions will mean the internet continuing to be unsafe—much more unsafe than it would have been under the previous iteration of the Bill. It will ensure that more people are harmed as a result of online content. It will absolutely—

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

No, I will not give way again. The change will ensure that people can absolutely say what they like online, but the damage and harm that it will cause are not balanced by the freedoms that have been won.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

As a Back-Bench Member of Parliament, I recommended that the “legal but harmful” provisions be removed from the Bill. When I chaired the Joint Committee of both Houses of Parliament that scrutinised the draft Bill, it was the unanimous recommendation of the Committee that the “legal but harmful” provisions be removed. As a Minister at the Dispatch Box, I said that I thought “legal but harmful” was a problematic term and we should not use it. The term “legal but harmful” does not exist in the Bill, and has never existed in the Bill, but it has provoked a debate that has caused a huge confusion. There is a belief, which we have heard expressed in debate today, that somehow there are categories of content that Ofcom can deem categories for removal whether they are unlawful or not.

During the Bill’s journey from publication in draft to where we are today, it has become more specific. Rather than our relying on general duties of care, written into the Bill are areas of priority illegal activity that the companies must proactively look for, monitor and mitigate. In the original version of the Bill, that included only terrorist content and child sexual exploitation material, but on the recommendation of the Joint Committee, the Government moved in the direction of writing into the Bill at schedule 7 offences in law that will be the priority illegal offences.

The list of offences is quite wide, and it is more comprehensive than any other such list in the world in specifying exactly what offences are in scope. There is no ambiguity for the platforms as to what offences are in scope. Stalking, harassment and inciting violence, which are all serious offences, as well as the horrible abuse a person might receive as a consequence of their race or religious beliefs, are written into the Bill as priority illegal offences.

There has to be a risk assessment of whether such content exists on platforms and what action platforms should take. They are required to carry out such a risk assessment, although that was never part of the Bill before. The “legal but harmful” provisions in some ways predate that. Changes were made; the offences were written into the Bill, risk assessments were provided for, and Parliament was invited to create new offences and write them into the Bill, if there were categories of content that had not been captured. In some ways, that creates a democratic lock that says, “If we are going to start to regulate areas of speech, what is the legal reason for doing that? Where is the legal threshold? What are the grounds for us taking that decision, if it is something that is not already covered in platforms’ terms of service?”

We are moving in that direction. We have a schedule of offences that we are writing into the Bill, and those priority illegal offences cover most of the most serious behaviour and most of the concerns raised in today’s debate. On top of that, there is a risk assessment of platforms’ terms of service. When we look at the terms of service of the companies—the major platforms we have been discussing—we see that they set a higher bar again than the priority illegal harms. On the whole, platforms do not have policies that say, “We won’t do anything about this illegal activity, race hate, incitement to violence, or promotion or glorification of terrorism.” The problem is that although have terms of service, they do not enforce them. Therefore, we are not relying on terms of service. What we are saying, and what the Bill says, is that the minimum safety standards are based on the offences written into the Bill. In addition, we have risk assessment, and we have enforcement based on the terms of service.

There may be a situation in which there is a category of content that is not in breach of a platform’s terms of service and not included in the priority areas of illegal harm. It is very difficult to think of what that could be—something that is not already covered, and over which Ofcom would not have power. There is the inclusion of the new offences of promoting self-harm and suicide. That captures not just an individual piece of content, but the systematic effect of a teenager like Molly Russell—or an adult of any age—being targeted with such content. There are also new offences for cyber-flashing, and there is Zach’s law, which was discussed in the Chamber on Report. We are creating and writing into the Bill these new priority areas of illegal harm.

Freedom of speech groups’ concern was that the Government could have a secret list of extra things that they also wanted risk-assessed, rather enforcement being clearly based either on the law or on clear terms of service. It is difficult to think of categories of harm that are not already captured in terms of service or priority areas of illegal harm, and that would be on such a list. I think that is why the change was made. For freedom of speech campaigners, there was a concern about exactly what enforcement was based on: “Is it based on the law? Is it based on terms of service? Or is it based on something else?”

I personally believed that the “legal but harmful” provisions in the Bill, as far as they existed, were not an infringement on free speech, because there was never a requirement to remove legal speech. I do not think the removal of those clauses from the Bill suddenly creates a wild west in which no enforcement will take place at all. There will be very effective enforcement based on the terms of service, and on the schedule 7 offences, which deal with the worst kinds of illegal activity; there is a broad list. The changes make it much clearer to everybody—platforms and users alike, and Ofcom—exactly what the duties are, how they are enforced and what they are based on.

For future regulation, we have to use this framework, so that we can say that when we add new offences to the scope of the legislation, they are offences that have been approved by Parliament and have gone through a proper process, and are a necessary addition because terms of service do not cover them. That is a much clearer and better structure to follow, which is why I support the Government amendments.

Photo of Charlotte Nichols Charlotte Nichols Llafur, Warrington North

I cannot help but see the Government’s planned removal of clauses 12 and 13 as essentially wrecking amendments to the Bill. Taking those provisions out of the Bill makes it a Bill not about online safety, but about child protection. We have not had five years or so of going backwards and forwards, and taken the Bill through Committee and then unprecedentedly recommitted it to Committee, in order to fundamentally change what the Bill set out to do. The fact that, at this late stage, the Government are trying to take out these aspects of the Bill melts my head, for want of a better way of putting it.

My hon. Friend the Member for Batley and Spen was absolutely right when she talked about what clauses 12 and 13 do. In effect, they are an acknowledgement that adults are also harmed online, and have different experiences online. I strongly agree with the hon. Member for Aberdeen North about this not being the protect MPs from being bullied on Twitter Bill, because obviously the provisions go much further than that, but it is worth noting, in the hope that it is illustrative to Committee members, the very different experience that the Minister and I have in using Twitter. I say that as a woman who is LGBT and Jewish—and although I would not suggest that it should be a protected characteristic, the fact that I am ginger probably plays a part as well. He and I could do the same things on Twitter on the same day and have two completely different experiences of that platform.

The risk-assessment duties set out in clause 12, particularly in subsection (5)(d) to (f), ask platforms to consider the different ways in which different adult users might experience them. Platforms have a duty to attempt to keep certain groups of people, and categories of user, safe. When we talk about free speech, the question is: freedom of speech for whom, and at what cost? Making it easier for people to perpetuate, for example, holocaust denial on the internet—a category of speech that is lawful but awful, as it is not against the law in this country to deny that the holocaust happened—makes it much less likely that I, or other Jewish people, will want to use the platform.

Our freedom of speech and ability to express ourselves on the platform is curtailed by the platform’s decision to prioritise the freedom of expression of people who would deny the holocaust over that of Jewish people who want to use the platform safely and not be bombarded by people making memes of their relatives getting thrown into gas chambers, of Jewish people with big noses, or of the Rothschild Zionist global control conspiracy nonsense that was alluded to earlier, which is encountered online constantly by Jewish users of social media platforms.

Organisations such as the Community Security Trust and the Antisemitism Policy Trust, which do excellent work in this area, have been very clear that someone’s right to be protected from that sort of content should not end the day they turn 18. Duties should remain on platforms to do risk assessments to protect certain groups of adults who may be at increased risk from such content, in order to protect their freedom of speech and expression.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office) 11:15, 13 Rhagfyr 2022

The hon. Member makes a powerful point about the different ways in which people experience things. That tips over into real-life abusive interactions, and goes as far as terrorist incidents in some cases. Does she agree that protecting people’s freedom of expression and safety online also protects people in their real, day-to-day life?

Photo of Charlotte Nichols Charlotte Nichols Llafur, Warrington North

I could not agree more. I suppose that is why this aspect of the Bill is so important, not just to me but to all those categories of user. I mentioned paragraphs (d) to (f), which would require platforms to assess exactly that risk. This is not about being offended. Personally, I have the skin of a rhino. People can say most things to me and I am not particularly bothered by it. My concern is where things that are said online are transposed into real-life harms. I will use myself as an example. Online, we can see antisemitic and conspiratorial content, covid misinformation, and covid misinformation that meets with antisemitism and conspiracies. When people decide that I, as a Jewish Member of Parliament, am personally responsible for George Soros putting a 5G chip in their arm, or whatever other nonsense they have become persuaded by on the internet, that is exactly the kind of thing that has meant people coming to my office armed with a knife. The kind of content that they were radicalised by on the internet led to their perpetrating a real-life, in-person harm. Thank God—Baruch Hashem—neither I nor my staff were in the office that day, but that could have ended very differently, because of the sorts of content that the Bill is meant to protect online users from.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

The hon. Lady is talking about an incredibly important issue, but the Bill covers such matters as credible threats to life, incitement to violence against an individual, and harassment and stalking—those patterns of behaviour. Those are public order offences, and they are in the Bill. I would absolutely expect companies to risk-assess for that sort of activity, and to be required by Ofcom to mitigate it. On her point about holocaust denial, first, the shield will mean that people can protect themselves from seeing stuff. The further question would be whether we create new offences in law, which can then be transposed across.

Photo of Charlotte Nichols Charlotte Nichols Llafur, Warrington North

I accept the points that the hon. Member raised, but he is fundamentally missing the point. The categories of information and content that these people had seen and been radicalised by would not fall under the scope of public order offences or harassment. The person was not sending me harassing messages before they turned up at my office. Essentially, social media companies and other online platforms have to take measures to mitigate the risk of categories of offences that are illegal, whether or not they are in the Bill. I am talking about what clauses 12 and 13 covered, whether we call it the “legal but harmful” category or “lawful but awful”. Whatever we name those provisions, by taking out of the Bill clauses relating to the “legal but harmful” category, we are opening up an area of harm that already exists, that has a real-world impact, and that the Bill was meant to go some way towards addressing.

The provisions have taken out the risk assessments that need to be done. The Bill says,

“(e) the level of risk of functionalities of the service facilitating the presence or dissemination of priority content that is harmful to adults, identifying and assessing those functionalities that present higher levels of risk;

(f) the different ways in which the service is used, and the impact of such use on the level of risk of harm that might be suffered by adults;

(g) the nature, and severity, of the harm that might be suffered by adults”.

Again, the idea that we are talking about offence, and that the clauses need to be taken out to protect free speech, is fundamentally nonsense.

I have already mentioned holocaust denial, but it is also worth mentioning health-related disinformation. We have already seen real-world harms from some of the covid misinformation online. It led to people including Piers Corbyn turning up outside Parliament with a gallows, threatening to hang hon. Members for treason. Obviously, that was rightly dealt with by the police, but the kind of information and misinformation that he had been getting online and that led him to do that, which is legal but harmful, will now not be covered by the Bill.

I will also raise an issue I have heard about from a number of people dealing with cancer and conditions such as multiple sclerosis. People online try to discourage them from accessing the proper medical interventions for their illnesses, and instead encourage them to take more vitamin B or adopt a vegan diet. There are people who have died because they had cancer but were encouraged online to not access cancer treatment because they were subject to lawful but awful categories of harm.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I wonder if the hon. Member saw the story online about the couple in New Zealand who refused to let their child have a life-saving operation because they could not guarantee that the blood used would not be from vaccinated people? Is the hon. Member similarly concerned that this has caused real-life harm?

Photo of Charlotte Nichols Charlotte Nichols Llafur, Warrington North

I am aware of the case that the hon. Member mentioned. I appreciate that I am probably testing the patience of everybody in the Committee Room, but I want to be clear just how abhorrent I find it that these provisions are coming out of the Bill. I am trying to be restrained, measured and reasonably concise, but that is difficult when there are so many parts of the change that I find egregious.

My final point is on self-harm and suicide content. For men under the age of 45, suicide is the biggest killer. In the Bill, we are doing as much as we can to protect young people from that sort of content. My real concern is this: many young people are being protected by the Bill’s provisions relating to children. They are perhaps waiting for support from child and adolescent mental health services, which are massively oversubscribed. The minute they tick over into 18, fall off the CAMHS waiting list and go to the bottom of the adult mental health waiting list—they may have to wait years for treatment of various conditions—there is no requirement or duty on the social media companies and platforms to do risk assessments.

The Chair adjourned the Committee without Question put (Standing Order No. 88).

Adjourned till this day at Two o’clock.