Schedule 11 - Categories of regulated user-to-user services and regulated search services: regulationsSchedule 11

Online Safety Bill – in a Public Bill Committee am 11:30 am ar 15 Rhagfyr 2022.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 11:30, 15 Rhagfyr 2022

I beg to move amendment 76, in schedule 11, page 213, line 11, at end insert

“, and

(c) any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”

This amendment provides that regulations specifying Category 1 threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.

Photo of Angela Eagle Angela Eagle Llafur, Wallasey

With this, it will be convenient to discuss Government amendments 77 to 79, 81 to 84, 86 to 91 and 93.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

These Government amendments seek to change the approach to category 1 designation, following the removal from the Bill of the adult safety duties and the concept of “legal but harmful” content. Through the proposed new duties on category 1 services, we aim to hold companies accountable to their terms of service, as we have said. I seek to remove all requirements on category 1 services relating to harmful content, so it is no longer appropriate to designate them with reference to harm. Consequently, the amendments in this group change the approach to designating category 1 services, to ensure that only the largest companies with the greatest influence over public discourse are designated as category 1 services.

Specifically, these amendments will ensure that category 1 services are so designated where they have functionalities that enable easy, quick and wide dissemination of user-generated content, and the requirement of category 1 services to meet a number of users threshold remains unchanged.

The amendments also give the Secretary of State the flexibility to consider other characteristics of services, as well as other relevant factors. Those characteristics might include a service’s functionalities, the user base, the business model, governance, and other systems and processes. That gives the designation process greater flexibility to ensure that services are designated category 1 services only when they have significant influence over public discourse.

The amendments also seek to remove the use of criteria for content that is harmful to adults from category 2B, and we have made a series of consequential amendments to the designation process for categories 2A and 2B to ensure consistency.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I have commented extensively on the flaws in the categorisation process in this and previous Committees, so I will not retread old ground. I accept the amendments in this grouping. They show that the Government are prepared to broaden the criteria for selecting which companies are likely to be in category 1. That is a very welcome, if not subtle, shift in the right direction.

The amendments bring the characteristics of a company’s service into consideration, which will be a slight improvement on the previous focus on size and functionality, so we welcome them. The distinction is important, because size and functionality alone are obviously very vague indicators of harm, or the threat of harm.

We are pleased to see that the Government have allowed for a list to be drawn up of companies that are close to the margins of category 1, or that are emerging as category 1 companies. This is a positive step for regulatory certainty, and I hope that the Minister will elaborate on exactly how the assessment will be made.

However, I draw the Minister’s attention to Labour’s long-held concern about the Bill’s over-reliance on powers afforded to the Secretary of State of the day. We debated this concern in a previous sitting. I press the Minister again on why these amendments, and the regulations around the threshold conditions, are ultimately only for the Secretary of State to consider, depending on characteristics or factors that only he or she, whoever they may be, deems relevant.

We appreciate that the regulations need some flexibility, but we have genuine concerns—indeed, colleagues from all parties have expressed such concerns—that the Bill will give the Secretary of State far too much power to determine how the entire online safety regime is imposed. I ask the Minister to give the Committee an example of a situation in which it would be appropriate for the Secretary of State to make such changes without any consultation with stakeholders or the House.

It is absolutely key for all of us that transparency should lie at the heart of the Bill. Once again, we fear that the amendments are a subtle attempt by the Government to impose on what is supposed to be an independent regulatory process the whim of one person. I would appreciate assurance on that point. The Minister knows that these concerns have long been held by me and colleagues from all parties, and we are not alone in those concerns. Civil society groups are also calling for clarity on exactly how decisions will be made, and particularly on what information will be used to determine a threshold. For example, do the Government plan on quantifying a user base, and will the Minister explain how the regime would work in practice, when we know that a platform’s user base can fluctuate rapidly? We have seen that already with Mastodon; the latter’s users have increased incredibly as a result of Elon Musk’s takeover of Twitter. I hope that the Minister can reassure me about those concerns. He will know that this is a point of contention for colleagues from across the House, and we want to get the Bill right before we progress to Report.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office) 11:45, 15 Rhagfyr 2022

My understanding is that only a very small number of platforms will reach the category 1 threshold. We are talking about the platforms that everybody has heard of—Facebook, Twitter and so on—and not about the slightly smaller platforms that lots of people have heard of and use. We are probably not talking about platforms such as Twitch, which has a much smaller user base than Facebook and Twitter but has a massive reach. My concern continues to be that the number threshold does not take into account the significant risks of harm from some of those platforms.

I have a specific question about amendment 76. I agree with my Labour Front-Bench colleague, the hon. Member for Pontypridd, that it shows that the Government are willing to take into account other factors. However, I am concerned that the Secretary of State is somehow being seen as the arbiter of knowledge—the person who is best placed to make the decisions—when much more flexibility could have been given to Ofcom instead. From all the evidence I have heard and all the people I have spoken to, Ofcom seems much more expert in dealing with what is happening today than any Secretary of State could ever hope to be. There is no suggestion about how the Secretary of State will consult, get information and make decisions on how to change the threshold conditions.

It is important that other characteristics that may not relate to functionalities are included if we discover that there is an issue with them. For example, I have mentioned livestreaming on a number of occasions in Committee, and we know that livestreaming is inherently incredibly risky. The Secretary of State could designate livestreaming as a high-risk functionality, and it could be included, for example, in category 1. I do not know whether it will be, but we know that there are risks there. How will the Secretary of State get that information?

There is no agreement to set up a user advocacy board. The requirement for Ofcom to consult the Children’s Commissioner will be brought in later, but organisations such as the National Society for the Prevention of Cruelty to Children, which deals with phone calls from children asking for help, are most aware of emerging threats. My concern is that the Secretary of State cannot possibly be close enough to the issue to make decisions, unless they are required to consult and listen to organisations that are at the coal face and that regularly support people. I shall go into more detail about high-harm platforms when we come to amendment 104.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The amendments give the Secretary of State the flexibility to consider other characteristics of services as well as other relevant factors, which include functionalities, user base, business model, governance, and other systems and processes. They effectively introduce greater flexibility into the designation process, so that category 1 services are designated only if they have significant influence over public discourse. Although the Secretary of State will make the regulations, Ofcom will carry out the objective and evidence-based process, which will be subject to parliamentary scrutiny via statutory instruments. The Secretary of State will have due consultation with Ofcom at every stage, but to ensure flexibility and the ability to move fast, it is important that the Secretary of State has those powers.

Amendment 76 agreed to.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I beg to move amendment 104, in schedule 11, page 213, line 11, at end insert—

“(1A) Regulations made under sub-paragraph (1) must provide for any regulated user-to-user service which OFCOM assesses as posing a very high risk of harm to be included within Category 1, regardless of the number of users.”

This amendment allows Ofcom to impose Category 1 duties on user-to-user services which pose a very high risk of harm.

I would say this, but I think that this is the most important amendment. The key area that the Government are getting wrong is the way in which platforms, providers or services will be categorised. The threshold is based on the number of users. It is the number of users “and” one of those other things, not the number of users “or” one of those other things; even that would make a significant difference.

The Secretary of State talked about the places that have a significant influence over public discourse. It is perfectly possible to have a significant influence over public discourse with a small number of users, or with a number of users that does not number into the millions. We have seen the spread of conspiracy theories that have originated and been perpetuated on very small platforms—very small, shady places on the internet that none of us has experienced or even heard of. Those are the places that have a massive impact and effect.

We know that one person can have a significant impact on the world and on people’s lives. We have heard about the physical harm that people can be incited to cause by the platforms they access, and the radicalisation and extremism they find themselves subject to. That can cause massive, damaging effects to anybody they choose to take physical action against, and to some of the most marginalised communities and groups in society. We are seeing an increase in the amount of hate crime and the number of people who believe conspiracy theories, and not all of that is because of the spread of those things on Facebook and Twitter. It is because of the breadcrumbing and the spread that there can be on smaller platforms.

The most extreme views do not necessarily tip over into “illegal” or “incitement”; they do not actually say, “Please go out and kill everybody in this particular group.” They say, “This particular group is responsible for all of ills you feel and for every negative thing that is happening in your life”, and people are therefore driven to take extremist, terrorist action. That is a significant issue.

I want to talk about a couple of platforms. Kiwi Farms, which is no longer in existence and has been taken down, was a very small platform that dramatically damaged the lives of trans people in particular. It was a platform where people went to incite hatred and give out the addresses of folk who they knew were members of the trans community. Some of those people had to move to another continent to get away from the physical violence and attacks they faced as a result of the behaviour on that incredibly small platform, which very few people will have heard about.

Kiwi Farms has been taken down because the internet service providers decided that it was too extreme and they could not possibly host it any more. That was eventually recognised and change was made, but the influence that that small place had on lives—the difficulties and harm it caused—is untold. Some of that did tip over into illegality, but some did not.

I also want to talk about the places where there is a significant amount of pornography. I am not going to say that I have a problem with pornography online; the internet will always have pornography on it. It attracts a chunk of people to spend time online, and some of that pornography is on large mainstream sites. Searches for incest, underage girls, or black women being abused all get massive numbers of hits. There is a significant amount of pornography on these sites that is illegal, that pretends to be illegal or that acts against people with protected characteristics. Research has found that a significant proportion—significantly more than a half—of pornography on mainstream sites that involves black women also involves violence. That is completely and totally unacceptable, and has a massive negative impact on society, whereby it reinforces negativity and discrimination against groups that are already struggling with being discriminated against and that do not experience the privilege of a cis white man.

It is really grim that we are requiring a number of users to be specified, when we know the harm that caused by platforms that do not have 10 million or 20 million United Kingdom users. I do not know what the threshold will be, but I know it will be too high to include a lot of platforms that have a massive effect. The amendment is designed specifically to give Ofcom the power to designate as category 1 any service that it thinks has a very high risk of harm; I have not set the bar particularly low. Now that the Minister has increased the levels of transparency that will be required for category 1 platforms, it is even more important that we subject extremist sites and platforms—the radicalising ones, which are perpetuating discrimination—to a higher bar and require them to have the transparency that they need as a category 1 service. This is a place where the Bill could really make a difference and change lives, and I am really concerned that it is massively failing to do so.

The reason I have said that it should be Ofcom’s responsibility to designate category 1 services is on the basis that it has the experts who will be looking at all the risk assessments, dealing with companies on a day-to-day basis, and seeing the harms and transparencies that the rest of us will not be able to see. The reporting mechanisms will be public for only some of the category 1 platforms, and we will not be able to find out the level of information that Ofcom has, so it is right that it should be responsible for designating sites as having a very high risk of harm. That is why I tabled the amendment, which would make a massive difference to people who are the most discriminated against as it is and who are the most at risk of harm from extremism. I urge the Minister to think again.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I rise briefly to support everything the hon. Member for Aberdeen North just said. We have long called for the Bill to take a harm-led approach; indeed, the Government initially agreed with us, as when it was in its first iteration it was called the Online Harms Bill rather than the Online Safety Bill. Addressing harm must be a central focus of the Bill, as we know extremist content is perpetuated on smaller, high-harm platforms; this is something that the Antisemitism Policy Trust and Hope not Hate have long called for with regards to the Bill.

I want to put on the record our huge support for the amendment. Should the hon. Lady be willing to push it to a vote—I recognise that we are small in number—we will absolutely support her.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I want to speak briefly to the amendment. I totally understand the reasons that the hon. Member for Aberdeen North has tabled it, but in reality, the kinds of activities she describes would be captured anyway, because most would fall within the remit of the priority illegal harms that all platforms and user-to-user services have to follow. If there were occasions when they did not, being included in category 1 would mean that they would be subject to the additional transparency of terms of service, but the smaller platforms that allow extremist behaviour are likely to have extremely limited terms of service. We would be relying on the priority illegal activity to set the minimum safety standards, which Ofcom would be able to do.

It would also be an area where we would want to move at pace. Even if we wanted to bring in extra risk assessments on terms of service that barely exist, the time it would take to do that would not give a speedy resolution. It is important that in the way Ofcom exercises its duties, it does not just focus on the biggest category 1 platforms but looks at how risk assessments for illegal activity are conducted across a wide range of services in scope, and that it has the resources needed to do that.

Even within category 1, it is important that is done. We often cite TikTok, Instagram and Facebook as the biggest platforms, but I recently spoke to a teacher in a larger secondary school who said that by far the worst platform they have to deal with in terms of abuse, bullying, intimidation, and even sharing of intimate images between children, is Snapchat. We need to ensure that those services get the full scrutiny they should have, because they are operating at the moment well below their stated terms of service, and in contravention of the priority illegal areas of harm.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 12:00, 15 Rhagfyr 2022

As debated earlier, we are removing the adult safety duties from the Bill, which means that no company will face any duties related to legal but harmful content. In their place, the Government are introducing new transparency accountability, and free speech duties on category 1 services. They have been discussed in detail earlier this session.

It would not be proportionate to apply those new duties to smaller services, but, as we have heard from my hon. Friend the Member for Folkestone and Hythe, they will still have to comply with the illegal content and child safety duties if they are accessed by children. Those services have limited resources, and blanket applying additional duties on them would divert those resources away from complying with the illegal content and child safety duties. That would likely weaken the duties’ impact on tackling criminal activity and protecting children.

The new duties are about user choice and accountability on the largest platforms—if users do not want to use smaller harmful sites, they can choose not to—but, in recognition of the rapid pace with which companies can grow, I introduced an amendment earlier to create a watchlist of companies that are approaching the category 1 threshold, which will ensure that Ofcom can monitor rapidly scaling companies, reduce any delay in designating companies as category 1 services, and apply additional obligations on them.

The hon. Member for Aberdeen North talked about ISPs acting with respect to Kiwi Farms. I talked on Tuesday about the need for a holistic approach. There is not one silver bullet. It is important to look at Government, the platforms, parenting and ISPs, because that makes up a holistic view of how the internet works. It is the multi-stakeholder framework of governing the internet in its entirety, rather than the Government trying to do absolutely everything. We have talked a lot about illegality, and I think that a lot of the areas in that case were illegal; the hon. Lady described some very distasteful things. None the less, with the introduction of the watchlist, I do not believe amendment 104 is required.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

The hon. Member for Folkestone and Hythe made a good point. I do not disagree that Ofcom will have a significant role in policing platforms that are below the category 1 threshold. I am sure it will be very hands on, particularly with platforms that have the highest risk and are causing the most harm.

I still do not think that is enough. I do not think that the Minister’s change with regard to emerging platforms should be based on user numbers. It is reasonable for us to require platforms that encourage extremism, spread conspiracy theories and have the most horrific pornography on them to meet a higher bar of transparency. I do not really care if they only have a handful of people working there. I am not fussed if they say, “Sorry, we can’t do this.” If they cannot keep people safe on their platform, they should have to meet a higher transparency bar, provide more information on how they are meeting their terms of service and provide toggles—all those things. It does not matter how small these platforms are. What matters is that they have massive risks and cause massive amounts of harm. It is completely reasonable that we hold them to a higher regulatory bar. On that basis, I will push the amendment to a vote.

Rhif adran 6 Online Safety Bill — Schedule 11 - Categories of regulated user-to-user services and regulated search services: regulationsSchedule 11

Ie: 4 MPs

Na: 8 MPs

Ie: A-Z fesul cyfenw

Na: A-Z fesul cyfenw

The Committee divided: Ayes 4, Noes 8.

Question accordingly negatived.

Amendments made: 77, in schedule 11, page 213, line 16, after “other” insert

“characteristics of the search engine or”.

This amendment provides that regulations specifying Category 2A threshold conditions for the search engine of regulated search services must also include conditions relating to any other characteristics of the search engine that the Secretary of State considers relevant.

Amendment 78, in schedule 11, page 213, line 23, after “other” insert

“characteristics of that part of the service or”.

This amendment provides that regulations specifying Category 2B threshold conditions for the user-to-user part of regulated user-to-user services must also include conditions relating to any other characteristics of that part of the service that the Secretary of State considers relevant.

Amendment 79, in schedule 11, page 213, line 36, leave out from “on” to “disseminated” in line 37 and insert

“how easily, quickly and widely regulated user-generated content is”.

This amendment provides that in making regulations specifying Category 1 threshold conditions the Secretary of State must take into account the impact of certain matters in relation to which conditions must be specified on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.

Amendment 80, in schedule 11, page 214, line 2, leave out from “illegal content” to “disseminated” in line 3 and insert

“and content that is harmful to children”.

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 81, in schedule 11, page 214, line 12, leave out “the relationship between”.

This amendment is consequential on Amendment 83 (which provides for additional matters that OFCOM must carry out research into).

Amendment 82, in schedule 11, page 214, line 13, leave out from beginning to “by” and insert

“how easily, quickly and widely regulated user-generated content is disseminated”.

This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 1 threshold conditions may be made must include research into how easily, quickly and widely regulated user-generated content is disseminated by means of regulated user-to-user services.

Amendment 83, in schedule 11, page 214, line 16, at end insert

“, and

(c) such other characteristics of that part of such services or factors relating to that part of such services as OFCOM consider to be relevant to specifying the Category 1 threshold conditions.”

This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 1 threshold conditions may be made must include research into other characteristics or factors of the user-to-user part of regulated user-to-user services as OFCOM consider relevant to specifying the Category 1 threshold conditions.

Amendment 84, in schedule 11, page 214, line 24, after “other” insert “characteristics or”.

This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 2A threshold conditions may be made must also include research into characteristics of the search engine of regulated search services and combined services as OFCOM consider relevant to specifying the Category 2A threshold conditions.

Amendment 85, in schedule 11, page 214, line 29, leave out from “illegal content” to “by” in line 30 and insert

“and content that is harmful to children”.

This amendment is consequential on the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 86, in schedule 11, page 214, line 34, leave out “factors” and insert

“characteristics of that part of such services or factors relating to that part of such services”.

This amendment provides that research required to be carried out by OFCOM before regulations specifying Category 2B threshold conditions may be made must include research into such other characteristics of the user-to-user part of regulated user-to-user services as OFCOM consider relevant to specifying the Category 2B threshold conditions.

Amendment 87, in schedule 11, page 214, leave out lines 40 to 42.

This amendment and Amendments 88 to 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.

Amendment 88, in schedule 11, page 214, line 44, at beginning insert “characteristic or”.

This amendment and Amendments 87, 89 and 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.

Amendment 89, in schedule 11, page 214, line 45, leave out “1(3)” and insert “1(1) or (3)”.

This amendment and Amendments 87, 88 and 90 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.

Amendment 90, in schedule 11, page 214, line 45, after “other” insert “characteristic or”.

This amendment and Amendments 87 to 89 (which provide that OFCOM’s advice as to what provision is appropriate for regulations under paragraph 1(1), (2) or (3) of Schedule 11 to make, may include advice that the regulations include other characteristics or factors) are consequential on Amendments 76 to 78.

Amendment 91, in schedule 11, page 216, line 38, at end insert—

“5A In this Schedule the ‘characteristics’ of a user-to-user part of a service or a search engine include its user base, business model, governance and other systems and processes.”

This amendment defines “characteristics” of a user-to-user part of a service or search engine for the purposes of Schedule 11.

Amendment 92, in schedule 11, page 216, leave out lines 43 and 44.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 93, in schedule 11, page 216, line 44, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50);”—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 11.

Schedule 11, as amended, agreed to.