Schedule 8 - Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services

Online Safety Bill – in a Public Bill Committee am 4:45 pm ar 13 Rhagfyr 2022.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Amendments made: 61, in schedule 8, page 203, line 13, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 62, in schedule 8, page 203, line 15, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 63, in schedule 8, page 203, line 17, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 64, in schedule 8, page 203, line 21, leave out from “or” to end of line 23 and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about user reporting of content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 65, in schedule 8, page 203, line 25, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 66, in schedule 8, page 203, line 29, leave out

“priority content that is harmful to adults”

and insert “relevant content”.

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about content which the terms of service say can be taken down or restricted. The reference to content that is harmful to adults is omitted, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 67, in schedule 8, page 203, line 41, at end insert—

“11A Measures taken or in use by a provider to comply with any duty set out in section (Duty not to act against users except in accordance with terms of service) or (Further duties about terms of service) (terms of service).”

This amendment means that OFCOM can require providers of user-to-user services to include information in their transparency report about measures taken to comply with the new duties imposed by NC3 and NC4.

Amendment 68, in schedule 8, page 204, line 2, leave out from “illegal content” to end of line 3 and insert

“or content that is harmful to children—”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 69, in schedule 8, page 204, line 10, leave out from “illegal content” to “, and” in line 12 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 70, in schedule 8, page 204, line 14, leave out from “illegal content” to “present” in line 15 and insert

“and content that is harmful to children”.

This amendment removes the reference to content that is harmful to adults, as a result of the removal of the adult safety duties (see Amendments 6, 7 and 41).

Amendment 71, in schedule 8, page 205, line 38, after “Part 3” insert

“or Chapters 1 to 2A of Part 4”.—(Paul Scully.)

This amendment requires OFCOM, in considering which information to require from a provider in a transparency report, to consider whether the provider is subject to the duties imposed by Chapter 2A, which is the new Chapter expected to be formed by NC3 to NC6 (and Chapter 1 of Part 4).

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I beg to move amendment 72, in schedule 8, page 206, line 5, at end insert—

“35A (1) For the purposes of this Schedule, content of a particular kind is ‘relevant content’ if—

(a) a term of service, other than a term of service within sub-paragraph (2), states that a provider may or will take down content of that kind from the service or restrict users’ access to content of that kind, and

(b) it is regulated user-generated content.

(2) The terms of service within this sub-paragraph are as follows—

(a) terms of service which make provision of the kind mentioned in section 9(5) (protecting individuals from illegal content) or 11(5) (protecting children from content that is harmful to children);

(b) terms of service which deal with the treatment of consumer content.

(3) References in this Schedule to relevant content are to content that is relevant content in relation to the service in question.”

This amendment defines “relevant content” for the purposes of Schedule 8.

Photo of Angela Eagle Angela Eagle Llafur, Wallasey

With this it will be convenient to discuss Government amendments 73 and 75.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

The amendments to schedule 8 confirm that references to relevant content, consumer content and regulated user-generated content have the same meaning as established by other provisions of the Bill. Again, that ensures consistency, which will, in turn, support Ofcom in requiring providers of category 1 services to give details in their annual transparency reports of their compliance with the new transparency, accountability and freedom of expression duties.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I will keep my comments on this grouping brief, because I have already raised our concerns and our overarching priority in terms of transparency reports in the previous debate, which was good one, with all Members highlighting the need for transparency and reporting in the Bill. With the Chair’s permission, I will make some brief comments on Government amendment 72 before addressing Government amendments 73 and 75.

It will come as no surprise to the Minister that amendment 72, which defines relevant content for the purposes of schedule 8, has a key omission—specifying priority content harmful to adults. For reasons we have covered at length, we think that it is a gross mistake on the Government’s side to attempt to water down the Bill in this way. If the Minister is serious about keeping adults safe online, he must reconsider this approach. However, we are happy to see amendments 73 and 75, which define consumer content and regulated user-generated content. It is important for all of us—whether we are politicians, researchers, academics, civil society, stakeholders, platforms, users or anyone else—that these definitions are in the Bill so that, when it is passed, it can be applied properly and at pace. That is why we have not sought to amend this grouping.

I must press the Minister to respond on the issues around relevant content as outlined in amendment 72. We greatly feel that more needs to be done to address this type of content and its harm to adults, so I would be grateful to hear the Minister’s assessment of how exactly these transparency reports will report back on this type of harm, given its absence in this group of amendments and the lack of a definition.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office)

I am pleased to see the list included and the number of things that Ofcom can ask for more information on. I have a specific question about amendment 75. Amendment 75 talks about regulated user-generated content and says it has the same meaning as it does in the interpretation of part 3 under clause 50. The Minister may or may not know that there are concerns about clause 50(5), which relates to

“One-to-one live aural communications”.

One-to-one live aural communications are exempted. I understand that that is because the Government do not believe that telephony services, for example, should be part of the Online Safety Bill—that is a pretty reasonable position for them to take. However, allowing one-to-one live aural communications not to be regulated means that if someone is using voice chat in Fortnite, for example, and there are only two people on the team that they are on, or if someone is using voice chat in Discord and there are only two people online on the channel at that time, that is completely unregulated and not taken into account by the Bill.

I know that that is not the intention of the Bill, which is intended to cover user-generated content online. The exemption is purely in place for telephony services, but it is far wider than the Government intend it to be. With the advent of more and more people using virtual reality technology, for example, we will have more and more aural communication between just two people, and that needs to be regulated by the Bill. We cannot just allow a free-for-all.

If we have child protection duties, for example, they need to apply to all user-generated content and not exempt it specifically because it is a live, one-to-one aural communication. Children are still at significant risk from this type of communication. The Government have put this exemption in because they consider such communication to be analogous to telephony services, but it is not. It is analogous to telephony services if we are talking about a voice call on Skype, WhatsApp or Signal—those are voice calls, just like telephone services—but we are talking about a voice chat that people can have with people who they do not know, whose phone number they do not know and who they have no sort of relationship with.

Some of the Discord servers are pretty horrendous, and some of the channels are created by social media influencers or people who have pretty extreme views in some cases. We could end up with a case where the Discord server and its chat functions are regulated, but if aural communication or a voice chat is happening on that server, and there are only two people online because it is 3 o’clock in the morning where most of the people live and lots of them are asleep, that would be exempted. That is not the intention of the Bill, but the Government have not yet fixed this. So I will make one more plea to the Government: will they please fix this unintended loophole, so that it does not exist? It is difficult to do, but it needs to be done, and I would appreciate it if the Minister could take that into consideration.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I do not believe that the provisions in terms of Ofcom’s transparency powers have been watered down. It is really important that the Bill’s protection for adults strikes the right balance with its protections for free speech, which is why we have replaced the “legal but harmful” clause. I know we will not agree on that, but there are more new duties that will make platforms more accountable. Ofcom’s transparency powers will enable it to assess compliance with the new safety duties and hold platforms accountable for enforcing their terms of service to keep users safe. Companies will also have to report on the measures that they have in place to tackle illegal content or activity and content that is harmful for children, which includes proactive steps to address offences such as child sexual exploitation and abuse.

The legislation will set out high-level categories of information that companies may be required to include in their transparency reports, and Ofcom will then specify the information that service providers will need to include in those reports, in the form of a notice. Ofcom will consider companies’ resources and capacity, service type and audience in determining what information they will need to include. It is likely that the information that is most useful to the regulator and to users will vary between different services. To ensure that the transparency framework is proportionate and reflects the diversity of services in scope, the transparency reporting requirements set out in the Ofcom notice are likely to differ between those services, and the Secretary of State will have powers to update the list of information that Ofcom may require to reflect any changes of approach.

Let me address the interesting point about aural exemptions made by the hon. Member for Aberdeen North. As she says, the exemption is there to ensure that we do not capture traditional phone calls. Phones have moved from pots and pans—from a plain old telephone system—to public access networks and beyond over the last 20 years. Although one-to-one live aural communications are exempt, other types of interactions between adults and children, including in-game private messaging chat functions and video calls, are in scope. If there are unintended consequences—the hon. Lady will know that I was described as the Minister for unintended consequences when I was at the Department for Business, Energy and Industrial Strategy—I would be happy to continue chatting with her and others to ensure that we get that difficult position right.

Photo of Kirsty Blackman Kirsty Blackman Shadow SNP Spokesperson (Cabinet Office) 5:00, 13 Rhagfyr 2022

The in-game chat that children use is overwhelmingly voice chat. Children do not type if they can possibly avoid it. I am sure that that is not the case for all children, but it is for most children. Aural communication is used if someone is playing Fortnite duos, for example, with somebody they do not know. That is why that needs to be included.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I very much get that point. It is not something that I do, but I have certainly seen it myself. I am happy to chat to the hon. Lady to ensure that we get it right.

Amendment 72 agreed to.

Amendments made: 73, in schedule 8, page 206, line 6, at end insert—

“‘consumer content’ has the same meaning as in Chapter 2A of Part 4 (see section (Interpretation of this Chapter)(3));”.

This amendment defines “consumer content” for the purposes of Schedule 8.

Amendment 74, in schedule 8, page 206, leave out lines 7 and 8.

This amendment is consequential on Amendment 41 (removal of clause 55).

Amendment 75, in schedule 8, page 206, line 12, at end insert—

“‘regulated user-generated content’ has the same meaning as in Part 3 (see section 50), and references to such content are to content that is regulated user-generated content in relation to the service in question;”.—(Paul Scully.)

This amendment defines “regulated user-generated content” for the purposes of Schedule 8.

Schedule 8, as amended, agreed to.

Ordered, That further consideration be now adjourned. —(Mike Wood.)

Adjourned till Thursday 15 December at half-past Eleven o’clock.

Written evidence reported to the House

OSB101 Mencap

OSB102 News Media Association (NMA)

OSB103 Dr Edina Harbinja, Reader in law, Aston Law School, Aston Business School, and Deputy Editor of the Computer Law and Security Review

OSB104 Carnegie UK

OSB105 Full Fact

OSB106 Antisemitism Policy Trust

OSB107 Big Brother Watch

OSB108 Microsoft

OSB109 Internet Society

OSB110 Parent Zone

OSB111 Robin Wilton

OSB112 Wikimedia Foundation