Clause 20 - Duties about freedom of expression and privacy

Online Safety Bill – in a Public Bill Committee am 4:00 pm ar 13 Rhagfyr 2022.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport 4:00, 13 Rhagfyr 2022

I beg to move amendment 28, in clause 20, page 21, line 42, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to freedom of expression when deciding on and implementing safety measures and policies.

Photo of Angela Eagle Angela Eagle Llafur, Wallasey

With this it will be convenient to discuss Government amendments 29, 31, 36 to 38 and 40.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I will be brief. The rights to freedom of expression and privacy are essential to our democracy. We have long been clear that the Bill must not interfere with those rights. The amendments will further strengthen protections for freedom of expression and privacy and ensure consistency in the Bill. They require regulated user-to-user and search services to have particular regard to freedom of expression and privacy when deciding on and implementing their safety measures and policy.

Amendments 28, 29 and 31 mean that service providers will need to thoroughly consider the impact that their safety and user empowerment measures have on users’ freedom of expression and privacy. That could mean, for example, providing detailed guidance and training for human reviewers about content that is particularly difficult to assess. Amendments 36 and 37 apply that to search services in relation to their safety duties. Ofcom can take enforcement action against services that fail to comply with those duties and will set out steps that platforms can take to safeguard freedom of expression and privacy in their codes of practice.

Those changes will not detract from platforms’ illegal content and child protection duties. Companies must tackle illegal content and ensure that children are protected on their services, but the amendments will protect against platforms taking an over-zealous approach to removing content or undermining users’ privacy when complying with their duties. Amendments 38 and 40 ensure that the rest of the Bill is consistent with those changes. The new duties will therefore ensure that companies give proper consideration to users’ rights when complying with them, and that that is reflected in Ofcom’s codes, providing greater clarity to companies.

Amendment 28 agreed to.

Amendments made: 29, in clause 20, page 22, line 2, after “have” insert “particular”.

This amendment has the result that providers of regulated user-to-user services must have particular regard to users’ privacy when deciding on and implementing safety measures and policies.

Amendment 30, in clause 20, page 22, line 6, leave out subsection (4).

This amendment removes clause 20(4), as that provision is moved to NC4.

Amendment 31, in clause 20, page 22, line 37, leave out paragraph (c) and insert—

“(c) section 14 (user empowerment),”.—

The main effect of this amendment is that providers must consider freedom of expression and privacy issues when deciding on measures and policies to comply with clause 14 (user empowerment). The reference to clause 14 replaces the previous reference to clause 13 (adults’ safety duties), which is now removed (see Amendment 7).

Question proposed, That the clause, as amended, stand part of the Bill.

Photo of Angela Eagle Angela Eagle Llafur, Wallasey

With this it will be convenient to discuss clause 30 stand part.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

I will speak broadly to clause 20, as it is an extremely important clause, before making remarks about the group of Government amendments we have just voted on.

Clause 20 is designed to provide a set of balancing provisions that will require companies to have regard to freedom of expression and privacy when they implement their safety duties. However, as Labour has repeatedly argued, it is important that companies cannot use privacy and free expression as a basis to argue that they can comply with regulations in less substantive ways. That is a genuine fear here.

We all want to see a Bill in place that protects free speech, but that cannot come at the expense of safety online. The situation with regards to content that is harmful to adults has become even murkier with the Government’s attempts to water down the Bill and remove adult risk assessments entirely.

The Minister must acknowledge that there is a balance to be achieved. We all recognise that. The truth is—and this is something that his predecessor, or should I say his predecessor’s predecessor, touched on when we considered this clause in the previous Bill Committee—that at the moment platforms are extremely inconsistent in their approach to getting the balance right. Although Labour is broadly supportive of this clause and the group of amendments, we feel that now is an appropriate time to put on record our concerns over the important balance between safety, transparency and freedom of expression.

Labour has genuine concerns over the future of platforms’ commitment to retaining that balance, particularly if the behaviours following the recent takeover of Twitter by Elon Musk are anything to go by. Since Elon Musk took over ownership of the platform, he has repeatedly used Twitter polls, posted from his personal account, as metrics to determine public opinion on platform policy. The general amnesty policy and the reinstatement of Donald Trump both emerged from such polls.

According to former employees, those polls are not only inaccurate representations of the platform’s user base, but are actually

“designed to be spammed and gamed”.

The polls are magnets for bots and other inauthentic accounts. This approach and the reliance on polls have allowed Elon Musk to enact and dictate his platform’s policy on moderation and freedom of expression. Even if he is genuinely trusting the results of these polls and not gamifying them, they do not accurately represent the user base nor the best practices for confronting disinformation and harm online.

Elon Musk uses the results to claim that “the people have spoken”, but they have not. Research from leading anti-hate organisation the Anti-Defamation League shows that far-right extremists and neo-Nazis encouraged supporters to actively re-join Twitter to vote in these polls. The impacts of platforming neo-Nazis on Twitter do not need to be stated. Such users are explicitly trying to promote violent and hateful agendas, and they were banned initially for that exact reason. The bottom line is that those people were banned in line with Twitter’s terms of service at the time, and they should not be re-platformed just because of the findings of one Twitter poll.

These issues are at the very heart of Labour’s concerns in relation to the Bill—that the duties around freedom of expression and privacy will be different for those at the top of the platforms. We support the clause and the group of amendments, but I hope the Minister will be able to address those concerns in his remarks.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I endorse the general approach set out by the hon. Member for Pontypridd. We do not want to define freedom of speech based on a personal poll carried out on one platform. That is exactly why we are enshrining it in this ground-breaking Bill.

We want to get the balance right. I have talked about the protections for children. We also want to protect adults and give them the power to understand the platforms they are on and the risks involved, while having regard for freedom of expression and privacy. That is a wider approach than one man’s Twitter feed. These clauses are important to ensure that the service providers interpret and implement their safety duties in a proportionate way that limits negative impact on users’ rights to freedom of expression. However, they also have to have regard to the wider definition of freedom of expression, while protecting users, which the rest of the Bill covers in a proportionate way.

Photo of Alex Davies-Jones Alex Davies-Jones Shadow Minister (Digital, Culture, Media and Sport), Shadow Minister (Tech, Gambling and the Digital Economy)

This goes to the heart of more than just one person’s Twitter feed, although we could say that that person is an incredibly powerful and influential figure on the platform. In the past 24 hours, Twitter has disbanded its trust and safety council. Members of that council included expert groups working to tackle harassment and child sexual exploitation, and to promote human rights. Does the Minister not feel that the council being disbanded goes to the heart of what we have been debating? It shows how a platform can remove its terms of service or change them at whim in order to prevent harm from being perpetrated on that platform.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

I will come back to some of the earlier points. At the end of the day, when platforms change their terms and conditions, which they are free to do, they will be judged by their users and indeed the advertisers from whom they make their money. There are market forces—I will use that phrase as well as “commercial imperative”, to get that one in there—that will drive behaviour. It may be the usability of Facebook, or Twitter’s terms and conditions and the approach of its new owner, that will drive those platforms to alternative users. I am old enough to remember Myspace, CompuServe and AOL, which tried to box people into their walled gardens. What happened to them? Only yesterday, someone from Google was saying that the new artificial intelligence chatbot—ChatGPT—may well disrupt Google. These companies, as big as they are, do not have a right to exist. They have to keep innovating. If they get it wrong, then they get it wrong.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

Does my hon. Friend agree that this is why the Bill is structured in the way it is? We have a wide range of priority illegal offences that companies have to meet, so it is not down to Elon Musk to determine whether he has a policy on race hate. They have to meet the legal standards set, and that is why it is so important to have that wide range of priority illegal offences. If companies go beyond that and have higher safety standards in their terms of service, that is checked as well. However, a company cannot avoid its obligations simply by changing its terms of service.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

My hon. Friend is absolutely right. We are putting in those protections, but we want companies to have due regard to freedom of speech.

I want to clarify a point that my hon. Friend made earlier about guidance to the new accountability, transparency and free speech duties. Companies will be free to set any terms of service that they want to, subject to their other legal obligations. That is related to the conversations that we have just been having. Those duties are there to properly enforce the terms of service, and not to remove content or ban users except in accordance with those terms. There will no platform risk assessments or codes of practices associated with those new duties. Instead, Ofcom will issue guidance on how companies can comply with their duties rather than codes of practice. That will focus on how companies set their terms of service, but companies will not be required to set terms directly for specific types of content or cover risks. I hope that is clear.

To answer the point made by the hon. Member for Pontypridd, I agree with the overall sentiment about how we need to protect freedom of expression.

Photo of Damian Collins Damian Collins Chair, Draft Online Safety Bill (Joint Committee), Chair, Draft Online Safety Bill (Joint Committee)

I want to be clear on my point. My question was not related to how platforms set their terms of service, which is a matter for them and they are held to account for that. If we are now bringing in requirements to say that companies cannot go beyond terms of service or their duties in the Bill if they are going to moderate content, who will oversee that? Will Ofcom have a role in checking whether platforms are over-moderating, as the Minister referred to earlier? In that case, where those duties exist elsewhere in the Bill, we have codes of practice in place to make sure it is clear what companies should and should not do. We do not seem to be doing that with this issue.

Photo of Paul Scully Paul Scully The Parliamentary Under-Secretary of State for Digital, Culture, Media and Sport

Absolutely. We have captured that in other parts of the Bill, but I wanted to make that specific bit clear because I am not sure whether I understood or answered my hon. Friend’s question correctly at the time.

Question put and agreed to.

Clause 20, as amended, accordingly ordered to stand part of the Bill.