1 | Episode 13 - The Online Safety Bill - Part 1 | |
---|---|---|
2 | To listen to this episode online click here | |
3 | ||
4 | Louisa | Hello. This is the online resilience podcast with me, Louisa Street and Professor Andy Phippen. We're discussing all aspects of young people's online lives and giving practical advice on how to support the young people you work with. Music is by Roo Pescod. |
5 | Welcome to another episode of the Headstart Online Resilience podcast. In the next two episodes, we're going to be talking about the online safety bill. So I'm going to start straight off by asking Andy, What is the online safety bill? | |
6 | Andy | OK, there's an interesting one. I started writing about where we're at now, about ten years ago. Basically, the online safety bill is the culmination of a great deal of policy activity that probably did start about ten years ago, looking at how we introduce better legislation. |
7 | The the overarching term to make the UK the safest place to go online in the world. I'm sure we'll come back to that in a bit and explain why that's complete nonsense. But it started off very much around how do we stop kids looking at porn? | |
8 | And it's expanded and expanded to where we're at today, which is the draft of the online safety bill with 145 pages of essentially responsibilities for platform providers to consider when providing online services. That's that's the core of it, which is kind of interesting because one thing you'd have thought they'd have learned over the last ten years is | |
9 | chucking tech at this stuff isn't necessarily going to help, because if you start from the very early stages of it, it was Oh, well, ISPs aren't providing home filtering default on. If we do that, then loads of kids will not be able to access pornography. | |
10 | So they did that. The major ISP did that, and then people tried to use it with family friendly filtering on. And then there was an awful lot of a block over blocking. So then people switched off again. I think the last Ofcom stat I saw was around between 40 and 60% of households have some level of filtering | |
11 | I'm suggesting that it's not because 40 to 60% of households want their kids looking at porn. It's because the tools provided the technical tools that are provided are not sufficiently effective. So you had that then you had the major policy success, which was applying filters to public Wi-Fi spaces. | |
12 | And I frequently said, while I don't think, on any internet libertarians who argue that that's an outrageous thing to do. Equally, I've never seen anyone browsing porn in Starbucks. So maybe there is. Maybe there was an issue there, but a gentleman in a corner. | |
13 | But, but no. But that was another le'ts chuck echnology issue. Then there was the big famous Let's Make Porn provided age verification onto their sites if they're connected by a user, which was actually placed into legislation in the digital economy bill and then revoked because that just wouldn't work. | |
14 | And the reason it wouldn't work are things we talked about before, such as we don't have a uniform way of demonstrating we're over the age of 18 in this country. A lot of the ways that you would demonstrate you're over 18 in this country have a financial barrier to them so that driving licenses, passports, citizen cards | |
15 | and similar. So you ended up with this really interesting hodgepodge of solutions and then it was ultimately withdrawn. So now we're in a place where we've got a far bigger bill that is not just looking at pornography, but looking at lots of other things as well that still seem to be very much focused on what can | |
16 | platforms do to technically prevent that from happening now? On the one hand, it's a good thing because it is an attempt to try and bring together lots of various disparate broken bits of legislation. So I don't know if you remember. | |
17 | But Katie Price put out a petition a couple of years ago calling for the government to make it illegal to abuse disabled people online and then annoying people like me when it is already illegal under the Equality Act. | |
18 | You know, attacking people with protected characteristics, regardless of whether it's online or offline, is is illegal. It has been legal since about 2012. So you've got a broadening of all these things and an attempt to try and bring it all together and address the harassment laws and address the image based abuse stuff and putting it all into | |
19 | one piece of legislation. But I think the thing that really stands out for me is the fact that it's almost wholly focused on providers. And what might providers do, which, you know, as both you and I know, that's not necessarily the best way of tackling these things. | |
20 | Louisa | Yeah. So just going back to one of the things you said about... it's already illegal to do those things. So there's a lot of stuff that is covered by the online safety bill that is actually in many ways, already covered by law. |
21 | So one of the things that they're saying is the social media companies, for example, will be held accountable for removing child sex abuse images and language that might incite racial hatred and other things along those sorts of lines, things which are already illegal. | |
22 | So what is the purpose from the government's point of view, perhaps, of putting this into the online safety bill as well? | |
23 | Andy | If I was being a cynic, I would suggest that even the title of the bill is quite problematic. This isn't a safety bill. I mean, we've talked about the concept of online safety before. I'm sure we'll pick up on it again in this podcast, but this isn't going to make people safe to go online. |
24 | This is, you know, if we were being more honest about it, maybe it should be referred to as the regulation of social media bill or something like this, because this is specifically looking at providers who provide person to prefer to person services and have considered the potential harms that happen on their platforms. | |
25 | But there is a very clear focus within the legistlation on what can platforms do and what more can platforms do, and we can bring in bigger fines, and there is still some discussion about making directors criminally liable. Stuff which just strikes me as a bizarre step, but certainly in terms of the size of the fines, it very | |
26 | much aligns with the stuff that came out, the data protection legislation in terms of illegally collecting data, or data breaches, not reporting on things. The fines have gone up significantly. Yeah. The other positive thing is that they are placing it in the hands of a regulator rather than trying to stick it all within the legislation itself. | |
27 | Because that would just be a terrible idea. But it's it's it's it's a tricky one because it's kind of like, do we need better legislation? Yes. Is this better legislation? Probably not. But I think one of the fundamental things I find about it is it's just the intangibility of what they're proposing that, you know, if you | |
28 | are one of the questions in the consultation was will this encourage a thriving tech sector in the country? Well, of course not, because you're a startup company and you've got an investor load in, comply with legislation, you'll go somewhere where that's not the case. | |
29 | But I guess we'll come onto that in a minute. But I think I think one of the one of the most fundamental things is I don't see anything in it that's going to improve aspects of safety for young people. | |
30 | Or incredibly, there's not a mention of vulnerable adults in the entire bill. So you either have you either have young people as a vulnerable group or adults now, I've spent quite a lot of time in the adult social care space and there are specific issues, you know, if you're part of a canteen or something where you would | |
31 | have liked to have seen the vulnerability of adults with learning difficulties, young adult brain injuries be considered within it, but there just isn't anything in there at all. | |
32 | Louisa | So to sort of concisely give a bit of an overview of what the online safety bill is. It's a way of it's a way of policing online content to try and make the people who are hosting their content accountable for what's there, rather than looking at the people who are posting their content. |
33 | So if that's kind of what the goal of the online safety bill is, let's talk a little bit about what some of the problems with that are. I suppose my kind of immediate go to having just said that overview would be that it's not holding the people who are posting the content accountable. | |
34 | It doesn't do anything about the root of the reason why people are posting that kind of stuff, and it doesn't regulate the dark web at all. So actually, it's not saying we want to remove this content from the world is just saying we want to remove it from certain social media platforms that will be covered by the | |
35 | bill. | |
36 | Andy | Yeah, an interesting offline analogy is With the owner of a wall will be responsible for abusive content sprayed on that wall. I think I'm always a little reluctant to do these online/offline comparisons within the US. Everyone talks about the Section 230 of the Communications Decency Act off the top of my head, which basically says the platform is not responsible for |
37 | the content posted within that. Now, obviously, it would seem like the online safety bills are ultra that although even people like Zuckerberg could acknowledge that Section 230 needs to be readdressed. But there are people that would argue that you just wouldn't have social media without Section 230 because no one would take that that liability on. | |
38 | I think one of the problems with the bill is very much the. It's almost addressing a part of the problem, like you said, it's not dealing with the deep rooted social issues and when at the end of the Euros when there was a lot of racist abuse happening on Twitter. | |
39 | The prime minister was challenged on in prime minister's questions, and he just came out with, Oh, we're going to sort this all out in the online safety bill and Twitter could sort it all out now if they wanted to. And it's kind of like | |
40 | No Twitter can't solve racism? Yeah. Yeah. People post racist content online because they're racist. | |
41 | Louisa | And yeah, and this is just, you know, this is essentially going to censor some of that rather than prevent people from posting it. You know, it's going to take those posts down, but there are still people who will be exempt from it. |
42 | So journalists and politicians are exempt from the online safety bill. So my kind of take on that was that you see a lot on social media people creating their favorite pundit and actually they could be censored for doing that. | |
43 | But the person who they're quoting wouldn't be, which obviously could cause this problem of there being a two tier system where some people are above this law and the rest of us have to toe the line. | |
44 | Andy | By way of interesting comparison. When we had all the coded lockdown regulation and stuff and Nigel Farage decided to go and visit the Kent coast to to stir up some animosity towards people arriving in dinghies, and they were challenged on that. |
45 | Farage's people, whoever they are, said he's a journalist, so he's exempt from that. one of the amusing things that happened, and this is one of the things where people frequently accuse me of being anti-conservative in terms of the party here, I'm not, you know, all political parties do this equally badly. | |
46 | Labor, a while ago did say that they want to make posting misinformation a criminal act. Now, if you look at the last election, every single political party used misinformation to certain degrees to encourage people to vote for them. | |
47 | So great. Yeah, let's let's make misinformation. Who decides what is misinformation and who decides what is an alternative view? I mean, as with all these things are all easy things to sort of go with someone's quoting nonsense about vaccination or something, but saying, Oh, look at this, it suggests something. | |
48 | I mean, first of all, is that misinformation? Or is it just poor interpretation? And secondly, how do we make the platform responsible for that? | |
49 | Louisa | Yeah. And one of the other things I read in preparation for this podcast was about how because of the fact that the online safety bill will include illegal and legal but harmful content, some people will be able to lobby for the removal of content based on it being in inverted commas harmful, which is likely to sweep |
50 | up a lot of other stuff in it. So anyone talking about issues related to the LGBT+ community, people of color, people using African-American vernacular English, anyone essentially not using received pronunciation could be accused of posting harmful content. Anyone posting ironic or things that are in poor taste but are meant to be funny, it's all going to | |
51 | be swept up with all of that. | |
52 | Andy | There are some serious concerns around freedom of expression overall. one of the really from a legal academic perspective, one of the really amusing things in it is that it talks about the issues around legal but harmful, and it uses this wonderful term as interpreted by a person of ordinary sensibilities. |
53 | But it doesn't go anywhere on from that determine what a person of ordinary sensibilities is. Now there have been case law that discussed what a person of ordinary sensibility is, but it does seem to be a very... Arbitrary definition, and yeah. | |
54 | You know, if you look at the LGBT outrage in the Midland schools where there was an awful lot of animosity towards the proposal that they were delivering that sort of education in primary schools, those people would be arguing that that content was harmful. | |
55 | So who decides what is and isn't? And, you know, while we moan about the current government, there are more extreme governments out there. But where all the way is the machinery in the legislation that presents feature that prevents feature creep in this sort of thing? | |
56 | You know, on the one hand, you've got illegal content, but this is one thing I admire with the Internet Watch Foundation. They say we deal with illegal content. There aren't many people online saying it's not fair. I can't share child abuse imagery on social media. | |
57 | However, when you get to legal but harmful you, I spent a lot of time looking at disclosures from young people around what they described as harmful content. And this is a wide ranging thing from animal abuse videos, sexual content, people being unpleasant to them to a great deal of current affairs. | |
58 | They were around the time of the Manchester bombing. There was an awful lot of young people talking about how they'd seen news coverage of the Manchester Arena bombing, and they found it upsetting. So they were disclosing that they found that current event material harmful when Lee Rigby was murdered in in London. | |
59 | Again, lots of young. And if you reflect upon the fact that the move was showing his murderer with bloodstained hands still holding a knife, talking about why he'd done it, you can sort of see, well, OK. But the other thing you know, this is the way it was. | |
60 | A little bit of a sledgehammer to crack a nut, I guess, is the harm is transient, particularly for adults, kind of like what's what's your emotional state like at the moment? Who makes that decision? Where is this person of ordinary sensibilities to make that decision? | |
61 | Presumably. It's down to the regulator. But if the regulator is then linked into government and then there's ministerial pressure, you know, it's not too long ago that we had Clause 28, you know, that's a piece of legislation. You know, we've made great progress on that sort of thing. | |
62 | But you know, it's not within the the boundaries of possibility that someone might decide that that sort of thing is something that we need to shut down again. | |
63 | Louisa | So just if anyone isn't aware, Section 28 was a piece of legislation that said schools are not allowed to discuss anything connected to LGBT people or their relationships or their lifestyle or anything because it would encourage children to engage in that. |
64 | Andy | It was a weird thing that argued that providing education around something was the promotion of a lifestyle choice. It was a very, very strange thing. But when was it was? It was the eighties when it was withdrawn. |
65 | Louisa | You know, it might have been a bit later than that. I might just double check that. Yeah, it was changed on the 18th of November 2003. |
66 | Andy | Jesus. |
67 | Louisa | Yeah, I thought it was in the early noughties because I've spoken to a lot of teachers who are at a similar age to me, people who are sort of in their thirties, late twenties who when they were at school, section 28 was still a thing and so their teachers were not legally allowed to talk to them about |
68 | being gay. And so they were under the impression that that was still the case, which is really worrying to think that in 2021, people would still be. Assuming that that was the case. | |
69 | Andy | That's less than 20 years ago, it was withdrawn. |
70 | Louisa | So absolutely that sort of stuff could come back, you know, where we're hopefully moving in a very positive direction around LGBT rights, but you know, it doesn't take that much for these things to so slowly be taken away. |
71 | Andy | Yeah, absolutely. I mean, you know, it's I think one of the fundamental aspects... There's a couple of things that are quite positive with the outcome on the transparency reporting there. But one of the fundamental aspects of it is that companies need to be able to demonstrate to the regulator that they have conducted a risk assessment to |
72 | consider who's using their platform and what potential harms could exist within the use of their platform and as a result of people using the platform, which is a reasonable thing. I think, you know, if obviously the the Big Tech ones, the Facebooks and the Instagrams, the world can already evidence that now, | |
73 | There was an interesting news story last week about Instagram's internal documents, acknowledging the fact that some of the content posted on their platforms can be harmful. I guess we can pick that up in a bit, because that was sort of an interesting comment. | |
74 | But but you do have, you know, more unscrupulous places where, you know, sort of four chans and eight chans of the world where they just go. Our problem is what people post. So it's it's a reasonable thing to do. | |
75 | But but then you've got the question on. Who decides what is enough in a risk assessment? I think that's one of the really sort of interesting things again from an academic perspective. Well, the example I always give is pornography. | |
76 | So if a pornography provider is identified within the bill as being required to perform a risk assessment and put tools in place to make sure that the young people can't access their content, then they put a bunch of stuff in place and they evidence that they've done their their risk assessment, and they've invested quite a lot of | |
77 | money in it. And they've, you know, they've employed people to carry out risk assessment. They've employed lawyers to look at the legality of what they're doing. They've employed more technical people to put the tools in place. And then some. | |
78 | For example, technically competent kid who decides to VPN into the porn site and therefore the porn site doesn't know it's come from the UK. Therefore, it doesn't trigger these tools or the kid logs into the the family P.C. and notices that one of the parents has left their porn log in open. | |
79 | Is that still the fault of the porn company? Because there's nothing in the bill that says what is enough? That's one thing that concerns me and could potentially seriously damage the usability of social media platforms in general, because if you are being threatened with significant fines, you'll take the conservative position on it and then everyone's going to start | |
80 | getting warnings and account suspensions and all those sorts of things, which is kind of happening. On places like Facebook already. | |
81 | Louisa | So this is because social media companies and other companies which host content online, if they don't comply, they'll face fines of up to 10% of their turnover, which is a huge amount for, you know, companies like Facebook, where their turnover is going to be, you know, in the, let's just say a lot. |
82 | And so they're not going to want to risk that. And so there'll be excessive in the removal of these posts. one of the criticisms I heard of that is that, well, actually, it doesn't protect victims in the first place. | |
83 | It could actually be it could actually be really problematic for victims because the evidence is being deleted before they have a chance to report it to the police, which again, is going to be very problematic for actually holding people accountable for what they're saying online. | |
84 | Andy | Mm-Hmm. I think that's a really good point. You know, one of the things that we always say to young people is it's something that I think capital is also something that something screengrab it. Don't respond to it. But if the company has decided with their algorithms offensive and taken it down already, it's not going to stop the |
85 | abuse, it just means that one of the evidence trails for it has been shut down. | |
86 | Louisa | Yeah. And, you know, having the foresight to see something upsetting that's been sent to you and that's being shared, you know, it's being shared in your feed and thinking, right, I must screenshot that straight away so that I've got the evidence that isn't how people respond to things. |
87 | It takes people a while to process, particularly young people. But, you know, regardless, anyone is likely to be, you know, potentially emotionally involved in that before they're able to think rationally and think, OK, I will screenshot this so that I've got the evidence now. | |
88 | Andy | Absolutely. Yeah. Well, I think as you said earlier, it almost absolves the responsibility away from the abuser and onto the platform, which does present some, some fairly significant challenges. You know, it's something that I've been saying for for ten years. |
89 | Indeed, I've written a book on the specific issue. Technology will only have so much of a part to play again. I've said this before in podcast, but the very famous Raynham's law, which was made by a cyber security researcher, Marcus Raynham, where he said You don't solve social problems with software, you know, what can the platform do | |
90 | ? The platform can put a bunch of algorithms in place to identify things, and it can put a bunch of tools in place that allow people to report stuff which then they can. You know, the idea that given the traffic on these platforms that they might employ, you know, ten times more moderators to look. | |
91 | And again, that will impact on the social media experience. If your post has to be moderated before it's allowed to go onto the platform, it is potentially going to change the nature of of social media experience. | |
92 | Louisa | And I think that's something that, you know, this could really see an end to social media as we currently know it because where when social media first appeared, it was very much a place where you could freely express your views and opinion. |
93 | And now it's going to be more censored than any other part of life. Yeah. You know, we've already seen a move, and we'll talk about this in a bit more detail in the next podcast. But we've seen a move of young people away from the platforms where you just scream into the void like Facebook and Twitter, and | |
94 | they're already engaging more with things like Snapchat, where you're speaking more directly to people, you know, we're likely to see that moving even more in that direction because, you know, it's going to have such a high level of censorship. | |
95 | Andy | You also have some really weird collisions of legislation if you like. So on the one hand, you have this really bovine debate happening in politics at the moment around how if Facebook introduce encryption into their messaging platform, then the pedophiles have won. |
96 | Therefore, they shouldn't be allowed to do it. And then there is investment at the moment in our advertising campaign to win over public opinion. The encryption is a bad idea. And there are some concerns that. It might be argued that if you are using encrypted communication, if you're looking at more of the person to person platforms rather | |
97 | than as you wonderfully describe as screaming into the void platforms and you know, how do you know what's going on on those platforms? And if you're demonstrating your duty of care, which is this overarching term within the legislation that says companies need to demonstrate their duty of care. | |
98 | If you're demonstrating your duty of care, you have to say, Well, we need to be able to see what people are saying communicating about. Therefore, we can't use encryption. And then you look at the Data Protection Act, which says, How do you ensure that people can't intercept? | |
99 | You immediately have these legislative tensions now. Now, I know the government are talking about moving away from the GDPR. It'll just be able to monetize our data more effectively, which which again, is it's kind of scary. And if you're running a business that trades with Europe and you move away from GDPR. | |
100 | Welcome to red tape hell. But equally, you've got one piece of legislation which is about ensuring data is secure and can't be intercepted and can't be read in. On the other hand, you got you need to be able to read the data, otherwise you're not performing your duty of care. | |
101 | Well, I was asked last week to comment on the appointment of Nadien Doris's. I'm Secretary of State for culture, media, digital, culture, media and sport, and I rather diplomatically said I think it highlights a bigger concern around the amount of science, technology, engineering and mathematics knowledge in Parliament because you get into these situations because no one's | |
102 | thought about these sorts of things on on a technical level or a technically legislative level as well, you know, and in order to achieve one thing, you're actually breaking the law with another. Know you have the age appropriate design code, which is part of the Data Protection Act that says you've thought about kids data and you've made | |
103 | sure it's safe and you made sure it's secure, you make sure it's protected and this has the potential, and we should stress that it's still a draft at the moment, and there's an awful lot of discussion going on. | |
104 | But equally, if you look at the consultation questions, there are things like Is this going to make the UK the safest place for you to go online in the world? You know, it seems to be very sort of grandiose in its vision. | |
105 | And going back to that point, of course, it's not going to make the UK the safest place going on in the world because geographical boundaries don't exist. Yeah, unless you take very, very prohibitive actions like, you know, China's got a different version of the internet from for most of the rest of us. | |
106 | Don't think we should be looking at China to have a look at their freedom of expression and freedom of speech, theor own Google is a good idea. Let's look at that model. | |
107 | Louisa | Yeah. And China recently just brought in a new system which will block young people from playing computer games for more than an hour on a weekday. And, you know, I'm sure people listening to this might be thinking, Well, that sounds great, either with my kids or the kids. |
108 | I'm supporting playing games for more time than that, but it's, you know, it's not really solving the problem. | |
109 | Andy | I always think that that should be a decision that takes place in the home, not by the government or the provider. And also, I mean, it's one of the things because Matt Hancock suggested similar around regulating screen time around social media was you'd obviously break. |
110 | There'd be massive breaks in data protection law then because you'd expect the companies to share data about data usage by children. But you know, it would be far better that parents discuss how long they think is acceptable. You know, I remember speaking to a journo when Fortnite came out and she was feeling really guilty because she only | |
111 | allowed her her son to play a weekend and from that. But but other people play. It promotes what. But those discussion you have, you know, most of the young people I've ever spoken to about that sort of thing are very comfortable with health rules. | |
112 | I think be a little less comfortable with the government deciding he played that sufficiently and showing you off a bit flow. | |
113 | Louisa | And I think it's all about that new ones. We talked about this a bit in the reporting harmful content episode that within a household, you can understand the nuance and you can say like, Yeah, OK, my child isn't very well today. |
114 | He's going to have the day he or she is going to have the day off school. So yeah, they're probably going to spend more time online than they would if they were out at school today. And certainly, if I'm unwell, I watch more TV than I would on a regular day. | |
115 | And that's kind of understood, whereas the government are not going to be able to come in with that level of nuance. So we're going to talk a little bit more about what people who support young people can do in preparation for the online safety bill coming in, what they need to know about the online safety bill | |
116 | But to kind of wrap up, I thought that something would be interesting would be an interesting reflection was that the government have said very clearly that this is about wanting to have a culture of transparency, but that culture of transparency is about these companies, social media companies and so on. | |
117 | Being transparent with their data is not about tackling the anonymity afforded by social media. I don't know if you've got anything you want to add to that. | |
118 | Andy | Yeah, no, it's it's. I mean, we'll pick this up in the next podcast. But I think one of the positive things is expecting companies to demonstrate more openness in terms of the number of problematic posts, the number of takedowns and similar. |
119 | But again, it only seems to be on that one stakeholder. I mean, if you look at the if you look at sort of web filtering and things, you have the Counterterrorism Internet Referral Unit, which is part of government, which adds websites to a list which decides the terrorist content. | |
120 | And there's no recourse for that reason. Come back for that. It just happens. It's entirely opaque. So we're talking about a greater culture of transparency, but it only seems to be fair for one stakeholder. And as we always say, this is a multiple stakeholder issue. | |
121 | It's not something where we can all sit back and go back on Facebook, sort all this out. In the same way that two people planning a terrorist attack in a cafe isn't going to the cafe owner. | |
122 | Another horrible off line comparison, but one that could reflect on, you know, you need to stop recording material, recording devices into the cruet sets on the tables because you need to make sure if someone is discussing something you need to know, you know, it's just it just seems nonsense. | |
123 | And that context? | |
124 | Louisa | Absolutely excellent. Well, let's pause there and we'll pick this back up in the next episode. So, yeah, we will be back in a couple of weeks. That's it for another episode of the Online Resilience podcast. If you liked it, please tell someone who knows you might also enjoy it. |
125 | You can share on Facebook, Twitter, or even just pop a link in an email. |