1 | Episode 12 - Reporting Hate Speech | |
---|---|---|
2 | To listen to this episode online click here | |
3 | ||
4 | Louisa | Hello. This is the online resilience podcast with me, Louisa Street and Professor Andy Phippen. We're discussing all aspects of young people's online lives and giving practical advice on how to support the young people you work with. Music is by Roo Pescod. |
5 | Louisa | Hello and welcome to another episode of the Headstart Online Resilience podcast. Today, we're talking about reporting hateful content online. Probably in the last few months you've been aware of the issues that have come up in the media about people posting racist content, online. |
6 | People posting hateful content online generally hate speech, whether it be homophobia, misogyny, racism or any other transphobic Islamophobia. Yeah, there's so many things that, you know, we we sort of hear people talking about and we want to talk a little bit about what we can do to support young people if they come across this stuff or if | |
7 | they're a victim of this sort of thing. And what we should be teaching young people as a whole about dealing with hate speech online? So, Andy in the media there's been a lot of talk about getting an algorithm to block racism. | |
8 | What are the problems with that? | |
9 | Andy | How long have we got? This is basically an entire strand, a strand of my entire career. I'm trying to push back against this will be what happens if something bad happens. There's a lot of media coverage. There is an element of it has been taking place online. |
10 | Politicians will go right. We need to make sure that the platforms are doing something about this. So after the racist abuse footballers, after the Euros, immediately, it was all the platform should be dealing with this better. Boris Johnson, I'm not sure hilariously is the right word, but there's a lot of stuff he does that makes me laugh | |
11 | . And then I go, Oh no, he's prime minister. But he said the prime minister's questions just before summer recess, when challenged on the racial abuse in Euro, said the online safety bill is coming along now. It's going to make platforms more accountable because we all know Twitter could take this all down if they wanted to | |
12 | , which is just such an utterly facile thing tosay. first of all, it's not true. And secondly, a politician waving their arms around and going, Oh, they should do it if they wanted to. It's just it's utterly counterproductive, because then if you've got the prime minister saying it and then that gets reported in the press and then people read | |
13 | the press, well, they said it was possible. Therefore, I can't believe this is still ongoing now. Algorithms are very, very good. If you're talking about text based content, for example, very, very good identifying words. And if those words have been flagged as bad, then they're very good at taking those sorts of things down. | |
14 | Yeah. So if someone is using racist keywords in their posts on, for example, Twitter, Twitter will identify them and flag them, and they will probably impose a suspension immediately. If, however, because there is a lot of this in far right discourse and some fairly well known. | |
15 | Far right, commentators, should we say, will make sure they don't use racist keywords, so they will give out a message that is racist or is placing a position of white privilege or something, but they won't use racist keywords, they won't get caught. | |
16 | You know, and then you go, Oh, well, well, algorithms can sort this stuff out there and let you know they can't. Regardless of what the A.I. researchers say that natural language processing has come a long way. It's still rubbish at context. | |
17 | The example I always give because I spent a huge amount of time because it's a way of looking at porn filtering is the example of cock. You know, if a porn filter is looking to determine whether a website is serving up sexual content, it will look for sexual keywords. | |
18 | one of those sexual keywords might be fine to block it. It might also block a site about aviary and chickens and those sorts things. Well, because quite rightly, someone might be referring to the male chicken as a cock. | |
19 | Within that situation as well is the reason that Scunthorpe still gets blocked by some filters. Because within Scunthorpe, you have a very bad sexual swearword and it will, you know, the algorithms don't have the opportunity to sort of go, Excuse me, why are you being racist? | |
20 | I know you were referring to a primate. Can I just check there. What normally happens within artificial intelligence based approaches is you feed it what's referred to as a corpus, a whole bunch of phrases. And if I identify similar phrases, then it will flag those up as potentially abusive. | |
21 | I don't know if you've experienced this on social media platforms recently, but sometimes someone will post something and they'll get a warning going. This looks like abusive content we've seen in the past. Are you sure you want to post this? | |
22 | They're doing this because they are trying to go further with algorithmic intervention. But what would happen if a platform accused you of being a racist when you were referring to a primate, for example? Incredibly bad PR for the organization, potential defamation or those sorts of things. | |
23 | So the idea that the platforms can solve this on their own is frankly nonsense. After the the Plymouth shooting, where the guy was posting up a lot of misogynistic stuff, it's sometimes referred to as incel based content. | |
24 | Again, the commentators start with, Oh, why did the platforms allow this to be posted? He was posting up YouTube videos where he was clearly demonstrating aspects of mental illness and posting up some fairly grim, misogynistic material. Now, if we think detecting that stuff in a text based context is problematic enough in a video based context is even | |
25 | more problematic because the algorithm has to try and for want of a better word because it doesn't actually listen, process the text from the video and then make a decision on whether, you know, it becomes incredibly complicated. And I know Keir Starmer tweeted a few days ago that as a result of the Plymouth shooting, it's about time | |
26 | the government got on with implementing the online safety bill. I can guarantee that the online safety bill won't stop this sort of thing from happening. | |
27 | Louisa | Yeah. |
28 | Andy | And it's it's rather worrying that the politicians see it as the solution. |
29 | Louisa | Absolutely. And it's interesting because after the racism that kind of followed on from the Euros, a few people I knew were posting online saying, if every single post that you put up about COVID has a warning label on it with information about where you can find out reliable information about it, why can't the same be done |
30 | for racism? And I kind of felt like it's because it is totally incapable of understanding the nuance. Say, if somebody who has experienced racism wants to talk about their experiences online, they should be able to do that without being flagged as being potentially racist, because that is going to further solidify the it's going to further solidify the | |
31 | prejudice against that person. There's a great podcast called Be Anti-Racist by Ibram X Kendi, which I listen to would highly recommend if you're interested in sort of learning more about how to tackle racism, and in that they talk a lot about the experience of racism and therefore, you know, it could. | |
32 | That could. That's the sort of thing that could be flagged as being racist because of the words that they're using. And clearly, it's not. But an algorithm isn't going to be able to tell that. But you, as a person with a brain and an understanding of nuance are going to be able to tell that difference. | |
33 | You're going to be able to recognize if someone is being, you know, standing up against something or being very subtle in a way to get those posts in without flagging, you know, without being flagged by the algorithm. So we can we can use our brains rather than relying on artificial intelligence. | |
34 | So what can people do if you see something online that is racist war, what should you do? What is a good way of dealing with that rather than kind of relying on that algorithm to pick it up? | |
35 | Andy | And this is the flip side of, I guess, a lot of the work we do and certainly the conversations we have with young people over the years is that the and also with a lot of the the legal stuff is going at the moment and platforms need to do more platforms to sort this out. |
36 | More platforms are providing the tools. It's just people aren't using those tools. So I was speaking to a group of young people a while ago saying, I think I've talked about on the podcast before, but there was a Tiktoker in the college who was posting up some fairly offensive content on Tik Tok. | |
37 | It was clearly stuff that breach TikTok's community standards, and the number of the students at the college were getting upset about it, and I was asked by a member of staff what they could do about it, and I said, Well, why haven't the students reported the account? | |
38 | Because that clearly breaches community standards. Tik Tok are famously conservative, so they'll check it out and ban the account, and I spoke to some young people at the college about that a couple of weeks afterwards, and there was no point reporting it doesn't work so well. | |
39 | How do you know? Well, when I reported a video on YouTube once and they didn't take it down. What was that about then? It was about animal abuse. Then you start to unpick it like an interpretation of a video that wasn't physically abusive to an animal. | |
40 | It was, you know, it was something that had nuance to it. But were all of these platforms provide tools for reporting? And I, myself and my friends will have a happy evening of getting a racist band from Twitter. You know, you sort of see a comment. | |
41 | You report that comment as racist. A bunch of other people report that the racist. Then you check back in a couple of days time, that account's been suspended. Yeah, if you have concerns about what someone's posting, you can report. | |
42 | The idea that we all sit back and platforms, check out everything that was posted onto them and we don't have to do anything is a very strange perspective of how a community operates. | |
43 | Louisa | I think, you know, even if you think, Oh, you know, they're not going to do anything, the more we all use the reporting mechanism, the more they're actually going to have to put time and energy into improving dealing with reports. |
44 | Because, you know, if we're if we're all reporting that stuff and then the government are saying platforms, what are you doing, then they can say, Look, we're doing this, we're getting all of these reports and we're responding to them. | |
45 | So the more we use it, the better it will be, the more likely it will be that those platforms will put money and energy into improving those mechanisms. | |
46 | Andy | one of the one of the good things it was interesting. I was asked by an organization a couple of weeks ago to think up five good things and five bad things about the online safety bill. It was very difficult to think of 5 good things. |
47 | However, one of the things I did say is the fact that they are having to what's referred to as transparency report and by having to report every year on the number of complaints made, the number of times I've made, the number of occasions that have been suspended. | |
48 | A number of things they've responded to positively and stuff. If they're going to start doing that, people will start believing it works because if you've got someone another, no point reporting it. They never do anything. That platform comes back and says we suspended half million accounts last year because of racist content, misogynistic content or homophobic content or | |
49 | whatever. Then people will start believing it more. There is certainly a knock on experience that if you know people who have used it and that's work, then great. But the idea that the platforms have either got to do everything or nothing is... | |
50 | They don't do anything. It's like, Well, have you ever reported it? No. | |
51 | Louisa | Yeah. |
52 | Andy | Hang on a minute, you know? So I think that is one of the positive things that is coming from it. one of the really interesting things about the fact the online safety bill is being held up as the thing that's going to solve everything is that it does talk about how platforms need to demonstrate that dealing with |
53 | illegal content, that's fine. But then they talk about this category of lawful but harmful or legal legal but harmful, which would be, you know, a huge amount of the stuff that people get offended by and people will claim is abusive will fit into that category because they're not using specific racist language or specific homophobic language and that | |
54 | . The legislation that area starts to become incredibly strangely worded, so it refers to this thing going on something that would be considered harmful by a person of ordinary sensibility. Well. | |
55 | Louisa | I'm glad that we know what ordinary means. |
56 | Andy | Yes. |
57 | Excuse me, sir. Would you be offended by the following? Yes, I would, right? So let's find Facebook. So this is almost like the the legislation is demolishing how complicated it is while still trying to put the pressure back on the platforms. | |
58 | You know, one of the very famous things about chat roulette, which was the precursor to a miracle. So both of those are platforms where your webcam get connected to another random webcam somewhere in the world and they're very famous for, you know, you click on Show Me Next Webcam and it would be someone masturbating. Chat roulette | |
59 | cleaned themselves up, basically by putting a report button onto the page. So when you click on something or someone there masturbating, you click report. It would log the IP address of that person as someone who was masturbating and would block that IP address. | |
60 | So that's where it all moved over to Omegle. The Chatroulette community got fed up with watching people masturbating, but it was a community effort to achieve that. It wasn't. Everyone sat back and wrote Dear chat roulette even older than I was disgusted to discover there was a man masturbating. | |
61 | You know, there is a responsibility of the whole community, something we talk about a great deal and safeguard against the role of all of the stakeholders in the community. I think sometimes in the online world, we just sit back and go, How the hell did Facebook let this happen when we sat in a cafe and there's someone | |
62 | spouting racist language on the table over from you and you look at the cafe owner and go, Excuse me, can you? You know you need problems within yourself or something. | |
63 | Louisa | Yeah. |
64 | Andy | You would just sort of sit there or go on TripAdvisor. I was disgusted to see there was a racist in that cafe. |
65 | You know. | |
66 | Louisa | And I do think that context is really important, like the context around the Euros, I think is a really good example because the Euros hype up nationalism is, you know, it's all about England. And it's encouraging people to really engage with this narrative of England is the best. |
67 | Everyone else is worse than us. We're amazing. That has always gone along with a certain element of racists who believe that. And so then get very involved in that scene and it isn't challenged, isn't critiqued. We're not saying hold on. | |
68 | Do you think all of these people were just raving about how amazing England is? Could could lead to some problems down the line? And, you know, then the government saying, Oh, taking the knee is unnecessary. We think that's daft. | |
69 | We're not going to condone it or whatever. You can't then be surprised when there are, you know, consequences to that. And so what we need to do is challenge those sort of racist behaviors at every step. And it's fine for people to support England. | |
70 | But if they're expressing views that England is better than everybody else, England is the best in the world. Where is that coming from? What like, what does that mean? You know, just having those conversations with young people, like when you say best in the world, do you mean just football and kind of get into that with them | |
71 | Have these conversations with them? Because if we can challenge them at the beginning, then hopefully they don't get so far down that road that they then might be sending racist, homophobic, misogynistic messages to other people. | |
72 | Andy | There's a very famous quote from the Secretary of State for Education, who was interviewed about the vaccine program. Why are we getting on so? Well, we went, Oh, because we're better than everyone else. We're better than France, we're better than Germany. |
73 | And it was just like, you thought that you mean better? Well, you know, if senior politicians are using the competitive, what's the word elitist narrative, it's no wonder that other people do as well. So, yeah, you know it. | |
74 | It is somewhat bizarre thing, but the bad stuff happens in society, and the immediate reaction is what the platform is going to do to stop this. As I'm sure I quoted before, there was a very famous cybersecurity researcher called Marcus Raynham. | |
75 | Andy | He's got this thing called Raynham's Law, which says you don't solve social problems through software. |
76 | Yeah. So we'll see. Yeah. | |
77 | So people tweeting racist self stop. People being racist. No, that's too complicated. | |
78 | Louisa | Yeah. |
79 | And you know, I think that really does get to the heart of it. Like, it's complicated. It's a difficult thing. A challenge racism and particularly in a county like Cornwall, where we don't have a huge amount of racial diversity, I think it can be really difficult because you might be working with young people who don't know any | |
80 | people who aren't white or European, and that that can then be really difficult. But again, there are ways that we can support young people around that and trying to get them to engage with content creators who come from different backgrounds to them could be a really good start. | |
81 | Like, if you don't have these people in real life, why not try and engage with them online? | |
82 | Andy | You know, online does present some really excellent opportunity for that. If you are just scrolling through various random on on Tik Tok, you will come across people from different cultures and different experiences. Start following them and think there's a very famous autistic commentator on Tik Tok who's just gone to 1,000,000 followers now. |
83 | And you look at the comments, you're doing so much to promote autism awareness and stuff. And you know, it's there's a lot of positives that can come from looking into other people's world. But I think as well, you know, there is also the response of the professional to make sure we're not chucking in throwaway remarks or comments | |
84 | that might facilitate certain types of prejudice and one that sticks with me. I was working as an all boys school in a southwest city that has girls in the sixth form and one of the girls in the sixth. | |
85 | one of the discussion group I was having around online misogyny was saying there was an incident in the corridor before they went into class and the girl was singled out as swearing. And the teacher said I wasn't expecting that sort of thing from a girl in front of a bunch of males who spent most of their secondary | |
86 | school in a male only environment and kind of go, That's the really appalling thing for professional to be saying to somebody, because all that's doing is creating isolation and saying we can do it, but you can't do it. | |
87 | Louisa | Yeah, absolutely. Yeah, that's not good. |
88 | Andy | I guess the key issue is the fact that, you know, the online platform can't solve it on their own, but they are providing tools to help people engage and not be bystanders. About this sort of thing, I guess, would be. |
89 | Louisa | Yeah. And we can kind of encourage young people to not be bystanders and to take an active role because you don't have to be the victim of something to report it. You don't have to be, you know, we shouldn't necessarily be putting that on |
90 | victims of that abuse to be the ones who are reporting it. If we all kind of do our bit and report things when we see them. Hopefully, that saves the victim having to go through that also quite difficult process. | |
91 | And hopefully, you know, from the reporting mechanism, there will start to be some more real world consequences for people posting racism, racist, homophobic or, you know, any kind of hate speech online because that will. That would be a good move that the government could make, but, you know, in the meantime, we can at least get it taken | |
92 | down. | |
93 | Andy | What I often say is that if this was being done ten years ago, would be sitting here on the platforms are appalling. Platforms are terrible because they wouldn't do anything. Now they're doing quite a lot, but a lot of that isn't used by people because they're just going to go, Oh, there's no point reporting. |
94 | There is a point in reporting. It works. Yeah, but you know, if you see something and you disagree with somebody there reporting that, but if you see something and it's racist or misogynistic or as, say, people, when you reported on other people reported as well, then it will be looked at by somebody and then somebody will, you | |
95 | know, there was. I mean, what are we three or four days past the Plymouth shooting that someone tweeted yesterday? Some really quite offensively misogynistic stuff about, well, if if you provided 16 to 21 year old males with women, they could be ex-offenders or something, then they wouldn't be behaving like this. | |
96 | Andy | But it can't go. And it was. It was just an amazing thing. It's like, Geez, I like taking the mickey. I know serious, serious. Everyone reports every. A huge amount of people reported it and gone very, very quickly. |
97 | You know, within a space of a couple of hours of that tweet appearing, the account was suspended. | |
98 | Louisa | Yeah. |
99 | Andy | Those were the people to use the tools. |
100 | Louisa | Yeah, absolutely. |
101 | Andy | And you know, if you are disagreeing with someone, you also have the tools to mute and block people, even if you can't get them banned from social media, which would be a problematic thing to do for freedom of speech. |
102 | Just disagree with somebody. You don't have to listen to what they say if you disagree with them. But if it is hate speech or offensive, then report it. | |
103 | Louisa | Definitely. Excellent. Well I think that seems like quite a good place to end. We will hopefully be back in a couple of weeks with another podcast. Until then, we'll see you soon. That's it for another episode of the Online Resilience podcast. |
104 | If you liked it, please tell someone you know who might also enjoy it. You can share on Facebook, Twitter, or even just pop a link in an email. |