We’re live at our #OnlineHarms event, where we will hear from industry experts, researchers and policy makers, and share interim findings from our research on young peoples’ perspectives on social media, acceptable use, and enforcement challenges faced in the UK and overseas.
Our panellists are @VictimsComm – Dame Vera Baird, @Dr_FaithG – a senior lecturer @ANU_Law and research lead on our latest #onlineharms work, Professor Lorna Woods – from @CarnegieUKTrust + @EssexLawSchool, and Jordan Khanu – a young person @LDN_VRU’s Young People’s Action Group.
Chaired by Catch22’s Chief Operating Officer, @Catch22Naomi, and with an update from Minister of State for Digital and Culture, @CJ_Dinenage, we are sure that this is going to be a fascinating discussion about the important issues surrounding #OnlineHarms.
Across our services, we have seen the issue of #OnlineHarms exponentially increasing as children and young people have relied on the online world for social and educational purposes this year. Our frontline staff are acutely aware of the opportunities and risks this has created.
In June, we released our #OnlineHarms Consultation, with insights from young social media users, tech platforms, youth services and youth workers on how violence and exploitation stem from online behaviour. This has been followed by an ongoing research project, led by @Dr_FaithG.
Our first speaker today is Dame Vera Baird, @VictimsComm, who is responsible for championing the interests of crime victims and witnesses and reviewing the operation of the Victims Code of Practice.
. @VictimsComm talks about the importance of using voice to improve the lives of victims, and the timeliness of this conversation about #OnlineHarms, stating that whilst we can't blame COVID-19, we have all spent far more time online, creating a "perfect storm" for online abuse.
Opportunities and threats are rising for young people, and supervision and support has reduced. Victim support services have reported seeing 6,000 fewer younger people seeing support compared to the same time last year. #OnlineHarms
Online abuse can push young people into criminality, @VictimsComm explains, with many first people targeted on social media before being then coerced into criminal activity, with threats around sharing sexually explicit images often forming part of that abuse.
. @VictimsComm explains that it has been proposed that it should be a crime to threaten to publish explicit images online, as this threat can be coercive, noting that legislation has simply not kept pace with this issue.
The Online Harms bill, which proposes a new framework, is a statutory duty of care on companies to keep users safe online and tackle illegal, harmful activity online, @VictimsComm continues, stating that her priorities are to give people a clear, straightforward route to report.
We need to ensure that frontline workers, including teachers, are people that children feel they have the confidence to speak to when these issues occur, as well as being able to spot red flags when they show, @VictimsComm continues.
"We need to keep up to date with a new dimension of criminality, in order to keep this issue under control," @VictimsComm concludes.
Next, we will hear from @Dr_FaithG, the lead on our current research into children and young people’s experience of social media, online harms, and the challenges facing law enforcement professionals. She is a Senior Lecturer at @ANU_Law.
This project builds on extensive work that Catch22 have been doing, @Dr_FaithG begins, in the area of online activity, social media, and whether this is a catalyst for violence, taking it more in-depth through workshops with young people on their experiences and ideal solutions.
The study is exploring their perceptions of social media, apps and gaming, and the benefits and risks that they identify and measures they think could work as part of regulation to make these spaces safer for children and young people, she continues.
The project has also involved researchers, tech organisations, educators, victim support services, safeguarding and law enforcement to discuss the challenges that they experience in how acceptable behaviour is monitored.
All of the children and young people spoken to felt their voices are marginalised and unheard on these issues, but they have a lot to say on their experiences, and the experiences of their peers. These voices and challenges have been missing up to now, and must be more central.
The COVID-19 context hasn't been experienced before, where every aspect of our lives has been moved online, and many young people have mentioned the challenges that this has faced, with far more opportunities for harm arising as a result of this time online.
There are also grey areas, @Dr_FaithG continues, with the need for legislation to be able to evolve with new platforms and as the digital world changes.
In the workshops, young people have emphasised that there are lots of positives to the online world, sharing direct quotes from the children and young people that
@Dr_FaithG has spoken to, but notes they all have expressed some experience of #OnlineHarms.
There are many people bypassing the age verification measures, the children and young people have noted, and that the education they have received is often out of date.
Many young people have also mentioned having to support and assist younger siblings, expressing anger at the platforms for the messages their siblings were receiving - particularly noting the automated, impersonal responses from the platforms, @Dr_FaithG continues.
We asked children and young people what they would like to see happen when something does happens, @Dr_FaithG says, noting that they often felt the harm was already done, and wouldn't complain now because of the responses received previously.
Children and young people really felt that "there was nobody really fighting their corner," @Dr_FaithG emphasises, and wanting there to be more accountability. They gave many suggestions about how to make it safer for them going forward, which will be included in the research.
Before we move onto the panel questions, we will hear from Minister of State for Digital and Culture, @CJ_Dinenage, who was unable to attend today but who has sent a video messaging giving an update on the department’s #OnlineHarms work.
#OnlineHarms is an incredibly important issue, she begins, highlighting the difficulties that COVID-19 has created, making this work more important than ever. @CJ_Dinenage emphasises the duty of care that companies will have, with a regulator in place to deal with non-compliance.
The full Government response will be published this year, with legislation coming early next year, @cj_dinenage states, with world leading protection for children's data also coming next Summer. This will be accompanied by an online media literacy strategy in the Spring.
"We all have a role to play in making online spaces safer," @cj_dinenage concludes, with all of this work helping to support individuals with the time they spend online.
We will now move onto our panel discussion, where we are also joined by Professor Lorna Woods from @CarnegieUKTrust and @EssexLawSchool, and Jordan Khanu, a young person from @LDN_VRU’s Young People’s Action Group.
"Part of what is going wrong, is inadequate systems", Professor Lorna Woods begins, noting that the complaints system and age verification systems are particular points of issue, stating that platforms should take steps to fix these issues as self-regulation is not working.
The regulator needs to listen to various groups in society, she continues, including those who are likely to be victimised as well as the technology platforms themselves. This forms a valuable part of getting the voices of those using the platforms out.
We've seen less emphasis on what good looks like, and what tools will empower users to have control over their own experiences so they can easily mute inappropriate content or contacts without having to engage with it. We need to create usable, appropriate tools, Lorna concludes.
For the first question, @Catch22Naomi asks how COVID-19 has impacted young peoples’ behaviour online and their experiences with social media. The question is framed in the context that many of us are spending a huge amount of time online now, even for education.
. @VictimsComm says that this is not new during COVID-19 but this has intensified the issue. This immense concentration of online use and isolation of children and young people, has enhanced the suffering that is faced.
Many people do not realise that they have had wrong done to them, and the reaction they receive in the immediate aftermath can make a real difference to the way this harm is perceived - something that picks up on the issues about complaints that have already been discussed today.
"People can come to expect that, if there is no enforcement taken, then this is what I have to learn to put up with, this is the way it is - which runs the risk of making the environment worse and encourage potential perpetrators," Professor Lorna Woods adds to the conversation.
Providing the experience of a young person, Jordan from @LDN_VRU's Young People's Action Group, discusses the anonymity that the internet gives to people, emphasising that you can't always trust that someone is who they say they are, which has a big impact on safety.
Jordan continues by explaining that the experiences of young people during lockdown has been very different for different people, heightening a lot of peoples' experiences of mental health and expectations for the future.
Next, @Catch22Naomi asks what our panellists think is meant by “acceptable use” in terms of appropriate behaviour online. This is a key part of our research, both in the perceptions of young people, and what degree of behaviour is accepted by the different social media platforms.
Children and young people in our workshops talked about acceptable use in terms of their own experiences on social media, wanting it to a space they could engage with without being subject to messages from people they don't know, @Dr_FaithG begins.
Guidelines are often in a format that isn't very accessible for the age group of young people, so people just click accept, @Dr_FaithG continues, noting that they often do not understand exactly what it is that they are signing up to.
For the final question to our panel, @Catch22Naomi asks what our panellists have been seeing in terms of legislative measures internationally that are working to keep young people safe online, as we are expecting to see legislation here in the very near future.
There are some initiatives at a national and international level, but looking at what the platforms are doing to try to make this more difficult is a very important part of the picture, Professor Lorna Woods notes.
The Duty of Care element of legislation is an important concept that needs to have an active quality to it in order to safeguard people effectively, @VictimsComm adds, and notes that there needs to be work done collaboratively here and internationally by investigative agencies.
. @Catch22Naomi has now handed over questions to Jordan from @LDN_VRU’s Young People’s Action Group, who asks how you should go about dealing with the problem of receiving hateful comments from people that you don't know online.
People need to be given strong education on ways to deal with this, @VictimsComm begins, highlighting her own experience of trolling online. Continuing, she says there needs to be a mechanism by which you can separate from it (when you can't shut your door on it to escape).
As our event draws to a close, we move to questions from our audience. We will publish answers to any questions that were not answered here on our website.
We start with a question about how we make duty of care a proactive, rather than a reactive, responsibility. Professor Lorna Woods says we need to think about the likely, foreseeable, harm that could eventuate.
We also need to be clear about how we are seeing harms, rather than waiting for it to occur. We need to think about what we are seeing, what we expect, and how we could improve that, noting the difference between "legal but harmful" and "criminal", which can leave it too late.
Young people are more likely to react by blocking rather than reporting, the next question notes, asking if young people interviewed for our workshops had ideas on how these reporting functions could be improved.
We need to be thinking more broadly about the type of world and communities in which we want to live. Young people often internalise these things and feel they must deal with it themselves, which doesn't put the responsibility on the platforms, @Dr_FaithG begins.
Continuing, @Dr_FaithG talks about using technology that is available on other platforms that might be able to improve others and give more power to users, highlighting again the importance of personalised feedback when issues are brought to the attention of a platform.
There is a need for much more transparency, @Dr_FaithG concludes, which would enable us to create a better picture of what goes on, nothing this should be picked up by tech platforms to show the capabilities to use technology for good, making them safer for young people.
"Everything that has been said in this panel discussion today has been true," and "the world of social media is a big world, and almost a separate world", Jordan concludes for us, highlighting how hard it is, as a young person, to stay safe across so many apps.
Thank you to all our contributors this morning for sharing their expertise with us! It has been a really interesting conversation about the increasing issue of #OnlineHarms and the challenges and opportunities that are arising from an increased reliance on the online world.
And thank you to our attendees too. We will be sharing the full results of our #OnlineHarms research in February and running our next event at the same time. If you would like to stay up-to-date on our work in this area, please bookmark this page: https://bit.ly/3gwXiij 
You can follow @Catch22.
Tip: mention @twtextapp on a Twitter thread with the keyword “unroll” to get a link to it.

Latest Threads Unrolled:

By continuing to use the site, you are consenting to the use of cookies as explained in our Cookie Policy to improve your experience.