Past Events - Personal Safety Webinar Aug 20
Commentary on ODA webinar on personal safety
The ODA webinar was held on 12 August. The presenters were Justin Davis from Spectrum Labs, Saskia Garner from the Suzy Lamplugh Trust, Anthony Oyogoa from UrSafe and Fred Beckley from The Meet Group. It was moderated by George Kidd at the ODA.
This Commentary hopefully captures key points made and starts with some broader take-aways as seen by the ODA hosts. Our read:
- Things are much changed – with or without the Covid events. We are not likely to roll theclock back.
- With tens and tens of millions using dating apps and services around the world the sectorhas a high profile. With this profile and given the nature of the services provided we mustrecognise the responsibilities we have, and act on it.
- Today, safety arrangements are not optional but expected and necessary. Users wantassurance services take their safety seriously – and should welcome advice, guidance, andthe safety tools they themselves can use.
- Having the right arrangements in place builds a sense of community and should drive trust,confidence, and revenue. We need, therefore, to adopt a safety by design mentality: reallyplanning out what you should do and where, when, and how.
The implications as we see them and based on points made
There has been a significant shift in terms of how services address safety issues, in the safeguards that exist, and in how extra levels of security can be offered to meet a higher level of user expectation and a lower level of tolerance of unacceptable behaviour online.
This may raises issues over how the legal provisions in the US Communications Decency Act and the European Community e-Commerce directive, that limit a services liability as a host or publisher of content, will work with its wish to apply safeguards and offer guidance and new tools to users. These measures have read-across to the ways in which services collect, use, share, store or retain personal data.
More thought is required over how entities partner to offer safeguards based on any data-sharing, and whether as much or more can be identifying and managing devices used online rather than real or possible false identities and personal data.
The “new normal” is not driven by the 2020 Coronavirus. It has evolved over time as services have come to recognise how having millions of users brings a shift in expectations on them not just from those users but also tech partners, channel providers and others while, at the same time and recognising the business case and benefits of demonstrating social responsibility.
Honesty, clarity, choice were key messages as the webinar looked at how users engage online, how behaviours may be managed or moderated, how users can be supported when meeting others and how services might support some users if/when a meeting results in serious harm or fears.
The art and science of protecting users
Justin Davis and Saskia Garner explored what is and is not possible in terms of moderating online to identify behaviour which is unpleasant and unwanted and whether it is possible to flag particular behaviours that are a high indicator that the person in question might constitute a threat in person.
Saskia reported 80% of stalking (not specific to dating) has an online element whether it be via WhatsApp, car software, Amazon Alexa. She argued the need to look at frequency of contact and way in which contact is being used, and not just at keywords to prevent this sort of behaviour. She spoke of potential research looking at cases where physical harm had resulted and going back through the places and forms of contact that preceded it to identify behaviours that could flag intent.
Justin saw data as an asset and the scope for building a community with other partners creating a fuller picture of behaviours. He challenged the sector to be more sophisticated with using data to identify seriously problematic behaviours. He spoke of moderation as a combination of art and science and the very real limits to how far poor but not demonstrably threatening behaviour online could prove a flag to risks when meeting in person. Given past research by Mintel, Populus and others had flagged unacceptable crude behaviour online as a No 1 concern and turn-off for users, there was a consensus that moderation and intervention would be valued by users even if its impact in deterring more serious threats could not be calculated.
Fred Beckley led discussion on how services could do more to intervene strategically and with more effect. He spoke, in particular, about the ability on IOS platforms to identify a device that is being used inappropriately in terms of content posted or conduct online (and also in seeking to carry out a fraud) and to report this to Apple with an instruction to block that device from the app in question. There was interest in how this approach prevented a rogue user registering again and again under different profiles/identities and over how a focus on the device with no need or desire to process sensitive personal data could allow intervention with fewer complications. There were questions over misconduct seen that would justify a ban. The Meet Group confirmed device blocking was reserved for the most troublesome activity. Like others it had mechanics for warnings, escalating time out’s given on streaming, as well as account termination or/and device blocking for more serious misconduct. The device block was certainly seen as a game changer; resulting in double digit percentage reductions in reports of unacceptable conduct and of attempted scams.
With over 70% of traffic now on smart phones there was a strong desire to see this same device blocking capability in place on Google’s Android operating system
The webinar discussed different approaches to checks on past criminal convictions relevant to a decision on accepting an individual onto a service. The Meet Group checks against registered sex offender lists in the United States. This is not universal practice. Many jurisdictions do not publish widely information on those with past criminal convictions. To do so in a way that would extend to dating apps seemed sure to mean a host of other social media, commercial and other online services and channels would be expected to have similar access. This would raise issues over data security and data retention policy and any legal rights individuals have to see information held on them. It has to sit alongside legal disclaimers over liability for the comprehensiveness and detail of checks conducted.
Webinar participants discussed the extent to which users were open to advice and guidance on personal physical safety: whether they would find this unwelcome and deter using a service or whether, as with fraud tips, they accept the world is not a perfect place filled with perfect people. The general view seemed to be that people recognise there are some risks in online contact and physical meetings, reflected in a way, for example, by a tripartite safety pledge Meet Group and its users sign up to, making sure they realise that offline and online world can be dangerous.
There was a brief discussion on the extent to which safety advice could be relatively generic and on situations where different and more blunt messaging may be necessary. There was a world of difference, for example, over how advice might be used to support a middle aged user of a mainstream service taking good time to get to know the person behind the profile before any meeting and friendship and a younger user using an App with a view to meeting a like-minded individual within hours, quite possibly with a view to an immediate physical relationship. Generally, however, the view was that most services rightly presented advice that has a broad relevance and is sensible and practical.
Giving users new tools and choices
Anthony Oyogoa also spoke of a “new normal” with users far more comfortable in using devices and locational data to help them stay safe generally, not just when dating. He argued it is the last bit of security that companies can offer their users in the journey from a digital presence to an actual meeting and date. Safety apps have a simple “plug and play” form that can allow messages confirming safety or messages seeking assistance to be sent and received in ways the user is comfortable with – including an ability to alert law enforcement agencies. UrSafe has just entered an advertising partnership with one international brand alongside The Meet Group offering the app to users for a trial period.
Anthony was not arguing safety apps should be a “requirement” but that there should be an industry standard of care that included making a safety app available to users. If/when such apps are offered by some leading services others may be open to questions and criticisms if they do not, at least, give users the ability to choose to use an app. There were analogies to car safety – and seatbelts and airbags as they moved from being novel to being a consumer requisite with or without, and before as well as after law-making.
Saskia Garner addressed the challenges services could face if confronted with a serious report of abuse and harm. Although incidents may be few and far between by the time something is reported it will have become a serious issue. If users are worried they will be dismissed they may think it not worth bothering reporting.
First contact was seen as critical. Is the perpetrator being dealt with? Does the service have the right information to-hand for referring a user to law enforcement and support agencies? Is the user feeling safe enough to go back to the service? Does the service have a view on how that should be managed? Expert training is essential: most calls by volume and nature are about service levels, value/cost and staff need to be alert to and know how to deal with those that are very infrequent but critical in nature. Where in-house arrangements are not possible and proportionate, services could set up a third party relationship that gives an expert point of contact available for support.
Throughout the webinar the message from panellists were to have plans and processes in place: not just to react to bad events as they happen. Justin Davis referred to “safety by design”, a number of others referred to their broader network of relationship and support arrangements. Support from an early stage is essential alongside pro-active messaging. You want users to have the best possible experience and users need to know they will be supported. ODA reported moves in the UK to have receipt, recording and reporting on user complaints and alerts over bad conduct as a regulatory requirement.
Our speaker’s very last one-line thoughts:
Saskia Garner – Reassure users that safety is a priority and what you do as a company. Have clear process of support from the right people.
Justin Davis – Healthier communities will drive revenue. Investment in trust and safety is investment in the community.
Anthony Oyogoa – Everyone is working for safety. Its integral. Success is when a user has no need to report in on a date where all was fine which means all elements of online and physical safety are working.
Fred Beckley –Everyone wants good, safe relationships. Have solid accurate data and make best use of it.