Framework for User Safety
The Framework for User Safety sets out the actions dating apps and services can take to create as safe an environment as possible for daters.

The Framework is based on the user safety elements of our broader Standards Package developed in partnership with the sector, online safety agencies, law enforcement, regulators and other stakeholders.
The Framework is in the form of a tool or template services can use when reviewing their policies and processes. And it is a checklist we think that any start-up business can use to design safety into their service.
As such the Framework is neither prescriptive or mandatory. We are not acting as an approvals or compliance body, but as a community of worldwide players who want to see safety is at the heart of dating services. Our members display the Association logo online to show commitment to these Principles.
The core elements of the Framework cover:
User advice & guidance
Service providers should give creative thought to when and how information can be given in ways that maximise user-awareness. Service providers have various opportunities to guide users on how to make best but also safe use of services. These include:
- at the point of sign-up
- when potential matches are presented
- when a user takes a subscription
- when users are alerted to interest from others through the messaging system
Advice and Guidance should at minimum address the risk of forms of online fraud and the risks to personal safety when meeting someone in person, particularly for the first time.
Services should highlight those behaviours and requests that might signify a scammer. Services that provide more secure in-service communications platforms should point out the benefits of these and risks of being taken off-site.
Service providers should look to make use of advice and guidance on safety online that may be produced by other agencies seeking to prevent fraud or other forms of harm such as stalking, harassment and revenge porn.
Addressing user safety
Services should give users a good understanding of what is and is not done in relation to safety and should be clear on where and why responsibility for certain risks must lie with Users.
Services should explain the steps the service can take to moderate users, making clear and also making clear the limits to what is permissible vis criminal record or other checks and, therefore, that they cannot guarantee the accuracy of profile information. Services should highlight ways in which users can take steps to check another user’s profile; being clear where these may be helpful but not 100% assured.
As part of best practice, services are expected to:
- Make Users aware of what constitutes unacceptable and illegal behaviour while using a dating service. The Model Code of User Conduct might help services in reviewing the content of any statement of their expectations. Services should give thought to how these expectations are shared with users in a way that ensures they are seen and understood, for example at point of registration.
- Ensure that Users are able easily to report any abuse/harm to the dating service and encourage Users to do so. Services may then need to consider whether their customer support set-up is able to address potentially sensitive issues as well as routine contractual and operational questions, and if not signpost to specialist support services.
- Ensure that all reports are responded to quickly and appropriately allowing action to highlight and secure a change in behaviour where possible and enable service providers to remove a profile and terminate an individual’s use of the service where that is an appropriate response.
- Have arrangements in place for dealing generally with reports within a set timeframe and addressing when and how actions taken may be explained to the person who reported the incident. Thought should be given to whether it is possible to alert others at possible risk whilst safeguarding the person making the report.
- Actively moderate user profiles, ensuring that appropriate arrangements exist to detect fraudulent or misleading profiles and inappropriate or harmful content and behaviours, and to remove such profiles from the service as soon as possible.
- Be alert to the possibility of under 18s trying to join a site. Services should be clear on the age restrictions that apply. In addition to profile picture and content checks, services should consider forms of cross-checking with other digital platforms and should actively encourage users to report any possible under-age individuals on the service.
- Respond fast to any reports of an under-age person that has managed to get onto the service, removing the profile and considering whether there is other action necessary specific to the individual incident.
- Be alert to any report suggesting another user is behaving in ways that are a cause for concern, distress or actual harm. Services should have the ability to remove such users. If serious issues arise operators should not hesitate to encourage a user to report the matter to their local police force, anonymously or otherwise.
- Investigate whether an individual or profile removed for serious unacceptable behaviour has been in contact with others on the service and might pose some threat to the safety of these other users.
- Consider whether it is appropriate to refer a user with concerns to law enforcement agencies or charitable bodies or others that specialise in safeguarding, protection and victim support.
- Understand the legal basis on which law enforcement or other regulatory agencies may require information and co-operate with these agencies.
Managing the risk of fraud
Services should use best endeavours to identify and remove scammer profiles, using software analytics, any data feeds from law enforcement bodies, agencies with sector-specific expertise and technology, alongside profile monitoring and user reporting tools.
Services should inform users of the safeguards that exist though use of the service’s messaging platform and any other arrangements the service has in place. Services, at the same time, should be clear on the limits on the rights and powers it has with a view to verifying every user and their good intentions.
Services should provide easy to find and user-friendly advice and guidance on how users can minimise the risk of fraud.
Services should allow users to report others if they suspect fraud or believe the person in question is a risk to the safety of others. Services should monitor these report channels and be ready to act promptly to remove wrongdoers. Where possible Services should give feedback on this to those making reports.
Where there is clear and demonstrable evidence of actual or attempted fraud, Services could look at whether that user has had significant and close contact with people other than any known victim. We encourage operators to alert these other users to the fact that this person has been removed from the service because of a seeming breach of the services conditions of use.
Services should always advise a victim of a scam to inform the lead law enforcement agency. Services should identify any other support agencies that could be of help to a user who is the victim of a fraud.
Services should have a clear understanding of the duties on them when presented with a legally binding instruction to release data or content to law enforcement agencies acting under national or state law.
Services should understand whether any mechanics exist in individual markets for accessing law enforcement agency data on reported scam profiles with a view to blocking these profiles or the device or IP address being used.
Services should not themselves create fake profiles to seem to populate a service or knowingly allow users or any other party to create and post fake profiles to attempt a fraud.
The full Framework can be downloaded here:

The Framework sits alongside our ODA Model Code of User Conduct that sets out what is expected by way of user conduct and behaviours online. In so doing we try to recognise that a great dating experience is based on the actions of both the service and its users.