Online dating has become one of the new normals particularly during a time when it is life-threatening to step out of our homes. However, thanks to technology, meeting people can happen even when you are bound by the four walls of your room. At the same time, it cannot be denied that online dating entails certain factors of risk and it is necessary to be on guard. Because a single swipe can change lives, for good or for bad.
In an effort to stop potential sexual attackers and to “red flag” them, Match Group will deploy the prowess of artificial intelligence. Match Group is the parent company of popular dating apps like Tinder and Hinge.
In this world-first proposal, Match Group will work hand-in-hand with the NSW police, immediately reporting cases of assault to law enforcement.
The New Step to Step Up
As per the reports from NSW Police, Match Group will soon facilitate streamlining reports concerning sexual assaults. This will be done with the help of a ‘portal’ that can be easily accessed by the NSW Police, helping them to take immediate action and respond effectively, ensuring the safety and security of the users.
A statement by Stacey Maloney (Detective Superintendent) reveals the need for cooperation from dating apps to bring to light the cases of sexual violence.
“If they hold information that is suggestive an offense has been committed, they have a responsibility in my view to pass that on.”
The new move from the company comes after an investigation conducted by triple Hack and Four Corners which brought to light the failure on the part of Tinder when it came to responding to sexual assault survivors, which helps the rapists to easily cover their tracks and escape the law.
In response to the investigation, Match Group has been working on certain steps to ensure the safety of users.
It was Superintendent Maloney who suggested the idea of dating apps adopting artificial intelligence systems which will facilitate the monitoring of users and messages, helping to effectively spot “red flags” indicative of sexual assault or violence.
Dr.Rosalie Gilbert who had conducted in-depth and extensive research about women’s safety on Tinder welcomed the changes with open arms. However, she is of the opinion that the artificial intelligence system will have limitations in detecting various types of problematic and threatening behaviors.
Although they will be effective in detecting overt abuse, there will be trouble identifying types of behavior and content that is are normalized.
“Automated systems are only as useful as the data that are used to develop them. This means that Match Group will need to consider what data it uses to train its models. An automated system designed to detect overt abuse will only ever be able to detect overt abuse.”
This points towards the nuances and intricacies of data that is chosen and the necessity for discerning the right step that would help to make these dating apps a safe space for users rather than being a “Predator’s Playground.”