The company stated in a weblog that its “Data Science team has prepared a white paper detailing the technology of Private Detector and has made an open-source version of it available on GitHub.”
The well-known women-first dating service Bumble is making its AI tool, Private Detector, available to the public in an effort to prevent the practise of transmitting unwanted nudes online, also known as cyber flashing, and make the internet a safer place for everyone.
The new gadget blurs a potential naked image submitted in a Bumble discussion automatically. The decision of whether to see or block the image is left to the discretion of the users who will be informed.
The company stated in a weblog that its “Data Science team has prepared a white paper detailing the technology of Private Detector and has made an open-source version of it available on GitHub.”
As we collaborate to make the internet a safer place, “it is our goal that the feature will be adopted by the larger tech community,” it continued.
This Private Detector version is made freely under the Apache License so that anyone can use it to blur pornographic photos either as is or after further fine-tuning with more training data.
Bumble claimed it collaborated with lawmakers from opposing parties in Texas in 2019 to pass a bill that effectively made sending unsolicited indecent photographs a punishable offence in an effort to address the more serious problem of cyberflashing.
Since HB 2789 in Texas was passed in 2019, Bumble has successfully pushed for legislation of a similar nature all around the world.
By assisting in the passage of SB 493 in Virginia and, most recently, SB 53 in California, which strengthened online safety in one of the US’s most populous states, Bumble achieved another significant milestone in public policy in 2022.