To combat the sending of unsolicited nudes online — known as cyber flashing — and make the internet safer place for everyone, the popular women-first dating app Bumble is open-sourcing its AI tool — Private Detector.
The new tool works by automatically blurring a potential nude image shared within a chat on Bumble. Users will be notified, and it is up to them to decide whether to view or block the image.
“Bumble’s Data Science team has written a white paper explaining the technology of Private Detector and has made an open-source version of it available on GitHub,” the company said in a blogpost.
“It is our hope that the feature will be adopted by the wider tech community as we work in tandem to make the internet a safer place,” it added.
This version of the Private Detector is released under the Apache License, so that it is available for everyone to implement as the standard for blurring lewd images as it is, or after fine tuning it with additional training samples.
To help address this more significant issue of cyberflashing, Bumble said it teamed up with legislators from across the aisle in 2019 in Texas to pass a bill that effectively made sending unsolicited lewd photos a punishable offence.
Since the passing of HB 2789 in Texas in 2019, Bumble has continued to advocate for similar laws across the US and globe successfully.
In 2022, Bumble reached another milestone in public policy by helping to pass SB 493 in Virginia and most recently SB 53 in California, adding another layer of online safety in one of the most populous states in the US.
–IANS
vc/uk
Note:- (Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor. The content is auto-generated from a syndicated feed.))