This update comes as Bumble is building new safeguards to uphold its mission to foster healthy and equitable relationships, and continue to put women at the center of its experiences.
As the dating ecosystem evolves, Bumble is focused on responsible uses of AI and addressing new challenges brought by disingenuous usage. In a recent Bumble survey*, 71% of Gen-Z and Millennial respondents felt there should be limits to using AI-generated profile pictures and bios on dating apps. In addition, 71% of those surveyed believed people who use AI-generated photos of themselves doing things they have never done, or visiting places they have never been, qualifies as catfishing.
The new safety update comes as Bumble is building new safeguards to uphold its mission to foster healthy and equitable relationships, and continue to put women at the center of its experiences. As the dating ecosystem evolves, Bumble is focused on responsible uses of AI and addressing new challenges brought by disingenuous usage.
You may find here the visual as seen above showing the new reporting option, and also appending a quote below from Bumble’s VP of Product, Risa Stein, who leads trust & safety efforts.
The new reporting option is an addition to Bumble’s existing features that taps on AI for good to help members stay safe while dating online:
- Deception Detector: Rolled out earlier this year, this AI tool helps identify spam, scam, and fake profiles—within the first two months, Bumble saw member reports of spam, scam, and fake profiles reduced by 45%**. Find out more in the release in our digital media kit here.
- Private Detector: An AI tool that automatically blurs a potential nude image shared within a chat on Bumble, before notifying you that you’ve been sent something that’s been detached as inappropriate. You can easily block or report the image after.
- For You feature: We also recently made new AI-powered advancements to our “For You” feature to improve our consumer experience. This is a daily set of four curated, relevant profiles based on our community’s preferences and past matches designed to show singles people that could be a great match.
Quote from Risa Stein, VP of Product at Bumble:
“An essential part of creating a space to build meaningful connections is removing any element that is misleading or dangerous. We are committed to continually improving our technology to ensure that Bumble is a safe and trusted dating environment. By introducing this new reporting option, we can better understand how bad actors and fake profiles are using AI disingenuously so our community feels confident in making connections.”