600,000 confessions. 100,000 minors. One Firebase rule.
When 404 Media published the investigation in March 2026, the numbers were these: 600,000 user records exposed. Roughly 100,000 of them minors. Not email addresses. Not user IDs. The records contained self-reported masturbation frequency, emotional triggers, content categories, and free-text personal confessions — the most intimate data the consumer internet handles, tied to verified ages.
The cause was a single Firebase rule: allow read, write: if true. The default. Set on project creation day and never changed.
Here's what Quittr actually collects from every new user, before it generates a "personalized recovery plan": Age. How often they watch pornography. How often they masturbate. What triggers them — stress, boredom, loneliness, specific emotional contexts. What categories of content they're trying to quit. And a free-text field: why do you want to stop?
| What was exposed | Scale |
|---|---|
| Total user records | ~600,000 |
| Self-identified minors | ~100,000 |
| Self-reported masturbation / porn-use frequency | Per-user, weekly |
| Emotional triggers | Free-text per user |
| Content categories being watched | Per user |
| "Why I want to quit" responses | Free-text per user |
There's a category of breach where the sensitivity of the data matters more than the volume. This is one of them. Email addresses are an annoyance. Email addresses tied to self-reported masturbation frequency, personal confessions, and a verified age that includes minors are a long-tail harassment, sextortion, and doxxing package.
The app that collected it
Here's the part that makes this story sting, because Quittr's story reads like a recruiting poster for vibe coding.
- Built by a 19-year-old, Alex Slater, and his co-founder Connor. First-time founders. No security background.
- Shipped in about 10 days using SwiftUI on the client and Firebase on the backend, with Superwall for paywalls.
- 350,000+ downloads in six months, across 120 countries, generating ~$250,000 in monthly recurring revenue and topping $1.1M in total revenue — bootstrapped, no venture capital.
- Featured in Dazed, The Week, an Oprah Winfrey Facebook post, L.A. Weekly. One of the most-cited examples of the "app mafia" of young founders shipping six-figure-MRR consumer apps with a small stack and no team.
Ship fast. Stay small. Own the revenue. Skip the VCs. Every one of those facts is a headline a builder wants to read. Nobody in this story did anything wrong by building fast. The tool they built on shipped a default that was wrong for production, and nobody told them.
The rule
Here's what Firebase ships when you select "test mode" on project creation:
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{document=**} {
allow read, write: if true;
}
}
}
if true means what it says. Read anything. Write anything. No authentication. No ownership check. No rate limit. Your users, your data, your confessions — accessible to a laptop on the other side of the planet with a single HTTPS request.
This is what Quittr's Firebase project was running. Not because someone made a mistake. Because this is the option Firebase presents when you create a new project, and because nothing between that moment and production deploy tells you it's wrong.
Security researcher Kaeden summarized the finding in a disclosure message to Quittr that 404 Media later quoted:
Your firebase (Database) is misconfigured its possible to read/write to anything, one of the things its possible to do for example is list all users and their info, which is pretty bad for an app of this nature.
"Which is pretty bad for an app of this nature" is the understatement of the year. But Kaeden's report is technically simple because the vulnerability is technically simple. There's nothing to chain, nothing to exploit, no zero-day to reverse. The rule said if true. The database said yes.
Why this isn't just a Quittr story
Quittr is the fourth major Firebase-backed consumer app to hit this exact failure mode in the last twelve months. The same allow read, write: if true default shipped to production at:
| App | Data exposed | Scale | Our coverage |
|---|---|---|---|
| Quittr | Age, habits, personal confessions, 100K minors | 600,000 users | This article |
| Cal AI | Health records, meal logs, 4-digit PINs, a child's data | 3.2M records | Cal AI case study |
| Tea | 13,000 government IDs, 1.1M private messages, GPS | 72,000 images | Tea case study |
| 900+ sites | Mixed PII across hundreds of Firebase projects | 125M records |
Four apps. Four completely independent teams. Four different data categories. One rule. The rule Firebase ships by default when you click "test mode" on day one.
The pattern is the point: if four different teams — including a 19-year-old who built a $1M app in 10 days and a calorie-tracking startup with 3.2 million users — all ship the same bug, the bug isn't in the builder. The bug is in the default.
Three things that make the default survive
Quittr's founder built the app in ~10 days. The Firebase rules file was likely set once, on project creation, and never revisited. That's not carelessness. It's three specific properties of Firebase's architecture that make the default survive from development to production unchallenged.
1. The rules surface is invisible from the client. When you're building your SwiftUI app and reading users from Firebase, every read returns data. Every write succeeds. There is no runtime signal that the backend is also readable by everyone else. Your app's happy path and an attacker's happy path are indistinguishable from the server's perspective. "The app works" and "the database is open to the internet" are the same observation.
2. The rules live in a different tab. The Firebase Console surfaces your database, your authentication, your storage, and your rules in separate sections. You can build, test, and deploy an entire app without ever opening the Rules tab. Your IDE doesn't show the rules file. Your build system doesn't check it. Your AI scaffolder certainly doesn't.
3. There is no deployment blocker. Firebase will happily deploy allow read, write: if true to production. No build-time check. No pre-deploy warning email. No reviewer. The same "Deploy" button that deploys locked-down rules deploys test-mode rules. In 2026, after hundreds of millions of records have leaked across thousands of projects, there is still no --i-know-this-is-test-mode flag on firebase deploy.
This is the gap AI code generators can't fill either. Every LLM scaffolder is trained to produce working code. Working code for Firebase is code that reads and writes without permission errors. The fastest way to get there is to leave test mode on. The rules file never surfaces in the IDE, never fails the build, never triggers an error. The scaffolder ships what works. What works is open.
The disclosure and the fix
Three independent security researchers contacted Quittr about the open database across July, August, and later in 2025. The first, Kaeden, was reportedly told the fix would ship within the hour. The rules remained unchanged for approximately eight months, until 404 Media reached out for comment in March 2026 and published the investigation on March 10–11.
The gap is notable less as a judgment of the founders and more as evidence that disclosure infrastructure doesn't exist in most vibe-coded projects. No security.txt. No security@ alias. No triage process. When a report arrives via Twitter DM to a 19-year-old who's never opened the Firebase Rules tab, the natural failure mode is to triage it behind the next feature release — not because the founder doesn't care, but because the fix requires navigating an unfamiliar surface under no visible pressure. The app still works. Revenue is still growing. The Rules tab is still where it always was.
This is a solvable problem. A security@ alias costs nothing. A written triage rule — "security reports get a commit within 24 hours, not a reply" — costs nothing. Adding them before you ship is the difference between a same-day fix and an eight-month gap that ends with a reporter in your inbox.
Your 5-minute audit
If you're shipping on Firebase, do these checks on your own app today. They cost nothing and they take less time than reading this article.
1. Open your Firebase Console → Firestore Database → Rules. Search for if true. If it appears at the root level, you have test mode in production. That's the Quittr bug. Fix it before you do anything else.
2. Check every Firebase product separately. Realtime Database, Firestore, and Storage are three products with three separate rules files. Tea's breach was Storage. Quittr's was the database. Cal AI was Firestore. Securing one does not secure the others. Open each tab, read each file.
3. Test from an unauthenticated client. Curl a REST endpoint for one of your collections without signing in. If you get data back, your rules are wrong.
4. Write rules that deny by default:
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /users/{userId} {
allow read, write: if request.auth != null
&& request.auth.uid == userId;
}
match /{document=**} {
allow read, write: if false;
}
}
}
Start from if false. Open what you've decided to expose. Never start from if true and subtract.
5. Add a disclosure inbox today. A security@yourapp.com alias. A /security.txt file. A single paragraph explaining how to report issues. The cost is zero. The absence of it is what turns a same-day fix into an eight-month gap.
How Flowpatrol catches this
Quittr's breach was a single Firebase rule. The entire exploit path — credential extraction, unauthenticated query, full data dump — takes under 60 seconds. Our scanner walks that exact chain automatically:
$ flowpatrol scan https://quittr.app Scanning... 4 endpoints discovered ✗ Critical: Firebase Firestore — unauthenticated read access Rule: allow read, write: if true (test-mode default) Collections readable: users, confessions, streaks, plans Records returned without auth: 600,000+ ✗ Critical: Firebase credentials exposed in client bundle Project ID: quittr-xxxxx API key: AIzaSy... ✗ High: No security.txt or disclosure endpoint Done in 2m 34s — 3 findings, 2 critical
We pull Firebase and Supabase credentials from your client bundle, test every discoverable collection with the anon key, and flag anything that returns data it shouldn't. The check that catches the Quittr bug is the same check that caught Cal AI, Tea, and 916 other Firebase projects. Five minutes. One URL.
Quittr was a brilliant app, built at extraordinary speed, by a founder who shipped what the internet celebrates. The data it collected was among the most sensitive the consumer internet handles. The rule that exposed it was the default.
The app was impressive. The default wasn't. Fix the default.
This case study is based on public reporting by 404 Media (March 10–11, 2026), Cybernews, Techlicious, DataBreaches.net, and founder interviews including BoringCashCow and L.A. Weekly. Reported exposure figures (600,000 users, ~100,000 minors, eight-month disclosure gap) are as documented in 404 Media's two-part investigation.