Kin

Why Safety Apps Feel Like Surveillance

J

JotGen by JouleWorx

April 14, 2026·11 Minute Read

Every safety app makes a bargain with you that it doesn't announce out loud.


TL;DR: The deal is never written on the app store page. It doesn't appear in the onboarding flow or the explainer video. But it is there, underneath everything, shaping every design decision the product has ever made.

This post is about what that deal actually costs. Not in the abstract, but in the specific choices made by three apps that most people reach for first. And about why the deal was never necessary to begin with.


Life360: When the Business Model Is the Product

Life360 is the largest player in the family safety space globally. It is heavily funded, broadly used, and it solves a problem that is genuinely difficult: the quiet anxiety of not knowing whether the people you love are okay. For millions of families, it works. That is worth saying clearly before anything else.

But there is a structural fact about Life360 that shapes everything else about the product, and it is worth understanding.

In December 2021, investigative journalism outlet The Markup revealed that Life360 was selling precise location data, including data on children and families, to approximately a dozen data brokers, who then sold it to virtually anyone who wanted to buy it. The company reportedly generated $16 million from data sales in 2020 alone. [1] The CEO at the time described this arrangement plainly: data sales were the mechanism that allowed the core app to remain free for the majority of users. [2]

Life360 announced it would phase out those broker deals. By June 2025, independent reporting found the company listing over 4,600 audience segments on a major data marketplace, grouped by users' location visits and personal attributes. [3] The underlying logic had not changed.

This is not a scandal, precisely. It is a design consequence. When continuous surveillance is how your product works, and when monetising that data is how your business works, the two things point in the same direction from the very beginning.

The result, and among the feelings users describe most consistently in their reviews, is something that might be called privacy fatigue. The sense of being watched. Of wearing a tracker rather than carrying a phone. The number of teenagers who quietly disable it, the young adults who leave the family circle, the users who feel the app working against the relationship it was meant to support: none of this is an accident.

For families looking for Life360 alternatives, this is precisely what they are trying to escape. Not the safety, which they want. The leash, which they don't.


Apple Check-In: The Walled Garden Problem

Apple's approach to the same problem looks completely different. Check-In, introduced with iOS 17 in September 2023, is a thoughtfully designed native feature: your iPhone automatically notifies a selected contact when you arrive safely at your destination, sharing your location, battery level, and signal strength, all end-to-end encrypted through iMessage. In full sharing mode, it also shares the route taken. [4] There is no third-party app involved. No data broker in the chain. For an all-iPhone family, it is genuinely elegant.

The problem is the word all.

Check-In requires iOS 17 or later for both the sender and the recipient. [5] The moment one person in your circle uses an Android phone, the feature doesn't degrade; it simply doesn't work. That person falls outside the safety net entirely.

In India, Android holds approximately 95% of the smartphone market. [6] This is not a niche case or an edge condition. It is the default reality of almost every family in the country. Most Indian households are mixed-device households, with different brands, different price points, and different operating systems. A safety feature that requires everyone to be on Apple hardware is, in practice, a safety feature built for a world that doesn't exist here.

Apple Check-In was not designed with bad intentions. It was designed for one ecosystem, and that ecosystem happens to exclude nearly everyone in one of the world's largest smartphone markets. The result is the same regardless of intent: safety becomes a platform privilege rather than a universal standard.


bSafe: When Safety Is Designed for Crisis

bSafe is the most personal of the three. It was built by a survivor of sexual assault and her father, who asked a simple question after years of watching her recover: could this have been prevented? [7] That origin matters, and it shows in the product's commitment to women's safety specifically.

The features bSafe built are a direct answer to that question: a voice-activated SOS alarm, live GPS streaming to guardians when the alarm fires, automatic audio and video recording, a fake call button to extract yourself from a threatening situation. [8] It is available in 125 countries. It has genuinely helped people in dangerous situations.

But there is a tension in this approach that is worth being honest about.

bSafe is built for the moment of crisis. Everything about its interface, the alarm, the siren, the streaming, the panic button, is oriented toward the worst-case scenario. That is appropriate for the worst-case scenario. But most of what families actually need is not the worst case. It is the background hum. The quiet daily signal. I got home. I'm at the office. I'm okay. For that ordinary, everyday need, a tool that looks and feels like an emergency system carries its own cognitive weight. You don't reach for it before things are bad. By the time you reach for it, things already are.

There is a second tension. The same features that allow a trusted guardian to follow someone's progress, such as live location, continuous tracking, and real-time streaming, are the features that domestic violence advocates have identified as susceptible to misuse in controlling relationships. [9] Location-sharing apps, however well-intentioned in their design, can be turned by an abusive partner into instruments of monitoring and isolation. This is not a criticism of bSafe specifically. It is a structural consequence of building around continuous surveillance as the primary mechanism. The tool is only as safe as the relationship it operates within.


What All Three Have in Common

Life360, Apple Check-In, and bSafe are different products built by different people with different intentions. But they share a foundational assumption: that presence requires data.

To know someone is safe, the assumption goes, you need to know where they are. Where they have been. When they moved, and how fast, and from which device. The data collection is not incidental to these products; it is the mechanism. Safety, as designed, requires a continuous record.

Once that record exists, the questions become practical ones: who holds it, for how long, and what are their incentives? For most families, these questions feel abstract. But they are not abstract for everyone.

Consider a journalist checking in with their editor at 11 p.m. on the night of a protest. That exchange does not produce neutral data. It produces operational intelligence: a record of who communicated with whom, in a politically sensitive context, at a sensitive time. The Freedom of the Press Foundation's digital security guidance notes that metadata, including who you contact and when, can be just as sensitive as the content of those communications, and is routinely gathered by platforms that most people use without a second thought. [10] The Pegasus Project identified over 40 Indian journalists as potential targets for surveillance. [11] For anyone whose sources and movements are sensitive, a safety app that logs the shape of their social connections is not a protective tool. It is a liability.

The same logic holds in a controlling relationship, and for activists, and for anyone whose connections are themselves sensitive information. The pattern is always the same: the data trail feels harmless until it reaches the wrong hands.


The Signal, Not the Stream

Most people are not waiting for a crisis. They are waiting for the background hum. I got home. I'm at the office. I'm okay. That is the complete thought. It asks nothing further.

Think about what you are actually waiting for when someone you care about is travelling home late. You are not waiting for their coordinates, their battery level, or a timestamp of their last movement. You are waiting for one piece of information: that they are safe. A tool shaped around the worst case does not serve this ordinary moment particularly well. It carries too much weight for what is, almost always, just a simple reassurance.

Presence and surveillance do not have to be the same thing. A check-in does not require a data stream. It can be a signal, sent when the person is ready, received by someone who cares, and then finished. No record opened that the moment did not ask for. No thread left running after the loop closes.

You do not need to know where someone is to know they are okay. You need them to be able to say so, on their own terms.


What Kin Is Built On

Most privacy safety apps were designed with data collection as a foundation, because data collection is how free products sustain themselves. The result is that even the act of letting someone know you are safe happens inside a system that is recording the shape of that connection indefinitely.

Kin is built on a different foundation. Not as a set of features, but as a set of beliefs:

Presence without continuous location. When you share where you are, it is because you chose to. A check-in is a signal (I'm okay), not a stream running in the background whether you asked for it or not.

Small, intentional circles. The people you need to hear from are few and they matter. Kin's circles are deliberately limited in size, not as a technical constraint, but as a philosophical one. Unnecessary connections should not exist.

No messaging. Ever. Kin does one thing. Better alternatives already exist for conversation. Kin closes the loop without opening a thread.

Consent as architecture, not policy. The most common approach to privacy in apps is a toggle in settings, a promise that data won't be misused. Kin's approach is different: the architecture is designed so that certain kinds of data collection are structurally not possible, not just policy-prohibited. Consent is not a feature. It is the foundation.

Minimum data. Zero compromise. Never sold. Most safety apps are free because your data pays for them. Kin asks for your phone number once, to verify who you are and to make sure only people you already know can reach you. Your contact list is used locally on your device for that filtering and it never leaves your phone. Beyond that, we collect nothing personal. We cannot read your messages, we do not build profiles, and we will never sell anything about you to anyone. What passes between you and the people you choose stays there.

Not surveillance with a privacy disclaimer attached. A signal, sent freely, received with relief, that leaves nothing behind. That is what a check-in app with no tracking should actually mean.

That is what we are building with Kin.


Sources

1. The Markup: "The Popular Family Safety App Life360 Is Selling Precise Location Data on Its Tens of Millions of Users" (December 6, 2021): Investigation revealing Life360 was selling precise location data, including data on children and families, to approximately a dozen data brokers. A former employee described Life360 as one of the largest sources of location data for the industry, with the company generating approximately $16 million from data sales in 2020.

2. 9to5Mac: "Tile owner 'Life360' reportedly sells location data of its users to 'virtually anyone'" (December 6, 2021): Life360 CEO Chris Hulls described data as an important part of the business model that allows the core Life360 services to remain free for the majority of users.

3. The Capitol Forum: "Life360: Family Safety App Selling Datasets Based on Users' Personal Information" (June 26, 2025): More than three years after Life360 announced changes to its data practices, The Capitol Forum found the company had listed more than 4,600 audience segments on LiveRamp's Data Marketplace as of March 2025, grouping users by location visits and personal attributes including age, gender, and household income. A Life360 spokesperson confirmed the company began selling segments through LiveRamp in August 2024 and planned to continue.

4. Pocket-lint: "What is Apple Check In and how does it work in Messages?" (September 2023): Apple Check-In, introduced with iOS 17, automatically notifies selected contacts when a user arrives safely at their destination, sharing encrypted information including location, route, battery level, and signal strength through iMessage.

5. Apple Support: "Use Check In for Messages on iPhone": Check In requires iOS 17 or later for both the sender and the recipient. The feature is built into iMessage and is not available cross-platform to Android users.

6. Sciflare / StatCounter: "Android vs iOS Market Share India" (2025): Android commands approximately 94.81% of India's smartphone operating system market as of March 2025, with iOS holding approximately 4.88%. India has the highest Android penetration of any major market globally.

7. ABC13 Houston: "Dating safety: bSafe, USafeUS and myPlan" (February 2024): bSafe was developed after a survivor of sexual assault and her father asked whether the incident could have been prevented, starting with the ability to trigger an alarm by voice and to prove what happened.

8. bSafe App Store listing: Features include voice-activated SOS alarm, live GPS streaming to guardians, automatic audio and video recording when the alarm fires, and a fake call button to extract a user from threatening situations.

9. National Network to End Domestic Violence (Safety Net Project): "Choosing and Using Apps: Considerations for Survivors": Domestic violence advocates note that phones, devices, and apps can create safety and privacy risks for survivors, and that abusive partners often monitor device activity. Location-sharing features intended for safety can be leveraged for monitoring and control in abusive relationships.

10. Freedom of the Press Foundation: "The 2026 Journalist's Digital Security Checklist" (December 2025): Metadata, including who you contact, when, and for how long, is collected by social media platforms and internet service providers and reveals communications patterns even when message contents are protected. For journalists, this metadata is operationally sensitive.

11. Human Rights Watch: "India: Media Freedom Under Threat" (May 2022): The Pegasus Project found that over 40 Indian journalists appeared on a leaked list of potential targets for surveillance. The Indian government has repeatedly stalled attempts to investigate these allegations.

Early Access

Kin is in early access.

If this is a feeling you recognize, we'd love to have you with us from the beginning.