Guide / Journey 04

Your data, your rules

MorningKeep is built for personal thinking, which means privacy is not a side promise. The product only works if your thoughts still feel like yours.

Private storage by default

Your Sparks live on your device and in your personal iCloud account through CloudKit private storage. That is an important distinction. MorningKeep is not built around a centralized database of everyone's thoughts sitting on company servers. Sync happens inside your Apple ecosystem, which keeps ownership of the source material much closer to you.

In practice, that means the core record of your thinking follows the same privacy posture as the rest of your private Apple data. It is your account, your storage, and your devices. MorningKeep is the interface and the intelligence layer on top of that, not the long-term warehouse of your inner life.

That also changes the trust model. If you delete data from MorningKeep, you are not asking a remote service to forget a copy it was built to retain forever. You are managing data that lives in your own environment. For a product centered on private reflection, that local-first posture is not marketing language. It is architecture.

Why App Attest and consent controls matter

When MorningKeep does need to talk to AI services, the app uses security controls designed to keep that path narrow and intentional. App Attest helps verify that requests are coming from a legitimate copy of the app instead of a spoofed client pretending to be you. Combined with secure network handling, that reduces the surface area for abuse around the parts of the product that leave the device.

Consent is the second half of that story. AI-powered features should not quietly assume access to everything you capture. MorningKeep gives you a clear line: when you enable AI features like Cue conversations or AI-generated briefs, content can be processed for those experiences; when you turn that off, new captures stay out of that pipeline. That makes privacy a product setting, not just a legal paragraph.

The practical effect is that you can decide how much intelligence you want versus how much isolation you want. Some people want Cue fully involved in their thinking. Others want MorningKeep primarily as a trusted capture and review system. The controls are there so the product can support both modes without pretending they are the same thing.

On-device fallback and what never leaves

The core of MorningKeep still works even when cloud AI is unavailable or disabled. You can capture new Sparks, keep your history, and continue using the app as your record of thought without being forced through a remote model call. That on-device fallback matters because it keeps the product useful even when you want a more private mode or when the network is not cooperating.

Some things are designed never to leave the device at all. Health data used to understand your rhythms is processed on-device. Biometric systems like Face ID or Touch ID remain inside Apple's secure hardware model. Sensitive credentials stay in Keychain storage rather than being exposed as plain account secrets inside the app. If you never send a Spark through an AI feature, it stays in the local and CloudKit path instead of being shipped to an external provider.

The same principle applies to identity. MorningKeep uses Apple account primitives instead of asking you to build a separate public profile for a private thinking app. That keeps the product narrower: fewer copies of your information, fewer places to secure, and fewer assumptions that your personal reflections should become platform data.

The practical takeaway is simple: MorningKeep is built to earn trust by limiting what moves, verifying the parts that do move, and giving you the choice to keep your thinking closer to home when that matters more than convenience.