Your clients trusted you first.
Conversations Copilot exists because of how much relationships matter. That means the data inside it, the real kind that reveals what is actually going on with someone, has to be protected like it matters. Here is exactly how we do that.
What lives inside this product.
Conversations Copilot stores relationship context. Not just names and job titles: the actual substance of your client relationships. What they are struggling with. What they told you in confidence. What you know about their organization that nobody outside it would know.
This is not contact data. This is the kind of information that took years to earn.
We built the product because we believe preparation is a form of respect. We take the same view of privacy: protecting this data is not a compliance exercise. It is the only reasonable response to what people have trusted you with.
The chain-of-trust problem. We are naming it.
Here is something most software companies will not say out loud: when you store client data in a tool like this, you are making a decision on behalf of people who never signed up for anything.
Your clients shared things with you, not with us. They consented to a relationship with you. That relationship now lives, in part, inside our product. They have no idea we exist.
That is a responsibility we do not take lightly.
It means we cannot hide behind your agreement with us. We have to behave as if your clients’ standards apply directly, because in every meaningful sense, they do.
What that means in practice
We will never make their data findable by anyone who is not you and your team
We will never use their information to build profiles, enrich other datasets, or make inferences beyond what you have explicitly entered
If you delete a client record, that deletion is permanent and complete
When we describe what we do with data, we write it so you could read it to your client and stand behind it
Our commitments. Every one of them absolute.
Not aspirational. Not "we try to." These are things we do or do not do, full stop.
Your data is yours.
The data you enter belongs to you and your organization. Not to us. We do not claim any license to it beyond what is necessary to operate the service. When you leave, you take it with you, or you can ask us to delete it and we will.
We are not training AI on your data. Ever.
The AI features in Conversations Copilot use your data to serve you: to generate briefs, surface context, prepare you for conversations. That is it. Your client relationships will never be used to train a model, improve a model, or contribute to any dataset that benefits anyone other than you.
We do not sell data. We do not share data.
Your information does not get sold to third parties. It does not get shared with data brokers, advertisers, or research firms. It does not leave our system unless you explicitly export it yourself.
Encryption in transit and at rest.
All data transmitted between you and Conversations Copilot is encrypted using TLS. All data stored in our systems is encrypted at rest using AES-256. Your data is not readable by anyone who does not have explicit authorization.
You can delete everything.
Not "submit a request and we will get back to you." You can delete your account and all associated data, and it will be gone. We will confirm when it is done.
One thing we might do someday. And how we would do it.
Conversations Copilot is used by people navigating complex relationships inside complex organizations. Over time, we believe there are patterns worth understanding: the organizational dynamics that derail deals, the communication breakdowns that damage client trust, the warning signs that appear months before a relationship falls apart.
We might study those patterns. Not to sell insights. Not to build a product on top of yours. To publish thinking that makes the broader community better at this work.
If we ever do this, it will work exactly like this:
No individual will be identifiable. No organization will be identifiable. We would aggregate behavior patterns (signals, not stories) in a way that reveals nothing about the people who generated them. You would never be able to find yourself in our research, and neither would anyone else.
We are telling you this now because we believe you should know how we think about data before you are inside the product, not after.
Everyone who touches your data.
These are the third-party services we use that process customer data in any form. This table is kept current. If you have questions about any of them, email us.
| Subprocessor | Purpose | Data Processed | Location |
|---|---|---|---|
| Hetzner | Infrastructure and hosting | All application data | EU (Germany) |
| Cloudflare | CDN, DDoS protection, bot filtering | Request metadata, IP addresses | Global |
| Brevo | Transactional email delivery | Email addresses, notification content | EU (France) |
| Authentication (SSO) | Account credentials | US | |
| Anthropic | AI brief generation and analysis | Relationship context submitted for AI processing | US |
Anthropic processes data via API calls. Per our agreement with Anthropic, data submitted through the API is not used to train their models. We do not use consumer AI products to process any customer data.
Your rights. Our contact.
Depending on where you operate, you have legal rights around your data: the right to access it, correct it, export it, or delete it. We honor all of these regardless of jurisdiction. You should not have to cite a regulation to get a reasonable response from us.
We respond to privacy inquiries within 48 hours.
This page was last updated: April 2026.
The data inside this product is the kind that took years to earn.
We built it to help you show up better for the people who gave you that trust. Protecting it is part of the same job.