Insights from Jill DeGraff, SVP of Regulatory Affairs, and Compliance, b.well Connected Health
With any new innovation, there’s a predictable arc: excitement, fear, conversation, and then clarification of how rights, responsibilities, and protections fit together in practice. That arc is unfolding right now around OpenAI’s recent announcement of ChatGPT Health, the company’s new dedicated, secured environment for health-related conversations that allows consumers to connect medical records (via b.well) and wellness applications.
As this feature begins to attract attention on social media and in tech press, it has also sparked intense discussion about consumer privacy protections, and what rights individuals really have when interacting with digital services that touch personal health information. Some commentators note that medical records uploaded into ChatGPT Health are no longer covered by HIPAA, because the service isn’t a covered entity under that law.
This conversation matters because many in the digital health community want a clear and realistic understanding of how privacy, consent, and control work today when people choose to share or connect their data into third-party consumer applications. While this is not that new, this is the foundation for how apps and services operate. Frankly, this is also where we need the industry to go, because this is how consumers make choices, and how we can earn their trust to keep engaging with their health.
Understanding the Legal Framework: HIPAA and Consumer Privacy Laws
HIPAA (the Health Insurance Portability and Accountability Act) provides essential patient rights and privacy protections when health information is held by healthcare providers, insurers, and their business associates, including the right to access clinical records and requirements for securing protected health information (PHI).
But there are important limits to HIPAA’s reach:
- It applies to a specific set of health care actors (covered entities and business associates).
- It governs PHI within the health care system, not data shared directly into consumer products or AI tools that are not covered entities.
- It does not, on its own, create broad control rights (like deletion or portability) for data once a consumer uploads it into a service outside the traditional healthcare environment.
That’s where modern consumer privacy laws fit in. Since 2018, a wave of state privacy legislation has established rights that apply to personal data more broadly, including health data that consumers choose to share with digital services. These rights often go beyond what HIPAA alone provides, such as:
- Consumers can know what data is collected and how it’s used.
- Consumers can delete data or withdraw consent.
- Consumers can opt out of sharing or selling personal information.
- Some laws explicitly define consumer health data and require affirmative authorization before use or sharing.
For example, California’s privacy laws allow consumers to stop businesses from sharing or selling sensitive information and require data minimization and access rights. Washington’s My Health, My Data Act specifically governs consumer health data not protected by HIPAA and mandates affirmative authorization and deletion rights.
These consumer privacy rights exist beyond the borderlines of HIPAA, after PHI leaves the control of a covered entity or its business associate. In fact, many of these consumer privacy rights extend protections into areas that HIPAA doesn’t reach.
How Consumer-Mediated Health Data Exchange Works in Practice
When a consumer uses an app or service to access and share their health information, real-world control is not just about what the law says, but it’s about what the user experiences. That’s where b.well’s three trust pillars for consumer-mediated health data access and exchange come alive:
1. Transparency: Clear Terms, Clear Decisions
Before any data is connected or shared, consumers must be able to understand:
- What data the app will access,
- Who will have visibility into it,
- How it will be used,
- For how long, and
- What controls they have over that access.
In the context of ChatGPT Health, OpenAI’s Health Privacy Notice outlines what information may be collected (including connections to medical records and wellness apps), how it’s used to support the service, and what controls users have, such as disconnecting apps and managing connected data sources.
This level of transparency is the starting point of informed consent and it is a federal legal requirement under the FTC’s enforcement authorities as well as under many state consumer privacy laws.
2. Informed Consent: Empowering People to Choose
Informed consent means consumers are given sufficient context and plain-language information before they authorize data access, use or sharing. This includes:
- Clear explanation of what will be accessed, and how it will be used
- The purpose for access,
- Any third parties involved, and
- What options exist for narrowing or revoking permissions.
For example, a consumer accessing ChatGPT Health will be presented with choices about which data to connect and how that data will be used to inform the AI’s responses and the service’s privacy notice is designed to align with consumer privacy obligations.
As well, many state laws have created more robust consent requirements. For example, the Washington State My Health My Data Act requires a separate consent to share consumer health data (not permitting compound consents). And a separate – and significantly more limiting authorization to share consumer health data. These state laws operate as a significantly higher floor of baseline consumer privacy protections, based on the principle of consumer consent that is freely given, informed, and revocable.
3. Ability to Exercise Meaningful Choice: Control Throughout the Lifecycle
Beyond initial consent, meaningful consumer control means:
- You can access all the data available from a connection.
- You can know what data has been shared with third parties.
- You can disconnect specific connections at any time, including those with third parties.
- You can delete data from specific connections that you had previously collected.
- You can ask for all your data to be deleted, and have that request honored, including data that you consented to share with third parties.
In digital health applications, consumer control vastly exceeds the rights that covered entities and their business associates must honor under HIPAA. By enabling consumer-mediated health data access, consumer privacy translates into concrete capabilities: the ability to remove a connected record source without deleting all your data (Use Case #3), the ability to delete data from a specific source (Use Case #4), and the ability to delete all health data from an application (Use Case #5).
Applying Rights Through Real Use Cases
To bring these principles together, consider how these rights play out in practice:
Use Case #1: Introducing a Trusted Data Intermediary
A consumer needs clear language that explains the intermediary’s role — who is accessing their data and why — before consent is obtained. Practice has shown that introducing this early in onboarding reduces confusion and increases confidence.
Use Case #2: Portal Access and Attestation
When a user connects a patient portal, they attest that they are using their own credentials. Presenting this attestation either at first connection or globally ensures consent and aligns with current standards for safe data retrieval. (Bear in mind, use of a modern digital ID credential will make this step even more secure, and spare users the aggravation of creating a patient portal account or remembering their credentials.)
Use Case #3: Disconnecting a Specific Source
A consumer might choose to disconnect a hospital’s record source while continuing to use other connected data. This level of granular control give consumers real choices, not just binary off/on switches.
Use Case #4: Deleting Data from One Source
State laws that grant a right to deletion mean that if someone wants their data removed from an app altogether, that must be honored and clearly explained in the consent and privacy notice frameworks. But giving consumers this added level of granular control also gives consumers a choice besides all-or-nothing.
Use Case #5: Deleting All Health Data
Consumers can request deletion of all health data held by an application, a right that aligns with the “right to be forgotten” in state privacy laws.
Use Case #6: Specific Consents for Sharing or Marketing
When an application intends to use data for marketing or sharing for non-core purposes, explicit, separate consent is required. Many state consumer privacy statutes require that marketing and data sharing consents are distinct and unbundled from core service permissions. This reinforces control over uses beyond what the consumer originally intended.
What This Means for Consumer Rights Today
In today’s regulatory environment:
- HIPAA continues to secure health data within traditional healthcare systems, ensuring PHI is protected when held by covered entities.
- State privacy laws that apply to applications outside the HIPAA framework provide significantly more robust privacy protections than HIPAA
- These protections are centered on well-established principles of transparency, informed consent and the ability of consumers to make meaningful choices throughout the data lifecycle.
- Developers and apps that sit outside of HIPAA must already implement these protections. Since HIPAA no longer sets the floor for privacy protections, consumers should not expect these protections in apps of covered entities and their business associates, unless they voluntarily agree to meet this higher floor.
Conclusion
The discussions around ChatGPT Health and consumer privacy reflect a broader moment in digital health — one where rights, technologies, and expectations are aligning. When individuals decide to connect their medical records or wellness apps to any service, they do so within a framework of transparent disclosure, informed consent, and meaningful control that is now supported by both law and operational best practice.
This framework is the foundation for how consumer-mediated data access and exchange works in practice. It gives people control over their information, enables choice among services based on value, and supports innovation built on trust and accountability.
That’s what today’s privacy ecosystem is meant to support, and what real consumer rights look like in action.
Resources:
b.well Consumer Rights for Health AI
b.well Privacy and Security Statement
OpenAI Health Privacy Notice
Collecting, Using, or Sharing Consumer Health Information? Look to HIPAA, the FTC Act, and the Health Breach Notification Rule