Regulatory Compliance: Turning Overhead into an Opportunity

There's always a balance between what some people call beneficial friction—that’s designed to make sure you are who you say you are—and the kind of friction that leaves you running, screaming down the hallway because you really need your records and can’t remember your password.
Deven McGraw, Chief Regulatory and Privacy Officer at Citizen Health
- View Transcript
[00:00:00] Deven: Don’t wait until you’re a year down the road in developing your product to bring in somebody good on the compliance side. It’s almost always the case that with any regulatory regime, there’s both pain around the compliance and opportunity. And so get somebody in quickly when you start so that you’re not missing the opportunities and you’re not having to sort of rework your product so to fix a compliance issue that frankly was there all along.
[00:00:29] Brendan: Welcome to Hard Problems Smart Solutions, the Newfire Podcast, where we explore the toughest challenges and smartest solutions with leaders across technology and healthcare. I’m Brendan Iglehart, Staff Healthcare Architect at Newfire Global Partners, and your host for this episode.
[00:00:54] Brendan: Today, we’re joined by Deven McGraw, Chief Regulatory and Privacy Officer at Citizen Health. Deven served as the Deputy Director for Health Information Privacy at the US Department of Health and Human Services Office for Civil Rights, Acting Chief Privacy Officer for ONC and Co-founder of Citizen, a startup built on the idea that patients should truly own and use their health data. In addition to her work at Citizen Deven remains one of the most influential voices in US healthcare information policy today. She shaped HIPAA enforcement, testified before Congress and helped move the needle on patient access rights in a system where getting your own records was once nearly impossible.
[00:01:33] Brendan: Deven, thank you so much for being here.
[00:01:36] Deven: Thank you very much. I’m glad to be here.
[00:01:38] Brendan: Deven, you’ve had an incredible career across government advocacy and startups, and I’m, I’m curious what pulled you into the world of health, privacy and regulatory policy.
[00:01:49] Deven: It’s a really good question. I really started off in healthcare generally. I sort of was drawn to the myriad of issues that we face in healthcare in the United States. And then, you know, that’s a big, that’s a really big topic, right? So pretty quickly, I kind of found my sweet spot in privacy. I was working for a nonprofit at the time, the National Partnership for Women and Families, and they were advocating for access to quality healthcare, not just access to any old healthcare.
[00:02:22] Deven: Recognizing that electronic medical records was gonna be pretty important to greater understanding of how healthcare is delivered and how do we improve the quality of that care, and, you know, anytime you’re moving data into digital form, lots of questions arise about privacy. And that became my sweet spot.
[00:02:41] Deven: It’s miles deep and inches wide if you think of the myriad of issues in our healthcare system. Right? And there weren’t that many people that were working in that space. And it does require some legal knowledge. It’s very helpful. I, I do have a law degree and so it just became the space that was endlessly fascinating to me and I decided to really do that as my, the career focus governance and how are we using data to improve the healthcare system and how do we do CO? How do we do so in a way that people trust? That’s really what privacy is in a nutshell.
[00:03:15] Brendan: So, Citizen, which is now called Citizen Health, was founded on a very personal story, is my understanding. Can you share that, that a little bit about that story and, and why that mission resonated with you and, and then the problem that you’re trying to solve with that, with your company?
[00:03:29] Deven: It’s not my story, but I will share it because the founder of Citizen, at the time, this is back in 2017, this was his personal story and it definitely moved me. So, Anil Sethi, he lost his youngest sister to stage four metastatic breast cancer. She was stage four at diagnosis, so she was definitely facing a very serious prognosis. And he tried to gather all her medical records from all the places where she had been seen, working mostly by looking at the portals that she had into some of her doctors, and found that there wasn’t a whole lot there. And when he came to talk to me about it, I said whatever’s in the portal is probably just a small amount of what she has the right to under HIPAA. And I talked to him about just what she did have the right to, which was essentially everything that had ever been generated about her care far more than what she had access to in the portal, and he came to me when I was still at HHS and said, I wanna start a company that really helps patients and particularly cancer patients at the time to get everything , that they’re entitled to access under US law.
[00:04:42] Deven: Can you help me do this? Help me start a company that’s dedicated to this. And I was like you know what? This is really what moves me. The idea that to really be an empowered patient to really take charge of your health, take charge of your healthcare journey, you need access to that data.
[00:05:03] Deven: You cannot do really, you can’t do much as a patient, but sort of sit around on the sidelines, which is not, certainly not my vision of what I would want as a patient and was not the vision for Anil at the time. And frankly, isn’t the vision for anybody who’s facing a serious illness or trying to help a loved one through a serious illness.
[00:05:23] Deven: If you don’t have access to all that data, you can’t. You’re nowhere. So that’s really what motivated me. I, I felt like I knew a lot about what rights patients have to access their data, in part because when I had been in the government, I had done a lot to issue guidance to try to be more clear with patients that their rights are quite expansive, that their, at the time they were far more than what people, generally thought they had the right to. There was a lot of non-compliance up there, that’s for sure. So it, it motivated me to come to Citizen way back when in 2017, and it still motivates me to be with the company to this day.
[00:06:02] Brendan: So, relates to that point that patients for many years have told that they have a right to this data and they do have a, they legally do have that right. But a lot of practical barriers have been in the way of, of them exercising that right? So why is that patient access issue such a hard problem to solve in practice?
[00:06:19] Deven: Oh Lord. First, I wanna say that we definitely have come a long way. We, it used to be the case that patients would be told the exact opposite of what their legal rights are. They would be told, you don’t have the right to that data. Or they would be told you have the right to some of that data, but you don’t have the right to my notes.
[00:06:36] Deven: Doctor’s notes, those are mine. None of that is true. You know, the truth is you have the right to every little bit of information that’s generated about your care, but getting over that sort of cultural hurdle was a big one. Now what we have are obstacles that are about your ability to access that data virtually without you having to go into an office and go in person to get it.
[00:07:00] Deven: Identity is one of the big challenges to solve. How can you essentially present yourself for your medical records by, you know, working through an application or a platform like Citizen to be able to get that data. And there are processes that can help to ID proof you online. It does require a presentation of a government-issued photo ID that hasn’t expired.
[00:07:26] Deven: But there is a way to do it. If you don’t have one of those pieces of identification, your challenges still can be fairly steep. And then let’s say you get through that hurdle, then the other hurdle is matching you to the right data. The fact that you’re identity proofed
[00:07:44] Deven: helps to match you to the right data ’cause they’re getting your name right. You’re getting some pieces of information that are present in your identification that can help I match you to the right record, but that’s not, it’s not a foregone conclusion. You know, some of us move and the official address that’s on our identification from the state is not the official address anymore.
[00:08:05] Deven: It’s actually something that happened to me. I moved a year ago. My current address is not actually the address on my driver’s license. I don’t wanna have to go in and get another driver’s license until that thing expires. At any rate, you know, various obstacles around matching you to your records.
[00:08:22] Deven: Fortunately for me, even though I do have an address change, at least I have a somewhat uncommon name. My first name, Deven, spelled D-E-V-E-N, not i, n, not o, n, various other more common spellings. My name together with my birthdate tends to yield a unique match and upon request. But you know, we still have issues for people with more common names or lots of questions on the part of healthcare systems that are, that want to produce, that we’re asking to produce data for these patients.
[00:08:55] Deven: Like even when you have a unique match in accordance with demographics, are you a hundred percent certain that this is the right person? There are still some questions that come up from time to time and, you know, some sludge in the pipes around, around requests or maybe the preference of the healthcare system is to deliver their, that data that you’re requesting through your portal.
[00:09:19] Deven: You gotta remember your username and password, and maybe you have multiple portals and you gotta try to connect an app or a service like ours to multiple portals and sustain that connection over time. A lot of times that connection has to be refreshed and then you gotta remember your username and password. Or your healthcare system changes the electronic medical record vendor that they’re working with
[00:09:40] Deven: and then you gotta do all those reconnections again. So it’s, it’s better than it used to be, but there still are, there’s still sludge in the pipes. And, you know, working to make that as seamless as possible while still ma you know, maintaining, the security of systems, you don’t want people to be able to spoof you and grab all your medical records,
[00:09:58] Deven: that would be a problem too. You know, there’s always a balance between w hat some people call beneficial friction, that’s designed to make sure that you are, who you’re saying you are and the kind of friction that actually leaves you running, screaming, down the hallway because you can’t, you really need your records and you can’t remember your password. And the last five passwords are not working or not acceptable, and you have to reset and you have to do it again and again and again over multiple places.
[00:10:26] Deven: It’s,
[00:10:27] Brendan: yeah, can definitely relate to that. So I, I think
[00:10:30] Deven: Many people
can.[00:10:32] Brendan: And I, I think you have a very interesting vantage point in this issue too, because you obviously have seen it in the past number of years and from the private sector perspective, but also spent a lot of time in government. So can you speak to a little bit about your time specifically at the Office of Civil Rights or other work that you did in government that really helped to kind of shift how people thought about that, the right to their, their data.
[00:10:53] Deven: Yeah, so I was very fortunate when I was at the Office for Civil Rights that improving the ability of an individual to get access to their data was an administration priority. It still is, which is great, but it was for the first time. In terms of like really making it a priority, it’s, again, it’s always been a very important part of the HIPAA privacy rule.
[00:11:18] Deven: It’s the ability of an individual to access their health information and to get access to all of it has been part of the HIPAA privacy rule since its inception, so more than 23 years, if I think about when that rule went actually into effect and was enforceable. And yet God, for a very long time, it was one of the top five categories of complaints that the Office for Civil Rights would get about from patients
[00:11:47] Deven: in terms of their HIPAA rights being violated. And it wasn’t really being very aggressively enforced. And that was actually true when I was there too. But the one thing we were able to do was to issue comprehensive guidance. Lots of FAQs, like, Can you be required to come in person to get your records?
[00:12:05] Deven: No. That would actually be a violation of your HIPAA rights. Prior to that guidance being issued and now being more actively enforced, lots of patients were told, yeah, we’re not going to give that to you unless you come in person. That’s a hassle for people who are trying to get a record. There,
[00:12:22] Deven: there’s still room for improvement, don’t get me wrong, but lots of stuff that we did, Can you get your records by email? Yeah. Actually, if that’s the way you want ’em. And you understand that sometimes email is unsecure as long as, you, that’s the way that you want it, then the healthcare institution on the other end of that transaction has to give that to you.
[00:12:43] Deven: That was something that we did a lot on in terms of issuing guidance. And it was something that I spoke a lot at conferences on and there were lots of organizations and healthcare institutions and health plans that had to do a fair amount to get their processes up to speed.
[00:13:01] Deven: Because what often happens is there’s this dynamic between the regulator and the regulated industry. And until that, until guidance comes out and really starts to seep in, regulated industry is gonna be fairly conservative in terms of how they interpret what their, legal expectations are.
[00:13:19] Deven: And particularly where you’re talking about data going out of the institution’s doors and into the hands of someone else. They’re likely to take a much more conservative posture about that. We see that all over the place. We see it also in, in terms of getting data to patients. How do I know this is the right patient?
[00:13:38] Deven: How do I know the patient has really hired this app? You know, there, there is an understandable reluctance. To let data out the door. But when you’re talking about the patient and their right to get that data, some of that natural reluctance has to give way to making a process that’s easy. And that kind of balance was what we were looking for when we were regulators.
[00:14:01] Deven: And frankly, it’s also what you kind of expect on the patient side, too. But of course, now that I’m out on the patient side, I’m definitely leaning into making it easier, easier where we can. But I guess it, it helps that I was on the side of regulators and actually counseled a number of organizations on how to comply with HIPAA.
[00:14:21] Deven: I understand where they’re coming from. I’m trying to figure out how, how do we meet people in the middle? A process that’s, that, that again has that, m aybe a little bit of friction so that it’s not so easy that people’s records can fall into the wrong hands really easily through spoofing and other other mechanisms.
[00:14:38] Brendan: Yeah, it makes sense. Fast forward to today, you know, and your work at Citizen. I wanna make sure that we highlight some of the work that you are doing there. How is the, the product that you’ve developed there and, and been involved with changing the paradigm for patients that really want to use the data and not just download it or, you know, get it via email.
[00:14:55] Deven: Yeah, how, how do we get out of the digital shoebox of data and into something that’s actually useful for people? And, for a number of years, when Citizen Health first started, we were mostly a digital shoebox for folks with the added benefit of giving people the opportunity to share de-identified data from their records for research purposes if they wanted to. And for the populations that we were serving, still are serving frankly, patients with rare diseases, patients with very complex conditions, this is a high interest to them because their diseases are often poorly misunderstood and the treatments are not ideal and there’s very strong interest in this. But we still, you know, previously weren’t necessarily giving people a lot of use out of their records.
[00:15:44] Deven: We have developed and are, are still refining what we’re calling the AI Advocate, which allows patients to actually learn from their medical records, have conversations with their medical records, talk to their medical records, get information out of their records that they can actually use, and to do that really easily through asking questions in a chat or through a voice activated chat interface.
[00:16:07] Deven: When was the last time I had a tetanus shot? What are the seizure medications that my child has been on? We have a lot of p atients using the Citizen platform who have children who whose disorders are characterized by seizures. What are some of the side effects that have been reported? I’m going to have a conversation with a new doctor,
[00:16:25] Deven: what are some of the things that I should highlight? These kinds of, asking real questions out of your records and being able to get those responses back. So that the medical records are actually useful to patients and not, again, just this digital shoebox of data that might be that you have and it’s important for you to have, but if you, if it’s hard for you to get information out of them, this is thousands of pages
[00:16:48] Deven: of records for, again, most of the patients that we serve that see two to five medical providers that have very complex conditions where, where there’s just a lot of data that they have access to and that’s under their control. Let’s make that data work for them and let’s use and continue to refine the tools of artificial intelligence for this purpose.
[00:17:09] Deven: We foresee that, and not too long in the future, that we will be able to help patients not only un get data and insights from their own records, but to actually be able to learn from communities of patients like them. What other seizure drugs are kids with my particular disorder or similar disorders on, maybe the medication that my kid is on is, is not working as well as maybe something else.
[00:17:34] Deven: And I’d like to be able to go into my doctor’s office armed with more information to ask about some of these other seizure medications or the medication’s working, but has a lot of side effects that are not great. Maybe I can find one that works just as well, but that I tolerate or that my child tolerates better. Learning from other
[00:17:53] Deven: patients who are going through the same thing that you’re going through can be very, very powerful. So first, let patients get information outta their own records, then let them get some insights back and learn from what other patients are going through, not, identifiable data, not that you’re, without your consent.
[00:18:12] Deven: We would never share medical record from one patient to another, but aggregate insights. From the aggregated data of a community can be is very powerful. If you think about how patients today go on social media chat groups and share data, share insights about what’s happening and really wanna learn about what’s happening to other people, we’re taking that concept and really supersizing it with data ultimately that comes right from medical records.
[00:18:44] Brendan: So if I’m a startup leader listening right now, there’s a lot of issues that you brought up and kind of gray area of regulation and, and different expectations from parties for how things should work, what’s some advice that you’d offer a startup leader or someone building in this space to try to be, be responsible, but progressive about how they think about these things, even when the regulations feel outdated or unclear.
[00:19:06] Deven: I think there’s a couple of sets of regulations at a minimum that you gotta be thinking of. And depending on your business model, you might have a whole host of others. But the two sets of them that, that come to mind that we face are, number one, are is your AI chatbot a medical device that is going to be in a position to potentially need to be subjected to Food and Drug Administration approval in the US because the kind of insights and information that you are delivering to patients is something that if you get this wrong,
[00:19:42] Deven: there could be serious healthcare complications that would result. We are a tool that is patient-facing. We are giving patients information that we tell them they should take to their medical provider. In most circumstances, patients can’t diagnose themselves, but there’s a lot more that patients can do with online prescribing and things of that nature more than they could do in the past.
[00:20:07] Deven: So we, we are definitely mindful of making sure that the quality of the output of our AI is high. And that we are sticking with the types of questions that we, you know that, that are within the lane of providing insights to patients that they can use, but not to diagnose themselves or to take on activities that could by themselves that could subject them to harm.
[00:20:34] Deven: So that’s one set of issues. If you are a startup in the space, and you’re working with AI and it’s intended to inform medical care, that’s definitely something that you have to think about, and it’s not something that you can just disclaim yourself away, right? Oh, you know, we never intended for you to use this for medical care.
[00:20:51] Deven: Yeah. That is just one set of facts that, that the regulators will take into consideration. What is this thing actually doing? What kind of information is it providing? And to what extent should this be subject to additional regulation because you could really harm somebody if it’s not accurate.
[00:21:08] Deven: And then the second set that is the privacy issues around the data. Do patients know where these insights are coming from? Do they understand what rights they have to opt out of things or to consent to be included in them. If you’re using someone’s data to train an AI algorithm, is it also training somebody else’s algorithm and that’s, part of your business model?
[00:21:32] Deven: All of this is something that you need to think about. we are part of the Care Alliance, which is an alliance of, of apps that are intended to work on behalf of patients, help patients get their health information, learn from their health information. And we’re, we’re just now we subscribe to the care and code of conduct around data governance for patient-facing apps because most of those apps are outside of HIPAA,
[00:21:56] Deven: that would include us. How do we, you know, r aise the bar for, for how we’re all conducting ourselves with respect to data governance. And we’re just now making some tweaks to that code of conduct actually to accommodate issues like AI and transparency and, and when should people be allowed to opt out or be required to prior consent before their data is used for certain purposes.
[00:22:18] Deven: So, So there’s, those are just two areas to consider obviously, if you’ve got a business model around health plans paying for, for things that generate outta the AI, you might have some other issues that you have to consider. It’s a regulated space. You can’t go in with blinders on.
[00:22:37] Brendan: Yeah, and kind of on that point, obviously HIPAA was written and really thought about deeply a lot. A long time before AI and the kind of current generation of health apps were part of everyday care or things that patients were using. So how do you think about balancing, protecting patient privacy while still ensuring that apps can get the data they need and there’s liquidity to enable the next generation of innovation?
[00:22:59] Deven: Yeah, I think it you know, a lot of times the questions come in this sort of very broad way. What do what, how do we modernize HIPAA for a modern era? And I, I think that, that it helps to think. This may seem old hat, but it’s, it, HIPAA is a use case driven set of rules, right?
[00:23:20] Deven: We have different rules for how you can use and share data when it’s for research. We have a lighter set of rules when you’re talking about using and sharing data for treatment, because that we wanna make sure that data flows easily for treatment, but we allow patients to have more choice.
[00:23:37] Deven: Many instances around research. I think that’s the right framing. When you think about what do the new use cases look like and how do, where do we require a patient consent requirement? Where should it be opt in and when should it be opt out? When are patients just better served by transparency so they can make the right choices about which apps they use in terms of their policies? Most,
[00:24:03] Deven: I’ll just give you an example, you know, most privacy policies that I read are policies that apply only to identifiable data. Which means, i f your data is de-identified sufficiently stripped of identifiers, that it’s considered to be anonymized, their privacy policy doesn’t even cover it, and you don’t have any transparency at all about what happens to your data.
[00:24:25] Deven: We do have transparency about de-identified data in our privacy policy. We don’t share data with a third party, even if it’s de-identified unless the patient is consented. But I think it’s really hard for patients to figure that out in privacy policies and it makes it even more important, I think, for the regulations to try to keep up with where all that is much less the fact that HIPAA was a statute that was enacted in 1996 and the regulations in, were finalized in 2002, 2003.
[00:24:56] Deven: That’s old stuff. We got a lot more permutations in healthcare and many different types of actors, many of which were, would be uncovered by those current laws. So this is one where the regulators can’t fix that. Congress has to decide, like we, we need to make sure that pr, our privacy laws cover all the different permutations of the actors that are out there today.
[00:25:19] Deven: And then again, a use case-driven approach so that you make sure that you get the right balance so that you get, again, that the ability to have innovative tools, but that people trust and trust for good reason, because their data’s not being abused.
[00:25:35] Brendan: You’re inspiring me to open up some of my healthcare apps and check out the privacy policies afterwards or if not, read them, feed them into Chat GPT and see what AI has to tell me that.
[00:25:43] Deven: You, you should and, and ask in particular, because, AI only works as well as the prompts, ask in particular what about de-identified or pseudonymized or anonymized data? Is it covered by the policy? Chances are it’s not, but may be. It should be, but yeah.
[00:26:07] Brendan: Yeah, that’s some good advice. Obviously in the recent years, you know, your work with Citizen, you’ve gotten a lot of exposure and, and involvement in shaping corporate cultures around some of these issues. So, and on the company side you’ve talked about how the best compliant cultures are where people feel safe raising problems early and reporting issues that they see.
[00:26:25] Brendan: So what’s some advice that you have for leaders that are trying to build that environment within startups or other types of companies?
[00:26:32] Deven: I don’t know that there’s any magic to it. People have to be able to trust that they can come to you and not lose their jobs. Even though there is this sense that you want to be, you wanna have a set of policies that strictly tell people, these are the rules, and if you violate them, you could possibly, it could possibly be grounds for your dismissal ’cause you want people to take this stuff seriously.
[00:27:00] Deven: But there’s obviously a line between folks being intentionally abusive and you know, looking at their neighbor’s data. That should be grounds for you’re out. But. Oh, I discovered something. I discovered we have a glitch and it’s potentially gonna reveal data online. Obviously, that, that actually should have a reward attached to it, right?
[00:27:26] Deven: You know, above bounty of some sort. You found it, we’re gonna get this fixed, we’re serious about it. Or, a lot of times it, sometimes mistakes happen ’cause people are going too fast. They were overly dependent on automation for a certain task, and it didn’t get sufficiently tested to make sure that it continues not to expose data in ways that are, that violate your policy.
[00:27:50] Deven: You really have to not punish the reporter of that and to figure out what are the systemic problems that might be causing us to make a human error. Or it might be causing us to miss things. Then it’s easier to get it fixed. And then people feel like they can come to you even with the human errors, because most of the times human errors are also systemic problems. It doesn’t feel like rocket science to me, but I guess I have worked in organizations before where people were so afraid of legal counsel that they just didn’t want them to know about anything that was happening because they would get, they were risking their jobs or just being yelled at or, or treated badly. Treated like they were incompetent or idiots because, that’s, that was just the corporate culture.
[00:28:42] Deven: I don’t think that’s helpful. I f you hire good people, treat them like the good people they are, including allowing them to tell you quickly whether they’ve made a mistake, and then work, working to figure out why’s that happened. What can we put into place to help you do your job better so that this doesn’t happen again?
[00:29:02] Brendan: Deven, when you think about the future of, of patient data and some of the kind of unlocking that data that we’re now seeing become more commonplace, especially in the era of AI, right. What excites you the most about some of the possibilities that, that will come with, with that increased data liquidity.
[00:29:18] Deven: I think it’s super exciting. If I think about even the mundane things that I use AI for today that have given me hours back, if not days back in my schedule, because they’re able to take on some tasks that would’ve taken me a really long time to do, and they do it so easily, I’m just amazed.
[00:29:40] Deven: You know, when I think about some of the advances in radiology, for example, where the AI is doing as well, or better than the human at looking at images and detecting anomalies or things that need to be detected. On some level, humans can’t ever be fully replaced, but they can be made better because we’re using machines to help us improve and get over the human error, get over the human bias assu, assuming that we’ve actually not trained the machines to be biased, which, which is an issue, right? It’s not. It’s not an either or. It’s a both and kind of issue. And the fact that we have a healthcare system that only gets people the right care like 50% of the time, it still is plagued with human error.
[00:30:31] Deven: That still costs too much. And doesn’t give us the outcomes that other countries with our level of resources enjoy. I think of course, we should look at every possible tool we have to fix it. Be mindful of where the weaknesses of over reliance on machines can bring us, but to embrace the fact that we can use machines to get better and for patients that they can use machines and their data to be really present in their care and make and be and have shared decision-making with their clinical providers and look for things that might help them, that maybe their clinical providers aren’t aware of. You know, again, I don’t know that we’re ever gonna get to the place of someone with a really serious illness being able to treat themselves, and I don’t know that we ever want to get there, but an active, smart patient, w ho isn’t sitting around waiting is where we’re headed. In fact, we have it already, in some pockets, but it’ll be increasingly more that person will be more of the norm , than the outlier.
[00:31:40] Brendan: Yeah, I think that’s a really interesting perspective because so much of the kind of investment and hype around using AI in healthcare is really focused on the healthcare provider and kind of empowering them and making them more productive, or at least that that’s what that’s focused on my world.
[00:31:52] Brendan: But the patient perspective is really great too, right? Because you’re, you’re leveling the playing field for people who would otherwise not have access to the kind of information or resources that, that would make them better patients, essentially.
[00:32:05] Deven: Yeah. Yeah. With AI, patients don’t have to be patient. They don’t have to wait
[00:32:09] Brendan: Love that.
[00:32:10] Deven: for someone to come to them. They actually, there are, they can be better prepared. They can be proactive, whether that’s for self care or whether that’s for actually advancing medical knowledge and looking for the best possible treatments for themselves.
[00:32:25] Brendan: Deven, if you could fast track one regulatory change that would make the biggest impacts in this space, what would that be? What’s kind of top of mind for you about things that you would try to fix?
[00:32:35] Deven: Oh God, my list is really long.
[00:32:39] Brendan: I know I said one thing, but you can give us a couple things if you want.
[00:32:41] Deven: I do think it would be helpful if we did have a HIPAA extension to personal health record platforms like ours. We get a lot of questions about why we’re not HIPAA covered, even from our own users. And I don’t have the choice to be HIPAA covered. I can contractually say that I will abide by HIPAA.
[00:33:02] Deven: That’s what’s required to participate in the Trusted Exchange Framework and Common Agreement voluntary network. Sure. You know, and I, but at the end of the day, there is a divide. And that creates this mistrust also on the part of the data holders that want to help patients get their data, but they’re like, why would we encourage a patient to use a tool that’s not even HIPAA covered?
[00:33:27] Deven: We’re past the point where we should have, long past the point, where we should have that divide. And frankly, it’s something that California did to fix their version of their state version of HIPAA years ago, like I think it’s been about 10 years since they did that. So, we need Congress to do it.
[00:33:43] Deven: It’s not something that anyone in the administration can do with the stroke of a pen or with a federal register notice. So, I would definitely fix that. The other ones are not minor. But they’re incremental fixes that are part of that sludge, the sludge in the pipes. Can we get it a little smoother?
[00:34:02] Deven: Can we take some of that sludge out? Smaller things that are a little bit more like in the weeds kinds of stuff, as opposed to, that would be expanding HIPAA to cover personal health applications that draw data out of medical records at the request of a patient would be huge.
[00:34:19] Brendan: So for the innovators that we have listing to this conversation, what’s a smart solution that you would advise them on to kind of act upon now if they want to be ahead of the curve and privacy and data governance?
[00:34:29] Deven: Number one, do a TLDR, too long, did not read version of your privacy policy and put it right up front. Something that people can read and get the general gist of, because nobody reads those whole privacy policies, even I don’t read them. But it’s really important for the patient to understand what they’re getting into, put it right up in the front. Look into signing the Care Alliance Code of Conduct.
[00:34:54] Deven: It’s a little bit of free advertising there, but it’s a voluntary code of conduct that is an attempt for those of us in the space to level up w here the expected behavior should be. And frankly, anybody who wants to agree to that can sign it. So why wouldn’t you, why wouldn’t you ascribe to the highest level of trust?
[00:35:14] Deven: And if you think it’s not enough, then go ahead, sign it, and then say, we do better. And then explain how you do. It’s pretty standard and that includes being transparent with people about de-identified data. Because that’s, it’s not something that you see enough of out there.
[00:35:31] Deven: And then, don’t wait until you’re a year down the road in developing your product to bring in somebody good on the compliance side. ’cause it’s not just about making sure that you’re not gonna get into trouble. It’s also about opportunity. Again, the right of patients to get their data is opportunity for those of us in that space.
[00:35:52] Deven: It’s not always a regulatory, u gh, I gotta worry about all these regulations and check the box and all this, all this stuff, right? It’s almost always the case that with any regulatory regime, there’s both pain around the compliance and opportunity. And so get somebody in quickly when you start so that you’re not missing the opportunities and you’re not having to sort of rework your product to, to fix a compliance issue that frankly was there all along.
[00:36:21] Deven: And the fact that you didn’t pay attention to it means you got a whole lot of work to, to undo what you did and fix it.
[00:36:29] Brendan: So I’m gonna close this out here with a quick lightning round question. What’s the biggest privacy challenge that you think we’re gonna face in this industry in the next five years?
[00:36:38] Deven: De-identification of data. So you know, we have a standard under HIPAA that while only HIPAA entities have to use it is actually one of the few standards that exist out there for people to point to. You know, it’s either the safe harbor version of de-identifying data, which requires you to remove certain types of data, or you’re using what’s called an expert opinion.
[00:37:03] Deven: You’re hiring a learned statistician who uses statistical disclosure techniques to try to reduce the risk that the data can be re-identified. The expert opinion, obviously, migrates with the, with learning. Safe Harbor is old, it’s, it was established back in 2003.
[00:37:25] Deven: I mean, that’s when it was finalized, actually, it was established before that. So, the likelihood that something that was established in 2003 still actually works to protect data f rom being re-identified just doesn’t make a heck of a lot of sense. Again, if you hire an expert, then you’re, that person’s gonna be aware of sort of
[00:37:48] Deven: how the field has evolved, what’s the data that, the potential recipient is likely to have access to, that they could use to, to re-identify, what’s their motive for re-identification? There’s a whole lot of context that goes into that, that is obviously gonna reflect modern times. The safe harbor just says, oh, just remove these 18 categories of data and your home free. And we’re living in a world where there’s a lot more data. And we’re in an AI world. Where the ability to combine data points across multiple data sets is even faster and more powerful than it was before. And yet we treat de-identified data as though it’s the holy grail and it doesn’t get regulated once it’s de-identified, which is why you don’t see it in the privacy policy
[00:38:30] Deven: ’cause it’s not part of any privacy laws that it has to be included. ’cause they only apply to identifiable data. So it’s the boiling water where, it’s almost to the point of bubbling over. It’s not maybe there yet, but it’s gonna get there. And maybe it frankly is, and we just don’t, there’s re-identification happening that we’re a lot less aware of.
[00:38:52] Brendan: Deven, this is, yeah, this has been a fascinating conversation. You, you’ve shown us that real smart solutions and privacy aren’t just about avoiding fines or enforcement, but they’re about empowering patients and really designed for the future. And then building these cultures that treat compliance as part of the innovation.
[00:39:07] Brendan: So thanks again for joining us and, and sharing that perspective.
[00:39:11] Deven: Thank you for having me.
[00:39:12] Brendan: Yeah, to our listeners, hope you’ve gained valuable insight into how to turn regulatory challenges and opportunities for patient empowerment. This has been Hard Problems, Smart Solutions, the Newfire Podcast. Until next time.
Chapters
00:00 Introduction to Compliance in Product Development
00:29 Meet Deven McGraw: A Leader in Healthcare Privacy
01:38 Deven’s Journey into Healthcare Privacy
03:16 The Founding Story of Citizen Health
06:03 Challenges in Patient Data Access
10:41 Government’s Role in Shaping Data Access
14:39 Citizen Health’s AI Advocate
18:45 Advice for Startups in Healthcare
22:59 Balancing Privacy and Innovation
26:09 Building a Culture of Compliance
29:03 Future of Patient Data and AI
32:25 Closing Thoughts and Regulatory Changes
36:29 Final Remarks and Podcast Conclusion
Much more than a compliance issue, patient data access should be (but often isn’t) seen as the foundation of trust in digital health. As technology evolves faster than regulation, many organizations still struggle to balance innovation with privacy, governance, and user empowerment.
There have been numerous—many still ongoing—efforts over the years to drive better alignment across stakeholders to properly liberate patient data while ensuring privacy protections, and today, we get a bit of a history lesson on how much of that legacy still impacts us, where we continue to see accurate data becoming more important and more valuable.
In this episode of “Hard Problems, Smart Solutions,” host Brendan Iglehart, Staff Healthcare Architect at Newfire, speaks with Deven McGraw, Chief Regulatory and Privacy Officer at Citizen Health and one of the nation’s leading voices on patient privacy, data access, and healthcare regulation. Together, they discuss how forward-thinking companies can turn data rights, compliance, and AI regulation into opportunities for product differentiation and meaningful patient impact.
Listeners will gain perspective on how to:
- Overcome the “sludge in the pipes” that still blocks seamless patient data access.
- Design AI-driven tools that empower patients without crossing regulatory lines.
- Build compliance cultures that encourage transparency instead of fear.
- Anticipate privacy challenges that could redefine the next five years of digital health.
- Turn regulation from a constraint into a strategic advantage for innovation.
Building systems that truly serve patients means treating privacy and access not as competing priorities, but as complementary drivers of better care. This conversation shines a light on how thoughtful design and sound governance can make that possible.
Ready to hear how the next era of digital health will balance innovation, privacy, and patient empowerment? Tune in now.
About the Speakers

