Your Medical Records v Metaverse

Your Medical Records v Metaverse

An investigation carried out by The Markup and reported on in Stat News, found that 33 out of 100 top hospitals have been sharing patient medical records with the Metaverse. While this seems like it goes against HIPAA and would be clearly illegal… it might not be. Last year HIPAA was revised under the CARE Act, regarding the use of medical records on third-party digital platforms. This was part of an overhaul to update how hospitals work, because it does seem archaic to be using fax machines in 2022. It was made intentionally obfuscating and difficult to understand. It’s actually called The Information Blocking Ban. The most simple definition is something like: This rule is a major change for patients, providers, and payors as it makes patient health information readily accessible by and portable with the patient, creating a sense of shared ownership and responsibility. It drives health information access and sharing into the digital world by encouraging health IT developers to build smartphone apps. Healthcare organizations will have to connect with third-party apps via certified application programming interfaces (APIs) to comply with patient requests for data sharing (or cite an exception). Long story short, when a patient uses a third-party tech platform, HIPAA rights no longer apply. Back in April 2021 when this went into effect, Devon McGraw told Fierce Healthcare: “For the vast majority of patients, it will take a while for it to sink in. I think this will build over time. There will not be thousands of patients banging on their doors to get their data."


Now there is. Meta makes Pixel, a free widget used in lots of software. MyChart uses Pixel which is how the “book appointment” button sends data to Meta. The data is not encrypted, at this time it appears Facebook can match the patient information to individual Facebook profiles. At least 26 million patients have had their data shared, and that’s just in the pool of the 33 hospitals sharing data that we know about. A spokesperson for Meta responded, ““If Meta’s signals filtering systems detect that a business is sending potentially sensitive health data from their app or website through their use of Meta Business Tools, which in some cases can happen in error, that potentially sensitive data will be removed before it can be stored in our ads systems.” Oops. It’s unclear if Meta’s goal was for advertising, training prediction algorithms, or profit in other ways. The Markup found that details such as a plan to terminate a pregnancy, sexual orientation, medications, dosages, and appointment schedule were all shared. Glenn Cohen, faculty director of Harvard Law School’s Petrie-Flom Center for Health Law Policy, Biotechnology, and Bioethics told Stat, “even if perhaps there’s something in the legal architecture that permits this to be lawful, it’s totally outside the expectations of what patients think the health privacy laws are doing for them.” I wonder what Amazon Pharmacy thinks. Can Anything Be Done? Unfortunately, the best action to take is the most boring one. Call your local lawmakers to push for data protection.