Nomadic, but also Trustworthy Data: The Next Step for Progress in Healthcare (revisited).
Or What Data Really Requires Beyond Interoperability: Governance and Trust
Let’s face it: historically, data exchange has had a bad reputation
We saw data exchange facing skepticism so many times, mostly due to past issues with consumer (especially social media platforms) mishandling user data.
Historically, these platforms have been criticized for poor data privacy practices, including unauthorized data sharing, insufficient security measures, and a lack of transparency about data use.
High-profile data breaches and scandals, like the old Cambridge Analytica incident, have fueled public distrust, showing how personal information can be exploited for commercial gain without consent.
This skepticism naturally extended to healthcare, where privacy and security concerns were even more acute. Beyond the claims of ownership by private organizations- often fueled by proprietary or competitive interests - the core issue for patients remained simpl: who can see/ have access to my medical files, and under what reasons and conditions?
To overcome this skepticism, the healthcare industry must demonstrate robust data protection measures, transparency, and a commitment to ethical practices, ensuring that data exchange is handled responsibly and with the utmost respect for patient privacy.
Data being both mobile and safe is an argument we must advocate not just to patients, but also to providers. Across industries, consumers are increasingly choosing products and services based on qualities like trust, openness, and AI-readiness.
The complexity of the healthcare system is profound. It is characterized by a myriad of stakeholders - hospitals, clinics, insurance companies, regulatory bodies, all operating on disparate systems and standards.
This fragmentation can only lead to delays, in a high-potent time for breakthrough medicine. Redundant tests or delays in accessing critical information still prevail, often due to outdated infrastructure, bureaucratic bottlenecks and yes, resistance to change or the usual: “This is how it’s been done so far.”
Data liquidity means ensuring patient information can flow seamlessly and accurately between different systems and providers. This requires technical interoperability and adherence to regulations such as HIPAA, GDPR and others.
Implementing secure APIs and health information exchanges (HIEs) can help facilitate this liquidity by enabling systems to communicate effectively.
The goal is to create an integrated and responsive health system where data is readily accessible to healthcare professionals, improving care coordination, reducing administrative burdens, and ultimately enhancing patient outcomes.
But today it’s no longer enough to simply move data between systems. The challenge is to ensure that health information can flow securely, transparently, and purposefully, giving patients confidence that their data is both mobile and trusted.
Comprehensive data paints a full picture of a patient's health journey.
While social media platforms use data to personalize content feeds, ensuring engagement, healthcare systems can use data to improve care coordination and adherence to treatment.
Data truly shines when it’s dynamic and interactive rather than static and isolated. While individual data points can be limited on their own, their value increases significantly when they can move and integrate across different systems. Data becomes more meaningful when it travels, connects with other information, and is contextualized.
To effectively handle nomadic data, we need a few key elements:
Data Transmission: Reliable methods to move data between systems smoothly.
Path Selection: Smart routing to find the best path for data, reducing delays.
Data Integrity: Ensuring data remains accurate and uncorrupted during transfer.
Support for Different Formats and Protocols: Compatibility with various data types and communication standards.
Tailored Data Routing: Custom routing to direct data based on specific needs or preferences
For health tech developers and innovators, this means evolving software products to meet these new demands. Rather than creating another monolithic system, we should design flexible, adaptive, modular solutions - and, to refer to this specific metaphor of fluid data, think about an analogy of an underwater channel, with multiple connection points. This approach will facilitate faster, seamless integration, bridging gaps rather than building barriers.
From Fluid Data to Trusted Nomadic Data
Data movement alone is not enough. Just like water can flow freely but still pollute the ecosystem if uncontrolled, nomadic data needs rules, oversight, and purpose, and in healthcare, these rules become essential. The next step is to design nomadic data that is governed and especially, ready for scientific and clinical use.
This means asking hard questions early:
Who owns the data? Patients are the moral owners, but multiple stakeholders - clinicians, payers, researchers - should also have legitimate roles in stewardship.
Patient a
Systems should be designed to allow multiple stakeholders access at the patient’s direction. Using APIs, patients could actively manage connections to their data: for example, authorizing Cigna to use certain records for reimbursement, while simultaneously sending their fresh MRI and medical report to the most appropriate clinical trial site in Germany. Or instead, they could delegate this role to an AI-driven agent that manages permissions, routes data to authorized stakeholders, and ensures compliance with consent and privacy requirements.
Who can access it? Access must be granular, revocable, and aligned with the patient’s consent and purpose.
Where does it reside? Cloud, on-prem, or patient devices? Each choice comes with trade-offs in security, latency, and jurisdiction.
Cloud storage offers scalability and accessibility, but raises questions about data sovereignty, legal jurisdiction (do I want my health data subject to regulations in China, the U.S., or my own country?) , and exposure to large-scale breaches.
On-premises storage provides more control and compliance with local regulations, but suffers from limited interoperability and access.
Patient-held data maximizes personal control and enables rapid, patient-directed sharing, yet introduces challenges around device security, standardization and of course, limited storage and scalability.
Can it be reused? Data should be clean, standardized, labeled, and versioned so that researchers, clinicians, and automated systems can rely on it safely.
When we design data to move safely and intelligently, it becomes an asset not just for the patient, but for collective good.
The Five Pillars of Trustworthy Nomadic Data
To operationalize this, consider the five interdependent pillars that enable data to travel safely and meaningfully:
Governed Access – Only authorized stakeholders can read, write, or compute on the data, and these permissions can adapt dynamically as contexts change.
Semantic Interoperability – Data must be standardized and labeled consistently, so that automated systems and human experts can interpret it reliably.
Modular Pathways – Data moves along flexible, secure channels that adapt to changing workflows, system updates, and patient needs.
Automation-Ready Design – Structured, clean, and labeled data enables AI and analytics tools to work safely, reducing the risk of errors and bias.
What if every time a patient underwent a medical imaging study that flagged a neurodegenerative or oncological condition, they were immediately presented with relevant clinical trials enrolling in their area? What if, simultaneously, each abnormality detected could be automatically labeled and paired with similar cases, using their medical history, genetic profile, and prior imaging?
In modern medical software, the competitive advantage is no longer just accuracy or immediacy, but the richness of information. Systems that combine mobility, governance, and contextual intelligence can guide patients to the right interventions, accelerate research, and support clinicians in making informed decisions in real time.
Beyond the Underwater Channel
Think back to the underwater channel metaphor. Imagine each droplet of patient data traveling along monitored pathways that connect hospitals, labs, wearable devices, and research repositories. But now, imagine each droplet carrying its provenance, permissions, and semantic labels. This ensures that wherever it surfaces -at a clinic, a research lab, or an AI system- it is trusted, interpretable, and actionable. This way we can move safely towards AI-driven systems that can provide predictive insights, because the inputs are standardized, validated, and auditable.
Why Governance and Trust Are Now the Frontier
In 2024, the conversation centered on interoperability and liquidity. In 2026, the frontier has shifted: mobility without governance is no longer progress. High-quality and auditable data is what enables:
Seamless patient care across fragmented systems
Sped-up scientific research
Safe, trustworthy AI
A recent call for autonomous science instruments argues that legacy tools designed for manual use are now bottlenecks in AI‑driven discovery and that future instruments must be open, modular, and automation‑native to achieve scalable, reproducible results.
This reinforces the idea that standards alone are not enough: just like healthcare data systems must be designed with automation, contextual metadata, and open APIs at their core, science needs instruments that are built for robots, AI, and continuous experimentation. Only then can we meaningfully accelerate both clinical workflows and systematic discovery.

