If You’re Still Debating RWE, You’re Already Behind
- Adigens Health

- Aug 11, 2025
- 3 min read
The U.S. Food and Drug Administration (FDA) has spent the better part of the last decade cautiously flirting with real-world evidence (RWE). Now, it appears the agency is ready to commit.
A new paper in Clinical Pharmacology and Therapeutics, authored by John Concato and other senior FDA figures, offers the clearest view yet of where the agency stands. Spoiler: if you’re a drug developer still stuck in the debate over whether RWE will matter, you’re asking the wrong question. The better question is how to make it work.
From Footnote to Framework
The FDA’s RWE Framework, introduced in 2018, laid the groundwork for integrating data from real-world settings into regulatory decisions. But much of the industry interpreted this as polite interest rather than policy shift. That perception is no longer tenable.
The paper outlines a pragmatic but determined agenda: RWE is not a replacement for randomized controlled trials (RCTs), but a complementary tool. And one that is already influencing decisions on effectiveness, safety, and label expansion.
Critically, the authors acknowledge the central tension in RWE: its proximity to clinical practice brings relevance, but also noise. They argue that rigor, in data selection and in analytic methods, is what separates regulatory-grade RWE from the marketing brochure variety.
Turning Method into Practice
If there’s one unifying message in the paper, it is this: regulators want better emulation of trials. It means that real-world studies must be designed with the same clarity of intent, defined comparators, eligibility criteria, and endpoints that would be expected in an RCT.
As Concato and colleagues put it: “A regulatory-grade study using real-world data must reflect the core elements of a hypothetical target trial.” It’s a quiet endorsement of target trial emulation logic, even if the phrase is not used explicitly.
And the applications are growing. Recent FDA approvals and label expansions have leaned on external control arms, hybrid designs, and observational data that mirror the rigor of clinical trials, without the time and constraints.
A Call to Build, Not Just Analyze
The authors also underscore something often lost in methodological debates: RWE is only as good as the infrastructure that underpins it. “Too many studies begin with the data, rather than with the question,” one reviewer of the paper might have said (had they not been bound by editorial conventions). The FDA does not dismiss routinely collected sources like claims or electronic health records, but it does expect sponsors to demonstrate that the data are fit for the intended purpose. In some contexts, particularly those involving nuanced clinical variables or patient-reported outcomes, this may require prospective collection. In others, robust retrospective data may suffice if thoughtfully applied within a rigorous framework.
This has profound implications. Companies with ambitions in rare disease, oncology, or other rapidly evolving areas must invest not only in analytics but in deliberate data stewardship, ensuring that the right elements are captured with sufficient depth and reliability. Platform partnerships, standardized data models, and hybrid approaches are quickly becoming prerequisites. The gold standard is no longer mere data reuse: it is data readiness, defined by intentional design and transparent provenance.If the FDA Has Moved On, So Should You
For developers waiting for an explicit signal that RWE can anchor submissions: this paper is it. But the message isn’t that anything goes. The FDA is increasingly comfortable with RWE when it resembles the structure and transparency of randomized research even if the setting is messy, the comparator imperfect, or the endpoint novel.
This isn't the start of a revolution. It's the start of operational expectations. And sponsors who are still debating whether this is real are, bluntly, behind the curve.

Comments