Are We Ready for a Change in Implementation Science?
Implementation science is moving from infancy into adulthood. In this blog we are going to explore how this relatively new field, which emerged with noble intentions, now stands at a critical juncture.
The Rise of Implementation Science
Implementation science emerged as a field in response to a persistent problem faced by healthcare systems in the late 1990s. Despite significant advancements in medical research, many proven treatments weren’t being effectively integrated into everyday clinical practice.
This research-practice gap prompted researchers and clinicians to recognize the need for a systematic approach to address the barriers preventing evidence-based interventions from being adopted in healthcare settings.
As the field took shape, its mission became clear: to develop strategies that would accelerate the uptake of research findings and improve patient outcomes.
The Implementation Science journal was established in 2006 and early pioneers such as Martin Eccles and Brian Mittman laid the groundwork by developing theoretical frameworks to guide implementation efforts.
These frameworks aimed to provide a structured approach for understanding the complex factors influencing the adoption of new practices in healthcare environments.
The field quickly gained momentum, attracting increasing attention from researchers, policymakers, and healthcare organizations. This growth was reflected in a surge of publications, funding opportunities, and dedicated research centres.
The expansion of implementation science underscored its potential to bridge the gap between research and practice, ultimately improving the quality and efficiency of healthcare delivery. As implementation science continued to mature, we saw the development of many more frameworks and approaches.
We now have implementation outcome measures so we can evaluate implementation. We have frameworks which help us identify what might be helping or hindering implementation so we can diagnose settings and formulate strategies to reduce or remove barriers to implementation.
We have process models that can help us plan for implementation and its been great for people to understand implementation is a process rather than a one time thing. We have lists of implementation strategies and hundreds of studies that have tested them.
We all the theoretical models, theories and frameworks have been great for researchers because we can use them as a backdrop for our research design, data collection and analysis etc.
Nilsen et al., 2015
This proliferation of theoretical models has signalled real progress but has also foreshadowed some of the complexities that have now emerged as practitioners attempt to apply these frameworks in real-world settings.
IMPLEMENTATION FRAMEWORKS
The theoretical domain framework? Normalisation process theory? The consolidated framework for implementation research? Sound pretty technical right? Well that’s because they are. They are difficult to learn. We can show these implementation frameworks to practitioners and we have tried to teach them about them but it simply hasn’t translated into the positive outcomes we might want.
It is the same with implementation strategies. Many of them remain inconclusive and the one’s that usually help like education, training, facilitation, promoting adaptability, audit and feedback are pretty hit and miss depending on the context.
Although some healthcare systems perform exceptionally well, many still struggle to drive system change using implementation science. Translating research findings into practice remains a slow process, there are still gaps in our understanding and implementation science outcomes are seldom fully integrated into healthcare systems.
As a result, all healthcare organizations still face significant challenges in effectively implementing implementation science driven changes and advancing evidence-based practice.
What’s more, the 170 theories models and frameworks now published within the field do not seem to have allowed the field to achieve its original aims i.e. the support of adoption, integration and impact.
So what has happened?
Well as we’ll see, implementation science, in its valiant effort to establish itself as a science, has missed a very important part of implementation and it now needs to decide how it will respond.
Implementation science is only two decades old, and implementation scientists have been working to establish the field as a rigorous and legitimate science. It’s been about survival and proving legitimacy these past two decades. Remember the world of academia is driven by publications and peer reviewed studies. It’s not like the world of practice. It's a different kettle of fish.
This has meant most of those working in implementation science have predominantly focused on theory advancement arguably at the expense of practical, context sensitive implementation. Many implementation scientists will continue to do this but some of us, in order to maximize implementation science’s potential, are going to have to learn to balance theoretical exploration with practical strategies that address the needs of healthcare practitioners and stakeholders, ensuring both relevance and sustainability in real-world contexts.
And yes, this might mean evolving past the sole notion that "evidence is king" and randomized controlled trials (RCTs) are the gold standard. RCTs are great for establishing causal inferences and demonstrating definitive truths particularly in medical research but their findings do not always translate seamlessly into real-world effectiveness when it comes to implementation.
RCTs work by us controlling for many contextual factors in contexts we are running the trial in. Future contexts will be different and ever changing and there won’t be a research team controlling for annoying contextual barriers and challenges in the next place.
The result is the knowledge we’ve been generating from implementation science hasn’t been knowledge that is particularly useful to practice. They want to know ‘how can I implement this?’ and we can’t tell them. We can only make recommendations based on the experiences of a previous context in the past. Recommendations like ‘ensure leadership engagement’, try ‘audit and feedback’, ‘train your staff’, but the question still remains, how? Which leaders? When will I audit? How can I audit? I barely have time to go to the bathroom as I’m always with patients.
This is why there are over 140 RCT’s on audit and feedback, and even if all of these showed a positive result, which they don’t, it’s very mixed, this type of knowledge, just doesn’t help practice.
Despite this ongoing trend many researchers and universities still remain solely focused on assembling an extensive evidence base without putting enough energy into understanding how to actually transfer this evidence into practice.
While it has been assumed that providing incontrovertible evidence would drive outcomes, the actual impact has been modest. The extensive publication of implementation science theory and focus on evidence accumulation, has often overshadowed balanced study designs that connect theory to practical interventions or improvement efforts. This overemphasis on evidence has limited the integration of theory with actionable, real-world applications.
Even when we have something that is evidence based, many researchers struggle to select relevant evidence, ask the right stakeholder questions, or use methodologies that yield practical, impactful results. This leads to studies with limited utility, redundant data, and poor applicability to frontline practice or policy-making. Despite the vast volume of research published daily, only a small fraction influences clinical practice, leaving healthcare practitioners feeling disconnected from researchers' efforts to address system or practice challenges.
Time for something new?
I think most would probably agree we need to effectively integrate robust theoretical, evidential, and practical dimensions of implementation equally in our work.
Methods: It’s time to get creative in our implementation science methods. Traditional implementation science (IS) methods often rely on static approaches like surveys, interviews, and focus groups, which fail to capture the dynamic, cultural, and relational complexities of healthcare settings. This has hindered the field’s ability to produce timely insights and address rapidly changing care delivery challenges.
Rapid ethnography focuses on fast, iterative data collection in local contexts. It emphasizes real-time, system-oriented analysis, capturing the relationships, behaviours, and workflows that define healthcare environments. Researchers use "mobile methods," accompanying practitioners, leveraging diverse data sources (e.g., wearables, big data), and documenting observations in the moment.
By harnessing immediacy and contextual relevance, rapid ethnography facilitates agile, actionable insights for improving healthcare delivery in complex, fast-paced environments. Rapid qualitative analysis is also useful in this regard and so are case studies.
Whatever our research approaches our methodologies need to yield meaningful, actionable results. Siloed approaches which fail to foster collaboration are no longer tenable. We know that traditional "push" models, where interventions are imposed, are less effective than "pull" models, which emphasize stakeholder engagement and participatory methods.
Co-creation models and implementation ecosystems—where researchers and practitioners collaborate over time—show promise for achieving sustainable change but require significant resources, trust, and relationship-building.
Our role is not to collect data, it is to work with practice to help them achieve their goals and in doing so we all learn something valuable and hopefully start getting more positive patient impact as things that work actually get implemented and sustained
Trust: Building trust, fostering engagement, and developing a shared vision are essential for success and a lot more of our attention needs to go into this area. While there are no quick fixes, outreach and collaboration by implementation scientists can bridge gaps between theory, practice, and research adoption, despite financial and systemic challenges. Knowledge translation relationships between researchers and frontline health and social care providers will likely be vital.
We also need to make sure teams on the ground have time to problem solve implementation regularly and in real time, so that they can produce the very practical implementation knowledge we need within in the field.
This of course means we will need theoretical models and frameworks which don’t just focus on the micro level but also acknowledged the macro level including community, policy, societal contexts and entire health systems.
Working with other disciplines: Of course we can’t expect implementation science on its own to deal with the wicked problem of implementation. It would be a mistake to do that, and a mistake to re-invent the wheel.
The solution is going to require an all-out collaborative effort from stakeholders across multiple fields. A combination of different methodologies can better address the complexities of healthcare implementation and it has been refreshing to see calls for methodological pluralism.
Over the last few years we have seen implementation science combine with improvement science and practices leading to powerful outcomes. We’ve also seen the very welcome addition of complexity science and frameworks such as Trish Greenhalgh’s NASSS framework. A video on that soon.
Training
As a final point training in implementation science also faces challenges. Most programs are generic, failing to account for diverse training needs or emphasizing applied experience. Embedding implementation researchers within healthcare teams and creating exchange programs could bridge gaps, fostering collaboration and providing mutual benefits.
However, capacity building remains limited, with few institutions offering specialized implementation science training or mentorship opportunities. Education in implementation science must also shift to emphasize applied skills, sustainability, and de-implementation, cultivating a workforce equipped to bridge the gap between theory and practice.
I have have developed a training programme which introduces participants to the basics of implementation science but also teaches them it’s limitations and then goes on, with the use of case studies, to show them how they can use research methods which not only connect more with practice, but are able to deal with complexity.
Classes are deliberately small so the training can be tailored to the needs of each participant. One-to-one time in solving problems of the group is paramount. Follow the training links on this website to find out more.
Conclusion
Implementation Science has established itself as a robust field, offering valuable frameworks and methodologies to help us understand implementation in healthcare systems.
However, its emphasis on theoretical rigor and evidence accumulation often limits practical application in real-world contexts. To bridge the persistent gap between research and practice, the field must embrace a more balanced approach—integrating theoretical exploration with actionable, context-sensitive strategies that address both micro-level and system-wide challenges.
Future efforts must prioritize adaptive, demand-driven methodologies that are rigorous, contextually relevant and can keep up with the needs of practice.