Knowledge translation for practice change in children's mental health

Melanie Barwick, PhD, CPsych, The Hospital for Sick Children
Katherine Boydell, MHSc, PhD, The Hospital for Sick Children
Denice Basnett, MA, The Hospital for Sick Children
Brian O'Hara, Children's Mental Health Ontario
H. Bruce Ferguson, PhD, CPsych, The Hospital for Sick Children
Rebecca Haines, MA, The Hospital for Sick Children

In 2000, Ontario mandated the use of two instruments for systematic screening and outcome measurement for children's mental health care organizations. A knowledge translation (KT) infrastructure was developed to support training in, and implementation of, the tools for service providers. Intensive face-to-face training was the mode of KT most valued by practitioners, but was complemented by regional communities of practice and an in-house train-the-trainer approach to facilitate ongoing collaborative professional education and practice change. The tools have now been introduced to over 4,000 practitioners, with uptake in over 90% of organizations.


All too frequently, children receive mental health care that is based on practices that have little supporting evidence or, at worst, poor outcomes.1,2 Yet there is substantial evidence that most children who receive an empirically supported treatment, get significantly better and do so more quickly than with other forms of treatment or no treatment at all.3,4

In 2000, Ontario's Ministry of Children and Youth Services mandated the use of two instruments for systematic screening and outcome measurement as part of a provincial plan for children's mental health. Standardized screening practices in children's mental health can lead to better outcomes, and can help to manage long waiting lists by identifying children at greatest risk.5 Measuring outcomes can demonstrate which treatments are effective, enhance clinical practice, provide accountability and encourage practitioners to examine service quality.6,7

We recognized very early on that transferring knowledge about these new evidence based tools, and implementing their use in day-to-day work with patients and families, would be extremely challenging.

The screening tool, the Brief Child and Family Phone Interview (BCFPI),8 is a standardized intake instrument to screen behavioural and emotional problems of clinical significance. The outcome tool, the Child and Adolescent Functional Assessment Scale (CAFAS),9 is a standardized outcome instrument to assess level of functioning and monitor service outcomes.

Both tools are IT-based and clinicians and intake workers require specialized training in their use. The Ontario Ministry of Children and Youth Services contracted with two organizations to deliver this training and support implementation. BCFPI is supported by Children's Mental Health Ontario, an advocacy organization that promotes the well-being of children, youth, and their families and CAFAS is supported by the Community Health Systems Resource Group at The Hospital for Sick Children, in consultation with an advisory group of service providers.

We recognized very early on that transferring knowledge about these new evidence-based tools, and implementing their use in day-to-day work with patients and families, would be extremely challenging. We therefore developed a formal KT infrastructure10 to support our training and implementation program, which was a composite of strategies developed from best practices evidence in the research literature, and evolved according to the preferences of our participants and stakeholders.

Six years later, we attribute the success of training, implementation, and adoption of these two tools in over 100 service provider organizations across Ontario to this KT infrastructure, which focused primarily on active stakeholder collaboration including providers, government, and implementers;11 face-to-face relationships;11,12 and the use of multiple methods of communication.13,14

The KT initiative

Changing the way mental health care is delivered is a formidable, slow-moving task, often requiring modifications in clinician behaviour, program restructuring, and an infusion of resources.15 With this initiative, organizations were also faced with challenges in accreditation, amalgamation, staff turnover, rising demands for service, and computer literacy.

CAFAS rater reliability training is based around intensive two-day workshops for clinicians, with ongoing website, email, and telephone support. These are delivered regionally by specialized CAFAS trainers and are well received by clinicians.

Changing the way mental health care is delivered is a formidable, slow moving task, often requiring modifications in clinician behaviour, program restructuring, and an infusion of resources.

This personalized approach, while highly effective, is time-consuming and expensive, and cannot meet the differing needs of all service providers in a geographical area. Training requirements occur in waves and are ongoing, due to individual training preferences, the dynamic nature of group learning and professional development and staff turnover.

In recognition of this, we introduced two strategies to facilitate ongoing collaborative professional education, practice change, and KT: regional "communities of practice" and an in-house "train-the-trainer" approach.

Communities of practice can be loosely defined as a group of people who come together, either virtually or in person, around a topic. Our regional communities of practice include Ministry program supervisors, implementers, and practitioners in a unique partnership. The meetings began with our BCFPI and CAFAS teams presenting data from an implementation perspective, but have now evolved to focus on presentations by service providers who share their knowledge on how the tools are being used in clinical practice. Lessons are then transferred to the website where they can be viewed by other users.

Train-the-trainer sessions focus on training one or two people in each service provider organization to train their own staff on CAFAS rater reliability. The CAFAS team certifies the trainers on an annual basis. We also worked to introduce training of the CAFAS tool into colleges and universities to lessen the training burden of service providers, as well as to develop a culture around outcome management in children's mental health service delivery.

Other KT strategies included the development of guidelines to support the use of the CAFAS tool with special populations, namely Aboriginal children and youth,16 and communication materials describing the tools and their use to patients and their families.

We also produce aggregate reporting of provincial and regional data generated by the tools to government and service providers to elicit feedback and plan for system change.

Results of the KT experience

Over 4,100 child and youth workers, social workers, psychologists, and psychiatrists have now been trained to reliably use the CAFAS tool. Upwards of 600 specialists have been trained on the BCFPI tool and about 250 workers now apply it across the province. The in-house train-the-trainer approach has proved to be a reliable training method, with high correlations between practitioners trained by on-site practitioner-trainers and those trained by our specialized CAFAS trainers.17 This initiative has also provided the first ever wait list management tool and outcome data for children aged six to seventeen years who receive mental health treatment.

Our community of practice meetings have been repeatedly well attended and reportedly relevant to participants. Practitioners report increasing buy-in and clinical utility, and service providers report increasing usefulness of data for administrative and quality improvement purposes. Results to date show uptake of the tools in 80-90% of mandated organizations. Practitioners, Ministry personnel, and implementation teams continue to work together to determine current needs for training, support, and knowledge exchange. Research funding is now being sought to evaluate the impact of communities of practice in effecting practice change and the translation of new knowledge about use of the tools in practice.

Lessons learned

Both a “carrot” and a “stick” have a role in the uptake of evidence-based practices.

This initiative has generated several important lessons about the adoption of evidence-based practices, and the KT strategies required to support them:

  • Ongoing, clear, and direct communication is needed from funders, leaders, and champions to all those involved (agency management, clinical supervisors, and front line workers) to fully engage participants in a spirit of meaningful collaboration. The absence of a communication plan, particularly in the crucial early stages of this initiative, was a key barrier to knowledge and uptake of these tools.
  • Both a "carrot" and a "stick" have a role in the uptake of evidence-based practices. Use of the tools was not included in service provider contracts until 2004 and, when they were, uptake increased.
  • Communities of practice appear to be valuable in supporting the use and clinical application of evidence-based practices. Pilot testing suggests time is needed to build a sense of trust among community members to allow for the exchange of tacit knowledge and to contribute to the development of a culture around new practices.
  • To be most effective, even in a mandated context, KT strategies must be developed in accordance with the individual and organizational state of readiness for change. Our approach has been to support all organizations and practitioners, but to focus on those who have a higher level of readiness for the adoption of new tools, which increases their chance of success. We highlight the successes of these "early adopter" organizations in the community-of-practice venues and on the website so that "late adopters" can share in their experiences.
  • The importance of face-to-face support cannot be overstated. Although more costly, it is the mode of KT most valued by and beneficial to practitioners. It is, in our view, most successful for sharing both the tacit and explicit knowledge required to build a professional learning culture.

The importance of face-to-face support cannot be overstated.

Conclusions and implications

Our work continues to introduce the BCFPI and CAFAS tools to a significant proportion of Ontario's children's mental health care providers. We have implemented a targeted KT infrastructure that supports significant practice change, and allows us to use our limited financial and human resources for training more effectively, as well as for supporting a culture of practice change.

We hope to continue to learn how best to bring evidence-based practices to the field and how to support the adoption of new and innovative approaches to mental health care for children. Anecdotal reports suggest appreciation for the tools' clinical contribution is growing with practitioner experience. However, real and perceived barriers remain to be addressed, including the time required to use the tools, the importance of assessing response to treatment and the extent to which the data generated are viewed as purely bureaucratic, as opposed to meaningful for clients and service provider organizations. Whether we are successful in developing a culture receptive to evidence-based practice and service delivery innovations in children's mental health remains to be seen: the journey continues.


1 Busch, A. B. 2002. Validity, reliability, and other key concepts in outcome assessment and services research. In Outcome measurement in psychiatry: A critical review, ed. W. W. IsHak, T. Burt and L. I. Sederer, 35-55. Washington, DC: American Psychiatric Publishing.
2 Dishion, T., J. McCord, and F. Poulin. 1999. When interventions harm: Peer groups and problem behavior. Am Psychol 54:755-64.
3 Chambliss, D., and T. H. Ollendick. 2001. Empirically supported psychological interventions: Controversies and evidence. Annu Rev Psychol 52:685-716.
4 JCCP. 1998. Empirically supported psychosocial interventions for children. Special issue, J Clin Child Psychol 27:138-226.
5 Lambert, M. J., J. L. Whipple, D. W. Smart, D. A. Vermeersch, S. L. Nielsen, and E. J. Hawkins. 2001. The effects of providing therapists with feedback on patient progress during psychotherapy: Are outcomes enhanced? Psychother Res 11:49-68.
6 Barlow, D. H., S. C. Hayes, and R. O. Nelson. 1984. The scientist-practitioner: Research and accountability in clinical and educational settings. New York, NY: Pergamon Press.
7 Ogles, B. M., M. J. Lambert, and K. S. Masters. 1996. Assessing outcome in clinical practice. Boston, MA: Allyn & Bacon.
8 Cunningham, C. E., P. Pettingill, and M. Boyle. 2000. The brief child and family phone interview (BCFPI). Training manual. Hamilton, ON: Canadian Centre for the Study of Children at Risk, Hamilton Health Sciences Corporation, McMaster University.
9 Hodges, K. 2003. Child and adolescent functional assessment scale. 3rd ed. Training manual. Ypsilanti, MI: Eastern Michigan University.
10 Barwick, M., K. Boydell, and C. Omrin. 2002. A knowledge transfer infrastructure for children's mental health in Ontario: Building capacity for research and practice. Report. Toronto, ON: The Hospital for Sick Children.
11 Lomas, J. 2000. Using 'linkage and exchange' to move research into policy at a Canadian Foundation. Health Affair 19 (3): 236-40.
12 Landry, R., M. Lamari, and N. Amara. 2003. Extent and determination of utilization of university research in government agencies. Public Admin Rev 63 (2): 192-205.
13 Bero, L. A., R. Grilli, J. M. Grimshaw, E. Harvey, A. D. Oxman, and M. A. Thomson. 1998. Closing the gap between research and practice: An overview of systematic reviews of interventions to promote the implementation of research findings. BMJ 317:465-68.
14 Freemantle, N., and I. Watt. 1994. Dissemination: Implementing the findings of research. Health Libr Rev 11:133-37.
15 Huang, L. N., M. S. Hepburn, and R. C. Espiritu. 2003. To be or not to be evidence-based? In Data Matters: An evaluation newsletter 6:1-3. National Technical Assistance Center for Children's Mental Health, Georgetown University Center for Child and Human Development.
16 Barwick, M. A., Dilico Ojibway Child and Family Services, and K. Hodges. 2004. Culturally competent evaluation: Clinical considerations for rating the Child and Adolescent Functional Assessment Scale with Aboriginal children and youth. Report. Toronto, ON: The Hospital for Sick Children.
17 Barwick, M. A., C. Omrin, and D. Basnett. Under review. Maintaining reliability in Ontario's outcome initiative: Training approaches and rater drift on the Child and Adolescent Functional Assessment Scale. J Behav Health Sci Res.

Date modified: