Module objectives

The purpose of this module is to support implementers to fully integrate the ChYMH Suite tools into organizational practice – and keeping them that way! In our experience, factors such as organizational culture and addressing resistance are crucial to keeping ChYMH use sustainable. With this module you will:

  1. Compare your organization’s implementation so far against the key factors that are shown to sustain change.
  2. Identify and address post-launch issues that threaten long-term implementation sustainability.
  3. Learn how ChYMH suite data can be used to support organizational and regional decision-making.
  4. Identify and celebrate your implementation successes.

4.1 Sustaining change

The good news:

If you have been following the steps of this toolkit, you can be confident that you have laid a solid foundation for sustainability!

Sustainability depends on creating and following a clear plan that includes the key pieces known to be related to long-term success, such as ensuring readiness before starting, having good two-way communication, and so on.

Even the best laid plans can go slightly awry. In the sustainability phase it is important to check in on your organization’s sustainability factors.

Be flexible and open. You may have to return to the activities of an earlier stage to make adjustments to ensure the long-term success of ChYMH Suite use at your organization.

What does fully realized ChYMH suite use look like?

  • ChYMH Suite assessments are completed with fidelity to your organization’s processes and to the definitions and instructions in the User’s Manual.
  • ChYMH Suite results are routinely incorporated into treatment planning and documentation. Everyone who needs to know how to read and understand the results, does. ChYMH Suite results are never filed away without being reviewed and used.
  • The culture of your organization is one of valuing data-informed decision-making. ChYMH Suite data is gathered, shared with clients and caregivers, and used to evaluate treatment progress, program effectiveness, and organizational successes.
  • Trainers are in place and are knowledgeable and accessible to staff for questions/concerns. Trainers participate in Communities of Practice.
  • Communication between leadership and staff is ongoing and continues to be two-way. Support for the ChYMH Suite is consistently visible through multiple avenues such as newsletters, and sharing data at staff meetings.

The 6 Essential Factors for Sustaining Change

  1. Leaders  
    The research is clear - organizational leadership is essential to successful implementation and sustainability. Is your management team consistently promoting the benefits of the tools, both clinically and as a data-informed decision support? Are your clinical and administrative leaders continually working to remove barriers and communicate project priorities? 
  2. Engagement  
    Another key to long-term success is getting staff increasingly involved as implementation evolves. It is time to move from an Implementation Team to a Sustainability Team, which will create an opportunity for new champions to emerge. Apply the same decision-making that went in to creating the Implementation Team. Strive for good representation in terms of how the tools are used. For example, assessors, intake staff, primary care workers, data analysts will all have different and valuable perspectives to contribute. Building a team with broad representation will help ensure that the needs of all those using the tools are at the forefront of sustainability conversations.
  3. Readiness  
    Readiness remains an important factor in sustaining optimal implementation outcomes. Post-launch is an excellent time to have the Sustainability Team repeat the Readiness Self-Assessment (section 1.5) to see if any new or continuing gaps need to be addressed.
  4. Embedment  
    Our experience has taught us that clinical embedment is the key to the sustainability of ChYMH tool use. How is this happening at your organization? Have clinical staff embedded the tools into their clinical documentation? Are they incorporating CAPs and scale results into care planning, and regularly utilizing monitoring assessment results at service planning meetings? Consider linking use of the interRAI ChYMH Suite to your organization’s strategy/vision to make expectations explicit.
  5. Measurement  
    Establish an ongoing measurement system and a standardized way of communicating results (such as the number of ChYMH assessments completed, or most common needs of children at intake). Data-informed decision-making may be new to some so showing staff how the information collected through the tools can improve outcomes for children and families will help to sustain their use.
  6. Communication  
    Don’t stop communicating with staff about the value the ChYMH Suite holds for your organization and your clients, and don’t stop listening to staff that are telling you what they need to make it work either. Use your Communication Plan to address issues as they come up and to provide clarity to staff about expectations and processes.

Ultimately, ChYMH Suite implementation efforts cannot succeed unless the organization’s culture shifts to valuing the ChYMH Suite assessments as clinical tools rather than just “paperwork” and “data”.

To achieve this, your organization must ensure three crucial factors are in place:

  • Staff must understand the reason for collecting objective data about clients.
  • Staff must be supplied with the resources they need, including manageable processes and user-friendly technology.
  • Staff must understand how to use the data in their specific roles and have clear systems in place for using the data to support decision-making.

4.2 Sustainability troubleshooting

Common issues that threaten sustainability are presented below, along with suggested approaches to address them.

If you’re hearing…This might include…Recommended approaches:
“Staff are reluctant. They haven’t ‘bought-in’ to the change.”
  • Staff struggling with the change and avoiding using the tools.
  • Staff feel like the tools are “just for data”.
  • Staff concerned that the assessment process will interfere with their ability to build therapeutic rapport.
  • Shine a light on the discontent. Work on helping staff feel safe to bring forward specific concerns, Give an opportunity for dialogue between staff and leadership to get these issues out in the open where they can be dealt with.
  • Deliver a clear message from leadership that promotes open communication while making clear that the decision to support this change is firm.
  • If your leadership has made missteps along the path, own them! Show your staff that you hear them and that you are willing to learn and grow from your mistakes.
  • Use your communication plan to prepare and deliver communications to staff about the urgent need to adopt this change, the benefits of a standardized assessment process and using data to enhance or demonstrate the benefits of clinical care.
  • Help assessors understand that as long as they use the intended conversational interview approach (not treat the assessments as questionnaires) the tools support relationship-building by providing a structure for gaining a thorough understanding of each client, their family, and their unique situation and needs.
“Staff are completing assessments, but the results aren’t getting used.”
  • Staff feeling that they don’t understand the results well enough to make use of them.
  • Staff not trusting the results and feeling that their own judgement should suffice.
  • Staff feeling as though they don’t know how to discuss assessment results with their team, supervisors, or clients.
  • Staff feeling as though there is no clear direction on how to use the results in their programs.
  • Make sure that clinical managers/supervisors have a strong understanding of the results and are able to guide staff in their use. Refer to the Clinical Staff Competency Checklist in the Appendices to support individual staff members.
  • Establish overall guidelines, review report templates and guides, and set expectations for discussing results with clients and families.
  • Consult with your training team about options such as booster training or a “Lunch and Learn” series to review how to interpret and use some of the most commonly identified status/outcome measures in your organization’s clientele. Examples of topics include the CAPs that are most commonly triggered, the scales that are most frequently consulted or monitored, or the decision-support algorithms for screening.
“We’ve put staff through the proper training, but assessments aren’t consistently getting done.”
  • Confusion about who will complete the assessments, or about the requirements around time frames for completing assessments.
  • Some programs doing great, while others are lagging.
  • A small number of vocal “naysayers” who are derailing the work of the Implementation Team in pockets across the organization.
  • Make sure that directions about processes are 1) clear and precise, 2) tailored to the program level (so there is no opportunity for “Well, they can’t possibly have meant us.” misinterpretations), and 3) effectively communicated using your Communication Plan.
  • If some areas of your clinical team are struggling where others are not, investigate further to determine whether the underlying issues are about processes, individual staff who are struggling, or individual “naysayers” who are spreading negativity. For the first possibility, see the bullet above; for the second, consult with your training team and determine whether booster training or mentoring opportunities might help with skill-building; and for the third, aim to have consistent positive messages spread from multiple sources (such as leadership and champions) and consider addressing naysayers directly in a manner that directly addresses their concerns.
“The process doesn’t make sense. We’re not getting the guidance we need.”
  • Different programs drifting to use of different processes, which is leading to confusion.
  • The need to make some program-specific tweaks to processes.
  • Supervisors and managers who aren’t equipped to guide decisions about process changes because they aren’t informed about what the different tools provide in terms of clinical information.
  • Start with the top. Focus on the clinical management team to make sure they have what they need to lead. Literacy training for this group needs to be fairly detailed. Supervisors need a strong understanding of the results so they can suggest ways of embedding results in the day-to-day practices of their teams.
  • Evaluate the current process for the program in question. Does the process work? Was it clearly communicated? Have changes been made since the initial launch, and if so, has this tailoring been helpful or harmful?
  • Hold program-based brainstorming sessions that encourage all team members to link the work their team is doing to the information produced by the assessments. It might be helpful to focus on specific scales or CAPs that highlight interventions that the program commonly uses. Start with considering what outcomes can be used to demonstrate success, and then look at how common secondary gains might be captured in the outcomes as well.
“Why can’t we all use the Screener? The ChYMH takes too long and they’re basically the same anyways.”
  • Assessors (or clinical managers) who don’t understand the differences between the ChYMH Suite tools and the different purposes they serve.
  • Assessors who don’t have faith that their assessments are being used clinically. If assessors feel the tools are being used “just for the data” they will likely feel the time they spend completing assessments could be better spent on “clinical work”.
  • Use your Communication Plan and training team to help staff understand that full and screening tools are not interchangeable. On the face of it, the assessments look very similar, and it's easy to think that this means that the tools are the same. Help your staff understand that even though the ChYMH and the Screener share many items, they produce different types of results that support different aspects of clinical work.
  • Take an objective look at how your organization is using ChYMH Suite results. Is the effort that is going into collecting this data being matched with an equally strong emphasis on using the data to make decisions about client care? It is important to balance a focus on completing the assessment and using the assessment results. Administering a ChYMH Suite assessment is the most salient skill staff have to develop during implementation, so it stands to reason that this becomes a very high priority. However, if the same emphasis isn’t placed on using the results, assessors will not value the assessments, leading to reduced fidelity when completing assessments and ultimately resulting in poor quality data that will not be useful.

4.3 How ChYMH suite data can help your organization

Clinical staff often find that learning to apply ChYMH Suite assessment information and outputs to their client care to be very rewarding. This realization can take some time as it requires a level of comfort with the tools. Once this has been achieved, using data-informed decision making to support young people and their caregivers feels good!

ChYMH Suite assessments can also promote data-informed decision-making at the organizational level. When assessment data is grouped, it becomes possible to tell clients’ stories and reveal the impact of the support provided.

Becoming agile with your datasets to examine outcomes or evaluate programs is a first step to using data to guide wider decisions. Learning how to communicate your findings with different stakeholders can motivate those involved to support improvements and underscore the importance of collecting standardized information in the first place.

Value that the ChYMH Suite data can bring to your organization:

  • The ability to build a richly detailed profile of your clientele, including identifying common “types” of clients, in order to serve them better.
  • The ability to confidently evaluate program and service effectiveness. Using the objective and quantifiable observations that are the foundation of the ChYMH Suite tool design allows you to go beyond subjective measures (such as whether someone feels that the client improved) to more credible objective measures (such as the amount of improvement in a specific behaviour).
  • Access to meaningful and credible data will help you tell your organization’s story, including the impact of your work.

Choosing the right data elements or “variables”

Part of what makes the ChYMH Suite tools so useful is the wealth of information they provide. The comprehensive nature of the tools can make it challenging to decide which data elements to focus on when looking at the data. This section gives an overview of what to consider when selecting data to use for different purposes.

Creating a profile of a ‘typical’ client

Many organizations begin working with their ChYMH Suite data by focusing on answering the question, “Who are we serving?” Once you know what your typical clients’ needs are, you can review existing services to look for gaps. You can also use this information to validate whether your organization’s actual focus (the daily work) is different from what is intended (program goals).

For this purpose, consider summary scores provided by assessment results (such as CAPs, scales, algorithm results) or specific individual items that are relevant to your services.

Evaluating change over timeIt is important to understand the clinical intention of interRAI outputs as this impacts the appropriateness of the data to measure change. Not all scales, CAPs or Screener data algorithm results are capable of showing general and/or clinical change. Tool outputs should be carefully reviewed before determining which variables to include in any analysis that involves comparing results over time.
Creating structure to create meaningThe amount of data produced by the ChYMH Suite tools can be daunting and it can be difficult to know where to start. A meaningful system for organizing, interpreting and communicating large quantities of data is a crucial part of turning data into knowledge, and the importance of this step should not be underestimated. To address this need for an overarching structure to help make the data meaningful, CPRI developed a Domains Framework (see Appendix A). Use this framework when organizing your own data or as a starting point for creating a framework that meets your organizational needs.

Tips for ensuring data quality

Inaccurate information collected during assessments will not give your clinical staff what they need to make sound data-informed decisions about their clients’ needs. Poor data quality can lead to inaccurate decisions at the organizational level as well. Luckily, there are some steps that can be taken to ensure data quality.

Emphasize continuous learning

Assessor drift is the gradual erosion of fidelity to the assessment guidelines over time. Issues with drift usually occur when organizations approach assessor training with a “one and done” mentality.

To reduce drift, approach learning as an ongoing process.

  • Refresh your staff training periodically with booster sessions.
  • Use competency measures to find staff that need support.
  • Review assessments as part of clinical supervision. For example, look for results that don’t “make sense” clinically and work together to determine how the assessment data went awry.
Connect the dots

To create a culture that values the ChYMH Suite data, bridge the gap between clinical staff and leadership.

  • Show staff how leadership uses the ChYMH suite data to make decisions for the organization.
  • Show how your organization’s data fits with larger comparisons (such as regional or provincial metrics) to help engage staff with setting and meeting targets.

Bridge the gap between clinical staff and data analysts too.

  • Share what you have learned from analyzing ChYMH Suite data. This will help staff see how the assessments get “rolled up”.
  • Ask for research questions! In our experience, assessors are curious about “what the numbers say” for their work areas and beyond.
Manage and use the data wisely

A few extra steps will help make sure that your datasets are in the best condition to provide the most useful information:

  • Remember that each client should only have one unique identifier assigned.
  • Create a “codebook” or “data dictionary” to ensure there are common definitions in place. Variable names typically appear differently in a data file so creating a detailed description of terms and what they mean is a best practice that reduces errors and increases efficiency. Codebooks or data dictionaries can also include notes about challenges or changes to data elements over time. This can help users understand the limitations of the data in the data file.
  • Plan analyses carefully. As noted in section 3.4, there are limitations to how CAPs, scales, and algorithms can be used to show severity and change. Other considerations include
    • sample size: do you have enough cases to detect change?
    • analysis type: does the analysis you have planned fit the data that you have? If you are planning statistical analyses, consider reading up on how to choose between a parametric test (like a t-test) and a non-parametric test (like a Wilcoxon test).

Measuring success: program evaluation using ChYMH suite data

Program evaluation can help you answer the most crucial question when it comes to service delivery: Is what we’re doing working?

This toolkit does not provide a step-by-step guide for program evaluation. Many excellent resources exist to help with planning and executing program evaluations in the mental health sector. One such resource is the Ontario Centre of Excellence for Child and Youth Mental Health’s Program Evaluation Toolkit, which can be accessed at www.excellenceforchildandyouth.ca.

Instead, the intent here is to provide an example of how ChYMH Suite data can be used to support program evaluation. To that end, we have included a sample logic model that illustrates how ChYMH Suite variables can be used to evaluate a clinical program.footnote 11

What is a logic model?

A logic model is a visual diagram that represents how a program works. It details all the elements of the program (including inputs, outputs, activities) and the intended goals or outcomes.

InputsInputs are the resources needed to provide the program. Human, financial, time, technology, and physical resources are examples of inputs that would be identified here.
OutputsOutputs are the actual products or units of service that are produced by the program.
ActivitiesActivities are the tasks or steps taken in delivering the program, such as completing assessments, delivering education, or providing counselling.
OutcomesOutcomes are the changes that the program is intended to cause.

The interRAI ChYMH Suite assessments are excellent sources of clinical data for evaluating the impact of services on clients’ functioning. The tools focus on objective, reliable, current, and holistic information. The ChYMH and ChYMH-DD provide a wealth of clinical data from which to pull mental health and service quality indicators that can be used to evaluate the outcomes of a program or service.

4.4 Celebrating successes

ChYMH Suite implementation is a marathon, not a sprint. It will be important to take time to recognize successes along the way. Which milestones your organization chooses to celebrate are entirely up to you, but here are a few that you may want to consider:

Post-Launch party!

Acknowledging the hard work and dedication of people from across the organization - including the Implementation Team, Information Technology staff, clinical staff, and managers - is a great way to create a sense of community. Launching tool use is no small feat, and getting through those early hurdles, both big (completing first assessments!) and small (accessing logins!) is something to celebrate. A post-launch party approximately one month after the "Go Live!" date is a great way to bring your organization together to celebrate their hard work and to demonstrate your commitment to the tools.

100th assessment milestone!

This milestone is a perfect opportunity to demonstrate the value that this data can have for your organization. In our experience, staff get excited when they can see something tangible created from the data that they have gathered. Consider creating a “Here’s what we’ve seen in the first 100 assessments" profile of your clients (such as primary needs, reasons for referral, median age). A staff meeting presentation, email blast, or internal newsletter feature are all excellent opportunities for acknowledging the milestone and sharing data.

Note: You may wish to adjust this number. Aim for a milestone that will occur around 3-4 months after the "Go Live!" date.

One-year anniversary!

We recommend a party - with cake! This is a great time to announce your new interRAI Sustainability Team, and of course, to show off your data. Marking the one-year milestone acknowledges challenges, successes and gratitude for a team effort well done. Make this celebration a time for reflection: What went well? What is the data telling us? What are assessors telling us? How can we improve going forward?

Mini case scenario #4

Members of the Sustainability Team have met with staff from the different programs and program managers have shared some of the challenges that they are facing. Some staff are feeling that this is 'just another tool' in a series of tools over the years. Others feel that the tool is too comprehensive or intrusive and are uncomfortable gathering information about some of the items. Others simply don't see any benefit to changing the way that things have been done in the past. Many clinical staff report that they do not like the format, and that they would prefer to use a more narrative approach that fits with the way that they were trained.

A quick analysis shows that the drop in completed assessments is due to a combination of clinical staff choosing not to do the assessment or starting assessments but not completing them.

Managers also reported that although some programs incorporate the results into their documentation and utilize the CAPs manual, others do not.

Organization D provides both inpatient and outpatient services to children and youth. They launched the ChYMH Suite 14 months ago. Implementation started off strong, but recently the organization has noticed a drop-off in assessment completion. A new Sustainability Team has been created to meet with each program and discuss potential barriers.

After considering all of the information that they have gathered, the Sustainability Team has developed a plan that includes:

  • Providing booster training that focuses on embedding the tools into clinical practices;
  • Having some of the “champions” work directly with the programs that are struggling;
  • Asking the executive director to send an all-staff email to remind teams of the reasons why the ongoing use of the ChYMH Suite tools is an organizational priority;
  • Sharing the implementation evaluation results to date;
  • Making engagement with the data a priority by providing each program with a tailored overview of data that is relevant for them.

Mini case scenario #4 reflection

Guiding questions

  • At Organization D, the first sign of waning commitment to the ChYMH Suite was a drop in the assessment completion rate. What other signs of trouble should you watch for? Are any currently occurring at your organization?
  • Consider Organization D’s plans for addressing the challenges that have been raised. The plans include a mix of training, communication, using data, and program-specific engagement. Does your organization typically rely on just one or two of these approaches? Is there an opportunity to expand to a more multifaceted approach to solving issues?
  • Is a Sustainability Team only needed if the long-term sustainability of ChYMH Suite use is in jeopardy? What functions can a Sustainability Team serve during times when clinical staff are content with the tools?

Phase 4 actions checklist

Retire the implementation team and form a sustainability team

  • As you close out the Implementation Team, be sure to acknowledge and celebrate their accomplishments – they have worked hard to get your organization through the Planning, Launch and Initial Use phases!
  • Mix up your membership – ideally, a Sustainability Team should be comprised of a mix of experienced staff from the Implementation Team who are interested in continuing to be involved as well as new staff from different areas of the organization who can bring lived experience using the tools or results in their roles.
  • Agree upon and document the roles, responsibilities, and tasks of the Sustainability Team.
  • Re-take the Readiness Self-Assessment in section 1.5 to gain insight into any current gaps as well as the strengths you have developed.
  • Review the Sustainability Troubleshooting guide in section 4.2. Are any of these common issues present at your organization? Make a formal plan for how the Sustainability Team will support your organization with overcoming lingering hurdles.

Expand your data use

  • Check in with various teams and ensure that data is being used to support clinical decision-making.
  • If you haven’t already, develop a strategy for how the data will be used to support organizational and operational decision-making. Decide whether you will use CPRI’s Domains Framework to organize data meaningfully or develop your own framework
  • Be sure to give staff who will be working with the data adequate time to understand the different types of variables in the data and determine which are useful at the organizational level.
  • Consult with clinical teams to understand which variables and outcomes are meaningful for developing a client profile for your organization.

Ensure that ongoing training and education efforts support sustainability

  • Check in on your Assessor and Trainer Training. Determine whether ongoing training is happening per the Training Plan, and whether it is meeting your needs.
  • Check in on staff competency. Refer to section 3.3 and Appendix M. Determine whether there any common gaps that could be addressed with targeted Booster Training.
  • Check in on clinical supervisors. Ensure that supervisors are continuing to prioritize their staff’s ongoing tool use and learning.

Plan a celebration!

  • Getting this far is no small feat. Be sure to acknowledge and celebrate your successes!

Footnotes

  • footnote[11] Back to paragraph To see a logic model template that incorporates ChYMH data as evaluation metrics, head to Appendix J.