Blog Post

Data, outcomes measurement, and evaluation in the immigrant and refugee-serving sector

(This is the seventh in a series of posts from the Settlement Sector & Technology Task Group's final report: From Silos to Solutions: Toward Sustainable and Equitable Hybrid Service Delivery in the Immigrant & Refugee-Serving Sector in Canada. Over the coming days/weeks, I will be extracting thematic sections from the report and posting them as articles to make them more accessible. In each key theme, we provide an introduction, sector perspectives (from interviews and focus groups), a number of useful tools and practices we have found to help guide our recommendations to help the sector and IRCC develop the themes into practice, and then a list of specific recommendations that are relevant to the theme.)

Institutional and Sector Resilience 

In the first phase of our work, we captured a number of promising practices that SPOs had implemented to adapt to the changing environment regarding the format of service delivery and organizational operation. The pandemic crisis has opened up some new opportunities as digital strategies have expanded to reach more audiences. We see the progress that has been made by SPOs, which strengthen the capacity of program resilience. 

In this process, individual settlement practitioners and SPOs have demonstrated creativity, flexibility, agility, and elasticity to create success, strategies, and implications to conquer challenges, difficulties, and toughness in pandemic times. Digital transformation and adopting the hybrid service delivery model can be unique driving forces to encourage and continue institutional resilience in the immigrant and refugee-serving sector. 

Data, Outcomes Measurement, and Evaluation

Introduction and Discussion about Data, Outcomes Measurement, and Evaluation

The previous narratives and analysis capture vital training capacity and upskilling in hybrid service delivery as well as what professional positions had been created to meet existing programs’ needs. The hybrid service delivery model contains human resources and development, service distribution, digital literacy assessment, and digital professional training and career establishment. In a hybrid service delivery model, measuring and evaluatig service performance and effectiveness is essential

SPOs have worked digitally/remotely since March 2020. Questions remain about how these organizations evaluate the quality and outcomes of digital services. This section will outline useful concepts and practices to measure and evaluate digital services in an effort toward better understanding which strategies are most useful and effective in achieving missions.

Outcomes measurement is not a new term for the immigrant and refugee-serving sector. Traditionally, many SPOs evaluate outputs “how the service was delivered" and "what service was delivered” rather than what the impact on newcomers was. Increasingly, outcomes measurement is becoming a core part of service and program evaluation. SPOs use a variety of evaluation tools, such as client feedback forms, focus groups with both clients and staff themselves, and staff informal meetings and sharing.

Counting, or quantitative measures are easy with technology. The digital service landscape is vast and-data-rich. Analytics are built into all digital systems. So what can be done with technology to evaluation outcomes? Data management is slowly becoming recognized as a needed core sector competency. Data can help ensure that agencies meet the needs of the communities they serve and remain accountable to diverse stakeholders. While this quantitative data could help compare service quality to before the pandemic time, more can be done.

Organizations are increasingly investing in data to measure program outcomes, improve services to clients, analyze their past performance, and adjust their strategy. Sectors and organizations are increasingly being asked to create their own data ethics standards, practices, and procedures. Organizations require the skills to gain meaningful insights from data in order to unleash the strength of data. This is particularly important when it comes to outcomes measurement. Measuring service delivery outcomes in a hybrid service model practice requires a different lens than simply the quantitative one. 

The Alberta Association of Immigrant Serving Agencies (AAISA) is currently conducting research that looks at various Data Information and Management Systems and solutions that are being used by the sector. As they connect with both the platform providers for a technical understanding of the platforms and with agencies for anecdotal feedback, the framework they create in the near future will be useful to help guide those agencies who are looking at adding, changing, and creating solutions.

Service Providers on Data, Outcomes Measurement, and Evaluation

A digital coordinator in an employment program identified a job matching system that allowed them to better focus and match clients’ professional and educational backgrounds with potential employers in digital job fairs, resulting in efficiencies as well as more effective hiring experiences: 

We had to create something called a job matching system. So no government had anything like that where we looked at the clients and those that are job ready. We use the NOC (National Occupational Classification) codes to match them with the employer job openings, and we match them that way. That helps getting people’s job faster. But it also helps with job fairs. We have a lot of mini job fairs across the different regions. And we only invite the people that we think are relevant to the employers that we’re bringing in. And that works really well. And technology allows us to do that, right? Before it was tedious to try to do that matching. Now a bit labour intensive to get all the NOC codes. But once that is done, the rest is a lot easier. (employment, SPO, focus group)

One participant stressed the comprehensiveness of digital data transition required to examine staff working preferences:

Looking at change management, just with a database, where we’re switching from having paper files and notes and things like that, and then transitioning to something that’s digital. That’s really important. And that was really important during this pandemic, because one of the first things that we needed was to make sure we have access to client files, and that we’re able to pace manage, and track the activities that we’re doing, and to be able to do that online without the paper files, and to be doing it remotely as well. But it was a long process. Some staff that prefer writing, taking their notes and typing them in later and so on. And I think again, the pandemic really pushed us forward to say that’s kind of like, there’s none of that you don’t have your files, you got to do everything online. (leaders, SPO, focus group)

In addition to database administration, informants underlined that intake and referral systems needed to be internally linked. Clients should not have to refill applications that contain repetitive content when receiving services within a province: 

What is the basic information that could be attached to a client’s file when it comes to settlement and integration services that they don’t have to exhaust newcomers every single time they go to one agency to the other to be able to, and some regions do it better. I know that Saskatoon, for example, all the agencies are interlinked really well, they do this much better in terms of flowing. But when it comes to Ontario, every single time you appear at a settlement agency or even any kind of settlement service it’s like you’re starting from scratch and like you never existed from the first place. It’s that part that you also need to enhance, so that you make the newcomer feeling a little bit more supportive. (employment, focus group)

Several frontline practitioners illustrated that intake questions needed to be further refined. These questions should avoid collecting unnecessary information from clients to accelerate the intake process. Each intake question through the virtual intake system should have its own purpose based on program serving objectives:.     

I think for the hybrid service model, in our intake process, we need to change the way we are doing our intake to make sure that we are capturing this information about the clients, their digital literacy or their interest in attending either online or in person, because this will help us organize our thoughts and organize our programs. Otherwise, it’s going to be like, are you interested and then they may not be interested or they may not be available. So checking on their family status, we do that, all of us but make it more consistent. The baseline data. We never captured, like are you interested in doing the work online or in person, we have to start capturing this information. (technology, focus group)

One SPO adopted a research-based evaluation and management system to measure their service delivery outcomes in a hybrid service model: 

In terms of measuring, some of the core measures stay the same. It’s participation, numbers, completion numbers. But I also do find that, you know, in some ways, having using digital services makes metrics and feedback actually much easier. So it’s not just again, conversation, it’s things that are getting documented, they can be tracked, they can be, you know, data can be compiled and looked at. (leader, SPO, focus group) 

This leader also emphasized that the quantitative and qualitative data collected through an evaluation model could support her organization to capture the reasons for the program drop-off rate: 

We also know where there have been drop offs in the number of clients who can participate. And we’re asking those questions very directly through the phone and through the intake process. And trying to get a sense of how many people can’t participate because of some of the same reasons because of childcare, that they have children at home, they’re finding it difficult, or maybe they don’t have the technology and so on. So we’re really looking at ways of gathering both quantitative as well as qualitative data. So that’s really an important piece. (leader, SPO, focus group) 

In one refugee service organization, settlement practitioners designed three sets of surveys, including intake survey, onboarding survey, and exit survey. These three types of surveys did not aim to quantify the service output but examine the impact of measurement on their service delivery to refugee clients.

In another SPO, one participant mentioned a phone call approach to collect service feedback. 

Data management has a direct result on service efficiencies and newcomer outcomes. Settlement practitioners in an employment program shared with us their service outcomes when they are able to collect and analyze data: 

We decided to go to occupation-specific virtual job fairs. And this was really successful because we ended up with say, we wanted people in the early childhood education we invited for five employers and 25 to 30 clients. It was very good because they ended up going into specific rooms, and they ended up like, you know, talking each person introduced themselves, and hiring happened after that. So it’s like, you know, we need to think strategically about, like maybe making it less crowded, less people, but, and more put more emphasis on specific occupations or specific labor market demand, I think that’s the best way to do it. The virtual job fairs, big ones, are not gonna work like. (employment, SPO, focus group)

It is also important to note that many SPOs already use outcome measurement approaches and tools:

So we have got measuring approaches we’ve been using for years for virtual meetings, so it wasn’t new to us to sit like this with a group of people. We have our performance management, [which] is a web-based performance management system that we’ve been using for how many years for eight years. And so people were already being all the way down to our home visitor level. We’re accustomed to entering their performance data on a weekly basis online and for us to get feedback. (multicultural community, interview).

Apart from understanding staff’s digital capacity as well as establishing an outcomes management system, it is interesting to note that one refugee serving organization did not limit themselves to measuring service outcomes by a specific measurement approach but expanded the notion of how share outcomes: 

One thing we’ve really had to think about is how do we kind of balance this sort of survey approach to impact measurement with genuine storytelling. It’s really important to protect the identities of the newcomers in our program. So where we’ve landed this year is we’ve started a podcast series. In the podcast series, most of the newcomers who are telling the stories of their experience connecting with volunteers, and what that experience has been like, and the impact, they’re doing it anonymously. And so in this way, we don’t have to share the newcomers’ image in a newspaper article or quote them by name, but we’re still really capturing that kind of qualitative data about how the program is landing. (refugee service organization, focus group)

Data, Outcome Measurement, and Evaluation Tools & Practice 

Besides existing outcome measurement approaches, it is essential to note that many SPOs have created customized outcomes measurement methods to reflect their service quality and adjust their hybrid service delivery practices. More should  be done to baseline data management in the sector.

Data Maturity Models and Assessments

The Data Management Maturity Model helps organizations evaluate data practices and data maturity against documented promising practices, identify gaps, and improve data management. There are a number of different, competing models. All offer a framework of data management practices in key categories to help organizations benchmark data management capabilities. While we see the clear benefit of applying a Data Management Maturity Model in the immigrant serving sector, due to time and resource constraints we are unable to conduct an in depth review and provide recommendations. 

Evaluating digital and hybrid service delivery

These resources could help the settlement sector develop the skills and assets required to tap into the different types of digital tools and processes that are available today to improve the way staff operates, while also supporting the mission of delivering quality services.

  • The Data Maturity Framework by DataKind and Data Orchard is a self assessment tool to help charities better understand and alleviate the challenges of incorporating data into their efforts. The framework presents the five stages of progress in data maturity for organizations: Unaware, Emerging, Learning, Developing, and Mastering together across each of the seven key themes: Data, Tools, Leadership, Skills, Culture, Uses and Analysis.
  • Microsoft’s Nonprofit Digital Assessment Worksheet is designed to assess an organization’s use of technology and approach to important topics like privacy and security. 
  • Vitus Research and Evaluation (2021) created a tool for SPOs to use in order to better understand the effect of COVID-19 on their clients' lives. Another report by Vitus - Measuring Your Impact During COVID-19: Seven Practical Considerations for Virtual & Hybrid Programs - presents strategies and practical considerations for implementing a blended mobile, virtual, and/or hybrid program. The guide is meant to spark action and reflection around designing and developing digital services, including planning and potential pitfalls that are inevitable when operating programs at scale and complexity.
  • Homewood Research Institute (2020) proposes a guideline by which efficacy of mobile apps in youth mental health services can be evaluated. 30 criteria are identified to test effectiveness. The report highlights the importance of human centred design approach, working with youth from the outset to understand if the design objectives and interaction design are appropriate. Ongoing developmental evaluation is also needed to make sure added features and modifications are appropriately evaluated.
  • The Ontario Centre of Excellence for Child and Youth Mental Health (the Centre) and Children’s Mental Health Ontario (CMHO) published a guidance report for evaluating and improving e-mental health services. The guide is categorized into 4 areas to consider for service evaluations: client level, service provider level, organizational level, larger environmental level. The guide also provides a checklist for evaluating e-mental health services. This checklist summarizes suggested areas, such as assessing and improving overall service outcomes, client engagement and satisfaction, staff skills in delivering virtual services.
  • Digital Mental Health Tools: Resources to Support Mental Health Clinical Practice (2020) by Centre for Addiction and Mental Health (CAMH) also provides key questions and areas of concerns we should think about when delivering mental health support through digital tools. Areas include computerized interventions, wearable computing and monitoring devices, telemedicine/telehealth. The report also presents useful resources from countries Canada, UK, US, Australia, and Hungary about app rating metrics , app assessment guidelines/frameworks, implementation resources, resources to improve communication, resources for clients.
  • The Mental Health Commission of Canada’s Toolkit for e-Mental Health Implementation for Canada (2018) includes a collection of strategies for effectively planning and implementing e-mental health advancement in clinical practice. Five modules are presented as: exploring the e-health, roadmap for launching e-health programmes, building digital skills, engaging clients in e-mental health, leadership for e-mental health innovation. These five modules reflect a complex and iterative mechanism rather than a linear one. Groundwork information, planning and feedback models, self-assessments, mini-case scenarios, and links to other tools are all provided in each module.
  • Centre for Addiction and Mental Health (CAMH)’s brief on virtual and remote mental health care for older adults explores the effects of different virtual and remote technologies, offers some considerations for decision-making around enablers and barriers to use technologies, and intervention adaptations for older adults, such as adjusting online, smartphone/app or video game technologies, and video or teleconferencing according to the needs of older adults.
  • Finding Digital Mental Health Tools during the Pandemic (2020) by CAMH provides insights to think about while assessing e-health tools. Key questions to ask: Does the tool work? Where did it come from/where is it going? Are risks managed and addressed? What and who is it for? How do you get it? Webinar also highlights the ways of optimizing the effectiveness of the digital service through collaborative and iterative testing cycles. It has been emphasized that clients need assistance in recognizing and choosing digital interventions that better address their needs, as well as deciding when and how to use them. Therefore, service providers must be better equipped and well-informed about digital tools and service delivery models.
  • The IFRC Data Playbook is a recipe book or exercise book with examples, promising practices, how to's, session plans, training materials, matrices, scenarios, and resources. The data playbook provides resources for National Red Cross/Red Crescent Societies to develop their literacy around data, including responsible data use and data protection. The content aims to be visual, remixable, collaborative, useful, and informative. There are nine modules. Each has a recipe that puts our raw materials in suggested steps to reach a learning objective. 

The Ontario Centre of Excellence for Child and Youth Mental Health and Children’s Mental Health Ontario (CMHO) have also compiled some useful resources, including Evaluating and improving e-mental health services. They have also run useful information webinars:

  • Evaluation of virtual care in response to COVID-19
    This webinar outlined a province-wide evaluation of virtual care conducted between April and September 2020. Throughout the webinar panelists discussed what was working well, what wasn’t and how agencies and services providers can improve virtual services going forward.
  • Three approaches to ongoing monitoring and evaluation
    This webinar features panelists sharing three different approaches to the ongoing monitoring and evaluation of virtual care in child and youth mental health. This includes a developmental evaluation and two mixed-method evaluations looking at client and caregiver perceptions, in one case, and client and staff surveys in the other.

Assessing and Communicating Data: Literacy, Fluency, and Mindfulness

Underlying any Data Management Maturity Model or approach is data literacy and competencies. Statistics Canada has created a Data Literacy Competency Framework which provides “an overview of the definitions and competency frameworks of data literacy, as well as the assessment tools used to measure it. These are based on the existing literature and current practices around the world. Data literacy, or the ability to derive meaningful information from data, is a relatively new concept. However, it is gaining increasing recognition as a vital skill set in the information age. Existing approaches to measuring data literacy—from self-assessment tools to objective measures, and from individual to organizational assessments—are discussed in this report to inform the development of an assessment tool for data literacy.”

The Data Strategy Roadmap for the Federal Public Service in Canada (2019) was released by the Government of Canada which provides recommendations for public services around four themes: stronger governance, improved data literacy and skills, enabling infrastructure and legislation, and more focused treatment of data as a valuable asset.

Additionally, the Canada School of Public Service Digital Academy offers further resources to develop data related skills in support of a data-literate workforce.

Data mindfulness can be defined as an active awareness about data. The concept was put forward by Ümit Mustafa Kiziltan, Chief Data Officer at IRCC at the 2021 Data Conference. According to Kiziltan, data mindfulness has 3 dimensions: awareness about the potentials of data; awareness of the impact that service providers could create on data; awareness of data limits. The first dimension concerns the appreciation of data as the key driver in enhancing work processes, while the second dimension regards the potential harm that could be done with data. And the last principle is about developing an awareness about data limits, which requires a critical lens on exploratory power of data. Rather one should “come to data with a sense of what constitutes public value”. 

We acknowledge that a value-driven approach to data is critical and should be further explored by the sector. This approach should also entail exploring potential data sharing frameworks across the sector and government that allow organizations to enhance their service evaluation.

In their 2019 report, Powered by Data, a coalition of civil society organizations that developed a Canadian policy agenda on administrative data for social impact, highlights four areas of potential use cases of administrative data in the social sector: outcomes evaluation, research and advocacy, data-informed program planning, and integrated service delivery. Their report highlights the importance of effective data sharing across traditional boundaries within nonprofits and also between nonprofits and government for effective use of data for the objectives mentioned above. As the report illustrated in one of the use cases, an organization providing health services to refugees and immigrants can access OHIP data, aggregated by catchment areas. Knowing how many refugees, new immigrants, and returning Ontario residents in a given catchment area have applied for OHIP or are on the 3-month waiting list could help anticipate how many people they will need to serve.

In a similar effort, the Alberta Nonprofit Data Strategy, a sector-wide collaborative initiative to build a knowledge-driven nonprofit sector, shared a vision for data use in the social sector. Their mission also includes a focused Newcomer Task Team, which scanned for current data initiatives in the  Immigrant and Refugee sector, and the nonprofit sector more broadly in Alberta to document learning and assess gaps. The report shows that service providers recognize organizational data capacity as the ability and expertise to collect, use, and share data strategically and appropriately. The following indicators are identified as data capacity components by the organizations: knowledge of available data; access to appropriate technology and infrastructure individual, organizational and sector buy‐in; and having dedicated staff, time, and funding to work with data intentionally. Recommendations by the organizations around successful data-sharing practice echo general concerns and suggestions present in our literature review: creating a data-sharing culture that is meaningful and impactful, rather than bureaucratic; enhance staff’s data and technological literacy; decrease confusion about what data  can be shared;  work collectively and more closely with funders to set the what, why’s, and how’s of reporting and data collection; and make sure trust is embedded in data sharing activities.

While managing, using, and sharing data is crucial, equally important is ethical and secure use of it. Responsible Innovation in Canada and Beyond: Understanding and Improving the Social Impacts of Technology (2021) provides a comprehensive guide to help the general public and the private and public sectors in their decision making pertaining to ethical and safe use of technology. It incorporates shared considerations, challenges, frameworks, and promising practices for improving the social impact of technology from a wide variety of perspectives.

Open Data Institute (2019) published the Data Ethics Canvas. It provides a high-level framework for identifying and assessing the ethical implications of any data activity within organizations handling personal information of their clients. 

The World Economic Forum’s latest Ethics by Design: An organizational approach to responsible use of technology (2020) report shares key insights to assist in the shaping of organizational decisions to encourage stronger and more routine ethical behaviour. Instead of focusing entirely on staff's personal character, the report promotes an approach that focuses on the environments that can lead ordinary people to engage in more ethical behaviours, such as organizational culture. The study discusses operational design measures and guidelines that have been seen to be more successful than traditional methods including compliance training and financial incentives.

While general understanding around ethical and safe use of data is valuable, our review advises that sectors and organizations are increasingly being asked to create their own data ethics standards, practices, and procedures. 

The OECD's Good Practice Principles for Data Ethics in the Public Sector (2021) highlights the importance of data ethics in the public sector as well as its practical implications. It lays out ten promising practices for public officials to follow when implementing data ethics in digital government programs and services.

Databilities is an evidence-based data literacy competency framework. It has been used to understand organizations’ data literacy. 

Researchers from Dalhousie University defined data literacy as “the ability to collect, manage, evaluate, and apply data, in a critical manner.” Another organization recognizes it as a skill that empowers all levels of workers to ask the right questions of data and machines, build knowledge, make decisions, and communicate meaning to others.”

How to assess and measure data literacy skills in organizations has been a key topic. The Global Data Literacy Benchmark (2020) has been used to understand data literacy in public service organizations in Australia, Canada, India, UK and US. The framework is labelled as the most comprehensive assessment tool of individual data literacy in the world by Statistics Canada. The framework outlines three areas: reading, writing, and comprehension. For each competency level within the framework, there are up to 6 levels of proficiency. Based on the level of proficiency, the framework identifies 3 cohorts of employees: the Curious, the Confident, and the Coaches (from low to advanced data literacy skills). The study found that organizations should identify opportunities to amplify the skills of the Coaches so they can reach more of the Curious and become an active part of the organizations’ data literacy campaign. The Confident should be supported and encouraged to stretch with the view to becoming future Coaches, while the Curious should be encouraged to engage with data literacy concepts and creating opportunities for them to learn and seek guidance from their fellows. 

While conversant in the “people, process and technology” capabilities of organizational change, most executives and professionals do not “speak data” fluently as the new critical capability of digital society.

Data, Outcomes Measurement, and Evaluation - Relevant Recommendations

  • Recommendation 1: Develop a roadmap to support organizational digital transformation -- Now: The sector should review existing Digital Maturity Models, Data Maturity Models, Digital Inclusion, and Digital Literacy models from within and outside nonprofit sectors to curate and customize models for the sector.
  • Recommendation 4: Establish baseline sector competencies -- Next: Explore models of digital transformation, digital and data maturity, hybrid service delivery in other non-profit and the private sector to bring the best and most relevant expertise into the sector.
  • Recommendation 3: Establish a hybrid service delivery lead at IRCC -- Later: Evaluate, incorporate, and establish digital and data maturity models into SPO program planning, funding, and operations, including active evaluation, learning, and knowledge mobilization of existing digital and hybrid service delivery in the sector. In particular, evaluation, learning, and knowledge mobilization attention should be paid to organizations in areas where digital and hybrid service delivery pre-dated COVID, such as pre-arrival, blended and remote language learning, and existing digital efforts funded by IRCC and other funders.

Leave a Reply

Your email address will not be published. Required fields are marked *