(This is the second in a series of posts from the Settlement Sector & Technology Task Group's final report: From Silos to Solutions: Toward Sustainable and Equitable Hybrid Service Delivery in the Immigrant & Refugee-Serving Sector in Canada. Over the coming days/weeks, I will be extracting thematic sections from the report and posting them as articles to make them more accessible. In each key theme, we provide an introduction, sector perspectives (from interviews and focus groups), a number of useful tools and practices we have found to help guide our recommendations to help the sector and IRCC develop the themes into practice, and then a list of specific recommendations that are relevant to the theme.)
Addressing digital inclusion is complex. It requires recognition of the challenge and sustained effort to address it. There is no single strategy or method that could address all populations’ needs. Instead, localization and customization of different programs in Service Providing Organizations (SPOs) requires flexibility. Digital inequity is multifaceted, and intersects with culture, gender, age, class, and educational background.
The sector wants to look at technology from a social justice and service lens. The sector recognizes that there is a digital divide, not everyone has digital skills, and that not everyone has access to technology, or wants to access services via technology. Much has been written, identified and codified in other human service sectors that can and should be easily transferred to the immigrant and refugee-serving sector. We outline these throughout the report. Resources can be borrowed, replicated, and customized for the settlement sector. Developing guidelines for professional practice on the use of technology in human service delivery is essential.
Frontline practitioners work directly and closely with their clients and comprehensively understand clients’ barriers and needs. Since March 2020, settlement frontline practitioners have worked to assist clients to access virtual service and participate in virtual learning. They have become digital service trainers, digital navigators, digital equity advocates, cybersecurity consultants, digital coaches, mentors, and more. They are adult educators, who invest endless efforts to build practical and equitable digital services and online learning spaces that some of them and their clients had not previously explored.
Our preliminary report identified that settlement practitioners gradually acquired digital technology knowledge and integrated digital skills into service delivery. Digital literacy is a foundational skill that must be developed in a hybrid service delivery model, alongside established skill sets. This section reviews the discussions and available digital literacy frameworks to enhance support to digital inclusion in the sector.
Digital literacy has been described by UNESCO as “the ability to define, access, manage, understand, integrate, communicate, evaluate and create information safely and appropriately through digital technologies for employment, decent jobs and entrepreneurship. It includes competencies that are variously referred to as computer literacy, ICT literacy, information literacy and media literacy." It entails the “ability to identify and use technology confidently, creatively and critically to meet the demands and challenges of living, learning and working in a digital society.”
Digital literacy skills contain the notion of digital literacy accessibility and clients’ presentability, performability, and digital knowledge comprehension. Digital literacy extends beyond simple digital consumption behaviour into digital fluency. Royal Roads University’s education technology expert Clint Lalonde describes the difference between digital literacy and digital fluency as: “In learning a foreign language, a literate person can read, speak, and listen for understanding in the new language. A fluent person can create something in the language: a story, a poem, a play, or a conversation. Similarly, digital literacy is an understanding of how to use the tools; digital fluency is the ability to create something new with those tools.”
Accenture suggests a similar approach that “digital fluency should be thought of in a manner similar to how people use languages. If someone is literate in a language, they understand the basic tools of speech, such as reading and speaking. However, if someone is fluent in a language, they are able to create something new with the tools, such as craft a poem or engage in robust conversation. Fluency unlocks newfound knowledge, creativity and innovation that literacy cannot enable on its own.” They define digital fluency as measurable within “an integrated framework measured by your digital workforce’s technology quotient (TQ) + digital operations + digital foundations + digital leadership and culture. When all four facets are in place, workers gain agility, and the organization leads in key performance metrics such as innovation and customer service.”
We recommend three important components for digital literacy assessment: 1) define the concept of digital literacy in the sector recognizing the uniqueness of clients in different programs; 2) assessment should be contextualized in settlement practitioners’ daily work practices and serve their programs’ goals; 3) when designing the assessment, it is crucial to consider clients’ intersectional identities, including race, gender, educational and professional background, sexual orientation, disability, and cultural practices.
We recommend that the sector not only explore and define digital literacy and skills but also digital fluency. Digital literacy assessment should recognize nuances of newcomer diversity in digital fluency. In discussions with the sector, it became clear there is much interest in establishing Digital Literacy Benchmarks (DLB) for newcomers that complement Canadian Language Benchmarks (CLB). Creating a similar set of benchmarks might allow SPOs to quickly and accurately assess the digital literacy levels of newcomers to guide and support them accordingly. A newcomer-centric DLB model does not currently exist, but there are efforts both within and outside the sector to describe, assess, and benchmark digital literacy/fluency competencies that should be explored.
Competencies for Digital Performance
The eSkills.ca project developed a useful synthesis of current digital literacy models. It outlines their competency framework, which incorporates digital fluency:
The OECD Skills Research framework for the the Digital Intelligence (DQ) offers a global framework for digital intelligence which includes a common set of definitions, language, and understanding of comprehensive digital literacy, skills, and digital readiness in eight areas, and 24 competencies composed of knowledge, skills, attitudes, and values.
Critical literacies
Several guidelines exist to build individual and organizational digital capacity to support digital transformation efforts. JISC UK (2019) outlined key digital capabilities:
In their recent study that explores the use of interactive and social media tools in the community work sector in the Greater Toronto Area (GTA), Jeremic and Bouchard (2019) highlight that critical digital pedagogy approach focuses on empowering digital users and using digital technologies through a social justice lens. The authors argue that critical digital pedagogy ensures that future practitioners will develop digital fluency skills, thereby equipping themselves to better respond and adapt to technological changes.
Assessing clients’ digital literacy skills
Measuring clients’ digital literacy skills can be challenging, since this measurement is different from a language proficiency assessment. Though difficulties occurred, many SPOs have attempted to measure client literacy skills:
To assess digital levels, I had briefly reached out to different settlement practitioners from different agencies, as well as talking to our settlement practitioners. And I had one extra partner in this process. So that partner because he was a settlement worker himself, he was going directly to clients, and then doing kind of, I would say, an informal focus group. It was with, like a very small number of people…. Basically, we were trying to figure out exactly the levels that they were at, I found that within different agencies, depending on who their focus was, I found some settlement practitioners’ clientele to be like they had a lot more access and a lot more digital literacy than our own clients. Like our clients had no digital access. Some of them had never seen a computer before. So with that, we had to kind of devise our different strategies. I know for age groups, our youth program, population, they were well versed in technology. So they didn’t face the same kind of struggles as some of our settlement practitioners and some of our senior workers. (newcomers health, SPO, focus group)
Creating a baseline of digital literacy assessment focuses on how digital literacy skills are interrelated with digital accessibility:
We went along and talked to our clients, and these are pieces that actually came to some of our forums, and in terms of evaluating it. And in terms of getting feedback from clients, some of our forums included a section that said, where would you like us to meet you? What’s the best place? What’s the best platform to meet you at? And that is how we then came across and said, why does this group need this, and this group needs that? And then we trained our staff to be able to run those platforms for those groups. (schools and libraries, focus group)
Here is another example that explains a similar message:
So the things that they have done to us kind of come along with their needs of technology and access. We’re finding that there’s less of a need in regards to computers, but still a need for digital literacy. I’m not saying that they’re able to use the computer, but they have one. So those are the differences. There’s still an understanding of whether they need assistance when they’re at home, do they have someone that’s home that can give them more assistance, or if there’s nobody at home, and then we have a sense of whether that person would need more help, what what they helped us with his enlisting our settlement counselors to kind of assist more with translation. If there was a need, and it was an area that somebody couldn’t understand or grasp, we would bring in a settlement counselor to help us with translation that spoke their language. And then we could work through the difficulties and problem solve with them. So we’ve had to kind of leverage each other’s services in order to to really meet the needs of the client. (technology, SPO, focus group)
In language service programs, instructors indicated that limitations appeared when teachers tried to create those benchmarks. It is important to not simply establish competencies, but how they will be applied in curriculum and pedagogy:
The kind of testing, idea, sort of questioning, but what are you gonna do with the test results? How is it really going to inform your teaching? And how is it really going to help you cement that relationship and trust that you need to have with the learners who come back next week and the week after and persist for something that isn’t very easy to do. And [we should] try to get them to focus on the right bit of the equation. They need to know how to use the keyboard or they need to know how to use the mouse don’t really establish that relationship. (adult literacy organization, interview)
Digital literacy is also tied to literacy in general. One informant shared his experiences of working with refugees who were not literate in their own language, which this participant had never considered in his previous work practices. He underlined the importance of doing self-guided research and fully understanding client nuances:
One stat I recently read, which really just helped make things click is, if we look at the refugees recently targeted by the Government of Canada for settlement, is close to 20% don’t have functional literacy in their first language. That’s interesting, right? Now, if you think about, like, our response to COVID. And we just translate something into another language. Like how, how foolhardy is that, right? A fifth of your audience doesn’t read the language? Why did you bother translating it? Some of the things we need to be thinking about is, if we’re gonna push the message to people, well, maybe we should be using WhatsApp voice messages, because that’s what’s going to be effective. It doesn’t presuppose literacy. And that literacy bias is so huge. We have that in so many different areas, and we don’t even realize that we just default to assuming people read something. And it’s like, oh, if I translate it into this language, that’s it. My job is done. Realizing that’s probably just as opaque to the person as the English document you had. Right. And so you just spent a lot of your time and effort doing something that’s utterly fruitless. (refugees serving organization, SPO, interview)
Measuring settlement practitioners’ digital literacy level
Assessing digital literacy does not stop at the client level; it is equally important to measure settlement practitioners’ digital literacy levels.
In a language service program, virtual methodology and pedagogy were viewed as the crucial components when evaluating settlement practitioners’ digital skills:
The most difficult part we are right now at the stage of pulling into the digital pedagogy, the methodology of online teaching how to engage make the learners an engaged learner, again, we move the learners from teaching them how to hold the mouse and click into how do you actually participate. And this is a big challenge. (language services, SPO, focus group)
When baselining or benchmarking digital literacy skills, it is also essential to evaluate how they are being used in practice:
So any perspective you have on, you know, the investments that are required in order to the way I put it is like everyone needs a floor of digital literacy and competency in order to do this work. So how do we get everybody to that level? And that includes frontline as well as leadership, right? And I actually created a leadership course, within it to teach, which I’ve been delivering for five years now. It is about what do you kind of walk people through to help them understand what their role is and where this is beneficial to them for the newcomers and for the staff? But what am I missing here, is to evaluate the role of evaluation. I think this is essential communication. (language services, SPO, interview)
Though many SPOs have started providing digital literacy training to their staff, challenges remained in measuring workers’ digital literacy levels after training had been delivered:
We have provided digital literacy training to our staff, as well as clients, for those individuals that were really struggling. So we put them through training almost three times over. But like you say, unfortunately, some individuals just have a level that they get stuck at the, you know, the plateau. But they’re doing the basics. So I can’t complain in the sense that the delivery has been impacted. It’s just maybe not as advanced as technologically inclusive of some of the other levels we have, but it’s there. And it’s been used by the clients and the instructor themselves. We have, I would say in the majority have come a long way. Huge from where we started. (technology, SPO, focus group)
One respondent implied that evaluation of workers’ digital skills is an ongoing and collaborative act:
It certainly for most of the people here know, having the ability to work with, you know, Microsoft Office skills. There are some minimum standards that if you don’t have them, you don’t get past the first hiring process. And we’re always evaluating. So for each position, what is the technology baseline for this position, and some, of course, require much more in terms of their abilities to be able to work in Adobe and to be able to do whatever it may be. So that’s my caution and encouragement that those baselines are important, but there needs to be a mechanism that organizations revisit and re-evaluate all the time. And it shouldn’t be just the government leading it, or the sector doing it. But it’s got to be a national working group, like the Technology Task Group to come up with some solid suggestions for actions. (health service organization, focus group)
In order to measure digital literacy skills for both clients and settlement practitioners, participants emphasized the idea of digital recredentialing:
How about certification, like certification requirements for staff just to make sure that we do not have holes in our staff education. People have learned how to use Zoom, but they still don’t understand how the devices work on the computer. So if they had a very basic understanding of those through the certification levels. Like level one, you understand basic computer literacy; Level two, you need to type at, let’s say, 25 to 30 words a minute. Level three, and so on. So just as a suggestion. And then, of course, seeing the requirements obviously, both time out of staff’s days, and also funding for that. (technology, SPO, focus group)
Assessing and Supporting Clients’ Digital Literacy
Developing tools for assessing digital literacy is necessary but not an easy task. There are a lot of nuances and details associated with the measurement components and methods to provide a valid and reliable evaluation. We have found a variety of digital literacy competency frameworks that have been introduced by international organizations, national or subnational bodies, and private institutions.
These digital literacy conceptual models outlined below demonstrate the greatest potential for adaptability and relevance to the settlement sector’s needs.
Within the sector
Canadian Resources
The British Columbia Digital Literacy Framework elaborates on six characteristics identified by B.C. educational leaders. These characteristics are based on the National Education Technology Standards for Students (NETS-S) standards developed by the International Society for Technology in Education (ISTE) and encompass the types of knowledge and skills learners need to be successful in the 21st century.
Mapping Digital Literacy Policy and Practice in the Canadian Education Landscape (2015) synthesizes the key concepts and existing promising practices in digital literacy education contexts in Canada to provide insights for those planning to develop digital literacy frameworks.
The Use, Understand, and Create Digital Literacy Framework for Canadian Schools provides a roadmap for teaching digital literacy skills in Canadian schools. The framework draws on seven key aspects of digital literacy – ethics and empathy, privacy and security, community engagement, digital health, consumer awareness, finding and verifying and making and remixing – and provides teachers with supporting lessons and interactive resources that are linked to curriculum outcomes for every province and territory.
At the international level
Our review outlined existing conceptual frameworks for the role of service providers to help build client digital literacy:
Self-assessing