The use of smart tech by social service agencies and other nonprofits exploded during the pandemic. For example, food banks deployed robots to pack meals; homeless services agencies used chatbots to give legal and mental health advice; and fundraising departments turned to AI-powered software to identify potential donors. At many nonprofits, smart tech is becoming integrated into internal workflows, fundraising, communications, finance operations, and service delivery efforts, freeing up staff to focus on deeper societal changes that need to be made — such as addressing the root causes of homelessness in addition to serving homeless people. While smart tech helped scores of nonprofits to pivot to suddenly remote and digital delivery of programs and services at the start of the pandemic, it may also enable them to turn the page on an era of frantic busyness and scarcity mindsets to one in which nonprofit organizations have the time to think and plan — and even dream.
Covid-19 created cascades of shortages, disruptions, and problems that rolled downhill and landed in the most vulnerable neighborhoods. In these neighborhoods, it’s often nonprofit organizations that provide services to members of the community. While the pandemic accelerated the need for digital transformation throughout the economy, the nonprofit sector was not immune to the need for nearly overnight innovation. As experts on the use of technology for social good, we’ve observed the many ways that nonprofits have been adopting “smart tech” to further social change in the wake of the pandemic, which we chronicle in our upcoming book, The Smart Nonprofit.
We use “smart tech” as an umbrella term for advanced digital technologies that make decisions for people. It includes artificial intelligence (AI) and its subsets and cousins, such as machine learning, natural language processing, smart forms, chatbots, robots, and more.
The use of smart tech by social service agencies and other nonprofits exploded during the pandemic. For example, food banks deployed robots to pack meals; homeless services agencies used chatbots to give legal and mental health advice; and fundraising departments turned to AI-powered software to identify potential donors.
When the pandemic began and schools switched to remote learning, many students who relied on school lunches were not able to receive them. Here’s where nonprofits stepped in to use smart technologies for social good. For example, researchers at Carnegie Mellon University used machine learning to flip the system on its head; instead of using buses to deliver children to schools, new bus routes were created to bring meals to children in the Pittsburgh area in the most efficient way.
The use of chatbots to provide support and deliver services to vulnerable populations increased tremendously during the pandemic. For instance, the Rentervention chatbot was developed by the legal aid nonprofits in Illinois to help tenants navigate eviction and other housing issues they were experiencing due to Covid-19. It also directs renters to pro bono legal advice.
At many nonprofits, smart tech is becoming integrated into internal workflows, fundraising, communications, finance operations, and service delivery efforts. Smart tech is currently best used for rote tasks in nonprofit organizations, such as reconciling expense reports and answering the same questions online using a chatbot (e.g. “Is my contribution tax-deductible?”) — freeing up staff to focus on other activities. We call this benefit the “dividend of time,” which can be used to, say, reduce staff burnout, get to know clients on a deeper, more human level, and focus on deeper societal changes that need to be made, such as addressing the root causes of homelessness in addition to serving homeless people. For example, when Covid-19 hit, Doctors Without Borders/Médecins Sans Frontières (MSF), the international humanitarian group dedicated to providing medical care to people in distress, created an online chatbot to answer common questions about the pandemic. This freed up staff to respond to a huge increase in conversations within their social media community around mental health, anxiety, and other well-being issues.
The development of technology often putters along quietly until an inflection point is reached, where the cost of the technology suddenly decreases while its commercial applications increase. This adoption process is often drawn in the shape of a hockey stick. We are sitting at the heel of that stick right now as the use of smart tech has begun to skyrocket. And as more and more nonprofits use smart tech, nonprofit leaders will need to have their eyes wide open about both the benefits and the risks of using new smart technologies.
People tend to think of the work done by computers and robots as incapable of being swayed by emotions, and therefore incapable of being biased or sexist or unfair. However, the code that powers smart tech was at some point created by people, and carries forward their opinions, assumptions, and biases — whether implicit or explicit. As the renowned data scientist Cathy O’Neil says, “Algorithms are opinions embedded in code.” We call the bias created inside of smart tech systems embedded bias. There are two main reasons embedded bias is prevalent. First, programmers, who continue to overwhelmingly be white men, make literally thousands of choices beneath the hood of smart tech that the rest of us can’t see. Second, smart tech requires massive data sets to learn to recognize patterns and make decisions.
Many large data sets in social service areas like housing or hiring were racist by design. In using these data sets to teach “smart tech” matching patterns, organizations were unwittingly paying forward historic racism. Once bias is baked into smart tech, not only is it likely to stay there forever; it becomes self-reinforcing as the tech system looks for the same patterns over time.
Making strategic decisions about when and how to use smart tech is therefore a leadership challenge, not a technical problem. There are consequences to automating systems and processes that range from losing the ability to make judgement calls (e.g. giving unusual job candidates a chance) to introducing flat out bias against people of color (e.g. risk assessment tools used by judges and parole boards ranking black defendants at much higher risk for recidivism than white defendants). Nonprofit leaders need to make a pledge to “do no harm” using smart tech and not to wait for something bad to happen before looking for warning signs. We call the nonprofit organizations that are using smart tech responsibly “smart nonprofits.” The kind of leadership required to lead these organizations is:
Human-Centered: These leaders take a human-centered approach to adopting new technology by finding the sweet spot between people and smart tech, while ensuring that people are always in charge of the technology.
Prepared: These leaders must actively reduce bias embedded in smart tech code and systems. A thoughtful, participatory process is required to select values-aligned systems, vendors, and consultants.
Knowledgeable and Reflective: These leaders make learning about what smart tech is and what it does an ongoing process in the boardroom, the C-suite, and among the staff. Once automated systems are in place, leaders need to be vigilant about whether the technology is performing as intended, or whether unintended consequences have arisen, and how clients and end users ultimately feel about the systems.
While smart tech helped scores of nonprofits to pivot to suddenly remote and digital delivery of programs and services at the start of the pandemic, it may also enable us to turn the page on an era of frantic busyness and scarcity mindsets to one in which nonprofit organizations have the time to think and plan — and even dream. We have a once-in-a-generation opportunity to remake work and focus on social change, and it requires people and organizations who are thoughtful and knowledgeable about the use of smart tech.