Technological Innovation is Not Always Social Progress

NTEN


By Hannah Summers

April 2, 2021

Fundraisers gathered around their computer screens once more last week for the annual Nonprofit Technology Conference. While virtual conferences will never match the in-person experience, NTC was not lacking thought-provoking conversations.

The three-day conference kicked off with keynote speaker, Ruha Benjamin, who examines how science and technology reinforce racial inequality. As new technology is always being introduced into the fundraising space, it’s important for us to be aware of the implications that come with technological advancements. As Ruha stated, technology innovation does not mean progress. We often conflate development of technologies with social progress, but these two things cannot be conflated because we often develop technologies that reinforce social inequalities.

In the nonprofit space particularly, we often think about ways of using “do-gooding” data – data that is framed in the best interest of the public. For example, in St. Paul, Minnesota, the police department and school system joined forces to create the Innovation Project that predicted which young people were likely to have future involvement with law enforcement and address their unmet needs. Without much critical evaluation, this seems like a good system – they were using data and technology to intervene proactively. But with a closer look at the data they were using, the data sources were already harmful. It looked at young people’s suspension records, attendance records, involvement with Child Protective Services, and family involvement with the legal system. The data were already symptoms of these institutions that aren’t working in the best interest of those being discriminated against. This project was ultimately brought to an end but shows how we must do a better job of learning the history and what’s behind a technological development before adopting it.

By using much of this predicative technology being created and Artificial Intelligence in the fundraising and nonprofit space, we need to be accountable for intervening with these algorithms that are inherently racist. These technologies are often using dirty data and predicting patterns of the past. Those who are creating these technologies are already monopolizing resources and power, and therefore, encoding their vision of the world into these tools.

We are all pattern makers. When our actions are influenced by bias – conscious or unconscious – then technology that is trained on our behaviors and fed data on our behaviors will produce “race blind or biased results. We have the power to make new patterns and live in new ways – if we do so, then technology can follow.

-By Hannah Summers

You May Also Like…

Go Get That DAF Money!

Go Get That DAF Money!

While the official estimates vary by source, Americans currently have about $300 billion sitting in Donor Advised Fund...

Mastering Mid-Level Messaging

Mastering Mid-Level Messaging

It’s not a secret. By combining major gifts and direct response strategies, mid-level messaging deeply engages...

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *