Grantee Profile: Good Data Collaborative

In Profiles & Interviews by Krysten Crawford2 Comments

Like most businesses, the nonprofit sector loves data—or desperately wants to. But Laura Walker McDonald has a blunt message for leaders, donors, policymakers, researchers and service providers who see data as the elixir to doing good.

Don’t, she says, collect data for data’s sake. Think through what kind of information you want—or even need—to collect, all of the different ways you are gathering it, who gets to see and share it and, just as important, at what point you need to get rid of it.

Laws like the new European Union General Data Protection Regulation aren’t the only reason. The ethics of working with often-vulnerable populations demand it.

“There are legal, operational, and reputational risks and nobody is handling them well,” says Walker McDonald, the director of innovation at the Global Alliance on Humanitarian Innovation. She cites a number of recent security breaches, including last year’s release of detailed private information on aid recipients at Catholic Relief Services in West Africa.

Walker McDonald isn’t the only one sounding the alarm. A small but growing number of nonprofit players are looking to raise awareness of what is increasingly known as “responsible data.” Two years ago, four of them—the Center for Democracy & Technology, The Engine Room and the Future of Privacy Forum, and SIMLab, which Walker McDonald ran until it shut down last year—came together to launch the Good Data Collaborative.

The Collaborative’s overarching goal is to jumpstart a much-needed conversation across the sector—especially among smaller civil society players—about responsible data.

“Questions around responsible data management can be very hard for one person at a small organization to wrap his or her head around, especially when there are a million other things to do,” says Natasha Duarte, a policy analyst at the Center for Democracy & Technology and a member of the Collaborative team. “We wanted to take the first step of figuring out what it is that people need to begin to manage data responsibly and to fill any gaps in those needs.”

The Collaborative was supported by a 2016 Digital Impact Grant. Digital Impact, an initiative within the Digital Civil Society Lab at the Stanford Center for Philanthropy and Civil Society (Stanford PACS), helps fund research teams and nonprofit organizations looking to advance the safe, ethical, and effective use of digital resources for social good. With the support of the Bill & Melinda Gates Foundation, Digital Impact has given more than $475,000 in grants since 2016.

Pushing back on donors

What, exactly, does responsible data mean? The Collaborative defines it as “the duty to ensure people’s rights to consent, privacy, security and ownership around the information processes of collection, analysis, storage, presentation and reuse of data, while respecting the values of transparency and openness.”

In addition to a revamped Responsible Data website, the Collaborative reviewed 18 existing responsible data policies and the key themes they share, including respect for individual rights and autonomy and accountability. The Collaborative also interviewed nearly 30 civil society players, researchers, donors, and nonprofit leaders during which there was widespread agreement of the need to manage data responsibly, but also of the challenges in doing so.

Walker McDonald says nonprofits are hobbled by the “starvation cycle” of too-little time and money to craft policies and to ensure those policies are followed—by boards of directors, software vendors, and iPad-wielding workers on the ground. Many interviewees described a do first, ask questions later mindset around data collection, and confessed to feeling overwhelmed, apprehensive, and eager to offload the job to someone else.

Another obstacle the Collaborative highlights: donors often think of data as a panacea and expect nonprofits to gather as much of it as possible. “Some of it is a lack of understanding that [data] could have potential negative consequences or unintended consequences,” said Zara Rahman, the research, engagement and communities team lead at The Engine Room, during a recent Digital Impact virtual roundtable about the Collaborative’s work.

Donors, added Rahman, need to “think of data not as a massive asset but rather as a potential kind of liability or something they should be wary of [and] think about carefully.”

Driving change

The timing of the Collaborative’s campaign could prove auspicious. Even as the number of tools and services aimed at helping nonprofits leverage data are growing, so too are publicized incidents of privacy violations and other problems with data in philanthropy. The Responsible Data website features “reflection stories” of a handful of mishaps and their main takeaways.

Europe’s new data protection law has also spurred more discussions about responsible data, even among nonprofits that are not directly affected by it, according to Duarte of the Center for Democracy & Technology. “The idea that organizations are going to have to take their data management very seriously isn’t abstract anymore,” she says.

Duarte, for one, hopes civil society overall one day models practices that for-profit companies ultimately adopt.

“There’s an opportunity here to lead the way,” she says, “and create a set of basic rights that recognizes that people have a right to their privacy, their autonomy, and their dignity.”

In June, The Engine Room invited the responsible data community to share lessons and challenges of developing and implementing responsible data guidelines. See notes from the community call here.


Krysten Crawford is a freelance writer and editor based in the San Francisco Bay Area.

Image by Art by Lønfeldt via Unsplash (CC BY 2.0)

Comments

  1. I’m eager to learn what comes from this effort. A big concern for our team at Caravan Studios is: how do we ensure that people remain owners of their own data and the intelligence and insights that may be found in aggregated data. It’s too easy to collect information and leave, putting it into a report, folding into broader decision making. How do we make sure that data remains with the community members — and the community — who produced it? How do we prioritize their insights when they review the data?

    That is, of course, all layered on top of meaningful consent and the right to privacy and all the other legal and ethical considerations when using data.

    1. Marnie, so good to see you here and thanks for sharing. It excites me to see how Caravan Studios continues to bring communities and tech together with innovations that make me ask, ‘why didn’t I think of that?’ Big SafeNight fan here.

      Agreed on your points. Given how communities are made of individuals: if we think of data philanthropy in terms of people donating their data for a good cause, just as they would their money, with similar incentives and systems of governance in place, we have a new sustainable form of currency that can give a voice to the disenfranchised while helping to drive independent research and good public policy across a host of issues.

      In this era of intransigent politics and polarizing ideologies, putting more individual ownership and value on data could diversify and strengthen collective bargaining for local communities and demographics in need of services and equal representation. It could also make people more aware of what they stand for, so that when elections roll around, they’ll be less apt to stand for what they think is wrong.

      We already commodify data, Facebook has seen to that. But those who reap the benefits (the shareholders) are so far removed from the data, they have no desire or incentive to know how that data is being used or to what end. Developing and mass producing secure and affordable open source tools (with autonomy, innovation and interoperability baked in) would bring an extinction level event for the freemium model, which is where much of our data seems to be going these days.

      On the question of joint ownership of data between individuals and organizations working on their behalf, the ‘data trust’ seems like a workable construct, which would earn individuals who ‘pay into the system’ certain rights with regard to how the organization is run, as well as a responsibility to participate. Similar to how a democracy is thought to function, I suppose. (A solution to the funder-driven mission issue would be sold separately.)

      For now, a question for your question. Does the Caravan Studios team consider said data as a privilege, a responsibility, a burden, or some combination of the three?

Leave a Comment