Transcript, The Internet of Things: Risks and Rewards of a Connected Future

In Conversations, Virtual Roundtables by Digital ImpactLeave a Comment

On July 11, 2018, Lucy Bernholz, Director of the Digital Civil Society Lab, was joined by Robin Wilton of Internet Society, Sona Shah of Neopenda, and Amy Sample Ward of NTEN to discuss the current state of IoT among nonprofits and how it has sparked a new era of social impact.

Below is the full transcript. You can download the transcript PDF here and listen to the conversation by using the audio player below or by visiting the Digital Impact podcast on iTunes. To watch the video and see speaker-recommended resources and highlights from the conversation, click here.


[00:01] Lucy Bernholz: Welcome. Thanks for joining today’s Digital Impact virtual roundtable, The Internet of Things, Risks and Rewards of a Connected Future. I’m Lucy Bernholz, Director of the Digital Civil Society Lab at Stanford University Center on Philanthropy and Civil Society.

The Digital Impact virtual roundtable series highlights issues related to digital data and civil society. These conversations are part of the larger constellation of activities that Digital Impact and the Digital Civil Society Lab are undertaking to bring together people thinking about and making happen digital civil society around the globe. We invite you to learn about the initiatives and opportunities available through Digital Impact, the Digital Civil Society Lab, and Stanford PACS. Our primary goal here at Digital Impact is to advance the safe, ethical, and effective use of digital resources in civil society.

And so today we’re talking about the Internet of Things and its implications for the social sector. The Internet of Things, or IoT as it’s often referred to, is the growing network of connected objects, everything from vehicles to household appliances, buildings to bicycles, parking spaces, toys, and assembly lines that are able to store, transmit, and exchange digitized data.

If you think that IoT has nothing to do with your organization or your mission, you might want to think again. IoT is all around us. It’s on our wrists, in our homes and offices, and throughout our cities and towns.

By 2020 it’s expected that more than 30 billion devices will be connected to digital networks, which is more than double the number that were connected in 2015, so a doubling in five years. Some forecasts anticipate as many as 100 billion connected IoT devices and a global economic impact of more than $11 trillion by 2025. IoT has the potential to drive economic and social progress in new ways, but IoT development also comes with a host of privacy and security risks that could wreak havoc on the social sector and reverse gains made in public policies and service delivery ranging from public-sector accountability to equal opportunity to credit scoring.

We’re grateful to Internet Society for jointly hosting this virtual roundtable and over the next hour our panel will discuss the current state of IoT in the social sector and we’ll take a look at what nonprofits can do to get ahead of the curve.

So, a couple of housekeeping details before we get started. Everyone but the panelists’ microphones will be muted for the length of the discussion, but we do want to hear from you so please use the comment function on your control panel to submit your questions and I’ll pass those on to the panelists.

Today’s discussion is being recorded and it’ll be shared on the digital impact podcast which is on iTunes as well as at digitalimpact.org. You can follow the discussion on social media with the hashtag #InternetForImpact and subscribe to the mailing list for updates. And remember, these conversations are driven by and for you, the community. If you have a topic you’d like to explore, please email your conversational ideas and discussion topics to hello@digitalimpact.org.

[03:29] Let me introduce the panelists. Today we’re joined by Robin Wilton, who is the technical outreach Director for Identity and Privacy at Internet Society. Robin, you want to? There we go. Sona Shah, who is the Chief Executive Officer at Neopenda. Okay, Sona. And Amy Sample Ward, who is the Chief Executive Officer at the Nonprofit Technology Network, also known NTEN. Hey, Amy. Thanks for each of you for joining us today and let’s get started.

Robin, I’m going to start with you. In May the online trust alliance, which is an Internet Society initiative, released a new version of the IoT trust framework which is focused, among other things, on making consumer grade devices and services more secure and their privacy policies more transparent. For most folks it’s hard to imagine the boundaries of something this big. So, first, can you give us the quick and dirty on how IoT works? And second, why should we care? How can we, as nonprofits, hope to understand, much less secure, such a big network of connected things if we can’t even get our existing networks right?

[04:44] Robin Wilton: Thanks, Lucy. That last phrase there, that last clause of the contract is a really key one I think. One thing that the Internet of Things does, as you illustrated with your opening statistics, was it represents a huge increase in the number of devices at the edge of the network generation data. I’m going to work my way back to how it actually works. But just thinking about generation data, and another point you made about the economics of the way the internet works these days, a lot of what we experience and what we get on the internet as consumers and citizens these days is actually fueled and funded by the monetization of personal data.

What I see happening with IoT is a real ramping up of the rate at which we generate personal data by interacting with the objects around us and walking around in this connected world. And what, to me, that represents is a massive injection of fuel for that monetization engine that powers so much of what we experience on the internet. So, I think that engine becomes bigger and more powerful and more difficult for us as individuals to influence and to direct to our own interests. So that’s part of what I think is the context here economically and in terms of its impact on us.

As for how it works, well, it’s like having your remote control for your TV. I remember, I’m old enough to remember having to get up and cross the room to push the button on the TV to change the channels. Then there was the luxury of being able to sit there and blitz a blipper [phonetic] and change the channels that way. What IoT does really is it allows you to talk to physical objects of all kinds, just thinking of this in a personal viewpoint. So, you can control physical objects the same as you would a remote control with a — with a TV. But, for example, you might turn your heating on at home before you leave the office.

You’re not dependent on being in the same room or in the same building because your command to the device is going through the internet. And similarly, you can get information back from connected objects no matter where you are because they connect to the internet. So, you can find out what your air conditioning is doing even if you’re half way around the planet. It’s like a remote control but the distance factor suddenly goes away because all this stuff is going through the internet.

There’s one other aspect, I think, to how it works, and that is that what you typically find for household connected things is that if you’ve got a bunch of connected lightbulbs, somewhere in your house there will be a controller for those lightbulbs. You might have an app on your phone that talks to the controller and the controller says turn the lights on in the kitchen. All of that happens in your home potentially. But when you add the internet into it what you actually find is that in order for your command to reach the lightbulbs it goes from your de — from your app to the internet, back to the controller again, and then to the lightbulbs.

All of that data about what you do with your connected objects is actually being stored, processed, and potentially monetized by someone in the cloud, the vendor of the lightbulbs or the manufacturer or the service provider for that online function. And I think that changes the context of what we’re doing because it adds another party to all those activities.

[08:18] Lucy Bernholz: Thanks, Robin. That’s super helpful. It almost makes the idea of actually going online somewhat quaint, right? We’re basically going to be online all the time. Let me go now to Sona. Your company, Neopenda, is an example of IoT being used specifically to advance human well-being to global health tech start up working to reduce newborn mortality in places where millions of newborns die, particularly in low resource settings. Eighty percent of these deaths Neopenda thinks are preventable. The first product you’ve got is a small wearable device that monitors the vital signs of critically ill newborns being cared for in hospitals in these kinds of places. So, tell us what digital infrastructure is needed to make this kind of technology work and what this could mean for healthcare in similar areas around the world.

[09:14] Sona Shah: Yeah, thanks so much for the introduction there, Lucy. And so, as you mentioned, we’re creating medical devices, particularly for emerging markets. And what’s really interesting is all of the advances in science and technology. In IoT we’re able to apply them to settings that are extremely resource constrained. To give you context, the typical hospitals that we work in, I’ve been in wards where there are maybe 100 critically ill newborns and really only two nurses. And there’s, you know, not much equipment around to really help care for these babies. And so oftentimes these newborns, when they’re in trouble, nobody knows and then they die from preventable causes. That’s really what we want to tackle with our first wearable product. It’s a vital signs monitor for critically ill newborns.

I think it’s an important question to consider what the digital infrastructure looks like because in these settings you don’t necessarily see Wi-Fi. Even taking it a step further, you hardly have continuous power. I think there are certainly different constraints that we see in a lot of these hospitals and it’s really important to understand the context behind that. We’re excited to kind of iteratively design our solution with and for the users, particularly in these settings keeping in mind some of the constraints that they do have.

Our wearable device, I did — I’m not sure if you can see my screen, but hopefully you can see the little wearable, that this essentially is the monitor that will be placed on a baby’s forehead or an upper arm. And a couple of things that are built in to make it more of that IoT device. So, the vital signs from all of these newborns in the room wirelessly connect over Bluetooth low energy. So, as I mentioned, wireless connectivity or Wi-Fi just isn’t readily available, so we are using Bluetooth low energy so everything is kind of self-contained within our entire system. We are building out more of the backend database and structure so that we can aggregate all of that data and upload that to a [inaudible] eventually that can be used for a variety of different things. But the core purpose of our device is to monitor the vital signs in a local environment given what the context looks like.

And I did just briefly want to mention a little bit more about the data, because what can we do with that data? I think there are so many questions and concerns, particularly in the healthcare sector. There are challenges with data security and privacy. This is patient data. There’s a lot of really private information that is very difficult to share with other people. So, I think that’s one of the things to consider with IoT is, how do we make sure that we’re following all of the regulations and all of the confidentiality behind some of this patient data? And then once we get over those barriers, then how do we use that data and actually come up with something useful?

Knowing how many hours of monitoring occurs, that doesn’t really help us much. How do we actually take it one step further and do something useful with it? How do we analyze that data? How do we visualize it? How do we present it to a variety of stakeholders? So, I think the opportunity with IoT in so many of these devices, whether it’s here in the U.S. or in developing countries, the opportunity is endless. It just really requires a little bit of thought into how do we implement it successfully?

[12:38] Lucy Bernholz: Great. Thanks, Sona. And, you know, your mention of the, right up front, designing within those privacy and rights concerns of the individual patient and their families and thinking about those as important design constraints is one of the areas in which I think the health tech field is really leading because the constraints are so clear. But the question for the rest of the nonprofit sector, I think, is are there similar rights and principles within which we should be thinking about the use of these devices so that they’re seen as design constraints and guardrails and ethical bounds, but that’s the innovation space, if you will.

So, let me bring Amy into the conversation. Amy, NTEN has something close to 100,000 nonprofit professionals who are all using tech to do good in the world. And, you know, that’s quite a network when you think about it. So, what is the state of IoT expertise among your membership, the state of adoption, and what do the people you’re working with have to lose or gain in a world of these omnipresent connected devices?

[13:50] Amy Sample Ward: Yeah, I have to say that’s only because I always do. I think the NTEN community is incredibly huge and strong. And, just a reminder because I think we always forget this, in the U.S. alone there’s 1.8 million nonprofits. So, when we’re thinking about scale and scope of these technologies, I think it’s really beyond a lot of what we’re normally thinking about. We normally think about like these five nonprofits maybe we work with or that we know or that we volunteer with, but there’s 1.8 million nonprofits in the U.S. If all of them had access to and were actively using data from an IoT world, like maybe they will by 2020, you know, when we double the network, I think there’s a lot that we could talk about, what will change.

So as far as the expertise and understanding of IoT within at least NTEN’s world, we see a lot of initial knowledge. Folks know what it is if we’re talking about IoT, but it’s still a bit distant. It’s still something that’s really for the for-profit industry, not necessarily for the nonprofit sector. And I think a lot of that comes from the anticipation of, even if it isn’t a reality, but the anticipation of cost that doing something like that would cost a ton and that we don’t have those resources as a single nonprofit. And, as we were just talking about, a lot of the anticipated fear of managing that data. You know, would we be in compliance? Who are we in compliance with? Would we even know those rules and how to operate within them? And I think the number of organizations that have moved to the cloud or have started in the cloud because they didn’t want to have or didn’t have internal knowledge to manage their own server or anything like that, just furthers that anticipated fear that, you know, how would we, as the ones managing this data, collecting this data, protecting it, all of that, has really caught a lot of organizations we work with.

And I am — I don’t know that especially our title includes the word peril. I don’t know that we are here today to quell all those fears and say that everything is fine. Just go ahead and start collecting data from any sensor you can. We can talk about it if you do want to collect that data, but ultimately I guess my recommendation as far as, you know, what do they stand to gain is, nonprofits don’t necessarily need to think about IoT as something that they, as the single entity, manage start to end. That you are buying the sensors. You’re designing where the sensor goes. You’re collecting that data. You’re storing that data, protecting that data. You’re deciding who can look at it. That data exists. The idea that it would only happen for your community if you decided to buy that and invest in it is a — is a little bit misleading, especially if we look at the amount of investment in smart cities right now.

If your city is already trying to do these things, that’s impacting your community. And just as we would recommend nonprofits remember they have a part in advocacy, even if it’s not direct advocacy it’s educating policymakers about your community. You have a huge opportunity to be part of smart cities initiatives advocating for your community. Your city wants to do something about transportation that impacts the people you serve, you should be at that table. You should be part of that IoT conversation and you don’t have to then take on buying the sensors and placing the sensors and collecting that data.

[17:41] Lucy Bernholz: Yeah, thanks, Amy. That’s a really important point, is that there’s both IoT kind of within nonprofits and IoT around the social sector, writ large or gen. It is absolutely changing the nature of the — of daily life for lots of people and that advocacy role is important.

Back to Robin with a little bit of shift there from advocacy and research. Internet Society partnered with the Center for European Policy Studies, which is a think tank and forum for debate on EU affairs to look at the digital gap in Europe. And about half the world’s population still isn’t connected. The report talks about how community networks can help.

So how can, or how could secure IoT connect and empower local communities in places that are currently not connected? And, of course, what do we need to be on the lookout from the get go assuming that things can and will go wrong?

[18:45] Robin Wilton: Surely nothing could go wrong.

[18:48] Lucy Bernholz: Yeah [laughing].

[18:50] Robin Wilton: I think, so two overarching things for the Internet Society are access and trust. If people can’t get access to the internet then clearly they can’t get the benefits that it can deliver. And if they can get access to the internet but what they find there is not trustworthy, then again, it compromises the potential benefits.

We think both those things are important and have to go hand-in-hand. I’ll give you a couple of examples, I think, of projects I’m aware of from a program we call Beyond the Net, which is relatively small-scale grants for community projects. There was a town in Mexico that was at the bottom of a valley and it kept getting flooded every time there was a storm further up the valley, and this was disrupting people’s lives. I mean, I saw a weird picture of a fridge, a refrigerator half buried in the mud because it had been picked up by the floodwaters and swept half way down the valley and there it was just sitting there half buried. It was — it was bad news for them and it was wiping out crops and it was flooding homes and so on. What they did was they, with a small grant from us, they installed some weather sensors further up the valley and then connected them to the village so the villagers then get advanced warning of when there were storms up the valley and they could then protect their homes or, if necessary, move out. And that was one.

The other one was a fantastic project from some teenage students in Zimbabwe who played around with some [inaudible] type small computers for a semester or so. And they did things like make them make siren noise. Or one student had a cool app where she could monitor the temperature of her fish tank and feed the fish remotely. Well, it was interesting. It was cool. It was kind of a toy. But that student then went on the next year to say, well, never mind feeding fish, I can set up a hydroponic farm inside a shipping container with solar panels on the top to provide the lighting and the — and to power the hydroponic irrigation control via an app on her phone. I mean, this kid is 16 and she can grow enough in that container to feed several families throughout the year sustainably. I just thought that was fantastic, from small beginnings to a project like that that really can help her community.

What can go wrong? Well, let’s have a think about that. I mean, the flood warnings thing. Clearly one thing that can go wrong is the flood warning doesn’t arrive. If the service isn’t resilient and reliable, people might get flooded out again. But then there are the perverse consequences. You’ve put this thing on the internet, if you haven’t secured access to it what happens if some 17-year-old kids from the next town decide they’re going to set the alarm off every six hours and convince you that your village is about to be flooded? And so, we need to think of the uncharitable uses of some of this online technology as well.

[22:14] Lucy Bernholz: Yeah, I’m reminded of the — uncharitable uses, at least in part because there was a story going around about three months ago about a hack at a Las Vegas casino that actually happened through the thermometer in the fish tank. We’ve got a couple of wonderful questions coming in already from people out across the world.

I want to just fold those in and keep moving through our script as well, because all of you will have a perspective. We had a nice question that I think I can paraphrase to be about timeframe issues. That it seems we often think we’re balancing rewards and risks, as we even titled the presentation here, but the rewards often seem to come much sooner than the risks, at least I think that’s how the questioner is thinking about our general experience with things like privacy and security.

The other thing that’s somewhat out of sync in those sorts of risk and reward calculations is that while the rewards may accrue quickly and immediately to individuals, the risks often come later and might be at a population level. So just framing it that way, if there’s anyone who wants to fold in some comments about that dynamic. So, the actual specific question was then how do we, as a sector, begin to think about that in terms of making decisions about investing in and operating within an IoT world? Let me put that out there. If there’s anyone who wants to respond to it quickly. Okay, Robin, go ahead.

[24:05] Robin Wilton: Sure, I’m not going to try to give a comprehensive answer because I want to give some space for the — for the others to give their input. But one factor in this, I think, is something you hinted at just now, that the bad consequences of poor or insecure design often don’t fall back on the originator of those floors. Economists call that a negative externality. It’s where I do something and it has bad consequences, but I don’t care because those consequences fall on someone else.

As the Internet Society, we are very concerned about the potential for insecure IoT devices to form the petri dish in which botnets can grow. If there are enough insecure toasters around the world that can be recruited to botnets, those can add up to a denial of service attack that knock a country off-line. So, we want to see that addressed through things like principles for secure design, for privacy by design, and for ethical data handling by design. Put it — build it into a value-based development process.

[25:11] Lucy Bernholz: Great. So, they actually have to be built right into the devices and actually at a much larger level. Similar to what Amy was saying, this isn’t about individual, not just about how individual nonprofits work, but really about the policy and guidelines.

Let me go back to Sona, because your work also has a nice example of a partnership between the nonprofit and a for-profit, Cisco in particular, a maker of routers and other internet devices. Clearly a leader on building internet connected devices and digitization. And in your particular space, infant mortality cost seems to be the primary issue that you’re trying to sort of solve for. The cost of being able to provide care to the number of sick children given the resources of the community. So how does this partnership with Cisco work and how is the relationship between Neopenda and a company like Cisco, and other community organizations, able to really put these devices to work safely, ethically, and improve lives?

[26:17] Sona Shah: Yeah, that’s a good question and it’s certainly not something that we can do alone. These efforts require input and feedback from a variety of people, including strategic partners like Cisco. Originally we were actually introduced to Cisco through a funding opportunity. So, we were able to compete in a [inaudible] received $100,000 award from Cisco, which was really great and beneficial for, you know, the very start of Neopenda. So, it’s nice to have the support from such a renowned organization like Cisco. And I think from there we’ve been able to develop the relationship quite a bit over the past couple of years. They’ve given us plenty of opportunity to present and speak to the wider Cisco audience. They’ve given us feedback. Some of their corporate social responsibility team was actually at Uganda at our site. They were able to, you know, see how the device worked in action.

There’s a number of different opportunities and ways to collaborate with some of the larger multinational companies. And it sometimes can extend beyond, you know, what you would initially think. For a partner like Cisco, the initial — the initial conversation would be a little bit more around — sorry if there’s a bit of background noise, the original conversation was around, you know, IoT and how can we collaborate on that level. But I think there also became more of a marketing effort, a funding effort, lots of different ways like that.

I think it’s important to understand who else is an expert in the field and how can we leverage their expertise? Vodafone Americas foundation, for example, is another one of our large partners and they’ve committed funding in addition to some of their resources and understanding, some of that wireless connectivity a little bit better.

I think partnerships are absolutely crucial. Understanding the wide range of partnerships is also absolutely crucial. Don’t just narrow yourself to a single type of partner. Just really think broadly about who else can provide expertise in the space that you’re working in.

[28:24] Lucy Bernholz: Thanks, Sona. And we’ve got a comment slash question here that actually picks right up on what you’re talking about, as well as the previous questioner’s concern about time and the different risk and rewards — so again, I’ll put it on the table and then take any thoughts on it or move to Amy — which really has to do with kind of the IoT challenge writ large, which is tied into the constant cost constraint that social sector organizations operate in, sort of the sense we’re never going to have the individual organizational talent or investment resources to control a network of things ourselves.

So how do we think about this collectively? Are there different kinds of partnerships with industry leaders who are building this? Is it about frameworks and principles, such as Robin described, where we’re really trying to influence the shape of the industrial creation of the devices, different kinds of incentives, any thoughts people have regarding kind of the role of the sector writ large on costs and risks, is the question that came in.

Amy, since you’re, you know, with this network, NTEN, any thoughts you might have on that in particular? Or if you’ve got additional examples where individual organizations, or even whole subsector domains, might be working with connected devices in a successful way and any lessons to share, that would be great.

[30:09] Amy Sample Ward: Yeah, for sure. I think both in response to the question that came in around cost and thinking about partnerships in different ways. I think there are a lot of really, to me, exciting kind of case studies already out there where single organizations aren’t trying to do all of that themselves, but are instead partnering with a number of other agencies that work in a similar solution area, whether that’s all different sides of addressing homelessness in Sacramento and partnering with the police department, too, so that everyone is accessing the same data, referencing the same, you know, options on site when you’re interfacing with someone experiencing homelessness. You can tap into everything that exists as potential solutions for them. Learn in real time which solutions people are taking to know which solutions are maybe working for people. Or, cities like Marietta [Georgia, US] where you have the entire city essentially is online and you think about the number of different organizations trying to address vision zero type goals and policies. When your whole city is online, being able to access, both for policy and for your own programmatic goals, that data around how many people are speeding through a school zone? Or, you know, how many bicyclists are getting notified that a car is going in excess speed near them?

To really be able to tell the story the story of what’s going on in your city versus being in a more reactive place, which I think oftentimes us, as nonprofits, we are in a reactive place to say something has happened and we want to advocate so it doesn’t happen again. Instead, as whole coalitions of nonprofits you can be partnering to tap into that data in real time and tell the story as it’s happening.

[32:01] Lucy Bernholz: Robin, you have something to add to that as well?

[32:05] Robin Wilton: Sure, and I see Sona has as well so I’ll be brief. First it’s important to reduce the cost and effort required to adapt and deploy IoT devices, whether you’re a nonprofit or not. That’s something that, as you mentioned in your introduction, that we’ve tried to do with the IoT trust framework, to give manufacturers and consumers a clear set of criteria to look at the product and say, well, does this stack up in terms of security, reliability, privacy, protection, and so on?

The second point is — and this comes back to the point Sona made I think really clearly with her client about two nurses trying to take care of 100 newborns in a single ward — what you can potentially do with IoT is to make better informed selective use of resources. Tt might be that the key to Cory’s [assumed spelling] question is not so much how much does it cost me to just give up and use IoT as if I use IoT how can I make better use of the resources available to me and perhaps get benefits or make savings elsewhere?

[33:19] Lucy Bernholz: Great. Thanks, Robin. Sona?

[33:22] Sona Shah: Yeah, I switched to headphones so hopefully this is a little bit better. I am thinking a little bit about cost constraints. And we’re a small startup with four people. We’re a really small organization but I wanted to point out a couple of things. The first is that I don’t think that it necessarily has to be a nonprofit sector that’s going into developing countries or emerging markets. We’re a for-profit that certainly believes in the opportunity to have impact and sustainability, and I think that’s one thing to just keep in mind because it can help alleviate some of the constraints with costs.

IoT is enabling things to get cheaper, smaller, more efficient, and that’s really exciting. And I think that’s where we’re able to leverage some of those benefits to more of the resource constraint settings. Thinking about how to manage the upfront costs, I do think that’s where a lot of the partnerships comes from, a lot of the relationships come from, because you’re not going to know everything. You’re not going to hire a team that will know everything.

Our approach has largely been to find the people who do have the expertise in it and bring them either onto the team, hire them as a consultant, figure out some way to get pro bono work from them. There’s a lot of ways to collaborate with other people and it requires a little bit more creativity, particularly for emerging markets, but there are ways to offset some of those upfront initial costs that can be honestly quite expensive.

[34:54] Lucy Bernholz: Yeah, and it sounds like we’re getting a lot of good ideas here about sort of thinking about these as leveraging and extending and redesigning your work around some of the capacity. Of course, I have some additional questions about that but let me stick with what’s coming in over the transom here.

A question earlier that I’ll ask each of you to think about which is, nonprofits, social sector organizations in general for a B corps [phonetic], the whole range, are trying to get better at measuring their impact, and evaluators may be able to be helpful with certain kinds of data that may be dependent on some of this connected infrastructure. I’m curious if any of you have examples or guidance for the listeners about what the role of evaluators should be in thinking about these devices and the data that’s being collected on them. If you have any examples about that, that might be useful for folks.

It seems to me that, among other things, here’s a very practical place where thinking about who is going to have access to what data and for what purposes and for how long and kind of all of the data governance responsibilities come into play. Flip that around and it also raises, I suppose, and a question for you all, maybe you have examples of where evaluators are really thinking differently about how to measure program impact because of the availability of different kinds of data. Open to the panel if anyone. Amy, do you want to jump in here?

[36:42] Amy Sample Ward: Sure. I believe that what I’m about to say is related to your question, and by the end you two will believe it. There are two examples that kind of come to mind. The first is, the University of Glasgow has done work where they’re trying to research around animals’ behavior to certain environmental stimulants. And doing that as a researcher where you like take an animal and you’re, you know, doing something, obviously that’s not an authentic experience. The animal is in crisis. You’re holding an animal, right? They’ve used other technologies so that they are not having to be present but they can otherwise receive that data back and do the same research in an authentic way.

And that makes me, of course, not advocate for human testing, but that there are tons of applications in the same way where organizations are often in a place to speculate impact or to restrict the reporting of their impact to the small group of people who are, you know, still checking their email a year later and willing to fill out a 10 minute survey about it or something, which is not, again, the authentic full capture. I think there’s a lot of opportunities there to think about how do we put some data collection pieces in place that allow for that full story that aren’t onerous or, you know, complicated for the folks who are experiencing that impact.

That’s the first piece. And then the other is to think about this also not just as a new thing. I think there are so many examples out there of organizations who have been using IoT or “IoT-esque” technology for reporting because it helped them get donors.

We have all seen many examples of that well in an African country, you know, reporting out live on the website that it’s pumped however much water. Of course, that was all invested in and framed as a donor relationship tool, right? So that if people wanted to donate they could say that’s my well. Look at it going. That also has huge storytelling potential for impact evaluation, for program decisions, for, you know, real-time on the ground service decisions. So, some of this isn’t new, we’ve just only thought about it as fundraising and not necessarily our whole impact evaluation.

[39:20] Sona Shah: To add onto what Amy was saying. I think impact evaluation is really difficult. Honestly, for us the very obvious impact that we would like to have is reducing newborn mortality, but that’s not something that you can measure in a pilot study over a month. That’s something that will happen, you know, over rigorous studies over many, many years. In thinking about impact evaluation, the data that you’re collecting, I think it’s important to think about it in a few different stages, a few different levels.

One model that we typically like to use is called the theory of change or logic model, if you’re familiar with that. And essentially it’s a model or a framework that enables you to think about what are the inputs that you’re putting into the system? What are the direct outputs? And then, how does that translate to short-term outcomes and long-term outcomes? So, it’s a framework of helping you think through your particular area. And for us that has been really beneficial, because ultimately our long-term outcome or our impact is reducing newborn mortality rates. But the short-term impact that we can measure, the short-term outcomes that we can measure are things more like, how do we reduce the amount of time that it takes to identify a baby is actually in distress? And, how do we improve the response time to that?

Those are kind of the intermediate steps that I think are important to understand and evaluate and collect that data on, because those are absolutely crucial for the short term and can help guide, you know, how to measure that newborn mortality rate in the future as well.

[40:59] Robin Wilton: I’d like to sort of take an example that I think illustrates some of the potential pitfalls with this. The idea of using this technology to improve evaluations is a well-intentioned action. But, for example, let’s just work through a not entirely unrealistic case study.

Imagine that you have an institutional building and you want to increase energy efficiency and monitor air quality. You put some sensors in that are connected and that measure temperature, carbon monoxide because we don’t want that building up, CO2 maybe, for air quality. That’s not an unusual kind of building automation project and you would expect to be able to monitor the impact of that pretty well. Then suppose that actually based on that you want to achieve some behavior change. You just don’t want to monitor the temperature, you want to perhaps encourage people to make more efficient use of energy and lighting and heating and so on.

Then someone might notice that actually if you have CO2 sensors in all of those rooms, you can also work out the occupancy of rooms, and that might be an efficiency saving or gain as well. But, on the other hand, you’re now starting to measure occupancy of rooms which might turn out be identifiable based on who is where in the office. And you might be collecting data about people’s occupancy in the room in a passive way, so they’re not aware that’s being collected.

We can start having very good intentions in these things and sometimes end up in places that actually should make us stop and rethink and say, well, hold on — are we doing something that is only beneficial here or does this raise some issues that we ought to address because this technology is producing results that we weren’t necessarily expecting? Certainly, maybe the people who deployed the technology were expecting but the occupants of the building might be unaware of that.

[43:16] Lucy Bernholz: Yeah, absolutely. Thanks for bringing it up, Robin. Because one of the — it seems to me that these kinds of possibilities are, you know, don’t even feel remote. They feel highly likely and that we’d actually be naive to not predict them. We’re already living in a world where it’s clear that we don’t control the data we generate when we do go online actively. So, here’s a question out of the blue but it’s one that always occurs to me when I think about social sector writ large which is, are there some things we just shouldn’t be doing? Once upon a time a place like a park or a museum or a library was where you went to get away from it all. Those are all parts of the sector. And it seems to me there are very important questions to be asked about what the social sector writ large should not do with these technologies, with these datasets. I don’t know if anyone has an immediate list of those things.

[44:30] Robin Wilton: Not an immediate list, but I do have a principal, which is that you can address some of those issues through value-based design. And it is never too early in the design and development process to start applying those values, even in the commercial world. If you start with an immoral business model it doesn’t really matter how I think of your products. We need to be prepared to challenge the fundamentals of projects, whether commercial or not, if we think that actually they are setting off down an unethical path.

[45:08] Amy Sample Ward: And just to go with that, Robin, I totally agree and would couple with that user-centered design principles. That whatever you’re trying to do should not be done unto your community, but your community should be part of designing that and bringing all of their own insight into the lived experience you’re trying to evaluate or improve.

[45:34] Sona Shah: Particularly in the healthcare sector as well there are checks and balances regardless of where you’re trying to implement the solution. Of course, here in the US we have very, very rigorous regulations around that. But even in more of the developing countries it’s not as — even if things aren’t as highly regulated you still have to go through the Institutional Review Board to do a clinical study, for example. But our independent review boards that will look over your current policy for safety and efficacy and I think that’s really important to uphold particularly when there is secure patient data you’re managing. I’m not sure if this applies across other sectors, but certainly in the healthcare sector if you’re not going through some kind of Institutional Review Board and you’re putting something on a patient, you’re probably not doing something that’s ethical.

[46:22] Lucy Bernholz: Yeah, it’s interesting that you bring that up, Sona, because across civil society writ large one of the things we’ve tracked at the lab actually is the kind of adoption of review board type processes or institutions as more and more organizations start depending on digitized data wherever it’s coming from. Whether it’s, you know, survey data that’s being stored in the cloud or this remote sensor data, but recognizing that what we refer to as the dependence on the digitized data and trying to clarify what in fact distinguishes the use of that resource in civil society from its use in the commercial sector or by governments.

And I think it’s mostly an open question actually as to where those lines really are and there’s a lot of work happening on them, but especially in, you know, an age of very hybridized business models figuring out how the different sectors address digitized data as a resource in and of itself. Network digitized data seems to me to be kind of the question of our time.

I want to get to some specific questions we’ve had from the participants. We got about 13 minutes left and I’ll try to do these [in] rapid fire. So, question about the United Nations Sustainable Development Goals, the 17 SDGs, which we won’t ask anybody to rattle off the top of their head here, but there was a question about does — can each of you talk about IoT in the context of at least one of the sustainable development goals? Is there an effort you’re aware of or examples you can share where there’s specific activities using network digitized devices to either track progress or to actually achieve progress on any one of the SDGs?

[48:27] Robin Wilton: Yeah, in a way. So, a couple of the SDGs are sustainability and quality of life. And there’s, I think, technology innovation in there as well. I heard two weeks ago about a fantastic project in Portugal on a large-scale campus, 130 hectares, 10,000 people, 3,000 vehicle movements a day, something like that. Very constrained for water and yet they expect on that campus to have crops, to have animals, and to have people.

They are using IoT to achieve much better resource utilization selectivity, resource preservation, sustainable usage of water, and so on. I think there are, yeah, and we’ve all seen some examples of agriculture being helped along with remote sensors and so on. That would be my example of a good compass scale experiment that actually I think is scalable.

[49:31] Lucy Bernholz: Thanks, Robin. Sona or Amy?

[49:35] Amy Sample Ward: I suggest the two that I already illustrated before certainly are in line with that. You know, network agencies, both nonprofit service providers and police working together around homelessness as an issue directly, as well as well-being and transport in the Marietta example where the entire city is networked. So, all agencies have access to that data.

[50:02] Sona Shah: And then our case, the SDG3 very closely relates to improving health and well-being. There’s one in particular, that’s to reduce the preventable newborn deaths, 3.2, and that’s really the one that we’re focusing our efforts on. I think IoT can be applied to many, many of the different SDGs. It’s just how you use it and what you use it for. Tt doesn’t specifically say, and I don’t think in SDG3 that you should use IoT to reduce those deaths, but I think it requires some creative thinking around how can we leverage all of the benefits of IoT and apply them to many of the different goals.

[50:46] Lucy Bernholz: Great. Another question coming in builds on some of the comments that have been made about sort of built infrastructure, particularly I think maybe smart cities, is that — and what’s just been talked about in terms of the IRB’s or the community review and it really goes toward a question of how we — do we have good models of community governance of these networks and data systems that might eliminate some of the horror stories we’ve heard about, or we can imagine, and we’ve heard about third parties having access to the data that enables domestic violence, enables stalking, it puts a level of remote people having access to this data into the system in a way that does put people in danger.

So, the question is, are there good models, examples you can share, things to think about where communities are building these networks themselves or building the governance of the networks by the city or perhaps it’s the principles of data destruction by the device makers to minimize those scenarios which are not even remotely future oriented, they’re happening here and now and we can model our threat model to [inaudible] them into the future?

[52:15] Amy Sample Ward: I will start. I imagine others have some great feedback. I guess my answer is a little bit more meta than the specifics of that example. and I’ll start by saying I don’t think that I’m extremely pessimistic by saying if humans are involved there’s going to be a percentage of those humans that hack them for a bad use. And that’s not to say that inherently the product was bad, you know.

We can think of every example of any household item that is just an item but could be used for bad — just to excuse that part of it — like we can’t inherently build a good IoT device. It is all in the use of it. A big part of this, at least at NTEN, the way we think about these issues are the lack of digital equity already in the US, in the whole world, the number of folks who are not online, yet there’s probably data about them online. They are not in a position to advocate for themselves, for the safe use of their own data, to even know what it means that data has been collected about them.

I think there needs to be more work, and NTEN would certainly welcome more partners that want to work on digital equity because it’s — we’ve already seen through our digital equity programs it’s actually a window for bringing people online who otherwise don’t necessarily see the value in being online, who don’t necessarily know what they would do if they were to get the internet.

Now they can see, oh, I can check in on my house when I’m not at home or I can unlock the door for my child when they get home from school and I’m still at work or whatever those instances are that are more about individual citizens’ IoT experience. It can be a great gateway to get them online and involved in that way. Once they are, they can be a better advocate for themselves and their family and the use of data because they are now a part of it.

[54:16] Lucy Bernholz: Sona or Robin?

[54:24] Sona Shah: I was just going to say that I think I mostly agree with everything that Amy was saying. I think right now there are so many questions around data wherever you are in the world. And, unfortunately, I don’t think that there is a solid framework for really developing, you know, what the — what the next coming years will look like.

That’s something that is actively being developed right now, which is a good and a bad thing because there are so many different schools of thought around, you know, if my data is entirely out there on an individual basis, do I care? That’s up for debate, right?
Lots of people have different thoughts and questions around that. And the ethics behind it also comes into question, because you can of course be a good person but have different thoughts on, you know, what is ethical and what is not unethical when it comes to somebody’s patient data.

The more transparent you are with the data that you’re collecting and what you’re doing with that, the better you are. But I honestly don’t think that there is a solid framework that we can all abide by, which is unfortunate. But I think we’ll get there, it’ll just take a few years.

[55:32] Lucy Bernholz: Well, and that actually strikes me as almost a plea for social sector voices, civil society voices to be involved in creating that framework. I’ve heard a lot of talk over the years about how the GDPR, for example — the EU General Data Protection Regulation — people feel like it was created and is now being imposed upon. But, of course, there was a multiyear process of creating that, that was highly engaged of certain elements of civil society organizations. And what I think all of you are saying is that in fact we’re all now so dependent on these technologies, all of us need to think about the potential frameworks and principles that civil society needs for these tools.

It’s all of our concern. And I think civil society writ large is woefully underprepared for this to be honest, and partly because, as, Sona, you just said and Amy said it, too, there is no single sense of how we feel about our data being collected and used for whatever purposes. It’s just not clear.

So, I’m going to ask you, each one rapid fire last question. I’m going to fold together two that have come in and they’re almost the must ask questions whenever the conversation is about the social sector and data, which is two part. Is there a different financial model here, here meaning in an age of collective data, of digitized network data, that might support some of the kinds of services that civil society protect — provides? And the flipside of that is, are there any hard and fast norms on the three of you or, more broadly, about the selling of digitized data as a financial strategy for social sector purposes? And I’ll just leave it at that, open, because I’m going to let you assume the level of aggregation, deidentification, anonymization, destruction — so the question is, in an age when digitized data is clearly a resource, do any of you have hard and fast rules or examples of where there’s a line, where there’s a new financial model? Or an answer to the question about, do we just sell data? Do we not sell data? How does that work? I’m going to ask you to, Robin, you’re nodding. I’m going to start with you [inaudible] I’m going to go to Sona, I’m going to go to Amy, and then we’re going to wrap this up. Go ahead.

[58:15] Robin Wilton: Okay. As you would expect, I’m not a great fan of monetization of personal data. I think there are two key sets of principles. First, is what you’re doing what the data subject was expecting you to do with the data? Is it a surprise? I go by the principle of no surprises. If what you’re doing with the data comes as a shock or a surprise, even a welcome surprise to the data subject, then you should really question whether you should be doing it.

The three principles I would tend to apply are respect, transparency, and fairness. Respect because you act in the other person’s interests when you should. Transparency because you don’t do stuff they’re not aware of. And fairness because every time you come to a decision point in the development of your product or the usage of the data you make a decision that reflects their interest as well as yours.

There was a sort of rider to that question, I think about anonymized data. And I’m very skeptical about anonymization as a safety mechanism. If you encrypt data you’re always told, based on the key length, this is how long that data is likely to be safe for before technology catches up with it. Anonymization is no different. It’s a temporary step.

[59:29] Lucy Bernholz: Great, thanks. Sona, thoughts?

[59:32] Sona Shah: Yeah, I think monetization of data is really interesting and something that we are actually actively working on right now. Because there’s an enormous amount of data that we can collect, but then on the flip side there is an enormous amount of benefit that this data can actually provide as well. I do think there’s a way for us to monetize it, but we’re actively trying to think about how do we do that using the same principles that Robin had talked about as well.

It’s important to be thinking about it. We could be a premium model, for example, where a part of the data is just out there for everyone, for the general good of understanding, you know, how do we actually improve newborn health outcomes? Nobody really knows what’s happening in these hospitals. There is a way to help provide some of that data without compromising any patient security. There is a way to do that. But then when we start to do more with the analytics on a hospital basis, on an NGO basis, on an administrative health basis, that’s where it can become a little bit more monetized as long as everyone understands what you’re doing with that data and how to use it.

[1:00:42] Lucy Bernholz: Right. Amy, you get the last 20-second word because we’re at the top of the hour.

[1:00:45] Amy Sample Ward: Okay, super-fast. I would I guess I’d agree with what’s already been said. And would remind folks that the principles Robin outlined aren’t unique to IoT data. Those should be in play now. Organizations are already making decisions about whether or not they’re going to sell their list. That is also personal data. Those are email addresses. Those are mailing addresses. Those same values should already be in place now. And so, I would argue, I don’t have an “everyone should or should not do it,” but I think those values should be used to make that decision organization by organization.

[1:01:22] Lucy Bernholz: Thanks, Amy. And that speaks to the real question here I think for civil society organizations, is it’s as much about the data itself as about the data governance processes around them, regardless of where they come.

This does wrap up our time today. I want to thank our panelists and everyone who joined us on the call and Internet Society for making this happen. Don’t forget to join in on social media with #InternetForImpact. This recording will be on the Digital Impact podcast on iTunes as well as on the Digital Impact website. You can check out Internet Society and Robin’s work at internetsociety.org. You can check out Neopenda at neopenda.com. And you can, of course, connect with Amy and NTEN at nten.org.

Please check the digitalimpact.org site for tips on how to advance the safe, ethical, and effective use of data. Thanks everybody. I’m Lucy Bernholz from the Digital Civil Society Lab at Stanford PACS, and goodbye for now.


Have thoughts or case studies on the current state of IoT among nonprofits? Leave a comment below. Have an idea for a virtual roundtable? Tell us at hello@digitalimpact.orgGet the latest from Digital Impact. Subscribe to our newsletter, follow us on Twitter @dgtlimpact, or better yet, become a contributorImage by chuttersnap via Unsplash (CC BY 2.0)

Leave a Comment