Digital Contact Tracing: Privacy by Design, Consent by Default
Q&A with TripleBlind, Steve Penrod, Greg Storm & Riddhiman Das
In response to the global challenges created by privacy concerns around digital contact-tracing solutions that surfaced as a key measure to fight COVID-19, an organized movement has emerged at Safe Paths, focusing on making transparency and consent the default settings. Contact tracing by healthcare authorities requires the optimization of human-tech partnerships to enable the provision of the accurate flow of information, and thus a need for an extra layer of privacy-first solutions. TripleBlind is a collaborator in this effort. They offer digital privacy as a service that enables entities to safely provide and consume sensitive data and algorithms in encrypted space, without compromising privacy or security. In this interview, we talk about state-of-the-art privacy technologies, privacy-preserving digital solutions at the intersection of TripleBlind’s work and the Safe Paths initiative, solutions to prevent surveillance capitalism, alternatives to surveillance state measures, public trust in contact-tracing efforts, and adherence to GDPR in Safe Paths technology development. We also discuss how TripleBlind and SafePaths’ efforts change the way people think about their personal data and how to take control of it. The conversation puts an emphasis on why Safe Paths and TripleBlind’s mission is important for today’s society with how we interact as individuals and businesses moving forward.
Deepti: COVID Safe Paths is a global movement that is developing free, open-source, privacy-by-design tools for individuals, public health officials, and larger communities to flatten the curve of COVID-19, reduce fear, and prevent a surveillance state response to the pandemic. Over the last two months, TripleBlind has been working with COVID Safe Paths in this wonderful alliance as part of the global open-source community to build a privacy-preserving approach to contact tracing. Through digital contact tracing, we can warn individuals if they have crossed paths with a person who has COVID-19 and makes the process efficient by reducing the time it takes to inform them. TripleBlind is a company working towards delivering tools that allow for the control of your own data and provides digital privacy as a service. So today we want to talk to its people about their thoughts on why this breakthrough technology is so vital in today’s landscape and how Safe Paths as an initiative intersects with their values, principles, and missions.
Mahdi: Joining us for this conversation today are co-founders Riddhiman Das and Greg Storm of TripleBlind. With them is Steve Penrod, a leading developer at TripleBlind and a major contributor to the Safe Paths initiative. Thank you for joining us today. Please give us a brief introduction on yourself, your associations with TripleBlind, and how you contribute to Safe Paths.
Steve: Hi, I am the Vice President of product development with TripleBlind. My interaction with Safe Paths began at the beginning when TripleBlind kicked off and when Das had brought in this idea from Ramesh to work on this privacy-preserving contact-tracing solution. I got involved from the initial stages of rolling it out and have been very deeply involved, building the architecture throughout and helping build up the organization. I supported interacting with the healthcare authorities who were looking at a practical implementation of our solution.
Riddhiman: I am a junior worker at TripleBlind. Which means a little bit of everything. Some days, a little bit of product, some days, a little bit of sales and marketing. But I am the CEO and co-founder with Greg here at TripleBlind. We’re building privacy enabled by mathematics, cryptography, deep learning, and offering that as a service to unlock the value of data that has historically stayed unmonetized and inaccessible inside data silos, companies, and organizations. Exciting mission here and great team.
Greg: Hello, I am working with Das and Steve. My official title is chief operating officer, but when Das finds junior work that he doesn’t want to do, I pick it up. So that’s sort of the relationship of how things work here.
Deepti: Thank you so much for the introductions. I have actually been following what you’ve been writing, Riddhiman – I saw one of your quotes that said, “We will move the world from, don’t be evil to can’t be evil.” I believe that is your mission and purpose at TripleBlind. Would you like to talk about the deeper meaning of this purpose and mission? Not what you’re trying to do, but how did you get started with it?
Riddhiman: Sure. So the information revolution that we’re in the middle of has been accelerated significantly by the ability to share data between parties. Security was at first an afterthought and then as breaches happened it became a bigger deal. The state-of-the-art security measures take care of data when it’s at rest. Meaning you store it inside your database or in your file system and you can keep it encrypted and safe when it’s on its way to its destination. But once it reaches its destination, for the receiving party to be able to do anything useful with that data, they have historically had to decrypt it. When you decrypt it, you can do anything you want with the data because you have a copy of the data in the raw.
Deepti: Right, so that is where “We will move the world from, don’t be evil to can’t be evil” comes in?
Riddhiman: Yes, we believe that is a fundamentally broken experience and a broken way to deal with data. Because once you have a copy of the data in the raw, not only do you have the ability to do anything you want with it, the sender has no ability to enforce what should be done with it. So, therefore, it limits the ability of the party with the data to be able to only interact with trusted third parties with contracts, lengthy negotiations, and the end-user ultimately whose data it is, ends up losing in the process. As we’ve seen with Capital One, Equifax, and all the other major breaches that happen almost every month at this point. So the idea behind “can’t be evil” is that we keep the data private and the algorithms that operate on the data privacy in its entire life cycle. So from the point of aggregating and building a new algorithm, to preprocessing, to utilizing the data in the model, to then encrypting and licensing the model. We have an entire pipeline that keeps data private using state-of-the-art privacy technologies in a way that does not allow a third party, whether malicious or genuine, to be able to abuse the data even if they wanted to. So that’s the crux of “can’t be evil” versus “don’t be evil.” “Don’t be evil” is the best effort. “Can’t be evil” is a guarantee.
Deepti: That is clearly what is needed in today’s world and current crisis, where we need to create privacy-preserving tools that empower people to help fight COVID-19 and restart the economy. Can you elaborate more on how this technology works and how it intersects with the Safe Paths initiative?
Riddhiman: So clearly contact tracing has the potential to lead to a surveillance state where our individual liberties and freedoms, globally, may be at risk because of overreach by the authorities. We’ve seen a slippery slope of that in some other countries where similar contact-tracing technologies have been adopted. We wanted to show the world first that you don’t necessarily have to build a surveillance state if you just want to do contact tracing in an accurate, reliable way. Secondly, public trust is likely going to be higher if the contact-tracing efforts do not involve the government or the authorities knowing everything about your movement and everything about who you’ve come into contact with. We wanted to show that all of this can be done on a global scale without privacy being an afterthought.
Mahdi: Sure. So you guys understood that by utilizing this technology, we’ll be able to prevent a surveillance state and still get the benefits of a contact-tracing solution. Could you tell us a little bit about how TripleBlind got involved with Safe Paths and what that journey was like when Prof. Ramesh Raskar introduced you guys to the idea?
Riddhiman: Ramesh happened to be at a healthcare conference where he learned that about 60% to 70% of the U.S. population was going to be affected by COVID at some point in the next two years. To prevent that kind of widespread chaos of an infectious disease going through the entire population, there were members of the White House Task Force and others talking about, “Is there a better way to do contact tracing that does not have to rely on the individual’s memory?” Greg can talk a little bit more about how contact tracing is done because he has done a lot of customer discovery on that. It is a fundamentally broken process because it relies on people’s fallible memories. When Ramesh came to us with the idea, we wanted to jump on the opportunity because this was not only a chance to showcase that it can be done privately, but it was also going to have a lot of value for the world and the communities at large.
Greg: We had been working with Ramesh for quite a while at that point. We were doing research in privacy-preserving machine learning. His team has done a lot of research there and we were turning that research into more of a product versus a research project. It was because of that trusted relationship where Ramesh was able to recognize that the technology we had around privacy was pretty good. He was able to recognize it because of his privacy bent, and we all knew that for a contact tracing app solution to be adopted widely, we definitely need to look at bulletproof privacy. Especially here (U.S.), nobody’s going to use it if it’s not really private. When Ramesh called and asked us if we would be willing to help, we absolutely thought it would be a great thing to help with because it’s in the sweet spot of what we eventually want to do. Think of a world a couple of years from now where your personal data can be kept private and you can unlock as much of that data or any level of access to that data as you want; it really changes the way many things in the world could work. Imagine a world where you could get your Amazon shopping history onto your own private vault and turn the tables on Amazon, so that Amazon and those Amazon advertisers had to pay you to run across your data to advertise to you. That would be enormous, right? So you know, for us, the opportunity to think around what this app could mean and how much this app could change people’s perspectives on their own data and how to control their own data is a big deal. A very big deal. So not only is it a good thing for the world, but it was a good thing to get excited about. Then we turned to Steve and said, “Okay, now do all the work.”
Steve: Yeah, and one of the things that we recognized with Ramesh early on is this particular problem that we were looking at, it wasn’t an academic question. It was really about what we can do in an extremely short timeframe that could be usable around the world. So this really did play into the sweet spot of that thinking that Ramesh had done, as well as the experience that we had building products in our previous companies. From a product-centric point of view, Greg going out and doing those interviews with contact tracers early on to be able to bring that knowledge of what it is we needed to build and how it needed to function for it to be not only a decent idea, but a practical idea. It all came together. This allows SafePaths (the mobile app) and Safe Places(the webtool) to exist and be well-positioned so they are solutions that the healthcare authorities would want to adopt, as well as consumers of the mobile app itself.
Deepti: Right. I believe having an ethical data strategy with privacy and consent as default settings in the products, services, and experiences are not just being seen as additional qualities for the current COVID-19 health crisis. These elements are becoming key drivers also for a more stringent legislative landscape as well. Initiatives like Safe Paths and companies like TripleBlind give a ray of hope that there is a possibility of not having to give privacy as a trade-off for utility. That was one of my primary reasons to join this MIT-led initiative. So far we have talked about the consumer-centric or user-centric point of view or a health authority point of view. At the same time, we’re finding that people have been looking at all this under a legal lens as well. For example, GDPR compliance. Steve, since you’ve been working so closely with Safe Paths, would you say that Safe Paths is GDPR compliant?
Steve: Sure. One thing that we should probably note is that in my previous life I was working with an organization called Mycroft where we were building voice assistance. During my tenure there, that’s when GDPR emerged into the world. It hasn’t been around very long, relatively speaking. So with Mycroft, a lot of that was about voice interaction technology. Which is one of the first times people started to get creeped out about having a microphone sitting in your house listening to every word you’re saying. As well as considering what the possibilities are when you have that sort of intimate data collection going on. That experience and that knowledge I think helped guide me into what we were building here at Safe Paths. GDPR, in particular, is Europe’s set of regulations that really codified for the first time what an individual’s rights were. It lists out several different rights. In the past some rights were honored by technology companies, but not necessarily always. It was more a matter of convenience or whenever there was a data breach, people would respond to it. However, GDPR was the first time that it became a requirement for these companies to consider these things throughout the user life cycle. Right from how a consumer grants their consent to having data collected on them, to how it’s stored while they are an active participant. As well as how users are able to detach themselves from a platform and how their data is treated once they have separated from it and have withdrawn their consent.
Steve: At Safe Paths we are looking through this mechanism that we’re building out and knowing in addition to that, the organization we are building isn’t going to be the one who actually does the data storage. We are, in GDPR terms, one of the controllers of the data, but we’re not the sole controller of it. Basically this is sort of a unique application in that there are really three controllers of data. There is the technology that we’ve built and provide through the mobile app (privacy preserving). The other controller is the individual user, because they literally are holding the data on their own cell phone, not in any cloud. Then the healthcare authority is a third controller that’s participating in this when they are receiving data from infected individuals. This is a little bit of a unique use case in that data is stored in multiple places at different points in time and has different people controlling it. Pulling all of that together is one of the things that I believe we pulled off really well with Safe Paths and Safe Places. Allowing all three parties to work together in a fairly seamless way to handle this data and to do it in such a way that it meets those requirements of GDPR.
Mahdi: You bring up some trends from the past that you noticed when working with voice assistants and how the public reacted to the technology. As the world continues to evolve and transform, why are Safe Paths and TripleBlind’s mission important for today’s society with how we interact as individuals and businesses moving forward?
Riddhiman: So one of the fairly transformative reads I’ve consumed in the last couple of years is “The Age of Surveillance Capitalism” [by S. Zuboff], which is the idea that private human experience has been monetized by companies for profit. The raw material of the advertising-based business model is that, first of all, I need to be able to predict what each one of you are going to be doing next. Whether it’s what are you going to purchase, what are you in the market for, or what can I influence you to purchase? Therefore, that leads to this private human experience being monetized and exploited by these business models that have emerged as a result of the ad-supported business model of the internet. The second-order effects of that are that individuals and users have been left without control over their own data. That has had dramatic consequences for the world. Such as the influence of elections, the rise of nationalism, and all kinds of violations of individual privacy. TripleBlind exists with the long-term mission to make sure that there are alternate models here that are actually better. Models where individuals can own their own data, they can move it around their providers, and can monetize it themselves. As opposed to letting whoever happens to be holding the data monetize it on their behalf with the current models. This puts the individual back in control of the online experience. It’s a similar philosophy on the organizational side as well. Today, when two organizations or two companies work together, they exchange information in a way that can violate the trust that is placed upon that information or placed upon the business relationship. This causes lots of business opportunities to not be pursued. Enabled by privacy-preserving and privacy-enforcing technologies, like what we have, we believe it will unlock a whole new wave of innovation that’s waiting to be explored behind data sets and algorithms that are too private to share today.
Greg: We actually think there are some governments that are getting behind it as well. The California Consumer Privacy Act is their attempt to address many of the concerns Das just articulated. It’s their legislative attempt to give you the right to request your data, to take charge of your data, etc. What we’re hoping to do is align with those kinds of efforts to enable people with the tools to actually follow through on what they’re not legally able to do.
Deepti: Surveillance capitalism is definitely one issue which we all face, but there’s also, especially in the context of this pandemic, surveillance state measures that are being taken right now. Maybe not in the U.S., but different countries. We’d like to maintain these privacy-preserving principles at Safe Paths, but it’s still an open-source tool. So how do we ensure specifically in the context of Safe Paths that this is not leading to one, a surveillance state, and secondly, how do we manage this privacy aspect when it is an open-source tool?
Steve: Well, I think one of the things that we can talk about is privacy-by-design. That’s one of the things this is a great example of. The simple reaction that most people had when they were first thinking about digital contact tracing is, “Oh, well that’s super easy to do. All we have to do is pull in everyone’s location to one central place and then we just do some comparisons on this one giant server.” While that appears to be the only way you can do it at first, it’s bad because someone has to know everyone’s information in order to achieve that. What we were able to achieve at Safe Paths is putting out a different way of looking at the problem. The different way of looking at that is to push the responsibility and the computational load out to the individuals. So instead of everyone having to send all of their information back to this “trusted” party, they trust themselves. This gets to that “can’t be evil.” We don’t know everyone’s location. So there’s nothing we can do with everyone’s location. It’s impossible for someone who adopts the Safe Paths approach to abuse that information because they never see it. The real private information remains in the control of the individual. That design is fundamentally what changes into a “can’t be evil” kind of situation, also followed by TripleBlind.
Mahdi: Steve, please touch a little bit more on that point. Should people be concerned that their data or privacy could be breached when utilizing an app like this, and why not? More on the technical side of things. Can you elaborate on how it works and why it’s not possible for that to happen?
Steve: Sure. So it’s tricky for a consumer when you’re just running an app, you don’t always know what’s happening under the hood (backend). Safe Paths is unique as far as being able to trace location information. That tracing is being done by the individual and it’s being kept local. So, literally my phone is where I’m storing the coordinates where I walk around and I am the one who is in control of that information and that never gets sent to anyone else without my active consent and my action to make that sharing occur. So as an individual, I’m walking around, my cell phone is recording where I’m going, and that stays in my pocket. Then the Safe Places technology (webtool for healthcare) brings in the ability for an individual who decides to share that information with the healthcare authority in case they are a verified COVID-19 case. Then this person is able to distribute their information to all of the other people in their life and in their community to allow those individuals to personally make a comparison of their private data against this public information that the verified positive person, in conjunction with the healthcare authority, exposed. So for example, let’s say I got diagnosed with COVID-19 and I wanted to share with my community my past several days of location history to let them know, “Oh, I went to the library on Tuesday afternoon and I was at the shopping mall on Monday.” These solutions give us the ability to distribute that information. It doesn’t expose my personal privacy other than someone with COVID-19 was at the library, someone with COVID-19 was at a shopping mall. It doesn’t dig down to say that I was the individual who was there. Then that information is published to the public record so that every individual’s cell phones can reach out and pull that information down to make a comparison for themselves to see if they have a reason to be concerned. Whether they were actually at that library or at that shopping mall at a time that means they should be concerned. This is contrasted with most of the other applications that are generating more of what is referred to as the “surveillance state.” In that, it pulls everyone’s information and it’s keeping all of that information in a centralized location to make the comparison and comparison calculation. Which leads to the slippery slope problem of, “Well, since we have all that information in one spot, what else can we possibly do with it?” We’ve eliminated that one spot. So we eliminate that temptation.
Deepti: Right. Thank you so much for the details on the tech-side about how this initiative is being developed and moving forward. We would also like to know a bit about the three of you as well. You clearly have the common set of values on creating the new paradigm of compounded value. But how did you all get together in the first place as part of TripleBlind? That’d be interesting for us to know.
Greg: To shorten a very long story. I was teaching machine learning at the University of Kansas and I went looking for real data to teach my course with. I met Das at a meetup in downtown Kansas City. It turned out that we are very aligned on a lot of things, privacy being one of them, but many other things as well. He convinced me to quit teaching full time to transition to part-time teaching and work full time with the company he was at at the time. We worked together for three and a half years. Last summer we decided to go do TripleBlind. When we started something that was interested and focused on privacy, we went looking for the best developers in town that were also aligned with our privacy principles. It didn’t take very long for Steve’s name to pop up to the top of the list as one of the best guys in town working on one of the most interesting privacy-oriented projects at Mycroft. It turned out that he kind of bonded with the mission of what we were doing and the rest is history. That’s how we all came together.
Deepti: I believe that TripleBlind is clearly showing us a better way to be private and it is completely in line with what Safe Paths is here to achieve. As the future of privacy evolves, we need to look at opt-in models rather than opt-out models. We all know that data is fundamental to our situational awareness and response to such pandemics. The mining of such data informs the design and implementation of services that are required to solve this very pandemic. At Safe Paths we believe in building an architecture of trust in a privacy-preserving approach to addressing these challenges, and we are very happy to have TripleBlind as our collaborator.
Mahdi: Thank you so much for sharing everyone. We really appreciate you joining us today to share your thoughts on why and how TripleBlind came together around this initiative at Safe Paths. We believe this is a super important mission and we have hopes for launching it publicly here in the United States and in communities across the entire world.
Riddhiman, Greg, Steve: Thanks for having us. Thank you, guys. It was a pleasure.