privacy software apps development GDPR

Are developers the new lawyers to be?  Embedding privacy into software apps

Embedding privacy into software apps.

Since the adoption of the GDPR, we all expect developers to embed privacy into existing and forthcoming software applications. But ask a thousand people to define what privacy is and you’ll get a thousand different answers.

For some, privacy is nothing more than secrecy. For others, it is everything related to their lives. Yet somehow, we expect developers to know what the “right” answer is for everyone. Arguing that all they have to do is to “respect users’ privacy” when developing personal data collecting apps. But how exactly? This is the part that we never talk about: how privacy should effectively be embedded into existing and forthcoming designs.

Hacking the privacy principles: turning legal requirements into 1s and 0s

If it is easy to say that developers should respect users’ privacy, hardcoding these principles into software systems is certainly not. Still, that is precisely what is expected from them, as the GDPR provides numerous privacy principles for developers to implement. What it does not provide though, is a clear guidance on how these principles should be implemented.

As a result, we observe that GDPR concepts like “data minimisation” are still a blur for lots of people. Indeed, according to the regulation, “Personal data shall be: adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed (‘data minimisation’)”. But what is adequate and relevant to someone might be interpreted in many different ways. So how can developers know that they have it right?

Mostly, they don’t. In their study Why developers cannot embed privacy into software systems – an empirical investigation, researchers Senarath and Arachchilage specifically recognize this fact as one of the main problems developers face when attempting to embed privacy into their designs: they simply don’t know how to verify that they’ve done it right.

Now if they would go deeper into the GDPR, developers would know that: “appropriate technical and organisational measures, such as pseudonymisation […] are designed to implement data-protection principles, such as data minimisation” (art. 25 § 1 GDPR). The thing is: do you know many developers who actually have read the 88 pages of the GDPR? Certainly not. And even if they did, would they be able to keep track of all the data protection principles that it refers to and understand how to interpret them?

We don’t expect lawyers to be able to implement technical solutions. So why expecting developers to be able to understand and navigate the law?

Minding the gap between tech and privacy experts: a collaborative approach

Clearly, this situation requires a collaboration between all these different subject-matter experts – lawyers and developers – so that they can build digital solutions that are both legally rightful and technically feasible.

At Pryv, we understand the burning need for such solutions, which is why we provide any developers with a ready-to-use software foundation on which they can build their own personal data collecting apps. Call it GDPR, CCPA, PIPEDA or any other existing of forthcoming regulation that regulates privacy and individuals’ rights, our software was built to help developers address the complexity of such privacy-by-design legal requirements and meet business constraints such as short-term deadlines and restricted resources.

Expectations can equal Reality

Today, we have set-up high expectations to have everything available on a click away. While it is easy to blame why privacy is not respected, it is yet difficult to challenge the reality of the situation…

So really? Are developers lawyers to be? Might be not, as solutions are there to help: 

Check out our work at Pryv: embedding privacy into software apps 👇

Now available for free in open source: [github-link].

pryv.com

Stephanie & Evelina @ Pryv

Sources: