Calls: Send in your ideas. Deadline February 1st, 2021.

Verified Differential Privacy for Julia

[Verified Differential Privacy for Julia]

Differential privacy can be used to prevent leakage of private information from published results of analyses performed on sensitive data. Doing so correctly requires handling the extra complexity introduced by this technique, on top of the complexity of the analysis procedure itself. A proposed relief comes in the form of type systems. They allow tracking privacy properties of functions in types, where successful typechecking is equivalent to proving sound privacy guarantees. This aids the programmer in reasoning about code, detects implementation errors that are really hard to notice before one falls victim to privacy breach, and can give formal guarantees to the people whose privacy is claimed to be protected. This project will implement a typechecker based on the type system of the Julia programming language. Julia is a high-level, high-performance, dynamic programming language. While it is a general purpose language and can be used to write any application, many of its features are well-suited for high-performance numerical analysis and computational science. This should enable data scientists to compute privacy guarantees for any Julia function before they start working with real user data.

Why does this actually matter to end users?

Worries over our health and safety will in many cases take precedence over the perceived value of our privacy. When it comes to our physical health and well-being, we are often in a strongly dependent position. Especially in times of great mental stress (like when a medical doctor breaks bad news to us) or fear (my daughter is late from school) we often lack the time and knowledge to really consider what data we actually want to make available and under which conditions. Many people in such situations reach a point of detachment and panic, where they hand out whatever data requested from them by whomever promises to resolve the stress. And once data is out there, it is hard to trace back.

But what if we do not have to give up our privacy for the sake of better, and more personalized health care, or our safety? What if we can have both? To know more about our physical and mental health, research needs to be done using sometimes very sensitive and personal information as data to be analyzed. What if we could make the analysis of this data worthwhile and beneficial to scientific knowledge, without compromising the privacy of anyone involved? That is where the concept of differential privacy comes in, which aims to publish aggregate, useful information out of a database without disclosing anything too personal. This project wants to build this differential privacy into a safety-centric programming language called Julia. This way analysis software programmed in Julia will protect personal data by design, offering scientists and experiments' subjects provable privacy guarantees.

Logo NLnet: abstract logo of four people seen from above Logo NGI Zero: letterlogo shaped like a tag

This project was funded through the NGI0 PET Fund, a fund established by NLnet with financial support from the European Commission's Next Generation Internet programme, under the aegis of DG Communications Networks, Content and Technology under grant agreement No 825310.

Navigate projects

Please check out NLnet's theme funds, such as NGI Assure, NGI0 Discovery (which is focussed on search, discovery and discoverability) and the Internet Hardening Fund.

Want to help but no money to spend? Help us by protecting open source and its users.

.