We delve into Privacy Enhancing Computation in our Navigator MasterClass on Emerging Trends - a field dedicated to protecting sensitive data during computing processes.
In the second installment of our Navigator MasterClass on Emerging Trends, we explore Privacy Enhancing Computation, a field of study that aims to safeguard sensitive data throughout any computing process. This goes beyond conventional encryption methods. Below are some of the well-known models within this scientific discipline:
The mainstreaming of privacy consciousness has motivated many IT departments to deploy various privacy-enhancing computation techniques
- Adding random noise to data before sharing (Differential Privacy);
- Distribute data over multiple machines/ devices for Machine Learning (Federated Learning);
- Retrieve information without revealing its nature or purpose (Private Information Retrieval);
- Process encrypted data without any decryption (Homomorphic Encryption);
- Data shares used by multiple parties without knowing about each other’s data share (Secure Multi-Party Computation).
Although the models have existed for a considerable period, the convergence of numerous factors has intensified the need for their implementation. These include concerns over privacy, security risks, regulatory scrutiny, widespread availability of computing resources, and the prevalence of ubiquitous computing - which refers to the seamless integration of technology into our everyday lives, allowing us to remain connected constantly. Before delving further into the topic, let us first examine some present-day applications.
Not surprisingly, IT companies were the early adapters of privacy enhancement advancements. Microsoft used Homomorphic Encryption for its Personal Health Information Exchange Program. It also deployed Homomorphic Encryption coupled with Secure Multi-Party Computation for its tool “Project WhiteNoise”, targeted towards data scientists in healthcare and financial services industries. Google uses Federated Learning with Homomorphic Encryption to analyse data across millions of devices. It also uses Differential Privacy to collect data on search queries and usage patterns.
Apple uses Differential Privacy from iOS 10 onwards to suggest words and phrases as users type some text. IBM used Secure Multi-Party Computation at MIT Media Lab. MIT itself has developed a tool called "Falcon" that uses Homomorphic Encryption and Secure Multi-Party Computation to execute computations on sensitive data without revealing the underlying data to the researchers.
The emergence of these models into mainstream computing has also started to solve some prickly problems in the IT world. As an example, McKinsey pointed out that the decentralized architecture of Web3 might put data security/ privacy at risk. At least two solutions have already been proposed. In a research paper published in Science Direct by a team of Spanish Researchers, the model of Homomorphic Encryption was combined with Blockchain. An IEEE paper on the other hand turned the data sharing problem in industrial IoT into a Machine Learning problem and then solved it with Federated Learning.
In a modified approach, an article by Chinese researchers proposed a model of Differential Privacy in situations where IoT/ Edge Computing is in location-sharing mode (leading to private data disclosure). So, it is clear that privacy-enhancing computation will change some of the emerging technologies in how they will eventually be implemented, especially in the world of distributed data. A reduced version of this approach is used by US Census Bureau while collecting data where Differential Privacy is deployed. And it is to be noted that they use Secure Multi-Party Computation for data analysis by multiple analysts.
In a few cases of the emerging technology landscape, privacy enhancement will create boundaries, an example being pervasive computing. This architecture of forever and everywhere connected is in direct conflict with privacy enhancement. AN IEEE paper that examined this issue suggests that the solution lies in the restriction of pervasive computing and stricter privacy regulations.
In any case, the mainstreaming of privacy consciousness has motivated many IT departments to deploy various privacy-enhancing computation techniques. Amazon uses Homomorphic Encryption during customer data analysis; European Central Bank used the same technique while collecting sensitive data from banks on their risk exposure. The issue is even more pronounced in medical research: International Genomics Consortium" (IGC), a research organization focused on studying genetic mutations and their impact on cancer uses Secure Multi-Party Computation and UK NHS’ "Data Safe Haven" platform uses the same model coupled with Differential Privacy to analyze patient data.
Having said all that, we must recognize that the biggest challenge in using privacy-enhancing computation is its complexity. It is just plain old and hard to comprehend. In a paper presented at ACM’s 2021 CHI Conference on Human Factors in Computing Systems, the following implementation issues were identified: tractability of a nebulous concept, usability by developers, accountability, and explainability.
If one was to explore alternatives, one can look at the suggestion of an HP Labs researcher who published in Cornell University’s arXiv. The suggestion was to use trusted computing with its “root of trust” encompassing hardware and software. This however transfers the onus of privacy to hardware from software and is therefore more cumbersome and expensive. At best it can be used for high-security applications; an example is the HSM (Hardware Security Module) used in payments processing, which is a black box used to generate and validate CVV of credit & debit cards. (By the way, it can be used for cryptographic key securitization in any application).
And just a parting thought, it is a good thing that internet cookies are on their way out anyways; privacy-enhancing technologies would have disallowed them and rendered them useless.
More from the author
Emerging Trends: Generative AI
- The author, Akash Jain, managed large IT organizations for global players like MasterCard and Reliance, as well as lean IT organizations for startups, with experience in financial and retail technologies.
Add new comment