FaceApp and PPA

By Ameesh Divatia, CEO and co-founder | July 22, 2019

faceapp and privacy-preserving analytics blogAccording to the Wall Street Journal, FaceApp has been downloaded more than 95 million times world-wide, with more than 20 million of those downloads occurring since July 11. That is a lot of people who want to see what they look like in 20 years. When you step away from the selfie fun, this is another example of a race to compete for big data business. What we’re seeing is a method to gather more data points that act as “bits and bytes fuel” to enhance someone’s AI model.

The Journal article also states (and this is a shortened version): According to its terms of service, FaceApp retains a “perpetual, irrevocable, nonexclusive, royalty-free [and] worldwide” license to share a user’s photos and “any name, username or likeness provided” in broad and unspecified ways… users consent to the transfer and storage of data to the U.S. and other countries “where you may not have the same rights and protections as you do under local law.”

But even now, few users ever read these exhaustive (and grossly unfair) privacy policies that are broad in scope and weighted entirely to business. Now Congress is asking the FBI to potentially investigate FaceApp because of Russian ties, which is a dramatic example of what happens virtually every day on the Internet. Data has value. And despite the warnings, users continue to blindly give up their data In exchange for something of value – which could simply be entertainment.

But there must be a balance between data for business and data privacy — a way to collect and analyze increasing volumes of personal data while maintaining the privacy of individual users from any threat, internal or external. The security industry (Baffle included) is rallying around a notion of privacy preserving analytics (PPA) to allow companies to share data while maintaining confidentiality and not exposing encryption keys.

Functionally, PPA is the ability to derive business benefit from a set of data records that are too sensitive to process in the clear. It offers a way to ensure that a data set doesn’t uncover an individual’s contributions or reveal encryption keys. And if legislated as a standard, it could ensure users that yes, they are giving data in exchange for a service, and that it will be used and analyzed – but their confidential information will be protected.

At first glance, PPA offers a tremendous opportunity to exploit the value in data collected from customers without the burden of sacrificing privacy. The devil, as always, is in the details, and in this case, it is with operational issues related to implementing such a solution. Manipulating data, whether it is masking, anonymization or encryption, involves an inline agent that transforms the original data. This implies a loss of control to the owner of the data because the transformed data is not recognizable anymore. It takes smart provisioning to make sure that the process of retrieving the original data from the transformed data is just as easy as transforming the original data. This is the primary barrier in more organizations adopting PPA as a standard practice in their normal operating processes.

New technologies, like Baffle’s Advanced Data Protection solution, let companies work on data without impacting encryption and privacy.  The key takeaway is that we analyze and learn from the trends we’re seeing and move to a framework that helps maintain security and preserve privacy for analysis without compromising data utility.

Let people have some fun, and whet their curious appetites about the future, without compromising their personal data.

Want to learn more? Watch our 90-second video on how Baffle can operate on encrypted data without any application code modifications and still preserve application functionality, or request a demo with one of our experts.