Risk Acceptance and Security Trade-offs: Managing Performance vs. Security
I recently read a note by someone discussing the futility of having engineers focus on performance cycles in order to optimize data encryption operations, and I couldn’t help being a bit taken aback. In the world we live in, where everyone wants to move faster, wants information faster and access to their data faster, someone was making the argument that accelerating data encryption operations, and consequently, transactions was useless.
At Baffle, we get asked multiple times every day and are challenged constantly to prove out our performance, deliver metrics and make recommendations for optimization. These questions around performance are absolutely at the top of the list when it comes to how our technology data protection tech works and how it can be implemented. To that end, we often explain in painstaking detail how we parse data and have minimized performance overhead.
Let me be clear — there’s nothing wrong with being concerned about performance. It should be top of mind for anyone working in technology or charged with delivery of an application or service. It just came as a bit of a shock to read that someone felt it was useless to focus on performance as it relates to data protection and security.
But, it also made me realize the balance that a lot of our customers need to strike in terms of recommending a security control or approach that could adversely impact the performance of an application. In a more recent scenario, we worked with a Fortune 50 firm processing IoT data who stated that the project initiative had been on hold for eighteen months because they couldn’t find a solution that could address both the security and performance requirements simultaneously.
Eighteen months. How’s that for the “shift left” crowd?
Cloud Data Protection Platform (CDPP)
Baffle enables data-centric protection at the field and row level to protect your data in the cloud
But what happens when the business decides the security control slows down an already slow application? In most cases, someone (e.g. a business or data owner) accepts the risk. What happens when the security control slows down development or is too difficult to implement and manage? Someone, accepts the risk. What happens when data is transferred to cloud or through staged environments in cloud and no easy solution exists to ensure the data is protected across various prod or non-prod stages?
Someone accepts the risk. Someone accepts that data will sit in the clear, exposed.
You might think — “how could anyone in security tolerate that?” The nuanced reality is that security leaders often think about risk mitigation and risk transfer as a means to manage the multitude of threats that they’re faced with. It’s akin to managing a threat portfolio if you will. But without a viable and transparent solution to address data security, requirements of availability and performance will win over security almost every time. That puts security in a position to leave data exposed or risk a breakdown in the business.
That’s a pretty big trade-off. It’s a trade-off that pits business against security and vice versa when that’s the last thing we need in infosec right now. But it’s also one that Baffle is in a unique position to help address. Our Data Protection Service remains invisible as a data-centric protection layer by integrating natively with other services and applications without any code changes. And our ability to remain invisible from a performance overhead perspective, eliminates the need to make the trade-off in the first place.
That’s a pretty bold claim to make, but one that we’re confident on proving out. So, you may be asking, “if your numbers are so good, what are they? What can we expect in terms of performance?”
Here are some metrics:
- In a global environment with over five billion records, Baffle’s Data Protection Services were measured at one to two milliseconds of overhead for encrypted data transactions.
- In an implementation with Postgres using pgCrypto, Baffle was measured approximately 100 times faster than the pgCrypto implementation for over 250 million API calls.
- In terms of tokenizations per second (tps) as a metric, the solution is 4 – 20 times faster than code book tokenization solutions on a tps basis.
- In a single instance bulk migration, we encrypted ~100 million records per hour without any scale out.
That all said — the real answer, when it comes to performance, is that your mileage will vary. It will vary based on the nature of the application workload and results that are encrypted. But time and again, our customers have told us that our performance numbers and our simplicity are what make all the difference.
So, if you’re at that data security “fork in the road” and looking to make a trade-off of security vs. performance, do yourself and your business a favor.
Take a look at what we can do to enable your data security strategy and help accelerate your business.