What Financial Services Should Know About Cloud Data Protection and Security
According to a 2021 Google/Harris Poll survey, 83% of financial services companies are “deploying cloud technology as part of their primary computing infrastructures.” The research also found that 38% of respondents use a hybrid cloud/on-premises environment, 28% rely on a single cloud, and 17% use multi-cloud.
The benefits of cloud computing for financial institutions are well documented: The cloud allows for greater flexibility and scalability, reduced costs, increased efficiency, and critical data analysis. As financial services companies become more digitally adept, they accrue massive amounts of valuable analyzed data, broadening their targets for hackers and cybercriminals.
Understanding that this data, these potential attack surfaces are more coveted today than ever before, security controls for financial services firms are more important than ever. Banks and other financial institutions are under attack and must manage extensive enterprise infrastructure and subject to targeted campaigns and persistent threat actors.
To combat this, several regulatory bodies have established security requirements in the form of encryption, authentication steps, data access controls, and/or breach notification requirements, which add to the complexity of storing and processing sensitive data and impose severe penalties for noncompliance. Financial institution compliance and data protection standards come in the form of the following governing bodies and standards:
- GLBA - Gramm–Leach–Bliley Act
- SOX - Sarbanes-Oxley Act
- GDPR - General Data Protection Regulation
- NIST CSF - National Institute of Standards and Technology Cybersecurity Framework
- PCI-DSS - Payment Card Industry Data Security Standard
- PSD 2 - Payment Services Directive
Each of these regulations provides an auditing mechanism and guideline for financial organizations and mandates security compliances for all service providers in the financial sector worldwide.
With U.S. consumer spending on e-commerce expected to hit a record $1 trillion this year, one such regulatory body is getting a lot of attention. The PCI Security Standards Council. The PCI Security Standards Council is the regulating body in charge of regulating and expanding the PCI DSS's application. The main credit card firms created PCI SSC in 2006. The purpose of PCI SSC is to specify what constitutes PCI DSS requirements and how to safeguard data in the environment that handles cardholder data (CDE).
In order to shield and secure client data and credit card information, payment processors must be diligent. Being in PCI DSS compliance means that PII, main account numbers (PAN), credit card numbers, and credit card data have all been tokenized and are safe. However, this is actually only the start. Real-time compliance must be ensured by ongoing training and compliance audits.
Enforcing compliance is up to individual payment brands or acquiring banks. The council's responsibility is to establish PCI requirements and recommend best practices for maintaining PCI compliance at all times. The PCI 3-Step Process is one such procedure. The following steps make up the PCI 3-step process:
- Assess - Locating cardholder data, listing IT resources and business procedures for accepting credit and debit cards, and testing them for vulnerabilities.
- Remediate - Closing security holes and not storing cardholder data unless absolutely essential
- Report - Putting together and sending the necessary reports to the right acquiring bank and card companies.
Please visit the PCI SSC's website if you want to learn more.
Not only are standards and compliance statutes important, but so are our understandings of the relationship between the data being protected and those involved in its protection.
As financial services organizations grow more reliant on the cloud, they must first understand what they are responsible for protecting and what their cloud provider must protect, then employ the appropriate protection methods to ensure data is protected in a way that maintains compliance and forwards business goals. The following suggestions will help financial services organizations do just that.
Understanding the shared responsibility model
Organizations operate under a shared responsibility model with cloud providers. Although it is easy to conceptualize, implementing a shared responsibility model requires much coordination. In many instances, a shared responsibility model dictates that cloud providers are responsible for the security “of” the cloud, and organizations are responsible for security “in” the cloud.
Think of it this way: A home security provider can install a protection system, but it is up to the homeowner to identify where the sensors are located and ensure that it is armed before leaving the house. Similarly, a cloud provider protects the cloud’s infrastructure to reduce intrusion risk, while the organization protects the data if a breach occurs. It is also essential to recognize that each cloud provider may vary in terms of the protections they offer.
For example, some providers do not offer field-level encryption to organizations, allowing potential hackers easy access to data. This requires financial organizations to seek assistance to gain that protection elsewhere, while others require organizations to consider a unique approach to de-identifying data and protecting privacy.
The following steps can help prepare organizations to protect their data under a shared responsibility model:
Identify sensitive data
Use advanced data discovery methods to find sensitive data in their repositories before moving them to the cloud. Privacy regulations must be top of mind due to the rapidly expanding scope of what is considered sensitive. For example, IP addresses and geolocation information are now regarded as sensitive in addition to personally identifiable information (PII), such as social security numbers and birth dates.
Determine data usage
Identify the purpose of collecting data to comply with privacy regulations, such as GDPR and CCPA. Next, map out how to process the data and whether to share it with a third party. Financial services firms must ensure that this data does not land into unauthorized hands, resulting in potentially hefty fines.
Assign access control
Determine who can access that data for processing. Creating customized views for individuals based on their persona using dynamic masking tools is one method. For example, an application developer needs a different perspective than a data scientist who accesses the same dataset in the cloud.
Research the cloud provider’s security qualifications
Like any service, cloud service providers should have quantifiable evidence demonstrating a commitment to cloud security. Conduct due diligence in researching their industry-specific cloud security certifications and publish regular reports associated with compliance and audits.
Seek out advanced protection
Audit the protections offered by the cloud provider and the protections the organization currently deploys, then determine what additional tools are necessary to bridge the gap.
Protecting the data analytics pipeline
Financial services providers use data analytics across functions—from sales and marketing to fraud detection and risk mitigation. The cloud’s limitless storage capabilities prompt enterprises to migrate data from on-premises environments, store it in data lakes and extract valuable data into warehouses for analysis. Analyzed data stored in the cloud is a massive target for criminals due to its high value and is often not appropriately protected. Many organizations relax security controls to momentarily enable easier access but forget to restore the necessary protections. Let’s take a closer look at methods financial organizations should incorporate to ensure their data is secure through the analytics pipeline.
Early protection methods
Security controls for the analytics pipeline can be categorized into two groups: visibility and entitlement. According to Gartner, visibility pertains to implementing “controls that remove ambiguity and increase visibility” of data, while entitlement is the management of data access. Organizations can address visibility and entitlement through data discovery, protection, and monitoring strategies.
- Discovery. Data varies in sensitivity levels, and it is essential to determine how each piece of unstructured data will be protected upon entering the pipeline. In discovery, each file or record is processed to identify the data and assigned a protection policy based on the level of its sensitivity, applicable compliance standards, and eventual use. Applying detailed metadata also helps clear any confusion about what a piece of data is and who should access it.
- Masking. If a piece of data has sensitive information, a social security number, for example, but will undergo analytics, masking occurs. Masking prevents the actual value of data from being seen by anyone, but it comes at a steep price — that data can never be processed downstream. This data will be stored for a fixed period of time—typically for compliance purposes.
- Tokenization. Data that contains sensitive information but will undergo analytics is tokenized. This protection method takes a piece of data and replaces it with other characters in the same format. For example, a tokenized Social Security number would consist of nine random numbers to look like an actual social security number. Tokenization allows existing applications to analyze the data, even though it is not real data. If a hacker accessed tokenized data, any analytics they performed would be inaccurate because it does not reflect actual values.
Encryption and data sharing
At this point, data analytics can begin, which makes data more valuable. As such, deeper protection, like encryption, is necessary to move forward. Encryption converts plain text data into unreadable ciphertext and can only be deciphered with a key that is accessed by only a privileged few. Innovative techniques are now becoming available to analyze that data by multiple parties without ever decrypting it. Once analyzed, data must remain encrypted for the entirety of its lifecycle.
Once encrypted, financial organizations can rest assured that their data has the highest protection for data sharing activities. For example, analyzed data is shared within the organization to provide the insight necessary to make crucial decisions related to customer acquisition, preferred services, and what products to offer. Only those with authorized access should be able to view the data. Authorized personnel grant access through encryption keys—a random string of bits that scramble and unscramble encrypted data.
Hold Your Own Key (HYOK) and Bring Your Own Key (BYOK) technology are user-controlled encryption keys that can be used to protect data in a given provider’s application or infrastructure by restricting who can unlock encrypted data. HYOK and BYOK solutions should:
- Implement a key virtualization layer (KVL). This allows the software to integrate with multiple key management solutions via industry-standard protocols, which is especially important for financial service providers, given the myriad regulations they must consider.
- Feature a two-tier hierarchy for key management—also commonly called envelope encryption. This method uses a master key (MK) that the customer can use to encrypt data encryption keys (DEKs). This method allows organizations to rotate and/or revoke their MK; the latter operation will invalidate DEKs and render data useless in an environment.
- Source multiple disparate keys. This action should permit sourcing from separate tenants or entities or geographical jurisdictions or nation-states and apply those to encrypt the actual data values in a multi-tenant data store at the row level.
Financial service providers also share data outside of the organization, which can increase the risk of data exposure. Encryption allows organizations to participate in the data pooling with multiple financial service providers to gain deeper insights into the industry at large. Often, security and bureaucracy can impede the ability to collaborate across organizations in various scenarios and industries. To take advantage of the power that data sharing provides, financial organizations must address privacy and security concerns by anonymizing data and data owners while still allowing analytics and machine learning operations to process.
With encrypted data, organizations can participate in industry data collaboration exercises knowing only they will be able to view their data while aggregate analytics auditing occurs. Encryption significantly reduces the risk of unauthorized access and potential data breaches.
Finding the right data security partner
In its 2021 report, “2022 Strategic Roadmap for Data Security Platform Convergence,” Gartner released details around new data protection terminology. It explained how the convergence of many data security technologies—like data masking, file encryption, and tokenization—will drive the evolution of the Data Security Platform (DSP). The emergence of DSP comes at a time when security is growing more complex—especially financial organizations that are newly able to analyze data with cloud modernization.
Organizations that previously used basic security protocols can now apply more secure, in-depth ways to protect their data with a DSP. Companies can use privacy-enhancing computation to ensure security protocols can be implemented at every point throughout the analytics pipeline. This is key to staying compliant and fully protecting customer data that is highly valuable to attackers for financial organizations. DSPs make data security easier to manage, as organizations work with a single platform instead of many, inherently reducing exposure risk.
When looking for a DSP partner, financial organizations should consider the following:
- Easy integration. A data protection system should seamlessly integrate into any primary cloud provider, like AWS, Microsoft Azure, GCP, major container-based platforms like Docker and Kubernetes.
- Full range of services. A data security platform should allow organizations to encrypt, tokenize, and mask data in the cloud without the difficulties of modifying code or embedding software development kits (SDKs) and support data protection from production to processing in deployments with cloud services.
- Comprehensive database protection. Databases like Postgres, MySQL, MariaDB, and Microsoft SQL Server often contain highly sensitive data for financial institutions, like credit card details, social security numbers, and personally identifiable information. This information is highly valuable and sought after by attackers, so it is necessary to protect it throughout the data pipeline.
- At-rest and in-motion protection. Typical data protection systems are designed to protect data at rest but leave it completely unsecured during the remainder of its time in the pipeline. Look for a solution that protects and secures the data while at rest and in motion, allowing access to only highly-qualified and authorized users.
- No-code/low-code implementation. This feature allows financial companies to implement their own transparent data security mesh on-premises or in the cloud to secure data throughout its lifecycle. In a zero-trust world, proactively defending and protecting data is essential to ensure only the proper channels use or see data.
- Compliance expertise. Financial organizations are also under pressure to comply with long-standing industry data security baselines, like PCI-DSS and SOX. On top of finance-specific data protection laws, governments create privacy regulations like GDPR (EU) and CPRA (California). It is critical to have regulatory experts on hand who understand the current state of compliance and have a clear vision of emerging trends that could alter existing regulations or potentially spawn new ones.
As data protection grows more complex, financial services organizations must be prepared to implement best-in-class security methods. Learn how Baffle can help with its data-centric protection for financial services.
Baffle delivers an enterprise level transparent data security mesh that secures data at the field or file level via a "no code" model. The solution supports tokenization, format preserving encryption (FPE), database and file AES-256 encryption, and role-based access control. As a transparent solution, cloud native services are easily supported with almost no performance or functionality impact.
No application code modification
Virtually no performance
Integrates easily into your
AES encryption in memory, in use,