Businesses want to leverage private data for GenAI because it can save them time, money, and resources. They can also derive new insights and use their data for a competitive advantage. However, GenAI systems must also comply with the same security and privacy regulations as any other IT system, and most existing approaches to secure data do not work well when applied to GenAI.

The expected use of regulated data has Compliance and InfoSec teams worried.   Already, GenAI projects have seen massive amounts of data compromised, such as an AI team at Microsoft that accidentally leaked 38TB of private data.  In addition, researchers are finding that adversarial prompting can get a GenAI chatbot to reveal PII data in its training set.

Watch this webinar as we discuss the:

  • Need to utilize private data with GenAI
  • Various ways GenAI project can expose confidential data
  • Options being considered to protect private data

WATCH NOW

Additional Resources