GanghorCloud targets CCPA compliance market with new CAPE solution

GanghorCloud has been selling intelligent information security products with next-gen DLP, but in reaction to customer requests and frustration with CCPA compliance products that don’t work, they have built a new platform from the ground up specifically to address this issue.

Tarique Mustafa, GhangorCloud’s CEO and CTO

Today, San Jose-based GhangorCloud, which makes intelligent information security and data privacy compliance enforcement solutions with a next-generation DLP [data loss prevention] component, is announcing a major expansion into multi-cloud compliance management. Their new CAPE [Compliance and Privacy Enforcement] solution next generation unified compliance platform is aimed at the CCPA/GDPR market, and automates tasks associated with data discovery, data classification, data mapping and consumer privacy compliance enforcement in real time across all regulatory compliance mandates.

“Our ISE [Information Security Enforcer] platform has been our focus to date, and it has sold very well, to over 100 plus customers,” said Tarique Mustafa, GhangorCloud’s CEO and CTO. “Many of them have been Fortune 500 organizations, including government, pharma, and banks. Now, however, customers have been asking us about this new opportunity around CCPA. The products in this space cost an arm and a leg – and they don’t work. There is a chassis but no engine. These products do a good job of capturing standard operating procedures, but their products depend on working with a DLP solution or a third arty dedicated data discovery engine, which are not in their products.”

Mustafa said that this problem repeats one that was endemic in the original DLP solutions.

“We saw this with DLP, where products were sold without data classification engines, so they failed to perform until very recently,” he stated.  “We had this classification already in ISE, and we have already done POCs in pre-existing customers and new ones specifically looking for CCPA types of products.”

Mustafa emphasized that CAPE’s new additions are not simply a matter of building new workflows into the product.

“That’s just table stakes,” he said. “CAP emphasizes three important code patterns. The first is a deep AI-based object and classification algorithm for data discovery through our eDiscovery engine, which allows for the classification and discovery of any object.” It identifies and classifies content automatically, generates privacy enforcement policies without requiring tedious manual intervention, and provides real-time enforcement of privacy mandates to minimize risk and exposure while significantly reducing total cost of ownership.

“Because there is no machine learning involved, it doesn’t require gobs of data or hardware for processing,” Mustafa added. “Instead the DPI-based algorithms based on knowledge coding and deep AI give the ability to define and compose any kind of sophisticated objects, while also providing very high scalability.”

The second key feature in CAPE is a sophisticated Data Mapping Engine that automatically creates a persistent Universal Data Map [UDM] for the data/information objects that exist in the enterprise corpuses.

“This involves chorology, which is a concept in the spread of any phenomenon, and typically has been focused around health sciences and social sciences,” Mustafa said. “It’s about learning how things like pandemics spread. The same principles are these in data dispersion. Finding pieces of information can take forever. We have completely automated the process and built a data map for data mapping the large systems in enterprises to track particular data that are objects of interest.”

The UDM created by the Data Mapping Engine is used to automatically generate the Data Subject Request (DSR) and Data Subject Access Request (DSAR) service workflow. This allows for the correlation of Actors, and reworks the incoming DSAR or DSR jobs into corresponding sets of primitive tasks, which are in turn ‘serialized’ into a task sequence using the logical and precedence dependencies between these tasks. The Workflow Engine monitors and reports on appropriate notifications during the fulfillment process.

“The third feature is the robotic automation of the service process itself,” Mustafa said. “When a consumer logs into a portal and wants to delete data, our algorithms determine how many autonomic pieces of data are involved in that request.

“Collectively, these three technologies working in synergy provide the engine that has been missing in these compliance products,” Mustafa summed up. “They can be deployed on-prem, hybrid, or SaaS, and are ideally suited for a channel partners or MSP to provide services around. We have built a strong channel training program that lets them provide services around the product to end customers, and it is a very good money maker.”