Traverse City, Michigan-based Naveego, a startup which makes cloud-based Data Quality Solutions [DQS], has announced the new release of their Naveego DQS offering. Naveego detects and eliminates data quality issues across systems, and between cloud and on-prem deployments. The new release enhances the ability to check data across different platforms, improves the dashboards, allows writing SQL against non-SQL data sets, and enhances data quality on large data sets.
Naveego’s name refers to the fact they started out as a BI platform about navigating data, and is a play on ‘navigate.’ That BI platform was within Safety Net, a Michigan MSP, as part of their Application Development Group.
“I worked at Safety Net for nine years, and developed this when I was there, in 2012-2013” said Derek Smith, Naveego’s co-founder and CEO. “Through some engagements, I fell in love with the BI side, seeing how this could be done well from the cloud. I siloed myself into a one-man division, and focused on getting it off the ground to the point where it could be a valuable business. There was always an understanding that when it got to a certain level of maturity it would spin off. Safety Net is an MSP and this doesn’t really fit within their model.”
When the spinoff did take place, in December 2014, with Safety Net remaining a strategic partner, Naveego defined itself as a BI platform.
“We soon pivoted from that, through engagements with some larger companies like Breitburn Energy Partners,” Smith said. “We shifted from BI down to specifically doing Master Data Management on a Data Quality Platform. We have been getting a lot of traction in oil and gas, and we are looking to expand into different verticals. We closed a funding round to do that at the end of May.”
Naveego sells entirely through a partner channel.
“Partners in different verticals bring us into their customers as part of their projects,” Smith said. “Because we solve data problems, we are really more horizontally aligned, but customers often work with vertically-aligned partners.”
Smith explained how they add their value.
“With Data Quality, there is what I call a ‘1-10-100 Rule,’” he said. “It takes $1 to verify a record as it is entered. It takes $10 to cleanse it, and it takes $100 if nothing is done, because the ramifications of the mistake are felt over and over again. We provide a high level of Data Quality to make everything better at the end of the line, to get data validated as close to the $1 point as possible.”
“We define a business process, like vendor onboarding, and make many quality checks that analyze the data,” said Mike Dominick, Naveego’s CPO and VP, Partner Success. “We look for things like not having a Tax ID, or having an asset be set up in ERP system and a project system and a vendor management system – which may not be aware of each other. This both helps with compliance, and understanding blocks within the business process.”
“What has been out there in the market that does this tends to be on-prem,” Smith said. “We also turn it from a project into an ongoing process.”
“We fit in broadly among organizations, not just the big ones, because of a focus in keeping it simple, but we do make the most sense where multiple applications are involved,” Dominick said. “We don’t require dedicated teams of data specialists.”
The new Naveego DQS release provides for Cross System Data Comparison, which makes it easy to compare the data between two different systems, regardless of where the data resides.
“This bridges that gap from analytics to business processes,” Dominick said. “It allows cloud and on prem to be checked on the same platform.” It also lets users compare data between different types of data sets, like between an internal SQL database and SalesForce.com.
This release also introduces a Data System Health Dashboard, which provides a high-level overview of the health of their data.
“We placed a larger emphasis on dashboarding in this release, because they are used to report up the chain of command,” Dominick said.
Naveego DQS now leverages Big Data technologies that support massive amounts of data.
“The use of these technologies allows us to manage larger data sets in a clean and easy to use way,” Dominick stated.
The release now facilitates writing SQL against Non-SQL data sets, to let customers connect to business applications quickly and efficiently, and validate them all with SQL.
“This lowers the bar to get rules created,” Dominick said.
Finally, Templated Quality Checks now let users quickly execute common quality checks for multiple targets.
“Business processes may be aligned differently,” Dominick noted. “We want to be able to do one single check, instead of one per division, or one per geo.”
Looking down the road, Smith said that the company’s objective is to bring complete visibility of data points.
“We are going in a direction that will provide traceability all around the business,” he said.