
Bindplane, the Unified Telemetry Pipeline built on OpenTelemetry, has announced Pipeline Intelligence, a production-grade AI automation platform that fundamentally shifts telemetry pipeline management from manual construction to intelligent automation. As organizations grapple with exploding data volumes driven by GenAI adoption, Pipeline Intelligence automates 70-80% of pipeline work, automatically identifying log types, applying parsers, and optimizing configurations, returning weeks of critical engineering capacity to observability, security and data teams.
“We’ve been around since 2020, and really started by focusing on solving what I call the telemetry challenge,” said Mike Kelly, Bindplane’s CEO. “That is how do you collect, process, standardize and transmit all of the signals that you’re collecting, whether it’s logs, metrics, or traces throughout an environment – all those signals that feed your security systems, your observability systems and now feed your AI systems.
“We’ve been developing Bindplane for quite some time, but really brought it to market about two years ago now, in July 2020,” Kelly stated. “Bindplane is a unified telemetry pipeline and platform. We’re really focused on managing all of your agents inside of an environment. We support over a million agents that are collecting all of those signals and then giving you visibility so you can do things like reduce the total volume and the costs associated with your security and analytics, your observability analytics and now your AI platforms – and then make it much easier to do that at scale. That that solves one of the big problems behind telemetry management, which is that it has been growing so quickly and exponentially. The amount of telemetry, the amount of data that systems emit, whether it’s your databases or your applications or your Kubernetes clusters – that has been growing exponentially – but the ability to manage and control that has not been. And that’s really where Bindplane comes in. It allows you to support management of millions of agents, supporting petabytes per day of data and giving you the control that you need to handle all of that.”
The new Bindplane launch addresses a growing crisis in enterprise infrastructure – Gen AI workloads are creating unprecedented telemetry data volumes that manual pipeline management cannot handle. Converging data streams from organizations monitoring LLM integrations, moving application telemetry to data lakes for model training, and managing operational scale alongside AI adoption overwhelm traditional approaches. That’s why building custom parsers today can take days per log type, potentially across hundreds of different log types in complex environments. Meanwhile, observability teams are spending cycles building pipelines instead of solving observability problems, and security teams spend time writing parsers rather than detecting threats.
Pipeline Intelligence transforms this work model by delivering solutions at 70-80% completion automatically, only requiring 20-30% fine-tuning by the experts. What used to take days per log source now takes minutes with the automation of identifying log types, intelligently selecting and applying parsers, and automating pipeline configuration based on the application of best practices without requiring deep pipeline expertise.
“GenAI workloads are creating an explosion of telemetry data that manual pipeline management simply cannot handle,” Kelly said. “Pipeline Intelligence represents our commitment to solving real problems, not just adding AI features as checkboxes. It’s now possible to automate the tedious work—identifying log types, applying parsers, building configurations—so teams can focus on the 20-30% that requires human expertise. This isn’t about marginal efficiency gains but returning weeks of engineering capacity every quarter and freeing experts to do the strategic work they were hired for.”
The process by which Bindplane transformed itself went from construction to intelligent automation.
“The automation has happened in stages, and between five and ten years ago, the manual piece was large organizations and this is primarily a problem at the enterprise organization level,” Kelly stated. “They would typically purchase a security platform and deploy security applications to every laptop, to every server. And that’s a fairly large process, partially automated, but its a big undertaking if you’re deploying this to hundreds or millions of devices, and then the next year they would either maybe choose another security platform or they would choose an observability vendor to detect performance in their applications. And they would take that observability vendor’s agent, then go and deploy that across hundreds of thousands or millions of observers. And then the next year they’d find another one and they’d continue this process and you end up with this very complex environment with potentially dozens of agents throughout a very large enterprise collecting all all types of data, but sometimes the same data. Bindplane specifically manages that deployment for you, allows you to deploy the same Bindplane open telemetry agent to all of your devices and then gives you visibility into the the management of that and that’s the automation piece that we launched with about two years ago. What we’re announcing now takes that a step further and so the pipeline and telemetry automation that was already within Bindplane, we’ve now extended that with Pipeline Intelligence, the AI component within Bindplane, that handles the pieces that weren’t automated before.
“So once you have your flow of data within Bindplane, you can see the types of logs, and Pipeline Intelligence will automatically detect what those logs are that you’re collecting,” Kelly continued. “And this is important for big companies because they could be collecting dozens or hundreds of different log types. You don’t always know the format. And it may not be a known format. So with pipeline intelligence, we automatically detect the type of telemetry it is. We can then apply one of our prebuilt blueprints to parse it into a known structure so the data is then useful. If this is a completely custom application with an unknown log type, and unknown metrics, we can build that parser automatically with Pipeline Intelligence and this is something that typically would take a customer significant time and expertise to build and deploy. And so it’s one more step that we’ve taken to automate and really leverage AI in a way that makes it easier for us to manage and solve these very tough problems.”
Unlike generic chat interfaces that can feel like experiments, Pipeline Intelligence focuses on specific, actionable tasks that teams need to accomplish. The task-based interface of the platform allows users to define concrete objectives: parse logs and route to proper destinations; find anomalies in telemetry streams; and optimize pipeline configurations for performance. Pipeline Intelligence performs the work with production-grade reliability.
There are a lot of observability vendors, and Kelly said Bindplane tends to co-operate with them rather than compete.
‘We partner with observability vendors and there really aren’t any observability vendors that we would consider a competitor,” Kelly said. “Our goal is to be a neutral platform that can be used by all, so Bindplane supports over two dozen different destinations between observability and security platforms. And really the goal for Bindplane is to allow the customer to collect and manage all of that data independent of the observability platform or security platform that they’re choosing and that makes us great to partner with, because observability and security vendors can work with Bindplane and it’s not a competitive solution. Our focus is really on solving that data in motion problem which is a big challenge. So we’ve been a significant partner with Google and the Google Sec OPS solution and Google Cloud Observability as well. We provide a version of Bindplane to Google Sec OPS customers as part of their Google Sec OPS subscription and the same with Google Cloud Observability and it gives them an easy way to deploy collectors, gather that security or observability data, process it, and send it to their Google Cloud infrastructure.”
Kelly said that this vendor-neutral architecture was ideal for rapidly evolving markets. With Pipeline Intelligence built on top of OpenTelemetry Collector integration, the architecture remains vendor-agnostic for organizations looking to switch downstream observability or security platforms without rebuilding pipelines. This flexibility guards against vendor lock-in and makes adaptation possible with less architectural debt as AI tooling evolves monthly with solutions constantly leapfrogging each other.
“Before Pipeline Intelligence, we had already automated a lot of the components of this deploying and managing to tell you the key pieces of of what we’re changing here,” Kelly commented. “Once you’re starting to collect very large volumes of data, you want to be able to see what those types of data are and understand what’s important and what’s not. That visibility is is really step one. That’s what we provide with Bindplane, so customers can look at the data and say this is really critical security or observability information and I’m going to make sure that I keep that, but I also see data that I can exclude and I can filter out. So I’m not paying for data that I don’t actually need. Everything ends up being charged based on ingestion. So the more we can focus on just the most important data, that’s where most of the the operators using Bindplane spend their time. determining what data was most critical, eliminating the rest, and then routing it to the correct location. With Pipeline Intelligence, we’ve baked in our knowledge of critical observability and security data. And that that all happens now in an automated way with Pipeline Intelligence. So with Bindplane, the first release we took was the ability to deploy and manage from months to weeks. Now we take that next step of ‘how do we actually refine that data, get it into the correct schemas and eliminate the pieces that we don’t need,’ taking that from weeks to days, and that’s really the focus.
“We have two releases related to Pipeline Intelligence,” Kelly said. “The first, which will be out in the first week of December, is really around helping operators. We have a number of built-in tools that allow you to streamline that and then automatically perform the operations that you’re looking for. We focused on not building just a chatbot interface. You can use natural language, but we really wanted to detect the the logs and metrics. We use it to apply our best practices and really just speed the operator along in their journey with truly useful telemetry data. The second release is less operator-focused and it really looks at everything within the system and so it’s fully agentic and reviews across hundreds of thousands of agents, looking at the entire pipeline in our organization and recommending the most impactful changes based on our set of criteria. Reducing the volume, updating the data to enforce schemas, or eliminating security leaks in PII data are all things that Pipeline Intelligence does as an agent, asynchronously running in the background.”
Bindplane purposely waited six months for AI models to mature before releasing automations specifically crafted for tasks that meet real user needs. The result is an AI solution exceeding the threshold of “10x efficiency” required for critical production pipelines, to be deployed to enterprises, rather than a marginal improvement that demos well but fails in production.
“One of the things that we really focus on is that it’s critical to get this right,” Kelly stated. So if we’re if we’re recommending changes to your structure or we’re detecting log types, we need to be accurate. We need to be correct. And that was the first piece as a model, getting to a point with a confidence interval that was high enough that we felt comfortable providing that to customers. The next was doing this at very high scale. This tends to be one of the areas where you know folks would say that that large language models and AI in its current state are not as effective as you would hope when you’re dealing with terabytes or petabytes of data. So we had to come up with some novel approaches to that and we came up with Pipeline Intelligence and through some intelligence sampling and ways of managing the data, even when customers are sending a petabyte of data every single day, we can still filter through that and determine very useful recommendations for customers. Those are a couple of things that are critical to providing a really good experience for our users, and one of the reasons that it may not have been as successful a year ago or even six months ago.”
Bindplane’s channel is growing and becoming more important to the company.
“To me it’s around 40%, but that’s a scenario that’s growing and it’s been a focus of ours as we’ve expanded, and the plan has become more prominent to to work really closely with partners and with the channel,” Kelly said. “This is an exciting area for us and we think it’s a great opportunity. I think Bindplane tends to be a great fit for channel partners, particularly with those partners that are already working with observability vendors. With security vendors, it’s not a competitive solution. It also tends to be a great solution packaged with an observability security platform to provide a complete solution that solves a problem for the customers out-of-the-box. It also tends to make their life a lot easier. One of the most challenging pieces when you’re just getting started with observability or security platforms is how you get folks onboarded. And if you are working with customers to get them not only to sell the product, but then to get them onboarded and successful in using it, Bindplane makes that much simpler. It makes the initial deployment easier.
“I think we’re very pro channel and and if you join our partner program, you get access to things that you’d expect our from our partner portal and training programs and deal registration, but we also have a a very active in- person training program and live training sessions, and you work with some of the people on our team to get your team up familiar with Bindplane,” Kelly said. “We make sure you have a Direct Line with us.
“We are very focused on providing the best customer support and partner support for folks out there,” Kelly added. “I think that’s the thing that makes us different than a lot of the others is you really do get First Class level support if you’re a partner.”
Bindplane provides no-cost NFR licenses.
“We’re very competitive if not better than what you’ll find out there with other folks,” Kelly said. “When working with the channel, we provide attractive margins that are competitive with everyone out there and and probably a little better. The fine point is you have a lot more access to the folks on our team than you may with other teams you’re working with. And that can be some of our solution architects working directly with you side by side on a deal. Our field sales team is very active. We don’t tend to just talk to you once and give you some training and then then leave you on your own, although that also depends on the channel partner. Some may prefer to pull us in on some of those deals and have us help with the more technical components of it, because with technical products that tends to be a challenge. We want to work with you and make sure that everyone’s successful and that things go smoothly and quickly.”
Kelly said that resellers still tend to be the focus.
“We have a number of types of partners we work with and certainly we try and find the best fit,” he said. “Some folks refer more traditional channel partners to us and we work with them as well as solution providers. So depending on the type, there’s probably a fit for you. This tends to be a great fit for anyone that’s working with observability and or security, and that tends to be the focus audience for us. We also work directly with strategic vendors like Google, but we work with others as well, either directly with that observability vendor or security vendor or their sales teams. We join their marketplaces so that they can provide more options for their customers and provide some visibility to their teams and customers.”
In conclusion, Kelly said that with OpenTelemetry collectors, customers retain ownership of the data plane, providing zero data custody, further democratizing pipeline management, and enabling teams to create optimized configurations without deep pipeline expertise. Pipelines requiring 40% of an observability engineer’s time could see Pipeline Intelligence return 25-30% of that engineer’s week. For a fully-loaded engineer costing $200,000 annually, that represents $50,000-$60,000 in redirected value per year, multiplied across entire teams.
Officially announced ahead of AWS re:Invent, Pipeline Intelligence general availability is targeted for the end of year 2025, with capabilities rolling out in stages. Local processing features will be available first, followed by stored telemetry with advanced AI features for customers who opt in.
