In the beginning, there was monitoring. Back then, organizations did what they could to monitor and observe data, especially when troubleshooting. Telemetry data consisting of logs, metrics, and, more recently, traces, offered the required data to be scrutinized or collected for monitoring.
This is when early telemetry data came into play in DevOps when troubleshooting sought to figure out what went wrong, how it happened, and how to fix applications, networks and other operational issues.
Over a decade ago, early players such as New Relic began to extend monitoring, bringing observability into play. The instrumenting part of channeling metrics, logs and traces to help make sense of the telemetry data remained an obstacle, accounting for a high threshold of entry for observability players.
Flash forward to today: Since its inception in 2019, OpenTelemetry has become workable for accommodating telemetry data consisting of traces, followed by metrics and, more recently, logs in 2023. What this means in practice is that OpenTelemetry allows users to instrument their telemetry, more feasibly integrate different observability integrations and other advantages.
In other words, the barriers of entry have been lowered, both for users — and now for observability players, who no longer have to devote significant resources to instrument data.
How OTel Changed the Game
Because OpenTelemetry is free and open source, users can easily gain insights from a collection of information about their systems, such as why a server ran out of memory, why a trace is slow, why a request is slow or why there are error logs.
This ease of use indicates we can expect to see a number of new entrants in the observability field, especially Software as a Service (SaaS) players, who seek to build services on top of the instrumentation.
The main draw of OpenTelemetry — as the second-largest Cloud Native Computing Foundation project — is how it offers a standardized process for observability. It can be seen as three main components: standards, SDKs and the collector. The standards ensure interoperability, the SDKs simplify application instrumentation and the collector acts as a vendor-neutral agent.
OpenTelemetry is used to make sense of telemetry data consisting of metrics, logs and traces. It is also more than just a vendor-neutral project in that it is designed to allow the user to integrate the observability tools of their choice into a common approach, unifying them.
“OpenTelemetry gives organizations control over the data that they generate, where it gets sent, and its contents, structure, and volume,” Morgan McLean, OpenTelemetry co-founder and senior director of product management at Splunk, a Cisco company, told The New Stack.
OTel offers a lot of value for capturing and processing data “before it gets sent to an observability solution,” McLean said. It allows users, he added, to discard data points or logs that have no value, reducing the cardinality of metrics, to benefit from OpenTelemetry’s semantic conventions removing processing or query costs that would be otherwise needed to normalize data — and other advantages, such as cost savings.
“The reasons 57% of observability leaders experience lower costs with OpenTelemetry are clear: While observability tools provide a lot of value for the organizations that use them, their cost — especially at scale — is a common concern,” McLean said. “One of the most straightforward ways to control observability costs is to ensure that you are only processing data that’s actionable.”
Open Telemetry is emerging and supported by most observability vendors. It provides an open, industry standard for collecting telemetry data across different applications, Mrudula Bangera, a research director at Gartner, writes in the firm’s report, published last September, on the future of observability.
“While OTel simplifies the telemetry data management, it also can bring significant cost savings through the elimination of some software licensing costs and other efficiencies,” Bangera wrote. “Organizations should standardize data collection by adopting open standards such as OTel. It accelerates development cycles by eliminating the need for repeated instrumentation efforts, enabling engineers to focus more strategically.
“OTel can also help reduce operational costs such as customization/consulting services by allowing organizations to tailor telemetry data collection and processing to their specific needs. They can do this on their own, without relying on costly proprietary tools or consulting services.”
The Future of OpenTelemetry
When New Relic was founded over a decade ago, many resources had to be used for building instrumentation. “We then spent a lot of time building telemetry data, like a time series database,” Nic Benders, chief technical strategist at New Relic, told The New Stack. Today, he said, observability players must go way above and beyond building solutions in addition to what OpenTelemetry offers.
“You can find decent-quality, open source observability solutions,” Benders said. “There are a few commercial engines that you can white label and you even see weird things — there are lots of databases out there. But it’s not enough to capture some telemetry, put it in a database, be able to dashboard it and alert on it.
“That is just contributing to information overload for users. What you have to do is be able to say, ‘I can pull end-to-end: an intelligent data capture that gets the data that you want and does it efficiently; an intelligent data platform that pulls it together, lets you ask questions that you didn’t know that you had before you sat down, gives you powerful query capabilities, and an intelligent action platform that says you probably don’t actually know the questions that you need to ask.’”
In many ways, OpenTelemetry is a starting point for your road to achieving observability. Last year was a “banner year” for OpenTelemetry, with a 45% year-over-year increase in code commits on GitHub, along with a 100% increase in search volume on Google, Marylia Gutierrez, staff software engineer, Grafana Labs and an OTel maintainer, told The New Stack,
“More and more organizations will turn to OTel,” Gutierrez said. “But in order for it to become fully interoperable with more systems and tools, we need to focus not only on its end use, but also on its implementation, by clearly defining the signals and their semantics within OpenTelemetry.
“By establishing clear, consistent semantic conventions for telemetry signals at the collection layer, we enable more sophisticated and standardized approaches to observability across the [open source software] ecosystem and make integration with OTel easier.”
The post OpenTelemetry Opens the Door for Observability Startups appeared first on The New Stack.