The buzz phrase “data is the new oil” isn’t a literal analogy, but it holds true in that data has fast become the economic driver of the digital era.
And just as crude oil can be categorized by different types, there are many types of data. One that is critical to the developing field of edge computing and the internet of things is time-series data, a time-ordered sequence of data points that is also referred to as time-stamped data.
During the recent “The Future Is Built on InfluxDB” event hosted by theCUBE, SiliconANGLE Media’s livestreaming studio, and InfluxData Inc., industry analyst Dave Vellante led a series of sessions on the importance of time-series data to the growing internet of things and industrial IoT. (* Disclosure below.)
“When you think about IoT and edge scale where things are happening super-fast, ingestion is coming from many different sources, and analysis often needs to be done in real time or near real time, that’s where time-series databases come in,” Vellante stated in his introduction to the event. “They’re purpose-built and can much more efficiently support ingesting metrics at scale and then comparing data points over time.”
In case you missed it, here are three insights from the “The Future Is Built on InfluxDB” event:
1) Time-series databases provide a future-proof foundation for digital business.
Adjusting to operating in a digital world is a challenge. New technologies are pushing the boundaries of what businesses can do — and what customers demand. Data is key to remaining competitive, and that data flows in from connected devices and sensors in an incremental, time-stamped stream.
The speed and scale of data required to operate a digital business far exceeds the capability of relational databases. But managing time-stamped data at scale and in real time is what time-series databases were purpose-built to do.
Whether it is consumer IoT projects, such as fitness trackers, instant delivery, or home monitoring systems; industrial IoT smart factories; or scientific research and sustainable initiatives ,such as windfarms, time-stamped data is the basis of the connected world.
“If you take a self-driving car, what you’re doing is you’re instrumenting that car to understand where it can perform in the real world in real time. And if you do that, if you run the loop which is: ‘I instrument it, I watch what happens. Oh, that’s wrong, I have to correct for that. I correct for that in the software.’ If you do that 4 billion times, you get a self-driving car,” said Evan Kaplan, chief executive officer of InfluxData.
The self-driving car is a simple example, but essentially every intelligent system moves along the same path, according to Kaplan. Companies that build their intelligent systems on top of a time-series database can access insights in real time, optimize their operations, and maximize their ability to compete in the digital economy.
Here is theCUBE’s complete video interview with Evan Kaplan:
2) Open-source server-agent Telegraf is a really cool way to collect and send data.
A time-series database might form the foundational basis of handling cloud-scale time-stamped data in real time, but it doesn’t solve the problem of getting that data from where it is to where it needs to be. The open-source project Telegraf (developed by InfluxData) works as a server-based agent to collect and send metrics and events from pretty much anywhere to pretty much anywhere else.
Based on plug-in architecture and written in the open-source language Go, Telegraf provides a data pipeline that gives a runtime for inputs, processors and outputs. Users can run events and metrics through aggregators, allowing them to apply statistics to data midstream and enrich events with external metadata by running it through processors. Outputs format the data into consumable events, which can be sent to the edge, cloud or on-premises databases.
“It’s unbelievable. It’s open source. It’s an edge agent; you can run it as close to the edge as you’d like. It speaks dozens of different protocols in its own right,” stated Brian Gilmore, director of internet of things and emerging technologies at InfluxData, during the event.
Use cases range from simple metrics collection in extreme situations, such as deep-earth tunnel boring to high-level predictive analytics from both current and historical data. When it comes to the latter, Telegraf enables users to pull in telemetry, plus external enrichment data, to easily integrate with interactive platforms like Jupyter Notebook or scientific computing and machine learning libraries. They can then build and train AI models and send that information back down to InfluxDB to apply it and detect anomalies, Gilmore explained.
Here is theCUBE’s complete video interview with Brian Gilmore:
3) Time-series data use cases extend into outer space.
Everywhere, across the globe and beyond, time-series data is bringing tomorrow’s innovations to life. And while InfluxData may not be a household name, its customer list crosses industries and continents, from the European Organization for Nuclear Research (CERN) to Comcast Corp. and into outer space.
The fact that InfluxDB is simple to implement yet can handle cloud-scale data at real-time speeds is a critical factor for many customers and keeps it at the top of the DB-Engines ranking of time-series database management systems.
Loft Orbital Solutions Inc. uses InfluxData to perform analysis on satellite systems data, a job that requires extremely high precision and real-time access to data.
“We often zoom out to look at a year’s worth of data. [Then] you’re zooming in, to where your screen is preoccupied by a tiny fraction of a second, and you need to see not just the actual telemetry, which is coming in at a high rate, but the events that are coming out of our controllers,” said Caleb MacLachlan, senior spacecraft operations software engineer for Loft Orbital during the event. “We want to have that at micro or even nanosecond precision so that we know, OK, we saw a spike in chamber pressure at this exact moment; was that before or after this valve opened? That kind of visibility is critical in these kinds of scientific applications and absolutely game-changing to be able to see that in near real time.”
Joining MacLachlan for a customer panel session during the event was Angelo Fausti, software engineer at the Vera C. Rubin Observatory. Having real-time visibility into their telemetry data and metrics is as crucial to the observatory’s mission of mapping the Universe as it is to Loft Orbital’s satellite-as-a-service operations.
Despite their power to enable some of humanity’s most ambitious scientific research and technological innovation, InfluxData’s tools are so simple to implement that the engineers can hand off building custom dashboards to the users themselves.
“What I’ve seen be game-changing is that, generally, I’d say anyone can learn to use Influx,” MacLachlan stated. “It gets us out of the way, us software engineers, who may not know quite as much as the scientists and engineers that are closer to the interesting math.”
Fausti has the same experience at the Rubin Observatory: “We have the astronomers making their own dashboards, because they know exactly what they need to visualize,” he said.
Here is theCUBE’s complete video interview with Angelo Fausti and Caleb MacLachlan:
And make sure to watch the full event video below:
(* Disclosure: TheCUBE is a paid media partner for “The Future Is Built on InfluxDB” event. Neither InfluxData, the sponsor for theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)