By James Blackwood-Sewell, Senior Director of Developer Advocacy, Timescale
Historically, single-purpose time-series databases have been used to keep pace with the waves of data generated by IoT, which has been one of the most transformative forces revolutionizing industries in recent years, driving innovation, and reshaping business operations.
However, with recent advancements in general-purpose databases, contenders like PostgreSQL are now increasingly attractive for use as the cornerstone of modern IoT applications.
From managing devices to optimizing processes and enhancing user experiences, IoT offers a wealth of opportunities for the companies that implement it well. It’s not all plain sailing, though. Companies need to meet the challenge of effectively managing the constant streams of time-stamped or time-series data produced by large fleets of sensors and devices.
Now that general-purpose databases like PostgreSQL can support this workload, the time for custom databases is drawing to a close.
A brief history lesson: How custom databases became the standard
The rise of IoT technology has unlocked a realm of possibilities across diverse sectors. From the emergence of smart cities to the proliferation of connected healthcare devices and precision agriculture, IoT is ushering in an era of innovation poised to reshape industries.
In supply chain management, IoT-powered sensors provide real-time insights into inventory levels, shipping conditions, and demand fluctuations. This data-driven approach streamlines logistics, reduces waste, and enhances overall efficiency.
In the energy sector, the synergy of IoT-enabled smart grids and sensors enables the monitoring of energy consumption patterns, paving the way for optimized resource distribution and utilization. Similarly, in agriculture, IoT devices empower farmers to monitor soil conditions, weather patterns, and crop health, leading to improved yields and resource conservation.
However, beneath IoT’s potential lies the challenge of effectively centralizing and managing the data generated by these devices. The ability to import data into the database and have it available very quickly while still allowing you to obtain the answers you need (through blazing-fast-single-row and aggregate queries) eliminated many databases straight away. The trade-offs for the new databases created to play in this space were data immutability, batch ingest, and limited queries—sometimes with a custom-purpose language that wasn’t used anywhere else, increasing the developer teams learning curve.
The incoming data was often downsampled to reduce its storage volume (in manufacturing, this is called “compression”, although it loses the ability to get the original data back), leading to the loss of opportunities for analysis in the future. Once data was in these databases, it was siloed, and linking to other datasets was hard or impossible.
Relational databases were well established, but they just couldn’t handle the volume of data. Expensive commercial historian databases became king in industrial IoT, and single-use open-source time-series databases entered the market.
The Power of PostgreSQL: Fueling the Next IoT Wave
While historians and time-series databases served IoT well, they limited the way data was used because they lacked the flexibility of their relational counterparts. Business data couldn’t be stored alongside IoT data, joins between datasets were difficult, and developers still had to learn how to use multiple database technologies. All this changed when native time-series extensions added the ability to handle time-series data in PostgreSQL as easily as any other data type.
With PostgreSQL unlocked, the need for complex, single-use time-series databases has dropped. Why use an expensive, specialized tool with limitations and a steep learning curve when an open-source general-purpose database could handle the workload? Less technology to learn, less moving parts in your technology stack, and less non-standard query languages or APIs.
Now that PostgreSQL can support the demands of IoT workloads, it’s still important to ask what the benefits of using it are. What do IoT companies gain from moving from time-series workloads to PostgreSQL?
Proven resilience and trustworthiness
With a legacy spanning over two decades and a thriving community of contributors, PostgreSQL has undergone extensive testing and refinement. Its battle-tested nature makes it a reliable choice for IoT applications where data integrity, consistency, and reliability are paramount.
Adaptability for diverse data types
Database technology like Timescale’s can unlock time-series data, but PostgreSQL’s native versatility shines through its support for various other data types, including JSON and geospatial data. This adaptability is crucial for modern IoT applications, where sensor data arrives in varied formats. Whether it’s sensor readings, GPS coordinates, nested object data, or traditional business data, PostgreSQL adeptly stores and processes this information, enabling comprehensive analytics and insightful revelations across multiple datasets.
Ecosystem and connectivity
PostgreSQL boasts a vibrant ecosystem of tools, extensions, and connectors that enhance its functionality. This ecosystem ensures that PostgreSQL seamlessly integrates with other technologies, both those commonly employed in IoT deployments and those historically not used with IoT. Whether it’s connecting with visualization tools, machine learning frameworks, or data processing platforms, PostgreSQL’s compatibility fosters a cohesive IoT architecture.
Time-series features in a relational world
Once PostgreSQL is unlocked for time-series the best qualities of both worlds are combined. Ultra-fast ingest and query, data which can be updated or arrive late, best in class lossless compression (often up to 95%), downsampling, retention windows, and incrementally updated materialized views for time-based aggregate queries are all within reach when the right extension is used.
The Best Backbone for IoT
We all know that IoT is one of the key innovations of the generation, providing a massive amount of very valuable data which needs to be stored and analyzed, often in real time. With recent developments in extensions PostgreSQL has risen to the challenge of handling this workload on a general purpose commodity database, removing the need for complex and expensive historian or time-series database products.
In the age of IoT, where data is king, PostgreSQL is starting it’s reign as the true monarch of modern data management.
This post originally appeared on TechToday.