“Big Data”, the ability to store, consume and analyse it appears to be as easy as subscribing to one of the multitude of cloud services clamouring for your attention. In this new era of “everything as a service” does the Process Historian still have a place?
Big data concepts have been embodied inside critical infrastructure companies for nearly two decades. Historian technologies have been used to save our environment and manage the most stringent critical infrastructure in the world. Tier 1 process historians have key features such as high compression, extremely fast data access, API access to proprietary stored data, standard out of the box interfaces for enterprise integration and the list goes on. The information systems “industry” attempted to utilise relational database like MS SQL, Oracle, MySQL and so forth to avoid the Automation vendor proprietary industrial offerings. Many tried these alternatives but all failed, unless they compromised on the performance of the system or more importantly the scope of data to be stored and retrieved. This was true from process plant to nuclear power plant. For the organisations that compromised on requirements, they quickly realised a limited data set means limited benefits to the enterprise in the longer term. A decade ago interoperability between OT and IT was starting to blur the lines in terms of providing data storage and analysis functions, but if you didn’t want to wait a day for your report to be generated then you were still using a process historian. The process historian never really received the accolades it deserved, mainly because of its unique complexity and sometimes poor implementation. Let’s face it, it’s a back office black hole, not the cool desktop application we would show our friends. It’s an industrial application that the nerds use to solve some serious problems.
The key criteria for selecting a process historian still holds true today, if you are planning for an on-premise solution. The alternative is to roll out cloud-based infrastructure, but why would you do that? Cloud based alternative solutions can now match up with the performance of on-premise process historians in many aspects. Though storage compression may still not be the same, the cost of storage has decreased meaning less efficient cloud storage is a good alternative option. Even though the technologies are fundamentally different, the major difference is not the technology, it is the location of the data and whether it is considered “safe”. Move the solution or part of it to the cloud using IoT technologies and suddenly many of the core reasons for choosing a process historian disappear altogether. A whole industry revolving around process historians including industrial automation vendors and services providers (meaning us) have a vested interested in the longer life of the process historian but the cost to transition away from tradition is sometimes prohibitive until it is time to recycle the infrastructure.
What is the life expectancy of process historians? If the large automation vendors decide not to maintain their products or continue to invest in maintaining security, application interfaces and so forth (as their support fees decline with lessoning interest) then that would be an important time to move on to alternative cloud based technologies. On the other hand, if scalability and interoperability with other cloud-based infrastructure and in particular global management of operations is important, there may be impetus to get going sooner with your cloud solution.
With the improvement of relational databases in the cloud, dozens of TSDBs (Time Series Data Bases) are readily available and dozens of standard interfaces and tools are available to the more general data consumer. The principle of locking down industrial data from the instrument to the boardroom is also a separate discussion up for debate.
With all this change, is anything going to remain? Process data is unique and highly technical. The planning for and management of process data needs to be done with consideration and transported intelligently. This is particularly important on low bandwidth infrastructure, however even with high bandwidth infrastructure the network design still needs to purify what is stored and manage the significant volumes of data that can be made available. Higher speed systems also create “more bad data” if it is not conditioned appropriately. This is a big warning for novice data providers. No matter what the technology or location of the data storage, the quality of the data source is key to any solution. The speed of data acquisition multiplies the effect of good and bad data, not only amplifying the choice of technology but in principle exposing the architecture and its final configuration.
The principle of managing process information with integrity will never change. In particular, for control systems with complex and customised process, the challenge will always require it to be defined, configured and delivered by a specialist. The collection and aggregation of good quality process data is critical and a device is not able to produce that outcome no matter who supplies the device in isolation to the system. In the past and until process historians disappear from view, process historian data collectors will continue to take away some of the headaches of this data aggregation activity described above. Process historians have good interfaces and adapters which collect information from standard SCADA and automation devices. This is a very strong reason to stay with the process historian for most if not all industrial applications.
If you were about to implement a new system, what technology would you recommend today? The first and most important question is this: Is the solution to be on premise now and forever? If the solution is on premise (for whatever the corporate reason), stay with the process historian concept for at least the next technology cycle. If not, consider carefully basing the decision on architecture, performance, renewal cost and out of the box interoperability. Something to note is that recently historian technologies started being provisioned in the cloud. This means we now have more options and one of these options is hybrid solution.
Hybrid solutions are being used in the world at large by military and critical infrastructure companies because the benefits are hard to refute. Hybrid solutions leverage the process historian for storage of data and leverage new technologies for the unification and presentation of analytics to the wider enterprise. This would normally include the organisation’s preferred Business Intelligence (BI) platform.
Some of the technical benefits of new tech in the cloud include:
- On demand storage.
- On demand numerical processing power.
- “Unlimited” client access and programmatic performance adjustments in real time that customise the user experience. This leads into AI and extended business intelligence applications.
It’s time to think next generation because for some non-industrial applications it is the end of the road for the process historian. When all that we had was a process historian for time series data, the choice was easy. Today we have options, historian on premise, historian in the cloud, alternative TSDB big data solutions and the hybrid/blended solution.
Previously published 2016 and updated 2019.