Our Top 5 Problems with Process Historians points out a number of product limitations and common implementation issues. The news is not all bad but the Top 5 does illustrate common problems. Like all technologies, getting it to do something more than it was intended to do may be cool, but there will be strings attached.

The value proposition is changing.
1. Historical Data Retrieval and Analysis.
2. Systems Integration and Compatibility.
3. Data Quality and Cleansing.
4. Contextualising and Enrichment.
5. Scale and Expansion.
Closing Scene

The value proposition is changing.

These are our Top 5 Problems with Process Historians. This more aptly applies when Process Historians are applied in the Enterprise landscape but not exclusively.
As always, we reserve the right to change the Top 5 as new developments surface or as new pathways currently being explored with new tech fail to offer suitable alternatives which match the historical benefits people determine they simply cannot do without. We are not challenging that people cannot do without these benefits. New entrants may not appreciate these benefits and or may value them within a new narrative. This is not wrong but does drive this ongoing debate helping to redefine the primary purpose of the software systems we continue to use today and those we are preparing to do without tomorrow.

1. Historical Data Retrieval and Analysis.

Retrieving historical data quickly and performing complex analyses can be challenging, especially when dealing with large datasets. Without comparing emerging technologies with current Process Historians, Historians need to provide tools and features for efficient data retrieval and analysis. Although data retrieval can be resource-intensive, it is so for any technology. In the Historian ecosystem the core system is tuned for best performance and is impressive. Outside this ecosystem the narrative is different, and performance is not measured the same. Enriched data does not only mean sub second process information in real time. At is core, the process historian provides very little data analysis capability. This is done elsewhere.
Tip: Evaluate the technology performance requirements as it applies to the business ecosystem versus the historian ecosystem in isolation.

2. Systems Integration and Compatibility

In correlation with our Top issue #1 is data integration. Historians often need to integrate with various legacy systems, sensors, and devices that use different data formats and communication protocols. Ensuring seamless data integration and compatibility can be a significant challenge. This is a data historians core capability. This is the front end of the historian or where it acts as a data consumer (client). The challenge rests with the backend where the historian is the Server.
Accessing the historian ecosystem can be limiting in terms of performance, and flexibility. SDKs (APIs) are provided which is useful to programmers and only if the API is fully functional. Other conventional Automation interfaces are almost always provided. Sometimes none of these interfaces meet the “business needs” though they meet the Operational needs of the business.
Tip: Interoperability is only the starting point and connection to a system is not Capability. Consider the effort required to integrate and maintain any customisations that have been done to “connect with ease”.

3. Data Quality and Cleansing

Data historians rely on accurate and reliable data. Data is usually captured from other software systems or scanned from edge devices directly. Inconsistent or erroneous data can lead to incorrect analyses and decisions. Ensuring data quality and implementing data cleansing procedures is an ongoing challenge. Historians are purposed as the single source of truth for raw data.
There are rudimentary ways to “clean up” raw data as it is being ingested and storing modified data. In effect unless a second data set is kept, the original data is lost. If the rudimentary rules to clean-up the data were not right or a new narrative develops that could have used the raw data, those benefits have been lost. The ease of applying clean up functionality may give a false sense of data integrity.
Tip: Consider the primary purpose of the data historian in terms of the business ecosystem. Does the data need to be cleansed in the historian or can it be cleaned up in real time as it is programmatically analysed? Can intelligent Operational uses who consume the data and have technical context overlook some deficiencies for the sake of raw data retention? Can business users rely on noise and inconsistencies being removed? The process historian does not normally meet both these opposing needs.

4. Contextualising and Enrichment

This topic is interwoven with Topic 2 – Systems Integration and Compatibility. Having contextual information available to deepen the analysis of processing data is mostly provided outside of the core historian archives. Metadata is sometimes limited especially for older technologies. The ability for Historians to acquire metadata from data sources is possible when the data source is a software system like SCADA or another historian. Contrarywise, for most edge devices, (intelligent we say) a transport protocol is used for time, quality, and data value (TQV) information only. In these situations, the tag or point name is usually not available from the source and this identification information must be maintained in the historian. That’s okay because it needs to be done somewhere anyway, however, the historian must support the type of metadata the business needs to consume.
Tip: Consider the core historian technology in terms of metadata. Does it need supporting technology (wrappers) to provide context? Does the system allow for acquiring more than TQV from other software systems which provide vast amounts of already concentrated data.

5. Scale and Expansion

As we wrap up our Top 5, this topic is impacted by all that come before. As enterprises generate vast amounts of data, historians must scale to handle the increasing data volume while maintaining optimal performance. Balancing scalability and performance can be tricky, especially during peak data loads.
The whole premise of having a process historian is to store process data which is unique. Scaling and Expansion is a matter of addressing the previous 4 topics being addressed adequately. Because these factors are not all usually addressed, the scalability and the ease of expansion becomes delicate surgery. That is not unlike any large enterprise software solution.
Tip: Consider architecting the business data ecosystem from the edge device to the business platform and connect with the platform consumers. Get past OT isolation.

Closing Scene

When developing the business data ecosystem which includes a process historian, consider the full scale of the system and how it may transition in terms of increased users, high demand for data, the different methods for data acquisition, and the previous 5 topics. Do not rely on interoperability and compliance statements to base important architectural decisions.
If you need help, ask us. We have had some successes, and we have learned from our failures.

Model Predictive Control emulates plant operation  

YOU MIGHT ALSO BE INTERESTED IN:

Next Generation Historians
Operational versus Enterprise Historians, what’s the difference?