What is an Enterprise Information System for Industrial Automation?

An Enterprise Information System (EIS) for Industrial Automation is a software system designed to manage and integrate various business and operational processes across an organization. In the context of industrial automation, an EIS typically refers to a software platform that integrates data from various control systems, such as Programmable Logic Controllers (PLCs) and Supervisory Control and Data Acquisition (SCADA) systems, with enterprise-level systems such as Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM) systems.

The purpose of an EIS for Industrial Automation is to provide a holistic view of the organization’s operations, allowing decision-makers to make data-driven decisions and improve overall performance. The system typically includes features such as data acquisition, real-time monitoring and control, reporting and analysis, and integration with other enterprise-level systems. It may also include features such as advanced analytics, machine learning, and artificial intelligence to help identify patterns and improve operational efficiency.

Overall, an EIS for Industrial Automation is designed to provide a unified view of an organization’s operations and help optimize performance by improving communication, reducing downtime, and increasing productivity.

At the core of an Industrial Automation solution is SCADA and a Process Data Historian (click here to learn about Industrial Automation). These mature product types have become the main staple for managing critical infrastructure associated with intelligent assets. Evidence of how stable some of these systems are, is measured in “up time”, and for some systems this is in the order of 99.999% availability. More recently, edge devices are increasing the data capture and intelligence capability placing even greater emphasis on the criticality of information infrastructure harware and software components.


Can Enterprise Information Systems be highly available?

Parasyn designs, implements and supports various process historian technologies for industry. We also procure and provision all of the software products, transition onto existing data centre infrastructure or work closely with 3rd party service providers and the client’s internal IT personnel to scope and specify what is required to establish suitable software solution infrastructure.  To achieve this, a managed service arrangement or a turnkey project structure can be the delivery platform. In the case of critical infrastructure High Availability components and products can be specified at the plant and enterprise level to ensure the necessary overall reliability is achieved. The specification, selection and licensing of enterprise information solutions can create a world of pain if not approached from a system’s engineering practice perspective (OT – Operational Technology bias).

What are the benefits of enterprise wide SCADA?

Organisations which take an enterprise view of SCADA, widen their scope and improve operations as the feedback loop envelope for operational systems widens. This is more a human benefit more than anything, however the technology platform exposes the data sets so that humans can consume trusted data and essential information and then demonstrate their findings with ease. For repeatable processes, these system wide optimisations can be captured using information management techniques and are less dependent on intimate user knowledge, which is typical for plant-based SCADA systems. The typical enterprise data consumer’s less “technical” perspective on “what they see” drives better metadata management adding context and meaning to everything the business does from that day forward. The business assets are linked to other enterprise systems.

Circut boards standing in a row

Where is the data lake for Industrial Automation System?

The process data historian is the key software component for enterprise information solutions for managing critical plant, critical remote assets and metering devices. Virtually anything that is instrumented is ripe for “harvesting”. How the system is designed to reliably collect information is more than a basic opinion about which is the “best brand”. How the collected information is displayed, often by dashboards or other “visualisation”, is only as good as the underlying data. The data needs to be trusted. The data needs to be of value to the organisation. Data Management is key to creating a useful data lake which hosts process data. If you need help deciding what to use or how to design your first or next system, this is something we passionate about.

What is a Process Data Historian (Data Historian for short)

A process historian is a software system that is used to collect, store, and analyse historical data related to industrial processes. It is typically used in industries such as manufacturing, energy, and chemicals where large amounts of data are generated by various process control systems such as Programmable Logic Controllers (PLCs) and Supervisory Control and Data Acquisition (SCADA) systems.

The purpose of a process historian is to capture and retain data from various process control systems over time, allowing operators and engineers to analyse historical trends and make data-driven decisions to improve operational efficiency, reduce downtime, and enhance product quality. Process historians typically store data in a time-series database, which enables users to easily view and analyse data in real-time or over a specific time period.


Process historians typically have features such as data acquisition, storage, retrieval, and analysis, as well as the ability to integrate with other systems such as SCADA systems, enterprise-level systems, and analytics tools. They may also include advanced features such as predictive analytics and machine learning to help operators and engineers identify patterns and make more informed decisions.

A process historian is an important tool for industrial processes as it helps to provide a complete picture of the process, allowing users to identify inefficiencies and optimize performance over time.

What is a Time Series Historian?

High performance data historians store time series data (i.e. the value of a data sample, the time and date and some metadata to provide context for the data sample) as efficiently as possible to minimise the disk space required. Fast retrieval is paramount for accessing the vast amounts of information gathered by process data historians. Information is stored efficiently in the sequence of the events (as they occurred in time) even if the data was received out of sequence.

Who uses Enterprise Information Solutions?


Governments, regulatory bodies and business have all been steadily increasing their demand for more accurate, validated and timely data. Organisation Compliance Reporting & Data Analysis requirements have therefore naturally become increasingly important as deregulation and privatisation takes a hold across the globe. Managing assets holistically according to how the business manages the asset and not how the operations group repairs and maintains the asset, no longer needs to be managed in isolation. mutually exclusive. More and more organisations are turning to enterprise data historians as the vehicle to store critical information that they rely on and trust. The strongest benefit of implementing and ENTERPRISE wide historian is sharing of information outside the boundaries of geography, divisions and areas of responsibility. The silos are removed.

What data types does an Enterprise Data Historian include?

When designed and implement correctly, an enterprise data historian creates a lifelong dependency for organisational user groups that are serious about sharing information and quality management.


Technology now delivers high speed data acquisition, abstracted data structures which fit specific asset models, mega storage, caching of enterprise KPIs and production performance management information. This means that at all levels of the organisation the appropriate information can be presented to all user groups. The enterprise historian liberates asset performance information and exposes information historically only considered precious by Operators and Maintenance personnel.


The data historian usually augments with other software tools to create data structures which can be harvested by business analysts and other data consumers who don’t need to understand how the data has been organised or “connected” into one view. Viewing data from an asset, geographical perspective or by instrument types creates a different value proposition for each user groups.

How important is the selection of Enterprise Information System core technologies?

The choice of which technology platform to use can be an expensive one. Overall, if the process data historian is over specified for the application, when expansion time comes it will be painful. Factors include; speed of data storage, storage efficiency, connectivity to source devices, interoperability, client access methods and speed of data retrieval. Each vendor’s product differs from another. None of them are the same. Getting this right is something Parasyn can help with.


Where are Operational Historians used?

Sometimes plant historians are required close to plant, which may or may not be remote to a centralised control room or the “head office”. Sometimes local historians are called plant historians. The value of plant historians is usually related to the need for the localised capability if the remote connection (internet or private network) is unavailable. When the local plant historian is tiered with an enterprise historian, the enterprise user does not need to be concerned with connecting to multiple systems. The value of tiering historians at the corporate level includes running enterprise class applications with may not be cost effective or even useful at the plant level. Some of these enterprise applications include asset optimisation, predictive analytics (machine learning), workflow, advanced analytics and production and compliance reporting.

What are the important features of Enterprise Historians?

Enterprise Historian solutions differ to what is typically used in a local process plant. Local Plant Historians may not be suitable for this type of application unless the architecture supports a cascaded model where the plant historian being used rolls up into the enterprise historian and is used to mimic or present aggregated values and the single source of truth. However, there are high end historians suitable as a single Enterprise-wide Historian which do not need to use a cascaded architecture.

When choosing an Enterprise Historian, it often comes down to price as the top performing products outperform other products available in the marketplace. The product features to carefully consider in detail are listed as follows:

  • Production Server price (this is the core product)
  • Production Server price to support high availability or redundancy. These are not necessarily the same thing.
  • Pre-Production Server (expected to be a lower cost that the Production Server), Test Server, Development Server.
  • Enterprise connector access (SDKs, API and other interfaces). It is important to understand how this affects the licensing and costs.
  • Webserver Licensing
  • Interfaces (e.g. OPC, OPC HDA, DNP3 etc). Be very clear and specific. Some interfaces are out of the box however, they may not meet operational requirements.
  • Web Client Licensing (unlimited, per seat, etc)
  • Desktop Client Licensing (unlimited, per seat, etc)
  • Application Add-ins (e.g. Excel)
  • Manual Data Insert (Tool for Operator inserts)
  • Asset Hierarchy Model which supports multiple views (e.g. Asset, GIS, Devices etc)
  • Support for Business Intelligence (dimensionalised data sets which reduce the need for complex data manipulation)
  • Analytics (Alerts, Events, Notifications)
  • Annual Support cost as a percentage of which Price List
  • Price Breaks for extending the license to expand the system
  • Local or International pricing (are you exposed to variances)

What are the most common Enterprise Historians?

The most common enterprise historian in the world is difficult to determine as there are many different historian software vendors in the market and their market share varies by region and industry. However, some of the most commonly used enterprise historians include:

  • Aveva PI is a widely used historian platform that is used in a variety of industries such as manufacturing, energy, and water/wastewater.
  • Honeywell PHD is used in industries such as oil and gas, petrochemicals, and refining.
  • GE Digital Proficy Historian is used in industries such as energy, utilities, and transportation.
  • ABB Ability Symphony Historian is used in industries such as power, mining, metals, and pulp and paper.
  • Wonderware Historian is used in industries such as manufacturing, food and beverage, and utilities.

It’s important to note that each enterprise historian platform has its own strengths and weaknesses, and the most suitable platform will depend on the specific needs and requirements of the application.

Can SQL or a relational database be used as an Enterprise Historian?

If the answer to the question had to be one word and it needed to be answered with today’s technology, it would be; No. The emphasis here should be on the world “Enterprise”. For many vendors, including Industrial Automation Vendors who don’t own an Enterprise Historian, they often use SQL or Oracle as a means to an end. Putting aside data payload costs for the moment which is a massive showstopper in certain locations, by leveraging cloud-based servers with “unlimited” resources, relational databases are most certainly a viable option for low volume non Time Series data warehousing of process data. Because of the performance of cloud servers, it can appear that relational databases are also suitable for Enterprise Historian solutions, however this is not the case, yet.

What is the role of a relational database for Industrial Automation?

Even the Enterprise Historian vendors embed relational databases into their total solutions for organising information including Asset Hierarchy and enterprise integrations. This is what relational databases were designed to do. For the small end of town, it makes sense to use SQL/Oracle for everything. For the “big end of town” it doesn’t. Unfortunately, sometimes Operational Historian products (or SCADA products with a relational database interface), ie the lower end of town, make claims about storage rates, acquisition rates and storage capabilities. It is important to compare apples with apples when making these claims. Some items to consider are:


  1. Does the device support Time Series data (time and date stamping the events in the field device). This is not native for a relational databases and solutions that claim this functionality have masked the issue and produced work arounds. The astute end user may not be aware because the data volume rates and reporting requirements are minimal.
  2. Are the performance figures being quoted describing the cloud infrastructure or the performance of the instance of the Server farm that is being provisioned for the specific application?
  3. What are the security and multi-tenanting limitations imposed by the cloud technology provider (in some cases the cloud provider provisions a much superior solution so this must be considered in context with all of the requirements including scalability and cost per point). Size of the infrastructure is only part of the picture.


Relational database like interfaces to SCADA though functional on some products should be carefully implemented. Best practice is to avoid external access into SCADA unless very tight controls are imposed on the interface to limit bulk data access and avoid performance degradation which can lead to data loss and operational issues in the SCADA system.


something interesting