Set Up High Availability of Web-based Clients

Before you begin

  1. Ensure that your network is enabled for multicast traffic. To do so, run the following command:
    netsh <interface name> <IP address> show joins
    A list of IP multicast groups that have been joined through an interface appears. If you do not specify an interface name, a list of multicast groups for all interfaces appears.
  2. Create a shared drive on your network that all the nodes in the cluster can access, and create a folder in that drive.
  3. On each node that you want to add to the cluster:
    1. Install the Failover Clustering feature.
    2. If you want to use an existing Proficy Authentication instance, ensure that all the cluster nodes point to the same Proficy Authentication instance. Note that for all the cluster nodes, the Proficy Authentication credentials of the node on which you installed Proficy Authentication last will be considered.

About this task

In a cluster environment, multiple servers are installed, which share the same data. Each of these servers is called a node. One of them acts as the primary node, while the others are standby nodes. If the primary node is down, one of the standby nodes is used.

When you install Web-based Clients in a cluster environment, the web servers are added to the cluster. You can then achieve high availability of connection between the Historian server and the client applications.

For example, if Configuration Hub on the primary node is unable to connect to the Historian server, the user session on the standby node is activated. Therefore, you will still be able to connect to the Historian server using Configuration Hub installed on the standby node.

The following services are shared between the primary and standby nodes in a cluster:
  • Historian Indexing Service
  • GE Historian PostgreSQL Database
Historian works with Microsoft Failover Cluster Manager to ensure high availability of Web-based Clients. Using Failover Cluster Manager, you must add these services to the cluster.

Procedure

  1. Access the primary node of the cluster.
  2. Create a failover cluster.
  3. Add a storage to the failover cluster.
  4. Select Roles > Create Empty Role.
    A role is created.
  5. Add a client access point to the role:
    1. Select the role.
    2. In the Actions section, select Add Resource > Client Access Point.
    3. Follow the on-screen instructions to add a client access point to the role.
  6. Add a storage to the role:
    1. Select the role.
    2. In the Actions section, select Add Storage.
    3. Follow the on-screen instructions to add the storage that you have created in step 3. You can use a storage only once.
  7. Add resources to the role:
    1. Select the role.
    2. In the Actions section, select Add Resource > Generic Service.
      The New Resource Wizard window appears.
    3. In the list of resources, select Historian Indexing Service, and then follow the on-screen instructions to add the service.
  8. Perform the previous step to add the GE Historian PostgreSQL Database resource as well.
  9. Add the following dependencies for each of these resources:
    1. Double-click a resource.
      The <resource name> Properties window appears.
    2. Select Dependencies.
    3. Select Insert, and add dependencies for each resource as described in the following table, using the AND operation.
      Resource Name Dependencies
      GE Historian ProstgreSQL Database
      • IP Address
      • Storage
      • The network name
      Historian Indexing Service
      • IP Address
      • Storage
      • The network name
      • GE Historian ProstgreSQL Database
      For example, the following image shows the dependencies added to Historian Indexing Service:

  10. Select the role, and then in the Actions section, select Start Role.
    When you later install Web-based Clients and provide the cluster details, Web-based Clients will be part of the cluster, thus achieving high availability.

What to do next

  1. Install Web-based Clients. During the installation, select the Cluster Node check box, and provide the details.
  2. Import the Proficy Authentication certificate into all the cluster nodes. Copy the certificate in the following path from any node in the cluster and paste it in the same folder in all the other nodes: C:\Program Files\GE\Operations Hub\httpd\conf\cert
  3. Restart the following services on all the cluster nodes:
    • Historian Indexing Service
    • GE Historian PostgreSQL Database
  4. On the machine on which you have installed the Historian server, update the URI of the following registry key to point to the cluster FQDN: HKEY_LOCAL_MACHINE\SOFTWARE\Intellution, Inc\iHistorian\SecurityProvider\OAuth2