Connecting to the Cloud allows you to utilise the data collected in ATS Intelligence in many different ways. It means that big data analysis can be performed remotely and 3rd party applications, present in the Cloud, can also utilise the data.
To connect to the Cloud you must first setup Azure and then specify the Cloud endpoints so that ATS Intelligence can communicate with it. You can then specify which runtime data will be sent to the Cloud.
Microsoft Azure is the Cloud service used by ATS Intelligence. The PowerShell script supplied with the ATS Intelligence installation will configure Azure to work with ATS intelligence.
Run Windows PowerShell.
Change directory to the location of the script. By default this will be the following:
C:\Program Files (x86)\Applied Tech Systems\Intelligence\Cloud\Azure\EventHubTrigger
This directory contains a number of files that are referenced from the powershell script named Execute_ARM_template_ADOS_intelligence.ps1. This PowerShell script deploys Azure resources that are required to store information from the Intelligence cloud service in Azure.
How the PowerShell script works
The script creates a resource group and executes an ARM template (ARM_template_ADOS_Intelligence.json). This template creates a SQL server, database, event hub and event hub triggers. The cloud service publishes data onto the event hub. The event hub executes a custom function that processes the message and delivers the content to the SQL database. This database can be used by PowerBI or other reporting tools to display the data.
After executing the ARM template, a powershell cmd-let named Invoke-SQL is executed against the created database to initialize it. The script being executed is named CreateAdosIntelligenceDatabaseTables.sql.
The final step of the powershell script is to deploy the event hub triggers which are also known as function apps. It uses the content of the file named adosfunctions.zip.
Modify the Execute_ARM_template_ADOS_intelligence.ps1 powershell script as described below.
The following information is required:
Azure subscription ID: The script will fail if an invalid subscription is used.
Admin login and password for database
IP address range: The script has room for 2 IP address ranges that must be supplied. These IP ranges are used by the SQL server.
If more IP ranges are required please refer to the documenation on the Microsoft website.
The changes required are grouped within the powershell script between BEGIN CUSTOMER SPECIFIC VARIABLES and END CUSTOMER SPECIFIC VARIABLES. These variables are about firewalls, SQL admin user and Azure subscription ID.
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# BEGIN CUSTOMER SPECIFIC VARIABLES #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# The customers Azure subscription ID #
$subscriptionId = "12345678-1234-1234-1234-1234567890AB"
# The ip address range that you want to allow to access your SQL server #
$databaseAdminLogin = "AdosAdmin"
$databaseAdminPassword = "AdminRulez123!"
# The ip address range that you want to allow to access your SQL server #
$startip1 = "192.168.0.0"
$endip1 = "192.168.255.254"
$startip2 = "192.172.0.0"
$endip2 = "192.172.255.254"
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
# END CUSTOMER SPECIFIC VARIABLES #
# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #
Run Execute_ARM_template_ADOS_intelligence.ps1.
Cloud endpoints specify the connections between ATS Intelligence and the cloud (Azure Event Hub).
Select the General tab.
Click Cloud Endpoints.
A list of the existing endpoints is displayed.
Click the add icon ().
Enter the connection string.
The format will be similar to the following:
EntityPath=<event hub name>;Endpoint=<connection string of event hub>
Entity hub name example: demoeventhub
Endpoint example: sb://demoeventhubns.servicebus.windows.net/;SharedAccessKeyName=RootManageSharedAccessKey;SharedAccessKey=kzQmiO2yq4O8rNb8uFOxjrbpbh16b1DvmbM/8Fw2fWQ=
Enable Is Active? so that it can be used.
Click OK.
The endpoint is added to the system.
To allow runtime data to be sent to the Cloud it must be enabled for each piece of equipment individually. Each piece of equipment can be configured to send Downtime, Counters and Process Variables.
Select the Equipment & Downtime tab.
Click Equipment Hierarchy.
The Equipment Hierarchy window opens.
Double click on the equipment that will send its runtime data to the Cloud.
The equipment configuration dialog opens.
Select the Cloud tab.
Enable the data types that will be sent to the Cloud.
Click OK.
From now on the selected data will be sent to the Cloud.
Only data recorded after the Cloud connection is enabled will be sent to the Cloud. To send data from before that time you must manually synchronize it for the required dates as described below.
Runtime data recorded after a Cloud connection is enabled will be sent to the Cloud automatically. To send data from before that time you must manually synchronize it for the required dates.
Select the General tab.
Click Data synchronization.
The Data synchronization window opens.
Select a From and To date in the Cloud data synchronization pane.
Click Synchronize.
All runtime data in the selected time period that has been enabled for synchronization is sent to the Cloud.
Master data, such as materials, KPIs and equipment, is automatically synchronized with the Cloud. However, in some rare instances, due to a bug fix for instance, the master data may not be synchronized.
You can manually synchronize master data as follows:
Select the General tab.
Click Data synchronization.
The Data synchronization window opens.
Click Synchronize. in the Master Data pane.
Master data is synchronized with the Cloud.
Can we improve this topic?