An Industry Brief
What is Data Correlation?
Sightline has expertise in collecting, retaining & analyzing time-series data.
What is “Time-Series Data”?
What is Correlation:
The EDM Correlation data analysis tool is designed to help users perform root cause analysis across multiple devices or systems. It compares the activity of one system over a given period with the activity of multiple other systems and identifies the most highly related data points. This allows EDM to create an intuitive visualization of related events across an entire network of mixed technology enabling root cause analysis to be performed more efficiently than ever before.
How it works:
When correlation is performed, EDM will analyze the data collected for a single system over a user-selected time range. Then it scans the data for each additional selected data source over that time range and uses an advanced algorithm to identify correlated events using Pearson’s r score. Correlated events are based on a score of 1. The closer to 1 the score is, the more related the event is. Highly correlated data points help users to identify issues that may be affecting multiple systems at the same time. These could be general network issues or a problem that originated in one system but affects others.
In the example below we use Correlation to isolate the root cause of high response time for an application called ECIS. This is a web-based application where the users enter transactions for inquiry and updates. The application features an IIS web-based user interface that has Oracle databases and uses webMethods for data validation. Most transactions have to go through to the database on the production mainframe to retrieve data, to satisfy the user query.
We use the EDM Correlation tool to identify that the number of inbound connections and TCP requests begin to increase dramatically, which then causes the number of active threads to spike, at which point the system runs out of available threads, and everything begins to slow down.
Root cause analysis across a network of multiple devices or systems can be extremely difficult, and EDM Correlation provides users with the power to perform that analysis in one easy to use tool. Additionally, EDM allows users to perform correlation from any visualization within the system (Reports, quick charts, forecasts, etc…) providing the ability to quickly compare and investigate currently viewed data against any other metric or datasource in the system. This enables root cause analysis to be performed quickly and easily.
A histogram is used to summarize discrete or continuous data, grouping data points into specified range values, called “bins.” The histogram is similar to a vertical bar graph; however, the histogram shows no space between the bars. Creating a histogram provides a visual representation of data distribution, and have the ability to displace a large amount of data and the frequency of data values. The median distribution of the data can be determined by a histogram, as well as showing any outliers or gaps in the data. Continue Reading Histogram Data
MQTT (Message Queueing Telemetry Transport) is a simple, lightweight message publishing and subscribing network protocol. It is the standard protocol for Internet of Things (IoT) and Industry 4.0 messaging.
It is lightweight, low bandwidth, and functions well in high latency and unreliable environments, making it ideal in production environments. Devices send data (publish) to an MQTT Broker with a topic and a data payload, and devices can subscribe to that topic and subtopics and receive updates containing that data from the broker when the data changes. Topics can be defined to use several levels of depth and devices can subscribe to topics using wildcards allowing for dynamic changes when required. Continue Reading MQTT Protocol
Zero-Trust Data Security
Zero-Trust is a security concept and cybersecurity framework allowing an organization to aggressively defend itself, its data, and user permissions using an advanced system of automated security protocols. Zero-Trust is centered on the belief that organizations should not automatically trust anything inside or outside their perimeters and instead must verify anything and everything trying to connect to their systems before granting access. Continue Reading Zero-Trust Data Security
Forecasting uses historical data as inputs to make informed estimates that are predictive in determining the direction of future trends. Choosing appropriate forecasting techniques depends on the type of data being used and behavior of the data. For example, forecasting techniques used for “financial data” may not be appropriate for time series data like “computer server data” or “data center network activity data” (though underlying forecasting principles stay the same). Continue Reading Data Forecasting