Has your cybersecurity been compromised?





We respect your privacy, and we hate spam as much as you. Which is why we will never share your email address with anyone.

Enquire Now

GET YOUR ANSWER BEFORE YOU KNOW YOUR QUESTION!

By Fredrik Juhlin, Solutions Architect at Help AG  |  Posted Saturday, 10th February 2018

Today there is a lot of buzz around the handling of analysis, Business Intelligence, streaming data, Internet of Things (IoT) and so on. You have probably been exposed to them all but how should you really tackle these area? Where do you start?

The digital revolution is opening great opportunities as organizations get access to more and more data from both internal and external sources. Meanwhile, organizations need to focus on their core business. A tricky balance to say the least among the customers I have worked with over the years!

In a traditional DW solution, you first need to come up with the questions, gather data and build a data model that can answer these questions.

The problem comes when the answer often leads to even more questions that can’t be answered with the data model created. This becomes an endless process and is very time consuming. At many customers I have worked with, this process is far too inflexible. We need to be faster than that. We are all competing in a global market on completely different terms than just a few years ago.

Analysing both our own processes as well as our surroundings is therefore highly necessary to be able to stand out from the competition. We need to be able to handle incidents fast and we need to test different hypothesis to keep the good ones and discard the bad ones.

We need a solution that gathers data and indexes it in a way, so we can ask our questions when they come and not having to know them in advance.

One of the fastest growing data source is machine data generated from all connected systems such as in data centres and from IoT. This data contains a lot more than just error log information. We can find:

  • User behaviours
  • Application behaviour
  • Service levels
  • User experience
  • Cyber risks
  • Trends
  • Correlations

We want to gather this data and be able to look at it from different perspectives. For example, we want to look at it from the perspective of an:

  • Application developer
  • IT administrator
  • IT security officer
  • Business analyst

If we have an easy way of collecting, normalizing and analysing this data, we can look at the data from the systems in different perspectives and find correlations between them.

One thing that tends to be time consuming in organization is to do root cause analysis (RCA) of when things go wrong. It often involves gathering people from different areas and competences to get a picture of what happened. To quickly do a RCA, we need tools to increase the visibility into different components in the environment and we need the ability to run correlation between the different components.

Since data comes from different sources and in different formats, the normalization of data is a task that is very important and often overlooked when talking about data analysis. This normalization can include things like adjusting values measured on different scales to a common scale or, mapping variables like Customer IDs between different sources.

To be able to have the time to do the gathering, normalizing and analysing of data- and at the same time focus on the core business- smart tools are not only needed but essential.

One excellent tool for this is Splunk. I have found Splunk very useful as it is specially designed to handle gathering and analysis of machine data. It runs analysis in real time and the customers have a very flexible data model- Schema on the fly. With this flexible data model, the user doesn’t have to define it in advance. It can be changed depending on the questions to answer.

Splunk has a wide area of knowledge built into the tool to handle the normalization of data from many different sources. There are also usable pre-built reports and dashboards from a wide area of system-, cloud- and application vendors.

For example, in the same dashboard, you can see completed transactions from the web shop and latency from the database. Maybe not something that you normally correlate but there is a clear relation between them.

Often data analysis projects have a tendency to get too complicated, leaving many customers not knowing where to start.

With the right tools it is possible to, without much work, collect data and run the first reports out of the box! From there it’s easier to see more and more use cases and the questions begin to rise. With a flexible data model, you run your questions as they come, or you can even …

Get your answer before you know the question!

Blog By:

Fredrik Juhlin, Solutions Architect at Help AG

RELATED POSTS

Post-Event Report: Help AG Security Spotlight Forum, March

As the region’s trusted security advisor, Help AG plays an ongoing role in raising awareness about the latest cyber security trends in the Middle East. Our Security Spotlight…

Read More

A GREAT TIME AFTER A RIYADH DAY’S WORK!

We recently held our quarterly team dinner in Saudi Arabia and I thoroughly enjoyed being surrounded by the team of dedicated professionals. It’s no secret that over the…

Read More

TOP MIDDLE EAST CYBER THREATS- 21 OCTOBER 2018

At Help AG, our Managed Security Services (MSS) team offers 24x7x365 monitoring of complex IT security infrastructures to some of the largest enterprises in the region. As a…

Read More

Back to Top