Understanding data is a competitive advantage – inefficiency awaits those left behind by the analytics train

Do you know what data fabric or explainable AI are – and why you should know them? After reading this article, you will. You will also learn why you should start refining data today at the latest and how your organisation can get started in utilising analytics. We asked a few topical data-related questions for experts in the field.
How do you see the current analytics expertise of Finnish companies and organisations? Has there been significant development in data refinement and analytics expertise in the last few years?
Mikael Collan: The companies that have invested in their data and analytics expertise have also increased their industry expertise tremendously in just a few years. This trend has been supported by a network of data analytic service companies that have emerged in Finland, as well as the growing cohort of young, trained experts who enter the industry.
In my view, there is a gap emerging between companies proficient in analytics and companies that have yet to master it. Whereas proficient companies are able to streamline their data processes by utilising analytics to create efficiency, others may feel overwhelmed by the amount of data, which may even slow them down further.
Data and analytics expertise is gradually becoming a competitive advantage. The utilisation of data and analytics also clarifies the vision required to manage a business. It cannot be stressed enough how important it is for a company to know what is truly going on and be able to reliably assess what will happen in the near future.
The timeliness and aptness of measures are factors that enormously improve a company’s competitiveness and stress tolerance. Understanding and expertise have increased nationally. The contexts in which analytics is utilised have increased in number and scope.
Antti Kuivalainen: The expertise in analytics and the technology related to it has clearly improved from a few years ago. Of course, there have always been pioneers in this sector. Today, the value of a company’s data resources is recognised better both in industry and other sectors. Organisations consider what their data could produce.
In practice, this is evident in companies hiring an increasing number of analytics experts. Expert companies like us have traditionally provided companies with analytics resources. Now, the situation is becoming more balanced. Companies have realised that they must retain ownership of their data. Of course, they should use external top expertise, but they must not rely entirely on suppliers.
With companies snatching up data scientists and architects, the recruitment market continues to heat up. The number of chief data officers is also increasing. The Chief Data Officer (CDO) has truly become an established role alongside the Chief Information Officer (CIO).
Samu Kuosmanen: The value of data is now understood better in general. One example of this is the acquisition of the food delivery company Wolt for several billion euros. Wolt has no physical facilities – only a platform and data on top of it. Perhaps an understanding of data has been the company’s competitive advantage.
Previously in my career in the consulting industry, I saw many different organisations and how far they have come in utilising data. Now looking at my current industry, I can say that the value of data is acknowledged in public administration. For example, there is an enormous emphasis on knowledge management in the Finnish health and social services reform. Before knowledge management can be undertaken, the data and indicators must be comparable.
The understanding of the necessity of knowledge management has permeated all organisations. There probably are not many organisations that consider data to be trivial. But whether an organisation has the capability for knowledge management is a different matter entirely. In this sense, there is still much work to be done, and the work in this area will, in fact, never be complete. A capable consultant partner can help. Consultants have seen how the matter has been taken care of in other companies and can help each organisation find its essence.
What is emphasised in analytics and its application in 2022?
Minna Vakkilainen: Data, analytics, and AI solutions are no longer separate functions or capabilities. They are starting to become firmly integrated into processes and digital services. The main focus is no longer on history, i.e., reporting. Instead, we look to the future through things such as forecasts, anticipation, and recommendations.
Successful utilisation of data and analytics requires a strong understanding of business and customers, as well as leadership. Proof of concept and testing are no longer enough. We must seek actual benefits from the start and successfully see solutions through to production. Ease of use and successful service design are absolute requirements for services that use AI solutions and analytics. Digital services are never truly complete. Their development requires continuous knowledge management and responding to user feedback.
Mikael Collan: Analytics is used to seek actual results. Analytics for the sake of analytics is starting to become outdated – what does not yield results is abandoned. There is a clear upward trend in industries such as software robotics and the process industry, for which analytics provides tangible added value.
New products and product versions containing analytics are already available on the market. They are superior compared to traditional products.
The public sector is also becoming interested in analytics, particularly data-based support for decision-making, i.e.,, utilising analysed data as a basis for decision-making.
This means that essential and current information must be made available to the parties that prepare decisions quickly, possibly in an automated manner. At present, public-sector data is scattered across different organisations, making it difficult to see the big picture. Even though both data and analysed data exist, they are not available in a centralised manner.
Antti Kuivalainen: I see the data fabric as an emerging trend. The term is new enough to still lack an established Finnish translation.
A data fabric is a concept related to data management within organisations. It describes the organisation’s data management capabilities, existing data sources and data pipeline – i.e., how data is moved from one place to another and utilised where it is needed. Ideally, this fabric is solid. It allows data to be found and transferred quickly, without weeks of preparation. However, if there are holes in the fabric, this is not possible until the holes have been filled.
The guiding principle in the data fabric concept is interconnecting the organisation’s data capabilities and making data more readily available to decision-makers. No data should be kept in a silo. Different data sources must be interconnected instead of data being gathered into a single centralised chunk.
What new trends are emerging in the field of analytics?
Minna Vakkilainen: We are increasingly noticing that advanced analytics and AI solutions can be used to produce actual value from data. For example, digital solutions in which analytics and AI solutions play a key role can significantly improve the customer experience and support the company’s profitable growth.
There has previously been – and still is to some degree – talk about commercialising data. However, organisations have now (fortunately!) set their sights and focused their actions more on data responsibility and actual, fair value creation.
Mikael Collan: The healthcare and social services sector is undergoing a transformation, but analytics is almost not addressed at all in the discussion on healthcare and social services. Hardly any attention has been paid to the possibility of using analytics and digitalisation to enhance processes, particularly referral to treatment.
The discussion on IT purchases in the healthcare and social services sector has also largely lacked discussion on the principles of analytics, i.e., the standardisation of the wellbeing services counties’ data architectures and interfaces.
If a standardised system is not built, we will find ourselves back in the same situation that we are in now. In that case, we will have to create various intermediate systems between systems in order for these systems to be able to communicate with each other. This is very expensive and inefficient. This type of development would also be very unfortunate from the perspective of analytics, automated data collection and data-based guidance.
Samu Kuosmanen: Until now, machine learning-based prediction models have often been similar to black boxes. Analytics that support decision-making recommends actions based on the available data. But we do not know how the machine comes to this particular recommendation, even though it would be very important to understand what is happening inside the models. Explainable AI meets this need.
Now that knowledge management is on the rise, it is becoming more important to understand what a predictive model is based on. Even if a highly precise predictive model supports our decision-making, we are still on uncertain ground if we do not know why the machine works as it does. What is its logic, and which drivers are behind the predictions?
In my opinion, an explainable model that ‘only’ has a 90 per cent predictive accuracy but can still understandably explain its operation could, in practice, be a better partner for humans than a ‘black box’ model with a 99 per cent accuracy.
Antti Kuivalainen: Up until a few years ago, there was talk about the value of storing data in a manner that creates continuity – and about organisations having mass data to analyse.
Now, the discourse has moved on to the importance of managing mass data. How can the right data be curated for decision-making from the cloud and on-prem environments?
The majority of data, 90 per cent, is unstructured. It is now more critical than ever to be able to recognise which data is valid. However, validation with traditional methods is very slow, which is why the degree of automation must be increased in the validation of data. Statistical methods have been successfully used for a long time to classify the reliability of data sources and the anomalies in them. However, these methods should be incorporated into a continuous, automated process that allows the data quality to be measured and controlled.
When various cloud environments are utilised, the significance of interconnecting them in the right way is highlighted.
In a cloud environment, the customer pays for the transfer and processing of data instead of the computational power. In other words, data processing must be optimised. This should be done where it is the most cost-effective.
A data strategy can be implemented in many ways. Sometimes it is best to carry out data processing early in the food chain – at the edge of the network – and only upload the result to the cloud. At other times, it is better to concentrate operations in one particular cloud environment, at least in some respects.
How should organisations go about utilising analytics?
Mikael Collan: The first thing organisations must do is understand what analytics is. If they do not yet understand this, they can ‘ask a friend’, for example.
If they have no clue how to utilise analytics in some concrete function of their organisation, I think that they should not invest in it.
Once the organisation has figured out a clear use for analytics, it should start with a pilot project. Before starting, the organisation must select a group of people within the organisation who are committed to the project and have a positive attitude towards the development of analytics. This group will also act as heralds of the reform in the future.
The most essential things are the concrete aspects, a drive for development and the desire to improve the operations – which is not different from traditional development work. Everything starts from understanding. Low-hanging fruit should be picked first.
Samu Kuosmanen: A general understanding is an important factor in seeing the big picture. It is the basis for defining the key management indicators representing the organisation’s success or achievement of its goals. It would be good to simplify the examination to five rather than ten indicators.
After this, we can address the question of what has caused the measurable quantities to develop as they have.
Antti Kuivalainen: Everything should start from creating a data strategy that takes the business needs into account. A data strategy makes it easier to assign a value and priority to the necessary development measures – what should be done and focused on, considering that the resources are always inevitably limited.
I have noticed that a practical next step after deciding on the goal is to utilise a conceptual model concept. The model describes the business concepts examined in data processing. It can be used to identify the company’s data sets. Is there room for development in the data resources?
When modelling a part of the business – be it sales, marketing or the optimisation of enterprise resource planning – there should be a mutual understanding in business and data management regarding what should be included in the data resources. Based on this understanding, the organisation can develop the markers of a data fabric or data management. What capabilities are needed to reach this goal?
The organisation should have its own data roadmap or data management concept in order to have the ability and resources to take ownership of data. This would allow it to help its suppliers with questions of substance on business.
The organisation requires a bridge builder who works at the interface between business and data or information management and is able to validate the priorities and needs and understand the target architecture. They do not have to be a guru.
The owner of the data strategy, such as the CIO, needs the support of data process owners, who can guide the suppliers and, through it, rationalise the operations.
What in-house expertise related to analytics should companies have – and when should they utilise competent cooperation partners?
Mikael Collan: Analytics must have an owner who is responsible for it within the organisation. This ensures that the operations boost continuous improvement instead of remaining fragmented. The owner is then able to outsource services and sell analytics within the organisation.
Organisations should outsource ready-made solutions while keeping the architecture under their control. The owner of the organisation’s data and analytics architecture is the in-house ‘Spider-Man’ who keeps hold of all the reins. They play a key role in building competencies.
Samu Kuosmanen: Organisations need enough in-house expertise in analytics architecture to form and provide the management team with a concise overview of the most important systems, processes and dataflows.
For understanding what goes on under the hood, however, there are outstanding specialised partners who would be happy to get their hands dirty in a grease pit.
When taking a car in for servicing, you must have a good enough understanding to be able to say what kind of servicing is needed – whether the brakes keep making noise or the car keeps shutting off – while also knowing what questions to ask, such as what the service plan covers.
By the same token, organisations must have general in-house expertise in analytics architecture and knowledge of special needs. They must know and be able to say in an understandable way what information they need from suppliers. Not many people can or want to paint their car by themselves, but they can still tell the paint shop exactly what shade of colour they want.
Of course, some cases cannot be closely defined. There is only a pile of data, and the organisation wants to know what it can get out of it. Expert assistance is also available for this type of need. Analysts who study data do not necessarily have to understand what the data concerns. They simply look for regularities in numbers and describe the links they find in data.
Few companies are large enough to have a sufficient amount of all the data expertise they require under their own roof. It is not optimal for all the data scientist expertise to lie in the hands of a single person. Even with two people, the number is still very low. An analytics team should be a good size.
Instead of having its own team, it might be better for the organisation to have general architectural expertise in house and receive help from a capable network of partners. The organisation can develop things together with these partners. When the organisation does need help, it is able to identify the problem on its own, and the only task left to the partner is solving the problem. The organisation must know the big picture and keep hold of the reins.
Antti Kuivalainen: Operators like us must have the capability and insight for dialogue. We must be able to express our views on how our customers can achieve their data strategy goals.
Today, analytics suppliers must have the technical competence to operate on modern cloud platforms. Our unique expertise is identifying the most cost-effective way to operate on various cloud platforms, including multi-cloud integration. The traditional method of gathering all data into one pile, into a single data warehouse, is not the solution to everything. Of course, it is still the most functional solution in some cases.
In my opinion, there is no point in the customer acquiring in-depth knowledge of the differences between different platforms and methods of implementing multi-cloud operations. This is the type of expertise that they should acquire from a knowledgeable partner.
What new things did you last learn or notice in the field of analytics?
Mikael Collan: It does not really come as a surprise that we learned things the hard way. Analytics is neither easy nor cheap. Its advantages are not easy to ‘sell’ – and even if you succeed in doing so, there may still be setbacks.
In the public sector, it is still early days in the full-scale utilisation of analytics. The digitalisation journey is only just starting. On the other hand, some public organisations have already made it far. For example, the Finnish Tax Administration uses the best-digitised tax system in the world.
Cross-pollination and learning from success play a key role in the digitalisation efforts in the public sector. A culture of continuous development and improvement is still missing, and matters are taken forward through projects. There is a need to establish the role of the owner of digitalisation in organisations and task the person in this role with long-term development of digitalisation and analytics. The situation is exactly the same in the business world, although the rate of progress may be different.
Samu Kuosmanen: I have noticed that various dashboards or indicators can be applied on a wide range of levels. Here in the Joint Municipal Authority for North Karelia Social and Health Services, our management team uses strategy indicators. We also seek to make all our operations visible at the team level by using various indicators in our daily work.
Many people often find analytics to be strange at first – they consider it to be something that nerds do; it might operate the management team’s traffic lights somewhere far away.
But once analytics is brought to the team level and utilised as indicators in daily work, people become immensely enthusiastic – it actually yields benefits!
At first, analytics projects are often considered to be recording work that unnecessarily takes up time. That is until people realise that it is a tool that helps them focus their work and is genuinely helpful in carrying out their jobs. In the same way that smartwatches automatically offer useful data on a jog, analytics offers processed data on daily work and helps develop it.
Antti Kuivalainen: It has been a joy to see concept modelling successfully breakthrough to the public sector data system. It has recently been well-represented in the organisation of healthcare and social services, for example.
I have monitored the development of data visualisation with interest. There has long been talk about data storytelling. Companies have now started investing in it to the degree that we have seen some impressive visualisations. I believe that this trend has, in part, contributed to a better understanding of the significance of data. It is easier to understand what makes data valuable when you see a visualisation that helps you come to a conclusion significantly more quickly than you previously did.
I believe that data visualisation will develop further once the generations of digital natives enter the workforce. For example, my daughter colours each day in her bullet journal based on what the day was like – nice, challenging or sad. The young generation has a creative way of thinking that does not necessarily occur among older people.
In this development, it is important to make tools that are easy to use available to users. Ideally, visualisation does not require the technical capability for data collection.
While waiting for this ideal scenario to come true, you can receive help with visualisation from a service designer – and if you do not have one in your organisation, you can receive expert consulting assistance from us at Haallas, for example. In design activities, it is also important to have an understanding of business. Service designers work together with data analysts, complementing their expertise.
Text: Kari Ahokas
Experts:

Minna Vakkilainen
Director of Customer Information, K Group
LinkedIn
Minna works as Director of Customer Information at one of the largest retail chains in Finland, K Group. She is interested in how data and technology can be harnessed into world-changing solutions, that support sustainable development and bring real added value to the daily lives of companies, employees, and customers.
Minna has worked for the K Group since 2014, responsible for analytics and the development of knowledge management and artificial intelligence solutions. She strongly supports change where the use of information is involved in everything - both strategic and operational decision-making.

Mikael Collan
Director General, VATT Institute for Economic Research
LinkedIn
Mikael works as General Director of the Institute of Economic Research (VATT) and Professor of Strategic Finance at LUT University. His research focuses on the application of fuzzy logic and analytics in company decision-making, especially in profitability calculation. Mikael is a member of the Finnish Society for Science and the former chairman of the Finnish Society for Operations Research.

Samu Kuosmanen
Director of Digital Services, Siun sote
LinkedIn
Samu has over 20 years of experience in managing digitisation and business intelligence projects for Finnish and international organisations. He specialises in enterprise performance management (EPM), analytics, and robotic process automation (RPA).

Antti Kuivalainen
Managing Director, Haallas
LinkedIn
Antti has 15 years of experience in the development of business data systems and knowledge management. He has managed extensive digital product and service deliveries to both Finnish and international organisations in several industries. In his current position as Managing Director at Haallas, Antti appreciates good partnerships, openness, and collaboration.