One of the most important hires for companies across most sectors these days is that of the data scientist. As data analysis and technology strategy expert Q Ethan McCallum observes on his website, there is no point in hiring a data scientist until you have the correct data infrastructure in place. Doing so would be akin to hiring Lewis Hamilton for a racing team, but providing him with a car liable to break down before the finish line. “To invest in such a data infrastructure is to invest in the long-term success of your firm’s data science activities,” Q McCallum notes.
The principal challenges with data come mainly from the volume, the plurality of sources and types, and the discrepancies in how it is gathered, processed and ultimately used.
Sven Denecken, Head of Product Management and Co-Innovation for SAP’s S/4HANA business suite, embraces the challenge of navigating the ever-changing tides when it comes to data. “As a product manager, I'm like a kid in a candy store. I want to use that technology. I want to use that data. I want to use those concepts, but my job is to bring it all together with an actual business process. Big data is more important than ever and the technology is there to compute it in vast amounts and with great speed. The more you can virtualize and put into in-memory speed computing, the better you will be able to adapt your business processes. You cannot know exactly what your customers want tomorrow, but you want to predict it as much as you can.”
That’s how companies can ensure proper enterprise resource planning, and that is what the SAP S/4HANA suite does: a real-time enterprise resource management suite for digital business built on the company’s advanced in-memory platform, SAP HANA, deployable in the cloud or on-premise.
Denecken, not unexpectedly, describes it as “the best enterprise resource planning software on the cloud” barring none, and he was happy to break down what he sees as the prerequisites for any company to succeed when it comes to structuring and interpreting data. “I would argue infrastructure as a service, security, and the availability of the data are three key ingredients you need to start with,” he asserts.
Whether talking about unstructured, structured or semi-structured data, Denecken is adamant these different types need to be combined if a company hopes to optimize its business processes. Everyone talks about big data. “I'm actually more a fan of the right data. Big data's the starting point. It's a commodity. The right data is bringing you a competitive advantage.
“We need to realize that data itself is the new gold. It’s a case of the more data the better, in whatever shape or form: unstructured, structured, or semi-structured; we need to collect much more. The key question is how a company deals with it. For example, text messages, audio, semi-structured data, are much less voluminous… I want to make sure that we process this in the right way. This is where process knowledge and data knowledge need to come together.”
This is also where a lot of companies tend to fall down, according to Dan Somers, CEO of predictive analytics firm Warwick Analytics. “Less than 1% of data is analyzed. This in itself is bad, but there are also a lot of types of data which are not very informative. The trick is to analyze 100% of the right data in the right way. Mostly, people are just deploying analytics for visualization. Unstructured and text data are very poorly analyzed and form the majority of data today. Much of the time there’s a ‘so what’ at the end of analysis because people are asking the wrong question.
“One example is analyzing voice of customer data for topics and sentiment whereas the better analysis is to validate (remove trolls and statistically validate across all customers removing skews) and then isolate the topics and sentiment which drive customer churn and/or loyalty, as these are the things that predictively make the difference.
“Start with the right question and analyze the right data,” Somers advises. “Then, once you start from there, find the tools that can help, don’t always just do what the data science team is capable of. It must fit the business and be flexible enough to be updated and ‘live’ as things inevitably evolve, rather than bogging the data science team down in curation.”
Being able to do all of these things requires a strong and robust network, or at least one which is attuned to a business’s own requirements and needs. That is very much the ‘domain’ of David Goff, who is Head of Enterprise Network for UK and Ireland for world networking leader Cisco.
“What my team is there to do, and what Enterprise Network is there to do, is to find ways that we can drastically simplify the network or actually make the network intuitive,” he explains. “To make it intuitive – that means to be able to see, to think, and to act itself, without manual intervention requires data.
“Then it’s about how we use visibility of data to be able to inform the network and to ensure that the network is something that adapts and has the rigidity that business needs to be able to capture transitions on IAP, cloud and mobility.” But from a network point of view, it’s less about looking within the data itself and more about how its transportation can be facilitated.
All of the factors outlined so far need to be considered when it comes to how to structure data, but what also needs to be remembered is that technology is constantly evolving and that the playing field is always subject to disruption and change.
Goff says he expects “ongoing innovation, creating ecosystems” and Denecken agrees that there is further room for disruption in the data space – in fact, he expects it, saying that anyone who manages to marry “the combination of big data, AI and business processes” will be on to a winner in that regard. He acknowledges that more and more processes are going to become streamlined or automated thanks to artificial intelligence and machine learning.
“There will be always niches where experts and very bright people will find an even better way or will fill in holes,” he adds. “Already today, what we can do with process robotic automation disrupts many business processes. So, would I, today, invest into a short service center company to outsource labor tasks? Personally, I wouldn’t. I think those tasks will be automated first.
“On the other side, a lot of opportunities will be created. There's a lot of discussion about things like access to big data, access with algorithms to make it more intelligent etc., but the closer you get to the business process, the more you will own that piece of the data. The further you go away, the more you will rely on third-party resources. Maybe also to pre-empt it, to pre-condition it, to pre-extract certain data. If I want to know what my customer base is doing, I'm not going to hire a consultant to dig at that for a year – but I will rely on certain market data to sense it, and then based on that sensing, drive my business processes or my automation.”