Analytics Trends for 2018- Valutrics
Data is set to become a big “currency” for businesses, with value to buy, sell or lose. Beyond the disruptions we have seen caused by next generation technology, new processes, business models, and even new organisations are looking at how to monetise data as an asset.
One of the reasons that analytics remains a hot topic is that the amount and variety of data available has skyrocketed, constantly creating new analytic challenges. But even more importantly, analytics has become an essential part of digital transformation.
For the last few decades, we’ve typically thought of business intelligence as a byproduct of our operational processes. In other words, we manufacture products, and ship them around the world, and sell them to customers. Each of these processes generates a lot of data, and we use that data both to keep track of operations and to create more optimized processes in the future.
This remains as true and important today as it’s ever been in the past. But now there’s another dimension coming into play: Organizations are increasingly realizing that digital transformation doesn’t just require new processes – it requires a new approach to implementing processes. They have to be more agile, more intelligent, and more responsive to change.
These new processes flip the traditional equation on its head. New processes are created on the fly, powered by analytics. The typical customer journey is a great example. Think about how you purchase a product today. In the old days, it was a fairly linear process, that companies characterized as a “sales funnel.” But now it’s more like a “write your own adventure” book, where there are many different possible interaction paths, and at each point in the process, you as a customer get to choose the next chapter. Analytics is being used to help the customer make the right choice at each point: “What other products might you be interested in?” or “Would a discount get you to purchase right now?” In other words, every customer process is unique; we can let analytics do all the work, creating thousands or millions of personalized “processes” based on the needs of each individual.
Because these new processes are analytics-powered, they can be much more agile and responsive to change – indeed, new machine learning approaches mean that they can update themselves.
Effectively creating and managing these kinds of flexible, on-the-fly processes is the big new opportunity in digital business. But it also means that analytics must have a more process-oriented approach, rather than being treated as a series of one-off decisions. SAP has always been a leader in traditional processes, and we’re now hard at work extending our leadership into these new digital transformation processes.
Gartner believes that information and analytics will be used to reinvent, digitalize, or eliminate 80% of today’s business processes. Analytics is no longer just an afterthought to the “real business” – it’s a the heart of the new business models of the future.
Analytics Adoption Rises Among Small Companies
In a NewVantage survey earlier this year, 48.4% reported that their firms are achieving measurable results from their big data investments – the first time the survey has found a near majority since it began in 2012. Furthermore, a massive 80.7% of executives characterized their big data investments as ‘successful,’ compared to just 1.6% who said they believed they had failed.
In 2018, more and more organizations will reach data maturity. We will see analytics practitioners face many of the same challenges they’ve been dealing with in recent years, as well as a few new ones to boot. Disruptive forces will continue to conspire to produce fresh problems on a daily basis that will make the use of data analytics both more difficult and more vital. Companies will collect more data than ever, and there will be new technologies and strategies to help exploit it to its full potential.
Large companies long ago realized the importance of data and analytics. In an IDG study last year, 78% of larger employers agreed data collection and analysis have the potential to completely change the way they do business. However, small companies are still behind in their efforts. According to an SAP-sponsored global survey of small businesses, many are still in the early stages of digital transformation. Further research from One Poll found that 56% of SMEs rarely or infrequently check their business’s data, while 3% have never looked at it at all.
This should change in 2018. In the SAP survey, 73% and 87% of small and midsize businesses surveyed indicated that their expectations regarding technology investments were met or exceeded. With the cost of data analysis and visualization technologies falling and investments bearing fruit, this year should see the adoption of data analytics extend to even the smallest of companies.
Outsourcing Of Analytics Increases
One option for SMEs unable to fund full-scale programs is to outsource them to an outside agency that specializes in data analytics.
Outsourcing analytics is an excellent way of enhancing your data capabilities when you lack the necessary funds, making it ideal for small companies. In 2018, though, the ever-growing problem of a lack of qualified data practitioners will also likely see many larger organizations look to the insights-as-a-service market. According to a recent PwC study, 69% of employers by the year 2021 will demand data science and analytics skills from job candidates, yet just 23% of college graduates will have the necessary skills to compete at the level employers demand. In a recent report sponsored by Dun & Bradstreet, 27% cited skills gaps as a major obstacle to their current data and analytics efforts. Of these, 60% said they are already using third parties to support organizational bandwidth and 55% are outsourcing some or all of their analytics needs. Forrester also predicts that as many as 80% of firms will rely on insights service providers for at least some of their insights capabilities in 2018.
Tony Fross, VP and North American practice leader for digital advisory services at Capgemini Consulting, notes that, ‘The decision to outsource is always about what the core competency of your business is, and where you need the speed. If you don’t have the resources or the ability to focus on it, sometimes outsourcing is a faster way to stand up a capability. So how do companies today make the leap to light speed and become big data analyzers? Do they go outside and hire data analysis consultants or try to develop the capability in-house? The fundamental question must be “how business-critical is the data?”‘
Qualitative Data Is On The Up
Organizations will realize in 2018 that quantitative data alone cannot help you truly understand customer behavior and market trends as it does not allow you to properly understand human emotion. It does not account for the ebbs and flows of people’s motivations and feelings and its insights can quickly become invalid as a result. Qualitative data bridges these knowledge gaps. It is the information found in the unstructured data of online reviews, social media, and so forth, that provides the context to help understand why something is the way it is and if it is changing.
When we rely solely on big data, we end up with a warped sense of the world in which human beings are simply numbers to be fed into an algorithm. This is not to say it is useless, nor that in many cases it can be used alone. It is still a powerful and helpful tool that companies should invest in. However, companies should and will also invest in gathering and analyzing qualitative data to uncover the deeper, more human meaning behind big data. Data scientists and ethnographers need to collaborate to ensure that big data has its groundings in actual human behavior, not just what a machine thinks it should be in perpetuity.
Transforming this qualitative data into quantitive form is also becoming increasingly easy to do, with machine learning technologies such as Brainspace’s now able to mine and visualize the unstructured data. It is not easy and there are still many challenges. As Dr. Kirk Borne, principal data scientist at Booz Allen Hamilton, wrote in a blog post for MapR Technologies Inc, ‘there are far more subtleties and intricacies in language that we can use to extract deeper understanding and finer shades of meaning from our qualitative data sources about our customers, employees, and partners.’ However, it is vital, and this year should see it become a much bigger deal.
GDPR Leads To New Opportunities
On the 25th May 2018, the EU General Data Protection Regulation (GDPR) comes into full effect after years in the making. The GDPR is the EU’s latest rewrite of its data privacy laws. Its impact will be felt by organizations across the globe, applying to every piece of data that touches the countries that signed – regardless of where in the world the data has been captured and analyzed. This has had tech giants up in arms at what they perceive to be a war on their power, and they will have their work cut out for them continuing to do so with the new rules.
This has many ramifications for data teams in terms of the data they can collect and how they use it, but it also presents an opportunity to companies that prepare fully and embrace GDPR, not just in terms of compliance but in rebuilding some of the trust with consumers that has been lost in recent years.
It will also help companies become better at managing their data. As CFO of SAP, Emmanuelle Brun Neckebrock recently wrote on this site, ‘Data is one of your company’s most valued resources, yet one of the most poorly managed. It’s the golden thread that runs through the entire organization, and in most instances, it’s managed casually and inconsistently, depending on individual employees and departments. You wouldn’t let your revenue, products, or equipment assets be handled that way, so data (given its inherent value) shouldn’t be any different. It warrants the same due care and attention. GDPR legislation is unique in that it allows you – OK, forces you – to transform the way you handle data across the whole organization, managing associated risks and compliance. In doing so, it’s actually strengthening your ability to compete on the digital playing field, making you more agile for long-term success.’
Companies Look To External Data
External data is any data generated from outside an organization. It can come from a variety of sources. For example, the US government alone makes available more than 131,000 datasets on the federally-run website data.gov. Another example is weather, which is particularly useful in supply chain management. UK based supermarket chain Tesco, for example, is renowned for their use of weather data to drive richer insights that help them to predict sales and stock requirements. They reported in 2013 that they had managed to save £6m ($7.5m) per year and reduced out-of-stock by 30% on special offers. In fact, in a recent survey of supply chain professionals by the UK Met Office, 47% cited weather as one of the top three factors external to their business that drives consumer demand. Of these, 57% said they had better sales forecast accuracy, 51% that they had better on-shelf availability, and 43% that they had reduced waste.
In 2012, Forbes writer Dan Woods argued that companies were suffering from what he called ‘data not invented here syndrome’, and were failing in their data initiatives because they focused solely on using data created inside the four walls of their business. This is a situation that has largely not improved. This is, to some extent, understandable. There are real risks around the use of external data and it is not right for every industry, but it can be a real driver of growth when applied correctly. In 2018, greater accessibility to external information and greater maturity in data programs should see those organizations for whom it makes sense far better positioned to use it.
NLG Helps Business Users Better Understand Visualizations
Natural Language Generation (NLG) and Natural Language Processing (NLP) are often wrongly used interchangeably and misunderstood by those in business. While NLP focuses on understanding what ideas are being communicated by analyzing textual data for patterns, NLG is a branch of AI that communicates the findings and insights discovered by NLP by translating them into natural language. This is now being integrated into analytics tools to work alongside data visualizations in order to provide mainstream users with a clearer narrative, demystifying data and communicating insights for you in real time so action can be taken. And BI and Analytics vendors are investing significant sums on Advanced NLG technology to complement their data storytelling. Indeed, by 2019, Gartner predicts that NLG will be a standard feature of 90% of modern business intelligence and analytics platforms.
Developing Value