...
...
Building a Data Culture

Building a Data Culture

Posted | Updated by Insights team:
Dr. Evangelo Damigos; PhD | Head of Digital Futures Research Desk
  • Digital Transformation
  • Sustainable Growth and tech Trends


Publication | Update: Oct 2020
...

Gartner says, "Culture and data literacy are the top two roadblocks for data and analytics leaders"

To achieve clarity of role and purpose, CDOs need to act quickly and decisively to determine the data-driven ambitions of the enterprise, set their scope, and actively communicate their role. 

Harnessing data can be the difference between staying ahead, or falling behind, notes Katerina Hanna, the APAC director for customer success at Tableau Software.

She cited a recent IDC report that underscored how data alone is no guarantee for success. Based on a survey of 1,100 organizations and interviews conducted at all levels, it established that data-driven progress is inextricably tied to the culture of the organization.

“You can have a lot of data, you can have the best leading-edge technology, you can even have the best analysts in the world. But that is not enough to guarantee success. More and more organizations around the world are recognizing that turning data into information, knowledge, insights, and actions requires a data culture,” she said.

Changing culture is difficult. It requires a strong vision, determination, and resilience, observed Hanna. This entails embedding data into the very fabric of the organization, an aspiration is to get everybody to see the value in data, proper training so users have the confidence to use data, and continually inspiring everyone to leverage data for better decision making, she said.

According to Paul Mah of CDO Trends, because a data culture is essentially the collective behaviors and beliefs of people who value, practice and encourage the use of data, a good strategy must involve every individual within the organization.

“People are at the core of any data culture. They need to practice using data every day, all of which then comes together in better decision making. Strong data culture depends on trust. Leaders in data culture believe that all their people are smart and capable. They empower people to ask questions, and they embrace critical thinking,” said Hanna.

Five keys for building a data culture

In her presentation, Hanna outlined five keys to building a data culture embraced by Tableau.

  • Trust: Leaders need to embrace critical thinking and trust their people to produce results.
  • Talent: Support and enable data-capable employees through their entire lifecycle recruitment.
  • Commitment: Incentivize and inspire existing employees to upskill and incorporate the use of data.
  • Sharing: Share and spread success and use cases through the organization as part of the data-driven journey.
  • Mindset: Nurture a new mindset across the organization that prioritizes insights and makes decisions based on facts, rather than intuition.

To illustrate some of the pointers in action, Hanna drew from anecdotes. For instance, an unnamed organization in China created a one-on-one training camp that accelerated data culture. By empowering and putting people at the front of their data-centric plans, the organization improved its analytics efficiency by 11% and simultaneously saved $ 1 million by not having to engage third-party designers and consultants.

On its part, Indonesia's largest financial institution, PT Bank Mandiri, teamed up with Tableau to develop more than 600 visualizations and dashboards to significantly improve their efficiency. One example given was manual data requests that took headquarters two weeks to process on average. This was cut down to just two days.

As transformation accelerates around the world, and macroeconomic and geopolitical factors force changes, business leaders strive to bring about the necessary change to help their organizations thrive.

According to Cognizant research, the following steps will help guide CDOs to create successful, thriving data cultures:

1. Map your organization’s data supply chain.

2. Focus on the “art of the possible.”

3. Be transparent about data.

4. Develop reward-sharing mechanisms.

5. Identify areas of friction within the organization.

6. Elevate the conversation to focus on strategy and innovation.

...

Source: Cognizant

David Waller, a partner and the head of data Science and analytics for Oliver Wyman Labs, has distilled 10 data commandments to help create and sustain a culture with data at its core.

1. Data-driven culture starts at the (very) top. Companies with strong data-driven cultures tend have top managers who set an expectation that decisions must be anchored in data — that this is normal, not novel or exceptional.  They lead through example.  At one retail bank, C-suite leaders together sift through the evidence from controlled market trials to decide on product launches.  At a leading tech firm, senior executives spend 30 minutes at the start of meetings reading detailed summaries of proposals and their supporting facts, so that they can take evidence-based actions. These practices propagate downwards, as employees who want to be taken seriously have to communicate with senior leaders on their terms and in their language. The example set by a few at the top can catalyze substantial shifts in company-wide norms.

2. Choose metrics with care — and cunning. Leaders can exert a powerful effect on behavior by artfully choosing what to measure and what metrics they expect employees to use. Suppose a company can profit by anticipating competitors’ price moves. Well, there’s a metric for that: predictive accuracy through time. So a team should continuously make explicit predictions about the magnitude and direction of such moves. It should also track the quality of those predictions – they will steadily improve!

For example, a leading telco operator wanted to ensure that its network provided key customers with the best possible user experience. But it had only gathered aggregated statistics on network performance, so it knew little about who was receiving what and the service quality they experienced. By creating detailed metrics on customers’ experiences, the operator could make a quantitative analysis of the consumer impact of network upgrades. To do this, the company just needed to have a much tighter grip on the provenance and consumption of its data than is typically the case — and that’s precisely the point.

3. Don’t pigeonhole your data scientists. Data scientists are often sequestered within a company, with the result that they and business leaders know too little about each another.  Analytics can’t survive or provide value if it operates separately from the rest of a business. Those who have addressed this challenge successfully have generally done so in two ways.

The first tactic is to make any boundaries between the business and the data scientists highly porous. One leading global insurer rotates staff out of centers of excellence and into line roles, where they scale up a proof of concept. Then they may return to the center. A global commodities trading firm has designed new roles in various functional areas and lines of business to augment the analytical sophistication; these roles have dotted-line relationships to centers of excellence.  Ultimately, the particulars matter less than the principle, which is to find ways to fuse domain knowledge and technical knowhow.

Companies at the leading edge use another tactic.  In addition to dragging data science closer to the business, they pull the business toward data science, chiefly by insisting that employees are code-literate and conceptually fluent in quantitative topics. Senior leaders don’t need to be reborn as machine-learning engineers.  But leaders of data-centric organizations cannot remain ignorant of the language of data.

4. Fix basic data-access issues quickly. By far the most common complaint we hear is that people in different parts of a business struggle to obtain even the most basic data. Curiously, this situation persists despite a spate of efforts to democratize access to data within corporations.  Starved of information, analysts don’t do a great deal of analysis, and it’s impossible for a data-driven culture to take root, let alone flourish.

Top firms use a simple strategy to break this logjam.  Instead of grand — but slow — programs to reorganize all their data, they grant universal access to just a few key measures at a time. For example, a leading global bank, which was trying to better anticipate loan refinancing needs, constructed a standard data layer for its marketing department, focusing on the most relevant measures. In this instance, these were core data pertaining to loan terms, balances, and property information; marketing channel data on how loans were originated; and data that characterized customers’ broad banking relationship. No matter the specific initiative, a canny choice for the first data to make accessible is whichever metrics are on the C-suite agenda. Demanding that other numbers eventually be tied to this data source can dramatically encourage its use.

5. Quantify uncertainty. Everyone accepts that absolute certainty is impossible.  Yet most managers continue to ask their teams for answers without a corresponding measure of confidence.  They’re missing a trick.  Requiring teams to be explicit and quantitative about their levels of uncertainty has three, powerful effects.

First, it forces decision makers to grapple directly with potential sources of uncertainty: Is the data reliable? Are there too few examples for a reliable model?  How can factors be incorporated when there are no data for them, such as emerging competitive dynamics?  One retailer found that the apparent degradation in redemption rates from its direct marketing models was caused by increasingly stale address data. An update, plus a process for keeping the data fresh, fixed the problem.

Second, analysts gain a deeper understanding of their models when they have to rigorously evaluate uncertainty. For example, a U.K. insurer’s core risk models had failed to adequately adjust to market trends.  So it built an early-warning system to take these trends into account and spot cases that would otherwise have been missed. As a result, it avoided losses due to sudden spikes in claims.

Finally, an emphasis on understanding uncertainty pushes organizations to run experiments.  “At most places, ‘test and learn’ really means ‘tinker and hope,’” a retailer’s chief merchant once noted. At his firm, a team of quantitative analysts paired up with category managers to conduct statistically rigorous, controlled trials of their ideas before making widespread changes.

6. Make proofs of concept simple and robust, not fancy and brittle. In analytics, promising ideas greatly outnumber practical ones. Often, it’s not until firms try to put proofs of concept into production that the difference becomes clear. One large insurer held an internal hackathon and crowned its winner — an elegant improvement of an online process — only to scrap the idea because it seemed to require costly changes to underlying systems. Snuffing out good ideas in this way can be demoralizing for organizations.

A better approach is to engineer proofs of concept where a core part of the concept is its viability in production. One good way is to start to build something that is industrial grade but trivially simple, and later ratchet up the level of sophistication. For example, to implement new risk models on a large, distributed computing system, a data products company started by implementing an extremely basic process that worked end-to-end: a small dataset flowed correctly from source systems and through a simple model and was then transmitted to end users. Once that was in place, and knowing that the whole still cohered, the firm could improve each component independently: greater data volumes, more exotic models, and better runtime performance.

7. Specialized training should be offered just in time. Many companies invest in “big bang” training efforts, only for employees to rapidly forget what they’ve learned if they haven’t put it to use right away.  So while basic skills, such as coding, should be part of fundamental training, it is more effective to train staff in specialized analytical concepts and tooling just before these are needed — say, for a proof of concept. One retailer waited until shortly before a first market trial before it trained its support analysts in the finer points of experimental design.  The knowledge stuck, and once-foreign concepts, such as statistical confidence, are now part of the analysts’ vernacular.

8. Use analytics to help employees, not just customers. It’s easy to forget the potential role of data fluency in making employees happier. But empowering employees to wrangle data themselves can do this, as it enables them to follow the advice in a memorably titled book on programming: Automate the Boring Stuff with Python. If the idea of learning new skills to better handle data is presented in the abstract, few employees will get excited enough to persevere and revamp their work.  But if the immediate goals directly benefit them — by saving time, helping avoid rework, or fetching frequently-needed information — then a chore becomes a choice. Years ago, the analytics team at a leading insurer taught itself the fundamentals of cloud computing simply so they could experiment with new models on large datasets without waiting for the IT department to catch up with their needs. That experience proved foundational when, at last, IT remade the firm’s technical infrastructure. When the time came to sketch out the platform requirements for advanced analytics, the team could do more than describe an answer. They could demonstrate a working solution.

9. Be willing to trade flexibility for consistency — at least in the short term. Many companies that depend on data harbor different “data tribes.” Each may have its own preferred sources of information, bespoke metrics, and favorite programming languages. Across an organization, this can be a disaster. Companies can waste countless hours trying to reconcile subtly different versions of a metric that should be universal. Inconsistencies in how modelers do their work takes a toll too. If coding standards and languages vary across a business, every move by analytical talent entails retraining, making it hard for them to circulate. It can also be prohibitively cumbersome to share ideas internally if they always require translation. Companies should instead pick canonical metrics and programming languages. One leading global bank did this, by insisting that its new hires in investment banking and asset management knew how to code in Python.

10. Get in the habit of explaining analytical choices. For most analytical problems, there’s rarely a single, correct approach.  Instead, data scientists must make choices with different tradeoffs. So it’s a good idea to ask teams how they approached a problem, what alternatives they considered, what they understood the tradeoffs to be, and why they chose one approach over another. Doing this as a matter of course gives teams a deeper understanding of the approaches and often prompts them to consider a wider set of alternatives or to rethink fundamental assumptions.  One global financial services company at first assumed that a fairly conventional machine-learning model to spot fraud couldn’t run quickly enough to be used in production. But it later realized the model could be made blazingly fast with a few simple tweaks. When the company started to utilize the model, it achieved astonishing improvements in accurately identifying fraud.

Companies — and the divisions and individuals that comprise them — often fall back on habit, because alternatives look too risky. Data can provide a form of evidence to back up hypotheses, giving managers the confidence to jump into new areas and processes without taking a leap in the dark. But simply aspiring to be data-driven is not enough. To be driven by data, companies need to develop cultures in which this mindset can flourish. Leaders can promote this shift through example, by practicing new habits and creating expectations for what it really means to root decisions in data.

Source: CDO Trends

Learn More: The Data-Driven Mindset

Framed Content Aggregator - Publisher | Sponsor
...
APU INSIGHTS
...Industry: Information, Communication, Technology ICT
SKU code : B1BA78BC-B343-20BA-8C1B-829D8E22C082
Delivery Format:
HTML ...
Immediate Delivery
...Access Rights | Content Availability:
...
...

...

Objectives and Study Scope

This study has assimilated knowledge and insight from business and subject-matter experts, and from a broad spectrum of market initiatives. Building on this research, the objectives of this market research report is to provide actionable intelligence on opportunities alongside the market size of various segments, as well as fact-based information on key factors influencing the market- growth drivers, industry-specific challenges and other critical issues in terms of detailed analysis and impact.

The report in its entirety provides a comprehensive overview of the current global condition, as well as notable opportunities and challenges. The analysis reflects market size, latest trends, growth drivers, threats, opportunities, as well as key market segments. The study addresses market dynamics in several geographic segments along with market analysis for the current market environment and future scenario over the forecast period. The report also segments the market into various categories based on the product, end user, application, type, and region.
The report also studies various growth drivers and restraints impacting the  market, plus a comprehensive market and vendor landscape in addition to a SWOT analysis of the key players.  This analysis also examines the competitive landscape within each market. Market factors are assessed by examining barriers to entry and market opportunities. Strategies adopted by key players including recent developments, new product launches, merger and acquisitions, and other insightful updates are provided.

Research Process & Methodology

...

We leverage extensive primary research, our contact database, knowledge of companies and industry relationships, patent and academic journal searches, and Institutes and University associate links to frame a strong visibility in the markets and technologies we cover.

We draw on available data sources and methods to profile developments. We use computerised data mining methods and analytical techniques, including cluster and regression modelling, to identify patterns from publicly available online information on enterprise web sites.
Historical, qualitative and quantitative information is obtained principally from confidential and proprietary sources, professional network, annual reports, investor relationship presentations, and expert interviews, about key factors, such as recent trends in industry performance and identify factors underlying those trends - drivers, restraints, opportunities, and challenges influencing the growth of the market, for both, the supply and demand sides.
In addition to our own desk research, various secondary sources, such as Hoovers, Dun & Bradstreet, Bloomberg BusinessWeek, Statista, are referred to identify key players in the industry, supply chain and market size, percentage shares, splits, and breakdowns into segments and subsegments with respect to individual growth trends, prospects, and contribution to the total market.

Research Portfolio Sources:

  • BBC Monitoring

  • BMI Research: Company Reports, Industry Reports, Special Reports, Industry Forecast Scenario

  • CIMB: Company Reports, Daily Market News, Economic Reports, Industry Reports, Strategy Reports, and Yearbooks

  • Dun & Bradstreet: Country Reports, Country Riskline Reports, Economic Indicators 5yr Forecast, and Industry Reports

  • EMIS: EMIS Insight and EMIS Dealwatch

  • Enerdata: Energy Data Set, Energy Market Report, Energy Prices, LNG Trade Data and World Refineries Data

  • Euromoney: China Law and Practice, Emerging Markets, International Tax Review, Latin Finance, Managing Intellectual Property, Petroleum Economist, Project Finance, and Euromoney Magazine

  • Euromonitor International: Industry Capsules, Local Company Profiles, Sector Capsules

  • Fitch Ratings: Criteria Reports, Outlook Report, Presale Report, Press Releases, Special Reports, Transition Default Study Report

  • FocusEconomics: Consensus Forecast Country Reports

  • Ken Research: Industry Reports, Regional Industry Reports and Global Industry Reports

  • MarketLine: Company Profiles and Industry Profiles

  • OECD: Economic Outlook, Economic Surveys, Energy Prices and Taxes, Main Economic Indicators, Main Science and Technology Indicators, National Accounts, Quarterly International Trade Statistics

  • Oxford Economics: Global Industry Forecasts, Country Economic Forecasts, Industry Forecast Data, and Monthly Industry Briefings

  • Progressive Digital Media: Industry Snapshots, News, Company Profiles, Energy Business Review

  • Project Syndicate: News Commentary

  • Technavio: Global Market Assessment Reports, Regional Market Assessment Reports, and Market Assessment Country Reports

  • The Economist Intelligence Unit: Country Summaries, Industry Briefings, Industry Reports and Industry Statistics

Global Business Reviews, Research Papers, Commentary & Strategy Reports

  • World Bank

  • World Trade Organization

  • The Financial Times

  • The Wall Street Journal

  • The Wall Street Transcript

  • Bloomberg

  • Standard & Poor’s Industry Surveys

  • Thomson Research

  • Thomson Street Events

  • Reuter 3000 Xtra

  • OneSource Business

  • Hoover’s

  • MGI

  • LSE

  • MIT

  • ERA

  • BBVA

  • IDC

  • IdExec

  • Moody’s

  • Factiva

  • Forrester Research

  • Computer Economics

  • Voice and Data

  • SIA / SSIR

  • Kiplinger Forecasts

  • Dialog PRO

  • LexisNexis

  • ISI Emerging Markets

  • McKinsey

  • Deloitte

  • Oliver Wyman

  • Faulkner Information Services

  • Accenture

  • Ipsos

  • Mintel

  • Statista

  • Bureau van Dijk’s Amadeus

  • EY

  • PwC

  • Berg Insight

  • ABI research

  • Pyramid Research

  • Gartner Group

  • Juniper Research

  • MarketsandMarkets

  • GSA

  • Frost and Sullivan Analysis

  • McKinsey Global Institute

  • European Mobile and Mobility Alliance

  • Open Europe

M&A and Risk Management | Regulation

  • Thomson Mergers & Acquisitions

  • MergerStat

  • Profound

  • DDAR

  • ISS Corporate Governance

  • BoardEx

  • Board Analyst

  • Securities Mosaic

  • Varonis

  • International Tax and Business Guides

  • CoreCompensation

  • CCH Research Network

...
Forecast methodology

The future outlook “forecast” is based on a set of statistical methods such as regression analysis, industry specific drivers as well as analyst evaluations, as well as analysis of the trends that influence economic outcomes and business decision making.
The Global Economic Model is covering the political environment, the macroeconomic environment, market opportunities, policy towards free enterprise and competition, policy towards foreign investment, foreign trade and exchange controls, taxes, financing, the labour market and infrastructure. We aim update our market forecast to include the latest market developments and trends.

Forecasts, Data modelling and indicator normalisation

Review of independent forecasts for the main macroeconomic variables by the following organizations provide a holistic overview of the range of alternative opinions:

  • Cambridge Econometrics (CE)

  • The Centre for Economic and Business Research (CEBR)

  • Experian Economics (EE)

  • Oxford Economics (OE)

As a result, the reported forecasts derive from different forecasters and may not represent the view of any one forecaster over the whole of the forecast period. These projections provide an indication of what is, in our view most likely to happen, not what it will definitely happen.

Short- and medium-term forecasts are based on a “demand-side” forecasting framework, under the assumption that supply adjusts to meet demand either directly through changes in output or through the depletion of inventories.
Long-term projections rely on a supply-side framework, in which output is determined by the availability of labour and capital equipment and the growth in productivity.
Long-term growth prospects, are impacted by factors including the workforce capabilities, the openness of the economy to trade, the legal framework, fiscal policy, the degree of government regulation.

Direct contribution to GDP
The method for calculating the direct contribution of an industry to GDP, is to measure its ‘gross value added’ (GVA); that is, to calculate the difference between the industry’s total pre­tax revenue and its total bought­in costs (costs excluding wages and salaries).

Forecasts of GDP growth: GDP = CN+IN+GS+NEX

GDP growth estimates take into account:

  • Consumption, expressed as a function of income, wealth, prices and interest rates;

  • Investment as a function of the return on capital and changes in capacity utilization; Government spending as a function of intervention initiatives and state of the economy;

  • Net exports as a function of global economic conditions.

CLICK BELOW TO LEARN MORE
...

Market Quantification
All relevant markets are quantified utilizing revenue figures for the forecast period. The Compound Annual Growth Rate (CAGR) within each segment is used to measure growth and to extrapolate data when figures are not publicly available.

Revenues

Our market segments reflect major categories and subcategories of the global market, followed by an analysis of statistical data covering national spending and international trade relations and patterns. Market values reflect revenues paid by the final customer / end user to vendors and service providers either directly or through distribution channels, excluding VAT. Local currencies are converted to USD using the yearly average exchange rates of local currencies to the USD for the respective year as provided by the IMF World Economic Outlook Database.

Industry Life Cycle Market Phase

Market phase is determined using factors in the Industry Life Cycle model. The adapted market phase definitions are as follows:

  • Nascent: New market need not yet determined; growth begins increasing toward end of cycle

  • Growth: Growth trajectory picks up; high growth rates

  • Mature: Typically fewer firms than growth phase, as dominant solutions continue to capture the majority of market share and market consolidation occurs, displaying lower growth rates that are typically on par with the general economy

  • Decline: Further market consolidation, rapidly declining growth rates

...

The Global Economic Model
The Global Economic Model brings together macroeconomic and sectoral forecasts for quantifying the key relationships.

The model is a hybrid statistical model that uses macroeconomic variables and inter-industry linkages to forecast sectoral output. The model is used to forecast not just output, but prices, wages, employment and investment. The principal variables driving the industry model are the components of final demand, which directly or indirectly determine the demand facing each industry. However, other macroeconomic assumptions — in particular exchange rates, as well as world commodity prices — also enter into the equation, as well as other industry specific factors that have been or are expected to impact.

  • Vector Auto Regression (VAR) statistical models capturing the linear interdependencies among multiple time series, are best used for short-term forecasting, whereby shocks to demand will generate economic cycles that can be influenced by fiscal and monetary policy.

  • Dynamic-Stochastic Equilibrium (DSE) models replicate the behaviour of the economy by analyzing the interaction of economic variables, whereby output is determined by supply side factors, such as investment, demographics, labour participation and productivity.

  • Dynamic Econometric Error Correction (DEEC) modelling combines VAR and DSE models by estimating the speed at which a dependent variable returns to its equilibrium after a shock, as well as assessing the impact of a company, industry, new technology, regulation, or market change. DEEC modelling is best suited for forecasting.

Forecasts of GDP growth per capita based on these factors can then be combined with demographic projections to give forecasts for overall GDP growth.
Wherever possible, publicly available data from official sources are used for the latest available year. Qualitative indicators are normalised (on the basis of: Normalised x = (x - Min(x)) / (Max(x) - Min(x)) where Min(x) and Max(x) are, the lowest and highest values for any given indicator respectively) and then aggregated across categories to enable an overall comparison. The normalised value is then transformed into a positive number on a scale of 0 to 100. The weighting assigned to each indicator can be changed to reflect different assumptions about their relative importance.

CLICK BELOW TO LEARN MORE
...

The principal explanatory variable in each industry’s output equation is the Total Demand variable, encompassing exogenous macroeconomic assumptions, consumer spending and investment, and intermediate demand for goods and services by sectors of the economy for use as inputs in the production of their own goods and services.

Elasticities
Elasticity measures the response of one economic variable to a change in another economic variable, whether the good or service is demanded as an input into a final product or whether it is the final product, and provides insight into the proportional impact of different economic actions and policy decisions.
Demand elasticities measure the change in the quantity demanded of a particular good or service as a result of changes to other economic variables, such as its own price, the price of competing or complementary goods and services, income levels, taxes.
Demand elasticities can be influenced by several factors. Each of these factors, along with the specific characteristics of the product, will interact to determine its overall responsiveness of demand to changes in prices and incomes.
The individual characteristics of a good or service will have an impact, but there are also a number of general factors that will typically affect the sensitivity of demand, such as the availability of substitutes, whereby the elasticity is typically higher the greater the number of available substitutes, as consumers can easily switch between different products.
The degree of necessity. Luxury products and habit forming ones, typically have a higher elasticity.
Proportion of the budget consumed by the item. Products that consume a large portion of the consumer’s budget tend to have greater elasticity.
Elasticities tend to be greater over the long run because consumers have more time to adjust their behaviour.
Finally, if the product or service is an input into a final product then the price elasticity will depend on the price elasticity of the final product, its cost share in the production costs, and the availability of substitutes for that good or service.

Prices
Prices are also forecast using an input-output framework. Input costs have two components; labour costs are driven by wages, while intermediate costs are computed as an input-output weighted aggregate of input sectors’ prices. Employment is a function of output and real sectoral wages, that are forecast as a function of whole economy growth in wages. Investment is forecast as a function of output and aggregate level business investment.

CLICK BELOW TO LEARN MORE