GOVERNED SELF-SERVICE ANALYTICS: Maturity Model (10/10)

My last 9 blogs covered all aspects of governed self-service and how to scale from department self-service to enterprise self-service. I received some very positive feedback and I am glad that my blogs inspired some readers:

Devdutta Bhosale says: “I read your article governed self-service analytics and as a Tableau server professional could instantly relate with some of challenges of implementing Enterprise BI with Tableau. Compared to Legacy BI tools such as BO, Micro-strategy, etc. enterprise BI is not the strength of Tableau especially compared to “the art of possible” with visualizations. I am so glad that you are writing so much in this space …. The knowledge you have shared has helped me follow some of the best practices with my recent Enterprise BI implementation at employer. I just wanted to say ‘thank you’ “.

Other readers also ask me how to measure governed self-service maturity. There are some BI maturity models by TDWI, Gartner’s, etc. However I have not seen any practical self-service analytics model. Here is my first attempt for the self-service analytics maturity model. I spent a lot of time thinking through this model and I read a lot too before I put this blog together.

I will describe the self-service analytics maturity model as followings: screenshot_184

  • Level 1: Ad-hoc
  • Level 2: Department Adoption
  • Level 3: Enterprise Adoption
  • Level 4: Culture of Analytics

Level 1 ad-hoc is where one or a few teams started to use Tableau for some quick visualization and insights. In other words, this is where Tableau initially landed. When Tableau’s initial value is recognized, Tableau adoption will go to business unit level or department level (level 2), which is where most of Tableau’s implementation is today. To scale further to enterprise adoption (level 3) needs business strategy alignment, bigger investment, and governed self-service model which is what this serious of blogs is about. The ultimate goal is to drive the culture of analytics and enable data-driven decision-making, which is level 4.

What are the characters of each maturity level? I will look into data, technology, governance, and business outcome perspectives for each of those maturity levels:

screenshot_204

Level 1: Ad-hoc

  • Data
    • Heroics data discovery
    • Data inconsistent
    • Poor data quality
  • Technology
    • Team based technology choice
    • Shadow IT tools
    • Exploration
  • Governance
    • No governance
    • Overlapping projects
  • Outcome
    • Focuses on what happened
    • Analytics does not reflect business strategy
    • Business process monitoring metrics

Level 2: Department Adoption

  • Data
    • Data useful
    • Some data definition
    • Siloed data management
    • Limited data polices
  • Technology
    • Practically IT supported architecture
    • Immature data preparation tools
    • Data mart like solutions
    • Early stage of big data technology
    • Scalability challenges
  • Governance
    • Functions and business line governance
    • Immature metadata governance
    • Islands of information
    • Unclear roles and responsibilities
    • Multiple versions of KPIs
  • Outcome
    • Some business functions recognizes analytics value and ROI
    • Analytics is used to inform decision-making
    • More on cause analysis & some resistant on adapting all insights
    • Data governance is managed in a piecemeal fashion

Level 3: Enterprise Adoption

  • Data
    • Data quality certification
    • Process & data measurement
    • Data policies measured & enforced
    • Data exception management
    • Data accuracy & consistency
    • Data protection
  • Technology
    • Enterprise analytics architecture
    • Managed analytics sandboxes
    • Enterprise data warehouse
    • Content catalog
    • Enterprise tools for various power users
    • Advanced technology
    • Exploration
  • Governance
    • Executive steering committee
    • Governed self-service
    • CoE with continuous improvement
    • Data and report governance
    • Enterprise data security
    • Business and IT partnership
  • Outcome
    • Analytics insight as a competitive advantage
    • Relevant information as a differentiator
    • Predictive analytics to optimize decision-making
    • Enterprise information architecture defined
    • Mature governed self-service
    • Tiered information contents

Level 4: Culture of Analytics

  • Data
    • Information life-cycle management
    • Data lineage & data flow impact documented
    • Data risk management and compliance
    • Value creation & monetizing
    • Business Innovation
  • Technology
    • Event detection
    • Correlation
    • Critical event processing & stream
    • Content search
    • Data lake
    • Machine learning
    • Coherent architecture
    • Predictive
  • Governance
    • Data quality certification
    • Process & data measurement
    • Data policies measured & enforced
    • Data exception management
    • Data accuracy & consistency
    • Data protection
    • Organizational process performance
  • Outcome
    • Data drives continuous business model innovation
    • Analytical insight optimizes business process
    • Insight in line with strategic business objectives
    • Information architecture underpins business strategies
    • Information governance as part of business processes

This will conclude the governed self-service analytics blogs. Here is key takeaways for the governance self-service analytics:

  1. Enterprise self-service analytics deployment needs a strong governance process
  2. Business and IT’s partnership is the foundation for a good governance
  3. If you are IT, you need to give more trust to your business partners
  4. If you business, be good citizen and follow the rule
  5. Community participation and neighborhood watch is important part of the success governance
  6. Governance  process evolves as your adoption goes

Thank you for reading.

One thought on “GOVERNED SELF-SERVICE ANALYTICS: Maturity Model (10/10)”

  1. Pingback: essayforme

Leave a Reply