Ashu's Blog – left, right, and all in between

Metrics Galore…
February 16, 2013

As more and more companies embark on historical looking metrics to gauge performance or future looking predictive analytics to make savvy business decisions, the debate on what to measure is often always on in the c suite.   The importance of measurement is widely understood to try to effect the right behavior. Data translates into information, which finally morphs into knowledge or wisdom that can be used by the organization to create some sustainable competitive advantage.  But before we explore why we need to measure and what we need to measure, it’s good to understand the different nuances of measurement systems:
  • A Measure is a quantitative indication of the extent, amount, dimension, capacity, or size of some attribute of a product or a process. It is a single data point (e.g., number of defects from a single product review).
  • Measurement is the act of determining a measure.
  • A Metric is a measure of the degree to which a system or process possesses a certain attribute.
  • An Indicator is a metric or series of metrics that provides insight into a process, project, or product.
  The use of metrics or scorecards should encompass the following objectives:
  • Verify achievement of deliverables associated with the initiative/project.
  • Behavior Modifier - Verify achievement of financial gains anticipated from the initiative/project.
  • Cause and Effect Relationships -Verify benefits achieved were a result of the efforts of that particular initiative/project.
  • Accountability for results - Make sponsors accountable for results within their areas.
  • Enable reuse of processes, models, etc. for future initiatives.
  Some of the good principles while designing these metrics:

  1. At Level 1, you need to restrict the number of KPIs at each organization level to 10
  2. These should be linked to strategy
  3. The organizational structure is guiding for KPI breakdown, with special “perspective” reports
  4. Selected KPIs must be valid, simple, measurable and controllable
  5. KPIs must be structured in a logical, mutually exclusive, breakdown structure and should consolidate upwards
  6. Define clear and structured ownership of KPIs to avoid local optimization
  7. High quality of KPI structure is crucial for organisational acceptance and needs to be prioritized during KPI design
  8. KPIs are designed to govern results on group level. Governance culture must be in line with the governance structure on which the KPI design is based

Share on Twitter Print

The high cost of bugs
September 13, 2012

The quality of IT applications are the backbone of most business processes today (whether customer-facing or internal), yet, according to Forrester Research “one-third of business stakeholders are dissatisfied or very dissatisfied with the quality of their software.” Poorly tested and integrated software has a severe effect on a business’s customers, sales, partnerships, employees and financial bottom line. NIST conservative estimate of the cost of programming errors in component interoperability just in the capital facilities industry in the U.S. alone is $15.8 billion per year - A primary driver for this high cost is fixing flaws in incorrect data exchanges between interoperating components. Software bugs cost the U.S. economy around $60 billion annually, or about 0.6% of the gross domestic product, as per the Cost Analysis of Inadequate Interoperability in the U.S. Capital Facilities Industry, GCR 04-867.   There are umpteen methodologies and automation mantras to get QA done in an effective and efficient manner. But the root of the issues lies in how the software is produced in the first place:


The picture above describes the interaction of a holistic Quality Planning with Quality Assurance and Quality Control disciplines with other areas. It consists of Test Planning, Test Execution and Test Management. As below, the key is to focus on:
  • Are we developing the right thing? – Requirements
  • Are doing it correctly? - The Process
  • Are we using the right tools?


Project Managers and Release Managers should understand how the activities outlined in this document fit into the overall context of project and iterative workflows. It is the responsibility of Senior Management, Project Managers, Systems Project Managers, Release Managers and Software Quality Assurance to determine and ensure appropriate tailoring and necessary compliance/deviations. A “common understanding” is critical to the successful analysis, design and implementation of a Software Quality Process that provides both Full and Iterative Life Cycle Testing and Quality—that is, testing across the entire application development or deployment lifecycle and within iterative development (and prototyping) cycles. Testing and Quality are a major process that starts at the beginning of a project (during the requirements phase) and is not considered complete until after the successful product deployment—thus the phrase: “Full Life Cycle Testing”. The following figure depicts an illustrative operational model of a QA testing organization:

Share on Twitter Print

Portals Galore
June 4, 2012

I was working recently with a government agency for taking some of their public content onto a website. The questions that surfaced up were around how much data, how to present, security issues, etc. That’s when one of the executives asked if this would be a portal and that they could provide other data. So after Yahoo blessed us with the first mega portal a few years ago, we all know that a portal is a web application style integrating many different types of applications, content, and services into flexible web site framework. But what are the flavors that are used for different business purposes?   These days, portal technology provides a set of basic services for:
  • Component based application development model via standards (portlet, widget, web part)
  • Real-time control for customization, page layout, look & feel
  • Fine grained, access control
  • Customization/Personalization of the components on the page. The terms “Customization” and “Personalization” tend to be used interchangeably. Customization is the ability to configure the user’s portal environment, typically without the development of code (customization is normally focused around look and feel, and page layout). Personalization is the targeting of information (data or content) to an individual or particular group of users based on what is known about the user
  Portals today are leveraged internally and externally as well. For internal portals in the Enterprise, the increasingly connected society is creating the “Wired and Ready Workers”.  There is a growing expectation company applications are available either via laptop or on their mobile device. According to Pew Internet & American Life Project, Networked Workers, in the United States, Americans are using more and more information and communications technologies inside and outside the workplace. Among those who are employed, 62% could be considered “Networked Workers” who use the internet or email at their workplace. Also 86% of employed Americans use the internet or email at least occasionally, 89% own a cell phone, 81% have a personal or work email account.   For external portals, success comes when we understand how to maintain a true dialogue with consumers via their preferred channels and technologies. As per a Gartner study Companies Must Change Practices to Target Generation, “By 2015, more money will be spent marketing and selling to multiple anonymous online personals than marketing and selling offline. This transition in customer interaction is being driven by Generation Virtual, also known as ‘Generation V’.”   “… Generation Virtual is not defined by age — or gender, social demographic or geography — but is based on demonstrated achievement, accomplishments and an increasing preference for the use of digital media channels to discover information, build knowledge and share insights.”  
  • Companies should organize their products and services around multiple online personas.
  • Sell to the persona, not the person. A persona will show you how it wants to be treated.
  Many businesses haven’t fully realized the potential of integrating the business processes at the user interface layer. Portal technologies are already natural consumers of distributed computing technology and application integration technology.  “Traditional ways of selling to customers, based on demographic information, will become irrelevant in the online world, which has its own merit-based system using personas that conduct transactions and spread influence anonymously.”
  • By 2020, the average mainstream consumer will be spending 23% more time online than is the case today. Assume a 21% increase per person in the time spent using fixed/nomadic online applications, and a 46% increase per person in the time spent using mobile data applications. - Source: Gartner, “User Survey Analysis: Next-Generation Communications Consumers, Worldwide, 2020”, 17 Feb 2009
  • Young consumers are at the leading edge of fixed-to-mobile substitution of data applications but, in 2020, mainstream consumers will still get most of their news and information through broadcast rather than nonlinear channels. 1
  • Providers of third-party customer data, business intelligence (BI) and analytic tools will shift toward consumer applications, eventually arming companies with automated, artificial intelligence, self-learning "persona bots" to seek customers' needs and desires. Source: Gartner, “How 'Generation V' Will Change Your Business “, 3 Jan 2008
  Some of the recent Technology Influences are as below:
  • Emerging Cloud Computing models
  • Web 2.0/Enterprise 2.0
  • Rich Internet Applications (RIA) & AJAX
  • New standards for Portlet interoperability
  • Evolving Open Source portal alternatives
  Some of the recent Market Influences are as below:
  • Consolidating Vendor Market
  • Emerging SaaS Market
  • Collaboration and social networks
  • Mashups -> expectations of nimble development.
  • Renewed interest in multiple device delivery

Share on Twitter Print

Business Relationship Mngt
April 10, 2012

Working with folks who run data centers or even are responsible for application management once the applications are built, one learns so much about efficiency to operations. The concept of Service Management came about from the work done in ITIL. (Information Technology Infrastructure Library is a registered trademark of the United Kingdom’s Office of Government Commerce. It is a set of concepts and practices for IT services management, development, and operations.) The goal of Business Relationship Management (BRM) is to move away from multiple points of contact for the business for services defined and measured in technical terms. Historically, the IT organization used to service multiple users across multiple business units with day-to-day requests as well as medium-term change management work. The segmentation of the traditional IT department used to be utmost in operational domains—in-house and/or third party (e.g., email, network, servers, etc.). This led to their services having an IT/IS internal focus, random change management discplines, suboptimal IT/IS resource utilization, and poor cost control of service delivery.   The Senior Business Relationship Manager role is overall accountable to the business for delivery of IS demand (projects and services) across a significant proportion of the business (e.g. Transmission, Distribution) and provides leadership for business relationship management within IS and the business as a whole, including leading a team focused on IS-business relationship management.The key reasons this role eolved was:  
  • Operational/technology rather than business/service focus
    • Managers spent ~80% of their time on detailed Information Systems (IS) matters, mainly resolving incidents
    • Many IT folks regard SLAs as having low business impact
    • SLAs not aligned with business drivers
    • Potential ‘gap’ in the service control function
  • Role and responsibility confusion
    • Many business stakeholders used to rate IS relationship structure as unclear
    • Managers were leading incident resolution, because it is not clear who has this responsibility
  • Business leads lacked authority to drive business requirements into IS
    • Cannot always drive business requirements through IS
    • Processes often bypassed
    • Inability to have a ‘levers of cost’ dialogue in most cases
    The message from all these operations is
  • Involve IS AND Business
  • Identify the right people in your organization
  • Deliver, deliver, deliver and build momentum

Share on Twitter Print

What is BI
December 14, 2010

As some sage once said, “The next best thing to knowing something is knowing where to find it.” The origins of Business Intelligence can be found in the manner human cognition is used to make decisions. Whether we are faced with attending a conference, getting married, or investing in some back-office ERP system, we try to follow this process:   When we are to make a decision, we first analyze the situation. We try to understand the characteristics of the circumstances and the environment. We then begin to evaluate our options. After some analysis we choose a certain course of action and act on it. Then we see the results and evaluate the results and see how the situation is changed by internal actions and external influences. Information is the result of processing, manipulating and organizing data in a way that adds knowledge to the receiver. The Business Intelligence capabilities are built exactly on the same process:
  • What is happening? - Scorecards and Dashboards
  • Why did it happen? - Analytics
  • What will happen? - Forecasting
  • What do we want to happen? - Planning, Budgeting, Consolidation
  • What happened? - Reporting
    But this area of Information Technology has run into many problems because of explosion of data and information. Falling storage costs are driving higher data volumes and the type of data is increasing everyday:
  • Transactional (structured) data
  • Office / Compound Documents
  • Fax, images, photos, graphics
  • Video and Audio
  • Web content
  • Structured  and unstructured data feeds
  The origin of the word “Entropy” was in Thermodynamics where systems were described using thermal energy instead of temperature, then entropy is just a number by which the thermal energy in the system is multiplied. But eventually Entropy was used as term to measure of the disorder in a system. Like anything else in our universe, left unmanaged all forms of information will decay and this is called Information Entropy. As time goes on in various organizations, the systems that measure and keep data, lose context because:
  • Content loses its context
  • Data loses its meaning as metadata          
  • Structure remains un-enforced
  • Unstructured data is hard to find
  Inadequate tools and processes to link structured and unstructured data and the sources are many-fold as well:
  • Office automation & collaboration
  • LAN file systems
  • Databases, warehouses and marts
  • Email, Instant Messaging, Voice
  • Digital and video cameras
  • RFID, SCADA, telematics
  The need for Information Architecture is established if one sees the signs of Information Entropy in the organization:
  • Company has multiple analytics, portals, content management, and data integration applications
  • Metrics and benefits are difficult to define and baseline
  • If the company is still measuring (if at all) availability and reliability at the information silo level – data marts, databases, applications, etc.
  • If the company has not defined and implemented a comprehensive information management application model aligning information criticality, data retention, and speed of access with price/performance and usage of existing software solutions
  • Reluctant ownership of data – data is often owned by IT instead of business units

Share on Twitter Print

Electronic Medical Records
October 12, 2010

I have been reading about Google Health and have wondered about data quality and it's significance for Electronic Medical Records in the medical/healthcare industry. Then this news from The Boston Globe does raise some questions where a misleading informatics approach led to a kidney survivor to read on Google that his cancer had spread to either his brain or spine. He was not aware of this and when he dug into it, it was erroneous information. There has been a lot of recent debate about this with a new Government in place. But I guess it remains to be seen which direction and how quickly we will begin to make some progress. One of the recent article on Health Affairs addresses the key issues of Patient Health Records so vividly.  The core problem in this industry is that the multitude of health care information systems are discrete and do not communicate with one another. In addition, vast bodies of medical knowledge and data do not exist in an electronic format that is usable by any decision support system.  One of the ANSI standards for healthcare arena, Health Level Seven offers some promise. But the number of standards are mind-boggling, some of them as below:
  • NCPDP – Messaging Standard for Drug Ordering
  • IEEE1073 – Messaging Standard for Medical devices
  • DICOM – Messaging Standard for Imaging
  • LOINC – Vocabulary Standards for Laboratory Result Names
  • RxNORM/NDF-RT – Vocabulary Standards for Clinical Drug Description & Drug Classification
 Being such a Lean Service Operations person, I guess a successful development of consumer centric care models will require cross-industry collaboration to close key gaps across financing and delivery systems:

Share on Twitter Print

The EPM story continues…
August 16, 2010

Thanks for all the feedback I received. I guess on almost EPM initiative the core questions lie around what metrics to use - “How much?”, “How many?”. Before we begin thinking in those terms, it is good to analyze why metrics are really needed. I had discussed this in another article Data Rich, Information Poor? Focus on the Right Metrics and I still believe that the mantra of all EPM efforts is what Six Sigma practitioners get their brains tattooed with:
  • If you can't measure something, you really don't know much about it.
  • If you don't know much about it, you can't control it.
  • If you can't control it, you are at the mercy of chance.
  The use of metrics/scorecards should encompass the following objectives:
  • Verify achievement of deliverables associated with the initiative/project
  • Behavior Modifier - Verify achievement of financial gains anticipated from the initiative/project
  • Cause and Effect Relationships - Verify benefits achieved were a result of that particular initiative/project efforts
  • Accountability for results - Make sponsors accountable for results within their areas
  • Enable reuse of process, models, etc. for future initiatives
  Whether one executes the decision making enabling processes through balanced scorecards  or dashboards or reports, the focus is to be able to get the right information to the right people at the right time. Like any systems implementation, the execution of reporting metrics is a trade-off of ease of capture vs. value delivered. But your BI implementations box should have all the facets/dimensions:

Share on Twitter Print

Enterprise Performance Management - EPM 101
August 8, 2010

Having been in the BI/ DW/EPM industry for so long, I keep reading with avid interest statements like “Less than 10% of strategies effectively formulated are effectively executed” from Fortune magazine. Business executives at all levels constantly struggle to understand what drives performance and how to boost it. In all my discussions with executives they discuss typical challenges of Performance Management:
  • How to get our strategic plan out of our executive office?
  • Are the right metrics and drivers in place to support the execution of our strategy and deliver the desired financial results?
  • Are our employees completely focused on reaching our strategic goals?
  • Are we leveraging the information in our systems to enhance decision making?
  • Do we provide our employees with clear direction to guide their decision making?
It really gets one thinking about Effectiveness and Efficiency of any business initiative. I guess Effectiveness typically means enabling a consistent process and framework for the evaluation of decision trade-offs (current/future) around investments, focusing management around the key drivers of value. Whereas Efficiency is all about streamlining decision making, planning and reporting processes consistently using key drivers of value. It improves focus of resource time and effort, eliminates redundant work and minimizes manual intervention and errors. I guess this has been a debate since someone started the concept of trade. We have been reading and doing business and defining metrics - what gets measured gets done. But like most strategies the real effectiveness of EPM starts by holistically combining the play of Processes, People, and Tools. Since processes enable the other assets, a “leading practice” Planning, Budgeting and Forecasting process model encompasses five integrated processes:     Strategic Planning Development of a long-term plan aimed at establishing the organization’s strategic positioning and driving value creation over and above competitors.   Target Setting & Operational Planning At any level in the organization the mix of financial and non-financial measures which are controllable will vary. It is good to use driver based performance models to break financial targets down into operational targets and milestones to ensure targets are aligned to operational goals. Using a shareholder value tree helps in these exercise.   Budget Creation & Approval Translate and express targets into quantitative terms. Define the expected financial performance of the divisions and business units.   Forecasting Predict outcomes periodically throughout the year to reflect changes that have occurred both in the internal and external environment since the budget was developed. Provide more accurate and timely information for better and less risky management planning and decision making.   Performance Management Provision of performance inputs to inform planning and budgeting activity. Ongoing measurement of performance to enable identification of continuous improvement opportunities and response to gaps to plan. An effective EPM strategy begins with understanding the stakeholders who would want to receive the information and what they would like to know. Of course, the number of metrics should be restricted so they are effective and they should be defined. Enough consideration has to be given to establishing targets for each metric. And once reported, action-based metrics should provide the requisite feedback either for further action/ investigation/re-planning. Also any metric that has been established and being produced on an on-going basis should be revisited on a periodic basis. In some cases, once a metric is used and drives a positive change in the organization, then the metric has done its job and may no longer need to be generated (or generated less frequently then on a monthly basis). In other cases new metrics will be added and in some cases, targets could be re-set for existing metrics. The typical systems issues I see are in the areas of wrong metrics, information overload, cost of data quality, lack of integrated information systems, or/and wrong incentives. From a tools perspective, it is important to be able to support the following: I guess this one from Scott Adams is the essence of EPM:   From the official Dilbert website

Share on Twitter Print

Information Supply Chain - Turning Data into Action
April 20, 2010

The way to think of information is the classic assembly line in manufacturing industry. Some data is received from outside the firm (suppliers), some data is produced from within, it is organized (metadata, taxonomy)for storage somewhere, it is distributed (aka reporting, etc.) to end users (customers) for consumption So Information architecture includes reporting, portals, content management, and data integration applications (BI/DW) with following components: Store
  • Data Sourcing and Ownership (as someone said “The next best thing to knowing something is knowing where to find it.”)
  • Data Quality / Hygiene
  • RDBMS Architecture
  • Data Warehouse / Data Mart design
  • Metadata/ Master Data Management
  • Taxonomy
  • Extract/Transform/Load (ETL) Tools/Techniques
  • Data Flows (within and between applications)
  Access (knowing what you need to know, when you need it, and how to act upon it)
  • Reporting
  • Analytics
Even more so, these days business performance remains critically dependent upon timely, accurate information sources and supporting analytic capabilities. Regulatory and legal liability concerns have increased corporate concern with control over their information assets. Increasing interconnectivity is driving up the volume of information that needs to be managed. The need for Information Architecture is established if one sees the signs of Information Entropy in the organization:
  • Company has multiple analytics, portals, content management, and data integration applications
  • Metrics and benefits are difficult to define and baseline
  • If the company is still measuring (if at all) availability and reliability at the information silo level – data marts, databases, applications, etc.
  • If the company has not defined and implemented a comprehensive information management application model aligning information criticality, data retention, and speed of access with price/performance and usage of existing software solutions
  • Reluctant ownership of data – data is often owned by IT instead of business units

Share on Twitter Print



    Business Intelligence

    IT Strategy

    Social Media

Social Media

© 2012 Ashu Bhatia Powered by Brown Books Digital