Ashu's Blog – left, right, and all in between

Entertain me...
June 21, 2015

According to World Travel & Tourism Council (WTTC) the Entertainment and Hospitality Industry is a $8 Trillion industry today and will grow to $12 Trillion by 2020. It is also a very accurate indicator of the global economy – the canary in the coal mine. If global economy is down due to any reason, E&H falls steeper and deeper and almost real time – a true barometer of the global economy.   It’s typically broken into 3 sub-segments:
  1. In Travel & Leisure parks and resorts, cruise lines, car rentals, airlines and mediators in between
  2. In Lodging vacation clubs, hotel chains, etc.
  3. In Media & Entertainment Content producers, Content aggregators, Digital right management companies, TV distributors, etc.
  All of them are changing really fast – with the onslaught of Social Media and a truly Digital Lifestyle, this industry is seeing some paradigm shifts. Every customer touch point is evolving so fast in this changing world. Let’s look at some examples of this
  • You look at a poster of Bahamas. Right there is a QR reader for you to see the top 3 hotels, car services and sea excursions.
  • A website like TripIt – once your itinerary is up, they will send you emails 1 day before to checkin. If there is a volcanic eruption in Iceland, they will show you 2 hours before flight that it is cancelled. Options for another flight right there. No running to gates with 300 other passengers and trying to use your Elite status to get another seat home.
  • They offer Travel Diary or “Traviary’ -  for before, during and  after pictures/comments
  • When you land after a certain flight – Hertz sends you a message/ email that your car is in a certain lane.
  • No looking at online behavior itself, 80% business can be lost in less than 5-7 seconds. If your mobile or website waits more than 5 seconds, typically people hit BACK button. After 2 more attempts you have typically lost the business.
  So all these customer touch points are dictated by only one word – EXPERIENCE. That is what the process and technology platforms have to deliver and enable the human interacting with the customer. And that EXPERIENCE starts much BEFORE traveler walks into the property.  
cust exp  
As an example take Cruise Lines customer touch points, looking at this from a customer’s glasses:
Segmenting the customers - Seniors, Boomers, Families, Couples, Singles, etc. – is key and the unique experience they seek. E.g. Singles can find information on print media - men's magazines (GQ, Cosmo, etc.), Travel Mags (CondeNast, etc.).     The consumer is
  1. more knowledgeable with internet and mobile accessibility, online consumer-created content and social networks. Everyone who books a vacation, looks at TripAdvisor, Expedia, etc. for ideas, feedback and reviews. Price and convenience are essential differentiators for most consumers.
  2. more empowered and demands unique requirements and they want more self-service and multi-channel options. You go online the moment your friend tells you something. This speed, volume, and transparency associated with internet travel distribution has put new pressure on most companies to re-calibrate their value chains.
  3. more diverse since there is a lot of global and local influence and the consumer is very individualistic and demanding more convenience. Everyone looks for specific personalization and destination flexibility - such data-driven is key for companies to compete.
    Hospitality 3

Share on Twitter Print

Sentimental Customer – Analyze This!
April 26, 2015

I was talking to an executive of a famous cruise ship company. He was flying from S. Florida to Seattle and was of course dreading the flight as it is. On top of that his flight was cancelled, and he got bumped down to a middle seat on to a much later flight. At that point he started tweeting some of his friends about his horrible experience. In some time the stewardess came up to him and apologized and offered a free ticket for another trip. He naturally appreciated this gesture and that set him thinking if it was his tweets that were being tracked. He is now doing a whole new CRM effort within his own company  - he knows the cruise customer is there for  4-7 days and can offer more upselling opportunities if they can listen and act better.
  Sentiment Analyses 1        
At the core of any sales and marketing effort, there only really 3 pieces:
  • Customer acquisition - Increase revenue and market share
  • Customer Insight & Innovation - Create, improve and differentiate products and services
  • Customer Experience & Service - Improve customer loyalty and reduce or avoid costs
  In today’s world, with the advent of connected consumers, the Customer Insight discipline is so important, especially in a B2C world. This is done through focus groups, buzz monitoring, community tracking, sentiment analyses and market research or pure competitive intelligence.  
The social media phenomenon has surely led to a new way of business engagement is revolutionary set of constituent interaction channels. This technology platform (Twitter, Facebook, etc.) gives everyone expanded ‘voice of the customer’ ability to exercise influence on the interaction with companies and peers.
  • User-generated content is often triggered by emotion
  • This can be amplified via the “viral effect”
  • Impact cannot be stopped or undone
  • It can forces companies to act in shorter cycle times
  • The scale can be fast and truly global
  Earlier this is what happened:
  • You talked to your customers, told them what to do
  • 95% happy customers is good
  • You wanted them to come to you
  Now this is what happens and companies need to adjust ASAP:
  • Consumer are talking among themselves, it’s time to listen
  • 5% unhappy customers is bad
  • You go to where they hang out
  It’s like the sewing circle of colleagues or clique of friends that you tell about a good doctor or a great sale has expanded to be truly global with platforms like Twitter, Facebook, etc.  
1.      LISTEN   Companies need to be listening on platforms like Blogs, Open Micro-Blog (Twitter), Closed Micro-Blogs (Yammer), Open Social Network (MySpace, WhatsApp), Closed Social Network (Facebook), Commentable User Generated Content (YouTube), Discussion Forums (Ning, TripAdvisor), etc.  
The listening part has many technologies like claraBridge, Attensity, Radian 6, Overtone, SAS, etc. Some companies are using Natural Language Processing (NLP) to understand syntax & context. As an example please see sentence below posted somewhere:
  Sentiment Analyses 3  
One can use advanced linguistics to understand topics and sentiment: “The EFTPS enrollment for my tax return was easy, but it took too long to get the confirmation package So the Category Sentiment is:
= Positive for “enrollment”
= Negative for “timeliness”
One customer verbatim from the customer can result in multiple categories within multiple taxonomies
E.g. “After being a retired Army General and a FSCO member for 38 years, I am quite upset that service representative would treat me so rudely over the phone when I called in to complain about the fact that you guys unexpectedly raised my auto premiums at the same time Geico is calling me offering me a major discount.” Brand-Related
  • Core Values > Service
  • Core Values > Loyalty
  • Core Competency >  Deliver Exceptional Customer Experiences
  • Eligibility > Retiree
  • Rank > Officer > General
  • Service > Army
  • Tenure> Greater than 30 years
  • Premium Increase
  • Products > P&C > Auto-Insurance
  Customer Experience-Related
  • Moments of Truth > Rate Increase Notification
  • P&C/Auto Insurance / Geico
    2.      ACT   The tough part of all this is to separate the signal from the noise and do something when its really important. The prioritization criteria for such an action is typically based on:  
  • # of customers impacted
  • Severity of impact to customer experience (i.e. moment of truth)
  • Severity of potential impact (e.g. legal liability)
  • Level of control to resolve
      Sentiment Analyses 2

Share on Twitter Print

Show me the Money - Business Value Realization
February 17, 2015

When it comes to Information Technology (IT) everyone knows the different questions the executives are interested in:  
  • How can I better use IT to deliver shareholder value?
  • How can I use IT to deliver on our business strategy?
  • Am I spending the right amount on IT?
  • How can IT reduce the costs of operation?
  • Are there better ways to use technology to become more effective?
  • Where should I focus my attention for new capabilities?
  • How do we get a return on our invested IT capital?
  • How do I measure the value and results from IT?
  • What IT assets do I need to own?
  The term “Value” is used so much in business and never completely understood. What is value anyway. Value at the highest level has 3 components:
  • How much cash will the company generate? How much cash injection will it need?
  • How certain is the cash generation and realization of the investment ?
  • When will cash be generated? When will value be harvested??
Show me the Mullah 1  
Measuring and communicating the business value of IT remains one of the biggest challenges for CIOs. The major challenges in assessing business value of IT are:  
  1. Difficult to separate value achieved from other improvements (organizational, process redesign, etc.)
  2. Measuring value is often subjective
  3. Practices for monitoring/inspecting and communicating value are seldom underappreciated
    To really understand what a project’s value is to the firm and then did it ever deliver that value, there is some focus needed on:  
  • Value Identification - Identifies what the sources of value are from IT, that can be used to deliver business benefit
  • Value Capturing - Identifies to what extent new IT initiatives are explicitly tracked and benefit realization for the initiatives
  • Value Delivery - Formalizes the expectations set between the business and IT, both in terms of value and relationship
  • Value Measurement - This capability defines how the success of an IT organization will be measured
  Value should be measured against business capability measures, those that can be directly quantified into financial measures. Measures should be developed jointly by the business and IT, then aligned with business capabilities. For business cases, benefits should be quantified and aligned with the capability measures for the impacted capabilities.Value Management is a journey and becomes the generation of Change Management. The components to do this are:  
  1. A business case framework to define and quantify benefits captured from investments (Templates/ Model)
  2. A methodology to measure benefit realization against management expectations for investments and report performance (Process)
  3. A sustainable discipline to enable better decisions in terms of future project prioritization and investments (Governance)
    Show me the Mullah 2          
Some of the best practices:  
  • Think of the Shareholder Value Tree for your project- always try to get a helicopter view of the situation
  • Value should be measured against business capability measures, those that can be directly quantified into financial measures. For business cases, benefits should be quantified and aligned with the capability measures for the impacted capabilities
  • Incorporate Benefits Realization as a work stream. Include checkpoints in project lifecycle for monitoring value during implementation
  • Partner with the business to discuss value in the language that stakeholders want and then drive business results
  • Define the right metrics you need to track for
    • Behavior Modifier — Aligns employees with the IT organization’s goals and objectives in a manner that motivates employees and influences desired behaviors
    • Accountability for Results — Holds managers accountable for results and forces them to direct value to the business
    • Performance Orientation — Shows what the performance of the IT organization “has been” and “where it is headed to” – and where to focus management attention. Shifts focus from reactive to proactive management`
    • Vision Connected — Quantifiable statement of the IT organization’s “to-be” state or vision

Share on Twitter Print

Have you ever sold an old belonging - basics of Revenue Management
January 12, 2015

Have you ever sold anything? Old books, your bike car or even your house?? Whether you did that over a garage sale, on eBay or even more advanced technology like Zillow, etc. what did you feel right after the sale? - “I could have raised the price another 10% and the dude would have still bought it”. “Dang! Why did I throw in my comics for free with the sale”. Or “oh my god I didn’t sell this bike just because I didn’t reduce my price by $10. It’d have been good riddance”.   Well, that’s just the nature of sales and pricing. Now imagine the plethora of players in the value chain of any industry like Entertainment & Hospitality
  • Distributors like Kayak, Expedient, etc.
  • GDS’s like Worldspan, Sabre, Amadeus
  • Operators like Contiki, Pleasant Holidays, etc.
  • Consolidators like Pegasus, Trust, etc.
  The consumer is willing to pay only so much depending on the product, the season, her expectations, different bundling of products. But all these players lead to so many types of rates for the same room: Published, Negotiated, Packaged (Expedia), Opaque (e.g. LastMinute, PriceLine), and  Restricted. Of course like any pricing strategy there is a stark dependency on so variables: demand patterns, channels, competition, etc. Imagine a hotel or vacation club that offers rooms – different destinations, different experiences (Jacuzzi, etc.); Services (jet ski tour thrown in), Security, additional room types, etc.  
The basic problem is as depicted below:
REv Mngt 5  
The only options management has are:
  • reject demand
  • inventory excess demand (queueing)
  • modulate capacity (add facilities, scheduling, resource allocation)
  • modulate demand (pricing, yield management)
  In the Hospitality industry there is a detailed science behind this pricing and demand fulfillment. This is to really be able to maximize the revenue. It’s called Revenue Management or Yield Management.   The idea is two-fold: Have a market segmentation strategy (capture consumer surplus) and match price to demand (peak-load pricing). So airlines sell First class, Business class, economy class etc. Hotels create suites, single rooms, double rooms, etc. and price the products differently. Intelligently allocate fixed capacity to products. Also then allocate more capacity to low price points if demand is weak; allocate more capacity to high price points if demand is strong. Ever wondered why flights are always over booked and over sold.   Hotels, travel agencies, airlines, vacation clubs, car rentals, theatres, sporting venues, and other industry players are even unlocking the power of Big Data to enhance revenue management. That’s why this is a cycle of these activities and combines a lot of science with the art of pricing. Providers recognize that data analytics are helpful in establishing the optimal price for their products – the right price to the right customer at the right time. The basic algorithms are as below:  
1. Segmentation/product design – discriminate (sort) customers based on their actual willingness-to-pay (reservation price). Since the willingness to pay is tough to find for all customers, they try to find a variable that is correlated with willingness-to-pay (a “sorting mechanism”). Advance Purchases are encouraged.  
2. Forecasting – factor in seasonality, trends, truncation (reservations accepted vs. potential demand), special events, promotions, etc.
  • perfect forecasts (deterministic)
  • uncertain forecasts (stochastic)
3. Capacity Allocation - evaluate the opportunity cost (displacement cost/bid prices) of using resources required to meet current demand. So accept current request if Revenue is greater than Displacement Cost. Mostly companies have to rank demand from highest revenue to lowest  
4. Control – Try to control demand and prices by ad hoc negotiation, “posting” remaining availability (reservation system booking limits), open/closed status indicators, bid prices/hurdle rates, etc. Ultimately, it boils down to an accept/deny decision for each service request.   This is where statistical tools and big data help. An example of a tree function for marginal revenue analyses for overbooking a flight for example is as below:
 REv Mngt 4  
C          capacity of flight
p          probability that the reservation shows up
r           revenue from booking seat
s          cost (free flight, goodwill) of denied boarding
Rev Mngt 2  
In a nutshell the business can be described as:
Demand Creation is           …     Marketing
Demand Capture is            …     Sales
Demand Management is   …    Revenue Management  

Share on Twitter Print

CRM basics from Caribbean street performers
January 3, 2015

Happy New Year to all my readers!
As all northern folks who look to get a winter break in the warm waters of the Caribbean, we just returned after savoring our piece of heaven. There were hordes and hordes of tourists – from cruise ships, flocking from airports, from all parts of the world, etc.
On one of the beaches they had street performers doing all sorts of things – swallowing swords, eating fire, other pyrotechnics, etc. The way some of the performers conducted themselves to get the spectators to throw in money into their TIPS jar was nothing short of CRM basics 101.
One of the street performer – a man juggling balling pins on a unicycle, also balance some pyrotechnic with his mouth – was amazing. He focused on the Core CRM processes:
  1. Generate demand – He stood adjacent to the arrival of cruise locations and had flames lit up in the sunset-laden beach area to get everyone to be curious enough to swing by
  2. Engage Customer – When he started his juggling act with bowling pins, he invited a front row standing  kid to come up and throw to him one by one the bowling pin while he balanced on a unicycle> he talked to the kid quite a bit and complemented him on his throws.
  3. Acquire new customers – He would yell at oncoming folks and passersby. He chose a location which was conducive to folks stop by as they enjoyed their ice creams or lemonades.
  4. Service customers – He brought stickers for kids and lulled them to watch his flame and juggling
  5. Deal with Globalization – He spoke a few Mandarin words to the Chinese travelers – “Ni hao”. Asked a kid and heard he was Italian and mentioned greetings in Italian.
  6. Embrace the Viral Effect of Social CRM –  He yelled at people making his video and taking pix to send to fried through Facebook or even YouTube.
As someone said, “Customer service is not a department, it’s an attitude”. The science behind itthe framework for such capabilities has three main components:
1.     Insight-driven Marketing
  Esther Dyson summed it up in the magazine strategy+business (December 2009): “People spend a lot of time online not looking for something, or at least not for something that can be bought or sold. Marketers need to understand that the Web is not about them; it’s about us. Marketers and media sites keep thinking, ‘Well, if we can only tweak our banner ads right, we can get the same success rate as Google.’ But they can’t, because a banner ad is usually shown to someone who is not looking for the item advertised.”
cr0 3

2.     Customer Segmentation and Targeting  
This the ability to classify or cluster customers / prospects based on certain business rules or inherent customer data behaviors and buying patterns. This is made possible through customer insight, data mining, segmentation, and prognosis. The key to creating customer segmentation and to targeting the right customers is to have adequate insight and to drive interactions with customers as per that insight.  
3.     Customer Touch Point Transformation
  These days customers interact with companies at many touch points—call centers, online, mobile apps, point of sales in the case of the retail industry, etc. In order to offer a complete and holistic experience for the customer, the company should look at contact touch point transformation at every level and have a decent integrated contact-management system
Contact center
With this basic framework in mind, one has to recognize that some of the recent world phenomenon makes this even a bigger mandate:  
  • Exponential expansion of media options makes targeting consumers far more complex
  • “Cash-rich, time-poor” consumers are demanding more relevant offerings, experiences and communication
  • Consumers are far more technology-savvy and more active in controlling the consumption cycle
  • Demographic and social changes are creating a more diverse, fragmented consumer base and buyer values
  • Products/services, stores and messages are proliferating and becoming increasingly commoditized

Share on Twitter Print

Master Data Management
December 2, 2014

Master Data are the fundamental business data in the company, typically long-lived and used across multiple applications, inherently including the subset of Master Reference Data.   Master Data, including Reference Data, is not to be considered “Metadata”. Metadata is data about data. It describes data content but it is NOT the content.  There is no formal and universal definition of how deep to take the definition from a content perspective. Only those metadata whose management will bring more value to the enterprise than the cost of the labor needed to create and maintain it should be managed and integrated formally. For example, “Master Data” definitions (customers, suppliers, products, organization, etc.) which will be the most ever-present and shared data across an enterprise will be most critical.
  MDM 4  
    Master Reference Data are used to understand, navigate and query information based on or related to the Core Master Data from various business level and/or user level perspectives.  
  MDM 2  
      Value Proposition of MDM:  
  • Enables 360º view of the enterprise and corporate performance management improvement.
  • Provides competitive advantage by enabling customer behavior insight & predictive modeling.
  • Improves the forecast management through more effective logistics and inventory control.
  • Reduces cost of manual master data reconciliation & alignment efforts, error fixing, etc.
  • Reduces the data redundancy cost by consolidating & eliminating duplicate masters (DB/apps).
  • Reduces application development & maintenance costs, by having clear master data interfaces.
  • Improves the decision quality by securing reliable, high quality master data “just in time”.
  • Improves the availability of key data – speed of access, data timeliness, common (user) support.
  • Eliminates/prevents redundant & non-coordinated master data activities within the company.
Executing components of the MDM Strategy
MDM 3        
  • Data Governance
    • Key roles
    • Desire to roll-out data governance to other areas of the business
    • Data Quality
      • Incorporating DQ approach into EDW instance consolidation project
      • Process Improvement

Share on Twitter Print

Competitive Intelligence using Analytics
November 1, 2014

Having spent so much time in the Entertainment and Hospitality industry, I see everywhere the key systems being CRM and Revenue Management systems. Yield Management is a market segmentation strategy to capture the consumer surplus, widely used in the Entertainment and Hospitality industry. Airlines, hotels, theme parks, car rentals, cruise ships, broadcasting (TV, radio), and Utilities (telecom, electricity), etc. always used segmentation and peak-load pricing. But with the advent in computing power of systems this technique is now able to do excessive and data intensive calculations with linear programming to optimize revenue. Just like airlines realized long ago that a flight flying with an empty seat is forgone revenue, many companies use historical data to optimize differential revenue gains. American Airlines had added $1.4B additional revenue over three-year period in the early 1990s. Hertz added 1-5% revenue annually and so did companies like Marriott Hotels, Royal Caribbean Cruise Line, etc.  Computer algorithms can use variables like time of purchase/usage (advance/spot purchase, day-of-week/season), purchase restrictions (cancellation options, minimum term, Saturday night stay), purchase volume (individual vs. group), and duration of usage (single night/weekly rate) to get the right price to the right customer. Next time you are stuck an airport and an airlines mentioned over-booked flight, know that some computer algorithm was doing some marginal analysis based on capacity of flight, typical cancellation rate, revenue from booking seat and cost of denied boarding (these days with the pressure airline are also adding costs like loss of goodwill along with the free flights they give you). Other analytic systems within an organization depending upon what the goals of a business or its units are.
  • Monte Carlo Simulation – This technique uses data to establish a pattern between a domain of possible inputs. The calculations generate inputs randomly from a probability distribution over the domain, perform computations on these and aggregate the results. This tries to minimize reinventing the wheel by reuse of research results. In the Pharmaceutical industry, for example, during the Discovery phase of their value chain, data mining is used to search contextual information based on secondary relationships,
  • Regression Analysis – This is a set of techniques used for modeling and analyzing several variables and establish a relationship between a dependent variable and one or more independent variables. It is widely used in Marketing and Sales part of a company’s value chain since it can provide deep insight on customer behavior and provide an enhanced decision-making for future customer interactions. Companies are using industrialized analytics which uses closed-loop promotion and data mining to optimize marketing budget allocation (i.e. optimizes marketing channel and product mix). The closed-loop-promotion ensures feedback of mission-critical data to marketing and provides on-demand access to marketing decision-support. This helps creates an optimal marketing mix (as Jerome McCarthy called this the 4P’s – product, price, place, and promotion). It started with improvements in systems for statistical analysis that helped understand why certain things were happening in the business environment. These were heavily used to do web traffic analyses during the emergence of eCommerce. Then these systems were extended to forecasting and extrapolation to see what would happen if the trends continued. Then the era of predictive modeling built systems what could try to predict what will happen next, based on empirical data and heuristics. Of course all these systems are used to help with optimization of spend and maximize the revenue. As an example, within the financial services industry, specifically Banks, such models are used to predict Retention & Loyalty, perform Portfolio Analytics, help with Fraud (Transaction/ Payment) and Anti Money Laundering efforts. They help answer questions like Bank Servicing - Which customers/ segments are at risk? Which ones are profitable? How do I retain them? How do I win back customers? They can also help with Loyalty Programs - What drives loyalty for my customer base? How do I design an effective loyalty program? They are used for Cross Selling and Upselling - Which deposit account holders would be interested in an auto loan? Can we sell insurance along with auto loan? Which products can be sold on credit card welcome call? Which ‘Gold’ customers would increase their spend if we upgrade them to ‘Platinum’? Can I identify customer requiring short term loan? It helps with Campaign Management with segmentation modeling, profitability analysis, retention campaigns, win-back campaigns, etc.
  • Neural Network Analysis  – Apart from the obvious impactions in the study of real biological neurons, this technique is used in areas like distribution & logistics, sociology, economics, etc. Any company that produces a product is chartered with getting that product to its customers (whether that’s the final customer, retailer, wholesaler, etc.) in the least expensive manner. They are constantly trying to reduce total inventory while maintaining service levels of supply at each warehouse or distribution center location. Neural network models help with optimizing with the variables involved: number of warehouses to have, location of each warehouse, size of each warehouse, allocation of products to the different warehouses, allocation of customers to each warehouse, etc.  The objective is to balance service level of supply against production/ purchasing costs, inventory carrying costs, facility costs (storage, handling and fixed costs), and transportation costs.
Comp ANalytics

Share on Twitter Print

Data Visualization
September 25, 2014

Sure a picture is worth a thousand words. But didn’t the dude in Matrix see numbers and know the stories behind this? Well, only in movies. According to research, Data visualization is so powerful because the human visual cortex converts objects into information quickly. As we continue the journey of Data – Information – Knowledge – Wisdom, the feedback loop of models and visualization to see patterns is key.
Data Visualiaztion 1  
As Big Data grows, it’s clear that the technology to gather and store data far EXCEEDS the ability to Analyze it. However, not all visualizations are actually that helpful. You may be all too familiar with lifeless bar graphs, or line graphs made with software defaults and couched in a slideshow presentation or lengthy document. The best data visualizations are ones that expose something new about the underlying patterns and relationships contained within the data. Understanding those relationships — and being able to observe them — is key to good decision making.  
  • Pizza and Cola sell together more often than any other combo – is there a cross-marketing opportunity?
  • Does Plant and Clay Pot sales IMPLY sales of Soil?
  • Milk sells well with everything – people probably come here specifically to buy it. Should we raise prices since less price elasticity?
  • What is the one item you want to have in your store in case of a hurricane?
  • Does buying any kind of pepper also denote sales of  banana?.
  • Does buying any kind of pepper also denote sales of  banana?.
  • Which customers are most likely not to have an accident?
  An important distinction lies between visualization for exploring and visualization for explaining. Exploring data is all about statistical acumen and understanding the nature of what the data represents in your enterprise. Visualization tools are an aid but they have been around for eons. Once you have explored, you will almost always find less than a handful of factors stand out and need explanation. Your presentation should not be about fancy graphs but the right power point / keynote /video storyline for your audience. It seldom needs voluptuous graphs ... if you are trying to describe more than this handful of points, then you are already lost in your quest.  
The key is use the right Visualization for the right Data at the right Time. I found this chart very helpful to decide the decision tree for which types of visualizations to use for different scenarios:
There are so many tools to do this kind of analyzes:
  • Qlik, SAP, SAS, and Tableau Software deliver the latest table stakes in visual discovery: storyboard capabilities.
  • Google Fusion Tables: Bust your data out of its silo and combine it with other data on the web. Collaborate, visualize and share
  • Datawrapper: An open source tool helping anyone to create simple, correct and embeddable charts in minutes
  • Infogram: is user-friendly interface to help develop creative, interactive infographics
  • Piktochart: Piktochart is a simple WYSIWYG editor to help develop and design charts and infographics
    Visualization for explaining is best when it is cleanest. Here, the ability to pare down the information to its simplest form — to strip away the noise entirely — will increase the efficiency with which a decision maker can understand it. As big data becomes bigger, and more companies deal with complex datasets with dozens of variables, data visualization will become even more important.  
Data Visualiaztion 2        

Share on Twitter Print

Techniques in Predictive Analytics
September 13, 2014

So last week when I wrote about Predictive Analytics, I got responses from folks saying, “The value from such areas is clearly there. But the challenge is which technique to use and the ever-sliding sword of showing ROI so there is buy in for these analyses”.  
In framing the Analytics problem – we need to balance data, SME knowledge, and performance. One of the things I have noticed in my work is when the analysts build models the real skill in creating effective analytic model is knowing which models and algorithms to use. They can use different techniques: neural networks, decision trees, linear regression, naïve Bayes, etc. But these days many analytic workbenches now automatically apply multiple models to a problem to find the combination that works best. One needs to explore different paths – they look at the problem from different perspectives. When these algorithms are combined there is resulting synergy. Once the modeling data sets were finalized, the largest incremental gain was not achieved by fine tuning the training parameters of an individual algorithm, but by combining predictions from multiple algorithms.  
So with the myriad tools and techniques that exist, the way to approach this is to ask the questions that are really important for what the company is trying to solve:  
  • Strategic Customer Questions
    • Who are the most/least profitable customers?
    • Who are the most/least satisfied customers?
    • What is fastest/slowest customer segment?
    • What are the reasons for customer attrition?
    • What are the costs of customer transactions?
  • Strategic Product Questions
    • What are our most/least profitable products?
    • What are our production costs & how can we lower them?
    • What is our cycle time & how can we lower it?
  • Strategic Employee Questions
    • Who are the most productive salespeople, employee?
    • Which managers have the highest retention rates? What do they do?
    • What is the cost of turnover?
  • Strategic Financial Questions
    • How accurate are the financial forecasts?
    • How much financial data is used to answer business decisions?
    • What impacts the demand of our product?
    • What items are affecting our margins the most?
  Based on this one has to look at some of the following techniques:  
  • Classification – predicting an item class, “Decision Tree”
  • Association – what occurs together, “Market Basket”
  • Estimation and Time Series – predicting a continuous value
  • Web and Text Mining – extracting information from unstructured data
  • Clustering – finding natural clusters or groups in data
  • Deviation Detection – finding changes or outliers
  • Link Analysis – finding relationships

Share on Twitter Print

Predictive Analytics
September 1, 2014

These days it’s tough to walk out of any meeting with Business or IT organizations without touching the topic of Big Data or Analytics. Lot of people struggle with what this is – everyone BELIEVES that it can help if done rightly. But what is it?
The way as I look it: As organizations mature on their Business Intelligence capability, the questions they ask mature too. It’s not about only looking at what the data tells about problems you need to solve. But can data tell you to THINK OF NEW PROBLEMS that you can solve. Things you didn’t know. THINK of something different. Organizations are faced with ever increasing business challenges: Driving new sources of growth, Cost management and cash conservation, Increased business complexity and the need for operational excellence, or Business restructuring in an increasingly global business environment. Ubiquitous computing and technology capabilities have increased dramatically the volume of data at companies’ disposal, yet there remains little in the way of actionable insights (Big Data). Companies need timely, in-depth actionable insights if they are to remain competitive globally to effect a “whole business” approach to big data analytics to deliver business results. Analytics-driven optimization of key business processes
  • Staking out distinctive market strategy (CRM Strategy and Loyalty programs)
  • Finding the best customers, and charging them the right price (Revenue Management )
  • Minimizing inventory and maximizing availability in supply chains (Inventory Optimization)
  • Understanding and managing financial performance (Forecasting)

Predictive Analytics

    Business Intelligence technologies are deductive in nature validating the hypotheses of the business problems you want to solve. Examples:
  • Product shortage by market
  • Vendor spend by category
  • Brand health by market
  • Periodic trend analysis
  • Periodic P&L and Financial      Reports
  Predictive Analytics is Inductive in nature - pull out meaningful relationships and patterns and tells you of different things that might be addressing the same or new problems. Example:
  • Business Mix Optimization      (Product, Geography, etc.)
  • Price sensitivity by consumer      segment
  • Customer Behavior Modeling
  • Performance/profitability      analysis
  As an example, NETFLIX, a US movie delivery company, asked engineers and scientists around the world to solve what might have seemed like a simple problem: improve Netflix's ability to predict what movies users would like by a modest 10%. From $5 million revenue in 1999 reached $4.3 billion revenue in 2013 as a result of becoming an analytics competitor. By analyzing customer behavior and buying patterns created a recommendation engine which optimizes both customer tastes and inventory condition.
As another example, Analyzing Love: Data Mining on in AllAnalytics,, online dating service, tries to predict the likelihood of attractions between people.  95% of relationship can be predicted by analyzing as few as 10 characteristics in each profile. The find things like:
  • Members with accounts on Twitter, which only allows for messages of no more than 140 characters, have shorter relationships.
  • People identifying themselves as Republicans are more willing to connect with Democrats than the reverse
  Predictive Analytics 2 Predictive Analytics 3    

Share on Twitter Print

Global Sourcing
August 18, 2014

In this day and age there is an assumed maturity in the way initiatives within a business are sourced and out-sourced. When it comes to IT applications and their development and maintenance, there are 4 possible scenarios that companies deal with:
  • Insource  - Maintain control internally (usually for reasons of intellectual property, privacy, or strategic responsiveness)
  • Staff Augmentation - Save money while maintaining responsibility for application support and maintenance activities
  • Co-source- Leverage external cost structure benefits and expertise while maintaining an appropriate level of control
  • Outsource - Delegate IT (or selected functions therein) to an external organization for which it is a core competency
With this industry evolved over the years, the rationale for IT outsourcing decisions has shifted from cost being the sole consideration to include a number of strategic factors. No doubt cost is still top of the mind, especially with this economy. But a lot of other considerations are in play:
  • Strategic Importance
    • Relative impact of a service area on the company’s revenues and overall profitability
    • How strategic is the function to my organization today? How does it fit into our future plans?
  • Current Capability
    • Relative strength of a service area's technical & business know-how, processes, and tools
    • What are the capabilities of the function?  How do those capabilities compare to our requirements, and to our peers?
  • Perceived Value / Cost
    • Perceived value of a service area relative to the costs incurred
    • What is the function’s capacity to adapt and change?
  • Ownership Preference
    • Relative preference of management to own, share, or transfer out IT assets based on company beliefs, values, and sourcing experience
    • How easily can the function be transitioned to another sourcing strategy?
    Business Quarterly indicates 75% of US executives considered financial motivations as secondary to other strategic objectives when outsourcing. Business Week reports, “The really smart business owners have figured out how to use outsourcing as a strategic tool instead of simply looking for savings.” CIO magazine reveals strategic value rivals cost reductions for outsourcing motivations.   Based on some reports by The Outsourcing Institute the top reasons for outsourcing look as below:


  No matter what the goals, the key success factors of outsourcing are always:
  • Be clear about objectives-- cost, process improvement, and the ability to focus on the core business are the most common
  • Incorporate business outcomes as a performance measure from the outset of the arrangement
  • Look beyond price and promises of cost reductions for an outsourcing provider that brings a wide set of skills and strengths, and a long-term track record of delivering results
  • Give as much attention to performance measurement and the quality of your relationship with your provider as you do to the contract
  • Use active governance to manage the outsourcing relationship for maximum performance
  • Task talented executives with optimizing outsourcing arrangements

Share on Twitter Print

Business Cases – Show me the Money !
July 29, 2014

Ever since Jerry Maguire blurted this out, people have been using this as a corporate euphemism for ROI/ Business case.
One of the critical roles for any organization is to manage the value achievement of the initiatives they pursue. They need to ensure sponsor and executive ownership of the business case. The business case allows the stakeholders in IT projects to jointly address their key concerns with project investments:
  Business cases highlight the initiatives that create the greatest value, support decision- making, and help track program performance. It is good to define the business case early and plan on many iterations since it:
  • Demonstrates how a major investment creates value
  • Includes both quantitative and qualitative rationale
  • Supports business decisions by weighing choices or options
  • Creates a way to track performance and measure success after a decision has been made
  • Gains alignment and management consensus for a project
  In some organizations, the term ‘Business case’ may also be referred to as
  • Cost/benefit analysis
  • ROI analysis
  • Feasibility study
  • Capital funding request
  • Case for action
  • Once the team has understood the importance of having a business case to guide the investment decisions of the initiatives, there is debate on what level of detail should it have. There are many approaches to building out a business case and the main elements are
    • Benefit models
    • Cost models
    • Cash flow models
    • Assumptions (timing, dependencies)
    • Sensitivity Analysis
    • Qualitative Factors Analysis (non-financial benefits, risks)
  The financial models can be Top-Down (more high level and helps form an initial hypothesis wider ranges to reflect uncertainty) or Bottoms-Up (more quantitative and time spent on thorough data collection and analyses). But the key point is that you need to build the business case with ranges and confidence levels. Once the numbers were compelling, the ranges could change but they would not change the decision.    

Share on Twitter Print

IT Service Management
June 27, 2014

At a BPM event recently in Orlando, I was chatting with a colleague about IT and the BPM responsibility. This guy is the SVP of IT operations and handles Infrastructure for his company. When someone asked him who from the business was responsible for the BPM aspects in his firm from the business side, his response was “We in IT are actually responsible for the BPM aspects and optimization therein.” Another guys goes, “The only real applications the business is concerned about is e-mail”   That set me thinking about IT Service Management, etc. Having spent some time doing ITIL work, I am familiar with the concept of IT service management, which involves moving:
  • Multiple points of contact with the business
  • Service defined and measured in technical terms (if at all)
  • Work driven by technology
  • Organized to support systems
  • Managed relationships established with customers
  • Service defined, measured and reported on in business terms
  • Work driven by service requirement
  • Organized to deliver service
So ITSM is all about better service at lower cost. But the challenges with a full blown ITIL deployment is that ITIL is far too generic for an organization to implement at a fast pace, in totality. Process reengineering and change management are always required and are rarely considered. Some practitioners have said that it complements other IT management methodologies like CMMI, etc. But the way I look at this is that CMM focuses on improving and appraises the maturity of application development.  ITIL is focused on best practices around IT Operations and Services. This kind of demarcation:  


  The ITIL v2 broke these Operations into Service Support (ensuring that the customer has access to appropriate services to support business functions) and Service Delivery (IT services are provided as agreed between the Service Provider and the Customer).   But the key to achieving good IT service management even at a small scale is by using the following guiding principles:
  • Business Relationship Management: Ongoing liaison and relationship building with Client community.  Maintain an understanding of the business and IT requirements.
  • Service Delivery Management: Understand the IT Services provided and the businesses reliance on these Services.  Carry out the appropriate business liaison and escalation for Service issues.
  • Service Performance Review: Formally review service performance against agreed upon SLAs. And good luck with that J
  • Service Level Agreement Management: Maintain service definitions and assess implications of any changes
  • Service Enhancement Request: Receive and shape requests for new/enhanced services

Share on Twitter Print

Supply Chain Excellence
June 7, 2014

Achieving supply chain excellence is complex and challenging, but success in achieving supply-chain driven competitive advantage enables superior customer service, profitable revenue for growth and significant increase in shareholder value. Inventory Management is the conductor of the symphony for Retail Supply Chain execution. It is critical for customer service since Inventory management is what initiates all merchandise movement and controls the timing within the supply chain
  • Supply chain assets and inventory usually comprise at least half of all non-store based assets
  • Supply chain activities typically account for as much as 40 – 70% of operating costs (including procurement  and markdowns)
  Some of the statements from retailers across all kinds of products:
  • “Assisted Inventory Management (AIM) helped us exceed our inventory-turn goal, making us the leader among national drugstore chains in this important productivity measure. We achieved inventory turns of 5.0 times for the year, up from 4.6 times in earlier years.” – CVS
  • “Positioned among the best in retail, our supply chain helps drive sales, reduce costs and ensure the availability of products our guests most want and need.” – Target
  • “We completed the conversion of each of our operating divisions to a common technology platform with greatly enhanced inventory management tools, permitting more sophisticated inventory planning and more precise by-store inventory allocation.” – Saks
  The three main components of the Inventory Optimization program address both the process and physical infrastructure of the supply chain.  
  1. Inventory Management Process  - this addresses end-to-end inventory management built on two core processes:
  • Foundational for continually replenished basic merchandise. Periodic automatic replenishment, long life, stable supply, short lead time to continually meet normal demand
  • Highly Variable which is typical of merchandise with high demand spikes due to promotions, fashion, short life and seasonal demand
  1. Network and Flow Strategy - Network Optimization starts with establishing a vision of alternative flow paths and ends with a full evaluation of end-to-end physical supply chain and a recommended distribution network strategy. One  has to assess merchandise flow paths to provide revenue growth, minimize supply chain costs and support overall inventory strategies.  Then one has to determine alternative distribution      strategies including buildings size and location, transportation strategies, inventory deployment strategies, and benefit based business cases.
  1. Store Operations – Design and implement a well-defined process for store operations related to receiving, shelf stocking, perpetual inventory accuracy and plan-o-gram maintenance.
  • Organization & Labor Planning
  • Life Cycle Management
  • Shelf Replenishment
  • Data Integrity Maintenance
  The idea is to push operations from
  • Stores Ordering for basic merchandise to Automatic Replenishment Approach which is centrally  maintained and helps with enhanced High Performance forecasting and allocation abilities
  • Store Reviews ( All replenishment orders to supplement simple forecasting & ordering logic) to Exception Only Reviews. No store review for standard items and examples of exception reviews: items with high inventories, poor service levels etc.
  • Limited Standards & Policies (In-stock policies and Service levels) to Standard Policies Across the Supply Chain. This is through reliable & repeatable inventory management processes and uniform service standards based on merchandise goals and category/SKU profitability
Forecasting SCM 2  

Share on Twitter Print

RFID in the age of Mobility
May 10, 2014

Having done a lot of work in the supply chain industry, I am so intrigued by RFID and its potential once the costs go further down. Radio frequency identification (RFID) is a generic term for technologies that use radio waves to automatically identify individual items. RFID technology is not new or complex; it has been around since the early radar systems in the 1940’s. What is new is how manufacturing advancements have reduced costs of implementing RFID systems (particularly tags). These silicon-based electronic identification tags, consisting of a tiny processor, memory, antenna and can be read and written wirelessly and can be made cheap, without a battery. The main components of this technology are:
  • Device made up of an electronic circuit and an integrated antenna
  • Radio frequency used to transfer data between the tag and the antenna
  • Read-only or read / write
  • Receives and transmits the  electromagnetic waves
  • Wireless data transfer
  • Receives commands from application software
  • Interprets radio waves into digital information
  • Provides power supply to passive tags
  IT Infrastructure
  • Reads / writes data from / to the tags through the reader
  • Stores and evaluates obtained data
  • Links the transceiver to an applications, e.g. ERP
Of course there has been a major drag in the adoption of this technology. The key challenges have been:
  • Not only costs of tags and readers, but the costs of integration of the RFID technology into the IT technology stack - e.g. ERP, etc.
  • Lack of worldwide data standards
  • Country-specific frequencies allocation
  • Vendors are very fragmented
  • Tag and data overload – How do we handle the data?
  • Read-rate accuracy
  • Tag and reader collision – Signals can interfere with each other
  • Privacy fears from the tracking provided by this technology
But more and more this technology is coming into mainstream. Especially after Walmart mandating the use of RFIDs in their supply chain management. Walmart believes that they can cut out costs and make their supply chain even more lean with this deployment.  
The uses of this technology are of course endless. I was recently reading about the CyberTM Tire from Pirelli Tire Systems that transmits information on road conditions and friction coefficients to the car's computer. Already some hospitals are using RFIDs to tag patients with wristbands to scan by hospital staff using PDAs or tablet PCs connecting to patients' data using a WLAN.  
And as this become more prevalent there are other uses that are surely ridden with privacy issues. There is much research where people are looking at ways to monitor real time health in individuals. There is a RFID implanted in the human wrist that send signals to the health insurance company at all times. When you wake up in the morning and go for a jog; you arrive at work and an email from the company (always monitoring your vital stats) sits in you inbox, proclaiming a reduced premium for the day. You have breakfast at McDonalds over the weekend. Lo and behold, your premium just went up.    

Share on Twitter Print

IT Spend Analyses
April 26, 2014

A few days ago I was in a CIO roundtable in Atlanta and one of the CIOs mentioned that despite the state of the economy their IT Organization was thinking of spending some if their budget on some innovative initiatives so that when we get to the bottom of the J-curve in the economy, they’d be ready to win over strategic goals. Really set me thinking – how are companies dividing their IT spend on keep-the-lights-on operations and strategic or innovative investment. Top executive management these days has two main questions:
  1. How can the IT organization be transformed to be an enabler of creating  business value rather than just being a cost of doing business?
  2. How can we achieve better results at a lower cost?
  I guess it’s always important for the IT organization to evaluate internally how IT’s value contribution to the business should be planned, managed, and assessed. Unfortunately, the link between business value and IT is often not understood by executives and especially in times like these IT spending levels are overly-squeezed. The common issues that we have seen:
  • Typically, IT spending level is based on historical or competitive benchmark levels
  • Lack of recognition for IT contribution on business side
  • Short term, simple IT cost cutting drives down value adding and innovative IT initiatives first
  • As a result, IT capabilities deteriorate and mid-term IT operating costs rise
  • Eventually, higher IT operating costs eat away funds for innovations and this furthers the overall IT budget explosion. A big vicious circle!
  Of course a company’s position on its spending is dependent upon many macro factors:
  • Number and size of competitors
  • Industry growth rate and rate of change
  • Industry margins/pricing
  • Product differentiation factors – physical products or knowledge assets
  Mandatory or non-discretionary IT investments are for keep-the-lights-on functions - IT Operations, regulatory, etc. Things like technical support, IT infrastructure management, technical upgrades to infrastructure components, required maintenance, enterprise-wide project support fall in this category.
  Discretionary spending, which is about IT investments that are Strategic, Enabling, and Sustaining, are on things like R&D (focus on future technologies), etc. These investments should create a strategic or economic advantage in the market, create barriers to entry, etc.   As written by Michael Treacy and Fred Wiersema in their classic book, Discipline of Market Leaders, there are three basic "value disciplines" for a company to pursue – operational excellence, customer intimacy, and /or product leadership. If the direction of the company is clear, well-communicated, and well-understood, then some strategic IT investments are driven from the same:  
  • If there is a product/service innovation  focus, then the company needs to focus on increasing value to existing customers, developing new markets and channels, etc. Examples of initiatives are eInnovation, eDesign Collaboration, PLM, etc.
  • If the company is focusing on Customer Intimacy, then the company needs to improve understanding of its customer needs, increase customer insight, etc. The initiatives fall in realms like Customer Insight (Inbound Marketing), Integrated View of Customer (DW, Analytics), etc.
  • If the company is trying to create new scales and reduce interaction costs between partners and customers, it needs to invest in increasing service levels at lower costs, concepts like “Super” Distributor, Supplier Collaboration, etc.
IT Spend  

Share on Twitter Print

Fleet Management
April 2, 2014

Fleet Management, of which Strategic Sourcing is a core part, is an integrated set of actions, which occur in a rational and logical manner, with the overall objective of attaining lowest Total Cost of Ownership (TCO). Key issues in fleet management involve capital commitments and management, as well as operating effectiveness and cost.  Fleet asset utilization is not typically tracked or measured, which leads to unwanted outcomes, such as having more vehicles than necessary, additional operating and maintenance costs and not always having the right vehicles for the jobs they are needed to do.  Additionally, fleet costs are usually   fragmented and a rarely captured in total, which leads to problems in trying to adequately and accurately assess operating efficiency and evaluate out-sourcing opportunities.   The first step for true optimization is getting a good handle on the existing fleet in terms of its make-up, utilization and operating cost, reviewing the administrative and operating practices related to procurement, operations, maintenance and disposition, as well as determining replacement scheduling. The foundation is based upon the following three areas:
  • Strategy (replacement scheduling, outsourcing/insourcing and fleet organization)
  • Operations (vehicle pooling, maintenance & repair, inventory management, fuel management)
  • Administration (Standards & specifications, fleet utilization, budget & cost reporting)
  The areas to explore the fleet management practices:
  • Fleet inventory (including but not limited to manufacturer and model year, type, location, VIN #, GVWR, acquisition price, options purchased, lease payment, annual operating and maintenance costs, sale price if retired, auction fees and class – how it’s used)
  • Equipment Utilization – Miles, hours or both on equipment where there may be two measures of utilization
  • Fleet “spend” at invoice level and at options level if available.
  • Current agreements and in progress negotiations
  • Current leases, short term rentals, and ownership models

Fleet Rationalization, Utilization and Fleet Mix - Once the standards and specifications process has taken place, putting rigor and focus in the area of rationalization and utilization brings value and savings to the company and fleet. The goal of this component of the process is multi-dimensional:
  • Ensuring that the proper utilization targets by class and location (e.g.,: metro v. rural) are set and used to reduce the number of low-use vehicles in the field
  • Rationalizing the fleet based on job function and job assignment.
  • Developing a fleet policy that optimizes the use of pooling vehicles, how and when to use short-term rentals and take home vehicles.
  • Identify fleet operating needs that may include needs for surplus vehicles including seasonal work requirements, construction projects, regulatory mandates, etc.
Focus on the 80/20 rule when it comes to prioritizing fleet opportunities.  Develop standards and specifications for the portion of the fleet that can be standardized and will provide the highest value/impact, such as passenger vehicles, SUVs, LD and MD trucks aerial and digger derricks.  Utility and construction equipment is often overlooked  
  • Fuel - In most cases, not incorporating the sourcing of bulk fuel (v. fuel management services) as a part of any fleet sourcing engagement. Past   experience has shown that this exercise returns almost no incremental  value and usually devolves into an exercise around sourcing transportation from supplier fuel racks to client bulk tank facilities.
  • Maintenance & Repair - Achieving the lowest TCO for fleet, maintenance and repair is an integral component of the equation. Inherently  maintenance and repair costs will decrease as an output of developing the standards and specifications and replacement schedule process. Other areas should also be evaluated, such as opportunities for network consolidations of maintenance and repair shops, etc.
  • Determining a “Levelized” Replacement Schedule - Developing a “Levelized” Replacement Schedule is a key concept in improving fleet management and obtaining benefits from strategic sourcing. Sharing the information with both internal finance and external vendors and suppliers is instrumental in planning for future fleet acquisitions and capital needs as well as structuring multi-year deals.
   Fleet TCO  
In summary, maximizing fleet effectiveness depends on managing it like a business, in an integrated and holistic fashion, across two major dimensions.
  • FLEET OPERATIONS – Operating revenue, Operating costs, Contribution margin, Productivity metrics and measures, Performance metrics and measures
  • FLEET ASSET MANAGEMENT – Fleet sizing, Standards and specs, Strategic sourcing, Life-cycle management, Maintenance and repair, Disposition management
Fleet Mngt 2      

Share on Twitter Print

Supply Chain Excellence
March 30, 2014

Achieving supply chain excellence is complex and challenging, but success in achieving supply-chain driven competitive advantage enables superior customer service, profitable revenue for growth and significant increase in shareholder value:
  • Supply chain assets and inventory usually comprise at least half of all non-store based assets
  • Supply chain activities typically account for as much as 40 – 70% of operating costs (including procurement and markdowns)
Retail 3      
  1. Scientific Retailing: Overview of the High Performance Retailing framework, demand and supply value drivers
  1. Inventory Management: Inventory Management is the conductor of the symphony for Retail Supply Chain  execution. It is critical for customer service since Inventory management is what initiates all merchandise movement and controls the timing within the supply chain
    Some of the statements from retailers across all kinds of products:
  • “Assisted Inventory Management (AIM) helped us exceed our inventory-turn goal, making us the leader among national drugstore chains in this important productivity measure. We achieved inventory turns of 5.0 times for the year, up from 4.6 times in earlier years.” – CVS
  • “Positioned among the best in retail, our supply chain helps drive sales, reduce costs and ensure the availability of products our guests most want and need.” - Target
  • “We completed the conversion of each of our operating divisions to a common technology platform with greatly enhanced inventory management tools, permitting more sophisticated inventory planning and more precise by-store inventory allocation.” – Saks
    The three main components of the Inventory Optimization program address both the process and physical infrastructure of the supply chain.  
  1. IM Process:
  • Addresses end-to-end inventory management built on two core processes:
  • Foundational for continually replenished basic merchandise: Periodic automatic replenishment, long life, stable supply, short lead time to continually meet normal demand
  • Highly Variable typical of merchandise with high demand spikes or problematic supply: Demand characteristics are promotions, fashion, short life and seasonal while supply is typically private imports and private label
  1. Network and Flow Strategy
Network Optimization starts with establishing a vision of alternative flow paths and ends with a full evaluation of end-to-end physical supply chain and a recommended distribution network strategy.
  • Assesses merchandise flow paths to provide revenue growth, minimize supply chain costs and support overall inventory strategies.
  • Determines alternative distribution strategies including buildings size and location, transportation strategies, inventory deployment strategies, and benefit based business cases.
  1. Store Operations
  • Determines store level inventory processes that maximize the customer perceived in-stock (several studies show 40-70% of outs occur due to store defects).
  • Design and implement a well-defined process for store operations related to receiving, shelf stocking, perpetual inventory accuracy and plan-o-gram maintenance.
    • Organization & Labor Planning
    • Life Cycle Management
    • Shelf Replenishment
    • Data Integrity Maintenance
    The idea is to push operations from
  • Stores Ordering for basic merchandise to Automatic Replenishment Approach which is centrally maintained and helps with enhanced High Performance forecasting and allocation abilities
  • Store Reviews ( All replenishment orders to supplement  simple forecasting & ordering logic) to Exception Only Reviews. No store review for standard items and examples of exception reviews: items with high inventories, poor service levels etc.
  • Limited Standards & Policies (In-stock policies and Service levels) to Standard Policies Across the Supply Chain. This is through reliable & repeatable inventory management processes and uniform service standards based on merchandise goals and category/SKU profitability
Forecasting SCM 2        

Share on Twitter Print

Lean and Six Sigma
March 16, 2014

Recently I was on the panel of a CXO discussion around how to optimize costs and increases business productivity, when someone from the audience asked about Lean Six Sigma and its relevance, especially in today’s economy. They asked is it Six Sigma that has been more effective or have Lean principles helped more? And how exactly do they differ? That set of dialog encouraged me to write this post. There is always debate about Lean and Six Sigma being so close that practitioners love to dichotomize in their thinking.  

So what is Six Sigma?
  • A Metric? - Less than 3.4 defects per million opportunities of product produced/ service rendered
  • A Vision? - Six Sigma is an overall strategy to accelerate improvements in processes, products, and services
  • A Value? - Strive for continuous improvement in all activities
  • A philosophy? - A proven “pursuit of perfection” business initiative that creates breakthroughs in profitability, quality, and productivity
Six Sigma practitioners follow these tenets as a business philosophy:
  • If something cannot be measured, we really do not know much about it.
  • If we don't know much about it, we cannot control it.
  • If we cannot control it, we are at the mercy of chance.
Six Sigma started in the manufacturing industry with emphasis on management of efficient processes, efficient management of people, dedication to measurement systems, etc – mostly Operational Excellence. But it became apparent that business success was more than the absence of negatives (defects, delays, cost overruns). Six Sigma then began to encompass positives like customer loyalty and delighters in new products. From operational excellence, Six Sigma has moved towards Customer Intimacy and Product Leadership value disciplines through its DFSS / DMADV tools. There is always debate that Six Sigma does not go that well with a Innovation focus. But all said and done the tools offered are used everywhere in different flavors and different terms:
  • SIPOC - A top level process mapping tool to document a process in the context of suppliers who provide inputs which are transformed into outputs for the customer.
  • Cause & Effect Matrix - A process of identifying problems, finding their causes, and creating the best solutions to keep them from happening again (fishbone diagram).
  • Failure Mode & Effects Analysis (FMEA) - A tool used to identify ways the process can fail, estimate the risk of the failure, identify causes of failure, prioritize actions to reduce failure risks, develop control plans to prevent failures
  • VOC - The “Voice of the Customer”; the customer specifications/ requirements that dictate acceptable and unacceptable outcomes and drive actions.
  • VOP  - The “Voice of the Process ”; the companies processes doing what they need to produce products/ services.
  • CTQ - “Critical to Quality”; characteristics that significantly influence one or more of the customer requirements.
Of course, recently Six Sigma has begun to be used in the IT industry as Six Sigma for Software.  

And what is Lean?  A philosophy that shortens the time line between the customer order and the shipment by eliminating waste (non-value-adding activities). This philosophy is based on the following principles:
  • Value – what the customer buys
  • Value stream – how value is delivered
  • Flow – putting value added steps in sequence. The “flow” or “value-stream” perspective represents a shift from vertical to horizontal thinking. Flow is enabled when materials and processes are standardized across the supply chain to reduce complexity.
  • Pull – triggering flow from the customer needs. E.g. have only projects in IT that the pipeline can take i.e. Demand Management.
  • Perfection – continuous improvement
  Value Added
  • Any activity that increases the market form or function of the product or service.  (These are things the customer is willing to pay for.) For .e.g in the Airline industry – actual flying time or in the Healthcare industry – diagnosis/treatment.
Non-Value Added = Waste
  • Any activity that does not add market form or function or is not necessary.  (These activities should be eliminated, simplified, reduced or integrated.). For e.g. in the airline industry – lining up to check –in or in the Healthcare industry – sitting in the waiting room waiting for an appointment. There are specific categories of waste that Lean targets:  
  1. Excess (or early) production - Overproduction
  2. Inventory - documents, forms, supplies
  3. Waiting  - e.g. delay in obtaining an appointment or test results
  4. Transportation (to/from processes)/ Motion – for e.g. leaving exam rooms for equipment, chart forms
  5. Extra Processing like inspection
  6. Defects
  7. Underutilized people / resources
The way I look at it is the Lean is the management philosophy and Six Sigma is a great set of tools that help you chart your path. You have got to use the Six Sigma to first reduce variation and then deploy Lean management to take your processes to a newer level altogether. Some thoughts to leave you with what I always do :-)  

SIX SIGMA is about supporting different situations with different and specific tools: ss 1  

And LEAN is about looking for efficient solutions and reducing waste: ss2

Share on Twitter Print

Payment and Transaction Processing economics
January 5, 2014

Ever since I worked at American Express and I still hear from folks that “oh these credit card companies make 4% of the transaction dollars”, I feel like documenting the whole flow of this industry and see the economics behind this. The way the transactions work are very well explained on this virtual school link: The mix of payment instruments evolves from the dynamics between customers, retailers and issuers – cash, checks, credit cards, debit cards, fleet cards, mobile payments, biometric payments, etc. From the retailers’ perspective, the range of instruments that retailers choose to endorse depend on the costs and benefits of each – with remaining transaction values as below:  CC processing 4  

For credit card transaction processing industry in particular, the value chain of this industry is as below:

CC processing Value Chain

But the current trends in this industry are as below:

  • Merchants, like Starbucks, have expanded pre-paid and store-card operations with payment processors, bypassing high cost clearing/ settlement networks
  • A consortium of major banks, such as BofA, Chase and Citigroup, have started or purchased a rival clearing / settlement network
  • Game-changing mobile and  internet initiatives, like GSM and Paypal, have expanded into the traditional payments marketplace and captured a small but growing market share
  • New lower cost alternatives, such as RevolutionMoney,  have successfully attracted merchants in direct competition with traditional providers and established a low cost credit card alternative
  • Core banking providers have developed integrated payment solutions embedded in their core banking applications
  • Interchange is the fee Processors pay to the issuing banks via VISA/MasterCard and these rates are set by VISA/MasterCard through their associations. Typically interchange rates are based on a % of volume for credit cards & a rate per item for debit cards. This is evolving as new players are jumping in.

Plus add to this the mobile payments with companies like Google Wallet jumping into the fray. The mobile ecosystems consists of the usual players- retailers/merchants and financial institutions but also adds handset manufacturers and network operators.

CC processing 1

Due to these platforms that needed built, the adoption varies in geographic areas. Currently the percentage of people that currently use a mobile phone for Banking Transactions at least once a month (according to EDC-GSMA Mobile Financial Services Survey):

CC processing 2

Share on Twitter Print

Big Data – within, at, and outside your Company boundaries…
November 16, 2013

So the title of this blog is so uninviting and overflowing in people emails that I will have to do Push marketing to make people read this :-) .But with all our clients in almost all industries - from hospitality to financial services to retail - I never leave some strategic meeting where people wonder about the true strategic value of Big Data and get questions like “Is it really that new? “You know us oldies in business and technology, we have seen it all and someone just came up with a new buzz.” “What is this hoopla about the elephant called Hadoop?"
Well, like all things in life there are always perspectives and point of views. The usage of a process or technology to effect the top line growth and operational efficiencies is up to the culture, intent and execution mastery of organizations.
So Big Data from social media impacts a lot of business functions but greatest is in the front end of any business – your customer relationship management (CRM) processes and systems. It implies a fundamental shift in the way organizations interact with prospects, customers, employees, partners and other stakeholders. Leaders must take the first critical step of changing their mindsets and revising some long-held beliefs about building and managing customer relationships.
Big Data1
The traditional 4 Ps of marketing (famous framework by Jerome McCarthy) has a new dimension – PEOPLE. The technology has enable people to talk to anyone within the organization or outside.  
  • Outside the organization there are Customer Peer - Customer peer conversations (Opinion sharing, Discussions, Idea sharing, Complaints, Questions, Jokes, Gossip, News, Etc.) for Information sharing, Relationship building. These are “Off Board” Outlets – (e.g. Facebook, Twitter, YouTube, etc.) Public Internet Outlets that organizations do not control, but can choose to participate in
  • At the company - customer touch points there are Customer - Employee conversations (call centers, chats, forums, Sentiment Monitoring, Brand association monitoring, etc. for Information sharing, Relationship building. These are “On Board” Technologies – that can be integrated into your public facing website to engage your constituents. Examples include discussion forums, ratings and reviews, idea solicitation communities, blogs, etc
  • Within the company walls there are employee – employee conversations (Opinion sharing, Questions, blogs, Etc.) for Information sharing. Hopefully nothing like NSA goof-up. “In Board” Technologies – that leverage social media techniques to facilitate collaboration and knowledge sharing between an organization’s internal employees.
So Big Data especially in the Entertainment & Hospitality business is surely evolving and rapidly for usage. Hotels, travel agencies, airlines, vacation clubs, and other industry players are unlocking the power of Big Data to dramatically improve products and services, thereby enhancing their competitive position and benefitting customers.  
  • Enhanced revenue management: Hotels recognize that data analytics are helpful in establishing the optimal price for rooms and ensuring that as few as possible are empty. Hotel chain Marriott takes this approach further, using big data for price optimization in restaurants, catering, and meeting spaces too.
  • Better relationships with hotel guests: When customer data is aggregated, instead of being fragmented across a hotel’s various divisions, the analytical insights lead to better marketing and customer service.
  • Targeted Marketing Promotions and Targeting is being focus even more to enhance extra marginal revenue.
But a major caveat to using all this is – data quality. I was on a Consulting panel recently where a good friend mentioned that companies are like true organisms – with a brain, muscle, fat etc. But the data is truly the blood that keeps this organism alive and running. I loved this analogy and carried it further in my head. Bad data can spoil the functions. Also polluted data (aka too much alcohol in the blood) can be devastating. Some threshold is OK. But that’s why we have data czars (aka police) catching folks doing DUI (aka management making wrong decisions under the influence of bad data). Now with sudden infusion of more diluted blood into the stream, how can we use the RBCs (red blood cells) properly. The elimination of a lot of White Noise in this big data to get to the little clusters of information wealth is where value is. The things to note on Big Data are:
  • User-generated content is often triggered by emotion
  • Amplified via the “viral effect”
  • Impact cannot be stopped or undone
  • Does not follow the conventional rules of commerce
  • Forces companies to act in shorter cycle times
  • Fast and truly global

Share on Twitter Print

Healthcare Payment Integrity
October 10, 2013

A lot of Health Plans can generates incremental prevention savings by improving existing business practices, increasing the enforcement of claim payment policies, and developing new, robust solutions to increase claim payment accuracy.   PI - 1    
Health claims functions everywhere face a similar set of challenges which inevitably lead to claims overpayments.  Claims overpayment is one driver of increased medical cost that clearly can be controlled. In this world of Obamacare and the uncertainty associated with it, a lot of Healthcare companies are honing up this capability.  
PI - 2  
      The typical methodology for calculating incremental prevention savings is structured around the establishment of a savings baseline using the four following inputs:  
  1. Payment Ratio Report: To calculate a      ratio that can be applied to a denied claim in order to derive the      expected value of the denied claim (i.e. to determine the value of the      claim, had it been paid). This is generally done by extracting all finalized      paid claim lines over a 12 month period. Then one has to group the data at      the claim type level (Facility/Professional) and calculate the ratio of      the paid amount (net of any member liabilities) to the billed charges.
  1. Denials Baseline: To determine the value of historical denials associated with a particular payment integrity initiative prior to prevention implementation.  This establishes a savings baseline that the company seeks to exceed through implementation of robust prevention measures.
  1. Leakage Baseline: To determine the value of claims inappropriately or inaccurately paid associated with a payment integrity initiative prior to prevention implementation.
  1. Denial Rate: The Baseline Denial Rate represents the percentage of the total suspect population that the Client would expect to deny for a particular claims payment integrity initiative prior to prevention implementation. This is done by dividing the value of the Denials Baseline by the value of the total suspect population for the initiative.

Share on Twitter Print

Customer Loyalty
May 21, 2013


Over the years working with Loyalty programs with so many companies in the Retail sector, I feel I can summarize some observations:
  • True customer loyalty is created when the customer becomes an advocate for the organization
  • Rewards alone don’t generate loyalty. If a loyalty marketing program is just about earning points you end up buying loyalty not earning it.  The loyalty      is to the program not the product or the company.
  • Rewards-only programs can be easily replicated by the competition, will quickly be commoditized and become a defensive play that no competitor can afford to unwind.
  • Loyalty can be attained, but the organization has to work at it, continuously, and it will not possible with all customers.
  • These is NO One-Size Fits-All Loyalty Program
  • A win-win relationship must be established, and this cannot be accomplished if both parties cannot realize benefit. The two poles must be attracted to each other.

loyalty 1

1. Compelling Value Proposition
  • That which provides the customer with a tangible benefit if he or she decides to join a benefit program.
  • Leading organizations adapt the value proposition for different segments of clients.
 2.     Satisfaction
  • Customer satisfaction is the degree to which customer feel their needs are met.
  • Short-term perspective, very much based on the transaction with the customer.
 3.     Loyalty
  • It is a feeling of connection to, and belief in and enterprise and its proposition, created by a “feel good” factor from interaction that lead to continued relationships.
  • Loyalty is ultimately the crucial measure and it is more difficult to achieve than satisfaction.
  • A customer can be dissatisfied despite being loyal.
  • Loyalty can only be created on the basis of trust and repetitive positive experiences over time.
 4.     Advocacy
  • The pinnacle of customer loyalty is where the      customer acts as an advocate for the enterprise.
In the last years, many banks talked a lot about the importance of customer knowledge but only few of them have put successful actions behind their words. Companies still struggle with the basics of revenue growth areas:
  • Customer Segmentation: Who are my customers, and how do they differ?
  • Differentiated Treatment: How should I treat each customer segment?
  • Optimization: How can I optimize treatment decisions to maximize value at an individual level?
The ability to classify or cluster customers / prospects based on certain business rules or inherent customer data behavior, pattern using advanced statistical modeling tools and techniques.  To effectively use the gold mine of customer information, banks must develop at the same time the capabilities to aggregate, analyze, and use the customer data. And the best way to develop these capabilities is to create a specific unit at Headquarter level / enterprise level. Let’s call this unit “Marketing Factory”.
  • The first is create an integrated view of each customer: marketing analytics achieves this goal developing superior data management capabilities
  • The second goal is understand and predict customer behaviours: marketing analytics achieves this goal
    • Developing propensity score
    • Realizing segmentation and profiling analysis
    • Realizing analysis of customer profitability and long term potential value
    • Developing analysis on customer satisfaction and loyalty
    • Develop marketing and sales dashboard
  • The third goal is to provide insights that directly improve sales effectiveness. Marketing analytics achieve this goals:
    • Identifying relevant commercial events and related offer
    • Defining next product to offer for each customer
    • Identifying the most profitable combination of customer segment/channel/product thanks to optimization tools
  loyalty 2  

Share on Twitter Print

A fool with a tool is still a…
March 7, 2013

Today I was chatting with the global CIO of a fortune 100 company and just discussing how the alignment of his initiatives was going. He had some interesting analogies of how the business conducts the IT aspects of its business. He said, “If you run a small business and you buy MS Excel, it doesn’t mean that you can manage cash flow better. Accounting is has a process to it, a language, etc. that is key to understand before you do things on XLS.” He added, “Its like my daughter buying an expensive digital camera - doesn’t mean the pictures will be better. You need to understand the basics of field of depth, lighting, speed, apertures, before you can leverage that tool to your advantage”.   The debate between TOOLS vs PROCESS and what is more important happens at every strategic meeting and decision making juncture. The third leg of the stool – PEOPLE – is always assumed present or can be brought in easily.   Processes and tools go hand in hand, so the question again is which one comes first though- the chicken or the egg conundrum. It all depends on the industry you are playing in, the position you are in and so most importantly which CAPABILITIES you need.  Technology is ever evolving, and with tools resulting from technology, one can argue that tools must lead the way for the activities we perform. But a good product, for example, has a limited life span in the marketplace. A good product development process, however, enables a company to create appealing new products over and over again. The alignment of processes and tools, is about Efficiency - it is all about HOW the organization should be doing what it decides to take on. For this companies need to think in 3 dimensions:
  • Differentiation “on the outside”—They need to have a clear view of what makes them unique—product, sales, service, brand, or business model. They need to deliver a consistently positive experience for customers in each market segment.
  • Simplification “on the inside”—They need simplicity in everything they do and this means standardized or componentized internal products, processes, and systems, with scalable and repeatable business models across the enterprise.
  • Execution mastery—They need to prioritize execution as a core capability with the right leadership skills, culture, and change and risk management.

Share on Twitter Print

A Smart phone a day keeps the doctor away
January 24, 2013

Another splurge on the media in the New Year is the deluge of ads for health and weight loss. They know the new year’s resolution is the time to close new members into gyms, diet courses, equipment sales, etc. But in this day and age of health options and the onslaught of mobile technology, it’s amazing to see how this industry is evolving. The interactions with people and patients is going from “episodic” to “continuous” with the advent of this technology:  
  • SIMpill: Smart pillbox to monitors medication and communicate with doctors
  • Proteus Pill: Ingestible sensor which sends digital signal to on-body receiver
  • Asthma Assistant: A 6-month pilot study of children and teens with severe persistent asthma found that the technology-enabled daily communication helped patients to better manage their conditions. Over the study period, patient adherence was high and there were no emergency department (ED) visits among the study population, compared to a national average of 2-3 annual visits among asthma patients. This technology enables data collection by the patient and then on a as needed basis monitoring by the medical team and provide feedback based on medical algorithms.
  • Diabetes Assistant: LG Glucophone is already in use in S. Korea – this works alongside Infobia’s Eocene diabetic management system to ease the task of blood glucose management. The results of the blood tests will be sent to a secure server that graphs and manages the disease, sets up automatic texts of results and creates reminder alarms.
  • Texting for health: Available at Nearly three-quarters of the people in the US have cellphones. SMS can be used to remind patients about their medications and also deliver info and encouragement to help patients manage their health.
    • In 2006, the drug maker PediaMed launched a mobile compliance campaign called 8TDAZE involving a prescription acne treatment called TAZORAC. – remind teenagers to apply the treatment regularly.
    • Text4baby is a free mobile information service designed to promote healthy birth outcomes and to reduce infant mortality among underserved populations.
  • Mobile Imaging:
    • Nearly three quarters of the world population don’t have access to essential medical imaging technologies (ultrasound, MRI, etc.). UCB researchers are creating portable medical imaging using mobile phones (data acquisition + display; remote computer for processing). The data acquisition device can be made with off-the-shelf parts that somebody with basic technical training can operate. As for cell phones, you could be out in the middle of a remote village and still have cell phone access. 
  • Diagnosis:
    • Symptom Navigator: Use the Symptom Navigator to figure out what you’re suffering from.
    • iEyeExam: With this app, you can give yourself a quick eye exam.
  With 85% of physicians using smartphones, there are many areas that mobile solutions will get into on the provider side too:
  • Schedule and scheduling management
  • Clinical record management
  • Patient accounts management
  • Accounts receivable management
  • Electronic insurance billing
  • Insurance claims management
  • Online patient registration and communication

Share on Twitter Print

Retail - Holiday shopping
December 14, 2012

With the holidays around the corner, everywhere you go, you get stuck in traffic especially if you are near a mall these days. The online and offline activity this season determines the economic flavor for at least a quarter or so. The retail industry anyway is pretty broad in that the value chains work differently for different consumer products and goods. But there are certain trends that are common:
  • “Retailization” is spreading as businesses across all industries vie for closer customer connections
  • Retail channels are continuing to blur and expand, generating new expectations from consumers and more cross-channel challenges for retailers
  • Shoppers are continuing to gravitate toward products and experiences that offer individual focus, interaction, customization, and cradle-to-grave offerings
  • Demand for online capabilities (and for a consistent experience) is increasing
  • Demographic shifts in spending power are driving retailers to rethink go-to-market strategies
  In these scenarios, when you begin analyzing the individual company needs, it is clear that winners will survive and gain market share by doing three things right:
  • Identify target customer by each purchase of target items
  • Identify measured value and reference value for daily average contribution profit separately for purchaser and non-purchaser groups, and variance of both values is identified as effect.
  • Convert to effect of entire profit increase
  The winners in retail spend less money but target the customers more scientifically and execute their investments more swiftly. To understand this, it is important to lay out the details of the value chain of the company. Finance, IT, human resources, and GNFR combine to manage the business, which consists of demand generation and demand fulfillment through various channels.   The retailers have to connect with its customers, whenever they want, however they want, seamlessly.  

  • Product available for order on-line and collect in store or local delivery
  • Relationship with and other partners for non-stocked items
  • 24 hour operations
  • Staffed to hourly profiles
  • Loyalty support
  • Ordering capability
  • E-Payment through phone
  • Supervisor (B2E) enablement
  Loyalty across channels:
  • Extensive network of partners with shared, integrated loyalty program
  • Customer (history) identifiable in all channels
  • Customer call centers should be effective and efficient
  They need to have product centric operations, focused on “Right Product, Right Place, Right Time, Right Price”. Forecasting has to be driven by macro-factors as well as local conditions:




Share on Twitter Print

Application Rationalization - Get Healthy and Stay Healthy
December 4, 2012

It is widely understood that if a person eats right and exercises then he or she can lose a desired amount of redundant weight and be healthy. But in many cases if the person returns to the same habits of unhealthy eating and lack of exercise then the unwanted weight will return. A recent study by researchers at the University of Missouri (as described by the website Science Daily) takes it a step further. What they found is that even with regular physical exercise, people who are otherwise sedentary are at higher risk for chronic diseases such as diabetes, obesity, and liver disease. They found that it is not enough to exercise regularly if a person otherwise sits in one place for most of the day.   Likewise, keeping IT operations healthy requires more than occasional bursts of helpful activity to rationalize and standardize. For many companies, a majority of IS and IT budgets are allocated to application maintenance and support, often up to 85%.  Not only does this decrease profitability but it reduces available capital for discretionary spending and strategic initiatives.   Several major factors have contributed to an increased focus on simplifying the application portfolio through rationalization and improved portfolio management including:   
  • Many years of distributed IS/IT spending and investment within specific functions and/or organization boundaries (no enterprise-wide investment management process)
  • Increased cost pressure and desire to improve the synergy of IS/IT investments across organization boundaries (eliminate redundant vendor/technology investments, consolidate IT assets)
  • Growing need to integrate infrastructure and enterprise solutions across external customers, suppliers and partners
  • Significant merger/integration activity to achieve economies of scale and remain competitive
  • Growing demands from the business to increase the strategic utilization of information technology and produce greater impact from the existing levels of IS/IT investment
  Unless applications are appropriately managed, the entire IT budget becomes operations and maintenance.    


  Rationalization is an ongoing activity to be re-examined as part of a regular exercise like annual planning.  Business–IT alignment and integration require that both parties take stock of the current situation, consider in what ways they wish to improve, and then determine how to get there. There are different “physics” that get the firm to this hairball of applications and infrastructure—governance and funding mechanisms, organization structure, changes in capabilities, leadership gaps, etc.   Some key characteristics of potential companies needing this:
  •  Cloned Systems
  •  Shadow Spending
  •  Past M&A Activity
  •  Lacks budget for initiatives
  •  Stove-Piped Investment
  •  High Maintenance budget
  The Solution
  • Application Rationalization is a systematic approach to improving the business performance of IT application portfolios by reducing current system complexity and by aligning application direction to the priorities of the business. The primary objectives of an application rationalization effort include:
    • Improve the overall investment mix available to fund strategic new initiatives
    • Reduce complexity of the application and resulting infrastructure environment to improve ability to integrate business capability across internal and external parties
    • Reduce cycle time for new initiatives by eliminating unnecessary application/system complexity
  Application rationalization is most successful when completed in conjunction with an effort to change the leadership, processes and organization disciplines that manage and control IS/IT spending.  Most companies have separate efforts underway evaluating the IS/IT operating model and governance direction.  These efforts are complementary to application rationalization activities, and should be pursued in parallel to the application rationalization activities. Application Rationalization drives value for our clients across business, Information Systems (IS) and Infrastructure (IT) areas:    

Business Value Levers
  • Decrease business process cost
  • Improve asset utilization
  • Better decisions with shared information
  • Improve people and skill mobility
  • Improved agility in acquisitions
  IS Value Levers (Apps)
  • Decrease application development and maintenance cost
  • Decrease interface and integration costs
  • Decrease conversion cost of new efforts
  • Consolidate and reduce software licenses
  IT Value Levers
  • Consolidate servers and server support
  • Consolidate and reduce storage costs
  • Decrease system software licenses
  • Simplify and enhance development tools and development environment
  • Reduce Operations Support Costs
  Application rationalization initiatives should be treated like a program—they require the proper attention, training, budget, communication, staffing and skills, and partners. The application renewal strategies need to be grouped into logical programs. The team needs to identify organizational and process impacts and create near-term and long-term roadmaps. Like any other initiative, the team needs to identify a benefits realization method so that the business case progress can be manage

Share on Twitter Print

Print, Print, away…
October 8, 2012

No matter how much people say we are completely digital, there is a still a demographic that prints and uses paper. Surely electronic media continue to reduce paper demand and specially world demand for graphic papers is decelerating. But paper is all around us. Forest Productswhich refers to goods manufactured from the forest (trees) includes everyday products like lumber, pulp, paper, and packaging.  Products include common items such as paper, lumber, corrugated packaging and facial tissue. The North America forest products sector is inching toward a gradual recovery in 2011, with a slowing recovery from 2012-2017. The key emerging markets for forest products are Asia, notably China, Latin America, and Russia.   Forest Products generally fall into five distinct categories. 
  1. Wood Products includes building materials like lumber, plywood and particleboard. 
  2. Tissue includes paper, bathroom tissue and facial tissue.  This segment will sometimes include technically complex, specialty items like baby diapers. 
  3. The paperboard segment includes products like corrugated and pharmaceutical boxes.
  4. The Market Pulp segment is a business-to-business segment and is not an end use form.  Market pulp is fluff or baled in form and is sold by companies to other forest products companies who in turn convert the pulp into paper, tissue, paperboard or some other product. 
  5. The Paper segment is fairly diverse and covers a range of items such as newspapers, paper grocery sacks, magazines and catalogs.

The forest products value chain is relatively straightforward, although individual process steps can be technically complex.
  • First, logs are harvested from a forest.  Increasingly, large forest products companies are divesting themselves of their land holdings and outsourcing the harvesting function.  The big drivers of divestment have been interest in timber holdings by the alternative investment community and the desire of forest products companies to unlock captive balance sheet value.
  • The logs are then transported to a saw or pulp mill.
    • In the saw mill, logs may be scanned to optimize yield before processing.  After unloading from a truck or train, the logs go through the sawmill and are cut into various forms of lumber and veneer. 
    • In the pulp mill, the logs are unloaded and then may be cut into smaller sizes before being chipped.  The wood chips are conveyed from the chip pile, screened and sent to a digester along with chemicals.  The chips and chemicals cook for a set amount of time, are released into a blow tank that separates the fibers, and then are sent to washing and possibly bleaching. A pulp-chemical slurry is formed in stock prep and piped to the paper machine, where it is formed on the wire.  The formed sheet then goes through several sections which remove water and then is wound on the dry end of the machine into a large roll called a jumbo roll, which can way many tons. The jumbo roll is typically immediately slit into smaller, easier to manage rolls of paper.
  • Conversion is an all-inclusive term for operations occurring after the paper mill.  Conversion can refer to sheeting, which is the process of turning web rolls into sheeted product.  It can refer to the manufacturing of corrugated products.  It can refer to the manufacturing of laminate.
  • Finished products are distributed to customers via truck, rail, and ship.
  • The final step in the value chain is recycling.  Recycled materials are sorted and reintroduced to the pulp mill.

Forest Products Megatrends are that Biomass energy is key. As all industries set goals for the use of renewable energy, forest products, as a trendsetter in this area, will be held to higher standards for renewable energy use.  Partnerships inside and outside of the industry will be necessary to further advance in this area. Another key area is Capital Spending. In an effort to protect liquidity, companies are cutting back on capital spending.  Expansion projects, especially those outside of the emerging markets, will be few and far between.    Due to the ability to flex manufacturing capabilities and the global presence of major firms, the forest products industry tends to be fairly fragmented when compared to other commodity industries that have gone through major consolidation like oil and steel.  

An important aspect of the forest products industry is the focus on sustainability.  An outgrowth of this has been the development of certification programs, which validate or verify topics like wood source, harvesting practices, and product recycled content.  There are three major certification programs currently competing in the industry:  Sustainable Forestry Initiative, Program for the Endorsement of Forest Certification, and Forest Stewardship Council.  Certified forest products could serve as an important source of competitive advantage in the future.  

Although players currently tend to concentrate in a segment of the industry, many can and do manufacture alternative products.  For example, a coated paper manufacturer like Stora Enso or NewPage will likely have the flexibility to produce coated or uncoated paper of varying thickness or weights from hardwoods and softwoods.   

Over time, there has been an oscillation between pure play in the segment and vertical and horizontal integration. International Paper is a good illustration.  At the companies integration peak in the 1990’s, International Paper owned millions of acres of timberland, operated a chemicals division, and manufactured products that included kraft paper, printing and writing papers, boxboard, wood products, high and low pressure laminates, and oriented strand board among others.  International Paper has divested most assets except those directly related to the manufacturing of paper and packaging.  Generally speaking, forest products companies have found difficulty extracting the perceived value that integration would bring.    

<<Please PRINT this blog for the paper industry>>  

Share on Twitter Print

Tomorrow starts today – Future Shoppers and Retail
September 25, 2012

The changes that come across our daily lives are centered around 4 major components: shoppers, technology, society in general, and environment that we live in.  

The Changing Shopper - Over the next 10 years, demand for better and highly personalized service is bound to go up as a result of consumers rapidly embracing newer technology. Concerns on sustainability as well as that on consumer health & well-being will also find more prominence in the interactions that an organization has with its consumers
  • Global demand for organic products continues to grow, with sales increasing by over $5 billion a year, according to The World of Organic Agriculture: Statistics & Emerging Trends
  • According to the Natural Marketing Institute (NMI), the American Lifestyles of Health and Sustainability (LOHAS) industry  is currently valued at US$209 billion, and estimated to comprise approximately 19% of the adult population in the United States, equaling a market of 41 million consumers
  • According to a report published by the Waste & Resources Action Program (WRAP), total food waste in the supply chain amounts to 11.3 million tones and total packaging waste to 5.1 million tones.
  • Consumers are increasingly leveraging technology (web 2.0, social networking etc.) to stay one step ahead of consumer product & retail companies
  • The growth of mobile features and device convergence such as wallet phones will drive up mobile sales
  The Changing Technology - In the coming years, newer business technology will enable manufacturers and retailers to become more adaptive of the rapidly changing environment
  • RFIDalready is and will continue to be a key technology to enable supply chain transparency in the future:
    • Conair, a company that manufactures food processors, blenders etc. is leveraging RFID to item -level tag products from its manufacturing location in China to Wal-Mart stores in the US to enhance product visibility
    • Kimberly-Clark is collaborating closely with Wal-Mart through RFID technology by sending tagged pallets & cases to the retailer’s distribution center.
    • American Apparel successfully piloted RFID at the item level and, once funding is secured, expects to roll out RFID to all of its 260 stores in the futured
  • The growth of mobile features and device convergence such as wallet phones will drive up mobile sales and Store visits will be enhanced by dynamic digital displays and personalization through a hand-held device or the customers own phone
    The Changing Society - The aging of societies in developed countries will have unexpected economic and political consequences. On the other hand, developing markets will see the rise of middle class and also witness increased levels of urbanization
  • It is estimated that the world economy will be about 80 percent larger in 2020 than it was in 2000, and average per capita income to be roughly 50 percent higher, according to UnHabitat
  The Changing Environment - Increased regulatory pressures from Government, depleting level of natural resources and shift of economic power towards the developing markets will completely alter the environment in which an organization functions
  • Unprecedented global economic growth will continue to put pressure on a number of highly strategic resources, including energy, food, and water, and demand is projected to outstrip easily available supplies over the next decade or so
  • Water scarcity will worsen due to unsustainable use and management of the resource as well as climate change
    • By 2030, the earth’s projected eight billion inhabitants will need 25% more water
    • By 2025, 2/3rd of the world’s population will be living in water stressed areas
    • It is interesting to see how this will impact food and beverage giants - Nestlé, Unilever, Coca-Cola Co., Anheuser-Busch, etc. – will approached 575 billion liters per year
    I guess the next couple of generations will really live in a different world after all.


Share on Twitter Print

Digital Payments
August 9, 2012

With Starbucks signing a deal with Square today and my work in Atlanta which is such a hub of payment processing industry, I keep seeing the debate of what should be the core competence and what should be outsourced within these companies all the time. To elaborate, there are 3 major stakeholders in payments
  • Banks
  • Intermediaris: ACH’s, SWIFT, CLS etc.
  • Customers

The mainstream processes include
  • Transaction checking - Identification, authorization, authentication, reconciliation, warehousing
  • Processing transaction - balance verification, , crediting & debiting accounts
  • Counterparty related processes - Creating settlement & clearing orders, creating channel report messages
The Secondary processes include charging, billing, validation, repair, Reconciliation, Handling Rejections, and Process and Monitor Reporting   It is easier to outsource clusters of related processes. The three main clusters of payment processing can be distinguished:
  • Transaction receipt and approve for further processing - receiving and preparing the order for final processing
  • Processing of the transaction - using payment information to debit and credit accounts
  • Communication with counterparts and clients - sending subsequent orders or reports to involved counterparts or reporting to clients
Many companies in this space are centralizing their payments: Control, Cost reduction, Risk Management. This way they can reduce the number of banks they are dealing with. Also theya re implementing a payment factory solution with following considerations:
  1. Quantitative Factors
  • Improved forecasting opportunities – better liquidity management
  • Improved reconciliation process for incoming payments
  • Less external supplier payments
  • Less interfaces
  • Improved negotiation power towards preferred bank, i.e. float and transaction costs
  • Limited spread on cross border payments
  • Less external bank accounts
  1. Qualitative Factors
  • Bank Independence
  • Enhanced overview, security, and control of payment flows
  • Competitive advantage from highly efficient standardized processes
  • The business units to focus on core business, i.e.  payment processing handled by a central competence centre
  • Standardized EDI solutions
  • Meet demands regarding payments from sophisticated suppliers and retailers, i.e. unpack and restructure payment files

Share on Twitter Print

Digital Media – Charles Darwin like evolution
January 15, 2012

Life used to be easy:

But the ecosystem evolved is much more complex today:

Originally characterized by a strict separation of roles, the current complex ecosystem is evolving with blurred lines between players across the value chain. There is no doubt that Digital Services are disrupting the marketplace – changing consumer usage patterns, driving new distribution models, and breaking down value chain boundaries as players migrate their positions to provide new consumer-driven experiences. Digital Services are the intersection of devices, content, and networks which provide services previously delivered in physical form. Digital Services are changing the way we monetize traditional media products. A variety of business models are being deployed – all attempting to ‘crack the code’ on digital monetization. Some Examples of ‘Free-to-Consumer’ economic models are:
  • Freemium - Free basic services with charge for premium features or content
  • Ad Supported - Free content and services supported by paid advertising
  • Cross Subsidies - Free content subsidized by other products or services 
  • Infomediary - Sale of customer information to third parties
    But for many players in this industry, blended economics provide multiple potential revenue streams, allowing tradeoff between revenue generating components, revenue timing, etc.:
  • Bundle: Content or Service Subscription + Device Example: content or service contract + free / subsidized device
e.g., Digital Content + Device + Service = Verizon VCast
  • Ad-Supported + Merchandising Example: streaming content + sale of physical content & merchandise
e.g. digital content + advertising = Yahoo!, Google
  • On-Demand + IP Licensing Example: direct content rental/sale + licensing of brand rights for related content / merchandise
e.g. Digital Content + Device = Apple
  • On-Demand + Affiliate Example: direct content rental / sale + ‘white-label’ solution for 3rd parties
  • Infomediary + Free Content or Service Example: aggregate and sell consumer data + proceeds fund free content or services
e.g.Digital Content (Supply Chain/Device Integration) = Amazon Unbox   The trend toward increasing user-generated content consumption is clear (at least in some demographics). But viability of operating a business based solely on hosting and distribution of user generated content is still unproven – primarily due to inconsistent content quality and significant cost of distributing rich media. Though, the traffic generated by providing tools for editing, storage, sharing, and syndication of UGC, coupled with user data capture, can provide a valuable asset for indirect monetization:

<click image for larger picture>

Share on Twitter Print

Innovate or die trying…
December 10, 2011

Ever since the passing away of Steve Jobs, people still talk of Innovation and Apple comes to the discussion. People, even after reading the book, are enamored of the way the products of this company have really changed how we do business, how we interact with other, seek entertainment, and just keep up with our flights, schedules, etc. The discussion of innovation vs. scalability and growth vs. process is so old that it bears no introductions. There is a group of people that assumes that too much process can kill innovation. And the capitalist mantra is that you’ve got to scale and repeat your work to get economies of scale, etc. There is a debate about whether Six Sigma being so focused on end results and numbers can be stifling for innovation. But like William C. Taylor said in Change This, “The most creative companies I’ve met don’t aspire to learn from the ‘best in class’ in their industry—especially when best in class isn’t all that great. Instead, they aspire to learn from innovators far outside their industry as a way to shake things up and leapfrog the competition. Ideas that are routine in one industry can be revolutionary when they migrate to another industry, especially when those ideas challenge the prevailing assumptions that define so many industries”. As Alan Kay said, “The best way to predict the future is to invent it”. Visions of the future by building prototypes that combine new technologies in innovative ways, illustrating how emerging technologies will have a significant impact on even the clients' businesses. Everyone agrees that the growth in size of a company creates some level of redundant assets - in both physical assets and knowledge assets as a well as human resources.  More growth creates more complexity – the “carbohydrate” in business. But if you resemble a “hand-made shoe” businesses too much - customized & expensive to change, non-scalable, non-replicable, expensive to integrate new businesses - you are not viewed highly on the shareholder value tree. So what’s the right balance? Part of the answer is in the CUSTOMERS. Customer demands and expectations are evolving beyond “convenience” - they go from telling you different things in the levels of their maturity of seeking your products/ services: “ Be convenient” to “Recognise and know me” to “Simplify  my life” to finally “Empower me”. So Innovation becomes about doing the right things and doing them correctly in these steps:
  • Concept
  • Feasibility
  • Pilot
  • Implementation
Just reading so much quotes o  Innovation on Idea Match, tell you athat many innovations start WITHOUT knowig exactly all the future uses of that innovaton. As Martin King said, “Take the first step in faith. You don't have to see the whole staircase, just take the first step." But in the end you got to do it for the joy of exploring – the inquistive mind. As Steve Jobs did say, “I was worth over $1,000,000 when I was 23, and over $10,000,000 when I was 24, and over $100,000,000 when I was 25, and it wasn’t that important because I never did it for the money. I want to put a ding in the universe."   

Share on Twitter Print

Agile – The Good, The Bad, and the lets-not-go-there
November 17, 2011

More and more these days when we walk into delivery organizations, the examples of methodology used for software development is replete with examples of flavors of Agile. Dealing with so many partners on the business side or even the IT leadership, there are questions abound around these methods:
  • What is agile development?
  • What are the differences between “traditional” (waterfall) and Agile?
  • What are the key tenets of agile development?
  • What types of work are appropriate for an agile approach?
  • What is the real value proposition?
  Over the last couple of years, numerous software development methodologies have been introduced to guide development teams in achieving these quality goals. Some methodologies involve a disciplined and detailed process with strong emphasis on planning. Examples of such methods include Capability Maturity Model (CMM), Software Process Improvement and Capability Determination model (SPICE), Team Software Process (TSP), etc. Then recently some agile methods have been advocated as a new paradigm for high-speed, quick-to-market software development. Examples include Behavior Driven Development (BDD), extreme programming (XP), SCRUM, etc. The basic critique of the agile software development evangelists is that conventional software development processes are too rigid to achieve the end results, let alone the aimed quality factors for contemporary user-experience-driven projects. But one thing is sure: no matter which process or methodology is applied, the bottom line is that there is an impact of software processes on the IT systems produced. As Alistair Coburn mentioned in his book, Agile Software Development, the size and complexity of the problem should dictate the “heaviness” of the methodology. The methodology should be designed to support the successful delivery of large, complex projects and may be scaled down for smaller projects. “Agile Methods” is an umbrella term for methodologies and practices that endorse what is known as the Agile Manifesto. The Agile Manifesto was written in 2001 and reflects the thinking of 17 advocates for lighter-weight methodologies.  The different frameworks are as below, but one of the principles that caught my eye when I started on this methodology was YAGNI. This principle promotes the “simplest design possible” to meet the current set of requirements—i.e., don’t spend time designing for future requirements.  “YAGNI” is sometimes used to describe this principle:  YAGNI = You Aren’t Going to Need It.  Here’s the argument:  it is actually cheaper to implement a simple solution that meets the current requirements and to refactor later, only when necessary, to meet new requirements, than it is to implement a complex solution that attempts to address current and future requirements.
  • Extreme Programming (Kent Beck, Ward Cunningham, Ron Jeffries).  Extreme Programming, or XP, is the best known of the Agile Methods.  XP is actually a highly disciplined process that expects all of the XP practices to be followed, including  user stories, iterative development, test first, pair programming and continuous integration. 
  • Crystal (Alistair Cockburn).  Crystal is a family of methods that provides different levels of "ceremony" depending on the size of the team, the criticality of the project, and the skill level of the people.  Its practices are drawn from Agile and plan-driven methods, as well as psychology and organizational development research
  • Scrum (Ken Schwaber, Jeff Sutherland, Mike Beedle).  Scrum is an agile framework  that supports software and product development.  Wrapping existing engineering practices, including Extreme Programming, Scrum generates the benefits of agile development with the advantages of a simple implementation.  The name “Scrum” is derived from activity that occurs during rugby match. (A group of players forms around the ball and the team-mates work together, sometimes violently, to move the ball downfield.)
  • FDD - Feature Driven Development (Jeff DeLuca, Peter Coad).  FDD is a lightweight, architecturally-based process that first establishes the overall application architecture and features list and then proceeds to design-by-feature and build-by-feature.  The Chief Architect and Chief Programmer are key roles in FDD, and the use of UML and other object-oriented design methods is strongly implied. 
  • DSDM - Dynamic Systems Development Method.  DSDM is more of a framework than a method.  DSDM is highly iterative and incremental.  It has seven phases, which are repeated during the lifecycle of the project. 

Characteristics of an Agile Project

  • Prioritized backlog of user stories
  • Evolves as stories are completed. The RUP parallel to the waterfall is that the user specification are broken into themes, stories, and features in that hierarchical order
  • Always production ready
  • Ships early and often
  • Designated Product Owner
  • Work performed by small cross-functional teams
  • Team is self-organized, makes own commitments, and highly motivated by rapid and continuous success
  • After each Sprint, team reviews product and process to identify and implement improvements
  • Work completed in 2-4 week development  “Sprints”, each focused on small set of features
  • Work done in priority order
  • Spec, design, code and test done concurrently for each feature
  • Signoff by business client at end of each spring
  • Progress tracked by # of features completed

Challenges of Agile Methods

  • Difficult to apply on large engagements.  Because Agile Methods depend so much on face-to-face communication, they are difficult to apply on large engagements.  For instance, the upper limit for an Extreme Programming project is approximately twenty people, the maximum number of people you can reasonably fit in an average room.  However, in the case of large engagements, delivery is typically split in several delivery projects, and Agile Methods may be effective at the project level, if the projects are independent enough.
  • Agile Methods require a cadre of responsible developers.  Furthermore, developers must be able to actively participate in the planning of their individual tasks.  Compared to the needs of a traditional project, an agile project requires a group of developers with slightly deeper skills. 
  • Agile Methods require an agile champion The adoption of any methodology is a cultural change and requires a champion.  The champion should lead the team with energy and enthusiasm.  If the champion can also play the role of an agile coach, all the better.
  In Balancing Agility and Discipline, Barry Boehm and Richard Turner propose an effective technique for evaluating the applicability of Agile Methods for a given project by evaluating the following factors:  size, criticality, dynamism, personnel and culture. Successful agile projects are likely to have the following characteristics:
  • Size: The number of people on the project is small, typically less than thirty 
  • Criticality:  The solution is not so important that failure means disaster of one kind or another
  • Dynamism:  The degree of expected change to requirements in significant
  • Personnel:  Project personnel tend to be more skilled and able to direct their own work 
  • Culture:  The culture of the organization thrives on change and uncertainty

By ranking a project around the five axes, it is possible to determine if the project is a good fit for Agile Methods.  Agile projects map to the center of the graph, whereas more traditional (described here as plan-driven) projects will map to the outer fringes.

Share on Twitter Print

October 8, 2011

With my book on “IT Strategy and Business Value” coming out soon, I was even more interested in the eReaders space than the occasional downloads on Kindle or the nook that my wife has. I began reading about the trends in this product segment and the global success within all demographics.
  • The digital assets distribution business is pretty global already with many players and the service offerings range across the whole value chain  - connectivity (GoSpoken), infrastructure (DigiPlug), Content  Management (CodeMantra, Milibris), Content Distribution (IngramDigital), and additional software (LibreDigital).
  • Electronic consumer books are currently growing at 50% on a global scale (Source: PwC Global Media & Entertainment Outlook)
  • Consumer electronic book sales are still about 2% of total book spending. It’s projected to increase beyond 5% by next year (2012).
  • Audiobooks are included in print books and is estimated to be a $1B market (Source: Audio Publishers Association)
  • In 2007 Google introduced the continuous browsing feature for readers to get 20% of consecutive content from a book
  • In 2008, Publishers like Harper Collins and Random House are beginning to put entire books online for free. The thought is that they create the free store browsing experience and will not cannibalize the print sales because for a 300+ page book, readers will truly prefer the paper version.
  • Project Gutenberg offers over 36,000 free ebooks because their copyright has expired in the US
Like anything else this is surely an evolving path and we’ll see how it goes. The music industry did see a major destructive force but has come out with new ways of distribution to offer consumer what they want the price they seek.

Share on Twitter Print

Change is inevitable, except from a vending machine
July 14, 2011

When I saw this picture of org charts drawn in lighter vein by Emmanuel Cornet, Software Engineer at Google Inc. and published on his site Bonker’s World, it was funny and serious at the same time.

The operating model of a company consists of culture (norms, motivations, behaviors, values, etc.), structures, teams, jobs, roles, etc. Many organizations introducing new initiatives or undergoing transformation feel that the installation of their new systems, processes, and/or people will mean adoption by the organization. They feel that the majority of the work is getting these systems and processes deployed and that will automatically lead to realization of the value goals. But all too often even when these initiatives (systems, processes, and tools) are properly installed, the return expected from the investments does not occur because these changes are not adopted. It is one thing to install a new CRM system for customer intimacy, but totally another to enable the internal and external sales force to adopt and use these systems to offer a one view to the customer when that customer touch point happens. The root of all change enablement lies in understanding the composition of the organization and knowing which blocks are key to achieve adoption and realization of value goals. At the basic level each organization has the following components:
The Operating Model is the blueprint of how each component within the company functions.  The leaders need to be thinking of the following questions to effect the changes they seek:
  • Who are the stakeholders involved – Who are sponsors, change agents, targets, and customers
  • Who owns the change management responsibilities?
  • How are the different Change Management streams interfaced:  Value Management, Demand Management, Resource Management.
  • Are changes assessed on the value they will deliver to the business prior to their approval?
  • How are associated risks assessed?
  • Is there appropriate infrastructure in place to manage change-related activities such as training, re-skilling, communication planning, etc.?
  • How is resistance to internal and external changes managed within the organization?

Share on Twitter Print

Everywhere we go, people want to know…Mobility Solution Strategy
June 9, 2011

I remember when I was in high school and went to Germany for the first time as an exchange student through PAD (Germany’s Educational Exchange Service for international cooperation in education.), all the students from all across the world would get together  amd spend some months together. We’d go through the streets of Berlin, Rheinberg, etc. and sing “everywhere we go people want to know, who we are so we tell them, we are the P-A-D , mighty, mighty P-A-D”. Of course, I hear that on the minor league baseball fields in the US too. And one hears a similar note in the world of IT where everyone needs to evaluate and execute some Mobility Strategy depending on their industry, the players, and their positioning. Few can argue the merits and shear market force of mobile technology.  Already, more than one billion mobile devices are in use globally. The Yankee Group wireless survey indicates that 55 percent of large U.S businesses will deploy a wireless wide-area data solution by next year.  The numbers are impressive. 
  • Revenue from mobile data overtook fixed voice in the US in 2011
  • Number who access web daily from mobile nearly doubled between Jan 2008 and Jan 2009 to 22.4 million (comScore).
  • Number of US smartphone subscribers went up by 72% in 2010 (Nielsen).
  • And then last week we got Google Wallet coming out with the final horizon for mobile solutions- pay with your smart phone in the US.
  Yet, the key to creating mobile business value is not in the technology itself, but in re-engineering key business processes for market-driven, high performance business change. Enterprise mobility is emerging as a key enabler across all aspects of an enterprise strategy, particularly as it makes way to inevitable “everywhere connectivity.” Companies that thrive will be those that mold mobility into a standard capability and fundamentally transform the way business is conducted - not just work unplugged. Pioneer companies are using it internally as well as externally to for higher revenue, lower costs, faster time-to-market and better customer service. The key benefits that companies seek: Internal uses:
  • Workforce Management
  • Assign  / Schedule tasks (alerts, completion)
  • Manage employee staffing
  • Improved workflow efficiency
  External uses:
  • Sales force enablement
  • Transaction processing (POS)
  • Multi-channel integration           
  • Increased purchase process efficiency
  Going forward, bundled solutions designed for mobility will drive adoption and seamless network convergence will facilitate growth. There are companies developing context sensitive solutions that are acquiring consumers.


Share on Twitter Print

Cloud Nine
May 9, 2011

It is obvious to anyone that Cloud Computing turns computing into a variable cost rather than fixed cost by providing compute capacity on-demand. The two key technology enablers have led to the adoption of the cloud: increased network bandwidth and infrastructure virtualization. But the narrow differences between the loosely related buzz words begs definition:
  • Virtualizationa set of technologies to create a virtual computing infrastructure by allowing division of physical assets  (processing power, storage and network bandwidth) into virtual machines (e.g., VMware virtual server)
  • Elastic Computing (EC) – a technology for provisioning and load balancing that doles out the virtual infrastructure on demand (e.g., Amazon’s EC2)
  • Grid Computing (GC)a computing architecture in which a large number of individual computers work in parallel to solve a problem (e.g., Google’s Map-Reduce)
  • Utility Computing (UC)a business or pricing model in which the user of computing resources pays (only) for the resources they use
  • Cloud Computing (VC) – typically some combination of the above to provide a fluid pool of remote computing. “Fluid” in the sense that computing can be more readily be divided (virtualization), doled out on demand, and combined.
  And the types of cloud:
  • Public  cloud: Cloud computing in the traditional mainstream sense. An off-site third-party provider shares resources and bills on a fine-grained utility-computing basis.No owned assets and scope is open to anyone who can pay for service as delivered by provider
  • Private (Internal) cloud: Emulating the cloud computing on corporate datacenters and access through private networks. Company/Entity Owned Assets and Scope is Bounded by Exclusive Membership defined by Company/Entity
  • Hybrid cloud: Some resources are provided and managed inhouse, and some others externally.
  The adoption of cloud requires number of technology and operational considerations.
  • SLA - service levels agreed between provider and user on various fronts like uptime, backup, security, privacy, etc.
  • Compliance – in order to comply with various regulations, users may have to adopt community or hybrid deployments.
  • Legal – Use of public cloud services may be restricted for certain business functions.
  • Open source – cloud is based on open standards  & software, a detailed study is necessary before adopting certain public cloud offerings.
  • Open standards – Cloud providers use mix set of APIs, interoperability to be assessed  before adoption.
  • Privacy – Cloud is based on shared environment, so privacy requirements has to be validated .
  • Governance -  refers to the controls and processes that make sure policies are enforced.
  • Portability – ability of moving components or systems between environments.
  • Integration – in cloud, integrating various components and systems can be complicated by issues such as multi-tenancy, federations and government regulations.
  • Interoperability – very critical for successful adoption of cloud.
  • Sustainability – more relevant to private cloud . Achieve through improved resource utilization, efficient process, etc.
  • Security – Today, it is a contentious issue which may delay in cloud adoption.  Large investments made by many companies to develop Cloud security model.
  From the business side of this technology, the CEO should be concerned about:
  • How will companies react to the rise of a new competitor built on cloud “pay as you go” model? E.g. Can a new company start up with near-zero investments with a pay-per-use billing system?
  • Is it possible for my company to enter new markets using cloud solutions? Which opportunities make sense?
  • How will I negotiate and partner with my IT vendors, if I can instantly switch my processing power from one to another?
  • How will one global on-line marketplace modify the dynamics of relationship with customers for non IT products and services?
  • How will cloud computing require more aggressive movements against my competitors, so I can benefit from 1st mover advantage?
  • Can I offer any cloud-based solution to my suppliers, to expand my knowledge and control through the value chain?
  • How can my IT vendors create competitive advantage for themselves through cloud? What should be my company position not to be harmed?

Share on Twitter Print

The 5th P in Marketing
March 9, 2011

Well, it’s that time of the year in Atlanta – beautiful weather before the air goes green with pollen :-) and we were out riding our bikes, when I saw my wife typing away on her iPhone. After inquiring what she was up to, I learnt that she had bid on an oriental rug that she wanted to buy for our dining room and was trying to log in the website. Being as security-risk-averse as I am, I suggested that she be careful since the airwaves could enable hackers to steal her credit card info. She rolled her eyes and continued. That set me thinking in this day and age how many folks are really using mobile platforms to pay. I found my answers on some website that do exhaustive surveys. One of the much respected website:


So the social media experience is still evolving from Learn/Social stage to “Do Business”. The list of top 15 most popular social networking websites confirms this pattern. Hence, the recent interest in Social CRM - use of social media to move beyond the limitations of traditional marketing, sales, and customer service to a continuous mode of relationship-building with the customers. Talking o so many leaders in this area, I have realized that leaders must take the first critical step of changing their mindsets and revising some long-held beliefs about building and managing customer relationships.  The traditional principles were focused on the 4 P’s of marketing
  • Product
  • Promotion
  • Price
  • Place
  But the 5th P – People – and their engagement for brand creation, for consistent and appealing experience and for adding value to them through interactions is key. Social CRM is not only about a new channel but rather about a fundamental shift in how to engage and interact with more empowered customers:
  • It completes the seismic “shift of power” to consumers. They are enabled to influence all things and become co-owner of the brand
  • It is not about anonymous mass nor individual customers, but about individuals within a community; with influencers, creators & consumers
  • It blurs the lines between Marketing, Sales and Service
  • It makes Analytics and Technology become critical enablers for real-time agility, flexibility, security and repeatability especially in industries like retail, hospitality, etc.
  Of course the speed with which this area of consumer engagement is affecting us is depicted by the time to achieve 50 million users:
  • Radio = 38 years
  • TV = 13 years
  • Web = 4 years
  • FaceBook = 3,5 years
  This cartoon from Hugh MacLeod summaries how important this is:  

Share on Twitter Print

US Freight Industry
March 2, 2011

Having spent some time in Supply Cain Management and having seen the freight industry in the US and Europe, I have been always very intrigued by these facts/ patterns, per The World Factbook:
  • The U.S. population of over 307 million (2009 estimate) is exceeded only by those of China and India, and is characterized by an extraordinary diversity in ancestry.
  • U.S. exports represent more than 10% of the world total.
  • In order to serve the U.S. economy the multimodal transportation network of highways, railroads, sea ports, pipelines, and airports moves, on a typical day, about 43 million tons of goods worth about $29 billion.
  • The U.S.’ Interstate Highway System, a 44,000-mile national network of multiple-lane expressways, is found in all 50 states and connects 90% of all cities with a population of at least 50,000.
  • Railroads move about one-third of the U.S.’ intercity freight traffic.
  • Navigable waterways are extensive and center upon the Mississippi River system in the interior of the U.S., the Great Lakes–St. Lawrence Seaway system in the north, and the Gulf Coast waterways along the Gulf of Mexico.
  • The U.S. has experienced spectacular growth in airplane traffic since the mid-20th century, and currently has nearly 5000 paved public airports.
  Also another great source of information is the American Transportation Statistics DB

Share on Twitter Print

Incident Management vs. Problem Management
February 1, 2011

I was talking to this executive in a fortune 500 company and he was struggling with putting in some SLAs for his IT organization. He wanted to show the business side that he was making operational improvements. One of the things he was struggling was Incidents and Problems that folks were capturing. He could not get his team to look as these separately and was talking about how Incident Management is related to Problem management and how it should be different. I thought of capturing my point of view as below. Whether you use ITIL terminology or not, here’s the difference:  
  • An Incident is any event that is not part of the standard operation of a service and that causes an interruption to, or a reduction in, the delivery of that service. Incident Management is concerned with restoring service to a user as quickly as possible whenever an incident occurs.
  • A Problem is generally an application-related occurrence or event that causes some level of disruption to normal client business operations, e.g. incidents with a major impact or of a repetitive nature may have an underlying problem. Problem Management determines the root cause of problems identifies interim workarounds if available and implements long term solutions to prevent their recurrence or mitigate their impacts.

An incident is like some car had a fender bender. Problem is like the timing belt was not replaced for some time  :-) The typical process steps for Incident Management are as below:
  • Detect & Record –  the Incident / Service Request is detected and recorded in the service management tool
  • Categorize & Prioritize – the appropriate ticket category is selected, and the impact and severity the incident determines the priority which is agreed with the user
  • Provide Initial Service Request Support – wherever possible the request is fulfilled at the time of capture, otherwise is routed to an appropriate support group; Security requests are routed to the Security Group and the Security Management process
  • Provide Initial Incident Support – wherever possible the incident is resolved at the time of capture, otherwise is routed to an appropriate support group
  • Receive & Accept –  the receiving support group confirms their ownership and initiates resolution of the incident or fulfilment of the service request in accordance with the priority service levels
  • Investigate & Diagnose – knowledge bases are accessed to determine if there is a related open or closed incident or problem, or a similar request.  Incidents are assessed to identify a solution or workaround, and service requests to determine how to fulfil
  • Resolve / Fulfil – the solution is developed and confirmed as acceptable by the user before implementing and verifying in production which may invoke Change Management; if the incident is thought to be caused by an underlying problem, the Problem Management process is invoked
  • Close – ticket is updated with resolution results and positive confirmation from the user is recorded before closing the ticket

Share on Twitter Print

Data Center Consolidation
October 20, 2010

Today’s global organizations are inherently complex. Nowhere is this fact more evident than in a company’s data center. The scene is often chaotic: data centers with hundreds (if not thousands) of servers and storage units, multiple databases and dozens of operating systems—all needing to work together seamlessly to satisfy 24/7 user demands and business process application requirements. Over the years, a proliferation of data centers has occurred largely due to mergers and acquisitions and the historical lack of bandwidth to support performance requirements.  Many organizations are now faced with a large number of decentralized data centers in numerous locations with a varying number of servers. Many have multiple data centers and server rooms with less than five servers in each.  All this results in unnecessary cost associated with administration, maintenance and depreciation. The value of data center consolidation has three primary components:
  • Companies can typically achieve a 20-40% reduction in server Total Cost of Ownership (TCO) by centralizing, consolidating, standardizing and hardening 70-80% of existing servers.  The number of servers can often be reduced by 50-70%.
  • Increased server reliability and availability due to the use of larger and more resilient servers, implementation of high capacity storage, fault tolerant NAS/SAN platforms, migration to standard server images and the improved ability of the support organization to focus on a centralized and substantially smaller server inventory
  • Reduced capital expenditures due to increased utilization of server (CPU and memory) and storage resources
In order to make the Data Center ready for the future, operations management will still retain focus on core disciplines ( System Performance Management,  Event / Fault Management,  Problems Management,  Operations Level Management, Service Level Management, and Security Management). But virtualization of infrastructure requires the approach to IT infrastructure management to become predictive:
  • Extension of infrastructure to Internet
  • High Availability Planning
  • Predictive Capacity Planning / Modeling
  • Workload Management
  • Business pay-per-use

Share on Twitter Print

Electronic Medical Records
October 12, 2010

I have been reading about Google Health and have wondered about data quality and it's significance for Electronic Medical Records in the medical/healthcare industry. Then this news from The Boston Globe does raise some questions where a misleading informatics approach led to a kidney survivor to read on Google that his cancer had spread to either his brain or spine. He was not aware of this and when he dug into it, it was erroneous information. There has been a lot of recent debate about this with a new Government in place. But I guess it remains to be seen which direction and how quickly we will begin to make some progress. One of the recent article on Health Affairs addresses the key issues of Patient Health Records so vividly.  The core problem in this industry is that the multitude of health care information systems are discrete and do not communicate with one another. In addition, vast bodies of medical knowledge and data do not exist in an electronic format that is usable by any decision support system.  One of the ANSI standards for healthcare arena, Health Level Seven offers some promise. But the number of standards are mind-boggling, some of them as below:
  • NCPDP – Messaging Standard for Drug Ordering
  • IEEE1073 – Messaging Standard for Medical devices
  • DICOM – Messaging Standard for Imaging
  • LOINC – Vocabulary Standards for Laboratory Result Names
  • RxNORM/NDF-RT – Vocabulary Standards for Clinical Drug Description & Drug Classification
 Being such a Lean Service Operations person, I guess a successful development of consumer centric care models will require cross-industry collaboration to close key gaps across financing and delivery systems:

Share on Twitter Print

USB Rechargeable Batteries
September 9, 2010

Hope all of you had a safe & pleasant Labor Day weekend. It’s funny when you have little ones and driving on a long weekend, you run out of batteries for all their toys that you so consciously packed. One of the best ideas in gadgets! USBCELL is a product of Moixa Energy Ltd., which is a spin out of Moixa Energy Holdings founded. With USBCELL they aim to enable consumers to recharge anywhere, without needing or carrying an external charger. A NiMH AA cell can be used in normal battery applications and can be recharged simply by plugging into a USB port. Here’s the Amazon way to get these:

Share on Twitter Print

BioMetrics and the future of identity
August 31, 2010

In working with some solutions for IT security, I came across this fascinating world of Biometrics. From machine readable passports to digital certificates and now to the advanced use of biometrics. Well, the basic definition of a biometric system is essentially a pattern recognition system that operates by acquiring biometric data from an individual and comparing this feature set against the individual reference template set in the database. E.g. laptops with fingerprint recognition, etc. So it’s all about Identity. Biometrics add a new way to confirm an individual’s identity: –         “What he/she possesses” (e.g. an ID card) –         “What he/she remembers” (e.g., a password) –         “Who he/she is” ® Biometrics   Biometrics technology maturity dimensions: –         Accuracy: use multiple biometrics and a point system to increase accuracy, to reduce fraud, and augment eligible population (can’t enrol in one, can still enrol in others) –         Standards: numerous standards exist, in various stages of completion, but for now, seems best to store sample rather than (along with) template –         Security: regular security principles, tools and techniques apply; e.g. using tamper-proof hardware components like in ATMs; some biometrics require human supervision to deter falsification –         Scale: for lowering the costs   Biometrics Technologies 
  • Face Recognition systems will be widely used in the future but enhanced solutions (e.g. 3D architecture, multimodal systems, new algorithms, etc.) will have to be envisioned to improve its accuracy.
  • The accuracy of currently available Fingerprint Recognition systems is adequate for verification systems and small- to medium-scale identification systems involving a few hundred users. This is by far the most used biometric technology.
  • The stability and uniqueness of human Iris Recognition makes iris recognition the most accurate biometric technology. Iridian’s iris recognition market leadership is well known. It is then expected that new comers will provide competing products, prices will fall and the technology market share will expand.
  • Hand Recognition has been widely employed due to its high user acceptance, efficiency and simplicity.
  • Voice Recognition has generated overwhelming hope for the past decade. This technology, although one of the simplest, suffers from its low accuracy, its dependency on environmental conditions and its weak stability over time.
  • Signature Recognition is the simplest but least used biometric technology. This is mainly due to its low accuracy, which makes it suitable only for verification. Indeed, the intrinsic characteristics such as speed, velocity, pressure, and inclination, which could increase its accuracy, are rarely used and stored during the enrolment phase. This technology has the unique advantage of being already broadly used and accepted in its traditional form which is not more demanding from the user point of view.
  The emergence of industry consensus standards is the sign of a maturing market. Several industry groups are very active, such as the BioAPI Consortium, and the International Civil Aviation Organization, but generally speaking biometric standards development has progressed slowly. Source: ICAO, OECD. “Enhancing International Travel Security (EITS)”
  Security Cost Suitable applications
Face Medium low Medium Watch-list scanning, assist other biometric
Fingerprint Medium High Medium low Verification, small- to medium-scale identification
Iris High High High secure area, large-scale identification
Voice Medium Low Low Telephone authentication, low security verification
Signature Medium low Medium Application with traditional signature
Hand Medium high Medium Access control, verification
Multimodal Extremely High Extremely High Potentially all, cost related issues
Pin/Password Medium low Low Default implementation

Share on Twitter Print

IT Capacity and Utility Computing
May 3, 2010

I was talking to an IT executive and he was chatting about how difficult it was to raise awareness within the business side of what IT capacity he has and how much his organization can really take on. He mentioned that despite some IT demand management processes, it was tough to set expectations. I guess these days IT Operations faces a number of opposing priorities: •  Increasing demands on IT staff to “fight fires” vs. provide value add services •  Addressing the increasing rate of change spawned by business needs •  Improving IT’s business value •  “Doing more with less” •  Measuring end-to-end quality of service But the real essence comes down to 3 things: •  Do things Effectively - Business partner mindset i.e. do only the right things •  Do things Efficiently - Reduce cost; Lean and Agile •  Be Adaptable - Change Management for changing internal and external ecosystems And moving to service-oriented operations can be accomplished in three phases that address immediate needs and lay the underpinnings for managing a dynamic and virtual environment.                          Phase I – Industrialize Processes & Standardize Tools Address critical areas of operational instability. Implement formalized service management processes. Focus on establishing foundations for standardized tools for availability management and problem resolution, asset and configuration management. Phase II – Business Alignment Develop service catalog to define IT services. Migrate processes and tools toward services orientation (business process vs. technical component). Tool focus is on integration and shared operational data store to improve IT intelligence Phase III – Convergence with Utility Computing Leverage automation of processes to reduce cost and improve efficiency. Focus is on provisioning and orchestration tools to automate dynamic recovery and availability management.

Share on Twitter Print



    Business Intelligence

    IT Strategy

    Social Media

Social Media

© 2012 Ashu Bhatia Powered by Brown Books Digital