Opportunity Pipeline Analysis & Data Warehouse
The client is a multi $billion company that provides software and services. During a period of declining revenue the sales team were stating that competitive software was the reason for the decline. This assertion was based on “gut feel” with no substantiation of the paradigm.
Virtually no quantitative data was available to determine the true impact of the competing software. There was limited information that had been analysed and therefore limited understanding of customer churn. Strengths, weaknesses, opportunities and threats were not handled in as constructive a manner as was required. Some conversion data was available but it was limited.
TRUST in the resultant information from our work was always going to be a challenge for two reasons:
- It could have a high probability of shattering current paradigms, often leading people to blame the data
- Vast amounts of data would be linked and used to produce important digestible and useful information, thereby undergoing massive transformation. In such cases it’s critical that management believe that the way the data is transformed is correct.
The perfect solution would be to build a data warehouse to analyse the array of data sources, it was decided to get as much leverage out of the data as possible. This included:
- Impact of competing software on sales
- Reasons for opportunities converted to sales
- Percentage of opportunities converted to sales
- Time taken to convert opportunities of different types into sales
- Accuracy of projected revenue
- Effectiveness of different sales teams
- Conversion of opportunities handled by calls into opportunities
- The ratio of telephone time to $ revenue
- General understanding of customer churn
- Long term customer churn
- Software version acceptance
- and more ...
The solution involved building a data warehouse with analysis services cubes to slice and dice the information.
Several feeds brought raw data in:
- Teams (sales team leaders and members which would change regularly)
- Exchange rates
Daily & Weekly Feeds: Because of the massive volume of data analysed daily “delta” (i.e. incremental) feeds containing new data and weekly feeds containing all data were implemented.
Multiple Front Ends: A number of front ends were configured to query the Analysis Services Cubes and it was also possible to drill down to the most granular source data.
Ensuring Data Quality: The main data sources were not already linked therefore it was necessary to produce a number of mapping tables in order to achieve a high level of data quality.
In the final result all of the challenges of a complex project with a huge amount of data from a variety of sources were met and the client was very pleased with the outcome. For example:
- The analysis that came out of the data warehouse that we implemented indicated that the competitive software was a complete non-issue. The software in question posed very little risk.
- Opportunity conversion information was produced that allowed managers to determine with a surprising degree of accuracy:
- What would be invoiced in future months
- Customer churn
- Version skipping
- Opportunity forecast revenue accuracy (i.e. what was forecast vs what was eventually billed out)
- The results were a major surprise to the client. Once they understood and then believed the analysis, the client in question was able to make changes to the way opportunities were handled.
The customer was left with the ability to clearly see the reason for the declining revenues. This was only made possible through the implementation of a comprehensive data warehouse that provided essential current business and market intelligence, which the customer used on an ongoing basis for making business decisions.