In this case study, our team wants to demonstrate how the reflections about the effectiveness of a simple feedback form on a website led to a change in business processes.
Have you ever experienced a situation when a number of online applications differed significantly in the marketer’s and sales manager’s reports? Most likely, yes.
This problem could happen because of various reasons, from those related to different calculations methods to common miscalculations. However, whatever the reason is, it’s essential that it wouldn’t cause the loss of real leads for your business.
In this case study, we’ll tell you how we prevented the loss of 20% of leads by merely answering the question ‘Why the data from different reports don’t match?’.
The client: telecommunications (new internet provider).
Time of cooperation: 5 months
After the client replaced the service order form on their website, the discrepancy between the number of achieved goals in Google Analytics, and the number of actual leads in client’s CRM increased abruptly. We had to understand what caused this problem.
Initially, the client’s website contained a multi-step order form. Because the number of incoming calls was significantly bigger than the number of filled applications on a website, we suspected that the users don’t have enough patience to fill in all the fields in this form.
We assumed that if we shorten the existing form, the number of sent forms will increase by 40%, while the number of calls will decrease by 10%.
Then we decided to check how this could work in practice.
To do that, we set up the tracking of a user’s transition from one step to another when they were filling in the form. As a result, it turned out that most of the users quit filling in the form on the second step of the four.
Basing on this, we decided to shorten the form. The new form contained 4 fields only: phone number by given mask, full name, address, and comments. As a result of such an update, the number of orders on the website increased by 45%.
Everything went well until we compared the data on a number of applications from CRM with the one from Google Analytics and saw that they diverge more than by 20% (while the permissible discrepancy is up to 5%).
After we saw that chart, we started looking for what caused this divergence.
First, we checked if the Google Analytics event and goal responsible for form submission were set up correctly. The goal worked correctly.
Second, we decided to check the data sending process. We had to answer the following question: which applications don’t make it to the CRM?
To do that, we created a separate spreadsheet that stored the applications sent by the same action that sent the data about action to Google Analytics and call center’s CRM.
As a result, we received the following data sets:
We compared the data from different sources and saw that our ‘parallel' spreadsheet contains more data.
After we compared the data from this spreadsheet with the CRM data, we received a list of contacts that weren’t added to CRM.
We analyzed the contacts that weren’t sent and discovered that the data doesn’t make it to CRM in the following situations:
We decided to:
Such an update enabled us to prevent the loss of 20% of applications from users.
Identifying and solving the problem proved to us that:
Keep in mind that it’s essential not only to assume the cause of data discrepancies but also to prove its existence. A hypothesis that hasn’t been confirmed or denied is dangerous because it confuses you and could lead to financial losses in the future.