Case Study: How Analytics Helped The Client Get 20% More Physical Leads
eye

Лена Б 1080.png

Head of Digital Analytics department

Helen

In this case study, our team wants to demonstrate how the reflections about the effectiveness of a simple feedback form on a website led to a change in business processes.

xcossa_picture.width-1110.png.pagespeed.ic.vjnj9lTK47.png

Have you ever experienced a situation when a number of online applications differed significantly in the marketer’s and sales manager’s reports? Most likely, yes.

This problem could happen because of various reasons, from those related to different calculations methods to common miscalculations. However, whatever the reason is, it’s essential that it wouldn’t cause the loss of real leads for your business.

In this case study, we’ll tell you how we prevented the loss of 20% of leads by merely answering the question ‘Why the data from different reports don’t match?’.

The Entry Data

The client: telecommunications (new internet provider).

Time of cooperation: 5 months

Our Task

After the client replaced the service order form on their website, the discrepancy between the number of achieved goals in Google Analytics, and the number of actual leads in client’s CRM increased abruptly. We had to understand what caused this problem.

The Result

  • The cause of lead loss is identified and eliminated.
  • The number of sales and satisfied customers increased by 20%.

The Problem

Initially, the client’s website contained a multi-step order form. Because the number of incoming calls was significantly bigger than the number of filled applications on a website, we suspected that the users don’t have enough patience to fill in all the fields in this form.

Лена Б 1080.png

Head of Digital Analytics department

Helen

We assumed that if we shorten the existing form, the number of sent forms will increase by 40%, while the number of calls will decrease by 10%.

Then we decided to check how this could work in practice.

To do that, we set up the tracking of a user’s transition from one step to another when they were filling in the form. As a result, it turned out that most of the users quit filling in the form on the second step of the four.

The analysis of the order form filling process.png A chart displaying the results of the analysis of order form effectiveness

Basing on this, we decided to shorten the form. The new form contained 4 fields only: phone number by given mask, full name, address, and comments. As a result of such an update, the number of orders on the website increased by 45%.

However…

Everything went well until we compared the data on a number of applications from CRM with the one from Google Analytics and saw that they diverge more than by 20% (while the permissible discrepancy is up to 5%).

The number of leads in the different data collection systems.png The number of leads sent from the website to Google Analytics and CRM

After we saw that chart, we started looking for what caused this divergence.

The Solution


First, we checked if the Google Analytics event and goal responsible for form submission were set up correctly. The goal worked correctly.

Second, we decided to check the data sending process. We had to answer the following question: which applications don’t make it to the CRM?

To do that, we created a separate spreadsheet that stored the applications sent by the same action that sent the data about action to Google Analytics and call center’s CRM.

As a result, we received the following data sets:

  • anonymized data in Google Analytics;
  • the data in a temporary spreadsheet;
  • the data in CRM.

A user entered the website.png The scheme for data transfer on form submission

The Result

We compared the data from different sources and saw that our ‘parallel' spreadsheet contains more data.

After we compared the data from this spreadsheet with the CRM data, we received a list of contacts that weren’t added to CRM.

We analyzed the contacts that weren’t sent and discovered that the data doesn’t make it to CRM in the following situations:

  • if a user started entering their phone number from +380;
  • if a user had specific codes of telecom operators: for instance, the numbers that started from 067;
  • we also figured out that the primary data loss happens when people use devices with a screen resolution of less than 360x640 (nearly 45% of all users had such devices). It turned out that the notification informing that a field was filled in incorrectly displayed above the form after the users clicked on the ‘Confirm' button. The users with small screen resolution didn’t notice this notification.

xcossa_pic4.width-1110.png.pagespeed.ic.EizuwWweHZ.png An alert about the form change

We decided to:

  • finalize the validation of the number entry;
  • move the incorrect entry notification to the corresponding field;

xcossa_pic5.width-1110.png.pagespeed.ic.vSa-PIgWXc.png An alert after the form change

  • make the ‘Confirm' button inactive until all the data is entered correctly.

Such an update enabled us to prevent the loss of 20% of applications from users.

The number of leads in the different data collection systems (1).png

The Conclusion

Identifying and solving the problem proved to us that:

  • the data collection logic always has to be transparent;
  • it’s necessary to list the requirements to data quality;
  • it’s required to have a process that allows finding data losses (for instance, the data comparison on different stages);
  • if the data collection logic or some inconsistencies bother you, don’t stop until you find what causes them.

Keep in mind that it’s essential not only to assume the cause of data discrepancies but also to prove its existence. A hypothesis that hasn’t been confirmed or denied is dangerous because it confuses you and could lead to financial losses in the future.