Even more user research and reliable prioritization models to run growth experiments [CXL Review Week 4]

run growth experiments
Share on linkedin
Share on whatsapp
Share on facebook
Share on twitter
Share on reddit

If you can’t describe what you’re doing as a process, you don’t know what you’re doing.

William Deming

It seems that if you want to run growth experiments user research is THE path. To write last week’s post I dedicated extra time to the chapter called user centric marketing thinking that I will learn everything I needed and next I would dive into growth experiments.

Little did I know that this week I will be learning about conversion research.

The good part is that now I understand why growth hackers get a bad reputation. When using listicles to “generate experiments ideas”, the chances to get relevant results are minimal. Also, this is why a part of the optimization program is to get deeper customer insights and can take up to three months.

Before jumping into the main topic, how to run growth experiments, it is time to include in this review series something about the learning experience at CXL Institute.

Let me be clear. The content is great. No doubt about it. It is so good that I consider writing an article about how I take notes. There is so much actionable information, I don’t want to miss it and going back to the notes is a path to master the fundamentals and process.

Yet the course management system is not the best I have seen. Before reviewing it, I wanted to understand the context. And Lady Luck visited me in a form of a comment from Peep Laja to this post:

In a reply to the question "What is your 5%?", he said: "For CXL Institute it's the network and content we produce. Not in the LMS business."

Now it makes sense and I don’t feel like critiquing the fact that I didn’t find a shortcut to the main table of contents and other shortcomings to the platform itself.

How to run growth experiments

The chapter on how to run experiments begins with a workshop by Peep Laja. It is a high-level overview of how to generate ideas, prioritize them, and decide when a test is winning. On the topic of A/B testing, I will write an article next week because the amount of information is over 9000.

First leson of growth experiments is that you simply don’t trust listicles. The articles are written by SEO people to get traffic and is a good practice to you don’t build a business on listicles.

But where do you start?

  • best practices? yes. but it is not optimization. When you build a website it should follow the best practices
  • design trends – not necessarily something to follow in general
  • market leaders or competition – just copy Amazon? Not a great path and benchmarking is also not optimization

Start your own optimization process following these steps:

  1. where are the problems
  2. what are the problems
  3. why is this a problem
  4. turn these issues into test hypotheses
  5. prioritize test and instant fixes

The discovery of matters is problem-solving 101. The foundation is the research with better data that answer business questions. For this you need a process and Peep Laja is presenting is the ResearchXL framework.

This helps you figure out what you should test and how to test the ideas. Here is in general what the framework steps are:

  1. technical analysis
  2. heuristic analysis – every page should have the only one desired action:
    • relevancy
    • clarity
    • motivation
    • friction
  3. digital analytics
    • where are the leaks
    • which segments
    • what are users doing
    • which actions correlate with higher conversion
  4. mouse tracking and form analytics
  5. qualitative survey
    1. buyer groups
    2. which problem are they solving
    3. how are they deciding
    4. what’s holding them back
    5. what else do they want to know
  6. user testing – ignore what the people say because what they do and what they say are two different worlds, especially if you pay them for the test. And try to give them three types of tasks: find a product – ecommerce – or a specific piece of information – for SaaS or lead gen. Find something broad – find a pair of jeans under $50 size 34 in black color. Finally, just go thru the funnel and ask questions:
    1. what’s difficult to understand
    2. what’s difficult to do
    3. what goes wrong

After working on discovering the problems, now we should have our own list of issues that we need to go and fix, not the listicles.

By the time I wrote down these notes, I realized that most of the information is in the form of blog posts on the CXL blog. But moving to the conversion research chapter, the content blew me away again.

I got actionable insights into how to approach each step. So it’s not only high-level but also an action-oriented content.

How to run growth experiments by prioritizing the right idea

Choosing models can be difficult. And having experience with implementing a NEW process in various teams, I feel that the model should be something that the team picks, not a copy of something that is popular.

Let me say it: sometimes stakeholders don’t care about your models and they might be right. Here is a quote from a famous British statistician:

Essentialy, all models are wrong… but some are useful.

George E.P. Box:

So I will leave you with a list of models and you should pick the one that is useful for you:

For a deep dive on this topic, read this article.

Mistakes while you run growth experiments

  1. Precious time is wasted on stupid tests (generate ideas that are relevant for your business)
  2. Thinking that you know what will work
  3. Copying other’s experiments (competition doesn’t know any better and benchmarks are just that, a point of reference)
  4. Having a sample size that is too low
  5. Run growth experiments long enough (always test full weeks and try to have it live at least 7, 14 or 21 days)
  6. The data from the experiment tool is not sent properly to the analytics tool
  7. Giving up if the hypothesis fails
  8. Ignoring validity threats: the history effect, instrumentation effect, selection effect
  9. Ignoring small gains (5% monthly increase in conversion rate will result in 80% uplift for the year)
  10. Not running tests all the time.

Resources:

The Fogg behavior model is mentioned in the workshop video. I wanted to mention it as a resource because it explains how or why a customer takes action. It’s never something that you control but is something that you can guide.

This article is the fourth in a series of 12 reviews of studying Growth marketing Minidegree at CXL Institute. Follow the minidegree tag for the entire series.

Stay In Touch.

  • contact [at] dascalescu.com

CD.

Let's Build Something.