Most marketers rely on what worked for them in the past, or on what they read in a blog post. They go for big, immediate, hacky wins.
They either get lucky or get it wrong.
The best growth strategies are made up of lots of lead bullets. One-off successes are not enough to sustainably grow a company.
A big idea isn’t enough to build a big business.
Execution is the multiplier that turns your idea into something valuable in reality. Not just getting lucky, but consistently making your own luck, over and over again.
But it’s impossible to do this if you don’t have a system.
A way of deciding on the right combination of marketing tactics, data analysis, and growth strategy. A set of rules for deciding how and when to make changes as you learn more about your business and the market within which it operates. A series of checks and balances to make sure you’re on track to achieve your goals.
A process that helps you…
- Focus on the biggest opportunities
- Know how big the opportunity is
- Help everyone know what to do
- Know when a change is needed
- Find success again and again
A process that makes your company’s growth scalable, predictable, and repeatable.
Having a process is the key to getting better.
As a growth marketing agency that has worked with over 100 top startups, we couldn’t do our jobs without a smart process.
This is that process:
Monday — Analysis
Tuesday — Strategy
Wednesday — Approval
Thursday — Execution
Friday — Learning
Let’s dive in:
At Ladder we’ve built our own platform to help us manage this growth process, which we’ll refer to with screenshots and examples throughout this guide.
You can use Google Docs (like we did before we built this software), but if you’re running into problems (like we did) and want a scalable solution, try our platform with a 14-day free trial:
We start by looking backwards, asking two major questions every week:
- What happened last week?
- What has been happening this past month / quarter / year?
Once we understand the trends we look into what’s driving them.
- What did we change?
- What changed around us?
- What tactics are working (or not)?
- What ways to segment the data show interesting or unexpected results?
- What funnel stages are people dropping off at?
- How is each marketing channel performing?
We also audit the settings of campaigns we haven’t checked in a while to make sure we’re following best practice.
Repeated exposure to performance data helps build our mental model of the businesses we work with; our understanding of what levers we can pull to drive growth. Once we’re satisfied, we curate the most useful and actionable insights and fill in a weekly report template to send to our client.
Output: Weekly report.
An example of a weekly funnel report in our platform.
Now we look forwards.
- What goals are we trying to hit?
- How far away are we from hitting them?
- What will it take?
The goals don’t change often, but how far or close we are from target informs which tactics we need to choose to get things on track.
We also look at what we learned from the previous day’s analysis; what assumptions have changed and how should that change our recommendations of what to do next?
After thinking through our goals and looking at the data, we usually have lots of ideas buzzing around in our heads. We then actively brainstorm, sometimes with other team members, to add to these ideas. We make sure we add all these to the ‘Ideas’ section in our platform, an area that the client can also contribute to.
Based on the expected impact vs. cost of implementation, we choose the top 3 tactics we want to bet on. We make mockups for any creative assets needed, and fill in more detail on how each test will be implemented. These tactics, once approved, are destined to be put live in a week’s time.
Output: Tactic Recommendations.
An example of tactic recommendations in our platform.
Although we’re pretty confident in our choices, now is the time to sense-check that we haven’t made any silly mistakes.
The first hurdle we have to overcome,= is an internal meeting called ‘stress-testing’. This involves pitching your three test ideas to the rest of the team, including at least one of the Ladder senior leadership team. As each tactic is pitched, the team pokes holes in the strategy, grills each other on the numbers, and offers their own tactic recommendations to replace those that didn’t pass.
Once we’re really confident that our recommendations are the right ones given what we know, we pitch them to the client. We usually try and arrange our weekly calls on Wednesdays so we can walk the client through our recommendations on the phone. However they have access to our platform so they can follow along or look in more detail later.
Our clients will always know more about their business than we can, and they get a veto on each tactic (though we do try and change their minds if our logic is sound). If the client vetoes a tactic, we go back to the drawing board and decide on a replacement before the tactics are due to go live next week.
Output: Client Approval.
An example of a projected test in our platform.
Now we taking all the hard work we did on strategy and make it work in practice.
Remember that we’re putting live tactics we decided on last week that passed ‘stress-testing’ and were approved by the client, so there aren’t usually any surprises. However the platforms we use to execute said tactics are constantly changing, and we sometimes what we wanted to do isn’t possible anymore. Sometimes we notice something new that would work even better.
In these cases we might need to change what we planned, but these are usually small changes. If there’s a major issue, we might put the tactic in our ‘Icebox’ for future consideration.
Before putting anything live, we do a final run-through to check that we didn’t make any mistakes. We dig deep into a few aspects of the campaign to see if everything looks as expected. We get a colleague to lend us an extra pair of eyes and make sure we’re not missing anything. Then, when we’re satisfied, we ship it and let the client know it’s live.
Finally, at the end of execution day, we double check that everything is working as expected.
Output: Executed Tactics.
An example of three executed tests being tracked in our platform.
Everything we needed to do for our clients is done now. But that doesn’t mean we’re finished. Just like pro athletes, we need to set aside some time for training in order to stay at the top of our game.
Friday is when we exercise our minds; reading about the latest tactics, exploring platforms we’re unsure how to use and working towards industry recognized certificates. We also hold a formal training session we call ‘Ladder Learning,’ where one member of the team presents on a topic they’ve mastered that others could benefit from.
Considering our workload and busy schedules, setting this time aside is tough but essential. When we’re too busy and skip this time investment, the quality and creativity of our tactic recommendations noticeably suffers for weeks after.
Friday is also when we catch up on all the administrative things we need to do to keep the business running. Each employee has a 1-2-1 with their manager where they discuss how the week went and what can be improved for next week. We have a company all-hands meeting, where Ladder’s leadership update the whole team on big wins, any issues we’ve spotted, and how our product is developing.
For a little fun and to incentivize creativity, we also hold our ‘Test of the Week’ competition; where whoever pitched the best test that week wins a $25 Amazon gift card and a mini trophy to display on their desk.
Output: Professional Development.
Our training curriculum, available at FreeGrowthDegree.com.
While our main focus is finding new tactics to test, we’ll have a number of tests running at a time; some SEO or social media tests can take 3 months to conclude. Our clients also typically have other marketing campaigns running; things that we (or they) tested in the past that we keep running. For our larger clients, these ‘business as usual’ campaigns can make up a sizable amount of budget and can’t be ignored just because they’re not currently being tested.
To make sure our existing campaigns get enough love and attention, we set aside some time at the beginning of each day to do our ‘daily checks’.
Assuming nothing is amiss and things are performing as expected, we’re done on the optimization front. However sometimes we need to adjust bids, budgets, or settings to get campaigns back on track.
Doing this every day, we start to notice patterns in performance; if a particular campaign or channel is degrading, we might then make it the focus of one or more of our tests in that week’s strategy session. For example running a creative test is a surefire way to get an old campaign purring again.
Output: Performance uplift.
When Things Go Wrong
Of course, it’s not always possible to follow this process. Delays happen, plans change, issues arise. We hold ourselves to this process wherever possible, but not at the expense of common sense.
If a test isn’t ready, we won’t just put it live because it’s Thursday. If we haven’t done enough analysis, we don’t just recommend any old tactic just for the sake of it.
This process is designed to tolerate exceptions. That’s why we do the hard stuff (analysis & strategy) up front and leave wiggle room at the end of the week in case execution takes longer than expected. Often these are one-off slip-ups. However, if we’re repeatedly failing to implement part of the process, it’s our best indication that there’s a problem that we need to investigate further.
Some examples of this:
- If we’re constantly delaying tests because the client hasn’t approved the creative, maybe the brand guidelines aren’t clear?
- If we aren’t coming up with tactics on time, maybe we’re struggling with analysis paralysis?
- If we’re having a hard time tracking and reporting on performance, maybe we don’t have the right analytics tools installed and properly set up?
It usually comes down to training; knowing the process is falling down for more than one week in a row indicates we need to spend extra time supporting that strategist till they can manage it themselves. Tracking how much a strategist is on top of this process is the main way we determine resourcing, promotion, and how we predict client churn.
That’s all folks! There are of course other processes that work, some of which may work better for your specific business case. What you read was the process that we find works extremely well for us. Fully adopting this process was a real turning point for Ladder.
The month we adopted this process our client churn go down 25% and our margin jump up to 30% (before, we were actually losing money!). Yes, we actually made more money AND our clients were happier about it. Win, win.
If you like our process… steal it!
It’s only fair that we put our version out there for others to benefit from.
Feel free to adapt this process to your situation. If you don’t have the time, try splitting this into one hour blocks and spread them throughout the week. If your business gets no traffic on the weekend, try shifting everything forward so execution happens on a Monday or Tuesday. If doing weekly sprints is too fast for you, try and do this on a month-to-month basis. Whatever works best for you.
Need to build your own scalable, measurable “growth hacking” process?
Review your growth goals with a Ladder Strategist: