Rethink your approach to daily operations and experiments: Part II - Designing an operating system for experiments
You need to run experiments alongside the core operations of your startup. How do you learn and execute simultaneously?
Experiments can mean many things to a startup, from the smaller scale A/B copy tests run on email campaigns to the larger scale product extensions. Regardless of functional area, these experiments consist of a series of actions taken in front of potential users to test a hypothesis before deciding whether to invest additional time and resources into larger projects.
To reap the true benefits from these efforts, you need to not only execute, but design a system that nets a successful experiment. If a startup is a company structure built for learning, that means you’re not only learning externally what your customers need to achieve product-market fit, but also internally how your team works best, the tools, processes, and people that help bring your idea to life in the most effective way possible.
Alongside designing a system to run your core operations, experimenting effectively will help you put action to insights, and grow efficiently, while feeling less chaotic and more strategic in the process.
How DO you comanage your ongoing company work and experiments that help you grow to that crucial next stage?
Design your process management to complement your experiments.
Why this works
Managing operations and experimentation are not mutually exclusive, and actively supporting their integration can enhance internal efficiency, communication, and direction. By learning from the lessons of each approach, your team can inform and improve the work of the other, thereby designing processes that both complement and iterate on your experiments.
Why Do This Now
Early customers are engaged and willing to help you grow. Leverage that audience. Neither daily operations nor experiments stop, so build better systems today that will enhance the work you’ll do tomorrow.
Below (Part II), we’ll outline how to design great experiments. (Next week in Part III, we’ll close with how to design the two systems to not only coexist but strengthen each other).
Let’s break this down like an expert generalist. As you know from our prior issues, expert generalists leverage the tools, rules, and people needed to create effective systems.
Remember: there is no one-size-fits-all approach. Modern operations are the art and science of considering your tools, processes, and people that make up a system for how things get done and the outputs will be as unique as the inputs.
how We Do: how to effectively manage learning and growth experiments.
One of the most common problems when running experiments is that too often founding teams don’t put effort into designing the full lifecycle of a successful experiment. We act like squirrels chasing the latest shiny ideas thrown into our Slack channels without effectively structuring our tests to capture learnings and make decisions with the data gathered.
It’s crucial to remember one thing here – an experiment succeeds if you learn. An experiment fails if you don’t. It’s the single most important factor in running an experiment. And by appropriately phasing your experiments, you’ll be able to cycle through them quickly as you work on your existing day-to-day work. You’ll be able to learn, and over time, those learnings will lead you to the next stage of growth. (Hint: Channel your inner Giannis.)
1️⃣ Dedicate time to a clear planning process.
2️⃣ Allocate time to prepare to ensure that your team has the bandwidth, resources, and assets for an effective experiment.
3️⃣ Jump into execution, and keep an open mind as you go.
4️⃣ Take the crucial time for analysis and build this in at the start.
Planning
Tools 🛠️
Slack channels can be used to centralize brainstorming and gather ideas that could become well-defined experiments,
Start a series of idea-XYZ channels to direct your team’s suggestions.
Miro, Notion, or Slack have great brainstorming tools to whiteboard your ideas as you plan exactly what needs to be done to get started.
Notion or a Google Doc can get the ball rolling if you build out a template and get the team to weigh in asynchronously before you get started.
For those who prefer an in-person work environment, a good old-fashioned “post-it notes on a whiteboard session” is a valuable way to capture all ideas before you begin.
Rules (Process) 📝
Within any experiment planning process, it’s crucial to identify:
A clear hypothesis - what insights led you to want to run an experiment in the first place? Who or what do you want to experiment upon and why? Consider the scope of the experiment and any open questions.
A clear question to be answered - when identifying a question, make it as concise as possible without too many sub-questions. Hyper-specific questions help to get the right amount of information for an experiment to be focused from the start.
Some questions to answer that will help you to develop a clear why:
What is the problem that needs to be solved?
Why is it a problem?
Where is the problem observed?
Who is impacted?
When was the problem first observed?
How is the problem observed?
How often is the problem observed?
A goal - this can be a learning goal, an activity goal, or a metric goal where possible from learnings gathered from former experiments. Think success metrics - goals, signals, and measures that can be used to assess this experiment.
A defined timeline - determine a clear start and end date for the experiment so you can maximize speed and efficiency while working.
A clear owner (more on that below).
A quitting plan - is there any reason that we should quit midstream? Is there anything we’d learn, or any variable that would taint the experiment that we should outline before we start that will cause us to change or end things? It’s a great way to avoid sunk cost fallacy and channel your inner Annie Duke.
Once outlined, identify the tasks and owners for each, with clear accountability on who is responsible for what. Synthesize this in a summary shared with all involved.
Identify how you’re capturing the data from the experiment, and where this is happening. (This is the big one).
Relying on scratch notes, qualitative data, memory, or gut does not an excellent, data-driven experiment make. If you define how information is captured in advance, you won’t be rushing to the closest place available to write things down when the time comes to execute.
Identify the time to analyze the experiment with all participants in advance. Set a calendar date and hold the space. This is often effectively done in a team meeting collaboratively where different experiences and perspectives can be discussed.
People 🫶
Identify a clear experiment owner, key stakeholders, and participants in the process.
Identify key decision-makers throughout the process of the experiment in advance. Will everyone be ready to give timely input when needed?
Preparation
Tools 🛠️
Transition your ideas to Asana, Monday, Basecamp, or another project management tool when you’re ready.
Make sure you’re building a place to track data; Google Sheets, Excel, a pre-made dashboard within a sales or marketing tool like Hubspot playbooks, etc.
Rules (Process) 📝
Execute all steps outlined in the plan. As you prepare, if you need to amend any of the parameters of the project, make sure you capture this and document it accordingly for future reference.
Outline your steps. If anything needs to change, be sure to take notes so you have the context for future reference.
Try to plan through the entirety of the experiment.
Are there any conflicts on the calendar that could impact the steps?
People 🫶
If your team is large enough, have someone who hasn’t been involved in drafting the experiment review your plan. Their fresh eyes may be able to identify an obvious step or potential conflict that you missed. Experiments are about challenging assumptions so use these opportunities to test how your perspective could inadvertently sway the experiment.
Examine your sample size. While all data can be useful, if you don’t have the reach to create a critical mass, make sure you’re accounting for that in your process and particularly in your review of the results.
Ensure everyone managing the experiment will be available throughout its duration. If someone needs to be out, do you have a backup protocol?
Execution
Tools 🛠️
How do you troubleshoot if something breaks mid-stream? Customers have a way of doing exactly the one thing you didn’t plan for. Do you need to set up your tools in advance to adapt?
Lean on your project management tools and communication channels to update as needed. Keep a birds’ eye view of this throughout with daily check-in’s as possible to keep the progress.
Be ready to adapt and pivot to get what you need from the experiment, even if it’s in the face of your hypothesis. Fall back on your design tools to adjust the process and/or flow.
Rules (Process) 📝
While an experiment is running, it may be tempting to intervene if the results don’t look like what you were expecting. If this happens, open your mind and see if there’s a different lesson you could learn. Were you asking the wrong question?
Don’t jump to conclusions. Whether we know it or not, it’s difficult to go into an experiment fully agnostic about the results. (We’re human!) Do your best to fight the urge to guide participants or interpret results to match your guess.
People 🫶
Experiments can be nebulous and not always obviously rewarding. It can be demoralizing to your team to “fail” an experiment, especially if you’re running experiments often while you’re chasing early growth. Again, an experiment succeeds if you learn. An experiment fails if you don’t. And the learning will come if you appropriately prepare for the experiment. This is why you’ve designed a clear system from the start that makes it nearly impossible to fail.
Analysis
Tools 🛠️
For interviews or experiments where analysis comes from lots of notes, synthesize with ChatGPT.
Use AI-notetakers such as Otter or Fathom to quickly record and decipher team meetings.
Or – try a WordCloud to visually depict what you’ve learned.
Rules (Process) 📝
Follow through with the analysis. Ensure the team holds themselves to this – it’s the moral of the story.
Include both quantitative insights as well as qualitative feedback whenever possible.
Did you answer the question you wanted to answer? Or open up a hundred more questions? The analysis may lead you to your next experiment.
People 🫶
Who takes the lesson and runs with it? Is part of the analysis identifying the next best steps and the person/team best equipped to iterate on what was learned?
Disperse results to the broader team. How will you make it contextually-appropriate to all who are coming into the analysis? How do you display your analysis in a way that can be most effectively interpreted by all types of learners who might have different specialties?
Take It Up A Level
Using AI to Improve how You Do:
Actually Actionable
Nice article. Now what?
We’ve taken the ideas above and put them into an action plan for you and your team.
🔖 Meeting 1 - 60 minutes - how We Do | Improving Our Operations (last week!)
🔖 Meeting 2 - 45 minutes - how We Do | Improving How We Run Experiments
Work with your team to understand what’s worked in the past and what hasn’t when it comes to experimentation and sprints (10 minutes).
Create specific templates surrounding your experimentations to ensure that they are easy to plan, implement, test, and review. Utilize the insights from your team to draft specific templates that codify the aforementioned experimentation process (20 minutes).
Build specific folders within your knowledge base to codify, structure, and organize this documentation to ensure it is easily accessible for your team (10 minutes).
Develop an experimentation framework that leverages your tools, rules, and people (+AI) to best navigate these experiments and sprints as they arise by establishing clear guidelines and expectations - and standardize the best you can - whether it's a new strategy, a new approach, or simple learning - all indicate success growth and progress (5 minutes).
If you structure how each stage of an experiment will be intentionally managed, it’ll be clear by the end if it’s successful or not, and what needs to happen next. Suddenly the many ideas that float around will form a line, connected by dots of learnings, and a clear path forward to your next stage of growth.
Now that you’ve designed your existing operations and your experimental operations, how do you run them concurrently, and have them reinforce each other?
More on that in the week to come. We’ll see you there!
Interested in digging in now? If your startup needs a “spring clean”, the oAT team is hosting a workshop next week designed to help you optimize your tools, systems, processes, and habits for maximum efficiency and success. Learn more here.