Ep 2: Alex Birkett

Alex Birkett is the growth and experimentation lead at Workato, an integration and workflow automation platform for enterprise technology companies. Alex is also the co-founder of Omniscient Digital, an agency that helps B2B software companies accelerate growth through full-service, conversion-driven content marketing. Prior to Workato, Alex worked at Hubspot as a Senior Growth Marketing Manager focusing on User Acquisition (SEO, CRO, content, and exploring new growth channels).

Alex is a talented marketer and friend. He goes deep on enterprise experimentation, a/b testing, conversion rate optimization, and why he started Omniscient Digital with David Ly Khim. Alex thinks deeply, speaks eloquently, and shares some valuable nuggets on experimentation processes that have helped him achieve so much success in large, complex SaaS organizations.

Connect with Alex:

Watch on YouTube

Listen on Spotify


Audio Transcription

Bill King 0:00

Welcome to episode number two of the growth theory optimal podcast. My name is Bill King. I will be your host today and every other time at this my first guest is Alex Paquette. Alex is the growth and experimentation lead at workato. And he was previously on the growth team at HubSpot focusing on user acquisition. Alex is an amazing marketer and someone I consider a friend I'm really impressed by all the stuff that he's been working on. On this episode, Alex breaks down what it's like to be a full time head of experimentation. The role experimentation and testing plays in the growth process, what data and insights he actually looked for when he thinks about how to succeed with experimentation. Not only is Alex doing that, but he also just launched his own agency with another friend David Lee Kim called omnisend, where they're actually helping b2b software companies accelerate growth through conversion driven content marketing. Before we get in the conversation, I just want to say if you could please like, subscribe, and do all the things that help the podcast get found it would be really helpful. Without further ado, let's get into episode number two with Alex briquette.

Alex Birkett 1:22

You want me to try to stay on topic? Or do you? Do you mind if I go off and do Dude,

Bill King 1:27

I was like, We're going down the rabbit hole and I can't wait to see what's at the bottom. That's what I was. That's where this conversation is going. It's very difficult to predict. But that is so. Exactly. I know Mike. I'm like why are we talking about Whole Foods at the end of the podcast? I don't know what's going on. Alright, Alex Bearcat, somebody who I just met about two years ago on the podcast today. But I I'm a big fan of Alex briquette. What a great guy. You have done a lot of things in the growth world. You started as a marketing manager. You started in like digital space in 2012. That was before you moved to Austin, right?

Alex Birkett 2:00

Yeah. That was like when I was in college, I acted as the manager for like arctica race, which is an e commerce company. That was kind of stone I was stumbling around not knowing exactly what to do what I'm what I wanted to do. But I knew I wanted to get into marketing and probably the tech space. So those were kind of early experiences.

Bill King 2:16

I'm still trying to figure out what I want to do, Alex. So then, you had I think we talked before, but I think one of the roles that were like probably changed the trajectory of your career probably was the opportunity to be in a growth marketing manager at CSL, which I think was a big influence in your career, probably. Then you went from there to still doing growth. But focusing on user acquisition at HubSpot, you were doing SEO, SEO and content, which is the Holy Trinity, I guess you could say in acquisition. And then you started on Nissen recently with David Lee Kim, which is awesome. And we'll get into that, which is basically an agency that helps people acquire customers through content marketing. And now you're at work. And you're doing a full time experimentation role where you actually own experimentation, which is pretty sweet. I think I hear the idea of experimentation all the time. And it's very rare. I think that you see someone who just does that full time. So we'll get into that. First of all, like, how did this evolution happen? Because sounds like we've talked about this offline. But I feel it sounds like we both had relatively like similar kind of paths, going from one role to the next role to the next role. And like, they were somewhat different, but they were all unique in their own ways, like challenging. You were in marketing manager, growth, marketing, manager, user acquisition, experimentation. How did all this happen? And like, how did this path become? What it is? And like, what were the inspirations that you picked up along the way, as your careers evolved? Yeah,

Alex Birkett 3:52

I don't, I don't plan things out super well, there's this concept that I was recently recently learned called pantser, versus a planner. So in writing, like, if you're reading a novel, like some people fly by the seat of their pants, they're a pantser. If they plan everything ahead, you know, character development, plot everything, that's a planner. So I've typically been a pantser. But I do fly by principles and kind of general things that I want. So I move towards those macro directions through whatever opportunities that come up, sort of through optionality as like Nassim Taleb would put it. So let me try to frame this from the beginning. Let me go back to my roots. Do you? Have you heard of this game called a we're not really strangers? Now? Okay, so this is, it's not really a game like a board game in the sense that like monopoly or something is, it's really like a conversation starter to like, there's all these prompts, you know, like Truth or Dare type things and like, basically, like questions that you ask each other. And I was recently on a trip with some friends, and we were kind of playing this to pass time on the car Ed. And one of the questions was, like, what did you want to be when you grew up? Or like when you were a little kid, and I kind of started thinking about And obviously, like, there's these crazy things that you're like, Oh, I want to be a baseball player, or, you know, I want to be like a rock star or something like that, which I still, you know, kind of want to be. But I think that these abilities running out. But when I look back and truly think like, Alright, when I was really young, what inclinations that I have that seemed natural, it's like I was there's the cliche, like, you know, you start the lemonade stand, I did that. I started a lemonade stand my little brother when I was a little kid. But then I started like a little store when I was in third grade, with a friend of mine, where we sold kind of T shirts and trinkets to friends. And then I discovered cafe press where you could actually make t shirt designs and sell them online. So I got my first e commerce check in like seventh grade. Not huge, but something where I

Bill King 5:40

learned that I could do this, you can get some dessert. So when you're when you're in middle school, or whatever age that is,

Alex Birkett 5:45

yeah, like extra extra Snickers candy bars at lunch or something. But yeah, I trace all this stuff. And I'm like, man, I have been starting and building things from zero to one since the beginning. And I've kind of always looked at entrepreneurship as the path that was for me. It's kind of difficult to coming from a small town thinking about how you actually crack into that, like, I never knew tech entrepreneurs when I was a kid. I mean, I think that's also like more of an emergent trend where that's been democratized. But through college, I realized I wanted to get into that. And eventually, like, build my own things, like I was really into independence and freedom, flexibility, I didn't want to be chained down to a location or set of hours. And back then it was kind of like entrepreneurship is the way to do that. I remember reading four hour workweek and a bunch of books, you know, in that kind of vein. So my first job was out of college was lawn starter. So I deliberately wanted to work at a very, very early stage company. And they were as early stage as you could feasibly get without founding it yourself. So they were they just graduated from the TechStars incubator program. And I joined preseed with just the founders, I was like, you know, employee number one, and we were we had like a MVP of the product. And we were trying to get product market fit and you know, figuring out sales and customer success, and basically every part of the business. So that was kind of foundational and how I looked at like business building, because they wanted to raise a bunch of a bunch of venture capital as well. But in terms of the marketing adventure that started at CSL, so I studied journalism, I didn't really know a ton about the quantitative arts outside of like the few, you know, statistics or social sciences type classes that I had to take. But CSL was a blog that I always followed, you know, after reading about Sean Ellis and kind of the growth hackers in that whole world, Andrew Chen, I read his blog, I got really interested in the idea of like the quantitative data driven the experimentation stuff, and the opportunity to join CSL as like a content slash growth marketer where I would be you know, what, it's like this framework of like, one foot in the know, and one foot in the unknown. Like, I knew I could write, but I didn't know the other stuff. So I joined CSL, and I felt like that was kind of a, like a data driven graduate school for me of sorts, like I was paid to learn all these cool things like analytics and AP testing.

Bill King 7:56

Yeah, that's really cool. I think we talked, we talked on this before, but I also started, I was in school for journalism, and then was had a blog was writing about stuff. And I was like, why is nobody finding my content? I think it's pretty good. Is there some reason for this and went down the rabbit hole of learning the analytics as well. Sounds like we both share this kind of like enthusiasm for learning. So once you find something that's got a system that you can deploy strategy around it, you just go down the rabbit hole until you eventually find the bottom or we're still tumbling down that rabbit hole, still trying to find the bottom. But that doesn't ask you

Alex Birkett 8:31

that in our podcast to about like, I felt like you go down rabbit holes easily, right? Like you get interested in something. And then you start reading and watching YouTube. And then three months later, you find yourself an expert in some esoteric new hobby.

Bill King 8:43

Yeah, that's, that's totally. And then, like, I followed a similar path where I was like, Okay, how do you get content to be found? And then, oh, wait a second, nobody's on the you know, nobody's converting, once they get there. Oh, it turns out, you have to actually put a reason for somebody to convert and all these things. And so now it looks like you know, outdated information. But it was probably most exciting time of my maturation in my career was like, just learning and kind of being able to test those new things and go through that process. So that's amazing. First of all, you've gone through a lot of different roles. And so you're kind of building a skill set that is in high demand right now. And I don't think there's a lot of people who do both content experimentation, like all the different arts of marketing, a lot of teams have a person who owns maybe some experiments, what is a full time experimentation person do where does it sit inside the team? Do you have like a growth org and experimentation is a unit inside that that growth work and like how did this come about?

Alex Birkett 9:48

This is a new role. So I've only been in it for two months. So I can't say you know, it this is gonna be like a shallow answer of sorts because I'm still learning myself. But it sits on the marketing org on the growth team and primarily works on Paid acquisition and website optimization, which is, you know, semantically the same as like a conversion rate optimization position. Now, this expands eventually, because there's also roles on the growth team that include, you know, SDR email sequences. And then beyond that there's like product and like, workout is not necessarily like, you know, product lead, and that you can go sign up, like self serve, but they're still onboarding, they're still retention, there's still upsell and cross sell stuff that eventually is going to be encompassed by, by growth and experimentation as well. Now, what it looks like is basically just the data driven growth process, right? There's still research ideation, hypotheses, prioritization, execution, or experimentation and analysis. And then you kind of go back and loop through that system. You know, you have weekly meetings, where you're reporting results of what happened, what you're currently working on, and what you're, you're planning next. And hopefully, like you iterate and improve and get incremental gains over time, and some of those big wins pay off, but like all the efforts, but yeah, it's essentially just a, you know, statistical tinkering.

Bill King 10:57

I'm curious. So how do you define success as the the person who owns experimentation? Is it? If I said, Hey, Alex, you did a fantastic job, we talk again, another year, and you get promoted? Are you responsible for the overall top line revenue number? Or like, do you have specific KPIs that you're looking at to understand how experimentation impacts all those things? Because I would imagine from a data side of things like, we know that Alex is doing all these phenomenal experiments to help reduce friction. But there's a lot of things that go on in the funnel that could influence that, like, how does that how does that work? When you have those conversations with the person you're you roll up to? Like, how do you guys talk about those things? Yeah,

Alex Birkett 11:39

the ROI of conversion rate optimization is one of the most difficult questions because it's like, if you're, if you're dealing with a proportion, you obviously have like the numerator and the denominator. So if like, you increase the traffic via, you know, anything, if you get like, a raise, you raise around and you get in TechCrunch, or something like that, or you have an article go viral on Hacker News, or you throw in a conference, like you get a bunch of direct traffic, that not necessarily is like going to convert, you have more people, and not necessarily a higher conversion rates, you might actually lower the conversion rate, but do nothing wrong. So I think proportion metrics are always difficult and slightly misleading. So there obviously has to be context. But then there's this trade off between like, like utility and an accuracy. So like, the closer you get to the truth, like the more complex you're typically going to get. That's the case, when you're building growth models. That's the case when you're designing machine learning algorithms, like there's always a bias versus variance trade off. It's like at the start, we're aiming for simplicity. And we're using conversion rate metric where we establish baselines over the past couple of quarters, we see the standard deviation, the potential variance there, and map out potential trends and what we would be happy with in the next couple of quarters. And then we assume that there's not going to be any drastic changes in terms of like, you know, traffic allocation or targeting. It's a big assumption, though, right? You know, there there, theoretically, could be in a series D startup, especially Sure. So we're trying to increase conversion rates week over week, month over month and quarter over quarter. And we have goals based on the website as well as paid acquisition. Now in the future, what I want to do is try to parse out incrementality. And you could do that various ways. There's like holdout sets. So you basically, like, optimize for 90%, or 95% of the audience, or 99, or whatever that proportion is and then keep one to 5% of the audience on like whatever old original untouched version there is. And then you can sort of parse out like, what the increment incremental gains your program has produced. There's also ways that you can do that on kind of an Is this a looser, more messy way that you can basically take like your wins your percentage win rate, and then like how much they won by and sort of, like, add those up, although, you know, sometimes those gains don't sustain. So that's a difficult way as well. So this is a very difficult question. Especially when you you start going into forecasting, like revenue gains from an experimentation program, which sometimes are not illusory, but sometimes they don't transpire in the way that they do in your Google optimized platform. So there's actually a cool series that David Manheim is putting out on LinkedIn. Now, I think, if you search like revenue attribution tomorrow, yeah, it's it's a difficult problem. I mean, as it is with SEO and anything, right? metrics are like the hardest part.

Bill King 14:14

Yeah, for sure. I mean, so like, you've probably been working on a, you know, you're relatively new, but are there like for the, for the folks listening who might be saying, Oh, that sounds cool. But it's also a little bit confusing. Can you give an example of just an experiment that you've run recently, and kind of like how you approach that problem, just so people can get context for what exactly you just said?

Alex Birkett 14:35

Yeah, so Well, we're kind of undergoing some interesting stuff. Right now. We're working on bigger projects that aren't necessarily like tactical UI experiments, but we're doing big projects like we're the the go to market for workato is essentially you request a demo or trial and you get past the sales and then you get basically put into the product via like a sales lead motion. There's an interest in product lead stuff, but not necessarily freemium because the product is so complex. And there's so many use cases. So there's kind of this intermediary that a lot of product ish lead companies are doing where it's like you give them an interactive product tour, you can see this on pendo. You know, I think HubSpot actually has one as well where you can go in and click around. And it's this, like, just the HTML and CSS of a product, but it's not in the product itself, you don't need to sign up to see that. So that's a big project that we're working on. And then incrementally through that, like, if we see that it works on an aggregate level, it overall like increases the conversion rate of people who, you know, see that, especially on the homepage, if we, if we put it there, then the incrementality. And kind of the micro optimization comes in the form of like, what steps of the product that we show, you know, there's maybe 10 or 12, kind of steps in the sequence of like product slides? It's like, do we shorten that? Do we lengthen that? Do we reverse the order and show the advanced features first, and the simple features later? Do we change the messaging, and then we can kind of optimize people through that funnel where we start, you know, maybe 1000, people start the product tour, and 10, get to the end, and you no one signs up for a demo? This is just, you know, throwing numbers out there. But then like, how do we get 1000 people in, and then like, you know, 500 people finishing it, and then more and more like completing demos. So that would be like one level of thinking in terms of like micro optimization. And then Apart from that, what we're doing on paid acquisition on landing pages is really just like diving into our messaging, which I think is really difficult for most companies, but especially for an automation company. It's like anybody can use automation. Like there's no, like, it's not like for email marketers, or for HR teams, or literally, anybody could use this. So we're doing winter tests. So this is Pep has you know, CSL Pep? Yeah, I think I've heard of it. Yep. They do like user testing. But for messaging. So you get like a panel of your target audience. And they basically, you give them like a landing page or a website. And they give you feedback, specific things that you can improve things are unclear. motivations have hesitations, all that stuff. So we're doing qualitative research to inform like new messaging. So we may test value propositions, we may test some sub headers, you may test some like, benefits versus kind of features copy in the in the page. But that's another big kind of like project that we're doing a lot of different experiments within.

Bill King 17:11

Yeah, I think it's really fascinating what you just said, Because oftentimes, people think about growth, they think about like more of like a consumer application where it's, you know, you've got an app, you've got a bunch of paid ads. And there's a few steps. And so your your micro optimizing every nth degree of of that of that funnel, but it sounds like the funnel at workato is actually quite a little bit more sort of like complex in the sense that you have to be able to zoom out, zoom in on those specific areas of the funnel, where you think that you have the highest leverage. So I guess I'm really curious. So one thing that I've really enjoyed about our conversations is I think you do a really good job of doing exactly that zooming out to me. And like, here's the big picture, my job is to increase the conversion rate. And here are the specific silos that I can influence. So take me through kind of like, like, how does experimentation work within the larger picture of growth? And tell me a little bit more about how you work cross functionally with your team? And the whole process y'all go through? Does it all trickle down from the go to market process? Or is this more of like, hey, go spot specific areas in the funnel? And just try to increase leverage as much as possible? Like, how does the process go from A to Z for experimentation? Hmm.

Alex Birkett 18:22

Yeah, again, this is something that I'm kind of revamping in terms of the process, but how how it works in terms of what we focus on, is going to be you know, that SLA between the sales and marketing org, and essentially like to hit numbers for expectation, like the, you know, it's still a startup. So there's, like, all these growth expectations to like, you know, get to the next round or the next level. So that's going to lead with like, the sales, kind of close rates, like, based on like, Alright, here's how many meetings were booked? Here's how many of those, on average become sales? And here's how many, you know, or here's how much the average sales worth things track backwards. All right, to get to those numbers, like, how many demos do we need? Totally, because then you can do the conversion rate between, like, on average, how many demos you get, how many meetings are booked from those demos. So then you just track back and say, Alright, here's how many demos, the growth team, the demand Gen team is actually, you know, in charge it this would be like our expectation, here's our stretch goal. And then you sort of break that down. And this is through the the growth model, which is like the cornerstone document of any growth team, you're you're mapping out the inputs, the outputs, the expectations and how you can sort of impact different levers. So there's, there's different channels, you know, there's pay, there's organic, there's, there's all kinds of different things going on, there's SDRs, etc, etc. So then experimentation is functionally going to sit in with with those teams kind of embedded, you know, within paid across, you know, all these different teams and figure out which, which motions we can use to like, have the highest impact in terms of like, you know, if paid conversion rates are lower than average, like, you know, we're going to focus on paid acquisition heavily that quarter, but it's always going to be shifting, but we're essentially like the plumbers of the growth world. We're gonna Go into the area that needs the most work and try to diagnose the problems come up with hypotheses for how we can fix them and sort of quantitatively design experiments and say, yes, this did work, this didn't work. This is the optimal way, or this isn't the optimal way. But we're always going to lead with the kind of strategic business priorities and the highest impact areas, which comes from the growth model,

Bill King 20:19

of course, makes sense. Yeah. I mean, one of the things I was really fascinated about when I was thinking about what to ask you was like, so in order to get to a full time experimentation person, like, I'm sure there's people that are listening right now that are like, wow, would be sweet to be able to have that, like on my team? Do you have a sense for what the winds were? Or like, what the culture was like previous to joining that allowed them to be able to say, or just justify the idea of having a full time person who's just focusing on experimentation? Like, what was it like previous? And how did those conversations go when you met the team? Because I'm really curious, like, you know, do you look up and say, Oh, I'm looking for just a full time zero thing? Or is this more like organic? They were like, Hey, listen, we got these problems in the funnel. And here's how I can help or tell me little bit more about that, because I'm sure there has to be a previous culture, in order for you to get to a full time experimentation, we're all merely curious about that.

Alex Birkett 21:07

Yeah, they reached out to me about a totally different role. And I think they actually opened this one up, because I just, you know, went off. And I was such a nerd about ABX testing, or like, just making AP tests. So I don't know if they'd planned it fully. But I do I know that especially around the stage, like with a lot of companies, you're kind of going from this position of being a lot of brilliant execution people a lot of brilliant implementers, to kind of scaling up in terms of process, and specialties. So this is happening across multiple orgs, that workato, if you're anywhere in that series B through D, potentially, it really is more of a function of the culture, and possibly the employee account, you're going to find yourself with tons of siloed work tons of ad hoc work, that's all brilliant and quickly done. But you're gonna find that there's not a lot of sharing, there's not a lot of visibility, and there's not a lot of communication. So as the company grows, as the customers grow, as the revenue grows, you need more of that structure, that formality and process. So if I had to say like, the biggest value that I'm adding is probably bringing things together more cohesively. Instead of like doing random experiments, which there was a lot of experimentation going on before, it was just done in silos and ad hoc, you know, somebody who's working on paid would run a paid experiment, there would be not as much prioritization going into it, it would just be like a random idea, I think me coming in brings some sort of texture and some sort of like, cross functional reference point to say, Alright, here's actually the biggest I've talked to the growth team after, you know, I've talked to all these different teams, here's the initiatives, here's the themes that we're going to work on. And then here's the prioritization model, the meeting structure, the artifacts, that we need to shift ideas upwards in the prioritization model, and here's how we're gonna like run them, and at what cadence and like what kind of input goals we want, we want to increase our experiment tempo from like, you know, four per week to eight per week or something like that, again, made up numbers. But having a centerpiece like that allows you to focus more on the program level, and, and still allow individual contributors to focus on, you know, generating great ideas and pulling together great customer insights and research to come up with those ideas. But then you need somebody to kind of cohesively bring it all together, you know, in order to scale otherwise, it's always just going to be sort of an experiment here and there that doesn't ever connect to the bigger mission. Yeah,

Bill King 23:17

I mean, I think what you just said is really interesting, because, in my experience, there have been either functional silos where there are just random people working on experiments, like in previous roles I've worked in, usually acquisition side of business, but as a function of owning the number, I had to run experiments, and I had to try to increase leverage as much as possible. But I do love the idea of having somebody who owns that thing, because it not only is does it make it easier for you to do these deploys at the organization, because you're the experiment guy, your job is to help them increase leverage, etc, that's just easier to be able to, to be able to do those bigger hairier experiments. But it's also I think it's really neat because it brings together the different teams a lot more than they probably would have had they just without, without having a full time experiment person, because in order for you to get those big wins, would you agree that a lot of the bigger ones are cross functional, and they are usually the ones that are sort of like avoided, because they are a little bit more difficult to implement? Is that as a been your experience so far?

Alex Birkett 24:18

Yeah, it's hard to say what the impact is, because I actually think that, um, well, it depends how much traffic you have and how much statistical power you can actually, you know, warrant, but I do think cheap experiments that have outsized results are probably the best case scenario, because sometimes you are going to invest a lot in like a complex idea that may have a lot of justification, and it's going to fail, and that's essentially going to be a sunk cost. So I don't necessarily know that that's true in all cases, but I do know that it is a limiting barrier in many teams. So to not be able to do complex or interesting cross functional experiments is just like a functional limiting thing that it's you know, you're going to be stuck with copy tests and stuck with CTA tests forever. So you're going to hit like a local maximum on your program. So yeah, definitely. I think that's true. And having somebody that can communicate and like, you know, it's their job to, like unblock the team, in terms of like making those things happen. I think that is pretty powerful. And just like the visibility function like the, you know, you send a monthly email update with like the experiment results, you own the the meeting you, you, you know, have meanings cross functionally, and you kind of evangelize and educate on experimentation. That's like, a unique function. And it's, it's hard to fit that in your day if you're doing all kinds of other things. So I think having, I actually learned this at HubSpot, like having just a person to go to, when you have a question about a certain thing is actually pretty important. Yeah, that makes sense. Yeah, because

Bill King 25:39

if everybody owns it, then you know, knowledge transfers are harder. Oftentimes, people get pulled in a million different directions. So if the guide who does paid owns, you know, experiments for paid,

Alex Birkett 25:51

they have a lot more on their plate. And so maybe they don't see them fully through. And like you said before, there's a lot of the internal stuff you have to deal with. They're also not talking to the website, experiment person, and they're not talking to the email experiment person, they're not talking to the product, experiment person. They've got all these backlogs and spreadsheets and like prioritization matrices that nobody's communicating. So they're running duplicate experiments. They're not sharing knowledge and coming up with iterative gains and incremental kind of knowledge accumulation. So like, there's all these limiting factors when you're just dealing with silos like that.

Bill King 26:22

Makes sense? So something I've been, as we've been going through this conversation that that makes me excited is this, this sounds awesome. Everybody loves to run experiments. It's great to learn, like, okay, you're day one at, you know, a random company, and you've been hired as the person who owns experiments. Where do you go? Like, how does how does it you go from zero to one on? First of all, approaching experiments, what are the things you look for? And how would you approach running experimentation for, you know, a random company? Like, what are the what is the Alex Bearcat playbook? Like, how do you how do you go into this process when you're starting a brand new company?

Alex Birkett 27:04

Well, the first thing you do if you're doing consulting, or if you're day one in house is you assume that the data is fucked. And the processes are all out of whack. And that experiments are not trustworthy, you go in an audit with those assumptions with your skeptic hat on, because it's almost always the case, every company I've worked for has had some sort of data issues. And if you have data issues, then we'll one you might not be getting accurate results. But two, you're also like voiding the trust that people feel and the results, which is actually a huge matter. Because like half the more than half the problem is organizational and getting buy in for experimentation, because you're gonna be testing interesting things, they're not always going to accord to your brand style guidelines, there's going to be creative ideas that don't fit the strategy, which is the point of experimentation, because you're trying to cap the downside, and unleash innovation in a controlled risk environment. So I think going in cleaning up the data, making sure that you have baseline metrics, expectations, variants, all of that stuff set in place, if you have a method by which you can control or run controlled experiments, if you have a testing tool, you know, especially like deeming if it fits the, the cultural context of your company, that's important too. You know, if you need to be doing like interesting partitioning of audiences and segments, and personalizing different experiences, if you have multiple audiences, you know, as Google Optimize the right tool, maybe, but there might be a better one. So going in and identifying like, what tools you have, what technology you have in place, figuring out if you're actually capturing qualitative research. Again, in almost all cases, you're either not doing that or it lives in silos, or you're doing it in such a limited capacity. That's not it's not very useful to going in and implementing things for specific questions like hot jar polls, or, you know, winter copy tests, or user testing is one of my favorite, implementing that stuff. And then you want to Well, I want to also say there's an organizational factor here, that is true in almost every position, whether you're coming in as an SEO, whether you're coming in as an agency or freelancer, whatever, that you also have to just introduce yourself and make, make sure that you're according to, you know, company culture and strategy. Because if you're going off on your own path, you're not going to, you know, the organizational immune system is going to kick you out pretty quickly. So listening before acting is always important, which is part of the auditing process, by the way, that's actually just listening to what's currently being collected and done. And then this culminates in a plan, you don't want to like plan things too quickly. But once you've got your data in place, once you've got your roadmap set, in terms of like test hypotheses from the qualitative research, then you start building the process, then you start, you know, building a prioritization matrix, a project management system, and incorporating other teams and you know, coming up with SLA, agreements, metrics, all that stuff, but that's how it started, generally speaking.

Bill King 29:42

Yeah. Interesting. So it sounds like you're probably in the middle of all this stuff, right? Would you say that that's accurate? There's a lot of stuff that you need to get in place in order for experimentation to be successful. I think that everybody can relate to the data thing. When you say data, what do you mean? Like so you're you're running experiments. What exactly do you mean, by getting data in check? What are some of the things you look for, and maybe somebody who is a founder or somebody who's working at a company now can learn about maybe some some processes that would help them get set up for experimentation faster, or they can just be more informed on what it is that you're exactly looking for.

Alex Birkett 30:19

Yeah, this is great. This is gonna differ so much based on like, what kind of company worked for. This is why I like e commerce is probably one of the easiest to work with, because Google Analytics is made for e commerce sites essentially. So if you're working with e commerce, you basically want to work backwards and ask what kind of information you need to make decisions, you might need more than you think in advance, which, you know, anticipate some of that and work with an expert in analytics to kind of presume some of that stuff. But at a base level, you're going to need to make sure that your user session page view data is clean and accurate. It's not like double tracking sessions, because you set up a bunch of UTM parameters that link to other internal pages on your website, you want to make sure that your conversion counts line up with your shopping cart or your CRM, if you're in b2b. You want to make sure that you can track downstream metrics such as like average order value in the e commerce context for repeat purchases. And in the b2b context, which is really, really hard. You want to make sure that you're enriching your data with sort of like the downstream stuff. So if you have like a sales lead process, like did they actually convert and become customers? If it's a product, that process did they activate? Did they monetize when, you know, this is where cohort analysis comes in, but essentially, you know, getting these baseline metrics in play. So you can figure out what your your conversion rates are, and like where you want to go from there? Because if you don't have that information, like what are you optimizing, really, you know, you can still make good decisions based on qualitative data, even if you didn't have this stuff, then you're not, you're not going to be experimented, you're not going to be running like controlled A B tests without that information. And then like, there's, there's so much more when, when it comes to data, like cleanliness, making sure that like, I don't know how technical you want me to go. Whenever you want my friend, like, you know, you have query parameters that you need to clean up, because you're running a bunch of paid campaigns. And that causes cardinality where you have 1000s and 1000s of rows for a single page. And it's very hard to aggregate that and get interesting information on a page basis. You know, there's like cross domain, sub domain level tracking issues, there's just like, tons of, I would just say, like data cleanliness and data integrity issues. And then there's the, the not must haves, but like the ones, you know, there's like, are you doing event tracking? are you tracking, like, different goals and sort of like building goal funnels? So you can, you know, map the customer journey? from a high, you know, top a funnel page down to like that bottom funnel point where they're, you know, considering purchase, but yeah, I would lead I mean, it's, it's really like, on a on a need to know, you know, like, What information do I need? And then kind of working backwards to figure out how to get that.

Bill King 32:45

Yep, that so that that's like the the holy grail, the first thing I hear I work at a six person startup, the first thing I hear what I hear you talk about this, like this would be the the the data holy grail, we're looking forward to run an experimentation program. What, where do you see people making the biggest mistakes when it comes to trying to implement an experimentation program? Because I think that everybody relates that data is just never going to be right at most places, or at least, you know, most companies don't have their data just perfectly in check. If you're at a smaller company, or you maybe you don't have your data, right, like, how would you go about that problem? Would that be like? Would you be thinking to yourself, let's try to get some quick wins on areas that I think are working properly? Or would you say, Alright, let's hold back, let's try to fix the data first, and go through that process, even though it might be a little bit more painful and take more time before you get to maybe some experiment results. A lot of folks I'm sure are thinking I would love to run experiments. Maybe I'm not sure if I would know that the results were right. How do you handle that? Because I'm sure you know, you started at work out, I'm sure they have a phenomenal analytics team. So you're ahead of most folks, but most folks don't have their data in check. So how would you guide them through the process of getting to the first few experiments?

Alex Birkett 33:59

Well, I would say that way fewer teams and companies should run a B tests than do because there's a hype cycle built into it, like you see these companies and case studies from companies like Optimizely, you know, to increase conversion rates by like 10,000%, or whatever. So you want to jump on that train and do the same thing. But it's just a feasibility matter. Not Not every company has enough traffic? Well, some companies have enough traffic to do a B testing, but not to make it business wise worth it. You know, if you can only run one test a month, do you really think you're gonna win every test? Probably not, you're probably going to go like one for five or one for 10 depending on how well optimize your user experiences to start. And in many cases, like startups are actually dealing with pretty good user experience best practices. So they're not usually there's usually not actually that much low hanging fruit if you're at like a technology startup. You know, that's granted like they're sometimes stuff forums and stuff. But as a matter of philosophy, I would say that you're probably best off just like mapping out the feasibility and the business case for it like on the on the back. case scenario if we went two or three experiments, and each one is 20%, which is absurd that that would be, yeah. Like how much is that worth the business versus versus how much is the cost of the testing tool, the person who run these, the time the opportunity cost where you could be doing other, more, more innovative and less incremental things? So let's say like one just question like, why you're why you're doing a B testing and experimentation in the first place. Two people are really loose with the definition of an experiment. So I think an experiment in many cases means something that's controlled, and you're trying to tease out some sort of causal factor, right? You're trying to see like, why? Or like, maybe how much something won and like, why, like what elements specifically caused that. But I think other people use experiment very loosely in the sense that like, Hey, we're trying something new, that says, fuck it, like do a bunch of new stuff. That's great. If you're talking about experiments in that sense, like everybody, shouldn't startup should be doing more than big companies, because you're really trying to, like, Tinker and see what works. And then third, I would say that I would actually heavily index on just decision quality, pre vetting your decisions, going for high expected value items, instead of just like random stuff. I think when you're booking.com, and your Facebook, Google, you can test a lot of random stuff and actually go for discovery, you can basically through data, see what you didn't expect to win. And in those cases, that's the best possible thing, because you never would have tested it if you were thinking logically. But in the beginning, you don't have that luxury. So you're going to have to deal with qualitative research, you're going to come up with as much evidence as you can, for ideas to be tested in the first place, and then put those forth and really ruthlessly prioritize in the beginning. So I would say, Don't even look at AV testing tools. If you don't have enough traffic, just do some surveys and talk to customers and do winter messaging tests and all that stuff. Like you're gonna get tons of insights there.

Bill King 36:47

Yeah, shout out to to the team over at winter, I was checking that out of the thing. It's really interesting, because I think a lot of people think about experiments, and they think about running an AB test with one image and, you know, a different image on the side and trying to, you know, sort of cobble through what that might mean. But oftentimes, I think the biggest experiments are just simple. It's like, What are you talking about how many people are getting there? And how much friction physical or cognitive is there? Getting away from them realizing what the goal is that you're trying to get them to, to realize? That's basically the the foundation of it right?

Alex Birkett 37:17

Well, that's actually a good framework, too, is that BJ Fogg behavior model, where it's like, when you want somebody to take action, there's like motivation and ease. And if you can increase the motivation, by usually by copywriting, or by, you know, different messaging and persuasion, then that's one lever, you can pull or it's just making it easier to get done. So if you remove formfields, if you remove steps, if you basically take out, you know, unclear messaging and make things more easy to understand, then that's also a lever. So if you just like, look at it, like, like from that paradigm, or that framework, you know, you barely sometimes you don't even need, like the qualitative tools, sometimes you can just do that kind of backhand. And you can just understand what is feasibly easier to do. By walking through the website, you know, yourself. Exactly.

Bill King 37:59

Yeah, it makes a lot of sense. So, this thing is probably a great transition. So we talked briefly about omnisend Could you tell us a little bit more about what omnisend is and why you chose to start the company with David because you obviously have plenty of things on your plate and you know, you're pumping out content I see all over the place. And I'm like, the Alex he, he's got some systems because he gets a lot of things done. Tell me a little bit more about on this and kind of like, why it exists and why, like how you help how you help companies grow faster.

Alex Birkett 38:30

Yeah, this is the bifurcation of my personal brand. So all the experiment I try to keep these separate, it's difficult to do and you know, maybe eventually they're gonna have to merge but I have my content marketing self and my experimentation self so the content marketing self I've been involved with content marketing for years since the start of my career. Like even when I was doing internships, I was always kind of writing and doing blogging, like I was a big fan of the Ryan Holiday the Tim Ferriss the, you know, just like the bloggers who built businesses on the backs of their pen. So I love that model. I obviously you know, lawn starter crushes it with SEO and content. CSL built, built their business on a blog, essentially, and sort of HubSpot, by and large, right. Like That was the whole inbound marketing playbook that they made a name for themselves by. I've always been involved with it. I've done a lot of writing. I've done freelance writing. David Kim, my co founder also was doing freelancing. He's been involved obviously, in a bunch of, you know, he worked at single grain, and then at HubSpot for many years. So he's he's seen kind of the insides of like very advanced content programs. And then you know, we were sitting at this conference, we were at a growth marketing conference in San Francisco. And my former boss at CSL Pep was there. And we were skipping the talks, having beers as we normally do. And it was with me and David, and he was explaining to us how hard it is to find a content marketer and how expensive the content marketing agencies he's looking at are. And Dave, and I just looked at each other. We actually like, talk a lot about starting a business together because both of us are kind of entrepreneurial minded and interested in building our own stuff. We just looked at each other. And we're like, we could do that that's like, we could start this right now let's just do this. Because I got kind of tired of waiting for the right idea to come around. And I'm like, you know what, let's just like build a revenue generating business. There's so many people out there who are like, agencies aren't a scalable business, or it's, uh, you know, bad margins. It's like, I don't know, it's not that bad of a business. Honestly, it's, it's fun to run, I get to work with great people. And I get to, like, our clients are all badass. I'm enjoying it. So we started a content marketing agency, we focused on actually, I think we differentiated one because we have great networks. So we started just reaching out to people in our networks and got our first clients that way. But David and I are both technical. Like I come from experimentation background, and he comes from like a product management management background. So I think we really bring an interesting angle to content marketing outside of just like, you know, here's some fluff. Here's, I didn't want to say fluffy. But here's some, like, here's some pretty writing, you know, you're gonna really like the quality of this article. And we're like, we're gonna drive some business results. And here's how we're going to measure it. Here's how we're going to model it and predict it and forecast it and we treat it like a product or like a, like a growth program. Like, we started out. I think our first headline was a turn turn content into a performance channel or something like that. Right, right. And we still work with clients on that basis. We do last click attribution, we model things up beforehand, we track everything. Like it's very performance based. Right. So that's, that's how we're, that's that's where we're at. Yeah,

Bill King 41:30

yeah, I mean, jeez, how we got connected. I actually used to read your stuff, I think on the CSL blog, and you were in town for inbound. And I was like, I need to meet this dude. He's, he's, he's, he's going deep on some concepts that, you know, like, a lot of people just put out stuff. I was like, you can tell he cares when he writes about something. It's like a passion project more than just an assignment, right?

Alex Birkett 41:50

It's actually hard for me to let go sometimes. And I'm trying to do that, because I want my blog, my personal blog to get bigger and more traffic, but I obviously have limited time. So I decided to, I'm now hiring freelancers to write content. And I built like a content roadmap report from my own site. But it was actually psychological, pretty difficult to let go. And just be okay with publishing content. Sure, it's all it's not bad content. But it's not like content that I would write necessarily. And there's this, like, you kind of have to let the ego go sometimes to it's like, the stuff that I write. I like it, but it takes me so long. And I'm like such perfectionist that it's becoming a bottleneck to actual growth. And I've done this with tons of things actually, like just psychologically this like purists trap, like, before we launched a course. And you know, it's doing pretty well. And it was fun to do. But before in my mind, there was this idea that people who sell courses are like, you know, scammy, like info marketers. And that's obviously not like the paradigm anymore, because there's all these platforms, like, teachable, that are democratizing it. But in my mind, it was always like the tai lopez or something like that. And like, I don't want to be like that. But it's like, there's all these limiting factors in my mind. So the purest content stuff I'm trying to turn to let go of a little bit for better results.

Bill King 43:01

Yeah, I mean, you know, like, I also had the same kind of like thoughts. And then I did reforge. And that changed my, like, perception of further education online. And if anybody's interested in definitely check that out. And also check out Alex's content as well. CSL Institute, too. Yeah. That's pretty much it. Yeah, me too. Like to get started today to be good at your job and to get to whatever state you want. It's out there just about how hard you want to work. And you know, how clear you are on what it is you want to do, right? So I think like this whole story about how you, you start with marketing, went to growth, marketing, now do an experimentation, all that became because you've been playing around in the lab, finding what you're what you're passionate about, outside of just your day to day work, you're not I mean, so you get exposed to things by taking on projects and not not assuming too much out of them.

Alex Birkett 43:52

I've done a ton of this, like through the courses obviously like working on the courses at CSL Institute and just reading a ton. But you know, sometimes I feel like it's a little frustrating that there's so much out there because you always feel behind. If you're like me, you do and so I'm always like, oh, like, I'm not an expert in ABM. So like, you know, am I going to be employable in two years? Is this anxiety there? And I don't know, I feel like the next step for me is just like, Alright, how do I like spend, like less time just always trying to be ahead of the game? I feel like that is not an entirely sustainable thing for your entire career to just always be like, voraciously reading and consuming and all this stuff. And sometimes I want to just go out and kayak, you know,

Bill King 44:29

well, you live in a beautiful area for kayaking. And, you know, you should spend as much time as outside as possible. You were always under this looming threat in the digital world that things change. I mean, I was just talking to I just did an interview a podcast with Kevin and Digg yesterday, about GPD three and the like influx of AI now and content, and how, again, things are just completely different. The world's changing every single day. And you know, you just kind of have to accept that it's not like oh, you know, what am I going to do? It's like the world is going to change the most difficult thing is like if you, if you don't know what you want, you can get caught in these, like information overload traps because everything shiny and everything's new, I mean, the easiest way to get over those is just to go try stuff, if you find something interesting, actually go do it like sometimes doing the job is very different from what you read, you know, if you're if you're in marketing or if you're in growth, or if you're in experimentation, running an experimentation program, I would imagine I haven't done this full time. But doing it full time and like doing it at scale, there's probably a lot of things to that process that a lot of folks didn't consider. So I'm a big advocate of like, learn the minimum that you need to possibly get into something. But then as soon as you feel at least reasonable that you could, you could, you could execute that thing, just go do it, and get some muscle memory. I love that,

Alex Birkett 45:50

I would say that's probably a trap like this, this idea, like, there's so much knowledge out there, and actually, on the Advanced fringes like things get pretty marginal, and I feel like that's probably a trap you and I fall into, because we get on these rabbit holes 100%, but you get into a career like SEO, and like really like what it comes down to. And this is obviously going to be simplified. And people are maybe going to get angry at this. But it's like links and content, like those are the things that matter. And then if you know that you just go start doing shit. And then you start to like, you know, pile on the marginal stuff, like, Alright, like, what is my, you know, internal PageRank? And like, how can I sculpt that with internal links? Like, there's all these, you know, weird things there. But experimentation, it's similar. It's like, Alright, how do I communicate this stuff? And how do I run trustworthy experiments? Like, how do I get good ideas, you know, and there's like, these couple pieces. And then you know, instead of like, reading all these, like, advanced books on like experiment design, and sequential testing, and noninferiority test, it's like, maybe let's get the basics in play first. And then if we need those things we can go out and you know, learn those things. But I do feel like I sometimes inundate myself with the fringe, you know, the the marginal stuff.

Bill King 46:53

Yeah, I mean, you know, it's like, easy to go down the rabbit hole, and like, never apply it to any real world scenario. So I think that's why there's two things that are important. Number one, is assuming you don't know a lot of things, I think that's really important when you're in an experimentation role, because you, you can get trapped in this kind of like, reinforcement loop you get into where you see everybody doing the same stuff. And usually, the breakout stuff is probably either the most simple, or it's something that nobody has, has quite thought about. Because they're there, they're not getting exposed to things that are different from the world that they're in. And also just having frameworks like, there are people who have built these amazing experimentation programs at scale at some of the biggest companies in the world. And if you can level up and talk to people like Alex and read his content, you can learn what the fundamentals are fast, and then deploy your creativity on top of those frameworks, because it will help you put some systems around things right. Hmm.

Alex Birkett 47:50

I love the idea. And assuming you know, very little, I like the epistemological humility, humility, I think that's an interesting framework for anything but especially experiments, you start to realize that even tests or ideas that you thought were highly certain, there's always a degree of uncertainty, even if well, especially if the AP test and a B testing is a way to quantify your uncertainty. That's all

Bill King 48:10

it is. For sure. For sure. Alex, where can people find you? Like, what what uh, what do people need to walk away? If people heard this and said, while he's got a lot of big words, like I do, every time we talk, where can they find more of those big words and find more about you what you're working on and kind of like if they need some advice on you know, testing or they need some help with content marketing, like where can people find you?

Alex Birkett 48:32

Yeah, the best way to do it is Alex Burkett. calm. And then I am writing a lot of blog posts there I have a section where you can contact me with my email and all that stuff. And then I have a section with all my best articles, my opinion, and then be omniscient calm is the agency website. So if you want content marketing services, or you want to read more of my content stuff, SEO and content stuff that's over there. And other than that, you know, I'm I'm not really doing zoom calls anymore. So if you're in Austin, just hit me up and get tacos or a coffee.

Bill King 49:03

Yes. If you'd like to go and play poker with myself and Alex at some point, we are going to be doing that at some point and it's going to be fantastic. I can't we're going to go

Alex Birkett 49:11

play poker and watch maybe we can find an emo band or something.

Bill King 49:14

That would be the ultimate weekend. Honestly, just all I need, you know, coffee or anonimo tacos and email poker. Alright Alex, thanks for joining man. It was awesome having you everybody. Connect with Alex. Looking forward to the next one piece.

Bill King