In this episode of The Breakout Growth Podcast, hosts Sean Ellis and Ethan Garr sit down with Teresa Torres, Product Discovery Coach at Product Talk and author of Continuous Discovery Habits.

Teresa is a recognized leader in the product discovery space, helping product teams integrate continuous learning into their workflows to drive growth and innovation.

Teresa shares her thoughts on how teams can shift from being feature-focused to outcome-focused and why that matters. And she also explains how companies can build products that improve engagement and retention by keeping customer needs at the center of decision-making. With real-world examples and a clear framework for implementing continuous discovery, this episode is a must-listen for product leaders and growth professionals alike.

What You’ll Learn:
The Power of Continuous Discovery: Teresa explains how embedding continuous discovery practices into your product development cycle helps teams stay aligned with customer needs and drive sustainable growth.
Moving from Outputs to Outcomes: Learn why product teams should focus on the impact their products have, rather than simply delivering features, and how this mindset shift leads to better customer engagement.
Balancing Business and Customer Value: Teresa dives deep into how product teams can create a balance between solving for customer problems while ensuring that business goals are met.
Assumption Testing Over A/B Testing: Why testing assumptions early on is critical for reducing risks, saving time, and building products that truly resonate with customers.
Building Continuous Discovery Habits: Practical advice on how teams can establish the right habits to continuously learn from their customers, and how these habits fuel innovation.
Cross-Functional Alignment: Teresa explains how continuous discovery can align teams across functions, helping everyone from product managers to engineers stay focused on delivering value.

Key Takeaways:
“Discovery is about continuously learning and adjusting, not waiting for the perfect idea.” — Teresa Torres
“The shift from outputs to outcomes is critical for driving real value for customers and businesses.” — Teresa Torres
“Your assumptions are the riskiest parts of your product—test them before you build anything.” — Teresa Torres

Timestamps:
[00:00] – Intro: Sean and Ethan introduce Teresa Torres and her background in product discovery.
[06:00] – The importance of continuous discovery and how it fuels engagement and retention.
[12:30] – Moving from outputs to outcomes: How product teams can stay focused on the right goals.
[20:45] – Testing assumptions early to avoid building the wrong product.
[30:15] – How to build habits that support continuous discovery within teams.
[38:20] – Aligning cross-functional teams around continuous discovery and customer value.

Why Listen:
If you’re a product leader, startup founder, or growth professional looking to improve customer engagement and retention, this episode with Teresa Torres offers invaluable insights on how to integrate continuous discovery into your organization. Teresa’s practical advice on testing assumptions, focusing on outcomes, and building customer-centric teams will help you unlock innovation and drive long-term growth.

Links & Resources:
Learn more about Teresa’s work and continuous discovery at Product Talk and Product Talk Academy.
Here’s a great article by Teresa to help you get started with her work
Follow Teresa Torres on LinkedIn – Connect here
Here’s the article that Sean wrote on Why Curiosity is Key to Growth Success
Subscribe to The Breakout Growth Podcast – Subscribe here

Join the Conversation:
If you enjoyed this episode, share your thoughts on social media and tag us! Don’t forget to subscribe and leave a review—it helps more people find the show.

Transcript:

Sean (00:01)
Hi, Teresa. It’s great to have you on the Breakout Growth Podcast. Ethan and I are excited to dive into product discovery with you. Particularly, we want to try to connect it to engagement, which is obviously a key lever of growth. And so that’s one of our big goals of this conversation.

Teresa Torres (00:18)
Excellent, I’m excited to be here.

Sean (00:20)
Yeah, and as I mentioned, joined by Ethan Gahr. Hey, Ethan.

Ethan Garr (00:20)
Yeah.

Hey Sean, hey Teresa. Yeah, Teresa, I’ve been really looking forward to this one. So let’s jump right in.

Teresa Torres (00:29)
Let’s do it.

Sean (00:30)
Yes. So, yeah, Teresa, you’ve obviously been a key voice in advancing product discovery. Your book is really popular. I was looking on Amazon and seeing how many reviews you had there and we’re at 750 ,000 books sold. A lot of it non -English versions of our book, but I think you have twice as many Amazon reviews as my book. So if there’s any proxy on books sold, that’s awesome. Congrats on that.

Teresa Torres (00:59)
Thank you.

Sean (01:00)
Yeah, so for anyone who doesn’t know your background as much and what you’re doing, can you tell us a bit about how Product Talk is helping companies build more customer centric products and ideally if you can tie a bit into what you’re doing to help drive user engagement.

Teresa Torres (01:20)
Yeah, I think the easiest way to think about this is I think as an industry, we’re seeing a big shift to more from outputs to outcomes. So instead of just building things, starting to think about what’s the impact those things are having. And then I think we’re also seeing companies realize if they want to scale, if they want to have a lot of impact, decision making is getting pushed down to more product teams. But product teams aren’t always equipped to handle both of those changes.

Right? They’re used to be giving a roadmap of build these features and now they’re being asked impact these outcomes. And so I think continuous discovery is really about our products are never done. We’re always going to be looking to improve them. There’s always more engagement we can drive. We can always acquire more customers. We can always retain more customers. How do we do that continuously? And how do we build the way that I look at that is how do we build fast feedback loops with our customers so that we’re always learning with them? We’re designing with them.

we’re creating with them so that we’re sure that what we’re doing matters. And I kind of got into this just because I spent most of my full -time employee experience in the San Francisco Bay area working at early stage startups. I saw the exact same mistakes everywhere. Product teams didn’t know very much about their customers. They weren’t talking to them. Everything was very founder vision led, which can get you somewhere, not always where you intend. And I just…

wanted to start chipping away at, can we help teams be a little bit more customer centric? And so for the last 13 years, I’ve been working as a product discovery coach.

Ethan Garr (02:57)
Very cool. Can you tell us a little bit about maybe what you’ve seen evolve in that time in terms of product discovery? I mean, obviously just in that time as a product guy, I’ve seen a lot change and like you said, sort of that product decision -making being pushed down to product. So I’m curious like what you’ve seen in terms of that evolution since then.

Teresa Torres (03:17)
Yeah, I think there’s some big arcs that even predate my time as a coach. I would say as an employee, what I saw happening in the world, like in the early 2000s, it was sort of the rise of UX. We saw a lot of debates about design, information architecture, interaction design, visual design. All of that kind of came together into the UX role. And that helped a lot because I think UXers did a lot for advancing, hey, maybe we should get feedback from customers on our designs. And then that started to trickle.

across the organization. think another big inflection point was the Lean Startup by Eric Riis. He sort of introduced this idea of assumption testing, fast cycles, don’t build too much, learn before you build. Although I think his cycle starts with build, but learn quickly. Let’s just put it that way. And then I think really the biggest catalyst has been as companies become, let’s say, internet native.

Sean (04:01)
You

Teresa Torres (04:13)
like they’re really thinking about delivering value over the internet, our ability to measure impact has just gone through the roof. Like we literally can measure where we’re losing people, where people fall off, what’s not going as expected. And I think that’s raised some awareness around, things don’t always have the impact we thought they would. How do we get better at this? And then I think when we ask the question, how do we get better at this?

Sean (04:25)
Mm

Right.

Teresa Torres (04:41)
We started from a project world of like, let’s do big usability studies. Let’s do big research projects. And I think in the last five to 10 years, we’re seeing a shift to smaller, more continuous activities as opposed to these large project -based activities.

Sean (04:58)
Is most of your coaching on kind of like products right in the early stage and trying to get them to product market fit or is it more about just the ongoing evolution of products?

Teresa Torres (05:11)
All of the above. Really, here was my goal. When I started to think about how do we help teams just be more customer -centric, increase their hit rate, what really got me started on this path was learning what percentage of startups fail. And it blew my mind. I was like, how in the world is this the world we live in? And so that was sort of the catalyst that got me thinking about how do we make better decisions? How do we reduce the error rate? And I think…

Sean (05:12)
Okay.

Teresa Torres (05:41)
A lot of my employee experience with small early stage startups, like I was the 10th employee one place, the 23rd another place, just a lot of zero to one. But a lot of my coaching was larger companies. And when I first started coaching at larger companies, I had a little bit of like imposter syndrome, like what can I teach these companies? The largest company I ever worked at had 100 people. But I quickly learned it’s the same problem. It’s the same challenge, right?

Ultimately in product, we’re trying to get customers to behave in a certain way. Hopefully in a way that creates value for them, but ultimately we’re trying to get them to behave in a certain way. And it doesn’t matter if it’s zero to one or if it’s 99 to 100, it’s, or whatever number we want to put on mature products. It’s really about humans and getting humans to do things and helping humans understand things and helping humans get more value out of things.

Ethan Garr (06:36)
I just wanted to follow up. think people are always talking about being customer centric. And I feel like sometimes it’s, it sounds, it becomes just a buzzword because people, you know, it’s like, we’re customer centric, but it has no depth. What do you think is really the key to, to making for comp for product teams that want to be more customer centric? What do you think? Like obviously product product discovery is important to that. But can you dive in a little bit about what really drives a true, authentic customer centric experience?

Teresa Torres (07:04)
Yeah, customer centric is becoming one of those words that doesn’t mean anything. Because everybody is customer centric, right? For me, it’s not just about customer centricity, because this is where I think some of the UX community gets it wrong, right? There’s this belief that if I just create value for my customers, I’ve done my job. Whereas what I think is really important is to make sure we’re creating both customer value and business value. And that if we’re not doing both,

We’re doing a disservice to our customers. It’s a little bit counterintuitive, right? If I create customer value, but I don’t create business value, what happens? My product is going to get shut down. Companies aren’t charities, right? And so I think aligning those two is really critical. And so for me, what does that look like? It means I have to understand my business context. I have to understand how my product supports the business context. That’s usually how does my product generate revenue?

or save costs, reduce costs for my company. And then I have to look at how do I understand how I can create customer value, something that’s valuable for my customer in a way that aligns with that business value. And that’s a lot of my product discovery framework is really designed around how do we align those things? How do we make sure that every product team understands how to create value for the customer so that they’re satisfied, but also creates value for the business. And that usually means the customer is so satisfied.

Sean (08:25)
Mm

Teresa Torres (08:31)
they’re gonna open their wallet and pay for something.

Sean (08:33)
Mm -hmm. Yeah, and I think that the value piece obviously like that that comes back to product market fit is just like is there is there enough value in there in the first place to actually get people using it and sticking around and obviously as you said the business value is gonna gonna require Monetizing in the right way and and building a real business around it With our focus and on this podcast on on growth a lot of that then becomes how do we make sure? Enough people are experiencing that value

in the right way. And so I’m curious if you’ve looked at ways to tie customer discovery to actually improving user engagement or other things that relate to growth.

Teresa Torres (09:17)
to push back on this because I know like there’s been a long dance in the product growth marketing worlds of like where’s overlap who owns what I don’t care I don’t care who owns what here’s how I think about growth right to grow I can acquire customers I can reduce the customers that I’m losing I can retain the customers I have I can grow the size of my customers whether that’s through net retained revenue or

Sean (09:24)
Mm

Right. Same, yeah.

Ethan Garr (09:29)
You

Teresa Torres (09:46)
growing seats or whatever, right? To me, those are all outcomes. So if I have a product team that’s tasked with one of those areas, or maybe even all of those areas, as we see at smaller companies, I wanna look at, okay, I know the business need is retain more customers or acquire more customers, that’s easy. I now need to do the work to understand who are the right customers, who’s gonna get value from what we have today, and or what do we need to…

Sean (10:12)
Mm

Teresa Torres (10:16)
add to our product, what value do we need to add to our product to go find more customers, right? And so there’s a little bit of this like, how do we discover what’s working today and how many people match that? And is there more capacity in our current market? In which case, great, I just need to go find more people like that.

Sean (10:21)
And yeah.

Right. And then how does customer discovery actually help you find those answers?

Teresa Torres (10:41)
Yeah, so in my framework, so let’s back up. One of my goals, like I didn’t set out to think I’m gonna come up with this product discovery framework. What I did was I just started talking to teams and I started coaching teams and one of the things that I learned really early in my career, actually in college, was interviews and to interview people and to learn about their lives and to make it about them. We’re designing for them, we better know who they are, right? And I just started getting teams doing this and…

A few things started to happen. When you start interviewing customers, first of all, you have to know what to ask, which a lot of people don’t. Like our intuition doesn’t always get us reliable feedback. So there’s sort of this art of interviewing. And then even if you learn what to ask, you get pretty quickly overwhelmed with everybody’s a little bit different. There’s a ton of things you could do. How do we decide what to do? Or worse, everybody looks exactly the same, which means you’re probably following

Pray to confirmation bias and not actually learning from your customers, right? And so then there was this sort of second challenge of like what do I do with all this stuff? And then there was this third challenge of like it’s great. This is interesting But my business wants me to ship something tomorrow and I need to start building And so really what happened pretty organically of just working with teams I started to look at like how do we add scaffolding around these three things? Like how do we help teams ask the right questions?

How do we help them synthesize what they’re learning and make it really actionable? And then how do we help them determine if their ideas for what to build are actually addressing those things? And that’s sort of the heart of my framework. So I start with, you have to have an outcome. You have to know what success looks like. Where are we headed? You have to interview. As you interview, you’re going to hear about all sorts of unmet customer needs, pain points, and desires. How do we synthesize that and make a strategic decision about where do we want to play it?

And then we have to explore solutions. And as we explore solutions, we’re looking for a match between those things. Does a solution address this need in a way that drives our business outcome? And I use assumption testing for that. And that’s just building on Eric Riis’s world. I didn’t make that up. It’s just a thing that was already out there. And I think a lot of what I did was how to make that really actionable for teams as opposed to a big concept. So I would say discovery is that simple. We’re starting with…

We’re starting with an outcome. We’re talking to our customers to learn about what they need. We’re evaluating solutions based on how well do they match those needs.

Sean (13:12)
And ultimately, I assume ultimately testing those solutions for how effective they are.

Teresa Torres (13:18)
Yeah, so part of evaluating solutions, this is one of the big changes I think with moving from a project world to a more continuous world. When we talk about testing solutions, what I see happen is teams build a bunch of stuff and they A -B test it, and then they learn they built the wrong thing. And what I don’t love about that is you just did all the work before you learned you were on the right track, right? So I do love A -B testing. I think it’s a great measurement tool. But I don’t want…

Sean (13:39)
Okay, yeah, yeah.

Yeah.

Teresa Torres (13:45)
teams to build everything. I don’t even want teams to do all the design work before they know they’re on the right track. And I think Eric Reese hinted at this in the Lean Startup, even more than hinted at it, but I don’t think a lot of people miss the message, which is you don’t have to test your whole solution. You can break it down into its underlying assumptions and do assumption testing. And that we can assumption test much quicker. We can test multiple ideas. I think that really unlocks just better decision making.

Sean (13:49)
Mm -hmm. Mm -hmm.

Mm -hmm.

Teresa Torres (14:14)
So when I talk about evaluating solutions, I prefer to do that through really rapid -fire assumption testing.

Sean (14:21)
One other just quick before I bring Ethan in, I’m sure you’re chomping at the bit with some questions, I’m curious, a lot of times when I talk about improving engagement, there’s a really tactical model that can help with that, and that’s Uriel’s Hooked model. Clearly it feels like…

Teresa Torres (14:39)
Mm

Sean (14:42)
You’re ultimately trying to kind of understand the drivers underneath that model instead of just just kind of thinking through. are there any other kind of limitations to that model? Or do you feel like that model complements what you’re talking about? Is it you know, what are your thoughts on that?

Teresa Torres (14:55)
Yeah, I’m going to say I’ve always thought this about growth, but I could replace growth with product. I think it’s the same. Not that growth and product are the same, but this statement applies equally to both. So I think we can look at growth from a really narrow optimizing lens. And for a lot of us, we get a ton of value out of that. That’s where our low hanging fruit is. We often see, this is where we see the blog posts about, changed this call to action and I got a 400 % increase in my conversion rate, right? Like,

I’m not poo -pooing that because there is a lot of wins there and I think we often underestimate those wins. But I think there’s another scope and that’s are we even in the right ballpark? Do we build the right thing? Is this the right value? Can we get a better value fit? And I don’t, that’s where like conversion rate optimization isn’t going to get you there. There’s a little bit more of like, I need to understand the humans I’m impacting, the humans I’m designing for and hopefully with.

And really see and this is where we see I would call them more I want to say step function change But don’t take that to mean bigger impact because sometimes we get a huge impact from those little optimizations, right? But it’s more of like instead of being in this lane optimizing it we might learn there’s a lane right next to us That is more impact Yeah, Yeah

Sean (16:16)
Right. I kind of local maxima versus like kind of much bigger opportunity on the next mountain over.

Teresa Torres (16:22)
And I would say that like all the habits I write about in my book and I like, I named the book Continuous Discovery Habits because I want people to think about it as a collection of habits. Like it’s not a rigorous process or framework that you adhere to perfectly. It’s here’s some tools in your toolbox. And I think all of those habits apply even in the optimization realm, right? So like if I’m trying to optimize a checkout flow,

I’m still gonna benefit from talking to people who recently purchased about their experience. I’m still gonna benefit from assumption testing. But if I expand my scope and say, okay, I actually wanna know are we offering the right products? Is that how I can get people to buy more? Okay, well now I can interview people about what they’re looking for and I can assumption test around new products to add. So my goal was not to like come up with a framework that we should all be dogmatic about and argue about, is this way better than this way?

My goal was to look at what’s the underlying framework that underpins all of our work. Right? And it’s really like, I really draw from problem solving and decision making research, because I feel like that’s what makes this universal, is that we’re essentially solving problems and we’re making decisions, so how do we do that well?

Ethan Garr (17:39)
So as teams adopt continuous discovery, how do you see them measuring the real impact on metrics? Are there key signals that they should always be looking for?

Teresa Torres (17:50)
think there’s two ways to think about this. So the first is we’re learning how to be a continuous discovery team, how do we track our progress? And then the second is continuous discovery is designed to help a team drive an outcome. So the ultimate measure of continuous discovery is are we impacting that outcome? That can be a lagging indicator, and that’s why in this sort of learning process, some of the things I like to look at is are you doing the right activities? So.

Are you interviewing regularly? Are you running assumption tests regularly? Is your outcome providing focus? Or is it just a thing that you set and then you do all the other things that you were gonna do anyway? And then ultimately, what I like to do in that learning phase is before you release anything, literally anything, take some time to document why did you build this? What impact did you expect it to have?

And then a week later, 30 days later, six months later, whatever the right timeframe is, revisit it. Did it have that impact? And when teams do this, they realize lots of stuff didn’t have the impact they thought. And then when we realize that we can ask why not? What was the gap? What assumption did we get wrong? And then that’s a feedback loop on how well did we do in our discovery, right? And so I don’t think we’re ever going to be perfect. I don’t think we’re ever going to release things that always have the exact impact we thought, but I think we can get closer.

Ethan Garr (19:13)
Is there, as teams adopt this, is there one key mistake that you see a lot of them make? Is there something that you’ve learned along the way that you’re like, if you can avoid this, you’re going to be a lot more successful?

Teresa Torres (19:26)
If I had to pick one, language is so vague. Let me tell you a story. I started coaching teams and a team would show up for a coaching session for the first time and they’d say, Teresa, we already do continuous discovery. I’d be like, great, let’s talk about what you’re doing. And what they would describe is they were doing usability testing or they were getting pulled in on sales calls or they were good at A -B testing. Like they were doing activities.

that a lot of us would put in the discovery bucket. But there was nothing continuous about it. And one of things that I like to look for is, are you throwing ideas away? Like you could do all the right activities, but if you’re not changing your mind about anything, those activities aren’t having an impact. So I think the big mistake teams make is they get stuck in a project world. They sort of miss the continuous part and they’re occasionally doing big discovery activities.

and haven’t really found this continuous cadence. We see the same thing when companies adopt Agile, right? They move from this like big heavy waterfall process to I think Marty Kagan calls it mini waterfall. Like they’re doing two week sprints, but they’re still planned six months ago. It’s a little bit of that, like they miss the continuous mindset part of it. And then because of that,

Sean (20:44)
Mm -hmm.

Mm

Teresa Torres (20:51)
They’re not really learning from their activities. They’re still doing what they would have always done.

Sean (20:57)
So one of the things that I like to do when I approach growth is I’ll kind of pick an area of the business that feels like we’re underperforming in that area for whatever reason. Like we don’t retain customers. I feel like we could retain them much better. There’s a lot of different ways we could potentially do that, but maybe we set the goal of improving that. then a lot, what I found, if I just randomly go in and start experimenting, I might get some wins, but it’s not going to be that great.

But I’ve learned over time that if I can deeply contextualize the situation, deeply understand what’s happening there, maybe where the customer’s having a problem, why they’re not coming back, why they are coming back, that the experiments we run tend to have a lot higher potential impact, but it still doesn’t necessarily lead to the perfect solution based on that deep context.

I still do find the need to have three or four solutions that I might be wanting to test and one ends up performing a lot better than the others. So I’m curious if you see that kind of experimentation as part of this process or is it that generally enough context, the solution you come up with, it’s kind of like, did it lift results or not? Roll it back if it didn’t, keep it if it did, or are you kind of simultaneously testing potentially more than one solution?

Teresa Torres (22:25)
Yeah, this is one of the biggest things I hammer home, is I want people comparing and contrasting. I want them comparing and contrasting at the outcome level. What if we tripled retention or like reduced churn by 3x? What if we acquired 3x more customers? Right, where’s our leverage? What outcome should we be focused on? I want them comparing and contrasting when we’re talking about user needs and pain points and desires. Like, how much does the customer care about this?

If we address this one versus this one, how does it change our position in the market? And then once we choose an opportunity or a customer need to go after, I want teams comparing and contrasting solutions. I get asked all the time, like, hey, I ran this test. This was my conversion rate. Is that good enough? I don’t know. Or like, ran a prototype test and I got feedback. How do I make sense of it? You know, like, we tend to think about like,

Sean (23:10)
Yeah, if it’s a hundred percent

Teresa Torres (23:20)
I think we’re trained to think in absolutes. Like we grew up going to school, there’s a right answer. And I think what we see in business is there’s lots of good answers. Some are better than others. And the only way to really evaluate things is to compare and contrast. And when I said I like to draw from decision -making research, this is really grounded. Like we have decades of decision -making research that suggests if we compare and contrast, we make better decisions. The challenge is most product teams and probably growth teams too,

like I have full backlogs and are drowning in work. And when you tell them like test more than one idea, they go, yeah, right. I don’t have time for that. And that’s where this concept of don’t test your ideas, test their underlying assumptions makes this sustainable. can, I, I work with teams that test multiple assumptions in a single day because that’s the goal. right, we’re going to frame our assumptions so small that we can test them really quickly.

And then that starts to give us the data we need to compare and contrast.

Sean (24:20)
One of the things I found in my last hands -on role with a company called Bounce was that we initially had that feedback. We’re just so busy, we can’t really implement this as an A -B test. Let’s just implement it. We’ll analyze the data, see if it improved against the goal, the output goal that we’re trying to do. And what we found is that there’s so much noise that almost everything was inconclusive, in which case, like,

Teresa Torres (24:46)
Yeah.

Sean (24:50)
for that extra 20 % of work, we kind of got no learning versus at least some, even if you’re like before and after testing, like we make a change and try to see does customer behavior change based on that change. And so fortunately over time, we did move a lot more toward every thing that we wanted to introduce to try to set it up as a test so that we actually.

could see the incremental outcome on before or after, or solution A versus solution B. And it was a very integrated effort between growth and product, and ultimately, a lot of the product teams’ features and other things they were releasing, we ended up following that same methodology. So it’s kind of everybody bought into the, we really don’t know the answer until we release it and test it.

So yeah, I don’t know if there’s a question in there or just the observation of that ultimately it’s really hard to predict customer behavior and improving customer behavior. I think you’re much more likely to do that if you can deeply contextualize it and understand the customer needs. But validating a solution is tough without running it as a test. That’s my own experience anyway.

Teresa Torres (26:11)
layers to this. So what I agree with what you said 100 % is A -B testing is the best way to measure did the thing we built have the impact we thought it would? Right? So we built this thing, we thought it would change behavior in this way, we’re going to A -B test it. That is the gold standard of is did that happen? A challenge with A -B testing, first of all, a large number of product teams work on products and or features

that don’t have enough traffic to A -B test in a reasonable amount of time. Right? I don’t want a team waiting a month to decide, was this the right thing to build? Now, if you built it and you’re waiting a month to measure impact, that’s fine. But when you’re deciding what to build, I want you to have fast feedback loops of, this idea look better than this other idea? And that’s where A -B testing, unless you’re Google or Facebook or maybe Netflix, probably not even Netflix, right?

Sean (26:45)
Right, yeah.

Teresa Torres (27:11)
And even those companies, like even the companies with a billion users, a billion people don’t use all the features. So you have to be working on a feature that a billion people use, right? Like, relying on A -B testing to tell you what to build, you’ve done all the work upfront, you’re waiting forever. It’s just, it’s slow. So I like to frame A -B testing as it’s a measurement stick. Like we, it’s not informing should we build this or not. It’s we decided to build this.

Sean (27:21)
Right?

Teresa Torres (27:40)
Did it have the impact we thought it would?

Sean (27:42)
Yeah, and that’s the view I’m looking at it as well. Like it’s after kind of validation or, not quite what we expected to happen, but just some kind of a feedback loop to let you know.

Teresa Torres (27:44)
Yeah. So…

Yeah, and that’s where using that as a feedback loop of did we miss something in our discovery? And I always tell teams you’re always gonna miss something in discovery. It’s really rare that you build a thing and it has exactly the impact you thought it would, right? And so that is a really important feedback loop of like, hey, we saw this gap, let’s go back and look at what we explored in discovery, we missed this, how do we make sure we don’t miss that the next time?

Sean (28:09)
Sure.

Ethan Garr (28:20)
So I have a question related to that. years ago, Sean came up with ice scoring impact, confidence and effort. And you hear there’s some variations on that. There’s rice and some others. I’m curious how that fits in with your thinking on this framework. Because what you’re describing is, be careful what you build, right? Like focus first on figuring out what to build. That process can be really good in organizing tests and prioritizing. Do you see a role for that kind of?

Teresa Torres (28:31)
Yeah.

Ethan Garr (28:49)
framework in what you do.

Teresa Torres (28:53)
Here’s what I like about those frameworks and all the variations on them. If you have stakeholders in a room, it starts to get out their underlying assumptions. We’re assuming the effort is this. We’re assuming the business impact is this. We’re assuming the customer impact is this. What I don’t like is if you’re not testing those assumptions, you still just have a bunch of people in a room guessing at what they should build. So what I want to see is,

Sean (29:17)
Mm

Teresa Torres (29:20)
Use that to have a really good conversation. And now let’s go get some feedback. If we think it has this impact on customers, let’s test that. So a lot of what I teach is, okay, so we have an idea. We want to build a thing. And in order to build this thing, we believe it’s going to create value for the customer. So one of things that I like to do is I like to use story mapping, the same idea popularized by Jeff Patton. Literally map out what does the customer have to do to get value from that solution.

It doesn’t exist yet, it’s just an idea on a whiteboard. Can we map out what the customer has to do to get value from that? Okay, we now have a map of everything we’re assuming a customer will do. And if they are willing to do all those things, at the end, there’s some value creation. Great. Now let’s look at each step and we can ask what needs to be true in order for a customer to take this step. That’s how I’m generating my assumptions that my idea depends upon. It’s one of the ways I’m generating the assumptions my ideas depend upon.

And so I’m gonna come up with things like, let’s suppose that you work at a startup and you don’t wanna do identity management. So you’re like, okay, let’s use Google as our login. That’s your idea. And then someone says, well no, let’s use Facebook as our login. I know that’s not really a thing anymore, but humor me. And there’s a debate, right? And executives can sit in a room or the product team can sit in a room and come up with, well, Google’s API is easier, well, more people use Facebook, whatever, right?

Ethan Garr (30:33)
You

Sean (30:33)
you

Teresa Torres (30:46)
All assumptions. If we map out, in order for this to be a successful way for people to log in, what does the customer actually have to do? Well, they have to go to our signup page, they have to see the options, they have to understand the options, they have to have one of those accounts, they have to select one, they have to do the little connection to make sure that I know what your Google ID is, right? We have to be able to store that accurately and abide by Google’s API.

Then when they come back, they have to remember that they selected Google and not Facebook. Okay, this is a really simple example. I just enumerated a dozen assumptions. So now we’re having this debate. What if someone just went out and launched a little survey on your website and said, which of the following accounts do you have Google or Facebook? Now I can learn in a day or two, do our customers use Google more or Facebook more? And instead of being in a room full of opinions that have no data whatsoever,

I just collected data about my customers that’s telling me which direction should I go, right? And I could do that for any of those assumptions so that I’m starting to learn like one of these options is better than the other.

Sean (31:55)
So yeah, just to tie the two pieces together, the ice scoring as I envisioned it when I came out with it was that confidence is largely going to be based on how much customer validation do you have in advance? Are customers struggling with this? Are they not? that ultimately you can increase your confidence in a solution based on spending more time analyzing with customers. If it’s a really easy test to run,

Teresa Torres (32:12)
Yeah.

Sean (32:24)
maybe it’s easier to just run the test rather than spend the time on gathering that feedback. And that’s always obviously debatable. I’m a big proponent of customer feedback. I used to run Qualaroo and basically built that company to be able to gather feedback in user flow. My product market fit survey was all about kind of, I don’t want to try to grow something until I can deeply understand.

Teresa Torres (32:50)
Yeah.

Sean (32:53)
how customers are getting value from it. so sustainable growth, I think, is really about deeply understanding value and being able to double down on what matters to customers and fix the things that prevent them from getting to value. that ultimately, it does come down to the test. Part of the process is that feedback loop from the test. But again, that’s just my view on it.

Teresa Torres (33:18)
If I’ve learned anything, it’s the way that people use tools is not necessarily the way the tool is intended. Right? So, like I said, I love it as a conversation starter. I love it like you’re saying as like, summarize what you’ve learned. It’s just not what I see people do. I see too many teams throw out opinions, they put it in a spreadsheet, they get a math formula, it looks like the right answer, and then they’re way overconfident in that right answer, right?

Sean (33:24)
Right. Yeah.

Mm

for sure. Yeah.

Ethan Garr (33:42)
Yeah.

Sean (33:47)
For sure. And what I advocate and what I see companies do is two completely different things. So I completely agree with you. Yeah.

Teresa Torres (33:52)
Yeah, and same with my stuff. I mean, I argue with people constantly about please don’t use generative AI to create your opportunity solution tree. You’re kind of missing the point. But, you know, people do what they want to do, so.

Sean (34:02)
Yeah

Ethan Garr (34:03)
Hahaha

Sean (34:05)
Right, I think this ultimately comes down to like at what point do you just run a test and get the feedback to see how behavior may change or not change based on a solution versus spending a lot of time on the customer feedback and just customer discovery to figure out what is the right solution to test. Do you have any thoughts on the kind of best way to balance that?

Teresa Torres (34:32)
This is hard, because I used to say, if it’s faster to run the test, run the test. Or if it’s faster to build it, just build it. But then I saw teams literally build 30 things in 30 days, and they were 30 wrong things. We wasted a month, right? So I think there’s a balance of both. I really want teams continuously engaging with customers. So one of the benchmarks that I set years ago was I like to see product teams talking, like doing a customer interview every week.

Sean (34:38)
Mm -hmm. Yeah.

Right. Yeah.

Ethan Garr (34:45)
hehe

Teresa Torres (35:02)
We’re just investing in our understanding of who our customer is incrementally over time. And then of course there’s synthesis work that has to be happening. What are we learning over time? How is it changing? If you’re doing that and you are looking at a solution and it literally is faster to build it than to really evaluate it.

Maybe you could say we do have a rich understanding of our customer and we should just build this and there will be times that that’s the right answer. But I think if a team has a pattern of just building it because it’s fast and they’re measuring impact and they are realizing they’re getting miss after miss after miss, maybe they should stop just building it because it’s faster because it’s not faster, right?

Sean (35:48)
Right, and also tying back to something else you mentioned, just the sample sizes are so hard, just because it’s easy to build doesn’t mean that you should be using that sample size on something because you didn’t take the time to figure out if your solution is tied to what customer needs are. You can have lots of conversations, but you just don’t have that much sample size to be testing different solutions in.

Teresa Torres (36:12)
Yeah, this sample size problem is real. Like, anytime somebody asks me like, yeah, but that’s not a representative sample. I’m like, you don’t have an audience. Like, your audience isn’t big enough to even be talking about quantitative testing. there’s math here and you’re asking me a mathy question, but I don’t think you understand the math behind the mathy question, right? Like, there’s just not, you have 100 users. Like, we’re not doing anything quantitative. I mean,

Sean (36:15)
Yeah, for sure.

Hey.

Right. And that’s a big part of the reason why I personally lean into consumer, just because it’s more likely to have some volume there where I can actually get feedback from testing different solutions.

Ethan Garr (36:41)
Yeah.

Teresa Torres (36:44)
Yeah.

once you have product market fit. There’s plenty of consumer companies that they’re still struggling with, do I have enough people? Or they’re an occasional use product, so maybe they have a lot of customers, but they don’t have a lot of customers in a short window of time. And if we’re continuously building, which is now the expectation in our industry, we are continuously building, you need to be continuously deciding what to build. So we need fast feedback loops, not these like,

Sean (36:53)
Exactly only after product market fit definitely Yeah

Teresa Torres (37:19)
see teams that are like, no one’s six weeks. I’m like, well are you just gonna… Yeah, like are you just not gonna build anything for six weeks? Are you gonna build the wrong stuff for six weeks? Like, we can do better than that.

Sean (37:22)
Yeah, we’ve sent it over to research and we’ll have an answer back in a while.

Ethan Garr (37:27)
haha

Sean (37:30)
Right. Yeah.

Ethan Garr (37:33)
Do you find that a lot of what you end up coaching teams on is really how to listen to customers? mean, it’s the ideas of how do you go out and talk to customers is one thing, but I feel like a lot of the challenge I face when I’m coaching teams is how do we continuously ask the right questions in the same organized way so that we can start to spot trends? And it’s a lot about listening. So I’m curious if that comes really into what you’re espousing.

Teresa Torres (37:59)
Yeah, there’s a few things that I hammer on. One is how to ask the right questions. I really focus on teaching teams to collect specific stories about past behavior. And that sounds really simple. We can all know to ask, tell me about the last time you did this thing. That is not the work. Asking that question is easy. The problem is, if I say, me about the last time you watched Netflix, you’re going to be like, I watched a movie last night after dinner. Cool story, right? Like, that’s not enough.

Sean (38:27)
Mm

Teresa Torres (38:28)
There’s like this art of collecting a story and like helping the participant remember their story and walking through it in enough detail to be useful to you and asking it at the right scope so that you get the full context and you understand what their goals were and not just tell me about the last time you used my product, right? So there’s a ton around that. On the assumption testing side, there’s a lot around research methods and

We see this with surveys. Sean, probably saw this at QualiRoo. Survey data is only as good as the question you ask. a lot of people, I mean, know there’s been decades now of people saying, don’t ask people about their hypothetical future behavior. But what is our most popular customer satisfaction metric on earth? A hypothetical question about future behavior, right? We’re just bad at this. So that’s…

Sean (39:04)
for sure.

-huh. Are you referring to the NPS question? that the?

Teresa Torres (39:27)
I am, I am. When we have a perfectly valid, like, we have really good satisfaction measurements that we could have used instead, but that’s a whole different slide, Brent. And here’s the thing, actually hypothetical future questions can be great sentiment metrics. How do you feel about this thing? So like the PMF survey is actually a really great sentiment metric, right? Like,

Sean (39:36)
Yeah, absolutely.

Ethan Garr (39:39)
Hehehehe

Teresa Torres (39:55)
If this thing happened, how do you feel? That’s great. What I don’t like about NPS is how likely are you to do this behavior? We’re terrible at predicting our future behavior. Like we’re terrible at it, right? I can imagine if this thing went away, how do I feel about that? Because I’m feeling that, right? I can’t predict what I’m gonna do tomorrow. Not with good reliability, like I just can’t. And so that’s…

It’s that kind of stuff. So we do a lot around mindset. We do a lot around research methods. On the mindset front, so many people still believe there is one best answer. There is a right answer. And there’s a lot of work to undo there because there’s not a right answer.

Sean (40:41)
How much does curiosity play into that? Are people who are very curious better at asking the right questions or is it really just, yeah, train the right way to ask questions and that’s all that matters?

Teresa Torres (40:54)
I curiosity matters a lot. I think I have a blog post about being curious. Yeah, there you go. I think it is a really critical trait, but the thing is, I think a lot of curious people may not be curious in their work because of their organizational context. Right? So I could be a naturally curious person, but if I know my CEO is asking me to build this feature and I’m told to test it,

Sean (40:59)
I just thought that my last blog post as well.

Ethan Garr (41:01)
Hehehehe

Teresa Torres (41:23)
and I’m learning and there might be flaws in it, but I don’t want to have to go tell the CEO their idea sucks. I’m not going to be that curious about those flaws. I’m going to look for why it could work and I’m going to focus on that. Right?

Sean (41:34)
confirmation bias kind of approach. it a cultural thing then that really executives should be looking to shape the culture in a way that people foster that the right type of curiosity?

Teresa Torres (41:47)
I think this is what’s so hard, right? I think it’s easy for leaders to think my product teams aren’t curious enough. My product teams don’t have enough intellectual honesty. They don’t have the scientific mindset. And they don’t look at, well, you’re incentivizing them not to do all those things, right? There is a skills gap. Like there is legitimately a skills gap. So where those leaders are right is our product teams don’t know how to do this and we have to close that skills gap.

But that’s not the whole story. And I get hired to close the skills gap. And I have really hard conversations with leaders about like, okay, I’m gonna come in and I’m gonna teach your teams. And they’re gonna get really excited about working this way, because frankly, it’s a lot more fun than being told what to build. But if you don’t close your leadership gap, all that’s gonna happen is 100 % turnover of your team. So are you sure you want me to come in and train your teams? And they think I’m crazy, because it’s like I don’t want the business.

Sean (42:39)
Yeah

Teresa Torres (42:42)
And in a lot of ways, I don’t want the business if they’re not serious because I’m pretty tired of seeing this, right? So Marty Kagan’s new book, Transformed, he talks about at all levels of the organization what has to happen for product teams to start working this way. And I love that he did look at it all the way from the top, all the way to the bottom. And I think people underestimate what’s really required to create the environment to work this way.

Sean (42:44)
Right. If you’re not going to succeed with it, what’s the point?

Ethan Garr (43:13)
So is it a, for you, it really both a top -down and bottoms -up effort with teams or do you always have to start with the executive team and work down? How do you approach that from a cultural perspective?

Teresa Torres (43:27)
It’s all of the above. So in an ideal world which none of us live in, we would have a CEO who drank the Kool -Aid, who’s driving the change, and everybody on their executive team, including the sales executive, would be on board. And they would train their teams, and every individual on those teams would be on board, and we would see even change across the organization. It never happens that way, right?

A CEO maybe drink the Kool -Aid, but they’re not individually doing the work of being ready to let go of their pet ideas. They haven’t aligned their executive teams. They’re still sort of infighting around. Wait, what do mean we’re not going to do this feature request from this big revenue customer that asked for it? Maybe they upskill their teams, but during that training, their teams are think it’s just the flavor of the week and don’t really commit to it. Right. So even when it’s top down, things can go wrong. I have also seen stories.

Ethan Garr (44:22)
Mm

Teresa Torres (44:25)
where an individual product team changed the way they worked. They found a way to do this in their feature factory, weird organizational context, and they inspired change in the organization. They had a bigger impact. Other people got curious about what they were doing. So my takeaway is this simple. Change is really hard. Like it’s really hard. So what I encourage people to do, whether you’re a leader or an individual contributor, start with yourself.

Change your own habits. Let people get curious about what you’re doing. When they’re curious about what you’re doing, that’s when you have the impact to influence.

Sean (45:03)
So you mentioned that there are certain situations that you’d prefer not to go into because your chances of success are pretty low. So I’m curious, and then you took it all the way down to change yourself first. there an organization that’s sort of the ideal organization for you to go be able to step in there and.

make enough changes that they can really embrace this approach of customer discovery and see real impact on their goals that’s sort of like, yes, this is the one I want to dive into and work on next.

Teresa Torres (45:39)
So, like in transformed, Marty Kagan outlines like three things an organization needed to change. This is a very simplified model, but I think he did a pretty good job with it. He talks about, let’s see if I can get him right. We have to change the way that we deliver. So that’s like the basics of continuous deployment, right? Like, do you have automations in place? Are you doing nightly builds? Does it take two weeks for a release engineer to get a release out the door? Like, are you good in that realm?

20 years ago I worked at a startup that was like pushing to get to nightly builds and I’m a little bit amazed that in 2024 this is not universal but it’s not right like our engineering practices need need to mature. The second thing he says is we have to change the way we decide what to build and or that we have to change which problems we choose to solve and then we have to change the way we decide what to build and so

Sean (46:18)
Yeah, not even close.

Teresa Torres (46:39)
He puts the deciding what problems to solve at the leadership level. What is messy about his language. And I don’t love it. Even though I think he got it right is I don’t think leaders think about them as choosing problems to solve. A lot of leaders dictate solutions. Let’s, let’s walk through a real world example. So Spotify recently, a couple of years ago, decided to make a bet on podcasts, right? They didn’t start with an outcome. They didn’t like tell their product teams.

Ethan Garr (46:56)
you

Teresa Torres (47:06)
We’re gonna just drive engagement. They made a bet. They said, we’re gonna put a lot of money. We wanna win a podcast. That’s a solution, right? So some people would argue they did it wrong in this empowered model. I don’t think that’s true at all because if we say we’re making a bet on podcasts, there’s still a billion, if not a trillion downstream decisions that need to be made about how we’re gonna win at podcasts. So for me, it’s, and like we’re seeing the same argument at Airbnb with

Ethan Garr (47:27)
hehe

Sean (47:30)
Right, okay.

Teresa Torres (47:34)
product managers versus product marketers and how much control and founder mode and how much micro management. I actually think this is entirely the wrong conversation. think leaders are always going to make bets. They’re always going to say, we’re going to go into podcasts. There’s still a trillion decisions that have to happen downstream. And we want those teams empowered to learn how to make those decisions in the best way for the customer and the business.

So then, so I think the better question is not founder mode versus manager mode or empowered versus not or micromanage versus not. It’s about given where we are as an organization, how narrow or wide of a scope do we want to give our teams? And when you have a ton of cash, you can give your teams lots of scope, right? Cause you, it’s, you’re like a VC. You’re having your teams go off and explore and your organization can afford a lot of exploration.

And when cash is not flush and it’s a little tighter, guess what you need to move? You need to move to exploiting. And what’s going to happen? We’re going to give all our teams a narrower scope. Right? And so like, I really think about this more as the like, who was it? Marsh, the explorer versus exploit guy. This is like, if you’re not familiar with it, it’s like a classic business school article about managing innovation in an organization. How much you explore versus how much you exploit what’s already working. And I think like,

Sean (48:40)
You

I don’t

Teresa Torres (49:03)
So in order for this to work, leaders have to be conscious about that decision. And there is no organization on earth, including Steve Jobs running Apple. There is no organization on earth where a leader made every single decision, right? Teams do have to make decisions. So continuous discovery to me is how do we help them make better decisions at whatever scope they’re making them at? And it’s up to the leaders to decide like,

Sean (49:19)
Mm

Teresa Torres (49:32)
How much scope do we want to give our teams?

Ethan Garr (49:36)
So I know we’re running short on time. wanted to ask you, just as you have really led this field, what is your own continuous learning process look like? How do you stay curious and continue to improve yourself?

Teresa Torres (49:50)
Yeah, a lot of ways. For my work with my products, I literally follow the exact same framework I teach. So I interview product people every week. I have a community of product people. It’s just a Slack community. We do monthly calls. So that’s sort of a more continuous, just constantly engaging with people. The purpose of the community is for them to get help putting the habits into practice. So then that’s great feedback on my content and on my curriculum and on my book.

because I see where there’s gaps. I am constantly running assumption tests in my community, on social media, with my blog posts, with our course content. Like right now I’m designing a new course and I actually integrated all of the content into an existing course because our alumni get access to it. So it was just a way to get real live feedback on how the content is performing before I offer it as a separate course. And then I’m just…

I take inspiration from everywhere, so I read a lot. Blogs, books. I listen a little bit. I’m not a huge podcaster because I’m not a very good audio learner. But I’m starting to… there’s such good podcast content out there, so I’m starting to integrate it. But yeah, I just… I really… I live by the always be learning kind of mantra.

Sean (50:54)
Mm

Ethan Garr (50:59)
hehe

Sean (51:12)
Yeah, you and I have the benefit of both being alumni from Lenny’s podcast. That’s one of my favorite ones for learning. so much good stuff on there.

Teresa Torres (51:18)
Yeah. It’s, you know what? I shouldn’t say this publicly, but it’s one of the few that I listen to every episode. I mean, I can say that publicly because Lenny’s great, but there’s so many friends I have that have podcasts that I’m like, it’s just. Yeah. I, it takes my whole brain to pay attention to auditory content. It’s like I’ve learned this as an adult. I wish I knew this when I was in school.

Sean (51:26)
Yeah, Yeah, Yeah, yeah, yeah. My intent is to listen to a lot more, but that’s the must listen.

Ethan Garr (51:37)
Hehehehe

Teresa Torres (51:45)
Like I really can’t, like people will go on a walk and listen to a podcast. I will have no idea what I just heard. Either that or I’ll run into a street sign. Like it literally takes my whole brain to listen. Whereas reading for me is not that way. podcasts are tough for me. But I started using Readwise Reader and Readwise Reader will turn anything into a transcript. So I now have started reading podcasts, so that’s great.

Sean (51:52)
Hmm.

well, I like it. Yeah.

Ethan Garr (52:10)
Yeah.

Sean (52:12)
Yeah, that’s awesome. So do you have any, I mean, that’s how you go about your own learning. And do you have any recommendation for people who want to get stronger in discovery themselves, how they can build those skills?

Teresa Torres (52:26)
Yeah, obviously the book is a good place to start. So if you don’t have it, it’s continuous discovery habits. I, my tip for people for how to get good at this, pick one habit at a time. Too many people think I got to completely radically change the way that I work. And I don’t recommend that. Like that’s not the way that we change. So what I recommend is look at the set of habits, find one.

that you think would be the easiest for you to adopt, given your organizational context and start there. And then when it’s ingrained in the way that you work, pick your next habit. And then there’s a lot of great resources. So I blog at product talk .org. I think I have almost 300 articles. Most of them are long form. They’re meant to be actionable. We share a lot of stories about how other teams are putting these ideas into practice. and then product discovery is really trendy right now. So

If my stuff doesn’t resonate, find somebody whose stuff does, right? Like there’s so many people writing in this space. There’s a lot of overlap with the growth community. There’s a lot of overlap.

Sean (53:35)
it is one of your favorites on on other people that are writing on this

Teresa Torres (53:40)
let’s see. Good question. I have so many. So first of all, we already mentioned Lenny. I think every person in the product space should be listening to his podcast. He’s a phenomenal interviewer, a lot of really great content. Pay for his subscription because his behind the paywall stuff is amazing. Yeah, he’s just, he just, he’s raising the bar for all of us. I’m kind of mad at him. I’m like, come on Lenny, why are you this good?

Sean (53:56)
Yeah, I’ve started paying as well.

Ethan Garr (53:58)
You

Sean (54:02)
Yes. My first feedback to Ethan was, man, if we want to really be as successful as Lenny with a podcast, we got a lot of work to do. That guy works really hard.

Ethan Garr (54:04)
Hahaha

Teresa Torres (54:13)
Yeah, totally. It does work really hard. If people aren’t familiar with Marty Kagan’s books where I think he excels, he is the best missionary this industry has. Like, if you just need a dose of inspiration, any of his books help with that. On the tactical side, choose who you learn from wisely. I feel like with LinkedIn kind of taking over for where Twitter left off, there’s so many people trying to build a personal brand by just

repackaging other people’s And there’s a lot of misinterpretation. It’s like what we talked about with ICE. So like I encourage everybody to go to primary sources. Even with my own stuff, like in my book, I tried to cite my primary sources, right? Because I don’t want you to just take my word for it. I want you to dig in and like come up with your own model. So I think it’s less about who and more about how. Like if you see an idea that resonates with you, do the work to figure out where it came from and is it evidence -based.

Ethan Garr (54:49)
Hmm.

Teresa Torres (55:12)
and does it work in different organizational contexts? Because I think that’s really how we learn.

Ethan Garr (55:17)
Look no further than than Eric Reese and what people think an MVP is to, yeah.

Teresa Torres (55:24)
Yeah, mean that term is now useless, right? yeah, so I think that it comes back to curiosity. Like don’t, just because I say something doesn’t mean it’s true. Like I sold some books, who cares? Right? Like take it as an idea and play with it and try it out and see if it works and learn where the idea came from.

Sean (55:24)
Yeah

Ethan Garr (55:45)
Absolutely. Well, I know we’re a little bit over time here, so we’ll ask you our final question and wrap up. this has been great, Teresa. I really enjoyed it. Is there anything you feel you understand about improving engagement and retention today that maybe you didn’t know a couple of years ago?

Teresa Torres (56:02)
You got to play with scope. So we talked about this a little bit. There’s sort of the optimizing scope, but then there’s also the like, are we even in the right lane scope? And I think maybe in the early days of this, I thought you could focus on one or the other and like, you know, we’re going to go through this optimizing period and then we’re going to go through this exploration period. Whereas what I think works really well is to be continuously doing both. And so you can even think about it as like, we’re going to spend a percentage of our time on optimizing.

and a percentage of our time on bigger, bolder, more exploratory bets. And I think that you learn from the feedback between the two, which I think is really valuable.

Sean (56:44)
Awesome. Well, so I have some takeaways, but Ethan, do you have any takeaways before I jump into mine?

Ethan Garr (56:49)
Yeah, you know, one thing that just stood out to me, Theresa, is that I think you’ve put a lot of thought and effort into how to do continuous discovery well. but I really appreciate that you look at it as a framework and not dogma. It’s not, there’s one way to do this. You have to fall into this. I feel like there’s a lot of things out there like, okay, ours, for example, where it’s like, there’s a lot of very like, if you don’t do it this way.

And then I think it makes it really hard for people to be successful with it. So I really appreciate the fact that you’ve given us sort of this, not, I wouldn’t say general, but this loose framework of pieces to put together and given us a really good sense of how that all works together. So that was really helpful for me. I love the idea of just picking one habit at a time because trying to, you know, change cultures is super hard. So it feels like when, when you say, don’t try to

Don’t try to change everything all at once. Try to get better at one thing effectively. And finally, the last thing is, you you’ve hammered home repeatedly and I think it’s really good. The key word in continuous discovery is continuous. Like it’s not anybody can do project work and there is value to that. But this idea of continuous discovery is super important. And as, as I’m coaching teams, I think the biggest challenge I have and even on my own, my own.

Teresa Torres (57:59)
Definitely.

Ethan Garr (58:13)
when it comes to customer discovery and talking to customers is just getting that habit, that continuous habit going. And this really helped from that. So a lot of great stuff here, but thank you.

Sean (58:22)
I’ll give my takeaways and then you can let us know if we were misrepresenting anything that you covered, Teresa, and then we’ll wrap it up. But so my takeaways, I also really like that very actionable one habit at a time. But on the other end of the spectrum, it does feel like that this is almost an operating system for the whole business. And I’m not sure how you get to that operating system, but that a business that operates this way.

feels like it would be a super powerful business and that you should aspire to get to that. But you start the journey one habit at a time. Those were my takeaways. So did we misrepresent anything you said there or miss anything that you feel like is also really important to emphasize before we wrap up?

Teresa Torres (59:08)
No, I think those are both spot on. I’ll just say on the dogma front, I’m kind of allergic to dogma. And selfishly, I wanted to write a timeless book, right? So if you write a book about OKRs, that is not gonna be a timeless book. And I think that’s what, I think part of it came out of working with a lot of teams. Like I often say, there’s no right tool. There’s a right tool for the right team. I think that’s really true. And when you work with a lot of teams, you see,

Ethan Garr (59:23)
hahaha

Teresa Torres (59:37)
the same thing doesn’t work in every context. And so my goal was, what are those underlying guiding principles that help people pick the right tool for them? And that was my goal with the collection of habits. And then I think on the organizational change front, I actually went back and got a master’s in learning an organizational change. And my takeaway from doing that master’s was, nobody should focus on organizational change, it’s impossibly hard.

But there is one book that I read as part of that that really stood out to me and it’s called Managing Transitions. It’s by William Bridges. And what he talks about is when we go through a change, there’s an ending, the thing that we’re leaving behind, and then there’s this messy transition period, and then there’s the new beginning. And so if you’re trying to inspire change, you have to manage those transitions. And when we talk about organizational change,

There’s like this model of like an organization is a thing and it’s changing. I prefer the model of an organization as a group of people and change happens when enough people in the organization change that everybody else is forced to come along. And so if I then combine those two things, okay, at the individual level, I have to go through this messy transition where I’m going to wander in what he calls a neutral zone until I figure out my new beginning, right? And enough people have to go through that messy change.

that the organization is pulled in that direction. Okay, if I’m a leader, I don’t force people through those transitions. I support people as they go through those transitions and they go through them on their own time. As an individual contributor, I have almost zero impact on whether people go through that transition. So this idea of like, just focus on your own habits is just pragmatic. Like, I don’t know what else you can do.

Sean (1:01:24)
Yeah, well, that definitely sounds like a book that I need to read. So that’s on my list. Well, I feel like we could keep talking on this topic for a long time. But what we covered is amazing. So thank you so much for sharing your insights and for everyone tuning in. Thanks for tuning in.

Teresa Torres (1:01:29)
It’s really good. Yeah.

Ethan Garr (1:01:29)
hehe

Teresa Torres (1:01:44)
Thanks for having me.

Ethan Garr (1:01:44)
Thank you.