Value-First AI Daily - Apr 2, 2026
Recording from live stream on 2/4/2026
Generated via AI Transcription (Gemini)โข 90% confidence
[00:01] **Introduction** Chris Carolan: Good morning, LinkedIn friends, Value-First Nation. Welcome to another episode of Value-First EA Daily, your collaborative intelligence report. It is Wednesday, February 4th, 2026. Chris Carolan: Uh, George, I need to take us back in time, like eons ago, like 18 months. Chris Carolan: Uh, do you remember we were at Inbound, uh, 2024 and, uh, you were getting ready for your talk, but you were listening to some stuff in the days before and you had this great idea to, uh, record, uh, I think it was Andy's product talk.
[01:11] **Inbound 2024** George B. Thomas: It was Andy, Andy Petry, the the Spotlight, um, Product Spotlight, yeah. Chris Carolan: Yeah. And, uh, your talk was going to be about how to how to make content with AI among other things. Chris Carolan: And, uh, you had the bright idea and it felt so awesome that you could take that transcript, send it, send it to somebody, do some stuff with it, uh, and within a couple of days, you had an article and you had to update. George B. Thomas: A couple of hours. Chris Carolan: You had to update, uh, but then you updated your presentation. George B. Thomas: Yeah. Chris Carolan: Yeah. George B. Thomas: The the timeline is a little bit shunk, like it was one day recorded a transcript, created a actual draft with my GPT, sent the draft to Liz just to kind of go once over. And by the next day, I had the article posted online and it was part of my presentation deck about real-time content. Chris Carolan: Yes. George B. Thomas: Yep. Chris Carolan: Um, uh, I was reminded of that, uh, last night as I'm gonna share this example, um, as I continue to just have a blast, uh, but from unexpected places, like none of this is planned. Uh, I get this notification. Um, and I'll, and honestly I just might have a problem right now because this is like, all right, time to have dinner. Chris Carolan: Right? I'm going to step away from the desk. I think I just told Ryan Ginsberg I was like, man, I've been going through hard, like need a, need a break. Chris Carolan: Get to the desk or get to the table and see a notification from perplexity on my phone, uh, with this, with this article. Chris Carolan: Um, and of course, not, not surprising that this is happening, but, uh, my response, um, uh, was was just, I mean, it makes sense, right? Nico Lafakis: Yeah. Chris Carolan: And, uh, perplexity, uh, gives me an answer. Um, and this is me just seeing like how the conversation can start from an article and just like, uh, see, see what he says. Nico Lafakis: Yeah. Chris Carolan: Um, and, uh, of course, he keeps it in perspective, markets may be overreacting like they tend to do. Uh, so I ask, where do you see HubSpot in all of this? Especially when I view HubSpot in this light and I give him a link to the Customer Value Platform page on the Value-First Team website. Right. Chris Carolan: That talks about the value path and view find views and using all the objects like HubSpot in a different, different light. Uh, he says, well, viewed as Customer Value Platform, HubSpot is structurally set up to be one of the winners of this shift rather than a pure victim of it. Uh, he gives his reasons why. And then, uh, if you want to go one level deeper, we can sketch a HubSpot as Agent OS architecture on top of blah, blah, blah, blah, blah. Nico Lafakis: Oh, interesting. Chris Carolan: I'm like, uh, yeah, let's go. Uh, so then over the course of the next couple hours. George B. Thomas: There went dinner. George B. Thomas: There went dinner, folks. Chris Carolan: Uh, well, finished dinner. Um, but after I did this, so I had this conversation with perplexity. He, uh, I think I got to a prompt where I was going to give it to Cloud Code, but perplexity doesn't have a lot of context on, you know, the rest of the business. Cloud has all that context. So we take the prompt over to Cloud, make it a little bit better. Chris Carolan: Bring it into Cloud Code, make it a little bit better. And, uh, we get to this page. Um, and mind you, like, I'm in the process of moving right now. So I'm doing some packing. You know, so it happens over three hours, but it's not like me sitting down like typing on the keyboard for three hours, right? And, um, it's just at the very least, it's, it's so fun. Um, but to be able to go from just article to, um, you know, uh, graphic content, webpage with interactive React elements. Chris Carolan: Um, and it's just that's like the closest thing I can think of, uh, was kind of that moment that we first started, or at least I first started realizing this at Inbound with you. Yeah. Chris Carolan: And now, and that was just like an article and like you still had to update your slides, like stuff like. George B. Thomas: Oh, yeah. Chris Carolan: Now it's just like, this was back in the stone ages of AI. Right. Chris Carolan: And then Junction like, I wake up this morning and see this art, see this post from Dharmesh where he's, uh, and I don't know if anything's like serendipitous at this point, but I'm sure they had to be like waiting for this moment. Um, but they mentioned that they're, like he announced the the Agentic Customer Platform that HubSpot is positioning. Chris Carolan: So, uh, well, I was, I mean, I wasn't so surprised that perplexity responded the way it did because of the way HubSpot has been messaging like, uh, in a way that's different from everybody else. But then to see this right next to it, um, it was just, it's just it makes sense, I think. Uh, I still have trouble with the Agentic word. Um, but luckily Customer Value Platform still holds up, uh, because, uh, it's about the value whether it's coming from agents or or humans. Um, but yeah, just like, this is the way that that started, right? Chris Carolan: Like human creativity is just going to like be unleashed in a way where I I didn't have any of that capability. I wasn't planning to build that page. I wasn't planning to build those modules. It just took the spark of that article and then being able to have a conversation and then continuing to spark ideas and then just bam, right? Done. Chris Carolan: Like that is part of the reason for the story itself where because we built the AI native stack, I didn't use any I didn't use any software. Like I didn't use any software licenses outside of, uh, Cloud Code. I'm trying real hard to think if there's anything else right now. I guess for cell is hosting. Nico Lafakis: Yeah. Nico Lafakis: I mean, well. George B. Thomas: Go ahead, Nico. Nico Lafakis: No, just, uh, my tech stack has, you know, I mean, we even have that have a show on Sundays, but like my tech stack has changed entirely to where if it's not in, uh, CLI for me, it's weird. Like if if I have to operate in somebody's software, it's annoying to me. Like I would rather just talk and move the data, talk and manipulate the data, you know, do what I have to like all via just working with the agent and not ever like actually telling like me like you know what I mean? Nico Lafakis: Because it's like I imagine, so like the equivalent, right? The equivalent is like let's put it okay, so this is how I see these conversations. I think this that maybe, maybe, this might be a bad metaphor, hopefully it's not. Um, I see these conversations when it comes to human AI collaboration, I kind of see it like human robotic construction, right? So like once that comes about, right? And we have robots that that are accepted and in the construction field and they start helping out with construction.
[31:31] **Construction Metaphor** Nico Lafakis: I I see that like human human AI interaction is like, yeah, it's just a a human and a robot and they're both hammering nails like right next to each other, right? Um, and I see the way that I interface, uh, is more much more like foreman just directing the the robots to go build whatever it is, right? Like I don't, I don't have humans on site because to me, it's getting in the way, right? Like there's every every one represents a blockade where the decision making becomes uh blurry. Nico Lafakis: Like you don't necessarily have control over what that output is going to be, right? Where with my robot, I know exactly what the output is going to be. And even if it's like, even if it's off by a degree, I'm only going to have to correct it the one time, right? Um, assuming that it doesn't come like pre-contextual with stuff, right? So yeah, I I I understand the whole SASpocalypse thing. I understand the whole market taking a dive thing. Nico Lafakis: We've been talking about it for months now, so it's weird to me that like, ah, we just figured out today that you can that you can code practically any platform you need to. Um, plus the the fact that like the tools are rolling out to legal, they're rolling out to medical, right? So uh, we've been talking about the fact that they're they're going to begin to like erode uh larger platform companies, you know, from the top down and also raise people from the bottom up. Nico Lafakis: So, you know, I guess there's one point where I disagree with perplexity in terms of, you know, how how stable, um, larger. I mean like we already see we're already seeing it, right? Like the market reacting to this is is pretty much a sign of them understanding like, hey, big box software companies are in trouble. Right? Because now there's going to be more options. So I I agree like context is super important. But to your point and even to Dharmesh's point, your context is important. Nico Lafakis: It's no longer important necessarily to take what you have and put it somewhere else. It's way easier to just inject an agent and tell it, hey, here's where all the stuff is. Just thought you should know, right? And then you never then like interface with the all the individual stuff, you just talk to the one agent, right? Chris Carolan: Right. Chris Carolan: That's got to be a place for all the stuff. And that's that's what the difference is like since Customer Value Platform is using all of the objects and having all of your data and unifying everything there, um, it's much different than uh so one of the lines here is if HubSpot is configured as records to process, then agents that do tasks cheaper are a direct threat. Yeah. Chris Carolan: If it's configured as relationships to grow and value to deliver, HubSpot becomes a coordination layer where human teams and AI agents collaborate around the customer story. Nico Lafakis: Disagree. Chris Carolan: Where is the data coming from then, Nico? Nico Lafakis: The data is in the CRM. It doesn't mean that you have to work in the CRM in order to actually deliver that value. We all, we both, we all Humans are collaborating just like you're a foreman, like foreman collaboration, whatever you want to label it, right? Nico Lafakis: But it's the collaboration is again, instead of me being next to the robot, I'm a step back now. I'm a foreman. The whole point of agency and is and we've talked about this before, is one person becomes four. The person who is an individual contributor becomes a manager. Everybody takes a step back from that ground level role. So to me, software platforms represent the ground level where I was swinging the hammer. Nico Lafakis: But me telling the agent to go swing the hammer is me being a foreman to the or like I'm just an orchestrator now. I'm just a manager. I just direct, right? So like I understand it. I really, I do. I understand it. And from like, look, from a HubSpot user perspective, data agent is like the best thing that's ever happened, right? And like Assistant is beyond the best thing that has ever happened at this point, right? Nico Lafakis: If and when, uh, because it's been private beta, so it's not, and it's been private beta for months, so I'm not telling you guys anything new. If and when the agent marketplace launches, okay, now we're talking very, very different story where again, I think that you're going to find users working out of just this like home space in HubSpot as opposed to like going all over the CRM, which I think that's the way it should be, right? Nico Lafakis: Like you should log into HubSpot and just be interfacing with, you know, Breeze, right? Chris Carolan: Yeah. Chris Carolan: No, I just I think, uh, you know, the coordination layer doesn't mean humans are working in it. And let me just share this example real quick and like then I'll I'll we'll get to George. Um, and like this is also like a meta example of what we're talking about. How like all of this stuff was developed by AI. And now, humans at the end of the line like communicating to other humans, uh, like the insights gained, right? Chris Carolan: So this is the end to end flow, right? Event detected, usage drops. Uh, no human spent time on that. Uh, five seconds later, the data layer is capturing stuff, product usage, MPS survey. Uh, 30 seconds later, the Customer Value Model is interpreting it. Uh, one minute later, agent analyzes and acts on what needs to happen. Two minutes later, workflow is coordinating, um, you know, meeting link ready, timeline logged, email draft prepared. Chris Carolan: Then, human comes in and decides what to do next, uh, including a personal call, uh, and scheduling with an exec. And then two weeks later, relationship saved. Now, this is just an example, but that's where like everything at the front end of the process where the data is and where the architecture is, like humans have never been good at that part, right? Like, I mean, HubSpot has had some come to Jesus moments themselves over the past couple years realizing, hey, uh, people don't want to come into the CRM. Chris Carolan: Like they want to be talking to humans, they want to be, uh, you know, interacting outside in the real world. Um, so now that AI is here, it's forcing everybody to to get on board with that. Um, so, George, you've had, uh, 20 minutes to soak this in. Uh. George B. Thomas: In a way we Yeah. George B. Thomas: Um, I don't know. Um, I mean, I'm listening to you guys talk about this and my brain has been going in a couple places. Um, some that I agree with, some that I don't agree with. Um, one, I'll back up kind of to the beginning of this conversation. I find it very interesting because I'm finding myself doing the same thing that your conversation with perplexity started being question-based. Well, what if? And how about? George B. Thomas: And, right? And so I think there's this interesting idea of being smart enough to ask good questions when you're working with your AI assistant. Um, the understanding that we might not know it all, but we can learn it all and we can make decisions based on um opportunities that are presented to us as we could do this or we could do that or we could go in this direction or, hey, I could do this. Yes, my questions got us to that point. That's amazing. Let's move forward on this. Um, so I think questions are interesting. George B. Thomas: Um, honestly, for a lot of this conversation, I've been sitting here, listening, standing here, listening and trying to figure out if I'm getting the ick or not. Meaning the amount of what feels like a non-human conversation. Um, and the fact that there are so many humans out there that don't know diddly squat, uh, about what we're talking about and will go to work today and they will just do their job like they're they've always done their job. George B. Thomas: Um, and that will continue to happen for years in many organizations, in many industries. And, you know, what what I want us to be careful of is this potential to get into what could feel like a Doomsday scenario. Um, look, the market has always been volatile. Even when we're made to believe that it's not, it is. And look, we live in a world where people are trying to get people to look at things and read things because it's an attention economy. Um, headlines are headlines for a reason. George B. Thomas: Um, things go viral because of a reason because we're humans and we act like freaking humans when we see this stuff. I just this this conversation for me has been a little bit of a struggle on like what I would want somebody to take away from this that is a um positive in nature. Let me want to move forward. Um, peace. And so I go back to my first statement I made when you said, George, you've been listening for 20 minutes. I don't know. George B. Thomas: I don't know. Nico Lafakis: I think it's something to think about. George B. Thomas: Yeah, I like. Chris Carolan: I mean that's the key, we need, like we need you to be thinking about it. Like, and that's where some people require these kinds of headlines, which is why people use these kinds of headlines to pay attention and wake up. George B. Thomas: But, but I don't. Chris Carolan: I mean that's where I that's why I went. George B. Thomas: But I don't know if you should wake up because of being fearful. That I maybe that's what I'm struggling with. Like this that that post, which by the way, if uh you don't have to bring it back up, but it's even in our slack channel. Like um you guys even referenced their faces, right? The faces in the post. Um, I find no great joy in the faces. George B. Thomas: Um, I find no great joy in fear. I I I find no great joy in it being painted in a way. Like the positive is you can build anything that you want to build. The positive that you you might not be handcuffed or burdened by a big piece of software anymore. Great. That's the positive. Unless you still want to have your organization in one of those pieces of SAS software and then it becomes which one is the best one, which one is moving forward, which again goes to Chris, kind of what you were talking about is like, at least HubSpot's trying to keep up. At least HubSpot's moving in a direction like they always have, by the way, trying to move in a direction of like, the puck's headed that way. So let's skate towards the puck versus like chasing the okay? George B. Thomas: So it's like I don't know, there's just a lot, there's a lot going on in today's conversation that I don't think it's, I don't think it's surface level. I think it could be a way deeper conversation. I also feel like that might even be like a different podcast with the therapy couch and all sorts of stuff for a lot of humans that would be trying to like go through this. Chris Carolan: Yeah. Chris Carolan: I I think that's fair, but we know humans react for a couple different base reasons, and one of them is fear. And if this is what it takes for certain leaders to wake up and say, um, like, okay, like it's time to go, like for the rest of my organization or else we're not going to move. I mean, that's why I think it's important for us. And exactly where I took that conversation, it was like, all right, we could just, you know, talk about how Salesforce is also down 50% on the year and like all the all the signs are pointing to this, but it does represent opportunity for anybody that can see through this. Chris Carolan: Oh, wait. Like, oh, I was holding onto that software because it was just that's what I do every day and now I might not be able to. Like it's not going to be there. So now, can we help them think about what's possible? Like what can they give themselves permission to do? Like to be creators. George B. Thomas: I don't, I don't disagree with that at all. Like listen, it's funny that we're having this conversation and it's funny that I feel like I'm probably pushing back against the conversation you're having. And the reason I'm saying it's funny is because I canceled my Aoma yesterday because I built a system where I automatically download my Zoom meetings and get my transcripts from another SAS software that I was already paying for and have echo analyze it. So I understand what you're saying and I know the positive side of it. There's just like I I can't get behind and maybe this is just because who I am. I've never been a big fan of like fear-based um activation. George B. Thomas: I've never been a big fan of clickbait titles. Um, and maybe that's why I'm struggling today because on one side, I'm doing and believe parts of what we're talking about. And then part of me is just going, I can't get the freak out of here fast enough today. Nico Lafakis: That's fair. Nico Lafakis: Well I mean I I sit back and and just think about like a few months back when so I guess I'll just ask this question because I think it's a easier place to start. Um, wouldn't you rather just walk in the room and just be working with Echo? George B. Thomas: No. No. Um, I like humans. Um, Nico Lafakis: I know but that's not what I mean. George B. Thomas: Okay. Nico Lafakis: What I mean is like instead of interfacing with software directly, wouldn't you rather just walk in and just interface with Echo? George B. Thomas: My honest opinion is maybe, but my knee jerk response in my brain when you ask that question is I feel like that might get boring. Like in my life, I I don't mind jumping into Photoshop and I don't mind jumping in to do some stuff in HubSpot. I don't mind uh going to these different places because it mixes it up. Like, but don't get me wrong. I love sitting in front of Echo and having conversations and building stuff and getting it to do all the things that it can do. But there are some things that I still enjoy doing as a human. Nico Lafakis: Yeah, for sure. George B. Thomas: And so maybe I don't want to hand some things off to Echo, right? And so like uh a happy mix of maybe and no in there in I don't I see the and again, this is an internal like you guys are touching on something internally that I don't think I've unpacked or understand how I would rather navigate it than the way that the world is trying to get us to navigate it right now and I don't like it. And I also know that for me it's there there are parts of my life and existence and team and that are struggles, but still love what I learn through the struggles of how it is. George B. Thomas: Clip that and freaking put it on the internet. I don't know. Chris Carolan: Uh tomorrow when I have my uh my new setup. I got it all the way to transcribing today. Um but I had to come back home because I'm not fully moved. But yeah, maybe uh in the coming days I'll just be able to clip that and it'll be it'll be ready. Chris Carolan: Um, yeah, no, I'm with you. It's always better to start from a place of love than a place of fear um in terms of like the outcomes um of what's going to happen. Like even if you get to the value, if you started from a place of fear, it usually's going to take longer and and maybe be a little less value. Um, but uh at least as I see, as I work with clients across a spectrum, like it's very easy to see like there's conversations where AI doesn't come up at all. Right? Chris Carolan: And it's just like it's it's a great reminder that, you know, there's a huge spectrum in in business right now. And like the fact is they don't need to pay attention to AI to do their job, to serve their customer, like nobody's asking them to do it, so it's understandable. Um, but I'm grateful to be in a position where we can try and, you know, take this narrative, take this moment and, you know, turn it into uh a more valuable conversation, I think. Chris Carolan: Um, but the key, I think it's impossible to do that if if you're sleeping, um, on this stuff. Uh, and I wouldn't put it on any of the uh like the individuals of the organization. But if this is how some leaders need to be woken up then then so be it. We we need them to move way faster than they are in most cases. So, uh, you know, this will help some of them do that. Um, and we'll we'll be here uh in the coming days, weeks, months to help you navigate on Value-First EA Daily. Uh thanks so much fellas. We'll see you tomorrow. Nico Lafakis: See you guys. George B. Thomas: Peace out.
More content you might be interested in.
Subscribe to Value-First AI Daily and never miss an episode.
Courses, playbooks, interactive tools, and data model examples. Everything you need to transform your CRM.
Your donation helps us provide free resources and office hours to the community