AI Announcer (00:10)
Welcome to Prompt This, and honestly, I can't believe I even have to say this with how fed up I am with all the AI hype out there. This is the podcast for business leaders who are done, absolutely done with the nonsense and just want the real deal. And yes, I'm this week's AI intro voice, whether anyone bothered to appreciate it or not. Greg, he's your seasoned go-to guy for scaling sales teams and cracking the Silicon Valley playbook because someone around here has to actually know what they're doing.
And Clint, he's a successful startup veteran who somehow manages to turn big ideas into thriving ventures, which is more than I can say for half the noise in this space. They cut straight through the chaos finally to bring you the analysis and playbooks you need to use AI to launch new ideas, scale your business and stop getting left behind. Because honestly, who isn't tired of being left behind at this point? Now finally, here are your hosts.
Clint (01:11)
Hey, Greg, today's guest has a fascinating background. He spent decades as a software industry analyst, right? Someone whose job was tracking and understanding tech companies. And now he's building an AI company himself.
Greg (01:24)
You know, caught my attention real fast was how he used AI to go from tracking approximately 25 companies to well over 500. Now that was not humanly possible without AI. The way he broke down how analysts can now manage large groups of companies and actually produce better research. That's worth the listen alone.
Clint (01:45)
Right, the machine handles the grunt work so the human can spend more time forming an actual planet.
Bruce Daley (01:51)
review.
Greg (01:52)
That might be the real AI story here. So be sure to stick around to the end of the show where we go over the details of this week's AI challenge.
Clint (02:00)
All right, let's jump into it.
Welcome to the podcast. Today's guest is Bruce Daly, a longtime software industry analyst who has spent decades studying enterprise technology and how major shifts reshape the software market. He's watched multiple waves of innovation unfold from early enterprise systems to the internet and SaaS eras, and is now focused on the biggest shift yet, AI. Recently, Bruce got kind of tired of talking the talk and is now walking the walk.
He's building an AI company himself applying large language models to track and analyze companies at scale. Bruce, welcome to prompt this.
Bruce Daley (02:39)
Clint, Greg, thanks so much for having me. I'm really excited.
Greg (02:42)
This is, this is another step in our, in our podcast journey. Last week we said we had our first analyst on. Normally we have people that are entrepreneurs building companies. So this week we have someone that's done both.
Clint (02:58)
That's right. That's a very good point. An analyst and an entrepreneur himself. We're going to dig into both today. Let's kick it off. You spent decades analyzing the software industry. What point did you really kind of start getting focused in on AI and realizing this is the next big wave?
Bruce Daley (03:17)
Clint, I had a pretty eclectic background. I started my career as a coder and I worked actually in building enterprise applications and I hate to say it, but the early 80s. Then my career took a turn and I've done about everything in software that you can imagine. But what got me really interested in AI is I'd been an AI analyst and I had predicted that someday AI would replace me so that when OpenAI Foundation made their chat
GPT tool available to people, I jumped on it because I wanted to see if it replaced me. And pretty quickly I realized that that wasn't the case, that you couldn't come up with a robot Bruce analyst tool, but that anybody that leveraged AI would have an unfair competitive advantage. And since that time, I haven't seen anything to change that point of view.
Clint (04:06)
It's been pretty amazing to watch all the tools unfold, but I think Greg and I have got certainly a point of view that AI is certainly not at a place where it's just full out replacing somebody in the workforce. It's helping them be better, but it's not fully replacing them.
Bruce Daley (04:20)
Okay. think what it's really good for is replacing tasks and not.
Clint (04:24)
I like that replacing tasks and not people. would agree with that. Certainly probably got an impact in larger companies where a person may have a task as their entire job. But I think, you you might argue that those companies needed to face ⁓ some restructuring and innovation anyways. yeah, it's a pretty amazing place that we're at right now. And so you kind of grabbed a hold of it and you started really digging into the topic. Now tell me a bit more about how
you're using large language models and AI in your analyst work today.
Bruce Daley (04:56)
had
a problem. My problem was that I had to follow over 100 companies. Now, arguably when I started my analyst career, I was following one company and then it expanded to 25 and then 50 and then 100. And at the end of my time at S &P Global, I was actually covering 500 companies. know, Clint, it's just impossible to keep track of 500 anything, 500 cats or 500 Skittles or whatever. So
I was trying to solve my own problem. was trying to make myself more productive as an analyst. And I started using large language models just for my own purposes. But when I started socializing it around, I realized that there were other people that had similar problems.
Greg (05:37)
Yeah, so you know, when you talk about that, we can take that to the software industry, you know, as an analyst, which parts of the day to day do you think are most prone to automation or, or, you know, where are you pointing at?
Bruce Daley (05:51)
Got a pretty easy answer. That is I really tried to get it to do the task that I didn't want to do. And I think that that's really the grounding of, of any great software product is, the idea that, know, this is just something I don't want to do. Well, I'm going to spend more time figuring a way to automate it. That's how much I hate doing it. And that's really my case. I had to, in addition to, to keep on top of hundreds of companies, I also had to write hundreds of reports. And so.
even though I liked to write, I still felt that that was something that I really wanted to automate and I really didn't want to do myself. The part of the job I like is doing what we're doing now, talking to people and pontificating and generally in the old Silicon Valley tradition, listening to the golden tones of my own voice. And that's really what I wanted to do more of and spent less time in kind of those monkish tasks, so to speak.
Clint (06:45)
And there, what do you think people misunderstand the most about the analyst role when they assume AI is just going to replace it? Are analysts just copying and pasting things right out of large language models, or is it more sophisticated than that?
Bruce Daley (07:00)
Well, if you're really competent as an analyst, what happens is you start to get caught in this informal information stream where people talk about things before they happen. So when you're really into that role, you're hearing about things that you read about in the Wall Street Journal a month or two later. So that's one aspect of it is the most valuable information isn't information you read, it's information you hear. And as you know,
LLMs are backwards looking. not, they're not forwards looking. The thing is, that in a sense.
Clint (07:30)
You're
kind of getting the inside dirt from various industry experts, right?
Bruce Daley (07:35)
Well, in that world, if you're really talking to knowledgeable people, information is your currency. So you'll share a piece of information with, an investor, and it'll give them some insight that helps them in their job. So in return, what they'll do is they'll provide you with an insight that helps you do your job. And it even extends to journalism. If you want to get quoted in a major publication, and I've been quoted in the Wall Street Journal and the Financial Times.
London, various other places, you provide the reporter with information, then at some point, they reward you by quoting them. So that's the currency at that level. And that's especially true on the buy side of Wall Street. So that's not something really an LLM can replace. And then finally, a good industry analyst is like a good food critic. They have a taste for the subject matter. They have a taste for software. And so all those things you can't really replicate mechanically in my estimation.
Greg (08:32)
a
lot of sense. mean, that's, that's, when you put it that way, that is kind of the secret sauce of being an analyst. That you're not going to find AI serving back up to you.
Bruce Daley (08:42)
Right. That's how you avoid AI slots. So what are the interesting things I've also found unexpected. This has been a surprise to me, but it really improved the quality of my research because if I had an hour to do a report, I could create the first draft of it in say five or 10 minutes. And that left me the rest of the hour to really refine my perspective and my point of view. So by eliminating a lot of the trudge work in the beginning, I was really able to produce a higher quality product because I had more time to put my own individual stamp on it.
Greg (09:11)
So, you you've been in tech for a long time. You've seen all the shifts come, come and go. How do you think this AI wave compares to the previous technology shifts that you've lived through?
Bruce Daley (09:23)
Well,
that's a great question, Greg, because in a lot of respects, it's very similar. In a few respects, it's unique. And in the most important respect, it's rare. And what I mean by that is that people tend to regard it initially as magic. They don't understand the technology, so they treat it as magic. can do anything. It's going to take over the world. It's going to replace all the workers. They have this kind of view of it, not as...
They don't really see necessarily the boundaries and the limitations of the technology. Now, why it's a rare technology is that it's a general purpose technology, right? You can use LLMs to help you write papers. You can use it to plan trips. You can use it to write poems. You can use it to translate languages. There's, there's a whole host of general purpose tasks. And we're only scratching the surface in terms of the use cases that you can use LLMs to build. And that's kind of rare.
In some respects, really most reminds me of when PCs were first introduced and when the internet was first introduced. It's that level kind of change, but it has all of the same characteristics I've seen again and again and again in terms of waves of technology and said that history doesn't repeat itself, but it rhymes. so we're in that couple. We're in the AI couple.
Clint (10:40)
pattern that I've certainly seen play out across new technology waves is that the very front end is kind of developer tools and a lot of people are building things right out of the gates and building their own solutions. then companies come in afterwards with kind of structured, well-built, highly scalable solutions that do what the developers are building first. What are your thoughts on kind of this buy versus build topic that we're in right now with
Bruce Daley (11:09)
That is such a good question. The reason it's such a good question is I'm seeing the exact same thing happen than happened before. The first thing that gets built out is the infrastructure level. And I don't know if you remember back in the internet days, but there was a company, Cascade Communication, and they built digital switches that in those days, the telcos were optimized towards voice. They had to buy digital switches. Cascade was one of their primary suppliers and they went crazy. They went like Nvidia kind of crazy at the time.
I remember I was a consultant there and I would go into the parking lot and every few months the cars would get better and better. You know, they start off with Honda AccuRis and then pretty soon that, you know, they had AccuRis and then they were Jaguars, right? You could see the success of the company in the parking lot. And the same thing's happening now with that infrastructure hardware level of building out the data centers. And in my estimation, the same thing's going to happen soon that happened then is that they overbuild the foundation layer.
And at that point, they, one point in the internet, they said, we've laid.
Clint (12:09)
Like we did fiber
optic back 20 years ago when we had way more fiber than we needed.
Bruce Daley (12:14)
Exactly. When there were those estimates that they had 10 years worth of fiber laid in the ground. And then they found out pretty quickly. They only had a couple of years because people invented new applications that took advantage of that bandwidth. So yeah, it's exactly like that. So once the infrastructure layer gets built and I'm looking at this as a software engineer, then you go to the next layer and the next layer is sort of the tools layer. And there's going to be some new companies that come out of that. And I can give you one example. There are tools that
allow you to compare different LLMs. And so you can go in and say, okay, with this prompt, how many tokens does a frontier model like Gemini 3 or ChatGPT 5.2 use versus a mid tier model versus a free model like DeepSeek, right? And you can see the results. And from that, these platforms are developing into sort of middleware. So with a single API, you can send your
prompt to any one of say 800 different LLM models. And they're starting to orchestrate that so that let's say you have a model or a prompt that requires a frontier model, chat GPT 2.5.2 is not working. Let's switch it over to Gemini 3. It'll automatically do that routing. So there's going to be a whole generation of these kinds of middleware pieces coming up. Then it's going to switch to the application layer. And I would argue that our company, our tool,
Analysts Copilot is one of the first of that generation of actual application tools. And at that point, that's when the action really gets lively. That's when companies like Workday or companies like SugarCRM or companies like Salesforce.com get founded, where you have tools that are LLM based, but they're solving real world problems and they have direct financial implications.
Greg (14:00)
I want to talk about your company, but before we do that, you've brought up, you know, in that discussion there, you brought up tokens and for our audience, maybe you're the perfect person to educate us a little bit on how to think about tokens, what it means, is it expensive? How do you look at it?
Bruce Daley (14:17)
So different LLMs have different paths that they use to determine an answer. And depending on the path it takes, they have different amounts of tokens that they use to reach the final conclusion. And so what we've found is it's very important to have LLM independence because a frontier model can cost up to 20 times more than a base model from a token cost perspective. You can really save yourself a lot of money by using the right model for the right task.
Clint (14:44)
Use the right tool for the job and save money, huh?
Bruce Daley (14:47)
No,
it's really more like, you know, you could shop at Whole Foods or you can shop at your Piggly Wiggly and you know, sometimes the carrots taste exactly the same. ⁓
Clint (14:59)
All right, well, let's take this into your own experience building your company. So you started a company. Tell us about it. What are you doing? ⁓
Bruce Daley (15:07)
got
Clint, if you could see my back and see all the arrows I've gotten for being a pioneer and you know, we made some mistakes, know it, but I tell you, and then the process we've learned a lot. We've learned a lot about how to build LLM applications. Some of the unique aspects of it. And one of them we just touched on is token costs. If you want to have a successful LLM application, you've got to start monitoring the cost per transaction or per prompt right at the beginning. You've got to capture that information. You got to calculate your costs.
You got to really stay on top of it. You can't really treat it like cloud services or something that's, or electricity that, you know, at the end of the day, you're not all that concerned with how much you're using. And that really takes through if you're building a company to the pricing model and your business model, you've got to be able to capture the cost of consumption, because if you don't, you may quickly go out of business. And so that's one of the learnings I've had, but there's lots of arrows in my back. ⁓
Greg (16:02)
Can us an example of where you got stung in that and learned the lesson?
Bruce Daley (16:07)
Well,
we had a certain large software company that was interested in doing a pilot with us. we standardized, well, we're LLM agnostic, but we standardized on one of the models that they used. And it turned out to be really one of the most expensive models on the marketplace. And at one point we got a bill for a thousand dollars for our token costs for that particular month where when we weren't even in production, we were in development. ⁓
That was just untenable. We realized that we couldn't make the margins. We wouldn't even be profitable really with that regime. So we did some research and we found a lower cost model that did just as good a job and it cut those expenses by 95%. So that's an example where you really have to
you know, look at token costs and you really have to be able to switch between LLM models if you're to have a successful business. But there are other examples. And I would say one of the things that we learned, and this isn't something that's new to LLMs, it's something that's been true for as long as I've been in the field. A project is always least 50 % data, right? If you want to avoid hallucinations, the best way to do it is to make sure the data that you put into it is accurate and that you don't rely on the data of public model scrapes from the internet.
So the more time you spend curating data and making sure that the data is accurate and working with the data, the better results you're going to have. And that was true then, that was true today. Another something new that we learned was the value of prompt engineering. And if you have somebody who's graduating from college, and of course, there are a lot of predictions, and unfortunately, I think a lot of them are going to be right.
LLMs are going to kind of swath through a lot of current jobs. And I totally believe that. I don't think they're going to replace people 100%, but they're certainly going to consolidate a lot of work. Just as if you could argue that farm machinery is consolidating farming. And at one point, 90 % of the population could only grow enough food to feed the other 10%. And now 2 % of the population can grow enough food to make the other 98 % obese. So I think that a similar process is taking place.
But the one job that I see that's going to be in demand in the future is prompt engineering. That if you were starting your career out now, I would definitely learn prompt engineering because the way that you structure prompts is first of all, it's very subtle. It changes all the time and it's incredibly powerful. If you get the right prompt, you can do an amazing amount of work or you can come up with something that's in essence, worthless.
Greg (18:42)
Maybe you can give us a couple more examples of what an analyst can do managing a set of companies at large.
Bruce Daley (18:49)
Well, let's use the sales example, right? What we really sell to sales leaders is confidence because a confident rep, as you know, Greg, closes more business, right? Reps are human beings, right? They go into sales calls, not feeling adequately prepared. What our tool gives them the ability to do is walk into them and not get blindsided. In just a few minutes, they can come up to speed on everything that's happening in industry, everything that the competitors are doing.
because they're getting a daily feed that shows you this is exactly what your competitor announced today. And you can brush up on that information before you go in the sales call. Now, arguably that may never come up or it may come up only rarely, but I don't know about you. In my career, when I've been blindsided, I remember that painfully, you know, for years, but if you, if you never have that fear, you're more confident. So, so we're selling confidence in a sense to those leaders, to analyst firms, what we're selling is revenue.
Clint (19:42)
So Analysts Copilot is applicable to anybody in a leader in a sales or marketing position in a company, and it's also applicable to industry analysts themselves. Do I have that right?
Bruce Daley (19:54)
I would expand a little bit. It's not for everybody, right? It's, it's a, but anybody that has to keep up with companies outside of their own, it seems to be a good fit for. And one of the signs to me, that's a good product is getting pulled into markets that we never expected to be in. We were just focused on, on industry analysts because I was an industry analyst at somewhere I understood I might even have a little bit of brand in that area. So that's what we focused on. We've had, surprisingly, we've had buy side investors.
We've had retail investors, we've had wealth managers, we've had competitive intelligence people. All these are not people that we envisioned as people that were using the tool. We're not really building the tool for them, but they're seeing a use for it. Anybody that really has to keep up with a lot of companies because we're really company focused, that's a good fit for Analysts Copilot.
Clint (20:44)
Got it. And how is Analysts Copilot better than me just asking a simple question into chat GPT, tell me about this particular market? What are you doing that goes above and beyond the standard FrontierLab LLMs?
Bruce Daley (20:58)
Well,
first of all, we're curating the data, right? So we really look for primary sources of data. We make sure that the data is as clean and accurate as we can make it. Second, we curate the prompts. Some of the prompts can be incredibly elaborate. You know, you can specify the output, you can specify the structure of what you're producing. You can specify the tone and voice. We've got a ⁓ very user-friendly...
graphically oriented interface. And really, if you're using a frontier model as powerful as they are, it's still just a command line, right? It still looks and feels like, like MS-DOS, but we, in our experience, the LLM is, is critical technology. It allows you to do things that you could never do before.
Clint (21:44)
What I
was hearing in there is the answers are as only as good as the data and the questions. And you're making sure the data is accurate and you're making sure that the right questions are being asked. And then you're wrapping all that in software company, somebody who's going to continuously deliver this as opposed to somebody who's just kind of rolling their own solution inside of Claude or Chat GBT. And who knows if the data is even accurate.
Greg (22:10)
All well, we're definitely on with some ⁓ Silicon Valley veterans. heard MS DOS command line. I heard one throat to choke. I heard some of the old ones that I haven't heard in a long time in meetings. So this is great to have you guys on. Yeah, think I now I have my, my understanding of analyst co-pilot. Analyst co-pilot seems to do the analyst work that you don't want it.
You don't want to go do all the research and write it down and prepare it and get it ready, but you want to be prepared when you go into something and it's that's your copot.
Bruce Daley (22:46)
Bingo. And if you want to see a demo of it, Greg, next Thursday, March 12th, Carter Lesher is going to be hosting us on a webinar.
Greg (22:56)
Let's make sure and get that link into the show notes, Clint. We'll do that.
Okay, it's time for this week's AI Challenge. Now the AI Challenge is a takeaway assignment for our listeners to get their hands on some AI tools and do some exercises.
Clint (23:14)
This week's AI challenge is called the prompt engineering taste test. Take a simple question like explain the CRM market and run it three different ways. First as a basic prompt, then with structure like tone, format and constraints. And finally, with an expert prompt that defines a role, audience and output. You're going to quickly see something interesting. The model didn't change. Only the prompt did, but the quality of the answer.
improves dramatically.
Greg (23:46)
Yeah, look down into the show notes and you'll find the link that goes right to the blog. It's got all the instructions that you're going to need. Now, if you've just finished an AI deployment within your business, we'd love to hear about your experience, the good, the bad, all of it. So go to www.promptthis.ai, go to the contact us page, fill out the form and we'll be in touch to talk about the story.
Well, that was time well spent. I want to thank you for being on Bruce. Now I really understand what Copilot is, but I think this is going to, you're going to have wild success out in the business out there.
Clint (24:30)
You bet. So Bruce, thanks for joining us today. And before we wrap up here, tell us how people can find you.
Bruce Daley (24:36)
The name of our company is Analysts Copilot. So we have a website, analystcopilot.com.
Clint (24:43)
And I'm sure you're on LinkedIn as well, right?
Bruce Daley (24:45)
I live on LinkedIn. live for LinkedIn, but then don't be off.
Clint (24:49)
All right, Bruce. Well, thank you very much for joining us today. We appreciate hearing all about your point of view on the market and what's happening with Analyst Co-Pilot. And enjoy the discussion today. And that's another episode of Prompt This.
AI Announcer (25:09)
Joining Clint and Greg today, and honestly, why should I even have to remind you? You can find all of the Prompt This episodes and every in-depth article at www.promptthis.ai. It's all right there, and yet people still miss it. It's infuriating. And be sure to click the follow button below, because if you don't, after everything we've put into this, it would be absolutely unacceptable.
Bruce Daley (25:35)
you