Video: Implementing Salesforce AI: Move from Experiments to Impact | Duration: 3160s | Summary: Implementing Salesforce AI: Move from Experiments to Impact | Chapters: Webinar Introduction (2.96s), AI Implementation Challenges (130.505s), AI Modernization Challenges (454.675s), Understanding Semantic Layers (679.855s), AI for Business Value (895.04004s), AI-Driven Data Entry (1389.615s), AI-Driven Customer Outcomes (1972.375s), Pricing and Implementation (2650.65s), AI Deployment Challenges (3053.51s), Concluding Remarks (3111.11s)
Transcript for "Implementing Salesforce AI: Move from Experiments to Impact": Hello, everyone. Welcome or welcome back to another Salesforce Ben webinar. Today, we'll be diving into insights from Coastal survey where they asked over a 20 Salesforce powered organizations questions around implementing Salesforce AI, including what it takes to move from experimentation to results at scale. We've got a big session today and there's lots to cover, so I won't keep you for long. Just some general housekeeping before we get started. You will notice that the chat is muted for this session just to help, streamline everything today. And if you have any questions, do feel free to submit them via the q and a box, and we will get around to as many as possible. This session is being recorded, and this will be shared in our follow-up email. And your speakers for today's session will be we have Eric Berridge, who is the CEO of Coastal and a longtime leader in the Salesforce ecosystem. He cofounded BlueWolf back in February, the original Salesforce consultancy later acquired by IBM, and brings twenty five years of experience helping companies navigate big tech shifts with systems and strategies that deliver results. We have Max Kirby, who is principal AI architect at Salesforce specializing in agentic AI and semantic data. With fifteen years in emerging tech, he's an award winning speaker exploring the intersection of technology, policy, and philosophy. And we also have Paul Wilson who leads Mural's go to market systems, modernizing data infrastructure to fuel product led growth. With more than fifteen years in CRM and MarTech at Salesforce, Adobe, and Slack, he's now building a composable AI ready architecture to deliver connected experiences across sales, support, and product. So on screen now, shortly, you will be able to see, some key terms that we will be referencing throughout the session. Do feel free to have a quick read through. This is just to ensure that everyone is going to be on the same page because we will be referencing these terms throughout the session. But that is everything from me. So over to you, Eric. Awesome. Thank you very much, Saima. Awesome to be here with everyone. Thank you for tuning in. We'll try to keep this, moving along pretty quickly, and hopefully, it's got a lot of impact and punchiness for our audience. I do wanna thank Salesforce Ben for having us. You all have created an incredible community, within the Salesforce ecosystem, and for that, we are all very, very grateful. And I wanna also thank our guests, Max from Salesforce and Paul from Neural for joining us today. And we'll talk about really four things over the course of the next forty minutes or so. Number one, you know, our research report really kinda uncovered the fact that there's some stalling in the AI world right now. And we're gonna talk about what is stalling AI impact in organizations. We're also gonna talk about how to ensure that your AI projects deliver real value for your organization. We're gonna talk about how to build a modern data infrastructure, and this is really, really key from what we're finding with our customers, how to build a modern data infrastructure that can support AI at scale. And then we're also gonna talk about why legacy processes aren't necessarily agent ready and and how to fix that. But before we get there, because this is a webinar, we're gonna do a poll. And I will invite the audience to answer this question, with one of five options. And the question is, what best describes your organization's AI efforts today? Some individual tools tools in use, but no org wide strategy yet, testing small use cases, working through data or process challenges to get AI off the ground, using AI in specific areas with measurable results, or on the fence waiting to see where AI goes, one through five. And I'm watching the results here now. We'll give it a few seconds, and then I'll stack rank them for you. Exciting stuff here with SalesforceBen, Salesforce, Coastal, and Mural. Alright. I think we've got a good representation from the group. I'd say the top vote getter is some individual tools in use, but not really an org like strategy yet. That keeps going up. Bottom vote getter is on the fence waiting to see where AI goes. So almost none of you are on the fence right now. You've all jumped into the pool. You're swimming in the deep end, and maybe some of you are looking for us to throw you a life preserver. We'll see. I've been in this ecosystem, as Saima said at the outset, for quite a while. We've all seen big tech shifts, over time. One of my most memorable actually is watching Salesforce come to market at the turn of the twenty first century, when this whole concept of the cloud was first launched and they had this moniker of no software. In fact, Marc Benioff himself used to picket Siebel conferences with a no software billboard. And back then, everyone thought the cloud was crazy and and client server was never gonna go away and look what happened. We're in one of those moments right now. We're at an inflection point. AI is going to be as bigger bigger than the Internet. It's gonna be as bigger bigger than the cloud. All of these things are building blocks towards true automation, which is what AI promises to deliver to your organizations and to our lives. But here's the issue, the tools are evolving faster than most teams can use them. And I see this actually inside of Coastal myself. Right? We're about 600 folks. We're deploying AI internally. The tools work, but figuring out how to make sure they're accessing the right data, they're giving us the right answers, is an uphill climb. We surveyed over a 20 Salesforce powered organizations in our research report, and two thirds are increasing their AI investments, but only 21% say they're seeing results. The takeaway is this is not a tech problem. It's an execution problem. It's a change management problem, and that is not to cast dispersion anywhere. Like, we're in the early days, folks. So this is a natural evolution of how technology evolves. But observing what's happening inside organizations and going through our report has really given us some key clues that lead to the four things that we wanna talk about today. Most teams don't have the right data architecture. They're not focused enough on process design or governance to really allow AI to achieve scale in an organization. So today, we're gonna unpack that, not just what's going wrong, but what to do about it. Let me just talk for a couple of seconds before I bring on our illustrious guest here. You know, we've moved past digital transformation. We're now in this era of data and AI modernization. Yet many teams are still approaching AI like it's a side project, like it's an isolated pilot disconnected from core systems or outcomes. I think the number one thing I've seen so far is that organizations that are having the most success are truly identifying their enterprise architecture and truly thinking about a modern tech architecture in ways that the cloud didn't require us to do in early days. In the early days of cloud, you could literally turn on an application, run it in a silo, and it had immediate impact for your organization. We're finding that AI is actually the opposite. You need to unlock your data assets and make sure those data assets are in the right place in your enterprise architecture. And it doesn't necessarily mean all of that data is in Salesforce, and we'll talk about that as well. Other common blockers that we see, choosing overly complex use cases or low impact use cases. Both are not necessarily the right way to go. Fragmented data, I touched upon that a second ago, and also no clear ownership between the business and technology. So let's talk about how to move past these things. And to do that, I wanna ask a question of Max, whose official title is a principal AI architect at Salesforce. We all know that Salesforce is has the pedal to the metal around AI with their launch, not only of Salesforce last year, but preempting that, the launch of data cloud in the prior year. But the reason why Max's title kinda makes me chuckle is, Max, your phone must be, like, ringing off the hook from internal folks at Salesforce and customers and Salesforce, Ben. Like, how do you carry a title like that these these days when every corner you turn, someone's screaming about AI? So let me just ask you this question. What are the biggest factors that you see keeping enterprise teams from moving beyond pilots into AI that truly scales? It's a great question. And and everyone, and thanks for having me. Always good to be on with Coastal. Couple things come to mind. There are kind of three categories in my mind. One is the fact that agents and agentic AI and to a large extent, just the generative AI that powers it. It calls certain things into question in terms of assumptions that we're using to build, but it reinforces others. And so this is tricky. People get people get caught up on that. What what do you reimagine, and what do you just enhance is a big question right now. So that's not really a blocker, but it creates hesitation. And then you see two mindsets. I think the poll sort of reflects it, although I'm really glad to see so many people moving forward. There are two kind of ways to looking at what's gonna happen with AgenTic AI. One is the do nothing option is really good. You know, there's gonna be a new model every three months. Maybe there's gonna be developer agents that do a lot of this for me. Maybe if I wait, it will become an easier climb. And then there's, of course, the other side, which I think reflects today's audience, which is we gotta get on this really quickly. We gotta figure out what to do with it. It's gonna change everything. Where do we start? And then what should we do to get to the next kind of phase? And you see both of those mindsets, and they're both right and they're both wrong. Right? There's there's not really a a black or a white on this. The last thing I'd say is there's a category of customers who understand that what's going on underneath the surface is a data conversation. And they might be taking a long way around, for what we think is the long way around. Right? They might be thinking about, okay. What this really means is I've gotta get the core of my digital operations solid. Right? I've gotta complete my journey to the cloud. I've gotta make sure that I have all of the data labeled properly. And I think the reasons why people get blocked or why this is kind of tricky is it's very new, and you really gotta find one or two use cases that help resolve these questions. Because when you stay in theory land, which is where us AI architects love to talk. Right? But we have to get down to, well, what are we using it for? I I would give you this as the final piece of the answer, Eric. The best advice I have on this is cleanse the data for a specific use case. It's the best way of understanding what should be questioned in terms of assumptions and what doesn't need to be. What what's a double down? What's a hesitate? And if you have kind of what is the knowledge that I need to get the agents to do what I need them to do and you work backwards from that, that's a lot easier to take a slice of the data as opposed to maybe the boil ocean approach, which I which I don't recommend. Right? I don't recommend just going straight to your knowledge base and trying to label everything, and all of that is so far not working. Is that kind of what we talk about when we talk about the semantic layer? Can you maybe get into some detail about that for our audience? That was one of the definitions we put up there. Yeah. This one's an important concept. Puzzle. Right? Yeah. It it's an important concept. And whenever you see the word layer, you always have to say, well, what does that mean? Let's start with the word semantic. Semantic means the meaning of the data is inherent. We we didn't use the term syntactic data for the last four decades. It was just called data. But, technically, all of the data that we've been working with, tables, columnars, those are syntactic data. That means you read the the header of the column and the label of the row, and then you can understand what's happening in the cell. Right? There's a syntax to how you need to read the data. And that's where you get kind of traditional relational database models and all that. Semantic data is the kind of data where you can just read it. So like a novel where you just read it and you can understand the meaning. It's much more natural. It's also been very difficult to do for computers. Spend a lot of time working on NLP technology. Transformers have kind of made all of the old NLP methods to understanding what's in a transcript or any unstructured data much easier. So when we talk about a semantic layer, the simplest analogy I can give you is it's a dictionary, but it's a really great dictionary. It's a dictionary that understands what all of those rows and columns mean in natural language. And as context changes, right, maybe you have two different business units where the same term might mean the same thing literally or interpretively, but there are others where it means something different literally or interpretively. Right? You have to manage both of those. So how do you tell the agents and the underlying LMs, here's what all of these terms mean about the language that my business speaks. That's a semantic layer. Okay. It's very, very helpful. Let's switch to Paul for a second. Paul, you're the COO of go to market at Mural. I have to imagine in your seat, you you need to take this translation of the technology and figure out how it's actually gonna truly drive business value for a company like Mural. That's a fast growing start it's still in its early start up days. How do you decide where AI might help? You know, I'm assuming you have somewhat limited resources too. You know, you can't boil the ocean for a company of your size. How do you decide where it might help and what needs to be in place, do you think, before you actually act? Yeah. I mean, that's the challenge. And I think what what I see is the volume of information is only going to continue to expand. And what we're looking to do at Mural, because we knew we had to reengineer the infrastructure for go to market, we took the opportunity to establish the foundations to be able to provide AI based analytics to the humans in sales and marketing within the organization. And so where we're thinking about the first deployment of AgenTek AI is in that space of recommendations and seeing predictions based on just masses and masses of data. And so for us, it's the ability to rather than ask our sellers to troll through Databricks and see if they can figure out signals on who to speak to, the ability for AI and machine learning to surface those signals, translate them, and prompt our sellers and people in success with what their nest next best action is. So that's the opportunity that we see. So that kind of is that kind of the first area that you've attacked? Is this whole next best action, get your seller ahead of the data in a way and and kinda guide them? Is that do you It's the one that we're building for. So, we're we're at the stage of our journey where, we we've just deployed data cloud. So we are now starting to see those data signals in raw form available on accounts. Where sellers used to have to go and wander around Databricks, we've now brought that data right onto the account record. And then the next phase is how do we begin to unlock that data with those injected and some signals that we wanna deploy. What are you using Databricks for exactly? Databricks is the data lake. So we are pushing every go to market signal from marketing and sales and success down into Databricks, and then it also reflects product usage. So we have all the PLG signaling of you've logged in, you've done these things, you've taken these actions. So we're really establishing that base data layer against which we can then deploy more and more agenda capabilities across the full customer journey. But I think that's a really important point here too, and maybe I'm stating the obvious. But, you know, I think the shift that we've seen with the advent of these modern data lakes, whether it's Databricks, Snowflake, Google BigQuery, you know, there there are different flavors out there. But the biggest change we're seeing is I I remember back in the earlier days of Salesforce, we would try to jam all this data in Salesforce. Right? And we would try to jam transactional data into Salesforce, which quite honestly created a mock for end users and and created orgs that were eventually unusable. Right? The technology was doing what it was supposed to be doing, but it wasn't really a great user experience. And I think the shift we've seen in the last couple of years out here that we strongly encourage companies to to invest time in is put Salesforce in the middle of these modern tech architectures so that the data can sit where it should sit. And with the advent of data cloud and with the advent of the concept of zero copy, which is where you can go and look at data and access data and transact on data without actually moving the data, that's a very, very powerful thing. Because now Salesforce is left to do what it does best, which is to help you execute business processes in front of your end users and your customers. So in I am trying to put trying to put all of that data into Salesforce is almost like trying to put every SKU that Amazon sells in every warehouse that Amazon has. Like, it just doesn't work. And this whole concept of signals, like, you wanna get those signals from the places that are most appropriate to act on that data. So I I love that you're doing that, and it's, you know, you're you're definitely ahead of the curve based on other customers that that we're working with. But that that's a really big piece of it. Back over to you, Max Kirby. We talked about I talked about the foundation. I talked about how contrary to the advent of the cloud with these siloed apps in those early days, in these early days, being investing in that foundation and making sure you have the right architecture in place, we believe is really important. And Salesforce made some major platform shifts to support this and to support data and AI. What changes are are most important for teams trying to build the right foundation? So I think the most important one is understanding that as we get to an increasingly multi agent world, right, there's gonna be lots of agents that you're gonna be dealing with, you have to think about the context of the underlying system in which an agent is working. That's top number one. You know, for us, for Salesforce, all of our contacts or, let's say, most of our contacts is around what matters for your customer. And so if it's a customer centric use case, Agent Force is gonna be really well informed about what matters to the customer and able to learn very quickly what else matters to the customer and then take action on that faster. That doesn't necessarily mean that there won't be multi agent flows that involve Agent Force and lots of other agents. But I see a world in which those other agents are going to, let's use a human term, right, consult with Agent Force to understand what matters for the customer because that's much more efficient than trying to get all of the context within a given system and then put it in the customer's context. And then here's the trickiest part about it, that context is itself contextual. Meaning, when was it true? Last week? A month ago? Should we base our actions and conclusions on that going forward? And this is where you get a lot of people that are mistaking, I think, AI. They're saying it's not working, but it's what it is is it's not contextualized. Right? Because the LLMs and agents are trying to be helpful, almost overly so. Right? They're clearly trained to be your best friend and just try really hard to do what they can. The question is, do they know what that means? Do they know what you want? And from a technical perspective, I think a really good way of framing this for the audience and for all of us trying to figure out agents going forward is, are you using the inference power that an agent can bring you for the output, or is most of that inference power budgeting on interpreting your input? If most of it is interpreting your input and what you mean, you're not gonna great get great output at equal at equal kind of token compute dollar term. Right? So that would be the number one is think about the context, think about the metadata around the data, and also think about how that context can change over the course of your business, and that's where that's where there can be really creative use cases too because that context is data itself. That's where the most advanced people are getting towards as they're looking at, like, what what questions are my customers asking? How is this agent being used? That's itself a dataset. So it's two sides of the same coin. You're in a unique situation, Max, because as I said earlier, like, you know, it's coming at you fast and furious given how much interest there is in AgentForce and in Salesforce's AI platform. Like, if you could share it with us, and we didn't prep for this, by the way. Like, what's the coolest use case you see right now? Like, what is the mind blowing use case that, you know, just it is is pretty sweet. Yeah. I'll give you two. Might be more than your bargain for it, but I'll give you I'll give you two. They're they're part of the same kind of thing. So so here we go. So the number one vein of use cases, I would say, grouping, are agents that are taking natural language in some form and doing data entry. Now that sounds kind of lame, but here's why it's really cool. For a long time, we have been making these assumptions. Right? We talked about the the assumptions that need to be questions. We are making the assumption that we have to use our software, our platforms, our screens, our surfaces in sort of the containers of the past. So whether it's putting aside the keyboard and using audio input or your natural language of voice, whether it's, you know, you're in a cab ride home from a customer conversation and you're tired, but it's fresh in your mind and so you wanna just do a brain dump into your phone and let Salesforce figure out where that data needs to go and update all those records for you and do all of that data entry, that is unlocking a whole new category of value. And I think it's mostly reflecting how our minds work. In that example, we'll be concrete. If you've ever taken forensic psychology or looked into how the law works with our memories, it's really bad. Our memories are not, reliable witnesses. The way our brains work is actually remembering the last time that we remembered something. They're not like video cameras that got everything on tape. So if you can get that information right after a customer interaction and you don't need the keyboard, you can just brain dump it and let the agents figure out where that data needs to go. You solve the memory problem. And then the whole platform gets more valuable because it's only as good as the information inside it. And so you're getting more reliable insights about the customer. So that's one, and then here's the complimentary one. If you can do that at the ground level, at the salesperson level, at the person interacting with the customer, but also service, sure, That itself creates this dataset that, like, a CRO or a leader then can ask some really wild questions that were impossible to ask in the same way. I'll give you concrete example here too. What objections are my customers mentioning to my field? I mean, I can't tell you how many CROs we talk to, and they say we don't know what's going on. And if we were to try to figure out what's going on, it would take this many emails, this many QBRs, and it's also kind of filtered through the agendas and the things that are happening and the yeah. It's all great over here, boss, when there's a really important insight that needs to be brought up and the whole company needs to adapt to what that customer is asking for to create better outcomes for those customers. So two sides of the use case, one, the kind of Adam, the other, the aggregate. And if we can do just that alone, I mean, that's changing how a lot of these companies who are early adopters of Agent Force are using their CRMs, but also how they're thinking about their customers and their sales in general now. Well, I guarantee you if we did another fun Salesforce Ben, Ben, poll right now and we ask the audience on a scale of one to ten, one being the worst, 10 being the best, how much do you like doing data entry? They would all answer one. Right? And it's the bane of of our current application stack is it requires this hands on the keyboard, fill out the right field, the data has to be perfect, every pick list has to be right. And I love that use case even though it's a little out on the reservation because, I mean, that's the future where I think CRM and modern technology will go, where we are we will live in a prompt based world and where the systems will use our unstructured data. We work way better in an unstructured way as human beings, and it will make sense of it, and it will push us down the right path. Now does that mean I want my bank account to be calculated on unstructured data? Probably not. Does that mean Yep. Yeah. There there will still be millions of things in our lives that where structured data is gonna roll the day, but the opportunity to use technology better is using this unstructured data. So Because how many structures in data implies a relationship too in language. That's what's so cool about it is that when you when you make a table and you say these are the rows and columns, you're also making a statement. But now we can work with both to your point. And I and I think that's a good segue because I think the other thing we're finding is that the business processes that exist in our systems today need some level of redesign before they're ready for agentic work, if you will. And it it makes total sense to me because most technology enabled business processes still have tons of exceptions. They still require rep approvals. There are workarounds. There's manual work. So the question I'd ask you, Paul, is as you're actually designing these in the real world, because Max and I are up here in the ether, like, how much process redesign is required? Like, when you think about your next best action use case, like, did you have to rethink how that process works inside of the company, or what effort is involved? Yeah. Well, but first, I'd like to thank Max for the first process reengineering effort. I'm sure I'm getting Slack messages right now from the sales team at Mural asking, can they stop entering data in Salesforce now, or is that this afternoon? So thanks for that. I really appreciate it. The the question about how process is going to evolve is the next area that we are going to be paying attention to. So if if I can kind of contextualize your question, Eric, you know, we are in the midst of we've just launched data cloud. We're in the process of launching revenue cloud. We're doing that effort right now, and we're mimicking the current patterns that exist in Salesforce today. However, the ability of agents to consume and make recommendations based on information that was manually assembled by humans to now. The way that that process is going to evolve going forward is where we think we're going to see some very interesting adaptations. And I think this is where some of the frameworks around approval flows, deal desk work, is gonna rapidly evolve once we start seeing how we need the data to support what agents are capable of. And that's what I think is going to be the interesting area of understanding from where we all exist today in the world of Salesforce core objects to a world tomorrow where agents are able to see a broader landscape and adapt more rapidly in the processes that business are trying to evolve. And I think that this is an area where, you know, we benefited greatly from the partnership with Coastal because we see our situation. We see our environment. And having a partner who can see around the corner and see what some other organizations are grappling with is informative for us, but I think processes are very much going to change with the inclusion of AI agents in the in the workforce. Well well, I'll ask this question of both of you. Will, like, the organization have to change? Like, you know, we're coming out of a Yes. Couple decades of you've got Salesforce admin and you've got business process people and you're doing quarterly releases, maybe monthly releases to tweak your processes. Like, how does that organization of the future look, and and where are they gonna be spending most of their time from a technology perspective? I'll start with Max Kirby. There's a really good book on this subject that I recommend everyone go and check out. It's an old one. It's called The Fifth Discipline by Peter Senge. He originally coined this term the learning organization. What I think so, yes, organizations will change, but I think they're going to become more oriented towards what do they need to learn and take action on. Another way of saying this is they're gonna become more of the essence of whatever they're trying to do for the end customer, and that's terrific. Right? I mean, I know there's a lot of kind of fear, anxiety, there's change, and those those are natural. I'm a wild optimist about this stuff because I think that, ultimately, if you can get more companies trying to create more great outcomes for their customers, what's gonna happen is we're gonna grow. Right? And I don't mean we, Salesforce. I mean, everyone. The economy will start to create better outcomes for customers, and there will be more to go around. So the organization will change. Yes. That sounds like, oh, well, yikes. But it's also going to grow, and I think it'll lead to lots of things like hiring. We're gonna be in a pregnant stage, and then we have to get through it, but there's something really good on the other side of this. What about you, Paul? Yeah. I I completely agree. I think that the the the changes and transformation we we we talk about the digital workforce and, you know, that that notion that AI agents are a digital workforce. And that is sort of predicated on the old style of thinking where, you know, you'd you bring in a new hire and they would learn and adapt to the circumstances that they come in. But I think we're gonna see that with good data foundations, there are there are optimizations that happen that we don't even know about. One thought that comes to mind for me is, you know, businesses go through the exercise of defining an ICP and the personas within that ICP, and that's who they tool their whole sales and marketing infrastructure to fit. I think in the world of data everywhere and agentic support, every ICP moment is gonna be an an adaptation, and then the right message will be delivered to the person who's coming to look to be a consumer, including potentially the message that we're just not a good fit. You know? And I think that that's the kind of area that has been, to date, a very iterative every quarter we need to refresh our ICP, every quarter we need to adjust our comp. I think this is gonna be the kind of area where the fluidity of data and and change is just gonna completely transform the way that people do business. And we're talking about this from the lens of how organizations market and sell, but the other flip side of this is how are organizations gonna buy and where is AI and what are those capabilities going to do in terms of how companies come into the market to look for solutions. How do we get AI selling to AI? So I mean, I think there's so much depth that will be uncovered because I think ICP is a great example. Right? Like, ideal customer profile. Like, we hear that, like, okay. Well, maybe we have three or five. Well, okay. That way. Like, you know, we've sold major transformational programs to companies that have less than 10 employees and we've done it to the Fortune 500. Like, what's my ICP? Well, there are certain attributes around what that client needs that's deeper than what the structured data is telling us. Right? So I I think that's an awesome use case, by the way, drilling into ICP. Let's talk quickly about outcomes. Right? Like, the only reason we're doing this is we're trying to make our businesses better. We're trying to make our customers more successful. We're trying to get them to use more of our portfolios. We're trying to retain them, trying to make our employees happy. Like, when you look at the use cases that both of you see, like, how do you coach your internal organization, Paul, or your clients, Max, on how to truly define outcomes so that the effort that's being put into these agent force programs are are delivering out the back end? Like, what what's the guidance you give there? I'll I'll start with you, Max. So I'll do a in in a broad way and then in a really specific way because I love this I love this questionnaire, and I also I I feel like a lot of people give it too broad of an answer. So I'm gonna give you the philosophy, but then I'll give you some specifics for how we look at it. There are two kinds of customers that we see that are moving forward right now. And I think the broad answer to this question reflects why they're moving forward in the way that they're moving forward. You've got you've got really big companies who have decided to focus on one thing and get it right, not do that boil the ocean thing. And then you have really great start ups, Mural included, right, that are nailing how this changes the assumptions for kind of all businesses in their category, but certainly theirs. And there's a theme there, right, which is ultimately try and figure out how to orient what you're doing towards that end customer and what they're going to have in terms of behavior when they buy. And I love this point, Paul. There's another great book, When Machines Can Become Customers on this subject. Another one to check out. But when I look at the people who are succeeding, I see this theme. I mean, Royal Bank of Canada, they were thinking about how do we make sure that our financial advisors understand the customer in their own voice and then kind of get that information whenever they need to. Some of the tax season businesses, you know, one eight hundred accountant was a big win for us and a huge case study. And they were able to do the deflection that you would normally wanna do in a customer service area, but just by understanding the customer in their own intent. And I see that across, like, airlines as well as customers that are trying to help people hire or sell textbooks or, you know, there's these, like, things that you think about as lame that can become really cool if you just understand the buyer in their context. So that's number one. Let me that's the general here's the specific. We think about this in terms of profit generated per salesperson hour. If you can get to that, now you have to go through return on human capital. You've gotta think about, like, how do you make assumptions? Is it at the department level? Because it's hard to get it at an individual level. You've got all sorts of heterogeneous data to work with. But that's what we that's what we aim at, is profit generated per utilized hour. And then how many hours can we also remove that are menial tasks and not useful at all? And I think data entry is number one way that you can get started on that. Awesome. What about you, Paul Wilson? Yeah. I think kinda tying into what Max Kirby was just talking about, at the foundation, we just talked about a bank, airlines, tax consultancies. Every experience now has a digital experience to it. And the more information and the more context that that information can provide for the customer's experience enhances the customer's experience. And so how do you do it while respecting privacy, respecting an individual's wishes? You know, we see the the evolution of this technology, but there are going to be markets, I'm sure, where, you potentially the European Union is looking at in a similar way to the right to be forgotten. You can request that your information is not processed for this kind of capability. So how do we, as curators of digital experiences, ensure that we're constructing the ecosystem to deliver that best in class differentiating customer experience that's contextualized and respects the customer? I think, you know, that these are these are these are the topics for us for today, in my opinion. These are the things we need to be thinking through. Yeah. I mean, I think at the end of the day, your your leadership and your CEOs and CFOs, they're gonna wanna see a return. They're go they're gonna wanna understand how you're grabbing more market share and how you're growing your business, doing it in a cost effective way. And and, you know, the the outcomes that AgenTic AI will deliver, they'll deliver those things if you have the right foundation, if you're picking the right things in your organization, if you're you're getting rid of the junk work that's taking up time. So I I would never go down one of these paths without that in mind. You know, make sure you can sell it to your CFO. But I would also say this, the one thing that does not change is that customers buy on emotion. They buy from people. They affiliate with brands that make them feel good. They put those brands in their lives. And until that experience has is negative, they stick with it. And that's how you drive customer loyalty and that's how the best brands in the world have have survived over time. So, you know, the fool's errand with AI, just like it was with the Internet and the cloud and just like it was with client server and just like it was with the mainframe is to think that as a customer, we're gonna wanna be integrated into this very brittle technological motion. We don't wanna we don't want that. Yes. I wanna be able to return a product quickly. Yes. I want Paul's next best action to show me the right thing. But I wanted to feel like it's coming from a lively organization. I wanna feel like it's got a heart and a soul, like and we truly believe that. And at at Coastal here, what we wanna help you do is free up your time with these tools so that you can win the hearts and minds of your customers. And that is never gonna change. And I and I firmly believe that. So laying the foundation, picking the right tools, reorienting reorienting your organization to think about AI the right way, those are all imperatives, identifying outcomes. But making sure your brand comes through the right way is still the mantra that I think any CEO across the land is is gonna want your teams to truly focus on. I'll give you a funny example here, Eric, just to build on what you're saying. One of the things that I think everyone knows agents can do is answer frequently asked questions. Right. I'm observing this, interesting secondary order effect that when you have agents answering frequently asked questions, it it changes the frequently asked questions on the page. Right? And we all know as marketing people, you know, you can think about what questions you want your your customer to ask. And a lot of FAQ pages across the web are like, oh, here's a frequently asked question. Why is your product the best for me? Why is it the best effort? And it's like, thank you customer for asking us that frequently asked question. Here is our entire value proposition. But if you let agents manage that, you get real questions that customers have, not ones that you want them to ask, ones that you need to answer. Interesting. Just wanna build on what you're saying because it's also forcing brands to think about how they are understood before they how they wish they were understood. It kinda push its going back to pull. And I think every move that you just outlined in technology has pushed us further to to this. I think agents is just gonna finish the fight. It's it's gonna be like, look, you have to be customer centric now. There's no real choice because otherwise, we're gonna skip over your whole product because we're gonna automate away anything that's not. Right? That's that's great. Great way of looking at it. I think we're ready to open it up for questions. Before we do that, I'm gonna call a little bit of an Max because I do want the audience to know that we've got some resources here at Saima that and we've got a QR code here that you can scan, that will do two things. Number one, we talk about the research report that we launched about six weeks ago. There's a QR code here for you to download that research report. It's titled AI isn't delivering. Here's what to do about it. You'll hear. It goes into depth around a lot of the things that we've talked about over the past forty minutes or so. And then we're also hosting here at Coastal, free of charge, small group data and AI workshops. If you wanna register for one of those, grab the QR code on the right, and, we we'd love to see you in one of those sessions. So with that, I'm gonna pass it back to Salesforce, Ben, to moderate some q and a. Amazing. Thank you so much. Yeah, we'll move on to q and a now. If you do have any questions, feel free to put them in our q and a tab. We did have a couple that came through. So Michelle is, asking. She said, I can appreciate that the store is not due to the technology, but due to implementation and change management. Can you speak more to how Coastal can assist clients with this critical step? Absolutely. First of all, we'd love to engage. We have a team of over 600 folks here in North America, the majority of which are deeply embedded in AI technologies. But more importantly, we have a process that we call true north, which is a very hard hitting, value driven exercise that takes a few weeks, and it actually helps you define your AI road map. That's probably the best place to start with us. We've done this across just about every industry that Salesforce serves. And and like I said, you know, go to our website, download the QR code for one of these workshops, and we'd love to get the conversation started. Fantastic. Thank you. With agent pulls and data cloud pricing being consumption based, do you have any tips on projecting cost? Well, I'll I'll turn this over to Max in a second. It's one area that we really didn't touch upon today is the whole consumption pricing world, which I think we all know we're rapidly moving to across all of the technologies that we use. And I will say I think customers are sometimes reticent because they're not exactly sure what the long term consumption model might how it might impact them. We've got different ways of working through that and making sure that you're consuming your data where you should consume it, which gets back to having the right architecture in place. But I'll also ask Max to weigh in on that because I'm sure you get to ask that question a lot, from your clients at Salesforce. Sure. Yeah. It's a big question. And as we're moving towards more OPEX for everything, the question is how do you forecast? So here's I'll I'll say two things on the upfront. One is not everything should be an agent. During every major new wave, the mindset that I mentioned before, the let's figure out and rush in, you know, you you wanna build to learn, so it's okay to experiment with all these things. But when you're building to optimize, that is a different that's a different mode. And and, architecturally, that's a different architecture. And you have to be able to keep those separate. Right? Are you building to learn, or are you building to optimize? Lot of people in the beginning, they'll think about optimization while they're still building to learn, and then you get really tricky conversations because it's like, what are we comparing this to? What are your goals? So here's what I'd say, two modes. And I'm kind of, giving you some of the game theory for how to how to work with us on this. One is when you're working on what you're building, think about how you have kind of a flat structure so that you know how much budget you have to learn. And then when you switch over to building to optimize, when you wanna really burn it in, you've got it, you wanna figure it out. Okay. Now let's put it into production and think about what doesn't need to be an agent. Maybe some of these agents are just yielding flows underneath. Right? When you get to that mode or moment, that's when you wanna think about what is it over time, and you wanna negotiate a rate. Right? Because once you have the p, the l starts to be a ratio that makes sense and there can be good deals done. But if you start thinking about the l, you're never gonna get to the p. And a lot of people get caught up. They get to, like, 80% there and then they go up. It's not gonna be, good enough because it's just too expensive. And it's like, well, have you done any optimization at all? Because usually, we see, like, 20 to 30% savings if you just figure out how to optimize. But the question is, what are you optimizing? There's also another one, which is if you bring your own LLM, right, those costs are borne by whoever brings the LLM. So that's another way of taking care of it. We let you bring your own LLM. You know, that's not our business to kind of upcharge and try and make that our that's not it. Right? So if you're getting that from us, you need to compare it in a total cost of ownership side of thing. So what I would say is, you know, it's it's what's old is new again. This happened in the cloud era when we were comparing servers and colo and hybrid and public cloud and then all the calculators showed up. And then there was this whole job called the cloud economist that came out to figure out how to do this. So that's good. That's that's healthy. We're at the beginning of the wave. My only advice is if you wanna just get down to what are we gonna do, set a flat budget to start, figure out what you're gonna do, then start looking at the ratios as you optimize because it's production and learning and those can often get in the way. It's design thinking, systems thinking. I'm sure everybody here kind of is familiar with the dichotomy. But keep in mind that's pronounced right now. It always is with new technology. Yeah. And I think we're in a little bit of an awkward shift because, you know, if they haven't been in the SQL system a long time, like, customers have been screaming for this type of pricing model for a long time. Right? The whole concept of seats and how many people do I need and and do I have to buy a seat for someone that's logging in once a week? Like, those conversations will slowly dissipate and we will get to a world where you will be paying what you're actually using and paying for what you're actually using. And that that should be a beautiful place as long as Salesforce keeps innovating, which I know you will, Max. We will. And the contract users will get bigger. The models will get better. Right? So yeah. Sorry, Paul. Maybe you were gonna make that point. Yeah. Well, and but mainly, I was just gonna highlight the other factor to to be cognizant of is don't double pay. So in an ecosystem where you may have Databricks and data cloud, as an organization, you wanna make a determination to say, I'm gonna spend my compute dollars with this platform and make sure that if there is compute that's handled and passed up, I'm not gonna then recompute or transform again. So be mindful and thoughtful about where you want the thinking to occur and then try to organize and systematize where that thinking happens. And that'll always be where your higher costs are. Right? So narrow it down to where that actually needs to happen. I think we've got time for one more question, Saima. Yeah. Excellent. What are the biggest blockers to deploying AI into production, data staging and preparation, infrastructure, or org readiness? Depends on the use case. Depends on the use case and the company. One of those is always, more important. I would say I haven't seen a single use case. I don't know, Paul, Eric, if you're seeing the same, but I haven't seen a single use case, for example, where all of those are not involved in some form. But usually, there's one that that's primary. What's most important to you right now, Paul? Although I think it's it's the human factors. It's the adaptation. So we wanna ensure that what we're building for is what we're optimizing on the human front. Awesome. Excellent. Thank you. Fantastic. Well, thank you, Eric, Paul, and Max for your time today and for sharing all of these insights. It's really thought provoking discussion and some powerful nuggets of wisdom there. Thank you everyone as well in the audience for tuning in today. If you would like to find out more about Coastal, we will be sharing the, links behind the QR codes to, in our follow-up email, and I'll also be sharing the recording to this session as well if you would like to share that with anyone else. Oh, they popped up on screen as well. But I will also be sharing the recording as well if you would like to share that forward with anyone who anyone else who might find this discussion useful. But thank you all for your time today, and we shall see you in the next one. Thank you as well. Thanks, hon.