Video: It’s time to get serious about your data strategy | Duration: 3124s | Summary: It’s time to get serious about your data strategy | Chapters: AI Week Introduction (24.83s), Evolution of Conversational AI (66.965s), Augmented Intelligence Breakthrough (305.19s), Conversational AI Advantages (444.76s), Chatbot Challenges Explored (851.145s), Analytics Terminology Introduction (1052.06s), Data Intelligence Platform (1155.7799s), Databricks Governance Features (1628.45s), ThoughtSpot Platform Demo (1895.795s), AI Trends and Applications (2650.825s), Session Wrap-Up (2737.4849s)
Transcript for "It’s time to get serious about your data strategy": I lead our EMEA region here at ThoughtSpot. Welcome back to our second installment of our Gen AI week. Hopefully, you were able to join us yesterday where we talked about how AI is the new BI. If you didn't, make sure you pop on to our website to catch the session on demand. Today, we're taking you beyond traditional BI and data visualization to reveal how AI powered platforms and AI agents are enabling faster, smarter decision making. We have got a whopper of a lineup. Cindy Howson, our chief data strategy officer, will be joined by the industry legend Donald Farmer and Ari Kaplan, the head evangelist at Databricks. It's gonna be so insightful. Let's dive in. Yeah. It's great to be here again and, delighted to be taking part in this webinar. And it's such an interesting topic of conversational analytics. The one one of the great things about working with with with TDWI, as I often say, is the fact that we're able to take a lot of research and implement that research and take it into, the conversations that we have with customers and the conversations that we have with vendors. So So we're not just sort of making this up. We're driving it from basic research that we have done. And and one of the things which came out from this research on generative AI, when we asked what kind of applications do people want to build with generative AI, one of the first things it said was we want chatbots. We want conversational experiences. And that seems, you know, really, really interesting to as part of that. And then there's other things like generating marketing content. And I'm sure you've all been in the receiving end of generative AI, and I'm sure you've all been in the receiving end of generative AI generated marketing content. But chatbots are an interesting one. And then later on, just about five steps down, you'll see to act as a front end for analyzing company data. So there's a chat experience, which is the top priority. But as soon as we get into what you might call the kind of internal issues that that that people are interested in, developing code and acting as a front end for analyzing data is really important. Now we can theorize about why that is, and some of our other questions do dig into this. But one of the key things is that conversations are seen as an easier way of engaging with with analytic data. I want to make the point today that conversations are not only an easier way. This is not just about user experience. They're actually a better way of engaging with data. And I think in the conversations we're going to have through, my discussion and the discussion that our guests will will will bring, I think you'll see that this is a very important development in analytics. It's a new development, but it's a development with quite a lot of history behind it. The evolution of conversational AI actually goes all the way back to the nineteen sixties. Some of you may have come across the the chatbot, Eliza, which was an early psychological chatbot, which was run more as an experiment. But even at that time, the developers were actually very surprised at the the way in which people were willing to engage with this chatbot. And it was very basic. It just essentially gave canned answers or reformatized you a question and asked it back to you. If it really didn't know what you were saying, it because it was pretending to be a psychologist, it would say something like, well, what does your mother think about this? And then you would carry on the conversation from there. But what was surprising about it was given how simple it was, it still had a great effect that people people really enjoyed using it. Nineteen ninety, the early two thousands, we started seeing natural language processing that was very much rule based. I was quite involved in that at one point developing, natural language connections to databases. And why we call them natural language? The truth was they weren't really what you were doing was using fairly natural terms and syntax, but you were generating a very structured query underneath. And you needed to understand that process very well. You actually needed to a certain extent, you needed to understand the query you were trying to generate in order to, use the type of natural language that would result in a good query. And to be honest, that's not really language that would result in a good query. And to be honest, that's not really conversational. If your idea of conversation is generating a query, then you're not gonna be welcome at my dinner table, anytime soon. But by the twenty twenties, we got into slightly more sophisticated systems, especially with the breakthroughs in neural networks, where systems were much less rule driven, but were able to, engage with natural language in a way that in a sense was less predictable, which made it more interesting, but also not only more engaging, but able to give a much wider range of answers. By 2018, Gartner had started to talk about augmented intelligence, which was a way of taking business intelligence and using machine learning to add new insight to, to business intelligence. And in particular, we started to see this breakthrough in question and answer systems or search based systems, which provided a much more natural experience for users trying to deal with with business intelligence. So rather well, part of the challenge of business intelligence in dashboards and reports and so on was that there was always a hypothesis presented to you of this is what's important. This is what the dashboard is showing to you. And that was great, but very often if you were an inquisitive, curious, executive, you're you're gonna have more questions. You're not just satisfied with what you see on screen. In fact, the whole purpose of showing you this data is so that you have another question and say, yes. But why are sales going lower? How can I understand my dropping sales? Or why is this marketing campaign so much better than that? What how do I dig in and discover more of that? And the augmented intelligence applications which emerged in 2018 and so on, we're we're actually much easier, much better for people to do that kind of exploratory analysis rather than just the kind of situational awareness that you got with traditional business intelligence. But I think we're all well aware that in 2022, things really did change. For most of us, you this might have come out of nowhere. But actually, the industry had been developing towards this process for some time. Then large language models and generative AI had actually come out of a of what we call the transformer technology, which itself had emerged from experiments with technology known as as GAN, which were generative adversarial networks. And many of you will remember some of the kind of early attempts at drawing pictures and and and, interpreting images with those early systems. And they were crude, but you could see that they were getting somewhere. Twenty twenty two, we had this breakthrough large language models really started to be capable of having what felt like an authentic conversation and indeed of generating rich natural language responses. So that breakthrough is really what has led to much of our conversation today. Now I said that I wanted to make the point that conversations and conversational interfaces are not just about simplicity and ease of use, And I think this is very important. I think, one of the things we hear a lot you'll you'll hear this buzzword prompt engineering quite a lot. Prompt engineering is the idea that if you can create the right prompt, the right input, then you will get better answers back from a generative AI, from a chatbot, from a from a chat GPT, or from from any of those systems. So the idea is to create the perfect prompt. A prompt engineering puts a lot of effort into that. That turns out to be not that much better than the old way of trying to create a query. After all, what you're trying to do is gives the correct input in order to get it correct output. That might be natural language, but it's not natural conversation. That's not the way we work as human beings. Natural interactions mirror mirror human dialogue. And they as we talk and and as you and I talk or any of us talk in the street or in the office, every sentence, every exchange actually enriches our understanding. It's not just a question of I need to utter the perfect sentence in order to get this, but we we have a to and fro. We have a genuine dialogue. This is flexible because it allows us not only to refine our understanding, but to actually change direction if we need to. You can imagine the conversation, you know, with, with the chatbot, which would be very like the conversation you might have with, a server in a restaurant. Server in a restaurant says, well, you know, we don't have that. This dish is not on, but can I recommend something else? And then the conversation goes in another direction. It's that ability to be flexible and change course direction that really matters in human conversation. This enables you to have more insights. It enables you to discover unexpected things that you may not have known about in the first place. And this is one of the great benefits of the conversational interface. It doesn't necessarily go where you expect. It can reveal and and and give you glimpses of new understanding that you can then follow. And that could be critically important for people who are particularly investigating problems or looking for new strategies, looking to innovate. Those are exactly the kind of responses that you need. And then it is true also that I think the conversational interface in human terms is more efficient. Now this all sounds just as if conversations are a good thing in themselves, and it's nice to have a conversation, and it takes you in these different directions. This is all true, but I also want to emphasize that behind that, there is actually something more. Technically, conversations beat prompt engineering. There's a whole list of stuff on here that you don't really you don't need to dig into this in detail to tell you the truth. But there's, we we won't even go through all of them. But what I want to get across to you is that there are very good technical reasons why conversations are not just a human, but satisfying for human beings, but they're technically better. Then in, generative AI, we have what's called an attention contact context windows and attention mechanisms, which are really the way in which the generative AI looks at your question and sets it in the context of its knowledge base. All this information that it has been trained with. When you give one command, one question to the JNEI, it only has one context window and its attention mechanism is limited to that. As soon as you ask another question, the context window moves and increases, and there's a larger retention window. And so the more you ask questions, the more the generative AI is actually able to explore its own knowledge base and give you much better contextualized answers. This is also what we call the vector space navigation. Data stored in these systems in the form of what we call vectors. And as you ask more questions, we can navigate and explore more vectors, which we can't do unless we have more input from you. This results in the, the ability to chain together multiple, prompts and start to build up a really rich context for for answering questions. If you ask only one question, even if it's the perfect prompt, you will not get such a good answer as if you're able to explore or at least enable the generative AI to explore this rich context space, which is behind. The other thing that happens is that when we give a quote, when we when we give a question, a prompt to generative AI, it's what we call tokenized, which is to say the system breaks it down into tokens. And those tokens represents linguistic units, if you you like. And the more of those tokens you have, again, the more your context faces. There is, the latest model from chat GPT app, for instance, has the code name strawberry. And the reason it has the code name strawberry is because just a few months ago, there was a rather famous problem in generative AI in chatbots that if you ask them how many r's are in strawberry, it couldn't give a good answer. It would give you two, typically, was was the answer even though there's three r's in strawberry, one in straw and two in berry. The latest version of chat b t GPT was code named strawberry because it could answer that question. And the way it did it was rather than seeing the word strawberry as one token, it understood that if we're asking that kind of question, it may need to break down that into more tokens and apply more logic to it. So it takes strawberry, breaks it down into nine tokens, counts three r's, and it can give the answer. Tokenization, therefore, is a really important part of the communication between you and the the the GPT even though you don't necessarily know that it's happening. You might come across it under those circumstances where something goes wrong. The more we can tokenize, the more we can, navigate the vector space, the better the context we get, the more attention window that we can apply to the problem, the richer the answers you're going to get. So it's not just a question of this is an easy way to to deal with it. This is a simple way to to craft your questions. It's a better way. Have a conversation with your systems rather than just trying to ask the perfect question. It's really interesting. It's it's great discovery process to have a conversation with your data. And the technologies we're gonna be talking about today are great at doing this, and they really enable you to have rich, exploratory, complex conversations with your data that enable you to discover things that you didn't know before and to answer questions you didn't even know you had. And that really is, a breakthrough in in analytics if you ask me. It's something we've been wanting for a very, very long time in the history of analytics and business intelligence. Now that's not to say there are not challenges. And some of the challenges are well, frankly, the biggest problem in any computer system is the human beings using it, and the same is exactly true for for for chatbots. Users will attempt to manipulate the chatbot. I look at the logs from some of the bots that I've been involved with even just in the last kind of couple of years and deploying and looking at the prompts that people issue. And I'd say something like 20% of them are inappropriate. People try to trick the track chatbot into doing things. They ask it questions they shouldn't be asking. And then there's always a risk that the chatbot might actually respond to some of these things. There's a potential, therefore, for misalignment with what the chat bot is intended to do. It's also a misalignment with company values, and it can have quite a bad impact on integrity and the public perception of your brand. But not only that, it can also lead to specific problems. I think it was Air Canada where somebody asked a chatbot for a a price on a fare on an airfare. The chatbot responded incorrectly with a ridiculously low price. The customer bought it. And, then, you know, Canada tried to, kinda roll that back and say, no. That was a mistake, and the courts told them, no. You you have to honor it. The chatbot is your company representative and offered that fair, and therefore, you have to you have to honor it. So chatbots have potential problems there. We need to worry about issues like accuracy, especially in commercially sensitive situations like that. I read somewhere else if somebody tried to, trick a chatbot into making a promise that, whatever deal, was made, that they would honor that deal, and then they tried to buy a GMC Yukon SUV for $1. And, again, the chatbot went along with that process, and people were trying to trick it into that. So if you're building a chatbot for commercial issues, to be fair, some of those silly ones are good to go away and the courts will deal with them in due time. But there are still potential problems, especially if these are, questions about information, questions about best practices, questions about safety, questions about, you know, commercial terms. You really want to make sure that your chatbot is capable of not only giving accurate answers, but answers which are aligned with your corporate ethos so that your brand integrity and the public sense of your brand is is maintained. You don't the chatbot is your representative if it's public facing, and therefore, it has to be, a good representative. And if you're using chat internally, that is also true. That when you're connected to data and people are using, generative AI, you know, using conversational interfaces for data, then it's really important that the results are accurate, contextualized, and appropriate. And that's more that might not have the same issue of public perception, but it absolutely does have the issue of trust and reliability within the organization. And, of course, you need your results to be accurate because the very purpose of doing this is that people will take action with it. So they need to be accurate results. So with that, I think we can say that, you know, analytics as an experience is going to be greatly enhanced by chatbots. We there are special things about about, chatbots and analytics that we need to be aware of. But, again, more detail here than we need to go into. But the key thing is that there are special analytic terms that we use. And some of those terms, of course, you'll be aware of that'd be things like what's average, what's mean, and what's median. But even a question like what are total sales in Wisconsin is somewhat vague in the sense that it may need to be disambiguated. Well, sales for what year? Sales of what category? Do you mean total gross sales, net sales, this sort of thing? And so there is a for any analytic system, the the chat interfaces, the the conversational experiences have to be able to disambiguate vocabulary, but they also have to know the specialized vocabulary of your business and the specialized vocabulary of analytics in order to be effective. With that, I think it's a great time for us to to to dig into how this works in the real world. Andrew, would you like to introduce Harry? Yes, please. Yeah. Thank you, Donald Farmer. That was a great presentation, and it's my pleasure to introduce our first guest speaker, Ari Kaplan with Databricks. Ari is the global head of evangelism with Databricks and a leading influencer in data and AI. The popular movie Moneyball was partly based on Ari's analytical and scouting experiencing experiences innovating Major League Baseball and creating the Cubs, Dodgers, and Orioles analytics departments. He's also the co author of the data intelligence platform for dummies, a Data IQ top 20 influencer in AI for 2024, co host of the popular show Live from the Lake House and was also the president of the Worldwide Oracle Users Group. Before joining Databricks, he traveled the world with McLaren Formula One, assisting the racing strategy team to bring AI models to production. With that, please welcome Ari, and I'll hand it over to you now. Hey. Thank you so much, Andrew. And, yeah, Donald Farmer, that was, definitely enlightening, brought back good memories of using Eliza back when I was a child. It was fascinating to be able to converse to some degree, and, loved how you walked us through, where we are today and heading into the future. So a lot of the key things that I see as evangelist traveling around the world, talking with customers and, partners like ThoughtSpot, everywhere I I go, this is one of the top, top requests to be able to converse with their data to, companies want to be able to do it on their own data where some of, like, the Chat GPT and others that were mentioned, one giant model trained on everything they could pull in, so it's trained on Taylor Swift concerts or World War one statistics. But what companies want are to be able to look at their own data with security and, you know, access control. If this person has access to HR salaries, you want them to have that chatbot or, like, run business intelligence, like, questions on salary, but somebody who doesn't have access to the salary data should not be able to ask that. Also, if things are trained on your own data, you get, more accurate information that's, like, in the context of your business, and it understands what you're asking for. For example, if you ask f y, your company may realize that that stands for fiscal year or the word churn in, like, Unilever, whatever joint ThoughtSpot Databricks customers may look at churn with one definition. But if you use just a general purpose LLM model, it it's diluted or it might not really align to what your business is. So companies wanna be able to get the accuracy on their own data where there's security. They also want to be able to self-service, you know, traditional BI tools. You have to have some expert who really understands your data, who really understands how do you, create these dashboards. So if you can make it more democratized so the business users can, get answers that they, you know, perhaps their BI, doesn't be able to provide or that it does but do it more simply, then you're gonna expand the value of the data that you already have. You know, traditional data scientist, for every one of them, there's 50 or 60 business analysts out in the real world. So you're gonna democratize and get more value out of all of the data that you have. And then in, terms of trust, I heard Donald mentioned, you know, you want lineage. You want the ability to see from your raw data how it's, leveraged all the way to your dashboards and then back. So that's kind of, like, the setup of where the demand is. And, you know, I'd also say, you know, Ari, one of our cofounders, put it to me nicely that, about six, seven years ago, everyone was, like, shouting data, data, data, and we would whisper, but you should check out AI. It's kinda helpful. And today, everyone seems to be shouting AI, and, you know, we wanna keep reminding people to get the best AI insights, you're gonna need to have that solid underlying foundation of your data. So you need good data to get good insights, and that's where Databricks fits in. You were this is a little overview of our company. If we can go to the 14,000,000,000 investment, very happy that in December, we raised a $10,000,000,000, quite large round, which shows the interest in this market and the growth. But we have over 10,000 global comp, customers, and we're very much centered around open source. So, Apache Spark, a lot of people have heard of, and a lot of people who haven't been in touch with Databricks for seven years thinks we're the Spark company, but we've grown a lot since then. MLflow for machine learning, and MLOps, Delta Lake, Unity Catalog for governing it all, and, even our own, LLM DBRX. So you could see, a lot of great analyst firms. You know, Cindy speaking next came from this world, but Gartner, Forrester, IDC, three, the most reputable, you know, wherein those, leader quadrants. And you could see the inventor of the lake house, and this was the first, like, paradigm shift after traditional databases and data warehouses where, you know, the challenge was you would have a structured data warehouse or or a database where you could put numbers and letters, very structured information in there. And then this whole other, paradigm, a data lake, where you could it was more like volume file based. You could have unstructured data, like social media is big these days, videos, PDFs, Word docs. So these are two very different platforms with two governance frameworks, two access controls, copying data back and forth everywhere. So the lake house was formed, which was a nice play on the words, data warehouse and data lake. And this was just that unifying, a platform to house everything. So that's really at the foundation of it all. You could take SQL, you could take Python, and access it all. And, you know, at the very end, you have the business intelligence or, even better ThoughtSpot, solutions that with a Databricks lake house underneath it, it really works the best. You want to have that good data. So that's where Databricks really, blossomed. And now, almost every single enterprise company has a lake house of some sort and, you know, super excited about that, but we didn't wanna end there. So, recently, we introduced the data intelligence category with our Databricks data intelligence platform. This was really sprouted by the whole, you know, Gen AI and LLM where, you know, you have your data that's being housed, but really to get extra value out of it, you want to get intelligence. So intelligence could be in any of these different areas. Mosaic AI is an acquisition Databricks had for over a billion, provides end to end AI for both generative and classical AI. You have notebooks that offer a deep set of data science capabilities. Databricks SQL is, at the serverless, which offers the most performant data warehouse in the cloud. And we also have AIBI, which integrates deeply with Databricks SQL so you can easily extend business intelligence across the business. And then Lakeflow ensures that you can reliably ingest and transform all the data that the workloads require. So that's, like, transform all the data that the workloads require. So that's like taking raw data, being able to track it all the way through, like, the the workflows. How do you merge all the data together? How do you govern it? How do you apply intelligence to it? And then how do you serve it out to solutions like ThoughtSpot? And, you know, it's all federated. So whether the data resides in, you know, three major constructs underneath, Delta Lake, Iceberg, and Parquet, it's all uniform as well as multi cloud, like Google, Azure, AWS. So the governance Unity Catalog, which we, famously open sourced at our summit, a Unity Catalog is the construct to be able to provide it, like, a unified governance for everything. So that's not just your data, but it could be your notebooks. It could be all of the different assets within there. It could be your large language models, like anything that you register, you know, dashboards, etcetera, all could be governed through Unity Catalog. So when you're in ThoughtSpot, for example, asking a question, you will know the the full lineage out there. You will have, access control. So the right people have access to the right information, not just inquiries, but in these chatbots, large language models. You as a company can be more confident that the right data is limited or granted to the right people, as well as auditing who asked what question. You know, Donald had mentioned he would look through what prompts people asked. You know, so that's like the auditing, a list of everything that happens in your environment and also things that didn't happen. So what data do you have that's not being discovered or leveraged? And then speaking of discovery, you know, also having this, the the catalog aspect, you will have a better grasp of everything in your environment through, LLMs. So if you ask what tables have information on sales and you have 8,000 tables, Databricks will more intelligently help you discover, like, if you are looking for sales in Europe versus sales in South America, it might be two different tables. So that's part of the data intelligence aspect. If you wanna find data, Databricks will help more intelligently find where the data is. And then, you know, cost controls, an unsaid part of this new Gen AI is you can add without controls, you can ask a question, and it goes off and does some computational heavy interrogation and costs thousands of dollars. But if you have controls, you could limit that or better understand what money you're spending on. And most importantly, it's all in the business semantics. So, like, one part of this fast growth is Databricks SQL. So it's built on a lake house architecture, and we've had a phenomenal growth up about a 80%, something like 600,000,000 in revenue from zero in just a couple of years. So it helps, you know, this is our warehouse, helps reduce costs, scale out not just to billions of records when I was with Oracle. Now you're talking trillions of records, petabytes of data daily, for single customers we're seeing and helps improve your performance, helps improve collaboration. And, you know, finally, this is all built on open formats, sharing data, sharing assets, the whole ecosystem. So Databricks supports ThoughtSpot. It supports the whole BI ecosystem even better than your classic, warehouse. And so with that, let me, hand it back to Andrew to, introduce Cindy. Thank you so much, Ari. That was a great presentation. And, yes, now it's my pleasure to introduce our next guest speaker, Cindy Howson with ThoughtSpot. Cindy is the Chief Data Strategy Officer at ThoughtSpot. She is an analytics and BI thought leader and expert with a flair for bridging business needs with technology. As Chief Data Strategy Officer at ThoughtSpot, she advises top clients on data strategy and best practices to becoming data driven, influences ThoughtSpot product strategy and interviews the industry's top data and analytics leaders on the DataChief podcast. Cindy was previously a Gartner Research Vice President as the lead author for the Data and Analytics Maturity Model and Analytics in BI Magic Quadrant and a popular keynote speaker. Prior to joining Gartner, she was founder of BI Scorecard, a resource for in-depth product reviews based on exclusive hands on testing, contributed to InformationWeek, and the author of several books. Please welcome Cindy, and I will hand it over to you now. Thank you, Andrew, Donald, Ari. Maybe I should also mention a TDWI faculty member for a number of years. And I do feel like so many things are finally coming together in our industry. I was writing about search as a way of democratizing insights way back in 02/2007 in one of my, books on the industry. And so it's nice to actually see it happening now. Seventeen years. Maybe seventeen years before, it's become mainstream. So I'll start with who is ThoughtSpot, how does it work, and then show it to you in action, building on some of the points that Donald and Ari have made. So ThoughtSpot is your trusted intelligence platform. As Donald mentioned, we have challenges with Gen AI as an industry, hallucinations, poisoning the well, so to speak. ThoughtSpot is a product because we've been working on this for more than a decade now. We are providing you 100% accuracy with no hallucinations on all your data. We were named a Gartner Magic Quadrant leader and also highly rated customer choice on Peer Insights and then G2 Crowd, some other review sites, and a trusted Databricks partner. These are some of the companies that we support. Our mission is to make the world more fact driven, and our vision is to be the simplest, most trusted part of your ecosystem, an integrated provider of production grade AI generated analytics. It is a full platform, which I'll show you a few things about. And I also want you to think about where have we been? What are the data analysts bogged down by? If they are still just answering the basic questions, the descriptive and diagnostic and business users having to wait three weeks or even one told me they have to wait three months, that is far too slow in our fast paced digital economy. In fact, one, financial services customer shared with me how they are decommissioning one of their legacy BI tools, repurposing those hundred of dashboard developers, and having them finally work on predictive and prescriptive analytics and generative AI chat bots and use cases. We really want to leverage the data analysts and the data scientists to work on higher value things. Always aligning to the business outcomes. It's not about the tech, it's about the business outcomes. ThoughtSpotter. We actually launched this in October. It is a substantial second generation of our first product that we launched in May 2023, the generative AI enhanced ThoughtSpot Sage. So this is any data, any question. Donald Farmer said I'm not going to get invited to his dinner table if I only ask a simple question. I want an invitation to that table, Donald Farmer, as long as you're not serving haggis, but that's a different discussion. We adapt it to your industry based on your data, and it's actionable insights anywhere you work with no dead ends. Enterprise grade trust that we provide by integrating with other products in the ecosystem. If you want to use the Unity catalog, you can use that. If you're if you don't have a catalog, we give you an analytic catalog that's baked in. But, a demo is worth a thousand, slides. So let me go ahead and share my screen and show you what this looks like in, in action, hopefully. Okay. And, Andrew okay, great. I am seeing it. I was going to say, Andrew, speak up. When I get into demo mode, I kind of, get going and I'm not watching what's broadcasting. So this is what the traditional ThoughtSpot experience looks like. I could also start this in an embedded app if I'm working in Salesforce or ServiceNow, for example, or a branded portal, which is what a number of our customers use as well. I have some of my favorite live boards up here. I if I know some of the questions I want to ask, I can just dive straight in to Spotter and start asking these questions. If I feel a little overwhelmed, it will profile your data and propose a few questions. I also start my day looking at my watch list, which will give me, the KPIs over defined periods. So what I'm going to do, I'm going to assume I'm a merchandiser, and I always look at some people might say, Cindy, this looks like a dashboard. But we call it a live board because every widget on this page is interactive. If I want to drill into this, I can. Or if I want to generate outliers, clustering algorithms, and I could be calling a machine learning model that somebody built in Databricks or another platform to generate these insights or to ask or add a comment, hey, what happened here? This looks great. But really, the KPI that I'm concerned about is, woah, what happened that sales were down 43% in November? That is not good. So I'm going to go ahead and launch this interactive experience. So it preserves the context, the context windows that Donald was talking about. And so here, I have sales monthly, but maybe I want to say and let me scroll up a little bit. I'm gonna say, how are the top seven states performing? And so it will take that first context and already it's showing me some smarts. It's displaying it as a geo map because I brought in states. If I wanted to see the tabular data, the actual numbers, I could hover over this or I could easily switch it to a table. I can also every every answer is highly interactive. So, again, I can go ahead and drill down here and maybe explore it by product as well. So I see that California is doing well, but maybe I want to look at, well, what were the worst performing? And and so I'm talking in natural language. How about like, that's a natural phrase. And worst performing seven states. And so performing, as Donald mentioned, because we have the context that this is sales dollars from our live board, it keeps that. If I wanted to train it, if I'm an inventory manager, I also could edit this and say, I want to go by Idaho, maybe. Okay, that kind of makes sense. Missouri, where I was born, yeah, that's a shame. I don't know what's going on. And, well, I now live in Delaware, so that should be doing better. What's going on here in Maryland? Now there's also a couple things I could do. I started with just, certain months, but I could say, show me this for the past three years. And so I can broad broaden it out so I'm not just looking, at per particular states. And I don't want, the bottom seven anymore. I want all states. So now I have a more holistic view of the sales. And Donald also referred to the tokens. So I have my natural language here, but it's showing me these tokens. That way, I have the trust. Now I also, as one industry analyst said to me, Cindy, it's easy to do a demo. It's harder to actually build a product. And as you evaluate options on the market, I want you to pay attention to the robustness of the questions as well as this look and feel. How you can start anywhere and no dead ends. Because the other thing we can do is create formulas on the fly. So I can say show me the percentage difference from, twenty twenty three to twenty twenty four. And now I really wanna proof this a little bit, so, I'm gonna wanna look at the tabular view. And you see it's added this percentage difference in sales. So as as a business person, I can look at this and say, well, these differences were not so not so alarming, but I'm gonna sort this by descending. And I'm gonna be calling the store managers, or sorry. I should have done ascending there. And I'm gonna be calling those, store managers, that had, the biggest drop in sales. And so, Delaware, gonna have to call that local person. Now for those of you that really like to geek out on the SQL, this is part of what we think reinforces the trust that you have full visibility into this. So I can actually go in and look at the SQL that that was, generated. Wanted to show you the actual SQL. And you see it's creating these case statements, and this is all your data in Databricks. So no dead ends, no pre aggregation, subsetting the data. In a modern cloud world, we would say leverage all the data for the best insights. So I'll stop sharing, and I want to okay. Are you seeing the slides? Yes. We are. The screen share has ended. Okay. Great. Thank you. And so so as I mentioned, you the insights that you saw there are quite robust, And you can experience that in the full ThoughtSpot platform. Ajentic AI is a top trend in our industry and something I wrote about in our trends report. So as Donald shared, the top use case was, from the TDWI survey was also customer experience. You could call Spotter in kind of a bodyless way from a customer experience agent to say, hey. We're coming up to the Super Bowl. This swag for the Super Bowl, hopefully, it'll be the Packers, is selling out in this state. Tell me the stores where I have inventory on hand, and it will call the spotter agent and loop that back inside the customer experience. And then also creating data apps. This is what some of our best financial services retail customers are doing to create a supplier portal. As Ari mentioned, some of the few customers that we're allowed to speak about, like Albertsons or Unilever, think Ben and Jerry's ice cream or Magnum ice cream among other great products. Fab you would. They make great kitchen cabinets. Home base is New Jersey. So it's start anywhere and no dead ends. We would call this a full intelligence platform. Wow. What a session that was. Thanks for joining us today, folks. Hopefully, you found it insightful, and hopefully, it helps all of you take your data strategy to the next level level. But we're not stop there. Actually, tomorrow is the highlight of the week. We've got Tom Cronin, the head of data from EasyJet Holidays, alongside our director of customer engagement, Maria Shale, talking about EasyJet Holidays journey with AI, how they have deployed AI powered analytics to every single person in their business to help drive their business performance forward. It's gonna be a cracker. And then to close out the week on Thursday, we've got Ricky and Aman from ThoughtSpot talking through the methodologies and frameworks that you can take away and implement in your business to achieve the same success that EasyJet have. You really won't wanna miss it, so I'm looking forward to seeing you all there. Thanks. Bye bye.