The vast majority of current GenAI projects will fail, not because of inherent flaws in large language models (LLMs), but because of misconceptions about how to use them and the lack of capabilities needed to successfully design, develop, and operationalize GenAI-driven applications.
At Data Summit 2024, Kjell Carlsson, head of AI strategy, Domino Data Labs, presented his session, “Shatter the Seven Myths of GenAI to Operationalize Impact,” debunking the most harmful myths that set up projects for failure.
The annual Data Summit conference returned to Boston, May 8-9, 2024, with pre-conference workshops on May 7.
He also looked at case studies of how advanced AI teams in industries ranging from pharma to food delivery are shattering these myths and delivering transformative outcomes.
“I get to speak to lots and lots of companies about how they use generative AI,” Carlsson said.
He explained that drug discovery, customer service, and areas of productivity are seeing success from generative AI.
The common characteristics of impactful GenAI case studies include people, processes, and technology. These companies are tapping into the broad ecosystem of GenAI and ML technologies with advanced MLOps capabilities.
Carlsson said his prediction is 90% of GenAI initiatives will fail to deliver transformative value.
“We’re all going to be using GenAI but it’s a question of doing something transformative, going to the core of the business,” Carlsson said. “I’m worried we’re setting ourselves up for a lot of failure here.”
According to Carlsson, some AI myths to ignore include:
- AI becoming sentient
- AI becoming omniscient
- AI will lead to mass unemployment
- AI could lead to human extinction
Several AI myths worth debunking include:
- GenAI replaces predictive AI/ML: The reality is GenAI complements predictive AI/ML
- Bigger is better: The reality is bigger is slower, more expensive, and generic
- GenAI is all about the LLM: The reality is everything else is just as important
- GenAI isn’t data science: The reality is data science is the best foundation for GenAI
- Responsible AI is someone else’s problem: In reality it is everyone’s problem
- You can outsource GenAI: The reality is in-house capabilities are key to AI transformation
- You can ignore AI and ML: The reality is you should have started ages ago
“It’s often helpful to think about [models] as pipelines,” he said. “Connecting multiple components is where the magic happens.”
He recommended designing your strategy for real-world GenAI impact. Empower experienced data product and data science teams to lead your GenAI projects. Implement iterative, unified processes for developing, testing, and operationalizing GenAI and predictive AI pipelines. Implement open, extensible capabilities that span the AI ecosystem and accelerate, streamline, and govern the AI lifecycle.
“Expect the unexpected,” Carlsson said. “This isn’t optional. This is the best bet we have for driving performance and value.”
In this era where AI is reshaping industries, the integration of large language models (LLMs) like ChatGPT with private knowledge platforms is a groundbreaking development.
Clive Smith, chief revenue officer, sales, Datavid Limited and Tim Padilla, director, sales and consulting North America, Datavid Limited, discussed “Integrating LLMs With a Private Knowledge Platform,” during their Data Summit 2024 session, following Carlsson.
They shared experiences and lessons learned from both internal R&D and the benchmarking of several LLMs with customers and subsequent integration with existing KM platforms.
Datavid is a data consultancy that specializes in extracting business value from structured and unstructured data. Smith and Padilla help customers create a well-defined FAIR data strategy (Finable Accessible Interoperable Reusable) which is foundational for building data-centric applications.
“What I’m seeing is a new generation of ‘data service providers’,” Smith said. “We treat generative AI as just another application. You have to get the experts to validate the results of the generative application you are building because they are the only people who can.”
A unified data framework, Padilla explained, is an approach that eliminates data silos that arise from structural and technological factors. It enables business agility, rapid delivery of innovative solutions, and increased ROI/reduced time to value.
According to Smith and Padilla, 4 shifts organizations should make include:
- Leaner, faster, and more agile operations
- Enable change across the organization for in-time insights
- Invest in next-gen technology innovation
- Harmonize tools and standards
“We come from a world of data,” Padilla said.
Many Data Summit 2024 presentations are available for review at https://www.dbta.com/DataSummit/2024/Presentations.aspx.