Deloitte and Snowflake AI Trends in 2024
Deloitte and Snowflake AI Trends in 2024
Deloitte and Snowflake AI Trends in 2024
AI in 2024:
Hot Takes From Dataiku,
Deloitte, & Snowflake
A Coming Era of
Compute Cost
Volatility
Florian Douetteau
Bio
CO-FOUNDER AND CEO, DATAIKU
Florian Douetteau is the Chief Executive Officer and
co-founder of Dataiku, the platform for Everyday AI,
enabling data experts and domain experts to work
together to build data into their daily operations,
from advanced analytics to Generative AI. More
than 600 companies worldwide use Dataiku, driving
diverse use cases from predictive maintenance
and supply chain optimization, to quality control in
precision engineering, to marketing optimization,
Generative AI use cases, and everything in between.
DATAIKU .3
To underscore how AI will become increasingly If this statement is not evidence enough that
essential to businesses in every sector, I thought it the ability to augment and automate business
was best summarized by Mustafa Suleyman, co- processes and business decisions with AI will
founder and CEO of Inflection AI and co-founder increasingly become a core capability of every
of DeepMind (which has since been acquired by business, this might: According to Bank of
Google), in an article for TIME entitled, “How the AI America Global Research and IDC, global
Revolution Will Reshape the World”:1 revenue associated with AI software,
hardware, service, and sales will likely grow
1
https://time.com/6310115/ai-revolution-reshape-the-world/
2
https://business.bofa.com/en-us/content/economic-impact-of-ai.
html#footnote-2
DATAIKU .4 DATAIKU .5
The GPU
Advantage in
Mastering LLMs
As a technical refresher, GPUs are computer
processors that are designed to accelerate the The parallel processing capabilities and
calculations necessary to render images and computational power of GPUs are instrumental
videos on a display. Originally developed for in the training and deployment of LLMs — they
rendering graphics in video games, GPUs have contribute to faster training times and more
evolved to become highly parallel processors efficient inference.
capable of handling a large number of calculations
simultaneously. So, for organizations aiming to achieve Everyday
AI — building Generative AI as well as traditional
This parallel processing capability makes GPUs ML and advanced analytics into the processes
well-suited for certain types of computations, such and operations throughout their business —
as those involved in machine learning (ML), deep this has implications: The costs associated with
learning, and AI. Before GPUs came on the scene in GPUs will exhibit significant variability and prove
the late 1990s and early 2000s, Central Processing challenging to anticipate, primarily influenced by
Units (CPUs) performed the calculations necessary the fundamental dynamics of supply and demand.
to render graphics. While both GPUs and CPUs
are essential components of modern computing That demand will increase (64% of senior AI
systems, they are optimized for different types of professionals are “Likely” or “Very Likely” to use
tasks. Generative AI for their business over the next year,
according to a Dataiku and Databricks survey),
LLMs are trained on large datasets and can though we don’t know by how much or how
generate human-like text based on the inputs quickly. Supply is uncertain, but will undoubtedly
they receive in a process known as inference. This be influenced by manufacturing capacity within
process is most efficiently done on GPUs, which the semiconductor supply chain (which includes
are adapted for parallel processing (meaning the production of GPUs and is concentrated in a
the architecture of these models allows for the few locations). In turn, manufacturing capacity
simultaneous processing of multiple tokens or may be influenced by global events such as natural
words). disasters or geopolitical events.
DATAIKU .6 DATAIKU .7
#1
Companies will need the ability to analyze and
assess the tradeoffs between cost and quality of
output to strike the most effective balance. This will
require a nuanced understanding of the variables
at play, ensuring that computational resources are
#2
deployed efficiently.
#3
costs during periods of lower demand.
#4
term cost benefits.
the ability to change Where compute cost volatility differs is companies
between models and that are not used to this kind of work will need to
do it now, as GPU cycles become critical to their
service providers to work. For example, financial services companies or Technologies that optimize the models for
adapt cost posture in pharmaceutical companies don’t usually engage specific use cases will become more popular.
in energy and shipping trading, but they will need Organizations are increasingly seeking solutions
the face of variable to build such practices for GPU compute. This that enhance the efficiency of their GPU utilization,
evolution of compute cost dynamics leads to aligning computational resources with the unique
costs. several consequences: requirements of their applications.
DATAIKU .8 DATAIKU .9
The GPU
Paradigm Shift:
How Dataiku Can Further, the collaborative environment of
Conclusion
unnecessary compute costs.
With the ability to integrate with leading cloud
platforms, users can take advantage of cloud
The ability to scale horizontally and vertically services with elastic computing capabilities, paying
enables Dataiku to accommodate varying only for the resources they consume. In times of Particularly in the age of Generative AI, teams
workloads and demands. This scalability enables compute cost volatility, the ability to scale up or will continue to generate and accumulate more
organizations to adapt to changing compute down in the cloud can be a cost-effective strategy. data, thus the need for computational power to
requirements without major disruptions, thus process and analyze this data will increase. So,
helping to manage costs effectively. Plus, organizations can capitalize on cloud- while organizations should be mindful of potential
specific features such as auto-scaling to optimize compute cost volatility, it shouldn’t deter them
Next, efficient data management and feature compute costs and ensure seamless deployment from continuing to scale ML and AI initiatives.
engineering are essential for building high- and scalability of machine learning models in
performing ML models. Dataiku provides tools for cloud environments. To build trust in AI projects By adopting a strategic and measured approach
managing and processing data, contributing to and programs, Dataiku enables organizations via (such as with Dataiku), organizations can harness
the creation of optimized datasets that can lead to teamwork and transparency in a single, centralized the benefits of these technologies while managing
more efficient model training, potentially reducing workspace, explainability and safety guardrails, and and optimizing computational resources
overall compute costs. robust AI Governance and oversight. effectively.
DATAIKU .13
The age of AI is upon us
The age of AI is upon us with the dawn of a broad So, when faced with a technology solution that
with the dawn of a broad
spectrum of Generative AI capabilities available
to consumers. But much like the technologies
obfuscates the ‘how’ and ‘why’ of its outputs
while introducing questions like “Is my data the spectrum of Generative AI
themselves, trust in these paradigm-shifting product?”, AI users must prioritize establishing and
technologies will not be built in a day. continually proving trustworthiness.
capabilities available to
consumers. But much like
As with prior cutting-edge tech, AI is greeted with At Deloitte, we work to understand the risks,
a healthy skepticism from both organizations and rewards, utility, and trustworthiness of AI so that
consumers seeking assurances on the privacy and we may help clients across industries, government,
The question of “Whom can I trust?” is one we Human trust in AI is an uphill climb. The
ponder when assessing another’s character, intent, organizations that focus on building AI solutions
or motivations. Personal trust is carefully cultivated with trust by design may claim an advantage in the
over time; it requires consistency, familiarity, and marketplace.
prioritization of meeting that individual’s needs.
1 Deloitte’s TrustID Generative AI Analysis. August 2023.
forward-thinking bots to intellectual property infringement to data answering for AI incidents similarly to cyber events
exfiltration. Understanding and accepting that AI or regulatory noncompliance. Management
control mechanisms may fall short of human expectations can help limit that cannot or does not effectively speak to the
the consequences of those failures and formulate organization’s use of AI could risk fostering a
and technological responsible planning and response. If for humans, culture of nonaccountability.
require a rebalancing company’s cars experience 76% fewer accidents from data subjects, organizations will need to both
involving property damage compared to human- comprehensively understand and communicate
of accountabilities and driven cars2. But comparative statistics alone do their AI uses, not simply for regulatory compliance
not shape public perception, and the perceived risk purposes, but to keep the lines of communication
responsibilities across of an AI solution often pales in comparison to its open with crucial stakeholders inside the
the organization, perceived reward for widespread adoption. organization and out.
technology. technology.
World Case Study of the Waymo One Service
Elastic Architecture
Built for the Cloud
Filter on Mr.
Governance
DataOps & MLOps Analytic Apps Generative AI
DATAIKU .28