We hosted our inaugural AI Center of Excellence meetup in New York, bringing together product and engineering leaders from over two dozen enterprises to discuss their experiences building applications with LLMs. Many thanks to our friends from BlackRock, Cisco, Citi, Cooley LLP, Credit Karma, CrowdStrike, Duolingo, Goldman Sachs, Instagram, MetLife, Morgan Stanley, Jefferies, JPMorgan Chase, Nielsen, Nutanix, Point72, Scale AI, S&P Global, and the University of Michigan for their contributions to our community.
Deploying AI is a strategic priority across the Fortune 500, and in 2024 the focus is on which applications will move from prototype to production to create real ROI. As more organizations embrace AI and integrate LLMs into their workflows, it is clear that transformative gains can be made once technical and operational challenges are overcome. Where are enterprises finding the most significant ROI? And where is the promise of AI not living up to the hype? It is often hard to tell without digging deep.
In 2023, OpenAI’s ChatGPT and Microsoft’s GitHub Copilot were the only pervasive tools in the enterprise, and much of the industry focused on building a pipeline of possibilities for AI in the enterprise. Entering 2024, every major software company is piloting or developing copilot capabilities for their existing products, and internal teams have dozens to hundreds of possible use cases to consider where LLMs and AI can help to automate or augment human work. Finding applications that create optimal end-user adoption and financial return is now the key to unlocking budgets and organizational support for larger AI-enabled deployments. Building an enterprise wide roadmap can help align stakeholders and prevent widespread experimentation that may never lead to commercial success.
The above map is a crowdsourced depiction of where AI is being deployed across our enterprise community - this is a subjective visualization of how our AI Center of Excellence members perceive the rate at which LLM use cases are internally adopted and the value delivered from each piloted application. We note that individual companies may have a completely different experience or viewpoint than depicted here, but many agree with the overall trends seen across the industry. Some observations that are commonly shared that are at times counterintuitive:
Though there is always friction in finding budget and executive sponsorship, most enterprises are still accelerating their adoption and use of AI despite the technical and organizational hurdles. One of the larger speed bumps seen internally is meeting compliance and security requirements, and navigating these concerns is new territory in every organization that must learn the rules and regulations of AI. Stakeholders outside of product and engineering play an increasingly important role in the pace of deployment of any AI application, and collaborating across these domains will be essential for the foreseeable future.
We created our AI Center of Excellence to bring together product and engineering executives from Fortune 500 companies, high-growth startups, and major tech platforms in AI to facilitate knowledge sharing and collaboration across the industry. Our initial list of 35 members represents some of the largest and earliest adopters of LLMs in the industry, and we are excited to expand this group in the coming months.
As enterprises embark on their journey to deploy AI, they often encounter similar challenges and shared experiences with peers across the industry. Through this collaborative effort, we believe organizations of all sizes and sectors can benefit from the best practices and wisdom learned when deploying AI. If you are a senior executive building with LLMs today, please reach out to join our community. We convene in physical and virtual forums throughout the year and will share insights regularly on this blog.
Thanks to all for participating in our first event - we look forward to seeing you soon!