Join top executives in San Francisco on July 11-12 to hear how leaders are integrating and optimizing AI investments for success.. Learn more
Google kicked off its annual I/O conference today with a central focus on what it’s doing to advance artificial intelligence (AI) across its domain. (Spoiler alert: it’s about PaLM 2.)
Google I/O It has long been Google’s premier developer conference, tackling a whole host of different topics. But 2023 is different: AI dominates almost every aspect of the event. This year, Google is trying to establish a leadership position in the market as rivals to Microsoft and open AI bask in the afterglow of ChatGPT’s runaway success. The foundation of Google’s effort is based on its new PaLM 2 large language model (LLM), which will serve to power at least 25 Google products and services that will be detailed during sessions at I/O, including Bard, Workspace, Cloud , Security and Vertex AI.
The original PaLM (short for Pathways Language Model) was released in April 2022 as the first iteration of Google’s core LLM for generative AI. Google claims that PaLM 2 dramatically expands the company’s generative AI capabilities in significant ways.
“At Google, our mission is to make the world’s information universally accessible and useful. And this is a perennial mission that has taken on new meaning with the recent acceleration of AI,” Zoubin Ghahramani, vice president of Google DeepMind, said during a roundtable press conference. “AI is creating the opportunity to understand more about the world and make our products much more useful.”
Putting next-gen AI in the “palm” of developers’ hands with PaLM 2
Ghahramani explained that PaLM 2 is a next-generation language model that is good at math, coding, reasoning, multilingual translation, and natural language generation.
He stressed that it is better than Google’s previous LLMs in almost every measurable way. That being said, one way that previous models were measured was by the number of parameters. For example, in 2022, when the first version of PaLM was released, Google claimed that it had 540 billion parameters for its largest model. In response to a question posed by VentureBeat, Ghahramani declined to provide a specific figure for PaLM 2’s parameter size, only noting that counting parameters is not an ideal way to measure performance or capability.
Ghahramani instead said that the model has been trained and built in a way that makes it better. Google trained PaLM 2 on the latest Tensor Processing Unit (TPU), which is Google’s custom silicon for machine learning (ML) training.
PaLM 2 is also better at AI inference. Ghahramani noted that by combining computation, optimal scaling, and improved mixing of data sets, as well as improvements to model architectures, PaLM 2 is more efficient at serving models and has better overall performance.
In terms of core capabilities enhanced for PaLM 2, there are three in particular that Ghahramani mentioned:
Multilingualism: The new model has been trained on more than 100 spoken word languages, allowing PaLM 2 to excel at multilingual tasks. Going a step further, Ghahramani said that he can understand nuanced sentences in different languages, including the use of ambiguous or figurative meanings of words instead of the literal meaning.
Reasoning: PaLM 2 provides stronger logic, common-sense reasoning, and mathematics than previous models. “We have trained on a large number of math and science texts, including scientific papers and mathematical expressions,” Ghahramani said.
Coding: PaLM 2 also understands, generates, and debugs code, and has been pretrained in more than 20 programming languages. Along with popular programming languages like Python and JavaScript, PaLM 2 can also handle older languages like Fortran.
“If you’re looking for help fixing a piece of code, PaLM 2 can not only fix the code, but also provide you with the documentation you need in any language,” Ghahramani said. “So this helps programmers around the world learn to code better and also collaborate.”
PaLM 2 is a model that powers 25 Google apps, including Bard
Ghahramani said that PaLM 2 can be adapted to a wide range of tasks and at Google I/O the company has detailed how it supports 25 products that impact almost every aspect of the user experience.
Building on the general-purpose PaLM 2, Google has also developed the Med-Palma 2, a model for the medical profession. For security use cases, Google has trained sec-palm. Google’s ChatGPT competitor Bard will now also take advantage of the power of PaLM 2, providing an intuitive message-based user interface that anyone can use, regardless of technical ability. Google’s suite of Workspace productivity apps will also get an intelligence boost, thanks to PaLM 2.
“PaLM 2 excels when you tune it on domain-specific data,” Ghahramani said. “So think of PaLM 2 as a general model that can be tuned to accomplish specific tasks.”
VentureBeat’s mission is to be a digital public square for technical decision makers to gain insights into transformative business technology and transact. Discover our informative sessions.