

The first improvement is a serious language boost. This model promises to be the building block for Google's AI products, serving as a massive general LLM with expanded reasoning, language and coding capabilities. While Google did not lay out how many parameters PaLM 2 is trained on - PaLM was trained on 540 billion language parameters - Google promises that PaLM 2 will have improved capabilities in addition to faster and more efficient performance. According to Google, PaLM 2 has already been secretly powering Google Bard and has many future integrations planned. Google is setting up PaLM 2 to be the fundamental AI model behind its future AI products.
