Google has introduced PaLM (Pathways Language Model) 2, an update to its next-generation large language model with improved multilingual, coding, and reasoning capabilities.
For multilingual tasks, PaLM 2 was more heavily pre-trained on multilingual text, spanning more than 100 languages, thus improving the software’s ability to understand and translate nuanced text including idioms, poems, and riddles.
In the coding realm, PaLM 2 was pre-trained on a large quantity of available source code data sets. The model “excels” at popular programming languages such as Python and JavaScript, Google said, but is also capable of generating specialized code in languages such as Fortran, Prolog, and Verilog.
To read this article in full, please click here
InfoWorld