Google PaLM
You’ve heard of GPT-3 if you’ve been on social media in the last 4 months. The language model from OpenAI, but you probably haven’t heard of Google PaLM yet. The Google Research Team has published the Pathways Language Model, also known as PaLM. The Google Research team made sure that it was trained on multilingual data sets from Wikipedia and on data scraped from the web.

What can it do?
It can complete many NLP processing task such as:
- Code Generation
- Math Word Problems
- Explaining Jokes
How Does it work?
You’ve probably never heard of chain-of-thought prompting. It allows you to describe multi-step problems as a series of intermediate steps. Normally, you prompt the model by asking and answering. The problem is: From time to time, your model will get the question wrong.
With Chain-Of-Though prompting, you give the model a question and answer pair, BUT you also include an explanation in your answer. Now you could ask the model a question and you will get the correct answer (depending on the question). You will also get an explanation in your output.
PaLM uses the standard transformer model architecture in a decoder-only setup, with a few modifications using SwiGLU activation, multi-query attention, and RoPE embeddings.
What makes PaLM so special ?
PaLM has been trained on GitHub code, Wikipedia, books, and more. PaLM has also been trained on multiple languages. This allows business owners to better support any customer service to communicate more easily with customers in their native tongue. PaLM does translation and Q&A very well.

CONNECT
Have a great week everyone!
Follow me on: Twitter, Linkedin, Medium and AIapplicationsblog.com
Prepare for your next job application with this Cover Letter Generator!