Unless it’s your full-time job, you probably don’t know how fast AI is improving. Let’s look at the incredible advancements that took place just last week.
On Monday, Stanford introduced the Alpaca 7B model, which is a language model trained on 52,000 instructions, similar to the way GPT-3 is trained, but with fewer instructions. This model is more lightweight and can be used on someone’s local computer. The model was tested against tools like GPT-3 and ChatGPT, and it performed almost as well as those models. This development shows that in the near future, we will be able to run AI chat models locally on every computer without the need for an internet connection.
On Tuesday, Google announced that they will be releasing all sorts of AI functionality inside their workspace tools, including GPT-3-like functionality inside Google Docs and tools like Gmail. They plan to roll out this feature to other tools like Google Sheets, Google Slides, and Google Meet. This development will make a lot of AI-based companies fairly obsolete since many of these companies are building tools that are similar to what Google is building. Google also announced the next generation of AI for developers in Google Workspace, releasing the PaLM API to select developers, so they can start building on top of the PaLM model, which is a multimodal model that combines images and text prompts.
Anthropic, a company that Google themselves is heavily invested in, introduced Claude, their own version of a chatbot, and it appears that Claude is only available through API, so it’s being used behind the scenes on a lot of tools like Poe from Cora and like the AI that’s built into Notion.
The biggest news of the week was the release of GPT-4, which was in ChatGPT for people to play with on Tuesday. The outputs that everybody were getting when messing with GPT-4 inside of ChatGPT were clearly different and better than what was obtained from previous versions of ChatGPT. GPT-4 will allow for longer context in messages and use visual inputs as a multimodal model, where it will be able to look at images and use that image for additional context and decipher what’s in the image. This breakthrough is significant because it shows that AI language models are continuing to advance and will likely have many practical applications in the future.
On Wednesday, Midjourney announced two big announcements during their live office hours event. The first announcement was the launch of Mid-Journey magazine, a curated collection of images from the community, along with interviews and content around generative AI. The second and more significant announcement was the launch of Midjourney version 5. This new version can create much more realistic images and can do a much better job at hands, though the images are still not always perfect. It also has a new tiling feature that allows users to make tiled images that line up perfectly with each other, even when using the image repeatedly. Image weights are also back in this new version, and the way users prompt the software is completely changed. Midjourney now wants users to prompt in full sentences and talk to it like a chatbot, like ChatGPT. This new prompt method will allow Midjourney to better recognize normal standard language.
On Thursday, Microsoft announced 365 Copilot, which adds AI to everything in their suite of tools. Users will soon have AI inside Microsoft Word, Excel, PowerPoint, Outlook, and Teams. They also introduced their new business chat, which takes all the data from all of these places and lets users chat with it. It will look in all of these places to answer users’ questions. This feature essentially creates a fine-tuned chatbot for users of all the information that’s inside of their Microsoft Suite of tools.
Also on Thursday, Baidu released their rival to ChatGPT, called Ernie. Unfortunately, Ernie’s release did not go as planned. During the one-hour presentation, the company pre-recorded all of Ernie’s responses to carefully curate how it was going to respond. Because they didn’t really want to show off what it can do live, and they carefully curated how it was going to respond in pre-recorded messages, the company’s shares of Baidu stock slumped by 10 percent. It doesn’t look like Baidu will be competing with Google or Microsoft anytime soon.
This article was written automatically by ChatGPT by reading the transcripts of the video above.