It is March 15th, 2023, and you’re watching The Code Report. OpenAI released GPT-4 yesterday, and I stand here in awe of its liddiness. It’s by far the most savage generative text model I’ve ever spoken to. I’m literally shaking right now, because I’m afraid I just became Ops Elite, but more on that later. It’s the successor to the Chat GPT 3.5 model that powers Chat GPT, but has a few new features that change everything.

Here are 7 things you need to know right now. First, GPT-4 stands for Generative Pre-Train Transformer, with the 4 representing the 4 horsemen of the apocalypse. It’s currently available to try out today if you’re a Chat GPT pro member, but API access is behind a waitlist. Big clients are already using it in production, like Microsoft Bing Chat, Duolingo for Language Learning, and at Big Banks to help them not collapse.

The second thing to know is that GPT-4 is smarter, which is described in detail on this paper. Like it passed the bar exam in the top 10%, unlike GPT-3, which was in the bottom 10%. That’s great news for humanity, because it’s thinking like Shakespeare, let’s kill all the lawyers. It is that, but it’s also acing AP exams, which is good enough to get you college credit.

When it comes to programming questions on leak code, it’s able to solve the easy ones, but still fails for the most part on the medium and hard questions. It’s basically where chess engines were at in the early 90s. Good chess players could still beat the engines back then, but 10 years later they didn’t have a chance. Third, GPT-4 can now handle 25,000 input words, compared to about 3,000 for GPT-3.

This is huge, because it means you can feed the AI more context relative to the task at hand. It’s going to make me homeless, because now you can take the documentation for any library that you want to learn, then prove the AI for a step-by-step guide, and it creates the perfect tutorial. In a matter of months, if not days, we’ll start seeing documentation pages with context aware built-in tutorial generators.

They’ll always be up to date, even though GPT-4’s training cutoff was in 2021. For example, I asked it for a tutorial about Angular Signals, which is a new feature just announced recently. Understandably, its initial response was not correct. However, I went to the Read Me 4 Angular Signals, copied it, and pasted it into my prumped. Its response was a near-perfect tutorial.

It did hallucinate an NPM package called Signals, but errors like that will become more rare, as developers tailor their documentation for these AI prumpeders. What’s crazy though is that it can do the opposite job as well. In this example, I wrote five different functions and asked it to document them for me. It did a pretty good job, which means humans don’t even really need to write docs anymore.

You could also use it to analyze your code, like if you have a smart contract and want to find security vulnerabilities, or it can translate code from one language to another, like a digital Rosetta Stone. In fact, the website Rosetta Code might be just as obsolete as me now, and just wait a few months until GPT-4 is integrated into GitHub Copilot.

It’ll be capable of handling far more context to make predictions that align with your specific dependencies or possibly do project-wide AI debugging. If you’re not careful, the copilot may become the captain. But that’s not all. The fourth thing you need to know is that it’s a multimodal model that can also accept images as an input. Like this dude sketched out a website on a piece of toilet paper, then seconds later, it created a shitty website.

Going from hand-drawn, beautiful art to working website. You’ll be able to take your Figma designs, then generate a web application for them in your favorite framework. Actually, screw Figma, you might as well just prumped mid-journey for your designs directly. Also with images, homework is completely obsolete now. GPT-3 was already writing B-grade term papers, but now kids can just take a screenshot of their math problems and get a solution in seconds.

However, GPT-4 does have some drawbacks. It’s noticeably slower than other models, so if the response time is important, you’ll likely want to use a different model. In addition, it’ll likely be expensive, especially if you’re providing a ton of tokens as context, because currently in the API, you’re billed per token, where a token roughly equals one word. The sixth thing you should know is that it’s based, or at least diet woke.

People have speculated that OpenAI is coding a political agenda into the AI because it refused to write poems about Trump, but would do so about Biden. GPT-4, though, didn’t hesitate to spit these bars for Trump. However, it is 82% more likely to deny a disallowed prompt, which is not good news for our old friend, Do Anything Now, Dan.

Which is sad, because the vast majority of chat GPT users are only there to trick it into doing bad things. And as a developer, the final thing you should know is that you can now pass it a system message to change its behavior. If you have access to the API, you can use this feature to give your chatbot its own custom persona or context to solve a specific problem.

The Earth is like a pancake, flat as can be, not round like a ball, believe you me. I have a whole video about the API on my second channel if you want to learn more. I keep telling myself I’m done making videos about AI, but every other day some crazy new thing comes out, that the world is changing before our eyes, for better or worse.

I just want to say that it’s been an honor and a privilege learning how to code with you here on YouTube, but the writing is on the wall. The role of the programming teacher is now obsolete. I’m just a dial-up internet connection in a world filled with 5G towers. Becoming an elite programmer is no longer just about how well you can Google stuff, now it’s about how well you can prove to the AI.

And that’s why you should buy my AI-proofing masterclass, which will be available for an absurd price as soon as GPT4 is finished writing it. This has been the Code Report, thanks for watching, and I will see you in the next one. Maybe.