What makes a PC an AI PC? Defining it beyond an attempt to sell some FOMO by vaguely waving at the existence of “neural processing units” and other features only available on the latest and greatest silicon. No one has come up with that.
Intel suggests that application developers will soon need to bring AI to all their software and make sure PCs are ready for it, but the most important workloads today are large-scale language models. This is an inference to strengthen the.
Therefore, an AI PC worthy of its name must perform inferences quickly and well.
Inference is the process of converting a submitted prompt into a response. This requires a machine that can process gigabytes of data, and even more matrix multiplication.
Computation-intensive tasks like this would seem to significantly expand the capabilities of desktop PCs (ChatGPT and its cousins run in huge data centers), but they would be far beyond the capabilities of desktop PCs that we already own. It turns out that a lot of hardware can do a decent job of inference. . Anyone can run a “good enough” chatbot on a PC, as long as the PC has enough RAM and a mid-range GPU.
This means a GPU with at least 8GB of VRAM (not made by Intel, at least for now, as there is no driver support). It’s not that expensive – it’s a gamble, so that’s a good thing. As for RAM, 32 GB is more satisfying than 16 GB, but 8 GB is not enough. That’s it. That’s all you need.
Apple handled this very well, perhaps completely by accident, with its M-series SoCs. Treating memory as “consolidated” means it’s all VRAM when needed. All modern M-series Macs with at least 16GB of RAM can function as AI PCs.
Coincidentally, four of the PCs I’ve purchased in the last 10 years meet these specifications. That includes his eight-year-old top-of-the-line monster that he bought as a “VR PC” with an Intel 6700K CPU. -His top-of-the-line Nvidia GTX 980 Ti GPU. It more or less follows Moore’s Law, and although somewhat slower than the newer machines in my collection, the chatbot runs fine locally.
Although not purchased as an AI PC, the PC I recently purchased is running each inference at a speed comparable to the completions generated by OpenAI’s GPT-3.5-Turbo, and the In terms of quality, it is virtually indistinguishable from its larger cousin.
To test these systems, I send fairly sophisticated prompts to the system, telling it to act as an “agent.” In other words, it solves the problem by generating a JSON file that is fed into another program. (The idea of “small parts loosely coupled” still applies in the age of AI.) And it all…works.
So we see that AI PCs are already everywhere. Did anyone know? moreover, many Introducing the AI PC. Tens of millions. That’s good. Because we need them.
Microsoft requires us to paste all sensitive documents into Copilot for analysis, thereby allowing its Redmond data center to eavesdrop on our most sensitive workflows, but many documents is too sensitive to be shared at all.
What about documents that are too hot to be stored on networked machines? What about medical information? Or is it a confidential document? Much of what we work with has strings attached and cannot be freely shared.
Legal documents often fall into this category, and this is the type of task for which the recently released SaulML legal language model is extremely useful. If a lawyer or paralegal runs the model on their local PC, they are free to send documents to it, read it, or generate new terms and conditions without worrying about leaking confidential information. can do. Many law firmsShut up and take your client’s money!”
I created a prototype of that tool last weekend. I took a full-fat, open-source SaulLM model and reduced its size to run on my fleet of AI PCs. I’m not a lawyer, so I can’t fully judge its quality. But I pasted huge chunks of contracts from future business partners there, without fear of leaking, and asked some tough questions.
Even if lawyers don’t like this book, the rest of us (faced with endless arcane contract terms and conditions that we ignore at our peril) will find it extremely useful. .
Similarly, a hardware vendor who relies on you for their revenue cycle might not like to admit that the PC you bought for working from home three years ago probably already has a perfectly usable AI PC. There will be. But it’s not an AI PC that Benderland is hoping to buy in a hurry. ®