I’ve gone on and on about how Large Language Models — so-called AIs — are crippled by training on bad data. Garbage In; Garbage Out! So let’s build a local LLM and feed it good data for a specific purpose. Welcome to Virtual Grad Student! We’re going to set up a Large Language Model to run locally, feed it a clean set of data, then make it available to authors as a virtual writer’s assistant. For example, to pull together a few paragraphs of background on Roman aqueduct architecture. Today we're upgrading hardware to run h2oGPT locally.
Links:
The h2oGPT open source Large Language Model
Common Corpus, the largest public domain dataset for training LLMs
Here’s the old hardware struggling with Photoshop:
Adobe AI tools' hit on computer performance
I’ve started to think I need to upgrade my computer as I’ve been wrestling with Adobe AI tools the last few weeks. I’ve never even seen a pause while working since I built it a half-decade ago, and now it occasionally locks up while running the Photoshop Beta AI tools. I thought it might be instructive to take a look at how these tools impact system per…
Next on Perfecting Equilibrium
Friday April 12th - Foto.Feola.Friday
Sunday April 14th — About that time I accidentally spent all my money on lenses. And tripods. And art. Every month for four years and fourth months... I was a typical soldier; always broke. But I was probably the only soldier ever broke because I’d spent every last dollar on Pentax lenses and woodblock prints…
Share this post