Playback speed
×
Share post
Share post at current time
0:00
/
0:00

Build Your Own AI: Configuring h2oGPT

The Perfecting Equilibrium Vlog for March 28, 2024

I’ve gone on and on about how Large Language Models — so-called AIs — are crippled by training on bad data. Garbage In; Garbage Out! So let’s build a local LLM and feed it good data for a specific purpose. Welcome to Virtual Grad Student! We’re going to set up a Large Language Model to run locally, feed it a clean set of data, then make it available to authors as a virtual writer’s assistant. For example, to pull together a few paragraphs of background on Roman aqueduct architecture. Today we're walking through h2oGPT options.


Links:

The h2oGPT open source Large Language Model

Common Corpus, the largest public domain dataset for training LLMs

Part 1 of Build Your Own AI

Next on Perfecting Equilibrium

Friday March 29th - Foto.Feola.Friday

Sunday March 31stAbout that time I accidentally spent all my money on lenses. And tripods. And art. Every month for four years and fourth months... I was a typical soldier; always broke. But I was probably the only soldier ever broke because I’d spent every last dollar on Pentax lenses and woodblock prints…

0 Comments
Perfecting Equilibrium
Perfecting Equilibrium Podcast
For a brief, shining moment Web1 democratized data. Then Web2 came along and made George Orwell look like an optimist. Now we are building Web3, Perfecting John Nash’s Information Equilibrium.