Y'know, a thought occurs.

There's this whole trope of having an "ai core" that you see in sci-fi, where the ai is tied to some kind of specific hardware that can be transferred between, say, a spaceship and some kind of spehss marine's power armor.

So why -not- run with it?

Get you a big chonky matte steel enclosure with a heatsink on, and throw in some GPUs, a TPM, basically a lil llm-optimized subsystem.

Call that shit an AI core.

And then MS would have legions of slavering fanbois queuing up for miles to get their hands on something implied to be like real-life Cortana.

Follow

@munin Yeah been thinking of building little NUC form factor devices you plug in and advertise an open ai chat compatible http endpoint over MDNS. Plus gibr it a memory and function calling and all that built in. You can plug it into any app that uses the chat api which is getting pretty standard already. I basically already do this with a mac mini I keep in my closet.

Sign in to participate in the conversation
Mauvestodon

Escape ship from centralized social media run by Mauve.