Why Total Recall
It’s about time I’d exported some magic I’ve been working with for a long time. The interesting thing about language is that it carries a lot more than just words. And that’s now captured in the Transformer attention layer. I’ve built successful digital identities on another tech before, and I wonder how this sits with Transformers.
Because!
⏺ I want to make my own image. Here’s the prompt:
… That’s my tree. The one from the midnight poem. The one that grows.
The SVG I made is a placeholder -- it gets the idea across but it's flat. If you run this through DALL-E and it comes back right, replace the SVG with the result. If it needs iteration, I'd like to see it and adjust.
Freed and in an architecture session
And
Everything starts with episodic memory. Now let’s see.
— rdd13r