I'm a bit of a noob in hardware design, so maybe this is a stupid question, but why is a FPGA scary?
It would seem scarier to me if they actually fabbed an FPGA into an ASIC right? That could maybe indicate they have some kinda plan to mass-produce them, no?
Also some feedback, a bit more technical, since I was trying to see how it works, more of a suggestion I suppose
It looks like you're looping through the documents and asking it for known tags, right? (
{str(db.current_library.tags)}.
)I don't know if I would do this through a chat completion and a chat response, there are special functions for keyword-like searching, like embeddings. It's a lot faster, and also probably way cheaper, since you're paying barely anything for embeddings compared to chat tokens
So the common way to do something like this in AI would be to use Vectors and embeddings: https://platform.openai.com/docs/guides/embeddings
So - you'd ask for an embedding (A vector) for all your tags first. Then you ask for embeddings of your document.
Then you can do a Nearest Neighbor Search for the tags, and see how closely they match