Did anyone get this to run?
RandomlyRight
Oof I’m sorry, sounds super bad. It’s interesting because I think the frontal lobe is exactly what would make someone overthink stuff or worry too much. So, I’m still considering it ;)
Amazing, can you share where exactly I need to bonk my head for this?
I wanted to set this up for a while now. Guess it’s time
I’ve read about this method in the GitHub issues, but to me it seemed impractical to have different models just to change the context size, and that was the point I started looking for alternatives
It was multiple models, mainly 32-70B
There are many projects out there optimizing the speed significantly. Ollama is unbeaten in the convenience though
Yeah, but there are many open issues on GitHub related to these settings not working right. I’m using the API, and just couldn’t get it to work. I used a request to generate a json file, and it never generated one longer than about 500 lines. With the same model on vllm, it worked instantly and generated about 2000 lines
Über n Salamibrot geht halt nix
Yo I think we Path of Exile gamers made it pretty clear he is not one of us
Take a look at NVIDIA Project Digits. It’s supposed to release in May for 3k usd and will be kind of the only sensible way to host LLMs then:
Super cool! I'd be interested in how to fit this to my head shape too, it’s now on my list of contenders for the concert