Hi
This might be a very noob question, so bear with me ![]()
I’m trying to deploy a Vllm instance in order to do inference using an HF model.
I’ve checked the official vllm one-click-app, but there doesn’t seem to be a way to specify which model to load. Is there a specific env var that I can use for this?
When I tried to use custom ai images from docker hub I get this error

