Troubleshooting ollama one click deploy

Hello, I am trying to deploy the one-click Ollama service. It starts, takes ages and does not complete. Is it because the underlying GPUs / systems are not provisioned or saturated ? (RTX4000 in fra).
Thanks, JB

The service not starting.

Hi @jeanbapt,

Indeed, all those Instance types are being used at the moment. I’d suggest deploying with another Instance type.

Best,
Alisdair

Thanks alisdair, whou so Koyeb is short of entry level GPU ? What a pity.
Just need a simple basic gpu to run a small quantized model, the alternatives you propose are overkill.

Regards,

JB

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.