App up and healthy but response tells deploying

I have a Flask app deployed in Koyeb. The app has an endpoint ‘/predict’, which accepts a POST request with an image to determine whether the image is fake or real using a pre-trained model.
In local, it works with the following command:

curl -X POST -F "file=@adidas-green.JPG" http://localhost:5000/predict

The app is deployed and healthy. The console message is similar to what I have in local. I tried the following command:

curl -X POST -F "file=@000001.jpg" https://poised-orelie-vepay.koyeb.app/predict

The above command gives the following message:

<p class="text-kgray-40">We are deploying your application. Refresh this page in a few seconds
            or check out your deployment status in the Koyeb <a class="text-kgreen-default"
              href="https://app.koyeb.com">control
              panel</a>.</p>

Please help me to figure this out.

Hello,

It seems the error message you see happens during the first deployment of your application. Your application should become eventually healthy after a few seconds (definitely less than 2 minutes), and your POST request should succeed

Is it the case?

If not, I suspect your application does not become healthy. This can be for two reasons:

  1. is your application listening on 0.0.0.0:5000?
  2. in the ports section of your service configuration, did you expose the port 5000?

Hi Julien

Thank you for the response. Sorry for the delay in replying from my side.
The error message remained even after hours of deployment and the app is shown as healthy.

I hadn’t bound my application to 0.0.0.0
This is my first time deploying :sweat_smile:
I tried this now, and I got the response:

error code: 524

My local setup uses Flask local development server which runs on port 5000. As part of the deployment, I used Gunicorn which runs on port 8000. I did expose port 8000 in my service config.

I’m sorry, but I don’t think I’ll be able to help without reproducing the issue. Could you create a minimal repository on github that replicates the issue?

Sure.

Thank you for looking into this.

You are just setting up your flask app wrong in my opinion, nothing needs to be bounded to anything, you just need to create a Procfile with this information in it:

web: gunicorn --bind :$PORT app:app

change the app:app to however you have yours and the PORT is passed from Koyeb not what you set. If your instance is running on 8000 as what you specify and you run your flask server on 5000 it cannot talk to each or do an instance health check.

@Julien_Castets1 Were you able to check my code?

@InfamyStudio Thank you for your response. Apologies for my late reply.

My first deployment didn’t bind to any port and the server showed starting up on 8000, the default port for Gunicorn.
5000 is the default port in the case of the local development server.

I tried as you pointed out. But, still no response. The app shows healthy but the request is time out.

I tried to run your application on a gpu-nvidia-rtx-4000-sff-ada instance, and it seems to work fine:

url -X POST -F file=@./xx.png  https://xxx/predict
{"result":"Fake"}

Thank you for checking.
I want to confirm, are you referring to an instance in Koyeb?
I’m currently on the free instance

I was just trying on your service and it times out but I don’t see the message you have mentioned. Did you make progress?

Thank you David for checking.
I’m getting request timeout as well. I doubt if it’s cause I’m on the free instance.

Could be. That is why Julien said it was working on a GPU instance. You might need to upgrade and deploy on an instance with higher performance.

Yes. I don’t have the logs anymore, but it was indeed working on a GPU instance hosted on Koyeb. On a regular instance, it was not working and there were warnings because of the lack of GPUs.

Thank you for the input @David and @Julien_Castets1
We are only in the ideation phase and are trying out free resources. Any suggestions on how we can go about to achieve this?

Hi @Sebastian_James ,

You could sign up for our GPU preview. We provide credits as part of the program.

@David Thank you so much for sharing. I’ll try this.