Error: "a port configuration was specified but the required port was never opened" While deploying

Im getting an error while trying to deploy my api.

Repl is private so i can’t really put a link

import tensorflow as tf
import numpy as np
import requests
from io import BytesIO
import time
import os

import logging
from flask import Flask, request
from concurrent.futures import ThreadPoolExecutor

app = Flask(__name__)
KEY = os.environ['KEY']
INTERPRETER_POOL_SIZE = 5  # Number of interpreters in the pool

# Load the labels
labels = eval(os.environ["POKEMONS_ALT"])

def preprocess_image(image):
    img = tf.image.resize(image, [224, 224])
    img = img / 255.0  # Normalize input data
    return img

def predict_pokemon_from_image(interpreter, image):
    # Get input and output details
    input_details = interpreter.get_input_details()
    output_details = interpreter.get_output_details()

    # Set the input tensor value
    input_data = tf.convert_to_tensor(image, dtype=tf.float32)
    interpreter.set_tensor(input_details[0]['index'], input_data)

    # Run the inference

    # Get the prediction output
    output_data = interpreter.get_tensor(output_details[0]['index'])
    predicted_class_index = np.argmax(output_data)
    predicted_label = labels[predicted_class_index]

    return predicted_label

def predict_pokemon_from_url(image_url):
      response = requests.get(image_url)
      image_data = response.content

      # Decode the image data as a tensor
      image = tf.image.decode_image(image_data, channels=3)
      image = preprocess_image(image)

      # Load the TFLite model from the interpreter pool
      interpreter = get_interpreter_from_pool()

      predicted_pokemon = predict_pokemon_from_image(interpreter, [image.numpy()])

      # Return the interpreter back to the pool

      return predicted_pokemon
  except requests.exceptions.RequestException as e:
      print("Request to", image_url, "failed:", e)

def initialize_interpreter():
    # Load the TFLite model
    tflite_model_path = 'pokefier_t1.tflite'
    interpreter = tf.lite.Interpreter(model_path=tflite_model_path)
    return interpreter

def get_interpreter_from_pool():
    return interpreter_pool.pop()

def return_interpreter_to_pool(interpreter):

@app.route('/', methods=['GET'])
def handle_get_request():'GET request received')
    return 'Hello, World!'

@app.route('/identifyPokemon', methods=['POST'])
def identify_pokemon():
    received_data = request.json

    if received_data['key'] != KEY:
        return ''

    pokemon_image = received_data['image']
    pokemon_name = predict_pokemon_from_url(pokemon_image)

    return pokemon_name

if __name__ == '__main__':
    logging.basicConfig(level=logging.INFO)'Server is starting...')

    # Initialize the interpreter pool
    interpreter_pool = [initialize_interpreter() for _ in range(INTERPRETER_POOL_SIZE)]'', port=8080)

Here are my deployment configurations


@KetoGamed Hi there and welcome. Would you mind inviting me to your Repl as a collaborator? I won’t make any changes. My username is:

sure, i’ve sent the invite

Thanks @KetoGamed. It looks like the Repl is in a Team so I can’t fork it to my personal account, but I did make a manual copy. It seems to deploy okay for me on this fresh Repl using the same requirements.txt and main files, though it does give various package warnings. Let me go ahead and invite you, and you can fork it as desired.

1 Like

Hi @SuzyAtReplit is there any way you could help me as well please? In my editor when i run my bot everything works as intended.

When I try to deploy it tells me that port was never opened. Not sure what is going on and i’ve been struggling for weeks now. :frowning:

a port configuration was specified but the required port was never opened
2024-06-11T04:52:16Z error: The application must serve traffic on either port 80 (if configured) or the first port in the port configuration. This port was never opened. Check the logs for more information.