Did someone completed the day3 of "Power of LLMs like GPT with Python"

Question:i want to make the code function in my local machine but I am facing a compatibility issue with the typing-extensions versions.
tensorflow-intel requires typingexension to be <4.6.0 & >=3.5.1 but when I run the code it shows error : ImportError: cannot import name ‘TypeAliasType’ from ‘typing_extensions’ which is available in versions >=4.6.0

Repl link: https://replit.com/@SaarthakSaxena1/Day-3-Implementing-GPT3-and-Flan-T5

from transformers import TFAutoModelForSeq2SeqLM, AutoTokenizer 
import gradio as gr

model = TFAutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-small")
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-small")

#print(tokenizer.decode(outputs[0], skip_special_tokens=True))

def call_flantt5( input , word_count):
  input_ids = tokenizer(input ,  return_tensors="pt").input_ids
  outputs1 = model.generate(**input_ids , max_length = word_count)

  return tokenizer.batch_decode(outputs1[0] , skip_special_tokens=True)


def to_gradio():
    demo = gr.Interface(fn=call_flantt5,inputs=["text", gr.Slider(0, 300)],outputs=["text"])
    demo.launch(debug = True , share = True)

if __name__ == "__main__":
  to_gradio()
  

Hi @SaarthakSaxena1 !
Does this also happen in your repl?

no but console fails to produce any output sometimes and sometimes just gives errors (some errors should come) but no output that’s wierd.