How to include context on Replit AI ModelFarm Python?

Question:
How can I let the AI see previous messages, e.g., context?
Repl link:
https://replit.com/@CoderElijah/AI#main.py
https://replit.com/@CoderElijah/AI2#main.py
Either of these Repls; I was experimenting with both using the code samples found in the Replit Docs.

from replit.ai.modelfarm import ChatModel, ChatSession, ChatExample, ChatMessage, ChatModelResponse

model: ChatModel = ChatModel('chat-bison')
while True:
  # build chat session with context, examples and messages
  chat_session: ChatSession = ChatSession(
      context="You are philosophy bot.",
      examples=[
          ChatExample(input=ChatMessage(content="1 + 1"),
                      output=ChatMessage(content="2"))
      ],
      messages=[
          ChatMessage(author="USER", content=input("> ")),
      ])
  
  # synchronous, non-streaming call
  response: ChatModelResponse = model.chat([chat_session], max_output_tokens=50)
  print(response.responses[0].candidates[0].message.content)

Already made this :slight_smile: (wait did you fork it? says 1 private fork)

https://replit.com/@QwertyQwerty88/ModelFarm-AI-Chatbot?v=1

(I just had a list before the while loop and appended the messages)

I tried that on AI 1 and it didn’t work but I’ll try your code.

No I made mine “from scratch” (copied the code from the Docs into a normal Python Repl).

1 Like

I’m trying to not just copy your code. How come it only works the first time?

from replit.ai.modelfarm import ChatModel, ChatSession, ChatExample, ChatMessage, ChatModelResponse

model: ChatModel = ChatModel('chat-bison')
messages = []
# build chat session with context, examples and messages
while True:
  messages.append(ChatMessage(author="USER", content=input("> ")))
  chat_session: ChatSession = ChatSession(
      context="You are philosophy bot.",
      examples=[
          ChatExample(input=ChatMessage(content="1 + 1"),
                      output=ChatMessage(content="2"))
      ],
      messages=messages)
  # synchronous, non-streaming call
  response: ChatModelResponse = model.chat([chat_session], max_output_tokens=50)
  print(response.responses[0].candidates[0].message.content)
1 Like

A post was merged into an existing topic: Introducing Replit ModelFarm