chat UI in fastHTML

chat UI implemented in fastHTML

Chat App initialization

Start by creating the chat application with FastHTML.


source

execute_handler

 execute_handler (function_name:str, session_id:str, **kwargs)
Type Details
function_name str Name of the function to execute
session_id str Session ID to use
kwargs

source

prepare_handler_schemas

 prepare_handler_schemas (session_id:str, fixup:Callable=None)
Type Default Details
session_id str Session ID to use
fixup Callable None Optional function to fix up the execution

Chat components

Basic chat UI components can include Chat Message and a Chat Input. For a Chat Message, the important attributes are the actual message (str) and the role of the message owner (user - boolean value whether the owner is the user, not the AI assistant).


source

ChatMessage

 ChatMessage (msg:str, user:bool)
Type Details
msg str Message to display
user bool Whether the message is from the user or assistant

For the chat input, set the name for submitting a new message via form.


source

ChatInput

 ChatInput ()

Action Buttons

Simple actions for creating a new message from the user side.


source

ActionButton

 ActionButton (content:str, message:str=None)
Type Default Details
content str Text to display on the button
message str None Message to send when the button is clicked

source

ActionPanel

 ActionPanel ()

Router

Home page

The home page should contain our message list and the Chat Input. The main page can be extracted by accessing the index (root) endpoint.


source

index

 index ()

Form submission

At submission, this function should:

  • Extract the new and all previous chat history
  • Prompt & get answers from ChatGPT from all these messages
  • Return a new ChatMessage

source

send

 send (session, msg:str, contents:list[str]=None, roles:list[str]=None)

Static files

In case the user needs to display images, serves files from directory ../data.


source

get_file

 get_file (file_name:str)

Serve files dynamically from the ‘data’ directory.

Runner

In addition to the main app, an utility function is implemented to run the app just by importing and executing this function to a Python file.


source

llmcam_chatbot

 llmcam_chatbot (host='0.0.0.0', port=5001)
Type Default Details
host str 0.0.0.0 The host to listen on
port int 5001 The port to listen on

For running while testing with Jupyter notebook, use the JupyUvi in fasthtml to run in separate thread.

from fasthtml.jupyter import *

server = JupyUvi(app=app)
[youtube] Extracting URL: https://www.youtube.com/watch?v=LMZQ7eFhm58
[youtube] LMZQ7eFhm58: Downloading webpage
[youtube] LMZQ7eFhm58: Downloading ios player API JSON
[youtube] LMZQ7eFhm58: Downloading mweb player API JSON
[youtube] LMZQ7eFhm58: Downloading m3u8 information
[youtube] LMZQ7eFhm58: Downloading m3u8 information
cap_2024.11.22_10:57:25_unclear.jpg
server.stop()