from fasthtml.jupyter import *
= JupyUvi(app=app) server
chat UI in fastHTML
Chat App initialization
Start by creating the chat application with FastHTML
.
Chat components
Basic chat UI components can include Chat Message and a Chat Input. For a Chat Message, the important attributes are the actual message (str) and the role of the message owner (user - boolean value whether the owner is the user, not the AI assistant).
ChatMessage
ChatMessage (msg:str, user:bool)
Type | Details | |
---|---|---|
msg | str | Message to display |
user | bool | Whether the message is from the user or assistant |
For the chat input, set the name for submitting a new message via form.
ChatInput
ChatInput ()
ActionPanel
ActionPanel ()
Router
Home page
The home page should contain our message list and the Chat Input. The main page can be extracted by accessing the index (root) endpoint.
index
index ()
Form submission
At submission, this function should:
- Extract the new and all previous chat history
- Prompt & get answers from ChatGPT from all these messages
- Return a new ChatMessage
send
send (msg:str, messages:list[str]=None, roles:list[str]=None)
Runner
In addition to the main app, an utility function is implemented to run the app just by importing and executing this function to a Python file.
llmcam_chatbot
llmcam_chatbot (package_name='ninjalabo.llmcam', module_name='chat_ui', app_variable='app', host='0.0.0.0', port=5001, **uvicorn_kwargs)
Find and run the FastAPI app in the specified module within the given package.
Type | Default | Details | |
---|---|---|---|
package_name | str | ninjalabo.llmcam | The installed package name |
module_name | str | chat_ui | The module containing the FastAPI app |
app_variable | str | app | The FastAPI app variable name |
host | str | 0.0.0.0 | The host to listen on |
port | int | 5001 | The port to listen on |
uvicorn_kwargs |
For running while testing with Jupyter notebook, use the JupyUvi
in fasthtml
to run in separate thread.
server.stop()