Overview
TheuseChat hook provides a simple interface for sending prompts to the language model and managing the response state. It handles loading states, error handling, and provides both streaming and final answer states.
Import
Usage
Return Value
The hook returns an object with the following properties:The current answer from the LLM. This value is set when the LLM call completes successfully.
The final answer state. This is set to the same value as
answer after a successful LLM call, but can be modified independently using setFinalAnswer().Indicates whether an LLM call is currently in progress. Set to
true when send() is called and false when the call completes or fails.Error message if the LLM call fails. Empty string when there is no error. Contains the error message from the exception or “Something went wrong.” as a fallback.
Async function to send a prompt to the LLM. Automatically manages loading state, error handling, and updates both
answer and finalAnswer on success.Parameters:prompt(string): The user prompt to send to the language model
- Clears previous
answeranderrorstates - Sets
loadingtotrue - Calls the LLM with the prompt and system prompt from config
- Updates
answerandfinalAnswerwith the response - Sets
errorif the call fails - Sets
loadingtofalsewhen complete
State setter function to manually update the
finalAnswer value. Useful for modifying the final answer independently of the LLM response.Parameters:value(string): The new value forfinalAnswer
Error Handling
The hook automatically catches errors during LLM calls and sets theerror state:
message property, with a fallback to “Something went wrong.” if no message is available.
State Management
The hook uses React’suseState internally to manage:
answer: Reset to empty string before eachsend()callfinalAnswer: Set to the LLM response after successful completionloading:trueduring LLM call,falseotherwiseerror: Reset to empty string before eachsend()call, set on failure
Implementation Details
The hook integrates with:getConfig()fromconfig/configManage.jsto retrieve configurationllmCall()frommodel/openRouter.jsfor LLM communicationsystemPromptfrommodel/systemPrompt.jsfor system-level instructions
/workspace/source/src/hooks/useChat.ts:6