File tree Expand file tree Collapse file tree 1 file changed +20
-1
lines changed
Expand file tree Collapse file tree 1 file changed +20
-1
lines changed Original file line number Diff line number Diff line change 1+ """
2+ This example demonstrates how to use the reply() method to have a conversation with the agent/LLM.
3+ After getting an initial response, you can use reply() to ask follow-up questions or request
4+ confirmation. The agent/LLM maintains context from the previous interaction, allowing it to:
5+
6+ 1. Confirm its previous output
7+ 2. Correct mistakes if needed
8+ 3. Provide additional explanation
9+ 4. Refine its response based on new information
10+
11+ Example:
12+ run = await my_agent(input) # Initial response
13+ run = await run.reply(user_message="Are you sure?") # Ask for confirmation
14+ ...
15+ """
16+
117import asyncio
218
319from dotenv import load_dotenv
@@ -52,7 +68,10 @@ async def main():
5268
5369 print (f"Extracted: { run .output .first_name } { run .output .last_name } " )
5470
55- # Double check with a simple confirmation
71+ # The reply() method allows you to continue the conversation with the LLM
72+ # by sending a follow-up message. The LLM will maintain context from the
73+ # previous interaction and can confirm or revise its previous output.
74+ # Here we ask it to double check its extraction.
5675 run = await run .reply (user_message = "Are you sure?" )
5776
5877 print ("\n After double-checking:" )
You can’t perform that action at this time.
0 commit comments