= models[0]
model model
'gemini-2.0-flash'
Let’s use the same example from Claudette’s documentation.
orders = {
"O1": dict(id="O1", product="Widget A", quantity=2, price=19.99, status="Awaiting order confirmation"),
"O2": dict(id="O2", product="Gadget B", quantity=1, price=49.99, status="Processing"),
"O3": dict(id="O3", product="Gadget B", quantity=2, price=49.99, status="Shipped")}
customers = {
"C1": dict(name="John Doe", email="john@example.com", phone="123-456-7890",
orders=[orders['O1'], orders['O2']]),
"C2": dict(name="Jane Smith", email="jane@example.com", phone="987-654-3210",
orders=[orders['O3']])
}
As with Claudette, we do not have to create the JSON schema manually. We can use docments.
def get_customer_info(
customer_id:str # ID of the customer
): # Customer's name, email, phone number, and list of orders
"Retrieves a customer's information and their orders based on the customer ID"
print(f'- Retrieving customer {customer_id}')
return customers.get(customer_id, "Customer not found")
def get_order_details(
order_id:str # ID of the order
): # Order's ID, product name, quantity, price, and order status
"Retrieves the details of a specific order based on the order ID"
print(f'- Retrieving order {order_id}')
return orders.get(order_id, "Order not found")
def cancel_order(
order_id:str # ID of the order to cancel
)->bool: # True if the cancellation is successful
"Cancels an order based on the provided order ID"
print(f'- Cancelling order {order_id}')
if order_id not in orders: return False
orders[order_id]['status'] = 'Cancelled'
return True
We are ready to go. The main difference here is that we don’t assign the tools to the chat itself, since otherwise Gemini becomes too eager to use them.
pr = 'Can you tell me the email address for customer C1?'
pr = 'Tell me the email address for customer C1.'
r = chat(pr, tools=tools, use_afc=False)
r
- Retrieving customer C1
get_customer_info(customer_id=C1)
usage_metadata
: Cached: 0; In: 145; Out: 10; Total: 155
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.0014089217409491539
content
:
role
: model
parts
:
function_call
:
args
:
name
: get_customer_info
The email address for customer C1 is john@example.com.
usage_metadata
: Cached: 0; In: 72; Out: 15; Total: 87
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.00014472052765389284
content
:
role
: model
parts
:
text
: The email address for customer C1 is john@example.com.
sp = """You will be provided with tools, but don't limit your answer to those tools.
If the user query is related to some of the tools you have access to come up with a sequence of actions to achieve the goal and **execute the plan immediately**.
If the user query is unrelated to the tools you have access to, answer the query using your own knowledge."""
chat = Chat(models[0])
r = chat('Cancel all orders for customer C1.', tools=tools, use_afc=False, sp=sp, temp=0.6)
r
- Retrieving customer C1
I can only cancel orders if I have the order ID. I need to retrieve the customer’s information first to get their order IDs. After that, I can cancel each order separately.
usage_metadata
: Cached: 0; In: 218; Out: 49; Total: 267
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.10805802442589585
content
:
role
: model
parts
:
text
: I can only cancel orders if I have the order ID. I need to retrieve the customer’s information first to get their order IDs. After that, I can cancel each order separately.
function_call
:
args
:
name
: get_customer_info
Chat.toolloop (pr, max_steps=10, trace_func:Optional[<built- infunctioncallable>]=None, cont_func:Optional[<built- infunctioncallable>]=<function noop>, inps=None, sp:str='', temp:float=0.6, maxtok:int|None=None, stream:bool=False, stop:str|list[str]|None=None, tools=None, use_afc=False, tool_mode='AUTO', maxthinktok:int=8000)
Add prompt pr
to dialog and get a response from Gemini, automatically following up with tool_use
messages
Type | Default | Details | |
---|---|---|---|
pr | Prompt to pass to Gemini | ||
max_steps | int | 10 | Maximum number of tool requests to loop through |
trace_func | Optional | None | Function to trace tool use steps (e.g print ) |
cont_func | Optional | noop | Function that stops loop if returns False |
inps | NoneType | None | |
sp | str | ||
temp | float | 0.6 | |
maxtok | int | None | None | |
stream | bool | False | |
stop | str | list[str] | None | None | |
tools | NoneType | None | |
use_afc | bool | False | |
tool_mode | str | AUTO | |
maxthinktok | int | 8000 |
@patch
@delegates(genai.chats.Chat.__call__)
def toolloop(self:genai.chats.Chat,
pr, # Prompt to pass to Gemini
max_steps=10, # Maximum number of tool requests to loop through
trace_func:Optional[callable]=None, # Function to trace tool use steps (e.g `print`)
cont_func:Optional[callable]=noop, # Function that stops loop if returns False
**kwargs):
"Add prompt `pr` to dialog and get a response from Gemini, automatically following up with `tool_use` messages"
n_msgs = len(self.h)
kwargs["use_afc"] = False
r = self(pr, **kwargs)
for i in range(max_steps):
if not r.function_calls:break
if trace_func: trace_func(self.h[n_msgs:]); n_msgs = len(self.h)
r = self(**kwargs)
if not (cont_func or noop)(self.h[-2]): break
if trace_func: trace_func(self.h[n_msgs:])
return r
chat = Chat(model)
r = chat.toolloop('Tell me the email address for customer C1.', tools=tools, sp=sp, temp=0.)
r
- Retrieving customer C1
The email address for customer C1 is john@example.com.
usage_metadata
: Cached: 0; In: 282; Out: 15; Total: 297
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -4.4924703737099965e-05
content
:
role
: model
parts
:
text
: The email address for customer C1 is john@example.com.
def print_msgs(msgs):
for n, m in enumerate(msgs):
for i, part in enumerate(m.parts):
print(f"\nMessage {n+1}, Part {i + 1}:\n")
c = "* Text *: " + part.text if part.text else ""
c += "* Function Call *: " + str(part.function_call) if part.function_call else ""
c += "* Function Response *: " + str(part.function_response.response['result']) if part.function_response else ""
print(c)
print()
chat = Chat(model)
r = chat.toolloop('Cancel all orders for customer C1.', tools=tools, trace_func=print_msgs, temp=0.6, sp=sp)
r
- Retrieving customer C1
Message 1, Part 1:
* Text *: Cancel all orders for customer C1.
Message 2, Part 1:
* Text *: I can only cancel orders if I have the order ID. I need to get the order IDs for customer C1 first.
Message 2, Part 2:
* Function Call *: id=None args={'customer_id': 'C1'} name='get_customer_info'
- Cancelling order O1
- Cancelling order O2
Message 1, Part 1:
* Function Response *: {'name': 'John Doe', 'email': 'john@example.com', 'phone': '123-456-7890', 'orders': [{'id': 'O1', 'product': 'Widget A', 'quantity': 2, 'price': 19.99, 'status': 'Cancelled'}, {'id': 'O2', 'product': 'Gadget B', 'quantity': 1, 'price': 49.99, 'status': 'Cancelled'}]}
Message 2, Part 1:
* Text *: OK. I have the customer's order IDs. Now I will cancel them one by one.
Message 2, Part 2:
* Function Call *: id=None args={'order_id': 'O1'} name='cancel_order'
Message 2, Part 3:
* Function Call *: id=None args={'order_id': 'O2'} name='cancel_order'
Message 1, Part 1:
* Function Response *: True
Message 1, Part 2:
* Function Response *: True
Message 2, Part 1:
* Text *: All orders for customer C1 have been cancelled.
All orders for customer C1 have been cancelled.
usage_metadata
: Cached: 0; In: 348; Out: 11; Total: 359
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.011668531732125715
content
:
role
: model
parts
:
text
: All orders for customer C1 have been cancelled.
In the original example, one of the two orders had a status of Shipped
. With that, the model tried to be too smart, and refused to cancel it since a Shipped order cannot be canceled (this is not necessarily a bad thing).
- Retrieving order O1
The status of order O1 is Cancelled.
usage_metadata
: Cached: 0; In: 319; Out: 10; Total: 329
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.005309220403432846
content
:
role
: model
parts
:
text
: The status of order O1 is Cancelled.
imps = 'os, warnings, time, json, re, math, collections, itertools, functools, dateutil, datetime, string, types, copy, pprint, enum, numbers, decimal, fractions, random, operator, typing, dataclasses'
def CodeChat(model: Optional[str] = None, ask:bool=True, tools=None, **kwargs):
imps = 'os, warnings, time, json, re, math, collections, itertools, functools, dateutil, datetime, string, types, copy, pprint, enum, numbers, decimal, fractions, random, operator, typing, dataclasses'
chat = Chat(model=model, **kwargs)
chat.ask = ask
chat.shell = get_shell()
chat.shell.run_cell('import '+ imps)
chat._tools = tools
chat._tools.append(chat.run_code)
return chat
@patch
def run_code(
self:genai.chats.Chat,
code:str, # Code to execute in persistent IPython session
): # Result of expression on last line (if exists); '#DECLINED#' if user declines request to execute
"Executes python code using a persistent IPython session. This is a safe sandbox environment."
confirm = f'Press Enter to execute, or enter "n" to skip?\n```\n{code}\n```\n'
if getattr(self, "ask", True) and input(confirm): return '#DECLINED#'
try: res = self.shell.run_cell(code)
except Exception as e: return traceback.format_exc()
return res.stdout if res.result is None else res.result
sp = f'''You are a knowledgable coding assistant assistant.
Don't do complex calculations yourself -- create code for them.
Whenever you create code, run it in the IPython environment using the `run_code` tool.
Don't ask for confirmation to use a tool and do your best to infer any parameters needed.
The following modules are pre-imported for `run_code` automatically:
{imps}
Note that `run_code` interpreter state is *persistent* across calls.
Before executing a task, carefully consider all tools at your disposal.
If a tool returns `#DECLINED#` report to the user that the attempt was declined and no further progress can be made.'''
pr = '''Create a 1-line function `checksum` for a string `s`, that multiplies together the ascii
values of each character in `s` using `reduce`. Test it into the session.'''
r = chat.toolloop(pr, tools=chat._tools, sp=sp)
r
Press Enter to execute, or enter "n" to skip?
```
from functools import reduce
checksum = lambda s: reduce(lambda x, y: x * y, [ord(c) for c in s], 1)
print(checksum('abc'))
```
Press Enter to execute, or enter "n" to skip?
```
from functools import reduce
checksum = lambda s: reduce(lambda x, y: x * y, [ord(c) for c in s], 1)
```
The function checksum
has been created. It calculates the product of the ASCII values of the characters in a given string. For the test string “abc”, the checksum is 941094.
usage_metadata
: Cached: 0; In: 416; Out: 44; Total: 460
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.07642308148470792
content
:
role
: model
parts
:
text
: The function checksum
has been created. It calculates the product of the ASCII values of the characters in a given string. For the test string “abc”, the checksum is 941094.
Looking up username
Press Enter to execute, or enter "n" to skip?
```
from functools import reduce
checksum = lambda s: reduce(lambda x, y: x * y, [ord(c) for c in s], 1)
print(checksum("Miko"))
```
The checksum of the username “Miko” is 96025545.
usage_metadata
: Cached: 0; In: 541; Out: 21; Total: 562
automatic_function_calling_history
:
model_version
: gemini-2.0-flash
candidates
:
finish_reason
: FinishReason.STOP
avg_logprobs
: -0.0005312668869183177
content
:
role
: model
parts
:
text
: The checksum of the username “Miko” is 96025545.
Although this example works in this fashion, it is very sensitive to the different prompts and settings. This issue seems to be related, and it might have to do with some parse failures of codeblocks.