Async versions


source

AsyncChat

 AsyncChat (model:str, sp:str='', temp:float=0.6, text_only:bool=False,
            cli:google.genai.client.AsyncClient|None=None)
Type Default Details
model str The model to be used
sp str System prompt
temp float 0.6 Default temperature
text_only bool False Suppress multimodality even if the model allows for it
cli google.genai.client.AsyncClient | None None Optional Client to be passed (to keep track of usage)
Exported source
from gaspare import *

import os

from google import genai

from fastcore.all import *
Exported source
def AsyncClient(model:str, # The model to be used by default (can be overridden when generating)
           sp:str='', # System prompt
           temp:float=0.6, # Default temperature
           text_only:bool=False, # Suppress multimodality even if the model allows for it
          ): 
    cli = genai.Client(api_key=os.environ['GEMINI_API_KEY'])
    c = cli.aio
    c.models.post_cbs = [c.models._call_tools]
    c.models.model, c.models.sp, c.models.temp, c.models.text_only = model, sp, temp, text_only
    return c


def AsyncChat(model:str, # The model to be used 
           sp:str='', # System prompt
           temp:float=0.6, # Default temperature
           text_only:bool=False, # Suppress multimodality even if the model allows for it
           cli:genai.client.AsyncClient|None=None, # Optional Client to be passed (to keep track of usage)
          ):
    c = AsyncClient(model, sp, temp, text_only) if cli is None else cli
    c.model, c.sp, c.temp, c.text_only = model, sp, temp, text_only
    chat = c.chats.create(model=c.model)
    chat.c.post_cbs.insert(0, chat._rec_res)
    return chat

source

AsyncClient

 AsyncClient (model:str, sp:str='', temp:float=0.6, text_only:bool=False)
Type Default Details
model str The model to be used by default (can be overridden when generating)
sp str System prompt
temp float 0.6 Default temperature
text_only bool False Suppress multimodality even if the model allows for it

Testing the AsyncClient

Everything should work with exactly the same api, just adding an await in front of it

cli = AsyncClient(models[0])
await cli("Hi there")

Hi! How can I help you today?

  • usage_metadata: Cached: 0; In: 2; Out: 10; Total: 12
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.009500809758901597
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Hi! How can I help you today?
    • finish_reason: FinishReason.STOP
cli.use

Cached: 0; In: 2; Out: 10; Total: 12

s = cli("write me a haiku", stream=True)
async for c in await s: print(c, end='')
Green leaves softly sway,
Sunlight warms the gentle breeze,
Nature's peace unfolds.

Since we did not need an async __call__ method, we could have had the method return an async generator rather than a coroutine (which would have meant in this case async for c in s would have been the way to make it work rather than async for c in **await** s)). To keep compatibility with Claudette’s async client, we wrap the async generator in a coroutine to be awaited, even if it technically adds a slight overhead.

cli.use

Cached: 0; In: 6; Out: 30; Total: 36

def sums(
    a:int,  # First thing to sum
    b:int=1 # Second thing to sum
) -> int: # The sum of the inputs
    "Adds a + b."
    print(f"Finding the sum of {a} and {b}")
    return a + b
a,b = 604542,6458932
pr = f"What is {a}+{b}?"
sp = "You are a summing expert."
r = await cli(pr, sp=sp, tools=[sums], use_afc=True)
r
Finding the sum of 604542 and 6458932
The sum of 604542 and 6458932 is 7063474.
  • usage_metadata: Cached: 0; In: 66; Out: 29; Total: 95
  • automatic_function_calling_history:
    automatic_function_calling_history[0]
    • role: user
    • parts:
      parts[0]
      • text: What is 604542+6458932?
    automatic_function_calling_history[1]
    • role: model
    • parts:
      parts[0]
      • function_call:
        • name: sums
        • args:
          • a: 604542
          • b: 6458932
    automatic_function_calling_history[2]
    • role: user
    • parts:
      parts[0]
      • function_response:
        • name: sums
        • response:
          • result: 7063474
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.021660833523191255
    • content:
      • role: model
      • parts:
        parts[0]
        • text: The sum of 604542 and 6458932 is 7063474.
    • finish_reason: FinishReason.STOP
r = await cli(pr, sp=sp, tools=[sums], use_afc=False)
r
Finding the sum of 604542 and 6458932
  • sums(a=604542, b=6458932)
  • usage_metadata: Cached: 0; In: 70; Out: 3; Total: 73
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: 1.1818580484638611e-05
    • content:
      • role: model
      • parts:
        parts[0]
        • function_call:
          • name: sums
          • args:
            • a: 604542
            • b: 6458932
    • finish_reason: FinishReason.STOP
await cli.structured(pr, sums)
Finding the sum of 604542 and 6458932
[7063474]
await cli.imagen("Charcoal drawing of a cute otter")
  • generated_images:
    generated_images[0]
    • image:
      • image_bytes: b’89PNG1a…’
      • mime_type: image/png
await cli("count to 10", stop="3")

Okay, here we go:

1

2

  • usage_metadata: Cached: 0; In: 5; Out: 11; Total: 16
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.15975962985645642
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Okay, here we go:

          1 2
    • finish_reason: FinishReason.STOP
cli

Okay, here we go:

1

2

  • usage_metadata: Cached: 0; In: 5; Out: 11; Total: 16
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.15975962985645642
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Okay, here we go:

          1 2
    • finish_reason: FinishReason.STOP
Input Output Cached
Tokens 211 76 0
Totals Tokens: 287 $0.000052

Testing the AsyncChat

Once again, everything should just work with an await in front.

chat = AsyncChat(model=models[0])
await chat("Hi, I am Miko")

Hi Miko! It’s nice to meet you. How can I help you today?

  • usage_metadata: Cached: 0; In: 5; Out: 19; Total: 24
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.006754375602069654
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Hi Miko! It’s nice to meet you. How can I help you today?
    • finish_reason: FinishReason.STOP
await chat("What is my name again?")

Your name is Miko.

  • usage_metadata: Cached: 0; In: 30; Out: 6; Total: 36
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.002763839748998483
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Your name is Miko.
    • finish_reason: FinishReason.STOP
pr = f"What is {a}+{b}?"
r = await chat(pr, tools=[sums])
r
Finding the sum of 604542 and 6458932
  • sums(a=604542, b=6458932)
  • usage_metadata: Cached: 0; In: 143; Out: 3; Total: 146
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: 2.1037994883954525e-06
    • content:
      • role: model
      • parts:
        parts[0]
        • function_call:
          • name: sums
          • args:
            • a: 604542
            • b: 6458932
    • finish_reason: FinishReason.STOP
await chat()

The answer is 7063474.

  • usage_metadata: Cached: 0; In: 102; Out: 13; Total: 115
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.1015774470109206
    • content:
      • role: model
      • parts:
        parts[0]
        • text: The answer is 7063474.
    • finish_reason: FinishReason.STOP
await chat(["Write a haiku about this image", "../examples/match.png"])

Feathered fists clash now,

Otter and bird in the ring,

Crowd roars, dogs all cheer.

  • usage_metadata: Cached: 0; In: 1411; Out: 23; Total: 1434
  • automatic_function_calling_history:
  • model_version: gemini-2.0-flash
  • candidates:
    candidates[0]
    • avg_logprobs: -0.4689888746842094
    • content:
      • role: model
      • parts:
        parts[0]
        • text: Feathered fists clash now, Otter and bird in the ring, Crowd roars, dogs all cheer.
    • finish_reason: FinishReason.STOP
async for chunk in await chat("Now do a short poem on it", stream=True): print(chunk, end='')
In mosaic's gleam, a boxing ring,
A bird of old, with feathered swing.
Against an otter, sleek and brown,
Before a cheering, canine town.
A Sasquatch looms, a watchful eye,
As ancient battles reach for sky.
The referee stands, stripes so neat,
A whimsical, historic feat.
chat

In mosaic’s gleam, a boxing ring,

A bird of old, with feathered swing.

Against an otter, sleek and brown,

Before a cheering, canine town.

A Sasquatch looms, a watchful eye,

As ancient battles reach for sky.

The referee stands, stripes so neat,

A whimsical, historic feat.

History

user: Hi, I am Miko

model: Hi Miko! It’s nice to meet you. How can I help you today?

user: What is my name again?

model: Your name is Miko.

user: What is 604542+6458932?

model: 604542 + 6458932 = 7063474

user: What is 604542+6458932?

model:

tool:

model: The answer is 7063474.

user: Write a haiku about this imageIMAGE_0

model: Feathered fists clash now, Otter and bird in the ring, Crowd roars, dogs all cheer.

user: Now do a short poem on it

model: In mosaic’s gleam, a boxing ring, A bird of old, with feathered swing. Against an otter, sleek and brown, Before a cheering, canine town. A Sasquatch looms, a watchful eye, As ancient battles reach for sky. The referee stands, stripes so neat, A whimsical, historic feat.

Input Output Cached
Tokens 3,186 161 0
Totals Tokens: 3,347 $0.000383