Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming generation is not working with automatic function calling #106

Open
lmsh7 opened this issue Jan 9, 2025 · 6 comments
Open

Streaming generation is not working with automatic function calling #106

lmsh7 opened this issue Jan 9, 2025 · 6 comments
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.

Comments

@lmsh7
Copy link

lmsh7 commented Jan 9, 2025

Environment details

  • Programming language: Python
  • OS: macOS
  • Language runtime version: 3.12.7
  • Package version: 0.4.0

Steps to reproduce

  1. import os
    from google import genai
    from google.genai import types
    from google.genai.types import (
        GoogleSearch,
    )
    
    
    client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])
    
    
    def search(query: str) -> str:
        return query
    
    
    # mytools = [{"google_search": GoogleSearch()}]
    mytools = [search]
    
    
    generation_config = types.GenerateContentConfig(
        temperature=1,
        top_p=0.95,
        top_k=40,
        max_output_tokens=8192,
        tools=mytools,
    )
    params = {
        "model": "gemini-2.0-flash-exp",
        "config": generation_config,
        "contents": "Search something about hongkong",
    }
    for chunk in client.models.generate_content_stream(**params):
        print(chunk.text)
  2. Traceback (most recent call last):
      File "/Users/lmsh7/Code/project8/./bug_report.py", line 33, in <module>
        print(chunk.text)
              ^^^^^^^^^^
      File "/Users/lmsh7/Code/project8/.venv/lib/python3.12/site-packages/google/genai/types.py", line 2483, in text
        raise ValueError(
    ValueError: GenerateContentResponse.text only supports text parts, but got function_call partvideo_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None
@lmsh7 lmsh7 added priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns. labels Jan 9, 2025
@Giom-V
Copy link

Giom-V commented Jan 9, 2025

Hello @lmsh7, I think your issue is because you can't simply use chunk.text . You need to do something like:

if chuck.text  is not None:
  print(chunk.text)
elif chuck.function_call is not None:
  # do something, or maybe nothing.

@lmsh7
Copy link
Author

lmsh7 commented Jan 9, 2025

Hello @lmsh7, I think your issue is because you can't simply use chunk.text . You need to do something like:

if chuck.text  is not None:
  print(chunk.text)
elif chuck.function_call is not None:
  # do something, or maybe nothing.

Could you provide a working example? When I attempt this...

Traceback (most recent call last):
  File "/Users/lmsh7/Code/project8/bug_report.py", line 33, in <module>
    if chunk.text is not None:
       ^^^^^^^^^^
  File "/Users/lmsh7/Code/project8/.venv/lib/python3.12/site-packages/google/genai/types.py", line 2483, in text
    raise ValueError(
ValueError: GenerateContentResponse.text only supports text parts, but got function_call partvideo_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None

@Giom-V
Copy link

Giom-V commented Jan 9, 2025

Try using chunk.candidates[0].content.parts[0] instead of chunk. That solves your error, but as far as I can see you still do not get the grounding data.

@lmsh7
Copy link
Author

lmsh7 commented Jan 9, 2025

Try using chunk.candidates[0].content.parts[0] instead of chunk.

I tried this approach but the results were not what I expected:

# Using streaming approach
for chunk in client.models.generate_content_stream(**params):
    if chunk.candidates[0].content.parts[0] is not None:
        print(chunk.candidates[0].content.parts[0])

Output:

video_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None
video_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=None function_response=None inline_data=None text=''

In comparison, when using the non-streaming approach:

# Using regular approach
response = client.models.generate_content(**params)
print(response.text)

Output:

I searched for "hongkong" and the API returned a result "hongkong". Is there anything else I can help you with?

@Giom-V
Copy link

Giom-V commented Jan 9, 2025

You forgot the .text in your code.

Here's a more complete loop you can use:

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
      for part in candidate.content.parts:
        print(part.text)

@lmsh7
Copy link
Author

lmsh7 commented Jan 10, 2025

You forgot the .text in your code.

Here's a more complete loop you can use:

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
      for part in candidate.content.parts:
        print(part.text)

Here is full code to reproduce, the result still None.

import os
from google import genai
from google.genai import types
from google.genai.types import (
    GoogleSearch,
)


client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])


def search(query: str) -> str:
    return query


# mytools = [{"google_search": GoogleSearch()}]
mytools = [search]


generation_config = types.GenerateContentConfig(
    temperature=1,
    top_p=0.95,
    top_k=40,
    max_output_tokens=8192,
    tools=mytools,
)
params = {
    "model": "gemini-2.0-flash-exp",
    "config": generation_config,
    "contents": "Search something about hongkong",
}

# response = client.models.generate_content(**params)
# print(response.text)

for chunk in client.models.generate_content_stream(**params):
    for candidate in chunk.candidates:
        for part in candidate.content.parts:
            print(part.text)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
priority: p2 Moderately-important priority. Fix may not be included in next release. type: bug Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Projects
None yet
Development

No branches or pull requests

2 participants