-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Streaming generation is not working with automatic function calling #106
Comments
Hello @lmsh7, I think your issue is because you can't simply use
|
Could you provide a working example? When I attempt this... Traceback (most recent call last):
File "/Users/lmsh7/Code/project8/bug_report.py", line 33, in <module>
if chunk.text is not None:
^^^^^^^^^^
File "/Users/lmsh7/Code/project8/.venv/lib/python3.12/site-packages/google/genai/types.py", line 2483, in text
raise ValueError(
ValueError: GenerateContentResponse.text only supports text parts, but got function_call partvideo_metadata=None thought=None code_execution_result=None executable_code=None file_data=None function_call=FunctionCall(id=None, args={'query': 'hongkong'}, name='search') function_response=None inline_data=None text=None |
Try using |
I tried this approach but the results were not what I expected: # Using streaming approach
for chunk in client.models.generate_content_stream(**params):
if chunk.candidates[0].content.parts[0] is not None:
print(chunk.candidates[0].content.parts[0]) Output:
In comparison, when using the non-streaming approach: # Using regular approach
response = client.models.generate_content(**params)
print(response.text) Output:
|
You forgot the Here's a more complete loop you can use:
|
Here is full code to reproduce, the result still None. import os
from google import genai
from google.genai import types
from google.genai.types import (
GoogleSearch,
)
client = genai.Client(api_key=os.environ["GEMINI_API_KEY"])
def search(query: str) -> str:
return query
# mytools = [{"google_search": GoogleSearch()}]
mytools = [search]
generation_config = types.GenerateContentConfig(
temperature=1,
top_p=0.95,
top_k=40,
max_output_tokens=8192,
tools=mytools,
)
params = {
"model": "gemini-2.0-flash-exp",
"config": generation_config,
"contents": "Search something about hongkong",
}
# response = client.models.generate_content(**params)
# print(response.text)
for chunk in client.models.generate_content_stream(**params):
for candidate in chunk.candidates:
for part in candidate.content.parts:
print(part.text) |
Environment details
Steps to reproduce
The text was updated successfully, but these errors were encountered: