3

I need to implement a chatbot that uses helpful functions for calling to interact with the database. This is a sandbox task from the laboratory

I have a problem sending the model a ready API response to be formatted to the user.

Once the necessary functions for database calls are declared:

list_tables_func = vertexai.generative_models.FunctionDeclaration.from_func(list_tables)
describe_table_func = vertexai.generative_models.FunctionDeclaration.from_func(describe_table)
execute_query_func = vertexai.generative_models.FunctionDeclaration.from_func(execute_query)

and packed into a tool, which was then passed to a model provided by the vertexai Python's SDK:

db_tool = vertexai.generative_models.Tool(
    function_declarations = [list_tables_func, describe_table_func, execute_query_func]
)

instruction = ...
vertexai.init(project=..., location=...)

model = vertexai.generative_models.GenerativeModel(
                        "gemini-1.5-flash-001",
                        tools = [db_tool],
                        system_instruction = instruction)

chat = model.start_chat()

I queried the model and received the necessary SQL-scripts for my defined functions. At this stage, the model generates only scripts for querying, not the actual responses from the database:

query = "What is the most expensive product?" 
resp = chat.send_message(content=query,
                         tools=[db_tool])
function_calls = resp.candidates[0].function_calls
func = function_calls[-1] # the last query is a clue to retrieve user's answer

however, the contents of func return are incorrect, as there is no name column in products table:

name: "execute_query"
args {
  fields {
    key: "sql"
    value {
      string_value: "SELECT name FROM products ORDER BY price DESC LIMIT 1"
    }
  }
}

So, i have ignored the issue, assuming, the model returned correct string_value . I mannuallly substituted name with product_name - I know that's sounds a bit insane. But so far so good:

sql_query = func.args["sql"].replace("name", "product_name") 

to generate user-readable response from Gemini I need to create a new request to the chat:

api_response = {"text": "; ".join([", ".join(items) for items in sql_query])}
user_response = chat.send_message(
    Content(
            role = "user",
            parts = [vertexai.generative_models.Part.from_function_response(
                     name = func.name, 
                     response = {"content": api_response})]
    )
)

Here, when executing send_message, an _InactiveRpcError occurs, stating:

<_InactiveRpcError of RPC that terminated with:
    status = StatusCode.INVALID_ARGUMENT
    details = "Please ensure that the number of function response parts should be equal to number of function call parts of the function call turn.
... >
InvalidArgument                           Traceback (most recent call last)
Cell In[24], line 1
----> 1 user_response = chat.send_message(
      2     Content(
      3             role = "user",
      4             parts = [Part.from_function_response(name = func.name, response = {"content": api_response})]
      5     )
      6 )"

I am trying to understand but have been unsuccessful, if someone could take the time to review the entire code in GitHub for more clarity if needed and offer some ideas how to fix the error, I would be incouraged to keep going. Thank you for your time.

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.