1

The goal is to use 'Langchain + Transformer's Local (small) models + Langchain's Tools + Langchain's Agent'.

I have some idea about the error-

  • This might be due to distilgpt2 model might not be able to generate the output in required format which is expecting by some funtion (a function which is parsing the output).

If this is the case then how we can change the parsing mechanism? I tried to go through the documentaion but no luck so far.

The code I have-

from langchain.llms import HuggingFacePipeline
from langchain.agents import initialize_agent, Tool, AgentType
from langchain.llms import OpenAI
import os

os.environ['OPENAI_API_KEY'] = 'sk-********************************'

tools = [
    Tool(
        name="Music Search",
        func=lambda x: "'All I Want For Christmas Is You' by Mariah Carey.",  # Mock Function
        description="A Music search engine. Use this more than the normal search if the question is about Music, like 'who is the singer of yesterday?' or 'what is the most popular song in 2022?'",
    ),
]

model_load = 'distilgpt2'
llm_huggingface = HuggingFacePipeline.from_model_id(
    model_id=model_load,
    task="text-generation",
    model_kwargs={"max_length": 500},
)

llm_openai = OpenAI(temperature=0.1)
agent = initialize_agent(
    tools,
    llm_openai, # llm_huggingface
    agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION,
    verbose=True,
)
ans = agent.run("what is the most famous song of christmas")
print(ans)

This code is working fine when I am using openai model.

Example Output-

D:\Python\python.exe D:\GitHub\tmp\b.py 


> Entering new AgentExecutor chain...
 I should look for a song that is popular during the christmas season
Action: Music Search
Action Input: most famous christmas song
Observation: 'All I Want For Christmas Is You' by Mariah Carey.
Thought: I now know the final answer
Final Answer: 'All I Want For Christmas Is You' by Mariah Carey.

> Finished chain.
'All I Want For Christmas Is You' by Mariah Carey.

Process finished with exit code 0

But when I am using llm_huggingface it gives me an error about some parsing-

D:\Python\python.exe D:\GitHub\tmp\b.py 


> Entering new AgentExecutor chain...
Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.
D:\Python\Lib\site-packages\transformers\generation\utils.py:1268: UserWarning: Input length of input_ids is 185, but `max_length` is set to 50. This can lead to unexpected behavior. You should consider increasing `max_new_tokens`.
  warnings.warn(
Traceback (most recent call last):
  File "D:\GitHub\tmp\b.py", line 30, in <module>
    ans = agent.run("what is the most famous song of christmas")
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\Lib\site-packages\langchain\chains\base.py", line 487, in run
    return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\Lib\site-packages\langchain\chains\base.py", line 292, in __call__
    raise e
  File "D:\Python\Lib\site-packages\langchain\chains\base.py", line 286, in __call__
    self._call(inputs, run_manager=run_manager)
  File "D:\Python\Lib\site-packages\langchain\agents\agent.py", line 1122, in _call
    next_step_output = self._take_next_step(
                       ^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\Lib\site-packages\langchain\agents\agent.py", line 930, in _take_next_step
    raise e
  File "D:\Python\Lib\site-packages\langchain\agents\agent.py", line 919, in _take_next_step
    output = self.agent.plan(
             ^^^^^^^^^^^^^^^^
  File "D:\Python\Lib\site-packages\langchain\agents\agent.py", line 532, in plan
    return self.output_parser.parse(full_output)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Python\Lib\site-packages\langchain\agents\mrkl\output_parser.py", line 52, in parse
    raise OutputParserException(
langchain.schema.output_parser.OutputParserException: Could not parse LLM output: ` I`

Process finished with exit code 1

I tried this- https://python.langchain.com/docs/modules/agents/how_to/custom_llm_agent

and https://python.langchain.com/docs/modules/model_io/models/llms/custom_llm

But this need OpenAI model. And I am using distilgpt2 or other similar small model. Don't want use Llama 2 either.

1 Answer 1

1

The exception comes from output_parser.py, which has

        if not re.search(r"Action\s*\d*\s*:[\s]*(.*?)", text, re.DOTALL):
            raise OutputParserException(
                f"Could not parse LLM output: `{text}`",
                observation=MISSING_ACTION_AFTER_THOUGHT_ERROR_MESSAGE,
                llm_output=text,
                send_to_llm=True,
            )

It appears that the input {text} does not match the pattern defined by the regular expression r"Action\s*\d*\s*:[\s]*(.*?)". In the context, text is defined as I only.

OpenAI output pattern seems to follow the similar format Action xyz: xyz while the distilgpt2 output does not.

The regular expression Action\s*\d*\s*:[\s]*(.*?) can be broken down into several components:

\s*: matches any number of whitespace characters (including zero). The \s represents a whitespace character, and the * is a quantifier that means "zero or more"

\d*: matches any number of digits (including zero). The \d represents a digit, and the * again is a quantifier that means "zero or more"

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.