1

I use llama-cpp-python to run LLMs locally on Ubuntu. While generating responses it prints its logs.

How to stop printing of logs??

I found a way to stop log printing for llama.cpp but not for llama-cpp-python. I just want to print the generated response.

1

1 Answer 1

3

You can stop printing logs by setting verbose=False

An example is as follows:

from langchain_community.llms import LlamaCpp

llm = LlamaCpp(
model_path="app/resources/llama-7b-chat-q4/llama-2-7b-chat.Q4_0.gguf",
temperature=0.01,
top_p=1,
verbose=False 
)
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.