Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Usage]: How do I use langchain for tool calls? #9692

Open
1 task done
2500035435 opened this issue Oct 25, 2024 · 1 comment
Open
1 task done

[Usage]: How do I use langchain for tool calls? #9692

2500035435 opened this issue Oct 25, 2024 · 1 comment
Labels
usage How to use vllm

Comments

@2500035435
Copy link

2500035435 commented Oct 25, 2024

Your current environment

vllm = 0.6.3.post1

How would you like to use vllm

langchain test code

llm = ChatOpenAI(
    api_key="api_key", 
    temperature=0.1, 
    top_p=0.7,
    max_tokens=8192,
    model="glm4-9b-chat", 
    base_url="http://host:port/v1/"
)

tools = {"weather": weather}

context = []
def process_query(query):
    
    global context
    context.append({"role": "user", "content": query})
    
    response = llm_with_tools.invoke(context)
    print(response)
    
    if response.tool_calls:
        tool_call = response.tool_calls[0]
        tool_name = tool_call["name"]
        tool = tools[tool_name] 
        tool_arguments = tool_call["args"]
        tool_result = tool(**tool_arguments) 
        
        context.append({"role": "system", "content": f"你可以通过工具得到实时的天气信息,工具得到的结果是:\n\n{tool_result}\n\n,这个结果绝对准确,你可以直接使用该结果进行表述。"})
        
        response = llm.invoke(context)

    context.append({"role": "assistant", "content": response.content})
    
    return response.content

query_1 = "今天深圳的天气怎么样?"
response_1 = process_query(query_1)
print("LLM_response:", response_1)

Run command:

vllm serve glm-4-9b-chat_path --served-model-name glm4-9b-chat --host xxx --port xxx --max_model_len=128000 --tensor_parallel_size 2 --gpu_memory_utilization 0.4 --trust_remote_code

How do I use langchain for vllm serve tool calls?

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@2500035435 2500035435 added the usage How to use vllm label Oct 25, 2024
@DarkLight1337
Copy link
Member

We don't maintain the langchain integration. I suggest asking on their repo instead.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usage How to use vllm
Projects
None yet
Development

No branches or pull requests

2 participants