The hands of AI large models – Function Calling calls the basic principles of external tools

This article will delve into the basic principles of Function Calling calling external tools, helping you understand how large models complete tasks with the help of external tools through steps such as autonomous content analysis, decision-making and use of tools, and structured filling of slots.

If the large model is compared to a “human”, then it only has a brain, eyes and mouth, and no hands and feet, that is, it can only “think”, “see” and “speak”, and will not use “tools”, such as when we ask the large model “what is the weather like today”. The large model itself cannot answer this question, it must call the relevant “tool” with the help of “external forces” to complete the answer (Here is an example of a flowchart that speaks human language)。

  • “External force”:For example, it can be “ink weather”, etc., because “ink weather” knows the “weather”. Therefore, as long as “ink weather” can be called, the large model can indirectly answer the user’s weather questions. So, how to “help”? Of course, it is the protagonist we are going to talk about today – “Function Calling”. (There doesn’t seem to be a very suitable Chinese translation, let’s roughly understand it as a function call or a tool call)
  • Function Calling:Independent “analysis” of contentcombineStructured [filling the groove],Autonomous “decision-making”usetoolcombineStructured [call]
  • Autonomous analysisContent: As long as you see the content in the prompt, you can do itautomaticParse the content in a specific format;
  • Autonomous decision-makingUse tools: refers to the structured content based on the parsed,automaticCall tools (e.g. weather, geolocation, date, etc.) and automatically call whatever you need.
  • Structured:A Json formatnothing more
  • Fill the slot: Refers to the large model according to the user’shistoryConversation content, informationStructuredThe analysis came out. What is the slot of the filling slot? Is a slot, what is a slot? It can be understood as “parameters”, for example, when a user asks about the weather in Beijing today, then the parameters of the slot are: date, city; These two parameters are slots.

Slot examples:

Example 1:

  • Departure:
  • Destination:

Example 2:

  • City:
  • Date:

Example 3:

  • Reason for reimbursement:
  • Reimbursement Amount:
  • Occurrence time:

Example 4:

  • Destination City:
  • Hotel unit rate limit:
  • Minimum hotel unit price:
  • Regional Preferences:

The basic principle of the large model call tool(That is, the large model has the basic principle of calling external tools with the help of Function Calling), and it doesn’t matter if you don’t understand itThere is a flow chart explained below

What does a product manager need to do?
In the process of a product from scratch, it is not easy to do a good job in the role of product manager, in addition to the well-known writing requirements, writing requirements, writing requirements, there are many things to do. The product manager is not what you think, but will only ask you for trouble, make a request:

View details >

1. The System Prompt (system prompt) clearly states which tools can be used, the parameters to be given, and what format to be written in; That is, the system prompt word can be understood as written toLarge modelWhat you see is the command issued by the system to the large model, and since it is an order, the large model must obey. When the large model performs a task, if it encounters a problem that it cannot solve (such as a weather problem), it can call the tool at any time, but it must be in the format required by the tool before it can be called.

  • If it is an application system that we use every day, such as DeepSeek or Doubao, its System Prompt has been officially encapsulated and is not visible to users. That is, the relevant tool calls have been written to System Prompt by default. For example, when DeepSeek did not encapsulate the weather call tool in the early days, we asked it weather questions, and it could not answer.
  • If it is a platform that produces agents such as coze, when we build an agent, we can write what tools to call and how to call them by hand according to the actual business process in System Prompt.

2. Intent recognition:Large modelBased on the information of the conversation with the userautonomicIdentify what tasks the user is asking to do;

3、Large models make independent decisionsAnswer the user’s current question about whether they need to use a tool and decide which tool to use (e.g., weather tool);

4. After deciding which tool to use, then determine the information transmission structure set by the retrieval tool (the prompt text writes the format of the calling tool and what parameters need to be passed by which tool is called, and the large model can be executed according to the command, such as the weather tool needs the “city” and “date” parameters);

5. Retrieve the content parsing of all historical contexts with the user, and fill in the slots (the purpose is to find the two parameters of “city” and “date”);

6. If the parameters are missing, ask the user until all the slots are filled in (until the two parameters of “city” and “date” are all determined);

7. According to the format of the slot, the large model calls the tool to generate model responses;

The above is the basic principle of Function Calling, it doesn’t matter if you don’t understand it, let’s explain it next:

Let’s see what to write in System Prompt, and the large model can obediently use Function Calling to call the tool?

Note: In the description below, if it is “//The text after this sentence only indicates the narration explanation of this sentence, not the content in the prompt;

System PromptExample:

Persona and reply logic:

If a user asks about the weather, you can use the tool when completing the user request; This sentence Prompt can be understood as: We can think of Prompt as a task command issued by the system to a large model, and the large model will obediently complete the task according to the command. Because the large model has semantic recognition capabilities, that is, it can understand language like a “human”, so this prompt is commanding the large model – if the user asks about the weather conditions, you use the tool yourself, how to use the tool, see the prompt below.The “you” here is the “big model”, and the “tool” refers to if you let the R&D students develop an agent, then it is up to the business to decide what weather tool to call (for example, it may be “ink weather” or some other weather MCP, it doesn’t matter if you don’t understand MCP, just think of it as an API or plug-in to understand it first), or if you build an agent in coze, we can choose what “weather” tool to call during the construction process.

##工具1    //This is a tool provided by Prompt. The previous sentence Prompt has commanded the large model, if you want to use the tool, you can use it yourself, then the large model as a “person” who executes the command, it will trust the “Prompt” that gives it the command, then it will call the tool, so you “Prompt” as the owner of the large model You can’t cheat other people’s large models, so “Prompt” puts the tool here, which means that if you want to use this tool for the large model, you can use it yourself. The tools are the following content.

Look up the weather based on “City and date”, which you must provide when you use itcityanddateThe two parameters should be provided in the following structure:    //The structure is as followsThat is, it refers to the aforementioned oneJsonformat, which is also known as “Structured”; This Json can be understood as a switch that calls the weather tool, and there are two parts in the switch, “city” and “date”, both of which are indispensable (your large model itself will judge whether it can recognize the city and date of the weather asked by the user based on its historical dialogue with the user, and if not, it will ask the user a question)

{

“name”:“get_weather”,

“arguments”:{

“city”: The city to be queried

“date”: The date to be queried

}

}

##工具2//This is another tool for Prompt for large models, please pick up large models

According to the user’s description, such as “today”, “yesterday”, “tomorrow”, and “next Monday”, the actual date must be obtained and the user’s description of the date must be provided when using the parameter, and the parameter should meet the following structure:

{

“name”:“get_date”,

“arguments”:{

“query”: The user’s description of the date

}

}

That’s all for this System Prompt.

Let’s use an example to illustrate:

If a user asks, “What’s the weather like in Beijing today?” ”

The large model began to analyze independently: Oh, the user is asking about the weather, then I need to call the weather tool “get_weather” given to me by the master, and the owner also told me that when calling this tool, the “city” and “date” parameters must be determined first, then I will find the historical dialogue with the user to see if I can get these two parameters, eh, the user said “today”, then I can call the date tool “get_date” provided by the master to me, This tool only needs one parameter (query: the user’s description of the date), then I can just fill in “today” (fill in the slot). After filling it in, the large model returns this “get_date” Json to DeepSeek (if it is in coze, it is returned to coze, the same applies to other applications), DeepSeek, as an encapsulated application, already has a piece of code or plug-in specifically to receive and process this Json, and after processing, it will return the value of “today” (such as May 31, 2025) to the large model, so the large model gets the parameter “date”.

Well, let’s take a look at the parameter “city”, and find that the user has told me that the city is “Beijing”, since the two parameters of “weather” and “city” are determined, then I will fill them in the weather tool “get_weather” (fill in the slot), and then return the “get_weather” Json to Deepseek (similar to the above), Deepseek executes this Json code and returns the “weather information” to the large model, After obtaining weather information, the large model organizes language (speaks human language) and outputs “human language” to users.

If the user does not say which city when asking the question, the large model will find that two parameters of “city” and “date” are needed when it independently analyzes the “get_weather”, and now the “city” cannot be confirmed, so it will take the initiative to ask the user: “Brother, which city do you want to ask about the weather?” Of course, this is just an example, now most applications have also encapsulated the “location” tool, and the large model will call the location tool in the interpretation process to obtain the city of the user’s current location, so there is no need to ask the user.

Brief summary:Remember a sentenceVery importantAt the application technology level, no matter how cool the design we make,In the end, there is only one purpose, that isFind a relatively more suitable prompt to pass it on to the large model。 BecausePrompts are the only way we can interact with large models。 Because if the large model wants to “work”, we must tell it what work to do and how to tell it, that is, the prompt word, and it can only be the prompt word, and only the prompt word can directly “communicate” with the large model. If the prompt words are not clear, the large model can only play freely, and the hallucination rate will naturally be greater. (Savor carefully)

End of text
 0