To give your LLM the ability to interact with the outside world, you will need tool calling.

Note that not every model supports tool calling. If the model does not have such an option, it might not call your tools. For reliable tool calling, we recommend trying the Qwen family of models.

Declaring a tool

A tool can be created from any (synchronous) python function, which returns a string. To perform the conversion, you simply need to use the @tool decorator. To get a good sense of how such a tool can look like, consider this geometry example:

import math
from quaynor import tool, Chat

@tool(description="Calculates the area of a circle given its radius")
def circle_area(radius: float) -> str:
    area = math.pi * radius ** 2
    return f"Circle with radius {radius} has area {area:.2f}"

As you can see, every @tool definition has to be complemented by a description of what such tool does. To let your LLM use it, simply add it when creating Chat:

chat = Chat('./model.gguf', tools=[circle_area])

Quaynor then figures out the right tool calling format, inspects the names and types of the parameters, and configures the sampler.

Naturally, more tools can be defined and the model can chain the calls for them:

import os
from pathlib import Path
from quaynor import Chat, tool

@tool(description="Gets path of the current directory")
def get_current_dir() -> str:
    return os.getcwd()

@tool(description="Lists files in the given directory", params={"path": "a relative or absolute path to a directory"})  
def list_files(path: str) -> str:
    files = [f.name for f in Path(path).iterdir() if f.is_file()]
    return f"Files: {', '.join(files)}"

@tool(description="Gets the size of a file in bytes")
def get_file_size(filepath: str) -> str:
    size = Path(filepath).stat().st_size
    return f"File size: {size} bytes"

chat = Chat('./model.gguf', tools=[get_current_dir, list_files, get_file_size])
response = chat.ask('What is the biggest file in my current directory?').completed()
print(response) # The largest file in your current directory is `model.gguf`.

Providing parameter descriptions

When a tool call is declared, information about the description, the types and the parameters is provided to the model, so it knows it can use it. Crucially, also parameter names are provided.

If those are not enough, you can decide to provide additional information by the params parameter:

from quaynor import tool
@tool(
    description="Given a longitude and latitude, gets the current temperature.",
    params={
        "lon": "Longitude - that is the vertical one!",
        "lat": "Latitude - that is the horizontal one!"
    }
)
def get_current_temperature(lon: str, lat: str) -> str:
    ...

These will be then appended to the information provided to model, so it can better navigate itself when using the tool.

Pre-packaged tools

We ship Quaynor with two packaged-in tools, which are general enough for mutliple use-cases - monty Python interpreter and bashkit Bash interpreter. Both of them should serve similar purpose - to give your small LLM a better chance to answer questions requiring precise reasoning or some kind of computation, possibly on a big context.

The usage is straightforward. Start with importing either python_tool or bash_tool from quaynor.

from quaynor import python_tool, bash_tool

chat = Chat('./model.gguf', tools=[python_tool(), bash_tool()])

Lastly, keep in mind that for most use-cases it is reasonable to constraint the tools with some limits regarding memory and computation time, so that you don't end up executing infinite loop code. To solve this, python_tool provides max_duration, max_memory and max_recursion_depth and bash_tool provides max_commands.

Tool calling and the context

As with everything made to improve response quality, using tool calls fills up the context faster than simply chatting with an LLM. So be aware that you might need to use a larger context size than expected when using tools.