this post was submitted on 19 Apr 2025
6 points (87.5% liked)

Large Language Models

208 readers
3 users here now

A place to discuss large language models.

Rules

  1. Please tag [not libre software] and [never on-device] services as such (those not green in the License column here).
  2. Be useful to others

Resources

github.com/ollama/ollama
github.com/open-webui/open-webui
github.com/Aider-AI/aider
wikipedia.org/wiki/List_of_large_language_models

founded 2 years ago
MODERATORS
 

I'm running ollama with llama3.2:1b smollm, all-minilm, moondream, and more. I am able to integrate it with coder/code-server, vscode, vscodium, page assist, cli, and also created a discord ai user.

I'm an infrastructure and automation guy, not a developer so much. Although my field is technically devops.

Now, I hear that some llms have "tools." How do I use them? How do I find a list of tools for a model?

I don't think I can simply prompt "Hi llama3.2, list your tools." Is this part of prompt engineering?

What, do you take a model and retrain it or something?

Anybody able to point me in the right direction?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 month ago (1 children)

What I'm wondering is, is there a standard format for instructing models to give outputs using the tool? They're specifically trained to be better at doing this right

[–] [email protected] 1 points 1 month ago (1 children)

Ah for training a new model from scratch? Yes there is a specific format, you can look at the ollama source code or any of the big models that accept tool use like llama4 for the format both to and from a model. However unless you're secretly a billionaire I doubt you could compete with these pertained models in tool calling.

Ollama's model list on their website has a filter for tool using models. To be honest all open source models suck at tool use compared to the big players, openai, anthropic, google. To be fair I don't have any hardware capable of running deepseeks newest models so I haven't tested them for tool use.

[–] [email protected] 1 points 1 month ago* (last edited 1 month ago) (1 children)

No I meant like, for prompting tool supporting models to be aware of the functions you are making available to it. I've tried arbitrary prompts to tell it to do this and it sort of works but yeah the models I've tried don't seem very good at that, was mainly wondering if using a specific format in the prompt would improve performance

[–] [email protected] 2 points 6 days ago

I think what you are looking for is MCP - Model Context Protocol. It's an effort to standardize what you are talking about.