Python
fromRealpython
13 hours agoHow to Integrate Local LLMs With Ollama and Python - Real Python
Use Ollama with Python to run local LLMs (llama3.2 and codellama) after installing Ollama, Python 3.8+, and downloading models.
We may want the model to query a database, call an external API, or perform calculations. Function calling (also known as tool calling) allows you to define functions (with schemas) that the model can call as part of its response. For us, this seems like a first step towards enabling more complex interactions with LLMs, where they can not only generate text but also perform actions based on that text.