Using Noir with Ollama
Learn how to integrate Noir with Ollama to run local large language models (LLMs) for in-depth endpoint analysis. This guide provides setup instructions and example commands.
Run large language models locally using Ollama for code analysis without sending data to external services.
Setup
-
Install Ollama: Download from ollama.com
-
Download a Model: Pull a model (e.g.,
phi-3)ollama pull phi-3 -
Serve the Model:
ollama serve phi-3
Usage
Run Noir with Ollama:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=ollama \
--ai-model=phi-3
Ollama provides local AI analysis for vulnerability detection, code improvements, and endpoint functionality descriptions.