Using Noir with Ollama
Run large language models locally using Ollama for code analysis without sending data to external services.
Setup
-
Install Ollama: Download from ollama.com
-
Download a Model: Pull a model (e.g.,
gemma4)ollama pull gemma4 -
Start the Server:
ollama serve
Usage
Run Noir with Ollama:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=ollama \
--ai-model=gemma4
Ollama provides local AI analysis for vulnerability detection, code improvements, and endpoint functionality descriptions.