Using Noir with Ollama

Run large language models locally using Ollama for code analysis without sending data to external services.

Setup

  1. Install Ollama: Download from ollama.com

  2. Download a Model: Pull a model (e.g., gemma4)

    ollama pull gemma4
    
  3. Start the Server:

    ollama serve
    

Usage

Run Noir with Ollama:

noir -b ./spec/functional_test/fixtures/hahwul \
     --ai-provider=ollama \
     --ai-model=gemma4

Ollama provides local AI analysis for vulnerability detection, code improvements, and endpoint functionality descriptions.

Esc