Using Noir with Ollama

Run large language models locally using Ollama for code analysis without sending data to external services.

Setup

  1. Install Ollama: Download from ollama.com

  2. Download a Model: Pull a model (e.g., phi-3)

    ollama pull phi-3
    
  3. Serve the Model:

    ollama serve phi-3
    

Usage

Run Noir with Ollama:

noir -b ./spec/functional_test/fixtures/hahwul \
     --ai-provider=ollama \
     --ai-model=phi-3

Ollama provides local AI analysis for vulnerability detection, code improvements, and endpoint functionality descriptions.

Esc