Ollama Integration
Setup Ollama
- Install Ollama: Follow the instructions on the official Ollama website to install the required software.
- Run the Model: Ensure the desired model is available and running.
Example Command to Serve the Model
To serve the VLLM model “microsoft/phi-4”, use the following command:
ollama pull "phi4"
ollama serve "phi4"
Run Noir with Ollama
To leverage Ollama capabilities for additional analysis, use the following command:
noir -b ./spec/functional_test/fixtures/hahwul \
--ai-provider=ollama \
--ai-model=phi4
This command performs the standard Noir operations while utilizing the specified Ollama model for enhanced analysis.