CROP Agents
Conversational parts consultant API using FastAPI, LangChain, and LLaMA for natural language queries.
CROP Agents
Repository: CT-CROP/CROP-Agents Last updated: 2026-02-21 Last synced to docs: 2026-03-10
FastAPI-based conversational parts consultant service using LangChain and LLM for natural language queries about tractor parts.
Features
- Natural Language Processing with LLM
- Structured JSON Responses for frontend integration
- Modular Architecture (routes, services, prompts, models)
- Dynamic Token Management based on context window
- Cloud Run Ready with Swagger documentation
Quick Start
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
uvicorn main:app --host 0.0.0.0 --port 8080 --reloadAPI available at http://localhost:8080, Swagger at http://localhost:8080/docs.
API Endpoints
GET /health # Health check
POST /api/query # Query the parts consultantConfiguration
| Variable | Required | Description |
|---|---|---|
VLLM_API_URL | Yes | URL of vLLM server |
VLLM_MODEL | Yes | Model name (default: Meta-Llama-3.1-8B-Instruct) |
LLM_MAX_TOKENS | No | Max tokens for response (default: 150) |
LLM_TEMPERATURE | No | Temperature (default: 0.0) |
Project Structure
CROP-Agents/
├── app/
│ ├── config/settings.py # Configuration
│ ├── models/schemas.py # Pydantic models
│ ├── prompts/consultant.py # LLM prompts
│ ├── routes/ # FastAPI endpoints
│ └── services/ # Business logic
├── main.py # Entry point
├── Dockerfile
└── cloudbuild.yaml