CROP

LLM Deploy

LLM deployment infrastructure for CROP AI services

LLM Deploy

Infrastructure for deploying and serving large language models used by CROP services.

Repository

CROP-LLM-Deploy

Overview

Manages deployment of LLMs used for:

  • Parts data extraction from PDFs
  • RAG-based parts lookup
  • AI assistant functionality

Stack

  • Python, vLLM
  • GCP GPU instances

On this page