Skip to content

feat: add Chutes AI as LLM provider#5380

Open
het4rk wants to merge 1 commit into
Mintplex-Labs:masterfrom
het4rk:feat/add-chutes-provider
Open

feat: add Chutes AI as LLM provider#5380
het4rk wants to merge 1 commit into
Mintplex-Labs:masterfrom
het4rk:feat/add-chutes-provider

Conversation

@het4rk
Copy link
Copy Markdown

@het4rk het4rk commented Apr 7, 2026

Closes #5379

Summary

  • Adds Chutes AI (https://chutes.ai) as an OpenAI-compatible LLM provider
  • Chutes runs decentralized AI inference on Bittensor SN64 with TEE-secured (Trusted Execution Environment) computation
  • Base URL: https://llm.chutes.ai/v1 — fully OpenAI-compatible

Changes

Backend

  • server/utils/AiProviders/chutes/index.jsChutesLLM class (streaming + non-streaming, chat completions, embeddings via native embedder)
  • server/utils/helpers/index.js — provider switch case
  • server/utils/helpers/customModels.js — adds "chutes" to SUPPORT_CUSTOM_MODELS, switch case, and getChutesModels() (fetches live model list from the API)

Frontend

  • frontend/src/components/LLMSelection/ChutesAiOptions/index.jsx — API key input + dynamic model selector
  • frontend/src/media/llmprovider/chutes.png — provider logo
  • Registered in GeneralSettings/LLMPreference and OnboardingFlow/Steps/LLMPreference
  • Added to useGetProvidersModels.js default models map
  • Added to ProviderPrivacy/constants.js

Provider details

  • Provider ID: chutes
  • Env vars: CHUTES_API_KEY, CHUTES_MODEL_PREF
  • Default model: chutes/DeepSeek-V3.2-TEE
  • Notable models: DeepSeek-V3.2-TEE, Qwen3-32B-TEE, Kimi-K2.5-TEE

Test plan

  • Set CHUTES_API_KEY and verify models load in the settings UI
  • Send a chat message and confirm streaming response works
  • Verify Chutes appears in the onboarding LLM selection step

🤖 Generated with Claude Code

Adds Chutes AI (https://chutes.ai) as an OpenAI-compatible LLM provider backed by Bittensor SN64 with TEE-secured inference.

- server/utils/AiProviders/chutes/index.js — ChutesLLM class (OpenAI-compatible)
- server/utils/helpers/index.js — provider switch case
- server/utils/helpers/customModels.js — SUPPORT_CUSTOM_MODELS entry, switch case, getChutesModels()
- frontend: ChutesAiOptions component, logo, GeneralSettings + Onboarding registration, useGetProviderModels entry, privacy constants

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat: add Chutes AI as LLM provider

1 participant