ServiceStack
  • Blog
  • Archive
  • Podcasts

From the blog

All posts tagged in llms

.net8 dev ai autoquery c# vue servicestack gpt markdown blazor devops okai identity-auth service-reference auth razor hosting js api github-actions dotnet ai-server templates admin-ui ssg db admin docs tailwind sqlite apikeys blog llms python react commands angular analytics apis docker swift workers rdbms licensing caching redis middleware kotlin ios android jwt json openapi php arm performance serialization jsonl semantic-kernel creatorkit marketing publishing website

Article

llms.py gets a UI 🚀

Simple ChatGPT-like UI to access ALL Your LLMs, Locally or Remotely!

Demis Bellot llms.py gets a UI 🚀 background

Demis Bellot

September 30, 2025 · 5 min read

Article

llms.py - Lightweight OpenAI compatible CLI and server gateway for multiple LLMs

Support for Text, Image and Audio generation. Seamlessly mix and match local models with premium cloud LLMs

Demis Bellot llms.py - Lightweight OpenAI compatible CLI and server gateway for multiple LLMs background

Demis Bellot

September 25, 2025 · 5 min read
view all posts
Features
ServiceStack Reference CreatorKit Litestream OrmLite Redis Locode Jamstack
Resources
Framework Docs Customer Forums ServiceStack.Redis ServiceStack.OrmLite ServiceStack.Text ServiceStack.Aws Live Demos Blog Archive
Learn
Why ServiceStack Architecture Explore ServiceStack API First Development The Simple POCO Life Training Services
About
Support What's New Contact Us Privacy Terms
Stay connected
About ServiceStack
ServiceStack started development in 2008 with the mission of creating a best-practices services framework with an emphasis on simplicity and speed, reducing the effort in creating and maintaining resilient message-based SOA Services and rich web apps like ubixar.

Copyright (c) ServiceStack Inc. 2024

Get started