# llms.txt and llms-full.txt support for Razor SSG and Razor Press Source: https://servicestack.net/posts/llms-txt ## Benefits for Large Language Models Large language models gain a lot of their knowledge from information on websites, but their small context windows aren't capable of handling most websites whose complex HTML Pages contain a mix of content, scripts, styles, navigation, ads and structural elements that can introduce noise and ambiguity. Parsing this content requires additional preprocessing to extract meaningful content while filtering out irrelevant elements, which can be inefficient and error-prone. Instead LLMs thrive on clean, structured text. To address this, [llmstxt.org](https://llmstxt.org) have proposed adding `/llms.txt` and `llms-full.txt` markdown files to websites to provide a more concise, expert-level information that LLMs can consume in a single, accessible location that's ideal for quickly adding context to LLMs, improves text comprehension, enhances model accuracy, streamlines content ingestion for RAG workflows and other AI-powered applications. Markdown was chosen for content as it's lightweight, clean, and structured specifically for readability. Markdown files contain pure text with minimal formatting, making them an ideal source for LLM training and retrieval. Because markdown is designed for capturing content, it ensures that LLMs receive high-quality, contextually relevant information without the distractions of complex page layouts or extraneous code. ## Support for llms.txt and llms-full.txt Fortunately [razor-ssg](https://razor-ssg.web-templates.io) and [razor-press](https://razor-press.web-templates.io) have fully embraced Markdown for maintaining their content, making it trivial to support generating `/llms.txt` and `llms-full.txt` files from your original static markdown content from the new [Llms.cshtml](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/Pages/Llms.cshtml) and [LlmsFull.cshtml](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/Pages/LlmsFull.cshtml) pages whose content is sourced from the Websites Markdown [/_pages](https://github.com/NetCoreTemplates/razor-ssg/tree/main/MyApp/_pages) folder rendered from the links in its [sidebar.json](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/_pages/sidebar.json) and Blog posts in the [/_posts](https://github.com/NetCoreTemplates/razor-ssg/tree/main/MyApp/_posts) folder in descending order. Here are some examples of `llms.txt` and `llms-full.txt` files generated from the Razor SSG and Razor Press website Templates: | Website | llms.txt | llms-full.txt | |--------------------------------------|-----------------------------------------------------------|---------------------------------------------------------------------| | https://servicestack.net | [llms.txt](https://servicestack.net/llms.txt) | [llms-full.txt](https://servicestack.net/llms-full.txt) | | https://docs.servicestack.net | [llms.txt](https://docs.servicestack.net/llms.txt) | [llms-full.txt](https://docs.servicestack.net/llms-full.txt) | | https://razor-ssg.web-templates.io | [llms.txt](https://razor-ssg.web-templates.io/llms.txt) | [llms-full.txt](https://razor-ssg.web-templates.io/llms-full.txt) | | https://razor-press.web-templates.io | [llms.txt](https://razor-press.web-templates.io/llms.txt) | [llms-full.txt](https://razor-press.web-templates.io/llms-full.txt) | :::{.text-center} ## Create a new Razor SSG or Razor Pages Project ::: [Razor SSG](https://razor-ssg.web-templates.io) is our FREE Project Template for creating fast, statically generated Websites and Blogs with Markdown & C# Razor Pages, whilst [Razor Press](https://razor-press.web-templates.io) is more focused on maintaining documentation websites.
Razor SSG
Razor Press
# Generate CRUD APIs and Admin UIs from existing DBs Source: https://servicestack.net/posts/crud-app-from-existing-db A core piece of functionality in the [Text to Blazor CRUD App](/posts/text-to-blazor) feature is distilling an AI Prompt into TypeScript classes that can be [further customized](/posts/text-to-blazor#customize-data-models) to generate AutoQuery CRUD APIs and Admin UIs for managing the underlying RDBMS tables. ## TypeScript Data Models Using TypeScript to define models is a flexible and effortless way to define your data models as it offers a DSL-like format with minimal boilerplate that's human-friendly to read and write that can also leverage TypeScript's powerful TypeScript Type System to provide a rich authoring experience with strong typing and intellisense that's validated against the referenced [api.d.ts](https://okai.servicestack.com/api.d.ts) schema containing all the C# Types, interfaces, and attributes used in defining APIs, DTOs and Data Models. ### Blueprint for Code Generation The TypeScript Data Models then serve as the blueprint for generating everything needed to support the feature in your App, including the AutoQuery CRUD APIs, Admin UIs and DB Migrations should you prefer to re-create it from scratch. ### RDBMS Metadata AppTask The first step in generating TypeScript Data Models is to capture the metadata from the existing RDBMS tables which we can do with the `App.json` [AppTask](https://docs.servicestack.net/app-tasks) below which uses your App's configured RDBMS connection to generate the Table Definitions for all tables in the specified RDBMS connection and schema to the file of your choice (e.g `App_Data/App.json`): ```csharp AppTasks.Register("App.json", args => appHost.VirtualFiles.WriteFile("App_Data/App.json",ClientConfig.ToSystemJson( migrator.DbFactory.GetTables(namedConnection:null, schema:null)))); ``` This task can then be run from the command line with: :::sh dotnet run --AppTasks=App.json ::: Which will generate the `App_Data/App.json` file containing the table definition metadata for all tables in the specified RDBMS. ### Different Connection or DB Schema If you prefer to generate the metadata for a different connection or schema, you can create a new AppTask with your preferred `namedConnection` and/or `schema`, e.g: ```csharp AppTasks.Register("Sales.json", args => appHost.VirtualFiles.WriteFile("Sales.json", ClientConfig.ToSystemJson( migrator.DbFactory.GetTables(namedConnection:"reports",schema:"sales")))); ``` ### Generate TypeScript Data Models The next step is to generate TypeScript Data Models from the captured metadata which can be done with the `okai` tool by running the `convert` command with the path to the `App.json` JSON table definitions which will generate the TypeScript Data Models to stdout which can be redirected to a file in your **ServiceModel** project, e.g: :::sh npx okai convert App_Data/App.json > ../MyApp.ServiceModel/App.d.ts ::: ## Generate CRUD APIs and Admin UIs The data models defined in the `App.d.ts` TypeScript Declaration file is what drives the generation of the Data Models, APIs, DB Migrations and Admin UIs. ### Customize Data Models This can be further customized by editing the TypeScript Declaration file and re-running the `okai` tool with just the filename, e.g: :::sh npx okai App.d.ts ::: Which will re-generate the Data Models, APIs, DB Migrations and Admin UIs based on the updated Data Models. ![](/img/posts/okai-models/npx-okai-App.png) :::tip You only need to specify the `App.d.ts` TypeScript filename (i.e. not the filepath) from anywhere within your .NET solution ::: ### Live Code Generation If you'd prefer to see the generated code in real-time you can add the `--watch` flag to watch the TypeScript Declaration file for changes and automatically re-generate the generated files on Save: :::sh npx okai App.d.ts --watch ::: # FREE CLI Access to the worlds most popular AI Models Source: https://servicestack.net/posts/okai-chat As part of the development of [okai](/posts/okai-models) for generating [Blazor CRUD Apps from a text prompt](/posts/text-to-blazor) using your preferred AI Models, we've also made available a generic **chat** prompt that can be used as a convenient way to conduct personal research against many of the worlds most popular Large Language Models - for Free! ![](/img/posts/okai-chat/okai-chat.webp) No API Keys, no Signups, no installs, no cost, you can just start immediately using the `npx okai chat` script to ask LLMs for assistance: :::sh npx okai chat "command to copy a folder with rsync?" ::: This will use the default model (currently codestral:22b) to answer your question. ### Select Preferred Model You can also use your preferred model with the `-m ` flag with either the model **name** or its **alias**, e.g you can use [Microsoft's PHI-4 14B](https://techcommunity.microsoft.com/blog/aiplatformblog/introducing-phi-4-microsoft%E2%80%99s-newest-small-language-model-specializing-in-comple/4357090) model with: :::sh npx okai -m phi chat "command to copy folder with rsync?" ::: ### List Available Models We're actively adding more great performing and leading experimental models as they're released. You can view the list of available models with `ls models`: :::sh npx okai ls models ::: Which at this time will return the following list of available models along with instructions for how to use them: ```txt USAGE (5 models max): a) OKAI_MODELS=codestral,llama3.3,flash b) okai -models codestral,llama3.3,flash c) okai -m flash chat FREE MODELS: claude-3-haiku (alias haiku) codestral:22b (alias codestral) deepseek-r1:32b deepseek-r1:671b (alias deepseek-r1) deepseek-r1:70b deepseek-r2:32b deepseek-v3:671b (alias deepseek) gemini-flash-1.5 gemini-flash-1.5-8b (alias flash-8b) gemini-flash-2.0 (alias flash) gemini-flash-lite-2.0 (alias flash-lite) gemini-flash-thinking-2.0 (alias flash-thinking) gemini-pro-2.0 (alias gemini-pro) gemma2:9b (alias gemma) gpt-3.5-turbo (alias gpt-3.5) gpt-4o-mini llama3.1:70b (alias llama3.1) llama3.3:70b (alias llama3.3) llama3:8b (alias llama3) mistral-nemo:12b (alias mistral-nemo) mistral-small:24b (alias mistral-small) mistral:7b (alias mistral) mixtral:8x22b mixtral:8x7b (alias mixtral) nova-lite nova-micro phi-4:14b (alias phi,phi-4) qwen-plus qwen-turbo qwen2.5-coder:32b (alias qwen2.5-coder) qwen2.5:32b qwen2.5:72b (alias qwen2.5) qwq:32b (alias qwq) qwq:72b PREMIUM MODELS: * claude-3-5-haiku claude-3-5-sonnet claude-3-7-sonnet (alias sonnet) claude-3-sonnet gemini-pro-1.5 gpt-4 gpt-4-turbo gpt-4o mistral-large:123b nova-pro o1-mini o1-preview o3-mini qwen-max * requires valid license: a) SERVICESTACK_LICENSE= b) SERVICESTACK_CERTIFICATE= c) okai -models -license ``` Where you'll be able to use any of the great performing inexpensive models listed under `FREE MODELS` for Free. Whilst ServiceStack customers with an active commercial license can also use any of the more expensive and better performing models listed under `PREMIUM MODELS` by either: a) Setting the `SERVICESTACK_LICENSE` Environment Variable with your **License Key** b) Setting the `SERVICESTACK_CERTIFICATE` Variable with your **License Certificate** c) Inline using the `-license` flag with either the **License Key** or **Certificate** ### FREE for Personal Usage To be able to maintain this as a free service we're limiting usage as a tool that developers can use for personal assistance and research by limiting usage to **60 requests /hour** which should be more than enough for most personal usage and research whilst deterring usage in automated tools. :::tip info Rate limiting is implemented with a sliding [Token Bucket algorithm](https://en.wikipedia.org/wiki/Token_bucket) that replenishes 1 additional request every 60s ::: # New okai tool for Rapid App Development Source: https://servicestack.net/posts/okai-models ## AI powered Rapid App Development Workflow The new `okai` npm tool works similar to the online [Text to Blazor App](/posts/text-to-blazor) generator except it's a local tool that can add additional functionality to an existing project: The syntax for adding a new feature to your Web App is `npx okai `, e.g: :::sh npx okai "The kind of Feature you would like to add" ::: Where it will generate the Data Models, AutoQuery CRUD APIs, DB Migrations and Admin UI for the selected feature which you'll see after selecting the LLM Data Models you want to use, e.g: ```sh Selected 'deepseek-r1:70b' data models Saved: /home/mythz/src/MyApp/MyApp.ServiceModel/Jobs.d.ts Saved: /home/mythz/src/MyApp/MyApp.ServiceModel/Jobs.cs Saved: /home/mythz/src/MyApp/wwwroot/admin/sections/Jobs.mjs Saved: /home/mythz/src/MyApp/wwwroot/admin/sections/index.mjs Saved: /home/mythz/src/MyApp/Migrations/Migration1001.cs Run 'dotnet run --AppTasks=migrate' to apply new migration and create tables To regenerate classes, update 'Jobs.d.ts' then run: $ okai Jobs.d.ts ``` Where okai will generate everything needed to support the feature in your App, including: - `MyApp.ServiceModel/Jobs.d.ts` - TypeScript Data Models - `MyApp.ServiceModel/Jobs.cs` - AutoQuery CRUD APIs and Data Models - `wwwroot/admin/sections/Jobs.mjs` - Admin UI Section - requires `blazor-admin` or `blazor-vue` template - `MyApp/Migrations/Migration1001.cs` - DB Migrations - requires project with [OrmLite DB Migrations](https://docs.servicestack.net/ormlite/db-migrations) Then to apply the migration and create the tables you can run: :::sh npm run migrate ::: ## Declarative AI powered Features The approach okai uses is very different from most AI tools which instead of using AI to generate an entire App or source code for a feature it's only used to generate the initial Data Models within a TypeScript Declaration file which we've found is best format supported by AI models that's also the best typed DSL for defining data models with minimal syntax that's easy for humans to read and write. This is possible for ServiceStack Apps since a significant portion of an App's functionality can be [declaratively applied](https://docs.servicestack.net/locode/declarative) including all [AutoQuery CRUD APIs](https://docs.servicestack.net/autoquery/crud) which can be implemented just using typed Request DTOs to define the shape of the API AutoQuery should implement. From the Data Models, the rest of the feature is generated using declarative code-first APIs depending on the template used. ### Generated Admin UI To have okai generate an Admin UI you'll need to use it within a new Blazor Admin project or Blazor Vue ([blazor-vue](https://blazor-vue.web-templates.io)) project: :::sh x new blazor-admin Acme ::: Which both support a "Modular no-touch" Admin UI which will appear under a new group in the Admin Sidebar: ![](/img/posts/text-to-blazor/okai-blazor-admin.webp) ## Customize Data Models The data models defined in the TypeScript Declaration file e.g. `Jobs.d.ts` is what drives the generation of the Data Models, APIs, DB Migrations and Admin UIs. This can be further customized by editing the TypeScript Declaration file and re-running the `okai` tool with the name of the TypeScript Declaration file, e.g. `Jobs.d.ts`: :::sh npx okai Jobs.d.ts ::: Which will re-generate the Data Models, APIs, DB Migrations and Admin UIs based on the updated Data Models. ![](/img/posts/text-to-blazor/okai-Employees.webp) :::tip You only need to specify the `Jobs.d.ts` TypeScript filename (i.e. not the filepath) from anywhere within your .NET solution ::: ### Live Code Generation If you'd prefer to see the generated code in real-time you can add the `--watch` flag to watch the TypeScript Declaration file for changes and automatically re-generate the generated files on Save: :::sh npx okai Jobs.d.ts --watch ::: # Text to Blazor Vue CRUD Apps Source: https://servicestack.net/posts/text-to-blazor Text to Blazor is our first initiative for harnessing AI to help to rapidly generate new Blazor Admin CRUD Apps from just a text description.
[![](/img/posts/text-to-blazor/text-to-blazor-prompt.webp)](/text-to-blazor) This will query 5 different high quality AI models to generate 5 different Data Models, APIs, DB Migrations and Admin UIs which you can browse to find the one that best matches your requirements. [![](/img/posts/text-to-blazor/text-to-blazor-gen.webp)](/text-to-blazor) ### Using AI to only generate Data Models Whilst the result is a working CRUD App, the approach taken is very different from most AI tools which uses AI to generate the entire App that ends up with a whole new code-base developers didn't write which they'd now need to maintain. Instead AI is only used to generate the initial Data Models within a **TypeScript Declaration file** which we've found is the best format supported by AI models that's also the best typed DSL for defining data models with minimal syntax that's easy for humans to read and write. ### Download preferred Blazor Vue CRUD App Once you've decided on the Data Models that best matches your requirements, you can download your preferred generated Blazor Vue CRUD App: [![](/img/posts/text-to-blazor/text-to-blazor-download.webp)](/text-to-blazor) ### Blazor Admin App **Admin Only** - is ideal for internal Admin Apps where the Admin UI is the Primary UI ![](/img/posts/text-to-blazor/okai-blazor-admin.webp) ### Blazor Vue App **UI + Admin** - Creates a new [blazor-vue](https://blazor-vue.web-templates.io) template that's ideal for Internet or public facing Apps, sporting a full-featured public facing UI for a Web App's users whilst enabling a back-office CRUD UI for Admin Users to manage their App's data. ![](/img/posts/text-to-blazor/okai-blazor-vue.webp) Clicking on the **Admin UI** button will take you to the Admin UI at `/admin`: ![](/img/posts/text-to-blazor/okai-blazor-vue-admin.webp) ## Modular Code Generation Instead of unleashing AI on your code-base unabated, we're only using AI to generate isolated functionality into grouped "no touch" source files that can be easily maintained and extended. Creating a new Project with a similar prompt above would create a new project with the new source files (marked with `*`) added to the existing project: ### APIs ```files /MyApp.ServiceModel Bookings.cs api.d.ts* Employees.cs* Employees.d.ts* ``` ### Migration ```files /MyApp/Migrations Migration1000.cs Migration1001.cs* ``` ### UI ```files /MyApp/wwwroot/admin /sections Bookings.mjs Employees.mjs* index.mjs* index.html ``` Which after downloading a new project just needs to run the [DB Migrations](https://docs.servicestack.net/ormlite/db-migrations) to create the tables required for any new functionality: :::sh npm run migrate ::: ## Run Migrations In order to create the necessary tables for the new functionality, you'll need to run the DB Migrations. If migrations have never been run before, you can run the `migrate` npm script to create the initial database: :::sh npm run migrate ::: If you've already run the migrations before, you can run the `rerun:last` npm script to drop and re-run the last migration: :::sh npm run rerun:last ::: Alternatively you can nuke the App's database (e.g. `App_Data/app.db`) and recreate it from scratch with `npm run migrate`. ## Instant CRUD UI After running the DB migrations, you can hit the ground running and start using the Admin UI to manage the new Data Model RDBMS Tables: :::youtube 8buo_ce3SNM Using AutoQuery CRUD UI in a Text to Blazor App ::: ### Create new Records from Search Dialog We're continually improving the UX of the [AutoQueryGrid Component](/vue/autoquerygrid) used in generating CRUD UIs to enable a more productive and seamless workflow. A change added to that end that you can see in the above video is the ability to add new Records from a Search dialog: ![](/img/posts/text-to-blazor/autoquerygrid-new2.webp) This now lets you start immediately creating new records without needing to create any lookup entries beforehand. ## Audited Data Models The TypeScript Data Models enable a rapid development experience for defining an App's Data Models which are used to generate the necessary AutoQuery CRUD APIs to support an Admin UI. An example of the productivity of this approach is the effortless support for maintaining a detailed audit history for changes to select tables by inheriting from the `AuditBase` base class, e.g: ```ts export class Job extends AuditBase { ... } ``` Which can then be regenerated using the name of the TypeScript Model definitions: :::sh npx okai Jobs.d.ts ::: This will include additional `CreatedBy`, `CreatedDate`, `ModifiedBy`, `ModifiedDate`, `DeletedBy` and `DeletedDate` properties to the specified Table and also generates the necessary [Audit Behaviors](https://docs.servicestack.net/autoquery/crud#apply-generic-crud-behaviors) on the AutoQuery APIs to maintain the audit history for each CRUD operation. ### AutoQuery CRUD Audit Log As the **blazor-admin** and **blazor-vue** templates are configured to use the [AutoQuery CRUD Executable Audit Log](https://docs.servicestack.net/autoquery/audit-log) in its [Configure.AutoQuery.cs](https://github.com/NetCoreTemplates/blazor-admin/blob/main/MyApp/Configure.AutoQuery.cs) the Audit Behaviors will also maintain an Audit Trail of all CRUD operations which can be viewed in the Admin UI: ![](/img/posts/text-to-blazor/okai-audit-form.webp) ## TypeScript Schema In addition to being a great DSL for defining Data Models, using TypeScript also lets us define a schema containing all the C# Types, interfaces, and attributes used in defining APIs, DTOs and Data Models in the accompanying [api.d.ts](https://okai.servicestack.com/api.d.ts) file. This now lets us use TypeScript to define the [Bookings.cs](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp.ServiceModel/Bookings.cs) AutoQuery APIs and Data Models which blazor-admin uses instead in its [Bookings.d.ts](https://github.com/NetCoreTemplates/blazor-admin/blob/main/MyApp.ServiceModel/Bookings.d.ts): ```ts /// export type Config = { prompt: "New Booking" api: "~/MyApp.ServiceModel/Bookings.cs" migration: "~/MyApp/Migrations/Migration1001.cs" uiMjs: "~/MyApp/wwwroot/admin/sections/Bookings.mjs" } export enum RoomType { Single, Double, Queen, Twin, Suite, } @Read.route("/bookings","GET") @Read.route("/bookings/{Id}","GET") @Read.description("Find Bookings") @Create.route("/bookings","POST") @Create.description("Create a new Booking") @Update.notes("Find out how to quickly create a C# Bookings App from Scratch") @Update.route("/booking/{Id}","PATCH") @Update.description("Update an existing Booking") @Delete.route("/booking/{Id}","DELETE") @Delete.description("Delete a Booking") @tag("Bookings") @icon({svg:"..."}) @notes("Captures a Persons Name & Room Booking information") @description("Booking Details") @validateHasRole("Employee") export class Booking extends AuditBase { @autoIncrement() id: number @Create.description("Name this Booking is for") @validateNotEmpty() name: string roomType: RoomType @validateGreaterThan(0) roomNumber: number @intlDateTime(DateStyle.Long) bookingStartDate: Date @intlRelativeTime() bookingEndDate?: Date @intlNumber({currency:"USD"}) @validateGreaterThan(0) cost: decimal @ref({model:"nameof(Coupon)",refId:"nameof(Coupon.Id)",refLabel:"nameof(Coupon.Description)"}) @references("typeof(Coupon)") couponId?: string @reference() discount?: Coupon @input({type:"textarea"}) notes?: string cancelled?: boolean @reference({selfId:"nameof(CreatedBy)",refId:"nameof(User.UserName)",refLabel:"nameof(User.DisplayName)"}) employee: User } @tag("Bookings") @icon({svg:"..."}) export class Coupon extends AuditBase { id: string description: string discount: number expiryDate: Date } ``` The benefit of this approach being that you can make a change to the Data Models and rerun the okai tool to regenerate the AutoQuery APIs, DB Migrations and Admin UIs. :::sh npx okai Bookings.d.ts ::: Which will regenerate its: - APIs: [MyApp.ServiceModel/Bookings.cs](https://github.com/NetCoreTemplates/blazor-admin/blob/main/MyApp.ServiceModel/Bookings.cs) - DB Migration: [MyApp/Migrations/Migration1000.cs](https://github.com/NetCoreTemplates/blazor-admin/blob/main/MyApp/Migrations/Migration1000.cs) - Admin UI: [/wwwroot/admin/sections/Bookings.mjs](https://github.com/NetCoreTemplates/blazor-admin/blob/main/MyApp/wwwroot/admin/sections/Bookings.mjs) What files will be generated is controlled in the `Config` section: ```ts export type Config = { prompt: "New Booking" api: "~/MyApp.ServiceModel/Bookings.cs" migration: "~/MyApp/Migrations/Migration1001.cs" uiMjs: "~/MyApp/wwwroot/admin/sections/Bookings.mjs" } ``` So if you no longer want the code regeneration to update the DB Migration for it, you can just remove it from the Config. ## Customize Data Models The data models defined in the TypeScript Declaration file e.g. `Bookings.d.ts` is what drives the generation of the Data Models, APIs, DB Migrations and Admin UIs. This can be further customized by editing the TypeScript Declaration file and re-running the `okai` tool with the name of the TypeScript Declaration file, e.g. `Bookings.d.ts`: :::sh npx okai Bookings.d.ts ::: Which will re-generate the Data Models, APIs, DB Migrations and Admin UIs based on the updated Data Models. ![](/img/posts/text-to-blazor/okai-Employees.webp) Or add `--watch` to watch the TypeScript Declaration file for changes and automatically re-generate the generated files on Save: :::sh npx okai Bookings.d.ts --watch ::: :::tip You only need to specify the `Bookings.d.ts` TypeScript filename (i.e. not the filepath) from anywhere within your .NET solution ::: One challenge with this approach is that we only have a single class to use to define our attributes for both Request and Response DTOs for all AutoQuery CRUD APIs and Data Models. ### API and Data Model attributes The okai tool resolves some of these issues with smart generation of attributes where "Data Model Attributes" like `[Icon]` class attribute and `[AutoIncrement]` property attributes are only generated on the Data Model: ```ts @icon({svg:"..."}) export class Booking { @autoIncrement() id: number @intlNumber({currency:"USD"}) cost: decimal } ``` Whilst "API Attributes" like `[Tag]` and `[ValidateHasRole]` class attribute and `[ValidateGreaterThan]` property attributes and are only generated on the APIs Request DTOs: ```ts @tag("Bookings") @validateHasRole("Employee") export class Booking { @validateGreaterThan(0) cost: decimal } ``` ### C# Types As JavaScript only has a limited set of types, the TypeScript **api.d.ts** schema also includes the built-in C# Types used when defining APIs, DTOs and Data Models which you'll be able to use when your APIs need to use a specific .NET type, e.g: ```ts export class Booking extends AuditBase { id: number name: string roomNumber: number bookingStartDate: Date bookingEndDate?: DateOnly cost: decimal cancelled?: boolean } ``` Which uses the `DateOnly` and `decimal` .NET Types to generate: ```csharp public class Booking : AuditBase { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } public int RoomNumber { get; set; } public DateTime BookingStartDate { get; set; } public DateOnly? BookingEndDate { get; set; } public decimal Cost { get; set; } public bool? Cancelled { get; set; } } ``` ### API Targeted Attributes When you need to add attributes to a specific API Request DTO you can use a CRUD prefix to have it only applied to that specific AutoQuery API, e.g: ```ts @Read.route("/bookings","GET") @Read.route("/bookings/{Id}","GET") @Create.route("/bookings","POST") ``` Where it would only the generated on the AutoQuery API that it targets, e.g: ```csharp [Route("/bookings", "GET")] [Route("/bookings/{Id}", "GET")] public class QueryBookings : QueryDb { ... } [Route("/bookings", "POST")] public class CreateBooking : ICreateDb, IReturn { ... } ``` In addition to `Create.`, `Read.`, `Update.`, `Delete.` attributes to target specific AutoQuery CRUD APIs, you can also use `Write.` to target all `Create.`, `Update.`, `Delete.` Write APIs. ### Ambiguous Attributes Attributes that can be annotated on both the Data Model and API Request DTOs like `[Notes]` and `[Description]` are only generated on the Data Model and require using targeted attributes to apply to them to API Request DTOs, e.g: ```ts @Read.description("Find Bookings") @Create.description("Create a new Booking") @Update.notes("Find out how to quickly create a C# Bookings App from Scratch") @Update.description("Update an existing Booking") @Delete.description("Delete a Booking") @notes("Captures a Persons Name & Room Booking information") @description("Booking Details") export class Booking extends AuditBase { ... } ``` Where the naked `@notes` and `@description` attributes are only generated on the Data Model whilst the targeted attributes are generated on their respective DTOs, e.g: ```csharp [Description("Find Bookings")] public class QueryBookings : QueryDb { ... } [Description("Create a new Booking")] public class CreateBooking : ICreateDb, IReturn { ... } [Notes("Find out how to quickly create a C# Bookings App from Scratch")] [Description("Update an existing Booking")] public class UpdateBooking : IPatchDb, IReturn { ... } [Description("Delete a Booking")] public class DeleteBooking : IDeleteDb, IReturnVoid { ... } [Description("Booking Details")] [Notes("Captures a Persons Name & Room Booking information")] public class Booking : AuditBase { ... } ``` ### Special Attribute Values There's special behavior for `"nameof(...)"` and `"typeof(...)"` string attribute values where: ```ts export class Booking extends AuditBase { @ref({model: "nameof(Coupon)", refId: "nameof(Coupon.Id)", refLabel: "nameof(Coupon.Description)"}) @references("typeof(Coupon)") couponId?: string } ``` Will be generated with native C# syntax, i.e. instead of as strings: ```csharp public class Booking : AuditBase { [Ref(Model=nameof(Coupon),RefId=nameof(Coupon.Id),RefLabel=nameof(Coupon.Description))] [References(typeof(Coupon))] public string? CouponId { get; set; } } ``` ### Changing Default Attributes To improve the default out-of-the-box experience some attributes are included by default, including: - `[Icon]` attribute on Data Models based on the Data Model name - prevent by adding empty `@icon()` attribute - `[AutoIncrement]` on `id` number properties if no other `[PrimaryKey]` attribute is defined - prevent by adding `@primaryKey()` or `@autoId()` - `[Validate*]` attributes added to Create/Update APIs on non-nullable properties - prevent by adding empty `@validate()` attribute Here's an example which changes the default behavior for the default attributes above: ```ts @icon() export class Todo { @primaryKey() id: number @validate() name: string } ``` Which will generate the C# APIs without the `[Icon]` and `[Validate]` attributes and replace `[AutoIncrement]` with `[PrimaryKey]`, e.g: ```csharp public class CreateTodo : ICreateDb, IReturn { [ValidateGreaterThan(0)] public int Id { get; set; } public string Name { get; set; } } public class Todo { [PrimaryKey] public int Id { get; set; } public string Name { get; set; } } ``` ### Modifying ApplicationUser In many cases the AI Models will want to generate a `User` class for their AI models. But as Blazor Apps are already configured to use an ApplicationUser Identity Auth User class, the C# code generation only generates the User class in a comment so you can merge it with your existing `User` class, e.g: ```csharp /* merge with User DTO /// /// Interface defining the structure for a JobApplication. /// Represents a user's application to a specific job. /// public class User { [AutoIncrement] public int Id { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public string Email { get; set; } /// /// Optional URL to the user's resume /// public string? ResumeUrl { get; set; } } */ ``` If you wish to add additional properties, you'll first need to add it your `ApplicationUser` class, e.g: ```csharp public class ApplicationUser : IdentityUser { public string? FirstName { get; set; } public string? LastName { get; set; } public string? DisplayName { get; set; } public string? ProfileUrl { get; set; } /// /// Optional URL to the user's resume /// public string? ResumeUrl { get; set; } } ``` You'll then need to regenerate the EF Migration to update the `AspNetUsers` table with the new columns by running the `init-ef` npm script: :::sh npm run init-ef ::: Which will delete the existing Migrations and create a new Migration to update the Identity Auth tables: ```json { "scripts": { "init-ef": "node -e 'fs.readdirSync(`Migrations`).filter(x => !x.startsWith(`Migration`)).forEach(x => fs.rmSync(`Migrations/${x}`))' && dotnet ef migrations add CreateIdentitySchema", } } ``` You can then delete your Primary Database (e.g. App_Data/app.db) and re-run the `migrate` npm script to recreate it: :::sh npm run migrate ::: If you want the additional property to be included in API Responses you'll also need to add it to your `User` DTO, e.g: ```csharp /// /// Public User DTO /// [Alias("AspNetUsers")] public class User { public string Id { get; set; } public string UserName { get; set; } public string? FirstName { get; set; } public string? LastName { get; set; } public string? DisplayName { get; set; } public string? ProfileUrl { get; set; } public string? ResumeUrl { get; set; } } ``` Which OrmLite and AutoQuery will use to query Identity Auth's `AspNetUsers` table. ### Custom APIs When you need more fine-grained control over the generated APIs, you can "takeover" the generation of an AutoQuery API by explicitly defining it yourself. So if you prefer to use explicit API Request DTOs instead of targeting attributes or need to control the exact properties that are generated in each API, you can define the API Request DTOs yourself where when exists will skip generation for that API. To showcase the differences between the single class approach, you can rewrite the above single class approach with an explicit class for each API: ```ts export enum RoomType { Single, Double, Queen, Twin, Suite, } @tag("Bookings") @notes("Captures a Persons Name & Room Booking information") @route("/bookings","GET") @route("/bookings/{Id}","GET") @autoApply(Behavior.AuditQuery) @description("Find Bookings") export class QueryBookings extends QueryDb { id?: number } @tag("Bookings") @route("/bookings","POST") @autoApply(Behavior.AuditCreate) @description("Create a new Booking") @validateHasRole("Employee") export class CreateBooking implements ICreateDb, IReturn { name?: string roomType?: RoomType @validateGreaterThan(0) roomNumber?: number bookingStartDate?: Date bookingEndDate?: Date @validateGreaterThan(0) cost?: decimal couponId?: string discount?: Coupon @input({type:"textarea"}) notes?: string cancelled?: boolean } @tag("Bookings") @route("/bookings","PATCH") @autoApply(Behavior.AuditModify) @description("Create a new Booking") @validateHasRole("Employee") export class UpdateBooking implements IPatchDb, IReturn { name?: string roomType?: RoomType @validateGreaterThan(0) roomNumber?: number bookingStartDate?: Date bookingEndDate?: Date @validateGreaterThan(0) cost?: decimal couponId?: string discount?: Coupon @input({type:"textarea"}) notes?: string cancelled?: boolean } @tag("Bookings") @route("/bookings/{Id}","DELETE") @autoApply(Behavior.AuditSoftDelete) @description("Delete a Booking") @validateHasRole("Manager") export class DeleteBookings implements IDeleteDb, IReturnVoid { id?: number } @tag("Bookings") @notes("Captures a Persons Name & Room Booking information") @route("/bookings","GET") @route("/bookings/{Id}","GET") @description("Find Bookings") export class Booking extends AuditBase { @autoIncrement() id: number @Create.description("Name this Booking is for") @Create.validateNotEmpty() name: string roomType: RoomType roomNumber: number @intlDateTime(DateStyle.Long) bookingStartDate: Date @intlRelativeTime() bookingEndDate?: Date @intlNumber({currency:"USD"}) cost: decimal @ref({model:"nameof(Coupon)",refId:"nameof(Coupon.Id)",refLabel:"nameof(Coupon.Description)"}) @references("typeof(Coupon)") couponId?: string @reference() discount?: Coupon @Write.input({type:"textarea"}) notes?: string cancelled?: boolean @reference({selfId:"nameof(CreatedBy)",refId:"nameof(User.UserName)",refLabel:"nameof(User.DisplayName)"}) employee: User } @description("Discount Coupons") export class Coupon extends AuditBase { id: string description: string discount: number expiryDate: Date } @tag("Bookings") @route("/coupons","GET") @autoApply(Behavior.AuditQuery) @description("Find Coupons") export class QueryCoupons extends QueryDb { id?: string } @tag("Bookings") @route("/coupons","POST") @autoApply(Behavior.AuditCreate) @description("Create a new Create") @validateHasRole("Employee") export class CreateCoupon implements ICreateDb, IReturn { id: string description: string discount: number expiryDate: Date } @tag("Bookings") @route("/coupons","PATCH") @autoApply(Behavior.AuditModify) @description("Create a new Coupon") @validateHasRole("Employee") export class UpdateCoupon implements IPatchDb, IReturnVoid { id: string description?: string discount?: number expiryDate?: Date } @tag("Bookings") @route("/coupons/{Id}","DELETE") @autoApply(Behavior.AuditSoftDelete) @description("Delete a Coupon") @validateHasRole("Manager") export class DeleteCoupon implements IDeleteDb, IReturnVoid { id?: string } ``` # Self Hosted AI Server for LLMs, Ollama, Comfy UI & FFmpeg Source: https://servicestack.net/posts/ai-server ## AI Server now ready to serve! We're excited to announce the first release of AI Server - a Free OSS self-hosted Docker private gateway to manage API access to multiple LLM APIs, Ollama endpoints, Media APIs, Comfy UI and FFmpeg Agents. :::youtube Ojo80oFQte8 Introducing AI Server ::: ### Centralized Management Designed as a one-stop solution to manage an organization's AI integrations for all their System Apps, by utilizing developer friendly HTTP JSON APIs that supports any programming language or framework. [![](/img/svgs/ai-server-overview.svg)](https://openai.servicestack.net) ### Distribute load across multiple Ollama, Open AI Gateway and Comfy UI Agents It works as a private gateway to process LLM, AI and image transformations requests that any of our Apps need where it dynamically load balances requests across our local GPU Servers, Cloud GPU instances and API Gateways running multiple instances of Ollama, Open AI Chat, LLM Gateway, Comfy UI, Whisper and ffmpeg providers. In addition to maintaining a history of AI Requests, it also provides file storage for its CDN-hostable AI generated assets and on-the-fly, cacheable image transformations. ### Native Typed Integrations Uses [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) to enable simple, native typed integrations for most popular Web, Mobile and Desktop languages including: C#, TypeScript, JavaScript, Python, Java, Kotlin, Dart, PHP, Swift, F# and VB.NET. Each AI Feature supports multiple call styles for optimal integration of different usages: - **Synchronous API** · Simplest API ideal for small workloads where the Response is returned in the same Request - **Queued API** · Returns a reference to the queued job executing the AI Request which can be used to poll for the API Response - **Reply to Web Callback** · Ideal for reliable App integrations where responses are posted back to a custom URL Endpoint ### Live Monitoring and Analytics Monitor performance and statistics of all your App's AI Usage, real-time logging of executing APIs with auto archival of completed AI Requests into monthly rolling SQLite databases. ### Protected Access with API Keys AI Server utilizes [Simple Auth with API Keys](https://docs.servicestack.net/auth/admin-apikeys) letting Admins create and distribute API Keys to only allow authorized clients to access their AI Server's APIs, which can be optionally further restricted to only [allow access to specific APIs](https://docs.servicestack.net/auth/apikeys#creating-user-api-keys). ## Install AI Server can be installed on macOS and Linux with Docker by running [install.sh](https://github.com/ServiceStack/ai-server/blob/main/install.sh): 1. Clone the AI Server repository from GitHub: :::sh `git clone https://github.com/ServiceStack/ai-server` ::: 2. Run the Installer :::sh `cd ai-server && cat install.sh | bash` ::: The installer will detect common environment variables for the supported AI Providers like OpenAI, Google, Anthropic, and others, and prompt ask you if you want to add them to your AI Server configuration. ### Optional - Install ComfyUI Agent If your server also has a GPU you can ask the installer to also install the [ComfyUI Agent](/ai-server/comfy-extension): The ComfyUI Agent is a separate Docker agent for running [ComfyUI](https://www.comfy.org), [Whisper](https://github.com/openai/whisper) and [FFmpeg](https://www.ffmpeg.org) on servers with GPUs to handle AI Server's [Image](https://docs.servicestack.net/ai-server/transform/image) and [Video transformations](https://docs.servicestack.net/ai-server/transform/video) and Media Requests, including: - [Text to Image](https://docs.servicestack.net/ai-server/text-to-image) - [Image to Text](https://docs.servicestack.net/ai-server/image-to-text) - [Image to Image](https://docs.servicestack.net/ai-server/image-to-image) - [Image with Mask](https://docs.servicestack.net/ai-server/image-with-mask) - [Image Upscale](https://docs.servicestack.net/ai-server/image-upscale) - [Speech to Text](https://docs.servicestack.net/ai-server/speech-to-text) - [Text to Speech](https://docs.servicestack.net/ai-server/text-to-speech) #### Comfy UI Agent Installer To install the ComfyUI Agent on a separate server (with a GPU), you can clone and run the ComfyUI Agent installer on that server instead: 1. Clone the Comfy Agent :::sh `git clone https://github.com/ServiceStack/agent-comfy.git` ::: 2. Run the Installer :::sh `cd agent-comfy && cat install.sh | bash` ::: ## Running in Production We've been developing and running AI Server for several months now, processing millions of LLM and Comfy UI Requests to generate Open AI Chat Answers and Generated Images used to populate the [pvq.app](https://pvq.app) and [blazordiffusion.com](https://blazordiffusion.com) websites. Our production instance with more info about AI Server is available at: :::{.m-0 .text-center .text-2xl .font-semibold .text-indigo-600} https://openai.servicestack.net ::: [![](/img/posts/ai-server/ai-server-languages.png)](https://openai.servicestack.net) ## API Explorer Whilst our production instance is protected by API Keys, you can still use it to explore available APIs in its API Explorer: :::{.m-0 .text-center .text-2xl .font-semibold .text-indigo-600} [https://openai.servicestack.net/ui/](https://openai.servicestack.net/ui/OpenAiChatCompletion) ::: ## Documentation The documentation for AI Server is being maintained at: :::{.m-0 .text-center .text-2xl .font-semibold .text-indigo-600} https://docs.servicestack.net/ai-server/ ::: ## Built-in UIs Built-in UIs allow users with API Keys access to custom UIs for different AI features [![](/img/posts/ai-server/ai-server-builtin-uis.png)](https://openai.servicestack.net) ## Admin UIs Use Admin UI to manage API Keys that can access AI Server APIs and Features [![](/img/posts/ai-server/ai-server-admin-uis.png)](https://openai.servicestack.net) ## Features The current release of AI Server supports a number of different modalities, including: ### Large Language Models - [Open AI Chat](https://docs.servicestack.net/ai-server/chat) - Support for Ollama endpoints - Support for Open Router, Anthropic, Open AI, Mistral AI, Google and Groq API Gateways ### Comfy UI Agent / Replicate / DALL-E 3 - [Text to Image](https://docs.servicestack.net/ai-server/text-to-image) ### Comfy UI Agent - [Image to Image](https://docs.servicestack.net/ai-server/image-to-image) - [Image Upscaling](https://docs.servicestack.net/ai-server/image-upscale) - [Image with Mask](https://docs.servicestack.net/ai-server/image-with-mask) - [Image to Text](https://docs.servicestack.net/ai-server/image-to-text) - [Text to Speech](https://docs.servicestack.net/ai-server/text-to-speech) - [Speech to Text](https://docs.servicestack.net/ai-server/speech-to-text) ### FFmpeg - [Image Transformations](https://docs.servicestack.net/ai-server/transform/image) - **Crop Image** - Crop an image to a specific size - **Convert Image** - Convert an image to a different format - **Scale Image** - Scale an image to a different resolution - **Watermark Image** - Add a watermark to an image - [Video Transformations](https://docs.servicestack.net/ai-server/transform/video) - **Crop Video** - Crop a video to a specific size - **Convert Video** - Convert a video to a different format - **Scale Video** - Scale a video to a different resolution - **Watermark Video** - Add a watermark to a video - **Trim Video** - Trim a video to a specific length ### Managed File Storage - Blob Storage - isolated and restricted by API Key ## AI Server API Examples To simplify integrations with AI Server each API Request can be called with 3 different call styles to better support different use-cases and integration patterns. ### Synchronous Open AI Chat Example The **Synchronous API** is the simplest API ideal for small workloads where the Response is returned in the same Request: ```csharp var client = new JsonApiClient(baseUrl); client.BearerToken = apiKey; var response = client.Post(new OpenAiChatCompletion { Model = "mixtral:8x22b", Messages = [ new() { Role = "user", Content = "What's the capital of France?" } ], MaxTokens = 50 }); var answer = response.Choices[0].Message.Content; ``` ### Synchronous Media Generation Request Example Other AI Requests can be called synchronously in the same way where its API is named after the modality it implements, e.g. you'd instead call `TextToImage` to generate an Image from a Text description: ```csharp var response = client.Post(new TextToImage PositivePrompt = "A serene landscape with mountains and a lake", Model = "flux-schnell", Width = 1024, Height = 1024, BatchSize = 1 }); File.WriteAllBytes(saveToPath, response.Results[0].Url.GetBytesFromUrl()); ``` ### Queued Open AI Chat Example The **Queued API** immediately Returns a reference to the queued job executing the AI Request: ```csharp var response = client.Post(new QueueOpenAiChatCompletion { Request = new() { Model = "gpt-4-turbo", Messages = [ new() { Role = "system", Content = "You are a helpful AI assistant." }, new() { Role = "user", Content = "How do LLMs work?" } ], MaxTokens = 50 } }); ``` Which can be used to poll for the API Response of any Job by calling `GetOpenAiChatStatusResponse` and checking when its state has finished running to get the completed `OpenAiChatResponse`: ```csharp GetOpenAiChatStatusResponse status = new(); while (status.JobState is BackgroundJobState.Started or BackgroundJobState.Queued) { status = await client.GetAsync(new GetOpenAiChatStatus { RefId = response.RefId }); await Task.Delay(1000); } var answer = status.Result.Choices[0].Message.Content; ``` ### Queued Media Artifact Generation Request Example Most other AI Server Requests are Artifact generation requests which would instead call `GetArtifactGenerationStatus` to get the artifacts response of a queued job, e.g: ```csharp var response = client.Post(new QueueTextToImage { PositivePrompt = "A serene landscape with mountains and a lake", Model = "flux-schnell", Width = 1024, Height = 1024, BatchSize = 1 }); // Poll for Job Completion Status GetArtifactGenerationStatusResponse status = new(); while (status.JobState is BackgroundJobState.Queued or BackgroundJobState.Started) { status = client.Get(new GetArtifactGenerationStatus { JobId = response.JobId }); Thread.Sleep(1000); } File.WriteAllBytes(saveToPath, status.Results[0].Url.GetBytesFromUrl()); ``` ### Queued Media Text Generation Request Example Whilst the Media API Requests that generates text like `SpeechToText` or `ImageToText` would instead call `GetTextGenerationStatus` to get the text response of a queued job, e.g: ```csharp using var fsAudio = File.OpenRead("files/test_audio.wav"); var response = client.PostFileWithRequest(new QueueSpeechToText(), new UploadFile("test_audio.wav", fsAudio, "audio")); // Poll for Job Completion Status GetTextGenerationStatusResponse status = new(); while (status.JobState is BackgroundJobState.Started or BackgroundJobState.Queued) { status = client.Get(new GetTextGenerationStatus { RefId = response.RefId }); Thread.Sleep(1000); } var answer = status.Results[0].Text; ``` ### Open AI Chat with Callback Example The Queued API also accepts a **Reply to Web Callback** for a more reliable push-based App integration where responses are posted back to a custom URL Endpoint: ```csharp var correlationId = Guid.NewGuid().ToString("N"); var response = client.Post(new QueueOpenAiChatCompletion { //... ReplyTo = $"https://example.org/api/OpenAiChatResponseCallback?CorrelationId=${correlationId}" }); ``` Your callback can add any additional metadata on the callback to assist your App in correlating the response with the initiating request which just needs to contain the properties of the `OpenAiChatResponse` you're interested in along with any metadata added to the callback URL, e.g: ```csharp public class OpenAiChatResponseCallback : IPost, OpenAiChatResponse, IReturnVoid { public Guid CorrelationId { get; set; } } public object Post(OpenAiChatResponseCallback request) { // Handle OpenAiChatResponse callabck } ``` Unless your callback API is restricted to only accept requests from your AI Server, you should include a unique Id like a `Guid` in the callback URL that can be validated against an initiating request to ensure the callback can't be spoofed. ## Feedback Feel free to reach us at [ai-server/discussions](https://github.com/ServiceStack/ai-server/discussions) with any AI Server questions. # .NET 8 Templates migrated to use Kamal for deployments Source: https://servicestack.net/posts/kamal-deployments Since introducing [GitHub Actions support to our templates](https://docs.servicestack.net/ssh-docker-compose-deploment), we've promoted the simplified deployments, focusing on tooling like SSH and Docker Compose to give the most portability to projects by default. This was partly inspired by the fact that cloud providers value offerings have been decreasing, especially over the last 5 years. We've previously showed [the significant savings](https://servicestack.net/posts/hetzner-cloud) available by utilizing of hosting providers like Hetzner (who we've been using for several years), and moved all our templates and live demos to Hetzner resulting in a roughly **$0.50 per month** cost per .NET App. Along with this decreasing in value from the major cloud vendors, and the general hardware improvements, we've also been leaning into [using SQLite for server .NET Apps](/posts/scalable-sqlite), using it as the primary database for some of our larger example applications like [pvq.app](https://pvq.app), [blazordiffusion.com](https://blazordiffusion.com), and most recently, [AI Server](https://openai.servicestack.net). We're delighted to see that the folks at BaseCamp are estimating to [save millions from their cloud exit](https://world.hey.com/dhh/our-cloud-exit-savings-will-now-top-ten-million-over-five-years-c7d9b5bd) and have doubled down on their general purpose Docker deployment solutions with their initial MRSK project, that's now known as Kamal. ### Use Kamal to deploy .NET Apps to any Linux server :::youtube -mDJfRG8mLQ Use Kamal with GitHub Actions to deploy .NET Apps to any Linux server ::: ## What is [Kamal](https://kamal-deploy.org/)? Kamal is a tool that offers the same flexibility by wrapping up the use of fundamental tooling like SSH and Docker into a great CLI tool that tries to make the management of containerized applications, enabling them to be deployed anywhere there is a Linux host that is accessible via SSH. It handles reverse proxy of web traffic automatically, as well as even the initial setup of the reverse proxy and related tooling to any target Linux host. This means you get the same great ergonomics of just pointing your DNS and configuration file to a server, and *Kamal takes care of the rest*, including TLS certificates via LetsEncrypt. It even has commands that allow you to check on your running applications, view logs etc and all you need to do is run the commands from your local repository directory. While our own templates have used the same approach for GitHub Actions, the usage was always awkward and lacked any dedicated CLI tooling you could run locally to check on your running applications. ## What's in the templates? We still believe that having a CI process is important, and while Kamal deployments are repeatable from your local machine and uses locking to avoid multiple developers deploying changes, the single consistent process of a CI is hard to beat. So while we have moved the templates to use Kamal, we've incorporated GitHub Actions by default so you can still get the benefits of running commands like `kamal app logs` locally from your development machine when looking at production issues, but have that consistent workflow for deployment on your repositories GitHub Actions. ## How it works One of the big benefits of Kamal is the focus on ergonomics and the really well done documentation that the BaseCamp team has put together. So if you need to know more about Kamal, [checkout their docs](https://kamal-deploy.org/docs/). For the ServiceStack templates, you will need to add a valid `PRIVATE_SSH_KEY` as a GitHub Actions secret to get it working along with the customization of your `config/deploy.yml` file which is a part of any Kamal setup. In short, you will need: - Get a Linux host running with SSH access - Update your DNS configuration with an A record pointing to that hosts IP address - Create a new project using one of our updated templates using a command like: :::sh x new blazor-vue MyApp ::: Update the `config/deploy.yml` with the following details: ### GitHub Container Registry Image Update with your preferred container image name: ```yml # Name of the container image image: my-user/myapp ``` ### Server Web Configure with your Linux Host IP Address: ```yml servers: # IP address of server, optionally use env variable web: - 123.123.123.123 ``` Alternatively, you can use an environment variable for the server IP address, e.g: ```yml web: - <%= ENV['KAMAL_DEPLOY_IP'] %> ``` ### Proxy Host Configure with your domain pointing to the same IP as your host: ```yml proxy: ssl: true host: myapp.example.com ``` ### Health Checks The template includes the use of ASP.NET Core Health Checks, that use the default Kamal path of `/up` to check if the application is running before deploying. ```csharp public class HealthCheck : IHealthCheck { public async Task CheckHealthAsync(HealthCheckContext context, CancellationToken token = default) { // Perform health check logic here return HealthCheckResult.Healthy(); } } ``` Kamal checks this path before deploying your application, so you can add any custom health checks to this path to ensure your application is ready to receive traffic. ## GitHub Repository With your application created and configured for deployment, you can create a new GitHub Repository and add the GitHub Actions Secret of `PRIVATE_SSH_KEY` which should be a separate SSH key for deployments that has access to your Linux host. You can use the GitHub CLI to do of these steps. ```bash gh repo create ``` When prompted, create an empty repository. Then add the `PRIVATE_SSH_KEY` secret. ``` gh secret set PRIVATE_SSH_KEY < deploy-key ``` Where `deploy-key` is your deployment specific SSH key file. Once created, you can follow the steps in your empty repository to init your templated `MyApp` project and push your initial commit. If you're deploy.yml config and DNS was setup correctly, the GitHub Action will do the following: - Build and test your application running the MyApp.Tests project by default - Publish your application as a Docker container to GitHub's `ghcr.io` repository - Use Kamal to initialize your Linux host to be able to run Kamal applications and use their default `kamal-proxy` - Fix volume permissions your for application due to ASP.NET containerization not running as root user in the container. - Run your `AppTasks=migrate` command before running your application initializing the SQLite database - Run your AppHost using `kamal deploy -P --version latest` command. ## Summary We're excited to be moving our templates to Kamal for deployments as it has distilled the simple approach we have baked in our templates for a number of years while massively improving the ergonomics. We're excited to see what the BaseCamp team does with the project, and we're looking forward to seeing the community grow around it. If you have any questions about the templates or Kamal, feel free to reach out to us on our Discord, GitHub Discussions or Customer Forums. # DTOs in all languages downloadable without .NET Source: https://servicestack.net/posts/npx-get-dtos To make it easier to consume ServiceStack APIs in any language, we've added the ability to download Typed DTOs from any ServiceStack API in all languages without needing .NET installed with the new `npx get-dtos` npm script. It has the same syntax and functionality as the `x` dotnet tool for adding and updating ServiceStack References where in most cases you can replace `x ` with `npx get-dtos ` to achieve the same result. Running `npx get-dtos` without any arguments will display the available options: get-dtos Update all ServiceStack References in directory (recursive) get-dtos Update existing ServiceStack Reference (e.g. dtos.cs) get-dtos Add ServiceStack Reference and save to file name get-dtos csharp Add C# ServiceStack Reference (Alias 'cs') get-dtos typescript Add TypeScript ServiceStack Reference (Alias 'ts') get-dtos javascript Add JavaScript ServiceStack Reference (Alias 'js') get-dtos python Add Python ServiceStack Reference (Alias 'py') get-dtos dart Add Dart ServiceStack Reference (Alias 'da') get-dtos php Add PHP ServiceStack Reference (Alias 'ph') get-dtos java Add Java ServiceStack Reference (Alias 'ja') get-dtos kotlin Add Kotlin ServiceStack Reference (Alias 'kt') get-dtos swift Add Swift ServiceStack Reference (Alias 'sw') get-dtos fsharp Add F# ServiceStack Reference (Alias 'fs') get-dtos vbnet Add VB.NET ServiceStack Reference (Alias 'vb') get-dtos tsd Add TypeScript Definition ServiceStack Reference Options: -h, --help, ? Print this message -v, --version Print tool version version --include Include all APIs in specified tag group --qs Add query string to Add ServiceStack Reference URL --verbose Display verbose logging --ignore-ssl-errors Ignore SSL Errors ## Reusable DTOs and Reusable Clients in any language A benefit of [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) is that only an API DTOs need to be generated which can then be used to call any remote instance running that API. E.g. DTOs generated for our deployed AI Server instance at [openai.servicestack.net](https://openai.servicestack.net) can be used to call any self-hosted AI Server instance, likewise the same generic client can also be used to call any other ServiceStack API. ### Typed Open AI Chat & Ollama APIs in 11 Languages A good example of its versatility is in the [Typed OpenAI Chat & Ollama APIs](/posts/typed-openai-chat-ollama-apis) in which AI Server's Typed DTOs can be used to call **any Open AI Chat compatible API** in its 11 supported languages. ### TypeScript Example For example you can get the TypeScript DTOs for the just released [AI Server](/posts/ai-server) by: 1. Installing the `@servicestack/client` npm package: :::copy npm install @servicestack/client ::: 2. Download AI Server's TypeScript DTOs: :::copy `npx get-dtos typescript https://openai.servicestack.net` ::: Which just like the `x` tool will add the TypeScript DTOs to the `dtos.ts` file ### Calling Ollama from TypeScript Call Ollama by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```ts import { JsonServiceClient } from "@servicestack/client" import { OpenAiChatCompletion } from "./dtos" const client = new JsonServiceClient(baseUrl) const response = await client.postToUrl("/v1/chat/completions", new OpenAiChatCompletion({ model: "mixtral:8x22b", messages: [ { role: "user", content: "What's the capital of France?" } ], max_tokens: 50 }) ) const answer = response.choices[0].message.content ``` ### Update TypeScript DTOs And later update all TypeScript ServiceStack References in the current directory with: :::sh `npx get-dtos typescript` ::: ### Install and Run in a single command This can be used as a more flexible alternative to the `x` tool where it's often easier to install node in CI environments than a full .NET SDK and easier to use npx scripts than global dotnet tools. For example you can use the `--yes` flag to implicitly install (if needed) and run the `get-dtos` script in a single command, e.g: :::sh `npx --yes get-dtos typescript` ::: ### C# Example As such you may want want to replace the `x` dotnet tool with `npx get-dtos` in your C#/.NET projects as well which can either use the language name or its more wrist-friendly shorter alias, e.g: :::sh `npx get-dtos cs https://openai.servicestack.net` ::: Then later update all C# DTOs in the current directory (including sub directories) with: :::sh `npx get-dtos cs` ::: # ServiceStack.Swift client library rewritten for Swift 6 Source: https://servicestack.net/posts/swift6-upgrade ![](https://docs.servicestack.net/img/pages/servicestack-reference/swift-logo-banner.jpg) As part of the release of [AI Server](/posts/ai-server) we've upgraded all generic service client libraries to support multiple file uploads with API requests to take advantage of AI Server APIs that accept file uploads like [Image to Image](https://docs.servicestack.net/ai-server/image-to-image), [Speech to Text](https://docs.servicestack.net/ai-server/speech-to-text) or its [FFmpeg Image](https://docs.servicestack.net/ai-server/transform/image) and [Video Transforms](https://docs.servicestack.net/ai-server/transform/video). ## ServiceStack.Swift rewritten for Swift 6 [ServiceStack.Swift](https://github.com/ServiceStack/ServiceStack.Swift) received the biggest upgrade, which was also rewritten to take advantage of Swift 6 features, including Swift promises which replaced the previous [PromiseKit](https://github.com/mxcl/PromiseKit) dependency - making it now dependency-free! For example you can request a [Speech to Text](https://docs.servicestack.net/ai-server/speech-to-text) transcription by sending an audio file to the `SpeechToText` API using the new `postFilesWithRequest` method: ### Calling AI Server to transcribe an Audio Recording ```swift let client = JsonServiceClient(baseUrl: "https://openai.servicestack.net") client.bearerToken = apiKey let request = SpeechToText() request.refId = "uniqueUserIdForRequest" let response = try client.postFilesWithRequest(request:request, file:UploadFile(fileName:"audio.mp3", data:mp3Data, fieldName:"audio")) Inspect.printDump(response) ``` ### Async Upload Files with API Example Alternatively use the new `postFileWithRequestAsync` method to call the API asynchronously using [Swift 6 Concurrency](https://docs.swift.org/swift-book/documentation/the-swift-programming-language/concurrency/) new **async/await** feature: ```swift let response = try await client.postFileWithRequestAsync(request:request, file:UploadFile(fileName:"audio.mp3", data:mp3Data, fieldName:"audio")) Inspect.printDump(response) ``` ### Multiple file upload with API Request examples Whilst the `postFilesWithRequest` methods can be used to upload multiple files with an API Request. e.g: ```swift let request = WatermarkVideo() request.position = .BottomRight let response = try client.postFilesWithRequest(request: request, files: [ UploadFile(fileName: "video.mp4", data:videoData, fieldName:"video"), UploadFile(fileName: "mark.jpg", data:imgData, fieldName:"watermark") ]) ``` Async Example: ```swift let response = try await client.postFilesWithRequestAsync(request: request, files: [ UploadFile(fileName: "video.mp4", data:videoData, fieldName:"video"), UploadFile(fileName: "mark.jpg", data:imgData, fieldName:"watermark") ]) ``` ### Sending typed Open AI Chat Ollama Requests with Swift Even if you're not running AI Server you can still use its typed DTOs to call any compatible Open AI Chat Compatible API like a self-hosted [Ollama](https://ollama.com) API. To call an Ollama endpoint from Swift: 1. Include `ServiceStack` package in your projects `Package.swift` ```swift dependencies: [ .package(url: "https://github.com/ServiceStack/ServiceStack.Swift.git", Version(6,0,0)..` ::: Download AI Server's C# DTOs with [x dotnet tool](https://docs.servicestack.net/dotnet-tool): :::copy `x csharp https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with `JsonApiClient`: ```csharp using ServiceStack; var client = new JsonApiClient(baseUrl); var result = await client.PostAsync("/v1/chat/completions", new OpenAiChatCompletion { Model = "mixtral:8x22b", Messages = [ new () { Role = "user", Content = "What's the capital of France?" } ], MaxTokens = 50 }); ``` ## TypeScript Install the `@servicestack/client` npm package: :::copy npm install @servicestack/client ::: Download AI Server's TypeScript DTOs: :::copy `npx get-dtos typescript https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```ts import { JsonServiceClient } from "@servicestack/client" import { OpenAiChatCompletion } from "./dtos" const client = new JsonServiceClient(baseUrl) const result = await client.postToUrl("/v1/chat/completions", new OpenAiChatCompletion({ model: "mixtral:8x22b", messages: [ { role: "user", content: "What's the capital of France?" } ], max_tokens: 50 }) ) ``` ## JavaScript Save [servicestack-client.mjs](https://unpkg.com/@servicestack/client@2/dist/servicestack-client.mjs) to your project Define an Import Map referencing its saved location ```html ``` Download AI Server's ESM JavaScript DTOs: :::copy `npx get-dtos mjs https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```js import { JsonServiceClient } from "@servicestack/client" import { OpenAiChatCompletion } from "./dtos.mjs" const client = new JsonServiceClient(baseUrl) const result = await client.postToUrl("/v1/chat/completions", new OpenAiChatCompletion({ model: "mixtral:8x22b", messages: [ { role: "user", content: "What's the capital of France?" } ], max_tokens: 50 }) ) ``` ## Python Install the `servicestack` PyPI package: :::copy pip install servicestack ::: Download AI Server's Python DTOs: :::copy `npx get-dtos python https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```py from servicestack import JsonServiceClient from my_app.dtos import * client = JsonServiceClient(baseUrl) result = client.post_url("/v1/chat/completions",OpenAiChatCompletion( model="mixtral:8x22b", messages=[ OpenAiMessage(role="user",content="What's the capital of France?") ], max_tokens=50 )) ``` ## Dart Include `servicestack` package in your projects `pubspec.yaml`: :::copy servicestack: ^3.0.1 ::: Download AI Server's Dart DTOs: :::copy `npx get-dtos dart https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```dart import 'dart:io'; import 'dart:typed_data'; import 'package:servicestack/client.dart'; var client = JsonServiceClient(baseUrl); var result = await client.postToUrl('/v1/chat/completions', OpenAiChatCompletion() ..model = 'mixtral:8x22b' ..max_tokens = 50 ..messages = [ OpenAiMessage() ..role = 'user' ..content = "What's the capital of France?" ]); ``` ### PHP Include `servicestack/client` package in your projects `composer.json`: :::copy "servicestack/client": "^1.0" ::: Download AI Server's PHP DTOs: :::copy `npx get-dtos php https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```php use ServiceStack\JsonServiceClient; use dtos\OpenAiChatCompletion; use dtos\OpenAiMessage; $client = new JsonServiceClient(baseUrl); $client->bearerToken = apiKey; /** @var {OpenAiChatCompletionResponse} $response */ $result = $client->postUrl('/v1/chat/completions', body: new OpenAiChatCompletion( model: "mixtral:8x22b", messages: [ new OpenAiMessage( role: "user", content: "What's the capital of France?" ) ], max_tokens: 50 )); ``` ## Java Include `net.servicestack:client` package in your projects `build.gradle`: :::copy implementation 'net.servicestack:client:1.1.3' ::: Download AI Server's Java DTOs: :::copy `npx get-dtos java https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```java import net.servicestack.client.*; import java.util.Collections; var client = new JsonServiceClient(baseUrl); OpenAiChatResponse result = client.post("/v1/chat/completions", new OpenAiChatCompletion() .setModel("mixtral:8x22b") .setMaxTokens(50) .setMessages(Utils.createList(new OpenAiMessage() .setRole("user") .setContent("What's the capital of France?") )), OpenAiChatResponse.class); ``` ## Kotlin Include `net.servicestack:client` package in your projects `build.gradle`: :::copy implementation 'net.servicestack:client:1.1.3' ::: Download AI Server's Kotlin DTOs: :::copy `npx get-dtos kotlin https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with JsonServiceClient: ```kotlin package myapp import net.servicestack.client.* val client = JsonServiceClient(baseUrl) val result: OpenAiChatResponse = client.post("/v1/chat/completions", OpenAiChatCompletion().apply { model = "mixtral:8x22b" messages = arrayListOf(OpenAiMessage().apply { role = "user" content = "What's the capital of France?" }) maxTokens = 50 }, OpenAiChatResponse::class.java) ``` ## Swift Include `ServiceStack` package in your projects `Package.swift` ```swift dependencies: [ .package(url: "https://github.com/ServiceStack/ServiceStack.Swift.git", Version(6,0,0)..` ::: Download AI Server's F# DTOs with [x dotnet tool](https://docs.servicestack.net/dotnet-tool): :::copy `x fsharp https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with `JsonApiClient`: ```fsharp open ServiceStack open ServiceStack.Text let client = new JsonApiClient(baseUrl) let result = client.Post("/v1/chat/completions", OpenAiChatCompletion( Model = "mixtral:8x22b", Messages = ResizeArray [ OpenAiMessage( Role = "user", Content = "What's the capital of France?" ) ], MaxTokens = 50)) ``` ## VB.NET Install the `ServiceStack.Client` NuGet package: :::copy `` ::: Download AI Server's VB.NET DTOs with [x dotnet tool](https://docs.servicestack.net/dotnet-tool): :::copy `x vbnet https://openai.servicestack.net` ::: Call API by sending `OpenAiChatCompletion` Request DTO with `JsonApiClient`: ```vb Imports ServiceStack Imports ServiceStack.Text Dim client = New JsonApiClient(baseUrl) Dim result = Await client.PostAsync(Of OpenAiChatResponse)( "/v1/chat/completions", New OpenAiChatCompletion() With { .Model = "mixtral:8x22b", .Messages = New List(Of OpenAiMessage) From { New OpenAiMessage With { .Role = "user", .Content = "What's the capital of France?" } }, .MaxTokens = 50 }) ``` # Simple API Keys Credentials Provider for .NET 8 C# Microservices Source: https://servicestack.net/posts/apikey-credentials-auth The usability of the [Simple Auth with API Keys](https://docs.servicestack.net/auth/admin-apikeys) story has been significantly improved with the new `ApiKeyCredentialsProvider` which enables .NET Microservices to provide persistent UserSession-like behavior using simple API Keys which can be configured together with the `AuthSecretAuthProvider` and `ApiKeysFeature` to enable a Credentials Auth implementation which users can use with their API Keys or Admin AuthSecret. A typical configuration for .NET Microservices looking to enable Simple Auth access whose APIs are protected by API Keys and their Admin functionality protected by an Admin Auth Secret can be configured with: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature([ new ApiKeyCredentialsProvider(), new AuthSecretAuthProvider("MyAuthSecret"), ])); services.AddPlugin(new SessionFeature()); services.AddPlugin(new ApiKeysFeature()); }) .ConfigureAppHost(appHost => { using var db = HostContext.AppHost.GetDbConnection(); appHost.GetPlugin().InitSchema(db); }); } ``` When registered a Credentials Auth dialog will appear for [ServiceStack Built-in UIs](https://servicestack.net/auto-ui) allowing users to Sign In with their **API Keys** or Admin **Auth Secret**. ![](/img/posts/apikey-credentials-auth/ai-server-auth-apiexplorer.png) ### Session Auth with API Keys Behind the scenes this creates a Server [Auth Session](https://docs.servicestack.net/auth/sessions) but instead of maintaining an Authenticated User Session it saves the API Key in the session then attaches the API Key to each request. This makes it possible to make API Key validated requests with just a session cookie instead of requiring resubmission of API Keys for each request. ### AI Server This is an ideal Auth Configuration for .NET Docker Appliances and Microservices like [AI Server](/posts/ai-server) that don't need the complexity of ASP .NET Core's Identity Auth machinery and just want to restrict access to their APIs with API Keys and restrict Admin functionality to Administrator's with an Auth Secret. The benefit of `ApiKeyCredentialsProvider` is that it maintains a persistent Session so that end users only need to enter their API Key a single time and they'll be able to navigate to all of AI Server's protected pages using their API Key maintained in their Server User Session without needing to re-enter it for each UI and every request. ### User Access with API Keys AI Server uses **API Keys** to restrict Access to their AI Features to **authorized Users** with Valid API Keys who are able to use its Built-in UIs for its AI Features with the Users preferred Name and issued API Key: ![](/img/posts/apikey-credentials-auth/ai-server-auth-user.png) After signing in a single time they'll be able to navigate to any protected page and start using AI Server's AI features: ![](/img/posts/apikey-credentials-auth/ai-server-auth-user-chat.png) ### User Access to API Explorer This also lets users use their existing Auth Session across completely different UIs like [API Explorer](https://docs.servicestack.net/api-explorer) where they'll have the same access to APIs as they would when calling APIs programatically with their API Keys, e.g: ![](/img/posts/apikey-credentials-auth/ai-server-auth-apiexplorer-api.png) ### Coarse or fine-grained API Key access By default **any** Valid API Key can access restricted services by `[ValidateApiKey]` ```csharp [ValidateApiKey] public class Hello : IGet, IReturn { public required string Name { get; set; } } ``` ### API Key Scopes API Keys can be given elevated privileges where only Keys with user defined scopes: ![](https://docs.servicestack.net/img/pages/auth/simple/admin-ui-apikeys-edit.png) Are allowed to access APIs restricted with that scope: ```csharp [ValidateApiKey("todo:read")] public class QueryTodos : QueryDb { public long? Id { get; set; } public List? Ids { get; set; } public string? TextContains { get; set; } } ``` ### Restricted API Keys to specific APIs API Keys can also be locked down to only be allowed to call specific APIs: ![](https://docs.servicestack.net/img/pages/auth/simple/admin-ui-apikeys-restrict-to.png) ## Admin Access AI Server also maintains an Admin UI and Admin APIs that are only accessible to **Admin** users who Authenticate with the App's configured Admin Auth Secret who are able to access AI Server's Admin UIs to monitor Live AI Requests, create new User API Keys, Manage registered AI Providers, etc. ![](/img/posts/apikey-credentials-auth/ai-server-auth-admin-jobs.png) ### Admin Restricted APIs You can restrict APIs to Admin Users by using `[ValidateAuthSecret]`: ```csharp [Tag(Tags.Admin)] [ValidateAuthSecret] [Api("Add an AI Provider to process AI Requests")] public class CreateAiProvider : ICreateDb, IReturn { //... } ``` Which are identified in API Explorer with a **padlock** icon whilst APIs restricted by API Key are identified with a **key** icon: ![](/img/posts/apikey-credentials-auth/ai-server-auth-apiexplorer-admin.png) # Podcasts now in Razor SSG Source: https://servicestack.net/posts/razor-ssg-podcasts ## Razor SSG now supports Podcasts! [Razor SSG](https://razor-ssg.web-templates.io) is our FREE Project Template for creating fast, statically generated Websites and Blogs with Markdown & C# Razor Pages. A benefit of using Razor SSG to maintain our [github.com/ServiceStack/servicestack.net](https://github.com/ServiceStack/servicestack.net) website is that any improvements added to **servicestack.net** end up being rolled into the Razor SSG Project Template for everyone else to enjoy. The latest feature recently added is [ServiceStack Podcasts](https://servicestack.net/podcasts), providing an easy alternative to learning about new features in our [TL;DR Release Notes](https://docs.servicestack.net/releases/v8_04) during a commute as well as a fun and more informative experience whilst reading [blog posts](https://servicestack.net/blog). The same podcast feature has now been rolled into the Razor SSG template allowing anyone to add the same feature to their Razor SSG Websites which can be developed and hosted for FREE on GitHub Pages CDN: ### Create a new Razor SSG Project
Razor SSG
### Markdown Powered The Podcast feature is very similar to the Markdown Blog Posts where each podcast is a simple `.md` Markdown page seperated by a publish date and its unique slug, e.g: **[/_podcasts](https://github.com/NetCoreTemplates/razor-ssg/tree/main/MyApp/_podcasts)** ```files /_pages /_podcasts config.json 2024-10-02_razor-ssg-podcasts.md 2024-09-19_scalable-sqlite.md 2024-09-17_sqlite-request-logs.md ... /_posts /_videos /_whatsnew ``` All editable content within different Podcast pages like the Podcast Sidebar is customizable within [_podcasts/config.json](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/_podcasts/config.json). [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-layout.webp)](https://razor-ssg.web-templates.io/podcasts) ### Podcast Page Whilst all content about a podcast is contained within its `.md` file and frontmatter which just like Blog Posts can contain interactive Vue Components and custom [Markdown Containers](https://razor-press.web-templates.io/containers). The [Backgrounds Jobs Podcast Page](https://razor-ssg.web-templates.io/podcasts/background-jobs) is a good example of this where its [2024-09-12_background-jobs.md](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/_podcasts/2024-09-12_background-jobs.md?plain=1) contains both a `` Vue Component as well as `sh` and `youtube` custom markdown containers to render its page: [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-page.webp)](https://razor-ssg.web-templates.io/podcasts/background-jobs) ### Audio Player Podcasts are played using the [AudioPlayer.mjs](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/wwwroot/pages/podcasts/AudioPlayer.mjs) Vue Component that's enabled on each podcast page which will appear at the bottom of the page when played: [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-audioplayer.webp)](https://razor-ssg.web-templates.io/podcasts) The `AudioPlayer` component is also independently usable as a standard Vue Component in markdown content like [this .md page](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/_posts/2024-10-02_razor-ssg-podcasts.md?plain=1#L72): ```html ``` :::{.py-8 .mx-auto .w-2/3 .not-prose} ::: It can also be embeddable inside Razor `.cshtml` pages using [Declarative Vue Components](https://servicestack.net/posts/net8-best-blazor#declarative-vue-components), e.g: ```html @{ var episode = Podcasts.GetEpisodes().FirstOrDefault(x => x.Slug == doc.Slug);
} ``` ### Dark Mode As Razor SSG is built with Tailwind CSS, Dark Mode is also easily supported: [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-dark.webp)](https://razor-ssg.web-templates.io/podcasts/background-jobs) ### Browse by Tags Just like [blog post archives](https://razor-ssg.web-templates.io/posts/), the frontmatter collection of `tags` is used to generate related podcast pages, aiding discoverability by grouping related podcasts by **tag** at the following route: /podcasts/tagged/{tag} https://razor-ssg.web-templates.io/podcasts/tagged/release [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-tag.webp)](https://razor-ssg.web-templates.io/podcasts/tagged/release) ### Browse by Year Likewise podcast archives are also browsable by the year their published at the route: /podcasts/year/{year} https://razor-ssg.web-templates.io/podcasts/year/2024 [![](/img/posts/razor-ssg-podcasts/razor-ssg-podcast-year.webp)](https://razor-ssg.web-templates.io/podcasts/year/2024) ### iTunes-compatible Podcast RSS Feed The information in [config.json](https://github.com/NetCoreTemplates/razor-ssg/blob/main/MyApp/_podcasts/config.json) is also used in the generated podcast RSS feed at: [/podcasts/feed.xml](https://razor-ssg.web-templates.io/podcasts/feed.xml) Which is a popular format podcast Applications can use to get notified when new Podcast episodes are available. The RSS Feed is also compatible with [podcasters.apple.com](https://podcasters.apple.com) and can be used to publish your podcast to [Apple Podcasts](https://podcasts.apple.com). ```xml Their Side https://razor-ssg.web-templates.io/podcasts https://razor-ssg.web-templates.io/img/posts/cover.png Their Side /podcasts razor-ssg Razor SSG Wed, 02 Oct 2024 03:54:03 GMT email@example.org (Razor SSG) email@example.org (Razor SSG) Razor SSG Razor SSG email@example.org ... ``` # Scalable Server SQLite Apps Source: https://servicestack.net/posts/scalable-sqlite Ever since adding [support for Litestream](https://docs.servicestack.net/ormlite/litestream) in our project's templates [GitHub Action Deployments](https://servicestack.net/posts/kubernetes_not_required) we've been using SQLite as the backend for our new .NET C# Apps as it's the [most cost-effective option](https://docs.servicestack.net/ormlite/litestream#savings-at-scale) that frees us from needing to use a cloud managed database which lets us make use of Hetzner's much cheaper [US Cloud VMs](https://www.hetzner.com/cloud/). We're also seeing increased usage of SQLite Server Apps with [Bluesky Social](https://github.com/bluesky-social/atproto/pull/1705) having moved to SQLite and all of 37 Signals new [Once](https://once.com) Web Apps [using SQLite](https://world.hey.com/dhh/multi-tenancy-is-what-s-hard-about-scaling-web-services-dd1e0e81), Tailscale having migrated their [primary database to SQLite](https://tailscale.com/blog/database-for-2022) whos ex-Google founders have [been using since 2018](https://www.youtube.com/watch?v=RqubKSF3wig) and Cloud Providers building distributed databases on top of SQLite like [Cloudflare D1](https://blog.cloudflare.com/introducing-d1/) and Fly.io's multi-region distributed [LiteFS](https://fly.io/docs/litefs/) solution. SQLite is a highly-performant DB that can handle a large number of concurrent read operations and [35% Faster](https://www.sqlite.org/fasterthanfs.html) filesystem performance for write operations with next to no latency that's often faster than other RDBMS's courtesy of its proximity to the running application which gives it unique advantages over traditional client/server RDBMS's where it's not susceptible to the [N+1 Queries problem](https://www.sqlite.org/np1queryprob.html) and is also able to execute your custom C# Logic inside SQL Queries using [Application SQL Functions](https://www.sqlite.org/appfunc.html). With [litestream.io](https://litestream.io) taking care of real-time replication to managed storage we just need to workaround SQLite's single concurrent writer to unlock the value, performance and unique features of SQLite in our Apps which we cover in this release with integrated support for Database Locks and Sync Commands. ## Single Concurrent Writer The primary limitation of SQLite is that it only supports a single concurrent writer, which means if you have multiple requests writing to the same database at the same time, they will need to coordinate access. As long as we can overcome this limitation SQLite can be an excellent choice to power many Web Apps. In the previous ServiceStack v8.3 release we [worked around this limitation](https://docs.servicestack.net/commands#use-case-sqlite-writes) by using [MQ Command DTOs](https://docs.servicestack.net/commands#mq-command-dtos) to route all DB Writes to be executed by a single Background MQ Thread. This works great for [messaging-based architectures](https://docs.servicestack.net/commands#messaging-workflow) where you can queue commands to be processed serially, but the overhead of using commands for all DB writes can be cumbersome when needing to perform sporadic writes within complex logic. ## Multiple SQLite Databases Firstly a great way to reduce contention is to use separate SQLite databases for different subsystems of your Application that way load is distributed across multiple DBs and writes across each SQLite database can be executed concurrently. This is especially important for write heavy operations like [SQLite Request Logging](/posts/sqlite-request-logs) or if your App stores every interaction of your App for A/B testing, storing them in separate `analytics.db` databases will remove any contention from your primary database. The other techniques below demonstrates concurrent safe techniques for accessing an SQLite DB: ### Always use Synchronous APIs for SQLite Generally it's recommended to use non-blocking Async APIs for any I/O Operations however as SQLite doesn't make Network I/O requests and its native implementation is blocking, its Async DB APIs are just pseudo-async wrappers around SQLite's blocking APIs which just adds unnecessary overhead. For this reason we recommend **always** using synchronous APIs for SQLite, especially as it's not possible to await inside a lock: ```csharp lock (Locks.AppDb) { //Can't await inside a lock //await Db.UpdateAsync(row); Db.Update(row); } ``` It's also safe to assume SQLite will always block since all [Asynchronous I/O efforts](https://www.sqlite.org/asyncvfs.html) were abandoned in favor of [WAL mode](https://www.sqlite.org/wal.html) which mitigates the cost of blocking **fsync()**. ## Database Locks The new `Locks` class maintains an object lock for each registered database connection that can be used to synchronize **write access** for different SQLite databases, e.g: ```js var row = db.SingleById(request.Id); row.PopulateWithNonDefaultValues(request); lock (Locks.AppDb) { Db.Update(row); } ``` `Locks.AppDb` can be used synchronize db writes for the App's primary database, e.g. `App_Data/app.db`. Whilst `Locks.GetDbLock(namedConnection)` can be used to get the DB Write Lock for any other [registered SQLite Database](https://docs.servicestack.net/ormlite/multi-database-app) by using the same named connection the SQLite Database Connection was registered against, e.g: ```csharp var dbFactory = new OrmLiteConnectionFactory(connStr, SqliteDialect.Provider); dbFactory.RegisterConnection(Databases.Search, $"DataSource=App_Data/search.db;Cache=Shared", SqliteDialect.Provider); dbFactory.RegisterConnection(Databases.Analytics, $"DataSource=App_Data/analytics.db;Cache=Shared", SqliteDialect.Provider); //... using var dbSearch = dbFactory.Open(Database.Search); lock (Locks.GetDbLock(Database.Search)) { dbSearch.Insert(row); } using var dbAnalytics = dbFactory.Open(Database.Analytics); lock (Locks.GetDbLock(Database.Analytics)) { dbAnalytics.Insert(row); } ``` ## Queuing DB Writes with SyncCommand `Locks` are a great option for synchronizing DB Writes that need to be executed within complex logic blocks, however locks can cause contention in highly concurrent Apps. One way to remove contention is to serially execute DB Writes instead which we can do by executing DB Writes within `SyncCommand*` commands and using a named `[Worker(Workers.AppDb)]` attribute for Writes to the primary database, e.g: ```csharp [Worker(Workers.AppDb)] public class DeleteCreativeCommand(IDbConnection db) : SyncCommand { protected override void Run(DeleteCreative request) { var artifactIds = request.ArtifactIds; db.Delete(x => artifactIds.Contains(x.ArtifactId)); db.Delete(x => artifactIds.Contains(x.ArtifactId)); db.Delete(x => artifactIds.Contains(x.ArtifactId)); db.Delete(x => x.CreativeId == request.Id); db.Delete(x => x.CreativeId == request.Id); db.Delete(x => x.CreativeId == request.Id); db.Delete(x => x.Id == request.Id); } } ``` Other databases should use its named connection for its named worker, e.g: ```csharp [Worker(Databases.Search)] public class DeleteSearchCommand(IDbConnectionFactory dbFactory) : SyncCommand { protected override void Run(DeleteSearch request) { using var db = dbFactory.Open(Databases.Search); db.DeleteById(request.Id); //... } } ``` Where it will be executed within its Database Lock. ## Executing Commands Now everytime the commands are executed they will be added to a ConcurrentQueue where they'll be serially executed by the worker's Background Task: ```csharp public class MyServices(IBackgroundJobs jobs) : Service { public void Any(DeleteCreative request) { // Queues a durable job to execute the command with the named worker var jobRef = jobs.EnqueueCommand(request); // Returns immediately with a reference to the Background Job } public async Task Any(DeleteSearch request) { // Executes a transient (i.e. non-durable) job with the named worker var result = await jobs.RunCommandAsync(request); // Returns after the command is executed with its result (if any) } } ``` When using any `SyncCommand*` base class, its execution still uses database locks but any contention is alleviated as they're executed serially by a single worker thread. ### AutoQuery Crud Database Write Locks To avoid SQLite concurrency write exceptions all DB Writes should be executed within its database lock or a named worker. Including the auto-generated [AutoQuery Crud](https://docs.servicestack.net/autoquery/crud) APIs which will implicitly use Database Locks if the **primary database is SQLite**. AutoQuery CRUD can also be explicitly configured to always be executed within Database Locks with: ```csharp services.AddPlugin(new AutoQueryFeature { UseDatabaseWriteLocks = true }); ``` ## SQLite Web Apps That's about it, by using any of the above techniques to guard against concurrent writes you can take advantage of the [simplicity, value and performance benefits](https://docs.servicestack.net/ormlite/litestream#the-right-time-for-server-side-sqlite) of SQLite in your Apps and utilize a solution like [litestream.io](https://litestream.io) for real-time replication of your SQLite databases to highly reliable managed storage. SQLite's [Checklist For Choosing The Right Database Engine](https://www.sqlite.org/whentouse.html#checklist_for_choosing_the_right_database_engine) covers the few situations when a traditional Client/Server RDBMS will be more appropriate. The primary use-case would be when your App needs to be distributed across multiple App Servers as using SQLite essentially forces you into scaling up, which gets more appealing every year with hardware getting cheaper and faster and cheap hosting providers like [hetzner.com](https://www.hetzner.com) where you can get bare metal 48 Core/96 vCore EPYC Servers with fast NVMe SSDs for **€236** per month - a fraction of the cost of comparable performance with any cloud managed solution [![](/img/posts/scalable-sqlite/hetzner-epyc-48.webp)](https://www.hetzner.com/dedicated-rootserver/) Which is a fraction of what it would cost for comparable performance using cloud managed databases: [![](/img/posts/scalable-sqlite/azure-sql-database.webp)](https://azure.microsoft.com/en-us/pricing/details/azure-sql-database/single/) In the rare cases where you need to scale beyond a single server you'll initially be able to scale out your different App databases onto different servers. Beyond that, if your App permits you may be able to adopt a multi-tenant architecture like [Bluesky Social](https://bsky.social/about) with each tenant having their own SQLite database to effectively enable infinite scaling. For further info on using high performance SQLite in production web apps check out [@aarondfrancis](https://x.com/aarondfrancis) comprehensive website and course at [highperformancesqlite.com](https://highperformancesqlite.com) - which contains a lot of great content accessible for free. ## Example SQLite Apps Our confidence in SQLite being the best choice for many web applications has led us to adopt it to power our latest web applications which are all [deployed to a shared Hetzner VM](/posts/kubernetes_not_required) whose [inexpensive hosting costs](/posts/jamstacks_hosting) allows us to host and make them **available for free!** All projects are open-source and employ the different techniques detailed above that should serve as a great resource of how they're used in real-world Web Applications: ### Blazor Diffusion Generate images for free using custom [Civit AI](https://civitai.com) and [FLUX-schnell](https://huggingface.co/black-forest-labs/FLUX.1-schnell) models: [![](/img/posts/scalable-sqlite/blazordiffusion.webp)](https://blazordiffusion.com) - Website: [blazordiffusion.com](https://blazordiffusion.com) - GitHub: [github.com/NetCoreApps/BlazorDiffusionVue](https://github.com/NetCoreApps/BlazorDiffusionVue/) ### pvq.app An OSS alternative to StackOverflow which uses the best proprietary and OSS Large Language Models to answer your technical questions. [pvq.app](https://pvq.app) is populated with over **1M+ answers** for the highest rated StackOverflow questions - checkout [pvq.app/leaderboard](https://pvq.app/leaderboard) to find the best performing LLM models (results are surprising!) [![](/img/posts/scalable-sqlite/pvq.webp)](https://pvq.app) - Website: [pvq.app](https://pvq.app) - GitHub: [github.com/ServiceStack/pvq.app](https://github.com/ServiceStack/pvq.app) ### AI Server The independent Microservice used to provide all AI Features used by the above applications. It's already been used to execute millions of LLM and Comfy UI Requests to generate Open AI Chat Answers and Generated Images used to populate the [blazordiffusion.com](https://blazordiffusion.com) and [pvq.app](https://pvq.app) websites. It was the project used to develop and test [Background Jobs](/posts/background-jobs) in action where it serves as a private gateway to process all LLM, AI and image transformations requests that any of our Apps need where it dynamically delegates requests across multiple Ollama, Open AI Chat, LLM Gateway, Comfy UI, Whisper and ffmpeg providers. [![](/img/posts/scalable-sqlite/ai-server.webp)](https://openai.servicestack.net) [![](/img/posts/scalable-sqlite/ai-server-chat.webp)](https://openai.servicestack.net) - Website: [openai.servicestack.net](https://openai.servicestack.net) - GitHub: [github.com/ServiceStack/ai-server](https://github.com/ServiceStack/ai-server) In addition to maintaining a history of AI Requests, it also provides file storage for its CDN-hostable AI generated assets and on-the-fly, cacheable image transformations. ### Private AI Gateway We're developing AI Server as a **Free OSS Product** that runs as a single Docker Container Microservice that Admins can use its built-in UIs to add multiple Ollama instances, Open AI Gateways to execute LLM requests and Client Docker agents installed with Comfy UI, ffmpeg and Whisper to handle all other non-LLM Requests. #### Multiple Ollama, Open AI Gateway and Comfy UI Agents The AI Server Docker container itself wont require any infrastructure dependencies or specific hardware requirements, however any Ollama endpoints or Docker Comfy UI Agents added will need to run on GPU-equipped servers. #### Native end-to-end Typed Integrations to most popular languages ServiceStack's [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) feature is used to provide native typed integrations to C#, TypeScript, JavaScript, Python, PHP, Swift, Java, Kotlin, Dart, F# and VB.NET projects which organizations can drop into their heterogeneous environments to manage their private AI Services used across their different Apps. #### Protected Access with API Keys AI Server utilizes [Simple Auth with API Keys](https://docs.servicestack.net/auth/admin-apikeys) letting Admins create and distribute API Keys to only allow authorized clients to access their AI Server's APIs, which can be optionally further restricted to only [allow access to specific APIs](https://docs.servicestack.net/auth/apikeys#creating-user-api-keys). ### AI Server V1 [AI Server V1](/posts/ai-server) is now released! The initial V1 release comes packed with features, including: #### Large Language Models - Open AI Chat - Support for Ollama endpoints - Support for Open Router, Open AI, Mistral AI, Google and Groq API Gateways #### Comfy UI Agent / Replicate / DALL-E 3 - Text to Image #### Comfy UI Agent - Image to Image - Image Upscaling - Image with Mask - Image to Text - Text to Audio - Text to Speech - Speech to Text #### ffmpeg - image/video/audio format conversions - image/video scaling - image/video cropping - image/video watermarking - video trimming #### Managed File Storage - Blob Storage - isolated and restricted by API Key ### AI Server Feedback Feel free to reach us at [ai-server/discussions](https://github.com/ServiceStack/ai-server/discussions) with any AI Server questions. # SQLite C# Request Logs Source: https://servicestack.net/posts/sqlite-request-logs Up until this release all of ServiceStack's database features like [AutoQuery](https://servicestack.net/autoquery) have been database agnostic courtesy of OrmLite's [support for popular RDBMS's](https://docs.servicestack.net/ormlite/installation) so that they integrate into an App's existing configured database. [Background Jobs](/posts/background-jobs) is our first foray into a SQLite-only backend, as it's the only RDBMS that enables us to provide encapsulated black-box functionality without requiring any infrastructure dependencies. It's low latency, high-performance and ability to create lightweight databases on the fly make it ideal for self-managing isolated appliance backends like Background Jobs and Request Logging which don't benefit from integrating with your existing RDBMS. The new [ServiceStack.Jobs](https://nuget.org/packages/ServiceStack.Jobs) NuGet package allows us to deliver plug and play SQLite backed features into .NET 8 C# Apps that are configured with any RDBMS or without one. The next feature added is a SQLite backed provider for [Request Logs](https://docs.servicestack.net/request-logger) with the new `SqliteRequestLogger` which can be added to existing .NET 8 Apps with the [mix tool](https://docs.servicestack.net/mix-tool): :::sh x mix sqlitelogs ::: Which adds a reference to **ServiceStack.Jobs** and the [Modular Startup](https://docs.servicestack.net/modular-startup) config below: ```csharp using ServiceStack.Jobs; using ServiceStack.Web; [assembly: HostingStartup(typeof(MyApp.ConfigureRequestLogs))] namespace MyApp; public class ConfigureRequestLogs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddPlugin(new RequestLogsFeature { RequestLogger = new SqliteRequestLogger(), EnableResponseTracking = true, EnableRequestBodyTracking = true, EnableErrorTracking = true }); services.AddHostedService(); }); } public class RequestLogsHostedService(ILogger log, IRequestLogger requestLogger) : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { var dbRequestLogger = (SqliteRequestLogger)requestLogger; using var timer = new PeriodicTimer(TimeSpan.FromSeconds(3)); while (!stoppingToken.IsCancellationRequested && await timer.WaitForNextTickAsync(stoppingToken)) { dbRequestLogger.Tick(log); } } } ``` This will use a Hosted Background Service to flush Request Logs to the requests SQLite database every **3** seconds (configurable in the PeriodicTimer). If your App is already using `RequestLogsFeature` configured (e.g. with Profiling) you'll want to remove it: ```csharp public class ConfigureProfiling : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { // services.AddPlugin(new RequestLogsFeature()); services.AddPlugin(new ProfilingFeature { IncludeStackTrace = true, }); }); } ``` ## Rolling Monthly SQLite Databases The benefit of using SQLite is that databases can be created on-the-fly where Requests will be persisted into isolated **requests** Monthly databases which can be easily archived into managed file storage instead of a singular growing database, visible in the [Database Admin UI](https://docs.servicestack.net/admin-ui-database): ![](/img/posts/sqlite-request-logs/sqlite-databases.webp) SQLite logs will also make it easier to generate monthly aggregate reports to provide key insights into the usage of your App. ## AutoQuery Grid Admin Logging UI As SQLite Requests Logs also makes it efficiently possible to sort and filter through logs, the Logging UI will switch to using a fully queryable `AutoQueryGrid` when using `SqliteRequestLogger`: ![](/img/posts/sqlite-request-logs/sqlite-request-logs.webp) # Simple C# Background Jobs & Recurring Tasks for .NET 8 Source: https://servicestack.net/posts/background-jobs We're excited to announce **Background Jobs** our effortless solution for queueing and managing background jobs and scheduled tasks in any .NET 8 C# App, implemented in true ServiceStack fashion where it seamlessly integrates into existing Apps and call existing APIs and sports a [built-in](/auto-ui) Management UI to provide real-time monitoring, inspection and management of background jobs. :::youtube 2Cza_a_rrjA Durable C# Background Jobs and Scheduled Tasks for .NET ::: ### Durable and Infrastructure-Free Prior to Background Jobs we've been using [Background MQ](https://docs.servicestack.net/background-mq) for executing our background tasks which lets you queue any Request DTO to execute its API in a background worker. It's been our preferred choice as it didn't require any infrastructure dependencies since its concurrent queues are maintained in memory, this also meant they were non-durable that didn't survive across App restarts. Whilst [ServiceStack MQ](https://docs.servicestack.net/messaging) enables an additional endpoint for your APIs our main use-case for using it was for executing background tasks which would be better suited by purpose-specific software designed for the task. #### SQLite Persistence It uses SQLite as the backing store for its durability since it's low latency, [fast disk persistence](https://www.sqlite.org/fasterthanfs.html) and embeddable file-based database makes it ideally suited for the task which allows creation of naturally partition-able and archivable monthly databases on-the-fly without any maintenance overhead or infrastructure dependencies making it easy to add to any .NET App without impacting or adding increased load to their existing configured databases. ### Queue APIs or Commands For even greater reuse you're able to queue your existing ServiceStack APIs as a Background Job in addition to [Commands](https://docs.servicestack.net/commands) added in the [last v8.3 release](https://docs.servicestack.net/releases/v8_03) for encapsulating units of logic into internal invokable, inspectable and auto-retryable building blocks. ### Real Time Admin UI The Background Jobs Admin UI provides a real time view into the status of all background jobs including their progress, completion times, Executed, Failed and Cancelled Jobs, etc. which is useful for monitoring and debugging purposes. ![](/img/posts/background-jobs/jobs-dashboard.webp) View Real-time progress of queued Jobs ![](/img/posts/background-jobs/jobs-queue.webp) View real-time progress logs of executing Jobs ![](/img/posts/background-jobs/jobs-logs.webp) View Job Summary and Monthly Databases of Completed and Failed Jobs ![](/img/posts/background-jobs/jobs-completed.webp) View full state and execution history of each Job ![](/img/posts/background-jobs/jobs-failed.webp) Cancel Running jobs and Requeue failed jobs ### Feature Overview Even in its v1 release it packs all the features we wanted in a Background Jobs solution: - No infrastructure dependencies - Monthly archivable rolling Databases with full Job Execution History - Execute existing APIs or versatile Commands - Commands auto registered in IOC - Scheduled Reoccurring Tasks - Track Last Job Run - Serially execute jobs with the same named Worker - Queue Jobs dependent on successful completion of parent Job - Queue Jobs to be executed after a specified Date - Execute Jobs within the context of an Authenticated User - Auto retry failed jobs on a default or per-job limit - Timeout Jobs on a default or per-job limit - Cancellable Jobs - Requeue Failed Jobs - Execute custom callbacks on successful execution of Job - Maintain Status, Logs and Progress of Executing Jobs - Execute transitive (i.e. non-durable) jobs using named workers - Attach optional `Tag`, `BatchId`, `CreatedBy`, `ReplyTo` and `Args` with Jobs Please [let us know](https://servicestack.net/ideas) of any other missing features you'd love to see implemented. ## Install As it's more versatile and better suited, we've replaced the usage of Background MQ with **ServiceStack.Jobs** in all **.NET 8 Identity Auth Templates** for sending Identity Auth Confirmation Emails when SMTP is enabled. So the easiest way to get started with ServiceStack.Jobs is to [create a new Identity Auth Project](https://servicestack.net/start), e.g: :::sh x new blazor-vue MyApp ::: ### Exiting .NET 8 Templates Existing .NET 8 Projects can configure their app to use **ServiceStack.Jobs** by mixing in: :::sh x mix jobs ::: Which adds the `Configure.BackgroundJobs.cs` [Modular Startup](https://docs.servicestack.net/modular-startup) configuration and a **ServiceStack.Jobs** NuGet package reference to your project. ## Usage Any API, Controller or Minimal API can execute jobs with the `IBackgroundJobs` dependency, e.g. here's how you can run a background job to send a new email when an API is called in any new Identity Auth template: ```csharp class MyService(IBackgroundJobs jobs) : Service { public object Any(MyOrder request) { var jobRef = jobs.EnqueueCommand(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); //... } } ``` Which records and immediately executes a worker to execute the `SendEmailCommand` with the specified `SendEmail` Request argument. It also returns a reference to a Job which can be used later to query and track the execution of a job. Alternatively a `SendEmail` API could be executed with just the Request DTO: ```csharp var jobRef = jobs.EnqueueApi(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); ``` Although Sending Emails is typically not an API you want to make externally available and would want to either [Restrict access](https://docs.servicestack.net/auth/restricting-services) or [limit usage to specified users](https://docs.servicestack.net/auth/identity-auth#declarative-validation-attributes). In both cases the `SendEmail` Request is persisted into the Jobs SQLite database for durability that gets updated as it progresses through the queue. For execution the API or command is resolved from the IOC before being invoked with the Request. APIs are executed via the [MQ Request Pipeline](https://docs.servicestack.net/order-of-operations) and commands executed using the [Commands Feature](https://docs.servicestack.net/commands) where they'll also be visible in the [Commands Admin UI](https://docs.servicestack.net/commands#command-admin-ui). ### Background Job Options The behavior for each `Enqueue*` method for executing background jobs can be customized with the following options: - `Worker` - Serially process job using a named worker thread - `Callback` - Invoke another command with the result of a successful job - `DependsOn` - Execute jobs after successful completion of a dependent job - If parent job fails all dependent jobs are cancelled - `UserId` - Execute within an Authenticated User Context - `RunAfter` - Queue jobs that are only run after a specified date - `RetryLimit` - Override default retry limit for how many attempts should be made to execute a job - `TimeoutSecs` - Override default timeout for how long a job should run before being cancelled - `RefId` - Allow clients to specify a unique Id (e.g Guid) to track job - `Tag` - Group related jobs under a user specified tag - `CreatedBy` - Optional field for capturing the owner of a job - `BatchId` - Group multiple jobs with the same Id - `ReplyTo` - Optional field for capturing where to send notification for completion of a Job - `Args` - Optional String Dictionary of Arguments that can be attached to a Job ## Schedule Recurring Tasks In addition to queueing jobs to run in the background, it also supports scheduling recurring tasks to execute APIs or Commands at fixed intervals. :::youtube DtB8KaXXMCM Schedule your Reoccurring Tasks with Background Jobs! ::: APIs and Commands can be scheduled to run at either a `TimeSpan` or [CRON Expression](https://github.com/HangfireIO/Cronos?tab=readme-ov-file#cron-format) interval, e.g: ### CRON Expression Examples ```csharp // Every Minute Expression jobs.RecurringCommand(Schedule.Cron("* * * * *")); // Every Minute Constant jobs.RecurringCommand(Schedule.EveryMinute, new CheckUrls { Urls = urls }); ``` ### CRON Format You can use any **unix-cron format** expression supported by the [HangfireIO/Cronos](https://github.com/HangfireIO/Cronos) library: ```txt |------------------------------- Minute (0-59) | |------------------------- Hour (0-23) | | |------------------- Day of the month (1-31) | | | |------------- Month (1-12; or JAN to DEC) | | | | |------- Day of the week (0-6; or SUN to SAT) | | | | | | | | | | * * * * * ``` The allowed formats for each field include: | Field | Format of valid values | |------------------|--------------------------------------------| | Minute | 0-59 | | Hour | 0-23 | | Day of the month | 1-31 | | Month | 1-12 (or JAN to DEC) | | Day of the week | 0-6 (or SUN to SAT; or 7 for Sunday) | #### Matching all values To match all values for a field, use the asterisk: `*`, e.g here are two examples in which the minute field is left unrestricted: - `* 0 1 1 1` - the job runs every minute of the midnight hour on January 1st and Mondays. - `* * * * *` - the job runs every minute (of every hour, of every day of the month, of every month, every day of the week, because each of these fields is unrestricted too). #### Matching a range To match a range of values, specify your start and stop values, separated by a hyphen (-). Do not include spaces in the range. Ranges are inclusive. The first value must be less than the second. The following equivalent examples run at midnight on Mondays, Tuesdays, Wednesdays, Thursdays, and Fridays (for all months): - `0 0 * * 1-5` - `0 0 * * MON-FRI` #### Matching a list Lists can contain any valid value for the field, including ranges. Specify your values, separated by a comma (,). Do not include spaces in the list, e.g: - `0 0,12 * * *` - the job runs at midnight and noon. - `0-5,30-35 * * * *` - the job runs in each of the first five minutes of every half hour (at the top of the hour and at half past the hour). ### TimeSpan Interval Examples ```csharp jobs.RecurringCommand( Schedule.Interval(TimeSpan.FromMinutes(1))); // With Example jobs.RecurringApi(Schedule.Interval(TimeSpan.FromMinutes(1)), new CheckUrls { Urls = urls }); ``` That can be registered with an optional **Task Name** and **Background Options**, e.g: ```csharp jobs.RecurringCommand("Check URLs", Schedule.EveryMinute, new() { RunCommand = true // don't persist job }); ``` :::info If no name is provided, the Command's Name or APIs Request DTO will be used ::: ### Idempotent Registration Scheduled Tasks are idempotent where the same registration with the same name will either create or update the scheduled task registration without losing track of the last time the Recurring Task, as such it's recommended to always define your App's Scheduled Tasks on Startup: ```csharp public class ConfigureBackgroundJobs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context,services) => { services.AddPlugin(new CommandsFeature()); services.AddPlugin(new BackgroundsJobFeature()); services.AddHostedService(); }).ConfigureAppHost(afterAppHostInit: appHost => { var services = appHost.GetApplicationServices(); var jobs = services.GetRequiredService(); // App's Scheduled Tasks Registrations: jobs.RecurringCommand(Schedule.Hourly); }); } ``` ### Background Jobs Admin UI The last job the Recurring Task ran is also viewable in the Jobs Admin UI: ![](/img/posts/background-jobs/jobs-scheduled-tasks-last-job.webp) ### Executing non-durable jobs `IBackgroundJobs` also supports `RunCommand*` methods for executing background jobs transiently (i.e. non-durable), which is useful for commands that want to be serially executed by a named worker but don't need to be persisted. #### Execute in Background and return immediately You could use this to queue system emails to be sent by the same **smtp** worker and are happy to not have its state and execution history tracked in the Jobs database. ```csharp var job = jobs.RunCommand(new SendEmail { ... }, new() { Worker = "smtp" }); ``` In this case `RunCommand` returns the actual `BackgroundJob` instance that will be updated by the worker. #### Execute in Background and wait for completion You can also use `RunCommandAsync` if you prefer to wait until the job has been executed. Instead of a Job it returns the **Result** of the command if it returned one. ```csharp var result = await jobs.RunCommandAsync(new SendEmail {...}, new() { Worker = "smtp" }); ``` ### Serially Execute Jobs with named Workers By default jobs are executed immediately in a new Task, we can also change the behavior to instead execute jobs one-by-one in a serial queue by specifying them to use the same named worker as seen in the example above. Alternatively you can annotate the command with the `[Worker]` attribute if you **always** want all jobs executing the command to use the same worker: ```csharp [Worker("smtp")] public class SendEmailCommand(IBackgroundJobs jobs) : SyncCommand { //... } ``` ### Use Callbacks to process the results of Commands Callbacks can be used to extend the lifetime of a job to include processing a callback to process its results. This is useful where you would like to reuse the the same command but handle the results differently, e.g. the same command can email results or invoke a webhook by using a callback: ```csharp jobs.EnqueueCommand(new CheckUrls { Urls = allUrls }, new() { Callback = nameof(EmailUrlResultsCommand), }); jobs.EnqueueCommand(new CheckUrls { Urls = criticalUrls }, new() { Callback = nameof(WebhookUrlResultsCommand), ReplyTo = callbackUrl }); ``` Callbacks that fail are auto-retried the same number of times as their jobs, which if they all fail then the entire job is also marked as failed. ### Run Job dependent on successful completion of parent Jobs can be queued to only run after the successful completion of another job, this is useful for when you need to kick off multiple jobs after a long running task has finished like generating monthly reports after monthly data has been aggregated, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }); jobs.EnqueueCommand(new () { DependsOn = jobRef.Id, }); jobs.EnqueueCommand(new () { DependsOn = jobRef.Id, }); ``` Inside your command you can get a reference to your current job with `Request.GetBackgroundJob()` which will have its `ParentId` populated with the parent job Id and `job.ParentJob` containing a reference to the completed Parent Job where you can access its Request, Results and other job information: ```csharp public class GenerateSalesReportCommand(ILogger log) : SyncCommand { protected override void Run() { var job = Request.GetBackgroundJob(); var parentJob = job.ParentJob; } } ``` ### Atomic Batching Behavior We can also use `DependsOn` to implement atomic batching behavior where from inside our executing command we can queue new jobs that are dependent on the successful execution of the current job, e.g: ```csharp public class CheckUrlsCommand(IHttpClientFactory factory, IBackgroundJobs jobs) : AsyncCommand { protected override async Task RunAsync(CheckUrls req, CancellationToken ct) { var job = Request.GetBackgroundJob(); var batchId = Guid.NewGuid().ToString("N"); using var client = factory.CreateClient(); foreach (var url in req.Urls) { var msg = new HttpRequestMessage(HttpMethod.Get, url); var response = await client.SendAsync(msg, ct); response.EnsureSuccessStatusCode(); jobs.EnqueueCommand(new SendEmail { To = "my@email.com", Subject = $"{new Uri(url).Host} status", BodyText = $"{url} is up", }, new() { DependsOn = job.Id, BatchId = batchId, }); } } } ``` Where any dependent jobs are only executed if the job was successfully completed. If instead an exception was thrown during execution, the job will be failed and all its dependent jobs cancelled and removed from the queue. ### Executing jobs with an Authenticated User Context If you have existing logic dependent on a Authenticated `ClaimsPrincipal` or ServiceStack `IAuthSession` you can have your APIs and Commands also be executed with that user context by specifying the `UserId` the job should be executed as: ```csharp var openAiRequest = new CreateOpenAiChat { Request = new() { Model = "gpt-4", Messages = [ new() { Content = request.Question } ] }, }; // Example executing API Job with User Context jobs.EnqueueApi(openAiRequest, new() { UserId = Request.GetClaimsPrincipal().GetUserId(), CreatedBy = Request.GetClaimsPrincipal().GetUserName(), }); // Example executing Command Job with User Context jobs.EnqueueCommand(openAiRequest, new() { UserId = Request.GetClaimsPrincipal().GetUserId(), CreatedBy = Request.GetClaimsPrincipal().GetUserName(), }); ``` Inside your API or Command you access the populated User `ClaimsPrincipal` or ServiceStack `IAuthSession` using the same APIs that you'd use inside your ServiceStack APIs, e.g: ```csharp public class CreateOpenAiChatCommand(IBackgroundJobs jobs) : AsyncCommand { protected override async Task RunAsync( CreateOpenAiChat request, CancellationToken token) { var user = Request.GetClaimsPrincipal(); var session = Request.GetSession(); //... } } ``` ### Queue Job to run after a specified date Using `RunAfter` lets you queue jobs that are only executed after a specified `DateTime`, useful for executing resource intensive tasks at low traffic times, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }, new() { RunAfter = DateTime.UtcNow.Date.AddDays(1) }); ``` ### Attach Metadata to Jobs All above Background Job Options have an effect on when and how Jobs are executed. There are also a number of properties that can be attached to a Job that can be useful in background job processing despite not having any effect on how jobs are executed. These properties can be accessed by commands or APIs executing the Job and are visible and can be filtered in the Jobs Admin UI to help find and analyze executed jobs. ```csharp var jobRef = jobs.EnqueueCommand(openAiRequest, new() { // Group related jobs under a common tag Tag = "ai", // A User-specified or system generated unique Id to track the job RefId = request.RefId, // Capture who created the job CreatedBy = Request.GetClaimsPrincipal().GetUserName(), // Link jobs together that are sent together in a batch BatchId = batchId, // Capture where to notify the completion of the job to ReplyTo = "https:example.org/callback", // Additional properties about the job that aren't in the Request Args = new() { ["Additional"] = "Metadata" } }); ``` ### Querying a Job A job can be queried by either it's auto-incrementing `Id` Primary Key or by a unique `RefId` that can be user-specified. ```csharp var jobResult = jobs.GetJob(jobRef.Id); var jobResult = jobs.GetJobByRefId(jobRef.RefId); ``` At a minimum a `JobResult` will contain the Summary Information about a Job as well as the full information about a job depending on where it's located: ```csharp class JobResult { // Summary Metadata about a Job in the JobSummary Table JobSummary Summary // Job that's still in the BackgroundJob Queue BackgroundJob? Queued // Full Job information in Monthly DB CompletedJob Table CompletedJob? Completed // Full Job information in Monthly DB FailedJob Table FailedJob? Failed // Helper to access full Job Information BackgroundJobBase? Job => Queued ?? Completed ?? Failed } ``` ### Job Execution Limits Default Retry and Timeout Limits can be configured on the Backgrounds Job plugin: ```csharp services.AddPlugin(new BackgroundsJobFeature { DefaultRetryLimit = 2, DefaultTimeout = TimeSpan.FromMinutes(10), }); ``` These limits are also overridable on a per-job basis, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }, new() { RetryLimit = 3, Timeout = TimeSpan.FromMinutes(30), }); ``` ### Logging, Cancellation an Status Updates We'll use the command for checking multiple URLs to demonstrate some recommended patterns and how to enlist different job processing features. ```csharp public class CheckUrlsCommand( ILogger logger, IBackgroundJobs jobs, IHttpClientFactory clientFactory) : AsyncCommand { protected override async Task RunAsync(CheckUrls req, CancellationToken ct) { // 1. Create Logger that Logs and maintains logging in Jobs DB var log = Request.CreateJobLogger(jobs,logger); // 2. Get Current Executing Job var job = Request.GetBackgroundJob(); var result = new CheckUrlsResult { Statuses = new() }; using var client = clientFactory.CreateClient(); for (var i = 0; i < req.Urls.Count; i++) { // 3. Stop processing Job if it's been cancelled ct.ThrowIfCancellationRequested(); var url = req.Urls[i]; try { var msg = new HttpRequestMessage(HttpMethod.Get,url); var response = await client.SendAsync(msg, ct); result.Statuses[url] = response.IsSuccessStatusCode; log.LogInformation("{Url} is {Status}", url, response.IsSuccessStatusCode ? "up" : "down"); // 4. Optional: Maintain explicit progress and status updates log.UpdateStatus(i/(double)req.Urls.Count,$"Checked {i} URLs"); } catch (Exception e) { log.LogError(e, "Error checking {Url}", url); result.Statuses[url] = false; } } // 5. Send Results to WebHook Callback if specified if (job.ReplyTo != null) { jobs.EnqueueCommand(result, new() { ParentId = job.Id, ReplyTo = job.ReplyTo, }); } } } ``` We'll cover some of the notable parts useful when executing Jobs: #### 1. Job Logger We can use a Job logger to enable database logging that can be monitored in real-time in the Admin Jobs UI. Creating it with both `BackgroundJobs` and `ILogger` will return a combined logger that both Logs to standard output and to the Jobs database: ```csharp var log = Request.CreateJobLogger(jobs,logger); ``` Or just use `Request.CreateJobLogger(jobs)` to only save logs to the database. #### 2. Resolve Executing Job If needed the currently executing job can be accessed with: ```csharp var job = Request.GetBackgroundJob(); ``` Where you'll be able to access all the metadata the jobs were created with including `ReplyTo` and `Args`. #### 3. Check if Job has been cancelled To be able to cancel a long running job you'll need to periodically check if a Cancellation has been requested and throw a `TaskCanceledException` if it has to short-circuit the command which can be done with: ```csharp ct.ThrowIfCancellationRequested(); ``` You'll typically want to call this at the start of any loops to prevent it from doing any more work. #### 4. Optionally record progress and status updates By default Background Jobs looks at the last API or Command run and worker used to estimate the duration and progress for how long a running job will take. If preferred your command can explicitly set a more precise progress and optional status update that should be used instead, e.g: ```csharp log.UpdateStatus(progress:i/(double)req.Urls.Count, $"Checked {i} URLs"); ``` Although generally the estimated duration and live logs provide a good indication for the progress of a job. #### 5. Notify completion of Job Calling a Web Hook is a good way to notify externally initiated job requests of the completion of a job. You could invoke the callback within the command itself but there are a few benefits to initiating another job to handle the callback: - Frees up the named worker immediately to process the next task - Callbacks are durable, auto-retried and their success recorded like any job - If a callback fails the entire command doesn't need to be re-run again We can queue a callback with the result by passing through the `ReplyTo` and link it to the existing job with: ```csharp if (job.ReplyTo != null) { jobs.EnqueueCommand(result, new() { ParentId = job.Id, ReplyTo = job.ReplyTo, }); } ``` Which we can implement by calling the `SendJsonCallbackAsync` extension method with the Callback URL and the Result DTO it should be called with: ```csharp public class NotifyCheckUrlsCommand(IHttpClientFactory clientFactory) : AsyncCommand { protected override async Task RunAsync( CheckUrlsResult request, CancellationToken token) { await clientFactory.SendJsonCallbackAsync( Request.GetBackgroundJob().ReplyTo, request, token); } } ``` #### Callback URLs `ReplyTo` can be any URL which by default will have the result POST'ed back to the URL with a JSON Content-Type. Typically URLs will contain a reference Id so external clients can correlate a callback with the internal process that initiated the job. If the callback API is publicly available you'll want to use an internal Id that can't be guessed so the callback can't be spoofed, like a Guid, e.g: `$"https://api.example.com?refId={RefId}"` If needed the callback URL can be customized on how the HTTP Request callback is sent. If the URL contains a space, the text before the space is treated as the HTTP method: `"PUT https://api.example.com"` If the auth part contains a colon `:` it's treated as Basic Auth: `"username:password@https://api.example.com"` If name starts with `http.` sends a HTTP Header `"http.X-API-Key:myApiKey@https://api.example.com"` Otherwise it's sent as a Bearer Token: `"myToken123@https://api.example.com"` Bearer Token or HTTP Headers starting with `$` is substituted with Environment Variable if exists: `"$API_TOKEN@https://api.example.com"` When needed headers, passwords and tokens can be URL encoded if they contain any delimiter characters. ## Implementing Commands At a minimum a command need only implement the simple [IAsyncCommand interface](https://docs.servicestack.net/commands#commands-feature): ```csharp public interface IAsyncCommand { Task ExecuteAsync(T request); } ``` Which is the singular interface that can execute any command. However commands executed via Background Jobs have additional context your commands may need to access during execution, including the `BackgroundJob` itself, the `CancellationToken` and an Authenticated User Context. To reduce the effort in creating commands with a `IRequest` context we've added a number ergonomic base classes to better capture the different call-styles a unit of logic can have including **Sync** or **Async** execution, whether they require **Input Arguments** or have **Result Outputs**. Choosing the appropriate Abstract base class benefits from IDE tooling in generating the method signature that needs to be implemented whilst Async commands with Cancellation Tokens in its method signature highlights any missing async methods that are called without the token. ### Sync Commands - `SyncCommand` - Requires No Arguments - `SyncCommand` - Requires TRequest Argument - `SyncCommandWithResult` - Requires No Args and returns Result - `SyncCommandWithResult` - Requires Arg and returns Result ```csharp public record MyArgs(int Id); public record MyResult(string Message); public class MyCommandNoArgs(ILogger log) : SyncCommand { protected override void Run() { log.LogInformation("Called with No Args"); } } public class MyCommandArgs(ILogger log) : SyncCommand { protected override void Run(MyArgs request) { log.LogInformation("Called with {Id}", request.Id); } } public class MyCommandWithResult(ILogger log) : SyncCommandWithResult { protected override MyResult Run() { log.LogInformation("Called with No Args and returns Result"); return new MyResult("Hello World"); } } public class MyCommandWithArgsAndResult(ILogger log) : SyncCommandWithResult { protected override MyResult Run(MyArgs request) { log.LogInformation("Called with {Id} and returns Result", request.Id); return new MyResult("Hello World"); } } ``` ### Async Commands - `AsyncCommand` - Requires No Arguments - `AsyncCommand` - Requires TRequest Argument - `AsyncCommandWithResult` - Requires No Args and returns Result - `AsyncCommandWithResult` - Requires Arg and returns Result ```csharp public class MyAsyncCommandNoArgs(ILogger log) : AsyncCommand { protected override async Task RunAsync(CancellationToken token) { log.LogInformation("Async called with No Args"); } } public class MyAsyncCommandArgs(ILogger log) : AsyncCommand { protected override async Task RunAsync(MyArgs request, CancellationToken t) { log.LogInformation("Async called with {Id}", request.Id); } } public class MyAsyncCommandWithResult(ILogger log) : AsyncCommandWithResult { protected override async Task RunAsync(CancellationToken token) { log.LogInformation("Async called with No Args and returns Result"); return new MyResult("Hello World"); } } public class MyAsyncCommandWithArgsAndResult(ILogger log) : AsyncCommandWithResult { protected override async Task RunAsync( MyArgs request, CancellationToken token) { log.LogInformation("Called with {Id} and returns Result", request.Id); return new MyResult("Hello World"); } } ``` # Utilize C# Commands to build more robust and observable systems Source: https://servicestack.net/posts/commands-feature Much of ServiceStack has been focused on providing a productive [API First Development](https://docs.servicestack.net/api-first-development) experience and adding value-added features around your System's external APIs, including: - [Native end-to-end typed API integrations](https://docs.servicestack.net/add-servicestack-reference) to **11 popular languages** - Built-in [API Explorer](https://docs.servicestack.net/api-explorer) to discover, browse and invoke APIs - Instant CRUD UIs with [Auto Query](https://docs.servicestack.net/autoquery/) and [Locode](https://docs.servicestack.net/locode/) - Custom CRUD UIs with [Blazor components](https://blazor-gallery.jamstacks.net) and [Vue Components](https://docs.servicestack.net/vue/) As well as [Declarative Validation](https://docs.servicestack.net/declarative-validation), multiple [Auth Integrations](https://docs.servicestack.net/auth/) and other extensive [Declarative Features](https://docs.servicestack.net/locode/declarative) to enhance your external facing APIs. ### Internal API Implementation Little attention has been given to internal implementations of APIs since it can use anything that fulfils its service contract by returning the APIs populated Response DTO. How code-bases are structured is largely a matter of developer preference, however we believe we've also been able to add value in this area by introducing an appealing option with our new managed **Commands** Feature. :::youtube SXPdBHbncPc Use C# Commands to build robust and observable systems with Admin UI ::: ## Code Architecture Ultimately nothing beats the simplicity of "No Architecture" by maintaining all logic within a Service Implementation which just needs to call a few App dependencies to implement its functionality and return a populated Response DTO: ```csharp public object Any(MyRequest request) => new MyResponse { ... }; ``` This is still the best option for small implementations where the Service is the only consumer of the logic that should be run on the HTTP Worker Request Thread. #### When to restructure The times when you may want to consider moving logic out of your Service into separate classes include: - **Code Reuse**: Make it easier to reuse your Service logic in other Services - **Complexity**: Break down complex logic into smaller more manageable pieces - **Testability**: Make it easier to test your Logic in isolation - **Observability**: Make it easier to log and monitor - **Robustness**: Make it easier to handle, retry and recover from errors - **Flexibility**: Make it easier to run in parallel or in a different managed thread We'll look at how the new **Commands Feature** can help with these concerns. ### Code Reuse Following principles of YAGNI in doing the simplest thing that could possibly work, whenever we want to reuse logic across Services we'd first start by moving it to an extension method on the dependency that it uses, e.g. ```csharp public static async Task> GetActiveSubscribersAsync( this IDbConnection db, MailingList mailingList) { return await db.SelectAsync(db.From(db.TableAlias("c")) .Where(x => x.DeletedDate == null && x.UnsubscribedDate == null && x.VerifiedDate != null && (mailingList & x.MailingLists) == mailingList) .WhereNotExists(db.From() .Where((c,e) => e.EmailLower == Sql.TableAlias(c.EmailLower, "c")) .Select(x => x.Id)) ); } ``` Which does a great job at encapsulating logic and making it reusable and readable: ```csharp foreach (var sub in await Db.GetActiveSubscribersAsync(MailingList.Newsletter)) { //... } ``` Where it can be reused without referencing any external classes whilst also being easily discoverable via intellisense. This works great for 1 or 2 dependencies, but becomes more cumbersome as the number of dependencies grows, e.g: ```csharp public static async Task> GetActiveSubscribersAsync( this IDbConnection db, ILogger log, ICacheClient cache, MailingList mailingList) ``` In which the complexity of the extension method dependencies leaks and impacts all calling classes that need to include them and also starts to impact its readability, e.g: ```csharp public class MyService(ILogger log, ICacheClient cache, IDbConnection db) : Service { public object Any(MyRequest request) { var subs = await Db.GetActiveSubscribersAsync(log, cache, request.MailList); } } ``` ### Refactoring Logic into separate classes The solution to this is to refactor the logic into a separate class and leverage the IOC to inject the dependencies it needs, fortunately with Primary Constructors this now requires minimal boilerplate code, e.g: ```csharp class MyLogic(ILogger log, ICacheClient cache, IDbConnection db) { //... } ``` But it still requires manual registration adding additional complexity to your Host project `Program.cs` or [Modular Configurations](https://docs.servicestack.net/modular-startup) which needs to manage registration for all these new logic classes, e.g: ```csharp builder.Services.AddTransient(); ``` ## Commands Feature Which touches on the first benefit of the **Commands Feature** which like ServiceStack Services auto registers all classes implementing the intentionally simple and impl-free `IAsyncCommand` interface, e.g: ```csharp public interface IAsyncCommand { Task ExecuteAsync(T request); } ``` Allowing for maximum flexibility in how to implement your logic classes, which are essentially encapsulated units of logic with a single method to execute it, e.g: ```csharp public class AddTodoCommand(ILogger log, IDbConnection db) : IAsyncCommand { public async Task ExecuteAsync(CreateTodo request) { var newTodo = request.ConvertTo(); newTodo.Id = await db.InsertAsync(newTodo, selectIdentity:true); log.LogDebug("Created Todo {Id}: {Text}", newTodo.Id, newTodo.Text); } } ``` Where we immediately get the benefits of code reuse, encapsulation, and readability without needing to manually register and pollute your App's configuration with them. By default Commands are registered as transient dependencies, but you can also register them with a different lifetime scope using the `[Lifetime]` attribute, e.g: ```csharp [Lifetime(Lifetime.Scoped)] public class AddTodoCommand(ILogger log, IDbConnection db) : IAsyncCommand {} ``` Or by manually registering them, if you need a custom registration: ```csharp services.AddTransient(c => CreateAddTodoCommand(c)); ``` ### Commands with Results For maximum flexibility, we want to encourage temporal decoupling by separating initiating a command from its execution, so instead of adding a different method to execute commands with results, we're instead recommending the convention of storing the result of a command in a `Result` property, e.g: ```csharp public interface IAsyncCommand : IAsyncCommand, IHasResult { } public interface IHasResult { T Result { get; } } ``` So we could implement a command with a result like: ```csharp public class AddTodoCommand(ILogger log, IDbConnection db) : IAsyncCommand { public Todo Result { get; private set; } public async Task ExecuteAsync(CreateTodo request) { Result = request.ConvertTo(); Result.Id = await db.InsertAsync(newTodo, selectIdentity:true); log.LogDebug("Created Todo {Id}: {Text}", Result.Id, Result.Text); } } ``` ### Messaging Although for better resilience and scalability we recommend utilizing a messaging pattern to notify the outputs of a command by publishing messages to invoke dependent logic instead of returning a result, e.g: ```csharp public class AddTodoCommand(IDbConnection db, IMessageProducer mq) : IAsyncCommand { public async Task ExecuteAsync(CreateTodo request) { var newTodo = request.ConvertTo(); newTodo.Id = await db.InsertAsync(newTodo, selectIdentity:true); mq.Publish(new SendNotification { TodoCreated = newTodo }); } } ``` Which decouples the sender and receiver of the message, allowing it to finish without needing to wait and concern itself on how subsequent logic is processed, e.g. how to handle errors, whether to execute it in a different managed thread, in parallel, etc. Messaging encourages adopting a more reliable asynchronous one-way workflow instead of implementing logic serially where the sender is timely coupled to the successful execution of all subsequent logic before being able to complete, e.g: ```csharp await cmd.ExecuteAsync(createTodo); var newTodo = cmd.Result; await SendNewTodoNotificationAsync(newTodo); ``` It allows for more reliable and observable workflows that removes the temporal coupling between components where each execution step can be executed on different threads, independently monitored and retried if needed. ```txt [A] -> [B] -> [C] ``` ### Commands as Application Building Blocks As they're not dependent on any framework and can support multiple execution patterns, we believe Commands make great building blocks for insulating units of logic as they're simple and testable and allow for managed execution which can easily add logging, monitoring, and resilience around your logic. ### Background MQ It should be noted adopting a messaging pattern doesn't require additional infrastructure complexity of an external MQ Server as you can use the [Background MQ](https://docs.servicestack.net/background-mq) to execute messages in configurable managed background threads. ### Executing Commands Commands are effectively a pattern to structure your logic that doesn't depend on any implementation assembly or framework, so they can just be executed directly, e.g: ```csharp using var db = dbFactory.Open(); var cmd = new AddTodoCommand(new NullLogger(), db); await cmd.ExecuteAsync(new CreateTodo { Text = "New Todo" }); ``` ### Command Executor They also allow for a managed execution which the **CommandsFeature** provides with its `ICommandExecutor` which can be executed like: ```csharp public class MyService(ICommandExecutor executor) : Service { public object Any(MyRequest request) { var cmd = executor.Command(); await cmd.ExecuteAsync(new AddTodoCommand { Text = "New Todo" }); } } ``` This still results in the same behavior where exceptions are bubbled but also adds observability and resilience and other niceties like executing any Fluent or Declarative Validation on Command Requests. ### Retry Failed Commands We can make commands more resilient by adding the `[Retry]` attribute to opt into auto retrying failed commands: ```csharp [Retry] public class AddTodoCommand() : IAsyncCommand {} ``` Which will automatically retry the command as per the default Retry Policy: ```csharp services.AddPlugin(new CommandsFeature { DefaultRetryPolicy = new( Times: 3, Behavior: RetryBehavior.FullJitterBackoff, DelayMs: 100, MaxDelayMs: 60_000, DelayFirst: false ) }); ``` That can be overridden on a per-command basis with the `[Retry]` attribute, e.g: ```csharp [Retry(Times=4, MaxDelayMs=300_000, Behavior=RetryBehavior.LinearBackoff)] public class AddTodoCommand() : IAsyncCommand {} ``` The different Retry Behaviors available include: ```csharp public enum RetryBehavior { // Use the default retry behavior Default, // Always retry the operation after the same delay Standard, // Should be retried with a linear backoff delay strategy LinearBackoff, // Should be retried with an exponential backoff strategy ExponentialBackoff, // Should be retried with a full jittered exponential backoff strategy FullJitterBackoff, } ``` ## Command Admin UI Which can be inspected in the new **Command Admin UI** where you can view summary stats of all executed Commands and **APIs** in the **Summary** tab, e.g: [![](/img/posts/commands-feature/AddTodoCommand-summary.png)](/img/posts/commands-feature/AddTodoCommand-summary.png) ### Latest Command Executions It also maintains a rolling log of the latest executed commands in the **Latest** tab: [![](/img/posts/commands-feature/AddTodoCommand-latest.png)](/img/posts/commands-feature/AddTodoCommand-latest.png) ### Failed Command Executions Whilst the **Errors** tab shows a list of all failed **Command** and **API** executions: [![](/img/posts/commands-feature/AddTodoCommand-errors.png)](/img/posts/commands-feature/AddTodoCommand-errors.png) ### Execute Internal Commands A benefit of using Commands as the building block for your internal logic is that they enjoy many of the same benefits of ServiceStack's message-based Services where they can be invoked using just the Command **Name** and a **Request** Body which allows them to be discovered and executed from the **Explore** Tab: [![](/img/posts/commands-feature/AddTodoCommand-execute.png)](/img/posts/commands-feature/AddTodoCommand-execute.png) In this way they can be treated like **Internal APIs** for being able to invoke internal functionality that's only accessible by **Admin** Users. ### Group Commands by Tag Just like ServiceStack Services they can be grouped by **Tag** which can be used to group related commands: ```csharp [Tag("Todos")] public class AddTodoCommand() : IAsyncCommand {} ``` ## MQ Integration Although `CommandsFeature` is a standalone feature we're registering it in the new Identity Auth Templates `Configure.Mq.cs` which already uses the Background MQ to execute messages in managed background threads where it's used to send Identity Auth emails: ```csharp public class ConfigureMq : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton(c => new BackgroundMqService()); services.AddPlugin(new CommandsFeature()); }) .ConfigureAppHost(afterAppHostInit: appHost => { var mqService = appHost.Resolve(); //Register ServiceStack APIs you want to be able to invoke via MQ mqService.RegisterHandler(appHost.ExecuteMessage); mqService.Start(); }); } ``` Despite being 2 independent features, they work well together as the Background MQ can be used to execute Commands in managed background threads of which a single thread is used to execute each Request Type by default (configurable per request). You'd typically want to use queues to improve scalability by reducing locking and concurrency contention of heavy resources by having requests queued and executed in a managed background thread where it's able to execute requests as fast as it can without contention. Queues are also a great solution for working around single thread limitations of resources like writes to SQLite databases. ## Use Case - SQLite Writes As we've started to use server-side SQLite databases for our new Apps given its [many benefits](https://docs.servicestack.net/ormlite/litestream) we needed a solution to workaround its limitation of not being able to handle multiple writes concurrently. One of the benefits of using SQLite is creating and managing multiple databases is relatively cheap, so we can mitigate this limitation somewhat by maintaining different subsystems in separate databases, e.g: [![](/img/posts/commands-feature/pvq-databases.png)](/img/posts/commands-feature/pvq-databases.png) But each database can only be written to by a single thread at a time, which we can now easily facilitate with **Background MQ** and **MQ Command DTOs**. ### MQ Command DTOs We can use the new `[Command]` attribute to be able to execute multiple commands on a single Request DTO Properties, e.g: ```csharp [Tag(Tag.Tasks)] [Restrict(RequestAttributes.MessageQueue), ExcludeMetadata] public class DbWrites : IGet, IReturn { [Command] public Vote? CreatePostVote { get; set; } [Command] public Vote? CreateCommentVote { get; set; } [Command] public Post? CreatePost { get; set; } [Command] public Post? UpdatePost { get; set; } [Command] public DeletePosts? DeletePosts { get; set; } [Command] public DeleteAnswers? DeleteAnswers { get; set; } [Command] public Post? CreateAnswer { get; set; } [Command] public PostSubscriptions? PostSubscriptions { get; set; } [Command] public TagSubscriptions? TagSubscriptions { get; set; } //... } ``` Then to execute the commands we can use the `Request.ExecuteCommandsAsync` extension method for its Background MQ API implementation: ```csharp public class BackgroundMqServices : Service { public Task Any(DbWrites request) => Request.ExecuteCommandsAsync(request); } ``` Which goes through all Request DTO properties to execute all populated properties with their associated command, using it as the request for the command. So after registering the `DbWrites` Command DTO with the MQ Service: ```csharp mqService.RegisterHandler(appHost.ExecuteMessage); ``` We can now publish a single `DbWrites` message to execute multiple commands in a single managed background thread, e.g: ```csharp public class NotificationServices(MessageProducer mq) : Service { public object Any(Watch request) { var userName = Request.GetClaimsPrincipal().GetUserName(); mq.Publish(new DbWrites { PostSubscriptions = request.PostId == null ? null : new() { UserName = userName, Subscriptions = [request.PostId.Value], }, TagSubscriptions = request.Tag == null ? null : new() { UserName = userName, Subscriptions = [request.Tag], }, }); mq.Publish(new AnalyticsTasks { WatchRequest = request, }); } } ``` We also benefit from its natural parallelism where write requests to different Databases are executed in parallel. # Simple Auth Story for .NET 8 C# Microservices Source: https://servicestack.net/posts/simple-auth-microservices With ServiceStack now [fully integrated with ASP.NET Core Identity Auth](https://docs.servicestack.net/auth/identity-auth) our latest [.NET 8 Tailwind Templates](/start) now include a full-featured Auth Configuration complete with User Registration, Login, Password Recovery, Two Factory Auth, and more. Whilst this is great for C# Web Applications which need it, it neglects the class of Apps which don't need User Auth and the additional complexity it brings with Identity and Password Management, EF Migrations, Token Expirations, etc. For these stand-alone Apps, Microservices and Docker Appliances that would still like to restrict Access to their APIs but don't need the complexity of ASP .NET Core's Authentication machinery, a simpler Auth Story is ideal. With the [introduction of API Keys](/posts/apikeys) we're able to provide a simpler Auth Story for stand-alone .NET 8 Microservices that's easy for **Admin** Users to manage and control which trusted clients and B2B Integrations can access their functionality. :::youtube 0ceU91ZBhTQ Simple Auth Story with API Keys ideal for .NET 8 Microservices ::: The easiest way to get started is by creating a new Empty project with API Keys enabled with your preferred database to store the API Keys in. SQLite is a good choice for stand-alone Apps as it doesn't require any infrastructure dependencies.

Create a new Empty project with API Keys

### Existing Projects Existing projects not configured with Authentication can enable this simple Auth configuration by running: :::sh x mix apikeys-auth ::: Which will add the [ServiceStack.Server](https://nuget.org/packages/ServiceStack.Server) dependency and the [Modular Startup](https://docs.servicestack.net/modular-startup) configuration below: ```csharp public class ConfigureApiKeys : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature(new AuthSecretAuthProvider("p@55wOrd"))); services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key // Features = [ // "Paid", // "Tracking", // ], // Optional: Available Features Admin Users can assign to any API Key // Scopes = [ // "todo:read", // "todo:write", // ], }); }) .ConfigureAppHost(appHost => { using var db = appHost.Resolve().Open(); var feature = appHost.GetPlugin(); feature.InitSchema(db); }); } ``` Which configures the **AuthSecretAuthProvider** with the **Admin** password and the **ApiKeysFeature** to enable support for [API Keys](https://docs.servicestack.net/auth/apikeys). ### Admin UI The **Admin** password will give you access to the [Admin UI](https://docs.servicestack.net/admin-ui) at: :::{.text-4xl .text-center .text-indigo-800} /admin-ui ::: ![](/img/posts/simple-auth-microservices/admin-ui-signin.png) ![](/img/posts/simple-auth-microservices/admin-ui-dashboard.png) ### API Keys Admin UI Clicking on **API Keys** menu item will take you to the API Keys Admin UI where you'll be able to create new API Keys that you can distribute to different API consumers you want to be able to access your APIs: ![](/img/posts/simple-auth-microservices/admin-ui-apikeys.png) The **ApiKeysFeature** plugin will let you control different parts of the UI, including what **Features** you want to assign to API Keys and what **Scopes** you want individual API Keys to be able to have access to. ```csharp services.AddPlugin(new ApiKeysFeature { Features = [ "Paid", "Tracking", ], Scopes = [ "todo:read", "todo:write", ], // ExpiresIn =[ // new("", "Never"), // new("30", "30 days"), // new("365", "365 days"), // ], // Hide = ["RestrictTo","Notes"], }); ``` Any configuration on the plugin will be reflected in the UI: ![](/img/posts/simple-auth-microservices/admin-ui-apikeys-new.png) The API Keys Admin UI also lets you view and manage all API Keys in your App, including the ability to revoke API Keys, extend their Expiration date as well as manage any Scopes and Features assigned to API Keys. ![](/img/posts/simple-auth-microservices/admin-ui-apikeys-edit.png) ### Protect APIs with API Keys You'll now be able to protect APIs by annotating Request DTOs with the `[ValidateApiKey]` attribute: ```csharp [ValidateApiKey] public class Hello : IGet, IReturn { public required string Name { get; set; } } ``` Which only allows requests with a **valid API Key** to access the Service. ### Scopes We can further restrict API access by assigning them a scope which will only allow access to Valid API Keys configured with that scope, e.g: ```csharp [ValidateApiKey("todo:read")] public class QueryTodos : QueryDb { public long? Id { get; set; } public List? Ids { get; set; } public string? TextContains { get; set; } } [ValidateApiKey("todo:write")] public class CreateTodo : ICreateDb, IReturn { [ValidateNotEmpty] public required string Text { get; set; } public bool IsFinished { get; set; } } [ValidateApiKey("todo:write")] public class UpdateTodo : IUpdateDb, IReturn { public long Id { get; set; } [ValidateNotEmpty] public required string Text { get; set; } public bool IsFinished { get; set; } } [ValidateApiKey("todo:write")] public class DeleteTodos : IDeleteDb, IReturnVoid { public long? Id { get; set; } public List? Ids { get; set; } } ``` ### Restrict To APIs Scopes allow for coarse-grained access control allowing a single scope to access a logical group of APIs. For more fine-grained control you can use **Restrict To APIs** to specify just the APIs an API Key can access: ![](/img/posts/simple-auth-microservices/admin-ui-apikeys-restrict-to.png) Unlike scopes which can access APIs with the **same scope** or **without a scope**, Valid API Keys configured with **Restrict To APIs** can only access those specific APIs. ### Features Features are user-defined strings accessible within your Service implementation to provide different behavior based on Features assigned to the API Key, e.g: ```csharp public object Any(QueryTodos request) { if (Request.GetApiKey().HasFeature("Paid")) { //... } } ``` ### API Explorer Support for API Keys is also integrated into the [API Explorer](https://docs.servicestack.net/api-explorer) allowing users to use their API Keys to access API Key protected Services which are highlighted with a **Key** Icon: ![](/img/posts/simple-auth-microservices/apiexplorer-requires-apikey.png) Users can enter their API Key by clicking on the **Key** Icon in the top right, or the link in the Warning alert when trying to access an API Key protected Service: ![](/img/posts/simple-auth-microservices/apiexplorer-apikey-dialog.png) ### Client Usage All HTTP and existing [Service Clients](https://docs.servicestack.net/clients-overview) can be configured to use API Keys for machine-to-machine communication, which like most API Key implementations can be passed in a [HTTP Authorization Bearer Token](https://datatracker.ietf.org/doc/html/rfc6750#section-2.1) that can be configured in Service Clients with: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey ``` ### API Key HTTP Header Alternatively, API Keys can also be passed in the `X-Api-Key` HTTP Header which allows clients to be configured with an alternative Bearer Token allowing the same client to call both **Authenticated** and **API Key** protected APIs, e.g: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = jwt, Headers = { [HttpHeaders.XApiKey] = apiKey } }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey client.headers.set('X-Api-Key', apiKey) ``` ## Conclusion We hope this shows how stand-alone .NET 8 Microservices and self-contained Docker Apps can use the simple **Admin** and **API Keys** configuration to easily secure their APIs with API Keys, complete with **Management UI** and **typed Service Client** integrations. # Using API Keys to secure .NET 8 C# APIs Source: https://servicestack.net/posts/apikeys As we continue to embrace and natively integrate with ASP.NET Core's .NET 8 platform, we've reimplemented the last major feature missing from ServiceStack Auth - support for API Keys that's now available from **ServiceStack v8.3**. ### What are API Keys? API Keys are a simple and effective way to authenticate and authorize access to your C# APIs, which are typically used for machine-to-machine communication, where a client application needs to access an API without user intervention. API Keys are often used to control access to specific resources or features in your API, providing a simple way to manage access control. ### Redesigning API Keys Building on our experience with API Keys in previous versions of ServiceStack, we've taken the opportunity to redesign how API Keys work to provide a more flexible and powerful way to manage access control for your APIs. The existing [API Key Auth Provider](https://docs.servicestack.net/auth/api-key-authprovider) was implemented as another Auth Provider that provided another way to authenticate a single user. The consequences of this was: - Initial API Request was slow as it required going through the Authentication workflow to authenticate the user and setup authentication for that request - No support for fine-grained access control as API Keys had same access as the authenticated user - API Keys had to be associated with a User which was unnecessary for machine-to-machine communication Given the primary use-case for API Keys is for machine-to-machine communication where the client is not a User, nor do they want systems they give out their API Keys to, to have access to their User Account, we've changed how API Keys work in .NET 8. ## .NET 8 API Keys Feature :::youtube U4vqOIHOs_Q New .NET 8 API Keys Feature with Built-In UIs! ::: The first design decision to overcome the above issues was to separate API Keys from Users and Authentication itself, where the new `ApiKeysFeature` is now just a plugin instead of an Auth Provider, which can be added to existing Identity Auth Apps with: :::sh x mix apikeys ::: Which will add the API Keys [Modular Startup](https://docs.servicestack.net/modular-startup) to your Host project, a minimal example of which looks like: ```csharp public class ConfigureApiKeys : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new ApiKeysFeature()); }) .ConfigureAppHost(appHost => { using var db = appHost.Resolve().Open(); var feature = appHost.GetPlugin(); feature.InitSchema(db); }); } ``` Where it registers the `ApiKeysFeature` plugin and creates the `ApiKey` table in the App's configured database if it doesn't already exist. ### Creating Seed API Keys The plugin can also be used to programmatically generate API Keys for specified Users: ```csharp if (feature.ApiKeyCount(db) == 0) { var createApiKeysFor = new [] { "admin@email.com", "manager@email.com" }; var users = IdentityUsers.GetByUserNames(db, createApiKeysFor); foreach (var user in users) { // Create a super API Key for the admin user List scopes = user.UserName == "admin@email.com" ? [RoleNames.Admin] : []; var apiKey = feature.Insert(db, new() { Name="Seed Key", UserId=user.Id, UserName=user.UserName, Scopes=scopes }); var generatedApiKey = apiKey.Key; } } ``` ### Basic Usage With the plugin registered, you can now use the `ValidateApiKey` attribute to limit APIs to only be accessible with a valid API Key, e.g: ```csharp [ValidateApiKey] public class MyRequest {} ``` ### Use API Keys with our without Users and Authentication API Keys can optionally be associated with a User, but they don't have to be, nor do they run in the context of a User or are able to invoke any Authenticated APIs on their own. Users who create them can also limit their scope to only call APIs they have access to, which can be done with user-defined scopes: ### Scopes Scopes are user-defined strings that can be used to limit APIs from only being accessible with API Keys that have the required scope. For example, we could create generate API Keys that have **read only**, **write only** or **read/write** access to APIs by assigning them different scopes, e.g: ```csharp public static class Scopes { public const string TodoRead = "todo:read"; public const string TodoWrite = "todo:write"; } [ValidateApiKey(Scopes.TodoRead)] public class QueryTodos : QueryDb {} [ValidateApiKey(Scopes.TodoWrite)] public class CreateTodo : ICreateDb, IReturn {} [ValidateApiKey(Scopes.TodoWrite)] public class UpdateTodo : IUpdateDb, IReturn {} [ValidateApiKey(Scopes.TodoWrite)] public class DeleteTodos : IDeleteDb, IReturnVoid {} ``` Where only API Keys with the `todo:read` scope can access the `QueryTodos` API, and only API Keys with the `todo:write` scope can access the `CreateTodo`, `UpdateTodo` and `DeleteTodos` APIs. APIs that aren't assigned a scope can be accessed by any valid API Key. The only built-in Scope is `Admin` which like the `Admin` role enables full access to all `[ValidateApiKeys]` APIs. ### Fine-grained Access Control Alternatively API Keys can be restricted to only be able to access specific APIs. ### Features In addition to scopes, API Keys can also be tagged with user-defined **Features** which APIs can inspect to enable different behavior, e.g. a **Paid** feature could be used to increase rate limits or return premium content whilst a **Tracking** feature could be used to keep a record of API requests, etc. These can be accessed in your Services with: ```csharp public object Any(QueryTodos request) { if (Request.GetApiKey().HasFeature(Features.Paid)) { // return premium content } } ``` ## Integrated UIs Like many of ServiceStack's other premium features, API Keys are fully integrated into [ServiceStack's built-in UIs](https://servicestack.net/auto-ui) including [API Explorer](https://docs.servicestack.net/api-explorer) and the [Admin UI](https://docs.servicestack.net/admin-ui). ### API Explorer Your Users and API Consumers can use API Explorer to invoke protected APIs with their API Key. API Key protected APIs will display a **key** icon next to the API instead of the **padlock** which is used to distinguish APIs that require Authentication. Users can configure API Explorer with their API Key by either clicking the **key** icon on the top right or by clicking the **API Key** link on the alert message that appears when trying to access an API requiring an API Key: ![](/img/posts/apikeys/apiexplorer-apikeys.png) Both of these will open the **API Key** dialog where they can paste their API Key: ![](/img/posts/apikeys/apiexplorer-apikeys-dialog.png) :::info NOTE API Keys are not stored in localStorage and only available in the current session ::: ### Admin UI Whilst **Admin** users can view and manage API Keys in the API Key [Admin UI](https://docs.servicestack.net/admin-ui) at: :::{.text-4xl .text-center .text-indigo-800} /admin-ui/apikeys ::: ![](/img/posts/apikeys/admin-ui-apikeys.png) This will let you view and manage all API Keys in your App, including the ability to revoke API Keys, extend their Expiration date as well as manage any Scopes and Features assigned to API Keys. ### Customizing API Key UIs The `ApiKeysFeature` plugin can be configured to specify which **Scopes** and **Features** can be assigned to API Keys as well as the different Expiration Options you want available in the API Key management UIs, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key Features = [ Features.Paid, Features.Tracking, ], // Optional: Available Features Admin Users can assign to any API Key Scopes = [ Scopes.TodoRead, Scopes.TodoWrite, ], // Optional: Limit available Expiry options that can be assigned to API Keys // ExpiresIn = [ // new("", "Never"), // new("7", "7 days"), // new("30", "30 days"), // new("365", "365 days"), // ], }); ``` ### Admin User API Keys When the `ApiKeysFeature` plugin is registered, the [User Admin UI](https://docs.servicestack.net/admin-ui-identity-users) will be enhanced to include the ability to create and manage API Keys for the user at the bottom of the **Edit User** form: ![](/img/posts/apikeys/admin-ui-user-apikeys.png) #### Creating User API Keys When creating API Keys, you can assign them a **Name**, its **Expiration** date and any **Scopes**, **Features** and **Notes**. ![](/img/posts/apikeys/admin-ui-user-apikeys-create.png) ### Restrict to APIs `Scopes` provide a simple way to logically group a collection of related APIs behind UX-friendly names without Users needing to know the behavior of each individual API. In addition, Users who want fine-grained control can also restrict API Keys to only be able to access specific APIs that their systems make use of by selecting them from the **Restrict to APIs** option: ![](/img/posts/apikeys/apikeys-restrict-to.png) #### One Time only access of generated API Key All UIs limit access to the generated API Key token so that it's only accessible at the time of creation: ![](/img/posts/apikeys/admin-ui-user-apikeys-create-dialog.png) #### Editing User API Keys Everything about the API Key can be edited after it's created except for the generated API Key token itself, in addition to be able to cancel and revoke the API Key: ![](/img/posts/apikeys/admin-ui-user-apikeys-edit.png) Invalid API Keys that have expired or have been disabled will appear disabled in the UI: ![](/img/posts/apikeys/admin-ui-user-apikeys-disabled.png) ## User Management API Keys In addition to the built-in Admin UIs to manage API Keys, all Identity Auth Tailwind templates have also been updated to include support for managing API Keys in their User Account pages: The templates aren't configured to use API Keys by default, but new projects can be configured to use API Keys by selecting the **API Keys** feature on the [Start Page](/start): [![](/img/posts/apikeys/start-apikeys.png)](/start) Or by mixing the `apikeys` project in your host project: :::sh x mix apikeys ::: Which add the `Configure.ApiKeys.cs` modular startup to your Host project, which registers the `ApiKeysFeature` plugin where you'd use the `UserScopes` and `UserFeatures` collections instead to control which scopes and features Users can assign to their own API Keys, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { // Optional: Available Scopes Admin Users can assign to any API Key Features = [ Features.Paid, Features.Tracking, ], // Optional: Available Features Admin Users can assign to any API Key Scopes = [ Scopes.TodoRead, Scopes.TodoWrite, ], // Optional: Limit available Scopes Users can assign to their own API Keys UserScopes = [ Scopes.TodoRead, ], // Optional: Limit available Features Users can assign to their own API Keys UserFeatures = [ Features.Tracking, ], }); ``` ### Identity Auth API Keys When enabled users will be able to create and manage their own API Keys from their Identity UI pages which will use any configured `UserScopes` and `UserFeatures`: ![](/img/posts/apikeys/identity-auth-apikeys.png) ### Client Usage Like most API Key implementations, API Keys can be passed in a [HTTP Authorization Bearer Token](https://datatracker.ietf.org/doc/html/rfc6750#section-2.1) that can be configured in ServiceStack Service Clients with: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey ``` ### API Key HTTP Header Alternatively, API Keys can also be passed in the `X-Api-Key` HTTP Header which allows clients to be configured with an alternative Bearer Token allowing the same client to call both **Authenticated** and **API Key** protected APIs, e.g: #### C# ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = jwt, Headers = { [HttpHeaders.XApiKey] = apiKey } }; ``` #### TypeScript ```ts const client = new JsonServiceClient(BaseUrl) client.bearerToken = apiKey client.headers.set('X-Api-Key', apiKey) ``` Or use a different HTTP Header by configuring `ApiKeysFeature.HttpHeader`, e.g: ```csharp services.AddPlugin(new ApiKeysFeature { HttpHeader = "X-Alt-Key" }); ``` # Support for RHEL 9's hardened cryptography policy Source: https://servicestack.net/posts/rhel9-cryptography A consequence of RedHat Enterprise Linux 9's hardened [system-wide cryptographic policies](https://docs.redhat.com/en/documentation/red_hat_enterprise_linux/8/html/security_hardening/using-the-system-wide-cryptographic-policies_security-hardening) is that it's incompatible with ServiceStack's current licensing mechanism which uses RSA encryption and SHA1 hashing algorithm to protect and validate license keys. Unfortunately this makes it no longer possible to use License Keys to run unrestricted ServiceStack Apps on default installs of RHEL 9. The difficulty being we can't both support RHEL 9's hardened cryptography policy and maintain compatibility with being able to use newer License Keys on all previous versions of ServiceStack - vital for enabling frictionless rotation of License Keys. As a system-wide policy we're unable to work around this restriction in the library to allow usage of RSA+SHA1 to just validate License Keys which is the only place it's used. As it only affected a small number of users initially we recommend that they just switch to use [RHEL's Legacy Cryptography Policy](https://docs.redhat.com/en/documentation/red_hat_enterprise_linux/8/html/security_hardening/using-the-system-wide-cryptographic-policies_security-hardening) which allows for maximum compatibility with existing software. ### Road to Solution ![](/img/posts/rhel9-cryptography/bg-redhat.webp) As more customers upgraded to RHEL 9 and started experiencing the same issue, we've decided to invest time to try and address this issue starting with adding support for a configurable Hashing algorithm when creating and validating License Keys. We still have the issue of not being able to generate a new License Key that would be compatible with both default RHEL 9 and all previous versions of ServiceStack. The solutions under consideration were: - Generate a new License Key that's compatible with RHEL 9's hardened cryptography policy, but inform customers that they'll be unable to use the new License Key on their existing versions of ServiceStack and to continue to use their existing License Key for existing versions - Generate 2 License Keys, and explain to Customers which key to use for previous versions of ServiceStack and which key to use for RHEL 9 - Provide a way for customers to regenerate their License Key to support RHEL 9's hardened cryptography policy Since this issue only affected a minority of our Customers we decided to go with the last option to avoid inflicting any additional complexity on the majority of our Customers who are unaffected by this issue. ### Generate License Key for RHEL 9+ Starting from ServiceStack v8.3+ Customers can regenerate a new License Key with a stronger **SHA512** Hash Algorithm that's compatible with RHEL 9's default hardened cryptography policy by visiting: :::{.text-3xl .text-indigo-600} https://account.servicestack.net/regenerate-license ::: ### Future We'll need to wait at least 1-2 years before we can make the stronger Hash Algorithm the default in order to reduce the impact of not being able to use new License Keys on versions of ServiceStack prior to **v8.2**. After the switch is made regenerating license keys will no longer be necessary. # Using ASP.NET Core Output Caching Source: https://servicestack.net/posts/redis-outputcache With the release of ServiceStack 8.1, we've embraced tighter integration with ASP.NET Core, including support for registering ServiceStack services with ASP.NET Core's Endpoint Routing system. This opens up exciting opportunities to leverage more of ASP.NET Core's rich feature set in your ServiceStack applications. One such feature is ASP.NET Core's built-in support for Output Caching (also known as Response Caching). Output Caching allows you to dramatically improve the performance of your APIs by caching the output and serving it directly from the cache for subsequent requests. This can significantly reduce the load on your server and database for frequently accessed, cacheable responses. ## Enabling Output Caching To utilize Output Caching with your ServiceStack Endpoints, you first need to add the Output Caching middleware to your ASP.NET Core request pipeline in the `Configure` method of your `Program.cs`: ```csharp // Program.cs var builder = WebApplication.CreateBuilder(args); var services = builder.Services; app.UseOutputCache(); // ... app.UseServiceStack(new AppHost(), options => options.MapEndpoints()); ``` Then in `ConfigureServices` you need to add the Output Caching services: ```csharp services.AddOutputCache(); ``` The order of adding OutputCache to your request pipeline can be very sensitive to change, so this will depend largely on your application and dependencies you are already using. For example, below is an example of using it in a Blazor application. ```csharp var app = builder.Build(); // Configure the HTTP request pipeline. if (app.Environment.IsDevelopment()) { app.UseMigrationsEndPoint(); } else { app.UseExceptionHandler("/Error", createScopeForErrors: true); app.UseHsts(); } app.UseHttpsRedirection(); app.UseStaticFiles(); app.UseAntiforgery(); // Add OutputCache after Antiforgery and before Auth related middleware app.UseOutputCache(); // Required for OutputCache app.UseAuthentication(); app.UseAuthorization(); app.MapRazorComponents() .AddInteractiveServerRenderMode(); // Add additional endpoints required by the Identity /Account Razor components. app.MapAdditionalIdentityEndpoints(); app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); }); ``` ## Configuring Caching Behavior With the middleware in place, you can now configure caching behaviors for your ServiceStack Endpoints by registering against the Route Handlers within the ServiceStack `options`. ```csharp app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); options.RouteHandlerBuilders.Add((routeHandlerBuilder, operation, verb, route) => { routeHandlerBuilder.CacheOutput(c => { // Use Cache Profiles c.UseProfile("Default30"); // Or configure caching per-request c.Expire(TimeSpan.FromSeconds(30)); c.VaryByAll(); }); }); }); ``` You can also vary the cache by specific properties, e.g: ```csharp builder.CacheOutput(c => c.VaryBy("userRole","region")); ``` Or use Cache Profiles for reusable caching strategies: ```csharp builder.Services.AddOutputCache(options => { options.AddPolicy("Default30", p => p.Expire(TimeSpan.FromSeconds(30))); }); ``` Then apply the named profile to your endpoints: ```csharp builder.CacheOutput(c => c.UseProfile("Default30")); ``` ## Finer-grained Control For more granular control, you can apply the `[OutputCache]` attribute directly on your Service class, and use the ServiceStack AppHost metadata in your `RouteHandlerBuilder`s `Add` method to detect and cache only the routes that are attributed with `OutputCache`. ```csharp app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); options.RouteHandlerBuilders.Add((routeHandlerBuilder, operation, verb, route) => { // Initialized appHost and allServiceTypes var appHost = HostContext.AppHost; var allServiceTypes = appHost.Metadata.ServiceTypes; // Find the service matching the RequestType of the operation var operationType = operation.RequestType; // Match with operation, verb and route appHost.Metadata.OperationsMap.TryGetValue(operationType, out var operationMap); var serviceType = operationMap?.ServiceType; if (serviceType == null) return; if (serviceType.HasAttributeOf()) { // Handle duration from OutputCacheAttribute var outputCacheAttribute = serviceType.FirstAttribute(); routeHandlerBuilder.CacheOutput(policyBuilder => { policyBuilder.Cache().Expire(TimeSpan.FromSeconds(outputCacheAttribute.Duration)); }); } }); }); ``` ```csharp [OutputCache(Duration = 60)] public class MyServices : Service { public object Any(Hello request) { return new HelloResponse { Result = $"Hello, {request.Name}!" }; } } ``` This enables for fine grained control of the built in `OutputCache` functionality compatible with using the same attribute with your MVC Controllers, and you can extend your use by updating the code above within the ServiceStack options. ## ServiceStack Redis Distributed Cache The above examples so far have been using a cache store that comes with the OutputCache package. This is just an in memory store, so isn't suitable for a distributed application. Thankfully, you can override the IOutputCacheStore interface in your IoC to change out the implementation that uses a centralized system like a Redis server. ```csharp public class RedisOutputCacheStore(IRedisClientsManager redisManager) : IOutputCacheStore { public async ValueTask GetAsync(string key, CancellationToken cancellationToken) { await using var redis = await redisManager.GetClientAsync(token: cancellationToken); var value = await redis.GetAsync(key, cancellationToken); return value; } public async ValueTask SetAsync(string key, byte[] value, string[]? tags, TimeSpan validFor, CancellationToken cancellationToken) { await using var redis = await redisManager.GetClientAsync(token: cancellationToken); // First persist in normal cache hashset await redis.SetAsync(key, value, validFor, cancellationToken); if (tags == null) return; foreach (var tag in tags) { await redis.AddItemToSetAsync($"tag:{tag}", key, cancellationToken); } } public async ValueTask EvictByTagAsync(string tag, CancellationToken cancellationToken) { await using var redis = await redisManager.GetClientAsync(token: cancellationToken); var keys = await redis.GetAllItemsFromListAsync($"tag:{tag}", cancellationToken); foreach (var key in keys) { await redis.RemoveEntryAsync(key); await redis.RemoveItemFromSetAsync($"tag:{tag}", key, cancellationToken); } } } ``` The above is a simple implementation of the IOutputCacheStore using the ServiceStack.Redis client to handle a centralized distributed cache. Using the class above, we can create a `Configure.OutputCache.cs` file that registers our IoC dependencies. ```csharp [assembly: HostingStartup(typeof(BlazorOutputCaching.ConfigureOutputCache))] namespace BlazorOutputCaching; public class ConfigureOutputCache : IHostingStartup { public void Configure(IWebHostBuilder builder) { builder.ConfigureServices(services => { services.AddSingleton(c => new BasicRedisClientManager("localhost:6379")); services.AddSingleton(); }); } } ``` We register out Redis client manager for our RedisOutputCacheStore, and then the store itself. ## Summary ASP.NET Core Output Caching is a powerful tool for improving the performance of your ServiceStack endpoints. With ServiceStack 8.1's tight integration with ASP.NET Core Endpoint Routing, utilizing this feature is now straightforward. As always, caching is a balancing act. Apply it judiciously to frequently accessed, cacheable data. And be sure to implement appropriate invalidation strategies to keep your application's data fresh. By leveraging Output Caching effectively, you can dramatically improve the scalability and responsiveness of your ServiceStack powered applications. Try it out in your ServiceStack 8.1+ projects and let us know how it goes! # Using ASP.NET Core Rate Limiter Middleware Source: https://servicestack.net/posts/asp-rate-limiter-middleware Introduction Rate limiting is an important technique for protecting web APIs and applications from excessive traffic and abuse. By throttling the number of requests a client can make in a given time period, rate limiting helps ensure fair usage, maintains performance and availability, and defends against denial-of-service attacks. ASP.NET Core provides built-in middleware for rate limiting based on client IP address or client ID. And now with the latest release, ServiceStack has added support for ASP.NET Core endpoints, making it possible to leverage the same rate limiting middleware across all ASP.NET Core endpoints, including ServiceStack APIs. In this post, we'll look at how to enable rate limiting in an ASP.NET Core app using the standard middleware. Then we'll explore some more advanced options to fine-tune the rate limiting behavior. Finally, we'll see how to implement per-user rate limiting for multi-tenant SaaS applications using ASP.NET Core Identity and ServiceStack Mapped Endpoints. Setting Up Rate Limiting To get started, let's enable the basic rate limiting middleware in an ASP.NET Core application: 1. Install the `Microsoft.AspNetCore.RateLimiting` NuGet package 2. In the `Program.cs`, add `AddRateLimiter` to register the rate limiting services: ```csharp builder.Services.AddRateLimiter(options => { options.GlobalLimiter = PartitionedRateLimiter.Create(httpContext => RateLimitPartition.GetFixedWindowLimiter( partitionKey: httpContext.User.Identity?.Name ?? httpContext.Request.Headers.Host.ToString(), factory: partition => new FixedWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = 100, QueueLimit = 0, Window = TimeSpan.FromMinutes(1) })); options.OnRejected = (context, cancellationToken) => { if (context.Lease.TryGetMetadata(MetadataName.RetryAfter, out var retryAfter)) { context.HttpContext.Response.Headers.RetryAfter = retryAfter.TotalSeconds.ToString(); } context.HttpContext.Response.StatusCode = StatusCodes.Status429TooManyRequests; context.HttpContext.Response.WriteAsync("Too many requests. Please try again later."); return new ValueTask(); }; }); ``` This sets up a fixed window rate limiter with a limit of 100 requests per minute, partitioned by either authenticated username or client host name. 1. Add the `UseRateLimiter` middleware to the pipeline: ```csharp app.UseRateLimiter(); ``` With this basic setup, the API is now protected from excessive requests from individual clients. If a client exceeds the limit of 100 requests/minute, subsequent requests will receive a `HTTP 429 Too Many Requests` response. Advanced Options The rate limiting middleware provides several options to customize the behavior: - `PermitLimit` - The maximum number of requests allowed in the time window - `QueueLimit` - The maximum number of requests that can be queued when the limit is exceeded. Set to 0 to disable queueing. - `Window` - The time window for the limit, e.g. 1 minute, 1 hour, etc. - `AutoReplenishment` - Whether the rate limit should reset automatically at the end of each window For example, to allow short bursts but constrain average rate, we could implement a sliding window algorithm: ```csharp options.GlobalLimiter = PartitionedRateLimiter.Create(httpContext => RateLimitPartition.GetSlidingWindowLimiter( partitionKey: httpContext.User.Identity?.Name ?? httpContext.Request.Headers.Host.ToString(), factory: partition => new SlidingWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = 100, QueueLimit = 25, Window = TimeSpan.FromMinutes(1), SegmentsPerWindow = 4 })); options.OnRejected = (context, cancellationToken) => { if (context.Lease.TryGetMetadata(MetadataName.RetryAfter, out var retryAfter)) { context.HttpContext.Response.Headers.RetryAfter = retryAfter.TotalSeconds.ToString(); } context.HttpContext.Response.StatusCode = StatusCodes.Status429TooManyRequests; context.HttpContext.Response.WriteAsync("Too many requests. Please try again later."); return new ValueTask(); }; ``` This allows up to 100 requests per minute on average, with the ability to burst up to 25 additional requests which are queued. ## Per-User Rate Limiting for SaaS Applications In a typical SaaS application, each user or tenant may have a different subscription plan that entitles them to a certain level of API usage. We can implement this per-user rate limiting by leveraging ASP.NET Core Identity to authenticate users and retrieve their plan details, and then configuring the rate limiter accordingly. First, ensure you have ASP.NET Core Identity set up in your application to handle user authentication. Then, add a property to your user class to store the rate limit for each user based on their plan: ```csharp public class ApplicationUser : IdentityUser { public int RateLimit { get; set; } } ``` Next, update the rate limiter configuration to partition by user and read the rate limit from the user's plan: ```csharp builder.Services.AddRateLimiter(options => { options.AddPolicy("per-user", context => { var user = context.User.Identity?.Name; if (string.IsNullOrEmpty(user)) { // Fallback to host name for unauthenticated requests return RateLimitPartition.GetFixedWindowLimiter( partitionKey: context.Request.Headers.Host.ToString(), factory: partition => new FixedWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = 100, QueueLimit = 0, Window = TimeSpan.FromMinutes(1) }); } // User exists // Get the user's rate limit from their plan var userId = context.User.FindFirstValue(ClaimTypes.NameIdentifier); var userManager = context.RequestServices.GetService>(); var appUser = userManager.FindByIdAsync(userId).Result; var rateLimit = appUser?.RateLimit ?? 0; // Create a user-specific rate limiter return RateLimitPartition.GetFixedWindowLimiter( partitionKey: user, factory: partition => new FixedWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = rateLimit, QueueLimit = 0, Window = TimeSpan.FromMinutes(1) }); }); }); ``` This configuration first checks if the request is authenticated. If not, it falls back to the default host-based rate limiting. For authenticated requests, it retrieves the user ID from the authentication claims and looks up the user in the ASP.NET Core Identity `UserManager`. It then reads the `RateLimit` property from the user object, which should be set based on the user's subscription plan. Finally, it creates a user-specific rate limiter using the `PartitionedRateLimiter` with the user's ID as the partition key and their personal rate limit as the `PermitLimit`. With this setup, each user will be rate limited independently based on their plan allowance. If a user exceeds their personal limit, they will receive a `429 Too Many Requests` response, while other users can continue making requests up to their own limits. Not only that, our rate handling is consistent across ASP.NET Core Endpoints regardless of how they are implemented, be it ServiceStack APIs, MVC Controllers, Minimal APIs etc. If you do need to target ServiceStack APIs with a specific policy name, you can create one with a policy name and use it when calling `UseServiceStack`. ```csharp services.AddRateLimiter(options => { // Policy name "per-user" is used by ServiceStack Mapped Endpoints options.AddPolicy("per-user", context => { var user = context.User.Identity?.Name; if (string.IsNullOrEmpty(user)) { // Fallback to host name for unauthenticated requests return RateLimitPartition.GetFixedWindowLimiter( partitionKey: context.Request.Headers.Host.ToString(), factory: partition => new FixedWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = 100, QueueLimit = 0, Window = TimeSpan.FromMinutes(1) }); } // User exists // Get the user's rate limit from their plan var userId = context.User.FindFirstValue(ClaimTypes.NameIdentifier); var userManager = context.RequestServices.GetService>(); var appUser = userManager.FindByIdAsync(userId).Result; var rateLimit = appUser?.RateLimit ?? 0; // Create a user-specific rate limiter return RateLimitPartition.GetFixedWindowLimiter( partitionKey: user, factory: partition => new FixedWindowRateLimiterOptions { AutoReplenishment = true, PermitLimit = rateLimit, QueueLimit = 0, Window = TimeSpan.FromMinutes(1) }); }); // ... }); //... // Make sure to call UseRateLimiter app.UseRateLimiter(); // Specify which policy is used by ServiceStack Mapped Endpoints app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); options.RouteHandlerBuilders.Add((routeBuilder, operation, method, route) => { routeBuilder.RequireRateLimiting(policyName: "per-user"); }); }); ``` By combining ASP.NET Core rate limiting with ASP.NET Core Identity in this way, you can implement flexible, per-user rate limiting suitable for multi-tenant SaaS applications. The same approach can be extended to handle different rate limits for different API endpoints or user roles as needed. By upgrading ServiceStack to use ASP.NET Core Endpoints, you can now leverage the same rate limiting middleware across all ASP.NET Core Endpoints, including ServiceStack APIs. # Kotlin Compose Multiplatform with end-to-end typed Kotlin & C# APIs Source: https://servicestack.net/posts/kotlin-compose-multiplatform The last few years of neglect of Xamarin has slid it into irrelevance, removing itself from consideration in the already shortlist of viable development options for creating native multi-platform iOS, Android and Desktop Apps, which leaves us just Flutter and React Native as the only viable options. Thanks to the vast language ecosystem covered by [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference), which ever technology you end up choosing to develop native Mobile and Desktop Apps with, you'll always be able to develop with the productivity and type safety benefits of end-to-end typed APIs in your preferred language, whether it's [TypeScript](https://docs.servicestack.net/typescript-add-servicestack-reference) or [JavaScript](https://docs.servicestack.net/javascript-add-servicestack-reference) for React Native, [Dart](https://docs.servicestack.net/dart-add-servicestack-reference) for Flutter, [Java](https://docs.servicestack.net/java-add-servicestack-reference) or [Kotlin](https://docs.servicestack.net/kotlin-add-servicestack-reference) for Android, or [Swift](https://docs.servicestack.net/swift-add-servicestack-reference) for iOS. Fortunately JetBrains has stepped in to fill the void with Compose Multiplatform offering a modern alternative for creating native Mobile, Desktop & Web Apps which can also leverage [Kotlin ServiceStack Reference](https://docs.servicestack.net/kotlin-add-servicestack-reference) for end-to-end typed APIs. [Compose Multiplatform](https://www.jetbrains.com/lp/compose-multiplatform/) builds on [Jetpack Compose](https://developer.android.com/jetpack/compose) - Google's modern toolkit for building native Android UIs bringing it to more platforms, including Windows, macOS and Linux Desktops, Web UIs with [Kotlin Wasm](https://kotlinlang.org/docs/wasm-overview.html) and on iOS with [Kotlin/Native](https://kotlinlang.org/docs/native-overview.html). We'll look at the latest [Compose Multiplatform v1.6 Release](https://blog.jetbrains.com/kotlin/2024/02/compose-multiplatform-1-6-0-release/) and use it to build a cross-platform Desktop App integrated with a .NET API backend utilizing [Kotlin ServiceStack Reference](https://docs.servicestack.net/kotlin-add-servicestack-reference) to generate Kotlin DTOs that can be used with the generic ServiceStack Java `JsonServiceClient` to enable its end-to-end typed API integration. ### JVM Platform Required Whilst Compose Multiplatform supports both JVM and non-JVM platforms, targeting a non JVM platform is very limited as you won't be able to reference and use any Java packages like ServiceStack's Java Client library in `net.servicestack:client` which is required for this example utilizing [Kotlin ServiceStack Reference](https://docs.servicestack.net/kotlin-add-servicestack-reference) typed Kotlin DTOs. ## Compose Multiplatform iOS & Android Apps :::youtube r6T3B7o1GYE JetBrains Compose Multiplatform iOS & Android Apps ::: The quickest way to a working Compose Multiplatform App integrated with a .NET API backend is to create a new project from the Compose Desktop template:

Create a new Compose Desktop App

Create a new Kotlin Multiplatform App with your preferred project name:

Or install from the command-line with the [x dotnet tool](https://docs.servicestack.net/dotnet-tool): :::sh x new kmp-desktop MyApp ::: ### Install JetBrains IDE As a JetBrains technology, you're spoilt for choice for which IDE to use. #### Android Studio If you're primarily developing for Android, Android Studio is likely the the best option, which you can setup by following their [Getting Started with Android Studio](https://www.jetbrains.com/help/kotlin-multiplatform-dev/compose-multiplatform-setup.html) guide. #### JetBrains Fleet Otherwise if you're primarily developing a Desktop App it's recommended to use [Fleet](https://www.jetbrains.com/fleet/) - JetBrains alternative to VS Code as a lightweight IDE for Kotlin Multiplatform Development. It's the preferred IDE when developing against a .NET API as you can develop both Kotlin front-end UI and backend .NET APIs from a single IDE. To get setup with Fleet, follow the [Getting Started with JetBrains Fleet](https://www.jetbrains.com/help/kotlin-multiplatform-dev/fleet.html). ### Open Project with Fleet Once you've installed Fleet, you can open your Desktop App project by opening the Folder in the Fleet IDE, or like VS Code you can launch it to open your Project's folder from the command-line with: :::sh fleet MyApp ::: ### Setup Fleet When first opening fleet you'll start with an empty canvas. I'd recommend adding the **Files** tool window on the left panel to manage the Kotlin UI and the **Solution** tool window on the bottom left panel to manage the .NET API backend. ### Run .NET API and Kotlin Desktop App Once setup, you can run both the Desktop App and the .NET API backend with from the Run Dialog with the `Ctrl+R` keyboard shortcut, or by clicking on the play button icon in the top menu bar: ![](/img/posts/kotlin-compose-multiplatform/fleet-run.webp) You'll want to run the .NET API backend first by selecting your Project Name which should launch your browser at `https://localhost:5001`, then launch the Desktop App by selecting the **composeApp [Desktop]** configuration which should launch a working Desktop App that calls your .NET API on each keystroke to search for matching files in your project: ![](/img/posts/kotlin-compose-multiplatform/search-files-app.webp) The majority of the UI is maintained in [/commonMain/kotlin/App.kt](https://github.com/NetCoreTemplates/kmp-desktop/blob/main/kmp/composeApp/src/commonMain/kotlin/App.kt) created using Jetpack Compose's declarative Kotlin UI. ### Update Kotlin DTOs The typed Kotlin DTOs for your .NET APIs is generated in [dtos.kt](https://github.com/NetCoreTemplates/kmp-desktop/blob/main/kmp/composeApp/src/commonMain/kotlin/dtos.kt). Which can be regenerated by running **Update DTOs** in the Run Dialog. Alternatively they can also be regenerated by running the `dtos` npm script from the command-line in your .NET Host project: :::sh npm run dtos ::: #### Android Studio If you're using Android Studio, you can also install the [ServiceStack Plugin](https://plugins.jetbrains.com/plugin/7749-servicestack) from the JetBrains Marketplace: ![](/img/posts/kotlin-compose-multiplatform/android-studio-plugins.webp) Which provides a **Add ServiceStack Reference** UI on the Context Menu, by right-clicking the folder where you want the DTOs generated: ![](/img/posts/kotlin-compose-multiplatform/add-servicestack-reference-dialog.webp) Then to update just right-click the `dtos.kt` and click **Update ServiceStack Reference** on the context menu: ![](/img/posts/kotlin-compose-multiplatform/update-servicestack-reference-dialog.webp) ### Command Line For any other Text Editors or IDEs a Kotlin ServiceStack Reference can also be added from the command-line using the [x dotnet tool](https://docs.servicestack.net/dotnet-tool) by specifying the BaseUrl where the ServiceStack APIs are hosted, e.g: :::sh x kotlin https://localhost:5001 ::: To update and regenerate all Kotlin DTOs within a folder, run: :::sh x kotlin ::: ## Create a new Kotlin Multiplatform App from Scratch For a customized Compose Multiplatform App, you can create a new App with [Kotlin Multiplatform Wizard](https://kmp.jetbrains.com) with just the options you need: [![](/img/posts/kotlin-compose-multiplatform/kmp-wizard.webp)](https://kmp.jetbrains.com) Which you can download in an empty Web Project: :::sh x new web MyApp ::: Then open the folder with both the Kotlin Multiplatform and .NET Web App in fleet: :::sh fleet MyApp ::: # New React SPA Template Source: https://servicestack.net/posts/net8-react-spa-template ## ServiceStack React SPA Template Just as we've enhanced the built-in ASP.NET Core React SPA template with the new [ServiceStack Vue SPA template](/posts/net8-vue-spa-template) we've also enhanced the built-in ASP.NET Core React SPA template with the new TypeScript [Vite React SPA template](https://react-spa.web-templates.io) with many new value-added and high-productivity features.

Vite React SPA Template

Explore the high productivity features in the new ServiceStack React SPA template

:::{.text-center} ## Live Demo ::: :::{.shadow .pb-1} [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/react-spa.png)](https://react-spa.web-templates.io) ::: ## ASP.NET Core React SPA Template The [React and ASP.NET Core](https://learn.microsoft.com/en-us/visualstudio/javascript/tutorial-asp-net-core-with-react) template provides a seamless starting solution which runs both the .NET API backend and Vite React frontend during development. It's a modern template enabling an excellent developer workflow for .NET React Apps, configured with Vite's fast HMR (Hot Module Reload), TypeScript support with TSX enabling development of concise and expressive type-safe components. ### Minimal API integration Whilst a great starting point, it's still only a basic template configured with a bare-bones React Vite App that's modified to show an example of calling a Minimal API. ### Built-in API Integration Although the approach used isn't very scalable, with a proxy rule needed for every user-defined API route: ```ts export default defineConfig({ //... server: { proxy: { '^/weatherforecast': { target, secure: false } }, } }) ``` And the need for hand maintained Types to describe the shape of the API responses with [Stringly Typed](https://wiki.c2.com/?StringlyTyped) fetch API calls referencing **string** routes: ```ts interface Forecast { date: string; temperatureC: number; temperatureF: number; summary: string; } function App() { const [forecasts, setForecasts] = useState(); useEffect(() => { populateWeatherData(); }, []); //... } async function populateWeatherData() { const response = await fetch('weatherforecast'); const data = await response.json(); setForecasts(data); } ``` Which is used to render the API response in a hand rolled table: ```tsx function App() { //... const contents = forecasts === undefined ?

Loading... Please refresh once the ASP.NET backend has started. See jsps for more details.

:
{forecasts.map(forecast => )}
Date Temp. (C) Temp. (F) Summary
{forecast.date} {forecast.temperatureC} {forecast.temperatureF} {forecast.summary}
; } ``` ### ServiceStack API Integration Fortunately ServiceStack can significantly improve this development experience with the [/api pre-defined route](https://docs.servicestack.net/endpoint-routing#api-pre-defined-route) where only a single proxy rule is needed to proxy all APIs: ```ts export default defineConfig({ //... server: { proxy: { '^/api': { target, secure: false } }, } }) ``` ### End-to-end Typed APIs Instead of hand-rolled types and Stringly Typed API calls, it utilizes server [generated TypeScript DTOs](https://docs.servicestack.net/typescript-add-servicestack-reference) with a generic JsonServiceClient to enable end-to-end Typed APIs: ```ts import { useState, useEffect } from "react" import { useClient } from "@/gateway" import { GetWeatherForecast } from "@/dtos" const client = useClient() const [forecasts, setForecasts] = useState([]) useEffect(() => { (async () => { const api = await client.api(new GetWeatherForecast()) if (api.succeeded) { setForecasts(api.response!) } })() }, []) ``` This benefits in less code to maintain, immediate static typing analysis to ensure correct usage of APIs and valuable feedback when APIs are changed, that's easily updated with a single command: :::sh npm run dtos ::: ### React Component Ecosystem Given it's popularity, React has arguably the richest ecosystem of freely available libraries and components, a good example are the popular [shadcn/ui](https://ui.shadcn.com) Tailwind components. Unlike most libraries they're source copied piecemeal into your project where they're locally modifiable, i.e. instead of an immutable package reference. As they're just blueprints, they're not dependent on a single library and will utilize the best library to implement each component if needed. E.g. the [Data Table](https://ui.shadcn.com/docs/components/data-table) component documents how to implement your own Data Table utilizing the headless [TanStack Table](https://tanstack.com/table/latest) - a version of which we've built into [DataTable.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/components/DataTable.tsx) which is used in the template to implement both complex CRUD UIs and [weather.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/weather.tsx) simple table results: ```tsx import { columnDefs, DataTable, getCoreRowModel } from "@/components/DataTable.tsx" const columns = columnDefs(['date', 'temperatureC', 'temperatureF', 'summary'], ({ temperatureC, temperatureF}) => { temperatureC.header = "Temp. (C)" temperatureF.header = "Temp. (F)" temperatureC.cell = temperatureF.cell = ({ getValue }) => <>{getValue()}° }) return ( ) ``` To render the [/weather](https://react-spa.web-templates.io/weather) customized Data Table: :::{.mx-auto .max-w-lg .shadow .rounded} [![](/img/posts/net8-react-spa-template/data-table.png)](https://react-spa.web-templates.io/weather) ::: The template also includes customizable [Form.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/components/Form.tsx) Input components which can be used to create beautiful validation-bound forms which effortlessly integrates with ServiceStack's [Error Handling](https://docs.servicestack.net/error-handling) and [Declarative Validation](https://docs.servicestack.net/declarative-validation) attributes. ## ServiceStack React SPA Features Other high-productivity features available in the ServiceStack React SPA template include: ### Integrated Identity Auth Pre-configured with ASP.NET Core Identity Auth, including Sign In and Custom Registration APIs and UI Pages which can be customized as needed, examples of Role-based security as well as a turn key solution for Integrating Identity Auth Registration workflow with your [SMTP Provider](https://docs.servicestack.net/auth/identity-auth#smtp-iemailsender) with all emails sent from a managed non-blocking [Background MQ](https://docs.servicestack.net/background-mq) for optimal responsiveness and execution. ### tailwindcss [Tailwind](https://tailwindcss.com) has quickly become the best modern CSS framework for creating scalable, [mobile-first](https://tailwindcss.com/#mobile-first) responsive websites built upon a beautiful expert-crafted constraint-based [Design System](https://tailwindcss.com/#constraint-based) that enables effortless reuse of a growing suite of [Free Community](https://tailwindcomponents.com) and professionally-designed [Tailwind UI Component](https://tailwindui.com) Libraries, invaluable for quickly creating beautiful websites. [![](/img/pages/blazor/tailwindui.png)](https://tailwindcss.com) In addition to revolutionizing how we style mobile-first responsive Apps, Tailwind's [Dark Mode](https://tailwindcss.com/#dark-mode) does the same for enabling Dark Mode a feature supported throughout the template and its Tailwind UI Components. [![](/img/posts/net8-react-spa-template/dark-mode.png)](https://tailwindcss.com/#dark-mode) ### Built for Productivity So that you're immediately productive out-of-the-box, the template includes a rich set of high-productivity features, including: | | | |---------------------------------------------------------------------|--------------------------------------------------------------| | [tailwind/typography](https://tailwindcss-typography.vercel.app) | Beautiful css typography for markdown articles & blog posts | | [tailwind/forms](https://github.com/tailwindlabs/tailwindcss-forms) | Beautiful css form & input styles that's easily overridable | | [Markdown](https://mdxjs.com/docs/getting-started/) | Native [mdx](https://mdxjs.com) Markdown integration | | [React Router](https://reactrouter.com) | Full featured routing library for React | | [plugin/press](https://github.com/ServiceStack/vite-plugin-press) | Static markdown for creating blogs, videos and other content | | [plugin/pages](https://github.com/hannoeru/vite-plugin-pages) | Conventional file system based routing for Vite | | [plugin/svg](https://github.com/pd4d10/vite-plugin-svgr) | Load SVG files as React components | | [Iconify](https://iconify.design) | Unified registry to access 100k+ high quality SVG icons | ### Bookings CRUD Pages The [Bookings CRUD example](https://react-spa.web-templates.io/bookings-crud) shows how you can utilize a customized Data Table and templates Form components to create a beautifully styled CRUD UI with minimal effort. ## Vite Press Plugin [![](https://images.unsplash.com/photo-1524668951403-d44b28200ce0?crop=entropy&fit=crop&h=384&w=768)](https://vue-spa.web-templates.io/posts/vite-press-plugin) Most Apps typically have a mix of dynamic functionality and static content which in our experience is best maintained in Markdown, which is why excited about the new [Vite Press Plugin](https://vue-spa.web-templates.io/posts/vite-press-plugin) which brings the same Markdown features in our [razor-ssg](https://razor-ssg.web-templates.io), [razor-press](https://razor-press.web-templates.io) and our [blazor-vue](https://blazor-vue.web-templates.io) templates, and re-implements them in Vite where they can be used to add the same rich content features to Vite Vue and Vite React Apps. A goal for vite-press-plugin is to implement a suite of universal markdown-powered features that can be reused across all our Vue, React and .NET Razor and Blazor project templates, allowing you to freely copy and incorporate same set of markdown feature folders to power markdown content features across a range of websites built with different technologies. All of Razor SSG's features are available in Vite Press Plugin, including: - [Blog](https://vue-spa.web-templates.io/blog) - Full Featured, beautiful Tailwind Blog with multiple discoverable views - [What's New](https://vue-spa.web-templates.io/whatsnew) - Build Product and Feature Release pages - [Videos](https://vue-spa.web-templates.io/videos) - Maintain Video Libraries and Playlists - [Metadata APIs](https://vue-spa.web-templates.io/posts/vite-press-plugin#metadata-apis-feature) - Generate queryable static .json metadata APIs for all content - [Includes](https://vue-spa.web-templates.io/posts/vite-press-plugin#includes-feature) - Create and reuse Markdown fragments It also supports an enhanced version of markdown for embedding richer UI markup in markdown content where most of [VitePress Containers](https://vitepress.dev/guide/markdown#custom-containers) are supported, including: - [Custom Markdown Containers](https://vue-spa.web-templates.io/posts/vite-press-plugin#markdown-containers) - **Alerts** - `info` - `tip` - `warning` - `danger` - `copy` - `sh` - `youtube` - [Markdown Fenced Code Blocks](https://vue-spa.web-templates.io/posts/vite-press-plugin#markdown-fenced-code-blocks) - Convert fenced code blocks into Richer UIs ### React Components In Markdown At the cost of reduced portability, you’re also able to embed richer Interactive Vue components directly in markdown: - [React Components in Markdown](https://react-spa.web-templates.io/posts/markdown-components-in-react) # New Vue SPA Template Source: https://servicestack.net/posts/net8-vue-spa-template With ServiceStack [now fully integrated with .NET 8](/posts/servicestack-endpoint-routing), our focus has shifted from providing platform-agnostic solutions that supports all ServiceStack's .NET Framework and .NET hosts to building on the new capabilities of .NET 8 by enhancing ASP .NET Core's built-in features and templates with ServiceStack's high-productivity features. ## ServiceStack Vue SPA Template The latest [Vue SPA](https://vue-spa.web-templates.io) template is a good example of this, building on and enhancing the built-in ASP.NET Core Vue SPA template with many high-productivity features.

Vite Vue SPA Template

Explore the high productivity features in the new ServiceStack Vue SPA template

:::{.text-center} ## Live Demo ::: :::{.shadow .pb-1} [![](https://raw.githubusercontent.com/ServiceStack/Assets/master/csharp-templates/vue-spa.png)](https://vue-spa.web-templates.io) ::: ## ASP.NET Core Vue SPA Template The [Vue and ASP.NET Core](https://learn.microsoft.com/en-us/visualstudio/javascript/tutorial-asp-net-core-with-vue) template provides a seamless starting solution which runs both the .NET API backend and Vite Vue frontend during development. It's a modern template capturing the best Vue has to offer, configured with Vite's fast HMR (Hot Module Reload) and TypeScript support - it allows App's to be developed with Vue's typed [Single File Components](https://vuejs.org/guide/scaling-up/sfc.html) enabling both a productive development experience and an optimal high-performance production build at runtime. ### Minimal API integration Whilst a great starting point, it's still only a basic template configured with a bare-bones Vue Vite App that's modified to show an example of calling a Minimal API. ### Built-in API Integration Although the approach used isn't very scalable, with a proxy rule needed for every user-defined API route: ```ts export default defineConfig({ //... server: { proxy: { '^/weatherforecast': { target, secure: false } }, } }) ``` And the need for hand maintained Types to describe the shape of the API responses with [Stringly Typed](https://wiki.c2.com/?StringlyTyped) fetch API calls referencing **string** routes: ```ts import { defineComponent } from 'vue'; type Forecasts = { date: string, temperatureC: string, temperatureF: string, summary: string }[]; interface Data { loading: boolean, post: null | Forecasts } export default defineComponent({ data(): Data { return { loading: false, post: null }; }, created() { // fetch the data when the view is created and the data is // already being observed this.fetchData(); }, watch: { // call again the method if the route changes '$route': 'fetchData' }, methods: { fetchData(): void { this.post = null; this.loading = true; fetch('weatherforecast') .then(r => r.json()) .then(json => { this.post = json as Forecasts; this.loading = false; return; }); } }, }); ``` Which is used to render the API response in a hand rolled table: ```html
Date Temp. (C) Temp. (F) Summary
{{ forecast.date }} {{ forecast.temperatureC }} {{ forecast.temperatureF }} {{ forecast.summary }}
``` ### ServiceStack API Integration Fortunately ServiceStack can significantly improve this development experience with the [/api pre-defined route](https://docs.servicestack.net/endpoint-routing#api-pre-defined-route) where only a single proxy rule is needed to proxy all APIs: ```ts export default defineConfig({ //... server: { proxy: { '^/api': { target, secure: false } }, } }) ``` ### End-to-end Typed APIs Instead of hand-rolled types and Stringly Typed API calls, it utilizes server [generated TypeScript DTOs](https://docs.servicestack.net/typescript-add-servicestack-reference) with a generic JsonServiceClient to enable end-to-end Typed APIs: ```ts import { ref, onMounted } from 'vue' import { ApiResult } from "@servicestack/client" import { useClient } from "@servicestack/vue" import { GetWeatherForecast } from "@/dtos" const client = useClient() const api = ref(new ApiResult()) onMounted(async () => { api.value = await client.api(new GetWeatherForecast()) }) ``` This benefits in less code to maintain, immediate static typing analysis to ensure correct usage of APIs and valuable feedback when APIs are changed, that's easily updated with a single command: :::sh npm run dtos ::: ### High Productivity Vue Components With access to the [ServiceStack Vue Components](https://docs.servicestack.net/vue/) library there's also less code to maintain in the UI, where you can render a beautiful tailwind styled DataGrid with just: ```html ``` ## ServiceStack Vue SPA Features Other high-productivity features available in the ServiceStack Vue SPA template include: ### Integrated Identity Auth Pre-configured with ASP.NET Core Identity Auth, including Sign In and Custom Registration APIs and UI Pages which can be customized as needed, examples of Role-based security as well as a turn key solution for Integrating Identity Auth Registration workflow with your [SMTP Provider](https://docs.servicestack.net/auth/identity-auth#smtp-iemailsender) with all emails sent from a managed non-blocking [Background MQ](https://docs.servicestack.net/background-mq) for optimal responsiveness and execution. ### tailwindcss [Tailwind](https://tailwindcss.com) has quickly become the best modern CSS framework for creating scalable, [mobile-first](https://tailwindcss.com/#mobile-first) responsive websites built upon a beautiful expert-crafted constraint-based [Design System](https://tailwindcss.com/#constraint-based) that enables effortless reuse of a growing suite of [Free Community](https://tailwindcomponents.com) and professionally-designed [Tailwind UI Component](https://tailwindui.com) Libraries, invaluable for quickly creating beautiful websites. [![](/img/pages/blazor/tailwindui.png)](https://tailwindcss.com) ### Dark Mode In addition to revolutionizing how we style mobile-first responsive Apps, Tailwind's [Dark Mode](https://tailwindcss.com/#dark-mode) does the same for enabling Dark Mode a feature supported throughout all of ServiceStack's [Vue Component Library](https://docs.servicestack.net/vue/). ![](/img/whatsnew/v6.5/dark-and-light-mode.png) ### Built for Productivity So that you're immediately productive out-of-the-box, the template includes a rich set of high-productivity features, including: | | | |----------------------------------------------------------------------------|--------------------------------------------------------------| | [tailwind/typography](https://tailwindcss-typography.vercel.app) | Beautiful css typography for markdown articles & blog posts | | [tailwind/forms](https://github.com/tailwindlabs/tailwindcss-forms) | Beautiful css form & input styles that's easily overridable | | [Markdown](https://github.com/markdown-it/markdown-it) | Native Markdown integration | | [plugin/press](https://github.com/ServiceStack/vite-plugin-press) | Static markdown for creating blogs, videos and other content | | [plugin/vue-router](https://github.com/posva/unplugin-vue-router) | Conventional file system based routing for Vue 3 on Vite | | [plugin/layouts](https://github.com/JohnCampionJr/vite-plugin-vue-layouts) | Support for multiple page layouts | | [plugin/components](https://github.com/antfu/unplugin-vue-components) | Auto importing & registering of components on-demand | | [plugin/svg](https://github.com/jpkleemans/vite-svg-loader) | Load SVG files as Vue components | | [Iconify](https://iconify.design) | Unified registry to access 100k+ high quality SVG icons | ### Bookings CRUD Pages Bookings CRUD example shows how you can rapidly develop beautiful responsive, customized CRUD UIs with minimal effort using [AutoQuery APIs](https://docs.servicestack.net/autoquery/), [AutoForms](https://docs.servicestack.net/autoform) & [AutoQueryGrid](https://blazor-gallery.servicestack.net/gallery/autoquerygrid) Vue Components. ### Admin Pages Whilst Bookings CRUD is a good example of creating custom UI for end users, you may also want to quickly develop a set of back-office CRUD Admin UIs to manage your App's Database tables, which is easily achievable AutoQueryGrid's default behavior:
The development UX of Admin Pages is further improved in Vue Vite which is able to use SFC Pages and conventional file system routing to quickly add Admin Pages to manage an App's back-end tables, e.g: #### [/admin/coupons.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/admin/coupons.vue) ```html ``` #### [/admin/bookings.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/admin/bookings.vue) ```html ``` ## Vite Press Plugin [![](https://images.unsplash.com/photo-1524668951403-d44b28200ce0?crop=entropy&fit=crop&h=384&w=768)](https://vue-spa.web-templates.io/posts/vite-press-plugin) Most Apps typically have a mix of dynamic functionality and static content which in our experience is best maintained in Markdown, which is why excited about the new [Vite Press Plugin](https://vue-spa.web-templates.io/posts/vite-press-plugin) which brings the same Markdown features in our [razor-ssg](https://razor-ssg.web-templates.io), [razor-press](https://razor-press.web-templates.io) and our [blazor-vue](https://blazor-vue.web-templates.io) templates, and re-implements them in Vite where they can be used to add the same rich content features to Vite Vue and Vite React Apps. A goal for vite-press-plugin is to implement a suite of universal markdown-powered features that can be reused across all our Vue, React and .NET Razor and Blazor project templates, allowing you to freely copy and incorporate same set of markdown feature folders to power markdown content features across a range of websites built with different technologies. All of Razor SSG's features are available in Vite Press Plugin, including: - [Blog](https://vue-spa.web-templates.io/blog) - Full Featured, beautiful Tailwind Blog with multiple discoverable views - [What's New](https://vue-spa.web-templates.io/whatsnew) - Build Product and Feature Release pages - [Videos](https://vue-spa.web-templates.io/videos) - Maintain Video Libraries and Playlists - [Metadata APIs](https://vue-spa.web-templates.io/posts/vite-press-plugin#metadata-apis-feature) - Generate queryable static .json metadata APIs for all content - [Includes](https://vue-spa.web-templates.io/posts/vite-press-plugin#includes-feature) - Create and reuse Markdown fragments It also supports an enhanced version of markdown for embedding richer UI markup in markdown content where most of [VitePress Containers](https://vitepress.dev/guide/markdown#custom-containers) are supported, including: - [Custom Markdown Containers](https://vue-spa.web-templates.io/posts/vite-press-plugin#markdown-containers) - **Alerts** - `info` - `tip` - `warning` - `danger` - `copy` - `sh` - `youtube` - [Markdown Fenced Code Blocks](https://vue-spa.web-templates.io/posts/vite-press-plugin#markdown-fenced-code-blocks) - Convert fenced code blocks into Richer UIs ### Vue Components In Markdown At the cost of reduced portability, you’re also able to embed richer Interactive Vue components directly in markdown: - [Vue Components in Markdown](https://vue-spa.web-templates.io/posts/markdown-components-in-vue) # Vite Press Plugin Source: https://servicestack.net/posts/vite-press-plugin The Vite Press Plugin is an alternative to [VitePress](https://vitepress.dev) for adding Markdown features to existing Vite Vue or React projects. It's a non-intrusive plugin for Vue and React Vite apps that want to add markdown powered content features without needing to adopt an opinionated framework for their entire App. ## Universal Markdown Features A goal for **vite-press-plugin** is to implement a suite of universal markdown-powered features that can be reused across Vue, React and .NET Razor and Blazor projects, allowing you to incorporate same set of markdown feature folders to power markdown content features across a range of websites built with different technologies. ### Vite Apps with vite-press-plugin The **vite-press-plugin** currently powers the markdown features in the static Vite Vue and React templates which are ideal for creating static websites, blogs, documentation and marketing websites that can be hosted FREE on [GitHub Pages CDN](https://pages.github.com): #### Static Vite Templates with vite-press-plugin - [press-vue](https://press-vue.servicestack.net) - Vite Vue App - [press-react](https://press-react.servicestack.net) - Vite React App The **vite-press-plugin** makes the Markdown features available to the Vite App, whilst the markdown rendering itself is optimally implemented in: - Vue Templates - with [markdown-it](https://github.com/markdown-it/markdown-it) in [Vue SFC](https://vuejs.org/guide/scaling-up/sfc.html) Components - React Templates - with [remark](https://github.com/remarkjs/remark) and [MDX](https://mdxjs.com) in [React](https://react.dev) Components #### .NET 8 API backend with Vite Vue & React SPA frontend When more capabilities are required and you want a .NET API backend to your Vite Vue or React SPA frontend, you can use one of our integrated .NET 8 SPA templates: - [vue-spa](https://vue-spa.web-templates.io) - .NET 8 API with Vite Vue SPA frontend - [react-spa](https://react-spa.web-templates.io) - .NET 8 API with Vite React SPA frontend ### .NET Templates with C# and Markdig Whilst the same Markdown feature folders are [implemented in C#](https://razor-ssg.web-templates.io/posts/razor-ssg) and rendered with [Markdig](https://github.com/xoofx/markdig) and either Razor Pages or Blazor Components: #### .NET 8 Razor SSG and Blazor SSR Templates - [razor-ssg](https://razor-ssg.web-templates.io) - .NET Razor SSG Blog and Marketing Website with **Markdig** - [razor-press](https://razor-press.web-templates.io) - .NET Razor SSG Documentation Website with **Markdig** - [blazor-vue](https://blazor-vue.web-templates.io) - .NET 8 Blazor Server Rendered Website with **Markdig** ### Markdown Feature Folders The content for each Markdown feature is maintained within its own feature folder with a `_` prefix: ```files /_includes /_posts /_videos /_whatsnew ``` #### Markdown Document Structure Additional metadata for each markdown page is maintained in the frontmatter of each markdown page, e.g. the front matter for this blog post contains: ```md --- title: Vite Press Plugin summary: Introducing the Vite Press Plugin author: Lucy Bates tags: [docs,markdown] image: https://picsum.photos/2000/1000 --- ``` The frontmatter is used in combination with file attributes to populate the document metadata. The schema used to support the current markdown features include: ```ts type Doc = { title: string // title of Markdown page (frontmatter) slug: string // slug to page (populated) path: string // path to page (populated) fileName: string // filename of markdown file (populated) content: string // markdown content (populated) date: string // date of page (frontmatter) tags: string[] // related tags (frontmatter) order?: number // explicit page ordering (frontmatter) group?: string // which group page belongs to (populated) draft?: boolean // make visible in production (frontmatter) wordCount: number // (populated) lineCount: number // (populated) minutesToRead: number // (populated) } type Post = Doc & { summary: string // short summary of blog post (frontmatter) author: string // author of blog post (frontmatter) image: string // hero image of blog post (frontmatter) } type Video = Doc & { url: string // URL of YouTube Video } type WhatsNew = Doc & { url: string // URL of YouTube Video image: string // Image to display for feature } ``` Markdown files can contain additional frontmatter which is also merged with the document metadata. ### Accessing Markdown Metadata In Vue App's the Metadata is available as an injected dependency that's navigable with the typed `VirtualPress` schema, e.g: ```ts import type { VirtualPress } from "vite-plugin-press" const press:VirtualPress = inject('press')! ``` In React App's it's available via an injected context: ```ts import { PressContext } from "@/contexts" const press = useContext(PressContext) ``` Which is defined as: ```ts import { createContext } from 'react' import type { VirtualPress } from 'vite-plugin-press' export const PressContext = createContext({} as VirtualPress) ``` This `VirtualPress` metadata is used to power all markdown features. ### Blog The blog maintains its markdown posts in a flat [/_posts](https://github.com/NetCoreTemplates/vue-spa/tree/main/MyApp.Client/src/_posts) folder which each Markdown post containing its publish date and URL slug it should be published under, e.g: ```files /_posts 2023-01-21_start.md 2024-02-11_jwt-identity-auth.md 2024-03-01_vite-press-plugin.md ``` Supporting all Blog features requires several different pages to render each of its view: | Description | Example | Vue | React | | - | - | - | - | | Main Blog layout | [/blog](/blog) | [blog.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/blog.vue) | [blog.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/blog.tsx) | | Navigable Archive of Posts | [/posts](/posts) | [index.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/posts/index.vue) | [index.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/posts/index.tsx) | | Individual Blog Post (like this!) | [/posts/vite-press-plugin](/posts/vite-press-plugin) | [\[slug\].vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/posts/%5Bslug%5D.vue) | [\[slug\].tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/posts/%5Bslug%5D.tsx) | | Display Posts by Author | [/posts/author/lucy-bates](/posts/author/lucy-bates) | [\[name\].vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/posts/author/%5Bname%5D.vue) | [\[name\].tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/posts/author/%5Bname%5D.tsx) | | Display Posts by Tag | [/posts/tagged/markdown](/posts/tagged/markdown) | [\[tag\].vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/posts/tagged/%5Btag%5D.vue) | [\[tag\].tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/posts/tagged/%5Btag%5D.tsx) | | Display Posts by Year | [/posts/year/2024](/posts/year/2024) | [\[year\].vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/posts/year/%5Byear%5D.vue) | [\[year\].tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/posts/year/%5Byear%5D.tsx) | #### Configuration Additional information about the Website Blog is maintained in `_posts/config.json` ```json { "localBaseUrl": "http://localhost:5173", "publicBaseUrl": "https://press-vue.servicestack.net", "siteTwitter": "@Vue", "blogTitle": "From the blog", "blogDescription": "Writing on software design and aerospace industry.", "blogEmail": "email@example.org (Vue)", "blogImageUrl": "https://servicestack.net/img/logo.png" } ``` #### Authors Whilst information about Post Authors are maintained in `_posts/authors.json` ```json [ { "name": "Lucy Bates", "email": "lucy@email.org", "bio": "Writing on software design and aerospace industry.", "profileUrl": "/img/profiles/user1.svg", "twitterUrl": "https://twitter.com/lucy", "threadsUrl": "https://threads.net/@lucy", "gitHubUrl": "https://github.com/lucy" }, ] ``` To associate an Author the **name** property is used to match a posts frontmatter **author**. ### General Features Most unique markdown features are captured in their Markdown's frontmatter metadata, but in general these features are broadly available for all features: - **Live Reload** - Latest Markdown content is displayed during **Development** - **Drafts** - Prevent posts being worked on from being published with `draft: true` - **Future Dates** - Posts with a future date wont be published until that date ### What's New Feature The [/whatsnew](/whatsnew) page is an example of creating a custom Markdown feature to implement a portfolio or a product releases page where a new folder is created per release, containing both release date and release or project name, with all features in that release maintained markdown content sorted in alphabetical order: ```files /_whatsnew /2023-03-08_Animaginary feature1.md /2023-03-18_OpenShuttle feature1.md /2023-03-28_Planetaria feature1.md ``` What's New follows the same structure as Pages feature which is rendered in: - [whatsnew.vue](https://github.com/NetCoreTemplates/vue-spa/blob/main/MyApp.Client/src/pages/whatsnew.vue) - [whatsnew.tsx](https://github.com/NetCoreTemplates/react-spa/blob/main/MyApp.Client/src/pages/whatsnew.tsx) ### Videos Feature The videos feature maintained in the `_videos` folder allows grouping of related videos into different folder groups, e.g: ```files /_videos /vue admin.md autoquerygrid.md components.md /react locode.md bookings.md nextjs.md ``` These can then be rendered as UI fragments using the `` component, e.g: ```tsx ``` ### Includes Feature The includes feature allows maintaining reusable markdown fragments in the `_includes` folder, e.g: ```files /_includes /features videos.md whatsnew.md privacy.md ``` Which can be included in other Markdown files with: ```md :::include privacy.md::: :::include features/include.md::: ``` Alternatively they can be included in other Vue, React or Markdown pages with the `` component, e.g: ```tsx ``` ### Metadata APIs Feature To support external clients from querying static markdown metadata you can export it to pre-rendered static `*.json` data structures by configuring `metadataPath` to the location you the `*.json` files published to, e.g: ```ts export default defineConfig({ plugins: [ Press({ metadataPath: 'public/api', }), ] }) ``` This will publish all the content of each content type in the year they were published in, along with an `all.json` containing all content published in that year as well aso for all time, e.g: ```files /meta /2022 all.json posts.json videos.json /2023 all.json posts.json /2024 all.json posts.json videos.json whatsnew.json all.json index.json ``` With this you can fetch the metadata of all the new **Blog Posts** added in **2023** from: [/api/2024/blog.json](/api/2024/blog.json) Or all the website content added in **2024** from: [/api/2024/all.json](/api/2024/all.json) Or **ALL** the website metadata content from: [/api/all.json](/api/all.json) This feature makes it possible to support use-cases like CreatorKit's [Generating Newsletters](https://servicestack.net/creatorkit/portal-mailruns#generating-newsletters) feature which generates a Monthly Newsletter Email with all new content added within a specified period. ## Markdown Containers Most of [VitePress Containers](https://vitepress.dev/guide/markdown#custom-containers) are also implemented, enabling rich markup to enhance markdown content and documentation universally across all Markdown App implementations: #### Input :::info This is an info box. ::: :::tip This is a tip. ::: :::warning This is a warning. ::: :::danger This is a dangerous warning. ::: #### Output :::info This is an info box. ::: :::tip This is a tip. ::: :::warning This is a warning. ::: :::danger This is a dangerous warning. ::: ### Custom Title You can specify a custom title by appending the text right after the container type: #### Input :::danger STOP Danger zone, do not proceed ::: #### Output :::danger STOP Danger zone, do not proceed ::: ### copy The **copy** container is ideal for displaying text snippets in a component that allows for easy copying: #### Input :::copy Copy Me! ::: #### Output :::copy Copy Me! ::: HTML or XML fragments can also be copied by escaping them first: #### Input ```md :::copy `` ::: ``` #### Output :::copy `` ::: ### sh Similarly the **sh** container is ideal for displaying and copying shell commands: #### Input :::sh npm run dev ::: #### Output :::sh npm run dev ::: ### YouTube For embedding YouTube Videos, optimally rendered using the `` component, e.g: #### Input :::youtube YIa0w6whe2U Vue Components Library ::: #### Output :::youtube YIa0w6whe2U Vue Components Library ::: ## Markdown Fenced Code Blocks For more flexibility you can utilize custom fenced components like the `files` fenced code block which can be used to capture ascii representation of a structured documentation like a folder & file structure, e.g: ```files /_videos /vue admin.md autoquerygrid.md components.md /react locode.md bookings.md nextjs.md ``` That we can render into a more UX-friendly representation by calling the `Files` component with the body of the code-block to convert the structured ascii layout into a more familiar GUI layout: ```files /_videos /vue admin.md autoquerygrid.md components.md /react locode.md bookings.md nextjs.md ``` The benefit of this approach of marking up documentation is that the markdown content still remains in an optimal human-readable form even when the markdown renderer lacks the custom fenced components to render the richer UI. ## Components In Markdown Up till now all above features will let you render the same markdown content in all available Vue, React, Razor or Blazor templates. At the cost of reduced portability, you're also able to embed rich Interactive Vue or React components directly in markdown. :::include component-links.md::: # New Blazor Interactive Auto Template with Custom Admin UIs Source: https://servicestack.net/posts/blazor-8-admin Since the release of .NET 8, we have been upgrading our [templates](https://github.com/NetCoreTemplates) and example applications to take advantage of some of the new features, especially for Blazor. Our templates now make use of static Server Side Rendering (SSR) for Blazor, which allows for faster initial page loads and better SEO, and our `blazor-wasm` template uses `InteractiveAuto` by default to provide a more responsive UI.
## What is InteractiveAuto? Blazor for .NET 8 has [four different rendering modes](https://learn.microsoft.com/en-us/aspnet/core/blazor/components/render-modes?view=aspnetcore-8.0#render-modes) you can take advantage of: - Static Server (static SSR) - Interactive Server - Interactive WebAssembly (WASM) - Interactive Auto For non-interactive pages, the static SSR mode is the fastest, as it renders the page on the server and sends the HTML to the client. However, when your page needs to be interactive, you need to use one of the interactive modes. Prior to .NET 8, there was a trade-off between the two available render modes (static server rendering wasn't yet available). The `Interactive Server` mode was faster to load, but the `Interactive WASM` mode was more responsive. The initial load times for `Interactive WASM` could be quite slow, as the entire application and all its dependencies needed to be downloaded before the page could render most of the content. > The initial load time for the `Interactive WASM` mode can be quite slow even for a minimal app Our templates previously worked around this limitation with a custom Pre-Rendering solution, as the wait times were too long for a good user experience. With .NET 8, the new `Interactive Auto` mode provides the best of both worlds as pre-rendering is now enabled by default. When the page is first loaded, it uses the `Interactive Server` mode, which is faster than `Interactive WASM` as it doesn't need to download WASM resources. So the user can start interacting with the page straight away, but with a slight delay for each of their interactions due to having to perform round-trips to the server for each interaction. In the background, the WASM resources are downloaded which can then be used to render the site on the client for subsequent visits. ## Using InteractiveAuto in your Blazor application In Blazor for .NET 8, render modes can be set on both a per-page and per-component basis. ```html @page "/counter" @rendermode InteractiveAuto ``` ```html ``` ## ServiceStack.Blazor Components The [ServiceStack.Blazor Components](https://blazor-gallery.jamstacks.net) have been updated for .NET 8 and work with the new `InteractiveAuto` render mode. This means you can focus more on your application logic and less on the UI, as the components provide a high-productivity UI for common tasks such as CRUD operations.
### AutoQueryGrid The [AutoQueryGrid](https://blazor-gallery.servicestack.net/gallery/autoquerygrid) component provides a full-featured data grid that can be used to display and edit data from an AutoQuery service. This is ideal for creating custom admin pages for your application. By integrating your admin screens into your application, you can optimize the user experience for specific workflows and get a huge amount of reuse of your existing AutoQuery services. ```html ``` ![](/img/posts/blazor-8-admin/autoquerygrid.png) For BlazorDiffusion, our StableDiffusion example application, we used the AutoQueryGrid to create a custom admin page for managing the modifiers in the application. ![](/img/posts/blazor-8-admin/blazordiffusion-modifiers.png) This is the simplest and fastest use of the AutoQueryGrid component, but it can also be heavily customized for lots of different use cases. In BlazorDiffusion we customize the grid to enable easy navigation contextually between separate customized admin screens for each Creative, linking to related table data. ![](/img/posts/blazor-8-admin/blazordiffusion-creatives.png) ```html ``` In the above example, we use the `ConfigureQuery` parameter to customize the query used by the AutoQueryGrid when displaying values. This is ideal if you want to filter the data for specific workflows, for example, only showing the data that is relevant to the current user. We combine this with a `Tabs` component to provide a navigation bar for the user to switch between the different filters on the same AutoQueryGrid. ```html ``` ![](/img/posts/blazor-8-admin/blazordiffusion-tab.png) ![](/img/posts/blazor-8-admin/blazordiffusion-tab1.png) We also use the `EditForm` parameter to customize the edit form for the AutoQueryGrid, so the workflow for editing a Creative is optimized using your own completely custom UI. ```html ``` ## Upgrading to .NET 8 BlazorDiffusion was an example application we originally developed for .NET 6. We upgraded the production release of this application to use our `blazor-vue` template, which can be perfect for public-facing web applications and teams that don't mind including a JavaScript framework in their application. However, to show the flexibility of Blazor for .NET 8, we also upgraded the whole application from our updated `blazor-wasm` template to take advantage of the new `InteractiveAuto` mode. ### Component Compatibility Since the ServiceStack.Blazor library has been updated for .NET 8, we just needed to bring over the shared components from the original application and update the references to the new library. When upgrading your application pages and components, you will need to avoid any JavaScript interop that runs during the `InitializeAsync` lifecycle method, as this is not supported in the `InteractiveAuto` mode. ### Running on both Server vs Client When using the `InteractiveAuto` mode, first visits will be running on the server, so your pages and components need to be available to both projects, as well as have any required dependencies registered in both projects `Program.cs` files. By placing your shared pages and components in a shared project like the `.Client` project in the `blazor-wasm` template, you can easily share them between the two projects. Look for any of your pages or components that use the `@injects` directive, as these will need to be registered in both projects. :::info Avoid sharing sensitive information via dependency injection, as this will be available to the client at runtime which will be able to be decompiled and inspected. ::: ## Source code and live demo The source code for the upgraded `BlazorDiffusionAuto` application is [available on GitHub](https://github.com/NetCoreApps/BlazorDiffusionAuto) and you can view a live demo of the application at [auto.blazordiffusion.com](https://auto.blazordiffusion.com). ## Conclusion The new `InteractiveAuto` mode in Blazor for .NET 8 provides the best of both worlds for Blazor applications. A built in pre-rendering solution means that you can have a fast initial load time, but still have a responsive UI for subsequent visits. And since the ServiceStack.Blazor components have been updated for .NET 8, you can take advantage of the high-productivity UI components to quickly create customizable and professional-looking admin pages in a Blazor application. ## Feedback If you have any questions or feedback, please feel free to reach out to us on [our forums](https://forums.servicestack.net) or [GitHub Discussions](https://servicestack.net/ask). # ASP.NET Core JWT Identity Auth Source: https://servicestack.net/posts/jwt-identity-auth JWTs enable stateless authentication of clients without servers needing to maintain any Auth state in server infrastructure or perform any I/O to validate a token. As such, [JWTs are a popular choice for Microservices](https://docs.servicestack.net/auth/jwt-authprovider#stateless-auth-microservices) as they only need to configured with confidential keys to validate access. ### ASP.NET Core JWT Authentication ServiceStack's JWT Identity Auth reimplements many of the existing [ServiceStack JWT AuthProvider](https://docs.servicestack.net/auth/jwt-authprovider) features but instead of its own implementation, integrates with and utilizes ASP.NET Core's built-in JWT Authentication that's configurable in .NET Apps with the `.AddJwtBearer()` extension method, e.g: #### Program.cs ```csharp services.AddAuthentication() .AddJwtBearer(options => { options.TokenValidationParameters = new() { ValidIssuer = config["JwtBearer:ValidIssuer"], ValidAudience = config["JwtBearer:ValidAudience"], IssuerSigningKey = new SymmetricSecurityKey( Encoding.UTF8.GetBytes(config["JwtBearer:IssuerSigningKey"]!)), ValidateIssuerSigningKey = true, }; }) .AddIdentityCookies(options => options.DisableRedirectsForApis()); ``` Then use the `JwtAuth()` method to enable and configure ServiceStack's support for ASP.NET Core JWT Identity Auth: #### Configure.Auth.cs ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature(IdentityAuth.For( options => { options.SessionFactory = () => new CustomUserSession(); options.CredentialsAuth(); options.JwtAuth(x => { // Enable JWT Auth Features... }); }))); }); } ``` ### Enable in Swagger UI Once configured we can enable JWT Auth in Swagger UI by installing **Swashbuckle.AspNetCore**: :::copy `` ::: Then enable Open API, Swagger UI, ServiceStack's support for Swagger UI and the JWT Bearer Auth option: ```csharp public class ConfigureOpenApi : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { if (context.HostingEnvironment.IsDevelopment()) { services.AddEndpointsApiExplorer(); services.AddSwaggerGen(); services.AddServiceStackSwagger(); services.AddJwtAuth(); //services.AddBasicAuth(); services.AddTransient(); } }); public class StartupFilter : IStartupFilter { public Action Configure(Action next) => app => { // Provided by Swashbuckle library app.UseSwagger(); app.UseSwaggerUI(); next(app); }; } } ``` This will enable the **Authorize** button in Swagger UI where you can authenticate with a JWT Token: ![](/img/posts/jwt-identity-auth/jwt-swagger-ui.png) ### JWT Auth in Built-in UIs This also enables the **JWT** Auth Option in ServiceStack's built-in [API Explorer](https://docs.servicestack.net/api-explorer), [Locode](https://docs.servicestack.net/locode/) and [Admin UIs](https://docs.servicestack.net/admin-ui): ### Authenticating with JWT JWT Identity Auth is a drop-in replacement for ServiceStack's JWT AuthProvider where Authenticating via Credentials will convert the Authenticated User into a JWT Bearer Token returned in the **HttpOnly**, **Secure** `ss-tok` Cookie that will be used to Authenticate the client: ```csharp var client = new JsonApiClient(BaseUrl); await client.SendAsync(new Authenticate { provider = "credentials", UserName = Username, Password = Password, }); var bearerToken = client.GetTokenCookie(); // ss-tok Cookie ``` ## JWT Refresh Tokens Refresh Tokens can be used to allow users to request a new JWT Access Token when the current one expires. To enable support for JWT Refresh Tokens your `IdentityUser` model should implement the `IRequireRefreshToken` interface which will be used to store the 64 byte Base64 URL-safe `RefreshToken` and its `RefreshTokenExpiry` in its persisted properties: ```csharp public class ApplicationUser : IdentityUser, IRequireRefreshToken { public string? RefreshToken { get; set; } public DateTime? RefreshTokenExpiry { get; set; } } ``` Now after successful authentication, the `RefreshToken` will also be returned in the `ss-reftok` Cookie: ```csharp var refreshToken = client.GetRefreshTokenCookie(); // ss-reftok Cookie ``` ### Transparent Server Auto Refresh of JWT Tokens To be able to terminate a users access, Users need to revalidate their eligibility to verify they're still allowed access (e.g. deny Locked out users). This JWT revalidation pattern is implemented using Refresh Tokens which are used to request revalidation of their access and reissuing a new JWT Access Token which can be used to make authenticated requests until it expires. As Cookies are used to return Bearer and Refresh Tokens ServiceStack is able to implement the revalidation logic on the server where it transparently validates Refresh Tokens, and if a User is eligible will reissue a new JWT Token Cookie that replaces the expired Access Token Cookie. Thanks to this behavior HTTP Clients will be able to Authenticate with just the Refresh Token, which will transparently reissue a new JWT Access Token Cookie and then continue to perform the Authenticated Request: ```csharp var client = new JsonApiClient(BaseUrl); client.SetRefreshTokenCookie(RefreshToken); var response = await client.SendAsync(new Secured { ... }); ``` There's also opt-in sliding support for extending a User's RefreshToken after usage which allows Users to treat their Refresh Token like an API Key where it will continue extending whilst they're continuously using it to make API requests, otherwise expires if they stop. How long to extend the expiry of Refresh Tokens after usage can be configured with: ```csharp options.JwtAuth(x => { // How long to extend the expiry of Refresh Tokens after usage (default None) x.ExtendRefreshTokenExpiryAfterUsage = TimeSpan.FromDays(90); }); ``` ## Convert Session to Token Service Another useful Service that's available is being able to Convert your current Authenticated Session into a Token with the `ConvertSessionToToken` Service which can be enabled with: ```csharp options.JwtAuth(x => { x.IncludeConvertSessionToTokenService = true; }); ``` This can be useful for when you want to Authenticate via an external OAuth Provider that you then want to convert into a stateless JWT Token by calling the `ConvertSessionToToken` on the client, e.g: #### .NET Clients ```csharp await client.SendAsync(new ConvertSessionToToken()); ``` #### TypeScript/JavaScript ```ts fetch('/session-to-token', { method:'POST', credentials:'include' }) ``` The default behavior of `ConvertSessionToToken` is to remove the Current Session from the Auth Server which will prevent access to protected Services using our previously Authenticated Session. If you still want to preserve your existing Session you can indicate this with: ```csharp await client.SendAsync(new ConvertSessionToToken { PreserveSession = true }); ``` ### JWT Options Other configuration options available for Identity JWT Auth include: ```csharp options.JwtAuth(x => { // How long should JWT Tokens be valid for. (default 14 days) x.ExpireTokensIn = TimeSpan.FromDays(14); // How long should JWT Refresh Tokens be valid for. (default 90 days) x.ExpireRefreshTokensIn = TimeSpan.FromDays(90); x.OnTokenCreated = (req, user, claims) => { // Customize which claims are included in the JWT Token }; // Whether to invalidate Refresh Tokens on Logout (default true) x.InvalidateRefreshTokenOnLogout = true; // How long to extend the expiry of Refresh Tokens after usage (default None) x.ExtendRefreshTokenExpiryAfterUsage = null; }); ``` # Built-In Identity Auth Admin UI Source: https://servicestack.net/posts/identity-auth-admin-ui With ServiceStack now [deeply integrated into ASP.NET Core Apps](/posts/servicestack-endpoint-routing) we're back to refocusing on adding value-added features that can benefit all .NET Core Apps. ## Registration The new Identity Auth Admin UI is an example of this, which can be enabled when registering the `AuthFeature` Plugin: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AuthFeature(IdentityAuth.For( options => { options.SessionFactory = () => new CustomUserSession(); options.CredentialsAuth(); options.AdminUsersFeature(); }))); }); } ``` Which just like the ServiceStack Auth [Admin Users UI](https://docs.servicestack.net/admin-ui-users) enables a Admin UI that's only accessible to **Admin** Users for managing **Identity Auth** users at `/admin-ui/users`. ## User Search Results Which displays a limited view due to the minimal properties on the default `IdentityAuth` model:
### Custom Search Result Properties These User's search results are customizable by specifying the `ApplicationUser` properties to display instead, e.g: ```csharp options.AdminUsersFeature(feature => { feature.QueryIdentityUserProperties = [ nameof(ApplicationUser.Id), nameof(ApplicationUser.DisplayName), nameof(ApplicationUser.Email), nameof(ApplicationUser.UserName), nameof(ApplicationUser.LockoutEnd), ]; }); ```
### Custom Search Result Behavior The default display Order of Users is also customizable: ```csharp feature.DefaultOrderBy = nameof(ApplicationUser.DisplayName); ``` As well as the Search behavior which can be replaced to search any custom fields, e.g: ```csharp feature.SearchUsersFilter = (q, query) => { var queryUpper = query.ToUpper(); return q.Where(x => x.DisplayName!.Contains(query) || x.Id.Contains(queryUpper) || x.NormalizedUserName!.Contains(queryUpper) || x.NormalizedEmail!.Contains(queryUpper)); }; ``` ## Default Create and Edit Users Forms The default Create and Edit Admin Users UI are also limited to editing the minimal `IdentityAuth` properties:
Whilst the Edit page includes standard features to lockout users, change user passwords and manage their roles:
### Custom Create and Edit Forms By default Users are locked out indefinitely, but this can also be changed to lock users out to a specific date, e.g: ```csharp feature.ResolveLockoutDate = user => DateTimeOffset.Now.AddDays(7); ``` The forms editable fields can also be customized to include additional properties, e.g: ```csharp feature.FormLayout = [ Input.For(x => x.UserName, c => c.FieldsPerRow(2)), Input.For(x => x.Email, c => { c.Type = Input.Types.Email; c.FieldsPerRow(2); }), Input.For(x => x.FirstName, c => c.FieldsPerRow(2)), Input.For(x => x.LastName, c => c.FieldsPerRow(2)), Input.For(x => x.DisplayName, c => c.FieldsPerRow(2)), Input.For(x => x.PhoneNumber, c => { c.Type = Input.Types.Tel; c.FieldsPerRow(2); }), ]; ``` That can override the new `ApplicationUser` Model that's created and any Validation: ### Custom User Creation ```csharp feature.CreateUser = () => new ApplicationUser { EmailConfirmed = true }; feature.CreateUserValidation = async (req, createUser) => { await IdentityAdminUsers.ValidateCreateUserAsync(req, createUser); var displayName = createUser.GetUserProperty(nameof(ApplicationUser.DisplayName)); if (string.IsNullOrEmpty(displayName)) throw new ArgumentNullException(nameof(AdminUserBase.DisplayName)); return null; }; ```
### Admin User Events Should you need to, Admin User Events can use used to execute custom logic before and after creating, updating and deleting users, e.g: ```csharp feature.OnBeforeCreateUser = (request, user) => { ... }; feature.OnAfterCreateUser = (request, user) => { ... }; feature.OnBeforeUpdateUser = (request, user) => { ... }; feature.OnAfterUpdateUser = (request, user) => { ... }; feature.OnBeforeDeleteUser = (request, userId) => { ... }; feature.OnAfterDeleteUser = (request, userId) => { ... }; ``` # System.Text.Json ServiceStack APIs Source: https://servicestack.net/posts/system-text-json-apis In continuing our focus to enable ServiceStack to become a deeply integrated part of .NET 8 Application's, ServiceStack latest .NET 8 templates now default to using standardized ASP.NET Core features wherever possible, including: - [ASP.NET Core Identity Auth](/posts/net8-identity-auth) - [ASP.NET Core IOC](/posts/servicestack-endpoint-routing#asp.net-core-ioc) - [Endpoint Routing](/posts/servicestack-endpoint-routing#endpoint-routing) - [Swashbuckle for Open API v3 and Swagger UI](/posts/openapi-v3-support) - [System.Text.Json APIs](/posts/system-text-json-apis) This reduces friction for integrating ServiceStack into existing .NET 8 Apps, encourages greater knowledge and reuse and simplifies .NET development as developers have a reduced number of concepts to learn, fewer technology implementations to configure and maintain that are now applied across their entire .NET App. The last integration piece supported was utilizing **System.Text.Json** - the default high-performance async JSON serializer used in .NET Applications, can now be used by ServiceStack APIs to serialize and deserialize its JSON API Responses that's enabled by default when using **Endpoint Routing**. This integrates ServiceStack APIs more than ever where just like Minimal APIs and Web API, uses **ASP.NET Core's IOC** to resolve dependencies, uses **Endpoint Routing** to Execute APIs that's secured with **ASP.NET Core Identity Auth** then uses **System.Text.Json** to deserialize and serialize its JSON payloads. ### Enabled by Default when using Endpoint Routing ```csharp app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); }); ``` ### Enhanced Configuration ServiceStack uses a custom `JsonSerializerOptions` to improve compatibility with existing ServiceStack DTOs and ServiceStack's rich ecosystem of generic [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) Service Clients, which is configured to: - Not serialize `null` properties - Supports Case Insensitive Properties - Uses `CamelCaseNamingPolicy` for property names - Serializes `TimeSpan` and `TimeOnly` Data Types with [XML Schema Time format](https://www.w3.org/TR/xmlschema-2/#isoformats) - Supports `[DataContract]` annotations - Supports Custom Enum Serialization ### Benefits all Add ServiceStack Reference Languages This compatibility immediately benefits all of ServiceStack's [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) native typed integrations for **11 programming languages** which all utilize ServiceStack's JSON API endpoints - now serialized with System.Text.Json ### Support for DataContract Annotations Support for .NET's `DataContract` serialization attributes was added using a custom `TypeInfoResolver`, specifically it supports: - `[DataContract]` - When annotated, only `[DataMember]` properties are serialized - `[DataMember]` - Specify a custom **Name** or **Order** of properties - `[IgnoreDataMember]` - Ignore properties from serialization - `[EnumMember]` - Specify a custom value for Enum values ### Custom Enum Serialization Below is a good demonstration of the custom Enum serialization support which matches ServiceStack.Text's behavior: ```csharp public enum EnumType { Value1, Value2, Value3 } [Flags] public enum EnumTypeFlags { Value1, Value2, Value3 } public enum EnumStyleMembers { [EnumMember(Value = "lower")] Lower, [EnumMember(Value = "UPPER")] Upper, } return new EnumExamples { EnumProp = EnumType.Value2, // String value by default EnumFlags = EnumTypeFlags.Value2 | EnumTypeFlags.Value3, // [Flags] as int EnumStyleMembers = EnumStyleMembers.Upper, // Serializes [EnumMember] value NullableEnumProp = null, // Ignores nullable enums }; ``` Which serializes to: ```json { "enumProp": "Value2", "enumFlags": 3, "enumStyleMembers": "UPPER" } ``` ### Custom Configuration You can further customize the `JsonSerializerOptions` used by ServiceStack by using `ConfigureJsonOptions()` to add any customizations that you can optionally apply to ASP.NET Core's JSON APIs and MVC with: ```csharp builder.Services.ConfigureJsonOptions(options => { options.PropertyNamingPolicy = JsonNamingPolicy.SnakeCaseLower; }) .ApplyToApiJsonOptions() // Apply to ASP.NET Core's JSON APIs .ApplyToMvcJsonOptions(); // Apply to MVC ``` ### Control over when and where System.Text.Json is used Whilst `System.Text.Json` is highly efficient, it's also very strict in the inputs it accepts where you may want to revert back to using ServiceStack's JSON Serializer for specific APIs, especially when you need to support external clients that can't be updated. This can done by annotating Request DTOs with `[SystemJson]` attribute, e.g: you can limit to only use `System.Text.Json` for an **APIs Response** with: ```csharp [SystemJson(UseSystemJson.Response)] public class CreateUser : IReturn { //... } ``` Or limit to only use `System.Text.Json` for an **APIs Request** with: ```csharp [SystemJson(UseSystemJson.Request)] public class CreateUser : IReturn { //... } ``` Or not use `System.Text.Json` at all for an API with: ```csharp [SystemJson(UseSystemJson.Never)] public class CreateUser : IReturn { //... } ``` ### JsonApiClient Support When Endpoints Routing is configured, the `JsonApiClient` will also be configured to utilize the same `System.Text.Json` options to send and receive its JSON API Requests which also respects the `[SystemJson]` specified behavior. Clients external to the .NET App can be configured to use `System.Text.Json` with: ```csharp ClientConfig.UseSystemJson = UseSystemJson.Always; ``` Whilst any custom configuration can be applied to its `JsonSerializerOptions` with: ```csharp TextConfig.ConfigureSystemJsonOptions(options => { options.PropertyNamingPolicy = JsonNamingPolicy.SnakeCaseLower; }); ``` ### Scoped JSON Configuration We've also added partial support for [Customized JSON Responses](https://docs.servicestack.net/customize-json-responses) for the following customization options: :::{.table,w-full} | Name | Alias | |------------------------------|-------| | EmitCamelCaseNames | eccn | | EmitLowercaseUnderscoreNames | elun | | EmitPascalCaseNames | epcn | | ExcludeDefaultValues | edv | | IncludeNullValues | inv | | Indent | pp | ::: These can be applied to the JSON Response by returning a decorated `HttpResult` with a custom `ResultScope`, e.g: ```csharp return new HttpResult(responseDto) { ResultScope = () => JsConfig.With(new() { IncludeNullValues = true, ExcludeDefaultValues = true }) }; ``` They can also be requested by API consumers by adding a `?jsconfig` query string with the desired option or its alias, e.g: ```csharp /api/MyRequest?jsconfig=EmitLowercaseUnderscoreNames,ExcludeDefaultValues /api/MyRequest?jsconfig=eccn,edv ``` ### SystemJsonCompatible Another configuration automatically applied when `System.Text.Json` is enabled is: ```csharp JsConfig.SystemJsonCompatible = true; ``` Which is being used to make ServiceStack's JSON Serializer more compatible with `System.Text.Json` output so it's easier to switch between the two with minimal effort and incompatibility. Currently this is only used to override `DateTime` and `DateTimeOffset` behavior which uses `System.Text.Json` for its Serialization/Deserialization. # OpenAPI v3 and Swagger UI Source: https://servicestack.net/posts/openapi-v3 In the ServiceStack v8.1 release, we have introduced a way to better incorporate your ServiceStack APIs into the larger ASP.NET Core ecosystem by mapping your ServiceStack APIs to standard [ASP.NET Core Endpoints](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/routing?view=aspnetcore-8.0#endpoints). This enables your ServiceStack APIs integrate with your larger ASP.NET Core application in the same way other middleware does, opening up more opportunities for reuse of your ServiceStack APIs. This opens up the ability to use common third party tooling. A good example of this is adding OpenAPI v3 specification generation for your endpoints offered by the `Swashbuckle.AspNetCore` package. Included in the v8.1 Release is the `ServiceStack.AspNetCore.OpenApi` package to make this integration as easy as possible, and incorporate additional information from your ServiceStack APIs into Swagger metadata. ![](/img/posts/openapi-v3/openapi-v3-swagger-ui.png) Previously, without the ability to map Endpoints, we've maintained a ServiceStack specific OpenAPI specification generation via the `OpenApiFeature` plugin. While this provided a lot of functionality by accurately describing your ServiceStack APIs, it could be tricky to customize those API descriptions to the way some users wanted to. In this post we will look at how you can take advantage of the new OpenAPI v3 Swagger support using mapped Endpoints, customizing the generated specification, as well as touch on other related changes to ServiceStack v8.1. ## AppHost Initialization To use ServiceStack APIs as mapped Endpoints, the way ServiceStack is initialized in . To convert your App to use [Endpoint Routing and ASP.NET Core IOC](/posts/servicestack-endpoint-routing) your ASPNET Core application needs to be updated to replace any usage of `Funq` IoC container to use ASP.NET Core's IOC. Previously, the following was used to initialize your ServiceStack `AppHost`: #### Program.cs ```csharp app.UseServiceStack(new AppHost()); ``` The `app` in this example is a `WebApplication` resulting from an `IHostApplicationBuilder` calling `builder.Build()`. Whilst we still need to call `app.UseServiceStack()`, we also need to move the discovery of your ServiceStack APIs to earlier in the setup before the `WebApplication` is built, e.g: ```csharp // Register ServiceStack APIs, Dependencies and Plugins: services.AddServiceStack(typeof(MyServices).Assembly); var app = builder.Build(); //... // Register ServiceStack AppHost app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); }); app.Run(); ``` Once configured to use Endpoint Routing we can the [mix](https://docs.servicestack.net/mix-tool) tool to apply the [openapi3](https://gist.github.com/gistlyn/dac47b68e77796902cde0f0b7b9c6ac2) Startup Configuration with: :::sh x mix openapi3 ::: ### Manually Configure OpenAPI v3 and Swagger UI This will install the required ASP.NET Core Microsoft, Swashbuckle and ServiceStack Open API NuGet packages: ```xml ``` Then add the `Configure.OpenApi.cs` [Modular Startup](https://docs.servicestack.net/modular-startup) class to your project: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureOpenApi))] namespace MyApp; public class ConfigureOpenApi : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { if (context.HostingEnvironment.IsDevelopment()) { services.AddEndpointsApiExplorer(); services.AddSwaggerGen(); // Swashbuckle services.AddServiceStackSwagger(); services.AddBasicAuth(); // Enable HTTP Basic Auth //services.AddJwtAuth(); // Enable & Use JWT Auth services.AddTransient(); } }); public class StartupFilter : IStartupFilter { public Action Configure(Action next) => app => { // Provided by Swashbuckle library app.UseSwagger(); app.UseSwaggerUI(); next(app); }; } } ``` All this setup is done for you in ServiceStack's updated [Identity Auth .NET 8 Templates](https://servicestack.net/start), but for existing applications, you will need to do [convert to use Endpoint Routing](https://docs.servicestack.net/endpoints-migration) to support this new way of running your ServiceStack applications. ## More Control One point of friction with our previous `OpenApiFeature` plugin was the missing customization ability to the OpenAPI spec to somewhat disconnect from the defined ServiceStack service, and related C# Request and Response Data Transfer Objects (DTOs). Since the `OpenApiFeature` plugin used class and property attributes on your Request DTOs, making the *structure* of the OpenAPI schema mapping quite ridged, preventing the ability for certain customizations. For example, if we have an `UpdateTodo` Request DTO that looks like the following: ```csharp [Route("/todos/{Id}", "PUT")] public class UpdateTodo : IPut, IReturn { public long Id { get; set; } [ValidateNotEmpty] public string Text { get; set; } public bool IsFinished { get; set; } } ``` Previously, we would get a default Swagger UI that enabled all the properties as `Paramters` to populate. ![](/img/posts/openapi-v3/openapi-v2-defaults.png) While this correctly describes the Request DTO structure, sometimes as developers we get requirements for how we want to present our APIs to our users from within the Swagger UI. With the updated SwaggerUI, and the use of the `Swashbuckle` library, we get the following UI by default. ![](/img/posts/openapi-v3/openapi-v3-defaults-application-json.png) These are essentially the same, we have a CRUD Todo API that takes a `UpdateTodo` Request DTO, and returns a `Todo` Response DTO. ServiceStack needs to have uniquely named Request DTOs, so we can't have a `Todo` schema as the Request DTO despite the fact that it is the same structure as our `Todo` model. This is a good thing, as it allows us to have a clean API contract, and separation of concerns between our Request DTOs and our models. However, it might not be desired to present this to our users, since it can be convenient to think about CRUD services as taking the same resource type as the response. To achieve this, we use the Swashbuckle library to customize the OpenAPI spec generation. Depending on what you want to customize, you can use the `SchemaFilter` or `OperationFilter` options. In this case, we want to customize the matching operation to reference the `Todo` schema for the Request Body. First, we create a new class that implements the `IOperationFilter` interface. ```csharp public class OperationRenameFilter : IOperationFilter { public void Apply(OpenApiOperation operation, OperationFilterContext context) { if (context.ApiDescription.HttpMethod == "PUT" && context.ApiDescription.RelativePath == "todos/{Id}") { operation.RequestBody.Content["application/json"].Schema.Reference = new OpenApiReference { Type = ReferenceType.Schema, Id = "Todo" }; } } } ``` The above matches some information about the `UpdateTodo` request we want to customize, and then sets the `Reference` property of the `RequestBody` to the `Todo` schema. We can then add this to the `AddSwaggerGen` options in the `Program.cs` file. ```csharp builder.Services.AddSwaggerGen(o => { o.OperationFilter(); }); ``` The result is the following Swagger UI. ![](/img/posts/openapi-v3/openapi-v3-customized-application-json.png) This is just one simple example of how you can customize the OpenAPI spec generation, and `Swashbuckle` has some great documentation on the different ways you can customize the generated spec. And these customizations impact any of your ASP.NET Core Endpoints, not just your ServiceStack APIs. ## Closing Now that ServiceStack APIs can be mapped to standard ASP.NET Core Endpoints, it opens up a lot of possibilities for integrating your ServiceStack APIs into the larger ASP.NET Core ecosystem. The use of the `Swashbuckle` library via the `ServiceStack.AspNetCore.OpenApi` library is just one example of how you can take advantage of this new functionality. # ServiceStack Endpoint Routing Source: https://servicestack.net/posts/servicestack-endpoint-routing In an effort to reduce friction and improve integration with ASP.NET Core Apps, we've continued the trend from last year for embracing ASP.NET Core's built-in features and conventions which saw the latest ServiceStack v8 release converting all its newest .NET 8 templates to adopt [ASP.NET Core Identity Auth](https://docs.servicestack.net/auth/identity-auth). This is a departure from building upon our own platform-agnostic abstractions which allowed the same ServiceStack code-base to run on both .NET Core and .NET Framework. Our focus going forward will be to instead adopt De facto standards and conventions of the latest .NET platform which also means ServiceStack's new value-added features are only available in the latest **.NET 8+** release. ### ServiceStack Middleware Whilst ServiceStack integrates into ASP.NET Core Apps as custom middleware into ASP.NET Core's HTTP Request Pipeline, it invokes its own black-box of functionality from there, implemented using its own suite of overlapping features. Whilst this allows ServiceStack to have full control over how to implement its features, it's not as integrated as it could be, with there being limits on what ServiceStack Functionality could be reused within external ASP .NET Core MVC Controllers, Razor Pages, etc. and inhibited the ability to apply application-wide authorization policies across an Application entire surface area, using and configuring different JSON Serialization implementations. ### Areas for tighter integration The major areas we've identified that would benefit from tighter integration with ASP.NET Core include: - [Funq IOC Container](https://docs.servicestack.net/ioc) - [ServiceStack Routing](https://docs.servicestack.net/routing) and [Request Pipeline](https://docs.servicestack.net/order-of-operations) - [ServiceStack.Text JSON Serializer](https://docs.servicestack.net/json-format) ### ServiceStack v8.1 is fully integrated! We're happy to announce the latest release of ServiceStack v8.1 now supports utilizing the optimal ASP.NET Core's standardized features to reimplement all these key areas - fostering seamless integration and greater reuse which you can learn about below: - [ASP.NET Core Identity Auth](https://docs.servicestack.net/auth/identity-auth) - [ASP.NET Core IOC](https://docs.servicestack.net/releases/v8_01#asp.net-core-ioc) - [Endpoint Routing](https://docs.servicestack.net/releases/v8_01#endpoint-routing) - [System.Text.Json APIs](https://docs.servicestack.net/releases/v8_01#system.text.json) - [Open API v3 and Swagger UI](https://docs.servicestack.net/releases/v8_01#openapi-v3) - [ASP.NET Core Identity Auth Admin UI](https://docs.servicestack.net/releases/v8_01#asp.net-core-identity-auth-admin-ui) - [JWT Identity Auth](https://docs.servicestack.net/releases/v8_01#jwt-identity-auth) Better yet, this new behavior is enabled by default in all of ServiceStack's new ASP .NET Identity Auth .NET 8 templates! ### Migrating to ASP.NET Core Endpoints To assist ServiceStack users in upgrading their existing projects we've created a migration guide walking through the steps required to adopt these new defaults: ### ASP .NET Core IOC The primary limitation of ServiceStack using its own Funq IOC is that any dependencies registered in Funq are not injected into Razor Pages, Blazor Components, MVC Controllers, etc. That's why our [Modular Startup](https://docs.servicestack.net/modular-startup) configurations recommend utilizing custom `IHostingStartup` configurations to register application dependencies in ASP .NET Core's IOC where they can be injected into both ServiceStack Services and ASP.NET Core's external components, e.g: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureDb))] namespace MyApp; public class ConfigureDb : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton(new OrmLiteConnectionFactory( context.Configuration.GetConnectionString("DefaultConnection"), SqliteDialect.Provider)); }); } ``` But there were fundamental restrictions on what could be registered in ASP .NET Core's IOC as everything needed to be registered before AspNetCore's `WebApplication` was built and before ServiceStack's AppHost could be initialized, which prohibited being able to register any dependencies created by the AppHost including Services, AutoGen Services, Validators and internal functionality like App Settings, Virtual File System and Caching providers, etc. ## Switch to use ASP .NET Core IOC To enable ServiceStack to switch to using ASP .NET Core's IOC you'll need to move registration of all dependencies and Services to before the WebApplication is built by calling the `AddServiceStack()` extension method with the Assemblies where your ServiceStack Services are located, e.g: ```csharp builder.Services.AddServiceStack(typeof(MyServices).Assembly); var app = builder.Build(); //... app.UseServiceStack(new AppHost()); ``` Which now registers all ServiceStack dependencies in ASP .NET Core's IOC, including all ServiceStack Services prior to the AppHost being initialized which no longer needs to specify the Assemblies where ServiceStack Services are created and no longer needs to use Funq as all dependencies should now be registered in ASP .NET Core's IOC. ### Registering Dependencies and Plugins Additionally ASP.NET Core's IOC requirement for all dependencies needing to be registered before the WebApplication is built means you'll no longer be able to register any dependencies or plugins in ServiceStack's `AppHost.Configure()` method. ```csharp public class AppHost() : AppHostBase("MyApp"), IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { // Register IOC Dependencies and ServiceStack Plugins }); public override void Configure() { // DO NOT REGISTER ANY PLUGINS OR DEPENDENCIES HERE } } ``` Instead anything that needs to register dependencies in ASP.NET Core IOC should now use the `IServiceCollection` extension methods: - Use `IServiceCollection.Add*` APIs to register dependencies - Use `IServiceCollection.AddPlugin` API to register ServiceStack Plugins - Use `IServiceCollection.RegisterService*` APIs to dynamically register ServiceStack Services in external Assemblies This can be done whenever you have access to `IServiceCollection`, either in `Program.cs`: ```csharp builder.Services.AddPlugin(new AdminDatabaseFeature()); ``` Or in any Modular Startup `IHostingStartup` configuration class, e.g: ```csharp public class ConfigureDb : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddSingleton(new OrmLiteConnectionFactory( context.Configuration.GetConnectionString("DefaultConnection"), SqliteDialect.Provider)); // Enable Audit History services.AddSingleton(c => new OrmLiteCrudEvents(c.GetRequiredService())); // Enable AutoQuery RDBMS APIs services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, }); // Enable AutoQuery Data APIs services.AddPlugin(new AutoQueryDataFeature()); // Enable built-in Database Admin UI at /admin-ui/database services.AddPlugin(new AdminDatabaseFeature()); }) .ConfigureAppHost(appHost => { appHost.Resolve().InitSchema(); }); } ``` The `ConfigureAppHost()` extension method can continue to be used to execute any startup logic that requires access to registered dependencies. ### Authoring ServiceStack Plugins To enable ServiceStack Plugins to support both Funq and ASP .NET Core IOC, any dependencies and Services a plugin needs should be registered in the `IConfigureServices.Configure(IServiceCollection)` method as seen in the refactored [ServerEventsFeature.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack/ServerEventsFeature.cs) plugin, e.g: ```csharp public class ServerEventsFeature : IPlugin, IConfigureServices { //... public void Configure(IServiceCollection services) { if (!services.Exists()) { services.AddSingleton(new MemoryServerEvents { IdleTimeout = IdleTimeout, HouseKeepingInterval = HouseKeepingInterval, OnSubscribeAsync = OnSubscribeAsync, OnUnsubscribeAsync = OnUnsubscribeAsync, OnUpdateAsync = OnUpdateAsync, NotifyChannelOfSubscriptions = NotifyChannelOfSubscriptions, Serialize = Serialize, OnError = OnError, }); } if (UnRegisterPath != null) services.RegisterService(UnRegisterPath); if (SubscribersPath != null) services.RegisterService(SubscribersPath); } public void Register(IAppHost appHost) { //... } } ``` #### All Plugins refactored to support ASP .NET Core IOC All of ServiceStack's Plugins have been refactored to make use of `IConfigureServices` which supports registering in both Funq and ASP.NET Core's IOC when enabled. #### Funq IOC implements IServiceCollection and IServiceProvider interfaces To enable this Funq now implements both `IServiceCollection` and`IServiceProvider` interfaces to enable 100% source-code compatibility for registering and resolving dependencies with either IOC, which we now recommend using over Funq's native Registration and Resolution APIs to simplify migration efforts to ASP.NET Core's IOC in future. ## Dependency Injection The primary difference between the IOC's is that ASP.NET Core's IOC does not support property injection by default, which will require you to refactor your ServiceStack Services to use constructor injection of dependencies, although this has become a lot more pleasant with C# 12's [Primary Constructors](https://learn.microsoft.com/en-us/dotnet/csharp/whats-new/tutorials/primary-constructors) which now requires a lot less boilerplate to define, assign and access dependencies, e.g: ```csharp public class TechStackServices(IAutoQueryDb autoQuery) : Service { public async Task Any(QueryTechStacks request) { using var db = autoQuery.GetDb(request, base.Request); var q = autoQuery.CreateQuery(request, Request, db); return await autoQuery.ExecuteAsync(request, q, db); } } ``` This has become our preferred approach for injecting dependencies in ServiceStack Services which have all been refactored to use constructor injection utilizing primary constructors in order to support both IOC's. To make migrations easier we've also added support for property injection convention in **ServiceStack Services** using ASP.NET Core's IOC where you can add the `[FromServices]` attribute to any public properties you want to be injected, e.g: ```csharp public class TechStackServices : Service { [FromServices] public required IAutoQueryDb AutoQuery { get; set; } [FromServices] public MyDependency? OptionalDependency { get; set; } } ``` This feature can be useful for Services wanting to access optional dependencies that may or may not be registered. :::info NOTE `[FromServices]` is only supported in ServiceStack Services (i.e. not other dependencies) ::: ### Built-in ServiceStack Dependencies This integration now makes it effortless to inject and utilize optional ServiceStack features like [AutoQuery](https://docs.servicestack.net/autoquery/) and [Server Events](https://docs.servicestack.net/server-events) in other parts of ASP.NET Core inc. Blazor Components, Razor Pages, MVC Controllers, Minimal APIs, etc. Whilst the Built-in ServiceStack features that are registered by default and immediately available to be injected, include: - `IVirtualFiles` - Read/Write [Virtual File System](https://docs.servicestack.net/virtual-file-system), defaults to `FileSystemVirtualFiles` at `ContentRootPath` - `IVirtualPathProvider` - Multi Virtual File System configured to scan multiple read only sources, inc `WebRootPath`, In Memory and Embedded Resource files - `ICacheClient` and `ICacheClientAsync` - In Memory Cache, or distributed Redis cache if [ServiceStack.Redis](https://docs.servicestack.net/redis/) is configured - `IAppSettings` - Multiple [AppSettings](https://docs.servicestack.net/appsettings) configuration sources With ASP.NET Core's IOC now deeply integrated we moved onto the next area of integration: API Integration and Endpoint Routing. ## Endpoint Routing Whilst ASP.NET Core's middleware is a flexible way to compose and execute different middleware in a HTTP Request pipeline, each middleware is effectively their own island of functionality that's able to handle HTTP Requests in which ever way they see fit. In particular ServiceStack's middleware would execute its own [Request Pipeline](https://docs.servicestack.net/order-of-operations) which would execute ServiceStack API's registered at user-defined routes with its own [ServiceStack Routing](https://docs.servicestack.net/routing). We're happy to announce that ServiceStack **.NET 8** Apps support an entirely new and integrated way to run all of ServiceStack requests including all APIs, metadata and built-in UIs with support for [ASP.NET Core Endpoint Routing](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/routing) - enabled by calling the `MapEndpoints()` extension method when configuring ServiceStack, e.g: ```csharp app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(); }); ``` Which configures ServiceStack APIs to be registered and executed along-side Minimal APIs, Razor Pages, SignalR, MVC and Web API Controllers, etc, utilizing the same routing, metadata and execution pipeline. #### View ServiceStack APIs along-side ASP.NET Core APIs Amongst other benefits, this integration is evident in endpoint metadata explorers like the `Swashbuckle` library which can now show ServiceStack APIs in its Swagger UI along-side other ASP.NET Core APIs in ServiceStack's new [Open API v3](/posts/openapi-v3) support. ### Routing Using Endpoint Routing also means using ASP.NET Core's Routing System which now lets you use ASP.NET Core's [Route constraints](https://learn.microsoft.com/en-us/aspnet/core/fundamentals/routing#route-constraints) for defining user-defined routes for your ServiceStack APIs, e.g: ```csharp [Route("/users/{Id:int}")] [Route("/users/{UserName:string}")] public class GetUser : IGet, IReturn { public int? Id { get; set; } public int? UserName { get; set; } } ``` For the most part ServiceStack Routing implements a subset of ASP.NET Core's Routing features so your existing user-defined routes should continue to work as expected. ### Wildcard Routes The only incompatibility we found was when using wildcard paths which in ServiceStack Routing would use an '*' suffix, e.g: `[Route("/wildcard/{Path*}")]` which will need to change to use a ASP.NET Core's Routing prefix, e.g: ```csharp [Route("/wildcard/{*Path}")] [Route("/wildcard/{**Path}")] public class GetFile : IGet, IReturn { public string Path { get; set; } } ``` #### ServiceStack Routing Compatibility To improve compatibility with ASP.NET Core's Routing, ServiceStack's Routing (when not using Endpoint Routing) now supports parsing ASP.NET Core's Route Constraints but as they're inert you would need to continue to use [Custom Route Rules](https://docs.servicestack.net/routing#custom-rules) to distinguish between different routes matching the same path at different specificity: ```csharp [Route("/users/{Id:int}", Matches = "**/{int}")] [Route("/users/{UserName:string}")] public class GetUser : IGet, IReturn { public int? Id { get; set; } public int? UserName { get; set; } } ``` It also supports defining Wildcard Routes using ASP.NET Core's syntax which we now recommend using instead for compatibility when switching to use Endpoint Routing: ```csharp [Route("/wildcard/{*Path}")] [Route("/wildcard/{**Path}")] public class GetFile : IGet, IReturn { public string Path { get; set; } } ``` ### Primary HTTP Method Another difference is that an API will only register its Endpoint Route for its [primary HTTP Method](https://docs.servicestack.net/api-design#all-apis-have-a-preferred-default-method), if you want an API to be registered for multiple HTTP Methods you can specify them in the `Route` attribute, e.g: ```csharp [Route("/users/{Id:int}", "GET,POST")] public class GetUser : IGet, IReturn { public required int Id { get; set; } } ``` As such we recommend using the IVerb `IGet`, `IPost`, `IPut`, `IPatch`, `IDelete` interface markers to specify the primary HTTP Method for an API. This isn't needed for [AutoQuery Services](https://docs.servicestack.net/autoquery/) which are implicitly configured to use their optimal HTTP Method. If no HTTP Method is specified, the Primary HTTP Method defaults to HTTP **POST**. ### Authorization Using Endpoint Routing also means ServiceStack's APIs are authorized the same way, where ServiceStack's [Declarative Validation attributes](https://docs.servicestack.net/auth/#declarative-validation-attributes) are converted into ASP.NET Core's `[Authorize]` attribute to secure the endpoint: ```csharp [ValidateIsAuthenticated] [ValidateIsAdmin] [ValidateHasRole(role)] [ValidateHasClaim(type,value)] [ValidateHasScope(scope)] public class Secured {} ``` #### Authorize Attribute on ServiceStack APIs Alternatively you can now use ASP.NET Core's `[Authorize]` attribute directly to secure ServiceStack APIs should you need more fine-grained Authorization: ```csharp [Authorize(Roles = "RequiredRole")] [Authorize(Policy = "RequiredPolicy")] [Authorize(AuthenticationSchemes = "Identity.Application,Bearer")] public class Secured {} ``` #### Configuring Authentication Schemes ServiceStack will default to using the major Authentication Schemes configured for your App to secure the APIs endpoint with, this can be overridden to specify which Authentication Schemes to use to restrict ServiceStack APIs by default, e.g: ```csharp app.UseServiceStack(new AppHost(), options => { options.AuthenticationSchemes = "Identity.Application,Bearer"; options.MapEndpoints(); }); ``` ### Hidden ServiceStack Endpoints Whilst ServiceStack Requests are registered and executed as endpoints, most of them are marked with `builder.ExcludeFromDescription()` to hide them from polluting metadata and API Explorers like Swagger UI and [API Explorer](https://docs.servicestack.net/api-explorer). To also hide your ServiceStack APIs you can use `[ExcludeMetadata]` attribute to hide them from all metadata services or use `[Exclude(Feature.ApiExplorer)]` to just hide them from API Explorer UIs: ```csharp [ExcludeMetadata] [Exclude(Feature.ApiExplorer)] public class HiddenRequest {} ``` ### Content Negotiation An example of these hidden routes is the support for invoking and returning ServiceStack APIs in different Content Types via hidden Endpoint Routes mapped with the format `/api/{Request}.{format}`, e.g: - [/api/QueryBookings](https://blazor-vue.web-templates.io/api/QueryBookings) - [/api/QueryBookings.jsonl](https://blazor-vue.web-templates.io/api/QueryBookings.jsonl) - [/api/QueryBookings.csv](https://blazor-vue.web-templates.io/api/QueryBookings.csv) - [/api/QueryBookings.xml](https://blazor-vue.web-templates.io/api/QueryBookings.xml) - [/api/QueryBookings.html](https://blazor-vue.web-templates.io/api/QueryBookings.html) #### Query String Format That continues to support specifying the Mime Type via the `?format` query string, e.g: - [/api/QueryBookings?format=jsonl](https://blazor-vue.web-templates.io/api/QueryBookings?format=jsonl) - [/api/QueryBookings?format=csv](https://blazor-vue.web-templates.io/api/QueryBookings?format=csv) ### Predefined Routes Endpoints are only created for the newer `/api/{Request}` [pre-defined routes](https://docs.servicestack.net/routing#pre-defined-routes), which should be easier to use with less conflicts now that ServiceStack APIs are executed along-side other endpoint routes APIs which can share the same `/api` base path with non-conflicting routes, e.g: `app.MapGet("/api/minimal-api")`. As a result clients configured to use the older `/json/reply/{Request}` pre-defined route will need to be configured to use the newer `/api` base path. No change is required for C#/.NET clients using the recommended `JsonApiClient` JSON Service Client which is already configured to use the newer `/api` base path. ```csharp var client = new JsonApiClient(baseUri); ``` Older .NET clients can be configured to use the newer `/api` pre-defined routes with: ```csharp var client = new JsonServiceClient(baseUri) { UseBasePath = "/api" }; var client = new JsonHttpClient(baseUri) { UseBasePath = "/api" }; ``` To further solidify that `/api` as the preferred pre-defined route we've also **updated all generic service clients** of other languages to use `/api` base path by default: #### JavaScript/TypeScript ```ts const client = new JsonServiceClient(baseUrl) ``` #### Dart ```dart var client = ClientFactory.api(baseUrl); ``` #### Java/Kotlin ```java JsonServiceClient client = new JsonServiceClient(baseUrl); ``` #### Python ```python client = JsonServiceClient(baseUrl) ``` #### PHP ```php $client = new JsonServiceClient(baseUrl); ``` ### Revert to Legacy Predefined Routes You can unset the base path to revert back to using the older `/json/reply/{Request}` pre-defined route, e.g: #### JavaScript/TypeScript ```ts client.basePath = null; ``` #### Dart ```dart var client = ClientFactory.create(baseUrl); ``` #### Java/Kotlin ```java client.setBasePath(); ``` #### Python ```python client.set_base_path() ``` #### PHP ```php $client->setBasePath(); ``` ### Customize Endpoint Mapping You can register a RouteHandlerBuilders to customize how ServiceStack APIs endpoints are registered which is also what ServiceStack uses to annotate its API endpoints to enable its new [Open API v3](/posts/openapi-v3) support: ```csharp options.RouteHandlerBuilders.Add((builder, operation, method, route) => { builder.WithOpenApi(op => { ... }); }); ``` ### Endpoint Routing Compatibility Levels The default behavior of `MapEndpoints()` is the strictest and recommended configuration that we want future ServiceStack Apps to use, however if you're migrating existing App's you may want to relax these defaults to improve compatibility with existing behavior. The configurable defaults for mapping endpoints are: ```csharp app.UseServiceStack(new AppHost(), options => { options.MapEndpoints(use:true, force:true, useSystemJson:UseSystemJson.Always); }); ``` - `use` - Whether to use registered endpoints for executing ServiceStack APIs - `force` - Whether to only allow APIs to be executed through endpoints - `useSystemJson` - Whether to use System.Text.Json for JSON API Serialization So you could for instance register endpoints and not `use` them, where they'll be visible in endpoint API explorers like [Swagger UI](https://docs.servicestack.net/releases/v8_01#openapi-v3) but continue to execute in ServiceStack's Request Pipeline. `force` disables fallback execution of ServiceStack Requests through ServiceStack's Request Pipeline for requests that don't match registered endpoints. You may need to disable this if you have clients calling ServiceStack APIs through multiple HTTP Methods, as only the primary HTTP Method is registered as an endpoint. When enabled `force` ensures the only ServiceStack Requests that are not executed through registered endpoints are `IAppHost.CatchAllHandlers` and `IAppHost.FallbackHandler` handlers. `useSystemJson` is a new feature that lets you specify when to use `System.Text.Json` for JSON API Serialization, which is our next exciting feature to standardize on using [ASP.NET Core's fast async System.Text.Json](https://docs.servicestack.net/releases/v8_01#system.text.json) Serializer. ## Endpoint Routing Everywhere Whilst the compatibility levels of Endpoint Routing can be relaxed, we recommend new projects use the strictest and most integrated defaults that's now configured on all [ASP.NET Core Identity Auth .NET 8 Projects](/start). For additional testing we've also upgraded many of our existing .NET Example Applications, which are now all running with our latest recommended Endpoint Routing configuration: - [BlazorDiffusionVue](https://github.com/NetCoreApps/BlazorDiffusionVue) - [BlazorDiffusionAuto](https://github.com/NetCoreApps/BlazorDiffusionAuto) - [TypeChatExamples](https://github.com/NetCoreApps/TypeChatExamples) - [TalentBlazor](https://github.com/NetCoreApps/TalentBlazor) - [TechStacks](https://github.com/NetCoreApps/TechStacks) - [Validation](https://github.com/NetCoreApps/Validation) - [NorthwindAuto](https://github.com/NetCoreApps/NorthwindAuto) - [FileBlazor](https://github.com/NetCoreApps/FileBlazor) - [Chinook](https://github.com/NetCoreApps/Chinook) - [Chat](https://github.com/NetCoreApps/Chat) # Migrating to ASP.NET Core Identity for Authentication Source: https://servicestack.net/posts/identity-migration ## ASP.NET Core Identity Since the release of ServiceStack v8 we have started to include the use of [ASP.NET Core Identity for authentication](https://learn.microsoft.com/en-us/aspnet/core/security/authentication/identity?view=aspnetcore-8.0&tabs=visual-studio) in [our templates](https://github.com/NetCoreTemplates). This gives developers the option to use the built-in ASP.NET Core Identity authentication system or ServiceStack's own authentication system when building their next system. This provides a closer alignment with the ASP.NET Core ecosystem and allows developers to use the built-in ASP.NET Core Identity authentication system if they are already familiar with it. If you are already using ServiceStack's authentication system, you can continue to do so, but if you are looking to migrate to ASP.NET Core Identity, this guide will walk you through the process with a concrete example of migrating our [BlazorDiffusion](https://github.com/NetCoreApps/BlazorDiffusion) example application. ## Overview of the migration process The migration process can be broken down into the following steps: - Add NuGet dependencies - Create ASP.NET Core Identity `AspNetUsers` class based on your existing custom `UserAuth` class - Create ASP.NET Core Identity `AspNetRoles`, ensure matching primary key type to `AspNetUsers` - Create ASP.NET Core Identity `ApplicationDbContext` class, again matching primary key type to `AspNetUsers` - Create EntityFrameworkCore migration to initialize ASP.NET Core Identity tables - Update `AuthFeature` registration to use ASP.NET Core Identity - Update `Program.cs` to use ASP.NET Core Identity - Implement the Migrate Users Task - Migrating Roles - Migrate Foreign Keys from UserAuth to AspNetUsers In this guide we will walk through each of these steps in detail and show how we migrated our BlazorDiffusion example application over to ASP.NET Core Identity to help you with your own migration. ### Add ASP.NET Core Identity EntityFrameworkCore NuGet package The first step is to add the required ASP.NET Core Identity NuGet packages to your project. This can be done using the dotnet CLI or via Visual Studio's NuGet package manager. :::shell dotnet add package Microsoft.AspNetCore.Identity.EntityFrameworkCore dotnet add package Microsoft.EntityFrameworkCore.Tools ::: Since BlazorDiffusion was an existing Blazor project, [we created a new `blazor-wasm` project](https://github.com/NetCoreTemplates/blazor-wasm) using `x new blazor-wasm BlazorDiffusion` and migrated the Services and Components over to the new project. We can do this because the `blazor-wasm` template and others have been [updated to use ASP.NET Core Identity by default](https://docs.servicestack.net/auth/identity-auth). So if your project previously used a ServiceStack template, first check if there is an updated version of the template available with ASP.NET Core Identity support. ### Create ASP.NET Core Identity `AspNetUsers` class Next you will need to create a class that inherits from `IdentityUser` to represent our users. [This class will be used by ASP.NET Core Identity to store user information in the database](https://learn.microsoft.com/en-us/aspnet/core/security/authentication/customize-identity-model?view=aspnetcore-8.0). You will want to mirror customizations from your own `UserAuth` class to this new class which will have the name in the database of `AspNetUsers`. To minimize changes, you can rename your existing `AppUser` class to something like `OldUserAuth` and then create a new `AppUser` class that inherits from `IdentityUser` and copy over any customizations from `OldUserAuth`. :::info In this case `AppUser` is the name of our custom `UserAuth` class. ::: You will still need to reference your `OldAppUser` class for migrating users, so you will want to point it to the `AppUser` table by using the `[Alias("AppUser")]` attribute. ```csharp [Alias("AppUser")] public class OldAppUser { [AutoIncrement] public int Id { get; set; } public string UserName { get; set; } public string DisplayName { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public string? Handle { get; set; } public string Email { get; set; } public string PasswordHash { get; set; } public string? ProfileUrl { get; set; } public string? Avatar { get; set; } //overrides ProfileUrl public string? LastLoginIp { get; set; } public DateTime? LastLoginDate { get; set; } public string RefIdStr { get; set; } public DateTime? LockedDate { get; set; } public DateTime CreatedDate { get; set; } public DateTime ModifiedDate { get; set; } } ``` When creating your new `AppUser` class, you will want to copy over any customizations from your `OldAppUser` class. In this case we have added a `Handle` property to our `OldAppUser` class, so this will need to be included in the new `AppUser` class as well. Essentially your custom EF IdentityUser will want a copy of all the properties you want to migrate other than Id, Email, and PasswordHash that's already defined in the base IdentityUser class. ```csharp // Add profile data for application users by adding properties to the AppUser class [Alias("AspNetUsers")] public class AppUser : IdentityUser { public string? FirstName { get; set; } public string? LastName { get; set; } public string? DisplayName { get; set; } public string? ProfileUrl { get; set; } [Input(Type = "file"), UploadTo("avatars")] public string? Avatar { get; set; } //overrides ProfileUrl public string? Handle { get; set; } public int? RefId { get; set; } public string RefIdStr { get; set; } = Guid.NewGuid().ToString(); public bool IsArchived { get; set; } public DateTime? ArchivedDate { get; set; } public string? LastLoginIp { get; set; } public DateTime? LastLoginDate { get; set; } public DateTime CreatedDate { get; set; } = DateTime.UtcNow; public DateTime ModifiedDate { get; set; } = DateTime.UtcNow; } ``` ### Create ASP.NET Core Identity `AspNetRoles` Next you will need to create a class that inherits from `IdentityRole` to represent your user roles. This class will be used by ASP.NET Core Identity to store role information in the database. ```csharp [Alias("AspNetRoles")] public class AppRole : IdentityRole { public AppRole() {} public AppRole(string roleName) : base(roleName) {} } ``` Again, because our `AppUser` class is using a different primary key type than the default `string` type, you will need to specify a matching primary key type for your `AppRole` class. ### Create ASP.NET Core Identity `ApplicationDbContext` class Now to use our `AppUser` and `AppRole` classes, you will need to create a class that inherits from `IdentityDbContext` to represent our database context. Just like with any EntityFrameworkCore database context, this class will be used to query and save data to the database. ```csharp public class ApplicationDbContext(DbContextOptions options) : IdentityDbContext(options) { protected override void OnModelCreating(ModelBuilder builder) { base.OnModelCreating(builder); builder.Entity() .HasIndex(x => x.Handle) .IsUnique(); } } ``` Above uses the `Handle` property on the `AppUser` class to create a unique index on the `Handle` column in the `AspNetUsers` table. You can add other custom restrictions to your schema here as well as needed. ### Create EntityFrameworkCore migration to initialize ASP.NET Core Identity tables Now that you have your `AppUser` and `AppRole` classes, and can access them via your newly created `ApplicationDbContext` class, you can create an EntityFrameworkCore migration to initialize the ASP.NET Core Identity tables. [You can generate your initial migration using the dotnet CLI or via Visual Studio's Package Manager Console](https://learn.microsoft.com/en-us/ef/core/cli/dotnet). :::shell dotnet ef migrations add CreateIdentitySchema ::: You should run this command from the AppHost project directory, which in our case is `BlazorDiffusion`. This will generate your new EntityFrameworkCore migration in the `Migrations` directory of your AppHost project, along side your ServiceStack migrations. With your new migration created, you can now update your database schema to include the ASP.NET Core Identity tables. :::shell dotnet ef database update ::: Using the [dotnet EntityFramework CLI is great for local development](https://learn.microsoft.com/en-us/ef/core/cli/dotnet), but for production deployments you will need to run the migrations on your server. You can do this using ServiceStack's AppTasks feature prior to the standard ServiceStack migrations. ```csharp public class ConfigureDbMigrations : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { var migrator = new Migrator(appHost.Resolve(), typeof(Migration1000).Assembly); AppTasks.Register("migrate", _ => { var log = appHost.GetApplicationServices().GetRequiredService>(); log.LogInformation("Running EF Migrations..."); var scopeFactory = appHost.GetApplicationServices().GetRequiredService(); using (var scope = scopeFactory.CreateScope()) { using var dbContext = scope.ServiceProvider.GetRequiredService(); dbContext.Database.EnsureCreated(); dbContext.Database.Migrate(); } }); ``` In the above example we are ensuring the database is created which creates the required schema, and then running the migrations to update the schema to the latest version. ### Update `AuthFeature` registration to use ASP.NET Core Identity With your ASP.NET Core Identity tables created, you can now update your [`AuthFeature` registration](https://docs.servicestack.net/auth/authentication-and-authorization) to use ASP.NET Core Identity. ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(IdentityAuth.For(options => { options.EnableCredentialsAuth = true; options.SessionFactory = () => new CustomUserSession(); }))); }); } ``` In the above example we are using the `IdentityAuth` class to register ASP.NET Core Identity with ServiceStack. This class is a wrapper around the standard ASP.NET Core Identity registration process and allows you to configure ASP.NET Core Identity options. [ServiceStack uses a compatible Identity v2 password hashing format](https://docs.servicestack.net/auth/migrate-to-identity-auth), which should let you migrate your users to Identity Auth without the need to reset their passwords. ### Update `Program.cs` to use ASP.NET Core Identity Now you will need to configure IdentityCore middleware in your `Program.cs` file. ```csharp services.AddAuthentication(options => { options.DefaultScheme = IdentityConstants.ApplicationScheme; options.DefaultSignInScheme = IdentityConstants.ExternalScheme; }) .AddIdentityCookies(); services.AddDataProtection() .PersistKeysToFileSystem(new DirectoryInfo("App_Data")); // $ dotnet ef migrations add CreateIdentitySchema // $ dotnet ef database update var connectionString = config.GetConnectionString("DefaultConnection") ?? throw new InvalidOperationException("Connection string 'DefaultConnection' not found."); services.AddDbContext(options => options.UseSqlite(connectionString, b => b.MigrationsAssembly(nameof(BlazorDiffusion)))); services.AddDatabaseDeveloperPageExceptionFilter(); services.AddIdentityCore(options => options.SignIn.RequireConfirmedAccount = true) .AddRoles() .AddEntityFrameworkStores() .AddSignInManager() .AddDefaultTokenProviders(); ``` Since BlazorDiffusion is a .NET 8 Blazor WASM application, we also needed some additional dependencies setup. ```csharp services.AddCascadingAuthenticationState(); services.AddScoped(); services.AddScoped(); services.AddScoped(); ``` If you are migrating to Identity from an existing Blazor application, our templates have [tailwind-css styled login and register pages that you can use to get started](https://github.com/NetCoreTemplates/blazor-wasm/tree/main/MyApp/Components/Account). For these, you will also need the additional IdentityEndpoints mapped. ```csharp // Add additional endpoints required by the Identity /Account Razor components. app.MapAdditionalIdentityEndpoints(); ``` ### Implement the Migrate Users Task So far we have prepared the application to use ASP.NET Core Identity, but we still need to migrate our existing users to the new ASP.NET Core Identity tables. This will require: - Migrating users from the `AppUser` table to the `AspNetUsers` table - Migrating custom roles table to the `AspNetRoles` table - Migrating foreign keys from the `UserAuth` table to the `AspNetUsers` table - Migrating foreign keys from the `UserAuthRole` table to the `AspNetUserRoles` table (if any) To do this we will create a new AppTask that will migrate our users to the new ASP.NET Core Identity tables. ```csharp AppTasks.Register("migrate.users", _ => { var log = appHost.GetApplicationServices().GetRequiredService>(); log.LogInformation("Running migrate.users..."); var scopeFactory = appHost.GetApplicationServices().GetRequiredService(); using var scope = scopeFactory.CreateScope(); using var dbContext = scope.ServiceProvider.GetRequiredService(); using var db = scope.ServiceProvider.GetRequiredService().Open(); var migrateUsers = db.Select(db.From().OrderBy(x => x.Id)); log.LogInformation("Migrating {Count} Existing ServiceStack Users to Identity Auth Users...", migrateUsers.Count); MigrateExistingUsers(dbContext, scope.ServiceProvider, migrateUsers).Wait(); }); ``` In the above example we are using the `IDbConnectionFactory` to open a connection to our database and select all of our existing users from the `AppUser` table. The `MigrationExistingUsers` method will then migrate our existing users to the new ASP.NET Core Identity tables. ```csharp private async Task MigrateExistingUsers(ApplicationDbContext dbContext, IServiceProvider services, List migrateUsers, string tempPassword="p@55wOrd") { var userManager = services.GetRequiredService>(); var now = DateTime.UtcNow; foreach (var user in migrateUsers) { var appUser = new AppUser { Id = user.Id, UserName = user.Email, Email = user.Email, DisplayName = user.DisplayName, FirstName = user.FirstName, LastName = user.LastName, Handle = user.Handle, ProfileUrl = user.ProfileUrl, Avatar = user.Avatar, RefIdStr = user.RefIdStr ?? Guid.NewGuid().ToString(), LockoutEnabled = true, LockoutEnd = user.LockedDate != null ? now.AddYears(10) : now, LastLoginDate = user.LastLoginDate, LastLoginIp = user.LastLoginIp, CreatedDate = user.CreatedDate, ModifiedDate = user.ModifiedDate, EmailConfirmed = true, }; await userManager.CreateAsync(appUser, tempPassword); if (user.PasswordHash != null) { // Update raw PasswordHash (which uses older ASP.NET Identity v2 format), after users successfully signs in // the password will be re-hashed using the latest ASP.NET Identity v3 implementation dbContext.Users .Where(x => x.Id == user.Id) .ExecuteUpdate(setters => setters.SetProperty(x => x.PasswordHash, user.PasswordHash)); } } } ``` In the above example we are using [the `UserManager`](https://learn.microsoft.com/en-us/dotnet/api/microsoft.aspnetcore.identity.usermanager-1?view=aspnetcore-8.0) to create a new `AppUser` for each of our existing users and then updating the `PasswordHash` property from the `OldAppUser` table. ### Migrating Roles If you are using custom roles, you will also need to migrate these to the new ASP.NET Core Identity tables, and then assign them to your users based on their existing roles in your previous setup. Your ServiceStack Authentication roles will be stored in the `UserAuthRole` table separately or in the `Roles` property of your `UserAuth` class. You will need to migrate these roles to the new ASP.NET Core Identity tables and then assign them to your users. ```csharp foreach (var roleName in allRoles) { var roleExist = await roleManager.RoleExistsAsync(roleName); if (!roleExist) { //create the roles and seed them to the database assertResult(await roleManager.CreateAsync(new AppRole(roleName))); } } ``` If your use of roles is static, you can create the list of all your roles from your applications code. If your use of roles is dynamic, you will need to query your database for all the roles that exist in your `UserAuthRole` table. One difference between ServiceStack Authentication and ASP.NET Core Identity is the use of the `Admin` role. In ServiceStack Authentication, the `Admin` role is a special role that gives the user access to all protected resources. In ASP.NET Core Identity, the `Admin` role is just a regular role that can be assigned to users. This means for users with the `Admin` role in your existing application, additional roles will need to be assigned to them in ASP.NET Core Identity to give them access to the same protected resources. In BlazorDiffusion we used the Admin role as well as others, but to preserve the existing behavior, we assigned all roles to users with the `Admin` role. ```csharp foreach (var user in Users.All) { var appUser = new AppUser { Id = user.Id, Email = user.Email, DisplayName = user.DisplayName, UserName = user.Email, Handle = user.Handle, Avatar = user.Avatar, EmailConfirmed = true, }; if(appUser.Email == "admin@email.com") await EnsureUserAsync(appUser, "p@55wOrd", AppRoles.All); else await EnsureUserAsync(appUser, "p@55wOrd", user.Roles); } ``` The `EnsureUserAsync` method will assign the roles to the user. ```csharp async Task EnsureUserAsync(AppUser user, string password, string[]? roles = null) { var existingUser = await userManager.FindByEmailAsync(user.Email!); if (existingUser != null) return; await userManager!.CreateAsync(user, password); if (roles?.Length > 0) { var newUser = await userManager.FindByEmailAsync(user.Email!); assertResult(await userManager.AddToRolesAsync(user, roles)); } } ``` ### Migrate Foreign Keys from UserAuth to AspNetUsers If you are using foreign keys in your existing application, you will need to migrate these to the new ASP.NET Core Identity tables. For databases like PostgreSQL, you can use the `ALTER TABLE` command to add a foreign key constraint to the `AspNetUsers` table, and we will want to remove the foreign key constraint from the `UserAuth` or `AppUser` table. In BlazorDiffusion for example, the `Creative` table was using a foreign key to the `AppUser` table, so we needed to replace this with a foreign key to the `AspNetUsers` table. ```sql ALTER TABLE "Creative" DROP CONSTRAINT "FK_Creative_AppUser_UserId"; ALTER TABLE "Creative" ADD CONSTRAINT "FK_Creative_AspNetUsers_UserId" FOREIGN KEY ("UserId") REFERENCES "AspNetUsers" ("Id") ; ``` For databases like SQLite, you will need to create a new table with the foreign key constraint and then copy the data over from the old table. We use SQLite for BlazorDiffusion since it makes it easy deploy the application, and SQLite is a great option for small applications that don't need to scale. Since we have to migrate several tables, we can create a `ReplaceForeignKeyConstraint` method to handle this for us. ```csharp private void ReplaceForeignKeyConstraint() { var modelDef = typeof(TModel).GetModelMetadata(); var createTable = SqliteDialect.Provider.ToCreateTableStatement(typeof(TModel)); var sql = $@"PRAGMA foreign_keys = OFF; ALTER TABLE {modelDef.ModelName} RENAME TO {modelDef.ModelName}_old; {createTable} INSERT INTO {modelDef.ModelName} SELECT * FROM {modelDef.ModelName}_old; -- DROP TABLE {modelDef.ModelName}_old; PRAGMA foreign_keys = ON;"; Db.ExecuteSql(sql); } ``` When replacing the tables like this, you will need to be aware of the order in which you replace the tables. For example, if you have a foreign key from the `Creative` table to the `AppUser` table, and a foreign key from the `Artifact` table to the `Creative` table, you will need to replace the `Creative` table first, and then the `Artifact` table. This is because the `Artifact` table has a foreign key to the `Creative` table, and if you replace the `Artifact` table first, the foreign key will still be pointing to the old `Creative` table. #### Incorrect migration order result
Incorrect DB mapping after migration.
#### Correct migration order result
Correct DB mapping after migration.
The rule of thumb is you will want to replace the tables from most depended on to least depended on. Another limitation of the SQL above is that the order of the columns in the new table must match the order of the columns in the old table. `INSERT INTO` will insert the data into the new table based on the order of the columns in the new table, so if the order of the columns is different, the data will be inserted into the wrong columns. During the migration of BlazorDiffusion, we hit this issue with the `Artifact` table. The `Artifact` C# class uses the `AuditBase` base class which has the `CreatedDate` and `ModifiedDate` properties. In a previous migration we added some additional columns as features were added. So when creating a copy of the `Artifact` class in the ServiceStack migration to handle fixing the foreign key, the order of the columns was different. Thankfully, since internal classes used in migrations are completely separate for repeatable migrations, we can just create the `Artfact` class to be specific for this migration. So instead of inheriting from `AuditBase`, we can just copy the properties from `AuditBase` into the `Artifact` class in the order required. Putting it all together, we have a migration `Up` method that looks like: ```csharp public override void Up() { var appHost = HostContext.AppHost; var log = appHost.GetApplicationServices().GetRequiredService>(); log.LogInformation("Migrating FKs from AppUser to AspNetUsers..."); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); ReplaceForeignKeyConstraint(); } ``` In a separate migration, we can then drop the old tables after confirming the migration was successful, and the previous data has been migrated to the new tables. ```csharp public class Migration1007 : MigrationBase { public override void Up() { DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); DropOldTable(); } private void DropOldTable() { var modelDef = typeof(TModel).GetModelMetadata(); Db.ExecuteSql($@"PRAGMA foreign_keys = OFF; DROP TABLE IF EXISTS {modelDef.ModelName}_old; PRAGMA foreign_keys = ON;"); } } ``` ### Migrate Foreign Keys from UserAuthRole to AspNetUserRoles If you have any tables that use a foreign key to your custom `UserAuthRole` table, you will need to do the same as above and migrate these to the new ASP.NET Core Identity tables. ## Why Migrate to ASP.NET Core Identity? ServiceStack's built-in authentication system is a great option for many applications. It provides a simple and easy to use authentication system that works out of the box with ServiceStack's built-in features like Sessions, Caching, and OrmLite. However, as a part of making ServiceStack more compatible with the ASP.NET Core ecosystem, we have started to include the use of ASP.NET Core Identity since a lot of ASP.NET developers are already familiar with it. It also provides features like two-factor authentication, external authentication providers, and more that are not available in ServiceStack's built-in authentication system. ![](/img/posts/identity-migration/two-factor-auth-example.png) So while you don't have to migrate to ASP.NET Core Identity, it is a great option if you are already familiar with it, or if you are looking to use some of the additional features it provides. If you are looking to migrate to ASP.NET Core Identity, we hope this guide helps you with your migration. If you have any questions, feel free to reach out on our [forums](https://forums.servicestack.net). # Docker Containerization in .NET 8 Source: https://servicestack.net/posts/net8-docker-containers ### All .NET Project Templates upgraded to .NET 8 Included in the release of [ServiceStack v8](https://docs.servicestack.net/releases/v8_00) all of ServiceStack's [.NET project templates](https://github.com/NetCoreTemplates/) have been upgraded to use **ServiceStack v8** and **.NET 8** target framework, in addition the built-in CI/CD deployment GitHub Actions have been upgraded to use the [secure rootless Linux Docker containers](https://devblogs.microsoft.com/dotnet/securing-containers-with-rootless/) that's now built into .NET 8 which allow you to effortlessly deploy your containerized .NET 8 Apps with Docker and GitHub Registry via SSH to any Linux Server.

.NET 8 Docker Containers

Learn about the latest streamlined containerization support built into .NET 8

### .NET 8 Docker Containerization .NET 8 simplifies Docker integration using functionality built into the .NET SDK tooling where it's able to use `dotnet publish` to publish your .NET 8 App to a container image without a Dockerfile, adhering to the latest Least privilege and hardened security best practices of running App's as non-root by default. This **publish to container** feature also supports creating Docker images for different architectures like ARM64 which sees [significant improvements in .NET 8](https://devblogs.microsoft.com/dotnet/this-arm64-performance-in-dotnet-8/) making deploying your .NET Apps to ARM64 an [even better value proposition](https://servicestack.net/posts/cloud-value-between-architectures). If you need more control over your Docker image, you can still use a Dockerfile to customize your image, and there are [even tools to generate Dockerfiles for you from your project](https://github.com/tmds/build-image). ### GitHub Action Deployments In today's DevOps ecosystem, [GitHub Actions](https://github.com/features/actions) stand out as an invaluable asset for automating CI/CD workflows directly within your GitHub repository. The introduction of .NET 8 takes this a step further, offering a streamlined approach to generating Docker images through the `DefaultContainer` setting in your `.csproj`. This ensures consistent application packaging, making it deployment-ready by just using `dotnet publish`. ServiceStack's project templates bring additional flexibility, by utilizing foundational tools like [Docker](https://www.docker.com) for containerization and [SSH](https://en.wikipedia.org/wiki/Secure_Shell) for secure deployments, it's able to deploy your Dockerized .NET applications to any Linux server, whether self-hosted or on any cloud provider. #### Live Demos use their GitHub Actions to deploy themselves Each template's Live demo are themselves utilizing their included GitHub Actions to deploy itself to a Linux server running on a **€13.60 /month** shared 8BG RAM [Hetzner Cloud VM](https://www.hetzner.com/cloud) that's currently running 50+ Docker Containers. This guide aims to walk you through the hosting setup and the GitHub Actions release process as introduced in the ServiceStack's latest .NET 8 project templates. ## Deployment Files Deployment files that are included in project templates to facilitate GitHub Actions deployments: #### .deploy/ - [nginx-proxy-compose.yml](https://github.com/NetCoreTemplates/blazor/blob/master/.deploy/nginx-proxy-compose.yml) - Manage nginx reverse proxy and Lets Encrypt companion container (one-time setup per server) - [docker-compose.yml](https://github.com/NetCoreTemplates/blazor/blob/main/.deploy/docker-compose.yml) - Manage .NET App Docker Container #### .github/workflows/ - [build.yml](https://github.com/NetCoreTemplates/blazor/blob/master/.github/workflows/build.yml) - Build .NET Project and Run Tests - [release.yml](https://github.com/NetCoreTemplates/blazor/blob/master/.github/workflows/release.yml) - Build container, Push to GitHub Packages Registry, SSH deploy to Linux server, Run DB Migrations and start new Docker Container if successful otherwise revert Migration ## Prerequisites Before your Linux server can accept GitHub Actions deployments, we need to setup your Linux deployment server. For a step-by-step walk through of these steps and more information about this solution, checkout our video guide below:

Use GitHub Actions for Auto Deployments

### Setup Deployment Server #### 1. Install Docker and Docker-Compose Follow [Docker's installation instructions](https://docs.docker.com/engine/install/ubuntu/) to install the latest version of Docker. #### 2. Configure SSH for GitHub Actions Generate a dedicated SSH key pair to be used by GitHub Actions: :::sh ssh-keygen -t rsa -b 4096 -f ~/.ssh/github_actions ::: Add the **public key** to your server's SSH **authorized_keys**: :::sh cat ~/.ssh/github_actions.pub >> ~/.ssh/authorized_keys ::: Add the **private key** to your repo's `DEPLOY_KEY` GitHub Action Secret which GitHub Actions will use to securely SSH into the server. #### 3. Set Up nginx-reverse-proxy You should have a `docker-compose` file similar to the `nginx-proxy-compose.yml` in your repository. Upload this file to your server: :::sh scp nginx-proxy-compose.yml user@your_server:~/ ::: To bring up the nginx reverse proxy and its companion container for handling TLS certificates, run: :::sh docker compose -f ~/nginx-proxy-compose.yml up -d ::: This will start an nginx reverse proxy along with a companion LetsEncrypt container that will automatically watch for additional Docker containers on the same network and initialize them with valid TLS certificates. ### Ready to host containerized .NET Apps Your Linux server is now ready to accept multiple .NET App deployments from GitHub Actions. The guide below walks through the process of setting up your GitHub repository to deploy new ServiceStack .NET Apps to your Linux server. ## Step-by-Step Guide ### 1. Create Your ServiceStack Application Start by creating your ServiceStack application, either from [ServiceStack's Start Page](https://servicestack.net/start) or by using the [x dotnet tool](https://docs.servicestack.net/dotnet-tool): :::sh x new blazor ProjectName ::: Replace `ProjectName` with your desired project name to generate a new ServiceStack application pre-configured with the necessary Docker compose files and GitHub Action workflows as above. ### 2. Configure DNS for Your Application You need a domain to point to your Linux server. Create an A Record in your DNS settings that points to the IP address of your Linux server: - **Subdomain**: `app.example.org` - **Record Type**: A - **Value/Address**: IP address of your Linux server This ensures that any requests to `app.example.org` are directed to your server. ### 3. Setting Up GitHub Secrets Navigate to your GitHub repository's settings, find the "Secrets and variables" section, and add the following secrets: - `DEPLOY_HOST`: IP address or hostname of your Linux server - `DEPLOY_USERNAME`: SSH Username to use for deployments - `DEPLOY_KEY`: Private key generated for GitHub Actions to SSH into your server - `LETSENCRYPT_EMAIL`: Your email address for Let's Encrypt notifications #### Using GitHub CLI for Secret Management You can use the [GitHub CLI](https://cli.github.com/manual/gh_secret_set) for a quicker setup of these GitHub Action Secrets, e.g: ```bash gh secret set DEPLOY_HOST --body="linux-server-host" gh secret set DEPLOY_USERNAME --body="linux-server-username" gh secret set DEPLOY_KEY --bodyFile="path/to/ssh-private-key" gh secret set LETSENCRYPT_EMAIL --body="your-email@example.org" ``` These secrets will populate environment variables within your GitHub Actions workflow and other configuration files, enabling secure and automated deployment of your ServiceStack applications. ### 4. Push to Main Branch to Trigger Deployment With everything set up, pushing code to the main branch of your repository will trigger the GitHub Action workflow, initiating the deployment process: ```bash git add . git commit -m "Initial commit" git push origin main ``` ### 5. Verifying the Deployment After the GitHub Actions workflow completes, you can verify the deployment by: - Checking the workflow's logs in your GitHub repository to ensure it completed successfully - Navigating to your application's URL (e.g., `https://app.example.org`) in a web browser. You should see your ServiceStack application up and running with a secure HTTPS connection ## Features ### DB Migrations The GitHub Actions workflow includes a step to run database migrations on the remote server in the **Run remote db migrations** step which automatically runs the `migrate` AppTask in the `app-migration` companion Docker container on the Linux Host Server to validate migration was successful before completing deployment of the new App. A failed migration will cause deployment to fail and the previous App version to continue to run. ### Patch appsettings.json with production secrets One way to maintain sensitive information like API keys and connection strings for your Production App outside of its source code GitHub repository is to patch the `appsettings.json` file with a [JSON Patch](https://jsonpatch.com) that's stored in your repo's `APPSETTINGS_PATCH` GitHub Action Secret which will be applied with the deployed App's `appsettings.json` file. For example this JSON Patch below will replace values and objects in your App's **appsettings.json**: ```json [ { "op":"add", "path":"/oauth.facebook.AppSecret", "value":"xxxx" }, { "op":"add", "path":"/oauth.microsoft.AppSecret", "value":"xxxx" }, { "op":"add", "path":"/smtp", "value":{ "UserName": "xxxx", "Password": "xxxx", "Host": "smtp-server.example.org", "Port": 587, "From": "noreply@example.org", "FromName": "No Reply" } } ] ``` You can test your JSON Patch by saving it to `appsettings.json.patch` and applying it with the [patch feature](https://docs.servicestack.net/dotnet-tool#patch-json-files) of the `x` dotnet tool: :::sh x patch appsettings.json.patch ::: ## Anatomy of GitHub Actions Workflow GitHub Actions workflows are defined in YAML files, and they provide a powerful way to automate your development process. This guide will take you through the key sections of the workflow to give you a comprehensive understanding of how it functions. ## Permissions In this workflow, two permissions are specified: - `packages: write`: Allows the workflow to upload Docker images to GitHub Packages - `contents: write`: Required to access the repository content Specifying permissions ensures that the GitHub Actions runner has just enough access to perform the tasks in the workflow. ## Jobs This workflow consists of two jobs: `push_to_registry` and `deploy_via_ssh`. ### push_to_registry This job runs on an Ubuntu 22.04 runner and is responsible for pushing the Docker image to the GitHub Container Registry. It proceeds only if the previous workflow did not fail. The job includes the following steps: 1. **Checkout**: Retrieves the latest or specific tag of the repository's code 2. **Env variable assignment**: Assigns necessary environment variables for subsequent steps 3. **Login to GitHub Container Registry**: Authenticates to the GitHub Container Registry 4. **Setup .NET Core**: Prepares the environment for .NET 8 5. **Build and push Docker image**: Creates and uploads the Docker image to GitHub Container Registry (ghcr.io) ### deploy_via_ssh This job also runs on an Ubuntu 22.04 runner and depends on the successful completion of the `push_to_registry` job. Its role is to deploy the application via SSH. The steps involved are: 1. **Checkout**: Retrieves the latest or specific tag of the repository's code 2. **Repository name fix and env**: Sets up necessary environment variables 3. **Create .env file**: Generates a .env file required for deployment 4. **Copy files to target server via scp**: Securely copies files to the remote server 5. **Run remote db migrations**: Executes database migrations on the remote server 6. **Remote docker-compose up via ssh**: Deploys the Docker image with the application ## Triggers (on) The workflow is designed to be triggered by: 1. **New GitHub Release**: Activates when a new release is published 2. **Successful Build action**: Runs whenever the specified Build action completes successfully on the main or master branches 3. **Manual trigger**: Allows for rollback to a specific release or redeployment of the latest release, with an input for specifying the version tag Understanding these sections will help you navigate and modify the workflow as per your needs, ensuring a smooth and automated deployment process. ## Deployment Server Setup Expanded ### Ubuntu as the Reference Point Though our example leverages Ubuntu, it's important to emphasize that the primary requirements for this deployment architecture are a Linux operating system, Docker, and SSH. Many popular Linux distributions like CentOS, Fedora, or Debian will work just as efficiently, provided they support Docker and SSH. ### The Crucial Role of SSH in GitHub Actions **SSH** (Secure SHell) is not just a protocol to remotely access your server's terminal. In the context of GitHub Actions: - SSH offers a **secure channel** between GitHub Actions and your Linux server - Enables GitHub to **execute commands directly** on your server - Provides a mechanism to **transfer files** (like Docker-compose configurations or environment files) from the GitHub repository to the server By generating a dedicated SSH key pair specifically for GitHub Actions (as above), we ensure a secure and isolated access mechanism. Only the entities possessing the private key (in this case, only GitHub Actions) can initiate an authenticated connection. ### Docker & Docker-Compose: Powering the Architecture **Docker** encapsulates your ServiceStack application into containers, ensuring consistency across different environments. Some of its advantages include: - **Isolation**: Your application runs in a consistent environment, irrespective of where Docker runs. - **Scalability**: Easily replicate containers to handle more requests. - **Version Control for Environments**: Create, maintain, and switch between different container images. **Docker-Compose** extends Docker's benefits by orchestrating the deployment of multi-container applications: - **Ease of Configuration**: Describe your application's entire stack, including the application, database, cache, etc., in a single YAML file. - **Consistency Across Multiple Containers**: Ensures that containers are spun up in the right order and with the correct configurations. - **Simplifies Commands**: Instead of a long string of Docker CLI commands, a single `docker-compose up` brings your whole stack online. ### NGINX Reverse Proxy: The Silent Workhorse Using an **nginx reverse proxy** in this deployment design offers several powerful advantages: - **Load Balancing**: Distributes incoming requests across multiple ServiceStack applications, ensuring optimal resource utilization. - **TLS Management**: Together with its companion container, nginx reverse proxy automates the process of obtaining and renewing TLS certificates. This ensures your applications are always securely accessible over HTTPS. - **Routing**: Directs incoming traffic to the correct application based on the domain or subdomain. - **Performance**: Caches content to reduce load times and reduce the load on your ServiceStack applications. With an nginx reverse proxy, you can host multiple ServiceStack (or non-ServiceStack) applications on a single server while providing each with its domain or subdomain. ## Additional Resources ### Docker & Docker-Compose - **[Docker Documentation](https://docs.docker.com/)**: Core concepts, CLI usage, and practical applications - **[Docker-Compose Documentation](https://docs.docker.com/compose/)**: Define and manage multi-container applications ### GitHub Actions - **[GitHub Actions Documentation](https://docs.github.com/en/actions)**: Creating workflows, managing secrets, and automation tips - **[Starter Workflows](https://github.com/actions/starter-workflows)**: Templates for various languages and tools ### SSH & Security - **[SSH Key Management](https://www.ssh.com/academy/ssh/keygen)**: Guidelines on generating and managing SSH keys - **[GitHub Actions Secrets](https://docs.github.com/en/actions/security-guides/encrypted-secrets)**: Securely store and use sensitive information # PHP typed client DTOs for .NET APIs Source: https://servicestack.net/posts/php-typed-apis We're happy to announce the **11th** [Add ServiceStack Reference](https://docs.servicestack.net/add-servicestack-reference) language to enjoy end-to-end typed support for calling .NET APIs - [PHP](https://www.php.net)! The **Add ServiceStack Reference** feature enables a simple way for PHP clients and Applications to generate native PHP DTO classes to access to your ServiceStack APIs.

End-to-end typed PHP

Learn about the rich JsonServiceClient & end-to-end typed API support for PHP

### PhpStorm ServiceStack Plugin PHP developers of [PhpStorm](https://www.jetbrains.com/phpstorm/) can get a simplified development experience for consuming ServiceStack Services by installing the [ServiceStack Plugin](https://plugins.jetbrains.com/plugin/7749-servicestack) from the JetBrains Marketplace: [![](/img/posts/php-typed-apis/phpstorm-servicestack-plugin.webp)](https://plugins.jetbrains.com/plugin/7749-servicestack) Where you'll be able to right-click on a directory and click on **ServiceStack Reference** on the context menu: ![](/img/posts/php-typed-apis/phpstorm-add-servicestack-reference.webp) To launch the **Add PHP ServiceStack Reference** dialog where you can enter the remote URL of the ServiceStack endpoint you wish to call to generate the Typed PHP DTOs for all APIs which by default will saved to `dtos.php`: ![](/img/posts/php-typed-apis/phpstorm-add-servicestack-reference-dialog.webp) Then just import the DTOs and `JsonServiceClient` to be able to consume any of the remote ServiceStack APIs: ```php send(new FindTechnologies( ids: [1,2,4,6], vendorName: "Google")); print_r($response); ``` If any of the the remote APIs change their DTOs can be updated by right-clicking on `dtos.php` and clicking **Update ServiceStack Reference**: ![](/img/posts/php-typed-apis/phpstorm-update-servicestack-reference.webp) ### Install PHP ServiceStack Client The only requirements for PHP apps to perform typed API Requests are the generated PHP DTOs and the generic `JsonServiceClient` which can be installed in Composer projects with: ```bash $ composer require servicestack/client ``` Or by adding the package to your `composer.json` then installing the dependencies: ```json { "require": { "servicestack/client": "^1.0" } } ``` ## First class development experience [PHP](https://www.php.net) is one of the worlds most popular programming languages thanks to its ease of use, platform independence, large standard library, flexibility and fast development experience which sees it excels as a popular language for web development and for development of popular CMS products like WordPress, Drupal and Joomla thanks to its flexibility, embeddability and ease of customization. To maximize the experience for calling ServiceStack APIs within these environments ServiceStack now supports PHP as a 1st class Add ServiceStack Reference supported language which gives PHP developers an end-to-end typed API for consuming ServiceStack APIs, complete with IDE integration in [PhpStorm](https://www.jetbrains.com/phpstorm/) as well as [built-in support in x dotnet tool](https://docs.servicestack.net/dotnet-tool#addupdate-servicestack-references) to generate Typed and annotated PHP DTOs for a remote ServiceStack instance from a single command-line. ### Ideal idiomatic Typed Message-based API To maximize the utility of PHP DTOs and enable richer tooling support and greater development experience, PHP DTOs are generated as Typed [JsonSerializable](https://www.php.net/manual/en/class.jsonserializable.php) classes with [promoted constructors](https://www.php.net/manual/en/language.oop5.decon.php#language.oop5.decon.constructor.promotion) and annotated with [PHPDoc Types](https://phpstan.org/writing-php-code/phpdoc-types) - that's invaluable when scaling large PHP code-bases and greatly improves discoverability of a remote API. DTOs are also enriched with interface markers and Annotations which enables its optimal end-to-end typed API: The PHP DTOs and `JsonServiceClient` library follow [PHP naming conventions](https://infinum.com/handbook/wordpress/coding-standards/php-coding-standards/naming) so they'll naturally fit into existing PHP code bases. Here's a sample of [techstacks.io](https://techstacks.io) generated PHP DTOs containing string and int Enums, an example AutoQuery and a standard Request & Response DTO showcasing the rich typing annotations and naming conventions used: ```php enum TechnologyTier : string { case ProgrammingLanguage = 'ProgrammingLanguage'; case Client = 'Client'; case Http = 'Http'; case Server = 'Server'; case Data = 'Data'; case SoftwareInfrastructure = 'SoftwareInfrastructure'; case OperatingSystem = 'OperatingSystem'; case HardwareInfrastructure = 'HardwareInfrastructure'; case ThirdPartyServices = 'ThirdPartyServices'; } enum Frequency : int { case Daily = 1; case Weekly = 7; case Monthly = 30; case Quarterly = 90; } // @Route("/technology/search") #[Returns('QueryResponse')] /** * @template QueryDb of Technology * @template QueryDb1 of TechnologyView */ class FindTechnologies extends QueryDb implements IReturn, IGet, JsonSerializable { public function __construct( /** @var array|null */ public ?array $ids=null, /** @var string|null */ public ?string $name=null, /** @var string|null */ public ?string $vendorName=null, /** @var string|null */ public ?string $nameContains=null, /** @var string|null */ public ?string $vendorNameContains=null, /** @var string|null */ public ?string $descriptionContains=null ) { } /** @throws Exception */ public function fromMap($o): void { parent::fromMap($o); if (isset($o['ids'])) $this->ids = JsonConverters::fromArray('int', $o['ids']); if (isset($o['name'])) $this->name = $o['name']; if (isset($o['vendorName'])) $this->vendorName = $o['vendorName']; if (isset($o['nameContains'])) $this->nameContains = $o['nameContains']; if (isset($o['vendorNameContains'])) $this->vendorNameContains = $o['vendorNameContains']; if (isset($o['descriptionContains'])) $this->descriptionContains = $o['descriptionContains']; } /** @throws Exception */ public function jsonSerialize(): mixed { $o = parent::jsonSerialize(); if (isset($this->ids)) $o['ids'] = JsonConverters::toArray('int', $this->ids); if (isset($this->name)) $o['name'] = $this->name; if (isset($this->vendorName)) $o['vendorName'] = $this->vendorName; if (isset($this->nameContains)) $o['nameContains'] = $this->nameContains; if (isset($this->vendorNameContains)) $o['vendorNameContains'] = $this->vendorNameContains; if (isset($this->descriptionContains)) $o['descriptionContains'] = $this->descriptionContains; return empty($o) ? new class(){} : $o; } public function getTypeName(): string { return 'FindTechnologies'; } public function getMethod(): string { return 'GET'; } public function createResponse(): mixed { return QueryResponse::create(genericArgs:['TechnologyView']); } } // @Route("/orgs/{Id}", "DELETE") class DeleteOrganization implements IReturnVoid, IDelete, JsonSerializable { public function __construct( /** @var int */ public int $id=0 ) { } /** @throws Exception */ public function fromMap($o): void { if (isset($o['id'])) $this->id = $o['id']; } /** @throws Exception */ public function jsonSerialize(): mixed { $o = []; if (isset($this->id)) $o['id'] = $this->id; return empty($o) ? new class(){} : $o; } public function getTypeName(): string { return 'DeleteOrganization'; } public function getMethod(): string { return 'DELETE'; } public function createResponse(): void {} } ``` The smart PHP `JsonServiceClient` available in the [servicestack/client](https://packagist.org/packages/servicestack/client) packagist package enables the same productive, typed API development experience available in our other 1st-class supported client platforms. Using promoted constructors enables DTOs to be populated using a single constructor expression utilizing named parameters which together with the generic `JsonServiceClient` enables end-to-end typed API Requests in a single LOC: ```php use Servicestack\JsonServiceClient; use dtos\Hello; $client = new JsonServiceClient("https://test.servicestack.net"); /** @var HelloResponse $response */ $response = client->get(new Hello(name:"World")); ``` > The `HelloResponse` optional type hint doesn't change runtime behavior but enables static analysis tools and IDEs like PyCharm to provide rich intelli-sense and development time feedback. For more usage examples and information about ServiceStack's PHP support checkout the [PHP Add ServiceStack Reference](https://docs.servicestack.net/php-add-servicestack-reference) docs. # ASP.NET Core Identity Auth in .NET 8 Source: https://servicestack.net/posts/net8-identity-auth ### ASP.NET Core Identity Auth now the default A significant change from **ServiceStack v8** is the adoption of the same ASP.NET Core Identity Authentication that's configured in Microsoft's default Projects templates in ServiceStack's new Project Templates. ## History of ServiceStack Authentication ServiceStack has always maintained its own [Authentication and Authorization](https://docs.servicestack.net/auth/authentication-and-authorization) provider model, primarily as it was the only way to provide an integrated and unified Authentication model that worked across all our supported hosting platforms, inc. .NET Framework, ASP.NET Core on .NET Framework, HttpListener and .NET (fka .NET Core). Whilst the Authentication story in ASP.NET has undergone several cycles of changes over the years, the ServiceStack Auth Model has remained relatively consistent and stable, with no schema changes required since release whilst still providing flexible options for [extending UserAuth tables](https://docs.servicestack.net/auth/auth-repository#extending-userauth-tables) and typed [User Sessions](https://docs.servicestack.net/auth/sessions#using-typed-sessions-in-servicestack). ### .NET Framework considered legacy Although the multi-platform support of the unified Authentication model has been vital for Organizations migrating their systems to .NET (Core) where ServiceStack Customers have been able to enjoy [Exceptional Code reuse](https://docs.servicestack.net/netcore#exceptional-code-reuse), it's become clear that the .NET platform (e.g. .NET 8) is the only platform that should be considered for new projects and that .NET Framework should only be considered a stable legacy platform for running existing systems on. Given Microsoft has committed to [Authentication Improvements in .NET 8](https://devblogs.microsoft.com/dotnet/whats-new-with-identity-in-dotnet-8/) it's become more important to easily integrate ServiceStack with new and existing .NET projects to access these new features than to continue recommending ServiceStack's unified Auth Providers as the default option for new projects. ### ServiceStack will use Identity Auth in new projects ASP.NET Core Identity Auth is the default Auth Model adopted in new ServiceStack projects which closely follows the same approach as the Microsoft Project Template it integrates ServiceStack with, e.g. the .NET 8 **Blazor** and **Blazor Vue** project templates adopts the exact same Auth configuration as Microsoft's default Blazor Project Template configured with **Individual** Identity Auth, likewise with the **Bootstrap** and **Tailwind** styled **MVC** and **Razor Pages** templates. You can find ServiceStack Integrated Identity Auth Templates for each of ASP.NET Core's major Blazor, Razor Pages and MVC Project Templates:

Create a Project with ASP.NET Identity Auth

### Identity Auth Template Live Demos For a quick preview of what these look like, checkout out their Internet Hosted Live Demos: The configuration and source code for the above projects are a good reference for how to configure ServiceStack with Identity Auth in your own projects: - [blazor](https://github.com/NetCoreTemplates/blazor) - [blazor-vue](https://github.com/NetCoreTemplates/blazor-vue) - [razor](https://github.com/NetCoreTemplates/razor) - [mvc](https://github.com/NetCoreTemplates/mvc) - [razor-bootstrap](https://github.com/NetCoreTemplates/razor-bootstrap) - [mvc-bootstrap](https://github.com/NetCoreTemplates/mvc-bootstrap) The **Bootstrap** versions use same Individual Identity Auth Pages that Microsoft's **Razor Pages** and **MVC** templates use, whilst the **Tailwind** versions have been enhanced to use **Tailwind CSS** instead of Bootstrap, includes a **visual QR Code** implementation that was missing and includes an `IEmailSender` SMTP solution that's easily enabled via Configuration to use your preferred **SMTP Server**. ## Migrating to ASP.NET Core Identity Auth Migrating from ServiceStack Auth to Identity Auth should be relatively straight-forward as ServiceStack uses a compatible Identity v2 password hashing format, which should let you migrate your users to Identity Auth without them noticing. ## ServiceStack's Identity Auth Integration ServiceStack's Identity Auth integration is focused on high compatibility so existing ServiceStack Customers require minimal effort to migrate existing code bases to use the new Identity Auth integration, despite Identity Auth being an entirely different Auth Provider model and implementation. It does this by retaining a lot of the existing user-facing Authentication and Session abstractions that ServiceStack APIs use for Authorization as well as existing endpoints and Request/Response DTOs that ServiceStack Clients use to Authenticate, but replace their internal implementation to use ASP.NET Identity Auth instead. The new Identity Auth integration is contained in the .NET 6+ **ServiceStack.Extensions** NuGet package: ```xml ``` Which at a minimum lets you configure ServiceStack to use Identity Auth by simply registering the existing `AuthFeature` plugin with the Application's custom EF `ApplicationUser` Data Model: ```csharp Plugins.Add(new AuthFeature(IdentityAuth.For())); ``` It requires minimal configuration as all Authorization is configured using ASP.NET Core's standard APIs, any configuration in this plugin is then just used to customize Identity Auth's integration with ServiceStack. There's also no new concepts to learn as all ASP .NET Core endpoints, pages and controllers continue to Authenticate against the populated `ClaimsPrincipal` whilst all ServiceStack APIs continue to Authenticate against the populated typed [User Session](https://docs.servicestack.net/auth/sessions). The `AuthFeature` works by registering the following Identity Auth Providers: ### Identity Auth Providers - [IdentityApplicationAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.Extensions/Auth/IdentityApplicationAuthProvider.cs) - Converts an Identity Auth `ClaimsPrincipal` into a ServiceStack Session - [IdentityCredentialsAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.Extensions/Auth/IdentityCredentialsAuthProvider.cs) - Implements ServiceStack's `Authenticate` API using Identity Auth - [IdentityJwtAuthProvider](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.Extensions/Auth/IdentityJwtAuthProvider.cs) - Converts an Identity Auth JWT into an Authenticated ServiceStack Session Only the `IdentityApplicationAuthProvider` is registered by default which is required to convert Identity Auth's `ClaimPrincipal` into an Authenticated ServiceStack [Session](https://docs.servicestack.net/auth/sessions). The other Auth Providers are required if you want to enable authentication with ServiceStack's endpoints. E.g. ServiceStack's [Built-in UIs](https://servicestack.net/auto-ui) would require the Credentials Auth to be enabled to authenticate via the built-in Sign In dialogs. ### Configuring Auth Providers Which is what all the Blazor and MVC Identity Auth templates enable by default in [Configure.Auth.cs](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp/Configure.Auth.cs): ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(IdentityAuth.For( // Configure ServiceStack's Integration with Identity Auth options => { options.EnableCredentialsAuth = true; options.SessionFactory = () => new CustomUserSession(); }) )); }); } ``` If you're using a `CustomUserSession` you'll also need to register it with the `SessionFactory` for it to be used. Each of the Identity Auth Providers can also be customized individually: ```csharp Plugins.Add(new AuthFeature(IdentityAuth.For(options => { // Configure IdentityApplicationAuthProvider options.AuthApplication... // Configure IdentityCredentialsAuthProvider options.EnableCredentialsAuth = true; options.AuthCredentials... // Configure IdentityJwtAuthProvider options.EnableJwtAuth = true; options.AuthJwt... }) )); ``` Typically you'll want to use the included Identity UI Pages and dependencies to register new users and assign roles, but if you have any existing client integrations that use ServiceStack APIs they can also be enabled with: ```csharp Plugins.Add(new AuthFeature(IdentityAuth.For(options => { // Include ServiceStack's Register API options.IncludeRegisterService = true; // Include ServiceStack's AssignRoles and UnAssignRoles APIs options.IncludeAssignRoleServices = true; )); ``` ### Extending Identity Auth Cookies and User Sessions By default all [well known Claim Names](https://github.com/ServiceStack/ServiceStack/blob/3ab3d23c85cf48435b8cd9386f25afab79fcb542/ServiceStack/src/ServiceStack.Extensions/Auth/IdentityApplicationAuthProvider.cs#L49) are used to populate the User Session, but you can also include additional claims in the Identity Auth Cookie and use them to populate the User Session by overriding `PopulateFromClaims()` in your [CustomUserSession.cs](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp.ServiceInterface/Data/CustomUserSession.cs), e.g: ```csharp public class CustomUserSession : AuthUserSession { public override void PopulateFromClaims(IRequest httpReq, ClaimsPrincipal principal) { // Populate Session with data from Identity Auth Claims ProfileUrl = principal.FindFirstValue(JwtClaimTypes.Picture); } } // Add additional claims to the Identity Auth Cookie public class AdditionalUserClaimsPrincipalFactory(UserManager userManager, RoleManager roleManager, IOptions optionsAccessor) : UserClaimsPrincipalFactory(userManager, roleManager, optionsAccessor) { public override async Task CreateAsync(ApplicationUser user) { var principal = await base.CreateAsync(user); var identity = (ClaimsIdentity)principal.Identity!; var claims = new List(); // Add additional claims here if (user.ProfileUrl != null) { claims.Add(new Claim(JwtClaimTypes.Picture, user.ProfileUrl)); } identity.AddClaims(claims); return principal; } } ``` ### Custom Application User Primary Key The default `IdentityUser` uses a `string` as the primary key populated with a Guid, but you could also change it to use an `int` by having your EF IdentityUser Data Model inherit from `IdentityUser` instead: ```csharp public class AppUser : IdentityUser { //... } ``` You'll also need to specify the Key Type when registering the `AuthFeature` plugin: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(IdentityAuth.For( options => { options.EnableCredentialsAuth = true; options.SessionFactory = () => new CustomUserSession(); }) )); }); } ``` Which the new .NET 8 BlazorDiffusion App does in [Configure.Auth.cs](https://github.com/NetCoreApps/BlazorDiffusionVue/blob/main/BlazorDiffusion/Configure.Auth.cs) to be compatible with its existing ServiceStack `UserAuth` tables which used an `int` primary key. ## Using Identity Auth in ServiceStack Apps One of the primary benefits of adopting Identity Auth is the wealth of documentation and resources available for it, which also applies to how you would use Identity Auth to secure your own Apps. If you're new to Identity Auth we recommend starting with the official introduction from Microsoft: - [Introduction to Identity on ASP.NET Core](https://learn.microsoft.com/en-us/aspnet/core/security/authentication/identity) To learn about securing Blazor Apps, go to: - [ASP.NET Core Blazor authentication and authorization](https://learn.microsoft.com/en-us/aspnet/core/blazor/security/) ### Declarative Validation Attributes The recommended way to protect your ServiceStack APIs is to continue to use the [Declarative Validation](https://docs.servicestack.net/declarative-validation) attributes which are decoupled from any implementation so be safely annotated on Request DTOs without adding any implementation dependencies, where they're also accessible to Clients and UIs using the Request DTOs to invoke your APIs. The available Typed Authorization Attributes include: | Attribute | Description | |-----------------------------|--------------------------------------------------------| | `[ValidateIsAuthenticated]` | Restrict access toAuthenticated Users only | | `[ValidateIsAdmin]` | Restrict access to Admin Users only | | `[ValidateHasRole]` | Restrict access to only Users assigned with this Role | | `[ValidateHasClaim]` | Restrict access to only Users assigned with this Claim | | `[ValidateHasScope]` | Restrict access to only Users assigned with this Scope | That can be annotated on **Request DTOs** to protect APIs: ```csharp [ValidateIsAuthenticated] [ValidateIsAdmin] [ValidateHasRole(role)] [ValidateHasClaim(type,value)] [ValidateHasScope(scope)] public class Secured {} ``` ## Migrating from ServiceStack Auth Migrating from ServiceStack Auth to Identity Auth should be relatively straight-forward as ServiceStack uses a compatible Identity v2 password hashing format, which should let you migrate your users to Identity Auth without them noticing. :::info TIP Please ensure your App database is backed up before running any migrations ::: #### 1. Rename old AppUser table You'll want to use a different name so it doesn't conflict with the new Identity Auth `AppUser` Data Model. This would only be needed to query the User data to migrate to Identity Auth, you'll be able to remove it after successfully migrating all your Users. You don't need to include all the properties of the `UserAuth` base table, just the ones you want to migrate to Identity Auth, which for Blazor Diffusion was only: ```csharp // Used by OrmLite to fetch User data to migrate from old ServiceStack `AppUser` table [Alias("AppUser")] public class OldAppUser { [AutoIncrement] public int Id { get; set; } public string UserName { get; set; } public string DisplayName { get; set; } public string FirstName { get; set; } public string LastName { get; set; } public string? Handle { get; set; } public string Email { get; set; } public string PasswordHash { get; set; } public string? ProfileUrl { get; set; } public string? Avatar { get; set; } //overrides ProfileUrl public string? LastLoginIp { get; set; } public DateTime? LastLoginDate { get; set; } public string RefIdStr { get; set; } public DateTime? LockedDate { get; set; } public DateTime CreatedDate { get; set; } public DateTime ModifiedDate { get; set; } } ``` #### 2. Create Identity Auth Data Model If you have a lot of existing references to the `AppUser` name you'll want to retain the same name so the existing references wont need to be updated. Essentially your custom EF IdentityUser will want a copy of all the properties you want to migrate other than `Id`, `Email`, and `PasswordHash` that's already defined in the base `IdentityUser` class: ```csharp [Alias("AspNetUsers")] // Tell OrmLite which table this EF Data Model maps to public class AppUser : IdentityUser { public string? FirstName { get; set; } public string? LastName { get; set; } public string? DisplayName { get; set; } public string? ProfileUrl { get; set; } [Input(Type = "file"), UploadTo("avatars")] public string? Avatar { get; set; } //overrides ProfileUrl public string? Handle { get; set; } public int? RefId { get; set; } public string RefIdStr { get; set; } = Guid.NewGuid().ToString(); public bool IsArchived { get; set; } public DateTime? ArchivedDate { get; set; } public string? LastLoginIp { get; set; } public DateTime? LastLoginDate { get; set; } public DateTime CreatedDate { get; set; } = DateTime.UtcNow; public DateTime ModifiedDate { get; set; } = DateTime.UtcNow; } ``` The `AppUser` Data Model and `int` primary key would also need to be registered in your [Configure.Auth.cs](https://github.com/NetCoreApps/BlazorDiffusionVue/blob/main/BlazorDiffusion/Configure.Auth.cs) configuration class: ```csharp public class ConfigureAuth : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { appHost.Plugins.Add(new AuthFeature(IdentityAuth.For( options => { options.EnableCredentialsAuth = true; options.SessionFactory = () => new CustomUserSession(); }) )); }); } ``` #### 3. Add Authentication Configuration You'll need to configure Entity Framework and add your desired ASP.NET Identity Auth configuration to your App's `Program.cs`. We'd recommend copying from a new Microsoft or [ServiceStack .NET 8 Project](https://docs.servicestack.net/auth/identity-auth) which closely matches the Authentication options you want to enable, e.g. you can start with the recommended Authentication for a new Blazor Project from its [Program.cs](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp/Program.cs): ```csharp services.AddAuthentication(IdentityConstants.ApplicationScheme) .AddIdentityCookies(); services.AddDataProtection() .PersistKeysToFileSystem(new DirectoryInfo("App_Data")); // $ dotnet ef migrations add CreateIdentitySchema // $ dotnet ef database update var connectionString = config.GetConnectionString("DefaultConnection") ?? throw new InvalidOperationException("Connection string 'DefaultConnection' not found."); services.AddDbContext(options => options.UseSqlite(connectionString, b => b.MigrationsAssembly(nameof(MyApp)))); services.AddDatabaseDeveloperPageExceptionFilter(); services.AddIdentityCore(options => options.SignIn.RequireConfirmedAccount = true) .AddRoles() .AddEntityFrameworkStores() .AddSignInManager() .AddDefaultTokenProviders(); services.AddSingleton(); services.AddScoped, AdditionalUserClaimsPrincipalFactory>(); ``` Alternatively if you want to add support for external OAuth logins you can copy from the **MVC Tailwind** Authentication configuration in its [Program.cs](https://github.com/NetCoreTemplates/mvc/blob/main/MyApp/Program.cs) which will also require adding the NuGet dependencies of the OAuth providers you want to support which you can get from its [MyApp.csproj](https://github.com/NetCoreTemplates/mvc/blob/main/MyApp/MyApp.csproj) #### 4. Create and Run EF Migrations After your App is properly configured you'll want to create the EF Migrations for your the Identity Auth User tables by installing the [dotnet-ef tool](https://learn.microsoft.com/en-us/ef/core/cli/dotnet) and running: :::sh dotnet ef migrations add CreateIdentitySchema ::: Which should create the EF Migrations in the `/Migrations` folder, you can then run the migrations to create the Identity Auth tables in your App's configured database: :::sh dotnet ef database update ::: #### 5. Implement the Migrate Users Task This could be implemented in a separate Application or Unit Test although we've found the easiest way to migrate existing users is to implement a custom [App Task](https://docs.servicestack.net/app-tasks) as it's able to make use of your App's configured Authentication, EF and OrmLite dependencies that can then be run from the command-line. The implementation should be fairly straight-forward, you'll basically just need to create a new Identity Auth User using the `UserManager` dependency for each of your existing users: ```csharp public class ConfigureDbMigrations : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost(appHost => { AppTasks.Register("migrate.users", _ => { var log = appHost.GetApplicationServices().GetRequiredService>(); log.LogInformation("Running migrate.users..."); var scopeFactory = appHost.GetApplicationServices().GetRequiredService(); using var scope = scopeFactory.CreateScope(); using var dbContext = scope.ServiceProvider.GetRequiredService(); using var db = scope.ServiceProvider.GetRequiredService().Open(); var migrateUsers = db.Select(db.From().OrderBy(x => x.Id)); log.LogInformation("Migrating {Count} Existing ServiceStack Users to Identity Auth Users...", migrateUsers.Count); MigrateExistingUsers(dbContext, scope.ServiceProvider, migrateUsers).Wait(); }); AppTasks.Run(); }); private async Task MigrateExistingUsers(ApplicationDbContext dbContext, IServiceProvider services, List migrateUsers, string tempPassword="p@55wOrd") { var userManager = services.GetRequiredService>(); var now = DateTime.UtcNow; foreach (var user in migrateUsers) { var appUser = new AppUser { Id = user.Id, UserName = user.Email, Email = user.Email, DisplayName = user.DisplayName, FirstName = user.FirstName, LastName = user.LastName, Handle = user.Handle, ProfileUrl = user.ProfileUrl, Avatar = user.Avatar, RefIdStr = user.RefIdStr ?? Guid.NewGuid().ToString(), LockoutEnabled = true, LockoutEnd = user.LockedDate != null ? now.AddYears(10) : now, LastLoginDate = user.LastLoginDate, LastLoginIp = user.LastLoginIp, CreatedDate = user.CreatedDate, ModifiedDate = user.ModifiedDate, // Verify you want existing Users emails to be confirmed EmailConfirmed = true, }; await userManager.CreateAsync(appUser, tempPassword); // Update raw Password Hash using EF if (user.PasswordHash != null) { dbContext.Users .Where(x => x.Id == user.Id) .ExecuteUpdate(setters => setters.SetProperty(x => x.PasswordHash, user.PasswordHash)); } } } } ``` As there's no official API for updating the raw `PasswordHash` you'll need to use EF's `ExecuteUpdate()` API to update it on the `AspNetUsers` table directly. It should be noted that ServiceStack Auth still uses ASP.NET Core's previous Identity v2 format for hashing its passwords, this will be automatically re-hashed using the latest ASP.NET Identity v3 format after users successfully sign in. #### Optimizing the PasswordHash Update Whilst migrating users should be a once-off task, if you have a lot of users you may want to optimize the `PasswordHash` update from a **N+1** query per user to a single query that updates all users in a single command. You'll need to use the **UPDATE FROM** syntax that's supported by your RDBMS's, here's an example of how to do it in SQLite: ```sql UPDATE AspNetUsers SET PasswordHash = u.PasswordHash FROM (SELECT Email, PasswordHash FROM AppUser WHERE PasswordHash is NOT NULL) AS u WHERE u.Email = AspNetUsers.Email; ``` #### Migrating Roles Migrating Roles will depend how their stored in your App, you'll first need to ensure each role is created in the `AspNetRoles` table with: ```csharp string[] allRoles = [...]; // All Roles in your App var roleManager = services.GetRequiredService>(); foreach (var roleName in allRoles) { var roleExist = await roleManager.RoleExistsAsync(roleName); if (!roleExist) { await roleManager.CreateAsync(new IdentityRole(roleName)); } } ``` You can then assign Roles to Users using the `UserManager`, e.g: ```csharp string[] roles = [...]; // Roles to assign to User var newUser = await userManager.FindByEmailAsync(user.Email!); await userManager.AddToRolesAsync(user, roles); ``` #### 6. Run the migrate.users Task With everything in place, all that's left is to run the `migrate.users` App Task from the command-line: :::sh dotnet run --AppTasks=migrate.users ::: #### 7. Verify Users can Sign In After successfully migrating all your users you should check the new `IdentityUser` table to verify all the User data you want has been migrated as well as verifying they can sign in with their existing credentials. #### Create a new ASP.NET Identity Auth Project to copy from The easiest way to include the Identity Auth UI Pages to your App is to copy your Application into a new .NET 8 Project that already includes them, you can create a new Blazor App with: :::sh x new blazor ProjectName ::: Or create a new Razor Pages Tailwind or Bootstrap App: :::sh x new razor ProjectName ::: :::sh x new razor-bootstrap ProjectName ::: Or new MVC Tailwind or Bootstrap App with: :::sh x new mvc ProjectName ::: :::sh x new mvc-bootstrap ProjectName ::: Alternatively you can manually copy the pages from the project template repositories, for Blazor most of the Identity Auth UI Pages are in the [Components/Identity](https://github.com/NetCoreTemplates/blazor/tree/main/MyApp/Components/Identity) and [Pages/Account](https://github.com/NetCoreTemplates/blazor/tree/main/MyApp/Components/Pages/Account) folders. For MVC, most of the Identity UI are in the [Account](https://github.com/NetCoreTemplates/mvc/blob/main/MyApp/Controllers/AccountController.cs) and [Manage](https://github.com/NetCoreTemplates/mvc/blob/main/MyApp/Controllers/ManageController.cs) controllers as well as their [Views/Account](https://github.com/NetCoreTemplates/mvc/tree/main/MyApp/Views/Account) and [Views/Manage](https://github.com/NetCoreTemplates/mvc/tree/main/MyApp/Views/Manage) folders. ### SMTP IEmailSender The .NET 8 Templates also include a nice solution for sending Identity Auth emails through the `IEmailSender` interface which drops the Email Request in the registered Background MQ in [Configure.Mq.cs](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp/Configure.Mq.cs) which uses it to invoke the `SendEmail` API in [EmailServices](https://github.com/NetCoreTemplates/blazor/blob/main/MyApp.ServiceInterface/EmailServices.cs) in a managed background worker: ```csharp public class EmailSender(IMessageService messageService) : IEmailSender { public Task SendEmailAsync(string email, string subject, string htmlMessage) { using var mqClient = messageService.CreateMessageProducer(); mqClient.Publish(new SendEmail { To = email, Subject = subject, BodyHtml = htmlMessage, }); return Task.CompletedTask; } } ``` To enable it you'll need to register your preferred SMTP Server in your App's `appsettings.json`: ```json { "SmtpConfig": { "Username": "username", "Password": "password", "Host": "smtp.mailtrap.io", "Port": 587, "FromEmail": "mail@example.org" } } ``` Then uncomment the `EmailSender` registration in your `Program.cs` ```csharp services.AddSingleton(); ``` ### Send any App Email The nice part about this solution is that it's not limited to just sending Identity Auth emails, you can also use it to send any App Email, either by publishing a message to the registered MQ with `PublishMessage` or by using the [Service Gateway](https://docs.servicestack.net/service-gateway) to invoke the API directly, e.g: ```csharp public class MyServices : Service { public object Any(MyRequest request) { // Send Email in managed Background MQ Worker PublishMessage(new SendEmail { To = email, Subject = subject, BodyHtml = body, }); // Block until Email is sent to SMTP Server Gateway.Send(new SendEmail { To = email, Subject = subject, BodyHtml = body, }); } } ``` # .NET 8's Best Blazor is not Blazor as we know it Source: https://servicestack.net/posts/net8-best-blazor The best way to find out what's new in .NET 8 Blazor is to watch the excellent [Full stack web UI with Blazor in .NET 8](https://www.youtube.com/watch?v=QD2-DwuOfKM) presentation by Daniel Roth and Steve Sanderson, which covers how Blazor has become a Full Stack UI Web Technology for developing any kind of .NET Web App.
## Your first .NET 8 Blazor App You don't get to appreciate what this means until you create your first .NET 8 Blazor App where you'll be pleasantly surprised that Blazor Apps render fast, clean HTML without needing to load large Web Assembly assets needed for Blazor WebAssembly Apps or starting a stateful Web Socket connection required for Blazor Server Interactive Apps. This is because the **default rendering mode** for Blazor uses neither of these technologies, instead it returns to traditional Web App development where Blazor Pages now return clean, glorious HTML - courtesy of Blazor's [Static render mode](https://learn.microsoft.com/en-us/aspnet/core/blazor/components/render-modes). [![](/img/posts/net8-best-blazor/blazor-ssr.png)](https://learn.microsoft.com/en-us/aspnet/core/blazor/components/render-modes) ## Choose your compromise Previously we were forced to choose upfront whether we wanted to build a Blazor Web Assembly App or a Blazor Server App and the compromises that came with them, which for public Internet Web Apps wasn't even a choice as Blazor Server Apps perform poorly over high latency Internet connections. This meant choosing Blazor Web Assembly Apps which required downloading a large Web Assembly runtime with users experiencing a long delay before the App was functional. To minimize this impact our Blazor WebAssembly Tailwind template included [built-in prerendering](https://github.com/LegacyTemplates/blazor-tailwind/blob/main/MyApp.Client/wwwroot/content/prerender.md) where as part of deployment would generate **static .html pages** that were deployed with the Blazor Web Assembly front-end UI that can be hosted on CDN edge networks to further improve load times. Whilst this meant the App's UI would be rendered immediately, it still wouldn't be functional until the Web Assembly runtime was downloaded and initialized, which would flicker as the static UI was replaced with Blazor's WASM rendered UI, then later Authenticated users would experience further delay and UI jank whilst the App signs in the Authenticated User. Whilst prerendering is an improvement over Blazor WASM's default blank loading screen, it's still not ideal for public facing Web Apps. ## .NET 8 Blazor is a Game Changer The situation has greatly improved in .NET 8 where your entire App no longer needs to be bound to a single Interactivity mode. Even better, Blazor's default **static rendering** mode results in the best UX where the Website Layout and important landing pages can be rendered instantly. ### Interactive only when you need it Only pages that need Blazor's interactivity features can opt-in to whichever Blazor interactive rendering mode makes the most sense, either on a page-by-page or component basis, or by choosing `RenderMode.InteractiveAuto` which uses **InteractiveWebAssembly** if the WASM runtime is loaded or **InteractiveServer** if it isn't. ### Enhanced Navigation FTW Ultimately I expect Blazor's new **Enhanced Navigation** is likely the feature that will deliver the biggest UX improvement users will experience given it's enabled by default and gives traditional statically rendered Web Apps instant SPA-like navigation responsiveness where new pages are swapped in without needing to perform expensive full page reloads. It's beauty lies in being able to do this as a mostly transparent detail without the traditional SPA complexity of needing to manage complex state or client-side routing. It's a smart implementation that's able to perform fine-grained DOM updates to only parts of pages that have changed, providing the ultimate UX of preserving page state, like populated form fields and scroll position, to deliver a fast and responsive UX that previously wasn't attainable from the simplicity of a Server Rendered App. Its implementation does pose some challenges in implementing certain features, but we'll cover some approaches below we've used to overcome them below. ### Full Stack Web UI Blazor's static rendering with enhanced navigation and its opt-in flexibility makes .NET 8 Blazor a game changer, expanding it from a very niche set of use-cases that weren't too adversely affected by its Interactivity mode downsides, to becoming a viable solution for developing any kind of .NET Web App, especially as it can also be utilized within existing ASP.NET MVC and Razor Pages Apps. ### Benefits over MVC and Razor Pages In addition, Blazor's superior component model allows building better encapsulated, more reusable and easier-to-use UI components which has enabled Blazor's rich 3rd Party library ecosystem to flourish, that we ourselves utilize to develop the high productivity Tailwind Components in the [ServiceStack.Blazor](https://blazor-gallery.servicestack.net) component library. So far there's only upsides for .NET Web App development, the compromises only kick in when you need Blazor's interactivity features, luckily these can now be scoped to just the Pages and Components that need them. But how often do we need them? ### When do you need Blazor's Interactivity features? It ultimately depends on what App your building, but a lot of Websites can happily display dynamic content, navigate quickly with enhanced navigation, fill out and submit forms - all in Blazor's default static rendering mode. Not even advanced features like **Streaming Rendering** used in Blazor Template's [Weather.razor](https://github.com/dotnet/aspnetcore/blob/v8.0.0-rc.2.23480.2/src/ProjectTemplates/Web.ProjectTemplates/content/BlazorWeb-CSharp/BlazorWeb-CSharp/Components/Pages/Weather.razor) page require interactivity, as its progressive rendered UI updates are achieved in a single request without interactivity. In fact the only time `@rendermode InteractiveServer` is needed in the default Blazor template is in the [Counter.razor](https://github.com/dotnet/aspnetcore/blob/v8.0.0-rc.2.23480.2/src/ProjectTemplates/Web.ProjectTemplates/content/BlazorWeb-CSharp/BlazorWeb-CSharp/Components/Pages/Counter.razor#L3) page whose C# Event Handling require it. Ultimately some form of Interactivity is going to be required in order to add behavior or client-side functionality that runs after pages have been rendered, but you still have some options left before being forced to opt into an Interactive Blazor solution. ### Interactive Feature Options We can see some of these options utilized in the Blazor Template [NavMenu.razor](https://github.com/dotnet/aspnetcore/blob/v8.0.0-rc.2.23480.2/src/ProjectTemplates/Web.ProjectTemplates/content/BlazorWeb-CSharp/BlazorWeb-CSharp/Components/Layout/NavMenu.razor) component which uses JavaScript `onclick` event handlers to add client-side behavior to simulate mouse clicks to toggle UI elements: ```html