docs: fix slugs

This commit is contained in:
Nicole Zhu 2024-03-14 20:55:03 +08:00
parent f878555598
commit b4eff9a108
5 changed files with 8 additions and 6 deletions

View File

@ -1,6 +1,6 @@
--- ---
title: Extensions title: Extensions
slug: /guides/inference/ slug: /guides/engines
--- ---
import DocCardList from "@theme/DocCardList"; import DocCardList from "@theme/DocCardList";

View File

Before

Width:  |  Height:  |  Size: 27 KiB

After

Width:  |  Height:  |  Size: 27 KiB

View File

@ -1,5 +1,6 @@
--- ---
title: Llama-CPP Extension title: Llama-CPP Extension
slug: /guides/engines/llama-cpp
--- ---
## Overview ## Overview

View File

@ -1,5 +1,6 @@
--- ---
title: TensorRT-LLM Extension title: TensorRT-LLM Extension
slug: /guides/engines/tensorrt-llm
--- ---
Users with Nvidia GPUs can get 20-40% faster* token speeds on their laptop or desktops by using [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM). Users with Nvidia GPUs can get 20-40% faster* token speeds on their laptop or desktops by using [TensorRT-LLM](https://github.com/NVIDIA/TensorRT-LLM).
@ -48,7 +49,7 @@ We offer a handful of precompiled models for Ampere and Ada cards that you can i
![alt text](image.png) ![alt text](image.png)
:::info :::info
Due to our limited resources, we only prebuilt a few demo models. You can always build your desired models directly on your machine. [Read here](##Build-your-own-TensorRT-models). Due to our limited resources, we only prebuilt a few demo models. You can always build your desired models directly on your machine. [Read here](#build-your-own-tensorrt-models).
::: :::
## Configure Settings ## Configure Settings

View File

@ -201,15 +201,15 @@ const sidebars = {
}, },
{ {
type: "category", type: "category",
label: "Inference Providers", label: "AI Engines",
className: "head_SubMenu", className: "head_SubMenu",
link: { link: {
type: 'doc', type: 'doc',
id: "guides/inference/README", id: "guides/engines/README",
}, },
items: [ items: [
"guides/inference/llama-cpp", "guides/engines/llama-cpp",
"guides/inference/tensorrt-llm", "guides/engines/tensorrt-llm",
] ]
}, },
{ {