diff --git a/docs/src/pages/docs/remote-models/generic-openai.mdx b/docs/src/pages/docs/remote-models/generic-openai.mdx
index 22f483372..7e8966237 100644
--- a/docs/src/pages/docs/remote-models/generic-openai.mdx
+++ b/docs/src/pages/docs/remote-models/generic-openai.mdx
@@ -44,8 +44,6 @@ The **OpenAI** fields can be used for any OpenAI-compatible API.
Please note that currently, the code that supports any OpenAI-compatible endpoint only reads the `~/jan/data/extensions/@janhq/inference-openai-extension/settings.json` file, which is OpenAI Inference Engines in the extensions page. Thus, it will not search any other files in this directory.
-
-
### Step 2: Start Chatting with the Model
@@ -53,8 +51,7 @@ The **OpenAI** fields can be used for any OpenAI-compatible API.
2. Select the model you want to use.
3. Specify the model's parameters.
4. Start the conversation with the model.
-
-
+
If you have questions or want more preconfigured GGUF models, please join our [Discord community](https://discord.gg/Dt7MxDyNNZ) for support, updates, and discussions.