Merge branch 'main' into chore/add-mermaid

This commit is contained in:
Hieu 2023-11-22 09:26:32 +09:00 committed by GitHub
commit 4c8efdd9fe
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
6 changed files with 183 additions and 333 deletions

View File

@ -2,344 +2,105 @@
title: Models
---
import ApiSchema from '@theme/ApiSchema';
:::warning
:::caution
Draft Specification: functionality has not been implemented yet.
Feedback: [HackMD: Models Spec](https://hackmd.io/ulO3uB1AQCqLa5SAAMFOQw)
:::
## Overview
Jan's Model API aims to be as similar as possible to [OpenAI's Models API](https://platform.openai.com/docs/api-reference/models), with additional methods for managing and running models locally.
In Jan, models are primary entities with the following capabilities:
### Objectives
- Users can import, configure, and run models locally.
- An [OpenAI Model API](https://platform.openai.com/docs/api-reference/models) compatible endpoint at `localhost:3000/v1/models`.
- Supported model formats: `ggufv3`, and more.
- Users can download, import and delete models
- Users can use remote models (e.g. OpenAI, OpenRouter)
- Users can start/stop models and use them in a thread (or via Chat Completions API)
- User can configure default model parameters at the model level (to be overridden later at message or thread level)
## Folder Structure
## Models Folder
- Models are stored in the `/models` folder.
- Models are organized by individual folders, each containing the binaries and configurations needed to run the model. This makes for easy packaging and sharing.
- Model folder names are unique and used as `model_id` default values.
Models in Jan are stored in the `/models` folder.
Models are stored and organized by folders, which are atomic representations of a model for easy packaging and version control.
A model's folder name is its `model.id` and contains:
- `<model-id>.json`, i.e. the [Model Object](#model-object)
- Binaries (may be downloaded later)
```shell
/jan # Jan root folder
/models
# GGUF model
/llama2-70b
llama2-70b.json
llama2-70b-q4_k_m.gguf
# Recommended Model (yet to be downloaded)
/mistral-7b
mistral-7b.json # Contains download instructions
# Note: mistral-7b-*.gguf binaries not downloaded yet
# Remote model
/azure-openai-gpt3-5
azure-openai-gpt3-5.json
# Note: No binaries
# Multiple Binaries
# COMING SOON
# Multiple Quantizations
# COMING SOON
# Imported model (autogenerated .json)
random-model-q4_k_m.bin
# Note: will be moved into a autogenerated folder
# /random-model-q4_k_m
# random-model-q4_k_m.bin
# random-model-q4_k_m.json (autogenerated)
```bash
jan/ # Jan root folder
models/
llama2-70b-q4_k_m/ # Example: standard GGUF model
model.json
model-binary-1.gguf
mistral-7b-gguf-q3_k_l/ # Example: quantizations are separate folders
model.json
mistral-7b-q3-K-L.gguf
mistral-7b-gguf-q8_k_m/ # Example: quantizations are separate folders
model.json
mistral-7b-q8_k_k.gguf
llava-ggml-Q5/ # Example: model with many partitions
model.json
mmprj.bin
model_q5.ggml
```
### Importing Models
## `model.json`
:::warning
- Each `model` folder contains a `model.json` file, which is a representation of a model.
- `model.json` contains metadata and default parameters used to run a model.
- The only required field is `source_url`.
- This has not been confirmed
- Dan's view: Jan should auto-detect and create folders automatically
- Jan's UI will allow users to rename folders and add metadata
### GGUF Example
:::
Here's a standard example `model.json` for a GGUF model.
You can import a model by just dragging it into the `/models` folder, similar to Oobabooga.
- Jan will detect and generate a corresponding `model-filename.json` file based on filename
- Jan will move it into its own `/model-id` folder once you define a `model-id` via the UI
- Jan will populate the model's `model-id.json` as you add metadata through the UI
## Model Object
:::warning
- This is currently not finalized
- Dan's view: I think the current JSON is extremely clunky
- We should move `init` to top-level (e.g. "settings"?)
- We should move `runtime` to top-level (e.g. "parameters"?)
- `metadata` is extremely overloaded and should be refactored
- Dan's view: we should make a model object very extensible
- A `GGUF` model would "extend" a common model object with extra fields (at top level)
- Dan's view: State is extremely badly named
- Recommended: `downloaded`, `started`, `stopped`, null (for yet-to-download)
- We should also note that this is only for local models (not remote)
:::
Jan represents models as `json`-based Model Object files, known colloquially as `model.jsons`. Jan aims for rough equivalence with [OpenAI's Model Object](https://platform.openai.com/docs/api-reference/models/object) with additional properties to support local models.
Jan's models follow a `model_id.json` naming convention, and are built to be extremely lightweight, with the only mandatory field being a `source_url` to download the model binaries.
<ApiSchema example pointer="#/components/schemas/Model" />
### Types of Models
:::warning
- This is currently not in the Model Object, and requires further discussion.
- Dan's view: we should have a field to differentiate between `local` and `remote` models
:::
There are 3 types of models.
- Local model
- Local model, yet-to-be downloaded (we have the URL)
- Remote model (i.e. OpenAI API)
#### Local Models
:::warning
- This is currently not finalized
- Dan's view: we should have `download_url` and `local_url` for local models (and possibly more)
:::
A `model.json` for a local model should always reference the following fields:
- `download_url`: the original download source of the model
- `local_url`: the current location of the model binaries (may be array of multiple binaries)
- `source_url`: https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF/.
```json
// ./models/llama2/llama2-7bn-gguf.json
"local_url": "~/Downloads/llama-2-7bn-q5-k-l.gguf",
```
#### Remote Models
:::warning
- This is currently not finalized
- Dan's view: each cloud model should be provided via a syste module, or define its own params field on the `model` or `model.init` object
:::
A `model.json` for a remote model should always reference the following fields:
- `api_url`: the API endpoint of the model
- Any authentication parameters
```json
// Dan's view: This needs to be refactored pretty significantly
"source_url": "https://docs-test-001.openai.azure.com/openai.azure.com/docs-test-001/gpt4-turbo",
"parameters": {
"init" {
"API-KEY": "",
"DEPLOYMENT-NAME": "",
"api-version": "2023-05-15"
},
"runtime": {
"temperature": "0.7",
"max_tokens": "2048",
"presence_penalty": "0",
"top_p": "1",
"stream": "true"
}
}
"metadata": {
"engine": "api", // Dan's view: this should be a `type` field
}
```
### Importers
:::caution
- This is only an idea, has not been confirmed as part of spec
:::
Jan builds "importers" for users to seamlessly import models from a single URL.
We currently only provide this for [TheBloke models on Huggingface](https://huggingface.co/TheBloke) (i.e. one of the patron saints of llama.cpp), but we plan to add more in the future.
Currently, pasting a TheBloke Huggingface link in the Explore Models page will fire an importer, resulting in an:
- Nicely-formatted model card
- Fully-annotated `model.json` file
### Multiple Binaries
:::warning
- This is currently not finalized
- Dan's view: having these fields under `model.metadata` is not maintainable
- We should explore some sort of `local_url` structure
:::
- Model has multiple binaries `model-llava-1.5-ggml.json`
- See [source](https://huggingface.co/mys/ggml_llava-v1.5-13b)
```json
"source_url": "https://huggingface.co/mys/ggml_llava-v1.5-13b",
"parameters": {"init": {}, "runtime": {}}
"metadata": {
"mmproj_binary": "https://huggingface.co/mys/ggml_llava-v1.5-13b/blob/main/mmproj-model-f16.gguf",
"ggml_binary": "https://huggingface.co/mys/ggml_llava-v1.5-13b/blob/main/ggml-model-q5_k.gguf",
"engine": "llamacpp",
"quantization": "Q5_K"
}
```
## Models API
:::warning
- We should use the OpenAPI spec to discuss APIs
- Dan's view: This needs @louis and App Pod to review as they are more familiar with this
- Dan's view: Start/Stop model should have some UI indicator (show state, block input)
:::
See http://localhost:3001/api-reference#tag/Models.
| Method | API Call | OpenAI-equivalent |
| -------------- | ------------------------------- | ----------------- |
| List Models | GET /v1/models | true |
| Get Model | GET /v1/models/{model_id} | true |
| Delete Model | DELETE /v1/models/{model_id} | true |
| Start Model | PUT /v1/models/{model_id}/start | |
| Stop Model | PUT /v1/models/{model_id}/start | |
| Download Model | POST /v1/models/ | |
## Examples
### Local Model
- Model has 1 binary `model-zephyr-7B.json`
- See [source](https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF/)
```json
// ./models/zephr/zephyr-7b-beta-Q4_K_M.json
// Note: Default fields omitted for brevity
"source_url": "https://huggingface.co/TheBloke/zephyr-7B-beta-GGUF/blob/main/zephyr-7b-beta.Q4_K_M.gguf",
"parameters": {
"init": {
"type": "model", // Defaults to "model"
"version": "1", // Defaults to 1
"id": "zephyr-7b" // Defaults to foldername
"name": "Zephyr 7B" // Defaults to foldername
"owned_by": "you" // Defaults to you
"created": 1231231 // Defaults to file creation time
"description": ""
"state": enum[null, "downloading", "ready", "starting", "stopping", ...]
"format": "ggufv3", // Defaults to "ggufv3"
"settings": { // Models are initialized with these settings
"ctx_len": "2048",
"ngl": "100",
"embedding": "true",
"n_parallel": "4",
"pre_prompt": "A chat between a curious user and an artificial intelligence",
"user_prompt": "USER: ",
"ai_prompt": "ASSISTANT: "
},
"runtime": {
// KIV: "pre_prompt": "A chat between a curious user and an artificial intelligence",
// KIV:"user_prompt": "USER: ",
// KIV: "ai_prompt": "ASSISTANT: "
}
"parameters": { // Models are called with these parameters
"temperature": "0.7",
"token_limit": "2048",
"top_k": "0",
"top_p": "1",
"stream": "true"
}
},
"metadata": {
"engine": "llamacpp",
"quantization": "Q3_K_L",
"size": "7B",
}
"metadata": {} // Defaults to {}
"assets": [ // Filepaths to model binaries; Defaults to current dir
"file://.../zephyr-7b-q4_k_m.bin",
]
```
### Remote Model
## API Reference
- Using a remote API to access model `model-azure-openai-gpt4-turbo.json`
- See [source](https://learn.microsoft.com/en-us/azure/ai-services/openai/quickstart?tabs=command-line%2Cpython&pivots=rest-api)
Jan's Model API is compatible with [OpenAI's Models API](https://platform.openai.com/docs/api-reference/models), with additional methods for managing and running models locally.
```json
"source_url": "https://docs-test-001.openai.azure.com/openai.azure.com/docs-test-001/gpt4-turbo",
"parameters": {
"init" {
"API-KEY": "",
"DEPLOYMENT-NAME": "",
"api-version": "2023-05-15"
},
"runtime": {
"temperature": "0.7",
"max_tokens": "2048",
"presence_penalty": "0",
"top_p": "1",
"stream": "true"
}
}
"metadata": {
"engine": "api",
}
```
See [Jan Models API](https://jan.ai/api-reference#tag/Models)
### Deferred Download
## Importing Models
- Jan ships with a default model folders containing recommended models
- Only the Model Object `json` files are included
- Users must later explicitly download the model binaries
-
```sh
models/
mistral-7b/
mistral-7b.json
hermes-7b/
hermes-7b.json
```
:::caution
### Multiple quantizations
This is current under development.
- Each quantization has its own `Jan Model Object` file
- TODO: `model.json`?
:::
```sh
llama2-7b-gguf/
llama2-7b-gguf-Q2.json
llama2-7b-gguf-Q3_K_L.json
.bin
```
You can import a model by dragging the model binary or gguf file into the `/models` folder.
### Multiple model partitions
- A Model that is partitioned into several binaries use just 1 file
```sh
llava-ggml/
llava-ggml-Q5.json
.proj
ggml
```
### Locally fine-tuned model
```sh
llama-70b-finetune/
llama-70b-finetune-q5.json
.bin
```
- Jan automatically generates a corresponding `model.json` file based on the binary filename.
- Jan automatically organizes it into its own `/models/model-id` folder.
- Jan automatically populates the `model.json` properties, which you can subsequently modify.

View File

@ -108,7 +108,7 @@ const config = {
specs: [
{
spec: "openapi/jan.yaml", // can be local file, url, or parsed json object
route: "/api-reference", // path where to render docs
route: "/api-reference/", // path where to render docs
},
],
theme: {

View File

@ -21,6 +21,7 @@
"@headlessui/react": "^1.7.17",
"@heroicons/react": "^2.0.18",
"@mdx-js/react": "^1.6.22",
"@redocly/cli": "^1.4.1",
"autoprefixer": "^10.4.16",
"axios": "^1.5.1",
"clsx": "^1.2.1",
@ -32,8 +33,8 @@
"postcss": "^8.4.30",
"posthog-docusaurus": "^2.0.0",
"prism-react-renderer": "^1.3.5",
"react": "^17.0.2",
"react-dom": "^17.0.2",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"react-icons": "^4.11.0",
"redocusaurus": "^2.0.0",
"sass": "^1.69.3",

18
docs/redocly.yaml Normal file
View File

@ -0,0 +1,18 @@
# NOTE: Only supports options marked as "Supported in Redoc CE"
# See https://redocly.com/docs/cli/configuration/ for more information.
extends:
- recommended
rules:
no-unused-components: error
theme:
openapi:
schemaExpansionLevel: 2
generateCodeSamples:
languages:
- lang: curl
- lang: Python
- lang: JavaScript
- lang: Node.js

View File

@ -1927,6 +1927,28 @@
require-from-string "^2.0.2"
uri-js "^4.2.2"
"@redocly/cli@^1.4.1":
version "1.4.1"
resolved "https://registry.yarnpkg.com/@redocly/cli/-/cli-1.4.1.tgz#6ca3a02a272d0fa33b687ec0a964bca5d4c4c3c7"
integrity sha512-c0v8SYyqC1QvImJrGhw6+wIPXn10Zhp1sUGmvXIIXZQzS7fL15+qBH4f19nh3ChpSdO4x0EdBFvvTovrupV4sQ==
dependencies:
"@redocly/openapi-core" "1.4.1"
chokidar "^3.5.1"
colorette "^1.2.0"
core-js "^3.32.1"
get-port-please "^3.0.1"
glob "^7.1.6"
handlebars "^4.7.6"
mobx "^6.0.4"
node-fetch "^2.6.1"
react "^17.0.0 || ^18.2.0"
react-dom "^17.0.0 || ^18.2.0"
redoc "~2.1.2"
semver "^7.5.2"
simple-websocket "^9.0.0"
styled-components "^6.0.7"
yargs "17.0.1"
"@redocly/openapi-core@1.4.0":
version "1.4.0"
resolved "https://registry.yarnpkg.com/@redocly/openapi-core/-/openapi-core-1.4.0.tgz#d1ce8e391b32452082f754315c8eb265690b784f"
@ -1943,7 +1965,7 @@
pluralize "^8.0.0"
yaml-ast-parser "0.0.43"
"@redocly/openapi-core@^1.0.0-rc.2":
"@redocly/openapi-core@1.4.1", "@redocly/openapi-core@^1.0.0-rc.2":
version "1.4.1"
resolved "https://registry.yarnpkg.com/@redocly/openapi-core/-/openapi-core-1.4.1.tgz#0620a5e204159626a1d99b88f758e23ef0cb5740"
integrity sha512-oAhnG8MKocM9LuP++NGFxdniNKWSLA7hzHPQoOK92LIP/DdvXx8pEeZ68UTNxIXhKonoUcO6s86I3L0zj143zg==
@ -3135,7 +3157,7 @@ cheerio@^1.0.0-rc.12:
parse5 "^7.0.0"
parse5-htmlparser2-tree-adapter "^7.0.0"
"chokidar@>=3.0.0 <4.0.0", chokidar@^3.4.2, chokidar@^3.5.3:
"chokidar@>=3.0.0 <4.0.0", chokidar@^3.4.2, chokidar@^3.5.1, chokidar@^3.5.3:
version "3.5.3"
resolved "https://registry.yarnpkg.com/chokidar/-/chokidar-3.5.3.tgz#1cf37c8707b932bd1af1ae22c0432e2acd1903bd"
integrity sha512-Dr3sfKRP6oTcjf2JmUmFJfeVMvXBdegxB0iVQ5eb2V10uFJUCAS8OByZdVAyVb8xXNz3GjjTgj9kLWsZTqE6kw==
@ -3477,7 +3499,7 @@ core-js@^2.4.1:
resolved "https://registry.yarnpkg.com/core-js/-/core-js-2.6.12.tgz#d9333dfa7b065e347cc5682219d6f690859cc2ec"
integrity sha512-Kb2wC0fvsWfQrgk8HU5lW6U/Lcs8+9aaYcy4ZFc6DDlo4nZ7n70dEgE5rtR0oG6ufKDUnrwfWL1mXR5ljDatrQ==
core-js@^3.23.3:
core-js@^3.23.3, core-js@^3.32.1:
version "3.33.3"
resolved "https://registry.yarnpkg.com/core-js/-/core-js-3.33.3.tgz#3c644a323f0f533a0d360e9191e37f7fc059088d"
integrity sha512-lo0kOocUlLKmm6kv/FswQL8zbkH7mVsLJ/FULClOhv8WRVmKLVcs6XPNQAzstfeJTCHMyButEwG+z1kHxHoDZw==
@ -4003,7 +4025,7 @@ debug@2.6.9, debug@^2.6.0:
dependencies:
ms "2.0.0"
debug@4, debug@^4.1.0, debug@^4.1.1:
debug@4, debug@^4.1.0, debug@^4.1.1, debug@^4.3.1:
version "4.3.4"
resolved "https://registry.yarnpkg.com/debug/-/debug-4.3.4.tgz#1319f6579357f2338d3337d2cdd4914bb5dcc865"
integrity sha512-PRWFHuSU3eDtQJPvnNY7Jcket1j0t5OuOsFzPPzsekD52Zl8qUfFIPEiswXqIvHWGVHOgX+7G/vCNNhehwxfkQ==
@ -4816,6 +4838,11 @@ get-own-enumerable-property-symbols@^3.0.0:
resolved "https://registry.yarnpkg.com/get-own-enumerable-property-symbols/-/get-own-enumerable-property-symbols-3.0.2.tgz#b5fde77f22cbe35f390b4e089922c50bce6ef664"
integrity sha512-I0UBV/XOz1XkIJHEUDMZAbzCThU/H8DxmSfmdGcKPnVhu2VfFqr34jr9777IyaTYvxjedWhqVIilEDsCdP5G6g==
get-port-please@^3.0.1:
version "3.1.1"
resolved "https://registry.yarnpkg.com/get-port-please/-/get-port-please-3.1.1.tgz#2556623cddb4801d823c0a6a15eec038abb483be"
integrity sha512-3UBAyM3u4ZBVYDsxOQfJDxEa6XTbpBDrOjp4mf7ExFRt5BKs/QywQQiJsh2B+hxcZLSapWqCRvElUe8DnKcFHA==
get-stream@^4.1.0:
version "4.1.0"
resolved "https://registry.yarnpkg.com/get-stream/-/get-stream-4.1.0.tgz#c1b255575f3dc21d59bfc79cd3d2b46b1c3a54b5"
@ -4985,6 +5012,18 @@ handle-thing@^2.0.0:
resolved "https://registry.yarnpkg.com/handle-thing/-/handle-thing-2.0.1.tgz#857f79ce359580c340d43081cc648970d0bb234e"
integrity sha512-9Qn4yBxelxoh2Ow62nP+Ka/kMnOXRi8BXnRaUwezLNhqelnN49xKz4F/dPP8OYLxLxq6JDtZb2i9XznUQbNPTg==
handlebars@^4.7.6:
version "4.7.8"
resolved "https://registry.yarnpkg.com/handlebars/-/handlebars-4.7.8.tgz#41c42c18b1be2365439188c77c6afae71c0cd9e9"
integrity sha512-vafaFqs8MZkRrSX7sFVUdo3ap/eNiLnb4IakshzvP56X5Nr1iGKAIqdX6tMlm6HcNRIkr6AxO5jFEoJzzpT8aQ==
dependencies:
minimist "^1.2.5"
neo-async "^2.6.2"
source-map "^0.6.1"
wordwrap "^1.0.0"
optionalDependencies:
uglify-js "^3.1.4"
has-flag@^3.0.0:
version "3.0.0"
resolved "https://registry.yarnpkg.com/has-flag/-/has-flag-3.0.0.tgz#b5d454dc2199ae225699f3467e5a07f3b955bafd"
@ -6133,7 +6172,7 @@ mobx-react@^7.2.0:
dependencies:
mobx-react-lite "^3.4.0"
mobx@^6.10.2:
mobx@^6.0.4, mobx@^6.10.2:
version "6.11.0"
resolved "https://registry.yarnpkg.com/mobx/-/mobx-6.11.0.tgz#8a748b18c140892d1d0f28b71315f1f639180006"
integrity sha512-qngYCmr0WJiFRSAtYe82DB7SbzvbhehkJjONs8ydynUwoazzUQHZdAlaJqUfks5j4HarhWsZrMRhV7HtSO9HOQ==
@ -7204,14 +7243,13 @@ react-dev-utils@^12.0.1:
strip-ansi "^6.0.1"
text-table "^0.2.0"
react-dom@^17.0.2:
version "17.0.2"
resolved "https://registry.yarnpkg.com/react-dom/-/react-dom-17.0.2.tgz#ecffb6845e3ad8dbfcdc498f0d0a939736502c23"
integrity sha512-s4h96KtLDUQlsENhMn1ar8t2bEa+q/YAtj8pPPdIjPDGBDIVNsrD9aXNWqspUe6AzKCIG0C1HZZLqLV7qpOBGA==
"react-dom@^17.0.0 || ^18.2.0", react-dom@^18.2.0:
version "18.2.0"
resolved "https://registry.yarnpkg.com/react-dom/-/react-dom-18.2.0.tgz#22aaf38708db2674ed9ada224ca4aa708d821e3d"
integrity sha512-6IMTriUmvsjHUjNtEDudZfuDQUoWXVxKHhlEGSk81n4YFS+r/Kl99wXiwlVXtPBtJenozv2P+hxDsw9eA7Xo6g==
dependencies:
loose-envify "^1.1.0"
object-assign "^4.1.1"
scheduler "^0.20.2"
scheduler "^0.23.0"
react-error-overlay@^6.0.11:
version "6.0.11"
@ -7345,13 +7383,12 @@ react-textarea-autosize@^8.3.2:
use-composed-ref "^1.3.0"
use-latest "^1.2.1"
react@^17.0.2:
version "17.0.2"
resolved "https://registry.yarnpkg.com/react/-/react-17.0.2.tgz#d0b5cc516d29eb3eee383f75b62864cfb6800037"
integrity sha512-gnhPt75i/dq/z3/6q/0asP78D0u592D5L1pd7M8P+dck6Fu/jJeL6iVVK23fptSUZj8Vjf++7wXA8UNclGQcbA==
"react@^17.0.0 || ^18.2.0", react@^18.2.0:
version "18.2.0"
resolved "https://registry.yarnpkg.com/react/-/react-18.2.0.tgz#555bd98592883255fa00de14f1151a917b5d77d5"
integrity sha512-/3IjMdb2L9QbBdWiW5e3P2/npwMBaU9mHCSCUzNln0ZCYbcfTsGbTJrU/kGemdH2IWmB2ioZ+zkxtmq6g09fGQ==
dependencies:
loose-envify "^1.1.0"
object-assign "^4.1.1"
read-cache@^1.0.0:
version "1.0.0"
@ -7373,7 +7410,7 @@ readable-stream@^2.0.1, readable-stream@~2.3.6:
string_decoder "~1.1.1"
util-deprecate "~1.0.1"
readable-stream@^3.0.6:
readable-stream@^3.0.6, readable-stream@^3.6.0:
version "3.6.2"
resolved "https://registry.yarnpkg.com/readable-stream/-/readable-stream-3.6.2.tgz#56a9b36ea965c00c5a93ef31eb111a0f11056967"
integrity sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA==
@ -7418,7 +7455,7 @@ recursive-readdir@^2.2.2:
dependencies:
minimatch "^3.0.5"
redoc@2.1.3:
redoc@2.1.3, redoc@~2.1.2:
version "2.1.3"
resolved "https://registry.yarnpkg.com/redoc/-/redoc-2.1.3.tgz#612c9fed744993d5fc99cbf39fe9056bd1034fa5"
integrity sha512-d7F9qLLxaiFW4GC03VkwlX9wuRIpx9aiIIf3o6mzMnqPfhxrn2IRKGndrkJeVdItgCfmg9jXZiFEowm60f1meQ==
@ -7766,13 +7803,12 @@ sax@^1.2.4:
resolved "https://registry.yarnpkg.com/sax/-/sax-1.3.0.tgz#a5dbe77db3be05c9d1ee7785dbd3ea9de51593d0"
integrity sha512-0s+oAmw9zLl1V1cS9BtZN7JAd0cW5e0QH4W3LWEK6a4LaLEA2OTpGYWDY+6XasBLtz6wkm3u1xRw95mRuJ59WA==
scheduler@^0.20.2:
version "0.20.2"
resolved "https://registry.yarnpkg.com/scheduler/-/scheduler-0.20.2.tgz#4baee39436e34aa93b4874bddcbf0fe8b8b50e91"
integrity sha512-2eWfGgAqqWFGqtdMmcL5zCMK1U8KlXv8SQFGglL3CEtd0aDVDWgeF/YoCmvln55m5zSk3J/20hTaSBeSObsQDQ==
scheduler@^0.23.0:
version "0.23.0"
resolved "https://registry.yarnpkg.com/scheduler/-/scheduler-0.23.0.tgz#ba8041afc3d30eb206a487b6b384002e4e61fdfe"
integrity sha512-CtuThmgHNg7zIZWAXi3AsyIzA3n4xx7aNyjwC2VJldO2LMVDhFK+63xGqq6CsJH4rTAt6/M+N4GhZiDYPx9eUw==
dependencies:
loose-envify "^1.1.0"
object-assign "^4.1.1"
schema-utils@2.7.0:
version "2.7.0"
@ -7849,7 +7885,7 @@ semver@^6.0.0, semver@^6.2.0, semver@^6.3.0, semver@^6.3.1:
resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.1.tgz#556d2ef8689146e46dcea4bfdd095f3434dffcb4"
integrity sha512-BR7VvDCVHO+q2xBEWskxS6DJE1qRnb7DxzUrogb71CWoSficBxYsiAGd+Kl0mmq/MprG9yArRkyrQxTO6XjMzA==
semver@^7.3.2, semver@^7.3.4, semver@^7.3.7, semver@^7.3.8:
semver@^7.3.2, semver@^7.3.4, semver@^7.3.7, semver@^7.3.8, semver@^7.5.2:
version "7.5.4"
resolved "https://registry.yarnpkg.com/semver/-/semver-7.5.4.tgz#483986ec4ed38e1c6c48c34894a9182dbff68a6e"
integrity sha512-1bCSESV6Pv+i21Hvpxp3Dx+pSD8lIPt8uVjRrxAUt/nbswYc+tK6Y2btiULjd4+fnq15PX+nqQDC7Oft7WkwcA==
@ -8040,6 +8076,17 @@ signal-exit@^3.0.2, signal-exit@^3.0.3:
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.7.tgz#a9a1767f8af84155114eaabd73f99273c8f59ad9"
integrity sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==
simple-websocket@^9.0.0:
version "9.1.0"
resolved "https://registry.yarnpkg.com/simple-websocket/-/simple-websocket-9.1.0.tgz#91cbb39eafefbe7e66979da6c639109352786a7f"
integrity sha512-8MJPnjRN6A8UCp1I+H/dSFyjwJhp6wta4hsVRhjf8w9qBHRzxYt14RaOcjvQnhD1N4yKOddEjflwMnQM4VtXjQ==
dependencies:
debug "^4.3.1"
queue-microtask "^1.2.2"
randombytes "^2.1.0"
readable-stream "^3.6.0"
ws "^7.4.2"
sirv@^2.0.3:
version "2.0.3"
resolved "https://registry.yarnpkg.com/sirv/-/sirv-2.0.3.tgz#ca5868b87205a74bef62a469ed0296abceccd446"
@ -8271,7 +8318,7 @@ style-to-object@0.3.0, style-to-object@^0.3.0:
dependencies:
inline-style-parser "0.1.1"
styled-components@^6.1.0:
styled-components@^6.0.7, styled-components@^6.1.0:
version "6.1.1"
resolved "https://registry.yarnpkg.com/styled-components/-/styled-components-6.1.1.tgz#a5414ada07fb1c17b96a26a05369daa4e2ad55e5"
integrity sha512-cpZZP5RrKRIClBW5Eby4JM1wElLVP4NQrJbJ0h10TidTyJf4SIIwa3zLXOoPb4gJi8MsJ8mjq5mu2IrEhZIAcQ==
@ -8571,6 +8618,11 @@ ua-parser-js@^1.0.35:
resolved "https://registry.yarnpkg.com/ua-parser-js/-/ua-parser-js-1.0.37.tgz#b5dc7b163a5c1f0c510b08446aed4da92c46373f"
integrity sha512-bhTyI94tZofjo+Dn8SN6Zv8nBDvyXTymAdM3LDI/0IboIUwTu1rEhW7v2TfiVsoYWgkQ4kOVqnI8APUFbIQIFQ==
uglify-js@^3.1.4:
version "3.17.4"
resolved "https://registry.yarnpkg.com/uglify-js/-/uglify-js-3.17.4.tgz#61678cf5fa3f5b7eb789bb345df29afb8257c22c"
integrity sha512-T9q82TJI9e/C1TAxYvfb16xO120tMVFZrGA3f9/P4424DNu6ypK103y0GPFVa17yotwSyZW5iYXgjYHkGrJW/g==
undici-types@~5.26.4:
version "5.26.5"
resolved "https://registry.yarnpkg.com/undici-types/-/undici-types-5.26.5.tgz#bcd539893d00b56e964fd2657a4866b221a65617"
@ -9080,6 +9132,11 @@ wildcard@^2.0.0:
resolved "https://registry.yarnpkg.com/wildcard/-/wildcard-2.0.1.tgz#5ab10d02487198954836b6349f74fff961e10f67"
integrity sha512-CC1bOL87PIWSBhDcTrdeLo6eGT7mCFtrg0uIJtqJUFyK+eJnzl8A1niH56uu7KMa5XFrtiV+AQuHO3n7DsHnLQ==
wordwrap@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
integrity sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==
wrap-ansi@^7.0.0:
version "7.0.0"
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
@ -9113,7 +9170,7 @@ write-file-atomic@^3.0.0:
signal-exit "^3.0.2"
typedarray-to-buffer "^3.1.5"
ws@^7.3.1:
ws@^7.3.1, ws@^7.4.2:
version "7.5.9"
resolved "https://registry.yarnpkg.com/ws/-/ws-7.5.9.tgz#54fa7db29f4c7cec68b1ddd3a89de099942bb591"
integrity sha512-F+P9Jil7UiSKSkppIiD94dN07AwvFixvLIj1Og1Rl9GGMuNipJnV9JzjD6XuqmAeiswGvUmNLjr5cFuXwNS77Q==
@ -9180,6 +9237,19 @@ yargs-parser@^21.1.1:
resolved "https://registry.yarnpkg.com/yargs-parser/-/yargs-parser-21.1.1.tgz#9096bceebf990d21bb31fa9516e0ede294a77d35"
integrity sha512-tVpsJW7DdjecAiFpbIB1e3qxIQsE6NoPc5/eTdrbbIC4h0LVsWhnoa3g+m2HclBIujHzsxZ4VJVA+GUuc2/LBw==
yargs@17.0.1:
version "17.0.1"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-17.0.1.tgz#6a1ced4ed5ee0b388010ba9fd67af83b9362e0bb"
integrity sha512-xBBulfCc8Y6gLFcrPvtqKz9hz8SO0l1Ni8GgDekvBX2ro0HRQImDGnikfc33cgzcYUSncapnNcZDjVFIH3f6KQ==
dependencies:
cliui "^7.0.2"
escalade "^3.1.1"
get-caller-file "^2.0.5"
require-directory "^2.1.1"
string-width "^4.2.0"
y18n "^5.0.5"
yargs-parser "^20.2.2"
yargs@^16.1.0:
version "16.2.0"
resolved "https://registry.yarnpkg.com/yargs/-/yargs-16.2.0.tgz#1c82bf0f6b6a66eafce7ef30e376f49a12477f66"

View File

@ -2,11 +2,11 @@
rem Attempt to run nitro_windows_amd64_cuda.exe
cd win-cuda
nitro.exe
nitro.exe %*
rem Check the exit code of the previous command
if %errorlevel% neq 0 (
echo nitro_windows_amd64_cuda.exe encountered an error, attempting to run nitro_windows_amd64.exe...
cd ..\win-cpu
nitro.exe
nitro.exe %*
)