added-new-content
BIN
docs/docs/hardware/concepts/concepts-images/GPU.png
Normal file
|
After Width: | Height: | Size: 388 KiB |
BIN
docs/docs/hardware/concepts/concepts-images/GPU_Image.png
Normal file
|
After Width: | Height: | Size: 945 KiB |
BIN
docs/docs/hardware/concepts/concepts-images/PCIex16.png
Normal file
|
After Width: | Height: | Size: 2.2 MiB |
BIN
docs/docs/hardware/concepts/concepts-images/Power.png
Normal file
|
After Width: | Height: | Size: 453 KiB |
BIN
docs/docs/hardware/concepts/concepts-images/RAM-VRAM.png
Normal file
|
After Width: | Height: | Size: 349 KiB |
BIN
docs/docs/hardware/concepts/concepts-images/VRAM-Image.png
Normal file
|
After Width: | Height: | Size: 553 KiB |
BIN
docs/docs/hardware/concepts/concepts-images/slot.png
Normal file
|
After Width: | Height: | Size: 636 KiB |
@ -2,7 +2,131 @@
|
||||
title: GPUs and VRAM
|
||||
---
|
||||
|
||||
- GPUs plugging in to Motherboard via PCIe
|
||||
- Multiple GPUs
|
||||
- NVLink
|
||||
- PCIe (and Motherboard limitations)
|
||||
## What Is a GPU?
|
||||
|
||||
A Graphics Card, or GPU (Graphics Processing Unit), is a fundamental component in modern computing. Think of it as the powerhouse behind rendering the stunning visuals you see on your screen. Similar to the motherboard in your computer, the graphics card is a printed circuit board. However, it's not just a passive piece of hardware; it's a sophisticated device equipped with essential components like fans, onboard RAM, a dedicated memory controller, BIOS, and various other features. If you want to learn more about GPUs then read here to [Understand the architecture of a GPU.](https://medium.com/codex/understanding-the-architecture-of-a-gpu-d5d2d2e8978b)
|
||||
|
||||

|
||||
|
||||
## What Are GPUs Used For?
|
||||
|
||||
Two decades ago, GPUs primarily enhanced real-time 3D graphics in gaming. But as the 21st century dawned, a revelation occurred among computer scientists. They recognized that GPUs held untapped potential to solve some of the world's most intricate computing tasks.
|
||||
This revelation marked the dawn of the general-purpose GPU era. Today's GPUs have evolved into versatile tools, more adaptable than ever before. They now have the capability to accelerate a diverse range of applications that stretch well beyond their original graphics-focused purpose.
|
||||
|
||||
### **Here are some example use cases:**
|
||||
|
||||
1. **Gaming**: They make games look good and run smoothly.
|
||||
2. **Content Creation**: Help with video editing, 3D design, and graphics work.
|
||||
3. **AI and Machine Learning**: Used for training smart machines.
|
||||
4. **Science**: Speed up scientific calculations and simulations.
|
||||
5. **Cryptocurrency Mining**: Mine digital currencies like Bitcoin.
|
||||
6. **Medical Imaging**: Aid in analyzing medical images.
|
||||
7. **Self-Driving Cars**: Help cars navigate autonomously.
|
||||
8. **Simulations**: Create realistic virtual experiences.
|
||||
9. **Data Analysis**: Speed up data processing and visualization.
|
||||
10. **Video Streaming**: Improve video quality and streaming efficiency.
|
||||
|
||||
## What is VRAM In GPU?
|
||||
|
||||
VRAM, or video random-access memory, is a type of high-speed memory that is specifically designed for use with graphics processing units (GPUs). VRAM is used to store the textures, images, and other data that the GPU needs to render graphics. Its allows the GPU to access the data it needs quickly and efficiently. This is essential for rendering complex graphics at high frame rates.
|
||||
|
||||
VRAM is different from other types of memory, such as the system RAM that is used by the CPU. VRAM is optimized for high bandwidth and low latency, which means that it can read and write data very quickly. The amount of VRAM that a GPU has is one of the factors that determines its performance. More VRAM allows the GPU to store more data and render more complex graphics. However, VRAM is also one of the most expensive components of a GPU. So when choosing a graphics card, it is important to consider the amount of VRAM that it has. If you are planning on running demanding LLMs or video games, or 3D graphics software, you will need a graphics card with more VRAM.
|
||||
|
||||

|
||||
|
||||
## What makes VRAM and RAM different from each other?
|
||||
|
||||
RAM (Random Access Memory) and VRAM (Video Random Access Memory) are both types of memory used in computers, but they have different functions and characteristics. Here are the differences between RAM and VRAM.
|
||||
|
||||
### RAM (Random Access Memory):
|
||||
|
||||
- RAM is a general-purpose memory that stores data and instructions that the CPU needs to access quickly.
|
||||
- RAM is used for short-term data storage and is volatile, meaning that it loses its contents when the computer is turned off.
|
||||
- RAM is connected to the motherboard and is accessed by the CPU.
|
||||
- RAM typically has a larger capacity compared to VRAM, which is designed to store smaller amounts of data with faster access times.
|
||||
- RAM stores data related to the operating system and the various programs that are running, including code, program files, and user data.
|
||||
|
||||
### VRAM (Video Random Access Memory):
|
||||
|
||||
- VRAM is a type of RAM that is specifically used to store image data for a computer display.
|
||||
- VRAM is a graphics card component that is connected to the GPU (Graphics Processing Unit).
|
||||
- VRAM is used exclusively by the GPU and doesn’t need to store as much data as the CPU.
|
||||
- VRAM is similar to RAM in that it is volatile and loses its contents when the computer is turned off.
|
||||
- VRAM stores data related specifically to graphics, such as textures, frames, and other graphical data.
|
||||
- VRAM is designed to store smaller amounts of data with faster access times than RAM.
|
||||
|
||||
In summary, RAM is used for general-purpose memory, while VRAM is used for graphics-related tasks. RAM has a larger capacity and is accessed by the CPU, while VRAM has a smaller capacity and is accessed by the GPU.
|
||||
|
||||
**Key differences between VRAM and RAM:**
|
||||
|
||||
| Characteristic | VRAM | RAM |
|
||||
| -------------- | --------------------- | --------------------- |
|
||||
| Purpose | Graphics processing | General processing |
|
||||
| Speed | Faster | Slower |
|
||||
| Latency | Lower | Higher |
|
||||
| Bandwidth | Higher | Lower |
|
||||
| Cost | More expensive | Less expensive |
|
||||
| Availability | Less widely available | More widely available |
|
||||
|
||||

|
||||
|
||||
## How to Connect GPU to the Motherboard via PCIe
|
||||
|
||||
Connecting hardware components to a motherboard is often likened to assembling LEGO pieces. If the parts fit together seamlessly, you're on the right track. Experienced PC builders find this process straightforward. However, for first-time builders, identifying where each hardware component belongs on the motherboard can be a bit perplexing.
|
||||
|
||||
**So follow the below 5 steps to Connect your GPU to the Motherboard:**
|
||||
|
||||
1. First, make sure your computer is powered off and unplugged from the electrical outlet to ensure safety.
|
||||
2. Open your computer case if necessary to access the motherboard. Locate the PCIe x16 on the motherboard where you'll install the GPU. These slots are typically longer than other expansion slots and are used for graphics cards.
|
||||
Remove Slot Covers (if applicable): Some PCIe slots may have protective covers or brackets covering them. Remove these covers by unscrewing them from the case using a Phillips-head screwdriver. And PCIe x16 will have plastic lock on one side only. There may be more than one PCIe x16 slot depending on the motherboard. You can use any of the slots according to your choice.
|
||||
|
||||

|
||||
|
||||
3. Now Insert the Graphics Card slowly:
|
||||
|
||||
- Unlock the plastic lock on one side of the PCIe x16 slot by pulling it outwards.
|
||||
|
||||

|
||||
|
||||
- Align the PCIe slot with your graphics card, making sure that the HDMI port side of the GPU faces the rear side of the CPU case.
|
||||
- Gently press on the card until you hear it securely snap in place.
|
||||
|
||||

|
||||
|
||||
4. Insert the Power Connector: If your GPU requires additional power (most modern GPUs do), connect the necessary power cables from your power supply to the GPU's power connectors. These connectors are usually located on the top or side of the GPU.
|
||||
|
||||

|
||||
|
||||
5. Power on the System: After turning on the PC see if the fans on your graphics card spin. If it does not spin, remove the power cable from the GPU, reconnect it, and power on the PC again.
|
||||
|
||||
> :memo: Note: To better understand you can also watch YouTube tutorials on how to Connect the GPU to the Motherboard via PCIe
|
||||
|
||||
## How to Choose a Graphics Card for your AI works
|
||||
|
||||
Selecting the optimal GPU for running Large Language Models (LLMs) on your home PC is a decision influenced by your budget and the specific LLMs you intend to work with. Your choice should strike a balance between performance, efficiency, and cost-effectiveness.
|
||||
|
||||
In general, the following GPU features are important for running LLMs:
|
||||
|
||||
- **High VRAM:** LLMs are typically very large and complex models, so they require a GPU with a high amount of VRAM. This will allow the model to be loaded into memory and processed efficiently.
|
||||
- **CUDA Compatibility:** When running LLMs on a GPU, CUDA compatibility is paramount. CUDA is NVIDIA's parallel computing platform, and it plays a vital role in accelerating deep learning tasks. LLMs, with their extensive matrix calculations, heavily rely on parallel processing. Ensuring your GPU supports CUDA is like having the right tool for the job. It allows the LLM to leverage the GPU's parallel processing capabilities, significantly speeding up model training and inference.
|
||||
- **Number of CUDA, Tensor, and RT Cores:** High-performance NVIDIA GPUs have both CUDA and Tensor cores. These cores are responsible for executing the neural network computations that underpin LLMs' language understanding and generation. The more CUDA cores your GPU has, the better equipped it is to handle the massive computational load that LLMs impose. Tensor cores in your GPU, further enhance LLM performance by accelerating the critical matrix operations integral to language modeling tasks.
|
||||
- **Generation (Series)**: When selecting a GPU for LLMs, consider its generation or series (e.g., RTX 30 series). Newer GPU generations often come with improved architectures and features. For LLM tasks, opting for the latest generation can mean better performance, energy efficiency, and support for emerging AI technologies. Avoid purchasing, RTX-2000 series GPUs which are much outdated nowadays.
|
||||
|
||||
### Here are some of the best GPU options for this purpose:
|
||||
|
||||
1. **NVIDIA RTX 3090**: The NVIDIA RTX 3090 is a high-end GPU with a substantial 24GB of VRAM. This copious VRAM capacity makes it exceptionally well-suited for handling large LLMs. Moreover, it's known for its relative efficiency, meaning it won't overheat or strain your home PC's cooling system excessively. The RTX 3090's robust capabilities are a boon for those who need to work with hefty language models.
|
||||
2. **NVIDIA RTX 4090**: If you're looking for peak performance and can afford the investment, the NVIDIA RTX 4090 represents the pinnacle of GPU power. Boasting 24GB of VRAM and featuring a cutting-edge Tensor Core architecture tailored for AI workloads, it outshines the RTX 3090 in terms of sheer capability. However, it's important to note that the RTX 4090 is also pricier and more power-hungry than its predecessor, the RTX 3090.
|
||||
3. **AMD Radeon RX 6900 XT**: On the AMD side, the Radeon RX 6900 XT stands out as a high-end GPU with 16GB of VRAM. While it may not quite match the raw power of the RTX 3090 or RTX 4090, it strikes a balance between performance and affordability. Additionally, it tends to be more power-efficient, which could translate to a more sustainable and quieter setup in your home PC.
|
||||
|
||||
If budget constraints are a consideration, there are more cost-effective GPU options available:
|
||||
|
||||
- **NVIDIA RTX 3070**: The RTX 3070 is a solid mid-range GPU that can handle LLMs effectively. While it may not excel with the most massive or complex language models, it's a reliable choice for users looking for a balance between price and performance.
|
||||
- **AMD Radeon RX 6800 XT**: Similarly, the RX 6800 XT from AMD offers commendable performance without breaking the bank. It's well-suited for running mid-sized LLMs and provides a competitive option in terms of both power and cost.
|
||||
|
||||
When selecting a GPU for LLMs, remember that it's not just about the GPU itself. Consider the synergy with other components in your PC:
|
||||
|
||||
- **CPU**: To ensure efficient processing, pair your GPU with a powerful CPU. LLMs benefit from fast processors, so having a capable CPU is essential.
|
||||
- **RAM**: Sufficient RAM is crucial for LLMs. They can be memory-intensive, and having enough RAM ensures smooth operation.
|
||||
- **Cooling System**: LLMs can push your PC's hardware to the limit. A robust cooling system helps maintain optimal temperatures, preventing overheating and performance throttling.
|
||||
|
||||
By taking all of these factors into account, you can build a home PC setup that's well-equipped to handle the demands of running LLMs effectively and efficiently.
|
||||
|
||||
@ -1,3 +1,19 @@
|
||||
---
|
||||
title: "@janhq: 2x4090 Workstation"
|
||||
---
|
||||
---
|
||||
|
||||

|
||||
|
||||
## This is Jan 2x4090 Workstation setup components list:
|
||||
|
||||
| Type | Item | Price |
|
||||
| :------------------- | :----------------------------------------------- | :------ |
|
||||
| **CPU** | [RYZEN THREADDRIPPER PRO 5965WX 280W SP3 WOF](#) | $2,229 |
|
||||
| **Motherboard** | [ASUS PRO WS WRX80E SAGE SE WIFI](#) | $933 |
|
||||
| **GPU** | [ASUS STRIX RTX 4090 24GB OC](#) | $4,345 |
|
||||
| **RAM** | [G.SKILL RIPJAW S5 2x32 6000C32](#) | $92.99 |
|
||||
| **Storage PCIe-SSD** | [SAMSUNG 990 PRO 2TB NVME 2.0](#) | $134.99 |
|
||||
| **Cooler** | [BEQUIET DARK ROCK 4 PRO TR4](#) | $89.90 |
|
||||
| **Power Supply** | [FSP CANNON 2000W PRO 92+ FULL MODULAR PSU](#) | $449.99 |
|
||||
| **Case** | [VEDDHA 6GPUS FRAME BLACK](#) | $59.99 |
|
||||
| **Total cost** | | $8334 |
|
||||
|
||||
@ -2,59 +2,3 @@
|
||||
sidebar_position: 1
|
||||
title: Introduction
|
||||
---
|
||||
|
||||
## How to choose LLMs for your work
|
||||
|
||||
Choosing the right Large Language Model (LLM) doesn't have to be complicated. It's all about finding one that works well for your needs. Here's a simple guide to help you pick the perfect LLM:
|
||||
|
||||
1. **Set Up the Basics**: First, get everything ready on your computer. Make sure you have the right software and tools to run these models. Then, give them a try on your system.
|
||||
2. **Watch Your Memory**: Pay attention to how much memory these models are using. Some are bigger than others, and you need to make sure your computer can handle them.
|
||||
3. **Find Compatible Models**: Look for the models that are like the top players in the game. These models are known to work well with the tools you're using. Keep these models in your shortlist.
|
||||
4. **Test Them Out**: Take the models on your shortlist and give them a try with your specific task. This is like comparing different cars by taking them for a test drive. It helps you see which one works best for what you need.
|
||||
5. **Pick the Best Fit**: After testing, you'll have a better idea of which model is the winner for your project. Consider things like how well it performs, how fast it is, if it works with your computer, and the software you're using.
|
||||
6. **Stay Updated**: Remember that this field is always changing and improving. Keep an eye out for updates and new models that might be even better for your needs.
|
||||
|
||||
And the good news is, finding the right LLM is easier now. We've got a handy tool called Extractum LLM Explorer that you can use online. It helps you discover, compare, and rank lots of different LLMs. Check it out at **[Extractum](http://llm.extractum.io/)**, and it'll make your selection process a breeze!
|
||||
|
||||
You can also use [Model Memory Calculator](https://huggingface.co/spaces/hf-accelerate/model-memory-usage) tool designed to assist in determining the required vRAM for training and conducting inference with large models hosted on the Hugging Face Hub. The tool identifies the minimum recommended vRAM based on the size of the 'largest layer' within the model. Additionally, it's worth noting that model training typically necessitates approximately four times the size of the model, especially when using the Adam optimization. Keep in mind When performing inference, expect to add up to an additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). More tests will be performed in the future to get a more accurate benchmark for each model.
|
||||
|
||||
## How to Calculate How Much vRAM is Required to My Selected LLM
|
||||
|
||||
**For example:** Calculating the VRAM required to run a 13-billion-parameter Large Language Model (LLM) involves considering the model size, batch size, sequence length, token size, and any additional overhead. Here's how you can estimate the VRAM required for a 13B LLM:
|
||||
|
||||
1. **Model Size**: Find out the size of the 13B LLM in terms of the number of parameters. This information is typically provided in the model's documentation. A 13-billion-parameter model has 13,000,000,000 parameters.
|
||||
2. **Batch Size**: Decide on the batch size you want to use during inference. The batch size represents how many input samples you process simultaneously. Smaller batch sizes require less VRAM.
|
||||
3. **Sequence Length**: Determine the average length of the input text sequences you'll be working with. Sequence length can impact VRAM requirements; longer sequences need more memory.
|
||||
4. **Token Size**: Understand the memory required to store one token in bytes. Most LLMs use 4 bytes per token.
|
||||
5. **Overhead**: Consider any additional memory overhead for intermediate computations and framework requirements. Overhead can vary but should be estimated based on your specific setup.
|
||||
|
||||
Use the following formula to estimate the VRAM required:
|
||||
|
||||
**VRAM Required (in gigabytes)** = `Model Parameters x Token Size x Batch Size x Sequence Length + Overhead`
|
||||
|
||||
- **Model Parameters**: 13,000,000,000 parameters for a 13B LLM.
|
||||
- **Token Size**: Usually 4 bytes per token.
|
||||
- **Batch Size**: Choose your batch size.
|
||||
- **Sequence Length**: The average length of input sequences.
|
||||
- **Overhead**: Any additional VRAM required based on your setup.
|
||||
|
||||
Here's an example:
|
||||
|
||||
Suppose you want to run a 13B LLM with the following parameters:
|
||||
|
||||
- **Batch Size**: 4
|
||||
- **Sequence Length**: 512 tokens
|
||||
- **Token Size**: 4 bytes
|
||||
- **Estimated Overhead**: 2 GB
|
||||
|
||||
VRAM Required (in gigabytes) = `(13,000,000,000 x 4 x 4 x 512) + 2`
|
||||
|
||||
VRAM Required (in gigabytes) = `(8,388,608,000) + 2,000`
|
||||
|
||||
VRAM Required (in gigabytes) ≈ `8,390,608,000 bytes`
|
||||
|
||||
To convert this to gigabytes, divide by `1,073,741,824 (1 GB)`
|
||||
|
||||
VRAM Required (in gigabytes) ≈ `8,390,608,000 / 1,073,741,824 ≈ 7.8 GB`
|
||||
|
||||
So, to run a 13-billion-parameter LLM with the specified parameters and overhead, you would need approximately 7.8 gigabytes of VRAM on your GPU. Make sure to have some additional VRAM for stable operation and consider testing the setup in practice to monitor VRAM usage accurately.
|
||||
|
||||
@ -16,7 +16,7 @@ title: Recommended AI Hardware by Budget
|
||||
| **Storage PCIe-SSD** | [ADATA XPG SX8200 Pro 512GB NVMe M.2 Solid State Drive](#) | $46.50 |
|
||||
| **Power Supply** | [Corsair CX-M Series CX450M 450W ATX 2.4 Power Supply](#) | $89.99 |
|
||||
| **Case** | [be quiet! Pure Base 600 Black ATX Mid Tower Case](#) | $97.00 |
|
||||
| **Total** | | $870 |
|
||||
| **Total cost** | | $870 |
|
||||
|
||||
## Entry-level PC Build at $1,500
|
||||
|
||||
@ -30,7 +30,7 @@ title: Recommended AI Hardware by Budget
|
||||
| **Storage PCIe-SSD** | [ADATA XPG SX8200 Pro 1TB NVMe M.2 Solid State Drive](#) | $109.99 |
|
||||
| **Power Supply** | [Corsair RMx Series RM650x 650W ATX 2.4 Power Supply](#) | $119.99 |
|
||||
| **Case** | [Corsair Carbide Series 200R ATX Mid Tower Case](#) | $59.99 |
|
||||
| **Total** | | $1371 |
|
||||
| **Total cost** | | $1371 |
|
||||
|
||||
## Mid-range PC Build at $3000
|
||||
|
||||
@ -45,7 +45,7 @@ title: Recommended AI Hardware by Budget
|
||||
| **Case** | [Fractal Design Pop Air ATX Mid Tower Case](https://de.pcpartpicker.com/product/QnD7YJ/fractal-design-pop-air-atx-mid-tower-case-fd-c-poa1a-02) | $89.99 |
|
||||
| **Power Supply** | [Thermaltake Toughpower GF A3 - TT Premium Edition 1050 W 80+ Gold](https://de.pcpartpicker.com/product/4v3NnQ/thermaltake-toughpower-gf-a3-1050-w-80-gold-certified-fully-modular-atx-power-supply-ps-tpd-1050fnfagu-l) | $139.99 |
|
||||
| |
|
||||
| **Total** | **$3000** |
|
||||
| **Total cost** | **$3000** |
|
||||
|
||||
## High-End PC Build at $6,000
|
||||
|
||||
@ -59,4 +59,4 @@ title: Recommended AI Hardware by Budget
|
||||
| **GPU** | [PNY RTX A-Series RTX A6000 48 GB Video Card](https://pcpartpicker.com/product/HWt9TW/pny-rtx-a-series-rtx-a6000-48-gb-video-card-vcnrtxa6000-pb) | $4269.00 |
|
||||
| **Power Supply** | [EVGA SuperNOVA 850 G2 850 W 80+ Gold ](https://pcpartpicker.com/product/LCfp99/evga-supernova-850-g2-850-w-80-gold-certified-fully-modular-atx-power-supply-220-g2-0850-xr) | $322.42 |
|
||||
| |
|
||||
| **Total** | **$6026.34** |
|
||||
| **Total cost** | **$6026.34** |
|
||||
|
||||
@ -1,5 +1,5 @@
|
||||
---
|
||||
title: Recommended AI Models by Hardware
|
||||
title: Recommended AI Hardware
|
||||
---
|
||||
|
||||
## Overview
|
||||
@ -94,7 +94,6 @@ R**ecommended RAM** **options for running LLMs:**
|
||||
|
||||
In addition to the amount of RAM, it is also important to consider the speed of the RAM. LLMs can benefit from having fast RAM, so it is recommended to use DDR4 or DDR5 RAM with a speed of at least 3200MHz.
|
||||
|
||||
|
||||
## Motherboard Selection
|
||||
|
||||
When picking a motherboard to run advanced language models, you need to think about a few things. First, consider the specific language model you want to use, the type of CPU and GPU in your computer, and your budget. Here are some suggestions:
|
||||
@ -141,6 +140,62 @@ To understand the impact of memory bandwidth, let's consider an example. If your
|
||||
|
||||
For a broader context, consider top-tier GPUs like the Nvidia RTX 3090, which offers an impressive 930 Gbps of VRAM bandwidth. In contrast, the latest DDR5 RAM can provide up to 100GB/s of memory bandwidth. Recognizing and optimizing for memory bandwidth is paramount to efficiently running models like CodeLlama, as it directly influences the speed at which you can generate text tokens during inference.
|
||||
|
||||
## How to choose LLMs for your work
|
||||
|
||||
Choosing the right Large Language Model (LLM) doesn't have to be complicated. It's all about finding one that works well for your needs. Here's a simple guide to help you pick the perfect LLM:
|
||||
|
||||
1. **Set Up the Basics**: First, get everything ready on your computer. Make sure you have the right software and tools to run these models. Then, give them a try on your system.
|
||||
2. **Watch Your Memory**: Pay attention to how much memory these models are using. Some are bigger than others, and you need to make sure your computer can handle them.
|
||||
3. **Find Compatible Models**: Look for the models that are like the top players in the game. These models are known to work well with the tools you're using. Keep these models in your shortlist.
|
||||
4. **Test Them Out**: Take the models on your shortlist and give them a try with your specific task. This is like comparing different cars by taking them for a test drive. It helps you see which one works best for what you need.
|
||||
5. **Pick the Best Fit**: After testing, you'll have a better idea of which model is the winner for your project. Consider things like how well it performs, how fast it is, if it works with your computer, and the software you're using.
|
||||
6. **Stay Updated**: Remember that this field is always changing and improving. Keep an eye out for updates and new models that might be even better for your needs.
|
||||
|
||||
And the good news is, finding the right LLM is easier now. We've got a handy tool called Extractum LLM Explorer that you can use online. It helps you discover, compare, and rank lots of different LLMs. Check it out at **[Extractum](http://llm.extractum.io/)**, and it'll make your selection process a breeze!
|
||||
|
||||
You can also use [Model Memory Calculator](https://huggingface.co/spaces/hf-accelerate/model-memory-usage) tool designed to assist in determining the required vRAM for training and conducting inference with large models hosted on the Hugging Face Hub. The tool identifies the minimum recommended vRAM based on the size of the 'largest layer' within the model. Additionally, it's worth noting that model training typically necessitates approximately four times the size of the model, especially when using the Adam optimization. Keep in mind When performing inference, expect to add up to an additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/). More tests will be performed in the future to get a more accurate benchmark for each model.
|
||||
|
||||
## How to Calculate How Much vRAM is Required to My Selected LLM
|
||||
|
||||
**For example:** Calculating the VRAM required to run a 13-billion-parameter Large Language Model (LLM) involves considering the model size, batch size, sequence length, token size, and any additional overhead. Here's how you can estimate the VRAM required for a 13B LLM:
|
||||
|
||||
1. **Model Size**: Find out the size of the 13B LLM in terms of the number of parameters. This information is typically provided in the model's documentation. A 13-billion-parameter model has 13,000,000,000 parameters.
|
||||
2. **Batch Size**: Decide on the batch size you want to use during inference. The batch size represents how many input samples you process simultaneously. Smaller batch sizes require less VRAM.
|
||||
3. **Sequence Length**: Determine the average length of the input text sequences you'll be working with. Sequence length can impact VRAM requirements; longer sequences need more memory.
|
||||
4. **Token Size**: Understand the memory required to store one token in bytes. Most LLMs use 4 bytes per token.
|
||||
5. **Overhead**: Consider any additional memory overhead for intermediate computations and framework requirements. Overhead can vary but should be estimated based on your specific setup.
|
||||
|
||||
Use the following formula to estimate the VRAM required:
|
||||
|
||||
**VRAM Required (in gigabytes)** = `Model Parameters x Token Size x Batch Size x Sequence Length + Overhead`
|
||||
|
||||
- **Model Parameters**: 13,000,000,000 parameters for a 13B LLM.
|
||||
- **Token Size**: Usually 4 bytes per token.
|
||||
- **Batch Size**: Choose your batch size.
|
||||
- **Sequence Length**: The average length of input sequences.
|
||||
- **Overhead**: Any additional VRAM required based on your setup.
|
||||
|
||||
Here's an example:
|
||||
|
||||
Suppose you want to run a 13B LLM with the following parameters:
|
||||
|
||||
- **Batch Size**: 4
|
||||
- **Sequence Length**: 512 tokens
|
||||
- **Token Size**: 4 bytes
|
||||
- **Estimated Overhead**: 2 GB
|
||||
|
||||
VRAM Required (in gigabytes) = `(13,000,000,000 x 4 x 4 x 512) + 2`
|
||||
|
||||
VRAM Required (in gigabytes) = `(8,388,608,000) + 2,000`
|
||||
|
||||
VRAM Required (in gigabytes) ≈ `8,390,608,000 bytes`
|
||||
|
||||
To convert this to gigabytes, divide by `1,073,741,824 (1 GB)`
|
||||
|
||||
VRAM Required (in gigabytes) ≈ `8,390,608,000 / 1,073,741,824 ≈ 7.8 GB`
|
||||
|
||||
So, to run a 13-billion-parameter LLM with the specified parameters and overhead, you would need approximately 7.8 gigabytes of VRAM on your GPU. Make sure to have some additional VRAM for stable operation and consider testing the setup in practice to monitor VRAM usage accurately.
|
||||
|
||||
<!--
|
||||
## Macbook 8GB RAM
|
||||
|
||||
|
||||
@ -14,9 +14,9 @@
|
||||
"write-heading-ids": "docusaurus write-heading-ids"
|
||||
},
|
||||
"dependencies": {
|
||||
"@docusaurus/core": "2.4.1",
|
||||
"@docusaurus/preset-classic": "2.4.1",
|
||||
"@docusaurus/theme-live-codeblock": "^2.4.1",
|
||||
"@docusaurus/core": "^2.4.3",
|
||||
"@docusaurus/preset-classic": "^2.4.3",
|
||||
"@docusaurus/theme-live-codeblock": "^2.4.3",
|
||||
"@headlessui/react": "^1.7.17",
|
||||
"@heroicons/react": "^2.0.18",
|
||||
"@mdx-js/react": "^1.6.22",
|
||||
@ -30,7 +30,7 @@
|
||||
"tailwindcss": "^3.3.3"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@docusaurus/module-type-aliases": "2.4.1"
|
||||
"@docusaurus/module-type-aliases": "^2.4.3"
|
||||
},
|
||||
"browserslist": {
|
||||
"production": [
|
||||
|
||||
@ -32,12 +32,7 @@ const sidebars = {
|
||||
collapsible: true,
|
||||
collapsed: false,
|
||||
link: { type: "doc", id: "features/features" },
|
||||
items: [
|
||||
"features/ai-models",
|
||||
"features/control",
|
||||
"features/acceleration",
|
||||
"features/extensions",
|
||||
],
|
||||
items: ["features/ai-models", "features/control", "features/acceleration", "features/extensions"],
|
||||
},
|
||||
],
|
||||
|
||||
@ -217,4 +212,22 @@ const sidebars = {
|
||||
],
|
||||
};
|
||||
|
||||
module.exports = {
|
||||
docs: [
|
||||
{
|
||||
type: "category",
|
||||
label: "Overview",
|
||||
link: {
|
||||
type: "generated-index",
|
||||
title: "Hardware Guides",
|
||||
description: "Learn about the most important Docusaurus concepts!",
|
||||
slug: "/category/docusaurus-guides",
|
||||
keywords: ["guides"],
|
||||
image: "/img/docusaurus.png",
|
||||
},
|
||||
items: ["pages", "docs", "blog", "search"],
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
module.exports = sidebars;
|
||||
|
||||
368
docs/yarn.lock
@ -1250,10 +1250,10 @@
|
||||
"@docsearch/css" "3.5.2"
|
||||
algoliasearch "^4.19.1"
|
||||
|
||||
"@docusaurus/core@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.4.1.tgz#4b8ff5766131ce3fbccaad0b1daf2ad4dc76f62d"
|
||||
integrity sha512-SNsY7PshK3Ri7vtsLXVeAJGS50nJN3RgF836zkyUfAD01Fq+sAk5EwWgLw+nnm5KVNGDu7PRR2kRGDsWvqpo0g==
|
||||
"@docusaurus/core@2.4.3", "@docusaurus/core@^2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/core/-/core-2.4.3.tgz#d86624901386fd8164ce4bff9cc7f16fde57f523"
|
||||
integrity sha512-dWH5P7cgeNSIg9ufReX6gaCl/TmrGKD38Orbwuz05WPhAQtFXHd5B8Qym1TiXfvUNvwoYKkAJOJuGe8ou0Z7PA==
|
||||
dependencies:
|
||||
"@babel/core" "^7.18.6"
|
||||
"@babel/generator" "^7.18.7"
|
||||
@ -1265,13 +1265,13 @@
|
||||
"@babel/runtime" "^7.18.6"
|
||||
"@babel/runtime-corejs3" "^7.18.6"
|
||||
"@babel/traverse" "^7.18.8"
|
||||
"@docusaurus/cssnano-preset" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/cssnano-preset" "2.4.3"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/react-loadable" "5.5.2"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-common" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-common" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
"@slorber/static-site-generator-webpack-plugin" "^4.0.7"
|
||||
"@svgr/webpack" "^6.2.1"
|
||||
autoprefixer "^10.4.7"
|
||||
@ -1327,33 +1327,33 @@
|
||||
webpack-merge "^5.8.0"
|
||||
webpackbar "^5.0.2"
|
||||
|
||||
"@docusaurus/cssnano-preset@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.4.1.tgz#eacadefb1e2e0f59df3467a0fe83e4ff79eed163"
|
||||
integrity sha512-ka+vqXwtcW1NbXxWsh6yA1Ckii1klY9E53cJ4O9J09nkMBgrNX3iEFED1fWdv8wf4mJjvGi5RLZ2p9hJNjsLyQ==
|
||||
"@docusaurus/cssnano-preset@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/cssnano-preset/-/cssnano-preset-2.4.3.tgz#1d7e833c41ce240fcc2812a2ac27f7b862f32de0"
|
||||
integrity sha512-ZvGSRCi7z9wLnZrXNPG6DmVPHdKGd8dIn9pYbEOFiYihfv4uDR3UtxogmKf+rT8ZlKFf5Lqne8E8nt08zNM8CA==
|
||||
dependencies:
|
||||
cssnano-preset-advanced "^5.3.8"
|
||||
postcss "^8.4.14"
|
||||
postcss-sort-media-queries "^4.2.1"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/logger@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.4.1.tgz#4d2c0626b40752641f9fdd93ad9b5a7a0792f767"
|
||||
integrity sha512-5h5ysIIWYIDHyTVd8BjheZmQZmEgWDR54aQ1BX9pjFfpyzFo5puKXKYrYJXbjEHGyVhEzmB9UXwbxGfaZhOjcg==
|
||||
"@docusaurus/logger@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/logger/-/logger-2.4.3.tgz#518bbc965fb4ebe8f1d0b14e5f4161607552d34c"
|
||||
integrity sha512-Zxws7r3yLufk9xM1zq9ged0YHs65mlRmtsobnFkdZTxWXdTYlWWLWdKyNKAsVC+D7zg+pv2fGbyabdOnyZOM3w==
|
||||
dependencies:
|
||||
chalk "^4.1.2"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/mdx-loader@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.4.1.tgz#6425075d7fc136dbfdc121349060cedd64118393"
|
||||
integrity sha512-4KhUhEavteIAmbBj7LVFnrVYDiU51H5YWW1zY6SmBSte/YLhDutztLTBE0PQl1Grux1jzUJeaSvAzHpTn6JJDQ==
|
||||
"@docusaurus/mdx-loader@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/mdx-loader/-/mdx-loader-2.4.3.tgz#e8ff37f30a060eaa97b8121c135f74cb531a4a3e"
|
||||
integrity sha512-b1+fDnWtl3GiqkL0BRjYtc94FZrcDDBV1j8446+4tptB9BAOlePwG2p/pK6vGvfL53lkOsszXMghr2g67M0vCw==
|
||||
dependencies:
|
||||
"@babel/parser" "^7.18.8"
|
||||
"@babel/traverse" "^7.18.8"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@mdx-js/mdx" "^1.6.22"
|
||||
escape-html "^1.0.3"
|
||||
file-loader "^6.2.0"
|
||||
@ -1368,13 +1368,13 @@
|
||||
url-loader "^4.1.1"
|
||||
webpack "^5.73.0"
|
||||
|
||||
"@docusaurus/module-type-aliases@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.4.1.tgz#38b3c2d2ae44bea6d57506eccd84280216f0171c"
|
||||
integrity sha512-gLBuIFM8Dp2XOCWffUDSjtxY7jQgKvYujt7Mx5s4FCTfoL5dN1EVbnrn+O2Wvh8b0a77D57qoIDY7ghgmatR1A==
|
||||
"@docusaurus/module-type-aliases@2.4.3", "@docusaurus/module-type-aliases@^2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/module-type-aliases/-/module-type-aliases-2.4.3.tgz#d08ef67e4151e02f352a2836bcf9ecde3b9c56ac"
|
||||
integrity sha512-cwkBkt1UCiduuvEAo7XZY01dJfRn7UR/75mBgOdb1hKknhrabJZ8YH+7savd/y9kLExPyrhe0QwdS9GuzsRRIA==
|
||||
dependencies:
|
||||
"@docusaurus/react-loadable" "5.5.2"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@types/history" "^4.7.11"
|
||||
"@types/react" "*"
|
||||
"@types/react-router-config" "*"
|
||||
@ -1382,18 +1382,18 @@
|
||||
react-helmet-async "*"
|
||||
react-loadable "npm:@docusaurus/react-loadable@5.5.2"
|
||||
|
||||
"@docusaurus/plugin-content-blog@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.4.1.tgz#c705a8b1a36a34f181dcf43b7770532e4dcdc4a3"
|
||||
integrity sha512-E2i7Knz5YIbE1XELI6RlTnZnGgS52cUO4BlCiCUCvQHbR+s1xeIWz4C6BtaVnlug0Ccz7nFSksfwDpVlkujg5Q==
|
||||
"@docusaurus/plugin-content-blog@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-blog/-/plugin-content-blog-2.4.3.tgz#6473b974acab98e967414d8bbb0d37e0cedcea14"
|
||||
integrity sha512-PVhypqaA0t98zVDpOeTqWUTvRqCEjJubtfFUQ7zJNYdbYTbS/E/ytq6zbLVsN/dImvemtO/5JQgjLxsh8XLo8Q==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-common" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-common" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
cheerio "^1.0.0-rc.12"
|
||||
feed "^4.2.2"
|
||||
fs-extra "^10.1.0"
|
||||
@ -1404,18 +1404,18 @@
|
||||
utility-types "^3.10.0"
|
||||
webpack "^5.73.0"
|
||||
|
||||
"@docusaurus/plugin-content-docs@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.4.1.tgz#ed94d9721b5ce7a956fb01cc06c40d8eee8dfca7"
|
||||
integrity sha512-Lo7lSIcpswa2Kv4HEeUcGYqaasMUQNpjTXpV0N8G6jXgZaQurqp7E8NGYeGbDXnb48czmHWbzDL4S3+BbK0VzA==
|
||||
"@docusaurus/plugin-content-docs@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-docs/-/plugin-content-docs-2.4.3.tgz#aa224c0512351e81807adf778ca59fd9cd136973"
|
||||
integrity sha512-N7Po2LSH6UejQhzTCsvuX5NOzlC+HiXOVvofnEPj0WhMu1etpLEXE6a4aTxrtg95lQ5kf0xUIdjX9sh3d3G76A==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/module-type-aliases" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/module-type-aliases" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
"@types/react-router-config" "^5.0.6"
|
||||
combine-promises "^1.1.0"
|
||||
fs-extra "^10.1.0"
|
||||
@ -1426,95 +1426,95 @@
|
||||
utility-types "^3.10.0"
|
||||
webpack "^5.73.0"
|
||||
|
||||
"@docusaurus/plugin-content-pages@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.4.1.tgz#c534f7e49967699a45bbe67050d1605ebbf3d285"
|
||||
integrity sha512-/UjuH/76KLaUlL+o1OvyORynv6FURzjurSjvn2lbWTFc4tpYY2qLYTlKpTCBVPhlLUQsfyFnshEJDLmPneq2oA==
|
||||
"@docusaurus/plugin-content-pages@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-content-pages/-/plugin-content-pages-2.4.3.tgz#7f285e718b53da8c8d0101e70840c75b9c0a1ac0"
|
||||
integrity sha512-txtDVz7y3zGk67q0HjG0gRttVPodkHqE0bpJ+7dOaTH40CQFLSh7+aBeGnPOTl+oCPG+hxkim4SndqPqXjQ8Bg==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
fs-extra "^10.1.0"
|
||||
tslib "^2.4.0"
|
||||
webpack "^5.73.0"
|
||||
|
||||
"@docusaurus/plugin-debug@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.4.1.tgz#461a2c77b0c5a91b2c05257c8f9585412aaa59dc"
|
||||
integrity sha512-7Yu9UPzRShlrH/G8btOpR0e6INFZr0EegWplMjOqelIwAcx3PKyR8mgPTxGTxcqiYj6hxSCRN0D8R7YrzImwNA==
|
||||
"@docusaurus/plugin-debug@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-debug/-/plugin-debug-2.4.3.tgz#2f90eb0c9286a9f225444e3a88315676fe02c245"
|
||||
integrity sha512-LkUbuq3zCmINlFb+gAd4ZvYr+bPAzMC0hwND4F7V9bZ852dCX8YoWyovVUBKq4er1XsOwSQaHmNGtObtn8Av8Q==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
fs-extra "^10.1.0"
|
||||
react-json-view "^1.21.3"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/plugin-google-analytics@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.4.1.tgz#30de1c35773bf9d52bb2d79b201b23eb98022613"
|
||||
integrity sha512-dyZJdJiCoL+rcfnm0RPkLt/o732HvLiEwmtoNzOoz9MSZz117UH2J6U2vUDtzUzwtFLIf32KkeyzisbwUCgcaQ==
|
||||
"@docusaurus/plugin-google-analytics@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-analytics/-/plugin-google-analytics-2.4.3.tgz#0d19993136ade6f7a7741251b4f617400d92ab45"
|
||||
integrity sha512-KzBV3k8lDkWOhg/oYGxlK5o9bOwX7KpPc/FTWoB+SfKhlHfhq7qcQdMi1elAaVEIop8tgK6gD1E58Q+XC6otSQ==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/plugin-google-gtag@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.4.1.tgz#6a3eb91022714735e625c7ca70ef5188fa7bd0dc"
|
||||
integrity sha512-mKIefK+2kGTQBYvloNEKtDmnRD7bxHLsBcxgnbt4oZwzi2nxCGjPX6+9SQO2KCN5HZbNrYmGo5GJfMgoRvy6uA==
|
||||
"@docusaurus/plugin-google-gtag@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-gtag/-/plugin-google-gtag-2.4.3.tgz#e1a80b0696771b488562e5b60eff21c9932d9e1c"
|
||||
integrity sha512-5FMg0rT7sDy4i9AGsvJC71MQrqQZwgLNdDetLEGDHLfSHLvJhQbTCUGbGXknUgWXQJckcV/AILYeJy+HhxeIFA==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/plugin-google-tag-manager@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.4.1.tgz#b99f71aec00b112bbf509ef2416e404a95eb607e"
|
||||
integrity sha512-Zg4Ii9CMOLfpeV2nG74lVTWNtisFaH9QNtEw48R5QE1KIwDBdTVaiSA18G1EujZjrzJJzXN79VhINSbOJO/r3g==
|
||||
"@docusaurus/plugin-google-tag-manager@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-google-tag-manager/-/plugin-google-tag-manager-2.4.3.tgz#e41fbf79b0ffc2de1cc4013eb77798cff0ad98e3"
|
||||
integrity sha512-1jTzp71yDGuQiX9Bi0pVp3alArV0LSnHXempvQTxwCGAEzUWWaBg4d8pocAlTpbP9aULQQqhgzrs8hgTRPOM0A==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/plugin-sitemap@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.4.1.tgz#8a7a76ed69dc3e6b4474b6abb10bb03336a9de6d"
|
||||
integrity sha512-lZx+ijt/+atQ3FVE8FOHV/+X3kuok688OydDXrqKRJyXBJZKgGjA2Qa8RjQ4f27V2woaXhtnyrdPop/+OjVMRg==
|
||||
"@docusaurus/plugin-sitemap@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/plugin-sitemap/-/plugin-sitemap-2.4.3.tgz#1b3930900a8f89670ce7e8f83fb4730cd3298c32"
|
||||
integrity sha512-LRQYrK1oH1rNfr4YvWBmRzTL0LN9UAPxBbghgeFRBm5yloF6P+zv1tm2pe2hQTX/QP5bSKdnajCvfnScgKXMZQ==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-common" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-common" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
fs-extra "^10.1.0"
|
||||
sitemap "^7.1.1"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/preset-classic@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.4.1.tgz#072f22d0332588e9c5f512d4bded8d7c99f91497"
|
||||
integrity sha512-P4//+I4zDqQJ+UDgoFrjIFaQ1MeS9UD1cvxVQaI6O7iBmiHQm0MGROP1TbE7HlxlDPXFJjZUK3x3cAoK63smGQ==
|
||||
"@docusaurus/preset-classic@^2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/preset-classic/-/preset-classic-2.4.3.tgz#074c57ebf29fa43d23bd1c8ce691226f542bc262"
|
||||
integrity sha512-tRyMliepY11Ym6hB1rAFSNGwQDpmszvWYJvlK1E+md4SW8i6ylNHtpZjaYFff9Mdk3i/Pg8ItQq9P0daOJAvQw==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/plugin-content-blog" "2.4.1"
|
||||
"@docusaurus/plugin-content-docs" "2.4.1"
|
||||
"@docusaurus/plugin-content-pages" "2.4.1"
|
||||
"@docusaurus/plugin-debug" "2.4.1"
|
||||
"@docusaurus/plugin-google-analytics" "2.4.1"
|
||||
"@docusaurus/plugin-google-gtag" "2.4.1"
|
||||
"@docusaurus/plugin-google-tag-manager" "2.4.1"
|
||||
"@docusaurus/plugin-sitemap" "2.4.1"
|
||||
"@docusaurus/theme-classic" "2.4.1"
|
||||
"@docusaurus/theme-common" "2.4.1"
|
||||
"@docusaurus/theme-search-algolia" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/plugin-content-blog" "2.4.3"
|
||||
"@docusaurus/plugin-content-docs" "2.4.3"
|
||||
"@docusaurus/plugin-content-pages" "2.4.3"
|
||||
"@docusaurus/plugin-debug" "2.4.3"
|
||||
"@docusaurus/plugin-google-analytics" "2.4.3"
|
||||
"@docusaurus/plugin-google-gtag" "2.4.3"
|
||||
"@docusaurus/plugin-google-tag-manager" "2.4.3"
|
||||
"@docusaurus/plugin-sitemap" "2.4.3"
|
||||
"@docusaurus/theme-classic" "2.4.3"
|
||||
"@docusaurus/theme-common" "2.4.3"
|
||||
"@docusaurus/theme-search-algolia" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
|
||||
"@docusaurus/react-loadable@5.5.2", "react-loadable@npm:@docusaurus/react-loadable@5.5.2":
|
||||
version "5.5.2"
|
||||
@ -1524,23 +1524,23 @@
|
||||
"@types/react" "*"
|
||||
prop-types "^15.6.2"
|
||||
|
||||
"@docusaurus/theme-classic@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.4.1.tgz#0060cb263c1a73a33ac33f79bb6bc2a12a56ad9e"
|
||||
integrity sha512-Rz0wKUa+LTW1PLXmwnf8mn85EBzaGSt6qamqtmnh9Hflkc+EqiYMhtUJeLdV+wsgYq4aG0ANc+bpUDpsUhdnwg==
|
||||
"@docusaurus/theme-classic@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-classic/-/theme-classic-2.4.3.tgz#29360f2eb03a0e1686eb19668633ef313970ee8f"
|
||||
integrity sha512-QKRAJPSGPfDY2yCiPMIVyr+MqwZCIV2lxNzqbyUW0YkrlmdzzP3WuQJPMGLCjWgQp/5c9kpWMvMxjhpZx1R32Q==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/module-type-aliases" "2.4.1"
|
||||
"@docusaurus/plugin-content-blog" "2.4.1"
|
||||
"@docusaurus/plugin-content-docs" "2.4.1"
|
||||
"@docusaurus/plugin-content-pages" "2.4.1"
|
||||
"@docusaurus/theme-common" "2.4.1"
|
||||
"@docusaurus/theme-translations" "2.4.1"
|
||||
"@docusaurus/types" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-common" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/module-type-aliases" "2.4.3"
|
||||
"@docusaurus/plugin-content-blog" "2.4.3"
|
||||
"@docusaurus/plugin-content-docs" "2.4.3"
|
||||
"@docusaurus/plugin-content-pages" "2.4.3"
|
||||
"@docusaurus/theme-common" "2.4.3"
|
||||
"@docusaurus/theme-translations" "2.4.3"
|
||||
"@docusaurus/types" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-common" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
"@mdx-js/react" "^1.6.22"
|
||||
clsx "^1.2.1"
|
||||
copy-text-to-clipboard "^3.0.1"
|
||||
@ -1555,18 +1555,18 @@
|
||||
tslib "^2.4.0"
|
||||
utility-types "^3.10.0"
|
||||
|
||||
"@docusaurus/theme-common@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.4.1.tgz#03e16f7aa96455e952f3243ac99757b01a3c83d4"
|
||||
integrity sha512-G7Zau1W5rQTaFFB3x3soQoZpkgMbl/SYNG8PfMFIjKa3M3q8n0m/GRf5/H/e5BqOvt8c+ZWIXGCiz+kUCSHovA==
|
||||
"@docusaurus/theme-common@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-common/-/theme-common-2.4.3.tgz#bb31d70b6b67d0bdef9baa343192dcec49946a2e"
|
||||
integrity sha512-7KaDJBXKBVGXw5WOVt84FtN8czGWhM0lbyWEZXGp8AFfL6sZQfRTluFp4QriR97qwzSyOfQb+nzcDZZU4tezUw==
|
||||
dependencies:
|
||||
"@docusaurus/mdx-loader" "2.4.1"
|
||||
"@docusaurus/module-type-aliases" "2.4.1"
|
||||
"@docusaurus/plugin-content-blog" "2.4.1"
|
||||
"@docusaurus/plugin-content-docs" "2.4.1"
|
||||
"@docusaurus/plugin-content-pages" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-common" "2.4.1"
|
||||
"@docusaurus/mdx-loader" "2.4.3"
|
||||
"@docusaurus/module-type-aliases" "2.4.3"
|
||||
"@docusaurus/plugin-content-blog" "2.4.3"
|
||||
"@docusaurus/plugin-content-docs" "2.4.3"
|
||||
"@docusaurus/plugin-content-pages" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-common" "2.4.3"
|
||||
"@types/history" "^4.7.11"
|
||||
"@types/react" "*"
|
||||
"@types/react-router-config" "*"
|
||||
@ -1577,34 +1577,34 @@
|
||||
use-sync-external-store "^1.2.0"
|
||||
utility-types "^3.10.0"
|
||||
|
||||
"@docusaurus/theme-live-codeblock@^2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-live-codeblock/-/theme-live-codeblock-2.4.1.tgz#214168e29041efc1eed1540f4989dd2efcfe6658"
|
||||
integrity sha512-KBKrm34kcdNbSeEm6RujN5GWWg4F2dmAYZyHMMQM8FXokx8mNShRx6uq17WXi23JNm7niyMhNOBRfZWay+5Hkg==
|
||||
"@docusaurus/theme-live-codeblock@^2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-live-codeblock/-/theme-live-codeblock-2.4.3.tgz#889eb4e740d2e9f2dc5516f9407f1bc147887387"
|
||||
integrity sha512-wx+iJCCoSewUkMzFy7pnbhDBCRcJRTLkpx1/zwnHhfiNWVvJ2XjtBKIviRyMhynZYyvO4sLTpCclzK8JOctkxw==
|
||||
dependencies:
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/theme-common" "2.4.1"
|
||||
"@docusaurus/theme-translations" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/theme-common" "2.4.3"
|
||||
"@docusaurus/theme-translations" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
"@philpl/buble" "^0.19.7"
|
||||
clsx "^1.2.1"
|
||||
fs-extra "^10.1.0"
|
||||
react-live "2.2.3"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/theme-search-algolia@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.4.1.tgz#906bd2cca3fced0241985ef502c892f58ff380fc"
|
||||
integrity sha512-6BcqW2lnLhZCXuMAvPRezFs1DpmEKzXFKlYjruuas+Xy3AQeFzDJKTJFIm49N77WFCTyxff8d3E4Q9pi/+5McQ==
|
||||
"@docusaurus/theme-search-algolia@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-search-algolia/-/theme-search-algolia-2.4.3.tgz#32d4cbefc3deba4112068fbdb0bde11ac51ece53"
|
||||
integrity sha512-jziq4f6YVUB5hZOB85ELATwnxBz/RmSLD3ksGQOLDPKVzat4pmI8tddNWtriPpxR04BNT+ZfpPUMFkNFetSW1Q==
|
||||
dependencies:
|
||||
"@docsearch/react" "^3.1.1"
|
||||
"@docusaurus/core" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/plugin-content-docs" "2.4.1"
|
||||
"@docusaurus/theme-common" "2.4.1"
|
||||
"@docusaurus/theme-translations" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/utils-validation" "2.4.1"
|
||||
"@docusaurus/core" "2.4.3"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/plugin-content-docs" "2.4.3"
|
||||
"@docusaurus/theme-common" "2.4.3"
|
||||
"@docusaurus/theme-translations" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
"@docusaurus/utils-validation" "2.4.3"
|
||||
algoliasearch "^4.13.1"
|
||||
algoliasearch-helper "^3.10.0"
|
||||
clsx "^1.2.1"
|
||||
@ -1614,18 +1614,18 @@
|
||||
tslib "^2.4.0"
|
||||
utility-types "^3.10.0"
|
||||
|
||||
"@docusaurus/theme-translations@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.4.1.tgz#4d49df5865dae9ef4b98a19284ede62ae6f98726"
|
||||
integrity sha512-T1RAGP+f86CA1kfE8ejZ3T3pUU3XcyvrGMfC/zxCtc2BsnoexuNI9Vk2CmuKCb+Tacvhxjv5unhxXce0+NKyvA==
|
||||
"@docusaurus/theme-translations@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/theme-translations/-/theme-translations-2.4.3.tgz#91ac73fc49b8c652b7a54e88b679af57d6ac6102"
|
||||
integrity sha512-H4D+lbZbjbKNS/Zw1Lel64PioUAIT3cLYYJLUf3KkuO/oc9e0QCVhIYVtUI2SfBCF2NNdlyhBDQEEMygsCedIg==
|
||||
dependencies:
|
||||
fs-extra "^10.1.0"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/types@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.4.1.tgz#d8e82f9e0f704984f98df1f93d6b4554d5458705"
|
||||
integrity sha512-0R+cbhpMkhbRXX138UOc/2XZFF8hiZa6ooZAEEJFp5scytzCw4tC1gChMFXrpa3d2tYE6AX8IrOEpSonLmfQuQ==
|
||||
"@docusaurus/types@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/types/-/types-2.4.3.tgz#4aead281ca09f721b3c0a9b926818450cfa3db31"
|
||||
integrity sha512-W6zNLGQqfrp/EoPD0bhb9n7OobP+RHpmvVzpA+Z/IuU3Q63njJM24hmT0GYboovWcDtFmnIJC9wcyx4RVPQscw==
|
||||
dependencies:
|
||||
"@types/history" "^4.7.11"
|
||||
"@types/react" "*"
|
||||
@ -1636,30 +1636,30 @@
|
||||
webpack "^5.73.0"
|
||||
webpack-merge "^5.8.0"
|
||||
|
||||
"@docusaurus/utils-common@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.4.1.tgz#7f72e873e49bd5179588869cc3ab7449a56aae63"
|
||||
integrity sha512-bCVGdZU+z/qVcIiEQdyx0K13OC5mYwxhSuDUR95oFbKVuXYRrTVrwZIqQljuo1fyJvFTKHiL9L9skQOPokuFNQ==
|
||||
"@docusaurus/utils-common@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils-common/-/utils-common-2.4.3.tgz#30656c39ef1ce7e002af7ba39ea08330f58efcfb"
|
||||
integrity sha512-/jascp4GbLQCPVmcGkPzEQjNaAk3ADVfMtudk49Ggb+131B1WDD6HqlSmDf8MxGdy7Dja2gc+StHf01kiWoTDQ==
|
||||
dependencies:
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/utils-validation@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.4.1.tgz#19959856d4a886af0c5cfb357f4ef68b51151244"
|
||||
integrity sha512-unII3hlJlDwZ3w8U+pMO3Lx3RhI4YEbY3YNsQj4yzrkZzlpqZOLuAiZK2JyULnD+TKbceKU0WyWkQXtYbLNDFA==
|
||||
"@docusaurus/utils-validation@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils-validation/-/utils-validation-2.4.3.tgz#8122c394feef3e96c73f6433987837ec206a63fb"
|
||||
integrity sha512-G2+Vt3WR5E/9drAobP+hhZQMaswRwDlp6qOMi7o7ZypB+VO7N//DZWhZEwhcRGepMDJGQEwtPv7UxtYwPL9PBw==
|
||||
dependencies:
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/utils" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@docusaurus/utils" "2.4.3"
|
||||
joi "^17.6.0"
|
||||
js-yaml "^4.1.0"
|
||||
tslib "^2.4.0"
|
||||
|
||||
"@docusaurus/utils@2.4.1":
|
||||
version "2.4.1"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.4.1.tgz#9c5f76eae37b71f3819c1c1f0e26e6807c99a4fc"
|
||||
integrity sha512-1lvEZdAQhKNht9aPXPoh69eeKnV0/62ROhQeFKKxmzd0zkcuE/Oc5Gpnt00y/f5bIsmOsYMY7Pqfm/5rteT5GA==
|
||||
"@docusaurus/utils@2.4.3":
|
||||
version "2.4.3"
|
||||
resolved "https://registry.yarnpkg.com/@docusaurus/utils/-/utils-2.4.3.tgz#52b000d989380a2125831b84e3a7327bef471e89"
|
||||
integrity sha512-fKcXsjrD86Smxv8Pt0TBFqYieZZCPh4cbf9oszUq/AMhZn3ujwpKaVYZACPX8mmjtYx0JOgNx52CREBfiGQB4A==
|
||||
dependencies:
|
||||
"@docusaurus/logger" "2.4.1"
|
||||
"@docusaurus/logger" "2.4.3"
|
||||
"@svgr/webpack" "^6.2.1"
|
||||
escape-string-regexp "^4.0.0"
|
||||
file-loader "^6.2.0"
|
||||
|
||||