🚧 Cortex.cpp is currently under development. Our documentation outlines the intended behavior of Cortex, which may not yet be fully implemented in the codebase.
cortex engines
This command allows you to manage various engines available within Cortex.
Usage:
cortex engines <command|parameter> [options] [subcommand]
Options:
Option | Description | Required | Default value | Example |
---|---|---|---|---|
-h , --help | Display help information for the command. | No | - | -h |
cortex engines get
This CLI command calls the following API endpoint:
This command returns an engine detail defined by an engine engine_name
.
Usage:
cortex engines get <engine_name>
For example, it returns the following:
┌─────────────┬────────────────────────────────────────────────────────────────────────────┐│ (index) │ Values │├─────────────┼────────────────────────────────────────────────────────────────────────────┤│ name │ 'onnx' ││ description │ 'This extension enables chat completion API calls using the Cortex engine' ││ version │ '0.0.1' ││ productName │ 'Cortex Inference Engine' │└─────────────┴────────────────────────────────────────────────────────────────────────────┘
To get an engine name, run the engines list
command first.
Options:
Option | Description | Required | Default value | Example |
---|---|---|---|---|
engine_name | The name of the engine that you want to retrieve. | Yes | - | llamacpp |
-h , --help | Display help information for the command. | No | - | -h |
cortex engines list
This CLI command calls the following API endpoint:
This command lists all the Cortex's engines.
Usage:
cortex engines list [options]
For example, it returns the following:
+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+| (Index) | name | description | version | product name | status |+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+| 1 | onnx | This extension enables chat completion API calls using the Onnx engine | 0.0.1 | Onnx Inference Engine | not_initialized |+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+| 2 | llamacpp | This extension enables chat completion API calls using the LlamaCPP engine | 0.0.1 | LlamaCPP Inference Engine | ready |+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+| 3 | tensorrt-llm | This extension enables chat completion API calls using the TensorrtLLM engine | 0.0.1 | TensorrtLLM Inference Engine | not_initialized |+---------+---------------------+-------------------------------------------------------------------------------+---------+------------------------------+-----------------+
Options:
Option | Description | Required | Default value | Example |
---|---|---|---|---|
-h , --help | Display help for command. | No | - | -h |
cortex engines install
This CLI command calls the following API endpoint:
This command downloads the required dependencies and installs the engine within Cortex. Currently, Cortex supports three engines:
Llama.cpp
Onnx
Tensorrt-llm
Usage:
cortex engines install [options] <engine_name>
For Example:
## Llama.cpp enginecortex engines install llamacpp## ONNX enginecortex engines install onnx## Tensorrt-LLM enginecortex engines install tensorrt-llm
Options:
Option | Description | Required | Default value | Example |
---|---|---|---|---|
engine_name | The name of the engine you want to install. | Yes | - | - |
-h , --help | Display help for command. | No | - | -h |
cortex engines uninstall
This command uninstalls the engine within Cortex.
Usage:
cortex engines uninstall [options] <engine_name>
For Example:
## Llama.cpp enginecortex engines uninstall llamacpp## ONNX enginecortex engines uninstall onnx## Tensorrt-LLM enginecortex engines uninstall tensorrt-llm
Options:
Option | Description | Required | Default value | Example |
---|---|---|---|---|
engine_name | The name of the engine you want to uninstall. | Yes | - | - |
-h , --help | Display help for command. | No | - | -h |