Skip to main content
warning

🚧 Cortex.cpp is currently under development. Our documentation outlines the intended behavior of Cortex, which may not yet be fully implemented in the codebase.

cortex engines init

This command sets up and downloads the required dependencies to run the available engines within Cortex. Currently, Cortex supports three engines:

  • Llama.cpp
  • Onnx
  • Tensorrt-llm

Usage​


cortex engines init [options] <name>

For Example:


## Llama.cpp engine
cortex engines init llamacpp
## ONNX engine
cortex engines init onnx
## Tensorrt-LLM engine
cortex engines init tensorrt-llm

Options​

OptionDescriptionRequiredDefault valueExample
nameThe name of the engine you want to run.Yes--
-h, --helpDisplay help for command.No--h