warning
🚧 Cortex.cpp is currently under development. Our documentation outlines the intended behavior of Cortex, which may not yet be fully implemented in the codebase.
cortex chat
info
This CLI command calls the following API endpoint:
This command starts a chat session with a specified model, allowing you to interact directly with it through an interactive chat interface.
Usage​
info
You can use the --verbose
flag to display more detailed output of the internal processes. To apply this flag, use the following format: cortex --verbose [subcommand]
.
- MacOs/Linux
- Windows
# Stablecortex chat [options] <model_id> -m <message># Betacortex-beta chat [options] <model_id> -m <message># Nightlycortex-nightly chat [options] <model_id> -m <message>
# Stablecortex.exe chat [options] <model_id> -m <message># Betacortex-beta.exe chat [options] <model_id> -m <message># Nightlycortex-nightly.exe chat [options] <model_id> -m <message>
info
This command uses a model_id
from the model that you have downloaded or available in your file system.
Options​
Option | Description | Required | Default value | Example |
---|---|---|---|---|
model_id | Model ID to chat with. | Yes | - | mistral |
-m , --message <message> | Message to send to the model | Yes | - | -m "Hello, model!" |
-h , --help | Display help information for the command. | No | - | -h |