.cortexrc
Cortex supports using a config-based approach to configuring most of its functionality. During the
installation process, a .cortexrc
will be generated with some sensible defaults in it. Using this
file, you can change the location and name of the data directory, the Cortex API server port, the host and more.
File Location
The configuration file is stored in the following locations:
- Windows:
C:\Users\<username>\.cortexrc
- Linux:
/home/<username>/.cortexrc
- macOS:
/Users/<username>/.cortexrc
Configuration Parameters
You can configure the following parameters in the .cortexrc
file:
Parameter | Description | Default Value |
---|---|---|
dataFolderPath | Path to the folder where .cortexrc located. | User's home folder. |
apiServerHost | Host address for the Cortex.cpp API server. | 127.0.0.1 |
apiServerPort | Port number for the Cortex.cpp API server. | 39281 |
logFolderPath | Path the folder where logs are located | User's home folder. |
logLlamaCppPath | The llama-cpp engine . | ./logs/cortex.log |
logOnnxPath | The onnxruntime engine log file path. | ./logs/cortex.log |
maxLogLines | The maximum log lines that write to file. | 100000 |
checkedForUpdateAt | The last time for checking updates. | 0 |
latestRelease | The lastest release vesion. | Empty string |
huggingFaceToken | HuggingFace token. | Empty string |
In the future, every parameter will be editable from the Cortex CLI. At present, only a selected few are configurable.
Example of the .cortexrc
file:
logFolderPath: /home/<user>/cortexcpplogLlamaCppPath: ./logs/cortex.loglogTensorrtLLMPath: ./logs/cortex.loglogOnnxPath: ./logs/cortex.logdataFolderPath: /home/<user>/cortexcppmaxLogLines: 100000apiServerHost: 127.0.0.1apiServerPort: 39281checkedForUpdateAt: 1737636738checkedForLlamacppUpdateAt: 1737636592699latestRelease: v1.0.8latestLlamacppRelease: v0.1.49huggingFaceToken: ""gitHubUserAgent: ""gitHubToken: ""llamacppVariant: linux-amd64-avx2-cuda-12-0llamacppVersion: v0.1.49enableCors: trueallowedOrigins: - http://localhost:39281 - http://127.0.0.1:39281 - http://0.0.0.0:39281proxyUrl: ""verifyProxySsl: trueverifyProxyHostSsl: trueproxyUsername: ""proxyPassword: ""noProxy: example.com,::1,localhost,127.0.0.1verifyPeerSsl: trueverifyHostSsl: truesslCertPath: ""sslKeyPath: ""supportedEngines: - llama-cpp - onnxruntime - tensorrt-llm