- Change `ModelMapping` column type from varchar(1024) to TEXT in channels table
- Add MySQL migration script to alter column type during database initialization
- Improve database schema flexibility for storing complex model mappings
- Enhance `NewProxyHttpClient` to handle SOCKS5 proxy authentication
- Extract username and password from proxy URL for SOCKS5 proxy configuration
- Provide optional authentication for SOCKS5 proxy connections
- Introduce `AZURE_DEFAULT_API_VERSION` environment variable
- Set default Azure API version to `2024-12-01-preview`
- Update README documentation for new environment configuration
- Modify Azure channel relay to use default API version when not specified
- Add `RecodeModelName` to `RelayInfo` struct for more flexible model name tracking
- Update text relay and quota consumption to use `RecodeModelName`
- Move reasoning effort from admin info to other info in log generation
- Ensure consistent model name handling across relay components
- Add `ReasoningEffort` field to `RelayInfo` struct
- Update log generation to include reasoning effort in admin info
- Modify logs table component to display reasoning effort when available
- Preserve reasoning effort information during request processing
- Remove model name suffixes after extracting reasoning effort
- Update upstream model name to reflect the base model
- Ensure clean model name is passed to the upstream service
- Modify model suffix parsing to use hyphen-separated suffixes
- Ensure consistent parsing of `-high`, `-medium`, and `-low` reasoning effort indicators
- Support setting reasoning effort via model name suffix
- Add `-high`, `-medium`, and `-low` suffixes to control reasoning effort
- Update README with new model configuration option
- Modify OpenAI adaptor to handle reasoning effort settings
The issue was caused by the `omitempty` tag in the Go struct, which prevented the `temperature` field from being included in the JSON output when it was set to 0.
Signed-off-by: Butui Hu <hot123tea123@gmail.com>
- Implemented conditional logic to double the streaming timeout for models starting with "o1" or "o3".
- Improved handling of streaming timeout configuration to enhance performance based on model type.
- Changed base image from Node.js to Bun for improved performance.
- Replaced npm install with bun install for dependency management.
- Updated build command to use Bun for building the application.
- Added new bun.lockb file to track Bun dependencies.