- Update error message for missing model ratio to be more user-friendly
- Modify ModelRatioNotSetEditor to filter models without price or ratio
- Enhance model data initialization with fallback values
- Implement `/api/channel/models_enabled` endpoint to retrieve enabled models
- Add `EnabledListModels` handler in controller
- Create new `ModelRatioNotSetEditor` component for managing unset model ratios
- Update router to include new models_enabled route
- Add internationalization support for new model management UI
- Include GPT-4.5 preview model in OpenAI model list
- Replace ThinkingAdapterMaxTokens with a more flexible DefaultMaxTokens map
- Add support for model-specific default max tokens configuration
- Update relay and web interface to use the new configuration approach
- Implement a fallback mechanism for default max tokens
- Update Claude relay to set default MaxTokens dynamically
- Modify web interface to clarify default MaxTokens input purpose
- Improve token configuration logic for thinking adapter models
- Introduce a new configuration management approach for model-specific settings
- Update Gemini settings to use the new config system with more flexible management
- Add support for dynamic configuration updates in option handling
- Modify Claude and Vertex adaptors to use new configuration methods
- Enhance web interface to support namespaced configuration keys
- Enhance channel test to detect more embedding models
- Update Azure OpenAI default API version to 2024-12-01-preview
- Remove redundant default API version setting in channel edit
- Add user cache writing in channel test
- Add new context keys for user-related information
- Modify user cache and authentication middleware to populate context
- Refactor quota and notification services to use context-based user data
- Remove redundant database queries by leveraging context information
- Update various components to use new context-based user retrieval methods