Backend (Go)
- Include custom endpoints in each model’s SupportedEndpointTypes by parsing Model.Endpoints (JSON) and appending keys alongside native endpoint types.
- Build a global supportedEndpointMap map[string]EndpointInfo{path, method} by:
- Seeding with native defaults.
- Overriding/adding from models.endpoints (accepts string path → default POST, or {path, method}).
- Expose supported_endpoint at the top level of /api/pricing (vendors-like), removing per-model duplication.
- Fix default path for EndpointTypeOpenAIResponse to /v1/responses.
- Keep concurrency/caching for pricing retrieval intact.
Frontend (React)
- Fetch supported_endpoint in useModelPricingData and propagate to PricingPage → ModelDetailSideSheet → ModelEndpoints.
- ModelEndpoints
- Resolve path+method via endpointMap; replace {model} with actual model name.
- Fix mobile visibility; always show path and HTTP method.
- JSONEditor
- Wrap with Form.Slot to inherit form layout; simplify visual styles.
- Use Tabs for “Visual” / “Manual” modes.
- Unify editors: key-value editor now supports nested JSON:
- “+” to convert a primitive into an object and add nested fields.
- Add “Convert to value” for two‑way toggle back from object.
- Stable key rename without reordering rows; new rows append at bottom.
- Use Row/Col grid for clean alignment; region editor uses Form.Slot + grid.
- Editing flows
- EditModelModal / EditPrefillGroupModal use JSONEditor (editorType='object') for endpoint mappings.
- PrefillGroupManagement renders endpoint group items by JSON keys.
Data expectations / compatibility
- models.endpoints should be a JSON object mapping endpoint type → string path or {path, method}. Strings default to POST.
- No schema changes; existing TEXT field continues to store JSON.
QA
- /api/pricing now returns custom endpoint types and global supported_endpoint.
- UI shows both native and custom endpoints; paths/methods render on mobile; nested editing works and preserves order.
This commit refactors the application's error handling mechanism by introducing a new standardized error type, `types.NewAPIError`. It also renames common JSON utility functions for better clarity.
Previously, internal error handling was tightly coupled to the `dto.OpenAIError` format. This change decouples the internal logic from the external API representation.
Key changes:
- A new `types.NewAPIError` struct is introduced to serve as a canonical internal representation for all API errors.
- All relay adapters (OpenAI, Claude, Gemini, etc.) are updated to return `*types.NewAPIError`.
- Controllers now convert the internal `NewAPIError` to the client-facing `OpenAIError` format at the API boundary, ensuring backward compatibility.
- Channel auto-disable/enable logic is updated to use the new standardized error type.
- JSON utility functions are renamed to align with Go's standard library conventions (e.g., `UnmarshalJson` -> `Unmarshal`, `EncodeJson` -> `Marshal`).
* backend
- constant/endpoint_type.go
• Add EndpointTypeMidjourney, EndpointTypeSuno, EndpointTypeKling, EndpointTypeJimeng.
- common/endpoint_type.go
• Map Midjourney / MidjourneyPlus, SunoAPI, Kling, Jimeng channel types to the new endpoint types.
* frontend
- ModelPricing.js
• Add “Supported Endpoint Type” column.
• Implement renderSupportedEndpoints with `stringToColor` for consistent tag colors.
These changes allow `/api/pricing` and model lists to return accurate
`supported_endpoint_types` covering all non-OpenAI providers and display
them clearly in the UI.
No breaking changes.
This update replaces instances of DecodeJson and DecodeJsonStr with UnmarshalJson and UnmarshalJsonStr in various relay handlers, enhancing code consistency and clarity in JSON processing. The changes improve maintainability and align with recent refactoring efforts in the codebase.
This update adds the IOCopyBytesGracefully function to the common package, which simplifies the process of copying response bodies in the OpenAI handlers. It enhances error handling and ensures proper resource management by encapsulating the logic for setting headers and writing response data. The OpenAI handlers have been refactored to utilize this new function, improving code clarity and maintainability.