feat: add MiniMax LLM provider instrumentation#140
feat: add MiniMax LLM provider instrumentation#140octo-patch wants to merge 1 commit intofuture-agi:mainfrom
Conversation
| - Both international (api.minimax.io) and domestic (api.minimaxi.com) endpoints | ||
| """ | ||
|
|
||
| __slots__ = ( |
There was a problem hiding this comment.
slots declares only 3 attributes, but line 105 and 112 assign self._original_protect. This will raise AttributeError at runtime when evaluations (Guardrailing) is utilized . Add "_original_protect" to slots. Or you can either drop the protect assignment, it is traced by other instrumentors
| else: | ||
| self._original_protect = None | ||
|
|
||
| def _uninstrument(self, **kwargs: Any) -> None: |
There was a problem hiding this comment.
same here uninstrument doesn't restore the wrapper. I would suggest to remove the protect wrapper itself
| content = message.get("content", "") | ||
| if content: | ||
| yield SpanAttributes.OUTPUT_VALUE, content | ||
|
|
There was a problem hiding this comment.
Line 101 yields OUTPUT_VALUE with readable content, then line 104 unconditionally yields OUTPUT_VALUE again with the raw JSON dump. Since set_attribute overwrites, the human-readable content is silently lost.
Either remove line 104 or use a different attribute key for the raw output. ( I would suggest checking the Conventions)
| "tool_calls": list(self._tool_calls.values()), | ||
| "finish_reason": self._finish_reason | ||
| } | ||
| yield SpanAttributes.OUTPUT_VALUE, safe_json_dumps(output_summary) |
There was a problem hiding this comment.
Same issue here, The dual overwrite would lose the content which the function is trying to extract
| MINIMAX_BASE_URLS = [ | ||
| "api.minimax.io", | ||
| "api.minimaxi.com", | ||
| ] |
There was a problem hiding this comment.
Consider Using ENUMS for better accessability
|
|
||
| if base_url is not None: | ||
| base_url_str = str(base_url).lower() | ||
| return any(url in base_url_str for url in MINIMAX_BASE_URLS) |
There was a problem hiding this comment.
Not verified but if there are custom deployments allowed from minimax with custom domains then this can be utilized for sub string matching. If this isn't the case please consider using ENUMS and strict matching for safety purposes
NVJKKartik
left a comment
There was a problem hiding this comment.
Hey @octo-patch Thank you for Contributing to the repo. The minimax provider is a great addition. The implementation is clean and follows the existing patterns well, I have flagged a few issues on the specific lines. Please address them and I will be happy to approve and publish
Add OpenTelemetry instrumentation for MiniMax (https://www.minimax.io/), which provides an OpenAI-compatible API. This integration supports: - MiniMax-M2.5 and MiniMax-M2.5-highspeed models (204K context) - Sync and async chat completions - Streaming responses - Function/tool calling - Token usage tracking The implementation follows the same pattern as the existing DeepSeek integration, detecting MiniMax clients by their base_url (api.minimax.io) and wrapping OpenAI SDK calls accordingly. Changes: - New package: python/frameworks/minimax/ (traceai_minimax) - Added MINIMAX to FiLLMProviderValues enum - Updated README.md with MiniMax in supported frameworks - Includes comprehensive tests (19 passing) and usage examples
ac507f6 to
f91e16c
Compare
Summary
Add OpenTelemetry instrumentation for MiniMax, an AI platform providing OpenAI-compatible API with powerful language models.
What is included:
traceai-minimax- Full instrumentation for MiniMax LLM API callsMINIMAXtoFiLLMProviderValuesenumImplementation approach:
MiniMax uses the OpenAI-compatible API (
https://api.minimax.io/v1), so this integration follows the same pattern as the DeepSeek instrumentor - wrapping OpenAI SDK calls and detecting MiniMax clients by theirbase_url.Files changed:
python/frameworks/minimax/python/fi_instrumentation/fi_types.pyREADME.mdTest plan
pytest tests/ -v)