Skip to content

Fix AttributeError in _get_finish_reason_for_choice when finish_reason is a string#45873

Open
happywhaler wants to merge 1 commit intoAzure:mainfrom
happywhaler:patch-2
Open

Fix AttributeError in _get_finish_reason_for_choice when finish_reason is a string#45873
happywhaler wants to merge 1 commit intoAzure:mainfrom
happywhaler:patch-2

Conversation

@happywhaler
Copy link

Fix AttributeError in _get_finish_reason_for_choice when finish_reason is a string

Description

This PR fixes a bug in _get_finish_reason_for_choice() that causes an AttributeError when the model returns finish_reason as a string instead of an enum.

Some models (e.g., Mistral Large 3 on Azure AI Foundry MaaS) return finish_reason as a raw string ("stop") rather than a CompletionsFinishReason enum. The current implementation unconditionally calls .value on finish_reason, which fails with:

AttributeError: 'str' object has no attribute 'value'

Root Cause

In tracing.py, _get_finish_reason_for_choice() assumes finish_reason is always an enum:

def _get_finish_reason_for_choice(self, choice):
    finish_reason = getattr(choice, "finish_reason", None)
    if finish_reason is None:
        return "none"
    return finish_reason.value  # Fails when finish_reason is already a string

However, the similar method _get_finish_reasons() already handles both cases correctly:

if hasattr(finish_reason, "value"):
    finish_reasons.append(finish_reason.value)
elif isinstance(finish_reason, str):
    finish_reasons.append(finish_reason)

Fix

Apply the same defensive pattern to _get_finish_reason_for_choice():

def _get_finish_reason_for_choice(self, choice):
    finish_reason = getattr(choice, "finish_reason", None)
    if finish_reason is None:
        return "none"
    elif hasattr(finish_reason, "value"):
        return finish_reason.value
    elif isinstance(finish_reason, str):
        return finish_reason
    else:
        return "none"

Impact

Without this fix, tracing fails for any model that returns string finish_reason, causing the entire API call to fail. Users must disable tracing (AZURE_SDK_TRACING_IMPLEMENTATION=none or tracing_enabled=False) as a workaround.

## Reproduction
1. Use Azure AI Inference SDK with tracing enabled
2. Call a Mistral model via Azure AI Foundry MaaS endpoint
3. Tracing code crashes on response processing

## Proposed Fix
Apply the same pattern from `_get_finish_reasons()` to `_get_finish_reason_for_choice()`:
Copilot AI review requested due to automatic review settings March 24, 2026 10:36
@github-actions github-actions bot added AI Model Inference Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference) Community Contribution Community members are working on the issue customer-reported Issues that are reported by GitHub users external to the Azure organization. labels Mar 24, 2026
@github-actions
Copy link

Thank you for your contribution @happywhaler! We will review the pull request and get back to you soon.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes tracing robustness in azure-ai-inference by handling finish_reason values that are returned as raw strings (instead of enums), preventing AttributeError during span/event creation for certain models.

Changes:

  • Update _get_finish_reason_for_choice() to return "none", finish_reason.value, or the raw finish_reason string depending on the runtime type.
  • Align _get_finish_reason_for_choice() behavior with the already-defensive _get_finish_reasons() implementation.

Comment on lines 281 to +290
def _get_finish_reason_for_choice(self, choice):
finish_reason = getattr(choice, "finish_reason", None)
if finish_reason is not None:
if finish_reason is None:
return "none"
elif hasattr(finish_reason, "value"):
return finish_reason.value

return "none"
elif isinstance(finish_reason, str):
return finish_reason
else:
return "none"
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

_get_finish_reason_for_choice now duplicates the same enum/string normalization logic already present in _get_finish_reasons. To avoid the two implementations drifting over time, consider extracting a small helper (e.g., _normalize_finish_reason(finish_reason)) and using it from both places (or have _get_finish_reasons call _get_finish_reason_for_choice per choice).

Copilot uses AI. Check for mistakes.
Comment on lines +283 to +288
if finish_reason is None:
return "none"
elif hasattr(finish_reason, "value"):
return finish_reason.value

return "none"
elif isinstance(finish_reason, str):
return finish_reason
Copy link

Copilot AI Mar 24, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This fix changes behavior for choices where finish_reason is already a str, but the existing tracing tests appear to only validate the serialized output (and likely exercise enum-backed finish_reason). Please add a focused test that constructs a minimal choice object with finish_reason="stop" (string) and asserts tracing emits "stop" without raising (covering the reported AttributeError regression).

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

AI Model Inference Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference) Community Contribution Community members are working on the issue customer-reported Issues that are reported by GitHub users external to the Azure organization.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants