Skip to content

strands.telemetry.tracer

OpenTelemetry integration.

This module provides tracing capabilities using OpenTelemetry, enabling trace data to be sent to OTLP endpoints.

AttributeValue = Union[str, bool, float, int, List[str], List[bool], List[float], List[int], Sequence[str], Sequence[bool], Sequence[int], Sequence[float]] module-attribute

Attributes = Optional[Mapping[str, AttributeValue]] module-attribute

Messages = List[Message] module-attribute

A list of messages representing a conversation.

MultiAgentInput = str | list[ContentBlock] | list[InterruptResponseContent] module-attribute

StopReason = Literal['content_filtered', 'end_turn', 'guardrail_intervened', 'interrupt', 'max_tokens', 'stop_sequence', 'tool_use'] module-attribute

Reason for the model ending its response generation.

  • "content_filtered": Content was filtered due to policy violation
  • "end_turn": Normal completion of the response
  • "guardrail_intervened": Guardrail system intervened
  • "interrupt": Agent was interrupted for human input
  • "max_tokens": Maximum token limit reached
  • "stop_sequence": Stop sequence encountered
  • "tool_use": Model requested to use a tool

_tracer_instance = None module-attribute

logger = logging.getLogger(__name__) module-attribute

AgentResult dataclass

Represents the last result of invoking an agent with a prompt.

Attributes:

Name Type Description
stop_reason StopReason

The reason why the agent's processing stopped.

message Message

The last message generated by the agent.

metrics EventLoopMetrics

Performance metrics collected during processing.

state Any

Additional state information from the event loop.

interrupts Sequence[Interrupt] | None

List of interrupts if raised by user.

structured_output BaseModel | None

Parsed structured output when structured_output_model was specified.

Source code in strands/agent/agent_result.py
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
@dataclass
class AgentResult:
    """Represents the last result of invoking an agent with a prompt.

    Attributes:
        stop_reason: The reason why the agent's processing stopped.
        message: The last message generated by the agent.
        metrics: Performance metrics collected during processing.
        state: Additional state information from the event loop.
        interrupts: List of interrupts if raised by user.
        structured_output: Parsed structured output when structured_output_model was specified.
    """

    stop_reason: StopReason
    message: Message
    metrics: EventLoopMetrics
    state: Any
    interrupts: Sequence[Interrupt] | None = None
    structured_output: BaseModel | None = None

    def __str__(self) -> str:
        """Get the agent's last message as a string.

        This method extracts and concatenates all text content from the final message, ignoring any non-text content
        like images or structured data. If there's no text content but structured output is present, it serializes
        the structured output instead.

        Returns:
            The agent's last message as a string.
        """
        content_array = self.message.get("content", [])

        result = ""
        for item in content_array:
            if isinstance(item, dict) and "text" in item:
                result += item.get("text", "") + "\n"

        if not result and self.structured_output:
            result = self.structured_output.model_dump_json()

        return result

    @classmethod
    def from_dict(cls, data: dict[str, Any]) -> "AgentResult":
        """Rehydrate an AgentResult from persisted JSON.

        Args:
            data: Dictionary containing the serialized AgentResult data
        Returns:
            AgentResult instance
        Raises:
            TypeError: If the data format is invalid@
        """
        if data.get("type") != "agent_result":
            raise TypeError(f"AgentResult.from_dict: unexpected type {data.get('type')!r}")

        message = cast(Message, data.get("message"))
        stop_reason = cast(StopReason, data.get("stop_reason"))

        return cls(message=message, stop_reason=stop_reason, metrics=EventLoopMetrics(), state={})

    def to_dict(self) -> dict[str, Any]:
        """Convert this AgentResult to JSON-serializable dictionary.

        Returns:
            Dictionary containing serialized AgentResult data
        """
        return {
            "type": "agent_result",
            "message": self.message,
            "stop_reason": self.stop_reason,
        }

__str__()

Get the agent's last message as a string.

This method extracts and concatenates all text content from the final message, ignoring any non-text content like images or structured data. If there's no text content but structured output is present, it serializes the structured output instead.

Returns:

Type Description
str

The agent's last message as a string.

Source code in strands/agent/agent_result.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
def __str__(self) -> str:
    """Get the agent's last message as a string.

    This method extracts and concatenates all text content from the final message, ignoring any non-text content
    like images or structured data. If there's no text content but structured output is present, it serializes
    the structured output instead.

    Returns:
        The agent's last message as a string.
    """
    content_array = self.message.get("content", [])

    result = ""
    for item in content_array:
        if isinstance(item, dict) and "text" in item:
            result += item.get("text", "") + "\n"

    if not result and self.structured_output:
        result = self.structured_output.model_dump_json()

    return result

from_dict(data) classmethod

Rehydrate an AgentResult from persisted JSON.

Parameters:

Name Type Description Default
data dict[str, Any]

Dictionary containing the serialized AgentResult data

required

Returns: AgentResult instance Raises: TypeError: If the data format is invalid@

Source code in strands/agent/agent_result.py
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
@classmethod
def from_dict(cls, data: dict[str, Any]) -> "AgentResult":
    """Rehydrate an AgentResult from persisted JSON.

    Args:
        data: Dictionary containing the serialized AgentResult data
    Returns:
        AgentResult instance
    Raises:
        TypeError: If the data format is invalid@
    """
    if data.get("type") != "agent_result":
        raise TypeError(f"AgentResult.from_dict: unexpected type {data.get('type')!r}")

    message = cast(Message, data.get("message"))
    stop_reason = cast(StopReason, data.get("stop_reason"))

    return cls(message=message, stop_reason=stop_reason, metrics=EventLoopMetrics(), state={})

to_dict()

Convert this AgentResult to JSON-serializable dictionary.

Returns:

Type Description
dict[str, Any]

Dictionary containing serialized AgentResult data

Source code in strands/agent/agent_result.py
78
79
80
81
82
83
84
85
86
87
88
def to_dict(self) -> dict[str, Any]:
    """Convert this AgentResult to JSON-serializable dictionary.

    Returns:
        Dictionary containing serialized AgentResult data
    """
    return {
        "type": "agent_result",
        "message": self.message,
        "stop_reason": self.stop_reason,
    }

ContentBlock

Bases: TypedDict

A block of content for a message that you pass to, or receive from, a model.

Attributes:

Name Type Description
cachePoint CachePoint

A cache point configuration to optimize conversation history.

document DocumentContent

A document to include in the message.

guardContent GuardContent

Contains the content to assess with the guardrail.

image ImageContent

Image to include in the message.

reasoningContent ReasoningContentBlock

Contains content regarding the reasoning that is carried out by the model.

text str

Text to include in the message.

toolResult ToolResult

The result for a tool request that a model makes.

toolUse ToolUse

Information about a tool use request from a model.

video VideoContent

Video to include in the message.

citationsContent CitationsContentBlock

Contains the citations for a document.

Source code in strands/types/content.py
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
class ContentBlock(TypedDict, total=False):
    """A block of content for a message that you pass to, or receive from, a model.

    Attributes:
        cachePoint: A cache point configuration to optimize conversation history.
        document: A document to include in the message.
        guardContent: Contains the content to assess with the guardrail.
        image: Image to include in the message.
        reasoningContent: Contains content regarding the reasoning that is carried out by the model.
        text: Text to include in the message.
        toolResult: The result for a tool request that a model makes.
        toolUse: Information about a tool use request from a model.
        video: Video to include in the message.
        citationsContent: Contains the citations for a document.
    """

    cachePoint: CachePoint
    document: DocumentContent
    guardContent: GuardContent
    image: ImageContent
    reasoningContent: ReasoningContentBlock
    text: str
    toolResult: ToolResult
    toolUse: ToolUse
    video: VideoContent
    citationsContent: CitationsContentBlock

InterruptResponseContent

Bases: TypedDict

Content block containing a user response to an interrupt.

Attributes:

Name Type Description
interruptResponse InterruptResponse

User response to an interrupt event.

Source code in strands/types/interrupt.py
138
139
140
141
142
143
144
145
class InterruptResponseContent(TypedDict):
    """Content block containing a user response to an interrupt.

    Attributes:
        interruptResponse: User response to an interrupt event.
    """

    interruptResponse: InterruptResponse

JSONEncoder

Bases: JSONEncoder

Custom JSON encoder that handles non-serializable types.

Source code in strands/telemetry/tracer.py
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
class JSONEncoder(json.JSONEncoder):
    """Custom JSON encoder that handles non-serializable types."""

    def encode(self, obj: Any) -> str:
        """Recursively encode objects, preserving structure and only replacing unserializable values.

        Args:
            obj: The object to encode

        Returns:
            JSON string representation of the object
        """
        # Process the object to handle non-serializable values
        processed_obj = self._process_value(obj)
        # Use the parent class to encode the processed object
        return super().encode(processed_obj)

    def _process_value(self, value: Any) -> Any:
        """Process any value, handling containers recursively.

        Args:
            value: The value to process

        Returns:
            Processed value with unserializable parts replaced
        """
        # Handle datetime objects directly
        if isinstance(value, (datetime, date)):
            return value.isoformat()

        # Handle dictionaries
        elif isinstance(value, dict):
            return {k: self._process_value(v) for k, v in value.items()}

        # Handle lists
        elif isinstance(value, list):
            return [self._process_value(item) for item in value]

        # Handle all other values
        else:
            try:
                # Test if the value is JSON serializable
                json.dumps(value)
                return value
            except (TypeError, OverflowError, ValueError):
                return "<replaced>"

encode(obj)

Recursively encode objects, preserving structure and only replacing unserializable values.

Parameters:

Name Type Description Default
obj Any

The object to encode

required

Returns:

Type Description
str

JSON string representation of the object

Source code in strands/telemetry/tracer.py
31
32
33
34
35
36
37
38
39
40
41
42
43
def encode(self, obj: Any) -> str:
    """Recursively encode objects, preserving structure and only replacing unserializable values.

    Args:
        obj: The object to encode

    Returns:
        JSON string representation of the object
    """
    # Process the object to handle non-serializable values
    processed_obj = self._process_value(obj)
    # Use the parent class to encode the processed object
    return super().encode(processed_obj)

Message

Bases: TypedDict

A message in a conversation with the agent.

Attributes:

Name Type Description
content List[ContentBlock]

The message content.

role Role

The role of the message sender.

Source code in strands/types/content.py
178
179
180
181
182
183
184
185
186
187
class Message(TypedDict):
    """A message in a conversation with the agent.

    Attributes:
        content: The message content.
        role: The role of the message sender.
    """

    content: List[ContentBlock]
    role: Role

Metrics

Bases: TypedDict

Performance metrics for model interactions.

Attributes:

Name Type Description
latencyMs int

Latency of the model request in milliseconds.

timeToFirstByteMs int

Latency from sending model request to first content chunk (contentBlockDelta or contentBlockStart) from the model in milliseconds.

Source code in strands/types/event_loop.py
26
27
28
29
30
31
32
33
34
35
36
class Metrics(TypedDict, total=False):
    """Performance metrics for model interactions.

    Attributes:
        latencyMs (int): Latency of the model request in milliseconds.
        timeToFirstByteMs (int): Latency from sending model request to first
            content chunk (contentBlockDelta or contentBlockStart) from the model in milliseconds.
    """

    latencyMs: Required[int]
    timeToFirstByteMs: int

ToolResult

Bases: TypedDict

Result of a tool execution.

Attributes:

Name Type Description
content list[ToolResultContent]

List of result content returned by the tool.

status ToolResultStatus

The status of the tool execution ("success" or "error").

toolUseId str

The unique identifier of the tool use request that produced this result.

Source code in strands/types/tools.py
87
88
89
90
91
92
93
94
95
96
97
98
class ToolResult(TypedDict):
    """Result of a tool execution.

    Attributes:
        content: List of result content returned by the tool.
        status: The status of the tool execution ("success" or "error").
        toolUseId: The unique identifier of the tool use request that produced this result.
    """

    content: list[ToolResultContent]
    status: ToolResultStatus
    toolUseId: str

ToolUse

Bases: TypedDict

A request from the model to use a specific tool with the provided input.

Attributes:

Name Type Description
input Any

The input parameters for the tool. Can be any JSON-serializable type.

name str

The name of the tool to invoke.

toolUseId str

A unique identifier for this specific tool use request.

Source code in strands/types/tools.py
52
53
54
55
56
57
58
59
60
61
62
63
64
class ToolUse(TypedDict):
    """A request from the model to use a specific tool with the provided input.

    Attributes:
        input: The input parameters for the tool.
            Can be any JSON-serializable type.
        name: The name of the tool to invoke.
        toolUseId: A unique identifier for this specific tool use request.
    """

    input: Any
    name: str
    toolUseId: str

Tracer

Handles OpenTelemetry tracing.

This class provides a simple interface for creating and managing traces, with support for sending to OTLP endpoints.

When the OTEL_EXPORTER_OTLP_ENDPOINT environment variable is set, traces are sent to the OTLP endpoint.

Both attributes are controlled by including "gen_ai_latest_experimental" or "gen_ai_tool_definitions", respectively, in the OTEL_SEMCONV_STABILITY_OPT_IN environment variable.

Source code in strands/telemetry/tracer.py
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
class Tracer:
    """Handles OpenTelemetry tracing.

    This class provides a simple interface for creating and managing traces,
    with support for sending to OTLP endpoints.

    When the OTEL_EXPORTER_OTLP_ENDPOINT environment variable is set, traces
    are sent to the OTLP endpoint.

    Both attributes are controlled by including "gen_ai_latest_experimental" or "gen_ai_tool_definitions",
    respectively, in the OTEL_SEMCONV_STABILITY_OPT_IN environment variable.
    """

    def __init__(self) -> None:
        """Initialize the tracer."""
        self.service_name = __name__
        self.tracer_provider: Optional[trace_api.TracerProvider] = None
        self.tracer_provider = trace_api.get_tracer_provider()
        self.tracer = self.tracer_provider.get_tracer(self.service_name)
        ThreadingInstrumentor().instrument()

        # Read OTEL_SEMCONV_STABILITY_OPT_IN environment variable
        opt_in_values = self._parse_semconv_opt_in()
        ## To-do: should not set below attributes directly, use env var instead
        self.use_latest_genai_conventions = "gen_ai_latest_experimental" in opt_in_values
        self._include_tool_definitions = "gen_ai_tool_definitions" in opt_in_values

    def _parse_semconv_opt_in(self) -> set[str]:
        """Parse the OTEL_SEMCONV_STABILITY_OPT_IN environment variable.

        Returns:
            A set of opt-in values from the environment variable.
        """
        opt_in_env = os.getenv("OTEL_SEMCONV_STABILITY_OPT_IN", "")
        return {value.strip() for value in opt_in_env.split(",")}

    def _start_span(
        self,
        span_name: str,
        parent_span: Optional[Span] = None,
        attributes: Optional[Dict[str, AttributeValue]] = None,
        span_kind: trace_api.SpanKind = trace_api.SpanKind.INTERNAL,
    ) -> Span:
        """Generic helper method to start a span with common attributes.

        Args:
            span_name: Name of the span to create
            parent_span: Optional parent span to link this span to
            attributes: Dictionary of attributes to set on the span
            span_kind: enum of OptenTelemetry SpanKind

        Returns:
            The created span, or None if tracing is not enabled
        """
        if not parent_span:
            parent_span = trace_api.get_current_span()

        context = None
        if parent_span and parent_span.is_recording() and parent_span != trace_api.INVALID_SPAN:
            context = trace_api.set_span_in_context(parent_span)

        span = self.tracer.start_span(name=span_name, context=context, kind=span_kind)

        # Set start time as a common attribute
        span.set_attribute("gen_ai.event.start_time", datetime.now(timezone.utc).isoformat())

        # Add all provided attributes
        if attributes:
            self._set_attributes(span, attributes)

        return span

    def _set_attributes(self, span: Span, attributes: Dict[str, AttributeValue]) -> None:
        """Set attributes on a span, handling different value types appropriately.

        Args:
            span: The span to set attributes on
            attributes: Dictionary of attributes to set
        """
        if not span:
            return

        for key, value in attributes.items():
            span.set_attribute(key, value)

    def _add_optional_usage_and_metrics_attributes(
        self, attributes: Dict[str, AttributeValue], usage: Usage, metrics: Metrics
    ) -> None:
        """Add optional usage and metrics attributes if they have values.

        Args:
            attributes: Dictionary to add attributes to
            usage: Token usage information from the model call
            metrics: Metrics from the model call
        """
        if "cacheReadInputTokens" in usage:
            attributes["gen_ai.usage.cache_read_input_tokens"] = usage["cacheReadInputTokens"]

        if "cacheWriteInputTokens" in usage:
            attributes["gen_ai.usage.cache_write_input_tokens"] = usage["cacheWriteInputTokens"]

        if metrics.get("timeToFirstByteMs", 0) > 0:
            attributes["gen_ai.server.time_to_first_token"] = metrics["timeToFirstByteMs"]

        if metrics.get("latencyMs", 0) > 0:
            attributes["gen_ai.server.request.duration"] = metrics["latencyMs"]

    def _end_span(
        self,
        span: Span,
        attributes: Optional[Dict[str, AttributeValue]] = None,
        error: Optional[Exception] = None,
    ) -> None:
        """Generic helper method to end a span.

        Args:
            span: The span to end
            attributes: Optional attributes to set before ending the span
            error: Optional exception if an error occurred
        """
        if not span:
            return

        try:
            # Set end time as a common attribute
            span.set_attribute("gen_ai.event.end_time", datetime.now(timezone.utc).isoformat())

            # Add any additional attributes
            if attributes:
                self._set_attributes(span, attributes)

            # Handle error if present
            if error:
                span.set_status(StatusCode.ERROR, str(error))
                span.record_exception(error)
            else:
                span.set_status(StatusCode.OK)
        except Exception as e:
            logger.warning("error=<%s> | error while ending span", e, exc_info=True)
        finally:
            span.end()
            # Force flush to ensure spans are exported
            if self.tracer_provider and hasattr(self.tracer_provider, "force_flush"):
                try:
                    self.tracer_provider.force_flush()
                except Exception as e:
                    logger.warning("error=<%s> | failed to force flush tracer provider", e)

    def end_span_with_error(self, span: Span, error_message: str, exception: Optional[Exception] = None) -> None:
        """End a span with error status.

        Args:
            span: The span to end.
            error_message: Error message to set in the span status.
            exception: Optional exception to record in the span.
        """
        if not span:
            return

        error = exception or Exception(error_message)
        self._end_span(span, error=error)

    def _add_event(self, span: Optional[Span], event_name: str, event_attributes: Attributes) -> None:
        """Add an event with attributes to a span.

        Args:
            span: The span to add the event to
            event_name: Name of the event
            event_attributes: Dictionary of attributes to set on the event
        """
        if not span:
            return

        span.add_event(event_name, attributes=event_attributes)

    def _get_event_name_for_message(self, message: Message) -> str:
        """Determine the appropriate OpenTelemetry event name for a message.

        According to OpenTelemetry semantic conventions v1.36.0, messages containing tool results
        should be labeled as 'gen_ai.tool.message' regardless of their role field.
        This ensures proper categorization of tool responses in traces.

        Note: The GenAI namespace is experimental and may change in future versions.

        Reference: https://github.com/open-telemetry/semantic-conventions/blob/v1.36.0/docs/gen-ai/gen-ai-events.md#event-gen_aitoolmessage

        Args:
            message: The message to determine the event name for

        Returns:
            The OpenTelemetry event name (e.g., 'gen_ai.user.message', 'gen_ai.tool.message')
        """
        # Check if the message contains a tool result
        for content_block in message.get("content", []):
            if "toolResult" in content_block:
                return "gen_ai.tool.message"

        return f"gen_ai.{message['role']}.message"

    def start_model_invoke_span(
        self,
        messages: Messages,
        parent_span: Optional[Span] = None,
        model_id: Optional[str] = None,
        custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
        **kwargs: Any,
    ) -> Span:
        """Start a new span for a model invocation.

        Args:
            messages: Messages being sent to the model.
            parent_span: Optional parent span to link this span to.
            model_id: Optional identifier for the model being invoked.
            custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
            **kwargs: Additional attributes to add to the span.

        Returns:
            The created span, or None if tracing is not enabled.
        """
        attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="chat")

        if custom_trace_attributes:
            attributes.update(custom_trace_attributes)

        if model_id:
            attributes["gen_ai.request.model"] = model_id

        # Add additional kwargs as attributes
        attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

        span = self._start_span("chat", parent_span, attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL)
        self._add_event_messages(span, messages)

        return span

    def end_model_invoke_span(
        self,
        span: Span,
        message: Message,
        usage: Usage,
        metrics: Metrics,
        stop_reason: StopReason,
        error: Optional[Exception] = None,
    ) -> None:
        """End a model invocation span with results and metrics.

        Args:
            span: The span to end.
            message: The message response from the model.
            usage: Token usage information from the model call.
            metrics: Metrics from the model call.
            stop_reason (StopReason): The reason the model stopped generating.
            error: Optional exception if the model call failed.
        """
        attributes: Dict[str, AttributeValue] = {
            "gen_ai.usage.prompt_tokens": usage["inputTokens"],
            "gen_ai.usage.input_tokens": usage["inputTokens"],
            "gen_ai.usage.completion_tokens": usage["outputTokens"],
            "gen_ai.usage.output_tokens": usage["outputTokens"],
            "gen_ai.usage.total_tokens": usage["totalTokens"],
        }

        # Add optional attributes if they have values
        self._add_optional_usage_and_metrics_attributes(attributes, usage, metrics)

        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.output.messages": serialize(
                        [
                            {
                                "role": message["role"],
                                "parts": self._map_content_blocks_to_otel_parts(message["content"]),
                                "finish_reason": str(stop_reason),
                            }
                        ]
                    ),
                },
            )
        else:
            self._add_event(
                span,
                "gen_ai.choice",
                event_attributes={"finish_reason": str(stop_reason), "message": serialize(message["content"])},
            )

        self._end_span(span, attributes, error)

    def start_tool_call_span(
        self,
        tool: ToolUse,
        parent_span: Optional[Span] = None,
        custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
        **kwargs: Any,
    ) -> Span:
        """Start a new span for a tool call.

        Args:
            tool: The tool being used.
            parent_span: Optional parent span to link this span to.
            custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
            **kwargs: Additional attributes to add to the span.

        Returns:
            The created span, or None if tracing is not enabled.
        """
        attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="execute_tool")
        attributes.update(
            {
                "gen_ai.tool.name": tool["name"],
                "gen_ai.tool.call.id": tool["toolUseId"],
            }
        )

        if custom_trace_attributes:
            attributes.update(custom_trace_attributes)
        # Add additional kwargs as attributes
        attributes.update(kwargs)

        span_name = f"execute_tool {tool['name']}"
        span = self._start_span(span_name, parent_span, attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL)

        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.input.messages": serialize(
                        [
                            {
                                "role": "tool",
                                "parts": [
                                    {
                                        "type": "tool_call",
                                        "name": tool["name"],
                                        "id": tool["toolUseId"],
                                        "arguments": tool["input"],
                                    }
                                ],
                            }
                        ]
                    )
                },
            )
        else:
            self._add_event(
                span,
                "gen_ai.tool.message",
                event_attributes={
                    "role": "tool",
                    "content": serialize(tool["input"]),
                    "id": tool["toolUseId"],
                },
            )

        return span

    def end_tool_call_span(
        self, span: Span, tool_result: Optional[ToolResult], error: Optional[Exception] = None
    ) -> None:
        """End a tool call span with results.

        Args:
            span: The span to end.
            tool_result: The result from the tool execution.
            error: Optional exception if the tool call failed.
        """
        attributes: Dict[str, AttributeValue] = {}
        if tool_result is not None:
            status = tool_result.get("status")
            status_str = str(status) if status is not None else ""

            attributes.update(
                {
                    "gen_ai.tool.status": status_str,
                }
            )

            if self.use_latest_genai_conventions:
                self._add_event(
                    span,
                    "gen_ai.client.inference.operation.details",
                    {
                        "gen_ai.output.messages": serialize(
                            [
                                {
                                    "role": "tool",
                                    "parts": [
                                        {
                                            "type": "tool_call_response",
                                            "id": tool_result.get("toolUseId", ""),
                                            "response": tool_result.get("content"),
                                        }
                                    ],
                                }
                            ]
                        )
                    },
                )
            else:
                self._add_event(
                    span,
                    "gen_ai.choice",
                    event_attributes={
                        "message": serialize(tool_result.get("content")),
                        "id": tool_result.get("toolUseId", ""),
                    },
                )

        self._end_span(span, attributes, error)

    def start_event_loop_cycle_span(
        self,
        invocation_state: Any,
        messages: Messages,
        parent_span: Optional[Span] = None,
        custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
        **kwargs: Any,
    ) -> Optional[Span]:
        """Start a new span for an event loop cycle.

        Args:
            invocation_state: Arguments for the event loop cycle.
            parent_span: Optional parent span to link this span to.
            messages:  Messages being processed in this cycle.
            custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
            **kwargs: Additional attributes to add to the span.

        Returns:
            The created span, or None if tracing is not enabled.
        """
        event_loop_cycle_id = str(invocation_state.get("event_loop_cycle_id"))
        parent_span = parent_span if parent_span else invocation_state.get("event_loop_parent_span")

        attributes: Dict[str, AttributeValue] = {
            "event_loop.cycle_id": event_loop_cycle_id,
        }

        if custom_trace_attributes:
            attributes.update(custom_trace_attributes)

        if "event_loop_parent_cycle_id" in invocation_state:
            attributes["event_loop.parent_cycle_id"] = str(invocation_state["event_loop_parent_cycle_id"])

        # Add additional kwargs as attributes
        attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

        span_name = "execute_event_loop_cycle"
        span = self._start_span(span_name, parent_span, attributes)
        self._add_event_messages(span, messages)

        return span

    def end_event_loop_cycle_span(
        self,
        span: Span,
        message: Message,
        tool_result_message: Optional[Message] = None,
        error: Optional[Exception] = None,
    ) -> None:
        """End an event loop cycle span with results.

        Args:
            span: The span to end.
            message: The message response from this cycle.
            tool_result_message: Optional tool result message if a tool was called.
            error: Optional exception if the cycle failed.
        """
        attributes: Dict[str, AttributeValue] = {}
        event_attributes: Dict[str, AttributeValue] = {"message": serialize(message["content"])}

        if tool_result_message:
            event_attributes["tool.result"] = serialize(tool_result_message["content"])

            if self.use_latest_genai_conventions:
                self._add_event(
                    span,
                    "gen_ai.client.inference.operation.details",
                    {
                        "gen_ai.output.messages": serialize(
                            [
                                {
                                    "role": tool_result_message["role"],
                                    "parts": self._map_content_blocks_to_otel_parts(tool_result_message["content"]),
                                }
                            ]
                        )
                    },
                )
            else:
                self._add_event(span, "gen_ai.choice", event_attributes=event_attributes)
        self._end_span(span, attributes, error)

    def start_agent_span(
        self,
        messages: Messages,
        agent_name: str,
        model_id: Optional[str] = None,
        tools: Optional[list] = None,
        custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
        tools_config: Optional[dict] = None,
        **kwargs: Any,
    ) -> Span:
        """Start a new span for an agent invocation.

        Args:
            messages: List of messages being sent to the agent.
            agent_name: Name of the agent.
            model_id: Optional model identifier.
            tools: Optional list of tools being used.
            custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
            tools_config: Optional dictionary of tool configurations.
            **kwargs: Additional attributes to add to the span.

        Returns:
            The created span, or None if tracing is not enabled.
        """
        attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="invoke_agent")
        attributes.update(
            {
                "gen_ai.agent.name": agent_name,
            }
        )

        if model_id:
            attributes["gen_ai.request.model"] = model_id

        if tools:
            attributes["gen_ai.agent.tools"] = serialize(tools)

        if self._include_tool_definitions and tools_config:
            try:
                tool_definitions = self._construct_tool_definitions(tools_config)
                attributes["gen_ai.tool.definitions"] = serialize(tool_definitions)
            except Exception:
                # A failure in telemetry should not crash the agent
                logger.warning("failed to attach tool metadata to agent span", exc_info=True)

        # Add custom trace attributes if provided
        if custom_trace_attributes:
            attributes.update(custom_trace_attributes)

        # Add additional kwargs as attributes
        attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

        span = self._start_span(
            f"invoke_agent {agent_name}", attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL
        )
        self._add_event_messages(span, messages)

        return span

    def end_agent_span(
        self,
        span: Span,
        response: Optional[AgentResult] = None,
        error: Optional[Exception] = None,
    ) -> None:
        """End an agent span with results and metrics.

        Args:
            span: The span to end.
            response: The response from the agent.
            error: Any error that occurred.
        """
        attributes: Dict[str, AttributeValue] = {}

        if response:
            if self.use_latest_genai_conventions:
                self._add_event(
                    span,
                    "gen_ai.client.inference.operation.details",
                    {
                        "gen_ai.output.messages": serialize(
                            [
                                {
                                    "role": "assistant",
                                    "parts": [{"type": "text", "content": str(response)}],
                                    "finish_reason": str(response.stop_reason),
                                }
                            ]
                        )
                    },
                )
            else:
                self._add_event(
                    span,
                    "gen_ai.choice",
                    event_attributes={"message": str(response), "finish_reason": str(response.stop_reason)},
                )

            if hasattr(response, "metrics") and hasattr(response.metrics, "accumulated_usage"):
                if "langfuse" in os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT", "") or "langfuse" in os.getenv(
                    "OTEL_EXPORTER_OTLP_TRACES_ENDPOINT", ""
                ):
                    attributes.update({"langfuse.observation.type": "span"})
                accumulated_usage = response.metrics.accumulated_usage
                attributes.update(
                    {
                        "gen_ai.usage.prompt_tokens": accumulated_usage["inputTokens"],
                        "gen_ai.usage.completion_tokens": accumulated_usage["outputTokens"],
                        "gen_ai.usage.input_tokens": accumulated_usage["inputTokens"],
                        "gen_ai.usage.output_tokens": accumulated_usage["outputTokens"],
                        "gen_ai.usage.total_tokens": accumulated_usage["totalTokens"],
                        "gen_ai.usage.cache_read_input_tokens": accumulated_usage.get("cacheReadInputTokens", 0),
                        "gen_ai.usage.cache_write_input_tokens": accumulated_usage.get("cacheWriteInputTokens", 0),
                    }
                )

        self._end_span(span, attributes, error)

    def _construct_tool_definitions(self, tools_config: dict) -> list[dict[str, Any]]:
        """Constructs a list of tool definitions from the provided tools_config."""
        return [
            {
                "name": name,
                "description": spec.get("description"),
                "inputSchema": spec.get("inputSchema"),
                "outputSchema": spec.get("outputSchema"),
            }
            for name, spec in tools_config.items()
        ]

    def start_multiagent_span(
        self,
        task: MultiAgentInput,
        instance: str,
        custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
    ) -> Span:
        """Start a new span for swarm invocation."""
        operation = f"invoke_{instance}"
        attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation)
        attributes.update(
            {
                "gen_ai.agent.name": instance,
            }
        )

        if custom_trace_attributes:
            attributes.update(custom_trace_attributes)

        span = self._start_span(operation, attributes=attributes, span_kind=trace_api.SpanKind.CLIENT)

        if self.use_latest_genai_conventions:
            parts: list[dict[str, Any]] = []
            if isinstance(task, list):
                parts = self._map_content_blocks_to_otel_parts(task)
            else:
                parts = [{"type": "text", "content": task}]
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {"gen_ai.input.messages": serialize([{"role": "user", "parts": parts}])},
            )
        else:
            self._add_event(
                span,
                "gen_ai.user.message",
                event_attributes={"content": serialize(task) if isinstance(task, list) else task},
            )

        return span

    def end_swarm_span(
        self,
        span: Span,
        result: Optional[str] = None,
    ) -> None:
        """End a swarm span with results."""
        if result:
            if self.use_latest_genai_conventions:
                self._add_event(
                    span,
                    "gen_ai.client.inference.operation.details",
                    {
                        "gen_ai.output.messages": serialize(
                            [
                                {
                                    "role": "assistant",
                                    "parts": [{"type": "text", "content": result}],
                                }
                            ]
                        )
                    },
                )
            else:
                self._add_event(
                    span,
                    "gen_ai.choice",
                    event_attributes={"message": result},
                )

    def _get_common_attributes(
        self,
        operation_name: str,
    ) -> Dict[str, AttributeValue]:
        """Returns a dictionary of common attributes based on the convention version used.

        Args:
            operation_name: The name of the operation.

        Returns:
            A dictionary of attributes following the appropriate GenAI conventions.
        """
        common_attributes = {"gen_ai.operation.name": operation_name}
        if self.use_latest_genai_conventions:
            common_attributes.update(
                {
                    "gen_ai.provider.name": "strands-agents",
                }
            )
        else:
            common_attributes.update(
                {
                    "gen_ai.system": "strands-agents",
                }
            )
        return dict(common_attributes)

    def _add_event_messages(self, span: Span, messages: Messages) -> None:
        """Adds messages as event to the provided span based on the current GenAI conventions.

        Args:
            span: The span to which events will be added.
            messages: List of messages being sent to the agent.
        """
        if self.use_latest_genai_conventions:
            input_messages: list = []
            for message in messages:
                input_messages.append(
                    {"role": message["role"], "parts": self._map_content_blocks_to_otel_parts(message["content"])}
                )
            self._add_event(
                span, "gen_ai.client.inference.operation.details", {"gen_ai.input.messages": serialize(input_messages)}
            )
        else:
            for message in messages:
                self._add_event(
                    span,
                    self._get_event_name_for_message(message),
                    {"content": serialize(message["content"])},
                )

    def _map_content_blocks_to_otel_parts(
        self, content_blocks: list[ContentBlock] | list[InterruptResponseContent]
    ) -> list[dict[str, Any]]:
        """Map content blocks to OpenTelemetry parts format."""
        parts: list[dict[str, Any]] = []

        for block in cast(list[dict[str, Any]], content_blocks):
            if "interruptResponse" in block:
                interrupt_response = block["interruptResponse"]
                parts.append(
                    {
                        "type": "interrupt_response",
                        "id": interrupt_response["interruptId"],
                        "response": interrupt_response["response"],
                    },
                )
            elif "text" in block:
                # Standard TextPart
                parts.append({"type": "text", "content": block["text"]})
            elif "toolUse" in block:
                # Standard ToolCallRequestPart
                tool_use = block["toolUse"]
                parts.append(
                    {
                        "type": "tool_call",
                        "name": tool_use["name"],
                        "id": tool_use["toolUseId"],
                        "arguments": tool_use["input"],
                    }
                )
            elif "toolResult" in block:
                # Standard ToolCallResponsePart
                tool_result = block["toolResult"]
                parts.append(
                    {
                        "type": "tool_call_response",
                        "id": tool_result["toolUseId"],
                        "response": tool_result["content"],
                    }
                )
            else:
                # For all other ContentBlock types, use the key as type and value as content
                for key, value in block.items():
                    parts.append({"type": key, "content": value})
        return parts

__init__()

Initialize the tracer.

Source code in strands/telemetry/tracer.py
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
def __init__(self) -> None:
    """Initialize the tracer."""
    self.service_name = __name__
    self.tracer_provider: Optional[trace_api.TracerProvider] = None
    self.tracer_provider = trace_api.get_tracer_provider()
    self.tracer = self.tracer_provider.get_tracer(self.service_name)
    ThreadingInstrumentor().instrument()

    # Read OTEL_SEMCONV_STABILITY_OPT_IN environment variable
    opt_in_values = self._parse_semconv_opt_in()
    ## To-do: should not set below attributes directly, use env var instead
    self.use_latest_genai_conventions = "gen_ai_latest_experimental" in opt_in_values
    self._include_tool_definitions = "gen_ai_tool_definitions" in opt_in_values

end_agent_span(span, response=None, error=None)

End an agent span with results and metrics.

Parameters:

Name Type Description Default
span Span

The span to end.

required
response Optional[AgentResult]

The response from the agent.

None
error Optional[Exception]

Any error that occurred.

None
Source code in strands/telemetry/tracer.py
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
def end_agent_span(
    self,
    span: Span,
    response: Optional[AgentResult] = None,
    error: Optional[Exception] = None,
) -> None:
    """End an agent span with results and metrics.

    Args:
        span: The span to end.
        response: The response from the agent.
        error: Any error that occurred.
    """
    attributes: Dict[str, AttributeValue] = {}

    if response:
        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.output.messages": serialize(
                        [
                            {
                                "role": "assistant",
                                "parts": [{"type": "text", "content": str(response)}],
                                "finish_reason": str(response.stop_reason),
                            }
                        ]
                    )
                },
            )
        else:
            self._add_event(
                span,
                "gen_ai.choice",
                event_attributes={"message": str(response), "finish_reason": str(response.stop_reason)},
            )

        if hasattr(response, "metrics") and hasattr(response.metrics, "accumulated_usage"):
            if "langfuse" in os.getenv("OTEL_EXPORTER_OTLP_ENDPOINT", "") or "langfuse" in os.getenv(
                "OTEL_EXPORTER_OTLP_TRACES_ENDPOINT", ""
            ):
                attributes.update({"langfuse.observation.type": "span"})
            accumulated_usage = response.metrics.accumulated_usage
            attributes.update(
                {
                    "gen_ai.usage.prompt_tokens": accumulated_usage["inputTokens"],
                    "gen_ai.usage.completion_tokens": accumulated_usage["outputTokens"],
                    "gen_ai.usage.input_tokens": accumulated_usage["inputTokens"],
                    "gen_ai.usage.output_tokens": accumulated_usage["outputTokens"],
                    "gen_ai.usage.total_tokens": accumulated_usage["totalTokens"],
                    "gen_ai.usage.cache_read_input_tokens": accumulated_usage.get("cacheReadInputTokens", 0),
                    "gen_ai.usage.cache_write_input_tokens": accumulated_usage.get("cacheWriteInputTokens", 0),
                }
            )

    self._end_span(span, attributes, error)

end_event_loop_cycle_span(span, message, tool_result_message=None, error=None)

End an event loop cycle span with results.

Parameters:

Name Type Description Default
span Span

The span to end.

required
message Message

The message response from this cycle.

required
tool_result_message Optional[Message]

Optional tool result message if a tool was called.

None
error Optional[Exception]

Optional exception if the cycle failed.

None
Source code in strands/telemetry/tracer.py
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
def end_event_loop_cycle_span(
    self,
    span: Span,
    message: Message,
    tool_result_message: Optional[Message] = None,
    error: Optional[Exception] = None,
) -> None:
    """End an event loop cycle span with results.

    Args:
        span: The span to end.
        message: The message response from this cycle.
        tool_result_message: Optional tool result message if a tool was called.
        error: Optional exception if the cycle failed.
    """
    attributes: Dict[str, AttributeValue] = {}
    event_attributes: Dict[str, AttributeValue] = {"message": serialize(message["content"])}

    if tool_result_message:
        event_attributes["tool.result"] = serialize(tool_result_message["content"])

        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.output.messages": serialize(
                        [
                            {
                                "role": tool_result_message["role"],
                                "parts": self._map_content_blocks_to_otel_parts(tool_result_message["content"]),
                            }
                        ]
                    )
                },
            )
        else:
            self._add_event(span, "gen_ai.choice", event_attributes=event_attributes)
    self._end_span(span, attributes, error)

end_model_invoke_span(span, message, usage, metrics, stop_reason, error=None)

End a model invocation span with results and metrics.

Parameters:

Name Type Description Default
span Span

The span to end.

required
message Message

The message response from the model.

required
usage Usage

Token usage information from the model call.

required
metrics Metrics

Metrics from the model call.

required
stop_reason StopReason

The reason the model stopped generating.

required
error Optional[Exception]

Optional exception if the model call failed.

None
Source code in strands/telemetry/tracer.py
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
def end_model_invoke_span(
    self,
    span: Span,
    message: Message,
    usage: Usage,
    metrics: Metrics,
    stop_reason: StopReason,
    error: Optional[Exception] = None,
) -> None:
    """End a model invocation span with results and metrics.

    Args:
        span: The span to end.
        message: The message response from the model.
        usage: Token usage information from the model call.
        metrics: Metrics from the model call.
        stop_reason (StopReason): The reason the model stopped generating.
        error: Optional exception if the model call failed.
    """
    attributes: Dict[str, AttributeValue] = {
        "gen_ai.usage.prompt_tokens": usage["inputTokens"],
        "gen_ai.usage.input_tokens": usage["inputTokens"],
        "gen_ai.usage.completion_tokens": usage["outputTokens"],
        "gen_ai.usage.output_tokens": usage["outputTokens"],
        "gen_ai.usage.total_tokens": usage["totalTokens"],
    }

    # Add optional attributes if they have values
    self._add_optional_usage_and_metrics_attributes(attributes, usage, metrics)

    if self.use_latest_genai_conventions:
        self._add_event(
            span,
            "gen_ai.client.inference.operation.details",
            {
                "gen_ai.output.messages": serialize(
                    [
                        {
                            "role": message["role"],
                            "parts": self._map_content_blocks_to_otel_parts(message["content"]),
                            "finish_reason": str(stop_reason),
                        }
                    ]
                ),
            },
        )
    else:
        self._add_event(
            span,
            "gen_ai.choice",
            event_attributes={"finish_reason": str(stop_reason), "message": serialize(message["content"])},
        )

    self._end_span(span, attributes, error)

end_span_with_error(span, error_message, exception=None)

End a span with error status.

Parameters:

Name Type Description Default
span Span

The span to end.

required
error_message str

Error message to set in the span status.

required
exception Optional[Exception]

Optional exception to record in the span.

None
Source code in strands/telemetry/tracer.py
224
225
226
227
228
229
230
231
232
233
234
235
236
def end_span_with_error(self, span: Span, error_message: str, exception: Optional[Exception] = None) -> None:
    """End a span with error status.

    Args:
        span: The span to end.
        error_message: Error message to set in the span status.
        exception: Optional exception to record in the span.
    """
    if not span:
        return

    error = exception or Exception(error_message)
    self._end_span(span, error=error)

end_swarm_span(span, result=None)

End a swarm span with results.

Source code in strands/telemetry/tracer.py
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
def end_swarm_span(
    self,
    span: Span,
    result: Optional[str] = None,
) -> None:
    """End a swarm span with results."""
    if result:
        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.output.messages": serialize(
                        [
                            {
                                "role": "assistant",
                                "parts": [{"type": "text", "content": result}],
                            }
                        ]
                    )
                },
            )
        else:
            self._add_event(
                span,
                "gen_ai.choice",
                event_attributes={"message": result},
            )

end_tool_call_span(span, tool_result, error=None)

End a tool call span with results.

Parameters:

Name Type Description Default
span Span

The span to end.

required
tool_result Optional[ToolResult]

The result from the tool execution.

required
error Optional[Exception]

Optional exception if the tool call failed.

None
Source code in strands/telemetry/tracer.py
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
def end_tool_call_span(
    self, span: Span, tool_result: Optional[ToolResult], error: Optional[Exception] = None
) -> None:
    """End a tool call span with results.

    Args:
        span: The span to end.
        tool_result: The result from the tool execution.
        error: Optional exception if the tool call failed.
    """
    attributes: Dict[str, AttributeValue] = {}
    if tool_result is not None:
        status = tool_result.get("status")
        status_str = str(status) if status is not None else ""

        attributes.update(
            {
                "gen_ai.tool.status": status_str,
            }
        )

        if self.use_latest_genai_conventions:
            self._add_event(
                span,
                "gen_ai.client.inference.operation.details",
                {
                    "gen_ai.output.messages": serialize(
                        [
                            {
                                "role": "tool",
                                "parts": [
                                    {
                                        "type": "tool_call_response",
                                        "id": tool_result.get("toolUseId", ""),
                                        "response": tool_result.get("content"),
                                    }
                                ],
                            }
                        ]
                    )
                },
            )
        else:
            self._add_event(
                span,
                "gen_ai.choice",
                event_attributes={
                    "message": serialize(tool_result.get("content")),
                    "id": tool_result.get("toolUseId", ""),
                },
            )

    self._end_span(span, attributes, error)

start_agent_span(messages, agent_name, model_id=None, tools=None, custom_trace_attributes=None, tools_config=None, **kwargs)

Start a new span for an agent invocation.

Parameters:

Name Type Description Default
messages Messages

List of messages being sent to the agent.

required
agent_name str

Name of the agent.

required
model_id Optional[str]

Optional model identifier.

None
tools Optional[list]

Optional list of tools being used.

None
custom_trace_attributes Optional[Mapping[str, AttributeValue]]

Optional mapping of custom trace attributes to include in the span.

None
tools_config Optional[dict]

Optional dictionary of tool configurations.

None
**kwargs Any

Additional attributes to add to the span.

{}

Returns:

Type Description
Span

The created span, or None if tracing is not enabled.

Source code in strands/telemetry/tracer.py
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
def start_agent_span(
    self,
    messages: Messages,
    agent_name: str,
    model_id: Optional[str] = None,
    tools: Optional[list] = None,
    custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
    tools_config: Optional[dict] = None,
    **kwargs: Any,
) -> Span:
    """Start a new span for an agent invocation.

    Args:
        messages: List of messages being sent to the agent.
        agent_name: Name of the agent.
        model_id: Optional model identifier.
        tools: Optional list of tools being used.
        custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
        tools_config: Optional dictionary of tool configurations.
        **kwargs: Additional attributes to add to the span.

    Returns:
        The created span, or None if tracing is not enabled.
    """
    attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="invoke_agent")
    attributes.update(
        {
            "gen_ai.agent.name": agent_name,
        }
    )

    if model_id:
        attributes["gen_ai.request.model"] = model_id

    if tools:
        attributes["gen_ai.agent.tools"] = serialize(tools)

    if self._include_tool_definitions and tools_config:
        try:
            tool_definitions = self._construct_tool_definitions(tools_config)
            attributes["gen_ai.tool.definitions"] = serialize(tool_definitions)
        except Exception:
            # A failure in telemetry should not crash the agent
            logger.warning("failed to attach tool metadata to agent span", exc_info=True)

    # Add custom trace attributes if provided
    if custom_trace_attributes:
        attributes.update(custom_trace_attributes)

    # Add additional kwargs as attributes
    attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

    span = self._start_span(
        f"invoke_agent {agent_name}", attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL
    )
    self._add_event_messages(span, messages)

    return span

start_event_loop_cycle_span(invocation_state, messages, parent_span=None, custom_trace_attributes=None, **kwargs)

Start a new span for an event loop cycle.

Parameters:

Name Type Description Default
invocation_state Any

Arguments for the event loop cycle.

required
parent_span Optional[Span]

Optional parent span to link this span to.

None
messages Messages

Messages being processed in this cycle.

required
custom_trace_attributes Optional[Mapping[str, AttributeValue]]

Optional mapping of custom trace attributes to include in the span.

None
**kwargs Any

Additional attributes to add to the span.

{}

Returns:

Type Description
Optional[Span]

The created span, or None if tracing is not enabled.

Source code in strands/telemetry/tracer.py
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
def start_event_loop_cycle_span(
    self,
    invocation_state: Any,
    messages: Messages,
    parent_span: Optional[Span] = None,
    custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
    **kwargs: Any,
) -> Optional[Span]:
    """Start a new span for an event loop cycle.

    Args:
        invocation_state: Arguments for the event loop cycle.
        parent_span: Optional parent span to link this span to.
        messages:  Messages being processed in this cycle.
        custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
        **kwargs: Additional attributes to add to the span.

    Returns:
        The created span, or None if tracing is not enabled.
    """
    event_loop_cycle_id = str(invocation_state.get("event_loop_cycle_id"))
    parent_span = parent_span if parent_span else invocation_state.get("event_loop_parent_span")

    attributes: Dict[str, AttributeValue] = {
        "event_loop.cycle_id": event_loop_cycle_id,
    }

    if custom_trace_attributes:
        attributes.update(custom_trace_attributes)

    if "event_loop_parent_cycle_id" in invocation_state:
        attributes["event_loop.parent_cycle_id"] = str(invocation_state["event_loop_parent_cycle_id"])

    # Add additional kwargs as attributes
    attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

    span_name = "execute_event_loop_cycle"
    span = self._start_span(span_name, parent_span, attributes)
    self._add_event_messages(span, messages)

    return span

start_model_invoke_span(messages, parent_span=None, model_id=None, custom_trace_attributes=None, **kwargs)

Start a new span for a model invocation.

Parameters:

Name Type Description Default
messages Messages

Messages being sent to the model.

required
parent_span Optional[Span]

Optional parent span to link this span to.

None
model_id Optional[str]

Optional identifier for the model being invoked.

None
custom_trace_attributes Optional[Mapping[str, AttributeValue]]

Optional mapping of custom trace attributes to include in the span.

None
**kwargs Any

Additional attributes to add to the span.

{}

Returns:

Type Description
Span

The created span, or None if tracing is not enabled.

Source code in strands/telemetry/tracer.py
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
def start_model_invoke_span(
    self,
    messages: Messages,
    parent_span: Optional[Span] = None,
    model_id: Optional[str] = None,
    custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
    **kwargs: Any,
) -> Span:
    """Start a new span for a model invocation.

    Args:
        messages: Messages being sent to the model.
        parent_span: Optional parent span to link this span to.
        model_id: Optional identifier for the model being invoked.
        custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
        **kwargs: Additional attributes to add to the span.

    Returns:
        The created span, or None if tracing is not enabled.
    """
    attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="chat")

    if custom_trace_attributes:
        attributes.update(custom_trace_attributes)

    if model_id:
        attributes["gen_ai.request.model"] = model_id

    # Add additional kwargs as attributes
    attributes.update({k: v for k, v in kwargs.items() if isinstance(v, (str, int, float, bool))})

    span = self._start_span("chat", parent_span, attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL)
    self._add_event_messages(span, messages)

    return span

start_multiagent_span(task, instance, custom_trace_attributes=None)

Start a new span for swarm invocation.

Source code in strands/telemetry/tracer.py
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
def start_multiagent_span(
    self,
    task: MultiAgentInput,
    instance: str,
    custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
) -> Span:
    """Start a new span for swarm invocation."""
    operation = f"invoke_{instance}"
    attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation)
    attributes.update(
        {
            "gen_ai.agent.name": instance,
        }
    )

    if custom_trace_attributes:
        attributes.update(custom_trace_attributes)

    span = self._start_span(operation, attributes=attributes, span_kind=trace_api.SpanKind.CLIENT)

    if self.use_latest_genai_conventions:
        parts: list[dict[str, Any]] = []
        if isinstance(task, list):
            parts = self._map_content_blocks_to_otel_parts(task)
        else:
            parts = [{"type": "text", "content": task}]
        self._add_event(
            span,
            "gen_ai.client.inference.operation.details",
            {"gen_ai.input.messages": serialize([{"role": "user", "parts": parts}])},
        )
    else:
        self._add_event(
            span,
            "gen_ai.user.message",
            event_attributes={"content": serialize(task) if isinstance(task, list) else task},
        )

    return span

start_tool_call_span(tool, parent_span=None, custom_trace_attributes=None, **kwargs)

Start a new span for a tool call.

Parameters:

Name Type Description Default
tool ToolUse

The tool being used.

required
parent_span Optional[Span]

Optional parent span to link this span to.

None
custom_trace_attributes Optional[Mapping[str, AttributeValue]]

Optional mapping of custom trace attributes to include in the span.

None
**kwargs Any

Additional attributes to add to the span.

{}

Returns:

Type Description
Span

The created span, or None if tracing is not enabled.

Source code in strands/telemetry/tracer.py
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
def start_tool_call_span(
    self,
    tool: ToolUse,
    parent_span: Optional[Span] = None,
    custom_trace_attributes: Optional[Mapping[str, AttributeValue]] = None,
    **kwargs: Any,
) -> Span:
    """Start a new span for a tool call.

    Args:
        tool: The tool being used.
        parent_span: Optional parent span to link this span to.
        custom_trace_attributes: Optional mapping of custom trace attributes to include in the span.
        **kwargs: Additional attributes to add to the span.

    Returns:
        The created span, or None if tracing is not enabled.
    """
    attributes: Dict[str, AttributeValue] = self._get_common_attributes(operation_name="execute_tool")
    attributes.update(
        {
            "gen_ai.tool.name": tool["name"],
            "gen_ai.tool.call.id": tool["toolUseId"],
        }
    )

    if custom_trace_attributes:
        attributes.update(custom_trace_attributes)
    # Add additional kwargs as attributes
    attributes.update(kwargs)

    span_name = f"execute_tool {tool['name']}"
    span = self._start_span(span_name, parent_span, attributes=attributes, span_kind=trace_api.SpanKind.INTERNAL)

    if self.use_latest_genai_conventions:
        self._add_event(
            span,
            "gen_ai.client.inference.operation.details",
            {
                "gen_ai.input.messages": serialize(
                    [
                        {
                            "role": "tool",
                            "parts": [
                                {
                                    "type": "tool_call",
                                    "name": tool["name"],
                                    "id": tool["toolUseId"],
                                    "arguments": tool["input"],
                                }
                            ],
                        }
                    ]
                )
            },
        )
    else:
        self._add_event(
            span,
            "gen_ai.tool.message",
            event_attributes={
                "role": "tool",
                "content": serialize(tool["input"]),
                "id": tool["toolUseId"],
            },
        )

    return span

Usage

Bases: TypedDict

Token usage information for model interactions.

Attributes:

Name Type Description
inputTokens Required[int]

Number of tokens sent in the request to the model.

outputTokens Required[int]

Number of tokens that the model generated for the request.

totalTokens Required[int]

Total number of tokens (input + output).

cacheReadInputTokens int

Number of tokens read from cache (optional).

cacheWriteInputTokens int

Number of tokens written to cache (optional).

Source code in strands/types/event_loop.py
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
class Usage(TypedDict, total=False):
    """Token usage information for model interactions.

    Attributes:
        inputTokens: Number of tokens sent in the request to the model.
        outputTokens: Number of tokens that the model generated for the request.
        totalTokens: Total number of tokens (input + output).
        cacheReadInputTokens: Number of tokens read from cache (optional).
        cacheWriteInputTokens: Number of tokens written to cache (optional).
    """

    inputTokens: Required[int]
    outputTokens: Required[int]
    totalTokens: Required[int]
    cacheReadInputTokens: int
    cacheWriteInputTokens: int

get_tracer()

Get or create the global tracer.

Returns:

Type Description
Tracer

The global tracer instance.

Source code in strands/telemetry/tracer.py
872
873
874
875
876
877
878
879
880
881
882
883
def get_tracer() -> Tracer:
    """Get or create the global tracer.

    Returns:
        The global tracer instance.
    """
    global _tracer_instance

    if not _tracer_instance:
        _tracer_instance = Tracer()

    return _tracer_instance

serialize(obj)

Serialize an object to JSON with consistent settings.

Parameters:

Name Type Description Default
obj Any

The object to serialize

required

Returns:

Type Description
str

JSON string representation of the object

Source code in strands/telemetry/tracer.py
886
887
888
889
890
891
892
893
894
895
def serialize(obj: Any) -> str:
    """Serialize an object to JSON with consistent settings.

    Args:
        obj: The object to serialize

    Returns:
        JSON string representation of the object
    """
    return json.dumps(obj, ensure_ascii=False, cls=JSONEncoder)