Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 17 additions & 1 deletion .generator/schemas/v2/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43045,12 +43045,14 @@ components:
ObservabilityPipelineAmazonS3Source:
description: |-
The `amazon_s3` source ingests logs from an Amazon S3 bucket.
It supports AWS authentication and TLS encryption.
It supports AWS authentication, TLS encryption, and configurable compression.

**Supported pipeline types:** logs
properties:
auth:
$ref: "#/components/schemas/ObservabilityPipelineAwsAuth"
compression:
$ref: "#/components/schemas/ObservabilityPipelineAmazonS3SourceCompression"
id:
description: The unique identifier for this component. Used in other parts of the pipeline to reference this component (for example, as the `input` to downstream components).
example: aws-s3-source
Expand All @@ -43073,6 +43075,20 @@ components:
- region
type: object
x-pipeline-types: [logs]
ObservabilityPipelineAmazonS3SourceCompression:
description: Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.
enum:
- auto
- none
- gzip
- zstd
example: gzip
type: string
x-enum-varnames:
- AUTO
- NONE
- GZIP
- ZSTD
ObservabilityPipelineAmazonS3SourceType:
default: amazon_s3
description: The source type. Always `amazon_s3`.
Expand Down
7 changes: 7 additions & 0 deletions docs/datadog_api_client.v2.model.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19198,6 +19198,13 @@ datadog\_api\_client.v2.model.observability\_pipeline\_amazon\_s3\_source module
:members:
:show-inheritance:

datadog\_api\_client.v2.model.observability\_pipeline\_amazon\_s3\_source\_compression module
---------------------------------------------------------------------------------------------

.. automodule:: datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_compression
:members:
:show-inheritance:

datadog\_api\_client.v2.model.observability\_pipeline\_amazon\_s3\_source\_type module
--------------------------------------------------------------------------------------

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
"""
Validate an observability pipeline with amazon S3 source compression returns "OK" response
"""

from datadog_api_client import ApiClient, Configuration
from datadog_api_client.v2.api.observability_pipelines_api import ObservabilityPipelinesApi
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source import ObservabilityPipelineAmazonS3Source
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_compression import (
ObservabilityPipelineAmazonS3SourceCompression,
)
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_type import (
ObservabilityPipelineAmazonS3SourceType,
)
from datadog_api_client.v2.model.observability_pipeline_config import ObservabilityPipelineConfig
from datadog_api_client.v2.model.observability_pipeline_config_processor_group import (
ObservabilityPipelineConfigProcessorGroup,
)
from datadog_api_client.v2.model.observability_pipeline_data_attributes import ObservabilityPipelineDataAttributes
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination import (
ObservabilityPipelineDatadogLogsDestination,
)
from datadog_api_client.v2.model.observability_pipeline_datadog_logs_destination_type import (
ObservabilityPipelineDatadogLogsDestinationType,
)
from datadog_api_client.v2.model.observability_pipeline_filter_processor import ObservabilityPipelineFilterProcessor
from datadog_api_client.v2.model.observability_pipeline_filter_processor_type import (
ObservabilityPipelineFilterProcessorType,
)
from datadog_api_client.v2.model.observability_pipeline_spec import ObservabilityPipelineSpec
from datadog_api_client.v2.model.observability_pipeline_spec_data import ObservabilityPipelineSpecData

body = ObservabilityPipelineSpec(
data=ObservabilityPipelineSpecData(
attributes=ObservabilityPipelineDataAttributes(
config=ObservabilityPipelineConfig(
destinations=[
ObservabilityPipelineDatadogLogsDestination(
id="datadog-logs-destination",
inputs=[
"my-processor-group",
],
type=ObservabilityPipelineDatadogLogsDestinationType.DATADOG_LOGS,
),
],
processor_groups=[
ObservabilityPipelineConfigProcessorGroup(
enabled=True,
id="my-processor-group",
include="service:my-service",
inputs=[
"amazon-s3-source",
],
processors=[
ObservabilityPipelineFilterProcessor(
enabled=True,
id="filter-processor",
include="service:my-service",
type=ObservabilityPipelineFilterProcessorType.FILTER,
),
],
),
],
sources=[
ObservabilityPipelineAmazonS3Source(
id="amazon-s3-source",
type=ObservabilityPipelineAmazonS3SourceType.AMAZON_S3,
region="us-east-1",
compression=ObservabilityPipelineAmazonS3SourceCompression.GZIP,
),
],
),
name="Pipeline with S3 Source Compression",
),
type="pipelines",
),
)

configuration = Configuration()
with ApiClient(configuration) as api_client:
api_instance = ObservabilityPipelinesApi(api_client)
response = api_instance.validate_pipeline(body=body)

print(response)
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,9 @@

if TYPE_CHECKING:
from datadog_api_client.v2.model.observability_pipeline_aws_auth import ObservabilityPipelineAwsAuth
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_compression import (
ObservabilityPipelineAmazonS3SourceCompression,
)
from datadog_api_client.v2.model.observability_pipeline_tls import ObservabilityPipelineTls
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_type import (
ObservabilityPipelineAmazonS3SourceType,
Expand All @@ -25,13 +28,17 @@ class ObservabilityPipelineAmazonS3Source(ModelNormal):
@cached_property
def openapi_types(_):
from datadog_api_client.v2.model.observability_pipeline_aws_auth import ObservabilityPipelineAwsAuth
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_compression import (
ObservabilityPipelineAmazonS3SourceCompression,
)
from datadog_api_client.v2.model.observability_pipeline_tls import ObservabilityPipelineTls
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_type import (
ObservabilityPipelineAmazonS3SourceType,
)

return {
"auth": (ObservabilityPipelineAwsAuth,),
"compression": (ObservabilityPipelineAmazonS3SourceCompression,),
"id": (str,),
"region": (str,),
"tls": (ObservabilityPipelineTls,),
Expand All @@ -41,6 +48,7 @@ def openapi_types(_):

attribute_map = {
"auth": "auth",
"compression": "compression",
"id": "id",
"region": "region",
"tls": "tls",
Expand All @@ -54,20 +62,24 @@ def __init__(
region: str,
type: ObservabilityPipelineAmazonS3SourceType,
auth: Union[ObservabilityPipelineAwsAuth, UnsetType] = unset,
compression: Union[ObservabilityPipelineAmazonS3SourceCompression, UnsetType] = unset,
tls: Union[ObservabilityPipelineTls, UnsetType] = unset,
url_key: Union[str, UnsetType] = unset,
**kwargs,
):
"""
The ``amazon_s3`` source ingests logs from an Amazon S3 bucket.
It supports AWS authentication and TLS encryption.
It supports AWS authentication, TLS encryption, and configurable compression.

**Supported pipeline types:** logs

:param auth: AWS authentication credentials used for accessing AWS services such as S3.
If omitted, the system’s default credentials are used (for example, the IAM role and environment variables).
:type auth: ObservabilityPipelineAwsAuth, optional

:param compression: Compression format for objects retrieved from the S3 bucket. Use ``auto`` to detect compression from the object's Content-Encoding header or file extension.
:type compression: ObservabilityPipelineAmazonS3SourceCompression, optional

:param id: The unique identifier for this component. Used in other parts of the pipeline to reference this component (for example, as the ``input`` to downstream components).
:type id: str

Expand All @@ -85,6 +97,8 @@ def __init__(
"""
if auth is not unset:
kwargs["auth"] = auth
if compression is not unset:
kwargs["compression"] = compression
if tls is not unset:
kwargs["tls"] = tls
if url_key is not unset:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Unless explicitly stated otherwise all files in this repository are licensed under the Apache-2.0 License.
# This product includes software developed at Datadog (https://www.datadoghq.com/).
# Copyright 2019-Present Datadog, Inc.
from __future__ import annotations


from datadog_api_client.model_utils import (
ModelSimple,
cached_property,
)

from typing import ClassVar


class ObservabilityPipelineAmazonS3SourceCompression(ModelSimple):
"""
Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.

:param value: Must be one of ["auto", "none", "gzip", "zstd"].
:type value: str
"""

allowed_values = {
"auto",
"none",
"gzip",
"zstd",
}
AUTO: ClassVar["ObservabilityPipelineAmazonS3SourceCompression"]
NONE: ClassVar["ObservabilityPipelineAmazonS3SourceCompression"]
GZIP: ClassVar["ObservabilityPipelineAmazonS3SourceCompression"]
ZSTD: ClassVar["ObservabilityPipelineAmazonS3SourceCompression"]

@cached_property
def openapi_types(_):
return {
"value": (str,),
}


ObservabilityPipelineAmazonS3SourceCompression.AUTO = ObservabilityPipelineAmazonS3SourceCompression("auto")
ObservabilityPipelineAmazonS3SourceCompression.NONE = ObservabilityPipelineAmazonS3SourceCompression("none")
ObservabilityPipelineAmazonS3SourceCompression.GZIP = ObservabilityPipelineAmazonS3SourceCompression("gzip")
ObservabilityPipelineAmazonS3SourceCompression.ZSTD = ObservabilityPipelineAmazonS3SourceCompression("zstd")
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,9 @@ def __init__(self, **kwargs):
If omitted, the system’s default credentials are used (for example, the IAM role and environment variables).
:type auth: ObservabilityPipelineAwsAuth, optional

:param compression: Compression format for objects retrieved from the S3 bucket. Use `auto` to detect compression from the object's Content-Encoding header or file extension.
:type compression: ObservabilityPipelineAmazonS3SourceCompression, optional

:param region: AWS region where the S3 bucket resides.
:type region: str

Expand Down
4 changes: 4 additions & 0 deletions src/datadog_api_client/v2/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3675,6 +3675,9 @@
ObservabilityPipelineAmazonS3GenericEncodingParquetType,
)
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source import ObservabilityPipelineAmazonS3Source
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_compression import (
ObservabilityPipelineAmazonS3SourceCompression,
)
from datadog_api_client.v2.model.observability_pipeline_amazon_s3_source_type import (
ObservabilityPipelineAmazonS3SourceType,
)
Expand Down Expand Up @@ -9568,6 +9571,7 @@
"ObservabilityPipelineAmazonS3GenericEncodingParquet",
"ObservabilityPipelineAmazonS3GenericEncodingParquetType",
"ObservabilityPipelineAmazonS3Source",
"ObservabilityPipelineAmazonS3SourceCompression",
"ObservabilityPipelineAmazonS3SourceType",
"ObservabilityPipelineAmazonSecurityLakeDestination",
"ObservabilityPipelineAmazonSecurityLakeDestinationType",
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
2026-04-08T12:44:25.060Z
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
interactions:
- request:
body: '{"data":{"attributes":{"config":{"destinations":[{"id":"datadog-logs-destination","inputs":["my-processor-group"],"type":"datadog_logs"}],"processor_groups":[{"enabled":true,"id":"my-processor-group","include":"service:my-service","inputs":["amazon-s3-source"],"processors":[{"enabled":true,"id":"filter-processor","include":"service:my-service","type":"filter"}]}],"sources":[{"compression":"gzip","id":"amazon-s3-source","region":"us-east-1","type":"amazon_s3"}]},"name":"Pipeline
with S3 Source Compression"},"type":"pipelines"}}'
headers:
accept:
- application/json
content-type:
- application/json
method: POST
uri: https://api.datadoghq.com/api/v2/obs-pipelines/pipelines/validate
response:
body:
string: '{"errors":[]}

'
headers:
content-type:
- application/vnd.api+json
status:
code: 200
message: OK
version: 1
8 changes: 8 additions & 0 deletions tests/v2/features/observability_pipelines.feature
Original file line number Diff line number Diff line change
Expand Up @@ -207,6 +207,14 @@ Feature: Observability Pipelines
Then the response status is 200 OK
And the response "errors" has length 0

@team:DataDog/observability-pipelines
Scenario: Validate an observability pipeline with amazon S3 source compression returns "OK" response
Given new "ValidatePipeline" request
And body with value {"data": {"attributes": {"config": {"destinations": [{"id": "datadog-logs-destination", "inputs": ["my-processor-group"], "type": "datadog_logs"}], "processor_groups": [{"enabled": true, "id": "my-processor-group", "include": "service:my-service", "inputs": ["amazon-s3-source"], "processors": [{"enabled": true, "id": "filter-processor", "include": "service:my-service", "type": "filter"}]}], "sources": [{"id": "amazon-s3-source", "type": "amazon_s3", "region": "us-east-1", "compression": "gzip"}]}, "name": "Pipeline with S3 Source Compression"}, "type": "pipelines"}}
When the request is sent
Then the response status is 200 OK
And the response "errors" has length 0

@team:DataDog/observability-pipelines
Scenario: Validate an observability pipeline with destination secret key returns "OK" response
Given new "ValidatePipeline" request
Expand Down
Loading