chore: Refactor OTel concerns in preparation for metric provider#123
chore: Refactor OTel concerns in preparation for metric provider#123morgan-wowk merged 1 commit intomasterfrom
Conversation
8a5aa83 to
7672232
Compare
7672232 to
0d953d1
Compare
cloud_pipelines_backend/instrumentation/opentelemetry/_internal/configuration.py
Show resolved
Hide resolved
cloud_pipelines_backend/instrumentation/opentelemetry/_internal/config.py
Outdated
Show resolved
Hide resolved
cloud_pipelines_backend/instrumentation/opentelemetry/_internal/config.py
Outdated
Show resolved
Hide resolved
0d953d1 to
f2c09c1
Compare
|
@yuechao-qin Feedback addressed. |
| # region: OpenTelemetry initialization | ||
| from cloud_pipelines_backend.instrumentation import opentelemetry as otel | ||
|
|
||
| otel.setup_providers() |
There was a problem hiding this comment.
Question: for local setup, should otel service be run separately? What would happen if no OTEL endpoint is configured (e.g. in case of e2e tests on CI)?
There was a problem hiding this comment.
Great question.
I'll answer in two ways.
For local development, we have this example stack I added and have personally been using to test traces and metrics for OSS: https://github.com/TangleML/tangle/tree/master/examples/observability/otel-jaeger-prometheus
For writing tests, or adding new metrics, here are some important things to know:
- The OTel Python SDK that we consume, and even our FastAPI auto-instrumentation consumes, is intentionally designed so that every action is a NOOP if no providers are set.
- It means, no exceptions from emitting metrics or opening traces when OTel is not configured.
- We only set providers on our end if they have filled out correctly shaped values for the environment variables.
- We only auto-instrument FastAPI if providers are set.
- For verfiying a metric, or trace, was used, via testing (whether its unit or e2e) that is possible and I'll follow up with instructions and examples in a separate PR. Consider this is a TODO. We will have to manipulate the type of reader / exporter we use at the start of tests.
Hmm. Are you sure we're simplifying things? It's hard to get more simple than the current 1-liner: otel_tracing.setup_api_tracing(app) |
It's more about decoupling. The reason it's better this way is because we will have scenarios where we want providers but don't want auto-instrumentation for FastAPI. I can add a function |
|
If you would like such a function, I'll add it! |
|
Let's keep the tests inside /tests/. We can have subfolders there like instrumentation or telemetry |
| def test_uses_custom_service_name(self, monkeypatch): | ||
| monkeypatch.setenv("TANGLE_OTEL_EXPORTER_ENDPOINT", "http://localhost:4317") | ||
|
|
||
| result = config.resolve(service_name="oasis-api") |
There was a problem hiding this comment.
We need to switch naming from Oasis to Tangle
There was a problem hiding this comment.
Naming module with a singular noun often conflicts with local variables. Better use plural or more uncountable. (Maybe, configuration is OK despite technically being singular noun...)
Kubernetes has
from kubernetes import config
from kubernetes import client
and it's very inconvenient, forcing me to rename those imports every time.
Sure thing. Increasingly over time, our root We can stick with the root-level tests for this PR. Here is the comparison of the two options for future. |
|
☝️ We can talk about this more in the future. There might be some "Pythonic" preference I am not aware of here so I have an open mind on the current root level tests strategy. I've done both root-level testing and co-located testing on small and large projects. Co-location for its many benefits has outweighed any cons for my previous teams. As our backend grows, we may experience some of the pains of a root level tests directory. If we do, let's consider giving co-location a try. I know we plan to add many tests in the near future and it could come up sooner than later as a desire. CC: @yuechao-qin |
f2c09c1 to
767ac8c
Compare
Moved |
…ider
**Changes:**
* Better organize opentelemetry code to create a better API for callers (various entrypoints) to:
* Setup providers (tracing, metrics, logging eventually)
* Setup FastAPI auto instrumentation
Made-with: Cursor
767ac8c to
7251a89
Compare
|
I do prefer co-located project structure myself. However, if my memory serves me correctly, Python doesn't support colocated tests (like Angular's .spec.ts) because there's no build step to exclude them — test files end up getting packaged and imported accidentally, pytest is designed to auto-discover from a separate tests/ dir out of the box. |
|
Co-location tests has advantages, but also have several cons. Usually there is 1-1 mapping between modules and test modules: But when tests start to get their own |
|
|
||
| [tool.pytest.ini_options] | ||
| testpaths = ["tests"] | ||
| addopts = "--import-mode=importlib" |
There was a problem hiding this comment.
Which errors required making this change?
Is this still needed?
There was a problem hiding this comment.
I will run the tests again tomorrow and confirm. And paste the error if it is still present.
There was a problem hiding this comment.
The answer is yes, it is still needed.
Context:
--import-mode=importlib is required because the test layout has two files with the same name: test_providers.py exists in both tests/instrumentation/opentelemetry/ and tests/instrumentation/opentelemetry/_internal/. Pytest's default prepend import mode can't distinguish between identically-named test modules in sibling directories — it errors with "which is not the same as the test file we want to collect". importlib mode gives each file a unique identity based on its full path rather than its module name, which resolves the collision.
Ark-kun
left a comment
There was a problem hiding this comment.
Thank you, Morgan. Approved with comment (about pyproject.toml change)
|
Good to know about this behaviour in Python. Thanks guys |
Merge activity
|


Refactor OpenTelemetry instrumentation into modular package
Changes:
setup_providers()andinstrument_fastapi())