-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
feat(python): Add the 3.x migration guide #11327
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
24 commits
Select commit
Hold shift + click to select a range
b6ee7e3
feat(python): Start the 3.x migration guide
sentrivana 9aa5b80
start with custom tracing
sentrivana b33676c
more stuff
sentrivana 4910c68
more stuff
sentrivana 0ed2c28
more
sentrivana eea95f1
.
sentrivana fbde46e
.
sentrivana 3414be0
remove todo
sentrivana d724fb2
improvements
sentrivana f4986b1
formatting
sentrivana 6ca947f
.
sentrivana 10857a0
Apply suggestions from code review
sentrivana 72c2dae
wording
sentrivana 3ed11af
wording
sentrivana 57f14b8
fix typo
sentrivana fb1f29e
code formatting
sentrivana 98af38e
add set_measurement
sentrivana ec37656
remove traces_sample_rate change
sentrivana cb7d6cb
Add a note about sdk being pre-release
sentrivana a9f2d17
wording
sentrivana 8366572
add expandable sampling context sections
sentrivana 34091c2
formatting
sentrivana 1c6c2af
Add add_attachment
sentrivana bda77c9
Apply suggestions from code review
sentrivana File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,340 @@ | ||
--- | ||
title: Migrate from 2.x to 3.x | ||
sidebar_order: 8997 | ||
description: "Learn about migrating from sentry-python 2.x to 3.x" | ||
--- | ||
|
||
<Alert title="Python SDK 3.0 pre-release"> | ||
|
||
Version 3.0 of the Sentry Python SDK is currently in pre-release. If you feel like giving it a spin, check out [our most recent releases](https://pypi.org/project/sentry-sdk/#history). Your feedback at this stage is invaluable, so please let us know about your experience, whether positive or negative, [on GitHub](https://github.com/getsentry/sentry-python/discussions/3936) or [on Discord](https://discord.gg/wdNEHETs87): How did the migration go? Did you encounter any issues? Is everything working as expected? | ||
|
||
</Alert> | ||
|
||
This guide describes the common patterns involved in migrating to version `3.x` of the `sentry-python` SDK. For the full list of changes, check out the [detailed migration guide in the repository](https://github.com/getsentry/sentry-python/blob/potel-base/MIGRATION_GUIDE.md). | ||
|
||
|
||
## Python Version Support | ||
|
||
Sentry Python SDK `3.x` only supports Python 3.7 and higher. If you're on an older Python version, you'll need to stay on an older version of the SDK: | ||
|
||
- Python 2.7-3.5: SDK `1.x` | ||
- Python 3.6: SDK `2.x` | ||
|
||
|
||
## Configuration | ||
|
||
The `enable_tracing` option was removed. Use [`traces_sample_rate`](/platforms/python/configuration/options/#traces_sample_rate) directly, or configure a [`traces_sampler`](/platforms/python/configuration/options/#traces_sampler) for more fine-grained control over which spans should be sampled. | ||
|
||
```python diff | ||
sentry_sdk.init( | ||
- enable_tracing=True, | ||
+ traces_sample_rate=1.0, | ||
) | ||
``` | ||
|
||
The deprecated `propagate_traces` option was removed. Use [`trace_propagation_targets`](/platforms/python/configuration/options/#trace_propagation_targets) instead. | ||
|
||
```python diff | ||
sentry_sdk.init( | ||
# don't propagate trace info downstream | ||
- propagate_traces=False, | ||
+ trace_propagation_targets=[], | ||
) | ||
``` | ||
|
||
Note that this only affects the global SDK option. The [`propagate_traces`](/platforms/python/integrations/celery/#options) option of the Celery integration remains unchanged. | ||
|
||
The `profiles_sample_rate` and `profiler_mode` options previously nested under `_experiments` have been removed. They're replaced by top-level options of the same name: | ||
|
||
```python diff | ||
sentry_sdk.init( | ||
- _experiments={ | ||
- "profiles_sample_rate": 1.0, | ||
- "profiler_mode": "thread", | ||
- }, | ||
+ profiles_sample_rate=1.0, | ||
+ profiler_mode="thread", | ||
) | ||
``` | ||
|
||
## API Changes | ||
|
||
`add_attachment()` is now a part of the top-level level API and should be imported and used directly from `sentry_sdk`. | ||
|
||
```python diff | ||
import sentry_sdk | ||
|
||
- scope = sentry_sdk.get_current_scope() | ||
- scope.add_attachment(bytes=b"Hello World!", filename="attachment.txt") | ||
+ sentry_sdk.add_attachment(bytes=b"Hello World!", filename="attachment.txt") | ||
``` | ||
|
||
Using `sentry_sdk.add_attachment()` directly also makes sure the attachment is added to the correct scope internally. | ||
|
||
### Custom Tracing API | ||
|
||
Tracing in the Sentry Python SDK `3.x` is powered by [OpenTelemetry](https://opentelemetry.io/) in the background, which also means we're moving away from the Sentry-specific concept of transactions and towards a span-only future. `sentry_sdk.start_transaction()` is now deprecated in favor of `sentry_sdk.start_span()`. | ||
|
||
```python diff | ||
- with sentry_sdk.start_transaction(): | ||
+ with sentry_sdk.start_span(): | ||
... | ||
``` | ||
|
||
Any spans without a parent span will become transactions by default. If you want to avoid promoting a span without a parent to a transaction, you can pass the `only_if_parent=True` keyword argument to `sentry_sdk.start_span()`. | ||
|
||
`sentry_sdk.start_transaction()` and `sentry_sdk.start_span()` no longer take the following arguments: `trace_id`, `baggage`, `span_id`, `parent_span_id`. Use `sentry_sdk.continue_trace()` for propagating trace data. | ||
|
||
`sentry_sdk.continue_trace()` no longer returns a `Transaction` and is now a context manager. To continue a trace from headers or environment variables, start a new span inside `sentry_sdk.continue_trace()`: | ||
|
||
```python diff | ||
- transaction = sentry_sdk.continue_trace({...}) | ||
- with sentry_sdk.start_transaction(transaction=transaction): | ||
- ... | ||
+ with sentry_sdk.continue_trace({...}): | ||
+ with sentry_sdk.start_span(): | ||
+ ... | ||
``` | ||
|
||
The functions `continue_from_headers`, `continue_from_environ` and `from_traceparent` have been removed. Use the `sentry_sdk.continue_trace()` context manager instead. | ||
|
||
|
||
## Span Data | ||
|
||
In OpenTelemetry, there is no concept of separate categories of data on a span: everything is simply a span attribute. This is a concept the Sentry SDK is also adopting. We deprecated `set_data()` and added a new span method called `set_attribute()`: | ||
|
||
```python diff | ||
with sentry_sdk.start_span(...) as span: | ||
- span.set_data("my_attribute", "my_value") | ||
+ span.set_attribute("my_attribute", "my_value") | ||
``` | ||
|
||
You can also set attributes directly when creating the span. This has the advantage that these initial attributes will be accessible in the sampling context in your `traces_sampler`/`profiles_sampler` (see also the [Sampling section](#sampling)). | ||
|
||
```python | ||
with sentry_sdk.start_span(attributes={"my_attribute": "my_value"}): | ||
... | ||
``` | ||
|
||
<Alert title="Span attribute type restrictions" level="warning"> | ||
|
||
There are important type restrictions to consider when setting attributes on a span via `span.set_attribute()` and `start_span(attributes={...})`. The keys must be non-empty strings and the values can only be several primitive types (excluding `None`) or a list of a single primitive type. See [the OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/common/#attribute) for details. | ||
|
||
Note that since the SDK is now exclusively using span attributes, this restriction applies to other ways of setting data on a span as well like `span.set_data()`, `span.set_measurement()`, `span.set_context()`. | ||
|
||
</Alert> | ||
|
||
|
||
## Sampling | ||
|
||
It's no longer possible to change the sampling decision of a span by setting `span.sampled` directly after the span has been created. Use either a custom `traces_sampler` (preferred) or the `sampled` argument to `start_span()` for determining whether a span should be sampled. | ||
|
||
```python | ||
with sentry_sdk.start_span(sampled=True) as span: | ||
... | ||
``` | ||
|
||
<Alert title="Sampling non-root spans" level="warning"> | ||
|
||
Both `traces_sampler` and the `sampled` argument will only influence whether root spans (transactions) are sampled. They can't be used for sampling child spans. | ||
|
||
</Alert> | ||
|
||
The `sampling_context` argument of `traces_sampler` and `profiles_sampler` has changed considerably for spans coming from our auto-instrumented integrations. As a consequence of using OpenTelemetry under the hood, spans can only carry specific, primitive types of data. This prevents us from making custom objects, for example, the `Request` object for several web frameworks, accessible on the span. | ||
|
||
<Expandable title='AIOHTTP sampling context changes'> | ||
The AIOHTTP integration doesn't add the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows: | ||
|
||
| Request property | Sampling context key(s) | | ||
| ----------------- | ------------------------------- | | ||
| `path` | `url.path` | | ||
| `query_string` | `url.query` | | ||
| `method` | `http.request.method` | | ||
| `host` | `server.address`, `server.port` | | ||
| `scheme` | `url.scheme` | | ||
| full URL | `url.full` | | ||
| `request.headers` | `http.request.header.{header}` | | ||
</Expandable> | ||
|
||
<Expandable title='Celery sampling context changes'> | ||
The Celery integration doesn't add the `celery_job` dictionary anymore. Instead, the individual keys are now available as: | ||
|
||
| Dictionary keys | Sampling context key | Example | | ||
| ---------------------- | --------------------------- | ------------------------------ | | ||
| `celery_job["args"]` | `celery.job.args.{index}` | `celery.job.args.0` | | ||
| `celery_job["kwargs"]` | `celery.job.kwargs.{kwarg}` | `celery.job.kwargs.kwarg_name` | | ||
| `celery_job["task"]` | `celery.job.task` | | | ||
</Expandable> | ||
|
||
<Expandable title='Tornado sampling context changes'> | ||
The Tornado integration doesn't add the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows: | ||
|
||
| Request property | Sampling context key(s) | | ||
| ----------------- | --------------------------------------------------- | | ||
| `path` | `url.path` | | ||
| `query` | `url.query` | | ||
| `protocol` | `url.scheme` | | ||
| `method` | `http.request.method` | | ||
| `host` | `server.address`, `server.port` | | ||
| `version` | `network.protocol.name`, `network.protocol.version` | | ||
| full URL | `url.full` | | ||
| `request.headers` | `http.request.header.{header}` | | ||
</Expandable> | ||
|
||
<Expandable title='WSGI sampling context changes'> | ||
The WSGI integration doesn't add the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows: | ||
|
||
| Env property | Sampling context key(s) | | ||
| ----------------- | ------------------------------------------------- | | ||
| `PATH_INFO` | `url.path` | | ||
| `QUERY_STRING` | `url.query` | | ||
| `REQUEST_METHOD` | `http.request.method` | | ||
| `SERVER_NAME` | `server.address` | | ||
| `SERVER_PORT` | `server.port` | | ||
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` | | ||
| `wsgi.url_scheme` | `url.scheme` | | ||
| full URL | `url.full` | | ||
| `HTTP_*` | `http.request.header.{header}` | | ||
</Expandable> | ||
|
||
<Expandable title='ASGI sampling context changes'> | ||
The ASGI integration doesn't add the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows: | ||
|
||
| Scope property | Sampling context key(s) | | ||
| -------------- | ------------------------------- | | ||
| `type` | `network.protocol.name` | | ||
| `scheme` | `url.scheme` | | ||
| `path` | `url.path` | | ||
| `query` | `url.query` | | ||
| `http_version` | `network.protocol.version` | | ||
| `method` | `http.request.method` | | ||
| `server` | `server.address`, `server.port` | | ||
| `client` | `client.address`, `client.port` | | ||
| full URL | `url.full` | | ||
| `headers` | `http.request.header.{header}` | | ||
</Expandable> | ||
|
||
<Expandable title='RQ sampling context changes'> | ||
The RQ integration doesn't add the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows: | ||
|
||
| RQ property | Sampling context key | Example | | ||
| --------------- | ---------------------------- | ---------------------- | | ||
| `rq_job.args` | `rq.job.args.{index}` | `rq.job.args.0` | | ||
| `rq_job.kwargs` | `rq.job.kwargs.{kwarg}` | `rq.job.args.my_kwarg` | | ||
| `rq_job.func` | `rq.job.func` | | | ||
| `queue.name` | `messaging.destination.name` | | | ||
| `rq_job.id` | `messaging.message.id` | | | ||
|
||
Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job. | ||
</Expandable> | ||
|
||
<Expandable title='AWS Lambda sampling context changes'> | ||
The AWS Lambda integration doesn't add the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible: | ||
|
||
| AWS property | Sampling context key(s) | | ||
| ------------------------------------------- | ------------------------------- | | ||
| `aws_event["httpMethod"]` | `http.request.method` | | ||
| `aws_event["queryStringParameters"]` | `url.query` | | ||
| `aws_event["path"]` | `url.path` | | ||
| full URL | `url.full` | | ||
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` | | ||
| `aws_event["headers"]["Host"]` | `server.address` | | ||
| `aws_context["function_name"]` | `faas.name` | | ||
| `aws_event["headers"]` | `http.request.headers.{header}` | | ||
</Expandable> | ||
|
||
<Expandable title='GCP sampling context changes'> | ||
The GCP integration doesn't add the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible: | ||
|
||
| Old sampling context key | New sampling context key | | ||
| --------------------------------- | ------------------------------ | | ||
| `gcp_env["function_name"]` | `faas.name` | | ||
| `gcp_env["function_region"]` | `faas.region` | | ||
| `gcp_env["function_project"]` | `gcp.function.project` | | ||
| `gcp_env["function_identity"]` | `gcp.function.identity` | | ||
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` | | ||
| `gcp_event.method` | `http.request.method` | | ||
| `gcp_event.query_string` | `url.query` | | ||
| `gcp_event.headers` | `http.request.header.{header}` | | ||
</Expandable> | ||
|
||
The ability to set `custom_sampling_context` on `start_transaction` was removed. If there is custom data that you want to have accessible in the `sampling_context` of a `traces_sampler` or `profiles_sampler`, set it on the span via the `attributes` argument, as all span attributes are now included in the `sampling_context` by default: | ||
|
||
```python diff | ||
- with start_transaction(custom_sampling_context={"custom_attribute": "custom_value"}): | ||
+ with start_span(attributes={"custom_attribute": "custom_value"}) as span: | ||
# custom_attribute will now be accessible in the sampling context | ||
# of your traces_sampler/profiles_sampler | ||
... | ||
``` | ||
|
||
<Alert title="Span attribute type restrictions" level="warning"> | ||
|
||
As mentioned above, span attribute keys must be non-empty strings and values can only be several primitive types (excluding `None`) or a list of a single primitive type. See [the OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/common/#attribute) for details. | ||
|
||
</Alert> | ||
|
||
|
||
## Errors | ||
|
||
We've updated how we handle `ExceptionGroup`s. You will now get more data if `ExceptionGroup`s appear in chained exceptions. As an indirect consequence, you might notice a change in how issues are grouped in Sentry. | ||
|
||
|
||
## Integrations | ||
|
||
Additional integrations will now be activated automatically if the SDK detects the respective package is installed: Ariadne, ARQ, asyncpg, Chalice, clickhouse-driver, GQL, Graphene, huey, Loguru, PyMongo, Quart, Starlite, Strawberry. You can [opt-out of specific integrations with the `disabled_integrations` option](/platforms/python/integrations/#disabling-integrations). | ||
|
||
We no longer support Django older than 2.0, trytond older than 5.0, and Falcon older than 3.0. | ||
|
||
### Logging | ||
|
||
The logging integration, which implements out-of-the-box support for the Python standard library `logging` framework, doesn't capture error logs as events anymore by default. The original behavior can still be achieved by providing a custom `event_level` to the `LoggingIntegration`: | ||
|
||
```python | ||
sentry_sdk.init( | ||
integrations=[ | ||
# capture error, critical, exception logs | ||
# and send them to Sentry as errors | ||
LoggingIntegration(event_level="ERROR"), | ||
], | ||
) | ||
``` | ||
|
||
### clickhouse-driver | ||
|
||
The query being executed is now available under the `db.query.text` span attribute (only if `send_default_pii` is `True`). | ||
|
||
### PyMongo | ||
|
||
The PyMongo integration no longer sets tags automatically. The data is still accessible via span attributes. | ||
|
||
The PyMongo integration doesn't set `operation_ids` anymore. The individual IDs (`operation_id`, `request_id`, `session_id`) are now accessible as separate span attributes. | ||
|
||
### Redis | ||
|
||
In Redis pipeline spans, there is no `span["data"]["redis.commands"]` that contains a dictionary `{"count": 3, "first_ten": ["cmd1", "cmd2", ...]}`. Instead, there is `span["data"]["redis.commands.count"]` (containing `3`) and `span["data"]["redis.commands.first_ten"]` (containing `["cmd1", "cmd2", ...]`). | ||
|
||
|
||
## Measurements | ||
|
||
The `set_measurement()` API was removed. You can set custom attributes on the span instead with `set_attribute()`. | ||
|
||
|
||
## Sessions | ||
|
||
The `auto_session_tracking()` context manager was removed. Use `track_session()` instead. | ||
|
||
|
||
## Scope | ||
|
||
Setting `Scope.user` directly is no longer supported. Use `Scope.set_user()` instead. | ||
|
||
|
||
## Metrics | ||
|
||
The `sentry_sdk.metrics` API doesn't exist anymore in SDK `3.x` as the [metrics beta has come to an end](https://sentry.zendesk.com/hc/en-us/articles/26369339769883-Metrics-Beta-Coming-to-an-End). The associated experimental options `enable_metrics`, `before_emit_metric` and `metric_code_locations` have been removed as well. | ||
|
||
|
||
## Internals | ||
|
||
There is no concept of a hub anymore and all APIs and attributes that were connected to hubs have been removed. |
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we reference
ExceptionGroup
s in our documentation today. Is this something you want added into our Python docs, @sentrivana ?Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could but I believe that's more of a nice-to-have. The SDK supports Python
ExceptionGroup
s natively so folks don't really have to care about them, and there is also no way to configure them so there's not much to document.