Skip to content

Commit dd364bf

Browse files
sentrivanasfanahataantonpirker
authored
feat(python): Add the 3.x migration guide (#11327)
We'll release an alpha version of the new major of the Python SDK soon. Preparing a migration guide to go with it. --------- Co-authored-by: Shannon Anahata <[email protected]> Co-authored-by: Anton Pirker <[email protected]>
1 parent aafc199 commit dd364bf

File tree

1 file changed

+340
-0
lines changed

1 file changed

+340
-0
lines changed
Lines changed: 340 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,340 @@
1+
---
2+
title: Migrate from 2.x to 3.x
3+
sidebar_order: 8997
4+
description: "Learn about migrating from sentry-python 2.x to 3.x"
5+
---
6+
7+
<Alert title="Python SDK 3.0 pre-release">
8+
9+
Version 3.0 of the Sentry Python SDK is currently in pre-release. If you feel like giving it a spin, check out [our most recent releases](https://pypi.org/project/sentry-sdk/#history). Your feedback at this stage is invaluable, so please let us know about your experience, whether positive or negative, [on GitHub](https://github.com/getsentry/sentry-python/discussions/3936) or [on Discord](https://discord.gg/wdNEHETs87): How did the migration go? Did you encounter any issues? Is everything working as expected?
10+
11+
</Alert>
12+
13+
This guide describes the common patterns involved in migrating to version `3.x` of the `sentry-python` SDK. For the full list of changes, check out the [detailed migration guide in the repository](https://github.com/getsentry/sentry-python/blob/potel-base/MIGRATION_GUIDE.md).
14+
15+
16+
## Python Version Support
17+
18+
Sentry Python SDK `3.x` only supports Python 3.7 and higher. If you're on an older Python version, you'll need to stay on an older version of the SDK:
19+
20+
- Python 2.7-3.5: SDK `1.x`
21+
- Python 3.6: SDK `2.x`
22+
23+
24+
## Configuration
25+
26+
The `enable_tracing` option was removed. Use [`traces_sample_rate`](/platforms/python/configuration/options/#traces_sample_rate) directly, or configure a [`traces_sampler`](/platforms/python/configuration/options/#traces_sampler) for more fine-grained control over which spans should be sampled.
27+
28+
```python diff
29+
sentry_sdk.init(
30+
- enable_tracing=True,
31+
+ traces_sample_rate=1.0,
32+
)
33+
```
34+
35+
The deprecated `propagate_traces` option was removed. Use [`trace_propagation_targets`](/platforms/python/configuration/options/#trace_propagation_targets) instead.
36+
37+
```python diff
38+
sentry_sdk.init(
39+
# don't propagate trace info downstream
40+
- propagate_traces=False,
41+
+ trace_propagation_targets=[],
42+
)
43+
```
44+
45+
Note that this only affects the global SDK option. The [`propagate_traces`](/platforms/python/integrations/celery/#options) option of the Celery integration remains unchanged.
46+
47+
The `profiles_sample_rate` and `profiler_mode` options previously nested under `_experiments` have been removed. They're replaced by top-level options of the same name:
48+
49+
```python diff
50+
sentry_sdk.init(
51+
- _experiments={
52+
- "profiles_sample_rate": 1.0,
53+
- "profiler_mode": "thread",
54+
- },
55+
+ profiles_sample_rate=1.0,
56+
+ profiler_mode="thread",
57+
)
58+
```
59+
60+
## API Changes
61+
62+
`add_attachment()` is now a part of the top-level level API and should be imported and used directly from `sentry_sdk`.
63+
64+
```python diff
65+
import sentry_sdk
66+
67+
- scope = sentry_sdk.get_current_scope()
68+
- scope.add_attachment(bytes=b"Hello World!", filename="attachment.txt")
69+
+ sentry_sdk.add_attachment(bytes=b"Hello World!", filename="attachment.txt")
70+
```
71+
72+
Using `sentry_sdk.add_attachment()` directly also makes sure the attachment is added to the correct scope internally.
73+
74+
### Custom Tracing API
75+
76+
Tracing in the Sentry Python SDK `3.x` is powered by [OpenTelemetry](https://opentelemetry.io/) in the background, which also means we're moving away from the Sentry-specific concept of transactions and towards a span-only future. `sentry_sdk.start_transaction()` is now deprecated in favor of `sentry_sdk.start_span()`.
77+
78+
```python diff
79+
- with sentry_sdk.start_transaction():
80+
+ with sentry_sdk.start_span():
81+
...
82+
```
83+
84+
Any spans without a parent span will become transactions by default. If you want to avoid promoting a span without a parent to a transaction, you can pass the `only_if_parent=True` keyword argument to `sentry_sdk.start_span()`.
85+
86+
`sentry_sdk.start_transaction()` and `sentry_sdk.start_span()` no longer take the following arguments: `trace_id`, `baggage`, `span_id`, `parent_span_id`. Use `sentry_sdk.continue_trace()` for propagating trace data.
87+
88+
`sentry_sdk.continue_trace()` no longer returns a `Transaction` and is now a context manager. To continue a trace from headers or environment variables, start a new span inside `sentry_sdk.continue_trace()`:
89+
90+
```python diff
91+
- transaction = sentry_sdk.continue_trace({...})
92+
- with sentry_sdk.start_transaction(transaction=transaction):
93+
- ...
94+
+ with sentry_sdk.continue_trace({...}):
95+
+ with sentry_sdk.start_span():
96+
+ ...
97+
```
98+
99+
The functions `continue_from_headers`, `continue_from_environ` and `from_traceparent` have been removed. Use the `sentry_sdk.continue_trace()` context manager instead.
100+
101+
102+
## Span Data
103+
104+
In OpenTelemetry, there is no concept of separate categories of data on a span: everything is simply a span attribute. This is a concept the Sentry SDK is also adopting. We deprecated `set_data()` and added a new span method called `set_attribute()`:
105+
106+
```python diff
107+
with sentry_sdk.start_span(...) as span:
108+
- span.set_data("my_attribute", "my_value")
109+
+ span.set_attribute("my_attribute", "my_value")
110+
```
111+
112+
You can also set attributes directly when creating the span. This has the advantage that these initial attributes will be accessible in the sampling context in your `traces_sampler`/`profiles_sampler` (see also the [Sampling section](#sampling)).
113+
114+
```python
115+
with sentry_sdk.start_span(attributes={"my_attribute": "my_value"}):
116+
...
117+
```
118+
119+
<Alert title="Span attribute type restrictions" level="warning">
120+
121+
There are important type restrictions to consider when setting attributes on a span via `span.set_attribute()` and `start_span(attributes={...})`. The keys must be non-empty strings and the values can only be several primitive types (excluding `None`) or a list of a single primitive type. See [the OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/common/#attribute) for details.
122+
123+
Note that since the SDK is now exclusively using span attributes, this restriction applies to other ways of setting data on a span as well like `span.set_data()`, `span.set_measurement()`, `span.set_context()`.
124+
125+
</Alert>
126+
127+
128+
## Sampling
129+
130+
It's no longer possible to change the sampling decision of a span by setting `span.sampled` directly after the span has been created. Use either a custom `traces_sampler` (preferred) or the `sampled` argument to `start_span()` for determining whether a span should be sampled.
131+
132+
```python
133+
with sentry_sdk.start_span(sampled=True) as span:
134+
...
135+
```
136+
137+
<Alert title="Sampling non-root spans" level="warning">
138+
139+
Both `traces_sampler` and the `sampled` argument will only influence whether root spans (transactions) are sampled. They can't be used for sampling child spans.
140+
141+
</Alert>
142+
143+
The `sampling_context` argument of `traces_sampler` and `profiles_sampler` has changed considerably for spans coming from our auto-instrumented integrations. As a consequence of using OpenTelemetry under the hood, spans can only carry specific, primitive types of data. This prevents us from making custom objects, for example, the `Request` object for several web frameworks, accessible on the span.
144+
145+
<Expandable title='AIOHTTP sampling context changes'>
146+
The AIOHTTP integration doesn't add the `aiohttp_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
147+
148+
| Request property | Sampling context key(s) |
149+
| ----------------- | ------------------------------- |
150+
| `path` | `url.path` |
151+
| `query_string` | `url.query` |
152+
| `method` | `http.request.method` |
153+
| `host` | `server.address`, `server.port` |
154+
| `scheme` | `url.scheme` |
155+
| full URL | `url.full` |
156+
| `request.headers` | `http.request.header.{header}` |
157+
</Expandable>
158+
159+
<Expandable title='Celery sampling context changes'>
160+
The Celery integration doesn't add the `celery_job` dictionary anymore. Instead, the individual keys are now available as:
161+
162+
| Dictionary keys | Sampling context key | Example |
163+
| ---------------------- | --------------------------- | ------------------------------ |
164+
| `celery_job["args"]` | `celery.job.args.{index}` | `celery.job.args.0` |
165+
| `celery_job["kwargs"]` | `celery.job.kwargs.{kwarg}` | `celery.job.kwargs.kwarg_name` |
166+
| `celery_job["task"]` | `celery.job.task` | |
167+
</Expandable>
168+
169+
<Expandable title='Tornado sampling context changes'>
170+
The Tornado integration doesn't add the `tornado_request` object anymore. Instead, some of the individual properties of the request are accessible, if available, as follows:
171+
172+
| Request property | Sampling context key(s) |
173+
| ----------------- | --------------------------------------------------- |
174+
| `path` | `url.path` |
175+
| `query` | `url.query` |
176+
| `protocol` | `url.scheme` |
177+
| `method` | `http.request.method` |
178+
| `host` | `server.address`, `server.port` |
179+
| `version` | `network.protocol.name`, `network.protocol.version` |
180+
| full URL | `url.full` |
181+
| `request.headers` | `http.request.header.{header}` |
182+
</Expandable>
183+
184+
<Expandable title='WSGI sampling context changes'>
185+
The WSGI integration doesn't add the `wsgi_environ` object anymore. Instead, the individual properties of the environment are accessible, if available, as follows:
186+
187+
| Env property | Sampling context key(s) |
188+
| ----------------- | ------------------------------------------------- |
189+
| `PATH_INFO` | `url.path` |
190+
| `QUERY_STRING` | `url.query` |
191+
| `REQUEST_METHOD` | `http.request.method` |
192+
| `SERVER_NAME` | `server.address` |
193+
| `SERVER_PORT` | `server.port` |
194+
| `SERVER_PROTOCOL` | `server.protocol.name`, `server.protocol.version` |
195+
| `wsgi.url_scheme` | `url.scheme` |
196+
| full URL | `url.full` |
197+
| `HTTP_*` | `http.request.header.{header}` |
198+
</Expandable>
199+
200+
<Expandable title='ASGI sampling context changes'>
201+
The ASGI integration doesn't add the `asgi_scope` object anymore. Instead, the individual properties of the scope, if available, are accessible as follows:
202+
203+
| Scope property | Sampling context key(s) |
204+
| -------------- | ------------------------------- |
205+
| `type` | `network.protocol.name` |
206+
| `scheme` | `url.scheme` |
207+
| `path` | `url.path` |
208+
| `query` | `url.query` |
209+
| `http_version` | `network.protocol.version` |
210+
| `method` | `http.request.method` |
211+
| `server` | `server.address`, `server.port` |
212+
| `client` | `client.address`, `client.port` |
213+
| full URL | `url.full` |
214+
| `headers` | `http.request.header.{header}` |
215+
</Expandable>
216+
217+
<Expandable title='RQ sampling context changes'>
218+
The RQ integration doesn't add the `rq_job` object anymore. Instead, the individual properties of the job and the queue, if available, are accessible as follows:
219+
220+
| RQ property | Sampling context key | Example |
221+
| --------------- | ---------------------------- | ---------------------- |
222+
| `rq_job.args` | `rq.job.args.{index}` | `rq.job.args.0` |
223+
| `rq_job.kwargs` | `rq.job.kwargs.{kwarg}` | `rq.job.args.my_kwarg` |
224+
| `rq_job.func` | `rq.job.func` | |
225+
| `queue.name` | `messaging.destination.name` | |
226+
| `rq_job.id` | `messaging.message.id` | |
227+
228+
Note that `rq.job.args`, `rq.job.kwargs`, and `rq.job.func` are serialized and not the actual objects on the job.
229+
</Expandable>
230+
231+
<Expandable title='AWS Lambda sampling context changes'>
232+
The AWS Lambda integration doesn't add the `aws_event` and `aws_context` objects anymore. Instead, the following, if available, is accessible:
233+
234+
| AWS property | Sampling context key(s) |
235+
| ------------------------------------------- | ------------------------------- |
236+
| `aws_event["httpMethod"]` | `http.request.method` |
237+
| `aws_event["queryStringParameters"]` | `url.query` |
238+
| `aws_event["path"]` | `url.path` |
239+
| full URL | `url.full` |
240+
| `aws_event["headers"]["X-Forwarded-Proto"]` | `network.protocol.name` |
241+
| `aws_event["headers"]["Host"]` | `server.address` |
242+
| `aws_context["function_name"]` | `faas.name` |
243+
| `aws_event["headers"]` | `http.request.headers.{header}` |
244+
</Expandable>
245+
246+
<Expandable title='GCP sampling context changes'>
247+
The GCP integration doesn't add the `gcp_env` and `gcp_event` keys anymore. Instead, the following, if available, is accessible:
248+
249+
| Old sampling context key | New sampling context key |
250+
| --------------------------------- | ------------------------------ |
251+
| `gcp_env["function_name"]` | `faas.name` |
252+
| `gcp_env["function_region"]` | `faas.region` |
253+
| `gcp_env["function_project"]` | `gcp.function.project` |
254+
| `gcp_env["function_identity"]` | `gcp.function.identity` |
255+
| `gcp_env["function_entry_point"]` | `gcp.function.entry_point` |
256+
| `gcp_event.method` | `http.request.method` |
257+
| `gcp_event.query_string` | `url.query` |
258+
| `gcp_event.headers` | `http.request.header.{header}` |
259+
</Expandable>
260+
261+
The ability to set `custom_sampling_context` on `start_transaction` was removed. If there is custom data that you want to have accessible in the `sampling_context` of a `traces_sampler` or `profiles_sampler`, set it on the span via the `attributes` argument, as all span attributes are now included in the `sampling_context` by default:
262+
263+
```python diff
264+
- with start_transaction(custom_sampling_context={"custom_attribute": "custom_value"}):
265+
+ with start_span(attributes={"custom_attribute": "custom_value"}) as span:
266+
# custom_attribute will now be accessible in the sampling context
267+
# of your traces_sampler/profiles_sampler
268+
...
269+
```
270+
271+
<Alert title="Span attribute type restrictions" level="warning">
272+
273+
As mentioned above, span attribute keys must be non-empty strings and values can only be several primitive types (excluding `None`) or a list of a single primitive type. See [the OpenTelemetry specification](https://opentelemetry.io/docs/specs/otel/common/#attribute) for details.
274+
275+
</Alert>
276+
277+
278+
## Errors
279+
280+
We've updated how we handle `ExceptionGroup`s. You will now get more data if `ExceptionGroup`s appear in chained exceptions. As an indirect consequence, you might notice a change in how issues are grouped in Sentry.
281+
282+
283+
## Integrations
284+
285+
Additional integrations will now be activated automatically if the SDK detects the respective package is installed: Ariadne, ARQ, asyncpg, Chalice, clickhouse-driver, GQL, Graphene, huey, Loguru, PyMongo, Quart, Starlite, Strawberry. You can [opt-out of specific integrations with the `disabled_integrations` option](/platforms/python/integrations/#disabling-integrations).
286+
287+
We no longer support Django older than 2.0, trytond older than 5.0, and Falcon older than 3.0.
288+
289+
### Logging
290+
291+
The logging integration, which implements out-of-the-box support for the Python standard library `logging` framework, doesn't capture error logs as events anymore by default. The original behavior can still be achieved by providing a custom `event_level` to the `LoggingIntegration`:
292+
293+
```python
294+
sentry_sdk.init(
295+
integrations=[
296+
# capture error, critical, exception logs
297+
# and send them to Sentry as errors
298+
LoggingIntegration(event_level="ERROR"),
299+
],
300+
)
301+
```
302+
303+
### clickhouse-driver
304+
305+
The query being executed is now available under the `db.query.text` span attribute (only if `send_default_pii` is `True`).
306+
307+
### PyMongo
308+
309+
The PyMongo integration no longer sets tags automatically. The data is still accessible via span attributes.
310+
311+
The PyMongo integration doesn't set `operation_ids` anymore. The individual IDs (`operation_id`, `request_id`, `session_id`) are now accessible as separate span attributes.
312+
313+
### Redis
314+
315+
In Redis pipeline spans, there is no `span["data"]["redis.commands"]` that contains a dictionary `{"count": 3, "first_ten": ["cmd1", "cmd2", ...]}`. Instead, there is `span["data"]["redis.commands.count"]` (containing `3`) and `span["data"]["redis.commands.first_ten"]` (containing `["cmd1", "cmd2", ...]`).
316+
317+
318+
## Measurements
319+
320+
The `set_measurement()` API was removed. You can set custom attributes on the span instead with `set_attribute()`.
321+
322+
323+
## Sessions
324+
325+
The `auto_session_tracking()` context manager was removed. Use `track_session()` instead.
326+
327+
328+
## Scope
329+
330+
Setting `Scope.user` directly is no longer supported. Use `Scope.set_user()` instead.
331+
332+
333+
## Metrics
334+
335+
The `sentry_sdk.metrics` API doesn't exist anymore in SDK `3.x` as the [metrics beta has come to an end](https://sentry.zendesk.com/hc/en-us/articles/26369339769883-Metrics-Beta-Coming-to-an-End). The associated experimental options `enable_metrics`, `before_emit_metric` and `metric_code_locations` have been removed as well.
336+
337+
338+
## Internals
339+
340+
There is no concept of a hub anymore and all APIs and attributes that were connected to hubs have been removed.

0 commit comments

Comments
 (0)