Description
How do you use Sentry?
Sentry Saas (sentry.io)
Version
1.45.0
Steps to Reproduce
Run the following python script, setting SENTRY_DSN accordingly.
Alternatively, clone from https://github.com/jwhitaker-gridcog/sentry-bug .
Python script
import subprocess
import sentry_sdk
import multiprocessing
import argparse
import sys
import os
# set SENTRY_DSN in env
# worker: this worker is used in two demos: first by multiprocessing, second by subprocess.
def worker(baggage_and_tp):
sentry_sdk.init(traces_sample_rate=1.0, environment="jarrad-local", debug=True)
baggage, traceparent = baggage_and_tp
try:
with sentry_sdk.continue_trace(
{"baggage": baggage, "sentry-trace": traceparent}, op="task", name="child"
) as worker_tx:
# # workaround:
# if worker_tx.parent_sampled:
# worker_tx.sampled = True
# worker_tx.init_span_recorder(1000)
sentry_sdk.capture_message("hi")
finally:
sentry_sdk.Hub.current.flush()
# demo 1: worker is started with multiprocessing.Process(). baggage is passed with multiprocessing magic.
def main_multiprocessing():
sentry_sdk.init(traces_sample_rate=1.0, environment="jarrad-local", debug=True)
with sentry_sdk.start_transaction(op="task", name="parent") as tx:
baggage, traceparent = sentry_sdk.get_baggage(), sentry_sdk.get_traceparent()
proc = multiprocessing.Process(
target=worker, name="worker", args=[(baggage, traceparent)]
)
proc.start()
proc.join()
# helper for demo 2: pull baggage from env and pass to worker.
def worker_subprocess():
baggage, traceparent = (
os.environ["SENTRY_BAGGAGE"],
os.environ["SENTRY_TRACEPARENT"],
)
worker((baggage, traceparent))
# demo 2: worker is started with subprocess. baggage is passed with env vars.
def main_subprocess():
sentry_sdk.init(traces_sample_rate=1.0, environment="jarrad-local", debug=True)
with sentry_sdk.start_transaction(op="task", name="parent") as tx:
baggage, traceparent = sentry_sdk.get_baggage(), sentry_sdk.get_traceparent()
subprocess.run(
[sys.executable, sys.argv[0], "worker_subp"],
check=True,
env={
"SENTRY_BAGGAGE": baggage or "",
"SENTRY_TRACEPARENT": traceparent or "",
},
)
if __name__ == "__main__":
parser = argparse.ArgumentParser()
parser.add_argument(
"command",
choices=["main_mp", "main_subp", "worker_subp"],
)
args = parser.parse_args()
match args.command:
# demo 1
case "main_mp":
main_multiprocessing()
# demo 2
case "main_subp":
main_subprocess()
# helper for demo 2, you shouldn't need to call this yourself
case "worker_subp":
worker_subprocess()
case other:
raise Exception(f"unreachable: {other}")
Run with demo.py main_mp
or demo.py main_subp
. Both demonstrate this issue in a different context.
I came across this issue using multiprocessing and suspected some strangeness around sentry_sdk.init()
vs fork()
, so wanted to include what I'm really doing.
However I was also able to reproduce it without multiprocessing being involved at all, and I didn't want this closed as a dupe of #291 prematurely :)
Expected Result
The worker transaction should be sent to sentry, because its parent is sampled.
According to docs at https://docs.sentry.io/platforms/python/configuration/sampling/#inheritance
Whatever a transaction's sampling decision, that decision will be passed to its child spans and from there to any transactions they subsequently cause in other services.
Actual Result
However, in logs, I can see
[sentry] DEBUG: Setting SDK name to 'sentry.python'
[sentry] DEBUG: [Tracing] Starting <task> transaction <parent>
...
[sentry] DEBUG: Setting SDK name to 'sentry.python'
[sentry] DEBUG: [Tracing] Extracted propagation context from incoming data:
... { 'dynamic_sampling_context': {
... 'trace_id': '300a4d0c53a944119ef5a2de17655f0d', 'environment': 'jarrad-local', 'release': '56f30f3b5f41cc68cae84e20b0913eb4c91cb6aa',
... 'public_key': '0166f7d1273fb9a6793b04c3b7e5b858', 'transaction': 'parent', 'sample_rate': '1.0', 'sampled': 'true'
... }, 'trace_id': '300a4d0c53a944119ef5a2de17655f0d', 'parent_span_id': 'b9c38c13eed5ca6b', 'parent_sampled': True, 'span_id': 'a3094c45744e14cd'}
[sentry] DEBUG: Discarding transaction because sampled = False
Full logs:
$> poetry run python demo.py main_mp
[sentry] DEBUG: Setting up integrations (with default = True)
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.aiohttp.AioHttpIntegration: AIOHTTP not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.boto3.Boto3Integration: botocore is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.bottle.BottleIntegration: Bottle not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.celery.CeleryIntegration: Celery not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.django.DjangoIntegration: Django not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.falcon.FalconIntegration: Falcon not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.fastapi.FastApiIntegration: Starlette is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.flask.FlaskIntegration: Flask is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.httpx.HttpxIntegration: httpx is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.openai.OpenAIIntegration: OpenAI not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.pyramid.PyramidIntegration: Pyramid not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.rq.RqIntegration: RQ not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sanic.SanicIntegration: Sanic not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sqlalchemy.SqlalchemyIntegration: SQLAlchemy not installed.
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.starlette.StarletteIntegration: Starlette is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.tornado.TornadoIntegration: Tornado not installed
[sentry] DEBUG: Setting up previously not enabled integration argv
[sentry] DEBUG: Setting up previously not enabled integration atexit
[sentry] DEBUG: Setting up previously not enabled integration dedupe
[sentry] DEBUG: Setting up previously not enabled integration excepthook
[sentry] DEBUG: Setting up previously not enabled integration logging
[sentry] DEBUG: Setting up previously not enabled integration modules
[sentry] DEBUG: Setting up previously not enabled integration stdlib
[sentry] DEBUG: Setting up previously not enabled integration threading
[sentry] DEBUG: Setting up previously not enabled integration redis
[sentry] DEBUG: Did not enable default integration redis: Redis client not installed
[sentry] DEBUG: Enabling integration argv
[sentry] DEBUG: Enabling integration atexit
[sentry] DEBUG: Enabling integration dedupe
[sentry] DEBUG: Enabling integration excepthook
[sentry] DEBUG: Enabling integration logging
[sentry] DEBUG: Enabling integration modules
[sentry] DEBUG: Enabling integration stdlib
[sentry] DEBUG: Enabling integration threading
[sentry] DEBUG: Setting SDK name to 'sentry.python'
[sentry] DEBUG: [Tracing] Starting <task> transaction <parent>
[sentry] DEBUG: [Profiling] Discarding profile because profiler was not started.
[sentry] DEBUG: Setting up integrations (with default = True)
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.aiohttp.AioHttpIntegration: AIOHTTP not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.boto3.Boto3Integration: botocore is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.bottle.BottleIntegration: Bottle not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.celery.CeleryIntegration: Celery not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.django.DjangoIntegration: Django not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.falcon.FalconIntegration: Falcon not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.fastapi.FastApiIntegration: Starlette is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.flask.FlaskIntegration: Flask is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.httpx.HttpxIntegration: httpx is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.openai.OpenAIIntegration: OpenAI not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.pyramid.PyramidIntegration: Pyramid not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.rq.RqIntegration: RQ not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sanic.SanicIntegration: Sanic not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.sqlalchemy.SqlalchemyIntegration: SQLAlchemy not installed.
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.starlette.StarletteIntegration: Starlette is not installed
[sentry] DEBUG: Did not import default integration sentry_sdk.integrations.tornado.TornadoIntegration: Tornado not installed
[sentry] DEBUG: Enabling integration argv
[sentry] DEBUG: Enabling integration atexit
[sentry] DEBUG: Enabling integration dedupe
[sentry] DEBUG: Enabling integration excepthook
[sentry] DEBUG: Enabling integration logging
[sentry] DEBUG: Enabling integration modules
[sentry] DEBUG: Enabling integration stdlib
[sentry] DEBUG: Enabling integration threading
[sentry] DEBUG: Setting SDK name to 'sentry.python'
[sentry] DEBUG: [Tracing] Extracted propagation context from incoming data: {'dynamic_sampling_context': {'trace_id': '300a4d0c53a944119ef5a2de17655f0d', 'environment': 'jarrad-local', 'release': '56f30f3b5f41cc68cae84e20b0913eb4c91cb6aa', 'public_key': '0166f7d1273fb9a6793b04c3b7e5b858', 'transaction': 'parent', 'sample_rate': '1.0', 'sampled': 'true'}, 'trace_id': '300a4d0c53a944119ef5a2de17655f0d', 'parent_span_id': 'b9c38c13eed5ca6b', 'parent_sampled': True, 'span_id': 'a3094c45744e14cd'}
[sentry] DEBUG: Discarding transaction because sampled = False
[sentry] DEBUG: Flushing HTTP transport
[sentry] DEBUG: background worker got flush request
[sentry] DEBUG: Sending envelope [envelope with 1 items (error)] project:4506627083468800 host:o430711.ingest.sentry.io
[sentry] DEBUG: Sending envelope [envelope with 1 items (internal)] project:4506627083468800 host:o430711.ingest.sentry.io
[sentry] DEBUG: 1 event(s) pending on flush
[sentry] DEBUG: background worker flushed
[sentry] DEBUG: atexit: got shutdown signal
[sentry] DEBUG: atexit: shutting down client
[sentry] DEBUG: Flushing HTTP transport
[sentry] DEBUG: background worker got flush request
[sentry] DEBUG: Sending envelope [envelope with 1 items (transaction)] project:4506627083468800 host:o430711.ingest.sentry.io
[sentry] DEBUG: background worker flushed
[sentry] DEBUG: Killing HTTP transport
[sentry] DEBUG: background worker got kill request
If I work around the issue with
def worker():
with sentry_sdk.continue_trace(...) as worker_tx:
if worker_tx.parent_sampled:
worker_tx.sampled = True
worker_tx.init_span_recorder(1000)
then everything works.
It looks like continue_trace()
should be doing stuff that you are doing in start_transaction()
, but instead just calls Transaction()
on its own so this got forgotten?
Metadata
Metadata
Assignees
Labels
Type
Projects
Status