-
-
Notifications
You must be signed in to change notification settings - Fork 5.8k
Fix intermittent test failure in TestLevelQueue #15777
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix intermittent test failure in TestLevelQueue #15777
Conversation
Fix go-gitea#15776 Signed-off-by: Andrew Thornton <[email protected]>
Did not resolve it for me, here is another failure with this PR applied: $ while true; do go test -count=1 -run TestLevelQueue ./modules/queue; done
ok code.gitea.io/gitea/modules/queue 0.450s
ok code.gitea.io/gitea/modules/queue 0.455s
ok code.gitea.io/gitea/modules/queue 0.458s
ok code.gitea.io/gitea/modules/queue 0.467s
ok code.gitea.io/gitea/modules/queue 0.489s
2021/05/07 20:43:52 ...les/queue/manager.go:134:Add() [T] Queue Manager registered: queue-1 (QID: 1)
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:96:Run() [D] level: Starting
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:104:Run() [T] level: Waiting til closed
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:147:readToChan() [T] level : Task found: &queue.testData{TestString:"A", TestInt:1}
2021/05/07 20:43:52 .../queue/workerpool.go:251:commonRegisterWorkers() [T] WorkerPool: 1 (for queue-1) adding 1 workers with group id: 1
2021/05/07 20:43:52 .../queue/workerpool.go:93:zeroBoost() [W] WorkerPool: 1 (for queue-1) has zero workers - adding 5 temporary workers for 5m0s
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:147:readToChan() [T] level : Task found: &queue.testData{TestString:"B", TestInt:2}
2021/05/07 20:43:52 .../queue/workerpool.go:420:doWork() [T] Handling: 1 data, [0xc000c7a1b0]
2021/05/07 20:43:52 .../queue/workerpool.go:420:doWork() [T] Handling: 1 data, [0xc00067c180]
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:106:Run() [T] level: Waiting til done
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 .../queue/workerpool.go:396:doWork() [T] Worker shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:109:Run() [T] level: Waiting til cleaned
2021/05/07 20:43:52 .../queue/workerpool.go:309:CleanUp() [T] WorkerPool: 1 CleanUp
2021/05/07 20:43:52 .../queue/workerpool.go:321:CleanUp() [T] WorkerPool: 1 CleanUp Done
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:174:Terminate() [T] level: Terminating
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:186:Terminate() [D] level: Closing with 2 tasks left in queue
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:191:Terminate() [D] level: Terminated
2021/05/07 20:43:52 ...les/queue/manager.go:134:Add() [T] Queue Manager registered: queue-2 (QID: 2)
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:96:Run() [D] level: Starting
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:104:Run() [T] level: Waiting til closed
2021/05/07 20:43:52 .../queue/workerpool.go:251:commonRegisterWorkers() [T] WorkerPool: 2 (for queue-2) adding 1 workers with group id: 1
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:147:readToChan() [T] level : Task found: &queue.testData{TestString:"A", TestInt:1}
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:147:readToChan() [T] level : Task found: &queue.testData{TestString:"B", TestInt:2}
2021/05/07 20:43:52 .../queue/workerpool.go:412:doWork() [T] Handling: 2 data, [0xc00058e378 0xc00058e5b8]
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:106:Run() [T] level: Waiting til done
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:174:Terminate() [T] level: Terminating
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:174:Terminate() [T] level: Terminating
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:156:Shutdown() [T] level: Shutting down
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:164:Shutdown() [D] level: Shutdown
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:186:Terminate() [D] level: Closing with 0 tasks left in queue
2021/05/07 20:43:52 ...ue/queue_bytefifo.go:191:Terminate() [D] level: Terminated
--- FAIL: TestLevelQueue (0.70s)
queue_disk_test.go:21:
Error Trace: queue_disk_test.go:21
workerpool.go:421
workerpool.go:276
asm_amd64.s:1371
Error: Should be true
Test: TestLevelQueue
queue_disk_test.go:21:
Error Trace: queue_disk_test.go:21
workerpool.go:421
workerpool.go:276
asm_amd64.s:1371
Error: Should be true
Test: TestLevelQueue
FAIL
FAIL code.gitea.io/gitea/modules/queue 0.742s
FAIL
ok code.gitea.io/gitea/modules/queue 0.438s
ok code.gitea.io/gitea/modules/queue 0.447s
ok code.gitea.io/gitea/modules/queue 0.438s
ok code.gitea.io/gitea/modules/queue 0.447s
ok code.gitea.io/gitea/modules/queue 0.496s |
Can you try on the wait-on-empty PR because I would hate to spend an evening fixing this if I have already fixed and prevented it there, |
Signed-off-by: Andrew Thornton <[email protected]>
169a702
to
a2d3420
Compare
Ah it's a boost situation. I think there's a race with zeroWorkers and addWorkers - so you may get a boost before the initial workers have started. I will look at fixing this in my wait-on-empty branch. |
Fails with similar error on that branch as well, just with different logging: failure log on
|
I've just pushed up a fix to wait-on-empty |
That last change seems to have done it, 150 runs without failure 👍 |
To clear any misunderstandings, this PR does not fix it, but #15693 does, closing this one. |
Fix #15776
Signed-off-by: Andrew Thornton [email protected]