Skip to content

Explicitly pass strong ref as raw pointer to prevent UB in Arc::drop #58611

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 5 additions & 4 deletions src/liballoc/sync.rs
Original file line number Diff line number Diff line change
Expand Up @@ -552,7 +552,7 @@ impl<T: ?Sized> Arc<T> {
// allocation itself (there may still be weak pointers lying around).
ptr::drop_in_place(&mut self.ptr.as_mut().data);

if self.inner().weak.fetch_sub(1, Release) == 1 {
if atomic::AtomicUsize::fetch_sub_explicit(&self.inner().weak, 1, Release) == 1 {
atomic::fence(Acquire);
Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()))
}
Expand Down Expand Up @@ -970,10 +970,11 @@ unsafe impl<#[may_dangle] T: ?Sized> Drop for Arc<T> {
/// [`Weak`]: ../../std/sync/struct.Weak.html
#[inline]
fn drop(&mut self) {
// Because `fetch_sub` is already atomic, we do not need to synchronize
// Because `fetch_sub_explicit` is already atomic, we do not need to synchronize
// with other threads unless we are going to delete the object. This
// same logic applies to the below `fetch_sub` to the `weak` count.
if self.inner().strong.fetch_sub(1, Release) != 1 {
// For preventing dangling self over the unsafe block strong ref pointer passed.
if atomic::AtomicUsize::fetch_sub_explicit(&self.inner().strong, 1, Release) != 1 {
return;
}

Expand Down Expand Up @@ -1350,7 +1351,7 @@ impl<T: ?Sized> Drop for Weak<T> {
return
};

if inner.weak.fetch_sub(1, Release) == 1 {
if atomic::AtomicUsize::fetch_sub_explicit(&inner.weak, 1, Release) == 1 {
atomic::fence(Acquire);
unsafe {
Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()))
Expand Down
35 changes: 35 additions & 0 deletions src/libcore/sync/atomic.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1575,6 +1575,41 @@ assert_eq!(foo.load(Ordering::SeqCst), 10);
}
}

doc_comment! {
concat!("Subtracts from the given atomic value, returning the previous value.

This operation wraps around on overflow.

`fetch_sub_explicit` takes ", stringify!($atomic_type), " which will be substracted by given value
in respect to [`Ordering`] argument which describes the memory ordering
of this operation. All ordering modes are possible. Note that using
[`Acquire`] makes the store part of this operation [`Relaxed`], and
using [`Release`] makes the load part [`Relaxed`].

[`Ordering`]: enum.Ordering.html
[`Relaxed`]: enum.Ordering.html#variant.Relaxed
[`Release`]: enum.Ordering.html#variant.Release
[`Acquire`]: enum.Ordering.html#variant.Acquire

# Examples

```
", $extra_feature, "use std::sync::atomic::{", stringify!($atomic_type), ", Ordering};

let foo = ", stringify!($atomic_type), "::new(20);
assert_eq!(", stringify!($atomic_type), ".fetch_sub_explicit(foo, Ordering::SeqCst), 20);
assert_eq!(foo.load(Ordering::SeqCst), 10);
```"),
#[inline]
#[$stable]
#[cfg(target_has_atomic = "cas")]
pub fn fetch_sub_explicit(f: *const $atomic_type,
val: $int_type,
order: Ordering) -> $int_type {
unsafe { atomic_sub((*f).v.get(), val, order) }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the type of atomic_sub? Won't this create a reference again?

Copy link
Member

@bjorn3 bjorn3 Feb 21, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

atomic_sub takes *mut T:

unsafe fn atomic_sub<T>(dst: *mut T, val: T, order: Ordering) -> T {

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Okay, thanks. So this would indeed fix the problem even under the most strict semantics -- but at the cost of duplicating the entire API surface of the Atomic* types.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

True that. See my comment under the PR description:

  • I am unsure how this can be done without extending the public api of AtomicUsize.

edit: basically all Atomic* i meant.

}
}

doc_comment! {
concat!("Bitwise \"and\" with the current value.

Expand Down