alloc/
rc.rs

1//! Single-threaded reference-counting pointers. 'Rc' stands for 'Reference
2//! Counted'.
3//!
4//! The type [`Rc<T>`][`Rc`] provides shared ownership of a value of type `T`,
5//! allocated in the heap. Invoking [`clone`][clone] on [`Rc`] produces a new
6//! pointer to the same allocation in the heap. When the last [`Rc`] pointer to a
7//! given allocation is destroyed, the value stored in that allocation (often
8//! referred to as "inner value") is also dropped.
9//!
10//! Shared references in Rust disallow mutation by default, and [`Rc`]
11//! is no exception: you cannot generally obtain a mutable reference to
12//! something inside an [`Rc`]. If you need mutability, put a [`Cell`]
13//! or [`RefCell`] inside the [`Rc`]; see [an example of mutability
14//! inside an `Rc`][mutability].
15//!
16//! [`Rc`] uses non-atomic reference counting. This means that overhead is very
17//! low, but an [`Rc`] cannot be sent between threads, and consequently [`Rc`]
18//! does not implement [`Send`]. As a result, the Rust compiler
19//! will check *at compile time* that you are not sending [`Rc`]s between
20//! threads. If you need multi-threaded, atomic reference counting, use
21//! [`sync::Arc`][arc].
22//!
23//! The [`downgrade`][downgrade] method can be used to create a non-owning
24//! [`Weak`] pointer. A [`Weak`] pointer can be [`upgrade`][upgrade]d
25//! to an [`Rc`], but this will return [`None`] if the value stored in the allocation has
26//! already been dropped. In other words, `Weak` pointers do not keep the value
27//! inside the allocation alive; however, they *do* keep the allocation
28//! (the backing store for the inner value) alive.
29//!
30//! A cycle between [`Rc`] pointers will never be deallocated. For this reason,
31//! [`Weak`] is used to break cycles. For example, a tree could have strong
32//! [`Rc`] pointers from parent nodes to children, and [`Weak`] pointers from
33//! children back to their parents.
34//!
35//! `Rc<T>` automatically dereferences to `T` (via the [`Deref`] trait),
36//! so you can call `T`'s methods on a value of type [`Rc<T>`][`Rc`]. To avoid name
37//! clashes with `T`'s methods, the methods of [`Rc<T>`][`Rc`] itself are associated
38//! functions, called using [fully qualified syntax]:
39//!
40//! ```
41//! use std::rc::Rc;
42//!
43//! let my_rc = Rc::new(());
44//! let my_weak = Rc::downgrade(&my_rc);
45//! ```
46//!
47//! `Rc<T>`'s implementations of traits like `Clone` may also be called using
48//! fully qualified syntax. Some people prefer to use fully qualified syntax,
49//! while others prefer using method-call syntax.
50//!
51//! ```
52//! use std::rc::Rc;
53//!
54//! let rc = Rc::new(());
55//! // Method-call syntax
56//! let rc2 = rc.clone();
57//! // Fully qualified syntax
58//! let rc3 = Rc::clone(&rc);
59//! ```
60//!
61//! [`Weak<T>`][`Weak`] does not auto-dereference to `T`, because the inner value may have
62//! already been dropped.
63//!
64//! # Cloning references
65//!
66//! Creating a new reference to the same allocation as an existing reference counted pointer
67//! is done using the `Clone` trait implemented for [`Rc<T>`][`Rc`] and [`Weak<T>`][`Weak`].
68//!
69//! ```
70//! use std::rc::Rc;
71//!
72//! let foo = Rc::new(vec![1.0, 2.0, 3.0]);
73//! // The two syntaxes below are equivalent.
74//! let a = foo.clone();
75//! let b = Rc::clone(&foo);
76//! // a and b both point to the same memory location as foo.
77//! ```
78//!
79//! The `Rc::clone(&from)` syntax is the most idiomatic because it conveys more explicitly
80//! the meaning of the code. In the example above, this syntax makes it easier to see that
81//! this code is creating a new reference rather than copying the whole content of foo.
82//!
83//! # Examples
84//!
85//! Consider a scenario where a set of `Gadget`s are owned by a given `Owner`.
86//! We want to have our `Gadget`s point to their `Owner`. We can't do this with
87//! unique ownership, because more than one gadget may belong to the same
88//! `Owner`. [`Rc`] allows us to share an `Owner` between multiple `Gadget`s,
89//! and have the `Owner` remain allocated as long as any `Gadget` points at it.
90//!
91//! ```
92//! use std::rc::Rc;
93//!
94//! struct Owner {
95//!     name: String,
96//!     // ...other fields
97//! }
98//!
99//! struct Gadget {
100//!     id: i32,
101//!     owner: Rc<Owner>,
102//!     // ...other fields
103//! }
104//!
105//! fn main() {
106//!     // Create a reference-counted `Owner`.
107//!     let gadget_owner: Rc<Owner> = Rc::new(
108//!         Owner {
109//!             name: "Gadget Man".to_string(),
110//!         }
111//!     );
112//!
113//!     // Create `Gadget`s belonging to `gadget_owner`. Cloning the `Rc<Owner>`
114//!     // gives us a new pointer to the same `Owner` allocation, incrementing
115//!     // the reference count in the process.
116//!     let gadget1 = Gadget {
117//!         id: 1,
118//!         owner: Rc::clone(&gadget_owner),
119//!     };
120//!     let gadget2 = Gadget {
121//!         id: 2,
122//!         owner: Rc::clone(&gadget_owner),
123//!     };
124//!
125//!     // Dispose of our local variable `gadget_owner`.
126//!     drop(gadget_owner);
127//!
128//!     // Despite dropping `gadget_owner`, we're still able to print out the name
129//!     // of the `Owner` of the `Gadget`s. This is because we've only dropped a
130//!     // single `Rc<Owner>`, not the `Owner` it points to. As long as there are
131//!     // other `Rc<Owner>` pointing at the same `Owner` allocation, it will remain
132//!     // live. The field projection `gadget1.owner.name` works because
133//!     // `Rc<Owner>` automatically dereferences to `Owner`.
134//!     println!("Gadget {} owned by {}", gadget1.id, gadget1.owner.name);
135//!     println!("Gadget {} owned by {}", gadget2.id, gadget2.owner.name);
136//!
137//!     // At the end of the function, `gadget1` and `gadget2` are destroyed, and
138//!     // with them the last counted references to our `Owner`. Gadget Man now
139//!     // gets destroyed as well.
140//! }
141//! ```
142//!
143//! If our requirements change, and we also need to be able to traverse from
144//! `Owner` to `Gadget`, we will run into problems. An [`Rc`] pointer from `Owner`
145//! to `Gadget` introduces a cycle. This means that their
146//! reference counts can never reach 0, and the allocation will never be destroyed:
147//! a memory leak. In order to get around this, we can use [`Weak`]
148//! pointers.
149//!
150//! Rust actually makes it somewhat difficult to produce this loop in the first
151//! place. In order to end up with two values that point at each other, one of
152//! them needs to be mutable. This is difficult because [`Rc`] enforces
153//! memory safety by only giving out shared references to the value it wraps,
154//! and these don't allow direct mutation. We need to wrap the part of the
155//! value we wish to mutate in a [`RefCell`], which provides *interior
156//! mutability*: a method to achieve mutability through a shared reference.
157//! [`RefCell`] enforces Rust's borrowing rules at runtime.
158//!
159//! ```
160//! use std::rc::Rc;
161//! use std::rc::Weak;
162//! use std::cell::RefCell;
163//!
164//! struct Owner {
165//!     name: String,
166//!     gadgets: RefCell<Vec<Weak<Gadget>>>,
167//!     // ...other fields
168//! }
169//!
170//! struct Gadget {
171//!     id: i32,
172//!     owner: Rc<Owner>,
173//!     // ...other fields
174//! }
175//!
176//! fn main() {
177//!     // Create a reference-counted `Owner`. Note that we've put the `Owner`'s
178//!     // vector of `Gadget`s inside a `RefCell` so that we can mutate it through
179//!     // a shared reference.
180//!     let gadget_owner: Rc<Owner> = Rc::new(
181//!         Owner {
182//!             name: "Gadget Man".to_string(),
183//!             gadgets: RefCell::new(vec![]),
184//!         }
185//!     );
186//!
187//!     // Create `Gadget`s belonging to `gadget_owner`, as before.
188//!     let gadget1 = Rc::new(
189//!         Gadget {
190//!             id: 1,
191//!             owner: Rc::clone(&gadget_owner),
192//!         }
193//!     );
194//!     let gadget2 = Rc::new(
195//!         Gadget {
196//!             id: 2,
197//!             owner: Rc::clone(&gadget_owner),
198//!         }
199//!     );
200//!
201//!     // Add the `Gadget`s to their `Owner`.
202//!     {
203//!         let mut gadgets = gadget_owner.gadgets.borrow_mut();
204//!         gadgets.push(Rc::downgrade(&gadget1));
205//!         gadgets.push(Rc::downgrade(&gadget2));
206//!
207//!         // `RefCell` dynamic borrow ends here.
208//!     }
209//!
210//!     // Iterate over our `Gadget`s, printing their details out.
211//!     for gadget_weak in gadget_owner.gadgets.borrow().iter() {
212//!
213//!         // `gadget_weak` is a `Weak<Gadget>`. Since `Weak` pointers can't
214//!         // guarantee the allocation still exists, we need to call
215//!         // `upgrade`, which returns an `Option<Rc<Gadget>>`.
216//!         //
217//!         // In this case we know the allocation still exists, so we simply
218//!         // `unwrap` the `Option`. In a more complicated program, you might
219//!         // need graceful error handling for a `None` result.
220//!
221//!         let gadget = gadget_weak.upgrade().unwrap();
222//!         println!("Gadget {} owned by {}", gadget.id, gadget.owner.name);
223//!     }
224//!
225//!     // At the end of the function, `gadget_owner`, `gadget1`, and `gadget2`
226//!     // are destroyed. There are now no strong (`Rc`) pointers to the
227//!     // gadgets, so they are destroyed. This zeroes the reference count on
228//!     // Gadget Man, so he gets destroyed as well.
229//! }
230//! ```
231//!
232//! [clone]: Clone::clone
233//! [`Cell`]: core::cell::Cell
234//! [`RefCell`]: core::cell::RefCell
235//! [arc]: crate::sync::Arc
236//! [`Deref`]: core::ops::Deref
237//! [downgrade]: Rc::downgrade
238//! [upgrade]: Weak::upgrade
239//! [mutability]: core::cell#introducing-mutability-inside-of-something-immutable
240//! [fully qualified syntax]: https://doc.rust-lang.org/book/ch19-03-advanced-traits.html#fully-qualified-syntax-for-disambiguation-calling-methods-with-the-same-name
241
242#![stable(feature = "rust1", since = "1.0.0")]
243
244use core::any::Any;
245use core::cell::Cell;
246#[cfg(not(no_global_oom_handling))]
247use core::clone::CloneToUninit;
248use core::clone::UseCloned;
249use core::cmp::Ordering;
250use core::hash::{Hash, Hasher};
251use core::intrinsics::abort;
252#[cfg(not(no_global_oom_handling))]
253use core::iter;
254use core::marker::{PhantomData, Unsize};
255use core::mem::{self, ManuallyDrop, align_of_val_raw};
256use core::num::NonZeroUsize;
257use core::ops::{CoerceUnsized, Deref, DerefMut, DerefPure, DispatchFromDyn, LegacyReceiver};
258use core::panic::{RefUnwindSafe, UnwindSafe};
259#[cfg(not(no_global_oom_handling))]
260use core::pin::Pin;
261use core::pin::PinCoerceUnsized;
262use core::ptr::{self, NonNull, drop_in_place};
263#[cfg(not(no_global_oom_handling))]
264use core::slice::from_raw_parts_mut;
265use core::{borrow, fmt, hint};
266
267#[cfg(not(no_global_oom_handling))]
268use crate::alloc::handle_alloc_error;
269use crate::alloc::{AllocError, Allocator, Global, Layout};
270use crate::borrow::{Cow, ToOwned};
271use crate::boxed::Box;
272#[cfg(not(no_global_oom_handling))]
273use crate::string::String;
274#[cfg(not(no_global_oom_handling))]
275use crate::vec::Vec;
276
277// This is repr(C) to future-proof against possible field-reordering, which
278// would interfere with otherwise safe [into|from]_raw() of transmutable
279// inner types.
280#[repr(C)]
281struct RcInner<T: ?Sized> {
282    strong: Cell<usize>,
283    weak: Cell<usize>,
284    value: T,
285}
286
287/// Calculate layout for `RcInner<T>` using the inner value's layout
288fn rc_inner_layout_for_value_layout(layout: Layout) -> Layout {
289    // Calculate layout using the given value layout.
290    // Previously, layout was calculated on the expression
291    // `&*(ptr as *const RcInner<T>)`, but this created a misaligned
292    // reference (see #54908).
293    Layout::new::<RcInner<()>>().extend(layout).unwrap().0.pad_to_align()
294}
295
296/// A single-threaded reference-counting pointer. 'Rc' stands for 'Reference
297/// Counted'.
298///
299/// See the [module-level documentation](./index.html) for more details.
300///
301/// The inherent methods of `Rc` are all associated functions, which means
302/// that you have to call them as e.g., [`Rc::get_mut(&mut value)`][get_mut] instead of
303/// `value.get_mut()`. This avoids conflicts with methods of the inner type `T`.
304///
305/// [get_mut]: Rc::get_mut
306#[doc(search_unbox)]
307#[rustc_diagnostic_item = "Rc"]
308#[stable(feature = "rust1", since = "1.0.0")]
309#[rustc_insignificant_dtor]
310pub struct Rc<
311    T: ?Sized,
312    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
313> {
314    ptr: NonNull<RcInner<T>>,
315    phantom: PhantomData<RcInner<T>>,
316    alloc: A,
317}
318
319#[stable(feature = "rust1", since = "1.0.0")]
320impl<T: ?Sized, A: Allocator> !Send for Rc<T, A> {}
321
322// Note that this negative impl isn't strictly necessary for correctness,
323// as `Rc` transitively contains a `Cell`, which is itself `!Sync`.
324// However, given how important `Rc`'s `!Sync`-ness is,
325// having an explicit negative impl is nice for documentation purposes
326// and results in nicer error messages.
327#[stable(feature = "rust1", since = "1.0.0")]
328impl<T: ?Sized, A: Allocator> !Sync for Rc<T, A> {}
329
330#[stable(feature = "catch_unwind", since = "1.9.0")]
331impl<T: RefUnwindSafe + ?Sized, A: Allocator + UnwindSafe> UnwindSafe for Rc<T, A> {}
332#[stable(feature = "rc_ref_unwind_safe", since = "1.58.0")]
333impl<T: RefUnwindSafe + ?Sized, A: Allocator + UnwindSafe> RefUnwindSafe for Rc<T, A> {}
334
335#[unstable(feature = "coerce_unsized", issue = "18598")]
336impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Rc<U, A>> for Rc<T, A> {}
337
338#[unstable(feature = "dispatch_from_dyn", issue = "none")]
339impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Rc<U>> for Rc<T> {}
340
341impl<T: ?Sized> Rc<T> {
342    #[inline]
343    unsafe fn from_inner(ptr: NonNull<RcInner<T>>) -> Self {
344        unsafe { Self::from_inner_in(ptr, Global) }
345    }
346
347    #[inline]
348    unsafe fn from_ptr(ptr: *mut RcInner<T>) -> Self {
349        unsafe { Self::from_inner(NonNull::new_unchecked(ptr)) }
350    }
351}
352
353impl<T: ?Sized, A: Allocator> Rc<T, A> {
354    #[inline(always)]
355    fn inner(&self) -> &RcInner<T> {
356        // This unsafety is ok because while this Rc is alive we're guaranteed
357        // that the inner pointer is valid.
358        unsafe { self.ptr.as_ref() }
359    }
360
361    #[inline]
362    fn into_inner_with_allocator(this: Self) -> (NonNull<RcInner<T>>, A) {
363        let this = mem::ManuallyDrop::new(this);
364        (this.ptr, unsafe { ptr::read(&this.alloc) })
365    }
366
367    #[inline]
368    unsafe fn from_inner_in(ptr: NonNull<RcInner<T>>, alloc: A) -> Self {
369        Self { ptr, phantom: PhantomData, alloc }
370    }
371
372    #[inline]
373    unsafe fn from_ptr_in(ptr: *mut RcInner<T>, alloc: A) -> Self {
374        unsafe { Self::from_inner_in(NonNull::new_unchecked(ptr), alloc) }
375    }
376
377    // Non-inlined part of `drop`.
378    #[inline(never)]
379    unsafe fn drop_slow(&mut self) {
380        // Reconstruct the "strong weak" pointer and drop it when this
381        // variable goes out of scope. This ensures that the memory is
382        // deallocated even if the destructor of `T` panics.
383        let _weak = Weak { ptr: self.ptr, alloc: &self.alloc };
384
385        // Destroy the contained object.
386        // We cannot use `get_mut_unchecked` here, because `self.alloc` is borrowed.
387        unsafe {
388            ptr::drop_in_place(&mut (*self.ptr.as_ptr()).value);
389        }
390    }
391}
392
393impl<T> Rc<T> {
394    /// Constructs a new `Rc<T>`.
395    ///
396    /// # Examples
397    ///
398    /// ```
399    /// use std::rc::Rc;
400    ///
401    /// let five = Rc::new(5);
402    /// ```
403    #[cfg(not(no_global_oom_handling))]
404    #[stable(feature = "rust1", since = "1.0.0")]
405    pub fn new(value: T) -> Rc<T> {
406        // There is an implicit weak pointer owned by all the strong
407        // pointers, which ensures that the weak destructor never frees
408        // the allocation while the strong destructor is running, even
409        // if the weak pointer is stored inside the strong one.
410        unsafe {
411            Self::from_inner(
412                Box::leak(Box::new(RcInner { strong: Cell::new(1), weak: Cell::new(1), value }))
413                    .into(),
414            )
415        }
416    }
417
418    /// Constructs a new `Rc<T>` while giving you a `Weak<T>` to the allocation,
419    /// to allow you to construct a `T` which holds a weak pointer to itself.
420    ///
421    /// Generally, a structure circularly referencing itself, either directly or
422    /// indirectly, should not hold a strong reference to itself to prevent a memory leak.
423    /// Using this function, you get access to the weak pointer during the
424    /// initialization of `T`, before the `Rc<T>` is created, such that you can
425    /// clone and store it inside the `T`.
426    ///
427    /// `new_cyclic` first allocates the managed allocation for the `Rc<T>`,
428    /// then calls your closure, giving it a `Weak<T>` to this allocation,
429    /// and only afterwards completes the construction of the `Rc<T>` by placing
430    /// the `T` returned from your closure into the allocation.
431    ///
432    /// Since the new `Rc<T>` is not fully-constructed until `Rc<T>::new_cyclic`
433    /// returns, calling [`upgrade`] on the weak reference inside your closure will
434    /// fail and result in a `None` value.
435    ///
436    /// # Panics
437    ///
438    /// If `data_fn` panics, the panic is propagated to the caller, and the
439    /// temporary [`Weak<T>`] is dropped normally.
440    ///
441    /// # Examples
442    ///
443    /// ```
444    /// # #![allow(dead_code)]
445    /// use std::rc::{Rc, Weak};
446    ///
447    /// struct Gadget {
448    ///     me: Weak<Gadget>,
449    /// }
450    ///
451    /// impl Gadget {
452    ///     /// Constructs a reference counted Gadget.
453    ///     fn new() -> Rc<Self> {
454    ///         // `me` is a `Weak<Gadget>` pointing at the new allocation of the
455    ///         // `Rc` we're constructing.
456    ///         Rc::new_cyclic(|me| {
457    ///             // Create the actual struct here.
458    ///             Gadget { me: me.clone() }
459    ///         })
460    ///     }
461    ///
462    ///     /// Returns a reference counted pointer to Self.
463    ///     fn me(&self) -> Rc<Self> {
464    ///         self.me.upgrade().unwrap()
465    ///     }
466    /// }
467    /// ```
468    /// [`upgrade`]: Weak::upgrade
469    #[cfg(not(no_global_oom_handling))]
470    #[stable(feature = "arc_new_cyclic", since = "1.60.0")]
471    pub fn new_cyclic<F>(data_fn: F) -> Rc<T>
472    where
473        F: FnOnce(&Weak<T>) -> T,
474    {
475        Self::new_cyclic_in(data_fn, Global)
476    }
477
478    /// Constructs a new `Rc` with uninitialized contents.
479    ///
480    /// # Examples
481    ///
482    /// ```
483    /// #![feature(get_mut_unchecked)]
484    ///
485    /// use std::rc::Rc;
486    ///
487    /// let mut five = Rc::<u32>::new_uninit();
488    ///
489    /// // Deferred initialization:
490    /// Rc::get_mut(&mut five).unwrap().write(5);
491    ///
492    /// let five = unsafe { five.assume_init() };
493    ///
494    /// assert_eq!(*five, 5)
495    /// ```
496    #[cfg(not(no_global_oom_handling))]
497    #[stable(feature = "new_uninit", since = "1.82.0")]
498    #[must_use]
499    pub fn new_uninit() -> Rc<mem::MaybeUninit<T>> {
500        unsafe {
501            Rc::from_ptr(Rc::allocate_for_layout(
502                Layout::new::<T>(),
503                |layout| Global.allocate(layout),
504                <*mut u8>::cast,
505            ))
506        }
507    }
508
509    /// Constructs a new `Rc` with uninitialized contents, with the memory
510    /// being filled with `0` bytes.
511    ///
512    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
513    /// incorrect usage of this method.
514    ///
515    /// # Examples
516    ///
517    /// ```
518    /// #![feature(new_zeroed_alloc)]
519    ///
520    /// use std::rc::Rc;
521    ///
522    /// let zero = Rc::<u32>::new_zeroed();
523    /// let zero = unsafe { zero.assume_init() };
524    ///
525    /// assert_eq!(*zero, 0)
526    /// ```
527    ///
528    /// [zeroed]: mem::MaybeUninit::zeroed
529    #[cfg(not(no_global_oom_handling))]
530    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
531    #[must_use]
532    pub fn new_zeroed() -> Rc<mem::MaybeUninit<T>> {
533        unsafe {
534            Rc::from_ptr(Rc::allocate_for_layout(
535                Layout::new::<T>(),
536                |layout| Global.allocate_zeroed(layout),
537                <*mut u8>::cast,
538            ))
539        }
540    }
541
542    /// Constructs a new `Rc<T>`, returning an error if the allocation fails
543    ///
544    /// # Examples
545    ///
546    /// ```
547    /// #![feature(allocator_api)]
548    /// use std::rc::Rc;
549    ///
550    /// let five = Rc::try_new(5);
551    /// # Ok::<(), std::alloc::AllocError>(())
552    /// ```
553    #[unstable(feature = "allocator_api", issue = "32838")]
554    pub fn try_new(value: T) -> Result<Rc<T>, AllocError> {
555        // There is an implicit weak pointer owned by all the strong
556        // pointers, which ensures that the weak destructor never frees
557        // the allocation while the strong destructor is running, even
558        // if the weak pointer is stored inside the strong one.
559        unsafe {
560            Ok(Self::from_inner(
561                Box::leak(Box::try_new(RcInner {
562                    strong: Cell::new(1),
563                    weak: Cell::new(1),
564                    value,
565                })?)
566                .into(),
567            ))
568        }
569    }
570
571    /// Constructs a new `Rc` with uninitialized contents, returning an error if the allocation fails
572    ///
573    /// # Examples
574    ///
575    /// ```
576    /// #![feature(allocator_api)]
577    /// #![feature(get_mut_unchecked)]
578    ///
579    /// use std::rc::Rc;
580    ///
581    /// let mut five = Rc::<u32>::try_new_uninit()?;
582    ///
583    /// // Deferred initialization:
584    /// Rc::get_mut(&mut five).unwrap().write(5);
585    ///
586    /// let five = unsafe { five.assume_init() };
587    ///
588    /// assert_eq!(*five, 5);
589    /// # Ok::<(), std::alloc::AllocError>(())
590    /// ```
591    #[unstable(feature = "allocator_api", issue = "32838")]
592    // #[unstable(feature = "new_uninit", issue = "63291")]
593    pub fn try_new_uninit() -> Result<Rc<mem::MaybeUninit<T>>, AllocError> {
594        unsafe {
595            Ok(Rc::from_ptr(Rc::try_allocate_for_layout(
596                Layout::new::<T>(),
597                |layout| Global.allocate(layout),
598                <*mut u8>::cast,
599            )?))
600        }
601    }
602
603    /// Constructs a new `Rc` with uninitialized contents, with the memory
604    /// being filled with `0` bytes, returning an error if the allocation fails
605    ///
606    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
607    /// incorrect usage of this method.
608    ///
609    /// # Examples
610    ///
611    /// ```
612    /// #![feature(allocator_api)]
613    ///
614    /// use std::rc::Rc;
615    ///
616    /// let zero = Rc::<u32>::try_new_zeroed()?;
617    /// let zero = unsafe { zero.assume_init() };
618    ///
619    /// assert_eq!(*zero, 0);
620    /// # Ok::<(), std::alloc::AllocError>(())
621    /// ```
622    ///
623    /// [zeroed]: mem::MaybeUninit::zeroed
624    #[unstable(feature = "allocator_api", issue = "32838")]
625    //#[unstable(feature = "new_uninit", issue = "63291")]
626    pub fn try_new_zeroed() -> Result<Rc<mem::MaybeUninit<T>>, AllocError> {
627        unsafe {
628            Ok(Rc::from_ptr(Rc::try_allocate_for_layout(
629                Layout::new::<T>(),
630                |layout| Global.allocate_zeroed(layout),
631                <*mut u8>::cast,
632            )?))
633        }
634    }
635    /// Constructs a new `Pin<Rc<T>>`. If `T` does not implement `Unpin`, then
636    /// `value` will be pinned in memory and unable to be moved.
637    #[cfg(not(no_global_oom_handling))]
638    #[stable(feature = "pin", since = "1.33.0")]
639    #[must_use]
640    pub fn pin(value: T) -> Pin<Rc<T>> {
641        unsafe { Pin::new_unchecked(Rc::new(value)) }
642    }
643}
644
645impl<T, A: Allocator> Rc<T, A> {
646    /// Constructs a new `Rc` in the provided allocator.
647    ///
648    /// # Examples
649    ///
650    /// ```
651    /// #![feature(allocator_api)]
652    /// use std::rc::Rc;
653    /// use std::alloc::System;
654    ///
655    /// let five = Rc::new_in(5, System);
656    /// ```
657    #[cfg(not(no_global_oom_handling))]
658    #[unstable(feature = "allocator_api", issue = "32838")]
659    #[inline]
660    pub fn new_in(value: T, alloc: A) -> Rc<T, A> {
661        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
662        // That would make code size bigger.
663        match Self::try_new_in(value, alloc) {
664            Ok(m) => m,
665            Err(_) => handle_alloc_error(Layout::new::<RcInner<T>>()),
666        }
667    }
668
669    /// Constructs a new `Rc` with uninitialized contents in the provided allocator.
670    ///
671    /// # Examples
672    ///
673    /// ```
674    /// #![feature(get_mut_unchecked)]
675    /// #![feature(allocator_api)]
676    ///
677    /// use std::rc::Rc;
678    /// use std::alloc::System;
679    ///
680    /// let mut five = Rc::<u32, _>::new_uninit_in(System);
681    ///
682    /// let five = unsafe {
683    ///     // Deferred initialization:
684    ///     Rc::get_mut_unchecked(&mut five).as_mut_ptr().write(5);
685    ///
686    ///     five.assume_init()
687    /// };
688    ///
689    /// assert_eq!(*five, 5)
690    /// ```
691    #[cfg(not(no_global_oom_handling))]
692    #[unstable(feature = "allocator_api", issue = "32838")]
693    // #[unstable(feature = "new_uninit", issue = "63291")]
694    #[inline]
695    pub fn new_uninit_in(alloc: A) -> Rc<mem::MaybeUninit<T>, A> {
696        unsafe {
697            Rc::from_ptr_in(
698                Rc::allocate_for_layout(
699                    Layout::new::<T>(),
700                    |layout| alloc.allocate(layout),
701                    <*mut u8>::cast,
702                ),
703                alloc,
704            )
705        }
706    }
707
708    /// Constructs a new `Rc` with uninitialized contents, with the memory
709    /// being filled with `0` bytes, in the provided allocator.
710    ///
711    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
712    /// incorrect usage of this method.
713    ///
714    /// # Examples
715    ///
716    /// ```
717    /// #![feature(allocator_api)]
718    ///
719    /// use std::rc::Rc;
720    /// use std::alloc::System;
721    ///
722    /// let zero = Rc::<u32, _>::new_zeroed_in(System);
723    /// let zero = unsafe { zero.assume_init() };
724    ///
725    /// assert_eq!(*zero, 0)
726    /// ```
727    ///
728    /// [zeroed]: mem::MaybeUninit::zeroed
729    #[cfg(not(no_global_oom_handling))]
730    #[unstable(feature = "allocator_api", issue = "32838")]
731    // #[unstable(feature = "new_uninit", issue = "63291")]
732    #[inline]
733    pub fn new_zeroed_in(alloc: A) -> Rc<mem::MaybeUninit<T>, A> {
734        unsafe {
735            Rc::from_ptr_in(
736                Rc::allocate_for_layout(
737                    Layout::new::<T>(),
738                    |layout| alloc.allocate_zeroed(layout),
739                    <*mut u8>::cast,
740                ),
741                alloc,
742            )
743        }
744    }
745
746    /// Constructs a new `Rc<T, A>` in the given allocator while giving you a `Weak<T, A>` to the allocation,
747    /// to allow you to construct a `T` which holds a weak pointer to itself.
748    ///
749    /// Generally, a structure circularly referencing itself, either directly or
750    /// indirectly, should not hold a strong reference to itself to prevent a memory leak.
751    /// Using this function, you get access to the weak pointer during the
752    /// initialization of `T`, before the `Rc<T, A>` is created, such that you can
753    /// clone and store it inside the `T`.
754    ///
755    /// `new_cyclic_in` first allocates the managed allocation for the `Rc<T, A>`,
756    /// then calls your closure, giving it a `Weak<T, A>` to this allocation,
757    /// and only afterwards completes the construction of the `Rc<T, A>` by placing
758    /// the `T` returned from your closure into the allocation.
759    ///
760    /// Since the new `Rc<T, A>` is not fully-constructed until `Rc<T, A>::new_cyclic_in`
761    /// returns, calling [`upgrade`] on the weak reference inside your closure will
762    /// fail and result in a `None` value.
763    ///
764    /// # Panics
765    ///
766    /// If `data_fn` panics, the panic is propagated to the caller, and the
767    /// temporary [`Weak<T, A>`] is dropped normally.
768    ///
769    /// # Examples
770    ///
771    /// See [`new_cyclic`].
772    ///
773    /// [`new_cyclic`]: Rc::new_cyclic
774    /// [`upgrade`]: Weak::upgrade
775    #[cfg(not(no_global_oom_handling))]
776    #[unstable(feature = "allocator_api", issue = "32838")]
777    pub fn new_cyclic_in<F>(data_fn: F, alloc: A) -> Rc<T, A>
778    where
779        F: FnOnce(&Weak<T, A>) -> T,
780    {
781        // Construct the inner in the "uninitialized" state with a single
782        // weak reference.
783        let (uninit_raw_ptr, alloc) = Box::into_raw_with_allocator(Box::new_in(
784            RcInner {
785                strong: Cell::new(0),
786                weak: Cell::new(1),
787                value: mem::MaybeUninit::<T>::uninit(),
788            },
789            alloc,
790        ));
791        let uninit_ptr: NonNull<_> = (unsafe { &mut *uninit_raw_ptr }).into();
792        let init_ptr: NonNull<RcInner<T>> = uninit_ptr.cast();
793
794        let weak = Weak { ptr: init_ptr, alloc };
795
796        // It's important we don't give up ownership of the weak pointer, or
797        // else the memory might be freed by the time `data_fn` returns. If
798        // we really wanted to pass ownership, we could create an additional
799        // weak pointer for ourselves, but this would result in additional
800        // updates to the weak reference count which might not be necessary
801        // otherwise.
802        let data = data_fn(&weak);
803
804        let strong = unsafe {
805            let inner = init_ptr.as_ptr();
806            ptr::write(&raw mut (*inner).value, data);
807
808            let prev_value = (*inner).strong.get();
809            debug_assert_eq!(prev_value, 0, "No prior strong references should exist");
810            (*inner).strong.set(1);
811
812            // Strong references should collectively own a shared weak reference,
813            // so don't run the destructor for our old weak reference.
814            // Calling into_raw_with_allocator has the double effect of giving us back the allocator,
815            // and forgetting the weak reference.
816            let alloc = weak.into_raw_with_allocator().1;
817
818            Rc::from_inner_in(init_ptr, alloc)
819        };
820
821        strong
822    }
823
824    /// Constructs a new `Rc<T>` in the provided allocator, returning an error if the allocation
825    /// fails
826    ///
827    /// # Examples
828    ///
829    /// ```
830    /// #![feature(allocator_api)]
831    /// use std::rc::Rc;
832    /// use std::alloc::System;
833    ///
834    /// let five = Rc::try_new_in(5, System);
835    /// # Ok::<(), std::alloc::AllocError>(())
836    /// ```
837    #[unstable(feature = "allocator_api", issue = "32838")]
838    #[inline]
839    pub fn try_new_in(value: T, alloc: A) -> Result<Self, AllocError> {
840        // There is an implicit weak pointer owned by all the strong
841        // pointers, which ensures that the weak destructor never frees
842        // the allocation while the strong destructor is running, even
843        // if the weak pointer is stored inside the strong one.
844        let (ptr, alloc) = Box::into_unique(Box::try_new_in(
845            RcInner { strong: Cell::new(1), weak: Cell::new(1), value },
846            alloc,
847        )?);
848        Ok(unsafe { Self::from_inner_in(ptr.into(), alloc) })
849    }
850
851    /// Constructs a new `Rc` with uninitialized contents, in the provided allocator, returning an
852    /// error if the allocation fails
853    ///
854    /// # Examples
855    ///
856    /// ```
857    /// #![feature(allocator_api)]
858    /// #![feature(get_mut_unchecked)]
859    ///
860    /// use std::rc::Rc;
861    /// use std::alloc::System;
862    ///
863    /// let mut five = Rc::<u32, _>::try_new_uninit_in(System)?;
864    ///
865    /// let five = unsafe {
866    ///     // Deferred initialization:
867    ///     Rc::get_mut_unchecked(&mut five).as_mut_ptr().write(5);
868    ///
869    ///     five.assume_init()
870    /// };
871    ///
872    /// assert_eq!(*five, 5);
873    /// # Ok::<(), std::alloc::AllocError>(())
874    /// ```
875    #[unstable(feature = "allocator_api", issue = "32838")]
876    // #[unstable(feature = "new_uninit", issue = "63291")]
877    #[inline]
878    pub fn try_new_uninit_in(alloc: A) -> Result<Rc<mem::MaybeUninit<T>, A>, AllocError> {
879        unsafe {
880            Ok(Rc::from_ptr_in(
881                Rc::try_allocate_for_layout(
882                    Layout::new::<T>(),
883                    |layout| alloc.allocate(layout),
884                    <*mut u8>::cast,
885                )?,
886                alloc,
887            ))
888        }
889    }
890
891    /// Constructs a new `Rc` with uninitialized contents, with the memory
892    /// being filled with `0` bytes, in the provided allocator, returning an error if the allocation
893    /// fails
894    ///
895    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
896    /// incorrect usage of this method.
897    ///
898    /// # Examples
899    ///
900    /// ```
901    /// #![feature(allocator_api)]
902    ///
903    /// use std::rc::Rc;
904    /// use std::alloc::System;
905    ///
906    /// let zero = Rc::<u32, _>::try_new_zeroed_in(System)?;
907    /// let zero = unsafe { zero.assume_init() };
908    ///
909    /// assert_eq!(*zero, 0);
910    /// # Ok::<(), std::alloc::AllocError>(())
911    /// ```
912    ///
913    /// [zeroed]: mem::MaybeUninit::zeroed
914    #[unstable(feature = "allocator_api", issue = "32838")]
915    //#[unstable(feature = "new_uninit", issue = "63291")]
916    #[inline]
917    pub fn try_new_zeroed_in(alloc: A) -> Result<Rc<mem::MaybeUninit<T>, A>, AllocError> {
918        unsafe {
919            Ok(Rc::from_ptr_in(
920                Rc::try_allocate_for_layout(
921                    Layout::new::<T>(),
922                    |layout| alloc.allocate_zeroed(layout),
923                    <*mut u8>::cast,
924                )?,
925                alloc,
926            ))
927        }
928    }
929
930    /// Constructs a new `Pin<Rc<T>>` in the provided allocator. If `T` does not implement `Unpin`, then
931    /// `value` will be pinned in memory and unable to be moved.
932    #[cfg(not(no_global_oom_handling))]
933    #[unstable(feature = "allocator_api", issue = "32838")]
934    #[inline]
935    pub fn pin_in(value: T, alloc: A) -> Pin<Self>
936    where
937        A: 'static,
938    {
939        unsafe { Pin::new_unchecked(Rc::new_in(value, alloc)) }
940    }
941
942    /// Returns the inner value, if the `Rc` has exactly one strong reference.
943    ///
944    /// Otherwise, an [`Err`] is returned with the same `Rc` that was
945    /// passed in.
946    ///
947    /// This will succeed even if there are outstanding weak references.
948    ///
949    /// # Examples
950    ///
951    /// ```
952    /// use std::rc::Rc;
953    ///
954    /// let x = Rc::new(3);
955    /// assert_eq!(Rc::try_unwrap(x), Ok(3));
956    ///
957    /// let x = Rc::new(4);
958    /// let _y = Rc::clone(&x);
959    /// assert_eq!(*Rc::try_unwrap(x).unwrap_err(), 4);
960    /// ```
961    #[inline]
962    #[stable(feature = "rc_unique", since = "1.4.0")]
963    pub fn try_unwrap(this: Self) -> Result<T, Self> {
964        if Rc::strong_count(&this) == 1 {
965            let this = ManuallyDrop::new(this);
966
967            let val: T = unsafe { ptr::read(&**this) }; // copy the contained object
968            let alloc: A = unsafe { ptr::read(&this.alloc) }; // copy the allocator
969
970            // Indicate to Weaks that they can't be promoted by decrementing
971            // the strong count, and then remove the implicit "strong weak"
972            // pointer while also handling drop logic by just crafting a
973            // fake Weak.
974            this.inner().dec_strong();
975            let _weak = Weak { ptr: this.ptr, alloc };
976            Ok(val)
977        } else {
978            Err(this)
979        }
980    }
981
982    /// Returns the inner value, if the `Rc` has exactly one strong reference.
983    ///
984    /// Otherwise, [`None`] is returned and the `Rc` is dropped.
985    ///
986    /// This will succeed even if there are outstanding weak references.
987    ///
988    /// If `Rc::into_inner` is called on every clone of this `Rc`,
989    /// it is guaranteed that exactly one of the calls returns the inner value.
990    /// This means in particular that the inner value is not dropped.
991    ///
992    /// [`Rc::try_unwrap`] is conceptually similar to `Rc::into_inner`.
993    /// And while they are meant for different use-cases, `Rc::into_inner(this)`
994    /// is in fact equivalent to <code>[Rc::try_unwrap]\(this).[ok][Result::ok]()</code>.
995    /// (Note that the same kind of equivalence does **not** hold true for
996    /// [`Arc`](crate::sync::Arc), due to race conditions that do not apply to `Rc`!)
997    ///
998    /// # Examples
999    ///
1000    /// ```
1001    /// use std::rc::Rc;
1002    ///
1003    /// let x = Rc::new(3);
1004    /// assert_eq!(Rc::into_inner(x), Some(3));
1005    ///
1006    /// let x = Rc::new(4);
1007    /// let y = Rc::clone(&x);
1008    ///
1009    /// assert_eq!(Rc::into_inner(y), None);
1010    /// assert_eq!(Rc::into_inner(x), Some(4));
1011    /// ```
1012    #[inline]
1013    #[stable(feature = "rc_into_inner", since = "1.70.0")]
1014    pub fn into_inner(this: Self) -> Option<T> {
1015        Rc::try_unwrap(this).ok()
1016    }
1017}
1018
1019impl<T> Rc<[T]> {
1020    /// Constructs a new reference-counted slice with uninitialized contents.
1021    ///
1022    /// # Examples
1023    ///
1024    /// ```
1025    /// #![feature(get_mut_unchecked)]
1026    ///
1027    /// use std::rc::Rc;
1028    ///
1029    /// let mut values = Rc::<[u32]>::new_uninit_slice(3);
1030    ///
1031    /// // Deferred initialization:
1032    /// let data = Rc::get_mut(&mut values).unwrap();
1033    /// data[0].write(1);
1034    /// data[1].write(2);
1035    /// data[2].write(3);
1036    ///
1037    /// let values = unsafe { values.assume_init() };
1038    ///
1039    /// assert_eq!(*values, [1, 2, 3])
1040    /// ```
1041    #[cfg(not(no_global_oom_handling))]
1042    #[stable(feature = "new_uninit", since = "1.82.0")]
1043    #[must_use]
1044    pub fn new_uninit_slice(len: usize) -> Rc<[mem::MaybeUninit<T>]> {
1045        unsafe { Rc::from_ptr(Rc::allocate_for_slice(len)) }
1046    }
1047
1048    /// Constructs a new reference-counted slice with uninitialized contents, with the memory being
1049    /// filled with `0` bytes.
1050    ///
1051    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
1052    /// incorrect usage of this method.
1053    ///
1054    /// # Examples
1055    ///
1056    /// ```
1057    /// #![feature(new_zeroed_alloc)]
1058    ///
1059    /// use std::rc::Rc;
1060    ///
1061    /// let values = Rc::<[u32]>::new_zeroed_slice(3);
1062    /// let values = unsafe { values.assume_init() };
1063    ///
1064    /// assert_eq!(*values, [0, 0, 0])
1065    /// ```
1066    ///
1067    /// [zeroed]: mem::MaybeUninit::zeroed
1068    #[cfg(not(no_global_oom_handling))]
1069    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
1070    #[must_use]
1071    pub fn new_zeroed_slice(len: usize) -> Rc<[mem::MaybeUninit<T>]> {
1072        unsafe {
1073            Rc::from_ptr(Rc::allocate_for_layout(
1074                Layout::array::<T>(len).unwrap(),
1075                |layout| Global.allocate_zeroed(layout),
1076                |mem| {
1077                    ptr::slice_from_raw_parts_mut(mem.cast::<T>(), len)
1078                        as *mut RcInner<[mem::MaybeUninit<T>]>
1079                },
1080            ))
1081        }
1082    }
1083
1084    /// Converts the reference-counted slice into a reference-counted array.
1085    ///
1086    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
1087    ///
1088    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
1089    #[unstable(feature = "slice_as_array", issue = "133508")]
1090    #[inline]
1091    #[must_use]
1092    pub fn into_array<const N: usize>(self) -> Option<Rc<[T; N]>> {
1093        if self.len() == N {
1094            let ptr = Self::into_raw(self) as *const [T; N];
1095
1096            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
1097            let me = unsafe { Rc::from_raw(ptr) };
1098            Some(me)
1099        } else {
1100            None
1101        }
1102    }
1103}
1104
1105impl<T, A: Allocator> Rc<[T], A> {
1106    /// Constructs a new reference-counted slice with uninitialized contents.
1107    ///
1108    /// # Examples
1109    ///
1110    /// ```
1111    /// #![feature(get_mut_unchecked)]
1112    /// #![feature(allocator_api)]
1113    ///
1114    /// use std::rc::Rc;
1115    /// use std::alloc::System;
1116    ///
1117    /// let mut values = Rc::<[u32], _>::new_uninit_slice_in(3, System);
1118    ///
1119    /// let values = unsafe {
1120    ///     // Deferred initialization:
1121    ///     Rc::get_mut_unchecked(&mut values)[0].as_mut_ptr().write(1);
1122    ///     Rc::get_mut_unchecked(&mut values)[1].as_mut_ptr().write(2);
1123    ///     Rc::get_mut_unchecked(&mut values)[2].as_mut_ptr().write(3);
1124    ///
1125    ///     values.assume_init()
1126    /// };
1127    ///
1128    /// assert_eq!(*values, [1, 2, 3])
1129    /// ```
1130    #[cfg(not(no_global_oom_handling))]
1131    #[unstable(feature = "allocator_api", issue = "32838")]
1132    // #[unstable(feature = "new_uninit", issue = "63291")]
1133    #[inline]
1134    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Rc<[mem::MaybeUninit<T>], A> {
1135        unsafe { Rc::from_ptr_in(Rc::allocate_for_slice_in(len, &alloc), alloc) }
1136    }
1137
1138    /// Constructs a new reference-counted slice with uninitialized contents, with the memory being
1139    /// filled with `0` bytes.
1140    ///
1141    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
1142    /// incorrect usage of this method.
1143    ///
1144    /// # Examples
1145    ///
1146    /// ```
1147    /// #![feature(allocator_api)]
1148    ///
1149    /// use std::rc::Rc;
1150    /// use std::alloc::System;
1151    ///
1152    /// let values = Rc::<[u32], _>::new_zeroed_slice_in(3, System);
1153    /// let values = unsafe { values.assume_init() };
1154    ///
1155    /// assert_eq!(*values, [0, 0, 0])
1156    /// ```
1157    ///
1158    /// [zeroed]: mem::MaybeUninit::zeroed
1159    #[cfg(not(no_global_oom_handling))]
1160    #[unstable(feature = "allocator_api", issue = "32838")]
1161    // #[unstable(feature = "new_uninit", issue = "63291")]
1162    #[inline]
1163    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Rc<[mem::MaybeUninit<T>], A> {
1164        unsafe {
1165            Rc::from_ptr_in(
1166                Rc::allocate_for_layout(
1167                    Layout::array::<T>(len).unwrap(),
1168                    |layout| alloc.allocate_zeroed(layout),
1169                    |mem| {
1170                        ptr::slice_from_raw_parts_mut(mem.cast::<T>(), len)
1171                            as *mut RcInner<[mem::MaybeUninit<T>]>
1172                    },
1173                ),
1174                alloc,
1175            )
1176        }
1177    }
1178}
1179
1180impl<T, A: Allocator> Rc<mem::MaybeUninit<T>, A> {
1181    /// Converts to `Rc<T>`.
1182    ///
1183    /// # Safety
1184    ///
1185    /// As with [`MaybeUninit::assume_init`],
1186    /// it is up to the caller to guarantee that the inner value
1187    /// really is in an initialized state.
1188    /// Calling this when the content is not yet fully initialized
1189    /// causes immediate undefined behavior.
1190    ///
1191    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1192    ///
1193    /// # Examples
1194    ///
1195    /// ```
1196    /// #![feature(get_mut_unchecked)]
1197    ///
1198    /// use std::rc::Rc;
1199    ///
1200    /// let mut five = Rc::<u32>::new_uninit();
1201    ///
1202    /// // Deferred initialization:
1203    /// Rc::get_mut(&mut five).unwrap().write(5);
1204    ///
1205    /// let five = unsafe { five.assume_init() };
1206    ///
1207    /// assert_eq!(*five, 5)
1208    /// ```
1209    #[stable(feature = "new_uninit", since = "1.82.0")]
1210    #[inline]
1211    pub unsafe fn assume_init(self) -> Rc<T, A> {
1212        let (ptr, alloc) = Rc::into_inner_with_allocator(self);
1213        unsafe { Rc::from_inner_in(ptr.cast(), alloc) }
1214    }
1215}
1216
1217impl<T, A: Allocator> Rc<[mem::MaybeUninit<T>], A> {
1218    /// Converts to `Rc<[T]>`.
1219    ///
1220    /// # Safety
1221    ///
1222    /// As with [`MaybeUninit::assume_init`],
1223    /// it is up to the caller to guarantee that the inner value
1224    /// really is in an initialized state.
1225    /// Calling this when the content is not yet fully initialized
1226    /// causes immediate undefined behavior.
1227    ///
1228    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1229    ///
1230    /// # Examples
1231    ///
1232    /// ```
1233    /// #![feature(get_mut_unchecked)]
1234    ///
1235    /// use std::rc::Rc;
1236    ///
1237    /// let mut values = Rc::<[u32]>::new_uninit_slice(3);
1238    ///
1239    /// // Deferred initialization:
1240    /// let data = Rc::get_mut(&mut values).unwrap();
1241    /// data[0].write(1);
1242    /// data[1].write(2);
1243    /// data[2].write(3);
1244    ///
1245    /// let values = unsafe { values.assume_init() };
1246    ///
1247    /// assert_eq!(*values, [1, 2, 3])
1248    /// ```
1249    #[stable(feature = "new_uninit", since = "1.82.0")]
1250    #[inline]
1251    pub unsafe fn assume_init(self) -> Rc<[T], A> {
1252        let (ptr, alloc) = Rc::into_inner_with_allocator(self);
1253        unsafe { Rc::from_ptr_in(ptr.as_ptr() as _, alloc) }
1254    }
1255}
1256
1257impl<T: ?Sized> Rc<T> {
1258    /// Constructs an `Rc<T>` from a raw pointer.
1259    ///
1260    /// The raw pointer must have been previously returned by a call to
1261    /// [`Rc<U>::into_raw`][into_raw] with the following requirements:
1262    ///
1263    /// * If `U` is sized, it must have the same size and alignment as `T`. This
1264    ///   is trivially true if `U` is `T`.
1265    /// * If `U` is unsized, its data pointer must have the same size and
1266    ///   alignment as `T`. This is trivially true if `Rc<U>` was constructed
1267    ///   through `Rc<T>` and then converted to `Rc<U>` through an [unsized
1268    ///   coercion].
1269    ///
1270    /// Note that if `U` or `U`'s data pointer is not `T` but has the same size
1271    /// and alignment, this is basically like transmuting references of
1272    /// different types. See [`mem::transmute`][transmute] for more information
1273    /// on what restrictions apply in this case.
1274    ///
1275    /// The raw pointer must point to a block of memory allocated by the global allocator
1276    ///
1277    /// The user of `from_raw` has to make sure a specific value of `T` is only
1278    /// dropped once.
1279    ///
1280    /// This function is unsafe because improper use may lead to memory unsafety,
1281    /// even if the returned `Rc<T>` is never accessed.
1282    ///
1283    /// [into_raw]: Rc::into_raw
1284    /// [transmute]: core::mem::transmute
1285    /// [unsized coercion]: https://doc.rust-lang.org/reference/type-coercions.html#unsized-coercions
1286    ///
1287    /// # Examples
1288    ///
1289    /// ```
1290    /// use std::rc::Rc;
1291    ///
1292    /// let x = Rc::new("hello".to_owned());
1293    /// let x_ptr = Rc::into_raw(x);
1294    ///
1295    /// unsafe {
1296    ///     // Convert back to an `Rc` to prevent leak.
1297    ///     let x = Rc::from_raw(x_ptr);
1298    ///     assert_eq!(&*x, "hello");
1299    ///
1300    ///     // Further calls to `Rc::from_raw(x_ptr)` would be memory-unsafe.
1301    /// }
1302    ///
1303    /// // The memory was freed when `x` went out of scope above, so `x_ptr` is now dangling!
1304    /// ```
1305    ///
1306    /// Convert a slice back into its original array:
1307    ///
1308    /// ```
1309    /// use std::rc::Rc;
1310    ///
1311    /// let x: Rc<[u32]> = Rc::new([1, 2, 3]);
1312    /// let x_ptr: *const [u32] = Rc::into_raw(x);
1313    ///
1314    /// unsafe {
1315    ///     let x: Rc<[u32; 3]> = Rc::from_raw(x_ptr.cast::<[u32; 3]>());
1316    ///     assert_eq!(&*x, &[1, 2, 3]);
1317    /// }
1318    /// ```
1319    #[inline]
1320    #[stable(feature = "rc_raw", since = "1.17.0")]
1321    pub unsafe fn from_raw(ptr: *const T) -> Self {
1322        unsafe { Self::from_raw_in(ptr, Global) }
1323    }
1324
1325    /// Increments the strong reference count on the `Rc<T>` associated with the
1326    /// provided pointer by one.
1327    ///
1328    /// # Safety
1329    ///
1330    /// The pointer must have been obtained through `Rc::into_raw` and must satisfy the
1331    /// same layout requirements specified in [`Rc::from_raw_in`][from_raw_in].
1332    /// The associated `Rc` instance must be valid (i.e. the strong count must be at
1333    /// least 1) for the duration of this method, and `ptr` must point to a block of memory
1334    /// allocated by the global allocator.
1335    ///
1336    /// [from_raw_in]: Rc::from_raw_in
1337    ///
1338    /// # Examples
1339    ///
1340    /// ```
1341    /// use std::rc::Rc;
1342    ///
1343    /// let five = Rc::new(5);
1344    ///
1345    /// unsafe {
1346    ///     let ptr = Rc::into_raw(five);
1347    ///     Rc::increment_strong_count(ptr);
1348    ///
1349    ///     let five = Rc::from_raw(ptr);
1350    ///     assert_eq!(2, Rc::strong_count(&five));
1351    /// #   // Prevent leaks for Miri.
1352    /// #   Rc::decrement_strong_count(ptr);
1353    /// }
1354    /// ```
1355    #[inline]
1356    #[stable(feature = "rc_mutate_strong_count", since = "1.53.0")]
1357    pub unsafe fn increment_strong_count(ptr: *const T) {
1358        unsafe { Self::increment_strong_count_in(ptr, Global) }
1359    }
1360
1361    /// Decrements the strong reference count on the `Rc<T>` associated with the
1362    /// provided pointer by one.
1363    ///
1364    /// # Safety
1365    ///
1366    /// The pointer must have been obtained through `Rc::into_raw`and must satisfy the
1367    /// same layout requirements specified in [`Rc::from_raw_in`][from_raw_in].
1368    /// The associated `Rc` instance must be valid (i.e. the strong count must be at
1369    /// least 1) when invoking this method, and `ptr` must point to a block of memory
1370    /// allocated by the global allocator. This method can be used to release the final `Rc` and
1371    /// backing storage, but **should not** be called after the final `Rc` has been released.
1372    ///
1373    /// [from_raw_in]: Rc::from_raw_in
1374    ///
1375    /// # Examples
1376    ///
1377    /// ```
1378    /// use std::rc::Rc;
1379    ///
1380    /// let five = Rc::new(5);
1381    ///
1382    /// unsafe {
1383    ///     let ptr = Rc::into_raw(five);
1384    ///     Rc::increment_strong_count(ptr);
1385    ///
1386    ///     let five = Rc::from_raw(ptr);
1387    ///     assert_eq!(2, Rc::strong_count(&five));
1388    ///     Rc::decrement_strong_count(ptr);
1389    ///     assert_eq!(1, Rc::strong_count(&five));
1390    /// }
1391    /// ```
1392    #[inline]
1393    #[stable(feature = "rc_mutate_strong_count", since = "1.53.0")]
1394    pub unsafe fn decrement_strong_count(ptr: *const T) {
1395        unsafe { Self::decrement_strong_count_in(ptr, Global) }
1396    }
1397}
1398
1399impl<T: ?Sized, A: Allocator> Rc<T, A> {
1400    /// Returns a reference to the underlying allocator.
1401    ///
1402    /// Note: this is an associated function, which means that you have
1403    /// to call it as `Rc::allocator(&r)` instead of `r.allocator()`. This
1404    /// is so that there is no conflict with a method on the inner type.
1405    #[inline]
1406    #[unstable(feature = "allocator_api", issue = "32838")]
1407    pub fn allocator(this: &Self) -> &A {
1408        &this.alloc
1409    }
1410
1411    /// Consumes the `Rc`, returning the wrapped pointer.
1412    ///
1413    /// To avoid a memory leak the pointer must be converted back to an `Rc` using
1414    /// [`Rc::from_raw`].
1415    ///
1416    /// # Examples
1417    ///
1418    /// ```
1419    /// use std::rc::Rc;
1420    ///
1421    /// let x = Rc::new("hello".to_owned());
1422    /// let x_ptr = Rc::into_raw(x);
1423    /// assert_eq!(unsafe { &*x_ptr }, "hello");
1424    /// # // Prevent leaks for Miri.
1425    /// # drop(unsafe { Rc::from_raw(x_ptr) });
1426    /// ```
1427    #[must_use = "losing the pointer will leak memory"]
1428    #[stable(feature = "rc_raw", since = "1.17.0")]
1429    #[rustc_never_returns_null_ptr]
1430    pub fn into_raw(this: Self) -> *const T {
1431        let this = ManuallyDrop::new(this);
1432        Self::as_ptr(&*this)
1433    }
1434
1435    /// Consumes the `Rc`, returning the wrapped pointer and allocator.
1436    ///
1437    /// To avoid a memory leak the pointer must be converted back to an `Rc` using
1438    /// [`Rc::from_raw_in`].
1439    ///
1440    /// # Examples
1441    ///
1442    /// ```
1443    /// #![feature(allocator_api)]
1444    /// use std::rc::Rc;
1445    /// use std::alloc::System;
1446    ///
1447    /// let x = Rc::new_in("hello".to_owned(), System);
1448    /// let (ptr, alloc) = Rc::into_raw_with_allocator(x);
1449    /// assert_eq!(unsafe { &*ptr }, "hello");
1450    /// let x = unsafe { Rc::from_raw_in(ptr, alloc) };
1451    /// assert_eq!(&*x, "hello");
1452    /// ```
1453    #[must_use = "losing the pointer will leak memory"]
1454    #[unstable(feature = "allocator_api", issue = "32838")]
1455    pub fn into_raw_with_allocator(this: Self) -> (*const T, A) {
1456        let this = mem::ManuallyDrop::new(this);
1457        let ptr = Self::as_ptr(&this);
1458        // Safety: `this` is ManuallyDrop so the allocator will not be double-dropped
1459        let alloc = unsafe { ptr::read(&this.alloc) };
1460        (ptr, alloc)
1461    }
1462
1463    /// Provides a raw pointer to the data.
1464    ///
1465    /// The counts are not affected in any way and the `Rc` is not consumed. The pointer is valid
1466    /// for as long as there are strong counts in the `Rc`.
1467    ///
1468    /// # Examples
1469    ///
1470    /// ```
1471    /// use std::rc::Rc;
1472    ///
1473    /// let x = Rc::new(0);
1474    /// let y = Rc::clone(&x);
1475    /// let x_ptr = Rc::as_ptr(&x);
1476    /// assert_eq!(x_ptr, Rc::as_ptr(&y));
1477    /// assert_eq!(unsafe { *x_ptr }, 0);
1478    /// ```
1479    #[stable(feature = "weak_into_raw", since = "1.45.0")]
1480    #[rustc_never_returns_null_ptr]
1481    pub fn as_ptr(this: &Self) -> *const T {
1482        let ptr: *mut RcInner<T> = NonNull::as_ptr(this.ptr);
1483
1484        // SAFETY: This cannot go through Deref::deref or Rc::inner because
1485        // this is required to retain raw/mut provenance such that e.g. `get_mut` can
1486        // write through the pointer after the Rc is recovered through `from_raw`.
1487        unsafe { &raw mut (*ptr).value }
1488    }
1489
1490    /// Constructs an `Rc<T, A>` from a raw pointer in the provided allocator.
1491    ///
1492    /// The raw pointer must have been previously returned by a call to [`Rc<U,
1493    /// A>::into_raw`][into_raw] with the following requirements:
1494    ///
1495    /// * If `U` is sized, it must have the same size and alignment as `T`. This
1496    ///   is trivially true if `U` is `T`.
1497    /// * If `U` is unsized, its data pointer must have the same size and
1498    ///   alignment as `T`. This is trivially true if `Rc<U>` was constructed
1499    ///   through `Rc<T>` and then converted to `Rc<U>` through an [unsized
1500    ///   coercion].
1501    ///
1502    /// Note that if `U` or `U`'s data pointer is not `T` but has the same size
1503    /// and alignment, this is basically like transmuting references of
1504    /// different types. See [`mem::transmute`][transmute] for more information
1505    /// on what restrictions apply in this case.
1506    ///
1507    /// The raw pointer must point to a block of memory allocated by `alloc`
1508    ///
1509    /// The user of `from_raw` has to make sure a specific value of `T` is only
1510    /// dropped once.
1511    ///
1512    /// This function is unsafe because improper use may lead to memory unsafety,
1513    /// even if the returned `Rc<T>` is never accessed.
1514    ///
1515    /// [into_raw]: Rc::into_raw
1516    /// [transmute]: core::mem::transmute
1517    /// [unsized coercion]: https://doc.rust-lang.org/reference/type-coercions.html#unsized-coercions
1518    ///
1519    /// # Examples
1520    ///
1521    /// ```
1522    /// #![feature(allocator_api)]
1523    ///
1524    /// use std::rc::Rc;
1525    /// use std::alloc::System;
1526    ///
1527    /// let x = Rc::new_in("hello".to_owned(), System);
1528    /// let x_ptr = Rc::into_raw(x);
1529    ///
1530    /// unsafe {
1531    ///     // Convert back to an `Rc` to prevent leak.
1532    ///     let x = Rc::from_raw_in(x_ptr, System);
1533    ///     assert_eq!(&*x, "hello");
1534    ///
1535    ///     // Further calls to `Rc::from_raw(x_ptr)` would be memory-unsafe.
1536    /// }
1537    ///
1538    /// // The memory was freed when `x` went out of scope above, so `x_ptr` is now dangling!
1539    /// ```
1540    ///
1541    /// Convert a slice back into its original array:
1542    ///
1543    /// ```
1544    /// #![feature(allocator_api)]
1545    ///
1546    /// use std::rc::Rc;
1547    /// use std::alloc::System;
1548    ///
1549    /// let x: Rc<[u32], _> = Rc::new_in([1, 2, 3], System);
1550    /// let x_ptr: *const [u32] = Rc::into_raw(x);
1551    ///
1552    /// unsafe {
1553    ///     let x: Rc<[u32; 3], _> = Rc::from_raw_in(x_ptr.cast::<[u32; 3]>(), System);
1554    ///     assert_eq!(&*x, &[1, 2, 3]);
1555    /// }
1556    /// ```
1557    #[unstable(feature = "allocator_api", issue = "32838")]
1558    pub unsafe fn from_raw_in(ptr: *const T, alloc: A) -> Self {
1559        let offset = unsafe { data_offset(ptr) };
1560
1561        // Reverse the offset to find the original RcInner.
1562        let rc_ptr = unsafe { ptr.byte_sub(offset) as *mut RcInner<T> };
1563
1564        unsafe { Self::from_ptr_in(rc_ptr, alloc) }
1565    }
1566
1567    /// Creates a new [`Weak`] pointer to this allocation.
1568    ///
1569    /// # Examples
1570    ///
1571    /// ```
1572    /// use std::rc::Rc;
1573    ///
1574    /// let five = Rc::new(5);
1575    ///
1576    /// let weak_five = Rc::downgrade(&five);
1577    /// ```
1578    #[must_use = "this returns a new `Weak` pointer, \
1579                  without modifying the original `Rc`"]
1580    #[stable(feature = "rc_weak", since = "1.4.0")]
1581    pub fn downgrade(this: &Self) -> Weak<T, A>
1582    where
1583        A: Clone,
1584    {
1585        this.inner().inc_weak();
1586        // Make sure we do not create a dangling Weak
1587        debug_assert!(!is_dangling(this.ptr.as_ptr()));
1588        Weak { ptr: this.ptr, alloc: this.alloc.clone() }
1589    }
1590
1591    /// Gets the number of [`Weak`] pointers to this allocation.
1592    ///
1593    /// # Examples
1594    ///
1595    /// ```
1596    /// use std::rc::Rc;
1597    ///
1598    /// let five = Rc::new(5);
1599    /// let _weak_five = Rc::downgrade(&five);
1600    ///
1601    /// assert_eq!(1, Rc::weak_count(&five));
1602    /// ```
1603    #[inline]
1604    #[stable(feature = "rc_counts", since = "1.15.0")]
1605    pub fn weak_count(this: &Self) -> usize {
1606        this.inner().weak() - 1
1607    }
1608
1609    /// Gets the number of strong (`Rc`) pointers to this allocation.
1610    ///
1611    /// # Examples
1612    ///
1613    /// ```
1614    /// use std::rc::Rc;
1615    ///
1616    /// let five = Rc::new(5);
1617    /// let _also_five = Rc::clone(&five);
1618    ///
1619    /// assert_eq!(2, Rc::strong_count(&five));
1620    /// ```
1621    #[inline]
1622    #[stable(feature = "rc_counts", since = "1.15.0")]
1623    pub fn strong_count(this: &Self) -> usize {
1624        this.inner().strong()
1625    }
1626
1627    /// Increments the strong reference count on the `Rc<T>` associated with the
1628    /// provided pointer by one.
1629    ///
1630    /// # Safety
1631    ///
1632    /// The pointer must have been obtained through `Rc::into_raw` and must satisfy the
1633    /// same layout requirements specified in [`Rc::from_raw_in`][from_raw_in].
1634    /// The associated `Rc` instance must be valid (i.e. the strong count must be at
1635    /// least 1) for the duration of this method, and `ptr` must point to a block of memory
1636    /// allocated by `alloc`.
1637    ///
1638    /// [from_raw_in]: Rc::from_raw_in
1639    ///
1640    /// # Examples
1641    ///
1642    /// ```
1643    /// #![feature(allocator_api)]
1644    ///
1645    /// use std::rc::Rc;
1646    /// use std::alloc::System;
1647    ///
1648    /// let five = Rc::new_in(5, System);
1649    ///
1650    /// unsafe {
1651    ///     let ptr = Rc::into_raw(five);
1652    ///     Rc::increment_strong_count_in(ptr, System);
1653    ///
1654    ///     let five = Rc::from_raw_in(ptr, System);
1655    ///     assert_eq!(2, Rc::strong_count(&five));
1656    /// #   // Prevent leaks for Miri.
1657    /// #   Rc::decrement_strong_count_in(ptr, System);
1658    /// }
1659    /// ```
1660    #[inline]
1661    #[unstable(feature = "allocator_api", issue = "32838")]
1662    pub unsafe fn increment_strong_count_in(ptr: *const T, alloc: A)
1663    where
1664        A: Clone,
1665    {
1666        // Retain Rc, but don't touch refcount by wrapping in ManuallyDrop
1667        let rc = unsafe { mem::ManuallyDrop::new(Rc::<T, A>::from_raw_in(ptr, alloc)) };
1668        // Now increase refcount, but don't drop new refcount either
1669        let _rc_clone: mem::ManuallyDrop<_> = rc.clone();
1670    }
1671
1672    /// Decrements the strong reference count on the `Rc<T>` associated with the
1673    /// provided pointer by one.
1674    ///
1675    /// # Safety
1676    ///
1677    /// The pointer must have been obtained through `Rc::into_raw`and must satisfy the
1678    /// same layout requirements specified in [`Rc::from_raw_in`][from_raw_in].
1679    /// The associated `Rc` instance must be valid (i.e. the strong count must be at
1680    /// least 1) when invoking this method, and `ptr` must point to a block of memory
1681    /// allocated by `alloc`. This method can be used to release the final `Rc` and
1682    /// backing storage, but **should not** be called after the final `Rc` has been released.
1683    ///
1684    /// [from_raw_in]: Rc::from_raw_in
1685    ///
1686    /// # Examples
1687    ///
1688    /// ```
1689    /// #![feature(allocator_api)]
1690    ///
1691    /// use std::rc::Rc;
1692    /// use std::alloc::System;
1693    ///
1694    /// let five = Rc::new_in(5, System);
1695    ///
1696    /// unsafe {
1697    ///     let ptr = Rc::into_raw(five);
1698    ///     Rc::increment_strong_count_in(ptr, System);
1699    ///
1700    ///     let five = Rc::from_raw_in(ptr, System);
1701    ///     assert_eq!(2, Rc::strong_count(&five));
1702    ///     Rc::decrement_strong_count_in(ptr, System);
1703    ///     assert_eq!(1, Rc::strong_count(&five));
1704    /// }
1705    /// ```
1706    #[inline]
1707    #[unstable(feature = "allocator_api", issue = "32838")]
1708    pub unsafe fn decrement_strong_count_in(ptr: *const T, alloc: A) {
1709        unsafe { drop(Rc::from_raw_in(ptr, alloc)) };
1710    }
1711
1712    /// Returns `true` if there are no other `Rc` or [`Weak`] pointers to
1713    /// this allocation.
1714    #[inline]
1715    fn is_unique(this: &Self) -> bool {
1716        Rc::weak_count(this) == 0 && Rc::strong_count(this) == 1
1717    }
1718
1719    /// Returns a mutable reference into the given `Rc`, if there are
1720    /// no other `Rc` or [`Weak`] pointers to the same allocation.
1721    ///
1722    /// Returns [`None`] otherwise, because it is not safe to
1723    /// mutate a shared value.
1724    ///
1725    /// See also [`make_mut`][make_mut], which will [`clone`][clone]
1726    /// the inner value when there are other `Rc` pointers.
1727    ///
1728    /// [make_mut]: Rc::make_mut
1729    /// [clone]: Clone::clone
1730    ///
1731    /// # Examples
1732    ///
1733    /// ```
1734    /// use std::rc::Rc;
1735    ///
1736    /// let mut x = Rc::new(3);
1737    /// *Rc::get_mut(&mut x).unwrap() = 4;
1738    /// assert_eq!(*x, 4);
1739    ///
1740    /// let _y = Rc::clone(&x);
1741    /// assert!(Rc::get_mut(&mut x).is_none());
1742    /// ```
1743    #[inline]
1744    #[stable(feature = "rc_unique", since = "1.4.0")]
1745    pub fn get_mut(this: &mut Self) -> Option<&mut T> {
1746        if Rc::is_unique(this) { unsafe { Some(Rc::get_mut_unchecked(this)) } } else { None }
1747    }
1748
1749    /// Returns a mutable reference into the given `Rc`,
1750    /// without any check.
1751    ///
1752    /// See also [`get_mut`], which is safe and does appropriate checks.
1753    ///
1754    /// [`get_mut`]: Rc::get_mut
1755    ///
1756    /// # Safety
1757    ///
1758    /// If any other `Rc` or [`Weak`] pointers to the same allocation exist, then
1759    /// they must not be dereferenced or have active borrows for the duration
1760    /// of the returned borrow, and their inner type must be exactly the same as the
1761    /// inner type of this Rc (including lifetimes). This is trivially the case if no
1762    /// such pointers exist, for example immediately after `Rc::new`.
1763    ///
1764    /// # Examples
1765    ///
1766    /// ```
1767    /// #![feature(get_mut_unchecked)]
1768    ///
1769    /// use std::rc::Rc;
1770    ///
1771    /// let mut x = Rc::new(String::new());
1772    /// unsafe {
1773    ///     Rc::get_mut_unchecked(&mut x).push_str("foo")
1774    /// }
1775    /// assert_eq!(*x, "foo");
1776    /// ```
1777    /// Other `Rc` pointers to the same allocation must be to the same type.
1778    /// ```no_run
1779    /// #![feature(get_mut_unchecked)]
1780    ///
1781    /// use std::rc::Rc;
1782    ///
1783    /// let x: Rc<str> = Rc::from("Hello, world!");
1784    /// let mut y: Rc<[u8]> = x.clone().into();
1785    /// unsafe {
1786    ///     // this is Undefined Behavior, because x's inner type is str, not [u8]
1787    ///     Rc::get_mut_unchecked(&mut y).fill(0xff); // 0xff is invalid in UTF-8
1788    /// }
1789    /// println!("{}", &*x); // Invalid UTF-8 in a str
1790    /// ```
1791    /// Other `Rc` pointers to the same allocation must be to the exact same type, including lifetimes.
1792    /// ```no_run
1793    /// #![feature(get_mut_unchecked)]
1794    ///
1795    /// use std::rc::Rc;
1796    ///
1797    /// let x: Rc<&str> = Rc::new("Hello, world!");
1798    /// {
1799    ///     let s = String::from("Oh, no!");
1800    ///     let mut y: Rc<&str> = x.clone();
1801    ///     unsafe {
1802    ///         // this is Undefined Behavior, because x's inner type
1803    ///         // is &'long str, not &'short str
1804    ///         *Rc::get_mut_unchecked(&mut y) = &s;
1805    ///     }
1806    /// }
1807    /// println!("{}", &*x); // Use-after-free
1808    /// ```
1809    #[inline]
1810    #[unstable(feature = "get_mut_unchecked", issue = "63292")]
1811    pub unsafe fn get_mut_unchecked(this: &mut Self) -> &mut T {
1812        // We are careful to *not* create a reference covering the "count" fields, as
1813        // this would conflict with accesses to the reference counts (e.g. by `Weak`).
1814        unsafe { &mut (*this.ptr.as_ptr()).value }
1815    }
1816
1817    #[inline]
1818    #[stable(feature = "ptr_eq", since = "1.17.0")]
1819    /// Returns `true` if the two `Rc`s point to the same allocation in a vein similar to
1820    /// [`ptr::eq`]. This function ignores the metadata of  `dyn Trait` pointers.
1821    ///
1822    /// # Examples
1823    ///
1824    /// ```
1825    /// use std::rc::Rc;
1826    ///
1827    /// let five = Rc::new(5);
1828    /// let same_five = Rc::clone(&five);
1829    /// let other_five = Rc::new(5);
1830    ///
1831    /// assert!(Rc::ptr_eq(&five, &same_five));
1832    /// assert!(!Rc::ptr_eq(&five, &other_five));
1833    /// ```
1834    pub fn ptr_eq(this: &Self, other: &Self) -> bool {
1835        ptr::addr_eq(this.ptr.as_ptr(), other.ptr.as_ptr())
1836    }
1837}
1838
1839#[cfg(not(no_global_oom_handling))]
1840impl<T: ?Sized + CloneToUninit, A: Allocator + Clone> Rc<T, A> {
1841    /// Makes a mutable reference into the given `Rc`.
1842    ///
1843    /// If there are other `Rc` pointers to the same allocation, then `make_mut` will
1844    /// [`clone`] the inner value to a new allocation to ensure unique ownership.  This is also
1845    /// referred to as clone-on-write.
1846    ///
1847    /// However, if there are no other `Rc` pointers to this allocation, but some [`Weak`]
1848    /// pointers, then the [`Weak`] pointers will be disassociated and the inner value will not
1849    /// be cloned.
1850    ///
1851    /// See also [`get_mut`], which will fail rather than cloning the inner value
1852    /// or disassociating [`Weak`] pointers.
1853    ///
1854    /// [`clone`]: Clone::clone
1855    /// [`get_mut`]: Rc::get_mut
1856    ///
1857    /// # Examples
1858    ///
1859    /// ```
1860    /// use std::rc::Rc;
1861    ///
1862    /// let mut data = Rc::new(5);
1863    ///
1864    /// *Rc::make_mut(&mut data) += 1;         // Won't clone anything
1865    /// let mut other_data = Rc::clone(&data); // Won't clone inner data
1866    /// *Rc::make_mut(&mut data) += 1;         // Clones inner data
1867    /// *Rc::make_mut(&mut data) += 1;         // Won't clone anything
1868    /// *Rc::make_mut(&mut other_data) *= 2;   // Won't clone anything
1869    ///
1870    /// // Now `data` and `other_data` point to different allocations.
1871    /// assert_eq!(*data, 8);
1872    /// assert_eq!(*other_data, 12);
1873    /// ```
1874    ///
1875    /// [`Weak`] pointers will be disassociated:
1876    ///
1877    /// ```
1878    /// use std::rc::Rc;
1879    ///
1880    /// let mut data = Rc::new(75);
1881    /// let weak = Rc::downgrade(&data);
1882    ///
1883    /// assert!(75 == *data);
1884    /// assert!(75 == *weak.upgrade().unwrap());
1885    ///
1886    /// *Rc::make_mut(&mut data) += 1;
1887    ///
1888    /// assert!(76 == *data);
1889    /// assert!(weak.upgrade().is_none());
1890    /// ```
1891    #[inline]
1892    #[stable(feature = "rc_unique", since = "1.4.0")]
1893    pub fn make_mut(this: &mut Self) -> &mut T {
1894        let size_of_val = size_of_val::<T>(&**this);
1895
1896        if Rc::strong_count(this) != 1 {
1897            // Gotta clone the data, there are other Rcs.
1898
1899            let this_data_ref: &T = &**this;
1900            // `in_progress` drops the allocation if we panic before finishing initializing it.
1901            let mut in_progress: UniqueRcUninit<T, A> =
1902                UniqueRcUninit::new(this_data_ref, this.alloc.clone());
1903
1904            // Initialize with clone of this.
1905            let initialized_clone = unsafe {
1906                // Clone. If the clone panics, `in_progress` will be dropped and clean up.
1907                this_data_ref.clone_to_uninit(in_progress.data_ptr().cast());
1908                // Cast type of pointer, now that it is initialized.
1909                in_progress.into_rc()
1910            };
1911
1912            // Replace `this` with newly constructed Rc.
1913            *this = initialized_clone;
1914        } else if Rc::weak_count(this) != 0 {
1915            // Can just steal the data, all that's left is Weaks
1916
1917            // We don't need panic-protection like the above branch does, but we might as well
1918            // use the same mechanism.
1919            let mut in_progress: UniqueRcUninit<T, A> =
1920                UniqueRcUninit::new(&**this, this.alloc.clone());
1921            unsafe {
1922                // Initialize `in_progress` with move of **this.
1923                // We have to express this in terms of bytes because `T: ?Sized`; there is no
1924                // operation that just copies a value based on its `size_of_val()`.
1925                ptr::copy_nonoverlapping(
1926                    ptr::from_ref(&**this).cast::<u8>(),
1927                    in_progress.data_ptr().cast::<u8>(),
1928                    size_of_val,
1929                );
1930
1931                this.inner().dec_strong();
1932                // Remove implicit strong-weak ref (no need to craft a fake
1933                // Weak here -- we know other Weaks can clean up for us)
1934                this.inner().dec_weak();
1935                // Replace `this` with newly constructed Rc that has the moved data.
1936                ptr::write(this, in_progress.into_rc());
1937            }
1938        }
1939        // This unsafety is ok because we're guaranteed that the pointer
1940        // returned is the *only* pointer that will ever be returned to T. Our
1941        // reference count is guaranteed to be 1 at this point, and we required
1942        // the `Rc<T>` itself to be `mut`, so we're returning the only possible
1943        // reference to the allocation.
1944        unsafe { &mut this.ptr.as_mut().value }
1945    }
1946}
1947
1948impl<T: Clone, A: Allocator> Rc<T, A> {
1949    /// If we have the only reference to `T` then unwrap it. Otherwise, clone `T` and return the
1950    /// clone.
1951    ///
1952    /// Assuming `rc_t` is of type `Rc<T>`, this function is functionally equivalent to
1953    /// `(*rc_t).clone()`, but will avoid cloning the inner value where possible.
1954    ///
1955    /// # Examples
1956    ///
1957    /// ```
1958    /// # use std::{ptr, rc::Rc};
1959    /// let inner = String::from("test");
1960    /// let ptr = inner.as_ptr();
1961    ///
1962    /// let rc = Rc::new(inner);
1963    /// let inner = Rc::unwrap_or_clone(rc);
1964    /// // The inner value was not cloned
1965    /// assert!(ptr::eq(ptr, inner.as_ptr()));
1966    ///
1967    /// let rc = Rc::new(inner);
1968    /// let rc2 = rc.clone();
1969    /// let inner = Rc::unwrap_or_clone(rc);
1970    /// // Because there were 2 references, we had to clone the inner value.
1971    /// assert!(!ptr::eq(ptr, inner.as_ptr()));
1972    /// // `rc2` is the last reference, so when we unwrap it we get back
1973    /// // the original `String`.
1974    /// let inner = Rc::unwrap_or_clone(rc2);
1975    /// assert!(ptr::eq(ptr, inner.as_ptr()));
1976    /// ```
1977    #[inline]
1978    #[stable(feature = "arc_unwrap_or_clone", since = "1.76.0")]
1979    pub fn unwrap_or_clone(this: Self) -> T {
1980        Rc::try_unwrap(this).unwrap_or_else(|rc| (*rc).clone())
1981    }
1982}
1983
1984impl<A: Allocator> Rc<dyn Any, A> {
1985    /// Attempts to downcast the `Rc<dyn Any>` to a concrete type.
1986    ///
1987    /// # Examples
1988    ///
1989    /// ```
1990    /// use std::any::Any;
1991    /// use std::rc::Rc;
1992    ///
1993    /// fn print_if_string(value: Rc<dyn Any>) {
1994    ///     if let Ok(string) = value.downcast::<String>() {
1995    ///         println!("String ({}): {}", string.len(), string);
1996    ///     }
1997    /// }
1998    ///
1999    /// let my_string = "Hello World".to_string();
2000    /// print_if_string(Rc::new(my_string));
2001    /// print_if_string(Rc::new(0i8));
2002    /// ```
2003    #[inline]
2004    #[stable(feature = "rc_downcast", since = "1.29.0")]
2005    pub fn downcast<T: Any>(self) -> Result<Rc<T, A>, Self> {
2006        if (*self).is::<T>() {
2007            unsafe {
2008                let (ptr, alloc) = Rc::into_inner_with_allocator(self);
2009                Ok(Rc::from_inner_in(ptr.cast(), alloc))
2010            }
2011        } else {
2012            Err(self)
2013        }
2014    }
2015
2016    /// Downcasts the `Rc<dyn Any>` to a concrete type.
2017    ///
2018    /// For a safe alternative see [`downcast`].
2019    ///
2020    /// # Examples
2021    ///
2022    /// ```
2023    /// #![feature(downcast_unchecked)]
2024    ///
2025    /// use std::any::Any;
2026    /// use std::rc::Rc;
2027    ///
2028    /// let x: Rc<dyn Any> = Rc::new(1_usize);
2029    ///
2030    /// unsafe {
2031    ///     assert_eq!(*x.downcast_unchecked::<usize>(), 1);
2032    /// }
2033    /// ```
2034    ///
2035    /// # Safety
2036    ///
2037    /// The contained value must be of type `T`. Calling this method
2038    /// with the incorrect type is *undefined behavior*.
2039    ///
2040    ///
2041    /// [`downcast`]: Self::downcast
2042    #[inline]
2043    #[unstable(feature = "downcast_unchecked", issue = "90850")]
2044    pub unsafe fn downcast_unchecked<T: Any>(self) -> Rc<T, A> {
2045        unsafe {
2046            let (ptr, alloc) = Rc::into_inner_with_allocator(self);
2047            Rc::from_inner_in(ptr.cast(), alloc)
2048        }
2049    }
2050}
2051
2052impl<T: ?Sized> Rc<T> {
2053    /// Allocates an `RcInner<T>` with sufficient space for
2054    /// a possibly-unsized inner value where the value has the layout provided.
2055    ///
2056    /// The function `mem_to_rc_inner` is called with the data pointer
2057    /// and must return back a (potentially fat)-pointer for the `RcInner<T>`.
2058    #[cfg(not(no_global_oom_handling))]
2059    unsafe fn allocate_for_layout(
2060        value_layout: Layout,
2061        allocate: impl FnOnce(Layout) -> Result<NonNull<[u8]>, AllocError>,
2062        mem_to_rc_inner: impl FnOnce(*mut u8) -> *mut RcInner<T>,
2063    ) -> *mut RcInner<T> {
2064        let layout = rc_inner_layout_for_value_layout(value_layout);
2065        unsafe {
2066            Rc::try_allocate_for_layout(value_layout, allocate, mem_to_rc_inner)
2067                .unwrap_or_else(|_| handle_alloc_error(layout))
2068        }
2069    }
2070
2071    /// Allocates an `RcInner<T>` with sufficient space for
2072    /// a possibly-unsized inner value where the value has the layout provided,
2073    /// returning an error if allocation fails.
2074    ///
2075    /// The function `mem_to_rc_inner` is called with the data pointer
2076    /// and must return back a (potentially fat)-pointer for the `RcInner<T>`.
2077    #[inline]
2078    unsafe fn try_allocate_for_layout(
2079        value_layout: Layout,
2080        allocate: impl FnOnce(Layout) -> Result<NonNull<[u8]>, AllocError>,
2081        mem_to_rc_inner: impl FnOnce(*mut u8) -> *mut RcInner<T>,
2082    ) -> Result<*mut RcInner<T>, AllocError> {
2083        let layout = rc_inner_layout_for_value_layout(value_layout);
2084
2085        // Allocate for the layout.
2086        let ptr = allocate(layout)?;
2087
2088        // Initialize the RcInner
2089        let inner = mem_to_rc_inner(ptr.as_non_null_ptr().as_ptr());
2090        unsafe {
2091            debug_assert_eq!(Layout::for_value_raw(inner), layout);
2092
2093            (&raw mut (*inner).strong).write(Cell::new(1));
2094            (&raw mut (*inner).weak).write(Cell::new(1));
2095        }
2096
2097        Ok(inner)
2098    }
2099}
2100
2101impl<T: ?Sized, A: Allocator> Rc<T, A> {
2102    /// Allocates an `RcInner<T>` with sufficient space for an unsized inner value
2103    #[cfg(not(no_global_oom_handling))]
2104    unsafe fn allocate_for_ptr_in(ptr: *const T, alloc: &A) -> *mut RcInner<T> {
2105        // Allocate for the `RcInner<T>` using the given value.
2106        unsafe {
2107            Rc::<T>::allocate_for_layout(
2108                Layout::for_value_raw(ptr),
2109                |layout| alloc.allocate(layout),
2110                |mem| mem.with_metadata_of(ptr as *const RcInner<T>),
2111            )
2112        }
2113    }
2114
2115    #[cfg(not(no_global_oom_handling))]
2116    fn from_box_in(src: Box<T, A>) -> Rc<T, A> {
2117        unsafe {
2118            let value_size = size_of_val(&*src);
2119            let ptr = Self::allocate_for_ptr_in(&*src, Box::allocator(&src));
2120
2121            // Copy value as bytes
2122            ptr::copy_nonoverlapping(
2123                (&raw const *src) as *const u8,
2124                (&raw mut (*ptr).value) as *mut u8,
2125                value_size,
2126            );
2127
2128            // Free the allocation without dropping its contents
2129            let (bptr, alloc) = Box::into_raw_with_allocator(src);
2130            let src = Box::from_raw_in(bptr as *mut mem::ManuallyDrop<T>, alloc.by_ref());
2131            drop(src);
2132
2133            Self::from_ptr_in(ptr, alloc)
2134        }
2135    }
2136}
2137
2138impl<T> Rc<[T]> {
2139    /// Allocates an `RcInner<[T]>` with the given length.
2140    #[cfg(not(no_global_oom_handling))]
2141    unsafe fn allocate_for_slice(len: usize) -> *mut RcInner<[T]> {
2142        unsafe {
2143            Self::allocate_for_layout(
2144                Layout::array::<T>(len).unwrap(),
2145                |layout| Global.allocate(layout),
2146                |mem| ptr::slice_from_raw_parts_mut(mem.cast::<T>(), len) as *mut RcInner<[T]>,
2147            )
2148        }
2149    }
2150
2151    /// Copy elements from slice into newly allocated `Rc<[T]>`
2152    ///
2153    /// Unsafe because the caller must either take ownership or bind `T: Copy`
2154    #[cfg(not(no_global_oom_handling))]
2155    unsafe fn copy_from_slice(v: &[T]) -> Rc<[T]> {
2156        unsafe {
2157            let ptr = Self::allocate_for_slice(v.len());
2158            ptr::copy_nonoverlapping(v.as_ptr(), (&raw mut (*ptr).value) as *mut T, v.len());
2159            Self::from_ptr(ptr)
2160        }
2161    }
2162
2163    /// Constructs an `Rc<[T]>` from an iterator known to be of a certain size.
2164    ///
2165    /// Behavior is undefined should the size be wrong.
2166    #[cfg(not(no_global_oom_handling))]
2167    unsafe fn from_iter_exact(iter: impl Iterator<Item = T>, len: usize) -> Rc<[T]> {
2168        // Panic guard while cloning T elements.
2169        // In the event of a panic, elements that have been written
2170        // into the new RcInner will be dropped, then the memory freed.
2171        struct Guard<T> {
2172            mem: NonNull<u8>,
2173            elems: *mut T,
2174            layout: Layout,
2175            n_elems: usize,
2176        }
2177
2178        impl<T> Drop for Guard<T> {
2179            fn drop(&mut self) {
2180                unsafe {
2181                    let slice = from_raw_parts_mut(self.elems, self.n_elems);
2182                    ptr::drop_in_place(slice);
2183
2184                    Global.deallocate(self.mem, self.layout);
2185                }
2186            }
2187        }
2188
2189        unsafe {
2190            let ptr = Self::allocate_for_slice(len);
2191
2192            let mem = ptr as *mut _ as *mut u8;
2193            let layout = Layout::for_value_raw(ptr);
2194
2195            // Pointer to first element
2196            let elems = (&raw mut (*ptr).value) as *mut T;
2197
2198            let mut guard = Guard { mem: NonNull::new_unchecked(mem), elems, layout, n_elems: 0 };
2199
2200            for (i, item) in iter.enumerate() {
2201                ptr::write(elems.add(i), item);
2202                guard.n_elems += 1;
2203            }
2204
2205            // All clear. Forget the guard so it doesn't free the new RcInner.
2206            mem::forget(guard);
2207
2208            Self::from_ptr(ptr)
2209        }
2210    }
2211}
2212
2213impl<T, A: Allocator> Rc<[T], A> {
2214    /// Allocates an `RcInner<[T]>` with the given length.
2215    #[inline]
2216    #[cfg(not(no_global_oom_handling))]
2217    unsafe fn allocate_for_slice_in(len: usize, alloc: &A) -> *mut RcInner<[T]> {
2218        unsafe {
2219            Rc::<[T]>::allocate_for_layout(
2220                Layout::array::<T>(len).unwrap(),
2221                |layout| alloc.allocate(layout),
2222                |mem| ptr::slice_from_raw_parts_mut(mem.cast::<T>(), len) as *mut RcInner<[T]>,
2223            )
2224        }
2225    }
2226}
2227
2228#[cfg(not(no_global_oom_handling))]
2229/// Specialization trait used for `From<&[T]>`.
2230trait RcFromSlice<T> {
2231    fn from_slice(slice: &[T]) -> Self;
2232}
2233
2234#[cfg(not(no_global_oom_handling))]
2235impl<T: Clone> RcFromSlice<T> for Rc<[T]> {
2236    #[inline]
2237    default fn from_slice(v: &[T]) -> Self {
2238        unsafe { Self::from_iter_exact(v.iter().cloned(), v.len()) }
2239    }
2240}
2241
2242#[cfg(not(no_global_oom_handling))]
2243impl<T: Copy> RcFromSlice<T> for Rc<[T]> {
2244    #[inline]
2245    fn from_slice(v: &[T]) -> Self {
2246        unsafe { Rc::copy_from_slice(v) }
2247    }
2248}
2249
2250#[stable(feature = "rust1", since = "1.0.0")]
2251impl<T: ?Sized, A: Allocator> Deref for Rc<T, A> {
2252    type Target = T;
2253
2254    #[inline(always)]
2255    fn deref(&self) -> &T {
2256        &self.inner().value
2257    }
2258}
2259
2260#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2261unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Rc<T, A> {}
2262
2263//#[unstable(feature = "unique_rc_arc", issue = "112566")]
2264#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2265unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for UniqueRc<T, A> {}
2266
2267#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2268unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Weak<T, A> {}
2269
2270#[unstable(feature = "deref_pure_trait", issue = "87121")]
2271unsafe impl<T: ?Sized, A: Allocator> DerefPure for Rc<T, A> {}
2272
2273//#[unstable(feature = "unique_rc_arc", issue = "112566")]
2274#[unstable(feature = "deref_pure_trait", issue = "87121")]
2275unsafe impl<T: ?Sized, A: Allocator> DerefPure for UniqueRc<T, A> {}
2276
2277#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2278impl<T: ?Sized> LegacyReceiver for Rc<T> {}
2279
2280#[stable(feature = "rust1", since = "1.0.0")]
2281unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Rc<T, A> {
2282    /// Drops the `Rc`.
2283    ///
2284    /// This will decrement the strong reference count. If the strong reference
2285    /// count reaches zero then the only other references (if any) are
2286    /// [`Weak`], so we `drop` the inner value.
2287    ///
2288    /// # Examples
2289    ///
2290    /// ```
2291    /// use std::rc::Rc;
2292    ///
2293    /// struct Foo;
2294    ///
2295    /// impl Drop for Foo {
2296    ///     fn drop(&mut self) {
2297    ///         println!("dropped!");
2298    ///     }
2299    /// }
2300    ///
2301    /// let foo  = Rc::new(Foo);
2302    /// let foo2 = Rc::clone(&foo);
2303    ///
2304    /// drop(foo);    // Doesn't print anything
2305    /// drop(foo2);   // Prints "dropped!"
2306    /// ```
2307    #[inline]
2308    fn drop(&mut self) {
2309        unsafe {
2310            self.inner().dec_strong();
2311            if self.inner().strong() == 0 {
2312                self.drop_slow();
2313            }
2314        }
2315    }
2316}
2317
2318#[stable(feature = "rust1", since = "1.0.0")]
2319impl<T: ?Sized, A: Allocator + Clone> Clone for Rc<T, A> {
2320    /// Makes a clone of the `Rc` pointer.
2321    ///
2322    /// This creates another pointer to the same allocation, increasing the
2323    /// strong reference count.
2324    ///
2325    /// # Examples
2326    ///
2327    /// ```
2328    /// use std::rc::Rc;
2329    ///
2330    /// let five = Rc::new(5);
2331    ///
2332    /// let _ = Rc::clone(&five);
2333    /// ```
2334    #[inline]
2335    fn clone(&self) -> Self {
2336        unsafe {
2337            self.inner().inc_strong();
2338            Self::from_inner_in(self.ptr, self.alloc.clone())
2339        }
2340    }
2341}
2342
2343#[unstable(feature = "ergonomic_clones", issue = "132290")]
2344impl<T: ?Sized, A: Allocator + Clone> UseCloned for Rc<T, A> {}
2345
2346#[cfg(not(no_global_oom_handling))]
2347#[stable(feature = "rust1", since = "1.0.0")]
2348impl<T: Default> Default for Rc<T> {
2349    /// Creates a new `Rc<T>`, with the `Default` value for `T`.
2350    ///
2351    /// # Examples
2352    ///
2353    /// ```
2354    /// use std::rc::Rc;
2355    ///
2356    /// let x: Rc<i32> = Default::default();
2357    /// assert_eq!(*x, 0);
2358    /// ```
2359    #[inline]
2360    fn default() -> Rc<T> {
2361        unsafe {
2362            Self::from_inner(
2363                Box::leak(Box::write(
2364                    Box::new_uninit(),
2365                    RcInner { strong: Cell::new(1), weak: Cell::new(1), value: T::default() },
2366                ))
2367                .into(),
2368            )
2369        }
2370    }
2371}
2372
2373#[cfg(not(no_global_oom_handling))]
2374#[stable(feature = "more_rc_default_impls", since = "1.80.0")]
2375impl Default for Rc<str> {
2376    /// Creates an empty str inside an Rc
2377    ///
2378    /// This may or may not share an allocation with other Rcs on the same thread.
2379    #[inline]
2380    fn default() -> Self {
2381        let rc = Rc::<[u8]>::default();
2382        // `[u8]` has the same layout as `str`.
2383        unsafe { Rc::from_raw(Rc::into_raw(rc) as *const str) }
2384    }
2385}
2386
2387#[cfg(not(no_global_oom_handling))]
2388#[stable(feature = "more_rc_default_impls", since = "1.80.0")]
2389impl<T> Default for Rc<[T]> {
2390    /// Creates an empty `[T]` inside an Rc
2391    ///
2392    /// This may or may not share an allocation with other Rcs on the same thread.
2393    #[inline]
2394    fn default() -> Self {
2395        let arr: [T; 0] = [];
2396        Rc::from(arr)
2397    }
2398}
2399
2400#[stable(feature = "rust1", since = "1.0.0")]
2401trait RcEqIdent<T: ?Sized + PartialEq, A: Allocator> {
2402    fn eq(&self, other: &Rc<T, A>) -> bool;
2403    fn ne(&self, other: &Rc<T, A>) -> bool;
2404}
2405
2406#[stable(feature = "rust1", since = "1.0.0")]
2407impl<T: ?Sized + PartialEq, A: Allocator> RcEqIdent<T, A> for Rc<T, A> {
2408    #[inline]
2409    default fn eq(&self, other: &Rc<T, A>) -> bool {
2410        **self == **other
2411    }
2412
2413    #[inline]
2414    default fn ne(&self, other: &Rc<T, A>) -> bool {
2415        **self != **other
2416    }
2417}
2418
2419// Hack to allow specializing on `Eq` even though `Eq` has a method.
2420#[rustc_unsafe_specialization_marker]
2421pub(crate) trait MarkerEq: PartialEq<Self> {}
2422
2423impl<T: Eq> MarkerEq for T {}
2424
2425/// We're doing this specialization here, and not as a more general optimization on `&T`, because it
2426/// would otherwise add a cost to all equality checks on refs. We assume that `Rc`s are used to
2427/// store large values, that are slow to clone, but also heavy to check for equality, causing this
2428/// cost to pay off more easily. It's also more likely to have two `Rc` clones, that point to
2429/// the same value, than two `&T`s.
2430///
2431/// We can only do this when `T: Eq` as a `PartialEq` might be deliberately irreflexive.
2432#[stable(feature = "rust1", since = "1.0.0")]
2433impl<T: ?Sized + MarkerEq, A: Allocator> RcEqIdent<T, A> for Rc<T, A> {
2434    #[inline]
2435    fn eq(&self, other: &Rc<T, A>) -> bool {
2436        Rc::ptr_eq(self, other) || **self == **other
2437    }
2438
2439    #[inline]
2440    fn ne(&self, other: &Rc<T, A>) -> bool {
2441        !Rc::ptr_eq(self, other) && **self != **other
2442    }
2443}
2444
2445#[stable(feature = "rust1", since = "1.0.0")]
2446impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Rc<T, A> {
2447    /// Equality for two `Rc`s.
2448    ///
2449    /// Two `Rc`s are equal if their inner values are equal, even if they are
2450    /// stored in different allocation.
2451    ///
2452    /// If `T` also implements `Eq` (implying reflexivity of equality),
2453    /// two `Rc`s that point to the same allocation are
2454    /// always equal.
2455    ///
2456    /// # Examples
2457    ///
2458    /// ```
2459    /// use std::rc::Rc;
2460    ///
2461    /// let five = Rc::new(5);
2462    ///
2463    /// assert!(five == Rc::new(5));
2464    /// ```
2465    #[inline]
2466    fn eq(&self, other: &Rc<T, A>) -> bool {
2467        RcEqIdent::eq(self, other)
2468    }
2469
2470    /// Inequality for two `Rc`s.
2471    ///
2472    /// Two `Rc`s are not equal if their inner values are not equal.
2473    ///
2474    /// If `T` also implements `Eq` (implying reflexivity of equality),
2475    /// two `Rc`s that point to the same allocation are
2476    /// always equal.
2477    ///
2478    /// # Examples
2479    ///
2480    /// ```
2481    /// use std::rc::Rc;
2482    ///
2483    /// let five = Rc::new(5);
2484    ///
2485    /// assert!(five != Rc::new(6));
2486    /// ```
2487    #[inline]
2488    fn ne(&self, other: &Rc<T, A>) -> bool {
2489        RcEqIdent::ne(self, other)
2490    }
2491}
2492
2493#[stable(feature = "rust1", since = "1.0.0")]
2494impl<T: ?Sized + Eq, A: Allocator> Eq for Rc<T, A> {}
2495
2496#[stable(feature = "rust1", since = "1.0.0")]
2497impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Rc<T, A> {
2498    /// Partial comparison for two `Rc`s.
2499    ///
2500    /// The two are compared by calling `partial_cmp()` on their inner values.
2501    ///
2502    /// # Examples
2503    ///
2504    /// ```
2505    /// use std::rc::Rc;
2506    /// use std::cmp::Ordering;
2507    ///
2508    /// let five = Rc::new(5);
2509    ///
2510    /// assert_eq!(Some(Ordering::Less), five.partial_cmp(&Rc::new(6)));
2511    /// ```
2512    #[inline(always)]
2513    fn partial_cmp(&self, other: &Rc<T, A>) -> Option<Ordering> {
2514        (**self).partial_cmp(&**other)
2515    }
2516
2517    /// Less-than comparison for two `Rc`s.
2518    ///
2519    /// The two are compared by calling `<` on their inner values.
2520    ///
2521    /// # Examples
2522    ///
2523    /// ```
2524    /// use std::rc::Rc;
2525    ///
2526    /// let five = Rc::new(5);
2527    ///
2528    /// assert!(five < Rc::new(6));
2529    /// ```
2530    #[inline(always)]
2531    fn lt(&self, other: &Rc<T, A>) -> bool {
2532        **self < **other
2533    }
2534
2535    /// 'Less than or equal to' comparison for two `Rc`s.
2536    ///
2537    /// The two are compared by calling `<=` on their inner values.
2538    ///
2539    /// # Examples
2540    ///
2541    /// ```
2542    /// use std::rc::Rc;
2543    ///
2544    /// let five = Rc::new(5);
2545    ///
2546    /// assert!(five <= Rc::new(5));
2547    /// ```
2548    #[inline(always)]
2549    fn le(&self, other: &Rc<T, A>) -> bool {
2550        **self <= **other
2551    }
2552
2553    /// Greater-than comparison for two `Rc`s.
2554    ///
2555    /// The two are compared by calling `>` on their inner values.
2556    ///
2557    /// # Examples
2558    ///
2559    /// ```
2560    /// use std::rc::Rc;
2561    ///
2562    /// let five = Rc::new(5);
2563    ///
2564    /// assert!(five > Rc::new(4));
2565    /// ```
2566    #[inline(always)]
2567    fn gt(&self, other: &Rc<T, A>) -> bool {
2568        **self > **other
2569    }
2570
2571    /// 'Greater than or equal to' comparison for two `Rc`s.
2572    ///
2573    /// The two are compared by calling `>=` on their inner values.
2574    ///
2575    /// # Examples
2576    ///
2577    /// ```
2578    /// use std::rc::Rc;
2579    ///
2580    /// let five = Rc::new(5);
2581    ///
2582    /// assert!(five >= Rc::new(5));
2583    /// ```
2584    #[inline(always)]
2585    fn ge(&self, other: &Rc<T, A>) -> bool {
2586        **self >= **other
2587    }
2588}
2589
2590#[stable(feature = "rust1", since = "1.0.0")]
2591impl<T: ?Sized + Ord, A: Allocator> Ord for Rc<T, A> {
2592    /// Comparison for two `Rc`s.
2593    ///
2594    /// The two are compared by calling `cmp()` on their inner values.
2595    ///
2596    /// # Examples
2597    ///
2598    /// ```
2599    /// use std::rc::Rc;
2600    /// use std::cmp::Ordering;
2601    ///
2602    /// let five = Rc::new(5);
2603    ///
2604    /// assert_eq!(Ordering::Less, five.cmp(&Rc::new(6)));
2605    /// ```
2606    #[inline]
2607    fn cmp(&self, other: &Rc<T, A>) -> Ordering {
2608        (**self).cmp(&**other)
2609    }
2610}
2611
2612#[stable(feature = "rust1", since = "1.0.0")]
2613impl<T: ?Sized + Hash, A: Allocator> Hash for Rc<T, A> {
2614    fn hash<H: Hasher>(&self, state: &mut H) {
2615        (**self).hash(state);
2616    }
2617}
2618
2619#[stable(feature = "rust1", since = "1.0.0")]
2620impl<T: ?Sized + fmt::Display, A: Allocator> fmt::Display for Rc<T, A> {
2621    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2622        fmt::Display::fmt(&**self, f)
2623    }
2624}
2625
2626#[stable(feature = "rust1", since = "1.0.0")]
2627impl<T: ?Sized + fmt::Debug, A: Allocator> fmt::Debug for Rc<T, A> {
2628    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2629        fmt::Debug::fmt(&**self, f)
2630    }
2631}
2632
2633#[stable(feature = "rust1", since = "1.0.0")]
2634impl<T: ?Sized, A: Allocator> fmt::Pointer for Rc<T, A> {
2635    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2636        fmt::Pointer::fmt(&(&raw const **self), f)
2637    }
2638}
2639
2640#[cfg(not(no_global_oom_handling))]
2641#[stable(feature = "from_for_ptrs", since = "1.6.0")]
2642impl<T> From<T> for Rc<T> {
2643    /// Converts a generic type `T` into an `Rc<T>`
2644    ///
2645    /// The conversion allocates on the heap and moves `t`
2646    /// from the stack into it.
2647    ///
2648    /// # Example
2649    /// ```rust
2650    /// # use std::rc::Rc;
2651    /// let x = 5;
2652    /// let rc = Rc::new(5);
2653    ///
2654    /// assert_eq!(Rc::from(x), rc);
2655    /// ```
2656    fn from(t: T) -> Self {
2657        Rc::new(t)
2658    }
2659}
2660
2661#[cfg(not(no_global_oom_handling))]
2662#[stable(feature = "shared_from_array", since = "1.74.0")]
2663impl<T, const N: usize> From<[T; N]> for Rc<[T]> {
2664    /// Converts a [`[T; N]`](prim@array) into an `Rc<[T]>`.
2665    ///
2666    /// The conversion moves the array into a newly allocated `Rc`.
2667    ///
2668    /// # Example
2669    ///
2670    /// ```
2671    /// # use std::rc::Rc;
2672    /// let original: [i32; 3] = [1, 2, 3];
2673    /// let shared: Rc<[i32]> = Rc::from(original);
2674    /// assert_eq!(&[1, 2, 3], &shared[..]);
2675    /// ```
2676    #[inline]
2677    fn from(v: [T; N]) -> Rc<[T]> {
2678        Rc::<[T; N]>::from(v)
2679    }
2680}
2681
2682#[cfg(not(no_global_oom_handling))]
2683#[stable(feature = "shared_from_slice", since = "1.21.0")]
2684impl<T: Clone> From<&[T]> for Rc<[T]> {
2685    /// Allocates a reference-counted slice and fills it by cloning `v`'s items.
2686    ///
2687    /// # Example
2688    ///
2689    /// ```
2690    /// # use std::rc::Rc;
2691    /// let original: &[i32] = &[1, 2, 3];
2692    /// let shared: Rc<[i32]> = Rc::from(original);
2693    /// assert_eq!(&[1, 2, 3], &shared[..]);
2694    /// ```
2695    #[inline]
2696    fn from(v: &[T]) -> Rc<[T]> {
2697        <Self as RcFromSlice<T>>::from_slice(v)
2698    }
2699}
2700
2701#[cfg(not(no_global_oom_handling))]
2702#[stable(feature = "shared_from_mut_slice", since = "1.84.0")]
2703impl<T: Clone> From<&mut [T]> for Rc<[T]> {
2704    /// Allocates a reference-counted slice and fills it by cloning `v`'s items.
2705    ///
2706    /// # Example
2707    ///
2708    /// ```
2709    /// # use std::rc::Rc;
2710    /// let mut original = [1, 2, 3];
2711    /// let original: &mut [i32] = &mut original;
2712    /// let shared: Rc<[i32]> = Rc::from(original);
2713    /// assert_eq!(&[1, 2, 3], &shared[..]);
2714    /// ```
2715    #[inline]
2716    fn from(v: &mut [T]) -> Rc<[T]> {
2717        Rc::from(&*v)
2718    }
2719}
2720
2721#[cfg(not(no_global_oom_handling))]
2722#[stable(feature = "shared_from_slice", since = "1.21.0")]
2723impl From<&str> for Rc<str> {
2724    /// Allocates a reference-counted string slice and copies `v` into it.
2725    ///
2726    /// # Example
2727    ///
2728    /// ```
2729    /// # use std::rc::Rc;
2730    /// let shared: Rc<str> = Rc::from("statue");
2731    /// assert_eq!("statue", &shared[..]);
2732    /// ```
2733    #[inline]
2734    fn from(v: &str) -> Rc<str> {
2735        let rc = Rc::<[u8]>::from(v.as_bytes());
2736        unsafe { Rc::from_raw(Rc::into_raw(rc) as *const str) }
2737    }
2738}
2739
2740#[cfg(not(no_global_oom_handling))]
2741#[stable(feature = "shared_from_mut_slice", since = "1.84.0")]
2742impl From<&mut str> for Rc<str> {
2743    /// Allocates a reference-counted string slice and copies `v` into it.
2744    ///
2745    /// # Example
2746    ///
2747    /// ```
2748    /// # use std::rc::Rc;
2749    /// let mut original = String::from("statue");
2750    /// let original: &mut str = &mut original;
2751    /// let shared: Rc<str> = Rc::from(original);
2752    /// assert_eq!("statue", &shared[..]);
2753    /// ```
2754    #[inline]
2755    fn from(v: &mut str) -> Rc<str> {
2756        Rc::from(&*v)
2757    }
2758}
2759
2760#[cfg(not(no_global_oom_handling))]
2761#[stable(feature = "shared_from_slice", since = "1.21.0")]
2762impl From<String> for Rc<str> {
2763    /// Allocates a reference-counted string slice and copies `v` into it.
2764    ///
2765    /// # Example
2766    ///
2767    /// ```
2768    /// # use std::rc::Rc;
2769    /// let original: String = "statue".to_owned();
2770    /// let shared: Rc<str> = Rc::from(original);
2771    /// assert_eq!("statue", &shared[..]);
2772    /// ```
2773    #[inline]
2774    fn from(v: String) -> Rc<str> {
2775        Rc::from(&v[..])
2776    }
2777}
2778
2779#[cfg(not(no_global_oom_handling))]
2780#[stable(feature = "shared_from_slice", since = "1.21.0")]
2781impl<T: ?Sized, A: Allocator> From<Box<T, A>> for Rc<T, A> {
2782    /// Move a boxed object to a new, reference counted, allocation.
2783    ///
2784    /// # Example
2785    ///
2786    /// ```
2787    /// # use std::rc::Rc;
2788    /// let original: Box<i32> = Box::new(1);
2789    /// let shared: Rc<i32> = Rc::from(original);
2790    /// assert_eq!(1, *shared);
2791    /// ```
2792    #[inline]
2793    fn from(v: Box<T, A>) -> Rc<T, A> {
2794        Rc::from_box_in(v)
2795    }
2796}
2797
2798#[cfg(not(no_global_oom_handling))]
2799#[stable(feature = "shared_from_slice", since = "1.21.0")]
2800impl<T, A: Allocator> From<Vec<T, A>> for Rc<[T], A> {
2801    /// Allocates a reference-counted slice and moves `v`'s items into it.
2802    ///
2803    /// # Example
2804    ///
2805    /// ```
2806    /// # use std::rc::Rc;
2807    /// let unique: Vec<i32> = vec![1, 2, 3];
2808    /// let shared: Rc<[i32]> = Rc::from(unique);
2809    /// assert_eq!(&[1, 2, 3], &shared[..]);
2810    /// ```
2811    #[inline]
2812    fn from(v: Vec<T, A>) -> Rc<[T], A> {
2813        unsafe {
2814            let (vec_ptr, len, cap, alloc) = v.into_raw_parts_with_alloc();
2815
2816            let rc_ptr = Self::allocate_for_slice_in(len, &alloc);
2817            ptr::copy_nonoverlapping(vec_ptr, (&raw mut (*rc_ptr).value) as *mut T, len);
2818
2819            // Create a `Vec<T, &A>` with length 0, to deallocate the buffer
2820            // without dropping its contents or the allocator
2821            let _ = Vec::from_raw_parts_in(vec_ptr, 0, cap, &alloc);
2822
2823            Self::from_ptr_in(rc_ptr, alloc)
2824        }
2825    }
2826}
2827
2828#[stable(feature = "shared_from_cow", since = "1.45.0")]
2829impl<'a, B> From<Cow<'a, B>> for Rc<B>
2830where
2831    B: ToOwned + ?Sized,
2832    Rc<B>: From<&'a B> + From<B::Owned>,
2833{
2834    /// Creates a reference-counted pointer from a clone-on-write pointer by
2835    /// copying its content.
2836    ///
2837    /// # Example
2838    ///
2839    /// ```rust
2840    /// # use std::rc::Rc;
2841    /// # use std::borrow::Cow;
2842    /// let cow: Cow<'_, str> = Cow::Borrowed("eggplant");
2843    /// let shared: Rc<str> = Rc::from(cow);
2844    /// assert_eq!("eggplant", &shared[..]);
2845    /// ```
2846    #[inline]
2847    fn from(cow: Cow<'a, B>) -> Rc<B> {
2848        match cow {
2849            Cow::Borrowed(s) => Rc::from(s),
2850            Cow::Owned(s) => Rc::from(s),
2851        }
2852    }
2853}
2854
2855#[stable(feature = "shared_from_str", since = "1.62.0")]
2856impl From<Rc<str>> for Rc<[u8]> {
2857    /// Converts a reference-counted string slice into a byte slice.
2858    ///
2859    /// # Example
2860    ///
2861    /// ```
2862    /// # use std::rc::Rc;
2863    /// let string: Rc<str> = Rc::from("eggplant");
2864    /// let bytes: Rc<[u8]> = Rc::from(string);
2865    /// assert_eq!("eggplant".as_bytes(), bytes.as_ref());
2866    /// ```
2867    #[inline]
2868    fn from(rc: Rc<str>) -> Self {
2869        // SAFETY: `str` has the same layout as `[u8]`.
2870        unsafe { Rc::from_raw(Rc::into_raw(rc) as *const [u8]) }
2871    }
2872}
2873
2874#[stable(feature = "boxed_slice_try_from", since = "1.43.0")]
2875impl<T, A: Allocator, const N: usize> TryFrom<Rc<[T], A>> for Rc<[T; N], A> {
2876    type Error = Rc<[T], A>;
2877
2878    fn try_from(boxed_slice: Rc<[T], A>) -> Result<Self, Self::Error> {
2879        if boxed_slice.len() == N {
2880            let (ptr, alloc) = Rc::into_inner_with_allocator(boxed_slice);
2881            Ok(unsafe { Rc::from_inner_in(ptr.cast(), alloc) })
2882        } else {
2883            Err(boxed_slice)
2884        }
2885    }
2886}
2887
2888#[cfg(not(no_global_oom_handling))]
2889#[stable(feature = "shared_from_iter", since = "1.37.0")]
2890impl<T> FromIterator<T> for Rc<[T]> {
2891    /// Takes each element in the `Iterator` and collects it into an `Rc<[T]>`.
2892    ///
2893    /// # Performance characteristics
2894    ///
2895    /// ## The general case
2896    ///
2897    /// In the general case, collecting into `Rc<[T]>` is done by first
2898    /// collecting into a `Vec<T>`. That is, when writing the following:
2899    ///
2900    /// ```rust
2901    /// # use std::rc::Rc;
2902    /// let evens: Rc<[u8]> = (0..10).filter(|&x| x % 2 == 0).collect();
2903    /// # assert_eq!(&*evens, &[0, 2, 4, 6, 8]);
2904    /// ```
2905    ///
2906    /// this behaves as if we wrote:
2907    ///
2908    /// ```rust
2909    /// # use std::rc::Rc;
2910    /// let evens: Rc<[u8]> = (0..10).filter(|&x| x % 2 == 0)
2911    ///     .collect::<Vec<_>>() // The first set of allocations happens here.
2912    ///     .into(); // A second allocation for `Rc<[T]>` happens here.
2913    /// # assert_eq!(&*evens, &[0, 2, 4, 6, 8]);
2914    /// ```
2915    ///
2916    /// This will allocate as many times as needed for constructing the `Vec<T>`
2917    /// and then it will allocate once for turning the `Vec<T>` into the `Rc<[T]>`.
2918    ///
2919    /// ## Iterators of known length
2920    ///
2921    /// When your `Iterator` implements `TrustedLen` and is of an exact size,
2922    /// a single allocation will be made for the `Rc<[T]>`. For example:
2923    ///
2924    /// ```rust
2925    /// # use std::rc::Rc;
2926    /// let evens: Rc<[u8]> = (0..10).collect(); // Just a single allocation happens here.
2927    /// # assert_eq!(&*evens, &*(0..10).collect::<Vec<_>>());
2928    /// ```
2929    fn from_iter<I: IntoIterator<Item = T>>(iter: I) -> Self {
2930        ToRcSlice::to_rc_slice(iter.into_iter())
2931    }
2932}
2933
2934/// Specialization trait used for collecting into `Rc<[T]>`.
2935#[cfg(not(no_global_oom_handling))]
2936trait ToRcSlice<T>: Iterator<Item = T> + Sized {
2937    fn to_rc_slice(self) -> Rc<[T]>;
2938}
2939
2940#[cfg(not(no_global_oom_handling))]
2941impl<T, I: Iterator<Item = T>> ToRcSlice<T> for I {
2942    default fn to_rc_slice(self) -> Rc<[T]> {
2943        self.collect::<Vec<T>>().into()
2944    }
2945}
2946
2947#[cfg(not(no_global_oom_handling))]
2948impl<T, I: iter::TrustedLen<Item = T>> ToRcSlice<T> for I {
2949    fn to_rc_slice(self) -> Rc<[T]> {
2950        // This is the case for a `TrustedLen` iterator.
2951        let (low, high) = self.size_hint();
2952        if let Some(high) = high {
2953            debug_assert_eq!(
2954                low,
2955                high,
2956                "TrustedLen iterator's size hint is not exact: {:?}",
2957                (low, high)
2958            );
2959
2960            unsafe {
2961                // SAFETY: We need to ensure that the iterator has an exact length and we have.
2962                Rc::from_iter_exact(self, low)
2963            }
2964        } else {
2965            // TrustedLen contract guarantees that `upper_bound == None` implies an iterator
2966            // length exceeding `usize::MAX`.
2967            // The default implementation would collect into a vec which would panic.
2968            // Thus we panic here immediately without invoking `Vec` code.
2969            panic!("capacity overflow");
2970        }
2971    }
2972}
2973
2974/// `Weak` is a version of [`Rc`] that holds a non-owning reference to the
2975/// managed allocation.
2976///
2977/// The allocation is accessed by calling [`upgrade`] on the `Weak`
2978/// pointer, which returns an <code>[Option]<[Rc]\<T>></code>.
2979///
2980/// Since a `Weak` reference does not count towards ownership, it will not
2981/// prevent the value stored in the allocation from being dropped, and `Weak` itself makes no
2982/// guarantees about the value still being present. Thus it may return [`None`]
2983/// when [`upgrade`]d. Note however that a `Weak` reference *does* prevent the allocation
2984/// itself (the backing store) from being deallocated.
2985///
2986/// A `Weak` pointer is useful for keeping a temporary reference to the allocation
2987/// managed by [`Rc`] without preventing its inner value from being dropped. It is also used to
2988/// prevent circular references between [`Rc`] pointers, since mutual owning references
2989/// would never allow either [`Rc`] to be dropped. For example, a tree could
2990/// have strong [`Rc`] pointers from parent nodes to children, and `Weak`
2991/// pointers from children back to their parents.
2992///
2993/// The typical way to obtain a `Weak` pointer is to call [`Rc::downgrade`].
2994///
2995/// [`upgrade`]: Weak::upgrade
2996#[stable(feature = "rc_weak", since = "1.4.0")]
2997#[rustc_diagnostic_item = "RcWeak"]
2998pub struct Weak<
2999    T: ?Sized,
3000    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
3001> {
3002    // This is a `NonNull` to allow optimizing the size of this type in enums,
3003    // but it is not necessarily a valid pointer.
3004    // `Weak::new` sets this to `usize::MAX` so that it doesn’t need
3005    // to allocate space on the heap. That's not a value a real pointer
3006    // will ever have because RcInner has alignment at least 2.
3007    // This is only possible when `T: Sized`; unsized `T` never dangle.
3008    ptr: NonNull<RcInner<T>>,
3009    alloc: A,
3010}
3011
3012#[stable(feature = "rc_weak", since = "1.4.0")]
3013impl<T: ?Sized, A: Allocator> !Send for Weak<T, A> {}
3014#[stable(feature = "rc_weak", since = "1.4.0")]
3015impl<T: ?Sized, A: Allocator> !Sync for Weak<T, A> {}
3016
3017#[unstable(feature = "coerce_unsized", issue = "18598")]
3018impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Weak<U, A>> for Weak<T, A> {}
3019
3020#[unstable(feature = "dispatch_from_dyn", issue = "none")]
3021impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Weak<U>> for Weak<T> {}
3022
3023impl<T> Weak<T> {
3024    /// Constructs a new `Weak<T>`, without allocating any memory.
3025    /// Calling [`upgrade`] on the return value always gives [`None`].
3026    ///
3027    /// [`upgrade`]: Weak::upgrade
3028    ///
3029    /// # Examples
3030    ///
3031    /// ```
3032    /// use std::rc::Weak;
3033    ///
3034    /// let empty: Weak<i64> = Weak::new();
3035    /// assert!(empty.upgrade().is_none());
3036    /// ```
3037    #[inline]
3038    #[stable(feature = "downgraded_weak", since = "1.10.0")]
3039    #[rustc_const_stable(feature = "const_weak_new", since = "1.73.0")]
3040    #[must_use]
3041    pub const fn new() -> Weak<T> {
3042        Weak { ptr: NonNull::without_provenance(NonZeroUsize::MAX), alloc: Global }
3043    }
3044}
3045
3046impl<T, A: Allocator> Weak<T, A> {
3047    /// Constructs a new `Weak<T>`, without allocating any memory, technically in the provided
3048    /// allocator.
3049    /// Calling [`upgrade`] on the return value always gives [`None`].
3050    ///
3051    /// [`upgrade`]: Weak::upgrade
3052    ///
3053    /// # Examples
3054    ///
3055    /// ```
3056    /// use std::rc::Weak;
3057    ///
3058    /// let empty: Weak<i64> = Weak::new();
3059    /// assert!(empty.upgrade().is_none());
3060    /// ```
3061    #[inline]
3062    #[unstable(feature = "allocator_api", issue = "32838")]
3063    pub fn new_in(alloc: A) -> Weak<T, A> {
3064        Weak { ptr: NonNull::without_provenance(NonZeroUsize::MAX), alloc }
3065    }
3066}
3067
3068pub(crate) fn is_dangling<T: ?Sized>(ptr: *const T) -> bool {
3069    (ptr.cast::<()>()).addr() == usize::MAX
3070}
3071
3072/// Helper type to allow accessing the reference counts without
3073/// making any assertions about the data field.
3074struct WeakInner<'a> {
3075    weak: &'a Cell<usize>,
3076    strong: &'a Cell<usize>,
3077}
3078
3079impl<T: ?Sized> Weak<T> {
3080    /// Converts a raw pointer previously created by [`into_raw`] back into `Weak<T>`.
3081    ///
3082    /// This can be used to safely get a strong reference (by calling [`upgrade`]
3083    /// later) or to deallocate the weak count by dropping the `Weak<T>`.
3084    ///
3085    /// It takes ownership of one weak reference (with the exception of pointers created by [`new`],
3086    /// as these don't own anything; the method still works on them).
3087    ///
3088    /// # Safety
3089    ///
3090    /// The pointer must have originated from the [`into_raw`] and must still own its potential
3091    /// weak reference, and `ptr` must point to a block of memory allocated by the global allocator.
3092    ///
3093    /// It is allowed for the strong count to be 0 at the time of calling this. Nevertheless, this
3094    /// takes ownership of one weak reference currently represented as a raw pointer (the weak
3095    /// count is not modified by this operation) and therefore it must be paired with a previous
3096    /// call to [`into_raw`].
3097    ///
3098    /// # Examples
3099    ///
3100    /// ```
3101    /// use std::rc::{Rc, Weak};
3102    ///
3103    /// let strong = Rc::new("hello".to_owned());
3104    ///
3105    /// let raw_1 = Rc::downgrade(&strong).into_raw();
3106    /// let raw_2 = Rc::downgrade(&strong).into_raw();
3107    ///
3108    /// assert_eq!(2, Rc::weak_count(&strong));
3109    ///
3110    /// assert_eq!("hello", &*unsafe { Weak::from_raw(raw_1) }.upgrade().unwrap());
3111    /// assert_eq!(1, Rc::weak_count(&strong));
3112    ///
3113    /// drop(strong);
3114    ///
3115    /// // Decrement the last weak count.
3116    /// assert!(unsafe { Weak::from_raw(raw_2) }.upgrade().is_none());
3117    /// ```
3118    ///
3119    /// [`into_raw`]: Weak::into_raw
3120    /// [`upgrade`]: Weak::upgrade
3121    /// [`new`]: Weak::new
3122    #[inline]
3123    #[stable(feature = "weak_into_raw", since = "1.45.0")]
3124    pub unsafe fn from_raw(ptr: *const T) -> Self {
3125        unsafe { Self::from_raw_in(ptr, Global) }
3126    }
3127}
3128
3129impl<T: ?Sized, A: Allocator> Weak<T, A> {
3130    /// Returns a reference to the underlying allocator.
3131    #[inline]
3132    #[unstable(feature = "allocator_api", issue = "32838")]
3133    pub fn allocator(&self) -> &A {
3134        &self.alloc
3135    }
3136
3137    /// Returns a raw pointer to the object `T` pointed to by this `Weak<T>`.
3138    ///
3139    /// The pointer is valid only if there are some strong references. The pointer may be dangling,
3140    /// unaligned or even [`null`] otherwise.
3141    ///
3142    /// # Examples
3143    ///
3144    /// ```
3145    /// use std::rc::Rc;
3146    /// use std::ptr;
3147    ///
3148    /// let strong = Rc::new("hello".to_owned());
3149    /// let weak = Rc::downgrade(&strong);
3150    /// // Both point to the same object
3151    /// assert!(ptr::eq(&*strong, weak.as_ptr()));
3152    /// // The strong here keeps it alive, so we can still access the object.
3153    /// assert_eq!("hello", unsafe { &*weak.as_ptr() });
3154    ///
3155    /// drop(strong);
3156    /// // But not any more. We can do weak.as_ptr(), but accessing the pointer would lead to
3157    /// // undefined behavior.
3158    /// // assert_eq!("hello", unsafe { &*weak.as_ptr() });
3159    /// ```
3160    ///
3161    /// [`null`]: ptr::null
3162    #[must_use]
3163    #[stable(feature = "rc_as_ptr", since = "1.45.0")]
3164    pub fn as_ptr(&self) -> *const T {
3165        let ptr: *mut RcInner<T> = NonNull::as_ptr(self.ptr);
3166
3167        if is_dangling(ptr) {
3168            // If the pointer is dangling, we return the sentinel directly. This cannot be
3169            // a valid payload address, as the payload is at least as aligned as RcInner (usize).
3170            ptr as *const T
3171        } else {
3172            // SAFETY: if is_dangling returns false, then the pointer is dereferenceable.
3173            // The payload may be dropped at this point, and we have to maintain provenance,
3174            // so use raw pointer manipulation.
3175            unsafe { &raw mut (*ptr).value }
3176        }
3177    }
3178
3179    /// Consumes the `Weak<T>` and turns it into a raw pointer.
3180    ///
3181    /// This converts the weak pointer into a raw pointer, while still preserving the ownership of
3182    /// one weak reference (the weak count is not modified by this operation). It can be turned
3183    /// back into the `Weak<T>` with [`from_raw`].
3184    ///
3185    /// The same restrictions of accessing the target of the pointer as with
3186    /// [`as_ptr`] apply.
3187    ///
3188    /// # Examples
3189    ///
3190    /// ```
3191    /// use std::rc::{Rc, Weak};
3192    ///
3193    /// let strong = Rc::new("hello".to_owned());
3194    /// let weak = Rc::downgrade(&strong);
3195    /// let raw = weak.into_raw();
3196    ///
3197    /// assert_eq!(1, Rc::weak_count(&strong));
3198    /// assert_eq!("hello", unsafe { &*raw });
3199    ///
3200    /// drop(unsafe { Weak::from_raw(raw) });
3201    /// assert_eq!(0, Rc::weak_count(&strong));
3202    /// ```
3203    ///
3204    /// [`from_raw`]: Weak::from_raw
3205    /// [`as_ptr`]: Weak::as_ptr
3206    #[must_use = "losing the pointer will leak memory"]
3207    #[stable(feature = "weak_into_raw", since = "1.45.0")]
3208    pub fn into_raw(self) -> *const T {
3209        mem::ManuallyDrop::new(self).as_ptr()
3210    }
3211
3212    /// Consumes the `Weak<T>`, returning the wrapped pointer and allocator.
3213    ///
3214    /// This converts the weak pointer into a raw pointer, while still preserving the ownership of
3215    /// one weak reference (the weak count is not modified by this operation). It can be turned
3216    /// back into the `Weak<T>` with [`from_raw_in`].
3217    ///
3218    /// The same restrictions of accessing the target of the pointer as with
3219    /// [`as_ptr`] apply.
3220    ///
3221    /// # Examples
3222    ///
3223    /// ```
3224    /// #![feature(allocator_api)]
3225    /// use std::rc::{Rc, Weak};
3226    /// use std::alloc::System;
3227    ///
3228    /// let strong = Rc::new_in("hello".to_owned(), System);
3229    /// let weak = Rc::downgrade(&strong);
3230    /// let (raw, alloc) = weak.into_raw_with_allocator();
3231    ///
3232    /// assert_eq!(1, Rc::weak_count(&strong));
3233    /// assert_eq!("hello", unsafe { &*raw });
3234    ///
3235    /// drop(unsafe { Weak::from_raw_in(raw, alloc) });
3236    /// assert_eq!(0, Rc::weak_count(&strong));
3237    /// ```
3238    ///
3239    /// [`from_raw_in`]: Weak::from_raw_in
3240    /// [`as_ptr`]: Weak::as_ptr
3241    #[must_use = "losing the pointer will leak memory"]
3242    #[inline]
3243    #[unstable(feature = "allocator_api", issue = "32838")]
3244    pub fn into_raw_with_allocator(self) -> (*const T, A) {
3245        let this = mem::ManuallyDrop::new(self);
3246        let result = this.as_ptr();
3247        // Safety: `this` is ManuallyDrop so the allocator will not be double-dropped
3248        let alloc = unsafe { ptr::read(&this.alloc) };
3249        (result, alloc)
3250    }
3251
3252    /// Converts a raw pointer previously created by [`into_raw`] back into `Weak<T>`.
3253    ///
3254    /// This can be used to safely get a strong reference (by calling [`upgrade`]
3255    /// later) or to deallocate the weak count by dropping the `Weak<T>`.
3256    ///
3257    /// It takes ownership of one weak reference (with the exception of pointers created by [`new`],
3258    /// as these don't own anything; the method still works on them).
3259    ///
3260    /// # Safety
3261    ///
3262    /// The pointer must have originated from the [`into_raw`] and must still own its potential
3263    /// weak reference, and `ptr` must point to a block of memory allocated by `alloc`.
3264    ///
3265    /// It is allowed for the strong count to be 0 at the time of calling this. Nevertheless, this
3266    /// takes ownership of one weak reference currently represented as a raw pointer (the weak
3267    /// count is not modified by this operation) and therefore it must be paired with a previous
3268    /// call to [`into_raw`].
3269    ///
3270    /// # Examples
3271    ///
3272    /// ```
3273    /// use std::rc::{Rc, Weak};
3274    ///
3275    /// let strong = Rc::new("hello".to_owned());
3276    ///
3277    /// let raw_1 = Rc::downgrade(&strong).into_raw();
3278    /// let raw_2 = Rc::downgrade(&strong).into_raw();
3279    ///
3280    /// assert_eq!(2, Rc::weak_count(&strong));
3281    ///
3282    /// assert_eq!("hello", &*unsafe { Weak::from_raw(raw_1) }.upgrade().unwrap());
3283    /// assert_eq!(1, Rc::weak_count(&strong));
3284    ///
3285    /// drop(strong);
3286    ///
3287    /// // Decrement the last weak count.
3288    /// assert!(unsafe { Weak::from_raw(raw_2) }.upgrade().is_none());
3289    /// ```
3290    ///
3291    /// [`into_raw`]: Weak::into_raw
3292    /// [`upgrade`]: Weak::upgrade
3293    /// [`new`]: Weak::new
3294    #[inline]
3295    #[unstable(feature = "allocator_api", issue = "32838")]
3296    pub unsafe fn from_raw_in(ptr: *const T, alloc: A) -> Self {
3297        // See Weak::as_ptr for context on how the input pointer is derived.
3298
3299        let ptr = if is_dangling(ptr) {
3300            // This is a dangling Weak.
3301            ptr as *mut RcInner<T>
3302        } else {
3303            // Otherwise, we're guaranteed the pointer came from a nondangling Weak.
3304            // SAFETY: data_offset is safe to call, as ptr references a real (potentially dropped) T.
3305            let offset = unsafe { data_offset(ptr) };
3306            // Thus, we reverse the offset to get the whole RcInner.
3307            // SAFETY: the pointer originated from a Weak, so this offset is safe.
3308            unsafe { ptr.byte_sub(offset) as *mut RcInner<T> }
3309        };
3310
3311        // SAFETY: we now have recovered the original Weak pointer, so can create the Weak.
3312        Weak { ptr: unsafe { NonNull::new_unchecked(ptr) }, alloc }
3313    }
3314
3315    /// Attempts to upgrade the `Weak` pointer to an [`Rc`], delaying
3316    /// dropping of the inner value if successful.
3317    ///
3318    /// Returns [`None`] if the inner value has since been dropped.
3319    ///
3320    /// # Examples
3321    ///
3322    /// ```
3323    /// use std::rc::Rc;
3324    ///
3325    /// let five = Rc::new(5);
3326    ///
3327    /// let weak_five = Rc::downgrade(&five);
3328    ///
3329    /// let strong_five: Option<Rc<_>> = weak_five.upgrade();
3330    /// assert!(strong_five.is_some());
3331    ///
3332    /// // Destroy all strong pointers.
3333    /// drop(strong_five);
3334    /// drop(five);
3335    ///
3336    /// assert!(weak_five.upgrade().is_none());
3337    /// ```
3338    #[must_use = "this returns a new `Rc`, \
3339                  without modifying the original weak pointer"]
3340    #[stable(feature = "rc_weak", since = "1.4.0")]
3341    pub fn upgrade(&self) -> Option<Rc<T, A>>
3342    where
3343        A: Clone,
3344    {
3345        let inner = self.inner()?;
3346
3347        if inner.strong() == 0 {
3348            None
3349        } else {
3350            unsafe {
3351                inner.inc_strong();
3352                Some(Rc::from_inner_in(self.ptr, self.alloc.clone()))
3353            }
3354        }
3355    }
3356
3357    /// Gets the number of strong (`Rc`) pointers pointing to this allocation.
3358    ///
3359    /// If `self` was created using [`Weak::new`], this will return 0.
3360    #[must_use]
3361    #[stable(feature = "weak_counts", since = "1.41.0")]
3362    pub fn strong_count(&self) -> usize {
3363        if let Some(inner) = self.inner() { inner.strong() } else { 0 }
3364    }
3365
3366    /// Gets the number of `Weak` pointers pointing to this allocation.
3367    ///
3368    /// If no strong pointers remain, this will return zero.
3369    #[must_use]
3370    #[stable(feature = "weak_counts", since = "1.41.0")]
3371    pub fn weak_count(&self) -> usize {
3372        if let Some(inner) = self.inner() {
3373            if inner.strong() > 0 {
3374                inner.weak() - 1 // subtract the implicit weak ptr
3375            } else {
3376                0
3377            }
3378        } else {
3379            0
3380        }
3381    }
3382
3383    /// Returns `None` when the pointer is dangling and there is no allocated `RcInner`,
3384    /// (i.e., when this `Weak` was created by `Weak::new`).
3385    #[inline]
3386    fn inner(&self) -> Option<WeakInner<'_>> {
3387        if is_dangling(self.ptr.as_ptr()) {
3388            None
3389        } else {
3390            // We are careful to *not* create a reference covering the "data" field, as
3391            // the field may be mutated concurrently (for example, if the last `Rc`
3392            // is dropped, the data field will be dropped in-place).
3393            Some(unsafe {
3394                let ptr = self.ptr.as_ptr();
3395                WeakInner { strong: &(*ptr).strong, weak: &(*ptr).weak }
3396            })
3397        }
3398    }
3399
3400    /// Returns `true` if the two `Weak`s point to the same allocation similar to [`ptr::eq`], or if
3401    /// both don't point to any allocation (because they were created with `Weak::new()`). However,
3402    /// this function ignores the metadata of  `dyn Trait` pointers.
3403    ///
3404    /// # Notes
3405    ///
3406    /// Since this compares pointers it means that `Weak::new()` will equal each
3407    /// other, even though they don't point to any allocation.
3408    ///
3409    /// # Examples
3410    ///
3411    /// ```
3412    /// use std::rc::Rc;
3413    ///
3414    /// let first_rc = Rc::new(5);
3415    /// let first = Rc::downgrade(&first_rc);
3416    /// let second = Rc::downgrade(&first_rc);
3417    ///
3418    /// assert!(first.ptr_eq(&second));
3419    ///
3420    /// let third_rc = Rc::new(5);
3421    /// let third = Rc::downgrade(&third_rc);
3422    ///
3423    /// assert!(!first.ptr_eq(&third));
3424    /// ```
3425    ///
3426    /// Comparing `Weak::new`.
3427    ///
3428    /// ```
3429    /// use std::rc::{Rc, Weak};
3430    ///
3431    /// let first = Weak::new();
3432    /// let second = Weak::new();
3433    /// assert!(first.ptr_eq(&second));
3434    ///
3435    /// let third_rc = Rc::new(());
3436    /// let third = Rc::downgrade(&third_rc);
3437    /// assert!(!first.ptr_eq(&third));
3438    /// ```
3439    #[inline]
3440    #[must_use]
3441    #[stable(feature = "weak_ptr_eq", since = "1.39.0")]
3442    pub fn ptr_eq(&self, other: &Self) -> bool {
3443        ptr::addr_eq(self.ptr.as_ptr(), other.ptr.as_ptr())
3444    }
3445}
3446
3447#[stable(feature = "rc_weak", since = "1.4.0")]
3448unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Weak<T, A> {
3449    /// Drops the `Weak` pointer.
3450    ///
3451    /// # Examples
3452    ///
3453    /// ```
3454    /// use std::rc::{Rc, Weak};
3455    ///
3456    /// struct Foo;
3457    ///
3458    /// impl Drop for Foo {
3459    ///     fn drop(&mut self) {
3460    ///         println!("dropped!");
3461    ///     }
3462    /// }
3463    ///
3464    /// let foo = Rc::new(Foo);
3465    /// let weak_foo = Rc::downgrade(&foo);
3466    /// let other_weak_foo = Weak::clone(&weak_foo);
3467    ///
3468    /// drop(weak_foo);   // Doesn't print anything
3469    /// drop(foo);        // Prints "dropped!"
3470    ///
3471    /// assert!(other_weak_foo.upgrade().is_none());
3472    /// ```
3473    fn drop(&mut self) {
3474        let inner = if let Some(inner) = self.inner() { inner } else { return };
3475
3476        inner.dec_weak();
3477        // the weak count starts at 1, and will only go to zero if all
3478        // the strong pointers have disappeared.
3479        if inner.weak() == 0 {
3480            unsafe {
3481                self.alloc.deallocate(self.ptr.cast(), Layout::for_value_raw(self.ptr.as_ptr()));
3482            }
3483        }
3484    }
3485}
3486
3487#[stable(feature = "rc_weak", since = "1.4.0")]
3488impl<T: ?Sized, A: Allocator + Clone> Clone for Weak<T, A> {
3489    /// Makes a clone of the `Weak` pointer that points to the same allocation.
3490    ///
3491    /// # Examples
3492    ///
3493    /// ```
3494    /// use std::rc::{Rc, Weak};
3495    ///
3496    /// let weak_five = Rc::downgrade(&Rc::new(5));
3497    ///
3498    /// let _ = Weak::clone(&weak_five);
3499    /// ```
3500    #[inline]
3501    fn clone(&self) -> Weak<T, A> {
3502        if let Some(inner) = self.inner() {
3503            inner.inc_weak()
3504        }
3505        Weak { ptr: self.ptr, alloc: self.alloc.clone() }
3506    }
3507}
3508
3509#[unstable(feature = "ergonomic_clones", issue = "132290")]
3510impl<T: ?Sized, A: Allocator + Clone> UseCloned for Weak<T, A> {}
3511
3512#[stable(feature = "rc_weak", since = "1.4.0")]
3513impl<T: ?Sized, A: Allocator> fmt::Debug for Weak<T, A> {
3514    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
3515        write!(f, "(Weak)")
3516    }
3517}
3518
3519#[stable(feature = "downgraded_weak", since = "1.10.0")]
3520impl<T> Default for Weak<T> {
3521    /// Constructs a new `Weak<T>`, without allocating any memory.
3522    /// Calling [`upgrade`] on the return value always gives [`None`].
3523    ///
3524    /// [`upgrade`]: Weak::upgrade
3525    ///
3526    /// # Examples
3527    ///
3528    /// ```
3529    /// use std::rc::Weak;
3530    ///
3531    /// let empty: Weak<i64> = Default::default();
3532    /// assert!(empty.upgrade().is_none());
3533    /// ```
3534    fn default() -> Weak<T> {
3535        Weak::new()
3536    }
3537}
3538
3539// NOTE: We checked_add here to deal with mem::forget safely. In particular
3540// if you mem::forget Rcs (or Weaks), the ref-count can overflow, and then
3541// you can free the allocation while outstanding Rcs (or Weaks) exist.
3542// We abort because this is such a degenerate scenario that we don't care about
3543// what happens -- no real program should ever experience this.
3544//
3545// This should have negligible overhead since you don't actually need to
3546// clone these much in Rust thanks to ownership and move-semantics.
3547
3548#[doc(hidden)]
3549trait RcInnerPtr {
3550    fn weak_ref(&self) -> &Cell<usize>;
3551    fn strong_ref(&self) -> &Cell<usize>;
3552
3553    #[inline]
3554    fn strong(&self) -> usize {
3555        self.strong_ref().get()
3556    }
3557
3558    #[inline]
3559    fn inc_strong(&self) {
3560        let strong = self.strong();
3561
3562        // We insert an `assume` here to hint LLVM at an otherwise
3563        // missed optimization.
3564        // SAFETY: The reference count will never be zero when this is
3565        // called.
3566        unsafe {
3567            hint::assert_unchecked(strong != 0);
3568        }
3569
3570        let strong = strong.wrapping_add(1);
3571        self.strong_ref().set(strong);
3572
3573        // We want to abort on overflow instead of dropping the value.
3574        // Checking for overflow after the store instead of before
3575        // allows for slightly better code generation.
3576        if core::intrinsics::unlikely(strong == 0) {
3577            abort();
3578        }
3579    }
3580
3581    #[inline]
3582    fn dec_strong(&self) {
3583        self.strong_ref().set(self.strong() - 1);
3584    }
3585
3586    #[inline]
3587    fn weak(&self) -> usize {
3588        self.weak_ref().get()
3589    }
3590
3591    #[inline]
3592    fn inc_weak(&self) {
3593        let weak = self.weak();
3594
3595        // We insert an `assume` here to hint LLVM at an otherwise
3596        // missed optimization.
3597        // SAFETY: The reference count will never be zero when this is
3598        // called.
3599        unsafe {
3600            hint::assert_unchecked(weak != 0);
3601        }
3602
3603        let weak = weak.wrapping_add(1);
3604        self.weak_ref().set(weak);
3605
3606        // We want to abort on overflow instead of dropping the value.
3607        // Checking for overflow after the store instead of before
3608        // allows for slightly better code generation.
3609        if core::intrinsics::unlikely(weak == 0) {
3610            abort();
3611        }
3612    }
3613
3614    #[inline]
3615    fn dec_weak(&self) {
3616        self.weak_ref().set(self.weak() - 1);
3617    }
3618}
3619
3620impl<T: ?Sized> RcInnerPtr for RcInner<T> {
3621    #[inline(always)]
3622    fn weak_ref(&self) -> &Cell<usize> {
3623        &self.weak
3624    }
3625
3626    #[inline(always)]
3627    fn strong_ref(&self) -> &Cell<usize> {
3628        &self.strong
3629    }
3630}
3631
3632impl<'a> RcInnerPtr for WeakInner<'a> {
3633    #[inline(always)]
3634    fn weak_ref(&self) -> &Cell<usize> {
3635        self.weak
3636    }
3637
3638    #[inline(always)]
3639    fn strong_ref(&self) -> &Cell<usize> {
3640        self.strong
3641    }
3642}
3643
3644#[stable(feature = "rust1", since = "1.0.0")]
3645impl<T: ?Sized, A: Allocator> borrow::Borrow<T> for Rc<T, A> {
3646    fn borrow(&self) -> &T {
3647        &**self
3648    }
3649}
3650
3651#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
3652impl<T: ?Sized, A: Allocator> AsRef<T> for Rc<T, A> {
3653    fn as_ref(&self) -> &T {
3654        &**self
3655    }
3656}
3657
3658#[stable(feature = "pin", since = "1.33.0")]
3659impl<T: ?Sized, A: Allocator> Unpin for Rc<T, A> {}
3660
3661/// Gets the offset within an `RcInner` for the payload behind a pointer.
3662///
3663/// # Safety
3664///
3665/// The pointer must point to (and have valid metadata for) a previously
3666/// valid instance of T, but the T is allowed to be dropped.
3667unsafe fn data_offset<T: ?Sized>(ptr: *const T) -> usize {
3668    // Align the unsized value to the end of the RcInner.
3669    // Because RcInner is repr(C), it will always be the last field in memory.
3670    // SAFETY: since the only unsized types possible are slices, trait objects,
3671    // and extern types, the input safety requirement is currently enough to
3672    // satisfy the requirements of align_of_val_raw; this is an implementation
3673    // detail of the language that must not be relied upon outside of std.
3674    unsafe { data_offset_align(align_of_val_raw(ptr)) }
3675}
3676
3677#[inline]
3678fn data_offset_align(align: usize) -> usize {
3679    let layout = Layout::new::<RcInner<()>>();
3680    layout.size() + layout.padding_needed_for(align)
3681}
3682
3683/// A uniquely owned [`Rc`].
3684///
3685/// This represents an `Rc` that is known to be uniquely owned -- that is, have exactly one strong
3686/// reference. Multiple weak pointers can be created, but attempts to upgrade those to strong
3687/// references will fail unless the `UniqueRc` they point to has been converted into a regular `Rc`.
3688///
3689/// Because they are uniquely owned, the contents of a `UniqueRc` can be freely mutated. A common
3690/// use case is to have an object be mutable during its initialization phase but then have it become
3691/// immutable and converted to a normal `Rc`.
3692///
3693/// This can be used as a flexible way to create cyclic data structures, as in the example below.
3694///
3695/// ```
3696/// #![feature(unique_rc_arc)]
3697/// use std::rc::{Rc, Weak, UniqueRc};
3698///
3699/// struct Gadget {
3700///     #[allow(dead_code)]
3701///     me: Weak<Gadget>,
3702/// }
3703///
3704/// fn create_gadget() -> Option<Rc<Gadget>> {
3705///     let mut rc = UniqueRc::new(Gadget {
3706///         me: Weak::new(),
3707///     });
3708///     rc.me = UniqueRc::downgrade(&rc);
3709///     Some(UniqueRc::into_rc(rc))
3710/// }
3711///
3712/// create_gadget().unwrap();
3713/// ```
3714///
3715/// An advantage of using `UniqueRc` over [`Rc::new_cyclic`] to build cyclic data structures is that
3716/// [`Rc::new_cyclic`]'s `data_fn` parameter cannot be async or return a [`Result`]. As shown in the
3717/// previous example, `UniqueRc` allows for more flexibility in the construction of cyclic data,
3718/// including fallible or async constructors.
3719#[unstable(feature = "unique_rc_arc", issue = "112566")]
3720pub struct UniqueRc<
3721    T: ?Sized,
3722    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
3723> {
3724    ptr: NonNull<RcInner<T>>,
3725    // Define the ownership of `RcInner<T>` for drop-check
3726    _marker: PhantomData<RcInner<T>>,
3727    // Invariance is necessary for soundness: once other `Weak`
3728    // references exist, we already have a form of shared mutability!
3729    _marker2: PhantomData<*mut T>,
3730    alloc: A,
3731}
3732
3733// Not necessary for correctness since `UniqueRc` contains `NonNull`,
3734// but having an explicit negative impl is nice for documentation purposes
3735// and results in nicer error messages.
3736#[unstable(feature = "unique_rc_arc", issue = "112566")]
3737impl<T: ?Sized, A: Allocator> !Send for UniqueRc<T, A> {}
3738
3739// Not necessary for correctness since `UniqueRc` contains `NonNull`,
3740// but having an explicit negative impl is nice for documentation purposes
3741// and results in nicer error messages.
3742#[unstable(feature = "unique_rc_arc", issue = "112566")]
3743impl<T: ?Sized, A: Allocator> !Sync for UniqueRc<T, A> {}
3744
3745#[unstable(feature = "unique_rc_arc", issue = "112566")]
3746impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<UniqueRc<U, A>>
3747    for UniqueRc<T, A>
3748{
3749}
3750
3751//#[unstable(feature = "unique_rc_arc", issue = "112566")]
3752#[unstable(feature = "dispatch_from_dyn", issue = "none")]
3753impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<UniqueRc<U>> for UniqueRc<T> {}
3754
3755#[unstable(feature = "unique_rc_arc", issue = "112566")]
3756impl<T: ?Sized + fmt::Display, A: Allocator> fmt::Display for UniqueRc<T, A> {
3757    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
3758        fmt::Display::fmt(&**self, f)
3759    }
3760}
3761
3762#[unstable(feature = "unique_rc_arc", issue = "112566")]
3763impl<T: ?Sized + fmt::Debug, A: Allocator> fmt::Debug for UniqueRc<T, A> {
3764    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
3765        fmt::Debug::fmt(&**self, f)
3766    }
3767}
3768
3769#[unstable(feature = "unique_rc_arc", issue = "112566")]
3770impl<T: ?Sized, A: Allocator> fmt::Pointer for UniqueRc<T, A> {
3771    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
3772        fmt::Pointer::fmt(&(&raw const **self), f)
3773    }
3774}
3775
3776#[unstable(feature = "unique_rc_arc", issue = "112566")]
3777impl<T: ?Sized, A: Allocator> borrow::Borrow<T> for UniqueRc<T, A> {
3778    fn borrow(&self) -> &T {
3779        &**self
3780    }
3781}
3782
3783#[unstable(feature = "unique_rc_arc", issue = "112566")]
3784impl<T: ?Sized, A: Allocator> borrow::BorrowMut<T> for UniqueRc<T, A> {
3785    fn borrow_mut(&mut self) -> &mut T {
3786        &mut **self
3787    }
3788}
3789
3790#[unstable(feature = "unique_rc_arc", issue = "112566")]
3791impl<T: ?Sized, A: Allocator> AsRef<T> for UniqueRc<T, A> {
3792    fn as_ref(&self) -> &T {
3793        &**self
3794    }
3795}
3796
3797#[unstable(feature = "unique_rc_arc", issue = "112566")]
3798impl<T: ?Sized, A: Allocator> AsMut<T> for UniqueRc<T, A> {
3799    fn as_mut(&mut self) -> &mut T {
3800        &mut **self
3801    }
3802}
3803
3804#[unstable(feature = "unique_rc_arc", issue = "112566")]
3805impl<T: ?Sized, A: Allocator> Unpin for UniqueRc<T, A> {}
3806
3807#[unstable(feature = "unique_rc_arc", issue = "112566")]
3808impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for UniqueRc<T, A> {
3809    /// Equality for two `UniqueRc`s.
3810    ///
3811    /// Two `UniqueRc`s are equal if their inner values are equal.
3812    ///
3813    /// # Examples
3814    ///
3815    /// ```
3816    /// #![feature(unique_rc_arc)]
3817    /// use std::rc::UniqueRc;
3818    ///
3819    /// let five = UniqueRc::new(5);
3820    ///
3821    /// assert!(five == UniqueRc::new(5));
3822    /// ```
3823    #[inline]
3824    fn eq(&self, other: &Self) -> bool {
3825        PartialEq::eq(&**self, &**other)
3826    }
3827
3828    /// Inequality for two `UniqueRc`s.
3829    ///
3830    /// Two `UniqueRc`s are not equal if their inner values are not equal.
3831    ///
3832    /// # Examples
3833    ///
3834    /// ```
3835    /// #![feature(unique_rc_arc)]
3836    /// use std::rc::UniqueRc;
3837    ///
3838    /// let five = UniqueRc::new(5);
3839    ///
3840    /// assert!(five != UniqueRc::new(6));
3841    /// ```
3842    #[inline]
3843    fn ne(&self, other: &Self) -> bool {
3844        PartialEq::ne(&**self, &**other)
3845    }
3846}
3847
3848#[unstable(feature = "unique_rc_arc", issue = "112566")]
3849impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for UniqueRc<T, A> {
3850    /// Partial comparison for two `UniqueRc`s.
3851    ///
3852    /// The two are compared by calling `partial_cmp()` on their inner values.
3853    ///
3854    /// # Examples
3855    ///
3856    /// ```
3857    /// #![feature(unique_rc_arc)]
3858    /// use std::rc::UniqueRc;
3859    /// use std::cmp::Ordering;
3860    ///
3861    /// let five = UniqueRc::new(5);
3862    ///
3863    /// assert_eq!(Some(Ordering::Less), five.partial_cmp(&UniqueRc::new(6)));
3864    /// ```
3865    #[inline(always)]
3866    fn partial_cmp(&self, other: &UniqueRc<T, A>) -> Option<Ordering> {
3867        (**self).partial_cmp(&**other)
3868    }
3869
3870    /// Less-than comparison for two `UniqueRc`s.
3871    ///
3872    /// The two are compared by calling `<` on their inner values.
3873    ///
3874    /// # Examples
3875    ///
3876    /// ```
3877    /// #![feature(unique_rc_arc)]
3878    /// use std::rc::UniqueRc;
3879    ///
3880    /// let five = UniqueRc::new(5);
3881    ///
3882    /// assert!(five < UniqueRc::new(6));
3883    /// ```
3884    #[inline(always)]
3885    fn lt(&self, other: &UniqueRc<T, A>) -> bool {
3886        **self < **other
3887    }
3888
3889    /// 'Less than or equal to' comparison for two `UniqueRc`s.
3890    ///
3891    /// The two are compared by calling `<=` on their inner values.
3892    ///
3893    /// # Examples
3894    ///
3895    /// ```
3896    /// #![feature(unique_rc_arc)]
3897    /// use std::rc::UniqueRc;
3898    ///
3899    /// let five = UniqueRc::new(5);
3900    ///
3901    /// assert!(five <= UniqueRc::new(5));
3902    /// ```
3903    #[inline(always)]
3904    fn le(&self, other: &UniqueRc<T, A>) -> bool {
3905        **self <= **other
3906    }
3907
3908    /// Greater-than comparison for two `UniqueRc`s.
3909    ///
3910    /// The two are compared by calling `>` on their inner values.
3911    ///
3912    /// # Examples
3913    ///
3914    /// ```
3915    /// #![feature(unique_rc_arc)]
3916    /// use std::rc::UniqueRc;
3917    ///
3918    /// let five = UniqueRc::new(5);
3919    ///
3920    /// assert!(five > UniqueRc::new(4));
3921    /// ```
3922    #[inline(always)]
3923    fn gt(&self, other: &UniqueRc<T, A>) -> bool {
3924        **self > **other
3925    }
3926
3927    /// 'Greater than or equal to' comparison for two `UniqueRc`s.
3928    ///
3929    /// The two are compared by calling `>=` on their inner values.
3930    ///
3931    /// # Examples
3932    ///
3933    /// ```
3934    /// #![feature(unique_rc_arc)]
3935    /// use std::rc::UniqueRc;
3936    ///
3937    /// let five = UniqueRc::new(5);
3938    ///
3939    /// assert!(five >= UniqueRc::new(5));
3940    /// ```
3941    #[inline(always)]
3942    fn ge(&self, other: &UniqueRc<T, A>) -> bool {
3943        **self >= **other
3944    }
3945}
3946
3947#[unstable(feature = "unique_rc_arc", issue = "112566")]
3948impl<T: ?Sized + Ord, A: Allocator> Ord for UniqueRc<T, A> {
3949    /// Comparison for two `UniqueRc`s.
3950    ///
3951    /// The two are compared by calling `cmp()` on their inner values.
3952    ///
3953    /// # Examples
3954    ///
3955    /// ```
3956    /// #![feature(unique_rc_arc)]
3957    /// use std::rc::UniqueRc;
3958    /// use std::cmp::Ordering;
3959    ///
3960    /// let five = UniqueRc::new(5);
3961    ///
3962    /// assert_eq!(Ordering::Less, five.cmp(&UniqueRc::new(6)));
3963    /// ```
3964    #[inline]
3965    fn cmp(&self, other: &UniqueRc<T, A>) -> Ordering {
3966        (**self).cmp(&**other)
3967    }
3968}
3969
3970#[unstable(feature = "unique_rc_arc", issue = "112566")]
3971impl<T: ?Sized + Eq, A: Allocator> Eq for UniqueRc<T, A> {}
3972
3973#[unstable(feature = "unique_rc_arc", issue = "112566")]
3974impl<T: ?Sized + Hash, A: Allocator> Hash for UniqueRc<T, A> {
3975    fn hash<H: Hasher>(&self, state: &mut H) {
3976        (**self).hash(state);
3977    }
3978}
3979
3980// Depends on A = Global
3981impl<T> UniqueRc<T> {
3982    /// Creates a new `UniqueRc`.
3983    ///
3984    /// Weak references to this `UniqueRc` can be created with [`UniqueRc::downgrade`]. Upgrading
3985    /// these weak references will fail before the `UniqueRc` has been converted into an [`Rc`].
3986    /// After converting the `UniqueRc` into an [`Rc`], any weak references created beforehand will
3987    /// point to the new [`Rc`].
3988    #[cfg(not(no_global_oom_handling))]
3989    #[unstable(feature = "unique_rc_arc", issue = "112566")]
3990    pub fn new(value: T) -> Self {
3991        Self::new_in(value, Global)
3992    }
3993}
3994
3995impl<T, A: Allocator> UniqueRc<T, A> {
3996    /// Creates a new `UniqueRc` in the provided allocator.
3997    ///
3998    /// Weak references to this `UniqueRc` can be created with [`UniqueRc::downgrade`]. Upgrading
3999    /// these weak references will fail before the `UniqueRc` has been converted into an [`Rc`].
4000    /// After converting the `UniqueRc` into an [`Rc`], any weak references created beforehand will
4001    /// point to the new [`Rc`].
4002    #[cfg(not(no_global_oom_handling))]
4003    #[unstable(feature = "unique_rc_arc", issue = "112566")]
4004    pub fn new_in(value: T, alloc: A) -> Self {
4005        let (ptr, alloc) = Box::into_unique(Box::new_in(
4006            RcInner {
4007                strong: Cell::new(0),
4008                // keep one weak reference so if all the weak pointers that are created are dropped
4009                // the UniqueRc still stays valid.
4010                weak: Cell::new(1),
4011                value,
4012            },
4013            alloc,
4014        ));
4015        Self { ptr: ptr.into(), _marker: PhantomData, _marker2: PhantomData, alloc }
4016    }
4017}
4018
4019impl<T: ?Sized, A: Allocator> UniqueRc<T, A> {
4020    /// Converts the `UniqueRc` into a regular [`Rc`].
4021    ///
4022    /// This consumes the `UniqueRc` and returns a regular [`Rc`] that contains the `value` that
4023    /// is passed to `into_rc`.
4024    ///
4025    /// Any weak references created before this method is called can now be upgraded to strong
4026    /// references.
4027    #[unstable(feature = "unique_rc_arc", issue = "112566")]
4028    pub fn into_rc(this: Self) -> Rc<T, A> {
4029        let mut this = ManuallyDrop::new(this);
4030
4031        // Move the allocator out.
4032        // SAFETY: `this.alloc` will not be accessed again, nor dropped because it is in
4033        // a `ManuallyDrop`.
4034        let alloc: A = unsafe { ptr::read(&this.alloc) };
4035
4036        // SAFETY: This pointer was allocated at creation time so we know it is valid.
4037        unsafe {
4038            // Convert our weak reference into a strong reference
4039            this.ptr.as_mut().strong.set(1);
4040            Rc::from_inner_in(this.ptr, alloc)
4041        }
4042    }
4043}
4044
4045impl<T: ?Sized, A: Allocator + Clone> UniqueRc<T, A> {
4046    /// Creates a new weak reference to the `UniqueRc`.
4047    ///
4048    /// Attempting to upgrade this weak reference will fail before the `UniqueRc` has been converted
4049    /// to a [`Rc`] using [`UniqueRc::into_rc`].
4050    #[unstable(feature = "unique_rc_arc", issue = "112566")]
4051    pub fn downgrade(this: &Self) -> Weak<T, A> {
4052        // SAFETY: This pointer was allocated at creation time and we guarantee that we only have
4053        // one strong reference before converting to a regular Rc.
4054        unsafe {
4055            this.ptr.as_ref().inc_weak();
4056        }
4057        Weak { ptr: this.ptr, alloc: this.alloc.clone() }
4058    }
4059}
4060
4061#[unstable(feature = "unique_rc_arc", issue = "112566")]
4062impl<T: ?Sized, A: Allocator> Deref for UniqueRc<T, A> {
4063    type Target = T;
4064
4065    fn deref(&self) -> &T {
4066        // SAFETY: This pointer was allocated at creation time so we know it is valid.
4067        unsafe { &self.ptr.as_ref().value }
4068    }
4069}
4070
4071#[unstable(feature = "unique_rc_arc", issue = "112566")]
4072impl<T: ?Sized, A: Allocator> DerefMut for UniqueRc<T, A> {
4073    fn deref_mut(&mut self) -> &mut T {
4074        // SAFETY: This pointer was allocated at creation time so we know it is valid. We know we
4075        // have unique ownership and therefore it's safe to make a mutable reference because
4076        // `UniqueRc` owns the only strong reference to itself.
4077        unsafe { &mut (*self.ptr.as_ptr()).value }
4078    }
4079}
4080
4081#[unstable(feature = "unique_rc_arc", issue = "112566")]
4082unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for UniqueRc<T, A> {
4083    fn drop(&mut self) {
4084        unsafe {
4085            // destroy the contained object
4086            drop_in_place(DerefMut::deref_mut(self));
4087
4088            // remove the implicit "strong weak" pointer now that we've destroyed the contents.
4089            self.ptr.as_ref().dec_weak();
4090
4091            if self.ptr.as_ref().weak() == 0 {
4092                self.alloc.deallocate(self.ptr.cast(), Layout::for_value_raw(self.ptr.as_ptr()));
4093            }
4094        }
4095    }
4096}
4097
4098/// A unique owning pointer to a [`RcInner`] **that does not imply the contents are initialized,**
4099/// but will deallocate it (without dropping the value) when dropped.
4100///
4101/// This is a helper for [`Rc::make_mut()`] to ensure correct cleanup on panic.
4102/// It is nearly a duplicate of `UniqueRc<MaybeUninit<T>, A>` except that it allows `T: !Sized`,
4103/// which `MaybeUninit` does not.
4104#[cfg(not(no_global_oom_handling))]
4105struct UniqueRcUninit<T: ?Sized, A: Allocator> {
4106    ptr: NonNull<RcInner<T>>,
4107    layout_for_value: Layout,
4108    alloc: Option<A>,
4109}
4110
4111#[cfg(not(no_global_oom_handling))]
4112impl<T: ?Sized, A: Allocator> UniqueRcUninit<T, A> {
4113    /// Allocates a RcInner with layout suitable to contain `for_value` or a clone of it.
4114    fn new(for_value: &T, alloc: A) -> UniqueRcUninit<T, A> {
4115        let layout = Layout::for_value(for_value);
4116        let ptr = unsafe {
4117            Rc::allocate_for_layout(
4118                layout,
4119                |layout_for_rc_inner| alloc.allocate(layout_for_rc_inner),
4120                |mem| mem.with_metadata_of(ptr::from_ref(for_value) as *const RcInner<T>),
4121            )
4122        };
4123        Self { ptr: NonNull::new(ptr).unwrap(), layout_for_value: layout, alloc: Some(alloc) }
4124    }
4125
4126    /// Returns the pointer to be written into to initialize the [`Rc`].
4127    fn data_ptr(&mut self) -> *mut T {
4128        let offset = data_offset_align(self.layout_for_value.align());
4129        unsafe { self.ptr.as_ptr().byte_add(offset) as *mut T }
4130    }
4131
4132    /// Upgrade this into a normal [`Rc`].
4133    ///
4134    /// # Safety
4135    ///
4136    /// The data must have been initialized (by writing to [`Self::data_ptr()`]).
4137    unsafe fn into_rc(self) -> Rc<T, A> {
4138        let mut this = ManuallyDrop::new(self);
4139        let ptr = this.ptr;
4140        let alloc = this.alloc.take().unwrap();
4141
4142        // SAFETY: The pointer is valid as per `UniqueRcUninit::new`, and the caller is responsible
4143        // for having initialized the data.
4144        unsafe { Rc::from_ptr_in(ptr.as_ptr(), alloc) }
4145    }
4146}
4147
4148#[cfg(not(no_global_oom_handling))]
4149impl<T: ?Sized, A: Allocator> Drop for UniqueRcUninit<T, A> {
4150    fn drop(&mut self) {
4151        // SAFETY:
4152        // * new() produced a pointer safe to deallocate.
4153        // * We own the pointer unless into_rc() was called, which forgets us.
4154        unsafe {
4155            self.alloc.take().unwrap().deallocate(
4156                self.ptr.cast(),
4157                rc_inner_layout_for_value_layout(self.layout_for_value),
4158            );
4159        }
4160    }
4161}
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy