alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{PointerLike, Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198    DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232    T: ?Sized,
233    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[rustc_intrinsic]
241#[unstable(feature = "liballoc_internals", issue = "none")]
242pub fn box_new<T>(x: T) -> Box<T>;
243
244impl<T> Box<T> {
245    /// Allocates memory on the heap and then places `x` into it.
246    ///
247    /// This doesn't actually allocate if `T` is zero-sized.
248    ///
249    /// # Examples
250    ///
251    /// ```
252    /// let five = Box::new(5);
253    /// ```
254    #[cfg(not(no_global_oom_handling))]
255    #[inline(always)]
256    #[stable(feature = "rust1", since = "1.0.0")]
257    #[must_use]
258    #[rustc_diagnostic_item = "box_new"]
259    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
260    pub fn new(x: T) -> Self {
261        return box_new(x);
262    }
263
264    /// Constructs a new box with uninitialized contents.
265    ///
266    /// # Examples
267    ///
268    /// ```
269    /// let mut five = Box::<u32>::new_uninit();
270    /// // Deferred initialization:
271    /// five.write(5);
272    /// let five = unsafe { five.assume_init() };
273    ///
274    /// assert_eq!(*five, 5)
275    /// ```
276    #[cfg(not(no_global_oom_handling))]
277    #[stable(feature = "new_uninit", since = "1.82.0")]
278    #[must_use]
279    #[inline]
280    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
281        Self::new_uninit_in(Global)
282    }
283
284    /// Constructs a new `Box` with uninitialized contents, with the memory
285    /// being filled with `0` bytes.
286    ///
287    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
288    /// of this method.
289    ///
290    /// # Examples
291    ///
292    /// ```
293    /// #![feature(new_zeroed_alloc)]
294    ///
295    /// let zero = Box::<u32>::new_zeroed();
296    /// let zero = unsafe { zero.assume_init() };
297    ///
298    /// assert_eq!(*zero, 0)
299    /// ```
300    ///
301    /// [zeroed]: mem::MaybeUninit::zeroed
302    #[cfg(not(no_global_oom_handling))]
303    #[inline]
304    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
305    #[must_use]
306    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
307        Self::new_zeroed_in(Global)
308    }
309
310    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
311    /// `x` will be pinned in memory and unable to be moved.
312    ///
313    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
314    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
315    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
316    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
317    #[cfg(not(no_global_oom_handling))]
318    #[stable(feature = "pin", since = "1.33.0")]
319    #[must_use]
320    #[inline(always)]
321    pub fn pin(x: T) -> Pin<Box<T>> {
322        Box::new(x).into()
323    }
324
325    /// Allocates memory on the heap then places `x` into it,
326    /// returning an error if the allocation fails
327    ///
328    /// This doesn't actually allocate if `T` is zero-sized.
329    ///
330    /// # Examples
331    ///
332    /// ```
333    /// #![feature(allocator_api)]
334    ///
335    /// let five = Box::try_new(5)?;
336    /// # Ok::<(), std::alloc::AllocError>(())
337    /// ```
338    #[unstable(feature = "allocator_api", issue = "32838")]
339    #[inline]
340    pub fn try_new(x: T) -> Result<Self, AllocError> {
341        Self::try_new_in(x, Global)
342    }
343
344    /// Constructs a new box with uninitialized contents on the heap,
345    /// returning an error if the allocation fails
346    ///
347    /// # Examples
348    ///
349    /// ```
350    /// #![feature(allocator_api)]
351    ///
352    /// let mut five = Box::<u32>::try_new_uninit()?;
353    /// // Deferred initialization:
354    /// five.write(5);
355    /// let five = unsafe { five.assume_init() };
356    ///
357    /// assert_eq!(*five, 5);
358    /// # Ok::<(), std::alloc::AllocError>(())
359    /// ```
360    #[unstable(feature = "allocator_api", issue = "32838")]
361    // #[unstable(feature = "new_uninit", issue = "63291")]
362    #[inline]
363    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
364        Box::try_new_uninit_in(Global)
365    }
366
367    /// Constructs a new `Box` with uninitialized contents, with the memory
368    /// being filled with `0` bytes on the heap
369    ///
370    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
371    /// of this method.
372    ///
373    /// # Examples
374    ///
375    /// ```
376    /// #![feature(allocator_api)]
377    ///
378    /// let zero = Box::<u32>::try_new_zeroed()?;
379    /// let zero = unsafe { zero.assume_init() };
380    ///
381    /// assert_eq!(*zero, 0);
382    /// # Ok::<(), std::alloc::AllocError>(())
383    /// ```
384    ///
385    /// [zeroed]: mem::MaybeUninit::zeroed
386    #[unstable(feature = "allocator_api", issue = "32838")]
387    // #[unstable(feature = "new_uninit", issue = "63291")]
388    #[inline]
389    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
390        Box::try_new_zeroed_in(Global)
391    }
392}
393
394impl<T, A: Allocator> Box<T, A> {
395    /// Allocates memory in the given allocator then places `x` into it.
396    ///
397    /// This doesn't actually allocate if `T` is zero-sized.
398    ///
399    /// # Examples
400    ///
401    /// ```
402    /// #![feature(allocator_api)]
403    ///
404    /// use std::alloc::System;
405    ///
406    /// let five = Box::new_in(5, System);
407    /// ```
408    #[cfg(not(no_global_oom_handling))]
409    #[unstable(feature = "allocator_api", issue = "32838")]
410    #[must_use]
411    #[inline]
412    pub fn new_in(x: T, alloc: A) -> Self
413    where
414        A: Allocator,
415    {
416        let mut boxed = Self::new_uninit_in(alloc);
417        boxed.write(x);
418        unsafe { boxed.assume_init() }
419    }
420
421    /// Allocates memory in the given allocator then places `x` into it,
422    /// returning an error if the allocation fails
423    ///
424    /// This doesn't actually allocate if `T` is zero-sized.
425    ///
426    /// # Examples
427    ///
428    /// ```
429    /// #![feature(allocator_api)]
430    ///
431    /// use std::alloc::System;
432    ///
433    /// let five = Box::try_new_in(5, System)?;
434    /// # Ok::<(), std::alloc::AllocError>(())
435    /// ```
436    #[unstable(feature = "allocator_api", issue = "32838")]
437    #[inline]
438    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
439    where
440        A: Allocator,
441    {
442        let mut boxed = Self::try_new_uninit_in(alloc)?;
443        boxed.write(x);
444        unsafe { Ok(boxed.assume_init()) }
445    }
446
447    /// Constructs a new box with uninitialized contents in the provided allocator.
448    ///
449    /// # Examples
450    ///
451    /// ```
452    /// #![feature(allocator_api)]
453    ///
454    /// use std::alloc::System;
455    ///
456    /// let mut five = Box::<u32, _>::new_uninit_in(System);
457    /// // Deferred initialization:
458    /// five.write(5);
459    /// let five = unsafe { five.assume_init() };
460    ///
461    /// assert_eq!(*five, 5)
462    /// ```
463    #[unstable(feature = "allocator_api", issue = "32838")]
464    #[cfg(not(no_global_oom_handling))]
465    #[must_use]
466    // #[unstable(feature = "new_uninit", issue = "63291")]
467    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
468    where
469        A: Allocator,
470    {
471        let layout = Layout::new::<mem::MaybeUninit<T>>();
472        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
473        // That would make code size bigger.
474        match Box::try_new_uninit_in(alloc) {
475            Ok(m) => m,
476            Err(_) => handle_alloc_error(layout),
477        }
478    }
479
480    /// Constructs a new box with uninitialized contents in the provided allocator,
481    /// returning an error if the allocation fails
482    ///
483    /// # Examples
484    ///
485    /// ```
486    /// #![feature(allocator_api)]
487    ///
488    /// use std::alloc::System;
489    ///
490    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
491    /// // Deferred initialization:
492    /// five.write(5);
493    /// let five = unsafe { five.assume_init() };
494    ///
495    /// assert_eq!(*five, 5);
496    /// # Ok::<(), std::alloc::AllocError>(())
497    /// ```
498    #[unstable(feature = "allocator_api", issue = "32838")]
499    // #[unstable(feature = "new_uninit", issue = "63291")]
500    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
501    where
502        A: Allocator,
503    {
504        let ptr = if T::IS_ZST {
505            NonNull::dangling()
506        } else {
507            let layout = Layout::new::<mem::MaybeUninit<T>>();
508            alloc.allocate(layout)?.cast()
509        };
510        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
511    }
512
513    /// Constructs a new `Box` with uninitialized contents, with the memory
514    /// being filled with `0` bytes in the provided allocator.
515    ///
516    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
517    /// of this method.
518    ///
519    /// # Examples
520    ///
521    /// ```
522    /// #![feature(allocator_api)]
523    ///
524    /// use std::alloc::System;
525    ///
526    /// let zero = Box::<u32, _>::new_zeroed_in(System);
527    /// let zero = unsafe { zero.assume_init() };
528    ///
529    /// assert_eq!(*zero, 0)
530    /// ```
531    ///
532    /// [zeroed]: mem::MaybeUninit::zeroed
533    #[unstable(feature = "allocator_api", issue = "32838")]
534    #[cfg(not(no_global_oom_handling))]
535    // #[unstable(feature = "new_uninit", issue = "63291")]
536    #[must_use]
537    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
538    where
539        A: Allocator,
540    {
541        let layout = Layout::new::<mem::MaybeUninit<T>>();
542        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
543        // That would make code size bigger.
544        match Box::try_new_zeroed_in(alloc) {
545            Ok(m) => m,
546            Err(_) => handle_alloc_error(layout),
547        }
548    }
549
550    /// Constructs a new `Box` with uninitialized contents, with the memory
551    /// being filled with `0` bytes in the provided allocator,
552    /// returning an error if the allocation fails,
553    ///
554    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
555    /// of this method.
556    ///
557    /// # Examples
558    ///
559    /// ```
560    /// #![feature(allocator_api)]
561    ///
562    /// use std::alloc::System;
563    ///
564    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
565    /// let zero = unsafe { zero.assume_init() };
566    ///
567    /// assert_eq!(*zero, 0);
568    /// # Ok::<(), std::alloc::AllocError>(())
569    /// ```
570    ///
571    /// [zeroed]: mem::MaybeUninit::zeroed
572    #[unstable(feature = "allocator_api", issue = "32838")]
573    // #[unstable(feature = "new_uninit", issue = "63291")]
574    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
575    where
576        A: Allocator,
577    {
578        let ptr = if T::IS_ZST {
579            NonNull::dangling()
580        } else {
581            let layout = Layout::new::<mem::MaybeUninit<T>>();
582            alloc.allocate_zeroed(layout)?.cast()
583        };
584        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
585    }
586
587    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
588    /// `x` will be pinned in memory and unable to be moved.
589    ///
590    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
591    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
592    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
593    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
594    #[cfg(not(no_global_oom_handling))]
595    #[unstable(feature = "allocator_api", issue = "32838")]
596    #[must_use]
597    #[inline(always)]
598    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
599    where
600        A: 'static + Allocator,
601    {
602        Self::into_pin(Self::new_in(x, alloc))
603    }
604
605    /// Converts a `Box<T>` into a `Box<[T]>`
606    ///
607    /// This conversion does not allocate on the heap and happens in place.
608    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
609    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
610        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
611        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
612    }
613
614    /// Consumes the `Box`, returning the wrapped value.
615    ///
616    /// # Examples
617    ///
618    /// ```
619    /// #![feature(box_into_inner)]
620    ///
621    /// let c = Box::new(5);
622    ///
623    /// assert_eq!(Box::into_inner(c), 5);
624    /// ```
625    #[unstable(feature = "box_into_inner", issue = "80437")]
626    #[inline]
627    pub fn into_inner(boxed: Self) -> T {
628        *boxed
629    }
630}
631
632impl<T> Box<[T]> {
633    /// Constructs a new boxed slice with uninitialized contents.
634    ///
635    /// # Examples
636    ///
637    /// ```
638    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
639    /// // Deferred initialization:
640    /// values[0].write(1);
641    /// values[1].write(2);
642    /// values[2].write(3);
643    /// let values = unsafe {values.assume_init() };
644    ///
645    /// assert_eq!(*values, [1, 2, 3])
646    /// ```
647    #[cfg(not(no_global_oom_handling))]
648    #[stable(feature = "new_uninit", since = "1.82.0")]
649    #[must_use]
650    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
651        unsafe { RawVec::with_capacity(len).into_box(len) }
652    }
653
654    /// Constructs a new boxed slice with uninitialized contents, with the memory
655    /// being filled with `0` bytes.
656    ///
657    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
658    /// of this method.
659    ///
660    /// # Examples
661    ///
662    /// ```
663    /// #![feature(new_zeroed_alloc)]
664    ///
665    /// let values = Box::<[u32]>::new_zeroed_slice(3);
666    /// let values = unsafe { values.assume_init() };
667    ///
668    /// assert_eq!(*values, [0, 0, 0])
669    /// ```
670    ///
671    /// [zeroed]: mem::MaybeUninit::zeroed
672    #[cfg(not(no_global_oom_handling))]
673    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
674    #[must_use]
675    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
676        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
677    }
678
679    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
680    /// the allocation fails.
681    ///
682    /// # Examples
683    ///
684    /// ```
685    /// #![feature(allocator_api)]
686    ///
687    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
688    /// // Deferred initialization:
689    /// values[0].write(1);
690    /// values[1].write(2);
691    /// values[2].write(3);
692    /// let values = unsafe { values.assume_init() };
693    ///
694    /// assert_eq!(*values, [1, 2, 3]);
695    /// # Ok::<(), std::alloc::AllocError>(())
696    /// ```
697    #[unstable(feature = "allocator_api", issue = "32838")]
698    #[inline]
699    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
700        let ptr = if T::IS_ZST || len == 0 {
701            NonNull::dangling()
702        } else {
703            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
704                Ok(l) => l,
705                Err(_) => return Err(AllocError),
706            };
707            Global.allocate(layout)?.cast()
708        };
709        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
710    }
711
712    /// Constructs a new boxed slice with uninitialized contents, with the memory
713    /// being filled with `0` bytes. Returns an error if the allocation fails.
714    ///
715    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
716    /// of this method.
717    ///
718    /// # Examples
719    ///
720    /// ```
721    /// #![feature(allocator_api)]
722    ///
723    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
724    /// let values = unsafe { values.assume_init() };
725    ///
726    /// assert_eq!(*values, [0, 0, 0]);
727    /// # Ok::<(), std::alloc::AllocError>(())
728    /// ```
729    ///
730    /// [zeroed]: mem::MaybeUninit::zeroed
731    #[unstable(feature = "allocator_api", issue = "32838")]
732    #[inline]
733    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
734        let ptr = if T::IS_ZST || len == 0 {
735            NonNull::dangling()
736        } else {
737            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
738                Ok(l) => l,
739                Err(_) => return Err(AllocError),
740            };
741            Global.allocate_zeroed(layout)?.cast()
742        };
743        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
744    }
745
746    /// Converts the boxed slice into a boxed array.
747    ///
748    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
749    ///
750    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
751    #[unstable(feature = "slice_as_array", issue = "133508")]
752    #[inline]
753    #[must_use]
754    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
755        if self.len() == N {
756            let ptr = Self::into_raw(self) as *mut [T; N];
757
758            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
759            let me = unsafe { Box::from_raw(ptr) };
760            Some(me)
761        } else {
762            None
763        }
764    }
765}
766
767impl<T, A: Allocator> Box<[T], A> {
768    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
769    ///
770    /// # Examples
771    ///
772    /// ```
773    /// #![feature(allocator_api)]
774    ///
775    /// use std::alloc::System;
776    ///
777    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
778    /// // Deferred initialization:
779    /// values[0].write(1);
780    /// values[1].write(2);
781    /// values[2].write(3);
782    /// let values = unsafe { values.assume_init() };
783    ///
784    /// assert_eq!(*values, [1, 2, 3])
785    /// ```
786    #[cfg(not(no_global_oom_handling))]
787    #[unstable(feature = "allocator_api", issue = "32838")]
788    // #[unstable(feature = "new_uninit", issue = "63291")]
789    #[must_use]
790    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
791        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
792    }
793
794    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
795    /// with the memory being filled with `0` bytes.
796    ///
797    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
798    /// of this method.
799    ///
800    /// # Examples
801    ///
802    /// ```
803    /// #![feature(allocator_api)]
804    ///
805    /// use std::alloc::System;
806    ///
807    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
808    /// let values = unsafe { values.assume_init() };
809    ///
810    /// assert_eq!(*values, [0, 0, 0])
811    /// ```
812    ///
813    /// [zeroed]: mem::MaybeUninit::zeroed
814    #[cfg(not(no_global_oom_handling))]
815    #[unstable(feature = "allocator_api", issue = "32838")]
816    // #[unstable(feature = "new_uninit", issue = "63291")]
817    #[must_use]
818    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
819        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
820    }
821
822    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
823    /// the allocation fails.
824    ///
825    /// # Examples
826    ///
827    /// ```
828    /// #![feature(allocator_api)]
829    ///
830    /// use std::alloc::System;
831    ///
832    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
833    /// // Deferred initialization:
834    /// values[0].write(1);
835    /// values[1].write(2);
836    /// values[2].write(3);
837    /// let values = unsafe { values.assume_init() };
838    ///
839    /// assert_eq!(*values, [1, 2, 3]);
840    /// # Ok::<(), std::alloc::AllocError>(())
841    /// ```
842    #[unstable(feature = "allocator_api", issue = "32838")]
843    #[inline]
844    pub fn try_new_uninit_slice_in(
845        len: usize,
846        alloc: A,
847    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
848        let ptr = if T::IS_ZST || len == 0 {
849            NonNull::dangling()
850        } else {
851            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
852                Ok(l) => l,
853                Err(_) => return Err(AllocError),
854            };
855            alloc.allocate(layout)?.cast()
856        };
857        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
858    }
859
860    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
861    /// being filled with `0` bytes. Returns an error if the allocation fails.
862    ///
863    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
864    /// of this method.
865    ///
866    /// # Examples
867    ///
868    /// ```
869    /// #![feature(allocator_api)]
870    ///
871    /// use std::alloc::System;
872    ///
873    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
874    /// let values = unsafe { values.assume_init() };
875    ///
876    /// assert_eq!(*values, [0, 0, 0]);
877    /// # Ok::<(), std::alloc::AllocError>(())
878    /// ```
879    ///
880    /// [zeroed]: mem::MaybeUninit::zeroed
881    #[unstable(feature = "allocator_api", issue = "32838")]
882    #[inline]
883    pub fn try_new_zeroed_slice_in(
884        len: usize,
885        alloc: A,
886    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
887        let ptr = if T::IS_ZST || len == 0 {
888            NonNull::dangling()
889        } else {
890            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
891                Ok(l) => l,
892                Err(_) => return Err(AllocError),
893            };
894            alloc.allocate_zeroed(layout)?.cast()
895        };
896        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
897    }
898}
899
900impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
901    /// Converts to `Box<T, A>`.
902    ///
903    /// # Safety
904    ///
905    /// As with [`MaybeUninit::assume_init`],
906    /// it is up to the caller to guarantee that the value
907    /// really is in an initialized state.
908    /// Calling this when the content is not yet fully initialized
909    /// causes immediate undefined behavior.
910    ///
911    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
912    ///
913    /// # Examples
914    ///
915    /// ```
916    /// let mut five = Box::<u32>::new_uninit();
917    /// // Deferred initialization:
918    /// five.write(5);
919    /// let five: Box<u32> = unsafe { five.assume_init() };
920    ///
921    /// assert_eq!(*five, 5)
922    /// ```
923    #[stable(feature = "new_uninit", since = "1.82.0")]
924    #[inline]
925    pub unsafe fn assume_init(self) -> Box<T, A> {
926        let (raw, alloc) = Box::into_raw_with_allocator(self);
927        unsafe { Box::from_raw_in(raw as *mut T, alloc) }
928    }
929
930    /// Writes the value and converts to `Box<T, A>`.
931    ///
932    /// This method converts the box similarly to [`Box::assume_init`] but
933    /// writes `value` into it before conversion thus guaranteeing safety.
934    /// In some scenarios use of this method may improve performance because
935    /// the compiler may be able to optimize copying from stack.
936    ///
937    /// # Examples
938    ///
939    /// ```
940    /// let big_box = Box::<[usize; 1024]>::new_uninit();
941    ///
942    /// let mut array = [0; 1024];
943    /// for (i, place) in array.iter_mut().enumerate() {
944    ///     *place = i;
945    /// }
946    ///
947    /// // The optimizer may be able to elide this copy, so previous code writes
948    /// // to heap directly.
949    /// let big_box = Box::write(big_box, array);
950    ///
951    /// for (i, x) in big_box.iter().enumerate() {
952    ///     assert_eq!(*x, i);
953    /// }
954    /// ```
955    #[stable(feature = "box_uninit_write", since = "1.87.0")]
956    #[inline]
957    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
958        unsafe {
959            (*boxed).write(value);
960            boxed.assume_init()
961        }
962    }
963}
964
965impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
966    /// Converts to `Box<[T], A>`.
967    ///
968    /// # Safety
969    ///
970    /// As with [`MaybeUninit::assume_init`],
971    /// it is up to the caller to guarantee that the values
972    /// really are in an initialized state.
973    /// Calling this when the content is not yet fully initialized
974    /// causes immediate undefined behavior.
975    ///
976    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
977    ///
978    /// # Examples
979    ///
980    /// ```
981    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
982    /// // Deferred initialization:
983    /// values[0].write(1);
984    /// values[1].write(2);
985    /// values[2].write(3);
986    /// let values = unsafe { values.assume_init() };
987    ///
988    /// assert_eq!(*values, [1, 2, 3])
989    /// ```
990    #[stable(feature = "new_uninit", since = "1.82.0")]
991    #[inline]
992    pub unsafe fn assume_init(self) -> Box<[T], A> {
993        let (raw, alloc) = Box::into_raw_with_allocator(self);
994        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
995    }
996}
997
998impl<T: ?Sized> Box<T> {
999    /// Constructs a box from a raw pointer.
1000    ///
1001    /// After calling this function, the raw pointer is owned by the
1002    /// resulting `Box`. Specifically, the `Box` destructor will call
1003    /// the destructor of `T` and free the allocated memory. For this
1004    /// to be safe, the memory must have been allocated in accordance
1005    /// with the [memory layout] used by `Box` .
1006    ///
1007    /// # Safety
1008    ///
1009    /// This function is unsafe because improper use may lead to
1010    /// memory problems. For example, a double-free may occur if the
1011    /// function is called twice on the same raw pointer.
1012    ///
1013    /// The raw pointer must point to a block of memory allocated by the global allocator.
1014    ///
1015    /// The safety conditions are described in the [memory layout] section.
1016    ///
1017    /// # Examples
1018    ///
1019    /// Recreate a `Box` which was previously converted to a raw pointer
1020    /// using [`Box::into_raw`]:
1021    /// ```
1022    /// let x = Box::new(5);
1023    /// let ptr = Box::into_raw(x);
1024    /// let x = unsafe { Box::from_raw(ptr) };
1025    /// ```
1026    /// Manually create a `Box` from scratch by using the global allocator:
1027    /// ```
1028    /// use std::alloc::{alloc, Layout};
1029    ///
1030    /// unsafe {
1031    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1032    ///     // In general .write is required to avoid attempting to destruct
1033    ///     // the (uninitialized) previous contents of `ptr`, though for this
1034    ///     // simple example `*ptr = 5` would have worked as well.
1035    ///     ptr.write(5);
1036    ///     let x = Box::from_raw(ptr);
1037    /// }
1038    /// ```
1039    ///
1040    /// [memory layout]: self#memory-layout
1041    #[stable(feature = "box_raw", since = "1.4.0")]
1042    #[inline]
1043    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1044    pub unsafe fn from_raw(raw: *mut T) -> Self {
1045        unsafe { Self::from_raw_in(raw, Global) }
1046    }
1047
1048    /// Constructs a box from a `NonNull` pointer.
1049    ///
1050    /// After calling this function, the `NonNull` pointer is owned by
1051    /// the resulting `Box`. Specifically, the `Box` destructor will call
1052    /// the destructor of `T` and free the allocated memory. For this
1053    /// to be safe, the memory must have been allocated in accordance
1054    /// with the [memory layout] used by `Box` .
1055    ///
1056    /// # Safety
1057    ///
1058    /// This function is unsafe because improper use may lead to
1059    /// memory problems. For example, a double-free may occur if the
1060    /// function is called twice on the same `NonNull` pointer.
1061    ///
1062    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1063    ///
1064    /// The safety conditions are described in the [memory layout] section.
1065    ///
1066    /// # Examples
1067    ///
1068    /// Recreate a `Box` which was previously converted to a `NonNull`
1069    /// pointer using [`Box::into_non_null`]:
1070    /// ```
1071    /// #![feature(box_vec_non_null)]
1072    ///
1073    /// let x = Box::new(5);
1074    /// let non_null = Box::into_non_null(x);
1075    /// let x = unsafe { Box::from_non_null(non_null) };
1076    /// ```
1077    /// Manually create a `Box` from scratch by using the global allocator:
1078    /// ```
1079    /// #![feature(box_vec_non_null)]
1080    ///
1081    /// use std::alloc::{alloc, Layout};
1082    /// use std::ptr::NonNull;
1083    ///
1084    /// unsafe {
1085    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1086    ///         .expect("allocation failed");
1087    ///     // In general .write is required to avoid attempting to destruct
1088    ///     // the (uninitialized) previous contents of `non_null`.
1089    ///     non_null.write(5);
1090    ///     let x = Box::from_non_null(non_null);
1091    /// }
1092    /// ```
1093    ///
1094    /// [memory layout]: self#memory-layout
1095    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1096    #[inline]
1097    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1098    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1099        unsafe { Self::from_raw(ptr.as_ptr()) }
1100    }
1101}
1102
1103impl<T: ?Sized, A: Allocator> Box<T, A> {
1104    /// Constructs a box from a raw pointer in the given allocator.
1105    ///
1106    /// After calling this function, the raw pointer is owned by the
1107    /// resulting `Box`. Specifically, the `Box` destructor will call
1108    /// the destructor of `T` and free the allocated memory. For this
1109    /// to be safe, the memory must have been allocated in accordance
1110    /// with the [memory layout] used by `Box` .
1111    ///
1112    /// # Safety
1113    ///
1114    /// This function is unsafe because improper use may lead to
1115    /// memory problems. For example, a double-free may occur if the
1116    /// function is called twice on the same raw pointer.
1117    ///
1118    /// The raw pointer must point to a block of memory allocated by `alloc`.
1119    ///
1120    /// # Examples
1121    ///
1122    /// Recreate a `Box` which was previously converted to a raw pointer
1123    /// using [`Box::into_raw_with_allocator`]:
1124    /// ```
1125    /// #![feature(allocator_api)]
1126    ///
1127    /// use std::alloc::System;
1128    ///
1129    /// let x = Box::new_in(5, System);
1130    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1131    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1132    /// ```
1133    /// Manually create a `Box` from scratch by using the system allocator:
1134    /// ```
1135    /// #![feature(allocator_api, slice_ptr_get)]
1136    ///
1137    /// use std::alloc::{Allocator, Layout, System};
1138    ///
1139    /// unsafe {
1140    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1141    ///     // In general .write is required to avoid attempting to destruct
1142    ///     // the (uninitialized) previous contents of `ptr`, though for this
1143    ///     // simple example `*ptr = 5` would have worked as well.
1144    ///     ptr.write(5);
1145    ///     let x = Box::from_raw_in(ptr, System);
1146    /// }
1147    /// # Ok::<(), std::alloc::AllocError>(())
1148    /// ```
1149    ///
1150    /// [memory layout]: self#memory-layout
1151    #[unstable(feature = "allocator_api", issue = "32838")]
1152    #[inline]
1153    pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1154        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1155    }
1156
1157    /// Constructs a box from a `NonNull` pointer in the given allocator.
1158    ///
1159    /// After calling this function, the `NonNull` pointer is owned by
1160    /// the resulting `Box`. Specifically, the `Box` destructor will call
1161    /// the destructor of `T` and free the allocated memory. For this
1162    /// to be safe, the memory must have been allocated in accordance
1163    /// with the [memory layout] used by `Box` .
1164    ///
1165    /// # Safety
1166    ///
1167    /// This function is unsafe because improper use may lead to
1168    /// memory problems. For example, a double-free may occur if the
1169    /// function is called twice on the same raw pointer.
1170    ///
1171    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1172    ///
1173    /// # Examples
1174    ///
1175    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1176    /// using [`Box::into_non_null_with_allocator`]:
1177    /// ```
1178    /// #![feature(allocator_api, box_vec_non_null)]
1179    ///
1180    /// use std::alloc::System;
1181    ///
1182    /// let x = Box::new_in(5, System);
1183    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1184    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1185    /// ```
1186    /// Manually create a `Box` from scratch by using the system allocator:
1187    /// ```
1188    /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1189    ///
1190    /// use std::alloc::{Allocator, Layout, System};
1191    ///
1192    /// unsafe {
1193    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1194    ///     // In general .write is required to avoid attempting to destruct
1195    ///     // the (uninitialized) previous contents of `non_null`.
1196    ///     non_null.write(5);
1197    ///     let x = Box::from_non_null_in(non_null, System);
1198    /// }
1199    /// # Ok::<(), std::alloc::AllocError>(())
1200    /// ```
1201    ///
1202    /// [memory layout]: self#memory-layout
1203    #[unstable(feature = "allocator_api", issue = "32838")]
1204    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1205    #[inline]
1206    pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1207        // SAFETY: guaranteed by the caller.
1208        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1209    }
1210
1211    /// Consumes the `Box`, returning a wrapped raw pointer.
1212    ///
1213    /// The pointer will be properly aligned and non-null.
1214    ///
1215    /// After calling this function, the caller is responsible for the
1216    /// memory previously managed by the `Box`. In particular, the
1217    /// caller should properly destroy `T` and release the memory, taking
1218    /// into account the [memory layout] used by `Box`. The easiest way to
1219    /// do this is to convert the raw pointer back into a `Box` with the
1220    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1221    /// the cleanup.
1222    ///
1223    /// Note: this is an associated function, which means that you have
1224    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1225    /// is so that there is no conflict with a method on the inner type.
1226    ///
1227    /// # Examples
1228    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1229    /// for automatic cleanup:
1230    /// ```
1231    /// let x = Box::new(String::from("Hello"));
1232    /// let ptr = Box::into_raw(x);
1233    /// let x = unsafe { Box::from_raw(ptr) };
1234    /// ```
1235    /// Manual cleanup by explicitly running the destructor and deallocating
1236    /// the memory:
1237    /// ```
1238    /// use std::alloc::{dealloc, Layout};
1239    /// use std::ptr;
1240    ///
1241    /// let x = Box::new(String::from("Hello"));
1242    /// let ptr = Box::into_raw(x);
1243    /// unsafe {
1244    ///     ptr::drop_in_place(ptr);
1245    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1246    /// }
1247    /// ```
1248    /// Note: This is equivalent to the following:
1249    /// ```
1250    /// let x = Box::new(String::from("Hello"));
1251    /// let ptr = Box::into_raw(x);
1252    /// unsafe {
1253    ///     drop(Box::from_raw(ptr));
1254    /// }
1255    /// ```
1256    ///
1257    /// [memory layout]: self#memory-layout
1258    #[must_use = "losing the pointer will leak memory"]
1259    #[stable(feature = "box_raw", since = "1.4.0")]
1260    #[inline]
1261    pub fn into_raw(b: Self) -> *mut T {
1262        // Make sure Miri realizes that we transition from a noalias pointer to a raw pointer here.
1263        unsafe { &raw mut *&mut *Self::into_raw_with_allocator(b).0 }
1264    }
1265
1266    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1267    ///
1268    /// The pointer will be properly aligned.
1269    ///
1270    /// After calling this function, the caller is responsible for the
1271    /// memory previously managed by the `Box`. In particular, the
1272    /// caller should properly destroy `T` and release the memory, taking
1273    /// into account the [memory layout] used by `Box`. The easiest way to
1274    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1275    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1276    /// perform the cleanup.
1277    ///
1278    /// Note: this is an associated function, which means that you have
1279    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1280    /// This is so that there is no conflict with a method on the inner type.
1281    ///
1282    /// # Examples
1283    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1284    /// for automatic cleanup:
1285    /// ```
1286    /// #![feature(box_vec_non_null)]
1287    ///
1288    /// let x = Box::new(String::from("Hello"));
1289    /// let non_null = Box::into_non_null(x);
1290    /// let x = unsafe { Box::from_non_null(non_null) };
1291    /// ```
1292    /// Manual cleanup by explicitly running the destructor and deallocating
1293    /// the memory:
1294    /// ```
1295    /// #![feature(box_vec_non_null)]
1296    ///
1297    /// use std::alloc::{dealloc, Layout};
1298    ///
1299    /// let x = Box::new(String::from("Hello"));
1300    /// let non_null = Box::into_non_null(x);
1301    /// unsafe {
1302    ///     non_null.drop_in_place();
1303    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1304    /// }
1305    /// ```
1306    /// Note: This is equivalent to the following:
1307    /// ```
1308    /// #![feature(box_vec_non_null)]
1309    ///
1310    /// let x = Box::new(String::from("Hello"));
1311    /// let non_null = Box::into_non_null(x);
1312    /// unsafe {
1313    ///     drop(Box::from_non_null(non_null));
1314    /// }
1315    /// ```
1316    ///
1317    /// [memory layout]: self#memory-layout
1318    #[must_use = "losing the pointer will leak memory"]
1319    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1320    #[inline]
1321    pub fn into_non_null(b: Self) -> NonNull<T> {
1322        // SAFETY: `Box` is guaranteed to be non-null.
1323        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1324    }
1325
1326    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1327    ///
1328    /// The pointer will be properly aligned and non-null.
1329    ///
1330    /// After calling this function, the caller is responsible for the
1331    /// memory previously managed by the `Box`. In particular, the
1332    /// caller should properly destroy `T` and release the memory, taking
1333    /// into account the [memory layout] used by `Box`. The easiest way to
1334    /// do this is to convert the raw pointer back into a `Box` with the
1335    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1336    /// the cleanup.
1337    ///
1338    /// Note: this is an associated function, which means that you have
1339    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1340    /// is so that there is no conflict with a method on the inner type.
1341    ///
1342    /// # Examples
1343    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1344    /// for automatic cleanup:
1345    /// ```
1346    /// #![feature(allocator_api)]
1347    ///
1348    /// use std::alloc::System;
1349    ///
1350    /// let x = Box::new_in(String::from("Hello"), System);
1351    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1352    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1353    /// ```
1354    /// Manual cleanup by explicitly running the destructor and deallocating
1355    /// the memory:
1356    /// ```
1357    /// #![feature(allocator_api)]
1358    ///
1359    /// use std::alloc::{Allocator, Layout, System};
1360    /// use std::ptr::{self, NonNull};
1361    ///
1362    /// let x = Box::new_in(String::from("Hello"), System);
1363    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1364    /// unsafe {
1365    ///     ptr::drop_in_place(ptr);
1366    ///     let non_null = NonNull::new_unchecked(ptr);
1367    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1368    /// }
1369    /// ```
1370    ///
1371    /// [memory layout]: self#memory-layout
1372    #[must_use = "losing the pointer will leak memory"]
1373    #[unstable(feature = "allocator_api", issue = "32838")]
1374    #[inline]
1375    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1376        let mut b = mem::ManuallyDrop::new(b);
1377        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1378        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1379        // want *no* aliasing requirements here!
1380        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1381        // works around that.
1382        let ptr = &raw mut **b;
1383        let alloc = unsafe { ptr::read(&b.1) };
1384        (ptr, alloc)
1385    }
1386
1387    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1388    ///
1389    /// The pointer will be properly aligned.
1390    ///
1391    /// After calling this function, the caller is responsible for the
1392    /// memory previously managed by the `Box`. In particular, the
1393    /// caller should properly destroy `T` and release the memory, taking
1394    /// into account the [memory layout] used by `Box`. The easiest way to
1395    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1396    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1397    /// perform the cleanup.
1398    ///
1399    /// Note: this is an associated function, which means that you have
1400    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1401    /// `b.into_non_null_with_allocator()`. This is so that there is no
1402    /// conflict with a method on the inner type.
1403    ///
1404    /// # Examples
1405    /// Converting the `NonNull` pointer back into a `Box` with
1406    /// [`Box::from_non_null_in`] for automatic cleanup:
1407    /// ```
1408    /// #![feature(allocator_api, box_vec_non_null)]
1409    ///
1410    /// use std::alloc::System;
1411    ///
1412    /// let x = Box::new_in(String::from("Hello"), System);
1413    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1414    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1415    /// ```
1416    /// Manual cleanup by explicitly running the destructor and deallocating
1417    /// the memory:
1418    /// ```
1419    /// #![feature(allocator_api, box_vec_non_null)]
1420    ///
1421    /// use std::alloc::{Allocator, Layout, System};
1422    ///
1423    /// let x = Box::new_in(String::from("Hello"), System);
1424    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1425    /// unsafe {
1426    ///     non_null.drop_in_place();
1427    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1428    /// }
1429    /// ```
1430    ///
1431    /// [memory layout]: self#memory-layout
1432    #[must_use = "losing the pointer will leak memory"]
1433    #[unstable(feature = "allocator_api", issue = "32838")]
1434    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1435    #[inline]
1436    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1437        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1438        // SAFETY: `Box` is guaranteed to be non-null.
1439        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1440    }
1441
1442    #[unstable(
1443        feature = "ptr_internals",
1444        issue = "none",
1445        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1446    )]
1447    #[inline]
1448    #[doc(hidden)]
1449    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1450        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1451        unsafe { (Unique::from(&mut *ptr), alloc) }
1452    }
1453
1454    /// Returns a raw mutable pointer to the `Box`'s contents.
1455    ///
1456    /// The caller must ensure that the `Box` outlives the pointer this
1457    /// function returns, or else it will end up dangling.
1458    ///
1459    /// This method guarantees that for the purpose of the aliasing model, this method
1460    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1461    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1462    /// Note that calling other methods that materialize references to the memory
1463    /// may still invalidate this pointer.
1464    /// See the example below for how this guarantee can be used.
1465    ///
1466    /// # Examples
1467    ///
1468    /// Due to the aliasing guarantee, the following code is legal:
1469    ///
1470    /// ```rust
1471    /// #![feature(box_as_ptr)]
1472    ///
1473    /// unsafe {
1474    ///     let mut b = Box::new(0);
1475    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1476    ///     ptr1.write(1);
1477    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1478    ///     ptr2.write(2);
1479    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1480    ///     ptr1.write(3);
1481    /// }
1482    /// ```
1483    ///
1484    /// [`as_mut_ptr`]: Self::as_mut_ptr
1485    /// [`as_ptr`]: Self::as_ptr
1486    #[unstable(feature = "box_as_ptr", issue = "129090")]
1487    #[rustc_never_returns_null_ptr]
1488    #[rustc_as_ptr]
1489    #[inline]
1490    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1491        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1492        // any references.
1493        &raw mut **b
1494    }
1495
1496    /// Returns a raw pointer to the `Box`'s contents.
1497    ///
1498    /// The caller must ensure that the `Box` outlives the pointer this
1499    /// function returns, or else it will end up dangling.
1500    ///
1501    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1502    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1503    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1504    ///
1505    /// This method guarantees that for the purpose of the aliasing model, this method
1506    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1507    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1508    /// Note that calling other methods that materialize mutable references to the memory,
1509    /// as well as writing to this memory, may still invalidate this pointer.
1510    /// See the example below for how this guarantee can be used.
1511    ///
1512    /// # Examples
1513    ///
1514    /// Due to the aliasing guarantee, the following code is legal:
1515    ///
1516    /// ```rust
1517    /// #![feature(box_as_ptr)]
1518    ///
1519    /// unsafe {
1520    ///     let mut v = Box::new(0);
1521    ///     let ptr1 = Box::as_ptr(&v);
1522    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1523    ///     let _val = ptr2.read();
1524    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1525    ///     let _val = ptr1.read();
1526    ///     // However, once we do a write...
1527    ///     ptr2.write(1);
1528    ///     // ... `ptr1` is no longer valid.
1529    ///     // This would be UB: let _val = ptr1.read();
1530    /// }
1531    /// ```
1532    ///
1533    /// [`as_mut_ptr`]: Self::as_mut_ptr
1534    /// [`as_ptr`]: Self::as_ptr
1535    #[unstable(feature = "box_as_ptr", issue = "129090")]
1536    #[rustc_never_returns_null_ptr]
1537    #[rustc_as_ptr]
1538    #[inline]
1539    pub fn as_ptr(b: &Self) -> *const T {
1540        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1541        // any references.
1542        &raw const **b
1543    }
1544
1545    /// Returns a reference to the underlying allocator.
1546    ///
1547    /// Note: this is an associated function, which means that you have
1548    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1549    /// is so that there is no conflict with a method on the inner type.
1550    #[unstable(feature = "allocator_api", issue = "32838")]
1551    #[inline]
1552    pub fn allocator(b: &Self) -> &A {
1553        &b.1
1554    }
1555
1556    /// Consumes and leaks the `Box`, returning a mutable reference,
1557    /// `&'a mut T`.
1558    ///
1559    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1560    /// has only static references, or none at all, then this may be chosen to be
1561    /// `'static`.
1562    ///
1563    /// This function is mainly useful for data that lives for the remainder of
1564    /// the program's life. Dropping the returned reference will cause a memory
1565    /// leak. If this is not acceptable, the reference should first be wrapped
1566    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1567    /// then be dropped which will properly destroy `T` and release the
1568    /// allocated memory.
1569    ///
1570    /// Note: this is an associated function, which means that you have
1571    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1572    /// is so that there is no conflict with a method on the inner type.
1573    ///
1574    /// # Examples
1575    ///
1576    /// Simple usage:
1577    ///
1578    /// ```
1579    /// let x = Box::new(41);
1580    /// let static_ref: &'static mut usize = Box::leak(x);
1581    /// *static_ref += 1;
1582    /// assert_eq!(*static_ref, 42);
1583    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1584    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1585    /// # drop(unsafe { Box::from_raw(static_ref) });
1586    /// ```
1587    ///
1588    /// Unsized data:
1589    ///
1590    /// ```
1591    /// let x = vec![1, 2, 3].into_boxed_slice();
1592    /// let static_ref = Box::leak(x);
1593    /// static_ref[0] = 4;
1594    /// assert_eq!(*static_ref, [4, 2, 3]);
1595    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1596    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1597    /// # drop(unsafe { Box::from_raw(static_ref) });
1598    /// ```
1599    #[stable(feature = "box_leak", since = "1.26.0")]
1600    #[inline]
1601    pub fn leak<'a>(b: Self) -> &'a mut T
1602    where
1603        A: 'a,
1604    {
1605        unsafe { &mut *Box::into_raw(b) }
1606    }
1607
1608    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1609    /// `*boxed` will be pinned in memory and unable to be moved.
1610    ///
1611    /// This conversion does not allocate on the heap and happens in place.
1612    ///
1613    /// This is also available via [`From`].
1614    ///
1615    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1616    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1617    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1618    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1619    ///
1620    /// # Notes
1621    ///
1622    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1623    /// as it'll introduce an ambiguity when calling `Pin::from`.
1624    /// A demonstration of such a poor impl is shown below.
1625    ///
1626    /// ```compile_fail
1627    /// # use std::pin::Pin;
1628    /// struct Foo; // A type defined in this crate.
1629    /// impl From<Box<()>> for Pin<Foo> {
1630    ///     fn from(_: Box<()>) -> Pin<Foo> {
1631    ///         Pin::new(Foo)
1632    ///     }
1633    /// }
1634    ///
1635    /// let foo = Box::new(());
1636    /// let bar = Pin::from(foo);
1637    /// ```
1638    #[stable(feature = "box_into_pin", since = "1.63.0")]
1639    pub fn into_pin(boxed: Self) -> Pin<Self>
1640    where
1641        A: 'static,
1642    {
1643        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1644        // when `T: !Unpin`, so it's safe to pin it directly without any
1645        // additional requirements.
1646        unsafe { Pin::new_unchecked(boxed) }
1647    }
1648}
1649
1650#[stable(feature = "rust1", since = "1.0.0")]
1651unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1652    #[inline]
1653    fn drop(&mut self) {
1654        // the T in the Box is dropped by the compiler before the destructor is run
1655
1656        let ptr = self.0;
1657
1658        unsafe {
1659            let layout = Layout::for_value_raw(ptr.as_ptr());
1660            if layout.size() != 0 {
1661                self.1.deallocate(From::from(ptr.cast()), layout);
1662            }
1663        }
1664    }
1665}
1666
1667#[cfg(not(no_global_oom_handling))]
1668#[stable(feature = "rust1", since = "1.0.0")]
1669impl<T: Default> Default for Box<T> {
1670    /// Creates a `Box<T>`, with the `Default` value for T.
1671    #[inline]
1672    fn default() -> Self {
1673        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1674        unsafe {
1675            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1676            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1677            // does not have a destructor.
1678            //
1679            // We use `ptr::write` as `MaybeUninit::write` creates
1680            // extra stack copies of `T` in debug mode.
1681            //
1682            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1683            ptr::write(&raw mut *x as *mut T, T::default());
1684            // SAFETY: `x` was just initialized above.
1685            x.assume_init()
1686        }
1687    }
1688}
1689
1690#[cfg(not(no_global_oom_handling))]
1691#[stable(feature = "rust1", since = "1.0.0")]
1692impl<T> Default for Box<[T]> {
1693    #[inline]
1694    fn default() -> Self {
1695        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1696        Box(ptr, Global)
1697    }
1698}
1699
1700#[cfg(not(no_global_oom_handling))]
1701#[stable(feature = "default_box_extra", since = "1.17.0")]
1702impl Default for Box<str> {
1703    #[inline]
1704    fn default() -> Self {
1705        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1706        let ptr: Unique<str> = unsafe {
1707            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1708            Unique::new_unchecked(bytes.as_ptr() as *mut str)
1709        };
1710        Box(ptr, Global)
1711    }
1712}
1713
1714#[cfg(not(no_global_oom_handling))]
1715#[stable(feature = "rust1", since = "1.0.0")]
1716impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1717    /// Returns a new box with a `clone()` of this box's contents.
1718    ///
1719    /// # Examples
1720    ///
1721    /// ```
1722    /// let x = Box::new(5);
1723    /// let y = x.clone();
1724    ///
1725    /// // The value is the same
1726    /// assert_eq!(x, y);
1727    ///
1728    /// // But they are unique objects
1729    /// assert_ne!(&*x as *const i32, &*y as *const i32);
1730    /// ```
1731    #[inline]
1732    fn clone(&self) -> Self {
1733        // Pre-allocate memory to allow writing the cloned value directly.
1734        let mut boxed = Self::new_uninit_in(self.1.clone());
1735        unsafe {
1736            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1737            boxed.assume_init()
1738        }
1739    }
1740
1741    /// Copies `source`'s contents into `self` without creating a new allocation.
1742    ///
1743    /// # Examples
1744    ///
1745    /// ```
1746    /// let x = Box::new(5);
1747    /// let mut y = Box::new(10);
1748    /// let yp: *const i32 = &*y;
1749    ///
1750    /// y.clone_from(&x);
1751    ///
1752    /// // The value is the same
1753    /// assert_eq!(x, y);
1754    ///
1755    /// // And no allocation occurred
1756    /// assert_eq!(yp, &*y);
1757    /// ```
1758    #[inline]
1759    fn clone_from(&mut self, source: &Self) {
1760        (**self).clone_from(&(**source));
1761    }
1762}
1763
1764#[cfg(not(no_global_oom_handling))]
1765#[stable(feature = "box_slice_clone", since = "1.3.0")]
1766impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1767    fn clone(&self) -> Self {
1768        let alloc = Box::allocator(self).clone();
1769        self.to_vec_in(alloc).into_boxed_slice()
1770    }
1771
1772    /// Copies `source`'s contents into `self` without creating a new allocation,
1773    /// so long as the two are of the same length.
1774    ///
1775    /// # Examples
1776    ///
1777    /// ```
1778    /// let x = Box::new([5, 6, 7]);
1779    /// let mut y = Box::new([8, 9, 10]);
1780    /// let yp: *const [i32] = &*y;
1781    ///
1782    /// y.clone_from(&x);
1783    ///
1784    /// // The value is the same
1785    /// assert_eq!(x, y);
1786    ///
1787    /// // And no allocation occurred
1788    /// assert_eq!(yp, &*y);
1789    /// ```
1790    fn clone_from(&mut self, source: &Self) {
1791        if self.len() == source.len() {
1792            self.clone_from_slice(&source);
1793        } else {
1794            *self = source.clone();
1795        }
1796    }
1797}
1798
1799#[cfg(not(no_global_oom_handling))]
1800#[stable(feature = "box_slice_clone", since = "1.3.0")]
1801impl Clone for Box<str> {
1802    fn clone(&self) -> Self {
1803        // this makes a copy of the data
1804        let buf: Box<[u8]> = self.as_bytes().into();
1805        unsafe { from_boxed_utf8_unchecked(buf) }
1806    }
1807}
1808
1809#[stable(feature = "rust1", since = "1.0.0")]
1810impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1811    #[inline]
1812    fn eq(&self, other: &Self) -> bool {
1813        PartialEq::eq(&**self, &**other)
1814    }
1815    #[inline]
1816    fn ne(&self, other: &Self) -> bool {
1817        PartialEq::ne(&**self, &**other)
1818    }
1819}
1820
1821#[stable(feature = "rust1", since = "1.0.0")]
1822impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1823    #[inline]
1824    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1825        PartialOrd::partial_cmp(&**self, &**other)
1826    }
1827    #[inline]
1828    fn lt(&self, other: &Self) -> bool {
1829        PartialOrd::lt(&**self, &**other)
1830    }
1831    #[inline]
1832    fn le(&self, other: &Self) -> bool {
1833        PartialOrd::le(&**self, &**other)
1834    }
1835    #[inline]
1836    fn ge(&self, other: &Self) -> bool {
1837        PartialOrd::ge(&**self, &**other)
1838    }
1839    #[inline]
1840    fn gt(&self, other: &Self) -> bool {
1841        PartialOrd::gt(&**self, &**other)
1842    }
1843}
1844
1845#[stable(feature = "rust1", since = "1.0.0")]
1846impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1847    #[inline]
1848    fn cmp(&self, other: &Self) -> Ordering {
1849        Ord::cmp(&**self, &**other)
1850    }
1851}
1852
1853#[stable(feature = "rust1", since = "1.0.0")]
1854impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1855
1856#[stable(feature = "rust1", since = "1.0.0")]
1857impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1858    fn hash<H: Hasher>(&self, state: &mut H) {
1859        (**self).hash(state);
1860    }
1861}
1862
1863#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1864impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1865    fn finish(&self) -> u64 {
1866        (**self).finish()
1867    }
1868    fn write(&mut self, bytes: &[u8]) {
1869        (**self).write(bytes)
1870    }
1871    fn write_u8(&mut self, i: u8) {
1872        (**self).write_u8(i)
1873    }
1874    fn write_u16(&mut self, i: u16) {
1875        (**self).write_u16(i)
1876    }
1877    fn write_u32(&mut self, i: u32) {
1878        (**self).write_u32(i)
1879    }
1880    fn write_u64(&mut self, i: u64) {
1881        (**self).write_u64(i)
1882    }
1883    fn write_u128(&mut self, i: u128) {
1884        (**self).write_u128(i)
1885    }
1886    fn write_usize(&mut self, i: usize) {
1887        (**self).write_usize(i)
1888    }
1889    fn write_i8(&mut self, i: i8) {
1890        (**self).write_i8(i)
1891    }
1892    fn write_i16(&mut self, i: i16) {
1893        (**self).write_i16(i)
1894    }
1895    fn write_i32(&mut self, i: i32) {
1896        (**self).write_i32(i)
1897    }
1898    fn write_i64(&mut self, i: i64) {
1899        (**self).write_i64(i)
1900    }
1901    fn write_i128(&mut self, i: i128) {
1902        (**self).write_i128(i)
1903    }
1904    fn write_isize(&mut self, i: isize) {
1905        (**self).write_isize(i)
1906    }
1907    fn write_length_prefix(&mut self, len: usize) {
1908        (**self).write_length_prefix(len)
1909    }
1910    fn write_str(&mut self, s: &str) {
1911        (**self).write_str(s)
1912    }
1913}
1914
1915#[stable(feature = "rust1", since = "1.0.0")]
1916impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1917    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1918        fmt::Display::fmt(&**self, f)
1919    }
1920}
1921
1922#[stable(feature = "rust1", since = "1.0.0")]
1923impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1924    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1925        fmt::Debug::fmt(&**self, f)
1926    }
1927}
1928
1929#[stable(feature = "rust1", since = "1.0.0")]
1930impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1931    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1932        // It's not possible to extract the inner Uniq directly from the Box,
1933        // instead we cast it to a *const which aliases the Unique
1934        let ptr: *const T = &**self;
1935        fmt::Pointer::fmt(&ptr, f)
1936    }
1937}
1938
1939#[stable(feature = "rust1", since = "1.0.0")]
1940impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1941    type Target = T;
1942
1943    fn deref(&self) -> &T {
1944        &**self
1945    }
1946}
1947
1948#[stable(feature = "rust1", since = "1.0.0")]
1949impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1950    fn deref_mut(&mut self) -> &mut T {
1951        &mut **self
1952    }
1953}
1954
1955#[unstable(feature = "deref_pure_trait", issue = "87121")]
1956unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1957
1958#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1959impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1960
1961#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1962impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
1963    type Output = <F as FnOnce<Args>>::Output;
1964
1965    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
1966        <F as FnOnce<Args>>::call_once(*self, args)
1967    }
1968}
1969
1970#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1971impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
1972    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
1973        <F as FnMut<Args>>::call_mut(self, args)
1974    }
1975}
1976
1977#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1978impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
1979    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
1980        <F as Fn<Args>>::call(self, args)
1981    }
1982}
1983
1984#[stable(feature = "async_closure", since = "1.85.0")]
1985impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
1986    type Output = F::Output;
1987    type CallOnceFuture = F::CallOnceFuture;
1988
1989    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
1990        F::async_call_once(*self, args)
1991    }
1992}
1993
1994#[stable(feature = "async_closure", since = "1.85.0")]
1995impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
1996    type CallRefFuture<'a>
1997        = F::CallRefFuture<'a>
1998    where
1999        Self: 'a;
2000
2001    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2002        F::async_call_mut(self, args)
2003    }
2004}
2005
2006#[stable(feature = "async_closure", since = "1.85.0")]
2007impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2008    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2009        F::async_call(self, args)
2010    }
2011}
2012
2013#[unstable(feature = "coerce_unsized", issue = "18598")]
2014impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2015
2016#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2017unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2018
2019// It is quite crucial that we only allow the `Global` allocator here.
2020// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2021// would need a lot of codegen and interpreter adjustments.
2022#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2023impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2024
2025#[stable(feature = "box_borrow", since = "1.1.0")]
2026impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2027    fn borrow(&self) -> &T {
2028        &**self
2029    }
2030}
2031
2032#[stable(feature = "box_borrow", since = "1.1.0")]
2033impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2034    fn borrow_mut(&mut self) -> &mut T {
2035        &mut **self
2036    }
2037}
2038
2039#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2040impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2041    fn as_ref(&self) -> &T {
2042        &**self
2043    }
2044}
2045
2046#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2047impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2048    fn as_mut(&mut self) -> &mut T {
2049        &mut **self
2050    }
2051}
2052
2053/* Nota bene
2054 *
2055 *  We could have chosen not to add this impl, and instead have written a
2056 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2057 *  because Box<T> implements Unpin even when T does not, as a result of
2058 *  this impl.
2059 *
2060 *  We chose this API instead of the alternative for a few reasons:
2061 *      - Logically, it is helpful to understand pinning in regard to the
2062 *        memory region being pointed to. For this reason none of the
2063 *        standard library pointer types support projecting through a pin
2064 *        (Box<T> is the only pointer type in std for which this would be
2065 *        safe.)
2066 *      - It is in practice very useful to have Box<T> be unconditionally
2067 *        Unpin because of trait objects, for which the structural auto
2068 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2069 *        otherwise not be Unpin).
2070 *
2071 *  Another type with the same semantics as Box but only a conditional
2072 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2073 *  could have a method to project a Pin<T> from it.
2074 */
2075#[stable(feature = "pin", since = "1.33.0")]
2076impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2077
2078#[unstable(feature = "coroutine_trait", issue = "43122")]
2079impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2080    type Yield = G::Yield;
2081    type Return = G::Return;
2082
2083    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2084        G::resume(Pin::new(&mut *self), arg)
2085    }
2086}
2087
2088#[unstable(feature = "coroutine_trait", issue = "43122")]
2089impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2090where
2091    A: 'static,
2092{
2093    type Yield = G::Yield;
2094    type Return = G::Return;
2095
2096    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2097        G::resume((*self).as_mut(), arg)
2098    }
2099}
2100
2101#[stable(feature = "futures_api", since = "1.36.0")]
2102impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2103    type Output = F::Output;
2104
2105    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2106        F::poll(Pin::new(&mut *self), cx)
2107    }
2108}
2109
2110#[stable(feature = "box_error", since = "1.8.0")]
2111impl<E: Error> Error for Box<E> {
2112    #[allow(deprecated, deprecated_in_future)]
2113    fn description(&self) -> &str {
2114        Error::description(&**self)
2115    }
2116
2117    #[allow(deprecated)]
2118    fn cause(&self) -> Option<&dyn Error> {
2119        Error::cause(&**self)
2120    }
2121
2122    fn source(&self) -> Option<&(dyn Error + 'static)> {
2123        Error::source(&**self)
2124    }
2125
2126    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2127        Error::provide(&**self, request);
2128    }
2129}
2130
2131#[unstable(feature = "pointer_like_trait", issue = "none")]
2132impl<T> PointerLike for Box<T> {}
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy