alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198    DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232    T: ?Sized,
233    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[rustc_intrinsic]
241#[unstable(feature = "liballoc_internals", issue = "none")]
242pub fn box_new<T>(x: T) -> Box<T>;
243
244impl<T> Box<T> {
245    /// Allocates memory on the heap and then places `x` into it.
246    ///
247    /// This doesn't actually allocate if `T` is zero-sized.
248    ///
249    /// # Examples
250    ///
251    /// ```
252    /// let five = Box::new(5);
253    /// ```
254    #[cfg(not(no_global_oom_handling))]
255    #[inline(always)]
256    #[stable(feature = "rust1", since = "1.0.0")]
257    #[must_use]
258    #[rustc_diagnostic_item = "box_new"]
259    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
260    pub fn new(x: T) -> Self {
261        return box_new(x);
262    }
263
264    /// Constructs a new box with uninitialized contents.
265    ///
266    /// # Examples
267    ///
268    /// ```
269    /// let mut five = Box::<u32>::new_uninit();
270    /// // Deferred initialization:
271    /// five.write(5);
272    /// let five = unsafe { five.assume_init() };
273    ///
274    /// assert_eq!(*five, 5)
275    /// ```
276    #[cfg(not(no_global_oom_handling))]
277    #[stable(feature = "new_uninit", since = "1.82.0")]
278    #[must_use]
279    #[inline]
280    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
281        Self::new_uninit_in(Global)
282    }
283
284    /// Constructs a new `Box` with uninitialized contents, with the memory
285    /// being filled with `0` bytes.
286    ///
287    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
288    /// of this method.
289    ///
290    /// # Examples
291    ///
292    /// ```
293    /// #![feature(new_zeroed_alloc)]
294    ///
295    /// let zero = Box::<u32>::new_zeroed();
296    /// let zero = unsafe { zero.assume_init() };
297    ///
298    /// assert_eq!(*zero, 0)
299    /// ```
300    ///
301    /// [zeroed]: mem::MaybeUninit::zeroed
302    #[cfg(not(no_global_oom_handling))]
303    #[inline]
304    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
305    #[must_use]
306    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
307        Self::new_zeroed_in(Global)
308    }
309
310    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
311    /// `x` will be pinned in memory and unable to be moved.
312    ///
313    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
314    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
315    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
316    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
317    #[cfg(not(no_global_oom_handling))]
318    #[stable(feature = "pin", since = "1.33.0")]
319    #[must_use]
320    #[inline(always)]
321    pub fn pin(x: T) -> Pin<Box<T>> {
322        Box::new(x).into()
323    }
324
325    /// Allocates memory on the heap then places `x` into it,
326    /// returning an error if the allocation fails
327    ///
328    /// This doesn't actually allocate if `T` is zero-sized.
329    ///
330    /// # Examples
331    ///
332    /// ```
333    /// #![feature(allocator_api)]
334    ///
335    /// let five = Box::try_new(5)?;
336    /// # Ok::<(), std::alloc::AllocError>(())
337    /// ```
338    #[unstable(feature = "allocator_api", issue = "32838")]
339    #[inline]
340    pub fn try_new(x: T) -> Result<Self, AllocError> {
341        Self::try_new_in(x, Global)
342    }
343
344    /// Constructs a new box with uninitialized contents on the heap,
345    /// returning an error if the allocation fails
346    ///
347    /// # Examples
348    ///
349    /// ```
350    /// #![feature(allocator_api)]
351    ///
352    /// let mut five = Box::<u32>::try_new_uninit()?;
353    /// // Deferred initialization:
354    /// five.write(5);
355    /// let five = unsafe { five.assume_init() };
356    ///
357    /// assert_eq!(*five, 5);
358    /// # Ok::<(), std::alloc::AllocError>(())
359    /// ```
360    #[unstable(feature = "allocator_api", issue = "32838")]
361    // #[unstable(feature = "new_uninit", issue = "63291")]
362    #[inline]
363    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
364        Box::try_new_uninit_in(Global)
365    }
366
367    /// Constructs a new `Box` with uninitialized contents, with the memory
368    /// being filled with `0` bytes on the heap
369    ///
370    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
371    /// of this method.
372    ///
373    /// # Examples
374    ///
375    /// ```
376    /// #![feature(allocator_api)]
377    ///
378    /// let zero = Box::<u32>::try_new_zeroed()?;
379    /// let zero = unsafe { zero.assume_init() };
380    ///
381    /// assert_eq!(*zero, 0);
382    /// # Ok::<(), std::alloc::AllocError>(())
383    /// ```
384    ///
385    /// [zeroed]: mem::MaybeUninit::zeroed
386    #[unstable(feature = "allocator_api", issue = "32838")]
387    // #[unstable(feature = "new_uninit", issue = "63291")]
388    #[inline]
389    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
390        Box::try_new_zeroed_in(Global)
391    }
392}
393
394impl<T, A: Allocator> Box<T, A> {
395    /// Allocates memory in the given allocator then places `x` into it.
396    ///
397    /// This doesn't actually allocate if `T` is zero-sized.
398    ///
399    /// # Examples
400    ///
401    /// ```
402    /// #![feature(allocator_api)]
403    ///
404    /// use std::alloc::System;
405    ///
406    /// let five = Box::new_in(5, System);
407    /// ```
408    #[cfg(not(no_global_oom_handling))]
409    #[unstable(feature = "allocator_api", issue = "32838")]
410    #[must_use]
411    #[inline]
412    pub fn new_in(x: T, alloc: A) -> Self
413    where
414        A: Allocator,
415    {
416        let mut boxed = Self::new_uninit_in(alloc);
417        boxed.write(x);
418        unsafe { boxed.assume_init() }
419    }
420
421    /// Allocates memory in the given allocator then places `x` into it,
422    /// returning an error if the allocation fails
423    ///
424    /// This doesn't actually allocate if `T` is zero-sized.
425    ///
426    /// # Examples
427    ///
428    /// ```
429    /// #![feature(allocator_api)]
430    ///
431    /// use std::alloc::System;
432    ///
433    /// let five = Box::try_new_in(5, System)?;
434    /// # Ok::<(), std::alloc::AllocError>(())
435    /// ```
436    #[unstable(feature = "allocator_api", issue = "32838")]
437    #[inline]
438    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
439    where
440        A: Allocator,
441    {
442        let mut boxed = Self::try_new_uninit_in(alloc)?;
443        boxed.write(x);
444        unsafe { Ok(boxed.assume_init()) }
445    }
446
447    /// Constructs a new box with uninitialized contents in the provided allocator.
448    ///
449    /// # Examples
450    ///
451    /// ```
452    /// #![feature(allocator_api)]
453    ///
454    /// use std::alloc::System;
455    ///
456    /// let mut five = Box::<u32, _>::new_uninit_in(System);
457    /// // Deferred initialization:
458    /// five.write(5);
459    /// let five = unsafe { five.assume_init() };
460    ///
461    /// assert_eq!(*five, 5)
462    /// ```
463    #[unstable(feature = "allocator_api", issue = "32838")]
464    #[cfg(not(no_global_oom_handling))]
465    #[must_use]
466    // #[unstable(feature = "new_uninit", issue = "63291")]
467    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
468    where
469        A: Allocator,
470    {
471        let layout = Layout::new::<mem::MaybeUninit<T>>();
472        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
473        // That would make code size bigger.
474        match Box::try_new_uninit_in(alloc) {
475            Ok(m) => m,
476            Err(_) => handle_alloc_error(layout),
477        }
478    }
479
480    /// Constructs a new box with uninitialized contents in the provided allocator,
481    /// returning an error if the allocation fails
482    ///
483    /// # Examples
484    ///
485    /// ```
486    /// #![feature(allocator_api)]
487    ///
488    /// use std::alloc::System;
489    ///
490    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
491    /// // Deferred initialization:
492    /// five.write(5);
493    /// let five = unsafe { five.assume_init() };
494    ///
495    /// assert_eq!(*five, 5);
496    /// # Ok::<(), std::alloc::AllocError>(())
497    /// ```
498    #[unstable(feature = "allocator_api", issue = "32838")]
499    // #[unstable(feature = "new_uninit", issue = "63291")]
500    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
501    where
502        A: Allocator,
503    {
504        let ptr = if T::IS_ZST {
505            NonNull::dangling()
506        } else {
507            let layout = Layout::new::<mem::MaybeUninit<T>>();
508            alloc.allocate(layout)?.cast()
509        };
510        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
511    }
512
513    /// Constructs a new `Box` with uninitialized contents, with the memory
514    /// being filled with `0` bytes in the provided allocator.
515    ///
516    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
517    /// of this method.
518    ///
519    /// # Examples
520    ///
521    /// ```
522    /// #![feature(allocator_api)]
523    ///
524    /// use std::alloc::System;
525    ///
526    /// let zero = Box::<u32, _>::new_zeroed_in(System);
527    /// let zero = unsafe { zero.assume_init() };
528    ///
529    /// assert_eq!(*zero, 0)
530    /// ```
531    ///
532    /// [zeroed]: mem::MaybeUninit::zeroed
533    #[unstable(feature = "allocator_api", issue = "32838")]
534    #[cfg(not(no_global_oom_handling))]
535    // #[unstable(feature = "new_uninit", issue = "63291")]
536    #[must_use]
537    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
538    where
539        A: Allocator,
540    {
541        let layout = Layout::new::<mem::MaybeUninit<T>>();
542        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
543        // That would make code size bigger.
544        match Box::try_new_zeroed_in(alloc) {
545            Ok(m) => m,
546            Err(_) => handle_alloc_error(layout),
547        }
548    }
549
550    /// Constructs a new `Box` with uninitialized contents, with the memory
551    /// being filled with `0` bytes in the provided allocator,
552    /// returning an error if the allocation fails,
553    ///
554    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
555    /// of this method.
556    ///
557    /// # Examples
558    ///
559    /// ```
560    /// #![feature(allocator_api)]
561    ///
562    /// use std::alloc::System;
563    ///
564    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
565    /// let zero = unsafe { zero.assume_init() };
566    ///
567    /// assert_eq!(*zero, 0);
568    /// # Ok::<(), std::alloc::AllocError>(())
569    /// ```
570    ///
571    /// [zeroed]: mem::MaybeUninit::zeroed
572    #[unstable(feature = "allocator_api", issue = "32838")]
573    // #[unstable(feature = "new_uninit", issue = "63291")]
574    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
575    where
576        A: Allocator,
577    {
578        let ptr = if T::IS_ZST {
579            NonNull::dangling()
580        } else {
581            let layout = Layout::new::<mem::MaybeUninit<T>>();
582            alloc.allocate_zeroed(layout)?.cast()
583        };
584        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
585    }
586
587    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
588    /// `x` will be pinned in memory and unable to be moved.
589    ///
590    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
591    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
592    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
593    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
594    #[cfg(not(no_global_oom_handling))]
595    #[unstable(feature = "allocator_api", issue = "32838")]
596    #[must_use]
597    #[inline(always)]
598    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
599    where
600        A: 'static + Allocator,
601    {
602        Self::into_pin(Self::new_in(x, alloc))
603    }
604
605    /// Converts a `Box<T>` into a `Box<[T]>`
606    ///
607    /// This conversion does not allocate on the heap and happens in place.
608    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
609    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
610        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
611        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
612    }
613
614    /// Consumes the `Box`, returning the wrapped value.
615    ///
616    /// # Examples
617    ///
618    /// ```
619    /// #![feature(box_into_inner)]
620    ///
621    /// let c = Box::new(5);
622    ///
623    /// assert_eq!(Box::into_inner(c), 5);
624    /// ```
625    #[unstable(feature = "box_into_inner", issue = "80437")]
626    #[inline]
627    pub fn into_inner(boxed: Self) -> T {
628        *boxed
629    }
630}
631
632impl<T> Box<[T]> {
633    /// Constructs a new boxed slice with uninitialized contents.
634    ///
635    /// # Examples
636    ///
637    /// ```
638    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
639    /// // Deferred initialization:
640    /// values[0].write(1);
641    /// values[1].write(2);
642    /// values[2].write(3);
643    /// let values = unsafe {values.assume_init() };
644    ///
645    /// assert_eq!(*values, [1, 2, 3])
646    /// ```
647    #[cfg(not(no_global_oom_handling))]
648    #[stable(feature = "new_uninit", since = "1.82.0")]
649    #[must_use]
650    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
651        unsafe { RawVec::with_capacity(len).into_box(len) }
652    }
653
654    /// Constructs a new boxed slice with uninitialized contents, with the memory
655    /// being filled with `0` bytes.
656    ///
657    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
658    /// of this method.
659    ///
660    /// # Examples
661    ///
662    /// ```
663    /// #![feature(new_zeroed_alloc)]
664    ///
665    /// let values = Box::<[u32]>::new_zeroed_slice(3);
666    /// let values = unsafe { values.assume_init() };
667    ///
668    /// assert_eq!(*values, [0, 0, 0])
669    /// ```
670    ///
671    /// [zeroed]: mem::MaybeUninit::zeroed
672    #[cfg(not(no_global_oom_handling))]
673    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
674    #[must_use]
675    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
676        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
677    }
678
679    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
680    /// the allocation fails.
681    ///
682    /// # Examples
683    ///
684    /// ```
685    /// #![feature(allocator_api)]
686    ///
687    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
688    /// // Deferred initialization:
689    /// values[0].write(1);
690    /// values[1].write(2);
691    /// values[2].write(3);
692    /// let values = unsafe { values.assume_init() };
693    ///
694    /// assert_eq!(*values, [1, 2, 3]);
695    /// # Ok::<(), std::alloc::AllocError>(())
696    /// ```
697    #[unstable(feature = "allocator_api", issue = "32838")]
698    #[inline]
699    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
700        let ptr = if T::IS_ZST || len == 0 {
701            NonNull::dangling()
702        } else {
703            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
704                Ok(l) => l,
705                Err(_) => return Err(AllocError),
706            };
707            Global.allocate(layout)?.cast()
708        };
709        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
710    }
711
712    /// Constructs a new boxed slice with uninitialized contents, with the memory
713    /// being filled with `0` bytes. Returns an error if the allocation fails.
714    ///
715    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
716    /// of this method.
717    ///
718    /// # Examples
719    ///
720    /// ```
721    /// #![feature(allocator_api)]
722    ///
723    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
724    /// let values = unsafe { values.assume_init() };
725    ///
726    /// assert_eq!(*values, [0, 0, 0]);
727    /// # Ok::<(), std::alloc::AllocError>(())
728    /// ```
729    ///
730    /// [zeroed]: mem::MaybeUninit::zeroed
731    #[unstable(feature = "allocator_api", issue = "32838")]
732    #[inline]
733    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
734        let ptr = if T::IS_ZST || len == 0 {
735            NonNull::dangling()
736        } else {
737            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
738                Ok(l) => l,
739                Err(_) => return Err(AllocError),
740            };
741            Global.allocate_zeroed(layout)?.cast()
742        };
743        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
744    }
745
746    /// Converts the boxed slice into a boxed array.
747    ///
748    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
749    ///
750    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
751    #[unstable(feature = "slice_as_array", issue = "133508")]
752    #[inline]
753    #[must_use]
754    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
755        if self.len() == N {
756            let ptr = Self::into_raw(self) as *mut [T; N];
757
758            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
759            let me = unsafe { Box::from_raw(ptr) };
760            Some(me)
761        } else {
762            None
763        }
764    }
765}
766
767impl<T, A: Allocator> Box<[T], A> {
768    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
769    ///
770    /// # Examples
771    ///
772    /// ```
773    /// #![feature(allocator_api)]
774    ///
775    /// use std::alloc::System;
776    ///
777    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
778    /// // Deferred initialization:
779    /// values[0].write(1);
780    /// values[1].write(2);
781    /// values[2].write(3);
782    /// let values = unsafe { values.assume_init() };
783    ///
784    /// assert_eq!(*values, [1, 2, 3])
785    /// ```
786    #[cfg(not(no_global_oom_handling))]
787    #[unstable(feature = "allocator_api", issue = "32838")]
788    // #[unstable(feature = "new_uninit", issue = "63291")]
789    #[must_use]
790    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
791        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
792    }
793
794    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
795    /// with the memory being filled with `0` bytes.
796    ///
797    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
798    /// of this method.
799    ///
800    /// # Examples
801    ///
802    /// ```
803    /// #![feature(allocator_api)]
804    ///
805    /// use std::alloc::System;
806    ///
807    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
808    /// let values = unsafe { values.assume_init() };
809    ///
810    /// assert_eq!(*values, [0, 0, 0])
811    /// ```
812    ///
813    /// [zeroed]: mem::MaybeUninit::zeroed
814    #[cfg(not(no_global_oom_handling))]
815    #[unstable(feature = "allocator_api", issue = "32838")]
816    // #[unstable(feature = "new_uninit", issue = "63291")]
817    #[must_use]
818    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
819        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
820    }
821
822    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
823    /// the allocation fails.
824    ///
825    /// # Examples
826    ///
827    /// ```
828    /// #![feature(allocator_api)]
829    ///
830    /// use std::alloc::System;
831    ///
832    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
833    /// // Deferred initialization:
834    /// values[0].write(1);
835    /// values[1].write(2);
836    /// values[2].write(3);
837    /// let values = unsafe { values.assume_init() };
838    ///
839    /// assert_eq!(*values, [1, 2, 3]);
840    /// # Ok::<(), std::alloc::AllocError>(())
841    /// ```
842    #[unstable(feature = "allocator_api", issue = "32838")]
843    #[inline]
844    pub fn try_new_uninit_slice_in(
845        len: usize,
846        alloc: A,
847    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
848        let ptr = if T::IS_ZST || len == 0 {
849            NonNull::dangling()
850        } else {
851            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
852                Ok(l) => l,
853                Err(_) => return Err(AllocError),
854            };
855            alloc.allocate(layout)?.cast()
856        };
857        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
858    }
859
860    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
861    /// being filled with `0` bytes. Returns an error if the allocation fails.
862    ///
863    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
864    /// of this method.
865    ///
866    /// # Examples
867    ///
868    /// ```
869    /// #![feature(allocator_api)]
870    ///
871    /// use std::alloc::System;
872    ///
873    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
874    /// let values = unsafe { values.assume_init() };
875    ///
876    /// assert_eq!(*values, [0, 0, 0]);
877    /// # Ok::<(), std::alloc::AllocError>(())
878    /// ```
879    ///
880    /// [zeroed]: mem::MaybeUninit::zeroed
881    #[unstable(feature = "allocator_api", issue = "32838")]
882    #[inline]
883    pub fn try_new_zeroed_slice_in(
884        len: usize,
885        alloc: A,
886    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
887        let ptr = if T::IS_ZST || len == 0 {
888            NonNull::dangling()
889        } else {
890            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
891                Ok(l) => l,
892                Err(_) => return Err(AllocError),
893            };
894            alloc.allocate_zeroed(layout)?.cast()
895        };
896        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
897    }
898}
899
900impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
901    /// Converts to `Box<T, A>`.
902    ///
903    /// # Safety
904    ///
905    /// As with [`MaybeUninit::assume_init`],
906    /// it is up to the caller to guarantee that the value
907    /// really is in an initialized state.
908    /// Calling this when the content is not yet fully initialized
909    /// causes immediate undefined behavior.
910    ///
911    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
912    ///
913    /// # Examples
914    ///
915    /// ```
916    /// let mut five = Box::<u32>::new_uninit();
917    /// // Deferred initialization:
918    /// five.write(5);
919    /// let five: Box<u32> = unsafe { five.assume_init() };
920    ///
921    /// assert_eq!(*five, 5)
922    /// ```
923    #[stable(feature = "new_uninit", since = "1.82.0")]
924    #[inline]
925    pub unsafe fn assume_init(self) -> Box<T, A> {
926        let (raw, alloc) = Box::into_raw_with_allocator(self);
927        unsafe { Box::from_raw_in(raw as *mut T, alloc) }
928    }
929
930    /// Writes the value and converts to `Box<T, A>`.
931    ///
932    /// This method converts the box similarly to [`Box::assume_init`] but
933    /// writes `value` into it before conversion thus guaranteeing safety.
934    /// In some scenarios use of this method may improve performance because
935    /// the compiler may be able to optimize copying from stack.
936    ///
937    /// # Examples
938    ///
939    /// ```
940    /// let big_box = Box::<[usize; 1024]>::new_uninit();
941    ///
942    /// let mut array = [0; 1024];
943    /// for (i, place) in array.iter_mut().enumerate() {
944    ///     *place = i;
945    /// }
946    ///
947    /// // The optimizer may be able to elide this copy, so previous code writes
948    /// // to heap directly.
949    /// let big_box = Box::write(big_box, array);
950    ///
951    /// for (i, x) in big_box.iter().enumerate() {
952    ///     assert_eq!(*x, i);
953    /// }
954    /// ```
955    #[stable(feature = "box_uninit_write", since = "1.87.0")]
956    #[inline]
957    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
958        unsafe {
959            (*boxed).write(value);
960            boxed.assume_init()
961        }
962    }
963}
964
965impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
966    /// Converts to `Box<[T], A>`.
967    ///
968    /// # Safety
969    ///
970    /// As with [`MaybeUninit::assume_init`],
971    /// it is up to the caller to guarantee that the values
972    /// really are in an initialized state.
973    /// Calling this when the content is not yet fully initialized
974    /// causes immediate undefined behavior.
975    ///
976    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
977    ///
978    /// # Examples
979    ///
980    /// ```
981    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
982    /// // Deferred initialization:
983    /// values[0].write(1);
984    /// values[1].write(2);
985    /// values[2].write(3);
986    /// let values = unsafe { values.assume_init() };
987    ///
988    /// assert_eq!(*values, [1, 2, 3])
989    /// ```
990    #[stable(feature = "new_uninit", since = "1.82.0")]
991    #[inline]
992    pub unsafe fn assume_init(self) -> Box<[T], A> {
993        let (raw, alloc) = Box::into_raw_with_allocator(self);
994        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
995    }
996}
997
998impl<T: ?Sized> Box<T> {
999    /// Constructs a box from a raw pointer.
1000    ///
1001    /// After calling this function, the raw pointer is owned by the
1002    /// resulting `Box`. Specifically, the `Box` destructor will call
1003    /// the destructor of `T` and free the allocated memory. For this
1004    /// to be safe, the memory must have been allocated in accordance
1005    /// with the [memory layout] used by `Box` .
1006    ///
1007    /// # Safety
1008    ///
1009    /// This function is unsafe because improper use may lead to
1010    /// memory problems. For example, a double-free may occur if the
1011    /// function is called twice on the same raw pointer.
1012    ///
1013    /// The raw pointer must point to a block of memory allocated by the global allocator.
1014    ///
1015    /// The safety conditions are described in the [memory layout] section.
1016    ///
1017    /// # Examples
1018    ///
1019    /// Recreate a `Box` which was previously converted to a raw pointer
1020    /// using [`Box::into_raw`]:
1021    /// ```
1022    /// let x = Box::new(5);
1023    /// let ptr = Box::into_raw(x);
1024    /// let x = unsafe { Box::from_raw(ptr) };
1025    /// ```
1026    /// Manually create a `Box` from scratch by using the global allocator:
1027    /// ```
1028    /// use std::alloc::{alloc, Layout};
1029    ///
1030    /// unsafe {
1031    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1032    ///     // In general .write is required to avoid attempting to destruct
1033    ///     // the (uninitialized) previous contents of `ptr`, though for this
1034    ///     // simple example `*ptr = 5` would have worked as well.
1035    ///     ptr.write(5);
1036    ///     let x = Box::from_raw(ptr);
1037    /// }
1038    /// ```
1039    ///
1040    /// [memory layout]: self#memory-layout
1041    #[stable(feature = "box_raw", since = "1.4.0")]
1042    #[inline]
1043    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1044    pub unsafe fn from_raw(raw: *mut T) -> Self {
1045        unsafe { Self::from_raw_in(raw, Global) }
1046    }
1047
1048    /// Constructs a box from a `NonNull` pointer.
1049    ///
1050    /// After calling this function, the `NonNull` pointer is owned by
1051    /// the resulting `Box`. Specifically, the `Box` destructor will call
1052    /// the destructor of `T` and free the allocated memory. For this
1053    /// to be safe, the memory must have been allocated in accordance
1054    /// with the [memory layout] used by `Box` .
1055    ///
1056    /// # Safety
1057    ///
1058    /// This function is unsafe because improper use may lead to
1059    /// memory problems. For example, a double-free may occur if the
1060    /// function is called twice on the same `NonNull` pointer.
1061    ///
1062    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1063    ///
1064    /// The safety conditions are described in the [memory layout] section.
1065    ///
1066    /// # Examples
1067    ///
1068    /// Recreate a `Box` which was previously converted to a `NonNull`
1069    /// pointer using [`Box::into_non_null`]:
1070    /// ```
1071    /// #![feature(box_vec_non_null)]
1072    ///
1073    /// let x = Box::new(5);
1074    /// let non_null = Box::into_non_null(x);
1075    /// let x = unsafe { Box::from_non_null(non_null) };
1076    /// ```
1077    /// Manually create a `Box` from scratch by using the global allocator:
1078    /// ```
1079    /// #![feature(box_vec_non_null)]
1080    ///
1081    /// use std::alloc::{alloc, Layout};
1082    /// use std::ptr::NonNull;
1083    ///
1084    /// unsafe {
1085    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1086    ///         .expect("allocation failed");
1087    ///     // In general .write is required to avoid attempting to destruct
1088    ///     // the (uninitialized) previous contents of `non_null`.
1089    ///     non_null.write(5);
1090    ///     let x = Box::from_non_null(non_null);
1091    /// }
1092    /// ```
1093    ///
1094    /// [memory layout]: self#memory-layout
1095    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1096    #[inline]
1097    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1098    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1099        unsafe { Self::from_raw(ptr.as_ptr()) }
1100    }
1101
1102    /// Consumes the `Box`, returning a wrapped raw pointer.
1103    ///
1104    /// The pointer will be properly aligned and non-null.
1105    ///
1106    /// After calling this function, the caller is responsible for the
1107    /// memory previously managed by the `Box`. In particular, the
1108    /// caller should properly destroy `T` and release the memory, taking
1109    /// into account the [memory layout] used by `Box`. The easiest way to
1110    /// do this is to convert the raw pointer back into a `Box` with the
1111    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1112    /// the cleanup.
1113    ///
1114    /// Note: this is an associated function, which means that you have
1115    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1116    /// is so that there is no conflict with a method on the inner type.
1117    ///
1118    /// # Examples
1119    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1120    /// for automatic cleanup:
1121    /// ```
1122    /// let x = Box::new(String::from("Hello"));
1123    /// let ptr = Box::into_raw(x);
1124    /// let x = unsafe { Box::from_raw(ptr) };
1125    /// ```
1126    /// Manual cleanup by explicitly running the destructor and deallocating
1127    /// the memory:
1128    /// ```
1129    /// use std::alloc::{dealloc, Layout};
1130    /// use std::ptr;
1131    ///
1132    /// let x = Box::new(String::from("Hello"));
1133    /// let ptr = Box::into_raw(x);
1134    /// unsafe {
1135    ///     ptr::drop_in_place(ptr);
1136    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1137    /// }
1138    /// ```
1139    /// Note: This is equivalent to the following:
1140    /// ```
1141    /// let x = Box::new(String::from("Hello"));
1142    /// let ptr = Box::into_raw(x);
1143    /// unsafe {
1144    ///     drop(Box::from_raw(ptr));
1145    /// }
1146    /// ```
1147    ///
1148    /// [memory layout]: self#memory-layout
1149    #[must_use = "losing the pointer will leak memory"]
1150    #[stable(feature = "box_raw", since = "1.4.0")]
1151    #[inline]
1152    pub fn into_raw(b: Self) -> *mut T {
1153        // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1154        let mut b = mem::ManuallyDrop::new(b);
1155        // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1156        // operation for it's alias tracking.
1157        &raw mut **b
1158    }
1159
1160    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1161    ///
1162    /// The pointer will be properly aligned.
1163    ///
1164    /// After calling this function, the caller is responsible for the
1165    /// memory previously managed by the `Box`. In particular, the
1166    /// caller should properly destroy `T` and release the memory, taking
1167    /// into account the [memory layout] used by `Box`. The easiest way to
1168    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1169    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1170    /// perform the cleanup.
1171    ///
1172    /// Note: this is an associated function, which means that you have
1173    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1174    /// This is so that there is no conflict with a method on the inner type.
1175    ///
1176    /// # Examples
1177    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1178    /// for automatic cleanup:
1179    /// ```
1180    /// #![feature(box_vec_non_null)]
1181    ///
1182    /// let x = Box::new(String::from("Hello"));
1183    /// let non_null = Box::into_non_null(x);
1184    /// let x = unsafe { Box::from_non_null(non_null) };
1185    /// ```
1186    /// Manual cleanup by explicitly running the destructor and deallocating
1187    /// the memory:
1188    /// ```
1189    /// #![feature(box_vec_non_null)]
1190    ///
1191    /// use std::alloc::{dealloc, Layout};
1192    ///
1193    /// let x = Box::new(String::from("Hello"));
1194    /// let non_null = Box::into_non_null(x);
1195    /// unsafe {
1196    ///     non_null.drop_in_place();
1197    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1198    /// }
1199    /// ```
1200    /// Note: This is equivalent to the following:
1201    /// ```
1202    /// #![feature(box_vec_non_null)]
1203    ///
1204    /// let x = Box::new(String::from("Hello"));
1205    /// let non_null = Box::into_non_null(x);
1206    /// unsafe {
1207    ///     drop(Box::from_non_null(non_null));
1208    /// }
1209    /// ```
1210    ///
1211    /// [memory layout]: self#memory-layout
1212    #[must_use = "losing the pointer will leak memory"]
1213    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1214    #[inline]
1215    pub fn into_non_null(b: Self) -> NonNull<T> {
1216        // SAFETY: `Box` is guaranteed to be non-null.
1217        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1218    }
1219}
1220
1221impl<T: ?Sized, A: Allocator> Box<T, A> {
1222    /// Constructs a box from a raw pointer in the given allocator.
1223    ///
1224    /// After calling this function, the raw pointer is owned by the
1225    /// resulting `Box`. Specifically, the `Box` destructor will call
1226    /// the destructor of `T` and free the allocated memory. For this
1227    /// to be safe, the memory must have been allocated in accordance
1228    /// with the [memory layout] used by `Box` .
1229    ///
1230    /// # Safety
1231    ///
1232    /// This function is unsafe because improper use may lead to
1233    /// memory problems. For example, a double-free may occur if the
1234    /// function is called twice on the same raw pointer.
1235    ///
1236    /// The raw pointer must point to a block of memory allocated by `alloc`.
1237    ///
1238    /// # Examples
1239    ///
1240    /// Recreate a `Box` which was previously converted to a raw pointer
1241    /// using [`Box::into_raw_with_allocator`]:
1242    /// ```
1243    /// #![feature(allocator_api)]
1244    ///
1245    /// use std::alloc::System;
1246    ///
1247    /// let x = Box::new_in(5, System);
1248    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1249    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1250    /// ```
1251    /// Manually create a `Box` from scratch by using the system allocator:
1252    /// ```
1253    /// #![feature(allocator_api, slice_ptr_get)]
1254    ///
1255    /// use std::alloc::{Allocator, Layout, System};
1256    ///
1257    /// unsafe {
1258    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1259    ///     // In general .write is required to avoid attempting to destruct
1260    ///     // the (uninitialized) previous contents of `ptr`, though for this
1261    ///     // simple example `*ptr = 5` would have worked as well.
1262    ///     ptr.write(5);
1263    ///     let x = Box::from_raw_in(ptr, System);
1264    /// }
1265    /// # Ok::<(), std::alloc::AllocError>(())
1266    /// ```
1267    ///
1268    /// [memory layout]: self#memory-layout
1269    #[unstable(feature = "allocator_api", issue = "32838")]
1270    #[inline]
1271    pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1272        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1273    }
1274
1275    /// Constructs a box from a `NonNull` pointer in the given allocator.
1276    ///
1277    /// After calling this function, the `NonNull` pointer is owned by
1278    /// the resulting `Box`. Specifically, the `Box` destructor will call
1279    /// the destructor of `T` and free the allocated memory. For this
1280    /// to be safe, the memory must have been allocated in accordance
1281    /// with the [memory layout] used by `Box` .
1282    ///
1283    /// # Safety
1284    ///
1285    /// This function is unsafe because improper use may lead to
1286    /// memory problems. For example, a double-free may occur if the
1287    /// function is called twice on the same raw pointer.
1288    ///
1289    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1290    ///
1291    /// # Examples
1292    ///
1293    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1294    /// using [`Box::into_non_null_with_allocator`]:
1295    /// ```
1296    /// #![feature(allocator_api, box_vec_non_null)]
1297    ///
1298    /// use std::alloc::System;
1299    ///
1300    /// let x = Box::new_in(5, System);
1301    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1302    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1303    /// ```
1304    /// Manually create a `Box` from scratch by using the system allocator:
1305    /// ```
1306    /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1307    ///
1308    /// use std::alloc::{Allocator, Layout, System};
1309    ///
1310    /// unsafe {
1311    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1312    ///     // In general .write is required to avoid attempting to destruct
1313    ///     // the (uninitialized) previous contents of `non_null`.
1314    ///     non_null.write(5);
1315    ///     let x = Box::from_non_null_in(non_null, System);
1316    /// }
1317    /// # Ok::<(), std::alloc::AllocError>(())
1318    /// ```
1319    ///
1320    /// [memory layout]: self#memory-layout
1321    #[unstable(feature = "allocator_api", issue = "32838")]
1322    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1323    #[inline]
1324    pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1325        // SAFETY: guaranteed by the caller.
1326        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1327    }
1328
1329    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1330    ///
1331    /// The pointer will be properly aligned and non-null.
1332    ///
1333    /// After calling this function, the caller is responsible for the
1334    /// memory previously managed by the `Box`. In particular, the
1335    /// caller should properly destroy `T` and release the memory, taking
1336    /// into account the [memory layout] used by `Box`. The easiest way to
1337    /// do this is to convert the raw pointer back into a `Box` with the
1338    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1339    /// the cleanup.
1340    ///
1341    /// Note: this is an associated function, which means that you have
1342    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1343    /// is so that there is no conflict with a method on the inner type.
1344    ///
1345    /// # Examples
1346    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1347    /// for automatic cleanup:
1348    /// ```
1349    /// #![feature(allocator_api)]
1350    ///
1351    /// use std::alloc::System;
1352    ///
1353    /// let x = Box::new_in(String::from("Hello"), System);
1354    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1355    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1356    /// ```
1357    /// Manual cleanup by explicitly running the destructor and deallocating
1358    /// the memory:
1359    /// ```
1360    /// #![feature(allocator_api)]
1361    ///
1362    /// use std::alloc::{Allocator, Layout, System};
1363    /// use std::ptr::{self, NonNull};
1364    ///
1365    /// let x = Box::new_in(String::from("Hello"), System);
1366    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1367    /// unsafe {
1368    ///     ptr::drop_in_place(ptr);
1369    ///     let non_null = NonNull::new_unchecked(ptr);
1370    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1371    /// }
1372    /// ```
1373    ///
1374    /// [memory layout]: self#memory-layout
1375    #[must_use = "losing the pointer will leak memory"]
1376    #[unstable(feature = "allocator_api", issue = "32838")]
1377    #[inline]
1378    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1379        let mut b = mem::ManuallyDrop::new(b);
1380        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1381        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1382        // want *no* aliasing requirements here!
1383        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1384        // works around that.
1385        let ptr = &raw mut **b;
1386        let alloc = unsafe { ptr::read(&b.1) };
1387        (ptr, alloc)
1388    }
1389
1390    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1391    ///
1392    /// The pointer will be properly aligned.
1393    ///
1394    /// After calling this function, the caller is responsible for the
1395    /// memory previously managed by the `Box`. In particular, the
1396    /// caller should properly destroy `T` and release the memory, taking
1397    /// into account the [memory layout] used by `Box`. The easiest way to
1398    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1399    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1400    /// perform the cleanup.
1401    ///
1402    /// Note: this is an associated function, which means that you have
1403    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1404    /// `b.into_non_null_with_allocator()`. This is so that there is no
1405    /// conflict with a method on the inner type.
1406    ///
1407    /// # Examples
1408    /// Converting the `NonNull` pointer back into a `Box` with
1409    /// [`Box::from_non_null_in`] for automatic cleanup:
1410    /// ```
1411    /// #![feature(allocator_api, box_vec_non_null)]
1412    ///
1413    /// use std::alloc::System;
1414    ///
1415    /// let x = Box::new_in(String::from("Hello"), System);
1416    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1417    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1418    /// ```
1419    /// Manual cleanup by explicitly running the destructor and deallocating
1420    /// the memory:
1421    /// ```
1422    /// #![feature(allocator_api, box_vec_non_null)]
1423    ///
1424    /// use std::alloc::{Allocator, Layout, System};
1425    ///
1426    /// let x = Box::new_in(String::from("Hello"), System);
1427    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1428    /// unsafe {
1429    ///     non_null.drop_in_place();
1430    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1431    /// }
1432    /// ```
1433    ///
1434    /// [memory layout]: self#memory-layout
1435    #[must_use = "losing the pointer will leak memory"]
1436    #[unstable(feature = "allocator_api", issue = "32838")]
1437    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1438    #[inline]
1439    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1440        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1441        // SAFETY: `Box` is guaranteed to be non-null.
1442        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1443    }
1444
1445    #[unstable(
1446        feature = "ptr_internals",
1447        issue = "none",
1448        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1449    )]
1450    #[inline]
1451    #[doc(hidden)]
1452    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1453        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1454        unsafe { (Unique::from(&mut *ptr), alloc) }
1455    }
1456
1457    /// Returns a raw mutable pointer to the `Box`'s contents.
1458    ///
1459    /// The caller must ensure that the `Box` outlives the pointer this
1460    /// function returns, or else it will end up dangling.
1461    ///
1462    /// This method guarantees that for the purpose of the aliasing model, this method
1463    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1464    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1465    /// Note that calling other methods that materialize references to the memory
1466    /// may still invalidate this pointer.
1467    /// See the example below for how this guarantee can be used.
1468    ///
1469    /// # Examples
1470    ///
1471    /// Due to the aliasing guarantee, the following code is legal:
1472    ///
1473    /// ```rust
1474    /// #![feature(box_as_ptr)]
1475    ///
1476    /// unsafe {
1477    ///     let mut b = Box::new(0);
1478    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1479    ///     ptr1.write(1);
1480    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1481    ///     ptr2.write(2);
1482    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1483    ///     ptr1.write(3);
1484    /// }
1485    /// ```
1486    ///
1487    /// [`as_mut_ptr`]: Self::as_mut_ptr
1488    /// [`as_ptr`]: Self::as_ptr
1489    #[unstable(feature = "box_as_ptr", issue = "129090")]
1490    #[rustc_never_returns_null_ptr]
1491    #[rustc_as_ptr]
1492    #[inline]
1493    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1494        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1495        // any references.
1496        &raw mut **b
1497    }
1498
1499    /// Returns a raw pointer to the `Box`'s contents.
1500    ///
1501    /// The caller must ensure that the `Box` outlives the pointer this
1502    /// function returns, or else it will end up dangling.
1503    ///
1504    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1505    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1506    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1507    ///
1508    /// This method guarantees that for the purpose of the aliasing model, this method
1509    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1510    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1511    /// Note that calling other methods that materialize mutable references to the memory,
1512    /// as well as writing to this memory, may still invalidate this pointer.
1513    /// See the example below for how this guarantee can be used.
1514    ///
1515    /// # Examples
1516    ///
1517    /// Due to the aliasing guarantee, the following code is legal:
1518    ///
1519    /// ```rust
1520    /// #![feature(box_as_ptr)]
1521    ///
1522    /// unsafe {
1523    ///     let mut v = Box::new(0);
1524    ///     let ptr1 = Box::as_ptr(&v);
1525    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1526    ///     let _val = ptr2.read();
1527    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1528    ///     let _val = ptr1.read();
1529    ///     // However, once we do a write...
1530    ///     ptr2.write(1);
1531    ///     // ... `ptr1` is no longer valid.
1532    ///     // This would be UB: let _val = ptr1.read();
1533    /// }
1534    /// ```
1535    ///
1536    /// [`as_mut_ptr`]: Self::as_mut_ptr
1537    /// [`as_ptr`]: Self::as_ptr
1538    #[unstable(feature = "box_as_ptr", issue = "129090")]
1539    #[rustc_never_returns_null_ptr]
1540    #[rustc_as_ptr]
1541    #[inline]
1542    pub fn as_ptr(b: &Self) -> *const T {
1543        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1544        // any references.
1545        &raw const **b
1546    }
1547
1548    /// Returns a reference to the underlying allocator.
1549    ///
1550    /// Note: this is an associated function, which means that you have
1551    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1552    /// is so that there is no conflict with a method on the inner type.
1553    #[unstable(feature = "allocator_api", issue = "32838")]
1554    #[inline]
1555    pub fn allocator(b: &Self) -> &A {
1556        &b.1
1557    }
1558
1559    /// Consumes and leaks the `Box`, returning a mutable reference,
1560    /// `&'a mut T`.
1561    ///
1562    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1563    /// has only static references, or none at all, then this may be chosen to be
1564    /// `'static`.
1565    ///
1566    /// This function is mainly useful for data that lives for the remainder of
1567    /// the program's life. Dropping the returned reference will cause a memory
1568    /// leak. If this is not acceptable, the reference should first be wrapped
1569    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1570    /// then be dropped which will properly destroy `T` and release the
1571    /// allocated memory.
1572    ///
1573    /// Note: this is an associated function, which means that you have
1574    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1575    /// is so that there is no conflict with a method on the inner type.
1576    ///
1577    /// # Examples
1578    ///
1579    /// Simple usage:
1580    ///
1581    /// ```
1582    /// let x = Box::new(41);
1583    /// let static_ref: &'static mut usize = Box::leak(x);
1584    /// *static_ref += 1;
1585    /// assert_eq!(*static_ref, 42);
1586    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1587    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1588    /// # drop(unsafe { Box::from_raw(static_ref) });
1589    /// ```
1590    ///
1591    /// Unsized data:
1592    ///
1593    /// ```
1594    /// let x = vec![1, 2, 3].into_boxed_slice();
1595    /// let static_ref = Box::leak(x);
1596    /// static_ref[0] = 4;
1597    /// assert_eq!(*static_ref, [4, 2, 3]);
1598    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1599    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1600    /// # drop(unsafe { Box::from_raw(static_ref) });
1601    /// ```
1602    #[stable(feature = "box_leak", since = "1.26.0")]
1603    #[inline]
1604    pub fn leak<'a>(b: Self) -> &'a mut T
1605    where
1606        A: 'a,
1607    {
1608        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1609        mem::forget(alloc);
1610        unsafe { &mut *ptr }
1611    }
1612
1613    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1614    /// `*boxed` will be pinned in memory and unable to be moved.
1615    ///
1616    /// This conversion does not allocate on the heap and happens in place.
1617    ///
1618    /// This is also available via [`From`].
1619    ///
1620    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1621    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1622    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1623    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1624    ///
1625    /// # Notes
1626    ///
1627    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1628    /// as it'll introduce an ambiguity when calling `Pin::from`.
1629    /// A demonstration of such a poor impl is shown below.
1630    ///
1631    /// ```compile_fail
1632    /// # use std::pin::Pin;
1633    /// struct Foo; // A type defined in this crate.
1634    /// impl From<Box<()>> for Pin<Foo> {
1635    ///     fn from(_: Box<()>) -> Pin<Foo> {
1636    ///         Pin::new(Foo)
1637    ///     }
1638    /// }
1639    ///
1640    /// let foo = Box::new(());
1641    /// let bar = Pin::from(foo);
1642    /// ```
1643    #[stable(feature = "box_into_pin", since = "1.63.0")]
1644    pub fn into_pin(boxed: Self) -> Pin<Self>
1645    where
1646        A: 'static,
1647    {
1648        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1649        // when `T: !Unpin`, so it's safe to pin it directly without any
1650        // additional requirements.
1651        unsafe { Pin::new_unchecked(boxed) }
1652    }
1653}
1654
1655#[stable(feature = "rust1", since = "1.0.0")]
1656unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1657    #[inline]
1658    fn drop(&mut self) {
1659        // the T in the Box is dropped by the compiler before the destructor is run
1660
1661        let ptr = self.0;
1662
1663        unsafe {
1664            let layout = Layout::for_value_raw(ptr.as_ptr());
1665            if layout.size() != 0 {
1666                self.1.deallocate(From::from(ptr.cast()), layout);
1667            }
1668        }
1669    }
1670}
1671
1672#[cfg(not(no_global_oom_handling))]
1673#[stable(feature = "rust1", since = "1.0.0")]
1674impl<T: Default> Default for Box<T> {
1675    /// Creates a `Box<T>`, with the `Default` value for T.
1676    #[inline]
1677    fn default() -> Self {
1678        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1679        unsafe {
1680            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1681            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1682            // does not have a destructor.
1683            //
1684            // We use `ptr::write` as `MaybeUninit::write` creates
1685            // extra stack copies of `T` in debug mode.
1686            //
1687            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1688            ptr::write(&raw mut *x as *mut T, T::default());
1689            // SAFETY: `x` was just initialized above.
1690            x.assume_init()
1691        }
1692    }
1693}
1694
1695#[cfg(not(no_global_oom_handling))]
1696#[stable(feature = "rust1", since = "1.0.0")]
1697impl<T> Default for Box<[T]> {
1698    #[inline]
1699    fn default() -> Self {
1700        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1701        Box(ptr, Global)
1702    }
1703}
1704
1705#[cfg(not(no_global_oom_handling))]
1706#[stable(feature = "default_box_extra", since = "1.17.0")]
1707impl Default for Box<str> {
1708    #[inline]
1709    fn default() -> Self {
1710        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1711        let ptr: Unique<str> = unsafe {
1712            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1713            Unique::new_unchecked(bytes.as_ptr() as *mut str)
1714        };
1715        Box(ptr, Global)
1716    }
1717}
1718
1719#[cfg(not(no_global_oom_handling))]
1720#[stable(feature = "rust1", since = "1.0.0")]
1721impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1722    /// Returns a new box with a `clone()` of this box's contents.
1723    ///
1724    /// # Examples
1725    ///
1726    /// ```
1727    /// let x = Box::new(5);
1728    /// let y = x.clone();
1729    ///
1730    /// // The value is the same
1731    /// assert_eq!(x, y);
1732    ///
1733    /// // But they are unique objects
1734    /// assert_ne!(&*x as *const i32, &*y as *const i32);
1735    /// ```
1736    #[inline]
1737    fn clone(&self) -> Self {
1738        // Pre-allocate memory to allow writing the cloned value directly.
1739        let mut boxed = Self::new_uninit_in(self.1.clone());
1740        unsafe {
1741            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1742            boxed.assume_init()
1743        }
1744    }
1745
1746    /// Copies `source`'s contents into `self` without creating a new allocation.
1747    ///
1748    /// # Examples
1749    ///
1750    /// ```
1751    /// let x = Box::new(5);
1752    /// let mut y = Box::new(10);
1753    /// let yp: *const i32 = &*y;
1754    ///
1755    /// y.clone_from(&x);
1756    ///
1757    /// // The value is the same
1758    /// assert_eq!(x, y);
1759    ///
1760    /// // And no allocation occurred
1761    /// assert_eq!(yp, &*y);
1762    /// ```
1763    #[inline]
1764    fn clone_from(&mut self, source: &Self) {
1765        (**self).clone_from(&(**source));
1766    }
1767}
1768
1769#[cfg(not(no_global_oom_handling))]
1770#[stable(feature = "box_slice_clone", since = "1.3.0")]
1771impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1772    fn clone(&self) -> Self {
1773        let alloc = Box::allocator(self).clone();
1774        self.to_vec_in(alloc).into_boxed_slice()
1775    }
1776
1777    /// Copies `source`'s contents into `self` without creating a new allocation,
1778    /// so long as the two are of the same length.
1779    ///
1780    /// # Examples
1781    ///
1782    /// ```
1783    /// let x = Box::new([5, 6, 7]);
1784    /// let mut y = Box::new([8, 9, 10]);
1785    /// let yp: *const [i32] = &*y;
1786    ///
1787    /// y.clone_from(&x);
1788    ///
1789    /// // The value is the same
1790    /// assert_eq!(x, y);
1791    ///
1792    /// // And no allocation occurred
1793    /// assert_eq!(yp, &*y);
1794    /// ```
1795    fn clone_from(&mut self, source: &Self) {
1796        if self.len() == source.len() {
1797            self.clone_from_slice(&source);
1798        } else {
1799            *self = source.clone();
1800        }
1801    }
1802}
1803
1804#[cfg(not(no_global_oom_handling))]
1805#[stable(feature = "box_slice_clone", since = "1.3.0")]
1806impl Clone for Box<str> {
1807    fn clone(&self) -> Self {
1808        // this makes a copy of the data
1809        let buf: Box<[u8]> = self.as_bytes().into();
1810        unsafe { from_boxed_utf8_unchecked(buf) }
1811    }
1812}
1813
1814#[stable(feature = "rust1", since = "1.0.0")]
1815impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1816    #[inline]
1817    fn eq(&self, other: &Self) -> bool {
1818        PartialEq::eq(&**self, &**other)
1819    }
1820    #[inline]
1821    fn ne(&self, other: &Self) -> bool {
1822        PartialEq::ne(&**self, &**other)
1823    }
1824}
1825
1826#[stable(feature = "rust1", since = "1.0.0")]
1827impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1828    #[inline]
1829    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1830        PartialOrd::partial_cmp(&**self, &**other)
1831    }
1832    #[inline]
1833    fn lt(&self, other: &Self) -> bool {
1834        PartialOrd::lt(&**self, &**other)
1835    }
1836    #[inline]
1837    fn le(&self, other: &Self) -> bool {
1838        PartialOrd::le(&**self, &**other)
1839    }
1840    #[inline]
1841    fn ge(&self, other: &Self) -> bool {
1842        PartialOrd::ge(&**self, &**other)
1843    }
1844    #[inline]
1845    fn gt(&self, other: &Self) -> bool {
1846        PartialOrd::gt(&**self, &**other)
1847    }
1848}
1849
1850#[stable(feature = "rust1", since = "1.0.0")]
1851impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1852    #[inline]
1853    fn cmp(&self, other: &Self) -> Ordering {
1854        Ord::cmp(&**self, &**other)
1855    }
1856}
1857
1858#[stable(feature = "rust1", since = "1.0.0")]
1859impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1860
1861#[stable(feature = "rust1", since = "1.0.0")]
1862impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1863    fn hash<H: Hasher>(&self, state: &mut H) {
1864        (**self).hash(state);
1865    }
1866}
1867
1868#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1869impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1870    fn finish(&self) -> u64 {
1871        (**self).finish()
1872    }
1873    fn write(&mut self, bytes: &[u8]) {
1874        (**self).write(bytes)
1875    }
1876    fn write_u8(&mut self, i: u8) {
1877        (**self).write_u8(i)
1878    }
1879    fn write_u16(&mut self, i: u16) {
1880        (**self).write_u16(i)
1881    }
1882    fn write_u32(&mut self, i: u32) {
1883        (**self).write_u32(i)
1884    }
1885    fn write_u64(&mut self, i: u64) {
1886        (**self).write_u64(i)
1887    }
1888    fn write_u128(&mut self, i: u128) {
1889        (**self).write_u128(i)
1890    }
1891    fn write_usize(&mut self, i: usize) {
1892        (**self).write_usize(i)
1893    }
1894    fn write_i8(&mut self, i: i8) {
1895        (**self).write_i8(i)
1896    }
1897    fn write_i16(&mut self, i: i16) {
1898        (**self).write_i16(i)
1899    }
1900    fn write_i32(&mut self, i: i32) {
1901        (**self).write_i32(i)
1902    }
1903    fn write_i64(&mut self, i: i64) {
1904        (**self).write_i64(i)
1905    }
1906    fn write_i128(&mut self, i: i128) {
1907        (**self).write_i128(i)
1908    }
1909    fn write_isize(&mut self, i: isize) {
1910        (**self).write_isize(i)
1911    }
1912    fn write_length_prefix(&mut self, len: usize) {
1913        (**self).write_length_prefix(len)
1914    }
1915    fn write_str(&mut self, s: &str) {
1916        (**self).write_str(s)
1917    }
1918}
1919
1920#[stable(feature = "rust1", since = "1.0.0")]
1921impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1922    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1923        fmt::Display::fmt(&**self, f)
1924    }
1925}
1926
1927#[stable(feature = "rust1", since = "1.0.0")]
1928impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1929    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1930        fmt::Debug::fmt(&**self, f)
1931    }
1932}
1933
1934#[stable(feature = "rust1", since = "1.0.0")]
1935impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1936    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1937        // It's not possible to extract the inner Uniq directly from the Box,
1938        // instead we cast it to a *const which aliases the Unique
1939        let ptr: *const T = &**self;
1940        fmt::Pointer::fmt(&ptr, f)
1941    }
1942}
1943
1944#[stable(feature = "rust1", since = "1.0.0")]
1945impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1946    type Target = T;
1947
1948    fn deref(&self) -> &T {
1949        &**self
1950    }
1951}
1952
1953#[stable(feature = "rust1", since = "1.0.0")]
1954impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1955    fn deref_mut(&mut self) -> &mut T {
1956        &mut **self
1957    }
1958}
1959
1960#[unstable(feature = "deref_pure_trait", issue = "87121")]
1961unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1962
1963#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1964impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1965
1966#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1967impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
1968    type Output = <F as FnOnce<Args>>::Output;
1969
1970    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
1971        <F as FnOnce<Args>>::call_once(*self, args)
1972    }
1973}
1974
1975#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1976impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
1977    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
1978        <F as FnMut<Args>>::call_mut(self, args)
1979    }
1980}
1981
1982#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1983impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
1984    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
1985        <F as Fn<Args>>::call(self, args)
1986    }
1987}
1988
1989#[stable(feature = "async_closure", since = "1.85.0")]
1990impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
1991    type Output = F::Output;
1992    type CallOnceFuture = F::CallOnceFuture;
1993
1994    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
1995        F::async_call_once(*self, args)
1996    }
1997}
1998
1999#[stable(feature = "async_closure", since = "1.85.0")]
2000impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2001    type CallRefFuture<'a>
2002        = F::CallRefFuture<'a>
2003    where
2004        Self: 'a;
2005
2006    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2007        F::async_call_mut(self, args)
2008    }
2009}
2010
2011#[stable(feature = "async_closure", since = "1.85.0")]
2012impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2013    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2014        F::async_call(self, args)
2015    }
2016}
2017
2018#[unstable(feature = "coerce_unsized", issue = "18598")]
2019impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2020
2021#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2022unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2023
2024// It is quite crucial that we only allow the `Global` allocator here.
2025// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2026// would need a lot of codegen and interpreter adjustments.
2027#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2028impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2029
2030#[stable(feature = "box_borrow", since = "1.1.0")]
2031impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2032    fn borrow(&self) -> &T {
2033        &**self
2034    }
2035}
2036
2037#[stable(feature = "box_borrow", since = "1.1.0")]
2038impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2039    fn borrow_mut(&mut self) -> &mut T {
2040        &mut **self
2041    }
2042}
2043
2044#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2045impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2046    fn as_ref(&self) -> &T {
2047        &**self
2048    }
2049}
2050
2051#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2052impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2053    fn as_mut(&mut self) -> &mut T {
2054        &mut **self
2055    }
2056}
2057
2058/* Nota bene
2059 *
2060 *  We could have chosen not to add this impl, and instead have written a
2061 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2062 *  because Box<T> implements Unpin even when T does not, as a result of
2063 *  this impl.
2064 *
2065 *  We chose this API instead of the alternative for a few reasons:
2066 *      - Logically, it is helpful to understand pinning in regard to the
2067 *        memory region being pointed to. For this reason none of the
2068 *        standard library pointer types support projecting through a pin
2069 *        (Box<T> is the only pointer type in std for which this would be
2070 *        safe.)
2071 *      - It is in practice very useful to have Box<T> be unconditionally
2072 *        Unpin because of trait objects, for which the structural auto
2073 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2074 *        otherwise not be Unpin).
2075 *
2076 *  Another type with the same semantics as Box but only a conditional
2077 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2078 *  could have a method to project a Pin<T> from it.
2079 */
2080#[stable(feature = "pin", since = "1.33.0")]
2081impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2082
2083#[unstable(feature = "coroutine_trait", issue = "43122")]
2084impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2085    type Yield = G::Yield;
2086    type Return = G::Return;
2087
2088    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2089        G::resume(Pin::new(&mut *self), arg)
2090    }
2091}
2092
2093#[unstable(feature = "coroutine_trait", issue = "43122")]
2094impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2095where
2096    A: 'static,
2097{
2098    type Yield = G::Yield;
2099    type Return = G::Return;
2100
2101    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2102        G::resume((*self).as_mut(), arg)
2103    }
2104}
2105
2106#[stable(feature = "futures_api", since = "1.36.0")]
2107impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2108    type Output = F::Output;
2109
2110    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2111        F::poll(Pin::new(&mut *self), cx)
2112    }
2113}
2114
2115#[stable(feature = "box_error", since = "1.8.0")]
2116impl<E: Error> Error for Box<E> {
2117    #[allow(deprecated, deprecated_in_future)]
2118    fn description(&self) -> &str {
2119        Error::description(&**self)
2120    }
2121
2122    #[allow(deprecated)]
2123    fn cause(&self) -> Option<&dyn Error> {
2124        Error::cause(&**self)
2125    }
2126
2127    fn source(&self) -> Option<&(dyn Error + 'static)> {
2128        Error::source(&**self)
2129    }
2130
2131    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2132        Error::provide(&**self, request);
2133    }
2134}
pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy