Skip to main content

alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187use core::clone::CloneToUninit;
188use core::cmp::Ordering;
189use core::error::{self, Error};
190use core::fmt;
191use core::future::Future;
192use core::hash::{Hash, Hasher};
193use core::marker::{Tuple, Unsize};
194#[cfg(not(no_global_oom_handling))]
195use core::mem::MaybeUninit;
196use core::mem::{self, SizedTypeProperties};
197use core::ops::{
198    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
199    DerefPure, DispatchFromDyn, LegacyReceiver,
200};
201#[cfg(not(no_global_oom_handling))]
202use core::ops::{Residual, Try};
203use core::pin::{Pin, PinCoerceUnsized};
204use core::ptr::{self, NonNull, Unique};
205use core::task::{Context, Poll};
206
207#[cfg(not(no_global_oom_handling))]
208use crate::alloc::handle_alloc_error;
209use crate::alloc::{AllocError, Allocator, Global, Layout};
210use crate::raw_vec::RawVec;
211#[cfg(not(no_global_oom_handling))]
212use crate::str::from_boxed_utf8_unchecked;
213
214/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
215mod convert;
216/// Iterator related impls for `Box<_>`.
217mod iter;
218/// [`ThinBox`] implementation.
219mod thin;
220
221#[unstable(feature = "thin_box", issue = "92791")]
222pub use thin::ThinBox;
223
224/// A pointer type that uniquely owns a heap allocation of type `T`.
225///
226/// See the [module-level documentation](../../std/boxed/index.html) for more.
227#[lang = "owned_box"]
228#[fundamental]
229#[stable(feature = "rust1", since = "1.0.0")]
230#[rustc_insignificant_dtor]
231#[doc(search_unbox)]
232// The declaration of the `Box` struct must be kept in sync with the
233// compiler or ICEs will happen.
234pub struct Box<
235    T: ?Sized,
236    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
237>(Unique<T>, A);
238
239/// Monomorphic function for allocating an uninit `Box`.
240#[inline]
241// The is a separate function to avoid doing it in every generic version, but it
242// looks small to the mir inliner (particularly in panic=abort) so leave it to
243// the backend to decide whether pulling it in everywhere is worth doing.
244#[rustc_no_mir_inline]
245#[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
246#[cfg(not(no_global_oom_handling))]
247fn box_new_uninit(layout: Layout) -> *mut u8 {
248    match Global.allocate(layout) {
249        Ok(ptr) => ptr.as_mut_ptr(),
250        Err(_) => handle_alloc_error(layout),
251    }
252}
253
254/// Helper for `vec!`.
255///
256/// This is unsafe, but has to be marked as safe or else we couldn't use it in `vec!`.
257#[doc(hidden)]
258#[unstable(feature = "liballoc_internals", issue = "none")]
259#[inline(always)]
260#[cfg(not(no_global_oom_handling))]
261#[rustc_diagnostic_item = "box_assume_init_into_vec_unsafe"]
262pub fn box_assume_init_into_vec_unsafe<T, const N: usize>(
263    b: Box<MaybeUninit<[T; N]>>,
264) -> crate::vec::Vec<T> {
265    unsafe { (b.assume_init() as Box<[T]>).into_vec() }
266}
267
268impl<T> Box<T> {
269    /// Allocates memory on the heap and then places `x` into it.
270    ///
271    /// This doesn't actually allocate if `T` is zero-sized.
272    ///
273    /// # Examples
274    ///
275    /// ```
276    /// let five = Box::new(5);
277    /// ```
278    #[cfg(not(no_global_oom_handling))]
279    #[ferrocene::prevalidated]
280    #[inline(always)]
281    #[stable(feature = "rust1", since = "1.0.0")]
282    #[must_use]
283    #[rustc_diagnostic_item = "box_new"]
284    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
285    pub fn new(x: T) -> Self {
286        // This is `Box::new_uninit` but inlined to avoid build time regressions.
287        let ptr = box_new_uninit(<T as SizedTypeProperties>::LAYOUT) as *mut T;
288        // Nothing below can panic so we do not have to worry about deallocating `ptr`.
289        // SAFETY: we just allocated the box to store `x`.
290        unsafe { core::intrinsics::write_via_move(ptr, x) };
291        // SAFETY: we just initialized `b`.
292        unsafe { mem::transmute(ptr) }
293    }
294
295    /// Constructs a new box with uninitialized contents.
296    ///
297    /// # Examples
298    ///
299    /// ```
300    /// let mut five = Box::<u32>::new_uninit();
301    /// // Deferred initialization:
302    /// five.write(5);
303    /// let five = unsafe { five.assume_init() };
304    ///
305    /// assert_eq!(*five, 5)
306    /// ```
307    #[cfg(not(no_global_oom_handling))]
308    #[stable(feature = "new_uninit", since = "1.82.0")]
309    #[must_use]
310    #[inline(always)]
311    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
312    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
313        // This is the same as `Self::new_uninit_in(Global)`, but manually inlined (just like
314        // `Box::new`).
315
316        // SAFETY:
317        // - If `allocate` succeeds, the returned pointer exactly matches what `Box` needs.
318        unsafe { mem::transmute(box_new_uninit(<T as SizedTypeProperties>::LAYOUT)) }
319    }
320
321    /// Constructs a new `Box` with uninitialized contents, with the memory
322    /// being filled with `0` bytes.
323    ///
324    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
325    /// of this method.
326    ///
327    /// # Examples
328    ///
329    /// ```
330    /// let zero = Box::<u32>::new_zeroed();
331    /// let zero = unsafe { zero.assume_init() };
332    ///
333    /// assert_eq!(*zero, 0)
334    /// ```
335    ///
336    /// [zeroed]: mem::MaybeUninit::zeroed
337    #[cfg(not(no_global_oom_handling))]
338    #[inline]
339    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
340    #[must_use]
341    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
342        Self::new_zeroed_in(Global)
343    }
344
345    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
346    /// `x` will be pinned in memory and unable to be moved.
347    ///
348    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
349    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
350    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
351    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
352    #[cfg(not(no_global_oom_handling))]
353    #[stable(feature = "pin", since = "1.33.0")]
354    #[must_use]
355    #[inline(always)]
356    pub fn pin(x: T) -> Pin<Box<T>> {
357        Box::new(x).into()
358    }
359
360    /// Allocates memory on the heap then places `x` into it,
361    /// returning an error if the allocation fails
362    ///
363    /// This doesn't actually allocate if `T` is zero-sized.
364    ///
365    /// # Examples
366    ///
367    /// ```
368    /// #![feature(allocator_api)]
369    ///
370    /// let five = Box::try_new(5)?;
371    /// # Ok::<(), std::alloc::AllocError>(())
372    /// ```
373    #[unstable(feature = "allocator_api", issue = "32838")]
374    #[inline]
375    pub fn try_new(x: T) -> Result<Self, AllocError> {
376        Self::try_new_in(x, Global)
377    }
378
379    /// Constructs a new box with uninitialized contents on the heap,
380    /// returning an error if the allocation fails
381    ///
382    /// # Examples
383    ///
384    /// ```
385    /// #![feature(allocator_api)]
386    ///
387    /// let mut five = Box::<u32>::try_new_uninit()?;
388    /// // Deferred initialization:
389    /// five.write(5);
390    /// let five = unsafe { five.assume_init() };
391    ///
392    /// assert_eq!(*five, 5);
393    /// # Ok::<(), std::alloc::AllocError>(())
394    /// ```
395    #[unstable(feature = "allocator_api", issue = "32838")]
396    #[inline]
397    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
398        Box::try_new_uninit_in(Global)
399    }
400
401    /// Constructs a new `Box` with uninitialized contents, with the memory
402    /// being filled with `0` bytes on the heap
403    ///
404    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
405    /// of this method.
406    ///
407    /// # Examples
408    ///
409    /// ```
410    /// #![feature(allocator_api)]
411    ///
412    /// let zero = Box::<u32>::try_new_zeroed()?;
413    /// let zero = unsafe { zero.assume_init() };
414    ///
415    /// assert_eq!(*zero, 0);
416    /// # Ok::<(), std::alloc::AllocError>(())
417    /// ```
418    ///
419    /// [zeroed]: mem::MaybeUninit::zeroed
420    #[unstable(feature = "allocator_api", issue = "32838")]
421    #[inline]
422    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
423        Box::try_new_zeroed_in(Global)
424    }
425
426    /// Maps the value in a box, reusing the allocation if possible.
427    ///
428    /// `f` is called on the value in the box, and the result is returned, also boxed.
429    ///
430    /// Note: this is an associated function, which means that you have
431    /// to call it as `Box::map(b, f)` instead of `b.map(f)`. This
432    /// is so that there is no conflict with a method on the inner type.
433    ///
434    /// # Examples
435    ///
436    /// ```
437    /// #![feature(smart_pointer_try_map)]
438    ///
439    /// let b = Box::new(7);
440    /// let new = Box::map(b, |i| i + 7);
441    /// assert_eq!(*new, 14);
442    /// ```
443    #[cfg(not(no_global_oom_handling))]
444    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
445    pub fn map<U>(this: Self, f: impl FnOnce(T) -> U) -> Box<U> {
446        if size_of::<T>() == size_of::<U>() && align_of::<T>() == align_of::<U>() {
447            let (value, allocation) = Box::take(this);
448            Box::write(
449                unsafe { mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<U>>>(allocation) },
450                f(value),
451            )
452        } else {
453            Box::new(f(*this))
454        }
455    }
456
457    /// Attempts to map the value in a box, reusing the allocation if possible.
458    ///
459    /// `f` is called on the value in the box, and if the operation succeeds, the result is
460    /// returned, also boxed.
461    ///
462    /// Note: this is an associated function, which means that you have
463    /// to call it as `Box::try_map(b, f)` instead of `b.try_map(f)`. This
464    /// is so that there is no conflict with a method on the inner type.
465    ///
466    /// # Examples
467    ///
468    /// ```
469    /// #![feature(smart_pointer_try_map)]
470    ///
471    /// let b = Box::new(7);
472    /// let new = Box::try_map(b, u32::try_from).unwrap();
473    /// assert_eq!(*new, 7);
474    /// ```
475    #[cfg(not(no_global_oom_handling))]
476    #[unstable(feature = "smart_pointer_try_map", issue = "144419")]
477    pub fn try_map<R>(
478        this: Self,
479        f: impl FnOnce(T) -> R,
480    ) -> <R::Residual as Residual<Box<R::Output>>>::TryType
481    where
482        R: Try,
483        R::Residual: Residual<Box<R::Output>>,
484    {
485        if size_of::<T>() == size_of::<R::Output>() && align_of::<T>() == align_of::<R::Output>() {
486            let (value, allocation) = Box::take(this);
487            try {
488                Box::write(
489                    unsafe {
490                        mem::transmute::<Box<MaybeUninit<T>>, Box<MaybeUninit<R::Output>>>(
491                            allocation,
492                        )
493                    },
494                    f(value)?,
495                )
496            }
497        } else {
498            try { Box::new(f(*this)?) }
499        }
500    }
501}
502
503impl<T, A: Allocator> Box<T, A> {
504    /// Allocates memory in the given allocator then places `x` into it.
505    ///
506    /// This doesn't actually allocate if `T` is zero-sized.
507    ///
508    /// # Examples
509    ///
510    /// ```
511    /// #![feature(allocator_api)]
512    ///
513    /// use std::alloc::System;
514    ///
515    /// let five = Box::new_in(5, System);
516    /// ```
517    #[cfg(not(no_global_oom_handling))]
518    #[unstable(feature = "allocator_api", issue = "32838")]
519    #[must_use]
520    #[inline]
521    pub fn new_in(x: T, alloc: A) -> Self
522    where
523        A: Allocator,
524    {
525        let mut boxed = Self::new_uninit_in(alloc);
526        boxed.write(x);
527        unsafe { boxed.assume_init() }
528    }
529
530    /// Allocates memory in the given allocator then places `x` into it,
531    /// returning an error if the allocation fails
532    ///
533    /// This doesn't actually allocate if `T` is zero-sized.
534    ///
535    /// # Examples
536    ///
537    /// ```
538    /// #![feature(allocator_api)]
539    ///
540    /// use std::alloc::System;
541    ///
542    /// let five = Box::try_new_in(5, System)?;
543    /// # Ok::<(), std::alloc::AllocError>(())
544    /// ```
545    #[unstable(feature = "allocator_api", issue = "32838")]
546    #[inline]
547    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
548    where
549        A: Allocator,
550    {
551        let mut boxed = Self::try_new_uninit_in(alloc)?;
552        boxed.write(x);
553        unsafe { Ok(boxed.assume_init()) }
554    }
555
556    /// Constructs a new box with uninitialized contents in the provided allocator.
557    ///
558    /// # Examples
559    ///
560    /// ```
561    /// #![feature(allocator_api)]
562    ///
563    /// use std::alloc::System;
564    ///
565    /// let mut five = Box::<u32, _>::new_uninit_in(System);
566    /// // Deferred initialization:
567    /// five.write(5);
568    /// let five = unsafe { five.assume_init() };
569    ///
570    /// assert_eq!(*five, 5)
571    /// ```
572    #[unstable(feature = "allocator_api", issue = "32838")]
573    #[cfg(not(no_global_oom_handling))]
574    #[must_use]
575    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
576    where
577        A: Allocator,
578    {
579        let layout = Layout::new::<mem::MaybeUninit<T>>();
580        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
581        // That would make code size bigger.
582        match Box::try_new_uninit_in(alloc) {
583            Ok(m) => m,
584            Err(_) => handle_alloc_error(layout),
585        }
586    }
587
588    /// Constructs a new box with uninitialized contents in the provided allocator,
589    /// returning an error if the allocation fails
590    ///
591    /// # Examples
592    ///
593    /// ```
594    /// #![feature(allocator_api)]
595    ///
596    /// use std::alloc::System;
597    ///
598    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
599    /// // Deferred initialization:
600    /// five.write(5);
601    /// let five = unsafe { five.assume_init() };
602    ///
603    /// assert_eq!(*five, 5);
604    /// # Ok::<(), std::alloc::AllocError>(())
605    /// ```
606    #[unstable(feature = "allocator_api", issue = "32838")]
607    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
608    where
609        A: Allocator,
610    {
611        let ptr = if T::IS_ZST {
612            NonNull::dangling()
613        } else {
614            let layout = Layout::new::<mem::MaybeUninit<T>>();
615            alloc.allocate(layout)?.cast()
616        };
617        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
618    }
619
620    /// Constructs a new `Box` with uninitialized contents, with the memory
621    /// being filled with `0` bytes in the provided allocator.
622    ///
623    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
624    /// of this method.
625    ///
626    /// # Examples
627    ///
628    /// ```
629    /// #![feature(allocator_api)]
630    ///
631    /// use std::alloc::System;
632    ///
633    /// let zero = Box::<u32, _>::new_zeroed_in(System);
634    /// let zero = unsafe { zero.assume_init() };
635    ///
636    /// assert_eq!(*zero, 0)
637    /// ```
638    ///
639    /// [zeroed]: mem::MaybeUninit::zeroed
640    #[unstable(feature = "allocator_api", issue = "32838")]
641    #[cfg(not(no_global_oom_handling))]
642    #[must_use]
643    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
644    where
645        A: Allocator,
646    {
647        let layout = Layout::new::<mem::MaybeUninit<T>>();
648        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
649        // That would make code size bigger.
650        match Box::try_new_zeroed_in(alloc) {
651            Ok(m) => m,
652            Err(_) => handle_alloc_error(layout),
653        }
654    }
655
656    /// Constructs a new `Box` with uninitialized contents, with the memory
657    /// being filled with `0` bytes in the provided allocator,
658    /// returning an error if the allocation fails,
659    ///
660    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
661    /// of this method.
662    ///
663    /// # Examples
664    ///
665    /// ```
666    /// #![feature(allocator_api)]
667    ///
668    /// use std::alloc::System;
669    ///
670    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
671    /// let zero = unsafe { zero.assume_init() };
672    ///
673    /// assert_eq!(*zero, 0);
674    /// # Ok::<(), std::alloc::AllocError>(())
675    /// ```
676    ///
677    /// [zeroed]: mem::MaybeUninit::zeroed
678    #[unstable(feature = "allocator_api", issue = "32838")]
679    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
680    where
681        A: Allocator,
682    {
683        let ptr = if T::IS_ZST {
684            NonNull::dangling()
685        } else {
686            let layout = Layout::new::<mem::MaybeUninit<T>>();
687            alloc.allocate_zeroed(layout)?.cast()
688        };
689        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
690    }
691
692    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
693    /// `x` will be pinned in memory and unable to be moved.
694    ///
695    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
696    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
697    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
698    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
699    #[cfg(not(no_global_oom_handling))]
700    #[unstable(feature = "allocator_api", issue = "32838")]
701    #[must_use]
702    #[inline(always)]
703    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
704    where
705        A: 'static + Allocator,
706    {
707        Self::into_pin(Self::new_in(x, alloc))
708    }
709
710    /// Converts a `Box<T>` into a `Box<[T]>`
711    ///
712    /// This conversion does not allocate on the heap and happens in place.
713    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
714    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
715        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
716        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
717    }
718
719    /// Consumes the `Box`, returning the wrapped value.
720    ///
721    /// # Examples
722    ///
723    /// ```
724    /// #![feature(box_into_inner)]
725    ///
726    /// let c = Box::new(5);
727    ///
728    /// assert_eq!(Box::into_inner(c), 5);
729    /// ```
730    #[unstable(feature = "box_into_inner", issue = "80437")]
731    #[inline]
732    pub fn into_inner(boxed: Self) -> T {
733        *boxed
734    }
735
736    /// Consumes the `Box` without consuming its allocation, returning the wrapped value and a `Box`
737    /// to the uninitialized memory where the wrapped value used to live.
738    ///
739    /// This can be used together with [`write`](Box::write) to reuse the allocation for multiple
740    /// boxed values.
741    ///
742    /// # Examples
743    ///
744    /// ```
745    /// #![feature(box_take)]
746    ///
747    /// let c = Box::new(5);
748    ///
749    /// // take the value out of the box
750    /// let (value, uninit) = Box::take(c);
751    /// assert_eq!(value, 5);
752    ///
753    /// // reuse the box for a second value
754    /// let c = Box::write(uninit, 6);
755    /// assert_eq!(*c, 6);
756    /// ```
757    #[unstable(feature = "box_take", issue = "147212")]
758    pub fn take(boxed: Self) -> (T, Box<mem::MaybeUninit<T>, A>) {
759        unsafe {
760            let (raw, alloc) = Box::into_non_null_with_allocator(boxed);
761            let value = raw.read();
762            let uninit = Box::from_non_null_in(raw.cast_uninit(), alloc);
763            (value, uninit)
764        }
765    }
766}
767
768impl<T: ?Sized + CloneToUninit> Box<T> {
769    /// Allocates memory on the heap then clones `src` into it.
770    ///
771    /// This doesn't actually allocate if `src` is zero-sized.
772    ///
773    /// # Examples
774    ///
775    /// ```
776    /// #![feature(clone_from_ref)]
777    ///
778    /// let hello: Box<str> = Box::clone_from_ref("hello");
779    /// ```
780    #[cfg(not(no_global_oom_handling))]
781    #[unstable(feature = "clone_from_ref", issue = "149075")]
782    #[must_use]
783    #[inline]
784    pub fn clone_from_ref(src: &T) -> Box<T> {
785        Box::clone_from_ref_in(src, Global)
786    }
787
788    /// Allocates memory on the heap then clones `src` into it, returning an error if allocation fails.
789    ///
790    /// This doesn't actually allocate if `src` is zero-sized.
791    ///
792    /// # Examples
793    ///
794    /// ```
795    /// #![feature(clone_from_ref)]
796    /// #![feature(allocator_api)]
797    ///
798    /// let hello: Box<str> = Box::try_clone_from_ref("hello")?;
799    /// # Ok::<(), std::alloc::AllocError>(())
800    /// ```
801    #[unstable(feature = "clone_from_ref", issue = "149075")]
802    //#[unstable(feature = "allocator_api", issue = "32838")]
803    #[must_use]
804    #[inline]
805    pub fn try_clone_from_ref(src: &T) -> Result<Box<T>, AllocError> {
806        Box::try_clone_from_ref_in(src, Global)
807    }
808}
809
810impl<T: ?Sized + CloneToUninit, A: Allocator> Box<T, A> {
811    /// Allocates memory in the given allocator then clones `src` into it.
812    ///
813    /// This doesn't actually allocate if `src` is zero-sized.
814    ///
815    /// # Examples
816    ///
817    /// ```
818    /// #![feature(clone_from_ref)]
819    /// #![feature(allocator_api)]
820    ///
821    /// use std::alloc::System;
822    ///
823    /// let hello: Box<str, System> = Box::clone_from_ref_in("hello", System);
824    /// ```
825    #[cfg(not(no_global_oom_handling))]
826    #[unstable(feature = "clone_from_ref", issue = "149075")]
827    //#[unstable(feature = "allocator_api", issue = "32838")]
828    #[must_use]
829    #[inline]
830    pub fn clone_from_ref_in(src: &T, alloc: A) -> Box<T, A> {
831        let layout = Layout::for_value::<T>(src);
832        match Box::try_clone_from_ref_in(src, alloc) {
833            Ok(bx) => bx,
834            Err(_) => handle_alloc_error(layout),
835        }
836    }
837
838    /// Allocates memory in the given allocator then clones `src` into it, returning an error if allocation fails.
839    ///
840    /// This doesn't actually allocate if `src` is zero-sized.
841    ///
842    /// # Examples
843    ///
844    /// ```
845    /// #![feature(clone_from_ref)]
846    /// #![feature(allocator_api)]
847    ///
848    /// use std::alloc::System;
849    ///
850    /// let hello: Box<str, System> = Box::try_clone_from_ref_in("hello", System)?;
851    /// # Ok::<(), std::alloc::AllocError>(())
852    /// ```
853    #[unstable(feature = "clone_from_ref", issue = "149075")]
854    //#[unstable(feature = "allocator_api", issue = "32838")]
855    #[must_use]
856    #[inline]
857    pub fn try_clone_from_ref_in(src: &T, alloc: A) -> Result<Box<T, A>, AllocError> {
858        struct DeallocDropGuard<'a, A: Allocator>(Layout, &'a A, NonNull<u8>);
859        impl<'a, A: Allocator> Drop for DeallocDropGuard<'a, A> {
860            fn drop(&mut self) {
861                let &mut DeallocDropGuard(layout, alloc, ptr) = self;
862                // Safety: `ptr` was allocated by `*alloc` with layout `layout`
863                unsafe {
864                    alloc.deallocate(ptr, layout);
865                }
866            }
867        }
868        let layout = Layout::for_value::<T>(src);
869        let (ptr, guard) = if layout.size() == 0 {
870            (layout.dangling_ptr(), None)
871        } else {
872            // Safety: layout is non-zero-sized
873            let ptr = alloc.allocate(layout)?.cast();
874            (ptr, Some(DeallocDropGuard(layout, &alloc, ptr)))
875        };
876        let ptr = ptr.as_ptr();
877        // Safety: `*ptr` is newly allocated, correctly aligned to `align_of_val(src)`,
878        // and is valid for writes for `size_of_val(src)`.
879        // If this panics, then `guard` will deallocate for us (if allocation occuured)
880        unsafe {
881            <T as CloneToUninit>::clone_to_uninit(src, ptr);
882        }
883        // Defuse the deallocate guard
884        core::mem::forget(guard);
885        // Safety: We just initialized `*ptr` as a clone of `src`
886        Ok(unsafe { Box::from_raw_in(ptr.with_metadata_of(src), alloc) })
887    }
888}
889
890impl<T> Box<[T]> {
891    /// Constructs a new boxed slice with uninitialized contents.
892    ///
893    /// # Examples
894    ///
895    /// ```
896    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
897    /// // Deferred initialization:
898    /// values[0].write(1);
899    /// values[1].write(2);
900    /// values[2].write(3);
901    /// let values = unsafe { values.assume_init() };
902    ///
903    /// assert_eq!(*values, [1, 2, 3])
904    /// ```
905    #[cfg(not(no_global_oom_handling))]
906    #[stable(feature = "new_uninit", since = "1.82.0")]
907    #[must_use]
908    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
909        unsafe { RawVec::with_capacity(len).into_box(len) }
910    }
911
912    /// Constructs a new boxed slice with uninitialized contents, with the memory
913    /// being filled with `0` bytes.
914    ///
915    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
916    /// of this method.
917    ///
918    /// # Examples
919    ///
920    /// ```
921    /// let values = Box::<[u32]>::new_zeroed_slice(3);
922    /// let values = unsafe { values.assume_init() };
923    ///
924    /// assert_eq!(*values, [0, 0, 0])
925    /// ```
926    ///
927    /// [zeroed]: mem::MaybeUninit::zeroed
928    #[cfg(not(no_global_oom_handling))]
929    #[stable(feature = "new_zeroed_alloc", since = "1.92.0")]
930    #[must_use]
931    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
932        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
933    }
934
935    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
936    /// the allocation fails.
937    ///
938    /// # Examples
939    ///
940    /// ```
941    /// #![feature(allocator_api)]
942    ///
943    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
944    /// // Deferred initialization:
945    /// values[0].write(1);
946    /// values[1].write(2);
947    /// values[2].write(3);
948    /// let values = unsafe { values.assume_init() };
949    ///
950    /// assert_eq!(*values, [1, 2, 3]);
951    /// # Ok::<(), std::alloc::AllocError>(())
952    /// ```
953    #[unstable(feature = "allocator_api", issue = "32838")]
954    #[inline]
955    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
956        let ptr = if T::IS_ZST || len == 0 {
957            NonNull::dangling()
958        } else {
959            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
960                Ok(l) => l,
961                Err(_) => return Err(AllocError),
962            };
963            Global.allocate(layout)?.cast()
964        };
965        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
966    }
967
968    /// Constructs a new boxed slice with uninitialized contents, with the memory
969    /// being filled with `0` bytes. Returns an error if the allocation fails.
970    ///
971    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
972    /// of this method.
973    ///
974    /// # Examples
975    ///
976    /// ```
977    /// #![feature(allocator_api)]
978    ///
979    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
980    /// let values = unsafe { values.assume_init() };
981    ///
982    /// assert_eq!(*values, [0, 0, 0]);
983    /// # Ok::<(), std::alloc::AllocError>(())
984    /// ```
985    ///
986    /// [zeroed]: mem::MaybeUninit::zeroed
987    #[unstable(feature = "allocator_api", issue = "32838")]
988    #[inline]
989    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
990        let ptr = if T::IS_ZST || len == 0 {
991            NonNull::dangling()
992        } else {
993            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
994                Ok(l) => l,
995                Err(_) => return Err(AllocError),
996            };
997            Global.allocate_zeroed(layout)?.cast()
998        };
999        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
1000    }
1001
1002    /// Converts the boxed slice into a boxed array.
1003    ///
1004    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
1005    ///
1006    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
1007    #[unstable(feature = "alloc_slice_into_array", issue = "148082")]
1008    #[inline]
1009    #[must_use]
1010    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
1011        if self.len() == N {
1012            let ptr = Self::into_raw(self) as *mut [T; N];
1013
1014            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
1015            let me = unsafe { Box::from_raw(ptr) };
1016            Some(me)
1017        } else {
1018            None
1019        }
1020    }
1021}
1022
1023impl<T, A: Allocator> Box<[T], A> {
1024    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
1025    ///
1026    /// # Examples
1027    ///
1028    /// ```
1029    /// #![feature(allocator_api)]
1030    ///
1031    /// use std::alloc::System;
1032    ///
1033    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
1034    /// // Deferred initialization:
1035    /// values[0].write(1);
1036    /// values[1].write(2);
1037    /// values[2].write(3);
1038    /// let values = unsafe { values.assume_init() };
1039    ///
1040    /// assert_eq!(*values, [1, 2, 3])
1041    /// ```
1042    #[cfg(not(no_global_oom_handling))]
1043    #[unstable(feature = "allocator_api", issue = "32838")]
1044    #[must_use]
1045    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1046        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
1047    }
1048
1049    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
1050    /// with the memory being filled with `0` bytes.
1051    ///
1052    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1053    /// of this method.
1054    ///
1055    /// # Examples
1056    ///
1057    /// ```
1058    /// #![feature(allocator_api)]
1059    ///
1060    /// use std::alloc::System;
1061    ///
1062    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
1063    /// let values = unsafe { values.assume_init() };
1064    ///
1065    /// assert_eq!(*values, [0, 0, 0])
1066    /// ```
1067    ///
1068    /// [zeroed]: mem::MaybeUninit::zeroed
1069    #[cfg(not(no_global_oom_handling))]
1070    #[unstable(feature = "allocator_api", issue = "32838")]
1071    #[must_use]
1072    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
1073        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
1074    }
1075
1076    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
1077    /// the allocation fails.
1078    ///
1079    /// # Examples
1080    ///
1081    /// ```
1082    /// #![feature(allocator_api)]
1083    ///
1084    /// use std::alloc::System;
1085    ///
1086    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
1087    /// // Deferred initialization:
1088    /// values[0].write(1);
1089    /// values[1].write(2);
1090    /// values[2].write(3);
1091    /// let values = unsafe { values.assume_init() };
1092    ///
1093    /// assert_eq!(*values, [1, 2, 3]);
1094    /// # Ok::<(), std::alloc::AllocError>(())
1095    /// ```
1096    #[unstable(feature = "allocator_api", issue = "32838")]
1097    #[inline]
1098    pub fn try_new_uninit_slice_in(
1099        len: usize,
1100        alloc: A,
1101    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1102        let ptr = if T::IS_ZST || len == 0 {
1103            NonNull::dangling()
1104        } else {
1105            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1106                Ok(l) => l,
1107                Err(_) => return Err(AllocError),
1108            };
1109            alloc.allocate(layout)?.cast()
1110        };
1111        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1112    }
1113
1114    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
1115    /// being filled with `0` bytes. Returns an error if the allocation fails.
1116    ///
1117    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
1118    /// of this method.
1119    ///
1120    /// # Examples
1121    ///
1122    /// ```
1123    /// #![feature(allocator_api)]
1124    ///
1125    /// use std::alloc::System;
1126    ///
1127    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
1128    /// let values = unsafe { values.assume_init() };
1129    ///
1130    /// assert_eq!(*values, [0, 0, 0]);
1131    /// # Ok::<(), std::alloc::AllocError>(())
1132    /// ```
1133    ///
1134    /// [zeroed]: mem::MaybeUninit::zeroed
1135    #[unstable(feature = "allocator_api", issue = "32838")]
1136    #[inline]
1137    pub fn try_new_zeroed_slice_in(
1138        len: usize,
1139        alloc: A,
1140    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
1141        let ptr = if T::IS_ZST || len == 0 {
1142            NonNull::dangling()
1143        } else {
1144            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
1145                Ok(l) => l,
1146                Err(_) => return Err(AllocError),
1147            };
1148            alloc.allocate_zeroed(layout)?.cast()
1149        };
1150        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
1151    }
1152}
1153
1154impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
1155    /// Converts to `Box<T, A>`.
1156    ///
1157    /// # Safety
1158    ///
1159    /// As with [`MaybeUninit::assume_init`],
1160    /// it is up to the caller to guarantee that the value
1161    /// really is in an initialized state.
1162    /// Calling this when the content is not yet fully initialized
1163    /// causes immediate undefined behavior.
1164    ///
1165    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1166    ///
1167    /// # Examples
1168    ///
1169    /// ```
1170    /// let mut five = Box::<u32>::new_uninit();
1171    /// // Deferred initialization:
1172    /// five.write(5);
1173    /// let five: Box<u32> = unsafe { five.assume_init() };
1174    ///
1175    /// assert_eq!(*five, 5)
1176    /// ```
1177    #[stable(feature = "new_uninit", since = "1.82.0")]
1178    #[inline(always)]
1179    pub unsafe fn assume_init(self) -> Box<T, A> {
1180        // This is used in the `vec!` macro, so we optimize for minimal IR generation
1181        // even in debug builds.
1182        // SAFETY: `Box<T>` and `Box<MaybeUninit<T>>` have the same layout.
1183        unsafe { core::intrinsics::transmute_unchecked(self) }
1184    }
1185
1186    /// Writes the value and converts to `Box<T, A>`.
1187    ///
1188    /// This method converts the box similarly to [`Box::assume_init`] but
1189    /// writes `value` into it before conversion thus guaranteeing safety.
1190    /// In some scenarios use of this method may improve performance because
1191    /// the compiler may be able to optimize copying from stack.
1192    ///
1193    /// # Examples
1194    ///
1195    /// ```
1196    /// let big_box = Box::<[usize; 1024]>::new_uninit();
1197    ///
1198    /// let mut array = [0; 1024];
1199    /// for (i, place) in array.iter_mut().enumerate() {
1200    ///     *place = i;
1201    /// }
1202    ///
1203    /// // The optimizer may be able to elide this copy, so previous code writes
1204    /// // to heap directly.
1205    /// let big_box = Box::write(big_box, array);
1206    ///
1207    /// for (i, x) in big_box.iter().enumerate() {
1208    ///     assert_eq!(*x, i);
1209    /// }
1210    /// ```
1211    #[stable(feature = "box_uninit_write", since = "1.87.0")]
1212    #[inline]
1213    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
1214        unsafe {
1215            (*boxed).write(value);
1216            boxed.assume_init()
1217        }
1218    }
1219}
1220
1221impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
1222    /// Converts to `Box<[T], A>`.
1223    ///
1224    /// # Safety
1225    ///
1226    /// As with [`MaybeUninit::assume_init`],
1227    /// it is up to the caller to guarantee that the values
1228    /// really are in an initialized state.
1229    /// Calling this when the content is not yet fully initialized
1230    /// causes immediate undefined behavior.
1231    ///
1232    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
1233    ///
1234    /// # Examples
1235    ///
1236    /// ```
1237    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
1238    /// // Deferred initialization:
1239    /// values[0].write(1);
1240    /// values[1].write(2);
1241    /// values[2].write(3);
1242    /// let values = unsafe { values.assume_init() };
1243    ///
1244    /// assert_eq!(*values, [1, 2, 3])
1245    /// ```
1246    #[stable(feature = "new_uninit", since = "1.82.0")]
1247    #[inline]
1248    pub unsafe fn assume_init(self) -> Box<[T], A> {
1249        let (raw, alloc) = Box::into_raw_with_allocator(self);
1250        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
1251    }
1252}
1253
1254impl<T: ?Sized> Box<T> {
1255    /// Constructs a box from a raw pointer.
1256    ///
1257    /// After calling this function, the raw pointer is owned by the
1258    /// resulting `Box`. Specifically, the `Box` destructor will call
1259    /// the destructor of `T` and free the allocated memory. For this
1260    /// to be safe, the memory must have been allocated in accordance
1261    /// with the [memory layout] used by `Box` .
1262    ///
1263    /// # Safety
1264    ///
1265    /// This function is unsafe because improper use may lead to
1266    /// memory problems. For example, a double-free may occur if the
1267    /// function is called twice on the same raw pointer.
1268    ///
1269    /// The raw pointer must point to a block of memory allocated by the global allocator.
1270    ///
1271    /// The safety conditions are described in the [memory layout] section.
1272    ///
1273    /// # Examples
1274    ///
1275    /// Recreate a `Box` which was previously converted to a raw pointer
1276    /// using [`Box::into_raw`]:
1277    /// ```
1278    /// let x = Box::new(5);
1279    /// let ptr = Box::into_raw(x);
1280    /// let x = unsafe { Box::from_raw(ptr) };
1281    /// ```
1282    /// Manually create a `Box` from scratch by using the global allocator:
1283    /// ```
1284    /// use std::alloc::{alloc, Layout};
1285    ///
1286    /// unsafe {
1287    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1288    ///     // In general .write is required to avoid attempting to destruct
1289    ///     // the (uninitialized) previous contents of `ptr`, though for this
1290    ///     // simple example `*ptr = 5` would have worked as well.
1291    ///     ptr.write(5);
1292    ///     let x = Box::from_raw(ptr);
1293    /// }
1294    /// ```
1295    ///
1296    /// [memory layout]: self#memory-layout
1297    #[stable(feature = "box_raw", since = "1.4.0")]
1298    #[inline]
1299    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1300    pub unsafe fn from_raw(raw: *mut T) -> Self {
1301        unsafe { Self::from_raw_in(raw, Global) }
1302    }
1303
1304    /// Constructs a box from a `NonNull` pointer.
1305    ///
1306    /// After calling this function, the `NonNull` pointer is owned by
1307    /// the resulting `Box`. Specifically, the `Box` destructor will call
1308    /// the destructor of `T` and free the allocated memory. For this
1309    /// to be safe, the memory must have been allocated in accordance
1310    /// with the [memory layout] used by `Box` .
1311    ///
1312    /// # Safety
1313    ///
1314    /// This function is unsafe because improper use may lead to
1315    /// memory problems. For example, a double-free may occur if the
1316    /// function is called twice on the same `NonNull` pointer.
1317    ///
1318    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1319    ///
1320    /// The safety conditions are described in the [memory layout] section.
1321    ///
1322    /// # Examples
1323    ///
1324    /// Recreate a `Box` which was previously converted to a `NonNull`
1325    /// pointer using [`Box::into_non_null`]:
1326    /// ```
1327    /// #![feature(box_vec_non_null)]
1328    ///
1329    /// let x = Box::new(5);
1330    /// let non_null = Box::into_non_null(x);
1331    /// let x = unsafe { Box::from_non_null(non_null) };
1332    /// ```
1333    /// Manually create a `Box` from scratch by using the global allocator:
1334    /// ```
1335    /// #![feature(box_vec_non_null)]
1336    ///
1337    /// use std::alloc::{alloc, Layout};
1338    /// use std::ptr::NonNull;
1339    ///
1340    /// unsafe {
1341    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1342    ///         .expect("allocation failed");
1343    ///     // In general .write is required to avoid attempting to destruct
1344    ///     // the (uninitialized) previous contents of `non_null`.
1345    ///     non_null.write(5);
1346    ///     let x = Box::from_non_null(non_null);
1347    /// }
1348    /// ```
1349    ///
1350    /// [memory layout]: self#memory-layout
1351    #[unstable(feature = "box_vec_non_null", issue = "130364")]
1352    #[inline]
1353    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1354    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1355        unsafe { Self::from_raw(ptr.as_ptr()) }
1356    }
1357
1358    /// Consumes the `Box`, returning a wrapped raw pointer.
1359    ///
1360    /// The pointer will be properly aligned and non-null.
1361    ///
1362    /// After calling this function, the caller is responsible for the
1363    /// memory previously managed by the `Box`. In particular, the
1364    /// caller should properly destroy `T` and release the memory, taking
1365    /// into account the [memory layout] used by `Box`. The easiest way to
1366    /// do this is to convert the raw pointer back into a `Box` with the
1367    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1368    /// the cleanup.
1369    ///
1370    /// Note: this is an associated function, which means that you have
1371    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1372    /// is so that there is no conflict with a method on the inner type.
1373    ///
1374    /// # Examples
1375    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1376    /// for automatic cleanup:
1377    /// ```
1378    /// let x = Box::new(String::from("Hello"));
1379    /// let ptr = Box::into_raw(x);
1380    /// let x = unsafe { Box::from_raw(ptr) };
1381    /// ```
1382    /// Manual cleanup by explicitly running the destructor and deallocating
1383    /// the memory:
1384    /// ```
1385    /// use std::alloc::{dealloc, Layout};
1386    /// use std::ptr;
1387    ///
1388    /// let x = Box::new(String::from("Hello"));
1389    /// let ptr = Box::into_raw(x);
1390    /// unsafe {
1391    ///     ptr::drop_in_place(ptr);
1392    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1393    /// }
1394    /// ```
1395    /// Note: This is equivalent to the following:
1396    /// ```
1397    /// let x = Box::new(String::from("Hello"));
1398    /// let ptr = Box::into_raw(x);
1399    /// unsafe {
1400    ///     drop(Box::from_raw(ptr));
1401    /// }
1402    /// ```
1403    ///
1404    /// [memory layout]: self#memory-layout
1405    #[must_use = "losing the pointer will leak memory"]
1406    #[stable(feature = "box_raw", since = "1.4.0")]
1407    #[inline]
1408    pub fn into_raw(b: Self) -> *mut T {
1409        // Avoid `into_raw_with_allocator` as that interacts poorly with Miri's Stacked Borrows.
1410        let mut b = mem::ManuallyDrop::new(b);
1411        // We go through the built-in deref for `Box`, which is crucial for Miri to recognize this
1412        // operation for it's alias tracking.
1413        &raw mut **b
1414    }
1415
1416    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1417    ///
1418    /// The pointer will be properly aligned.
1419    ///
1420    /// After calling this function, the caller is responsible for the
1421    /// memory previously managed by the `Box`. In particular, the
1422    /// caller should properly destroy `T` and release the memory, taking
1423    /// into account the [memory layout] used by `Box`. The easiest way to
1424    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1425    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1426    /// perform the cleanup.
1427    ///
1428    /// Note: this is an associated function, which means that you have
1429    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1430    /// This is so that there is no conflict with a method on the inner type.
1431    ///
1432    /// # Examples
1433    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1434    /// for automatic cleanup:
1435    /// ```
1436    /// #![feature(box_vec_non_null)]
1437    ///
1438    /// let x = Box::new(String::from("Hello"));
1439    /// let non_null = Box::into_non_null(x);
1440    /// let x = unsafe { Box::from_non_null(non_null) };
1441    /// ```
1442    /// Manual cleanup by explicitly running the destructor and deallocating
1443    /// the memory:
1444    /// ```
1445    /// #![feature(box_vec_non_null)]
1446    ///
1447    /// use std::alloc::{dealloc, Layout};
1448    ///
1449    /// let x = Box::new(String::from("Hello"));
1450    /// let non_null = Box::into_non_null(x);
1451    /// unsafe {
1452    ///     non_null.drop_in_place();
1453    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1454    /// }
1455    /// ```
1456    /// Note: This is equivalent to the following:
1457    /// ```
1458    /// #![feature(box_vec_non_null)]
1459    ///
1460    /// let x = Box::new(String::from("Hello"));
1461    /// let non_null = Box::into_non_null(x);
1462    /// unsafe {
1463    ///     drop(Box::from_non_null(non_null));
1464    /// }
1465    /// ```
1466    ///
1467    /// [memory layout]: self#memory-layout
1468    #[must_use = "losing the pointer will leak memory"]
1469    #[unstable(feature = "box_vec_non_null", issue = "130364")]
1470    #[inline]
1471    pub fn into_non_null(b: Self) -> NonNull<T> {
1472        // SAFETY: `Box` is guaranteed to be non-null.
1473        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1474    }
1475}
1476
1477impl<T: ?Sized, A: Allocator> Box<T, A> {
1478    /// Constructs a box from a raw pointer in the given allocator.
1479    ///
1480    /// After calling this function, the raw pointer is owned by the
1481    /// resulting `Box`. Specifically, the `Box` destructor will call
1482    /// the destructor of `T` and free the allocated memory. For this
1483    /// to be safe, the memory must have been allocated in accordance
1484    /// with the [memory layout] used by `Box` .
1485    ///
1486    /// # Safety
1487    ///
1488    /// This function is unsafe because improper use may lead to
1489    /// memory problems. For example, a double-free may occur if the
1490    /// function is called twice on the same raw pointer.
1491    ///
1492    /// The raw pointer must point to a block of memory allocated by `alloc`.
1493    ///
1494    /// # Examples
1495    ///
1496    /// Recreate a `Box` which was previously converted to a raw pointer
1497    /// using [`Box::into_raw_with_allocator`]:
1498    /// ```
1499    /// #![feature(allocator_api)]
1500    ///
1501    /// use std::alloc::System;
1502    ///
1503    /// let x = Box::new_in(5, System);
1504    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1505    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1506    /// ```
1507    /// Manually create a `Box` from scratch by using the system allocator:
1508    /// ```
1509    /// #![feature(allocator_api, slice_ptr_get)]
1510    ///
1511    /// use std::alloc::{Allocator, Layout, System};
1512    ///
1513    /// unsafe {
1514    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1515    ///     // In general .write is required to avoid attempting to destruct
1516    ///     // the (uninitialized) previous contents of `ptr`, though for this
1517    ///     // simple example `*ptr = 5` would have worked as well.
1518    ///     ptr.write(5);
1519    ///     let x = Box::from_raw_in(ptr, System);
1520    /// }
1521    /// # Ok::<(), std::alloc::AllocError>(())
1522    /// ```
1523    ///
1524    /// [memory layout]: self#memory-layout
1525    #[unstable(feature = "allocator_api", issue = "32838")]
1526    #[inline]
1527    pub unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1528        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1529    }
1530
1531    /// Constructs a box from a `NonNull` pointer in the given allocator.
1532    ///
1533    /// After calling this function, the `NonNull` pointer is owned by
1534    /// the resulting `Box`. Specifically, the `Box` destructor will call
1535    /// the destructor of `T` and free the allocated memory. For this
1536    /// to be safe, the memory must have been allocated in accordance
1537    /// with the [memory layout] used by `Box` .
1538    ///
1539    /// # Safety
1540    ///
1541    /// This function is unsafe because improper use may lead to
1542    /// memory problems. For example, a double-free may occur if the
1543    /// function is called twice on the same raw pointer.
1544    ///
1545    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1546    ///
1547    /// # Examples
1548    ///
1549    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1550    /// using [`Box::into_non_null_with_allocator`]:
1551    /// ```
1552    /// #![feature(allocator_api)]
1553    ///
1554    /// use std::alloc::System;
1555    ///
1556    /// let x = Box::new_in(5, System);
1557    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1558    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1559    /// ```
1560    /// Manually create a `Box` from scratch by using the system allocator:
1561    /// ```
1562    /// #![feature(allocator_api)]
1563    ///
1564    /// use std::alloc::{Allocator, Layout, System};
1565    ///
1566    /// unsafe {
1567    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1568    ///     // In general .write is required to avoid attempting to destruct
1569    ///     // the (uninitialized) previous contents of `non_null`.
1570    ///     non_null.write(5);
1571    ///     let x = Box::from_non_null_in(non_null, System);
1572    /// }
1573    /// # Ok::<(), std::alloc::AllocError>(())
1574    /// ```
1575    ///
1576    /// [memory layout]: self#memory-layout
1577    #[unstable(feature = "allocator_api", issue = "32838")]
1578    // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1579    #[inline]
1580    pub unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1581        // SAFETY: guaranteed by the caller.
1582        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1583    }
1584
1585    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1586    ///
1587    /// The pointer will be properly aligned and non-null.
1588    ///
1589    /// After calling this function, the caller is responsible for the
1590    /// memory previously managed by the `Box`. In particular, the
1591    /// caller should properly destroy `T` and release the memory, taking
1592    /// into account the [memory layout] used by `Box`. The easiest way to
1593    /// do this is to convert the raw pointer back into a `Box` with the
1594    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1595    /// the cleanup.
1596    ///
1597    /// Note: this is an associated function, which means that you have
1598    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1599    /// is so that there is no conflict with a method on the inner type.
1600    ///
1601    /// # Examples
1602    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1603    /// for automatic cleanup:
1604    /// ```
1605    /// #![feature(allocator_api)]
1606    ///
1607    /// use std::alloc::System;
1608    ///
1609    /// let x = Box::new_in(String::from("Hello"), System);
1610    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1611    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1612    /// ```
1613    /// Manual cleanup by explicitly running the destructor and deallocating
1614    /// the memory:
1615    /// ```
1616    /// #![feature(allocator_api)]
1617    ///
1618    /// use std::alloc::{Allocator, Layout, System};
1619    /// use std::ptr::{self, NonNull};
1620    ///
1621    /// let x = Box::new_in(String::from("Hello"), System);
1622    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1623    /// unsafe {
1624    ///     ptr::drop_in_place(ptr);
1625    ///     let non_null = NonNull::new_unchecked(ptr);
1626    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1627    /// }
1628    /// ```
1629    ///
1630    /// [memory layout]: self#memory-layout
1631    #[must_use = "losing the pointer will leak memory"]
1632    #[unstable(feature = "allocator_api", issue = "32838")]
1633    #[inline]
1634    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1635        let mut b = mem::ManuallyDrop::new(b);
1636        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1637        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1638        // want *no* aliasing requirements here!
1639        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1640        // works around that.
1641        let ptr = &raw mut **b;
1642        let alloc = unsafe { ptr::read(&b.1) };
1643        (ptr, alloc)
1644    }
1645
1646    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1647    ///
1648    /// The pointer will be properly aligned.
1649    ///
1650    /// After calling this function, the caller is responsible for the
1651    /// memory previously managed by the `Box`. In particular, the
1652    /// caller should properly destroy `T` and release the memory, taking
1653    /// into account the [memory layout] used by `Box`. The easiest way to
1654    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1655    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1656    /// perform the cleanup.
1657    ///
1658    /// Note: this is an associated function, which means that you have
1659    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1660    /// `b.into_non_null_with_allocator()`. This is so that there is no
1661    /// conflict with a method on the inner type.
1662    ///
1663    /// # Examples
1664    /// Converting the `NonNull` pointer back into a `Box` with
1665    /// [`Box::from_non_null_in`] for automatic cleanup:
1666    /// ```
1667    /// #![feature(allocator_api)]
1668    ///
1669    /// use std::alloc::System;
1670    ///
1671    /// let x = Box::new_in(String::from("Hello"), System);
1672    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1673    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1674    /// ```
1675    /// Manual cleanup by explicitly running the destructor and deallocating
1676    /// the memory:
1677    /// ```
1678    /// #![feature(allocator_api)]
1679    ///
1680    /// use std::alloc::{Allocator, Layout, System};
1681    ///
1682    /// let x = Box::new_in(String::from("Hello"), System);
1683    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1684    /// unsafe {
1685    ///     non_null.drop_in_place();
1686    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1687    /// }
1688    /// ```
1689    ///
1690    /// [memory layout]: self#memory-layout
1691    #[must_use = "losing the pointer will leak memory"]
1692    #[unstable(feature = "allocator_api", issue = "32838")]
1693    // #[unstable(feature = "box_vec_non_null", issue = "130364")]
1694    #[inline]
1695    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1696        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1697        // SAFETY: `Box` is guaranteed to be non-null.
1698        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1699    }
1700
1701    #[unstable(
1702        feature = "ptr_internals",
1703        issue = "none",
1704        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1705    )]
1706    #[inline]
1707    #[doc(hidden)]
1708    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1709        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1710        unsafe { (Unique::from(&mut *ptr), alloc) }
1711    }
1712
1713    /// Returns a raw mutable pointer to the `Box`'s contents.
1714    ///
1715    /// The caller must ensure that the `Box` outlives the pointer this
1716    /// function returns, or else it will end up dangling.
1717    ///
1718    /// This method guarantees that for the purpose of the aliasing model, this method
1719    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1720    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1721    /// Note that calling other methods that materialize references to the memory
1722    /// may still invalidate this pointer.
1723    /// See the example below for how this guarantee can be used.
1724    ///
1725    /// # Examples
1726    ///
1727    /// Due to the aliasing guarantee, the following code is legal:
1728    ///
1729    /// ```rust
1730    /// #![feature(box_as_ptr)]
1731    ///
1732    /// unsafe {
1733    ///     let mut b = Box::new(0);
1734    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1735    ///     ptr1.write(1);
1736    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1737    ///     ptr2.write(2);
1738    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1739    ///     ptr1.write(3);
1740    /// }
1741    /// ```
1742    ///
1743    /// [`as_mut_ptr`]: Self::as_mut_ptr
1744    /// [`as_ptr`]: Self::as_ptr
1745    #[unstable(feature = "box_as_ptr", issue = "129090")]
1746    #[rustc_never_returns_null_ptr]
1747    #[rustc_as_ptr]
1748    #[inline]
1749    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1750        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1751        // any references.
1752        &raw mut **b
1753    }
1754
1755    /// Returns a raw pointer to the `Box`'s contents.
1756    ///
1757    /// The caller must ensure that the `Box` outlives the pointer this
1758    /// function returns, or else it will end up dangling.
1759    ///
1760    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1761    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1762    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1763    ///
1764    /// This method guarantees that for the purpose of the aliasing model, this method
1765    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1766    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1767    /// Note that calling other methods that materialize mutable references to the memory,
1768    /// as well as writing to this memory, may still invalidate this pointer.
1769    /// See the example below for how this guarantee can be used.
1770    ///
1771    /// # Examples
1772    ///
1773    /// Due to the aliasing guarantee, the following code is legal:
1774    ///
1775    /// ```rust
1776    /// #![feature(box_as_ptr)]
1777    ///
1778    /// unsafe {
1779    ///     let mut v = Box::new(0);
1780    ///     let ptr1 = Box::as_ptr(&v);
1781    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1782    ///     let _val = ptr2.read();
1783    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1784    ///     let _val = ptr1.read();
1785    ///     // However, once we do a write...
1786    ///     ptr2.write(1);
1787    ///     // ... `ptr1` is no longer valid.
1788    ///     // This would be UB: let _val = ptr1.read();
1789    /// }
1790    /// ```
1791    ///
1792    /// [`as_mut_ptr`]: Self::as_mut_ptr
1793    /// [`as_ptr`]: Self::as_ptr
1794    #[unstable(feature = "box_as_ptr", issue = "129090")]
1795    #[rustc_never_returns_null_ptr]
1796    #[rustc_as_ptr]
1797    #[inline]
1798    pub fn as_ptr(b: &Self) -> *const T {
1799        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1800        // any references.
1801        &raw const **b
1802    }
1803
1804    /// Returns a reference to the underlying allocator.
1805    ///
1806    /// Note: this is an associated function, which means that you have
1807    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1808    /// is so that there is no conflict with a method on the inner type.
1809    #[unstable(feature = "allocator_api", issue = "32838")]
1810    #[inline]
1811    pub fn allocator(b: &Self) -> &A {
1812        &b.1
1813    }
1814
1815    /// Consumes and leaks the `Box`, returning a mutable reference,
1816    /// `&'a mut T`.
1817    ///
1818    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1819    /// has only static references, or none at all, then this may be chosen to be
1820    /// `'static`.
1821    ///
1822    /// This function is mainly useful for data that lives for the remainder of
1823    /// the program's life. Dropping the returned reference will cause a memory
1824    /// leak. If this is not acceptable, the reference should first be wrapped
1825    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1826    /// then be dropped which will properly destroy `T` and release the
1827    /// allocated memory.
1828    ///
1829    /// Note: this is an associated function, which means that you have
1830    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1831    /// is so that there is no conflict with a method on the inner type.
1832    ///
1833    /// # Examples
1834    ///
1835    /// Simple usage:
1836    ///
1837    /// ```
1838    /// let x = Box::new(41);
1839    /// let static_ref: &'static mut usize = Box::leak(x);
1840    /// *static_ref += 1;
1841    /// assert_eq!(*static_ref, 42);
1842    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1843    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1844    /// # drop(unsafe { Box::from_raw(static_ref) });
1845    /// ```
1846    ///
1847    /// Unsized data:
1848    ///
1849    /// ```
1850    /// let x = vec![1, 2, 3].into_boxed_slice();
1851    /// let static_ref = Box::leak(x);
1852    /// static_ref[0] = 4;
1853    /// assert_eq!(*static_ref, [4, 2, 3]);
1854    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1855    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1856    /// # drop(unsafe { Box::from_raw(static_ref) });
1857    /// ```
1858    #[stable(feature = "box_leak", since = "1.26.0")]
1859    #[inline]
1860    pub fn leak<'a>(b: Self) -> &'a mut T
1861    where
1862        A: 'a,
1863    {
1864        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1865        mem::forget(alloc);
1866        unsafe { &mut *ptr }
1867    }
1868
1869    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1870    /// `*boxed` will be pinned in memory and unable to be moved.
1871    ///
1872    /// This conversion does not allocate on the heap and happens in place.
1873    ///
1874    /// This is also available via [`From`].
1875    ///
1876    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1877    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1878    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1879    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1880    ///
1881    /// # Notes
1882    ///
1883    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1884    /// as it'll introduce an ambiguity when calling `Pin::from`.
1885    /// A demonstration of such a poor impl is shown below.
1886    ///
1887    /// ```compile_fail
1888    /// # use std::pin::Pin;
1889    /// struct Foo; // A type defined in this crate.
1890    /// impl From<Box<()>> for Pin<Foo> {
1891    ///     fn from(_: Box<()>) -> Pin<Foo> {
1892    ///         Pin::new(Foo)
1893    ///     }
1894    /// }
1895    ///
1896    /// let foo = Box::new(());
1897    /// let bar = Pin::from(foo);
1898    /// ```
1899    #[stable(feature = "box_into_pin", since = "1.63.0")]
1900    pub fn into_pin(boxed: Self) -> Pin<Self>
1901    where
1902        A: 'static,
1903    {
1904        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1905        // when `T: !Unpin`, so it's safe to pin it directly without any
1906        // additional requirements.
1907        unsafe { Pin::new_unchecked(boxed) }
1908    }
1909}
1910
1911#[stable(feature = "rust1", since = "1.0.0")]
1912unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1913    #[inline]
1914    fn drop(&mut self) {
1915        // the T in the Box is dropped by the compiler before the destructor is run
1916
1917        let ptr = self.0;
1918
1919        unsafe {
1920            let layout = Layout::for_value_raw(ptr.as_ptr());
1921            if layout.size() != 0 {
1922                self.1.deallocate(From::from(ptr.cast()), layout);
1923            }
1924        }
1925    }
1926}
1927
1928#[cfg(not(no_global_oom_handling))]
1929#[stable(feature = "rust1", since = "1.0.0")]
1930impl<T: Default> Default for Box<T> {
1931    /// Creates a `Box<T>`, with the `Default` value for `T`.
1932    #[inline]
1933    fn default() -> Self {
1934        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1935        unsafe {
1936            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1937            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1938            // does not have a destructor.
1939            //
1940            // We use `ptr::write` as `MaybeUninit::write` creates
1941            // extra stack copies of `T` in debug mode.
1942            //
1943            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1944            ptr::write(&raw mut *x as *mut T, T::default());
1945            // SAFETY: `x` was just initialized above.
1946            x.assume_init()
1947        }
1948    }
1949}
1950
1951#[cfg(not(no_global_oom_handling))]
1952#[stable(feature = "rust1", since = "1.0.0")]
1953impl<T> Default for Box<[T]> {
1954    /// Creates an empty `[T]` inside a `Box`.
1955    #[inline]
1956    fn default() -> Self {
1957        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1958        Box(ptr, Global)
1959    }
1960}
1961
1962#[cfg(not(no_global_oom_handling))]
1963#[stable(feature = "default_box_extra", since = "1.17.0")]
1964impl Default for Box<str> {
1965    #[inline]
1966    fn default() -> Self {
1967        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1968        let ptr: Unique<str> = unsafe {
1969            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1970            Unique::new_unchecked(bytes.as_ptr() as *mut str)
1971        };
1972        Box(ptr, Global)
1973    }
1974}
1975
1976#[cfg(not(no_global_oom_handling))]
1977#[stable(feature = "pin_default_impls", since = "1.91.0")]
1978impl<T> Default for Pin<Box<T>>
1979where
1980    T: ?Sized,
1981    Box<T>: Default,
1982{
1983    #[inline]
1984    fn default() -> Self {
1985        Box::into_pin(Box::<T>::default())
1986    }
1987}
1988
1989#[cfg(not(no_global_oom_handling))]
1990#[stable(feature = "rust1", since = "1.0.0")]
1991impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1992    /// Returns a new box with a `clone()` of this box's contents.
1993    ///
1994    /// # Examples
1995    ///
1996    /// ```
1997    /// let x = Box::new(5);
1998    /// let y = x.clone();
1999    ///
2000    /// // The value is the same
2001    /// assert_eq!(x, y);
2002    ///
2003    /// // But they are unique objects
2004    /// assert_ne!(&*x as *const i32, &*y as *const i32);
2005    /// ```
2006    #[inline]
2007    fn clone(&self) -> Self {
2008        // Pre-allocate memory to allow writing the cloned value directly.
2009        let mut boxed = Self::new_uninit_in(self.1.clone());
2010        unsafe {
2011            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
2012            boxed.assume_init()
2013        }
2014    }
2015
2016    /// Copies `source`'s contents into `self` without creating a new allocation.
2017    ///
2018    /// # Examples
2019    ///
2020    /// ```
2021    /// let x = Box::new(5);
2022    /// let mut y = Box::new(10);
2023    /// let yp: *const i32 = &*y;
2024    ///
2025    /// y.clone_from(&x);
2026    ///
2027    /// // The value is the same
2028    /// assert_eq!(x, y);
2029    ///
2030    /// // And no allocation occurred
2031    /// assert_eq!(yp, &*y);
2032    /// ```
2033    #[inline]
2034    fn clone_from(&mut self, source: &Self) {
2035        (**self).clone_from(&(**source));
2036    }
2037}
2038
2039#[cfg(not(no_global_oom_handling))]
2040#[stable(feature = "box_slice_clone", since = "1.3.0")]
2041impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
2042    fn clone(&self) -> Self {
2043        let alloc = Box::allocator(self).clone();
2044        self.to_vec_in(alloc).into_boxed_slice()
2045    }
2046
2047    /// Copies `source`'s contents into `self` without creating a new allocation,
2048    /// so long as the two are of the same length.
2049    ///
2050    /// # Examples
2051    ///
2052    /// ```
2053    /// let x = Box::new([5, 6, 7]);
2054    /// let mut y = Box::new([8, 9, 10]);
2055    /// let yp: *const [i32] = &*y;
2056    ///
2057    /// y.clone_from(&x);
2058    ///
2059    /// // The value is the same
2060    /// assert_eq!(x, y);
2061    ///
2062    /// // And no allocation occurred
2063    /// assert_eq!(yp, &*y);
2064    /// ```
2065    fn clone_from(&mut self, source: &Self) {
2066        if self.len() == source.len() {
2067            self.clone_from_slice(&source);
2068        } else {
2069            *self = source.clone();
2070        }
2071    }
2072}
2073
2074#[cfg(not(no_global_oom_handling))]
2075#[stable(feature = "box_slice_clone", since = "1.3.0")]
2076impl Clone for Box<str> {
2077    fn clone(&self) -> Self {
2078        // this makes a copy of the data
2079        let buf: Box<[u8]> = self.as_bytes().into();
2080        unsafe { from_boxed_utf8_unchecked(buf) }
2081    }
2082}
2083
2084#[stable(feature = "rust1", since = "1.0.0")]
2085impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
2086    #[inline]
2087    fn eq(&self, other: &Self) -> bool {
2088        PartialEq::eq(&**self, &**other)
2089    }
2090    #[inline]
2091    fn ne(&self, other: &Self) -> bool {
2092        PartialEq::ne(&**self, &**other)
2093    }
2094}
2095
2096#[stable(feature = "rust1", since = "1.0.0")]
2097impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
2098    #[inline]
2099    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
2100        PartialOrd::partial_cmp(&**self, &**other)
2101    }
2102    #[inline]
2103    fn lt(&self, other: &Self) -> bool {
2104        PartialOrd::lt(&**self, &**other)
2105    }
2106    #[inline]
2107    fn le(&self, other: &Self) -> bool {
2108        PartialOrd::le(&**self, &**other)
2109    }
2110    #[inline]
2111    fn ge(&self, other: &Self) -> bool {
2112        PartialOrd::ge(&**self, &**other)
2113    }
2114    #[inline]
2115    fn gt(&self, other: &Self) -> bool {
2116        PartialOrd::gt(&**self, &**other)
2117    }
2118}
2119
2120#[stable(feature = "rust1", since = "1.0.0")]
2121impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
2122    #[inline]
2123    fn cmp(&self, other: &Self) -> Ordering {
2124        Ord::cmp(&**self, &**other)
2125    }
2126}
2127
2128#[stable(feature = "rust1", since = "1.0.0")]
2129impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
2130
2131#[stable(feature = "rust1", since = "1.0.0")]
2132impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
2133    fn hash<H: Hasher>(&self, state: &mut H) {
2134        (**self).hash(state);
2135    }
2136}
2137
2138#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
2139impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
2140    fn finish(&self) -> u64 {
2141        (**self).finish()
2142    }
2143    fn write(&mut self, bytes: &[u8]) {
2144        (**self).write(bytes)
2145    }
2146    fn write_u8(&mut self, i: u8) {
2147        (**self).write_u8(i)
2148    }
2149    fn write_u16(&mut self, i: u16) {
2150        (**self).write_u16(i)
2151    }
2152    fn write_u32(&mut self, i: u32) {
2153        (**self).write_u32(i)
2154    }
2155    fn write_u64(&mut self, i: u64) {
2156        (**self).write_u64(i)
2157    }
2158    fn write_u128(&mut self, i: u128) {
2159        (**self).write_u128(i)
2160    }
2161    fn write_usize(&mut self, i: usize) {
2162        (**self).write_usize(i)
2163    }
2164    fn write_i8(&mut self, i: i8) {
2165        (**self).write_i8(i)
2166    }
2167    fn write_i16(&mut self, i: i16) {
2168        (**self).write_i16(i)
2169    }
2170    fn write_i32(&mut self, i: i32) {
2171        (**self).write_i32(i)
2172    }
2173    fn write_i64(&mut self, i: i64) {
2174        (**self).write_i64(i)
2175    }
2176    fn write_i128(&mut self, i: i128) {
2177        (**self).write_i128(i)
2178    }
2179    fn write_isize(&mut self, i: isize) {
2180        (**self).write_isize(i)
2181    }
2182    fn write_length_prefix(&mut self, len: usize) {
2183        (**self).write_length_prefix(len)
2184    }
2185    fn write_str(&mut self, s: &str) {
2186        (**self).write_str(s)
2187    }
2188}
2189
2190#[stable(feature = "rust1", since = "1.0.0")]
2191impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
2192    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2193        fmt::Display::fmt(&**self, f)
2194    }
2195}
2196
2197#[stable(feature = "rust1", since = "1.0.0")]
2198impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
2199    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2200        fmt::Debug::fmt(&**self, f)
2201    }
2202}
2203
2204#[stable(feature = "rust1", since = "1.0.0")]
2205impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
2206    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2207        // It's not possible to extract the inner Uniq directly from the Box,
2208        // instead we cast it to a *const which aliases the Unique
2209        let ptr: *const T = &**self;
2210        fmt::Pointer::fmt(&ptr, f)
2211    }
2212}
2213
2214#[stable(feature = "rust1", since = "1.0.0")]
2215impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
2216    type Target = T;
2217
2218    fn deref(&self) -> &T {
2219        &**self
2220    }
2221}
2222
2223#[stable(feature = "rust1", since = "1.0.0")]
2224impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
2225    fn deref_mut(&mut self) -> &mut T {
2226        &mut **self
2227    }
2228}
2229
2230#[unstable(feature = "deref_pure_trait", issue = "87121")]
2231unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
2232
2233#[unstable(feature = "legacy_receiver_trait", issue = "none")]
2234impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
2235
2236#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2237impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
2238    type Output = <F as FnOnce<Args>>::Output;
2239
2240    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
2241        <F as FnOnce<Args>>::call_once(*self, args)
2242    }
2243}
2244
2245#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2246impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
2247    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
2248        <F as FnMut<Args>>::call_mut(self, args)
2249    }
2250}
2251
2252#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
2253impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
2254    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
2255        <F as Fn<Args>>::call(self, args)
2256    }
2257}
2258
2259#[stable(feature = "async_closure", since = "1.85.0")]
2260impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
2261    type Output = F::Output;
2262    type CallOnceFuture = F::CallOnceFuture;
2263
2264    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
2265        F::async_call_once(*self, args)
2266    }
2267}
2268
2269#[stable(feature = "async_closure", since = "1.85.0")]
2270impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2271    type CallRefFuture<'a>
2272        = F::CallRefFuture<'a>
2273    where
2274        Self: 'a;
2275
2276    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2277        F::async_call_mut(self, args)
2278    }
2279}
2280
2281#[stable(feature = "async_closure", since = "1.85.0")]
2282impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2283    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2284        F::async_call(self, args)
2285    }
2286}
2287
2288#[unstable(feature = "coerce_unsized", issue = "18598")]
2289impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2290
2291#[unstable(feature = "pin_coerce_unsized_trait", issue = "150112")]
2292unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2293
2294// It is quite crucial that we only allow the `Global` allocator here.
2295// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2296// would need a lot of codegen and interpreter adjustments.
2297#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2298impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2299
2300#[stable(feature = "box_borrow", since = "1.1.0")]
2301impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2302    fn borrow(&self) -> &T {
2303        &**self
2304    }
2305}
2306
2307#[stable(feature = "box_borrow", since = "1.1.0")]
2308impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2309    fn borrow_mut(&mut self) -> &mut T {
2310        &mut **self
2311    }
2312}
2313
2314#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2315impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2316    fn as_ref(&self) -> &T {
2317        &**self
2318    }
2319}
2320
2321#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2322impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2323    fn as_mut(&mut self) -> &mut T {
2324        &mut **self
2325    }
2326}
2327
2328/* Nota bene
2329 *
2330 *  We could have chosen not to add this impl, and instead have written a
2331 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2332 *  because Box<T> implements Unpin even when T does not, as a result of
2333 *  this impl.
2334 *
2335 *  We chose this API instead of the alternative for a few reasons:
2336 *      - Logically, it is helpful to understand pinning in regard to the
2337 *        memory region being pointed to. For this reason none of the
2338 *        standard library pointer types support projecting through a pin
2339 *        (Box<T> is the only pointer type in std for which this would be
2340 *        safe.)
2341 *      - It is in practice very useful to have Box<T> be unconditionally
2342 *        Unpin because of trait objects, for which the structural auto
2343 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2344 *        otherwise not be Unpin).
2345 *
2346 *  Another type with the same semantics as Box but only a conditional
2347 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2348 *  could have a method to project a Pin<T> from it.
2349 */
2350#[stable(feature = "pin", since = "1.33.0")]
2351impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2352
2353#[unstable(feature = "coroutine_trait", issue = "43122")]
2354impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2355    type Yield = G::Yield;
2356    type Return = G::Return;
2357
2358    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2359        G::resume(Pin::new(&mut *self), arg)
2360    }
2361}
2362
2363#[unstable(feature = "coroutine_trait", issue = "43122")]
2364impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2365where
2366    A: 'static,
2367{
2368    type Yield = G::Yield;
2369    type Return = G::Return;
2370
2371    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2372        G::resume((*self).as_mut(), arg)
2373    }
2374}
2375
2376#[stable(feature = "futures_api", since = "1.36.0")]
2377impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2378    type Output = F::Output;
2379
2380    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2381        F::poll(Pin::new(&mut *self), cx)
2382    }
2383}
2384
2385#[stable(feature = "box_error", since = "1.8.0")]
2386impl<E: Error> Error for Box<E> {
2387    #[allow(deprecated)]
2388    fn cause(&self) -> Option<&dyn Error> {
2389        Error::cause(&**self)
2390    }
2391
2392    fn source(&self) -> Option<&(dyn Error + 'static)> {
2393        Error::source(&**self)
2394    }
2395
2396    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2397        Error::provide(&**self, request);
2398    }
2399}
2400
2401#[unstable(feature = "allocator_api", issue = "32838")]
2402unsafe impl<T: ?Sized + Allocator, A: Allocator> Allocator for Box<T, A> {
2403    #[inline]
2404    fn allocate(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2405        (**self).allocate(layout)
2406    }
2407
2408    #[inline]
2409    fn allocate_zeroed(&self, layout: Layout) -> Result<NonNull<[u8]>, AllocError> {
2410        (**self).allocate_zeroed(layout)
2411    }
2412
2413    #[inline]
2414    unsafe fn deallocate(&self, ptr: NonNull<u8>, layout: Layout) {
2415        // SAFETY: the safety contract must be upheld by the caller
2416        unsafe { (**self).deallocate(ptr, layout) }
2417    }
2418
2419    #[inline]
2420    unsafe fn grow(
2421        &self,
2422        ptr: NonNull<u8>,
2423        old_layout: Layout,
2424        new_layout: Layout,
2425    ) -> Result<NonNull<[u8]>, AllocError> {
2426        // SAFETY: the safety contract must be upheld by the caller
2427        unsafe { (**self).grow(ptr, old_layout, new_layout) }
2428    }
2429
2430    #[inline]
2431    unsafe fn grow_zeroed(
2432        &self,
2433        ptr: NonNull<u8>,
2434        old_layout: Layout,
2435        new_layout: Layout,
2436    ) -> Result<NonNull<[u8]>, AllocError> {
2437        // SAFETY: the safety contract must be upheld by the caller
2438        unsafe { (**self).grow_zeroed(ptr, old_layout, new_layout) }
2439    }
2440
2441    #[inline]
2442    unsafe fn shrink(
2443        &self,
2444        ptr: NonNull<u8>,
2445        old_layout: Layout,
2446        new_layout: Layout,
2447    ) -> Result<NonNull<[u8]>, AllocError> {
2448        // SAFETY: the safety contract must be upheld by the caller
2449        unsafe { (**self).shrink(ptr, old_layout, new_layout) }
2450    }
2451}