alloc/
boxed.rs

1//! The `Box<T>` type for heap allocation.
2//!
3//! [`Box<T>`], casually referred to as a 'box', provides the simplest form of
4//! heap allocation in Rust. Boxes provide ownership for this allocation, and
5//! drop their contents when they go out of scope. Boxes also ensure that they
6//! never allocate more than `isize::MAX` bytes.
7//!
8//! # Examples
9//!
10//! Move a value from the stack to the heap by creating a [`Box`]:
11//!
12//! ```
13//! let val: u8 = 5;
14//! let boxed: Box<u8> = Box::new(val);
15//! ```
16//!
17//! Move a value from a [`Box`] back to the stack by [dereferencing]:
18//!
19//! ```
20//! let boxed: Box<u8> = Box::new(5);
21//! let val: u8 = *boxed;
22//! ```
23//!
24//! Creating a recursive data structure:
25//!
26//! ```
27//! # #[allow(dead_code)]
28//! #[derive(Debug)]
29//! enum List<T> {
30//!     Cons(T, Box<List<T>>),
31//!     Nil,
32//! }
33//!
34//! let list: List<i32> = List::Cons(1, Box::new(List::Cons(2, Box::new(List::Nil))));
35//! println!("{list:?}");
36//! ```
37//!
38//! This will print `Cons(1, Cons(2, Nil))`.
39//!
40//! Recursive structures must be boxed, because if the definition of `Cons`
41//! looked like this:
42//!
43//! ```compile_fail,E0072
44//! # enum List<T> {
45//! Cons(T, List<T>),
46//! # }
47//! ```
48//!
49//! It wouldn't work. This is because the size of a `List` depends on how many
50//! elements are in the list, and so we don't know how much memory to allocate
51//! for a `Cons`. By introducing a [`Box<T>`], which has a defined size, we know how
52//! big `Cons` needs to be.
53//!
54//! # Memory layout
55//!
56//! For non-zero-sized values, a [`Box`] will use the [`Global`] allocator for its allocation. It is
57//! valid to convert both ways between a [`Box`] and a raw pointer allocated with the [`Global`]
58//! allocator, given that the [`Layout`] used with the allocator is correct for the type and the raw
59//! pointer points to a valid value of the right type. More precisely, a `value: *mut T` that has
60//! been allocated with the [`Global`] allocator with `Layout::for_value(&*value)` may be converted
61//! into a box using [`Box::<T>::from_raw(value)`]. Conversely, the memory backing a `value: *mut T`
62//! obtained from [`Box::<T>::into_raw`] may be deallocated using the [`Global`] allocator with
63//! [`Layout::for_value(&*value)`].
64//!
65//! For zero-sized values, the `Box` pointer has to be non-null and sufficiently aligned. The
66//! recommended way to build a Box to a ZST if `Box::new` cannot be used is to use
67//! [`ptr::NonNull::dangling`].
68//!
69//! On top of these basic layout requirements, a `Box<T>` must point to a valid value of `T`.
70//!
71//! So long as `T: Sized`, a `Box<T>` is guaranteed to be represented
72//! as a single pointer and is also ABI-compatible with C pointers
73//! (i.e. the C type `T*`). This means that if you have extern "C"
74//! Rust functions that will be called from C, you can define those
75//! Rust functions using `Box<T>` types, and use `T*` as corresponding
76//! type on the C side. As an example, consider this C header which
77//! declares functions that create and destroy some kind of `Foo`
78//! value:
79//!
80//! ```c
81//! /* C header */
82//!
83//! /* Returns ownership to the caller */
84//! struct Foo* foo_new(void);
85//!
86//! /* Takes ownership from the caller; no-op when invoked with null */
87//! void foo_delete(struct Foo*);
88//! ```
89//!
90//! These two functions might be implemented in Rust as follows. Here, the
91//! `struct Foo*` type from C is translated to `Box<Foo>`, which captures
92//! the ownership constraints. Note also that the nullable argument to
93//! `foo_delete` is represented in Rust as `Option<Box<Foo>>`, since `Box<Foo>`
94//! cannot be null.
95//!
96//! ```
97//! #[repr(C)]
98//! pub struct Foo;
99//!
100//! #[unsafe(no_mangle)]
101//! pub extern "C" fn foo_new() -> Box<Foo> {
102//!     Box::new(Foo)
103//! }
104//!
105//! #[unsafe(no_mangle)]
106//! pub extern "C" fn foo_delete(_: Option<Box<Foo>>) {}
107//! ```
108//!
109//! Even though `Box<T>` has the same representation and C ABI as a C pointer,
110//! this does not mean that you can convert an arbitrary `T*` into a `Box<T>`
111//! and expect things to work. `Box<T>` values will always be fully aligned,
112//! non-null pointers. Moreover, the destructor for `Box<T>` will attempt to
113//! free the value with the global allocator. In general, the best practice
114//! is to only use `Box<T>` for pointers that originated from the global
115//! allocator.
116//!
117//! **Important.** At least at present, you should avoid using
118//! `Box<T>` types for functions that are defined in C but invoked
119//! from Rust. In those cases, you should directly mirror the C types
120//! as closely as possible. Using types like `Box<T>` where the C
121//! definition is just using `T*` can lead to undefined behavior, as
122//! described in [rust-lang/unsafe-code-guidelines#198][ucg#198].
123//!
124//! # Considerations for unsafe code
125//!
126//! **Warning: This section is not normative and is subject to change, possibly
127//! being relaxed in the future! It is a simplified summary of the rules
128//! currently implemented in the compiler.**
129//!
130//! The aliasing rules for `Box<T>` are the same as for `&mut T`. `Box<T>`
131//! asserts uniqueness over its content. Using raw pointers derived from a box
132//! after that box has been mutated through, moved or borrowed as `&mut T`
133//! is not allowed. For more guidance on working with box from unsafe code, see
134//! [rust-lang/unsafe-code-guidelines#326][ucg#326].
135//!
136//! # Editions
137//!
138//! A special case exists for the implementation of `IntoIterator` for arrays on the Rust 2021
139//! edition, as documented [here][array]. Unfortunately, it was later found that a similar
140//! workaround should be added for boxed slices, and this was applied in the 2024 edition.
141//!
142//! Specifically, `IntoIterator` is implemented for `Box<[T]>` on all editions, but specific calls
143//! to `into_iter()` for boxed slices will defer to the slice implementation on editions before
144//! 2024:
145//!
146//! ```rust,edition2021
147//! // Rust 2015, 2018, and 2021:
148//!
149//! # #![allow(boxed_slice_into_iter)] // override our `deny(warnings)`
150//! let boxed_slice: Box<[i32]> = vec![0; 3].into_boxed_slice();
151//!
152//! // This creates a slice iterator, producing references to each value.
153//! for item in boxed_slice.into_iter().enumerate() {
154//!     let (i, x): (usize, &i32) = item;
155//!     println!("boxed_slice[{i}] = {x}");
156//! }
157//!
158//! // The `boxed_slice_into_iter` lint suggests this change for future compatibility:
159//! for item in boxed_slice.iter().enumerate() {
160//!     let (i, x): (usize, &i32) = item;
161//!     println!("boxed_slice[{i}] = {x}");
162//! }
163//!
164//! // You can explicitly iterate a boxed slice by value using `IntoIterator::into_iter`
165//! for item in IntoIterator::into_iter(boxed_slice).enumerate() {
166//!     let (i, x): (usize, i32) = item;
167//!     println!("boxed_slice[{i}] = {x}");
168//! }
169//! ```
170//!
171//! Similar to the array implementation, this may be modified in the future to remove this override,
172//! and it's best to avoid relying on this edition-dependent behavior if you wish to preserve
173//! compatibility with future versions of the compiler.
174//!
175//! [ucg#198]: https://github.com/rust-lang/unsafe-code-guidelines/issues/198
176//! [ucg#326]: https://github.com/rust-lang/unsafe-code-guidelines/issues/326
177//! [dereferencing]: core::ops::Deref
178//! [`Box::<T>::from_raw(value)`]: Box::from_raw
179//! [`Global`]: crate::alloc::Global
180//! [`Layout`]: crate::alloc::Layout
181//! [`Layout::for_value(&*value)`]: crate::alloc::Layout::for_value
182//! [valid]: ptr#safety
183
184#![stable(feature = "rust1", since = "1.0.0")]
185
186use core::borrow::{Borrow, BorrowMut};
187#[cfg(not(no_global_oom_handling))]
188use core::clone::CloneToUninit;
189use core::cmp::Ordering;
190use core::error::{self, Error};
191use core::fmt;
192use core::future::Future;
193use core::hash::{Hash, Hasher};
194use core::marker::{PointerLike, Tuple, Unsize};
195use core::mem::{self, SizedTypeProperties};
196use core::ops::{
197    AsyncFn, AsyncFnMut, AsyncFnOnce, CoerceUnsized, Coroutine, CoroutineState, Deref, DerefMut,
198    DerefPure, DispatchFromDyn, LegacyReceiver,
199};
200use core::pin::{Pin, PinCoerceUnsized};
201use core::ptr::{self, NonNull, Unique};
202use core::task::{Context, Poll};
203
204#[cfg(not(no_global_oom_handling))]
205use crate::alloc::handle_alloc_error;
206use crate::alloc::{AllocError, Allocator, Global, Layout};
207use crate::raw_vec::RawVec;
208#[cfg(not(no_global_oom_handling))]
209use crate::str::from_boxed_utf8_unchecked;
210
211/// Conversion related impls for `Box<_>` (`From`, `downcast`, etc)
212mod convert;
213/// Iterator related impls for `Box<_>`.
214mod iter;
215/// [`ThinBox`] implementation.
216mod thin;
217
218#[unstable(feature = "thin_box", issue = "92791")]
219pub use thin::ThinBox;
220
221/// A pointer type that uniquely owns a heap allocation of type `T`.
222///
223/// See the [module-level documentation](../../std/boxed/index.html) for more.
224#[lang = "owned_box"]
225#[fundamental]
226#[stable(feature = "rust1", since = "1.0.0")]
227#[rustc_insignificant_dtor]
228#[doc(search_unbox)]
229// The declaration of the `Box` struct must be kept in sync with the
230// compiler or ICEs will happen.
231pub struct Box<
232    T: ?Sized,
233    #[unstable(feature = "allocator_api", issue = "32838")] A: Allocator = Global,
234>(Unique<T>, A);
235
236/// Constructs a `Box<T>` by calling the `exchange_malloc` lang item and moving the argument into
237/// the newly allocated memory. This is an intrinsic to avoid unnecessary copies.
238///
239/// This is the surface syntax for `box <expr>` expressions.
240#[rustc_intrinsic]
241#[unstable(feature = "liballoc_internals", issue = "none")]
242pub fn box_new<T>(_x: T) -> Box<T>;
243
244impl<T> Box<T> {
245    /// Allocates memory on the heap and then places `x` into it.
246    ///
247    /// This doesn't actually allocate if `T` is zero-sized.
248    ///
249    /// # Examples
250    ///
251    /// ```
252    /// let five = Box::new(5);
253    /// ```
254    #[cfg(not(no_global_oom_handling))]
255    #[inline(always)]
256    #[stable(feature = "rust1", since = "1.0.0")]
257    #[must_use]
258    #[rustc_diagnostic_item = "box_new"]
259    #[cfg_attr(miri, track_caller)] // even without panics, this helps for Miri backtraces
260    pub fn new(x: T) -> Self {
261        return box_new(x);
262    }
263
264    /// Constructs a new box with uninitialized contents.
265    ///
266    /// # Examples
267    ///
268    /// ```
269    /// let mut five = Box::<u32>::new_uninit();
270    /// // Deferred initialization:
271    /// five.write(5);
272    /// let five = unsafe { five.assume_init() };
273    ///
274    /// assert_eq!(*five, 5)
275    /// ```
276    #[cfg(not(no_global_oom_handling))]
277    #[stable(feature = "new_uninit", since = "1.82.0")]
278    #[must_use]
279    #[inline]
280    pub fn new_uninit() -> Box<mem::MaybeUninit<T>> {
281        Self::new_uninit_in(Global)
282    }
283
284    /// Constructs a new `Box` with uninitialized contents, with the memory
285    /// being filled with `0` bytes.
286    ///
287    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
288    /// of this method.
289    ///
290    /// # Examples
291    ///
292    /// ```
293    /// #![feature(new_zeroed_alloc)]
294    ///
295    /// let zero = Box::<u32>::new_zeroed();
296    /// let zero = unsafe { zero.assume_init() };
297    ///
298    /// assert_eq!(*zero, 0)
299    /// ```
300    ///
301    /// [zeroed]: mem::MaybeUninit::zeroed
302    #[cfg(not(no_global_oom_handling))]
303    #[inline]
304    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
305    #[must_use]
306    pub fn new_zeroed() -> Box<mem::MaybeUninit<T>> {
307        Self::new_zeroed_in(Global)
308    }
309
310    /// Constructs a new `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
311    /// `x` will be pinned in memory and unable to be moved.
312    ///
313    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin(x)`
314    /// does the same as <code>[Box::into_pin]\([Box::new]\(x))</code>. Consider using
315    /// [`into_pin`](Box::into_pin) if you already have a `Box<T>`, or if you want to
316    /// construct a (pinned) `Box` in a different way than with [`Box::new`].
317    #[cfg(not(no_global_oom_handling))]
318    #[stable(feature = "pin", since = "1.33.0")]
319    #[must_use]
320    #[inline(always)]
321    pub fn pin(x: T) -> Pin<Box<T>> {
322        Box::new(x).into()
323    }
324
325    /// Allocates memory on the heap then places `x` into it,
326    /// returning an error if the allocation fails
327    ///
328    /// This doesn't actually allocate if `T` is zero-sized.
329    ///
330    /// # Examples
331    ///
332    /// ```
333    /// #![feature(allocator_api)]
334    ///
335    /// let five = Box::try_new(5)?;
336    /// # Ok::<(), std::alloc::AllocError>(())
337    /// ```
338    #[unstable(feature = "allocator_api", issue = "32838")]
339    #[inline]
340    pub fn try_new(x: T) -> Result<Self, AllocError> {
341        Self::try_new_in(x, Global)
342    }
343
344    /// Constructs a new box with uninitialized contents on the heap,
345    /// returning an error if the allocation fails
346    ///
347    /// # Examples
348    ///
349    /// ```
350    /// #![feature(allocator_api)]
351    ///
352    /// let mut five = Box::<u32>::try_new_uninit()?;
353    /// // Deferred initialization:
354    /// five.write(5);
355    /// let five = unsafe { five.assume_init() };
356    ///
357    /// assert_eq!(*five, 5);
358    /// # Ok::<(), std::alloc::AllocError>(())
359    /// ```
360    #[unstable(feature = "allocator_api", issue = "32838")]
361    // #[unstable(feature = "new_uninit", issue = "63291")]
362    #[inline]
363    pub fn try_new_uninit() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
364        Box::try_new_uninit_in(Global)
365    }
366
367    /// Constructs a new `Box` with uninitialized contents, with the memory
368    /// being filled with `0` bytes on the heap
369    ///
370    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
371    /// of this method.
372    ///
373    /// # Examples
374    ///
375    /// ```
376    /// #![feature(allocator_api)]
377    ///
378    /// let zero = Box::<u32>::try_new_zeroed()?;
379    /// let zero = unsafe { zero.assume_init() };
380    ///
381    /// assert_eq!(*zero, 0);
382    /// # Ok::<(), std::alloc::AllocError>(())
383    /// ```
384    ///
385    /// [zeroed]: mem::MaybeUninit::zeroed
386    #[unstable(feature = "allocator_api", issue = "32838")]
387    // #[unstable(feature = "new_uninit", issue = "63291")]
388    #[inline]
389    pub fn try_new_zeroed() -> Result<Box<mem::MaybeUninit<T>>, AllocError> {
390        Box::try_new_zeroed_in(Global)
391    }
392}
393
394impl<T, A: Allocator> Box<T, A> {
395    /// Allocates memory in the given allocator then places `x` into it.
396    ///
397    /// This doesn't actually allocate if `T` is zero-sized.
398    ///
399    /// # Examples
400    ///
401    /// ```
402    /// #![feature(allocator_api)]
403    ///
404    /// use std::alloc::System;
405    ///
406    /// let five = Box::new_in(5, System);
407    /// ```
408    #[cfg(not(no_global_oom_handling))]
409    #[unstable(feature = "allocator_api", issue = "32838")]
410    #[must_use]
411    #[inline]
412    pub fn new_in(x: T, alloc: A) -> Self
413    where
414        A: Allocator,
415    {
416        let mut boxed = Self::new_uninit_in(alloc);
417        boxed.write(x);
418        unsafe { boxed.assume_init() }
419    }
420
421    /// Allocates memory in the given allocator then places `x` into it,
422    /// returning an error if the allocation fails
423    ///
424    /// This doesn't actually allocate if `T` is zero-sized.
425    ///
426    /// # Examples
427    ///
428    /// ```
429    /// #![feature(allocator_api)]
430    ///
431    /// use std::alloc::System;
432    ///
433    /// let five = Box::try_new_in(5, System)?;
434    /// # Ok::<(), std::alloc::AllocError>(())
435    /// ```
436    #[unstable(feature = "allocator_api", issue = "32838")]
437    #[inline]
438    pub fn try_new_in(x: T, alloc: A) -> Result<Self, AllocError>
439    where
440        A: Allocator,
441    {
442        let mut boxed = Self::try_new_uninit_in(alloc)?;
443        boxed.write(x);
444        unsafe { Ok(boxed.assume_init()) }
445    }
446
447    /// Constructs a new box with uninitialized contents in the provided allocator.
448    ///
449    /// # Examples
450    ///
451    /// ```
452    /// #![feature(allocator_api)]
453    ///
454    /// use std::alloc::System;
455    ///
456    /// let mut five = Box::<u32, _>::new_uninit_in(System);
457    /// // Deferred initialization:
458    /// five.write(5);
459    /// let five = unsafe { five.assume_init() };
460    ///
461    /// assert_eq!(*five, 5)
462    /// ```
463    #[unstable(feature = "allocator_api", issue = "32838")]
464    #[cfg(not(no_global_oom_handling))]
465    #[must_use]
466    // #[unstable(feature = "new_uninit", issue = "63291")]
467    pub fn new_uninit_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
468    where
469        A: Allocator,
470    {
471        let layout = Layout::new::<mem::MaybeUninit<T>>();
472        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
473        // That would make code size bigger.
474        match Box::try_new_uninit_in(alloc) {
475            Ok(m) => m,
476            Err(_) => handle_alloc_error(layout),
477        }
478    }
479
480    /// Constructs a new box with uninitialized contents in the provided allocator,
481    /// returning an error if the allocation fails
482    ///
483    /// # Examples
484    ///
485    /// ```
486    /// #![feature(allocator_api)]
487    ///
488    /// use std::alloc::System;
489    ///
490    /// let mut five = Box::<u32, _>::try_new_uninit_in(System)?;
491    /// // Deferred initialization:
492    /// five.write(5);
493    /// let five = unsafe { five.assume_init() };
494    ///
495    /// assert_eq!(*five, 5);
496    /// # Ok::<(), std::alloc::AllocError>(())
497    /// ```
498    #[unstable(feature = "allocator_api", issue = "32838")]
499    // #[unstable(feature = "new_uninit", issue = "63291")]
500    pub fn try_new_uninit_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
501    where
502        A: Allocator,
503    {
504        let ptr = if T::IS_ZST {
505            NonNull::dangling()
506        } else {
507            let layout = Layout::new::<mem::MaybeUninit<T>>();
508            alloc.allocate(layout)?.cast()
509        };
510        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
511    }
512
513    /// Constructs a new `Box` with uninitialized contents, with the memory
514    /// being filled with `0` bytes in the provided allocator.
515    ///
516    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
517    /// of this method.
518    ///
519    /// # Examples
520    ///
521    /// ```
522    /// #![feature(allocator_api)]
523    ///
524    /// use std::alloc::System;
525    ///
526    /// let zero = Box::<u32, _>::new_zeroed_in(System);
527    /// let zero = unsafe { zero.assume_init() };
528    ///
529    /// assert_eq!(*zero, 0)
530    /// ```
531    ///
532    /// [zeroed]: mem::MaybeUninit::zeroed
533    #[unstable(feature = "allocator_api", issue = "32838")]
534    #[cfg(not(no_global_oom_handling))]
535    // #[unstable(feature = "new_uninit", issue = "63291")]
536    #[must_use]
537    pub fn new_zeroed_in(alloc: A) -> Box<mem::MaybeUninit<T>, A>
538    where
539        A: Allocator,
540    {
541        let layout = Layout::new::<mem::MaybeUninit<T>>();
542        // NOTE: Prefer match over unwrap_or_else since closure sometimes not inlineable.
543        // That would make code size bigger.
544        match Box::try_new_zeroed_in(alloc) {
545            Ok(m) => m,
546            Err(_) => handle_alloc_error(layout),
547        }
548    }
549
550    /// Constructs a new `Box` with uninitialized contents, with the memory
551    /// being filled with `0` bytes in the provided allocator,
552    /// returning an error if the allocation fails,
553    ///
554    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
555    /// of this method.
556    ///
557    /// # Examples
558    ///
559    /// ```
560    /// #![feature(allocator_api)]
561    ///
562    /// use std::alloc::System;
563    ///
564    /// let zero = Box::<u32, _>::try_new_zeroed_in(System)?;
565    /// let zero = unsafe { zero.assume_init() };
566    ///
567    /// assert_eq!(*zero, 0);
568    /// # Ok::<(), std::alloc::AllocError>(())
569    /// ```
570    ///
571    /// [zeroed]: mem::MaybeUninit::zeroed
572    #[unstable(feature = "allocator_api", issue = "32838")]
573    // #[unstable(feature = "new_uninit", issue = "63291")]
574    pub fn try_new_zeroed_in(alloc: A) -> Result<Box<mem::MaybeUninit<T>, A>, AllocError>
575    where
576        A: Allocator,
577    {
578        let ptr = if T::IS_ZST {
579            NonNull::dangling()
580        } else {
581            let layout = Layout::new::<mem::MaybeUninit<T>>();
582            alloc.allocate_zeroed(layout)?.cast()
583        };
584        unsafe { Ok(Box::from_raw_in(ptr.as_ptr(), alloc)) }
585    }
586
587    /// Constructs a new `Pin<Box<T, A>>`. If `T` does not implement [`Unpin`], then
588    /// `x` will be pinned in memory and unable to be moved.
589    ///
590    /// Constructing and pinning of the `Box` can also be done in two steps: `Box::pin_in(x, alloc)`
591    /// does the same as <code>[Box::into_pin]\([Box::new_in]\(x, alloc))</code>. Consider using
592    /// [`into_pin`](Box::into_pin) if you already have a `Box<T, A>`, or if you want to
593    /// construct a (pinned) `Box` in a different way than with [`Box::new_in`].
594    #[cfg(not(no_global_oom_handling))]
595    #[unstable(feature = "allocator_api", issue = "32838")]
596    #[must_use]
597    #[inline(always)]
598    pub fn pin_in(x: T, alloc: A) -> Pin<Self>
599    where
600        A: 'static + Allocator,
601    {
602        Self::into_pin(Self::new_in(x, alloc))
603    }
604
605    /// Converts a `Box<T>` into a `Box<[T]>`
606    ///
607    /// This conversion does not allocate on the heap and happens in place.
608    #[unstable(feature = "box_into_boxed_slice", issue = "71582")]
609    pub fn into_boxed_slice(boxed: Self) -> Box<[T], A> {
610        let (raw, alloc) = Box::into_raw_with_allocator(boxed);
611        unsafe { Box::from_raw_in(raw as *mut [T; 1], alloc) }
612    }
613
614    /// Consumes the `Box`, returning the wrapped value.
615    ///
616    /// # Examples
617    ///
618    /// ```
619    /// #![feature(box_into_inner)]
620    ///
621    /// let c = Box::new(5);
622    ///
623    /// assert_eq!(Box::into_inner(c), 5);
624    /// ```
625    #[unstable(feature = "box_into_inner", issue = "80437")]
626    #[inline]
627    pub fn into_inner(boxed: Self) -> T {
628        *boxed
629    }
630}
631
632impl<T> Box<[T]> {
633    /// Constructs a new boxed slice with uninitialized contents.
634    ///
635    /// # Examples
636    ///
637    /// ```
638    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
639    /// // Deferred initialization:
640    /// values[0].write(1);
641    /// values[1].write(2);
642    /// values[2].write(3);
643    /// let values = unsafe {values.assume_init() };
644    ///
645    /// assert_eq!(*values, [1, 2, 3])
646    /// ```
647    #[cfg(not(no_global_oom_handling))]
648    #[stable(feature = "new_uninit", since = "1.82.0")]
649    #[must_use]
650    pub fn new_uninit_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
651        unsafe { RawVec::with_capacity(len).into_box(len) }
652    }
653
654    /// Constructs a new boxed slice with uninitialized contents, with the memory
655    /// being filled with `0` bytes.
656    ///
657    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
658    /// of this method.
659    ///
660    /// # Examples
661    ///
662    /// ```
663    /// #![feature(new_zeroed_alloc)]
664    ///
665    /// let values = Box::<[u32]>::new_zeroed_slice(3);
666    /// let values = unsafe { values.assume_init() };
667    ///
668    /// assert_eq!(*values, [0, 0, 0])
669    /// ```
670    ///
671    /// [zeroed]: mem::MaybeUninit::zeroed
672    #[cfg(not(no_global_oom_handling))]
673    #[unstable(feature = "new_zeroed_alloc", issue = "129396")]
674    #[must_use]
675    pub fn new_zeroed_slice(len: usize) -> Box<[mem::MaybeUninit<T>]> {
676        unsafe { RawVec::with_capacity_zeroed(len).into_box(len) }
677    }
678
679    /// Constructs a new boxed slice with uninitialized contents. Returns an error if
680    /// the allocation fails.
681    ///
682    /// # Examples
683    ///
684    /// ```
685    /// #![feature(allocator_api)]
686    ///
687    /// let mut values = Box::<[u32]>::try_new_uninit_slice(3)?;
688    /// // Deferred initialization:
689    /// values[0].write(1);
690    /// values[1].write(2);
691    /// values[2].write(3);
692    /// let values = unsafe { values.assume_init() };
693    ///
694    /// assert_eq!(*values, [1, 2, 3]);
695    /// # Ok::<(), std::alloc::AllocError>(())
696    /// ```
697    #[unstable(feature = "allocator_api", issue = "32838")]
698    #[inline]
699    pub fn try_new_uninit_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
700        let ptr = if T::IS_ZST || len == 0 {
701            NonNull::dangling()
702        } else {
703            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
704                Ok(l) => l,
705                Err(_) => return Err(AllocError),
706            };
707            Global.allocate(layout)?.cast()
708        };
709        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
710    }
711
712    /// Constructs a new boxed slice with uninitialized contents, with the memory
713    /// being filled with `0` bytes. Returns an error if the allocation fails.
714    ///
715    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
716    /// of this method.
717    ///
718    /// # Examples
719    ///
720    /// ```
721    /// #![feature(allocator_api)]
722    ///
723    /// let values = Box::<[u32]>::try_new_zeroed_slice(3)?;
724    /// let values = unsafe { values.assume_init() };
725    ///
726    /// assert_eq!(*values, [0, 0, 0]);
727    /// # Ok::<(), std::alloc::AllocError>(())
728    /// ```
729    ///
730    /// [zeroed]: mem::MaybeUninit::zeroed
731    #[unstable(feature = "allocator_api", issue = "32838")]
732    #[inline]
733    pub fn try_new_zeroed_slice(len: usize) -> Result<Box<[mem::MaybeUninit<T>]>, AllocError> {
734        let ptr = if T::IS_ZST || len == 0 {
735            NonNull::dangling()
736        } else {
737            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
738                Ok(l) => l,
739                Err(_) => return Err(AllocError),
740            };
741            Global.allocate_zeroed(layout)?.cast()
742        };
743        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, Global).into_box(len)) }
744    }
745
746    /// Converts the boxed slice into a boxed array.
747    ///
748    /// This operation does not reallocate; the underlying array of the slice is simply reinterpreted as an array type.
749    ///
750    /// If `N` is not exactly equal to the length of `self`, then this method returns `None`.
751    #[unstable(feature = "slice_as_array", issue = "133508")]
752    #[inline]
753    #[must_use]
754    pub fn into_array<const N: usize>(self) -> Option<Box<[T; N]>> {
755        if self.len() == N {
756            let ptr = Self::into_raw(self) as *mut [T; N];
757
758            // SAFETY: The underlying array of a slice has the exact same layout as an actual array `[T; N]` if `N` is equal to the slice's length.
759            let me = unsafe { Box::from_raw(ptr) };
760            Some(me)
761        } else {
762            None
763        }
764    }
765}
766
767impl<T, A: Allocator> Box<[T], A> {
768    /// Constructs a new boxed slice with uninitialized contents in the provided allocator.
769    ///
770    /// # Examples
771    ///
772    /// ```
773    /// #![feature(allocator_api)]
774    ///
775    /// use std::alloc::System;
776    ///
777    /// let mut values = Box::<[u32], _>::new_uninit_slice_in(3, System);
778    /// // Deferred initialization:
779    /// values[0].write(1);
780    /// values[1].write(2);
781    /// values[2].write(3);
782    /// let values = unsafe { values.assume_init() };
783    ///
784    /// assert_eq!(*values, [1, 2, 3])
785    /// ```
786    #[cfg(not(no_global_oom_handling))]
787    #[unstable(feature = "allocator_api", issue = "32838")]
788    // #[unstable(feature = "new_uninit", issue = "63291")]
789    #[must_use]
790    pub fn new_uninit_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
791        unsafe { RawVec::with_capacity_in(len, alloc).into_box(len) }
792    }
793
794    /// Constructs a new boxed slice with uninitialized contents in the provided allocator,
795    /// with the memory being filled with `0` bytes.
796    ///
797    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
798    /// of this method.
799    ///
800    /// # Examples
801    ///
802    /// ```
803    /// #![feature(allocator_api)]
804    ///
805    /// use std::alloc::System;
806    ///
807    /// let values = Box::<[u32], _>::new_zeroed_slice_in(3, System);
808    /// let values = unsafe { values.assume_init() };
809    ///
810    /// assert_eq!(*values, [0, 0, 0])
811    /// ```
812    ///
813    /// [zeroed]: mem::MaybeUninit::zeroed
814    #[cfg(not(no_global_oom_handling))]
815    #[unstable(feature = "allocator_api", issue = "32838")]
816    // #[unstable(feature = "new_uninit", issue = "63291")]
817    #[must_use]
818    pub fn new_zeroed_slice_in(len: usize, alloc: A) -> Box<[mem::MaybeUninit<T>], A> {
819        unsafe { RawVec::with_capacity_zeroed_in(len, alloc).into_box(len) }
820    }
821
822    /// Constructs a new boxed slice with uninitialized contents in the provided allocator. Returns an error if
823    /// the allocation fails.
824    ///
825    /// # Examples
826    ///
827    /// ```
828    /// #![feature(allocator_api)]
829    ///
830    /// use std::alloc::System;
831    ///
832    /// let mut values = Box::<[u32], _>::try_new_uninit_slice_in(3, System)?;
833    /// // Deferred initialization:
834    /// values[0].write(1);
835    /// values[1].write(2);
836    /// values[2].write(3);
837    /// let values = unsafe { values.assume_init() };
838    ///
839    /// assert_eq!(*values, [1, 2, 3]);
840    /// # Ok::<(), std::alloc::AllocError>(())
841    /// ```
842    #[unstable(feature = "allocator_api", issue = "32838")]
843    #[inline]
844    pub fn try_new_uninit_slice_in(
845        len: usize,
846        alloc: A,
847    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
848        let ptr = if T::IS_ZST || len == 0 {
849            NonNull::dangling()
850        } else {
851            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
852                Ok(l) => l,
853                Err(_) => return Err(AllocError),
854            };
855            alloc.allocate(layout)?.cast()
856        };
857        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
858    }
859
860    /// Constructs a new boxed slice with uninitialized contents in the provided allocator, with the memory
861    /// being filled with `0` bytes. Returns an error if the allocation fails.
862    ///
863    /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and incorrect usage
864    /// of this method.
865    ///
866    /// # Examples
867    ///
868    /// ```
869    /// #![feature(allocator_api)]
870    ///
871    /// use std::alloc::System;
872    ///
873    /// let values = Box::<[u32], _>::try_new_zeroed_slice_in(3, System)?;
874    /// let values = unsafe { values.assume_init() };
875    ///
876    /// assert_eq!(*values, [0, 0, 0]);
877    /// # Ok::<(), std::alloc::AllocError>(())
878    /// ```
879    ///
880    /// [zeroed]: mem::MaybeUninit::zeroed
881    #[unstable(feature = "allocator_api", issue = "32838")]
882    #[inline]
883    pub fn try_new_zeroed_slice_in(
884        len: usize,
885        alloc: A,
886    ) -> Result<Box<[mem::MaybeUninit<T>], A>, AllocError> {
887        let ptr = if T::IS_ZST || len == 0 {
888            NonNull::dangling()
889        } else {
890            let layout = match Layout::array::<mem::MaybeUninit<T>>(len) {
891                Ok(l) => l,
892                Err(_) => return Err(AllocError),
893            };
894            alloc.allocate_zeroed(layout)?.cast()
895        };
896        unsafe { Ok(RawVec::from_raw_parts_in(ptr.as_ptr(), len, alloc).into_box(len)) }
897    }
898}
899
900impl<T, A: Allocator> Box<mem::MaybeUninit<T>, A> {
901    /// Converts to `Box<T, A>`.
902    ///
903    /// # Safety
904    ///
905    /// As with [`MaybeUninit::assume_init`],
906    /// it is up to the caller to guarantee that the value
907    /// really is in an initialized state.
908    /// Calling this when the content is not yet fully initialized
909    /// causes immediate undefined behavior.
910    ///
911    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
912    ///
913    /// # Examples
914    ///
915    /// ```
916    /// let mut five = Box::<u32>::new_uninit();
917    /// // Deferred initialization:
918    /// five.write(5);
919    /// let five: Box<u32> = unsafe { five.assume_init() };
920    ///
921    /// assert_eq!(*five, 5)
922    /// ```
923    #[stable(feature = "new_uninit", since = "1.82.0")]
924    #[inline]
925    pub unsafe fn assume_init(self) -> Box<T, A> {
926        let (raw, alloc) = Box::into_raw_with_allocator(self);
927        unsafe { Box::from_raw_in(raw as *mut T, alloc) }
928    }
929
930    /// Writes the value and converts to `Box<T, A>`.
931    ///
932    /// This method converts the box similarly to [`Box::assume_init`] but
933    /// writes `value` into it before conversion thus guaranteeing safety.
934    /// In some scenarios use of this method may improve performance because
935    /// the compiler may be able to optimize copying from stack.
936    ///
937    /// # Examples
938    ///
939    /// ```
940    /// #![feature(box_uninit_write)]
941    ///
942    /// let big_box = Box::<[usize; 1024]>::new_uninit();
943    ///
944    /// let mut array = [0; 1024];
945    /// for (i, place) in array.iter_mut().enumerate() {
946    ///     *place = i;
947    /// }
948    ///
949    /// // The optimizer may be able to elide this copy, so previous code writes
950    /// // to heap directly.
951    /// let big_box = Box::write(big_box, array);
952    ///
953    /// for (i, x) in big_box.iter().enumerate() {
954    ///     assert_eq!(*x, i);
955    /// }
956    /// ```
957    #[unstable(feature = "box_uninit_write", issue = "129397")]
958    #[inline]
959    pub fn write(mut boxed: Self, value: T) -> Box<T, A> {
960        unsafe {
961            (*boxed).write(value);
962            boxed.assume_init()
963        }
964    }
965}
966
967impl<T, A: Allocator> Box<[mem::MaybeUninit<T>], A> {
968    /// Converts to `Box<[T], A>`.
969    ///
970    /// # Safety
971    ///
972    /// As with [`MaybeUninit::assume_init`],
973    /// it is up to the caller to guarantee that the values
974    /// really are in an initialized state.
975    /// Calling this when the content is not yet fully initialized
976    /// causes immediate undefined behavior.
977    ///
978    /// [`MaybeUninit::assume_init`]: mem::MaybeUninit::assume_init
979    ///
980    /// # Examples
981    ///
982    /// ```
983    /// let mut values = Box::<[u32]>::new_uninit_slice(3);
984    /// // Deferred initialization:
985    /// values[0].write(1);
986    /// values[1].write(2);
987    /// values[2].write(3);
988    /// let values = unsafe { values.assume_init() };
989    ///
990    /// assert_eq!(*values, [1, 2, 3])
991    /// ```
992    #[stable(feature = "new_uninit", since = "1.82.0")]
993    #[inline]
994    pub unsafe fn assume_init(self) -> Box<[T], A> {
995        let (raw, alloc) = Box::into_raw_with_allocator(self);
996        unsafe { Box::from_raw_in(raw as *mut [T], alloc) }
997    }
998}
999
1000impl<T: ?Sized> Box<T> {
1001    /// Constructs a box from a raw pointer.
1002    ///
1003    /// After calling this function, the raw pointer is owned by the
1004    /// resulting `Box`. Specifically, the `Box` destructor will call
1005    /// the destructor of `T` and free the allocated memory. For this
1006    /// to be safe, the memory must have been allocated in accordance
1007    /// with the [memory layout] used by `Box` .
1008    ///
1009    /// # Safety
1010    ///
1011    /// This function is unsafe because improper use may lead to
1012    /// memory problems. For example, a double-free may occur if the
1013    /// function is called twice on the same raw pointer.
1014    ///
1015    /// The raw pointer must point to a block of memory allocated by the global allocator.
1016    ///
1017    /// The safety conditions are described in the [memory layout] section.
1018    ///
1019    /// # Examples
1020    ///
1021    /// Recreate a `Box` which was previously converted to a raw pointer
1022    /// using [`Box::into_raw`]:
1023    /// ```
1024    /// let x = Box::new(5);
1025    /// let ptr = Box::into_raw(x);
1026    /// let x = unsafe { Box::from_raw(ptr) };
1027    /// ```
1028    /// Manually create a `Box` from scratch by using the global allocator:
1029    /// ```
1030    /// use std::alloc::{alloc, Layout};
1031    ///
1032    /// unsafe {
1033    ///     let ptr = alloc(Layout::new::<i32>()) as *mut i32;
1034    ///     // In general .write is required to avoid attempting to destruct
1035    ///     // the (uninitialized) previous contents of `ptr`, though for this
1036    ///     // simple example `*ptr = 5` would have worked as well.
1037    ///     ptr.write(5);
1038    ///     let x = Box::from_raw(ptr);
1039    /// }
1040    /// ```
1041    ///
1042    /// [memory layout]: self#memory-layout
1043    #[stable(feature = "box_raw", since = "1.4.0")]
1044    #[inline]
1045    #[must_use = "call `drop(Box::from_raw(ptr))` if you intend to drop the `Box`"]
1046    pub unsafe fn from_raw(raw: *mut T) -> Self {
1047        unsafe { Self::from_raw_in(raw, Global) }
1048    }
1049
1050    /// Constructs a box from a `NonNull` pointer.
1051    ///
1052    /// After calling this function, the `NonNull` pointer is owned by
1053    /// the resulting `Box`. Specifically, the `Box` destructor will call
1054    /// the destructor of `T` and free the allocated memory. For this
1055    /// to be safe, the memory must have been allocated in accordance
1056    /// with the [memory layout] used by `Box` .
1057    ///
1058    /// # Safety
1059    ///
1060    /// This function is unsafe because improper use may lead to
1061    /// memory problems. For example, a double-free may occur if the
1062    /// function is called twice on the same `NonNull` pointer.
1063    ///
1064    /// The non-null pointer must point to a block of memory allocated by the global allocator.
1065    ///
1066    /// The safety conditions are described in the [memory layout] section.
1067    ///
1068    /// # Examples
1069    ///
1070    /// Recreate a `Box` which was previously converted to a `NonNull`
1071    /// pointer using [`Box::into_non_null`]:
1072    /// ```
1073    /// #![feature(box_vec_non_null)]
1074    ///
1075    /// let x = Box::new(5);
1076    /// let non_null = Box::into_non_null(x);
1077    /// let x = unsafe { Box::from_non_null(non_null) };
1078    /// ```
1079    /// Manually create a `Box` from scratch by using the global allocator:
1080    /// ```
1081    /// #![feature(box_vec_non_null)]
1082    ///
1083    /// use std::alloc::{alloc, Layout};
1084    /// use std::ptr::NonNull;
1085    ///
1086    /// unsafe {
1087    ///     let non_null = NonNull::new(alloc(Layout::new::<i32>()).cast::<i32>())
1088    ///         .expect("allocation failed");
1089    ///     // In general .write is required to avoid attempting to destruct
1090    ///     // the (uninitialized) previous contents of `non_null`.
1091    ///     non_null.write(5);
1092    ///     let x = Box::from_non_null(non_null);
1093    /// }
1094    /// ```
1095    ///
1096    /// [memory layout]: self#memory-layout
1097    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1098    #[inline]
1099    #[must_use = "call `drop(Box::from_non_null(ptr))` if you intend to drop the `Box`"]
1100    pub unsafe fn from_non_null(ptr: NonNull<T>) -> Self {
1101        unsafe { Self::from_raw(ptr.as_ptr()) }
1102    }
1103}
1104
1105impl<T: ?Sized, A: Allocator> Box<T, A> {
1106    /// Constructs a box from a raw pointer in the given allocator.
1107    ///
1108    /// After calling this function, the raw pointer is owned by the
1109    /// resulting `Box`. Specifically, the `Box` destructor will call
1110    /// the destructor of `T` and free the allocated memory. For this
1111    /// to be safe, the memory must have been allocated in accordance
1112    /// with the [memory layout] used by `Box` .
1113    ///
1114    /// # Safety
1115    ///
1116    /// This function is unsafe because improper use may lead to
1117    /// memory problems. For example, a double-free may occur if the
1118    /// function is called twice on the same raw pointer.
1119    ///
1120    /// The raw pointer must point to a block of memory allocated by `alloc`.
1121    ///
1122    /// # Examples
1123    ///
1124    /// Recreate a `Box` which was previously converted to a raw pointer
1125    /// using [`Box::into_raw_with_allocator`]:
1126    /// ```
1127    /// #![feature(allocator_api)]
1128    ///
1129    /// use std::alloc::System;
1130    ///
1131    /// let x = Box::new_in(5, System);
1132    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1133    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1134    /// ```
1135    /// Manually create a `Box` from scratch by using the system allocator:
1136    /// ```
1137    /// #![feature(allocator_api, slice_ptr_get)]
1138    ///
1139    /// use std::alloc::{Allocator, Layout, System};
1140    ///
1141    /// unsafe {
1142    ///     let ptr = System.allocate(Layout::new::<i32>())?.as_mut_ptr() as *mut i32;
1143    ///     // In general .write is required to avoid attempting to destruct
1144    ///     // the (uninitialized) previous contents of `ptr`, though for this
1145    ///     // simple example `*ptr = 5` would have worked as well.
1146    ///     ptr.write(5);
1147    ///     let x = Box::from_raw_in(ptr, System);
1148    /// }
1149    /// # Ok::<(), std::alloc::AllocError>(())
1150    /// ```
1151    ///
1152    /// [memory layout]: self#memory-layout
1153    #[unstable(feature = "allocator_api", issue = "32838")]
1154    #[rustc_const_unstable(feature = "const_box", issue = "92521")]
1155    #[inline]
1156    pub const unsafe fn from_raw_in(raw: *mut T, alloc: A) -> Self {
1157        Box(unsafe { Unique::new_unchecked(raw) }, alloc)
1158    }
1159
1160    /// Constructs a box from a `NonNull` pointer in the given allocator.
1161    ///
1162    /// After calling this function, the `NonNull` pointer is owned by
1163    /// the resulting `Box`. Specifically, the `Box` destructor will call
1164    /// the destructor of `T` and free the allocated memory. For this
1165    /// to be safe, the memory must have been allocated in accordance
1166    /// with the [memory layout] used by `Box` .
1167    ///
1168    /// # Safety
1169    ///
1170    /// This function is unsafe because improper use may lead to
1171    /// memory problems. For example, a double-free may occur if the
1172    /// function is called twice on the same raw pointer.
1173    ///
1174    /// The non-null pointer must point to a block of memory allocated by `alloc`.
1175    ///
1176    /// # Examples
1177    ///
1178    /// Recreate a `Box` which was previously converted to a `NonNull` pointer
1179    /// using [`Box::into_non_null_with_allocator`]:
1180    /// ```
1181    /// #![feature(allocator_api, box_vec_non_null)]
1182    ///
1183    /// use std::alloc::System;
1184    ///
1185    /// let x = Box::new_in(5, System);
1186    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1187    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1188    /// ```
1189    /// Manually create a `Box` from scratch by using the system allocator:
1190    /// ```
1191    /// #![feature(allocator_api, box_vec_non_null, slice_ptr_get)]
1192    ///
1193    /// use std::alloc::{Allocator, Layout, System};
1194    ///
1195    /// unsafe {
1196    ///     let non_null = System.allocate(Layout::new::<i32>())?.cast::<i32>();
1197    ///     // In general .write is required to avoid attempting to destruct
1198    ///     // the (uninitialized) previous contents of `non_null`.
1199    ///     non_null.write(5);
1200    ///     let x = Box::from_non_null_in(non_null, System);
1201    /// }
1202    /// # Ok::<(), std::alloc::AllocError>(())
1203    /// ```
1204    ///
1205    /// [memory layout]: self#memory-layout
1206    #[unstable(feature = "allocator_api", issue = "32838")]
1207    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1208    #[rustc_const_unstable(feature = "const_box", issue = "92521")]
1209    #[inline]
1210    pub const unsafe fn from_non_null_in(raw: NonNull<T>, alloc: A) -> Self {
1211        // SAFETY: guaranteed by the caller.
1212        unsafe { Box::from_raw_in(raw.as_ptr(), alloc) }
1213    }
1214
1215    /// Consumes the `Box`, returning a wrapped raw pointer.
1216    ///
1217    /// The pointer will be properly aligned and non-null.
1218    ///
1219    /// After calling this function, the caller is responsible for the
1220    /// memory previously managed by the `Box`. In particular, the
1221    /// caller should properly destroy `T` and release the memory, taking
1222    /// into account the [memory layout] used by `Box`. The easiest way to
1223    /// do this is to convert the raw pointer back into a `Box` with the
1224    /// [`Box::from_raw`] function, allowing the `Box` destructor to perform
1225    /// the cleanup.
1226    ///
1227    /// Note: this is an associated function, which means that you have
1228    /// to call it as `Box::into_raw(b)` instead of `b.into_raw()`. This
1229    /// is so that there is no conflict with a method on the inner type.
1230    ///
1231    /// # Examples
1232    /// Converting the raw pointer back into a `Box` with [`Box::from_raw`]
1233    /// for automatic cleanup:
1234    /// ```
1235    /// let x = Box::new(String::from("Hello"));
1236    /// let ptr = Box::into_raw(x);
1237    /// let x = unsafe { Box::from_raw(ptr) };
1238    /// ```
1239    /// Manual cleanup by explicitly running the destructor and deallocating
1240    /// the memory:
1241    /// ```
1242    /// use std::alloc::{dealloc, Layout};
1243    /// use std::ptr;
1244    ///
1245    /// let x = Box::new(String::from("Hello"));
1246    /// let ptr = Box::into_raw(x);
1247    /// unsafe {
1248    ///     ptr::drop_in_place(ptr);
1249    ///     dealloc(ptr as *mut u8, Layout::new::<String>());
1250    /// }
1251    /// ```
1252    /// Note: This is equivalent to the following:
1253    /// ```
1254    /// let x = Box::new(String::from("Hello"));
1255    /// let ptr = Box::into_raw(x);
1256    /// unsafe {
1257    ///     drop(Box::from_raw(ptr));
1258    /// }
1259    /// ```
1260    ///
1261    /// [memory layout]: self#memory-layout
1262    #[must_use = "losing the pointer will leak memory"]
1263    #[stable(feature = "box_raw", since = "1.4.0")]
1264    #[inline]
1265    pub fn into_raw(b: Self) -> *mut T {
1266        // Make sure Miri realizes that we transition from a noalias pointer to a raw pointer here.
1267        unsafe { &raw mut *&mut *Self::into_raw_with_allocator(b).0 }
1268    }
1269
1270    /// Consumes the `Box`, returning a wrapped `NonNull` pointer.
1271    ///
1272    /// The pointer will be properly aligned.
1273    ///
1274    /// After calling this function, the caller is responsible for the
1275    /// memory previously managed by the `Box`. In particular, the
1276    /// caller should properly destroy `T` and release the memory, taking
1277    /// into account the [memory layout] used by `Box`. The easiest way to
1278    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1279    /// [`Box::from_non_null`] function, allowing the `Box` destructor to
1280    /// perform the cleanup.
1281    ///
1282    /// Note: this is an associated function, which means that you have
1283    /// to call it as `Box::into_non_null(b)` instead of `b.into_non_null()`.
1284    /// This is so that there is no conflict with a method on the inner type.
1285    ///
1286    /// # Examples
1287    /// Converting the `NonNull` pointer back into a `Box` with [`Box::from_non_null`]
1288    /// for automatic cleanup:
1289    /// ```
1290    /// #![feature(box_vec_non_null)]
1291    ///
1292    /// let x = Box::new(String::from("Hello"));
1293    /// let non_null = Box::into_non_null(x);
1294    /// let x = unsafe { Box::from_non_null(non_null) };
1295    /// ```
1296    /// Manual cleanup by explicitly running the destructor and deallocating
1297    /// the memory:
1298    /// ```
1299    /// #![feature(box_vec_non_null)]
1300    ///
1301    /// use std::alloc::{dealloc, Layout};
1302    ///
1303    /// let x = Box::new(String::from("Hello"));
1304    /// let non_null = Box::into_non_null(x);
1305    /// unsafe {
1306    ///     non_null.drop_in_place();
1307    ///     dealloc(non_null.as_ptr().cast::<u8>(), Layout::new::<String>());
1308    /// }
1309    /// ```
1310    /// Note: This is equivalent to the following:
1311    /// ```
1312    /// #![feature(box_vec_non_null)]
1313    ///
1314    /// let x = Box::new(String::from("Hello"));
1315    /// let non_null = Box::into_non_null(x);
1316    /// unsafe {
1317    ///     drop(Box::from_non_null(non_null));
1318    /// }
1319    /// ```
1320    ///
1321    /// [memory layout]: self#memory-layout
1322    #[must_use = "losing the pointer will leak memory"]
1323    #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1324    #[inline]
1325    pub fn into_non_null(b: Self) -> NonNull<T> {
1326        // SAFETY: `Box` is guaranteed to be non-null.
1327        unsafe { NonNull::new_unchecked(Self::into_raw(b)) }
1328    }
1329
1330    /// Consumes the `Box`, returning a wrapped raw pointer and the allocator.
1331    ///
1332    /// The pointer will be properly aligned and non-null.
1333    ///
1334    /// After calling this function, the caller is responsible for the
1335    /// memory previously managed by the `Box`. In particular, the
1336    /// caller should properly destroy `T` and release the memory, taking
1337    /// into account the [memory layout] used by `Box`. The easiest way to
1338    /// do this is to convert the raw pointer back into a `Box` with the
1339    /// [`Box::from_raw_in`] function, allowing the `Box` destructor to perform
1340    /// the cleanup.
1341    ///
1342    /// Note: this is an associated function, which means that you have
1343    /// to call it as `Box::into_raw_with_allocator(b)` instead of `b.into_raw_with_allocator()`. This
1344    /// is so that there is no conflict with a method on the inner type.
1345    ///
1346    /// # Examples
1347    /// Converting the raw pointer back into a `Box` with [`Box::from_raw_in`]
1348    /// for automatic cleanup:
1349    /// ```
1350    /// #![feature(allocator_api)]
1351    ///
1352    /// use std::alloc::System;
1353    ///
1354    /// let x = Box::new_in(String::from("Hello"), System);
1355    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1356    /// let x = unsafe { Box::from_raw_in(ptr, alloc) };
1357    /// ```
1358    /// Manual cleanup by explicitly running the destructor and deallocating
1359    /// the memory:
1360    /// ```
1361    /// #![feature(allocator_api)]
1362    ///
1363    /// use std::alloc::{Allocator, Layout, System};
1364    /// use std::ptr::{self, NonNull};
1365    ///
1366    /// let x = Box::new_in(String::from("Hello"), System);
1367    /// let (ptr, alloc) = Box::into_raw_with_allocator(x);
1368    /// unsafe {
1369    ///     ptr::drop_in_place(ptr);
1370    ///     let non_null = NonNull::new_unchecked(ptr);
1371    ///     alloc.deallocate(non_null.cast(), Layout::new::<String>());
1372    /// }
1373    /// ```
1374    ///
1375    /// [memory layout]: self#memory-layout
1376    #[must_use = "losing the pointer will leak memory"]
1377    #[unstable(feature = "allocator_api", issue = "32838")]
1378    #[inline]
1379    pub fn into_raw_with_allocator(b: Self) -> (*mut T, A) {
1380        let mut b = mem::ManuallyDrop::new(b);
1381        // We carefully get the raw pointer out in a way that Miri's aliasing model understands what
1382        // is happening: using the primitive "deref" of `Box`. In case `A` is *not* `Global`, we
1383        // want *no* aliasing requirements here!
1384        // In case `A` *is* `Global`, this does not quite have the right behavior; `into_raw`
1385        // works around that.
1386        let ptr = &raw mut **b;
1387        let alloc = unsafe { ptr::read(&b.1) };
1388        (ptr, alloc)
1389    }
1390
1391    /// Consumes the `Box`, returning a wrapped `NonNull` pointer and the allocator.
1392    ///
1393    /// The pointer will be properly aligned.
1394    ///
1395    /// After calling this function, the caller is responsible for the
1396    /// memory previously managed by the `Box`. In particular, the
1397    /// caller should properly destroy `T` and release the memory, taking
1398    /// into account the [memory layout] used by `Box`. The easiest way to
1399    /// do this is to convert the `NonNull` pointer back into a `Box` with the
1400    /// [`Box::from_non_null_in`] function, allowing the `Box` destructor to
1401    /// perform the cleanup.
1402    ///
1403    /// Note: this is an associated function, which means that you have
1404    /// to call it as `Box::into_non_null_with_allocator(b)` instead of
1405    /// `b.into_non_null_with_allocator()`. This is so that there is no
1406    /// conflict with a method on the inner type.
1407    ///
1408    /// # Examples
1409    /// Converting the `NonNull` pointer back into a `Box` with
1410    /// [`Box::from_non_null_in`] for automatic cleanup:
1411    /// ```
1412    /// #![feature(allocator_api, box_vec_non_null)]
1413    ///
1414    /// use std::alloc::System;
1415    ///
1416    /// let x = Box::new_in(String::from("Hello"), System);
1417    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1418    /// let x = unsafe { Box::from_non_null_in(non_null, alloc) };
1419    /// ```
1420    /// Manual cleanup by explicitly running the destructor and deallocating
1421    /// the memory:
1422    /// ```
1423    /// #![feature(allocator_api, box_vec_non_null)]
1424    ///
1425    /// use std::alloc::{Allocator, Layout, System};
1426    ///
1427    /// let x = Box::new_in(String::from("Hello"), System);
1428    /// let (non_null, alloc) = Box::into_non_null_with_allocator(x);
1429    /// unsafe {
1430    ///     non_null.drop_in_place();
1431    ///     alloc.deallocate(non_null.cast::<u8>(), Layout::new::<String>());
1432    /// }
1433    /// ```
1434    ///
1435    /// [memory layout]: self#memory-layout
1436    #[must_use = "losing the pointer will leak memory"]
1437    #[unstable(feature = "allocator_api", issue = "32838")]
1438    // #[unstable(feature = "box_vec_non_null", reason = "new API", issue = "130364")]
1439    #[inline]
1440    pub fn into_non_null_with_allocator(b: Self) -> (NonNull<T>, A) {
1441        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1442        // SAFETY: `Box` is guaranteed to be non-null.
1443        unsafe { (NonNull::new_unchecked(ptr), alloc) }
1444    }
1445
1446    #[unstable(
1447        feature = "ptr_internals",
1448        issue = "none",
1449        reason = "use `Box::leak(b).into()` or `Unique::from(Box::leak(b))` instead"
1450    )]
1451    #[inline]
1452    #[doc(hidden)]
1453    pub fn into_unique(b: Self) -> (Unique<T>, A) {
1454        let (ptr, alloc) = Box::into_raw_with_allocator(b);
1455        unsafe { (Unique::from(&mut *ptr), alloc) }
1456    }
1457
1458    /// Returns a raw mutable pointer to the `Box`'s contents.
1459    ///
1460    /// The caller must ensure that the `Box` outlives the pointer this
1461    /// function returns, or else it will end up dangling.
1462    ///
1463    /// This method guarantees that for the purpose of the aliasing model, this method
1464    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1465    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1466    /// Note that calling other methods that materialize references to the memory
1467    /// may still invalidate this pointer.
1468    /// See the example below for how this guarantee can be used.
1469    ///
1470    /// # Examples
1471    ///
1472    /// Due to the aliasing guarantee, the following code is legal:
1473    ///
1474    /// ```rust
1475    /// #![feature(box_as_ptr)]
1476    ///
1477    /// unsafe {
1478    ///     let mut b = Box::new(0);
1479    ///     let ptr1 = Box::as_mut_ptr(&mut b);
1480    ///     ptr1.write(1);
1481    ///     let ptr2 = Box::as_mut_ptr(&mut b);
1482    ///     ptr2.write(2);
1483    ///     // Notably, the write to `ptr2` did *not* invalidate `ptr1`:
1484    ///     ptr1.write(3);
1485    /// }
1486    /// ```
1487    ///
1488    /// [`as_mut_ptr`]: Self::as_mut_ptr
1489    /// [`as_ptr`]: Self::as_ptr
1490    #[unstable(feature = "box_as_ptr", issue = "129090")]
1491    #[rustc_never_returns_null_ptr]
1492    #[rustc_as_ptr]
1493    #[inline]
1494    pub fn as_mut_ptr(b: &mut Self) -> *mut T {
1495        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1496        // any references.
1497        &raw mut **b
1498    }
1499
1500    /// Returns a raw pointer to the `Box`'s contents.
1501    ///
1502    /// The caller must ensure that the `Box` outlives the pointer this
1503    /// function returns, or else it will end up dangling.
1504    ///
1505    /// The caller must also ensure that the memory the pointer (non-transitively) points to
1506    /// is never written to (except inside an `UnsafeCell`) using this pointer or any pointer
1507    /// derived from it. If you need to mutate the contents of the `Box`, use [`as_mut_ptr`].
1508    ///
1509    /// This method guarantees that for the purpose of the aliasing model, this method
1510    /// does not materialize a reference to the underlying memory, and thus the returned pointer
1511    /// will remain valid when mixed with other calls to [`as_ptr`] and [`as_mut_ptr`].
1512    /// Note that calling other methods that materialize mutable references to the memory,
1513    /// as well as writing to this memory, may still invalidate this pointer.
1514    /// See the example below for how this guarantee can be used.
1515    ///
1516    /// # Examples
1517    ///
1518    /// Due to the aliasing guarantee, the following code is legal:
1519    ///
1520    /// ```rust
1521    /// #![feature(box_as_ptr)]
1522    ///
1523    /// unsafe {
1524    ///     let mut v = Box::new(0);
1525    ///     let ptr1 = Box::as_ptr(&v);
1526    ///     let ptr2 = Box::as_mut_ptr(&mut v);
1527    ///     let _val = ptr2.read();
1528    ///     // No write to this memory has happened yet, so `ptr1` is still valid.
1529    ///     let _val = ptr1.read();
1530    ///     // However, once we do a write...
1531    ///     ptr2.write(1);
1532    ///     // ... `ptr1` is no longer valid.
1533    ///     // This would be UB: let _val = ptr1.read();
1534    /// }
1535    /// ```
1536    ///
1537    /// [`as_mut_ptr`]: Self::as_mut_ptr
1538    /// [`as_ptr`]: Self::as_ptr
1539    #[unstable(feature = "box_as_ptr", issue = "129090")]
1540    #[rustc_never_returns_null_ptr]
1541    #[rustc_as_ptr]
1542    #[inline]
1543    pub fn as_ptr(b: &Self) -> *const T {
1544        // This is a primitive deref, not going through `DerefMut`, and therefore not materializing
1545        // any references.
1546        &raw const **b
1547    }
1548
1549    /// Returns a reference to the underlying allocator.
1550    ///
1551    /// Note: this is an associated function, which means that you have
1552    /// to call it as `Box::allocator(&b)` instead of `b.allocator()`. This
1553    /// is so that there is no conflict with a method on the inner type.
1554    #[unstable(feature = "allocator_api", issue = "32838")]
1555    #[rustc_const_unstable(feature = "const_box", issue = "92521")]
1556    #[inline]
1557    pub const fn allocator(b: &Self) -> &A {
1558        &b.1
1559    }
1560
1561    /// Consumes and leaks the `Box`, returning a mutable reference,
1562    /// `&'a mut T`.
1563    ///
1564    /// Note that the type `T` must outlive the chosen lifetime `'a`. If the type
1565    /// has only static references, or none at all, then this may be chosen to be
1566    /// `'static`.
1567    ///
1568    /// This function is mainly useful for data that lives for the remainder of
1569    /// the program's life. Dropping the returned reference will cause a memory
1570    /// leak. If this is not acceptable, the reference should first be wrapped
1571    /// with the [`Box::from_raw`] function producing a `Box`. This `Box` can
1572    /// then be dropped which will properly destroy `T` and release the
1573    /// allocated memory.
1574    ///
1575    /// Note: this is an associated function, which means that you have
1576    /// to call it as `Box::leak(b)` instead of `b.leak()`. This
1577    /// is so that there is no conflict with a method on the inner type.
1578    ///
1579    /// # Examples
1580    ///
1581    /// Simple usage:
1582    ///
1583    /// ```
1584    /// let x = Box::new(41);
1585    /// let static_ref: &'static mut usize = Box::leak(x);
1586    /// *static_ref += 1;
1587    /// assert_eq!(*static_ref, 42);
1588    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1589    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1590    /// # drop(unsafe { Box::from_raw(static_ref) });
1591    /// ```
1592    ///
1593    /// Unsized data:
1594    ///
1595    /// ```
1596    /// let x = vec![1, 2, 3].into_boxed_slice();
1597    /// let static_ref = Box::leak(x);
1598    /// static_ref[0] = 4;
1599    /// assert_eq!(*static_ref, [4, 2, 3]);
1600    /// # // FIXME(https://github.com/rust-lang/miri/issues/3670):
1601    /// # // use -Zmiri-disable-leak-check instead of unleaking in tests meant to leak.
1602    /// # drop(unsafe { Box::from_raw(static_ref) });
1603    /// ```
1604    #[stable(feature = "box_leak", since = "1.26.0")]
1605    #[inline]
1606    pub fn leak<'a>(b: Self) -> &'a mut T
1607    where
1608        A: 'a,
1609    {
1610        unsafe { &mut *Box::into_raw(b) }
1611    }
1612
1613    /// Converts a `Box<T>` into a `Pin<Box<T>>`. If `T` does not implement [`Unpin`], then
1614    /// `*boxed` will be pinned in memory and unable to be moved.
1615    ///
1616    /// This conversion does not allocate on the heap and happens in place.
1617    ///
1618    /// This is also available via [`From`].
1619    ///
1620    /// Constructing and pinning a `Box` with <code>Box::into_pin([Box::new]\(x))</code>
1621    /// can also be written more concisely using <code>[Box::pin]\(x)</code>.
1622    /// This `into_pin` method is useful if you already have a `Box<T>`, or you are
1623    /// constructing a (pinned) `Box` in a different way than with [`Box::new`].
1624    ///
1625    /// # Notes
1626    ///
1627    /// It's not recommended that crates add an impl like `From<Box<T>> for Pin<T>`,
1628    /// as it'll introduce an ambiguity when calling `Pin::from`.
1629    /// A demonstration of such a poor impl is shown below.
1630    ///
1631    /// ```compile_fail
1632    /// # use std::pin::Pin;
1633    /// struct Foo; // A type defined in this crate.
1634    /// impl From<Box<()>> for Pin<Foo> {
1635    ///     fn from(_: Box<()>) -> Pin<Foo> {
1636    ///         Pin::new(Foo)
1637    ///     }
1638    /// }
1639    ///
1640    /// let foo = Box::new(());
1641    /// let bar = Pin::from(foo);
1642    /// ```
1643    #[stable(feature = "box_into_pin", since = "1.63.0")]
1644    #[rustc_const_unstable(feature = "const_box", issue = "92521")]
1645    pub const fn into_pin(boxed: Self) -> Pin<Self>
1646    where
1647        A: 'static,
1648    {
1649        // It's not possible to move or replace the insides of a `Pin<Box<T>>`
1650        // when `T: !Unpin`, so it's safe to pin it directly without any
1651        // additional requirements.
1652        unsafe { Pin::new_unchecked(boxed) }
1653    }
1654}
1655
1656#[stable(feature = "rust1", since = "1.0.0")]
1657unsafe impl<#[may_dangle] T: ?Sized, A: Allocator> Drop for Box<T, A> {
1658    #[inline]
1659    fn drop(&mut self) {
1660        // the T in the Box is dropped by the compiler before the destructor is run
1661
1662        let ptr = self.0;
1663
1664        unsafe {
1665            let layout = Layout::for_value_raw(ptr.as_ptr());
1666            if layout.size() != 0 {
1667                self.1.deallocate(From::from(ptr.cast()), layout);
1668            }
1669        }
1670    }
1671}
1672
1673#[cfg(not(no_global_oom_handling))]
1674#[stable(feature = "rust1", since = "1.0.0")]
1675impl<T: Default> Default for Box<T> {
1676    /// Creates a `Box<T>`, with the `Default` value for T.
1677    #[inline]
1678    fn default() -> Self {
1679        let mut x: Box<mem::MaybeUninit<T>> = Box::new_uninit();
1680        unsafe {
1681            // SAFETY: `x` is valid for writing and has the same layout as `T`.
1682            // If `T::default()` panics, dropping `x` will just deallocate the Box as `MaybeUninit<T>`
1683            // does not have a destructor.
1684            //
1685            // We use `ptr::write` as `MaybeUninit::write` creates
1686            // extra stack copies of `T` in debug mode.
1687            //
1688            // See https://github.com/rust-lang/rust/issues/136043 for more context.
1689            ptr::write(&raw mut *x as *mut T, T::default());
1690            // SAFETY: `x` was just initialized above.
1691            x.assume_init()
1692        }
1693    }
1694}
1695
1696#[cfg(not(no_global_oom_handling))]
1697#[stable(feature = "rust1", since = "1.0.0")]
1698impl<T> Default for Box<[T]> {
1699    #[inline]
1700    fn default() -> Self {
1701        let ptr: Unique<[T]> = Unique::<[T; 0]>::dangling();
1702        Box(ptr, Global)
1703    }
1704}
1705
1706#[cfg(not(no_global_oom_handling))]
1707#[stable(feature = "default_box_extra", since = "1.17.0")]
1708impl Default for Box<str> {
1709    #[inline]
1710    fn default() -> Self {
1711        // SAFETY: This is the same as `Unique::cast<U>` but with an unsized `U = str`.
1712        let ptr: Unique<str> = unsafe {
1713            let bytes: Unique<[u8]> = Unique::<[u8; 0]>::dangling();
1714            Unique::new_unchecked(bytes.as_ptr() as *mut str)
1715        };
1716        Box(ptr, Global)
1717    }
1718}
1719
1720#[cfg(not(no_global_oom_handling))]
1721#[stable(feature = "rust1", since = "1.0.0")]
1722impl<T: Clone, A: Allocator + Clone> Clone for Box<T, A> {
1723    /// Returns a new box with a `clone()` of this box's contents.
1724    ///
1725    /// # Examples
1726    ///
1727    /// ```
1728    /// let x = Box::new(5);
1729    /// let y = x.clone();
1730    ///
1731    /// // The value is the same
1732    /// assert_eq!(x, y);
1733    ///
1734    /// // But they are unique objects
1735    /// assert_ne!(&*x as *const i32, &*y as *const i32);
1736    /// ```
1737    #[inline]
1738    fn clone(&self) -> Self {
1739        // Pre-allocate memory to allow writing the cloned value directly.
1740        let mut boxed = Self::new_uninit_in(self.1.clone());
1741        unsafe {
1742            (**self).clone_to_uninit(boxed.as_mut_ptr().cast());
1743            boxed.assume_init()
1744        }
1745    }
1746
1747    /// Copies `source`'s contents into `self` without creating a new allocation.
1748    ///
1749    /// # Examples
1750    ///
1751    /// ```
1752    /// let x = Box::new(5);
1753    /// let mut y = Box::new(10);
1754    /// let yp: *const i32 = &*y;
1755    ///
1756    /// y.clone_from(&x);
1757    ///
1758    /// // The value is the same
1759    /// assert_eq!(x, y);
1760    ///
1761    /// // And no allocation occurred
1762    /// assert_eq!(yp, &*y);
1763    /// ```
1764    #[inline]
1765    fn clone_from(&mut self, source: &Self) {
1766        (**self).clone_from(&(**source));
1767    }
1768}
1769
1770#[cfg(not(no_global_oom_handling))]
1771#[stable(feature = "box_slice_clone", since = "1.3.0")]
1772impl<T: Clone, A: Allocator + Clone> Clone for Box<[T], A> {
1773    fn clone(&self) -> Self {
1774        let alloc = Box::allocator(self).clone();
1775        self.to_vec_in(alloc).into_boxed_slice()
1776    }
1777
1778    /// Copies `source`'s contents into `self` without creating a new allocation,
1779    /// so long as the two are of the same length.
1780    ///
1781    /// # Examples
1782    ///
1783    /// ```
1784    /// let x = Box::new([5, 6, 7]);
1785    /// let mut y = Box::new([8, 9, 10]);
1786    /// let yp: *const [i32] = &*y;
1787    ///
1788    /// y.clone_from(&x);
1789    ///
1790    /// // The value is the same
1791    /// assert_eq!(x, y);
1792    ///
1793    /// // And no allocation occurred
1794    /// assert_eq!(yp, &*y);
1795    /// ```
1796    fn clone_from(&mut self, source: &Self) {
1797        if self.len() == source.len() {
1798            self.clone_from_slice(&source);
1799        } else {
1800            *self = source.clone();
1801        }
1802    }
1803}
1804
1805#[cfg(not(no_global_oom_handling))]
1806#[stable(feature = "box_slice_clone", since = "1.3.0")]
1807impl Clone for Box<str> {
1808    fn clone(&self) -> Self {
1809        // this makes a copy of the data
1810        let buf: Box<[u8]> = self.as_bytes().into();
1811        unsafe { from_boxed_utf8_unchecked(buf) }
1812    }
1813}
1814
1815#[stable(feature = "rust1", since = "1.0.0")]
1816impl<T: ?Sized + PartialEq, A: Allocator> PartialEq for Box<T, A> {
1817    #[inline]
1818    fn eq(&self, other: &Self) -> bool {
1819        PartialEq::eq(&**self, &**other)
1820    }
1821    #[inline]
1822    fn ne(&self, other: &Self) -> bool {
1823        PartialEq::ne(&**self, &**other)
1824    }
1825}
1826
1827#[stable(feature = "rust1", since = "1.0.0")]
1828impl<T: ?Sized + PartialOrd, A: Allocator> PartialOrd for Box<T, A> {
1829    #[inline]
1830    fn partial_cmp(&self, other: &Self) -> Option<Ordering> {
1831        PartialOrd::partial_cmp(&**self, &**other)
1832    }
1833    #[inline]
1834    fn lt(&self, other: &Self) -> bool {
1835        PartialOrd::lt(&**self, &**other)
1836    }
1837    #[inline]
1838    fn le(&self, other: &Self) -> bool {
1839        PartialOrd::le(&**self, &**other)
1840    }
1841    #[inline]
1842    fn ge(&self, other: &Self) -> bool {
1843        PartialOrd::ge(&**self, &**other)
1844    }
1845    #[inline]
1846    fn gt(&self, other: &Self) -> bool {
1847        PartialOrd::gt(&**self, &**other)
1848    }
1849}
1850
1851#[stable(feature = "rust1", since = "1.0.0")]
1852impl<T: ?Sized + Ord, A: Allocator> Ord for Box<T, A> {
1853    #[inline]
1854    fn cmp(&self, other: &Self) -> Ordering {
1855        Ord::cmp(&**self, &**other)
1856    }
1857}
1858
1859#[stable(feature = "rust1", since = "1.0.0")]
1860impl<T: ?Sized + Eq, A: Allocator> Eq for Box<T, A> {}
1861
1862#[stable(feature = "rust1", since = "1.0.0")]
1863impl<T: ?Sized + Hash, A: Allocator> Hash for Box<T, A> {
1864    fn hash<H: Hasher>(&self, state: &mut H) {
1865        (**self).hash(state);
1866    }
1867}
1868
1869#[stable(feature = "indirect_hasher_impl", since = "1.22.0")]
1870impl<T: ?Sized + Hasher, A: Allocator> Hasher for Box<T, A> {
1871    fn finish(&self) -> u64 {
1872        (**self).finish()
1873    }
1874    fn write(&mut self, bytes: &[u8]) {
1875        (**self).write(bytes)
1876    }
1877    fn write_u8(&mut self, i: u8) {
1878        (**self).write_u8(i)
1879    }
1880    fn write_u16(&mut self, i: u16) {
1881        (**self).write_u16(i)
1882    }
1883    fn write_u32(&mut self, i: u32) {
1884        (**self).write_u32(i)
1885    }
1886    fn write_u64(&mut self, i: u64) {
1887        (**self).write_u64(i)
1888    }
1889    fn write_u128(&mut self, i: u128) {
1890        (**self).write_u128(i)
1891    }
1892    fn write_usize(&mut self, i: usize) {
1893        (**self).write_usize(i)
1894    }
1895    fn write_i8(&mut self, i: i8) {
1896        (**self).write_i8(i)
1897    }
1898    fn write_i16(&mut self, i: i16) {
1899        (**self).write_i16(i)
1900    }
1901    fn write_i32(&mut self, i: i32) {
1902        (**self).write_i32(i)
1903    }
1904    fn write_i64(&mut self, i: i64) {
1905        (**self).write_i64(i)
1906    }
1907    fn write_i128(&mut self, i: i128) {
1908        (**self).write_i128(i)
1909    }
1910    fn write_isize(&mut self, i: isize) {
1911        (**self).write_isize(i)
1912    }
1913    fn write_length_prefix(&mut self, len: usize) {
1914        (**self).write_length_prefix(len)
1915    }
1916    fn write_str(&mut self, s: &str) {
1917        (**self).write_str(s)
1918    }
1919}
1920
1921#[stable(feature = "rust1", since = "1.0.0")]
1922impl<T: fmt::Display + ?Sized, A: Allocator> fmt::Display for Box<T, A> {
1923    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1924        fmt::Display::fmt(&**self, f)
1925    }
1926}
1927
1928#[stable(feature = "rust1", since = "1.0.0")]
1929impl<T: fmt::Debug + ?Sized, A: Allocator> fmt::Debug for Box<T, A> {
1930    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1931        fmt::Debug::fmt(&**self, f)
1932    }
1933}
1934
1935#[stable(feature = "rust1", since = "1.0.0")]
1936impl<T: ?Sized, A: Allocator> fmt::Pointer for Box<T, A> {
1937    fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1938        // It's not possible to extract the inner Uniq directly from the Box,
1939        // instead we cast it to a *const which aliases the Unique
1940        let ptr: *const T = &**self;
1941        fmt::Pointer::fmt(&ptr, f)
1942    }
1943}
1944
1945#[stable(feature = "rust1", since = "1.0.0")]
1946impl<T: ?Sized, A: Allocator> Deref for Box<T, A> {
1947    type Target = T;
1948
1949    fn deref(&self) -> &T {
1950        &**self
1951    }
1952}
1953
1954#[stable(feature = "rust1", since = "1.0.0")]
1955impl<T: ?Sized, A: Allocator> DerefMut for Box<T, A> {
1956    fn deref_mut(&mut self) -> &mut T {
1957        &mut **self
1958    }
1959}
1960
1961#[unstable(feature = "deref_pure_trait", issue = "87121")]
1962unsafe impl<T: ?Sized, A: Allocator> DerefPure for Box<T, A> {}
1963
1964#[unstable(feature = "legacy_receiver_trait", issue = "none")]
1965impl<T: ?Sized, A: Allocator> LegacyReceiver for Box<T, A> {}
1966
1967#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1968impl<Args: Tuple, F: FnOnce<Args> + ?Sized, A: Allocator> FnOnce<Args> for Box<F, A> {
1969    type Output = <F as FnOnce<Args>>::Output;
1970
1971    extern "rust-call" fn call_once(self, args: Args) -> Self::Output {
1972        <F as FnOnce<Args>>::call_once(*self, args)
1973    }
1974}
1975
1976#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1977impl<Args: Tuple, F: FnMut<Args> + ?Sized, A: Allocator> FnMut<Args> for Box<F, A> {
1978    extern "rust-call" fn call_mut(&mut self, args: Args) -> Self::Output {
1979        <F as FnMut<Args>>::call_mut(self, args)
1980    }
1981}
1982
1983#[stable(feature = "boxed_closure_impls", since = "1.35.0")]
1984impl<Args: Tuple, F: Fn<Args> + ?Sized, A: Allocator> Fn<Args> for Box<F, A> {
1985    extern "rust-call" fn call(&self, args: Args) -> Self::Output {
1986        <F as Fn<Args>>::call(self, args)
1987    }
1988}
1989
1990#[stable(feature = "async_closure", since = "1.85.0")]
1991impl<Args: Tuple, F: AsyncFnOnce<Args> + ?Sized, A: Allocator> AsyncFnOnce<Args> for Box<F, A> {
1992    type Output = F::Output;
1993    type CallOnceFuture = F::CallOnceFuture;
1994
1995    extern "rust-call" fn async_call_once(self, args: Args) -> Self::CallOnceFuture {
1996        F::async_call_once(*self, args)
1997    }
1998}
1999
2000#[stable(feature = "async_closure", since = "1.85.0")]
2001impl<Args: Tuple, F: AsyncFnMut<Args> + ?Sized, A: Allocator> AsyncFnMut<Args> for Box<F, A> {
2002    type CallRefFuture<'a>
2003        = F::CallRefFuture<'a>
2004    where
2005        Self: 'a;
2006
2007    extern "rust-call" fn async_call_mut(&mut self, args: Args) -> Self::CallRefFuture<'_> {
2008        F::async_call_mut(self, args)
2009    }
2010}
2011
2012#[stable(feature = "async_closure", since = "1.85.0")]
2013impl<Args: Tuple, F: AsyncFn<Args> + ?Sized, A: Allocator> AsyncFn<Args> for Box<F, A> {
2014    extern "rust-call" fn async_call(&self, args: Args) -> Self::CallRefFuture<'_> {
2015        F::async_call(self, args)
2016    }
2017}
2018
2019#[unstable(feature = "coerce_unsized", issue = "18598")]
2020impl<T: ?Sized + Unsize<U>, U: ?Sized, A: Allocator> CoerceUnsized<Box<U, A>> for Box<T, A> {}
2021
2022#[unstable(feature = "pin_coerce_unsized_trait", issue = "123430")]
2023unsafe impl<T: ?Sized, A: Allocator> PinCoerceUnsized for Box<T, A> {}
2024
2025// It is quite crucial that we only allow the `Global` allocator here.
2026// Handling arbitrary custom allocators (which can affect the `Box` layout heavily!)
2027// would need a lot of codegen and interpreter adjustments.
2028#[unstable(feature = "dispatch_from_dyn", issue = "none")]
2029impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Box<U>> for Box<T, Global> {}
2030
2031#[stable(feature = "box_borrow", since = "1.1.0")]
2032impl<T: ?Sized, A: Allocator> Borrow<T> for Box<T, A> {
2033    fn borrow(&self) -> &T {
2034        &**self
2035    }
2036}
2037
2038#[stable(feature = "box_borrow", since = "1.1.0")]
2039impl<T: ?Sized, A: Allocator> BorrowMut<T> for Box<T, A> {
2040    fn borrow_mut(&mut self) -> &mut T {
2041        &mut **self
2042    }
2043}
2044
2045#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2046impl<T: ?Sized, A: Allocator> AsRef<T> for Box<T, A> {
2047    fn as_ref(&self) -> &T {
2048        &**self
2049    }
2050}
2051
2052#[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2053impl<T: ?Sized, A: Allocator> AsMut<T> for Box<T, A> {
2054    fn as_mut(&mut self) -> &mut T {
2055        &mut **self
2056    }
2057}
2058
2059/* Nota bene
2060 *
2061 *  We could have chosen not to add this impl, and instead have written a
2062 *  function of Pin<Box<T>> to Pin<T>. Such a function would not be sound,
2063 *  because Box<T> implements Unpin even when T does not, as a result of
2064 *  this impl.
2065 *
2066 *  We chose this API instead of the alternative for a few reasons:
2067 *      - Logically, it is helpful to understand pinning in regard to the
2068 *        memory region being pointed to. For this reason none of the
2069 *        standard library pointer types support projecting through a pin
2070 *        (Box<T> is the only pointer type in std for which this would be
2071 *        safe.)
2072 *      - It is in practice very useful to have Box<T> be unconditionally
2073 *        Unpin because of trait objects, for which the structural auto
2074 *        trait functionality does not apply (e.g., Box<dyn Foo> would
2075 *        otherwise not be Unpin).
2076 *
2077 *  Another type with the same semantics as Box but only a conditional
2078 *  implementation of `Unpin` (where `T: Unpin`) would be valid/safe, and
2079 *  could have a method to project a Pin<T> from it.
2080 */
2081#[stable(feature = "pin", since = "1.33.0")]
2082impl<T: ?Sized, A: Allocator> Unpin for Box<T, A> {}
2083
2084#[unstable(feature = "coroutine_trait", issue = "43122")]
2085impl<G: ?Sized + Coroutine<R> + Unpin, R, A: Allocator> Coroutine<R> for Box<G, A> {
2086    type Yield = G::Yield;
2087    type Return = G::Return;
2088
2089    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2090        G::resume(Pin::new(&mut *self), arg)
2091    }
2092}
2093
2094#[unstable(feature = "coroutine_trait", issue = "43122")]
2095impl<G: ?Sized + Coroutine<R>, R, A: Allocator> Coroutine<R> for Pin<Box<G, A>>
2096where
2097    A: 'static,
2098{
2099    type Yield = G::Yield;
2100    type Return = G::Return;
2101
2102    fn resume(mut self: Pin<&mut Self>, arg: R) -> CoroutineState<Self::Yield, Self::Return> {
2103        G::resume((*self).as_mut(), arg)
2104    }
2105}
2106
2107#[stable(feature = "futures_api", since = "1.36.0")]
2108impl<F: ?Sized + Future + Unpin, A: Allocator> Future for Box<F, A> {
2109    type Output = F::Output;
2110
2111    fn poll(mut self: Pin<&mut Self>, cx: &mut Context<'_>) -> Poll<Self::Output> {
2112        F::poll(Pin::new(&mut *self), cx)
2113    }
2114}
2115
2116#[stable(feature = "box_error", since = "1.8.0")]
2117impl<E: Error> Error for Box<E> {
2118    #[allow(deprecated, deprecated_in_future)]
2119    fn description(&self) -> &str {
2120        Error::description(&**self)
2121    }
2122
2123    #[allow(deprecated)]
2124    fn cause(&self) -> Option<&dyn Error> {
2125        Error::cause(&**self)
2126    }
2127
2128    fn source(&self) -> Option<&(dyn Error + 'static)> {
2129        Error::source(&**self)
2130    }
2131
2132    fn provide<'b>(&'b self, request: &mut error::Request<'b>) {
2133        Error::provide(&**self, request);
2134    }
2135}
2136
2137#[unstable(feature = "pointer_like_trait", issue = "none")]
2138impl<T> PointerLike for Box<T> {}