Variance rules: a recap
Definition
Variance is a property of generic types:
-
Variance for a lifetime-generic type (e.g.,
Formatter<'_>
) will determine whether lifetimes can:- shrink (covariant),
- grow (contravariant),
- or neither (invariant).
-
Variance for a type-generic type (
Vec<T>
) will determine:-
intuitively, if that type parameter (
T
) were replaced with a lifetime-generic type itself (type T<'r> = &'r str
), how the variance of that innerT<'r>
type will propagate and affect the resulting variance of the composed type (Vec<T<'r>>
).× T<'lt>
Co-variantT<'lt>
Contra-variantT<'lt>
In-variantF<T>
covariant
=
"passthrough"Covariant F<T<'lt>>
Contravariant F<T<'lt>>
Invariant F<T<'lt>>
F<T>
contravariant
=
"flips it around"Contravariant F<T<'lt>>
Covariant F<T<'lt>>
Invariant F<T<'lt>>
F<T>
invariantInvariant F<T<'lt>>
Invariant F<T<'lt>>
Invariant F<T<'lt>>
Tip: if you see covariance as "being positive" (
+
), and contravariance as "being negative" (-
), and invariance as "being zero" (0
), these composition rules are the same as the sign multiplication rules!× +1 -1 0 +1 +1 -1 0 -1 -1 +1 0 0 0 0 0 -
formally, how the subtyping relation among two choices for the type parameter results in some subtyping relation for the generic type.
That is, if you have
T ➘ U
(T
a subtype ofU
), andtype F<X>;
- if
F
is covariant, thenF<T> ➘ F<U>
; - if
F
is contravariant, thenF<U> ➘ F<T>
(reversed!);
You can have
T ➘ U
by:-
common case: having a variant
type T<'lt>
, and picking the lifetimes accordingly (this is the intuitively section just above); niche case: having a higher-rank
fn
pointer typeand certain concrete choices of the inner generic lifetime:
//! Pseudo-code: //! using `fn<'any>(…)` instead of `for<'any> fn(…)` // From higher-order lifetime to a fixed choice: fn<'any>(&'any str) ➘ fn(&'fixed str) // From a covariant lifetime to a higher-order one: fn(fn(&'static str)) ➘ fn<'any>(fn(&'any str))
Interestingly enough, it does mean that if we pick:
-
type T = for<'any> fn(fn(&'any str)); type U = fn(fn(&'static str));
Then:
-
T ≠ U
; -
T : 'static
andU : 'static
; -
T ➘ U
by "fixing the higher-order lifetime"⚠️ A
: 'static
bound is not sufficient to prevent subtyping shenanigans form happening! ⚠️ -
U ➘ T
by "higher-ordering" a covariant lifetime by "induction from'static
".
-
- if
-
Main variance examples to keep in mind
- Mutable borrows and shared mutability wrappers are invariant.
Mutex<T>
,Cell<T>
,&'cov Cell<T>
,&'cov mut T
are invariant (inT
);
- otherwise, generally, owned stuff or immutably-borrowed stuff can be covariant.
T
,&'_ T
,Box<T>
,Arc<T>
, are covariant.
fn(CbArg)
is contravariant inCbArg
.- But
fn(…) -> Ret
is covariant inRet
!
- But
impl Traits<'lt>
anddyn Traits<'lt>
are invariant;- this includes the
Fn{,Mut,Once}
traits.
- this includes the
+ 'usability
is covariant:- Bonus: there is also a no-overhead-upcasting/reünsizing coercion possible from
&mut (dyn 'big + Traits)
to&mut (dyn 'short + Traits)
, which to the untrained eye could appear as if&mut (dyn 'lt + Traits)
were covariant in'lt
, which it is not.
- Bonus: there is also a no-overhead-upcasting/reünsizing coercion possible from
Variance of "product types" / struct
ural records / tuples
The rule of thumb is: combine them restrictively / non-covariance and non-contravariance are infectious.
By that I mean that you could think of variance as of marker traits:
trait Covariant {}
trait Contravariant {}
struct Example<'lt> {
x: &'lt str, // Covariant + !Contravariant
y: Mutex<&'lt bool>, // !Covariant + !Contravariant
} // !Covariant + !Contravariant = Invariant
For instance, in this Example
, we have a non-contravariant field alongside an invariant, i.e., neither-covariant-nor-contravariant field.
Thus, the resulting Example
can't be:
- contravariant, due to either field;
- covariant, due to the second field.
So we have a neither-covariant-nor-contravariant type, i.e., an invariant one.