# Distributive property

In mathematics, the distributive property of binary operations generalizes the distributive law from Boolean algebra and elementary algebra. In propositional logic, distribution refers to two valid rules of replacement. The rules allow one to reformulate conjunctions and disjunctions within logical proofs.

Visualization of distributive law for positive numbers

For example, in arithmetic:

2 ⋅ (1 + 3) = (2 ⋅ 1) + (2 ⋅ 3), but 2 / (1 + 3) ≠ (2 / 1) + (2 / 3).

On the left-hand side of the first equation, the 2 multiplies the sum of 1 and 3; on the right-hand side, it multiplies the 1 and the 3 individually, with the products added afterward. Because these give the same final answer (8), multiplication by 2 is said to distribute over the addition of 1 and 3. Since one could have put any real numbers in place of 2, 1, and 3 above, and still have obtained a true equation, multiplication of real numbers distributes over addition of real numbers.

## Definition

Given a set S and two binary operators ∗ and + on S, the operation ∗ :

is left-distributive over + if, given any elements x, y and z of S,

${\displaystyle x*(y+z)=(x*y)+(x*z),}$

is right-distributive over + if, given any elements x, y, and z of S,

${\displaystyle (y+z)*x=(y*x)+(z*x),}$  and

is distributive over + if it is left- and right-distributive.[1]

Notice that when ∗ is commutative, the three conditions above are logically equivalent.

## Meaning

The operators used for examples in this section are those of the usual addition (${\displaystyle +}$ ) and multiplication (${\displaystyle \cdot }$ ).

If the operation denoted ${\displaystyle \cdot }$  is not commutative, there is a distinction between left-distributivity and right-distributivity:

${\displaystyle a\cdot \left(b\pm c\right)=a\cdot b\pm a\cdot c}$   (left-distributive)
${\displaystyle (a\pm b)\cdot c=a\cdot c\pm b\cdot c}$   (right-distributive)

In either case, the distributive property can be described in words as:

To multiply a sum (or difference) by a factor, each summand (or minuend and subtrahend) is multiplied by this factor and the resulting products are added (or subtracted).

If the operation outside the parentheses (in this case, the multiplication) is commutative, then left-distributivity implies right-distributivity and vice versa, and one talks simply of distributivity.

One example of an operation that is "only" right-distributive is division, which is not commutative:

${\displaystyle (a\pm b)\div c=a\div c\pm b\div c}$

In this case, left-distributivity does not apply:

${\displaystyle a\div (b\pm c)\neq a\div b\pm a\div c}$

The distributive laws are among the axioms for rings (like the ring of integers) and fields (like the field of rational numbers). Here multiplication is distributive over addition, but addition is not distributive over multiplication. Examples of structures with two operations that are each distributive over the other are Boolean algebras such as the algebra of sets or the switching algebra.

Multiplying sums can be put into words as follows: When a sum is multiplied by a sum, multiply each summand of a sum with each summand of the other sum (keeping track of signs) then add up all of the resulting products.

## Examples

### Real numbers

In the following examples, the use of the distributive law on the set of real numbers ${\displaystyle \mathbb {R} }$  is illustrated. When multiplication is mentioned in elementary mathematics, it usually refers to this kind of multiplication. From the point of view of algebra, the real numbers form a field, which ensures the validity of the distributive law.

First example (mental and written multiplication)

During mental arithmetic, distributivity is often used unconsciously:

${\displaystyle 6\cdot 16=6\cdot (10+6)=6\cdot 10+6\cdot 6=60+36=96}$

Thus, to calculate 6 ⋅ 16 in one's head, one first multiplies 6 ⋅ 10 and 6 ⋅ 6 and add the intermediate results. Written multiplication is also based on the distributive law.

Second example (with variables)
${\displaystyle 3a^{2}b\cdot (4a-5b)=3a^{2}b\cdot 4a-3a^{2}b\cdot 5b=12a^{3}b-15a^{2}b^{2}}$
Third example (with two sums)
{\displaystyle {\begin{aligned}(a+b)\cdot (a-b)&=a\cdot (a-b)+b\cdot (a-b)=a^{2}-ab+ba-b^{2}=a^{2}-b^{2}\\&=(a+b)\cdot a-(a+b)\cdot b=a^{2}+ba-ab-b^{2}=a^{2}-b^{2}\end{aligned}}}
Here the distributive law was applied twice, and it does not matter which bracket is first multiplied out.
Fourth Example
Here the distributive law is applied the other way around compared to the previous examples. Consider
${\displaystyle 12a^{3}b^{2}-30a^{4}bc+18a^{2}b^{3}c^{2}\,.}$
Since the factor ${\displaystyle 6a^{2}b}$  occurs in all summands, it can be factored out. That is, due to the distributive law one obtains
${\displaystyle 12a^{3}b^{2}-30a^{4}bc+18a^{2}b^{3}c^{2}=6a^{2}b(2ab-5a^{2}c+3b^{2}c^{2})\,.}$

### Matrices

The distributive law is valid for matrix multiplication. More precisely,

${\displaystyle (A+B)\cdot C=A\cdot C+B\cdot C}$

for all ${\displaystyle l\times m}$ -matrices ${\displaystyle A,B}$  and ${\displaystyle m\times n}$ -matrices ${\displaystyle C}$ , as well as

${\displaystyle A\cdot (B+C)=A\cdot B+A\cdot C}$

for all ${\displaystyle l\times m}$ -matrices ${\displaystyle A}$  and ${\displaystyle m\times n}$ -matrices ${\displaystyle B,C}$ . Because the commutative property does not hold for matrix multiplication, the second law does not follow from the first law. In this case, they are two different laws.

### Other examples

1. Multiplication of ordinal numbers, in contrast, is only left-distributive, not right-distributive.
2. The cross product is left- and right-distributive over vector addition, though not commutative.
3. The union of sets is distributive over intersection, and intersection is distributive over union.
4. Logical disjunction ("or") is distributive over logical conjunction ("and"), and vice versa.
5. For real numbers (and for any totally ordered set), the maximum operation is distributive over the minimum operation, and vice versa: max(a, min(b, c)) = min(max(a, b), max(a, c)) and min(a, max(b, c)) = max(min(a, b), min(a, c)).
6. For integers, the greatest common divisor is distributive over the least common multiple, and vice versa: gcd(a, lcm(b, c)) = lcm(gcd(a, b), gcd(a, c)) and lcm(a, gcd(b, c)) = gcd(lcm(a, b), lcm(a, c)).
7. For real numbers, addition distributes over the maximum operation, and also over the minimum operation: a + max(b, c) = max(a + b, a + c) and a + min(b, c) = min(a + b, a + c).
8. For binomial multiplication, distribution is sometimes referred to as the FOIL Method[2] (First terms ac, Outer ad, Inner bc, and Last bd) such as: (a + b) · (c + d) = ac + ad + bc + bd.
9. Polynomial multiplication is distributive over polynomial addition.
10. Complex number multiplication is distributive: ${\displaystyle u(v+w)=uv+uw,(u+v)w=uw+vw}$

## Propositional logic

### Rule of replacement

In standard truth-functional propositional logic, distribution[3][4] in logical proofs uses two valid rules of replacement to expand individual occurrences of certain logical connectives, within some formula, into separate applications of those connectives across subformulas of the given formula. The rules are

${\displaystyle (P\land (Q\lor R))\Leftrightarrow ((P\land Q)\lor (P\land R))}$

and

${\displaystyle (P\lor (Q\land R))\Leftrightarrow ((P\lor Q)\land (P\lor R))}$

where "${\displaystyle \Leftrightarrow }$ ", also written , is a metalogical symbol representing "can be replaced in a proof with" or "is logically equivalent to".

### Truth functional connectives

Distributivity is a property of some logical connectives of truth-functional propositional logic. The following logical equivalences demonstrate that distributivity is a property of particular connectives. The following are truth-functional tautologies.

Distribution of conjunction over conjunction
${\displaystyle (P\land (Q\land R))\leftrightarrow ((P\land Q)\land (P\land R))}$
Distribution of conjunction over disjunction
${\displaystyle (P\land (Q\lor R))\leftrightarrow ((P\land Q)\lor (P\land R))}$
Distribution of disjunction over conjunction
${\displaystyle (P\lor (Q\land R))\leftrightarrow ((P\lor Q)\land (P\lor R))}$
Distribution of disjunction over disjunction
${\displaystyle (P\lor (Q\lor R))\leftrightarrow ((P\lor Q)\lor (P\lor R))}$
Distribution of implication
${\displaystyle (P\to (Q\to R))\leftrightarrow ((P\to Q)\to (P\to R))}$
Distribution of implication over equivalence
${\displaystyle (P\to (Q\leftrightarrow R))\leftrightarrow ((P\to Q)\leftrightarrow (P\to R))}$
Distribution of disjunction over equivalence
${\displaystyle (P\lor (Q\leftrightarrow R))\leftrightarrow ((P\lor Q)\leftrightarrow (P\lor R))}$
Double distribution
{\displaystyle {\begin{aligned}((P\land Q)\lor (R\land S))&\leftrightarrow (((P\lor R)\land (P\lor S))\land ((Q\lor R)\land (Q\lor S)))\\((P\lor Q)\land (R\lor S))&\leftrightarrow (((P\land R)\lor (P\land S))\lor ((Q\land R)\lor (Q\land S)))\end{aligned}}}

## Distributivity and rounding

In practice, the distributive property of multiplication (and division) over addition may appear to be compromised or lost because of the limitations of arithmetic precision. For example, the identity ⅓ + ⅓ + ⅓ = (1 + 1 + 1) / 3 appears to fail if the addition is conducted in decimal arithmetic; however, if many significant digits are used, the calculation will result in a closer approximation to the correct results. For example, if the arithmetical calculation takes the form: 0.33333 + 0.33333 + 0.33333 = 0.99999 ≠ 1, this result is a closer approximation than if fewer significant digits had been used. Even when fractional numbers can be represented exactly in arithmetical form, errors will be introduced if those arithmetical values are rounded or truncated. For example, buying two books, each priced at £14.99 before a tax of 17.5%, in two separate transactions will actually save £0.01, over buying them together: £14.99 × 1.175 = £17.61 to the nearest £0.01, giving a total expenditure of £35.22, but £29.98 × 1.175 = £35.23. Methods such as banker's rounding may help in some cases, as may increasing the precision used, but ultimately some calculation errors are inevitable.

## In rings and other structures

Distributivity is most commonly found in rings and distributive lattices.

A ring has two binary operations, commonly denoted + and ∗, and one of the requirements of a ring is that ∗ must distribute over +. Most kinds of numbers form rings.

A lattice is another kind of algebraic structure with two binary operations, ∧ and ∨. If either of these operations (say ∧) distributes over the other (∨), then ∨ must also distribute over ∧, and the lattice is called distributive. See also Distributivity (order theory).

A Boolean algebra can be interpreted either as a special kind of ring (a Boolean ring) or a special kind of distributive lattice (a Boolean lattice). Each interpretation is responsible for different distributive laws in the Boolean algebra.

Failure of one of the two distributive laws brings about near-rings and near-fields instead of rings and division rings respectively. The operations are usually configured to have the near-ring or near-field distributive on the right but not on the left.

Rings and distributive lattices are both special kinds of rigs, which are generalizations of rings that have the distributive property. For example, natural numbers form a rig.

## Generalizations

In several mathematical areas, generalized distributivity laws are considered. This may involve the weakening of the above conditions or the extension to infinitary operations. Especially in order theory one finds numerous important variants of distributivity, some of which include infinitary operations, such as the infinite distributive law; others being defined in the presence of only one binary operation, such as the according definitions and their relations are given in the article distributivity (order theory). This also includes the notion of a completely distributive lattice.

In the presence of an ordering relation, one can also weaken the above equalities by replacing = by either ≤ or ≥. Naturally, this will lead to meaningful concepts only in some situations. An application of this principle is the notion of sub-distributivity as explained in the article on interval arithmetic.

In category theory, if (S, μ, η) and (S′, μ′, η′) are monads on a category C, a distributive law S.S′ → S′.S is a natural transformation λ : S.S′ → S′.S such that (S′, λ) is a lax map of monads SS and (S, λ) is a colax map of monads S′ → S. This is exactly the data needed to define a monad structure on S′.S: the multiplication map is Sμ.μS2.SλS and the unit map is ηS.η. See: distributive law between monads.

A generalized distributive law has also been proposed in the area of information theory.

### Antidistributivity

The ubiquitous identity that relates inverses to the binary operation in any group, namely (xy)−1 = y−1x−1, which is taken as an axiom in the more general context of a semigroup with involution, has sometimes been called an antidistributive property (of inversion as a unary operation).[5]

In the context of a near-ring, which removes the commutativity of the additively written group and assumes only one-sided distributivity, one can speak of (two-sided) distributive elements but also of antidistributive elements. The latter reverse the order of (the non-commutative) addition; assuming a left-nearring (i.e. one which all elements distribute when multiplied on the left), then an antidistributive element a reverses the order of addition when multiplied to the right: (x + y)a = ya + xa.[6]

In the study of propositional logic and Boolean algebra, the term antidistributive law is sometimes used to denote the interchange between conjunction and disjunction when implication factors over them:[7]

• (ab) ⇒ c ≡ (ac) ∧ (bc)
• (ab) ⇒ c ≡ (ac) ∨ (bc)

These two tautologies are a direct consequence of the duality in De Morgan's laws.

## Notes

1. ^ Distributivity of Binary Operations from Mathonline
2. ^ Kim Steward (2011) Multiplying Polynomials from Virtual Math Lab at West Texas A&M University
3. ^ Elliott Mendelson (1964) Introduction to Mathematical Logic, page 21, D. Van Nostrand Company
4. ^ Alfred Tarski (1941) Introduction to Logic, page 52, Oxford University Press
5. ^ Chris Brink; Wolfram Kahl; Gunther Schmidt (1997). Relational Methods in Computer Science. Springer. p. 4. ISBN 978-3-211-82971-4.
6. ^ Celestina Cotti Ferrero; Giovanni Ferrero (2002). Nearrings: Some Developments Linked to Semigroups and Groups. Kluwer Academic Publishers. pp. 62 and 67. ISBN 978-1-4613-0267-4.
7. ^ Eric C.R. Hehner (1993). A Practical Theory of Programming. Springer Science & Business Media. p. 230. ISBN 978-1-4419-8596-5.