# Associative property

(Redirected from Associativity)

In mathematics, the associative property[1] is a property of some binary operations. In propositional logic, associativity is a valid rule of replacement for expressions in logical proofs.

Within an expression containing two or more occurrences in a row of the same associative operator, the order in which the operations are performed does not matter as long as the sequence of the operands is not changed. That is, (after rewriting the expression with parentheses and in infix notation if necessary) rearranging the parentheses in such an expression will not change its value. Consider the following equations:

${\displaystyle (2+3)+4=2+(3+4)=9\,}$
${\displaystyle 2\times (3\times 4)=(2\times 3)\times 4=24.}$

Even though the parentheses were rearranged on each line, the values of the expressions were not altered. Since this holds true when performing addition and multiplication on any real numbers, it can be said that "addition and multiplication of real numbers are associative operations".

Associativity is not the same as commutativity, which addresses whether or not the order of two operands changes the result. For example, the order does not matter in the multiplication of real numbers, that is, a × b = b × a, so we say that the multiplication of real numbers is a commutative operation.

Associative operations are abundant in mathematics; in fact, many algebraic structures (such as semigroups and categories) explicitly require their binary operations to be associative.

However, many important and interesting operations are non-associative; some examples include subtraction, exponentiation, and the vector cross product. In contrast to the theoretical properties of real numbers, the addition of floating point numbers in computer science is not associative, and the choice of how to associate an expression can have a significant effect on rounding error.

## Definition

A binary operation ∗ on the set S is associative when this diagram commutes. That is, when the two paths from S×S×S to S compose to the same function from S×S×S to S.

Formally, a binary operation ∗ on a set S is called associative if it satisfies the associative law:

(xy) ∗ z = x ∗ (yz) for all x, y, z in S.

Here, ∗ is used to replace the symbol of the operation, which may be any symbol, and even the absence of symbol (juxtaposition) as for multiplication.

(xy)z = x(yz) = xyz for all x, y, z in S.

The associative law can also be expressed in functional notation thus: f(f(x, y), z) = f(x, f(y, z)).

## Generalized associative law

In the absence of the associative property, five factors a, b, c, d, e result in a Tamari lattice of order four, possibly different products.

If a binary operation is associative, repeated application of the operation produces the same result regardless of how valid pairs of parentheses are inserted in the expression.[2] This is called the generalized associative law. For instance, a product of four elements may be written, without changing the order of the factors, in five possible ways:

${\displaystyle ((ab)c)d}$
${\displaystyle (ab)(cd)}$
${\displaystyle (a(bc))d}$
${\displaystyle a((bc)d)}$
${\displaystyle a(b(cd))}$

If the product operation is associative, the generalized associative law says that all these formulas will yield the same result. So unless the formula with omitted parentheses already has a different meaning (see below), the parentheses can be considered unnecessary and "the" product can be written unambiguously as

${\displaystyle abcd.}$

As the number of elements increases, the number of possible ways to insert parentheses grows quickly, but they remain unnecessary for disambiguation.

An example where this does not work is the logical biconditional ${\displaystyle \leftrightarrow }$ . It is associative, thus A${\displaystyle \leftrightarrow }$ (B${\displaystyle \leftrightarrow }$ C) is equivalent to (A${\displaystyle \leftrightarrow }$ B)${\displaystyle \leftrightarrow }$ C, but A${\displaystyle \leftrightarrow }$ B${\displaystyle \leftrightarrow }$ C most commonly means (A${\displaystyle \leftrightarrow }$ B and B${\displaystyle \leftrightarrow }$ C), which is not equivalent.

## Examples

In associative operations is ${\displaystyle (x\circ y)\circ z=x\circ (y\circ z)}$ .

The addition of real numbers is associative.

Some examples of associative operations include the following.

• The concatenation of the three strings "hello", " ", "world" can be computed by concatenating the first two strings (giving "hello ") and appending the third string ("world"), or by joining the second and third string (giving " world") and concatenating the first string ("hello") with the result. The two methods produce the same result; string concatenation is associative (but not commutative).
• In arithmetic, addition and multiplication of real numbers are associative; i.e.,
${\displaystyle \left.{\begin{matrix}(x+y)+z=x+(y+z)=x+y+z\quad \\(x\,y)z=x(y\,z)=x\,y\,z\qquad \qquad \qquad \quad \ \ \,\end{matrix}}\right\}{\mbox{for all }}x,y,z\in \mathbb {R} .}$
Because of associativity, the grouping parentheses can be omitted without ambiguity.
• The trivial operation xy = x (that is, the result is the first argument, no matter what the second argument is) is associative but not commutative. Likewise, the trivial operation xy = y (that is, the result is the second argument, no matter what the first argument is) is associative but not commutative.
• Addition and multiplication of complex numbers and quaternions are associative. Addition of octonions is also associative, but multiplication of octonions is non-associative.
• The greatest common divisor and least common multiple functions act associatively.
${\displaystyle \left.{\begin{matrix}\operatorname {gcd} (\operatorname {gcd} (x,y),z)=\operatorname {gcd} (x,\operatorname {gcd} (y,z))=\operatorname {gcd} (x,y,z)\ \quad \\\operatorname {lcm} (\operatorname {lcm} (x,y),z)=\operatorname {lcm} (x,\operatorname {lcm} (y,z))=\operatorname {lcm} (x,y,z)\quad \end{matrix}}\right\}{\mbox{ for all }}x,y,z\in \mathbb {Z} .}$
${\displaystyle \left.{\begin{matrix}(A\cap B)\cap C=A\cap (B\cap C)=A\cap B\cap C\quad \\(A\cup B)\cup C=A\cup (B\cup C)=A\cup B\cup C\quad \end{matrix}}\right\}{\mbox{for all sets }}A,B,C.}$
• If M is some set and S denotes the set of all functions from M to M, then the operation of function composition on S is associative:
${\displaystyle (f\circ g)\circ h=f\circ (g\circ h)=f\circ g\circ h\qquad {\mbox{for all }}f,g,h\in S.}$
• Slightly more generally, given four sets M, N, P and Q, with h: M to N, g: N to P, and f: P to Q, then
${\displaystyle (f\circ g)\circ h=f\circ (g\circ h)=f\circ g\circ h}$
as before. In short, composition of maps is always associative.
• Consider a set with three elements, A, B, and C. The following operation:
× A B C
A A A A
B A B C
C A A A
is associative. Thus, for example, A(BC)=(AB)C = A. This operation is not commutative.

## Propositional logic

### Rule of replacement

In standard truth-functional propositional logic, association,[4][5] or associativity[6] are two valid rules of replacement. The rules allow one to move parentheses in logical expressions in logical proofs. The rules (using logical connectives notation) are:

${\displaystyle (P\lor (Q\lor R))\Leftrightarrow ((P\lor Q)\lor R)}$

and

${\displaystyle (P\land (Q\land R))\Leftrightarrow ((P\land Q)\land R),}$

where "${\displaystyle \Leftrightarrow }$ " is a metalogical symbol representing "can be replaced in a proof with."

### Truth functional connectives

Associativity is a property of some logical connectives of truth-functional propositional logic. The following logical equivalences demonstrate that associativity is a property of particular connectives. The following are truth-functional tautologies.[7]

Associativity of disjunction:

${\displaystyle ((P\lor Q)\lor R)\leftrightarrow (P\lor (Q\lor R))}$
${\displaystyle (P\lor (Q\lor R))\leftrightarrow ((P\lor Q)\lor R)}$

Associativity of conjunction:

${\displaystyle ((P\land Q)\land R)\leftrightarrow (P\land (Q\land R))}$
${\displaystyle (P\land (Q\land R))\leftrightarrow ((P\land Q)\land R)}$

Associativity of equivalence:

${\displaystyle ((P\leftrightarrow Q)\leftrightarrow R)\leftrightarrow (P\leftrightarrow (Q\leftrightarrow R))}$
${\displaystyle (P\leftrightarrow (Q\leftrightarrow R))\leftrightarrow ((P\leftrightarrow Q)\leftrightarrow R)}$

Joint denial is an example of a truth functional connective that is not associative.

## Non-associative operation

A binary operation ${\displaystyle *}$  on a set S that does not satisfy the associative law is called non-associative. Symbolically,

${\displaystyle (x*y)*z\neq x*(y*z)\qquad {\mbox{for some }}x,y,z\in S.}$

For such an operation the order of evaluation does matter. For example:

${\displaystyle (5-3)-2\,\neq \,5-(3-2)}$
${\displaystyle (4/2)/2\,\neq \,4/(2/2)}$
${\displaystyle 2^{(1^{2})}\,\neq \,(2^{1})^{2}}$

Also note that infinite sums are not generally associative, for example:

${\displaystyle (1+-1)+(1+-1)+(1+-1)+(1+-1)+(1+-1)+(1+-1)+\dots \,=\,0}$

whereas

${\displaystyle 1+(-1+1)+(-1+1)+(-1+1)+(-1+1)+(-1+1)+(-1+1)+\dots \,=\,1}$

The study of non-associative structures arises from reasons somewhat different from the mainstream of classical algebra. One area within non-associative algebra that has grown very large is that of Lie algebras. There the associative law is replaced by the Jacobi identity. Lie algebras abstract the essential nature of infinitesimal transformations, and have become ubiquitous in mathematics.

There are other specific types of non-associative structures that have been studied in depth; these tend to come from some specific applications or areas such as combinatorial mathematics. Other examples are quasigroup, quasifield, non-associative ring, non-associative algebra and commutative non-associative magmas.

### Nonassociativity of floating point calculation

In mathematics, addition and multiplication of real numbers is associative. By contrast, in computer science, the addition and multiplication of floating point numbers is not associative, as rounding errors are introduced when dissimilar-sized values are joined together.[8]

To illustrate this, consider a floating point representation with a 4-bit mantissa:
(1.0002×20 + 1.0002×20) + 1.0002×24 = 1.0002×21 + 1.0002×24 = 1.0012×24
1.0002×20 + (1.0002×20 + 1.0002×24) = 1.0002×20 + 1.0002×24 = 1.0002×24

Even though most computers compute with a 24 or 53 bits of mantissa,[9] this is an important source of rounding error, and approaches such as the Kahan summation algorithm are ways to minimise the errors. It can be especially problematic in parallel computing.[10][11]

### Notation for non-associative operations

In general, parentheses must be used to indicate the order of evaluation if a non-associative operation appears more than once in an expression (unless the notation specifies the order in another way, like ${\displaystyle {\dfrac {2}{3/4}}}$ ). However, mathematicians agree on a particular order of evaluation for several common non-associative operations. This is simply a notational convention to avoid parentheses.

A left-associative operation is a non-associative operation that is conventionally evaluated from left to right, i.e.,

${\displaystyle \left.{\begin{matrix}x*y*z=(x*y)*z\qquad \qquad \quad \,\\w*x*y*z=((w*x)*y)*z\quad \\{\mbox{etc.}}\qquad \qquad \qquad \qquad \qquad \qquad \ \ \,\end{matrix}}\right\}{\mbox{for all }}w,x,y,z\in S}$

while a right-associative operation is conventionally evaluated from right to left:

${\displaystyle \left.{\begin{matrix}x*y*z=x*(y*z)\qquad \qquad \quad \,\\w*x*y*z=w*(x*(y*z))\quad \\{\mbox{etc.}}\qquad \qquad \qquad \qquad \qquad \qquad \ \ \,\end{matrix}}\right\}{\mbox{for all }}w,x,y,z\in S}$

Both left-associative and right-associative operations occur. Left-associative operations include the following:

${\displaystyle x-y-z=(x-y)-z}$
${\displaystyle x/y/z=(x/y)/z}$
• Function application:
${\displaystyle (f\,x\,y)=((f\,x)\,y)}$
This notation can be motivated by the currying isomorphism.

Right-associative operations include the following:

${\displaystyle x^{y^{z}}=x^{(y^{z})}}$
Exponentiation is commonly used with brackets or right-associatively because a repeated left-associative exponentiation operation is of little use. Repeated powers would mostly be rewritten with multiplication:
${\displaystyle (x^{y})^{z}=x^{(yz)}}$
Formatted correctly, the superscript inherently behaves as a set of parentheses; e.g. in the expression ${\displaystyle 2^{x+3}}$  the addition is performed before the exponentiation despite there being no explicit parentheses ${\displaystyle 2^{(x+3)}}$  wrapped around it. Thus given an expression such as ${\displaystyle x^{y^{z}}}$ , the full exponent ${\displaystyle y^{z}}$  of the base ${\displaystyle x}$  is evaluated first. However, in some contexts, especially in handwriting, the difference between ${\displaystyle {x^{y}}^{z}=(x^{y})^{z}}$ , ${\displaystyle x^{yz}=x^{(yz)}}$  and ${\displaystyle x^{y^{z}}=x^{(y^{z})}}$  can be hard to see. In such a case, right-associativity is usually implied.
${\displaystyle \mathbb {Z} \rightarrow \mathbb {Z} \rightarrow \mathbb {Z} =\mathbb {Z} \rightarrow (\mathbb {Z} \rightarrow \mathbb {Z} )}$
${\displaystyle x\mapsto y\mapsto x-y=x\mapsto (y\mapsto x-y)}$
Using right-associative notation for these operations can be motivated by the Curry–Howard correspondence and by the currying isomorphism.

Non-associative operations for which no conventional evaluation order is defined include the following.

• Exponentiation of real numbers in infix notation:[17]
${\displaystyle (x^{\wedge }y)^{\wedge }z\neq x^{\wedge }(y^{\wedge }z)}$
${\displaystyle a\uparrow \uparrow (b\uparrow \uparrow c)\neq (a\uparrow \uparrow b)\uparrow \uparrow c}$
${\displaystyle a\uparrow \uparrow \uparrow (b\uparrow \uparrow \uparrow c)\neq (a\uparrow \uparrow \uparrow b)\uparrow \uparrow \uparrow c}$
usw.
${\displaystyle {\vec {a}}\times ({\vec {b}}\times {\vec {c}})\neq ({\vec {a}}\times {\vec {b}})\times {\vec {c}}\qquad {\mbox{ for some }}{\vec {a}},{\vec {b}},{\vec {c}}\in \mathbb {R} ^{3}}$
• Taking the pairwise average of real numbers:
${\displaystyle {(x+y)/2+z \over 2}\neq {x+(y+z)/2 \over 2}\qquad {\mbox{for all }}x,y,z\in \mathbb {R} {\mbox{ with }}x\neq z.}$
• Taking the relative complement of sets ${\displaystyle (A\backslash B)\backslash C}$  is not the same as ${\displaystyle A\backslash (B\backslash C)}$ . (Compare material nonimplication in logic.)

## Antiassociativity

A binary operation ∘ on S is an antiassociative operation if and only if:

∀x,y,z∈S:(x∘y)∘z≠x∘(y∘z)[18]

Let (S,∘) be an algebraic structure.

Then (S,∘) is an antiassociative structure if and only if ∘ is an antiassociative operation.

That is, if and only if:

∀x,y,z∈S:(x∘y)∘z≠x∘(y∘z)[19]

## References

1. ^ Hungerford, Thomas W. (1974). Algebra (1st ed.). Springer. p. 24. ISBN 978-0387905181. Definition 1.1 (i) a(bc) = (ab)c for all a, b, c in G.
2. ^ Durbin, John R. (1992). Modern Algebra: an Introduction (3rd ed.). New York: Wiley. p. 78. ISBN 978-0-471-51001-7. If ${\displaystyle a_{1},a_{2},\dots ,a_{n}\,\,(n\geq 2)}$  are elements of a set with an associative operation, then the product ${\displaystyle a_{1}a_{2}\dots a_{n}}$  is unambiguous; this is, the same element will be obtained regardless of how parentheses are inserted in the product
3. ^ "Matrix product associativity". Khan Academy. Retrieved 5 June 2016.
4. ^ Moore, Brooke Noel; Parker, Richard (2017). Critical Thinking (12th edition). New York: McGraw-Hill Education. p. 321. ISBN 9781259690877.
5. ^ Copi, Irving M.; Cohen, Carl; McMahon, Kenneth (2014). Introduction to Logic (14th edition). Essex: Pearson Education. p. 387. ISBN 9781292024820.
6. ^ Hurley, Patrick J.; Watson, Lori (2016). A Concise Introduction to Logic (13th edition). Boston: Cengage Learning. p. 427. ISBN 9781305958098.
7. ^ "Symbolic Logic Proof of Associativity". Math.stackexchange.com. 22 March 2017.
8. ^ Knuth, Donald, The Art of Computer Programming, Volume 3, section 4.2.2
9. ^ IEEE Computer Society (29 August 2008). IEEE Standard for Floating-Point Arithmetic. doi:10.1109/IEEESTD.2008.4610935. ISBN 978-0-7381-5753-5. IEEE Std 754-2008.
10. ^ Villa, Oreste; Chavarría-mir, Daniel; Gurumoorthi, Vidhya; Márquez, Andrés; Krishnamoorthy, Sriram, Effects of Floating-Point non-Associativity on Numerical Computations on Massively Multithreaded Systems (PDF), archived from the original (PDF) on 15 February 2013, retrieved 8 April 2014
11. ^ Goldberg, David (March 1991). "What Every Computer Scientist Should Know About Floating-Point Arithmetic" (PDF). ACM Computing Surveys. 23 (1): 5–48. doi:10.1145/103162.103163. Retrieved 20 January 2016. ([1], [2])
12. ^ George Mark Bergman: Order of arithmetic operations
13. ^ Education Place: The Order of Operations
14. ^ Khan Academy: The Order of Operations, timestamp 5m40s
15. ^ Virginia Department of Education: Using Order of Operations and Exploring Properties, section 9
16. ^ Bronstein: de:Taschenbuch der Mathematik, pages 115-120, chapter: 2.4.1.1, ISBN 978-3-8085-5673-3
17. ^ Exponentiation Associativity and Standard Math Notation Codeplea. 23 August 2016. Retrieved 20 September 2016.
18. ^ Definition:Antiassociative Operation
19. ^ Definition:Antiassociative_Structure