# Counting

**Counting**is the process of determining the number of elements of a finite set of objects. The traditional way of counting consists of continually increasing a counter by a unit for every element of the set, in some order, while marking those elements to avoid visiting the same element more than once, until no unmarked elements are left; if the counter was set to one after the first object, the value after visiting the final object gives the desired number of elements. The related term

*enumeration*refers to uniquely identifying the elements of a finite set or infinite set by assigning a number to each element.

Counting sometimes involves numbers other than one; for example, when counting money, counting out change, "counting by twos", or "counting by fives".

There is archaeological evidence suggesting that humans have been counting for at least 50,000 years. Counting was primarily used by ancient cultures to keep track of social and economic data such as the number of group members, prey animals, property, or debts. Notched bones were also found in the Border Caves in South Africa that may suggest that the concept of counting was known to humans as far back as 44,000 BCE. The development of counting led to the development of mathematical notation, numeral systems, and writing.

## Forms of counting

Counting can occur in a variety of forms.Counting can be verbal; that is, speaking every number out loud to keep track of progress. This is often used to count objects that are present already, instead of counting a variety of things over time.

Counting can also be in the form of tally marks, making a mark for each number and then counting all of the marks when done tallying. This is useful when counting objects over time, such as the number of times something occurs during the course of a day. Tallying is base 1 counting; normal counting is done in base 10. Computers use base 2 counting.

Counting can also be in the form of finger counting, especially when counting small numbers. This is often used by children to facilitate counting and simple mathematical operations. Finger-counting uses unary notation, and is thus limited to counting 10. Older finger counting used the four fingers and the three bones in each finger to count to the number twelve. Other hand-gesture systems are also in use, for example the Chinese system by which one can count to 10 using only gestures of one hand. By using finger binary, it is possible to keep a finger count up to.

Various devices can also be used to facilitate counting, such as hand tally counters and abacuses.

## Inclusive counting

Inclusive counting is usually encountered when dealing with time in the Romance languages. In exclusive counting languages such as English, when counting "8" days from Sunday, Monday will be*day 1*, Tuesday

*day 2*, and the following Monday will be the

*eighth day*. When counting "inclusively," the Sunday will be

*day 1*and therefore the following Sunday will be the

*eighth day*. For example, the French phrase for "fortnight" is

*quinzaine*, and similar words are present in Greek, Spanish and Portuguese. In contrast, the English word "fortnight" itself derives from "a fourteen-night", as the archaic "" does from "a seven-night"; the English words are not examples of inclusive counting.

Names based on inclusive counting appear in other calendars as well: in the Roman calendar the

*nones*is 8 days before the

*ides*; and in the Christian calendar Quinquagesima is 49 days before Easter Sunday.

Musical terminology also uses inclusive counting of intervals between notes of the standard scale: going up one note is a second interval, going up two notes is a third interval, etc., and going up seven notes is an

*octave*.

## Education and development

Learning to count is an important educational/developmental milestone in most cultures of the world. Learning to count is a child's very first step into mathematics, and constitutes the most fundamental idea of that discipline. However, some cultures in Amazonia and the Australian Outback do not count, and their languages do not have number words.Many children at just 2 years of age have some skill in reciting the count list. They can also answer questions of ordinality for small numbers, for example, "What comes after

*three*?". They can even be skilled at pointing to each object in a set and reciting the words one after another. This leads many parents and educators to the conclusion that the child knows how to use counting to determine the size of a set. Research suggests that it takes about a year after learning these skills for a child to understand what they mean and why the procedures are performed. In the meantime, children learn how to name cardinalities that they can subitize.

## Counting in mathematics

In mathematics, the essence of counting a set and finding a result*n*, is that it establishes a one-to-one correspondence of the set with the set of numbers. A fundamental fact, which can be proved by mathematical induction, is that no bijection can exist between and unless ; this fact ensures that counting the same set in different ways can never result in different numbers. This is the fundamental mathematical theorem that gives counting its purpose; however you count a set, the answer is the same. In a broader context, the theorem is an example of a theorem in the mathematical field of combinatorics—hence combinatorics is sometimes referred to as "the mathematics of counting."

Many sets that arise in mathematics do not allow a bijection to be established with for

*any*natural number

*n*; these are called infinite sets, while those sets for which such a bijection does exist are called finite sets. Infinite sets cannot be counted in the usual sense; for one thing, the mathematical theorems which underlie this usual sense for finite sets are false for infinite sets. Furthermore, different definitions of the concepts in terms of which these theorems are stated, while equivalent for finite sets, are inequivalent in the context of infinite sets.

The notion of counting may be extended to them in the sense of establishing a bijection with some well-understood set. For instance, if a set can be brought into bijection with the set of all natural numbers, then it is called "countably infinite." This kind of counting differs in a fundamental way from counting of finite sets, in that adding new elements to a set does not necessarily increase its size, because the possibility of a bijection with the original set is not excluded. For instance, the set of all integers can be brought into bijection with the set of natural numbers, and even seemingly much larger sets like that of all finite sequences of rational numbers are still countably infinite. Nevertheless, there are sets, such as the set of real numbers, that can be shown to be "too large" to admit a bijection with the natural numbers, and these sets are called "uncountable." Sets for which there exists a bijection between them are said to have the same cardinality, and in the most general sense counting a set can be taken to mean determining its cardinality. Beyond the cardinalities given by each of the natural numbers, there is an infinite hierarchy of infinite cardinalities, although only very few such cardinalities occur in ordinary mathematics.

Counting, mostly of finite sets, has various applications in mathematics. One important principle is that if two sets

*X*and

*Y*have the same finite number of elements, and a function is known to be injective, then it is also surjective, and vice versa. A related fact is known as the pigeonhole principle, which states that if two sets

*X*and

*Y*have finite numbers of elements

*n*and

*m*with, then any map is

*not*injective ; this follows from the former principle, since if

*f*were injective, then so would its restriction to a strict subset

*S*of

*X*with

*m*elements, which restriction would then be surjective, contradicting the fact that for

*x*in

*X*outside

*S*,

*f*cannot be in the image of the restriction. Similar counting arguments can prove the existence of certain objects without explicitly providing an example. In the case of infinite sets this can even apply in situations where it is impossible to give an example.

The domain of enumerative combinatorics deals with computing the number of elements of finite sets, without actually counting them; the latter usually being impossible because infinite families of finite sets are considered at once, such as the set of permutations of for any natural number

*n*.