Natural number


In mathematics, the natural numbers are the numbers 0, 1, 2, 3, and so on, possibly excluding 0. The terms positive integers, non-negative integers, whole numbers, and counting numbers are also used. The set of the natural numbers is commonly denoted by a bold or a blackboard bold.
The natural numbers are used for counting, and for labeling the result of a count, such as: "there are seven days in a week", in which case they are called cardinal numbers. They are also used to label places in an ordered series, such as: "the third day of the month", in which case they are called ordinal numbers. Natural numbers can also be used to label, like the jersey numbers of a sports team; in this case, they have no specific mathematical properties and are called nominal numbers.
Natural numbers can be compared by magnitude, with larger numbers coming after smaller ones in the list 1, 2, 3,.... Two basic arithmetical operations are defined on natural numbers: addition and multiplication. However, the inverse operations, subtraction and division, only sometimes give natural-number results: subtracting a larger natural number from a smaller one results in a negative number and dividing one natural number by another commonly leaves a remainder.
The most common number systems used throughout mathematics – the integers, rational numbers, real numbers, and complex numbers – contain the natural numbers, and can be formally defined in terms of natural numbers.
Arithmetic is the study of the ways to perform basic operations on these number systems. Number theory is the study of the properties of these operations and their generalizations. Much of combinatorics involves counting mathematical objects, patterns and structures that are defined using natural numbers.

Terminology and notation

The term natural numbers has two common definitions: either or. Because there is no universal convention, the definition can be chosen to suit the context of use. To eliminate ambiguity, the sequences and are often called the positive integers and the non-negative integers, respectively.
The phrase whole numbers is frequently used for the natural numbers that include 0, although it may also mean all integers, positive and negative. In primary education, counting numbers usually refer to the natural numbers starting at 1, though this definition can vary.
The set of all natural numbers is typically denoted or in blackboard bold as Whether 0 is included is often determined by the context but may also be specified by using or with a subscript or superscript. Examples include, or and or .

Intuitive concept

An intuitive and implicit understanding of natural numbers is developed naturally through using numbers for counting, ordering and basic arithmetic. Within this are two closely related aspects of what a natural number is: the size of a collection; and a ''position in a sequence''.

Size of a collection

Natural numbers can be used to answer questions like: "how many apples are on the table?". A natural number used in this way describes a characteristic of a collection of objects. This characteristic, the size of a collection is called cardinality and a natural number used to describe or measure it is called a cardinal number.
Two collections have the same size or cardinality if there is a one-to-one correspondence between the objects in each collection to the objects in the other. For example, in the image to the right every apple can be paired off with one orange and every orange can be paired off with one apple. From this, even without counting or using numbers it can be seen that the group of apples has the same cardinality as the group of oranges, meaning they are both assigned the same cardinal number.
The natural number 3 is the thing used for the particular cardinal number described above and for the cardinal number of any other collection of objects that could be paired off in the same way to one of these groups.

Position in a sequence

The natural numbers have a fixed progression, which is the familiar sequence beginning with 1, 2, 3, and so on. A natural number can be used to denote a specific position in any other sequence, in which case it is called an ordinal number. To have a specific position in a sequence means to come either before or after every other position in the sequence in a defined way, which is the concept of order.
The natural number 3 then is the thing that comes after 2 and 1, and before 4, 5 and so on. The number 2 is the thing that comes after 1, and 1 is the first element in the sequence. Each number represents the relation that position bears to the rest of the infinite sequence.

Counting

The process of counting involves both the cardinal and ordinal use of the natural numbers and illustrates the way the two fit together. To count the number of objects in a collection, each object is paired off with a natural number, usually by mentally or verbally saying the name of the number and assigning it to a particular object. The numbers must be assigned in order starting with 1 but the order of the objects chosen is arbitrary as long as each object is assigned one and only one number. When all of the objects have been assigned a number, the ordinal number assigned to the final object gives the result of the count, which is the cardinal number of the whole collection.

History

Ancient roots

The most primitive method of representing a natural number is to use one's fingers, as in finger counting. Putting down a tally mark for each object is another primitive method. Later, a set of objects could be tested for equality, excess or shortage—by striking out a mark and removing an object from the set.
The first major advance in abstraction was the use of numerals to represent numbers. This allowed systems to be developed for recording large numbers. The ancient Egyptians developed a powerful system of numerals with distinct hieroglyphs for 1, 10, and all powers of 10 up to over 1 million. A stone carving from Karnak, dating back from around 1500 BCE and now at the Louvre in Paris, depicts 276 as 2 hundreds, 7 tens, and 6 ones; and similarly for the number 4,622. The Babylonians had a place-value system based essentially on the numerals for 1 and 10, using base sixty, so that the symbol for sixty was the same as the symbol for one—its value being determined from context.
A much later advance was the development of the idea that can be considered as a number, with its own numeral. The use of a 0 digit in place-value notation dates back as early as 700 BCE by the Babylonians, who omitted such a digit when it would have been the last symbol in the number. The Olmec and Maya civilizations used 0 as a separate number as early as the, but this usage did not spread beyond Mesoamerica. The use of a numeral 0 in modern times originated with the Indian mathematician Brahmagupta in 628 CE. However, 0 had been used as a number in the medieval computus, beginning with Dionysius Exiguus in 525 CE, without being denoted by a numeral. Standard Roman numerals do not have a symbol for 0; instead, nulla from nullus, the Latin word for "none", was employed to denote a 0 value.
The first systematic study of numbers as abstractions is usually credited to the Greek philosophers Pythagoras and Archimedes. Some Greek mathematicians treated the number 1 differently than larger numbers, sometimes even not as a number at all. Euclid, for example, defined a unit first and then a number as a multitude of units, thus by his definition, a unit is not a number and there are no unique numbers. However, in the definition of perfect number which comes shortly afterward, Euclid treats 1 as a number like any other.
Independent studies on numbers also occurred at around the same time in India, China, and Mesoamerica.

Emergence as a term

used the term progression naturelle in 1484. The earliest known use of "natural number" as a complete English phrase is in 1763. The 1771 Encyclopaedia Britannica defines natural numbers in the logarithm article.
Starting at 0 or 1 has long been a matter of definition. In 1727, Bernard Le Bovier de Fontenelle wrote that his notions of distance and element led to defining the natural numbers as including or excluding 0. In 1889, Giuseppe Peano used N for the positive integers and started at 1, but he later changed to using N0 and N1. Historically, most definitions have excluded 0, but many mathematicians such as George A. Wentworth, Bertrand Russell, Nicolas Bourbaki, Paul Halmos, Stephen Cole Kleene, and John Horton Conway have preferred to include 0. This approach gained wider adoption in the 1960s and was formalized in ISO 31-11, which defines natural numbers to include zero, a convention retained in the current ISO 80000-2 standard.

Formal construction

In 19th century Europe, there was mathematical and philosophical discussion about the exact nature of the natural numbers. Henri Poincaré stated that axioms can only be demonstrated in their finite application, and concluded that it is "the power of the mind" which allows conceiving of the indefinite repetition of the same act. Leopold Kronecker summarized his belief as "God made the integers, all else is the work of man".
The constructivists saw a need to improve upon the logical rigor in the foundations of mathematics. In the 1860s, Hermann Grassmann suggested a recursive definition for natural numbers, thus stating they were not really natural—but a consequence of definitions. Later, two classes of such formal definitions emerged, using set theory and Peano's axioms respectively. Later still, they were shown to be equivalent in most practical applications.
Set-theoretical definitions of natural numbers were initiated by Frege. He initially defined a natural number as the class of all sets that are in one-to-one correspondence with a particular set. However, this definition turned out to lead to paradoxes, including Russell's paradox. To avoid such paradoxes, the formalism was modified so that a natural number is defined as a particular set, and any set that can be put into one-to-one correspondence with that set is said to have that number of elements.
In 1881, Charles Sanders Peirce provided the first axiomatization of natural-number arithmetic. In 1888, Richard Dedekind proposed another axiomatization of natural-number arithmetic, and in 1889, Peano published a simplified version of Dedekind's axioms in his book The principles of arithmetic presented by a new method. This approach is now called Peano arithmetic. It is based on an axiomatization of the properties of ordinal numbers: each natural number has a successor and every non-zero natural number has a unique predecessor. Peano arithmetic is equiconsistent with several weak systems of set theory. One such system is ZFC with the axiom of infinity replaced by its negation. Theorems that can be proved in ZFC but cannot be proved using the Peano Axioms include Goodstein's theorem.