Notation may be said to be the use of definite symbols for the representation of definite things. Although symbols are of great importance in chemistry, music, and mathematics, yet in this article we shall only deal with those which have been commonly used in counting. It must have always been necessary to have some means of representing quantity, and so we find that in the very earliest times - long before the growth of writing - certain signs were used to denote numbers. It was natural that the fingers should first be enlisted as symbols, and counting by these appendages in the course of time developed into an intricate system of reckoning. In its primitive form it is found now among certain savage tribes, and it is remarkable to note that savages, as a rule, can only deal with very small numbers. Elaborated to a finished system of calculation, it is seen in certain provincial villages in Europe itself, where the different positions of the fingers - obtained by bending or closing them - are capable of expressing numbers up to 10,000. The fingers, however, though useful for expression, were useless for record, and so a system for expressing numbers by strokes grew up. This was obviously cumbrous for large numbers, and soon different symbols were employed to stand for such numbers as 5, 10, 100, etc. The Babylonians represented all numbers below 100 by two symbols only - for 1 and 10 - these being repeated as many times as the number required ; larger numbers were, in fact, obtained simply by the addition of smaller ones. Soon, however, multiplication began to be used. In the Syrian system (derived from the Egyptian), as well as in the Babylonian, a sign put to the left of the symbol for 100 denoted the number of hundreds meant. As writing became general, the alphabet became a field for symbols. Sometimes the letters taken in order represented the numbers also in order, as in the Ionic system; and sometimes the initial letter of the word signifying the number was used to denote it, as in the "Herodian" system of the early Greeks. Later the Greeks, like the' Hebrews, took certain of their letters to denote the numbers from 1 to 9; others represented 10, 20, 30, and so on, while further letters were used to signify the hundreds. The Roman system is familiar to all. Before the letters "C" and "M" (the initials ofcentum and mille) were used, however, we find a circle divided in different ways representing 100 and 1,000. The sign ? for 1,000 may have given rise to the form (I), and half the sign would be D the symbol for 500. Some people think that "M" itself comes from the old sign and not from mille. The sign ? for 100 may also have given rise to J. or L for 50. It gradually became evident to the Greeks and Hebrews that high numbers could be expressed by merely altering the position of the symbols of the lower ones, and from this time onwards notation assumed an easy form. For a long time numbers had been mechanically represented by means of counters placed on a kind of table known as an abacus (q.v.). The use of an abacus with nine ciphers instead of the counters' seems to have been known in Europe in the tenth century, before the complete modern system was introduced. This complete system came into use in the twelfth century and differed from all previous ones in having the sign 0. Until then no zero had been used, so it was not always clear whether a symbol meant a certain number or ten or a hundred times as much. The system with the zero seems to have originated in India, to have taken root in Arabia in the ninth century, and from thence; progressed to Europe. ' Besides this decimal system, we still have a trace of a sexagesimal system in the division of time and in the graduation of the circle. This method was in use among the Babylonians, who reckoned in powers of 60.