Numerals in Unicode

A numeral (often called number in Unicode) is a character that denotes a number. Decimal is used widely in various writing systems throughout the world, however the graphemes representing the decimal digits differ widely, therefore Unicode includes 22 different sets of graphemes for the decimal digits, and also various decimal points, thousands separators, negative signs, etc. Unicode also includes several non-Decimal numerals such as Aegean numerals, Roman numerals, counting rod numerals, Cuneiform numerals and ancient Greek numerals. There is also a large number of typographical variations of the Western Arabic numerals provided for specialized mathematical use and for compatibility with earlier character sets, such as ² or ②, and composite characters such as ½.

Numerals in Unicode

A numeral (often called number in Unicode) is a character that denotes a number. Decimal is used widely in various writing systems throughout the world, however the graphemes representing the decimal digits differ widely, therefore Unicode includes 22 different sets of graphemes for the decimal digits, and also various decimal points, thousands separators, negative signs, etc. Unicode also includes several non-Decimal numerals such as Aegean numerals, Roman numerals, counting rod numerals, Cuneiform numerals and ancient Greek numerals. There is also a large number of typographical variations of the Western Arabic numerals provided for specialized mathematical use and for compatibility with earlier character sets, such as ² or ②, and composite characters such as ½.