# Numbering and Coding Systems Tutorials

Number system and data representations are the integral part of any embedded systems, and every embedded engineer should know about this. In this article we will read and discuss all important and different types of Number Systems used in Embedded Systems programming. All these are important number of 8051 microcontrollers and embedded systems, every embedded engineer must know about it. There are various important methods of presenting number systems in computer programming, they are as follows:

1. Binary Number System
2. Decimal Number Systems
4. BCD (Binary Coded Decimal) Number

Binary Number System
In Binary Number Systems computer uses 0 and 1 digit to represent any single unit. It is base 2 number systems. It is also recognized as positional notation number system. Every binary digit is multiplied by 2 as per its number position.

Unlike other analog circuits where signal is has variable frequency and changes ups and downs, it changes from one value to another value in common working. But in digital signals, it remains only in two status 0 or 1, True or False, it is followed by every digital circuits. Logig 0 or Logic 1. In common sense, generally 0 is represented by Low signals and 1 is represented by High signals.

Decimal Number Systems
The easiest and common number system for number representation is a decimal value number can be seen every where in computing. Used by everyone in every area. It is a Base 10 number system. It uses digits from 0-9, the base of Decimal number systems is 10 because we use 0-9 ten numbers, while using a decimal number we follow positional notation system in computers. Every number is multiplied by the power of 10 based on the occurring position. In Decimal Number System, there are ten possible visible digits that appear with every digit position, so we need to display ten numerals important to represent the decimal quantity of each digit position. These are very common numbers starts from 0 to 9, 1,2,3,4,5,6,7,8,9.