Digitizing is the process of translating an analog or continous signal (such a music recording) into a digital code translatable to the computer.
Consider a photographer taking a light reading of a subject. She looks at the dial from her light meter and writes the exposure down on paper. This is an example of digitizing a light reading. In this example, the light is a continuous analog signal source which is then converted into a numerical expression.
A Video digitizer uses a television camera pointed at a subject for its analog signal source. The camera sends an analog waveform (the measure of the relative brightness of the subject) in the form of scan lines to a conversion unit which quantizes (quantizing gives each sample taken a numerical value) the sampled information it receives.
Having more numerical values define a image means more or higher resolution and clarity in the final image. Morie patterns and jaggies result when the quantization process is not high enough to convert the analog signal.
Once an image is in the computer, the image can be altered or combined with other images and text to create unique and personalized documents. Theorically, this new digital information can be copied repeatedly without degrading.
A binary numbering system is used to represent the on and off states of the electric circuits. O mean off, 1 means on. Each 0 or 1 represents a binary digit or bit. A byte is made up of 8 bits. Each character on your keyboard is represented by a distinct arrangement of 8 bits, the standardized system for the keyboard is known as the ASCII character set, which is an acronym for American Standard Code for Information Interchange.
Kilobytes refer to 1024 bytes (2**10) or KB or K.
Megabytes refer to 1024 kilobytes, and is represented by MB.