NTA UGC NET 2023 » NTA Study Materials » Computer Science » Representation of Characters in Computers

Representation of Characters in Computers

Characters, including letters, punctuation, and digits, are encoded as binary integers on computers. This article will discuss data and character representation.

Even though everything in a computer is binary and can be represented as a binary value, whole binary numbers do not accurately reflect the vast array of numbering systems found inside computers. Character data and integer data are two types of representation. Even though computers deal with data and character representation in binary form, humans deal with information in the form of symbolic alphabetic and numeric information. 

Character theory

According to mathematical theory, especially group theory, a group representation’s character may be defined as a function of a group that relates to each group element’s trace, which is the trace of its respective matrix. The character conveys the essential information about the representation more concisely than the representation itself. Georg Frobenius’ representation theory of finite groups was first developed purely based on types of characters, with no explicit matrix realisation of representations in the first place. This is conceivable because the character of a finite group determines, up to the point of isomorphism, the complex representation of the group. 

Yet, the issue with representations over a field of positive characteristics, often known as “modular representation,” is more sensitive; however, Richard Brauer also produced a robust theory of characters. Many profound theorems on the structure of finite groups use the characteristics of modular representations to prove their conclusions. All of the information contained within a computer is transmitted as a series of electrical signals either on or off at any given time. Because of this, before a computer can handle any kind of data, including text, graphics, and sound, the data must first be transformed into binary form, which may be done in various ways. It goes without saying that if the data is not turned into binary — a sequence of ones and zeros – the computer will not be able to interpret it, much less process it.

Types of data representation methods

  • ASCII

A standard code is a code that everyone uses. The ASCII (American Standard for Computer Information Interchange) code defines which character is represented by each sequence of characters. Characters must be represented in the computer using the ones and zeros that the hardware can understand and deal with. Although how they are displayed is random, standards have been established to ensure that characters used on one computer are correctly interpreted by another. The most widely used standard is known as ASCII (American Standard Code for Information Interchange). Each character in ASCII is represented by a single byte since there are 128 characters in total to be represented. 

  • Data and character representation

Unlike decimals, integers are either whole numbers or fixed-point numbers, with the radix point fixed after the least significant bit in both cases. In contrast to real numbers or floating-point numbers, where the location of the radix point is always the same, radix points do not change. It is vital to remember that integers and floating-point numbers are processed in computers differently. Integers are usually represented as 8-bit, 16-bit, 32-bit, or 64-bit numbers, depending on their bit length. It is common in computer engineering and science to use the word integer to refer to a data type that reflects a limited subset of the mathematical integers, which is true in computer engineering and science. It is possible to have unsigned or signed integers. A byte is defined as eight bits in length. Data and character representation sets that are one byte in size may include up to 256 characters. Although Unicode has been around for a long time, it is now the only standard that employs two bytes to represent all characters from all writing systems throughout the globe in a single set.

  • Packing characters into words

The majority of character sets include much fewer symbols than the range of numbers that may be used to form a computer word in most situations. It is important to note that all of the two-digit octal integers may represent the 64 characters from 00 to 77, which requires just 6 bits. The division of each 12-bit byte into two 6-bit fields and the packing of ten characters into each 60-bit word became second nature. Varying characters’ packing into words is done for machines with varied numbers of characters and different amounts of bits per word. Still, the general practice is to allow a certain number of bits per character in a certain sequence. 

Conclusion

The computer is a piece of electronic equipment. Each of its cables can either conduct electric current or not. As a result, it only recognises two states, similar to a light switch. It turns out that this is sufficient to ensure that the whole concept is successful. Any system capable of representing at least two states may be used in data and character representation. Consider the Morse code, which is used in telegraphy and other applications. It is a sound transmission system that can transport both a brief beep represented by a dot and an extended beep.

faq

Frequently asked questions

Get answers to the most common queries related to the NTA Examination Preparation.

What is the visual representation of characters in a text?

Ans. To convert text into binary, a code may be created in which each number represents a single letter. One kind of...Read full

What is the best way to represent characters on a computer?

Ans. We use a coding system to represent characters, which is nothing more than a mapping function. Examples of stan...Read full

What is the definition of data and character representation?

Ans. Two-state devices are used to store and process data in computers, and they are used in both electronic and mec...Read full

Which is the most often utilised technique for portraying characters?

Ans. ASCII (American Standard Code for Information Interchange) is a code for encoding English characters as integer...Read full