Introduction to Information Technology

Information technology is the science associated with the application of computer and telecommunication equipment and accessories to store, retrieve, transmit, transfer, detect, and manipulate data/information. Information technology has developed rapidly in the 20th, and 21st centuries. The term ”Information technology” is synonymously used as the ”computer and its networking”, but it is not confined only to the computer and its accessories. The computer is the best means for processing and communicating the data. Apart from computers, there is so much other equipment which is being used as a source of communication. For instance, telephone, radio, television, fax machine, typewriter, printing machine, and even postal system. All these means have been used all through the globe prior to the development of computer science. But the advent of computers has changed the whole scenario. The fast, precise, and rapid communicative ability of computers has left the rest of the means far behind. Humans have been in intense effort to store, retrieve, and transfer the data by faster means for thousands of years. The computer is an outcome of the human’s efforts comprised of thousands of years. The development of only the 20th century is more than that of thousands of years prior to the 20th century.

History of Computer Technology:

Since the earliest of computer technology, varieties of devices have been used to make the computation easier for thousands of years, probably initially in the form of a tally stick, which was used to count and calculate numbers and quantities.

The Antikythera machine, which dates back to the early first century BC, is often regarded as the world’s first analog computer, as well as the first known method observed. Comparable devices did not appear in Europe until the 16th century, and it was not until 1645 that the first mechanical calculator capable of performing four basic arithmetical functions was developed.

Electronic computers, using relays or valves, first appeared in the early 1940’s. The electromechanical Zuse Z3, completed in 1941, was the first computer to be set up in the world, and by modern standards one of the first machines to be considered a complete computer. Colossus, built during World War II to secretly transcribe German messages, was the first digital computer. Although not planned, it was not a general purpose, it was designed to do only one job. Nor does it have the ability to keep its system in memory; programs are made using plugs and switches to change internal wiring. The first well-known electronic computer system was the Manchester Small-Scale Experimental Machine (SSEM), which launched its first system on June 21, 1948.

The construction of transistors (radio) in the late 1940s by Graham Bell at Bell Laboratories allowed a new generation of computers to be built with significantly reduced power consumption. The latest commercially available computer system, the Ferranti Mark I, had 4050 valves and was powered by 25 kilowatts. By comparing the first revolutionary computer, made at the University of Manchester and operational in November 1953, it consumed only 150 watts in its final version.

Processing of Data: 

A punched tape used to represent the data by early computers.

The data processing means the storage and evaluation of the data either present in the computer or given to it. The early electronic computers were known as Colossus, which used a punched tape to represent the data by wholes on it. It was a long non-metallic tape or strip of paper which was used by computers to show and read the data. This technology has now become obsolete.

Electronic data storage, used in modern computers, dates back to World War II, when a type of memory was built to slow down the removal of clutter from radar signals, the first active application that was a string of mercury delays. The first random storage device for digital storage was the Williams tube, based on a standard cathode ray tube, but the data stored and the linear memory delay did not change because it had to be continuously updated, so it was lost when power was removed. The first type of static computer storage was a magnetic drum, invented in 1932, and used in Ferranti Mark 1, the first electronic computer available for sale worldwide.

IBM introduced the first hard disk drive in 1956 (still in use today), as part of their 305 RAMAC computer system. Most digital data today is still stored magnetically on hard disks, or apparently on media such as CD-ROMs. Until 2002 most of the data was stored on analog devices, but in that year the digital storage capacity exceeded the analog for the first time. Since 2007 approximately 94% of data stored worldwide is digitized: 52% on hard disks, 28% on optical devices and 11% on digital magnetic tape. It is estimated that the global data storage capacity of electronic devices has grown from less than 3 exabytes in 1986 to 295 exabytes in 2007, almost every three years.

In more modern parts, such as a memory card, USB data travelers are often used by almost everyone in the world. These very small but rapid methods are the result of ongoing and consistent research in the field of information technology.

Database:

In order to store and retrieve large amounts of data, the need for a Database was felt in the midst of the 20th century. In the decade of 1960, Database management systems emerged to address this problem. Database management system made it easier to store and retrieve a very large amount of data quickly and accurately.

One of the first programs was IBM’s Information Management System (IMS), which was widely used more than 40 years later. IMS keeps data in order, but in the 1970s Ted Codd developed another relational storage model based on the concept of set and predicate concept and common concepts of tables, rows and columns. The first commercial-related database management system (RDBMS) was available at Oracle in 1980.

All database management systems have many integrated features that allow the data they store to be accessed simultaneously by multiple users while maintaining its integrity. A feature of all databases is that the structure of the data it contains is defined and stored separately from the data itself, in the database schema.

Extensible markup language (XML) has become a popular data representation format in recent years. Although XML data may not be stored in standard file systems, it is usually stored in a database to facilitate “their robust use guaranteed by years of thought and practical effort”. As with the emergence of the Standard Generalized Markup Language (SGML), the structure based on the XML text provides the opportunity for both to be machine-readable and human-readable.

Data Retrieval: 

The associated data model introduced a standard programming language (SQL), based on related algebra.

The words “data” and “information” do not match. Anything that is stored by data, but becomes informative only when it is organized and presented logically. Most of the world’s digital data is not organized and stored in various physical formats or within a single organization. Data repositories were first developed in the 1980’s to integrate these different stores. They contain data extracted from a variety of sources, including external sources such as the Internet, organized in such a way that it facilitates decision-making support systems (DSS).

Data Transmission:

Data transmission has three aspects: transmission, propagation, and reception. It can be broadly categorized as broadcasting, in which information is transmitted unidirectionally downstream, or telecommunications, with bidirectional upstream and downstream channels.

XML has been increasingly employed as a means of data interchange since the early 2000s, particularly for machine-oriented interactions such as those involved in web-oriented protocols such as SOAP, describing “data-in-transit rather than … data-at-rest”. One of the challenges of such usage is converting data from relational databases into XML Document Object Model (DOM) structures.

Data Manipulation:

Data manipulation means to process and organize the data to make it useful and readable easily. Hilbert and Lopez identify the exponential pace of technological change (a kind of Moore’s law): machines’ application-specific capacity to compute information per capita roughly doubled every 14 months between 1986 and 2007; the per capita capacity of the world’s general-purpose computers doubled every 18 months during the same two decades; the global telecommunication capacity per capita doubled every 34 months; the world’s storage capacity per capita required roughly 40 months to double (every 3 years); and per capita broadcast information has doubled every 12.3 years.

Massive amounts of data are stored worldwide every day, but unless it can be analyzed and presented effectively it essentially resides in what have been called data tombs: “data archives that are seldom visited”. To address that issue, the field of data mining – “the process of discovering interesting patterns and knowledge from large amounts of data” – emerged in the late 1980s.

 

Add a Comment

Your email address will not be published.