Tuesday, August 28, 2012

Computer information technology,about computer technology, technology, computer

   
    A computer is a general purpose device that can be programmed to carry out a finite set of arithmetic or logical operations. Since a sequence of operations can be readily changed, the computer can solve more than one kind of problem.
    The first electronic digital computers were developed between 1940 and 1945 in the United Kingdom and United States. Originally they were the size of a large room, consuming as much power as several hundred modern personal computers (PCs).[1] In this era mechanical analog computers were used for military applications.
Modern computers based on integrated circuits are millions to billions of times more capable than the early machines, and occupy a fraction of the space. Simple computers are small enough to fit into mobile devices, and mobile computers can be powered by small batteries. Personal computers in their various forms are icons of the Information Age and are what most people think of as "computers". However, the embedded computers found in many devices from mp3 players to fighter aircraft and from toys to industrial robots are the most numerous.
    In the late 1940′s scientists developed the first machines that could store and use encoded instructions or programs. In the beginning the users performing a variety of computations had to rewrite the basic hardware of the computers. Later on the innovations dramatically changed and increased the flexibility and usefulness of computers. Instead, they could simply direct the computer to perform one of the functions that it had stored in its memory.
    As the computer technology has many diversified sub fields there are different roles defined for different sets of action and activities which cover the core technology of computing. The task of adapting an application program designed to run on one operating system to another computer system is often technically complex, and costly.
    The computer technology is proving as a necessity in the coming years and surely is one of the great means of entertainment; a single magical box which has almost every available communication service contained in it.

   I keep saying it. If you learn something with technology – pass it forward. Pass it on. Share it with someone. It’s the only way. It joins the dots and fills in the learning gaps. At school we are getting better and better at this and there is a noticeable lift in confidence and competence and then that real high you get from implementing something good in a classroom. Tips?
   1. If you don’t know how to do something – ask!
   2. Plan one on one or small group meetings where you are going to share what you know. Our faculty has planned to share its individual knowledge openly at meetings in the last week of term. That way we’ll all have a bigger pool of expertise to draw on.
   3. Our technicians are very patient. They make sure each individual can do whatever it is they wanted to learn to do.
   4. Our resource centre has implemented a media server and that means we are all now talking about how to use it and the resource centre staff are being rewarded for the time and effort they are putting into it. Everyone is in the conversation.
   5. Students love to share. Teach them to ask and show too and include them in the learning loop.
   6. Make technical resources available and let some people get good at them and then facilitate the learning of others. Spread the expertise by snowballing.
   7. Show people what you do. Invite them to look and see. It generates some good learning conversations about how to use technology in a school.
   8. There are no dumb questions. No stupid questions. Just things people don’t know or cannot do.
   9. Use technology to circulate information. It keeps everyone in the loop and gets them to be a part of the learning curve.
  10. Enthusiasm is a wonderful thing. If you are keen , it is very infectious and breaks down barriers.

First general purpose computers : 


 In 1801, Joseph Marie Jacquard made an improvement to the textile loom by introducing a series of punched paper cards as a template which allowed his loom to weave intricate patterns automatically. The resulting Jacquard loom was an important step in the development of computers because the use of punched cards to define woven patterns can be viewed as an early, albeit limited, form of programmability.

    It was the fusion of automatic calculation with programmability that produced the first recognizable computers. In 1837, Charles Babbage was the first to conceptualize and design a fully programmable mechanical computer, his analytical engine. Limited finances and Babbage's inability to resist tinkering with the design meant that the device was never completed—nevertheless his son, Henry Babbage, completed a simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in 1906. This machine was given to the Science museum in South Kensington in 1910.

      A succession of steadily more powerful and flexible computing devices were constructed in the 1930s and 1940s, gradually adding the key features that are seen in modern computers. The use of digital electronics (largely invented by Claude Shannon in 1937) and more flexible programmability were vitally important steps, but defining one point along this road as "the first digital electronic computer" is difficult.Shannon 1940 Notable achievements include:

     1.Konrad Zuse's electromechanical "Z machines". The Z3 (1941) was the first working machine featuring binary arithmetic, including floating point arithmetic and a measure of programmability. In 1998 the Z3 was proved to be Turing complete, therefore being the world's first operational computer.
    2.The non-programmable Atanasoff–Berry Computer (commenced in 1937, completed in 1941) which used vacuum tube based computation, binary numbers, and regenerative capacitor memory. The use of regenerative memory allowed it to be much more compact than its peers (being approximately the size of a large desk or workbench), since intermediate results could be stored and then fed back into the same set of computation elements.
    3.The secret British Colossus computers (1943), which had limited programmability but demonstrated that a device using thousands of tubes could be reasonably reliable and electronically reprogrammable. It was used for breaking German wartime codes.
   4. The Harvard Mark I (1944), a large-scale electromechanical computer with limited programmability.


   5. The U.S. Army's Ballistic Research Laboratory ENIAC (1946), which used decimal arithmetic and is sometimes called the first general purpose electronic computer (since Konrad Zuse's Z3 of 1941 used electromagnets instead of electronics). Initially, however, ENIAC had an inflexible architecture which essentially required rewiring to change its programming.

 computer information technology, what is computer technology, information computer technology, computer technology information, computer and technology, about computer technology, technology, computer

No comments:

Post a Comment

LinkWithin

 
 
 

feedburneer

Feedjit