Written by admin on April 6th, 2011
Apple has been a major driving force in the consumer technology market for a couple of years now. Thanks to their innovative products and creative approaches to the digital age, Apple has managed to carve itself a unique niche and create a trend of its own. The names iPods, iPhone, iPad and Mac have become a household name. However, before Apple’s booming popularity in the world of mobile phones and mp3 players, it had already established itself as a competitive computer manufacturer. Consequently, the evolution of Apple is always an interesting story to tell. Let’s take a look at some of Apple’s ventures and get to know some of your Mac’s ancestors.
Back in its premature days, Apple initially offered “all-in-one” computers. They provided packages of built-in monitors. Nonetheless, despite the classification, Apple already provided its customers with a variety of monitor packages with the same box. Some of the earliest all-in-one computers include Apple II and Apple III. Apple II was released in 1977 at the same time that the famous rainbow Apple logo was conceptualized. Steve Jobs figured that colors should be added to the logo in order to emphasize the superior color capacity of Apple II. This marked the beginning of Apple’s distinctiveness from its competitors. Apple III at the time was released to cater to business industries and compete with IBM. In 1983, Apple introduced Apple IIe which eliminated the numeric keypad and came with a built-in keyboard.
Venture into Portability and Desktop Computers
1984 signaled the start for Apple’s focus on portability along with “out-of-the-box” functionality. The company introduced Apple IIc. Nonetheless, the IIc edition still lacked a portable power supply. In the same year, the name Macintosh opened a new era not just for Apple, but for the world over. Macintosh started it all. It was Steve Jobs’ demonstration at Mac World that instigated the explosion of Apple’s popularity, releasing the very first Macintosh and introduced many of the features that people are enjoying now. Afterwards, Apple released Apple IIGS and Macintosh Plus in 1986. IIGS provided the signature silkscreened front from Steve Wozniak. The Macintosh Plus series originally came in beige color but it was later on changed to gray, the trademark color for many Apple computers in the years to come. Following the release of the Macintosh series, Macintosh II was introduced. It was the first modular computer from Apple. This also marked the beginning of desktop computers for the company. The following provides an outline of the progression of Apple in the computer manufacturing industry:
Macintosh SE (1987). In this same year of desktop computers, Apple began adding advanced SCSI support and internal hard disk to its computers.
Macintosh Color (1993). Apple finally put out its first color compact computer.
iMac G3 Tray Loading Bondi Blue (1998). This is another computer revolution catalyst from Apple. The iconic, streamlined desktop computer got rid of the tower but kept the computing capacity of the device.
2000 – Present
By the year 2000, Apple was using Intel’s chipset and had invested on computer speed. From its Power Mac G5 (2003) to the Mac Pro (2006), Apple continually emphasized the company’s dedication in providing efficiency and fast computing ability. The beginning of 2000 also showed the company’s ongoing venture into the laptop industry. In 2006, Apple introduced the famous MacBook which became an instant hit. In 2008, the company released the revolutionary MacBook Air, so compact it fit right into a paper envelope. The same period also saw the debut of Apple’s LCD offerings such as the Apple Cinema Display. Nonetheless, among the countless revolutionary products that apple introduced, it is the iPod release which set Apple a few bars above all other competitors. From the iPod Classic First Generation (2001) to the Shuffle, Nano and iPod touch, Apple changed the way people listen to music. And in 2007, every tech aficionado’s must-have gadget was released: the iPhone. The iPhone has received such immense popularity and positive reception that it proved to be a viable and strong competitor to Blackberry. To date, Apple is still active in bringing the consumer market more groundbreaking products like the iPad, pushing the boundaries of our understanding of computers. More than two decades after its establishment, Apple has truly become a force to be reckoned with.
Written by admin on April 6th, 2011
Our modern life is characterized by the digital age of computers. Regardless of how people view them, computers have become an integral part of society on which we rely. Today, computers provide ample functionality and capabilities. They no longer simply compute but also allow people to accomplish various tasks including shop, communicate with friends, create documents, manage financial transactions, and hold global conferences. Nonetheless, it is well worth taking a trip down memory lane from time to time to see how it all began.
First Generation (1945-1956)
During the start of the Second World War, governments of different nations aimed to develop computers in order to create effective war strategies and increase their chances of success during the war. Because of increased funding, technology underwent rapid development. In 1941, Germany was able to develop a computer named Z3. The computer, created by Konrad Zuse, was used to design missiles and airplanes. Meanwhile, Allied Forces also embarked on their own computer development program and were able to produce extremely powerful computers. In the year 1943, the British were able to create a computer, the Colossus which could break German codes. However, the overall impact of technology during this period was still limited.
Second Generation Computers (1956-1963)
The transistor was invented in the year 1948, revolutionizing computer development. Instead of the large vacuum tubes which were previously used, the transistor provided a smaller and less cumbersome alternative for computers. Since then, the size of the computer has gradually and dramatically changed. Consequently, the transistor became incorporated into computers from the year 1956. Together with advancements in magnetic-core memory, transistors were able to give birth to a new generation of computers. These computers were smaller, more reliable and much faster. They were also more efficient in terms of energy usage. They provided a comparative advantage against their predecessors.
Third Generation Computers (1964-1971)
While transistors served as a major breakthrough and provided many advantages to second generation computers, they still presented a couple of setbacks. These included generating a great deal of heat, resulting in damage to computer parts. Thankfully, the quartz rock was discovered, which provided an effective alternative. In 1958, Jack Kilby of Texas Instruments invented the integrated circuit, or IC. It was comprised of three electronic components including small silicon discs made of quartz. Later on, scientists were able to devise a way to minimize the size of the chips. They were able to make a single chip, known as the semiconductor. Third generation computers were not just made of semiconductors but they also introduced the use of operating systems. Operating systems allowed computers to process a range of programs and functions.
Fourth Generation (1971-Present)
Following the introduction of integrated circuits, computer manufacturers were further able to reduce the size of the computer. The idea of Large Scale Integration allowed people to fit hundreds of different components onto a small single chip. On the other hand, Ultra-large scale integration (ULSI), provided a platform whereby one single chip could contain millions of components. Because multiple components could be fitted on to one computer chip, the size of computers dramatically reduced. Further, the changes in the capacity of the chips also provided more reliability, efficiency and power. Hence, it was in 1971 with Intel’s invention of the 4004 chip that integrated chips ushered in a new era for computers. The chip was able to hold the memory, the central processing unit and also input and output controls.
Fifth Generation (Present and Beyond)
The fifth generation is hard to describe because its progress is exponential compared to that of earlier generations. New computers are developed at a pace too fast even for a market as demanding as ours. Today computers come in all sizes, embedded inside mobile phones, and in applications which were previously only dreamt of. Furthermore, computers are now also equipped with article intelligence and a whole range of applications that can be customized according to the preference of the user. Also, computers are extremely connected in vast networks thanks to the internet.
Written by admin on April 6th, 2011
People today can’t seem to function without the Internet. It’s as if everything is connected to the world wide web, and our lives are rapidly evolving around it. Regardless of whether we are at home, school, or work, the internet has a constant presence in our daily lives. Furthermore, most devices like cell phones are now connected to the internet, meaning that it virtually governs every aspect of one’s life. While accessing the internet is very common nowadays, it hasn’t always been like this. There have been considerable changes and advancements made throughout history in order to produce a platform of this scale and accessibility. The following sections outline some of the most important changes and happenings in the history of the internet. While they may seem to be a thing of the past, knowing these things helps us to appreciate the internet more. The internet is not an overnight sensation. On the contrary, it took a long time to get where it is today.
During this year, the USSR developed and launched the first artificial satellite called Sputnik. In response to this venture, the United States created the Advanced Research Projects Agency or ARPA under the Department of Defense (DoD). The purpose of the ARPA is to help the United States gain a lead in terms of science and technology. This initial launch was a catalyst, spawning many more science and technology ventures, and opening a gateway into the world of networks and eventually the internet.
Rand Paul Baran was hired by the U.S. Air Force to embark on a research focusing on how to maintain command and control over the country’s bombers and missiles following nuclear attack. Baran was working with RAND Corporation and was commissioned to do a study on developing a military research network that can withstand or persist following a nuclear strike. Further, the military also requested that the network be decentralized so that in the event that locations in the United States were attacked, they would still be able to access control to their nuclear arms and embark on a possible counter-attack. Baran’s research recommended different ways to make this network and connection work. The final proposal from his study discussed a packed switched network. Basically, this refers to the separation of data into different packets, or datagrams, with labels to note their origin and destination. These packets can be forwarded from one computer to another.
The acoustics consulting team of Bolt, Beranek and Newman (BBN) won the ARPANET contract to build the first decentralized network for ARPA. During this time, the actual physical network was built, providing a connection to four nodes. These nodes included University of California at Santa Barbara, University of California at Los Angeles, University of Utah, and SRI in Stanford. The physical network was linked using 50 Kbps (Kilobytes per second)circuits.
Ray Tomlinson, working under BBN, invented and organized the first e-mail program. During this time, the ARPANET made use of the Network Control Protocol or NCP during this time to be able to transmit data. This allowed communication to push through between hosts under the same network.
DARPA (Defense Advanced Research Projects Agency) embarked on the development of the TCP (Transmission Control Protocol) and the IP (Internet Provider). The protocol was very important in letting different computers communicate with each other despite coming from different networks.
National Science Foundation developed a backbone referred to as the CSNET (Computer Science Network). This is a 56 Kbps network which will allow different institutions to connect to networks even without access to ARPANET.
The Internet Activities Board (IAB) was created during this time. Machines using the ARPANET were completely replaced and the TCP/IP became the Internet’s central protocol. The use of domain names and IP numbers also began during this period.
The Advanced Network & Services was formed. This was a non-profit organization aimed at conducting studies focused on high speed networking. A new backbone was introduced, the 45 Mbps line. The original lines used for ARPANET were taken out.
CSNET was no longer used and new network was established: the National Research and Education Network, or the NREN. This network specifically supports the huge amount of data transfer in the internet.
The early 2000’s saw the era of dot-com sites. The internet extended to millions of users and capitalists began taking an interest in this platform.