Take a look at your computer, the one you're using to read this blog. Whether you're using a old desktop or the latest tablet, you're using technology that was barely imaginable in 1971. Why did I pick 1971? Because that's when the modern computer age truly dawned. And it happened almost by accident.
I'll provide links to other sites that do a much better job of telling the story than I could, but here's the Cliff's Notes version:
In 1969, a Japanese company called Busicom wanted to build a new electronic calculator. They'd designed a set of chips to do the job, and had contracted with the fledgling company Intel to make them. After accepting the job Intel balked, believing the design was too complicated for their capabilities. Instead, Intel offered a simpler design built around a central processing unit built into a single chip. Busicom accepted the design, and in January 1971 the first chips were tested in the lab.
Although Busicom owned the rights to the design, business was poor and Busicom asked Intel to reduce the cost of the chips. Intel agreed, on the condition they obtained the rights to sell the chips for applications other than calculators. In November 1971 Intel announced the commercial availability of the world's first single-chip computer: the Intel 4004 microprocessor.
OK, that's all very interesting *yawn* but why yet another blog devoted to some ancient chip no one uses anymore? Especially one that runs at one 4,000th the speed of a modern gaming PC?
In 2006, for the 4004's 35th anniversary, Intel released the schematics of the 4004 under a Creative Commons license. Also, the 4004 circuit has only about 2300 components, versus more than 1 billion in the Intel Core i7 Quad on my desktop. That makes it possible to study the 4004 design in ways not possible with modern CPUs.
And most importantly for this blog, it's possible for a determined (or slightly insane) hobbyist to build one from discrete components.
No comments:
Post a Comment