Part 2 of a 3-part video mini-series on How Computers Work for people with no computer or electronics engineering background. Content is presented with ...
+Ryan R Some complexity is necessary to make the circuit scalable - ie. combine to handle numbers > 1. Basically, it all depends on whether you want to make it usable in a general purpose programmable computer, or, merely make a 'toy' for a hobby or very specific application.
Ah, the process you are looking for is called Photolithography. I'm not familiar with the process, so I'd suggest looking it up online and seeing what you find. But my simplistic understanding would suggest that a stencil is made at a workable scale, for each of the layers of the circuit (it's built up in 3 dimensions using multiple 2D layers). Some kind of light is used, in combination with lenses to shine the pattern of the stencil on to a much smaller silicon wafer. This etches out or modifies the surface of the wafer, but only where their are corresponding holes / transparent regions within the stencil. Repeating the process, it's effectively possible to etch very small circuit components in 3D into a silicon wafer. As I said, it's a simplistic explanation and I'm not really familiar with the process, but hopefully it helps to illustrate what's possible. It's kind of like how photographs are enlarged, except in reverse, and the light itself is used to make physical changes to the surface (think of photographic film or lasers.) Hopefully that goes some way to answering your question!
Thanks for producing these videos; very helpful. Could I ask you a
question? Do you think that there would be any benefit in trying to learn
Assembly Language programming, nowadays? Thanks.
Hi, sorry for the late reply. It really depends what you're trying to achieve. If you want a better understanding of how computers work, however, then I'd highly recommend doing a little assembly programming just to get a taste of it. If you're programming business applications, it's probably a waste of productive programming hours.
How Computers Work: A Journey Into the Walk-Through Computer, hosted by David Neil
Recorded 1990 How Computers Work: A Journey Into the Walk-Through Computer is an educational video produced by The Computer Museum and hosted by ...
"Yeah man, fuckin' A! Computers are so lame! Like this one time I was
rushin' to get my essay done, an' mah computer just totally psych'ed out on
me! I was totally spazzin' out until my mom came in and told me to take a
chill pill and watch this video and I was all like 'Fer shur penis-breath,
just gag me with a spoon'." Whoa, I just had an 80's flashback!
I was looking for a video to cover some computer basics for a college
course. I don't think this one is going to be too helpful, but it's so
entertaining and reminds me of how thrilled I was to discover that a
program could be stored on this "Floppy disk" thing that you could stick
inside the little black-and-white macs we used. Oh, and their clothes... :D
I can't believe the sexism and racism of this video. The blacks and girls
have all the right answers and the white kid is just a hulligan. We all
know that he Asian kids build all this stuff and they don't get a look in.
[I'm not an American so I understand irony]
there is no continent called "Middle America". North America includes every
country from Canada to Panama. This is why I home school my children
because the pubic fool system does nothing but produce functional
illiterates and retards.
I remember that thing quite well! The silver wall inside the entrance, at
left, is supposed to represent the power supply. And blended in w/ it,
there was a door -- this led to the men's room. Think it's visible around
24:04.
wow a flashback to the 80's. I found it funny at the end when he said about
putting the computer back together again. I dont think the hard drive would
work after being opened or the CPU being touched by everyone.
Hhhhahahaha! In High School my Computer's teacher played this video for my
class. It was so state of the art!
Pretty much it's all the same today, but on an even smaller scale and
processes can go even faster.
man, I can't believe we used to dress like that. I'm surprised that there
aren't any mushroom cuts :P oh, the 90's the music and video games were
awesome, but we dressed like tools
yeah this computers old but its still a computer and can give you a good
idea on how these things work. How much space and ram you think was on here
like 5GB and 30MB rofl
the golden age of the computer scientist.. i'm a CS student but i don't get
much a lesson like this, mostly programming and the abstraction in a
pseudocode
"Girls trying to learn about computers, LOL." "Niggers trying to learn
about computer, LOLOLOLOLOLOL"
How Computers Work: Programming (Part III)
Part 3 of a 3-part video mini-series on How Computers Work for people with no computer or electronics engineering background. Content is presented with ...
good on ya joshy, even though your a bit of a dork, and i didnt even intend
on learning about computers, i found your miniseries very informative and
encourage u to keep up the good work and keep teaching the masses useful
shit. well done bud.
+Joshua Hawcroft very informative....i like how you explain in simple language....how simple life would be if all the books were written in such simplicity
+hornygoatify Thanks for the kind words mate! :) Yes, 'overclocking' or pushing the processor beyond it's rated speed is generally not a great idea - unless you know what you're doing. There are some caveats. There are two main problems as I understand it. 1) Heat. Increasing processor speed tends to increase the amount of power used (as you have suggested) and that energy has to go somewhere. Basically the energy you put into the processor ends up being radiated as heat. If the processor is not sufficiently cooled, you will cook it. There will also be temperatures above which you shorten the life. If you want details I would suggest looking at Intel processors, which have been the workhorses of most PCs for decades. 2) Speed cannot be increased beyond a certain point because it is physically impossible for the signals to propagate around the processor fast enough. This ceiling depends on the precise configuration and fabrication of the chip. Gates do not switch on/off instantly - although it may seem like it to us humans. Likewise, electricity isn't instant. Push a processor above this ceiling and it will become dysfunctional at least until you reduce the speed - it will behave like an insane person. Hope that helps!
also, i heard somewhere its possible to speed up the processor clock to increase processor speed, at the expense of sucking more juice from the power supply. could this be detriment to the longevity of my machine? and could you quickly explain, or link me, to some information that could show me how? thanks joshy
So the adder.. If you use a 32 bit or 64 bit processor does a 32 bit
computer take two times longer to do the same (long) calculation as a 64
bit computer? because of the halve amount of adders? or are the adders not
the thing that seperates a 32 bit from a 64 bit computer?
Hi. It depends on a great many factors, but the simple answer is that a 32-bit processor of an identical architecture to a 64-bit processor would necessarily take longer to perform a 64-bit calculation, since it would have to be performed in separate steps/operations. In reality, there are more factors at play, such as pipelining, clockspeed, architecture, scale, etc. So yes, the 'bitness' of a processor has quite a lot to do with the size of the numbers it operates on natively.
Thanks Joshua. I found your videos useful in building my understanding about computers. I am studying for Compita A+. I already have my Network Plus. Thanks again.