h a l f b a k e r yYeah, I wish it made more sense too.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
Pocket calculators used to cut down on transistor count
by
performing arithmetic operations serially. I know this
thing
but not the details. And of course there are such things
as
bit-slice CPUs.
Taking this from the larger to the smaller scale, this is
what we have:
1. At the largest
scale, there's a CPU which appears to
have
a certain word length - say 64 bits. However, it in fact
consists logically of an array of 32 two-bit processors.
Physically, however, it consists of a single processor used
thirty-two times as a separate processor, with the inputs
and outputs temporarily stored so as to look to the
outside
as if it's a sixty-four bit processor. I would say this could
be
dealt with by some kind of shift register type
thingy, but that would involve lots of logic gates and I
want
to avoid that, because that means more switching
elements.
2. At a smaller scale, the ALU and registers, such as they
are (probably just a stack pointer and program counter)
are also serial. As with the PDP-11, memory-to-memory
addressing is used and any other registers are in memory,
including flags. Instructions are also decoded in series.
3. At a smaller scale still, each logic gate is only virtually
realised. Whereas there may appear to be a two-input
binary adder consisting of a few logic gates, again they
are
working in serial with delays, and in fact consist of the
same gates used repeatedly, pretending to be other
gates.
4. Still smaller, they are of course realised using NAND or
NOR gates in various configurations, and naturally these
are all the same logic gate reused.
5. Below the scale even of logic gates, the switching
element itself is just a valve (or transistor if you insist)
reused in various ways in order to realise the logical
equivalent of an NAND or NOR gate, depending on which
way you want to go.
Ultimately, the CPU consists of a single valve plugged
into
a socket which connects with itself in various ways using
various other components arranged very cleverly,
probably
mainly capacitors and resistors. But to a programmer, it
just looks like a normal 64-bit CPU like any other.
Why a valve? Well, obviously retro styling, but also it
means you can just plug it in and let it glow. When it
finally blows because of all the turning on and off, you
get
another one.
It would probably be a good idea to stock up in advance
with a couple of million or so valves so it lasts more than
about a minute. Maybe there should be an automated
VLSI-based system for replacing them.
Colossus computer
https://en.wikipedi...i/Colossus_computer "The first Mark 2 Colossus, containing 2400 valves ..." [8th of 7, Jan 31 2018]
[link]
|
|
Switching speeds of valves are vastly lower than switching
speeds of solid-state electronics. I expect the valve to
generally last quite a bit longer than a minute. However,
the real question is, with that slow switching speed in
mind,
and given how much work it has to do to emulate a 64-bit
processor, can it last long enough to get any particular
job
finished? |
|
|
(For those who don't know, likely Americans
unaccustomed to this particular Britishism, a valve is a
vacuum tube.) |
|
|
Even with a complex thyratron/dekatron valve, there's no way that a single tube can ever approach the necessary complexity. |
|
|
It might be possible - in theory - to build a "computational element" using thermionic emission in a large vacuum chamber (glass, or stainless steel) then pump it down to a level where it would work - but one failure would mean back to square one. Plus, even using a digital regime, isolation between elements would be a serious problem. |
|
|
To build a basic flip-flop, you essentally need two valves, altho there are a few specialist "logic" types. |
|
|
Building any sort of computational device using a single off-the-shelf valve is impractical. And very halfbaked. |
|
|
How would you add with such a device? |
|
|
Some sort of really fragile abacus ? |
|
|
The problem is memory, if the output of the gate in its previous state has to be fed into the next state, it needs to be saved somewhere, and memory normally means flip-flops, which are made from gates. And if you have memory, you already have enough to build logic, since a single unit of memory can function as a lookup table. This is the basis of FPGAs. |
|
|
I like the idea of memory as logic, an n-bit memory with an m-bit address space can function as an n by m lookup table. And all combinational logic can be represented by a lookup table mapping inputs to outputs. |
|
|
Instead of wasting logic elements on building memory, why not go Seriously Serial and fit a delay-line memory? This would need at least one tube to amplify and retransmit the pulses, but if the delay line was long enough you could store a fairly substantial lookup table. You can build, for instance, an ALU using just combinational logic, and perhaps with multiple delay lines feeding the same tube you could store input data and opcodes, I don't know. You would also need some magical gate-free way of starting the system up to initially load all the data. |
|
|
Colossus used paper tape. The rebuild at Bletchley Park demonstrates it perfectly. |
|
|
But there are a LOT of valves ... |
|
|
You don't need flip-flops to store things. Semiconductor-
based dynamic RAM, for example, consists of capacitors and
other techniques are available. For instance, you could
twang a wire, use CRTs with metal plates in front of them,
use rings on wires or a mercury delay line. There's no need
to squander logic gates on making memory. |
|
|
That correlates with my experience here in fact. I often
feel I'm discovering ideas rather than inventing them, and I
kind of feel all invention is in fact discovery. Ideally, then, a
computer is a truth finder. Makes me think of quantum
computers somehow. |
|
|
So a newly commisioned build, the quantum computer named "The Eye of Serialon". |
|
|
[IT], that's either a deeply profound insight into the nature of the Universe, or specious rubbish. |
|
|
If only it were possible to determine which .... |
|
|
Just ask it. If it answers, then it's probably
true, or possibly enjoying a good laugh at your
expense. |
|
|
//the truth doesnt need storing, merely accessing// This is fine
as soon as you've built a device to scan the entire universe in real
time. Don't forget transactional integrity. Again, this is easy, once
you've persuaded the universe to slow down and wait for you. |
|
|
The computation only needs the prime factors and processes to the E-collar that our simple focus is interested in. A bit of extrapolation and modelling and the stating of truth will be like magic. |
|
|
[Ian], the reason for my silence on this page is that you've
triggered an interest in functional programming which is
taking up my time. It occurs to me that just as there was
once an object-oriented CPU, maybe there could be, or
perhaps already are, *purely functional* circuits. But it
also occurs to me that there might be some kind of signal
processing device which is effectively already this, or that
it's only a particular perceptual filter which prevents me
from seeing that they are already functional. |
|
|
My only experience of functional programming is APL. I
can see that as a kind of over-featured calculator, and
speculate as to the hardware of my imaginary device. |
|
|
Toggle realys make very nice memory modules
A bit slow but giving the single tube computation
that should be fine. |
|
|
That interests me rather too much Ian. I'm very easily
distracted, and now I know there's something interesting
called Elm out there it may dissipate my energies rather.
But thanks anyway. |
|
|
I currently have an image of data shooting through boxes
without touching the sides and I'm kind of thinking
everything could stream, but there could be major eddies in
that stream. |
|
| |