I think this kind of knowledge is pretty important for when you're writing code, even if you work exclusively in the JVM. It helps to understand what goes on under the hood and the fundamental limitations of what your computer can do, so you know where you need to optimize.
There's a trend, small but growing, of people doing CPU's and algorithms without computers implementing them. Many strive for simplicity to enable learning or reduce subversion. Abacus is a simple, nonelectronic device for computing arithmetic. A similar device might be constructed out of physical components to emulate a primitive CPU or one part at a time.
While we're at it, why not use that opportunity to experiment with alternative fundamentals, like ternary or quaternary (like our DNA I suppose) instead of binary, or further explore things like belt architectures [1].
I see no reason why we should forever stick with things the way they are, just because they seemed simple to implement at the time. The trend you mention could be a chance to reinvent computers, as it were.
Possibly. I brought up ternary or quaternary to hardware guys on Schneier's blog. I considered them since they shouldn't be much harder to work with than Boolean. In theory. That led me to two reasons that Boolean prevailed over most: simple rules allowed easy transformation and verification by computers; just two states means you can aim for high noise immunity at the gate level vs one with more precision required for its analog implementation.
However, there is research in the other stuff. Here is one of the papers I found:
There were also other types of logic used in digital with interesting advantages. Two good ones, one for speed and one for reliability, are below. I particularly found the computer made with diode logic to be pretty amazing.
Don't forget ternary content addressable memory (TCAM), which is common in contemporary network routers because it turns out having a third state to mean "don't care" is incredibly useful:
Cant look at videos yet but descriptions sound like exact kind of thing im talking about. Well, there's those that replace the function and others that aid understanding. So, two categories of mechanical, fluidic, paper, and so on systems.
As it happens, there've been a bunch over the years. Including a hardware implementation of BF, I think a stack machine at one point... At least one that was pipelined, and since I largely stopped following the scene, a bunch built using command blocks. (Tho that's a bit cheaty if you're looking to understand how they're built instead of just the instruction set.)
I tried building a compiler for one particular computer, the RedGame 2. Which in practice meant an assembler + installer, lol. Somebody else also built an emulator and the other assembler for that computer. Also remember Ohm's computer, a 16-bit machine which was... practically vertical. There were lanes for 16 bits side by side, all of the ALU functions were stacked one above the other, and all of the registers were likewise stacked beside the ALU. It also had an assembler, and an emulator (in Bash, IIRC).
Among the interesting / odd things about Redstone computers, is just how slow signals propagate down wires compared to the gate speed. By in large, small is the only form of speed. That, and small is kindof necessary, via the sheer space constraints imposed by being inside a game. The one 64 bit ALU I saw once... was sufficiently wide that you had to stand close to the middle to keep the entire thing in simulation range. Corollary to that, is the low speeds can make the operation of the chip substantially intuitive. With the RedGame 2, a few times I would try to follow ROM reads, instruction decode, register reads, ALU ops, and right back to the condition in time to select the next instruction. That was a bit of an oddity of the RedGame, every instruction contained a conditional branch. You would use... 2 or 3 bits as an operand for what condition to check, list out two (!) instruction numbers to branch to on true/false.
I've (much more) recently mulled over building my own CPU, tho it'd be a couple months worth of work. Most recent sketch was of something of a Mill-alike. It'd be an 8-bit, with 8 belt positions, one ALU, 12 bit instructions with ~30 ops available, and maybe 128 program instructions in ROM. Not sure I could actually fit any RAM, tho... With RAM being positively huge, and any mildly compact part-analog storage taking around 5-10 seconds per read. I'm also not sure something so small it can only fit one ALU really benefits from a belt architecture, rather than a 3-operand register machine. For that matter, most Redstone computers use quite wide program ROMs instead of a Von Neumann architecture, so instruction bits are practically the cheapest part of these computers.
Not sure if that's more informative than making a more-capable ISA and writing an emulator in C.
Brainfuck. That particular computer... Was an 8 instruction machine with a... very short data tape. It could increment / decrement numbers on the tape, move back and forth on the tape, use loop constructs to select instructions, and print / read numbers from a panel. Was far too diminutive to run anything of interest, but it did technically have all 8 instructions.
http://www.ugrad.cs.ubc.ca/~cs121/2015W2/Labs/Lab9/lab9.pdf http://www.ugrad.cs.ubc.ca/~cs121/2015W2/Labs/Lab9/playcpu.p...
I think this kind of knowledge is pretty important for when you're writing code, even if you work exclusively in the JVM. It helps to understand what goes on under the hood and the fundamental limitations of what your computer can do, so you know where you need to optimize.