I always like to point out the telegraph relay computer[0] built by one Harry Porter. I think that pretty much takes the cake for retro hardware. It also brings up the question of how early a binary computer could have been built, since the relay was invented in the early/mid 1800s[1] and Boolean logic in 1847[2]
For some reason it took 90 years for Boolean switched hardware to appear - originally invented by Shannon in the US and Zuse in Germany in the late 1930s.
It's interesting to wonder why it took so long, and what would have happened if it had been invented much earlier.
The first may have been because Boolean algebra was very obscure and unlikely to cross the path of the electromechanical and telegraph engineers who worked with relays.
The second I can't even begin to speculate about - although the first FET was patented in 1925, so in theory it would have been possible to build a solid state computer not long after.
Charles Peirce also had the idea in the 1880s, but he didn't publish it. (There were a couple of others in the 1930s who did, but were unfortunate enough to not be Shannon.)
The idea of the transistor was patented in 1926, but it took all the hard work of making a transistor by the Bell Labs team for discrete transistors to become a reality.
See [1] for an excellent summary of the birth of the transistor.
Let me don my tinfoil hat and say that one of the major functions of a computer back then was fire control for artillery, later seconded only by cracking Axis encryption. Maybe anyone who tried to source 10,000 reliable relays in 1938 got a knock on their door...
I don't think that is as crazy as it sounds. Remember, back then, the commercial potential of a "computer" was for the military.
Even if you had the idea & concept back then, AND the money to build it - what would you do with it? You're not using it like we use computers today...
The story of the FET is interesting. Patented in 1925 and 1928 by Lilienfeld, the invention wasn't implemented until the 1940's. Researchers at Bell Labs, Shockley and Pearson, build operational devices based on Lilienfeld's work, but never acknowledged the Lilienfeld FET in their published reports.
FET production began in the 1950's when supporting technologies became available. Later on in the 1990's the original FET designs were implemented and produced working devices.
Surprising how old many technologies really are. As with the FET it can take a long time before inventions or discoveries can be realized and put to good use.
And photography came of age in the 1840's. We could have had something like integrated circuits shortly after. What a world that would have been with steam power and solid state calculating engines!
Though keeping in mind that solid state semiconductor physics is based heavily in quantum mechanics which probably wasn't 'sufficient' & more widely accepted until the late 1920s or 1930s, the earliest the transistor could have been invented was just pre-WW2. We were really quite far off from QM for most of the 19th century.
Everything depends on QM, but you don't have to understand that to make use of it. Just laying out massive circuits on film would have made computation possible a century earlier?
For example, the distances between source and drain in a transistor are based on factors like electron/hole mobility, minority carrier recombination in the gate, which are probabilistic and can be calculated using equations which would only have been known through quantum physics. Without QM, the challenge would be akin to reliably building a skyscraper in the Middle Ages (prior to Newtonian physics). Of course, that is why the pyramids are Wonders of the World, though we are able to see the Egyptians had some of the necessary mathematics and understanding of physics for that.
If someone were to provide you with variously doped n-type, p-type semiconductor, with enough time you could make a transistor by trial and error. The key I guess is that you probably wouldn't have considered using (or even had for that matter) n-/p-type material unless you had a background in solid state physics in an industrial lab in the 1940s (i.e. everyone associated with Shockley). But at that point, and being in such an organization, wisely you'd be inclined to work out the math & science first on blackboards, and thus economize your limited person-hours and R&D budget.
As an aside, and probably the main reason no one would ever have created a transistor or even IC in the 19th century or earlier is because, they really didn't need one yet. A transistor controls electricity (in a circuit) using electricity (from another ideally isolated circuit). In the 19th century, mechanical relays existed which controlled electricity using mechanical means (which could in turn have been controlled by electricity). This was "good enough" for most applications. There was not yet any large-scale telecommunication system (requiring better low-noise, low-power signal amplification), large populations within nation-states (requiring high speed digital data processing by corporations & governments), or potential for world wars that could destroy the planet many times over (requiring data secrecy and cryptanalysis by nations only fighting in proxy aka Cold Wars). These were 20th century and, remain at this point, 21st century concerns that benefitted from, and in some cases sadly, were enabled by the transistor.
OP here - my original idea was that in the mid-late 1800s we could have had binary computers made from telegraph relays as switches, not solid-state transistors. The relays I believe are still many orders of magnitude more reliable than vacuum tubes, and probably easier to make more reliable for large-scale provisioning.
I would be happy with the Xerox PARC developer experience becoming mainstream, instead of people insisting in replaying a PDP-11 developer experience on their hexacore workstation with 16 GB and 2 GB GPGPU and the insistence of relying on programming languages that were already known as unsafe in early 70's for our IT infrastructure.
It's a nice thought but not happening. Given transition to hardware accelerators, I'd also be happy with products like this coming down to consumer, mid-range price:
I’d like to point out that this is common at many universities for compsci undergrad students as 2nd-year project (although usually done simulated in Xilinx ISE and ISim).
It’s definitely interesting to do these things (I’m currently taking abovementioned course at my uni, designing first a 4-bit harvard architecture processor in Xilinx, and, after that’s done, in 2 weeks starting with designing a DLX in Xilinx), but it’s even more amazing to see someone actually build these things IRL (and thereby prove that these are actually useful skills to have!)
I used to design and build whole systems from TTL logic in the 70s and 80s. The most integrated part was the ALU from AMD, otherwise, it was mostly NAND and NOR gates. Great fun, especially cause we knew how everything worked.
[0] https://www.youtube.com/watch?v=n3wPBcmSb2U [1] https://en.wikipedia.org/wiki/Relay#History [2] https://en.wikipedia.org/wiki/George_Boole#Symbolic_logic