Theemile wrote:I have a picture from the 1954 Popular Mechanics showing a mockup of supposed "home computer" from the 2000s. The image bares more in common with a nuclear reactor's control room than an IBM PC - the computer being the size of a large living room, with the walls covered in banks of dials.
[snip]
And the article mentioned that many breakthroughs in miniaturization would be required for the concept to be practical.
And Yet many more happened, as we know and continued to happen - now virtually everyone holds hundreds of thousands of times the computing power of an 80s PC in our pocket. I don't think the PM writer could have considered that.
In one of the First Star Trek the Next Generation Episodes, it was mentioned that the entire ship had 4 Teraquads of computer storage - assuming a "quad" is 2 binary bits, that's 8 terabit. I just built a gaming machine with 20 Terabytes of storage (4 x 2TB NVME drives, and a 12TB HDD data drive.). I know that it was just writers on a show saying that in the fall of 1987, but that was an insane number to a PC geek whose PC had a hideously expensive 40MB HD.
Technology has made those #s and designs seem pathetic. In another 50 years, hat will be made possible due to unforeseen breakthroughs?
I forget the unfathomably large specs that Heinlein gave the Mycroft computer in The Moon is a Harsh Mistress (okay, looked it up, and it wasn't as bad as I remembered. The unused memory bank Mycroft was able to lock aside for private use of him and Mannie is "Ten-to-the-eighth-bits capacity"; or, as we'd know it, 12.5 megabytes. OTOH when Heinlein defined the capacity at which Mycroft "woke up" he did a bit better job, describing "banks of associated neural nets" and saying Mycroft had more than "one and a half times ["ten-to-the-tenth"] neuristors".
That'd be over 15 billion; which compares not so unfavorable to the very largest modern neural net, GPT-3 at 175 billion parameters (only a bit over 11 times larger) -- though obviously that mere size didn't make GPT wake up.
Still at least 15 billion parameter neural net isn't as laughably out of place as a computer in 2075 having any spare memory bank of as small as just 12.5 MB.