Welcome to ICT Mythbusters Episode One – who needs prequels, start your numbering scheme at one!
ICT Mythbusters is inspired by the great Discovery show Mythbusters
It’s also premiering a new concept in advertising that will revolutionise it:
Commercials that the host – in this case me – don’t make any money from, so click the banner and support the REAL Mythbusters, if you want to me support, send me some money ;-).
Was Bill Gates wrong?
A very famous quote from Bill Gates is:
640KB of RAM should be enough for everyone
Everybody has been laughing at that statement, but was he actually right?
I can access the web in high-fidelity from my Nintendo DS, and any modern phone with Java ME can run the Opera mini browser, and these phones rarely have more than 1MB of RAM. I’d say that approximately 4MB should be enough for everyone.
So Bill Gates was right, or? Why is it that he wasn’t? It’s of course because we’ve moved our storage to the server, as a college of mine was so friendly to point out – actually I think he was quite annoyed with me – but that’s only because he didn’t understand what I meant.
The desktop and portable computer is an anachronism, as I’ve written before, and we need to move ALL the storage to the server – where it belongs, and run only thin clients. VERY thin clients would actually suffice for something like 90+ percent of the worlds business users.
So yes my college is right, yes 640K isn’t enough for everyone, neither is 4MB, but how much is then?
And we’re talking server storage, to cater to the computing needs of the entire world, at the time, not considering the more than exponential growth we’re likely to se in the future.
Help me do the math, or should I just submit it to Jamie, Adam, Toby, Grant and the red-hot Kari.
3 replies on “ICT Mythbusters part one: 640K should be enough for everyone! Not exactly! But how much do we need?”
The solution may be to put all data on the server, but whose server?
I mean, I actually need my data to move around with me, hence I use Gmail for mail and del.icio.us for links.
But moving even more fundamental data to the server raises a fundamental problem: How reliable are those who administer the servers?
Would I move all my organization’s documents onto Google’s servers, can I trust them to always respect my privacy, and how reliable are they?
Today, an Internet outage means no communication, etc. If we move all data onto centralized servers, it would mean data was unavailable and a huge amount of practical things can’t get done. Would this make our infrastructure too fragile?
I’m thinking of establishing a “best of both worlds” approach myself: Keep my data on the local machine, but rsync the important stuff to an encrypted directory on a server available on the Internet, e.g. every 24 hours. This means keeping the data and the computing power on the local machine but always being able to retrieve the data on any other machine – but since it’s encrypted, it will still be safe against intruders.
Anyway, I think we’re getting closer to a “best of both world” approach: yes, moving more and more to the server seems logical – still, devices are getting smaller and smaller and getting more and more processing and power and more and more storage (my cell phone has a 2GB memory card, e.g. – that’s a lot of document backup space), so we’ll probably be seeing a promiscuous mixture of fat, fit, lean and thin clients. 🙂
Good points.
The thing is that the data on your pc isn’t safe either, unless you encrypt it with things like FileVault.
If you ever leave your pc in a public place – you’re living room is an example, when you have guests – you could easily be owned! But I have a password! No good either, it’s easy to crack, if you have a Mac, you don’t even need a tool to achieve that…
So encrypt your files and copy them to a public server that is located in a closed vault, and your data is safer than the police will allow 😉
[…] ICT Mythbusters is inspired by the great Discovery show Mythbusters, and you’ll find Episode One here. […]