8 bits of bun

Why code is unoptimized


So i was thinking recently:

Why is it that all kinds of software had so little system requirements around 20 years ago.

It looks like many types of software have grown to be very bloated, while adding minimal featuresets.

Looking at Windows 2000 compared to Windows 10 for example:

Minimum system work memory (RAM) has increased by 16x (64MB to 1GB).

Storage requirements (SSD/HDD) has increased by 8x (2GB to 16GB).

Similar story for the processor, and all without hardware drivers.

 

I think a logical explaination for all this is: people don't want to optimize and rely on frameworks/software engines for faster rollout of software too much nowadays. I don't say they're wrong, but it would be nice if a web browser doesn't like to eat up all system RAM available with just a few tabs (looking at chrome for example).

 

For the future i think it is needed to get back to optimisations again as PC parts are getting to the upper limit of what the silicon used can do. HDD size increases are getting less every gen (percent wise), processors are getting only more cores to stretch Moore's law, Graphics cards aren't getting the performance uplift per generation we used to see generation over generation, and so on.

 

Anyhow, with that said, i think with thel imits of silicon reached, we will get to an era where code/framework optimizing is going to be essential in the next decade. Same goes for multi architecture exports (x86 is the main dominator for now, but ARM and Power9 are getting closer to performance/watt for sure).

 

I personally think until we see transistors of a better material than silicon, this will be the case.

But when a new material has come instead of silicon, and stuff is getting faster steadily, system requiremens will increase again, like running in a circle.

Last edited: September 19, 2019, 20:07

Powered by CMSimple | Template: Desyn 20xx (v0.2) (by 8bit-bun)