We’ve had general purpose computers for decades but every year the hardware requirements for general purpose operating systems keep increasing. I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now where you need at least 8gb to have a decent experience. What has changed are growing protocol specs that are now a bloated mess, poorly optimised programs and bad design decisions.
I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
My point is not that we should all go back to using old hardware right now with current the current way we use our tech because that is impossible.
My point is that the way we look at technology is wrong and the way we upgrade without real reason. The average person does not need a 4k camera, it does not make them a better photographer. I’ve used digital cameras with < 15 M sensors, the photos generally sufficed for family/holiday snaps and professional photography. Yet there will be people who have thrown out phones because they unnecessarily want the latest camera tech. Wait till people want 8k recording.
That perfectly working phone that was thrown out is an example of the e-waste I was talking about. Producing computers is not with out societal and environmental cost, and to throw perfectly serviceable machines is morally reprehensible. Current culture would agree with me that its not sustainable, but most people aren’t ready to have to keep their device for 5+ years.
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture. You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
I mean its not that crazy, I’m writing this on a moto Z2 play. It was released June 2017, not long till year 6 bit hope it goes longer. It’s perfectly usable, runs most apps fine, can even run TFT.
Phones haven’t changed that much recently, this model has a great screen, 4gb of ram(more than some laptops that are still being released!), and a decent chip. Only issue is the battery is sub 3000mah but I know of a few models from around the same time went up to 5000mah.
You do get better mileage running an OS like lineage and being degoogled since a lot of their tracking processes kill the battery and slows things down.
Computers haven’t become less efficient. They can still crunch numbers like crazy.
It’s the software. Why spend a month making something when you can just download some framework that does what you want in one hour. Sure, it used 10 times as much memory and CPU, but that’s still only a 1 second delay with a modern computer and the deadline for release is approaching fast.
Repeat that process often enough and you have a ridiculously bloated mess of layers upon layers of software. Just for fun you can start up some old software and play around with it in an emulator to be baffled how quick it all works on a modern system.
People also forget that most of the actual calculations were done on paper first; the computers were basically just executing precalculated instructions.
These are multiple printouts of the code. The computer did not only execute precalculated instruction. (This would be a sequencer BTW.). Try it yourself AGC.
I’m not quite sure if even that is correct. The AGC, as far as I understand it, did do quite a bit of calculation on the fly and was essentially the first digital fly by wire system. It did rely on input from the crew and ground control for eg correcting its state vector etc etc, but it even has dedicated vector instructions if I recall correctly. Can’t really precompute all that much when you can’t be sure things will go to plan and you’re dealing with huge distances. It did have eg separate programs for different phases of the flight but they weren’t really precalculated as such, more like different modes that eg read input from different sensors etc etc.
The US space program was pretty big on having a human in the loop though, much more so than the Soviet one which relied more on automation and the pilot was more of a passenger in a sense, sort of a failsafe for the automatic systems.
The book Digital Apollo goes into all this this in more detail, I can highly recommend it if you’re a ginormous nerd like I am and think that computers we’ve shot into space are endlessly fascinating
Firefox. Firefox is a free and open source web browser that is not just nice to your RAM, making it run smoothly alongside games or on older machines, but also respects your privacy.
Unlike Chrome, it doesn’t track every move you make online and it’s not only more customizable, it also doesn’t threaten ad-blockers and the free web in general. Check out Firefox with the link below!
I did the same! I’m now given to understand that that was Google’s goal with Chrome - make the easiest-to-use and most lightweight browser to bring everyone in, then ramp up the trackers and bloat. I think I need to export my bookmarks and look into Firefox again…
Diehard Firefox stan since Phoenix days. I also discovered a little social phenomenon. Many (not all) tech people who call it “furryfox” and hate FF for “politics” are actually rightwing leaning or homophobes just silently aligning with the righty political agenda. Do not ask me how I know it. Rightwing SJWs exist en masse silently, and they are among us.
Compute intensive stuff usually demands those levels of RAM. I know for gaming the recommendation nowadays is 16GB while 8GB are considered “works for now”. There are some games though that still benefit from more RAM (I upgraded to 32GB on my old PC for a Beta of a Sim game as it maxed out my 16GB to the point of lagging my PC)
You don’t think you’ll ever really use all 32GB at the same time until you’re running a virtual machine or two and open task manager to see that you’re consistently using over 82% of your RAM, which happened to me today.
Add comment