There's a rather large issue in the technology industry that no one within the industry seems to care to talk about or to admit exists. The lone voices have been the grey beards who are largely ignored with “okay boomer” or similar epithets whether or not they are boomers or match any epithet applied to them. Ageism and elitism and many other “isms” are used to dismiss these folks who actually do have valid arguments regardless of prevailing sentiments. For my own part, while I have been aware of this problem, I have vacillated and never done anything about it. What is this problem? Waste.
There are two types of waste that I am talking about. There's material waste, and there's electrical power waste. Both are very serious issues, and they're often directly related to one another. Every single time that a software developer chooses a language that isn't efficient, that developer is choosing to waste power. On any one single machine this power waste typically isn't too serious a problem. The problem is, there are billions of computing devices on this planet, and they all use power. If the code the developer writes runs on any decent number of devices, that developer then wasted very large amounts of power usually generated by gas, oil, or coal.
Software developers love to make excuses for using slow languages like Python or Ruby (including me). They will make arguments that "C is unsafe!" while they simultaneously include tons of code via pip or ruby gems that they've neither vetted nor put through a formal review process. They trade actual security and safety for memory safety. That code also isn't typically going be performance oriented, and then furthers our problem. The only thing this code did was provide the developer a way to do less of his/her job. Additionally, any complaint about memory safety must address the fact that C was good enough for the programming language but somehow not good enough for the programmer to use directly. It must also make the argument that protecting the planet is less important than yet another piece of garbage code (honestly, most code is garbage). The next time you decide that you need to use some hip programming language, remember that you might be destroying the Earth. The green house gasses produced by your code running on an over-provisioned cloud, using more resources than it needs to because you chose to be lazy, the strip mining of the Earth to get more materials for more machines and more energy because the entire industry is addicted to just not doing their jobs. People are willing to use paper straws that wilt and make their beverages gross, but they aren't willing to spend a few more hours using a more performant language and running code through formal QA processes.
I've touched on one part of the material waste: over-provisioning. As an industry, we've become addicted to overprovisioned cloud infrastructure. For the instant turn around and auto-scaling that the cloud provides, there must be many thousands or millions of machines on stand by at all times. This means that mankind is making more machines than are needed and those machines are also more powerful than needed. We then add to this that code is typically very inefficient, and this gets worse. There's a compounding effect on energy use and material use.
Every developer out there needs to realize that it is your worst code that will become the most widespread and longest lived. If it solves a problem, even if it solves it poorly, it will live forever. Any later version you make won't get used because the crappy first release already solved the problem. That code might then get used on webservers, database servers, and other machines all over the Earth. As an aggregate, this will cause more energy use, more over-provisioning, more computers that are more powerful being produced, and then still more energy used, and still more over-provisioning. Then, we have the churn. Every year millions of tons of electronics are thrown away. They aren't responsibly discarded either. Most of this crap just gets dumped on the third world, where it pollutes the environment and kills people, animals, and plants. This isn't a joke, and it isn't an exaggeration. In 2019, the world wasted more than 59.1 million tons of electronics. That's the equivalent of around 350 large cruise ships that are completely filled with all of this e-waste.
But wait, there's more! The computers that get thrown away aren't even useless. The computers weren't dead. The computers weren't even outdated. The computers were not at fault. The software companies were. I can run AutoCAD, spreadsheet software, word processors, games, email, web browsing, and more on an 8088 with an 8087 coprocessor, and a 64 megabyte hard disk with about 640K RAM. I may not be able to do all of those things at once, but really... I am a human. I can do only one thing at one time. Having more things open is a convenience, but it is not truly a necessity. Of course, I am not advocating a return to the IBM XT, I am simply making a point. Everything we now do was possible in the past with far more modest computing machinery.
Yet again, for the sake of laziness, we must make bigger, slower, and more annoying software. We must waste screen space, and therefore get people to buy larger and higher-resolution screens. We must waste RAM and CPU cycles making people purchase more powerful and more power hungry machinery. We must make people realize that the computer they love now is actually just garbage and something newer and "better" is needed.
The point I am trying to make in this example is that people are getting more hardware than their needs require, and that the powering of that hardware is wasteful. A machine that uses less power is likely adequate to their tasks. Currently, most people could get by on a Raspberry Pi with 4GB of RAM. It’s a modest machine, but it can do everything that the vast majority of people require. You know, it will run Chrome with a few tabs. It is less powerful than many leading smartphones, which tells us a bit about smartphone waste as well.
The hardware manufacturers are not much better than the programmers of course. Rather than innovate in a meaningful way (which is expensive), they would rather make small iterative advancements, push them out as "magical" and sell you another iTrinket that doesn't do anything your old one didn't do. The old one will then be thrown in the garbage. The precious metals and energy wasted in creating it forgotten, and more land will be stripmined to make your next new iTrinket.
Less performant code is actually bad for the planet, bad for people, and bad for economies. All of the labor and all of the material could be used to enrich lives in meaningful ways. Instead, it is being used to make social media platforms that divide us, games that are addictive in ways similar to gambling, a constant barrage of propaganda from streaming services and glowing screens, and porn that minimizes relationships and over-sexualizes society. Is this really what we wish to be feeding? Is this really what we want out of humanity? The escapes that people are seeking due to the crushing realities of life make those crushing realities of life far worse than they would otherwise be. The human addiction to our escapes is creating more need for that escape. It’s heroin on a global scale.
If you are out there advocating for environmentalism, you should care about this problem. If you are out there talking about energy independence, you should care about this problem. The latest framework may help you produce a deliverable, but it is also going to contribute to e-waste. The problem starts with the software and systems engineers. The software developers need to write lean code, and the systems engineers need to take the time to configure, performance tune, capacity plan and then provision hardware accordingly. Programmers need to take the time to make things more performant. Hardware engineers need to make things modular and upgradable. Companies need to be finding new revenue streams rather than convincing people to toss out perfectly good hardware. The constant fanboyism regarding new orchestration systems like puppet, chef, ansible, and salt is no better. The fanboyism over docker and kubernetes is no better. The fanboyism over the cloud and virtual machines is no better. These are all overhead. If you are using an orchestration framework, you’re adding machines to the pool that will be constantly on and waiting for your command.
You are also not tuning per machine, you’re using configs in the large which drives up power consumption (there are people who take that into account and plan accordingly but I am making generalizations because I am talking about waste in the aggregate). Containers and virtualization are largely done because so called “safer” languages and frameworks are producing code that cannot be trusted and therefore must be isolated. Additionally, those containers and virtualization schemes add to the power consumption problem. Especially as the machines are over-provisioned to allow for future spin up of more virtual machines or containers! If you are using docker because you’re too lazy to learn how to install and configure something... you really need a new job. Likewise, the fact that you are pushing datasets that saturate your connection should be considered. The fact that your code is bad and requires legions of servers should be considered. Did you really need all of those machines, or did your languages, packages, frameworks, and data sets decide that you needed those machines and you were unwilling to make the required investments to fight them? In the systemic laziness of the industry, we have added to a series of global problems, and at some point we will all be forced to pay the price. I am not saying that any one person is guilty; I am saying that we are all guilty. We all need to change what we do, how we do it, and why we do it. I sincerely hope that you will join me in spreading awareness and taking action.