I've felt like this for years. The software that we use as time passes, becomes more restrictive, more un-free, more finicky and disobedient. The modern model of software as web applications is tedious both for the user and the programmer. The user is locked in versions and features that may abruptly change under his feet, internet connection to handle even the most trivial cases of computations and its total invasion of privacy. The programmer is forced to comply with a platform (the web browser) that is so both so complex and far-fetching, a center piece that is impossible to re-create resulting in being a hostage of google, especially after the inevitable decline of mozilla - even if this happens in a less obvious way.

From my observation the main trend of software nowadays is how to use the aforementioned platforms in two main ways:

  1. create "users" that in turn are exploited through their computation
  2. create infrastructure that can handle the massive amount of "users" these corporations want to exploit

From a technological point of view, this offers innovation only in the space of distributed systems, and in a very centralized way at that.

Looking in the past, things seem to have seriously eluded us. Take a look at this presentation of old operating systems, and try to make a point that there is innovation in any of the modern equivalents. I, at least, can't find any. Over at https://datagubbe.se, it is even proposed that the usability of software is declining.

I recently encountered a series of interesting articles that resonate a lot with some of these thoughts. The first post that I read was by John Ohno over at https://www.lord-enki.net. Aptly name Silicon Valley hasn’t innovated since 1978 it makes the case that the last few decades of software evolution seems to have stagnated, at least on the level of novelty that it has introduced lately.

It's not a secret that after the late 80s early 90s computers became big business. And business rarely goes well with novelty, since trying to innovate always brings in the huge risk of economic failure. The result is a market that is very nicely summed as follows:

The computing technologies developed prior to 1980 have mostly become cheap enough that they have become accessible to a mass audience, in part because of iteration on manufacturing techniques, & mostly because of cheap labor (in the form of fresh-out-of-college CS students who will write bad code for half of what you’d pay the PhDs to refuse to write bad code, and will work unpaid overtime if you give them a ball pit and a superiority complex).

With academia being handled a business support role, all the interesting technical topics seem to have stagnated. Even the way that academia seems to handle tenures and the rising culture of numerous dubious publications as gamified procedure. Indeed there is interesting work happening, but usually it's in small groups or parts, away from the mainstream. In contrast to the evangelizing, what I see around me, is that in fact corporations stagnate development and that there needs to be space for the risky ideas that economic plannings despise.

On the other hand, Jonathan Edwards claims that open source is stifling progress. In a nutshell he argues that the development model that is promoted by the bazaar in another piece in the stagnation we see today, as the model of incremental development and "crowd sourcing" can't lead to new ideas, that require small cohesive teams, ready to invest a lot of time in a project.

Another interesting idea on the subject is that, maybe, the whole human culture has stagnated. This is summed nicely in the following quote:

The risk-aversion and hyper-professionalization of Computer Science is part of a larger worrisome trend throughout Science and indeed all of Western Civilization that is the subject of much recent discussion (see The Great Stagnation, Progress Studies, It’s Time to Build). Ironically, a number of highly successful software entrepreneurs are involved in this movement, and are quite proud of the progress wrought from commercialization of the internet, yet seem oblivious to the stagnation and rot within software itself.

It's actually quite hard to assess what has happened and even what has already happened. The modern implementation of software, it's applications to the world around us seem to just have started warming up, yet there is already a lot of ideas of stagnation. I think the feeling of stagnation can be researched elsewhere.

The tools we use are becoming increasingly imposing over us. The fact that web applications are so big and FAANG are the only mediators of innovation, destructs the possibility of innovation in a series of protocols that these businesses rely upon. The fact that the "user" is to be controlled creates software is very precise direction, that doesn't allow new paradigms to evolve. My idea, supporting/contradicting the ideas I read around, is that we have broken premises and by this point of view we have on software, new ideas are indiscernible. To have ideas about ground-breaking software is to have ground-breaking ideas for life, and this is something that is so usually forgotten.

Nonetheless, there is work that could break this circle of reproduction. Some of them include:

  1. The Guix project is a novel way to abstract and compose the software we make.
  2. Spritely is a novel way for communications between our software.
  3. Explainable artificial intelligence may be a way forward to understand more of the world around us.

Only one thing is for sure. There isn't a way forward by the demands of big corporations. Following the trends will only result in the stagnation and exploitation of our computations. Using all the latest fads will only lead to leaving the great ideas hidden. Complying to the way of life that is demanded from the modern programmer will only result in less things be attainable.