I have never heard of a project manager in software development that had the liberty to prioritise stability or performance over features, at least outside of some specialised areas (aviation, military, banking, etc). It's not (yet) a selling argument to a broad audience. Only when lacklustre performance really hits very high levels and attracts many negative comments, then yes, something gets done (see the emphasis on speed in the latest Lightroom development announcements for instance).
As for bloat, this can be explained by backwards compatibility needs and negative backlash as soon as a moderately used feature is removed or replaced by a "better" version (akin to the new import workflow in Lightroom 6.2). Couple this with new features that are regularly thrown in with no clear use case or with half-thought implementations and this will only raise pressure on maintainers, who will need to find ways to stay profitable.
I think that a change in how software companies deal with quality insurance is likely to make the life of users even more difficult. The list is from Microsoft, but this might apply to others as well:
- Cut costs by firing all compatibility testers and promote users to unpaid beta testers.
- Gather as much input passively from users using telemetry. Make it nearly impossible to opt out. Instead of only using this information for maintenance, sell it to make it even more useful now or in the future (since the terms of services can be changed for data that has already been collected).
- Make it mandatory to always run the latest build of your application so that no one can escape being a beta tester by using an older version.
- Make it really hard to download and install an older version.
- Progressively restrict the ability of users to perform certain actions, because it can lead to security problems. Say for instance installing unsigned drivers to use Argyll CMS with an I1 Pro.
So software companies are progressively working towards making their own life easier. The majority of users will put up with this because it is for their own good or it doesn't bother them at all for mainstream activities. Or because they have no alternatives.
Cheers,
Fabien
That's a very good summary, Fabien
The sad thing is that bugs and poorly conceived and written software have nowadays become the norm.
It's one thing if an inexperienced user downloads some virus or malware, that is after all his fault or negligence, but if a national bank, eBay, Paypal or other major institution releases defective software to millions of users, that's simply inexcusable.
In practical terms, software bloats cause newer versions of computer programs to become perceptibly slower, use more memory, disk space or processing power. All these facts are known to most users and support technicians.
The bloats in software are due to some extent to keeping backward compatibility, but in many cases the old and inefficient section of code stay in place, simply because there is no budget to redesign and streamline the system. At first glance it seems easier to keep the old program modules and just interface it to some new lines of code. Quite often the maintenance programmers don't analyze and understand the legacy applications, and the IT managers don't have a clue about the program workings. Fortunately this doesn't happen in all companies, but in many more large corporations than the general public would believe.