In yesterday’s post I touted automatic software updates as the one great thing that helps software improve.
Today I want to continue my talk about that topic by asking: Does it really? Is being able to install updates to your software not as flawless an idea as you might expect? I’m mostly going to be working with an argument I read in a column today, so you may want to read that if you’d like an actually well-written version of this post.
The point is, software isn’t permanent anymore. Before the introduction of the internet and easy-to-use updaters, changes to existing software were far and few in between. This meant that the version of the software that initially shipped would likely be the version that’d still be running on clients’ computers years later. That’s pretty permanent compared to today’s situation.
Wouldn’t this make for much “better” software? Software that would actually be done once it reached the users’ hands? Much like how architects design buildings with the utmost care for the very reason that they’ll likely be around for a long time to come. It has to withstand the test of time. What if we made our software that way?
But with the whole rapid evolution of technology thing, finding middle ground might be a better option.