Production Expert

View Original

Pro Tools Users Would Prefer Bug Fixes Rather Than New Features - Do You Agree?

Of course we would all want both but in our intentionally binary poll designed to force choice Would You Rather Have Bug Fixes Or New Features In Pro Tools? You Can't Say Both - Poll we explored the challenges of resolving bugs against providing new features and asked you in a poll to choose which is more important to you, bug fixes or new features? In this article we look at the results of the poll and recent developments.

The Results

See this chart in the original post

As you can see from the results, just under 70% of you voted for bug fixes and 30% voted for new features, since we ran the poll Avid has released Pro Tools 2019.5 and as we saw in the article Pro Tools 2019.5 Bug Fixes - The Complete List Avid has heard your pleas and the list of bug fixes the team at Avid have squashed is a very long list.

In the article we discussed what is a bug? Our view is that reproducibility is the principal identifier of a software bug as opposed to erratic behaviour on a single computer system. Conversations that seek to identify whether a perceived bug is indeed a bona fide bug, tend to irritate people who are experiencing them but it is important to examine these “bugs” carefully. The key here that is often missed, is that if the bug isn’t reproducible across multiple systems and particularly if the so-called bug is possibly due to a lack of care in the maintenance of the host system, the responsibility for the inconvenience of the problem should move from the company who made the software back to the user. But with human nature being what it is, no-one likes to admit they are wrong so it’s understandable that discussions around bugs can get heated.

Clearly there is a balance to be struck here. “No bugs ever” is unrealistic and the pressure to comply with new operating systems means that the 100% stable version of Pro Tools everyone would like is fundamentally a moving target.

Software that doesn’t work is unacceptable so the practical solution lies somewhere between these two extremes and clearly this is why there is so much debate. Companies have finite resources and a finite time in which to deploy those resources. Should they spend that time fixing the borderline bugs, which affect small numbers of people or should they spend it creating new features which far greater numbers of people are asking for and stand to benefit from?

A Coder’s Point Of View

In the comments on the article community member Bradley Eaton said…

“Having spent a few years around software developers (albeit in a different market sector that what we are discussing here), I can attest to the fact that bug vs feature (or enhancement) is almost always an ROI proposition. Time vs Money. If the bug is fairly benign--it won't destroy the computer--or has a simple workaround, it will generally get back-shelved in favor of productivity enhancements. Yet, I've seen weeks of man-hours get used up trying to reproduce a bug that a user is reporting, over and over, and over again. After this fruitless research, someone will go sit with the user to see what they are doing and then it turns out that they are employing a particularly "unique" set of keystrokes, a problem solved by 60 seconds of re-training. All that time could have spent working on ways to make the software better, wasted.”

In our research about bug fixes we came across an article by Joel Spolsky, who a software developer in New York City. Joel had just found his hero.

“Jamie Zawinski is what I would call a duct-tape programmer. And I say that with a great deal of respect. He is the kind of programmer who is hard at work building the future, and making useful things so that people can do stuff. He is the guy you want on your team building go-carts, because he has two favorite tools: duct tape and WD-40. And he will wield them elegantly even as your go-cart is careening down the hill at a mile a minute. This will happen while other programmers are still at the starting line arguing over whether to use titanium or some kind of space-age composite material that Boeing is using in the 787 Dreamliner.”

Joel had just read an interview with Jamie in the book Coders at Work, by Peter Seibel. In this book Peter asked Jamie, “Overengineering seems to be a pet peeve of yours.” and Jamie replied…

“Yeah, at the end of the day, ship the fucking thing! It’s great to rewrite your code and make it cleaner and by the third time it’ll actually be pretty. But that’s not the point—you’re not here to write code; you’re here to ship products.”

Joel continues…

“Duct tape programmers are pragmatic. Zawinski popularized Richard Gabriel’s precept of Worse is Better. A 50%-good solution that people actually have solves more problems and survives longer than a 99% solution that nobody has because it’s in your lab where you’re endlessly polishing the damn thing. Shipping is a feature. A really important feature. Your product must have it.”

The Law Of Diminishing Returns

There you have it, a very pragmatic approach to writing code. As an audio mixer often working to deadlines, I frequently find myself asking “Is it fit for purpose?” With limited time you have to make choices and decisions. Yes I maybe able to improve this clip but the time it is going to take me to do it is too costly to the overall project, to the extent I might how get the job completed on time. This idea of Richard Gabriel’s ‘Worse Is Better’ is very interesting and surely a 50% solution that ships is better than a 99% solution that never comes. Or to take a common saying here in the UK “A bird in the hand is worth two in the bush”.

We have just seen this with the recent ZombieLand vulnerability issue with Intel chips made from 2011. Intel, together with the companies such as Apple and Microsoft have rushed out patches to close the vulnerability, because in their view closing the vulnerability quickly is more important than taking a small performance hit.

Yes, in the short term it might have a small impact on performance but I would rather have a slightly slower system that was secure than a speedy system that someone could bring down. After all, recovering from having been attacked is going to have a much higher impact on what I can do than suffering a slight performance hit.

But the key point here is that this won’t be the last we hear about the ZombieLand vulnerability. Developers from Intel, Apple and Microsoft will now be considering a better, more efficient solution, which won’t have a performance hit, safe in the knowledge that users have been protected from the vulnerability.

Worse Is Better?

What do you think? Were you surprised by the results? What do you make of Jamie’s approach? Is ‘Worse Is Better’ OK? Please do share your thoughts below.

See this gallery in the original post