Responding to the "IT Mess" at SAAQ: the Art of Continuous Testing
While reading the Journal de Québec this week, I couldn't help but react to Joseph Facal's column on the "IT mess" at the SAAQ, which we're discovering a little more every day with the Gallant commission. I decided to respond to him by email to share my point of view. However, in the context of the numerous discussions about digital transformation, software engineering, and technology project management, I'm taking the liberty of extending the scope of my response to enhance understanding of our industry. I'm also very curious to hear what my peers have to say on the subject; please don't hesitate to react!
Hello Mr. Facal,
I'm a regular reader of your columns and wanted to react to your latest publication about the "IT mess" at the SAAQ.
I'm a 28-year-old software developer working for Nexapp, a Quebec-based software engineering firm that has been helping companies maximize and accelerate the impact of their software investments for over 10 years.
First, I'd like to point out that the 5 reasons you noted why digital transformation projects can go off the rails are spot on:
- It's a relatively new field
- It's difficult to master
- Projects are abstract
- Management teams are neophytes
- There is resistance to change
Now, I'd also like to criticize the proposed solution a little.
In our jargon, what you're suggesting has a well-known name: waterfall development.
Requirements and constraints are analyzed in detail by business analysts, domain specialists, project managers, and specific stakeholders. Following this stage, software architects review the analysis reports to define the broad outlines of the solution to be implemented, including the technologies to be used, the design patterns to be applied, the database models to support various business needs, the performance and security requirements, and other non-functional needs.
The work is then divided between the development teams. If the project claims to be modern, it will attempt to apply various methodologies drawn from the Agile philosophy, such as scrum ceremonies, sprints, story points, estimations, or, in its more extreme form, SAFe (Scaled Agile Framework). This is where the bulk of the work is done, and where the developers get down to work.
When a team has completed a piece of a feature, it is often passed to a quality assurance (QA) team to validate the constraints and requirements listed during the analysis phase.
Finally, when the feature receives the seal of approval from QA and management, it is deployed in production for end-users.
This approach may seem attractive to most people. Indeed, at first glance, one might draw several parallels with other areas of engineering and say that, by comparison, a similar method should deliver the same results: reliability and predictability.
However, to make this assumption is to ignore one of the most fundamental premises of software development: that no two software projects are identical. More often than not, a software project is an original solution that attempts to respond to specific problems experienced by a certain group of users.
And it's here that we can extract the most value from a software solution: by asking ourselves what is the minimal automation that would improve a person's life. This way of thinking has a name: agility. This methodology can sometimes get a bad press, depending on which way the wind blows. In fact, it's also known by its Silicon Valley nickname: move fast and break things.
For my part, I much prefer the parallel with the scientific method: hypothesis -> experimentation -> validation. What sets IT apart and gives it a huge advantage over other fields of engineering is that it's possible to have an almost instantaneous feedback loop. With the right processes in place, developers are able to put forward a design (hypothesis), produce a solution (experimentation) and then check for themselves the result of what they have done (validation). And all in the same day.
A software project is all about hypotheses. What is the right product for my users? What are the performance requirements for our implementation? What are the security issues? The role of a software developer is to apply these assumptions to see if they will meet the desired needs. The major problem with most software fiascos is that it takes too long to test each and every one of our premises against the fires of production. And since Murphy's Law applies more to IT than to other fields, one of the best ways to guard against it is to test every little hypothesis as frequently as possible.
Finally, I leave you with this quote from Melvin E. Conway, which sums up very well why we get the IT systems we deserve:
Organizations which design systems (in the broad sense used here) are constrained to produce designs which are copies of the communication structures of these organizations.
- Melvin E. Conway, How Do Committees Invent?
William Picard
Software Craftsman
For a digital transformation that delivers results
Discover the 6 steps that will enable you to quickly identify what will really pay off.

