In my last blog, I wrote about the Software Supply Chain Security and Cloud aspects considerations in the 2022 DORA Report. In this blog, let us look at the DORA metrics on operational performance parameters and delivery parameters that are considered in the 2022 DORA Report, and take a critical look at the implications.
Operational Performance: Reliability
In this year’s report, DORA report has included not only scalability, but also latency, performance, and scalability, and clubbed them as ‘reliability’. This is not surprising as more organisations are moving towards microservices, and the consideration for measuring latency, performance, and scalability is getting important. At the same time, these four aspects have always been in consideration for enterprise and other applications to build sturdy reliability. These aspects are also important in Software Testing. I am surprised that the team preparing the DORA yearly report has only now realized the importance of latency, performance, and scalability, in 2022. This, taken together with the consideration of Software Supply Chain security (most probably after the experiences with the log4j vulnerability), provide two important facets of application design considerations – performance and security. And in the years to come, I’m sure that the DORA report team will see the importance of other aspects of design considerations, and include them also in their report! So, I predict it would be ‘Design Metric(s)’ rather than a ‘Reliability Metric’. Let’s see.
Delivery Metrics And Their Performance
This is a delicate subject, as this year, the metrics that the DORA report uses to cluster organisations have shown signs of not aspiring to be in the ‘high’ or ‘elite’ club clusters. While the reason that DORA report states is that organisations that are new are still ramping up with those metrics, I doubt it, the reason being the same organisations that responded to the survey in the previous years would have responded this year too, with an addition of new organisations (that number is not known as far as I have seen in the report – correct me if I am wrong).
As we look at the clusters, we see that the ‘High’ cluster has reduced from 40% to 11%. The ‘Low’ cluster has increased from 7% to 19%. The ‘Medium’ has increased from 28% to a whopping 69%! What this means is that the organisations are getting more and more reluctant to adhere or aspire to be in the ‘High’ or the ‘Elite’ category clusters of the DORA metrics.
There could be a couple of reasons for this. One, organisations that have always been tracking the DORA metrics to measure their delivery performance are feeling the fatigue of implementing and measuring their delivery performance through DORA metrics, and two, organisations don’t care about these metrics anymore as these metrics don’t reflect the reality in which their delivery performances are measured on their day-to-day operations. Either way, it looks like the DORA report team need to do some introspection (may be talk to the organisations in length) to figure out if these are the right metrics to measure and if they would be inspired and motivated to measure these for their delivery performance. Otherwise, I suspect there would be no ‘Accelerate’.
In the coming blogs, we will look at a few other sections of 2022 DORA Report in detail. Feel free to chat with me if you would like on DORA Metrics.