DORA vs DORA

Did you miss part 1? Check it out here.

The Digital Operational Resilience Act (DORA) is a significant piece of EU legislation aimed at enhancing the financial sector’s resilience against cyber threats and operational disruptions. While DORA and DevOps Research and Assessment (DORA) might share a name, they serve distinct purposes within the financial industry.   

DORA (the regulation) sets forth stringent requirements for financial institutions and their third-party service providers to establish robust cybersecurity measures, incident response plans, and business continuity strategies. These regulations aim to minimize the impact of potential cyberattacks and operational failures on financial services.   

While DORA (the DevOps framework) is not directly referenced in the EU DORA regulation, its principles and practices can significantly contribute to achieving the regulatory goals. By adopting DevOps practices, financial institutions can improve their software delivery processes, leading to faster deployment cycles, reduced downtime, and enhanced system stability. This aligns with DORA’s objectives of promoting operational resilience and minimizing disruptions.

While DORA (the regulation) provides the regulatory framework, DORA (the DevOps framework) offers a practical approach to implementing the necessary measures to comply with the regulations and build a more resilient financial sector.

Within software development, the term DORA (the DevOps framework) has taken on a definition of performance excellence. The DevOps Research and Assessment has established a framework that helps organizations measure and enhance their software delivery capabilities. So what are the key metrics introduced by DORA (the DevOps framework), their implications for development teams, and how can organizations leverage these insights to improve performance?

Measuring Performance: DORA’s Four Key Metrics

DORA has identified four key metrics that serve as benchmarks for measuring software delivery performance:

  1. Lead Time for Changes
    • Definition: This metric measures the time it takes for a code change to go from commitment to being successfully deployed in production.
    • Importance: Shortening lead times allows organizations to respond faster to market demands and user feedback. High-performing teams aim for lead times of less than one day.
  2. Deployment Frequency
    • Definition: Deployment frequency tracks how often an organization releases code to production or end users.
    • Importance: Frequent deployments are indicative of a healthy development process. Elite performers deploy multiple times a day which allows for quicker iterations and improved responsiveness.
  3. Change Fail Rate
    • Definition: This metric measures the percentage of changes that lead to degraded service in production.
    • Importance: Understanding change fail rates helps organizations identify the quality of their deployment processes. A lower fail rate indicates a more stable environment whereas a high fail rate may signal issues in testing or quality assurance practices.
  4. Time to Restore Service
    • Definition: This metric assesses the time it takes to recover from a failure in production.
    • Importance: The ability to restore service quickly is crucial for maintaining user trust and satisfaction. High-performing teams strive to restore service within one hour to minimize downtime and impact on users.

Performance Clusters: Understanding Levels of Excellence

DORA’s research has categorized organizations into performance clusters based on their performance metrics. These clusters include:

  • Elite Performers:
    • Achieve lead times of less than one day, deploy code on demand, have a change fail rate of 5% or lower, and restore service in less than one hour. These teams exemplify the ideal state of software delivery performance by demonstrating agility and reliability.
  • High Performers:
    • Typically have lead times ranging from one day to one week, deploy code between once per day and once per week, and have a change fail rate between 6% and 15%. While they may not be at the elite level, high performers show strong capabilities and are on a path toward improvement.
  • Medium Performers:
    • Experience lead times of one week to one month, deploy code between once per week and once per month, and have a change fail rate of 16% to 25%. These teams face challenges that impact their ability to deliver software efficiently.
  • Low Performers:
    • Often have lead times exceeding one month, deploy code infrequently, and have a change fail rate above 25%. Organizations in this cluster may struggle with processes, culture, or technology, leading to a cycle of inefficiency.

Improving Software Delivery: Strategies for Success

Understanding an organization’s position within the DORA performance clusters is crucial for identifying areas for improvement. By implementing the following strategies, teams can enhance their software delivery performance:

Prioritize CI/CD Continuous Integration and Continuous Delivery (CI/CD) practices can significantly reduce lead times and increase deployment frequency. Automating the testing and deployment processes ensures that code changes are integrated and released smoothly, leading to faster delivery and higher quality.

Foster Collaborative Cultures Breaking down silos between development, operations, and other stakeholders is essential for effective collaboration. By fostering a culture of open communication and knowledge sharing, teams can accelerate problem-solving, improve decision-making, and drive innovation.

Invest in Quality Assurance A robust testing framework is crucial for reducing the change failure rate. Implementing automated testing and continuous monitoring can help teams identify and address issues early in the development process, ensuring the delivery of high-quality software.

Learn from Failures Conducting blameless post-mortems after incidents can provide valuable insights into the root causes of failures. By analyzing past mistakes and implementing corrective actions, teams can prevent future issues and improve their overall performance.

Measure and Adapt Regularly measuring and analyzing performance metrics, such as lead time, deployment frequency, change failure rate, and time to restore service, allows teams to track their progress and identify areas for improvement. By using DORA’s metrics as benchmarks, organizations can make data-driven decisions to optimize their software delivery processes.

To improve software delivery performance, organizations should prioritize CI/CD practices, foster collaboration, invest in quality assurance, learn from failures, and measure and adapt their processes. By implementing these strategies and leveraging DORA metrics, teams can achieve faster, more reliable, and higher-quality software delivery.

Wrapping-Up

DORA, the DevOps Research and Assessment framework, can significantly contribute to the EU’s Digital Operational Resilience Act (DORA) by improving software delivery processes. While DORA (the regulation) sets strict standards for cybersecurity and operational resilience, DORA (the framework) provides practical tools to achieve these goals. By focusing on metrics like deployment frequency, lead time, and change failure rate, financial institutions can enhance their software delivery, leading to faster innovation, reduced downtime, and improved overall resilience, thus aligning with the objectives of the EU DORA regulation.

The DORA framework provides invaluable insights into software delivery performance, enabling organizations to measure their capabilities and identify areas for improvement. By focusing on the four key metrics – lead time for changes, deployment frequency, change fail rate, and time to restore service – development teams can enhance their processes and deliver high-quality software more efficiently.

As organizations strive to improve their performance, understanding where they stand in the DORA performance clusters can be helpful. The path to excellence may require a commitment to continuous improvement, collaboration, and a culture that values learning and adaptation.

If you’re interested in exploring DORA’s metrics and how they can enhance your software delivery processes, book a time with Paul Nashawaty to discuss these insights in more detail.

Authors

  • Paul Nashawaty, Practice Leader and Lead Principal Analyst, specializes in application modernization across build, release and operations. With a wealth of expertise in digital transformation initiatives spanning front-end and back-end systems, he also possesses comprehensive knowledge of the underlying infrastructure ecosystem crucial for supporting modernization endeavors. With over 25 years of experience, Paul has a proven track record in implementing effective go-to-market strategies, including the identification of new market channels, the growth and cultivation of partner ecosystems, and the successful execution of strategic plans resulting in positive business outcomes for his clients.

    View all posts
  • Bringing more than a decade of varying experience crossing multiple sectors such as legal, financial, and tech, Sam Weston is an accomplished professional that excels in ensuring success across various industries. Currently, Sam serves as an Industry Analyst at Efficiently Connected where she collaborates closely in the areas of application modernization, DevOps, storage, and infrastructure. With a keen eye for research, Sam produces valuable insights and custom content to support strategic initiatives and enhance market understanding. Rooted in the fields of tech, law, finance operations and marketing, Sam provides a unique viewpoint to her position, fostering innovation and delivering impactful solutions within the industry. Sam holds a Bachelor of Science degree in Management Information Systems and Business Analytics from Colorado State University and is passionate about leveraging her diverse skill set to drive growth and empower clients to succeed.

    View all posts