DORA 2025: AI’s Impact on DevOps — Speed, Trust, and the Platform Effect
Alan Shimel analyzes the 2025 DORA Report, focusing on how widespread AI adoption is transforming DevOps teams’ speed, stability, and performance. He discusses productivity, trust, and the pivotal role of platforms and value stream management.
DORA 2025: AI’s Impact on DevOps — Speed, Trust, and the Platform Effect
Author: Alan Shimel
Overview
The 2025 DORA Report, released by Google Cloud’s DORA research team, investigates the increasing influence of AI on software development and DevOps pipelines. Building on more than a decade of industry-shaping analysis, this report highlights how AI adoption is nearly universal across teams but brings new organizational tensions, especially around trust and stability.
Key Insights
1. AI Adoption Outpaces Trust
- 90% of respondents report using AI tools in software delivery.
- Despite widespread adoption, about a third of engineers admit they don’t fully trust AI-generated code.
- This mirrors early skepticism around CI/CD adoption—usage becomes table stakes long before confidence matures.
2. Productivity Gains, But Stability Concerns
- DORA’s 2025 data confirms AI can significantly increase development throughput—teams using AI report faster progress.
- However, these same teams experience greater instability, indicating that workflow speed alone does not guarantee safer or more reliable releases.
- The analogy: “AI is rocket fuel. But without a guidance system, you’re just accelerating toward the cliff.”
3. Platforms Define AI’s Real Value
- Internal developer platforms are now present in 90% of organizations, with most having dedicated platform teams.
- High-quality platforms dramatically boost the returns of AI adoption, while weak platforms limit or even undermine those benefits.
- Mature platform engineering is now shown, with data, to be foundational for effective AI scaling.
4. Value Stream Management (VSM) Is Essential
- AI often drives “local optimizations” (e.g., faster code, speedier test suites).
- Without robust value stream management tying these gains to business value, overall organizational performance may not improve.
- DORA’s research underscores the need to align AI productivity wins with team and product outcomes via VSM.
5. User Focus as a Performance Lever
- Teams that implement AI without user-centered practices risk harming performance.
- High-impact AI deployments occur when user needs drive implementation, echoing DevOps’ core principle of business value delivery.
What Should DevOps Leaders Do?
Shimel distills the findings into actionable recommendations:
- Invest in strong internal platforms. Don’t introduce AI on fragile foundations.
- Strengthen value stream management. Ensure that productivity improvements contribute to business goals, not just activity.
- Define and communicate trust boundaries. Make it clear when AI output is acceptable without review and when manual checks are required.
- Prepare safety nets. As development speed rises, robust rollback and recovery mechanisms become more necessary.
- Maintain user focus. Keep end-user requirements at the center of all AI and process improvements.
Conclusion
Alan Shimel closes by framing the 2025 DORA findings as a call for pragmatic leadership in an AI-saturated era. While AI tools can supercharge velocity, they can only improve outcomes where strong platforms, disciplined practices, and a culture of user-centricity exist. For organizations lagging in foundational DevOps practices, AI won’t be a magic fix.
“If you’ve got your fundamentals in place — platforms, pipelines, VSM and a relentless focus on users — AI can give you a serious edge. But if you’re hoping AI will fix broken culture or bad process, I’ve got news for you. I’ve seen this movie before, and it doesn’t end well.”
Further Reading and Resources
This post appeared first on “DevOps Blog”. Read the entire article here