- CIOsurge
- Posts
- Your Cloud Architecture Was Built for the Wrong Era
Your Cloud Architecture Was Built for the Wrong Era
This week: CIOs are turning AI on their own IT departments first, your cloud stack needs a GPU-native redesign, and Spain's blackout just made grid security a boardroom conversation.
Your Cloud Architecture Was Built for the Wrong Era

Powered by Single Fin
Welcome to this week’s edition of CIOsurge!
This week:
57% of CIOs face pressure to improve productivity and 52% must reduce costs. The leaders getting results are applying AI to their own workflows first.
Traditional cloud architectures are failing under AI workloads. The shift from CPU-centric to GPU-native infrastructure is no longer optional.
Spain's nationwide blackout on April 28 has triggered a cybersecurity investigation into power plant vulnerabilities, with 756 million data points under analysis.
Let’s make this week a game-changer.
Stay sharp. Stay ahead.
💡 Guest Expert Insights: The Build Window Has Collapsed
I was talking with Alex Tuck on the Project Zero podcast recently, and he made a point that stuck with me.
AI is just another tool in a long line of tools we’ve adopted as technologists—dial-up to broadband, servers to cloud, and now AI. The difference this time is the pace.
Alex put it simply: if you don’t iterate constantly, you’re never going to keep up with the changes.
He’s right. Three to five years ago, building a beta enterprise-ready software product took at least six to nine months (sometimes twelve). Today, you can do that in a weekend.
The companies that want to win in this environment have to embrace that pace. If you’re not continuing to innovate, you become obsolete.
At SingleFin, we see this play out with customers constantly. The organizations pulling ahead are the ones willing to test, ship, and iterate at a speed that would have looked reckless five years ago.
For technology leaders, that shift demands a different posture on risk. Waiting for certainty before you act is a strategy for falling behind.
Alex added something I think a lot of CIOs need to hear: this AI wave has created a democratization for functional folks—the people who are good at talking to people and synthesizing information.
You no longer need to be deeply technical to build. The tools are accessible to everyone now.
That means speed is no longer a differentiator reserved for large enterprises with big engineering teams. The differentiator is willingness to act—something startups have always had, and which AI is now handing to anyone who wants it.
— Zack Tembi, CEO, SingleFin
CIOs Are Turning AI Inward on Their Own IT Departments
According to Gartner, 57% of CIOs face pressure to improve productivity and 52% must reduce costs. The response from leading CIOs is to apply AI to their own internal workflows first. AI is boosting coding efficiency by more than 70%, though testing gains remain closer to 30%. Development teams using AI for vibe coding are slicing months off typical product development schedules. Mike Anderson, CIO at Netskope, has created Gemini Gems as digital twins of employee roles, feeding the AI technical documentation to learn about each role's tasks and knowledge needs. The approach lets him maintain a flat budget while delivering more output across the organization.
The instinct to point AI at customer-facing processes first is understandable. But the CIOs getting the most leverage right now are the ones who started with their own house. The key insight from this piece: automating a bad process just makes it fail faster. The CIOs seeing real gains are rethinking workflows from scratch, then layering AI on top. That's a fundamentally different approach than bolting a copilot onto a legacy process and hoping for productivity gains. If your IT team hasn't gone through this exercise yet, that's where I'd start. The internal gains compound, and they give you a credible proof point before you pitch AI transformation to the rest of the business.
- Zack Tembi
Your Cloud Architecture Was Designed for the Wrong Workloads
Enterprise cloud strategy is shifting from "cloud-first" to "intelligence-first," and most existing architectures were never designed for what AI demands. AI workloads expose hard limitations in traditional cloud setups built for transactional systems. GPU-accelerated compute, high-throughput data pipelines, and massive training datasets create bottlenecks that CPU-centric infrastructure cannot resolve. Organizations now need GPU-native architecture, with careful orchestration of GPU scheduling and memory management. Compliance requirements and specialized GPU availability are driving hybrid and multi-cloud deployments, adding complexity around data consistency and model versioning. Meanwhile, FinOps is becoming essential as LLM training consumes substantial budgets, requiring cross-functional collaboration between data scientists and financial teams.
This is the piece I'd forward to any CIO who thinks their current cloud setup can handle the AI workloads they're planning. It can't. The shift from CPU-centric to GPU-native infrastructure is happening whether you budget for it or not. The organizations that get ahead of this are the ones building FinOps into their AI strategy from day one, pairing every model deployment with a cost accountability framework. If your data science team and your finance team have never been in the same room, that meeting needs to happen this quarter.
- Zack Tembi

💡 CIO Spotlights
Citi has appointed Brian Saluzzo as its new Chief Information Officer.
Nearly four years at Google as VP of core developer engineering and product management, building and scaling enterprise technology platforms
Joined Citi in March 2026; appointment as CIO announced April 29
Mandate includes optimizing technology operations and scaling AI across Citi's global business
Reports to Tim Ryan, head of Technology and Business Enablement, who cited Saluzzo's expertise in helping "organizations work smarter and more securely"
Saluzzo's appointment comes as Citi launched Arc, a new platform for scaling AI agents across the enterprise.





Reply