
NetApp is a cloud-native data storage and AI solutions provider that is based in San Jose, California. Data Management Insight spoke to chief marketing officer Gabi Boko to learn more about how the company helps financial institutions.
Data Management Insight: When was NetApp formed, and how do you service financial institutions and financial services companies?
Gabi Boko: NetApp was founded in 1992 with the goal of modernising enterprise storage by embedding software-defined data management – a “built-in, not bolted-on” approach. For more than 30 years, we’ve helped enterprises unify, manage and protect their data across hybrid and multi-cloud environments. For financial institutions, this translates into secure, high-performance access to both structured and unstructured data, which is critical for mission-critical applications and emerging AI workloads. Our solutions help financial institutions build intelligent data infrastructure that ensures data is accessible, governed, and optimised to power functions like streamlining credit decisions, managing risk, detecting fraud and even personalising financial advice. .
DMI: What are the most common pain points for clients that NetApp solves?
GB: Clients frequently grapple with data fragmentation across diverse environments – on-premise, single-cloud, or multi-cloud – hindering consistent insights and operational efficiency. They also face increasing demands for advanced governance, security and seamless data integration into both traditional enterprise applications and new AI workloads. NetApp addresses these by providing a unified data infrastructure, uniquely native across all hyperscalers. This ensures continuity in structure, governance and security, significantly enhancing cyber resilience against threats like ransomware. By building critical functionalities directly into the data layer, we provide peace of mind and enable customers to responsibly utilise their growing data estates.
DMI: Can you elaborate on the “built-in, not bolted-on” philosophy and why it’s beneficial?
GB: Our philosophy emphasises the power of having both built-in capabilities and the flexibility to bolt on specialised solutions. While customers will always integrate various enterprise applications and new workloads, we believe embedding core data management directly into the infrastructure is crucial. This ‘built-in’ approach ensures consistent power, efficiency and security at every level of the technology stack. It means the same principles of governance, security and intelligence apply to your data, regardless of where it resides or how it’s being used. This deep integration offers extra layers of managed control, differentiating us by providing a confident, secure, and optimised data foundation.
DMI: What emerging challenges are you currently solving for your customers?
GB: A significant emerging challenge comes from agentic AI systems, which autonomously retrieve, reason over, and act on enterprise data. The core problem isn’t access to computing power or AI models, but rather whether a customer’s data is truly ‘AI-ready’. This involves ensuring data can be easily accessed, identified, governed and prepared to feed these powerful computational demands. Traditional SaaS architectures weren’t designed for the continuous, autonomous data retrieval these new systems require. Building intelligent data infrastructure solves this by enabling high-performance, low-latency and permission-aware data, which is crucial for operationalising AI, allowing confident experimentation and training.
DMI: How does NetApp use AI internally?
BG: Internally, we’ve invested heavily in training our entire workforce, completing six months of AI classes for all employees. This education focuses on operationalising AI, optimising workflows and leveraging it to enhance scale and speed by automating actions, not replacing people. We call this ‘AI for work’. We rigorously apply AI to our own workloads and systems, experimenting and refining our approaches internally before deploying them for customers. NetApp is also a fervent consumer of its own products and methodologies. This commitment allows us to speak with genuine authority about what works, ensuring our solutions are proven and effective in real-world scenarios.
DMI: What do you see as the next big thing in technology that customers will adopt and NetApp will enable?
GB: Beyond evolving storage formats, the next major shift is the emergence of a robust data platform on top of these formats, what we term ‘intelligent data infrastructure’. Companies are rapidly becoming more data-driven, regardless of AI adoption, shifting focus from mere storage volume to speed. The ability to quickly analyse and act on data is a critical competitive advantage, driven by agent-based systems that continuously retrieve and act on information. NetApp is enabling this by providing real-time access to accurate, consistent data. This future will be built through innovation, strategic partnerships, and a community approach emphasising fluid data exchange via joint APIs.
DMI: What’s in NetApp’s pipeline regarding product and service releases?
GB: Following last year’s launch of our data platform and NetApp AFX – our enterprise disaggregated all-flash storage system with embedded ONTAP software – we’re focused on enhancing AI functionality within this platform. Customers can expect a continued rollout of deep AI capabilities, spanning from the visualisation layer down to the core format layer, directly powering AI-era data workloads. We’re also significantly expanding our ecosystem through numerous new partnerships. These collaborations will extend APIs and deepen connections across enterprises, particularly in areas like security and further advancements in cloud and data management. The next year will primarily centre on these deep AI integrations and strategic partnerships.
Subscribe to our newsletter

