Published 03. Feb. 2022

Data Fabric – Securing Your Flexibility and Freedom to Choose

Do you want to achieve new or faster data-driven outcomes? Netapp explains why Data Fabric is the perfect solution.
General

A Data Fabric delivers the right data and applications to the right place, at the right time, and with the right capabilities. You are in control of your data and can keep it safe no matter if your workload is running locally, hybrid, or cloud based! You have the flexibility to transition into a hybrid, multi-cloud setup that suits your business and the freedom to choose the right service to any given workload – now and in the future. 

 

Your data—where, when, and how you need it! 

For nearly three decades, NetApp has been focusing on innovations enabling customers to build stronger, smarter, and more efficient infrastructures. The objective being delivery of the right data and applications to the right place, at the right time, and with the right capabilities. When it comes to your business, we’ll meet you at your level and explore where you want to go, and then help you get there with a data fabric designed for simplicity and agility. 

 

What is Data Fabric? 

In 2016, Dave Hitz, co-founder of NetApp, went on stage at NetApp Insight and introduced a new term: Data Fabric. It wasn’t a product, there were no deliverables, but it was a philosophy that NetApp was going to live by in the development of its new and existing products. 

He said that most new workloads were going to be cloud-based (but not all), and that while it’s really easy to deploy and destroy workload instances in the cloud, those workloads are useless unless they have relatively local access to the datasets required to achieve business outcomes. 

Any doubts about the cloud being production-ready had been clearly vanquished as AWS and Azure had already grown into behemoths, with each introducing new services seemingly every day. 

A few years after this announcement, it seemed that “Data Fabric” was going to be this overall term that fell into the category of “marketecture” — just a cool term with no real meaning or implementation. 

 
 

So what is the Data Fabric now?

NetApp has created a foundational delivery architecture for workloads and their data. This is unique as everyone else in the industry focuses on one or the other. Customers can provision, manage, and run production, development, or test application instances in the place that makes the most sense at that time. This has a tremendously positive impact on a data-driven application development and execution workflow, as organizations look to the cloud for their “use-as-you-need” compute farms.  

When you consider that according to IDC, the amount of data stored globally will grow from ~40ZB in 2019 to 175ZB in 2025, with 49% of that data stored in a public cloud, it’s clear that two things are true: 1) there’s going to be a ton of data in the cloud, and 2) there’s going to be a ton of data still resident in data centers.  

These datasets will consist of millions/billions (or more?) of files (or objects), with capacities already exceeding the petabyte range. Moving datasets of that sort around by scanning filesystems is simply not possible. 

At the core of the NetApp Data Fabric lies NetApp SnapMirror technology. SnapMirror allows you to efficiently move data from place to place in a way that makes the number of files irrelevant, without the need for third-party replication software or appliances that introduce high rates of failure and even higher skill requirements for administration. 

NetApp redeveloped SnapMirror at the beginning of the Data Fabric movement to open it up to other platforms such as S3 to expand the Data Fabric to as many use cases as possible. 

NetApp Cloud Volumes ONTAP has allowed customers to achieve much faster analytics results using lots of ephemeral cloud compute, leveraging data that resides primarily on-premises, and employing the Data Fabric to get that data into the cloud. The customer remains at the top of the food chain, as opposed to customers who get disrupted because they still cling to the traditional (read: slow and frustrating) 100% on-premises method of application delivery. 

If your organization is looking to achieve new or faster data-driven outcomes, it is imperative that you settle on a foundational architecture that not only gets and keeps your dynamic data in the places where you’ll be achieving those outcomes, but also brings your scaled applications to bear on that data to realize true acceleration. If you do your research, you’ll find that NetApp has led in this space from the onset and is so far ahead in its capabilities that you’ll want to grab onto the NetApp Data Fabric, hold tight, and get ready for a wild ride. 

 

Read more about NetApp’s approach to securing your flexibility and freedom to choose with a Data Fabric here and let’s connect on how to reach the Data Fabric strategy you need.  

 

This article was contributed by NetApp.