At the end of last year, a Cohesity research report found that 47% of IT executives were worried that their company’s IT budget was used unnecessary storage. This week, Cohesity's second survey report "Massive Data Fragments on the Cloud" with 900 sample responses pointed out that the cost of migrating massive data is a very important issue. Only consider those who have not yet completed the business migration to the public cloudFor enterprises, 42% of IT executives are worried about high costs and the costs associated with data migration for migrating local businesses to the cloud.
Data migration, streamlining storage, and eliminating redundancy are the focus of discussions on saving expenditures on public cloud storage.
In recent years, as multi-clouds have become more common, coupled with data gravity, the importance of data migration has attracted more and more attention. Data gravity describes the trend of data storage where it is created, and applications running near the data will get better latency and throughput. If security requirements and costs are excluded, the best option is to store the data in the public cloud where the application runs. However, the cost of migrating data from an enterprise data center is very high. As Datamation pointed out, cloud computing companies provide free data migration, but these are important for adopting a multi-cloud path without involving a one-time large-scale data migration to a new cloud environment.It may not be the best choice for the company.
Source: "Cohesity's Secondary Data Market Study" 87% of IT executives surveyed said their organization’s secondary storage data It's all fragmented. This group is worried that too much money is spent on storage.
In addition to migration, as an applicationData is usually moved as part of the routine operation of the program. When a public cloud application uses data stored in another cloud (private or public), it will face the need to pay for bandwidth or the amount of data transmitted.
Source: "Cohesity's Mass Data Fragmentation in the Cloud"This chart shows the concerns of the IT team, where 50% of the senior managers have not completed the public migration. Please note that this survey did not ask about public cloud security and other common questions.
In the new research, Cohesity also found two characteristics that respondents generally have:
1) The data created on the public cloud retains an average of 3 copies;
2) On average, three different brand solutions or point products are used to manage data and secondary data on all public clouds. Secondary storage is usually non-critical. Static secondary data is stored in backup, archive, file sharing, object storage, test and development systems, analysis, data sets, private clouds and public clouds. Best practice states that because it does not need to be accessed frequently, secondary data should be stored on cheap hardware. The storage lifecycle management system tags the data according to time and importance, and thenThe software determines the storage location of the data or deletes it if necessary. This method is also the way most companies implement GDPR (General Data Protection Regulation) for personal data. Without storage lifecycle management, individuals will ultimately choose which type of storage to use for different use cases.
Worries about public cloud overruns are obvious, and many organizations are trying to use the best methods to optimize costs. Even after choosing the best cost-effective brand and computing type, the price of migrating and storing data will keep IT executives awake. It is currently uncertain whether the supplier’s solution can provide a simple way to achieve better results.
Reduce the number of backups
Although storing secondary data is cheap, the cost will continue to increase when IT does not delete old backups. Nutanix and Veeam SOftware is collaborating to provide software to help companies reduce the number of snapshots and other backups required. In order to explain the needs of its products, Nutanix released the "Cloud Usage Report 2019" and Veeam released the "Cloud Data Management Report 2019". Taken as a whole, these studies provide context for deciding how and when to create backups.
Source:"Nutanix's Cloud Usage Report 2019”
Nutanix’s report provides detailed usage data about its Xi Beam customers’ cloud usage. We don’t believe that using this data can draw any broad conclusions, as it only represents Nutanix A small portion of more than 13,000 customers. Having said that, it illustrates how companies optimize their use of cloud resources. The figure below shows the number of actions taken to eliminate redundant resources, of which more than 92% are deleting unused snapshots This does not mean that reducing snapshot storage can save most of the money, it is just a record of some incidents in the cloud environment. It also shows that companies with annual revenues of less than $1 billion usually take measures to perform virtual machines Optimization. Whether these operations support automation will have a major impact on the relevance of these data.
Source: "Nutanix's Cloud Usage Report 2019" It is difficult to define different types of storage services, so it is difficult to define an accurate Azure and AWS user Consumption model.
Source: Veeam's "2019 Cloud Data Management Report" 54% of high-priority applications will be backed up or copied at least once every hour, but ordinary applications are usually protected 37% of the time.