We changed our name from IT Central Station: Here's why

Top 8 Backup and Recovery Software Tools

Veeam Backup & ReplicationCommvaultZertoRubrikCohesity DataProtectNakivoVeritas NetBackupVembu BDR Suite
  1. leader badge
    Pricing is fairly reasonable and not overly expensive.The replication is the most valuable feature for us.
  2. leader badge
    The most valuable feature is Commvault's coverage. It has wide coverage to back up every data center environment. The product can back up OpenStack, OpenShift, and competitive solutions don't have this feature. Commvault can handle backing up any virtual workload or enterprise application.
  3. Find out what your peers are saying about Veeam Software, Commvault, Zerto and others in Backup and Recovery Software. Updated: January 2022.
    563,780 professionals have used our research since 2012.
  4. leader badge
    It's also very much faster than any other migration or disaster recovery platform we have. I work with virtualization, mostly on VMware, and I must admit that Zerto is even better than VMware Site Recovery Manager. Zerto compresses the data and it works much faster.
  5. leader badge
    Rubrik, when compared to Cohesity, Dell, and Veritas, was the best of the four.Its reporting is most valuable. I like how it reports that everything is successfully backed up. It provides a summarized report that I can give to auditors and management.
  6. leader badge
    I like that Cohesity offers the chance to have data on-premises and on the cloud. All the data is automatically sent to the cloud. I found value in the instant recovery with DataMove. I found value in the protection group model for applying policies to VMs. And in the overall UI navigation and the way that the application is laid out in the web browser.
  7. Nakivo's backup and replication product has allowed us to implement a disaster recovery solution with a target repository in the cloud. The backup for physical machines and replication for the virtual environment are the most valuable aspects.
  8. report
    Use our free recommendation engine to learn which Backup and Recovery Software solutions are best for your needs.
    563,780 professionals have used our research since 2012.
  9. The backup and restore features work very well.Always current with market features.
  10. The ability to map a drive and restore a separate file is most valuable. The restoration activity is good.You can restore all your data or partial parts of it. You can restore a specific version of the data. It has a lot of options for restore, so you can have the correct data that you want to restore. This is very important. You must know what you are going to restore. Otherwise, you may be overwriting correct data with other data. You must know what specific files you are restoring as well as which version. Partial restore is very important because there might be some files which are newer than the backup and some files that are corrupted. You need to restore some files from the backup, but not all the files.

Advice From The Community

Read answers to top Backup and Recovery Software questions. 563,780 professionals have gotten help from our community of experts.
Ivan Monnier
What is the best backup solution for a hybrid environment (VMware/AWS/GCP)?
author avatarSimonClark

Everyone has recommended good reliable solutions, Rubrik, Veeam, Druva would be our top choices. You need to decide how quickly you need to get a solution deployed and running, how much you want to spend over the next 3-5 years, does your business have any reason not to be in the cloud, do you need to backup every Whirlpool location and endpoint or are the individual regions responsible for their own local data, how quickly and how often you need to upload and download, are you operating M365, do you have any relationship with AWS and perhaps you would like to buy a solution via AWS Marketplace? Druva and Veeam are both AWS vendor partners and use AWS infrastructure...

The list goes on

author avatarSteffen Hornung
Real User

Ok, since you want talk with us, here is my recommendation: Rubrik. Install it wherever you want, cloud, datacenter, edge, it is your choice. Enjoy the google search experience if you are looking for a file to retrieve. Dont know the market share but that is just statistics anyway and bound to be old news. 

As you are working for a pretty big company you may want to know that HomeDepot has deployed rubrik on over 2200 locations just in 3 days time. Preparation for that stunt with significant longer. But that is just how much you can do with automation on that product.
Will change your point-of-view on backup, for sure.

author avatarAlbeez
Real User

VEEAM backup is very good and  we are using it since 3 years 


author avatarChris Childerhose
Real User

If you are looking for a solution for a hybrid environment that covers the solutions mentioned then Veeam is the way to go. They cover both Hyper-V and VMware as well as all the cloud platforms with their appliances.

Yes there are other solutions and as mentioned ensure you understand what you are trying to achieve when it comes to backup and recovery.

author avatarreviewer1559268 (AIX System Administration at a construction company with 1,001-5,000 employees)
Real User

Hello, to me, it does not matter which tool is used for Backup. What is important, is the value of the data that is being backup and what can be restored and how fast. 

Do we have instant recovery (Veeam, Zerto),  files versions (IBM Spectrum), Snapshot (VMWare, SAN), Replication (SAN), Cloud backup, etc.?

Also, to be taken into consideration: cost, location, storage, networking, time to backup/restore.

Another item that needs to be addressed is ransomware. 

That being said, the next generation of backup tools should include all the above.

author avatarAli Mansouri
Real User

Veeam Backup and Replication

author avatarSimonClark

The recommendations have been spot on however, if your budget is very limited there are options we offer for cloud and on-premise backup which are significantly lower priced but as reliable and as feature rich as the brand leaders. Security is always a focus area and it is essential that any solution we offer has controls in place to prevent ransomware and or malicious deletion of critical back ups.

author avatarVladanKojanic
Real User

We first need to define what a “next generation” backup is. Now everything is NG, because it is a trend that all new products are called this way. Then comes the question of what needs you have, what kind of backup you need. Is it an on-premise or a Cloud solution. Each of us can recommend one, which we use. Whichever one you choose to be modern, that is, to follow trends and develop, will do the job for you.

We, specifically for 5 years, have been using Commvault and we are very satisfied. First of all, it is complete, which enables all possible (and not possible) combinations, supports almost all HW vendors, as well as vendors that offer Cloud solutions. What attracted us is that it is innovative, that follows trends and quickly adapts to the market. And what we like the most is that we, as admins, from one place, from one console, can manage absolutely all processes in the backup system.

Ariel Lindenfeld
Hi peers, There's a lot of vendor hype about enterprise backup and recovery software. What's really important to look for in a solution? Let the community know what you think. Share your opinions now!
author avatarTomas-Dalebjörk (CGI)
Real User

There are several things to consider.

The abilities to have flexibilities to fulfill requirements

- Recovery Time Objectives (RTO, how to fulfill different requirements that the business has to restore data that meet the requirements of "how long time can the business live without the data")

- Recovery Point Objectives (RPO, how to fulfill different requirements that the business has about how much data to lose in case of different incidents)

- Backup Time Objectives (BTO, how efficient the solutions are to protect the data)

- Resource utilization (How cost-efficient the solutions are with the resources utilization), data reduction inline/post, progressive incremental forever with/without rebuilding base data

- Maintenance tasks on the solutions (data retention managements), protecting the solution, upgrading off/online, ..

- Support from vendor

- Price of the solution

- Limitation of licenses, gentlemen's agreement, or hard limits

- The ability to use different retention policies, exclude content, use different storages, extra copies etc...

- Security of the solution

Philosophy: Why backing up data again if the data has not been changed?

The fastest way to protect data is to not back it up (again)

Progressive incremental forever (Always incremental)

Philosophy: Why restore all data if you can restore only the data needed

Instant Recovery or restoring single objects

Integrating the backup process with applications such as PostgreSQL, Oracle etc, so that the archive logs / WAL logs, etc will be protected immediately when it is created will improve the RPO. This can be done using SPFS - a filesystem for Spectrum Protect

Taking application-consistent snapshots stored on Spectrum Protect storage using efficient data transfer (progressive block-level incremental forever), reduces the time to take backups, and saves resources on the backup server and the server protected. This can be done using SPFS - Instant Recovery for Spectrum Protect

Restoring only what is needed, can be performed by native backup software such as Spectrum Protect. Provisioning an application consistent snapshot to a server and accessing the data while the restoration is performed in the background can be done using SPIR - Instant Recovery for Spectrum Protect. This helps clients to access data directly to select the data that is needed to copy to the origin or use as production data directly.

author avatarRaul Garcia
Real User

They are several aspects;
1) The frequency with which you need the backup files, folders (files) and / or servers in question to be running. Since this frequency is in theory your closest and farthest recovery point at the same time.

Example 1: If you define that every four hours, in case of a problem you will be able to recover what you backed up four hours ago

2) The estimated size of what you need to back up vs. the time it takes to back it up
Example 2: If you are going to backup 300 GB every four hours and the process takes 8 hrs. (because your information is sent to a MDF - SITE mirror by an internet link or something) then you will not be able to back up every 4 hours, you will have to do it every 8 or 9 hrs.

Example 3: If you are going to backup 50 GB every four hours and the process takes 1 hrs. (because you send your information to an MDF - SITE mirror through an internet link or something) then you will not have problems when you have to make the next backup within 4 hours.

3) The applicant's ability to program (in sequence and / or in parallel) what you need to support

Example 4: Suppose that some files, folders (files) and / or servers need to be backed up every 4 hours. and others every 12 hrs. and others every 24 hours. and others maybe every week. In this case you have to estimate very well the worst scenario that is when the sum of what you are going to be supporting coincides and that slows the process, which implies that when the following programmed backups are activated they effectively run without setback.

4) The flexibility of the application for the execution of incremental or full backups

Example 5: In this case it is knowing what the application is going to do in case a backup fails. Does the incremental part that did not back up start again from scratch? Does it leave a process restart point, if so, how reliable is this process? Will it force you to make a FULL backup that will not take 4 hrs. and that it will take 24 hrs. or more? With what your programming will have to be re-standardized?

5) While it is true that the restoration is the most relevant, prior to this you must ensure that you have well supported what "budgets" should be supported.

In these aspects www.datto.com is what worked best for us.

author avatarThang Le Toan (Victory Lee) (Robusta Technology & Training)
Real User

1. Data integrity
(e.g., fast recovery capability, scheduling backups of the most recent problem / high success rate recovery / ability to automatically check or open data to be restored for quick check support to identify define backup data that can restore well).

2. Data availability
(e.g. the ability to successfully back up in the backup window).

3. Integrate with the rest of the infrastructure
(e.g. automation, the ability to create scripts when backing up or restoring or syncing data).

4. Easy to use
(for example, an easy-to-find interface for necessary functions, arranging drivers in a process sequence).
5. Confidentiality, data encryption and data protection.
6. Ability to integrate standards The General Data Protection Regulation (GDPR), Centralized data management, uniform data control, can access backed up data by Token, USB smart.

author avatarIvo Dissel
Real User

The most important aspect is the time for the backup and restore to finish, and of course how easy it is to configure schedules, rules, policies, etc.

author avatarCheyenne Harden

When deploying backup solutions we look at features that work the way we expect them to. 

Data should be deduplicated to retain quick efficient backups while actually being able to restore without issue. Restoring databases, mailboxes, and domain controllers is particularly difficult for some well-known vendors. We have observed many instances of potential clients having failed restores with "successful" backups. So, having reliable restores is a must. Test often!

Backups must be flexible to meet customer needs with custom retention times while providing quick restore options.

The UI must be easy to use or mistakes will be made during the configuration of backup jobs.

author avatarChrisKetel


author avatarRaul Garcia
Real User

Exactly, according to what is mentioned, I would add the order of priority that I would give them from my experience (and of course to your best consideration).

From the last Backup of the data you have until the moment you apply.

The contingency is your RPO (Recovery Point Objective) and after you apply the contingency and until you restore the data it is your RTO (Recovery Time Objectives).

It seems to me that it is more important to consider the RTO (Recovery Time Objectives) because it is always the longest in the process, if we saw it as a Critical Path the RTO this is your Critical Path.

author avatarChrisKetel

The most important thing is the speed and accuracy and flexibility of the recovery process.

David Thompson
What is the best backup for super-duper (100Gbps) fast read and write with hardware encryption?
author avatarVuong Doan
Real User

The backup speed depends on:
- number of concurrent I/O streams
- data type
- network
- read/write speed of backup repository
- data encryption enable or not
- terabytes of front end data to be backed up

The question is not clear enough, to sizing a high scalable, high throughput environment. To archive the 100Gbps throughput, you have to list down the mentioned information.

For a very large environment, I strongly recommend using either NetBackup or CommVault.

author avatarJohn Askew
Real User

I would suggest Veeam with the underlying storage being provided by a Pure FlashArray//C.

The FlashArray will provide the throughput you are after (its all-flash), the encryption (FIPS 140-2 certified, NIST compliant), data reduction (Veeams isn't that great) which should provide price parity to spinning disk, provides Immutability which you may also need & is a certified solution with Veeam.

The other storage platform worth looking at is VAST Storage, which has roughly the same features set as the Pure arrays, but uses a scale-out, disaggregated architecture and wins hands down in the throughput race against the Pure's.

author avatarreviewer1053252 (Technical Presales Consultant/ Engineer at a wholesaler/distributor with 10,001+ employees)
Real User

There is no such thing as best "anything" let alone backups. There are plenty of enterprise solutions that can handle the load you mentioned plenty are available in the market and it all comes down to your needs.

Hardware encryptions might be much more secure (tougher to hack but still hackable) than software encryptions however they open doors for vendor lock-in and that in certain situations can affect the recoverability of your data.

My advice to you is to focus on looking for a backup solution that can help you guarantee the recoverability of your data at the event of a disaster rather than focus on best backup 100gbps with hardware encryptions.

At the end of the day what's the point of a backup solution if it can do all that you mentioned and fails you at the event of a disaster.

If you can give me more environment details such as what kind of platforms and apps are being utilized I may be able to assist other than that my answers to you are there is no such thing as the best backup for 100gbps with hardware encryption.

We live in a world where everything is software-defined and it's safe to say that that's the way everyone should go.

author avatarreviewer1183848 (User at a media company with 51-200 employees)
Real User

We use the smallest Cohesity cluster possible with three nodes and have 60GBps of available bandwidth. I assume with more nodes you could get to 100Gbps. They have flash and an unbelievable filesystem. Do you have a use case for 12,500 megabytes per second of backup throughput? I'm having trouble envisioning an admin who would be in charge of a source capable of that coming to a forum like this with your exact question!

author avatarMike Zukerman
Real User

I don't think there are backup appliances with the 100Gbps interfaces that exist.

This speed is not needed for the backups, as the network is hardly ever the bottleneck.

author avatarSaravanan Jaganathan
Real User

Nowadays Cisco and other vendors are coming up with 25 Gig & 100 Gig Ports. On the physical setup of your physical or ESXi(Including backup servers) it should be planned in a way which can connect to this switches to have 100 Gig Pipe. DataDomain, HPE storeonce & Quantum DXI supports you the hardware encryption. Identify the right hardware model which supports the right I/O for your disk backups.This will eliminate your bottleneck after having the 100 Gig N/W. On software you can go for Netbackup, Veeam or Commvault. Each has its own option to reduce the frequent data flow by having client side deduplication

author avatarNick Cerrone

It seems an object storage with inline dedupe could fit but would need to be sized for the performance. Backup targets are typically tuned for the ingest. Is the data dedup-able or compressible? How much data are you looking to backup and in how much time? How much data do you need to restore and in how much time?

author avatarMuathAlhwetat
Real User

Your question is not cearly enough for calculate best scenario for your question, Because there are many factors depend on such as :
-Backup for what physical or virtualization environment.
-Data tybe.
-Network speed on all devices.
-Storage tybe flash or tap.
-What is the read/write speed of your disks/tape, AND the bus/controller speed that the disk is attached to?
-How many files and, how much data are you backing up?
-Is your backup application capable of running multiple jobs and sending multiple streams of data simultaneously?

Some potential points for improvement might include:
Upgrading switches and ethernet adapters to Gigabit Ethernet or greater.
Investing in higher performing disk arrays or subsystems to improve read and write speeds.
Investing in LTO-8 tape drives and consider a library, if you are not already using one, so that you can leverage multiplex (multistream) to tape.

Office 365 has built-in backup functionality, but some people recommend having a third-party backup. Is this necessary, and what solutions do you recommend for this?
author avatarTonyKerr

In regards to Backups for 365, It all depends on backup costings licenses and functionality and what type of environment you have to say for Hybrid to the cloud.

If you are in a large Enterprise environment it may be necessary to change your backup strategy to cover all products to get a cost-effective solution however 365 has basic built-in functionality but not many features as enterprise products.





author avatarRupert Laslett (iland Internet Solutions Corp)

Due to Microsoft's 'shared responsibility' model, it is absolutely necessary to have a backup of your O365 data, especially if the data is critical to the business. Whether you require a backup to be compliant or are looking for protection against accidental or malicious deletion (Insider Threats or Malware), a long term archive solution is well worth the price.

There are many providers offering O365 backup solutions today so be sure to check for any hidden fees or potential caps. It's also worth checking to see if the vendor supports backup for SharePoint Online, Teams, and OneDrive as well as Exchange Online.

Some companies provide licenses for you to backup locally, others provide an almost SaaS-like model, incorporating the storage and licensing.
If you don't have local storage available or do not wish to backup locally then you're best off looking at Cloud Service Providers or SaaS providers for O365 Backup. Be sure to understand where your data is held, the level of security and redundancy, and whether or not there is any level of support included in the cost.

You'll also want to be sure you can restore easily, with several different restore options as some vendors have very limited options.
iland cloud, the company I represent, offers a backup of the entire domain within O365 for a per-user price including licensing, unlimited storage, and support, with no extra fees.

Feel free to contact me via LinkedIn if you would like to find out more.
Also happy to answer any questions on other vendors that I have experience with.

author avatarVladan Kojanic
Real User

Surely. Of course, you should first check what and what kind of contract you have with the cloud provider for using Office 365, what kind of license and support you have. But I would definitely recommend some corporate backup solution as well. If you use business applications and databases or cleanly store copies of databases. By using a backup solution, you have more flexibility and the ability to set up, as opposed to what the cloud provider offers you through a license for Office365.

I would even recommend that you take a cloud backup solution, specifically Commvault Metalic, which is not intended for clients who do not have on-premise capabilities. Here you also have the option of choosing which components you want to cover with the backup solution, and what is most convenient, whenever you want to take a new component, it is enough to just enter a new license for it, without any setup, installation.

author avatarMarkPattison

The backup functionality built into Microsoft365 is all to do with Microsoft losing their systems. 

If you want to recover something YOU have accidentally deleted, or any of the more advanced backup functionality (e.g. the ability to recover a single mailbox from a date in the recent past) then you need 3rd-party backup software.

author avatarMartin Mash
Real User

If you don't care about the data stored by Microsoft, then you don't have to back it up.  But if you do care about your data, then look into some sort of backup solution for O365/M365.  There are many good options out there.  Microsoft's responsibility is for the infrastructure but if you have a user do something that they shouldn't have, you could be in for a big headache.

We had a user, about a month after we had migrated the accidentally deleted their entire inbox.  Since we did have a backup solution in place we were able to recover their inbox back into their mailbox.  While our solution was slow in this recovery, the user was able to get all their mail box.

author avatarreviewer1243038 (CEO/co-founder at a tech services company with 1-10 employees)
Real User

I would use:

1. Azure Backup solutions - it's quite cheap for some amount of data;

2. Another third party backup solution - depends on how the whole environment looks - many backup software solutions exist  - for every computer/or session - backup with agent, or if the environment is virtualized (for example - virtual desktops by Microsoft or VMware Horizon/Workspace One)-

It's good to use some software that is able to backup the whole user virtual machine - -Veeam (os/apps agents and virtual environment),
-PureBackup by Archiware (totally free, only support costs - but it's not required; only for virtual machines)
-Networker (composed with DataDomain - very high level of deduplication),
-Agent backups - Symantec\Veritas Backup Exec, Arcserve Backup or Veeam
-Agent for Windows ( it's free, but there's no common management console if the quantity of clients is above 10 - I guess).  These are not expensive solutions.

There're some in-built solutions - for example - if a data storage is Qnap/Synology device - some software (synchronization software) exists 'in device' - it's easy to use but - the device has to be rather stronger because the synchronization client works in continuous mode. For small offices, this solution is enough.

author avatarAlbeez
Real User

Use GFI Email Archiver. The solution helps in backup and addressing long term retention requirements. It keeps a copy directly when an email is sent or received and avoid any mail lose due to intentional or accidental deletion of emails by users.

Nurit Sherman
Hi community,  We all know it's important to conduct a trial or do a proof of concept as part of the buying process.  Do you have any advice for our community about the best way to conduct a trial or PoC?  How would you conduct a trial effectively? Are there any mistakes which should be avoided?
author avatarMichaelWeimann (Infrascale)
Real User

Was going to write a lengthy response but yours is spot on Gary. I will only add that the front end and back end of every SMART goal is to be Specific and Timely. Document what is important to test and what the criteria for passing are BEFORE you ever take delivery. Then put an expected time for this POC to complete and what would be a successful test.

The only other thing I would add is if the vendor is not providing technical resources to drive and/or assist during the POC...don't waste your time. But, if you expect the vendor to devote the resources, you can also expect the vendor to hold you to a purchasing decision when/if everything passes with flying colors.

author avatarGary-Cook (Commvault)

I am not sure if this question comes from a vendor or customer so the response is somewhat generic. If you are the technical customer or end user, try to be involved in the process start to end. If possible, be the hands on the keyboard. No better way to understand the solution if you are going to be the user of it in the future. If you are the vendor promoting ease of use, there is no better way to sell your product to the technical team.

I have managed a lot of data replication, protection, and archiving POCs. Two requirements always stand out. Success criteria and POC type. As a vendor delivering the POC, you will fail 90% of the time without clearly defining these up front. As a customer, you should have a clear idea about why you are investing your time in POC and what you expect to gain from it.

POCs should not be a training exercise. They are a path to purchase a solution for a budgeted project. If you are just kicking the tires, consider the free or self-paced options provided by many vendors. These include on-line labs and downloadable virtual machines or trial software. These cannot be considered a POC in my book.

Now the two key components for a successful POC.

#1 - Define as a Functional or Performance POC

Decide whether you are running a functional or performance-based POC. If you are the vendor, make sure the customer is aware of the limitation of a functional POC in a limited resource environment. Don't allow a Functional POC to become a Performance POC. Been there. Done that. It's never a success.

Functional testing is easier. There is no requirement for measured performance so sizing the environment is a minor issue. Just has to be "fast enough" to keep your attention. They usually cover base installation, backup target configuration, agent configuration, test backups and restores, reporting, alerting, etc. Data sets are generally small. It can be executed in a limited environment with virtual machines. Sometimes the vendor can supply access to a remote lab environment such as the VMware vSAN lab. Sometimes it can be delivered as a preconfigured VM downloaded from the vendor.

Performance testing is complicated. Speeds and feeds matter. You will not be able to backup your entire live environment so you have to build a test environment to mimic it as close as possible if you are looking for GB/sec measurements. Success Criteria become golden in performance tests. You will be following the recommended hardware configuration supplied by the vendor.

#2 - Success Criteria

Define clear success criteria and stay with the plan. This will avoid scope creep where testing has no endpoint.

A test plan can be extremely difficult to create from scratch. Take the time because it is key to a fair and complete test. It will make you think about the purpose of the test. Most vendors have boilerplate POC documents. They are a good starting point but they almost always focus on the strength of the product. If you are the customer performing comparison testing, blend them into a single document.

Some or all of the success criteria should meet the "must have" requirements of a published RFP if it exists.

Test criteria should not be too detailed, especially to favor a particular solution UNLESS that is a pass/fail test.

Define a start and end date based on the testing requirements. Testing should be sequenced. Test backup of app A, app B, os C.. Don't jump back and forth between Oracle and Sharepoint for example. Complete one, deal with any issues, check the boxes, and move on.

DR, Performance, and SLA testing absolutely require detailed planning. Too much to detail in this short response. Imagine a POC where you are faced with "I need to recover my 50 TB Oracle server off-site with an RPO of 5 seconds and an RTO of 5 minutes".

In a large POC, you might have regularly scheduled meetings or conference calls for updates on the progress and to deal with issues.

Include a site survey covering security and the network configuration, Prepare to deal with fixed IPs, firewalls, ports, Active Directory, etc. Nothing like a backup solution to break a network and bring the testing to a standstill. Make sure you have a clear understanding of the environment. I once had a POC where they were migrating some AD domains that were part of the test infrastructure. Unknown to me. Needless to say, we faced constant failures.

Define the hardware and configuration requirements on a per server basis. OS, partition sizes, network, etc. This applies to the backup infrastructure servers and the servers that will be the source of the backup data.

Include all the key contacts with access information to servers.

Make sure you have ALL the required resources (human and compute) resources available on the start date. For example, you might need help from an Oracle DBA or SME on day 2 to continue the installation.

Define a process to modify the plan. I've seen cases where another department sees the shiny new object and wants to jump into testing their app after the plan was approved and tests begin. Plan to deal with this exception in the testing procedure but not deviate from accomplishing the original success criteria. It should be approved by management.

Define what is considered critical to the success of the test, what is a nice to have feature, and optionally, what doesn't matter at all. Be specific. Include application versions if it matters. You might judge the test completion as pass / partial pass / no pass or a percentage of how it meets the criteria. Don't use subjective rankings. Add a column next to the test for comments for subjective comments.

If you are comparison testing two or more solutions, make sure you can test "apples to apples" across the POC candidates. All vendors should be tested to the same standard. It can be difficult to compare an appliance to an enterprise software solution. The appliance will win the easy to install checkbox but might fail in the ease of expansion category because it requires a new, larger box.

Consider the future in a POC, not just how it functions today. For example, you should think about the process to add additional capacity locally or bring on new sites/servers.

NOTE: Content here subject to updates if I think of something new or helpful.

author avatarFred Kovacs

I know this is a simple answer but research companies that offer this service and use their free software trial versions to see if you like them or not. Research is the answer.

author avatarDominik-Oppitz

1 - Build up a dedicated environment for evaluation. In this, you can control and monitor all aspects (performance impact on primary storage, restore times, etc) very granularly without jeopardizing your production infrastructure. Hardware vendors are more than willing to help out as often a new software solution comes hand in hand with a new backup solution.

2 - A man (or woman) with a plan is a man (or woman) having success. Work out an agenda for the evaluation, starting at the business needs (SLO/SLAs, etc). Define the necessary processes with the vendor - this is a great test for how supportive they are and will be.

3 - Document the outcome person by person! Everyone looks at a vendor differently, so you need multiple-vector information as a foundation for your decision. BTW, this is a great tool to motivate your staff and to a vendor´s pricetag where you need it!

4 - Stick to the plan but be open to expanding. Never go back from the initially defined scenario. It was based on business needs, and these needs do not disappear - but boy do they come up during these evaluations. Keep them tracked and manage them accordingly. Not every input needs to be tested, but it needs to be ticked off and to be addressed.

5 - Whatever solution you look at: form follows function follows usability follows security

6 - Squeeze whatever you can learn out of these scenarios. You never know when you need it again.

7 - Play fair. Vendors invest a lot of their time in these PoCs. So, if they do not fit your need is to tell them. Give them the chance to bring up another solution or to withdraw. But again: Never go back from your agenda. Your business defined the needs.

author avatarit_user897210 (IBM Spectrum Protect Expert - ISMS owner at a non-tech company with 10,001+ employees)


Resilience is the keyword of any Backup & Restore software. Whatever it’s named. I can see 3 major mistakes while people are driving analysis & test of their Backup & Restore solution.

1 – Definition: Before even starting, RTO/RPO have to be clearly defined, this will be very helpful to determine your B&R tools and architecture.

- RTO/RPO, do I need all my data backup or only critical one and to what timeframe?

- Size, how much active data are we speaking about? Should I keep all data indefinitely or should I put strong Data policy management (in my environment, the standard is 30 days and all above have to be justified either by Business or Legal requirement)

As well, if I put deduplication in place, what is the impact?

- Availability, should I make my B&R infra high available? What is the outage I can consider/live without my B&R software

2 – Environment: What is the scope of your Backup & Storage software ? We you be able to use as 1 tool, or many spread over your coverage (datacenter, workstation…) and at what scale?

3 – Testing, what is the purpose of performing the 100GB test if I’m covering 100TB? The test must be driven in “prod likd” situation.

- Sizing of all daily backup

- Restore in production while the backup is running in parallel

o This will confirm my RTO/RPO or will show what gap I do have to answer.

- My B&R “admin” task

o Do I use deduplication? If Yes, what is the capability of my hardware to dedup daily backups (10, 20, 100TB/24h ?)

o Do I use Disaster Recovery capability?

o Recycling of tapes (considering smart environment)

If to protect my data in the best manner, the time all those tasks are taking in my B&R resource must be known.

- Restore my B&R software.

This is quick head-up on the hottest topic too much often forgotten when driving study on B&R software.

author avatarit_user871440 (Senior Solutions Consultant with 51-200 employees)

From my experience following aspects should be considered to avoid potential problems:
1. Choose only “known vendors” in the market for POC especially if the data to be secured its worth
2. Check for a single product which can fulfill “all data management” requirements out of the box
3. Conduct a “real life” POC which includes all required scenarios (backup & restore)
4. Don’t forget about the performance of the Backup & Recovery solution especially the Restore Speed (RTO)
5. Ask Data Management Specialists early for advise e.g. Rules of thumb / Best Practices
6. Before deciding for a solution check for the total costs over a longer period (renewal, growth,..)
7. Avoid vendor lock-in solutions (flexible components e.g. Server, Storage,…)

author avatarRaul Garcia
Real User

I agree with the previous recommendations that are exposed, which would add the following 4 aspects;

1) Involves all the supply of the complete solution: A proof of concept does not only involve the provider of the application. For example, you should also consider communications providers (carriers) in the event that your test is being done by supporting servers in geographically distant locations, or even in the same SITE, it may be involving other solutions such as virtualization and even the database of the main application that you want to support (considering the size of the logs or things of that nature)

2) Prepare your level of acceptance and rate the test: Go testing and qualifying (knowing and not necessarily learning) is very good, however what are we qualifying ?, As we know we are after the test comparing "pears" with "pears " Well for this I recommend that prior to the proof of concept have the SCRIP ready for the steps you want to test, the range over which will qualify the result, the findings and assumptions, as well as the ideal qualification on which you will observe that both distance themselves the solutions he tried. Even when you only try one. Let's imagine that you are qualifying 3 criteria with an 80, 95 and 90 as a suitable qualification and that the solution complied at 40, 55 and 50. Would you say that the proof of concept was successful? In my opinion, the test fulfilled half of the functionality or solution expected for each requirement, so the test could be considered as failed (we must find another solution)
3) Time required for the test (your time not the provider's): Another aspect to consider is the time in which the provider is willing to invest with you the use of your solution. Sometimes the best tests are those that simulate a natural period of your operation. It may be 24 hrs. as it could also be a full week (7 days) or even a full month with a change from one period to another.
In this case you should negotiate with the supplier the time that you require and of course if you do not want to invest adjust to what the provider proposes, but that is already part of the qualification of the possible results (your requirement vs the variable of times required to replicate a scenario "of the actual operation")

4) Test the Backup on the Recovery in the same POC: Por last, as we are talking about replicating information (application, a virtual server, etc.), always consider exposing in the scope of proof of concept both the backing of information and the return to normal operation from said backup. Otherwise your proof of concept would be incomplete.

author avatarJohannFLEURY
Real User


With all already been said, be sure to test all different technology this PoC will be used for and do not neglect end-user testing. They are the final step of a good PoC.

Do not rely on vendor performance story. They can be far from reality of your own environment so have already a baseline and set of performance tests to be sure it fits your need and know the limits. As example, do not buy a Dell EMC if you need IOs, this is not made for it but if you need archiving solution then it is becoming a good candidate.

And last, PoC is here so… test, test, test… and redo test, test, test so your teams will be confortable with it.

Backup and Recovery Software Articles

Evgeny Belenky
PeerSpot (formerly IT Central Station)
Nov 19 2021
Hi community members, Spotlight #2 is our fresh bi-weekly community digest for you. It covers cybersecurity, IT and DevOps topics. Check it out and comment below with your feedback! Trending What are the pros and cons of internal SOC vs SOC-as-a-Service? Join The Moderator Team at IT Ce...
Read More »
Matthew Shoffner
IT Central Station
Discussions about backup tend to dive straight into the technical aspects of creating safe copies of vital data. They may miss what is arguably a more important issue, which is the purpose of the backup process itself. When looking at explanations of the different types of backup available to IT ...
Read More »
1 Comment
Freelance Writer – B2B Technology Marketing
Journal of Cyber Policy
On Saturday, May 8, 2021, major media outlets reported that Colonial Pipeline, whose fuel pipeline network supplies gasoline, jet fuel, and other petroleum necessities to over 50 million Americans, had suffered a ransomware attack and shut down its pipeline as a precaution. The disruption in supp...
Read More »
1 Comment
Freelance Writer – B2B Technology Marketing
Journal of Cyber Policy
OVHcloud, Europe’s largest cloud services provider, suffered a devastating fire on March 10 at its facility in Strasbourg, in Eastern France. The fire destroyed one of four data centers at the site. As Reuters reported, the fire disrupted millions of websites, taking government agency portals off...
Read More »
JC AlexandresThe never old IT adage ... backup, backup, backup.
Rory SheltonBe prepared, an organization's existence could be the cost. 
Vladan Kojanic
Project Manager - Business Consultant at Comtrade System Integration
When the pandemic hit, we were forced to quickly adapt and find answers to questions we’d never asked ourselves before: how can we keep in touch with our colleagues when we’re not in the office? And how can we make sure we are still efficient while working from home? It quickly became apparent t...
Read More »
Chris Childerhose
Lead Infrastructure Architect at ThinkON
Every Virtualization and System Administrator deals with having the ability to recover servers, files, etc. and having a Backup Solution to help with recovery will ease the burden.  But how do you know which one is right for you?  How would you go about choosing the right solution that will help ...
Read More »
Federico LucchiniGreat article. Also remember how important are datas in your company, I would… more »
1 Comment
Matthew Shoffner
IT Central Station
Effective data backup and recovery doesn’t just happen, it takes using the best backup software and a sound plan. For sure, some companies have an ad hoc approach to this critical area of IT operations, but that is not a best practice. It’s wise to develop a thorough data backup strategy and plan...
Read More »
Matthew Shoffner
IT Central Station
There are many types of backup used to both guard and recover data in cases where data integrity is compromised. Data can change very fast, slowly, or not all. Data can be very sensitive or common. Because data can change hourly, daily, or weekly and be of differing importances, a data backup pro...
Read More »
Davina Becker
Content Editor
IT Central Station
Businesses spend a lot of time building their proprietary data and information. That information can often hold the key to a competitive advantage in the market. Data loss from threats or disasters can lead to upset customers, lost revenue, and potentially bankruptcy, e.g., more than 90 percent o...
Read More »
Find out what your peers are saying about Veeam Software, Commvault, Zerto and others in Backup and Recovery Software. Updated: January 2022.
563,780 professionals have used our research since 2012.