by Kay Ewbank
System backup has changed with the advent of virtual machines and the cloud. Kay Ewbank looks at your options.
HardCopy Issue: 67 | Published: November 6, 2015
One of the most important tasks for anyone managing a computer system is to ensure that if something goes wrong with that system, the users can continue working with their data and applications intact. This holds true whether the system is a single home PC, a small network, or thousands of networked machines spread over many sites. The need is the same whether the machines are real or virtual, and whether the data those users want is stored on a local drive, a network attached drive, or in the cloud.
Along with this increase in locations for data storage has come ever increasing amounts of data to be managed. Backing up and restoring many gigabytes or even terabytes of data involves a lot of bandwidth, to the point where normal working can be compromised by the amount of data being transferred. Many companies bite the bullet and dedicate a full system with its own servers and hardware so data can be backed up in full on a weekly basis with daily incremental backups. If that’s not a possibility, then backup software needs to be sophisticated enough to back up data without stopping normal network use.
The increasing popularity of virtual machines and cloud storage make backup more complicated, if only in being sure you’re actually backing up all the data you need to. Knowing how many virtual machines are in existence in your system, and what data is associated with them, is one problem – particularly as some of those machines might not be mounted when the backup is carried out. Cloud based data is even trickier. While many cloud providers do offer backup, that might not be sufficient to meet the regulatory requirements of your particular situation. And what would happen if a problem occurred at the cloud provider?
An increasing number of companies are using the cloud as the backup location, but this does need to be carefully thought through. You can be reasonably sure a tape stored in a safety deposit box at the bank will still be there should you actually need it: can you be equally sure your cloud provider will still be in business and will have retained your data? Another drawback of cloud backups is the time it takes to backup and restore data. Some cloud backup companies have resorted to copying customers’ data onto disks and sending them out via courier, which gives an indication of just how long the process takes.
Backup is one area where third party software still offers significant advantages over that provided with the operating system – assuming of course that your operating system has backup utilities included.
There’s a strong case for saying that the backup market is so strong because the facilities Windows offers are so feeble, both for servers and desktops. Windows Server 2012 has its own basic solution called Windows Server Backup. This is made up of a Microsoft Management Console (MMC) snap-in, some command-line tools, and Windows PowerShell scripts. You can use it to back up a full server (all volumes), selected volumes, the system state, or specific files or folders. If you’re using the version of Windows Server aimed at small and medium-sized businesses (SMBs), namely Windows Server Essentials, then you get a slightly friendlier wizard-based system that can make use of Windows Azure Backup. You can set the backup to back up data from PCs connected to the network daily, and if you need to restore data, you can choose individual files and folders or entire PCs. The software has built-in compression, throttling and encryption prior to the data being transmitted to the cloud.
On the desktop the backup method depends on which version of Windows you’re using. Windows 10 has both the File History option and Windows Backup and Restore, while Windows 8 is limited to File History, and earlier versions have only the Backup and Restore.
File History carries out an automatic backup every hour, backing up all documents stored in the Libraries, Desktop, Favourites and Contacts folders. This can be turned off by the user of the machine, or using a group policy setting in a network situation. Other options let you configure where to store the backup on external devices, and you can change the time period.
The drawback of File History is that it is limited in which data can be stored using it, and it doesn’t support storing data in the cloud. One point to note is that the Windows 10 version of File History only backs up files in the Libraries list in a particular account, not in the Favourites or Desktop.
Versions of Windows other than Windows 8 come with Windows Backup and Restore, which lets you back up folders, libraries, and drives to another drive, a DVD or the local network. There are limitations to Backup and Restore, particularly the ‘restore’ element. If you want to restore an image, the hardware needs to be identical. Windows Backup is also notoriously slow, running to hours or even days, and not having a ‘resume’ option should the machine be turned off part way through.
Symantec Backup Exec 2015
Backup Exec has been the biggest name in the backup arena since the days the product was owned by Veritas, but went through a troubled period after being taken over by Symantec, particularly in the incarnation of the unpopular Backup Exec 2012. Thankfully Symantec put a lot of work into improving matters with Backup Exec 2014, which was greeted with relief and enthusiasm due to its improved performance.
Backup Exec 2015 has built on that positive vibe, extending integration into virtualised environments, and adding new cloud connectors. However, some customers have been concerned by recent moves that saw the dropping of the Backup Exec appliance bundles that contained both the software and hardware bundled to provide physical and virtual server backup with integrated deduplication. More uncertainty was caused by the splitting of Veritas from Symantec into a separate company. Backup Exec is reverting to the Veritas stable, which is pleasing those customers who felt Symantec had taken wrong turns with the software, but adding a degree of uncertainty as to how the new company will develop.
One of the biggest complaints about Backup Exec 2012 was the way the user interface had been streamlined with a new management console that was supposed to be easier to use, but which many users found less accessible and where some of the more advanced features of earlier releases were no longer available. Things have got somewhat better in Backup Exec 2015, but experienced administrators still say the old interface offered more options.
In practical terms, Backup Exec 2015 works with virtual, physical and cloud based backups, and you can recover VMs, applications, databases, files, folders and singular application objects. The most recent version improves integration with VMware and Hyper-V. It recognises new VMs as they are added to your network and automatically protects them, and you can create a backup of a system that will be converted into a virtual machine for either Hyper-V or VMware. If you need to recover physical or virtual servers, there’s a Simplified Disaster Recover (SDR) step by step recovery to make things easier, and you can make use of snapshot agents to recover individual virtual machines.
Arcserve is back as an independent company, with arcserve Backup and arcserve Unified Data Protection as its major products. Those with long memories will remember arcserve as being the major player back in the days of Novell Netware and Windows NT backup, before they became part of the CA megalith. The company is now separate again, and is intent on making itself as popular as it originally was, with a strong focus on service level agreements and a new version of the product in the shape of arcserve Unified Data Protection (UDP).
In terms of features, arcserve Backup has everything bar the kitchen sink. It works well with real, virtual and cloud environments, and supports tape, disk and the cloud as storage for your backups. It is very configurable, with options ranging from all-inclusive software that lets you back up physical and virtual file servers, email servers, database servers and application servers, to more selective options for specific operating systems, file servers or application servers.
If you’re working with virtual machines, you can back up VMware, Microsoft Hyper-V and Citrix XenServer. There’s also an option where you take an image-based backup and convert it to either a bootable virtual machine or a disk image of a server that can be used for bare-metal restore. You can also create a synthetic backup which synthesises a full backup from previous incremental backups.
Arcserve UDP is an integrated data protection solution that is aimed at the mid-market and SMB sectors. This isn’t a replacement for arcserve Backup, which will continue as a separate product for at least a few years, though the eventual plan is to amalgamate the product lines.
For the moment, if you want to use tape as your primary backup, arcserve Backup is the product to use, whereas arcserve UDP is aimed at companies wanting image-based backup to disk. The aim of arcserve UDP is to combine backup, replication, high availability and global deduplication. You control the software from a nicely designed web console, and there’s a quick-start wizard for most tasks.
UDP uses the concept of a recovery point server where the backing up happens, along with deduplication and replication services. This server can have multiple data stores, and you can replicate the server to a second remote server for added safety. UDP handles virtual machines well, including the idea of a virtual standby, which uses a recovery point to create a VM. The VM is then kept up to date as data changes, and if the node does fail, UDP will automatically start the VM instead.
Dell NetVault Backup
NetVault Backup 10 is the first major upgrade of the software since Quest was taken over by Dell. It is available as a standalone product, or as part of the Dell Backup & Disaster Recovery Suite along with AppSure and VRanger.
The new version of NetVault has a new web user interface, and has improved database and monitoring features. NetVault’s database used to be a flat file format, but this has been replaced by a multi-threaded PostgreSQL database, and NetVault Backup now carries out operations such as initial backup and secondary copy in parallel.
NetVault is a great product if you run multiple operating systems thanks to its strong cross-platform support and consistent look and feel across the different platforms. You can restore indexes and catalogues across platforms, and it is great at minimising index sizes. It has native support for more backup devices (such as tape libraries) than rival products, though the focus seems increasingly to be on hardware/software combinations with Dell hardware, such as disk backup appliances. There’s also a LAN-free option so you can avoid increased network traffic by using local or SAN-attached storage devices. NetVault users praise its fast and simple device discovery, and its excellent application agents – there are plug-ins for more or less any middleware that you can think of.
The reporting options in NetVault are very customisable, and it’s strong on alerting you if problems occur. The monitoring shows active jobs, policy jobs and data transfer in real-time in a single screen, with info on how the backups are running, which devices are being used and the amount of bandwidth they’re using.
NetVault has good security options for regulatory compliance. You can choose which data should be encrypted for backup on a job-by-job basis, and the encryption options supported include CAST-128, AES-256 and CAST-256 encryption. You can also select which data to encrypt at the job level.
Dell VRanger is part of the Dell Backup & Disaster Recovery Suite, and like NetVault is also available separately if your environment doesn’t require the other elements.
Like Veeam, Dell’s VRanger Pro is aimed at situations where you want to back up virtual machines, in particular VMware virtual environments. The latest version can also be used to back up physical Windows servers, and this latest release has added support for Microsoft Hyper-V systems.
VRanger supports VMware ESX and ESXi systems, and comes in a Standard Edition that gives you data protection for small virtual environments, and a Pro edition that adds improved scalability and disaster recovery capabilities. The latest version is vSphere 5 certified, and supports vSphere 5’s improved streaming and memory limits.
VRanger is agentless to make it easier to install and support. It supports deduplication and disk-to-disk backup with Dell DR appliances, EMC Data Domain, and NetVault SmartDisk. You can take incremental backups, and the backups can be taken while the virtual machines are running. If you’re backing up Windows physical servers, it sends data direct from the original servers to the backup target, so avoiding the need for a media server. If you’re backing up virtual machines, this is achieved by having a virtual appliance on the machine being backed up. If you’re backing up a physical server, a local agent is installed to perform the same process.
One aspect where VRanger is user friendly is in the ability to browse a catalogue of available backups to find the information you’re looking for if you need to restore files or a system. You can then restore the data using a Storage Area Network Fibre Channel instead of having to load your normal network with the extra traffic.
Veeam Backup & Replication
Veeam is an increasingly popular choice for backups, having begun life as software just for backing up VMware virtual machines. Support for Microsoft Hyper-V was added some versions ago, but this is still a product that focusses solely on the virtual environment. If you want to back up physical servers, you need a second product.
Nevertheless, within the virtual environment, Veeam works well, and it’s fast, both in backing up and recovery. Veeam claims you can recover a full VM in mere minutes, and the claims are justified.
When you’re creating backups, you can do so from shared storage, and you can also take incremental backups to minimise the creation time. Once you’ve started with a full backup, you can then choose to work with synthetic backups, where only the incremental changes are saved. Veeam overcomes one of the worries of the backup administrator by opening the backup in a virtual environment, creating a virtual machine based on the backup, starting that virtual machine and checking that it works. This means if you do need to recover the machine, it should actually work.
You can also run a virtual machine directly from the backup file, and this ability is used by Veeam when you choose to recover a machine. The VM is started from the backup so users can get to work immediately using the machine and applications on it, and you can transfer the VM while in use to your local SAN or NAS.
Veeam has some restrictions: for example, you’re expected to back up entire volumes rather than single directories or groups of directories. If you want to back up a subset of a volume, you need to create a new volume containing that data. Veeam’s support for Hyper-V also restricts you from backing up pass through volumes and volumes connected using iSCSI in Hyper-V.
The main take-away about Veeam is that it just works, and so long as you’re OK with the virtualisation only restriction, this is a great option.