7 Best Practices for Backing up Your Salesforce Data
Losing your Salesforce data is probably one thing you don’t like to think about. The consequences could be drastic and it might be more comfortable to simply continue along as though a data loss event isn’t a possibility—but it is.
According to a 2019 study by LogicMonitor, 96% of companies experienced at least one system outage within the previous three years.
System outages result in an inability to provide services, loss of data, potential compliance complications, and loss of revenue.
There are a wide variety of causes for such an event. Strict digital security practices will help your company have a better chance at avoiding a data loss event, but it won’t completely guard you from the possibility.
The frequency of cyberattacks continues to rise. For example, ransomware attacks are predicted to occur every 11 seconds this year. And that’s just one of the many different types of cyberattacks that can bring your system down.
However, it doesn’t need to be a malevolent outside that threatens that stability of your Salesforce data. 95% of data breaches are the result of human error.
The leading cause of data loss actually has nothing to do with breaches, malware, or any type of cyberattack. Accidental deletions are the leading cause of data loss for SaaS applications.
A recent backup of your Salesforce data and the ability to restore it is essential to mitigating the harmful effects of a data loss event.
How to Back up Salesforce Data
There are several types of data within Salesforce, along with specialized procedures for saving that data.
Types of Data
You can employ various methods to back up your Salesforce data safely. The first step is to decide what types of data you want to save since Salesforce keeps records of two varieties—data and metadata.
Data is an umbrella term for all types of records, including deals, accounts, contacts, and more, plus custom object records. Data protection measures are important because even with the best intentions, users and administrators can accidentally delete or modify this information. Tools like Data Loader within Salesforce are valuable in that they make it easy to alter large amounts of data simultaneously, but that same capability increases the risk of accidents. That’s why it’s vital to perform regular data backups and a manual point-in-time backup before major projects.
Metadata refers to derivative information like reports, settings configurations, dashboards, custom code, and your preferential layouts. In order to secure 100% protection, it’s key to protect both types of information. It’s possible for administrators, developers, or even users with advanced permissions to alter configuration settings like custom fields, page layouts, or custom codes, and since it’s often impossible to undo these changes, it’s vital to have metadata backups.
Data Backup Methods in Salesforce
There are a number of Salesforce data backup methods designed for data and metadata that can help you protect your records and settings.
Methods for backing up your data include:
- Data export service: This tool allows you to manually or automatically conduct exports of your data comma-separated values (CSV) files via the user interface (UI). You can generate these backup files every week or every 29 days.
- Data loader: The data loader tool allows you to export your data to a CSV file through the Salesforce API by creating a SOQL query for the data export.
- Report export: The report export feature allows you to create a manual, on-demand report of your data and present it in file formats like a Microsoft Excel sheet or CSV.
There are also several tools built into Salesforce for backing up metadata:
- Sets: You can use change sets to send customized metadata from various org sections with Salesforce, including creating and testing new objects. For example, you can send from the sandbox org to the production org using the change set method.
- Sandbox: When you refresh a related sandbox, Salesforce automatically copies over your configured metadata.
- Force.com: This migration tool uses a Java/Ant-based command-line utility to move your metadata information between the local directory and Salesforce org.
7 Best Practices for Backing up Salesforce Data
Following established procedures with a Salesforce data backup policy is essential to maintaining security. Here are 7 best practices to backing up your Salesforce data:
Here are 7 best practices to backing up your Salesforce data:
1. Find Your Preferred Scope
All backups are not created equal. The amount of storage space available, the amount of data contained within your Salesforce system—there are many factors that can be manipulated to construct a backup plan that works best for your company.
There are four main types of backups that will dictate the scope the process.
- Full Backup: The entire set of data and metadata
- Incremental Backup: Select incremental changes since the previous backup
- Normal Backup: Picks data objects defined in the backup configuration
- Hierarchical Backup: Picks up all relationships corresponding to the selected data objects and associated relationships
Smaller backups will require less storage, but they will also provide less coverage in the case of a data disaster. Each company will need to find their own preferred amount of data to backup, but we recommend backing up as much as possible—preferably everything.
2. Schedule Frequent Backups
Recovering a backup of system data isn’t going to do much good if it’s extremely outdated. The idea is to reintegrate current data sets to minimize the impact on your daily operations. This is only possible with a recent repository of backup data.
Schedule repeating backups so you can be sure you have a reliable data set to fall back on. We recommend doing this at least once a week, preferably daily.
The amount of time between your backups will affect your Recovery Point Objective (RPO). This is the maximum period of time—and the resulting data—you are willing to lose from your system in the event of a data loss event. A shorter RPO will mean a higher frequency of backups.
3. Archive Unused Data
Data archiving is the practice of identifying data that is no longer used, moving it out of the production system, and putting it in long-term storage.
This provides a few different benefits to your Salesforce system:
- Increased capacity leading to faster backup and recovery efforts
- Eliminates the process of backing up inactive data
- Aids in remaining compliant with applicable regulations and laws
- Reduces efforts in maintaining and managing software and infrastructure for on-site backup storage
Archiving is frequently confused with backups. A backup is a copy of essential data without affecting the original files. Data archiving moves the entire file to a separate repository apart from functional system data.
4. Set Data Retention Parameters
The long-term storage of data archiving might be separate from your main data repository, but it still requires a certain degree of attention and resources.
How long is long enough when it comes to holding onto unused data? That’s where your data retention parameters will come into play.
There is going to be a variety of types of unused data—everything from outdated customer information to underutilized services. The amount of time you retain these sets of data will impact your bottom line and overall functionality of your Salesforce instance.
The retention period of your backups can be configured to address your needs. Ensure all regulatory requirements are met but keep these parameters to a reasonable length of time.
5. Protect Your Storage Repositories
Cybercriminals gain entry to their target systems and move throughout every possible area within it. This allows them to access various sets of information, and if their goal is to hold your data ransom or to simply corrupt it, this has the potential to create large problems.
Your data backups should be secured and separate from other data sets so they aren’t exposed in the event of a breach.
This can be accomplished by encrypting your backup data sets and utilizing strict access control methods.
The only people that should be able to access your backups are the people that need it.
6. Institute Reliable Restore Functionality
A complete and reliable backup of your Salesforce system data doesn’t do much good if you can’t pull the information out of it.
Fast restore functionality means you can get your system back up to date and get back to work. This will cover your data, your metadata, attachments, and more.
This essential step in the backup process is often overlooked but is integral to a successful backup security strategy.
7. Consider Regulatory Requirements
There are a variety of industry-specific regulations that relate to how your company handles system and customer data. The storage of this data is a particularly important aspect.
Instituting strict security measures will help you meet ISO 27001 and NIST backup requirements.
The GDPR, for example, has precise stipulations on when you need to secure—and when you need to delete—sensitive information.
Your backup and recovery plan needs to be tailored to address the exact regulatory requirements related to your industry.
Utilizing a third-party tool that also offers data recovery functionality will support your overall data security efforts and provide the best protection.
Yes. Data loss can have many different causes so it’s important to frequently back up your system data to avoid costly outages.
Utilize automation, set automatic backup periods, and configure the settings to match your needs.