Ro Backup
Ro is a multi-platform command-line application to automate backups of your Salesforce data.
Using Ro Backup, you can:
-
Authenticate using username/password, OAuth, or JWT
-
Back up all exportable data in your org, including attachments and (optionally) deleted records
-
Back up all objects by default or choose specific objects to back up
-
Export data as compressed CSV, JSON, or parquet files
-
Optionally copy data to an S3 bucket
-
Backup Salesforce to a database
-
Run backups unattended using cron or Windows Task Scheduler
Usage
Authenticate using OAuth, and back up all objects (OAuth token will be stored for future use)
$ ro -e login.salesforce.com
Authenticate with your Salesforce org using username and password, and back up all objects
$ ro -e login.salesforce.com -u username@org.example.com -p passwordToken
Back up only given objects (using stored credentials)
$ ro Opportunity My_Custom_Object__c Document
Use credentials of previously authenticated user
$ ro -u username@org.example.com
Back up to a database (SQLite only for now)
$ ro -D backup.db Opportunity My_Custom_Object__c Document
Override default backup options to exclude base64-encoded fields and include deleted records
$ ro --exclude-base64 --include-deleted
Set output options such as filename, output format format, and whether to include objects with no records
$ ro --exclude-empty --output=path/filename --format=json --unzipped
Set advanced options, such as concurrency level and pk chunking
$ ro --max-jobs=100 --chunk-size=100000
Save the command-line flags and arguments to a config file that can be used for subsequent runs
$ ro --exclude Task --exclude Event --exclude-base64 --write-config ro.xml
Use a config file to set flags and arguments
$ ro --read-config ro.xml
Complete list of options
Usage: ro [options] [<objects>]
<objects> standard or custom object names to back up (empty for all)
Options:
backup
-B, --exclude-base64 do not fetch base64 fields (default: fetch all fields)
-E, --exclude string object(s) to exclude from backup
-b, --before string limit records exported to those created/modified before this date
-i, --include-deleted fetch deleted objects (default: do not fetch deleted)
-s, --since string limit records exported to those created/modified since this date
output
-D, --dsn string write to database using DSN instead of file
-S, --s3-bucket string s3 bucket to upload to, output flag is used as key or prefix
-U, --unzipped do not zip output file (default: zip output)
-X, --exclude-empty exclude objects with no records from output file (default: include all objects)
-f, --format string output file format (csv, json, parquet) (default: csv)
-o, --output string output filename (default: time.now().zip)
-r, --dry-run log backup process without actually performing bulk API jobs
authentication
-a, --connected-app string connected app id
-e, --endpoint string salesforce instance to authenticate with
-j, --jwt string jwt signing key filename
-p, --password string salesforce instance password
-u, --username string salesforce instance username
logging
-L, --log-level string set log level based on string (debug, info, warn, error)
-P, --perf enable performance monitoring
-d, --debug set log level to debug
-q, --quiet set log level to warn
config
-R, --read-config string read configuration from file
-W, --write-config string write configuration to file for future re-use
advanced
--where string limit records exported to those matching a custom WHERE condition
-c, --chunk-size int enable pk chunking with given number of records per batch, use 0 to disable (default: 100000)
-m, --max-jobs int max number of concurrent bulk API jobs (default: 100)
about
-v, --version display version