Description
When there are dozens of services and databases of different types in the infrastructure, manual backups turn into a nightmare. One server uses PostgreSQL, another uses MySQL, and the third uses MongoDB, and each needs its own commands (pg_dump, mysqldump, mongodump) and its own scripts.
The Dumper project solves this problem by combining all types of databases into one universal tool.
Dumper is written in Go and works via the CLI, the configuration is set in YAML, so it’s easy to embed it in cron, CI/CD pipelines, GitHub Actions, or a Docker environment.
Advantages of Dumper
Multi-DBMS — PostgreSQL, MySQL, MongoDB and others.
SSH connections — you can make dumps from remote servers.
Flexible file name templates are convenient for storage and automation.…
Description
When there are dozens of services and databases of different types in the infrastructure, manual backups turn into a nightmare. One server uses PostgreSQL, another uses MySQL, and the third uses MongoDB, and each needs its own commands (pg_dump, mysqldump, mongodump) and its own scripts.
The Dumper project solves this problem by combining all types of databases into one universal tool.
Dumper is written in Go and works via the CLI, the configuration is set in YAML, so it’s easy to embed it in cron, CI/CD pipelines, GitHub Actions, or a Docker environment.
Advantages of Dumper
Multi-DBMS — PostgreSQL, MySQL, MongoDB and others.
SSH connections — you can make dumps from remote servers.
Flexible file name templates are convenient for storage and automation.
Automatic archiving and deletion of old dumps.
Easy integration with cron/CI/Jenkins/GitLab Runner.
Example: backup of three databases Dumper Configuration
settings:
template: "{%srv%}_{%db%}_{%date%}_{%time%}"
archive: true
remove_dump: true
format: "dump"
dir_dump: "/opt/backups/dumps"
dir_archived: "/opt/backups/archived"
location: "server"
servers:
prod-pg:
name: "db-prod-pg"
host: "10.1.1.10"
port: 22
user: "ubuntu"
private_key: "~/.ssh/id_rsa"
is_passphrase: false
stage-mysql:
name: "db-stage-mysql"
host: "10.1.1.20"
port: 22
user: "deployer"
password: "mysqlpass"
analytics-mongo:
name: "db-analytics-mongo"
host: "10.1.1.30"
port: 22
user: "admin"
private_key: "~/.ssh/id_backup"
is_passphrase: false
databases:
production_pg:
name: "prod_db"
user: "postgres"
password: "pgpass"
server: "prod-pg"
driver: "psql"
format: "plain"
staging_mysql:
name: "staging"
user: "mysqluser"
password: "mysqlpass"
server: "stage-mysql"
driver: "mysql"
format: "sql"
analytics_mongo:
name: "analytics"
user: "root"
password: "mongo123"
server: "analytics-mongo"
driver: "mongo"
format: "bson"
options:
auth_source: "admin"
ssl: true
For verification, you can dump a single database.:
./dumper --config ./config.yaml --db production_pg
Or all the databases at once:
./dumper --config ./config.yaml --all
Or manually select it:
./dumper --config ./config.yaml
Automation via cron
Adding a daily backup to /etc/crontab:
0 3 * * * root /usr/local/bin/dumper --config /opt/dumper/config.yaml --all >> /var/log/dumper.log 2>&1
Tips and useful recommendations:
- Make sure that the SSH key is configured and that access to the server is available without requesting a password (or use an ssh agent).
- Check the access rights: the user must have the rights to perform a dump (for example, pg_dump for PostgreSQL).
- Monitor your free disk space: if you make regular backups and do not delete old ones, you will gain volume very quickly.
- Use the template template so that the file name reflects the date/time, which is convenient for searching and archiving.
- Test recovery: A backup copy is not valuable until you are sure that it can be restored.
- Keep sensitive information in mind: the configuration file contains passwords and keys. Keep it in a safe place or restrict access.
- Familiarize yourself with the formats: for example, MongoDB can use the bson format, PostgreSQL — plain, dump, tar.
Summary
Dumper is a simple tool for databases backup, especially if you have several different servers and databases. It allows you to standardize the process, minimize manual steps, and integrate backups into automated workflows.