
Deploy Postgres S3 Backup (Fastest)
A simple Bun app to back up your PostgreSQL database to S3 via a cron
Postgres S3 Backup
slvssb/railway-postgres-s3-backup
Just deployed
Deploy and Host Postgres S3 Backup (Fastest) on Railway
PostgreSQL S3 Backup is an automated backup service that uses Railway cron to dump PostgreSQL data and upload it to S3-compatible storage. The service is written in TypeScript and provides configurable scheduling and storage options.
About Hosting Postgres S3 Backup (Fastest)
PostgreSQL S3 Backups runs as a Bun application that executes pg_dump operations on a schedule and uploads compressed database dumps to S3 storage. You'll need to manage cron job reliability, monitor backup success/failure rates, and handle S3 storage costs as backup data accumulates. The service requires database connection management, S3 authentication, and error handling for network failures during uploads. Storage lifecycle policies become important for managing backup retention and costs over time.
Common Use Cases
- Database Administrators: Automate regular PostgreSQL backups to cloud storage with configurable retention policies
- DevOps Engineers: Implement disaster recovery procedures and maintain backup compliance for production databases
Dependencies for Postgres S3 Backup (Fastest) Hosting
- Bun Runtime: TypeScript execution environment for backup and upload operations
- PostgreSQL Access: Database connection credentials and pg_dump utility availability
- S3 Storage: AWS S3 or compatible storage service with appropriate access permissions
Deployment Dependencies
Implementation Details
Overview:
The template uses Railway cron, written in TypeScript to dump your PostgreSQL data to a file and then upload the file to S3 natively on Bun Runtime.
Key Features:
- Configurable backup schedule: Configurable via the
Cron
in the Settings after deployment - Support for custom buckets: The script supports using an
S3_ENDPOINT
environment variable to use any S3 compliant storage bucket (e.g., Wasabi)
Required Configuration:
# AWS/S3 Configuration
AWS_ACCESS_KEY_ID=your-access-key-id
AWS_SECRET_ACCESS_KEY=your-secret-access-key
S3_BUCKET=your-bucket-name
S3_REGION=us-east-1
# Database Configuration
DATABASE_URL=postgresql://user:password@host:port/database
# Backup Schedule
cron=0 5 * * *
Environment Variables:
AWS_ACCESS_KEY_ID
- AWS access key IDAWS_SECRET_ACCESS_KEY
- AWS secret access key, sometimes also called an application keyS3_BUCKET
- The name of the bucket that the access key ID and secret access key are authorized to accessS3_REGION
- The name of the region your bucket is located in, set to auto if unknown. Defaultauto
DATABASE_URL
- The connection string of the database to backupS3_ENDPOINT
- The S3 custom endpoint you want to use. Applicable for 3rd party S3 services such as Cloudflare R2 or Backblaze R2S3_FORCE_PATH_STYLE
- Use path style for the endpoint instead of the default subdomain style, useful for MinIO. Defaultfalse
BUCKET_STORAGE_CLASS
- Storage class of the S3 Bucket. DefaultSTANDARD
BACKUP_FILE_PREFIX
- Add a prefix to the file name. Defaultbackup
BUCKET_SUBFOLDER
- Define a subfolder to place the backup files in BACKUP_OPTIONS - Add any valid pg_dump option, supported pg_dump options can be found here. Example: --exclude-table=pattern
Backup Process:
# Basic backup workflow
1. Connect to PostgreSQL database using BACKUP_DATABASE_URL
2. Execute pg_dump with specified BACKUP_OPTIONS
3. Compress database dump file
4. Upload to S3 bucket with timestamp and optional prefix
5. Clean up local temporary files
6. Log backup success/failure status
Advanced Configuration Examples:
# Custom S3 endpoint (Cloudflare R2)
S3_ENDPOINT=https://account-id.r2.cloudflarestorage.com
S3_FORCE_PATH_STYLE=false
# Backup with exclusions
BACKUP_OPTIONS=--exclude-table=temp_* --exclude-table=logs
# Organized storage
BACKUP_FILE_PREFIX=prod-db-
BUCKET_SUBFOLDER=postgresql-backups/
Storage Considerations:
- Monitor S3 storage costs as backups accumulate
- Implement lifecycle policies for backup retention
- Consider backup compression and deduplication
- Plan for disaster recovery and backup restoration procedures
Why Deploy Postgres S3 Backup (Fastest) on Railway?
Railway is a singular platform to deploy your infrastructure stack. Railway will host your infrastructure so you don't have to deal with configuration, while allowing you to vertically and horizontally scale it.
By deploying Postgres S3 Backup (Fastest) on Railway, you are one step closer to supporting a complete full-stack application with minimal burden. Host your servers, databases, AI agents, and more on Railway.
Template Content
Postgres S3 Backup
slvssb/railway-postgres-s3-backupS3_BUCKET
The name of the bucket that the access key ID and secret access key are authorized to access
AWS_ACCESS_KEY_ID
AWS access key ID
AWS_SECRET_ACCESS_KEY
AWS secret access key, sometimes also called an application key