Table of Contents
Cloud storage is convenient, but it can get expensive fast. I wanted a simple, reliable, and low-cost backup solution for my projects without getting locked into Google Drive or Dropbox pricing.
That’s where Hetzner Storage Boxes come in: a generous amount of storage, WebDAV support out of the box, and a price that’s hard to beat.
In this article, I’ll show you how I set up my own “personal backup drive” using Hetzner Storage Boxes, with a focus on one-way backups (local → remote only). This way, I always have an up-to-date backup of my important folders without worrying about accidental remote deletions syncing back to my laptop.
Why Hetzner Storage Boxes?
When I started looking for a backup solution, cost and flexibility were the two main factors. Hetzner Storage Boxes checked both boxes. They start at just a few euros per month and provide a generous amount of space, making them one of the most affordable options for long-term storage.
Some of the highlights include:
- Ample storage capacity – plans start at 1 TB and scale up as needed.
- Multiple access protocols – WebDAV, FTP, SFTP, and rsync are all supported out of the box.
- Built-in backup features – snapshots and backup intervals can be configured to add extra safety.
- Cost efficiency – far cheaper than popular services like Google Drive, Dropbox, or OneDrive for the same capacity.
Below is a price comparison for individual plans, as of mid-2025:
Provider / Plan | Storage Offered | Price (Monthly) |
---|---|---|
Hetzner Storage Box (BX11) | 1 TB | €3.20 |
Hetzner Storage Box (BX21) | 5 TB | €10.90 |
Google One – 100 GB | 100 GB | €1.99 |
Google One – 2 TB | 2 TB | €9.99 |
Dropbox Plus – 2 TB | 2 TB | €9.99 |
OneDrive Personal – 1 TB | 1 TB | €9.20 |
What This Means for Your Wallet
- Ridiculously low entry cost – Hetzner gives you 1 TB for just over €3/month, versus €9–10 from Google, Dropbox, or Microsoft for similar or even less storage. That’s roughly one-third of the cost for comparable space.
- No gimmicks – Hetzner includes unlimited data transfer, optional snapshot features, and support for protocols like WebDAV, FTP, SFTP, and rsync.
- Budget scaling – Paying just a few euros monthly makes it perfect for low-cost backups, especially if your goal is simplicity and affordability.
Because WebDAV is natively supported, it integrates seamlessly with Python. That means I can create a simple, lightweight sync solution, like the one in this guide, without relying on heavy third-party tools or vendor lock-in.
Hetzner Storage Box Configuration
You can sign up for Hetzner here.
Hetzner offers several Storage Box plans, depending on your needs:

Once you’ve got your Storage Box, you’ll need a few details for the scripts in this guide to work:
- Username (normally looks like
uXXXXXX
) - Password for your main account (note: this script doesn’t support sub-accounts)
- WebDAV access and External Reachability enabled
- Storage Box URL (see Hetzner’s WebDAV docs)
Here’s an example configuration:

And if you’d like to go beyond just Storage Boxes, Hetzner also offers cloud VPS at a budget-friendly price. I’ve put together a PDF guide that shows you how to turn one into your own Platform-as-a-Service (PaaS). A great next step if you want to self-host projects, apps, or services, all without breaking the bank.
First Attempt: File Monitoring Only
My first approach was the simplest one: just watch my local folders for changes and push any new or modified files to my Hetzner Storage Box. The idea was to keep my backup updated automatically without having to manually run anything or do a full sync each time.
Here’s how I thought about it:
- Monitor my local folder for new or changed files (including moves and deletes).
- Upload these file changes to the remote storage via WebDAV.
Here’s the initial Python script :
import time
import os
from pathlib import Path
from dotenv import load_dotenv
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
from webdav3.client import Client
LOCAL_DIR = Path("./HetznerDrive").resolve()
REMOTE_DIR = "/HetznerDrive"
# Load environment variables from .env file
load_dotenv()
# --- Helpers -----------------------------------------------------------------
def load_config():
"""Load configuration from environment variables"""
# Convert to webdavclient3 format
options = {
'webdav_hostname': os.getenv('HETZNER_BASE_URL'),
'webdav_login': os.getenv('HETZNER_USERNAME'),
'webdav_password': os.getenv('HETZNER_PASSWORD'),
'webdav_timeout': 30
}
return options
def create_webdav_client(config_options):
"""Create and configure WebDAV client"""
client = Client(config_options)
client.verify = True
return client
# --- Watchdog Local Event Handler --------------------------------------------
class LocalHandler(FileSystemEventHandler):
def __init__(self, client):
self.client = client
def on_modified(self, event):
if not event.is_directory:
try:
self.upload(Path(event.src_path))
except Exception as e:
print(f"Error handling file modification for {event.src_path}: {e}")
def on_moved(self, event):
try:
rel_source = str(Path(event.src_path).relative_to(LOCAL_DIR))
remote_path_source = f"{REMOTE_DIR}/{rel_source}".replace("\\", "/")
rel_dest = str(Path(event.dest_path).relative_to(LOCAL_DIR))
remote_path_dest = f"{REMOTE_DIR}/{rel_dest}".replace("\\", "/")
print(f"Moving {rel_source} to {rel_dest}")
self.client.move(remote_path_source, remote_path_dest)
print(f"Moved {remote_path_source} to {remote_path_dest}")
except Exception as e:
print(f"Error handling file move for {remote_path_source}: {e}")
def on_created(self, event):
if not event.is_directory:
try:
self.upload(Path(event.src_path))
except Exception as e:
print(f"Error handling file creation for {event.src_path}: {e}")
else:
try:
rel = str(Path(event.src_path).relative_to(LOCAL_DIR))
remote_path = f"{REMOTE_DIR}/{rel}".replace("\\", "/")
self.client.mkdir(remote_path)
print(f"Created remote directory: {remote_path}")
except Exception as e:
print(f"Error handling directory creation for {event.src_path}: {e}")
def on_deleted(self, event):
if not event.is_directory:
try:
rel = str(Path(event.src_path).relative_to(LOCAL_DIR))
remote_path = f"{REMOTE_DIR}/{rel}".replace("\\", "/")
self.client.clean(remote_path)
print(f"Deleted remote: {rel}")
except Exception as e:
print(f"Error handling file deletion for {event.src_path}: {e}")
def upload(self, path: Path):
try:
rel = str(path.relative_to(LOCAL_DIR)).replace("\\", "/")
remote_path = f"{REMOTE_DIR}/{rel}"
print(f"Uploading {rel} -> {remote_path}")
self.client.upload_sync(remote_path=remote_path, local_path=str(path))
print(f"Successfully uploaded: {rel}")
except Exception as e:
print(f"Error uploading {path}: {e}")
# Don't re-raise - we want to continue monitoring other files
# --- Main Sync Loop ----------------------------------------------------------
def main():
try:
config_options = load_config()
client = create_webdav_client(config_options)
print(f"Local directory: {LOCAL_DIR}")
print(f"Remote directory: {REMOTE_DIR}")
LOCAL_DIR.mkdir(exist_ok=True)
# Ensure remote base directory exists
try:
print(f"Ensuring remote directory exists: {REMOTE_DIR}")
client.mkdir(REMOTE_DIR)
except Exception as e:
print(f"Warning: Could not create remote directory {REMOTE_DIR}: {e}")
print("This might be normal if the directory already exists or if you don't have write permissions")
# Start local watcher
handler = LocalHandler(client)
obs = Observer()
obs.schedule(handler, str(LOCAL_DIR), recursive=True)
obs.start()
print("Starting file sync (Ctrl+C to stop)...")
try:
while True:
time.sleep(30)
except KeyboardInterrupt:
print("\nStopping sync...")
obs.stop()
finally:
obs.join()
print("Sync stopped.")
except Exception as e:
print(f"Fatal error: {e}")
import traceback
traceback.print_exc()
return 1
return 0
if __name__ == "__main__":
main()
I store my WebDAV credentials in a .env
file, which the script loads using python-dotenv
. This keeps sensitive information separate from the code and easy to update.
The script uses Watchdog to monitor the local folder for changes. Whenever a file is created, modified, deleted, or moved, the corresponding action is applied to the remote folder. This ensures the backup mirrors my local folder accurately.
Files are uploaded using a helper function that preserves relative paths, so the remote folder structure stays the same as my local one. Moves and renames are handled intelligently to avoid duplicates, and deletions remove files from the remote storage to keep it clean.
The main()
function sets up the WebDAV client, ensures the local and remote directories exist, and starts the Watchdog observer. A loop runs indefinitely, and I can stop the sync anytime with Ctrl+C
.
I’ve actually written a separate article that dives deeper into file system monitoring with Watchdog, covering how it works under the hood and when it’s useful. You can check it out here: Mastering File System Monitoring with Watchdog in Python.
Improved Approach: Add Startup Sync
It was a good first attempt for basic incremental backups, but I realized that if I wanted a clean and reliable remote mirror, I needed to add a startup sync step before entering monitoring mode.
That’s where the improved approach comes in:
# --- Startup Sync -------------------------------------------------------------
def startup_sync(client):
"""Perform initial sync: upload local files to remote and delete remote files not present locally"""
print("Starting initial sync...")
# Get list of all local files and directories
local_files = set()
local_dirs = set()
for root, dirs, files in os.walk(LOCAL_DIR):
root_path = Path(root)
for file in files:
rel_path = str(root_path.joinpath(file).relative_to(LOCAL_DIR)).replace("\\", "/")
local_files.add(rel_path)
for dir_name in dirs:
rel_path = str(root_path.joinpath(dir_name).relative_to(LOCAL_DIR)).replace("\\", "/")
local_dirs.add(rel_path)
print(f"Found {len(local_files)} local files and {len(local_dirs)} local directories")
# Get list of all remote files and directories
remote_files = set()
remote_dirs = set()
try:
def list_remote_recursive(path):
try:
items = client.list(path)
for item in items:
if item.endswith('/'):
# Directory
dir_name = item.rstrip('/').split('/')[-1]
if dir_name: # Skip root path
rel_path = path.replace(REMOTE_DIR, '').lstrip('/')
if rel_path:
remote_dirs.add(f"{rel_path}/{dir_name}")
else:
remote_dirs.add(dir_name)
# Recursively list subdirectories
list_remote_recursive(f"{path}/{dir_name}")
else:
# File
rel_path = path.replace(REMOTE_DIR, '').lstrip('/')
if rel_path:
remote_files.add(f"{rel_path}/{item}")
else:
remote_files.add(item)
except Exception as e:
print(f"Warning: Could not list remote directory {path}: {e}")
list_remote_recursive(REMOTE_DIR)
print(f"Found {len(remote_files)} remote files and {len(remote_dirs)} remote directories")
except Exception as e:
print(f"Warning: Could not list remote files: {e}")
print("Proceeding with upload only...")
# Delete remote files that don't exist locally
deleted_count = 0
for remote_file in remote_files:
if remote_file not in local_files:
try:
remote_path = f"{REMOTE_DIR}/{remote_file}"
print(f"Deleting remote file (not in local): {remote_file}")
client.clean(remote_path)
deleted_count += 1
except Exception as e:
print(f"Error deleting remote file {remote_file}: {e}")
print(f"Deleted {deleted_count} remote files")
# Delete remote directories that don't exist locally
deleted_dirs = 0
for remote_dir in sorted(remote_dirs, key=len, reverse=True): # Delete deepest dirs first
if remote_dir not in local_dirs:
try:
remote_path = f"{REMOTE_DIR}/{remote_dir}"
print(f"Deleting remote directory (not in local): {remote_dir}")
client.clean(remote_path)
deleted_dirs += 1
except Exception as e:
print(f"Error deleting remote directory {remote_dir}: {e}")
print(f"Deleted {deleted_dirs} remote directories")
# Create remote directories that don't exist
created_dirs = 0
for local_dir in local_dirs:
try:
remote_path = f"{REMOTE_DIR}/{local_dir}"
client.mkdir(remote_path)
created_dirs += 1
except Exception as e:
# Directory might already exist
pass
print(f"Created {created_dirs} remote directories")
# Upload local files that don't exist remotely or are newer
uploaded_count = 0
for local_file in local_files:
try:
local_path = LOCAL_DIR / local_file
remote_path = f"{REMOTE_DIR}/{local_file}"
# Check if remote file exists and compare modification times
should_upload = True
try:
remote_info = client.info(remote_path)
if remote_info:
# File exists remotely, check if local is newer
local_mtime = local_path.stat().st_mtime
remote_mtime = float(remote_info.get('modified', 0))
if local_mtime <= remote_mtime:
should_upload = False
except:
# Remote file doesn't exist or can't get info
pass
if should_upload:
print(f"Uploading: {local_file}")
client.upload_sync(remote_path=remote_path, local_path=str(local_path))
uploaded_count += 1
else:
print(f"Skipping (up to date): {local_file}")
except Exception as e:
print(f"Error uploading {local_file}: {e}")
print(f"Uploaded {uploaded_count} files")
print("Initial sync completed!")
This function works in three phases. First, it walks through my local folder, collecting a complete list of files and directories. At the same time, it queries the Hetzner Storage Box recursively to build a list of what’s already on the remote side.
With both lists in hand, the script compares them. Any files or directories that exist remotely but not locally are deleted. This prevents the remote backup from filling up with stale or renamed files. Conversely, any folders that exist locally but not remotely are created to keep the directory structure in sync.
Finally, the script uploads all local files that are either missing from the remote or have a newer modification time. That way, I don’t waste time re-uploading unchanged files, but I know everything important is up to date.
Once this one-time sync is done, the script switches into monitoring mode using Watchdog, just like before. From that point on, only changes are pushed, which keeps the sync efficient and fast.
The end result is a setup where my Hetzner Storage Box always starts as a clean copy of my local folder, and then stays updated in real time. This gives me confidence that my remote backup is accurate, not just “close enough.”
Full source code can be downloaded from:
Usage
Once I had the script ready, I set it up with just a few simple steps.
Install Dependencies
First, I installed the required Python packages. I’m using watchdog
for file monitoring, webdavclient3
for WebDAV communication, and python-dotenv
for environment variables:
pip install watchdog webdavclient3 python-dotenv
Configure Environment Variables
Next, I created a .env
file in the same directory as the script. This keeps my credentials safe and out of the code:
HETZNER_BASE_URL=https://your-storagebox-url
HETZNER_USERNAME=your-username
HETZNER_PASSWORD=your-password
Run the Script
Finally, I just started the sync process:
python hetzner_drive_sync.py
On startup, the script performs a full sync to ensure my remote backup matches my local folder. After that, it switches into monitoring mode, watching for changes in real time and syncing them to Hetzner automatically:
Local directory: D:\GitHub\Hetzner-Storage-Box-Cloud-Drive\HetznerDrive
Remote directory: /HetznerDrive
Ensuring remote directory exists: /HetznerDrive
Starting initial sync...
Found 1 local files and 1 local directories
Found 1 remote files and 1 remote directories
Deleting remote file (not in local): s/r.txt
Deleted 1 remote files
Deleted 0 remote directories
Created 1 remote directories
Uploading: s/b
Uploaded 1 files
Initial sync completed!
Starting file sync (Ctrl+C to stop)...
Uploading a -> /HetznerDrive/a
Successfully uploaded: a
I can stop the sync at any time with Ctrl+C
.
Wrap-Up
At the end of the day, this little script gives me exactly what I wanted: an affordable personal cloud backup that I fully control. Hetzner’s Storage Box is inexpensive, fast, and reliable, and by layering on a simple one-way sync I don’t have to worry about anything getting out of hand.
The key for me is that it’s one-way. My local folder is always the source of truth, and the remote just mirrors it. That means no confusing conflicts, no messy merge logic. Just a clean backup of my files in the cloud.
Every time I start the script, it does a full sync to tidy things up, then quietly runs in the background, catching changes as they happen. For me, that adds up to real peace of mind: my important files are safe off-site, but I don’t have to think about them.
My name is Nuno Bispo (a.k.a. Developer Service), and I love to teach and share my knowledge.
This blog is mostly focused on Python, Django and AI, but Javascript related content also appears from time to time.
Feel free to leave your comment and suggest new content ideas. I am always looking for new things to learn and share.
Follow me on Twitter: https://twitter.com/DevAsService
Follow me on Instagram: https://www.instagram.com/devasservice/
Follow me on TikTok: https://www.tiktok.com/@devasservice
Follow me on YouTube: https://www.youtube.com/@DevAsService