Importing SQLite Database Snapshots
Overview
One of the challenges of running an AR.IO Gateway is the initial synchronization time as your gateway builds its local index of the Arweave network. This process can take days or even weeks, depending on your hardware and the amount of data you want to index. To accelerate this process, you can import a pre-synchronized SQLite database snapshot that contains transaction and data item records already indexed.
This guide will walk you through the process of importing a database snapshot into your AR.IO Gateway.
Note
The below instructions are designed to be used in a linux environment. Windows and MacOS users must modify the instructions to use the appropriate package manager/ command syntax for their platform.
Unless otherwise specified, all commands should be run from the root directory of the gateway.
Quick Start
Download Database Snapshot
Download the latest database snapshot using BitTorrent:
transmission-cli "magnet:?xt=urn:btih:62ca6e05248e6df59fac9e38252e9c71951294ed&dn=2025-04-23-sqlite.tar.gz&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=http%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Fopen.demonii.com%3A1337%2Fannounce&tr=udp%3A%2F%2Ftracker.torrent.eu.org%3A451%2Fannounce&tr=udp%3A%2F%2Fp4p.arenabg.com%3A1337%2Fannounce&tr=https%3A%2F%2Ftracker.bt4g.com%3A443%2Fannounce"
This downloads a 42.8GB snapshot current to April 23, 2025.
Extract the Snapshot
Extract the downloaded tarball:
tar -xzf 2025-04-23-sqlite.tar.gz
This creates a directory with the extracted database files.
Import the Snapshot
Replace your existing database with the snapshot:
# Stop the gateway
docker compose down
# Backup existing database (optional)
mkdir sqlite-backup
mv data/sqlite/* sqlite-backup/
# Remove old database
rm data/sqlite/*
# Import new snapshot
mv 2025-04-23-sqlite/* data/sqlite/
# Start the gateway
docker compose up -d
Detailed Instructions
Obtaining a Database Snapshot
SQLite database snapshots are very large and not easy to incrementally update. For these reasons, AR.IO is distributing them using BitTorrent.
Install Torrent Client
Install a BitTorrent client. We recommend transmission-cli:
# Ubuntu/Debian
sudo apt-get install transmission-cli
# CentOS/RHEL
sudo yum install transmission-cli
# macOS
brew install transmission-cli
Download Snapshot
Download the latest snapshot using the magnet link:
transmission-cli "magnet:?xt=urn:btih:62ca6e05248e6df59fac9e38252e9c71951294ed&dn=2025-04-23-sqlite.tar.gz&tr=udp%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=http%3A%2F%2Ftracker.opentrackr.org%3A1337%2Fannounce&tr=udp%3A%2F%2Fopen.demonii.com%3A1337%2Fannounce&tr=udp%3A%2F%2Ftracker.torrent.eu.org%3A451%2Fannounce&tr=udp%3A%2F%2Fp4p.arenabg.com%3A1337%2Fannounce&tr=https%3A%2F%2Ftracker.bt4g.com%3A443%2Fannounce"
This will download a snapshot, current to April 23, 2025, of an unbundled data set that includes all data items uploaded via an ArDrive product, including Turbo. The file will be named 2025-04-23-sqlite.tar.gz
and be approximately 42.8GB in size.
Consider Seeding
Seeding Recommendation
While continuing to seed the torrent after download is not required, it is highly recommended to help ensure the continued availability of the snapshot for others, as well as the integrity of the data. Seeding this file should not cause any issues with your internet service provider.
Extracting the Database Snapshot
Once the file has downloaded, you can extract it using the following command.
Verify Download
Check that the file downloaded completely:
ls -lh 2025-04-23-sqlite.tar.gz
# Should show approximately 42.8GB
Extract the Archive
Extract the tarball:
tar -xzf 2025-04-23-sqlite.tar.gz
Be sure to replace the filename with the actual filename of the snapshot you are using, if not using the example above.
Verify Extraction
Check that the extraction was successful:
ls -la 2025-04-23-sqlite/
# Should show SQLite database files
This will extract the file into a directory matching the filename, minus the .tar.gz
extension.
Importing the Database Snapshot
Once you have an extracted database snapshot, you can import it into your AR.IO gateway by replacing the existing SQLite database files.
IMPORTANT
Importing a database snapshot will delete your existing database and replace it with the snapshot you are importing.
Backup Existing Database
(Optional) Backup your existing SQLite database files:
mkdir sqlite-backup
mv data/sqlite/* sqlite-backup/
Import New Snapshot
Move the snapshot files into the data/sqlite
directory:
mv 2025-04-23-sqlite/* data/sqlite/
Be sure to replace 2025-04-23-sqlite
with the actual directory name of the extracted snapshot you are using.
Verifying the Import
The simplest way to verify the import is to check the gateway logs to see what block number is being imported.
Check Gateway Logs
View the gateway logs to see the current block height:
docker compose logs -f gateway
Look for messages indicating the current block being processed.
Verify Block Height
The 2025-04-23 snapshot was taken at block 1645229
, so the gateway will start importing blocks after this height if the snapshot was imported successfully.
You should see logs showing blocks being processed starting from block 1645230 or higher.
Use Grafana (Optional)
You can also use the Grafana Extension to view the last block imported in a more human readable format.
How is this guide?
Content Moderation
Gateway operators have the right and ability to blocklist any content or ArNS name that is deemed in violation of its content policies or is non-compliant with local regulations.
Setting Apex Domain Content
Complete guide to configuring your AR.IO Gateway to serve custom content from the apex domain