I've been a paying user of Dropbox for about six years, but through UMaine I get access to G Suite. G Suite includes Drive File Stream, which essentially provides the same service as Dropbox (with some considerations). I felt very uncomfortable with transferring my nearly 2TB of data from Dropbox to Drive File Stream (what if I lost something!?!?), but as each yearly bill came around I became more inclined to make the swap. At first, I considered manually transferring my files through the desktop applications, but that was starting to lead me down a dark path of tragedy and ruin. After some forum sleuthing, I found some posts that suggested using rclone to transfer my data over.

Rclone can be ran on multiple platforms, but since this is a process I wanted to run constantly, I put it up in a DigitalOcean Droplet. Initially I was skeptical, as it seemed like something that would solve all of my cloud storage woes. The documentation says it checks the files as they are transferred (duplicates, failed copies, etc) and allows you to purge directories – all from the command line. It doesn't require you to download data and then upload it; it simply initiates the transfer without tying up computing resources. The fantastic part about it is that it also has other handy features (like sync), which I am definitely going to use now that I've experienced how well copy and purge worked.

The following are the steps I followed to setup rclone to transfer all of my data from Dropbox to Google Drive in an Ubuntu 18.04 DigitalOcean droplet.

Go

Once you SSH or login to your droplet, you'll need to install Go as this is the programming language rclone was written in:

Download the tarball - it's likely there will be a new version of Go available, so it's okay if the version you download doesn't match the one listed here:

curl -O https://dl.google.com/go/go1.14.7.linux-amd64.tar.gz

Checksum to see if hash matches:

sha256sum go1.14.7.linux-amd64.tar.gz

Does the hash match? Mine matched with: 4a7fa60f323ee1416a4b1425aefc37ea359e9d64df19c326a58953a97ad41ea5

If the hash matched, then you're ready to extract the tarball:

tar xvf go1.14.7.linux-amd64.tar.gz

Recursively set permissions for your Go directory:

sudo chown -R root:root ./go
sudo mv go /usr/local

Now it's time to set some environment paths

# Go's root value, where Go looks for files
sudo nano ~/.profile

# add to last line of ~/.profile
export GOPATH=$HOME/work
export PATH=$PATH:/usr/local/go/bin:$GOPATH/bin

# refresh profile
source ~/.profile

Once these paths are set, test your Go directory with the following:

mkdir $HOME/work
mkdir -p work/src/github.com/user/hello

nano ~/work/src/github.com/user/hello/hello.go

# add the following:

package main

import "fmt"

func main() {
    fmt.Printf("hello, world\n")
}

# write and exit

go install github.com/user/hello

# now execute the command
hello

# "hello, world" should print out

gcc/g++ compiler

Now that you have Go installed and running, the gcc/g++ compiler needs to be installed

sudo apt update
apt-get install build-essential
sudo apt-get install manpages-dev
gcc --version

Rclone

While I was installing rclone, the current build in GitHub was failing, so I checked out and installed a previous passing build. You may not need to do this if the current version has been fixed.

#go get github.com/rclone/rclone
cd ~/work/src/github.com/user
git clone https://github.com/rclone/rclone.git
cd rclone

# current release is broken, so I'm checking out the last stable release
git checkout tags/v1.52.2

# make it
go build
./rclone version

Configure Rclone

This next step will vary slightly depending on the version of rclone, but the essence should still hold true. I had to do a "headless" configuration because I set this up through my droplet.

# navigate into your rclone directory
cd ~/work/src/github.com/user/rclone

./rclone config

Start your configuration with DropBox. When prompted to auto config, select n. You should be prompted with additional instructions and a request for a key that require accessing a link or two through your desktop.

For DropBox, start by downloading rclone locally. Open command prompt and cd into the download directory.

# cd into extracted folder with CMD, then:

rclone authorize "dropbox"

# browser will launch, authorize and you'll be given a key

Copy the provided key and paste it into the prompt of your "headless" machine (in my case, my droplet).

Once done, launch ./rclone config again and start a new configuration for Google Drive. When prompted for auto config, select n again. On the "headless" machine, you will be given a link. Paste the link into any accessible browser and authorize. You will again be given a token that you need to paste into your "headless" machine.

Transfer Time!

Useful Tags (descriptions from rclone documentation)

--update This forces rclone to skip any files which exist on the destination and have a modified time that is newer than the source file.

--dry-run Do a trial run with no permanent changes. Use this to see what rclone would do without actually doing it. Useful when setting up the sync command which deletes files in the destination.

--progress This flag makes rclone update the stats in a static block in the terminal providing a realtime overview of the transfer.

rclone copy

First, it might be good to do a --dry-run as previews what will update without actually making any changes (kind of like git diff).

./rclone copy dropbox:FolderToTransfer/ gdrive:FolderToTransfer/ --update --dry-run

Then, do the actual update to physically transfer files.

# to print progress
./rclone copy dropbox:FolderToTransfer/ gdrive:FolderToTransfer/ --update --progress

# to have it run in the background
./rclone copy dropbox:FolderToTransfer gdrive:FolderToTransfer/ --update

rclone purge

Be careful with this – if you want to completely remove all files within a directory, use purge.

./rclone purge dropbox:FolderToPurge/