Upgrading

Upgrading to the Next Version

For details on the contents of each version please see the Changelog. This document is purely about how to upgrade to newer versions.

v2.3.0 - More Taggy Goodness, Less Bugs

Shut down your Backup Brain as per usual.

Download the latest code:

git fetch --tags

Start using it:

git checkout v2.3.0

Restart Backup Brain

v2.2.0 - Tags & Settings

This version adds the ability to apply data or schema migrations to the database on startup. The upgrades are fully automated.

Shut down your Backup Brain as per usual.

Download the latest code:

git fetch --tags

Start using it:

git checkout v2.2.0

Restart Backup Brain

Geeky Details

The ./serve and ./docker/serve startup scripts now query the database, for the current schema version, and see if there are any files in the scripts/data_migrations/ folder that correspond to the next version.

If pending schema migrations are found they’le be run automatically before starting the server.

Currently this can only support upgrading one version. There’s an open ticket to allow users to migrate multiple versions at a time. Currently there’s only one version to migrate so this limitation isn’t a problem yet. If you’ve got any familiarity with writing Bash scripts that’d be an easy ticket to help out with.

v2.1.0 - Bug Fixes

Shut down your Backup Brain as per usual.

Download the latest code:

git fetch --tags

Start using it:

git checkout v2.1.0

Restart your Backup Brain

v2.0.0 - Docker!

Shut down your Backup Brain as per usual.

Download the latest code:

git fetch --tags

Checkout the v2.0.0 tag.

git checkout v2.0.0

Switching to Docker

  • Backing Up Your Data

    DO THIS FIRST‼️ The following commands should be run from the root of your Backup Brain repository.

    First: Make sure you’ve got a fresh backup of your data.

    scripts/export.sh
    

    Then make a quick check that your mongo_exports/bookmarks.json and mongo_exports/users.json files look like they have legit content in them.

    • Checking Bookmarks Export

      head -c 80 mongo_exports/bookmarks.json
      

      The output should look something like this

      {"_id":{"$oid":"64a838dd906b1d182175f254"},"title":"Free Fediverse","description…
      
    • Checking Users Export

      head -c 80 mongo_exports/users.json
      

      The output should look something like this

      {"_id":{"$oid":"666f6197906b1d17c490dbef"},"email":"masukomi@masukomi.org","encr…
      
  • Setting up config files

    Follow the Getting Started instructions. You will need to run the scripts/run_me_first.sh script when instructed.

    If you’re already using Tailscale Funnel to expose your Backup Brain to the internet, you’ll need to enter an API key in the docker-compose.yml file, as per the instructions in Getting Started.

  • From a 1.x version with data

    • Loading Your Backup In Docker

      Do this after backing up your data, and going through the Getting Started instructions.

      Connect to a shell in the bb_mongodb container

      docker-compose exec -it bb_mongodb bash
      

      You should see a prompt that looks like root@a1b2c3d4:/#

      Change to the app directory.

      cd app
      

      Run the import script. ⚠️ WARNING:

      • This will overwrite any bookmarks and archives you have added since using docker.
      • If you have changed the database name to something custom, you should preface the line below with DATABASE_NAME=my_custom_name followed by a space
      scripts/import.sh
      

      The output should look something like this:

      Importing to the "backup_brain_development" database found on
      mongodb://bb_mongodb:27017/
      ----------------------------------------------
      replacing bookmarks with exported data...
      2024-08-02T19:55:35.781+0000	connected to: mongodb://bb_mongodb:27017/
      2024-08-02T19:55:35.782+0000	dropping: backup_brain_development.bookmarks
      2024-08-02T19:55:38.782+0000	[#################.......] backup_brain_development.bookmarks	132MB/178MB (74.4%)
      2024-08-02T19:55:41.050+0000	[########################] backup_brain_development.bookmarks	178MB/178MB (100.0%)
      2024-08-02T19:55:41.050+0000	12681 document(s) imported successfully. 0 document(s) failed to import.
      replacing users with exported data...
      2024-08-02T19:55:41.069+0000	connected to: mongodb://bb_mongodb:27017/
      2024-08-02T19:55:41.070+0000	dropping: backup_brain_development.users
      2024-08-02T19:55:41.077+0000	1 document(s) imported successfully. 0 document(s) failed to import.
      

      When that’s complete run the following to exit the shell.

      exit
      
    • Resetting your Meilisearch Admin & Search keys

      For reasons I can’t explain, the default search & admin keys will change even if your MEILI_MASTER_KEY remains the same.

      To get the updated keys, copy your MEILI_MASTER_KEY and run the following

      Connect to a shell in the bb_rails container

      docker-compose exec -it bb_rails bash
      

      You should see a prompt that looks like root@a1b2c3d4:/#

      Change to the app directory, and launch the rails console. Note that launching the rails console will take a stupid-long time. I don’t know why.

      cd app
      bundle exec rails console
      

      Make sure your master key is set correctly and has made its way to rails.

      ENV['MEILI_MASTER_KEY']
      

      That should output your master key. If it doesn’t you need to exit, exit again, and go address the configuration issue in your .env file.

      Ask Meilisearch for the default keys

      Search::Client.instance.get_default_keys(ENV['MEILI_MASTER_KEY'])
      

      That will return a Hash that looks like this:

      {:search=>"your search api key",
       :admin=>"your admin api key"}
      

      Type exit to leave the Rails console, and exit again to leave the docker shell.

      Change the values of MEILISEARCH_SEARCH_KEY and MEILISEARCH_ADMIN_KEY in your .env file to match the corresponding values in the hash above.

      Then restart the bb_rails container.

      docker-compose restart -t 5 bb_rails
      

      If docker is running in the foreground you can just ^c (control + c) to stop it and docker-compose up to restart it.

    • Reindexing Your Search data

      After resetting the search and admin keys above, and restarting the bb_rails container we need to load all your bookmark data into the new Meilisearch instance.

      Connect to a shell in the bb_rails container, and launch the Rails Console as per the instructions in the prior section.

      Then run the following. Depending on how many bookmarks you have this could take minutes.

      Bookmark.all.each do | b |
        b.update_in_search
      end
      

      When it finishes you should see output like this:

      #<Mongoid::Contextual::Mongo:0x0000ffff7f49e5b0
       @cache=nil,
       @cache_loaded=true,
       @collection=#<Mongo::Collection:0x111100 namespace=backup_brain_development.bookmarks>,
       @criteria=#<Mongoid::Criteria
        selector: {}
        options:  {}
        class:    Bookmark
        embedded: false>
      ,
       @klass=Bookmark,
       @view=#<Mongo::Collection::View:0x111120 namespace='backup_brain_development.bookmarks' @filter={} @options={"session"=>nil}>>
      

      Run exit to exit the Rails console, and exit again to exit the docker shell.

  • From a 1.x version without data

    Use git to checkout the v2.0.0 tag.

    Follow the new Getting Started started docs. There’s very little you’ll need to do.

Ignoring Docker

No additional steps

v1.1.0

Just checkout the v1.1.0 tag.

git checkout v1.1.0

Bleeding Edge Changes

Work in advance of the upcoming release will be merged into the next branch as it becomes ready for you to test. It’s possible that bugs will appear in this branch, but it’s incredibly unlikely that there will be anything that could lead to data loss.

In the unlikely chance that you need to take any direct action to use the latest code in next I’ll mention them on the Mastodon Account.

Because Backup Brain uses MongoDB there is no need to run migrations when new models are added, but I may make scripts to backfill data into old records.