A quick summary of the current status of the Quest for The One Blog.

Status - Quest for the One Blog, Part 13

1,164 words.

Status - Quest for the One Blog, Part 13

A quick summary of the current status of the Quest for The One Blog.

I haven’t written out the current state of the blog recently, so here it is. This is a bit of a followup to yesterday’s list of things yet to do, inspired by some Twitter and Discord talk.

Writing Process

My current writing process is this: I write posts in a text editor on my Windows PC, in Markdown format. (I use Visual Studio Code.) I put them in a folder called “drafts,” within a directory structure of other stuff. The directory is part of a git repository for version control for the site.

It’s glorious. Writing responsiveness is instantaneous. None of that waiting for WordPress’s slow editor and my slow web host which takes up to a full second to load. I can focus completely on the writing and not have to worry about constantly yelling angrily at the writing tools.

When I’m done with a post, I “publish” the post, which for me means adding a little bit of metadata (the date, and a slug, mainly) to the top of the Markdown file, and then copying the file into the right directory. I wrote a script to streamline the process. Otherwise, I’d have to manually put the Markdown file into the right place, name it, and date it, which is a pain. Hugo doesn’t do any of that for you.

Next I preview the work I did by running a command “hugo server.” This spins up a local web server that displays a preview of the site, as it will be built out on the web. I open it in a web browser by going to “http://localhost:1313.” I usually do a few quick edits to the Markdown and make sure the images are right, and all the changes are displayed in real-time in my browser.

Then I use git commands to commit the post I’ve just written to the repository, and push the changes up to a remote repository. That’s the end of my involvement. The rest is automatic.

For pictures, I put them in another directory on my Windows PC which I’ve cleverly named “imagebucket.” I run a script that synchronizes everything in that directory with a remote Amazon AWS S3 bucket, which I can reference by URL. I picked AWS because I could write a script to do it all automatically. It could be anywhere, as long as it can be automated so I don’t have to manually upload pictures from my PC to the web. (I have variables in my blog so I can change the location of the images at will.)

(I do all of this inside Visual Studio Code, incidentally. It has a Terminal window for typing commands, and it has built-in git support.)

Behind-the-Scenes

My site is still a work-in-progress, what you might call a Minimum Viable Product, at this point. The RSS feed and link redirection were the main things that had to work out of the gate. (Nobody in the history of the world has ever updated a link once they’ve bookmarked or stored it somewhere.)

My local git repository connects to two different remote repositories: I have one on Azure DevOps, and one on AWS CodeCommit. Those are Microsoft- and Amazon-branded git hosting solutions, very similar to GitHub. They’re free (up to a point).

At the moment, my site is built from the Azure git repo, because that’s the one I started with, and it’s the one I got working first. I had intended to host the site on Azure’s Static Website service (which is free, up to a point), but ran into a major problem because it doesn’t support root domains. (That means I would have had to call my site www.endgameviable.com instead of endgameviable.com, which is a non-starter because it would break every old link out there.)

Here’s how the process works: Each time I commit changes to the Azure git repository, a build pipeline triggers, which runs Hugo. Hugo takes all the Markdown files I’ve made and turns them into HTML files. Then they’re placed into an Azure storage bucket. You can actually see my full web site there, the place I had original intended it to live, and it loads lightning-fast, but it has a very cryptic computer-generated URL.

But in order to get my site to continue to live at endgameviable.com, I then have to copy the entire static site over to my old Linux web host. The build pipeline I mentioned above does that using a secure FTP command. It’s slow and painful, and I would never recommend anyone do it in a million years. But I don’t have much choice at the moment.

So in summary:

  • Write posts on PC.
  • Put posts into git repo.
  • Automated pipeline builds the entire site and plops it into a static file bucket.
  • Copy static site to endgameviable.com.

It’s that last step that I need to eliminate, because it’s wasteful, and my old Linux host is the only part that isn’t free right now. But there are many things I need to resolve before I can do that, most of which I mentioned in my “todo” post.

I mentioned before that I want to move to AWS. Azure just happens to be where I started, and where everything is currently wired up. But eventually I want to consolidate all of that onto Amazon AWS, because AWS seems like a more mature service than Azure at this point. At least in the area of serving static content. (It’s also helpful for my day job to have more exposure to and experience with AWS infrastructure services.)

So, at the same time all of the above is happening with Azure, I also have my AWS CodeCommit repository setup to build a second static website, which goes into a AWS S3 bucket. There’s a different cryptic link where you can see a complete mirror of my site hosted there on Amazon.

So in total I have three copies of my web site out there on the Internet right now: One on endgameviable.com (my old Linux host), one on Microsoft Azure, and one on Amazon AWS. They’re all built from the same source, the “single source of truth,” in the parlance of industry consultants, which is on my PC.

If I setup everything correctly, in theory, I can move anywhere by just flipping some switches. I just need a git repository, a build pipeline that supports Hugo, and a static file bucket that I can connect a domain name to. (It’s more complicated than that, but that’s the basics.)

The main technical challenge that I still need to solve for a static site is redirecting old permalinks. (I also want to archive two or three other blogs into this one, so I also need to redirect those links to the new site location.)

It’s a long-term project, especially since I typically only dedicate about an hour a month to working on it.

Related

Sorry, new comments are disabled on older posts. This helps reduce spam. Active commenting almost always occurs within a day or two of new posts.