How to Use GitHub Pages for Parked Domains screenshot

For domain names I’ve reserved but haven’t done anything with yet, I like to have them parked with my own simple landing page rather than one of those ad-filled “parked domain” pages hosted by the registrar. Previously I had set mine up on a Digital Ocean VM. This wasn’t too difficult (and had a side benefit of forcing me to brush up on my Apache HTTP Server skills) but I’ve switched to a much easier method: hosting for free via GitHub Pages. This is not only free in terms of cost, but also time because it’s very simple to set up and run.

GitHub Pages

First of all this was a good excuse to finally learn more about GitHub pages and support for building static sites with Jekyll. In essence GitHub will automatically create and publish an HTML website from your repo source files. Both public and private repos are supported; GitHub will remind you that all generated sites are public though, so be careful if publishing private repos this way.

For anyone wanting to learn more, I suggest going through the Jekyll Quickstart first and then the specific notes for running Jekyll on GitHub Pages. After coming up to speed on the basics, I tried publishing things a few different ways and came to a few conclusions:

  • Pages makes it very easy to publish automatically (no need for any separate deployment tools)
  • You can use either full Jekyll, or fall back to writing your own HTML (which Jekyll which simply publish as-is)
  • Troubleshooting Jekyll build issues is difficult if not impossible
  • With an NPM module it’s possible to run the Pages build locally which is very helpful for debugging
  • The GitHub documentation for Pages is pretty confusing
  • Pages would be a great solution for anything simple, but not for anything more complex or critical
  • SSL is included, and certificate renewals are automatic (thanks to Let’s Encrypt)
  • Custom domains are also supported (no charge)

In the end I decided to go with a very plain HTML solution without anything Jekyll specific. Read on to see the steps I followed.

Setting up Parked Domains on GitHub Pages

These are the steps I followed for setting up my parked domain name pages on GitHub Pages. The example I’ll use here is my domain

  1. For your domain name (whether it’s new or existing), lower the DNS time to live (TTL) to be very short (an hour or less). If you had an existing TTL which was longer, you can do this step the day before to give it time to propagate.
  2. On GitHub, create a new public repo and choose the option to include a starter README. I’m using the domain name as the repo name, so my repo is at bcantoni/
  3. Note that private repos work too, but since the site is going to be public anyways, it seems to make sense for the repo to be public.
  4. On your local system, do a git clone to bring the repo down.
  5. Add your web content (even if just a simple index.html).
  6. Commit and push
  7. On the GitHub repo settings, enable GitHub Pages and choose the option to publish from root of the master branch. (Other options here include using the /docs path in the master branch, or the dedicated gh-pages branch.) I also like to turn off wikis, issues and projects since none of those will be needed.
  8. Your site should now be available on
  9. Next you’ll set up your custom domain by starting at your DNS provider. Add an A round-robin pool to include the 4 GitHub IP addresses:,,,
  10. Wait a bit for this change to propagate, then check the DNS entry on your system with the command line dig If dig does not respond with the new expected IP addresses, wait and try again later.
  11. Once the DNS lookup is working, return to the GitHub Pages settings and enter your domain as the custom domain name and save it.
  12. After a moment, GitHub should say your site is now available under
  13. Finally you can enable Enforce HTTPS. After several minutes, this will be configured on the GitHub side and you can confirm by visiting your custom domain.
  14. If you have any trouble, remove and then re-add the custom domain name. That step will trigger the logic on the GitHub side to reconfigure everything.
  15. Once everything is confirmed working, return to your DNS provider and set a normal TTL once again (typically 24 hours or more).
  16. Whenever you need to make site content changes, just push them to GitHub and your results will automatically go live right away.

The UI might change around a bit over, but as of this writing here’s what the GitHub Pages settings section will look like when done:

Screenshot of GitHub Pages settings

Finally the results – these are my currently parked domains which are all using this technique:

For completeness, I also used the same method to hang a really simple landing page for Gorepoint (this one uses a minimal design built with Materialize.)

Photo credit: chuttersnap on Unsplash

How to Fix Dell Monitors With Mac Laptops

The Problem

I’m lucky to have a work-provided MacBook Pro as my primary system along with a pretty nice Dell UltraSharp 30-in monitor at the office. One of the things I’ve always struggled with in this combination is really poor resolution when the Dell is connected. This problem with Dell monitors isn’t quite the “fuzzy font” problem you’ll see if you search around (that’s mostly referring to font smoothing or anti-aliasing adjustments which macOS can apply). Instead it’s more like the screen is running at a much lower than native resolution.

This weekend I finally upgraded my system to macOS Mojave (10.14) and this morning in the office my favorite display bug had once again returned. These are my notes for fixing the issue with links to the original sources.

The Solution

The canonical article I found is on the forums where the problem and solution (paraphrased) is summarized as:

Many Dell monitors (e.g., U2713H, U2713HM, …) look really bad when connected to a Mac (OS X 10.8.2) via DisplayPort, as if some sharpening or contrast enhancement was applied. Others have reported the same problem. The reason is that the DisplayPort uses YCbCr colors instead of RGB to drive the display, which limits the range of colors and apparently causes the display to apply some undesired post processing. The problem can be solved by overriding the EDID data of the display in order to tell OS X that the display only supports RGB.

The fix is to create a display EDID profile to force macOS to use RGB mode. You can read through the forum post for details and check this this patch-edid patch for a Ruby script to make it pretty easy.

The Steps

These are the steps to run that patch script and get a display override file in the right place on macOS Mojave. This should only be done by advanced users and/or those who know what they’re doing because it involves disabling Apple’s System Integrity Protection for a short time.


  1. Restart the Mac in Recovery Mode (hold down Cmd-R once the Apple logo appears)
  2. Open a terminal window and disable SIP: csrutil disable
  3. Restart the Mac again and let it boot normally
  4. Download the patch-edid.rb script
  5. Run the script with Ruby: ruby patch-edid.rb
  6. Confirm that you have a local folder DisplayVendorID-10ac with a file DisplayProductID-xxxx
  7. Copy the folder DisplayVendorID-10ac and its contents over to /System/Library/Displays/Contents/Resources/Overrides/ (sudo will be required)
  8. Restart the Mac in Recovery Mode once again
  9. Open a terminal window and re-enable SIP: csrutil enable
  10. Restart the Mac again and let it boot normally
  11. Open the Display Preferences for the external Dell monitor and under the Color tab select the “forced RGB mode (EDID override)” option
  12. One more system restart may be needed for the changes to take effect

HTTP Request Inspectors

Someone wrote in to let me know that an older project of mine which provides a simple webservice echo test was referencing some out of date projects. My HTTP test service doesn’t use any of those other services, but I’ve updated the blog post description to point to some new options for comparison purposes.

The original hosted version of RequestBin (at is no longer live. This was run by Runscope at the time and the source code is still up on GitHub (Runscope/requestbin). You can see their the status change:

We have discontinued the publicly hosted version of RequestBin due to ongoing abuse that made it very difficult to keep the site up reliably. Please see instructions below for setting up your own self-hosted instance.

Here’s the commit with the awesome (but sad) commit message of :( when they turned it off. Some good comments there from fans of the service.

The good news is with the source code still available, people can still use the service themselves (Heroku and Docker instructions are included) and the adventurous ones can run live services using the same. Here are a few I found with some quick searching (but have not personally used):

Converting HTML to Markdown using Pandoc

Markdown is a great plain text format for a lot of applications and is often used to convert to HTML (for example on my WordPress blog here). There are also some good use cases for the opposite: converting from HTML into Markdown. I recently had such a case to convert some older blog posts from raw HTML into Markdown found that Pandoc made it really easy.

What’s Pandoc

Pandoc is an open-source utility for converting between a number of common (and rare) document types, for example plain text, HTML, Markdown, MS Word, LaTeX, wiki, and so on. The output formats list is really extensive, and people can write their own “filters” to handle other formats as well, or to customize the existing ones to their exact needs. The project tagline sums it up nicely:

If you need to convert files from one markup format into another, pandoc is your swiss-army knife.

Screenshot of Pandoc website showing all the supported file formats
The Pandoc website lists all of the support file types it can convert between

My Use Case

My particular use case was to convert about a dozen really old blog posts from this website. I wrote these back in the early days when I managed this site in CityDesk and later migrated to MovableType. The Google Search Console alerted me to some crawler errors which turned out to be caused by raw PHP file content being served instead of real HTML.

My approach for cleaning this up was as follows:

  1. Convert HTML original articles into Markdown format
  2. Do some manual cleanup editing and double-check links are still valid
  3. Drop the Markdown into the appropriate Posts within WordPress
  4. Modify my existing .htaccess files to do permanent (301) redirects for all of the old URLs


Simple HTML Example

With Pandoc installed, you can try a simple test pulling down the installation instructions page:

curl --silent | pandoc --from html --to markdown_strict -o

To see the result, consider this HTML snippet from installing.html:

<h2 id="compiling-from-source">Compiling from source</h2>
<p>If for some reason a binary package is not available for your platform, or if you want to hack on pandoc or use a non-released version, you can install from source.</p>
<h3 id="getting-the-pandoc-source-code">Getting the pandoc source code</h3>
<p>Source tarballs can be found at <a href="" class="uri"></a>. For example, to fetch the source for version</p>
tar xvzf pandoc-
cd pandoc-</code></pre>

We can see the resulting Markdown turned out very well:

## Compiling from source

If for some reason a binary package is not available for your platform, or if you want to hack on pandoc or use a non-released version, you can install from source.

### Getting the pandoc source code

Source tarballs can be found at <a href="" class="uri"></a>. For example, to fetch the source for version

    tar xvzf pandoc-
    cd pandoc-

My Blog Post Conversions

For my dozen old HTML articles, the straight conversion ended up being a bit noisy, especially with the some old CMS template boilerplate around the content which was no longer needed. To clean those up I used a little bit of Sed to clean it up before conversion:

echo "converting $1"
cat $1 | sed '1,/<div class="asset-header">/d' | sed '/<div class="asset-footer">/,/<\/html>/d' | pandoc --wrap=none --from html --to markdown_strict > $

(The above Sed commands clean up the HTML source in two passes: first removing everything from top of file to <div class="asset-header">, which is where the blog post started; and then removing all from <div class="asset-footer"> to the end of file.)

After that, I just needed to do some minor editing cleanups on the Markdown files before bringing them in to WordPress. Success!

Further Reading

There are a few good online converters you can try; keep in mind some of these are limited in the number of characters they can handle:

To learn more and go deeper on Pandoc, I recommend going through their excellent user’s guide.

And finally a big recommendation for Dillinger, a great online tool for editing Markdown text with live HTML rendering. I use that for writing these blog articles as well, before moving them in to WordPress.

My Current Podcast Playlist

boy singing on microphone with pop filter.

I like to periodically drop my podcast subscription list here for anyone interested, and so I can look back and see how my interests have changed :) (Search here for some previous updates.) Lately I’m mostly listening to software or startup podcasts, but have started following a lot of woodworking ones as well as I try to find time for my woodworking hobby!

Tech / Software

Hanselminutes – Fresh Talk and Tech for Developers [rss]

The Changelog [rss]

.NET Rocks! [rss]

Build Your SaaS – running a startup in 2019 [rss]

The Ars Technicast [rss]

Startups / Business

DataSnax Podcast [rss]

MegaMaker [rss]

Import This [rss]

The Tim Ferriss Show [rss]

The Smart Passive Income Online Business and Blogging Podcast [rss]

Woodworking / Makers

Making It With Jimmy Diresta, Bob Clagett and David Picciuto [rss]

The Modern Maker Podcast [rss]

Made for Profit [rss]

The Green Woodworker Podcast [rss]

If You Build IT Podcast [rss]

Measuring Up Podcast [rss]

The Make or Break Show [rss]

Forked Up: A Thug Kitchen Podcast [rss]


NASCAR on NBC podcast [rss]

Sports Media with Richard Deitsch [rss]

Photo by Jason Rosewell on Unsplash