Posts From Category: tutorial

Generating Jekyll posts using Drafts and Working Copy »

I put together a script that will take a draft, grab the title and body and then prompt you for front matter data before sending the completed post off to Working Copy. It’s specific to my site, and purposes, but it should be fairly straightforward and easy to adapt to your needs.

When you first run the action, it’ll ask you for your repo name, posts path and Working Copy x-callback-url token. This info will be stored in Drafts and used to write out the correct file.

Site categories and tags are expected to be space delimited and are split out and mapped over to parse them into the proper format.

Post dates are pre-populated with the current date and that same date is combined with the draft file to generate the file name that’s specified when first running the action.

Updating to the latest version of git on Ubuntu

If you’re using git on Ubuntu, the version distributed via apt may not be the newest version of git (I use git to deploy changes on all of the sites I manage). You can install the latest stable version of git provided by the maintainers as follows:

sudo add-apt-repository ppa:git-core/ppa
sudo apt-get update

A brief intro to git

I was recently asked to speak about and provide insight into git at a meetup I’ve been running with a friend. As a developer, a version control system is a critical part of your toolkit, no matter the size of the project or team you may find yourself working on.

I first started learning to use git by applying it to my own projects and maintaining local repositories to track those projects. From there I moved on to hosting and storing my git repositories at Bitbucket while still working independently. My first experience with working alongside other developers in git came at my first full time development job on a small team (think really small — two developers, myself included). I picked up the basics of branching, handling merges, developing different features in parallel and, ultimately, dealing with QA and production deployments that were sourced from various branches in our project repository.

I’ve expanded on my knowledge git in the jobs I’ve held since that first position and have used svn pretty heavily as well (I don’t mind it, but I don’t love it — I’d argue git is the better choice for a number of reasons, its decentralized nature and flexibility being chief among them).

One of the many appeals of git is its flexibility and there are a wide range of commands that come with it. To get started, I’d suggest digging in with the following:

# initialize git
git init

# clone repo
git clone <repo url>

# view the state of the repo
git status

# this will stage all of your modified and/or untracked files for commit
git add -A

# this will stage only the files that you pass into the command as an argument, delimited by a space
git add <path to file(s)>

# this will commit all modified files and apply the commit message that follows it
git commit -am "<commit message>"

# this will commit only the files that you've staged and apply the message that follows it
git commit -m "<commit message>"

# amend the last commit message
git commit --amend

# this will fetch changes from the remote branch that you're currently on; this will require a merge if your local copy of the branch has diverged from the remote
git pull

# you can also specify arguments and branches with git pull, for example
git pull origin master

# this will checkout a different branch from the branch you're currently on
git checkout <branch name>

# alternatively you can revert the state of your current branch to match the head of that branch, or that of of an individual file
git checkout .

git checkout <path to file>

# check out a new branch, diverging from the current branch
git checkout -b <branch name>

# see available branches
git branch

# delete a branch locally
git branch -d <branch name>

# delete a branch remotely
git push origin --delete <branch name>

# merge a branch into the branch you're currently on
git merge <branch name>

# stash your current changes and return to the head of the branch you're on
git stash

# reapply your stashed changes
git stash apply

# reapply your topmost stashed changes and discard the change set
git stash pop

# show commit logs
git log

# show the reference log of all recent actions
git reflog

# fetch remote branches
git fetch

# throw away uncommitted changes and revert to the head of the branch (destructive command)
git reset --hard HEAD

# back branch up to previous commit
git checkout <commit hash value>

# revert a previous commit
git revert <git commit hash value>

Each of these commands has numerous options associated with it and allows for broad control over the flow and history of your project. There are a number of other options I’d suggest for learning more about git:

Installing HTTP/2 on Ubuntu 16.04 with virtual hosts

Now that HTTP/2 is fairly stable and widely available, I decided to try and install and run it on this server. I’m currently running the Ubuntu 16.04.2 LTS with virtual hosts configured so I can serve a number of sites beyond this one. All of the sites this server hosts are also served securely using certificates from LetsEncrypt.

To install HTTP/2 I SSH’d in to the server and ran the following commands:

# add the new apache repository to your server's sources

sudo add-apt-repository -y ppa:ondrej/apache2

# update apache

sudo apt-key update

sudo apt-get update

# WARNING: answering yes at the prompts following this command will overwrite your apache.conf file located in /etc/apache2

sudo apt-get --only-upgrade install apache2 -y

# enable http2

sudo a2enmod http2

Next, navigate to /etc/apache2/sites-available and edit a virtual file of your choice, adding the following line after the ServerName declaration:

Protocols h2 h2c http/1.1

Finally, restart apache:

sudo service apache2 restart

Your site should now be served using http2. You can verify this using the KeyCDN tool located here.

Did I miss anything? Let me know.

Clearing mod_pagespeed cache

I use mod_pagespeed on this server to help speed up asset delivery and force optimization best practices across all of the sites I host. Occasionally, during deployments, it’s helpful to clear the module cache. Doing so is as simple as the following:

touch /var/cache/mod_pagespeed/cache.flush

Throwing together a blog

I’ve been working on this site for longer than I’d care to admit (years at this point). It’s been through a few domains, two content management systems, multiple versions of those content management systems, countless designs and several different hosts. I’m really happy with where it’s at and what I’ve learned putting it together.

I started this site off running Kirby on shared hosting. It’s served as a design and development playground for me as I’ve learned and applied new things. It started off without being version and now the source for it is stored on Github and now runs on Statamic.

I started off writing the CSS and JS for the site manually, before generating a Grunt build process, breaking out the styles to be more modular and rewriting them in SCSS. Dependencies are now sourced from npm and Bower.

Instead of running the site on shared hosting, it now runs on a LAMP Digital Ocean box using PHP7 and mod_pagespeed, both of which have made a tremendous difference in terms of site performance.

As it stands now, I’m thrilled with where this site sits, but I’m curious to see how else I can continue improving it.

Generating a responsive CSS grid using Neat

I use a responsive grid system for this site (and a number of other projects) that’s generated by pulling in Thoughtbot’s Neat framework. To generate the framework for this grid, I’ve put together a simple SASS/SCSS mixin that looks like the following:”

.grid {

    &-main-container {
        @include outer-container;
    }

    &-row {
        @include row;
        @include pad (0 10%);

        @media only screen and (max-width: 640px) {
            @include pad (0 10%);
        }

        &.collapse {
            @media only screen and (max-width: 640px) {
                @include pad (0);
            }
        }

        .grid-row { // collapse nested grid rows
            @include pad(0);
        }
    }

    $grid-columns: 12;

    @for $i from 0 through $grid-columns {

        &-columns-#{$i} {
            @include span-columns($i);
        }

        &-columns-small-#{$i} {
            @include span-columns($i);

            @media only screen and (max-width: 640px) {
                @include span-columns(12);
            }
        }
    }
    @for $i from 0 through $grid-columns {

        &-shift-left-#{$i} {
            @include shift(-$i);
        }

        &-shift-right-#{$i} {
            @include shift($i);
        }

        @media only screen and (max-width: 640px) {
            &-shift-left-#{$i},
            &-shift-right-#{$i} {
                @include shift(0);
            }
        }

    }
}

To use the grid, simply drop it in as an import after including Neat. Once your SASS/SCSS files have been parsed, you’ll end up with completed grid classes that will allow you to generate responsive markup for a page. For example:

<div class="grid-main-container">
    <div class="grid-row>
        <div class="grid-columns-9">
        <!-- Content -->
        </div>
        <div class="grid-columns-3">
        <!-- Content -->
        </div>
    </div>
    <!-- Columns in this row will collapse to the full screen width on small screens -->
    <div class="grid-row>
        <div class="grid-columns-small-9">
        <!-- Content -->
        </div>
        <div class="grid-columns-small-3">
        <!-- Content -->
        </div>
    </div>
</div>

Scriptable Backups with Arq

I’ve been using Arq for my backups for several months now and have regular backups being pushed to both Amazon Cloud Drive and AWS. A big part of Arq’s appeal is it’s flexibility, configurability and the wide array of backup destinations it supports. In short, it allows you to own and control your backups.

In addition to being a wonderfully designed app, Arq ships with a handy command line utility that lets you pause, resume and otherwise control your backups using simple commands named for the app. In order to use these commands, however, you need to include the executable in your shell’s path variable.

To accomplish this, I symlinked the Arq executable in to usr/local/bin. If /usr/local/bin isn’t in your path, you can add it by adding the following to your .bashrc, .bash_profile or what have you:

export PATH=$PATH:/usr/local/bin

Next, symlink the Arq executable:

sudo ln -s /Applications/Arq.app/Contents/MacOS/Arq /usr/local/bin/Arq

Next, open up a new shell and try the following:

Arq pause 60
Arq resume

Now you can easily control your backups from your CLI of choice or even script them from apps like Alfred or Control Plane (context sensitive backups anyone?).

Update OS X from the command line

If you don’t want to bother dealing with the Mac App Store you can check for any recent updates for OS X from the command:

sudo softwareupdate -i -a

You can also combine this with commands to run Homebrew and Cask updates (allowing you to quickly update things quickly and efficiently):

sudo softwareupdate -i -a && brew update && brew upgrade brew-cask && brew cleanup && brew cask cleanup

Syncing OSX app preferences and dot files

I’ve started using a command line tool called mackup to back up and sync many of my dot files and application settings on OS X.

You can install the tool via pip or homebrew. I installed it via homebrew and set it up as follows:

brew install mackup
mackup backup

By default mackup will back up your files to a file named mackup in the root of your Dropbox folder. You can also choose to back your files up to Google Drive or anywhere else on your local drive by creating .mackup.cfg in your user root and setting options the tool provides.

Now, when you move to a new machine, you simply install the tool and run:

mackup restore

Your settings will be added to the new machine and kept in sync via the storage you chose when setting up mackup.

External links and redirects in Statamic navigation

I put together a fieldset and template that allows external links to be added to the navigation of Statamic sites alongside internal links. To implement this in your site, the fieldset should look like the following:

title: Nav link
    
    fields:
    link:
    display: Link
    required: true
    default:
    type: text
    
    content:
    type: hidden

This fieldset should be accompanied by a template named link.html which will need to be added to your site’s theme. The contents of the template are simply Statamic’s redirect example.

Now you should be able to create link pages in your Statamic admin panel that can then be added to your site’s navigation. The pages created in the panel should create page files that look like the following:

title: Example link page
fieldset: link
template: link
link: http://example.com

Is there an easier or more effective way to do this? Let me know.

Fastmail in Fluid.app

I’ve spent the last few months bouncing around OSX mail clients. I went from Mail.app to Airmail, to a Mailmate trial, back to Airmail and then back to Mail.app. Now, however, I’ve finally settled on a mail client: Fastmail’s web interface in a Fluid instance.

I’ve gone with the Fastmail web app for one simple reason: I wanted every mail client I tried to essentially be a native version of their web app. I would find myself working in Fastmail’s web app rather than any given mail client I was trying out without even thinking about it. I would be viewing something in Safari and then jump to the web app — rather than a mail client — without even thinking about it.

Running Fastmail in a Fluid instance did, however, require a bit of setup. First, I set my newly created Fastmail.app up as my default mail client. Next, I modified the default Gmail URL handler created with the new Fluid instance to open mailto: links in Fastmail as follows:

function transform(inURLString) {
inURLString = inURLString.replace('mailto:', '');
inURLString = inURLString.replace('&', '&');

var argStr = '';
var splits = inURLString.split('?');

var emailAddr = null;
var args = {};
if (splits.length > 0) emailAddr = splits[0];
if (splits.length > 1) argStr = splits[1];

var outURLString = 'https://www.fastmail.com/mail/compose:to=' + emailAddr;

if (argStr.length > 0) outURLString += '&' + argStr;
return outURLString;
}

Add this URL handler by going to the Fluid app’s preferences, URL handlers and name the handler Fastmail with the pattern mailto:*

Configuring the dock counter for the Fluid instance is also fairly straightforward and James Wilcox has a great writeup on setting that up.

Are you currently using Fastmail in a Fluid instance? Or do you have a particular web client you prefer? I’m currently pretty happy with this setup and already have a few other ideas for URL handlers and scripts I plan on trying out.

If you don’t use Fastmail, I would highly recommend it and you can sign up for it here.

Edit (10.29.2014): Updated the script to reflect Fastmail’s new TLD (.com as opposed to .fm that they previously used; thanks to Keith Bradnam for the heads up.

Edit (1.29.2017): Updated the compose URL to reflect Fastmail’s new compose routing. Thanks Fred Barker!

Sublime Text 3 - ctrl + tab key bindings

I use Sublime Text as my primary text editor but have never liked the default tab behavior where ctrl + tab takes you to the most recently used tab rather than the next horizontal tab in the tab bar (ctrl + shift + tab does the reverse).

To fix this, I’ve added a few lines to the user key bindings file (located in Preferences > Key Bindings - User):

{ "keys": ["ctrl+tab"], "command": "next_view" },
{ "keys": ["ctrl+shift+tab"], "command": "prev_view" }

Sorting email using aliases and plus addressing in Fastmail

I subscribe to a number of mailing lists and, up until recently, had been using individual server-side rules to sort all incoming messages from those lists in to a specific folder. However, as the number of lists I was subscribed to grew, adding and maintaining individual rules became increasingly tedious.

To make managing messages from mailing lists easier, I’ve switched all of the mailing lists I subscribe to to an alias that is targeted at the specific folder I want them sorted in to. To set this up you need to create a new alias and target that alias at a specific folder using plus addressing as follows:

fastmailusername+targetfolder@fastmail.com

Now, instead of having to create a rule for each mailing list sender, I simply provide the alias that I have created and any messages received via that alias are sent directly to the folder I store mailing list messages in.

Automatic Feedbin subscription backups

A few weeks ago I switched from Fever. to Feedbin. I had been using Fever on a shared hosting account and, over the long term, was proving to be slower than I had expected it to be. So far Feedbin has proven to be considerably faster than my old Fever install and appears to be more actively developed (I’ve also been able to use Jared Sinclair’s Unread — it’s fantastic).

I plan on sticking with Feedbin as my RSS service, but also wanted to make sure I kept a backup of all the feeds I subscribe to just in case anything happens to change. Rather than manually exporting a JSON backup of my feeds on a regular basis, I threw together the following shell script to download the JSON file via Feedbin’s API and save it to Dropbox:

"curl -u 'example@example.com:password' https://api.feedbin.me/v2/subscriptions.json -o ~/Dropbox/Backups/Feedbin/feedbin-subscriptions.json"

I have the above script saved and used Lingon to schedule it to run automatically once a week, alleviating the need for me take the time to back up my RSS subscriptions by hand. To use the script, you simply need to drop in your Feedbin credentials, save it wherever you’d like and then add it and schedule it to run via Lingon.

Photo management with Dropbox and Hazel

I recently abandoned iPhoto as a means of storing, organizing and managing photos on OSX and deactivated the associated iCloud Photo Sharing feature running from iOS in to iPhoto via iCloud. I have replaced my iPhoto-based workflow with one centered around Dropbox (which I have subscribed to for some time). I have been asked about this workflow and what follows is a brief explanation of what was involved with setting it up:

I began by exporting my iPhoto library to a folder using Phoshare1. I then created a simple Hazel rule to scan my iPhoto library for duplicate images or videos and discard them. Clearing duplicates from my iPhoto library saved me 6 GB in space which either speaks to how disorganized my library was to begin with or how bloated iPhoto managed to make it.

After clearing duplicate files I created another rule to rename all photos based on the date they were taken and what they were taken with before then organizing them in to a subfolder based on that date. From there organization was simply a question of looking through each folder and appending an event title after the date the folder was named with.

Once all of the above rules were run on my Dropbox Photos directory I edited them to run on my Dropbox Camera Uploads directory. This allows me to upload photos via the iOS Dropbox app or import it directly from my camera and have Hazel auto-organize any content based on event date which I then label and move to a folder in the Photos folder named for the year in which the pictures were taken.

I now have more Dropbox space, an organized and easy to share photo library and a simple workflow for any and all photos I take (I take a lot).

This workflow allows me to keep all my photos (and all of my edited photos) unified across all devices that I use as well as the web. If I need to edit something I edit it in Photoshop and let Dropbox take care of making sure it’s everywhere I need it to be.

To view photos on the go, I use Unbound which allows me to quickly glance through and view images without having to store them directly on the device being used to view them.

I no longer have to wonder whether my photos made it to iPhoto on my MacBook Air through iCloud Photo Sharing or any other device. Any photos I take on my phone are everywhere I need to be without having to worry because of Dropbox, as is the case with any photos I take with my camera (though the process of connecting that to a computer feels increasingly cumbersome).

I’ve seen more complex photo workflows than mine, but tend to prefer the simplicity of the default Dropbox app, a handful of rules and a little manual sorting. Now, I have all of my photos sorted and will have any other photos I take sorted going forward.

You can download the rules I use here »

  1. It’s worth noting that Dropbox’s app also allows you to pull your photos out of iPhoto’s library file. If you import your photos this way, Dropbox attempts to sort them in to folders by date and iPhoto event. I found it easier to use Phoshare as it simply exports your photos in to a single folder, making it easier for Hazel to process them. 

Leaving Google Apps for Fastmail

I recently began a process of re-evaluating the web services I use, the companies that provide them and an evaluation of where I store important data. I had used Google services extensively with Gmail handling my email, my contacts synced through Google contacts, calendars in Google calendar and documents in a Google Drive (I had used Google Reader extensively but switched to a Fever installation following Reader’s demise). While Google’s services are world class, it became increasingly clear to me that if was not in my interest to store significant amounts of personal data with a company that has a financial interest in profiting from that information.

I wanted to replace the free services I was using with comparable services from companies whose interests we’re aligned with their users (whose users were their customers – not advertisers) and who had a clear business model (they provide a service their users pay for).1

Enter Fastmail

I explored several options for email hosting, with Rackspace Email, Hushmail and Hover - email among the services that caught my attention. Ultimately, I landed on FastMail. Fastmail is a reliable, IMAP email provider with extensive support for custom domains. Fastmail also has strong spam prevention and flexible server side filtering.

I began the transition to Fastmail by using IMAP migration tool. The migration process itself was relatively quick too (given the volume of email in my account)2.

While your email is being migrated you should take the time to set up the aliases associated with your Fastmail account. Rather than being tied to a single email address like Google Apps, Fastmail allows you to use virtual aliases that allow you to use multiple email addresses (and even multiple domains) with the same Fastmail account.

During my switch to Fastmail I also took the time to flatten my email folder structure and associated server-side rules. I used to use umbrella folders/labels with individual subfolders/labels for senders within each category. While migrating to Fastmail I elected to keep only the umbrella categories which has allowed me to filter through broadly related emails that have been grouped together rather than tabbing through endless folders. This means I have less fine-grained control over where individual emails go but the time saved in not having to sort through endless subfolders and associated rules has been worth it.

My next step was updating my DNS records at my domain’s registrar and waiting for propagation. Fastmail has extensive documentation on its required settings for custom DNS but, in most cases, you can simply set your MX records to point to Fastmail’s servers:

in1-smtp.messagingengine.com, priority=10
in2-smtp.messagingengine.com, priority=20

You can also point your namer servers to Fastmail as follows:

ns1.messagingengine.com
ns2.messagingengine.com

Additionally, you will need to add an SPF record to your domain’s DNS records as follows:

v=spf1 include:spf.messagingengine.com -all

Finally, you will also need to set up DKIM signing for your outgoing email. Fastmail has instructions on the DKIM setup process on their site. The general steps they provide are as follows:

  1. Login to your FastMail account and go to Options –> Virtual Domains (or Manage –> Domains for a family/business account).
  2. Scroll to the bottom, and you’ll see a new “DKIM signing keys” section. For each domain you have, you’ll see a DKIM public key.
  3. Login to your DNS provider, and create a new TXT record for each domain listed and use the value in the “Public Key” column as the TXT record data to publish.

Contacts and calendars

While Fastmail provides an outstanding email experience, they do not currently support CardDav syncing for contacts (CalDav support is currently in beta ). It is worth noting that Fastmail has an LDAP server that allows you to store contacts associated with your mail account (with an option to add people you correspond with automatically), but the server is read-only.

For now I’m using iCloud to sync my calendars and contacts and will weigh Fastmail’s options for each when full support arrives. I’m currently leaning towards sticking with iCloud rather than adopting Fastmail’s solutions.3 I didn’t, admittedly, explore a host of options for calendar and contact syncing outside of iCloud. I use iCloud for a handful of other things and adopting sync services from yet another party seemed clunky.

Chat

Leaving Google Apps also meant leaving Google Hangouts (which I used semi-regularly to communicate with friends and family). Fastmail does offer XMPP support for certain accounts which I have used in place of Google Hangouts. How long Google continues to support XMPP and interoperability with Google Hangouts remains to be seen.

Fastmail so far

I’ve been using Fastmail since the end of November and couldn’t be happier with it. The service has been extremely reliable (I haven’t noticed a single instance of downtime). It’s also been nice to use a traditional IMAP implementation after having used Google’s quirky implementation for so long. Fastmail doesn’t have the host of services Google provides, but it is a bullet proof email provider that I feel I can trust with my data which was exactly what I was looking to in switching4

Notes

I did quite a bit of research before switching to Fastmail and the following posts helped push me to make the move:

Have you moved to Fastmail? Are you thinking of doing so? Let me know your thoughts on it or the move to it. You can sign up for Fastmail here.5

  1. My interest in this idea, specifically was sparked by this blog post by Marco Arment: Let us pay for this service so it won’t go down 

  2. I had previously consolidated all of my old email accounts in to my Google Apps account via forwarding and by checking them via IMAP through Gmail. 

  3. I currently use the first-party mail clients on both iOS and OSX so not having contacts and calendars synced with Fastmail is really only an issue when I the Fastmail web interface (which isn’t all that frequently). For now I’ve been manually uploading vCard files to Fastmail which is clunky, but not all that annoying. I do miss being able to create events by clicking on parsed text (which Google Apps supported), but not all that much. 

  4. If you do get tripped up switching from another provider, Fastmail does have extensive documentation. You can also feel free to get in touch with me at @cdransf

  5. This is a referral link so using it will give me credit at Fastmail.