endot

eschew obfuscation (and espouse elucidation)

Git-walklog

Most of the time, when looking at history in a git repository, I am most interested in changes at a higher level than an individual commit. From time to time, however, I really want to look at each commit on its own. So, I created git-walklog. For each commit in the range specified, it:

  1. Shows the standard log format: author, date, and commit message. Then it waits for input.
  2. Hitting enter then runs git difftool on just that commit, showing you any differences in your configured difftool 1.

If you want to skip a commit, all you need to do is type ‘n’ or ‘no’.

I usually use git log with different options till I get it to just show the entries I’m interested in and then replace log with walklog to cruise through the commits.

Examples

To see the last three commits:

1
git walklog -3 --reverse

To see the changes for a particular branch:

1
git walklog master..branch --reverse

To see what came in the last git pull:

1
git walklog master@{1}.. --reverse

I usually put --reverse in there, because I want to see the commits in the same order as they were created.

Enjoy.

  1. You do have a difftool configured, don’t you? Run git config --global diff.tool vimdiff and then use git difftool instead of git diff and all your diffs will show up in vimdiff. Works for other diffing tools too, look for “Valid merge tools” in man difftool.

Git Subtree Tracking Made Easy

Last year, when I made my list of pros and cons comparing git subtrees with submodules, one of the downsides listed for subtrees was that it’s hard to figure out where the code came from originally.

Well, it seems that the internet hasn’t been sitting on its hands. While the main repository remained stable, a couple forks took it upon themselves to teach git-subtree to keep a record of what it merges in a .gittrees file. When a subtree is added, something like the following is added to the file:

1
2
3
4
[subtree "bin/remotecopy"]
    url = git@github.com:justone/remotecopy.git
    path = bin/remotecopy
    branch = master

I created my own fork and merged in those ones as well as adding the ability to prune stale entries.

The next task was finding all the subtrees in my dotfiles repository and adding entries like the one above. Here’s the shell script I used:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
#!/bin/sh

# for each path that was subtree merged
for path in `git log --grep Squashed --oneline | awk '{ print $3 }' | sort | uniq | sed "s/'//g" | sed "s/\/$//g"`; do

    # check to see if the subpath is already in .gittrees
    git config -f .gittrees subtree.$path.url &> /dev/null
    if [ $? -eq 0 ]; then
        echo "$path already configured"
    else

        # look for the most recent commit
        commit=`git log --grep "Squashed '$path/'" --oneline | head -n 1 | awk '{ print $NF }'`
        if [[ $commit =~ '..' ]]; then
            commit=`echo $commit | cut -d . -f 3`
        fi
        echo "last commit for $path is $commit";

        # ask for the git url
        echo "Enter url: "
        read URL

        # record the subtree info
        git config -f .gittrees --unset subtree.$path.url
        git config -f .gittrees --add subtree.$path.url $URL
        git config -f .gittrees --unset subtree.$path.path
        git config -f .gittrees --add subtree.$path.path $path
        git config -f .gittrees --unset subtree.$path.branch
        git config -f .gittrees --add subtree.$path.branch master
    fi

    echo "------------------------------"
done

When run, it finds any subtrees that have been squashed in and shows the path and the last short commit id. For instance:

1
2
3
$ sh ../subtree_commits.sh 
last commit for .vim/bundle/tabular is b7b4d87
Enter url: 

Then all I had to do was find the right git repository online and paste the url in. I usually accomplished this by searching for github [pluginname] and then appending /commit/[short sha1] to the url to see if the commit existed. For instance, looking for github tabular led me to https://github.com/godlygeek/tabular, and appending /commit/b7b4d87 shows that the commit exists in that repository, so it’s likely the right one. For the plugins that aren’t hosted by their authors on github or elsewhere, the vim-scripts mirror was usually where I ended up. The script can be run multiple times, it skips any subtrees that already have entries in .gittrees.

After running the script for a while, and searching for all the home repositories for my subtrees, I ended up with this .gittrees file.

Now, if I want to list my subtrees:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
$ dfm subtree list
    bin/git-subtree        (merged from https://github.com/justone/git-subtree.git branch master) 
    bin/remotecopy        (merged from git@github.com:justone/remotecopy.git branch master) 
    .bashrc.d/resty        (merged from https://github.com/micha/resty.git branch master) 
    .bashrc.d/z        (merged from https://github.com/rupa/z.git branch master) 
    .vim/bundle/AutoTag        (merged from https://github.com/vim-scripts/AutoTag.git branch master) 
    .vim/bundle/FuzzyFinder        (merged from https://github.com/vim-scripts/FuzzyFinder.git branch master) 
    .vim/bundle/L9        (merged from https://github.com/vim-scripts/L9.git branch master) 
    .vim/bundle/ack        (merged from https://github.com/mileszs/ack.vim.git branch master) 
    .vim/bundle/bufexplorer        (merged from https://github.com/vim-scripts/bufexplorer.zip.git branch master) 
    .vim/bundle/bufkill        (merged from https://github.com/vim-scripts/bufkill.vim.git branch master) 
    .vim/bundle/conque-shell        (merged from https://github.com/vim-scripts/Conque-Shell.git branch master) 
    .vim/bundle/gundo        (merged from https://github.com/sjl/gundo.vim.git branch master) 
    .vim/bundle/regbuf        (merged from https://github.com/tyru/regbuf.vim.git branch master) 
    .vim/bundle/syntastic        (merged from https://github.com/scrooloose/syntastic.git branch master) 
    .vim/bundle/tabular        (merged from https://github.com/godlygeek/tabular.git branch master) 
    .vim/bundle/taglist        (merged from https://github.com/vim-scripts/taglist.vim.git branch master) 
    .vim/bundle/ultisnips        (merged from https://github.com/SirVer/ultisnips.git branch master) 
    .vim/bundle/vcscommand        (merged from git://repo.or.cz/vcscommand.git branch master) 
    .vim/bundle/vim-colors-solarized        (merged from https://github.com/altercation/vim-colors-solarized.git branch master) 
    .vim/bundle/vim-fugitive        (merged from https://github.com/tpope/vim-fugitive.git branch master) 
    .vim/bundle/vim-markdown-preview        (merged from https://github.com/robgleeson/hammer.vim.git branch master) 
    .vim/bundle/vim-octopress        (merged from https://github.com/tangledhelix/vim-octopress.git branch master) 
    .vim/bundle/vim-r        (merged from https://github.com/jcfaria/Vim-R-plugin.git branch master) 
    .vim/bundle/vim-speeddating        (merged from https://github.com/tpope/vim-speeddating.git branch master) 
    .vim/bundle/vim-unimpaired        (merged from https://github.com/tpope/vim-unimpaired.git branch master) 

And to update a particular subtree:

1
2
3
4
5
6
7
$ dfm subtree pull -P .bashrc.d/z --squash
From https://github.com/rupa/z
 * branch            master     -> FETCH_HEAD
git fetch using:  https://github.com/rupa/z.git master
Merge made by recursive.
 .bashrc.d/z/z.sh |   49 +++++++++++++++++++++++++++++++++++++++++++------
 1 files changed, 43 insertions(+), 6 deletions(-)

Alright, that’s all I have time for today. Enjoy.

Dfm Tip: Untracked Binaries

The problem

From time to time, I have scripts and binaries that I only really need on one system. Since the base dotfiles repo includes a bin directory (for dfm itself), if I just drop files in there, git continually shows me that they are untracked. For instance, if I have a script called only_on_my_mac, then running dfm status shows:

1
2
3
4
5
6
# On branch personal
# Untracked files:
#   (use "git add <file>..." to include in what will be committed)
#
#       bin/only_on_my_mac
nothing added to commit but untracked files present (use "git add" to track)

At the very least, this is annoying. Here, I offer three solutions to this problem.

Solution 1: .gitignore

The first solution is to just drop a .gitignore file in the bin directory and specify each file that needs to be ignored:

$HOME/.dotfiles/bin/.gitignore
1
only_on_my_mac

If you only have a couple scripts, this may work. However, after the third entry, this gets rather tedious.

Solution 2: Create an excluded directory

This solution is a slight modification of the previous one. Instead of ignoring each individual file, create a directory for excluded scripts:

1
mkdir .dotfiles/bin/excluded

Then, ignore all the files inside that directory (except the .gitignore file itself):

$HOME/.dotfiles/bin/excluded/.gitignore
1
2
*
!.gitignore

Finally, update .bashrc.load to add the new directory to the path:

$HOME/.dotfiles/.bashrc.load
1
PATH=$HOME/bin/excluded:$PATH

Now, any scripts that are only for one system can just be dropped into the bin/excluded directory and git/dfm won’t try to track them.

Solution 3: Use dfm recursion

This solution involves modifying the .dfminstall file in the base of the dotfiles repository. Add the following line:

$HOME/.dotfiles/.dfminstall
1
bin recurse

Then, when dfm installs, it will symlink any scripts in the dotfiles’ bin directory instead of symlinking the entire directory.

without recursion
1
2
$ ls -al bin
lrwxr-xr-x  1 jones  staff  13 Oct  4 20:20 bin -> .dotfiles/bin
with recursion
1
2
3
4
5
6
$ ls -al bin
drwxr-xr-x 3 vagrant vagrant 4096 2012-01-23 01:31 .
drwxr-xr-x 5 vagrant vagrant 4096 2012-01-23 01:31 ..
-rw-r--r-- 1 vagrant vagrant   12 2012-01-23 01:31 only_on_my_mac
drwxr-xr-x 2 vagrant vagrant 4096 2012-01-23 01:31 .backup
lrwxrwxrwx 1 vagrant vagrant   20 2012-01-23 01:31 dfm -> ../.dotfiles/bin/dfm

This solution only started working today. There was an bug in dfm that prevented the bin directory itself from being recurse-able. If you want to use this solution, you’ll need to merge the latest code.

Conclusion

I personally use Solution #2, mostly because I don’t want to keep a list of files to ignore (translation: lazy).

I hope these solutions are helpful. Enjoy.

My Tmux Configuration

Update: I refined my configuration. See it here.

For the longest time, I was a screen user. Then, a little while ago, I discovered tmux, the next generation terminal multiplexer. Not only is it easier to search for on google, it has a rich and consistent configuration language.

I’ve figured out a rather unique tmux configuration and I wanted to share it.

Background

Originally, I just used tmux on remote servers to control several windows. This made it easy to create new remote windows, but I had to keep multiple Terminal tabs open, one for each remote server. I had long wanted to be able to reconnect with my tabs, much in the same way that I could reconnect with the windows on an individual server. I contemplated just running tmux locally on my mac, but then each new window would have required a new connection, and they wouldn’t be logically grouped.

So I run tmux locally and remotely.

Nested tmux

I manage my nested tmux sessions with three configuration files.

  1. .tmux.shared - contains configuration and bindings that are shared between my master and remote sessions
  2. .tmux.master - contains configuration unique to my local (master) session
  3. .tmux.conf - contains configuration unique to the remote sessions

Shared configuration

Note: bind -n maps a key that works all the time, regular bind maps a key that has to be prefixed with the prefix key.

The shared configuration contains three basic sections:

  1. Vim-ish keybindings - I set them whenever I can get them. (-:
  2. Misc. configuration - One screen-compatible binding and one I’ll highlight later.
  3. Status bar configuration - This I mostly copied from someone else.

Local configuration

The important setting here is updating the prefix to be Ctrl-Alt-b. It took a few days to get used to hitting it, but my left ring finger now drops down to hit the alt key when I want to do local operations.

1
set-option -g prefix M-C-b

One convenience of using Terminal tabs is cruising between them with Shift-Cmd-[ and Shift-Cmd-]. To get a similar facility, I map Ctrl-Alt-h and Ctrl-Alt-l to previous and next:

1
2
3
# window navigation
bind-key -n M-C-h prev
bind-key -n M-C-l next

Remote configuration

My remote configuration file doesn’t make any modifications with regard to nesting tmux sessions. It uses the default Ctrl-b prefix and is named .tmux.conf so that it is the default when tmux is started.

It doesn’t specify next and previous window navigation like the master config because the corresponding choice for keys would be Alt-h and Alt-l, which confuses vim when I need to hit Escape followed by h, which happens rather frequently. I fall back on the normal tmux navigation for next and previous window.

Other nifty settings

Resizing panes

I often split windows into multiple panes. While tmux has some nice default layouts, it is sometimes easier to just move the divisions yourself.

Here’s the configuration section for resizing from my remote configuration:

1
2
3
4
5
# keybindings to make resizing easier
bind -r C-h resize-pane -L
bind -r C-j resize-pane -D
bind -r C-k resize-pane -U
bind -r C-l resize-pane -R

Hitting the sequence Ctrl-b Ctrl-h will make the division between the current pane and the one below it move one line. What makes it usable is the -r flag, which means I can just keep hitting Ctrl-h as many times as I want until the panes look right.

Synchronizing input

Every so often, I want to send the same input to all panes in a particular window. With this configuration, it’s easy to toggle the built in synchronization:

1
2
3
# easily toggle synchronization (mnemonic: e is for echo)
bind e setw synchronize-panes on
bind E setw synchronize-panes off

Conclusion

Using these settings makes it a cinch to reconnect to my entire work environment from anywhere. I have two status bar lines at the bottom of my screen; the lower one is analogous to Terminal tabs and the upper one shows my remote windows.

Enjoy.

Remotecopy - Copy From Remote Terminals Into Your Local Clipboard

Problem

I copy and paste all the time. Most of the time, I copy short pieces of information that are too long to type (I’m lazy) but too short to setup anything more complex (wget, scp, etc.). For a while, this was fine as most of my copy targets were either local to my system or in a terminal window on a remote server. However, as I increased my use of splits in tmux and windows in vim, highlighting remote text with my mouse became horribly cumbersome. I needed a way to copy remote text into my local clipboard.

Partial solutions

Recent versions of Terminal let you select blocks of text when holding down the alt key, but when I copied and pasted, the resulting block of text had extra trailing whitespace.

Another solution I tried was MouseTerm. It’s a SIMBL plugin that sends your mouse events straight through to the remote terminal apps. So, I could “set mouse=a” and then select text in any vim window without overlapping other windows. The only problem was that once the text was selected, I couldn’t copy it back to my local computer.

Then, I found remote-pbcopy. It’s a setup where pbcopy is running in a daemon mode on your local laptop and listening on a specific port. That port is then forwarded to the remote machine with SSH. Finally, a little alias facilitates piping output into that port. The result: remote data ends up in your local clipboard.

This is exactly what I wanted. However, I didn’t like the caveat at the end: there is no security on the listening daemon. This means that any if any malicious (or prank-minded) person can figure out what port you are using, they can smash your local copy buffer.

Remotecopy

My solution to this problem, remotecopy, is an evolution on remote-pbcopy. It uses a secret value, like a password, to authenticate copy requests. To do this, it replaces the client and server with perl equivalents so that a little extra logic can be added.

Here’s the sequence of events.

  1. Start remotecopyserver on your local laptop.
  2. SSH to a remote host with the following argument: -R 12345:localhost:12345
  3. On the remote host, run remotecopy 'test string'
  4. Hit cmd-v and enter
  5. ‘test string’ is now in your clipboard.

Here’s how it works.

When remotecopy is run, it makes a connection to localhost:12345 (and therefore the remotecopyserver, via SSH). Then, a short handshake is done, followed by the transfer of the copy data.

Before I describe the client side of the interaction, here is how the server operates:

  1. On startup, generate a secret string. Listen for connections.
  2. When a connection is made, the client will send it’s secret.
  3. If the secret matches the local secret, tell the client that it can send the copy data. Read the data and push it into the local clipboard with pbcopy.
  4. If the secret is invalid or missing, tell the client so and close the connection. Push the secret string into the local clipboard with pbcopy.

It’s important not to miss the last part of step 4. This makes the secret available later.

Now, back to remotecopy. When remotecopy runs, it doesn’t know the secret from the server. It does the following:

  1. Connect to the server and send an empty secret. The server sends back a rejection.
  2. Prompt the user for the secret value. (Because the server copied it into the paste buffer, all you need to do is paste (cmd-v) and hit enter)
  3. Reconnect to the server, sending the secret and then the copy data.

It’s quite a long description, but the process is very quick. If you already have the secret in your clipboard history, you can pass -s <secret> and remotecopy will only need to make one connection.

Example runs

For each of these examples, after the secret is entered, the data is in the server’s copy buffer.

Copy a simple string.

1
2
3
$ remotecopy foo
Input secret:
rc-b212f4522lle33a689edcca88d6845b8

Copy output of another program.

1
2
3
$ ls | remotecopy
Input secret:
rc-b212f4522lle33a689edcca88d6845b8

Specify secret on command line

Note: no prompt is needed.

1
$ ls | remotecopy -s rc-b212f4520c3e33a689edcca88d6845b8

Using remotecopy with vim

Since I use vim as much as possible, remotecopy includes a vim plugin that enables sending data from remote vim sessions.

To copy the entire file or visual selection, use ,y. To copy a particular buffer, use ,r. When the remotecopy is first attempted, there will be a prompt for the secret. After that, the secret is cached so future copies are quick.

Using remotecopy with dfm

If you’re using dfm to manage your dotfiles, just copy it into your bin directory. Both remotecopy and remotecopyserver are self contained perl scripts that don’t have external module dependencies.

You can also use git subtrees and symlink the vim plugin, like I do here and here.

More information

Each script has full documentation. Just run with the --man option to view it.

The code is available on github.

Enjoy.

Octopress Migration Details

As is customary for those who’ve converted from WordPress to Octopress, here’s a quick post about my experience converting this blog.

Getting the blog up and running was a cinch, especially with a good example to examine when I had questions.

Converting old entries

To convert my WordPress entries, I turned to exitwp. It worked pretty well, but I ran into two issues.

The first was that the YAML blob at the top of the converted posts wasn’t formatted correctly.

1
2
3
4
5
6
7
8
9
10
11
---
author: nate
date: '2011-10-29 18:53:01'
layout: post
slug: git-submodules-vs-subtrees-for-vim-plugins-part-2
status: publish
title: Git submodules vs. subtrees for vim plugins, part 2
wordpress_id: '328'
? ''
: - Misc
---

I solved this by switching to chitsaou’s fork.

The second problem was that html2text, which exitwp uses to do the actual conversion, was hard wrapping lines at 78 characters. I fiddled with it for quite a while, hacking the backend code for html2text, but then I remembered that markdown parsers pass HTML straight through (and that I don’t care if old entries are regular HTML). So I just modified the exitwp script to gut the html2fmt method:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
$ git di
diff --git a/exitwp.py b/exitwp.py
index ae58d24..d21a8df 100755
--- a/exitwp.py
+++ b/exitwp.py
@@ -37,12 +37,7 @@ item_field_filter = config['item_field_filter']
 date_fmt=config['date_format']

 def html2fmt(html, target_format):
-    html = html.replace("\n\n", '<br>')
-    if target_format=='html':
-        return html
-    else:
-        # This is like very stupid but I was having troubles with unicode encodings and process.POpen
-        return html2text(html, '')
+    return html

 def parse_wp_xml(file):
     ns = {

Old source highlighting plugin

I had used SyntaxHighlighter Evolved in WordPress to handle my syntax highlighting needs, so I needed to convert those to Octopress’ triple backtick format.

For this, I turned to some perl one liners:

1
2
3
4
perl -MHTML::Entities -p0777i -e 's/(\[sourcecode[^]]*\])(.*)(\[\/(sourcecode)\])/$1.decode_entities($2).$3/mse' *.markdown
perl -p -i -e 's/\[sourcecode.*language="([^"] )"[^]]*\]/``` \1\n/' *.markdown
perl -p -i -e 's/\[sourcecode[^]]*\]/```\n/' *.markdown
perl -p -i -e 's/\[\/sourcecode\]/\n```/' *.markdown

That first one uses a little trick to slurp in entire files (seen here) and decode HTML entities.

Finishing touches

And, finally, many urls on my site included the hostname, making the Octopress preview less useful. One more perl one liner:

1
perl -p -i -e 's{http://endot.org/}{/}g' *.markdown

Final verdict

Octopress is awesome.

Enjoy.

Git Submodules vs. Subtrees for Vim Plugins, Part 2

When I talked about submodules vs. subtrees before, one of the things I listed as a benefit for subtrees was the speed of the initial clone.  I’d written a few scripts to help me benchmark the two, and with a little extra time that I have this weekend, I thought I’d share the data.

I generated 2, 4, 6, 8, and 10 plugin repositories for both submodules and subtrees and cloned each one ten times over both a local and a remote connection.  Here is the result:

As you can see, submodules take longer for each one you add and subtrees stay pretty much the same.  Here’s the R code to generate the above graph:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
#!/usr/bin/env Rscript

library(ggplot2) # load up the ggplot2 library

# load up the data from the google csv export
smst <- read.csv('data.csv')

# add names to the data
names(smst) <- c('type', 'count', 'time')

# force count to be a factor instead of a continuous variable
smst$count <- factor(smst$count)

# calculate the mean for each type/count group
smst_mean <- aggregate(list(time=smst$time), list(type=smst$type, count=smst$count), mean)

png(filename = "submodule_vs_subtree.png", width=700, height=700)

ggplot(smst_mean, aes(x=count, y=time, group=type, color=type)) + geom_line(size = 2) + ylab("time") + xlab("plugin count") + opts(title = "Submodule vs. Subtree checkout times")

The generation and benchmarking scripts as well as the reported data and code are in my submodule_vs_subtree repo on github.

jQuery Conference: Boston

A couple weeks ago, I had the privilege of crossing the country to attend the jQuery conference in Boston.  It was a unique opportunity for me.  While we use jQuery extensively at work, I haven’t done much with it lately.  However, since we have a few projects coming up that will be web focused, so it was a timely occurrence.  I went with my good friend and coworker David (I’d link to his blog, but he doesn’t have one yet, cough) and another coworker, Erin, who had been to Boston before.

First off, Boston was awesome.  When we arrived on Friday night, we popped over to the conference venue to register before David and I headed into north Boston.  Erin was off to meet some local friends, but not before giving us a quick overview of the city.  We headed up to Hanover Street near North End Park, where we expected to find good Italian food.  We weren’t in the mood for Italian when we reached the area, so we wandered a while before ending up at the Green Dragon Tavern, an establishment that claims to have once been the “Headquarters of the Revolution” for a while.  After starting with a couple beers, we feasted on a Hot Corned Beef (with Mustard Horseradish Sauce) sandwich and some Boston Bangers and Mash.  It was a meal to write home about (or, you know, blog about).  We hadn’t even started the conference and it was already a great trip.  We continued to go out each night after the days festivities, but that first meal was easily the best.

The conference itself was great too.  My lack of recent experience with jQuery meant that almost all the talks were interesting and usually resulted in me adding to my list of things to learn.  Here are some of the highlights (in roughly chronological order):

The jQuery Mobile keynote - Todd Parker & Scott Jehl

I’m quite interested in the mobile web at the moment, and so everything in this talk was fascinating.  They support a whole host of devices (not just iOS) and are very close to releasing 1.0.  Also, there’s an awesome demo of jQuery Mobile’s slide transitions (click here to load the demo directly on your phone).  It’s just regular html with links to images and jQuery Mobile transforms it into a sliding image gallery.

Plugin Authoring Best Practices - Ben Alman

I haven’t created any plugins (yet) for jQuery, but after this talk I not only have the tools and templates to get started quickly, I understand what some of the line noise is when looking at existing plugins.  Also, I learned the term IIFE.

Event Improvements in jQuery 1.7 - Dave Methvin

This was, by far, the most entertaining talk of the weekend. Not only did I learn about the cool new unified event declaration model that’s coming in jQuery 1.7, I got a refresher course on all the memes I have and haven’t seen.

There was quite a bit more from the weekend.  Here are my raw notes, with many many URLs to explore.

All in all, it was a great trip.

Dfm Updates: Uninstall and Updatemergeandinstall

In my dfm talk a couple weeks ago, I listed out some low hanging fruit; just a few things that I thought would be easy to add to the system.  Well, this past weekend, I went to the jQuery conference in Boston.  It was a great conference in its own right, and I hope to post on it this week, but for now I want to talk about the improvements I made to dfm while on the plane.

dfm uninstall

The first thing I tackled was the ability to remove dotfiles if they weren’t needed anymore.  Sometimes, you need to log into a shared account (such as root) and you’d like to use your settings, but not leave them behind for the next person.  Now, ‘dfm uninstall’ does just that.

dfm updatemergeandinstall

Up until now, fetching your latest dotfiles changes required two steps: ‘dfm updates’ and ‘dfm mergeandinstall’.  Well, now there’s updatemergeandinstall that does both.  It takes the same flags as either updates or mergeandinstall and works just like it sounds.  Oh, and for the perennially lazy (i.e., me), it has a shortcut named ‘umi’.

The Rest

The test suite covers all the new commands and I added more coverage for the lightly tested mergeandinstall code.  ’dfm install’ now cleans up dangling symlinks, for those times when a file is no longer needed.  And, finally, I refactored the code somewhat to make it easier to work on and to reduce duplication.

To update to the latest, just run these commands:

1
2
$ dfm remote add upstream git://github.com/justone/dotfiles.git
$ dfm pull upstream master

Enjoy.