Providing the finest product available

April 22nd, 2008 by ymendel 0 comments »

I do a fair amount of weekend traveling to nearby cities and states, and for various reasons (flying sucks even more now thanks to the TSA; public transportation sucks in 99.53% of the US, 99.982% of the southeast) this translates to a lot of driving. Finding your way around a new city is a bit of an adventure, especially so when that city happens to be like Atlanta, where directions invariably include at least one of “get on [some crazy huge highway]” and “turn onto Peachtree”. (I knew about Peachtree long before I first visited Atlanta, but nobody warned me about Old Hickory before I moved to Nashville. That was a lesson I had to learn quickly.)

Most of the people in these travel destinations were very helpful with directions. A printer turned out to be a valuable asset if only for taking along printouts of Google Maps information. Some of the people, either in the travel destinations or as part of the traveling gang, had GPS devices. I never really cared much for those things before, but that’s often the case with products geared towards a market you’re not part of. Everything changed once the traveling started.

Still, even experiencing first-hand how useful they can be and even considering that it could be a good deal considering how much use we’d get out of it, I was hesitant to spend between $300-$800 on one for myself, especially when becoming more and more familiar with Atlanta’s layout (at least for the few places that mattered during these visits) and a GPS wouldn’t help much with problems like one block of Marietta being closed (probably related to the tornado damage). But once again, that changed with new information. One of my friends showed up with a Garmin nüvi 200 he got for only $150.

Mine arrived today, and I busied myself with seeing how it thought I should get around. None of the directions matched my normal routes, but they would’ve got me there in the end. My favorite thing about it (and the company) so far has nothing to do with the functionality, but is a quote from the packaging: “Garmin reserves the right to provide you with the finest product available to date. Engineering enhancements are ongoing and may not be reflected in the pictures and specifications on this package.”

One of the biggest differences involved in moving from employment to independent contracting/business-owning was the increase in management, and a lot of that has to do with managing relationships. Much of this has manifested in the form of developing contracts and dealing with expectations. A recent bout of this involved a lawyer on the other side wanting us to enumerate before starting the work any third-party software we’d be using to complete the work. We pushed back against that, explaining that it would cause us to be less efficient and more expensive for them if we couldn’t use helpful plugins unless they were listed before we even started.

The quote on the GPS packaging brought that to mind. It’s just a shame that it seems to come from legal defensiveness, saying that they reserve the right to provide the finest product available, not that they’re dedicated to providing the finest product available. I’m not sure if that says more about me or Garmin, though.

Demeter: inviolable or too sexy to resist?

April 21st, 2008 by ymendel 5 comments »

Lots of people talk about the Law of Demeter and how important it is. Some deride it, some want to make it seem less stringent by calling it merely a “suggestion”. Me, I’m so familiar with it I call it “James”.

That’s a joke, son. Settle in ‘cause we’re just getting started.

There’s a lot I could link right here, but it takes a bit of effort to compile the best of the best. A good starting point is to go to Jay Fields’s blog and search for “demeter”. You’ll probably come back with anywhere from one quarter to one half of the entries. I’m just going to dive in to the real meat of this.

And so I introduce to you shmemeter. This came out of efforts started a few jobs ago when I ran up against limitations of the delegation built in to Rails. It’s great for a lot of things, but why should I have to write my own method if I’m going to delegate customer_name to customer.name? I would have rather used Forwardable, but there’s something to be said about going against convention and instead putting in a bunch of extend Forwardable calls.

More than wanting to change the target method, I wanted to have these “wrapper” methods, these demeter artifacts (or “demartifacts”?), these saviors-or-violations-depending-on-whom-you-ask be safe. I was working with optional associations and didn’t want to get NoMethodError if the association wasn’t set. Tell me what you’d rather write:

class User < ActiveRecord::Base
  delegate :username, :to => :login, :nil_is_okay_please_do_not_mess_me_up => true
end

or

class User < ActiveRecord::Base
  def username
    login && login.username
  end
end

Okay, maybe that option doesn’t have the best name. It’s a good thing I didn’t stick with it.

As I said, this started a few jobs ago, and I put it aside for a while. The latest job brought the concerns back into the forefront, especially when I was writing methods like

class Entry < ActiveRecord::Base
  def person_name
    person ? person.name : ''
  end
end

This is what BDD gives you. I very carefully planned out how I wanted Entry#person_name to be the name of the entry’s person and that it should be the empty string if there was no person. Good thing that screwed me out of delegation. In fact, each one of those desires screwed me out of delegation the way Rails does it, and put together they just laughed in my face.

With shmemeter, I can do something simpler and cleaner

class Entry < ActiveRecord::Base
  delegate :person_name, :to => :person, :as => :name, :missing_target => ''
end

That example shows both extra options in concert. They can be used separately, of course, with :as specifying the method to call on the target and :missing_target specifying the value to use if the target is nil.

Do what you want. Me, I say “demeter shmemeter”.

Obligatory github link: http://github.com/flogic/shmemeter/tree/master
Obligatory tracking link: http://tasks.ogconsultin.gs/projects/show/shmemeter

C.bash_history

April 17th, 2008 by vinbarnes 4 comments »

Piggybacking off Ben's post, here is my bash history.

$ history 1000 | awk '{a[$2]++}END{for(i in a){print a[i] " " i}}' | sort -rn | head
105 git
48 ssh
40 airport
39 ip
30 ls
26 cd
21 gvn
17 tm
17 exit
16 qri

Obviously, I've made the switch (some time ago, in fact) to git as there isn't svn anywhere to be found. The airport and ip commands were because I was desperately seeking wifi (or weefee as the French say) while abroad. Which reminds me, I need to pen a little something about Scotland on Rails, my new favorite Scottish regional Rails conference!

Test-driving a rails script/runner script

April 15th, 2008 by rickbradley 1 comment »

We’re committed to being truly test-driven. That means writing a test for EVERYTHING before the code is written. That means you don’t have a Rails migration without writing a test first that fails until that migration is written and applied. That means when you write rake tasks you write tests for them. That means when you write scripts to do data loading, for example – a real example that came up over the past few days – you write tests before you write the scripts.

I messed around with rspec, script/runner, dug through optparse.rb, played for a while and finally found a simple recipe for testing script/runner scripts in Rails apps (among other scripts in other places).

Here’s a set of rspec examples:

  
require File.dirname(__FILE__) + '/../../spec_helper'
require 'claim_parser/claims_importer'

describe "running the import_claims.rb script via script/runner" do
  def run_script
    eval File.read(File.join(RAILS_ROOT, 'lib', 'claim_parser', 'import_claims.rb'))
  end

  describe 'when no filename is specified on the command-line' do
    before :each do
      Object.send(:remove_const, :ARGV)
      ARGV = []
    end

    it 'should fail' do
      lambda { run_script }.should raise_error(ArgumentError)
    end
  end

  describe 'when a filename is specified on the command-line' do
    before :each do
      @filename = '/path/to/some/fictional/claims/file'
      Object.send(:remove_const, :ARGV)
      ARGV = [ @filename ]
      @importer = stub('claims importer', :run => true)
      ClaimsImporter.stubs(:new).returns(@importer)
    end

    it 'should create a ClaimsImporter instance' do
      ClaimsImporter.expects(:new).returns(@importer)
      run_script      
    end

    it 'should provide the command-line specified filename when creating the ClaimsImporter' do
      ClaimsImporter.expects(:new).with(@filename).returns(@importer)
      run_script
    end    

    it 'should call the run method on the ClaimsImporter instance' do
      @importer.expects(:run)
      run_script
    end
  end
end

And the corresponding script:

  
#
# Import claims data
#
#   (run with script/runner)
#

require 'claim_parser/claims_importer'

raise ArgumentError, "A claims data filename is required" if ARGV.blank? or ARGV[0].blank?

ClaimsImporter.new(ARGV[0]).run

Gitting on the bandwagon

April 15th, 2008 by ymendel 0 comments »

It’s about time. The winds of change have blown, and if the Rails team can do it, a smaller, leaner team can do it too.

I’m talking about moving to git, and specifically about using github.

So any public flogic project you can see at our tracking system should be also available at github, now and in the future.