Ben Northrop


Decisions and software development


An Architect's Morality


(1 comments)
May 20th 2010


There was an interesting article in Nature, arguing that our moral intuitions are formed as much by rational deliberation as by our experiences (despite the latter being more en vogue in academic circles - who knew!).

It got me thinking about how in any field (whether it be software development or biology, finance, etc.) people have very finely developed intuitions about what is "right" or "wrong" - something that seems akin to moral judgement. This happens in software development without a doubt - people who have been in the field for a while can have very powerful, immediate, gut reactions to problems - they just know that a certain path is "right" or "wrong".

The question I'm kicking around is, in software development, how are our "moral" intuitions best formed - by rational deliberation (e.g. reading, observing, studying, listening) or personal experience (playing many roles, living in the trenches)? In other words, which route would lead to greater "moral wisdom" - more study or more lived experiences?

(This question is a lot like the kerfuffle about Justice Sotomayor's "wise latina" comments months back - is she more "wise", coming from a rough neighborhood in the Bronx, than someone who came from the upper-middle class 'burbs?)

My sense is that no amount of study or observation can make up for living in the trenches and feeling the pain of bad decisions (your own and others'!). I can read all day about why unit testing is great, but it doesn't sink in to the level of "moral intuition" until I live through an experience where there's a million lines of code, no unit tests, and so no clue as to whether your change broke something else or not.

Anyway, just a thought...

I believe that software development is fundamentally about making decisions, and so this is what I write about (mostly). I've been building software for about 20 years now, as a developer, tech lead, and architect. I have two degrees from Carnegie Mellon University, most recently one in philosophy (thesis here). I live in Pittsburgh, PA with my wife, 3 energetic boys, and dog. Subscribe here or write me at ben dot northrop at gmail dot com.

Got a Comment?


Sign up to hear about the next post!

If you liked this article and want to hear about the next one, enter your email below. I don't spam - you'll only receive an email when there's a new post (which is about once a month, tops). It's all low-key, straight from me.

Comments (1)

October 17, 2016
Interesting thought! I'd say almost invariably the moral intuition part of it comes from personal experience. A good example for me is in studying vs. finding design patterns.

There are some patterns I stumble upon by finding them organically in the code I was working with, and then realizing only a long while later that there is some abstracted derivative to it that has a name. More often than not, I end up discovering this with anti-patterns even more so (The "anemic domain model" comes to mind).

The ones that I actually studied first, in their textbook form, are all great academic fodder. But, I honestly cannot think of a single time where I took an approach based off of something I only read or heard about. I always find myself making educated decisions off of my previous experience with something similar, and then (hopefully) adding some new knowledge to my experience catalog after finding some shortcoming to the decision I made.