Good Software Still Takes 10 Years. Get Used to It

by Robert Moss

Just over fourteen years ago, Joel Spolsky published a wonderful essay called "Good Software Takes Ten Years, Get Used To It."  The key point is captured succinctly in the title, but let’s elaborate on it just a bit.

Out of the gate, even if you launch a commercial application successfully and gain market share (or, build and launch a new internally-built corporate application and get the internal users to actually use it), it takes a lot of time to make it great. Almost a full decade, in fact. There are missing features to add, performance issues to resolve, and lots of bugs to shake out. Over the course of the first 5 to 8 years, you get great feedback from the people using your software day in and day out, plus good ideas you can lift from competitors, and the ability to integrate your product with other systems and build in new 3rd party components. (Google Maps, anyone?)

On the flipside, though, after 10 years, as Spolsky puts it, “nobody can think of a single feature that they really need.” Thus you get bells and whistles like the Microsoft Word Paperclip (remember that?) and unnecessary upgrades to the user experience that make things look slicker and fancier but don’t actually improve the utility of the product.

But that was back in 2001. This is the era of agile development, continuous deployment, failing fast, and all that. Kids in their parent’s garages (or in the co-working environment of tech incubators) crank out the code for a new disruptive mobile app in six months, launch it to the world, and sweep up billion dollar valuations because their user counts are soaring and they’re going to figure out how to monetize them any day now. Right?

Spolsky’s rule of thumb, I am afraid, is as valid now as it was when he first wrote it. Oracle’s database took 10 years to get good, Lotus Notes took 10 years to get good, and so will the next new shiny thing. Building software on the web and on mobile devices and slurping down a lot of lattes and energy drinks won’t change that.

But what about Facebook and it’s explosive growth? Did it really take 10 years to get good? Let’s take a look.

Facebook was launched to Harvard students in February 2004 and opened up to other campuses rapidly after that. By the end of 2004 it had its millionth registered user, which is a remarkable thing to accomplish in less than a year, though it admittedly was free and took just seconds to sign up. In September 2005 the platform, originally limited only to college students, was opened up to high schoolers. The support for uploading photos was added the next month—October 2005, over a year and half after launch. The ability to tag friends followed in December, the news feed (the algorithmically generated summary of your friends’ posts) in September 2006, the same month user accounts were opened up to anyone older than 13.

More features followed fast and furious over the next few years, like the ability for developers to create apps within Facebook (May 2007) and the ability to tag friends in status updates and comments (Sept 2009.) Perhaps the most important feature for Facebook’s long-term success arrived in August 2011: the Facebook App for Android and iOS. That was some seven and a half years in.

More recent features, though, have seemed more and more grasping. 2013, Facebook’s 10th year, brought us graph search, emoticons to express emotions in your posts, trending topics—all those widgets that I think of as “the junk over in the far right column that I never look at.” Currently, everyone is all abuzz over the much anticipated "dislike button", which, in all likelihood, will not actually be a dislike button. 

Earlier this year, Nilary Patel of the Verge made the perceptive argument that Facebook has become the new AOL. The firm logic of the 10-year rule dictates that they would need to: after all, they’ve already built all the features into the original Facebook application that they needed to become the dominant social media platform. If you aren’t on Facebook now, what could they possibly add that would draw you in? So, the ruthless logic of business dictates they must look somewhere else and try to be something else. They’ve finished the product.

OK, but what about Twitter—that got really good really fast, right? On March 21, 2006, Jack Dorsey posted, “just setting up my twttr”, the first tweet, and the service was opened to the public on July 15, 2006—just over nine years ago. Twitter fought hard to resist adding features just for the sake of adding them—even if users were loudly demanding them—and focus just on those that were essential to its success. And still it took a long time.

The hashtag debuted in August 2007. Promoted tweets showed up in April 2010—an important feature for advertisers but also for Twitter’s business model. One of the key features they worked on during the early years, of course, was for the system to be stable and not crash. (Do you miss the fail whale?)  As had been the case at Facebook, just keeping the servers up occupied much of the engineers’ energy.

Twitter went mobile in 2010—four years in—and it did so not by building but acquiring Atebits, which had developed the iPhone app Tweetie. The fall of that year brought the ability to view pictures and videos inside of the Twitter client (instead of jumping via a link to a web site hosting the content)  December 2011 saw the arrival of the “Fly” design to bring a consistent experience across web and mobile.

And now, as Twitter begins its 10th year, it’s worth asking whether it’s entering its talking paperclip phase, where “nobody can think of a single feature that they really need.” The recent reinstatement of Jack Dorsey as CEO, was attributed to the company’s need to accelerate “the pace of product improvements.” But what if the reason Twitter isn’t adding new users at the pace it once did is not because it needs new features but because it has pretty much added all the ones it really needs?

But I’m not so much interested in the 10 year rule because of its implications for those managing mature applications but rather its lessons for those just setting out to create new ones. It’s a warning against the terribly unrealistic expectations on the part of software developers, end users, and—most importantly—the business and technology leaders who make decisions about ambitious new development projects. The crucial business mistakes Spolsky warned against in 2001—Get Big Fast syndrome, overhyping early releases, belief in “Internet time”—are still valid today. Most dangerous, in my view, is the misguided notion that if we just organize projects correctly, staff them with good people, and motivate them correctly we can somehow suspend the fact that making great software simply takes a lot of time and effort. It’s hard work.

I’ve seen that dynamic recently in a team I’ve been working with who are in the closing weeks of a grueling 18-month project. The mission was basically to take a bunch of outdated internal corporate applications and replace them with a slick new web-based front end that integrates with a complex array of older back office systems. They’re dismayed by the bugs and quality issues and “missed requirements” they’re shaking out as they try to take the initial version live and end users are getting into the system to do real work for the first time. Agile stories and collaborative scrumming and late-night coding sessions didn’t prevent those challenges, and they’re left wondering what they did wrong. (The answer: nothing. This is just how it works.)

I’ve seen it in the continuous agitation within the engineering teams at commercial software companies who want to retire a “legacy” platform that’s six or seven years old and rebuild the whole thing from scratch. "We can make it better, stronger, faster." “How long will you need?” “18 months.” (It always seems to be 18 months. I think that’s because developers really think they can do just about anything in a year, and they throw in a little extra buffer just in case.)

It’s understandable to want chunk all that “technical debt” and start from scratch: our new systems need to be web-based, service-oriented, accessible outside corporate firewalls, and work well on mobile devices, either with a native app client or via responsive design. That’s a lot of stuff that old form-based client-server applications and even older web applications simply can’t do.

But the really hard part is all of the application-specific functions, the business logic, the long, slogging trial and error of figuring out what works for users and what doesn’t. There are plenty of wise things you can do to get good results more quickly, like using commercial software instead of trying to build everything from scratch, and using open source and 3rd party libraries that have a community of developers shaking out the kinks. (It’s really hard to add those essential features you need when you are fixing bugs and refactoring code.) Often it even makes sense to just wrap the old mainframe or put a new front-end on old client server code, as ugly and convoluted as that old code might be.

Sometimes we do need to just throw it away and start over from scratch. But that’s not a decision to enter into lightly, and we should do so with eyes wide open. If you’re doing anything of significant scale, it’s not a 12 month effort or even an 18 month one. Sure, you can get something launched and in use in production, but that’s just the start of the game, not the finish. Good software takes 10 years. And we still need to get used to it.

Asset 7
  • Washington, D.C.
  • Berlin
  • Brussels
  • London
  • Los Angeles
  • Minneapolis
  • New England
  • New York