Skip to content
Who's in the Video
David Heinemeier Hansson is a Danish programmer and the creator of the Ruby on Rails open source web development framework. He is also a partner with Jason Fried at the web-based software development firm[…]
Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Cloud computing shouldn’t be an either/or decision. We should definitely make use of the tremendous collaborative possibilities of the Web for some tasks but utilize “the awesome local, graphical power and computing power of a modern computer” for others.

Question: The trend now is towards client-side applications, but Rails deals primarily with server code. Does Rails need to evolve to keep up?

David Hansson:  So Rails have actually been interested in the client side for a long time.  When AJAX sort of first got its initial push, when it got its acronym, back in, I think, 2006, Rails was one of the first server-side frameworks that said, "This is going to be huge and we're going to do something about that."  So we put a Java script library straight in the Rails distribution prototype and we built a bunch of helpers around that to make it easier to create AJAX applications with Rails.  And today it's almost inconceivable that you'll build a new, modern web application that doesn't have some aspect of AJAX in it.

Now, some people go a lot further than just having some aspects of AJAX in it.  Some people have their entire application in Java script and just use the back end as a data store for that.  I don't find that development experience that pleasurable.  I have come to tolerate Java script now that there are great libraries and frameworks like Prototype around it to sort of make it a little more pleasurable, but it's still no Ruby.  Ruby is still my first love in terms of programming languages.  And however much you paint up Java script, it's not going to beat that.  Which is fine.

So, from the development side of things, I don't enjoy Java script programming nearly as much or in the same league as I enjoy Ruby programming.  Okay, fine.  On the client side of things, like is this better for the user?  I think there's something special and appealing to me about the mix, the mix of how the web is discreet pages and you use hyperlinks to jump from place to place and AJAX is sort of sprinkled across to make certain common operations a little faster.  I tend not to like very heavy, single-screen-based web applications.  They can be fine for some things, but I think the Web has this unique category of applications that fit into that sort of middle ground between one screen, or mainly one-screen applications and static web pages.  And that's an awesome sweet spot and I think it works incredibly well for a wide array of applications.  And I wouldn't want them to be any different.  There are certainly some people developing for the web who long for the days of the desktop application and finally see that now AJAX is bringing that back.  Well, we've heard that story a lot of times.  First it was Java that was going to do this, applets were going to bring back the desktop experience and we could get rid of this nasty HTML.  Then it was Flash that would bring this forward.  And now AJAX or anything else like that.  There's been so many attempts to bring the desktop to the web, and none of them have succeeded in becoming the dominant approach to building web applications, and I think there's a good reason for that, because that's not what users want.  Like that sweet spot in the middle is great and it's actually desirable on its own terms.

Question: Will cloud computing take over completely?

David Hansson:  Google is certainly trying to bring about the day where you can just have a dumb computer and everything happens in the cloud.  I don't know if that's even desirable yet.  I don't know why it has to be either/or.  There are certain types of applications, especially information-based or network based applications that are just great on the web.  Like editing your, say, movies or pictures?  Why does that have to be online?  Why does the act of doing that itself have to be online?  If there's not a big benefit to that, then I don't see any particular reasoning to forcing it to be there.  It's fine to have local applications, too.  And we don't need the either/or's.  We can take the best of the web and the best of the collaboration that that enables and enjoy that, for all those applications where that's a good fit.  And then we can use the awesome local, graphical power and computing power of a modern computer to do those other heavy things.  I think that that split, that duality is just fine.

Recorded July 22, 2010

Interviewed by Peter Hopkins


Related