Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For the time being, I will personally continue to build #!-only websites, designed exclusively for javascript enabled browsers. Maintaining two versions (along the guidelines of progressive enhancement) is just too much work (maintaining and testing html view templates as well as jQuery or moustache templates) considering that so few people lack javascript. I wouldn't let a release go live without running a selenium suite across it in any case. My perspective would be different, I imagine, if I worked on a large team that could 'afford' to take the progressive enhancement route.


The point of progressive enhancement is specifically to not maintain two versions. You have just one version of your code for all user agents, and then you enhance it for agents that have better features.

Now worrying about having to build your templates twice -- once in your back end language, and then in JavaScript -- is a valid concern. Code duplication is never good. However, mustache is supported by many many languages (http://mustache.github.com/) which means that you really can build your templates just once and call out to very similar code to populate them with data.

Yes, you would need to run tests against the site with and without JavaScript (or CSS or XHR or any other feature that you were using to progressively enhance your page, but how far you go is really up to you), but if that's automated, it isn't much effort on your part.

I'll be the first to admit that it's fine to cheat in the code to give your users a better experience, but I don't think it's a good idea purely to take development short cuts or cut costs. If our roads were built that way (and in India they are), you'd end up with potholes every six months (and in India we do).


>> I will personally continue to build #!-only websites, designed exclusively for javascript enabled browsers

We as web developers just spent the last 10 years as slaves to IE6 and saying "oh how I wish developers would/could develop for standards, not just with platform X in mind". And here we are again, setting ourselves up for an even worse situation than the IE6 problems. We have sites designed with only (iphone | IE6 | javascript-capable) in mind. The reason of "it's too much work" is the same answers given time and time again to the sites only designed for platform X, but that's not a good reason to do something when the platform you're delivering it on is by definition a network of variable capability platforms.

It's fine if we want to try to push the state of the art of rich internet apps, but at what point do we stand back and realize that we're not building a website (a collection of HTML documents on the world wide web) but rather delivering a javascript executable to our user that just happens to be renderable in most modern web browsers?

I don't mean to single you out, it's something that we all having to deal with, but is there anyone at the standards organizations that are listening to the pulse of the new web? If people want to deliver applications to users via a URI, why do we have to include all of the extra baggage of HTTP/HTML/CSS/Javascript?

If we as the artists of the web are going to break the implied contract of what the WWW is, we should at least be honest with ourselves and work toward a real solution rather than trying to staple on yet another update to HTML to try its hardest to pretend to be its big brother Mr. Desktop App.


There's a thing called "usability". A web application may need javascript; a website presenting documents and information has no excuse not to work in pure HTML.

Typical crap : you can't access nvidia.com anymore with lynx/links. When your goddam nvidia proprietary driver doesn't play well with your kernel upgrade, you have no way anymore to download a new one without running X11, though it used to work not too long ago.


I take almost the polar opposite approach to you, always providing a safe fallback to plain ol' HTML. I feel its a defensive style and doesn't provide a single point of failure. I run Noscript since it makes most pages load much faster. No offense, but I don't want to run your code, I'd rather read your content.


Fair. Would you feel the same way about a web application, as opposed to a content-heavy site?


Are there still content heavy sites? Not to be facetious here but it seems user demand is aligned with smaller and smaller, more and more active bits of content served by web applications. At some point you may have a lot of content, howev it's not so much heavy as it is highly interactive.


Can you clarify something - what benefit does the #! syntax give your site specifically? The reason for using this seems to be misunderstood by the original article.


The #! syntax alerts google's crawler that the site supports their ajax crawling scheme [1].

The fragment (part after the hash), in general, is used to denote a specific state (or "page") in a single page ajax site.

[1] http://code.google.com/web/ajaxcrawling/docs/getting-started...


Are you trolling or just a lazy jackass?


Not trolling. But I admit that I'm overstating my position a little bit to see if any interesting discussion comes of it. The reality of my current situation (a one, sometimes two person team) has meant that I have indeed been following the convention I just described. That's not to say I want to, but as I mentioned, I will probably continue to for the foreseeable future without too much hesitation.


Note the account's name, 'TrollBait'. Brand new too. If he keeps those kinds of comments up, he'll be banned in no time.

On the other hand, I'm impressed how you just ignored TrollBait's insult and just replied in an absolutely normal manner.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: