Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Wondering how this affects SEO and non javascript enabled browsers. I assume one would still have to implement the more traditional solution as a backup option.


Try Facebook in Firefox using the Web Developer Toolbar extension to disable Javascript. You'll see that major features just silently fail to operate correctly. I think that NoScript users and Javascript-free browsers are basically non-existant. That doesn't mean you shouldn't design for the before-Javascript-is-downloaded code path, but I wouldn't really worry too much.


I was more concerned with how this would be applicable to a non Facebook site, particularly one that relies significantly on SEO for traffic. I would think Facebook would have solved this for the pages they do want to SEO (Pages, Questions, etc)


I wouldn't say they're non-existent because every so often you'll find some user on here bragging about how he uses NoScript. Personally though, I don't think that users who intentionally cripple their browsers matter.


But that's contious choice they're making. It's like worrying about making your website accessible for Richard Stallman.


The official NoScript site [1] downloads via the Mozilla Add-ons page [2], so its download count is probably accurate: 67,616,402

NoScript does not publish their individual add-on usage statistics, but the global download/usage ratio can be calculated from the statistics on the Firefox Add-on home page [3]:

1,962,617,946 add-ons downloaded 157,090,095 add-ons in use

About 8% of downloaded add-ons are still in use. Assuming NoScript's usage ratio is comparable to the average, approximately 5.4 million installations of FireFox are running NoScript. Let's ignore the fact that NoScript's usage ratio is probably much lower than the average, due to the fact that it breaks most web pages.

I'd wager that the average NoScript user has at least two machines, so the total number of NoScript users is probably less than 2.7 million.

There are over 230 million internet users in the USA [4]

Even if 100% of NoScript users were Americans, they form 1% or less of the general population. If you, like me, believe that I have been generous to NoScript here, it is likely that no script users are no more numerous than 1 in 1,000.

Even if the user is using NoScript, they can whitelist your site. You can probably add <noscript>WARNING: THIS SITE IS BUSTED WITHOUT JS</noscript> to the top of your page and call it a day. If you are feeling generous, redirect no-script users to the mobile version of your site and tell them why.

tldr: Assume that human user agents have Javascript.

[1] http://noscript.net/

[2] https://addons.mozilla.org/en-US/firefox/addon/722/

[3] https://addons.mozilla.org/en-US/firefox/

[4] http://www.google.com/publicdata?ds=wb-wdi&met=it_net_us...

EDIT: Converted to blog post -- http://news.ycombinator.com/item?id=1406233


People use NoScript to prevent crappy sites from doing annoying, stupid, or malicious things. I'm pretty sure most NoScript users whitelist sites that they visit that are legit and require JavaScript. That's how I use it anyway.


I think following progressive enchancement paradigm in your website for the sole purpose of including a marginal percentege of users is—just like supporting IE6—not really worth your time. If you can, however, do that, than it's a great strategy to make sure that all of your content will be indexable. I think google had some proposal for server parsing #anchor%20urls for visiting spiders, but I had never understood how was that supposed to reduce complexity. Meanwhile if your project is not a ginormous behemot, following Progressive Enchancement in addition to mentioned SEO benefits can help you to ensure for example that you validate all your input on server side etc. As long as it makes sense, just like TDD. But to prevent “before-Javascript-is-downloaded code path” scenario all you need to do is to include your <script/> tags in head section. One exception would be if user's browser time-outs or encounters error when requesting your script. But that's very, very unlikely. Just make sure you don't name your javascript files “free_porn_sex.js.”


I thought that Google's and Yahoo's crawlers did not parse JavaScript. And those are "browsers" I care a great deal about.


In Facebook's case, I think they are already serving completely different content to logged-in users and non-logged-in users/crawlers; most of the complexity of Facebook is invisible to crawlers anyway.


To me it sounds that BigPipe is as applicable as Flash or GWT - solutions already known for their problems with SEO. However, if that's not a problem for you then it's probably a good optimization for your site.

Keeping two different output formats for a site (one for crawlers and one for humans) sounds complex. To date, none of the sites I've developed could have justified such overhead in development.


This would be suicide for a site relying on SEO to drive traffic. It's a cool approach if you're building something for which that doesn't apply though (web-based enterprise software, privately-accessed tools, or if you're a multi-billion dollar company that's so large you can create your own world wide web and the search engines can go screw).




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: