Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The irony is that because of the lean structure behind the server, this forum actually responds faster than most webfora that do use AJAX/SPA.

Funny given that the whole purpose of AJAX/SPA was to reduce response time. That's it's reason for existing.

Turns out it just complicates things ...



Turns out it just complicates things ...

No, a good car is not evidence that the plane just complicates things, it's evidence that you've got a good car.


We should be arguing that most people use cars and don't need to take the plane for some simple CRUD apps...


From what I recall AJAX wasn't for making things faster or for scaling like other siblings posts describe.

How I remember it, it was created for user experience reason. Not having to submit a form and refresh the entire page and cause the page to bounce back to the top of the page, etc.

AJAX (at least when I learned and implemented it) was for updating the user about an action they performed without refreshing the entire universe in front of their eyes, only updating the contents of a div or something else.

The whole web app as a javascript app.js blob idea took the AJAX idea to an extreme. In this extreme it often does slow down page load speeds and negatively effect the user experience when Javascript doesn't work, for whatever reason.


I think what you say makes no sense at all.

The idea of AJAX/SPA was to provide interactivity and a less technical UI that caters to the average Joe. AJAX/SPA moves some of the computation from the backend to the frontend so the service scales better for millions of users.

This forum is not going to be fast for millions of users (it might not even be for hundreds of simultaneous users) because the server has to render everything again for everyone.

Server side rendering is good for response times, that's why the SPA world went back to it (Google: react ssr). SPA is good for scalability and average Joe UI.

...and AJAX is there so we don't need to do a full page reload all the time. Again, not because of response times.


> AJAX/SPA was to provide [...] a less technical UI

I hadn't heard that reason before. Can you please talk more about this?

> This forum is not going to be fast for millions of users (it might not even be for hundreds of simultaneous users) because the server has to render everything again for everyone.

This theory doesn't match reality. In reality, multipage applications are faster than single-page ones. In fact, most things on the internet still are multipage, including web forums. Most web forums are powered by PHPBB, a multipage web app, and they were powered by PHPBB 15 years ago, when hardware and PHP were much slower. This very site is a multipage web app, and it holds up just fine, even though it has enough users to bring down other websites.

Single-page applications seemed like they would be faster, but usually they aren't.

First, let's look closer at a multipage application. All of the scripts, styles, and images should be cached after loading the first page, if you set the headers right (the Expires header, mainly). Therefore, what is left? The content. And I have found that the size of the HTML content is often close to the size of the same thing in JSON --- at least the way I write HTML (I try to keep it lean. Few classes, extraneous divs, etc.). This is because JSON has all those keys:

    {
        "color": "red"
    }
while HTML has just the values:

    <div>red</div>
That was a simple example, but I measured a bigger one, and the size turned out to be about the same, especially after compression. All of those HTML tags compress well, because of repetition.

Now if they are the same size, then they should load in the same time. Except they don't. The server-rendered page loads faster. Why? Because of progressive rendering. When just some of the HTML has come down the pipe, the page appears. But with AJAX, all of the JSON must first load, then be parsed, then wrapped in a template, then inserted into the DOM. Then it appears all at once.


> Can you please talk more about this?

Average Joes want spinners, buttons, draggable stuff, toggles etc. that don't lead to a page load everytime you press them. Sure, you can say that "well just make a multipage app with some jquery". But at some point you're just approaching a SPA with SSR support.

Imagine facebook's chat but everytime you hit enter you had a full page reload :D

> In reality, multipage applications are faster than single-page ones.

I agree. They are faster for a single user when the usecase is a semi static page like a forum. However, the SPA + API model makes stuff easier for the backend side and allows that average Joe UI stuff. It scales well to millions of users.

The thing is, it's not about you. It's about everyone. We want to build services that please the majority and backends that stay up when the majority is logged in.

Also, it really isn't possible to build realtime/dynamic stuff with a "let's render some HTML on the server" mindset. How do you do realtime chat? Realtime games? How do you do a collaborative text editor? and so on.

JSON/SPA/AJAX are not there to so your static forum page loads fast. They are there because of the next generation of the web.


If it were to scale to millions of users, caching would help.

At that point, you'd have to cache the JSON responses just the same, so the complexity of cache invalidation would be there in the exact same way.

But you would still not have the additional load and bloat on the user browser.


You are correct; caching is always hard.

Load and bloat? Well that depends. If you're doing a realtime collaborative texteditor with a realtime chat, I just don't see how you'd get anything even remotely nice without what you call "load and bloat".

However, to return to the original topic. Which is simpler: a) creating a performant data API b) creating a performant server that handles data stuff just like a) AND all the rendering & UI stuff

Surely the latter is more complex since it has more responsibilities?


When you say "irony", are you genuinely surprised by this?


No, but I've seen comments on here from people under the impression that rendering on the client is a more efficient use of CPU, or that rendering a page in HTML is significantly more resource intensive than rendering the data in json. There is a generation out there that doesn't seem to know that server side rendering exists.


>There is a generation out there that doesn't seem to know that server side rendering exists.

Oh, don't worry, they're reinventing it in JS. I've had a recent project proposal and one of the engineers said that to make the webpage faster we could prerender the pages on the server side with JS.

Like, at that point, you're just using PHP with a different name and worse performance (a small 4$ shared hoster can easily handle 1M users per month, I've yet to see a NodeJS app handle 1M users per month on a 5$ VPS without problems)


Do you get why you'd prerender a JS app and why an isomorphic setup is fundamentally different than server side rendering with PHP?

You're comparing different things. Also, JS performs as well or better than PHP. Not sure why you guys get so ideological about this.


>Do you get why you'd prerender a JS app and why an isomorphic setup is fundamentally different than server side rendering with PHP?

Tbh, I'm not really interested. I can squeeze amazing performance out of a 10kB self-written JS library with noscript fallbacks than most JS-heavy apps out there. The sites I develop work with dialup connections or worse. Because my phone regularly has only such a connection. Modern JS apps are a pain to use under such conditions, I've not noticed much difference between prerendered JS and unprerendered JS, it's both crap under these conditions.

On the other hand, Orange Forum loaded almost instantly (less than 10 seconds) on a crappy GPRS 16kbps connection. Discourse or NodeBB don't load at all and if I'm lucky enough I might see some error message or crapped out webpage.

>Also, JS performs as well or better than PHP. Not sure why you guys get so ideological about this.

I have only ever seen evidence of the contrary.

I can run a 1M user/month website with 128MB of RAM on a shared hoster using PHP. If you get a good shared hosters you can probably hit 10M user/month.

I have not seend a nodejs app that can handle 1M user/month on a 5$ VPS, which has 512MB RAM and probably more CPU and disk than the shared hoster offering.

But I'm willing to rethink this if I see real world evidence that a comparable software set runs better and more efficiently in JS than PHP. I won't consider synthetic benchmarks since those rarely model real world usage and comparable means the software must be usuable with and without JS on the client enabled.


Discourse is a Rails forum with a JS frontend. It's bloated. I don't disagree with your PHP experiences, you're extrapolating from:

1) your personal experience

2) mature software vs immature software

3) bad software vs good software

You can't say you're "not interested" in understanding the other side, throw out your own anecdotal benchmarks, draw conclusions from that, and then demand that others provide "non-synthetic" benchmarks in order to prove you wrong. Well, you can, but it doesn't seem especially objective.


Then please, show me a performant NodeJS webapp that works with noscript and slow dialup.

There is not much personal experience about that since it's a simple on/off comparison. Either it works without scripts on dialup or it doesn't.


Here is one:

https://nodejs.org/en/about/

My point is, you can write any kind of app with almost any kind of server-side tech. If you don't like the culture, you may be right, but please be explicit about it.


Which is why I never bought into the JS framework fad of the month and keep happily using .NET/Java stacks with server side rendering and minimal JavaScript.

We should make the best use of pure HTML/CSS web pages, that is what the web was made for.

For anything else, better go native with Web APIs.


The benefits from client side Vs server side are really a matter of scale. If you're running a personal forum / whatever then you're not going to notice a whole lot. But when you start having several hundred thousand or more concurrent users then being able to cache your pages in a CDN and only generating JSON responses via APIs really can have a profound impact on your server side resources.


You can cache server rendered pages in a CDN or Varnish or whatever you like, and there's also the use of the proper http headers to drive client side cache control. All of this works for server side rendered content.


Obviously there are a great many layers to caching (there's a whole multitude of other solutions out there that you've also missed off :p). However with regards to the specific points you've raised:

1) You cannot cache server rendered pages in a CDN (see footnote) if those pages contain user specific information (like this topic does). Information such as a user name, message read status - even on public messages, etc. If you do CDN cache those pages you'll leak user-specific information to other users. This is why you need tools like Varnish that cache page fragments rather than whole pages; or why you serve up HTML templates and then populate user information on the client side via RESTful API calls.

2) For similar reasons as above and again very relevant to the project in this discussion, HTTP Cache-Control headers also wouldn't help with the HTML if your backend is generating the HTML. In fact in those instances you'd probably want to set your max-age to 0 (again, speaking strictly about dynamically generated HTML. Static assets like JS, CSS, and images are a different matter but they're not server generated dynamic content). Granted with browser caching there isn't the risk of leaking user information to other users; the risk is just the browser not fetching an updated forum view / whatever.

Caching is one of those things that are easy to set up but also very easy to get wrong.

Footnote: Akamai might support it - if you're willing to pay their premium - as it's an immensely sophisticated product. However it's not an option I've seen when using Akamai and the other solutions I've used definitely don't support caching page fragments.


1) nor can you cache client-side rendered pages in a CDN ... you can only cache EMPTY information-free pages in a CDN

2) Agreed, but again, same problem as one

Truth of the matter: Javascript rendered pages

1) request(s) for the page 2) response for the page 3) at least one AJAX request (in practice: dozens) (cannot be akamai'ed) 4) waiting for all responses

Server side rendering without AJAX

1) request page 2) respond to page

Seems to me server side rendering is guaranteed to be faster, if you count the full cycle. But sure, with AJAX you can show your logo faster. With server side rendering only you can show the actual data people are looking for faster.


Again, this is where the question of scale comes in. Server side may well work out quicker when you only have a few hundred visitors. But when you've got hundreds of thousands, being able to offload > 90% of your content makes a massive difference. JASON APIs are less expensive to get generate than full HTML pages which include the same content. It might only be a fraction less work on the servers but multiply that by several hundred thousand and you quickly enable yourself to scale down your back end infrastructure.

This isn't stuff I'm blindly guessing on either, I've worked on several hugely busy UK and international services that started out as monolithic code bases pushing HTML and migrated them to APIs. Each time the language remained the same, the main body of the backend logic even remained the same, but instead of pulling HTML templated from the backend and pushing out the completed contents, the move was to push out the templates and have the client side render them. Each time that change was made the back end infrastructure could shrink. In some instances by a factor of 10!! The drawback is the client side took longer to render the page upon first impression, however the application ran faster from then on in and the hosting costs were reduced as well.

So this is why I keep saying scale matters when discussing APIs Vs server side HTML generation. For the vast majority of project that people build, there isn't going to be much in the way of difference (heck even my own personal project are built API-less unless I specifically need an API for other - none performance relates - purposes. But when you start talking about hundreds of thousands or millions of concurrent users, then even small changes can have a real impact and thus that's when you really start to appreciate some of the advantages API driven development can offer


If you lose performance on every single request, scale will only make you lose more. But yes, you can offload a tiny amount of server-side processing onto the client side. If that makes a difference in your costs ...

No I can't say then go for it. What you say is bullshit. Offloading string mangling to the client at the cost of doing extra requests is absurd.

We established that a single request (ie. directly filled in page) is going to be faster than javascript doing requests. You appear to agree. The only thing scale will bring is a bigger difference.

Javascript/AJAX is simply a manifestation of "Worse is better":

1) programmers start making websites. Good ones, and bad ones.

2) we see lots of examples of badly programmed server-side programming.

3) "we must fix this", and so now we do client side programming

4) incredible amounts of examples of bad client-side programming




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: