Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can cache server rendered pages in a CDN or Varnish or whatever you like, and there's also the use of the proper http headers to drive client side cache control. All of this works for server side rendered content.


Obviously there are a great many layers to caching (there's a whole multitude of other solutions out there that you've also missed off :p). However with regards to the specific points you've raised:

1) You cannot cache server rendered pages in a CDN (see footnote) if those pages contain user specific information (like this topic does). Information such as a user name, message read status - even on public messages, etc. If you do CDN cache those pages you'll leak user-specific information to other users. This is why you need tools like Varnish that cache page fragments rather than whole pages; or why you serve up HTML templates and then populate user information on the client side via RESTful API calls.

2) For similar reasons as above and again very relevant to the project in this discussion, HTTP Cache-Control headers also wouldn't help with the HTML if your backend is generating the HTML. In fact in those instances you'd probably want to set your max-age to 0 (again, speaking strictly about dynamically generated HTML. Static assets like JS, CSS, and images are a different matter but they're not server generated dynamic content). Granted with browser caching there isn't the risk of leaking user information to other users; the risk is just the browser not fetching an updated forum view / whatever.

Caching is one of those things that are easy to set up but also very easy to get wrong.

Footnote: Akamai might support it - if you're willing to pay their premium - as it's an immensely sophisticated product. However it's not an option I've seen when using Akamai and the other solutions I've used definitely don't support caching page fragments.


1) nor can you cache client-side rendered pages in a CDN ... you can only cache EMPTY information-free pages in a CDN

2) Agreed, but again, same problem as one

Truth of the matter: Javascript rendered pages

1) request(s) for the page 2) response for the page 3) at least one AJAX request (in practice: dozens) (cannot be akamai'ed) 4) waiting for all responses

Server side rendering without AJAX

1) request page 2) respond to page

Seems to me server side rendering is guaranteed to be faster, if you count the full cycle. But sure, with AJAX you can show your logo faster. With server side rendering only you can show the actual data people are looking for faster.


Again, this is where the question of scale comes in. Server side may well work out quicker when you only have a few hundred visitors. But when you've got hundreds of thousands, being able to offload > 90% of your content makes a massive difference. JASON APIs are less expensive to get generate than full HTML pages which include the same content. It might only be a fraction less work on the servers but multiply that by several hundred thousand and you quickly enable yourself to scale down your back end infrastructure.

This isn't stuff I'm blindly guessing on either, I've worked on several hugely busy UK and international services that started out as monolithic code bases pushing HTML and migrated them to APIs. Each time the language remained the same, the main body of the backend logic even remained the same, but instead of pulling HTML templated from the backend and pushing out the completed contents, the move was to push out the templates and have the client side render them. Each time that change was made the back end infrastructure could shrink. In some instances by a factor of 10!! The drawback is the client side took longer to render the page upon first impression, however the application ran faster from then on in and the hosting costs were reduced as well.

So this is why I keep saying scale matters when discussing APIs Vs server side HTML generation. For the vast majority of project that people build, there isn't going to be much in the way of difference (heck even my own personal project are built API-less unless I specifically need an API for other - none performance relates - purposes. But when you start talking about hundreds of thousands or millions of concurrent users, then even small changes can have a real impact and thus that's when you really start to appreciate some of the advantages API driven development can offer


If you lose performance on every single request, scale will only make you lose more. But yes, you can offload a tiny amount of server-side processing onto the client side. If that makes a difference in your costs ...

No I can't say then go for it. What you say is bullshit. Offloading string mangling to the client at the cost of doing extra requests is absurd.

We established that a single request (ie. directly filled in page) is going to be faster than javascript doing requests. You appear to agree. The only thing scale will bring is a bigger difference.

Javascript/AJAX is simply a manifestation of "Worse is better":

1) programmers start making websites. Good ones, and bad ones.

2) we see lots of examples of badly programmed server-side programming.

3) "we must fix this", and so now we do client side programming

4) incredible amounts of examples of bad client-side programming




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact