Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

When you say "irony", are you genuinely surprised by this?


No, but I've seen comments on here from people under the impression that rendering on the client is a more efficient use of CPU, or that rendering a page in HTML is significantly more resource intensive than rendering the data in json. There is a generation out there that doesn't seem to know that server side rendering exists.


>There is a generation out there that doesn't seem to know that server side rendering exists.

Oh, don't worry, they're reinventing it in JS. I've had a recent project proposal and one of the engineers said that to make the webpage faster we could prerender the pages on the server side with JS.

Like, at that point, you're just using PHP with a different name and worse performance (a small 4$ shared hoster can easily handle 1M users per month, I've yet to see a NodeJS app handle 1M users per month on a 5$ VPS without problems)


Do you get why you'd prerender a JS app and why an isomorphic setup is fundamentally different than server side rendering with PHP?

You're comparing different things. Also, JS performs as well or better than PHP. Not sure why you guys get so ideological about this.


>Do you get why you'd prerender a JS app and why an isomorphic setup is fundamentally different than server side rendering with PHP?

Tbh, I'm not really interested. I can squeeze amazing performance out of a 10kB self-written JS library with noscript fallbacks than most JS-heavy apps out there. The sites I develop work with dialup connections or worse. Because my phone regularly has only such a connection. Modern JS apps are a pain to use under such conditions, I've not noticed much difference between prerendered JS and unprerendered JS, it's both crap under these conditions.

On the other hand, Orange Forum loaded almost instantly (less than 10 seconds) on a crappy GPRS 16kbps connection. Discourse or NodeBB don't load at all and if I'm lucky enough I might see some error message or crapped out webpage.

>Also, JS performs as well or better than PHP. Not sure why you guys get so ideological about this.

I have only ever seen evidence of the contrary.

I can run a 1M user/month website with 128MB of RAM on a shared hoster using PHP. If you get a good shared hosters you can probably hit 10M user/month.

I have not seend a nodejs app that can handle 1M user/month on a 5$ VPS, which has 512MB RAM and probably more CPU and disk than the shared hoster offering.

But I'm willing to rethink this if I see real world evidence that a comparable software set runs better and more efficiently in JS than PHP. I won't consider synthetic benchmarks since those rarely model real world usage and comparable means the software must be usuable with and without JS on the client enabled.


Discourse is a Rails forum with a JS frontend. It's bloated. I don't disagree with your PHP experiences, you're extrapolating from:

1) your personal experience

2) mature software vs immature software

3) bad software vs good software

You can't say you're "not interested" in understanding the other side, throw out your own anecdotal benchmarks, draw conclusions from that, and then demand that others provide "non-synthetic" benchmarks in order to prove you wrong. Well, you can, but it doesn't seem especially objective.


Then please, show me a performant NodeJS webapp that works with noscript and slow dialup.

There is not much personal experience about that since it's a simple on/off comparison. Either it works without scripts on dialup or it doesn't.


Here is one:

https://nodejs.org/en/about/

My point is, you can write any kind of app with almost any kind of server-side tech. If you don't like the culture, you may be right, but please be explicit about it.


Which is why I never bought into the JS framework fad of the month and keep happily using .NET/Java stacks with server side rendering and minimal JavaScript.

We should make the best use of pure HTML/CSS web pages, that is what the web was made for.

For anything else, better go native with Web APIs.


The benefits from client side Vs server side are really a matter of scale. If you're running a personal forum / whatever then you're not going to notice a whole lot. But when you start having several hundred thousand or more concurrent users then being able to cache your pages in a CDN and only generating JSON responses via APIs really can have a profound impact on your server side resources.


You can cache server rendered pages in a CDN or Varnish or whatever you like, and there's also the use of the proper http headers to drive client side cache control. All of this works for server side rendered content.


Obviously there are a great many layers to caching (there's a whole multitude of other solutions out there that you've also missed off :p). However with regards to the specific points you've raised:

1) You cannot cache server rendered pages in a CDN (see footnote) if those pages contain user specific information (like this topic does). Information such as a user name, message read status - even on public messages, etc. If you do CDN cache those pages you'll leak user-specific information to other users. This is why you need tools like Varnish that cache page fragments rather than whole pages; or why you serve up HTML templates and then populate user information on the client side via RESTful API calls.

2) For similar reasons as above and again very relevant to the project in this discussion, HTTP Cache-Control headers also wouldn't help with the HTML if your backend is generating the HTML. In fact in those instances you'd probably want to set your max-age to 0 (again, speaking strictly about dynamically generated HTML. Static assets like JS, CSS, and images are a different matter but they're not server generated dynamic content). Granted with browser caching there isn't the risk of leaking user information to other users; the risk is just the browser not fetching an updated forum view / whatever.

Caching is one of those things that are easy to set up but also very easy to get wrong.

Footnote: Akamai might support it - if you're willing to pay their premium - as it's an immensely sophisticated product. However it's not an option I've seen when using Akamai and the other solutions I've used definitely don't support caching page fragments.


1) nor can you cache client-side rendered pages in a CDN ... you can only cache EMPTY information-free pages in a CDN

2) Agreed, but again, same problem as one

Truth of the matter: Javascript rendered pages

1) request(s) for the page 2) response for the page 3) at least one AJAX request (in practice: dozens) (cannot be akamai'ed) 4) waiting for all responses

Server side rendering without AJAX

1) request page 2) respond to page

Seems to me server side rendering is guaranteed to be faster, if you count the full cycle. But sure, with AJAX you can show your logo faster. With server side rendering only you can show the actual data people are looking for faster.


Again, this is where the question of scale comes in. Server side may well work out quicker when you only have a few hundred visitors. But when you've got hundreds of thousands, being able to offload > 90% of your content makes a massive difference. JASON APIs are less expensive to get generate than full HTML pages which include the same content. It might only be a fraction less work on the servers but multiply that by several hundred thousand and you quickly enable yourself to scale down your back end infrastructure.

This isn't stuff I'm blindly guessing on either, I've worked on several hugely busy UK and international services that started out as monolithic code bases pushing HTML and migrated them to APIs. Each time the language remained the same, the main body of the backend logic even remained the same, but instead of pulling HTML templated from the backend and pushing out the completed contents, the move was to push out the templates and have the client side render them. Each time that change was made the back end infrastructure could shrink. In some instances by a factor of 10!! The drawback is the client side took longer to render the page upon first impression, however the application ran faster from then on in and the hosting costs were reduced as well.

So this is why I keep saying scale matters when discussing APIs Vs server side HTML generation. For the vast majority of project that people build, there isn't going to be much in the way of difference (heck even my own personal project are built API-less unless I specifically need an API for other - none performance relates - purposes. But when you start talking about hundreds of thousands or millions of concurrent users, then even small changes can have a real impact and thus that's when you really start to appreciate some of the advantages API driven development can offer


If you lose performance on every single request, scale will only make you lose more. But yes, you can offload a tiny amount of server-side processing onto the client side. If that makes a difference in your costs ...

No I can't say then go for it. What you say is bullshit. Offloading string mangling to the client at the cost of doing extra requests is absurd.

We established that a single request (ie. directly filled in page) is going to be faster than javascript doing requests. You appear to agree. The only thing scale will bring is a bigger difference.

Javascript/AJAX is simply a manifestation of "Worse is better":

1) programmers start making websites. Good ones, and bad ones.

2) we see lots of examples of badly programmed server-side programming.

3) "we must fix this", and so now we do client side programming

4) incredible amounts of examples of bad client-side programming




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact