Submitted by rgthree t3_ygl2ll in InternetIsBeautiful
Asyncrosaurus t1_iua1b1i wrote
Reply to comment by IllyrioMoParties in Nostalgic website in 90s “Geocities” style for tracking Halloween movies 📼🎃 by rgthree
>how did they do it
Well text and low-res images are very quick to load. They may or may not even be using a database, meaning the loadtime could be entirely on transmission.
Then look at reddits infrastructure. It's a large system with many active componets being engaged with each pageload, built to scale to millions of active users, not for a single pageload score. For such a simple site ("new" Reddits bloated front-end monstrosity notwithstanding), Reddit is deceptively complex behind the scenes.
rgthree OP t1_iuagxyn wrote
It’s a custom Python CLI that scrapes movie data and generates a static JSON file with the data. Then another Python file that generates all the static HTML files. Some GIFs, CSS, and JS for interactivity. All static and hosted off an old MacBook behind Cloudflare CDN for fast delivery.
Viewing a single comment thread. View all comments