zlacker

[return to "Show HN: Wikipedia as a doomscrollable social media feed"]
1. pinkmu+Hc[view] [source] 2026-02-02 02:06:23
>>rebane+(OP)
Please fix the loading issue and I’ll return! I think you don’t need to pull all the data at initialization, you could lazily grab a couple from each category and just keep doing it as people scroll.
◧◩
2. rebane+Fg[view] [source] 2026-02-02 02:48:22
>>pinkmu+Hc
The loading issue is just a hug of death, the site's currently getting multiple visitors per second, and that requires more than a gigabit of bandwidth to handle.

I sort of need to pull all the data at the initialization because I need to map out how every post affects every other - the links between posts are what take up majority of the storage, not the text inside the posts. It's also kind of the only way to preserve privacy.

◧◩◪
3. goodmy+th[view] [source] 2026-02-02 02:55:41
>>rebane+Fg
I feel very strongly that you should be able to serve hundreds or thousands of requests at gbps speeds.

Why are you serving so much data personally instead of just reformatting theirs?

Even if you're serving it locally...I mean a regular 100mbit line should easily support tens or hundreds of text users...

What am I missing?

◧◩◪◨
4. rebane+Xh[view] [source] 2026-02-02 03:01:31
>>goodmy+th
> Why are you serving so much data personally instead of just reformatting theirs?

Because then you only need to download 40MB of data and do minimal processing. If you were to take the dumps off of Wikimedia, you would need to download 400MB of data and do processing on that data that would take minutes of time.

And also it's kind of rude to hotlink a half a gig of data on someone else's site.

> What am I missing?

40MB per second is 320mbps, so even 3 visitors per second maxes out a gigabit connection.

◧◩◪◨⬒
5. goodmy+kj[view] [source] 2026-02-02 03:12:36
>>rebane+Xh
no but...why are you passing 40mb from your server to my device in a lump like that?

All I'm getting from your serve is a title, a sentence, and an image.

Why not give me say the first 20 and start loading the next 20 when I reach the 10th?

That way you're not getting hit with 40mb for every single click but only a couple of mb per click and a couple more per scroll for users that are actually using the service?

Look at your logs. How many people only ever got the first 40 and clicked off because you're getting ddosed? Every single time that's happened (which is more than a few times based on HN posts), you've not only lost a user but weakened the experience of someone that's chosen to wait by increasing their load time by insisting that they wait for the entire 40MB download.

I am just having trouble understanding why you've decided to make me and your server sit through a 40MB transfer for text and images...

[go to top]