zlacker

[return to "Starlink's laser system is beaming 42 petabytes of data per day"]
1. mrb+b46[view] [source] 2024-02-02 01:00:29
>>alden5+(OP)
So that is "432 Mbit/s per laser, and 9000 lasers total". I don't know you guys but I find that statement much more relatable than "42 PB/day". Interestingly, they also say each laser "can sustain a 100Gbps connection per link" (although another part of the article even claims 200 Gbit/s). That means each laser is grossly underused on average, at 0.432% of its maximum capacity. Which makes sense since 100 Gbit/s is probably achievable in ideal situations (eg. 2 satellites very close to each other), so these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away and no longer are within line of sight of each other.

And with 2.3M customers, that's an average 1.7 Mbit/s per customer, or 550 GB per customer per month, which is kinda high. The average American internet user probably consumes less than 100 GB/month. (HN readers are probably outliers; I consume about 1 TB/month).

◧◩
2. yosito+986[view] [source] 2024-02-02 01:37:03
>>mrb+b46
I think the average Instagram or TikTok user must be using more than 100GB/month. And if you count YouTube and Netflix, it's probably more than that.
◧◩◪
3. calvin+k86[view] [source] 2024-02-02 01:39:18
>>yosito+986
Is resolution going to peak? Like speeding on a highway are there diminishing returns? On the other hand, bandwidth availability seems to also drive demand...
◧◩◪◨
4. colech+W96[view] [source] 2024-02-02 01:56:44
>>calvin+k86
Sure, but "4k" is still being used as a differentiator for streaming companies in how much they charge. Even then they serve up some pretty compressed streams where there's room to do less of that for a noticeable notch in quality.

There's of course a limit. The "native" bitrate equivalent of your retina isn't infinite.

Next step though is going to be lightfield displays (each "pixel" is actually a tiny display with a lens that produces "real" 3D images) and I assume that will be a thing, we shall see if it does better than the last generation of 3D TVs/movies/etc. That's a big bump in bitrate.

There's also bitrate for things like game/general computing screen streaming where you need lots of overhead to make the latency work, you can't buffer several seconds of that.

The next gen sci-fi of more integrated sensory experiences is certainly going to be a thing eventually too. Who knows how much information that will need.

When more bandwidth becomes available, new things become possible, sometimes that are hard to imagine before somebody gets bored and tries to figure it out.

When I'm futzing around with ML models, I'm loading tens of gigabytes from disk into memory. Eventually something like that and things orders of magnitude larger will probably be streamed over the network like nothing. PCIe 4.0 x16 is, what 32 GBps? Why not that over a network link for every device in the house in 10 years?

[go to top]