And with 2.3M customers, that's an average 1.7 Mbit/s per customer, or 550 GB per customer per month, which is kinda high. The average American internet user probably consumes less than 100 GB/month. (HN readers are probably outliers; I consume about 1 TB/month).
There's of course a limit. The "native" bitrate equivalent of your retina isn't infinite.
Next step though is going to be lightfield displays (each "pixel" is actually a tiny display with a lens that produces "real" 3D images) and I assume that will be a thing, we shall see if it does better than the last generation of 3D TVs/movies/etc. That's a big bump in bitrate.
There's also bitrate for things like game/general computing screen streaming where you need lots of overhead to make the latency work, you can't buffer several seconds of that.
The next gen sci-fi of more integrated sensory experiences is certainly going to be a thing eventually too. Who knows how much information that will need.
When more bandwidth becomes available, new things become possible, sometimes that are hard to imagine before somebody gets bored and tries to figure it out.
When I'm futzing around with ML models, I'm loading tens of gigabytes from disk into memory. Eventually something like that and things orders of magnitude larger will probably be streamed over the network like nothing. PCIe 4.0 x16 is, what 32 GBps? Why not that over a network link for every device in the house in 10 years?