I understand this is napkin math, but shouldn't we consider that the load isn't evenly distributed? - in which case 50% average utilization seems extremely high
100k a year for 100GBps, leaving it up to you to calculate how many petabytes per year you can pass with that.