- `time wget https://t.co/4fs609qwWt` -> `0m5.389s`
- `time curl -L https://t.co/4fs609qwWt` -> `0m1.158s`
This is precisely why I did believe OP. This is Elon Musk we're talking about.
Or even like some junior dev removed an index
although seems unlikely it just happens to be the NYT.
Or did you mean failing to resolve some internal service's hostname?
curl -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/117.0" -I "https://t.co/4fs609qwWt"
x-response-time: 4521Even if it's deliberate, I don't see how people can complain. Google has outright blocked Breitbart for years. They prevent results from that domain from appearing at all unless you specifically force it with site: and apparently HN does the same. Politically motivated censorship and restricting "reach" is just how Silicon Valley rolls. Pre-Musk Twitter did freeze the New York Post's account and many other much worse things. It'd be a shame for Musk to be doing this deliberately, even though it seems unlikely. But that's the problem with creating a culture where that sort of behavior is tolerated, isn't it? One day it might be turned around on you.
Does the value added by sources like the NYT outweigh the negatives of being occasionally biased or outright wrong? Yes.
- `time curl -A "Mozilla/5.0 (X11; Linux x86_64; rv:60.0) Gecko/20100101 Firefox/81.0" -L https://t.co/4fs609qwW` -> 4.730 total
- `time curl -L https://t.co/4fs609qwWt` -> 1.313 total
Same request, the only difference is user-agent.
The only "values" that matter are the personal whims of whoever happens to own Twitter, or Google or Facebook.
Yeah, something more like that where the internal service is somehow 'sharded' due to some overly complicated distributed database nonsense, and there's a DNS lookup that is failing. Of course that'd mean the DNS lookup wasn't cached, so you're taking that normal latency on every single hit, which would be terrible architecture. The curl-vs-wget performance isn't explained by that though (although that's a bit weird in and of itself, and might suggest that they had to allow that for some internal tool that they didn't want to punish).
> glibc defaults to 5 sec,
The timeout being close to 5 seconds is what made me wonder about it. Its just off though.
NYT may have more reach and definitely isn't neutral, but it's a far cry from the nonsense that Breitbart publishes. It's nakedly partisan.
% curl -gsSIw'foo %{time_total}\n' -- https://t.co/4fs609qwWt https://t.co/iigzas6QBx | grep '^\(HTTP/\)\|\(location: \)\|\(foo \)'
HTTP/2 301
location: https://nyti.ms/453cLzc
foo 0.119295
HTTP/2 301
location: https://www.gov.uk/government/news/uk-acknowledges-acts-of-genocide- committed-by-daesh-against-yazidis
foo 0.037376[Edit:] I'm still seeing it with threads.net:
curl -v -A 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15' https://t.co/DzIiCFp7Ti % curl -gsSIw'foo %{time_total}\n' https://t.co/DzIiCFp7Ti | grep '^\(HTTP/\)\|\(location: \)\|\(foo \)'
HTTP/2 301
location: https://www.threads.net/@chaco_mmm_room
foo 0.123137
Doesn't matter if I do a HTTP/2 HEAD or GET: % curl -gsSw'%{time_total}\n' https://t.co/DzIiCFp7Ti
0.121503
HTTP/1.1 also shows no delay: % curl -gsSw'%{time_total}\n' --http1.1 https://t.co/DzIiCFp7Ti
0.120044
I chalk this up to rot at X/twitter that is being fixed now that it was noticed.That's because you're not spoofing the User-Agent to be a browser rather than curl.
% curl -gsSw'%{time_total}\n' -A 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15' https://t.co/DzIiCFp7Ti
<head><noscript><META http-equiv="refresh" content="0;URL=https://www.threads.net/@chaco_mmm_room"></noscript><title>https://www.threads.net/@chaco_mmm_room</title></head><script>window.opener = null; location.replace("https:\/\/www.threads.net\/@chaco_mmm_room")</script>4.690000
% curl -gsSIw'%{time_total}\n' -A 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15' https://t.co/DzIiCFp7Ti
HTTP/2 200
...
content-length: 272
...
x-response-time: 4524
...
4.660211
The delay is not there for nyti.ms (anymore) but once you use the Safari UA it's handled as 200 response: % curl -gsSIw'foo %{time_total}\n' -A 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15' https://t.co/4fs609qwWt https://t.co/iigzas6QBx | grep '^\(HTTP/\)\|\(location: \)\|\(foo \)'
HTTP/2 200
foo 0.126043
HTTP/2 200
foo 0.037255
It really does seem that twitter is adding a 4.5s delay to some sites from web browsers. Could be malicious, could be rot...What gets a website censored, in the modern corporation-dominated Internet, is going against the interests and preferences of Big Tech owners - and nothing else. Nobody with any power is bound to look out for the public interest, however defined; ICANN is perhaps the only exception that comes to mind.
We can waste our time and attention debating over which targets were more or less deserving of censorship, based on our personal ideas of public interest. But as long as Big Tech is allowed to exist in its current form, we're like powerless peasants arguing about the decisions of kings.