In other news, possibly the best designed website of 2020: http://www.muskfoundation.org/
First, go to "Site settings" and disable it globally.
To make a site-wide exception, click on the icon in the address bar and enable it for the site there.
It used to be crippled and unusable, but they fixed it recently.
I also recommend Bromite on Android.
So how do I trace what Javascript is doing on my machine?
Generally mitmproxy gives a feeling what sites the browser talks too. And strace gives often a good feeling what a Linux binary does. But the browser is too big and complicated to read strace output in most cases.
Can anybody recommend a tool to look what Javascript code loaded by a certain page is doing?
What? :)
Almost all modern browsers have debug panels that will list all of the requests made by a page, assets cached, cookies and local db, and of course it's trivial to just view the source of a site and read the javascript if it isn't compressed.
Open your browser's developer tools, go to the Script/Debugger tab and have at it. It's just about as obtuse to use as a tool as gdb, but you'll see exactly what it does. Chrome dev tools has automatic formatting of the code, maybe firefox too. But you'll be stuck with shitty variable names if they been mangled. Although you could try http://www.jsnice.org/, I had variable luck with using it.
It would be interesting to have a browser tool that is like strace and you could filter by calls, so you can see exactly where window.navigator is being used for example, or localStorage.setItem. For now best you can do is searching for "navigator" which works, but can be minified/hidden away by coder as well.
Here[1] is the link to disabling JS for a site in Chromium.
[1] chromium://settings/content/javascript
Haven't really used the Javascript debugger, but my guess would be completely infeasible to follow everything a random "modern" Web site might do. And as you say some Javascript might be compressed or obfuscated. What I really would want is a somewhat higher level / more filtered approach: Like strace lets me just trace file operations for example.
https://support.apple.com/en-gb/guide/mac-help/mchlp2271/mac
If someone knows how to achieve the same on Linux for Chrome and Firefox, I'd love to hear it (browser plugins are a bit of a security and stability shitshow, so non-plugin solution would be preferred, all else being equal).
Exactly, that's what I meant.
I combine this with another ad blocker (Wipr) to block everything else.
I have no idea why they made the SVG image inline but the CSS style external, though. That same image is used on every page.
I think Chrome actually is fine although I don't know of a keybinding for it: Don't they still have a toggle right in the site menu (click the icon to the left of the URL to toggle all kinds of these things)?
The other rule is that the JS is all hand-written. No frameworks or other dependencies.
JS doesn't have any magic to it, location information is opt-in, but your IP is a much better advertising identifier.
Would love to have heard that final talk!
It's about the best UI I could come up with for this particular knob and the other things adjacent to it.
Additionally, you can set breakpoints on event handlers and Chromium has deobfuscation built in. You can usually tell approximately what's going on by stepping through the code and watching the variables in local scope.
https://news.ycombinator.com/item?id=11411982
[1] I once enabled JS on a site that claimed it would provide "a better experience", and was bombarded with a bunch of ads and other irritations that just made me turn it off again. It was not a "better experience".
Things like this are seriously creepy: https://www.crazyegg.com/blog/mouse-recorder/
> IANAL
> What I think they mean by this is that you shouldn't link to resources on their website to make it seem like they endorse your (product, website, whatever).
i-may-not-be-totally-perfect-but-parts-of-me-are-excellent.com
and sue anyone who links to them. Hopefully the author will be so grateful for this insight that they won't sue me for reproducing their copyrighted work in this comment.
[0] https://fairuse.stanford.edu/2003/09/09/copyright_protection...
A great way to make your browsing better is to disable 3rd party scripts by default and whitelist when needed, but <noscript> fails to work in those conditions.
Canvas fingerprinting, WebGL fingerprinting, GPU, fonts etc etc etc.
Please, stop arguing, JS is a nightmare for privacy. Period
Nowadays OSes have protection for this sort of thing. But I'd imagine you could still fingerprint an OS like that. Combine that with TLS, HTTP, etc. specifics and you could narrow it down quite a bit I bet.
most people don't run their own resolvers, so at best you're fingerprinting DNS server of the ISP.
>http caches
can be easily cleared, or mitigated entirely by extensions or browser (eg. multi account containers).
Of course, you’d need to have JavaScript enabled to do that...
A good contrast with web pages which are not apps telling you "This app requires Javascript to run".
[1] https://en.wikipedia.org/wiki/Transmission_Control_Protocol#...
It's worth saying that most people don't even know what Javascript is, full stop. Weirdly enough my now mother does, but my younger sister doesn't - we now have a generation that has effectively grown-up post-smartphone, which is fascinating to me.
* Washington Post no longer has a paywall
* anandtech.com is seemingly unaffected, but tomshardware.com is very different (and less pushy)
* SoundCloud says "Oh no! | JavaScript is disabled | You need to enable JavaScript to use SoundCloud"
* nationalreview.com is broken -- I can only read a few paragraphs in a NR PLUS story, and there's no way to keep scrolling (I can read the article fine in another tab)
* An article I co-wrote, published in a Cambridge University Press journal, is now sans tables and figures, but the console reports no errors or exceptions.
An interesting experiment! Overall, it seems my internet experience is better without JS (but reading an academic article online is way worse).
I would suggest that most websites work better if you have javascript and CSS enabled, since that's what they were designed for, but use an ad blocker like uBlock matrix to remove the ads.
If you're worried about tracking, you can block ads and tracking scripts without disabling javascript. If you're worried about viruses, well, all I can say there is that in my experience and understanding, if you keep your browser updated, the odds of getting a virus via browser JS are exceedingly low. Doubly so if you're not frequenting sketchy sites.
I don't know, it seems to me like advice from a time before security was a priority for browser makers, and high-quality ad blockers existed. At this point, I really don't see the value.
eg: hotjar.com sessioncam.com
Legitimate tools for measuring effectiveness of pages with little in the way of nefarious tracking afaics. Also very useful for replaying user errors/problems.
Not to mention that a host of vulnerabilities were image related a few years back (one of the original rookits exploited a TGA bug).
> uBlock Origin
Honestly, this is the antivirus of the web. I helped my niece set up my old computer for Minecraft today, and she was explaining how her friend had installed viruses (adware, really) 3 times. Every one of those instances was caused by download link confusion for Minecraft mods. Disabling JavaScript isn't going to save you from being tricked into downloading shady software, only an adblocker will.
I forget about the back button. By default, I always open links in new tabs which means back button has no data. Also, SPAs have hijacked the back button or just broken it completely, so I've been trained to not count on it behaving as expected. There's also mobile experience where getting to the back button itself is often painful after the UI hides navigation from you.
Otherwise, I am 100% in agreement. If a page is so user hostile to not making a friendly non-JS page, the tab gets closed
I really wish, even if it was an optional setting, browsers would copy the past history of the source tab when you did that. If it hit back in a tab I opened that way, I still want “where I got here from” not “stay here” or “new tab page" or especially “close the tab” (thanks a lot Android Chrome).
They would need to have compromised one of the root certificates on your machine to not give you a giant security warning.
In modern browsers there’s not even a button to bypass them (although I know I chrome you can type “this is unsafe” to a hidden input in the error page and it will let you bypass it temporarily).
That's literally the only thing it did until I reconfigured my browser to access it. It's a misuse of `<noscript>` and it's completely unnecessarily intruding on how I use my own computer to access the content. I thought that was the kind of thing people here (especially the anti-JS people) frown upon.
How can I generate pages with dynamic content easily? Ideally with absolute minimal dependencies.
That’s not how it’s tracked commonly. Similar to HTTP caches, you can fingerprint visitors by how quickly a domain request resolves for them. Sure, all of this can be mitigated. But you have to even know what to mitigate. And given the most fanatical privacy folks aren’t aware of basic timing fingerprints is a good indicator that no one is mitigating it nearly as well as they might think.
[1]: https://www.macrumors.com/how-to/disable-content-blockers-sa...
BTW, you probably want to move off of uM given gorhill has abandoned it in favor of uBO. (I converted all my rules to a mix of uBO dynamic rules for JS and static rules for everything else, except for cookies which I still use uM for because uBO can't manage them.)
I think it's a conflict between Tor trying to be mainstream as possible and Onions trying to reduce users attack vectors, JavaScript has been used against users on Tor.
My philosophy is: set a good example, make the benefit clear, and communicate with other devs who might not realize the horror show they’re sending down the wire/executing in their users’ browsers. But forcing people to figure out how to disable something increasingly hard to disable before they can even hear you is not good communication.
I only clicked the link because I was hoping there would be a regular site under default config and some special treat with JS disabled. Progressive enhancement. That would have been a clever and compelling execution.
Javascript is a privacy and security nightmare. It's almost equivalent to downloading and silently executing untrusted code on your machine. I say "almost" because Javascript code is virtualized and sandboxed. Though I have no doubt people have already discovered vulnerabilities that enable code to break out of the sandbox.
You still need Ajax (at least) for truly dynamic content. The bad news is that Ajax requires client-side JS. The good news is minimal Ajax functionality only takes about 30 lines of pure Javascript. The bad news is you will need to write those 30 lines yourself because every JS library is at least 100x bigger than that and packed with cruft you don't need.
My advice is to pretend React doesn't exist, learn Javascript, and write just the Javascript you need and no more.
Disabling Javascript makes most of this insulting crap go away, and sometimes it is the only way to read the content.
Watch his videos. Check out his articles on A List Apart and in Smashing Magazine, among others. Pay attention, he's very thoughtful and you'll probably learn a lot.
IME, 9 times out of 10, web developers are using JS for non-necessary reasons. The user configurable settings of popular browsers make it easy to designate the small number of sites that actually require JS and keep JS disabled for all other sites. They anticipate that the user will not have one default JS policy for all websites. In other words, these web browsers do not expect that all users should just leave JS enabled/disabled for every website, they acknowledge there will be situations where it should be disabled.
However as we all know most users probably never change settings. Doubtful it is a coincidence that all these browsers have JS enabled by default.
The number of pages I visit that actually require JS for me to retrieve the content is so small that I can use a client that does not contain a JS interpreter. Warnings and such one finds on web pages informing users that "Javascript is required" are usually false IME. I can still retrieve the content with the use of an HTTP request and no JS.
There is nothing inherently wrong with the use of JS. It is nice to have a built-in interpreter in a web browser for certain uses. For example, it makes web-based commerce much easier. However, I believe the largest use of JS today is to support the internet ad industry. Without having automatic execution of code by the browser without user review, approval or even interaction, I do not believe the internet ad "industry" would exist as we know it.
I believe this not because I think having a JS or other interpreter is technically necessary, but because these companies have become wholly reliant upon it.
That's why disabling JS stopsa remarkable amount of ads and tracking.
I browse the web with JS disabled by default. If I encounter a site that has trouble with that, I enable it for that site until I can determine if it is worth leaving it enabled, which usually means at some point I'll be back there again and need it on.
For the most part, it is a superior experience to what I was seeing before with just an ad blocker. The most noticeable thing about it is probably how many images simply don't load because developers lean on JS for loading and scaling them.
Brave's browser claims a speedup over AdBlock plus, but was inspired by UBO, so the performance is fairly similar, but is baked into the browser instead of being an extension.
> We therefore rebuilt our ad-blocker taking inspiration from uBlock Origin and Ghostery’s ad-blocker approach.
AFAIK, JavaScript the language has neither privacy nor security issues of "nightmare" level.
> It's almost equivalent to downloading and silently executing untrusted code on your machine.
No it's not. The code is run in a VM, which is run in a browser. So, the code is limited in doing things to the browser, which itself is limited in what it can do to your computer (files and whatnot). So it's not at all like running untrusted code "on your machine".
> I say "almost" because Javascript code is virtualized and sandboxed.
It's virtualized (in the browser) such that all the code will run almost the same on different browsers and chipsets. Again, the browser code is what keeps the computer safe from any code it runs, including CSS code or other VMs it may use, like Java or Flash. Also the OS keeps the computer safe from the browser (or at least it should).
So, no it's not JavaScript that is the boogeyman here.
Right, so you are describing the implementation of the tool I was looking for. Obviously I don't want to do that manually while tracing a page.
Calling out your histrionics for what they are is not ad hominem. I’m attacking your statement, not your person.
Further, that’s not some sort of axiomatic law, that’s just a phrase. Even if it was, losers using ad hominem doesn’t mean winners don’t, that’s not how logic works.
> FOR A FREE CAR INSURANCE RATE QUOTE THAT COULD SAVE YOU SUBSTANTIAL MONEY WWW.GEICO.COM OR CALL 1-888-395-6349, 24 HOURS A DAY
...on the homepage of a quarter-trillion dollar company, with no other ads.
Ad hominem: You're wrong because you're an idiot.
Just an insult: You're an idiot because you're wrong.
Furthermore, concluding that somebody is wrong because they used a logical fallacy is itself a logical fallacy. If I said "2+2=4 because you're an idiot" my reasoning would be fallacious, but to conclude that the answer must therefore not be four is also fallacious.
(I occasionally used it for curiosity, but found it too tedious in the long term. I have settled on CookieAutoDelete, which seem to address most tracking. Not many seem to run a completely server based fingerprint database.)
BTW, the site works as expected with my Linux/Firefox/uMatrix setup... the inline scripts are disabled by default and I see the page content. I'm not sure why GP had issues.
foo.com bar.com css allow
which means "allow foo.com to fetch css from bar.com", the corresponding uBO static rule is: @@||bar.com^$domain=foo.com,css,allow
The full list of things that can be allow/block'd by uBO is at https://github.com/gorhill/uBlock/wiki/Static-filter-syntax#...I have a "block everything by default" rule at the top that's:
*$css,font,frame,media,object,ping,script,websocket,xhr
*$image,redirect=1x1.gif
*$csp=worker-src 'none'
@@*$1p,css,frame,image
which means:1. Block a bunch of things by default.
2. Block images by replacing them with the built-in 1x1 GIF instead of canceling the request.
3. Disable web workers by setting the CSP worker-src.
4. Override the previous rules by allowing first-party CSS, frames and images. (The @@ means it's an override rule.)
(The fact that my default is to block everything is why the first example I gave above starts with @@ too.)
Web workers can be allowed on a per-site basis by overriding the csp directive with a reset:
@@||foo.com^$csp
Lastly, I have a dynamic rule to allow `<noscript>` tags to be rendered: no-scripting: * true
Then, for every static rule where I enable JS for a domain, I add a corresponding `no-scripting: $domain false` in the dynamic rules.It's annoying to have to move between static and dynamic rules when deciding to enable JS on a site, but I'm not sure there's a better way. Neither static nor dynamic rules individually support everything that uM could do - static rules can't block inline JS nor render `<noscript>` content, and dynamic rules can't block every kind of request.
Static rules are also nice in that you can have empty lines and comments and arbitrary ordering of your rules, so it's easier to group rules in sections based on the domain names, add comments, etc. Dynamic rules however are like uM's rules and are forced to be sorted by domain name with no empty lines or comments.
As a lover of old image formats and the security issues they can cause* this sounds fascinating, but some quick google searches don’t seem to surface what you are referencing. Can you share any more details?
* I once fell into discovering a memory disclosure flaw with Firefox and XBM images
Takes max 10 seconds, on any site. Can you do it in less than 10 seconds using that validator?
Shows a banner "You Don't Need JavaScript to Run This Site (turn it off here)"
It's a response to all the "You Need JavaScript to Run This Site" banners we see everywhere even on plain text/image sites.
Also just so you know, Brave isn't "written" in Rust alone, it is a big software with a lot of parts, including but not limited to a rendering engine, a JS VM and a WASM engine.
The Rust part at most (unconfirmed) would be the glue that connects them together, and I doubt that's where the bottleneck is for most browsers.
>The new algorithm with optimised set of rules is 69x faster on average than the current engine.
The "security features" of popular browsers will never protect the user from the tentacles of internet advertising. Companies/organizations that author popular web browsers generally rely on the success of internet advertising in order to continue as going concerns; as such, they are obviously not focused on internet advertising, and collection of user data, as a "security threat".
I swear I've had ones that popup as I move the mouse to close the tab.
As someone pointed out, there's a button on uBlock to disable it.
I didn't quite spend so much time on Firefox preferences but I didn't find the option. I'm sure it's there somewhere
You can open developer console, go into it's settings and look for Disable Javascript. Checking the box will disable JS and reload the page.
With SSR, you need some component that's aware of every change, and that triggers those re-renders at sensible times (every render takes server resources). This all feels messy, compared to rendering just-in-time on the client-side.
Jesus, why does everyone these days automatically assumes that everyone else is using Chrome or Chromium? It's almost as crazy as calling Windows a "PC".
if you disable 3rd party javascript, (using ublock origin or others) noscript tags don't trigger because scripting is still technically turned on and noscript tags aren't assigned to the script they compliment so the browser has no way of knowing which ones to run or not run in the 3rd party situation.
this is not about how "you" do things, It is more about how it should be done! js is almost never provides want I want when I browse, I expect to get some information! I am not on the circus looking for adventures!
people who love to use javascript to prove that they have some kind of taste about how ux etc. I think these people should use some other platform for people who are insterested in show bussniss. think of this as public transportation is designed based on who the driver is that day! is this sound ok to you? do you understand this one?
web is just connection to other people, not a tool for others to bully you just bein' smart about "the code" they wrote is brilliant!
Heydon is a developer, and an influencer of developers. He's saying: web development is now absolutely obsessed with JavaScript, and it in no way has to be. The basics, HTML, CSS. That's what's important.
[1] A beta that you can download from the github page. I assume the latest stable version also works fine, but the beta had a few additional bugfixes and features and I haven't encountered any instability.
But I must say I hate GDPR banners and this could convert me.
What part of it do you think is bad for usability or accessibility?
But my point was that the markup is invalid.
fetch("https://heydonworks.com").then(x => x.text()).then(x => {
var f = document.createElement("iframe");
document.body.append(f);
f.style.left = "0px";
f.style.top = "0px";
f.style.width = "100%";
f.style.height = "100%";
f.style.position = "absolute";
f.style.border = 0;
x = x.replace(/\<\/?noscript\>/gi, "");
x = x.replace(/\<script\>.*\<\/script\>/gi, "");
f.contentDocument.write(x);
}); <meta name="noscript">
The browser could put an extra icon next to the https padlock so the user would know they were viewing a document rather than an application.[1] https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Co...
[1]: https://apps.apple.com/us/app/purify-block-ads-and-tracking/...
If you want to build something that's nice for humans and machines, look up best practices for this sort of thing - plenty of information is widely available on how to build things in usable and accessible ways (and it's simpler to do it correctly than to use these 'hack'-like workarounds anyway!)
Use a tracking pixel (eg. image) to make further requests and cookie will be included in the request.
Inspect element -> remove
Browsers should really add "remove element" directly to the context menu.