1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.
2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.
3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.
4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.
Free online metadata viewer http://exif.regex.info
Powered by FOSS (Perl-based) https://exiftool.org
this bug (closed as Expected Behavior) has a demonstration: https://gitlab.gnome.org/GNOME/gimp/-/issues/4487
There should be a test suite for image editing applications which will validate the different ways of editing a file to see which ones work as expected and which do not. I’m thinking something similar to web standards test for browsers. Does something like this already exist?