1) The code, for now, runs locally. This is good. To avoid the possibility of the code being tampered with at a later day (for example, it could be modified to send copies of the image to a server), download the webpage and use the saved copy, not the live copy.
2) Do not use the blur functionality. For maximum privacy, this should be removed from the app entirely. There are _a lot_ of forensic methods to reverse blur techniques.
3) Be weary of other things in the photograph that might identify someone: reflections, shadows, so on.
4) Really a subset of 2 and 3, but be aware that blocking out faces is often times not sufficient to anonymise the subject in the photo. Identifying marks like tattoos, or even something as basic as the shoes they are wearing, can be used to identify the target.
Free online metadata viewer http://exif.regex.info
Powered by FOSS (Perl-based) https://exiftool.org
this bug (closed as Expected Behavior) has a demonstration: https://gitlab.gnome.org/GNOME/gimp/-/issues/4487
Edit: to be clear I meant this as a commentary on the technology, not the people making the mistakes
All the program has to do is scrub all exif data, have a censor box/brush that is 100% black and rencode the image so there is no remaining unneeded data.