zlacker

[parent] [thread] 3 comments
1. macrol+(OP)[view] [source] 2023-04-01 08:19:55
Seems it creates an "iris code" from the iris scan. Then they want to develop some new hash function to hash the iris code to get an iris hash.

This iris hash should then be stored in some decentralised database like a blockchain or something.

https://worldcoin.org/blog/developers/privacy-deep-dive

replies(1): >>moonch+Oe
2. moonch+Oe[view] [source] 2023-04-01 11:15:14
>>macrol+(OP)
So if you damage your iris you're fucked ?
replies(1): >>aramac+mm
◧◩
3. aramac+mm[view] [source] [discussion] 2023-04-01 12:31:34
>>moonch+Oe
Yes. That is how biometric data works; if my finger gets cut off I can no longer unlock with touch ID on my iPhone.

Wonder if they have a backup ID method of some sort or if it’s solely iris data

replies(1): >>Phemis+Bn
◧◩◪
4. Phemis+Bn[view] [source] [discussion] 2023-04-01 12:41:12
>>aramac+mm
It is likely recording the whole time - e.g. also while the person is moving their iris in view of the camera.

So, I would expect them to have captured a seriously significant AND deduplicated (through the iris) database of faces as well.

Obviously, this is an extremely valuable database on its own merits, because it contains biometric data on a ton of people from "non-restricted" countries and may well solve some of the systemic biases current biometric systems show, due to lack of a representative dataset to train on.

These systemic biases are currently one of the major arguments _against_ the use of biometric systems at an even bigger scale and thus solving it will take away a big tool in the privacy-rights activists toolbox.

Are biases still systemic if at one point a significant amount of people from every geographical location on the planet has been included in the training set?

I can imagine arguing _for_ this position at the very least becomes a lot more complex.

[go to top]