Ahmedahmed5000
Vorbeck
AI could do good and a lot of harm. How do we prevent the latter.
AI could do good and a lot of harm. How do we prevent the latter.
Madows and some rural, traditional indhoyars be too robust. Only Euro HGs could compare.
Serena and Michelle have robust bodies, but their faces still look pretty womanly to me. Clearly the software is biased towards Caucasian leaning faces.
wlhi she looks like a handsome man in the second pic very unfortunateWhile I can see she's a woman this is very robust face compared to delicate little cadaan women:
View attachment 211023View attachment 211024
They have face recognition at airports now. I wonder if madows are having issues with those.These facial recognition technologies are racist, showing discrimination with higher false-positive rates among Blacks, (including West Africans, East Africans, and East Asians according to NIST findings). The highest divergent error rate is with Black women.
View attachment 211029
A good example of how human bias exists, because these algorithms and data reflect and are created and handled by people, for example, how there is an imbalance in the training data used for these neural network engines. And then there is the issue of darker skin reflecting less light, wherein the context of the night has harder to process the features properly. They certainly need to better fine-tune the algorithms.
One of my brother gets bothered at airports more than usual, I suspect it's cause he has a huge Arab-like schnozThey have face recognition at airports now. I wonder if madows are having issues with those.
Ah so the biometric face recognition doesn't recognize his face?One of my brother gets bothered at airports more than usual, I suspect it's cause he has a huge Arab-like schnoz
They have face recognition at airports now. I wonder if madows are having issues with those.
Nah it does but I wonder if the software(s) have a bias against brown hook nosed people at airports, cause we all obvious muslim names so that can't be it.Ah so the biometric face recognition doesn't recognize his face?
Nah it does but I wonder if the software(s) have a bias against brown hook nosed people at airports, cause we all obvious muslim names so that can't be it.
Maybe it's just prejudiced TSA agents. They do "thick hair pat-downs" to black ppl cause apparently their fullbody scanners can't see inside kinky hair.
How airport scanners discriminate against passengers of color
Full-body scanners often have trouble reading thick hair and certain head coverings — contributing to racist profiling.www.vox.com
You don't have permission to view the spoiler content. Log in or register now.
No, it isn't perfect yet at all. I doubt it will ever be 100% perfect, but its only a matter of time in which face recognition will be everywhere and it will pick up on nearly everyone's face. Soon, we won't even need to carry around documations, we might just have it for 'just in case' situations.Still not as accurate from a distance, I wager. Or probably in general. Just recently a saxiib of mine got stopped at a Dubai metro station and asked by a CDI plainclothes officer to join him in a back office. There the officer showed him metro camera footage stills of him from a few days prior and asked if that was him to which my friend said yes, the officer asked for his ID and such and then a little while after apologized for the inconvenience and said he could go. My friend asked for details but the officer only said there was a guy wanted for something classified and my friend resembled him. Then the same thing happened again with my friend at another station (lol). If mix ups like that are still a seemingly regular thing in late 2021 then this stuff is clearly not perfect.