AI mistakes Serena williams & michelle obama for men.

Shimbiris

بىَر كَىَل إيه عآنه له
VIP
Madows and some rural, traditional indhoyars be too robust. Only Euro HGs could compare.
 

Basra

LOVE is a product of Doqoniimo mixed with lust
Let Them Eat Cake
VIP

AI could do good and a lot of harm. How do we prevent the latter.


whispers: and so does their respective husbands


1641387752907.png
 
AI depends on good data, the problem with google has been that their data is mostly white people. Also the image recognition technology doesn’t work well with darker skin, so it’s going to make more error with darker peopl.
 

Shimbiris

بىَر كَىَل إيه عآنه له
VIP
Serena and Michelle have robust bodies, but their faces still look pretty womanly to me. Clearly the software is biased towards Caucasian leaning faces.

While I can see she's a woman this is a very robust face compared to delicate little cadaan women:

image (2).jpg
Serena_Williams_at_2013_US_Open.jpg
 
These facial recognition technologies are racist, showing discrimination with higher false-positive rates among Blacks, (including West Africans, East Africans, and East Asians according to NIST findings). The highest divergent error rate is with Black women.
1641389234517.png

A good example of how human bias exists, because these algorithms and data reflect and are created and handled by people, e.g., how there is an imbalance in the training data used for these neural network engines. And then there is the issue of darker skin reflecting less light, wherein the context of the night has harder to process the features properly. They certainly need to better fine-tune the algorithms.
 
These facial recognition technologies are racist, showing discrimination with higher false-positive rates among Blacks, (including West Africans, East Africans, and East Asians according to NIST findings). The highest divergent error rate is with Black women.
View attachment 211029
A good example of how human bias exists, because these algorithms and data reflect and are created and handled by people, for example, how there is an imbalance in the training data used for these neural network engines. And then there is the issue of darker skin reflecting less light, wherein the context of the night has harder to process the features properly. They certainly need to better fine-tune the algorithms.
They have face recognition at airports now. I wonder if madows are having issues with those.
 
Even though face recognition makes it easier to travel and I don't have to line up for passport checks, there is something freaky about it for me. When I travelled recently and got off the plane, I didn't even need to take out my passport at check point/smart gate. The face recognition machine merely scanned my face and I was able to get through to the baggage area.

Am I the only one that finds it rather uncanny?
 
Last edited:

Jacko

VIP
Ah so the biometric face recognition doesn't recognize his face?
Nah it does but I wonder if the software(s) have a bias against brown hook nosed people at airports, cause we all obvious muslim names so that can't be it. :cosbyhmm:

Maybe it's just prejudiced TSA agents. They do have racist-like policies like the "thick hair pat-downs" to black ppl cause apparently their fullbody scanners can't see inside kinky hair.

 
Nah it does but I wonder if the software(s) have a bias against brown hook nosed people at airports, cause we all obvious muslim names so that can't be it. :cosbyhmm:

Maybe it's just prejudiced TSA agents. They do "thick hair pat-downs" to black ppl cause apparently their fullbody scanners can't see inside kinky hair.

You don't have permission to view the spoiler content. Log in or register now.
 

Shimbiris

بىَر كَىَل إيه عآنه له
VIP
You don't have permission to view the spoiler content. Log in or register now.

Still not as accurate from a distance, I wager. Or probably in general. Just recently a saxiib of mine got stopped at a Dubai metro station and asked by a CDI plainclothes officer to join him in a back office. There the officer showed him metro camera footage stills of him from a few days prior and asked if that was him to which my friend said yes, the officer asked for his ID and such and then a little while after apologized for the inconvenience and said he could go. My friend asked for details but the officer only said there was a guy wanted for something classified and my friend resembled him. Then the same thing happened again with my friend at another station (lol). If mix ups like that are still a seemingly regular thing in late 2021 then this stuff is clearly not perfect.
 
Still not as accurate from a distance, I wager. Or probably in general. Just recently a saxiib of mine got stopped at a Dubai metro station and asked by a CDI plainclothes officer to join him in a back office. There the officer showed him metro camera footage stills of him from a few days prior and asked if that was him to which my friend said yes, the officer asked for his ID and such and then a little while after apologized for the inconvenience and said he could go. My friend asked for details but the officer only said there was a guy wanted for something classified and my friend resembled him. Then the same thing happened again with my friend at another station (lol). If mix ups like that are still a seemingly regular thing in late 2021 then this stuff is clearly not perfect.
No, it isn't perfect yet at all. I doubt it will ever be 100% perfect, but its only a matter of time in which face recognition will be everywhere and it will pick up on nearly everyone's face. Soon, we won't even need to carry around documations, we might just have it for 'just in case' situations.
 

Trending

Top