O
3

I was training a model on my own photos for a month before I saw the problem

I kept feeding it pictures from my phone, about 200 total, but every output looked weirdly similar until my friend pointed out they all had the same lighting from my living room lamp. I realized I'd basically taught it to only understand one specific time of day in one room. Has anyone else messed up their training data by not checking for something simple like that?
3 comments

Log in to join the discussion

Log In
3 Comments
johnr73
johnr7323d ago
Ever think about how many other hidden patterns we're accidentally teaching these things? Like if all your photos are from chest height, does the model just think that's the only way to see the world? Makes you wonder what else we're missing because it just seems normal to us.
1
xena_fox39
xena_fox3923d ago
Oh man, that's a wild point. I read something about how image sets are full of people smiling, so AIs start linking "person" with "showing teeth" in a way that feels creepy. Or how most pictures of "dinner" are taken from above, looking down at the plate, so the whole idea of a meal gets framed from that one angle. We're probably baking in a ton of weird biases without even noticing.
2
lisak26
lisak2622d agoOG Member
Totally! And it's not just smiling, it's like how most "office worker" pictures are people in suits at desks, so AI might not get that a person in jeans coding from a couch is also working. Or how "doctor" images are often older men with stethoscopes, ignoring nurses or younger women. We're feeding it a really narrow slice of life.
3