AI trained on photos from kids’ entire childhood without their consent
AI trained on photos from kids’ entire childhood without their consent

AI trained on photos from kids’ entire childhood without their consent

AI trained on photos from kids’ entire childhood without their consent
AI trained on photos from kids’ entire childhood without their consent
Don't store your personal stuff online. If you want to share stuff, send it directly and encrypt it.
Idk this kind of feels like victim blaming. Why should you expect your photos to be used in a way that is so devoid of the original purpose you shared them for? It's like telling people to not go out of the house with money on them, you don't expect to be robbed, so why should you have your entire way of living affected by it instead of punishing robbers when that does happen, or in this case companies that abuse good will.
It's a violation of trust for sure, but users made the decision to post something publicly accessible and actually requested distribution. The lower tech version is putting your phone number on a flier and receiving a prank call. Ultimately it's a consequence of releasing that data to the public, and giving rights to said platform by allowing them to distribute it.
I would also apply it on reverse, if you're a company or artist who created content and put it online, why would you not expect that somebody will download it without paying you? If they can, it should be totally fine.
Let's compare an apple to a car to a software...an apple is physical, if you take it without pay, the company has one less apple. Same with a car. With software that's not the case. You can't touch it and there is an infinite number of copies to be had.
The Internet is similar to a street except for the fact that thief's can walk on it without having anyone know or care about what they are doing. So if you leave a software or artware on the street, there's a good chance that it will get stolen. Same with the interwebs.
Its all of us whoever had an online presence I'd bet. The depth of what has been done will not come to light for a while.
When you post something online it's almost as it's become a public thing like newspaper thrown in the street. Take care of your online privacy! 🏴
Kids "easily traceable" from photos used to train AI models, advocates warn.
I mean, that's true, and could be a perfectly-legitimate privacy issue, but that seems like an issue independent of training AI models. Like, doing facial recognition and such isn't really new.
Stable Diffusion or similar generative image AI stuff is pretty much the last concern I'd have over a photo of me. I'd be concerned about things like:
Like, I feel that there are very real privacy issues associated with having a massive image database, and that those may have been ignored. It just...seems a little odd that people would ignore all that, and then only have someone write about it when it comes to running an LLM on it, which is pretty limited in actual issues that I'd have.
And all that aside, let's say that someone is worried about someone generating images of 'em with an LLM.
Even if you culled photos of kids from Stable Diffusion's base set, the "someone could generate porn" concern in the article isn't addressed. Someone can build their own model or -- with less training time -- a LoRA for a specific person.
kagis
Here's an entire collection of models and LoRAs trained on a particular actress on Civitai. The Stable Diffusion base model doesn't have them, which is exactly why people went out and built their own. And "actress" alone isn't gonna be every model trained on a particular person, just probably a popular one.
https://civitai.com/tag/actress
4303 models
And that is even before you get to various techniques that start with a base image of a person, do no training on that image at all, and then try to generate surrounding parts of the image using a model.
Thank you 🙏 this is an underrated comment
The way I see it, if they're too young to have scocial media, they're too young to be on scocial media.
It's real odd when you consider how society is now okay with parents posting pictures of our children openly for the world to see. Yet when the kids start sharing pictures of them selves to friends it's super dangerous for them.
The sad part is now private photos are at risk with all the cloud minning and "AI" crap. The idea that no matter how much I lock down my privacy, simply sending a picture of my kid to their grandma, who will save it to her auto-cloud phone gallary, is still going to feed that picture to the collective is sickening.
The only way to win is not to play
This is the best summary I could come up with:
Photos of Brazilian kids—sometimes spanning their entire childhood—have been used without their consent to power AI tools, including popular image generators like Stable Diffusion, Human Rights Watch (HRW) warned on Monday.
The dataset does not contain the actual photos but includes image-text pairs derived from 5.85 billion images and captions posted online since 2008.
HRW's report warned that the removed links are "likely to be a significant undercount of the total amount of children’s personal data that exists in LAION-5B."
Han told Wired that she fears that the dataset may still be referencing personal photos of kids "from all over the world."
There is less risk that the Brazilian kids' photos are currently powering AI tools since "all publicly available versions of LAION-5B were taken down" in December, Tyler told Ars.
That decision came out of an "abundance of caution" after a Stanford University report "found links in the dataset pointing to illegal content on the public web," Tyler said, including 3,226 suspected instances of child sexual abuse material.
The original article contains 677 words, the summary contains 169 words. Saved 75%. I'm a bot and I'm open source!
That's what I feared and I removed my entire content from google photo 6 years ago. Also my spouse's.
Fuck that's so nasty
Where do you think AI gets all of its information?
There's nothing left to do but ban AI. If we can't even agree to this, we are absolutely lost.
Trying to ban AI is like trying to ban math. Or staple Jello to a tree. It just doesn't work that way.
You have a system that steals copyrighted materials, sucks up power, and spits out constantly wrong and occasionally dangerous "facts", something created by people that can be removed from our world by having governments step in and forbid its use, and you think it's like a natural constant of the world?
Go fuck yourself. With a sharp stick. You are part of the problem right now along with the fucking fascist right-wing assholes. Go away.
That's just so wrong-headed. How else do you expect billionaires to monetize every aspect of our lives?
Lol the idea that you need consent to look at someone's publicly posted pictures is laughably wrong.
View is not the same as "use in a commercial enterprise to turn a profit". Only a fool would think that's the same thing.
This. Anyone can view content online.
Training a visual model off those images requires feeding those images into a model, and that is not the terms under which you originally viewed them.
It's why OpenAI is currently facing tons of lawsuits it may legitimately lose in court.
Probably not though, they can just settle and pay a fee. Deep pockets.
You're allowed to video tape in public for profit. Do we consider paying photos online to be public?
Another rubbish hit piece on open source.
It's not, and you don't speak for the free software community.
Even if you're not on social media, you'll probably still have a shadow profile on Google/Metas servers. My 13 month old baby has a library of images searchable in Google photos and a profile photo in the app. It's convenient, but incredibly creepy.
Yeah, why would you allow this to happen though?
It's not opt-in as far as I'm aware. Just using Google photos makes it so. I suppose I'm deep enough in the google ecosystem (well, let's say my wife is not going to move away from it) to be desensitised to how messed up it kind of is.
I was more talking about how other people (i.e. your friends) will take photos of you and post it on social media or even just keep them in their google photos, and meta/google will build a shadow profile for you without your consent via facial recognition.
I want to defend that poster but I can’t disagree with you… There is one person responsible and it’s definitely not the child….
Wait until you have photos spanning from, not only your child, but your cousins' children who are photographed less often. Google can easily match up an infant to the same 10 year old child. Hell, I can barely do that sometimes and have to use context clues to figure out who the infant was.
To be fair to you, you don't have a photo library of millions of children from infant to teen to train your neurons on.
I scanned a ton of my mom's family photos after she passed, and uploaded them to Google Photos. It's a bit shocking how good it is at guessing the same person at different ages, even 20+ years' difference.