I mean, it looks fine to me too, though if you know that it's generated and can look at it, I guess it's possible to meet the lower bar of finding things that are off.
If someone is taking a picture with a sunset in the background, then their face should be in shadow.
Also the straps on the shirt are all sorts of fucky.
Thatβs true, but it could just as well be an instagram filter? So many photos nowadays are heavily processed.
Besides the lighting, the only thing I could find are the straps on her top that seem to get a bit mixed up.
True, didn't think about that.
It does if you have seen a lot of AI images, especially SD.
It's cool how quickly the brain can pick out the subtle issues, despite how near perfect the image is
True, but only when you expect it. I've seen real pictures with weird lighting before and if I didn't know they were real, I would've thought of Photoshop. With some experience you know what to look for, but there have already been plenty of studies showing that AI persons can not properly be distinguished from real people in pictures.
Definitely true
I wonder if we're going to end up with a new field of forensic medicine determining if media is real based on subtle anatomy/biomechanics details. Even if the person is real, a particular photo or video might not be
Doesn't look fake to me actually.
I mean, it looks fine to me too, though if you know that it's generated and can look at it, I guess it's possible to meet the lower bar of finding things that are off.
If someone is taking a picture with a sunset in the background, then their face should be in shadow.
https://live.staticflickr.com/7008/13532530895_a52ee219eb.jpg
Also the straps on the shirt are all sorts of fucky.
Thatβs true, but it could just as well be an instagram filter? So many photos nowadays are heavily processed.
Besides the lighting, the only thing I could find are the straps on her top that seem to get a bit mixed up.
True, didn't think about that.
It does if you have seen a lot of AI images, especially SD.