Accessibility is the only moral use that Generative AI can have nowadays
Accessibility is the only moral use that Generative AI can have nowadays
I don't particularly use generative IAs that create Alt Text from images, but I've seen a lot of people who do use them and I think that's the only use of AI from which I can't draw anything negative.
The fact is that there are many people who do not know how to make a good Alt Text, and therefore either do not put them or put one that is only the word "photo/image", which is much worse.
I think if AI had started there, as a tool to help accessibility, it wouldn't have the stigma it has today.
But of course, there are not enough blind people in the world from whom to get absurdly, ridiculously, vulgarly obscene amounts of profit...
(edit: I kind of regret posting this image for reasons described below, but I'll leave it up for context.)
Disabled people are not going to all agree on everything, especially when they are pretty diverse. Some will like gen ai, and some will not.
It depends on what the disability is if it could help or not.
I'm glad Mary speaks for all disabled people and we can finally put this topic to rest.
Mary doesn't get an opinion because it doesn't represent 100% of people like her!
I think that in this context it does not apply because the person in the image is talking about people who use accessibility as an excuse to keep asking ChatGPT to do their homework, and I in particular refer to the legitimate use of AI solely and exclusively to facilitate accessibility. I could give as an example the Bots on Mastodon that, if you follow them, when you upload an image without Alt Text, they respond to the Toot with a detailed description of the image.
Where is the context about using AI to do their homework?
Yeah, and all it requires is for poorly paid workers in Kenya to be tortured and forced to watch CSAM all day.