There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.
the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well... You need to feed it CSAM.
First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.
But also, AI systems can blend multiple elements together. They don't need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.
I think it's rash to judge the tone of his writing like that. It can be a struggle to identify and admit one's flaws, and it's certainly a struggle for most people of the modern era to write elegantly with only pen and a few sheets of paper.
No.
Some of the worst politicians are young.
Some of the best politicians are old.
Age isn't a problem. Undemocratic systems and bad politics are problems.
GOG is good too.