Powershell is available for linux and will run the same modules that have made it such a success on Windows. Want to fire up vmware containers or get a list of vms? Want to talk to Exchange servers? Azure? AWX? $large-corporate-thing? Powershell is a very good tool for that, even if it smells very Microsofty.
The linux version works well - it has some quirks (excessive logging, a MS repo that needs manual approving that breaks automatic updates) but aside from those, it just works. I have several multi-year scripts that tick away nicely in the background.
Perl is already installed on most linux machines and unless you start delving into module usage, you won't need to install anything else.
Python is more fashionable, but needs installing on the host and environments can get complicated. I don't think it scales as well as Perl, if that's a concern of yours.
Perl's core to most distros and will be there already. Python isn't and can be quite heavy - plus some of are are still smarting over the major version change breaking everything and the need for complicated environments.
You're right to be paranoid, it's unrelenting how many and varied are the ways of those wanting to take advantage. I hope you find a good compromise for your dad.
Better than anything else, IME. My home server hasn't had a fresh install since Debian 8. It's now on 12 and each time I just dist-upgrade.
There are sometimes the odd breakage, but it's a lot less hassle than reinstalling everything. (we use EL at work and that takes months to migrate to new machines)
I sympathise with your Dad - everyone's had updates go bad, and it's easy to assume the "don't fix what ain't broke" mantra. But to do so is being willfully ignorant of basic computer security. And to be fair, Debian-stable is one of the least troublesome things to just let automatically update.
Debian and Ubuntu have the unattended-upgrades package which is designed to take a lot of the sting out of automatic updating. I'd recommend setting that up and you won't have to touch it again.
There's also the crontab way - "apt-get update && apt-get upgrade" at frequencies that suit you. (A check for reboot afterwards is a good idea).
In my experience, the AI bots are absolutely not honoring robots.txt - and there are literally hundreds of unique ones. Everyone and their dog has unleashed AI/LLM harvesters over the past year without much thought to the impact to low bandwidth sites.
Many of them aren't even identifying themselves as AI bots, but faking human user-agents.
robots.txt does not work. I don't think it ever has - it's an honour system with no penalty for ignoring it.
I have a few low traffic sites hosted at home, and when a crawler takes an interest they can totally flood my connection. I'm using cloudflare and being incredibly aggressive with my filtering but so many bots are ignoring robots.txt as well as lying about who they are with humanesque UAs that it's having a real impact on my ability to provide the sites for humans.
Over the past year it's got around ten times worse. I woke up this morning to find my connection at a crawl and on checking the logs, AmazonBot has been hitting one site 12000 times an hour, and that's one of the more well-behaved bots. But there's thousands and thousands of them.
If cookie prompts annoy you (and why wouldn't they? Complicated and time wasting prompts caused by terrible and compromised legislation that's led to far more intrusion instead of enforcing use of browser settings) and you don't care about cookies, then the browser extension "I don't care about cookies" suppresses the vast majority.
Can't say I've noticed much pain beyond what I mention - powershell just works.