I think another factor is that apparently the training of this LLM was done significantly cheaper than other mainstream models. Or that's what I came across from other forum discussions. I don't really care too much to bother myself and dig deeper
Yep, that's what I meant. There's an org-transclusion package aiming to reproduce logseq's behaviour but editing from the embedded copy and updating the source isn't as seamless.
I also use Zotero standalone for bibliography but I wanna move everything to emacs. There's org-noter and bibliography support in org(roam).
But you're absolutely right, it's overwhelming, the amount of things one can do and the learning curve is steep
I hate wiping, I hate toilet paper