Lol. That's the hardware. Of course it has access to the device hardware. You still need software. All of Google's local AI features use Gemini Nano, which absolutely 100% I guarantee you will not ship with GrapheneOS.
None of those are cutting edge AI models that could be ripped open and examined if people had access to the files. It's not just an app or something, there are internal trade secrets at risk.
Because it's proprietary software. They have an open source model (based off it) called Gemma but Gemini Nano is super locked down. There aren't even public APIs for 3rd party developers to use it through the OS yet.
The screenshot you are showing is scrolled down the page past "More results". What you are showing is after all of the actual search results (which for me is just the app and no ad).
Google started making the Tensor mobile SoC because Qualcomm (and everyone else) weren't investing enough in hardware for ML/AI. We just happen to be seeing a lot of years of investment finally culminating now.
This is straight up tinfoil hat. You really think they architected a whole new chip and had it fabricated just to data mine what songs you are listening to? You don't think it would be easier to just send that data from Android? Apple, Sony and everybody else has custom chips for ANC and audio processing, it is in no way a generally solved problem.
It's actually sad that shit comments that don't even make logical sense get upvotes on here "because Google bad".
When are you going to admit you have no idea what you are talking about?
An LLM literally is a "general AI model that powers a variety of tasks".