VCs are starting to partner with private equity to buy up call centers, accounting firms and other "mature companies" to replace their operations with AI
vivendi @ vivendi @programming.dev Posts 0Comments 138Joined 3 mo. ago
You do realize how data science works, right? The data is cleared by a human team before it is fed to a model, the data is also normalized and processed by some criteria, etc.
Also if you hold some seriously high value data, if I was designing the system, I'd make it flag your system for more advanced visual retrieval (you can let a multimodal LLM use a website like a human user with tool use)
If you just want to stop scrapers use Anubis why tf are you moving your own goalpost?
Also if your website is trash enough that it gets downed by 15,000 requests you should either hire a proper network engineer or fire yours like wtf man I made trash tier Wordpress websites that handled magnitudes more in 2010
EDIT: And stop using PHP in the case of ScummVM. Jesus Christ this isn't 2005
It's not adapting to change, it is fighting change
I can't really provide any further insight without finding the damn paper again (academia is cooked) but Inference is famously low-cost, this is basically "average user damage to the environment" comparison, so for example if a user chats with ChatGPT they gobble less water comparatively than downloading 4K porn (at least according to this particular paper)
As with any science, statistics are varied and to actually analyze this with rigor we'd need to sit down and really go down deep and hard on the data. Which is more than I intended when I made a passing comment lol
According to https://arxiv.org/abs/2405.21015
The absolute most monstrous, energy guzzling model tested needed 10 MW of power to train.
Most models need less than that, and non-frontier models can even be trained on gaming hardware with comparatively little energy consumption.
That paper by the way says there is a 2.4x increase YoY for model training compute, BUT that paper doesn't mention DeepSeek, which rocked the western AI world with comparatively little training cost (2.7 M GPU Hours in total)
Some companies offset their model training environmental damage with renewable and whatever bullshit, so the actual daily usage cost is more important than the huge cost at the start (Drop by drop is an ocean formed - Persian proverb)
This particular graph is because a lot of people freaked out over "AI draining oceans" that's why the original paper (I'll look for it when I have time, I have a exam tomorrow. Fucking higher ed man) made this graph
This is actually misleading in the other direction: ChatGPT is a particularly intensive model. You can run a GPT-4o class model on a consumer mid to high end GPU which would then use something in the ballpark of gaming in terms of environmental impact.
You can also run a cluster of 3090s or 4090s to train the model, which is what people do actually, in which case it's still in the same range as gaming. (And more productive than 8 hours of WoW grind while chugging a warmed up Nutella glass as a drink).
Models like Google's Gemma (NOT Gemini these are two completely different things) are insanely power efficient.
Every image has a few color channels/layers. If it's a natural photograph, the noise patterns in these layers are different. If it's AI diffusion however those layers will be uniform.
One thing you can do is to overlay noise that resembles features that don't exist (using e.g Stable Diffusion) inside the color channels of a picture. This will make AI see features that don't exist.
Nightshade layers some form of feature noise on top of an image as an alpha inlaid pattern which makes the quality of the image look ASS and it's also defeated if a model is specifically trained to remove nightshade.
Ultimately this kind of stupid arms race shit is futile. We need to adapt completely new paradigms for completely new situations.
Fuck with their noise models.
Create a system that generates pseudorandom hostile noise (noise that triggers neural feature detection) and layer it on top of the image. This will create false neural circuits.
China has that massive rate because it manufactures for the US, the US itself is a huge polluter for military and luxury NOT manufacturing
I will cite the scientific article later when I find it, but essentially you're wrong.
Lol, LMAO even
Have an infographic
It's an S for Summation
It should be a D though, because it fucks students
Yeah this is on me, I've never done web dev in my life. Always been low level shit for me. (The project is a highly networked system with isolates, async/await, futures, etc)
But damn man comparatively JS/TS seems a lot easier ¯_ (ツ) _/¯
Honestly? They're based for being so easy to make
For the record, I am a C/Dart/Rust native dev 2+ years deep in a pretty big project full of highly async code. This shit would've been done a year ago if the stack was web based instead of 100% native code
Permanently Deleted
AI would do a better job
Not in Iran. Only some restaurants, mostly old school ones carry them now. They also take the bottle back, send it to the factory, where they are cleaned and filled again.
You still get the micro plastic bonus tho
This is because auto regressive LLMs work on high level "Tokens". There are LLM experiments which can access byte information, to correctly answer such questions.
Also, they don't want to support you omegalul do you really think call centers are hired to give a fuck about you? this is intentional