Honestly, people being able to get tested known doses of various drugs of choice would save a lot of lives and create a lot of opportunity to intervene and help people recover. Making drugs illegal just causes miser and funds crime.
Yeah, and sure he can pass a drug test if you consider:
LSD: 12 hour wash-out time for a blood test
Cocaine: 24 hour wash-out time for a blood test
MDMA: 24-36 hour wash-out time for a blood test
Ketamine: 24 hour wash-out time for a blood test, and it's easy to get a prescription for off-label use to treat depression
So basically, do all the drugs you want Friday night, by Monday morning you'll be clean enough for a blood test.
if you compiled some code and then uncompiled it you would get the most efficient version of it ... ?
Sorta, an optimizing compiler will always trim dead code which isn't needed, but it will also do things that are more efficient but make the code harder to understand like unrolling loops. e.g. you might have some code that says "for numbers 1-100 call some function" the compiler can look at this and say "let's just go ahead and insert 100 calls to that function with the specific number" so instead of a small loop you'll see a big block of function calls almost the same.
Other optimizations will similarly obfuscate the original programmers intent, and thinks like assertions are meant to be optimized out in production code so those won't appear in the de-compiled version of the sources.
Historically, reverse proxies were invented to manage a large number of slow connections to application servers which were relatively resource intensive. If your application requires N bytes of memory per transaction then the time between the request coming in and the response going out could pin those bytes in memory, as the web server can't move ahead to the next request until the client confirms it got the whole page.
A reverse proxy can spool in requests from slow clients, when they are complete, then hand them off to the app servers on the backend, the response is generated and sent to the reverse proxy, which can slowly spool the response data out while the app server moves onto the next request.
ZFS will let you setup a RAID like set of small volumes which mirror one larger volume, it takes some setup, but that's the most "elegant" solution in that once it's configured you only need to touch it when you add a volume to the system and it's just a mounted filesystem that you use.
Does not solve the off-site problem, one fire and it's all gone.
Web sites and pages come and go, but the search engine indices are forever. The Internet Archive, for example, uses data from a search engine crawler to populate their archive of the internet (until Alexa was shut down by Amazon, they do their own crawling now). Google likely has a lot of old internet data in archives as well.
Before capitalism there was feudalism and more basic market economies organized around market towns. Before you get to that level of density (i.e. purely agrarian or hunter gatherer societies) we generally see gift economies, which has been the default economic system for the majority of human history.
True, but if all the data was encrypted, then the drive formatted it would require physically dismantling an HDD in a clean room to recover data, for SSDs the wear leveling makes it hard to fully erase anything, but again, after encrypting and formatting the cost of the tools needed to get the data back are well above the potential benefit (i.e. there are easier ways to get people's personal info)
Chevron? You mean the oil company previously known as Exxon until they had a little boating accident in Alaska?