Bit of a weird observation: "Seeing a new computing paradigm coming out of Data Science / Observability"
lysdexic @ lysdexic @programming.dev Posts 65Comments 357Joined 2 yr. ago
lysdexic @ lysdexic @programming.dev
Posts
65
Comments
357
Joined
2 yr. ago
Perhaps I'm being dense and coffee hasn't kicked in yet, but I fail to see where is this new computing paradigm that's mentioned in the title.
From their inception, computers have been used to plug in sensors, collect their values, and use them to compute stuff and things. For decades each and every single consumer-grade laptop has adaptive active cooling, which means spinning fans and throttling down CPUs when sensors report values over a threshold. One of the most basic aspects of programming is checking if a memory allocation was successful, and otherwise handle an out-of-memory scenario. Updating app states when network connections go up or down is also a very basic feature. Concepts like retries, jitter, exponential back off have become basic features provided by dedicated modules. From the start Docker provided support for health checks, which is basically am endpoint designed to be probed periodically. There are also canary tests to check if services are reachable and usable.
These exist for decades. This stuff has been done in production software since the 90s.
Where's the novelty?