Skip Navigation

Posts
1
Comments
543
Joined
2 yr. ago

  • Not for no reason. They made claims, I provided links, they whined about it. They provided zero links backing up their 40 year old claim that FPGA would replace anything that didn't run away fast enough.

  • There's several write-ups covering it all.

  • They can be a xenophobic, ageist jagoff all they want. I'm not engaging with them anymore. They're the carpenter that thinks a hammer solves all problems, if we pretend they actually did anything with FPGA as their day job.

  • I mean, you're such an absolute know-nothing that it's hilarious. Nice xenophobic bullshit sprinkled in too. Sorry, no university for me, let alone FPGA in university in the 90s. When my friends were in university they were still spending their time learn Java.

    The world has changed since 30 years ago

    Indeed. And people like me have been there every step of the way. Your ageism is showing.

    and the future of integer operations is in reprogrammable chips

    Yes, I remember hearing this exact sentiment 30 years ago. Right around the time we were hearing (again) how neural networks were going to take over the world. People like you are a dime a dozen and end up learning their lessons in a painfully humbling experience. Good luck with that, I hope you take it for the lesson it is.

    All the benefit of a fab chip

    Except the amount of wasted energy, and extreme amount of logic necessary to make it actually work. You know. The very fucking problem everybody's working hard to address.

    The very idea that you think all these companies are looking to design and build their own single purpose chips

    The very idea that you haven't kept up with the industry and how many companies have developed their own silicon is laugh out loud comedy to me. Hahahaha. TSMC has some news for you.

    You’re only describing how ASIC is used in switches

    Nope, I actually described how they are used in SoCs, not in switching fabrics.

    That’s not how general use computing works in the world anymore, buddy

    Except all those Intel processors I mentioned, those ARM chips in your iPhones and Pixels, the ARM processors in your macbooks. You know. Real nobodies in the industry.

    It’s never going to be a co-proc in a laptop that can load models and do general inference, or be a useful function for localized NN.

    Intel has news for you. It's impressive how in touch you pretend to be in "the industry" but how little you seem to know about actual products being actually sold today.

    Hey, quick question. Does nvidia have FPGAs in their GPUs? No? Hmm. Is the H100 just a huge set of FPGA? No? Oh, weird. I wonder why, since you in all your genuis has said that's the way everybody's going. Strange that their entire product roadmap shows zero FPGA on their DPUs, GPUs, or on their soon to arrive SoCs. You should call Jensen, I bet he has so much to learn from a know-it-all like you that has some amazing ideas about US universities. Hey, where is it that all these tech startup CEOs went to university?

    Tell you what. Don't bother responding, nothing you've said holds any water or value.

  • I’m assuming you’re a big crypto fan

    Swing and a miss.

    because that’s about all I could say of ASIC in an HPC type of environment to be good for

    Really? Gee, I think switching fabrics might have a thing to tell you. For someone that does this for a living, to not know the extremely common places that ASICs are used is a bit of a shock.

    want a CHEAP solution

    Yeah, I already covered that in my initial comment, thanks for repeating my idea back to me.

    and ASIC is the most short-term

    Literally being atabled to the Intel tiles in Sapphire Rapids and beyond. Used in every switch, network card, and millions of other devices. Every accelerator you can list is an ASIC. Shit, I've got a Xilinx Alveo 30 in my basement at home. But yeah, because you can get an FPGA instance in AWS, you think you know that ASICs aren't used. lmao

    e-wastey

    I've got bad news for you about ML as a whole.

    inflexible

    Sometimes the flexibility of a device's application isn't the device itself, but how it's used. Again, if I can do thousands, tens of thousands, or hundreds of thousands of integer operations in a tenth of the power, and a tenth of the clock cycles, then load those results into a segment of activation functions that can do the same, and all I have to do is move this data with HBM and perhaps add some cheap ARM cores, bridge all of this into a single SoC product, and sell them on the open market, well then I've created every single modern ARM product that has ML acceleration. And also nvidia's latest products.

    Woops.

    When you get a job in the industry

    I've been a hardware engineer for longer than you've been alive, most likely. I built my first FPGA product in the 90s. I strongly suspect you just found this hammer and don't actually know what the market as a whole entails, let alone the long LONG history of all of these things.

    Do look up ASICs in switching, BTW. You might learn something.

  • Now ask open AI to type for you what the draw backs of FPGA is. Also the newest slew of chips is using partially charged NAND gates instead of FPGA.

    Almost all ASIC being used right now is implementing the basic math functions, activations, etc. and the higher level work is happening in more generalized silicon. You can not get the transistor densities necessary for modern accelerator work in FPGA.

  • Not at all useful something like running neutral networks

    Um. lol What? You may want to do your research here, because you're so far off base I don't think you're even playing the right game.

    There’s a reason why datacenters don’t lease ASIC instances.

    Ok, so you should just go ahead and tell all the ASIC companies then.

    https://www.allaboutcircuits.com/news/intel-and-google-collaborate-on-computing-asic-data-centers/

    https://www.datacenterfrontier.com/servers/article/33005340/closer-look-metas-custom-asic-for-ai-computing

    https://ieeexplore.ieee.org/document/7551392

    Seriously. You realize that the most successful TPUs in the industry are ASICs, right? And that all the "AI" components in your phone are too? What are you even talking about here?

  • Their "investments" in their largest customers are showing up as sales volume when they're essentially giving products away. This coming year has four major companies coming to market with deadly serious competition, and the magic money investment in AI scams is drying up very quickly. So, if I was nvidia, doing what Jensen has been up to with the CEO-to-CEO customer meetings to arrange delivery timelines, and royally screwing over his most reliable channel partners, I'd hope with everything I have that the customers I have keep buying needlessly so the bubble never bursts.

  • Dedicated ASIC is where all the hotness lies. Flexibility of FPGA doesn't seem to overcome its overhead for most users. Not sure if it will change when custom ASIC becomes too expensive again, and all the magic money furnaces run out of bills to burn.

  • "Please buy more of my hardware so nobody finds out how deep in trouble my company is. PLEASE."

  • The only correct format. Least to most specific.

  • This is a foolish response. We aren't going to live anywhere but this planet, and only a moron thinks humanity is leaving this planet. Truly stupid shit, spoon fed to the incredulous by a billionaire dipshit and a century of science FICTION stories.

    Back to the Zubrin books for you.

  • Nothing we're doing is going to prevent "one big rock" from changing life on earth. And there's a solutely no possibility of moving humanity anywhere else. Science fiction isn't a reason to support nonsense.

  • ... You don't realize that commercial entities have always built space craft for the government, so nothing you're saying has any credibility. You should probably know something about the topic before arguing about it.

  • NASA isn't a company, and they've always paid contractors to make their vehicles. Crucially, noone is doing anything that hasn't been done before.

  • lmao

    Hot staging didn't hot stage, flight termination failed again, it didn't reach its target altitude, a bunch of engines flamed out unexpectedly again.

    That's not a success. Not exploding on the pad doesn't make it a success. Stop believing the YouTube simps.

  • Pretty much, yep. What did end up in discovery was an absolute embarrassment to anybody with a shred of dignity. Obviously Musk has none of that, so we're here again.

  • The instant they enter discovery, he's going to have to drop the suit like usual or hand over an awful lot of evidence of interacting with some extremely shitty racist ass hats. I'm excited for either outcome.

  • That's my camp as well. But I'm willing to call it shitter when necessary. :D