Skip Navigation

coyotino [he/him]
Posts
614
Comments
1,520
Joined
2 yr. ago

  • in souls, the 2-handed longswords that are a realistic size are realistically fast when you wield them in 2-handed mode. the ones that are slow are more anime sized.

    baseball bat was not the best metaphor, i went with that because most people (including souls fans) haven't swung an actual sword in their lives.

  • there is a new "alt text" box when i am posting, and I entered that alt text, but it doesn't seem like that alt text is showing up anywhere. Any clues about this?

  • have you swung a melee weapon before in real life, like maybe a baseball bat? Does swinging the bat happen instantaneously, or do you have to wind up the swing for momentum? Would a bigger, heavier bat be faster or slower to swing?

    Souls games have slow melee attacks compared to something like Devil May Cry, but the speed is intended to be more realistic compared to those kinds of speedy action games. Just apply real life logic to it, and it should make more sense. If the weapon you are using is too slow for you, find a smaller, lighter one that would be easier to swing in real life. If starting out as a regular dude and then becoming more powerful is not appealing to you, or if realistic fights are not exciting to you, then maybe the Souls games just aren't your bag.

  • exactly! it's a way to own a complete copy on disc, independent from the servers. I know there are other companies offering that specific thing, but more players in the space is a good thing imo

  • Big article, but a great read! Some key excerpts:

    This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.

    Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

    One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).

    One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.

    But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.

    By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.

  • Big article, but a great read! Some key excerpts:

    This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.

    Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.

    One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).

    One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.

    But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.

    By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.

  • you have a mental fortitude i cannot imagine. getting all those sewer-tier jokes pure, uncut, while holding down L2+R2 for 30 hours...I thought they banned that sort of thing in the Geneva Conventions

  • i would be happy if we had like, double the users. beyond that it would start to look too much like Reddit, but 2x users would make for some more lively discussions.

  • Just putting this out there: wouldn't a multiplayer-focused game like Borderlands be relatively shielded from the lads out on the high seas? Sure fitgirl could hook you up for a single-player campaign, but who tf wants to play Borderlands solo?

  • "as long as people spend less money on games overall things will be fine!" Easy to say when you're retired from the industry. I don't think anyone in the industry would appreciate the implications of that...

  • That is a fair point! That could be neat. Still not worth the environmental cost of using this technology, but interesting in a vacuum!

  • developers have been working on this, but it doesn't scale to games in the way you might think. For one, games have to communicate with data centers to process LLMs, so we will still have to deal with the lag of data transmission and processing. The other problem is that, in general, the AI are not very good. ChatGPT has all the hype because it is very convincing, but it does not actually know what it is talking about. Go ask ChatGPT to add up 5 multi-digit numbers and watch it fail at a task that your pocket calculator can complete in seconds. All these LLMs are doing is taking your input and spitting out a response that sounds correct based on how people usually respond to that input. In the context of a game, this means that any dynamic conversation you might have with an NPC would go flying off the rails in ways that would make a game feel broken and/or unfinished. Go watch the video in the linked article and make your own judgment.

  • Honestly, Jiménez Zorrilla's article linked at the start of this interview seems way more interesting than the interview itself. Sadly, the evidence seems sufficent to say that shrimp have sentience akin to lobsters, and yet people still feel comfortable boiling lobsters alive. So if one is comfortable eating pig (which is very common), it is hard to imagine that same person ever caring about the well-being of a shrimp in their lifetime. At least vegetarians and vegans avoid shrimp along with everything else, i guess. poor little buddies.

  • So now the US president is openly threatening US citizens with denied entry at the border? Am I getting that right?

  • makes sense. Epic immediately started offering in-app store software, so other companies could implement Fortnite-like stores into their own apps in a way that bypasses Apple's payment system. It's plain to see that Apple will do everything they can to stop that from happening. Services are about 25% of Apple's revenue, which means that if they lose most of that revenue stream, their profit margin is almost cut in half. Combine that with how much tariffs are going to cut into iPhone revenue, and now this is more like an existential fight for Apple.

  • I need to look Ella Purnell up

    Yellowjackets is the big one! Sweetpea is a recent one that was pretty solid. She also voices Jinx in Arcane.

  • tbh i feel like that was less than a year ago, so this turnaround for Season 2 feels fast for me. But maybe because i've still been seeing Walton Goggins and Ella Purnell in everything.

  • interesting to discover that MAGA does have a line for what they are willing to accept. Up until this moment I truly thought he could say or do whatever he wants and MAGA would drink it down like castor oil. Also fascinating that Trump has brushed up against this line already, not even 6 months into this term. We are in for a long 4+ years...

  • Jokes and Humor @beehaw.org

    they NEED you

    Gaming @beehaw.org

    Warhammer 40,000: Space Marine 2 is a Russian game by Saber Interactive. Should gamers care if they care about Ukraine?

    Jokes and Humor @beehaw.org

    You shoulda gave me the window seat, Lynda...

    Gaming @beehaw.org

    passing on treasures

    Jokes and Humor @beehaw.org

    one day...

    Gaming @beehaw.org

    have we checked on them lately?

    Gaming @beehaw.org

    is this true?

    Jokes and Humor @beehaw.org

    boil em. mash em. stick em in a...

    Gaming @beehaw.org

    did you hear that Dr Disrespect tried to make a comeback the other day with a Deadlock stream?

    Gaming @beehaw.org

    Palworld will not change to free to play model, dev claims

    Jokes and Humor @beehaw.org

    i know i left it around here somewhere...

    Jokes and Humor @beehaw.org

    Happy Friday!

    U.S. News @beehaw.org

    USPS' long-awaited new mail truck makes its debut to rave reviews from carriers

    Gaming @beehaw.org

    Tony Hawk claims he is ‘talking to Activision again’

    Jokes and Humor @beehaw.org

    surprise!

    Jokes and Humor @beehaw.org

    know the difference

    Gaming @beehaw.org

    Sony to Start Selling Refurbished PS5 Consoles

    Jokes and Humor @beehaw.org

    high octane addiction

    Jokes and Humor @beehaw.org

    ehheheheheh

    Entertainment @beehaw.org

    Now YOU could own the Iron Throne or Jon Snow's sword from 'Game of Thrones'