Is it over for apex ?
ffhein @ fhein @lemmy.world Posts 12Comments 311Joined 2 yr. ago
The first game was named Battlefield 1942, so technically there hasn't been a "1" in the series before this :) It came out in 2016 so it's not really new, but I bought it last year and played it on Linux for a few hours with friends, and it still has an active player base.
Have they stated that they're going to support Linux or at least Proton/Wine? I did a quick search on the game's Steam forum and it sounded like it doesn't work currently.
Ahh, now I get it :P
Nope, Norwegian company until they were bought by Chinese investors a few years ago. They did have a lot of developers in Sweden and Poland though.
My 4 last employers have used desktop Linux to some extent:
- Ericsson (Swedish telecoms), default was to have a Windows laptop with X server (Citrix?) but a few of us were lucky enough to get a Linux laptop.
- Vector (German automotive), Linux dev. environment in a VM on Windows laptops.
- Opera Software (Norwegian web browser), first day I was given a stack of components and told to assemble my PC and then install my Linux distribution of choice.
- And a smaller company, which shall remain unnamed, also used Windows laptops with Linux dev. env. in VM.
Sure most of it was on top of Windows, but if you fullscreen it you can barely tell the difference :)
It sounds like bed adhesion might have got worse, perhaps you have touched the print surface with your fingers while removing prints? You could try removing the plate and washing it with warm water and soap. Some people use IPA but if you do then you need to make sure you really wipe it clean before it evaporates, otherwise the dissolved fats will stay on the bed. If your bed has some kind of anti-stick coating I think there's also a risk that you damage if you use stronger solvents.
As for warping in general it could be an indication that your flow rate is exceeding your melting capacity. If you have an all metal hotend you could try printing at higher temp, if not then try reducing print speed instead.
My first couple of computers had AmigaOS and even from the start Windows felt like complete garbage in comparison, but eventually I had to buy a PC to keep up with the times. After that I kept looking for alternative OS:es, tried Linux dual booting but kept going back to Windows since all the programs and hardware I needed to use required it. When I finally decided to go full time Linux, some time between 2005 and 2010, it was because I felt like I was just wasting my life in front of the computer every day. With Windows it was too easy to fire up some game when I had nothing else to do, and at that time there were barely any games for Linux so it removed that temptation. But that has ofc. changed now and pretty much all Windows games work equally well on Linux :)
Ahh, I thought you meant you had a 0.2mm nozzle, but now I see you probably meant layer height.
Moisture absorbtion is rarely a problem with PLA, but hopefully dehydration won't hurt, as long as you don't accidentally overheat it and it deforms. I've left rolls of PLA out in the open for 6 months without noticing any deterioration. Both your filaments used to print fine, and then the oozing spontaneously started with both of them?
2mm retraction should be more than enough for a direct drive extruder.
The only certification I have is from the Kansas City Barbeque Society, allowing me to act as a judge in BBQ competitions.
Things are probably different nowadays, but at least 15-25 years ago you could just apply for IT jobs and if someone lied about their skills it would hopefully show during the technical interviews. I don't know if that counts as getting in very early.
What filament and other slicer settings? Could be too hot. Could be retraction settings. Did the oozing start when you switched nozzles? If it's a cheap Amazon nozzle it might be faulty and have a different diameter than advertised. Did you follow the correct procedure with hot tightening when switching nozzles? If not, you might have got molten filament in between the nozzle and the heat break.
Easiest GUI toolkit I've used was NiceGUI. The end result is a web app but the python code you write is extremely simple, and it felt very logical to me.
Assuming they already own a PC, if someone buys two 3090 for it they'll probably also have to upgrade their PSU so that might be worth including in the budget. But it's definitely a relatively low cost way to get more VRAM, there are people who run 3 or 4 RTX3090 too.
Probably a joke, since there's always someone commenting that the item isn't food safe whenever a model for something food related.
It's not easy trying to research which 3d printer to buy, there is more click bait and marketing than impartial reviews out there, and search engines tend to promote the garbage. And without a lot of 3d printing experience, it can be difficult to know if a "review" is paid for by the printer's manufacturer, or just trying to trick you into clicking their affiliate links. There are also no consistently good brands if you're looking for a cheap printer, pretty much all of them have produced a few good printers and others that have more flaws. For example old Ender 3 and Ender 3 Pro were very good at the time, and Creality built up a lot of brand recognition, but then they switched to low quality components and seemingly stopped doing quality control and made a bunch of crap. Now it might be turning around again, as Creality's latest printers are starting to look decent again, although perhaps a little overpriced.
Personally I use this spreadsheet to compare pros and cons of budget printers. It's maintained by a group of users at a 3d printing discord server, and while one cannot know for sure none of them have any ties for example to Sovol (the most recommended budget brand currently), they've seemed quite impartial to me so far.
Interesting.. I've never had this issue in Fedora KDE, which I run on my PC, but exactly the same thing happens on my wife's PC and the HTPC which both run Xubuntu. Tried setting screen saver, power save options and eventually even uninstalling the screensaver completely. At least in my case it's caused by Xorg DPMS if I remember correctly. Fixed it a while ago but then it came back on one of the computers at some point. Check out https://wiki.archlinux.org/title/Display_Power_Management_Signaling if it could be the same for you.
The only issue is (layer height), which of course can change from print to print.
Technically not the only issue 😀 it can also change from layer to layer.
Maybe calculate it from the length of filament being pushed out / the length of the movement * filament cross section? I'm on my phone so I can't check right now but that info should be possible to extract from the gcode iirc
Never heard of this printer brand, but if it turns out that its mainboard isn't compatible with what you're trying to do then it might be an option to replace it with a cheap stepper driver board from AliExpress. You wouldn't need anything fancy if you're running Klipper on your RPi
I think a 650 W PSU should be enough for a workload of 490 W idle. Please, correct me, if I am wrong.
You mean 490W under load, right? One would hope that your computer uses less than 100W idle, otherwise it's going to get toasty in your room :) I would say this depends on how much cheaper a 650W PSU is, and how likely it is you'll upgrade your GPU. It really sucks saving up for a ridiculously expensive new GPU and then realizing you also need to fork out an additional €150 to replace your fully functional PSU. On the other hand, going from 650W to 850W might double the cost of the PSU, and it would be a waste of money if you don't buy a high end GPU in the future. For PSU, check out https://cultists.network/140/psu-tier-list/ .If you're buying a decent quality unit I wouldn't worry about efficiency loss from running at a lower % of its rated max W, I doubt it's going to be enough to be noticeable on your power bill.
I've always had Nvidia GPUs and they've worked great for me, though I've stayed with X11 and never bothered with Wayland. If you're conscious about power usage, many cards can be power limited + overclocked to compensate. For example I could limit my old RTX3080 to 200W (it draws up to 350W with stock settings) and with some clock speed adjustments I would only lose about 10% fps in games, which isn't really noticeable if you're still hitting 120+ fps. My current RTX3090 can't go below 300W (stock is 370W) without significant performance loss though.
If you have any interest in running AI stuff, especially LLM (text generation / chat), then get as much VRAM as you possibly can. Unfortunately I discovered local LLMs just after buying the 3080, which was great for games, and realized that 12GB VRAM is not that much. CUDA (i.e. Nvidia GPUs) is still dominant in AI, but ROCm (AMD) is getting more support so you might be able to run some things at least.
Another mistake I made when speccing my PC was to buy 2*16GB RAM. It sounded like a lot at the time, but once again when dealing with LLMs there are models which are larger than 32GB that I would like to run with partial offloading (splitting work between GPU and CPU, though usually quite slow). Turns out that DDR5 is quite unstable, and I don't know if it's my motherboard or the Ryzen CPU which is to blame, but I can't just add 2 more RAM. I.e. there are 4 slots, but it would run at 3800MHz instead of the 6200Mhz that the individual sticks are rated for. Don't know if Intel mobos can run 4x DDR5 sticks at full speed.
And a piece general advice, in case this isn't common knowledge at this point; Be wary when trying to find buying advice using search engines. Most of the time it'll only give you low quality "reviews" which are written only to convince readers to click on their affiliate links :( There are still a few sites which actually test the components and not just AI generate articles. Personally I look for tier lists compiled by users (Like this one for mobos), and when it comes to reviews I tend to trust those which get very technical with component analyses, measurements and multiple benchmarks.
It's not that bad. Of course I've had a few games that didn't work, like CoD:MW2, but nearly all multiplayer games my friends play also work on Linux. The last couple of years we've been playing Apex Legends, Overwatch, WoWs, Dota 2, Helldivers 2, Diablo 4, BF1, BFV, Hell Let Loose, Payday 3, Darktide, Isonzo, Ready or Not, Hunt: Showdown to name a few.
So sad that they didn't fix the AC until the game had been around for years, I would've loved to play it in the beginning when the player skill was more varied. Tried to get into it when Linux was allowed but it seemed like mostly the try-hards were still playing. Had some good games but it was a bit too sweaty for my friends at times.
I tried playing it through Wine during season 2 or 3, the game worked flawlessly but you would get kicked after 1-5 minutes due to missing AC.