Hacker Newsnew | past | comments | ask | show | jobs | submit | more ThatPlayer's commentslogin

I don't think it's quite the same. Rollback netcode is like lockstep netcode, where the entire game is simulated locally and only inputs are networked. Since it's still only input being networked, network drops (or slow computers) affect everyone, requiring the simulation to slow down. Not just fighting games, but RTS games would do this. If you've ever played Starcraft/Warcraft 3 where it would freeze when a player disconnected.

With rollback/lockstep, there's no need for a server simulation at all. Most games are not doing that: the client's local simulations are less important than the server's simulation, even missing information (good to prevent wallhacks). Any dropped packets are handled with the server telling the client the exact positions of everything, leading to warping. Dropped packets and latency also only affect the problem player, rather than pausing everyone's simulations.


> there's apparently a checkbox to enable Linux support on EasyAntiCheat - and some don't "check" it

Because support doesn't mean full features. It's like saying iPad supports Microsoft Excel. At some point it's the same name for different software.

I think especially because it's under Proton, that means it's the Windows version of the game you're weakening to anti-cheat too. Even Valve's own VAC has issues running under Proton.


Reading comments like that really don't motive me to switch to Linux for gaming on my +2000 gaming library.


Variable refresh rate is nice when your refresh rate doesn't match your output. Especially when you're getting into higher refresh rates. So if your display is running at 120hz, but you're only outputting 100hz: you cannot fit 100 frames evenly into 120 frames. 1/6 of your frames will have to be repeats of other frames, and in an inconsistent manner. Usually called judder.

Most TVs will not let you set the refresh rate to 100hz. Even if my computer could run a game at 100hz, without VRR, my choices are either lots of judder, or lowering it to 60hz. That's a wide range of possible refresh rates you're missing out on.

V-Sync and console games will do this too at 60hz. If you can't reach 60hz, cap the game at 30hz to prevent judder that would come from anything in between 31-59. The Steam Deck actually does not support VRR. Instead the actual display driver does support anything from 40-60hz.

This is also sometimes an issue with movies filmed at 24hz on 60hz displays too: https://www.rtings.com/tv/tests/motion/24p


Modern GPUs aren't constrained by the number of ports. You can use DisplayPort MST (hubs) to output multiple displays off a single port.

But most GPUs are limited to being able to run 4 displays. Except some AMD Eyefinity can do 6


Those are mostly reports for the Windows build of Baldur's Gate 3, running through Proton/Wine. He's talking about the newer Linux native build of the game from 3 months ago.

There's a few reports there for the native version of the game: https://www.protondb.com/app/1086940#9GT638Fuyx , with similar Nvidia GPU issues and a fix.


A recent one seems to work with VRR: https://www.reddit.com/r/linux_gaming/comments/1pkdfcm/ugree...

DP1.4 though, so you're still going to need compression.


Yup this works but there's as of yet no HBR13.5 or better input so you're not getting full hdmi 2.1 equivalent. But if you don't care about 24 bits per pixel DSC then you can have an otherwise flawless 4k120hz experience.

https://trychen.com/feature/video-bandwidth


Thanks for this link! I'd heard of the CableMatters dongle but this seems like a better route to go assuming the patches are accepted.


Wow that's awesome work in that post! I've also bought a few things from UGreen now, they're great.


It was still an issue enough that some developers made BattlEye for anti-cheat 20 years ago for Battlefield games. It's still one of the more popular anticheats today.

Other games did similarly. Quake 3 Arena added Punkbuster in a patch. Competitive 3rd party Starcraft 1 server ICCUP had an "anti-hack client" as a requirement.


Up until a year or two ago, the majority of monitors (and graphic cards) used DisplayPort 1.4 and HDMI 2.1. With HDMI 2.1 (42 Gbps) having more bandwidth than the DisplayPort (26 Gbps).

This is my case with my relatively new/high-end RTX 4080 and OLED monitor. So until I upgrade both, I use HDMI to be able to drive a 1440p 240hz 10-bit HDR signal @ 30 Gbps.


I had said I wouldn’t upgrade from my RTX 3080 until I could run “true 4K”.

I finally got the 240hz 4K uncompressed but it required buying a $1300 Asus OLED monitor and the RTX 5090. It looks amazing though, even with frame gen. Monster Hunter had some particularly breathtaking HDR scenes. I think it uses DisplayPort 2.1? Even finding the cable is difficult, Microcenter didn’t have them in April and the only one that worked was the one that came with the monitor.


Did you ever get those local SSDs as copy-on-write overlays on Linux? I imagine it'd be easier with btrfs support for seeding device: https://btrfs.readthedocs.io/en/latest/Seeding-device.html


Yes, on Linux I was able to move the copy-on-write overlays to use local disks, which is one reason it performs much better (admittedly not a reason that would affect most people).

I am just using dm-snapshot for this -- block device level, no fancy filesystems.


> EAC has the support for Linux

This does not mean it supports the full feature set as from EAC on Windows. As an analogy, it's like saying Microsoft Excel supports iPad. It's true, but without VBA support, there's not going to be many serious attempts to port more complicated spreadsheets to iPad.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: