Sunshine Encode Performance Windows vs. CachyOS #963
-
Select Topic AreaQuestion BodyHi there I've been using Sunshine for a while now with my virtualized Windows gaming machine. Now I built a vm on my proxmox host with the exact same specs, turned the Windows VM off so that the CachyOS VM could use the GPU via PCIe passthrough, everything went smooth and stuff worked right out of the box. I just had to figure out the creation of the virtual display using an edid file but that was easy enough once I knew that I had to do that. In case anyone asks these are the specs of the VM: The game runs well in 4k with vsync @60Hz and frame pacing turned on in moonlight and it looks quite good when i turn the bitrate to 80-100MBit/s below it gets atrocious and horrible to watch This worked very well in Windows. When I use CachyOS on the gameserver though, things seem to go south. So I really don't understand what is happening. I noticed one major difference in sunshine though: Greetings audiocrush |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 8 replies
-
|
Same experience on 30-series Nvidia on Cachy as well. Installed through octopi, not a problem on windows or fedora |
Beta Was this translation helpful? Give feedback.
No MTU size is the same across this subnet on all interfaces
doesn't matter anyways I found the culprit, this was such a damn pain to find I swear
when I found it I remembered I once had a similar problem with a remote desktop server farm on a couple of VMWare clusters and the customer had issues with the sessions lagging and hanging and at one point we found out the ring buffers on the NICs were overflowing and thus packets were discarded
So now, all that damn time, it was overflowing tx buffers in proxmox...
By default, for some insanely dumb reasons they were set on all my NICs to 256kb
I set them to 4096kb and now I can crank the bitrate all the way up as far as it goes and I get butt…