This post has been de-listed
It is no longer included in search results and normal feeds (front page, hot posts, subreddit posts, etc). It remains visible only via the author's post history.
Hello, I wanted to see if running bumblebee is a good method of switching the GPU back to linux after running a VM. Maybe this will be of use to someone.
I ran all of my benchmarks on Unigine Heaven on OpenGL at 1920x1080 8xAA fullscreen on ultra quality with Extreme Tessellation. My VM setup consists of KVM/Qemu with virt-manager frontend. Sound comes through pulseaudio/ich9 as other sound settings gave me an FPS drop.
Here is my hardware and other stats.
System: Host: warden Kernel: 4.9.8-1-ARCH x86_64 (64 bit gcc: 6.3.1)
Desktop: Cinnamon 3.2.8 (Gtk 3.22.8) Distro: Antergos Linux
Machine: Device: desktop Mobo: ASUSTeK model: PRIME Z270-A v: Rev 1.xx
UEFI [Legacy]: American Megatrends v: 0701 date: 12/27/2016
CPU: Quad core Intel Core i7-7700K (-HT-MCP-) cache: 8192 KB
flags: (lm nx sse sse2 sse3 sse4_1 sse4_2 ssse3 vmx) bmips: 33612 clock speeds: max: 4700 MHz 1: 799 MHz 2: 1458 MHz 3: 1974 MHz 4: 800 MHz 5: 799 MHz 6: 1298 MHz 7: 854 MHz 8: 799 MHz
Graphics: Card-1: Intel Device 5912 bus-ID: 00:02.0
Card-2: NVIDIA GP104 [GeForce GTX 1070] bus-ID: 01:00.0 Display Server: X.Org 1.19.1 drivers: nvidia (unloaded: fbdev,vesa,nouveau) FAILED: modesetting Resolution: [email protected] GLX Renderer: GeForce GTX 1070/PCIe/SSE2 GLX Version: 4.5.0 NVIDIA 375.26 Direct Rendering: Yes
Unigine Heaven Benchmark | run1 | run2 | run3 | run4 | run5 | Average |
---|---|---|---|---|---|---|
Native Windows | 2205 | 2185 | 2185 | 2204 | 2208 | 2206.5 |
Native Linux same xserver | 2034 | 2051 | 2056 | 2050 | 2040 | 2037 |
Native Linux diff xserver | 2215 | 2197 | 2197 | 2197 | 2217 | 2216 |
Native iGPU linux | 118 | 118 | ||||
Primus no vsync | 2173 | 2170 | 2172 | 2174 | 2161 | 2167 |
VirtualGl no compression | 1338 | 1360 | 1357 | 1357 | 1357 | 1347.5 |
virtualgl jpg | 1341 | 1338 | 1339 | 1339 | 1339 | 1340 |
virtualgl proxy | 1345 | 1343 | 1341 | 1340 | 1338 | 1341.5 |
virtualgl rgb | 1371 | 1369 | 1370 | 1369 | 1369 | 1370 |
virtualgl xv | 1196 | 1198 | 1198 | 1198 | 1197 | 1196.5 |
virtualgl yuv | 1358 | 1342 | 1341 | 1340 | 1337 | 1347.5 |
A bar graph of this data can be seen here on imgur.
Different compression methods for the virtualGL bridge are mostly not worth it. Only RGB compression actually gives a performance increase over using no compression. However, no method compares to using the primus bridge.
Using the primus bridge results in enourmous performance gains. I was puzzled until I read more about how x-server and bumblebee works (Rendering in a separate xserver). I decided to test Unigine Heaven in a different xserver by switching to a different tty and running heaven from there.
xinit ./heaven -- :1
So it seems by having the iGPU render the desktop, the dGPU can focus solely on the application being focused on.
Running a separate xserver for games might work for some people, but it's not my cup of tea. I am really happy with bumblebee and primus and it will be my setup for a while.
tl;dr: bumblebee and primus good.
edit: formatting galore
Post Details
- Posted
- 7 years ago
- Reddit URL
- View post on reddit.com
- External URL
- reddit.com/r/VFIO/commen...