Bugzilla – Bug 152026
Using Xgl as DisplayManager_XServer, 3D apps don't detect 3D graphics card, etc.
Last modified: 2006-09-01 16:29:08 UTC
When DISPLAYMANAGER_XSERVER="Xgl", 3D applications don't detect the 3D graphics card, and display a warning. This is independent of Compiz. If you continue past the warning, they work, but colors are not correct. I used Neverputt and Neverball in my tests, using the proprietary nvidia driver.
"Don't detect the 3D graphics card" is not exactly a good description. Please post application, version, and exact output. Neverputt works fine for me on Xgl. Did you specify '-accel glx:pbuffer' in DISPLAYMANAGER_XGL_OPTS?
Created attachment 71492 [details] Tar of screenshot and /etc/sysconfig/displaymanager From the large screenshot, you can see two instances of Neverball, and a third is being launched, creating the message about no 3D being detected. I made one neverball transparent to show that XGL is running (this one with compiz). Initially, neverball appears normal, but when you start gameplay, you will notice that the colors are off - ie, the course is black rather than green. Also, the 3D effects are much more crude than neverball just on Xorg. Another example is Armagetron. When you finish a round, the text is displayed wrong, in that it simply colors in a text line with the color of the text rather than writing text with the 3D background.
Please invoke 'glxinfo', once without Xgl running directly on Xorg, and once on Xgl (just like you called neverball) and post the output. Please use seperate attachments and not tarballs, that eases access a lot. Also, please attach /etc/X11/xorg.conf, /var/log/Xorg.0.log, and /var/log/Xorg.93.log (if present).
Created attachment 73797 [details] glxinfo with XGL running
Created attachment 73798 [details] glxinfo with xorg
Created attachment 73799 [details] xorg.conf
Created attachment 73800 [details] xorg.0.log
Created attachment 73801 [details] Xorg.93.log
I believe the reason is that CheckHardware does test for Direct Rendering in verify.c which Xgl doesn't provide.
Yes, that's correct.
(In reply to comment #9) > I believe the reason is that CheckHardware does test for Direct Rendering in > verify.c which Xgl doesn't provide. In that case this is a bug in neverball. The concepts of direct rendering and aardware acceleration is somewhat orthogonal. Somewhat, because direct rendering w/o hardware acceleration makes no sense. OpenGL has no possibility to check whether hardware acceleration is available. Heck, what is hardware acceleration anyway? Does it count if we only have hardware accelerated gouraud shaded triangles without textures? Or are PixelShaders 2.0 an absolute must? *Every* test for hardware accelerated OpenGL is broken by design. Stefan, Marcus, I do not completely get what CheckHardware does, there is no manpage or other docs... Added Gary to CC as the maintainer of neverball. What do you think about this issue?
Matthias, CheckHardware tries to detect if sound and/or 3d support is available (options "--3D", "--sound"). It's used when 3D and/or Sound support is required for the package in the PDB. See e.g. the neverball desktop file /usr/share/applications/neverball.desktop: [...] Exec=CheckHardware --3D neverball CheckHardware was a requirement for our desktop to inform the user about his current 3D/sound configuration since so many applications in the menu ran so slowly or sound didn't work. And the usere didn't know why ... For detecting 3D support CheckHardware does exeaclyt the same as glxinfo does for detecting if direcet rendering is available.
As I said - it is impossible to detect in a fool-proof way whether 3D hardware support is available. I don't have an idea how to test this, as indirect rendering can be hardware accelerated as well. You can only check whether 3D support is working at all, which should be the case as long as the glx module is loaded in the X server, and for direct rendering you can be pretty sure that you have some sort of 3D hardware support. To what extend is another question.
How it detects 3D is not as important as how the 3D application runs. For example, the colors are not what they should be. The problem is not unique to neverball, that is just one example. From what I've read, I wonder if this is because Xgl is a layer between the application and the real Xorg x server. Some articles (can't seem to find any right now) I have read indicate that this is the intended behavior, and the disadvantage Xgl has compared to XEgl. If that's correct, this is working as designed, and I'm fine if you want to close the bug.
Would you say that current OpenGL applications run reliably on Xgl? If you say yes, we should consider to drop this 3D test completely. Otherwise - and WRT neverball/neverputt this seems to be the case - we should keep the warning and adjust it accordingly.
Created attachment 74327 [details] Screenshot of 3d app on xorg This is a screenshot of the commercial game Unreal Tournament 2004 in 3D using nvidia's driver with xorg. Notice how things look normal. This screenshot is taken when the game is inside a building, where things are darker.
Created attachment 74328 [details] Screenshot of 3d app on XGL This is the same game, same map as the previous attachment. Notice that only some objects appear normal. It looks very dark, but this is actually an "outdoor" area, where it should be lighter. Not only is it just darker, but it appears to simply not render some things (such as the floor). A few UT2004 maps appear okay with XGL, this is an example of one that does not.
(In reply to comment #15) > Would you say that current OpenGL applications run reliably on Xgl? From the missing rendering when using XGL, and wrong colors, I would say no. I also attempted to look at performance, but top showed utilization around 100% for a processor in both scenarios. Running the game in XGL was not as smooth as just running it on xorg. As a side note, the hardware I run the tests on is a dual core system. I have to wonder if the 2nd core is picking up slack in doing the rendering on my tests. Unfortunately, my older single processor hardware is old, and would not be valid test hardware for these cases.
(In reply to comment #14) > How it detects 3D is not as important as how the 3D application runs. For Agreed. > example, the colors are not what they should be. The problem is not unique to > neverball, that is just one example. OpenGL on top of Xgl isn't widely tested yet. AFAIK Quake3 works fine already, but there hasn't been much stress testing yet (and it is low priority). > From what I've read, I wonder if this is because Xgl is a layer between the > application and the real Xorg x server. Some articles (can't seem to find any > right now) I have read indicate that this is the intended behavior, and the > disadvantage Xgl has compared to XEgl. If that's correct, this is working as > designed, and I'm fine if you want to close the bug. This is utterly wrong. Where did you read this crap? Guess I have to document this even more. The only difference between Xgl and Xegl is that Xegl doesn't need Xorg for initialization of the framebuffer. Xgl uses the direct rendering model, that is the underlying does not get involved any more as soon as the frame buffer is initialized and the graphics context generated. Well, maybe for resource allocation, but that doesn't really count. (In reply to comment #17) > This is the same game, same map as the previous attachment. Notice that only > some objects appear normal. It looks very dark, but this is actually an Seems like the lightmap texture is missing / misinterpreted. Which map is that (name)? (In reply to comment #18) > also attempted to look at performance, but top showed utilization around 100% > for a processor in both scenarios. Running the game in XGL was not as smooth > as just running it on xorg. This is well known and I try to emphasis it everywhere I go. However, there is still this myth around that Xgl would accelerate OpenGL, which it won't. Due to inderect rendering things will be slower for high end applications. > to wonder if the 2nd core is picking up slack in doing the rendering on my > tests. Unfortunately, my older single processor hardware is old, and would not > be valid test hardware for these cases. Give Xgl and compiz of 10.1 another chance, as soon as they are out. We had a multithreading issue which has been worked around now. I'm leaving this bug on NEEDINFO until then.
on comment #15: I am programming a pretty standard openGL app in my freetime (multitexturing, no extensions - no shaders), and did that for a while now under Xgl. I can say I have no major issues apart from these: - My frame rate drops heavily ( -20 fps in average on a NVidia 5500) compared to Xorg ( but that's not a problem for my app as the general frame rate is at about 100 fps ) - I can't switch screen resolutions like with X.org (SDL only returns the screen resolution Xgl is running in, if I force another resolution I get just a small window). This is a problem for my app, as the user is forced to use the resolution Xgl is running in for the Application. On X.org, the user could switch to 640x480 for example, to archive higher frame rates. Other than that, I have seen no visual difference with my app in Xgl compared to Xorg. I recently switched back to Xorg for completely other reasons. After some weeks with Xgl I found it to be stressful for the eyes, especially the Cube switching. I did not make a bugreport because I don't really know if it's maybe the long switching time of my TFT: While moving wobbly windows and switching the cube, my eyes become stressed while moving terminals. The fonts are somehow antialiased while moving, and then suddenly become sharp when the movement ends. This was ugly to work with, which I found out after some weeks of using.
I can't see why this one is still set to NEEDINFO.
(In reply to comment #19) > Give Xgl and compiz of 10.1 another chance, as soon as they are out. We had a > multithreading issue which has been worked around now. I'm leaving this bug on > NEEDINFO until then. Due to this. I actually meant 10.1 RC1... You can leave it assigned, of course.
Wrong colors and such are likely related to multi-texturing when using vertex arrays being broken in nvidia's libGL for indirect rendering. You can see this problem when running quake3 with multi-texturing turned on. Try using mesa's libGL with the application that produce wrong colors on Xgl.
On Beta9, I see the same issues. I'm not quite sure how to substitute mesa's libGL. I installed the nvidia drivers, then reinstalled the Mesa* packages. When I tried to restart X, it failed. What's the correct way to try Mesa's GL?
This should work. Could you attach /var/log/Xorg.0.log? Thanks.
/var/log/Xorg.0.log is not modified when I do this -- I moved the existing logs to another location, and re-did the test. The Xorg.93.log was recreated, but not the Xorg.0.log. Also, I notice that the clock "busy" icon appeared on my terminal, and it exist on each console. The nvidia driver install thinks there is an X session running and won't reinstall (even if I go to init 1, then to init 3) until I boot the system to runlevel 3 directly. This also happened the first time. I installed the nvidia drivers (original and reinstall) according to your (Stefan) documentation. Will attach Xorg.93.log in case there is something there.
Created attachment 76064 [details] Xorg.93.log when Mesa is installed over Nvidia drivers
Just for the record, Chad. You're still using the NVIDIA driver. Anyway, this is a discussion about still to use CheckHardware or not. IMHO for CODE10 (SL 10.1/SLES10/SLED10) we should still use it since OpenGL apps do not run properly on top of Xgl yet.
It should still be used, but CheckHardware should itself check if it's running on top of Xgl. If yes, it should not do the Check at all, because 3D support can be assumed to be available in this case.
No, it can't, if you consider '3D support' to be hardware accelerated 3D support. That only works if the drivers support pBuffers or FBOs (which currently only nvidia and partially fgrlx do).
*** Bug 163822 has been marked as a duplicate of this bug. ***
CheckHardware.changes ------------------------------------------------------------------ Fri Sep 1 18:26:16 CEST 2006 - sndirsch@suse.de - added wrapper script to disable 3D check when Xgl is active (Bug #152026)