Bugzilla – Bug 154947
NVIDIA: 1920x1200 not supported?
Last modified: 2006-04-07 19:41:24 UTC
Looks like a 1920x1200 mode is not supported by the NVIDIA driver on a GeForce FX 5900XT whereas the "nv" driver does support this mode. I'll attach nvidia-bug-report.log.
Created attachment 71128 [details] nvidia-bug-report.log
The error messages in the logfile are not really helpful. :-( [...] (II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan) (II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan) [...] (II) NVIDIA(0): Not using mode "1920x1200" (no mode of this name) It doesn't get better when not specifying the 1920x1200 modeline. The only difference it that these lines no longer occur in the logfile. (II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan) (II) NVIDIA(0): Not using mode "1920x1200" (bad mode clock/interlace/doublescan) Looks like the NVIDIA driver does not support this mode. At least not on this board. Andy, is this a known limitation in the driver?
> (--) NVIDIA(0): DFP-0: maximum pixel clock: 165 MHz Might be a problem. ' Option "IgnoreEDID" "on" ' might help here, but will likely be dangerous (pixel clock to high). Modeline "1920x1200" 239.63 1920 2064 2272 2624 1200 1201 1204 1251 Modeline "1920x1200" 204.39 1920 2056 2264 2608 1200 1201 1204 1244 We should try the modeline used by the nv driver instead. (II) NV(0): clock: 154.0 MHz Image Size: 495 x 310 mm (II) NV(0): h_active: 1920 h_sync: 1968 h_sync_end 2000 h_blank_end 2080 h_border: 0 (II) NV(0): v_active: 1200 v_sync: 1203 v_sync_end 1209 v_blanking: 1235 v_border: 0 (II) NV(0): Ranges: V min: 48 V max: 85 Hz, H min: 30 H max: 94 kHz, PixClock max 170 MHz Modeline "1920x1200" 1920 1968 2000 2008 1200 1203 1209 1235 Mike, could you try this modeline?
> Modeline "1920x1200" 1920 1968 2000 2008 1200 1203 1209 1235 Modeline "1920x1200" 154 1920 1968 2000 2008 1200 1203 1209 1235
Yes, this seems to work. I tested only remote and could not see the X screen, but now I see the following lines in the Xorg.0.log file: (**) NVIDIA(0): Validated modes for display device DFP-0: (**) NVIDIA(0): Mode "1920x1200": 154.0 MHz, 76.7 kHz, 62.1 Hz
This looks good, but having a deeper look at the logfile it seems to me that the NVIDIA driver does *not* have any modelines for some high definition modes, e.g.: (II) NVIDIA(0): Not using mode "1920x1200" (no mode of this name) (II) NVIDIA(0): Not using mode "1900x1200" (no mode of this name) (II) NVIDIA(0): Not using mode "1600x1200" (no mode of this name)
http://en.wikipedia.org/wiki/DVI Maximum dot clock is 165 MHz on single link DVI. For calculating regular modelines there is 'gtf', which produces # 1920x1200 @ 60.00 Hz (GTF) hsync: 74.52 kHz; pclk: 193.16 MHz Modeline "1920x1200_60.00" 193.16 1920 2048 2256 2592 1200 1201 1204 1242 -HSync +Vsync In xorg CVS there is 'cvt' that can create modelines for the reduced timing formula as well: # 1920x1200 59.95 Hz (CVT 2.30MA-R) hsync: 74.04 kHz; pclk: 154.00 MHz Modeline "1920x1200R" 154.00 1920 1968 2000 2080 1200 1203 1209 1235 +hsync -vsync Looks similar to the one nv is using, but not equivalent (may be a typo above? 2008 vs 2080). Note that you should rename the modeline to 1920x1200 when using it. Also the +hsync -vsync is important.
Yes, typo. Mike, please use the modeline from Matthias instead of mine.
Mike, Olaf, you might also need this one to make the modeline active: Option "ExactModeTimingsDVI" "boolean" Forces the initialization of the X server with the exact timings specified in the ModeLine. Default: false (for DVI devices, the X server initializes with the closest mode in the EDID list).
Mike told me that this modeline did not work for him with his FX5200. FX5200 seems to be limited to a mode of 1440x1050 when using the DVI output. :-( At least this is what the Xserver log says.
This bugreport has been INVALID from the beginning. NVIDIA driver uses the timings he gets from EDID.