WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 4: Nvidia and DL-DVI Adapters

When I wrote the previous installment of this series of articles I said some good things about the quality of Nvidia drivers and the nice touches that Nvidia have put in their firmware. Unfortunately, this didn’t seem to carry through to the GTX5xx series of cards as well as I’d hoped (my previous testing was done with an 8800GT card). It turns out that my new GTX580 card isn’t very compatible with the T221, and the cause is the evolution of those nice touches that Nvidia have put in their firmware that I mentioned before. The 8800GT card limited itself to 1280×1024 resolution when it did anti-aliasing. The GTX580 card takes it to the maximum – it will go into the highest resolution the monitor reports as being able to do. In this particular case, that causes severe problems – when the primary port starts to drive the T221 in 3840×2400@13Hz, the secondary port seems to switch off, and for some reason, the Nvidia card won’t talk to it afterwards. That means that 2xSL-DVI mode won’t work to give you 3840×2400@20Hz. The problem is, unfortunately, in the firmware, rather than the drivers, and both Windows and Linux drivers exhibit the same issue.

I did, however, manage to procure some DL-DVI adapters that saved the day. They come with their own EDID modes programmed in, and they don’t report 3840×2400 modes, only 1920×2400@48Hz. So the AA doesn’t go into full tiled resolution mode and thus, you can see both halves of the monitor detect separately and the tiled 3840×2400@48Hz mode becomes available. The end result is quite awesome and worth the cost and the effort. Not only is this monitor fantastic for coding work (I like having lots of terminals open), but the monitor is in fact fully gaming capable (assuming you have a powerful enough card to drive it). As you can see, Crysis looks pretty amazing on it.

If you are getting one of these monitors, make sure you get the DL-DVI adapters to go with it, unless you are using a G9x series Nvidia card. You can get the T221 monitors complete with the DL-DVI adapters from here.

10 thoughts on “WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 4: Nvidia and DL-DVI Adapters

  1. No, the same place had an adapter on it’s own available once, but they don’t do that any more. With a G9x class Nvidia card and a dual SL-DVI cable it works lovely, though, if you cannot find the adapters. I found that 23Hz was good enough even for gaming.

  2. Cool article, i’ve been looking into switching over to nvidia for my t221 (i have the adapters mentioned above) since the newer ATI cards only have 1 dual link DVI port and if you mix DP and DVI you end up with screen tearing.

    Did you need to enable nVidia surround for gaming or is it supported natively?

    • Actually, ATI cards exhibit tearing with dual screens just the same with two dual link DVI inputs to the T221. It has nothing to do with mixing DP and DVI ports. The tearing was obvious both when playing videos and gaming.

      I use XP and all I did was configure the desktop to stretch across both “monitors”. All the gaming from there on behaves as if I had a single 3840×2400 monitor. IIRC the “desktop stretch” feature was removed in Vista and 7.

  3. Thanks for the reply and information. It’s interesting you experienced the same results with an ATI card with two dual link DVI cards, my understanding was that it was mixing the DVI and DP ports that resulted in tearing but apparently not.

    It’s unfortunate that nvidia removed the desktop stretch for Vista and 7, guess I’ll have to look for another option.

    • You misunderstood – it wasn’t Nvidia/ATI that removed the desktop stretch feature – it was Microsoft. In XP it was a built in Windows feature that wasn’t GPU dependent. Whether Nvidia or ATI drivers include a replacement feature – I don’t know. I seem to remember that there was some OS related issue that made it difficult to re-add the feature in Vista/7.

      Also note that I wasn’t using 2 ATI cards – I was using one card with two DVI outputs, and still experienced tearing. The main reason I switched to Nvidia is because I had a 4870X2, and the evidence (mainly GPU temperatures) strongly implied that when using two monitors, only one GPU was being used, while the other was idle, regardless of whether CrossFire was enabled. And despite it running off of a single GPU there was still tearing evident between the two screens.

Comments are closed.