WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 3: ATI vs. Nvidia

It is at times like this that I get to fully consider just how bad a decision it was to jump ship from ATI to Nvidia when it came to graphics cards. But now that sense has been forced back upon me, I will hopefully not consider such madness again for the best part of the next decade.

Due to the ATI drivers being fundamentally unable to handle the T221 reliably, I bit the bullet and decided to put my old 8800GT card back in. The first WTF came about when it transpired that ATI drivers cannot be uninstalled from Windows XP using their bundled uninstaller in Safe or VGA modes. This is quite bad when you consider that it could be the ATI drivers that are making the machine not boot into normal mode. Credit to it, though, the ATI uninstaller was not too bad once I ran it in normal mode, and after using it to remove all ATI software and uninstalling the ATI devices in Device Manager, there wasn’t enough left to cause problem on the next reboot, during which the machine contained an Nvidia card. Everything booted up fine, and after a quick run of the Auslogics Registry Cleaner (just to make sure – easily the best registry cleaner I have used to date), everything was ready for the installation of Nvidia drivers. Everything went quite painlessly, and a reboot later I had the T221 configured for 2x1920x2400@20Hz mode. The only thing that didn’t come up perfectly by default is that I had to add the 1920×2400@20Hz mode in Nvidia Control Panel (click the Customize button).

By this point, the superior features of Nvidia were already becoming apparent:

  • Text and low-resolution mode anti-aliasing in firmware – such modes look vastly better than on ATI hardware).
  • Until the driver enables the secondary port, it remains disabled. This is really nice on the T221 because it means you don’t get the same thing on the screen twice during the BIOS POST and early stages of the boot process. I can imagine this also being annoying on a multi-head setup.
  • The primary port wasn’t switching for no apparent reason in Windows with multiple screens plugged in.
  • Best of all – Windows XP drivers Just Work ™. They don’t forget their settings between reboots.
  • No tearing down the middle of the screen where the two halves meet. With the ATI card, the mouse couldn’t be drawn on both halves at the same time. In the middle, you could make it virtually disappear, Not a big deal, but yet another example of general bugginess. Also, in games the tearing along the same line disappeared (I always run with vsync forced to on, and it was still visible from time to time with the ATI card).

Just the properly working drivers would have easily convinced me of the error of my recent ways, but all the other niceties really make for a positive difference to the experience.

After I got Windows working (it only took 20 minutes, after giving up on ATI after wasting a whole day on getting it to work properly and remember the settings between reboots), it was time to get things working in Linux. The first thing that jumped out at me about this part of the exercise is just how much better ATI’s Linux drivers are compared to their Windows drivers. It is obvious that they are actually being developed by somebody competent. Unlike the Windows drivers, the Linux drivers worked out of the box, and the only unusual thing that I needed to do was to make sure Fake Xinerama was configured and preloaded. Removing them was a simple case of:

# rpm -e fglrx

Simple, efficient, reliable. Seems ATI‘s Windows driver team have a lot to learn from their Linux driver team.

The machine came up fine with the nouveau drivers loaded, but I wanted to get Nvidia’s binary drivers working. The experience here was a little more problematic than it had been with the ATI drivers. The nvidia-xconfig and nvidia-settings utilities weren’t as intuitive as the ATI configuration utility, and the setup suffered from a particularly annoying problem where GPU scaling would default to on. This resulted in the screen mode being left stretched and unusable, but sometimes just starting the nvidia-settings program would fix some of it. In the end I just gave up on it and wrote my own xorg.conf according to the documentation – and that worked perfectly. You may want to set the following environment variable to force vsync in GL modes (e.g. for mplayer’s GL output)

# export __GL_SYNC_TO_VBLANK=1

This ensured there was no tearing visible during video playback.

One thing worth noting is that Nvidia drivers bring their own Xinerama layer with them, so the Xorg Xinerama should be disabled. There is also an option for faking the Xinerama information (NoTwinViewXineramaInfo), so no need for Fake Xinerama, either.

In conclusion, it is quite clear that Nvidia win hands down in terms of features and user experience, especially on Windows, due to their more stable drivers are more intuitive configuration utilities. The story is different on Linux – I would put ATI slightly ahead on that platform, at least in terms of configuration utilities. Having to use Fake Xinerama isn’t a big deal for the technically minded. Even on Linux, however, in terms of the overall outcome and the end experience, I feel Nvidia still come out ahead, since ATI drivers still occasionally produce visible tearing when playing back high definition video.

All this made me think about what is the most important thing about a product such as graphics cards. In the end it is not just about performance. Performance is only a part of the overall package. What I find is that the most important thing about a product is the whole experience of configuring it and using it. How easy is it to get to working under edge case conditions? How reliable is it – once it is working does it stay working? Are there any experience ruining artifacts such as tearing visible in applications, even with vsync enabled? These sorts of things along with the crowning touches such as anti-aliasing of low resolution modes and only having one active video output until the drivers specifically enable the others are what really impacts the experience. And based on my experience of Nvidia and ATI cards over the past few years, I hope somebody talks some sense to me if I consider an ATI product again – except perhaps if their FireGL team starts writing their Windows drivers.

WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 2: Windows

When I set out to do this, I thought getting everything working under Windows would be easier than it was under Linux. After all, the drivers should be more mature and AMD would have likely put more effort into making sure things “just work” with their drivers. The experience has shown this expectation was unfounded. Getting the T221 working in SL-DVI 3840×2400@13Hz mode was trivial enough, but getting the 2xSL-DVI 2x1920x2400@20Hz mode working reliably has proven to be quite impossible.

The first problem has been the utter lack of intuitiveness in the Catalyst Control Center. It took a significant amount of research to finally find that the option for desktop stretching across two monitors lies behind a right click menu on an otherwise unmarked object:

CCC Desktop Stretch Option

CCC Desktop Stretch Option

Results, however, were intermittent. Sometimes the resolution for the second half of the screen would randomly get mis-set, sometimes it would work. Sometimes the desktop stretching would fail. Eventually, when it all worked (and it would usually require a lot of unplugging of the secondary port to get a usable screed back), it would be fine for that Windows session, but it would all go wrong again after a reboot. The screen would just go to sleep at the point where the login screen should come up, and the only way to wake it up is to unplug the secondary DVI link, log in, and then plug in the second cable, usually a few times, before it would come up in a usable mode. Then the same resolution and desktop stretching configuration process would have to be repeated – with a non-deterministic number of attempts required, using both the Windows display settings configuration and the Catalyst Control Center.

At first I thought it could be due to the fact that I am using an ATI HD4870X2 card, so I disabled one of the GPUs. That didn’t help. Then I tried using a different monitor driver, rather than the “Default Monitor” which is purely based on the EDID settings the monitor provides. I tried a ViewSonic VP2290b-3 driver (this was a rebranded T221), and a custom driver created using PowerStrip based on the EDID settings, and neither helped. Since I only use Windows for occasional gaming and not for any serious work, this isn’t a show stopping issue for me, but I am stunned that AMD‘s Linux drivers are more stable and usable than the Windows ones when using even slightly unusual configurations.

To add a final insult to injury, 4870X2 card doesn’t end up running the monitor with one GPU running each 1920×2400 section. Instead, one GPU ends up running both, and the 2nd GPU remains idle. At first I attributed the tearing between the two halves of the screen to be due to each half being rendered by a different GPU. Unfortunately, considering that all tests show that one GPU remains cold and idle while the other one is shown to be under heavy load, I have to conclude that this is not the case. This is particularly disappointing because the experience is both visually bad (tearing between the two 1920×2400 sections) and poorly performing (one GPU always remains idle and the frame rates suffer quite badly – 7-9fps in the Crysis Demo Benchmark). I clearly recall that my Nvidia 9800GX2 card I had before had a configuration option to enable dual-screen dual-GPU mode.

I am just about ready to give up on AMD GPUs, purely because the drivers are of such poor quality and lacking important features (e.g. requirement of fakexinerama under Linux, something that Nvidia drivers have a built in option for). I’m going to dig out my trusty old 8800GT card and see how that compares.

Genesi Efika MX Smartbook’s 0 Button Mouse

I love my Genesi Efika MX Smartbook – it’s an awesome little machine. But there have been three things that have bothered me about it since I got mine, and they are the sort of things that can make a difference between sub-mediocrity and brilliance. I have already covered one of the issues in a previous post concerning the screen upgrade.

The second big problem I have with it is that the buttons on the touch pad are completely unusable. This is not an exaggeration. Due to the way they are designed, it is only possible to use them for dragging with a copious amount of luck – not skill – luck. Clicking using the buttons in the touchpad requires only an infinitesimally smaller amount of luck than dragging. This isn’t acceptable, and since I otherwise rather like the Smartbook, I decided to find a good workaround that doesn’t involve carrying a mouse or a trackball with me – this would ruin one of the best things about it – the portability.

I used to have Sony Vaio PCG-U1 and PCG-U3 machines in the past. They were quite awesome, and competed quite successfully on spec with the Genesi Efika MX Smartbook – which is fairly impressive considering the Vaio‘s in question were produced in 2002 – 9 years ago. The main reason why I finally needed to upgrade from the old Vaio was because 1024×768 sccreen resolution simply stopped being sufficient for any serious use. The standard Efika would have failed this requirement even worse were it not for the possibility of the 1280×720 screen upgrade. Plus, the Efika is much thinner and doesn’t require a battery pack as big as the rest of the laptop for 6 hours’ battery life. But I digress. The main point I was getting to is that the Vaio had mouse buttons that were quite separate from the joypad, while still being very ergonomic and easy to use. This made me think about using a similar trick on the Efika. All I needed was two conveniently placed yet redundant keys on the keyboard to remap into mouse buttons. The “House” (the one with an icon of a houe as opposed to”Home”) and “Alt” keys in the bottom left corner seemed perfect for this task.

To do this, we need to do two things:

  1. Disable Xorg’s usage of the keys using xmodmap. I put mine in /etc/X11/xmodmap.
  2. Configure actkbd to trap the low-level keystrokes and execute xdotool commands to issue Xorg mouse button events. Put this in /etc/actkbd.conf
  3. Put the two together and make it happen automatically on login using a script /etc/X11/Xsession.d/95-keyremap.

That is pretty much it. The “House” and “Left Alt” keys will now act as left and right mouse buttons respectively. I hope you find it to be a big an improvement as I did. It feels like having mouse buttons again after being stuck with a 0 button mouse.

These instructions are for Ubuntu, since that is what the Efika ships with and I haven’t gotten around to putting Fedora on it yet. It shouldn’t be difficult to adapt the above approach for other distributions.

WQUXGA a.k.a. OMGWTF – IBM T221 3840×2400 204dpi Monitor – Part 1: Linux

I’m not sure how many people occasionally stop to notice this sort of thing, but to me it frequently seems that technology regresses for long periods from it’s infrequent peaks. In the 60s we saw flights of the likes of XB-70 Valkyrie and the SR-71 Blackbird, and people walked on the moon. Yet in 2011 we are reading about the last flight of the Space Shuttle rather than about the first colony on Mars. It makes a quote from Idiocracy all the more uncanny: “… sadly the world’s greatest minds and resources where focused on conquering hair loss and prolonging erections.

The same pattern seems to apply to some aspects of the computer industry, when cost pressures take precedence over quality, features and innovation. In 2001, we saw the introduction of the IBM T220 monitor, with resolution of 3840×2400 on a 22.2″ panel. It was later superseded by the T221 with very similar specifications, but it was ultimately discontinued in 2005. Nothing matching it has been available since. Today, the screen resolutions seems to be undergoing an erosion. On small panels the “standards” (sub-standards?) have settled at the completely unusable 1024×600, and with total of three exceptions from Dell (3007WFP, 3008WFP), Samsung (305T) and Apple (Cinema HD), the commonly available screens are limited to 1920×1080 resolution. Even 1920×1200 screens are getting more and more rare, especially on laptops, because screens are marketed by diagonal size and for any given diagonal length, 16:9 ratio screens have a smaller surface area than 16:10 ratio screens.

IBM T221 monitors, especially of the latest DG5 variety, are very hard to come by and still expensive if you can ever find one. Typically they sell for double what you can get a Dell 3007WFP for. But you do get more than twice the pixel count and more than twice the pixel density. I have recently acquired a T221 and if your eyes can handle it (and mine can), the experience is quite amazing – once you get it working properly. Getting it working properly, however, can be quite a painful experience if you want to get the most out of it.

My T221 came with a single LFH-60 -> 2x SL-DVI (single link DVI) cable. There are two LFH-60 connectors on the T221, which allows the screen to be run using 4x SL-DVI inputs. This provides a maximum refresh of 48Hz. There is also a way to run this monitor using 2xDL-DVI inputs at 48Hz, but this requires special adapters, but that is a subject for another article, since I haven’t got any of those yet.

Using a single LFH-60 -> 2x SL-DVI cable, there are only two modes in which the T221 can be run:

1) As a single 3840×2400 panel @ 13Hz using a single SL-DVI port

2) As two separate monitors, each being 1920×2400 @ 20Hz, using two SL-DVI ports

The 13Hz mode is completely straightforward to get working on both RHEL6 and XP x64, but 13Hz is  just not fast enough. You can actually see the mouse pointer skipping as you move it, and playing back a video also results in visible frame skipping. So I have spent the effort to get the 2x1920x2400@20Hz mode working on my ATI HD4870X2. The end results are worth it, but the process isn’t entirely straightforward. The important thing to consider is that when running in anything other than 3840×2400@13Hz mode appears to the computer as two completely separate 1920×2400 monitors.

IBM T221 with Linux

ATI‘s Linux drivers aren’t really mature enough for the job, and to achieve the best results, you have to use aticonfig to generate xorg.conf without xinerama support, start X-Windows, fire up the amdcccle configuration utility for ATI cards, enable dual screens, then add xinerama support. If all this sounds complicated to you – it is, and it took a lot of trial and error to get right. So to save you the effort, here is a copy of my xorg.conf file. This is from a RHEL6 machine using the ATI fglrx driver. It will almost certainly work on other distributions, too, with little or no modification.

This still won’t work quite as you’d hope, though – xinerama passes information to the applications about the geometry of the desktop, and apps will only maximize to one screen. This also goes for the task bar, and applies to video playback. The last bit of magic involves faking the xinerama information. Nvidia drivers come with a built in option for this: “NoTwinViewXineramaInfo”. Unfortunately, ATI drivers have no such option. But, this being the world of Linux, there is a backup plan. There is a LD_PRELOAD library called Fake Xinerama that can be used to override the screen geometry passed to applications, and make the applications think they are on a single 3840×2400 screen. All you need to do is the following:

1) Compile fake xinerama from the like above
2) Add the line “/usr/local/lib64/libXinerama.so” to your /etc/ld.so.preload file.
3) Create a file ~/.fakexinerama containing:

1
0 0 3840 2400

The first line contains the number of screens, the second line’s format is:
<origin X> <origin Y> <width X> <width Y>
If you are booting into graphical environment immediately (runlevel 5), you will need the .fakexinerama file in root’s home directory, too, since gdm/kdm run as root.

And if you have managed to follow all that, you will have a single seamless  3840×2400@20Hz desktop.