After some playing, hacking, and testing, I\'ve figured something out.
When running any NT based OS, being Windows NT, Windows 2000, or Windows XP, AND you have a Geforce 4 card, due to the way the program is parsing the programmatical instructions it will die if the server is not giving it what it expects. If the config file says that there will be 32 bit color textures et al, then the OS/Hardware expects there to be 32 bit color. Since the server is only at this point passing 16 bit color, it generates an unhandled exception, due to the fact that it is trying to generate XYZ polygons without enough data to do so.
Think of it this way. Im going to generate a 1 by 1 by 1 polygon. To do so in 32 bit color requires that at least 96 bits of information (enough to color each side in). The game knows this, the rendering engine knows this, so it requests the information from the server. The server sends back only 48 bits and signals for an end to the transfer. The rendering engine goes WHAT? No way dude, resend (thus the reason why it takes so long for the program to die). After finally figuring out that hey buddy, that\'s all thats coming, the program says fine, thats not enough, I cannot render, and errors to desktop with an unhandled unspecified error.
However. Set your color depth to 16 bit, and the program knows that it only needs X amount of data to fill the vertices. The server IS sending 16 bit data, so, the program gets happy.
Clearer now?