An overview of the GXT6500P and GXT4500P
#1
An overview of the GXT6500P and GXT4500P
[Image: O6EVJI7.gif]
The GXT4500P (single GPU) and GXT6500P (dual GPU) were released in 2002 as 64-bit PCI-X cards for the RS/6000 platform, however they can also operate in a 32-bit PCI slot as well--obviously at a performance loss. They were generally paired with the T115, T117, T119 and T120 monitors, the T120 being a fairly famous cheaper substitute to the T220/221. Hilariously enough they're not compatible with all VGA monitors--including IBM's own L190, I suspect this is due to the way AIX and/or the GXT cards handle VGA signals. It's fairly common when dealing with old IBM video cards to experience some weirdness, for instance the microchannel XGA-2 cards can have display problems at different resolutions like 800x600 on certain CRTs; but that's getting a little off topic. Needless to say if you're running a VGA monitor, expect there not to be a seamless experience with *all* monitors.

Because it may come up or be wondered upon, the GXT135P was never designed by IBM and is in fact a Matrox GPU with PowerPC firmware. Therefore it does not run graPHIGS. XiG did release a 3rd party driver to this card for AIX to take more advantage of its capabilities and enhance a few things since IBM just slapped it in as an afterthought because it wasn't developed in-house like the 4500 and 6500. Reference: http://www.xig.com/Pages/Hdwe/Graphics/GXT135P.html
Please be advised XiG doesn't sell any of their souped up matrox AIX drivers anymore, so... that's the end.
  • GXT4500P: graPHIGS, OpenGL 1.2
  • GXT6500P: graPHIGS, OpenGL 1.2.1
I'm (guessing) the 6500 has a firmware update that allows it to support OpenGL 1.2.1 over the 4500's 1.2? Or perhaps it was deliberate by IBM? I'm not sure. At any rate because the card came out in 2002, the OpenGL versioning support is on the older side of things, unfortunately. IBM also lists 'X11' and 'motif' under the API section but... that's not really the same as OpenGL, graPHIGS, GLide, DirectX etc. Any card will run X11 afaik.

The GXT4500P and GXT4500P are only supported under AIX and not Linux (they require at least AIX 4.3.3 and higher), to my knowledge nobody was written any open source Linux drivers, or drivers at all. I am also not aware of how widespread graPHIGS usage was, I couldn't count a single person who has expressed usage of the API. I wouldn't even have known it was a thing without looking at the spec sheets for these cards.

Systems these cards were approved to run in:
  • RS/6000 7044-170 (both)
  • RS/6000 7044-270 (both)
  • RS/6000 7043-150 (GTX4500)
  • RS/6000 7043-270 (both)
  • IntelliStation POWER 9111-285 (both)
  • IntelliStation POWER 7047-185 (both)
  • IntelliStation POWER 9114-275 (both)
  • IntelliStation POWER 9112-265 (both)
Eventually IBM just re-used the same old GXT4500 and 6500 cards in the later IntelliStation POWER series and never refreshed the card platform, meaning these were the last for the AIX CAD cards. I'm guessing this was due to the high amount of R&D and cost to design and build these cards and the shrinking market for them; not all IntelliStations sold were used for CAD so that would make the market demand for these cards even less than the amount of machines sold. That does mean they were actively produced from 2002 all the way up to 2009: which is impressive as I imagine the memory modules would have gotten fairly difficult to source at that point. They wouldn't have actively manufactured the old memory ICs in 2009, unless IBM still had contacts for it.

According to IBM's marketing at the time:
Quote:Ideal for mid- to high-end MCAD, front-end graphic processing and other floating-point-intensive applications, the IntelliStation POWER 265 with GTX6500P graphics convincingly beats comparably configured HP 3700 and Sun Blade 1000 workstations in CATIA V4 benchmarks.2 A new generation of IBM 3D graphics technology—the GXT4500P and GXT6500P with 128MB unified frame buffer—delivers up to a 15-20% boost in performance over the GXT4000P/6000P adapters, at a new lower price. Both PCI graphics accelerators feature analog and digital output, 24-bit double buffer with resolutions up to 2048 x 1536 at 60 Hz, and API support for OpenGL 1.2.1, graPHIGS, and X11. Advanced 3D features include an on-board geometry accelerator, hardware lighting, IBM IntelliStation Workstations: Performance on Demand 24-bit Z-buffer, 4/8-bit overlay, 8-bit double buffered Alpha, 8-bit stencil, up to 110MB texture memory, dual texture, and 3D texture. The GXT6500P has an additional geometry and lighting processor that further increases performance.

2. Jan. 2002 CATIA V4.2.2 R2 TAGITT results. Each system configured with 512MB memory, 9.1GB SCSI hard drive, and CD-ROM drive. HP 3700 at 750 MHz with fx 5 or fx 10 pro adapter; Sun Blade 1000 at 900 MHz with Expert3D adapter. All performance data contained in this publication was obtained in a specific environment, and is presented as an illustration. The results obtained in other operating environments may vary. http://www-1.ibm.com/servers/eserver/pse...urces.html


In terms of running these adapters, you can in fact run *two* GXT4500Ps in an IntelliStation POWER 185 with dual monitor support, however only one GXT6500P can be ran in the system at any given time. This restriction could change depending on other systems, but I only have the one and limited experience outside the 185 (which is probably the oddest in the bunch, anyways). If you're settling for the anemic Matrox GXT135P, you can stuff a total of four of them in a system and (presumably) get quad monitor support out of AIX! Some day in the future I may consider two GXT4500Ps if I want to dabble in AIX mutli-monitor support, that is if my 185 ever becomes a workstation I like to use on a daily basis.
micrex22
PS/2 it!

Trade Count: (0)
Posts: 144
Threads: 27
Joined: May 2018
Find Reply
09-01-2018, 11:35 PM
#2
RE: An overview of the GXT6500P and GXT4500P
(09-01-2018, 11:35 PM)micrex22 Wrote:  [Image: O6EVJI7.gif]
The GXT4500P (single GPU) and GXT6500P (dual GPU) were released in 2002 as 64-bit PCI-X cards for the RS/6000 platform, however they can also operate in a 32-bit PCI slot as well--obviously at a performance loss. They were generally paired with the T115, T117, T119 and T120 monitors, the T120 being a fairly famous cheaper substitute to the T220/221. Hilariously enough they're not compatible with all VGA monitors--including IBM's own L190, I suspect this is due to the way AIX and/or the GXT cards handle VGA signals. It's fairly common when dealing with old IBM video cards to experience some weirdness, for instance the microchannel XGA-2 cards can have display problems at different resolutions like 800x600 on certain CRTs; but that's getting a little off topic. Needless to say if you're running a VGA monitor, expect there not to be a seamless experience with *all* monitors.

Because it may come up or be wondered upon, the GXT135P was never designed by IBM and is in fact a Matrox GPU with PowerPC firmware. Therefore it does not run graPHIGS. XiG did release a 3rd party driver to this card for AIX to take more advantage of its capabilities and enhance a few things since IBM just slapped it in as an afterthought because it wasn't developed in-house like the 4500 and 6500. Reference: http://www.xig.com/Pages/Hdwe/Graphics/GXT135P.html
Please be advised XiG doesn't sell any of their souped up matrox AIX drivers anymore, so... that's the end.
  • GXT4500P: graPHIGS, OpenGL 1.2
  • GXT6500P: graPHIGS, OpenGL 1.2.1
I'm (guessing) the 6500 has a firmware update that allows it to support OpenGL 1.2.1 over the 4500's 1.2? Or perhaps it was deliberate by IBM? I'm not sure. At any rate because the card came out in 2002, the OpenGL versioning support is on the older side of things, unfortunately. IBM also lists 'X11' and 'motif' under the API section but... that's not really the same as OpenGL, graPHIGS, GLide, DirectX etc. Any card will run X11 afaik.

The GXT4500P and GXT4500P are only supported under AIX and not Linux (they require at least AIX 4.3.3 and higher), to my knowledge nobody was written any open source Linux drivers, or drivers at all. I am also not aware of how widespread graPHIGS usage was, I couldn't count a single person who has expressed usage of the API. I wouldn't even have known it was a thing without looking at the spec sheets for these cards.

Systems these cards were approved to run in:
  • RS/6000 7044-170 (both)
  • RS/6000 7044-270 (both)
  • RS/6000 7043-150 (GTX4500)
  • RS/6000 7043-270 (both)
  • IntelliStation POWER 9111-285 (both)
  • IntelliStation POWER 7047-185 (both)
  • IntelliStation POWER 9114-275 (both)
  • IntelliStation POWER 9112-265 (both)
Eventually IBM just re-used the same old GXT4500 and 6500 cards in the later IntelliStation POWER series and never refreshed the card platform, meaning these were the last for the AIX CAD cards. I'm guessing this was due to the high amount of R&D and cost to design and build these cards and the shrinking market for them; not all IntelliStations sold were used for CAD so that would make the market demand for these cards even less than the amount of machines sold. That does mean they were actively produced from 2002 all the way up to 2009: which is impressive as I imagine the memory modules would have gotten fairly difficult to source at that point. They wouldn't have actively manufactured the old memory ICs in 2009, unless IBM still had contacts for it.

According to IBM's marketing at the time:
Quote:Ideal for mid- to high-end MCAD, front-end graphic processing and other floating-point-intensive applications, the IntelliStation POWER 265 with GTX6500P graphics convincingly beats comparably configured HP 3700 and Sun Blade 1000 workstations in CATIA V4 benchmarks.2 A new generation of IBM 3D graphics technology—the GXT4500P and GXT6500P with 128MB unified frame buffer—delivers up to a 15-20% boost in performance over the GXT4000P/6000P adapters, at a new lower price. Both PCI graphics accelerators feature analog and digital output, 24-bit double buffer with resolutions up to 2048 x 1536 at 60 Hz, and API support for OpenGL 1.2.1, graPHIGS, and X11. Advanced 3D features include an on-board geometry accelerator, hardware lighting, IBM IntelliStation Workstations: Performance on Demand 24-bit Z-buffer, 4/8-bit overlay, 8-bit double buffered Alpha, 8-bit stencil, up to 110MB texture memory, dual texture, and 3D texture. The GXT6500P has an additional geometry and lighting processor that further increases performance.

2. Jan. 2002 CATIA V4.2.2 R2 TAGITT results. Each system configured with 512MB memory, 9.1GB SCSI hard drive, and CD-ROM drive. HP 3700 at 750 MHz with fx 5 or fx 10 pro adapter; Sun Blade 1000 at 900 MHz with Expert3D adapter. All performance data contained in this publication was obtained in a specific environment, and is presented as an illustration. The results obtained in other operating environments may vary. http://www-1.ibm.com/servers/eserver/pse...urces.html


In terms of running these adapters, you can in fact run *two* GXT4500Ps in an IntelliStation POWER 185 with dual monitor support, however only one GXT6500P can be ran in the system at any given time. This restriction could change depending on other systems, but I only have the one and limited experience outside the 185 (which is probably the oddest in the bunch, anyways). If you're settling for the anemic Matrox GXT135P, you can stuff a total of four of them in a system and (presumably) get quad monitor support out of AIX! Some day in the future I may consider two GXT4500Ps if I want to dabble in AIX mutli-monitor support, that is if my 185 ever becomes a workstation I like to use on a daily basis.

Very interesting information there,
Happen to have an RS/60000 7043-140 and a GXT6500P video card on it.
AIX 4.1.1 doesn't seem to have the drivers for the card and have all kind of issues to run Quake 2 on AIX 5.

Can't find the software drivers anywhere for AIX 4.1.1. Do you happen to have or know where to get them ?.
soviet
Octane

Trade Count: (0)
Posts: 192
Threads: 22
Joined: Apr 2019
Location: Uruguay
Find Reply
08-17-2021, 08:48 PM
#3
RE: An overview of the GXT6500P and GXT4500P
they require at least AIX 4.3.3 and higher
Irinikus
Hardware Connoisseur

Trade Count: (0)
Posts: 3,475
Threads: 319
Joined: Dec 2017
Location: South Africa
Website Find Reply
08-17-2021, 08:54 PM
#4
RE: An overview of the GXT6500P and GXT4500P
I miss my 285 dearly.
I had the GXT135p and it's quite easy to get it running in fb mode with a bit of tweaking of Xconf.
I also managed to run my former 285 in Linux with a Radeon 7000 flashed for the Mac, with full OpenGL acceleration. The only problem was handling non-power-of-two textures causing problems resizing windows sometimes. I resolved that by switching to WindowMaker at the time.
The only thing you lose, obviously, is the firmware prompt. It was far more painful to get partitioning set right. Took me 6 months of trial and error.

Unfortunately, I never got the taste of running AIX with the better cards and try a proper 3D application. I did run blender under Linux, but ugh, the lack of Altivec is very visible also for VLC and other multimedia tasks.

IBM updated the graPHIGS documentation until 2007 (or perhaps even longer).
https://www.ibm.com/docs/en/ssw_aix_71/n...igstrf.pdf
graphigs also supports full accelerated remote visualisation (I think it's not the case with OpenGL) - CATIA used it.

I wanted to document it, even at some point thought I was going to find a way to run CATIA. But meh... in the meantime I will document what I can in HP-UX - I will probably need access to some older workstation to document HP's API.
Shiunbird
Administrator

Trade Count: (1)
Posts: 553
Threads: 45
Joined: Mar 2021
Location: Czech Republic
Find Reply
08-18-2021, 09:46 PM
#5
RE: An overview of the GXT6500P and GXT4500P
(08-17-2021, 08:54 PM)Irinikus Wrote:  they require at least AIX 4.3.3 and higher


What's weird is that IBM references higher OpenGL (or rather 'GL') support elsewhere, particularly on early model RS/6000s:

https://www.ibm.com/common/ssi/cgi-bin/s...subtype=CA

Quote:The new PCI-based graphics adapters (POWER GXT110P, GXT500P, GXT550P, and GXT800P) provide outstanding 2D and 3D graphics performance for technical workstations and offer you an exciting new level of 2D and 3D price/performance. The GXT500P, GXT550P, and GXT800P adapters provide native support for the OpenGL and graPHIGS™ 3D APIs and additionally support the IBM GL 3.2 and PEX APIs. The POWER GXT250P and GXT255P support the new RS/6000™ PowerPC™ PCI-based workstations, the RS/6000 43P Series Models 140 and 240, and 7025 Model F40. The GXT110P and GXT250P now support the current RS/6000 Models E20, E30, and F30. IBM also offers the popular GXT1000™ integrated into the new Model F40 providing you with a powerful technical workstation in a single package. The IBM 7250 Model 002 POWER GXT1000 is now available on the new Models 140, 240, and F40.

And then later on we have this note:


Quote:OpenGL and GL 3.2 for AIX, Version 4.1.5 or PEX and PHIGS for AIX, Version 4.1.5


I'm too lazy to go through it but there were other documents and PDFs mentioning OpenGL/GL 3.x and higher on AIX. Is GL even the same as OpenGL? I don't know. Is this only supported by the older GXT800P adapters? I don't know. Is GL 3.2 older than OpenGL 1.2? I don't know. But whatever the case is someone else can investigate these inconsistencies and curiousities.

There is this from the "history of OpenGL" page that might shed light on it.
Quote:OpenGL 3.0 adds the concept of deprecation: marking certain features as subject to removal in later versions. GL 3.1 removed most deprecated features, and GL 3.2 created the notion of core and compatibility OpenGL contexts.
Official versions of OpenGL released to date are 1.0, 1.1, 1.2, 1.2.1, 1.3, 1.4, 1.5, 2.0, 2.1, 3.0, 3.1, 3.2, 3.3, 4.0, 4.1, 4.2, 4.3, 4.4, 4.5, 4.6.
https://www.khronos.org/opengl/wiki/History_of_OpenGL

"Unlike Windows, OS/2 is a true operating system" - Steven S. Ross, How to Maximize Your PC
(This post was last modified: 09-25-2021, 05:26 PM by micrex22.)
micrex22
PS/2 it!

Trade Count: (0)
Posts: 144
Threads: 27
Joined: May 2018
Find Reply
09-25-2021, 05:24 PM
#6
RE: An overview of the GXT6500P and GXT4500P
(09-25-2021, 05:24 PM)micrex22 Wrote:  I'm too lazy to go through it but there were other documents and PDFs mentioning OpenGL/GL 3.x and higher on AIX. Is GL even the same as OpenGL? I don't know. Is this only supported by the older GXT800P adapters? I don't know. Is GL 3.2 older than OpenGL 1.2? I don't know. But whatever the case is someone else can investigate these inconsistencies and curiousities.

GL is certainly older than OpenGL based solely on the version numbers and timeline.

I know some of IBM's products supported IRIS GL (SGI's proprietary predecessor to OpenGL).  I suspect that (or a derivative) is what IBM is talking about with the references to plain GL.  It's before the "Open" standard, but they wouldn't have wanted to call it "IRIS GL" thereby acknowledging that they licensed it from a competitor.  At least that's where I'd start looking, if I was doing the research.

SGI:  Indigo, Indigo2, Octane, Origin 300
Sun:  SPARCstation 20 (x4), Ultra 2, Blade 2500, T5240
HP:  9000/380, 425e, C8000
Digital: DECstation 5000/125, PWS 600au
jpstewart
Developer

Trade Count: (1)
Posts: 444
Threads: 6
Joined: May 2018
Location: SW Ontario, CA
Find Reply
09-25-2021, 11:24 PM
#7
RE: An overview of the GXT6500P and GXT4500P
(09-01-2018, 11:35 PM)micrex22 Wrote:  Some day in the future I may consider two GXT4500Ps if I want to dabble in AIX mutli-monitor support, that is if my 185 ever becomes a workstation I like to use on a daily basis.

I suppose I should elaborate on this years later in the event someone bugs me about it (as I never publicly disclosed the tests). I'm not actually convinced there is a limit to any GPU on any PCI slot, as far as I can tell IBM's statements are entirely arbitrary so that the systems don't suffer performance loss. You can load up as many cards as can physically fit, and the system(s) will take them all.

I ran two GXT4500Ps without issue and four (Matrox) GXT135P without issue. With four GXT135Ps you can drive a T221, but not quite at the full resolution as even the Matrox cards don't support the maximum required quadrant of the four single-link DVI connectors. If you could fit four GXT4500Ps that would be more than sufficient to drive a T221 display.

It's all old tech that can't really do much (without a serious time investment that I'll never be able to afford it), so this is the end of the road. Awhile back I was trying to get the recent Debian PPC installed on it, but the maintaner never set it up to be functional for non-G5 systems and getting the errors off were far too annoying (I lost my baby SCSI pins so transplanting the disk into another U320 system is just not enjoyable, and I don't feel like converting it to IDE because you'd have to scramble the guts). *sigh*

Also the Linux kernel does have GXT support now from a certain buddy who had it added in, so it is [now] a myth that Linux cannot run on non-Matrox GXTs.

"Unlike Windows, OS/2 is a true operating system" - Steven S. Ross, How to Maximize Your PC
micrex22
PS/2 it!

Trade Count: (0)
Posts: 144
Threads: 27
Joined: May 2018
Find Reply
04-05-2024, 10:51 AM
#8
RE: An overview of the GXT6500P and GXT4500P
I have 2 T221s and I have owned 3, and I tried all I could with them.

The two I currently have are:
- one 2x dual link DVI overclocked to 60Hz.
- a gen1 4x single link DVI 41Hz.

I tried getting it to run at 1x DVI at 14Hz full resolution under HP-UX 11.11 (FireGL C8000) and also tried 1x dual link DVI at 30 Hz and 2x single link DVI.
In both cases, the system configurator detects the full resolution and I can configure it, but X never starts. It gets stuck at 100% CPU.

However I can perfectly run it at 2560x1600 60Hz (that's the resolution I actually run HP-UX at), so maybe there's some limitation in X11 that causes it to poop.

I was obsessed with the T221s a few years ago and documented all I could on youtube, and got all I found in archive.org, including IBM running a visualisation engine at full resolution. I never found the video of Quake at 4K, though. =(

So I am curious whether they really run it at 4K under AIX or used Linux or if it's a driver/X11 version issue. The FireGL should be able to do it - it's a much newer card. And it works under Linux. I had it running once, but without hardware acceleration (a known issue for the FireGL in hppa under Linux).

When I had AIX 7.1 running, I don't recall OpenGL newer than 1.2. And I believe that multiple GPUs would do the trick - it only takes a bit of x.conf wizardry as it gets very annoying if the OS sees multiple monitors and multiple canvases rather than all monitors combined into a single canvas.

(Sorry for going off-topic - I see T221, my eyes glow and my heart beats with joy)
Shiunbird
Administrator

Trade Count: (1)
Posts: 553
Threads: 45
Joined: Mar 2021
Location: Czech Republic
Find Reply
04-07-2024, 06:11 PM
#9
RE: An overview of the GXT6500P and GXT4500P
(04-07-2024, 06:11 PM)Shiunbird Wrote:  So I am curious whether they really run it at 4K under AIX or used Linux or if it's a driver/X11 version issue. The FireGL should be able to do it - it's a much newer card. And it works under Linux. I had it running once, but without hardware acceleration (a known issue for the FireGL in hppa under Linux).

When I had AIX 7.1 running, I don't recall OpenGL newer than 1.2. And I believe that multiple GPUs would do the trick - it only takes a bit of x.conf wizardry as it gets very annoying if the OS sees multiple monitors and multiple canvases rather than all monitors combined into a single canvas.
Yeah I was referring to the multiple monitors (it's the same under Windows and IRIX too, you feed it all of the independent quadrants and stitch them together).

I don't really know how some of that old tech would respond to a real 4K display.

"Unlike Windows, OS/2 is a true operating system" - Steven S. Ross, How to Maximize Your PC
micrex22
PS/2 it!

Trade Count: (0)
Posts: 144
Threads: 27
Joined: May 2018
Find Reply
04-10-2024, 07:07 AM
#10
RE: An overview of the GXT6500P and GXT4500P
I never managed to get it work with modern Windows.

It insists on automatically picking up the highest resolution, meaning 3840x2400 @ 14Hz.
Then, when I plug a second DVI, it goes bananas.

If I go to legacy monitor settings and force 1920x1200, then when I add 2nd DVI, that second DVI gets set to 3840x2400 and then the monitor doesn't sync and I can't change settings.

NVIDIA Quadro multi-monitor wizard got severely dumbified over the years and I can't get it set there either =(

Then I installed 2nd GPU and connected 4x DVI to a Quadro and a standard monitor to a RX580, and when I try to set the T221 all settings pane, even the legacy resoltuion/refresh rate picker go to the out-of-sync monitor.

And someone in MSFT has shit for brains. When you set resolution in the new monitor settings app, if you put 1920x1200, it keeps syncing at the highest resolution and does blur via software. Seriously, wtf. Whatever resolution you set, only the canvas changes, you are not actually setting the monitor. =(

Nowadays I only get it to work via X conf. I've heard there are ways via editing monitor INI file or regedit to force Windows not to go for highest resolution possible, but meh, I gave up.
Shiunbird
Administrator

Trade Count: (1)
Posts: 553
Threads: 45
Joined: Mar 2021
Location: Czech Republic
Find Reply
04-10-2024, 03:00 PM


Forum Jump:


Users browsing this thread: 1 Guest(s)