Why all the work on ultra mode and no sli?
City of Heroes Ultra Mode uses the OpenGL 3.x API for it's graphics, and at the time of launch, was the only commercially launched game to utilize OpenGL 3.x API.
As such, Nvidia, nor AMD, have had their hands on fast-past OpenGL 3.x rendering samples to optimize their drivers for. They haven't had "time" to optimize their drivers for multi-gpu support atop OpenGL 3.x.
There are other titles out there now leveraging the OpenGL 3.x and 4.x API's, such as the Unigine Heaven Benchmark and Valve's Source Engine. Id's Rage will also reportedly be driving an OpenGL 3.x rendering path for the PC release, although it will be using an OpenGL 2.x rendering path for the Xbox 360 / Playstation 3 releases.
Now, it is technically possible for a developer to force a game to sabotage or make better use of various Multi-GPU modes. For example, when Borderlands launched it was deliberately sabotaged at the code level against AMD Crossfire configurations.
During the I17 Beta, CoH has also had a special command line accessed rendering mode enabled on the test server, taking advantage of 2x SLI setups. As of the current test server revision, this command line mode seems to have been pulled.
In the ever-popular words of the Devs: "Soon". They have been working on it, it's just not ready. During the I17 Beta, they actually had a working, if very unsupported, switch to force SLI 'on' in CoH. From what I hear, it didn't improve much but it was a step in that direction. So, have patience.
It is known that there are an infinite number of worlds, simply because there is an infinite amount of space for them to be in. However, not every one of them is inhabited. Therefore, there must be a finite number of inhabited worlds. Any finite number divided by infinity is as near to nothing as makes no odds, so the average population of all the planets in the Universe can be said to be zero. From this it follows that the population of the whole Universe is also zero, and that any people you may meet from time to time are merely the products of a deranged imagination.
short version: Multi-GPU is normally accelerated through the driver itself. This is not a case of CoH not supporting SLI or Crossfire. It is a case of the Driver Vendors not having added Multi-GPU support.
City of Heroes Ultra Mode uses the OpenGL 3.x API for it's graphics, and at the time of launch, was the only commercially launched game to utilize OpenGL 3.x API. As such, Nvidia, nor AMD, have had their hands on fast-past OpenGL 3.x rendering samples to optimize their drivers for. They haven't had "time" to optimize their drivers for multi-gpu support atop OpenGL 3.x. There are other titles out there now leveraging the OpenGL 3.x and 4.x API's, such as the Unigine Heaven Benchmark and Valve's Source Engine. Id's Rage will also reportedly be driving an OpenGL 3.x rendering path for the PC release, although it will be using an OpenGL 2.x rendering path for the Xbox 360 / Playstation 3 releases. * * * Now, it is technically possible for a developer to force a game to sabotage or make better use of various Multi-GPU modes. For example, when Borderlands launched it was deliberately sabotaged at the code level against AMD Crossfire configurations. During the I17 Beta, CoH has also had a special command line accessed rendering mode enabled on the test server, taking advantage of 2x SLI setups. As of the current test server revision, this command line mode seems to have been pulled. |
SLI technology can be enabled for every gaming application, including both OpenGL and Direct3D gaming applications. SLI technology provides either 3D performance scaling using alternate frame rendering (AFR) or split-frame rendering (SFR) or increased visual quality using the SLI Antialiasing mode. In order to provide the optimal 'out-of-box' experience for its customers, NVIDIA has created an extensive set of optimized game profiles which enable SLI scaling automatically. The full list of these SLI-optimized games can be found here.
NVIDIA® SLI technology can be enabled for every gaming application. In addition, to provide the optimal 'out-of-box' experience, NVIDIA has created an extensive set of over 500 custom application profiles which enable SLI technology automatically and optimize scaling performance. These optimized applications, shown below, are enabled automatically with no control panel changes required.
SLI technology can be enabled for every gaming application, including both OpenGL and Direct3D gaming applications. SLI technology provides either 3D performance scaling using alternate frame rendering (AFR) or split-frame rendering (SFR) or increased visual quality using the SLI Antialiasing mode. In order to provide the optimal 'out-of-box' experience for its customers, NVIDIA has created an extensive set of optimized game profiles which enable SLI scaling automatically. The full list of these SLI-optimized games can be found here.
NVIDIA® SLI technology can be enabled for every gaming application. In addition, to provide the optimal 'out-of-box' experience, NVIDIA has created an extensive set of over 500 custom application profiles which enable SLI technology automatically and optimize scaling performance. These optimized applications, shown below, are enabled automatically with no control panel changes required. |
That still does not change what I said.
Okay, so you can copy from Nvidia's press-site.
That still does not change what I said. |
You said it is not supported Nvidia says it is..you're saying the reason SLI is not supported on COH is because it uses OPENGL 3.x - according to Nvidia 3.x is supported with SLI...
You said it is not supported Nvidia says it is..you're saying the reason SLI is not supported on COH is because it uses OPENGL 3.x - according to Nvidia 3.x is supported with SLI...
|

Originally Posted by ShadowNate
;_; ?!?! What the heck is wrong with you, my god, I have never been so confused in my life!
|
so if I go back to non ultra mode I should get sli then....or crossfire.
They said OpenGL, not OpenGL 3.x. Also, according to that press release, SLI just magically works and the Devs don't have to do anything, so this thread shouldn't exist
![]() |
Bingo.
Let me spell this out B_Witched: Nvidia. Lies.
They've disabled PhysX support in their drivers if you don't use an Nvidia card to render a scene... Not really a "big" deal since Intel killed off the Larrabee add-in cards and that means you'd have to be one of those users with an ATi primary card and an Nvidia secondary card.
They've been caught with their pants down sabotaging competitors graphics cards in games like Need for Speed, Borderlands, and Batman Arkham Asylum. Again, maybe not the biggest deal since again, the only competitor is AMD/ATi.
Oh... and they've got numerous class-action lawsuits over the multiple product recalls from literally exploding laptops... where Nvidia outright lied to vendors over just what the actual thermal outputs of their chipsets were. Of which, in this theme, yet another Vendor issued another recall for yet ANOTHER line-up of EXPLODING laptops... THIS WEEK.
I'm sorry, it's not that I'm pro-AMD/ATi. I just hate companies that outright play nasty every chance they get. As of late, Nvidia's either been paying game developers to sabotage game-code so that it breaks on competitors equipment; won't tell OEM's and ODM's just how hot their chips actually are and allowing sub-standard cooling solutions for their chips to hit the market; and essentially tells add-in board vendors that if they want to make a profit, they'll have to use cheaper board-materials (Sparkle, Zotac, Galaxy).
Okay. Fine. I'm biased. Nvidia won't support what actually matters to me. They won't release the specifications on their graphics cards. They won't recognize the Nouvea driver. They have no open-source / Linux strategy beyond a binary driver.
Hell, even Intel has partially won me over with the I7 processor and with Larrabee could do. I really think Intel made a bone-headed mistake pulling Larrabee from the market. I really think it could have been the graphics tech that would have lit a fire under both AMD and Nvidia. The I7's a really good processor, and you no longer get reamed up the rump on the infrastructure costs of building an Intel rig. That and Intel's actually pretty damn good on the open-source front. Sure, their normal IGP's resemble a hybrid mix of a Rage 128 and a Hoover on Windows, and the Linux driver is even worse... but at least they decided to match what AMD does.
I'm getting away from the point here.
Just because Nvidia says SLI support is automagic across DirectX and OpenGL, and should work no matter what... DOES NOT MEAN IT ACTUALLY WORKS THAT WAY.
First: not every game is going to benefit from AFR.
Second: support still needs to be entered in at the driver level. Nvidia hasn't had real games to optimize OpenGL 3.x for.
Okay, there is a fair point in that, yes, OpenGL 3.0 has been out for a couple years now, and quite frankly both AMD and Nvidia could have used their own internal demo's to prepare for potential rendering methods... and there's some really talented coders on Rage3D and Nzone that could have provided a quick and dirty OpenGL 3.x rendering demo to work against... so... yeah, I think both companies have a bit to answer for on why the support is so poor.
On the other hand, the gaming industry has been chasing after DirectX like it was some kind of liquid crack... Not only because OpenGL stagnated like a Georgia swamp when SGI decided to take a nap, but also finding out that when you piss off several million gamers still using Windows Xp... it does bad things to earnings reports.
Now, I could go into the full on spiel here about why DirectX, and any proprietary platform API is a bad idea... but I don't think anybody wants to read lecture number #42 which starts as:
Hence the resurgence now in visible commercial products turning back to OpenGL. Khronos, as an organization, has gotten far-more involved in driving OpenGL adoption, and when you present the only selection of programming API's that will allow a programmer to hit any platform, regardless of OS? |
Here's the thing:
Anytime you read something from AMD, Intel, or Nvidia, have a barrel of salt on hand.
Then talk to the guys who actually write the code.
Granted, I will admit that hanging around the X.org dev channels on IRC is a bit... scary.
so if I go back to non ultra mode I should get sli then....or crossfire.
|
SLI / Crossfire support was never triggered for CoH in the past.
I really don't know why Nvidia didn't support it, given that the game was one of their "babies" at launch... Given that games like Doom3, Prey, and Quake 4, could leverage SLI, and in some cases for some amazing performance results... I know it wasn't an issue with CoH's usage of OpenGL as a rendering API.
So, I couldn't tell you why Nvidia never did support SLI, in any form, with the old graphics engine.
Now, given that Nvidia is supposed to be the champion of multi-gpu support... I mean.. seriously... you think Multi-GPU, you think SLI... I would have thought they would have been chomping at the bit to have CoH's Ultra-Mode running in SLI on launch.
I'm guessing that Nvidia still has engy's working on the program, and indications from Television suggest SLI support could be arriving later this year. It may, for reasons I'm not sure I understand, ultimately require a hard-switch as the test-server did.
As far as AMD/ATi goes... back then ATi couldn't care about OpenGL.
ATi's OpenGL support was actually so bad in it's pre-2007 state that AMD pretty much ordered the old engine junked, and they've been through... at least one other complete engine re-writes now for OpenGL support. I think I'm also right in saying that the OpenGL 4.0 driver is using yet another completely new engine. I could be wrong there... Terry Makoden runs screaming from AMD's headquarters any time I send him an email. (no, I don't know if he actually does run, but it has been hinted in the past that one of my emails did generate a hole in a wall.)
Anyways, the rumor is that AMD does have a Crossfire solution in the works for CoH. However, this solution will likely only be for RadeonHD 2xxx series and newer. If you are running Crossfire on an x800, x1800, or x1900 series card, it probably won't work. On the other hand, I know what the crossfire editions of these cards sold like... and I'm pretty sure that AMD/ATi isn't exactly concerned with Crossfire support on these cards.
For that matter, I know what the HD 2000 series sold like too... and honestly... I'm not sure AMD/ATi should bother.
Anyways, my theory is that SLI / Crossfire don't work with the original graphics engine because the code was fundamentally broken to deliver some effects against various cards, such as the Intel Integrated graphics card. The graphics engine has been described to me as load of rusty plumbing with band-aids every 2 inches.
The Ultra-Mode update reportedly cleared out a lot of the old custom patchwork code, which promptly generated problems for many users with older graphics cards / Intel graphics accelerators.
Theoretically, the new-engine should be easier to accelerate in a multi-gpu environment.
Again, this is largely dependent on what the driver see's and is programmed for.
Multi-GPU support isn't automatic no matter who says it. There are certain aspects of a 3D engine that needs to be coded to take advantage of a multi gpu setup. A game needs to be coded to blast as many graphics calls at once so while one GPU is busy digesting that the CPU can by blasting graphics calls to the second card for the next frame.
Simply the 3D engine in this game was coded back in the early 2000s and while they may have added additional effects on the GPU side of the pipeline, the CPU side still isn't what it should be to get a high scaling factor for multiple GPUs. It's all about the under structure behind how all the objects in the zones and levels are designed and stored separately and that's not the same way as how you would design for an FPS.
Edit: And the funny thing is, coding a game so it can handle multiple GPUs can make it perform worse on a single GPU by eliminating the natural parallelism between the CPU and GPU.
Father Xmas - Level 50 Ice/Ice Tanker - Victory
$725 and $1350 parts lists --- My guide to computer components
Tempus unum hominem manet
Why does coh not support or utilize SLI?