-
Posts
4197 -
Joined
-
Quote:Yes and no.Really ? Do you have anything linking to this ?? I ask because I use the website alot and have always found them to be accurate. Also I couldn't find anything on the web about this that may just be my search abilities.
Yes, there is evidence still on the web, like this piece by Charlie from The Inquirer back in 2006 : http://www.theinquirer.net/inquirer/...-hardware-rant
Other evidence can be found on some blogs, like this 2007 post : http://scientiasblog.blogspot.com/20...-its-soul.html
And the Inquirer brought it up again in 2007 as well : http://www.theinquirer.net/inquirer/...-hardware-sold
The allegations first came around back in 2001, and the first proof reached the web, and I think Slashdot, in 2002. The topic was discussed again when THG was up for sale since one of the hopes was that THG would regain it's credibility.
***
and no. The old slashdot discussions and forum postings I had saved now go to defunct links, such as ThatForum, StarEmu, and the old Gamenikki.com forums. Theoretically the content may still somewhere in http://www.archive.org/index.php ... which as far as I know, the search engines such as Google, Yahoo, and Bing / LiveSearch don't index.
***
edit: yes. I know. citing the TheInquirer is a bit like citing a politician. You just know it's going to come back and bite you on the rump. TheInq, and Charlie in particular, also have reputations for stretching the truth. Not as bad as TheReg's Andrew Orlowski whose posts could often be used as a foundation for a drinking game. The problems with THG's credibility was actually one of the few cases where Charlie, a couple of his hardware reviewers at the time, and I agreed on something. I'm a bit sad I didn't save the information as it happened... but I don't think it really occurred to anybody back in 2001 or 2002 that the information put on the web would one day be unavailable or dead linked. -
part of me wants to know what your slotting is to do that.
The other part of me says... I can't afford it even if I did know -
Steam's Black Friday Weekend continues. Today one of the deals is City of Heroes: AE edition for $10.
Right now Steam's site seems to be under a DDOS as everybody checks the deals, but drop by here if you have some gift giving to plan : http://store.steampowered.com/early-holiday -
Quote:... I think if I saw somebody actually wearing those in public, I'd either asphyxiate myself by laughing too hard, or look for the camera-man shooting the latest Vanilla Ice music video.I was browsing through Brooks Brothers online when I found this amazing product.
I wish I had someone to buy these for. Anyone else find fantastic products? -
mewit. You have private messages turned off.
First of all, I wouldn't touch Alienware with a 10foot pole. They aren't exactly the... premium... vendor they were a couple years back. Their buy-out by Dell saw several cuts in product quality, and the customer service went from an Industry best to typical Dell levels. Which means if you have a problem and you pick up the phone, you get India.
Now, I do run a small business building custom computers, but I'm pretty sure the forum terms frown on self-advertising.
If I was given $1300 to spend on a computer, I'd buy most of the stuff from Newegg. The Processor, Motherboard, Heatsink, Hard-drive, memory, graphics card(s), optical drive, and so on. I made a quick list here of stuff I'd spend my own money on: $1300 Computer. (note, I think you'll need to be logged in to Newegg to read it)
That's about $1173. I'd drop another $100 on one of Thermaltakes Refurbed Power Supplies because you still get a 2 year warranty and save a chunk of cash.
Which leaves some money left over for the chassis, and I'd simply point you to a listing of available chassis and ask what you want.
Now, there are a couple of problems with this wish list, and you'll actually run into with smaller computer vendors. I'm not eligible for Microsoft's discount OS program, so in order for me to sell copies of Microsoft Windows, I have to pass the full cost of the operating system over to you if you went with Windows. Legally speaking, I could sell you the OEM version of Vista Home Premium, which is just over $100. The full version that Microsoft would prefer I re-sell is $190.
So we've shot over $1300 and are closer to $1400, if not $1500, if you wanted Windows. Then there's the problem of shipping. How exactly do I get this to you, in one piece.... without pannicing that UPS or Fedex is gonna screw something up.
Now, I could resort this list by buying similar components, but choosing lowest price instead of best rating or what I know to be good, and I could drop from Crossfired Radeon 5770's to a single RadeonHD card, and easily undershoot the Dell Alienware system.
Which is one of the huge problems I have with Alienware. Dell uses the cheapest components possible now. Really, with what was listed on that system, you should be paying closer to $1600 or $1700 for the hardware alone. That it's a couple hundred off either screams Black Friday or something is catastrophically wrong. -
Quote:Yes / no / not reallyOk here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?
Yes, two 9800 GTX's in SLI are pretty powerful, but you are dependent on software support of multi-gpu rendering. It'd be a cheaper way to gain more performance while you wait for Nvidia to go bankrupt or get bought out. (and no, that's actually not a joke, I think Human Being addressed the financial problem Nvidia has trying to move GTX chips right now... or somebody did in the thread)
No, I'm pretty sure that a single GTX 275 would outrun two 9800's. Granted, I can't actually test this. I don't have a GTX 275 on hand, although I do have GTS 250's... which are pretty much just die-shrunk rebadged 9800's.
I'll get to the Not Really at the end.
Quote:After reading toms hardware guide on where video cards stand I'm reluctant to purchace any of the 2xx series at this time.
********************************************
Okay, here's the Not Really part for those not interested in a soft analysis.
Not Really: it's a toss-up. If your an Nvidia fan, well. To broach the subject again, it's really questionable whether or not Nvidia is actually going to matter in a few months. Fermi's going to be launching against Larrabee and shortly before the product refreshes on the RadeonHD 5x00 series. Intel and AMD are also going to be pushing CPU's with Integrated GPU's in a 6months to 8months... and that's going to once again create a three-horse race in the low to mid-range graphics processor segment. I'm not too entirely sure Nvidia can survive as a vendor of gaming graphics cards if it's traditional bread and butter market, the low end market where the likes of the Geforce2 MX, Geforce4 MX, Geforce FX 5200, and Geforce 6600 dominated, goes away.
Nvidia's going to have to launch Fermi with parts at all segments, from the low-end to the High-End, and deliver large quantities. Since they are dependent on TSMC, that's... not really something I'm sure Nvidia can pull off. AMD is having enough problems pulling their own 40nm parts that are completed out of TSMC.
In order for Nvidia to survive, they are going to need a megabucks deal. One of the hot rumors that is going around now is that Nvidia has won the contract for the next Nintendo handheld. Personally, I doubt it given Nintendo's decades long relationship with AMD. Remember, the N64 was created with help from SGI, and that SGI team left to become ArtX. ArtX did the design for the Gamecube's LSI, but were bought up by ATi shortly before 2003. ArtX turned ATi around, helping to launch the Radeon 9700 series of cards and revamp ATi's... broken... driver program. ATi then did the graphics design for the Wii, something AMD was rather grateful for after the merger as Nintendo's profits helped shore up crashing processor revenue streams. At one time ATi had a contract to do the GPU for the so called GameBoy 2, a console that was halted, then dropped, after the DS went from side-toy to main-show.
With this kind of history, Nvidia would have to offer an astoundingly good deal to beat what has been a financially successful string of consoles for Nintendo. So I don't think this kind of guaranteed income deal is in store from Nvidia.
We also know that Nvidia's president has very kind words for Apple, and has been developing mobile platforms targeted towards the markets serviced by PowerVR / Imagination... a company that Apple has invested in. Some have suggested Nvidia is trying to maneuver themselves into a position to be bought out by Apple.
Now, with this sort of background, and multiple questions surrounding what's going to happen to / with Nvidia, the sensible thing to do is wait it out. Wait for Fermi parts to be delivered.
TL;DR version: Honestly, I'd save your money for next year rather than rushing to upgrade now. -
heh, I always refer to Infernia as The Queen of Yap and Glacia as The Princess of Yap.
-
Quote:Yes and No.I know this probably wants to be asked by a few people, but I'm itching to ask now.
I'm useless at graphics cards and what their equivalents are, so how does anyone here think a 2GB NVIDIA GeForce 9500 GT will handle this based on Posis recommendations? I'll be looking to upgrade if I need to, but am so damned curious as is.
Yes, you'll be able to run all of the Going Rogue features. There's nothing in the feature sets of the cards listed by Mr. Miller that are not found in their lower counterparts.
However, the question is performance. You may have to accept a low resolution, like 1024*768 in order to obtain acceptable frames per second performance. -
Quote:Can do. Probably take me a bit to get something setup that's... benchmarkable... across all 3 systems. I've gotta work LaserTron support for Adventure-Crossing tomorrow, so I'll likely not have something till Sunday or Monday.
Can you tell us if there is *any* difference between single mode and multi-GPU mode right now? Everyone has always stated that CoX will not take advantage of multi-GPU setups but I haven't seen any real benchmarking on whether there is any affect on on CoX (+ or -). -
Oh yeah, something else while my mind is on that direction.
Human Being brought up the 300 watt limit that the GTX 295 tries to stay under by using two GTX 275 chips. AMD's 5970 is a similar two-chip / single card monster, and is also limited to 300 watt's. However, AMD is reportedly cherry picking the HD 5870 processors that go into the 5970, picking the ones best suited to overclocking. The Stock heatsink is also designed to accommodate 400 watts of heat output. If you don't mind smashing through the 300 watt limit, the 5970 will, by default run at stock 5870 speeds for each chip, and generally reach higher clocks.
One of the concerns about AMD cherry picking processors to use in the 5970 might mean that 5870 cards could suffer on the available clock speed head-room front. If you aren't in the market to overclock... and seriously... why would you need to Overclock a 5870 to begin with... this really isn't a big deal. -
real quick:
Quote:no you can't, or at least not anymore. Nvidia disabled this with recent driver updates. It's one of the reasons why Intel and AMD are pushing OpenCL as a gaming physics solution.Here's the kicker though: you can do the same thing with an ATI card as main GPU!
(still reading the rest of the thread since my last post)
... and I see you actually mentioned this as I read further.
Quote:Would anyone know how well does COH do with Crossfire using AFR these days? Do you know, by chance? I couldn't find anything, but I don't know if that's my search-fu being weak or there not being much.
This will likely change as of Going Rogue as the new engine supposedly will take advantage of multi-gpu rendering.
Quote:Another good company is XFX.
Sooo.. quick note on Sapphire. Sapphire's traditionally been a bit closer to ATi than other vendors. Back when ATi was selling their own branded cards, Sapphire was the actual manufacturer. One of the things to keep in mind is that for a long time ATi didn't sell chips to 3rd party vendors.
However, I did pay for that cheapness on the RadeonHD 4850's I picked up. They actually are the most annoying fans since the 6600 GT's I had, and remind me quite a lot of an FX 5800 Ultra I borrowed. At full tilt it's like a Banshee Howling fest.
Quote:Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
A GTX 275 starts around $240.
A good one with 1.7gb of memory is around $320.
A GTX 295 starts around $530.... well, the only one Newegg has is $530.
So you're basically paying a $50 premium to use up 2 slots in your computer rather than 4 slots... since i'm not aware of any single slot GTX 275's.
Now, if you've got the cash, yeah, the GTX 295 is a freaking monster. And the one on Newegg is actually cheaper than a 4870x2 I found on Dell.
But I can get a Stock 4870 from XFX for around $154. Two of those are only $308...
I can also find RadeonHD 4890's for around $200... and two of those in Crossfire are going to set you back around $400... and you'd be running away from GTX 285's just on single cards.
So, I wouldn't be buying a GTX 295. I can save a lot more money by being willing to sacrifice space in the case. -
Quote:Just commenting on this, but I wouldn't really cite Tom's site as a reference on how processor's perform. THG has been busted multiple times in the past accepting vendor money to slant reviews.I hate to get involved in religious wars but here is some objective information for cpu comparison
for general performance
http://www.tomshardware.com/charts/2...tage,1394.html
here we go for gaming related
http://www.tomshardware.com/charts/2....0.2,1396.html
Quote:The problem I having with ATI cards is fact that half the games I play beside CoH have a page long issues with ATI cards. Yeah the new cards run rings around nVidia cards but at least nVidia run alot more stable than the ATI cards. And the sticky for playing CoH with ATI cards still scares me away from buying a new one.
Quote:Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anything I would need to know about differences between nVidia and ATI?
Quote:Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.
Quote:And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card? -
Quote:*shifty eyes to the left and to the right*The name i marked with the orange color actually gave me an idea of a toon...
A Broadsword/Willpower scrapper, minimal to no clothes, named Weapon Ecchi... ^_^
Ecchi Shibari! The demon summoning Mastermind out for a stroll through Atlas Park! -
Quote:There's really not much performance difference between a PCIE x16 and a PCIE 2.0 x16 slot. As much as I detest referencing Wikipedia, I have no desire to scrounge through PCI-SIG's site for the correct data, so take a look here: http://en.wikipedia.org/wiki/PCI_Exp...CI_Express_2.0Ya thats the board I currently have I was just wondering what the pci e 2 ment over it just being a X16 pci e slot
I would hate to buy a expencive card and not use it to its full potential
Quote:Ya know, I have to ask "What constitutes a decent processor these days?"
While I've no doubt about i5 and i7's, I was more curious about AMD's offerings these days. I'm debating upgrading my poor old C2D 6300 @ 1.8ghz, and the Athlon X4 620 is looking good.
For the purposes of gaming in resolutions of 1280*720 (720p) or 1440*900, two common resolutions for low end LCD's, even Socket 754 or late Pentium4's can deliver enough punch for most games.
For the purposes of gaming in resolutions like 1680*1050, 1920*1080 (1080p), or 1920*1200, processors from the Intel Core2 Duo lineup and the AMD Socket 939 and AM2 dual core CPU's can deliver enough back-end power.
Right now, for most shipping games against the Socket AM3 Phenom's or Intel I5 / I7 processors, you'll run into a limit on your graphics card long before you run into the limit on your processor.
From a price for performance standpoint I have a hard time suggesting people buy Intel processors and motherboards. Earlier in the thread I was talking about the Intel D975XBX. This board was a contemporary of the Asus M2R32-MVP as both were launched in 2006. The M2R32-MVP cost... well... quite a lot less than the Intel board. I think I paid around $130 for the Asus board, and the Intel board was somewhere in the neighborhood over $250 if memory serves correctly. The Asus board had more PCIE bandwidth for each PCIE slot (both slots were 16x, the slots on the Intel were 4x, 8x, then 16x, and if you ran in crossfire it was 8x,8x) The Asus board used less power. And so on and so on.
What really got me though is that as AMD introduced new processors, Asus pretty much kept updating the BIOS to the literal physical limits of what the board could handle. That M2R32-MVP of mine is currently running with a Phenom 9600 that's overclocked.
The Intel board, as noted earlier, never had a bios update with new processors. Rather, as I found out when researching the board name for this thread, if I wanted to utilize new processors from Intel, I would have to buy a completely new $200+ motherboard... with the same exact physical hardware. I'm sure some accountant somewhere is sitting there, nodding his head, saying Yeah, that's how we make money. I'm sitting here going that's how you *CENSORED* your customers!
***
Now, we do know that AMD intends to keep the ongoing Socket Compatibility that they have right now for future processors. The upcoming Bulldozer will have a new socket, but reportedly will also work on currently shipping Socket AM3 motherboards.
So if you bought an AM3 motherboard today, there's some assurance that just like Socket AM3 processors will work on most Socket AM2+ motherboards, consumer oriented Bulldozer designs will work on most Socket AM3 motherboards.
You don't get that kind of... assurance with Intel. Rather, you get the opposite. Case in point is the recent release of the LGA 1156. Rather than re-using the existing LGA 1136 and maintaining Socket Compatibility across processors, Intel simply releases a new socket design. Which is rather rotten if you want to stop and think about it for a minute.
***
So if you were buying now, I'd push you towards a Socket AM3 motherboard with a Socket AM3 processor.
I'm just not sure which motherboard I'd push you too.
DFI burnt me pretty badly with a couple of their Intel I7 motherboard offerings that refused to run in Triple Channel memory mode, so I'm a little hesitant at pointing out the nice looking DFI LANParty DK 790FXB-M3H5. Sure, all of the DFI AMD boards I've had have been awesome, so maybe it was just an Intel thing or something. Asus makes a wonderful Quad-Crossfire board... but it's almost $200, which is Intel board territory. MSI has the 790FX-GD70... but it's an MSI... and every single MSI board I've had with an AMD chipset has been complete and utter *CENSORED.
I'd actually be tempted to live with the ASUS M4A79XTD EVO. Yeah, it's only got dual 8x PCIE slots, so you do lose a bit of bandwidth if you run in Crossfire mode. It's also available for around $110, so it's a bit of a deal for it's feature set. -
-
!!!
*pulls out a hatchet and chases Cien around the thread* -
Quote:I have one the D975XBX motherboards: http://www.intel.com/products/motherboard/D975XBX/So with this mother board http://www.motherboards.org/reviews/...ds/1690_2.html
Will it run one of the gtx 285's ? I see they are pci-e 2 ?
As far as I can tell, Intel doesn't actually specify what's different between the d975xbx and d975xbx2. Never mind, found it. The 2 model added several processors that really should have been added in a BIOS update to the first motherboard...
D975XBX 1 Processor List
D975XBX 2 Processor List
The only reason you'd buy one is if you wanted to run a really early Core 2 Duo with Crossfired Radeons, since this was one of, if not, the first Intel board to carry Crossfire support. It does not, however, support SLI as later and more modern Intel motherboards do.
I'd also have a hard time suggesting this board on it's own merits. The BIOS for the original model is rubbish, and since it looks like Intel just re-released the board with an updated BIOS for new processors instead of... you know.. actually releasing the BIOS so that if you had the original board you could use the new processors... I somewhat doubtful that the 2 model is any better.
Getting a SATA drive to boot from the system was difficult. And it chewed through more power than it really should have with it's feature support.
***
If I mis-read this as you actually having one and aren't just looking at buying one, yes, it's PCI-Express slots will support Nvidia cards.
Just not in SLI. -
and you didn't edit Wikipedia to state that Thanksgiving is celebrated in your country?
-
my gut reaction is pipe dream
One of ...things... I worry about is class separation, which is going to be a big deal in Going Rogue. One of the design points of CoH is that no class should play like any other class. If you play a Fire Aura Brute you should have a different play experience than a Fire Aura Scrapper, or a Fire Aura Tank. If you play a Dark Miasma Mastermind, your play experience should be different from that of a Corruptor or a Defender.
Since Going Rogue will allow players of the base Hero and Villain archtypes to start in Praetoria, then choose a side, I think it will be very important that players have different experiences for each archtype. Otherwise there will be a flood of players that choose one particular power-set combination, because it's clearly better than a similar combo on a different archtype. Why play an Elec Armor Brute when an Elec Armor tank or Scrapper has better survivability? With Going Rogue, just role a tank or scrapper in Praetoria, and choose villain side, and presto, you've got something that fills the aggro control or damage jobs of an Elec Armor brute, and you'll live longer in a fight.
IO's blur the line even more right now. Some of the common forum threads now concern softcapping various archtypes against positional defenses. There already are defenders, blasters, and corruptors achieving soft-capped positional ranged... so they can fire from ranged, and generally live from range with no real risk to themselves.
In that aspect, I think it's pretty fair to say that a Ranged attack Archtype with Scrapper / Brute / Tank level shields is going to be a tad bit... broken.
I rather suspect that future Archtypes won't be so much about blurring the lines of Archtypes that already exist by just combining two separate and distinct sets. I suspect that future development will be more about making sure each Archtype in the game now presents a different play experience, and that future Archtypes will follow the current Epic Archtype design of multiple sets with trade-offs in each set. -
*peeks into the channel, and sees puppeh in a turkey outfit*
Sometimes I feel sowwy for canine kind... other times, I just laugh my rump off. -
Quote:*head tilts*Right now I would estimate that the current mac implementation runs at 40%-70% GPU efficiency of an identical windows platform. With Ultra you'll probably see 70%-85% GPU efficiency. (The first numbers are seat of the pants numbers by taking my 8800GTX off my windows box and plunking the same card into my hackintosh - identical hardware otherwise).
Um... the Cedega overhead on Nvidia cards is closer to offering 85%-95% of the same performance on the same card at the same detail levels with CoH, and I'm basing that on Geforce 6200, Geforce 6600 GT, Geforce 6800, Geforce 7900 GT KO, Geforce 9500 GT, and Geforce GTS 250. I'm sort of curious as to why you'd only be getting 40% of that same performance on Cider. Although given my propensity for knocking Nvidia on the quality, or lack-thereof, on their drivers, that could be why. -
Quote:mewit. I should have been checking the combat log more closely. I always presumed that extra damage was from the ghost axe undead damage.That's because they're weak to lethal. Guide to Enemy Resistances by Culex. (requires Excel or the ability to open Excel documents)
You weren't getting the bonus damage, you were just getting extra damage because they have negative lethal resistance. (lethal damage does 1.3x as much) Dr. Vahz has a weakness to lethal as well. -
Quote:This actually has been answered as well, although not directly.But it hasn't been answered. We don't know if the limiting factor is the pixel-pushing ability of the card (in which case the playability depends on your tolerance for low framerates), or if it's the presence of specific hardware features (in which case a sub-par graphics card can't do Ultra Mode at any framerate).
CoH uses OpenGL as it's rendering API, which is why the graphics in Going Rogue aren't tied to NT6 / DirectX 11. We also know that neither Nvidia, nor ATi, have had different feature sets in differently named graphics cards for years. The last time Nvidia had different feature sets in graphics cards that were branded the same came back during the Geforce 4 years. The Geforce 4 TI series were DirectX 8 cards, the Geforce 4 MX cards were simply overclocked Geforce 2 MX's with DX 7 support.
There are a couple of oddities on AMD's mobile lineup... exampling the Mobile Radeon 2100 and x2300... which were DirectX 9.0c cards. They weren't branded with the RadeonHD tag that signified DirectX 10 support.
Anyways, since the RadeonHD 4850 was named by Ghost Falcon, and the 9800 GT was named by Positron, we can compare the technical specifications of these cards. For the purposes of OpenGL, most of the features of each card should be exposed to a developer through OpenGL 3.0, although it's possible that the developers are building off against OpenGL 2.0 ES.
Since the feature set in the cards listed by the developers is pretty much identical to lower end cards, like the Radeon 46xx series, or the Geforce 9500 and 9600 cards, we can pretty much say that the limiting factor is performance. Not features.
***
edit: also, Positron uses the terms : best bet and recommend
which also enforces the point that it's about performance more than features.
If the developers are using OpenGL 3.0, all of the RadeonHD series and all of the Geforce 8x00 series onwards should be able to drive Going Rogue's graphics, although maybe not at 30 fps. Crossfired 3870's, for example, should be able to match a single 4850 since they can do so in most other shipping games.
If the developers are using OpenGL 2.0 ES, theoretically, you could go back to the Radeon 9700 series or the Geforce 6x000 and still have all of Going Rogue's visuals... I... somewhat doubt... that OpenGL 2.0 ES is the break point though.
**
double edit: doh. I forgot that ATi also did the renaming / rebranding thing with the Radeon 8500 series. They kept re-releasing the 8500 as the Radeon 9000, then the 9200, and used the Radeon 8500 in the integrated chipset, the Radeon 9100. Again though, that was also years ago. -
Quote:It's... pretty much already been answered. Hardware is hardware. If the RadeonHD 4850 is the hardware performance starting point... that's the hardware performance starting point, regardless of what OS you are using.Can somebody speak to what this means for Mac users?
I've got a 2007 iMac with an ATI RadeonHD2600, which according to the handy chart posted earlier, is several tiers below the NVidia 9800 GT.
If I want to stick with an iMac, I would NEED to go with the high end 27", as its ATI Radeon HD 4850 is the only one that's better than the NVidia 9800 GT.
I'm sure Mac Pro, Mac Mini, and MacBook users also will at some point want this question addressed.
Thanks for the other info, though. -
Quote:that'll pretty much only happen if the game switches to OpenCL for physics. Nvidia's behavior with PhysX leads me to doubt that they'll... actively... develop acceleration on CPU's.As I mentioned earlier, I believe it will use one core for main processing and a second for physics (if you don't have a dedicated card), but I do wonder if Ultra-mode will allow it to do something with the other 3/4 of my i7.
Quote:Doesn't COH already do that?
One of the big problems with game development coming off of the Xbox, Gamecube, and PS2 systems onto the Xbox 360, PS3, and Wii consoles is that the current crop of consoles are built to be multi-threaded with multiple cores. If you want to get technical the Gamecube was actually ahead of that curve with it's split 32/32 bit - 64bit processor pipeline. The change-over produced ill feelings from a development industry trained on single threaded production techniques that had to start working on developing techniques and code that would be able to utilize a larger number of instruction threads, rather than just a faster instruction thread.
The basic problem is that software written to perform well with multiple instruction threads generally doesn't work too well on single threaded / cored processors. This is one of the reasons why Microsoft Windows systems do so poorly in server related tasks: since Microsoft's main money making market is the desktop, they've never really been able to optimize the NT kernel for multi-threaded enviroments. You'll actually find that Microsoft tends to maintain a separate kernel, the HPC kernel, for tasks that require multi-threading. One of the big deals about Windows 7 is that the kernel's ability to handle additional processing cores and threads had been improved for consumers, which theoretically means that those with Triple, Quad, Hex, or Oct cores / processors will see larger performance gains in basic tasks.
If you are actually interested in the coding side of making SMP and Single Thread instructions work together, and how to optimize for each, you can check out the Linux Kernel Mailing list and look up the work that went into making SMP-enabled Linux kernels work properly on single thread-systems.
If you aren't interested and just want to know how this relates to CoH, keep reading.
One of the points of development we can take away from HeroCon is that the developers are interested in maintaining the existing playerbase and system specifications, which are largely based against processors around the 800mhz mark from the Pentium III and original Athlon line-ups. One of the theorized reasons is that NCSoft wants the game to be viable on netbook type systems, where something like an Intel Atom has around the same instructions per clock (IPC), as the years old PIII and Athlon designs. Looking ahead the netbook / notebook market is going to do nothing but grow, and there's the huge problem for Microsoft that Google's getting into the act...
No, I'm not saying that we'll see a downloadable app for ChromeOS on launch that installs CoH to an SD card or USB memory stick and runs through Transgaming Cedega... although Transgaming is working with both Google and Paragon Studio's in different capacities. As a future option to keep the customer base growing, it's something that NCSoft, as a publisher, and Paragon Studios as a developer, would have a hard time not thinking about or considering.
Since the developers seem to be interested in maintaining that older performance profile, there's only so much that can be done with the base engine to harness multiple cores or processing threads. Thus, while CoH can take some advantage of multiple cores, short of an update that enables a new underlying CPU engine, it's never really going to be able to take advantage of quad-cores, hex-cores, or oct-cores.