-
Posts
665 -
Joined
-
Two 5770 reviews. The xbit review compares the 5770 and the GTX 260; they come in at about the same class. So the question is: do you want ATI or do you want Nvidia?
(*Warning, gratuitous handwaving ahead*)
If a 9800 GT is the "entry point" for GR Ultra Mode, and a GTX 285 will run it maxed out without breaking a sweat, a GTX 260 sits right between them: 9800 GT < 9800 GTX+/GTS 250 < GTX 260 < GTX 275 < GTX 285. So either GTX 260 or 5770 should be able to run Ultra mode to some reasonable degree.
(*End handwaving; drive safely*)
Since ATI is now the official graphics partner for Paragon (instead of Nvidia), I would expect CoX GR to run better on an ATI board, or at least not have any difficulties. However, the game on the Live servers does have "graphical oddities" with ATI at the present moment. If you want prettiness *right now*, the GTX 260 is your board. If you can put up with no water effects, etc for a couple months until GR, go with the 5770.
Your problem better not be power. If you can't feed a 9800 GT, you're not going to be able to run a GTX 260. The 5770 sips less power than the 260, but it should be at least as much draw as a 9800 GT. If your system is hot enough that it's conking out a 9800 GT, those two cards are going to be too hot as well: tie down your cabling, clear out your air paths, and evict all dust-bunnies! -
Quote:That depends on how much money you have and how much you value your sanity and hearing. In operation, reviewers describe the sound coming from dual-chip boards as "uncomfortable". If you've got the money to watercool the board or are someone who never plays without headphones on, that could be different.
So, is it worth me buying a new PC with a GTX 295 or waiting for the 300s? I'd really like one sooner rather than later, but if I can get something better for a similar (or lower) price when they come out, I may just wait.
Thing is though, ATI is now the official graphics partner for Paragon Studios, rather than Nvidia. The only hesitance I have in recommending someone go with an ATI board is that we don't yet have confirmation that the regular graphics engine (the standard one, not the Ultra Mode) will have the ATI-bugs in it fixed with GR.
If those remain in the regular version, you have the possibility of someone "upgrading" to an ATI board on a budget and chancing to put themselves just below what they personally consider playable for Ultra Mode and stuck with ATI graphical oddities in regular mode.
If you have the money to spend on a GTX 295 you're shopping for something in the class well above that grey area. I'd go with a Radeon HD 5850 or 5870, depending on your preference.
As to waiting, the most reliable estimates I've seen of Fermi/G300 hitting actual market shelves is April/May of 2010. Assuming Fermi turns out to be a competent offering, then ATI boards might drop a bit in price. I don't see a reason for them to do so any earlier than that. -
-
Quote:I don't know that a PPU would actually gain you anything either; at least not without some fiddling. When Nvidia disabled their drivers to prevent people from using an Nvidia card for dedicated PhysX while an ATI card is being used as main GPU, they also disabled legacy Ageia PPU PCI support. The inference being: "buy one of our new cards, fanboy!"Buying Nvidia for PhysX support won't do you a lot of good. You'll actually need a PhysX PPU card, and they are no longer sold new. They are not that expensive used though.
Have I mentioned I think this is a counterproductive business strategy they are following? -
Quote:Yes, that was me. I'm not quite ready to order a funerary floral arrangement for Nvidia, but they are in a very nasty position.Yes, two 9800 GTX's in SLI are pretty powerful, but you are dependent on software support of multi-gpu rendering. It'd be a cheaper way to gain more performance while you wait for Nvidia to go bankrupt or get bought out. (and no, that's actually not a joke, I think Human Being addressed the financial problem Nvidia has trying to move GTX chips right now... or somebody did in the thread)
Financially, Nvidia actually isn't in too bad a shape. According to the most recent numbers I saw, the company had a few hundred million in debt and a billion or two cash-on-hand. So Nvidia isn't in danger of immediate monetary collapse. It's problem, is that the outlook for any new sources of revenue is increasingly bleak.
Ironically (and I love ironies) ATI and Nvidia essentially have mirror-image predicaments. ATI is carrying about $5 billion (with a "B") of debt. They are very deeply in the hole. However, they have a winning solution on the Graphics Card market right now and AMD recently settled all their lawsuits with Intel for a $1.25 billion payout (and a promise to "play nice" from now on). The same agreement also allows AMD to sell its 30% stake in Global Foundries microchip manufacturing. Liabilities from Global Foundries caused AMD to take a loss rather than turning a profit last year. So they should lose the most draining portion of their business while gaining several billion dollars in cash. That cash on hand isn't important for getting rid of their debt, but because they will have their own funding (without searching for more credit) to research and develop new microchips. The R&D + manufacturing workup for a new chip can easily cost several billion dollars.
And that multi-billion dollar price tag is what may ruin Nvidia.
As I said earlier, Nvidia has basically had two core products over the last years: the G92 and the G200. Their other recent chips have derived from the architectures and research of these two.
The G92 was quite successful when it debuted, and Nvidia decided to elaborate the design into something larger during development of G200. The thinking was that ATI would fill its "traditional role" and produce something that was decent-and-low-priced-but-never-high-performance. If Nvidia built something impressively large, they would certainly dominate the high-end market.
Unfortunately ATI surprised and outflanked them badly with their 4xxx series of cards late last year. To begin with, the 4xxxs were much more powerful than Nvidia had expected. Compounding that strength was the fact that ATI had jumped to a 55 nm manufacturing process for the chips, rather than Nvidia's 65 nm resolution; the chips were smaller and cheaper per batch of silicon. Furthermore, ATI opted for "more expensive" and twice-as-fast GDDR5 memory instead of Nvidia's GDDR3...but with half the memory controllers Nvidia was using. The price for memory controllers turned out to be the greater factor and that made ATI's boards cheaper to construct. With half the memory components to power and a smaller chip, the boards needed less power-regulatory elements as well. The final result was products that could equal or outpace the equivalent Nvidia board at 20-50% lower price.
Nvidia scrambled to transfer to 55 nm process (GTX 280 -> GTX 285, etc), but they were still stuck with a larger chip than ATI was working with. Physical production costs per unit were simply higher for Nvidia. So their potential profit per board at each performance price-point was lower. ATI took full advantage of this and dragged prices down to the point where Nvidia essentially had to "pay people" to take their cards; selling them at a loss.
Buying market share that way is okay for a while; so long as you can transfer to a new product. That new product was supposed to be the G300 chip (aka "Fermi"). But Fermi was supposed to arrive this Fall. It is now delayed until at least late-spring/early-summer. A year and a half of selling their products for less than the price they get them from the factory at is *not* okay. So rather than continue to throw good money after bad with the commercially-failed G200 line, they are shutting it down and leaving the high end graphics market to ATI for now.
So what else does Nvidia have to survive on? je_saist already went over the problems with Nvidia's future as an integrated/mobile graphics chip provider. I'd add that laptop manufaturers may actually be *looking* for an opportunity to wash their hands of Nvidia due to a series of faulty chips Nvidia had been selling them up to August of last year.
The "new" thing that Nvidia is trying to do is make money from the academic "Compute" market. If you have some sort of massive computational task that requires a supercomputer, it's more efficient to have a number of small, specialized processors rather than a few generalized ones. Well, that's exactly what a modern GPU is. Following the logic of this, Nvidia recently started shopping their graphics cards around to universities and research institutes. Nvidia sees this as a huge and untapped market of applications. However, the idea is just beginning to move around and these are not the kinds of customers who make snap buying decisions or decide to be "early-adopters". Compute application sales were a tiny fraction of Nvidia's revenues last year.
Nvidia does make the graphics core of the Sony PS3, but ATI turns out to supply the graphics in the Xbox 360.
Nvidia basically has no product that it can rely on for a solid revenue stream in the coming year. The G92 and its derivatives have been a tremendous workhorse, but they're really showing their age now. The G200 essentially resulted in Nvidia flushing their billion dollar research costs down the toilet; its a dead end. Nvidia needs a new flagship product to maintain their business. And that brings us back to the G300/Fermi.
In response to the shockingly effective ATI 4xxxs, Nvidia decided to bring as much firepower as they could to bear on the next generational competition with its rival. Furthermore, they couldn't get caught with an older manufacturing process than ATI either. They also decided that they needed to incorporate capabilities for robust double precision calculation and memory error correction to accommodate academic Compute customers.
The G200 was massive. The G300 is monstrous. Fermi consists of 3 billion (yes, with a "B") transistors in a die that is larger than 500 mm^2. The ATI "Cypress" chip in the 5xxxs consists of only 2 billion transistors in a 334 mm^2 die. Remember what I said about the ATI 4xxxs costing less to produce than the G200s? Nvidia is once again in the exact....same....situation. Nvidia needs to unveil a G300 chip that turns out to eat Global Warming and poop unicorns. The only way Fermi will be able to compete with Cypress and recoup its development costs is if it completely blows the ATI chip out of the water in performance.
And that's where Taiwan Semiconductor Manufacturing Company's bungling becomes important... The new 40 nm resolution manufacturing process that both Nvidia and ATI are using from TSMC has turned out to be more problematic than advertised. ATI was getting very low yields of functional chips per batch of silicon. Fermi, being half-again more complex than Cypress, is even more vulnerable to random fatal defects. Reports were that the original test production runs resulted in 1.7% functional chips. Those that functioned turned out to do so at much lower clock rates than expected. If you have a chip with massive capabilities that works at a relatively low frequency, it may not outperform a less elaborate chip that can run faster...
Nvidia is in better shape than ATI to take a(nother) financial hit. The question is what, if anything, the company will be able to do *afterwards*?
So yeah, Nvidia? Deep trouble. -
A GTS 240, depending on manufacturer's modifications, might work around the 9800 GT "entry" mark. However, the GTS 240 (and 220 and 210/310) does not support SLI. Unlike the 9x00s, there is no way to combine two of them for potentially greater performance.
-
Quote:"Maybe." It depends on a lot of things we don't have information on. Under ideal circumstances, an SLI-ed pair of 9800 GTXs should actually outstrip a single GTX 275. The operative phrase there is "ideal circumstances". Not every game supports multi-GPU configurations (CoX currently doesn't). Those that do support multiple GPUs don't necessarily scale 1:1; you might get 1.3x the performance of a single card rather than 2x. Those that do support multiple GPUs also don't necessarily react to Nvidia SLI and ATI Crossfire format the same way.Ok here's my question. I have a 9800gtx right now on a quad core cpu (2.8ghz) with 8 gigs ram.
Would just getting a 2nd 9800 gtx for sli be a better choice than trying to spend alot on a 275?
Assuming Ultra Mode responds to SLI with 1:1 scaling, there are other factors to consider. I presume that you have an open PCIe 2.0 x16 slot on your motherboard to put the second card in, but does the board actually support SLI? The last generation of motherboards tended to license either Nvidia SLI or ATI Crossfire, but not both. Also, do you have the requisite additional 6-pin power connectors on your power supply to hook it up? If you've got the connectors, is your power supply rated high enough to support both cards, your quad core, and whatever else is in your system?
If the answer to all of that is "yes", then there still remains the problem of getting a "matching" card. I'm not sure how precisely you are naming the board, but you can't find "9800 GTX"s anymore. You can find 9800 GTX+s and GTS 250s. The 9800 GTX, 9800 GTX+, and GTS 250 are all using the same G92 chip, which originally debuted in late-model 8800s, with a progressively more impressive sounding label. However, each one is clocked a >little< bit higher than its predecessor. In order to match the new card to the older one, you might have to underclock it a bit (which you can do with EVGA's Precision program).
Potentially, this could be a cheap(er) upgrade. It's not necessarily trivial though. -
Quote:Oh, those things were wretched. >_< It wasn't just XFX. I had a PCIe 1.0 6600 and it resulted in my very very first-est after-market component modification when I bought a Zalman VF700 cooler to make the hurting stop!I'm sort of tossed on XFX. My first couple of cards with them were the AGP Geforce 6600 GT's I had... which had bloody awful heatsink designs. However, it seems that Nvidia's 6600 AGP design was something vendors couldn't deviate from, so I really couldn't blame XFX for the cards.
I've never trusted those little single-height fan-sink designs on any other card since then. Case In Point. -
Quote:Wait, I just reread that... Okay, I expect you meant tax refunds, not that you were able to write off a gaming PC as a tax deduction, but that makes me think of one of my favorite Penny Arcade strips. (Warning: some language)Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
You can click the "News" link to read the story behind the strip. -
Quote:A GTX 295 is two GTX 275-class chips stuck on the same board and slightly under-clocked in a desperate attempt to stay below 300 Watts TDP. It's how the two companies compete for the "fastest single graphics card in the world" title. Despite lackluster performance in the rest of the GTX 2xx line this year, the GTX 295 barely managed to hold onto the title against the ATI 4870 X2 (two 4870s built into the same package). However, they lost that standing just last week against the ATI 5970 (two 5870s on the same board and underclocked to 5850-speeds in a desperate attempt to stay under 300 Watts TDP =P).Just out of interest, any opinions on the Geforce 295? I'm looking to build a new hi-spec system (yay tax deductions!) and have seen that going as part of certain deals.
These cards are the "best discrete cards in the world". They also are huge, run hot, and tend to make noise like a Basset Hound being sucked into a vacuum cleaner. (Of course, if you've got the money to buy one or more double-chip cards, you probably have the money to drop on a hefty water-cooling system).
I would note that I think there have been three revisions of the GTX 295, one of them being two distinct circuit boards with one chip each inside the cowling instead of two chips on one board. I'm not sure what the dissatisfaction with them was, but apparently not all GTX 295s are built the same. -
-
Quote:They're different companies with different architectures and will both behave a little different in each situation. There are two significant things to note however.
Firstly, If I do end up going for the ATI cards, it will be my first ATI Card. Is there anythign I would need to know about differences between nVidia and ATI?
Firstly, in current implementation of the game, ATI cards have a number of "graphic glitches" in CoX. (See Bill Z's aforementioned sticky thread in the Technical Issues and Bugs section.) We've had no official word if these are going to be remedied in Going Rogue. It is likely that Ultra Mode will not see these problems with ATI since they were demoing it on ATI boards and ATI is the official graphics partner. However, we can't know for certain if these oddities will go away entirely when Going Rogue is live, or if they will only disappear in Ultra Mode but remain in the standard version of the game.
Heck, for all we know, Nvidia cards might end up sporting "graphical oddities" in Ultra Mode =P.
For now though, be aware that there are quirks with using ATI in the current game.
Secondly, ATI does not support hardware-accelerated PhysX.
Wait, what does that mean? For CoX, pretty much nothing.
Ageia was purchased by Nvidia a while ago, and a proprietary physics engine got even more proprietary. All Nvidia cards of 8800 or newer vintage (except the "brand new" G210/G310 =P) will run the PhysX calculations on their GPU. If you *don't* have an Nvidia card, the PhysX gets done entirely by the CPU.
In CoX, that's not a huge burden. I've never heard of ATI users complaining of massive framerate drops when flying debris appears on their screen in-game. The PhysX calculations are light enough that the CPU can handle it as implemented; which makes sense considering the minimum hardware specifications Paragon wants to maintain. Given the processor you've suggested, you shouldn't even notice the "lack of hardware PhysX".
Where it might become important though, is in other games. There aren't very many that use PhysX, and even fewer that use them intensively. The only one I can think of, though it's definitely significant, is Batman: Arkham Asylum. The PhysX implementation in the game is very intense when set to "High". You can be using an ATI card in B:AA with PhysX effects turned on, zoom along at 60 frames per second, and as soon as you encounter volumetric smoke or flying sheets of folding paper, the PhysX will crush the frame rate until it passes. Even a high end Nvidia card will see a dip in frame rate under those circumstances.
Some people have taken to using a second Nvidia card in another motherboard slot as a Dedicated PhysX Processor; there's an option in the Nvidia driver control panel to implement that when there are two graphics cards present. They don't have to be the same card either. People have been taking low-end 9600 GTs or 9800 GTs, pairing them with GTX 260s, and seeing a boost in overall performance since neither the CPU nor Graphics processor have to deal with the PhysX stuff. The weaker, older cards are plenty enough to handle the specialized task.
Here's the kicker though: you can do the same thing with an ATI card as main GPU! In fact I've seen some reviews that say an ATI card outperforms the equivalent Nvidia one in B:AA while using a 9x00 GT as dedicated PhysX unit. You don't need an SLI or Crossfire motherboard (the two companies proprietary dual-card standards). You do have to be using either Win XP or Win 7 though; both have the required "Driver Test" mode wherein you can run two graphics drivers at the same time. Vista doesn't have this capability.
A little while ago, Nvidia caught on to this and disable dedicated PhysX operation in their drivers unless there was an Nvidia card in the other motherboard slot. Amongst the antics of an increasingly erratic company, I find this particularly asinine. Nvidia has essentially abandoned the high-end graphics card market and all they have to compete on are their lower-class boards. Those same boards are going to become even less competitive in Jan/Feb when the new ATI bargain cards arrive. If people are choosing ATI performance cards for quality, but could still pick up a cheap Nvidia card to run with it for specialized purposes, Nvidia might still make some money. But I'm not in marketing, so what do I know =P.
You can still do the different-cards trick if you use older (pre-190.xx I think) drivers. If you are using Win 7, there is also a homemade patch (predictably) floating around that will let you use the newest Nvidia drivers while telling them "No no, that Radeon chip in the other PCIe slot is really a Geforce
! Really
!"
Off the top of my head, those are the two things you need to know: as currently implemented, CoX has graphics quirks with ATI cards and a small handful of games (but not CoX) can see drastic performance fluxtuations with PhysX enabled in the absence of an Nvidia card.
Quote:Second, What company should I shoot for with the Card's board. I'm leaning towards the ASUS board, since I do have some experiendce with their products, but any information you could give there would be helpful.
I do know that Sapphire is well respected for ATI cards. Sapphire is basically to ATI what EVGA is to Nvidia. Sapphire also has a good line of custom, non-reference-standard versions of the cards they sell, like the Vapor-X line with better cooling/quieter fan/slight overclocking.
Another good company is XFX. XFX is the never-quite-overtaking-them closest quality-competitor for EVGA in the Nvidia line. About a year ago though, they looked at Nvidia and went "these guys are nuts!" and started setting up to build ATI cards as well. From looking at newegg after following your links, I also noticed that XFX had (past tense) both ATI 5850s and 5870s available for a $60 premium today; apparently no other company had them stocked with Newegg for Black Friday. Definitely no flies on these guys.
Quote:And, Lastly, the ATI card has 1GB memory while the nVidia card I selected has 2GB. Will this difference be noticeable in the performance of the card? -
Quote:That should be fine.Ok, I've been thinking about it, and have decided to post the main equipment I plan to purchase for my new system (CPU, GFX, ETC) to get a feel for the performance it will have from someone more experienced in this area. I'm a diagnostician, not a power PC builder.
So with that in mind:
CPU: AMD Phenom II x4
http://www.newegg.com/Product/Produc...82E16819103471
GIGABYTE GA-MA790X-UD4P AM3/AM2+/AM2 AMD 790X ATX AMD Motherboard
http://www.newegg.com/Product/Produc...82E16813128387
Any thoughts? Will this system stack up to GR's Ultra Mode and handle the load? I know that, according to Posi, the card will handle Ultra Mode at full tilt, but I want to make sure before making any purchases that the rest can handle it.
Quote:GFX Card: EVGA 02G-P3-1185-AR GeForce GTX 285
http://www.newegg.com/Product/Produc...82E16814130486
If the price point is no object, then you could also grab an ATI 5870 for the same money ($400-$420) and get something that flatly outstrips the GTX 285 head to head. Since the 5850 and 5870 use the same chip, the available dies are all going to the 5870 and it's a little easier to actually find a 5870 for sale.
Unless there's something bizarre about the implementation, I would expect the ATI cards to run at their maximum strength in CoX; as Bill Z pointed out to me, the ATI sticker has replaced the Nvidia one at the bottom of the CoH Homepage. We don't have empirical data that the 58xxs will perform that well with CoX, but as they are using 48xxs as a baseline, the 58xxs are a scale-up of similar architecture, and ATI is now the official graphics partner for CoX, it's quite likely to be so. Supposedly, new 58xxs are hurtling towards market as we speak and will appear magically on Dec 15th; just in time to be sold for Christmas. If they don't appear then, they'll certainly show up in January.
So if you want the card now, it would be easier to get a GTX 285. If you wait a bit, you could probably get the same performance for cheaper or better performance for the same price.
With all that said, EVGA is an excellent company. They're kind of the Cadillac of Nvidia board makers. If you decided to go with the Nvidia card, that's likely to be a quality one. -
Quote:Where are you seeing this? i7 920s on Newegg are going for $280.If you're looking to build a true low-end budget PC (that will still play games ok), this is a fine suggestion. But given that you can find core i7 920s for $200 (as of 4mo ago), and they run circles around any AMD processor, if you're not looking at a total budget system, I'd get the core i7 920.
However, the i5 750 gives most of the performance of the i7 920, costs $200 currently, and will require much less money to give it a motherboard and RAM than an i7 920. That's also a possibility if you're working inside a strict budget. -
Quote:Don't count on it!Give it a few months. Prices for things like this drop very rapidly. By the time GR is released, these cards will probably cost significantly less.
The price of graphics cards is usually driven perpetually-downward by the back and forth between Nvidia and ATI. Right now though, that situation doesn't apply for either company.
Nvidia is in a very nasty developmental position right now. Their new G300 processor is delayed until at least late spring/early summer of next year; it was supposed to have debuted before the ATI 5xxx cards a few months ago. Early sample reports about their performance are not encouraging either.
Meanwhile, the company is bleeding money on the GTX 2xx series cards introduced around this time last year. The G200 processor is fairly huge, larger than the contemporary ATI processor of the 4xxxs, and is relatively expensive to produce because of that. But 4xxxs not only equal-or-out-performed the GTX 2xxs, they undercut them on total price as well. They had to reduce the prices on the GTX 2xxs to the point they were basically unprofitable in order to maintain their leading market-share. But Nvidia has a sales-guarantee with its card producers wherein Nvidia will pay the difference if the producer ends up having to sell their cards for less than Nvidia's reference MSRP. I've seen it described as Nvidia having to wrap each chip in a $20 bill for someone to be willing to take them out of the warehouse.
Now the 5xxxs have hit and Nvidia still doesn't have an equivalent offering. To prevent having to lose even *more* money from a price decrease in response to the flatly superior 5xxxs, Nvidia told its card producers that they were "having production difficulties" and started to taper off distribution of GTX 2xx chips. The GTX 2xxs have been made "artificially scarce" to keep their price up. Then Nvidia announced a few weeks ago that they were End Of Lifing the GTX 260, 275, and 285; what's in the production chain *right now* is all there will ever be of these cards.
So Nvidia has to keep prices of their current cards up while they struggle for a new product, have constructed a situation to make that happen, and are now stopping production of the cards entirely to staunch the bleeding.
So no, the price of Nvidia cards isn't likely to drop between now and next spring.
For ATI's part, they are currently dominating the market with superior products and do not have an equivalent competing product on the immediate horizon. However, they have a superior product...but there aren't any around to sell. The same company (Taiwan Semiconductor Manuf. Co.) that is fabricating Nvidia's G300 chips is also making ATI's 5xxxs with the same manufacturing technique...and it's suffering similar production problems. There are mid-range 57xx boards available, but there are virtually no 58xxs. Indeed, the price for the 5850s has gone UP $50 since they were released in September. There's rumor that supply will get better around Dec 15th, but we'll have to wait and see.
ATI has a free run at the market, with Nvidia sitting on its hands, and they can't take full advantage of it because of supply problems! Add in that ATI is carrying a large amount of structured debt they have to service and they have no reason to reduce prices, every reason to keep them up, and real logistical problems forcing them to keep it that way.
So no, ATI won't be dropping prices on their cards any time soon either. (They will however, be dropping low-range versions of their new cards on the market in the first quarter of next year.)
If you want a cheaper card, I recommend trying to catch a holiday sale of some sort.
As a related note, I'd like to mention that Nvidia has just announced release of a G310 card. Do not buy this thing thinking it is part of a new high-performance line and will be better than a 2xx card! The G310 is a re-named G210 (which was a piece of junk) that Nvidia hopes will sound attractive because "300" is higher than "200". It's the same thing they did with the GTS 250 actually being a renamed 9800 GTX+; still using an older G92 chip rather than one of the newer G200s despite the name. The more I learn about Nvidia's business practices, the less I like the company. -
A few take-aways from the new Winter Event:
- Lord Winter's Northern Lights pets are absolutely Brutal.
I approve!
I like this kind of thinking from the Powers team: a distributed, seemingly-minor threat that becomes lethal if ignored. Using the Kheldian-alike energy blasts to "play the odds" against highly-protected Players made the situation dangerous. GM fights are often threatless with heavily-IOed characters and just involve standing in one place while pressing a series of buttons (or worse having to chase after this big thing that can't hurt you). The Northern Lights force one to pay attention to the fight. Do you stay on the GM or try to reduce the population of scattered pets with the clock ticking? If you don't break off to "Follow the Lights", will you be overwhelmed?
Innovative situations like this that require new tactical thinking are a step in the right direction. Keep following that path.
- There was something about making it so you had Heros and Villains in the same Lord Winter instance...?
Vague requirements are only part of the problem though. I have no idea where I might go in game to find out how to make that co-op situation happen. Instructions and information on how to make things work (especially when it's trying to play with one's other online friends) needs to be readily available. And the more complicated the requirements are, the greater the need to have that information accessible to the Players.
- There was story text that sort of carried on the story from previous years.
However, the story text as presented doesn't actually reference those previous years. There is also no nod to the Malleus Mundi, which was given as involved in the original Halloween and Winter Events (2004). Nor is there any explanation for why Lord Winter's Frostlings are popping out of presents originally let loose by "The Gamester" (Winter '05). I somehow doubt Lord Winter is the hidden Gamester, so what is The Gamester's involvement with the Frostlings? Or perhaps the Malleus Mundi? Are the town-and-existential-being-stealing Redcaps mixed in somehow?
This is a step up from the original implementation of the Halloween Banner Event (which originally had no text whatsoever). But these are ongoing events in the game world. They return annually and as such involve an accumulative story. "Because it's a holiday Event" is not enough reason for something "eventful" to be going on in the game world. Better effort to integrate Annual Holiday Events into the larger CoX (and their own) backstory needs to be taken. Even if you don't like an old story thread, you still have to tie it off.
There's quite a bit of criticism about the pace of progression in the ongoing CoX story. The annual Events present a minor, but significant and regular opportunity to create a sense of a "living story". All that is required is that any new features of the events be given a rationale that is coherent with, and acknowledges, the Events from previous years. There's an opportunity here to add a great deal of vitality to the backstory and ongoing-world with very little attention.
I'm sure that new elements of the Holiday Events start out from an Engineering perspective rather than a story one. (i.e. "What can we do that's completely new and fun this year?") Integrating those new game-elements into the pre-existing world is important for making them part of that game though; rather than just "that thingy we get to do once a year".
All in all, a good addition to the seasonal repertoire. -
The rank, certainly. Every army that lasted more than a season has also had "veterans" who carried knowledge and experience with them. But the British instituted the formal, professional role and degree of responsibility within a unit. The Sergeant became an institution of the army itself, rather than a given unit, whose intended purpose was to provide constancy and continuity from unit to unit over time.
The Romans had "Principalis" who were soldiers recognized for becoming highly experienced veterans within a given unit. But they did not pass between units, nor did they deliberately set out to gain that status, nor did the legion commands have a policy of cultivating and maintaining a set number of them with explicit purpose.
The Soviets in modern times, again, had a "Sergeant" rank, but that was only a recognition of seniority among enlisted soldiers. All the authority and responsibilities taken up by the NCO in current, Western-model armies were left to the Lieutenant. Should the Lieutenant be promoted, they left and the unit got someone brand new.
Rather than let experience come and go ad-hoc, the British formalized a role within their armies to create and maintain it universally; granting authority, responsibility, and even specific career-path to do so. -
Quote:*Blink* *Blink* Er, does that mean your last name actually does refer to this? I kept telling myself that couldn't be it.*note*
Just for a little bit of a sneak peak, the "Colonel" I'm looking at is my own namesake and oldest character, whom I never got around to writing a finished story about. -
Quote:Actually, that sounds like what happens when a regular soldier gets recruited for the US Special Forces. Each branch of the US military has it's "own" Special Forces Unit(s) (Army Rangers, Marine Corp Force Recon, Navy SEALs, etc), but even though recruited within a single branch, they are all now under a separate, independant Command that answers to the Department of Defense (The Pentagon).That was kind of what prompted me to ask this in the first place, though. In my case, it's a soldier who goes up to around Sergeant rank, but then leaves the military almost entirely and joins an entirely separate branch, essentially abandoning his old post. I understand that such a thing doesn't happen in real life, because the sort of "entirely separate branch" wouldn't exist in a non-sci-fi world (at least I wouldn't think, we don't have genetically remodified super soldiers yet, do we?) and as such is subject to an entirely different hierarchy.
If you want a soldier that proved himself as a dynamic fighting man before being recruited into an elite organization and given strong, independant authority, I'd recommend: An enlisted soldier who was sent to Officer Training School after attaining the rank of Corporal, distinguished himself in battle with his men as a Lieutenant, and was promoted to Captain in recognition and showed he could handle responsibilities greater than single-unit sized. He took University courses while both a Lieutenant and Captain, earning a graduate degree in [insert here]. At which point he was cherry-picked by Special Operations Command for their new Special Biologics unit and given a courtesy promotion to Major with his new responsibilities.
He is now an Officer and a Fighting Man who is outside of the regular chain of command from [insert original branch of the military], who answers to someone high in Special Operations Command, who in turn probably reports directly to the Joint Chiefs of Staff (the highest ranking officers in each branch).
Quote:In fact, and I will go back to Mass Effect again because that actually has a PERFECT representation of what I was talking about, consider him something akin to Mass Effect's Specters. He is an operative who receives his missions directly from central high command, rather than from any commanding officer, and whose missions are given about the highest priority there, such that he could commandeer military supplies, resources, personnel, facilities and whatever else he may require for the completion of this mission. Not QUITE Specter level, in that the Specters were the highest authority AT ALL and answered to absolutely no-one, and the operative I have in mind would still work WITH the armed force, rather than separate from it, but that sort of working relationship is what I had in mind.
"Lensmen" are selected through a multi-year screening and training process that takes in several million recruits from a galaxy-spanning civilization and produces a spare handful of graduates at the end. Lensmen, by their nature, are self-starters, totally focused, incorruptible, and completely devoted to the principles of Civilization. Lensmen are the face and leaders of galactic law enforcement and defense, given authority for independent action and ad-hoc command over any regular military units. -
Quote:If an enlisted soldier is raised to Officer rank through Officer Training School, it tends to happen fairly early in their career. If a soldier spends a lot of time as a Sergeant, they tend to stay a Sergeant for their career. A Sergeant may eventually be granted the respect of a Colonel (see: Plummley in We Were Soldiers; who, yes, was a real person), but would never, ever have equivalent authority.My Colonel equivalent, on the other hand, is quite the opposite. He starts out as a common soldier in charge of a particularly proficient squad (which would probably make him a Sergeant at the time), but eventually moves out of active front line combat and into special forces, to specialise in covert operations and sensitive missions. This both gives him significantly high security and confidentiality clearance, as well as the authority to commandeer resources for his missions as he sees fit, because the tasks he is given tend to be more critical than most of the thing more regular army units are typically engaged in. I don't think I ever directly gave him the rank of a Colonel, and if I ever gave him a rank, it'd be Special Agent or something to this effect, but his active authority, the way I see it, would be just around that of an acting colonel.
-
A little more on the Officer/Non-Commissioned Officer (NCO) difference:
NCOs were an invention of the British army and largely been adopted by all modern militaries since then. Sergeants are referred to as the "backbone of the [insert force here]". The reason for that is the concept of Institutional Memory. Institutional Memory is a collection of "ways of doing things", traditions, and knowledge of past mistakes so that earlier errors don't have to be repeated endlessly. Since a Sergeant spends his entire career with the enlisted soldiers and is up close with the end results of any mistakes that are made by higher ranks, they are the repository for Institutional Memory.
The invention of the NCO is credited to a lot of the effectiveness of Western militaries in modern history.
A Lieutenant's job is to direct a small unit of soldiers towards the accomplishment of a mission. A Sergeant's job is to make sure the unit is *able* to do that mission (that includes making sure the Lieutenant does his job right). The Sergeant spends time thinking about whether individual soldiers are in the right frame of mind, which ones need a little extra training or "encouragement", etc. The Lieutenant has to be aware of these things as well, but the Sergeant is the one who makes them happen. If the Lieutenant spent all his time doing the Sergeant's job, he wouldn't have time to do anything else; if he spent all his time thinking tactically, the day to day running of the unit would suffer.
That's actually what happened to Lieutenants in militaries organized on the old Soviet model of recent history. The Soviets considered regular enlisted men too stupid and uneducated to be trusted with any real responsibility. While they had "Sergeant" ranks, those soldiers were essentially Privates with a bit more seniority. So Soviet-model militaries put a lot more strain on the responsibilities of the Lieutenants and ended up with less effective individual units that also had to keep "re-inventing the wheel" every time they got a new officer.
I've read that the hardest thing for the American trainers in Iraq to do was establish an effective, professional NCO corp in the new Iraqi army.
That also segues into your remaining un-answered question: what's an XO? An XO, or Executive Officer (also First Officer) is the right-hand of the commanding officer. It's easiest to illustrate with the navy: A Captain in the navy directs where the ship goes and who it fights when it gets there. The Captain's First Officer, and second in command, makes sure the ship is able to get there, it's various departments are functioning optimally, and it's ready to fight when it arrives. Again, the Captain must check off that these things have been done, but it's the XO who gets them done.
As a final thing to bring it all around to the beginning, in the navy senior-Sergeants are referred to as "Chiefs". As in the "Master Chief" from Halo.
I highly recommend that you do watch We Were Soldiers that another poster mentioned. It well illustrates "good" examples of all the ranks up to Colonel and how they operate (plus one "bad" Lt). The Lake Scene from deleted scenes in that film is also priceless for it's depiction of a Sergeant who's had enough of another "bad" Lt. I'll PM you links for both but not include them here; it's as bloody as any other recent war movie and thus NSFW. -
Did you know our own Melissa "War Witch" Bianco had written a novel?
Before one of the Devs dropped it into conversation the night before Herocon last year, neither did I...
War Witch creates the zones we play in and writes the ambient dialogue you hear while traveling around them. Just on the strength of that body of work, I knew I wanted to see what she produced for a book**.
Asking around about the volume, I was told that they would have some the following day at Herocon's "Wentworth's" table. When I got there early the next day, the person manning the table wasn't sure what they were being sold for (and there is no MSRP on the jacket). They finally had to go get War Witch herself and ask how much it was. Consequently, I actually got to have mine signed by the author herself.
So after hearing about it out of the blue and a bit of searching, I had a copy of War Witch's self-published novel: Real Life.
Real Life follows the experiences of one Jessie Sutherland, beginning with her unceremoniously and unwillingly finding herself stuck in the Australian rain forest. Ms Sutherland has little familiarity with anything but city living and feels both sullen and apprehensive about her predicament. Her attempts to adapt are not made any easier by simultaneously dealing with conflicts among her (more acclimated) companions in the wild. She's deprived of all the assumed conveniences and artificial comforts, both physical and emotional, that she has dwelt within all her life.
Ms Bianco's writing well conveys the minor irritations that will continually impinge on your attention out "in the rough"; unexpected, petty indignities like shoes that won't dry out and grit between one's teeth. Yet the prose doesn't get bogged down in a catalogue of description. It's illustrative without slowing everything down. At one point, she fully depicts an encounter with an Australian Big-Eared Horseshoe Bat by sketching the animal with a few adjectives and then communicating the rest through everyone else's reactions to it. Pausing to look up a picture of the creature, I felt that I "recognized" the thing when I saw it, even though I'd never heard of one before. A quick scene got to move rapidly without sacrificing support for why it played out as it did.
Each character is also given a sufficiently distinctive "voice" in their dialogue to point them out as individuals. They have their own ways of speaking and verbal ticks. Again, this is from the person who writes our in-zone NPC dialogue and it shows.
Ms Bianco also peppers the story with a seasoning of quirks and oddities that make the narrative feel more "real". I found myself wondering if she was selecting things from her own experience and placing them in the story for authenticity, or if she was deliberately fabricating them from whole-cloth. For example, do you think she actually has a clock radio at home that will only pick up Spanish language stations? Has she actually been to Aria? (If she has though, how could she not have ordered the venison??) Whether she cherry-picked them from personal experience or created them in-step, their presence makes her characters and their world more memorable and believable.
All of the effort comes together to put you inside Jessie's skin for a sensually intimate (which is not to necessarily say "pleasant") examination of the main character's world and herself; at higher magnification than Jessie ever might have sought.
If I was going to level a literary criticism at the book, I'd say that it takes a little too long for the "stakes" of the plot to be established. 'X' is happening, but if it turns out one way or another, how is it going to affect the main character's life in the long term? Honestly though, I've seen multiply-published authors that have done the same thing and it ultimately doesn't harm the book. In the end, every last little thing that's happened turns out to have mattered a great deal. In a clever parallel, the seemingly understated front cover actually shows you what's to come if you look closely enough. But the reader doesn't see it any more than the characters do, even though it's right in front of you.
I should note for myself that I don't normally read contemporary fiction. I read science fiction, urban horror, and historic non-fiction; but contemporary fiction isn't something I look for. I enjoyed this book. In fact I was quite surprised with how well done it was. At the end of the novel I found myself wanting to read more of it.
In short: the lady can write.
If Ms Bianco ever chose to pen another book, I would buy that one as well.
Real Life was published by Booksurge, which is a printing house associated with Amazon.com. You can obtain the book from either place. There is also a Kindle(TM) edition of the novel if you have one of Amazon's spiffy portable displays. It's further possible that if you go to Herocon next weekend, they may have a few at Wentworth's again this year. And if you ask very, very nicely, you might get the author to sign your copy as well.
Real Life
ISBN: 1-4392-0422-5
ISBN-13: 978-1439204221
(**The reverse is also true. After having seen the descriptions, puzzles, dialogue, and understanding of people in the text, the author is definitely the kind of person you want working on your Game as well.) -
In No Particular Order:
The Beastly Power:
I attempted to take the Beastly reward power twice. Neither time did I receive the buff. This was on Hero side with a Scrapper; lv36, auto-SKed to 49. The buff did not appear in either the icon list beneath the HP bar or Combat Attributes.
New Spawn System At Banners
Mobs now appear in a puff of smoke within visible distance of a banner. Previously, they ran in from outside.
The new Spawn animation is pretty, but I find it tactically far less interesting than the old version. Previously, Mobs would come in large groups and sometimes came through one of several narrow choke points around the banner. You could be constantly moving from spot to spot as you tried to keep them clear. The Mobs also ran all the way in to the banner and were extremely likely to see you if you were in proximity. With the new Spawns, the Mobs will sometimes just stand there and do nothing while they wait for you to come kill them. The show up too far away to see you if you are on the opposite side of the Banner sigil.
This is boring when it happens. If spawning the Mobs at a distance was problematic for some reason, then it would be better to still give them a waypoint at the Banner so they run in as soon as they spawn.
More mechanically, the new spawning method is causing Mobs to appear inside buildings. I had this happen twice in Port Oaks. I was too "busy" to write a proper bug report at the time, but I gave the /loc coordinates for both locations to Ghost Falcon. Mobs spawned inside buildings could still fire through the walls; Players could only respond with AoEs.
For future content, I'd prefer to see a more tactically complex situation.
The Story
Story has been added for where the Banners came from. You can see it by clicking on the tiny "i" that appears in the Nav bar and scrolling all the way down to the bottom. This is not something that springs out at you and I don't know that I ever would have seen it if someone else hadn't told me it was there. This may be the only way it could have been added to an already "in the can" project (with a million other things going on), but for future content this is a "non-optimal" solution. I would not like to see it repeated.
Breaking Banner Immunity
That you have to defeat enemies at all of the banners before the red bar begins to descend is non-intuitive. I also didn't see it in the instruction window when I read through it. I'm afraid this will end up frustrating some players who gather in a large group and turtle together in the same place (standard Invasion tactics). From their perspective, they brought a large amount of firepower and several teams worth of players to the task; yet still accomplished nothing: obviously it takes even more effort than that. So some of them will conclude "why bother".
The idea of trying to get everyone separated a bit rather than in one big lag-ball of green numbers is a good idea. But if the players don't understand that's what you're trying to do, it could back-fire on you.
Minor Bug In Aspect GM Meter
I was looking at the Nav bar when the final Banner went down in a successful Event. When the range-to-Aspect meter appeared, it showed up for just a second as something like "aspect:distance". It then flashed over to being an actual number. I'm not even sure that's exactly what it said for that second, but it was certainly not a number. That's way to vague to put in a bug report, but I still wanted to mention the observation since it's a "new" bit of tech. -
Quote:Well, when you have a couple Taunting characters keeping enemies from running anywhere and several MMs with pets firing on Aggressive, all of those "waypointer" mobs tend to get extremely dead long before they point to anything...The special enemies for the banners all started converging on the GM (again Avatea pointed that out) and I followed one set straight to the GM, so not that bad a hunt really.