The poor performance probably is due to the fact that a lot of games use Nvidia proprietary tech, which does not play nice with ATI cards.
Allow me to elaborate: PhysX is a good example -- initially made by Ageia, later purchased by nVidia, and subsequentely integrated in Nvidia hardware. AMD wasn't allowed to use the tech, and therefore PhysX could never become a requirement for games because ATI hardware wouldn't be able to run it despite more than being able to. Nvidia hoarded the tech for themselves.
Tesselation is another example -- however, in this case AMD was the first to experiment with it, calling it TruForm. A textbook example can be seen in the original DeusEx. The tech was never widely adopted until DirectX 11, and ironically, Nvidia hardware is now better at it than ATI, and the problem only gets worst the more tesselation is used. And this is the core of a lot of problems, intentional and unintentional. Take Crysis 2, as an example -- the upgrade to DirectX 11 was widely encouraged and supported by Nvidia and made heavy use of tesselation, which proved to be a problem for ATI users because as I mentioned, ATI hardware has a harder time with tesselation nowadays. The tech has been overused in games, and a part of me wonders if it wasn't intentionally done to make ATI hardware look bad. But, to be fair, maybe the patch was tested on Nvidia hardware only, but to me, that would be a massive omission.
And not just that, but Nvidia collaborated with Bethesda for Fallout 4, case in point, godrays. As an Nvidia user you probably haven't been affected much, but I run ATI hardware, and if I don't turn that shite off, my framerate tanks. Again, because of the very heavy use of tesselation.
The main culprit in all of this is Nvidia Gameworks. Basically, Nvidia works with developers to ensure their games run well and look good on Nvidia hardware. The problem with this is that they push tech that favors Nvidia hardware, making them run a lot worst on ATI hardware.
Another example, Witcher 3. It uses Nvidia's hairworks. But again, it relies heavily on tesselation, so ATI hardware struggles with it. Here is a quote from AMD's Chief gaming scientist Richard Huddy:
- Quote :
- We've been working with CD Projeckt Red from the beginning. We've been giving them detailed feedback all the way through. Around two months before release, or thereabouts, the GameWorks code arrived with HairWorks, and it completely sabotaged our performance as far as we're concerned. We were running well before that... it's wrecked our performance, almost as if it was put in to achieve that goal.
Now, to be fair, I haven't played Witcher 3, and I don't plan on it, so I can't speak from hard experience just how much of a performance hit the game takes on ATI hardware. BUT, I have seen, and experimented, the effects of bloody godrays in F4.
According to AMD, Nvidia isn't willing to share the source code for it's proprietary graphics API. And without said code, AMD find themselves unable to optimize their ATI hardware drivers for Nvidia's tech. But, playing devil's advocate for a second, to be fair Nvidia invested a lot of money on HairWorks so it makes sense that they would protect their investment.
But, even so, AMD's older GCN 1 1 architecture was never that good with tesselation. The good news is that tesselation can be adjusted in Catalyst Control Center, but even so, why couldn't AMD have that automatically?
Again, to be fair, Nvidia are usually the first to come up with the tech in the first place. But, while AMD's solutions tend to come later down the line, tend to be open source, which needless to say is a GOOD thing for the gaming community, because everybody can use them, and therefore those games run like a champ on both Nvidia and ATI hardware.
Back to PhysX, AMD worked with Havok and Bullet to provide open-source physics for games. And instead of HairWorks, AMD has TressEffects.
But, Tomb Raider uses TressEffects, and Nvidia cards ended up facing the same problem. But after a quick patch, this was fixed and the game ran well on both Nvidia and ATI hardware. Which, back to the Witcher 3, is a stark contrast. Without Nvidia's code, AMD cannot easily make a fix to address the issue.
Nvidia doesn't lose out if a game uses AMD's solutions, but AMD loses if it uses Nvidia's. It all boils down to the fact that Nvidia keeps all their tech to themselves, like a spoiled brat, but AMD freely shares theirs with the entire community. I can understand that Nvidia are protecting their intellectual propreties, but this harms the gaming community and I am NOT okay with that.
In fact, AMD announced GPUOpen, which combines all of AMD's techs into one handy package. And this is a HUGE step, because it could catalyst (pun intended) a fairer, leveled playing field that would benefit everyone except Nvidia, as they would have to give up the advantage that GameWorks gives them.
There are many documented instances of Nvidia refusing to collaborate with AMD, with Mantle and AMD's requests to make GameWorks open source. This will require the support of game and software developpers, and maybe even a helping hand from Intel. Again, a problem which would be fixed with one harmonized standard.
Bottom line -- there are a lot of variables to consider, but yes, I feel Crossfire can be a valid option. However, it is a steep investment, and one has to keep in mind it will NOT play nice with games designed with Nvidia tech.
Wow. That was quite a wall of text. But, having been a longtime AMD user, I can provide a counterpoint to your Nvidia experience.