Oui.
C’est TRES louche cet engouement anti-nvidia.
Je remet ici une tres bonne analyze de Arstechnica, les anglophones apprecieront. Les autres iront apprendre l’anglais
Allez hop! En substance ca dit que faut depasser la hype du jour (“Nvidia sucks“
et voir que ATI lache des brouzoufs comme des oufs pour faire dire aux fabricants de jeux “Nvidia c’est des mechants, ils trichent et en plus ca rame sur mon jeu“. HL2 est un tres bon exemple. Cf. Le texte… Bien sur NVIDIA a des problemes mais les proportions dans lesquelle c’est en train de devenir c’est du grand n’importe quoi… comme d’hab dans ce genre de debats religieux… Et je suis pas un pro Nvidia ou un pro ATI
Pour une fois je m’en fout.
Last week Hannibal wrote up this bit analyzing what can only be called a brewing storm in the graphics world: Valve called out NVIDIA’s GeForce FX’s performance and quality. Never mind the fact that the comments were made at an ATI-sponsored event or the fact that Valve and ATI have entered into a lucrative bundling deal–NVIDIA is the new evil and scorned Alexis, ostensibly guilty of cheating on benchmarks. And we need not concern ourselves with the fact that ATI admitted to a certain kind of cheating themselves, albeit with rather smooth language. NVIDIA’s own response to Valve’s charges seemed to fall largely on deaf ears: the community has for the most part decided that NVIDIA sucks, and you can’t trust what they say.
Yet NVIDIA raised some interesting points; appealing to a number of factors, they defended their performance while questioning two things: 1) why did Valve not use the most recent drivers, and more importantly 2) why did Valve, with whom NVIDIA has a working relationship on this very game, not say nary a word to NVIDIA about these problems, instead waiting to release them in a sensational fashion? For the most part, these questions are going unanalyzed because some think it political suicide to give NVIDIA the time of day right now. Perhaps it’s a side effect of the geek lust for all things binary, but I’m wary of a situation where reports depict NVIDIA as the Debil, ATI as God, and Gabe Newell of Valve as the Revelator. Lest we forget… Half-Life 2 is not on the shelf right now, and if NVIDIA is to be believed, they’ve not had a full chance to look at the situation with Half-Life 2 (or with pixel shaders). The situation is remarkable: people are burning effigies, boiling pots of oil, and sharpening their knives over the speculation of how the GeForce FX will perform on an unreleased game. And better yet: most gamers are too broke to afford a Radeon 9800 Pro or a GeForce FX 5900 Ultra, anyway (but if you’re in the market, go with ATI
.
The tone above is one of frustration. There’s a lot of sensationalism swelling around these events, and it doesn’t seem as though many people want to think about the bigger questions. It’s far easier to point fingers and laugh, or to spin stories until there’s just not interest anymore. Here’s what I’d like to hear more about. Just who is controlling the DirectX development path, and just who is saying “sure, we can call that a DX 9 part, even if it’s not”? Where is the certification process for DX 9? How did NVIDIA release a line of DX 9 parts that actually can’t run some games well at anything but DX8? Hannibal already linked this up, but I think more can be said about this technical article at 3DCenter. In particular, this is the quote I found most interesting:
A pixel shader 2.0 written according to the recommendations from ATi or created with the current DX9-SDK HLSL compiler unfortunately highlights all those problems [with NVIDIA’s architecture]. Such shaders use many temp register and have texture instructions packed in a block. A shader of version 1.4 uses the same principles. nVidia will have to tackle these problems, the earlier the better. Microsoft is working on an SDK update which will include a new version of the compiler that can produce more “NV30-friendly” code with less temp registers and paired texture ops. But if developers use PS 1.4 or generate only one PS 2.0 shader, this compiler won’t help nVidia much. They will have to solve the problem in the driver.
Pixel Shader 1.4 and 2.0 performance isn’t great because NVIDIA’s hardware isn’t designed to handle it optimally. ATI’s is. So it is Microsoft who plans on releasing an updated compiler to help NVIDIA out of this mess, since as it stands now, Microsoft’s DX compiler produces code that works great for ATI, and not so great for NVIDIA (it works great for ATI because they stayed close to spec, as far as I can tell). This is actually old news: Carmack has already said that pixel shader performance on NVIDIA’s new GPUs will be lackluster unless you code directly for it (which curiously he is, but Valve isn’t…I guess they gave up). Coding for specific hardware isn’t anything new. In fact, it’s pretty much the legacy of 3D accelerators. The difference now is that DX 9 is very mature, very complicated, and perhaps too open-ended. It’s also not a hardware spec. There’s more than one way to get anything done according to the DX specifications.
NVIDIA will continue to update their drivers (they already have, and will continue to do so), but everyone will call it “cheating.” Their silicon can’t handle PS 1.4 and 2.0 in an optimal fashion, so they have to find a solution in the driver to fix the problem. As I see it, they can only “cheat” from now on (and one person’s “cheat” is another person’s “optimization”). And if you’ve been watching the scene a few years, this should all be eerily familiar. Back when NVIDIA was the King, there would be newsposts and freakin’ articles about beta driver releases! Not that long ago, NVIDIA managed to squeeze more and more performance from its driver updates, and the world cheered.
Calvin: The one thing I’ve come to appreciate about John Carmack more than his coding skills is the level-headedness he displays when it comes to benchmarks and driver problems. He seems to have good working relationships with both ATI and Nvidia, and I haven’t seen him make a strong comment on anything before discussing it with the concerned developers. The thing that worries me, though, isn’t Gabe Newell’s seeming dislike for Nvidia, nor the much-less-than-perfect conditions people have been allowed to run benchmarks in, nor is it some of the cheating/optimizing that ATI and Nvidia have done.
My main concern is that game and console developers are cozying up with hardware developers, sales channels and review sites, and this can only hurt the average gamer. It’s rumored that ATI won a bidding war to be tied into Half-Life 2, just as Nvidia had a tie-in with UT 2003. That would be fine, except that it’s not unreasonable, given the circumstances, to suspect that maybe Nvidia and Valve can do better, if only the latest drivers were used. Gamers who are looking to buy a new video card deserve to know which card will serve them best, and having ATI’s money involved clouds the picture.
If it were just Half-Life 2 I wouldn’t be upset, but I’m bothered because it’s a trend I’m seeing. In the last couple of weeks IGN and GameSpot both got “NGage” sections. EBGames got one to, with a big flashy icon so it stands out. Do I think that these three entities all decided that the NGage was so great that gamers had to know about it, all of a sudden? No. I think Nokia’s money is involved, and I don’t think I’ll be trusting an NGage review from those three sources.
Oui c’est long. M’en fout ![]()