NV40 Reviews e Comentarios (Discussão Pre-Lancamento)

Epá, a Newegg não vende para fora dos EUA. O que eu irei fazer é pedir a familiares meus para comprarem e mandarem-me! :)

Metro,

Seguindo a teoria do Nemesis11 que me parece ter coerência talvez esta não fosse a placa original que era para ser lançada. A velocidade da mem é muito maior. E talvez por isso vemos estes resultados.

Não é por nada, mas reparando no próprio link que colocaste, os clocks das memórias das NV40 pouco ou nada influênciam os resultados... agora do core, influencia e muito.
 
Não é só a mem que devia ser maior ( a samsung está a comercializar Gddr a 600/700/800 e a Micron a 833 ), o Core tb devia ser maior. As placas que foram para os developers está com o core a 475, tal como as que estavam na GDC.
Deve ter sido com este core clock (ou mais) que se deve ter chegado aos 14K a 1024 (apresentação da nVidia).

Ainda não tinha visto a review da Hexus. Estou suprendido que ela melhore mais aumentado o core que as mems.
O fill rate aumentou muito mais que a bandwidth em relação à NV30/5/8. Ela devia estar limitada pelas mems (mais ou menos o que aconteceu com a Geforce2).

O que mais me surpreende é a forma positiva como toda a gente está a reagir. Temos aqui uma placa que consome mais, faz mais barulho e mais calor que a "famosa" 5800 e toda a gente está a gostar, sem ver os numeros da R420.
Não interpretem mal, espero que o mercado fique mais equilibrado.
 
havia uma grande necessidade na mente das pessoas da nvida lancar uma bomba..
mas por acaso,ate acho k lançou :P pelo menos,pelo pouco k li tirei essa conclusao..
mas a verdade,é que bastava haver ganhos de para ai 10%,k a reaccao das pessoas seria talvez a mesma..
é bom para o mercado estas reviravoltas...:)

e desde a 9700pro,que nao se via um avanco tao grande em termos de uma placa..e ha quantos anos foi isso? :P e agora,ate parece que os jogos comecam finalmente a acompanhar o poder das graficas...gracas talvez à estabilizacao por mais uns anos do dx9

quanto à speed das memorias..acho que tinha lido nalgum lado,que ainda eram escassas as memorias de 600mhz por parte da samsung..por isso é que as primeiras iriam sair com as de 500mhz... mas punha-se a hipotese de os fabrincantes das placas fazerem o "upgrade" assim que houver a memoria disponivel..
Em termos de Oc..é que me parece pouco (acho que li 40mhz nas memorias)..mas tem k se ver k é uma nova tecnologia..ha k sedimentar :P


e pronto..estes lançamentos sao smp um espectaculo..fazem-me querer ir comprar algo que nao necessito..lol . o k so significa k o marketing esta a funcionar,e bem! estara na altura de comprar umas accoes da nvidia? nahhhh..ja o devia ter feito era a uns tempos :P
_zZz_
 
Metro disse:
Edit: Gosto deste cooler. Sou dos que acho que o cooler pode ocupar até três baias. Quero lá saber:D
00084083.jpg

ups, só agora vi essa placa.
Isso não é uma grafica, é um tijolo. Espero que pelo menos seja mais silenciosa que os 50 DB da review da Hexus.
 
Nemesis11 disse:
O que mais me surpreende é a forma positiva como toda a gente está a reagir. Temos aqui uma placa que consome mais, faz mais barulho e mais calor que a "famosa" 5800 e toda a gente está a gostar, sem ver os numeros da R420.
Não interpretem mal, espero que o mercado fique mais equilibrado.

Não consome muito mais afinal de tudo, não faz mais ruido, faz exactamente o mesmo que a FX5950Ultra, ou seja, bem menos do que a FX5800 Ultra, e não é mais quente.

Para teres uma ideia, a minha FX5900 Ultra fazia 58ºC em Idle, com cooler de origem e tudo em default... esta faz 44ºC com algum overclock e depois de correr o 3dmark...
 
50 DB? Mais silenciosa que a 5950?

Quanto ao calor, com aquele dissipador acredito que seja menos "quente" nas leituras do core que a 5950, mas ela não tem "saida de escape" como a 5800. O interior da caixa fica bem quentinho de certeza com uma 6800 (Heat pipes.....)

Quanto ao power, continuo com a duvida, se não consome assim tanto, para quê 2 molex com linhas diferentes e requesitos de 480 W de fonte?
Se a fonte pode ser para jogar pelo seguro, 2 molex não fazem sentido se ela consome "tão pouco".

P.S. - Eu vejo isto assim, se dia 26 a R420 for 10% mais rapida, ninguem se vai lembrar da NV40, se a NV40 for 10% mais rapida, ninguem se vai lembrar dos 480 W/50 DB/etc etc.
 
A família R300 da ATi já faz toda processamento de vídeo através dos pixel-shaders. Se implementada inteligentemente, a descodificação/codificação de vídeo não "custa" assim tantos transístores.
 
_zZz_ disse:
greven,emigraste para os usa? é que se nao me engano,a newegg é so para residentes nos estados unidos :(

era bom,nao era...? :P

_zZz_

Ai é? Opss... é que como vi o post do Raptor a falar do newegg, fui ver.... Todos juntos... OHHHHHHH.... :sad:

:)
 
E aí começam elas a sair pó estaleiro para os retoques finais...

"Availability
The first GPUs based on the NVIDIA GeForce 6 Series, the GeForce 6800 Ultra and GeForce 6800 models, are manufactured using IBM’s high-volume 0.13-micron process technology and are currently shipping to leading add-in-card partners, OEMs, system builders, and game developers.

Retail graphics boards based on the GeForce 6800 models are slated for release in the next 45 days."
 
"The arrival of the GeForce 6 Series has the industry buzzing:

“The arrival of the new GeForce 6800 will further open the door for any game developer looking to push the limits of their imaginations.”

Dusty Welch, Vice President, Global Brand Management, Activision
“The GeForce 6800 delivers screaming performance on demanding games like Unreal Tournament 2004 from start to finish. NVIDIA’s newest powerhouse technology is a perfect match for any gamer who demands non-stop action and blazing fast frame rates.”

David T. Brown, Producer for Unreal Tournament 2004, Atari
“The pace of innovation in graphics technology is a very exciting prospect for both game developers and gamers and NVIDIA has set yet another milestone with the GeForce 6800. For Blizzard, the affordable cost and consistent compatibility and reliability of NVIDIA GPUs make it easy for us to recommend GeForce cards to our customers.”

Blizzard Entertainment
“Not only is the GeForce 6800 extremely fast, it allows us to take our games to the next generation of cinematic realism. The advanced shader capabilities alone open up a whole new realm of possibilities for us.”

Steven Lux, Vice President of US Marketing, Codemasters
“With the GeForce 6800, NVIDIA brings long-awaited, true high dynamic-range imaging to gaming.”

Dean Sekulic, Programmer, Croteam
“The GeForce 6800 is a flexible platform that allows us to fully exploit DirectX 9.0 shader technology. We were able to achieve a very beautiful, polished game with Far Cry, but now we have the ability to push technologies like high-dynamic range, floating point blending and dynamic branching to render pixels at double the speed, and that’s a particularly exciting prospect as we begin to focus on the advancement of the CryEngine for new big games!”

Cevat Yerli, CEO and President, Crytek.
“The newest generation of NVIDIA GPUs will more than satisfy a gamer’s hunger for intense, fast-paced action. For cutting-edge games like Painkiller, where fast, reliable performance is essential, NVIDIA has delivered the goods.”

Brian Gladman, Product Manager, DreamCatcher Games
“With the introduction of the GeForce 6800, NVIDIA has once again raised the bar for what’s possible to achieve in PC gaming.”

Anand Gupta, Director of Business Development, Eidos Interactive
"With its combination of new features and performance, NVIDIA's new GeForce 6800 is impressive and exciting. For the high-quality cinematic look and feel we're trying to achieve for our latest game, the Lord of the Rings: Battle for Middle-earth, this technology couldn't have come along at a better time."

Mark Skaags, Vice President and Executive Producer, Electronic Arts
“The Grafan game engine’s use of high dynamic-range lighting, multiple real-time shadows, and multipass rendering techniques requires a high-performance graphics card. We’re currently working with the GeForce 6800 Ultra GPU and using pixel shader 3.0; all we can say is ‘wow’.”

Herb Marselas, Co-Founder and Chief Executive Officer, Emogence, LLC
“The sheer performance power of the GeForce 6800 will allow gamers to make the detailed 3D environments of Soldner work for them, as they seamlessly move through complex and highly detailed maps to achieve their missions.”

Greg Bauman, Senior Marketing Manager, Encore Software
“The GeForce 6800 is a great leap forward in PC graphics, bringing next-generation DirectX 9.0 rendering performance to new heights while continuing forward with the high-definition floating-point rendering, driver performance, and stability NVIDIA is famous for. As the first GPU supporting the new pixel shader 3.0 programming model, the GeForce 6800 enables games to use entirely new rendering approaches and obtain higher performance per-pixel lighting, shadowing, and special effects than was ever possible before.”

Tim Sweeney, Founder and President, Epic Games
“With 16 pixel pipelines and comprehensive support of Shader Model 3.0, the GeForce 6800 offers a wide spectrum of new features at the highest level of performance. We’re looking forward to combining two new capabilities—vertex texture sampling and stream frequency dividing—to pack our mesh data, thus minimizing our video memory usage. Furthermore, OpenEXR support in texture filtering and pixel blending will finally allow us to efficiently implement high dynamic-range rendering, while also allowing our artists to fully express themselves in this extended domain.”

Janos Boudet, Senior Engine Programmer, Eugensystems
“We are really impressed by NVIDIA’s GeForce 6800. Being at the forefront of technology, NVIDIA is an ideal partner.”

Brian Jobling, Chief Executive Officer, Eutechnyx
“The GeForce 6800 is actually where dreams become reality. We’ve been waiting for a hardware platform to arrive that supports pixel and vertex 3.0 shaders so we can take the Torque Shader Engine to its fullest potential and showcase technology that we’ve only imagined before.”

Jay Moore, Evangelist, Garage Games
“With this new generation we’re really getting things down and not only are we faster and better, but we're now perfecting and mastering the technology.”

Randy Pitchford, President and CEO, Gearbox
“The vertex stream instancing enabled by the GeForce 6800 is a killer feature that allows our SpeedTree technology to drive forests of unprecedented depth and density. Coupled with raw performance, we fully expect our customers’ games to boast far richer and more immersive environments.”

Chris King, President, IDV, Inc.
“The GeForce 6800 is proof of NVIDIA's commitment to providing developers with the tools and hardware we need to hone our craft. Intensity is a hallmark of the Call of Duty franchise, and having a hardware platform that allows us to take the immersion to the next level, with cinematic quality visuals at amazing frame rates, is extremely exciting.”

Vince Zampella, Chief Creative Officer, Infinity Ward
“These new GeForce 6800s are scary fast; it runs Thief: Deadly Shadows at 1600×1200 with AA and all the detail settings cranked all the way up faster than anything we have tested on. I can’t wait to get these in our development boxes as we are starting work on our next rendering technology. Having FP16 buffers fully supported as a texture format is awesome and 3.0 shader support gives us the ideal platform to develop on today for our next titles. Actually I can’t wait to get one in my computer!"

Tim Little, Director of Technology, Ion Storm, LP
“What most excites me about the GeForce 6 Series is the Vertex Stream Frequency Divider and Pixel Shader 3.0 support. The combination will give us the ability to create richer game environments and complex special effects. At last we will be able to do realistic gas and fluid simulations in real time!”

Tim Ramsay, Senior Graphics Programmer, LucasArts
“NVIDIA's GeForce 6 Series changes the playing field and sets new standards for image clarity and quality through floating point capabilities in shading, filtering, texturing, and blending.”

Kevin Stephens, Director of Technology, Monolith
“It’s clear that NVIDIA has done it again with the GeForce 6800. Not only is it wicked fast for Joint Operations: Typhoon Rising, but it provides us with a forward-looking platform for creating next-generation games today. The ability to create and deploy pixel and vertex 3.0 shaders will allow us to reach new levels of realism that weren’t possible before.”

Mark Davis, Chief Scientist, Novalogic
“We have been absolutely blown away by NVIDIA's GeForce 6800 and its amazing set of features. Thanks NVIDIA for putting so much power in our hands!”

Laurent Paret, Vice President of Business Development and Co-Founder, NP Cube
“I think that NVIDIA's GeForce 6800 is a brilliant card. Because it implements Microsoft DirectX 9.0 features at full speed and quality, it is going to dramatically increase overall performance of today's DirectX 9.0 games, including Painkiller.”

Klaudiusz Zych, Technology Programmer, People Can Fly
“I can’t wait to get my hands on some GeForce 6800 hardware with its full support for floating point textures and pixel and vertex shader branching!”

Greg Hjelstrom, Lead 3D Engineer, Petroglyph
“We’ve been flabbergasted by GeForce 6800’s sheer power delivered by 16 pixel pipelines at incredible speed. Its unsurpassed fill rate and shader performance is of tremendous value to all game developers.”

Kurt Pelzer, Senior Software Engineer, Piranha Bytes
“Just by running the samples from the NVIDIA Developer SDK 7.0 on a GeForce 6800 Ultra, it’s easy to see the immense power and graphical possibilities of this next-generation-today graphics card.”

Bob Troughton, PC Rendering Lead, Pitbull Syndicate
“NVIDIA has stepped it up a few more gears with the GeForce 6 Series. Vertex and pixel shader 3.0, with dynamic branching is exactly what developers have been longing for since the inception of shaders on the PC.”

Mark Robinson, Senior PC Programmer, Rockstar San Diego
“We’ve made a serious commitment to NVIDIA, and they to us, as the graphics platform of choice for EverQuest II. Our 3D artists and technologists rely on the feature set and programmability of NVIDIA GPUs to keep our online world running smoothly and looking spectacular. The introduction of the GeForce 6800 presents a particularly exciting proposition for us, as it unlocks a ton of new features that our artists have been asking for. With this new technology, the beautiful worlds of EverQuest II will be a level beyond every other massively multiplayer game available.”

John Smedley, President, Sony Online Entertainment
“Due to the fact that the NVIDIA GeForce 6800 Ultra runs our latest tech flawlessly right out of the box, we are absolutely delighted with both the level of performance and the rock-solid drivers, ensuring developers, publishers, and most importantly customers, benefit to the max without any hassle whatsoever.”

Martyn Brown, Creative Director, Team17 Software Limited
“With the GeForce 6800, NVIDIA is raising the bar by delivering tons of stunning and highly useful features. Totally orthogonal support for 64-bit floating point color across the entire pipeline—including texture filtering and frame buffer blending—makes GeForce 6800 the perfect graphics hardware for high dynamic-range rendering which is at the basis of our next-generation game engine!”

Jakub Klarowicz, Senior Engine Programmer, Techland
“NVIDIA’s GeForce 6800 is the most powerful graphics hardware I've ever seen. Its performance and capabilities will help us reach new levels of image quality.”

Julien Merceron, Chief Technology Officer, Ubisoft
“We have a very clear goal for the Xubl game engine—render 80,000 polygons per frame without a hitch in performance. With NVIDIA’s GeForce 6800, not only are we able to do just that, but we’re also able to push out unique and stylized effects on a high-performance hardware platform that supports advanced rendering techniques like normal mapping, high dynamic-range lighting and complex 3.0 shaders.”

Bostjan Troha, CEO, Zootfly
“The feature set of the GeForce 6800 is truly astounding. Just when I thought graphics hardware couldn’t possibly evolve at a faster pace, NVIDIA once again exceeds my wildest expectations by delivering a GPU with a design that is remarkably ahead of its time.”

Eric Lengyel, Author, Mathematics of 3D Programming and The OpenGL Extensions Guide"
 
Este novo GPU da Nvidia trás algumas dificuldades à Intel.
Com a inclusão de um módulo para processamento de vídeo que não é mais
do que uma unidade de processamento vectorial SIMD 16-way, é possível
passar coisas que corriam bem em SSE2/SSE3 para o GPU da Nvidia.
Resta testar a performance deste pequeno demónio. A freq do GPU ser de
apenas 400Mhz é que não ajuda mas com uma boa memória a ajudar não sei
não.

Como uma das grandes vantagens do P4 sobre os AMD era justamente as
SIMD SSE's penso que com um GPU destes talvez o P4 perca alguma
vantagem.
 
iJFerreira disse:
Este novo GPU da Nvidia trás algumas dificuldades à Intel.
Com a inclusão de um módulo para processamento de vídeo que não é mais
do que uma unidade de processamento vectorial SIMD 16-way, é possível
passar coisas que corriam bem em SSE2/SSE3 para o GPU da Nvidia.
Resta testar a performance deste pequeno demónio. A freq do GPU ser de
apenas 400Mhz é que não ajuda mas com uma boa memória a ajudar não sei
não.

Como uma das grandes vantagens do P4 sobre os AMD era justamente as
SIMD SSE's penso que com um GPU destes talvez o P4 perca alguma
vantagem.

Resumindo, nesta frase do Anando:
This could actually really help even the playing field between Intel and AMD if it catches on ...

Parece q está toda a gente entusiasmada. Os principais developers falam bastante do PS3.0. Estava convencido de que isso não tivesse tanto impacto. Como aparentemente a ATI R420 não irá suportar os PS3.0, esta tem um grande handicap competitivo. Talvez mais ao nivel de marketing no que na prática, tirando ainda raras excepções, mas com todos estes pareceres futuristas quanto à aplicação breve destas "instruções", a fasquia vai ficando cada vez mais alta para a ATI. Não só terá que superar esta performance, como tem que fazer esquecer a inexistência dos PS3.0 e eventualmente o não suportar as mesmas capacidades de encoding/decoding de video da nv40.

Bem dizia alguém às uns tempos atrás: "Se a ATI julga que irá poder descansar sobre os louros da liderança na performance, desengane-se" (qq coisa assim)
 
Back
Topo