AMD's R600 to suffer another delay
Spin the respin action, Feb launch looking unlikely
By Theo Valich: Wednesday 03 January 2007, 08:30
THE FIRST AMD/ATI graphics chip just cannot enter the world without much screaming and wriggling.
In fact, it's looking more like Nvidia is going to be to the market first with an 80 nanometre version of the G80 graphics chip, rather than ATI's 720 million transistor monster.
The last revision of the prototype chip - upon which a certain "pre-review" is based - also suffers from problems which are serious enough to get another re-spin, sources tell me. This re-spin puts a hold on the launch for another couple of weeks, and now R600 is looking like an early March launch, probably the week before SnowBIT in Hangover. However, AMD/ATI is making severe changes to the whole line up and we can say that this launch, when it happens, will be very, very shocking for the 3D industry.
I am not talking about pseudo-penismarks here.
We learned about the performance and feature characteristics and they're looking pretty impressive in more than one way, although you can expect that Dr. Spinola's engine is getting into defensive mode, saying people "don't understand what the R600 is".
However, engineer teams are working around the clock to make this launch as good as possible.
One small note on the tech demos. Prepare for your mandibles to drop, because Adrienne should look like a high-school attempt at 3D graphics compared to ATI's thing. µ
Não sei se já cá meteram esta noticia.
http://theinquirer.net/default.aspx?article=36673
A ser verdade, o lançamento só será em março.
parece que quem comprou a G80 ha 2 meses atras (sim ja passaram 2 meses e nada de R600...) comprou mt bem, visto que so daqui a 2 meses irá aparecer a primeira concorrencia, ou seja, 5 meses depois!
ATI está mesmo....
Cada vez que jogo e faço uso das potencialidades da minha 8800GTX, fico sempre impressionado com a performance e qualidade gráfica produzida pela G80. Não tenho duvidas nenhumas que foi uma das melhores aquisições que fiz nos últimos tempos, a par com a board eVGA 680i SLI e o meu E6600, casamento de "sonho"...
Relativamente ao R600, creio que a noticia não surpreende. Sente-se e dá a sensação que a AMD e ATI estão arrumar a "casa" e não me surpreendia ver a ATI ressurgir das cinzas em força pela mão da AMD. Aliás esta aparente calma e silencio por parte da ATI, dá já a sensação que o procedimento mudou, algo muito condizente com os procedimentos da AMD relativamente á protecção dos seu NDAs.
Convém não desprezar a AMD/ATI, até porque 2007 é um importante "milestone" da AMD, aposta é forte... They will be back!!
se sair em Março mas os resultados forem semelhantes aos apresentados no level5 já é bem bom
cumps
First, keep in mind that the memory interface will be a 512-bit wide one. If you pair that with a maximum of 2GBs of GDDR4 Ram (it will actually feature 1 to 2 GBs of Ram so this is no mistake!) you’ll get about 140GB/s of data throughput which will leave 8800 cards in dust. With a maximum of 86GB/s, 8800 GTX seems to be far away when compared to Ati’s new toy.
The PCB is probably the most complex one ever used in the video card industry. It’s a 12 layer one, a little shorter than 8800 GTX’s but a lot “fatter”. And so is the card since it houses the R600 and a total of 2GBs of Ram. Add up the cooling device and the power lines and memory connectors and you’ll quickly realize that R600 will also be the heaviest card on the market. The “guilty” part in all this equation is actually the mighty quad heatpipe cooler produced by Arctic Cooling especially for R600. As the cooler and heatsink are actually integrated into the PCB in order to improve its structural integrity, the final card should look quite interesting.
A new Rage Theater 200 chip will be used to take care of the card’s VIVO functions. As for the actual performance part, you may already know that R600 will integrate 64 unified shaders, similar to Nvidia’s 128 stream processors found in 8800 GTX but far more complex. The unified shader units are 4-way SIMD capable, making them comparable to 256 real shader units. And that should be enough to challenge Nvidia again. On the downside, R600 can only manage 16 pixels per clock, making it slower than 8800 GTX but officials from AMD said that it won’t be a problem since the actual frequency of R600 will be around 750MHz and will compensate this issue.
Rumors inside Ati have confirmed that the early R600 samples running at 750MHz have produced a score of 11,000 points when running Futuremark’s 3DMark 2006. While this statement remains to be proven when the final product will be available on the market, several voices also said that Ati is already pre-testing some of its samples at a whopping 1GHz speed.
Various reports coming from Ati (again this is not an official statement) suggested that the postponing of the launch had a lot to do with the fact that the cooling system cannot sustain the heavy temperature dissipation especially when the GPU is running at high speeds. I kind of doubt that R600 will actually come out on the market running at a full 1GHz, but it will surely be clocked higher than G80. And if you keep in mind that maybe 80% of the buyers will overclock the card as much as they can, this becomes a real bottleneck.
At the moment, G80 is the way to go. Ati still has a lot of things to take care of, especially problems regarding high clock speeds and the integration of GDDR4 into the PCB. The G80 pre-calculates loads of answers and stores them in its texture units (G80 has plenty of those). Ati has more computing power but less pre-load design so it’s hard to say which will be the winner. Anyhow, it’s good to see that the architectures of the two giants seem to differ a lot.
ATi is curiously targeting February the 14th as the actual launch date. Yep, it’s Valentine’s Day and this time it will coincide with the R600 release. Give or take a few bucks, it’s a “perfect” gift for your girlfriend. As illogical as it may be, the launch comes at the perfect time for CEBIT which will start on March 15 and last until to March 21. Whether R600 will be the new king at CEBIT is really hard to say especially since it seems that Nvidia has 3 months of silence ahead. And I doubt they’re going to rest their brains all that time. Several voices from Nvidia have already pointed out that the company is preparing a tweaked 80nm part which will perform better than G80. The chip is called G81 and is expected to come out just in time to meet R600.
Since September, Samsung kept on announcing about how GDDR4 was going to be the next-best-thing in IT. And I proudly use the word “since” because Samsung’s GDDR4 wafers from where GDDR4 chips first emerged were in many cases either faulty or simply not working as
they should. And then came Nvidia’s G80 equipped with GDDR-3 (384-bit wide). Meanwhile, Ati continued to develop the R600 product line keeping alive both the GDDR3 and GDDR4 lines.
It seems that the future equals GDDR4 since Ati has recently announced that along with the R600 line, the Stream Processor 2 line (GPGPU board based on R600) will only support GDDR4 memory chips. The name for the new Ati graphics board is still unknown but it’s beginning to be quite clear that both products will rely solely on GDDR4. And that will surely translate into two things.
A projected 2GB GDDR4 R600 will probably be even more expensive than a G80 based one. And then there’s the problem of availability since 2GB GDDR4 monsters will force the memory producers to supply a larger quantity of chips. And since GDDR4 is only at the beginning of its time, this could easily turn into a great GDDR4 shortage.
No other details about R600 are available. Rumors claim that the final chip is rotated at 45 (not 60 as it was originally presumed) degree angle in order to keep the memory traces as short as possible. Moreover, R600’s back could end up with a G80-style DIN-DVI-DVI and with the HD output connector located at the top of the PCB, not in the middle as it was with the older x1950 series. Lots of good news, except for one: the R600 was postponed until February. Reasons are unknown but that seems to be no problem for DirectX 10’s users since Nvidia has some problems herself with a working DirectX 10 driver for Vista slatted until December the 15.
bem bom????
sair 5 meses depois da g80 e bater-se taco a taco com ela é bem bom?????
para mim isso é bem mau! a R600 para ser bem boa tem de ser tao boa como a G81 de modo a antecipar a sua mais que provavel performance.
e nesses 5 meses deve ser o suficiente para tares farto de jogar jogos dx10 com os drivers que não existem
os 5 meses que tás a supor são os 5 meses de reinado da nvidia, porque pelo que tenho visto a ati chega mais tarde mas acaba por fazer melhor.
qual é o melhor chip que seja só dx9c?
However, AMD/ATI is making severe changes to the whole line up and we can say that this launch, when it happens, will be very, very shocking for the 3D industry.
One small note on the tech demos. Prepare for your mandibles to drop, because Adrienne should look like a high-school attempt at 3D graphics compared to ATI's thing. µ
http://www.theinquirer.net/default.aspx?article=36673
LOL o que sera que o INQ quiz dizer com isto ?? :
cumps
desculpem la, mas não resisto a fazer esta pergunta:
qual pode ser o interesse da nVidia em lançar o G81 antes do R600? reforçar a liderança? baixar o preço do G80 so para agradar ao pessoal, e perder o dinheirinho que podia ganhar com mais uns meses de G80 no topo da performance... contem com isso (not)
[a outra opção seria manter o preço da G80, e lançar o G81 a preços astronomicos... tipo 700~800€ - o que me parece altamente improvavel]
além do mais, ia lançar um segundo chip para DX10, ainda mais poderoso, quando ainda não ha jogos para DX10... tipo... para que?
e o G80 está ao preço de uma placa topo de gama, nem está a um preço exorbitante.
o G80 é o unico chip que aí anda com suporte para DX10 (mas neste momento, a maior parte do pessoal esta-se a lixar para isso, porque o que interessa de facto, é o desempenho que tem em DX9... um pouco ao jeito do aconteceu com o athlon64 onde a maior parte do pessoal se estava a lixar para o suporte para os 64bits e preocupava-se era com o desempenho a 32bit)
e depois, qual é a pressa que a ATi pode ter em lançar uma placa DX10 quando não ha jogos para ela? o unico interesse seria responder à nVidia, mas o responder não é o cerne da questão (na minha opinião, claro)
quer dizer... pronto é claro que ha interesse em ter a melhor placa, mas todos sabemos que este não é o segmento onde se ganha mais dinheiro. e onde se ganha dinheiro (value for money), a ATi até nem esta mal representada.
o natal ja se perdeu (e o natal representa vendas) vai perder o lançamento do windows vista (alguém vai comprar placa destas para suportar o windows vista? I really don't think so), e pelos vistos vai sair em março (e acho que vai sair na altura dos primeiros titulos DX10... na altura em que se lançou a radeon 9xxx, também foi muito mais em cima do lançamento dos jogos DX9... mas a serie 9xxx não saiu so para gama alta, saiu para gama media e baixa)
enfim, acredito que este post possa ser um bocado controverso, mas não me parece complicado perceber o ponto de vista...
Uma topo-de-gama dominante exerce um poderoso "halo-effect" sobre todo o resto da gama.