GEFORCE GTX 280

GEFORCE GTX 280

take it

ATI HD 2900 GT

ATI  HD 2900 GT

วันพุธที่ 31 มีนาคม พ.ศ. 2553

ATI Radeon HD 5800


ATI Radeon HD 5800
As many of you know, today is the day ATI lifts the curtain on its latest generation of desktop based graphics cards, with two models kicking things off in the HD 5850 and top end single GPU offering, the HD 5870. In the near future we'll see others added to the lineup with the most anticipated model being the dual-GPU based HD 5870 X2, all of these being the first DirectX 11 supporting cards to hit the market.
AMD today launched the most powerful processor ever created, found in its next-generation graphics cards, the ATI Radeon™ HD 5800 series graphics cards, and the world’s first and only to fully support Microsoft DirectX® 112, the new gaming and compute standard shipping shortly with Microsoft Windows® 7 operating system. Boasting up to 2.72 TeraFLOPS of compute power, the ATI Radeon™ HD 5800 series effectively doubles the value consumers can expect of their graphics purchases, delivering twice the performance-per-dollar of previous generations of graphics products.3 AMD will initially release two cards: the ATI Radeon HD 5870 and the ATI Radeon HD 5850, each with 1GB GDDR5 memory. With the ATI Radeon™ HD 5800 series of graphics cards, PC users can expand their computing experience with ATI Eyefinity
multi-display technology, accelerate their computing experience with ATI Stream technology, and dominate the competition with superior gaming performance and full support of Microsoft DirectX® 11, making it a “must-have” consumer purchase just in time for Microsoft Windows® 7 operating system.

“With the ATI Radeon HD 5800 series of graphics cars driven by the most powerful processor on the planet, AMD is changing the game, both in terms of performance and the experience,” said Rick Bergman, senior vice president and general manager, Products Group, AMD. “As the first to market with full DirectX 11 support

How well does the new HD 5870 perform against the still solid performing older generation HD 4890 and competing models from NVIDIA? You can find this out in our fresh review of Sapphire's stock model which has just gone live thanks to the NDA lift.
Also watch out for several other articles over the next day or so including two of these bad boy HD 5870s in Crossfire later on tonight followed by an overclocking article.
Designed and built for purpose: Modeled on the full DirectX 11 specifications, the ATI Radeon HD 5800 series of graphics cards delivers up to 2.72 TeraFLOPS of compute power in a single card, translating to superior performance in the latest DirectX 11 games, as well as in DirectX 9, DirectX 10, DirectX 10.1 and OpenGL titles in single card configurations or multi-card configurations using ATI CrossFireX™ technology. When measured in terms of game performance experienced in some of today’s most popular games, the ATI Radeon HD 5800 series is up to twice as fast as the closest competing product in its class.5 allowing gamers to enjoy incredible new DirectX 11 games – including the forthcoming DiRT™2 from Codemasters, and Aliens vs. Predator™ from Rebellion, and updated version of The Lord of the Rings Online™ and Dungeons and Dragons Online® Eberron Unlimited™ from Turbine – all in stunning detail with incredible frame rates.
Generations ahead of the competition: Building on the success of the ATI Radeon™ HD 4000 series products, the ATI Radeon HD 5800 series of graphics cards is two generations ahead of DirectX 10.0 support, and features 6th generation evolved AMD tessellation technology, 3rd generation evolved GDDR5 support, 2nd generation evolved
There's a stack of other coverage on the new 5800 family of Radeon cards showing up around the web, which you can get to via the below links :-

วันจันทร์ที่ 15 มีนาคม พ.ศ. 2553

gpu cooler


gpu cooler

Graphics card cooling has become almost as important as CPU Cooling, if not more so and with many products to choose from; it’s difficult to know which one to go for. Kolink’s retail brand Coolink stands for an effective conjunction of no-frills performance, excellent quality and attractive pricing. While Coolink held a strong presence in the Asian market ever since the late 90ies, it was not until 2005 that the brand was introduced to the European market on a large scale. After the launch of Coolink-Europe.com in late 2005, Coolink quickly became a well recognized brand for high-quality cooling components in Europe too.”“Coolink is a brand of the Kolink International Corporation and stands for an effective conjunction of no-frills performance, excellent quality and attractive pricing. Coolink – the direct link to affordable high-end cooling!The GPU cooler up for review today is the Coolink GFXChilla and the low profile design promises high performance cooling with low noise outputs. Let’s see how it performs.

วันอาทิตย์ที่ 21 กุมภาพันธ์ พ.ศ. 2553

Geforce 7800GTX = gpu ps3


NVIDIA's success in the graphics industry has to do with two things; the first is a competitive product and the second, the ability to execute. 3dfx learned the hard way (as did NVIDIA with the NV30) that releasing hardware in a timely fashion is critical. With the NV40 family of products which spans from the 6200 TC all the way to the 6800 Ultra, NVIDIA nailed down both performance leadership as well as high product availability for the most part.
The graphics manufacturers are usually pretty predictable in terms of the product launches; there is usually an introduction of a part in the spring/summer timeframe and a product refresh in fall/winter. Although there was the announcement of a 512MB GeForce 6800 Ultra in this February, that was not really a refresh, but the introduction of SLI can be read as such. The last few months have been filled with rumors about new cards as the spring rolled around. Rumors on the NVIDIA front had been contradictory at best with some sources suggesting that there would be a NV40 refresh while others pointed to a whole new architecture. In any event, it has been over a year since the debut of the 6800 Ultra and there has not been another single board, mainstream NVIDIA part that has superceded it until today; the 7800 GTX, codenamed G70.

Today, NVIDIA will be announcing the availability of their next-generation GeForce 7800 GTX graphics processing unit (GPU). The GeForce 7800 GTX is based on the PCI-Express interface
and is replacing the popular GeForce 6800 Ultra at the high-end. With NVIDIA's Scalable Link Interface (SLI) technology, a second GeForce 7800 GTX can easily be added to a system in order to achieve an unprecedented level of 3D graphics performance


The GeForce 7800 GTX will be available to order today from online retailers, system builders, and PC OEMs and carries a suggested retail price of $599. The following PriceGrabber search will provide a list of online retailers that are selling the GeForce 7800 GTX. Keep in mind that the price of a product is typically determined by supply and demand. One of the main reasons for the $599 price tag is that the PCI-Express versions of the GeForce 6800 Ultra and GeForce 6800 Ultra have retained much of their original value due to the tremendous popularity of SLI.
Today may also mark the end of era an NVIDIA did not provide information in regards to an AGP version of the GeForce 7800 GTX. They are also announcing the GeForce 7800 GTX today and declined to provide any further information at this time concerning additional GPUs that are expected to make up the GeForce 7 Series.

The NVIDIA GeForce GTX 200 series


THIS IS SPARTA! Oh ... hi everyone. Yeah once you get to play games with the beast we'll be showing off today I tend to get a little melodramatic. I have this weird thing that once I really like a piece of hardware I try to find a weird euphemism for it that fits the product. The actual GeForce GTX 280 we put to the test reminded me of pure brute force. But all that brute force streamlined, well organized and efficient could very well be an euphemism for SPARTAAAANS. If you have no frickin clue what I'm talking about, then first go see the movie '300' and then come back and read this article, as I like to invoke that feeling into this article.
Welcome to today's introduction of the GeForce GTX 200 release. The long awaited successors of the GeForce series 8 generation architecture is finally here, with one keyword: loads of additional transistors.
Specifically two new products are being released today: GeForce GTX 260 and 280.
Yeah the rumor was right ... 1.4 frickin' Billion transistors slapped onto a silicon -- quite amazing! I guess NVIDIAs one billion dollars a year investments on R&D shows off today. Weirdly enough it's also the day that NVIDIA decided to go strong on something else than gaming. They call it their 'GPU and Beyond' approach. In short wording, they want you guys to be very aware of the fact that the good old GeForce series graphics card is more than a piece of machinery to only play games with. And that's true ... we see more and more features merged into the graphics card and they broaden that PC experience we all so much love. Today we'll actually show you some very interesting examples of that, yet obviously we will go a little deeper into the architecture and fire of a dozen or two games at the product as well. And yes, we finally found a product that can play Crysis at a decent resolution with high-image quality settings.
Before we dive into the review; architecture and features of the new GeForce GTX series 200. I really wanted to take you on a Shader crash course. We, the press, talk about it all the time in our articles, but certainly it is very difficult for the end-user to even understand what a shader or shader processor is. Next page please, and if you're not interested in that explanation, just jump to page three.
But I feel you first should have a glimpse of the GeForce GTX 280.

ATI Radeon HD 2900 XT: Calling a Spade a Spade





While AMD will tell us that R600 is not late and hasn't been delayed, this is simply because they never actually set a public date from which to be delayed. We all know that AMD would rather have seen their hardware hit the streets at or around the time Vista launched, or better yet, alongside G80. But the fact is that AMD had quite a few problems in getting R600 out the door.
While we couldn't really get the whole story from anyone, we heard bits and pieces here and there during our three day briefing event in Tunis, Tunisia. These conversations were short and scattered and not the kind of thing that it's easy to get a straight answer about when asking direct questions. Keeping that in mind, we do have some information and speculation about a few of the road bumps AMD faced with R600.
Apparently, the first spin of R600 silicon could only communicate over the debugging interface. While the upside is that the chip wasn't totally dead, this is not a good problem to have. We also overheard that a later revision of the hardware suffered from fragments getting stuck in pixel shaders. We even overheard one conversation where someone jokingly remarked that AMD should design hardware but leave the execution to NVIDIA.
In a wild bout of pure speculation on our part, we would have to guess about one other problem that popped up during R600's creation. It seems to us that AMD was unable to get their MSAA hardware to work properly and was forced to use shader hardware to handle MSAA rather than go back for yet another silicon revision. Please know that this is not a confirmed fact, but just an educated guess.
In another unique move, there is no high end part in AMD's R600 lineup. The Radeon HD 2900 XT is the highest end graphics card in the lineup and it's priced at $399. While we appreciate AMD's intent to keep prices in check, the justification is what we have an issue with. According to AMD, it loses money on high end parts which is why we won't see anything more expensive than the 2900 XT this time around. The real story is that AMD would lose money on a high end part if it wasn't competitive, which is why we feel that there's nothing more expensive than the 2900 XT. It's not a huge deal because the number of people buying > $399 graphics cards is limited, but before we've started the review AMD is already giving up ground to NVIDIA, which isn't a good sign.
More than anything, we'd guess that the lack of a high end part has a lot to do with the delays and struggles AMD saw this time around in bringing R600 to market. We expect to see the return of a very high end part by the time R700 comes around, assuming that there aren't similarly debilitating delays.
The delays and lack of a high end would be beyond perfect if the Radeon HD 2900 XT could do to NVIDIA what the G80 launch did to ATI, unfortunately the picture just isn't that rosy. ATI's latest and greatest doesn't exactly deliver the best performance per watt, so while it doesn't compete performance-wise with the GeForce 8800 GTX it requires more power. An ultra high end power requirement in a sub-$400 graphics card isn't exactly ideal.
Despite all of this, there's a great deal of cool technology in the R600, and as ATI is now a part of a CPU company, we received more detail on the GPU than we've gotten during any other GPU launch. AMD takes graphics very seriously, and it recently reaffirmed its commitment to continue to deliver high end discrete graphics cards, so amidst countless delays and rumors of strange problems, the R600 architecture is quite possibly more important to AMD than the graphics cards themselves. An eventual derivative of this architecture will be used in AMD's Fusion processors, eventually making their way into a heterogeneous multi-core AMD microprocessor.
With AMD's disappointing Q1, it can't rest too much on the hope of Fusion changing the market, so we'll have to start by looking at where R600 is today and how it stacks up to NVIDIA's latest and almost greatest.
The new ATI Radeon HD 2900 XT runs at 740 MHz and accesses its 512 MB GDDR3 memory at 1.65 GHz (825 MHz x 2), using a new 512-bit memory interface, with boosts the memory maximum theoretical transfer rate to 105.6 GB/s – Radeon X1950 XTX has a memory maximum transfer rate of 64 GB/s and GeForce 8800 GTX, of 86.4 GB/s, but the new GeForce 8800 Ultra reaches 103.6 GB/s.
Since it is based on Shader 4.0 (DirectX 10), this video card uses a unified shader processing architecture, i.e. instead of having separated units for processing pixel shader, vertex shader, physics and geometry, it has several generic units, called “stream processors”. This model has 320 processing units, against 128 on GeForce 8800 GTX and Ultra.
There are several other new features on this new video card, like the new generation of Avivo – 2D-enhancement technology –, dubbed Avivo HD, now capable of decoding HD-DVD and Blu-Ray discs directly on the GPU and thus freeing up the CPU from this task, and the use of 128-bit High Dynamic Range (HDR), against 64-bit on the previous ATI generation. We’ve written a full article explaining the architecture of this new video card and we recommend you to read this article to complement the present review.
For a full comparison between ATI Radeon HD 2900 XT and competing chips from nVidia, please read our tutorial nVidia Chips Comparison Table.
On Figures 2 and 3 you can see the reviewed card from AMD, which is a reference model. When a high-end video card is released, AMD (or nVidia) manufactures the cards (actually the cards are manufactured by an Asian manufacturer contracted by them) and then sells them to their partners, the video card manufacturers, which add their sticker, changed the BIOS with their settings, add their cables and CDs and put everything into their box. In this case, the cards provided by different manufacturers are exactly the same. This is the case with Radeon HD 2900 XT, at least right now