Pages: 1 ... 4 5 [6] 7 8 ... 36

Today we are going to talk about the fastest mass production GeForce GTX TITAN graphics accelerator with air cooling as well as the results of a pair of TITAN cards in a 2-way SLI configuration.

Read more...


0 Comments

New Asus graphics accelerator can be called a true apogee of the Radeon HD 7970 evolution. It is hard to believe that there could ever be any faster or more advanced graphics accelerators before the HD 7000-series epoch comes to its logical end.

Read more...

0 Comments
<p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">As Intel got into the chipset business it quickly found itself faced with an interesting problem. As the number of supported IO interfaces increased (back then we were talking about things like AGP, FSB), the size of the North Bridge die had to increase in order to accommodate all of the external facing IO. Eventually Intel ended up in a situation where IO dictated a minimum die area for the chipset, but the actual controllers driving that IO didn&rsquo;t need all of that die area. Intel effectively had some free space on its North Bridge die to do whatever it wanted with. In the late 90s Micron saw this problem and contemplating throwing some L3 cache onto its North Bridges. Intel&rsquo;s solution was to give graphics away for free.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">The budget for Intel graphics was always whatever free space remained once all other necessary controllers in the North Bridge were accounted for. As a result, Intel&rsquo;s integrated graphics was never particularly good. Intel didn&rsquo;t care about graphics, it just had some free space on a necessary piece of silicon and decided to do something with it. High performance GPUs need lots of transistors, something Intel would never give its graphics architects - they only got the bare minimum. It also didn&rsquo;t make sense to focus on things like driver optimizations and image quality. Investing in people and infrastructure to support something you&rsquo;re giving away for free never made a lot of sense.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">Intel hired some very passionate graphics engineers, who always petitioned Intel management to give them more die area to work with, but the answer always came back no. Intel was a pure blooded CPU company, and the GPU industry wasn&rsquo;t interesting enough at the time. Intel&rsquo;s GPU leadership needed another approach.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">A few years ago they got that break. Once again, it had to do with IO demands on chipset die area. Intel&rsquo;s chipsets were always built on a n-1 or n-2 process. If Intel was building a 45nm CPU, the chipset would be built on 65nm or 90nm. This waterfall effect allowed Intel to help get more mileage out of its older fabs, which made the accountants at Intel quite happy as those $2 - $3B buildings are painfully useless once obsolete. As the PC industry grew, so did shipments of Intel chipsets. Each Intel CPU sold needed at least one other Intel chip built on a previous generation node. Interface widths as well as the number of IOs required on chipsets continued to increase, driving chipset die areas up once again. This time however, the problem wasn&rsquo;t as easy to deal with as giving the graphics guys more die area to work with. Looking at demand for Intel chipsets, and the increasing die area, it became clear that one of two things had to happen: Intel would either have to build more fabs on older process nodes to keep up with demand, or Intel would have to integrate parts of the chipset into the CPU.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">Not wanting to invest in older fab technology, Intel management green-lit the second option: to move the Graphics and Memory Controller Hub onto the CPU die. All that would remain off-die would be a lightweight IO controller for things like SATA and USB. PCIe, the memory controller, and graphics would all move onto the CPU package, and then eventually share the same die with the CPU cores.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">Pure economics and an unwillingness to invest in older fabs made the GPU a first class citizen in Intel silicon terms, but Intel management still didn&rsquo;t have the motivation to dedicate more die area to the GPU. That encouragement would come externally, from Apple.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;">Looking at the past few years of Apple products, you&rsquo;ll recognize one common thread: Apple as a company values GPU performance. As a small customer of Intel&rsquo;s, Apple&rsquo;s GPU desires didn&rsquo;t really matter, but as Apple grew, so did its influence within Intel. With every microprocessor generation, Intel talks to its major customers and uses their input to help shape the designs. There&rsquo;s no sense in building silicon that no one wants to buy, so Intel engages its customers and rolls their feedback into silicon. Apple eventually got to the point where it was buying enough high-margin Intel silicon to influence Intel&rsquo;s roadmap. That&rsquo;s how we got Intel&rsquo;s HD 3000. And that&rsquo;s how we got here.</span></span></span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="background-color: rgb(255, 255, 255);"><span style="color: rgb(0, 0, 0);"><span style="font-size: small;"><a href="http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested" target="_blank">Read more...</a></span></span></span></p>

0 Comments

As spring gets ready to roll over to summer, last week we saw the first phase of NVIDIA’s annual desktop product line refresh, with the launch of the GeForce GTX 780. Based on a cut-down GK110 GPU, the GTX 780 was by most metrics a Titan Mini, offering a significant performance boost for a mid-generation part, albeit a part that forwent the usual $500 price tier in the process. With the launch of GTX 780 the stage has been set for the rest of the GeForce 700 series refresh, and NVIDIA is wasting no time on getting to the next part in their lineup. So what’s up next? GeForce GTX 770, of course.

In our closing thoughts on the GTX 780, we ended on the subject of what NVIDIA would do for a GTX 770. Without a new mid/high-end GPU on the horizon, NVIDIA has instead gone to incremental adjustments for their 2013 refreshes, GTX 780 being a prime example through its use of a cut-down GK110, something that has always been the most logical choice for the company. But any potential GTX 770 is far more nebulous, as both a 3rd tier GK110 part and a top-tier GK104 part could conceivably fill the role just as well. With the launch of the GTX 770 now upon us we finally have the answer to that question, and the answer is that NVIDIA has taken the GK104 option.

What is GTX 770 then? GTX 770 is essentially GTX 680 on steroids. Higher core clockspeeds and memory clockspeeds give it performance exceeding GTX 680, while higher voltages and a higher TDP allow it to clock higher and for it to matter. As a result GTX 770 is still very much a product cut from the same cloth as GTX 680, but as a fastest GK104 card yet it is a potent successor to the outgoing GTX 670.

Read more...

0 Comments

As the two year GPU cycle continues in earnest, we’ve reached the point where NVIDIA is gearing up for their annual desktop product line refresh. With the GeForce 600 series proper having launched over a year ago, all the way back in March of 2012, most GeForce 600 series products are at or are approaching a year old, putting us roughly halfway through Kepler’s expected 2 year lifecycle. With their business strongly rooted in annual upgrades, this means NVIDIA’s GPU lineup is due for a refresh.

How NVIDIA goes about their refreshes has differed throughout the years. Unlike the CPU industry (specifically Intel), the GPU industry doesn’t currently live on any kind of tick-tock progression method. New architectures are launched on new process nodes, which in turn ties everything to the launch of those new process nodes by TSMC. Last decade saw TSMC doing yearly half-node steps, allowing incremental fab-driven improvements every year. But with TSMC no longer doing half-node steps as of 40nm, this means fab-drive improvements now come only every two years.

In lieu of new process nodes and new architectures, NVIDIA has opted to refresh based on incremental improvements within their product lineups. With the Fermi generation, NVIDIA initially shipped most GeForce 400 Fermi GPUs with one or more disabled functional units. This helped to boost yields on a highly temperamental 40nm process, but it also left NVIDIA an obvious route of progression for the GeForce 500 series. With the GeForce 600 series on the other hand, 28nm is relatively well behaved and NVIDIA has launched fully-enabled products at almost every tier, leaving them without an obvious route of progression for the Kepler refresh.

So where does NVIDIA go from here? As it turns out NVIDIA’s solution for their annual refresh is essentially the same: add more functional units. NVIDIA of course doesn’t have more functional units to turn on within their existing GPUs, so instead they’re doing the next best thing, acquiring more functional units by climbing up the GPU ladder itself. And with this in mind, this brings us to today’s launch, the GeForce GTX 780.

Read more...

0 Comments

Two new graphics cards in the entry-level price segment entered into a fierce competition against each other. Who will come out on top ?


0 Comments
<p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="font-size: small;">Officially canonized back in 2008 with AMD&rsquo;s &ldquo;small die&rdquo; strategy, dual-GPU cards have since become a staple of AMD&rsquo;s product lineup. Filling a small-but-important niche for AMD, dual-GPU cards allow AMD to both deliver ultra-enthusiast performance levels their traditional single-GPU products can&rsquo;t offer, and at the same time compete with NVIDIA&rsquo;s big die flagship cards without AMD needing to produce a big die GPU of their own. As a result, though these cards aren&rsquo;t necessarily obligatory, with each generation we&rsquo;re left eagerly awaiting just what AMD has in store for their capstone product.</span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="font-size: small;">Of course with that said, like so many other facets of the 7000 series, the dual-GPU situation has played out rather unusually in the past year. In a typical year we would see AMD release a standard design, and then later on partners like Asus and PowerColor would release their own custom designs in the name of product differentiation and squeezing out just a bit more performance. Instead the 7000 series has played out in reverse: Asus and PowerColor released their designs first. Consequently, up until this point the 7990 has been &ldquo;officially unofficial&rdquo;, reflecting the fact that the first 7990s were AMD sanctioned products, but not based on AMD designs.</span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="font-size: small;">But at long last the 7990 is becoming fully official. AMD is getting into the game with their own 7990 design, and perhaps more importantly they&rsquo;re doing so while coming to bear with the kind of engineering resources that only a GPU manufacturer can provide. This isn&rsquo;t going to be the first 7990 &ndash; that honor belongs to&nbsp;</span><span style="font-size: small;">PowerColor&rsquo;s 7990</span><span style="font-size: small;">&nbsp;&ndash; but this is unquestionably the most important 7990.&nbsp; For AMD and their partners going official doesn&rsquo;t just mean the AMD is taking a greater role in matters, but as we&rsquo;ll see it means changing the rules of the game entirely.</span></p> <p style="box-sizing: border-box; margin: 13px 0px; padding: 0px; border: 0px; font-family: Arimo, sans-serif; font-size: 14px; line-height: 21px; vertical-align: baseline; color: rgb(68, 68, 68); -webkit-text-stroke-color: rgba(0, 0, 0, 0); -webkit-text-stroke-width: 1px; background-color: rgb(246, 246, 246);"><span style="font-size: small;"><a href="http://www.anandtech.com/show/6915/amd-radeon-hd-7990-review-7990-gets-official" target="_blank">Read more...</a></span></p>

0 Comments
<p><span style="font-size: small;">To get our weekly geekiness quota out of the way early, the desktop video card industry is a lot like The Force. There are two sides constantly at odds with each other for dominance of the galaxy/market, and balance between the two sides is considered one of the central tenants of the system. Furthermore when the system isn&rsquo;t in balance something bad happens, whether it&rsquo;s galactic domination or uncompetitive video card prices and designs.</span></p> <p><span style="font-size: small;"> To that end &ndash; and to bring things back to a technical discussion &ndash; while AMD and NVIDIA&rsquo;s ultimate goals are to rule the video card market, in practice they serve to keep each other in check and keep the market as a whole balanced. This is accomplished by their doing what they can to offer similarly competitive video cards at most price points, particularly the sub-$300 market where the bulk of all video card sales take place. On the other hand when that balance is disrupted by the introduction of a new GPU and/or new video card, AMD and NVIDIA will try to roll out new products to restore that balance.</span></p> <p><span style="font-size: small;"> This brings us to the subject of today&rsquo;s launch. Friday saw the launch of AMD&rsquo;s Radeon HD 7790, a $149 entry-level 1080p card based on their new Bonaire GPU. AMD had for roughly the last half-year been operating with a significant price and performance gap between their 7770 and 7850 products, leaving the mid-$100 market open to NVIDIA&rsquo;s GTX 650 Ti. With the 7790 AMD finally has a GTX 650 Ti competitor and more, and left unchallenged this would mean AMD would control the market between $150 and $200.</span></p> <p><span style="font-size: small;"> NVIDIA for their part has no interest in letting AMD take that piece of the market without a fight, and as such will be immediately countering with a new video card: the GTX 650 Ti Boost. Launching today, the GTX 650 Ti Boost is based on the same GK106 GPU as the GTX 650 Ti and GTX 660, and is essentially a filler card to bridge the gap between them. By adding GPU boost back into the mix and using a slightly more powerful core configuration, NVIDIA intends to plug their own performance gap and at the same time counter AMD&rsquo;s 7850 and 7790 before the latter even reaches retail. It&rsquo;s never quite that simple of course, but as we&rsquo;ll see the GTX 650 Ti Boost does indeed bring some balance back to the Force.</span></p> <p><span style="font-size: small;"><a href="http://www.anandtech.com/show/6838/nvidia-geforce-gtx-650-ti-boost-review-" target="_blank">Read more...</a></span></p>

0 Comments

Let’s meet the fastest single-processor graphics card with mind-blowing performance. But it is not only the performance that will blow you away…

Read more...

0 Comments
In an industry that has long grown accustomed to annual product updates, the video card industry is one where the flip of a calendar to a new year brings a lot of excitement, anticipation, speculation, and maybe even a bit of dread for consumers and manufacturers alike. It’s no secret then that with AMD launching most of their Radeon HD 7000 series parts in Q1 of 2012 that the company would be looking to refresh their product lineup this year. Indeed, they removed doubt before 2012 even came to a close when they laid out their 8000M plans for the first half of 2013, revealing their first 2013 GPU and giving us a mobile roadmap with clear spots for further GPUs. So we have known for months that new GPUs would be on their way; the questions being what would they be and when would they arrive?

The answer to that, as it turns out, is a lot more complex than anyone was expecting. It’s been something of an epic journey getting to AMD’s 2013 GPU launches, and not all for good reasons. A PR attempt to explain that the existing Radeon HD 7000 series parts would not be going away backfired in a big way, with AMD’s calling their existing product stack “stable through 2013” being incorrectly interpreted as their intention to not release any new products in 2013. This in turn lead to AMD going one step further to rectify the problem by publically laying out their 2013 plans in greater (but not complete) detail, which thankfully cleared a lot of confusion. Though not all confusion and doubt has been erased – after all, AMD has to save something for the GPU introductions – we learned that AMD would be launching new retail desktop 7000 series cards in the first half of this year, and that brings us to today.

Launching today is AMD’s second new GPU for 2013 and the first GPU to make it to the retail desktop market: Bonaire. Bonaire in turn will be powering AMD’s first new retail desktop card for 2013, the Radeon HD 7790. With the 7790 AMD intends to fill the sometimes wide chasm in price and performance between their existing 7770 (Cape Verde) and 7850 (Pitcairn) products, and as a result today we’ll see just how Bonaire and the 7790 fit into the big picture for AMD’s 2013 plans.

0 Comments
Pages: 1 ... 4 5 [6] 7 8 ... 36