Powered by Blogger.
Showing posts with label nVIDIA. Show all posts
Showing posts with label nVIDIA. Show all posts

Tuesday, 31 May 2011

Computex 2011: NVIDIA Announces Wired 3D Vision Glasses

Technically Computex 2011 doesn’t start for another day, but already companies are announcing new hardware right ahead of the event in order to try to beat the rush. NVIDIA is one of those companies; at Computex they’re announcing their new wired 3D Vision glasses.

As a bit of background, since the introduction of 3D Vision in 2009 the hardware has remained relatively unchanged. NVIDIA launched with their wireless 3D Vision glasses kit at $199; more recently they dropped the price down to $149, but other than a slightly tweaked revision of the glasses offering double the battery life, the glasses themselves haven’t changed. Meanwhile, though wireless glasses are generally going to be the best way to use an active shutter 3D system, it does have a couple of downfalls: there’s only so much cost cutting that can be done, the need to have batteries and USB connectors on-board dictates design to some degree, and $150 objects that are not tied down tend to grow legs and walk away.

As the market for 3D Vision expands, driven by declining prices for the necessary 120Hz LCD monitors, NVIDIA has finally had to deal with these problems, both to keep costs in-line with cheaper monitors and to build a set of glasses suitable for some new markets. The result is that we’ve come full-circle: 3D shutter glasses are now wired once again.

Being announced today and shipping in late June are the NVIDIA 3D Vision wired glasses, which NVIDIA intends to fill the above niche. As the name implies, it’s a set of 3D vision glasses that are wired – in this case using USB 2.0 – instead of the wireless glasses + hub solution that NVIDIA’s wireless kit uses. By ditching the batteries and the IR gear, and by integrating the functions of the 3D Vision hub into the USB connector itself, NVIDIA has been able to cut production costs. Priced at $99, these are intended to be the new low-end glasses to go with the aforementioned cheaper 120Hz monitors, while the wireless glasses will continue to be offered at $149. Besides the fact that the wireless glasses are wireless, it looks like the other features differentiating the two will be the 3D Vision control functionality the wireless hub offers - convergence controls and turning 3D Vision on & off - as there's no analog on the wired glasses.

The wired glasses will also be serving as NVIDIA’s first official foray into LAN/cyber-café business. As wireless glasses can’t be easily secured and can be easily stolen, NVIDIA designed and will be pitching the wired glasses as a practical alternative to café owners that want to offer 3D Vision without their investments walking out the door. The wired glasses feature a Kensington lock slot in the USB connector/hub, which means the glasses can be locked down like the rest of a café’s hardware. We wouldn’t venture to guess just how many cafes are actually interested in offering 3D Vision right now, but clearly NVIDIA believes it’s a worthwhile market to chance.


The existing wireless glasses

Finally, in spite of the wired nature of the glasses, they may end up being more comfortable than the existing wireless glasses. NVIDIA isn’t heavily promoting it, but the wired glasses do have a different fit thanks to the fact that there’s no longer a need to embed batteries, IR receivers, or a controller into the glasses themselves. Of the few different editors here at AnandTech that have used the wireless 3D Vision glasses, the most common complaint is the fit; as the glasses are one size fits all, we’ve found that the glasses pinch our (apparently) big heads. Although only NVIDIA really has any data to back up how many users shy away from 3D Vision due to the fit of the glasses, we suspect we’re not the only ones that the existing glasses poorly fit – the merits of the technology mean little if the glasses themselves can’t be worn comfortably for any length of time. Once we receive our sample glasses we’ll be taking a look at the fit to see if these are any better than the wireless glasses.

Wrapping things up, the wired glasses will cost $99 and be shipping in late June from NVIDIA and other retailers. NVIDIA hasn’t talked about any long-term plans for the wired glasses, but we wouldn’t rule out the possibility that they’re going to replace the wireless glasses in monitor bundles that already include glasses. A few manufacturers are building the IR transmitter directly into the monitor’s bezel these days, but for the rest this would be another way to bring down the price of a complete 3D Vision kit.

]]>

Read more...

NVIDIA GeForce GTX 560M: High-End Mobile Graphics with Optimus

Our collective wishes have been granted by the fine folks at NVIDIA: you can now buy a notebook with high-end graphics that supports Optimus and thus is capable of offering excellent battery life. NVIDIA is refreshing their GeForce GTX 460M with the 560M. This will be a faster GPU, naturally, using the updated GF116 instead of the 460M's older GF106.


The impending ASUS G74Sx will be powered by the GeForce GTX 560M.

Notebooks using the new chip should be available in the near future, though keep in mind that not all notebooks will support all features. Read on for more details.

]]>

Read more...

Monday, 30 May 2011

Linux Dell XPS L501X - switching nvidia off

One of the hybrid-graphics-linux team members has found a way to switch off the nvidia graphics card from the Dell XPS L501X laptop model. Here are the details of the process:

https://lists.launchpad.net/hybrid-graphics-linux/msg00450.html

For more information, join the hybrid-graphics-linux team and reply to the email in the mailing list:

https://lists.launchpad.net/hybrid-graphics-linux/

Read more...

Switch on/off Nvidia card on Dell Vostro 3700/3500

This URL provides the details to switch on/off the Nvidia card on a Dell Vostro 3700 or 3500 laptop:

https://bbs.archlinux.org/viewtopic.php?pid=882737#p882737

When the nvidia card is switched on, it uses the nouveau drivers, and some stability issues have been found.

Read more...

Good news for linux hybrid graphics on nvidia -- nouveau drivers leap forward @ www.phoronix.com

This is good news for those on hybrid graphics laptops that got them to work with nouveau drivers but are unable to install and use the binary closed-source nvidia drivers. As you can read here:

[Phoronix] On Low-End GPUs, Nouveau Speeds Past The NVIDIA Driver

the results of the phoronix benchmarking show that nouveau, steadily improving since it was merged into the Linux kernel, promises to deliver great performance on different nvidia GPUs.

Read more...

New ASUS U31SD with Nvidia Optimus @ www.engadget.com

ASUS works Sandy Bridge magic on thin-and-light U31E, U31SD, and U36SD -- Engadget

ASUS U31SD

It appears that ASUS is finally ready to show its line of thin-and-light machines some Sandy Bridge love. Swedish site Technytt claims to have the exclusive scoop on a trio of laptops -- the U31SD, U31E, and U36SD -- that will find their way to retail channels in late May. The U31SD is already showing up on the ASUS site, with the option of either a Core i5 2410M or Core i3 2310M , and a choice of Intel integrated graphics or a 1GB GeForce GT 520M card. All three 13.3-inch machines will reportedly have similar specs, though the U31E will supposedly lack a discrete graphics option. The U31SD tips the scales at a perfectly portable 3.9 pounds and it's safe to assume the U36SD will match up size-wise with the svelte U36JG, which is just 0.75-inches thick and weighs 3.5 pounds. There's no official word from ASUS regarding price or availability but, while you wait to get your paws on one, feast your eyes on the U31SD in the gallery below.

Read more...

Samsung SF310 with nvidia optimus @ www.pcmag.com

Samsung SF310 Review & Rating | PCMag.com

May 10, 2011

Samsung SF310

  • Pros

    Intel Core i5-480M processor. Nvidia GeForce 310M graphics with automatic switching. Lots of extra features, like Bluetooth 3.0, Sleep-and-Charge, and quick restarts. Good battery life.

  • Cons

    Covered ports are less accessible. Wimpy speakers. Middling performance. Internal battery can't be accessed or replaced by buyer. Small hard drive.

  • Bottom Line

    Some buyers will be intrigued by features like Bluetooth 3.0, Sleep-and-Charge funcationality, and quick restarts, but more discerning buyers may shun the Samsung SF310 for its underutilized components and small hard drive.

Read more...

Sunday, 29 May 2011

NVIDIA announces Optimus for Desktops @ www.anandtech.com

NVIDIA Synergy to Bring Optimus to Desktops

Read more...

Dell XPS 15z with nvidia optimus @ www.engadget.com

Dell XPS 15z review

For years, Dell's been teasing supermodel-thin laptops, each one flawed out of the gate: too pricey, too underpowered, and with underwhelming battery life. This time, Dell told us we'd get something different -- a laptop without compromise. Recently, Round Rock killed off the Adamo and nixed the XPS 14, and then rumors started to spin -- a spiritual successor would be the slimmest 15.6-inch notebook we'd ever seen, be crafted from "special materials" and yet cost less than $1,000. Dell even stated that it would have an "innovative new form factor" of some sort.

The company neglected to mention it would look like a MacBook Pro.

This is the Dell XPS 15z, and we're sorry to say it's not a thin-and-light -- it's actually a few hairs thicker than a 15-inch MacBook Pro, wider, and at 5.54 pounds, it weighs practically the same. It is, however, constructed of aluminum and magnesium alloy and carries some pretty peppy silicon inside, and the base model really does ring up at $999. That's a pretty low price to garner comparisons to Apple's flagship, and yet here we are. Has Dell set a new bar for the notebook PC market? Find out after the break.


Design

Clean lines, smooth curves, and vast expanses of beautifully textured metal, cool to the touch -- the MacBook Pro has captivated Apple fans for years, and there's no doubt Dell's trying to capture much of the same charm. From the aluminum chassis to the placement of the speakers, DVD drives and majority of ports... heck, even the tiny arrow key bars on the backlit keyboard and feet on the bottom of the chassis are cribbed from Apple's product.


It's honestly difficult to find anything on the entire notebook that feels wholly original, though there are a few Dell tweaks -- the speakers and vents have the same pattern as those on the Inspiron Duo, and last year's XPS lineup contributed its distinctive hinged screen, which lies flat on top of the notebook rather than forming a traditional clamshell case. You'll also find plenty of chrome trim, ringing both the chassis and the oversized touchpad.

But let's get this out of the way right now: though the XPS 15z most definitely looks like a MacBook Pro and sports similar materials, you'll wind up disappointed if you're expecting the same exacting attention to detail. You're looking at an aluminum and magnesium alloy sandwich here, not a unibody frame, and much of that metal is thin enough to flex under a little bit of pressure. While typing, we noticed that if we put a little weight on the keyboard, we'd oh-so-slightly squish the whole frame, not enough to make a lasting impression, but enough to audibly restrict airflow to the system fan. And -- at least in our pre-production model -- that aforementioned chrome trim had rough edges that slightly chafed our wrists. We also have to laugh at Dell's decision to place all the I/O ports in a row on the left-hand side, just like Apple's rig, as we've often felt Cupertino sacrificed function for form in so doing.

Still, it's a very attractive machine...

Display / Speakers / Keyboard / Trackpad

...and it looks even better when you lift the lid to find this optional 300-nit 1920 x 1080 screen. Yes, while Apple fans still have to settle for a 1680 x 1050 pixel picture in a 15-inch chassis, an extra $150 buys the XPS 15z a full 1080p display, allowing for high-res movies, games, and wonderfully roomy split-screen multitasking. It's a pretty bright, beautiful picture on this particular screen, too, and though the contrast isn't quite as high as we'd like, Dell's software will automatically adjust the backlight to give you the best out of your blacks and whites. It's also quite glossy, unfortunately, and viewing angles are pretty terrible here, as the picture becomes far less vibrant if you shift your head even slightly to the left or right.

That's a shame, considering that the XPS 15z's speakers sound like they're designed to be shared. We're not sure what kind of drivers lie beneath those patterned grilles, but they sure are loud, and create a wide soundstage perfect for movies and games even though they have little bass to speak of. They're also a little shrill at maximum volume, but they're still a cut above most laptop speakers we've used.

Dell's been on a chiclet keyboard bent as of late -- following the global trend -- and while opinions differ regarding whether floating islands or distinct grids make for better typing, we can't complain about the sea of squares on the XPS 15z. As we've already alluded to, Mac users will find themselves completely at home with the layout, and the keys themselves are fairly friendly, too -- rounded, comfortable, slightly convex little squares and rectangles with a smooth, rubbery action and nice big tactile guides on the home row. There's no dedicated numpad here, a bit of a shame considering that there's definitely space, but we suppose some things have to be sacrificed for symmetry and a pair of speakers loud enough to fill the room.

Speaking of symmetry, you'll find the XPS 15z trackpad front and center in the experience, and we're happy to say it's a fairly pleasant one -- the oversized Cypress pad is quick, responsive and accurate for single-finger input, and comes with a pair of large, clicky and satisfying mouse buttons. What's more, it does two, three and four-finger multitouch gestures, though you'll note we didn't include them in the "quick, responsive and accurate" part. Some work amazingly well (swipe four fingers sideways to engage Windows Flip 3D, then drag one to flip through your open apps) and some fail miserably (far too often, the trackpad detected a pinch-to-zoom motion when we intended to do two-finger scrolling). You can tailor gestures at whim in the Cypress settings page, but we were surprised to find that our changes didn't stick. The next time we rebooted the machine, those problematic default settings were back again.

Did we mention that the entire palmrest is made of magnesium alloy, including those speaker grilles? The whole surface you interact with is smooth, durable, and dirt-resistant too, as the only way we were able to leave a visible fingerprint was by touching the glossy screen itself. We should note, however, that the metallic surface is a double-edged sword here. We noticed that our fingers were getting mighty toasty during a benchmark, as if the computer was venting hot air right right onto our skin, and during an intensive session of Bulletstorm, we found the magnesium spacers between the crucial WASD keys was burning hot to the touch. It seems that Dell may have put some important silicon right underneath those keys, so you may want an external keyboard at your next LAN party.

Performance and battery life

Yes, you heard us right, a LAN -- the XPS 15z may not be a gaming rig per se, but for $999 there's more than enough power under the hood for a few frag sessions. Even the base model is loaded with a dual-core 2.3GHz Intel Core i5-2410M processor, switchable NVIDIA GeForce GT525M graphics with 1GB of memory, and 6GB of DDR3 RAM, a 7200RPM hard drive and loads of high-end connectivity. You'll find a gigabit Ethernet jack keeping the power socket company around back, two USB 3.0 ports on the left-hand side, along with one eSATA / USB 2.0 combo port, one Mini DisplayPort, and an HDMI 1.4 jack, a pair of 3.5mm headphone and microphone jacks on the right, and a dual-band Intel 802.11a/g/n WiFi radio inside.

What can all that do in practice? Well, we're actually not quite sure about those particular specs, since we actually received the 2.7GHz Core i7-2620M version with 2GB of dedicated graphics memory and 8GB of RAM. That kind of rig will run you $1,534, but it'll also do some potent processing.

Case in point: We launched our Chrome browser with a dozen Engadget tabs, started playing a DVD copy of Hitch, fired up a 720p windowed version of Batman: Arkham Asylum, and started opening windowed 1080p movie trailers for The Dark Knight all at the same time... and it was only after the third concurrent trailer on top of our perfectly playable game session and DVD movie that we started noticing a little slowdown. In other words, multitasking is a go, and in case you're wondering, Windows told us that particular load still only used 80 percent of our available CPU cycles.

The XPS 15z also pulled its weight in a dedicated gaming scenario, playing that same Batman: Arkham Asylum at 1080p with all settings maxed save AA, and managed to deliver 30FPS on average. Similarly, Call of Duty 4: Modern Warfare gave us around 40FPS with 4xAA and all settings maxed. Bulletstorm proved pretty taxing, though: we had to drop resolution to 1366 x 768 and reduce details to medium to make that game playable. If you're aching for some more theoretical benchmarks, our XPS 15z pulled scores of E1511, P894 and X282 in 3DMark11, and earned 8023 PCMarks in PCMark Vantage and 7,317 in 3DMark06. By the by, boot times weren't amazing, but they're certainly not bad, as we clocked 40 seconds for the machine to load into Windows, or about a minute for the machine to finish loading startup programs and be completely ready for use.

PCMarkVantage3DMark06
Battery Life
Dell XPS M15z (Core i7-2620M, GeForce GT525M 2GB)8,0237,3173:41 / 4:26
MacBook Pro (Core i7-2720QM, Radeon 6750M)8,04110,2627:27
Lenovo ThinkPad X1 (Core i5-2410M)7,7873,7263:31 / 6:57
Samsung Series 9 (Core i5-2537M)7,5822,2404:20
Lenovo ThinkPad X2207,6353,5177:19
ASUS U36Jc (Core i5 / NVIDIA GeForce 310M)5,9812,048 / 3,5245:30
ASUS U33Jc-A1 (Core i3-370M, GeForce 310M)5,5741,860 / 3,4035:10
Toshiba Portege R705 (Core i3-350M)5,0241,739 / 3,6864:25
Notes: the higher the score the better. For 3DMark06, the first number reflects score with GPU off, the second with it on.


We'd mentioned that Dell's previous attempts at premium systems failed price, power and battery life tests. With the XPS 15z, well... two out of three ain't bad. Despite the fact that the NVIDIA Optimus GPU turns off when not fully taxed (powering the laptop's display with integrated Intel HD 3000 Graphics instead), we weren't able to get much more than three and a half hours of regular use out of our review unit. Turning to our standard battery drain test (where we loop a standard-definition video with the screen at roughly 65 percent brightness, and with WiFi on), we saw much the same thing -- 3 hours, 41 minutes of use from the sealed 8-cell, 2.6Ah battery. It occurred to us that perhaps Optimus wasn't actually switching off the discrete GPU at the most appropriate intervals, and sure enough, we were able to eke out a little more runtime by completely disabling it, but you're still looking at 4 hours, 26 minutes of use. That's not bad, all things considered, but it's a good sight worse than the 8.5 hours of life that Dell's advertising here, and if the company wants to make a dent in the MacBook Pro's armor, it'll have to do better than that.

Software and Stage UI

The XPS 15z comes with the usual array of mostly unobtrusive bloatware, including a trial subscription to Norton Antivirus, the token copies of Microsoft Office Starter and Roxio for your disc burning needs -- but there is one thing out of the ordinary, and that's Dell's Stage UI. That's right -- the company's divorced its custom touchscreen interface from the Inspiron Duo and Streak, and turned it into a launcher bar that sits at the bottom of your desktop. There's no need to fear for your Windows 7 taskbar, though, as Stage buttons are just shortcuts to quickly launch your favorite multimedia, and the gallery, audio, video and podcast players are actually rather good-looking in our honest opinion. If you don't care for the bloat, it's all quickly uninstalled. Everybody wins.

Wrap-up

These two laptops are not equals, but they never had to be -- for hundreds upon hundreds of dollars less than the Mac competition, Dell's unleashed an attractive, powerful and definitely desirable Windows PC -- perhaps desirable enough to woo buyers who prefer Windows but love the Mac aesthetic. We suspect that's Dell's plan here, because while we really appreciate the XPS 15z's metal construction and choice parts, it hasn't really changed the game.

It's no lighter, thinner or particularly better armed than the competition, and when it tried to borrow the MacBook Pro's flair it picked up some of Apple's failings along the way. We're not just talking about the inability to having chunky USB peripherals plugged in at the same time, but rather the ability to configure and upgrade the machine at will. While that dual-core Core i7 processor, NVIDIA GT525M GPU, 8-cell battery and DVD drive are nice to have, that's the best you'll get here -- even though Dell's slightly chunkier XPS 15 is configurable with quad-core processors, faster video options and a Blu-ray drive to deliver extra value to that 1080p screen.

When Dell tells you that the XPS 15z has no compromises, that's not quite the case -- it's a solid choice at this price point, but corners were cut to get there. 

Read more...

Friday, 27 May 2011

Why all mobiles and tablets uses Nvidia Tegra Chipsets

Today’s every mobile and tablet manufacturers uses Nvidia Tegra Chipsets because it brings extreme multitasking with the first mobile dual-core CPU, the best mobile Web experience with up to two times faster browsing, hardware accelerated Flash, and console-quality gaming with an ultra-low power (ULP) NVIDIA GeForce GPU delivers outstanding mobile 3D game playability and a visually engaging, highly-responsive 3D user interface.
There are four reasons Why all mobiles and tablets uses Nvidia Tegra Chipsets and also provide the ultimate realistic experience to the users.

1.Extreme Multitasking -do more faster
Facebook, Twitter, Pandora, E-mail, Pictures, Movies — everyone is multitasking. And as mobile devices become your primary computing device, you expect the same snappiness and performance that you get on your PC.

2.Web with Flash - The best mobile Web experience
Surf the Web in all its multimedia glory. Web pages and Flash-based content load up to two times faster so you never have to skip a beat.

3.Console Gaming Quality - Take your gaming on the road
For the first time ever, get console-quality gaming on your mobile device and the fastest graphics performance. Whether you're into racing games, sports games, or even role-playing games (RPGs), experience the future of mobile gaming on your NVIDIA Tegra powered super phone or tablet.

4.HD Video- HD movies, games and more — big time
Watch 1080p HD movies, photos, games on your mobile devices without compromising battery life.

Read more...

Sunday, 22 May 2011

AMD Radeon 6990 vs. Nvidia GTX 590. Do We Have A Winner?


In my opinion, the short answer is no. I have been reading and watching literally dozens of reviews and benchmarks on these two cards and I have come to one conclusion. When it comes down to what these cards are designed to do, and that's offer up a single card dual-GPU solution for gaming, they are both evenly matched. Period.

So here we have the two flagship cards from AMD and Nvidia for this generation, once again competing for dominance, and once again fans on both sides of the fence demanding a clear cut winner. "So what card is better?!?!?!"

Many people will not do any actual research on their own and make up their own minds. They always want to be told what to think and then follow the crowd, no matter what side of the fence you may be on. Then blindly proclaim that their card is better, as if it's some sort of pissing contest. I see countless replies and user comments like the following from several different reviews.

  • "do the math man, the nvidia card is around 16% faster without considering Hawx 2. if you add Hawx bench theres a 25% diference between them" - Obviously this user has no idea how to do statistical math as he is basing his numbers on a few game benchmarks from a single source.
  • "amd radeon graphics sucks!!!NVIDIA IS THE BEST !" - I do love this users advanced understanding of GPU technology. /sarcasm
  • "nvidia is a joke, everyone knows AMD is better, nvidia fanboys are just too stupid to admit it" - Seriously? Where does this logic come from?

These are just three of literally hundreds of such comments on both sides of the fence I have found while doing all my research. There are just far too many people out there that really have no clue and constantly show their inability to actually think for themselves. They look at one review, allow themselves to be tainted by one train of thought, and never stop to actually weigh all the facts about the technology they obviously have no real understanding of.

Then you have those who readily admit they don't have a real good technical understanding of the facts about current GPU's and are just trying to make a decision on what will work best for them and they are bombarded with all the aforementioned idiocy. Left to sift through it all and not really knowing what way to turn. Often getting drawn into misinformation or fanboy proclamations.

So that's why I decided to write this blog entry. I want to share the FACTS about both cards and then share my OPINIONS on both cards, and why I would choose the one I would. I wanted to compile the information I have gathered from those dozens of reviews I have read and watched all in one place so anyone who happens across my humble little blog can make up their own mind and not be bombarded by "THE HD 6990 IS BEST" or "THE GTX 590 IS KING" opinions of many of the reviewers.

These facts, information, and the approximated benchmarks included below, are compiled from dozens of different sites and reviews, here are just a few of ones I visit the most. There are several other sites I came across doing this specific research, but these are the most popular and the ones I recommend the most.
  • Tom's Hardware
  • Motherboards.org
  • Hardware Heaven
  • Linus Tech Tips
  • Overclock 3D
  • Geeks 3D
  • Tech PowerUp
  • PassMark Software
  • Legit Reviews
  • Hot Hardware
  • AnandTech
  • And many more individual reviews and benchmarks
Basically, I wrote down ALL the test results I found in ALL of my research and then averaged them out and rounded to the nearest five. Because lets be honest, does 1-4 FPS make a difference when we are talking numbers this high? No. So lets look at the facts and the numbers for both cards.

Gaming Performance

I chose to list only DX10 and DX11 capable games and no DX9 games since to be perfectly honest, DX9 is becoming outdated as are the games that use it. These cards are designed to take advantage of DX10 and DX11, pure and simple.

I listed the average overall FPS scores from over a dozen different benchmark sites, including the ones I list above. This way you can get a better overall comparative average on both of these cards and how they perform across different test benches.

I did not list any specific brands since everyone has their own personal preference, and from everything I have read, they all perform almost identical. So you are free to choose your brand. Besides, if you want to stick with a specific brand, then you are better off focusing your research to that brand.

All of the benchmarks were at stock speeds unless otherwise noted with an "OC" indicating they were overclocked.

I also tried to list the same basic graphic settings and most common resolution (1920x1080) across all the reviews to keep it all as level as possible for the best possible comparison.

Now, to the games!

Bulletstorm DX10 (Average FPS - Max Settings - 8xAA - 1920x1080)
  • HD 6990 OC: 105
  • GTX 590 OC: 105
Dragon Age 2 DX11 (Average FPS - Max Settings - 8xAA - 1920x1080)
  • HD 6990 OC: 70
  • GTX 590 OC: 45
Stalker Call of Pripyat DX11 (Average FPS - Default Settings - 1920x1200)
  • HD 6990: 100
  • GTX 590: 105
Alien vs Predator DX11 (Average FPS - Default Settings - 1920x1200)
  • HD 6990: 100
  • GTX 590: 85
Hawx 2 DX11 (Average FPS - High Settings - 8xAA - 16xAF - 1920x1200)
  • HD 6990: 190
  • GTX 590: 250
Battlefield Bad Company 2 DX11 (Average FPS - Highest Setting - 8xAA - 8x/16x MSAA/AF - - 1920x1080)
  • HD 6990: 130
  • GTX 590: 130
F1 2010 DX11 (Average FPS - Ultra Setting - 8xAA - 1920x1080)
  • HD 6990: 100
  • GTX 590: 85
Metro 2033 DX11 (Average FPS - High Settings - AAA- 4xAA - 1920x1080)
  • HD 6990: 55
  • GTX 590: 55
Lost Planet 2 DX11 (Average FPS - High Settings - 8xMSAA - 32xCSAA - AA - 1920x1080)
  • HD 6990: 85
  • GTX 590: 90
Just Cause 2 DX11 (Average FPS - Highest Settings - 8xMSAA - 16xAF - 8xAA - 1920x1080)
  • HD 6990: 75
  • GTX 590: 70

So as you can see, both of these trade punches back and froth and they both kick ass no matter how you look at it. In most cases, the differences in FPS are minimal, and in EVERY SINGLE CASE totally unnoticeable without benchmark software running in the background.

What does all that mean? Simple, there is NO clear cut winner when it comes to framerates between the HD 6990 and the GTX 590. It will boil down to the game YOU play and the your own personal preferences.

Synthetic Benchmarks

I put very little stock in these numbers since I do not play synthetic benchmarks, I play video games, but I know a lot of people do. They can be a general overall indication of how a card may perform, but they are intended as a standardized platform for scoring based on the benchmark software being used. Not as an end all of how the card will perform in individual games and on individual systems. But here are the numbers anyway. Again, averaged over multiple sites.

3DMark X11 Overall Scores
  • HD 6990 GPU: 9300
  • GTX 590 GPU: 8930
  • HD 6990 3D Marks: 9180
  • GTX 590 3D Marks: 8820
3DMark Vantage
  • HD 6990: 34500
  • GTX 590: 38000
  • HD 6990 GPU: 22500
  • GTX 590 GPU: 19400
Again, notice a trend here? The numbers are flopping back and forth between both cards and not exactly Earth shattering in their differences.

So what does it come down to now? When it comes to gaming, both the AMD HD 6990 and Nvidia GTX 590 are pretty darn evenly matched. So now it's really going to come down to the individual user. Here are some other things you can factor in.

Power Consumption and Heat

Both cards are plugged for the same wattage however the HD 6990 will draw more under load than the GTX 590.

So thus, the HD 6990 can run hotter than the GTX 590. The HD 6990 does vent more air outside of the case where as the GTX 590 vents more air into the case.

So this will come down to your power needs, case cooling, and your personal preference yet again.

Noise Level

The HD 6990 is a considerably louder card than the GTX 590 when those fans ramp up to speed under load.

So if you want quiet, then the GTX 590 is better suited for that. If you don't care or control your fan speeds manually, then the HD 6990 may fit your bill nicely. Either way, here we are sitting at personal preference again.

Features

These can be deal makers and breakers for some people, including myself, so we better talk about those.

Do you want CUDA and PhysX Engine support? Looks like you better go with the GTX 590, or any other modern Nvidia GPU.

How about multi-display support. Both the HD 6990 and GTX 590 have it. So what one is "better", well that's going to depend on a few different things.

AMD Radeon cards have had Eyefinity for 3+ display support for a long time now, they can also support over 3 displays where as the GTX 590 can not. It is limited to 3 displays. Also the extra GB of VRAM the HD 6990 has over the GTX 590 will come in handy when gaming on multiple displays, especially if you go nuts and push it to 5 displays.

So in the multi-display gaming category, the HD 6990 would be the better choice.

Want SLI and Crossfire scaling? Well, that depends on the games you play of course, but overall in this case, the GTX 590 scales better than the HD 6990. So the GTX 590 may be the way to go if you are one of those people wanting to run a quad dual-GPU card setup.

If your going to scale, and you want to use AMD, your much better off going with a 6950 or 6970, they scale much better then the 6990.

What about BIOS/OC switch? The HD 6990 has it and the GTX 590 does not. So what does that mean? Flip a switch and you instantly OC the 6990 to 880MHz and it does have a noticeable impact on performance. So do you want BIOS level overclocking? Do you want it under warranty? Then you may want to go with the HD 6990, and more specifically, the XFX HD 6990.

Not to mention, that switch, toggles between two separate BIOS on the card, a protected one and a flash-able one. The same as on the 6950 and 6970. If you ever flash the one BIOS, and it fails for some unforeseen reason, no big deal, no more worries about bricking your card. Just flip the switch back to the other protected BIOS and re-flash the failed one.

Final Thoughts

Yes I am an AMD user, but seriously, how can anyone crown one card the overall winner over the other when they are so evenly matched. Simply put, you can't. Both of these cards run neck and neck, trading punches back and forth all the way down the charts, and they BOTH are still standing at the end of it all. Both of these cards kick ass, pure and simple.

So in the end, like I have said multiple times throughout this review, it's going to come down to YOU and your own preferences. For the love of Pete, if you are on the fence, do some research. Think for yourself, make an educated decision. Once you have done all that, and you still can't decide, go with whatever your fanboy heart tells you too.

My Personal Choice?

XFX Radeon HD 6990. Here are my personal reasons.

  • Both cards are even when it comes to gaming. So no preference there.
  • I could care less about synthetic benchmarks. So no preference there.
  • Power consumption and heat. I have a power supply that's more than capable of powering both cards. I have a case that's more than capable of keeping them both nice and cool. So again, no preference here either.
  • Noise level. The HD 6990 is louder, but I am one of the few that likes that. I was raised on loud cars and even louder music. I also have a case with 7 huge case fans, not including those on the processor, PSU, and RAM. I also game with high quality headphones and manually control my fan speeds. So I have to go with the HD 6990 on this one.
  • CUDA and PhysX. Don't care for either. Why? AMD has DirectCompute and that works just fine for me. There are far fewer games that support PhysX than do. The simple fact of the matter is, most of the games out there and that are coming out over this next year do not use PhysX. They either use Havoc or their own physics engine. Game developers are not willing to alienate such a large portion of their customer base and use PhysX when it's proprietary and is only allowed to run on Nvidia cards, not when it comes to their money maker titles. It's far more logical to use an engine, or make one yourself, that runs on all cards, regardless of brand. So I have to go HD 6990 here too.
  • Multi-display support. I like Eyefinity, Nvidia is just too new to multi-display gaming. The only reason the GTX 590 can even support 3 displays is because its a dual-GPU card. Where as most all modern AMD/ATI cards have support for 3, or more, displays. So the HD 6990 would be my choice. Not to mention that extra GB of VRAM.
  • SLI or Crossfire. The GTX 590 scales better than the HD 6990. However, I would never run a multi-card setup with either card. There is no point. It's overkill and the only benefit from it would be bragging rights and synthetic scores. Neither mean a damn thing to me. If I am going to run a multi-card setup, I would run dual 6950's for under $500. You get great scaling, anywhere from 30 to 100% depending on the game. It's cheaper than both the 6990 and 590, and you get real performance boosts. Anything beyond that is back to bragging rights and synthetic numbers again. When we are already talking FPS numbers this high and stable with two cards in SLI or Crossfire, there is no point to go any further. The game will see no gameplay benefits. So I have no preference here.
  • The BIOS/OC switch. I like that feature. I never have to worry about bricking my card and I can get that instant OC. If I buy XFX, my card is still covered when running at the OC rate of 880MHz. So I have to choose AMD over Nvidia on this one.

So there is my own personal thought process and why I would buy the HD 6990 over the GTX 590. That, and for the simple fact there are just too evenly matched and I would just have to go with my "fanboy" roots.

The point of this all is. Use your own thought process, make up your own mind, because based on all the facts these cards are equal. So I have to give kudos to both Nvidia and AMD for giving us two kick ass cards to choose from and that have the power to deliver great performance across the board. When the race is this close, the real winners are the end users. We are the ones that end up benefiting from the race. Each company pushing the envelope to try and come out on top and thus giving us more and more power to play with.

In the end though, why did I do all this, do I plan on actually buying one of these dual-GPU cards? The answer is no. I am very happy with my XFX 6950 2GB card as it powers through any and all games at all maximum settings at 1920x1080 with no problems. I do plan on Crossfire, but it will be two 6950's and that will have me set for what I do, game, for a good time to come. Both of these cards are overkill for someone like me and a waste of money in my opinion when available SLI and Crossfire solutions are often less expensive and offer similar performance and enough of an overall boost to keep any gamer playing at max everything. However, these are my own personal opinions. Like I say over and over, I am not an enthusiast, I am a realist.

I did this because even though I am an admitted AMD "fanboy", I get sick of all the BS, blind proclamations, and misguided information I see when I sit down and research these sorts of things. So it was a way for me to vent a little, bring together facts and information in one location for others to compare, and to share my opinions. So if someone finds all this useful, then great, I am not looking for anyone to agree or disagree with me. Just to think for themselves and look at the entire picture.

So who is the winner?

WE ARE!

In most all game benchmarks these cards are within only a few percent of each other. In a few they are all over the place because Nvidia and AMD cards are completely different architecture. I each and every case, these cards trade blows back and forth, and they BOTH come out standing at the end. Both cards would be fantastic for anyone looking for the best dual=GPU solution out there.

With AMD and Nvidia both competing this hard for top billing and the "Fastest Video Card in the World". We will be the ones that come out on top with better and faster maturing drivers, new features and technologies, and thus better gaming performance.

Want to follow the battle anyway? http://www.legitreviews.com/news/10358/

No matter what either company says. I stand by my conclusion, these cards are equal.

Read more...

Friday, 20 May 2011

Nvidia Gaphic Card Geforce 8800GTX Review

If you want the fastest graphic card on the planet then buy the Nvidia graphic card Geforce 8800 series, if you want the fastest gaming system on the plant then buy the two Nvidia Geforce 8800GTX graphic cards and put them together using the SLI.

The Geforce 8000 series has a new chip, 90 nanometre, with over 680 million transistors, it is considered to be the largest graphics chip ever built. The fastest 8000 version is clocked at 575 MHz while the GTX version clocks at 500 MHz.

Nvidia graphic cardd EVGA Geforce 8800 GTX

The Geforce 8800 series graphic cards are the first to use the Direct 10 chip. "It is also the first to support the 'unified' marchitecture." It uses the same "pipeline calculates vertex, pixel and geometry Shader information".

"Nvidia graphic card Geforce 8800 GTX has a core working at 575MHz. GDDR 3 memory works in a rather odd 384-bit mode and it has twelve chips, totally of 768MB of memory. While we all expected 1024 would be the next step, Nvidia decided to go for 768MB due its different memory controller. Nvidia clocked the memory at 1800MHz with a totally bandwidth of respectable 86.4 GB/s and a fill rate of 36.8 billion a second."

The Geforce 8000 series unified parallel Shader is designed to support 128 individual stream processors running at 1.35GHz in terms of speed. The speed numbers came from Nvidia. Because it support direct x 10 it will only work with Window Vista operating systems.

This card is hot, literally it has two power 6 pin connectors means that the card gets 2x75 watts from the cables plus an additional 75 watts from the PCIe bus. This brings total power consumption to an amazing 225 Watts.

The Nvidia graphic card Geforce 8800 GTX "has a dual slot cooler - massive and heavy but it does the job. The card always worked around 55Celsius in 2D mode and a bit higher in 3D."

Nvidia Graphic Card Leadtek Geforce 8800 GTX

The test of two cards

The first card tested was the EVGA Geforce 8800 GTX with a brand-new ACS3 Cooler.
"The ACS3 cooler is actually more efficient that Nvidia's reference cooler as the card works at 55C in 2D mode while the Nvidia reference cooler cools the card to 60 Celsius only. This is what makes the difference between EVGA card and the rest of the cards based on the reference cooler design." The box the card come in is super small.

Second card was a Leadtek Geforce 8800 GTX card with the SLI or Scalable Link Interface for linking two graphic cards together. The Leadtek card supports HDCP or High-bandwidth Digital Content Protection and the driver CD also includes Vista drivers.

The driver for both card has a couple of new features, One new featured includes support for 16X Anisotropic filtering. The big news is that Nvidia finally supports FSAA 8X and 16X. Nvidia's Luminex is a marketing name for incredible image quality that includes support for 16X Anti-alasing, 128-bit HDR (High Density Resolution) and support for 2560x1600 resolution with a high frame rate.

Bench marketing The cards where tested using the following:
"Foxconn C51XE M2aa Nforce 590 SLI motherboard
Sapphire AM2RD580 with SB600 board for Crossfire
Athlon FX 62 2800 MHz 90 nanometre Windsor core
2x1024 MB DDR2 Corsair CM2X1024-6400C3 memory
Seagate Barracuda 7200.9 500GB SATA NCQ hard drive
Thermaltake Mini Typhoon Athlon 64/X2/FX cooler and Intel CPU's
OCZ 700W GameXstream power supply"

The two nvidia graphic cards where test on the 3Dmark03 a "single EVGA G80 card, EVGA eGeforce 8800GTX ACS3 575/1800MHz scores 30485. it is just slightly faster than the 7950 GX2 card, 2500 slower than Crossfire and more than 10000 faster than a single fastest X1950XTX card."

When the EVGA-Leadtek 8800GTX SLI 2x 575/1800MHz combo scores 49390, almost 50K a perfect score for the 3Dmark03. With the SLI it is sixty two percent faster than a single card. A single Geforce 8000 graphic card is fifty four percent faster than an X1950XTX. With the Vertex Shader test the Geforce 8000 graphic card is twice as fast as the ATI's faster card.

"Nvidia Graphic card EVGA eGeforce 8800GTX ACS3 575/1800MHz scores is eighty four percent faster than the ATI's X1950XTX in Shader model 2.0 test and seventy one in Shader model 3.0 / HDR testing."

List of computer video games used in the test
"Doom 3 scores around 135 FPS at first three resolutions and drop to 125 at the 20x15, SLI even then scores 135 so this is clearly CPU limited. EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz is almost 80 percent faster at 2048x1536. Doom 3 with effects on scores 130 FPS at first two resolutions and later starts to drop but is still faster than 7950 GX2 cards all the time. SLI doesn’t drop at all at first three resolutions only slightly drops at 20x15.

FEAR scores are up to sky with the weakest score of 95 FPS at 20x15, faster than Crossfire in the last two resolutions and from GX2 and X1950XTX at all times. It is up to 66 percent faster than X1950XTX and 68 percent from the Gainward 7950 GX2 card. SLI is again 68 percent faster than Crossfire a massive difference.

Quake 4 runs up to forty seven frames faster on G80 and SLI gets the score better but not much. G80 is always faster than GX2 and Crossfire. Quake 4 with FSAA and Aniso runs some forty percent faster than ATI's fastest card and 30 per cent in Crossfire versus SLI G80.

Far Cry with effects on performance is matched with both G80 and X1950XTX while the SLI can outperform both in 20x15.

Serious Sam 2 is faster on ATI in first two resolutins by three frames while EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz wins by eight, or fifteen frames at higher resolutions. SLI is 23 per cent faster than a single G80 and 43 percent faster than X1950xTX.

Serious Sam 2 with FSAA and Aniso on always scores faster at EVGA eGeforce 8800GTX ACS3 575 / 1800 MHz card but not that much, some nine to ten percent while the SLI is 54 per cent faster than a single card and sixty eight percent than ATI's card."
The two Nvidia Geforce 8800 graphic cards connected using SLI.

In Short
The Nvidia Geforce 8000 series graphic cards are amazing beating out the latest ATI card in virtually every test. The card is pricey, expect to pay around $600 or more for one of them. It is a very long card and it does get hot but the card can be consider the fastest graphics card on the market right now, it is also very stable as well.

Stating from the beginning if you want the fastest graphic card on the planet then buy the Nvidia graphic card Geforce 8800 series, if you want the fastest gaming system on the plant then buy the two Nvidia Geforce 8800GTX graphic cards and put them together using the SLI.

Read more...

Sunday, 15 May 2011

Release 270.51 NVIDIA Beta Drivers

On the 30th March, NVIDIA made the Release 270.51 Beta Drivers official and available to the users. As the latest NVIDIA drivers, it promises great boost to the older one, and provides more features and extra performance.

It is said that the Release 270.51 NVIDIA Beta Drivers can boost gaming performance up to 6 times faster and better. Additionally, it includes NVIDIA Update, which automatically searches and installs the latest GeForce drivers on your engine.

The Release 270.51 NVIDIA Beta Drivers fully integrates all 3D Vision drivers easily and even expands the 3D Vision windows and supports 3DTV Play, Windows Aero, and 3D Vision Notebooks.

Release 270.51 NVIDIA Beta Drivers also adds support for applications using the new CUDA 4.0 features.

Check out the complete list of features of the Release 270.51 NVIDIA Beta Drivers
 
Download the Release 270.51 NVIDIA Beta Drivers

Read more...

Saturday, 14 May 2011

NVIDIA GT 440 GV-N440TC-1GI Video Card






















Gigabyte has launches a new video record today that uses the NVIDIA GT 440 GPU. The new salutation is start product GV-N440TC-1GI and it is aimed at the lower-end of the gaming and technology spectrum. The lineup has both nice features and requires at small a 400W PSU.
Features of the greeting include 512MB of GDDR5 onboard with 1GB of TurboCache module tot. The set time of the video carte is 830MHz, the shader timepiece is 1660MHz, and the remembering timepiece is 3200MHz. The lineup supports DirectX 11 and has individual connectivity options.

Connectivity includes D-Sub, DVI, and HDMI outputs. The carte fits into a PCI-E 2.0 slot and has a 128-bit memory interface. TurboCache RAM uses several of the computers group retention to add many RAM for video use. The GPU uses 40nm application and supports PhysX, CUDA and more. Pricing and availability are unfamiliar.

Read more...

Wednesday, 11 May 2011

NVIDIA Synergy to Bring Optimus to Desktops

We first encountered NVIDIA’s Optimus Technology in February of last year. It has done wonders for laptop battery life on midrange systems, where manufacturers no longer need to worry about killing mobility by including a discrete GPU. Over the past fourteen months, we have seen the number of Optimus enabled laptops balloon from a few initial offerings to well over 50—and very likely more than 100. That sort of uptake is indicative of a successful technology and feature, and while we do encounter the occasional glitch it’s not much worse than the usual driver bugs we deal with.

If you were among those who thought, “This sounds like a great technology—when will they bring it to the desktop?” you’re not alone. So far it has only been available in laptops, and even then we haven’t seen any notebook vendors support the technology with anything faster than a GT 555M (i.e. there are so far no notebooks with GTX GPUs that support Optimus; the closest we get is Alienware’s M17x, which uses their own BinaryGFX switching technology).

Previously, the lack of switchable graphics on desktops—particularly something as elegant as NVIDIA’s Optimus—hasn’t been a big deal. That all changed when Intel released Sandy Bridge and introduced their Quick Sync technology. In our Sandy Bridge review we looked at Quick Sync and found it was the fastest way to transcode videos, providing up to double the performance of an i7-2600K CPU and potentially four times the performance of dual-core SNB processors. Unfortunately, there’s a catch: as we mentioned in our SNB review, Quick Sync works only if the IGP is enabled and has at least one display connected.

This limitation is particularly irksome as the only way you can get the IGP is if you use the H67 chipset (and give up the overclocking and enthusiast features offered by P67). The Z68 chipset should provide both overclocking and IGP support in the near future, but you’re still left with the IGP use requirement, making Quick Sync essentially unavailable to users with discrete GPUs—who are very possibly the most likely candidates for actually making use of the feature.

There appears to be some good news on the horizon. It’s hardly a surprise, as we’ve suspected as much since Optimus first reared its head, but VR-Zone reportsthat NVIDIA is finally bringing the technology to desktops. There’s a name change, as it will now go by the name Synergy (though you may also see it referred to as Desktop Optimus at times). Rumors are that Synergy will see the light of day late next month or in early June.

While it’s true that you can already get access to Quick Sync while using a discrete GPU using Lucid’s Virtu, there are a few differences worth noting. First and foremost, Synergy is software based, free, and requires no license agreement. Any recent NVIDIA GPU (400 or 500 series) should work on H67, H61 or Z68 chipset motherboards. (P67 does not support the SNB IGP and thus won’t work.) You’ll need the appropriate drivers and BIOS (and maybe VBIOS), but that should be it. No special hardware needs to be present on the GPU or motherboard and anyone with the appropriate GPU and motherboard chipset should have the option of using Synergy.

This is in contrast to Virtu, which only comes bundled with certain motherboards and incurs a price premium on those boards. However, Virtu still has the advantage of working with both AMD and NVIDIA GPUs. Owners of AMD GPUs will have to rely on Virtu or wait for AMD to come out with their own equivalent to Virtu and Synergy.

One final note is that both Virtu and Optimus/Synergy function in a similar fashion at a low level. There are profiles for supported games/applications, and when the driver detects a supported executable it will route the API calls to the discrete GPU. Here’s where NVIDIA has a big leg up on Lucid: they’ve been doing Optimus profiles for over a year, and while Lucid now lists support for 157 titles, NVIDIA has a lot more (and the ability to create custom profiles that generally work). You also don’t have to worry about new GPU drivers breaking support with Virtu, as NVIDIA handles all of that in their own drivers.

We’ll certainly be keeping an eye out for Synergy and will report our findings when it becomes available. In the meantime, gamers interested in Quick Sync as well as people looking to cut down on power use when they’re not using their GPU have something to look forward to. Now bring on the Z68 motherboards, Intel.

]]>

Read more...

Thursday, 5 May 2011

NVIDIA Announces CUDA 4.0

The last time we discussed CUDA and Tesla in depth was in September of 2010. At the time NVIDIA had just recently launched their lineup of Fermi-powered Tesla products, and was using the occasion to announce the 3.2 version of their CUDA GPGPU toolchain. And though when we’re discussing the fast pace of the GPU industry we’re normally referring to NVIDIA’s massive consumer GPU products arm, the Tesla and Quadro businesses are not to be underestimated. An aggressive 6 month refresh schedule is not just good for consumer products it seems, but it’s good for the professional side too. NVIDIA’s CUDA team seems to have taken this to heart, and so here we are just shy of 6 months later with NVIDIA preparing to launch the next version of CUDA: CUDA 4.0.

]]>

Read more...

Wednesday, 4 May 2011

Nvidia challenge ATI Radeon by releasing GT 520

type='html'>

Most in my earlier post show you how those big companies run and stay in challenge eash other in enthusiast-level graphic cards. Such as GTX 590, HD 6890 and etc. But, even though they stay in battle for enthusiast-level, Nvidia won’t lose in entry-level too. Nvidia now release their new product that is the cheapest graphic card from Nvidia which has DirectX 11 feature to challenge Radeon HD 6450.
New GT 520 take the place as the cheapest DirectX 11 GPU in their product line up, offering lifelike character and terrain to handle DirectX 11 games at lower resolution, and support for viewing Blue-Ray movies, photographs, and Web content – all in stereoscopic 3D, courtesy of Nvidia 3D Vision feature.
This card will be equipped by 48 CUDA cores, 64-bit 1Gb GDDR3 memory, 810 core clock, and 1800 MHz memory. For the price, it expected retail is approximately USD $50.

Read more...

AMD takes on Nvidia in a mobile battle

type='html'>

We know in last year in 2010 Nvidia shows up their Nvidia Tegra to market. And it combined with such kind of great tablet like Android, iPhone, iPad etc. Nvidia Tegra give so much improvements especially in device graphic.
Nvidia dominate that device in market in several years. But, now AMD look the chance and has already recruit Android engineers. MSI marketing its latest tablet PC built using AMD’s Brazos APUs and the fact that such beast exist indicates that AMD’s APUs have increased their presence in the PC tablet market.
AMD is reportedly recruiting talent to developot of Android driver software – that mean it could offer notebook and tablet PC partners chipsets that support Android at later date, the source noted.
In the other hand, Nvidia not only providing GPU into a phone, but in certain time Nvidia also will probably control dual-core device market. AMD has a lot of ground to make up if it wants to catch up.

Read more...

GIGABYTE Unveils NVIDIA GeForce™ GTX 550 Ti Overclock Edition Graphics Cards

type='html'>

Gigabyte Technology Co. LTD. A leading of motherboard and graphic cards, is present overclock edition of GeForceTM GTX 550 Ti named GV-N550OC-1GI. This graphic card built in 40mm process and equipped by GDDR5 memory, and this cad has overclock ability and outperforms by 6% (based on 3DMark Vantage Extreme Mode). About the design you won’t be disappointed, because it use the latest cooler-design from Gigabyte. When using this cooler and Gigabyte Ultra Durable VGA materials, GV-N550OC-1GI run much cooler and because using Gigabyte Ultra Durable material it also quitter than standard version. For the feature it equipped with NVIDIA 3D VisionTM, SLI, CUDATM, PhysX®, and Microsoft DirectX 11 technologies.

This graphic card master of overclocking ability and graphic card stability to provide gamers the reliable gaming environment. Even though the fan much larger size, the noise level run from 21.7 dBA to 28 the loudest. It’s really quite for a fan with 10 cm size and by all of this innovation it can decrease the heat-level by 8% than generic version.
 
 Gigabyte’s own Ultra Durable VGA Technology gives guarantee better overclocking capability, lower GPU temperature, and great power efficiency by using 2 oz PCB board, Samsung and Hynix memory, Japanese solid capacitors, Ferrite/Metal core chokes, and Low RDS (on) mosfet. With industry’s leading quality, GIGABYTE graphics cards can satisfy the most enthusiast gamers.
For more details of GIGABYTE GV-N550OC-1GI, please visit the GIGABYTE VGA website: http://www.gigabyte.com/products/main.aspx?s=43

Read more...

  © Blogger templates The Professional Template by Ourblogtemplates.com 2008

Back to TOP