Thursday, February 22, 2018
FeaturedGraphics CardsReviews

Sparkle GeForce GTX 260 Plus 1792MB Video Card Review


The nVidia geForce GTX 260 was originally released in June 2008, nearly 1.5 years ago. At that time, it and its big brother, the GTX 280, were the most powerful single GPU gaming cards in the market. At its release price of $400, it was pretty expensive, but it was $150 less than the only two cards that really came close, the GTX 280, and the Radeon HD 4870 X2, which was released a couple of weeks later.

Though nVidia has recently come out with a few new cards, the GTX 295, GTX 285 and GTX 275, the venerable GTX 260 is still a quite powerful gaming card. And it really isn’t the same card it was when released, though released in the 65nm process, it has since been upgraded to the 55nm process. And originally sporting 192 stream processors, it now has 216.

I recently made the comment that the GTX 260 896MB video card is probably the best bang-for-the-buck gaming card deal going. It is a lot of video card for around $175, and gives reputable FPS when coupled with inexpensive processors like the Phenom II X3 720.

Today I will be looking at a GTX 260 by Sparkle, the GTX 260 Plus. This isn’t your typical GTX 260, as it sports 1792 megs of memory. I reviewed a couple of Radeon HD 4870 X2s that each had two gigs of memory, but they were actually two 4870s with 1GB of memory per GPU. This is a single GPU with not much less than two gigs for itself. Will this massive amount of graphics memory make a difference? Read on to see!

Special thanks to Sparkle for providing us with the GeForce GTX 260 Plus Video Card to review!

Specifications:
Model Number: SXX2601792D3S-VP
Interface: PCI Express 2.0 x16
GPU: geForce GTX 260
Core Clock: 576MHz
Shader Clock: 1242 MHz
Stream Processors: 216 Processor Cores
Memory: 1792MB GDDR3
Memory Clock: 2214 MHz
Memory Interface: 448-bit
RAMDAC: Dual 400 MHz RAMDAC
Max Resolution: 2560 x 1600
Power Connectors: 2 x 6 Pin PCI-E
Ports: 2 x DVI
3D API: DirectX 10, OpenGL 2.1

Features:
– 55nm manufacture process for lower power consumption
– Dual-link DVI supported
– HDCP ready
– Warranty: Lifetime Limited Parts and Labor

Packaging
The card comes in a sleeved white themed box with the typical icons denoting the card’s capabilities and features.


Sparkle geForce GTX 260 Plus 1792MB Video Card Sparkle geForce GTX 260 Plus 1792MB Video Card

Inside the card is packed in bubblewrap and protected from the bundle by cardboard.


Sparkle geForce GTX 260 Plus 1792MB Video Card

[ad#review1068-bottom]

  • Using skived fins in the heatisink is a nice touch.

  • Nice to see someone still releasing a GTX 260. A lot of other manufacturers seem to have stopped making them.

  • Using skived fins in the heatisink is a nice touch.

  • Nice to see someone still releasing a GTX 260. A lot of other manufacturers seem to have stopped making them.

  • What about cooling. I heard it need separate exhaust ?

  • F. Buskirk

    IT LOOKS GREAAAAAT BOB. TOP OF THE LINE. YOU GOT IT BOY. DAD

  • with the cooling shell so large i wonder why they didn't make it a 2U PCI card?

  • What about cooling. I heard it need separate exhaust ?

  • What do you mean? It is a 2-slot video card…

  • It looks to me like the video exhausts some of the heat back into the case.

  • F. Buskirk

    IT LOOKS GREAAAAAT BOB. TOP OF THE LINE. YOU GOT IT BOY. DAD

  • with the cooling shell so large i wonder why they didn't make it a 2U PCI card?

  • Pingback: Computer and Technoloy News » Blog Archive » Sparkle GeForce GTX 260 Plus 1792MB Video Card Review | ThinkComputers()

  • What do you mean? It is a 2-slot video card…

  • It looks to me like the video exhausts some of the heat back into the case.

  • 2U as in it takes up two slots. Not internally, but externally. All mobo's are spaced differently so the internals don't matter. But, slot spacing on the back of a case is always the same. SO i was wondering why the mounting bracket on the card wasn't a 2U since the cooling shield was so big?

    i.e. https://www.thinkcomputers.org/reviews/asus_engt

  • Good question and since technically it takes up 2-slots I wonder why Sparkle never used a traditional PCI bracket with exhaust holes.

  • 2U as in it takes up two slots. Not internally, but externally. All mobo's are spaced differently so the internals don't matter. But, slot spacing on the back of a case is always the same. SO i was wondering why the mounting bracket on the card wasn't a 2U since the cooling shield was so big?

    i.e. https://www.thinkcomputers.org/reviews/asus_engt

  • Good question and since technically it takes up 2-slots I wonder why Sparkle never used a traditional PCI bracket with exhaust holes.

  • lewislau

    This is a pretty good graphics card for the price you're paying, plus i've always been a fan of nvidia =D. top notch

  • lewislau

    This is a pretty good graphics card for the price you're paying, plus i've always been a fan of nvidia =D. top notch

  • True dat buts its too bad the GT200 and GT200b stocks are quite low.

  • True dat buts its too bad the GT200 and GT200b stocks are quite low.

  • Pingback: News and Links for 11-6-09: iGadget Life()

  • www.hardocp.com

    Why would you benchmark that card in only 1280×1024?

    Do you think that an additional 896 megs of GDDR3 (on top of the 260GTX's original 896mb) is going to make any appearance at resolution that low? Why would you even bother doing a benchmarking review if you weren't going to test this thing on more than 3 benchmarking utilities and on such a low resolution? You aren't going to need a card with more than 512mb of GDDR3 at that resolution. (Are we in 1998?) I know! How about 1680×1050 or 1920×1200, because we all know in 2009 everyone uses CRT monitors only capable of 1280×1024 resolution! How long did it take you to run these tests? 15 minutes? 20 minutes? Man, put a little effort into the next review. ThinkComputers… I don't see a lot of work or think going on here.

  • www.hardocp.com

    Why would you benchmark that card in only 1280×1024?

    Do you think that an additional 896 megs of GDDR3 (on top of the 260GTX's original 896mb) is going to make any appearance at resolution that low? Why would you even bother doing a benchmarking review if you weren't going to test this thing on more than 3 benchmarking utilities and on such a low resolution? You aren't going to need a card with more than 512mb of GDDR3 at that resolution. (Are we in 1998?) I know! How about 1680×1050 or 1920×1200, because we all know in 2009 everyone uses CRT monitors only capable of 1280×1024 resolution! How long did it take you to run these tests? 15 minutes? 20 minutes? Man, put a little effort into the next review. ThinkComputers… I don't see a lot of work or think going on here.

  • You are entitled to your opinion but whining like a little baby doesn't help.

  • You are entitled to your opinion but whining like a little baby doesn't help.

  • You are entitled to your opinion but whining like a little baby doesn't help.

  • Pingback: Video Cards and Stuff | MEGATechNews :: Mega Techie Goodness For the Masses()

  • Pingback: Sapphire Radeon HD 5850 1GB Video Card | Unbiased Computer Hardware Reviews - ThinkComputers.org()

  • Pingback: Sapphire Radeon HD 5850 1GB Video Card Review | Unbiased Computer Hardware Reviews - ThinkComputers.org()

  • IMO 1792MB of memory is bit much.

  • IMO 1792MB of memory is bit much.

  • lewislau

    good for higher resolutions

  • lewislau

    Yea, it's a pretty good card, and sad to see it go.

  • lewislau

    i agree, that testing on a larger resolution should have been included. Perhaps, you could have not been so harsh in your comments though.

  • lewislau

    good for higher resolutions

  • lewislau

    Yea, it's a pretty good card, and sad to see it go.

  • lewislau

    i agree, that testing on a larger resolution should have been included. Perhaps, you could have not been so harsh in your comments though.

  • IMO 1792MB of memory is bit much.

  • lewislau

    good for higher resolutions

  • lewislau

    Yea, it's a pretty good card, and sad to see it go.

  • lewislau

    i agree, that testing on a larger resolution should have been included. Perhaps, you could have not been so harsh in your comments though.

Advertisment ad adsense adlogger