Printed from http://www.electronista.com

GeForce 300 to outpace dual Radeons?

updated 10:35 am EST, Mon December 14, 2009

NVIDIA may claim absolute speed win

NVIDIA's first GeForce 300 series cards may provide it with an unambiguous lead over AMD if purportedly leaked benchmarks are accurate. The tests suggest that the fastest single-chip GeForce card, the GTX 380, would be faster than the dual-chip Radeon HD 5970. They go so far as to suggest that a slightly speed-reduced card, the GTX 360, will often come close and in one case exceed AMD's flagship card.

Both would be even faster than the much more conventional Radeon HD 5870. It's not clear where the advantages would stem from, though the Fermi architecture that will underpin most GeForce 300 chips promises lifelike 3D graphics. It centers on a 512-core design that's ostensibly to help general computing but which should also help in massively parallel graphics duties.

The recipient of the benchmarks, Guru3D, notes that it's hard to verify the authenticity of the benchmarks. However, it notes that the formatting is at least partly consistent with NVIDIA's pre-release benchmarks and that the performance isn't necessarily unrealistic.

NVIDIA is unofficially due to launch the GeForce 300 line in early 2010, possibly as early as CES.







By Electronista Staff
toggle

Comments

  1. TomSawyer

    Fresh-Faced Recruit

    Joined: Jan 2008

    +2

    Anyone else

    feel kind of awed and nostalgic when they see GRAPHICS CARDS that are on a whole other plane of performance than the COMPUTERS they used to think of as zippy or cutting-edge?

  1. iphonerulez

    Dedicated MacNNer

    Joined: Nov 2008

    -3

    All these powerful graphics cards

    have nothing to do with Macs, though. Do they even work in Mac Pro models? And now, aren't most people buying computers that cost less than one of these cards would cost. I really don't understand how these companies are making money off such powerful cards. Even though Apple is pioneering Grand Central Dispatch, none of the most powerful cards are being used in Macs, so what's the point. I think the Mac uses the pokiest of the bunch which is the GTX 285.

  1. zaghahzag

    Dedicated MacNNer

    Joined: Aug 2006

    0

    @iphonerules

    Not even. The mac pro has as its top card the ATI 4870, which is I think a whole generation behind the slow ATI card in those charts.

    OTOH, apple doesn't sell a gaming computer and the imac has a non-upgradeable GPU, so it's not a shock they wouldn't bother with faster video cards.

    It does point out a huge gap in performance that apple has never seriously tried to fill.

  1. testudo

    Forum Regular

    Joined: Aug 2001

    +4

    @iphonerulez

    ...have nothing to do with Macs, though. Do they even work in Mac Pro models?

    Probably not until Apple writes a driver for them, and that would only occur if they decided to use the card. They're not going to write their own drivers for OS X, considering the cost involved in doing so and the expected gain in sales (since only MacPros could use them, and those aren't high in the MAc market share department).

    And now, aren't most people buying computers that cost less than one of these cards would cost.

    Many people are buying computers that cost less than these. But there's also a lot of people who need/want computers that end up costing well over a thousand dollars. Esp. the gaming and tinkering markets. They eat this c*** up.

    I really don't understand how these companies are making money off such powerful cards.

    Hasn't Apple proved you don't need a huge market to be successful.

    Even though Apple is pioneering Grand Central Dispatch, none of the most powerful cards are being used in Macs, so what's the point. I think the Mac uses the pokiest of the bunch which is the GTX 285.

    Well, for one, the entire computing world does not revolve around Apple. In fact, of all computers sold, most aren't made by Apple (I know, hard to imagine).

    Second, Grand Central Dispatch is basically a framework for creating multi-processing applications. But it is only for the CPU; it doesn't off-load to the GPU. CoreImage offloads work to the GPU, as does (did?) Quartz Extreme.

  1. bbbl67

    Fresh-Faced Recruit

    Joined: Dec 2009

    +1

    DirectX 10 not 11

    Look at the graphs, they all say DirectX 10 at the bottom. Nvidia is trying to show their own cards in the best light possible by disabling all of the new performance-enhancement features of DX11. One has to wonder if Nvidia has even gotten DX11 working right on their cards, if they won't show their DX11 performance. I suspect Nvidia will come to regret having released these charts.

  1. testudo

    Forum Regular

    Joined: Aug 2001

    +2

    Re: DirectX

    What? A graphics card maker showing off test results that show their products in the best possible light? Say it isn't so!

Login Here

Not a member of the MacNN forums? Register now for free.

toggle

Network Headlines

toggle

Most Popular

Sponsor

Recent Reviews

Sound Blaster Roar Bluetooth speaker

There could very well be a new king of the hill for Bluetooth speakers, with Sound Blaster's recent entry into the marketplace. Bringi ...

Kenu Airframe Plus

Simple, stylish and effective, the Kenu Airframe + portable car mount is the latest addition to Kenu's lineup. Released earlier this y ...

Plantronics Rig Surround 7.1 headset

Trying to capture the true soundscape of video games can be a daunting task. Looking to surround-sound home theater options, users hav ...

Sponsor

toggle

Most Commented

 
toggle

Popular News