Something went wrong. Try again later
    Follow

    PC

    Platform »

    The PC (Personal Computer) is a highly configurable and upgradable gaming platform that, among home systems, sports the widest variety of control methods, largest library of games, and cutting edge graphics and sound capabilities.

    Nvidia Fermi/GT300/GF100 info might be available soon

    • 55 results
    • 1
    • 2
    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #1  Edited By Geno

    With CES at hand and the proposed release of March 2010 imminent, it's quite possible that Nvidia will give full disclosure on specs and benchmarks of its GT300 series quite soon. Why is this exciting? It's been speculated that Nvidia's heavily delayed next generation of cards with their new architecture will completely break all performance benchmarks thus far, including those from ATi's rather impressive and recently released HD5000 series. Even more so than usual with the natural progression of graphics cards. According to this, posted 3 days ago, http://www.vistax64.com/graphic-cards/265211-gt300-nda-has-been-partially-lifted.html , an Nvidia insider has suggested that full disclosure may be released even today. Until then however, take a look at the details given thus far through the partial NDA lift: 
      
    - March 2nd release Date 
    - Crysis (ambiguous whether its Warhead or not) gets

    73fps

    at 1920x1200, 4x SSAA, 16AF
    - 148fps in Dirt 2 (I'm assuming max settings)
    - Will feature 32x AA 
    - New SLI rendering method 
    - GF100 (single gpu) to outperform GTX 295 (dual gpu) by 40%, for less money 
    - GF100 will outperform HD 5970 (dual GPU) and cost less as well
    - GF104 (supposedly the dual GPU card) will be twice as powerful (ambiguous as to twice as powerful as GF100 or GTX 295 or HD 5970) 
     - GF104's max load temp is 55 degress C (which is insanely cool for a dual gpu card)
     
    And in other news pertaining to Nvidia, Nintendo has ordered several million of their powerful new Tegra2 units to be implemented in their next handheld (DS2 imminent?)
     
    All credit goes to original poster and source of course. 
     
    Edit: Fermi GF100 demo while you wait: 
     

    Edit2: This was posted on the Rage3D forums by a member with over 9000 posts, I'd call it fairly reliable: 
     
    (Translated to English using Google Translate, it's mostly readable) 
     
    High-end models will deploy the full array of 16 Cluster A total of 512 CUDA Core computing unit, to support 384-bit GDDR5 memory body structure, body has a memory capacity of 1.5GB. Although the Fermi core has adopted 40nm manufacturing process, provided that such cards power up to 250W to 300W, and called for PCI-E 6-pin + PCI-E 8-pin power supply. A second level model will take away the two Cluster group memory array and a body controller, which fell to 448 CUDA Core and 320-bit GDDR5, memory body with a total capacity of 1.28GB, power consumption is reduced to 225W, and only need two PCI-E 6-pin power supply.

    In addition to these two single-core products, NVIDIA also has plans to launch limited edition GF 100 GX2 dual-core models, a unified by NVIDIA to get the manufacturers produce. Fermi built two core products, so that the total number of CUDA Core Up to 1024, and has amazing body 3GB memory capacity, built-in power body will amount to more than 6 billion, becoming the industry's most power within the body of the history, the most robust performance The display card products.

    Easily beyond the HD 5870
    GF 100 actual performance, highly concerned about the industry and users. Sources indicated that single-core version, with 448 CUDA Core of the GF 100 graphics card, 3D performance will easily exceed that same level opponents Ati Radeon HD 5870, in the mainstream games, such as: Crysis Warhead, Far Cry 2, HAWX, and Fallout 3, GF 100's performance of HD 5870, respectively 1.6 times, 1.2 times, 1.55 times and 1.2 times.
             
     
    Edit3: Here are some demo videos; they were originally stored on a private account on Vimeo but somebody has since copied them publicly onto youtube. I'm 99% sure that these are real. Enjoy.  
      
        
          
            
        
        
        
        
      
           
        
        
        
        
         
    These appear to be the GTX 360 (GF100=GTX360) since that would be lowest end model based on the previous generation and the name coding. Not bad I think.      
     
    Edit4:  Apparently NDA will be lifted TONIGHT guys.  
     
     Nvidia will take the wraps off their fancy new DirectX 11 graphics card tomorrow night. Ahead of the embargo's expiry a number of videos have leaked out, showing real-time ray tracing, tessellation, and Far Cry 2 running at significantly faster framerates when compared to the GeForce GTX 360.     
     
    http://ve3d.ign.com/articles/news/52473/Nvidia-GeForce-300-Videos-Leak-Ahead-Of-Tomorrows-Big-Reveal     
      
    Settings and framerates for the FC2 benchmark, thanks to a poster on the ve3D page.  
     
    ranch small, ultra, 1920x1200, 4xAA

    gtx285
    avg/max/min 
    50/72/38

    gf100 ( = GTX 360)
    avg/max/min
    84/126/65 +68%/+75%/+71%

    5870
    avg 70fps       
     
    Edit5: Looks like someone posted a small preview.
     
    http://www.bjorn3d.com/forum/showpost.php?p=215717&postcount=8  
     
     Design Article releases tomorrow 7PM CST with complete Whitepaper info.


    New Features, new cache, new Memory setup, and yes it's about 100% performance increase over GTX-2xx so figure single GTX-285 vs 5870 then double the GTX-285 performance.


    Then it handles triangles different, triangles on any given frame can number in the hundreds of thousands so that's very important.


    It will fold a lot better.


    Increased efficiency in several areas.


    It's a revolutionary new design oriented toward tessellation (those pesky triangles) and geometric programming. Problem being every wire frame is made up of triangles, tessellation takes those triangle and breaks them down into many smaller triangles. This core is uniquely designed to handle that so geometric and shader heavy games you will see more than the 100% raw power increase.


    520USD might handle it. At 2x GTX-285 performance that puts it above GTX-295 performance and it's DX11 ready and designed for that specifically. Current ATI offerings are really good but basically a double the hardware on the same core design to provide more raw power. GF100 is a core design to take advantage of what the industry needs today and for some time in the future.


    Read the article tomorro cause that's about all I can say tonight.
        
     
    Edit6: I don't know why it keeps posting what I'm saying in italics, some forum bug.  
     
    Edit 7: Extremely amusing video incoming, "birth of GT300" 
     
      
      
    Avatar image for hitmanagent47
    HitmanAgent47

    8553

    Forum Posts

    25

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #2  Edited By HitmanAgent47

    Sounds good.

    Avatar image for diamond
    Diamond

    8678

    Forum Posts

    533

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    #3  Edited By Diamond
    @Geno: Sounds like a pie in the sky wishlist to me.  Or rather a bunch of BS put up on a forum by someone with the intention to BS.  Crysis at that resolution, with 4xSS and that framerate would be MANY times as powerful as a 5970.  A 5970 can't even do 60fps at that resolution with NO AA, let alone super sampled AA.  What would be the point of 32xAA when you can start moving to super sampling anyways?
     
    Utter 100% BS.
    Avatar image for driadon
    Driadon

    3265

    Forum Posts

    763

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 14

    #4  Edited By Driadon

    I'll believe it when I see it. Rumors, especially when it comes to computer hardware, are incredibly unreliable.

    Avatar image for wolf_blitzer85
    wolf_blitzer85

    5460

    Forum Posts

    2

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 6

    #5  Edited By wolf_blitzer85

    What would even be the point of 32xAA? Anyhoo if these numbers are for real, I'm down for picking one of these up in the near future.

    Avatar image for pufferfiz
    PufferFiz

    1501

    Forum Posts

    3667

    Wiki Points

    0

    Followers

    Reviews: 4

    User Lists: 6

    #6  Edited By PufferFiz

    fucking FINALLY, been waiting so long for this fucking card.

    Avatar image for diagnostic
    Diagnostic

    99

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #7  Edited By Diagnostic

    Riiiiiiiight.

    Avatar image for lhaymehr
    lhaymehr

    204

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #8  Edited By lhaymehr

    Yay, hardware porn. Luckily, I'm in no hurry with my solid card. One thing is for sure tho; It won't be much more powerful than the latest Radeon, IMO. The recent announcement from AMD/ATI is probably the cause for the nV's delay. They're probably bumping up the specs just barely.
     
    I could be wrong.

    Avatar image for evilsbane
    Evilsbane

    5624

    Forum Posts

    315

    Wiki Points

    0

    Followers

    Reviews: 4

    User Lists: 0

    #9  Edited By Evilsbane

    Those numbers, if shown to be true, are very exciting.

    Avatar image for scarace360
    scarace360

    4813

    Forum Posts

    41

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #10  Edited By scarace360

    If this is true than that sounds awesome.

    Avatar image for kblt
    Kblt

    514

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #11  Edited By Kblt

    Yeah, how much of this is marketing and how much is true? I'm waiting for independent benchmarks before I'll upgrade my nvidia machine(ALSO LOL 500€ PRICETAG TYVM).

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #12  Edited By Geno
    @Diamond said:

    " @Geno: Sounds like a pie in the sky wishlist to me.  Or rather a bunch of BS put up on a forum by someone with the intention to BS.  Crysis at that resolution, with 4xSS and that framerate would be MANY times as powerful as a 5970.  A 5970 can't even do 60fps at that resolution with NO AA, let alone super sampled AA.  What would be the point of 32xAA when you can start moving to super sampling anyways? Utter 100% BS. "

    Uh, Crysis can be played at 80+ fps with 5970 CF at 1920x1200 4xAA (and that's with poor drivers), a single 5970 averages 74.5fps. http://www.techpowerup.com/reviews/HIS/Radeon_HD_5970_CrossFire/7.html  
     
    If indeed the GF100 alone is 8% stronger than the 5970 (not too far a possibility; the 5870 is almost as powerful as the GTX295) then it wouldn't be too far of a stretch to say the dual GPU solution could reach that performance. If the post is referring to Crysis Warhead (which is a possibility) the probability is even higher.  That guy also seems to be a rather established member of their forums so I doubt he was just trolling or spouting BS. 
     
    @lhaymehr said:

    " Yay, hardware porn. Luckily, I'm in no hurry with my solid card. One thing is for sure tho; It won't be much more powerful than the latest Radeon, IMO. The recent announcement from AMD/ATI is probably the cause for the nV's delay. They're probably bumping up the specs just barely.  I could be wrong. "

    I'm in no hurry to upgrade as well, as tempting as the performance gains are really the only game out there that would benefit it is Crysis (and Stalker, but even less people play that game than Crysis). Plus the second generation Fermi is likely going to be much better as they rule out of some of the kinks of the first generation. However I'm just extremely curious to see what the green team has in store after this loooooong gestation period. 
    Avatar image for diamond
    Diamond

    8678

    Forum Posts

    533

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    #13  Edited By Diamond
    @Geno said:
    " @Diamond said:

    " @Geno: Sounds like a pie in the sky wishlist to me.  Or rather a bunch of BS put up on a forum by someone with the intention to BS.  Crysis at that resolution, with 4xSS and that framerate would be MANY times as powerful as a 5970.  A 5970 can't even do 60fps at that resolution with NO AA, let alone super sampled AA.  What would be the point of 32xAA when you can start moving to super sampling anyways? Utter 100% BS. "

    Uh, Crysis can be played at 80+ fps with 5970 CF at 1920x1200 4xAA (and that's with poor drivers), a single 5970 averages 74.5fps. http://www.techpowerup.com/reviews/HIS/Radeon_HD_5970_CrossFire/7.html  
     
    If indeed the GF100 alone is 8% stronger than the 5970 (not too far a possibility; the 5870 is almost as powerful as the GTX295) then it wouldn't be too far of a stretch to say the dual GPU solution could reach that performance. If the post is referring to Crysis Warhead (which is a possibility) the probability is even higher.  That guy also seems to be a rather established member of their forums so I doubt he was just trolling or spouting BS.
    Probably not full details then.  That doesn't mesh with any other 5970 benchmark I've seen.  It takes a HELL of a lot more power to run super sampled AA than regular AA.  SSAA means you'd basically be running the game at 3840x2400 then scaling down.
     
    If Nvidia manages to pull of a multi-generational leap such as that, ATI/AMD are done, period.  That's why it's BS.
    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #14  Edited By Geno
    @Diamond said:
    " @Geno said:
    " @Diamond said:

    " @Geno: Sounds like a pie in the sky wishlist to me.  Or rather a bunch of BS put up on a forum by someone with the intention to BS.  Crysis at that resolution, with 4xSS and that framerate would be MANY times as powerful as a 5970.  A 5970 can't even do 60fps at that resolution with NO AA, let alone super sampled AA.  What would be the point of 32xAA when you can start moving to super sampling anyways? Utter 100% BS. "

    Uh, Crysis can be played at 80+ fps with 5970 CF at 1920x1200 4xAA (and that's with poor drivers), a single 5970 averages 74.5fps. http://www.techpowerup.com/reviews/HIS/Radeon_HD_5970_CrossFire/7.html  
     
    If indeed the GF100 alone is 8% stronger than the 5970 (not too far a possibility; the 5870 is almost as powerful as the GTX295) then it wouldn't be too far of a stretch to say the dual GPU solution could reach that performance. If the post is referring to Crysis Warhead (which is a possibility) the probability is even higher.  That guy also seems to be a rather established member of their forums so I doubt he was just trolling or spouting BS.
    Probably not full details then.  That doesn't mesh with any other 5970 benchmark I've seen.  It takes a HELL of a lot more power to run super sampled AA than regular AA.  SSAA means you'd basically be running the game at 3840x2400 then scaling down.  If Nvidia manages to pull of a multi-generational leap such as that, ATI/AMD are done, period.  That's why it's BS. "
    My guess is that he meant the 4xAA available in the game options menu, and was referring to the GF100 single gpu card. That would make the most sense based on the rest of the information. 
    Avatar image for eroticfishcake
    eroticfishcake

    7856

    Forum Posts

    7820

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 6

    #15  Edited By eroticfishcake

    Threads like these remind me of how crappy my graphics card is. Ugh. Anyway, that's good to hear that there's a card that can run Crysis almost effortlessy though I won't fully embrace it until it's released.

    Avatar image for andrewb
    AndrewB

    7816

    Forum Posts

    82

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 16

    #16  Edited By AndrewB

    Nvidia also has to hope that whatever AMD is sure to have in reserve for their card refreshes doesn't one-up the Fermi in the price/performance arms race. Nvidia has been known for creating massive and top performing products, but failing to keep the prices to a level that stays competitive.
     
    Also, I just want to say again how crazy it will be to have a DS that outperforms the current generation Wii. Hopefully it'll have TV out support or something.

    Avatar image for diamond
    Diamond

    8678

    Forum Posts

    533

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    #17  Edited By Diamond
    @Geno said:
    My guess is that he meant the 4xAA available in the game options menu, and was referring to the GF100 single gpu card. That would make the most sense based on the rest of the information.
    Well I could believe that, but it seems like a pretty big leap to type SSAA rather than just AA.  Most people don't even really understand what SSAA is, to even type it suggests you're paying attention to a detail such as that.
    Avatar image for shaunassnz
    ShaunassNZ

    2233

    Forum Posts

    196

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #18  Edited By ShaunassNZ

    Sounds lovely and are we getting the new DS which will be called Deca Screen as this will now have 10 screens.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #19  Edited By Geno
    @Diamond said:

    " @Geno said:

    My guess is that he meant the 4xAA available in the game options menu, and was referring to the GF100 single gpu card. That would make the most sense based on the rest of the information.

    Well I could believe that, but it seems like a pretty big leap to type SSAA rather than just AA.  Most people don't even really understand what SSAA is, to even type it suggests you're paying attention to a detail such as that. "
    As hard as that is to believe, it's easier to believe that than a person with over 2000 posts on a tech forum would randomly troll this information. Maybe he was excited. Or maybe *zounds* Nvidia actually CAN do Crysis at those settings with that fps. One can hope.  
     
    Edit: Oh I just noticed something, in the first part of the post he said "all tests run at 1920x1200 4xaa 16AF" then it's only later does he say 4xssaa. I'm guessing it's a typo. 
    Avatar image for black_raven
    Black_Raven

    1764

    Forum Posts

    8

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #20  Edited By Black_Raven

    If these rumours are true then I might get one.

    Avatar image for suicidal_sniper
    Suicidal_SNiper

    951

    Forum Posts

    2

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #21  Edited By Suicidal_SNiper

    Gah, it makes me feel as if my 4890 is already obsolete =\ . Oh well, that's the life of a PC Gamer I guess.

    Avatar image for time_lord
    Time_Lord

    793

    Forum Posts

    5499

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 9

    #22  Edited By Time_Lord

    And I just finished building my computer god dam it.

    Avatar image for jiggah
    Jiggah

    304

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #23  Edited By Jiggah
    @Time_Lord said:

    " And I just finished building my computer god dam it. "

    Don't worry, if Nvidia's track record is anything, this will probably be delayed until next year.
     
    Seriously though, I'm thinking these cards are going to cost an arm and a leg.  By the time they roll out, ATI/AMD will be rolling out a competitive product probably at better cost.  Nvidia is playing a catch up game, and so far it doesn't look like it's going well.
    Avatar image for seriouslynow
    SeriouslyNow

    8504

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #24  Edited By SeriouslyNow

    It's real as far as I'm concerned.

    Avatar image for spazmaster666
    spazmaster666

    2114

    Forum Posts

    42

    Wiki Points

    0

    Followers

    Reviews: 9

    User Lists: 16

    #25  Edited By spazmaster666

    I really hope that these cards will launch in Q1 of this year because I've been waiting ages to see what nvidia is bringing to the table next. I have nothing against ATI (I've owned many ATI cards in the past) but I want to see what nvidia's bringing to the table before I decide what card to purchase next.

    Avatar image for seriouslynow
    SeriouslyNow

    8504

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #26  Edited By SeriouslyNow

    I think it's more likely they'll be Q2 released.  Just in time for an ATI 5970++.  Nvidia and ATI collude on release scheduling and it's been investigated already.  This was settled via a payout per customer process as a result of a class action against both companies.  People should think about that kind of history before they take sides in fanboi war.  Just saying...

    Avatar image for ajamafalous
    ajamafalous

    13992

    Forum Posts

    905

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 9

    #27  Edited By ajamafalous

    I hope these release before the middle of March and there are plenty in stock. If so, I'll trade up from my GTX 275.

    Avatar image for slippy
    Slippy

    749

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 2

    User Lists: 2

    #28  Edited By Slippy

    As long as it doesn't cost an arm more than ATi's cards.

    Avatar image for dcfgs3
    DCFGS3

    1084

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 2

    User Lists: 1

    #29  Edited By DCFGS3

    Looks like a good mid year buy for me, assuming it's true.

    Avatar image for snail
    Snail

    8908

    Forum Posts

    16390

    Wiki Points

    0

    Followers

    Reviews: 1

    User Lists: 9

    #30  Edited By Snail

    Great! I was considering getting an ATI Radeon HD 5970, but now I am glad I didn't. I'll wait and see what this is all about. Nvidia has really fallen behind, so they need something really great to get them back to the lead on the Graphics Card industry. They haven't released a new product in a long time. They don't even have a Direct X11 Graphics Card, that's how bad they are right now.
     
    In the Nvidia vs. ATI battle I've always been an Nvidia kinda guy. They let me down there, for taking so long to release a new product, but I hope it will be worth the wait.

    Avatar image for mrklorox
    MrKlorox

    11220

    Forum Posts

    1071

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #31  Edited By MrKlorox

    So these haven't been disproved yet? Practically the only number we're missing is a price tag.
     
    And I guess clock speeds.

    Avatar image for pause
    pause422

    6350

    Forum Posts

    16

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #32  Edited By pause422

    This is DX11 supported right? I'm assuming. I need a DX11 card soon when I overdo my computer to windows 7 and upgrade some other things, once some more info comes up for this baby I might see if its worth it.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #33  Edited By Geno

    Just got wind that the NDA may be lifted on the 12th. Crossing my fingers.  
     
    Sled demo at CES for anyone who's interested. Note; this is using a single GF100 card.  
     

    Avatar image for chyro
    Chyro

    356

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #34  Edited By Chyro

    Yeah I watched the CES Streamcast though he only mentions GT100 at the end.  I did enjoy the 3D thing.  I have not experienced it myself because it's extremely expensive at $450 and that's if you buy it separately from newegg.  Not much GT100 information unfortunately.

    Avatar image for jjweatherman
    JJWeatherman

    15144

    Forum Posts

    5249

    Wiki Points

    0

    Followers

    Reviews: 10

    User Lists: 18

    #35  Edited By JJWeatherman

    That tech demo was pretty impressive. I need a new computer.    : (

    Avatar image for barrock
    Barrock

    4185

    Forum Posts

    133

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #36  Edited By Barrock

    Any word on a price point?

    Avatar image for seriouslynow
    SeriouslyNow

    8504

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #37  Edited By SeriouslyNow

    You can guarantee it will sit below the ATI 5970.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #38  Edited By Geno

    Still nothing on things like benchmarks, but it's basically been confirmed that production will start in February and a paper launch will happen in March. Also there's this little tidbit: 
     
    http://games.on.net/article/7836/Metro_2033_-_Technical_QA     
     
    In which there's the statement: 
     
    games.on.net: What are the system requirements to run the game at full detail for a PC user, everything turned on, at 1680 x 1050 resolution?

    Olez: At that resolution, the last generation of high end hardware – a single GPU NVIDIA card - will suffice. The new generation of cards from ATI, and we’ve seen the new ones from NVIDIA too, they’re way more powerful. With these you can turn everything on, with image enhancing features such as Super Sampling of every surface and every texture. Everything will be more detailed with this – not just edges.

        
    I think some developers may already have prototypes of GF100 from that statement, and by the looks of it it doesn't seem to disappoint. This is presuming that "last generation" means the GT200 series.
    Avatar image for chyro
    Chyro

    356

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #39  Edited By Chyro

    Yeah everything sounds great about it.  Except it will probably be $500.  I'm not rich.  So I will just keep my 260/216 for now until I see Nvidia DX11 cards around $200.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #40  Edited By Geno
    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #41  Edited By Geno

    This was posted earlier today on the Rage3D forums by a member with over 9000 posts, I'd call it fairly reliable: 
     
    (Translated to English using Google Translate, it's mostly readable) 
     
    High-end models will deploy the full array of 16 Cluster A total of 512 CUDA Core computing unit, to support 384-bit GDDR5 memory body structure, body has a memory capacity of 1.5GB. Although the Fermi core has adopted 40nm manufacturing process, provided that such cards power up to 250W to 300W, and called for PCI-E 6-pin + PCI-E 8-pin power supply. A second level model will take away the two Cluster group memory array and a body controller, which fell to 448 CUDA Core and 320-bit GDDR5, memory body with a total capacity of 1.28GB, power consumption is reduced to 225W, and only need two PCI-E 6-pin power supply.

    In addition to these two single-core products, NVIDIA also has plans to launch limited edition GF 100 GX2 dual-core models, a unified by NVIDIA to get the manufacturers produce. Fermi built two core products, so that the total number of CUDA Core Up to 1024, and has amazing body 3GB memory capacity, built-in power body will amount to more than 6 billion, becoming the industry's most power within the body of the history, the most robust performance The display card products.

    Easily beyond the HD 5870
    GF 100 actual performance, highly concerned about the industry and users. Sources indicated that single-core version, with 448 CUDA Core of the GF 100 graphics card, 3D performance will easily exceed that same level opponents Ati Radeon HD 5870, in the mainstream games, such as: Crysis Warhead, Far Cry 2, HAWX, and Fallout 3, GF 100's performance of HD 5870, respectively 1.6 times, 1.2 times, 1.55 times and 1.2 times.
         
     
    Go to http://www.rage3d.com/board/showthread.php?t=33953264&page=57 for videos.    
    Avatar image for captain_clayman
    captain_clayman

    3349

    Forum Posts

    10

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    #42  Edited By captain_clayman

    cool... 
    but uh... 
     
    it's probably gonna be REALLY REALLY REALLY expensive.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #43  Edited By Geno
    @captain_clayman said:

    " cool... but uh...  it's probably gonna be REALLY REALLY REALLY expensive. "

    I think they mentioned that its prices are going to be pretty competitive; GTX 380 should cost less than GTX 295, and perform around 30% better. That's quite a good deal from Nvidia. The one thing I am worried about is power expenditure and heat, it's a very, very big GPU. My current setup already runs at 90 degrees C at max load, I shudder to think what this will do. 
     
     
    For those interested, here are some demo videos just released; they were originally stored on a private account on Vimeo but somebody has since copied them publicly onto youtube. I'm 99% sure that these are real. Enjoy.  
     
      
       
        
        
        
        
        
     
    These appear to be the GTX 360 (GF100=GTX360) since that would be lowest end model based on the previous generation and the name coding. Not bad I think. 
    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #44  Edited By Geno
    Avatar image for wolf_blitzer85
    wolf_blitzer85

    5460

    Forum Posts

    2

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 6

    #45  Edited By wolf_blitzer85
    @Geno: Wow thanks for keeping us up to date on this stuff. The more info the better these cards are looking. Even though I wont be able to afford them, I'm curious about the high end card specs and see what those beasts can do.
    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #46  Edited By Geno
    @wolf_blitzer85 said:
    " @Geno: Wow thanks for keeping us up to date on this stuff. The more info the better these cards are looking. Even though I wont be able to afford them, I'm curious about the high end card specs and see what those beasts can do. "
    No problem, it's a labor of love. 
    Avatar image for dr_feelgood38
    Dr_Feelgood38

    1582

    Forum Posts

    780

    Wiki Points

    0

    Followers

    Reviews: 4

    User Lists: 1

    #47  Edited By Dr_Feelgood38

    Yeah, seriously, Geno, keep it up. I'm in the market for a laptop at the moment so I'm not gonna be getting one of these anytime soon or anything, but I like to know how far behind the curve I am. These cards look ungodly awesome, let's hope they weren't kidding about being competitive in price.

    Avatar image for geno
    Geno

    6767

    Forum Posts

    5538

    Wiki Points

    0

    Followers

    Reviews: 15

    User Lists: 3

    #48  Edited By Geno

    Settings and framerates for the FC2 benchmark, thanks to a poster on the ve3D page.  
     
    ranch small, ultra, 1920x1200, 4xAA

    gtx285
    avg/max/min 
    50/72/38

    gf100 ( = GTX 360)
    avg/max/min
    84/126/65 +68%/+75%/+71%

    5870
    avg 70fps 
     
    From the looks of it we are in for something special. The GTX 360 is the successor to the GTX 260, which is the lower end model of the GT200 line and we are already seeing performance equivalent to or exceeding the GTX 295 (highest end dual GPU card, ~ GTX 260 SLI). That could mean the higher end GTX 380 and 385 will show 40-50% improvement on top of that, which in turn means GTX 380 or GTX 385 SLI would make even Crysis at max settings bow down. It looks like we will be seeing x2 performance increase or more over the previous gen, along with tesselation features and other optimizations. It has been reported that DX11 provides a 20-30% performance boost in games that are coded with it in mind compared to DX10 on top of that. 
     
    The ray tracing I find interesting; it looks like it will be the next feature alongside antialiasing in terms of image enhancing. Though, the setup they provided required Tri-SLI GF100 to get a mere 0.6fps, so I guess real time Pixar level graphics is still quite a ways off, or at least until somebody formulates a groundbreaking algorithm for it.  
     

    @Dr_Feelgood38

    said:

    " Yeah, seriously, Geno, keep it up. I'm in the market for a laptop at the moment so I'm not gonna be getting one of these anytime soon or anything, but I like to know how far behind the curve I am. These cards look ungodly awesome, let's hope they weren't kidding about being competitive in price. "

    Thanks. 
    Avatar image for sandwich_adjustment
    sandwich_adjustment

    703

    Forum Posts

    318

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 5

    interesting when 3d rendering engines and games will reach singularity

    Avatar image for diamond
    Diamond

    8678

    Forum Posts

    533

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    #50  Edited By Diamond
    @Geno said:

    The ray tracing I find interesting; it looks like it will be the next feature alongside antialiasing in terms of image enhancing. Though, the setup they provided required Tri-SLI GF100 to get a mere 0.6fps, so I guess real time Pixar level graphics is still quite a ways off, or at least until somebody formulates a groundbreaking algorithm for it.

    It doesn't seem like the card is an especially good fit for raytracing.  0.2FPS in a simple environment isn't very hopeful for near future application.
     
    FWIW a lot of Pixar stuff is just rasterized, not sure if that's still true for their more recent stuff.
     

    @Geno

    said:

    GTX 260, which is the lower end model of the GT200 line

    Definitely more mid level, the GTS240 and GTS250 (and the GT series) are low end.

    This edit will also create new pages on Giant Bomb for:

    Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

    Comment and Save

    Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.