Something went wrong. Try again later
    Follow

    Cyberpunk 2077

    Game » consists of 11 releases. Released Dec 10, 2020

    An open-world action role-playing game by CD Projekt RED based on the pen and paper RPG Cyberpunk 2020.

    Cyberpunk 2077’s E3 2018 demo was running on an Intel i7-8700K with an NVIDIA GeForce GTX1080Ti

    • 63 results
    • 1
    • 2
    Avatar image for tesla
    Tesla

    2299

    Forum Posts

    1

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 4

    @rethla: Wooooow dude, S tier pun. High five.

    Avatar image for qrdl
    qrdl

    479

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @tesla Don't encourage such behaviour!

    Avatar image for deactivated-5d1d502761653
    deactivated-5d1d502761653

    305

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Developers demo a game which is supposed to come out in the next years on high end hardware to make it look utmost impressive? Color me surprised!

    Avatar image for turtlefish
    TurtleFish

    415

    Forum Posts

    210

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    #54  Edited By TurtleFish
    @rethla said:
    @ghoti221:I havnt seen HDR done good on any PC games and the monitors are crap, why is that a must have for Cyberpunk?

    Also have you tried running Witcher 3 maxed out on anything but a 1080ti?

    Running Witcher 2 maxed out is still a feat...

    I think HDR is a must have because they're positioning CP2077 to be both immersive and impressive in graphics both technically and aesthetically. By 2020, HDR gaming monitors and cards should be there, at least on the high end (the first 4K 120Hz HDR Gsync monitors were just put up for sale a couple of months ago, so there should be some product in 2 years).

    And what you're asking is exactly my point - Witcher is pretty crazy demanding, which means they aren't afraid to go for broke. But, I wasn't referring to what card with which you can run the game at max settings, I was referring to the 1080Ti being defined as the minimum spec card. I've got no problems with game being so crazy at max you can't run it at 4k/60 without a set of Nvidia GTX 1199Ti's running in Quad SLI with a 2KW power supply. But if the 1080Ti or comparable is the minimum spec, then I think they've gone too far, since that mean pretty much nobody will be able to play the game at all. Which is why I think there's definitely going to be several rounds of optimization before we see what the final spec is.

    Avatar image for vortextk
    vortextk

    973

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    @rethla said:
    @salarn said:

    @rethla: the E3 demo was done in Unreal, makes sense since REDengine isn't known for making FPS games.

    What? Source?

    An e3 interview with a producer(?) from cdproject with Jason Schreier confirmed that the game is running on their next version of the red engine, red engine 4.

    Avatar image for frytup
    frytup

    1954

    Forum Posts

    5

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    @vortextk said:
    @rethla said:
    @salarn said:

    @rethla: the E3 demo was done in Unreal, makes sense since REDengine isn't known for making FPS games.

    What? Source?

    An e3 interview with a producer(?) from cdproject with Jason Schreier confirmed that the game is running on their next version of the red engine, red engine 4.

    Obviously. Very odd to claim they'd build the demo in a completely different engine from the one they're using to develop the game. That would be a ton of wasted work and, come on, they've had five years to figure out how to do playable first person in REDengine.

    Here's the Kotaku interview.

    Avatar image for qrdl
    qrdl

    479

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    It's beyond me how anybody would think that a company which created an engine from the ground up would have any trouble binding the camera to the character's head. This is like a complete non-issue compared to going from FPP to TPP and having to develop algorithms for camera behavior.

    Avatar image for rethla
    rethla

    3725

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @qrdl said:

    It's beyond me how anybody would think that a company which created an engine from the ground up would have any trouble binding the camera to the character's head. This is like a complete non-issue compared to going from FPP to TPP and having to develop algorithms for camera behavior.

    Not to mention creating an engine from the ground up means you dont have to go from TPP to FPP...

    Still i dont expect the first person to be stellar in this game, its the world and the story thats the draw of CDPR games not the second rate action.

    Avatar image for john1912
    John1912

    2508

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    #60  Edited By John1912

    @ghoti221 said:
    @rethla said:
    @ghoti221:I havnt seen HDR done good on any PC games and the monitors are crap, why is that a must have for Cyberpunk?

    Also have you tried running Witcher 3 maxed out on anything but a 1080ti?

    Running Witcher 2 maxed out is still a feat...

    I just loaded Witcher 3 up with my 1080 to check FPS. More or less set to ultra with out hair works obviously. I easily get stable high 55-60fps out doors/small towns. Dips to like 40-48 FPS in a the big cities with large crowds. There is zero chance a 1080ti would be the minimum. Minimum will likely be a 970. I cant even see 980 being the minimum. If you want full 4k with CONSTANT 60 fps likely going to need two 1070s-1080s. Im going to expect 30-45fps at 1440p for Cyberpunk on more or less maxed settings. Likely have to turn down some options like highest AA, and shadows which isnt worth it anyway.

    Avatar image for rethla
    rethla

    3725

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 2

    @john1912: Yeh again turning of stuff you will be able to run the game. 1080ti wont be minimum i agree.

    Avatar image for max_cherry
    Max_Cherry

    1700

    Forum Posts

    176

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    This game looks like a worthy successor to Deus Ex.

    Avatar image for deltamind
    deltamind

    21

    Forum Posts

    0

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 0

    Not entirely sure if I would be able to play this game, unless santa is satisfied with me not being naughty this year. Would a GTX 1060 6GB model blow up if played on 1080p, the game is not here hence I don't think we can conclude right?

    Avatar image for shagge
    ShaggE

    9562

    Forum Posts

    15

    Wiki Points

    0

    Followers

    Reviews: 0

    User Lists: 1

    @deltamind: I'd imagine it'd be fine (of course, this is all speculation, but hey). I haven't found a game yet that isn't perfectly playable at 1080p on my 1050ti. If this is the system crusher people are expecting it to be, I'm sure mid-range cards will have to settle for mostly medium settings at that res, but I'll eat my hat if this game is so demanding that you'd need a high end setup to run it at 1080p. Especially if they really are aiming to release on this generation of consoles.

    This edit will also create new pages on Giant Bomb for:

    Beware, you are proposing to add brand new pages to the wiki along with your edits. Make sure this is what you intended. This will likely increase the time it takes for your changes to go live.

    Comment and Save

    Until you earn 1000 points all your submissions need to be vetted by other Giant Bomb users. This process takes no more than a few hours and we'll send you an email once approved.