Is Titanfall using the DX11 API?

Avatar image for scaramoosh
scaramoosh

216

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1  Edited By scaramoosh

According to paperwork the Xbox One has the more powerful CPU as it is clocked around 200mhz higher, Microsoft went on about it having a dedicated sound chip to take away the load from the CPU, where the PS4 has to use the CPU, though I don't know if that is true. The GPU of the Xbox One is clocked higher, it is probably more like 1.4Tflops vs 1.8 of the PS4.. on paper at least. Can Microsoft clock it even higher? We know that GPU is capable of being clocked to 1ghrz, can it in the Xbox One though? How much resources are being locked off from developers? Could be the power advantage of more GPU CUs on the PS4 actually be locked off for other things?

We just don't know all the facts yet and tbh the bottleneck right now seems to be the esram, on paper it is faster but there is little to work with. It seems like the PS4 is easier to straight port from the PC as it is just unified GDDR5, so more work goes into an Xbox One port. So as time goes on will developers get the hang of the esram? Yes, it'll only get better.

One thing that interested me was Respawn went on about upgrading their game to DX11 and 64bit when they knew it was going to be an Xbox One game. Is Titanfall using the DX11 API? Could DX12 be a factor in getting that game to run better? The API could be the major bottleneck for Titanfall.

We know that Killzone Shadowfall is running at 960 x 1080 and then upscaled to 1920 x 1080. So I don't think we can quite count the Xbox One out on power yet, I just wonder if developers just need to get used to how to use the esram more efficiently. We know the esram buffer is not too small for 1080p, it'll never do 4k gaming, but neither console will, 780ti SLI setups struggle at 4k. I think really the esram is just something developers need to work out, PS4 games are just easier to port to.

It'll be interesting to see, you cannot write the Xbox One off yet like every one seems to have done.

What I really want to find out though is if Titanfall is using the DX11 API, that could explain everything and if they can make the switch to DX12, it could suddenly be a massive change. The low level Mantle API for BF4 increased FPS dramatically, one benchmark I saw it went from 100FPS to 160 with Mantle, 60% right there..

Avatar image for audiobusting
audioBusting

2581

Forum Posts

5644

Wiki Points

0

Followers

Reviews: 4

User Lists: 26

#2  Edited By audioBusting

Uhh, like you said, they talked about using DX11. So... yes??? I'm not sure what part of that isn't clear enough.

Edit: And I'm not sure what you're getting at really, they upgraded it to DX11 because they're putting it on the Xbox One. You have to use DirectX on the Xbox, as far as I know.

Avatar image for nashvilleskyline
Nashvilleskyline

353

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

DX 12 will work on the xbox one.

"The panel's description highlights how the company plans to deliver on demands of better tools for developers, added performance, and support for an "unparalleled assortment of hardware," with specific mention of PCs, tablets, phones, and consoles"

Ign article

I have High hopes for the Xbox One's future.

Avatar image for andrewb
AndrewB

7816

Forum Posts

82

Wiki Points

0

Followers

Reviews: 1

User Lists: 16

#4  Edited By AndrewB

Slightly higher clock speeds on the CPU and GPU side would not make up for the lesser number of hardware shaders on the GPU side, but yes, a lower-level API could help yet at the same time make it even more proprietary/difficult to develop for. A negative, sure, but developing for the PS4 is just as proprietary* given the strange hardware (GDDR 5 and OpenGL as compared to the still iron grip Microsoft has on the x86 architecture in relation to gaming with DirectX and PCs), so I'd be glad to see it happen in order to see developers make the most of the hardware.

* I realized that proprietary is the wrong word given that OpenGL is anything but and DirectX is the definition, but I think you get the point. Comparing the ease of porting between the two consoles and the PC was what I was getting at.

Avatar image for scaramoosh
scaramoosh

216

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#5  Edited By scaramoosh

The main thing though is the esram = harder to port to as they have to optimize for it, on paper it is actually faster than the GDDR5 of the PS4, but in reality it comes down to the developer and it doesn't really work that way in practise, also there isn't a lot of it so it makes it harder. So I think it is something we'll see optimized as time goes by, resolution and AA is just memory, so hopefully as developers have more time with the Xbox One, it'll improve. That said, the PS4 like Kill Shadowfall is 960 x 1080 upscaled, that console struggles too quite obviously to get 60fps...

The one thing I want to know though is does Titanfall use DX11 on the Xbox One? That could explain why it just doesn't run so well to how it looks, DX11 has proven to be really poor and it is an API so it isn't like writing to metal.

That said though, Titanfall on my TV, looks great with the Xbox One, a lot better than the screenshots suggest.

Avatar image for scaramoosh
scaramoosh

216

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6  Edited By scaramoosh

@nashvilleskyline:

DX 12 will work on the xbox one.

"The panel's description highlights how the company plans to deliver on demands of better tools for developers, added performance, and support for an "unparalleled assortment of hardware," with specific mention of PCs, tablets, phones, and consoles"

Ign article

I have High hopes for the Xbox One's future.

The main point I have is if it is using an API, that explains everything, it isn't like a highly optimized write to metal game.

It could only be using like 40% of the consoles power due to the API.

Avatar image for cinnase7en
Cinnase7en

49

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

Xbone's GPU is 1.31 from a TF standpoint. They've said so and we know this. Increasing performance on PC isn't an issue. A mantle like API doesn't make any sense considering console API's already are low-level. Reasons for performance issues are this: new system with bad tools. We know this. Digital Foundry talked about this in September last year and several devs have talked about it publicly. New SDK should help and patches down along the line. Another issue, lots of smoke effects and particles when the titans come out to play. Particle always kick the shit out of a GPU. Plus, it can be the CPU. Depends. You'd be surprised by the amount of times a framerate dip has occurred due to the CPU.

Improvements from Mantle are coming from the CPU end due to the fact the API is, seemingly, designed to improve CPU efficiency. Low-end chips, so you'd think it would improve the CPU stuff in the Xbone, but they are already using low-level API's on a console.

That said, the PS4 like Kill Shadowfall is 960 x 1080 upscaled, that console struggles too quite obviously to get 60fps...

The one thing I want to know though is does Titanfall use DX11 on the Xbox One? .

To be fair, Killzone Shadowfall's MP was clearly designed to run at the same resolution and framerate as the single player. Which was 1080p locked at 30. But. people whinged and complained about that so they unlocked the framerate for the SP and then had to do what they could in the time they had to get the framerate as high as they could for MP.

Unlikely they are using a high-level API like that on Xbone. Maybe they are? I mean, this early in the cycle and devs need to do what they need to do to get their game running as fast as possible this early on in a console cycle. We won't know unless they say.