Mark Cerny on the PS5 Pro, Flopflation and PlayStation’s partnership with AMD

Today, Sony released a new technical presentation from PlayStation architect Mark Cerny, taking a deep dive into the PS5 Pro – specifically the system’s new AI upscaling technology, PlayStation Spectral Super Resolution. During the lecture, Cerny also announced that Sony and AMD are entering into a strategic partnership, called Project Amethyst, with the aim of advancing game graphics through machine learning – i.e. the technology behind Sony’s PSSR and AMD’s FSR.

Last month, I had the opportunity to visit PlayStation headquarters and preview today’s presentation, as well as sit down with Cerny, get answers to a few of our lingering questions about the PS5 Pro, as well as more details about Project Amethyst – and how it could be relevant not only to the future of game graphics on PlayStation, but also Xbox and PC.

The following conversation has been lightly edited for length and clarity.

IGN: You talked about the difference between RDNA 2 and RDNA 3 and how you wanted to stick with RDNA 2 because you didn’t want to create a lot of work for developers upgrading to RDNA 3.

Mark Cerny: So it’s 2.X, right? So we asked what features you can take from RDNA 3 that won’t cause a lot of work for developers. And then we took them.

My question is why is it a problem on console whereas on PC with PC games you can just plod in a new generation GPU and it just works.

Drivers in the console world tend to be very, very thin. It is seen as one of the advantages of consoles – that you can take full advantage of all the hardware features. But it means that when you go from generation to generation, there is more work. But you will also see problems in the PC world. If you had a new card, you would either have to compile all your shaders live, which of course can lead to problems, or you would have to have a strategy to deliver all those shaders compiled to the new GPU.

I love the term flopflation. Did you come up with that expression? Or is it something from the marketing department?

Yes, I figured that out. I’m not sure how much credit I’ll take on, but yeah, I figured it out.

So, on that subject, how do you feel about the general teraflops arms race? I know there are many different factors that go into game performance, but on the other hand, the general public likes to have a hard kind of number that they can stick to. What’s your take on that?

In fact, I don’t see that happening in the PC world. In the PC world, I don’t think consumers talk much about teraflops. It seems to be only for consoles and only in recent generations. As I’ve said many times, teraflops are not a good indication of GPU performance.

This isn’t about creating proprietary technology for PlayStation – the goal is to create something that can be used broadly across PC and console and cloud.

Tell me more about the Amethyst partnership. What kind of information is shared between the companies? And how does this partnership differ from the existing relationship between SIE and AMD in terms of how you’ve built the PS5 on their hardware and previous consoles, what’s new?

So first I have to give the nature of the collaboration. There are two goals we are working on with AMD. One is: better hardware architectures for machine learning. And it’s not about creating proprietary technology for PlayStation – the goal is to create something that can be used widely across PC and console and cloud. The second collaboration is: regarding these lightweight CNNs for game graphics. So, you know, the kind of things that are used in the PSSR and maybe the kind of things that would be used in the future FSR.

So that is the broad nature of the collaboration. So what kind of information is shared between the companies to achieve these goals?

We collaborate directly on both of these goals.

Does that mean we can expect the results of that collaboration to be reflected in future AMD hardware that isn’t necessarily PlayStation hardware?

Absolutely. This is not about creating proprietary technology or hardware for the PlayStation.

To expand a step even further, does this mean it could potentially be used for Xbox hardware?

It can be used by anyone who wants to use it.

Who started the partnership? Was it Sony who wanted to get more in bed with AMD? Or did they come to you and say how can we learn from your experience on the gaming side?

Well, they are long-term partners and it was very clear that we were going after similar goals.

Let’s go back to the PS5 Pro. I want to talk about some of the interior. You didn’t mention the CPU at all in the presentation today. Is there any difference in the PS5 Pro’s CPU from the base PS5?

There are some minor improvements everywhere, and the CPU clock frequency is one of those improvements. If you want the CPU to run at a 10% higher clock frequency, you can make it run at a 10% higher clock frequency.

And it is?

That would be 3.85 GHz.

Architecturally. It is still on…

Zen 2. It’s the same Zen 2 CPU.

As for the AI ​​upscaler that you use for PSSR – is that a discrete piece of hardware or is it built into the GPU itself?

We needed hardware that had this very high performance for machine learning. So we went in and modified the shader core to make that happen. Specifically, in terms of what you touch on the software side, there are 44 new machine learning instructions that take a freer approach to detecting RAM access. Actually you are using register RAM as RAM. And also implement the math needed for the CNNs.

To put it differently, we improved the GPU. But we didn’t add a tensor unit or anything to it.

You mentioned frame generation, but it sounded like that wasn’t the focus of the machine learning aspects of the PS5 Pro. Is there any kind of image generation technology in the PS5 Pro?

On the PS5 Pro, at the moment, all we have is PSSR, which is super resolution. We are incredibly interested in all the things that can be done with machine learning to make game graphics better. And I think it’s pretty obvious to everyone that frame generation, frame extrapolation, and ray-tracing denoising are also very interesting goals.

Ray tracing is significantly more performance demanding than traditional rasterization and traditional reflections and lighting techniques. Do you feel that the gaming community’s interest in ray tracing, as you mentioned in the talk, is worth continuing development in that direction?

Ray tracing is not one thing, it is many things. You can use ray tracing for audio queries – it’s pretty cheap – or you can use ray tracing to enhance your lighting, which is a bit more expensive, but not terribly so. You can make reflections, and because the reflections can be made in lower resolution, it can also be done without breaking the budget. And then if you continue, you end up with path tracing, where your technology is basically based on ray tracing, and if you don’t have incredibly performative ray tracing hardware, you won’t get very far with that. So what we do is we provide tools to the developers and allow them to figure out where on the spectrum they want to be.

In the development of the PS5 Pro, which started about four years ago… Knowing what you know now about the advances you found along the way and everything that has happened in the last four years, what are they main thing you would have liked to get into the PS5 Pro, technology-wise, that you weren’t able to, that is now “next on your hit list”?

I can tell you, building the machine learning hardware for the PlayStation 5 Pro was an incredible education. And instead of saying “oh, we would have done it differently,” I look at it as we now have a good understanding of how this works, and as a result everything we do in the future has significant more potential.

We now have a good understanding of how this works, and as a result everything we do in the future has significantly more potential.

You also mentioned the prospect of building it yourself instead of buying or outsourcing the technology. Can you elaborate on the thought process there?

A very simple way to look at this is: do we take the next roadmap AMD technology, or do we actually go in and try to design the circuit ourselves – and we chose the latter. And we chose the latter because we really wanted to start working in the space ourselves. It was clear that the future was very ML-driven. And know, you know the world is talking about LLMs and generative AI, but I’m really mostly just looking at game graphics and the boost for game graphics that we can get. So from that we wanted to work in that space.

More generally, what are your thoughts on the world’s recent fascination with artificial intelligence as a buzzword?

Well, we live in very interesting times.

That’s for sure.

I would say these are very different questions. So you might see something like a smartphone coming out with AI capabilities or a laptop coming out with AI capabilities, and it might not be immediately clear what those capabilities will bring to you as a consumer. But the console space is nice and simple for this, because there are several things that have already been shown to be of great benefit – if you just have sufficiently powerful hardware.

Jumping to the future, you said console development is about a four-year process. Does this mean we can hint that work has begun on the PS6?

We are not discussing PS6 at this time.

Can we expect it in about four years?

Same answer.