How does Hz deal with FPS? 144hz monitor or 240?

Ha
- in Hardware
12

I bought a PC the day before yesterday:

Intel Core i7 8700k
Geforce RTX 2070
16GB RAM

And currently playing on a 60hz monitor, as I have a better plan to buy this weekend.

Play Fortnite on low-mid settings with a streched resolution (1280x1080).

A coworker meant one can come only on so much FPS how many Hz the monitor has, no matter how strong the PC is.

I come with my settings on 240 FPS, can it be despite only the 60 hz monitor?

Is it just a display error, or does it indicate how much the PC would be able to reach?

Well understand definitely not the meaning of Hz, so please for clarification.

The next question would be if I have a 144hz monitor, or should I buy a 240hz, and why?

Kn

In my opinion, 144 Hz is enough, but every eye is different. There are certainly some that can still make a difference to 240Hz.

But as I said, you have to know that for yourself. I think 144 Hz makes more sense, because you will not reach 240 FPS in most games anyway.

To the thing with Hz and FPS:

For Monitor, the value is always in Hz, which is the refresh rate, ie how many pictures are displayed per second. The graphics card has nothing to do with your monitor, but it may just be able to produce only 60 frames per second, but it can just as well calculate 240 fps, for example. The monitor does not care, it always shows as many pictures as its refresh rate. Accordingly, it makes little sense, for example, to play on a 60Hz monitor with 240fps, since the monitor will display only every 4th image.

Bo
1

Your question has nothing to do with the vote.

Hertz is the frequency of a periodic signal over one second.

A 60Hz panel shows a frame every 16-17ms, a 144Hz panel about every 7ms and a 240Hz roughly every 4ms.

Fps are the average frames per second your video card generates. They are not periodic and fluctuate sometimes very strong, the frametime is almost never the same. The fps have nothing to do with the repetition rate of your monitor.

Since fps and frametime are not periodic, you need i.d.R. Significantly more fps than the monitor Hertz has, so that at each swap (screen change of the monitors) also a rendered frame is present, we take for example. A very short sequence of three frames, in which one is the scene calculation is not complicated and takes only 5ms, the following frame, however, must be calculated significantly more (viewport changes, you suddenly have x enemies in front of you, etc.) and it takes 10ms until the frame is calculated, then the third frame is 20ms, then you average 87 fps (3 frames in 35ms equals 11.66ms average frametime or 87 frames in one second), now imagine you have a 60 Hz monitor, ie Every ~ 16ms a swap, even before the first swap takes place, the second frame is calculated (in 15ms), i. You also get the second frame on the second cycle, but on the second swap (after 32ms) the third frame is not yet calculated (that takes 35ms), ie. The monitor can't display any new information and there's no swap. The second frame is simply displayed 32ms. You have an average of 87 calculated fps for your 60Hz monitor in the example, but you were only shown about 40 frames per second.

In reality, this looks a little different, as well as partial images are displayed when appropriate information in the swap available (therefore tearing comes) but I hope you have roughly understood what fps and Hertz are and what relationship they have to each other.

ta

The Hertz of the monitor are the FPS this monitor just can represent a maximum. A 144Hz monitor will deliver a maximum of 144FPS, no matter if the graphics card delivers 200 or 500FPS.

Of course, if the video card delivers less than 144FPS, the monitor will not show more frames.

So before you buy a 240Hz monitor, ask yourself if your PC can deliver the 240FPS at all. I would therefore only invest in a PC that almost continuously reached more than 144FPS and if then money is still left you can consider the 240Hz monitor.

Whether it is necessary or not, you can't really answer, I have played Counterstrike times on a 240Hz monitor and on a 144Hz monitor and frankly no difference can be seen. However, some gamers swear that a 240Hz monitor is much better. So I would say that you will not notice the difference as a normal person, but it may well be that in e-sports with such a monitor may have a small advantage.

Bo

If it is not quite clear yet, high frequency monitors will work well if you have significantly less fps, as more information can actually be displayed (in the example above: The 240Hz monitor has a swap every 4ms, in the 35ms of the sequence So 8 swaps (starting at 0), so he puts in 9 frames / 8 swaps three new pictures, extrapolated to a full second would you get so of your 87 fps, which has rendered your GPU, 80 fps, twice as much as at the 60Hz panel.

First of all, it is not correct to say that a 120Hz / 144Hz / 240Hz panel is worthwhile only if you have a lot of fps, so it's worth it.

And secondly, it is wrong to say that to exploit the potential of a panel with x Hz, you would also need x fps… To be able to exploit the full potential, you need significantly more fps than the monitor's refresh rate.

Bo

A 240Hz or even a 120 / 144Hz monitor will display much more information even if you have much less fps than the Hertz panel (compared to a 60Hz panel).

ta

Of course, a 144Hz monitor will show you 100 FPS, which a 60Hz monitor can't. There's nothing else in the answer.

But if your Graka only charges 30FPS, a 60Hz monitor will give you exactly the same display as a 144Hz monitor.

Bo

No, as a rule, a 144Hz will also show you at 30 fps more frames than a 60Hz monitor, this is v.a. Remember that a monitor without hardware sync almost never can show the full fps, even if they are below the refresh rates.

ta

Depends on the settings on it.

At 30Fps, a 60Hz monitor will surely show you all frames because the frame rate is exactly half the refresh rate.

Of course, this is no longer the case with other FPS, but that depends on how the graphics card is doing. If the simply rausbetißt the frames as they are calculated then all frames are displayed on the monitor.

The problem which you mention, I only know if VSYNC is activated. In this case, there's a problem when the frame rate is less than the refresh rate of the screen. The reason behind this is that individual frames have to wait for the new refresh cycle of the screen. The refresh cycle is of course much shorter at 144Hz than at 60Hz, which is true if VSYNC is enabled.

If Vsync is not active, it is the Graka completely no matter what the monitor is currently doing, if a new image was rendered is written into the output buffer and the monitor takes, then this can just lead to Screentearing, but the monitor is so every frame dar Namely, each frame is at least one refresh rate constant on the screen. Screentearing is usually not strong at lower frame rates, which should always work without Vsync.

Bo

This has nothing to do with framerate, but frametime.

Simple example, a 2Hz monitor and a 2 fps signal. The 2Hz monitor can only display one (full) frame as soon as the frametime varies as it only swaps every 500ms.

Same example with a 4Hz panel, so 250ms swap, here the frametime can vary very much, nevertheless the panel shows both frames without problems.

Or imagine sequences with fast consecutive frames well under 16ms, followed by pictures with longer frame-time, the average fps can fall below the frame rate, but since two or more frames are rendered during fast sequences during a swap, they are discarded again.

Tldr: The higher the repetition rate of the panel, the more likely that the displayed frames will match the rendered one, the faster the frametime varies, the stronger this effect will be.

ta

Yes the average FPS of course already.

But if the frame rate is constantly below the refresh rate of the screen then nothing is lost. Because a frame refresh is always faster than the time it takes to calculate the new frame.

If the graphics card renders slower than the screen can render and then faster again then of course there's a problem, but that is exactly what I meant in my answer.

If my computer can't even fall below a minimum frame time of 7ms, then I need not worry about a 240Hz monitor.

And even if the computer every 60 frames once a frame with a shorter frame time of 7ms achieved then brings me the much more expensive monitor synonymous nothing.

Bo

In practice, and considering that also partial frames are output, in an example like 30 fps, in fact, there's no difference, the more avg. However, Fps and refresh rate approximate, but the more likely such a scenario is.

ta

One will possibly notice a difference if one operates the 144Hz monitor with average 120FPS max.

Nevertheless, as I said, I see no sense in buying a 240Hz monitor if you can't even use the 144Hz monitor and that's exactly what the overall response is about.

It simply does not make sense to run a 260Hz monitor with a weak video card, because you think you need a fast monitor and save on the Graka.