Testresultat för RTX 4070 Super har läckt

Permalänk
Medlem

NVIDIA CES special address https://www.youtube.com/watch?v=-JbSg2UnK2k
Nya Super-korten, raytracing i Diablo IV, m.m.

Nu när vi vet alla specs och ungefärliga svenska priser - värt att uppgradera från ett RTX 3070 till nya 4070 Ti Super, eller bättre att vänta ett år för t.ex. 5070/5080...?
Uppgraderade nyligen main monitor från ultrawide till super ultrawide (Odyssey OLED G9) och tappade då en del prestanda. Innan dess funkade 3070 helt ok för mina behov.

Permalänk
Medlem
Skrivet av Donnerstal:

G-sync Ultimate har inget med adaptive sync att göra. Allt Ultimate i stort sett innebär är att skärmen stödjer HDR1000. G-sync-modulen finns i skärmar som är märkta med G-sync (även Ultimate, men inte Compatible). Modulen har dock ingen egentlig fördel vad gäller adaptive sync, vilket de flesta tänker på när man pratar G-sync/Freesync.

Alltså löser en Freesync-skärm adaptive sync lika bra som en märkt med G-sync Ultimate.

Fast det där stämmer inte riktigt. G-Sync modulen har en fördel mot Freesync som gör att G-Sync troligtvis alltid kommer vara bättre även i framtiden.
Citerar en bra förklaring på varför från användaren "cwm9" på Reddit:

"So, I think the misunderstanding comes from what GSYNC and FREESYNC actually do.

I'm not 100% sure about all of this, but I think what I'm about to write is correct.

If all GSYNC and FREESYNC did was to send multiple copies of a frame to a monitor, why even require the hardware?

Because that's not what's going on (generally).

Traditionally, the chip inside a monitor expects data to come in from the computer at regular intervals. It then pipes that data to the LCD to refresh it, with a small buffer to account for any timing differences between the two devices and possibly a scaling step. (This is where display lag comes from.)

GSYNC and FREESYNC do away with the idea of expected intervals, and simply wait to update the display until the next frame arrives.

But there's a hitch: the panel HAS to be updated within a certain number of milliseconds in the exact same way that memory has to be refreshed every so many milliseconds, otherwise the pixels will lose their memory of what they were supposed to be displaying.

And here's where the two differ: FREESYNC doesn't have any on-board memory that stores the previous frame. It only has the ability to delay updating the display until the next frame starts being transmitted. So, when time "runs out" for the panel, the computer has to resend the frame from the video card's buffer over the display port cable. Of course, that takes a finite amount of time --- and that introduces one standard display frame lag.

GSYNC, on the other hand, already has the prior frame in memory. It doesn't have to wait for the frame to be transmitted by the video card again, it simply refreshes the display from the internal buffer.

Now, so far, it sounds like there shouldn't be any difference, and indeed, there isn't any difference at this point from the perspective of the user. The frames haven't changed, so you don't even see anything happen.

But the key is, what happens during the next frame?

That will depend on when the next frame is finished by the video card.

If the next frame happens to be available right after the panel refresh starts, in FreesSync monitors the DisplayPort is busy even though the frame is available. The video card can't start transmitting the next frame until the previous frame is sent.

Contrast that with GSYNC: GSYNC has in-built memory, so when the frame is available, the video card can immediately begin transmitting the frame to the GSYNC module, even though the GSYNC module is currently updating the panel.

When the panel has been refreshed with the duplicated frame, GSYNC is able to immediately being displaying the next frame without having to wait for the GPU.

Now, it's not like the GSYNC module can just interrupt the refresh and display the new data: that would result in a tear. But what it can do is begin displaying the next frame immediately and push every other incoming frame out slightly in time --- basically introduce a little extra lag, but make sure that every frame gets displayed --- until the GPU gets slightly behind and the buffer empties.

FREESYNC will kind-of do the same thing, but because of the extra delay introduced by having to transmit over the DisplayPort, it will lag the performance of GSYNC by one display-lag interval.

This, by the way, is why it's important to set your in-game max FPS to 1 less than what your monitor can actually handle when using GSYNC. (119 FPS instead of 120 FPS.) If your GPU hiccups and has to double a frame, the small amount of extra time that 1FPS leaves allows the module catch up within 1 second, even if your GPU is spitting out frames at full speed. (If you drop it to 118FPS, it catches up within 1/2 a second.) Without doing this, the display can potentially get up to 1 full frame behind and stay that way until the GPU falls below full framerate for a moment, defeating the whole point of having a high FPS monitor.

What about FREESYNC? The only way it can catch up is by altering the next frame. That is, the software has to be aware that there is a frame waiting to be transmitted and inform the software that it needs to calculate the next frame update with that lag accounted for.

GSYNC is all built into one module and no cooperation is require between the module and the video card. Additionally, the module is physically wired to the panel, so there is no lag introduced when retransmitting.

FREESYNC requires cooperation between the display, the video driver, the software drivers of the video card, and the game, and it introduces lag both from communication with the video driver and from the lag introduced by the Display Port.

The two should, under ideal circumstances, perform nearly identically, with the GSYNC module beating the FREESYNC system by a few milliseconds at most. (Something like 1-10 ms). But if the monitor and the video card or video card drivers don't play well together, things can fall apart completely resulting in very nasty glitches and dropped frames.

Another MAJOR difference comes into play if the video card is (why would you do this?) intentionally set to a framerate below that of the monitor and a fixed video source (rather than a game) is being displayed. Suppose you are watching a movie and set the framerate out from your video card to 48hz. Now suppose your video card hiccups, requiring a display panel refresh. FREESYNC must resend a frame, but now it's a full frame behind, and it can't ever fix this fact because there's no "extra time" in which to display the lagging data. The only fix is to skip the frame or adjust the audio to match.

Compare that to GSYNC: GSYNC is perfectly aware that the display can handle 120hz, and when it receives the late frame, it is able to immediately transmit the data to the display. Because the display is so much faster than the movie, the following frame will be displayed on time. Yes, one frame of movie video will be delivered, say, 10ms late, but the frame is normally 41ms long, so who cares... the gap for the next video frame will be only 31ms and you'll be caught up --- a minor hiccup you won't even notice.

Of course, it's silly to intentionally set your framerate that low --- simply leave your framerate set normally and play the movie using a player that is able to work with FreeSync. Then the player will be able to catch up, just as it would with GSYNC.

Does any of this really matter? Well, if you can you tell if your game or movie is de-synched by 10 ms for a few frames, then, I guess? As long as the display, card, and video drivers all play nice with each other, there's very little difference. We're talking, worst case scenario, at 120 fps while gaming, a difference in lag of 10 ms between the two, and more typically just a few ms. (That's 1 full frame plus a few ms of buffering.) If you set your FPS to 24hz, then it could be up to 44ms, but who would do that?

TLDR: GSYNC will always work and will always beat FreeSync by at least a few milliseconds, and potentially up to one full frame at the designated FPS. GSYNC catches up naturally with no help from the computer when a frame is missed (if the FPS set for the game/display mode is at least 1 fps below what the display is capable of) , FREESYNC requires software cooperation to catch up. FreeSync only works properly if the monitor, video card, driver, and software all play nice with each other. GSYNC always works."

Permalänk
Medlem
Skrivet av wibbe44:

Fast det där stämmer inte riktigt. G-Sync modulen har en fördel mot Freesync som gör att G-Sync troligtvis alltid kommer vara bättre även i framtiden.
Citerar en bra förklaring på varför från användaren "cwm9" på Reddit:

"So, I think the misunderstanding comes from what GSYNC and FREESYNC actually do.

I'm not 100% sure about all of this, but I think what I'm about to write is correct.

If all GSYNC and FREESYNC did was to send multiple copies of a frame to a monitor, why even require the hardware?

Because that's not what's going on (generally).

Traditionally, the chip inside a monitor expects data to come in from the computer at regular intervals. It then pipes that data to the LCD to refresh it, with a small buffer to account for any timing differences between the two devices and possibly a scaling step. (This is where display lag comes from.)

GSYNC and FREESYNC do away with the idea of expected intervals, and simply wait to update the display until the next frame arrives.

But there's a hitch: the panel HAS to be updated within a certain number of milliseconds in the exact same way that memory has to be refreshed every so many milliseconds, otherwise the pixels will lose their memory of what they were supposed to be displaying.

And here's where the two differ: FREESYNC doesn't have any on-board memory that stores the previous frame. It only has the ability to delay updating the display until the next frame starts being transmitted. So, when time "runs out" for the panel, the computer has to resend the frame from the video card's buffer over the display port cable. Of course, that takes a finite amount of time --- and that introduces one standard display frame lag.

GSYNC, on the other hand, already has the prior frame in memory. It doesn't have to wait for the frame to be transmitted by the video card again, it simply refreshes the display from the internal buffer.

Now, so far, it sounds like there shouldn't be any difference, and indeed, there isn't any difference at this point from the perspective of the user. The frames haven't changed, so you don't even see anything happen.

But the key is, what happens during the next frame?

That will depend on when the next frame is finished by the video card.

If the next frame happens to be available right after the panel refresh starts, in FreesSync monitors the DisplayPort is busy even though the frame is available. The video card can't start transmitting the next frame until the previous frame is sent.

Contrast that with GSYNC: GSYNC has in-built memory, so when the frame is available, the video card can immediately begin transmitting the frame to the GSYNC module, even though the GSYNC module is currently updating the panel.

When the panel has been refreshed with the duplicated frame, GSYNC is able to immediately being displaying the next frame without having to wait for the GPU.

Now, it's not like the GSYNC module can just interrupt the refresh and display the new data: that would result in a tear. But what it can do is begin displaying the next frame immediately and push every other incoming frame out slightly in time --- basically introduce a little extra lag, but make sure that every frame gets displayed --- until the GPU gets slightly behind and the buffer empties.

FREESYNC will kind-of do the same thing, but because of the extra delay introduced by having to transmit over the DisplayPort, it will lag the performance of GSYNC by one display-lag interval.

This, by the way, is why it's important to set your in-game max FPS to 1 less than what your monitor can actually handle when using GSYNC. (119 FPS instead of 120 FPS.) If your GPU hiccups and has to double a frame, the small amount of extra time that 1FPS leaves allows the module catch up within 1 second, even if your GPU is spitting out frames at full speed. (If you drop it to 118FPS, it catches up within 1/2 a second.) Without doing this, the display can potentially get up to 1 full frame behind and stay that way until the GPU falls below full framerate for a moment, defeating the whole point of having a high FPS monitor.

What about FREESYNC? The only way it can catch up is by altering the next frame. That is, the software has to be aware that there is a frame waiting to be transmitted and inform the software that it needs to calculate the next frame update with that lag accounted for.

GSYNC is all built into one module and no cooperation is require between the module and the video card. Additionally, the module is physically wired to the panel, so there is no lag introduced when retransmitting.

FREESYNC requires cooperation between the display, the video driver, the software drivers of the video card, and the game, and it introduces lag both from communication with the video driver and from the lag introduced by the Display Port.

The two should, under ideal circumstances, perform nearly identically, with the GSYNC module beating the FREESYNC system by a few milliseconds at most. (Something like 1-10 ms). But if the monitor and the video card or video card drivers don't play well together, things can fall apart completely resulting in very nasty glitches and dropped frames.

Another MAJOR difference comes into play if the video card is (why would you do this?) intentionally set to a framerate below that of the monitor and a fixed video source (rather than a game) is being displayed. Suppose you are watching a movie and set the framerate out from your video card to 48hz. Now suppose your video card hiccups, requiring a display panel refresh. FREESYNC must resend a frame, but now it's a full frame behind, and it can't ever fix this fact because there's no "extra time" in which to display the lagging data. The only fix is to skip the frame or adjust the audio to match.

Compare that to GSYNC: GSYNC is perfectly aware that the display can handle 120hz, and when it receives the late frame, it is able to immediately transmit the data to the display. Because the display is so much faster than the movie, the following frame will be displayed on time. Yes, one frame of movie video will be delivered, say, 10ms late, but the frame is normally 41ms long, so who cares... the gap for the next video frame will be only 31ms and you'll be caught up --- a minor hiccup you won't even notice.

Of course, it's silly to intentionally set your framerate that low --- simply leave your framerate set normally and play the movie using a player that is able to work with FreeSync. Then the player will be able to catch up, just as it would with GSYNC.

Does any of this really matter? Well, if you can you tell if your game or movie is de-synched by 10 ms for a few frames, then, I guess? As long as the display, card, and video drivers all play nice with each other, there's very little difference. We're talking, worst case scenario, at 120 fps while gaming, a difference in lag of 10 ms between the two, and more typically just a few ms. (That's 1 full frame plus a few ms of buffering.) If you set your FPS to 24hz, then it could be up to 44ms, but who would do that?

TLDR: GSYNC will always work and will always beat FreeSync by at least a few milliseconds, and potentially up to one full frame at the designated FPS. GSYNC catches up naturally with no help from the computer when a frame is missed (if the FPS set for the game/display mode is at least 1 fps below what the display is capable of) , FREESYNC requires software cooperation to catch up. FreeSync only works properly if the monitor, video card, driver, and software all play nice with each other. GSYNC always works."

Så gsync är en halv frame snabbare i snitt? Men bara i de fall där man dubbelvisar frames, alltså mer sällan än heltid

Vid 120FPS blir det kanske 2ms skillnad (hälften av hälften av 8ms), men lite ojämnt beroende på bildflödet

Skulle inte kalla det någon avgörande skillnad

Många av de snabbaste och bästa bildskärmarna har ju inte ens en gsync-modul, så du begränsas då ganska extremt i vilken skärm du kan köpa. Inte värt att ens ta det i beaktande skulle jag säga

Att skärmen har någon synkteknik är såklart bra, och att den funkar med ditt grafikkort, men även för nvidiaägare är det inte en markant skillnad

Permalänk
Medlem
Skrivet av medbor:

Så gsync är en halv frame snabbare i snitt? Men bara i de fall där man dubbelvisar frames, alltså mer sällan än heltid

Vid 120FPS blir det kanske 2ms skillnad (hälften av hälften av 8ms), men lite ojämnt beroende på bildflödet

Skulle inte kalla det någon avgörande skillnad

Många av de snabbaste och bästa bildskärmarna har ju inte ens en gsync-modul, så du begränsas då ganska extremt i vilken skärm du kan köpa. Inte värt att ens ta det i beaktande skulle jag säga

Att skärmen har någon synkteknik är såklart bra, och att den funkar med ditt grafikkort, men även för nvidiaägare är det inte en markant skillnad

Jag köpte just en begagnad gsync modul skärm och det värsta är att man då måste ha en jävla fläkt i skärmen, det visste jag inte när jag köpte den. Som tur var är den inte superhögljud men skärmen alstrar mer värme vilket märks och har en ytterligare felpunkt i form av fläkten, vilket inte är helt lätt att få tag på ny. Samt att om du ska byta kylpasta måste du öppna hela skärmen.

Permalänk
Medlem
Skrivet av medbor:

Så gsync är en halv frame snabbare i snitt? Men bara i de fall där man dubbelvisar frames, alltså mer sällan än heltid

Vid 120FPS blir det kanske 2ms skillnad (hälften av hälften av 8ms), men lite ojämnt beroende på bildflödet

Skulle inte kalla det någon avgörande skillnad

Många av de snabbaste och bästa bildskärmarna har ju inte ens en gsync-modul, så du begränsas då ganska extremt i vilken skärm du kan köpa. Inte värt att ens ta det i beaktande skulle jag säga

Att skärmen har någon synkteknik är såklart bra, och att den funkar med ditt grafikkort, men även för nvidiaägare är det inte en markant skillnad

Dock fortfarande bättre än Freesync vilket var det som diskuterades. Skall du ha det bästa är det G-Sync med modul som gäller.

Permalänk
Medlem
Skrivet av wibbe44:

Dock fortfarande bättre än Freesync vilket var det som diskuterades. Skall du ha det bästa är det G-Sync med modul som gäller.

Ska du ha det bästa är det väl något datacenterkort som gäller, och minst 32 kärnor Treadripper

Nej, det bästa man kan köpa är det man har råd med och bäst egenskaper för pengarna, tror sällan gsync-modulen är bäst för pengarna, eller ens sitter i skärmen som ger bäst upplevelse

Permalänk
Medlem
Skrivet av medbor:

Ska du ha det bästa är det väl något datacenterkort som gäller, och minst 32 kärnor Treadripper

Nej, det bästa man kan köpa är det man har råd med och bäst egenskaper för pengarna, tror sällan gsync-modulen är bäst för pengarna, eller ens sitter i skärmen som ger bäst upplevelse

Nu var varken datacenterkort eller pengar inblandat utan endast teknikerna Freesync vs G-Sync. Vilken som är bäst vet vi nu utan att behöva sväva iväg till massa "om, men, pengar" osv. Det leder inte till något konstruktivt och inte heller relevant eftersom det endast var teknikerna i fråga som var ämnet.
Även om skillnaden är liten så finns den där, vilket kan vara avgörande om man verkligen vill ha det bästa.

Permalänk
Medlem

Hade G-Sync modulen varit så otroligt bra att den gör en väsentlig skillnad hade fler tillverkare byggt skärmar med modulen. I nuläget avtar det istället för varje år, vilket tyder på att viljan inte finns från varken kunder eller tillverkare att betala för funktionen, dvs skillnaden i upplevd kvalité är för liten.

Jag vill också tillägga att jag har använt en skärm med G-Sync ultimate i flera år innan det blev urvattnat (släppt krav på HDR1000). Jag har även parallellt använt Freesync/G-Sync compatible, och jag skulle nog inte kunna säga vilken som är vilken i ett blindtest.

Mvh

Visa signatur

Asus ROG Strix B650E-F - AMD Ryzen 7 7800X3D - 32GB DDR5 - Galax RTX 4090 Hall Of Fame OC Lab - Corsair MP700 - WD Black SN850 - WD Black SN850X - Samsung QVO 870 - WD Black HDD - Noctua NH-U12A Chromax - Fractal Design Define 7 - Seasonic Prime Ultra Gold 1000W - Alienware AW3423DWF QD-OLED