Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fluctuating physics update count per render frame #15270

Closed
1 task done
zmanuel opened this issue Jan 2, 2018 · 11 comments
Closed
1 task done

Fluctuating physics update count per render frame #15270

zmanuel opened this issue Jan 2, 2018 · 11 comments

Comments

@zmanuel
Copy link
Contributor

zmanuel commented Jan 2, 2018

Godot version:
3.0-beta2, but goes back to 2.1 at least

OS/device including version:
Linux x64, Ubuntu 16.04.3 LTS
Nvidia driver 384.90, GeForce GTX 970

Issue description:
Not exactly a duplicate of #10388, but the identical situation with a worse result. The conditions are:

  • Render framerate(monitor refresh rate if vsync is enabled and conditions are ideal) and physics framerate are almost, but not exactly, equal
  • The measured render frame time fluctuates

The issue is the simple while (time_accum > frame_slice) {-loop in Main::iteration() that determines the physics updates. Usually, if the two framerates are almost equal, the loop runs exactly once because the comparison yields stable results every frame. But over time, due to the slight framerate difference, time_accum at the start of the update drifts upwards or downwards. When it drifts close to 0 or frame_slice, the loop won't run at all or twice. If that only happened once, that would be fine, but unfortunately, then the variable measured frame time comes into play and the loop comparison becomes unstable from frame to frame, leading to many subsequent render frames with fluctuating numbers of physics updates (0, 1 and 2 if the monitor refresh rate is higher than the physics refresh, 1, 2 and 3 if it is lower). That looks ugly for a bit. The countermeasures discussed in #10388 do help here, too, but I think we can do better by default.

Steps to reproduce:
Make sure your monitor refresh are only a tiny bit different (setting the physics update to 59 or 61 in project settings is too much). Run the attached project for a bit. It measures recent max and min number of physics steps per render frame. It also shoots a string of bullets that move a bit down on every _process() and a bit to the right on every _physics_process(), making differences between them visible as the stream becomes jaggy. Obviously nothing anyone would do in a real project.

The test project also shows the recent max and min render frame times so you can see the fluctuation.

The ideal display when the two rates are exactly equal is this:
ideal
The stream is a nice diagonal line, min and max physics steps are both 1. However, sometimes (for me, once or twice a minute), the display changes and shows this:
jitter
The physics update cadence is something like 1, 1, 1, 0, 2, 1, 0, 2, 0, 2, 0, 1, 1, 1, 1, physics frames per render frame, it can extend for much longer than that. For me, with a refresh rate slightly above 60, the min and max updates are 0 and 2; for someone with a slightly lower refresh rate, they would be 1 and 3.

Now, one total dropped physics frame is of course inevitable if you insist on correct timing in your game, but it should look like this:
jitter_fixed

I implemented a primitive first fix attempt here: zmanuel@90a11ef, the last screenshot was taken with that fix.
The idea of the fix is to simply add some hysteresis that makes the updater choose physics update iteration counts close to the one from the last frame.

The fix does not handle the case where the physics update is 30 fps and the display refresh is close to 60 Hz (or, more relevant to the hardcore, the physics update is 60 fps and the monitor has close to 120 Hz):
sadly_no_fix_for_30fps_physics
If there is demand, I can extend the fix to that situation. It would simply not just stabilize the render frame to render frame fluctuations, but fluctuations over groups of render frames. Probably requires a helper class for clarity.

Minimal reproduction project:
timer_test.zip

  • I searched the existing GitHub issues for potential duplicates.
@Causeless
Copy link

Causeless commented Jan 30, 2018

I'd like to mention that this and #10388 are fundamentally different issues - you say "The countermeasures discussed in #10388 do help here, too, but I think we can do better by default." as if render interpolation would be a hack to work around this issue.

Render interpolation is something we should have regardless, given that we cannot always guarantee a 100% static render frametime of 16⅔ms.

That aside - I would argue that this fix is really more of a hack than render interpolation.

The point of physics and render is that they are fundamentally disconnected in timing, and there's not really any issue with a non-synchronized render and physics timestep. This fix unnecessarily ties the render and physics timestep together. Both do not - and indeed, should not - be tied together, and when having physics in an different thread altogether it's immediately obvious that tying them together in this way isn't correct.

@zmanuel
Copy link
Contributor Author

zmanuel commented Jan 30, 2018

Umm, threads? Has nothing to do with anything. Yes, the physics update may happen in a different thread, so say the docs (couldn't find anything in the source, but then I am currently viewing 2.1), but it nevertheless has to sync up with the main thread after every update. If it wouldn't, render interpolation couldn't work either. Besides, the two timers are already tied together.

Also, while render interpolation would indeed largely do away with the visual glitches, it's no silver bullet and can't be applied to every game. It would be poison for precision platformers and fighting games. Those are best if what's shown on the screen is an exact representation of the current gamestate (well, a well defined recent gamestate anyway, thinking about inevitable queues). All assuming the screen refresh and physics framerate are identical, if they're not, well, interpolation would be the lesser evil.

And render interpolation is a hack here :) It does not fix that you have two physics updates in one render frame, zero in the next, then two again. Think about where input happens: outside of the iterate() call (some is threaded, but again, doesn't help). Think about where the game physics can react to input. Not it the update with zero physics steps. During the periods of janky updates, you effectively reduced your input response to that of a stuttery 45 fps game, give or take. With interpolation, players are unlikely to notice consciously, but if the game is timing sensitive, it will feel just a little bit jankier than it needs to.

And that's all this is about. The update is a little bit jankier than it needs to be. The resulting game has a little less fidelity than it could have. I do realize that I am probably a little too obsessed about these little things, but it's like taking a pristine beautiful pixel font and rendering it 1/3 of a pixel to the side, filtered so it still gets blurred, and don't get me started on that :)

@Causeless
Copy link

Causeless commented Jan 31, 2018

If the physics update is executed in parallel to the render, in a separate thread, the simulation tick doesn't necessarily need to be tied and synchronized with with the render (although, this would incur input lag).

I believe Bullet has functionality to allow buffering of simulation states, (at the very least it double-buffers, though I think you can use MotionStates to triple-buffer or more) which means that it isn't strictly necessary to tie the physics frame end to the render frame end. You could start simulating the next physics frame immediately, only rendering with the latest complete frames, albeit at the cost of incurring input lag.

This is what a few engines, such as Naughty Dog's engine, do. I think their engine allows render and simulation deviations of up to three ticks before they start sleep()ing to sync. That said, their games are console third-person shooters with big heavy animations so obviously they can incur a fair bit of input lag.

I think that on some point we're both really just thinking from an entirely different perspective here, however. For input-lag sensitive games, everything I am suggesting is a truly awful idea. Ideally both systems should be implemented, although I wouldn't call render interpolation a hack really: render state interpolation is a feature supported by most physics engines, and by extension, most game engines. It's by default, and expected, in most cases.

@ghost
Copy link

ghost commented Feb 1, 2018

i'm creating a simple 3d game and i have this problem, when shooting a weapon or moving rigidbody character i notice i get some lag after few seconds
then i found this issue and downloaded the project and this is what i got

physic

it's lagging completely like my game , like the Picture Exactly .. each 1 second lag (Everything stop for less then 1/10 second) then continue .. and lag (Everything stop for less then 1/10 second) then continue .. etc...

Edit :
8 GB RAM
NVidia GTX 980
i7 4790 K
Godot 3 Stable using Bullet

@ghost
Copy link

ghost commented Feb 2, 2018

and here is a video showing the problem :

https://www.youtube.com/watch?v=hhAHV6CQgag

@zmanuel
Copy link
Contributor Author

zmanuel commented Feb 3, 2018

CCODE05B: Those horizontal ticks in the 'graph' say that the app is properly freezing up for a bit without rendering anything. I'd say it's unrelated and you should check for some nasty background process that blocks the GPU every second, like a silly large dynamic desktop background with a ticking clock (the freezes really come pretty much in 1s intervals). CPU hogs are also a possibility.

Causeless: Yeah, it's possible to increase physics throughput by allowing a background physics process to run a bit ahead and feed script input to it with a delay. This would need to be optional, though, because you then have to be very careful about how you do your two way interactions between scripts and physics.

I've been thinking a bit; my patch is essentially input value sanitation. And before one does that, one should first check whether the input data is as good as it can be. It's not. Timer values are taken at a pretty much arbitrary point of the main loop (well, in the beginning of the function, but since it's run in a loop, that too is an arbitrary point) and measure CPU time.
Couple of possibly better timers:

There seems to be an API for measuring when a GPU command completed: http://www.lighthouse3d.com/tutorials/opengl-timer-query/
One could use that to measure when the buffer swap completes.

The timer could be queried further down the loop after the VisualServer::get_singleton()->draw()
call which does some syncing with the GPU.

VisualServer::get_singleton()->draw() could be modified to do more syncing with the GPU. RasterizerGLES3::end_frame() could either use plain glFinish in all branches (yes, has issues) or fences (https://www.khronos.org/opengl/wiki/Sync_Object) to wait for a previous frame to finish.

I tried them all except for the GPU timer queries, they did not help much. Oh well. Guess I'll have to try the complicated one. But, to be honest, even if it works for me, how compatible is it going to be?

@ghost
Copy link

ghost commented Feb 3, 2018

Hi again
sorry i don't understand you well because English is not my first language
but i already Created this game using UE4 , and released previous Part using UDK .. i just want to try Godot and i found this problem

there is no Process running is making this problem , and i'm using Fresh installed Windows 7 x64 , and the same project using UE4 run pretty smooth with higher FPS without any problem , but in Godot i get low FPS and Freezing

this problem is also happen on other PC , Freeze each second

not only that , but when i create an 3D Empty project and try to move the camera using lerp - Smooth movements , i notice it's like jumping sometimes or freezing .. i don't know how to explain it
not only camera .. everything moving using linear interpolation or lerp have this problem but if i don't use them .. they move fine without any problem

my PC :
Win7 x64 bit , i7 4970K , 8 GB Ram and GTX 980

and the other PC :
Win10 x64 bit , i7 3770K , 8 GB Ram and GTX 780 ti

@zmanuel
Copy link
Contributor Author

zmanuel commented Feb 5, 2018

What I am trying to say is that your case looks like proper short freezing, that is, the engine not putting out any frames for a period of 50 ms or so. But I may be wrong. A little test may help: In the Editor, go to "Project/Project Settings" in the menu. There, go to "Debug/Settings" and check the "Print Fps" box. Then run your game. It should print "FPS: ..." every second to the console then. If the FPS values are abut 55 to 57 more or less consistently, you have small actual freezes and your problem is unrelated to what I'm reporting. If it's around 60, your problem is related.

@zmanuel
Copy link
Contributor Author

zmanuel commented Feb 20, 2018

(Just a quick status report) The glQueryCounter based timer works better, but still has enough fluctuation to cause this particular jitter if taken unfiltered. Oh well. I'll clean up my code a bit and then submit it.

Edit: Three strikes against that plan.

  1. the relevant functions are not actually part of GLES
  2. even for desktop OGL, word on the web is that the timer always returns 0
  3. GLES3 support has just been wiped off the roadmap and replaced with Vulkan on desktop platforms

The good news is that Vulkan does have equivalent functions, but that part of the implementation should better wait until at least a base Vulcan renderer is implemented. I'll just submit the sanitizing code, then.

@zmanuel
Copy link
Contributor Author

zmanuel commented Mar 7, 2018

Submitted PR #17353.

I created a new test project:
frame_timer_test.zip

It shows on the top of the screen a graph of the time_accum variable in main.cpp, the remainder of time that hasn't been included in a physics update yet.
Below that, the individual frame times are plotted in red.
And below that, two, umm, cannonballs move from left to right, the top one moving with each _process, the lower one with each _physics_process. Here's how it looks in a normal situation:
screenshot_1
Never mind the buttons, they're there to test various situations the timer handling has to cope with. Dropped frames and less than 60FPS performance.
In the ideal world where the monitor has exactly 60Hz, the green line would stay put and each render frame would be one physics frame. If the frequency is off, it slowly drifts up or down. When it reaches the top or bottom, it jumps to the other side. In the unpatched engine, this looks like this:
60s_screenshot_1
The green line can't decide for a bit whether it wants to be near the top or bottom. As it jumps from top to bottom, zero physics steps are performed that render frame, and when it jumps back up, two are done at once. This causes the lower blue cannonball to jitter. This is with singlethreaded rendering, multithreading makes it worse, there is some extra timer jitter once per second or so (I'll look into the reasons for that later):
60m_screenshot_2
As does having a monitor that really gets close to 60Hz (though, to compensate, it also makes it rarer):
60s_veryclose_screenshot_1

With the fix, multithreaded looks like this:
60m_screenshot_1
Just one jump down of the green line (sorry for the confusion, the graphs continuously move from left to right, so the time axis in stills points to the left...), one single jerk of the blue cannonball, unavoidable under the given constraints.

120 Hz is also improved. Before (single thread):
120s_screenshot_1
The effect is less perceivable here, of course, with physics steps fluctuating between 1 and 3 per render frame and thus the ball jittering back and forth only half as much.
After (multithreaded, hence the single/dual pixel kinks):
120m_screenshot_1

One bad side effect of the current code at the jitter fix value of the sample can be seen here on the right side:
120m_screenshot_3
while the hysteresis code refuses to update the acceptable number of physics updates per render frame, it tolerates variation in animation step size. Lowering the jutter fix configuration to something sensible (the default is 0.05 right now, 0.1 would be fine) lowers the maximum step variation, too. It should be possible to avoid this entirely in code for all jitter fix configurations, I'll see what I can do.

@bojidar-bg
Copy link
Contributor

Closing as the PR was merged.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants