-
-
Notifications
You must be signed in to change notification settings - Fork 21.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fluctuating physics update count per render frame #15270
Comments
I'd like to mention that this and #10388 are fundamentally different issues - you say "The countermeasures discussed in #10388 do help here, too, but I think we can do better by default." as if render interpolation would be a hack to work around this issue. Render interpolation is something we should have regardless, given that we cannot always guarantee a 100% static render frametime of 16⅔ms. That aside - I would argue that this fix is really more of a hack than render interpolation. The point of physics and render is that they are fundamentally disconnected in timing, and there's not really any issue with a non-synchronized render and physics timestep. This fix unnecessarily ties the render and physics timestep together. Both do not - and indeed, should not - be tied together, and when having physics in an different thread altogether it's immediately obvious that tying them together in this way isn't correct. |
Umm, threads? Has nothing to do with anything. Yes, the physics update may happen in a different thread, so say the docs (couldn't find anything in the source, but then I am currently viewing 2.1), but it nevertheless has to sync up with the main thread after every update. If it wouldn't, render interpolation couldn't work either. Besides, the two timers are already tied together. Also, while render interpolation would indeed largely do away with the visual glitches, it's no silver bullet and can't be applied to every game. It would be poison for precision platformers and fighting games. Those are best if what's shown on the screen is an exact representation of the current gamestate (well, a well defined recent gamestate anyway, thinking about inevitable queues). All assuming the screen refresh and physics framerate are identical, if they're not, well, interpolation would be the lesser evil. And render interpolation is a hack here :) It does not fix that you have two physics updates in one render frame, zero in the next, then two again. Think about where input happens: outside of the iterate() call (some is threaded, but again, doesn't help). Think about where the game physics can react to input. Not it the update with zero physics steps. During the periods of janky updates, you effectively reduced your input response to that of a stuttery 45 fps game, give or take. With interpolation, players are unlikely to notice consciously, but if the game is timing sensitive, it will feel just a little bit jankier than it needs to. And that's all this is about. The update is a little bit jankier than it needs to be. The resulting game has a little less fidelity than it could have. I do realize that I am probably a little too obsessed about these little things, but it's like taking a pristine beautiful pixel font and rendering it 1/3 of a pixel to the side, filtered so it still gets blurred, and don't get me started on that :) |
If the physics update is executed in parallel to the render, in a separate thread, the simulation tick doesn't necessarily need to be tied and synchronized with with the render (although, this would incur input lag). I believe Bullet has functionality to allow buffering of simulation states, (at the very least it double-buffers, though I think you can use MotionStates to triple-buffer or more) which means that it isn't strictly necessary to tie the physics frame end to the render frame end. You could start simulating the next physics frame immediately, only rendering with the latest complete frames, albeit at the cost of incurring input lag. This is what a few engines, such as Naughty Dog's engine, do. I think their engine allows render and simulation deviations of up to three ticks before they start sleep()ing to sync. That said, their games are console third-person shooters with big heavy animations so obviously they can incur a fair bit of input lag. I think that on some point we're both really just thinking from an entirely different perspective here, however. For input-lag sensitive games, everything I am suggesting is a truly awful idea. Ideally both systems should be implemented, although I wouldn't call render interpolation a hack really: render state interpolation is a feature supported by most physics engines, and by extension, most game engines. It's by default, and expected, in most cases. |
i'm creating a simple 3d game and i have this problem, when shooting a weapon or moving rigidbody character i notice i get some lag after few seconds it's lagging completely like my game , like the Picture Exactly .. each 1 second lag (Everything stop for less then 1/10 second) then continue .. and lag (Everything stop for less then 1/10 second) then continue .. etc... Edit : |
and here is a video showing the problem : |
CCODE05B: Those horizontal ticks in the 'graph' say that the app is properly freezing up for a bit without rendering anything. I'd say it's unrelated and you should check for some nasty background process that blocks the GPU every second, like a silly large dynamic desktop background with a ticking clock (the freezes really come pretty much in 1s intervals). CPU hogs are also a possibility. Causeless: Yeah, it's possible to increase physics throughput by allowing a background physics process to run a bit ahead and feed script input to it with a delay. This would need to be optional, though, because you then have to be very careful about how you do your two way interactions between scripts and physics. I've been thinking a bit; my patch is essentially input value sanitation. And before one does that, one should first check whether the input data is as good as it can be. It's not. Timer values are taken at a pretty much arbitrary point of the main loop (well, in the beginning of the function, but since it's run in a loop, that too is an arbitrary point) and measure CPU time. There seems to be an API for measuring when a GPU command completed: http://www.lighthouse3d.com/tutorials/opengl-timer-query/ The timer could be queried further down the loop after the VisualServer::get_singleton()->draw() VisualServer::get_singleton()->draw() could be modified to do more syncing with the GPU. RasterizerGLES3::end_frame() could either use plain glFinish in all branches (yes, has issues) or fences (https://www.khronos.org/opengl/wiki/Sync_Object) to wait for a previous frame to finish. I tried them all except for the GPU timer queries, they did not help much. Oh well. Guess I'll have to try the complicated one. But, to be honest, even if it works for me, how compatible is it going to be? |
Hi again there is no Process running is making this problem , and i'm using Fresh installed Windows 7 x64 , and the same project using UE4 run pretty smooth with higher FPS without any problem , but in Godot i get low FPS and Freezing this problem is also happen on other PC , Freeze each second not only that , but when i create an 3D Empty project and try to move the camera using lerp - Smooth movements , i notice it's like jumping sometimes or freezing .. i don't know how to explain it my PC : and the other PC : |
What I am trying to say is that your case looks like proper short freezing, that is, the engine not putting out any frames for a period of 50 ms or so. But I may be wrong. A little test may help: In the Editor, go to "Project/Project Settings" in the menu. There, go to "Debug/Settings" and check the "Print Fps" box. Then run your game. It should print "FPS: ..." every second to the console then. If the FPS values are abut 55 to 57 more or less consistently, you have small actual freezes and your problem is unrelated to what I'm reporting. If it's around 60, your problem is related. |
(Just a quick status report) The glQueryCounter based timer works better, but still has enough fluctuation to cause this particular jitter if taken unfiltered. Oh well. I'll clean up my code a bit and then submit it. Edit: Three strikes against that plan.
The good news is that Vulkan does have equivalent functions, but that part of the implementation should better wait until at least a base Vulcan renderer is implemented. I'll just submit the sanitizing code, then. |
Submitted PR #17353. I created a new test project: It shows on the top of the screen a graph of the time_accum variable in main.cpp, the remainder of time that hasn't been included in a physics update yet. With the fix, multithreaded looks like this: 120 Hz is also improved. Before (single thread): One bad side effect of the current code at the jitter fix value of the sample can be seen here on the right side: |
Closing as the PR was merged. |
Godot version:
3.0-beta2, but goes back to 2.1 at least
OS/device including version:
Linux x64, Ubuntu 16.04.3 LTS
Nvidia driver 384.90, GeForce GTX 970
Issue description:
Not exactly a duplicate of #10388, but the identical situation with a worse result. The conditions are:
The issue is the simple
while (time_accum > frame_slice) {
-loop in Main::iteration() that determines the physics updates. Usually, if the two framerates are almost equal, the loop runs exactly once because the comparison yields stable results every frame. But over time, due to the slight framerate difference, time_accum at the start of the update drifts upwards or downwards. When it drifts close to 0 or frame_slice, the loop won't run at all or twice. If that only happened once, that would be fine, but unfortunately, then the variable measured frame time comes into play and the loop comparison becomes unstable from frame to frame, leading to many subsequent render frames with fluctuating numbers of physics updates (0, 1 and 2 if the monitor refresh rate is higher than the physics refresh, 1, 2 and 3 if it is lower). That looks ugly for a bit. The countermeasures discussed in #10388 do help here, too, but I think we can do better by default.Steps to reproduce:
Make sure your monitor refresh are only a tiny bit different (setting the physics update to 59 or 61 in project settings is too much). Run the attached project for a bit. It measures recent max and min number of physics steps per render frame. It also shoots a string of bullets that move a bit down on every _process() and a bit to the right on every _physics_process(), making differences between them visible as the stream becomes jaggy. Obviously nothing anyone would do in a real project.
The test project also shows the recent max and min render frame times so you can see the fluctuation.
The ideal display when the two rates are exactly equal is this:
The stream is a nice diagonal line, min and max physics steps are both 1. However, sometimes (for me, once or twice a minute), the display changes and shows this:
The physics update cadence is something like 1, 1, 1, 0, 2, 1, 0, 2, 0, 2, 0, 1, 1, 1, 1, physics frames per render frame, it can extend for much longer than that. For me, with a refresh rate slightly above 60, the min and max updates are 0 and 2; for someone with a slightly lower refresh rate, they would be 1 and 3.
Now, one total dropped physics frame is of course inevitable if you insist on correct timing in your game, but it should look like this:
I implemented a primitive first fix attempt here: zmanuel@90a11ef, the last screenshot was taken with that fix.
The idea of the fix is to simply add some hysteresis that makes the updater choose physics update iteration counts close to the one from the last frame.
The fix does not handle the case where the physics update is 30 fps and the display refresh is close to 60 Hz (or, more relevant to the hardcore, the physics update is 60 fps and the monitor has close to 120 Hz):
If there is demand, I can extend the fix to that situation. It would simply not just stabilize the render frame to render frame fluctuations, but fluctuations over groups of render frames. Probably requires a helper class for clarity.
Minimal reproduction project:
timer_test.zip
The text was updated successfully, but these errors were encountered: