I’m having a strange problem, the physics in my game is running faster in a browser than it is in the simulator. In my game, I can aim in 360 degrees and fire off a shot. The shot (a rigidbody attached to a sprite, not a bullet) is always given the same speed, and is fired off in the direction the player aimed. The shot is moved by setting its linear velocity. I have an array of unit vectors for each degree in all 360 degrees, and I multiply the unit vector by my speed (set to 500 always) to calculate velocity.
In the simulator, everything looks as I expect. In the browser, the shot moves way faster than I expect. I placed some log statements and found out how far the shot fires the first frame after velocity is applied:
Shot position after frame 1 (Simulator): (-2.58, 7.93, 0.00) dt: 0.016667999999999666
Shot position after frame 1 (Browser): (-2.22, 8.03, 0.00) dt: 0.006881000000000313
This is a 2D game so I’ll ignore z values. In the simulator the shot moves from (0, 0) to (-2.58, 7.93) in 0.01667 seconds (60fps). This is what I expect, the distance traveled is 7.717 , the frames per second is 60.
Looking at the browser, the shot moves from (0, 0) to (-2.22, 8.03). A similar distance traveled of 7.499, however it did so in 0.006881 seconds. It traveled almost the same exact distance in one frame at 145fps in the browser as it did in 60fps in the simulator.
I’ve tried to cap the frame rate to 60fps in the browser by calling:
But that doesn’t seem to have an effect on anything.
Am I doing something wrong? This behavior makes no sense to me, I feel like the fix is something completely obvious that I’m missing here. Any help would be appreciated, thanks!