Skip to main content
deleted 328 characters in body
Source Link
Alican
  • 336
  • 1
  • 9

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

deleted 6 characters in body
Source Link
Alican
  • 336
  • 1
  • 9

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    float targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    float targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

currentlyCurrently I'm participating in a multiplayer game project as a network developer. HoweverUnfortunately, unfortunatelyhowever, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. imagesImages don't look smooth compared to the server even if the client runs at 60 frames per second.

thingsThings about the architecture:

  • basedBased on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • clientClient receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • beforeBefore processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • clientClient doesn't deal with rigidbody; only position and rotation.

theThe code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    float targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using unity3d's LerpUnity3D's LERP method from the Vector3 class. SlerpUsing SLERP didn't solve the problem also.

anyAny ideas? everythingEverything looks great on paper.

thankspaper.

currently I'm participating a multiplayer game project as network developer. However, unfortunately, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. images don't look smooth compared to the server even if the client runs at 60 frames per second.

things about the architecture:

  • based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • client doesn't deal with rigidbody; only position and rotation.

the code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    float targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using unity3d's Lerp method from the Vector3 class. Slerp didn't solve the problem also.

any ideas? everything looks great on paper.

thanks.

Currently I'm participating in a multiplayer game project as a network developer. Unfortunately, however, I'm facing an interpolation problem which makes game objects look like they are 'jumping/teleporting' sometimes. Images don't look smooth compared to the server even if the client runs at 60 frames per second.

Things about the architecture:

  • Based on client/server model. server sends regular transform updates at every 33 milliseconds (30fps).
  • Client receives and stores position updates in a queue. then, plays them from 1 second behind using interpolation between each update.
  • Before processing the next packet, client calculates the sent time between current and next packet to find out how many milliseconds this interpolation should take. at every frame, it deducts the delta time from the interpolation time. so, if interpolation time reaches a number equal or less than zero, it iterates to the next packet (if the value is less than zero, it means that there is a time fragment. so the client also considers that time fragment for interpolation of the jumped packet at the same frame.)
  • Client doesn't deal with rigidbody; only position and rotation.

The code looks like this:

interpolationTime-=deltaTime;

if(interpolationTime<=0)
{
    float timeFragment=Mathf.Abs(interpolationTime);
    popCurrentPacket();
    float targetDuration=packets[1].time-packets[0].time;
    interpolationTime=targetDuration-timeFragment;
}

interpolate(interpolationTime, targetDuration);

I'm using Unity3D's LERP method from the Vector3 class. Using SLERP didn't solve the problem.

Any ideas? Everything looks great on paper.

Source Link
Alican
  • 336
  • 1
  • 9
Loading