SWBFGamers

Gaming for the Original SWBF1 and SWBF2/other games => Star Wars Battlefront (2004 Original) => Topic started by: Led on November 24, 2009, 08:19:42 AM

Title: Tick Rate vs Frame Rate
Post by: Led on November 24, 2009, 08:19:42 AM
Hi MPCers,

If you have ever hosted a SWBF game, you are aware that you can set the TPS (ticks per second) of the server.  For a DC based game hosted with the game cd/dvd you can set 15, 20, or 30 TPS.

1.2 version PC dedicated server software allows the admind to set other TPS values with the server command /tps 30 for 30 TPS, for example.

The TPS setting sets the mininum frames per second (FPS) for users on the server.

My question is:  are TPS the same as FPS ?

The question arises because I have recently become aware that it is possible for players to un-cap their client side framerates, to what ever their video card can handle using an undocumented client command.

My impression is that TPS are used to define server syncing with the client.  FPS is used for the visual smoothness for the player.

The motivation for asking the question is to determine if players are gaining an advantage using an uncapped framerate.

What do you think?

Buckler






Title: Re: Tick Rate vs Frame Rate
Post by: Hardcore on November 24, 2009, 09:34:09 AM
From what i understand- Ticks per second, simply means how fast the GAME itself is updating. This is also known as "logic per second" many times.

FPS is the graphics running, making it smoother for the vusial effect as you said.

Basically-  During each tick, the server processes incoming user commands, runs a physical simulation step, checks the game rules, and updates all object states. After simulating a tick, the server decides if any client needs a world update and takes a snapshot of the current world state if necessary. A higher tickrate increases the simulation precision, but also requires more CPU power and available bandwidth on both server and client.

This may sound a bit out of place but this is how MOST games are built that i know of and this is how i understand that it works this is of corse based on my understanding so if someone knows otherwise feel free to correct me because SWBF could be done differently.

So basically if they have powerful computers, and you have a powerful computer, it would in practice help to make the TPS faster, however i do not know if SWBF can handle faster itself, something that would need to be tested i'd guess.
Title: Re: Tick Rate vs Frame Rate
Post by: aeria. on November 24, 2009, 02:24:41 PM
Quote from: Buckler on November 24, 2009, 08:19:42 AM

The question arises because I have recently become aware that it is possible for players to un-cap their client side framerates, to what ever their video card can handle using an undocumented client command.
for asking the question is to determine if players are gaining an advantage using an uncapped framerate.

Hmm.. may we know the command?
Title: Re: Tick Rate vs Frame Rate
Post by: Led on November 24, 2009, 03:07:15 PM
Quote from: ag on November 24, 2009, 02:24:41 PM
Hmm.. may we know the command?

Well, I want to know what you think, first   ;)
Title: Re: Tick Rate vs Frame Rate
Post by: aeria. on November 24, 2009, 05:27:42 PM
As Hardcore put it, the game is updating at a slower rate than your FPS so I believe there is no true advantage. Playing at 30 FPS and playing at 60+ FPS isn't that big of a difference when it comes to responsiveness. However if you compare 20 FPS and 60+ FPS, there is a sure difference.
Title: Re: Tick Rate vs Frame Rate
Post by: Led on November 24, 2009, 06:03:22 PM
Fair enough. 

The command is /noframelock

It gets inserted into the command line of the program, as described for the /nointro command
neat the bottom of this page:

http://www.tweakguides.com/SWB_7.html


I get 150-200 fps on my video card with it.

Buck


Title: Re: Tick Rate vs Frame Rate
Post by: aeria. on November 24, 2009, 07:17:20 PM
I'll probably try it next time I'm playing Battlefront. But good find!
Title: Re: Tick Rate vs Frame Rate
Post by: Xfire Keenmike aka cull on November 25, 2009, 01:51:04 AM
i dont know, at least if i have an advantage i am a person not to flaunt it. the probles with my video card have been happening long before i learned about the no framlock.
Title: Re: Tick Rate vs Frame Rate
Post by: Led on November 25, 2009, 05:21:28 AM
Quote from: keenmike on November 25, 2009, 01:51:04 AM
i dont know, at least if i have an advantage i am a person not to flaunt it. the probles with my video card have been happening long before i learned about the no framlock.


Read in a more cynical way, your post could mean `If I have an unfair advantage, I don't want anyone else to know about it'.


But first, is it really an advantage if the tick rate remains the same?  That is the question that I am soliciting opinions.

Second, if it is an advantage, I feel that everyone should be aware of it!  And, if it is not, it certainly makes the game visual experience much more smooth--perhaps meaning the game will  continue to have interest to someone.

Also, from the perspective of someone that runs a server, why should I waste my bandwidth providing a 60 TPS server (that gives everyone a mininum of 60 FPS) when I can provide a 20 TPS server and let everyone be aware that they can get 100-200 FPS anyway?


Buck
Title: Re: Tick Rate vs Frame Rate
Post by: Xfire Keenmike aka cull on November 25, 2009, 09:47:30 PM
text is easy to get misunderstood, so let me say honestly i am not being cynical. and i am not hiding a thing, if you ask i will tell. and if i can give i will give. back on the topic i ask how do i get 100-200 FPS while in the map and in game? the most i have ever seen is 60 FPS. and as you can see in my screeen shot i got 20 FPS while in my own server.
Title: Re: Tick Rate vs Frame Rate
Post by: MjrNuT on November 25, 2009, 11:21:34 PM
Quote from: Buckler on November 24, 2009, 08:19:42 AM
Hi MPCers,

If you have ever hosted a SWBF game, you are aware that you can set the TPS (ticks per second) of the server.  For a DC based game hosted with the game cd/dvd you can set 15, 20, or 30 TPS.

1.2 version PC dedicated server software allows the admind to set other TPS values with the server command /tps 30 for 30 TPS, for example.

The TPS setting sets the mininum frames per second (FPS) for users on the server.

My question is:  are TPS the same as FPS ?

The question arises because I have recently become aware that it is possible for players to un-cap their client side framerates, to what ever their video card can handle using an undocumented client command.

My impression is that TPS are used to define server syncing with the client.  FPS is used for the visual smoothness for the player.

The motivation for asking the question is to determine if players are gaining an advantage using an uncapped framerate.

What do you think?

Buckler

Buckler!!

I just had to chime in here for you b/c it's something I've investigated a long time ago when undertaking TF2 servers.  You ask great questions and the principles are the same for SWBF.  The only thing is that SWBF lacks more controls then TF2 or other newer games, but these ones you are speaking of are the same principles across the games.

Here is a link that explains alot, some of what quite familiar from JK's post.  ;P

http://whisper.ausgamers.com/wiki/index.php/Tickrate


Answer to your first question: No

Answer to your second question: No


The /tps command is ticks per second.  On the tweak guides it says it sets the maximum frames per second, which is correct.  The reason for this is create a level playing field. 


That playing field is meant to weed out the high latency players (i.e., poor connections).  They send less data and receive less data.  The effect of this are players that skip around on your screen, or you die unexplicably, or other such similar situations. 

To you with the 30 fps, you didn't see anything but a blip or a glitch possibly.  To the opponent, they see you as somewhat smooth.  The server calculates the world at those times where that opponent sees you in their target and fires, hence you die, whereas it didn't seem anything like that on your end.


So by this, the advantage goes to the person with the highest ping!! 


This is not related to the command that you've mentioned, which is only client side.  All that is doing is allowing them to not be locked into the server, if set at /tps X.  Only effect is what you've said, higher frames possible which is only b/c the gfx card is allowed to receive and update more due to more available I/O from you, the client.  Another effect is smoother or more accurate game play -- to a certain extent b/c of what I described earlier.


Games that really benefit for higher tickrate are ones where the player models are the same for each side, thus the same hit zones.  So accuracy and precision mean alot more in those games (i.e., counter strike, cod, battlefield, etc.).  SWBF, I think are varying player models, more than just different skins...I think.  Been a long time for me, so the accuracy is out the door.


The other part about SWBF for the advantage where /tps is not set or even if it is set are those ppl that use a Lag switch.  They are acting like the player with a bad connection, but in this case know they are rather than unbeknowst to them.

If I were to recommend, keep the server at a set /tps and use the /noframelock.

Happy TG man!
Title: Re: Tick Rate vs Frame Rate
Post by: Soap Mactavish on November 26, 2009, 05:25:52 AM
well i dont really know that actual difference but FTS and TPS have to be different becauserthey are spelled differently. I know it sounds like a stupid answer but that how i usually approach those questions
Title: Re: Tick Rate vs Frame Rate
Post by: Xfire Keenmike aka cull on November 26, 2009, 07:58:07 AM
let me add that the majority of time i play swbf1 is with my little pc which has onboard graphics. the rig set up is in my xfire profile.

Quote from: Buckler on November 24, 2009, 08:19:42 AM
The question arises because I have recently become aware that it is possible for players to un-cap their client side framerates, to what ever their video card can handle using an undocumented client command.

question, is the answer /noframelock?
Title: Re: Tick Rate vs Frame Rate
Post by: Led on November 26, 2009, 08:00:21 AM
Hi MJR,

Thanks for the good info!

I have been a bit bugged that I have not heard about the /noframelock command before now.
It is much easier on the eyes, although my player movement doesn't seem as "crisp" to me when I have 30 fps.

It seems to have been well known in the ESL communities.  (I suspect Skullz knows about  it too, but I haven't seen him lately to ask about it  ;)  My personal thought is that we should tell everyone.

keen mike:  I have a 50 TPS server set up so you can get 50 FPS minimum.  Go ahead and try it. If you want to make you own server 30 TPS/30 FPS minimum, you can set that option in the hosting settings when you set up a game with your disk.  *If  you have a decent video card* you can get 100-200 FPS when you use the /noframelock command.  Let me know if I can assist you to enable the command.

Buckler


Title: Re: Tick Rate vs Frame Rate
Post by: MjrNuT on November 26, 2009, 08:19:50 AM
No problem my good sir.  Most likely you haven't heard it b/c quite frankly....there just aren't that many adults in the game.  :P

Admittedly, the game has a small following and of course remember its genre, when it came out. 

Considering the link I gave you, the /noframelock effectively allows you the client to provide more I/O ONLY on the client side.  So when you're at 100 FPS that you see, of course it will be "choppy" considering a 30 FPS lock.  Kind of a bad analogy but i KNOW you will understand this:

finite element model with 4 elements thru the thickness vs, 40.  How much fidelity in results are improved with the latter?  :D

FPS is NEVER the same as TPS.  A bad thing is when the FPS (server or client) goes BELOW the TPS.  Means the gfx card on the client can't handle the action and/or the internet connection is getting choke.  Also, server FPS is also NOT the same as Client FPS.
Title: Re: Tick Rate vs Frame Rate
Post by: quik on December 04, 2009, 04:54:48 PM
Overall, yes having a no lock cap on your frame/tick rates DO affect your player. It makes everything more realistic, reducing common server lag. I get about 80. The command you use is /noframelock
Title: Re: Tick Rate vs Frame Rate
Post by: (212)Ldr.Lando on June 22, 2012, 10:33:26 PM
hi guys,
im looking in my 1.2 dedicated server how to change my fps-rate.
can u help me?
Title: Re: Tick Rate vs Frame Rate
Post by: Led on June 23, 2012, 12:38:07 AM
Hi Lando,

It is explained in this post.  use /tps 30 or /tps 40 in the server ini file to set ticks, which is a minimum fps for players.  Players can use /noframelock to uncap player side FPS.

To see how to do it, read the posts.  Let me know if you need help.


also, here is the same information--

for players:
http://www.xfire.com/communities/perfectlittleangels/forums/157275/topic/1548076/

for server hosts:
http://www.xfire.com/communities/perfectlittleangels/forums/157275/topic/1504084/
(look for section on battlefront.ini)





Title: Re: Tick Rate vs Frame Rate
Post by: (212)Ldr.Lando on June 23, 2012, 01:05:43 AM
heya buckler...
thank u very much for ur fast answer.
Title: Re: Tick Rate vs Frame Rate
Post by: Phobos on July 05, 2014, 09:47:05 AM
Capping the framerate above 30
If you use /noframelock in conjunction with Vsync, it will cap your framerate at the monitor's native refresh rate. Mine is 60 hz which caps swbf1 at 60 FPS for all servers. With vsync off I normally have 120+ FPS which is super smooth, but usually after the server has been up for a while the player models will start to glitch into black spikes all over the map. This seems to prevent it so far, I haven't seen any players spiking at 60 FPS. Update: There still appears to be spiking even with the 60 FPS lock though less frequent.   

Enabling Vsync is useful to help prevent tearing/spiking but is less smooth if your video card can handle the higher frames. If your game spikes alot with /NFL I would suggest enabling vsync. If the glitch happens rarely then you will get higher FPS with vsync disabled.

I host most of my servers at 60 TPS because the NFO allows for such bandwidth and more shots seem to register than hosting on 30.

What I am really wondering though, if shots appear to register less often (deflected) when the client's FPS is higher than the server's TPS, as opposed to when the client FPS and server TPS are identical (ie 30/30 or 60/60)?

Quote from: http://www.tweakguides.com/SWB_5.html
VSync: Vertical Synchronization (VSync) is the synchronization of your graphics card and monitor's abilities to redraw the screen a number of times each second (measured in FPS or Hz). If this setting is set to On, the most noticeable effect is that your framerate will be capped at the maximum Hz rating for your monitor at your chosen resolution. For example, if you've chosen 1280x1024 resolution, and your monitor only performs at 75Hz at this resolution, your FPS cannot exceed 75FPS. If you set VSync to Off, your FPS can exceed your monitor's maximum refresh rate, however you may see some image "tearing" whenever your monitor and graphics card go slightly out of synchronization. For maximum performance I generally recommend disabling VSync, however if you really cannot tolerate any tearing then enable VSync. Note that you should also make sure your Direct3D VSync setting in your graphics card's control panel is set to 'Application Preference' otherwise that setting will override any in-game settings.

I have also enabled OpenGL Triple Buffering but it didn't seem to have much affect on framerates.
Title: Re: Tick Rate vs Frame Rate
Post by: aeria. on July 05, 2014, 10:22:01 AM
This thread is ancient Phobos.
Title: Re: Tick Rate vs Frame Rate
Post by: Phobos on July 05, 2014, 10:34:33 AM
Quote from: aeria. on July 05, 2014, 10:22:01 AM
This thread is ancient Phobos.
That is quite true, so is this one
http://www.swbfgamers.com/index.php?topic=1870

It seems like much of this would apply to SWBF1 also
[spoiler]https://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking

QuoteMultiplayer games based on the Source Engine use a Client-Server networking architecture. Usually a server is a dedicated host that runs the game and is authoritative about world simulation, game rules, and player input processing. A client is a player's computer connected to a game server. The client and server communicate with each other by sending small data packets at a high frequency (usually 20 to 30 packets per second). A client receives the current world state from the server and generates video and audio output based on these updates. The client also samples data from input devices (keyboard, mouse, microphone, etc.) and sends these input samples back to the server for further processing. Clients only communicate with the game server and not between each other (like in a peer-to-peer application). In contrast with a single player game, a multiplayer game has to deal with a variety of new problems caused by packet-based communication.

Network bandwidth is limited, so the server can't send a new update packet to all clients for every single world change. Instead, the server takes snapshots of the current world state at a constant rate and broadcasts these snapshots to the clients. Network packets take a certain amount of time to travel between the client and the server (i.e. the ping time). This means that the client time is always a little bit behind the server time. Furthermore, client input packets are also delayed on their way back, so the server is processing temporally delayed user commands. In addition, each client has a different network delay which varies over time due to other background traffic and the client's framerate. These time differences between server and client causes logical problems, becoming worse with increasing network latencies. In fast-paced action games, even a delay of a few milliseconds can cause a laggy gameplay feeling and make it hard to hit other players or interact with moving objects. Besides bandwidth limitations and network latencies, information can get lost due to network packet loss.
(https://developer.valvesoftware.com/w/images/e/ea/Networking1.gif)
To cope with the issues introduced by network communication, the Source engine server employs techniques such as data compression and lag compensation which are invisible to the client. The client then performs prediction and interpolation to further improve the experience.

Basic Networking
The server simulates the game in discrete time steps called ticks. By default, the timestep is 15ms, so 66.666... ticks per second are simulated, but mods can specify their own tickrate. During each tick, the server processes incoming user commands, runs a physical simulation step, checks the game rules, and updates all object states. After simulating a tick, the server decides if any client needs a world update and takes a snapshot of the current world state if necessary. A higher tickrate increases the simulation precision, but also requires more CPU power and available bandwidth on both server and client. The server admin may override the default tickrate with the -tickrate command line parameter, though tickrate changes done this way are not recommended because the mod may not work as designed if its tickrate is changed.

Note:The -tickrate command line parameter is not available on CSS, DoD S, TF2, L4D and L4D2 because changing tickrate causes server timing issues. The tickrate is set to 66 in CSS, DoD S and TF2, and 30 in L4D and L4D2.

Clients usually have only a limited amount of available bandwidth. In the worst case, players with a modem connection can't receive more than 5 to 7 KB/sec. If the server tried to send them updates with a higher data rate, packet loss would be unavoidable. Therefore, the client has to tell the server its incoming bandwidth capacity by setting the console variable rate (in bytes/second). This is the most important network variable for clients and it has to be set correctly for an optimal gameplay experience. The client can request a certain snapshot rate by changing cl_updaterate (default 20), but the server will never send more updates than simulated ticks or exceed the requested client rate limit. Server admins can limit data rate values requested by clients with sv_minrate and sv_maxrate (both in bytes/second). Also the snapshot rate can be restricted with sv_minupdaterate and sv_maxupdaterate (both in snapshots/second).

The client creates user commands from sampling input devices with the same tick rate that the server is running with. A user command is basically a snapshot of the current keyboard and mouse state. But instead of sending a new packet to the server for each user command, the client sends command packets at a certain rate of packets per second (usually 30). This means two or more user commands are transmitted within the same packet. Clients can increase the command rate with cl_cmdrate. This will increase responsiveness but requires more outgoing bandwidth, too.

Game data is compressed using delta compression to reduce network load. That means the server doesn't send a full world snapshot each time, but rather only changes (a delta snapshot) that happened since the last acknowledged update. With each packet sent between the client and server, acknowledge numbers are attached to keep track of their data flow. Usually full (non-delta) snapshots are only sent when a game starts or a client suffers from heavy packet loss for a couple of seconds. Clients can request a full snapshot manually with the cl_fullupdate command.

Responsiveness, or the time between user input and its visible feedback in the game world, are determined by lots of factors, including the server/client CPU load, simulation tickrate, data rate and snapshot update settings, but mostly by the network packet traveling time. The time between the client sending a user command, the server responding to it, and the client receiving the server's response is called the latency or ping (or round trip time). Low latency is a significant advantage when playing a multiplayer online game. Techniques like prediction and lag compensation try to minimize that advantage and allow a fair game for players with slower connections. Tweaking networking setting can help to gain a better experience if the necessary bandwidth and CPU power is available. We recommend keeping the default settings, since improper changes may cause more negative side effects than actual benefits.

Entity Interpolation
By default, the client receives about 20 snapshot per second. If the objects (entities) in the world were only rendered at the positions received by the server, moving objects and animation would look choppy and jittery. Dropped packets would also cause noticeable glitches. The trick to solve this problem is to go back in time for rendering, so positions and animations can be continuously interpolated between two recently received snapshot. With 20 snapshots per second, a new update arrives about every 50 milliseconds. If the client render time is shifted back by 50 milliseconds, entities can be always interpolated between the last received snapshot and the snapshot before that.

Source defaults to an interpolation period ('lerp') of 100-milliseconds (cl_interp 0.1); this way, even if one snapshot is lost, there are always two valid snapshots to interpolate between. Take a look at the following figure showing the arrival times of incoming world snapshots:
(https://developer.valvesoftware.com/w/images/4/49/Interpolation.gif)
The last snapshot received on the client was at tick 344 or 10.30 seconds. The client time continues to increase based on this snapshot and the client frame rate. If a new video frame is rendered, the rendering time is the current client time 10.32 minus the view interpolation delay of 0.1 seconds. This would be 10.22 in our example and all entities and their animations are interpolated using the correct fraction between snapshot 340 and 342.

Since we have an interpolation delay of 100 milliseconds, the interpolation would even work if snapshot 342 were missing due to packet loss. Then the interpolation could use snapshots 340 and 344. If more than one snapshot in a row is dropped, interpolation can't work perfectly because it runs out of snapshots in the history buffer. In that case the renderer uses extrapolation (cl_extrapolate 1) and tries a simple linear extrapolation of entities based on their known history so far. The extrapolation is done only for 0.25 seconds of packet loss (cl_extrapolate_amount), since the prediction errors would become too big after that.

Entity interpolation causes a constant view "lag" of 100 milliseconds by default (cl_interp 0.1), even if you're playing on a listenserver (server and client on the same machine). This doesn't mean you have to lead your aiming when shooting at other players since the server-side lag compensation knows about client entity interpolation and corrects this error.
Tip:More recent Source games have the cl_interp_ratio cvar. With this you can easily and safely decrease the interpolation period by setting cl_interp to 0, then increasing the value of cl_updaterate (the useful limit of which depends on server tickrate). You can check your final lerp with net_graph 1.
Note:If you turn on sv_showhitboxes (not available in Source 2009) you will see player hitboxes drawn in server time, meaning they are ahead of the rendered player model by the lerp period. This is perfectly normal!

Input Prediction
Lets assume a player has a network latency of 150 milliseconds and starts to move forward. The information that the +FORWARD key is pressed is stored in a user command and send to the server. There the user command is processed by the movement code and the player's character is moved forward in the game world. This world state change is transmitted to all clients with the next snapshot update. So the player would see his own change of movement with a 150 milliseconds delay after he started walking. This delay applies to all players actions like movement, shooting weapons, etc. and becomes worse with higher latencies.

A delay between player input and corresponding visual feedback creates a strange, unnatural feeling and makes it hard to move or aim precisely. Client-side input prediction (cl_predict 1) is a way to remove this delay and let the player's actions feel more instant. Instead of waiting for the server to update your own position, the local client just predicts the results of its own user commands. Therefore, the client runs exactly the same code and rules the server will use to process the user commands. After the prediction is finished, the local player will move instantly to the new location while the server still sees him at the old place.

After 150 milliseconds, the client will receive the server snapshot that contains the changes based on the user command he predicted earlier. Then the client compares the server position with his predicted position. If they are different, a prediction error has occurred. This indicates that the client didn't have the correct information about other entities and the environment when it processed the user command. Then the client has to correct its own position, since the server has final authority over client-side prediction. If cl_showerror 1 is turned on, clients can see when prediction errors happen. Prediction error correction can be quite noticeable and may cause the client's view to jump erratically. By gradually correcting this error over a short amount of time (cl_smoothtime), errors can be smoothly corrected. Prediction error smoothing can be turned off with cl_smooth 0.

Prediction is only possible for the local player and entities affected only by him, since prediction works by using the client's keypresses to make a "best guess" of where the player will end up. Predicting other players would require literally predicting the future with no data, since there's no way to instantaneously get keypresses from them.

Lag Compensation
Let's say a player shoots at a target at client time 10.5. The firing information is packed into a user command and sent to the server. While the packet is on its way through the network, the server continues to simulate the world, and the target might have moved to a different position. The user command arrives at server time 10.6 and the server wouldn't detect the hit, even though the player has aimed exactly at the target. This error is corrected by the server-side lag compensation.

The lag compensation system keeps a history of all recent player positions for one second. If a user command is executed, the server estimates at what time the command was created as follows:

Command Execution Time = Current Server Time - Packet Latency - Client View Interpolation

Then the server moves all other players - only players - back to where they were at the command execution time. The user command is executed and the hit is detected correctly. After the user command has been processed, the players revert to their original positions.
Note:Since entity interpolation is included in the equation, failing to have it on can cause undesired results.

On a listen server you can enable sv_showimpacts 1 to see the different server and client hitboxes:
(https://developer.valvesoftware.com/w/images/c/ca/Lag_compensation.jpg)
This screenshot was taken on a listen server with 200 milliseconds of lag (using net_fakelag), right after the server confirmed the hit. The red hitbox shows the target position on the client where it was 100ms + interp period ago. Since then, the target continued to move to the left while the user command was travelling to the server. After the user command arrived, the server restored the target position (blue hitbox) based on the estimated command execution time. The server traces the shot and confirms the hit (the client sees blood effects).

Client and server hitboxes don't exactly match because of small precision errors in time measurement. Even a small difference of a few milliseconds can cause an error of several inches for fast-moving objects. Multiplayer hit detection is not pixel perfect and has known precision limitations based on the tickrate and the speed of moving objects.

The question arises, why is hit detection so complicated on the server? Doing the back tracking of player positions and dealing with precision errors while hit detection could be done client-side way easier and with pixel precision. The client would just tell the server with a "hit" message what player has been hit and where. We can't allow that simply because a game server can't trust the clients on such important decisions. Even if the client is "clean" and protected by Valve Anti-Cheat, the packets could be still modified on a 3rd machine while routed to the game server. These "cheat proxies" could inject "hit" messages into the network packet without being detected by VAC (a "man-in-the-middle" attack).

Network latencies and lag compensation can create paradoxes that seem illogical compared to the real world. For example, you can be hit by an attacker you can't even see anymore because you already took cover. What happened is that the server moved your player hitboxes back in time, where you were still exposed to your attacker. This inconsistency problem can't be solved in general because of the relatively slow packet speeds. In the real world, you don't notice this problem because light (the packets) travels so fast and you and everybody around you sees the same world as it is right now.

Net Graph
The Source engine offers a couple of tools to check your client connection speed and quality. The most popular one is the net graph, which can be enabled with net_graph 2 (or +graph). Incoming packets are represented by small lines moving from right to left. The height of each line reflects size of a packet. If a gap appears between lines, a packet was lost or arrived out of order. The lines are color-coded depending on what kind of data they contain.

Under the net graph, the first line shows your current rendered frames per second, your average latency, and the current value of cl_updaterate. The second line shows the size in bytes of the last incoming packet (snapshots), the average incoming bandwidth, and received packets per second. The third line shows the same data just for outgoing packets (user commands).
(https://developer.valvesoftware.com/w/images/9/96/Net_graph.jpg)

Optimizations
The default networking settings are designed for playing on dedicated server on the Internet. The settings are balanced to work well for most client/server hardware and network configurations. For Internet games the only console variable that should be adjusted on the client is "rate", which defines your available bytes/second bandwidth of your network connection. Good values for "rate" is 4500 for modems, 6000 for ISDN, 10000 DSL and above.

In an high-performance network environment, where the server and all clients have the necessary hardware resources available, it's possible to tweak bandwidth and tickrate settings to gain more gameplay precision. Increasing the server tickrate generally improves movement and shooting precision but comes with a higher CPU cost. A Source server running with tickrate 100 generates about 1.5x more CPU load than a default tickrate 66. That can cause serious calculation lags, especially when lots of people are shooting at the same time. It's not suggested to run a game server with a higher tickrate then 66 to reserve necessary CPU resources for critical situations.
Note:It is not possible to change tickrate on CSS, DoD S TF2, L4D and L4D2 because changing tickrate causes server timing issues. The tickrate is set to 66 in CSS, DoD S and TF2, and 30 in L4D and L4D2.

If the game server is running with a higher tickrate, clients can increase their snapshot update rate (cl_updaterate) and user command rate (cl_cmdrate), if the necessary bandwidth (rate) is available. The snapshot update rate is limited by the server tickrate, a server can't send more then one update per tick. So for a tickrate 66 server, the highest client value for cl_updaterate would be 66. If you increase the snapshot rate and encounter packet loss or choke, you have to turn it down again. With an increased cl_updaterate you can also lower the view interpolation delay (cl_interp). The default interpolation delay is 0.1 seconds, which derives from the default cl_updaterate 20. View interpolation delay gives a moving player a small advantage over a stationary player since the moving player can see his target a split second earlier. This effect is unavoidable, but it can be reduced by decreasing the view interpolation delay. If both players are moving, the view lag delay is affecting both players and nobody has an advantage.

This is the relation between snapshot rate and view interpolation delay is the following:

interpolation period = max( cl_interp, cl_interp_ratio / cl_updaterate )

"Max(x,y)" means "whichever of these is higher". You can set cl_interp to 0 and still have a safe amount of interp. You can then increase cl_updaterate to decrease your interp period further, but don't exceed tickrate (66) or flood your connection with more data than it can handle.
Tips

Don't change console settings unless you are 100% sure what you are doing
    Most "high-performance" setting cause exactly the opposite effect, if the server or network can't handle the load.
Don't turn off view interpolation and/or lag compensation
    It will not improve movement or shooting precision.
Optimized setting for one client may not work for other clients
    Do not just use settings from other clients without verifing them for your system.
If you follow a player in "First-Person" as a spectator in a game or SourceTV, you don't exactly see what the player sees
    Spectators see the game world without lag compensation.
[/spoiler]
Title: Re: Tick Rate vs Frame Rate
Post by: SleepKiller on July 09, 2014, 02:24:00 PM
Quote from: Phobos on July 05, 2014, 10:34:33 AM
It seems like much of this would apply to SWBF1 also
[spoiler]https://developer.valvesoftware.com/wiki/Source_Multiplayer_Networking
[/spoiler]
I actually find that unlikely. Source has a very robust networking system, that although it sometimes creates some oddities (facestabs, etc) is still very different to and leagues ahead of SWBF.
EhPortal 1.34 © 2024, WebDev