How much deviance should be added when you fire. The arguments mean:
x - the base deviance of a shot y - amount added per shot z - amount reduced per server-frame (30 per second)
For example, the BAR 1918 is:
ObjectTemplate.SetFireDev 3.5 0.25 0.03
so the shot itself have a deviance of 3.5, each preceeding shot adds 0.25 to this deviance (so a burst of fire has each successive shot be less accurate), and 0.03 is how much deviance is subtracted each server frame (30 per second) from that added by previous shots.
The sniper/engineer rifles are all:
ObjectTemplate.SetFireDev 0 0 0
so have no deviation due to firing.
All x-values (maximum deviation) of SetFireDev, SetMiscDev, SetSpeedDev, and SetTurnDev are added together.
Source and more details