Is that a trick question?
![Image](https://cdn.betterttv.net/emote/60abb17567644f1d67e8c114/1x.webp)
2025 - 19 = 2006
Sengoku Strider wrote: ↑Mon Aug 21, 2023 6:59 pmThanks for the detailed response, sorry I missed it until now. This was essentially what I thought in broad strokes, which is why the initial dev statement threw me off. Your point about older hardware/programming technique also makes plenty of sense, especially with 80s hardware that only had single or double digit KB of RAM and slower single-thread processors.
No worries
![Smile :)](./images/smilies/icon_smile.gif)
The most generous interpretation I can come up with in Destiny's case is that the game tick runs at 30 on all machines as capitulation to the lowest-common-denominator, since it has to do crossplay, and that requires all clients agree on a given tick rate for sane results.
That would mean continuous damage-over-time (like the Quake 3 lightning gun, or whatever equivalents D2 has) has to be chopped up and applied in 1/30sec chunks, which could theoretically introduce precision issues.
It shouldn't really matter for other kinds of damage - i.e. anything that shoots bullets or makes explosions - since those sources are naturally quantized to <30hz.
Thinking on it, that also imples that they're using integers to calculate damage instead of decimals, since dividing by 30 results in a trivially tiny amount of precision loss with floating-point numbers - certainly enough to preserve the original ratio vs time after quantization.
Ints are a different case, since you'd need a multiple of 30 in order to divide without any loss of information.
Could be that they're using coarse 8-bit ints to save on network packet size, but I'm not sure why you would if that kind of fine-grained TTK tuning is a concern.
Hardware decimal bittage generally comes in multiples of 32, but you can do a precise-enough 8-bit (or whatever-you-like bit) one in software if values < 1.0 need to representable.