That's not a bad choice at all (I've got a Hori VX myself, and it's the only 360 stick I've kept), but if you ever happen to find an HRAP EX for cheap I'd snap it up anyway - personally I regret not getting the one that went for €40. A superior lever is worth risking up to 10-15ms more latency imho (unless your setup is already overflowing with lag left and right, but knowing you that's certainly not the case). Of course if you ever feel like modding the VX then great - a JLF should fit in more easily btw, and buttons are not so bad as to absolutely need a replacement.
Wouldn't it be better to measure the actual lag in milliseconds by using some sort of stop-watch app like when they test monitors? Having said that I really don't know if it is actually possible on consoles.
Actually I've been thinking about that method to measure controller's input lag, and now I tend to believe that doing any better is really non-trivial. Making a controller comparison through a high refresh rate camera and two (identical) setups will still add more variables with no clear advantage *, and direct, non-referenced results (that for displays are possible by using the Leo Bodnar device) are probably very difficult - can you build a device that measures processing lag at the logic level of the board? Probably, but who's gonna do it?
A more feasible way to improve on his results would be to test the sticks with games (and displays) that run at more than 60Hz (making sure that the polling rates for the two controller ports are exactly the same would also be nice). Of course, it's still gonna be differential lags and not absolute.
* Edit: Actually I remind now of reading on the SRK forum about using a high refresh camera to measure absolute delay between an illuminated button push and a CRT screen update. However, for very precise results the screen would still need to run at high refresh rates..
So yeah, the methodology employed in his testing is good, as in, statistically sound. The button presses will be uniformly distributed across the frame and averaging over 1000 repetitions is surely enough to tame most of the intrinsic variability and get a relevant (differential) ms figure (btw it should be possible to calculate a confidence interval for the population/real mean too). I'll just shoot a number out of my ass and say that those lag figures can be trusted down to +/-2ms (of additional latency wrt the reference stick, whose source for the quoted "0.00" lag was unfortunately not provided). There's a minor assumption that must be made for this to hold though - i.e. that the consoles' polling rates of the controller ports are the same as, and synced to, the game refresh rate (60hz) or else, if they are different (which may well be, for example the USB 2.0 standard polling rate is 125Hz or 8ms), that the consoles render to screen all inputs polled in the same frame (but different polling cycle) in the following one, which is how it should be really.