I know this will drift this thread somewhat to port...or maybe starboard...but:
I'm trying to understand what possible advantage there is in determining engine timing (valve or ignition) by measuring piston/crank/rod movement and doing some sort of conversion to degrees as opposed to using a degree wheel and actually measuring degrees directly.
Ignoring the fact that the crank moves several degrees while the piston does not when at TDC, the measuring method seems to me to add the possibility of error in the measurement itself or in the conversion. Am I missing or not understanding something?
As far as ignition timing is concerned, as mentioned, the best way to set timing is based on engine performance. Using a factory timing setting is fine - it will always work. But it's commonly not optimum for a specific engine. The old method of advance under load until pinging/back off two degrees works fine. Better, of course, is to use a dyno and determine timing by where the engine makes it's maximum power. As noted, both methods will yield a different timing number based on fuel. I have my Commando timing set based on regular fuel because it operates in Mexico. Fuel is easily available even in rather remote locations but often, in those areas, only regular (87). If it was in TX, for example, I would set it based on using 93.
Setting the timing based on fuel/the bike's performance has another advantage - the "error" of the timing scale makes no difference at all. You set the timing based on performance and then observe where the timing mark falls on the scale and that's the timing for that bike. Using that method, unless you are degreeing in a new cam or something similar, NONE of the other stuff matters - there is no need to find TDC or care anything about it OR the accuracy of the timing scale/mark on the rotor.