OK, all you [civil] engineer types I need your help. I received a nice little yellow summons for speeding (40 in a 25, how lame) on my way to work this morning. The method used was "ESP," a two-laser contraption that measures the time it takes for your front tires to cross the lasers which are only 3 ft apart. I want to prove that 3 ft is not a statistically accurate distance to measure my speed if I was accelerating. Now I know the old "rate = distance / time" formula, but I'd like to go beyond that with a judge. It's a long time since engineering calculus in college (I started as a mechanical engineer but switched to ECON later) but I do remember that acceleration = (distance / time) squared. I want to prove that it is mathematically possible for my rate per the ESP device to be 40mph, while registering less than that on my speedometer. For intance, if I accelerated quickly from 10mph to 25mph (rate of change) across that 3-ft span, could that device not kick out a falsely high reading (rate)? I mean...there's no way I know of for a device to measure acceleration, only rate. Am I onto something here? If it makes any difference I was in my Jeep, not a sportscar. (I'd prefer not to get into calibrations if I can help it). Thanks for any and all creativity.