Hi.
The relatively new "Express" feature -- starting with some typical speed and then working up or down -- is a good idea operationally and statistically but iff the tests on which it is estimated are quite stable. In a situation (like the one I'm tryng to debug) in which speeds are fluctuating wildly (e.g., circa 80% of download readings in the 9-12 mpbs range but the other 20% clustering around 15-20 kbps, working from an average is a disaster. I'm seeing automatic tests starting with 7 MB downloads simply stall with several hours at less than 50%.
If there were an effective and very quick downgrade procedure, that might still be ok, but whatever downgrade procedure is built in seems (from observation) to be based, not on a timer, but on having the initial data download complete.
To save comments about the connection itself (more on that when I figure out what is going on), I've had packet analysers on both the LAN and between the local router and the cable modem and they are seeing fairly consistent local traffic density except when tests are being run. Running your tests from multiple computers on the LAN and use of different testing procedure yield roughly consistent results (the numbers my be different, but the wide fluctuations (and when they occur) are consistent.
IMO, there needs to be a way to easily disable the "Express" mechanism. Even if you fix the downgrade procedure, there will probably always be edge cases extreme enough for it to fail.