On 1/23/13 6:28 PM, Peter La wrote:
> This is fantastic. :)
>
> To Dietrich's points, I'd like to add:
>
> 1. Let's separate 'cold' and 'hot' startup times. I'm not sure if the 30 times includes 1 'cold' startup time, but if it does, it probably shouldn't.
Yes, there's currently 1 cold launch. I can look into enhancing the
testrun to include both hot and cold launches.
> Except for apps that depend on hardware (like Camera), or that need to load data upon launch, most apps hot startup time hasn't been a big issue, and generally take less than 1 second, or about the time it takes for the screen transition animation. It's the cold startup times that tend to take anywhere from 2.5-5 seconds, sometimes more.
>
> 2. In addition to time to first paint, I think a useful metric to add is time it takes to get to an interactible state, ie. when you can start tapping on things. The latter is really important for a good user experience.
Do you have suggestions for events to listen to for this state to have
been achieved?
> This is *awesome*, thanks so much.
>
> A couple of questions:
>
> 1. Does each datapoint in the graph represent the 30-runs of each app
> combined? (e.g.: 30 * NumberOfAppsTested)
>
> 2. Is first-paint the right metric to use? Cc'ing Taras Glek for input.
>
> 3. Can we get measurements per-app? This will help detect per-app
> regressions, as well as communicate performance behavior to partners.
>
> 4. Can we add tests with user data? Eg, a common test our partners are
> doing is "add 1000 contacts, restart, load Contacts app". Telemetry showed
> us in Firefox desktop that testing without user data is of limited value.
>
> 5. I'm worried that testing "hot" doesn't give us visibility into the worst
> case performance. The pathologically bad performance cases are a lot of
> what got Firefox a bad performance rap, so we should ensure we're testing
> only cold start, or test it separately maybe?
>
>
> On Wed, Jan 23, 2013 at 2:45 AM, Dave Hunt <
dh...@mozilla.com> wrote:
>
>> We now have basic B2G performance tests reporting to
>>
https://datazilla.mozilla.org/**b2 <
https://datazilla.mozilla.org/b2>
>>
>> There are a few things worth noting:
>>
>> 1. The version number is taken from the device (these are running against
>> nightly unagi engineering builds with latest master Gaia flashed on top).
>> Until recently this was 1.0.0-prerelease but now appears to be
>> 1.0.0.0-prerelease and is the reason you will see two charts.
>>
>> 2. The revision identifier for these tests in DataZilla is the git commit
>> hash for Gaia, however DataZilla is currently only set up to work with
>> Mercurial, so any links to the commit are broken. One of the next steps is
>> to switch this to use github and leverage the API to detect and report
>> performance regressions.
>>
>> 3. Clicking the summary chart at the top of the results page will display
>> a larger version of the chart below. In this chart, each point represents a
>> test run, and clicking data points will display detailed results for the
>> test run in an additional chart below. If you then click a data point in
>> this chart, a final chart showing the data submitted for that
>> application/metric will be displayed.
>>
>> It's also worth a brief explanation of what these tests do. For now they
>> simply open the Phone, Messages, and Settings apps and take a measurement
>> of the time before the first paint event is fired. Each app is launched 30
>> times, and these tests are triggered when a new nightly build is available,
>> or when Gaia master branch is updated.
>>
>> I encourage and look forward to your questions, feedback, and suggestions.
>> :)
>>
>> For those interested, the source code for these tests can be found here:
>>
http://hg.mozilla.org/users/**tmielczarek_mozilla.com/**b2gperf<
http://hg.mozilla.org/users/tmielczarek_mozilla.com/b2gperf>
>>
>> Cheers,
>> Dave Hunt
>> Automation Development
>> ______________________________**_________________
>> dev-gaia mailing list
>>
dev-...@lists.mozilla.org
>>
https://lists.mozilla.org/**listinfo/dev-gaia<
https://lists.mozilla.org/listinfo/dev-gaia>