Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Waterfall, metrics and median reporting questions #4255

Open
ygarde opened this issue Aug 13, 2024 · 1 comment
Open

Waterfall, metrics and median reporting questions #4255

ygarde opened this issue Aug 13, 2024 · 1 comment

Comments

@ygarde
Copy link

ygarde commented Aug 13, 2024

Your question

I have a handful of questions about varying aspects of Sitespeed.io's functionality:

  • When performing a comparison in features between Sitespeed.io and Webpagetest, the waterfall chart in sitespeed.io had significantly fewer requests reported than the exact same page load on WPT. Is there any known reason for this?
  • Is there anywhere that Sitespeed.io reports the Content Download Time metric that is found on a per request basis on the waterfall chart in WPT? I could not find this metric reported anywhere.
  • Is there a way to change which run is reported as the median run to be based on the LastVisualChange metric instead of SpeedIndex on pages like the Waterfall view for example?
  • I am currently making use of the analysisstorer plugin to get JSON versions of my test results. Is there a clean way to get run by run test results (including all metrics for each run) in a single file rather than the current method where there is an individual JSON file for each run per module of sitespeed?

I will update this thread with more questions as necessary.

@soulgalore
Copy link
Member

Hi @ygarde

When performing a comparison in features between Sitespeed.io and Webpagetest, the waterfall chart in sitespeed.io had significantly fewer requests reported than the exact same page load on WPT. Is there any known reason for this?

Can you share an example so I can have a look? I think by default WebPageTest ends a test after X seconds of no network activity. By default (but you can change that) the test ends after loadEventEnd + 2 seconds. You can checkout https://www.sitespeed.io/documentation/sitespeed.io/browsers/#choose-when-to-end-your-test and you could try --pageCompleteCheckNetworkIdle and see if you see any difference.

Is there anywhere that Sitespeed.io reports the Content Download Time metric that is found on a per request basis on the waterfall chart in WPT? I could not find this metric reported anywhere.

Maybe, what do Content Download Time mean/measure, how is it defined?

Is there a way to change which run is reported as the median run to be based on the LastVisualChange metric instead of SpeedIndex on pages like the Waterfall view for example?

No it's not configurable, but it could be done with some changes. The logic works like this right now: If we have SpeedIndex, use that. If not, use loadEventEnd.

I am currently making use of the analysisstorer plugin to get JSON versions of my test results. Is there a clean way to get run by run test results (including all metrics for each run) in a single file rather than the current method where there is an individual JSON file for each run per module of sitespeed?

I think the easiest way is to create your own plugin, collect those messages that you need and store them the way you want. Let me know if you need any guidance.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants