Slashing Bad User Experience Using DevTools - Addendum.
by Henri Helvetica
These are notes that are meant to compliment the FITC presentation during Web Unleashed 2019.
I’d like to add that devtools are 80%-90% the same across all major browsers, with the odd difference here are there. If you have any q?s about the talks and or this doc, feel free to hit me on twitter at @HenriHelvetica.
Since the deck was pretty much a series of screen shots, this will consist of the odd comments and links to the info on the slide, in order of pages as they appear. Sorry about the size of the deck, it was full of screen shots(PNGs)
+ 29. This was a great example of what some performance looks like, and I spotted when I was looking for some info for a conference happening in the the fall: https://experience.afrotech.com(⚠️Nota Bene: visit site at your own peril as it’s 90MB+). We looked at the network waterfall where we spotted long load times.
Here we take a look at the # of requests and confirm what is likely a set of large resources and the heavy pay load. And this was confirmed.
We then went to the table heading where everything can be filtered either alphabetically and or ascending or descending(in size). In this case, we hit SIZE - which will filter all resources being loaded by size.
This is where we see that the largest resources in this case were in fact several images(which is likely always the case), and thus the culprit.
Getting back to a ordinary page load, we like to immediately look at some quick info - like # or requests made, the transferred and resource sizes(side by side, and more on that later) and some timing information like the onLoad
Like we mentioned in #31, by going to the table heading, we can start to cycle through the resources to look for anomalies, 404s, duplicate files. odd caching directives, and so much more.
We can also filter by resource type. Again, this is pretty important in how we can start to isolate the type of resource to make sure things loaded correctly. Something I did mention is the opportunity to command click the heading to show 2 or more resource types in the same page. For example, you might command click all text resources to make sure they’re all compressed at a glance.
you can additionally filter the resources results in several ways by possibly isolating say, by header, mime type, domain, resource size(that are 100kb or higher, or lower). super handy to investigate.
These are ways to spot things like - duplicate files that may have resided from incessant updates.
see 39
+ 42, 43. This gear in the top left will allow you to expose some more features. Here we looked at the opportunity to display the resource list and clearly expose whether or not the resource was compressed or not. You can look @ more network analysis info on the Google reference page here: https://developers.google.com/web/tools/chrome-devtools/network/reference
+ 45-47. Pat Meeman mentioned how he spends a lot of time looking at the filmstrip. It’s a great tool as it allows you to see a page load in increments - like a series of screen shots. In Chrome DevTools, you will see individual screen shots of the page load, w/ a time stamp below at the base, showing what painting took place. This will allow for you to pin point what appeared, when, at which point you can make an informed decision, further deep diving if adjustments need making. https://developers.google.com/web/updates/2015/05/film-strips-in-network-panel
+ 50. A waterfall will always show you which part of the load process took place and how long it lasted. You can look at the actual resource for quick timing info, or hover over it to get a little tool tip w/ additional info. Added, to look at that expanded info, you can click on the resource name(far left column) and now go back to the far right, and click on TIMING at the top. This will provide into on the resource load, where possible bottle necks were. https://developers.google.com/web/tools/chrome-devtools/network/reference#timing
+ 56, 57. Code Coverage is important. As some of you may know, both Javascript and CSS have varying taxing effects of the rendering of a page. All of it must be parsed. We can now find out how much of your css and js was left unused by looking at the coverage tab. You can then make decision about editing parts of the unused blocks of code. DevTools will not only provide a percentage of unused. You can then also look @ the actual code that was left unused by clicking on the mentioned percentage info. At that point, you can investigate and see what additional steps to make. https://developers.google.com/web/tools/chrome-devtools/coverage/
Lighthouse has proven to be a fan fave, as it has allowed for developers to get some direction as to what may be a current challenge with their page. You can reach Lighthouse under the Audit panel, and you then pick PERFORMANCE as the audit you want. The Lighthouse app will take your page through a series of test, under throttled conditions, and return a score out of 100. There you will be offered some ways on how to reduce various bottle necks, with additional explanation provide by drop down menus.
by Henri Helvetica
Slides (⚠️ 30MB+): http://bit.ly/SlashingBadUX_slides
Pages.