27. Dependency graph is a graph of modules in
your application. Starting from entry point
webpack is recursively adding all imported
modules to the graph and bundles them into
(usually) one file.
@filrakowski
35. Code splitting allows you to split your code
into various bundles. You can think about this
as deferring a piece of your dependency
graph to be loaded later.
@filrakowski
36. Lazy loading loading chunk of code on demand
usually as a response to user action.
@filrakowski
62. Rule #5 - Make use of cache for static assets
@filrakowski
63. Service workers essentially act as proxy
servers that sit between web applications, the
browser, and the network (when available).
They can be used to serve network responses
from cache instead of a network.
@filrakowski
90. Thank you!
If you already forgot something - don’t worry.
I’ll share the slides :)
@filrakowski
Notas do Editor
How many of you think performance is important?
For those who did not raised their hands I will pretend that I’m smart and show some numbers to prove my point that web performance is extremely important.
First of all as you can see only one second of waiting is enough for the user to make a mental context switch and potentially leave our website. Now let’s see how performance affects other numbers
Given all this number we can assume a simple equation. Speed equals money. Do you agree?
Now when I hope you’re convinced you how important web performance is let’s find main factors affecting it
When we want some resource we need to request it and wait for resource. The bigger file is - the longer it takes to download it.
Next JavaScript engine needs to parse it. Again the bigger file is - the longer it takes.
Once parsing is finished and our JS engine created a nice AST that browser can work with it’s time to execute the code and guess what - the more code it’s to execute the longer it takes.
Ok so it looks like the main factor influencing loading performance is file size.
As a frontend developers we mostly can optimize the execution and parsing phase but we will see later that there are some things that we can do to improve response time.
For now let’s focus on the area that we certainly can fully control.
Let’s see what problems we can encounter by creating simple app
It usually starts like this. An entry point with Vue attached and a root component. But over time the application grows
We almost certainly will add a Vue-router. Following this we will add some routes, probably another 3rd party libs.
Our initially downloaded JS bundle will grow with every module we add
As we remember speed equals money so does it mean that more features equals less money?
As you probably guessed it doesn’t To understand how to solve performance issues with growing bundle we need to understand how this bundle is constructed
It appears that web pack under the hood is creating something called dependency graph. It’s just recursively checks the imports in JS modules and adds them to the graph starting from your entry point. To better understand this process lets see an example
The problem with bundling all modules into one file is that we might not need some of the modules depending on current state of the application
This is why webpack has feature called code splitting. It allows you to split your code into different bundles and load them on demand.
Lazy loading is just loading this code splitter chunks on demand. So we are cutting some modules and loading them when they are needed
We can tell webpack which parts of the code it should split by doing a dynamic import. Instead of a regular import dynamically imported module is an entry point for a completely new bundle. The bundle will be lazily loaded ONLY when this function is invoked.
By dynamically importing routes we are making sure that only the visited route will be downloaded
Code-splitter JS bundles generated by webpack will look like this.We have three bundles instead of one big file
If you are using Nuxt you have this feature out of the box
So rule number one…
Per-route code splitting will give you more than any other lazy code splitting technique but we can do more
There are still many things that are not needed right after user enters your website and can be loaded lazily
Like popups
Sidebars or any other off-screen components
So rule number 2
You can use v-if to conditionally load offscreen compoennts. V-if not true -> no render - no invocation - no download. It’s a great way to conditionally load components
It appears that most of the 3rd party libs can also be loaded lazily.
And the syntax is the same. Import function returns a promise with the lazily loaded lib
When speaking about 3rd party libs there is a common approach to keep all of node modules in vendor bundle.
Don’t do this. It’s much better to k keep 3rd party libs in route bundles or even better load them lazily whenever it’s possible. Otherwise users will download redundant code
You’ll probably say ‚Hey Filip it leads to code duplication aren;’t vendor bundles meant to prevent it?’
What if we will have loads imported in two routes. How to deal with this duplication?
Now the shared modules will be bundled into a separate file and therefore downloaded only once
Staying in the 3rd party libs topic there are tons of them and not all of them are good for your use case, or good overall so choose them carefully
There is a great website that can help you with choosing appropriate libraries - bundle phobia
It will give you all the information on how adding some particular library will affect your apps performance
what is even better - it will also suggest you alternatives so you can really choose the best option
Previously I mentioned that as a frontend devs we don’t have influence on how fast the server responds but it turns out that there are some things that we can do to improve this part of the process
We can reduce the number of network requests by making use of assets that we already downloaded
And we can use SW for this. They work as a proxy between your client web app and server
We are sending request to the service worker which proxies it to the network
But it can also proxy this request to the cache if asset was previously downloaded and respond almost instantly. Service Worker cache persists even when browser has been closed.
You can either install Vue-cli PWA plugin for this
..or Nuxt PWA module
In both cases you will see cached data under application tab of your devtools. Of course you can configure it as you wish
We can make use of prefetching
Which is just downloading assets before they are needed
So rule number 5
We can explicitly tell which chunks of code we want to prefetch with web pack magic comments that will dynamically add prefetch link to the head of our application
Or use more general solutions with regular expressions
If you are using Vue-cli 3 it’s prefetching code-splitter chunks out of the box
If you are using Vue-cli 3 it’s prefetching code-splitter chunks out of the box
Ok we learned a lot about performance optimization but it’s equally important to know where to optimize and measure our results.
You can use devtools „coevrage” tool to see how much of the shipped code was actually executed
You can also use Webpack Bundle Analyzer to actually see the size of your modules. This one is a must-have.
…and instalation is extremely easy
You can use ‚bundlesize’ package to make sure that you are not exceeding the reasonable size of the package.
..it also has a very nice GitHub integration
And you can use import cost plugin to keep track of size of your imports