What this page weight checker measures
The checker runs a single GET request and reads the response. It records the uncompressed HTML size, the compressed wire size from Content-Length, response time in milliseconds, and the Content-Encoding value (gzip, br, or none). It parses the HTML and counts external assets: stylesheets, scripts, images, fonts, and iframes.
It does not download every asset. Fetching 80 sub-resources per page would be slow and would hammer the target site. Instead the tool uses Web Almanac 2024 medians (HTML 30 KB median, 75 KB at p75; total 2.4 MB median, 5 MB at p75) and asset-type multipliers to estimate total weight. The estimate is honest, labeled as such, and runs in under two seconds.
How to use this page size checker
- Enter Page URL. Paste the full URL you want to audit, including
https://. The checker accepts any public page that returns HTML. JS-only single-page apps return whatever the server sends before client-side rendering, which is also what Googlebot sees first. - Hit Check page size. The tool fetches the URL, reads headers, parses asset references, and returns a card with HTML doc size, gzip wire size, response time, encoding, cache-control, asset counts, and an estimated total weight with a benchmark badge.
Try this with https://www.nytimes.com. The checker returns roughly 280 KB raw HTML, 55 KB gzipped, a 600 ms response, br encoding, 18 stylesheets, 42 scripts, 60+ images, 8 fonts, and an estimated 4.8 MB total. That sits near the Web Almanac p75, flagged amber. A lean portfolio page typically returns 8 KB HTML, 3 KB gzipped, and an estimated 200 KB total, flagged green.
Why page weight matters for Core Web Vitals
Page weight drives Largest Contentful Paint (LCP). The Web Almanac 2024 found pages above 3 MB failed the 2.5-second LCP threshold 62% of the time on mobile, while pages under 1 MB failed only 18%. Google uses Core Web Vitals as a ranking signal, so heavy pages lose positions even when content is strong.
Wire size matters more than raw size. A 250 KB HTML doc compressed to 35 KB ships faster than a 60 KB doc shipped uncompressed. The checker shows both. If Content-Encoding is empty on a text response over 5 KB, you waste bandwidth on every visitor. Enabling brotli at the edge cuts text payloads 70-80% with no code change.
Common mistakes
- Trusting the uncompressed HTML size. Browsers download the gzipped or brotlied bytes. Read the wire size, not the doc size.
- Counting requests instead of weight. Forty 5 KB images weigh less than two 4 MB hero photos. Fix the heaviest assets first.
- Ignoring third-party scripts. Analytics, chat widgets, and tag managers often inject 500 KB+ of JS that never appears in your build. Run the checker on the live page.
- Optimizing only the homepage. Product, blog, and category templates usually weigh 2-3x more because they pull dynamic images and embedded video.
- Treating the asset count as the full picture. The checker reports declared assets. Lazy-injected scripts are not counted. Cross-check with google-crawler-simulator for the rendered DOM.
Advanced tips
- Set a budget. Aim for under 1.5 MB on landing pages and under 800 KB on conversion pages. Pages under 1 MB pass LCP 82% of the time on 4G.
- Switch from gzip to brotli. Brotli compresses text 15-25% smaller and ships in 96% of browsers as of 2026.
- Convert hero images to AVIF with a WebP fallback. AVIF averages 50% smaller than JPEG. A 400 KB hero usually drops to 90 KB.
- Subset web fonts. A full Google Font weight is 80-120 KB; subsetting to Latin basic drops it to 18-25 KB.
- Pair with the alt-text-checker to clean up the image inventory the size checker counts.
Once you have a number, fix the heaviest assets first and re-check. Use the google-crawler-simulator to see what a bot sees after JS runs, and the h1-checker to confirm the leaner HTML still ships a valid heading structure.