feat(perf): Beasties critical CSS — homepage-only#335
Conversation
Install Beasties dependency and implement critical CSS extraction: - Extract above-the-fold CSS and inline in <head> <style> block - Defer main CSS bundle via preload with onload swap strategy - Add <noscript> fallback for no-JS browsers Build process flow: 1. Hugo builds with production environment 2. bin/inline-critical post-processes public/ output 3. Converts absolute URLs to relative for Beasties processing 4. Applies preload + swap strategy for async CSS loading 5. Restores absolute URLs to match baseURL Performance impact: Defers render-blocking main CSS bundle, allowing critical CSS to render without waiting for full stylesheet download. Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
…ion elsewhere Refactor bin/inline-critical.mjs to process only the root index.html (homepage) instead of recursively processing all HTML files. Performance measurements showed: - Homepage: LCP improved 23.3% (4993ms → 3829ms) — critical CSS inlining pays off - Other pages: FCP/LCP regressed +7.9% mean due to 75KB inline CSS overhead exceeding first-paint budget at simulated mobile bandwidth Solution: Apply Beasties critical CSS inlining only to homepage where the above-the-fold content is large enough to justify the trade-off. Other 6 templates use plain <link rel="stylesheet"> to avoid overhead. Changes: - Replace walk() recursive traversal with processHomepageOnly() targeting root index.html - Homepage gets inline critical CSS + <link rel="preload"> + <noscript> fallback - All other pages skip Beasties, rendering as-is with regular stylesheet links Verification: - Homepage: 1+ preload links with onload swap handlers ✓ - Services/blog/careers: 0 preload links (untouched) ✓ - bin/dtest: 84/0 (no visual regressions) ✓ Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
|
Warning Rate limit exceeded
You’ve run out of usage credits. Purchase more in the billing tab. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. ℹ️ Review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (4)
✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Summary
Adds Beasties to the production build pipeline, scoped to the homepage only. Improves homepage LCP by 23.3% with no regression on the other 6 layouts.
Why scoped to homepage?
Initial implementation (
f0b91c41) ran Beasties across all HTML output. Lighthouse measurement showed it regressed FCP on 6 of 7 pages:Root cause: Beasties extracted 75KB of "critical" CSS into inline
<style>. At simulated mobile bandwidth (~1.6Mbps), parsing 75KB takes ~375ms, exceeding the round-trip it was meant to eliminate. Only the homepage benefited because its LCP element is a hero image whose render was previously blocked by the 220KB CSS bundle. Deferring that bundle let the image paint sooner.Final results (Option B — homepage only, commit
eb423f20)Headline number: homepage LCP 4993ms → 3827ms (−1166ms, −23.3%), Lighthouse perf score 79 → 84.
The honest tradeoff
Homepage FCP regressed +12.4% (2405ms → 2702ms). This is the known cost of inlining 75KB of CSS — the browser must parse it before first paint. Two reasons we accept this trade:
inlineThreshold). For now, accept the trade.Visual regression
bin/dtest: 84 screenshots, 0 failures. Implementation is correct — only the strategy was wrong in v1, fixed in v2.Methodology note (saved to memory for next time)
During the first measurement run, Lighthouse silently captured a broken-render baseline because the production build hardcodes
https://jetthoughts.com/...URLs which got blocked by Cross-Origin Read Blocking when served from localhost. Fixed by passingBASE_URL=http://localhost:1313/tobin/hugo-build(which already supports this env var). All measurements in this PR used the corrected methodology. Manual Chrome DevTools verification caught this — Lighthouse alone would have shipped misleading data.Files changed
package.json+bun.lockb: addbeastiesdevDependency (0.4.2)bin/inline-critical(new): shell wrapperbin/inline-critical.mjs(new): processes ONLY<output>/index.html(root homepage)bin/hugo-build: invokesbin/inline-criticalafter Hugo whenENVIRONMENT=production.github/workflows/_hugo.yml: invokesbin/inline-critical publicafter the Hugo build stepTest plan
bin/hugo-buildwithENVIRONMENT=production)<style>+<link rel="preload" onload>+<noscript>fallback<link rel="stylesheet">(Beasties skipped them)bin/dtest84/0 visual regression🤖 Generated with Claude Code