Hi - I have identified the root cause + a caddyfile fix for a pervious help request - The original thread has now been closed, but it ranks well on Google for this specific issue - and I’d like to help the next person who comes along.
The original thread is:
Long story short
Webpack uses chunked encoding, which gets cancelled by Caddy when buffering is being used.
Fix:
In your caddyfile, specify:
flush_interval -1
As part of your reverse_proxy directive, and hot refresh/hot module reload will work.
(Sorry for skipping the template, not sure how else to submit a fix/solution to an old thread.)
Nice, thanks for posting this! It’s not obvious to someone like me who has no knowledge of or experience with Webpack HMR, but it makes sense that flushing immediately could cause some clients to behave differently in such a way that those log emissions are reduced/eliminated.
This could/should probably be a wiki article if you’re willing to expand or explain it a bit more!
I could’ve sworn I’d tried something similar to this previously, but it may have been putting it in the wrong spot as well…d’oh! After implementing this, the specific message I saw being spammed tons has pretty much disappeared. Good to know the equivalent to proxy_buffering off; from nginx.