Re^3: Using a git filter to Cache HTML Resources Locally
by Anonymous Monk on Oct 12, 2018 at 08:04 UTC
|
What I'm trying to get at, is that good practice of local caching without rewriting urls, ex <script src="//cdn..../../../"
Or using something like url_for('resource.js') in javascript or perl, to load resources based on configuration/environment
something thats one and done not physical rewrite on each change
| [reply] [d/l] |
|
|
local caching without rewriting urls, ex <script src="//cdn..../../../"
Maybe I'm missing something obvious here, but how would you do this for URLs like the ones I showed, e.g. https://cdnjs.cloudflare.com/ajax/libs/normalize/8.0.0/normalize.min.css? Remember I said these files are for public distribution, and I can't rely on people having a local server - it's even possible to open the HTML files from the local disk (file://). Using the public CDN URLs is easy, and doesn't require me to distribute a bunch of extra files along with my own.
something like url_for('resource.js') in javascript or perl, to load resources based on configuration/environment
Sure, that's a possibility - but then that code gets run for everybody. This tool is really only meant to be a development aid, it applies only to the local git working copy, and only when it's configured by the user. The files in the repository are those for distribution, and thanks to the filter they always keep their public CDN URLs.
The only downside of the git filter approach that I can see so far is the small performance penalty on some git operations. So I'm not yet convinced that there is something wrong with this approach.
| [reply] [d/l] [select] |
|
|
Maybe I'm missing something obvious here, but how would you do this for URLs like the ones I showed, e.g. https://cdnjs.cloudflare.com/ajax/libs/normalize/8.0.0/normalize.min.css? Remember I said these files are for public distribution, and I can't rely on people having a local server - it's even possible to open the HTML files from the local disk (file://). Using the public CDN URLs is easy, and doesn't require me to distribute a bunch of extra files along with my own.
Huh?
The html files merely detect if a local cache exists, otherwise they load from internet
So a wannabe developer, or clever end user, merely runs cacheresources.pl and they're set
| [reply] [d/l] |
|
|
|
|
|
|
|
|
|
The interesting part is not the URL rewriting but the automatic download of the remote URL to a local file.
It wouldn't be hard to try to load both URLs from Javascript, or whatever, but why add additional complexity when you can just rewrite the file?
| [reply] |
|
|
| [reply] |
|
|
The interesting part is not the URL rewriting but the automatic download of the remote URL to a local file.
Heh
Downloading/mirroring is the boring part :)(lwp-rget/wget/curl/httrack...)
The URL rewriting is the eyebrow raising part (sourcefilter)
It wouldn't be hard to try to load both URLs from Javascript, or whatever, but why add additional complexity when you can just rewrite the file?
Um, ... Why is one additional complexity but not the other?
| [reply] |
|
|