in reply to Re: Using a git filter to Cache HTML Resources Locally
in thread Using a git filter to Cache HTML Resources Locally

How do you do this without filters?

I'm not quite sure what you're asking specifically... you can edit the files manually, you can used a hacked solution like this, or you can use a different script like this (the now removed predecessor of htmlrescache). This is how you would use htmlrescache standalone.

Replies are listed 'Best First'.
Re^3: Using a git filter to Cache HTML Resources Locally
by Anonymous Monk on Oct 12, 2018 at 08:04 UTC

    What I'm trying to get at, is that good practice of local caching without rewriting urls, ex  <script src="//cdn..../../../"

    Or using something like url_for('resource.js') in javascript or perl, to load resources based on configuration/environment

    something thats one and done not physical rewrite on each change

      local caching without rewriting urls, ex <script src="//cdn..../../../"

      Maybe I'm missing something obvious here, but how would you do this for URLs like the ones I showed, e.g. https://cdnjs.cloudflare.com/ajax/libs/normalize/8.0.0/normalize.min.css? Remember I said these files are for public distribution, and I can't rely on people having a local server - it's even possible to open the HTML files from the local disk (file://). Using the public CDN URLs is easy, and doesn't require me to distribute a bunch of extra files along with my own.

      something like url_for('resource.js') in javascript or perl, to load resources based on configuration/environment

      Sure, that's a possibility - but then that code gets run for everybody. This tool is really only meant to be a development aid, it applies only to the local git working copy, and only when it's configured by the user. The files in the repository are those for distribution, and thanks to the filter they always keep their public CDN URLs.

      The only downside of the git filter approach that I can see so far is the small performance penalty on some git operations. So I'm not yet convinced that there is something wrong with this approach.

        Maybe I'm missing something obvious here, but how would you do this for URLs like the ones I showed, e.g. https://cdnjs.cloudflare.com/ajax/libs/normalize/8.0.0/normalize.min.css? Remember I said these files are for public distribution, and I can't rely on people having a local server - it's even possible to open the HTML files from the local disk (file://). Using the public CDN URLs is easy, and doesn't require me to distribute a bunch of extra files along with my own.

        Huh?

        The html files merely detect if a local cache exists, otherwise they load from internet

        So a wannabe developer, or clever end user, merely runs  cacheresources.pl and they're set

      The interesting part is not the URL rewriting but the automatic download of the remote URL to a local file.

      It wouldn't be hard to try to load both URLs from Javascript, or whatever, but why add additional complexity when you can just rewrite the file?

        This remarks reminds me of a caching proxy I had in use when internet connection time was metered in seconds. It might be possible to configure it to the same effect.

        For shielding your "work in progress" from local resources (CSS/JS/...) that are WIP themselves, however, the git filter can do what the proxy definitely can't.

        The interesting part is not the URL rewriting but the automatic download of the remote URL to a local file.

        Heh

        Downloading/mirroring is the boring part :)(lwp-rget/wget/curl/httrack...)

        The URL rewriting is the eyebrow raising part (sourcefilter)

        It wouldn't be hard to try to load both URLs from Javascript, or whatever, but why add additional complexity when you can just rewrite the file?

        Um, ... Why is one additional complexity but not the other?