Quick Tip — Written on
Crawl an URL and edit it instantly
How to crawl an url and instantly edit it's source code to check things are expected? Well, when you have a browser it's easy. But in a terminal on a remote server it's a bit more complicated.
On most unix like systems there is curl
already installed to fetch the URL. You just need to get it somehow displayed. Of course you could write the output of curl into a file and open that with curl "http://example.com/" -o temp.txt
. But that's kind of boring and feels clunky. But with the power of unix it works a short one-liner without an intermediate file:
curl -s "http://example.com/" | less
Boom, the source is in a pager for your viewing pleasure. This works with more
as well. This piping does not work with vim
, thought. For that you'll need to get the source in via stdin:
vim -R <(curl -s "http://example.com/")
That works Works with less
and more
as well:
less -f <(curl -s "http://example.com/")
more -f <(curl -s "http://example.com/")
And in case you are still on your GUI and quickly want that crawl result open in VSCode, here you go:
curl -s "http://example.com" | code -