Best of Browsers

by debianjoe

browsersbeautiful web

Recently, in a discussion regarding Mozilla Firefox/Ice-Weasel plug-ins that remove photos from the web, a few text-based browsers were brought up as a very simple way to get a similar (or possibly better) effect.  One of the things that I totally insist on having on any Unix-based system is a text-based web browser.  I once found myself using curl to pull pages down one at a time and manually viewing the html in nano to find a mirror list.  After that, I vowed to never do that again if it could be avoided.  Assuming that you are prone to experimenting with Xserver, like the framebuffer, or write programs that could crush whatever display server you use, it might be worthwhile to keep a text web browser available as a fallback.  Beyond simple need, there’s a few upsides to them that you won’t find in more graphical browsers.  They tend to help you to focus on what is actually written as opposed to the CSS, flash, and javascript applets that overrun the world-wide-web these days.  Also, they’re generally more compact and normally adopt themselves better to content dumps to stdout.

I will say that they are a large enough portion of my general workflow that on the little web-browser project that I co-wrote, I spent time making certain that there would be cross integration of bookmarks between it and links2.  Obviously, I consider them not only worth having, but something that I’d prefer to see everyone adopt.

As far as a compare and contrast, I’ll say that I tend to use links2 when I have mouse support.  I prefer emacs-w3m on systems that don’t have mouse integration.  In the end, this is really about the speed that I can navigate with them.  All of my other opinions and preferences are more “chocolate vs. vanilla” and I don’t expect everyone to share my tastes.  The ones that I’d strongly recommend are links2, links, elinks, lynx, and w3m.  While I do swear by emacs-w3m, that one is probably only well suited for emacs users.

I’d mentioned that these also adopt themselves well to scripting, so I guess that I at least owe an example.  Let’s assume I’d like to check the daily news from BBC-News, but only if it pertains to Russia.  I could grab headlines with something like:

elinks www.bbc.co.uk/news -dump | grep -i 'Russia'

That may not be a sensible use, but I think that it’s pretty easy to see how this idea can be used to get your local weather or any such web content dumped into a script.  Let your mind run free, and I’m sure that lots of useful (or at least interesting) ideas will come.  Happy browsing.

Advertisements