[clug] Why do we do what we do in Linux? [was Re: Text editor]

Hugh Fisher Hugh.Fisher at anu.edu.au
Mon Oct 27 18:14:02 MDT 2014

On 28/10/14 09:28, Hal Ashburner wrote:
> it pretty disappointing. So I guess your mileage may vary. I'm
> completely sold on the command line as a powerful tool, but that essay
> didn't make the sale. GUIs are ideal for infrequently done, ad-hoc or
> one off tasks because their commands are more discovervable and you
> can get immediate feedback to see if it did what you thought it would
> and undo it if not.

This is an attitude I've often seen expressed by Linux/Unix people. It's 
usually true for programming and system administration, but not in the 
many other fields where computers are being used. And IMNSHO, it is very 
damaging if you want to promote open/free software.

To start with, why can't command line or text-only tools be 
discoverable? Nano for example is great for new Linux people because it 
is discoverable with a few lines of help at the bottom, while vi is 
notorious for it's "Help! How do I get out?" problem.

> When you find yourself repeating the same steps 3
> or more times the command line wins, then being able to cut and paste
> that command line command into a text file with any name you give it
> and run it as a script - no gui can touch that. Recording VBA macros
> in MS office is about as good as you can get in a GUI equivalent and
> it's greatly inferior to intermediate command line usage for such
> tasks.

GUIs are also idea for frequently done and very complex tasks such as 
editing magazines, sound, music, images, video, film. And the people 
doing these tasks wanted to automate the repetitive bits, which led to 
AppleScript, COM, and VBA being developed in the late 1980s and 1990s. 
If you think that recording VBA macros is all people do, you need to dig 

These systems don't follow the "Unix way" but despite this, or maybe 
even because of this, they have very powerful capabilities:

* Two way communication. Instead of pipes, you get events and/or a sort 
of remote procedure call. Two programs can talk back and forth, so one 
program can send an image to another for filtering or something, then 
get it back.

* Complex data types. Because they're based on clipboard mechanisms 
rather than byte streams, the sender can ask "I have an image, I can 
send it in PNG, JPEG, or TIFF, which would you like?" You don't have to 
muck around with files in /tmp.

The new Microsoft PowerShell has typed data at the command line level, 
which AFAIK no Unix shell can do yet.

* Dynamic linking. It's normal in the Mac/Windows worlds for apps to 
have a dynamic linking API defined (per app, like Photoshop plugins, or 
standardized, eg COM). One program can directly call routines in 
another. People write components to work with existing applications.

This obviously has downsides - eg the ActiveX disaster in web pages - 
but it wasn't necessarily a dumb idea at the time. Xerox PARC were 
experimenting with sending LISP code between programs in the 1970s.

These various forms of non-Unixy IPC have been around for a couple of 
decades now. And they are a major reason why people stay with Mac/MS 
Windows rather than switching. It's not enough to write a new word 
processor that can open MS Word files; it would have to fit seamlessly 
into all these complex AppleScript/COM/VBA pipelines.

So imagine a financial trading firm with a system of Excel spreadsheets, 
databases, and report generators; handling and generating megabytes of 
financial data each day. (To anyone with a formal computer science 
education it will look like a horrible sphagetti coding mess; but it 
works.) In walks some Linux enthusiast who says "Hey! Look, I can pipe 
ASCII text from one program to another! Isn't that great?"

	Hugh Fisher

More information about the linux mailing list