Home Messages Index
[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index

Re: CLI People a Dying Breed? Adoption Pains Self-inflicted

  • Subject: Re: CLI People a Dying Breed? Adoption Pains Self-inflicted
  • From: Rex Ballard <rex.ballard@xxxxxxxxx>
  • Date: Tue, 29 Jan 2008 15:59:40 -0800 (PST)
  • Bytes: 15397
  • Complaints-to: groups-abuse@xxxxxxxxxx
  • Injection-info: h11g2000prf.googlegroups.com; posting-host=67.80.109.118; posting-account=-EkKmgkAAAAxynpkobsxB1sKy9YeqcqI
  • Newsgroups: comp.os.linux.advocacy
  • Organization: http://groups.google.com
  • References: <2669587.rTVeSmf00P@xxxxxxxxxxxxxxx>
  • User-agent: G2/1.0
  • Xref: ellandroad.demon.co.uk comp.os.linux.advocacy:600171
On Jan 28, 12:23 pm, Roy Schestowitz <newsgro...@xxxxxxxxxxxxxxx>
wrote:

 SWM, Shell User, Seeks Soul Mate for GUI-Free LTR

> ,----[ Quote ]
> | "There is a sad truth to the world today," wrote the anonymous poster of the
> | ad. "I am part of a dying breed of people known as 'shell users.' We are an
> | old-fashioned bunch, preferring the warm glow of a green screen full of text
> | over the cold blockiness of a graphical interface.... The whole 'Microsoft
> | Windows' fad will fade away sooner or later, but in the interim, our kind is
> | facing extinction.
> `----
> http://www.ecommercetimes.com/story/must-read/61394.html

The shell is far from dead, but it is no longer the primary tool for
accessing a UNIX or Linux system.  In the early 1980s, Unix users
often used an ASCII terminal or a PC with a Terminal Emulator to
access a UNIX system, usually through an RS-232 connection.  Typically
the resolution was 80 colums by 25 lines.  In the 1970s, many UNIX
users used a teletype terminal sometimes via a 300 baud modem, which
is why so many of the most popular unix and linux commands have short
names like ls to list the contents of a directory, or cd to change
directories.  Later shells like the BSD C Shell supported aliases so
that users could use more generic terms like "list" and "chdir" to be
more like MS-DOS.

By the late 1980s, many UNIX workstations were stand-alone, and others
supported high resolution graphics terminals such as the DEC VT-300
and Textronix 4010 series terminals.  X11 was developed to provide a
way to write standard applications that could be interfaced to any of
these terminals, or to graphics drivers running within the existing
UNIX system.  This made GUI based applications much easier to develop,
and much more popular.  By the early 1990s, you do do most of the same
kinds of things you could do with Windows, with UNIX.  You could start
applications and open files using GUI interface commands rather than
shell commands.

But even today, the shell remains a powerful tool for those who are
willing to learn it.  The shell itself is very simple and only takes
about 20 hours to learn.  The big advantage, and the challenge, of the
shell is that it is possible to run almost any command in UNIX,
including launching graphical applications.  There are thousands of
applications or "filters" that have been specially designed to work
efficiently with the shell, taking input through standard input, and
sending output to standard output, allowing a shell programmer to
create complex "pipelines" that can perform very complex tasks with
just a few lines of source code.

The main advantage of shell programming is it's flexibility.  Because
the shell programs are interpreted in real time, it's very easy to
test a pipeline and then save it into a file so that you can run it as
a script.  Once you have a script, you can start it automatically
using cron jobs, or you can invoke it from a GUI by clicking a file,
or using a python script that launches the shell script.

Many GUI applications for UNIX also integrate with shells and shell
scripts as well.  Often these shells can be used for such tasks as
accessing a database, transforming it with filters into the input for
a charting tool or spreadsheet, and then display that chart or
spreadsheet.

Learning shell commands and the scripting languages such as sed, awk,
and perl, can make it very easy to quickly create solutions to custom
problems.  Often a script that would take days to write in a compiled
language such as C, C++, or Java can de done in a few hours of shell
programming.

Modern Linux and Unix workstations, desktops, and laptops don't
require the use of the shell.  The Mac OS/X is a good example of a
system that never requires a user to use the shell for typical
applications and uses, but has a shell interface available for those
who know the fine art.

In UNIX terminology, those who understood shell programming and the
various scripting languages and how to write filters were often called
"Wizards" because they could produce rapid results so quickly that it
almost seemed like magic or sorcery.  The secret was source-ery :-D

> Speeding Up Free Software Adoption: External and Internal Routes to Success
> ,----[ Quote ]
> | Striking a balance between mindsets might be a factor on which the success of
> | Free software is hinged. Put simply, a struggle against
> | so-called 'pragmatism' may have been one of the greatest barriers to wider
> | adoption of Free software.
> `----
> http://itmanagement.earthweb.com/osrc/article.php/3724136

Much of the success of Linux is that Linus and Linux distributors have
taken a very pragmatic approach to Open Source vs Proprietary
software.  The Linux kernel itself, along with the libraries such as
LGPL are written using open source software licenses that are intended
to prevent undocumented forking of the core framework.

At the same time, the use of Linux "Modules" to allow Linux to call
proprietary binary-only drivers has made it possible for vendors to
give Linux the full capabilities of the chipset or the peripheral
without having to expose their best tricks and secrets to their
competitors.

The LGPL is  a shared library which  creates a "bridge" between
proprietary software such as commercial software like Adobe Flash, or
Oracle, and the GPL Linux kernel, such that the commercial software
vendors don't have to publish their code in source code form.

The result has been that many commercial software vendors have begun
to offer Open Source frameworks of their own.  IBM's Eclipse, for
example provides a framework that can be used as a "graphical shell"
in much the way one would use the command-line shells such as bash or
ksh, yet plug-ins can be proprietary and commercialized.  WebSphere
Studio, Rational Application Developer and various other commercial
applications are actually plug-ins to eclipse which can either used an
existing compatible version of eclipse, or a copy that is included
with the software package if Eclipse has not already been installed.

The result is that there are thousands of applications written to the
Linux and Eclipse frameworks, including both proprietary and OSS
solutions, some of which are very competitive.

The result has been a whole new genre of applications which are built
on multiplatform OSS frameworks, and can be used with either Linux or
Windows or Macs (or other versions of UNIX for that matter).

In most cases, all that's required is a version of platform
independent java and a glibc compatible interface library such as
cygwin for windows.  UNIX already has full support for glibc as does
Solaris, UnixWare, FreeBSD, Linux, and most other versions of BSD, and
most SysV based versions of UNIX.

> Related:
> When will we hear the end of computer quacks?

> ,----[ Quote ]
> | So why beat the dead skunk again? Check it out: Don Norman discovered
> | command line interfaces! And he's about to take his discovery to the
> | press! Yes, he thinks this is an original discovery all his own.
> | [...]
> | I can't wait until Microsoft invents apt-get so he can fawn over it next...
> `----
> http://penguinpetes.com/b2evo/index.php?title=when_will_we_hear_the_e...

It really is amusing that Don Norman has just "discovered" that
command lines and efficient search tools are really valuable.

Perhaps that's why UNIX had so many powerful and versatile search
tools such as grep, sed, awk, and perl, as well as parsers generators
such as bison and yacc to help you create your own custom parsers for
your own custom search engines.

Many have sought the "holy grail" of "plain English" languages, dating
all the way back to BASIC and COBOL.  Yet a true "plain English"
command line interpreter won't work.  The biggest problem is that most
spoken languages have ambiguous syntax and symantic issues.

If you ask a computer to interpret an english phrase such as "the
spirit is willing but the flesh is weak" into tokens that can be
translated into another language, such as Russian, and then try to
translate that back to English you get "The Wine is strong but the
Meat is spoiled".  It can get much worse.

Lawyers make an entire profession out of creating language which seems
to mean one thing, but can later be interpreted very differently by
the author of the contract.  You think you are agreeing to a minor
clause that should be easy to comply with, and discover, only after
it's too late, that you misinterpreted the contract, and are now in
violation, and that your oblication is 100 times what you originally
expected.  A common example is when a doctor asks you to sign an
agreement that you will pay any expenses not covered by the insurance
company, with verbal assurances that this only means the co-pay, only
to find out that the doctor is not in your network, the insurance
company won't pay anything, and you are now liable for thousands of
dollars in medical bills.

>From this type of language, we expect to try to tell a computer
exactly what we want it to do, and wonder why we don't get the result
we want.  Most text search engines ignore insignificant pronouns such
as; the, at, if, he, she, it, a, of, and so on.  The result is that
the user types in what looks like english, but turns into a pretty
generic set of seach keywords.

More advanceds search engines such as those used for Dow Jones News
Retrieval Service, have real commands such as words near each other,
all words, and wild cards.

UNIX has a very powerful query language which is extremely efficient,
and often remarkably accurate.this syntax, known as "regular
expressions" is implemented differently on Linux and BSD from the
implementation in AT&T based UNIX, but essentially, supports wild
cards, multiple words on the same line, and different commands can be
combined.  Languages that use these regular expressions include grep,
sed, awk, perl, and vi, among many others.

For many years, UNIX did not use relational databases, because it was
more efficient to use these search tools on standard sequential text
files.  The text files often had delimiters or a syntax that could
also be used to generate index files that could be used to more
quickly locate the desired target, which could then be viewed and/or
edited using a text editor such as VI.  In fact, the ability of VI to
invoke these filters and search tools made it very easy to scan files
and get a result set which could be viewed using the familiar editor
interface.  Emacs was another popular interface, and provided
additional capabilities such as the ability to open multiple windows,
and to extract from one window/buffer and put the output into another
buffer or working file, which could be viewed in another window.

Even today, many search engines and many web sites are based on
scripting tools such as perl, which are called as apache modules.
Even the web interface itself is a script command generated by the
form being used.

The big advantage of command line interfaces such as shells, which
allow multiple commands to be combined into a simple pipeline (or a
not so simple pipeline), is that one can get better control over all
of the options and option combinations available for each of the
different components.  Typically, a GUI might be able to handle 10-20
control flags as radio buttons or checkboxes, but if you have several
commands in a pipeline, there could be 100-200 flags and options
available which might give you better control over the functions you
are trying to achieve.  A simple script can set most of the commands
for the desired effect, and expose only those flags or options which
need to be set by the user.  If the user wants to customize the
script, exposing additional flags, or changing the settings of those
flags, they can do so without having to change the GUI interface
significantly.

Ideally, in Linux, we combine the best of both worlds.  We can combine
a GUI interface either via the Web Interface or via Python, Tcl, Ruby,
or Java, with scripts and filters, to generate customized solutions
with user friendly interfaces.  This makes development of even
relatively complex applications, involving complex and frequently
shifting data, relatively easy and quick compared to monolithic GUI
interfaces where the entire application must be compiled from source
code and libraries.

> Death of the command line

> ,----[ Quote ]
> | It's hard for me to imagine using an OS without a strong command line.
> | Even Microsoft has recognized the for that with their Monad Shell
> | (though they are at least temporarily removing that from Vista). Linux
> | of course has its Bash shell, Mac OS X has Terminal (which now defaults
> | to Bash) - everybody knows you need a shell.
> `----
> http://aplawrence.com/Unixart/command_line_death.html

Microsoft has never had a strong command line engine.  They have tried
to give lip service to command line interfaces with .com and .bat
files, as well as WSH and VBScript.  The problem is that so many of
the critical functions and applications are so tightly tied to the
GUI, that all of these scripting tools have been extremely limited.
Even Services for Unix is extremely limited in it's ability to combine
existing Microsoft capabilites to perform common maintenance and
support functions.  Cygwin provides the ability to create many
scripted applications that run on Windows, but these scripts can't be
used to automate such things as recycling of services, backups, and
registry management.

Linux and Unix on the other hand, are designed so that all critical
management functions can be scripted, which means that it's much
easier to automate most of the management.  This is one of the reasons
why Linux and Unix servers are so popular in IT centers and have such
low TCO.

Rex Ballard
http://www.open4success.org


[Date Prev][Date Next][Thread Prev][Thread Next]
Author IndexDate IndexThread Index