conceptual overview
The system's command interpreter begins operation at a very high level of
abstraction. At this level, in unix, everything has a full pathname in the
filesystem namespace, including the system's programs, subroutine
libraries, disk drives, disk drive partitions, sockets, RAM, your tty and
so on. The shell is for executing commands on filenames. It has powerful
text macro variable processing and filename "globbing" (wildcards). Shell
command I.O. redirection is also strikingly expressive at the file level,
particularly given a unix environment. Processing of the *contents* of
files, however, is inconvenient and inefficient in the shell itself. C,
Awk, sed, dcon, and myriad other utilities and programming languages are
much better for bits, bytes, strings, numbers and other things files may
contain. The user access controls that are associated with the phrase "a
unix shell" in a networking context are implemented by the system's login
and getty programs, which invoke the user's actual shell program upon
successful login.
A shell can invoke an instance of itself like it can call any other program. The invoked program is a "child" of the invoking program, and inherits a "local" copy of the parent's environment. That is, changes the child makes to it's environment do not effect the parent. The result is that parent-child programs form a tree of inherited and modified environments, and when an ash running as a child instance changes it's e.g. $PATH, the parent's copy is not effected.
There's another unix watch-phrase that pertains to the overall emphasis on modularity evident in what the shell does and does not provide; "Implement mechanism, not policy.". The shell is designed and documented to provide certain functionality, but to let the user decide what that functionality is for. Anything that doesn't have to be "hard-wired in" isn't, and the docs concentrate on actual behaviors, rather than particular uses. The shell puts the burden of expressing what you want to do on you, rather than making costly assumptions about your intentions.
The shell is confusing. Anything that flexible is. It is also interactive, however, so it's easy to try things. Consider this document to be a guide for further experimentation.
The shell sets the initial value of the "positional parameters" ( available as the contents of the internal variables 1, 2 ... etc.) from the args remaining after any arg used as the name of a file of commands is deleted.
The flags (other than -c) are set by preceding them with - and cleared by preceding them with +; see the set built-in command for a list of flags. If no value is specified for the -i flag, the -s flag is set, and the standard input and output of the shell are connected to terminals, then the -i flag will be set. If no value is specified for the -j flag, then the -j flag will be set if the -i flag is set.
When the shell is invoked with the -c option, it is good practice to include the -i flag if the command was entered interactively by a user.
initialization files
If ash is invoked as a name that begins with -, for example if ash
is invoked via a symlink to ash named "-ash", this instance of ash is
assumed to be a login shell, and the files /.et/profile and
.profile are interpreted (run) if they exist.
If the parent's environment variable SHINIT is set on entry to ash, the commands in $SHINIT are run. $SHINIT is not examined if this instance is a login shell, or if ash is running a shell procedure. (ash is considered to be running a shell procedure if neither the -s nor the -c options are set.)
When a shell function is executed, all of the shell positional parameters (except 0, the program name, which remains unchanged) are set to the parameters (arguments) to the shell function. The variables which are explicitly placed in the environment of the command (by placing assignments to them before the function name) are made local to the function and are set to the values given. Then the command (list) given in the function definition is executed. The positional parameters are restored to their original values when the command completes. That is, the parent of the command gets it's original commandline arguments back.
Shell built-ins are executed internally to the shell, without spawning a new process or environment. Built-ins may be I.O. redirected as usual.
When a normal program is executed, the shell runs the program, passing the parameters and the environment to the program. If the program is a shell procedure, the shell will interpret the program in a subshell. The shell will reinitialize itself in this case, so that the effect will be as if a new shell had been invoked to handle the shell procedure, except that the location of commands located in the parent shell will be remembered by the child. If the program is a file beginning with #!, the remainder of the first line specifies an interpreter for the program. The shell (or the operating system) will exec the interpreter in this case. The arguments to the interpreter will consist of any arguments given on the first line of the program, followed by the name of the program, followed by the arguments passed to the program.
I/O redirection < > << >> <# ># <- >-
Programs keep track of open files internally with small intergers called
file descriptors. The default file descriptors of standard input,
standard output and standard error of a unix command are always 0, 1 and
2. By default all three are connected to the terminal the command is
invoked from, but the shell can redirect them, if the command actually
uses them. Many commands don't use all three. Input/output redirections
can be intermixed with the words in a simple command and can be placed
following any of the other commands or command lists. The file reconnected
must follow the redirection operator, but otherwise any order is possible,
i.e. the redirection may preceed the command or the variable assignments.
When redirection occurs, the shell saves the old values of the file
descriptors and restores them when the command completes.
< opens the file as input to the command. > opens the file for overwriting ("clobbering").>> opens the file for appending to the file. <&digit and >&digit makes the input or output a duplicate of the file descriptor numbered by the digit. If a minus sign is used in place of a digit, the standard input or standard output are closed.
The << token redirection takes input from what is called
a "here document". As the shell encounters << redirections,
it collects them. The next time it encounters an unescaped newline, it
reads the documents in turn. The token string following the <<
specifies the contents of the line that terminates the document. If none
of the quoting methods
('', "", or \)
are used to enter the
token, then the document is treated like a word inside double quotes:
$ and backquote are expanded and backslash can be used to escape
these and to continue long lines. The token cannot contain any variable or
command substitutions, and its length (after quoting) must be in the range
of 1 to 79 characters. If <<- is used in place of
<<, then leading tabs are deleted from the lines of the
document. (This is to allow you to indent shell procedures containing
here documents in typical coding style.) As an interactive example,
command << TOKEN
will send all lines typed after the above to standard input of "command"
until a line consisting of nothing but "TOKEN" is typed. Here documents
are a bit confusing, but extremely convenient once understood. See the
cLIeNUX DSFHed script for an example of how here-documents allow any
normally interactive program like ed to be controlled in a script.
Any of the preceding redirection operators may be preceded by a single digit specifying the file descriptor to be redirected. There cannot be any white space between the digit and the redirection operator.
command sequence (program) flow control
A "list" is a sequence of zero or more commands separated by newlines, semicolons, or ampersands (&), and optionally terminated by one of these three characters. The commands in a list are executed in the order they are written. If a command is followed by an ampersand, the shell starts the command and immediately proceeds onto the next command; otherwise it waits for the command to terminate before proceeding to the next one. Ending a command line with & is a special case of the above, causing the command to start in the background, in effect giving you your prompt back immediately in an interactive shell.
&& and || are conditional operators. && executes the first command, and then executes the second command iff (if and only if) the exit status of the first command is zero. || is similar, but executes the second command iff the exit status of the first command is nonzero. && and || both have the same priority.
The | operator feeds the standard output of the first command
into the standard input of the second command. This is commonly called a
"pipe". The exit status of the | operator is the exit status of the
second command. | has a higher priority than || or
&&. Pipes feed into commands, whereas other redirection
operators feed into files. Pipes may be chained. One popular chain is
cat file(s) | sort | uniq | less
Each section of a pipe is a full simple command.
An
if
command looks like
if list then list
[ elif list then list ] ...
[ else list ] fi
A while command looks like
while list do list done
The two lists are executed repeatedly while the exit status of the first
list is zero. The
until
command is similar, but has the word
until
in place of
while
repeats until the exit status of the first list
is zero.
The for command looks like
for variable in word... do list done
The words are expanded, and then the list is executed repeatedly with the variable set to each word in turn. The list of words for loops over is often filenames generated by a wildcard pattern. Numerically counted loops are better generated with while or until constructs, or with for and a separate "count" command if one is available. For-loops can be made to traverse subdirectories by using ` find .` or similar command expansions to generate the wordlist. In ash do and done may be replaced with { and }.
The break and continue commands look like
break [ num ] continue [ num ]
Break terminates the num innermost for, until or while loops. Continue continues with the next iteration of the num'th innermost loop.
The case command looks like
case word in patterna) list ;; patternb) list ;; patternc) list ;; ... esac
The pattern can actually be one or more patterns (see Patterns below), separated by | characters. A "*" pattern can serve as a "default" case if it occurs as the last pattern.
Commands may be grouped by writing either
(list)
or
{ list; }
The first of these executes the commands in a subshell, which will not pass it's changes to it's environment variables to the parent shell's environment. The output of a () or {} group of commands may be redirected as a single entity.
subroutines
A function (subroutine) definition looks like
function_name () command
A function definition is an executable statement; when executed it installs a function named name and returns an exit status of zero. The command is normally a list enclosed between { and }. The "alias" feature of some shells is basically redundant to functions.
Variables may be declared to be local to a function by using a local command. This should appear as the first staement of a function, and looks like
local [ variable | - ] ...
When a variable is made local, it inherits the initial value and exported and readonly flags from the variable with the same name in the surrounding scope, if there is one. Otherwise, the variable is initially unset. Ash uses dynamic scoping, so that if you make the variable x local to function f, which then calls function g, references to the variable x made inside g will refer to the variable x declared inside f, not to the global variable named x.
The only special parameter than can be made local is -. Making - local causes any shell options that are changed via the set command inside the function to be restored to their original values when the function returns.
The return command looks like
return [ exitstatus ]
It terminates the currently executing function. exitstatus is an optional integer. Unix tradition is that 0 means that no error or other curiosity occured. The significance of non-zero positive or negative numbers varies. Success is 0. This makes sense with the usual implication of 0 implying "false" if you consider a shell return code to mean "error equals...", i.e. 0 returned means "error equals false." The most recent return code passed to the shell is available in the $? internal variable. Example... test 4 -eq 6 ; echo $?
command lookup
When locating a command, ash first looks to see if it has a shell
function by that name. Then, if PATH does not contain an entry for
%builtin, it looks for a built-in command by that name.
Finally, it searches each entry in PATH in turn for the command.
The value of the PATH variable should be a series of entries separated by colons. Each entry consists of a directory name, or a directory name followed by a flag beginning with a percent sign. The current directory should be indicated by an empty directory name. If no percent sign is present, then the entry causes ash to search for the command in the specified directory. If the flag is %builtin then the list of shell built-in commands is searched. If the flag is %func then the directory is searched for a file which is read as input to ash. This file should define a function whose name is the name of the command being searched for.
Command names containing a slash are simply executed without performing any of the above searches.
command environment
The environment of a command is a set of name/value pairs. When the
shell is invoked, it reads these names and values, sets the shell
variables with these names to the corresponding values, and marks
the variables as exported. The export
command can be used to mark additional variables as exported.
The environment of a command is constructed by constructing name/value pairs from all the exported shell variables, and then modifying this set by the assignments which precede the command, if any.
variable substitution
Shell variables are text "macros", which are text strings that represent
other text strings. The use of all-caps for variable names is merely a
convention; capital letters are not inately different than other
characters. For a variable BLAH, $BLAH represents the text value
of BLAH. Try a few trivial experiments with VAR="whatever", echo and set
to clarify this, e.g. echo PATH and echo $PATH. The meaning of the string
resulting from the expansion of $BLAH is dependant on further
interpretation. Numbers and lists like $PATH are just text strings
as far as expansion is concerned.
Ash accepts two syntaxes for command substitution: `list` and $(list)
.
Either of these may be included in a word. During the command
substitution process, the command (syntactly a list) will be
executed and anything that the command writes to the standard output will
be captured by the shell. The final newline (if any) of the output will
be deleted; the rest of the output will be substituted for the command in
the word. In other words, the output of the enclosed command becomes
arguments to the enclosing command. One very useful example is using
"find" to generate the list of words for a for-loop.
for F in `find . `
will pass all the filenames in a directory and it's subdirectories to the
variable F in turn, allowing the for loop to traverse an entire
directory `recursively'.
word splitting
When the value of a variable or the output of a command is substituted,
the resulting text is subject to word splitting, unless the dollar sign
introducing the variable or backquotes containing the text were enclosed
in double quotes. In addition, $@ is subject to a special type of
splitting, even in the presence of double quotes.
If a word is the expression following the word "case" in a case construct, the file name which follows a redirection symbol, or an assignment to the environment of a command, then the word cannot be split into multiple words. In these cases, only variable substitution and command substitution are performed.
ash uses two different splitting algorithms. The normal approach, which is intended for splitting text separated by whitespace (spaces, tabs and new-lines), is used if the first character of the shell variable IFS (Input Field Seperator) is a space. Otherwise an alternative experimental algorithm, which is useful for splitting (possibly empty) fields separated by a separator character, is used.
When performing splitting, the shell scans the replacement text looking for a character (when IFS does not begin with a space) or a sequence of characters (when IFS does begin with a space), deletes the character or sequence of characters, and splits the word into two strings at that point. When IFS begins with a space, the shell deletes either of the strings if they are null. As a special case, if the word containing the replacement text is the null string, the word is deleted.
The variable $@ is special in two ways. First, splitting takes place between the positional parameters, even if the text is enclosed in double quotes. Second, if the word containing the replacement text is the null string and there are no positional parameters, then the word is deleted. The result of these rules is that "$@" is equivalent to "$1" "$2" ... "$n", where n is the number of positional parameters. (Note that this differs from the System V shell. The System V documentation claims that "$@" behaves this way; in fact on the System V shell "$@" is equivalent to "" when there are no positional paramteters.)
filename generation
"Globbing" is performed after word splitting is complete. Each word is
viewed as a series of patterns, separated by slashes. The process of
expansion replaces the word with the names of all existing files whose
names can be formed by replacing each pattern with a string that matches
the specified pattern. There are two restrictions on this: first, a
pattern cannot match a string containing a slash, and second, a pattern
cannot match a string starting with a period unless the first character of
the pattern is a period. * matches all but the dotfiles in the current
directory. .*/??? matches every 3-letter filename in a dotted
subdirectory of the current directory.
A glob pattern consists of normal characters, which match themselves, and meta-characters ("wildcards"). The meta-characters are !!, *, ?, and [. These characters lose there special meanings if they are quoted. When command or variable substitution is performed and the dollar sign or back quotes are not double quoted, the value of the variable or the output of the command is scanned for these characters and they are turned into meta-characters. Glob patterns are similar to ed regular expressions, but simpler. For example, a glob * is equivalent to a .* regular expression. Globbing is simpler because it only deals with filenames.
Two exclamation points at the beginning of a pattern function as a "not" operator, causing the pattern to match any string that the remainder of the pattern does not match. Other occurances of exclamation points in a pattern match exclamation points. Two exclamation points are required rather than one to decrease the incompatibility with the System V shell (which does not treat exclamation points specially. This is ash-specific.).
An asterisk * matches any string of characters. A question mark matches any single character. A left bracket [ introduces a character list. The end of the character list is indicated by a ]; if the ] is missing then the [ matches a [ rather than introducing a character list. A character list matches any of the characters between the square brackets. A range of characters may be specified using a minus sign, e.g, [a-z]. The character list may be complemented by making an exclamation point the first character of the character list.
To include a ] in a character class, make it the first character listed (after the !, if any). To include a minus sign, make it the first or last character listed.
By convention, the name ``/u/user'' refers to the home directory of the specified user. There are good reasons why this feature should be supported by the file system (using a feature such as symbolic links) rather than by the shell, but ash is capable of performing this mapping if the file system doesn't. If the mapping is done by ash, setting the -f flag will turn it off. Ash silently discards nul (byte = zero, see ASCII) characters. Any other character will be handled correctly by ash, including characters with the high order bit set. Globbing wildcard patterns
Globbing may be turned of when ash is invoked with the -f switch. If a word fails to match any files and the -z flag is not set, then the word will be left unchanged (except that the meta-characters will be considered normal characters). If the -z flag is set, then the word is only left unchanged if none of the patterns contain a character that can match anything besides itself. Otherwise the -z flag forces the word to be replaced with the names of the files that it matches, even if there are zero names.
If the operating system that ash is running on supports job control, ash will allow you to use it. In this case, typing the suspend character (typically ^Z) while running a command will return you to the shell and will make the suspended command the current job. You can then continue the job in the background by typing bg, or you can continue it in the foreground by typing fg. By tradition, an exit status of zero means that a command has succeeded and a nonzero exit status indicates that the command failed. This is better than no convention at all, but in practice it is extremely useful to allow commands that succeed to use the exit status to return information to the caller. A variety of better conventions have been proposed, but none of them has met with universal approval. The convention used by ash is:
0 Success. 1 Alternate success. 2 Failure. 129-... Command terminated by a signal.The "alternate success" return is used by commands to indicate various conditions which are not errors but which can, with a little imagination, be conceived of as less successful than plain success. For example, test returns 1 when the tested condition is false and getopts returns 1 when there are no more options. Because this convention is not used universally, the -e option of ash causes the shell to exit when a command returns 1 even though that contradicts the convention described here.
When a command is terminated by a signal, the uses 128 plus the signal number as the exit code for the command.
bg
[ job ] ...
Continue the specified jobs (or the current job if no jobs are given)
in the background.
bltin
command arg...
Execute the specified built-in command. (This is useful when you have a
shell function with the same name as a built-in command.)
cd
[ directory ]
Switch to the specified directory (default $HOME). If the an entry for
CDPATH appears in the environment of the cd command or the shell variable
CDPATH is set and the directory name does not begin with a slash, then the
directories listed in CDPATH will be searched for the specified directory.
The format of CDPATH is the same as that of PATH. In an interactive
shell, the cd command will print out the name of the directory that it
actually switched to if this is different from the name that the user
gave. These may be different either because the CDPATH mechanism was used
or because a symbolic link was crossed.
.
The commands in the specified file are read and executed by the shell.
A path search is not done to find the file because the directories in
PATH generally contain files that are intended to be executed, not read.
eval string...
The strings are parsed as shell commands and executed.
exec[ command arg... ]
Unless "command" is omitted, the shell process is replaced with the
specified program (which must be a real program, not a shell built-in or
function). Any redirections on the exec command are marked as permanent,
so that they are not undone when the exec command finishes. If the
command is not found, the exec command causes the shell to exit.
exit [ exitstatus ]
Terminate the shell process. If "exitstatus" is given it is used as the
exit status of the shell; otherwise the exit status of the preceding
command is used.
exportname...
The specified names are exported so that they will appear in the environment
of subsequent commands. The only way to un-export a variable is to unset it.
Ash
allows the value of a variable to be set at the same time it is exported
by writing
export name=value
With no arguments the export command lists the names of all exported variables.
fg [ job ]
Move the specified job or the current job to the foreground.
getopts optstring var
The System V "getopts" command.
-rv
hash command...
The shell maintains a hash table which remembers the locations of
commands. With no arguments whatsoever, the hash command prints
out the contents of this table. Entries which have not been looked
at since the last
cd
command are marked with an asterisk; it is possible for these entries
to be invalid.
With arguments, the hash command removes the specified commands from
the hash table (unless they are functions) and then locates them.
With the
-v
option,
hash
prints the locations of the commands as it finds them.
The
-r
option causes the
hash
command to delete all the entries in the hash table except for
functions.
jobid [ job ]
Print the process id's of the processes in the job. If the job argument
is omitted, use the current job.
jobs
This command lists out all the background processes which are children
of the current shell process.
lc [ function-name ]
The function name is defined to execute the last command entered.
If the function name is omitted, the last command executed is
executed again. This command only works if the
-i
flag is set. This all the command history processing ash has.
pwd
Print the current directory. The built-in command may differ from the
program of the same name because the built-in command remembers what
the current directory is rather than recomputing it each time. This
makes it faster. However, if the current directory is renamed, the
built-in version of pwd will continue to print the old name for the
directory. (This may change in cLIeNUX ash.)
read[ -p prompt ] [ -e ] variable...
The prompt is printed if the
-p
option is specified and the standard input is a terminal. Then a
line is read from the standard input. The trailing newline is deleted
from the line and the line is split as described
in the section on word splitting above, and the pieces are assigned to
the variables in order. If there are more pieces than variables, the
remaining pieces (along with the characters in IFS that separated them)
are assigned to the last variable. If there are more variables than
pieces, the remaining variables are assigned the null string.
The -e option causes any backslashes in the input to be treated specially. If a backslash is followed by a newline, the backslash and the newline will be deleted. If a backslash is followed by any other character, the backslash will be deleted and the following character will be treated as though it were not in IFS, even if it is.
readonlyname...
The specified names are marked as read only, so that they cannot be
subsequently modified or unset. Ash allows the value of a variable
to be set at the same time it is marked read only by writing
readonly name=value
With no arguments the readonly command lists the names of all read only variables.
set [
{
-options
|
+options
|
--
}
] arg...
The
set
command performs three different functions.
With no arguments, it lists the values of all shell variables.
If options are given, it sets the specified option flags, or clears
them if the option flags are introduced with a
+
rather than a
-.
Only the first argument to
set
can contain options.
The possible options are:
-e Causes the shell to exit when a command terminates with a nonzero exit status, except when the exit status of the command is explicitly tested. The exit status of a command is considered to be explicitly tested if the command is used to control an if, elif, while, or until; or if the command is the left hand operand of an ``&&'' or ``||'' operator.
-f Turn off file name generation.
-I Cause the shell to ignore end of file conditions.
(This doesn't apply when the shell a script sourced using the ``.''
command.) The shell will in fact exit if it gets 50 eof's in a
row.
-i Make the shell interactive. This causes the shell to
prompt for input, to trap interrupts, to ignore quit and terminate signals,
and to return to the main command loop rather than exiting on error.
-j Turns on Berkeley job control, on systems that support it.
When the shell starts up, the
-j
is set by default if the
-i
flag is set.
-n Causes the shell to read commands but not execute them.
(This is marginally useful for checking the syntax of scripts.)
-s If this flag is set when the shell starts up, the shell
reads commands from its standard input. The shell doesn't examine the
value of this flag any other time.
-x If this flag is set, the shell will print out each
command before executing it.
-z If this flag is set, the file name generation process
may generate zero files. If it is not set, then a pattern which does
not match any files will be replaced by a quoted version of the pattern.
The third use of the set command is to set the values of the shell's positional parameters to the specified args. To change the positional parameters without changing any options, use ``--'' as the first argument to set. If no args are present, the set command will leave the value of the positional parameters unchanged, so to set the positional parameters to set of values that may be empty, execute the command
shift $#
first to clear out the old values of the positional parameters.
variable
value
setvar
Assigns
value
to
variable.
(In general it is better to write
variable=value
rather than using
setvar.
Setvar
is intended to be used in functions that assign values to variables whose
names are passed as parameters.)
shift[ n ]
Shift the positional parameters
n
times.
A shift sets the value of $1 to the value of $2, the value of $2 to
the value of $3, and so on, decreasing the value of $# by one.
If there are zero positional parameters, shifting doesn't do anything.
test [ arg ] check arg
Ash conditional execution constructs like if, while and until make thier decision
of what to do based on the return code of the previous command. That value is kept
in the $? variable. Normal commands usually return a zero if they succeeded,
and some error code if they failed. In computers generally, 0 means "false", so a
unix return value of 0 can be interpreted as error=false, or success. In other
words, a return value is an error value. The ash test builtin provides
returncodes for a variety of handy tests and comparisons, solely to provide input to
an ash conditional. "check" in the format shown above is similar to a command
option. For example,
test -f file_blareturns 0 if file_bla is an existing regular file. That's kinda backwards. Intuitively, I would expect that to return a non-zero value, but I guess it means "error looking for file_bla = false". To make sense of test return values, think of them as "error in assuming condition = true/false".
Checks can be text-oriented or numerical, and can be for one argument or two. This table is from the sourcecode for the test builtin, with guesses. UNOP means one argument, BINOP means the test takes two arguments.
-r UNOP file is readable -w UNOP file is writable -x UNOP file is executable -f UNOP file exists -d UNOP file is a directory -c UNOP file is a character device -b UNOP file is a block device -p UNOP file is a FIFO, a pipe -u UNOP file is SetUID -g UNOP file is SetGID -k UNOP -s UNOP -t UNOP file is a tty -z UNOP string is empty -n UNOP string is non-empty -U UNOP return UID of file's owner -G UNOP return GID of file's owner -L UNOP file is a symlink -S UNOP file is a socket = BINOP strings are identical != BINOP strings differ -eq BINOP integers are equal -ne BINOP integers are not equal -ge BINOP integerA >= integerB -gt BINOP integerA > integerB -le BINOP integerA <= integerB -lt BINOP integerA < integerB -nt BINOP fileA is newer than fileB -ot BINOP fileA is older than fileB -ef BINOP file dates are equal ! BUNOP Boolean Unary NOT -a BBINOP Boolean AND -o BBINOP Boolean OR ( PAREN parentheses ) PAREN "That's a rich set of tests and modifiers. Start with test -f and echo $?. These differ somewhat from shell to shell, so the best docs is/are experimentation.
trap[ action ] signal...
Cause the shell to parse and execute
"action"
when any of the specified signals are received.
The signals are specified by signal number.
"Action" may be null or omitted;
the former causes the specified signal to be ignored and the latter
causes the default action to be taken.
When the shell forks off a subshell, it resets trapped (but not ignored)
signals to the default action.
The trap command has no effect on signals that were ignored on entry
to the shell.
umask[ mask ]
Set the value of umask to the specified octal value. If the argument is
omitted, the umask value is printed. umask determines the initial
permissions of files created by the shell.
unsetname...
The specified variables and functions are unset and unexported.
If a given name corresponds to both a variable and a function, both the
variable and the function are unset.
wait[ job ]
Wait for the specified job to complete and return the exit status of the
last process in the job. If the argument is omitted, wait for all jobs
to complete and the return an exit status of zero.
cd() { if bltin cd "$@" thenif test -f .enter then. .enter elsereturn 0 fi fi }
This function causes the file ``.enter'' to be read when you enter a directory, if it exists. The bltin command is used to access the real cd command. The ``return 0'' ensures that the function will return an exit status of zero if it successfully changes to a directory that does not contain a ``.enter'' file. Redefining existing commands is not always a good idea, but this example shows that you can do it if you want to.
The suspend function distributed with ash looks like
# Copyright (C) 1989 by Kenneth Almquist. All rights reserved. # This file is part of ash, which is distributed under the terms # specified by the Ash General Public License. suspend() { local - set +j kill -TSTP 0 }
This turns off job control and then sends a stop signal to the current process group, which suspends the shell. (When job control is turned on, the shell ignores the TSTP signal.) Job control will be turned back on when the function returns because ``-'' is local to the function. As an example of what not to do, consider an earlier version of suspend:
suspend() { suspend_flag=$- set +j kill -TSTP 0 set -$suspend_flag }
There are two problems with this. First, suspend_flag is a global variable rather than a local one, which will cause problems in the (unlikely) circumstance that the user is using that variable for some other purpose. Second, consider what happens if shell received an interrupt signal after it executes the first set command but before it executes the second one. The interrupt signal will abort the shell function, so that the second set command will never be executed and job control will be left off. The first version of suspend avoids this problem by turning job control off only in a local copy of the shell options. The local copy of the shell options is discarded when the function is terminated, no matter how it is terminated.
Shell variables can be used to provide $-prefixed abbreviations for filenames which you type frequently, including commands.
When writing shell procedures, try not to make assumptions about what is imported from the environment. Explicitly unset or initialize all variables your program uses, rather than assuming they will be unset. If you use cd, it is a good idea to unset CDPATH.
People sometimes use ``<&-'' or ``>&-'' to provide no input to a command or to discard the output of a command. A better way to do this is to redirect the input or output of the command to /dev/null.
Word splitting and file name generation are performed by default, and you have to explicitly use double quotes (")to suppress it. This is backwards, but you can learn to live with it. Just get in the habit of writing double quotes around variable and command substitutions, and omit them only when you really want word splitting and file name generation. If you want word splitting but not file name generation, use the -f option.
cat <<-! Line 1: $(line) Line 2: $(line) !
Unsetting a function which is currently being executed may cause strange behavior.
The shell syntax allows a here document to be terminated by an end of file as well as by a line containing the terminator word which follows the <<. What this means is that if you mistype the terminator line, the shell will silently swallow up the rest of your shell script and stick it in the here document.
pwd should be a true built-in. Hopefully it will be by the time you read this. There is other process-specific information that a shell already has that it can provide by to the user itself cheaply that ash doesn't.##