Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Contents....................................................................................................................................................................................................................................... 1
File descriptors......................................................................................................................................................................................................................... 4
Redirection........................................................................................................................................................................................................................... 7
Implementing pipe............................................................................................................................................................................................................... 7
Linux tar command examples ( tar == arhiva gzip == compression ).......................................................................................................................................9
1) Create a tar archive of a subdirectory.............................................................................................................................................................................. 9
2) List the contents of a tar archive...................................................................................................................................................................................... 9
3) Extracting a tar archive.................................................................................................................................................................................................. 10
4) Linux tar command with gzip - Creating a compressed archive.....................................................................................................................................10
5) Creating a compressed archive of the current directory................................................................................................................................................10
6) Creating an archive in a different directory.................................................................................................................................................................... 11
Linux gzip - How to work with compressed files.................................................................................................................................................................... 11
The Unix/Linux gzip command........................................................................................................................................................................................... 11
The Linux gunzip command................................................................................................................................................................................................ 11
The Linux file compress utilities (zcat, zmore, zgrep).........................................................................................................................................................12
WC command......................................................................................................................................................................................................................... 12
Here Document.................................................................................................................................................................................................................. 13
Shell Script Commands.......................................................................................................................................................................................................... 14
UNIX Commands Review.................................................................................................................................................................................................... 14
Command-Line Arguments................................................................................................................................................................................................ 16
shift Command................................................................................................................................................................................................................... 16
Special *@#0$?_!- Parameters.......................................................................................................................................................................................... 17
Uppercase or Lowercase Text for Easy Testing................................................................................................................................................................... 19
Check the Return Code...................................................................................................................................................................................................... 20
Using getopts to Parse Command-Line Arguments................................................................................................................................................................21
Find Command................................................................................................................................................................................................................... 22
Finding files that contain text (find + grep)........................................................................................................................................................................ 24
Power file searching with find and grep............................................................................................................................................................................. 24
Acting on files you find (find + exec).................................................................................................................................................................................. 24
Running the ls command on files you find......................................................................................................................................................................... 25
Find and delete.................................................................................................................................................................................................................. 25
Case-insensitive file searching............................................................................................................................................................................................ 26
Find files by modification time........................................................................................................................................................................................... 26
grep command....................................................................................................................................................................................................................... 26
Searching for a text string in one file.................................................................................................................................................................................. 27
Searching for a string in multiple files................................................................................................................................................................................ 27
Case-insensitive file searching with the Unix grep command............................................................................................................................................27
Reversing the meaning of a grep search............................................................................................................................................................................ 27
Using grep in a Unix/Linux command pipeline................................................................................................................................................................... 27
How you can find all the Java processes running on your system......................................................................................................................................28
Find all the sub-directories in the current directory...........................................................................................................................................................28
Search for multiple patterns at one time........................................................................................................................................................................... 28
Searching for regular expressions (regex patterns) with grep............................................................................................................................................28
Display only filenames with a grep search......................................................................................................................................................................... 29
Related Unix/Linux grep commands and tutorials-string command..................................................................................................................................30
Scripting................................................................................................................................................................................................................................. 31
Variables............................................................................................................................................................................................................................ 31
Using the test shell builtin.................................................................................................................................................................................................. 34
File Tests............................................................................................................................................................................................................................. 36
Integer Tests....................................................................................................................................................................................................................... 36
String Tests......................................................................................................................................................................................................................... 37
String test with patern matching........................................................................................................................................................................................ 38
Diferenta dintre asignare directa si folosirea lui let............................................................................................................................................................39
SED............................................................................................................................................................................................................................................. 41
Stergere caractere dintr-un String sau numeFisier..................................................................................................................................................................... 42
dirname and basename............................................................................................................................................................................................................. 42
Diferenta intre pwd so PWD environment variable.................................................................................................................................................................... 43
Database connection and retrieve values.................................................................................................................................................................................. 43
Regular expressions................................................................................................................................................................................................................... 47
Reading variables from files....................................................................................................................................................................................................... 49
Example of logging..................................................................................................................................................................................................................... 50
Scripturi to use for oracle database........................................................................................................................................................................................... 52
How to test sftp in linux script................................................................................................................................................................................................... 54
Give permission recursivelly...................................................................................................................................................................................................... 55
File descriptors
In simple words, when you open a file, the operating system creates an entry to represent that file and store the information about that opened file. So if
there are 100 files opened in your OS then there will be 100 entries in OS (somewhere in kernel). These entries are represented by integers like (...100, 101,
102....). This entry number is the file descriptor. So it is just an integer number that uniquely represents an opened file in operating system. If your process
opens 10 files then your Process table will have 10 entries for file descriptors.
To the kernel, all open files are referred to by File Descriptors. A file descriptor is a non-negative number. When we open an existing file or create a new file,
the kernel returns a file descriptor to the process. The kernel maintains a table of all open file descriptors, which are in use. When we want to read or write
a file, we identify the file with the file descriptor that was returned by open() or create() function call, and use it as an argument to either read() or write().
It is by convention that, UNIX System shells associates the file descriptor 0 with Standard Input of a process, file descriptor 1 with Standard Output, and file
desciptor 2 with Standard Error.
File descriptor ranges from 0 to OPEN_MAX.
Detailed explanation :
File Descriptors : One of the first things a UNIX programmer learns is that every running program starts with three files already opened:
This raises the question what an open file represents. The value returned by an open call is termed a file descriptor and is essentially an index into an array
of open files kept by the kernel.
Figure 1.3. Abstraction
File descriptors associate the abstraction provided by device-drivers with a file interface provided to a user.
File descriptors are an index into a file-descriptor table stored by the kernel. The kernel creates a file-descriptor in response to an open call and associates
the file-descriptor with some abstraction of an underlying file-like object; be that an actual hardware device, or a file-system or something else entirely.
Consequently a processes read or write calls that reference that file-descriptor are routed to the correct place by the kernel to ultimately do something
useful.
In short, the file-descriptor is the gateway into the kernel's abstractions of underlying hardware. An overall view of the abstraction for physical-devices is
shown in Figure 1.3, “Abstraction”.
Starting at the lowest level, the operating system requires a programmer to create a device-driver to be able to communicate with a hardware device. This
device-driver is written to an API provided by the kernel just like in Example 1.2, “Abstraction in include/linux/virtio.h”; the device-driver will provide a
range of functions which are called by the kernel in response to various requirements. In the simplified example above, we can see the drivers provide a
read and write function that will be called in response to the analogous operations on the file-descriptor. The device-driver knows how to convert these
generic requests into specific requests or commands for a particular device.
To provide the abstraction to user-space, the kernel provides a file-interface via what is generically termed a device layer. Physical devices on the host are
represented by a file in a special file-system such as /dev. In UNIX-like systems, so called device-nodes have what are termed a major and a minor number
which allows the kernel to associate particular nodes with their underlying driver. These can be identified via ls as illustrated in Example 1.3, “Example of
major and minor numbers”.
This brings us to the file-descriptor, which is the handle user-space uses to talk to the underlying device. In a broad-sense, what happens when a file is
opened is that the kernel is using the path information to map the file-descriptor with something that provides an appropriate read and write, etc. API.
When this open is for a device (/dev/sr0 above), the major and minor number of the opened device-node provides the information the kernel needs to find
the correct device-driver and complete the mapping. The kernel will then know how to route further calls such as read to the underlying functions provided
by the device-driver.
A non-device file operates similarly, although there are more layers in-between. The abstraction here is the mount-point; mounting a file-system has the
dual purpose of setting up a mapping so the file-system knows the underlying device that provides the storage and the kernel knows that files opened
under that mount-point should be directed to the file-system driver. Like device-drivers, file-systems are written to a particular generic file-system API
provided by the kernel.
There are indeed many other layers that complicate the picture in real-life. For example, the kernel will go to great efforts to cache as much data from disks
as possible in otherwise free-memory; this provides many speed advantages. It will also try to organise device access in the most efficient ways possible;
for example trying to order disk-access to ensure data stored physically close to each other is retrieved together, even if the requests did not arrive in such
an order. Further, many devices are of a more generic class such as USB or SCSI devices which provide their own abstraction layers to write too. Thus rather
than writing directly to devices, file-systems will go through these many layers. Understanding the kernel is to understand how these many APIs interrelate
and coexist.
The Shell
The shell is the gateway to interacting with the operating system. Be it bash, zsh, csh or any of the many other shells, they all fundamentally have only one
major task — to allow you to execute programs (you will begin to understand how the shell actually does this when we talk about some of the internals of
the operating system later).
But shells do much more than allow you to simply execute a program. They have powerful abilities to redirect files, allow you to execute multiple programs
simultaneously and script complete programs. These all come back to the everything is a file idiom.
Redirection
Often we do not want the standard file descriptors mentioned in the section called “File Descriptors” to point to their default places. For example, you may
wish to capture all the output of a program into a file on disk, or, alternatively have it read its commands from a file you prepared earlier. Another useful
task might like to pass the output of one program to the input of another. With the operating system, the shell facilitates all this and more.
Implementing pipe
The implementation of ls | more is just another example of the power of abstraction. What fundamentally happens here is that instead of associating the
file-descriptor for the standard-output with some sort of underlying device (such as the console, for output to the terminal), the descriptor is pointed to an
in-memory buffer provided by the kernel commonly termed a pipe. The trick here is that another process can associate its standard input with the other-
side of this same buffer and effectively consume the output of the other process. This is illustrated in Figure 1.4, “A pipe in action”
The pipe is an in-memory buffer that connects two processes together. File-descriptors point to the pipe object, which buffers data sent to it (via a write) to
be drained (via a read)
Writes to the pipe are stored by the kernel until a corresponding read from the other side drains the buffer. This is a very powerful concept and is one of
the fundamental forms of inter-process communication or IPC in UNIX like operating systems. The pipe allows more than just a data transfer; it can act as a
signaling channel. If a process reads an empty pipe, it will by default block or be put into hibernation until there is some data available .Thus two processes
may use a pipe to communicate that some action has been taken just by writing a byte of data; rather than the actual data being important, the mere
presence of any data in the pipe can signal a message. Say for example one process requests that another print a file - something that will take some time.
The two processes may setup a pipe between themselves where the requesting process does a read on the empty pipe; being empty that call blocks and
the process does not continue. Once the print is done, the other process can write a message into the pipe, which effectively wakes up the requesting
process and signals the work is done.
Allowing processes to pass data between each other like this springs another common UNIX idiom of small tools doing one particular thing. Chaining these
small tools gives a flexibility that a single monolithic tool often can not.
Linux tar command examples ( tar == arhiva gzip == compression )
These days the Linux tar command is more often used to create compressed archives that can easily be moved around, from disk to disk, or computer to
computer. One user may archive a large collection of files, and another user may extract those files, with both of them using the tar command.
where MyProject.20090816.tar is the name of the archive (file) you are creating, and MyProject is the name of your subdirectory. It's common to
name an uncompressed archive with the .tar file extension.
The general syntax of the tar command when creating an archive looks like this:
tar [flags] archive-file-name files-to-archive
This lists all the files in the archive, but does not extract them.
To list all the files in a compressed archive, add the z flag like before:
tar tzvf my-archive.tgz was tar’d and gzip’d in one step
That same command can also work on a file that was tar'd and gzip'd in two separate steps (as indicated by the .tar.gz file extension):
tar tzvf my-archive.tar.gz was tar’d and gzip’d in two steps
I almost always list the contents of an unknown archive before I extract the contents. I think this is always good practice, especially when you're logged in
as the root user.
For compressed archives the tar extract command looks like this:
tar xzvf my-archive.tar.gz
or this:
tar xzvf my-archive.tgz
But these days it's more common to create a gzip'd tar archive with one tar command, like this:
tar czvf MyProject.20090816.tgz MyProject
As you can see, I added the 'z' flag there (which means "compress this archive with gzip"), and I changed the extension of the archive to .tgz, which is the
common file extension for files that have been tar'd and gzip'd in one step.
In this tar example, the '.' at the end of the command is how you refer to the current directory.
As you can see, you just add a path before the name of your tar archive to specify what directory the archive should be created in.
In this tutorial I take a quick look at the gzip and gunzip file compression utilities, along with their companion tools you may not have known about: zcat,
zgrep, and zmore.
The Unix/Linux gzip command
You can compress a file with the Unix/Linux gzip command. For instance, if I run an ls -l command on an uncompressed Apache access log file named
access.log, I get this output:
-rw-r--r-- 1 al al 22733255 Aug 12 2008 access.log
Note that the size of this file is 22,733,255 bytes. Now, if we compress the file using gzip, like this:
gzip access.log
we end up creating a new, compressed file named access.log.gz. Here's what that file looks like:
-rw-r--r-- 1 al al 2009249 Aug 12 2008 access.log.gz
Notice that the file has been compressed from 22,733,255 bytes down to just 2,009,249 bytes. That's a huge savings in file size, roughly 10 to 1(!).
There's one important thing to note about gzip: The old file, access.log, has been replaced by this new compressed file, access.log.gz. This might freak you
out a little the first time you use this command, but very quickly you get used to it. (If for some reason you don't trust gzip when you first try it, feel free to
make a backup copy of your original file.)
Running that command restores our original file, as you can see in this output:
-rw-r--r-- 1 al al 22733255 Aug 12 2008 access.log
For instance, instead of using cat to display the entire contents of the file, you use zcat to work on the gzip'd file instead, like this:
zcat access.log.gz
(Of course that output will go on for a long time with roughly 22MB of compressed text.)
You can also scroll through the file one page at a time with zmore:
zmore access.log.gz
And finally, you can grep through the compressed file with zgrep:
zgrep '/java/index.html' access.log.gz
There are also two other commands, zcmp and zdiff, that let you compare compressed files, but I personally haven't had the need for them. However, as
you can imagine, they work like this:
zmp file1.gz file2.gz
or
zdiff file1.gz file2.gz
As a quick summary, just remember that you don't have to uncompress files to work on them, you can use the following z-utilities to work on the
compressed files instead:
zcat
zmore
zgrep
zcmp
zdiff
WC command
If we have a file du1L that contains 5 lines like this
unu
doi
trei
patru
cinci
then we can count the lines number in many ways:
[mihail@oc8168772081 TEST_INFOACADEMY]$ cat du1L
unu
doi
trei
patru
cinci
[mihail@oc8168772081 TEST_INFOACADEMY]$ wc -l du1L
5 du1L
[mihail@oc8168772081 TEST_INFOACADEMY]$ wc -l <du1L
5
[mihail@oc8168772081 TEST_INFOACADEMY]$ cat du1L|wc -l --- piping the output of cat command into the input of wc command
5
Here Document
A here document is used to redirect input into an interactive shell script or program. We can run an interactive program within a shell script without user
action by supplying the required input for the interactive program, or interactive shell script. This is why it is called a here document: the required input is
here, as opposed to somewhere else.
This is the syntax for a here document:
program_name <<LABEL
Program_Input_1
Program_Input_2
Program_Input_3
Program_Input_#
LABEL
Example:
/usr/local/bin/My_program << EOF
Randy
Robin
Rusty
Jim
EOF
Notice in the here documents that there are no spaces in the program input lines, between the two EOF labels. If a space is added to the input, the here
document may fail. The input that is supplied must be the exact data that the program is expecting, and many programs will fail if spaces are added to the
input.
Most of the commands shown in Table 1-3 are used at some point in this book, depending on the task we are working on in each chapter.
COMMAND DESCRIPTION
() Runs the enclosed command in a sub-shell
(( )) Evaluates and assigns value to a variable and does math in a shell
$(( )) Evaluates the enclosed expression
[] Same as the test command
<> Used for string comparison
$( ) Command substitution
‘command’ Command substitution
Command-Line Arguments
The command-line arguments $1, $2, $3,…$9 are positional parameters, with $0 pointing to the actual command, program, shell script, or function and $1,
$2, $3, …$9 as the arguments to the command.
The positional parameters, $0, $2, and so on in a function are for the function's use and may not be in the environment of the shell script that is calling the
function. Where a variable is known in a function or shell script is called the scope of the variable.
shift Command
The shift command is used to move positional parameters to the left; for example, shift causes $2 to become $1. We can also add a number to the shift
command to move the positions more than one position; for example, shift 3 causes $4 to move to the $1 position.
Sometimes we encounter situations where we have an unknown or varying number of arguments passed to a shell script or function, $1, $2, $3…. (also
known as positional parameters). Using the shift command is a good way of processing each positional parameter in the order they are listed.
To further explain the shift command, we will show how to process an unknown number of arguments passed to the shell script shown in Listing 1-2. Try to
follow through this example shell script structure. This script is using the shift command to process an unknown number of command-line arguments, or
positional parameters. In this script we will refer to these as tokens.
#!/usr/bin/sh
#
# SCRIPT: shifting.sh
# PLATFORM: Not platform dependent
#
# PURPOSE: This script is used to process all of the tokens which
# are pointed to by the command-line arguments, $1, $2, $3, etc…
#
# REV. LIST:
#
while true
do
TOTAL=‘expr $TOTAL + 1’ # A little math in the
# shell script, a running
# total of tokens processed.
TOKEN=$1 # We always point to the $1 argument with a shift process each $TOKEN
shift # Grab the next token, i.e. $2 becomes $1
done
Positional Parameters
The arguments on the command line are available to a shell program as numbered parameters. The first argument is $1, the second is $2, and so on.
There are special parameters that allow accessing all the command-line arguments at once. $* and $@ both will act the same unless they are enclosed in
double quotes, “ ”.
Special Parameter Definitions
We can rewrite the shell script shown in Listing 1-2 to process an unknown number of command-line arguments with either the $* or $@ special
parameters, as shown in Listing 1-3.
#!/usr/bin/sh
#
# SCRIPT: shifting.sh
# AUTHOR: Randy Michael
# DATE: 12-31-2007
# REV: 1.1.A
# PLATFORM: Not platform dependent
# PURPOSE: This script is used to process all of the tokens which
# Are pointed to by the command-line arguments, $1, $2, $3, etc… -
#
# REV LIST:
# Start a for loop
for TOKEN in $*
do
process each $TOKEN
done
Listing 1-3 Example using the special parameter $*
We could have also used the $@ special parameter just as easily. As we see in the preceding code segment, the use of the $@ or $* is an alternative
solution to the same problem, and it was less code to write. Either technique accomplishes the same task.
So , the first two special parameters, $* and $@, expand to the value of all the positional parameters combined. $# expands to the number of positional
parameters. $0 contains the path to the currently running script or to the shell itself if no script is being executed.
$$ contains the process identification number (PID) of the current process, $? is set to the exit code of the last-executed command, and $_ is set to the last
argument to that command. $! contains the PID of the last command executed in the background, and $- is set to the option flags currently in effect.
The Bourne shell could only address up to nine positional parameters. If a script used $10, it would be interpreted as $1 followed by a zero. To be able to
run old scripts, bash maintains that behavior. To access positional parameters greater than 9, the number must be enclosed in braces: ${15}.
e script is passed to the parameters that can be accessed via their positions, $0, $1, $2 and so on. The function shift N moves the positional parameters by
N positions, if you ran shift (the default value of N is 1), then $0 would be discarded, $1 would become $0, $2 would become $1, and so on: they would all
be shifted by 1 position. There are some very clever and simple uses of shift to iterate through a list of paramters of unknown length.
We often need to test text strings like filenames, variables, file text, and so on, for comparison. It can sometimes vary so widely that it is easier to
uppercase or lowercase the text for ease of comparison. The tr and typeset commands can be used to uppercase and lowercase text. This makes testing for
things like variable input a breeze. Following are some examples of using the tr command:
Variable values:
Expected input TRUE
Real input TRUE
Possible input true TRUE True True, and so on
Upcasing:
Downcasing:
NOTE The single quotes are required around the square brackets.
No matter what the user input is, we will always have the stable input of TRUE, if uppercased, and true, if lowercased. This reduces our code testing and
also helps the readability of the script.
We can also use typeset to control the attributes of a variable in the shell. In the previous example we are using the variable VARIABLE. We can set the
attribute to always translate all of the characters to uppercase or lowercase. To set the case attribute of the variable VARIABLE to always translate
characters assigned to it to uppercase, we use
typeset -u VARIABLE
The -u switch to the typeset command is used for uppercase. After we set the attribute of the variable VARIABLE, using the typeset command, anytime we
assign text characters to VARIABLE they are automatically translated to uppercase characters.
Example:
typeset -u VARIABLE
VARIABLE=“True”
echo $VARIABLE
TRUE
To set the case attribute of the variable VARIABLE to always translate characters to lowercase, we use
typeset -l VARIABLE
Example;
typeset -l VARIABLE
VARIABLE=“True”
echo $VARIABLE
true
As an example, we want to check if the /usr/local/bin directory exists. Each of these blocks of code accomplishes the exact same thing:
test -d /usr/local/bin
if [ “$?” -eq 0 ] # Check the return code
then # The return code is zero
fi
or
if test -d /usr/local/bin
then # The return code is zero
fi
or
if [ -d /usr/local/bin ]
then # The return code is zero
Notice that we checked the return code using $? once. The other examples use the control structure's built-in test. The built-in tests do the same thing of
processing the return code, but the built-in tests hide this step in the process. All three of the previous examples give the exact same result. This is just a
matter of personal choice and readability.
Using getopts to Parse Command-Line Arguments
The getopts command is built into the shell. It retrieves valid command-line options specified by a single character preceded by a – (minus sign) or + (plus
sign). To specify that a command switch requires an argument to the switch, it is followed by a : (colon). If the switch does not require any argument, the :
should be omitted. All of the options put together are called the OptionString, and this is followed by some variable name. The argument for each
switch is stored in a variable called OPTARG. If the entire OptionString is preceded by a : (colon), any unmatched switch option causes a ? to be loaded into
the VARIABLE. The form of the command follows:
The easiest way to explain this is with an example. For a script we need seconds, minutes, hours, days, and a process to monitor. For each one of these we
want to supply an argument — that is, -s5 -m10 -pmy_backup. In this we are specifying 5 seconds, 10 minutes, and the process is my_backup. Notice that
there does not have to be a space between the switch and the argument, and they can be entered in any order. This is what makes getopts so great! The
command line to set up our example looks like this:
There are a few things to note in the getopts command. The getopts command needs to be part of a while loop with a case statement within the loop for
this example. On each option we specified, s, m, h, d, and p, we added a : (colon) after each switch. This tells getopts that an argument is required. The :
(colon) before the OptionString list tells getopts that if an unspecified option is given, to set the TIMED variable to the ? character. This allows us to call the
usage function and exit with a return code of 1. The first thing to be careful of is that getopts does not care what arguments it receives, so we have to take
action if we want to exit. The last thing to note is that the first line of the while loop has output redirection of standard error (file descriptor 2) to the bit
bucket. Anytime an unexpected argument is encountered, getopts sends a message to standard error (file descriptor 2). Because we expect this to happen,
we can just ignore the messages and discard them to /dev/null. We will study getopts a lot in this book.
Find Command
The Linux find command is very powerful. It can search the entire filesystem to find files and directories according to the search criteria you specify.
Besides using the find command to locate files, you can also use it to execute other Linux commands (grep, mv, rm, etc.) on the files and directories you
find, which makes find extremely powerful.
On a related note, don’t forget the locate command. It keeps a database on your Unix/Linux system to help find files very fast:
locate command
--------------
locate tomcat.sh # search the entire filesystem for 'tomcat.sh' (uses the locate database)
The remaining sections on this page describe more fully the commands just shown.
This first Linux find example searches through the root filesystem ("/") for the file named Chapter1. If it finds the file, it prints the location to the screen.
find / -name Chapter1 -type f -print
On Linux systems and modern Unix system you no longer need the -print option at the end of the find command, so you can issue it like this:
find / -name Chapter1 -type f
The -type f option here tells the find command to return only files. If you don’t use it, the find command will returns files, directories, and other things
like named pipes and device files that match the name pattern you specify. If you don't care about that, just leave the -type f option off your command.
This next find command searches through only the /usr and /home directories for any file named Chapter1.txt:
find /usr /home -name Chapter1.txt -type f
To search in the current directory — and all subdirectories — just use the . character to reference the current directory in your find commands, like this:
find . -name Chapter1 -type f
This next example searches through the /usr directory for all files that begin with the letters Chapter, followed by anything else. The filename can end with
any other combination of characters. It will match filenames such as Chapter, Chapter1, Chapter1.bad, Chapter-in-life, etc.:
find /usr -name "Chapter*" -type f
This next command searches through the /usr/local directory for files that end with the extension .html. These file locations are then printed to the
screen:
find /usr/local -name "*.html" -type f
Every option you just saw for finding files can also be used on directories. Just replace the -f option with a -d option. For instance, to find all directories
named build under the current directory, use this command:
find . -type d -name build
To find all files that don't match a filename pattern, use the -not argument of the find command, like this:
find . -type f -not -name "*.html"
That generates a list of all files beneath the current directory whose filename DOES NOT end in .html, so it matches files like *.txt,*.jpg, and so on.
(Those last few characters are required any time you want to exec a command on the files that are found. I find it helpful to think of them as a placeholder
for each file that is found.)
This next example is similar, but here I use the -i argument to the grep command, telling it to ignore the case of the characters string, so it will find files
that contain string, String, STRING, etc.:
find . -type f -name "*.java" -exec grep -il string {} \;
This is a special way of mixing the Linux find and grep commands together to search every file in every subdirectory of my current location. It searches for
the string "foo" in every file below the current directory, in a case-insensitive manner. This find/grep command can be broken down like this:
"." means "look in the current directory"
-type f means "look in files only"
-exec grep -il foo means "search for the string 'foo' in a case-insensitive manner, and return the matching line and
filename when a match is found
{} \; is a little bizarre syntax that you need to add to the end of your find command whenever you add the -exec option.
I try to think of it as a placeholder for the filenames the find command finds.
Acting on files you find (find + exec)
This command searches through the /usr/local directory for files that end with the extension .html. When these files are found, their permission is
changed to mode 644 (rw-r--r--).
find /usr/local -name "*.html" -type f -exec chmod 644 {} \;
This find command searches through the htdocs and cgi-bin directories for files that end with the extension .cgi. When these files are found, their
permission is changed to mode 755 (rwxr-xr-x). This example shows that the find command can easily search through multiple sub-directories (htdocs, cgi-
bin) at one time:
find htdocs cgi-bin -name "*.cgi" -type f -exec chmod 755 {} \;
That's nice, but what if I want to see the last modification time of these files, or their filesize? No problem, I just add the ls -ld command to my find
command, like this:
find . -name "*.pl" -exec ls -ld {} \;
The "-l" flag of the ls command tells ls to give me a "long listing" of each file, while the -d flag is extremely useful in this case; it tells ls to give me the same
output for a directory. Normally if you use the ls command on a directory, ls will list the contents of the directory, but if you use the -d option, you'll get one
line of information, as shown above.
Here's how to find all files beneath the current directory that begin with the letters 'Foo' and delete them.
find . -type f -name "Foo*" -exec rm {} \;
This one is even more dangerous. It finds all directories named CVS, and deletes them and their contents. Just like the previous command, be very careful
with this command, it is dangerous(!), and not recommended for newbies, or if you don't have a backup.
find . -type d -name CVS -exec rm -r {} \;
The syntax to find multiple filename extensions with one command looks like this:
find . -type f \( -name "*.c" -o -name "*.sh" \)
Just keep adding more "-o" (or) options for each filename extension. Here's a link to
To limit the output to just files, add the -type f option as shown earlier:
find . -mtime -7 -type f
If you’re just looking for a file by name, and you want to be able to find that file even faster than you can with the find command, take a look at the Linux
locate command. The locate command keeps filenames in a database, and can find them very fast.
For more details on the find command, check out our online version of the find man page.
grep command
The name grep means "general regular expression parser", but you can think of the grep command as a “search” command for Unix and Linux systems: It’s
used to search for text strings and regular expressions within one or more files.
In a simple example like this, the quotes around the string fred aren't necessary, but they are needed if you're searching for a string that contains spaces,
and will also be needed when you get into using regular expressions (search patterns).
The '*' wildcard matches all files in the current directory, and the grep output from this command will show both (a) the matching filename and (b) all lines
in all files that contain the string 'joe'.
As a quick note, instead of searching all file with the "*" wildcard, you can also use grep to search all files in the current directory that end in the file
extension .txt, like this:
grep 'joe' *.txt
This grep search example matches the string "score", whether it is uppercase (SCORE), lowercase (score), or any mix of the two (Score, SCore, etc.).
(I deleted about half of the "httpd -k start" lines from that output manually to save a little space.)
How you can find all the Java processes running on your system using the ps and grep commands in a Unix pipeline:
ps auxwww | grep -i java
In this example I've piped the output of the ps auxwww command into my grep command. The grep command only prints the lines that have the string
"java" in them; all other lines from the ps command are not printed.
Find all the sub-directories in the current directory is to mix the Linux ls and grep commands together in a pipe, like this:
ls -al | grep '^d'
Here I'm using grep to list only those lines where the first character in the line is the letter d.
Using the Linux grep command to search for multiple patterns at one time (egrep)
Search for multiple patterns at one time. You can use a different version of the grep command to .To do this, just use the egrep command
instead of grep, like this:
egrep 'score|nation|liberty|equal' gettysburg-address.txt
This Unix egrep command searches the file named gettysburg-address.txt for the four strings shown (score, nation, liberty, and equal). It returns any lines
from the file that contain any of those words.
I should also note that "egrep" stands for "extended grep", and as you can see, it lets you do things like searching for multiple patterns at one time.
Searching for regular expressions (regex patterns) with grep
Of course the Linux grep command is much more powerful than this, and can handle very powerful regular expressions (regex patterns). In a simple
example, suppose you want to search for the strings "Foo" or "Goo" in all files in the current directory. That grep command would be:
grep '[FG]oo' *
If you want to search for a sequence of three integers with grep you might use a command like this:
grep '[0-9][0-9][0-9]' *
This next grep command searches for all occurrences of the text string fred within the /etc/passwd file, but also requires that the "f" in the name "fred" be
in the first column of each record (that's what the caret character tells grep). Using this more-advanced search, a user named "alfred" would not be
matched, because the letter "a" will be in the first column:
grep '^fred' /etc/passwd
Regular expressions can get much, much more complicated (and powerful) than this, so I'll just leave it here for now.
If you're looking through a lot of files for a pattern, and you just want to find the names of the files that contain your pattern (or "patterns", as shown with
egrep) -- but don't want to see each individual grep pattern match -- just add the -l (lowercase letter L) to your grep command, like this:
grep -l StartInterval *.plist
This command doesn't show every line in every file that contains the string "StartInterval"; it just shows the names of all the files that contain this string,
like this:
com.apple.atrun.plist
com.apple.backupd-auto.plist
com.apple.dashboard.advisory.fetch.plist
com.apple.locationd.plist
org.amavis.amavisd_cleanup.plist
Of course you can also combine grep command arguments, so if you didn't happen to know how to capitalize "StartInterval" in that previous example, you
could just add the -i argument to ignore case, like this:
grep -il startinterval *.plist
and that would have worked just fine as well, returning the same results as the previous grep command example.
To show the line numbers of the files that match your grep command, just add the -n option, like this:
grep -n we gettysburg-address.txt
Searching my sample gettysburg-address.txt file, I get the following output from this command:
9:Now we are engaged in a great civil war,
22:that we should do this.
24:But in a larger sense we can not dedicate -
25:we can not consecrate -
26:we can not hallow this ground.
29:have consecrated it far above our poor power
33:what we say here,
43:we take increased devotion to that cause
46:that we here highly resolve that these dead
grep before/after - Showing lines before or after your grep pattern match
After a recent comment, I just learned that you can display lines before or after your grep pattern match, which is also very cool. To display five lines before
the phrase "the living" in my sample document, use the -B argument, like this:
grep -B 5 "the living" gettysburg-address.txt
Similarly, to show the five lines after that same search phrase, use the -A argument with your Unix grep command, like this:
grep -A 5 "the living" gettysburg-address.txt
Of course you can use any number after the -A and -B options, I'm just using the number five here as an example.
There are at least two other commands related to grep that you should at least be aware of. The fgrep command stands for "fast grep", or "fixed strings",
depending on who you talk to. The egrep command stands for "extended grep", and lets you use even more powerful regular expressions.
And as I mentioned in the previous section Mac OS X systems have the mdfind command. As a practical matter I use plain old grep 99% of the time.
Scripting
Variables
A variable is a parameter denoted by a name; a name is a word containing only letters, numbers, or underscores and beginning with a letter or an
underscore.
Many variables are set by the shell itself, including three you have already seen: HOME, PWD, and PATH. With only two minor exceptions, auto_resume and
histchars, all the variables set by the shell are all uppercase letters.
The words entered after the command are its arguments. These are words separated by whitespace (one or more spaces or tabs). If the whitespace is
escaped or quoted, it no longer separates words but becomes part of the word.
In the first line, the spaces between 2 and 3 are quoted because they are surrounded by single quotation marks. In the second, the space after now is
escaped by a backslash, which is the shell’s escape character.
In the final line, a space is quoted with double quotes.
In the second command, the first argument is an option. Traditionally, options to Unix commands are a single letter preceded by a hyphen, sometimes
followed by an argument. The GNU commands found in Linux distributions often accept long options as well. These are words preceded by a double
hyphen. For example, most GNU utilities have an option called --version that prints the version:
An internal command in all modern shells, echo prints its arguments with a single space between them to the standard output stream, followed by a
newline:
$ echo The quick brown fox
The default newline can be suppressed in one of two ways, depending on the shell:
$ echo -n No newline
No newline$ echo "No newline\c"
No newline$
The BSD variety of echo accepted the option -n, which suppressed the newline. AT&T’s version used an escape sequence, \c, to do the same thing.
bash has the -e option to activate escape sequences such as \c but by default uses -n to prevent a newline from being printed. (The escape sequences
recognized by echo -e are the same as those described in the next section, with the addition of \c).
Add –e to the echo command if you want the escape sequences to be recognized.
If you limit the use of echo to situations where there cannot be a conflict, that is, where you are sure the arguments do not begin with -n and do not
contain escape sequences, you will be fairly safe. For everything else (or if you’re not sure), use printf.
As you know, it is almost obligatory to begin with a hello world script and we will not disappoint as far as this is concerned. We will begin by creating a new
script $HOME/bin/hello1.sh. The contents of the file should read as in the following screenshot:
I am hoping that you haven't struggled with this too much; it is just three lines after all. I encourage you to run through the examples as you read to really
help you instill the information with a good hands-on practice.
#!/bin/bash: Normally, this is always the first line of the script and is known as the shebang. The shebang starts with a comment but the system still
uses this line. A comment in a shell script has the # symbol. The shebang instructs the system to the interpreter to execute the script. We use bash for
shell scripts and we may use PHP or Perl for other scripts, as required. If we do not add this line, then the commands will be run within the current shell; it
may cause issues if we run another shell.
echo "Hello World": The echo command will be picked up in a built-in shell and can be used to write a standard output, STDOUT, this defaults to the
screen. The information to print is enclosed in double-quotes, there will be more on quotes later.
exit 0: The exit command is a built in shell and is used to leave or exit the script. The exit code is supplied as an integer argument. A value of anything
other than 0 will indicate some type of error in the script's execution.
With the script saved in our PATH environment, it still will not execute as a standalone script. We will have to assign and execute permissions for the file, as
needed. For a simple test, we can run the file directly with bash. The following command shows you how to do this:
$ bash $HOME/bin/hello1.sh
We should be rewarded with the Hello World text being displayed back on our screens. This is not a long-term solution, as we need to have the script in the
$HOME/bin directory, specifically, to make the running of the script easy from any location without typing the full path. We need to add in the execute
permissions as shown in the following code:
$ chmod +x $HOME/bin/hello1.sh
We should now be able to run the script simply, as shown in the following screenshot:
This script is simple but we still have to know how to make use of the exit codes from scripts and other applications. The command-line list that we
generated earlier while creating the $HOME/bin directory, is a good example of how we can use the exit code:
$ command1 || command 2
In the preceding example, command2 is executed only if command1 fails in some way. To be specific, command2 will run if command1 exits with a status code
other than 0.
We will only execute command2 if command1 succeeds and issues an exit code of 0.
To read the exit code from our script explicitly, we can view the $? variable, as shown in the following example:
$ hello1.sh
$ echo $?
The expected output is 0, as this is what we have added to the last line of the file and there is precious little else that can go wrong causing us to fail in
reaching that line.
Referring back to the command hierarchy within this chapter, we can use a type to determine the location and type of file hello1.sh is:
$ type hello1.sh #To determine the type and path
$ type -a hello1.sh #To print all commands found if the name is NOT unique
$ type -t hello1.sh ~To print the simple type of the command
${var#Pattern} Remove from $var the shortest part of $Pattern that matches the front end of $var.
${var##Pattern} Remove from $var the longest part of $Pattern that matches the front end of $var.
Example :
It is probably time for us to pull over to the side of the scripting highway and look a little more at this command test. This is both a shell builtin and a file
executable in its own right. Of course, we will have to hit the built-in command first, unless we specify the full path to the file.
When the test command is run without any expressions to evaluate, then the test will return false. So, if we run the test as shown in the following
command:
$ test
The exit status will be 1, even though no error output is shown. The test command will always return either True or False or 0 or 1, respectively. The basic
syntax of test is:
test EXPRESSION
test ! EXPRESSION
If we need to include multiple expressions, these can be AND or OR together using the -a and -o options, respectively:
We can also write in a shorthand version replacing the test with square brackets to surround the expression as shown in the following example:
[ EXPRESION ]
Testing strings
We can test for the equality or inequality of two strings. For example, one of the ways to test the root user is using the following command:
[ $USER = root ]
Equally, we could test for a non-root account with the following two methods:
We can also test for zero values and non-zero values of strings. We saw this in an earlier example in this chapter.
To test if a string has a value, we could use the -n option. We can check to see if the current connection is made via SSH by checking for the existence of a
variable in the user's environment. We show this using test and square brackets in the following two examples:
test -n $SSH_TTY
[ -n $SSH_TTY ]
If this is true, then the connection is made with SSH; if it is false, then the connection is not via SSH.
As we saw earlier, testing for a zero string value is useful when deciding if a variable is set:
test -z $1
[ -z $1 ]
A true result for this query means that no input parameters have been supplied to the script.
Testing integers
As well as, testing string values of bash scripts can test for integer values and whole numbers. Another way of testing input of a script is to count the
numbers of positional parameters and also test that the number is above 0:
test $# -gt 0
[ $# -gt 0 ]
When in relationship, top positional parameters the variable $# represents the number of parameters passed to the script. To test equality of integer
values, the -eq option is used and not the = symbol.
Testing file types
While testing for values we can test for the existence of a file or file type. For example, we may only want to delete a file if it is a symbolic link. I use this
while compiling a kernel. The /usr/src/linux directory should be a symbolic link to the latest kernel source code. If I download a newer version before
compiling the new kernel, I need to delete the existing link and create a new link. Just in case someone has created the /usr/src/linux directory, we
can test it as a link before removing it:
The -h option tests that the file has a link. Other options include:
More options do exist, so delve into the main pages as you need. We will use different options throughout the book; thus, giving you practical and useful
examples.
File Tests
Several operators test the state of a file. A file’s existence can be tested with -e (or the nonstandard -a). The type of file can be checked with -f for a regular
file, -d for a directory, and -h or -L for a symbolic link. Other operators test for special types of files and for which permission bits are set.
Integer Tests
Comparisons between integers use the -eq, -ne, -gt, -lt, -ge, and -le operators.
The equality of integers is tested with -eq:
$ test 1 -eq 1
$ echo $?
0
$ [ 2 -eq 1 ]
$ echo $?
1
$ [ 2 -ne 1 ]
$ echo $?
0
The remaining operators test greater than, less than, greater than or equal to, and less than or equal to.
String Tests
Strings are concatenations of zero or more characters and can include any character except NUL (ASCII 0). They can be tested for equality or inequality, for
nonempty string or null string , , and in bash for alphabetical ordering. The = operator tests for equality, in other words, whether they are identical; != tests
for inequality. bash also accepts == for equality, but there is no reason to use this nonstandard operator.
The -z and -n operators return successfully if their arguments are empty or nonempty:
$ [ -z "" ]
$ echo $?
0
$ test -n ""
$ echo $?
1
The greater-than and less-than symbols are used in bash to compare the lexical positions of strings and must be escaped to prevent them from being
interpreted as redirection operators:
$ str1=abc
$ str2=def
$ test "$str1" \< "$str2"
$ echo $?
0
$ test "$str1" \> "$str2"
$ echo $?
1
The previous tests can be combined in a single call to test with the -a (logical AND) and -o (logical OR) operators:
test is usually used in combination with if or the conditional operators && and ||.
Like test, [[ ... ]] evaluates an expression. Unlike test, it is not a builtin command. It is part of the shell grammar and not subject to the same parsing as a
builtin command. Parameters are expanded, but word splitting and file name expansion are not performed on words between [[ and ]].
It supports all the same operators as test, with some enhancements and additions. It is, however, nonstandard, so it is better not to use it when test could
perform the same function.
Example
String test with patern matching
Test if argument starts with a letter
\([a-z]\)\([a-z]\)[a-z]\2\1 can find radar r in first group ,a in second , d in third …then remembered a and r
Avem output
[mihail@oc8168772081 TEST_INFOACADEMY]$ ./myparams unu doi trei
My name is 'basename ./myparams' - I was called as ./myparams
I was called with 3 parameters.
Parameter number 1 is: unu
Parameter number 2 is: doi
Parameter number 3 is: trei
if [ "$a" \< "$b" ] you need to escape the operator for single [
The == comparison operator behaves differently within a double-brackets test than within single brackets.
# Find out if our backup job failed or not and notify on screen
[ $? -eq 0 ] && echo "Backup done!" || echo "Backup failed"
SED
[mihail@oc8168772081 ~]$ echo Sunday | sed 's/day/night/'
Sunnight
If you want to switch two words around, you can remember two patterns and change the order around:
[mihail@oc8168772081 ~]$ echo day one |sed 's/\([a-z]*\) \([a-z]*\)/\2 \1/'
one day
Sometimes you want to search for a pattern and add some characters, like parenthesis, around or near the pattern you found.
[mihail@oc8168772081 ~]$ echo day | sed 's/[a-z]*/(&)/'
(day)
If you want to take off numbers from a line . The "\1" is the first remembered pattern, and the "\2" is the second remembered pattern. Sed has up to nine
remembered patterns. Patern is in between round parantheses ‘( ‘ and ‘ )’
[mihail@oc8168772081 ~]$ echo day1234 | sed 's/\([a-z]*\).*/\1/'
day
If you want to switch two words around, you can remember two patterns and change the order around:
[mihail@oc8168772081 ~]$ echo day one |sed 's/\([a-z]*\) \([a-z]*\)/\2 \1/'
one day
If you want to reverse the first three characters on a line, you can use
[mihail@oc8168772081 ~]$ echo alex |sed 's/^\(.\)\(.\)\(.\)/\3\2\1/'
elax
[mihail@oc8168772081 ~]$ echo oty alex | sed 's/oty alex/oa/' inlocuim 2 cuvinte cu 2 litere
oa
[mihail@oc8168772081 ~]$ echo oty alex nijdyag | sed 's/oty alex/oa/' idem dar ramane restul
oa nijdyag
[mihail@oc8168772081 ~]$ echo oty alex nijdyag | sed 's/oty alex .*/oa/' eliminarea restului de cuvinte
oa
There are two levels of interpretation here: the shell, and sed.
In the shell, everything between single quotes is interpreted literally, except for single quotes themselves. You can effectively have a single quote between
single quotes by writing '\'' (close single quote, one literal single quote, open single quote).
Sed uses basic regular expressions. In a BRE, in order to have them treated literally, the characters $.*[\^ need to be quoted by preceding them by a
backslash, except inside character sets ([…]). Letters, digits and (){}+?| must not be quoted (you can get away with quoting some of these in some
implementations). The sequences \(, \), \n, and in some implementations \{, \}, \+, \?, \| and other backslash+alphanumerics have special
meanings. You can get away with not quoting $^ in some positions in some implementations.
Furthermore, you need a backslash before / if it is to appear in the regex outside of bracket expressions. You can choose an alternative character as the
delimiter by writing, e.g., s~/dir~/replacement~ or \~/dir~p; you'll need a backslash before the delimiter if you want to include it in the BRE. If
you choose a character that has a special meaning in a BRE and you want to include it literally, you'll need three backslashes; I do not recommend this, as it
may behave differently in some implementations.
• & and \ need to be quoted by preceding them by a backslash, as do the delimiter (usually /) and newlines.
• \ followed by a digit has a special meaning. \ followed by a letter has a special meaning (special characters) in some implementations, and \
followed by some other character means \c or c depending on the implementation.
• With single quotes around the argument (sed 's/…/…/'), use '\'' to put a single quote in the replacement text.
If the regex or replacement text comes from a shell variable, remember that
If I try to run this in sh (dash here), it'll fail because of the parentheses, which need to be escaped.
The problem you're experiencing isn't due to shell interpolating and escapes - it's because you're attempting to use extended
regular expression syntax without passing sed the -r or --regexp-extended option.
Change your sed line from
sed 's/(127\.0\.1\.1)\s/\1/' [some file]
to
sed -r 's/(127\.0\.1\.1)\s/\1/' [some file]
By default sed uses uses basic regular expressions (think grep style), which would require the following syntax:
sed 's/\(127\.0\.1\.1\)[ \t]/\1/' [some file]
===============
If you try
[mihail@oc8168772081 TEST_INFOACADEMY]$ echo ${pwd##*/}
empty result ( it doesn’t work )
[mihail@oc8168772081 TEST_INFOACADEMY]$
Database connection and retrieve values
On connecting with wallet , credentiales are kept in wallet on client side . Once stored, you can connect to database using sqlplus /@connect_string
Create a Oracle Wallet
-create
If you schedule cron through oracle user, keep the privileges as such. Please note that if a user has a read permission on these files, it can login to database.
So it's like your House Key which you would like to keep safely with you
Next step is to add database credential to the wallet. Before this, create a tnsnames entry you will use to access the database
AMIT_TEST11R2 =
(DESCRIPTION =
(ADDRESS = (PROTOCOL = TCP)(HOST = db11g)(PORT = 1521))
(CONNECT_DATA =
(SERVER = DEDICATED)
(SERVICE_NAME = test11r2)
)
)
#!/bin/sh
VALUE=`sqlplus $DB_USERNAME/$PASSWORD@//$HOST_NAME:$DB_PORT/$DB_SID <<END
set pagesize 0 feedback off verify off heading off echo off
SELECT ID FROM TEST_USERS WHERE USER_NAME=$SAMPLE_USER;
exit;
END`
if [ -z "$VALUE" ]; then
echo "No rows returned from database"
exit 0
else
echo $VALUE
fi
sau cu wallet
#!/bin/sh
export TNS_ADMIN=/home/css/BEX_GW/wallet
VALUE=`sqlplus /@NLS_USER
<<END
set pagesize 0 feedback off verify off heading off echo off
SELECT ID FROM TEST_USERS WHERE USER_NAME=$SAMPLE_USER;
exit;
END`
if [ -z "$VALUE" ]; then
echo "No rows returned from database" ------ must be manually entered
exit 0
else
echo $VALUE
fi
SQLPLUS hr@(DESCRIPTION=
(ADDRESS=(PROTOCOL=tcp)(HOST=sales-server)(PORT=1521) )
(CONNECT_DATA=
(SERVICE_NAME=sales.us.acme.com) ) )
=======================
The answer shown above works fine if you are trying to retrieve a single value from sqlplus. If you have a sqlplus script that returns multiple columns, you
could read them into shell variables like this:
sqlplus / @myscript.sql | read var1 var2 var3
This will read 3 columns into var1, var2, and var3. Make sure that if you do this, you don't have blank or null values coming back, otherwise the "read"
command will skip over the blanks/nulls and get the variable assignments out of sync.
Another variation: if you are retrieving multiple rows as well as columns in your sqlplus script, and want to loop over the rows:
sqlplus / @myscript.sql | while read var1 var2 var3
do
<more shell stuff here>
done
A specific example:
in emps.sql:
exit;
In test.sh:
Regular expressions
We have anchors , character sets and modifiers
There are also two types of regular expressions: the "Basic" regular expression, and the "extended" regular expression. A few utilities like awk and egrep
use the extended expression.
Anchors ^ and $
The regular expression "^A" will match all lines that start with a capital A. The expression "A$" will match all lines that end with the capital A.
If the anchor characters are not used at the proper end of the pattern, then they no longer act as anchors. That is, the "^" is only an anchor if it is the first
character in a regular expression. The "$" is only an anchor if it is the last character. The expression "$1" does not have an anchor. Neither is "1^".
If you need to match a "^" at the beginning of the line, or a "$" at the end of a line, you must escape the special characters with a backslash. Here is a
summary:
Pattern Matches
Character set . The simplest character set is a character. The regular expression "the" contains three character sets: "t," "h" and "e". It will match any line
with the string "the" inside it. This would also match the word "other". To prevent this, put spaces before and after the pattern: " the ".
Use this pattern with grep to print every address in your incoming mail box:
grep '^From: ' /usr/spool/mail/$USER
Some characters have a special meaning in regular expressions. If you want to search for such a character, escape it with a backslash.
The character "." is one of those special meta-characters. By itself it will match any character, except the end-of-line character. The pattern that will match a
line with a single characters is
^.$
Regular Expression Matches
[] The characters "[]"
[0] The character "0"
[0-9] Any number
[^0-9] Any character other than a number
[-0-9] Any number or a "-"
[0-9-] Any number or a "-"
[^-0-9] Any character except a number or a "-"
[]0-9] Any number or a "]"
[0-9]] Any number followed by a "]"
[0-9-z] Any number, or any character between "9" and "z".
[0-9\-a\]] Any number, or a "-", a "a", or a "]"
The special character "*" matches zero or more copies. That is, the regular expression "0*" matches zero or more zeros, while the expression "[0-9]*"
matches zero or more numbers.
. (dot ) matches any character.
There is a special pattern you can use to specify the minimum and maximum number of repeats. This is done by putting those two numbers between "\{" and
"\}". The backslashes deserve a special discussion. Normally a backslash turns off the special meaning for a character. A period is matched by a "\." and an
asterisk is matched by a "\*".
If a backslash is placed before a "<," ">," "{," "}," "(," ")," or before a digit, the backslash turns on a special meaning.
This was done because these special functions were added late in the life of regular expressions. Changing the meaning of "{" would have broken old
expressions.
The regular expression to match 4, 5, 6, 7 or 8 lower case letters is
[a-z]\{4,8\}
Any numbers between 0 and 255 can be used.
Searching for a word isn't quite as simple as it at first appears. The string "the" will match the word "other". You can put spaces before and after the letters
and use this regular expression: " the ". However, this does not match words at the beginning or end of the line. And it does not match the case where there is
a punctuation mark after the word.
The characters "\<" and "\>" are similar to the "^" and "$" anchors, as they don't occupy a position of a character. They do "anchor" the expression between
to only match if it is on a word boundary. The pattern to search for the word "the" would be "\<[tT]he\>". The character before the "t" must be either a new
line character, or anything except a letter, number, or underscore. The character after the "e" must also be a character other than a number, letter, or
underscore or it could be the end of line character.
Another pattern that requires a special mechanism is searching for repeated words. The expression "[a-z][a-z]" will match any two lower case letters. If you
wanted to search for lines that had two adjoining identical letters, the above pattern wouldn't help. You need a way of remembering what you found, and
seeing if the same pattern occurred again. You can mark part of a pattern using "\(" and "\)". You can recall the remembered pattern with "\" followed by a
single digit. Therefore, to search for two identical letters, use "\([a-z]\)\1". You can have 9 different remembered patterns. Each occurrence of "\(" starts a
new pattern. The regular expression that would match a 5 letter palindrome, (e.g. "radar"), would be
\([a-z]\)\([a-z]\)[a-z]\2\1
=====================================
#!/bin/bash
. propFile #source the file in order to have the variable pathTest1 available
Ex2
scriptul test_var.sh contine o singura linie “echo Color is “ . Fara “source” , variabila COLOR definita in shelul curent , nu este vazuta de script ( pornit
intr-un subshell )
[mihail@oc8168772081 scripturi]$ vim test_var.sh
[mihail@oc8168772081 scripturi]$ COLOR=BLUE
[mihail@oc8168772081 scripturi]$ chmod +x test_var.sh
[mihail@oc8168772081 scripturi]$ ./test_var.sh
Color is
[mihail@oc8168772081 scripturi]$ source ./test_var.sh
Color is BLUE
[mihail@oc8168772081 scripturi]$
Example of logging
Avem o functie log() intr-un script logging.sh
#!/bin/bash
LOG_LEVEL_INFO=3
LOG_LEVEL_WARNING=2
LOG_LEVEL=$LOG_LEVEL_INFO
LOG_LABEL_INFO=INFO
LOG_LABEL_WARNING=WARNING
LOG_TIMESTAMP="%Y.%m.%d %H:%M.%S"
function log() {
LEVEL=$1
LABEL="$2"
shift 2
[[ $LOG_LEVEL -ge $LEVEL ]] && echo " Am Logat textul [`date +\"$LOG_TIMESTAMP\"` ] $LABEL : $*"|tee -a $Log
VALUE=returned2
return 0
}
function log_info() {
log $LOG_LEVEL_INFO "$LOG_LABEL_INFO" $*
}
function log_warning() {
log $LOG_LEVEL_WARNING "$LOG_LABEL_WARNING" $*
}
=================================================
Scriptul de testare a functiei ( scriptul testeaza mai multe chestii )
#!/bin/bash
. logging.sh
var="int sit dev"
for f in $var
do
echo $f
done
P=/home/mihail/TEST_INFOACADEMY/mihai
Log=/home/mihail/TEST_INFOACADEMY/log.test
TST1=/home/mihail/TEST_INFOACADEMY/TEST1
if [ ! -d $TST1 ];then
echo "dir nu exista"
else
echo "dir exista" && cp /home/mihail/TEST_INFOACADEMY/testawk1 $TST1
fi
log_info "asta trebuie logat" ;
echo $LOG_LEVEL_INFO
echo VALUE este $VALUE
param=""
[[ -z $param ]] && echo "param is empty"
log_warning "WARNING is WORKING"
echo finished
OBJECT_NAME OBJECT_TYPE
------------------------- -------------------
EMP_DEPT CLUSTER
EMP TABLE
DEPT TABLE
EMP_DEPT_INDEX INDEX
PUBLIC_EMP SYNONYM
EMP_MGR VIEW
When you create a view or a synonym, the view or synonym is based on its underlying base object. The ALL_DEPENDENCIES,
USER_DEPENDENCIES, and DBA_DEPENDENCIES data dictionary views can be used to reveal the dependencies for a view. The
ALL_SYNONYMS, USER_SYNONYMS, and DBA_SYNONYMS data dictionary views can be used to list the base object of a synonym. For
example, the following query lists the base objects for the synonyms created by user jward:
if [[ $1 == dev ]];then
echo "ai ales dev"
elif [[ $1 == sit ]];then
echo "ai ales sit"
echo -n "Continue ? Y/N "
read answer
if [[ answer == Y ]]; then
(shift; "progname" $* ) | grep $1
else
exit 0
fi
fi
echo -n "introduceti environmentul si apasati enter "
read env
if [[ $env == dev ]];then
echo "ai ales dev"
elif [[ $env == sit ]];then
echo "ai ales sit"
else
echo " nu ai ales bine"
fi
else
echo "nu ai ales corect"
or
fi
=====================================================================================
Give permission recursivelly
chmod -R <permissionsettings> <dirname>
chmod -R 755 will set this as permissions to all files and folders in the tree. But why on earth would you want to? It might make sense for the directories,
but why set the execute bit on all the files?
I suspect what you really want to do is set the directories to 755 and either leave the files alone or set them to 644. For this, you can use the find
command. For example:
chmod 644 {} \; specifies the command that will be executed by find for each file. {} is replaced by the file path, and the semicolon denotes the end of
the command (escaped, otherwise it would be interpreted by the shell instead of find).
========================================================================================================
Commands combined
sudo apt-get update && sudo apt-get install pyrenamer
A || B Run B if A failed
==============================================================================================================================
========================================================================================================
NETWORKING
Assuming that it's a TCP (rather than UDP) port that you're trying to use:
1. On the server itself, use netstat -an to check to see which ports are listening
2. From outside, just telnet host port (or telnet host:port on Unix systems) to see if the connection is refused, accepted, or timeouts
On Win7 or Vista default option 'telnet' is not recognized as an internal or external command,operable program or batch file. To solve this, just enable it :
Click Start, Control Panel, Programs, and then Turn Windows Features on or off. In the list, scroll down and select Telnet Client and click OK
No process is listening.
This is by far the most common reason for the message. First ensure that you are trying to connect to the correct system. If you are then to determine if
this is the problem, on the remote system run or 1 e.g. if you are expecting a process to be listening on port 22222
or
ss -tnlp | grep :22222
If nothing is listening then the above will produce no output. If you see some output then confirm that it's what you expect then see the firewall section
below.
If you don't have access to the remote system and want to confirm the problem before reporting it to the relevant administrators you can use tcpdump
(wireshark or similar).
When a connection is attempted to an IP:port where nothing is listening, the response from the remote system to the initial SYN packet is a packet with the
flags RST,ACK set. This closes the connection and causes the Connection Refused message e.g.
$ sudo tcpdump n host 192.0.2.1 and port 22222
tcpdump: verbose output suppressed, use v or vv for full protocol decode
listening on enp14s0, linktype EN10MB (Ethernet), capture size 262144 bytes
12:31:27.013976 IP 192.0.2.2.34390 > 192.0.2.1.22222: Flags [S], seq 1207858804, win 29200, options [mss
1460,sackOK,TS val 15306344 ecr 0,nop,wscale 7], length 0
12:31:27.020162 IP 192.0.2.1.22222 > 192.0.2.2.34390: Flags [R.], seq 0, ack 1207858805, win 0, length 0
If the port is blocked by a firewall and the firewall has been configured to respond with icmp-port-unreachable this will also cause a connection
refused message. Again you can see this with tcpdump (or similar)
$ sudo tcpdump n icmp
tcpdump: verbose output suppressed, use v or vv for full protocol decode
listening on enp14s0, linktype EN10MB (Ethernet), capture size 262144 bytes 13:03:24.149897 IP 192.0.2.1 >
192.0.2.2: ICMP 192.0.2.1 tcp port 22222 unreachable, length 68
Note that this also tells us where the blocking firewall is.
So now you know what's causing the Connection refused message you should take appropriate action e.g. contact the firewall administrator or investigate
the reason for the process not listening.
1 Other tools are likely available.