Changing Timezone in linux.
1) Change to the directory /usr/share/zoneinfo, where you can find list of timezone regions
2) Move your existing time zone
mv /etc/localtime /etc/localtime-old
3) Link the timezone u needed to localtime
ln -sf /usr/share/zoneinfo/America/Pacific /etc/localtime
4) Update the system time by
/usr/bin/rdate -s time-a.nist.gov
5) Set the ZONE entry in the file /etc/sysconfig/clock file (e.g. "America/Los_Angeles")
6) /sbin/hwclock --systohc
About Me
Wednesday, 26 November 2008
Wednesday, 8 October 2008
Visual block in VI
Do you need to perform multiple line edits at a single stretch.
I came across VISUAL BLOCK in vi editor.
CTRL + v makes you to enter the visual mode for your file.
select multiple lines
SHIFT + i gives you the insert mode
Enter the text at beginning of the block
Press ESC twice to apply changes to all lines.
I came across VISUAL BLOCK in vi editor.
CTRL + v makes you to enter the visual mode for your file.
select multiple lines
SHIFT + i gives you the insert mode
Enter the text at beginning of the block
Press ESC twice to apply changes to all lines.
Blogged with the Flock Browser
Thursday, 25 September 2008
PERL script for extracting log file based on time
Below code extracts the huge log file based on time frame passed.
#!/usr/bin/perl
use strict;
my $file;
my $times = 0;
my $needed_response_time = 0;
my $start;
my $end;
sub Usage
{
my ($msg) = $_[0];
if ($msg)
{
print STDERR "$msg \n";
}
print STDERR << "EOHELP"
Usage: split_access_log_by_time
options:
-t- :: time interval for which the log is needed
-g
#!/usr/bin/perl
use strict;
my $file;
my $times = 0;
my $needed_response_time = 0;
my $start;
my $end;
sub Usage
{
my ($msg) = $_[0];
if ($msg)
{
print STDERR "$msg \n";
}
print STDERR << "EOHELP"
Usage: split_access_log_by_time
options:
-t
-g
PERL script to combine two csv files
Below script is used to combine two csv files based on the first column. For example, it is assumed both the file has first column related to project names.
#!/usr/bin/perl -w
$FILE1 = $ARGV[0];
$FILE2 = $ARGV[1];
$PROJECTFILE = $ARGV[2];
my %File1Map = ();
my %File2Map = ();
open(FILE1) or die("Could not open $FILE1 file.");
foreach $line () {
# ($Project,$Attachments,$Announcements,$Documents,$Discussions,$Users,$ArtifactTypeCount,$Alm) = split(',',$line);
($Project,$values) = split(',',$line,2);
# my $tempval="$Attachments,$Announcements,$Documents,$Discussions,$Users,$ArtifactTypeCount,$Alm";
my $tempval = $values;
# print "$Project,$values";
$tempval =~ s/\r|\n//g;
$File1Map{"$Project"} = "$tempval";
}
close(FILE1);
open(FILE2) or die("Could not open $FILE2 file.");
foreach $subline() {
($proj,$izcount) = split(',',$subline);
$File2Map{"$proj"}="$izcount";
}
close(FILE2);
open(PROJECTFILE) or die("Could not open $PROJECTFILE file.");
foreach $projname() {
($name,$dummy)=split('\n',$projname);
my $existingValue=$File1Map{"$name"};
my $newValue=$File2Map{"$name"};
if (! defined $existingValue)
{
$existingValue = ",,,,,,,,";
}
if (! defined $newValue)
{
$newValue = "\n";
}
print "" . $name . "," . $existingValue . "," . $newValue;
}
#!/usr/bin/perl -w
$FILE1 = $ARGV[0];
$FILE2 = $ARGV[1];
$PROJECTFILE = $ARGV[2];
my %File1Map = ();
my %File2Map = ();
open(FILE1) or die("Could not open $FILE1 file.");
foreach $line (
# ($Project,$Attachments,$Announcements,$Documents,$Discussions,$Users,$ArtifactTypeCount,$Alm) = split(',',$line);
($Project,$values) = split(',',$line,2);
# my $tempval="$Attachments,$Announcements,$Documents,$Discussions,$Users,$ArtifactTypeCount,$Alm";
my $tempval = $values;
# print "$Project,$values";
$tempval =~ s/\r|\n//g;
$File1Map{"$Project"} = "$tempval";
}
close(FILE1);
open(FILE2) or die("Could not open $FILE2 file.");
foreach $subline(
($proj,$izcount) = split(',',$subline);
$File2Map{"$proj"}="$izcount";
}
close(FILE2);
open(PROJECTFILE) or die("Could not open $PROJECTFILE file.");
foreach $projname(
($name,$dummy)=split('\n',$projname);
my $existingValue=$File1Map{"$name"};
my $newValue=$File2Map{"$name"};
if (! defined $existingValue)
{
$existingValue = ",,,,,,,,";
}
if (! defined $newValue)
{
$newValue = "\n";
}
print "" . $name . "," . $existingValue . "," . $newValue;
}
Thursday, 18 September 2008
AWK IF ELSE block
nawk -F"," ' BEGIN{OFS=","} {if (NR==1) {
print "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXx"
} else {
print $1,$2,$3,$4,"S",1
print $1,$2,$3,$5,"S",2
print $1,$2,$3,$6,"D",1
print $1,$2,$3,$7,"D",2
}}' in.txt
Extract total number of requests hits
Below is the command to calculate no of time projects accessed using the url. The informations are retrieved from log file where all the requests are logged.
cut -d ' ' -f 7 ssl_request_log |grep '/sf/'|cut -d '/' -f 3|sort|uniq -c
cut -d ' ' -f 7 ssl_request_log |grep '/sf/'|sort|uniq -c
Sample log file
[31/Aug/2008:05:50:08 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /asf-help/RoboHelp_CSH.js HTTP/1.1" 7677
[31/Aug/2008:05:50:08 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /wsf-images/masthead/logo.gif HTTP/1.1" 2430
[31/Aug/2008:05:50:09 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /fsf-images/masthead/dropdown.gif HTTP/1.1" 49
[31/Aug/2008:05:50:09 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /nsf-images/masthead/help.gif HTTP/1.1" 402
cut -d ' ' -f 7 ssl_request_log |grep '/sf/'|cut -d '/' -f 3|sort|uniq -c
cut -d ' ' -f 7 ssl_request_log |grep '/sf/'|sort|uniq -c
Sample log file
[31/Aug/2008:05:50:08 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /asf-help/RoboHelp_CSH.js HTTP/1.1" 7677
[31/Aug/2008:05:50:08 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /wsf-images/masthead/logo.gif HTTP/1.1" 2430
[31/Aug/2008:05:50:09 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /fsf-images/masthead/dropdown.gif HTTP/1.1" 49
[31/Aug/2008:05:50:09 +0200] 0.0.0.0 SSLv3 RC4-MD5 "GET /nsf-images/masthead/help.gif HTTP/1.1" 402
Wednesday, 10 September 2008
Script to copy contents of file between two specific line patterns
Here is the script code to copy contents of files between two line patterns.
For example:
To copy the contents of file starting between the particular date and end date.
if [ $# -ne 3 ]; then
echo 1>&2 Usage: $0 file_to_parsed starting_pattern ending_pattern
exit 127
fi
startline=`grep -ni -m1 $2 $1 | awk -F ":" '{print $1}'`
endline=`grep -ni $3 $1 | tail -n1 | awk -F ":" '{print $1}'`
totallineno=`expr $endline - $startline + 1`
echo "Extracting contents between" $2 " and " $3
echo "No of lines found: " $totallineno
echo "Copying contents to temp file: " temp_file
echo "`head -n$endline $1 | tail -n $totallineno`" > temp_file
echo "Done copying."
For example:
To copy the contents of file starting between the particular date and end date.
if [ $# -ne 3 ]; then
echo 1>&2 Usage: $0 file_to_parsed starting_pattern ending_pattern
exit 127
fi
startline=`grep -ni -m1 $2 $1 | awk -F ":" '{print $1}'`
endline=`grep -ni $3 $1 | tail -n1 | awk -F ":" '{print $1}'`
totallineno=`expr $endline - $startline + 1`
echo "Extracting contents between" $2 " and " $3
echo "No of lines found: " $totallineno
echo "Copying contents to temp file: " temp_file
echo "`head -n$endline $1 | tail -n $totallineno`" > temp_file
echo "Done copying."
Convert your DVD to other suitable formats
Need to convert your dvd?.
Below is the command to do it.
for i in 3 4 5; // loop dvd chapters
do
mencoder dvd://1 -chapter $i-$i -o chapter$i.avi -oac mp3lame -ovc lavc -vf scale=320:240 -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256 -ofps 30;
done
Going through the loop makes our job easy for all the chapters.
Below is the command to do it.
for i in 3 4 5; // loop dvd chapters
do
mencoder dvd://1 -chapter $i-$i -o chapter$i.avi -oac mp3lame -ovc lavc -vf scale=320:240 -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256 -ofps 30;
done
Going through the loop makes our job easy for all the chapters.
AWK to calculate sum, min, max of particular field in txt
Recently, I came across situation to calculate the sum, min, max of particular field using awk.
Below is the snippet I've used to calculate the field values.
awk 'BEGIN{
FS=OFS=",";
max = -999999999; min = 9999999999;
print "----------------------------"
}
{
sub(" Total requests: ","",$2);
val = int($2);
lines+=1;sum+=val;
if( val > max ) max = val;
if ( val < min ) min = val;
}
END{
print "| Total No of Lines: "lines " Sum: " sum " Average: " sum/lines " Max: " max " Min: " min
}' with_out.txt
Text File Contents for parsing:
09/Sep/2008:03:57:09, Total requests: 36, New requests: 34
09/Sep/2008:03:57:10, Total requests: 61, New requests: 58
09/Sep/2008:03:57:11, Total requests: 40, New requests: 38
09/Sep/2008:03:57:12, Total requests: 101, New requests: 97
09/Sep/2008:03:57:13, Total requests: 33, New requests: 35
Below is the snippet I've used to calculate the field values.
awk 'BEGIN{
FS=OFS=",";
max = -999999999; min = 9999999999;
print "----------------------------"
}
{
sub(" Total requests: ","",$2);
val = int($2);
lines+=1;sum+=val;
if( val > max ) max = val;
if ( val < min ) min = val;
}
END{
print "| Total No of Lines: "lines " Sum: " sum " Average: " sum/lines " Max: " max " Min: " min
}' with_out.txt
Text File Contents for parsing:
09/Sep/2008:03:57:09, Total requests: 36, New requests: 34
09/Sep/2008:03:57:10, Total requests: 61, New requests: 58
09/Sep/2008:03:57:11, Total requests: 40, New requests: 38
09/Sep/2008:03:57:12, Total requests: 101, New requests: 97
09/Sep/2008:03:57:13, Total requests: 33, New requests: 35
Thursday, 28 August 2008
AWK Samples
Here are some of the useful awk program collections!awk '{print $2,$1}' filename
awk '{print $2,$1}' filename
awk '$1 > $2 {print $1,$2,$1-$2}' filename
awk '$1 > $2' filename
awk '$1 > $2{print}' filename
awk '$1 > $2{print $0}' filename
awk '$1=="foo"{print $2}' filename
awk '/foo.*bar/{print $1,$3}' filename
awk '$2~/foo/{print $3,$1}' filename
awk '$2!~/foo/{print $3,$1}' filename
awk '/foo/,/bar/' filename
awk 'BEGIN{print"fee"} $1=="foo"{print"fi"}
END{print"fo fum"}' filename
awk '{print $1,$NF }' filename
awk '{print NR,$0 }' filename
awk -F: '{print $1,$3 }' /etc/passwd
awk '{$10=""; print }' filename
awk '{for(i=1;i<=NF;i++) print $i }' filename awk '{printf("%s %03d %02d %.15g\n",$1,$2,$3,$3/$2); }' filename
awk '{for(i=NF;i > 0;i--) printf("%s",$i); printf("\n"); }' filename
Replace particular field of line using awk
I was trying to replace the particular field of a line and found a suggestion for that.
echo 'count ****** 50' | gawk '{$2="\"5AB0\"";print}'
The above gawk command replaces the ****** with 5ABo.
echo 'count ****** 50' | gawk '{$2="\"5AB0\"";print}'
The above gawk command replaces the ****** with 5ABo.
Wednesday, 20 August 2008
To delete windows partition from ubuntu.
1) Command to see the partition
sudo fdisk -l
2) Mount the windows partition
sudo mount /dev/sda1 /mnt/
3) Look for the files and verify the data
ls -rlt
4)Then look for the available file system
sudo mkfs.
5)Create the new file system
sudo mkfs.ext3 /dev/sda1
6)Finally mounting the created partition
sudo mount /dev/sda1 /mnt/
To change the file system.
sudo vim /etc/fstab
To remove windows during bootup
cd /boot/grub/
sudo vim menu.lst
sudo grub-install /dev/sda
For referrence:
prakash@prakash-laptop:/boot/grub$ cat /etc/fstab
# /etc/fstab: static file system information.
#
#
proc /proc proc defaults 0 0
# /dev/sda3
UUID=38683652-4d42-4885-a18a-aa1fede0ca9d / ext3 defaults,errors=remount-ro 0 1
# /dev/sda1
UUID=C2C8BE9CC8BE8DE3 /media/sda1 ntfs defaults,nls=utf8,umask=007,gid=46 0 1
# /dev/sda5
#UUID=F0F6AC07F6ABCC62 /media/sda5 ntfs defaults,nls=utf8,umask=007,gid=46 0 1
# /dev/sda4
UUID=cc9c91ad-8b78-417f-9e49-beee5eeb1486 none swap sw 0 0
/dev/scd0 /media/cdrom0 udf,iso9660 user,noauto 0 0
# /dev/sda5
/dev/sda5 /home ext3 defaults,errors=remount-ro 0 1
# /dev/sda1
/dev/sda1 /data ext3 defaults,errors=remount-ro 0 1
sudo fdisk -l
2) Mount the windows partition
sudo mount /dev/sda1 /mnt/
3) Look for the files and verify the data
ls -rlt
4)Then look for the available file system
sudo mkfs.
5)Create the new file system
sudo mkfs.ext3 /dev/sda1
6)Finally mounting the created partition
sudo mount /dev/sda1 /mnt/
To change the file system.
sudo vim /etc/fstab
To remove windows during bootup
cd /boot/grub/
sudo vim menu.lst
sudo grub-install /dev/sda
For referrence:
prakash@prakash-laptop:/boot/grub$ cat /etc/fstab
# /etc/fstab: static file system information.
#
#
proc /proc proc defaults 0 0
# /dev/sda3
UUID=38683652-4d42-4885-a18a-aa1fede0ca9d / ext3 defaults,errors=remount-ro 0 1
# /dev/sda1
UUID=C2C8BE9CC8BE8DE3 /media/sda1 ntfs defaults,nls=utf8,umask=007,gid=46 0 1
# /dev/sda5
#UUID=F0F6AC07F6ABCC62 /media/sda5 ntfs defaults,nls=utf8,umask=007,gid=46 0 1
# /dev/sda4
UUID=cc9c91ad-8b78-417f-9e49-beee5eeb1486 none swap sw 0 0
/dev/scd0 /media/cdrom0 udf,iso9660 user,noauto 0 0
# /dev/sda5
/dev/sda5 /home ext3 defaults,errors=remount-ro 0 1
# /dev/sda1
/dev/sda1 /data ext3 defaults,errors=remount-ro 0 1
Wednesday, 6 August 2008
How to combine avi files together
Hi,
I was trying to combine the 2 avi files and finally came with the good output. Here are the commands I've used to do it.
1) First rename the files to be appended with sequential numbers in it.
cat v1.avi v2.avi > vtest.avi
2) Then invoke mencoder to get the audio and video synchronization.
mencoder -forceidx -oac copy -ovc copy vtest.avi -o vtest_final.avi
I was trying to combine the 2 avi files and finally came with the good output. Here are the commands I've used to do it.
1) First rename the files to be appended with sequential numbers in it.
cat v1.avi v2.avi > vtest.avi
2) Then invoke mencoder to get the audio and video synchronization.
mencoder -forceidx -oac copy -ovc copy vtest.avi -o vtest_final.avi
Monday, 14 July 2008
Cowon d2 avi file support
I bought a new cowon d2 portable media player and it's sounds great.
I can't play the avi downloaded from the internet. I need to convert this avi using the below command to play it
mencoder Kallai_Mattum_Kandal.avi -o Kallai_Mattum_Kandal_Cowon.avi -ovc lavc -vf scale=320:240 -oac mp3lame -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256
for i in $*
do
mencoder $i -o $i"_Cowon.avi" -ovc lavc -vf scale=320:240 -oac mp3lame -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256 -mc 30
done
The above makes the player to play the file with good video and sound.
I can't play the avi downloaded from the internet. I need to convert this avi using the below command to play it
mencoder Kallai_Mattum_Kandal.avi -o Kallai_Mattum_Kandal_Cowon.avi -ovc lavc -vf scale=320:240 -oac mp3lame -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256
for i in $*
do
mencoder $i -o $i"_Cowon.avi" -ovc lavc -vf scale=320:240 -oac mp3lame -lavcopts vmax_b_frames=0:acodec=mp3:abitrate=128:vbitrate=256 -mc 30
done
The above makes the player to play the file with good video and sound.
Thursday, 3 July 2008
Create master and slave subversion repository setup using svnsync
I've learned the one more svn command svnsync, which I ever tried before. thought sharing this will be usefull for svn buddies.
Basically this command is to replicate the svn repository (i.e) Master to n number of slave repository setup.
How to:
1. Create the slave repository. follow the below steps for creating that.
a. Create an new repository.
b. Then you need to alter somescripts in the newly created repository inorder to accept this setup.
c. Get in to hooks directory of newly created above repository and rename pre-revprop-change.tmpl to pre-revprop-change.
d. Give pre-revprop-change executable permission.
e. Then edit the pre-revprop-change and add exit 0 at the begining of the file.
2. Execute the below commands to point master repository to newly created slave
1. svnsync init slaverepositoryurl masterrepositoryurl
The above command is one time setup command (i.e) pointing the master repository to slave repository.
2. svnsync sync slaverepositoryurl
The above command is invoked when ever you need to sync with master repository.
Similarly you can have n number of slaves for the master repository.
Also heared that this is not the right way to use it in production, but any how Im posting this to get startup.
Basically this command is to replicate the svn repository (i.e) Master to n number of slave repository setup.
How to:
1. Create the slave repository. follow the below steps for creating that.
a. Create an new repository.
b. Then you need to alter somescripts in the newly created repository inorder to accept this setup.
c. Get in to hooks directory of newly created above repository and rename pre-revprop-change.tmpl to pre-revprop-change.
d. Give pre-revprop-change executable permission.
e. Then edit the pre-revprop-change and add exit 0 at the begining of the file.
2. Execute the below commands to point master repository to newly created slave
1. svnsync init slaverepositoryurl masterrepositoryurl
The above command is one time setup command (i.e) pointing the master repository to slave repository.
2. svnsync sync slaverepositoryurl
The above command is invoked when ever you need to sync with master repository.
Similarly you can have n number of slaves for the master repository.
Also heared that this is not the right way to use it in production, but any how Im posting this to get startup.
Monday, 2 June 2008
VIM properties
< snippet >
set shiftwidth=4
set expandtab
set tabstop=4
< / snippet >
post the above snippet on ~/.vimrc and compile it using source ~/.vimrc
set shiftwidth=4
set expandtab
set tabstop=4
< / snippet >
post the above snippet on ~/.vimrc and compile it using source ~/.vimrc
Wednesday, 28 May 2008
Making your vi for default settings
If you want your vi editor to load with default settings (i.e) the line number to be displayed and also tabspace to set for 4 characters.
Here the way to do...
create a file ~/.exrc with settings
set nu
set tabstop=3
Then opening the vi will have these settings common for all.
Here the way to do...
create a file ~/.exrc with settings
set nu
set tabstop=3
Then opening the vi will have these settings common for all.
Thursday, 22 May 2008
More about MySQL
MySQL has two types of engine
1) InnoDB
2) MyISAM.
The difference b/w the above two is, former supports the subquery, constraints etc... but the later won't support those. This causes the MyISAM tables relatively faster during retrieval when compared to InnoDB. But the problem is the constraint check and subquery are not applicable in it.
Related to performance engineering.
How to find the total rows processed when executing an query and other infrmation.
use, explain query
(i.e) explain select * from table.
Related to mysql dump
How to see the create table script of an existing table ?
use, show create table < table_name >
(i.e) show create table < table_name >
Related to process
How to find the process id executing in mysql when you run a query? (i.e) Executing an query from the java client will create a process in background.
use, show processlist
1) InnoDB
2) MyISAM.
The difference b/w the above two is, former supports the subquery, constraints etc... but the later won't support those. This causes the MyISAM tables relatively faster during retrieval when compared to InnoDB. But the problem is the constraint check and subquery are not applicable in it.
Related to performance engineering.
How to find the total rows processed when executing an query and other infrmation.
use, explain query
(i.e) explain select * from table.
Related to mysql dump
How to see the create table script of an existing table ?
use, show create table < table_name >
Related to process
How to find the process id executing in mysql when you run a query? (i.e) Executing an query from the java client will create a process in background.
use, show processlist
Wednesday, 21 May 2008
SSH key generation
We will be ended up by re entering the password when ever we connect to a remote machine using ssh.
Here is the way to avoid re entering the password each and every time by generating a key pair Id and moving to remote machine only once.
Here is the way to avoid re entering the password each and every time by generating a key pair Id and moving to remote machine only once.
ssh-keygen -t dsa
Generating public/private dsa key pair.
Enter file in which to save the key (/home/localuser/.ssh/id_dsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/localuser/.ssh/id_dsa.
Your public key has been saved in /home/localuser/.ssh/id_dsa.pub.
The key fingerprint is:
93:58:20:56:72:d7:bd:14:86:9f:42:aa:82:3d:f8:e5 localuser@mybox.home.com
scp ~/.ssh/id_dsa.pub username@remotemachine:.ssh/authorized_keys
login to remote machine
chmod 700 ~/.ssh
chmod 600 ~/.ssh/authorized_keys
Friday, 16 May 2008
Power of Shell Script
Scripts makes the job more easy !
Below is the sample script which I wrote to retrieve the count of files modified for all the modified dates in home directory. This script also gives the sum of bytes for files modified on the corresponding dates.
prakash@prakash-laptop:~$ for filedate in `ls -l | awk '{print $6}' | sort | uniq -c | awk '{print $2}'`;do ls -l | grep $filedate | awk 'BEGIN{print "----------------------------"} {count+=1;sum+=$5} END{print $6" | " count " | " sum }' ; done
Sample Output
----------------------------
Date | Count | Sum of bytes
----------------------------
2007-10-30 | 1 | 26
----------------------------
2007-11-21 | 1 | 4096
----------------------------
2008-02-06 | 1 | 332341
----------------------------
2008-02-25 | 2 | 8192
----------------------------
2008-03-03 | 1 | 999
----------------------------
2008-03-04 | 1 | 219
----------------------------
2008-03-05 | 1 | 3402
----------------------------
2008-03-06 | 2 | 739209
----------------------------
2008-03-11 | 4 | 6648
----------------------------
2008-03-14 | 10 | 50918
----------------------------
2008-03-15 | 4 | 106573
----------------------------
2008-03-16 | 1 | 10635
Really the scripting makes your job more easy.
Below is the sample script which I wrote to retrieve the count of files modified for all the modified dates in home directory. This script also gives the sum of bytes for files modified on the corresponding dates.
prakash@prakash-laptop:~$ for filedate in `ls -l | awk '{print $6}' | sort | uniq -c | awk '{print $2}'`;do ls -l | grep $filedate | awk 'BEGIN{print "----------------------------"} {count+=1;sum+=$5} END{print $6" | " count " | " sum }' ; done
Sample Output
----------------------------
Date | Count | Sum of bytes
----------------------------
2007-10-30 | 1 | 26
----------------------------
2007-11-21 | 1 | 4096
----------------------------
2008-02-06 | 1 | 332341
----------------------------
2008-02-25 | 2 | 8192
----------------------------
2008-03-03 | 1 | 999
----------------------------
2008-03-04 | 1 | 219
----------------------------
2008-03-05 | 1 | 3402
----------------------------
2008-03-06 | 2 | 739209
----------------------------
2008-03-11 | 4 | 6648
----------------------------
2008-03-14 | 10 | 50918
----------------------------
2008-03-15 | 4 | 106573
----------------------------
2008-03-16 | 1 | 10635
Really the scripting makes your job more easy.
Thursday, 15 May 2008
Ignoring Unix signals
Here is the interesting small C program I came across to ignore the unix signals!. Basically I'm a Java programmer, so this small C impressed me a lot :)
Below program ignores the CTRL + C signal,
test.c
#include < stdio.h >
#include < signal.h >
main()
{
signal(SIGINT,SIG_IGN);
while(1)
printf("You can't kill me with SIGINT anymore, dude\n");
return 0;
}
gcc test.c -o testout
./testout
Running the above program completely ignores the CRTL + C command and keep on running with out aborting. CRTL + C sends the SIGINT signal and we ignore that in program.
Below program ignores the CTRL + C signal,
test.c
#include < stdio.h >
#include < signal.h >
main()
{
signal(SIGINT,SIG_IGN);
while(1)
printf("You can't kill me with SIGINT anymore, dude\n");
return 0;
}
gcc test.c -o testout
./testout
Running the above program completely ignores the CRTL + C command and keep on running with out aborting. CRTL + C sends the SIGINT signal and we ignore that in program.
Tuesday, 13 May 2008
UNIX SIGNALS
THE UNIX SIGNALS are used for communicating the process when ever the event has occurred.
When ever you give CRTL + C on the running process or kill on the running process id, the signals are sent to process to notify the event.
There are about 64 signals and can be viewed using kill -l.
When you press CRTL + C , by default SIGINT is sent to the running process,
When you press CRTL + Z , by default SINTSTP is sent to the running process,
When you press CRTL + / , by default SIGABRT is sent to the running process.
Think when some one press CTRL + C when your program is half the way, you are going to end up with non cleaned up resource, this is because you do'nt have handler over the SIGINT in our program.
You can write the handler for CRTL + C using trap as follows to get the control. You cannot get the handler for SIGKILL and SIGSTOP. These two signals cannot be caught while other can be.
test.sh
!/bin/bash
trap "echo 'This is trap executing for 0!'; exit 0" 0
trap "echo 'This is trap executing for SIGINT!'; exit 1" SIGINT
trap "echo 'This is trap executing for SIGKILL!'; exit 1" SIGKILL
trap "echo 'This is trap executing for SIGTERM!'; exit 1" SIGTERM
trap "echo 'This is trap executing for SIGABRT!'; exit 1" SIGABRT
echo "We are running the script. Press Ctrl-C to cause trap to execute!"
read
We can also send the signals during kill using pid.
For EX:
kill -15 pid,
The above command will send the SIGTERM signal to the pid before terminating. So you can do your clean up logic by handling the signal in your program.
Using kill -9 pid,
the above command will send the SIGKILL signal but it cannot be caught by the program, this is also similar to SIGSTOP that cannot be caught too.
So it's advisable to use the SIGTERM to kill the process and then the SIGKILL as we can caught the SIGTERM signal for our operations and the SIGKILL cannot be.
The Recommended order of killing the process id is
kill_pid () {
PID=$1
RETVAL=0
for signal in "TERM" "INT" "HUP" "KILL"; do
kill -$signal $PID
RETVAL=$?
[ $RETVAL -eq 0 ] && break
echo "warning: kill failed: pid=$PID, signal=$signal" >&2
sleep 1
done
return $RETVAL
}
kill_pid 1234
When ever you give CRTL + C on the running process or kill on the running process id, the signals are sent to process to notify the event.
There are about 64 signals and can be viewed using kill -l.
When you press CRTL + C , by default SIGINT is sent to the running process,
When you press CRTL + Z , by default SINTSTP is sent to the running process,
When you press CRTL + / , by default SIGABRT is sent to the running process.
Think when some one press CTRL + C when your program is half the way, you are going to end up with non cleaned up resource, this is because you do'nt have handler over the SIGINT in our program.
You can write the handler for CRTL + C using trap as follows to get the control. You cannot get the handler for SIGKILL and SIGSTOP. These two signals cannot be caught while other can be.
test.sh
!/bin/bash
trap "echo 'This is trap executing for 0!'; exit 0" 0
trap "echo 'This is trap executing for SIGINT!'; exit 1" SIGINT
trap "echo 'This is trap executing for SIGKILL!'; exit 1" SIGKILL
trap "echo 'This is trap executing for SIGTERM!'; exit 1" SIGTERM
trap "echo 'This is trap executing for SIGABRT!'; exit 1" SIGABRT
echo "We are running the script. Press Ctrl-C to cause trap to execute!"
read
We can also send the signals during kill using pid.
For EX:
kill -15 pid,
The above command will send the SIGTERM signal to the pid before terminating. So you can do your clean up logic by handling the signal in your program.
Using kill -9 pid,
the above command will send the SIGKILL signal but it cannot be caught by the program, this is also similar to SIGSTOP that cannot be caught too.
So it's advisable to use the SIGTERM to kill the process and then the SIGKILL as we can caught the SIGTERM signal for our operations and the SIGKILL cannot be.
The Recommended order of killing the process id is
kill_pid () {
PID=$1
RETVAL=0
for signal in "TERM" "INT" "HUP" "KILL"; do
kill -$signal $PID
RETVAL=$?
[ $RETVAL -eq 0 ] && break
echo "warning: kill failed: pid=$PID, signal=$signal" >&2
sleep 1
done
return $RETVAL
}
kill_pid 1234
How to find processor 32 bit or 64 bit
Some time you may need to know what bit your processor uses, for installing the packages.
Here is the way you can use to find the information.
Here is the way you can use to find the information.
getconf LONG_BIT
uname -m
(
x86_64 GNU/Linux indicates you have 64bit CPU. If you use see i386/i486/i586/i686 then it is a 32 bit CPU.)
Friday, 4 April 2008
Facing problem with X11
I've faced some problem with X11 forwarding. Then I analsyed the X process using ps -ef | grep X
I found the X process initialized has the argument--nolistentcp. Actually this prevents the tcp connection for X11. So make sure this is disabled in gdm.conf.
I came to know there is a entry in /etc/X11/gdm/gdm.conf which prevents the TCP connection.
after altering the "DisallowTCP"=false.
and restart the X windows.
To restart X windows CRTL + ALT + backspace or restart the system.
I found the X process initialized has the argument--nolistentcp. Actually this prevents the tcp connection for X11. So make sure this is disabled in gdm.conf.
I came to know there is a entry in /etc/X11/gdm/gdm.conf which prevents the TCP connection.
after altering the "DisallowTCP"=false.
and restart the X windows.
To restart X windows CRTL + ALT + backspace or restart the system.
Tuesday, 1 April 2008
To find opened ports in linux
After a long search I found the good way to search the opened ports in a machine.
It's nmap & netstat.
Syntax
netstat -na | grep 6000
nmap -p 1-63335 localhost
The above command displays the ports opened in machine localhost.
It's nmap & netstat.
Syntax
netstat -na | grep 6000
nmap -p 1-63335 localhost
The above command displays the ports opened in machine localhost.
Thursday, 27 March 2008
Mastering VI
| To delete remaining characters under cursor in current line. |
dd | To delete entire line and copy to clipboard |
$ | Move the cursor to the end of the current line. |
C-g | To display filename and number of lines in end of the terminal |
r | To replace single character under cursor |
R | To replace multiple character under cursor untill escape is pressed |
dw | To replace a word under cursor |
Mastering Bash
| Move the cursor to the beginning of the input line. |
C-d | Same as [DEL] (this is the Emacs equivalent). |
C-e | Move the cursor to the end of the input line. |
C-k | Kill, or "cut," all text on the input line, from the character the cursor is underneath to the end of the line. |
C-l | Clear the terminal screen. |
C-u | Kill the entire input line. |
C-y | Yank, or "paste," the text that was last killed. Text is inserted at the point where the cursor is. |
C-_ | Undo the last thing typed on this command line. |
[←] | Move the cursor to the left one character. |
[→] | Move the cursor to the right one character. |
To resize image for thumbnail in linux
Need the thumbnail, Here is the utility I used..
First install the imagemagic
sudo apt-get intsall imagemagic
Use convert command to make your thumbnail.
Here 200 is the 200px width
convert -thumbnail 200 DSC01536.JPG DSC01536_thumb.JPG
Here x200 is the 200px height
convert -thumbnail x200 DSC01536.JPG DSC01536_thumb.JPG
First install the imagemagic
sudo apt-get intsall imagemagic
Use convert command to make your thumbnail.
Here 200 is the 200px width
convert -thumbnail 200 DSC01536.JPG DSC01536_thumb.JPG
Here x200 is the 200px height
convert -thumbnail x200 DSC01536.JPG DSC01536_thumb.JPG
< CRTL > + r for Search in terminal command line
"This is used to search already executed command in Linux terminal."
We waste timings by retyping the command often for a small argument changes. But this is not worthy, Hmmm you can use CRTL + r to searching the already executed commands.
USAGE:
PressCRTL + r and enter text to match the already executed command. Matching results will be dispalyed, so now you can reuse them.
We waste timings by retyping the command often for a small argument changes. But this is not worthy, Hmmm you can use
USAGE:
Press
Wednesday, 19 March 2008
To Alter MAC address in windows, linux
I heard that we can't use some internet broad band connectivity expect the specific system as it's bounded to that specific MAC address.
Hmmm, the above raised a question? how to use that in other laptop etc...
Got a suggestion, Why can't you change your machine or laptop's MAC address matching the original bounded MAC address.
To Change MAC address in windows!
Go to,
Settings > Network Connections > local area connection > General Tab > configure > Advanced > Locally Administered Address > value.
Give the value , MAC address of bounded machine.
To Change MAC address in linux!
ifconfig eth0 down
ifconfig eth0 hw ether 00:00:00:AB:AC:AA
ifconfig eth0 up
This will use the mac address temporally to browse!.
Hmmm, the above raised a question? how to use that in other laptop etc...
Got a suggestion, Why can't you change your machine or laptop's MAC address matching the original bounded MAC address.
To Change MAC address in windows!
Go to,
Settings > Network Connections > local area connection > General Tab > configure > Advanced > Locally Administered Address > value.
Give the value , MAC address of bounded machine.
To Change MAC address in linux!
ifconfig eth0 down
ifconfig eth0 hw ether 00:00:00:AB:AC:AA
ifconfig eth0 up
This will use the mac address temporally to browse!.
Labels:
address,
configuration,
MAC,
networking,
settings,
windows
Tuesday, 11 March 2008
About Zenity, a linux command.
Need GTK+ support for the command or shell. Yes this is the right command. It's almost belong to similar family like Xdisplay, display, cDisplay.
Try this, it's awesome. I do no whether it's inbuilt in redhat but it's there in ubuntu.
Some quick cracks on that.
To get the calendar !
zenity --calendar
To get the input from the user !
zenity --entry
To get Yes or No
zenity --question --text "Do you need to proceed reading?"
Use the ouput of the above command to input of another command using echo $?.
The echo$? returns status of the previous command.
answer=`zenity --question --text "You appear to have an AMD64 architecture. Do you want to install the 64-bit version of Songbird?"; echo $?`
if [ $anw = 0 ] ; then
....
....
....
fi
I've missed this command lot or else I would've used in most of my scripting languages to get the user inputs :)
Try this, it's awesome. I do no whether it's inbuilt in redhat but it's there in ubuntu.
Some quick cracks on that.
To get the calendar !
zenity --calendar
To get the input from the user !
zenity --entry
To get Yes or No
zenity --question --text "Do you need to proceed reading?"
Use the ouput of the above command to input of another command using echo $?.
The echo$? returns status of the previous command.
answer=`zenity --question --text "You appear to have an AMD64 architecture. Do you want to install the 64-bit version of Songbird?"; echo $?`
if [ $anw = 0 ] ; then
....
....
....
fi
I've missed this command lot or else I would've used in most of my scripting languages to get the user inputs :)
Friday, 7 March 2008
GNU Screen && nohup
These commands are useful to run commands even after closing ssh.
screen - when you need to run the process in background with the terminal alive for further process or monitor
nohup - when you need to run the process in background with out control over the process or the monitor.
Commands for screen
To start the new screen
screen
To detach from the screen with out exiting the session
Crtl a + d
To reattach to the default first screen
screen -r
If more than one screen listed then select the appropriate screen using
screen -x {pid/name}
Command for nohup
nohup command
screen - when you need to run the process in background with the terminal alive for further process or monitor
nohup - when you need to run the process in background with out control over the process or the monitor.
Commands for screen
To start the new screen
screen
To detach from the screen with out exiting the session
Crtl a + d
To reattach to the default first screen
screen -r
If more than one screen listed then select the appropriate screen using
screen -x {pid/name}
Command for nohup
nohup command
Tuesday, 4 March 2008
Hibernate it's not "Hibernate in Action"
I was little bit eager to find why my windows OS doesn't consume power and it's not same in case of ubuntu OS during my laptop is idle.
Then the reason came out is because of "Hibernate", Yes there is an Hibernate function in windows xp which makes the batteries live longer. Then I've used the Hibernate function in ubuntu too and it made it as expected!!!!.
Then the reason came out is because of "Hibernate", Yes there is an Hibernate function in windows xp which makes the batteries live longer. Then I've used the Hibernate function in ubuntu too and it made it as expected!!!!.
VPN configuration in Ubuntu
Recently I've configured the VPN for first time and here are
some commands related to it.
Install VPNC
sudo apt-get install vpnc
cd /etc/vpnc/
sudo ls /etc/vpnc
Configure VPNC
sudo vim /etc/vpnc/example.conf
sudo mv /etc/vpnc/example.conf /etc/vpnc/default.conf
sudo vpnc
sudo vim /etc/vpnc/default.conf
To start VPNC
sudo vpnc
To Disconnect VPNC
sudo vpnc-disconnect
ps ax|grep network
Restart Network
sudo /etc/init.d/networking restart
ping yahoo.com
some commands related to it.
Install VPNC
sudo apt-get install vpnc
cd /etc/vpnc/
sudo ls /etc/vpnc
Configure VPNC
sudo vim /etc/vpnc/example.conf
sudo mv /etc/vpnc/example.conf /etc/vpnc/default.conf
sudo vpnc
sudo vim /etc/vpnc/default.conf
To start VPNC
sudo vpnc
To Disconnect VPNC
sudo vpnc-disconnect
ps ax|grep network
Restart Network
sudo /etc/init.d/networking restart
ping yahoo.com
Tuesday, 26 February 2008
To split and analyze the log files.
When ever the log file size is more, Try splitting it and view the file!
Calculate the length of the log file
wc -l log-file.log
Then split the file based on the total number of lines.
For ex Consider the no of lines as 1000, then we can split in to files of 100 lines using below command.
split -l100 log-file.log log-file.log.
This will result in multiple files of 100 lines each. The name of files would be appended by aa,ab,....
To rejoin the file use below command
$ cat log-file.log.* >lastweek.mp3
Calculate the length of the log file
wc -l log-file.log
Then split the file based on the total number of lines.
For ex Consider the no of lines as 1000, then we can split in to files of 100 lines using below command.
split -l100 log-file.log log-file.log.
This will result in multiple files of 100 lines each. The name of files would be appended by aa,ab,....
To rejoin the file use below command
$ cat log-file.log.* >lastweek.mp3
What for /dev/null?
Do you know why they redirect to /dev/null
/dev/null or the null device is a special file that discards all data written to it
The null device is typically used for disposing of unwanted output streams of a process. It's equivalent to "don't bother about the results ".
ex:
ls -l *.txt > name.txt 2>/dev/null
/dev/null or the null device is a special file that discards all data written to it
The null device is typically used for disposing of unwanted output streams of a process. It's equivalent to "don't bother about the results ".
ex:
ls -l *.txt > name.txt 2>/dev/null
Monday, 25 February 2008
Consecutive commands in linux
Adding && between the commands works!
Command1 && Command2
Some times we would be pasting the command in vim with out a directory. I think this will be more helpful to append before the command instead of removing and pasting another time.
mkdir DIST && command
Command1 && Command2
Some times we would be pasting the command in vim with out a directory. I think this will be more helpful to append before the command instead of removing and pasting another time.
mkdir DIST && command
Debug Java Application or Java Web Application using Eclipse
Steps To Configure Java Application with eclipse
1. Append below parameters to JVM_ARGUMENTS of the Application
-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,
address=5050
2. Configure your Eclipse Debugger.
Go to project source.
Place Break Points in Project source where ever needed.
Go to Project > Run > Open Debug Dialog
Create new Remote Java Application with corresponding parameter
Host - Application Host IP
Port - Port specified in JVM_ARGUMENTS of Application.
Click Debug
3. Access your application.
1. Append below parameters to JVM_ARGUMENTS of the Application
-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=n,
address=5050
2. Configure your Eclipse Debugger.
Go to project source.
Place Break Points in Project source where ever needed.
Go to Project > Run > Open Debug Dialog
Create new Remote Java Application with corresponding parameter
Host - Application Host IP
Port - Port specified in JVM_ARGUMENTS of Application.
Click Debug
3. Access your application.
Friday, 22 February 2008
Basic X11 Forwarding Over SSH
X11Forwarding needs to be enabled on the sshd server. Do this by making the following edit:
File: /etc/ssh/sshd_config
After you make these changes, you will need to restart sshd so the changes will be accepted:
/etc/init.d/sshd restart
Don't forget to log out and log in to the server for this change to take effect.
Note: one reason for receiving the error messages
xterm Xt error: Can't open display: your_client_name:0.0
may be that X11Forwarding is not enabled on the server.
[edit] Running single apps
$ ssh -X < remote_server >
File: /etc/ssh/sshd_config
After you make these changes, you will need to restart sshd so the changes will be accepted:
/etc/init.d/sshd restart
Don't forget to log out and log in to the server for this change to take effect.
Note: one reason for receiving the error messages
xterm Xt error: Can't open display: your_client_name:0.0
may be that X11Forwarding is not enabled on the server.
[edit] Running single apps
$ ssh -X < remote_server >
Thursday, 21 February 2008
Needed shell script to compile and run java files.
#Let's start building the classpath with existing jars & libs.
THE_CLASSPATH=
for i in `ls *.jar`
do
THE_CLASSPATH=${THE_CLASSPATH}:${i}
done
# Let's compile all the java file.
for i in `find . -name '*.java'`
do
echo "Compiling $i"
javac -cp $THE_CLASSPATH $i
done
# invoke the main function.
java -cp .:$THE_CLASSPATH $CLASSNAME$
unset THE_CLASSPATH
THE_CLASSPATH=
for i in `ls *.jar`
do
THE_CLASSPATH=${THE_CLASSPATH}:${i}
done
# Let's compile all the java file.
for i in `find . -name '*.java'`
do
echo "Compiling $i"
javac -cp $THE_CLASSPATH $i
done
# invoke the main function.
java -cp .:$THE_CLASSPATH $CLASSNAME$
unset THE_CLASSPATH
To set classpath in linux
Simple tips, tought it will be useful
CLASSPATH=*.class:$CLASSPATH
export CLASSPATH
CLASSPATH=*.class:$CLASSPATH
export CLASSPATH
Wednesday, 20 February 2008
To open mutiple files in vi
vi -o file1 file2
To switch between splitted windows use
< ctrl >+ w
To Quit all qa!
To save all wa!
To save&quit wqa!
To switch between splitted windows use
< ctrl >+ w
To Quit all qa!
To save all wa!
To save&quit wqa!
Tuesday, 19 February 2008
Diff and Patch
I Recently came across two more useful linux commands diff and patch. :)
DIFF:
Display the differences between two files, or each corresponding file in two directories.
SYNTAX
diff [options] from-file to-file > path-file-name
PATCH
SYNTAX
patch -p0 < new-patch
patch -p1 < new-patch
Levels in the Patch Command (-p0 or -p1?):
The -p option will optionally strip off directory levels from the patchfile. For Ex: if you have a patchfile with a header as such:
--- old/modules/pcitable Mon Sep 27 11:03:56 1999
+++ new/modules/pcitable Tue Dec 19 20:05:41 2000
Using a -p0 will expect, from your current working directory, to find a subdirectory called "new", then "modules" below that, then the "pcitable" file below that.
Using a -p1 will strip off the 1st level from the path and will expect to find (from your current working directory) a directory called "modules", then a file called "pcitable". Patch will ignore the "new" directory mentioned in the header of the patchfile.
For more Info:
man diff/patch
DIFF:
Display the differences between two files, or each corresponding file in two directories.
SYNTAX
diff [options] from-file to-file > path-file-name
PATCH
SYNTAX
patch -p0 < new-patch
patch -p1 < new-patch
Levels in the Patch Command (-p0 or -p1?):
The -p option will optionally strip off directory levels from the patchfile. For Ex: if you have a patchfile with a header as such:
--- old/modules/pcitable Mon Sep 27 11:03:56 1999
+++ new/modules/pcitable Tue Dec 19 20:05:41 2000
Using a -p0 will expect, from your current working directory, to find a subdirectory called "new", then "modules" below that, then the "pcitable" file below that.
Using a -p1 will strip off the 1st level from the path and will expect to find (from your current working directory) a directory called "modules", then a file called "pcitable". Patch will ignore the "new" directory mentioned in the header of the patchfile.
For more Info:
man diff/patch
To Reset Firefox MasterPassword
Firefox:
Enter "chrome://pippki/content/resetpassword.xul" in the Location Bar, press Enter, then click "Reset"
Enter "chrome://pippki/content/resetpassword.xul" in the Location Bar, press Enter, then click "Reset"
Subscribe to:
Posts (Atom)