Yesterday, I was looking to find whether any daemon thread is running with in jboss server, which does an scheduled job.
Tried ps -efw but it didn't worked, as it's an internal thread, got some suggestions that dumpingthe jboss process will dump the name of daemon threaad and finally it worked great :)
# Find the running boss server process id.
ps -efww | grep 'java' #choose your jboss server pid.
# send thread dump signal.
kill -3 $pid
About Me
Wednesday, 25 November 2009
Thursday, 5 November 2009
Load Test: Lucene 2.4 VS Lucene 2.9
public class ContrivedFCTest extends TestCase {
public void testLoadTime() throws Exception {
Directory dir = FSDirectory.getDirectory(System.getProperty("java.io.tmpdir") + File.separator + "test");
IndexWriter writer = new IndexWriter (dir, new SimpleAnalyzer(), true, IndexWriter.MaxFieldLength.LIMITED);
writer.setMergeFactor(37);
writer.setUseCompoundFile(false);
for(int i = 0; i < 5000000; i++) {
Document doc = new Document();
doc.add (new Field ("field", "String" + i, Field.Store.NO, Field.Index.NOT_ANALYZED));
writer.addDocument(doc);
}
writer.close();
IndexReader reader = IndexReader.open(dir);
long start = System.currentTimeMillis();
FieldCache.DEFAULT.getStrings(reader, "field");
long end = System.currentTimeMillis();
System.out.println("load time:" + (end - start)/1000.0f + "s");
}
}
public void testLoadTime() throws Exception {
Directory dir = FSDirectory.getDirectory(System.getProperty("java.io.tmpdir") + File.separator + "test");
IndexWriter writer = new IndexWriter (dir, new SimpleAnalyzer(), true, IndexWriter.MaxFieldLength.LIMITED);
writer.setMergeFactor(37);
writer.setUseCompoundFile(false);
for(int i = 0; i < 5000000; i++) {
Document doc = new Document();
doc.add (new Field ("field", "String" + i, Field.Store.NO, Field.Index.NOT_ANALYZED));
writer.addDocument(doc);
}
writer.close();
IndexReader reader = IndexReader.open(dir);
long start = System.currentTimeMillis();
FieldCache.DEFAULT.getStrings(reader, "field");
long end = System.currentTimeMillis();
System.out.println("load time:" + (end - start)/1000.0f + "s");
}
}
Tuesday, 3 November 2009
List ports used by java process
Below is the command to list all the ports opened by a java process
sudo netstat -tulpn |grep java
sudo netstat -tulpn |grep java
Monday, 26 October 2009
Python socket programming
I was in a situation to make the particular port listen,
Below code helped me in doing that simple.
import socket
#create an INET, STREAMing socket
serversocket = socket.socket(
socket.AF_INET, socket.SOCK_STREAM)
#bind the socket to a public host,
# and a well-known port
serversocket.bind((socket.gethostname(), 80))
#become a server socket
serversocket.listen(5)
Below code helped me in doing that simple.
import socket
#create an INET, STREAMing socket
serversocket = socket.socket(
socket.AF_INET, socket.SOCK_STREAM)
#bind the socket to a public host,
# and a well-known port
serversocket.bind((socket.gethostname(), 80))
#become a server socket
serversocket.listen(5)
Thursday, 24 September 2009
To touch a folder contents recursive
Some times you may need to alter the access date of files and folder. You can do it by a recursive touch command.
find . | xargs touch
find . | xargs touch
psql output to file
Below is the command to redirect psql output to a file.
psql> \o /tmp/file.txt
psql> select * from table
This should allow output to custom file.
psql> \o /tmp/file.txt
psql> select * from table
This should allow output to custom file.
Friday, 18 September 2009
Removing control characters - python
You want a regex to remove control characters (< chr(32) and > chr(126)) from
strings ie.
line = re.sub(r"[^a-z0-9-';.]", " ", line) # replace all chars NOT A-Z,a-z, 0-9, [-';.] with " "
1. What is the best way to include all the required chars rather than list them all within the r"" ?
You have to list either the chars you want, as you have done, or the
ones you don't want. You could use
r'[\x00-\x1f\x7f-\xff]' or
r'[^\x20-\x7e]'
line = re.sub(r'[\x00-\x1f\x7f-\xff]', "", "test ^A ^B testing this again")
strings ie.
line = re.sub(r"[^a-z0-9-';.]", " ", line) # replace all chars NOT A-Z,a-z, 0-9, [-';.] with " "
1. What is the best way to include all the required chars rather than list them all within the r"" ?
You have to list either the chars you want, as you have done, or the
ones you don't want. You could use
r'[\x00-\x1f\x7f-\xff]' or
r'[^\x20-\x7e]'
line = re.sub(r'[\x00-\x1f\x7f-\xff]', "", "test ^A ^B testing this again")
Intresting tip abt tcpdump
Searched for a tool like wireshock in windows and came to know this!. This help me to dump the soap requests.
Here is the simple command.
sudo tcpdump -XX -s0 -i lo -w k "tcp port 80"
-XX: Print packets in hex and ASCII
-s0: Print whole packets, not only first 68 chars.
-i is the interface to monitor.
"tcp port 80" Filter expression.
-w output file.
This dumps all the requests and responses which are passing through tcp port 80!.
Here is the simple command.
sudo tcpdump -XX -s0 -i lo -w k "tcp port 80"
-XX: Print packets in hex and ASCII
-s0: Print whole packets, not only first 68 chars.
-i is the interface to monitor.
"tcp port 80" Filter expression.
-w output file.
This dumps all the requests and responses which are passing through tcp port 80!.
Wednesday, 2 September 2009
Apache Module mod_deflate
Looking to tune the apache for performance, then this should help you out.
The mod_deflate module provides the DEFLATE output filter that allows output from your server to be compressed before being sent to the client over the network
This is a simple sample configuration for the impatient.
Compress only a few types
AddOutputFilterByType DEFLATE text/html text/plain text/xml
The following configuration, while resulting in more compressed content, is also much more complicated. Do not use this unless you fully understand all the configuration details.
Compress everything except images
# Insert filter
SetOutputFilter DEFLATE
# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip
# MSIE masquerades as Netscape, but it is fine
# BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won't work. You can use the following
# workaround to get the desired effect:
BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html
# Don't compress images
SetEnvIfNoCase Request_URI \
\.(?:gif|jpe?g|png)$ no-gzip dont-vary
# Make sure proxies don't deliver the wrong content
Header append Vary User-Agent env=!dont-vary
The mod_deflate module provides the DEFLATE output filter that allows output from your server to be compressed before being sent to the client over the network
This is a simple sample configuration for the impatient.
Compress only a few types
AddOutputFilterByType DEFLATE text/html text/plain text/xml
The following configuration, while resulting in more compressed content, is also much more complicated. Do not use this unless you fully understand all the configuration details.
Compress everything except images
# Insert filter
SetOutputFilter DEFLATE
# Netscape 4.x has some problems...
BrowserMatch ^Mozilla/4 gzip-only-text/html
# Netscape 4.06-4.08 have some more problems
BrowserMatch ^Mozilla/4\.0[678] no-gzip
# MSIE masquerades as Netscape, but it is fine
# BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
# NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48
# the above regex won't work. You can use the following
# workaround to get the desired effect:
BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html
# Don't compress images
SetEnvIfNoCase Request_URI \
\.(?:gif|jpe?g|png)$ no-gzip dont-vary
# Make sure proxies don't deliver the wrong content
Header append Vary User-Agent env=!dont-vary
Thursday, 25 June 2009
Capture the results of the queries executed in mysqlclient in a file (no copy and paste)
One way is to copy the mysqlclient screen and paste it into a text file. The other way is to use tee.
mysql>
mysql> tee mysql-1.txt
Logging to file 'mysql-1.txt'
mysql> show databases;
+----------------+
| Database |
+----------------+
| test_db |
| db_3 |
| db_4 |
| db_5 |
+----------------+
4 rows in set (0.00 sec)
mysql> exit
Bye
[thiru@cu065 tmp]$ less mysql-1.txt
mysql> show databases;
+----------------+
| Database |
+----------------+
| test_db |
| db_3 |
| db_4 |
| db_5 |
+----------------+
4 rows in set (0.00 sec)
mysql> exit
mysql>
mysql> tee mysql-1.txt
Logging to file 'mysql-1.txt'
mysql> show databases;
+----------------+
| Database |
+----------------+
| test_db |
| db_3 |
| db_4 |
| db_5 |
+----------------+
4 rows in set (0.00 sec)
mysql> exit
Bye
[thiru@cu065 tmp]$ less mysql-1.txt
mysql> show databases;
+----------------+
| Database |
+----------------+
| test_db |
| db_3 |
| db_4 |
| db_5 |
+----------------+
4 rows in set (0.00 sec)
mysql> exit
Monday, 22 June 2009
Sending mail through command line client.
Some days back I'm in situation to test the continuous spamming of mails to my mail box.
This command did me the work
for i in `seq 1 35`; do echo "test" | mail -s test forum@email.com; done
This command did me the work
for i in `seq 1 35`; do echo "test" | mail -s test forum@email.com; done
Wednesday, 3 June 2009
Apache <IfDefine> Directive
This is used to process some section in httpd.conf based on the system argument.
The directives within an <ifdefine></ifdefine> section are only processed if the "test" variable is true. If "test" variable is false, everything between the start and end markers is ignored.
Example:
httpd -D test -k start
Starting the httpd with system parameter "test" will execute the below section or else it will skip those
<ifdefined>
...
...
</ifdefined>
This will be useful sometime when you try loading modules on specific arguments.
The directives within an <ifdefine></ifdefine> section are only processed if the "test" variable is true. If "test" variable is false, everything between the start and end markers is ignored.
Example:
httpd -D test -k start
Starting the httpd with system parameter "test" will execute the below section or else it will skip those
<ifdefined>
...
...
</ifdefined>
This will be useful sometime when you try loading modules on specific arguments.
Monday, 1 June 2009
How to Telnet to a Web Server - HTTP Requests through Telnet
Interesting, today I came to know there is a way through telnet to get the requests from the server.
Normally we don't see what sent between the web server and the client. Here is the way we can see what's actually received through the telnet.
To connect to the web server - open a command line and type the command
telnet host port
telnet localhost 8080
Then you'll get connected to the particular web server and then you can enter any HTTP command you want such as GET, HEAD.
If you need to request for a web page from the web server you can type the HTTP request as follows.
GET pageName HTTP/1.0
HEAD pageName HTTP/1.0
Hit enter twice. Then you will get a response.
You'll get a 404 status if page not found, 301 if the page is moved permanently and 401 if you are not authorized to access the page. More HTTP status codes can be found here.
Normally we don't see what sent between the web server and the client. Here is the way we can see what's actually received through the telnet.
To connect to the web server - open a command line and type the command
telnet host port
telnet localhost 8080
Then you'll get connected to the particular web server and then you can enter any HTTP command you want such as GET, HEAD.
If you need to request for a web page from the web server you can type the HTTP request as follows.
GET pageName HTTP/1.0
HEAD pageName HTTP/1.0
Hit enter twice. Then you will get a response.
You'll get a 404 status if page not found, 301 if the page is moved permanently and 401 if you are not authorized to access the page. More HTTP status codes can be found here.
Thursday, 30 April 2009
Setup YUM installer repo
To use the YUM installer you need to register the repo.
Create a file "/etc/yum.repos.d/centos.repo" and paste the below content.
[base]
name=CentOS-$releasever - Base
mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=os
#baseurl=http://mirror.centos.org/centos/$releasever/os/$basearch/
baseurl=http://mirror.centos.org/centos/5/os/i386/
gpgcheck=0
gpgkey=http://mirror.centos.org/centos/RPM-GPG-KEY-centos4
protect=1
Create a file "/etc/yum.repos.d/centos.repo" and paste the below content.
[base]
name=CentOS-$releasever - Base
mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=os
#baseurl=http://mirror.centos.org/centos/$releasever/os/$basearch/
baseurl=http://mirror.centos.org/centos/5/os/i386/
gpgcheck=0
gpgkey=http://mirror.centos.org/centos/RPM-GPG-KEY-centos4
protect=1
Thursday, 23 April 2009
Linux maximum processes per user
Recently I came to know there is a max count of processes allocated for each user. Below is the command to find the find that.
ulimit -u
Reference:
http://tldp.org/LDP/solrhe/Securing-Optimizing-Linux-RH-Edition-v1.3/x4733.html
ulimit -u
Reference:
http://tldp.org/LDP/solrhe/Securing-Optimizing-Linux-RH-Edition-v1.3/x4733.html
Friday, 17 April 2009
Removing invalid xml characters
Recently I came across oneof the good way to remove InvalidXmlCharacters. Below is the snippet.
/**
* Returns the input stripped of invalid XML characters.
*
* see http://www.w3.org/TR/2000/REC-xml-20001006#NT-Char for valid XML
* character list.
*/
public String removeInvalidXmlCharacters(String input)
{
if (input == null) {
return input;
}
char c;
StringBuffer sb = new StringBuffer();
for (int i = 0; i < input.length(); i++)
{
c = input.charAt(i);
//remove ZeroWidthSpace
if (c == '\u200b') {
continue;
}
if ((c == 0x9) || (c == 0xA) || (c == 0xD)
|| ((c >= 0x20) && (c <= 0xD7FF))
|| ((c >= 0xE000) && (c <= 0xFFFD))
|| ((c >= 0x10000) && (c <= 0x10FFFF))
) {
sb.append(c);
}
}
return sb.toString();
}
/**
* Returns the input stripped of invalid XML characters.
*
* see http://www.w3.org/TR/2000/REC-xml-20001006#NT-Char for valid XML
* character list.
*/
public String removeInvalidXmlCharacters(String input)
{
if (input == null) {
return input;
}
char c;
StringBuffer sb = new StringBuffer();
for (int i = 0; i < input.length(); i++)
{
c = input.charAt(i);
//remove ZeroWidthSpace
if (c == '\u200b') {
continue;
}
if ((c == 0x9) || (c == 0xA) || (c == 0xD)
|| ((c >= 0x20) && (c <= 0xD7FF))
|| ((c >= 0xE000) && (c <= 0xFFFD))
|| ((c >= 0x10000) && (c <= 0x10FFFF))
) {
sb.append(c);
}
}
return sb.toString();
}
Thursday, 16 April 2009
AWK to take last previous column
You can use the below command to print last previous column.
awk '{print $(NF-1)}' access.log | sort | uniq | wc -l
awk '{print $(NF-1)}' access.log | sort | uniq | wc -l
Wednesday, 8 April 2009
Ubuntu youtube no sound in firefox
I was continuously facing problem with sound while playing youtube using firefox. The youtube inturn uses gstreamer library to render the sound.
Finally got sound working after installing restricted packages!
Run "sudo apt-get install ubuntu-restricted-extras"
Paly and check your youtube for sound.
Finally got sound working after installing restricted packages!
Run "sudo apt-get install ubuntu-restricted-extras"
Paly and check your youtube for sound.
Friday, 3 April 2009
Subversion stress testing.
Today for one of my friend I've explored a new tool, which was used for stress testing on subversion.
You can find the script here.
Basically the script is for performing multiple operations on the repository to test the load.
You can find the script here.
Basically the script is for performing multiple operations on the repository to test the load.
Tuesday, 31 March 2009
Find and Grep
The find and grep are very usefull commands for development.
When you want to grep a content for the matching files, below commands will help you out.
In linux:
find /tmp -name 'page.xml' -print0 | xargs -0 grep "view_link=\"ReportView" | awk '{print $1}' | wc -l
In solaris:
find /tmp -name 'page.xml' -exec grep "view_link=\"ReportView" {} \; | wc -l
In the above command I'm trying to find page.xml inside /tmp dir and then search content "view_link=\"ReportView" from the find output.
When you want to grep a content for the matching files, below commands will help you out.
In linux:
find /tmp -name 'page.xml' -print0 | xargs -0 grep "view_link=\"ReportView" | awk '{print $1}' | wc -l
In solaris:
find /tmp -name 'page.xml' -exec grep "view_link=\"ReportView" {} \; | wc -l
In the above command I'm trying to find page.xml inside /tmp dir and then search content "view_link=\"ReportView" from the find output.
Monday, 30 March 2009
STOP and START a running process.
Send signals to the running process.
# sending STOP signal to the process will halt the process in it's current state.
sudo kill -STOP $pid.
# To confirm the process is stopped for a moment
sudo strace -p $pid. #this will leave traces of "STOPPED".
# send CONT signal to resume the process to start it's work
sudo kill -CONT $pid.
Inorder to make the process idle (to simulate deadlock situation) I've used the above technique to stop the process and test the status of the process.
# sending STOP signal to the process will halt the process in it's current state.
sudo kill -STOP $pid.
# To confirm the process is stopped for a moment
sudo strace -p $pid. #this will leave traces of "STOPPED".
# send CONT signal to resume the process to start it's work
sudo kill -CONT $pid.
Inorder to make the process idle (to simulate deadlock situation) I've used the above technique to stop the process and test the status of the process.
Monday, 23 March 2009
To print shared library dependencies
To know what are the libraries used by any program.
>>ldd /usr/bin/svn | grep db
>>libdb-4.7.so => /lib/libdb-4.7.so (0x00154000)
>>libdb-4.3.so => /lib/libdb-4.3.so (0x00cb0000)
>>ldd /usr/bin/svn | grep db
>>libdb-4.7.so => /lib/libdb-4.7.so (0x00154000)
>>libdb-4.3.so => /lib/libdb-4.3.so (0x00cb0000)
Friday, 20 March 2009
Enable pgsql logging
Some days back I came to situation to find when the table rows are inserted and deleted by an unknown thread!.
Got an idea from one of my colleague and enabled the pgsql logging. It worked great :)
To enable logging
# Edit the pgsql configuration file
/var/lib/pgsql/data/postgresql.conf
# Under section # ERROR REPORTING AND LOGGING
# Edit the logging options and set your preferrence!.
...
log_directory = 'pg_log'
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
log_line_prefix = '%u %h %t'
log_statement = 'all'
log_hostname = on
...
# Go to pgsql logging dir and tail for the logfile.
cd /var/lib/pgsql/data/pg_log/
Got an idea from one of my colleague and enabled the pgsql logging. It worked great :)
To enable logging
# Edit the pgsql configuration file
/var/lib/pgsql/data/postgresql.conf
# Under section # ERROR REPORTING AND LOGGING
# Edit the logging options and set your preferrence!.
...
log_directory = 'pg_log'
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
log_line_prefix = '%u %h %t'
log_statement = 'all'
log_hostname = on
...
# Go to pgsql logging dir and tail for the logfile.
cd /var/lib/pgsql/data/pg_log/
Thursday, 26 February 2009
Recursive shell command s
I was searching for macking the commands work recursively.
Below is the way to do it.
MIGRATION_CONFIG_FILE="$(dirname $(dirname $(readlink $0)))"/$CONFIGURATION_FILE.
Below is the way to do it.
MIGRATION_CONFIG_FILE="$(dirname $(dirname $(readlink $0)))"/$CONFIGURATION_FILE.
Monday, 16 February 2009
Splitting a patch file
Some times the patch file needs to be splitted to individual files for easy handling.
Below is the command to do that, cut the patch file matching pattern "Index:".
csplit artf21810.diff /Index:/ {*}
Below is the command to do that, cut the patch file matching pattern "Index:".
csplit artf21810.diff /Index:/ {*}
Friday, 13 February 2009
Script to delete folder in a loop
#!/bin/sh
i=0
while [ $i -le 40 ]
do
folder="psr-pub-$i"
#chown apache:apache $folder -R
rm -rf $folder
i=`expr $i + 1`
done
i=0
while [ $i -le 40 ]
do
folder="psr-pub-$i"
#chown apache:apache $folder -R
rm -rf $folder
i=`expr $i + 1`
done
Shell script to connect postgress in a loop
i=211
while [ $i -le 250 ]
do
folder="psr-pub-$i"
echo "Starting processing $folder."
./svnmapping.sh $folder
echo "Done with $folder"
i=`expr $i + 1`
done
projectName=$1
userId="user1308"
DBConnect="psql -d psrdb -U psruser --tuples-only --single-line"
date=`date +"%F %T"`
projId=`echo "select id from project where path='projects.$projectName'" | $DBConnect`
projId=`echo $projId`
parent_folder_id=`echo "select id as parent_folderid from folder where path = 'scm' and project_id='$projId'" | $DBConnect`
parent_folder_id=`echo $parent_folder_id`
nextscm_id_query="select max(substring(id from (char_length('reps')+1))) from scm_repository"
primaryId_query="select id from objecttype where application='Scm' and name='Repository'"
secondaryId_query="select id from objecttype where application='Scm' and name='Commit'"
nextscm_id=`echo $nextscm_id_query | $DBConnect`
nextscmId=`echo $nextscm_id`
nextscmId=reps`expr $nextscmId + 1`
primaryId=`echo $primaryId_query | $DBConnect`
primaryId=`echo $primaryId`
secondaryId=`echo $secondaryId_query | $DBConnect`
secondaryId=`echo $secondaryId`
insertscm="insert into scm_repository values('$nextscmId', 'exsy1002', '/svnroot/$projectName', 'f')"
insertfolder="insert into folder values ('$nextscmId', 'scm.$projectName', '$projectName', '$projectName', '100', '$date+05:30', '$date+05:30','f', '$primaryId', '$secondaryId', '$userId', '$userId', '$parent_folder_id', '$projId', '0')"
scm=`echo $insertscm | $DBConnect`
fol=`echo $insertfolder | $DBConnect`
echo $scm" "$fol
while [ $i -le 250 ]
do
folder="psr-pub-$i"
echo "Starting processing $folder."
./svnmapping.sh $folder
echo "Done with $folder"
i=`expr $i + 1`
done
projectName=$1
userId="user1308"
DBConnect="psql -d psrdb -U psruser --tuples-only --single-line"
date=`date +"%F %T"`
projId=`echo "select id from project where path='projects.$projectName'" | $DBConnect`
projId=`echo $projId`
parent_folder_id=`echo "select id as parent_folderid from folder where path = 'scm' and project_id='$projId'" | $DBConnect`
parent_folder_id=`echo $parent_folder_id`
nextscm_id_query="select max(substring(id from (char_length('reps')+1))) from scm_repository"
primaryId_query="select id from objecttype where application='Scm' and name='Repository'"
secondaryId_query="select id from objecttype where application='Scm' and name='Commit'"
nextscm_id=`echo $nextscm_id_query | $DBConnect`
nextscmId=`echo $nextscm_id`
nextscmId=reps`expr $nextscmId + 1`
primaryId=`echo $primaryId_query | $DBConnect`
primaryId=`echo $primaryId`
secondaryId=`echo $secondaryId_query | $DBConnect`
secondaryId=`echo $secondaryId`
insertscm="insert into scm_repository values('$nextscmId', 'exsy1002', '/svnroot/$projectName', 'f')"
insertfolder="insert into folder values ('$nextscmId', 'scm.$projectName', '$projectName', '$projectName', '100', '$date+05:30', '$date+05:30','f', '$primaryId', '$secondaryId', '$userId', '$userId', '$parent_folder_id', '$projId', '0')"
scm=`echo $insertscm | $DBConnect`
fol=`echo $insertfolder | $DBConnect`
echo $scm" "$fol
Thursday, 5 February 2009
How to set alias name for network machines
Simple thing but avoids our repetative daily work.
To set alias name for your frequently used machine.
#Go to /etc/hosts
vi /etc/hosts
#Make an entry for your machine
#IPAddress Hostname Alias
127.0.0.1 connectmachine.com connect
#Restart your network
/etc/rc.d/init.d/network restart
Once it's done. do a ping to check it up
ping connect
To set alias name for your frequently used machine.
#Go to /etc/hosts
vi /etc/hosts
#Make an entry for your machine
#IPAddress Hostname Alias
127.0.0.1 connectmachine.com connect
#Restart your network
/etc/rc.d/init.d/network restart
Once it's done. do a ping to check it up
ping connect
Subscribe to:
Posts (Atom)