Category Archives: Linux

Get the total MySQL data and index usage by storage engine

SELECT IFNULL(B.engine,'Total') "Storage Engine",
CONCAT(LPAD(REPLACE(FORMAT(B.DSize/POWER(1024,pw),3),',',''),17,' '),' ',
SUBSTR(' KMGTP',pw+1,1),'B') "Data Size", CONCAT(LPAD(REPLACE(
FORMAT(B.ISize/POWER(1024,pw),3),',',''),17,' '),' ',
SUBSTR(' KMGTP',pw+1,1),'B') "Index Size", CONCAT(LPAD(REPLACE(
FORMAT(B.TSize/POWER(1024,pw),3),',',''),17,' '),' ',
SUBSTR(' KMGTP',pw+1,1),'B') "Table Size" FROM
(SELECT engine,SUM(data_length) DSize,SUM(index_length) ISize,
SUM(data_length+index_length) TSize FROM
information_schema.tables WHERE table_schema NOT IN
('mysql','information_schema','performance_schema') AND
engine IS NOT NULL GROUP BY engine WITH ROLLUP) B,
(SELECT 3 pw) A ORDER BY TSize;

How to test DNS server speed on Linux

Without manual configuration, your Linux will be set to use the DNS service offered by your ISP or organization. If you are not satisfied with the default DNS service, you can consider using other public DNS services such as Google DNS, OpenDNS, etc. Before switching to different DNS servers, you probably want to compare DNS server speed among available DNS servers, and find the best DNS for you.

On Linux, there is an open-source DNS benchmark tool called namebench which can help you find the best DNS servers to use. namebench can run in command-line mode as well as in GUI mode. In this post, I will describe how to test and compare DNS server speed by using namebench.

To install and launch namebench on Ubuntu or Debian:

$ sudo apt-get install python-tk
$ wget http://namebench.googlecode.com/files/namebench-1.3.1-source.tgz
$ tar xvfvz namebench-1.3.1-source.tgz
$ cd namebench-1.3.1
$ ./namebench.py

To install and launch namebench on CentOS, RHEL or Fedora:

$ sudo yum install tkinter
$ wget http://namebench.googlecode.com/files/namebench-1.3.1-source.tgz
$ tar xvfvz namebench-1.3.1-source.tgz
$ cd namebench-1.3.1
$ ./namebench.py

If you don’t have X11 installed, namebench will automatically proceed in command-line mode, and start evaluating available DNS servers right away. If you have X11 installed, namebench will launch a graphical interface as follows. Click on “Start Benchmark” button to start DNS benchmarking. namebench will test your local DNS servers, as well as public and regional DNS servers.

When run in command-line mode, namebench will, upon completion, prints out benchmarking summary (e.g., recommended DNS configuration) in the terminal as follows.

Recommended configuration (fastest + nearest):
----------------------------------------------
nameserver 208.67.220.220  # OpenDNS  
nameserver 151.198.0.39    # Verizon Home5 US  
nameserver 71.250.0.12     # SYS-71.250.0.12  

********************************************************************************
In this test, OpenDNS is 5.1%: Faster 
********************************************************************************

- Saving report to /tmp/namebench_2013-04-30_2337.html
- Saving detailed results to /tmp/namebench_2013-04-30_2337.csv

When run in GUI mode, namebench will show DNS speed test result in a web browser window. A typical DNS benchmarking report generated by namebenchlooks like the following.

DNS speed comparison summary:

Mean/minimum DNS response time:

DNS response time distribution:

To compare DNS servers, namebench performs DNS lookup on hostnames found in your web browser history, or Alexa top 10,000 domain names. DNS lookup time measurements generated by namebench are stored in /tmp/namebench_*.csv, so you can do any custom analysis if you want.

Source: http://xmodulo.com/2013/05/how-to-test-dns-server-speed-on-linux.html

Get MySQL tables size

SHOW TABLE STATUS FROM db_name LIKE 'tbl_name';
*************************** 1. row ***************************
 Name: tbl_name
 Engine: MyISAM
 Version: 10
 Row_format: Dynamic
 Rows: 5262588
 Avg_row_length: 154
 Data_length: 40680708
 Max_data_length: 281474976710655
 Index_length: 22840320
 Data_free: 0
 Auto_increment: NULL
 Create_time: 2013-06-13 15:08:11
 Update_time: 2013-06-20 14:27:43
 Check_time: 2013-06-20 13:16:16
 Collation: utf8_general_ci
 Checksum: NULL
 Create_options:
 Comment:
 1 row in set (0.00 sec)

Get table sizes in MB for a whole database:

SELECT table_name AS "Tables",
 round(((data_length + index_length) / 1024 / 1024), 2) "Size in MB"
 FROM information_schema.TABLES
 WHERE table_schema = "table_name";

Hide Computer, Home, Network, Trash and Mounted volumes icons on Ubuntu 12.04

gsettings set org.gnome.desktop.background show-desktop-icons true
gsettings set org.gnome.nautilus.desktop computer-icon-visible false
gsettings set org.gnome.nautilus.desktop home-icon-visible false
gsettings set org.gnome.nautilus.desktop network-icon-visible false
gsettings set org.gnome.nautilus.desktop trash-icon-visible false
gsettings set org.gnome.nautilus.desktop volumes-visible false

IPTables: Prevent crawlers and add to blacklist

iptables -N badguys
iptables -A badguys -p tcp --dport 80 -m recent --set --name badlist -j LOG --log-prefix Adding to the badlist
iptables -N block
iptables -I INPUT ! -i lo -p tcp --dport 80 -j block
iptables -I bam -p tcp --syn --dport 80 -m recent --name badlist --update --seconds 600 -j DROP
iptables -I badguys -p tcp --dport 80 -m recent --name goodguys --update --rsource -j RETURN
iptables -A bam -p tcp --dport 80 -m string --algo bm --string WHATEVER -j badguys
iptables -I badguys -m limit --limit 10/s --limit-burst 10 -j RETURN

cURL: Print Headers Only

juancho@juancho-laptop:~$ curl -I -L http://www.lanacion.com
 HTTP/1.1 301 Moved Permanently
 Content-Length: 150
 Content-Type: text/html; charset=UTF-8
 Location: http://www.lanacion.com.ar/
 Server: Microsoft-IIS/7.5
 X-ORI: w9
 X-Powered-By: ASP.NET
 Date: Thu, 28 Feb 2013 21:01:46 GMT
 Connection: close
HTTP/1.1 200 OK
 Cache-Control: public, must-revalidate, max-age=900
 Content-Type: text/html
 Last-Modified: Thu, 28 Feb 2013 20:56:02 GMT
 Vary: Accept-Encoding
 Response-Flush: No
 X-ORI: w1
 X-Powered-By: ASP.NET
 Date: Thu, 28 Feb 2013 21:02:12 GMT
 X-Varnish: 3357640651 3357411539
 Age: 368
 Via: 1.1 varnish
 Connection: keep-alive
 X-CAMINO: UA_BOTS
 X-Cache: HIT 1653
 X-EPV: v8
  • -I : when used, CURL prints only the server response’s HTTP headers, instead of the page data.
  • -L : if the initial web server response indicates that the requested page has moved to a new location (redirect), CURL’s default behaviour is not to request the page at that new location, but just print the HTTP error message. This switch instructs CURL to make another request asking for the page at the new location whenever the web server returns a 3xx HTTP code.

Using perl MySQL Tuner

MySQLTuner is a script written in Perl that allows you to review a MySQL installation quickly and make adjustments to increase performance and stability. The current configuration variables and status data is retrieved and presented in a brief format along with some basic performance suggestions.

juancho@juancho-laptop:~$ perl < (GET https://raw.github.com/rackerhacker/MySQLTuner-perl/master/mysqltuner.pl)

>> MySQLTuner 1.2.0 - Major Hayden
 >> Bug reports, feature requests, and downloads at http://mysqltuner.com/
 >> Run with '--help' for additional options and output filtering
 Please enter your MySQL administrative login: root
 Please enter your MySQL administrative password:
-------- General Statistics --------------------------------------------------
 [--] Skipped version check for MySQLTuner script
 [OK] Currently running supported MySQL version 5.5.29-0ubuntu0.12.04.1
 [OK] Operating on 64-bit architecture
-------- Storage Engine Statistics -------------------------------------------
 [--] Status: +Archive -BDB -Federated +InnoDB -ISAM -NDBCluster
 [--] Data in MyISAM tables: 1G (Tables: 763)
 [--] Data in InnoDB tables: 113M (Tables: 78)
 [--] Data in PERFORMANCE_SCHEMA tables: 0B (Tables: 17)
 [!!] Total fragmented tables: 86
-------- Security Recommendations -------------------------------------------
 [OK] All database users have passwords assigned
-------- Performance Metrics -------------------------------------------------
 [--] Up for: 16h 50m 51s (10K q [0.171 qps], 852 conn, TX: 15M, RX: 2M)
 [--] Reads / Writes: 98% / 2%
 [--] Total buffers: 192.0M global + 2.7M per thread (151 max threads)
 [OK] Maximum possible memory usage: 597.8M (10% of installed RAM)
 [OK] Slow queries: 0% (0/10K)
 [OK] Highest usage of available connections: 8% (13/151)
 [OK] Key buffer size / total MyISAM indexes: 16.0M/572.6M
 [OK] Key buffer hit rate: 99.9% (1M cached / 971 reads)
 [OK] Query cache efficiency: 44.4% (3K cached / 8K selects)
 [OK] Query cache prunes per day: 0
 [OK] Sorts requiring temporary tables: 0% (0 temp sorts / 1K sorts)
 [OK] Temporary tables created on disk: 21% (159 on disk / 734 total)
 [OK] Thread cache hit rate: 98% (15 created / 852 connections)
 [!!] Table cache hit rate: 9% (400 open / 4K opened)
 [OK] Open file limit used: 76% (786/1K)
 [OK] Table locks acquired immediately: 100% (7K immediate / 7K locks)
 [OK] InnoDB data size / buffer pool: 113.8M/128.0M
-------- Recommendations -----------------------------------------------------
 General recommendations:
 Run OPTIMIZE TABLE to defragment tables for better performance
 MySQL started within last 24 hours - recommendations may be inaccurate
 Enable the slow query log to troubleshoot bad queries
 Increase table_cache gradually to avoid file descriptor limits
 Variables to adjust:
 table_cache (> 400)

Print IP’s access from Apache domlog

juancho@juancho-laptop:~$ log="/root/domlogs_list"; dir="/usr/local/apache/domlogs"; > $log; num=$(find $dir -type f -name "*_log" | wc -l); echo "There are $num log files."; read -p "How many do you want outputted?: " howmany; read -p "How many IPs per domain would you like listed? " perdomain; echo $(date) | tee -a $log; find $dir -maxdepth 1 -mindepth 1 -type f ! -name "*_log" -exec bash -c "grep -cH "$(date +%d)\/$(date +%b)" {}" -print \; | sort -t\: -k 2 -rnu | sed 's/\(.*\)\:\([[:digit:]]\+\)/\2 \1/' | head -n $howmany | awk '{print $2}' | while read domain; do echo "----------------------------------------------------------------" | tee -a $log; echo $domain | tee -a $log; echo -e "# of requests\t\tIP address\t\tDomain name (if available)" | tee -a $log; grep "$(date +%d)\/$(date +%b)" $domain | awk '{print $1}' | sort | uniq -c | sort -rn | head -n $perdomain | awk '{print $1 " " $2}' | while read num ip; do rev=$(dig -x $ip +short); echo -e "$num\t\t$ip\t\t$rev" | tee -a $log; done; done
There are XX log files.
 How many do you want outputted?: 1
 How many IPs per domain would you like listed? 1
 Tue Feb 26 10:42:21 ART 2013
 ----------------------------------------------------------------
 /usr/local/apache/domlogs/xxx.com
 # of requests IP address Domain name (if available)
 8505 66.249.73.23 crawl-66-249-73-23.googlebot.com.
 971 65.55.24.214 msnbot-65-55-24-214.search.msn.com.
 605 201.231.198.85 85-198-231-201.fibertel.com.ar.
 513 65.55.52.117 msnbot-65-55-52-117.search.msn.com.
 503 157.55.32.144 msnbot-157-55-32-144.search.msn.com.
 492 157.55.36.50 msnbot-157-55-36-50.search.msn.com.
 484 157.55.32.95 msnbot-157-55-32-95.search.msn.com.