Today, I have encountered problems during fetching large MySQL table. Simply, the server was getting out of memory due to largeĀ table size.
I knew, you can fetch subset of dataĀ from MySQL tableĀ using LIMIT. Standard `LIMIT N` returns first N rows of your table.
However there is second way of calling LIMIT, `LIMIT S, N`; where S is the starting row and N is total number of rows to be returned. Thus, you can easily plug it into FOR LOOPĀ to fetch from large MySQL table in for example 1 million rows batches:
step=1000000
t=large_table_name
user=mysql_username
pass=mysql_pass
db=database_name
#get row count
c=`mysql -u$user -p$pass $db -N -A -e"select count(*) from $t"`
echo `date` $t "with $c rows";
#write table header
echo -e $header | gzip > $t.txt.gz
#get table in batch
for i in `seq 1 $step $c`; do
cmd="select * from $t limit $i, $step"
echo -en "r$i / $c"
mysql -u$user -p$pass $db -A -N -e"$cmd" | gzip >> $t.txt.gz
done
echo ""
Hope, someone will find it useful:)