Moving to KDE – is it worth it?

I’m Ubuntu enthusiast. However, since Gnome introduction as default in Ubuntu, I’ve been experiencing stability issues. I don’t mind to reboot my laptop from time to time, but my workstation is a different story – often many weeks without reboot.

After many discussions with my friend, I’ve decided to give a try to KDE. I’ve been experimenting with KDE years ago and I found it not straightforward to use. But apparently since version 5 it’s possible to customise KDE to look & feel nearly whatever you like. And I have to admit, I got sucked by it after just a few hours. First of all, it’s very stable, quite lightweight and very practical. It’s also pretty – it doesn’t matter that much for productivity, but it’s nice add-on. I felt in love with drop-down terminal. Setting everything so migration from Gnome was smooth took me a few hours for the first time. But it paid off rather quickly, cause I’m way more productive than before. That’s how my screen looks like more or less.

If you want to try it, I’d recommend trying KDE Neon instead of Kubuntu, as Neon is developed by KDE Community, therefore it’s the purest KDE experience you can get. Below, you can find a list a widgets, applications and customisations which made my life easier (again, big thanks to Maciek for helping with the migration!).


  • (Add widgets)
    • system load viewer [set compact view]
    • Global menu
  • (Add widgets > Get new widgets > Download new plasma widgets)
    • event calendar (replace standard clock & calendar)
  • from github

Applications (installed through Discover)

  • thermal monitor
  • redshift + redshift control
  • latte
  • dropbox
  • netspeed widget


  • yakuake (drop-down terminal activated with F12)
  • workrave
sudo apt install yakuake workrave htop
yakuake &
latte-dock &


I’ll try to keep it up-to-date. Hope someone will find it useful. I did already, while installing Neon on the second and third machine 😉

Investigate & reduce the size of Drupal sqlite3 database

Today while performing regular Drupal update and backup, I’ve realised Drupal sqlite3 database sites/default/files/.ht.sqliteis over 440 Mb! I found it peculiar, as our website isn’t storing that much information and the size grew significantly since last time I’ve looked it up couple of months ago. I’ve decided to investigate what’s eating up so much DB space.

Investigate what’s eating up space within your sqlite3 db

There is super useful program called sqlite3_analyzer. This program analyses your database file and reports what’s actually taking your disk space. You can download it from here (download precompiled sqlite3-tools). Note, under Linux you’ll likely need to install 32bit-libraries ie. under Ubuntu/Debian execute

sudo apt install libc6-i386 lib32stdc++6 lib32gcc1 lib32ncurses5 lib32z1  

Once you have the program, simply execute sqlite3_analyzer DB_NAME | less and the program will produce detailed report about your DB space consumption. For me it looked like that:

Can you spot how much space the actual data is taking? Yes, only 4.7% (20k pages). And what’s taking most of the space? Freelist.

Quick googling taught me, that freelist is simply empty space left after deletes or data moving. You may ask, why isn’t it cleaned up later? You see, having entire database with all tables in one file is very handy, but troublesome. Every time given table is edited, the space that is freed isn’t used, but rather marked as freelist. And those regions get cleaned up only when vacuumcommand is issued. This should happen automatically from time-to-time if auto vacuum is enabled. I couldn’t know why isn’t it working by default with Drupal…

Reduce the size of sqlite3 DB file

Nevertheless, I’ve decided to perform vacuummanually. Of course I’ve backed-up the db, just in case (you should always do that!). But sqlite3 .ht.sqlite vacuum returned Error: no such collation sequence: NOCASE_UTF8. At this point, I though maybe simple DB dump and recovery would solve my problem – after all that’s more or less what happens under the hood when you perform vacuum.

sqlite3 .ht.sqlite.bck .dump > db.sql
sqlite3 .ht.sqlite < db.sql

DB recovered after dump was indeed smaller (16 Mb), but it was missing some tables (sqlite3 .ht.sqlite .tables). Interestingly, when I’ve investigated the schema of the missing tables (sqlite3 .ht.sqlite.bck .schema block_content), I’ve realised that all of those contain NOCASE_UTF8 in table schema. I found that really peculiar! After further googling and rather lengthy reading, I’ve realised NOCASE_UTF8 is invalid in sqlite3, but it can be replaced simply with NOCASE.

Replace DB schema directly on sqlite3 db

In the brave (and firstly stupid I though) attempt, I’ve decided just to replace wrong statements directly on the DB file using sed (sed 's/NOCASE_UTF8/NOCASE/g' .ht.sqlite.bck > .ht.sqlite). As expected, the database file got corrupted. This is because all tables location are stored internally in the same file, so truncating some text from the DB file isn’t the wisest idea as I’ve expected. Then, I’ve decided to replace NOCASE_UTF8, but keeping the same size of the statement after replacement using white spaces. To my surprise it worked & allowed me to reduce the size of DB from 440 to 30 Mb 🙂

sed 's/NOCASE_UTF8/NOCASE     /g' .ht.sqlite.bck > .ht.sqlite
sqlite3 .ht.sqlite vacuum
-rw-rw-r--  1 lpryszcz www-data  32638976 Feb 28 13:57 .ht.sqlite
-rw-rw-r-- 1 lpryszcz www-data 451850240 Feb 28 13:45 .ht.sqlite.bck

Finally, to make sure, that there is no data missing between old and new, reduced DB, you can use sqldiff .ht.sqlite .ht.sqlite.bck. It’ll simply report all SQL command that will transform one DB into another and nothing if DB contain identical information.

Hopefully replacing NOCASE_UTF8 with NOCASE will allow auto vacuum to proceed as expected on the Drupal DB in the future!

Why did I switch back to Raspbian on my RPi server?

Frankly, while I’m super happy user of Ubuntu on desktop and servers, I’ve been quite unsatisfied with Ubuntu for Raspberry Pi. Mostly, cause it ruined a few of my SD cards. I’ve realised that writing file access info to the card was one of the key factor here. Unfortunately, Ubuntu for RPi won’t but with noatime… while latest Rasbian has it by defualt.

What I like a lot about Raspbian Stretch Lite, is that beside natively supporting all Raspberry Pi features, it’s also cross platform compatible – it works super well on both, RPi2 and RPi3.

And yes, this blog among few other things is server from RPi2 🙂

Changing MAC address of USB LAN adapter permamently

It’s been long time since the last post… But time came that I’ve faced serious problem when trying to change MAC address of my USB LAN adapter.

As recommended by numerous pages found by googling change MAC address Linux, I’ve tried ifconfig eth0 hw ether NEWMAC and macchanger. It changed MAC of my devices (as seen in ifconfig output), yet after plugging the LAN cable, the MAC was automatically restored to permanent one.

At first, I thought it’s the fault of NetworkManager, so I’ve stopped it. But the problem still persisted. After some tinkering, I’ve realised, the MAC can be specified also in NetworkManager alone by adding to /etc/NetworkManager/NetworkManager.conf two lines:


and restarting NetworkManager

sudo service network-manager restart

Note, when I’ve changed MAC in NetworkManager using GUI, the permanent MAC was also restored upon LAN cable connection.

Hope this helps someone having similar problem with USB LAN adapter.

VLC subtitles downloading (VLSub) doesn’t work / hangs

For some weeks already, I’ve been annoyed by not working VLSub extension of VLC. It simply hangs during downloading the subtitles. Apparently, this is associated with changes in remote access. Today, I’ve found simple solution for this issue:

  • Update VLC
  • sudo add-apt-repository ppa:videolan/master-daily
    sudo apt-get update && sudo apt-get upgrade -y
  • Replace HTTP/1.1 with HTTP/1.0 in /usr/lib/vlc/lua/extensions/VLSub.luac
  • sudo sed -i.bak 's\HTTP/1.1\HTTP/1.0\g' /usr/lib/vlc/lua/extensions/VLSub.luac
  • Restart VLC and enjoy!

Solution found on VideoLan forum.

Copy data from Android phone with broken screen

The screen of the phone broke and you want to retrieve your contacts / files… Quite typical story. While getting your photos / files is quite trivial, plugging your phone to computer and copying necessary files would be enough.
The situation with getting out your contacts (if you happened not to sync them with Google) is slightly more complicated. Here is what I did in the case of Samsung S4 mini with broken screen. Note, the digitizer (touch screen) worked, but the USB debugging was OFF. Also, S4 mini has no video output. If your phone happens to have HDMI or MDL, just get the cable and plug it to your TV / monitor 😉

  1. Enable USB debugging
  2. This is hard part and can be done only manually! But it’s quite complicated with broken screen. You need to repeat 3 steps until you reach what you want: make a screenshot (HOME + POWER button in S4 mini), see what’s on the screen (navigate to you Phone storage > Pictures > Screenshots), do some action and repeat… This is extremely tedious, but proved to work with me.
    In Android 4.4 which my phone had, you need to enter Settings > About, scroll down and press many times (~7 should work) Build Number. This will enable `Developer options` in Settings. You need to enter it and tick `USB Debugging` and press OK (here I needed to rotate the screen, as the right side of my digitizer didn’t work…).
    I recommend clicking `Revoke access` and OK, as my computer couldn’t connect till I pressed it.
    Then unplug the mobile phone and plug it again. New dialog asking for permission to access for your computer will appear on the screen. You should tick `Always allow access` and OK. From now on, the access through ADB is possible. You can check it with:

    # install ADB
    sudo apt-get install android-tools-adb
    # connect
    adb devices
    adb shell

    Note, if your digitizer is working only partially (my case), it’s usefull to enable autorotation first.

  3. Screencast Android to computer monitor
  4. # install seversquare
    sudo apt-get install qt4-qmake libqt4-dev libqtcore4 libqtgui4
    git clone
    cd sevensquare
    # for Ubuntu 16.04 replace 5th line of Makefile with 
    	(cd build && qmake-qt4 -o Makefile ../
    # compile
    # and run
    build/seven-square &

    Now, you should see the mobile screen and be able to interact with it your mouse & keyboard. Now exporting contacts should be trivial, right?

I have tried to dump userdata partition, but on original system version there is no root access and getting one would erase the data…

Let me know if there is any simpler solution!

Kindle touch with ssh to wi-fi enabled

Lately, I’ve spent quite a lot of time playing with my Kindle. The motivation was mostly its sloppiness – after having it for 1.5 year, my Kindle became very slow and unresponsive… I decided to factory reset it. And while doing it, I’ve checked if I maybe the jailbreak of newer firmware is possible (last time I’ve checked over a year ago it wasn’t).
To my big joy, I found these days it’s fairly easy to jailbreak any kindle by downgrading it’s firmware.
But this isn’t the focus of this post. Here I would like to share my experience with enabling ssh access to my Kindle. Why anyone would do that? Kindle is ARM-processor powered computer with Linux installed. So having SSH access, you can setup your Kindle to do lots of useful stuff. But about that, I’ll write in the next posts 😉
Here is how to proceed:

  1. Jailbreak your Kindle
  2. Install KUAL & Mobileread Package Installer (MrPI)
  3. Install USBNetwork Hack
  4. Create new user account in Kindle for SSH access
  5. # get dev name from udev
    dmesg | grep usb0
    # bind
    sudo ifconfig enp0s20u8
    # telnet
    # mount root with write access
    mntroot rw
    # create new user
    mkdir -p /home
    adduser USER
    # make it root by changing USERID to 0 in <code>/etc/passwd</code> ie. 
  6. Start sshd and enable port 22 for SSH temporarily
  7. This is only to check if SSH is possible. So far we didn’t make any serious changes 😉

    /mnt/us/usbnet/sbin/sshd -f /mnt/us/usbnet/etc/sshd_config
    iptables -A INPUT -p tcp --dport 22 -j ACCEPT

    Now try to login to your Kindle by SSH. You can find its IP by executing ifconfig. Proceed only if SSH works for you.

  8. Enable port 22 for SSH
  9. # add below line to <code>/etc/sysconfig/iptables</code> to enable SSH access
    -A INPUT -p tcp --dport 22 -j ACCEPT
  10. Add sshd to upstart
  11. Create new file /etc/upstart/sshd.conf

    # ssh - OpenBSD Secure Shell server
    # The OpenSSH server provides secure shell access to the system.
    env LOGFILE=/tmp/ssh.log
    description     "OpenSSH server"
    start on dbus_ready
    stop on stopping dbus
    respawn limit 2 5
    umask 022
    pre-start script
        test -x /usr/sbin/sshd || { stop; exit 0; }
        test -e /etc/ssh/sshd_not_to_be_run && { stop; exit 0; }
        test -c /dev/null || { stop; exit 0; }
    end script
        # if you used to set SSHD_OPTS in /etc/default/ssh, you can change the
        # 'exec' line here instead
        echo `date` "Starting sshd..." >> $LOGFILE 2>&1
        /mnt/us/usbnet/sbin/sshd -f /mnt/us/usbnet/etc/sshd_config >> $LOGFILE 2>&1
    end script
  12. Disable auto updates
  13. mv /etc/uks /etc/uks.disabled

Now SSH should work after Kindle reboot 🙂


Create Windows USB stick under Ubuntu

Today, I needed to create Windows 10 USB key in order to install it in the laptop. I found it not so straightforward under Ubuntu… But quickly I found a simple solution, WinUSB.

# install WinUSB
sudo add-apt-repository ppa:nilarimogard/webupd8 && sudo apt update && sudo apt install winusb

# without USB formatting
sudo winusb --install Win10.iso /dev/sdd

# with USB formatting - this didn't work for me, due to boot loaded installation fauilure
sudo winusb --format Win10.iso /dev/sdd

Source: webupd8.

Mount USB drive in RPi2

By default, RPi2 limits power supply of USB ports to 0.6A, while most USB drives require more than that. This is why most USB drives cannot be used with RPi2 without external power supply to RPi2. Luckily, given that you have decent power supply (ie I have 2A charger), you can increase USB port max current, by editing `/boot/config.txt`.

IMPORTANT!!! RPi2 allows max 1.2A current through all 4 USB ports combined, so don’t even think about plugging two USB drives!!! This is because the maximum power allowed into the RPi2 is limited to 2A by the fuse (F1) so if one of your USB device draw 1A, then that leaves 1A for the RPi + GPIO + remaining USB devices.

First, check if your drive can be detected without tweaking

sudo blkid

If you see only mmcblk0 entries, this means that your drive is not recognised.

/dev/mmcblk0p1: ...
/dev/mmcblk0p2: ...
/dev/mmcblk0: PTUUID="d2fd971c" PTTYPE="dos"

Add max_usb_current=1 to /boot/config.txt:

Reboot and check if your USB drive is visible after restart:

sudo reboot
sudo blkid

Now, you should see new entry:

/dev/mmcblk0p1: ...
/dev/mmcblk0p2: ...
/dev/mmcblk0: PTUUID="d2fd971c" PTTYPE="dos"
/dev/sda1: LABEL="turo"...

And you should be able to mount it:

sudo mkdir -p /media/turo
sudo mount /dev/sda1 /media/turo

Inspired by and