Command-line sharing for Harmattan

I use IRC and I want to be able to share photos there easily. For n900 I had implemented a sharing plugin and that worked nicely. When I got the n950 I of course wanted to do the same with that but it turned out to be a difficult task.

I started to implement webupload and SSO plugins but I never got them to work. The biggest show stopper was lacking documentation for the SSO part. Finally Mika Suonpää pointed me to Share UI plugins and now, only a few days later, I have the first version of it working for n950 :)

For some reason I don’t get my icons visible, they are always shown as a red square. All hints about that are most welcome. As is testing and feedback of the plugin. The plugin settings are in Settings -> Applications -> Command-line Share, and from there you need to enable the plugin and set the command to be run. After that the sharing plugin is visible in the Gallery -> share.

The source code can be found here and the corresponding forum thread here.

MeeGo 1.2 ARMv7 chroot (beta)

I’ve always liked the Scratchbox approach to cross-compiling. Run ./configure && make and you have an ARM binary, no need to explicitly tell the configure we are cross-compiling nor fix the bad behaving build scripts.

MeeGo doesn’t provide an SDK for the (ARM) platform. There’s an SDK for building Qt applications and there’s an QEMU for emulating the ARM device environment. For building the lower level components (Qt itself, GStreamer, etc) you are expected to use OBS. OBS is a very good build infrastructure tool, especially as you can link your own OBS to upstream OBS instances like MeeGo or OpenSUSE and have your OBS build only your own components or modified upstream components.

But OBS is a bit overkill when you are developing your own component that you don’t want to be a part of anything bigger yet. The OBS client side tool, osc, allows you to build components locally in a chroot but still you need an OBS account and those aren’t automatically available for everyone, not even in the community OBS.

I took the chroot created by osc build and modified it a bit with the help from stskeeps @ #meego-arm. The produced chroot is capable of building ARMv7 hard-float binaries without OBS or OBS account. It includes a minimal set of dependencies to make it smaller for easy download (it’s still 162MB), and the project specific dependencies can be installed normally with zypper from the standard MeeGo repositories.

The benefit from using a chroot over a QEMU image is the installed speed tools; many of the components taking time during a build are actually x86 binaries. These include e.g. bash, compilers and bzip2. Running these as emulated ARM binaries (or natively on an ARM hardware) would be much much slower.

I’ve used this approach with my own Qt + GStreamer project and it has been working well but that’s only a small use case so there can be all kinds of issues still. I use the same account inside the chroot and outside and bind mount my $HOME to the chroot so I can build project under my $HOME.

Current known issues:

  • Zypper doesn’t find noarch packages (this is an upstream bug, not related to my chroot).
  • Tested only on Debian Squeeze. At some point there was a linker issue on Ubuntu and I don’t know if that still exists.
  • My instructions mention libqt4-devel although the package name is libqt-devel.

If you want to try it out, it’s available via BitTorrent at http://tuomas.kulve.fi/tmp/torrent/ (temporary location). After extracting the tarball, see the readme file in the root directory.

Pleco Phase01 completed

I started playing with microcontrollers in 2005 and, if not at the very start, at least very quickly I decided to aim to have some sort of remote controlled Linux device with controllable camera with digital wireless communication. Now, 6 years later, I have completed my first phase :)

Couple of photos of the earlier devices are shown in the project page.

After several planning iterations and code rewrites I ended up using Qt both on the remote controlled Gumstix and on the GUI controller. I decided that trying to optimize everything from the memory and CPU consumption to the network bandwidth just isn’t worth the time spent in implementing it. The most CPU intensive task is the video encoding to H263 and that’s done in the DSP. I’m running MeeGo on the Gumstix and it provides e.g. the GStreamer plugins for the DSP.

Using Qt framework with self made simple protocol over UDP I got the Phase01 code implemented quite quickly compared to my previous efforts. The protocol allows low priority packets (like periodic statistics and video stream) to be lost and guarantees the passing of high priority packets (control commands etc.). Also only the latest control command of each type is retransmitted, i.e. an old packet is not retransmitted if a new overriding command has already been given.

The controller GUI shows the states the slave sends, like motor speeds, WLAN signal strength, CPU load average and some protocol statistics like round trip time and the number of retransmissions.

Currently the motors are controlled using the a,s,d,w keys in 10% steps and the camera is controlled dragging the mouse left button pressed on top of the video window.

Here’s a video (direct link) of the Phase01. You need HTML5 video capable browser with Ogg Theora/Vorbis codecs.