Pleco Phase 03: Finally drivable

I wrote about Phase 02 in September 2013 and listed goals for the Phase 03.

Most of the goals are implemented and finally it is a joy to remote drive the car.

I attached a wide angle lens to the existing camera and that made a huge difference. The camera crops a bit from the 180° angle but clearly the wider the better. I also made my first 3D print to better attach the camera to the servos.

Tegra 3 based Ouya was replaced with a Tegra K1 based Jetson TK1. That made it easy to stream a low latency H264 video over the network. Due to USB 2.0 and network limitations, the best quality video stream that can be enabled from the controller application is 800×600 2 Mbps.

The Microsoft Lifecam decreases the FPS in low light conditions so I added some V4L2 controls. The brightness is set to minimum to get the maximum FPS. In addition to that, the controller application now has manual focus and manual zoom.

The latency is low enough to drive over a long distance. In local wired network, the application latency (from the controller application to the slave application and back to the controller application) is 1-2 ms. Over my local WIFI it increases to 4 ms. In the video above the car was connected to an LTE network and the communication was routed over a distance of 800 km (the relay is in Stockholm and I live near Helsinki). The latency was around 60-80 ms and it did not introduce any noticeable delay. A friend of mine even drove the car from Gold Coast (that is in Australia, almost 15000 km away!). The latency was around 450 ms and while the delay was obvious, he was still able to drive it.

I also measured the actual visual delay, i.e. how long it takes for the controller application to show what the camera sees (“photon to display”). I did that by taping a led to the camera and using an external microcontroller with two light sensors. One sensor was taped to the led on the camera and the other one was taped to the monitor showing the controller application. I then measured the difference of the two sensors.

The visual latency was about 105 ms with 1 ms network latency. The camera is supposed to be 30 FPS so that can introduce a maximum of 33 ms of latency and my monitor is 60 Hz, so it can introduce a 16 ms latency. On average something else still introduced a 80 ms latency. That 80 ms may be the sum of getting the video stream from the USB camera to the Jetson, encoding it, sending it, buffering a frame, decoding it and then finally showing it. While it might be possible to get the latency down, it is already low enough for remote driving within a 1000km range :)

The project will never end and there are already some goals for the Phase 04:

  • GPS
  • AHRS based car orientation visualisation
  • 60 FPS (stereo?) camera
  • Control the webcam based on driver’s head orientation
  • Improved gamepad etc. controls

Pleco Phase02 completed

It’s been two years already since I posted about Phase01 being completed. It was a simple track based vehicle using cheap DC motors and the hull was built from Meccano parts:

Pleco Phase01

For Phase02 I decided to go with a ready made car and bought an RC Rock Crawler. It’s about four times the size of the Phase01:

Pleco Phase02 open

Pleco Phase02

The software is roughly the same as it was for Phase01. I’ve fixed a lot of bugs, simplified some things and added a bunch of new features. The slave has a sonar next to webcam and it measures the distance to what ever the camera is pointing at. The slave also measures the current consumption and battery voltage level. All details are shown on the GUI.

The hardware on the other hand has changed a lot. Instead of handling the servos directly from Linux there’s now a separate self-designed Cortex-M4 based microcontroller board for driving the PWM signals. In the future the microcontroller can update the PWM signals real-time based on other sensors without having to worry about latency issues in Linux.

The Linux is now running on a Tegra3 based Ouya gaming device. Ouya is not as convenient for robotics as Gumstix was but personally I think Tegra has better Linux support than OMAPs. And since there is now a separate control board for motors and sensors, it’s enough for the Ouya to only have a USB connection to the control board.

Ouya needs 12 volts while the motors need 5 volts and I’m expecting to need 3.3 volts as well. So I now have three switching regulators connected to the battery for the needed voltage levels. And a USB hub because of the USB based control board and webcam. There are also lots of cables. So I’m not actually able to get everything nicely inside the car, but almost.

Controlling an RC car with a keyboard is inconvenient, so I’ve added support for gamepads. I’m currently using a Bluetooth based gamepad from Logitech:

Logitech F710 gamepad

When driving in a local WiFi network, there’s no noticeable latency at all. It’s still not possible to really drive based on the camera only, the viewing angle is too narrow, turning the camera is cumbersome and the video quality needs some tweaking.

Now that I’ve made some nice progress with the project, I have clear goals for the Phase03:

  1. Disassemble the webcam to get the size and the weight down and to be able to attach fish eye lenses to it.
  2. 3D print a frame for the electronics and the webcam.
  3. Tweak the video quality and camera controls so that it would finally be possible to drive based on the video stream.
  4. Control the webcam based on driver’s head orientation?

So there are plenty of interesting things to learn :)

Oh, and I’ve moved the code to github.

Debian on Ouya

Ouya

Ouya is a 99$ Tegra3 based gaming console without a display.

My goal is to use Ouya in my robotics project for video encoding but first I want to have Debian running there with hardware accelerated X.Org, OpenGL ES and video.

I now have the Debian running but with a few bigger issues that I’ll describe shortly. If you want to try it out, you can find my instructions for creating the rootfs in github. That might void the warranty even though the kernel is booted from RAM and the rootfs mounted from USB stick so the Ouya software should remain intact.

Ouya is easily brickable permanently, so you need to be careful not to flash anything there and you’ll be doing it at your own risk.

About the Issues.

HDMI

2013-09-05: HDMI works now with multiple resolutions and is not so picky about having the cable connected already on boot.

Ouya display kernel driver is hardcoded to use HDMI as the primary display with 1920×1080 resolution. That works relatively well as long as you keep your monitor/TV always on. I couldn’t get anything but black if I e.g. switched my monitor to another input and back.

I tried to enable the normal behaviour in the display driver and by disabling the LVDS in xorg.conf but the TV remains black and doesn’t seem to change the resolution. Maybe there is still something hardcoded in the display driver or maybe the driver is trying to enable the non-existent internal panel and gets confused.

Audio

2013-09-11: Got audio working with a simple asound.conf. I also run /etc/init.d/alsa-utils stop but unsure if that affected anything.

aplay -l lists HDMI as the first device and tegrawm8903 as the second. If I make the tegrawm8903 the default the audio plays but only from the left speaker and half the speed.

I don’t know if this is related to the HDMI problems above or if I should just have the correct ALSA mixer configuration.

Video

2013-09-05: Totem has configurable video sink: gconftool-2 -s /system/gstreamer/0.10/default/videosink nvxvimagesink –type=string

Hardware accelerated video decoding and rendering is supported through GStreamer. Currently videos play well with playbin2 as long as the audio is disabled with audio-sink=fakesink and the correct video sink is selected with video-sink=nvximagesink.

Maybe the GStreamer’s autovideosink should be patched to use nvxvimagesink instead of xvimagesink to make nvxvimagesink the default in applications using playbin2

Power

2013-09-05: CPUfreq with ondemand governor is working and enabled in the kernel nby default now.

CPU core hot-plugging works but switching to the low power core seems to lead to a kernel bug (arch/arm/mach-tegra/pm.c:509). Same bug seems to happen with ondemand CPUFreq governor. I haven’t really tried to get those working yet but both of them are working with Nexus7 so maybe it’s possible to get them working on Ouya as well.

If you want to test Debian on Ouya, you can find the needed blobs from my /tmp but I might set up something more appropriate later.

Pleco Phase01 completed

I started playing with microcontrollers in 2005 and, if not at the very start, at least very quickly I decided to aim to have some sort of remote controlled Linux device with controllable camera with digital wireless communication. Now, 6 years later, I have completed my first phase :)

Couple of photos of the earlier devices are shown in the project page.

After several planning iterations and code rewrites I ended up using Qt both on the remote controlled Gumstix and on the GUI controller. I decided that trying to optimize everything from the memory and CPU consumption to the network bandwidth just isn’t worth the time spent in implementing it. The most CPU intensive task is the video encoding to H263 and that’s done in the DSP. I’m running MeeGo on the Gumstix and it provides e.g. the GStreamer plugins for the DSP.

Using Qt framework with self made simple protocol over UDP I got the Phase01 code implemented quite quickly compared to my previous efforts. The protocol allows low priority packets (like periodic statistics and video stream) to be lost and guarantees the passing of high priority packets (control commands etc.). Also only the latest control command of each type is retransmitted, i.e. an old packet is not retransmitted if a new overriding command has already been given.

The controller GUI shows the states the slave sends, like motor speeds, WLAN signal strength, CPU load average and some protocol statistics like round trip time and the number of retransmissions.

Currently the motors are controlled using the a,s,d,w keys in 10% steps and the camera is controlled dragging the mouse left button pressed on top of the video window.

Here’s a video (direct link) of the Phase01. You need HTML5 video capable browser with Ogg Theora/Vorbis codecs.

ALIP on n8x0

ARM and Movial announced a second stable release of ARM Linux Internet Platform (ALIP) generic repository. ALIP got other updates as well, see a blog post about them in Movial’s Sandbox or the actual release notes.

The Kaze project for n8x0 devices was updated to use the generic-2 branch as well. There are no built images provided, but ALIP is relatively easy to compile if one is already familiar with Scratchbox. The ALIP rootfs works with the Maemo kernel and initfs and it can be booted nicely from an MMC/SD card, so no need to destroy the Maemo from the device just to test ALIP.


Kaze on n8x0

ALIP requires newer SB components (and specific toolchains) but the newer SB components should work just fine with Maemo targets and Maemo toolchains. Or the newer SB can be installed in a different directory from tarballs and used concurrently with the Maemo SB.

Follow the From scratch instructions but replace alip-project with kaze-project and don’t pass the -cbeagleboard. You should pass -c multimedia to include 3rd party provided (by me actually) gst-ffmpeg to the build.

If you want to include WebKit engine and Midori UI to it, add “midori” to the components file.

Unfortunately the X driver for OMAPs (xf86-video-omapfb) in the stable branch in omap-repository has a bug concerning n8x0 devices and you should use master branch of it if you want to test video playback. The easiest way to switch using master branch for this component is to clone the n8x0 configuration repository and switch the branch before running the matrix install.


git clone git://linux.onarm.com/git/n8x0/config/n8x0.git
vi n8x0/suite/n8x0-recommended
# Add the branch: Component("xf86-video-omapfb", branch="master")
matrix install -c multimedia

After the install you should include the binary DSP tasks from the device (they are proprietary and cannot be distributed). Use the helper script (get_nokia_binaries.sh) in src/platform-n8x0 that fetches them from the device over ssh and reinstall the component before creating the rootfs image:


matrix install-only -c multimedia platform-n8x0

Lots of things are still broken:

  • Power button tries to suspend, which fails and does nothing.
  • WLAN encryption keys are not stored succesfully.
  • WPA doesn’t work (WEP and unencrypted do work).
  • Midori should be started after networking.
  • There’s no ssh client (but dbclient as it’s dropbear).
  • Power management.
  • Default XFCE theme doesn’t look cool.
  • Etc.

But I believe that with some work Kaze on n8x0 will become decent enough for everyday use and will provide up to date components long after Nokia has dropped the n8x0 support.

If you have any questions, visit #alip @ freenode.

PS. If you want an open source media engine with D-Bus API checkout the Octopus. It’s a work in progress but handles basic audio and video playback on n8x0 just fine :)

Linux on ARM

ARM (and Movial) has published a new site that provides Open Source components, middleware and utilities used to build a Linux Mobile software stack on ARM.

All components (applications, libraries, etc) are in GIT repositories. The build tool is called Matrix. Matrix clones all components under one directory and compiles them with a single command. With another command you get JFFS2 image although that’s not as simple as it should be.

ARM would like to get all contributions directly to upstream instead of providing large code dumps and states that developers are encouraged to participate in discussion forums and developer community of respective components used on this site. That’s why there are no new mailing lists nor forums available for the platform. There is #matrixhelp (#matrix was taken) on irc.ipv6.oftc.net for Matrix related issues though. Developing the components is convenient if you are familiar with GIT. It’s easy to test if your patch works and send it to the upstream project.

One of the supported hardware platforms is n8x0 which is nice as it’s commonly available. The downside is the closed source nature of it. There are two projects, example-project and Kaze that has n8x0 configured as one target platform. Kaze has XFCE desktop instead of Matchbox desktop that the example-project uses.

Kaze boots but most features need still work. WLAN works without encryption but WEP and WPA encryptions need to be fixed. ALSA works with alsa plugins through the DSP but the closed source DSP tasks need to be copied to the build system. Kaze has normal X.Org instead of Xomap, so there’s no XV extension, only stubs.

2.6.25 and BT working on Gumstix

I finally got the Bluetooth working with my Gumstix running 2.6.25 and my linux setup. My patches are against two weeks old kernel but hopefully apply to current HEAD too. The patches include my kernel config: gumstix-verdex-bt.config too.

I had to configure the BT hardware using pxaregs and /prog/gpio in my startup scripts (copied from some OE image):


echo -n "Starting 32kHz clock..."
/usr/sbin/pxaregs OSCC_OON 1
while /usr/sbin/pxaregs OSCC_OOK | tail -n 1 | grep -q -v 1;do
echo -n '.'
sleep 1
done
echo "Settled"

/sbin/modprobe gumstix_bluetooth
/sbin/modprobe proc_gpio

echo "AF3 out" > /proc/gpio/GPIO9

echo "AF1 in" > /proc/gpio/GPIO42
echo "AF2 out" > /proc/gpio/GPIO43
echo "AF1 in" > /proc/gpio/GPIO44
echo "AF2 out" > /proc/gpio/GPIO45

I also had to patch my hciattach from bluez-utils 3.29. Those patches are here.

UBIFS and Gumstix

I wanted try something other than JFFS2 as the rootfs and decided to go with UBIFS. Thanks to GIT’s brilliance I had no trouble pulling the UBIFS kernel patches from their tree to mine.

It seems that the UBIFS hasn’t had many NOR flash users before me and it needed some fixes. Artem Bityutskiy was extremely helpful in fixing the deficiencies and helping me out. After a few debug rounds I now have UBIFS root on my Gumstix.

I created the UBIFS image with the following commands:


sudo mkfs.ubifs --compr=zlib -r /tmp/rootfs -m 1 -e 130944 -c 120 -o ubifs.img
ubinize -o ubi.img -m 1 -p 128KiB -v ubinize.cfg

With this ubinize.cfg:


[ubifs]
mode=ubi
image=ubifs.img
vol_id=0
vol_size=13MiB
vol_type=dynamic
vol_name=rootfs
vol_alignment=1
vol_flags=autoresize

Note that I have reserved 2MiB for the kernel partition.

I had to add one extra parameter to the kernel args to specify what MTD partition I wanted to use:

console=ttyS0,115200n8 root=ubi0:rootfs rootfstype=ubifs reboot=cold,hard ubi.mtd=1

Now my Gumstix boots with simplified kernel to busybox shell in roughly 4.2 seconds (counted from the bootm command in U-Boot).

2.6.25 running on Gumstix

I had to add a new category after all: gumstix. The embedded word has too broad a meaning.

Updating the kernel wasn’t such a big thing after all, even for a kernel n00b like me. I git cloned the vanilla tree from kernel.org and patched the generic (and one bluetooth) Gumstix patches from OE: arch-config.patch, board-init.patch, header.patch, mach-types-fix.patch, modular-init-bluetooth.patch, tsc2003-config.diff, tsc2003.c, and uImage-in-own-partition.patch. Most of them applied fine, I just had to manually apply simple Makefile patches and change pxa_init_irq call to pxa27x_init_irq.

I don’t know if it really works, but at least it booted:


root@gumstix-custom-verdex:~$ uname -a
Linux gumstix-custom-verdex 2.6.25-rc8-00151-gdc41023 #2 Fri Apr 4 23:43:25 EEST 2008 armv5tel unknown

Gumstix

It’s time to create a new category in my projects blog: embedded.

I got interested in microcontrollers some years ago and wrote something simple for 16F88 with assembler. Then I connected it to an old IPAQ using a serial connection. But the IPAQ isn’t a very good development platform so now I decided to buy a gumstix.

Gumstix Verdex

It’s 400Mhz verdex mainboard with 64M RAM and 16M flash. And a bluetooth. The mainboard is the smaller board upside down in the picture. The bigger board is a expansion board including 3x rs232, power plug, a USB mini-B connector and a bunch of GPIO lines.

The bootloader is U-Boot and the Linux distribution is based on OpenEmbedded. U-Boot is nice as I’m somewhat familiar with it but the OE I’m going to replace with something else.

I just booted it up to see how it looks inside:


cat /proc/cpuinfo
Processor : XScale-PXA270 rev 7 (v5l)
BogoMIPS : 415.33
Features : swp half thumb fastmult edsp iwmmxt
CPU implementer : 0x69
CPU architecture: 5TE
CPU variant : 0x0
CPU part : 0x411
CPU revision : 7
Hardware : The Gumstix Platform

Maybe the first thing would be to configure the kernel to support only those features included in my hardware setup.