Pleco Phase 03: Finally drivable

I wrote about Phase 02 in September 2013 and listed goals for the Phase 03.

Most of the goals are implemented and finally it is a joy to remote drive the car.

I attached a wide angle lens to the existing camera and that made a huge difference. The camera crops a bit from the 180° angle but clearly the wider the better. I also made my first 3D print to better attach the camera to the servos.

Tegra 3 based Ouya was replaced with a Tegra K1 based Jetson TK1. That made it easy to stream a low latency H264 video over the network. Due to USB 2.0 and network limitations, the best quality video stream that can be enabled from the controller application is 800×600 2 Mbps.

The Microsoft Lifecam decreases the FPS in low light conditions so I added some V4L2 controls. The brightness is set to minimum to get the maximum FPS. In addition to that, the controller application now has manual focus and manual zoom.

The latency is low enough to drive over a long distance. In local wired network, the application latency (from the controller application to the slave application and back to the controller application) is 1-2 ms. Over my local WIFI it increases to 4 ms. In the video above the car was connected to an LTE network and the communication was routed over a distance of 800 km (the relay is in Stockholm and I live near Helsinki). The latency was around 60-80 ms and it did not introduce any noticeable delay. A friend of mine even drove the car from Gold Coast (that is in Australia, almost 15000 km away!). The latency was around 450 ms and while the delay was obvious, he was still able to drive it.

I also measured the actual visual delay, i.e. how long it takes for the controller application to show what the camera sees (“photon to display”). I did that by taping a led to the camera and using an external microcontroller with two light sensors. One sensor was taped to the led on the camera and the other one was taped to the monitor showing the controller application. I then measured the difference of the two sensors.

The visual latency was about 105 ms with 1 ms network latency. The camera is supposed to be 30 FPS so that can introduce a maximum of 33 ms of latency and my monitor is 60 Hz, so it can introduce a 16 ms latency. On average something else still introduced a 80 ms latency. That 80 ms may be the sum of getting the video stream from the USB camera to the Jetson, encoding it, sending it, buffering a frame, decoding it and then finally showing it. While it might be possible to get the latency down, it is already low enough for remote driving within a 1000km range :)

The project will never end and there are already some goals for the Phase 04:

  • GPS
  • AHRS based car orientation visualisation
  • 60 FPS (stereo?) camera
  • Control the webcam based on driver’s head orientation
  • Improved gamepad etc. controls

XBMC for Tegra with full HW acceleration

XBMC at UltraHD resolution.
XBMC running at UltraHD resolution on Jetson TK1.

Thanks to Markus Tavenrath there is finally a fully accelerated Kodi (previously known as XBMC) for Tegra K1 based devices like the Jetson TK1. Some of the Kodi patches are already upstreamed and the rest will be hopefully soon. Jetson supports X.Org very well and things like XRandR based TV refresh rate changes work perfectly.

I have a simple lab power supply (Mastech DF17132) and I made some quick power measurements with it. I did only guestimate the typical reading instead of making multiple tests and calculating the averages. So the numbers should be close to the truth but not scientifically accurate. I replaced the power supply that came with the Jetson with the lab power, so I have not included the consumption of the Jetson’s external power supply in the measurements.

The on-demand CPU frequency governor does not provide proper CPU clocks for smooth Kodi UI and video playback, so all the Kodi tests have been run with all four CPU cores forced on-line and with performance governor.

My USB hub, USB keyboard and USB mouse consume about 0.84W. The very noisy fan takes about 0.72W. That is about 1.5W that is included in the numbers below but maybe should not as everybody use different USB devices and nobody can use that fan in an HTPC setup.

In full screen video test cases the TV’s refresh rate has been changed to 24Hz and when the XBMC UI is visible, the refresh rate is 60Hz for 1080p TV and 30Hz for 2160p TV (HDMI 1.4 limitation).

1920×1080 Sony TV
Test case Power consumption
Slim Login 3.36
XFCE desktop, idle, ondemand 3.36
XFCE desktop, idle, performance 3.72
Kodi main menu 5.40
Kodi full screen video 1080p24 5.16
Kodi full screen video 1080p60 6.00
Kodi full screen video 2160p24 5.40

In the table below, the resolution used for all the cases is Ultra HD or 2160p or 3840×2160. The term “4k” is misleading as people often really mean the Ultra HD and not for example 4096×2160.

3840×2160 LG TV
Test case Power consumption
XFCE desktop, idle, ondemand 3.72
XFCE desktop, idle, performance 4.08
Kodi main menu 5.52
Kodi full screen video 1080p24 (max gpu clocks) 6.96
Kodi full screen video 2160p24 (max gpu clocks) 6.96

I think decoding 1080p24 with 5.16W (or 3.7W without USB peripherals and the fan) is pretty good! Add 2 watts and you get 2160p.

After watching different trailers over and over I definitely would like to see movies being produced at higher FPS rates and distributed at higher bitrates instead of bumping the resolution from Full HD to Ultra HD. That 1080p60 looked awesome.

The biggest complaint I have about the Jetson, is the fan. The noise is way too loud even for development use, not to mention using Jetson as an HTPC in the living room. There is some discussion about using passive coolers in the NVIDIA forums.

For installation instructions and other tips see the Installing Kodi wiki page.

Ubuntu on Ouya

[Update 2014-09-20]: The latest patches from Markus for the Jetson TK1 work also for Tegra3!

I was testing Markus Tavenrath’s EGL branch of the XBMC on my Ouya and noticed that OMX_UseEGLImage segfaults on Debian but doesn’t on Ubuntu. The segfault happens inside the NVIDIA’s OMX binary, so there is not much to debug. And with Ubuntu 13.10 I had rendering issues. Then I tried Debian Jessie just to discover it is already using X.Org Video ABI 15.

So here is the summary of my discoveries:

  • Debian Wheezy: Segfault in OMX_UseEGLImage with XBMC.
  • Debian Jessie: X.Org Video ABI 15, no driver.
  • Ubuntu 12.04: Old.
  • Ubuntu 12.10: Old.
  • Ubuntu 13.04: No obvious issues.
  • Ubuntu 13.10: X.Org Video ABI 14, rendering problems.
  • Ubuntu 14.04: X.Org Video ABI 15, no driver.
HW accelerated video and GLES2.
HW accelerated video and GLES2.

I noticed that Totem has moved to GStreamer 1.0 so it does not use NVIDIA’s GStreamer plugins. I changed Parole to use nvxvimagesink instead of the de-facto xvimagesink and it works well. I’ve uploaded the binary to my tmp.

Pulseaudio seemed to cause slow playback and extra CPU consumption so I just removed it.

My instructions for creating Ubuntu Raring rootfs for Ouya can be found from github. Do leave a comment if there is anything broken with the instructions!

Debian on Ouya: All Systems Go

I finally got all the major subsystems working and Debian is now usable on Ouya. I prefer Debian but Ubuntu should work similarly.

Working video and 3D

In the above screenshot I’m playing 1080p H264 while running glmark2-es, an OpenGL ES benchmark. And the CPUFreq ondemand governor doesn’t even raise the CPU frequency from the minimum as everything is properly accelerated. I do have to admit that the 1080p video isn’t running smoothly with the benchmark running at the same time.

A disclaimer before telling more: Trying this out might void the warranty and flashing anything to Ouya may easily brick it forever. So don’t try it, unless you are willing to buy a new one.

That said, the idea is not to touch Ouya’s Android at all. Kernel is booted from RAM and Debian from an SD card or USB stick.

The instructions for debootstrapping your own Debian Wheezy rootfs are described in the github repository and the needed binaries are in a temporary location that will probably change at some point.

WiFi is working as well but I’m not sure if the firmware binaries are redistributable, so you need to copy them from the Android rootfs after booting to Debian.

One of my main goals for this whole exercise is video encoding and I’m happy to say that H264 encoding from an USB webcam works nicely with hardware acceleration as well.

I’ve also noticed some issues:

  • Hciconfig doesn’t show BT devices, I didn’t debug much further.
  • Mpg123 seems to play but nothing is heard. Maybe it tries to pass MP3 onwards instead of PCM? GStreamer provides acceleration for MP3 and AAC though.
  • Video acceleration works only with GStreamer and nvxvimagesink. Mplayer can’t even use XV video output correctly.
  • Tegra3 supports only OpenGL ES 2.0 with EGL, so no full OpenGL with GLX.
  • Xrandr seems to be able to switch the resolution but after the switch there was window decoration corruption.

Despite the deficiencies I think Tegra3 is quite well supported in the X.Org world and many things work well with this 99$ device.

If you test my instructions, let me know how it goes and what you think about it! :)

Pleco Phase02 completed

It’s been two years already since I posted about Phase01 being completed. It was a simple track based vehicle using cheap DC motors and the hull was built from Meccano parts:

Pleco Phase01

For Phase02 I decided to go with a ready made car and bought an RC Rock Crawler. It’s about four times the size of the Phase01:

Pleco Phase02 open

Pleco Phase02

The software is roughly the same as it was for Phase01. I’ve fixed a lot of bugs, simplified some things and added a bunch of new features. The slave has a sonar next to webcam and it measures the distance to what ever the camera is pointing at. The slave also measures the current consumption and battery voltage level. All details are shown on the GUI.

The hardware on the other hand has changed a lot. Instead of handling the servos directly from Linux there’s now a separate self-designed Cortex-M4 based microcontroller board for driving the PWM signals. In the future the microcontroller can update the PWM signals real-time based on other sensors without having to worry about latency issues in Linux.

The Linux is now running on a Tegra3 based Ouya gaming device. Ouya is not as convenient for robotics as Gumstix was but personally I think Tegra has better Linux support than OMAPs. And since there is now a separate control board for motors and sensors, it’s enough for the Ouya to only have a USB connection to the control board.

Ouya needs 12 volts while the motors need 5 volts and I’m expecting to need 3.3 volts as well. So I now have three switching regulators connected to the battery for the needed voltage levels. And a USB hub because of the USB based control board and webcam. There are also lots of cables. So I’m not actually able to get everything nicely inside the car, but almost.

Controlling an RC car with a keyboard is inconvenient, so I’ve added support for gamepads. I’m currently using a Bluetooth based gamepad from Logitech:

Logitech F710 gamepad

When driving in a local WiFi network, there’s no noticeable latency at all. It’s still not possible to really drive based on the camera only, the viewing angle is too narrow, turning the camera is cumbersome and the video quality needs some tweaking.

Now that I’ve made some nice progress with the project, I have clear goals for the Phase03:

  1. Disassemble the webcam to get the size and the weight down and to be able to attach fish eye lenses to it.
  2. 3D print a frame for the electronics and the webcam.
  3. Tweak the video quality and camera controls so that it would finally be possible to drive based on the video stream.
  4. Control the webcam based on driver’s head orientation?

So there are plenty of interesting things to learn :)

Oh, and I’ve moved the code to github.

Debian on Ouya

Ouya

Ouya is a 99$ Tegra3 based gaming console without a display.

My goal is to use Ouya in my robotics project for video encoding but first I want to have Debian running there with hardware accelerated X.Org, OpenGL ES and video.

I now have the Debian running but with a few bigger issues that I’ll describe shortly. If you want to try it out, you can find my instructions for creating the rootfs in github. That might void the warranty even though the kernel is booted from RAM and the rootfs mounted from USB stick so the Ouya software should remain intact.

Ouya is easily brickable permanently, so you need to be careful not to flash anything there and you’ll be doing it at your own risk.

About the Issues.

HDMI

2013-09-05: HDMI works now with multiple resolutions and is not so picky about having the cable connected already on boot.

Ouya display kernel driver is hardcoded to use HDMI as the primary display with 1920×1080 resolution. That works relatively well as long as you keep your monitor/TV always on. I couldn’t get anything but black if I e.g. switched my monitor to another input and back.

I tried to enable the normal behaviour in the display driver and by disabling the LVDS in xorg.conf but the TV remains black and doesn’t seem to change the resolution. Maybe there is still something hardcoded in the display driver or maybe the driver is trying to enable the non-existent internal panel and gets confused.

Audio

2013-09-11: Got audio working with a simple asound.conf. I also run /etc/init.d/alsa-utils stop but unsure if that affected anything.

aplay -l lists HDMI as the first device and tegrawm8903 as the second. If I make the tegrawm8903 the default the audio plays but only from the left speaker and half the speed.

I don’t know if this is related to the HDMI problems above or if I should just have the correct ALSA mixer configuration.

Video

2013-09-05: Totem has configurable video sink: gconftool-2 -s /system/gstreamer/0.10/default/videosink nvxvimagesink –type=string

Hardware accelerated video decoding and rendering is supported through GStreamer. Currently videos play well with playbin2 as long as the audio is disabled with audio-sink=fakesink and the correct video sink is selected with video-sink=nvximagesink.

Maybe the GStreamer’s autovideosink should be patched to use nvxvimagesink instead of xvimagesink to make nvxvimagesink the default in applications using playbin2

Power

2013-09-05: CPUfreq with ondemand governor is working and enabled in the kernel nby default now.

CPU core hot-plugging works but switching to the low power core seems to lead to a kernel bug (arch/arm/mach-tegra/pm.c:509). Same bug seems to happen with ondemand CPUFreq governor. I haven’t really tried to get those working yet but both of them are working with Nexus7 so maybe it’s possible to get them working on Ouya as well.

If you want to test Debian on Ouya, you can find the needed blobs from my /tmp but I might set up something more appropriate later.

MeeGo 1.2 ARMv7 chroot (beta)

I’ve always liked the Scratchbox approach to cross-compiling. Run ./configure && make and you have an ARM binary, no need to explicitly tell the configure we are cross-compiling nor fix the bad behaving build scripts.

MeeGo doesn’t provide an SDK for the (ARM) platform. There’s an SDK for building Qt applications and there’s an QEMU for emulating the ARM device environment. For building the lower level components (Qt itself, GStreamer, etc) you are expected to use OBS. OBS is a very good build infrastructure tool, especially as you can link your own OBS to upstream OBS instances like MeeGo or OpenSUSE and have your OBS build only your own components or modified upstream components.

But OBS is a bit overkill when you are developing your own component that you don’t want to be a part of anything bigger yet. The OBS client side tool, osc, allows you to build components locally in a chroot but still you need an OBS account and those aren’t automatically available for everyone, not even in the community OBS.

I took the chroot created by osc build and modified it a bit with the help from stskeeps @ #meego-arm. The produced chroot is capable of building ARMv7 hard-float binaries without OBS or OBS account. It includes a minimal set of dependencies to make it smaller for easy download (it’s still 162MB), and the project specific dependencies can be installed normally with zypper from the standard MeeGo repositories.

The benefit from using a chroot over a QEMU image is the installed speed tools; many of the components taking time during a build are actually x86 binaries. These include e.g. bash, compilers and bzip2. Running these as emulated ARM binaries (or natively on an ARM hardware) would be much much slower.

I’ve used this approach with my own Qt + GStreamer project and it has been working well but that’s only a small use case so there can be all kinds of issues still. I use the same account inside the chroot and outside and bind mount my $HOME to the chroot so I can build project under my $HOME.

Current known issues:

  • Zypper doesn’t find noarch packages (this is an upstream bug, not related to my chroot).
  • Tested only on Debian Squeeze. At some point there was a linker issue on Ubuntu and I don’t know if that still exists.
  • My instructions mention libqt4-devel although the package name is libqt-devel.

If you want to try it out, it’s available via BitTorrent at http://tuomas.kulve.fi/tmp/torrent/ (temporary location). After extracting the tarball, see the readme file in the root directory.

Pleco Phase01 completed

I started playing with microcontrollers in 2005 and, if not at the very start, at least very quickly I decided to aim to have some sort of remote controlled Linux device with controllable camera with digital wireless communication. Now, 6 years later, I have completed my first phase :)

Couple of photos of the earlier devices are shown in the project page.

After several planning iterations and code rewrites I ended up using Qt both on the remote controlled Gumstix and on the GUI controller. I decided that trying to optimize everything from the memory and CPU consumption to the network bandwidth just isn’t worth the time spent in implementing it. The most CPU intensive task is the video encoding to H263 and that’s done in the DSP. I’m running MeeGo on the Gumstix and it provides e.g. the GStreamer plugins for the DSP.

Using Qt framework with self made simple protocol over UDP I got the Phase01 code implemented quite quickly compared to my previous efforts. The protocol allows low priority packets (like periodic statistics and video stream) to be lost and guarantees the passing of high priority packets (control commands etc.). Also only the latest control command of each type is retransmitted, i.e. an old packet is not retransmitted if a new overriding command has already been given.

The controller GUI shows the states the slave sends, like motor speeds, WLAN signal strength, CPU load average and some protocol statistics like round trip time and the number of retransmissions.

Currently the motors are controlled using the a,s,d,w keys in 10% steps and the camera is controlled dragging the mouse left button pressed on top of the video window.

Here’s a video (direct link) of the Phase01. You need HTML5 video capable browser with Ogg Theora/Vorbis codecs.

Learning OpenGL ES 2.0

I got (again) interested on OpenGL ES 2.0 and bought OpenGL® ES 2.0 Programming Guide.

The examples on the book can be downloaded from http://www.opengles-book.com/. Those are meant to be compiled with Microsoft Visual Studio.

I made a small patch that adds initial support for Linux/X11. It doesn’t support texture loading (because I misplaced my png loader somewhere), the WinCreate() is copypaste code I don’t fully understand and the WinLoop() is missing some functionality. But at least the Chapter 1 Hello Triangle compiles and runs in Fremantle on Beagle board :)

Compile and install the libes in a normal autotools manner. Then you can compile the examples by exporting the PKG_CONFIG_PATH and running gcc:


gcc `pkg-config libes --cflags --libs` Hello_Triangle.c -o Hello_Triangle

As usual, patches are welcome :)

ALIP on n8x0

ARM and Movial announced a second stable release of ARM Linux Internet Platform (ALIP) generic repository. ALIP got other updates as well, see a blog post about them in Movial’s Sandbox or the actual release notes.

The Kaze project for n8x0 devices was updated to use the generic-2 branch as well. There are no built images provided, but ALIP is relatively easy to compile if one is already familiar with Scratchbox. The ALIP rootfs works with the Maemo kernel and initfs and it can be booted nicely from an MMC/SD card, so no need to destroy the Maemo from the device just to test ALIP.


Kaze on n8x0

ALIP requires newer SB components (and specific toolchains) but the newer SB components should work just fine with Maemo targets and Maemo toolchains. Or the newer SB can be installed in a different directory from tarballs and used concurrently with the Maemo SB.

Follow the From scratch instructions but replace alip-project with kaze-project and don’t pass the -cbeagleboard. You should pass -c multimedia to include 3rd party provided (by me actually) gst-ffmpeg to the build.

If you want to include WebKit engine and Midori UI to it, add “midori” to the components file.

Unfortunately the X driver for OMAPs (xf86-video-omapfb) in the stable branch in omap-repository has a bug concerning n8x0 devices and you should use master branch of it if you want to test video playback. The easiest way to switch using master branch for this component is to clone the n8x0 configuration repository and switch the branch before running the matrix install.


git clone git://linux.onarm.com/git/n8x0/config/n8x0.git
vi n8x0/suite/n8x0-recommended
# Add the branch: Component("xf86-video-omapfb", branch="master")
matrix install -c multimedia

After the install you should include the binary DSP tasks from the device (they are proprietary and cannot be distributed). Use the helper script (get_nokia_binaries.sh) in src/platform-n8x0 that fetches them from the device over ssh and reinstall the component before creating the rootfs image:


matrix install-only -c multimedia platform-n8x0

Lots of things are still broken:

  • Power button tries to suspend, which fails and does nothing.
  • WLAN encryption keys are not stored succesfully.
  • WPA doesn’t work (WEP and unencrypted do work).
  • Midori should be started after networking.
  • There’s no ssh client (but dbclient as it’s dropbear).
  • Power management.
  • Default XFCE theme doesn’t look cool.
  • Etc.

But I believe that with some work Kaze on n8x0 will become decent enough for everyday use and will provide up to date components long after Nokia has dropped the n8x0 support.

If you have any questions, visit #alip @ freenode.

PS. If you want an open source media engine with D-Bus API checkout the Octopus. It’s a work in progress but handles basic audio and video playback on n8x0 just fine :)