Sunday, December 31, 2017

Year 2017 overview : Duke's Choice Award and more... move on to 2018

Picture taken by Neo
Hi everyone,
  the year 2017 ends soon... so we would like to do a brief summary about what we have achieved. Before we move on, we would love to thank all our fans and supporters, it was an amazing year for the Robo4J project! 

The first and great achievement was winning the Duke's Choice Award 2017 for Java ecosystem innovation. The Robo4J Team feels very honored to hold this award! 

The next milestone was the well accepted presentation at JavaOne 2017: "From Concept to Robotic Overlord with Robo4J". We felt again very honored also about the big audience and we were very happy to answer so many question that came up after the session. 

A few weeks after JavaOne we were holding our next presentation at Devoxx Belgium 2017. The presentation: "Much Robots, Very Java so IoT". Although our time slot was a way shorter then at JavaOne, the audience was very big and amazing. We were grateful again! Hopefully next time we can build some cool IoT stuff on stage when we get a bigger time slot. :)

Although it was a very busy time due to ordinary jobs and preparations for all conferences we have moved Robo4J project forward:

1. Robo4J has native JSON support for http socket communication
    JSON native support allows to talk with Robo4J systems only by defining traditional Java Classes and the developer doesn't need to care much about anything else. Such classes are automatically serialized/deserialized behind the scene. This feature allows Robo4J to be connected with any other system that uses JSON type of communication. The developer gets at the same time advantages from the strongly typed langue. It makes socket codecs definitions very simple. See the example bellow ...more is coming next year 2018 :



2. Robo4J SNAPSHOT build alpha-3.0.0 available for Everyone! 
A few days ago we have published the Robo4J SNAPSHOT  release. It allows every developer to simply import the Robo4J binaries directly to the project by using the Maven or Gradle build system. Here we have published a simple Gradle example that initiates the basic Robo4J system : github project

And that's basically all for this year! 
For the next year the Robo4J team is already "cooking" some surprises, not only with Java Mission Control and Flight Recorder (thanks to Marcus Robo4J has the native support for it since the very early beginning) but also some very cool features according to sockets and network communication... And it is still not all ;) 

Stay tuned and Happy New Year 2018!
Robo4J Team!

Monday, September 4, 2017

Where to Meet Robo4J in 2017 - a Conference List

Hi all!

Sorry about the radio silence. We've been busy prepping for the upcoming conferences!

Here is a list of conferences where there will be a Robo4J talk the next few months:

Date Conference Location
2017-09-14 NetBeans Days München 2017 Munich / Germany
2017-09-30 JavaOne 4 Kids 2017 San Francisco / USA
2017-10-03 JavaOne 2017 San Francisco / USA
2017-11-06 Devoxx 2017 Antwerp / Belgium

We are looking forward to seeing you there!

/The Robo4J team

Sunday, July 9, 2017

Magnetometer Calibration

Since there is a fair bit of a pain for me to get pictures into the blogs using my go-to blogging tool, I've posted a blog on my personal blog about how to calibrate magnetometers in Robo4J:

http://hirt.se/blog/?p=796

Hope this helps!

/M

Saturday, May 20, 2017

Be ready and prepare Raspberry Pi for Java 9

   Java 9 is coming soon with many new features. Aside of continued discussion about Project Jigsaw impact, it's time to get prepared.
This article helps you to prepare your RaspberryPi environment with multiple installed JDKs.
Whole process is quite straight forward: 

The first step is to download the last stable release of java jdk-9. 
In the article we use the most recent build: jdk-9+170


You can download JDK directly into your RapsberryPi or your local machine and then by using $scp command upload it to your RaspberryPi machine
$scp ./jdk-9-ea+170_linux-arm32-vfp-hflt_bin.tar.gz pi@<raspberrypi_ip>/home/pi

The next step is to login into your RaspberryPi machine  
$ssh pi@<raspberry_ip>

To unzip the downloaded JDK file execute following command:
$ sudo tar zxvf jdk-9-ea+170_linux-arm32-vfp-hflt_bin.tar.gz -C /opt

The command extracts the jdk-9 into the /opt folder

Now link java and javac with newly available JDK
$ sudo update-alternatives --install /usr/bin/javac javac /opt/jdk-9/bin/javac 1
$ sudo update-alternatives --install /usr/bin/java java /opt/jdk-9/bin/java 1

by using following command we select the jdk-9 available for us under the number 1
$ sudo update-alternatives --config javac
$ sudo update-alternatives --config java

at the end we verify our current java version by:
$ java -version
java version "9-ea"
Java(TM) SE Runtime Environment (build 9-ea+170)
Java HotSpot(TM) Client VM (build 9-ea+170, mixed mode)

$ javac -version
javac 9-ea


Congratulation your RaspberryPi is now running on Java 9 and you can switch between JDKs.

Monday, April 17, 2017

Robo4J and the Java Flight Recorder (Laser Scanner Calibration)

The Java Flight Recorder (JFR) is a powerful feature in the Oracle JDK. It allows you to record information about the Java runtime, and the application running in the Java runtime. When designing robots, it can be an invaluable tool. Robo4J has some integration with the Java Flight Recorder out of the box.

In this blog, I will use the JFR integration to simplify the calibration necessary to make use of the LaserScanner class.

Calibrating Servos

Step one is to calibrate whatever servos are involved in doing a scan. I have a pan servo and a tilt servo. Only the pan servo is actually involved in the LaserScanner class - it provides 2D scans. That said, I need for the tilt servo to be calibrated properly too, if nothing else to ensure that I get level scans. To calibrate the servos, see my Servo Calibration blog.

First Configuration

The second step is to configure reasonable defaults for the configuration file for the LaserScannerExample ($ROBO4J_HOME/robo4j-units-rpi/src/examples/resources/lidarexample.xml). The most important part is to copy the values from your calibrated pan servo in the first step to the laserscanner.servo entry. Also, ensure that you set the minimum acquisition time depending on the version of the Lidar Lite you have. The v1 requires at least 20 ms. The v2 and later should be okay with 2ms. For the rest of the settings, leave the defaults for now.

Doing Flight Recordings

The next step is to run the a few scans using the LaserScannerExample and look at the results.
First source the envionment variables:
source scripts/rpi/environment.sh
Next run the LaserScannerExample with the flight recorder enabled. We will connect over JMX, so you need to open the appropriate ports, for example:
sudo java -XX:+UnlockCommercialFeatures -XX:+FlightRecorder -Dcom.sun.management.jmxremote.port=7091 -Dcom.sun.management.jmxremote.rmi.port=7091 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Djava.rmi.server.hostname=coffewlan -cp $ROBO4J_PATH com.robo4j.units.rpi.lidarlite.LaserScannerExample
Next, get the robo4j-tools repo for Robo4J, and follow the instructions to start Java Mission Control (JMC) with the Robo4J plug-in.

Create a connection to your RaspberryPi running the example. You can connect with the JMX console to see that the connection works.


Next create a recording from Java Mission Control by right clicking on your connection and select “Start Flight Recording". You can use any recording template, just make sure to enable the Robo4J specific events in the last page of the wizard.




Click finish to start the recording. Once done, it will be downloaded to JMC automatically. Now, since you are using the Robo4J plug-in, the scans will be visualized for you. Open the Robo4J tab group, and select a scan to take a look at it.

If you suspect your servo is travelling too fast or slow (multiple sample points in the end with the same reading(s), weird artifacts in the end of a scan), adjust the angularSpeed. If you feel the laser do not get enough time to acquire, increase the minAcquisition time. Also, since we move the servo whilst acquiring, you may need to compensate the angular speed. Since I was lazy and didn’t have time to spend looking for a perfect physical model, there is a trim parameter to compensate (to make the left-to-right, and right-to-left scans align).

Below is a picture of a couple of individual scans on my robot without trim (simply shift- or control-click a few scans to render them simultaneously):

And the picture below is with trim set to 5.5:

Simply keep doing recordings and bisect your way to a proper trim value for your particular setup.

Summary

This blog showed that Robo4J has built in support for the Java Flight Recorder, and that the built in support is quite useful to calibrate the laser range finder (the LaserScanner class). It is also quite useful for trying out different algorithms for feature extraction and mapping using pre-existing data, but that is for another night and another blog.

Servo Calibration in Robo4J

In this blog I will show how to do simple servo calibration in Robo4J. This is, for example, important to get the laser rangefinder to work properly.

First of all we will need to configure two servo units. In our case we will use an Adafruit 16CH 12-bit PWM generator, but any RoboUnit taking a float value (normally normalized, between -1 and 1) can be used.

First sync your robo4j git repo and build everything:
git pull
./gradlew build
Next set up a classpath variable you can use. Normally just running the environment.sh script for the RaspberryPi will be fine. If not, edit the script to fit your setup.
source scripts/rpi/environment.sh
First off, copy the settings file for the calibration test. It is located in $ROBO4J_HOME/robo4j-units-rpi/src/examples/resources/calibration.xml.
For example:
cp $ROBO4J_HOME/robo4j-units-rpi/src/examples/resources/calibration.xml /tmp
Next edit the file to suit your setup. If you are about to calibrate your laser scanner, make sure that your pan servo is represented in the file. Also set all trim values to 0, and any inverted values to false. Next run the little calibration utility:
sudo java -cp $ROBO4J_PATH com.robo4j.units.rpi.pwm.CalibrationUtility /tmp/calibration.xml
You should now see something like this:

State before start:
RoboSystem state Uninitialized
================================================
  pan Initialized
  tilt Initialized

State after start:
RoboSystem state Started
================================================
  pan Started
  tilt Started


Type the servo to control and how much to move the servo, between -1 and 1. For example:
pan -1.0
Type q and enter to quit!

This is pretty self explanatory. Simply refer to the servo by name, and then set the value for it. If the servo name is “pan” in the configuration file, “pan -1.0” followed by enter will move the servo to one extreme.

For pan servos for the laser rangefinder -1 should be full left, and 1 full right. Ensure that this is the case, if not, change the inverted value for the servo configuration in the settings file. Next set the servo to 0. If the servo is off center when set to 0, carefully unscrew the servo arm without changing the position of the servo arm. Then re-attach it as close to centered as possible.

For the final tuning, use the trim value in the servo configuration to ensure that the servo is properly centered. If you are going to use the servo for panning a laser, also make note of how many degrees rotation of the arm a move from the center (0) to an extreme (-1 or 1) corresponds to.

That’s it! When you are coding/setting up your robot, simply use the settings in you calibration settings file.

Summary

This blog showed how to go about calibrating a servo in Robo4J using the very simple CalibrationUtility. This is an especially important step when the servo will be used together with the LaserScanner, something I will talk about in a future post.

Monday, April 10, 2017

5 Things I discovered about IoT at Javaland



Last month I presented the Robo4J framework at InnovationLab, during the Javaland conference (28-30.03.2017). It was really an amazing experience, not only because of the nice conference venue but also because of all the things I’ve noticed during hours of talks. Robo4J had a small stand there and it was possible to show a couple of examples live. 

The first thing I realised pretty quickly, and appreciated, was the selection of the examples I had taken with me. I packed a Lego Mindstorm Education set with Lego Sensors, Motors, RaspberryPIs with external batteries and some LCDs and sensors (all from the Adafruid company). The Lego is for kids and pretty sturdy. Adafruid provides really good quality, and is robust enough too. All has been working perfectly over the whole conference without any problems.
The second important thing was my DNS/DHCP server configured on one of my RaspberryPis, providing a stable WiFi network. It was crucial because without it none of the examples would fully work. 

The third thing I appreciated is about the interest of the people attending. Most of the people seeing all the unconnected hardware elements, laying on the table, asked about it. Maybe they didn’t believe it could work. Sure, Java and Hardware is not an easy thing; it takes some effort to make it work. I disconnected all components because I wanted to show how easy it is to build a working system by using the Robo4J framework. We then spent some time connecting all pieces together and writing some simple Robo4J applications. In the end we ran the application and it was working. My impression was that they were surprised based on the feedback I’ve received. They were impressed how easy it can be to use Java and Hardware together. 

When the people saw what Robo4J can do for IoT development it turned into a discussion about its usage. I warned everyone that Robo4J is still currently in an alpha version. Even so there are already a lot of usages where people may use Robo4J. A lot of attendees asked about how to setup the Lego Mindstorm they've bought for their kids. They never used Java there because it was not so easy to setup all together. As a consequence of those discussions we have prepared two helpful blog posts.
The fifth discovery I did is that people try hard to use RaspberryPi together with Python. All pre-prepared examples are very easy to run for the users, but than sooner or later they are struggling about how to continue with the development. I met a couple of people from Robotics startups, and they told me that they would highly appreciate to have the possibility employ Java. Aside from static typing, Java has a bigger and more active community. On the internet there are, in my opinion, much more useful libraries. Java also provides you better control over your application runtime and much more.

Summary
Javaland was extremely helpful for Robo4J. We got really nice feedback on the framework itself. There is still much work to be done before we release the first version. We are looking forward to implementing your feedback in Robo4J.


Miro, for the Robo4J Team

Monday, April 3, 2017

How to Prepare Lego EV3 for Robo4J (install leJOS)

First download the LeJOS related files:

1. Download leJOS_EV3_0.9.1-beta.tar.gz
2. Download Java for Lego Mindstorm EV3 ejdk-8-fcs-b132-linux-arm-sflt-03_mar_2014.tar.gz

Next install LeJOS on the SD card:

1. According to the manual, you should use an SD card bigger than 2GB
2. Format the SD card (on Mac you can use SDFormatter)
3. Unpack leJOS_EV3_0.9.1-beta.tar.gz
4. Go to the unpacked folder
5. Copy the lejosimage.zip file to the SD Card
6. Unpack the lejosimage.zip file
7. Move all the unpacked content to the root of the SD card
8. Copy ejdk-8-fcs-b132-linux-arm-sflt-03_mar_2014.tar.gz to the root of the SD card
9. Put the SD card back into the EV3 Lego brick
10. Start the Lego brick

We do recommend to setup WiFi on your lego Brick. Such a setup allows you to use an ssh client to connect to the brick and the scp utility to upload files. The default password is not set (any characters sequence will be accepted).

Getting Started with Robo4J

This blog post will show you how to quickly get started with Robo4J and will explain some key concepts.

Robo4J contains some key core “modules”. We will use the word “module” loosely here, as they are not quite yet JDK 9 modules. These modules are all defined as their own Gradle projects and have their own compile time dependencies expressed in Gradle.

When you build your own robot, you will normally have a dependency on the robo4j-core and the units defined for the platform you are using for your robot. For example, robo4j-units-rpi for the Raspberry Pi, or robo4j-units-lego for a Lego EV3 robot.


In this blog post I will use the provided Raspberry Pi lcd example to show how to set up and run Robo4J on the Raspberry Pi. Miro will post a Lego example later.

Module Description
robo4j-math Commonly used constructs as points, scans (for laser range scans), feature extraction and similar. Most modules depend on this module.
robo4j-hw-* A platform specific module with easy to use abstractions for a certain hardware platform. For example robo4j-hw-rpi, which contains easy to use abstraction for common off-the-shelf hardware from Adafruit and Sparkfun.
robo4j-core Defines the core high level concepts of the Robo4J framework, such as RoboContext, RoboUnit and RoboReference.
robo4j-units-* A platform specific module which defines high level RoboUnits that can be included into your projects. This modules basically ties the hardware specific module together into the Robo4J core framework, providing ready to use units that can simply be configured, either in XML or code.


Note that the robo4j-hw-* modules can be used without buying into the rest of the framework. If you simply want to bootstrap the use of the hardware you’ve purchased, without using the rest of the framework, that is certainly possible. Hopefully you will find the rest of the framework useful enough though.

First thing you want to do is to pull the robo4j core repo:
git clone https://github.com/Robo4J/robo4j.git
Once git is done pulling the source, you will want to install the modules into the local maven repository:
gradlew install
Now you are ready to pull the source for the LCD example:
git clone https://github.com/Robo4J/robo4j-rpi-lcd-example.git
We will finally build it all into a runnable fat jar:
gradlew :fatJar
Now, to run the example, simply run the jar:
java -jar build\libs\robo4j-rpi-lcd-example-alpha-0.3.jar
If you are not on a Raspberry Pi, with an Adafruit LCD shield connected to it, then you can actually run the example on any hardware, by asking the LCD factory for a Swing mockup of the Adafruit LCD:
java -Dcom.robo4j.hw.rpi.i2c.adafruitlcd.mock=true -jar build\libs\robo4j-rpi-lcd-example-alpha-0.3.jar
(This particular example can actually be run without using Robo4J core at all; see my previous post for a robo4j-hw-rpi version of the example.)

Now, let’s see how the example is actually set up. There are two example launchers for the project. One is setting up Robo4J using an XML configuration file, and one sets everything up in Java. Both are valid methods, and sometimes you may want to mix them – this is supported by the builder. I will use the XML version in this blog.

In this example there are mainly two units that we need to configure. A button unit and an LCD unit. They are both actually on the same hardware address (Adafruit uses an MCP23017 port extender to talk to both the buttons and the LCD using the same I2C address). However, it is much nicer to treat them as two logical units when wiring things up in the software, and this is exactly how Robo4J treats them.

Here is the most relevant part of the XML-file:
Note that the AdafruitLcdUnit and the AdafruitButtonUnit are provided for us, and they are simply configured for the example. Also note that they reside on the same hardware address, as expected. No code is changed or added for configuring these units. The controller unit is however ours, and it defines what should happen when a button is pressed.

The following snippet of source shows how to instantiate it all (focusing on the most relevant parts of the code):

We get ourselves a RoboBuilder, we add the configuration file as an input stream to the builder, and that’s it in terms of configuration. The builder will be used to instantiate the individual RoboUnits in the context of the overarching RoboContext. The RoboContext can be thought of as a reference to a “robot” (a collection of RoboUnits with (normally) a shared life cycle). It is currently a local reference (only in the local Java runtime), but once we get the network services up, you will also be able to look up remote RoboContexts.

Once we have the RoboContext set up, we start it, which will in turn start the individual RoboUnits. After that is done, we also provide a little start message to the LCD by getting a reference to the RoboUnit named “lcd”, and send it an initial LcdMessage.
The controller (which is not an off-the-shelf-unit) runs little demos on the LCD, and allows the user to navigate between the demos using the up and down buttons. Simply go here for the source.

I will simply note that the most important part of building a custom unit is to override the onMessage method, and that this unit is somewhat different to most units, as it is actually synchronizing on an internal state, and will skip all buttons pressed whilst a demo is running.

Summary

This was a brief introduction to what the robo4j framework currently looks like. Note that we currently have changes lined up for the core which are currently not implemented, such as changing the threading model/behaviour for individual RoboUnits using annotations. Expect things to change before we reach 1.0. Also note that we currently have a very limited amount of units implemented. This will also change before we release a 1.0. We will not release a first version of the APIs until we have implemented a few more robots in the framework.

One of the robots being migrated to Robo4J right now is Coff-E.

Hope this helps, and please note that this is all changing from day to day. Things will continue to change until, at least, after this summer. If you would like to add your own units, hardware abstractions or similar, feel free contact us at info@robo4j.io. We’d be happy to get more help!

:)

/Marcus

Sunday, January 15, 2017

Getting Started with robo4j-hw-rpi

This blog will provide a step-by-step instruction for how to quickly get started with the robo4j-hw-rpi module. We will use the Adafruit LCD as an example, but any of the supported hardware would be used in pretty much the same way. Also, if you wish support for certain 1 wire protocol sensors, like the Dallas DS18B20 temperature sensors, or the AM2302/DHT11/DHT22 temperature and humidity sensors, you can find a library for them in my private blog (for now).

Installing the Software

First of all the software required must be installed on the Raspberry Pi:
  1. Install PI4J.
    curl -s get.pi4j.com | sudo bash
  2. Set up the Raspberry for i2c communication.
  3. Install the Oracle JDK (if not already installed).
    sudo apt-get install oracle-java8-jdk
  4. Ensure that the Oracle JDK is the one in use (use java –version to verify).
    If not change so that it is. There can be a massive performance difference
  5. Clone the robo4j repo.
    git clone https://github.com/Robo4J/robo4j.git
  6. Run gradlew build in the robo4j root directory.
  7. Install Eclipse (optional, but good if you want to play around with the examples a bit).
    apt-get install eclipse

Setting up Eclipse (Optional)

For an easy way to play around with the examples, we can set up Eclipse on our Raspberry Pi.
  1. Install Egit.
    Inside of Eclipse, go to Help | Install New Software…
    Add a new update site with the following URL:
    http://download.eclipse.org/egit/updates-3.7.1/
  2. Install the Gradle plug-in.
    http://download.eclipse.org/buildship/updates/e42/releases/2.x/
  3. Go to Robo4J on GitHub and clone the repo using Egit from within Eclipse.
  4. Import the projects into Eclipse.
Note that the Eclipse version currently installed by Raspbian will be a rather old version - Eclipse 3.8.0. This unfortunately means that it will not support JDK 8. This does not matter, as the robo4j-hw-rpi and robo4j-math projects only require Java SE 1.7.
Normally you would probably like to develop in a more modern version of Eclipse (or some other IDE) and just use the command line to build robo4j.

Running the Demos

There are two ways of running the demos. One through building the robo4j.jar file with Gradle, and one through Eclipse.
Here is how you would, for example, run the LCD demo using the robo4j.jar built by gradle:
sudo java –classpath robo4j-hw-rpi/build/libs/robo4j-hw-rpi-alpha-0.2.jar:/opt/pi4j/lib/* com.robo4j.hw.rpi.i2c.adafruitlcd.Demo
If you are playing around with the code in Eclipse, do the following:
  1. With Eclipse started, open a command line and change directory to the robo4j/robo4j-hw-rpi/bin folder.
  2. Search for the demo/example corresponding to the hardware set up you want to test in the examples folder in the eclipse project.
    For example, if you have an Adafruit LCD shield, you would use the com.robo4j.hw.rpi.i2c.adafruitlcd.Demo class.
  3. Run the demo from the command line.
    Do not forget to add the PI4J classes to the classpath, for example:
    sudo java –classpath .:/opt/pi4j/lib/* com.robo4j.hw.rpi.i2c.adafruitlcd.Demo
Running the LCD shield demo would show you something similar to this:

Have fun!

Robo4J Raspberry Pi Support

There are now two new modules in Robo4J:
  • robo4j-math
    Some basic math definitions and functions that may prove useful when building robots. This module will provide common data definitions like points and lines, and also provide feature extraction and other functionality that will prove useful when building robots. We’re currently moving Coff-E to Robo4J, which means this module will grow quite a bit during the upcoming months.
  • robo4j-hw-rpi
    Provides abstractions for commonly used off-the-shelf hardware. Does not have dependencies on any other module but math, and can be used as easy to consume, stand-alone, Java hardware abstractions.
Now, if you are getting started with the Raspberry Pi and hardware, the robo4j-hw-rpi provides a quick way get started. It provides examples that are ready to run for some of the most commonly used chips and protocols out there. Using robo4j-hw-rpi can be used as simple, stand alone, java abstractions for accessing your hardware. It does not require you to buy into the rest of the framework, though we hope that using robo4j fully would be the natural next step once we get further along with our implementation.

If your particular piece of hardware is missing, please let us know. Or why not join our effort and add a nice Java abstraction for your favorite piece of hardware to the library?

Saturday, January 14, 2017

Robo4J on GitHub

The Robo4J team has now created an organisation on GitHub

We will continuously publish all updates to the Robo4J core framework, and all the other related modules, to the repositories in this new location.

Over time we plan on publishing sample code, ranging from little snippets of sample code showing how to use a certain piece of hardware, all the way up to a working autonomous robotic vehicle.

Enjoy!

Robo4J is entering the Year 2017!

  Robo4J Team wants to wish you all a beautiful start and happy New Year 2017. 
This year will be very important for the Robo4J framework. A lot of amazing things are already in the queue and waiting for the right time to be published. In such way we keep Java Robotics and IoT community full informed about the continual progress.

 The first Robo4J 2017 announcement is about the core development team. The team currently consists from two brilliant engineers with many years of experiences on the JVM field and pure Java Enthusiasts: 

Miro Wengner

twitter: @miragemiko
email: miro@robo4j.io

Marcus Hirt

twitter: @hirt
email: marcus@robo4j.io

Happy New Year 2017! 

updated section ABOUT