The HR-OS1s are shipping and I couldn't be happier! We spent the last two days finalizing the first 10 kits, but they're finally our the door. We'll have another 20 kits out this week, then the last 15 out sometime next week. I'm so excited to see how people use these robots.
I've spent the last week or so feverishly working to flesh out the HR-OS1 Documentation. I'm using Screenflow to create some cool videos that mix screen casting and video of the hardware setup. You can see them here, here, and here. I need to work on the audio - I'm using a samson mic, but I really think things would benefit from a USB headset. Regardless, Screenflow makes my workflow super easy - it even exports straight to youtube, which is stellar. I started using VNC to record other OSes, but ended up getting an HDMI capture card (the Game Capture HD, which makes things look a lot better and more responsive. It adds another layer to my workflow (Screenflow doesn't natively record from the capture card, I have to record it in another program and import it manually) but it's really not that bad (as long as I remember to hit 'record')
This week has been al about the HR-OS1. We're all working really hard to get the first 50 kits out, so we've been doing an insane amount of work on top of our normal fare.
A couple weeks after the Raspberry Pi 2 came out, we decided to support the Raspberry Pi environment for the HR-OS1. I've been doing a lot of testing on the Raspberry Pi (and finding some bugs in the HR-OS1 Framework along the way). At first we were having some issues with the WiFi dongles we had gotten - our bulk shipment was a version 2 and we had tested on a version 1. But Wade was able to hunt down the correct drivers and get then all working. So far, everything else is working swimmingly. It's great to see the robot running on the Pi.
I've been working in the RME(Robot Motion Editor), a tool that lets you pose and play sequences for the robot. The video shows the robot playing some poses that I created. They're inspired by some of the original Darwin's poses. They're really really really rough, the final ones will be a lot better (as you can see the site/walk don't have proper keyframes in-between them so they're a bit wonky).
I've also been working with the node.js code that Daniel created. My goal is to have the robot serve up a mobile-friendly page that allows you to control the robot. Ideally you'll be able to use the accelerometer in your phone to control it as well - because why not?
The pixy is currently non-functional, but I really hope to do some projects with that soon.
Next week's going to be a lot more HR-OS1. Infact until we ship it (and for a while after) I imagine that it will be consuming quite a bit of my time. But I'm not complaining, it's an awesome little platform.
Last week I finished some preliminary tests with the LIDAR Lite from Pulsed Light . The LIDAR Lite is a laser based distance sensor that can measure distance from 0-40m with 2.5cm accuracy. And it's only $89.00! This combination of performance and cost make it an amazing little sensor for robotics.
This demo works well with both AX and MX series DYNAMIXEL servos (that is with both the PhantomX Turret and the WidowX/ScorpionX Turret).The biggest difference is that the MX servos have a full 360° scanning range while the AX servos only have 300°. For the video I used a ScorpionX Turret, but its really overkill. For a mobile robot I'd use the PhantomX Turret. Just having the physical construction all done for me as well as great servos with positional feedback made the whole project a breeze.
I've been really interested in working with Chrome Apps for basic demos. I think HTML is a great way to create quick and easy user interfaces, and the Canvas element makes it easy to create custom elements like the room scan. Working with the chrome serial interface can be a little wonky - I had to spend a lot of time sorting the serial input to get it to line up correctly. But once I did, getting everything up and running was really fast. I really like being able to make changes on the fly and not have to worry about restructuring my entire GUI. Chrome Apps certainly aren't the best solutions for every problem out there, but for proof of concepts and demos, they're pretty great.
I started the project by having the turret auto-scan and then send a custom 7-byte packet (2 byte header/2 bytes of position / 2 bytes of distance / check sum) I eventually figured I needed a good way to start/stop the turret and set the speed. So I turned to the ArboitX Commander library - the same one we use in our other robots. That made it really easy to send commands out from the chrome app. In the future, I'd really like to upgrade the commander library to have a built in data-return packet.
This was very much a proof of concept, so I'm not looking to make many improvements on it anytime soon (I'd like to do a little bit of code cleanup and maintenance, but that's about it). But some ideas that I thought about working on are as follows:
Using Slip Rings to allow the LIDAR lite to spin continuously instead of scanning back and forth.
Alternatively, building a battery and wireless transceiver into the system to allow the entire rig to spin (though this has limited usefulness for mobile robots.
Storing scan data for a robot to report back to the computer later. The MX servos have 4096 different positions, each position being a 2-byte value. Storing that in SRAM will max out the Arduino Uno's 2K and even the ArbotiX-M's 4K. You can lose some resolution and drop it down enough to get a scan or two, but really an SD card or other storage is necessary. Of course there are always chips like the Teensy3.1 with more storage, or you could integrate a Raspberry Pi or other SBC to just get the data on scan.
Figuring out the maximum scan speed/bottle necks.
Investigating a low cost alternative. I think that it might be possible to use standard hobby servos and a little bit of interpolation (or just speed measurements/prediction) could be used to make a servo perimeter scanner. The scan would be limited to 180 degrees and the resolution wouldn't be great, but this could be great for simple mobile robots.
Looking at even lower cost systems like a DC motor with a cheap encoder.
A few of my friends had a blind gift exchange for the holidays, with a $20 limit. I waited until pretty much the last minute to come up with my gift, so Amazon was pretty much out of the question. I was racking my brain what I could get my friend, so I went to Marshalls to try and find something. Eventually I came across a couple of Growlers for sale. Now my friend works at a brewery, so just getting him a growler would be pretty silly. But seeing the growler reminded me of a project he had asked me about a couple of yeas ago. He wanted to know if he could make growler lamps to put around the brewery. We talked about a couple different ways to accomplish it, but we never did anything with the idea. So when I saw the growler, I decided to buy one and make him a Growler lamp.
Not too much to report this weekend. While helping out a friend I've been playing with the Teensy 3.1 to drive NeoPixels. The Teensy is really a pleasure to work with for prototyping. It's small, has interrupts for ever pin, 3 serial ports (not including usb/programming) and a crazy assortment of other features I haven't even played with.
I backed the Teensy 3.0 Kickstarter 2 years ago, and I was super impressed with the unit I got. And I have to say that all of the improvements that came with the Teensy 3.1 are pretty stellar. And the support that the creator Paul Stoffregen and the community gives is just as amazing. I know KurtE is working on a nice carrier board, I need to check up on that.
So what's up for this week? I'm going to start some work on new RobotGeek sensors, and I'll be working on some demos for the Robot Arms to go with the recently released Arm Link software - the videos are coming, I promise!
Over the next week I'll be rolling out all the links and announcements of the Arm Link Software! I've been developing this software for Trossen Robotics for quite some time now and am really happy that its finally ready for users. You can find all the source code and releases here
So what is Arm Link? Arm Link is an open source Java/Processing application that allows you to control the InterbotiX/RobotGeek Robot arms from your computer. The software sends serial packets to the arms with X/Y/Z positions that the arms them move to using their internal Inverse Kinematics engine. For more information, check out the Getting Started Guide I wrote.
A brand new video is on the way, but until then, here's the old pre-release.