09
Aug
13

Going to Small Sat

Tomorrow begins the Small Sat conference in Logan Utah. It is 6 days immersed in space science, technology, policy, and inspiration. The format looks to be a new presentation every 15 minutes! There are several people, businesses, and presentations that have my attention. I’m hoping to meet Zac Manchester from the KickSat project, hear from Interorbital on their summer’s work, have a few laughs again with Fred of Team FredNet, look for mission control software best practices for Part-Time Scientists, and talk with other ion engine researchers. With good cell reception, I hope to tweet cool facts during the conference (@iondragonfly).

09
Aug
13

KickSat Sprite CGP

How’s that for a cryptic title? The KickSat project will deploy 200 satellites, called “Sprites”, each the size of a business card into low earth orbit in December 2013. Along with Don Smith and my colleagues at Part-Time Scientists, I am writing software for one of these Sprites. The most interesting challenge is communications. At roughly 1 byte per second and no battery to power a radio through a solar outage, getting a lot of data from the communication stream is difficult. Us humans are pursuing the classic route of performing trade studies to select a communication protocol. Meanwhile, the Cartesian Genetic Programming (CGP) machinery grinds away looking for an even better protocol. I’ve reached out to the creator of CGP, Julian Miller, who has also brought in Simon Harding to assist. I cannot overstate the value of their input on tuning parameters and setting up the problem. Final results will be posted here in early September. In the meantime, here is the structure of the CGP system.

The Sprite has a 3 axis magnetometer, a 3 axis gyro, a temperature sensor, and a small persistent memory. Figuring just the sensor readings themselves, we have 7 telemetry points to be sent to the ground. Adding values for system uptime, reboot count, min/max readings, sensor noise, and sensor correlations further raises this count.

The generic structure is to reduce the resolution to 8 bits per telemetry point and to send points in either a round-robin or random sequence with an indicator telling us which data point follows. Pairs of telemetry values are grouped into a single ID and sent together. Success is defined by recovering the sensor data from both complete and corrupted packets. With amount of redundancy built into the Sprite’s low level protocol, we aren’t concerned really with byte-level corruption as much as with packet truncation due to loss of power.

Each “world” consists of a CGP individual to encode a packet and another individual to decode a packet. The encoder is presented with 16 separate inputs, 1 for each bit in 2x 8-bit sensor readings. The encoder outputs 16 separate outputs, each interpreted as a binary value (>0 –> 1, <= 0 –> 0). These constitute the packet that will be sent.

A packet is made from the encoder output and presented to the decoder. It takes 16 inputs, each a tri-state bit. 1, 0, and -1 with -1 meaning “unknown” as can happen with truncated packets from the satellite. The decoder’s output is 16 tri-state bits (>0 –> 1, == 0 –> 0, < 0 –> -1 unknown) and are mapped to 2x 8-bit sensor readings.

The high order bits on each 8-bit sensor reading are the most valuable. In a corrupted packet, the reward for getting the highest order bit of a sensor reading correct is 128. The second is 64, and on down. In a normal packet, the rewards are 3* the corrupted reward schedule. This does lead successfully to the achieving perfection on bits 0 and 8 first (the MSB of the two sensor readings) and to getting normal packets working before truncated packets (though only slightly). If all test cases are perfectly handled, then the total size of the encoder and decoder is used to drive down the size.

100 random 8-bit pairs are used for each generation in each world. Validation is done with 1000 random pairs and there is strong correlation between generation test scores and validation scores. Testing the full set of 65,536 pairs has not been worth it for validation. Each encoder and decoder pair is given a new set of random test cases with each generation. Each random pair is presented normally (input->encoder->packet->decoder->reward) as well with truncation (input->encoder->packet->set last 8 bits to -1 ->decoder->reward).

Though separate individuals, the final reward score is assigned to both the encoder and decoder, causing them to advance or fail together. This is based on my past experience with CGP protocol simulation that showed very, very, very slow progress when ranking each separately.

The population structure is 1+7. That is, the 1 best individual (or child that is not worse than the parent) is promoted and 7 mutant children created. There is a 1+7 population for encoders and a 1+7 population for decoders.

09
Aug
13

CGP, Compressed Sensing, and Data Collection

We’re doing vacuum chamber tests of the FRETS1 satellite’s ion engine and building an embedded data collection system. Naturally, I want every nuance to every signal trace at several million samples per second on lots of channels. Even after being talked down to 4 channels, 2MS/s/channel, and 8 bit samples, it’s still a lot of data for an embedded system that must be the size of an Arduino in the prototyping stage and even smaller on the satellite.

Lots of and lots of searches later, it is clear we aren’t going to have that system. We’re going to have one better. To start with, the incoming data can be fed into a flexible triggering system and threshold events counted in small time slices. The data storage and sampling requirements are 3-6 orders of magnitude less for this approach, removing large amounts of add-on SRAM and engineering headache. The downside is that the data is essentially compressed in the hardware before it gets to the collection CPU. Now it gets interesting. Such data can be converted using the Compressed Sensing GPU library I presented at NVidia’s GTC this year. Further, the recovered signal can be modeled with an equation using Cartesian Genetic Programming (CGP). The equation is fit to the compressed data, the recovered data from the Compressed Sensing library, and to physical truths of continuity.

Admittedly that’s a lot of math for an embedded controller to be doing. However, we can build basic engine controls using just the compressed data. The engine controller will have even better data over time as the CGP and Compressed Sensing solutions converge. And that is what I mean by a “better” data collection system – it has a series of fallbacks needed for a space mission and still works just fine on the bench.

18
Jul
12

Thinking with laser cutters

I finally made my first part using a laser cutter.  There’s a period on that sentence because the most exciting event happened 30 minutes later after the second and third parts came out perfectly.  The background: the ion engine needs a test stand for use inside a vacuum chamber.  I decided to make this stand out of clear plastic so all internal parts (especially unwanted sparking) could still be observed.  A laser cutter at the nearby TechShop was the obvious choice.  After some fussing around with 2D drawing programs, I was thankfully nudged into using Inventor to make real 3D parts and to confirm how they fit.  When cutting day finally came the process was flawless and the parts came out so easily that I had 90 minutes of time left in my laser reservation.

There was a class running in the woodshop, which I needed to make holders for the vacuum chamber’s thick-walled glass cylinder.  Looking at the woodshop, looking at the laser, looking at the woodshop, looking at the laser – I had a “2001” moment!  It was suddenly clear that the 1/4″ acrylic sitting right there was perfect for cradling the glass cylinder and a few very simple pieces of 1″ dowel would give all the 3D-ness that was actually needed.  10 minutes of drafting later, parts were coming out.  15 minutes of wood cuts and drilling later, it was done.  That “2001” moment when the mental barriers fell really emphasizes the importance of putting yourself into new situations.

Here are the parts for the support and chamber partially assembled.  Enjoy!

30
Dec
11

Evolving Custom Communication Protocols

I once again spoke at 28C3 in Berlin, Germany for Part-Time Scientists.  Today, the topic was Evolving Custom Communication Protocols.  We used Cartesian Genetic Programming to create the finite state machines of a transceiver, including all packet flow control logic.  This only works when there is at least some trivial adversary, such as Outer Space for our moon mission.  The technique works even better when an attacker is included and allowed full access to all data and is coevolved with the transceiver.  For our runs, transceivers evolved with and without an attacker both handle a validation case equally well, but the transceivers evolved with an attacker are accelerating their evolution and will soon out perform their less-stressed cousins.

Code, data, and the presentation PDF/PPT are online here:

http://dl.dropbox.com/u/5001972/EvolvingCustomCommunicationProtocols.zip

The YouTube video is here:

http://www.youtube.com/watch?v=zoOOpiMJQ8s

 

29
Dec
11

Algorithm Invention with Cartesian Genetic Programming

A couple of days ago, I got the honor of presenting at 28C3 in Berlin, Germany for the Part-Time Scientists!  The topic was “Automatic Algorithm Invention” and featured my foray into Cartesian Genetic Programming.  The topic is exciting actually and much easier to get started with than you might think!  To prove it, I’ve posted the presentation materials and code:

http://dl.dropbox.com/u/5001972/AutomaticAlgorithmInvention.zip

There is also a YouTube video of the presentation:

http://www.youtube.com/watch?v=tQvFZVlM2Gk

For me, the heart of the technique is the addition of “state” to the input and output variables.  Chromosomes each get private state and share common state.  They are evolved to cooperate and share information via this state.

Danke everyone that turned out and I look forward to hearing about your adventures with evolving algorithms!

04
Oct
11

CompSci says optimism works

I’ve been doing some reading for the upcoming ai-class.com course and have some philosophy to share as a result.

Searches are a mainstay of optimization algorithms and a good search needs a good estimator.  At every step of the way, the algorithm knows the real cost to get to that point and needs estimates of the cost of each possible next step.  But, in general, what policy suites an estimation algorithm best: optimism or pessimism?  (Note: I ruled out “realistic” as it implies we already know the answer to the problem.)

An optimistic estimator underestimates the true cost of a solution.   The final, optimal solution will always cost more than the optimistic estimate.  If we, as humans, were to give optimistic estimates, we’d always be wrong and underfunding our endeavors.  Sounds bad, right?  It turns out though, that this optimistic estimate always leads us to the optimal lowest real cost.  Why?  Because with an unattainable low estimate, an algorithm keeps searching.  Eventually, it finds the lowest cost and concludes all possibilities beyond that cost more and “settles” for the optimal point.

A pessimistic estimator overestimates the real cost.  When we humans overestimate then find a solution that costs less, we’re heroes.  When an algorithm follows a pessimistic estimator, it stops when it finds a solution that is lower than the estimate.  This may or may not be the actual optimal location.  Dangerously, pessimism teaches us that pessimism works and trains us to double our estimates and to only trust those that overestimate.  It rarely shows us an optimal solution and we never really find out what we’re missing.

So, what are we to do? Both options are usually wrong though the pessimistic one might actually find a high cost solution that matches its estimate, and being correct sure feels good.  It’ no wonder people question the results of optimization algorithms – the algorithms are just trying to find the best answer while the people are just trying to be right.