The embedded/mobile software market is growing faster than the rest of the industry. From my point of view this just the tip of the iceberg, because we are at the beginning of the sensoric revolution.

Historically all data got created in the "middle" and we used integration technologies to distribute the data. Web 1.0 turned out to become the preferred mechanism to get this done and we changed the content/data in the middle through the web (e.g. online banking). With Web 2.0 something new is happening, because the data/content does not get created anymore in the middle. It gets created at the edges of the web and flows into the "middle" (e.g. mobile phones with cameras).

The next wave is going to be the sensoric revolution, where lots of data/content will become available, but it is not going to be produced by human beings anymore. Instead devices like homes, cars, phones, toasters will produce data/content. Because they can. Because by now all of these devices contain a million (not quite) sensors to measure, what the h... is going on around them. A car is driving over a bridge and the sensors in the car detect that there is wheel-spin and that the temperature is below zero. The car will then make this data/content available and other cars going into this direction can be warned that this bridge is probably slippery. The VP Engineering at IONA Technologies is on his mountain bike going downhill and flips over. The accelerometer in his iPhone detects the stunt (and waits for one minute, if there is anymore movement :)) and notifies a selected group of people (not me) to check on him.

The sensoric revolution is coming. How do we integrate with these sensoric devices?

Check out this cool iPhone/Chumby integration.