In the light of recent “AI hype” I’ve decided to refresh my knowledge about AI principles.
Having funny record “Control Systems of Autonomous Flying Objects” (a.k.a. “rocket science”) as a specialty in my university diploma from country that does not exist anymore – USSR. So I am not a stranger on the subject as you can imagine.
Anyway I want to discover what was changed during last 30 years in this field. To be honest, not that much as far as I can tell.
The only thing is that processor power increased significantly and overall miniaturization of components. Friedrich Engels 130 years ago defined universal principle of “Transition of quantity into new quality”. So if quantity (of transistors on a chip in this case) has changed it shall lead us to new quality, capabilities and properties of the system.
And indeed while handful of Rosenblatt’s perceptrons in USSR Post’s letter sorting devices, in 1970-ies, were capable to recognize zip codes/indexes of pretty much all letters circulating in USSR. And now we have full face recognition on our mobile phones. That’s recognition of more complex images but still principles stay all the same and quite far from word Intellect in AI abbreviation. These are just variations of basics behind Kalman, Matched and Adaptive filters.
Anyway, back to the subject. Refreshing the knowledge by just reading books around of Statistical and Possibility theories is a bit boring. So I need some practical task to solve.
And here is where I’ve got the idea – to create a robot with CPU and bunch of sensors on board – to see what we can do with it these microcontrollers-everywhere days.
After some Web exploration (1 hour in reality) found MakeBlock robot construction set that matches my expectation: programarable, constructible and has a lot of sensors. In particular this one mBot Ranger + Add-on pack just for spare details and micro-servo that it contains:
Nice small thing and quite basic in its original incarnation and, indeed, it took 30 minutes to assemble the thing. Then to setup its control program on my mobile phone (it has Bluetooth interface) and run it in “Obstacle Avoidance Mode”.
Here are literally its first steps and so inevitable first crash 🙂
That default “Obstacle Avoidance” program is way too basic. It just uses ultrasonic sensor (two “eyes” in front) and two motors in drive forward/reverse modes. But on board of the device we’ve got gyroscope/accelerometer and “line following” sensor that can be used to detect breakaway from floor, etc.
The final goal is to have something smart, with personality and green (it will have solar panels to charge batteries). Not the best PRD that I saw in my life but at least is the one that definitely less boring 🙂
Here is kind of specification that device has:
|Main board||Arduino Mega 2560, 256 KB flash memory, 8 KB SRAM, 4 KB EEPROM, ATmega2560 processor. 16 MHz CPU speed|
Side note: I was surprised that the board has gyroscope. And how miniscule it is. I do remember how gyro-stabilized platform looked like on SS-18 “Satan”. That was really THE DEVICE, with mechanical gyros and bunch of pure gold wires. 5 minutes was needed for those spinning disk to reach needed speed. The gyro keeps coordinate system of start point – to compute trajectory and position by integrating acceleration.
Having said about gyros, there is an interesting question: relative to what actually gyroscope keeps its position? If you will try to answer on this you will dig from cosmology level down to Higgs bozon …
It is going to be an interesting project, so stay tuned.
Next step: to make better “scout mode” – that obstacle avoidance mode + gathering stats about room geometry. Unfortunately I do not have that much time for it – just 30-60 minutes per day – busy with Sciter and new html-notepad thing.