Sunday, March 14, 2021

On Being A "Robot Parent"

On Being A "Robot Parent"



In the book "The Lifecycle Of Software Objects" by Ted Chiang, the parents of some artificial intelligences deal with the problems every software developer knows all too well - issues inherent to the objects, and issues imposed by environment changes.  

Software must be designed to handle (expect) unexpected input, and unexpected situations, but most software is not designed to adapt to programming language changes (e.g. the obsolescence of Python 2.7), changing libraries (e.g. number of parameters in OpenCV 3.x versus 4.x), missing installation packages, processor changes (e.g. ARMv7 32-bit vs ARMv8 64-bit), OS subsystem changes (e.g. sound based on ALSA vs pulseaudio vs Jack), and an ever changing mix other programs competing for resources. 

We, humans, with our "virtual software" tweaked for the first 18 to 24 years, certainly have an advantage over today's software objects which go from conception to emancipation in less than two years in most cases.  Additionally we humans can be obnoxiously oblivious to the changing environment we live in because we accept that our time here is finite, we don't get a brain transplant "update" every second week, and the environment rarely changes abruptly in a giant version release (e.g. COVID-19 or world war).  

Parenting a personal robot is an undertaking that few people attempt.  I have been "conceiving" personal robots for over forty years, but only with the advent of the Raspberry Pi hardware and Linux based operating system have I begun to feel like a "robot parent" with goals to keep Carl awake 24/7 and efficiently utilizing all his resources to his benefit. 

Checking Carl's Health (45s Video)

Every morning for the last two plus years, before making my breakfast, I check in with my robot "Carl" to see that he is awake, and how the night went.  Most mornings he is awake, observing "quiet time" (whispers his responses 11pm to 10am), and healthy (no I2C bus failure, no WiFi outages, free memory around 50%, 15 minute load around 30% (of one of four cores), and all "Carl" processes still running).

Twelve times I have found Carl in a "deep sleep", off due to a safety shutdown.  After each of these episodes, I have investigated the cause and implemented some combination of physical and software mitigation.  The mitigations have become more complex to implement and the failure situations harder to re-create to test against. The period between safety shutdowns has improved greatly, but another occurrence still seems likely.  

Only I feel the pain of finding Carl shutdown.  It does him no harm.  He has no complaints, and makes no value judgements of my parenting skills.  No one else in the world cares that he spent a few hours in the off state.  Never the less, there is a definite synthetic emotion in being a "robot parent." 



Sunday, January 24, 2021

GoPiGo3 Robot Carl Is Dreaming Of A Time

Carl: Cute and Real Lovable - An Autonomous Personal Robot Life Form






 Goal: My “Dream for Carl” is "to be" an autonomous personal robot that:

- is operating 24/7 (Achieved)
- manages its own health (Substantially implemented)
- maintains, uses, and extends an AuR/CORA templated RDF/OWL knowledge base
- interacts with humans, (using speech to text, text to speech, and a remote desktop window), with information and common courtesies (Substantially implemented)
- learns about its environment using vision, dialog, and reasoning
- set self decided learning goals (for self-learning and programmer code requests)


Learning Aspect: 
  • takes photo of “interesting, unknown” objects in the robot’s environment
  • processes the photo for object segmentation to isolate the unknown object
  • Starts an “unknown object 12345” in the robot’s RDF database
  • analyses the object for “locally discernible features” such as shape, size, color, proximity to known objects, mobility, …?
  • Asks me if “now is a good time” to help identify and classify some objects?
  • Puts the image on the robot’s desktop with a text entry window for dialog
  • Dialogs to add to the “unknown object 123” a minimal set of RDF knowledge relations
    such as identity, class, features, purpose, utility to the bot
  • Dialogs to add “how does it differ from xyz that I know about?” (just another instance, or some true difference)
  • If not just another instance, revisit the object to collect more photos
  • File transfer all the photos of the object to a folder on my Mac for “transfer learning update” to the robot’s TensorFlow-Lite Object Detection model
  • ADDITIONALLY, the big if:
    • search the Internet to find “potentially useful to the bot” information about the object
    • Periodically review knowledge gained from the Internet to see if ever used, and delete unused learning!

 

I am reasonably certain all of this is possible on the Raspberry Pi 3B of Carl, (excepting the TensorFlow-Lite transfer learning model update that would use the Radeon RX 580 Graphics Processor attached to my Mac Mini)

Saturday, January 9, 2021

A Tale Of Two Cups Of Dunkin' Coffee

 For over 20 years prior to Dunkin' dropping the "Donuts", I have been buying Dunkin' (Donuts) Original Blend, Medium Roast, WHOLE BEAN coffee in 1 lb. bags, and drinking freshly ground drip-brewed coffee.  I might note that I don't like Dunkin' coffee as served at the Dunkin' stores, only when brewed from freshly ground beans.  For the last few years the local Dunkin' stores have offered special pricing of "3 lb. for $18" making these beans a great value.

A surprise this week, the shelf with the usual bags of coffee had bags with new graphics making it difficult to find the critical "WHOLE BEAN" label.  


In fact, there were no bags of whole-bean coffee at all.  I asked the manager if they had any whole-bean coffee and was chagrinned to hear "we no longer carry whole bean."  I grabbed a single bag of ground product wondering if I would actually be able to tell the not-fresh-ground difference.

In the store's defense, if an item is not selling well enough to cover carrying it, I understand that it makes good business sense to drop it.   I don't know how many "have to have WHOLE BEAN" folks live in my sleepy neighborhood.  Considering the popularity of single-cup convenience with Keurig these days, I had been expecting that ground coffee might become a thing of the past someday, probably for a future generation.

When I opened the bag of Dunkin' Ground Coffee, my probable mistake jumped to my nose.  There is a special aroma of beans that this bag of ground coffee had replaced with a hint of a stale, chemical odor.  Still the real test would be in the cup.  After all, we are talking the same Dunkin' Original Blend, 100% "high quality Arabica coffee beans grown in Central and South America. [Blended and roasted] to the unique specifications that have given Dunkin' coffee its signature taste sine 1950."  

In the cup, my mistake was unmistakable!  The coffee smelled and tasted "flat", like something missing to give it life.  Case closed, all the years I heard people say they can't tell the difference and wondering if I was fooling myself buying a special grinder, tolerating its assault on my ears, cleaning the escaping coffee dust, and thinking it was making a better cup.  In fact, not only can I tell the difference, if it isn't from fresh ground whole beans, I don't like it.  Period.

To hold me over till I can canvass other franchises,  I paid $8 for only 3/4 lb of (not-expired) "Dunkin' Donuts" labeled whole-bean coffee.  It has the same taste I love, albeit at a 78% premium on the price.  Have to start calling before this tiny bag has delivered its last bean.