Investors: Press Release

GeckoSystemsí Mobile Robots Conference to Demo CompoundedSensorArray™ Sensor Fusion Technology

CONYERS, Ga., Oct. 22, 2009 -- GeckoSystems Intl. Corp. (PINKSHEETS: GCKO) -- announced today that during their first annual "Mobile Robots in Motion" conference November 4-5, 2009 attendees will be able to observe and discuss GeckoSystemsí sensor fusion based vision system, the CompoundSensorArray™. GeckoSystems is a dynamic leader in the emerging mobile robotics industry revolutionizing their development and usage with "Mobile Robot Solutions for Safety, Security and Service™."

"Traditional video centric machine vision is very expensive in dollars, power consumed, and time required to update. Taking a clue from compound insect eyes in nature, we invented the CompoundedSensorArray (CSA) which 'creates' two hundred and sixty facets (like a compound insect eye) by scanning ten infrared range (IR) finders over twenty-six positions every 1-1.5 seconds. We fuse this 180 degree Field of View (FOV) with sonar range finders to give greater range and reliability to the CSA for locating stationary and/or dynamic obstacles quickly and at low cost, both in dollars and time," observed Martin Spencer, President/CEO, GeckoSystems.

The CSA system also includes a high speed stepping motor, its control electronics, and the software to correctly assemble the facets' ranging data into an abstracted image for use by their automatic, self-navigation software, GeckoNav™.

"The amount of data that the CompoundSensorArray provides is far greater then what can reasonably be collected with fixed sensors and much lower cost then scanning laser range finding systems frequently used," stated Mark Peele, Vice President, Research and Development, GeckoSystems.

"This low cost, sensor fusion system not only provides the critical, actionable situation awareness information sufficient for our AI navigation software, GeckoNav, but also satisfies these requirements at a lower cost, in both dollars and power. This reduces the CareBot's cost to the end user while maintaining high level utility and increased ROI to our stockholders," concluded Spencer.

According to Wikipedia: "Sensor fusion is the combining of sensory data or data derived from sensory data from disparate sources such that the resulting information is in some sense better than would be possible when these sources were used individually."

Everyday we use "sensor fusion" in our routine activities. For example, we smell smoke and then look for a grayish cloud to determine the source of the smoke, its proximity, and consequent degree of danger to us. In noisy crowds when we talk with someone, we use lip reading to enable us to understand what we don't hear clearly. Humans use sensor fusion every day to make choices based on data that is interdependent, or incomplete, versus using only one of our five senses. The better the sensor fusion, the better the choices and the more "actionable" the "situation awareness" is.

GeckoSystems employs proprietary sensor fusion technologies not only in its flagship automatic self-navigation software, GeckoNav™, but also in GeckoTrak™, the GeckoSPIO™, and GeckoOrient. GeckoTrak uses advanced sensor fusion to merge machine vision, passive infrared, and sonar to identify and/or locate the person of interest such that GeckoTrak can inform GeckoNav automatically as to the whereabouts of the designated person for continuous proximate monitoring. The GeckoSPIO, a sensor/power input/output mobile robot controller board, enables faster, more graceful self-navigation through loose crowds of moving people as in airport, bus, and train terminals, shopping centers and other public areas. GeckoOrient automatically and intelligently merges sensor data from odometry (dead reckoning), a solid-state compass, and accelerometer based gyroscopes (IMU's), for enhanced orientation accuracy while errand running, patrolling, or following a designated person.

Like an automobile, mobile robots are made from steel, aluminum, plastic, and electronics, but with ten to twenty times the amount of software running. The CareBot has an aluminum frame, plastic shroud, two independently driven wheels, multiple sensor systems, microprocessors and several onboard computers connected in a local area network (LAN). The microprocessors directly interact with the sensor systems and transmit data to the onboard computers. The onboard computers each run independent, highly specialized cooperative/subsumptive artificial intelligence (AI) software programs, GeckoSavants™, which interact to complete tasks in a timely, intelligent and common sense manner. GeckoNav™, GeckoChat™ and GeckoTrak™ are primary GeckoSavants. GeckoNav is responsible for maneuvering, avoiding dynamic and/or static obstacles, seeking waypoints and patrolling. GeckoChat is responsible for interaction with the care-receiver such as answering questions, assisting with daily routines and reminders, and responding to other verbal commands. GeckoTrak, which is mostly transparent to the user, enables the CareBot to maintain proximity to the care-receiver using sensor fusion. The CareBot is an internet appliance that is accessible for remote video/audio monitoring and telepresence.

As predicted in the recent Forbes' article (http://www.forbes.com/2009/09/17/robots-health-care-technology-breakthroughs-telehealth.html), due to the sufficiency and cost effective robustness of GeckoSystems' first product, the CareBot™, near term in home evaluation trials have been recently announced. This conference will enable many industry observers to witness and determine for themselves the proximity to market and consumer acceptance their first product will enjoy.

Journalists are encouraged to contact Mr. Spencer regarding the progress of GeckoSystems and potential attendance at the upcoming GeckoSystems' invitation only "Mobile Robots in Motion" conference. Journalists and other interested parties may submit their request for an invitation at their website or call 678-413-1640.

Download Full-Text PDF