Cheng Guo

Phidget Video

Click here to watch

Seminar Readings

Click here to download

Final Project Milestones & Deliverables

Milestone #1: Controlling the AIBO

The first step of the project is to build an interface that allows the user to control the AIBO remotely. In order to accomplish this, I will study the component of James Young’s AIBO Surrogate [1] code used for remotely controlling the AIBO. Then, I will make some minor adjustments to James’s code so that the user can use keyboard instead of mouse for controlling. This modification is necessary because users will use UMPCs (hotkeys or joy-sticks) to control the AIBO rather than desktop computers (mouse).

  • Deliverables:

1) A remote controlling interface for walking the AIBO

2) The interface will contains a live streaming video area which displays the AIBO’s vision

3) The interface will allow the user to use keyboard to control the AIBO (Please refer to Milestone #6)

Milestone #2: Adding steaming audio support

The second step is to add streaming audio support to the user interface. Since the AIBO is used for home monitoring, it is important to allow the users to hear what is happening around the AIBO.

  • Deliverables:

1) Users will be able to hear streaming sound through the AIBO’s microphones.

Milestone #3: Overlaying Icons on the streaming video screen

It is important to give users feedbacks about the current state of the AIBO. For example, the user should know the current walking speed of the AIBO. Also, the user should be aware of the AIBO’s battery life and Wi-Fi signal strength.

  • Deliverables:

1) Design graphic icons for battery life, signal strength and walking speed.

2) Overlay these icons on the streaming video screen.

3) Let the AIBO to update these icons according to its’ current state.

Milestone #4: Detecting visual changes around the domestic environment

The AIBO should be able to notice environmental changes while patrolling around the home. For instance, it should be able to recognize the difference between locked and unlocked doors. Following Dr. Jeffery Boyd suggestion, I will study Dr. David Lowe’s research on key point matching [2]. Then, I will apply this technique to the project, so that the AIBO can visually compare the different state of the same object and draw practical conclusions regarding the domestic environment (please note that large components of this milestone will be performed as course project for Dr. Jeffery Boyd course Image Analysis and Computer Vision CPSC 635.)

  • Deliverables:

1) Allow the AIBO to patrol certain predefined locations around the home.

2) Let AIBO to take pictures of critical locations (such as doors and windows) and compare them with preloaded pictures in the database

3) Notify the user if a difference between the pictures of critical locations are found (eg, when unlocked doors detected, notice the user)

Milestone #5: Detecting the location of unusual sound

The AIBO should be able to detect visual changes as well as unusual sounds. If an unusual sound has been detected by the AIBO, then the AIBO will provide the user a rough direction of where the sound is generated. By supplying visual cues such as arrow icons on top of the streaming video, the AIBO will guide the user to walk to that location.

  • Deliverables:

1) Detecting the rough location of unusual sound

2) Provide visual feedbacks to the user and guide her to walk the AIBO to that location

Milestone #6: Porting the existing program to an UMPC

In order to allow the user to control the AIBO from any location, I will port the project from a desktop computer onto an UMPC (Ultra Mobile PC [3]). Modifications will be needed so that the program will adapt and work with the input device on the UMPC.

  • Deliverables:

1) Porting the existing program to an UMPC

2) Modifying existing controls so that it fits with the UMPC’s input device

Reference

1. AIBO Aware. http://pages.cpsc.ucalgary.ca/~jyoung/pmwiki.php?n=Projects.AIBOAware

2. Lowe, D. Keypoint Detector. http://www.cs.ubc.ca/~lowe/keypoints/

3. UMPC. http://www.microsoft.com/windowsxp/umpc/default.mspx

Assignment 2 Proposal

As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.

The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.

Deliverables:

1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO

2) Add keyboard support for controlling the AIBO

Reference

1. Tekkotsu. http://tekkotsu.no-ip.org/

Seminar Proposal

Over the past few years, several robots have been introduced into the consumer market. There are domestic service robots like the Roomba and Scooba[1], or intelligent and toy-like robots such as the Sony AIBO. As robots enter everyday life, interaction between humans and robots becomes an unavoidable problem that designers have to face and to solve. How can we communicate with robots efficiently? How can humans and robots understand each other’s needs and thoughts? How can we treat robots as social artifact and grant them with social meaning? There are many problems that robot designers have to tackle.

For my project, I will face many problems that I’ve mentioned above. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and learn to apply emotional design principles to robots.

Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the potential problems associated with my project thoroughly.

1. Scooba. http://www.irobot.com/sp.cfm?pageid=128

Research Proposal (Revised)

Active Home Monitoring System

Due to the limitation of our sensory and physical presence, we can only detect and respond to the environment around us. So we are naturally limited in our ability for remote sensing and interaction. Although we can use digital sensors to extend our sensory to remote places, we have to confine ourselves to a specific environment (such as a security monitoring room) to monitor the results. Besides reading and interpreting this feedback, this remote sensory system does not allow users to physically respond to remote events.

Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other computers cannot. By utilizing the power of current handheld devices, we may interact with robots anywhere.

For this project, I am planning to use the Sony AIBO robot dog combined with handheld devices like Ultra Mobile PCs (UMPCs [2]) to build a remote monitoring and awareness system for a home environment. AIBO is an intelligent robot which has the capabilities of moving in its physical space and responding to visual, audio and haptic inputs. Faraway users equipped with a mobile platform such as a UMPC will be able to monitor their home environments through the AIBO’s sensors. For instance, if there is an unusual sound detected by AIBO at home while the users are working at their offices, the users will be able to look through the AIBO vision system and get a sense of what is happening at home. Therefore, people will be able to extend their visual and auditory awareness to a remote place through a robotic interface. Also, AIBO has the ability to move and dynamically explore the home environment to discover “unusual” events; for example, the AIBO can patrol around the doors and windows to make sure that they are always locked. If anything "looks" different from its previous state, the users will get active and live feedback from the AIBO. Moreover, users will have the ability to communicate and control the AIBO to trigger certain events. For example, users may control the AIBO to interact with actuators and sensors, like Phidgets, to complete certain physical tasks or to sense more information from devices like thermometers and motion sensors, to enrich the AIBO’s physical, visual and auditory abilities.

In conclusion, by combining the AIBO and a UMPC, I plan to design and implement an active home monitoring system where users can be aware of their home conditions remotely. Also, by observing the interaction between humans and robots, I hope to explore and discover ways to tighten the relationship between them, so that robots can be granted with more social meanings such as trust and reliability [3].

Reference

1. Weiser, M. (1991) The computer for the 21st Century. Scientific American. 94-110, September.

2. Ultra-Mobile PC, http://www.microsoft.com/windowsxp/umpc/default.mspx

3. Norman, D. (2004) Chapter 5: People, Places and Things, Why we love (or hate) everyday things (pp 135-160). New York: Basic Books.