Cheng Guo
CPSC70181.ChengGuo History
Show minor edits - Show changes to output
Changed lines 1-3 from:
to:
%rfloat% [[CPSC70181/Home| Attach:logo-70181.png]]
!!!Phidget Video
!!!Phidget Video
Changed lines 5-7 from:
!! Seminar Readings
to:
!!! Seminar Readings
Added lines 1-3:
!!Phidget Video
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/pvid.wmv | '+Click here to watch+']]
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/pvid.wmv | '+Click here to watch+']]
Changed lines 3-4 from:
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/NormanChp6_7.pdf | '''Click here to download''']]
to:
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/NormanChp6_7.pdf | '+Click here to download+']]
Changed lines 3-4 from:
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/NormanChp6_7.pdf | Click here to download]]
to:
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/NormanChp6_7.pdf | '''Click here to download''']]
Changed lines 3-6 from:
* Emotional Design - Chapter 7: The Future of Robots
to:
[[http://pages.cpsc.ucalgary.ca/~cheguo/70181/NormanChp6_7.pdf | Click here to download]]
Added lines 1-6:
!! Seminar Readings
* [[http://pages.cpsc.ucalgary.ca/~cheguo/70181/Roomba.pdf | Service Robots in the Domestic Environment: A Study of the Roomba Vacuum in the Home]]
* Emotional Design - Chapter 6: Emotional Machines
* Emotional Design - Chapter 7: The Future of Robots
* [[http://pages.cpsc.ucalgary.ca/~cheguo/70181/Roomba.pdf | Service Robots in the Domestic Environment: A Study of the Roomba Vacuum in the Home]]
* Emotional Design - Chapter 6: Emotional Machines
* Emotional Design - Chapter 7: The Future of Robots
Changed lines 106-107 from:
!!Research Proposal
to:
!!Research Proposal (Revised)
Changed lines 111-113 from:
Due to the limitation of our sensory and physical presence, we can only detect and respond to the environment around us. So we are naturally limited in our ability for remote sensing and interaction. Although, we can use digital sensors to extend our sensory to remote places, we have to confine ourselves to a specific environment (such as, security monitoring room) to monitor the results. Besides of reading and interpreting these feedbacks, this remote sensory system does not allow users to physically response to remote events.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which othercomputer cannot. By utilizing the power of current handheld devices, we may grant robots with ubiquity so that humans can interact with them anywhere.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other
to:
Due to the limitation of our sensory and physical presence, we can only detect and respond to the environment around us. So we are naturally limited in our ability for remote sensing and interaction. Although we can use digital sensors to extend our sensory to remote places, we have to confine ourselves to a specific environment (such as a security monitoring room) to monitor the results. Besides reading and interpreting this feedback, this remote sensory system does not allow users to physically respond to remote events.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other computers cannot. By utilizing the power of current handheld devices, we may interact with robots anywhere.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other computers cannot. By utilizing the power of current handheld devices, we may interact with robots anywhere.
Changed line 115 from:
For this project, I am planning to use the Sony AIBO robot dog combined with handheld devices like Ultra Mobile PCs (UMPCs [2]) to build a remote monitoring and awareness system for a home environment. AIBO is an intelligent robot which has the capabilities of moving in the physical space and responding to visual, audio and haptic inputs. A faraway user equipped with a mobile platform such as a UMPC will be allowed to monitor his home environment through the AIBO’s sensors. For instance, if there is an unusual sound detected by AIBO at home while the user is working at their office, the user will be able to look through the AIBO vision system and get a sense of what is happening at home. Therefore, people will be able to extend their visual and auditory awareness to a remote place through a robotic interface. Also, AIBO has the ability to move and dynamically explore the home environment to discover “unusual” events, such as patrolling around the doors and windows to make sure that they are always locked. So the users will get active and live feedbacks from the AIBO. Moreover, users will have the ability to communicate and control the AIBO to trigger certain events. For example, users may control the AIBO to interact with actuators and sensors, like Phidgets, to complete certain physical tasks or to sense more information from devices like thermometers and motion sensors, to enrich the AIBO’s physical, visual and auditory abilities.
to:
For this project, I am planning to use the Sony AIBO robot dog combined with handheld devices like Ultra Mobile PCs (UMPCs [2]) to build a remote monitoring and awareness system for a home environment. AIBO is an intelligent robot which has the capabilities of moving in its physical space and responding to visual, audio and haptic inputs. Faraway users equipped with a mobile platform such as a UMPC will be able to monitor their home environments through the AIBO’s sensors. For instance, if there is an unusual sound detected by AIBO at home while the users are working at their offices, the users will be able to look through the AIBO vision system and get a sense of what is happening at home. Therefore, people will be able to extend their visual and auditory awareness to a remote place through a robotic interface. Also, AIBO has the ability to move and dynamically explore the home environment to discover “unusual” events; for example, the AIBO can patrol around the doors and windows to make sure that they are always locked. If anything "looks" different from its previous state, the users will get active and live feedback from the AIBO. Moreover, users will have the ability to communicate and control the AIBO to trigger certain events. For example, users may control the AIBO to interact with actuators and sensors, like Phidgets, to complete certain physical tasks or to sense more information from devices like thermometers and motion sensors, to enrich the AIBO’s physical, visual and auditory abilities.
Changed lines 117-120 from:
In conclusion, by combining AIBO and UMPC, I plan to design and implement an active home monitoring system where users can be aware of their home conditions remotely. Also, by observing the interaction between humans and robots, I hope to explore and discover ways to tighten the relationship between them, so that robots can be granted with more social meanings such as trust and reliability [3].
to:
In conclusion, by combining the AIBO and a UMPC, I plan to design and implement an active home monitoring system where users can be aware of their home conditions remotely. Also, by observing the interaction between humans and robots, I hope to explore and discover ways to tighten the relationship between them, so that robots can be granted with more social meanings such as trust and reliability [3].
Changed lines 38-39 from:
The AIBO should be able to notice environmental changes while patrolling around the home. For instance, it should be able to recognize the difference between locked and unlocked doors. Following Dr. Jeffery Boyd suggestion, I will study Dr. David Lowe’s research on key point matching [2]. Then, I will apply this technique to the project, so that the AIBO can visually compare the different state of the same object and draw practical conclusions regarding the domestic environment (please note that large components of this milestone will be performed as course project for Dr. Jeffery Boyd course Image Analysis and Computer Vision CPSC 635.
to:
The AIBO should be able to notice environmental changes while patrolling around the home. For instance, it should be able to recognize the difference between locked and unlocked doors. Following Dr. Jeffery Boyd suggestion, I will study Dr. David Lowe’s research on key point matching [2]. Then, I will apply this technique to the project, so that the AIBO can visually compare the different state of the same object and draw practical conclusions regarding the domestic environment (please note that large components of this milestone will be performed as course project for Dr. Jeffery Boyd course Image Analysis and Computer Vision CPSC 635.)
Changed lines 6-7 from:
The first step of the project is to build an interface that allows the user to control the AIBO remotely. In order to accomplish this, I will study the component of James Young’s AIBO Surrogate [1] code used for remotely controlling the AIBO. Then, I will make some minor adjustments to James’s code so that the user can use keyboard instead of mouse for controlling. This modification is necessary because users will use UMPCs to control the AIBO rather than desktop computers. Thus, they will have to use joy-sticks and hotkeys to control the AIBO.
to:
The first step of the project is to build an interface that allows the user to control the AIBO remotely. In order to accomplish this, I will study the component of James Young’s AIBO Surrogate [1] code used for remotely controlling the AIBO. Then, I will make some minor adjustments to James’s code so that the user can use keyboard instead of mouse for controlling. This modification is necessary because users will use UMPCs (hotkeys or joy-sticks) to control the AIBO rather than desktop computers (mouse).
Added line 3:
Changed lines 83-84 from:
Deliverables:
to:
'''Deliverables''':
Changed lines 7-8 from:
*Deliverables:
to:
*'''Deliverables''':
Changed lines 19-20 from:
*Deliverables:
to:
*'''Deliverables''':
Changed lines 27-28 from:
*Deliverables:
to:
*'''Deliverables''':
Added line 30:
Added line 32:
Changed lines 35-36 from:
Milestone #4: Detecting visual changes around the domestic environment
to:
'''Milestone #4: Detecting visual changes around the domestic environment'''
Changed lines 39-40 from:
Deliverables:
to:
*'''Deliverables''':
Added line 42:
Added line 44:
Changed lines 47-48 from:
Milestone #5: Detecting the location of unusual sound
to:
'''Milestone #5: Detecting the location of unusual sound'''
Changed lines 51-52 from:
Deliverables:
to:
*'''Deliverables''':
Added line 54:
Changed lines 57-58 from:
Milestone #6: Porting the existing program to an UMPC
to:
'''Milestone #6: Porting the existing program to an UMPC'''
Changed lines 61-62 from:
Deliverables:
to:
*'''Deliverables''':
Added line 64:
Changed lines 67-68 from:
Reference
to:
'''Reference'''
Added line 10:
Added line 12:
Changed lines 19-20 from:
Deliverables:
to:
*Deliverables:
Changed lines 23-24 from:
Milestone #3: Overlaying Icons on the streaming video screen
to:
'''Milestone #3: Overlaying Icons on the streaming video screen'''
Changed lines 27-28 from:
Deliverables:
to:
*Deliverables:
Changed lines 7-8 from:
Deliverables:
to:
*Deliverables:
Deleted line 9:
Deleted line 10:
Changed lines 3-4 from:
Milestone #1: Controlling the AIBO
to:
'''Milestone #1: Controlling the AIBO'''
Added line 10:
Added line 12:
Changed lines 15-16 from:
Milestone #2: Adding steaming audio support
to:
'''Milestone #2: Adding steaming audio support'''
Changed lines 1-6 from:
!!Assignment 2 Proposal
As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.
The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.
As part of the final project deliverables, I am planning to learn how
The second part of this assignment will be learning and modifying James Young’s code of
to:
!! Final Project Milestones & Deliverables
Milestone #1: Controlling the AIBO
The first step of the project is to build an interface that allows the user to control the AIBO remotely. In order to accomplish this, I will study the component of James Young’s AIBO Surrogate [1] code used for remotely controlling the AIBO. Then, I will make some minor adjustments to James’s code so that the user can use keyboard instead of mouse for controlling. This modification is necessary because users will use UMPCs to control the AIBO rather than desktop computers. Thus, they will have to use joy-sticks and hotkeys to control the AIBO.
Milestone #1: Controlling the AIBO
The first step of the project is to build an interface that allows the user to control the AIBO remotely. In order to accomplish this, I will study the component of James Young’s AIBO Surrogate [1] code used for remotely controlling the AIBO. Then, I will make some minor adjustments to James’s code so that the user can use keyboard instead of mouse for controlling. This modification is necessary because users will use UMPCs to control the AIBO rather than desktop computers. Thus, they will have to use joy-sticks and hotkeys to control the AIBO.
Changed lines 9-12 from:
2) Add keyboard support for controlling the AIBO
to:
1) A remote controlling interface for walking the AIBO
2) The interface will contains a live streaming video area which displays the AIBO’s vision
3) The interface will allow the user to use keyboard to control the AIBO (Please refer to Milestone #6)
Milestone #2: Adding steaming audio support
The second step is to add streaming audio support to the user interface. Since the AIBO is used for home monitoring, it is important to allow the users to hear what is happening around the AIBO.
Deliverables:
1) Users will be able to hear streaming sound through the AIBO’s microphones.
Milestone #3: Overlaying Icons on the streaming video screen
It is important to give users feedbacks about the current state of the AIBO. For example, the user should know the current walking speed of the AIBO. Also, the user should be aware of the AIBO’s battery life and Wi-Fi signal strength.
Deliverables:
1) Design graphic icons for battery life, signal strength and walking speed.
2) Overlay these icons on the streaming video screen.
3) Let the AIBO to update these icons according to its’ current state.
Milestone #4: Detecting visual changes around the domestic environment
The AIBO should be able to notice environmental changes while patrolling around the home. For instance, it should be able to recognize the difference between locked and unlocked doors. Following Dr. Jeffery Boyd suggestion, I will study Dr. David Lowe’s research on key point matching [2]. Then, I will apply this technique to the project, so that the AIBO can visually compare the different state of the same object and draw practical conclusions regarding the domestic environment (please note that large components of this milestone will be performed as course project for Dr. Jeffery Boyd course Image Analysis and Computer Vision CPSC 635.
Deliverables:
1) Allow the AIBO to patrol certain predefined locations around the home.
2) Let AIBO to take pictures of critical locations (such as doors and windows) and compare them with preloaded pictures in the database
3) Notify the user if a difference between the pictures of critical locations are found (eg, when unlocked doors detected, notice the user)
Milestone #5: Detecting the location of unusual sound
The AIBO should be able to detect visual changes as well as unusual sounds. If an unusual sound has been detected by the AIBO, then the AIBO will provide the user a rough direction of where the sound is generated. By supplying visual cues such as arrow icons on top of the streaming video, the AIBO will guide the user to walk to that location.
Deliverables:
1) Detecting the rough location of unusual sound
2) Provide visual feedbacks to the user and guide her to walk the AIBO to that location
Milestone #6: Porting the existing program to an UMPC
In order to allow the user to control the AIBO from any location, I will port the project from a desktop computer onto an UMPC (Ultra Mobile PC [3]). Modifications will be needed so that the program will adapt and work with the input device on the UMPC.
Deliverables:
1) Porting the existing program to an UMPC
2) Modifying existing controls so that it fits with the UMPC’s input device
2) The interface will contains a live streaming video area which displays the AIBO’s vision
3) The interface will allow the user to use keyboard to control the AIBO (Please refer to Milestone #6)
Milestone #2: Adding steaming audio support
The second step is to add streaming audio support to the user interface. Since the AIBO is used for home monitoring, it is important to allow the users to hear what is happening around the AIBO.
Deliverables:
1) Users will be able to hear streaming sound through the AIBO’s microphones.
Milestone #3: Overlaying Icons on the streaming video screen
It is important to give users feedbacks about the current state of the AIBO. For example, the user should know the current walking speed of the AIBO. Also, the user should be aware of the AIBO’s battery life and Wi-Fi signal strength.
Deliverables:
1) Design graphic icons for battery life, signal strength and walking speed.
2) Overlay these icons on the streaming video screen.
3) Let the AIBO to update these icons according to its’ current state.
Milestone #4: Detecting visual changes around the domestic environment
The AIBO should be able to notice environmental changes while patrolling around the home. For instance, it should be able to recognize the difference between locked and unlocked doors. Following Dr. Jeffery Boyd suggestion, I will study Dr. David Lowe’s research on key point matching [2]. Then, I will apply this technique to the project, so that the AIBO can visually compare the different state of the same object and draw practical conclusions regarding the domestic environment (please note that large components of this milestone will be performed as course project for Dr. Jeffery Boyd course Image Analysis and Computer Vision CPSC 635.
Deliverables:
1) Allow the AIBO to patrol certain predefined locations around the home.
2) Let AIBO to take pictures of critical locations (such as doors and windows) and compare them with preloaded pictures in the database
3) Notify the user if a difference between the pictures of critical locations are found (eg, when unlocked doors detected, notice the user)
Milestone #5: Detecting the location of unusual sound
The AIBO should be able to detect visual changes as well as unusual sounds. If an unusual sound has been detected by the AIBO, then the AIBO will provide the user a rough direction of where the sound is generated. By supplying visual cues such as arrow icons on top of the streaming video, the AIBO will guide the user to walk to that location.
Deliverables:
1) Detecting the rough location of unusual sound
2) Provide visual feedbacks to the user and guide her to walk the AIBO to that location
Milestone #6: Porting the existing program to an UMPC
In order to allow the user to control the AIBO from any location, I will port the project from a desktop computer onto an UMPC (Ultra Mobile PC [3]). Modifications will be needed so that the program will adapt and work with the input device on the UMPC.
Deliverables:
1) Porting the existing program to an UMPC
2) Modifying existing controls so that it fits with the UMPC’s input device
Added lines 60-80:
1. AIBO Aware. http://pages.cpsc.ucalgary.ca/~jyoung/pmwiki.php?n=Projects.AIBOAware
2. Lowe, D. Keypoint Detector. http://www.cs.ubc.ca/~lowe/keypoints/
3. UMPC. http://www.microsoft.com/windowsxp/umpc/default.mspx
!!Assignment 2 Proposal
As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.
The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.
Deliverables:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
Reference
2. Lowe, D. Keypoint Detector. http://www.cs.ubc.ca/~lowe/keypoints/
3. UMPC. http://www.microsoft.com/windowsxp/umpc/default.mspx
!!Assignment 2 Proposal
As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.
The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.
Deliverables:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
Reference
Changed lines 9-13 from:
1)Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2)Add keyboard support for controlling the AIBO
2)Add keyboard support for controlling the AIBO
to:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
Reference
2) Add keyboard support for controlling the AIBO
Reference
Changed lines 9-12 from:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
2)
to:
1)Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2)Add keyboard support for controlling the AIBO
2)Add keyboard support for controlling the AIBO
Added lines 1-15:
!!Assignment 2 Proposal
As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.
The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.
Deliverables:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
1. Tekkotsu. http://tekkotsu.no-ip.org/
As part of the final project deliverables, I am planning to learn how to write and run programs on the Sony AIBO. This assignment will consist of two major parts. The first part will be to learn the AIBO’s programming interface – Tekkotsu [1]. Tekkotsu is a 3rd party API developed by the Carnegie Mellon University. It allows programmers to control the Sony AIBO without worrying about the low-level coding.
The second part of this assignment will be learning and modifying James Young’s code of AIBO Aware. James wrote a program which allows the users to remotely control the Sony AIBO in the community bar. I want to use his code and modify it so that it can run separately without the community bar.
Deliverables:
1) Modify James’ code to create an stand alone program for remotely controlling the Sony AIBO
2) Add keyboard support for controlling the AIBO
1. Tekkotsu. http://tekkotsu.no-ip.org/
Changed lines 7-8 from:
Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with my project thoroughly.
to:
Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the potential problems associated with my project thoroughly.
Changed lines 7-8 from:
Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with robots thoroughly.
to:
Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with my project thoroughly.
Changed lines 7-8 from:
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with robots thoroughly.
to:
Hopefully, by learning and understanding these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with robots thoroughly.
Changed lines 5-6 from:
For the final project, I plan to implement a system, which allows users to remotely control the Sony AIBO to monitor domestic environments. I will face many problems that I’ve mentioned in the previous paragraph. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and learn to apply emotional design principles to robots.
to:
For my project, I will face many problems that I’ve mentioned above. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and learn to apply emotional design principles to robots.
Changed lines 5-8 from:
For the final project, I plan to implement a system, which allows users to remotely control the Sony AIBO to monitor domestic environments. I will face many problems that I’ve mentioned in the previous paragraph. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and how to apply emotional design principles to robots.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction
to:
For the final project, I plan to implement a system, which allows users to remotely control the Sony AIBO to monitor domestic environments. I will face many problems that I’ve mentioned in the previous paragraph. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and learn to apply emotional design principles to robots.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with robots thoroughly.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction, and understand the problems associated with robots thoroughly.
Changed lines 3-4 from:
Over the past few years, several robots have been introduced into the consumer market. There are dedicated service robots like the Roomba and Scooba[1], or intelligent and toy-like robots such as the Sony AIBO. As more and more robots being introduced into our daily lives, interactions between humans and robots will become an unavoidable problem that designers have to face and to solve. How can we communicate with robots so that we understand each other’s needs and thoughts? How can we treat robots as social artifact and grant them with social meaning? There are many problems that robot designers have to tackle.
to:
Over the past few years, several robots have been introduced into the consumer market. There are domestic service robots like the Roomba and Scooba[1], or intelligent and toy-like robots such as the Sony AIBO. As robots enter everyday life, interaction between humans and robots becomes an unavoidable problem that designers have to face and to solve. How can we communicate with robots efficiently? How can humans and robots understand each other’s needs and thoughts? How can we treat robots as social artifact and grant them with social meaning? There are many problems that robot designers have to tackle.
Added line 12:
Added lines 1-11:
!!Seminar Proposal
Over the past few years, several robots have been introduced into the consumer market. There are dedicated service robots like the Roomba and Scooba[1], or intelligent and toy-like robots such as the Sony AIBO. As more and more robots being introduced into our daily lives, interactions between humans and robots will become an unavoidable problem that designers have to face and to solve. How can we communicate with robots so that we understand each other’s needs and thoughts? How can we treat robots as social artifact and grant them with social meaning? There are many problems that robot designers have to tackle.
For the final project, I plan to implement a system, which allows users to remotely control the Sony AIBO to monitor domestic environments. I will face many problems that I’ve mentioned in the previous paragraph. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and how to apply emotional design principles to robots.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction.
1. Scooba. http://www.irobot.com/sp.cfm?pageid=128
Over the past few years, several robots have been introduced into the consumer market. There are dedicated service robots like the Roomba and Scooba[1], or intelligent and toy-like robots such as the Sony AIBO. As more and more robots being introduced into our daily lives, interactions between humans and robots will become an unavoidable problem that designers have to face and to solve. How can we communicate with robots so that we understand each other’s needs and thoughts? How can we treat robots as social artifact and grant them with social meaning? There are many problems that robot designers have to tackle.
For the final project, I plan to implement a system, which allows users to remotely control the Sony AIBO to monitor domestic environments. I will face many problems that I’ve mentioned in the previous paragraph. Thus, for the seminar, I propose to explore the topics that are related to human-robot interaction and UBICOMP, such as, understanding the influence of design on human-robot interaction, analysis the problems associated with interface design for controlling robots, and how to apply emotional design principles to robots.
Hopefully, by exploring these relative topics, I will be able to apply my knowledge of UBICOMP to human-robot interaction.
1. Scooba. http://www.irobot.com/sp.cfm?pageid=128
Added lines 1-22:
!!Research Proposal
!!!Active Home Monitoring System
Due to the limitation of our sensory and physical presence, we can only detect and respond to the environment around us. So we are naturally limited in our ability for remote sensing and interaction. Although, we can use digital sensors to extend our sensory to remote places, we have to confine ourselves to a specific environment (such as, security monitoring room) to monitor the results. Besides of reading and interpreting these feedbacks, this remote sensory system does not allow users to physically response to remote events.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other computer cannot. By utilizing the power of current handheld devices, we may grant robots with ubiquity so that humans can interact with them anywhere.
For this project, I am planning to use the Sony AIBO robot dog combined with handheld devices like Ultra Mobile PCs (UMPCs [2]) to build a remote monitoring and awareness system for a home environment. AIBO is an intelligent robot which has the capabilities of moving in the physical space and responding to visual, audio and haptic inputs. A faraway user equipped with a mobile platform such as a UMPC will be allowed to monitor his home environment through the AIBO’s sensors. For instance, if there is an unusual sound detected by AIBO at home while the user is working at their office, the user will be able to look through the AIBO vision system and get a sense of what is happening at home. Therefore, people will be able to extend their visual and auditory awareness to a remote place through a robotic interface. Also, AIBO has the ability to move and dynamically explore the home environment to discover “unusual” events, such as patrolling around the doors and windows to make sure that they are always locked. So the users will get active and live feedbacks from the AIBO. Moreover, users will have the ability to communicate and control the AIBO to trigger certain events. For example, users may control the AIBO to interact with actuators and sensors, like Phidgets, to complete certain physical tasks or to sense more information from devices like thermometers and motion sensors, to enrich the AIBO’s physical, visual and auditory abilities.
In conclusion, by combining AIBO and UMPC, I plan to design and implement an active home monitoring system where users can be aware of their home conditions remotely. Also, by observing the interaction between humans and robots, I hope to explore and discover ways to tighten the relationship between them, so that robots can be granted with more social meanings such as trust and reliability [3].
Reference
1. Weiser, M. (1991) The computer for the 21st Century. Scientific American. 94-110, September.
2. Ultra-Mobile PC, http://www.microsoft.com/windowsxp/umpc/default.mspx
3. Norman, D. (2004) Chapter 5: People, Places and Things, Why we love (or hate) everyday things (pp 135-160). New York: Basic Books.
!!!Active Home Monitoring System
Due to the limitation of our sensory and physical presence, we can only detect and respond to the environment around us. So we are naturally limited in our ability for remote sensing and interaction. Although, we can use digital sensors to extend our sensory to remote places, we have to confine ourselves to a specific environment (such as, security monitoring room) to monitor the results. Besides of reading and interpreting these feedbacks, this remote sensory system does not allow users to physically response to remote events.
Weiser envisioned that people will be surrounded with hundreds of “invisible” computers to aid them in the ubiquitous computing era [1]. We may consider the premise that robots can also be members among this “sea of computers”. Robots, while arguably not ubiquitous, can support humans with awareness to remote environments and at the same time with the ability to move and physically interact with remote physical objects, which other computer cannot. By utilizing the power of current handheld devices, we may grant robots with ubiquity so that humans can interact with them anywhere.
For this project, I am planning to use the Sony AIBO robot dog combined with handheld devices like Ultra Mobile PCs (UMPCs [2]) to build a remote monitoring and awareness system for a home environment. AIBO is an intelligent robot which has the capabilities of moving in the physical space and responding to visual, audio and haptic inputs. A faraway user equipped with a mobile platform such as a UMPC will be allowed to monitor his home environment through the AIBO’s sensors. For instance, if there is an unusual sound detected by AIBO at home while the user is working at their office, the user will be able to look through the AIBO vision system and get a sense of what is happening at home. Therefore, people will be able to extend their visual and auditory awareness to a remote place through a robotic interface. Also, AIBO has the ability to move and dynamically explore the home environment to discover “unusual” events, such as patrolling around the doors and windows to make sure that they are always locked. So the users will get active and live feedbacks from the AIBO. Moreover, users will have the ability to communicate and control the AIBO to trigger certain events. For example, users may control the AIBO to interact with actuators and sensors, like Phidgets, to complete certain physical tasks or to sense more information from devices like thermometers and motion sensors, to enrich the AIBO’s physical, visual and auditory abilities.
In conclusion, by combining AIBO and UMPC, I plan to design and implement an active home monitoring system where users can be aware of their home conditions remotely. Also, by observing the interaction between humans and robots, I hope to explore and discover ways to tighten the relationship between them, so that robots can be granted with more social meanings such as trust and reliability [3].
Reference
1. Weiser, M. (1991) The computer for the 21st Century. Scientific American. 94-110, September.
2. Ultra-Mobile PC, http://www.microsoft.com/windowsxp/umpc/default.mspx
3. Norman, D. (2004) Chapter 5: People, Places and Things, Why we love (or hate) everyday things (pp 135-160). New York: Basic Books.