WO2005123210A2 - Figurine using wireless communication to harness external computing power - Google Patents
Figurine using wireless communication to harness external computing power Download PDFInfo
- Publication number
- WO2005123210A2 WO2005123210A2 PCT/US2005/019933 US2005019933W WO2005123210A2 WO 2005123210 A2 WO2005123210 A2 WO 2005123210A2 US 2005019933 W US2005019933 W US 2005019933W WO 2005123210 A2 WO2005123210 A2 WO 2005123210A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- figurine
- data
- computer
- translation
- output
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/003—Dolls specially adapted for a particular function not connected with dolls
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Definitions
- the invention relates to figurines such as stuffed animals, teddy bears, dolls, toy robots, action figures, and the like, and more particularly, to figurines that include electronics.
- the term "figurine” refers to a doll, a teddy bear, a stuffed animal, a toy robot, a toy statue, an action figure, and the like.
- Figurines are commonly used by children to pass the time and facilitate imaginative thought.
- more advanced computerized figurines have been developed. These more advanced figurines, for example, may incorporate electronics that allow the figurine to interact with the child.
- the invention is directed to a system including a figurine that utilizes wireless communication to harness computing power of an external computer.
- applications that require intensive processing power can be seemingly executed by the figurine, with the intensive processing actually being performed by the external computer.
- the figurine may capture input and wirelessly transfer the input to an external computer, which processes the input.
- the external computer returns output to the figurine, which presents the output to a child.
- Speech recognition applications, speech interpretation applications, image processing applications, voice recognition applications, and language translation applications are some examples of applications that typically require intensive processing power and large amounts of memory.
- the invention contemplates a figurine that utilizes wireless communication to harness computing power of an external computer in order to facilitate the presentation of speech recognition applications, speech interpretation applications, image processing applications, voice recognition applications, and language translation applications through the figurine.
- the internal electronics of the figurine can be greatly simplified.
- the need for intensive processing power and a large amount of memory in the figurine can be avoided.
- the need to protect powerful processors and memory from misuse by a child handing the figurine can also be avoided.
- battery life in the figurine may be extended by using the techniques described herein.
- the invention provides a system comprising a figurine that captures input from a user and wirelessly communicates the input.
- the input can be image data, for example, or audio data such as speech data.
- the system also includes a computer that receives the speech data from the figurine, generates a response to the speech data, and wirelessly communicates the response to the figurine. The figurine then outputs the response to the user.
- the invention provides a system comprising a figurine that captures speech data from a user and wirelessly communicates the speech data.
- the system also includes a computer that receives the speech data from the figurine, generates a translation of the speech data, and wirelessly communicates the translation to the figurine.
- the figurine outputs the translation to the user.
- the invention provides a system comprising a figurine that captures image data from a user and wirelessly communicates the image data, wherein the image data includes one or more words or phrases.
- the system also includes a computer that receives the image data from the figurine, generates a translation of the words or phrases, and wirelessly communicates the translation to the figurine.
- the figurine outputs the translation to the user.
- the invention provides a system comprising a figurine that captures image data from a user and wirelessly communicates the image data, wherein the image data includes one or more words or phrases.
- the system also includes a computer that receives the image data from the figurine, generates audio data corresponding to the words or phrases, and wirelessly communicates the audio data to the figurine.
- the figurine outputs the audio data to the user.
- the invention provides an interactive toy figurine comprising a data capture device and a wireless transmitter/receiver to wirelessly transfer data captured by the data capture device and receive output associated with the data captured by the data capture device.
- the data capture device may be an image capture device to capture image data, such as a camera deployed in one or both of the eyes of the toy figurine, or elsewhere.
- a method comprises capturing speech data from a user at a figurine, and wirelessly communicating the speech data to an external computer. The method also comprises receiving from the external computer a response to the speech data, and outputting the response to the user from the figurine.
- a method comprises capturing speech data from a user at a figurine, and wirelessly communicating the speech data to an external computer. The method also comprises receiving from the external computer a translation of the speech data, and outputting the translation to the user from the figurine.
- a method comprises capturing image data with a figurine, and wirelessly communicating the image data to an external computer.
- the image data includes one or more words or phrases.
- the method also comprises receiving from the external computer a translation of the words or phrases, and outputting the translation from the figurine.
- a method comprises capturing image data with a figurine, and wirelessly communicating the image data to an external computer.
- the image data includes one or more words or phrases.
- the method also comprises receiving from the external computer audio data corresponding to the words or phrases, and outputting the audio data from the figurine.
- a system comprises a figurine that captures input and wirelessly communicates the input.
- the system also includes a computer that receives the input from the figurine, generates output based on the input, and wirelessly communicates the output to the figurine.
- the figurine presents the output to a user.
- a system comprises a figurine communicatively coupled to a computer, which is in-turn communicatively coupled to a server via a network.
- the figurine provides input to the computer and receives output from the computer.
- the computer can receive software updates from the server such that functionality of the figurine can be changed or expanded via computer software upgrades.
- upgrades may also be loaded on the computer via a conventional disk or other storage medium, in which case, communication with the server would not be necessary.
- a system comprises a figurine communicatively coupled to a computer.
- the system includes one or more system compatible objects that the figurine can interact with, harnessing the power of the computer.
- the compatible objects may include indicia identifiable by the figurine, which can ensure that the software on the computer can provide useful interaction between the figurine and the object.
- a system comprises a figurine, a computer, and a parent unit.
- the parent unit may comprise a software module on the computer, or a separate hardware device.
- the parent unit allows parents to exert parental control over the functionality of the figurine by interacting with software modules on the computer that control operation and interactive features of the figurine.
- the parent unit may also function as a baby monitor, e.g., a smart baby monitor that can generate an alarm if a baby in proximity to the figurine ceases to breath, or has other detectable problems.
- FIG. 1 is a conceptual diagram illustrating a figurine wirelessly communicating with a computer.
- FIGS. 2 and 3 are block diagrams of a figurine wirelessly communicating with a computer.
- FIGS. 4-6 are flow diagrams according to embodiments of the invention, illustrating application of the invention to translation of spoken or written messages.
- FIG. 7 is a conceptual diagram illustrating a figurine wirelessly communicating with a computer via a wireless hub.
- FIG. 8 is a conceptual diagram illustrating a figurine wirelessly communicating with a computer thorough a network.
- FIG. 9 is a conceptual diagram illustrating a figurine wirelessly communicating with a computer and a compatible object.
- FIG. 10 is a conceptual diagram illustrating a figurine wirelessly communicating with a computer, with a parents' unit.
- FIG. 11 is a conceptual diagram illustrating a system in which a server communicates with clients that wirelessly communicate with figurines.
- the invention is directed to a system including a figurine that utilizes wireless communication to harness computing power of an external computer.
- a figurine that utilizes wireless communication to harness computing power of an external computer.
- certain applications that require intensive processing power can be seemingly executed by the figurine, with the intensive processing actually being performed external to the figurine in another computer.
- the figurine may capture audio data, video data or both, and wirelessly transfer the captured data to the external computer, either directly or via a network.
- Audio data includes, but is not limited to, speech data, voice data and music data.
- the external computer receives the data as input, processes the data, generates output based on the input data, and transfers the output to the figurine. The output can then be presented to a child as though the figurine processed and generated the output directly.
- FIG. 1 is a diagram illustrating a system 10 according to an embodiment of the invention.
- System 10 generally includes a figurine 12 such as a doll, teddy bear, stuffed animal, toy robot, toy statue, action figure or the like.
- System 10 also includes an external computer 14 such as a personal computer (PC), Macintosh, workstation, laptop, notebook, palm computer, or any other computer external to figurine 12.
- PC personal computer
- Macintosh workstation
- laptop notebook, palm computer, or any other computer external to figurine 12.
- Figurine 12 and external computer 14 communicate either directly or indirectly via one or more wireless communication links 16. In some cases, external computer 14 may be networked to one or more wireless hubs or other devices that facilitate wireless communication.
- Figurine 12 harnesses the computing power of external computer 14 in order to facilitate execution of processor-intensive and/or memory-intensive applications. A child can interact with figurine 12. Accordingly, figurine 12 can facilitate learning and provide instruction and guidance to the child. As figurine 12 harnesses the computing power of external computer 14 in order to facilitate execution of these applications, the computing power and memory needed in figurine 12 can be significantly reduced. Accordingly, the need to protect processors and/or memory from misuse, by a child handing figurine 12, can also be reduced. Also, the power used by figurine 12 can be reduced, prolonging battery life within figurine 12.
- figurine 12 may present a speech recognition application to the child, e.g., a program that teaches the child the meanings of one or more words or phrases. In that case, the child may speak to figurine 12, which captures the speech and wirelessly communicates the captured speech to external computer 14. External computer 14 parses the speech and generates one or more meanings, which are communicated back to figurine 12. Figurine 12 can then output the meanings to the child in any number of ways.
- a speech recognition application e.g., a program that teaches the child the meanings of one or more words or phrases.
- the child may speak to figurine 12, which captures the speech and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the speech and generates one or more meanings, which are communicated back to figurine 12.
- Figurine 12 can then output the meanings to the child in any number of ways.
- a child can utter the word "travel" to figurine 12, which captures the utterance and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the captured speech and generates one or more definitions, which are communicated back to figurine 12.
- Figurine 12 can output the definition by, for example, responding "the word 'travel' means to go on a trip.”
- figurine 12 may be capable of holding intelligent conversation with the child by harnessing the computing power of external computer. In that case, the child may speak to figurine 12, which captures the speech and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the speech and generates one or more responses, which are communicated back to figurine 12.
- Figurine 12 can then output the responses to the child.
- figurine 12 may respond with an intelligent answer.
- Software executing on external computer 14 may adapt over time to the questions posed by the child, and may also be upgradeable. Upgrades to software on external computer 14, for example, may cause figurine 12 to appear to grow intellectually with the child.
- a child can utter the words "I love you” to figurine 12, which captures the utterance and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the child's message and generates one or more responses, which are communicated back to figurine 12.
- Figurine 12 can output the response by, for example, responding "I love you, too.”
- the child can utter the question "What is a triangle?" to figurine 12, which can respond "A triangle is a shape that has three sides.”
- figurine 12 may help the child with reading.
- one or more image capture devices such as digital cameras, may be located in figurine 12, such as in one or more eyes 18 of figurine 12.
- the child may present a page of a book to figurine 12 by directing the eyes 18 of figurine 12 toward the page and pressing a button (not shown).
- a captured image of the page can be wirelessly communicated to external computer 14.
- External computer 14 parses the page and identifies the words printed on the page.
- External computer 14 then communicates back to figurine, so that figurine 12 can output the words written on the page.
- figurine 12 may appear to be reading to the child.
- a display (not shown) may also be incorporated into figurine 12 to present the child with the captured words being read by figurine 12. Therefore, figurine 12 may aid in the learning process of the child by helping to teach the child to read.
- figurine 12 may facilitate translation of words spoken by the child.
- the child may speak to figurine 12, which captures the speech and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the speech and identifies a translation of the words or phrases spoken by the child.
- External computer 14 then communicates back to figurine, so that figurine 12 can output the translations to the child.
- figurine 12 serves as an interpreter.
- a child can utter the words "Thank you” to figurine 12, which captures the utterance and wirelessly communicates the captured speech to external computer 14.
- External computer 14 parses the utterance and identifies a translation of the phrase.
- External computer 14 then communicates the translation back to figurine, and figurine 12 can output the translations by, for example, responding with "'Gracias' means 'thank you' in Spanish.”
- figurine 12 may facilitate translation of written words.
- one or more image capture devices such as digital cameras, may be located in the eyes 18 of figurine 12.
- the child may present words or phrases to figurine 12 by directing the eyes 18 of figurine 12 toward the words or phrases.
- the child may press a button (not shown) on figurine 12 to capture the words or phrases being "viewed" by the figurine.
- a captured image of the words or phrases can be wirelessly communicated to external computer 14.
- External computer 14 parses the words or phrases, and identifies a translation of the words or phrases.
- External computer 14 then communicates back to figurine, so that figurine 12 can output the translation.
- a display (not shown) may also be incorporated into figurine 12 to present the child with the captured words being translated.
- the display for example, may be located anywhere on figurine, but is preferably located on the back of figurine so that the words can be viewed by the child as eyes 18 of figurine 12 are directed away from the child towards a page to be read.
- figurine 12 harnesses the computing power of external computer 14 to perform image processing unrelated to words and phrases.
- One or more image capture devices such as digital cameras, may be located in the eyes 18 of figurine 12, and can capture image data to be processed by external computer 14.
- Image processing can include recognition of faces, objects, colors, numbers, places, activities, and the like.
- figurine 12 can seem to recognize the person or persons interacting with figurine 12.
- Figurine 12 can use the recognition in its interaction by, for instance, calling a child by name.
- figurine 12 can seem to recognize objects and attributers of objects such as shape, type or quantity.
- figurine 12 can teach a child to recognize shapes, count objects, become familiar with colors, and the like.
- figurine 12 harnesses the computing power of external computer 14 to perform voice recognition in order to identify the speaker.
- Figurine 12 can use the voice recognition in its interaction by, for instance, calling a child by name.
- figurine 12 can seem to recognize the speaker.
- voice recognition applications would be used along with speech recognition applications.
- Voice recognition applications refer to applications that identify who is talking and may allow for programmed figurine interaction only with those persons associated with a recognized voice.
- Speech recognition applications refer to applications that recognize what is being said and may be generally used with any voice.
- the invention may utilize both speech and voice applications together in order to determine what is being said and who is speaking. This can improve interaction with figurine 12 such that figurine 12 may only respond to the child for which it is programmed to respond.
- voice recognition a child could say "sing me a song” or "tell me a story” and the figurine may select a song or store from a library and respond to the recognized voice, as directed. Responses from the figurine to others, however, may be limited or prohibited if the requesting voice is not recognized.
- Interaction between a user and figurine 12 can be proactive as well as reactive.
- external computer 14 can cause figurine 12 to take action that is not in response to action by a user.
- figurine 12 may serve as an alarm clock, telling a child that it is time to get out of bed.
- Figurine 12 may also proactively remind a user of the day's appointments, birthdays of friends or relatives, and the like.
- first output may be provided, which is responsive to input to the figurine, and computer 12 can be programmed to proactively cause figurine 12 to output second output to a user, e.g., an alarm or reminder.
- FIG. 2 is a functional block diagram of system 10 including a figurine 12 and an external computer 14. Again, figurine 12 uses wireless communication in order to harnesses the processing horsepower of external computer 14. In this manner, complex applications can be performed by computer 14, yet presented by figurine 12 to a user.
- Figurine 12 includes one or more input devices 22 to capture input from a user, e.g., a child.
- Figurine 12 also includes one or more output devices 23 present output to the user.
- Input device 22 may comprise, for example, a sound-detecting transducer such as a microphone, or an image capture device, such as a digital camera.
- a button or other actuator may be disposed on figurine 12 to turn on the microphone or to cause the digital camera to take a picture.
- Output device 23 may comprise a sound-generating transducer such as a speaker, or possibly a display screen.
- Sounds or images detected by input device 22 may be processed locally by local central processing unit (CPU) 24 in order to facilitate communication of the data to external computer 14.
- local CPU 24 may package the captured input for transmission to external computer 14.
- Local CPU 24 may also control transmitter/receiver 26 to cause transmission of data indicative of the sounds or images detected by input device 22.
- Local CPU 24, for example, may comprise a relatively simple controller implemented in application specific integrated circuit (ASIC). If images are captured by figurine 12, local CPU 24 may compress the image file to simplify wireless transfer of the image file. In any case, transmitter/receiver 26 transfers data collected by figurine 12 so that the data can be processed external to figurine 12.
- ASIC application specific integrated circuit
- the wireless communication between transmitter/receiver 26 of figurine 12 and transmitter/receiver 27 of external computer 14 may conform to any of a wide variety of wireless communication protocols. Examples include, but are not limited to a wireless networking standard such as one of the IEEE 802.11 standards, a standard according to the Bluetooth Special Interest Group, or the like.
- the IEEE 802.11 standards include, for example, the original 802.11 standard having data transfer rates of 1-2 Megabits per second (Mbps) in a 2.4-2.483 Gigahertz (GHz) frequency band, as well as the IEEE 802.1 lb standard (sometimes referred to as 802.11 wireless fidelity or 802.11 Wi-Fi) that utilizes binary phase shift keying (BPSK) for 1.0 MBPS transmission, and quadrature phase shift keying (QPSK) for 2.0, 5.5 and 11.0 Mbps transmission, the IEEE 802.1 lg standard that utilizes orthogonal frequency division multiplexing (OFDM) in the 2.4 GHz frequency band to provide data transmission at rates up to 54 Mbps, and the IEEE 802.1 la standard that utilizes OFDM in a 5 GHz frequency band to provide data transmission at rates up to 54 Mbps.
- BPSK binary phase shift keying
- QPSK quadrature phase shift keying
- Transmitter/receiver 27 of external computer 14 receives data sent by transmitter/receiver 26 of figurine 12.
- Remote CPU 28 performs extensive processing on the data to generate output.
- remote CPU 28 may comprise a general purpose microprocessor that executes software to generate the output. The output is then transmitted back to figurine 12. Output device 23 of figurine 12 can then present the output to the user.
- the local electronics of figurine 12 can be greatly simplified.
- the need for intensive processing power and a large amount of memory in figurine 12 can be avoided.
- Battery power in figurine 12 can also be extended by performing processing tasks externally in computer 14.
- software upgrades may be easily implemented for execution by remote CPU 28 without requiring upgrade of the components of figurine 12.
- figurine 12 may present a speech recognition application to the child, e.g., a program that teaches the child the meanings of one or more words or phrases. In that case, the child may speak to figurine 12, and input device 22 can capture the speech.
- Local CPU 24 packages the speech and causes transmitter/receiver 26 to wirelessly communicates the captured speech to external computer 14.
- Remote CPU 28 parses the speech and generates one or more meanings, which are communicated back to figurine 12 by transmitter/receiver 27.
- Output device 23 of figurine 12 can then output the meanings to the child.
- figurine 12 maybe capable of holding intelligent conversation with the child by harnessing the remote CPU 28 of external computer 14.
- the child may speak to figurine 12, and input device 22 can capture the speech.
- Local CPU 24 packages the speech and causes transmitter/receiver 26 to wirelessly communicate the captured speech to external computer 14.
- Remote CPU 28 parses the speech and generates one or more responses, which are communicated back to figurine 12 by transmitter/receiver 27.
- Output device 23 of figurine 12 can then output the responses to the child.
- figurine 12 may respond with an intelligent answer.
- figurine 12 may help the child with reading.
- input device 22 in the form of an image capture device may capture images of a page.
- Local CPU 24 packages the image and causes transmitter/receiver 26 to wirelessly communicate the captured image to external computer 14.
- Local CPU may also compress the image prior to transmission.
- remote CPU 28 parses the image and generates one or more meanings, which are communicated back to figurine 12 by transmitter/receiver 27.
- remote CPU 28 may perform character recognition on the image in order to identify characters, and may then decipher the meaning of the identified characters using one or more dictionaries stored in memory and accessible by remote CPU 28.
- Output device 23 of figurine 12 can then output the meanings to the child.
- figurine 12 may appear to be reading to the child.
- Output device 23 may include speakers for verbal output and possibly a display to present the child with the captured words being read by figurine 12.
- figurine 12 may facilitate translation of words spoken by the child. In that case, the child may speak to figurine 12, and input device 22 can capture the speech.
- Local CPU 24 packages the speech and causes transmitter/receiver 26 to wirelessly communicate the captured speech to external computer 14.
- Remote CPU 28 parses the speech and identifies a translation of the word or phrases spoken by the child, which are communicated back to figurine 12 by transmitter/receiver 27.
- Output device 23 of figurine 12 can then output the translation to the child.
- figurine 12 serves as an interpreter.
- figurine 12 harnesses the computing power of external computer 14 to perform other types of image processing.
- One or more image capture devices can capture image data to be processed by external computer 14.
- Image processing can include recognition of faces, objects, colors, numbers, places, activities, and the like.
- figurine 12 can seem to recognize the person or persons interacting with figurine 12.
- Figurine 12 can use the recognition in its interaction by, for instance, calling a child by name.
- figurine 12 can seem to recognize objects and attributers of objects such as shape, type or quantity.
- figurine 12 can teach a child to recognize shapes, count objects, become familiar with colors, and the like.
- Interaction between a user and figurine 12 can be proactive as well as reactive.
- external computer 14 can cause figurine 12 to take action that is not in response to action by a user.
- figurine 12 may serve as an alarm clock, telling a child that it is time to get out of bed.
- Figurine 12 may also proactively remind a user of the day's appointments, birthdays of friends or relatives, and the like.
- figurine 12 may remind a user to take medication. Any such alarms or reminders may be standard audio tones, music, or possibly programmed or recorded audio of a familiar voice, making figurine 12 speak with a pleasant tone to the user when providing reminders.
- a parent's voice may be recorded such that figurine 12 speaks with such recordings.
- Voice emulation software may also be used by computer 14 so that figurine 12 speaks new words or phrases in a voice that emulates that of the parents.
- FIG. 3 is a more detailed block diagram of system 30 illustrating application of the invention to one of the example applications described above, in particular, translation of written words.
- System 30 may correspond to system 10 (FIGS. 1 and 2).
- image capture device 33 may capture images of a page.
- image capture device 33 may comprise a digital camera located in the eyes of figurine 32 so that when a child directs the eyes of figurine 32 toward a page and presses an actuator, the image of the page is captured.
- the actuator for example, may be disposed on the back of figurine 32 so that when the eyes of figurine 32 are directed toward a page, the actuator is easily accessible.
- the image capture device and actuator may be deployed in other locations on figurine 32.
- Local CPU 34 packages the image and causes transmitter/receiver 36 to wirelessly communicate the captured image to external computer 31.
- Remote CPU 38 parses the image and generates a translation, which is communicated back to figurine 32 by transmitter/receiver 37.
- Remote CPU 28 may invoke software modules 37, 39 to specifically perform optical character recognition 39 and translation 40. Once the image has been translated and the translation has been communicated back to figurine 32, output device 35 of figurine 32 can then output the translation.
- Optical character recognition module 39 may recognize English, and translator module 40 may translate from English to Spanish. Any other languages, however, could also be supported.
- optical character recognition may be performed locally at figurine, with the more processor-intensive translation being performed by external computer 31.
- Other exemplary applications can be supported in a manner similar to that depicted in FIG. 3.
- image capture device 33 may capture one or more images of a face.
- Local CPU 34 packages the image and causes transmitter/receiver 36 to wirelessly communicate the captured image to external computer 31.
- Remote CPU 38 executes face recognition software modules to identify the face in the image. Once the face has been identified remote CPU 38 may then ' incorporate that identity into the output of figurine 32 by, for example, referring to the user by name. Voice recognition could also be used to cause figurine 32 to refer to the user by name.
- remote CPU 38 can execute shape recognition software modules, color recognition software modules, object recognition software modules, quantification software modules. Remote CPU 38 can use such software modules to help a user recognize shapes, count objects, become familiar with colors, and the like. Although a user perceives all action occurring through figurine 32, processor-intensive image processing is actually being performed by remotely by external computer 31.
- An object recognition module for example, may be designed to recognize currency (such as coins) and allow the figurine to teach a child how to accurately count change.
- the various modules and components described herein may be implemented in hardware, software, firmware, or any combination. The invention is not limited to any particular hardware or software implementation. If implemented in software, the modules may be stored on a computer readable medium such as memory or a non- volatile storage medium.
- FIGS. 4-6 are flow diagrams according to some embodiments of the invention, illustrating application of the invention to one of the example applications described above, in particular, translation of spoken or written messages. In the technique shown in FIG.
- figurine 12 performs speech capture (41), and then transmits speech data to external computer 14 (42).
- External computer 14 receives the speech data (43) and may perform speech recognition (44) to identify the spoken words or phrases.
- External computer 14 performs translation with respect to the identified spoken words or phrases (45) and transmits the translation back to figurine 12 (46).
- Figurine 12 receives the translation (47) and outputs the translation to the user.
- Figurine 12 may drive an output device such as a display screen , thereby providing a written output.
- a user may find it more desirable, however, to have figurine 12 drive a speaker in figurine 12, thereby providing an audible output, such as a synthesized speech recitation of the translation.
- figurine 12 acts as a translator of spoken words or phrases, invoking an external computer 14 to reduce the local processing at figurine 12.
- computer 14 may also perform voice recognition so that figurine 12 only responds to recognized voices, or responds differently, e.g., by identifying different persons, in response to computer 14 recognizing different voices.
- figurine 12 captures an image (51), and then transmits the image to external computer 14 (52).
- External computer 14 receives the image (53) and decodes the image (54), e.g., by performing optical character recognition.
- External computer then translates the characters to generate a translation (55).
- External computer 12 transmits the translation back to figurine 12 (56).
- Figurine 12 receives the translation (57) and outputs the translation to the user (58).
- figurine 12 acts as a translator of written words or phrases, invoking an external computer 14 to reduce the local processing at figurine 12.
- the translation may be output in audio, video, or both.
- figurine 12 captures an image that includes written words or phrases (61), and then transmits the images to external computer 14 (62).
- External computer 14 receives the image (63) and decodes the image (64) e.g., by performing optical character recognition to identify written words or phrases.
- External computer 14 then generates an audio signal (65) as a function of the identified words or phrases.
- External computer 12 transmits the audio signal back to figurine 12 (66).
- Figurine 12 receives the audio signal (67) and outputs the translation to the user. In this manner, figurine 12 appears to be reading the written words or phrases, by invoking an external computer 14 to reduce the local processing at figurine 12.
- FIG. 7 is a block diagram of a system 70, similar to system 10.
- figurine 72 wirelessly communicates with external computer 74 via a wireless hub 75.
- wireless hub 75 communicates wirelessly with figurine 72, and is coupled to external computer 74.
- FIG. 8 illustrates another system 80, similar to system 10, in which figurine 82 wirelessly communicates to take advantage of computing power of external computer 84.
- figurine 82 wirelessly communicates with external computer 84 via a wireless hub 85 that couples to external computer 84 via network 86.
- wireless hub 85 communicates wirelessly with figurine 82, and is coupled to external computer 84 via network 86.
- Network 86 may comprise a small local area network (LAN), a wide area network, or even a global network such as the Internet.
- Communication between hub 85 and external computer 84 may be, but need not be, wireless.
- the wireless capabilities of figurine 82 allow for communication with external computer 84, thereby allowing figurine 82 to make use of the processing capabilities of external computer 84.
- figurine 82 When a figurine is configured to communicate with a global network such as the Internet, such as is depicted in FIGS. 8 or 11, the figurine can serve as an input-output device for interaction with the network and other stations or servers coupled to network 82.
- figurine 82 reports information obtained from one or more network servers (not shown). For example, a child could ask figurine 82, "What is the weather forecast for today?" The request is relayed to external computer 84, which accesses a server via network 86 that can provide the local forecast. Upon retrieving the local forecast, external computer 84 supplies that information to figurine 82, which answers the child's question.
- FIG. 9 is a diagram illustrating a system 90 according to an additional embodiment of the invention.
- System 90 includes a figurine 92 and an external computer 94, which communicate either directly or indirectly via one or more wireless communication links.
- system 90 includes a compatible object 95, embodied in FIG. 9 as a book.
- Compatible object 95 can be an object of any type, but in a typical implementation, compatible object 95 is an accessory for figurine 92.
- Compatible object 95 includes a wireless identifier, by which a detector in figurine 92 can detect the presence of compatible object 95.
- a wireless identifier is a radio frequency identification (RFID) tag 96.
- RFID tag 96 may be hidden in compatible object 95 and not readily observable to a user.
- An RFID tag reader 98 in figurine 92 detects and reads RFID tag 96. Bar codes or other indicia might also be used, in which case reader 98 would facilitate the reading of such indicia.
- RFID tag 96 is a wireless electronic device that communicates with RFID tag reader 98.
- RFID tag 96 may include an integrated circuit (not shown) and a coil (not shown).
- the coil may act as a source of power, as a receiving antenna, and a transmitting antenna.
- the coil may be coupled to capacitor to store power when interrogated in order to drive the integrated circuit.
- the integrated circuit may include wireless communications components and memory.
- RFID tag reader 98 may include an antenna and a transceiver.
- RFID tag reader 98 may "interrogate" RFID tag 96 by directing an electromagnetic (i.e., radio) signal to RFID tag 96.
- RFID tag 96 may include, but need not include, an independent power source.
- RFID tag 96 receives power from the interrogating signal from RFID tag reader 98.
- RFID tag 96 may perform certain operations, which may include transmitting data stored in the memory of the RFID tag 96 to RFID tag reader 98.
- the transmitted data may include an identification of compatible object 95.
- figurine 92 may exert better control over those objects that will be used to interact with figurine, and can help ensure that a child will not become frustrated, e.g., if figurine 92 were used with an incompatible book or object.
- RFID tag reader 98 identifies RFID tag 96
- external computer 94 becomes aware of compatible object 95 proximate to figurine 92.
- External computer 94 can use the identity of compatible object 95 to communicate with a user more effectively.
- exemplary compatible object 95 in FIG. 9 is a book.
- external computer 94 learns the identity of the book, external computer 94 can generate output appropriate for that book.
- Figurine 92 may, for example, direct the attention of a child to illustrations shown in the book, and explain how the illustrations pertain to the story.
- FIG. 10 is a diagram illustrating a system 100 according to another embodiment of the invention.
- System 100 includes a figurine 102 and an external computer 104, which communicate either directly or indirectly via one or more wireless communication links.
- system 100 includes a parents' unit 106.
- Parents' unit 106 may comprise any device, including, but not limited to, a television, a computer, a telephone, a speaker, a video monitor and the like.
- Parents' unit 106 may communicate with external computer 104 in any fashion, such as by an electrical connection, by an optical link, or by radio frequency.
- System 100 is configured to serve as a child monitoring system.
- a parent can deploy figurine 102 proximate to a child so that figurine 102 can capture video information or audio information or b'oth about that child.
- Figurine 102 transmits the captured information to external computer 104.
- External computer 104 in turn sends information to parents' unit 106.
- the parent may also communicate in real time to the child through figurine 102, e.g., by speaking into a microphone of parents' unit 106.
- Parents' unit 106 may be a separate unit, or may be implemented as a software module that executes directly on external computer 104.
- external computer 104 simply relays captured information to parents' unit 106.
- captured audio and video data showing the child's location, condition and activity may be relayed to parents' unit 106.
- external computer 104 can also process the captured audio and video data and provide useful information to parents' unit 106.
- external computer 104 can process audio data captured via figurine 102 and determine whether the child is crying, sleeping, breathing abnormally, and the like.
- External computer 104 can also process video data captured via figurine 102 and determine whether the child is awake or has gotten out of bed or the like.
- FIG. 11 is a diagram illustrating a system 110 according to another embodiment of the invention.
- System 110 is a server-client system tin which a server 112 supplies one or more functionalities to one or more client figurine-computer systems 114, 116.
- Server 112 manages a database 113 that stores software that can provide figurines with one or more functionalities.
- Client figurine-computer systems 114, 116 download one or more functionalities from server 112 via a network 118.
- Network 118 may comprise any network, including a global network such as the Internet.
- Examples of functionalities include, but are not limited to, the functionalities described herein.
- the owner of client figurine-computer system 114 may desire that her child's figurine 122 should be capable of helping teach her child about numbers, letters, basic shapes and basic colors.
- her child's figurine 122 should be capable of reciting stories suitable for a child four years of age.
- the owner of client figurine-computer system 114 downloads software for such functionalities from server 112 via network 118.
- the software is stored locally at external computer 120.
- the owner of client figurine-computer system 116 may desire that his child's figurine 124 should be capable of reading a book, helping teach his child speak and write in English and Spanish, and playing games appropriate for a child six years of age.
- each parent can customize his or her child's figurine for the child's age, needs or desires. As the child develops, the parent can obtain more advanced functionality. Further, as new functionalities are developed and added to database 113, the parents can download the new functionalities. As a result, the figurines seem to "grow" with the children, and can be enabled to perform new or more sophisticated functions. Because the new and more advanced functionality can be executed in external computer 120, the need to upgrade figurine 122 may be avoided, which can be important to a child that has become emotionally attached to figurine 122.
- FIG. 37 may depict a single figurine with a single external computer
- the invention encompasses embodiments in which a single external computer interacts with two or more figurines.
- a parent with two children can, for example, give a different figurine to each child, and each figurine can communicate wirelessly with the same or different external computers.
- Each child will perceive that each figurine operates independently of the other.
- each figurine may be separately empowered with functionality appropriate for each child.
- the invention may offer one or more advantages.
- a child's toy can be very versatile, capable of a wide range of functionality.
- the functionality can be customized to the child, and can change as the child develops.
- the invention supports an interesting and adaptable system that can help a child learn a wide range of subjects, making interaction with the figurine not only fun, but educational as well.
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MXPA06014212A MXPA06014212A (en) | 2004-06-08 | 2005-06-07 | Figurine using wireless communication to harness external computing power. |
BRPI0511898-0A BRPI0511898A (en) | 2004-06-08 | 2005-06-07 | system, interactive toy figure and method |
EP05758005A EP1765478A2 (en) | 2004-06-08 | 2005-06-07 | Figurine using wireless communication to harness external computing power |
JP2007527640A JP2008506510A (en) | 2004-06-08 | 2005-06-07 | Figurine using external computing power using wireless communication |
CA002569731A CA2569731A1 (en) | 2004-06-08 | 2005-06-07 | Figurine using wireless communication to harness external computing power |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US57810104P | 2004-06-08 | 2004-06-08 | |
US60/578,101 | 2004-06-08 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2005123210A2 true WO2005123210A2 (en) | 2005-12-29 |
WO2005123210A3 WO2005123210A3 (en) | 2008-02-14 |
Family
ID=35510282
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2005/019933 WO2005123210A2 (en) | 2004-06-08 | 2005-06-07 | Figurine using wireless communication to harness external computing power |
Country Status (8)
Country | Link |
---|---|
US (1) | US20060234602A1 (en) |
EP (1) | EP1765478A2 (en) |
JP (1) | JP2008506510A (en) |
CN (1) | CN101193684A (en) |
BR (1) | BRPI0511898A (en) |
CA (1) | CA2569731A1 (en) |
MX (1) | MXPA06014212A (en) |
WO (1) | WO2005123210A2 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008096884A (en) * | 2006-10-16 | 2008-04-24 | Hitachi Software Eng Co Ltd | Communication system for learning foreign language |
JP2009034806A (en) * | 2007-07-31 | 2009-02-19 | Ind Technol Res Inst | Structure for applying radio recognizing technique to electronic robot operation sequential control |
EP2777786A3 (en) * | 2013-03-15 | 2014-12-10 | Disney Enterprises, Inc. | Managing virtual content based on information associated with toy objects |
US9011194B2 (en) | 2013-03-15 | 2015-04-21 | Disney Enterprises, Inc. | Managing virtual content based on information associated with toy objects |
GB2532141A (en) * | 2014-11-04 | 2016-05-11 | Mooredoll Inc | Method and device of community interaction with toy as the center |
US9610500B2 (en) | 2013-03-15 | 2017-04-04 | Disney Enterprise, Inc. | Managing virtual content based on information associated with toy objects |
Families Citing this family (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070073436A1 (en) * | 2005-09-26 | 2007-03-29 | Sham John C | Robot with audio and video capabilities for displaying advertisements |
US7551523B2 (en) * | 2007-02-08 | 2009-06-23 | Isaac Larian | Animated character alarm clock |
US8894461B2 (en) * | 2008-10-20 | 2014-11-25 | Eyecue Vision Technologies Ltd. | System and method for interactive toys based on recognition and tracking of pre-programmed accessories |
US7868762B2 (en) * | 2007-12-12 | 2011-01-11 | Nokia Corporation | Wireless association |
US20090197504A1 (en) * | 2008-02-06 | 2009-08-06 | Weistech Technology Co., Ltd. | Doll with communication function |
WO2009149112A1 (en) * | 2008-06-03 | 2009-12-10 | Tweedletech, Llc | An intelligent game system for putting intelligence into board and tabletop games including miniatures |
US9712359B2 (en) * | 2009-04-30 | 2017-07-18 | Humana Inc. | System and method for communication using ambient communication devices |
US20100325781A1 (en) * | 2009-06-24 | 2010-12-30 | David Lopes | Pouch pets networking |
US9421475B2 (en) | 2009-11-25 | 2016-08-23 | Hallmark Cards Incorporated | Context-based interactive plush toy |
US8568189B2 (en) * | 2009-11-25 | 2013-10-29 | Hallmark Cards, Incorporated | Context-based interactive plush toy |
US20110230116A1 (en) * | 2010-03-19 | 2011-09-22 | Jeremiah William Balik | Bluetooth speaker embed toyetic |
US8414347B2 (en) | 2010-12-23 | 2013-04-09 | Lcaip, Llc | Smart stuffed animal with air flow ventilation system |
US9089782B2 (en) * | 2010-12-23 | 2015-07-28 | Lcaip, Llc. | Smart stuffed toy with air flow ventilation system |
US8801490B2 (en) | 2010-12-23 | 2014-08-12 | Lcaip, Llc | Smart stuffed toy with air flow ventilation system |
US20120185254A1 (en) * | 2011-01-18 | 2012-07-19 | Biehler William A | Interactive figurine in a communications system incorporating selective content delivery |
US20120190453A1 (en) * | 2011-01-25 | 2012-07-26 | Bossa Nova Robotics Ip, Inc. | System and method for online-offline interactive experience |
JP5844288B2 (en) * | 2011-02-01 | 2016-01-13 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | Function expansion device, function expansion method, function expansion program, and integrated circuit |
US9126122B2 (en) * | 2011-05-17 | 2015-09-08 | Zugworks, Inc | Doll companion integrating child self-directed execution of applications with cell phone communication, education, entertainment, alert and monitoring systems |
US20130078886A1 (en) * | 2011-09-28 | 2013-03-28 | Helena Wisniewski | Interactive Toy with Object Recognition |
GB2496169B (en) * | 2011-11-04 | 2014-03-12 | Commotion Ltd | Toy |
US9492762B2 (en) | 2012-05-08 | 2016-11-15 | Funfare, Llc | Sensor configuration for toy |
US9565402B2 (en) * | 2012-10-30 | 2017-02-07 | Baby-Tech Innovations, Inc. | Video camera device and method to monitor a child in a vehicle |
US11020680B2 (en) * | 2012-11-15 | 2021-06-01 | Shana Lee McCart-Pollak | System and method for providing a toy operable for receiving and selectively vocalizing various electronic communications from authorized parties, and for providing a configurable platform independent interactive infrastructure for facilitating optimal utilization thereof |
US20140349547A1 (en) * | 2012-12-08 | 2014-11-27 | Retail Authority LLC | Wirelessly controlled action figures |
US20140162230A1 (en) * | 2012-12-12 | 2014-06-12 | Aram Akopian | Exercise demonstration devices and systems |
US20140256214A1 (en) * | 2013-03-11 | 2014-09-11 | Raja Ramamoorthy | Multi Function Toy with Embedded Wireless Hardware |
KR101504699B1 (en) * | 2013-04-09 | 2015-03-20 | 얄리주식회사 | Phonetic conversation method and device using wired and wiress communication |
US20140329433A1 (en) * | 2013-05-06 | 2014-11-06 | Israel Carrero | Toy Stuffed Animal with Remote Video and Audio Capability |
KR101458460B1 (en) * | 2013-05-27 | 2014-11-12 | 주식회사 매직에듀 | 3-dimentional character and album system using the same |
US9406240B2 (en) * | 2013-10-11 | 2016-08-02 | Dynepic Inc. | Interactive educational system |
JP6174543B2 (en) * | 2014-03-07 | 2017-08-02 | 摩豆科技有限公司 | Doll control method and interactive doll operation method by application, and apparatus for doll control and operation |
US20150290548A1 (en) * | 2014-04-09 | 2015-10-15 | Mark Meyers | Toy messaging system |
KR101623167B1 (en) * | 2014-05-16 | 2016-05-24 | 수상에스티(주) | Monitoring system for baby |
US9833725B2 (en) * | 2014-06-16 | 2017-12-05 | Dynepic, Inc. | Interactive cloud-based toy |
CN106459416B (en) | 2014-06-23 | 2020-05-12 | 信越化学工业株式会社 | Organopolysiloxane crosslinked product, process for producing the same, antifogging agent, and silicone composition for solvent-free release paper |
US9931572B2 (en) | 2014-09-15 | 2018-04-03 | Future of Play Global Limited | Systems and methods for interactive communication between an object and a smart device |
JP5866539B1 (en) * | 2014-11-21 | 2016-02-17 | パナソニックIpマネジメント株式会社 | Communication system and sound source reproduction method in communication system |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
US10405745B2 (en) | 2015-09-27 | 2019-09-10 | Gnana Haranth | Human socializable entity for improving digital health care delivery |
JP6680125B2 (en) * | 2016-07-25 | 2020-04-15 | トヨタ自動車株式会社 | Robot and voice interaction method |
US20180158458A1 (en) * | 2016-10-21 | 2018-06-07 | Shenetics, Inc. | Conversational voice interface of connected devices, including toys, cars, avionics, mobile, iot and home appliances |
US10783799B1 (en) * | 2016-12-17 | 2020-09-22 | Sproutel, Inc. | System, apparatus, and method for educating and reducing stress for patients with illness or trauma using an interactive location-aware toy and a distributed sensor network |
US10441879B2 (en) * | 2017-03-29 | 2019-10-15 | Disney Enterprises, Inc. | Registration of wireless encounters between wireless devices |
KR102295836B1 (en) * | 2020-11-20 | 2021-08-31 | 오로라월드 주식회사 | Apparatus And System for Growth Type Smart Toy |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7261612B1 (en) * | 1999-08-30 | 2007-08-28 | Digimarc Corporation | Methods and systems for read-aloud books |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2516425Y2 (en) * | 1990-12-11 | 1996-11-06 | 株式会社タカラ | Operating device |
EP0606790B1 (en) * | 1992-12-08 | 2000-03-22 | Steven Lebensfeld | Toy having subject specific,word/phrase selectable, message delivering doll or action figure |
US6947571B1 (en) * | 1999-05-19 | 2005-09-20 | Digimarc Corporation | Cell phones with optical capabilities, and related applications |
US5945656A (en) * | 1997-05-27 | 1999-08-31 | Lemelson; Jerome H. | Apparatus and method for stand-alone scanning and audio generation from printed material |
US6159101A (en) * | 1997-07-24 | 2000-12-12 | Tiger Electronics, Ltd. | Interactive toy products |
US6554679B1 (en) * | 1999-01-29 | 2003-04-29 | Playmates Toys, Inc. | Interactive virtual character doll |
US6227931B1 (en) * | 1999-07-02 | 2001-05-08 | Judith Ann Shackelford | Electronic interactive play environment for toy characters |
US6719604B2 (en) * | 2000-01-04 | 2004-04-13 | Thinking Technology, Inc. | Interactive dress-up toy |
US6773344B1 (en) * | 2000-03-16 | 2004-08-10 | Creator Ltd. | Methods and apparatus for integration of interactive toys with interactive television and cellular communication systems |
US6443796B1 (en) * | 2000-06-19 | 2002-09-03 | Judith Ann Shackelford | Smart blocks |
US7008288B2 (en) * | 2001-07-26 | 2006-03-07 | Eastman Kodak Company | Intelligent toy with internet connection capability |
CN102982298B (en) * | 2002-09-26 | 2016-12-07 | Ip解决方案株式会社 | Follow the tracks of device, information input/output unit, optical pickup device and information recording/playing back device |
US7248170B2 (en) * | 2003-01-22 | 2007-07-24 | Deome Dennis E | Interactive personal security system |
-
2005
- 2005-06-07 CA CA002569731A patent/CA2569731A1/en not_active Abandoned
- 2005-06-07 JP JP2007527640A patent/JP2008506510A/en active Pending
- 2005-06-07 WO PCT/US2005/019933 patent/WO2005123210A2/en active Application Filing
- 2005-06-07 CN CNA2005800267810A patent/CN101193684A/en active Pending
- 2005-06-07 US US11/146,907 patent/US20060234602A1/en not_active Abandoned
- 2005-06-07 EP EP05758005A patent/EP1765478A2/en not_active Withdrawn
- 2005-06-07 MX MXPA06014212A patent/MXPA06014212A/en not_active Application Discontinuation
- 2005-06-07 BR BRPI0511898-0A patent/BRPI0511898A/en not_active IP Right Cessation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7261612B1 (en) * | 1999-08-30 | 2007-08-28 | Digimarc Corporation | Methods and systems for read-aloud books |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008096884A (en) * | 2006-10-16 | 2008-04-24 | Hitachi Software Eng Co Ltd | Communication system for learning foreign language |
JP2009034806A (en) * | 2007-07-31 | 2009-02-19 | Ind Technol Res Inst | Structure for applying radio recognizing technique to electronic robot operation sequential control |
EP2777786A3 (en) * | 2013-03-15 | 2014-12-10 | Disney Enterprises, Inc. | Managing virtual content based on information associated with toy objects |
US9011194B2 (en) | 2013-03-15 | 2015-04-21 | Disney Enterprises, Inc. | Managing virtual content based on information associated with toy objects |
US9610500B2 (en) | 2013-03-15 | 2017-04-04 | Disney Enterprise, Inc. | Managing virtual content based on information associated with toy objects |
GB2532141A (en) * | 2014-11-04 | 2016-05-11 | Mooredoll Inc | Method and device of community interaction with toy as the center |
Also Published As
Publication number | Publication date |
---|---|
CA2569731A1 (en) | 2005-12-29 |
BRPI0511898A (en) | 2008-01-15 |
WO2005123210A3 (en) | 2008-02-14 |
EP1765478A2 (en) | 2007-03-28 |
MXPA06014212A (en) | 2007-03-12 |
US20060234602A1 (en) | 2006-10-19 |
CN101193684A (en) | 2008-06-04 |
JP2008506510A (en) | 2008-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20060234602A1 (en) | Figurine using wireless communication to harness external computing power | |
US20220111300A1 (en) | Educational device | |
US11327556B2 (en) | Information processing system, client terminal, information processing method, and recording medium | |
KR102306624B1 (en) | Persistent companion device configuration and deployment platform | |
US8591302B2 (en) | Systems and methods for communication | |
US20090275408A1 (en) | Programmable interactive talking device | |
CN105126355A (en) | Child companion robot and child companioning system | |
KR100666487B1 (en) | Educational toy using rfid tag recognition | |
JP2003205483A (en) | Robot system and control method for robot device | |
JPH11511859A (en) | Educational and entertainment device with dynamic configuration and operation | |
US20180272240A1 (en) | Modular interaction device for toys and other devices | |
CN109891357A (en) | Emotion intelligently accompanies device | |
JP2008185994A (en) | Sound reproduction system | |
WO2019190817A1 (en) | Method and apparatus for speech interaction with children | |
US20200368630A1 (en) | Apparatus and System for Providing Content to Paired Objects | |
US11731262B2 (en) | Robot and method for operating the same | |
CN112017484A (en) | Logic thinking training and interaction machine | |
KR20170117856A (en) | Interactive system of objects using rf card | |
KR20230081026A (en) | Apparatus and method for providing audiovisual content for the disabled | |
KR20230156674A (en) | Audio output system and method for changing sound content thereof | |
Ebner | Alexa, Siri and more: The impact of speech recognition on social behaviour and our responsibility as its creators | |
KR20230081935A (en) | Smart robot for psychotheraph and controlling method thereof | |
CN110071735A (en) | Wisdom children's garment system and its commercial operation method | |
KR20030030604A (en) | A toy for receiving possible with long distance of a voice information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: PA/a/2006/014212 Country of ref document: MX |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2569731 Country of ref document: CA Ref document number: 2007527640 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2005758005 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200580026781.0 Country of ref document: CN |
|
WWP | Wipo information: published in national office |
Ref document number: 2005758005 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: PI0511898 Country of ref document: BR |