US20140038489A1 - Interactive plush toy - Google Patents
Interactive plush toy Download PDFInfo
- Publication number
- US20140038489A1 US20140038489A1 US13/567,490 US201213567490A US2014038489A1 US 20140038489 A1 US20140038489 A1 US 20140038489A1 US 201213567490 A US201213567490 A US 201213567490A US 2014038489 A1 US2014038489 A1 US 2014038489A1
- Authority
- US
- United States
- Prior art keywords
- toy
- query
- macro
- plush toy
- action
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 9
- 230000009471 action Effects 0.000 claims abstract description 64
- 230000033001 locomotion Effects 0.000 claims abstract description 48
- 238000000034 method Methods 0.000 claims description 24
- 238000004891 communication Methods 0.000 claims description 13
- 238000001556 precipitation Methods 0.000 claims description 6
- 238000001514 detection method Methods 0.000 claims description 4
- 230000004044 response Effects 0.000 claims description 4
- 230000003213 activating effect Effects 0.000 claims 3
- 230000001052 transient effect Effects 0.000 claims 1
- 238000004590 computer program Methods 0.000 abstract description 2
- 230000006870 function Effects 0.000 description 6
- 230000003993 interaction Effects 0.000 description 5
- 230000001413 cellular effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 210000003128 head Anatomy 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000007790 scraping Methods 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000010267 cellular communication Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000003203 everyday effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H3/00—Dolls
- A63H3/28—Arrangements of sound-producing means in dolls; Means in dolls for producing sounds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H30/00—Remote-control arrangements specially adapted for toys, e.g. for toy vehicles
- A63H30/02—Electrical arrangements
- A63H30/04—Electrical arrangements using wireless transmission
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63H—TOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
- A63H2200/00—Computerized interactive toys, e.g. dolls
Abstract
A system is presented for controlling an interactive electronic plush toy using a macro computer program created on a mobile electronic device using a mobile application. The plush toy is provided with an electronic wireless network interface for communicating wirelessly with the smartphone via a network. The macro provides computer programming for instructing the plush toy to perform an Internet search query, receive results of the query, and perform predetermined actions based on the query results. The actions may include producing sound or motion to make the toy seem life-like.
Description
- The present application relates to the field of electronically-controlled toys. More particularly, the described embodiments relate to a plush toy that is controlled by a smartphone application or “app.” The plush toy may connect wirelessly to the Internet and may be programmed to perform actions based on information obtained from the Internet.
- One embodiment of the present invention provides a plush toy having built-in wireless communication features. A smartphone app may wirelessly control the plush toy's movement, sounds, or functions by assigning a predetermined macro computer program to the toy, or by creating a series of instructions for the toy to follow. In one aspect, a user may create a macro program that instructs the toy to query a resource, receive results, apply a conditional rule to the result, and perform one or more actions indicated by the rule. The toy may have moving parts, may play sounds, may play pre-recorded or real-time messages, may have text-to-speech features, and may play back custom voice recordings. The toy may have integrated sensors and respond to ambient factors such as movement, sounds, or light. The toy may be programmed to perform functions based on the time of day. The smartphone app may define tasks such as pulling weather information, financial information, or social media information from the Internet.
-
FIG. 1 is a schematic diagram depicting a system for controlling an interactive plush toy. -
FIG. 2 is a flow chart showing steps in a method for creating a macro. -
FIG. 3 is a flow chart showing the steps for creating an exemplary macro. -
FIG. 4 is a flow chart detailing steps in a method for executing a macro. -
FIG. 5 is a schematic diagram depicting the primary elements of an interactive plush toy. -
FIG. 6 is a diagram showing elements of a smart phone app used to program a macro for an interactive plush toy. -
FIG. 1 shows a system for controlling an interactive plush toy. The system includes aplush toy 100, a handheld mobile communication device such as asmartphone 150, anetwork 160 such as the Internet, andinformation resources FIG. 1 as a schematic of circuitry inside aplush toy 100. Althoughtoy 100 is described as a “plush toy” in relation to the various embodiments,toy 100 may also be a doll, an action figure, a toy vehicle, or other plaything.Device 150 is described as a “smartphone,” but could also be a tablet computer, a music player, a digital watch, or any other mobile device with computing power able to execute a mobile application. - Smartphone 150 and plush
toy 100 havewireless interfaces plush toy 100 andsmartphone 150. Plushtoy 100 andsmartphone 150 are also each able to connect wirelessly to aremote network 160 such as the Internet.Wireless interfaces Wireless interfaces smartphone 150 may connect to plushtoy 100 via Bluetooth communication and connect tonetwork 160 via cellular communication. -
Remote information resources smartphone 150 andplush toy 100 throughnetwork 160. In the exemplary embodiment,weather information 170 may include temperature, precipitation, and pollen count information.Weather information 170 could also include weather forecasts, severe weather information, driving conditions, UV index, or other weather-related information.News information 180 may include national news headlines, local news, and stock news.News information 180 could also include category-specific news such as technology or political news.Social media 190 may include social networks and email, but may also include news sources, blogs, Internet forums, podcasts, video, etc.Information resources information resources smartphone 150 andplush toy 100 could include sports, horoscope, travel, shopping, and other types of information resources available via anetwork 160. - Smartphone 150 may include a plush
toy control app 152 for controlling and programmingplush toy 100. As explained further in relation toFIGS. 2-6 ,plush toy app 152 allows a user to create a macro forplush toy 100 by defining a search for a query, defining conditional instructions indicating when to perform the query, defining conditional rules differentiating the results of the query, and defining actions to perform based on the query results. Afterplush toy app 152 finishes programming a macro, the macro is sent to theplush toy 100 and stored inmemory 135 asmacro 130.Plush toy 100 may execute themacro 130 by performing the steps of executing a query, receiving the query results, comparing the query results to the conditional rules, and performing actions provided in the conditional rules, based on the query results. There are numerous ways to implementplush toy app 152 that would be known to one skilled in the art. - Plush toy 100 has numerous interactive features to make
plush toy 100 seem life-like. The features may include movement capabilities such asmovement 115, which includes moving the head, arms, legs, eyes, and mouth ofplush toy 100. Movement capabilities may be implemented electronically or mechanically, and may be software- or hardware-controlled. In one embodiment,plush toy 100 could contain movement actuators using gears, pulleys, etc. connected to one or more servo motors or stepper motors controlled byprocessor 145. Other implementations of movement in aplush toy 100 will be known to one skilled in the art. - Plush toy 100 has a
digital processor 145 in data communication with a tangible,non-transitory memory 135. In one embodiment,memory 135 includes both non-volatile memory, such as magnetic-based memory or flash memory, and volatile but faster access memory, such asRAM Processor 145 executes computer instructions and may control movement, sensors, and sound production withinplush toy 100.Processor 145 may be a general purpose processor such as those processors based on the ARM processing architecture by ARM Holdings PLC (Cambridge, UK).Memory 135 may containmacros 130 with instructions for accessing information over a network such as the Internet, querying databases or other information sources,monitoring sensors 105 within theplush toy 100, and instructing theplush toy 100 to produce movement and generate sounds. Memory 135 may also contain generally programming 132 that controls the basic features of theplush toy 100. The general programming may, for instance, instruct theplush toy 100 to make a particular noise when a pressure sensor in an arm of theplush toy 100 sensors pressure. Thegeneral programming 132 also instructs theplush toy 100 when and how to communicate overwireless interface 155 to thenetwork 160, and how to downloadmacros 130 from thenetwork 160. On one embodiment, themacros 130 contain instructions that are interpreted and implemented on theprocessor 145 by thegeneral instructions 132. Thegeneral instructions 132 andmacros 130 may be loaded from the non-volatile portions of thememory 135 into the volatile but faster RAM portion of thememory 135 in order to speed up execution of themacro 130 andinstructions 132. -
Sensors 105 are integrated withinplush toy 100 and may be software- or hardware-controlled and monitored. In the preferred embodiment,sensors 105 monitor various conditions internal and external to plushtoy 100.Sensors 105 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, pressure sensor, or other types of known sensors.Sensors 105 may operate continuously, or may be activated on demand by computer instructions withinplush toy 100 or instructions received from outside ofplush toy 100. In some embodiments, asensor 105 is provided that monitors thewireless interface 155 for a trigger signal. This allows a Bluetooth connection or a WiFi connection to receive signals that trigger actions in theplush toy 100. - Movement features 115 of
plush toy 100 allow for physical movement of the body, head, arms, legs, eyes and other parts ofplush toy 100, and may include light-up features ofplush toy 100.Movement 115 may be controlled bymacros 130 or as part of thegeneral instructions 132. Sound generating features 125 includes pre-recorded sound, text-to-speech features, playback of live audio, or other types of sound. Sound generating features 125 preferably are coordinated with movement features 115, for example to synchronize the mouth ofplush toy 100 with speech sounds. -
FIG. 2 presents a method for creating a macro for use withplush toy 100 ofFIG. 1 .FIG. 3 presents a specific implementation of the method ofFIG. 2 , in which a macro is programmed to access weather information via a network such as the Internet, and then perform actions based on the retrieved information. The macro is preferably created at thesmartphone 150 usingplush toy app 152. - In the
first step 210 ofFIG. 2 , a user of asmartphone 150 usesplush toy app 152 to create a macro. Step 210 includes choosing an information resource and a query for that information resource. The information resource may be one or more ofinformation resources FIG. 1 , but could also include other types of resources accessible over anetwork 160. Step 210 may include choosing a database, webpage, URL, or other web resource to query, and preferably includes a search term or search parameter. - In
step 220, the user chooses an execute trigger for the macro. The execute trigger is a conditional statement indicating when to execute the steps of the macro. The execute trigger condition may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition determined by asensor 105 on the plush toy. Alternatively, the trigger could be the powering on or powering off of theplush toy 100. The trigger may be recurring or intermittent. The user may choose instep 220 to program the macro to execute an unlimited number of times, or may choose to program the macro to run only a specified number of times, after which the macro will not run. - In
step 230, the user of theplush toy app 152 defines conditional rules indicating how to differentiate the query results returned from the information resource as a result of the query. Theplush toy app 152 may have built-in or predetermined instructions forstep 230, or the user may be given flexible choices for differentiating the query results. In the preferred embodiment, atstep 230 the macro is programmed to evaluate the query results using conditional rules, then determine actions for theplush toy 100 based on an applied rule. The rules may include being above or below a certain threshold, may be Boolean yes/no conditions, may be based on predetermined keywords, or other programmable rules. It is possible to create macros that contain no conditionals instep 230. - In
step 240 the macro is programmed to define actions to perform based on the differentiated query results. In one embodiment, an action is performed for every query result. In an alternate embodiment, an action may be performed for some query results, while no actions are performed for other query results. The actions to be performed may include movement, sound projection, speech-to-text, illuminating lights, or other actions. The actions to be performed may include executing a second macro and performing the steps of the second macro. - In
step 250, the user ofplush toy app 152 defines a return trigger indicating when to perform the actions defined instep 240. Instep 250 the user chooses a return trigger for the macro. The return trigger may be a specified time of day, a sound cue, a motion cue, a light cue, or other external condition. In one embodiment theplush toy 100 waits until a return trigger condition is met before executing the action defined instep 240. In another embodiment, no return trigger is defined instep 250 and theplush toy 100 may perform the actions defined instep 240 immediately without waiting for a return trigger condition. The programming of the macro is finished whenstep 250 is complete. Instep 260, the macro programming instructions are transferred wirelessly from theplush toy app 152 and saved in thememory 135 ofplush toy 100 as a macro 130. The wireless transfer may be made locally through a connection such as a Bluetooth connection, or may be made over a remote network such as the Internet. If the wireless transfer of the macro 130 is made via the Internet, thesmartphone 150 does not need to be near theplush toy 100 to send macro instructions. -
FIG. 3 is an exemplary macro, such asmacro 130 ofFIG. 1 , which may be programmed at aplush toy app 152. The function of the macro 130 is to perform an Internet query of a weather information resource to determine the outside temperature at 7 AM. The results of the query are numeric, and are differentiated by particular numeric thresholds. If the temperature query result returns a number below a certain threshold, the macro 130 instructs theplush toy 100 to perform a first set of actions; if the results are above a particular threshold the macro 130 instructs theplush toy 100 to perform a second set of actions. If the numeric result is between the two thresholds, theplush toy 100 makes no action. The macro 130 instructs theplush toy 100 to perform the chosen action after a motion sensor is triggered. - Turning specifically to the steps in
FIG. 3 , instep 310 the macro 130 is programmed to query aninformation resource 170 for weather information. Atstep 312, the macro defines the query for theweather information resource 170, asking the resource to return the current temperature (in degrees Fahrenheit) for a particular geographic location. Atstep 314, a trigger is defined that causes the macro to perform this query at 7:00 AM every day. Other triggers (not shown) could be defined, for example, to only execute the macro 130 on Mondays, or to only execute the macro 130 a particular number of times, then discontinue executing the macro. The macro 130 could also be programmed to execute more than one time per day, or only in response to a predetermined external stimulus detected by asensor 105, such as a voice command. - In
step 320, the macro 130 is programmed to receive the results of the query, after which conditional rules are applied. In the exemplary embodiment ofFIG. 3 , the results of the query are numeric and are represented inFIG. 3 by the variable “TEMP.” These results are differentiated in three steps. Instep 332 the query result is tested against a first conditional statement. Step 332 determines whether the numeric value of “TEMP” is less than 55 degrees. Ifstep 332 returns a negative, the result is tested atstep 334, which determines whether “TEMP” is less than 70 degrees. Ifstep 334 returns a negative, then it is known that “TEMP” is greater than or equal to 70 degrees, as indicated inFIG. 3 bybox 336. In the example ofFIG. 3 , all numerical (e.g., integer) query results will follow one of the paths atsteps - Once the differentiating conditional options of
steps step 332 is satisfied, the method proceeds to step 342. The macro programmer may choose one or more actions and sounds to perform for the particular query result. In the exemplary embodiment ofFIG. 3 , the user programs the macro 130 atstep 342 to cause theplush toy 100 to move the mouth, eyes, and arms, and to produce the sound “Remember your jacket!” through either text-to-speech functions, pre-recorded sound, or by recording a custom voice sound. Atstep 344, the user programs the macro 130 to take no action when the condition atstep 334 is met. Atstep 346, the user programs theplush toy 100 to move the mouth and eyes and produce the sound “You won't need a coat today!” - After the actions to be performed are defined at
steps plush toy app 152 may define a conditional statement indicating when to perform theactions FIG. 3 , the user chooses to make theplush toy 100 wait until motion is sensed by the plush toy before performing one of theactions step 348 the user ofplush toy app 152 will decrease the likelihood that theplush toy 100 performs the actions when no person is around. The programming atstep 348 will cause theplush toy 100 to wait for motion to be sensed, and if no motion is sensed,plush toy 100 will continue to monitor using its motion sensors. In one embodiment, the repeating step of waiting for motion to be sensed instep 348 can time out after a predetermined time period, such as two hours. Atstep 350, the user programs theplush toy 100 to perform thedetermined step step 348 is satisfied. -
FIG. 4 shows a method for executing a macro in a system such as the system shown inFIG. 1 having aplush toy 100 with amemory 135 and aprocessor 145. In the system theplush toy 100 is in wireless data communication with anetwork 160 and may access information resources such asinformation resources network 160. Thefirst step 400 is for theplush toy 100 to receive the macro 130 that was programmed by themobile device 150 and store the macro 130 inmemory 135. Instep 410macro programming 130 then begins executing by instructingprocessor 145 to perform an Internet query of aninformation resource network 160 using the defined search query. Instep 420 the results of the query are received at theprocessor 145. Instep 430 theprocessor 145 compares the query results to predetermined results conditionals specified in themacro programming 130. Based on the results conditionals, theprocessor 145 determines actions to perform at theplush toy 100 atstep 440. Instep 450 theplush toy 100 waits for one or more events determined bymacro 130 indicating when to perform the actions. Atstep 460 theprocessor 145 determines that the conditionals have been satisfied, and atstep 470 theplush toy 100 performs the actions. As explained above, it is possible for no results conditionals to be defined in themacro programming 130, in which case the actions will be performed instep 470 immediately after determining the actions to be performed instep 440, thereby skippingsteps -
FIG. 5 shows an exemplary plush toy to be used with the methods and system described herein.Plush toy 500 comprises aprocessor 510 that controls the operations ofplush toy 500. Awireless interface 550 connectsplush toy 500 to a network such asnetwork 160 and a smartphone such assmartphone 150 ofFIG. 1 .Wireless interface 550 may be one or more of Bluetooth, Wi-Fi, cellular GSM, cellular CDMA, or other wireless data communication interfaces. - A
memory 511 is operably connected toprocessor 510, and stores macros 512 andgeneral instructions 513 for the operation of theplush toy 500. In the preferred embodiment a macro is created on a mobile electronic device and sent via a wireless signal to be stored in thememory 511 ofplush toy 100. Themacros 512 may be macros programmed according toFIG. 3 .Multiple macros 512 may be programmed and transmitted to theplush toy 500 viawireless interface 550. It is contemplated thatplush toy 500 would run more than one macro at a time. Allowing thetoy 500 to perform multiple actions at various times in response to a number of different stimuli will make thetoy 500 more interactive and lifelike. -
Processor 510 is responsible for controlling motion and producing sound inplush toy 500 according to thegeneral instructions 513 store inmemory 511.Processor 510 also receives input fromsensors 530 and uses the sensor data in connection withmacros 512.Sensors 530 of theplush toy 500 may include one or more of a motion sensor, light sensor, clock, sound sensor, accelerometer, or other types of known sensors.Plush toy 500 may have a speaker andamplifier 524 for generating sound. Text-to-speech functions 540 may be used to produce sound. As is true with the other capabilities of theplush toy 500, text-to-speech functionality 540 may be implemented in software as part of thegeneral instructions 513, or may be implemented as hardware, such as in a special purpose processor designed to convert text data to audible speech. Pre-recorded sound may be stored inmemory 511 and be projected throughspeaker 524. -
Plush toy 500 has motion actuators includingarm actuators 520,leg actuators 521, eyes actuator 522, andmouth actuator 523, which produce movement and makeplush toy 500 appear life-like. Motion actuators ofplush toy 500 may include small, rotating motors connected to gears, pulleys, cams, or levers. Servo or stepper motors connected to theprocessor 510 may drive motion in actuators 520-523.Processor 510 may controlmouth actuator 523 andsound generator 540 to synchronize speech sounds with mouth motion ofplush toy 500. The movement features ofplush toy 500 may be implemented in a number of different ways that are known to one skilled in the art. -
FIG. 6 shows an exemplary embodiment of a smartphone or mobile application for creating a macro program to be used with a toy such asplush toy 500. Auser interface 600 for aplush toy app 601 has features for programming a macro and sending the macro 512 via a wireless connection to theplush toy 500. In the preferred embodiment, the smartphone has a touch-sensitive screen for interacting withuser interface 600, makingplush toy app 601 very easy to use. Theuser interface 600 shows anedit window 610 containing various modifiable fields. In a preferred embodiment, theplush toy app 601 has predefined macro templates to simplify macro creation.Temperature macro 620 is an example of such a template. A user is able to edit certain functional elements of the macro 620 without having to understand the underlying computer programming. - In one embodiment, macro templates are created to ease the creation of common macro types. For example,
FIG. 6 could be created based upon a weather macro template that simplifies the creation of a macro for the plush toy that is based on the weather conditions in a location. In this example, the macro queries the outdoor temperature at a location. The macro 620 allows a user to edit the searchquery data field 621. The searchquery data field 621 defines the instructions that are used to search weather information such asweather information resource 170 ofFIG. 1 over awireless network 160.Search data field 621 is shown searching for the temperature in degrees Fahrenheit, but thedata field 621 could also search other weather attributes such as precipitation forecasts and pollen count. This field could be edited by clicking onedit search button 611 in theedit macro 610 portion ofuser interface 600. The application would then allow a user to change one or more of the variables in thesearch field 621, such as the desired weather condition or the relevant geographic location. By choosing editrun settings button 612 the user may change the conditions under which the macro 620 will perform its functions. Runsettings field 622 indicates that the macro will be executed when the clock withinplush toy 500 registers 7:00 AM. Therun settings field 622 could support other programming instructions. For example, the macro 620 could be programmed to run at different times on different days, or to run in response to a voice command, a motion detection, a light detection, or other type of stimulus internal or external toplush toy 500. In the preferred embodiment themacro programming 620 causesprocessor 510 to monitor one or more sensors 530 (including the clock) to determine when to run amacro program 620. Edit resultsoptions button 613 allows a user to change the results options field 623 formacro 620. In the embodiment ofFIG. 6 , the search query results are differentiated by numerical temperature. In a first case, theresults option field 623 indicates that if the temperature is below 55 degrees the macro 620 will cause theprocessor 510 to perform a first set of actions. If the temperature is above 70 degrees theprocessor 510 will perform another set of actions. In an alternate scenario the macro 620 could be programmed to search for other information, for example, precipitation information. Theresults options 623 would be separated by different variables in that case. For example, the search query may query a weather information resource and return results indicating a percent chance of precipitation. In this case theresults option 623 could instruct to perform a particular action if the chance of rain is below twenty percent, and perform another action if the chance of precipitation is fifty percent or higher. - Edit sounds
button 614 and editmovement button 615 allow a user to program the actions that theplush toy 500 will perform after receiving results of a query. In the exemplary embodiment thedata field 624 shows that theplush toy 500 will use text-to-speech features to play back the phrase “you won't need a coat today.” Thedata field 624 specifically shows the action to take when the weather query results are above 70 degrees. Theplush toy app 601 would also allow the user to choose sounds when the weather query results indicate that the temperature is below 55 degrees. Movement instructions are given indata field 625. In the example shown, when the temperature is above 70 degrees theprocessor 510 ofplush toy 500 will cause the arms actuators 521 to wave, and will cause the mouth actuator 532 to “lip synch” along with the text-to-speech sounds projected by speaker andamplifier 524.Plush toy app 601 would also allow the user to program motions for theplush toy 500 to perform when the results show that the temperature is below 55 degrees. - Edit
return options button 616 allows the user to edit the conditions under whichplush toy 500 will perform thesound 624 andmovement 625 actions. In the example ofFIG. 6 , returnoptions field 626 instructs theplush toy 500 to signalsensors 530 ofFIG. 5 to activate a motion sensor and wait until the motion sensor senses a movement before returning the results and performing the specifiedactions sensor 530. - When the programming for
macro 620 is complete, theplush toy app 601 will wirelessly send the macro program toplush toy 500 when the user selects the “send to my toy”button 680. Theplush toy app 601 may send the macro directly toplush toy 500 via a wireless interface using a local wireless signal, for example through a Bluetooth connection.Plush toy app 601 may also route themacro programming 620 through a network such as the Internet. In this case,plush toy 500 would receive the wireless signals via a wireless Internet connection. Whenplush toy app 601 uses the Internet to communicate withplush toy 500, a smartphone does not need to be in close proximity toplush toy 500 to create and send themacro program 620. - It is possible to use the
app 601 to program multipleplush toys 500. In this case, the send to mytoy button 680 would require the user to identify which toy is being programmed by this macro. In a multi-toy environment, it is possible to program interactions betweentoys 500. These interactions usually require that onetoy 500 perform an action that is sensed by theother toy 500. For instance, a first toy could be programmed as described above, to query the outdoor temperature at 7:00 am, wait for movement in the room, and then speak the words “You won't need a coat today.” The other toy could also be programmed so that at 7:00 am it will query the predicted temperature tomorrow, and may discover that the high tomorrow will be 92 degrees. The second toy would wait for the first toy to state its line (such as by waiting for a sound sensor to hear the words, or waiting until the second toy senses movement, then senses sound, and then waits 5 seconds). When this occurs, the second toy then states “and tomorrow looks like a hot one. It will be over ninety degrees tomorrow.” To improve interaction betweentoys 500, toys could include the ability to trigger one another, such as through unique sounds, or even wireless digital connections operating between the wireless interfaces 550 of eachtoy 500. Macros could be programmed to initiate upon receipt of a signal from acompanion toy 500, cause an action to be performed, and then send a return signal to thecompanion toy 500 to trigger an additional macro at the companion toy. By combining this type of interaction with the ability to query external data sources, complicated and informative interactions between toys could be developed. - The many features and advantages of the invention are apparent from the above description. Numerous modifications and variations will readily occur to those skilled in the art. For example, the plush toy app could be implemented as a control panel attached to the plush toy. Since such modifications are possible, the invention is not to be limited to the exact construction and operation illustrated and described. Rather, the present invention should be limited only by the following claims.
Claims (24)
1. A system for controlling an electronic toy, comprising:
a) a mobile electronic device having a device processor, a device wireless communication interface in data communication with the electronic toy over a wireless data connection, a tangible, non-transitory device memory, and a mobile application residing on the device memory, the mobile application including macro instructions transmitted wirelessly to the toy over the wireless data connection; and
b) the electronic toy having
i) a toy processor,
ii) a tangible, non-transitory toy memory,
iii) a toy wireless communication interface in data communication with the mobile electronic device over the wireless data connection,
iv) a processor-controlled apparatus selected from a set comprising a sound generator and a motion actuator, and
v) the macro instructions received from the mobile electronic device via the wireless data connection and stored in the toy memory, the macro instructions configured to cause the toy to
(1) perform a query of an external information resource over the toy wireless communication interface,
(2) receive a query result,
(3) apply a conditional rule to the query result, and
(4) cause the processor-controlled apparatus to perform an action indicated by the conditional rule as applied to the query result.
2. The system of claim 1 , wherein the information resource is a weather information resource and the query includes a request for at least one of temperature, precipitation, UV index, and pollen count information.
3. The system of claim 1 , wherein the performed action is generating sound on the sound generator located within the toy, the sound being at least one of recorded sound and text-to-speech sound.
4. The system of claim 1 , wherein the action performed is activating the motion actuator of the toy.
5. (canceled)
6. (canceled)
7. The system of claim 1 , wherein the query result is a numeric value and the conditional rule differentiates the query result by comparing the numeric value to a threshold value.
8. The system of claim 1 , wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor and send the query after the sensor makes a detection.
9. The system of claim 8 , wherein the sensor is one of a motion sensor, a sound sensor, a light sensor, a voice sensor, a clock, and an accelerometer.
10. The system of claim 1 , wherein the toy further comprises a sensor, and the macro instructions include instructions to monitor the sensor after receiving the query result and perform the action after the sensor makes a detection.
11. A method for controlling a processor-controlled interactive toy via a mobile device, the mobile device having a processor, a wireless network interface, a user interface, and a tangible, non-transitory memory, the method comprising:
a) creating a macro, at the mobile device, by selecting, through the user interface,
i) a search query for performing a query of an information source on a remote network,
ii) a conditional rule to apply to a result of the query,
iii) a first action for the toy to perform, the action including at least one of generating sound at a sound generator of the toy and activating a motion actuator of the toy; and
b) transmitting the macro from the mobile device to the toy via the wireless network interface of the mobile device to be stored in a tangible, non-transient memory of the toy;
wherein the conditional rule provides instructions for the toy to perform the first action if the query result satisfies a first condition.
12. (canceled)
13. The method of claim 11 , wherein the conditional rule provides instructions to perform a second action different from the first action if the query result does not satisfy the first condition.
14. The method of claim 11 , wherein step a) further comprises selecting a trigger condition on which to initiate the query of the information resource.
15. The method of claim 14 , wherein step a) further comprises selecting a trigger condition on which to initiate performing the action.
16. The method of claim 11 , wherein the information source is one of a weather information source, a news information source, and a social media information source.
17. A method for executing a macro by an electronic toy having a processor, a non-transitory memory, and a wireless interface connected to a wireless network, the method comprising:
a) receiving macro instructions over the wireless network;
b) storing the macro instructions in the memory of the electronic toy;
c) executing the macro instructions after step b), the macro instructions including instructions to:
i) query a remote information resource via the wireless interface,
ii) receive a result of the query,
iii) apply a conditional rule to the query result, the conditional rule providing instructions to perform an action if the query result meets a first condition; and
d) performing the action, response to executing the macro instructions;
wherein the action is one of generating sound at a sound generator of the toy and activating a motion actuator of the electronic toy.
18. The method of claim 17 , wherein the macro instructions further include instructions to query the remote information resource after a trigger event.
19. The method of claim 17 , wherein the query result is a numeric value and the conditional rule includes comparing the numeric value to a threshold value.
20. The method of claim 17 , wherein the motion actuator is one of a leg actuator, an arm actuator, an eye actuator, and a mouth actuator.
21. The method of claim 17 , wherein the information resource is one of a weather resource, a news resource, and a social media resource.
22. The method of claim 1 , wherein the conditional rule is a rule to differentiate the query result using a Boolean yes/no condition.
23. The method of claim 1 , wherein the conditional rule is a rule to differentiate the query result based on predetermined keywords.
24. The method of claim 11 , further comprising the steps of:
c) storing the macro in the memory of the toy;
d) performing the query of the information source on the remote network via a wireless interface within the toy;
e) applying the conditional rule to the result of the query; and
f) performing the first action, by the toy.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/567,490 US20140038489A1 (en) | 2012-08-06 | 2012-08-06 | Interactive plush toy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/567,490 US20140038489A1 (en) | 2012-08-06 | 2012-08-06 | Interactive plush toy |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140038489A1 true US20140038489A1 (en) | 2014-02-06 |
Family
ID=50025938
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/567,490 Abandoned US20140038489A1 (en) | 2012-08-06 | 2012-08-06 | Interactive plush toy |
Country Status (1)
Country | Link |
---|---|
US (1) | US20140038489A1 (en) |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130178982A1 (en) * | 2012-01-06 | 2013-07-11 | Tit Shing Wong | Interactive personal robotic apparatus |
US20140134918A1 (en) * | 2012-11-14 | 2014-05-15 | Hysonic. Co., Ltd | Smart toy driving system for mobile terminal |
US20140315468A1 (en) * | 2013-04-20 | 2014-10-23 | Tito Vasquez | Character-Based Electronic Device, System, and Method of Using the Same |
US8997697B1 (en) * | 2012-07-09 | 2015-04-07 | Perry L. Dailey | Agricultural security assembly |
EP2915567A1 (en) * | 2014-03-07 | 2015-09-09 | Mooredoll Inc. | Method and device for controlling doll with app and operating the interactive doll |
WO2015179466A1 (en) * | 2014-05-23 | 2015-11-26 | Bluniz Creative Technology Corporation | Remote interactive media |
WO2015195554A1 (en) * | 2014-06-16 | 2015-12-23 | Watry Krissa | Interactive cloud-based toy |
US20160077788A1 (en) * | 2014-09-15 | 2016-03-17 | Conduct Industrial Ltd. | Systems and Methods for Interactive Communication Between an Object and a Smart Device |
US20160121229A1 (en) * | 2014-11-04 | 2016-05-05 | Mooredoll Inc. | Method and device of community interaction with toy as the center |
US20160158659A1 (en) * | 2014-12-07 | 2016-06-09 | Pecoto Inc. | Computing based interactive animatronic device |
US20160171909A1 (en) * | 2014-12-15 | 2016-06-16 | Myriad Sensors, Inc. | Wireless multi-sensor device and software system for measuring physical parameters |
US20160206962A1 (en) * | 2014-09-15 | 2016-07-21 | Future of Play Global Limited | Systems and Methods for Interactive Communication Between an Object and a Smart Device |
WO2016143905A1 (en) * | 2015-03-12 | 2016-09-15 | 株式会社博報堂 | Control system, model, and control method |
CN106325228A (en) * | 2015-06-26 | 2017-01-11 | 北京贝虎机器人技术有限公司 | Method and device for generating control data of robot |
CN106325065A (en) * | 2015-06-26 | 2017-01-11 | 北京贝虎机器人技术有限公司 | Robot interactive behavior control method, device and robot |
US9914062B1 (en) * | 2016-09-12 | 2018-03-13 | Laura Jiencke | Wirelessly communicative cuddly toy |
US20180117276A1 (en) * | 2016-11-03 | 2018-05-03 | Jesal Kantawala | Childrens guided breathing and mediation, mindfulness, and yoga toy |
US20190206215A1 (en) * | 2018-01-03 | 2019-07-04 | Sebourn Ferguson | Proximity Induced Messaging Device |
US10405745B2 (en) * | 2015-09-27 | 2019-09-10 | Gnana Haranth | Human socializable entity for improving digital health care delivery |
US10518183B2 (en) | 2017-10-27 | 2019-12-31 | Ramseen E. Evazians | Light-up toy with motion sensing capabilities |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
US20210077770A1 (en) * | 2019-09-18 | 2021-03-18 | Lisa L. Parisien | Hypnotherapy system utilizing an interactive doll and method of hypnotherapy for children |
US11094311B2 (en) | 2019-05-14 | 2021-08-17 | Sony Corporation | Speech synthesizing devices and methods for mimicking voices of public figures |
US11141669B2 (en) * | 2019-06-05 | 2021-10-12 | Sony Corporation | Speech synthesizing dolls for mimicking voices of parents and guardians of children |
US11202967B2 (en) * | 2018-02-26 | 2021-12-21 | Jun Taek KANG | Toy and user-customized toy system |
WO2022040483A1 (en) * | 2020-08-19 | 2022-02-24 | Huge Play Inc. | Interactive, animatronic game/device partner and method for using same |
CN114570034A (en) * | 2020-11-30 | 2022-06-03 | 株式会社多美 | Toy system |
US11364442B2 (en) * | 2019-05-23 | 2022-06-21 | Disney Enterprises, Inc. | Interactive device |
US20220339549A1 (en) * | 2021-04-23 | 2022-10-27 | Ann Johnson | Wirelessly Coupled Stuffed Toy with Integrated Speaker |
US11657308B2 (en) * | 2018-07-02 | 2023-05-23 | Sap Se | Rule scenario framework for defining rules for operating on data objects |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020077028A1 (en) * | 2000-12-15 | 2002-06-20 | Yamaha Corporation | Electronic toy and control method therefor |
US20040186623A1 (en) * | 2001-05-25 | 2004-09-23 | Mike Dooley | Toy robot programming |
-
2012
- 2012-08-06 US US13/567,490 patent/US20140038489A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020077028A1 (en) * | 2000-12-15 | 2002-06-20 | Yamaha Corporation | Electronic toy and control method therefor |
US20040186623A1 (en) * | 2001-05-25 | 2004-09-23 | Mike Dooley | Toy robot programming |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9079113B2 (en) * | 2012-01-06 | 2015-07-14 | J. T. Labs Limited | Interactive personal robotic apparatus |
US20130178982A1 (en) * | 2012-01-06 | 2013-07-11 | Tit Shing Wong | Interactive personal robotic apparatus |
US8997697B1 (en) * | 2012-07-09 | 2015-04-07 | Perry L. Dailey | Agricultural security assembly |
US20140134918A1 (en) * | 2012-11-14 | 2014-05-15 | Hysonic. Co., Ltd | Smart toy driving system for mobile terminal |
US8939814B2 (en) * | 2012-11-14 | 2015-01-27 | Hysonic. Co., Ltd. | Smart toy driving system for mobile terminal |
US9821237B2 (en) * | 2013-04-20 | 2017-11-21 | Tito Vasquez | Character-based electronic device, system, and method of using the same |
US20140315468A1 (en) * | 2013-04-20 | 2014-10-23 | Tito Vasquez | Character-Based Electronic Device, System, and Method of Using the Same |
EP2915567A1 (en) * | 2014-03-07 | 2015-09-09 | Mooredoll Inc. | Method and device for controlling doll with app and operating the interactive doll |
US20150251102A1 (en) * | 2014-03-07 | 2015-09-10 | Mooredoll Inc. | Method and device for controlling doll with app and operating the interactive doll |
WO2015179466A1 (en) * | 2014-05-23 | 2015-11-26 | Bluniz Creative Technology Corporation | Remote interactive media |
WO2015195554A1 (en) * | 2014-06-16 | 2015-12-23 | Watry Krissa | Interactive cloud-based toy |
US9833725B2 (en) | 2014-06-16 | 2017-12-05 | Dynepic, Inc. | Interactive cloud-based toy |
US20160077788A1 (en) * | 2014-09-15 | 2016-03-17 | Conduct Industrial Ltd. | Systems and Methods for Interactive Communication Between an Object and a Smart Device |
US9931572B2 (en) * | 2014-09-15 | 2018-04-03 | Future of Play Global Limited | Systems and methods for interactive communication between an object and a smart device |
US20160206962A1 (en) * | 2014-09-15 | 2016-07-21 | Future of Play Global Limited | Systems and Methods for Interactive Communication Between an Object and a Smart Device |
CN105641936A (en) * | 2014-09-15 | 2016-06-08 | 未来游戏全球有限公司 | Systems and methods for interactive communication between object and smart device |
TWI559966B (en) * | 2014-11-04 | 2016-12-01 | Mooredoll Inc | Method and device of community interaction with toy as the center |
US20160121229A1 (en) * | 2014-11-04 | 2016-05-05 | Mooredoll Inc. | Method and device of community interaction with toy as the center |
US20160158659A1 (en) * | 2014-12-07 | 2016-06-09 | Pecoto Inc. | Computing based interactive animatronic device |
US20160171909A1 (en) * | 2014-12-15 | 2016-06-16 | Myriad Sensors, Inc. | Wireless multi-sensor device and software system for measuring physical parameters |
WO2016143905A1 (en) * | 2015-03-12 | 2016-09-15 | 株式会社博報堂 | Control system, model, and control method |
US10616310B2 (en) | 2015-06-15 | 2020-04-07 | Dynepic, Inc. | Interactive friend linked cloud-based toy |
CN106325065A (en) * | 2015-06-26 | 2017-01-11 | 北京贝虎机器人技术有限公司 | Robot interactive behavior control method, device and robot |
CN106325228A (en) * | 2015-06-26 | 2017-01-11 | 北京贝虎机器人技术有限公司 | Method and device for generating control data of robot |
US10405745B2 (en) * | 2015-09-27 | 2019-09-10 | Gnana Haranth | Human socializable entity for improving digital health care delivery |
US9914062B1 (en) * | 2016-09-12 | 2018-03-13 | Laura Jiencke | Wirelessly communicative cuddly toy |
US20180117276A1 (en) * | 2016-11-03 | 2018-05-03 | Jesal Kantawala | Childrens guided breathing and mediation, mindfulness, and yoga toy |
US10518183B2 (en) | 2017-10-27 | 2019-12-31 | Ramseen E. Evazians | Light-up toy with motion sensing capabilities |
US20190206215A1 (en) * | 2018-01-03 | 2019-07-04 | Sebourn Ferguson | Proximity Induced Messaging Device |
US11202967B2 (en) * | 2018-02-26 | 2021-12-21 | Jun Taek KANG | Toy and user-customized toy system |
US11657308B2 (en) * | 2018-07-02 | 2023-05-23 | Sap Se | Rule scenario framework for defining rules for operating on data objects |
US11094311B2 (en) | 2019-05-14 | 2021-08-17 | Sony Corporation | Speech synthesizing devices and methods for mimicking voices of public figures |
US11364442B2 (en) * | 2019-05-23 | 2022-06-21 | Disney Enterprises, Inc. | Interactive device |
US11141669B2 (en) * | 2019-06-05 | 2021-10-12 | Sony Corporation | Speech synthesizing dolls for mimicking voices of parents and guardians of children |
US20210077770A1 (en) * | 2019-09-18 | 2021-03-18 | Lisa L. Parisien | Hypnotherapy system utilizing an interactive doll and method of hypnotherapy for children |
US11577044B2 (en) * | 2019-09-18 | 2023-02-14 | Lisa Parisien | Hypnotherapy system utilizing an interactive doll and method of hypnotherapy for children |
WO2022040483A1 (en) * | 2020-08-19 | 2022-02-24 | Huge Play Inc. | Interactive, animatronic game/device partner and method for using same |
US11745105B2 (en) | 2020-08-19 | 2023-09-05 | Huge Play Inc. | Interactive animatronic game/device partner and method for using same |
CN114570034A (en) * | 2020-11-30 | 2022-06-03 | 株式会社多美 | Toy system |
US20220339549A1 (en) * | 2021-04-23 | 2022-10-27 | Ann Johnson | Wirelessly Coupled Stuffed Toy with Integrated Speaker |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20140038489A1 (en) | Interactive plush toy | |
KR102306624B1 (en) | Persistent companion device configuration and deployment platform | |
US11148296B2 (en) | Engaging in human-based social interaction for performing tasks using a persistent companion device | |
JP6902683B2 (en) | Virtual robot interaction methods, devices, storage media and electronic devices | |
US20170206064A1 (en) | Persistent companion device configuration and deployment platform | |
RU2690071C2 (en) | Methods and systems for managing robot dialogs | |
US10391636B2 (en) | Apparatus and methods for providing a persistent companion device | |
WO2016011159A1 (en) | Apparatus and methods for providing a persistent companion device | |
US20150298315A1 (en) | Methods and systems to facilitate child development through therapeutic robotics | |
US10076632B2 (en) | Sensory feedback system with active learning | |
JP6728319B2 (en) | Service providing method and system using a plurality of wake words in an artificial intelligence device | |
CN104506586A (en) | Intelligent earphone system capable of regulating volume by gesture and regulation method | |
JP2020038709A (en) | Continuous conversation function with artificial intelligence device | |
US10275222B2 (en) | Technologies for physical programming | |
WO2016206645A1 (en) | Method and apparatus for loading control data into machine device | |
US20200257954A1 (en) | Techniques for generating digital personas | |
WO2018183812A1 (en) | Persistent companion device configuration and deployment platform | |
US20220137917A1 (en) | Method and system for assigning unique voice for electronic device | |
WO2020153146A1 (en) | Information processing device and information processing method | |
Chang et al. | Intelligent Voice Assistant Extended Through Voice Relay System | |
Benassi | User Notification Interface Using Internet of Things Devices | |
Silverstein | IoT Control Device with Simplified Interface | |
Rodriguez | Location Finding of Wireless Beacons | |
Von Dehsen | Camera Lens with Display Mode | |
Thai et al. | Using Multimedia with Firmware 2.0 Systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BBY SOLUTIONS, INC., MINNESOTA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SHARMA, ANSHUMAN;MCGINNIS, PATRICK;COATE, TODD;AND OTHERS;REEL/FRAME:028730/0918 Effective date: 20120802 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |