US20080318673A1 - Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith - Google Patents

Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith Download PDF

Info

Publication number
US20080318673A1
US20080318673A1 US12/131,605 US13160508A US2008318673A1 US 20080318673 A1 US20080318673 A1 US 20080318673A1 US 13160508 A US13160508 A US 13160508A US 2008318673 A1 US2008318673 A1 US 2008318673A1
Authority
US
United States
Prior art keywords
data
gaming
user
game
gaming object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/131,605
Inventor
Ahmadreza (Reza) Rofougaran
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/131,605 priority Critical patent/US20080318673A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROFOUGARAN, AHMADREZA REZA
Publication of US20080318673A1 publication Critical patent/US20080318673A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SESHADRI, NAMBIRAJAN, KARAOGUZ, JEYHAN, IBRAHIM, BRIMA B., ROFOUGARAN, MARYAM, WALLEY, JOHN
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • G01S13/426Scanning radar, e.g. 3D radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/003Bistatic radar systems; Multistatic radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact

Definitions

  • This invention relates generally to gaming systems and more particularly to game controllers used for interacting with a game console and an associated display.
  • Home gaming systems typically include a game controller that includes one or more buttons or a joy stick that allows a user to provide input to a game console that runs one or more games.
  • the game console is coupled to a display device such as a television set to provide audio and video output from the game.
  • an IR device in IR communication systems, includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode.
  • the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam.
  • the receiver via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used video games to detect the direction in which a game controller is pointed.
  • an IR sensor is placed near the game display, where the IR sensor to detect the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration.
  • the motion data is transmitted to the game console via a Bluetooth wireless link.
  • the Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • the IR communication has a limited area in which a player can be for the IR communication to work properly.
  • the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved.
  • the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions.
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system in accordance with an embodiment of the present invention
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system in accordance with an embodiment of the present invention
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with an embodiment of the present invention.
  • FIG. 4 is a block diagram representation of a gaming system in accordance with an embodiment of the present invention.
  • FIG. 5 is a diagram of an example of the collection of image data by the gaming object 259 in accordance with an embodiment of the present invention.
  • FIG. 6 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with an embodiment of the present invention
  • FIG. 7 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with another embodiment of the present invention.
  • FIG. 8 is a diagram of a method for processing a position and/or motion based selection in accordance with an embodiment of the present invention.
  • FIG. 9 is a diagram of a method for processing a position and/or motion based gaming action in accordance with an embodiment of the present invention.
  • FIGS. 10-12 are diagrams of an embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention.
  • FIGS. 13-15 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention.
  • FIG. 16 is a diagram of a method for determining position and/or motion tracking in accordance with an embodiment of the present invention.
  • FIG. 17 is a diagram of another method for determining position and/or motion tracking in accordance with an embodiment of the present invention.
  • FIG. 18 is a diagram of another method for determining position and/or motion tracking in accordance with an embodiment of the present invention.
  • FIG. 19 is a diagram of another embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention.
  • FIG. 20 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag in accordance with an embodiment of the present invention.
  • FIG. 21 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a first manner is accordance with an embodiment the present invention.
  • FIG. 22 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a second manner is accordance with an embodiment the present invention.
  • FIG. 23 is a flowchart representation of a method in accordance with an embodiment of the present invention.
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system that includes a game console and a gaming object.
  • a video display 98 is shown that can be coupled to game console 100 to display video generated by game console 100 in conjunction with the set-up and playing of the game and to provide other user interface functions of game console 100 .
  • game console 100 can include its own integrated video display that displays, either directly or via projection, video content in association with any of the functions described in conjunction with video display 98 .
  • the gaming system has an associated physical area in which the game console and the gaming object are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, at a gaming center, on an airplane, etc.).
  • the physical area includes desk 92 , chair 94 and couch 96 .
  • the gaming object 110 may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game.
  • the gaming object 110 may include a simulated sword, a simulated gun, a paddle, racquet, bat, or other sporting good, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, or other element of a costume or article of clothing, a guitar, baton, keyboard, or other music related item, etc.
  • the gaming object 110 may represent or resemble another object from the game, may be coupled to an object that is worn or otherwise coupled to a user or be as simple as a standard box, pod or other object that is held by the user.
  • the functionally of game object 110 can be included in a multi-function device such as a mobile telephone, personal digital assistant, or other personal electronic device that performs other non-gaming functions.
  • the game console 100 optionally determines the orientation of the gaming object 110 within the physical area using one or more orientation sensors.
  • the game console 110 can further track the motion of the gaming object using one or more motion tracking techniques to facilitate video game play.
  • the game console may determine an initial orientation and/or position of the gaming object 110 within a tolerance (e.g., within a meter an/or within 1-5 degrees) at an update rate (e.g., once every second or once every few seconds) and tracks the motion or changes in the orentation within a motion tracking tolerance (e.g., within a few millimeters) at a tracking update rate (e.g., once every 10-100 milliseconds) based on motion data and/or orientation data generated in response to the actions of a user.
  • a tolerance e.g., within a meter an/or within 1-5 degrees
  • an update rate e.g., once every second or once every few seconds
  • a tracking update rate e.g., once every 10-100 milliseconds
  • the gaming object 110 can be an object that can include a joystick, touch pad, touch screen, wheel, one or more buttons and/or other sensor, actuator or other user interface device that generates other user data in response to the actions of a user.
  • the gaming object 110 and gaming console 100 communicate via wireless transceivers over a wireless communication link that will be described in greater detail in conjunction with FIG. 4 .
  • Game console 100 generates display data for display on a display device such as video display 98 . While shown as a home game console 100 , gaming object 110 can optionally communicate with other game devices such as an arcade game, a game server that is connected to a local area network, a communication network or public data network such as the Internet, or other game devices.
  • gaming object 110 may optionally communicate with a base station or access point that transfers communications to and from the gaming object 110 to the gaming object via a local area network, a communication network or public data network such as the Internet.
  • a base station or access point that transfers communications to and from the gaming object 110 to the gaming object via a local area network, a communication network or public data network such as the Internet.
  • the video display 98 displays one or more interactive items in the set-up or execution of at least one game or otherwise in association with a gaming application executed by the game console 100 .
  • These interactive items are interactive in response to the orientation data generated based on one or more orientation sensors and other interaction data. For instance, during the initiation of a game, one or more menu items can be displayed on the video display 98 for selection by the user via pointing the gaming object at the menu item and selecting the menu item by the press of a button.
  • the user can “shoot” at an interactive item on the video display 98 , such as a clay pigeon displayed in conjunction with a skeet shooting game, by pointing the gaming object 110 at the clay pigeon and pressing a button or trigger to initiate a shot.
  • an interactive item on the video display 98 such as a clay pigeon displayed in conjunction with a skeet shooting game
  • the gaming object 110 includes a sensor that generates user data in the form of biofeedback data in response to an action of a user.
  • the gaming objects sends an RF signal that indicates the biofeedback data to a game device such as game console 100 .
  • the game device executes a gaming application that is based the biofeedback data. In this fashion, the set-up, user authentication, and/or operation of a game can be adjusted based on the biofeedback data.
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system of FIG. 1 .
  • a user 106 is represented schematically as holding a particular gaming object 110 in his or her hand or hands.
  • User data 102 and orientation data is generated by the gaming object 110 and communicated via a wireless communication path 104 with the game console 100 .
  • the user data 102 and orientation data 105 can include user selections, commands, position data indicating the position, orientation, and/or motion of the gaming object 110 or other user data that is generated based on the actions of the user in conjunction with the playing, and set-up of a particular game, and/or the user's other interactions with the game console 100 .
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system that includes a game console, a plurality of players and a plurality of gaming objects.
  • game console 100 communicates with both gaming object 110 and gaming object 110 ′and receives corresponding user data and orientation data, such as user data 102 and orientation data 105 , from each gaming object.
  • game console 100 operates on a separate frequency for each device, however, time division multiplexing, carrier sense multiple access collision avoidance (CSMA/CA) or other multiple access techniques can likewise be used.
  • CSMA/CA carrier sense multiple access collision avoidance
  • FIG. 4 is a block diagram representation of a gaming system in accordance with an embodiment of the present invention.
  • a gaming system is shown that includes game console 100 and gaming object 110 .
  • Gaming object 110 includes an actuator 114 for generating user data, such as user data 102 in response to the actions of a user, such as user 106 .
  • Actuator 114 can include a microphone, button, joy stick, wheel, keypad, keyboard, a resistive strip, touch pad or touch screen, and/or a motion sensor (such as an on-chip gyrator or accelerometer or other position or motion sensing device) along with other driver circuitry for generating user data 102 based on the actions of the user 106 .
  • a motion sensor such as an on-chip gyrator or accelerometer or other position or motion sensing device
  • the actuator 114 includes a capacitive or resistive sensor, such as a resistive or capacitive touchpad or touch screen.
  • a capacitive or resistive sensor such as a resistive or capacitive touchpad or touch screen.
  • the resistive capacitive sensor can be operable to generate user data 102 that includes an audio output command, such as to change a volume setting, to select, enable or disable background music or other audio effects; an audio input command, that enables or disables voice commands, sets an input level or an input device.
  • the capacitive sensor can generate set-up commands, gaming data, preferences data, product registration data, and/or authentication data or other user data 102 in response to the actions of a user, such as user 106 .
  • the actuator 114 includes a resistive sensor, capacitive sensor, microphone, optical sensor or other sensor that generates user data 102 that includes biofeedback data that can be used by game console 100 to adjust a game parameter of the gaming application based on the biofeedback data. For instance in an adventure game, an excitement level of the game can be reduced in response to biofeedback indicating a heart rate or level of perspiration that is too high or increasing too rapidly. In another embodiment, the game can sense the fear of a user via biofeedback that indicates a high heart rate or level of perspiration.
  • Optional orientation sensor 112 can include a photosensor that generates the orientation data 105 based on an optical signal from a video display such as a video display integrated in game console 100 or separate video display 98 . In this fashion, the optical signal can be used to generate orientation data 105 that represents the orientation of the gaming object 110 .
  • orientation sensor 112 includes a plurality of sensors such as motion sensors, RF tags or other that generate orientation data that indicates the orientation of the gaming object 110 based on the relative positions of the plurality of sensors.
  • Transceiver 120 sends data, such as user data 102 and orientation data 105 to transceiver 130 of game console 100 via RF signals 108 .
  • gaming object 110 optionally receives RF signals 108 from game console 100 that contain other gaming data such as control data, optional display data for display on a touch screen or other display screen incorporated in gaming object 110 .
  • Gaming object 110 optionally contains a processor 122 ′, memory 120 ′ and bus 125 ′.
  • processor 122 ′ can execute one or more application to perform the operation of a smart gaming controller, to facilitate the generation and transmission of user data 102 and orientation data 105 , to perform other gaming operations and to optionally perform non-gaming functions and applications.
  • Transceiver 120 can communicate with transceiver 130 via a wireless telephony protocol operating in a short range or low power mode, via a Bluetooth standard interface, via a 802.11 or other wireless local area network protocol, or via another wireless protocol.
  • transceiver 120 is coupled to receive an RF signal 108 initiated by game console 100 , such as a 60 GHz RF signal or other RF signal.
  • transceiver 120 converts energy from the RF signal 108 into a power signal for powering the transceiver 120 or all or some portion of the gaming object 110 .
  • the gaming object 110 deriving power, in while or in part, based on RF signal 108 , gaming object 110 can optionally be portable, small and light.
  • Transceiver 120 conveys the user data 102 and orientation data 105 back to the game console 100 by backscattering the RF signal 108 based on user data 102 and orientation data 105 .
  • Game console 100 includes an interface module 132 for coupling to the gaming object 110 .
  • interface module 132 includes transceiver 130 that communicates with transceiver 120 either directly or via a network.
  • Game console 100 further includes a memory 124 and processor 122 that are coupled to interface module 132 via a bus 125 .
  • processor 122 executes one or more routines such as an operating system, utilities, and one or more applications such as video game applications or other gaming applications that produce video information that is converted to display signal 128 via driver 126 .
  • Processors 122 and 122 ′ can each include a dedicated or shared processing device.
  • a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions.
  • the memories 124 and 124 ′ can each be a single memory device or a plurality of memory devices.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information.
  • processors 122 or 122 ′ implement one or more of their functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. While particular bus architectures are shown, alternative bus architectures including architectures having two or more buses or direct connectivity between the various modules of game console 100 and gaming object 110 , can likewise be employed within the broad scope of the present invention.
  • the game console 100 can generate display data for display on a display device that contains at least one interactive item that is interactive in response to the orientation data 105 and interaction data included in user data 102 .
  • an optic sensor in a gaming object 110 that simulates a gun can generate optical feedback to determine if the “gun” is pointed at a particular object, such as a clay pigeon, that is displayed on the screen.
  • interaction data is generated, such as by the user 106 simulating the pull of a trigger, when the gaming object 110 is pointed at the interactive item, the interaction can result.
  • the clay pigeon discussed above, the clay pigeon can be shown to be broken by the simulated “shot” from the simulated gun.
  • game console can display an interactive menu having menu items that are selectable by pointing the gaming object 110 at the menu item and generating interaction that indicates the user's intent to select the item.
  • Game console 100 optionally includes a recognition module 128 that operates to recognize one or more patterns in biofeedback data, such as voice data, image data or other biofeedback data and to generate a recognition signal in response thereto.
  • recognition module 128 can perform pattern recognition on the voice data to recognize and/or authenticate the speaker as corresponding to a particular registered user.
  • the gaming object 110 can prompt the user to generate voice samples used to train a speaker recognition routine included in recognition module 128 .
  • the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence.
  • recognition module 128 can perform pattern recognition on the voice data to recognize voice commands such as game commands, set-up commands, other commands or other biofeedback via either speaker dependent speech recognition that operates based on training data received from gaming object 110 or via speaker independent speech recognition.
  • voice commands such as game commands, set-up commands, other commands or other biofeedback via either speaker dependent speech recognition that operates based on training data received from gaming object 110 or via speaker independent speech recognition.
  • the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence for use by the gaming application. In this fashion, a user can issue voice commands such as “jump”, “stop”, “go back”, “commence firing” or any other commands used in conjunction with a gaming application.
  • a shout generated by the user can be recognized as a shout and used by a game to alert characters of the game, to modify an anger or fear level or other emotional state of the user's character in the game or other characters in the game, or to modify one or other game parameters.
  • recognition module 128 can perform pattern recognition on the image data to recognize and/or authenticate the image as corresponding to a particular registered user.
  • the gaming object 110 can prompt the user to generate image samples used to train an image recognition routine included in recognition module 128 .
  • the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence.
  • recognition module 128 can perform pattern recognition on the image data to recognize portions of the image as corresponding to an emotional response of particular registered user.
  • analysis of the user's expression based on the eyes, mouth, eyebrows, brow or other features can be used to determine if the user is fatigued, afraid, angry, sad, disappointed, excited, etc.
  • the recognition of one or more of these emotional states by recognition module 128 can be used to generate a corresponding recognition signal that can be used by processing module 122 to modify the emotional state of the user's character in the game or other characters in the game, or to modify one or other game parameters.
  • FIG. 5 is a diagram of an example of the collection of image data by the gaming object 110 in accordance with an embodiment of the present invention.
  • a gaming object 259 such as gaming object 110
  • image sensor 382 can be a charge coupled device (CCD) or other image sensor that generates image data in the form of either a still image or video.
  • CCD charge coupled device
  • image sensor 382 operates by capturing an image that can be used to generate user data 102 in the form of image data.
  • the image 380 is shown to correspond to a head shot of a user, such as user 106 , the image can correspond to a fingerprint, palm print, face, retina or other portion of a user's body.
  • this image data can be used as biofeedback data for authentication, modifying game parameters or for other purposes in conjunction with the set-up or execution of one or more gaming applications.
  • FIG. 6 is a diagram of an example of positioning and/or motioning of a game controller to select an item on the display of a game console.
  • a game controller 260 such as gaming object 110
  • console utilize tracking of the orientation of the controller to provide a selection of a menu item displayed on a video display associated with game console 100 .
  • Gaming object 260 such as a geometric solid such as a handheld device that can be positioned and oriented in three dimensional space.
  • the gaming object 260 can have three dimensional coordinates (x, y, z) and be oriented along roll, pitch and yaw axes based on, for instance, up/down and side-to-side motion, rotation, tilt and translation and rotation about other axes.
  • gaming object 136 includes an orientation sensor 112 , such as optical sensor 136 that generates orientation data 105 when the orientation of gaming object 260 corresponds to an orientation in alignment with the menu item.
  • orientation sensor 112 such as optical sensor 136 that generates orientation data 105 when the orientation of gaming object 260 corresponds to an orientation in alignment with the menu item.
  • the light emitted by “item 2” in the menu is received by the optical sensor and used to generate orientation data 105 .
  • the game console 100 can highlight the menu item when the orientation of the gaming object 260 corresponds to an orientation aligned with the menu item.
  • the “item 2” is highlighted when the gaming object is pointed at this menu item. This provides visual feedback to a user of gaming object 260 of hat item the gaming object 260 is pointed at. If the user indicates his or her selection of the highlighted item, via an actuator 114 (such as by the click of a button), game console 100 can respond by performing the function associated with this menu item in conjunction with the particular menu displayed.
  • the orientation data is preprocessed in the optical sensor 136 or processing module 122 ′ based on an image generated therefrom to generate orientation data 105 .
  • orientation data 105 corresponding to the image or other optical output is sent to game console 100 for processing by processing module 122 to determine which the orientation data corresponds to any of the menu items being displayed based on timing of the signal in correspondence to the timing of the displayed image, or based on a recognition of an image or portion of an image corresponding to the displayed item or a portion thereof.
  • the interactive item displayed on display 98 can alternatively be a graphics item displayed in conjunction with a game.
  • an interaction is generated, such as the breaking of the clay pigeon, when the orientation of the gaming object 260 corresponds to an orientation in alignment with the graphics item on display 98 .
  • FIG. 7 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with another embodiment of the present invention.
  • gaming object 261 such as gaming object 110
  • sensing tags 140 for use in generating orientation data 105 that indicates the orientation of the gaming object 261 .
  • the relative position of the sensing tags 140 in three-dimensional space can be used to determine the orientation of the gaming object 261 .
  • the positioning of the sensing tags can be determined within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) the motion of the sensing tags 140 can be tracked within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds) within a position and motion tracking area that is range of game console 100 .
  • a positioning tolerance e.g., within a meter
  • a positioning update rate e.g., once every second or once every few seconds
  • the motion of the sensing tags 140 can be tracked within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds) within a position and motion tracking area that is range of game console 100 .
  • each of the sensing tags 140 is implemented via an RF tag.
  • the game console 100 sends one or more RF signals 108 on a continuous basis and reads the orientation data 105 generated by each of the sensing tags 140 periodically (e.g., once every 10-100 milliseconds) to update the positioning of sensing tags 140 .
  • the game console 100 generates the sends one or more RF signals 108 periodically (e.g., once every 10-100 milliseconds) and reads the orientation data 105 generated by each of the sensing tags 140 only when required to update the orientation of game object 261 .
  • the sensing tags 140 can be RF tags that contain motion sensors or other position sensors and the game object 261 itself reads the position of each of the sensing tags 140 and generates orientation data 105 that is compiled and sent to the game console 100 .
  • FIG. 8 is a diagram of a method for processing a position and/or motion that begins by placing the controller and/or gaming console in a menu selection mode as shown in step 330 .
  • the controller is set up to process a menu selection as opposed to a gaming function.
  • the method continues by establishing the gaming object 100 's current position and orientation with respect to an initial position in a display area as shown in step 332 . For example, regardless of the current position and orientation (assuming it is in range), the gaming object 100 's current position and orientation is processed to correspond to a particular location on the menu display.
  • the method proceeds by highlighting the menu item corresponding to the initial position (e.g., a start menu button) as shown in step 334 .
  • the method then continues by tracking the motion of the gaming object and mapping the motion to coordinates of the menu display area (e.g., in an embodiment, the mapping of the motion will be limited to somewhere with the menu display area) as shown in steps 336 and 338 .
  • the method continues by determining whether the motion has moved to another item in the menu list as shown in step 340 . If yes, the method proceeds by highlighting the new item as shown in step 342 .
  • the method then proceeds by determining whether a selection of the highlighted item is received as shown in step 344 . If not, the process continues by tracking the motion in step 336 . If a selection is received, the process continues by processing the menu selection as shown in step 346 . This may be done in a convention manner.
  • FIG. 9 is a diagram of a method for processing a position and/or motion based gaming action that begins by placing the gaming object (e.g., a controller) and/or game console in a gaming mode as shown in step 350 . The method continues by establishing the gaming object's current position and orientation with respect to an initial position in a gaming display area as shown in step 352 . For example, if the game being played is a shooting arcade game and the gaming object is functioning as a gun, this step determines the initial aiming of the gun.
  • the gaming object e.g., a controller
  • game console e.g., a gaming mode
  • the method continues by establishing the gaming object's current position and orientation with respect to an initial position in a gaming display area as shown in step 352 . For example, if the game being played is a shooting arcade game and the gaming object is functioning as a gun, this step determines the initial aiming of the gun.
  • the method continues by determining whether the position and orientation of the gaming object is within the gaming display area as shown in step 354 . If yes, the method continues by providing a display icon corresponding to the position and orientation as shown in step 356 .
  • the icon may be cross hairs of a gun to correspond to the aiming of the video game gun.
  • the method continues by tracking the motion of the gaming object and mapping the motion to the gaming display area as shown in steps 358 and 360 .
  • the method continues by determining whether an action has been received as shown in step 362 . For example, has the trigger of the gun been pulled? If not, the process repeats as shown. If yes, the process continues by processing the action as shown in step 364 .
  • the processing may include mapping the shooting of the gun in accordance with the aiming of the gun.
  • FIGS. 10-12 are diagrams of an embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an xyz origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object 110 is determined based on its Cartesian coordinates (e.g., x1, y1, z1).
  • Cartesian coordinates e.g., x1, y1, z1
  • the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the origin.
  • the positions of the sensing tags 140 can be used to determine an orientation of the gaming object 110 .
  • FIGS. 13-15 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an origin is selected to be somewhere in the localized physical area and each point being tracked, such as the position of each sensing tag 140 or other position used for determining the positioning or orientation of the gaming object 110 is determined based on its vector, or spherical, coordinates ( ⁇ , ⁇ , ⁇ ), which are defined as: ⁇ 0 is the distance from the origin to a given point P. 0 ⁇ 180° is the angle between the positive z-axis and the line formed between the origin and P. 0 ⁇ 360° is the angle between the positive x-axis and the line from the origin to the P projected onto the xy-plane.
  • is referred to as the zenith, colatitude or polar angle, while ⁇ is referred to as the azimuth.
  • To plot a point from its spherical coordinates go ⁇ units from the origin along the positive z-axis, rotate ⁇ about the y-axis in the direction of the positive x-axis and rotate ⁇ about the z-axis in the direction of the positive y-axis.
  • the new position of the tracking and/or positioning points are determined in vector, or spherical, coordinates with respect to the origin that can be used to determine not only the position of the gaming object 110 but its orientation as well.
  • FIGS. 10-15 illustrate two types of coordinate systems
  • other three-dimensional coordinate systems may be used for tracking motion and/or establishing position and orientation within a gaming system.
  • FIG. 16 is a diagram of another method for determining position and/or motion tracking that begins in step 300 by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 9-11 ).
  • the reference point may be the origin or any other point within the localized physical area.
  • the reference point can be the location of the game console 100 , the location of the game object 110 at a particular time, such as a set-up time, the location of one of a plurality of sensing tags 140 , however, other reference points can likewise be used.
  • the method continues in one or more branches.
  • a vector with respect to the reference point is determined to indicate the initial position of the gaming object 110 and/or the sensing tags 140 based on the reference point as shown in step 302 .
  • This branch continues by updating the positions to track the motion and/or orientation of gaming object 110 based on orientation data 105 as shown in step 304 .
  • the other branch includes determining a vector with respect to the reference point for the gaming object 110 to establish its initial position as shown in step 306 .
  • This branch continues by updating the gaming object 110 's position to track the gaming object's motion using orientation data as shown in step 308 .
  • the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 17 is a diagram of another method for determining position and/or motion tracking that begins in step 310 by determining the coordinates of the sensing tags position in the physical area. The method then continues by determining the coordinates of a gaming object's initial position as shown in step 312 . The method then proceeds by updating the coordinates of the sensing tags position in the physical area to track the game objects orientation as shown in step 314 . The method also continues by updating the coordinates of a gaming object's position to track its motion as shown in step 316 .
  • FIG. 18 is a diagram of another method for determining position and/or motion tracking that begins in step 320 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for the sensing tags initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 12-14 ) as shown in step 322 .
  • a coordinate system e.g., one of the systems shown in FIGS. 12-14
  • the method then continues by determining a vector of a gaming object 110 's initial position as shown in step 324 .
  • the method then proceeds by updating the vector of the sensing tag's position in the physical area to track the gaming object's orientation as shown in step 326 .
  • the method also continues by updating the vector of the gaming object's position to track its motion as shown in step 328 .
  • FIG. 19 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above.
  • the coordinate system includes a positioning coordinate grid and a motion tracking grid, where the motion tracking grid is of a finer resolution than the positioning coordinate grid.
  • the player or gaming object 110 's position within the physical area can have a first tolerance (e.g., within a meter) and the motion tracking of the player and/or the gaming object has a second tolerance (e.g., within a few millimeters).
  • the position of the player and/or gaming object can be updated infrequently in comparison to the updating of the motion (e.g., the position can be updated once every second or so while the motion may be updated once every 10 milliseconds).
  • FIG. 20 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag.
  • RFID reader 205 represents a particular implementation of transceiver 130 and RFID tag 235 represents a particular implementation of transceiver 120 .
  • RFID tag can be used in an implementation of sensing tags 140 in communication with RFID reader 235 incorporated in game console 100 .
  • RFID reader 205 includes a protocol processing module 40 , an encoding module 42 , an RF front-end 46 , a digitization module 48 , a predecoding module 50 and a decoding module 52 , all of which together form components of the RFID reader 205 .
  • RFID 205 optionally includes a digital-to-analog converter (DAC) 44 .
  • DAC digital-to-analog converter
  • the protocol processing module 40 is operably coupled to prepare data for encoding in accordance with a particular RFID standardized protocol.
  • the protocol processing module 40 is programmed with multiple RFID standardized protocols to enable the RFID reader 205 to communicate with any RFID tag, regardless of the particular protocol associated with the tag.
  • the protocol processing module 40 operates to program filters and other components of the encoding module 42 , decoding module 52 , pre-decoding module 50 and RF front end 46 in accordance with the particular RFID standardized protocol of the tag(s) currently communicating with the RFID reader 205 .
  • this flexibility can be omitted.
  • the protocol processing module 40 In operation, once the particular RFID standardized protocol has been selected for communication with one or more RFID tags, such as RFID tag 235 , the protocol processing module 40 generates and provides digital data to be communicated to the RFID tag 235 to the encoding module 42 for encoding in accordance with the selected RFID standardized protocol.
  • This digital data can include commands to power up the RFID tag 235 , to read user data or other commands or data used by the RFID tag in association with its operation.
  • the RFID protocols may include one or more line encoding schemes, such as Manchester encoding, FM0 encoding, FM1 encoding, etc.
  • the encoded data is provided to the decoding module 52 , which recaptures data, such as user data 102 and/or orientation data 105 therefrom in accordance with the particular encoding scheme of the selected RFID protocol.
  • the protocol processing module 40 processes the recovered data to identify the object(s) associated with the RFID tag(s) and/or provides the recovered data to the processing module 122 for further processing.
  • the processing module 40 may be a single processing device or a plurality of processing devices.
  • a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module may have an associated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module 40 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • RFID tag 235 that includes a power generating circuit 240 , an oscillation module 244 , a processing module 246 , an oscillation calibration module 248 , a comparator 250 , an envelope detection module 252 , a capacitor C 1 , and a transistor T 1 .
  • the oscillation module 244 , the processing module 246 , the oscillation calibration module 248 , the comparator 250 , and the envelope detection module 252 may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • One or more of the modules 244 , 246 , 248 , 250 , 252 may have an associated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the module.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the modules 244 , 246 , 248 , 250 , 252 implement one or more of their functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the power generating circuit 240 In operation, the power generating circuit 240 generates a supply voltage (V DD ) from a radio frequency (RF) signal that is received via antenna 254 .
  • the power generating circuit 240 stores the supply voltage V DD in capacitor C 1 and provides it to modules 244 , 246 , 248 , 250 , 252 .
  • the envelope detection module 252 determines an envelope of the RF signal, which includes a DC component corresponding to the supply voltage V DD .
  • the RF signal is an amplitude modulation signal, where the envelope of the RF signal includes transmitted data.
  • the envelope detection module 252 provides an envelope signal to the comparator 250 .
  • the comparator 250 compares the envelope signal with a threshold to produce a stream of recovered data.
  • the oscillation module 244 which may be a ring oscillator, crystal oscillator, or timing circuit, generates one or more clock signals that have a rate corresponding to the rate of the RF signal in accordance with an oscillation feedback signal. For instance, if the RF signal is a 900 MHz signal, the rate of the clock signals will be n*900 MHz, where “n” is equal to or greater than 1.
  • the oscillation calibration module 248 produces the oscillation feedback signal from a clock signal of the one or more clock signals and the stream of recovered data. In general, the oscillation calibration module 248 compares the rate of the clock signal with the rate of the stream of recovered data. Based on this comparison, the oscillation calibration module 248 generates the oscillation feedback to indicate to the oscillation module 244 to maintain the current rate, speed up the current rate, or slow down the current rate.
  • the processing module 246 receives the stream of recovered data and a clock signal of the one or more clock signals.
  • the processing module 246 interprets the stream of recovered data to determine a command or commands contained therein.
  • the command may be to store data, update data, reply with stored data, verify command compliance, read user data, an acknowledgement, etc. If the command(s) requires a response, the processing module 246 provides a signal to the transistor T 1 at a rate corresponding to the RF signal.
  • the signal toggles transistor T 1 on and off to generate an RF response signal that is transmitted via the antenna.
  • the RFID tag 235 utilizing a back-scattering RF communication. Note that the resistor R 1 functions to decouple the power generating circuit 240 from the received RF signals and the transmitted RF signals.
  • the RFID tag 235 may further include a current reference (not shown) that provides one or more reference, or bias, currents to the oscillation module 244 , the oscillation calibration module 248 , the envelope detection module 252 , and the comparator 250 .
  • the bias current may be adjusted to provide a desired level of biasing for each of the modules 244 , 248 , 250 , and 252 .
  • FIG. 21 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a first manner is accordance with an embodiment the present invention.
  • gaming object 372 such as gaming object 110
  • the hand 99 comes in contact with the sensor 370 .
  • the senor 370 can be a capacitive sensor that includes a layer that can store an electrical charge. When a user touches the sensor a portion of the charge is transferred to the user reducing the charge in the capacitive layer.
  • the sensor 370 includes a driver that differences in charge from end to end of the strip to determine the amount and location of the touch that can be output as user data, such as user data 102 .
  • the sensor 370 can be a resistive sensor, such as four or five wire tough pad or strip or other resistive sensor.
  • the sensor 370 can isolate biofeedback data such as a user's heart rate, a level of perspiration, or other biometric data that can be included in user data 102 .
  • the sensor 370 generates user data 102 that includes biofeedback data that can be used by game console 100 to adjust a game parameter of the gaming application based on the biofeedback data. For instance in a adventure game, an excitement level of the game can be reduced in response to biofeedback indicating a heart rate or level of perspiration that is too high or increasing too rapidly.
  • the game can sense the fear of a user via biofeedback that indicates a high heart rate or level of perspiration.
  • biofeedback can indicate a level of fatigue of the user based on heart rate or perspiration levels and take action to taunt the player in a light-heated way or otherwise adjust the level of difficulty of the game based on the user's fatigue.
  • the user data 102 generated by the sensor 370 can indicate the manner in which the user grasps the gaming object, in terms of the level of tightness, the position of the hand on the gaming object 372 , etc. each of these parameters can be included in user data 102 and the game console 100 can adjust one or more game parameters in response.
  • the gaming object 372 may be used to simulate a tennis racquet in a user's hand.
  • the game may attribute more power to the user's serve if the gaming object is held near one end, signifying greater simulated racquet extension during the serve.
  • a greater probability of a “miss-hit” shot can attributed based on the user data 102 .
  • a bunt by a user may require the user to shift his or her hand position on the gaming controller to simulate “choking-up” on the simulated bat.
  • FIG. 22 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a second manner is accordance with an embodiment the present invention.
  • the user's hand 99 is in a different position on the gaming object 372 covering more of the sensor 370 .
  • this change in the manner in which the gaming object 372 is grasped can be indicated via user data 102 and used to adjust one or more parameters of a game.
  • FIG. 23 is a flowchart representation of a method in accordance with an embodiment of the present invention. In particular a method is presented for use in conjunction with one or more functions and features presented in conjunction with FIGS. 1-22 .
  • biofeedback data is generated in response to an action of a user.
  • an RF signal is sent to a game device, wherein the RF signal indicates the biofeedback data.
  • a gaming application is generated based the biofeedback data.
  • the biofeedback data includes image data corresponding to an image of the user.
  • the gaming application can recognize the user based on the image data.
  • the biofeedback data can includes voice data generated by the user that is used to recognizes the user based on the voice data and/or to recognize a game command based on the voice data.
  • the biofeedback data can indicate a heart rate of the user and/or a perspiration level of the user.
  • the gaming application can adjust a game parameter based on the biofeedback data.
  • the biofeedback data can indicate a manner in which the user grasps the gaming object.
  • the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences.
  • the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level.
  • an intervening item e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module
  • inferred coupling i.e., where one element is coupled to another element by inference
  • the term “operable to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items.
  • the term “associated with”, includes direct and/or indirect coupling of separate items and/or one item being embedded within another item.
  • the term “compares favorably”, indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2 , a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1 .
  • transistors in the above described figure(s) is/are shown as field effect transistors (FETs), as one of ordinary skill in the art will appreciate, the transistors may be implemented using any type of transistor structure including, but not limited to, bipolar, metal oxide semiconductor field effect transistors (MOSFET), N-well transistors, P-well transistors, enhancement mode, depletion mode, and zero voltage threshold (VT) transistors.
  • FETs field effect transistors
  • MOSFET metal oxide semiconductor field effect transistors
  • N-well transistors N-well transistors
  • P-well transistors P-well transistors
  • enhancement mode enhancement mode
  • depletion mode depletion mode
  • VT zero voltage threshold

Abstract

A gaming object includes a sensor that generates biofeedback data in response to an action of a user. A transceiver is coupled to send an RF signal to a game device, that indicates the biofeedback data. A game device executes a gaming application that is based the biofeedback data.

Description

    CROSS REFERENCE TO RELATED PATENTS
  • This invention is claiming priority under 35 USC §119(e) to a provisionally filed patent application having the title VIDEO GAMING SYSTEM WITH POSITION AND MOTION TRACKING, a filing date of Jun. 22, 2007, and an application number of 60/936,724.
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field of the Invention
  • This invention relates generally to gaming systems and more particularly to game controllers used for interacting with a game console and an associated display.
  • 2. Description of Related Art
  • Home gaming systems typically include a game controller that includes one or more buttons or a joy stick that allows a user to provide input to a game console that runs one or more games. The game console is coupled to a display device such as a television set to provide audio and video output from the game.
  • In IR communication systems, an IR device includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode. In operation, the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam. The receiver, via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used video games to detect the direction in which a game controller is pointed. As an example, an IR sensor is placed near the game display, where the IR sensor to detect the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration. The motion data is transmitted to the game console via a Bluetooth wireless link. The Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • While the above technologies allow video gaming to include motion sensing, it does so with limitations. As mentioned, the IR communication has a limited area in which a player can be for the IR communication to work properly. Further, the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved. Thus, the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING(S)
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system in accordance with an embodiment of the present invention;
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system in accordance with an embodiment of the present invention;
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system in accordance with an embodiment of the present invention;
  • FIG. 4 is a block diagram representation of a gaming system in accordance with an embodiment of the present invention.
  • FIG. 5 is a diagram of an example of the collection of image data by the gaming object 259 in accordance with an embodiment of the present invention;
  • FIG. 6 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with an embodiment of the present invention;
  • FIG. 7 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with another embodiment of the present invention;
  • FIG. 8 is a diagram of a method for processing a position and/or motion based selection in accordance with an embodiment of the present invention;
  • FIG. 9 is a diagram of a method for processing a position and/or motion based gaming action in accordance with an embodiment of the present invention;
  • FIGS. 10-12 are diagrams of an embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention;
  • FIGS. 13-15 are diagrams of another embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention;
  • FIG. 16 is a diagram of a method for determining position and/or motion tracking in accordance with an embodiment of the present invention;
  • FIG. 17 is a diagram of another method for determining position and/or motion tracking in accordance with an embodiment of the present invention;
  • FIG. 18 is a diagram of another method for determining position and/or motion tracking in accordance with an embodiment of the present invention;
  • FIG. 19 is a diagram of another embodiment of a coordinate system of a gaming system in accordance with an embodiment of the present invention;
  • FIG. 20 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag in accordance with an embodiment of the present invention;
  • FIG. 21 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a first manner is accordance with an embodiment the present invention;
  • FIG. 22 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a second manner is accordance with an embodiment the present invention; and
  • FIG. 23 is a flowchart representation of a method in accordance with an embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a schematic block diagram of an overhead view of an embodiment of a gaming system that includes a game console and a gaming object. A video display 98 is shown that can be coupled to game console 100 to display video generated by game console 100 in conjunction with the set-up and playing of the game and to provide other user interface functions of game console 100. It should also be noted that game console 100 can include its own integrated video display that displays, either directly or via projection, video content in association with any of the functions described in conjunction with video display 98.
  • The gaming system has an associated physical area in which the game console and the gaming object are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, at a gaming center, on an airplane, etc.). In the example shown the physical area includes desk 92, chair 94 and couch 96.
  • In an embodiment of the present invention, the gaming object 110 may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game. For example, the gaming object 110 may include a simulated sword, a simulated gun, a paddle, racquet, bat, or other sporting good, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, or other element of a costume or article of clothing, a guitar, baton, keyboard, or other music related item, etc. It should be noted that the gaming object 110 may represent or resemble another object from the game, may be coupled to an object that is worn or otherwise coupled to a user or be as simple as a standard box, pod or other object that is held by the user. Further, the functionally of game object 110 can be included in a multi-function device such as a mobile telephone, personal digital assistant, or other personal electronic device that performs other non-gaming functions.
  • In this system, the game console 100 optionally determines the orientation of the gaming object 110 within the physical area using one or more orientation sensors. In addition, the game console 110 can further track the motion of the gaming object using one or more motion tracking techniques to facilitate video game play. In this embodiment, the game console may determine an initial orientation and/or position of the gaming object 110 within a tolerance (e.g., within a meter an/or within 1-5 degrees) at an update rate (e.g., once every second or once every few seconds) and tracks the motion or changes in the orentation within a motion tracking tolerance (e.g., within a few millimeters) at a tracking update rate (e.g., once every 10-100 milliseconds) based on motion data and/or orientation data generated in response to the actions of a user.
  • In addition, the gaming object 110 can be an object that can include a joystick, touch pad, touch screen, wheel, one or more buttons and/or other sensor, actuator or other user interface device that generates other user data in response to the actions of a user. In operation, the gaming object 110 and gaming console 100 communicate via wireless transceivers over a wireless communication link that will be described in greater detail in conjunction with FIG. 4. Game console 100 generates display data for display on a display device such as video display 98. While shown as a home game console 100, gaming object 110 can optionally communicate with other game devices such as an arcade game, a game server that is connected to a local area network, a communication network or public data network such as the Internet, or other game devices. Further while the communication between gaming object 110, is shown as direct communication, gaming object may optionally communicate with a base station or access point that transfers communications to and from the gaming object 110 to the gaming object via a local area network, a communication network or public data network such as the Internet.
  • In an embodiment of the present invention, the video display 98 displays one or more interactive items in the set-up or execution of at least one game or otherwise in association with a gaming application executed by the game console 100. These interactive items are interactive in response to the orientation data generated based on one or more orientation sensors and other interaction data. For instance, during the initiation of a game, one or more menu items can be displayed on the video display 98 for selection by the user via pointing the gaming object at the menu item and selecting the menu item by the press of a button. In another example, the user can “shoot” at an interactive item on the video display 98, such as a clay pigeon displayed in conjunction with a skeet shooting game, by pointing the gaming object 110 at the clay pigeon and pressing a button or trigger to initiate a shot.
  • Further, the gaming object 110 includes a sensor that generates user data in the form of biofeedback data in response to an action of a user. The gaming objects sends an RF signal that indicates the biofeedback data to a game device such as game console 100. The game device executes a gaming application that is based the biofeedback data. In this fashion, the set-up, user authentication, and/or operation of a game can be adjusted based on the biofeedback data.
  • Further details including many optional functions and features are described in conjunction with FIGS. 2-23 that follow.
  • FIG. 2 is a schematic block diagram of a side view of an embodiment of a gaming system of FIG. 1. In particular, a user 106 is represented schematically as holding a particular gaming object 110 in his or her hand or hands. User data 102 and orientation data is generated by the gaming object 110 and communicated via a wireless communication path 104 with the game console 100. The user data 102 and orientation data 105 can include user selections, commands, position data indicating the position, orientation, and/or motion of the gaming object 110 or other user data that is generated based on the actions of the user in conjunction with the playing, and set-up of a particular game, and/or the user's other interactions with the game console 100.
  • FIG. 3 is a schematic block diagram of an overhead view of another embodiment of a gaming system that includes a game console, a plurality of players and a plurality of gaming objects. In this instance, game console 100 communicates with both gaming object 110 and gaming object 110′and receives corresponding user data and orientation data, such as user data 102 and orientation data 105, from each gaming object. In an embodiment of the present invention, game console 100 operates on a separate frequency for each device, however, time division multiplexing, carrier sense multiple access collision avoidance (CSMA/CA) or other multiple access techniques can likewise be used.
  • FIG. 4 is a block diagram representation of a gaming system in accordance with an embodiment of the present invention. In particular, a gaming system is shown that includes game console 100 and gaming object 110. Gaming object 110 includes an actuator 114 for generating user data, such as user data 102 in response to the actions of a user, such as user 106.
  • Actuator 114 can include a microphone, button, joy stick, wheel, keypad, keyboard, a resistive strip, touch pad or touch screen, and/or a motion sensor (such as an on-chip gyrator or accelerometer or other position or motion sensing device) along with other driver circuitry for generating user data 102 based on the actions of the user 106.
  • In an embodiment of the present invention, the actuator 114 includes a capacitive or resistive sensor, such as a resistive or capacitive touchpad or touch screen. By touching the touchpad or touch screen, particularly in response to soft keys or other visual queues displayed by a touch screen or other display, the resistive capacitive sensor can be operable to generate user data 102 that includes an audio output command, such as to change a volume setting, to select, enable or disable background music or other audio effects; an audio input command, that enables or disables voice commands, sets an input level or an input device. In a similar fashion, the capacitive sensor can generate set-up commands, gaming data, preferences data, product registration data, and/or authentication data or other user data 102 in response to the actions of a user, such as user 106.
  • In an embodiment of the present invention, the actuator 114 includes a resistive sensor, capacitive sensor, microphone, optical sensor or other sensor that generates user data 102 that includes biofeedback data that can be used by game console 100 to adjust a game parameter of the gaming application based on the biofeedback data. For instance in an adventure game, an excitement level of the game can be reduced in response to biofeedback indicating a heart rate or level of perspiration that is too high or increasing too rapidly. In another embodiment, the game can sense the fear of a user via biofeedback that indicates a high heart rate or level of perspiration. In a sports game, biofeedback can indicate a level of fatigue of the user based on heart rate or perspiration levels and take action to taunt the player in a light-heated way or otherwise adjust the level of difficulty of the game based on the user's fatigue.
  • Optional orientation sensor 112 can include a photosensor that generates the orientation data 105 based on an optical signal from a video display such as a video display integrated in game console 100 or separate video display 98. In this fashion, the optical signal can be used to generate orientation data 105 that represents the orientation of the gaming object 110. In a further embodiment, orientation sensor 112 includes a plurality of sensors such as motion sensors, RF tags or other that generate orientation data that indicates the orientation of the gaming object 110 based on the relative positions of the plurality of sensors.
  • Transceiver 120 sends data, such as user data 102 and orientation data 105 to transceiver 130 of game console 100 via RF signals 108. In addition, gaming object 110 optionally receives RF signals 108 from game console 100 that contain other gaming data such as control data, optional display data for display on a touch screen or other display screen incorporated in gaming object 110.
  • Gaming object 110 optionally contains a processor 122′, memory 120′ and bus 125′. When included, processor 122′ can execute one or more application to perform the operation of a smart gaming controller, to facilitate the generation and transmission of user data 102 and orientation data 105, to perform other gaming operations and to optionally perform non-gaming functions and applications. Transceiver 120 can communicate with transceiver 130 via a wireless telephony protocol operating in a short range or low power mode, via a Bluetooth standard interface, via a 802.11 or other wireless local area network protocol, or via another wireless protocol.
  • In another embodiment, transceiver 120 is coupled to receive an RF signal 108 initiated by game console 100, such as a 60 GHz RF signal or other RF signal. In a similar fashion to a passive RFID tag, transceiver 120 converts energy from the RF signal 108 into a power signal for powering the transceiver 120 or all or some portion of the gaming object 110. By the gaming object 110 deriving power, in while or in part, based on RF signal 108, gaming object 110 can optionally be portable, small and light. Transceiver 120 conveys the user data 102 and orientation data 105 back to the game console 100 by backscattering the RF signal 108 based on user data 102 and orientation data 105.
  • Game console 100 includes an interface module 132 for coupling to the gaming object 110. In particular, interface module 132 includes transceiver 130 that communicates with transceiver 120 either directly or via a network. Game console 100 further includes a memory 124 and processor 122 that are coupled to interface module 132 via a bus 125. In operation, processor 122 executes one or more routines such as an operating system, utilities, and one or more applications such as video game applications or other gaming applications that produce video information that is converted to display signal 128 via driver 126.
  • Processors 122 and 122′ can each include a dedicated or shared processing device. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. The memories 124 and 124′ can each be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. Note that when the processors 122 or 122′ implement one or more of their functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. While particular bus architectures are shown, alternative bus architectures including architectures having two or more buses or direct connectivity between the various modules of game console 100 and gaming object 110, can likewise be employed within the broad scope of the present invention.
  • As discussed in conjunction with FIG. 1 the game console 100 can generate display data for display on a display device that contains at least one interactive item that is interactive in response to the orientation data 105 and interaction data included in user data 102. For instance, an optic sensor in a gaming object 110 that simulates a gun can generate optical feedback to determine if the “gun” is pointed at a particular object, such as a clay pigeon, that is displayed on the screen. If interaction data is generated, such as by the user 106 simulating the pull of a trigger, when the gaming object 110 is pointed at the interactive item, the interaction can result. In the case of the clay pigeon discussed above, the clay pigeon can be shown to be broken by the simulated “shot” from the simulated gun. In a similar fashion, game console can display an interactive menu having menu items that are selectable by pointing the gaming object 110 at the menu item and generating interaction that indicates the user's intent to select the item.
  • Game console 100 optionally includes a recognition module 128 that operates to recognize one or more patterns in biofeedback data, such as voice data, image data or other biofeedback data and to generate a recognition signal in response thereto. For example, when biofeedback data includes voice data, recognition module 128 can perform pattern recognition on the voice data to recognize and/or authenticate the speaker as corresponding to a particular registered user. In operation, the gaming object 110 can prompt the user to generate voice samples used to train a speaker recognition routine included in recognition module 128. When the voice signals received as part of user data 102 are recognized as part of an authentication routine as corresponding to a particular user, the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence.
  • In a further example, when biofeedback data includes voice data, recognition module 128 can perform pattern recognition on the voice data to recognize voice commands such as game commands, set-up commands, other commands or other biofeedback via either speaker dependent speech recognition that operates based on training data received from gaming object 110 or via speaker independent speech recognition. When the voice signals received as part of user data 102 are recognized as corresponding to a particular command, the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence for use by the gaming application. In this fashion, a user can issue voice commands such as “jump”, “stop”, “go back”, “commence firing” or any other commands used in conjunction with a gaming application. Further, a shout generated by the user can be recognized as a shout and used by a game to alert characters of the game, to modify an anger or fear level or other emotional state of the user's character in the game or other characters in the game, or to modify one or other game parameters.
  • In another example, when biofeedback data includes image data corresponding to a portion of the user such as the face, fingerprint, palm print, retina or other portion of the user, recognition module 128 can perform pattern recognition on the image data to recognize and/or authenticate the image as corresponding to a particular registered user. In operation, the gaming object 110 can prompt the user to generate image samples used to train an image recognition routine included in recognition module 128. When the image data received as part of user data 102 are recognized as part of an authentication routine as corresponding to a particular user, the recognition module 128 can respond by generating a recognition signal to processor 122 that indicates this correspondence.
  • In yet another example, when biofeedback data includes image data corresponding to a portion of the user, such as the face, recognition module 128 can perform pattern recognition on the image data to recognize portions of the image as corresponding to an emotional response of particular registered user. In particular, analysis of the user's expression based on the eyes, mouth, eyebrows, brow or other features can be used to determine if the user is fatigued, afraid, angry, sad, disappointed, excited, etc. The recognition of one or more of these emotional states by recognition module 128 can be used to generate a corresponding recognition signal that can be used by processing module 122 to modify the emotional state of the user's character in the game or other characters in the game, or to modify one or other game parameters.
  • FIG. 5 is a diagram of an example of the collection of image data by the gaming object 110 in accordance with an embodiment of the present invention. In particular, a gaming object 259, such as gaming object 110, is shown that includes an image sensor 382. Image sensor 382 can be a charge coupled device (CCD) or other image sensor that generates image data in the form of either a still image or video. As shown, image sensor 382 operates by capturing an image that can be used to generate user data 102 in the form of image data. While the image 380 is shown to correspond to a head shot of a user, such as user 106, the image can correspond to a fingerprint, palm print, face, retina or other portion of a user's body.
  • As discussed in conjunction with FIG. 4, this image data can be used as biofeedback data for authentication, modifying game parameters or for other purposes in conjunction with the set-up or execution of one or more gaming applications.
  • FIG. 6 is a diagram of an example of positioning and/or motioning of a game controller to select an item on the display of a game console. In an embodiment, a game controller 260 such as gaming object 110, and console utilize tracking of the orientation of the controller to provide a selection of a menu item displayed on a video display associated with game console 100. Gaming object 260, such as a geometric solid such as a handheld device that can be positioned and oriented in three dimensional space. In operation, the gaming object 260 can have three dimensional coordinates (x, y, z) and be oriented along roll, pitch and yaw axes based on, for instance, up/down and side-to-side motion, rotation, tilt and translation and rotation about other axes.
  • In this embodiment, gaming object 136 includes an orientation sensor 112, such as optical sensor 136 that generates orientation data 105 when the orientation of gaming object 260 corresponds to an orientation in alignment with the menu item. In this case, the light emitted by “item 2” in the menu is received by the optical sensor and used to generate orientation data 105. In response, the game console 100 can highlight the menu item when the orientation of the gaming object 260 corresponds to an orientation aligned with the menu item.
  • In the example shown, the “item 2” is highlighted when the gaming object is pointed at this menu item. This provides visual feedback to a user of gaming object 260 of hat item the gaming object 260 is pointed at. If the user indicates his or her selection of the highlighted item, via an actuator 114 (such as by the click of a button), game console 100 can respond by performing the function associated with this menu item in conjunction with the particular menu displayed.
  • In an embodiment of the present invention, the orientation data is preprocessed in the optical sensor 136 or processing module 122′ based on an image generated therefrom to generate orientation data 105. In the alternative, orientation data 105 corresponding to the image or other optical output is sent to game console 100 for processing by processing module 122 to determine which the orientation data corresponds to any of the menu items being displayed based on timing of the signal in correspondence to the timing of the displayed image, or based on a recognition of an image or portion of an image corresponding to the displayed item or a portion thereof.
  • While presented in conjunction with the selection of a menu item, in concert with the clay pigeon/gun example previously presented, the interactive item displayed on display 98 can alternatively be a graphics item displayed in conjunction with a game. In this embodiment, an interaction is generated, such as the breaking of the clay pigeon, when the orientation of the gaming object 260 corresponds to an orientation in alignment with the graphics item on display 98.
  • FIG. 7 is a diagram of an example of positioning and/or motioning of a game controller to interact with an item on the display of a game console in accordance with another embodiment of the present invention. In this embodiment, gaming object 261, such as gaming object 110, is implemented with sensing tags 140 for use in generating orientation data 105 that indicates the orientation of the gaming object 261. In particular, the relative position of the sensing tags 140 in three-dimensional space can be used to determine the orientation of the gaming object 261.
  • In this embodiment, the positioning of the sensing tags can be determined within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) the motion of the sensing tags 140 can be tracked within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds) within a position and motion tracking area that is range of game console 100.
  • In an embodiment of the present invention, each of the sensing tags 140 is implemented via an RF tag. In this mode of operation, the game console 100 sends one or more RF signals 108 on a continuous basis and reads the orientation data 105 generated by each of the sensing tags 140 periodically (e.g., once every 10-100 milliseconds) to update the positioning of sensing tags 140. In another mode of operation, the game console 100 generates the sends one or more RF signals 108 periodically (e.g., once every 10-100 milliseconds) and reads the orientation data 105 generated by each of the sensing tags 140 only when required to update the orientation of game object 261. In a further mode of operation, the sensing tags 140 can be RF tags that contain motion sensors or other position sensors and the game object 261 itself reads the position of each of the sensing tags 140 and generates orientation data 105 that is compiled and sent to the game console 100.
  • FIG. 8 is a diagram of a method for processing a position and/or motion that begins by placing the controller and/or gaming console in a menu selection mode as shown in step 330. In this mode, the controller is set up to process a menu selection as opposed to a gaming function. The method continues by establishing the gaming object 100's current position and orientation with respect to an initial position in a display area as shown in step 332. For example, regardless of the current position and orientation (assuming it is in range), the gaming object 100's current position and orientation is processed to correspond to a particular location on the menu display.
  • The method proceeds by highlighting the menu item corresponding to the initial position (e.g., a start menu button) as shown in step 334. The method then continues by tracking the motion of the gaming object and mapping the motion to coordinates of the menu display area (e.g., in an embodiment, the mapping of the motion will be limited to somewhere with the menu display area) as shown in steps 336 and 338. The method continues by determining whether the motion has moved to another item in the menu list as shown in step 340. If yes, the method proceeds by highlighting the new item as shown in step 342.
  • The method then proceeds by determining whether a selection of the highlighted item is received as shown in step 344. If not, the process continues by tracking the motion in step 336. If a selection is received, the process continues by processing the menu selection as shown in step 346. This may be done in a convention manner.
  • FIG. 9 is a diagram of a method for processing a position and/or motion based gaming action that begins by placing the gaming object (e.g., a controller) and/or game console in a gaming mode as shown in step 350. The method continues by establishing the gaming object's current position and orientation with respect to an initial position in a gaming display area as shown in step 352. For example, if the game being played is a shooting arcade game and the gaming object is functioning as a gun, this step determines the initial aiming of the gun.
  • The method continues by determining whether the position and orientation of the gaming object is within the gaming display area as shown in step 354. If yes, the method continues by providing a display icon corresponding to the position and orientation as shown in step 356. For example, the icon may be cross hairs of a gun to correspond to the aiming of the video game gun. The method continues by tracking the motion of the gaming object and mapping the motion to the gaming display area as shown in steps 358 and 360.
  • The method continues by determining whether an action has been received as shown in step 362. For example, has the trigger of the gun been pulled? If not, the process repeats as shown. If yes, the process continues by processing the action as shown in step 364. For example, the processing may include mapping the shooting of the gun in accordance with the aiming of the gun.
  • FIGS. 10-12 are diagrams of an embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these figures an xyz origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object 110 is determined based on its Cartesian coordinates (e.g., x1, y1, z1). As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the origin. As discussed in conjunction with FIG. 9, the positions of the sensing tags 140 can be used to determine an orientation of the gaming object 110.
  • FIGS. 13-15 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these figures an origin is selected to be somewhere in the localized physical area and each point being tracked, such as the position of each sensing tag 140 or other position used for determining the positioning or orientation of the gaming object 110 is determined based on its vector, or spherical, coordinates (ρ, φ, θ), which are defined as: ρ≧0 is the distance from the origin to a given point P. 0≦φ≦180° is the angle between the positive z-axis and the line formed between the origin and P. 0≦θ≦360° is the angle between the positive x-axis and the line from the origin to the P projected onto the xy-plane. φ is referred to as the zenith, colatitude or polar angle, while θ is referred to as the azimuth.φ and θ lose significance when ρ=0 and θ loses significance when sin(φ)=0 (at φ=0 and φ=180°). To plot a point from its spherical coordinates, go ρ units from the origin along the positive z-axis, rotate φ about the y-axis in the direction of the positive x-axis and rotate θ about the z-axis in the direction of the positive y-axis. As the sensing tags and/or gaming object 110 moves, the new position of the tracking and/or positioning points are determined in vector, or spherical, coordinates with respect to the origin that can be used to determine not only the position of the gaming object 110 but its orientation as well.
  • While FIGS. 10-15 illustrate two types of coordinate systems, other three-dimensional coordinate systems may be used for tracking motion and/or establishing position and orientation within a gaming system.
  • FIG. 16 is a diagram of another method for determining position and/or motion tracking that begins in step 300 by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 9-11). The reference point may be the origin or any other point within the localized physical area. In particular, the reference point can be the location of the game console 100, the location of the game object 110 at a particular time, such as a set-up time, the location of one of a plurality of sensing tags 140, however, other reference points can likewise be used.
  • The method continues in one or more branches. Along one branch, a vector with respect to the reference point is determined to indicate the initial position of the gaming object 110 and/or the sensing tags 140 based on the reference point as shown in step 302. This branch continues by updating the positions to track the motion and/or orientation of gaming object 110 based on orientation data 105 as shown in step 304.
  • The other branch includes determining a vector with respect to the reference point for the gaming object 110 to establish its initial position as shown in step 306. This branch continues by updating the gaming object 110's position to track the gaming object's motion using orientation data as shown in step 308. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 17 is a diagram of another method for determining position and/or motion tracking that begins in step 310 by determining the coordinates of the sensing tags position in the physical area. The method then continues by determining the coordinates of a gaming object's initial position as shown in step 312. The method then proceeds by updating the coordinates of the sensing tags position in the physical area to track the game objects orientation as shown in step 314. The method also continues by updating the coordinates of a gaming object's position to track its motion as shown in step 316.
  • FIG. 18 is a diagram of another method for determining position and/or motion tracking that begins in step 320 by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for the sensing tags initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 12-14) as shown in step 322. As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • The method then continues by determining a vector of a gaming object 110's initial position as shown in step 324. The method then proceeds by updating the vector of the sensing tag's position in the physical area to track the gaming object's orientation as shown in step 326. The method also continues by updating the vector of the gaming object's position to track its motion as shown in step 328.
  • FIG. 19 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above. In this embodiment, the coordinate system includes a positioning coordinate grid and a motion tracking grid, where the motion tracking grid is of a finer resolution than the positioning coordinate grid. In general, the player or gaming object 110's position within the physical area can have a first tolerance (e.g., within a meter) and the motion tracking of the player and/or the gaming object has a second tolerance (e.g., within a few millimeters). As such, the position of the player and/or gaming object can be updated infrequently in comparison to the updating of the motion (e.g., the position can be updated once every second or so while the motion may be updated once every 10 milliseconds).
  • FIG. 20 is a schematic block diagram of an embodiment of an RFID reader and an RFID tag. In particular, RFID reader 205 represents a particular implementation of transceiver 130 and RFID tag 235 represents a particular implementation of transceiver 120. In addition, RFID tag can be used in an implementation of sensing tags 140 in communication with RFID reader 235 incorporated in game console 100. As shown, RFID reader 205 includes a protocol processing module 40, an encoding module 42, an RF front-end 46, a digitization module 48, a predecoding module 50 and a decoding module 52, all of which together form components of the RFID reader 205. RFID 205 optionally includes a digital-to-analog converter (DAC) 44.
  • The protocol processing module 40 is operably coupled to prepare data for encoding in accordance with a particular RFID standardized protocol. In an exemplary embodiment, the protocol processing module 40 is programmed with multiple RFID standardized protocols to enable the RFID reader 205 to communicate with any RFID tag, regardless of the particular protocol associated with the tag. In this embodiment, the protocol processing module 40 operates to program filters and other components of the encoding module 42, decoding module 52, pre-decoding module 50 and RF front end 46 in accordance with the particular RFID standardized protocol of the tag(s) currently communicating with the RFID reader 205. However, if a plurality of RFID tags 235 each operate in accordance with a single protocol, this flexibility can be omitted.
  • In operation, once the particular RFID standardized protocol has been selected for communication with one or more RFID tags, such as RFID tag 235, the protocol processing module 40 generates and provides digital data to be communicated to the RFID tag 235 to the encoding module 42 for encoding in accordance with the selected RFID standardized protocol. This digital data can include commands to power up the RFID tag 235, to read user data or other commands or data used by the RFID tag in association with its operation. By way of example, but not limitation, the RFID protocols may include one or more line encoding schemes, such as Manchester encoding, FM0 encoding, FM1 encoding, etc. Thereafter, in the embodiment shown, the digitally encoded data is provided to the digital-to-analog converter 44 which converts the digitally encoded data into an analog signal. The RF front-end 46 modulates the analog signal to produce an RF signal at a particular carrier frequency that is transmitted via antenna 60 to one or more RFID tags, such as RF ID rag 235.
  • The RF front-end 46 further includes transmit blocking capabilities such that the energy of the transmitted RF signal does not substantially interfere with the receiving of a back-scattered or other RF signal received from one or more RFID tags via the antenna 60. Upon receiving an RF signal from one or more RFID tags, the RF front-end 46 converts the received RF signal into a baseband signal. The digitization module 48, which may be a limiting module or an analog-to-digital converter, converts the received baseband signal into a digital signal. The predecoding module 50 converts the digital signal into an encoded signal in accordance with the particular RFID protocol being utilized. The encoded data is provided to the decoding module 52, which recaptures data, such as user data 102 and/or orientation data 105 therefrom in accordance with the particular encoding scheme of the selected RFID protocol. The protocol processing module 40 processes the recovered data to identify the object(s) associated with the RFID tag(s) and/or provides the recovered data to the processing module 122 for further processing.
  • The processing module 40 may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module may have an associated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module 40 implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • RFID tag 235 that includes a power generating circuit 240, an oscillation module 244, a processing module 246, an oscillation calibration module 248, a comparator 250, an envelope detection module 252, a capacitor C1, and a transistor T1. The oscillation module 244, the processing module 246, the oscillation calibration module 248, the comparator 250, and the envelope detection module 252 may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. One or more of the modules 244, 246, 248, 250, 252 may have an associated memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the modules 244, 246, 248, 250, 252 implement one or more of their functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • In operation, the power generating circuit 240 generates a supply voltage (VDD) from a radio frequency (RF) signal that is received via antenna 254. The power generating circuit 240 stores the supply voltage VDD in capacitor C1 and provides it to modules 244, 246, 248, 250, 252.
  • When the supply voltage VDD is present, the envelope detection module 252 determines an envelope of the RF signal, which includes a DC component corresponding to the supply voltage VDD. In one embodiment, the RF signal is an amplitude modulation signal, where the envelope of the RF signal includes transmitted data. The envelope detection module 252 provides an envelope signal to the comparator 250. The comparator 250 compares the envelope signal with a threshold to produce a stream of recovered data.
  • The oscillation module 244, which may be a ring oscillator, crystal oscillator, or timing circuit, generates one or more clock signals that have a rate corresponding to the rate of the RF signal in accordance with an oscillation feedback signal. For instance, if the RF signal is a 900 MHz signal, the rate of the clock signals will be n*900 MHz, where “n” is equal to or greater than 1.
  • The oscillation calibration module 248 produces the oscillation feedback signal from a clock signal of the one or more clock signals and the stream of recovered data. In general, the oscillation calibration module 248 compares the rate of the clock signal with the rate of the stream of recovered data. Based on this comparison, the oscillation calibration module 248 generates the oscillation feedback to indicate to the oscillation module 244 to maintain the current rate, speed up the current rate, or slow down the current rate.
  • The processing module 246 receives the stream of recovered data and a clock signal of the one or more clock signals. The processing module 246 interprets the stream of recovered data to determine a command or commands contained therein. The command may be to store data, update data, reply with stored data, verify command compliance, read user data, an acknowledgement, etc. If the command(s) requires a response, the processing module 246 provides a signal to the transistor T1 at a rate corresponding to the RF signal. The signal toggles transistor T1 on and off to generate an RF response signal that is transmitted via the antenna. In one embodiment, the RFID tag 235 utilizing a back-scattering RF communication. Note that the resistor R1 functions to decouple the power generating circuit 240 from the received RF signals and the transmitted RF signals.
  • The RFID tag 235 may further include a current reference (not shown) that provides one or more reference, or bias, currents to the oscillation module 244, the oscillation calibration module 248, the envelope detection module 252, and the comparator 250. The bias current may be adjusted to provide a desired level of biasing for each of the modules 244, 248, 250, and 252.
  • FIG. 21 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a first manner is accordance with an embodiment the present invention. In this embodiment gaming object 372, such as gaming object 110, includes a resistive or capacitive sensor 370 shown as a sensing strip. When a user grasps the gaming object 372 in his or her hand 99, the hand 99 comes in contact with the sensor 370.
  • In an embodiment of the present invention, the sensor 370 can be a capacitive sensor that includes a layer that can store an electrical charge. When a user touches the sensor a portion of the charge is transferred to the user reducing the charge in the capacitive layer. The sensor 370 includes a driver that differences in charge from end to end of the strip to determine the amount and location of the touch that can be output as user data, such as user data 102. In another embodiment, the sensor 370 can be a resistive sensor, such as four or five wire tough pad or strip or other resistive sensor.
  • The sensor 370 can isolate biofeedback data such as a user's heart rate, a level of perspiration, or other biometric data that can be included in user data 102. In an embodiment of the present invention, the sensor 370 generates user data 102 that includes biofeedback data that can be used by game console 100 to adjust a game parameter of the gaming application based on the biofeedback data. For instance in a adventure game, an excitement level of the game can be reduced in response to biofeedback indicating a heart rate or level of perspiration that is too high or increasing too rapidly. In another embodiment, the game can sense the fear of a user via biofeedback that indicates a high heart rate or level of perspiration. In a sports game, biofeedback can indicate a level of fatigue of the user based on heart rate or perspiration levels and take action to taunt the player in a light-heated way or otherwise adjust the level of difficulty of the game based on the user's fatigue.
  • In a further embodiment of the present invention, the user data 102 generated by the sensor 370 can indicate the manner in which the user grasps the gaming object, in terms of the level of tightness, the position of the hand on the gaming object 372, etc. each of these parameters can be included in user data 102 and the game console 100 can adjust one or more game parameters in response.
  • For instance, in a tennis game, the gaming object 372 may be used to simulate a tennis racquet in a user's hand. The game may attribute more power to the user's serve if the gaming object is held near one end, signifying greater simulated racquet extension during the serve. However, if the position of the gaming object is not shifted to a more normal position near the center of the gaming object for a ground stroke shot, a greater probability of a “miss-hit” shot can attributed based on the user data 102. Similarly, in a baseball game, a bunt by a user may require the user to shift his or her hand position on the gaming controller to simulate “choking-up” on the simulated bat.
  • It should be noted that these examples are merely illustrative of the many possible applications of the use of user data 102 generated in the context of a game.
  • FIG. 22 is a schematic block diagram of a user's hand grasping a gaming object with a capacitive sensor in a second manner is accordance with an embodiment the present invention. As compared with FIG. 21, the user's hand 99 is in a different position on the gaming object 372 covering more of the sensor 370. As discussed in conjunction with FIG. 20, this change in the manner in which the gaming object 372 is grasped can be indicated via user data 102 and used to adjust one or more parameters of a game.
  • FIG. 23 is a flowchart representation of a method in accordance with an embodiment of the present invention. In particular a method is presented for use in conjunction with one or more functions and features presented in conjunction with FIGS. 1-22. In step 400, biofeedback data is generated in response to an action of a user. In step 402, an RF signal is sent to a game device, wherein the RF signal indicates the biofeedback data. In step 404, a gaming application is generated based the biofeedback data.
  • In an embodiment of the present invention, the biofeedback data includes image data corresponding to an image of the user. The gaming application can recognize the user based on the image data. The biofeedback data can includes voice data generated by the user that is used to recognizes the user based on the voice data and/or to recognize a game command based on the voice data.
  • In an embodiment of the present invention, the biofeedback data can indicate a heart rate of the user and/or a perspiration level of the user. The gaming application can adjust a game parameter based on the biofeedback data. Further, the biofeedback data can indicate a manner in which the user grasps the gaming object.
  • As may be used herein, the terms “substantially” and “approximately” provides an industry-accepted tolerance for its corresponding term and/or relativity between items. Such an industry-accepted tolerance ranges from less than one percent to fifty percent and corresponds to, but is not limited to, component values, integrated circuit process variations, temperature variations, rise and fall times, and/or thermal noise. Such relativity between items ranges from a difference of a few percent to magnitude differences. As may also be used herein, the term(s) “coupled to” and/or “coupling” and/or includes direct coupling between items and/or indirect coupling between items via an intervening item (e.g., an item includes, but is not limited to, a component, an element, a circuit, and/or a module) where, for indirect coupling, the intervening item does not modify the information of a signal but may adjust its current level, voltage level, and/or power level. As may further be used herein, inferred coupling (i.e., where one element is coupled to another element by inference) includes direct and indirect coupling between two items in the same manner as “coupled to”. As may even further be used herein, the term “operable to” indicates that an item includes one or more of power connections, input(s), output(s), etc., to perform one or more its corresponding functions and may further include inferred coupling to one or more other items. As may still further be used herein, the term “associated with”, includes direct and/or indirect coupling of separate items and/or one item being embedded within another item. As may be used herein, the term “compares favorably”, indicates that a comparison between two or more items, signals, etc., provides a desired relationship. For example, when the desired relationship is that signal 1 has a greater magnitude than signal 2, a favorable comparison may be achieved when the magnitude of signal 1 is greater than that of signal 2 or when the magnitude of signal 2 is less than that of signal 1.
  • While the transistors in the above described figure(s) is/are shown as field effect transistors (FETs), as one of ordinary skill in the art will appreciate, the transistors may be implemented using any type of transistor structure including, but not limited to, bipolar, metal oxide semiconductor field effect transistors (MOSFET), N-well transistors, P-well transistors, enhancement mode, depletion mode, and zero voltage threshold (VT) transistors.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention. One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.

Claims (20)

1. A gaming object comprising:
a sensor that generates biofeedback data in response to an action of a user; and
a transceiver coupled to send an RF signal to a game device, wherein the RF signal indicates the biofeedback data;
wherein the game device executes a gaming application that is based the biofeedback data.
2. The gaming object of claim 1 wherein the sensor includes an image sensor and the biofeedback data includes image data corresponding to an image of the user.
3. The gaming object of claim 2 wherein the game device recognizes the user based on the image data.
4. The gaming object of claim 1 wherein the sensor includes a microphone and the biofeedback data includes voice data generated by the user.
5. The gaming object of claim 4 wherein the game device recognizes the user based on the voice data.
6. The gaming object of claim 4 wherein the game device recognizes a game command based on the voice data.
7. The gaming object of claim 1 wherein the sensor includes a heart rate sensor and the biofeedback data indicates a heart rate of the user.
8. The gaming object of claim 1 wherein the sensor includes a perspiration sensor and the biofeedback data indicates a perspiration level of the user.
9. The gaming object of claim 1 wherein the game device adjusts a game parameter of the gaming application based on the biofeedback data.
10. The gaming object of claim 1 wherein the sensor generates biofeedback data that indicates a manner in which a user grasps the gaming object.
11. A method for use in a gaming system, the method comprising:
generating biofeedback data in response to an action of a user;
sending an RF signal to a game device, wherein the RF signal indicates the biofeedback data; and
executing a gaming application based the biofeedback data.
12. The method of claim 1 wherein biofeedback data includes image data corresponding to an image of the user.
13. The method of claim 2 wherein the gaming application recognizes the user based on the image data.
14. The method of claim 1 wherein the biofeedback data includes voice data generated by the user.
15. The method of claim 4 wherein the gaming application recognizes the user based on the voice data.
16. The method of claim 4 wherein the gaming application recognizes a game command based on the voice data.
17. The method of claim 1 wherein the biofeedback data indicates a heart rate of the user.
18. The method of claim 1 wherein the biofeedback data indicates a perspiration level of the user.
19. The method of claim 1 wherein the gaming application adjusts a game parameter based on the biofeedback data.
20. The method of claim 1 wherein the biofeedback data indicates a manner in which the user grasps the gaming object.
US12/131,605 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith Abandoned US20080318673A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/131,605 US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US93672407P 2007-06-22 2007-06-22
US12/131,605 US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith

Publications (1)

Publication Number Publication Date
US20080318673A1 true US20080318673A1 (en) 2008-12-25

Family

ID=40135930

Family Applications (26)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions
US12/131,605 Abandoned US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/135,332 Abandoned US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,702 Abandoned US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions

Family Applications After (21)

Application Number Title Priority Date Filing Date
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/135,332 Abandoned US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,702 Abandoned US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Country Status (1)

Country Link
US (26) US20090017910A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US20100105478A1 (en) * 2008-10-18 2010-04-29 Hallaian Stephen C Mind-control toys and methods of interaction therewith
US20100332842A1 (en) * 2009-06-30 2010-12-30 Yahoo! Inc. Determining a mood of a user based on biometric characteristic(s) of the user in an online system
US20110028218A1 (en) * 2009-08-03 2011-02-03 Realta Entertainment Group Systems and Methods for Wireless Connectivity of a Musical Instrument
US20110063208A1 (en) * 2008-05-09 2011-03-17 Koninklijke Philips Electronics N.V. Method and system for conveying an emotion
US20110293144A1 (en) * 2009-02-02 2011-12-01 Agency For Science, Technology And Research Method and System for Rendering an Entertainment Animation
US20140243093A1 (en) * 2013-02-28 2014-08-28 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US8835736B2 (en) 2007-02-20 2014-09-16 Ubisoft Entertainment Instrument game system and method
US8907193B2 (en) 2007-02-20 2014-12-09 Ubisoft Entertainment Instrument game system and method
US20150019153A1 (en) * 2013-07-12 2015-01-15 Facebook, Inc. Calibration of Grab Detection
US20150040149A1 (en) * 2012-10-14 2015-02-05 Ari M. Frank Reducing transmissions of measurements of affective response by identifying actions that imply emotional response
US8986090B2 (en) 2008-11-21 2015-03-24 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
US20160144278A1 (en) * 2010-06-07 2016-05-26 Affectiva, Inc. Affect usage within a gaming context
US9669297B1 (en) * 2013-09-18 2017-06-06 Aftershock Services, Inc. Using biometrics to alter game content
US20180082151A1 (en) * 2008-08-22 2018-03-22 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
WO2018150162A1 (en) * 2017-02-14 2018-08-23 Sony Interactive Entertainment Europe Limited Sensing apparatus and method
US10427042B2 (en) * 2009-07-10 2019-10-01 Valve Corporation Player biofeedback for dynamically controlling a video game state
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11253781B2 (en) 2009-07-10 2022-02-22 Valve Corporation Player biofeedback for dynamically controlling a video game state
US20220116721A1 (en) * 2013-10-09 2022-04-14 Voyetra Turtle Beach, Inc. Audio Alerts In A Wireless Device

Families Citing this family (510)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8915859B1 (en) * 2004-09-28 2014-12-23 Impact Sports Technologies, Inc. Monitoring device, system and method for a multi-player interactive game
US8835616B2 (en) * 2004-12-07 2014-09-16 Lanthiopep B.V. Methods for the production and secretion of modified peptides
EP2296079A3 (en) * 2005-10-26 2011-04-13 Sony Computer Entertainment Inc. System and method for interfacing with a computer program
US7636645B1 (en) 2007-06-18 2009-12-22 Ailive Inc. Self-contained inertial navigation system for interactive control using movable controllers
US9405372B2 (en) 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
US7702608B1 (en) 2006-07-14 2010-04-20 Ailive, Inc. Generating motion recognizers for arbitrary motions for video games and tuning the motion recognizers to the end user
KR101299682B1 (en) * 2006-10-16 2013-08-22 삼성전자주식회사 Universal input device
US8344949B2 (en) * 2008-03-31 2013-01-01 Golba Llc Wireless positioning approach using time-delay of signals with a known transmission pattern
US8462109B2 (en) 2007-01-05 2013-06-11 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US7934423B2 (en) 2007-12-10 2011-05-03 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US8139945B1 (en) 2007-01-20 2012-03-20 Centrak, Inc. Methods and systems for synchronized infrared real time location
US7636697B1 (en) 2007-01-29 2009-12-22 Ailive Inc. Method and system for rapid evaluation of logical expressions
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US8284822B2 (en) * 2007-02-27 2012-10-09 Broadcom Corporation Method and system for utilizing direct digital frequency synthesis to process signals in multi-band applications
US20080205545A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Crystalless Polar Transmitter
US7826550B2 (en) * 2007-02-28 2010-11-02 Broadcom Corp. Method and system for a high-precision frequency generator using a direct digital frequency synthesizer for transmitters and receivers
US20080205550A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Polar Transmitter
US8116387B2 (en) * 2007-03-01 2012-02-14 Broadcom Corporation Method and system for a digital polar transmitter
US7894830B2 (en) * 2007-04-28 2011-02-22 Broadcom Corporation Motion adaptive wireless local area network, wireless communications device and integrated circuits for use therewith
US8064923B2 (en) * 2007-04-28 2011-11-22 Broadcom Corporation Wireless communications device and integrated circuits with global positioning and method for use therewith
JP4438825B2 (en) * 2007-05-29 2010-03-24 ソニー株式会社 Arrival angle estimation system, communication apparatus, and communication system
US20080299906A1 (en) * 2007-06-04 2008-12-04 Topway Electrical Appliance Company Emulating playing apparatus of simulating games
US7912449B2 (en) * 2007-06-14 2011-03-22 Broadcom Corporation Method and system for 60 GHz location determination and coordination of WLAN/WPAN/GPS multimode devices
US8238832B1 (en) * 2007-08-28 2012-08-07 Marvell International Ltd. Antenna optimum beam forming for multiple protocol coexistence on a wireless device
US8591430B2 (en) 2007-09-14 2013-11-26 Corventis, Inc. Adherent device for respiratory monitoring
US20090076345A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Adherent Device with Multiple Physiological Sensors
WO2009036306A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Adherent cardiac monitor with advanced sensing capabilities
US9186089B2 (en) 2007-09-14 2015-11-17 Medtronic Monitoring, Inc. Injectable physiological monitoring system
WO2009036316A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Energy management, tracking and security for adherent patient monitor
EP2195102A1 (en) * 2007-09-14 2010-06-16 Christian Züger A system for capturing tennis match data
WO2009036348A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Medical device automatic start-up upon contact to patient tissue
US8790257B2 (en) 2007-09-14 2014-07-29 Corventis, Inc. Multi-sensor patient monitor to detect impending cardiac decompensation
US8727881B2 (en) * 2007-09-25 2014-05-20 Wms Gaming Inc. Accessing wagering game services by aiming handheld device at external device
KR101187909B1 (en) * 2007-10-04 2012-10-05 삼성테크윈 주식회사 Surveillance camera system
JP5116424B2 (en) * 2007-10-09 2013-01-09 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US8206325B1 (en) 2007-10-12 2012-06-26 Biosensics, L.L.C. Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection
US7842875B2 (en) * 2007-10-19 2010-11-30 Sony Computer Entertainment America Inc. Scheme for providing audio effects for a musical instrument and for controlling images with same
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
JP5411425B2 (en) * 2007-12-25 2014-02-12 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
US9020780B2 (en) * 2007-12-31 2015-04-28 The Nielsen Company (Us), Llc Motion detector module
US8604365B2 (en) * 2008-01-03 2013-12-10 Qualcomm Incorporated Ultrasonic digitizer and host
US9007178B2 (en) 2008-02-14 2015-04-14 Intermec Ip Corp. Utilization of motion and spatial identification in RFID systems
US9047522B1 (en) * 2008-02-14 2015-06-02 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
US8994504B1 (en) 2008-02-14 2015-03-31 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
JP5405500B2 (en) 2008-03-12 2014-02-05 コーヴェンティス,インク. Predicting cardiac decompensation based on cardiac rhythm
JP5039950B2 (en) * 2008-03-21 2012-10-03 インターナショナル・ビジネス・マシーンズ・コーポレーション Object movement control system, object movement control method, server, and computer program
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US8412317B2 (en) 2008-04-18 2013-04-02 Corventis, Inc. Method and apparatus to measure bioelectric impedance of patient tissue
US8508219B2 (en) * 2008-04-30 2013-08-13 National Institute Of Advanced Industrial Science And Technology Object state detection apparatus and method
US8120354B2 (en) * 2008-05-01 2012-02-21 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
US8242888B2 (en) 2008-06-05 2012-08-14 Keystone Technology Solutions, Llc Systems and methods to determine motion parameters using RFID tags
US8461966B2 (en) 2008-06-05 2013-06-11 Micron Technology, Inc. Systems and methods to determine kinematical parameters using RFID tags
US8830062B2 (en) * 2008-06-05 2014-09-09 Micron Technology, Inc. Systems and methods to use radar in RFID systems
US9844730B1 (en) * 2008-06-16 2017-12-19 Disney Enterprises, Inc. Method and apparatus for an interactive dancing video game
US8483623B2 (en) * 2008-06-19 2013-07-09 Broadcom Corporation Method and system for frequency-shift based PCB-to-PCB communications
GB2461578A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh Transferring electric energy to a vehicle
GB2461577A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh System and method for transferring electric energy to a vehicle
US8655622B2 (en) * 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
GB2463692A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh An arrangement for providing a vehicle with electric energy
GB2463693A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh A system for transferring electric energy to a vehicle
CN102171726B (en) * 2008-10-01 2014-06-18 索尼电脑娱乐公司 Information processing device, information processing method, program, and information storage medium
CN101726738B (en) * 2008-10-30 2012-12-26 日电(中国)有限公司 Multi-target positioning system and multiple-access control method based on power control
US7855683B2 (en) * 2008-11-04 2010-12-21 At&T Intellectual Property I, L.P. Methods and apparatuses for GPS coordinates extrapolation when GPS signals are not available
US20100122278A1 (en) * 2008-11-13 2010-05-13 Alfred Xueliang Xin Method and an automated direction following system
EP2192696B1 (en) * 2008-11-28 2014-12-31 Sequans Communications Wireless communications method and system with spatial multiplexing using dually polarized antennas and corresponding receiver
US8588805B2 (en) * 2008-12-13 2013-11-19 Broadcom Corporation Receiver utilizing multiple radiation patterns to determine angular position
JP2010152493A (en) * 2008-12-24 2010-07-08 Sony Corp Input device, control apparatus, and control method for the input device
US20100177076A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Edge-lit electronic-ink display device for use in indoor and outdoor environments
US8457013B2 (en) 2009-01-13 2013-06-04 Metrologic Instruments, Inc. Wireless dual-function network device dynamically switching and reconfiguring from a wireless network router state of operation into a wireless network coordinator state of operation in a wireless communication network
US20100177749A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Methods of and apparatus for programming and managing diverse network components, including electronic-ink based display devices, in a mesh-type wireless communication network
KR101742583B1 (en) * 2009-01-27 2017-06-01 엑스와이지 인터랙티브 테크놀로지스 아이엔씨. A method and apparatus for ranging finding, orienting, and/or positioning of single and/or multiple devices
US9652030B2 (en) 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
US8254964B2 (en) * 2009-02-23 2012-08-28 Sony Ericsson Mobile Communications Ab Method and arrangement relating to location based services for a communication device
US8311506B2 (en) * 2009-02-26 2012-11-13 Broadcom Corporation RFID receiver front end with phase cancellation and methods for use therewith
KR100999711B1 (en) 2009-03-09 2010-12-08 광주과학기술원 Apparatus for real-time calibrating in the collaboration system and method using the same
JP5287385B2 (en) * 2009-03-13 2013-09-11 オムロン株式会社 Measuring device
US8725156B2 (en) * 2009-04-02 2014-05-13 Honeywell International Inc. Methods for supporting mobile nodes in industrial control and automation systems and other systems and related apparatus
JP2010245796A (en) 2009-04-06 2010-10-28 Sony Corp Video display and method, video display system, and program
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US8953029B2 (en) * 2009-05-08 2015-02-10 Sony Computer Entertainment America Llc Portable device interaction via motion sensitive controller
US8417264B1 (en) * 2009-05-14 2013-04-09 Spring Spectrum L.P. Method and apparatus for determining location of a mobile station based on locations of multiple nearby mobile stations
KR100979623B1 (en) * 2009-05-27 2010-09-01 서울대학교산학협력단 Positioning system and method based on radio communication appratus comprising multiple antenna
US20100304931A1 (en) * 2009-05-27 2010-12-02 Stumpf John F Motion capture system
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
CN101898042B (en) * 2009-05-31 2012-07-18 鸿富锦精密工业(深圳)有限公司 Game controller and control method thereof
US8676659B1 (en) * 2009-07-23 2014-03-18 Bank Of America Corporation Methods and apparatuses for facilitating financial transactions using gamer tag information
US20110025464A1 (en) * 2009-07-30 2011-02-03 Awarepoint Corporation Antenna Diversity For Wireless Tracking System And Method
KR20110012584A (en) * 2009-07-31 2011-02-09 삼성전자주식회사 Apparatus and method for estimating position by ultrasonic signal
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
US8581773B1 (en) * 2009-10-15 2013-11-12 The Boeing Company Dual frequency transmitter
WO2011050283A2 (en) 2009-10-22 2011-04-28 Corventis, Inc. Remote detection and monitoring of functional chronotropic incompetence
JP5326989B2 (en) * 2009-10-26 2013-10-30 セイコーエプソン株式会社 Optical position detection device and display device with position detection function
JP2011090604A (en) * 2009-10-26 2011-05-06 Seiko Epson Corp Optical position detection apparatus and display device with position detection function
JP5493702B2 (en) * 2009-10-26 2014-05-14 セイコーエプソン株式会社 Projection display with position detection function
JP2011099994A (en) 2009-11-06 2011-05-19 Seiko Epson Corp Projection display device with position detecting function
US8535133B2 (en) * 2009-11-16 2013-09-17 Broadcom Corporation Video game with controller sensing player inappropriate activity
US8429269B2 (en) * 2009-12-09 2013-04-23 Sony Computer Entertainment Inc. Server-side rendering
US9451897B2 (en) 2009-12-14 2016-09-27 Medtronic Monitoring, Inc. Body adherent patch with electronics for physiologic monitoring
US20110148884A1 (en) * 2009-12-17 2011-06-23 Charles Timberlake Zeleny System and method for determining motion of a subject
US8497902B2 (en) * 2009-12-18 2013-07-30 Sony Computer Entertainment Inc. System for locating a display device using a camera on a portable device and a sensor on a gaming console and method thereof
CN102109594B (en) * 2009-12-28 2014-04-30 深圳富泰宏精密工业有限公司 System and method for sensing and notifying voice
US9019149B2 (en) 2010-01-05 2015-04-28 The Invention Science Fund I, Llc Method and apparatus for measuring the motion of a person
US8884813B2 (en) * 2010-01-05 2014-11-11 The Invention Science Fund I, Llc Surveillance of stress conditions of persons using micro-impulse radar
US9024814B2 (en) 2010-01-05 2015-05-05 The Invention Science Fund I, Llc Tracking identities of persons using micro-impulse radar
US20110166937A1 (en) * 2010-01-05 2011-07-07 Searete Llc Media output with micro-impulse radar feedback of physiological response
US20110166940A1 (en) * 2010-01-05 2011-07-07 Searete Llc Micro-impulse radar detection of a human demographic and delivery of targeted media content
US9069067B2 (en) * 2010-09-17 2015-06-30 The Invention Science Fund I, Llc Control of an electronic apparatus using micro-impulse radar
WO2011090886A2 (en) * 2010-01-25 2011-07-28 Rambus Inc. Directional beam steering system and method to detect location and motion
US9104238B2 (en) * 2010-02-12 2015-08-11 Broadcom Corporation Systems and methods for providing enhanced motion detection
EP2540065B1 (en) 2010-02-23 2017-01-18 Telefonaktiebolaget LM Ericsson (publ) Communication performance guidance in a user terminal
US8884741B2 (en) 2010-02-24 2014-11-11 Sportvision, Inc. Tracking system
US8979665B1 (en) 2010-03-22 2015-03-17 Bijan Najafi Providing motion feedback based on user center of mass
US8523667B2 (en) * 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US8965498B2 (en) 2010-04-05 2015-02-24 Corventis, Inc. Method and apparatus for personalized physiologic parameters
US8711571B2 (en) * 2010-04-09 2014-04-29 Shenzhen Netcom Electronics Co., Ltd. Portable multimedia player
US20110275434A1 (en) * 2010-05-04 2011-11-10 Mediatek Inc. Methods for controlling a process of a game and electronic devices utilizing the same
JP5700758B2 (en) * 2010-05-19 2015-04-15 任天堂株式会社 GAME DEVICE, GAME PROGRAM, GAME SYSTEM, AND GAME PROCESSING METHOD
US8428394B2 (en) 2010-05-25 2013-04-23 Marcus KRIETER System and method for resolving spatial orientation using intelligent optical selectivity
US20110298887A1 (en) * 2010-06-02 2011-12-08 Maglaque Chad L Apparatus Using an Accelerometer to Capture Photographic Images
US8593331B2 (en) * 2010-06-16 2013-11-26 Qualcomm Incorported RF ranging-assisted local motion sensing
US8537847B2 (en) * 2010-06-22 2013-09-17 Sony Corporation Digital clock with internet connectivity and multiple resting orientations
US8174934B2 (en) * 2010-07-28 2012-05-08 Empire Technology Development Llc Sound direction detection
US9167975B1 (en) * 2010-07-28 2015-10-27 Impact Sports Technologies, Inc. Motion resistant device to monitor heart rate in ambulatory patients
US20120212374A1 (en) * 2010-08-17 2012-08-23 Qualcomm Incorporated Method and apparatus for rf-based ranging with multiple antennas
FI122328B (en) * 2010-08-18 2011-12-15 Sauli Hepo-Oja Active localization system
US8613666B2 (en) * 2010-08-31 2013-12-24 Microsoft Corporation User selection and navigation based on looped motions
US20120064841A1 (en) * 2010-09-10 2012-03-15 Husted Paul J Configuring antenna arrays of mobile wireless devices using motion sensors
US20120063270A1 (en) * 2010-09-10 2012-03-15 Pawcatuck, Connecticut Methods and Apparatus for Event Detection and Localization Using a Plurality of Smartphones
US8391334B1 (en) * 2010-09-27 2013-03-05 L-3 Communications Corp Communications reliability in a hub-spoke communications system
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
KR101339431B1 (en) * 2010-11-19 2013-12-09 도시바삼성스토리지테크놀러지코리아 주식회사 Game controller, game machine, and game system employ the game controller
JP5241807B2 (en) * 2010-12-02 2013-07-17 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US8319682B2 (en) * 2011-01-06 2012-11-27 The Boeing Company Method and apparatus for examining an object using electromagnetic millimeter-wave signal illumination
US8753275B2 (en) * 2011-01-13 2014-06-17 BioSensics LLC Intelligent device to monitor and remind patients with footwear, walking aids, braces, or orthotics
EP3312629A3 (en) 2011-02-21 2018-06-13 Transrobotics, Inc. System and method for sensing an object's dimensions
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
EP2497547B1 (en) 2011-03-08 2018-06-27 Nintendo Co., Ltd. Information processing program, information processing apparatus, information processing system, and information processing method
EP2497543A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
JP5792971B2 (en) 2011-03-08 2015-10-14 任天堂株式会社 Information processing system, information processing program, and information processing method
US9159293B2 (en) * 2011-03-16 2015-10-13 Kyocera Corporation Electronic device, control method, and storage medium storing control program
GB201105587D0 (en) * 2011-04-01 2011-05-18 Elliptic Laboratories As User interfaces for electronic devices
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US9103899B2 (en) 2011-04-29 2015-08-11 The Invention Science Fund I, Llc Adaptive control of a personal electronic device responsive to a micro-impulse radar
US8884809B2 (en) * 2011-04-29 2014-11-11 The Invention Science Fund I, Llc Personal electronic device providing enhanced user environmental awareness
US9151834B2 (en) 2011-04-29 2015-10-06 The Invention Science Fund I, Llc Network and personal electronic devices operatively coupled to micro-impulse radars
US9000973B2 (en) * 2011-04-29 2015-04-07 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US20120282987A1 (en) * 2011-05-06 2012-11-08 Roger Romero Artificial touch device for electronic touch screens
JP5937792B2 (en) * 2011-06-03 2016-06-22 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP5869236B2 (en) 2011-06-03 2016-02-24 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US8890684B2 (en) 2011-06-17 2014-11-18 Checkpoint Systems, Inc. Background object sensor
RU2455676C2 (en) * 2011-07-04 2012-07-10 Общество с ограниченной ответственностью "ТРИДИВИ" Method of controlling device using gestures and 3d sensor for realising said method
US9084001B2 (en) 2011-07-18 2015-07-14 At&T Intellectual Property I, Lp Method and apparatus for multi-experience metadata translation of media content with metadata
US8943396B2 (en) 2011-07-18 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
KR101893601B1 (en) * 2011-07-22 2018-08-31 삼성전자 주식회사 Input apparatus of display apparatus, display system and control method thereof
WO2013020105A2 (en) * 2011-08-04 2013-02-07 Rambus Inc. Low-cost tracking system
US9237362B2 (en) * 2011-08-11 2016-01-12 At&T Intellectual Property I, Lp Method and apparatus for multi-experience translation of media content with sensor sharing
US8942412B2 (en) 2011-08-11 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for controlling multi-experience translation of media content
US10585472B2 (en) 2011-08-12 2020-03-10 Sony Interactive Entertainment Inc. Wireless head mounted display with differential rendering and sound localization
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
US9503838B2 (en) * 2011-08-29 2016-11-22 Electronics And Telecommunications Research Institute Method and system for communicating between devices
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
US20140253737A1 (en) * 2011-09-07 2014-09-11 Yitzchak Kempinski System and method of tracking an object in an image captured by a moving device
KR101398709B1 (en) * 2011-09-09 2014-05-28 주식회사 팬택 Terminal apparatus and method for supporting multi interface using user motion
US20130095875A1 (en) * 2011-09-30 2013-04-18 Rami Reuven Antenna selection based on orientation, and related apparatuses, antenna units, methods, and distributed antenna systems
US9449342B2 (en) 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
WO2013067526A1 (en) * 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
US20130113649A1 (en) * 2011-11-09 2013-05-09 Marquette Trishaun Detection of an asymmetric object
US9945940B2 (en) 2011-11-10 2018-04-17 Position Imaging, Inc. Systems and methods of wireless position tracking
US9933509B2 (en) 2011-11-10 2018-04-03 Position Imaging, Inc. System for tracking an object using pulsed frequency hopping
US10165228B2 (en) * 2011-12-22 2018-12-25 Mis Security, Llc Sensor event assessor training and integration
US8902936B2 (en) 2011-12-22 2014-12-02 Cory J. Stephanson Sensor event assessor input/output controller
US8874162B2 (en) 2011-12-23 2014-10-28 Microsoft Corporation Mobile device safe driving
US9420432B2 (en) 2011-12-23 2016-08-16 Microsoft Technology Licensing, Llc Mobile devices control
US20130305354A1 (en) 2011-12-23 2013-11-14 Microsoft Corporation Restricted execution modes
US9467834B2 (en) 2011-12-23 2016-10-11 Microsoft Technology Licensing, Llc Mobile device emergency service
US9325752B2 (en) 2011-12-23 2016-04-26 Microsoft Technology Licensing, Llc Private interaction hubs
US9710982B2 (en) 2011-12-23 2017-07-18 Microsoft Technology Licensing, Llc Hub key service
JP2013153405A (en) * 2011-12-28 2013-08-08 Panasonic Corp Av apparatus and initial setting method thereof
US9129489B2 (en) * 2012-01-13 2015-09-08 Gtech Canada Ulc Remote gaming method where venue's system suggests different games to remote player using a mobile gaming device
US9558619B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for carrying out an uninterrupted game with temporary inactivation
US9084932B2 (en) 2012-01-13 2015-07-21 Gtech Canada Ulc Automated discovery of gaming preferences
US9295908B2 (en) 2012-01-13 2016-03-29 Igt Canada Solutions Ulc Systems and methods for remote gaming using game recommender
US9558625B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for recommending games to anonymous players using distributed storage
US9536378B2 (en) 2012-01-13 2017-01-03 Igt Canada Solutions Ulc Systems and methods for recommending games to registered players using distributed storage
US9159189B2 (en) * 2012-01-13 2015-10-13 Gtech Canada Ulc Mobile gaming device carrying out uninterrupted game despite communications link disruption
US9269222B2 (en) * 2012-01-13 2016-02-23 Igt Canada Solutions Ulc Remote gaming system using separate terminal to set up remote play with a gaming terminal
US9123200B2 (en) * 2012-01-13 2015-09-01 Gtech Canada Ulc Remote gaming using game recommender system and generic mobile gaming device
US9208641B2 (en) * 2012-01-13 2015-12-08 Igt Canada Solutions Ulc Remote gaming method allowing temporary inactivation without terminating playing session due to game inactivity
US9011240B2 (en) * 2012-01-13 2015-04-21 Spielo International Canada Ulc Remote gaming system allowing adjustment of original 3D images for a mobile gaming device
US10142689B2 (en) * 2012-01-17 2018-11-27 Sony Interactive Entertainment Inc. Server, terminal, information processing method, information processing program, and computer-readable recording medium storing information processing programs
US9088309B2 (en) * 2012-02-17 2015-07-21 Sony Corporation Antenna tunning arrangement and method
EP2820899A4 (en) * 2012-02-29 2015-06-03 Intel Corp Location discrepancy corrections based on community corrections and trajectory detection
EP2832012A1 (en) 2012-03-30 2015-02-04 Corning Optical Communications LLC Reducing location-dependent interference in distributed antenna systems operating in multiple-input, multiple-output (mimo) configuration, and related components, systems, and methods
US20130275873A1 (en) 2012-04-13 2013-10-17 Qualcomm Incorporated Systems and methods for displaying a user interface
US9326689B2 (en) 2012-05-08 2016-05-03 Siemens Medical Solutions Usa, Inc. Thermally tagged motion tracking for medical treatment
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
TW201349032A (en) * 2012-05-23 2013-12-01 Tritan Technology Inc An anti-optical-noise pointer positioning system
US20130321245A1 (en) * 2012-06-04 2013-12-05 Fluor Technologies Corporation Mobile device for monitoring and controlling facility systems
US9213092B2 (en) * 2012-06-12 2015-12-15 Tyco Fire & Security Gmbh Systems and methods for detecting a change in position of an object
US9782669B1 (en) 2012-06-14 2017-10-10 Position Imaging, Inc. RF tracking with active sensory feedback
US10269182B2 (en) 2012-06-14 2019-04-23 Position Imaging, Inc. RF tracking with active sensory feedback
US20140009384A1 (en) * 2012-07-04 2014-01-09 3Divi Methods and systems for determining location of handheld device within 3d environment
US20140028500A1 (en) * 2012-07-30 2014-01-30 Yu-Ming Liu Positioning System
WO2014020921A1 (en) * 2012-07-31 2014-02-06 独立行政法人科学技術振興機構 Device for estimating placement of physical objects
US9519344B1 (en) 2012-08-14 2016-12-13 Position Imaging, Inc. User input system for immersive interaction
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US9230076B2 (en) 2012-08-30 2016-01-05 Microsoft Technology Licensing, Llc Mobile device child share
US9754442B2 (en) 2012-09-18 2017-09-05 Igt Canada Solutions Ulc 3D enhanced gaming machine with foreground and background game surfaces
US9454879B2 (en) 2012-09-18 2016-09-27 Igt Canada Solutions Ulc Enhancements to game components in gaming systems
US20140080638A1 (en) * 2012-09-19 2014-03-20 Board Of Regents, The University Of Texas System Systems and methods for providing training and instruction to a football kicker
EP2903703A1 (en) * 2012-10-04 2015-08-12 Disney Enterprises, Inc. Interactive objects for immersive environment
US9405011B2 (en) 2012-10-05 2016-08-02 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
US9002641B2 (en) 2012-10-05 2015-04-07 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
EP2904420A4 (en) 2012-10-05 2016-05-25 Transrobotics Inc Systems and methods for high resolution distance sensing and applications
US10234539B2 (en) 2012-12-15 2019-03-19 Position Imaging, Inc. Cycling reference multiplexing receiver system
DE102012224321B4 (en) * 2012-12-21 2022-12-15 Applejack 199 L.P. Measuring device for detecting a hitting movement of a racket, training device and method for training a hitting movement
CA2838129A1 (en) 2012-12-28 2014-06-28 Spielo International Canada Ulc Stacks of game components in a 3d enhanced gamimg machine
US9451394B2 (en) 2012-12-31 2016-09-20 Elwha Llc Cost-effective mobile connectivity protocols
US9876762B2 (en) 2012-12-31 2018-01-23 Elwha Llc Cost-effective mobile connectivity protocols
US8965288B2 (en) 2012-12-31 2015-02-24 Elwha Llc Cost-effective mobile connectivity protocols
US9980114B2 (en) 2013-03-15 2018-05-22 Elwha Llc Systems and methods for communication management
US9832628B2 (en) 2012-12-31 2017-11-28 Elwha, Llc Cost-effective mobile connectivity protocols
US9635605B2 (en) 2013-03-15 2017-04-25 Elwha Llc Protocols for facilitating broader access in wireless communications
US9781664B2 (en) 2012-12-31 2017-10-03 Elwha Llc Cost-effective mobile connectivity protocols
US9713013B2 (en) 2013-03-15 2017-07-18 Elwha Llc Protocols for providing wireless communications connectivity maps
US9119068B1 (en) * 2013-01-09 2015-08-25 Trend Micro Inc. Authentication using geographic location and physical gestures
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US9482741B1 (en) 2013-01-18 2016-11-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
JP2014153663A (en) * 2013-02-13 2014-08-25 Sony Corp Voice recognition device, voice recognition method and program
CN104303133A (en) * 2013-03-12 2015-01-21 施政 System and method for interactive board
JP6127602B2 (en) * 2013-03-13 2017-05-17 沖電気工業株式会社 State recognition device, state recognition method, and computer program
US9807582B2 (en) 2013-03-15 2017-10-31 Elwha Llc Protocols for facilitating broader access in wireless communications
US9693214B2 (en) 2013-03-15 2017-06-27 Elwha Llc Protocols for facilitating broader access in wireless communications
US9706060B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for facilitating broader access in wireless communications
US9706382B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for allocating communication services cost in wireless communications
US9813887B2 (en) 2013-03-15 2017-11-07 Elwha Llc Protocols for facilitating broader access in wireless communications responsive to charge authorization statuses
US9781554B2 (en) 2013-03-15 2017-10-03 Elwha Llc Protocols for facilitating third party authorization for a rooted communication device in wireless communications
US9866706B2 (en) 2013-03-15 2018-01-09 Elwha Llc Protocols for facilitating broader access in wireless communications
US9843917B2 (en) 2013-03-15 2017-12-12 Elwha, Llc Protocols for facilitating charge-authorized connectivity in wireless communications
US9596584B2 (en) 2013-03-15 2017-03-14 Elwha Llc Protocols for facilitating broader access in wireless communications by conditionally authorizing a charge to an account of a third party
ITMI20130495A1 (en) * 2013-03-29 2014-09-30 Atlas Copco Blm Srl ELECTRONIC CONTROL AND CONTROL DEVICE FOR SENSORS
WO2014165862A1 (en) * 2013-04-05 2014-10-09 Ladd Mark J Systems and methods for sensor-based mobile gaming
US9311789B1 (en) 2013-04-09 2016-04-12 BioSensics LLC Systems and methods for sensorimotor rehabilitation
US9945939B1 (en) 2013-05-06 2018-04-17 Lokdon Llc Method for determining a location of an emitter
FR3006477B1 (en) * 2013-05-29 2016-09-30 Blinksight DEVICE AND METHOD FOR DETECTING THE HANDLING OF AT LEAST ONE OBJECT
CA3192820A1 (en) * 2013-06-04 2014-12-11 Isolynx, Llc Object tracking system optimization and tools
US9782670B2 (en) * 2014-04-25 2017-10-10 Ubisoft Entertainment Computer program, method, and system for enabling an interactive event among a plurality of persons
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9891337B2 (en) * 2013-07-15 2018-02-13 SeeScan, Inc. Utility locator transmitter devices, systems, and methods with dockable apparatus
US9128552B2 (en) * 2013-07-17 2015-09-08 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9223340B2 (en) 2013-08-14 2015-12-29 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9826439B2 (en) 2013-09-30 2017-11-21 Elwha Llc Mobile device sharing facilitation methods and systems operable in network equipment
US9813891B2 (en) 2013-09-30 2017-11-07 Elwha Llc Mobile device sharing facilitation methods and systems featuring a subset-specific source identification
US9838536B2 (en) 2013-09-30 2017-12-05 Elwha, Llc Mobile device sharing facilitation methods and systems
US9740875B2 (en) 2013-09-30 2017-08-22 Elwha Llc Mobile device sharing facilitation methods and systems featuring exclusive data presentation
US9774728B2 (en) 2013-09-30 2017-09-26 Elwha Llc Mobile device sharing facilitation methods and systems in a context of plural communication records
US9805208B2 (en) 2013-09-30 2017-10-31 Elwha Llc Mobile device sharing facilitation methods and systems with recipient-dependent inclusion of a data selection
US9753131B2 (en) * 2013-10-09 2017-09-05 Massachusetts Institute Of Technology Motion tracking via body radio reflections
EP3055708A2 (en) * 2013-10-09 2016-08-17 Massachusetts Institute Of Technology Motion tracking via body radio reflections
US9616343B2 (en) * 2013-11-18 2017-04-11 Gaming Support B.V. Hybrid gaming platform
US10033945B2 (en) * 2013-12-12 2018-07-24 Flir Systems Ab Orientation-adapted image remote inspection systems and methods
US10634761B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
EP3087407A2 (en) 2013-12-27 2016-11-02 Massachusetts Institute of Technology Localization with non-synchronous emission and multipath transmission
US9933247B2 (en) 2014-01-13 2018-04-03 The Boeing Company Mandrel configuration monitoring system
US9497728B2 (en) 2014-01-17 2016-11-15 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10200819B2 (en) * 2014-02-06 2019-02-05 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US9437002B2 (en) 2014-09-25 2016-09-06 Elwha Llc Systems and methods for a dual modality sensor system
US9739883B2 (en) 2014-05-16 2017-08-22 Elwha Llc Systems and methods for ultrasonic velocity and acceleration detection
US9618618B2 (en) 2014-03-10 2017-04-11 Elwha Llc Systems and methods for ultrasonic position and motion detection
US20150260823A1 (en) * 2014-03-11 2015-09-17 Crestron Electronics, Inc. Method of enclosing and powering a bluetooth emitter
JP2015196091A (en) * 2014-04-02 2015-11-09 アップルジャック 199 エル.ピー. Sensor-based gaming system for avatar to represent player in virtual environment
US10871566B2 (en) * 2014-04-09 2020-12-22 Thomas Danaher Harvey Methods and system to assist search and interception of lost objects
US9995824B2 (en) * 2014-04-09 2018-06-12 Thomas Danaher Harvey Methods and system to assist search for lost and submerged objects
WO2016003526A2 (en) * 2014-04-18 2016-01-07 Massachusetts Institute Of Technology Indoor localization of a multi-antenna receiver
CN106659428B (en) 2014-04-28 2020-10-16 麻省理工学院 Vital signs monitoring by radio reflection
US10436888B2 (en) * 2014-05-30 2019-10-08 Texas Tech University System Hybrid FMCW-interferometry radar for positioning and monitoring and methods of using same
US9824524B2 (en) 2014-05-30 2017-11-21 Igt Canada Solutions Ulc Three dimensional enhancements to game components in gaming systems
US10347073B2 (en) 2014-05-30 2019-07-09 Igt Canada Solutions Ulc Systems and methods for three dimensional games in gaming systems
US9575560B2 (en) * 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US20160014390A1 (en) * 2014-07-08 2016-01-14 Apple Inc. Electronic Devices With Connector Alignment Assistance
US10234952B2 (en) * 2014-07-18 2019-03-19 Maxim Integrated Products, Inc. Wearable device for using human body as input mechanism
US9525472B2 (en) 2014-07-30 2016-12-20 Corning Incorporated Reducing location-dependent destructive interference in distributed antenna systems (DASS) operating in multiple-input, multiple-output (MIMO) configuration, and related components, systems, and methods
US9811164B2 (en) * 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US20160073087A1 (en) * 2014-09-10 2016-03-10 Lenovo (Singapore) Pte. Ltd. Augmenting a digital image with distance data derived based on acoustic range information
CN105637786B (en) * 2014-09-22 2018-01-19 宇宙网络股份有限公司 Data medium and data carrier system
US9993723B2 (en) * 2014-09-25 2018-06-12 Intel Corporation Techniques for low power monitoring of sports game play
US9600080B2 (en) * 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
CN107003381A (en) 2014-10-07 2017-08-01 Xyz 互动技术公司 For the apparatus and method for orienting and positioning
US9797979B2 (en) 2014-10-08 2017-10-24 Symbol Technologies, Llc System for and method of estimating bearings of radio frequency identification (RFID) tags that return RFID receive signals whose power is below a predetermined threshold
GB2531378B (en) * 2014-10-10 2019-05-08 Zwipe As Power harvesting
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US9423669B2 (en) 2014-11-04 2016-08-23 Qualcomm Incorporated Method and apparatus for camera autofocus based on Wi-Fi ranging technique
US10869175B2 (en) * 2014-11-04 2020-12-15 Nathan Schumacher System and method for generating a three-dimensional model using flowable probes
US9715010B2 (en) * 2014-11-28 2017-07-25 Htc Corporation Apparatus and method for detection
US10609475B2 (en) 2014-12-05 2020-03-31 Stages Llc Active noise control and customized audio system
US11327711B2 (en) 2014-12-05 2022-05-10 Microsoft Technology Licensing, Llc External visual interactions for speech-based devices
US9729267B2 (en) 2014-12-11 2017-08-08 Corning Optical Communications Wireless Ltd Multiplexing two separate optical links with the same wavelength using asymmetric combining and splitting
CN107111362B (en) * 2014-12-19 2020-02-07 Abb公司 Automatic configuration system for operator console
US10275801B2 (en) * 2014-12-19 2019-04-30 Ca, Inc. Adapting user terminal advertisements responsive to measured user behavior
CN104461009B (en) * 2014-12-22 2018-01-09 百度在线网络技术(北京)有限公司 The measuring method and smart machine of object
US10009715B2 (en) * 2015-01-06 2018-06-26 Microsoft Technology Licensing, Llc Geographic information for wireless networks
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
EP3062142B1 (en) 2015-02-26 2018-10-03 Nokia Technologies OY Apparatus for a near-eye display
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
RU2583450C1 (en) * 2015-04-14 2016-05-10 Игорь Александрович Маренков Method of locating ground source of radio-frequency of satellite communication system
KR102229658B1 (en) 2015-04-30 2021-03-17 구글 엘엘씨 Type-agnostic rf signal representations
CN107430444B (en) 2015-04-30 2020-03-03 谷歌有限责任公司 RF-based micro-motion tracking for gesture tracking and recognition
CN107430443B (en) 2015-04-30 2020-07-10 谷歌有限责任公司 Gesture recognition based on wide field radar
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US10251046B2 (en) * 2015-06-01 2019-04-02 Huawei Technologies Co., Ltd. System and method for efficient link discovery in wireless networks
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US9995823B2 (en) 2015-07-31 2018-06-12 Elwha Llc Systems and methods for utilizing compressed sensing in an entertainment system
US10444256B2 (en) * 2015-08-07 2019-10-15 Structural Health Data Systems Device and system for relative motion sensing
US10542222B2 (en) 2015-08-31 2020-01-21 Daniel Arnold Multiview body camera system with environmental sensors and alert features
US10257434B2 (en) 2015-08-31 2019-04-09 Daniel Arnold Multiview body camera system with environmental sensors and alert features
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
CN108139758A (en) * 2015-10-09 2018-06-08 深圳市大疆创新科技有限公司 Apparatus of transport positioning based on significant characteristics
US9929794B2 (en) * 2015-10-15 2018-03-27 Honeywell International Inc. Long term evolution (LTE) air to ground communication enhancements associated with uplink synchronization
WO2017079484A1 (en) 2015-11-04 2017-05-11 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US10555343B2 (en) * 2015-11-30 2020-02-04 Sony Corporation Dynamic back-off time based on channel utilization statistics
US10434396B2 (en) * 2015-11-30 2019-10-08 James Shaunak Divine Protective headgear with display and methods for use therewith
EP3176766B1 (en) * 2015-12-03 2019-07-17 Sony Mobile Communications, Inc. Remote controlling a plurality of controllable devices
EP3383071B1 (en) * 2015-12-28 2021-02-03 Huawei Technologies Co., Ltd. Floor positioning method, device and system
US20170255254A1 (en) * 2016-03-02 2017-09-07 Htc Corporation Tracker device of virtual reality system
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10297576B2 (en) 2016-04-18 2019-05-21 Skyworks Solutions, Inc. Reduced form factor radio frequency system-in-package
US10062670B2 (en) * 2016-04-18 2018-08-28 Skyworks Solutions, Inc. Radio frequency system-in-package with stacked clocking crystal
US9918386B2 (en) 2016-04-18 2018-03-13 Skyworks Solutions, Inc. Surface mount device stacking for reduced form factor
US10269769B2 (en) 2016-04-18 2019-04-23 Skyworks Solutions, Inc. System in package with vertically arranged radio frequency componentry
US10492302B2 (en) 2016-05-03 2019-11-26 Google Llc Connecting an electronic component to an interactive textile
US20170017874A1 (en) * 2016-05-06 2017-01-19 Qualcomm Incorporated Radio frequency identification (rfid) reader with frequency adjustment of continuous radio frequency (rf) wave
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
WO2017200279A1 (en) * 2016-05-17 2017-11-23 Samsung Electronics Co., Ltd. Method and apparatus for facilitating interaction with virtual reality equipment
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
US10388027B2 (en) * 2016-06-01 2019-08-20 Kyocera Corporation Detection method, display apparatus, and detection system
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US10252812B2 (en) 2016-09-28 2019-04-09 General Electric Company System and method for controlling fuel flow to a gas turbine engine based on motion sensor data
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US9980075B1 (en) 2016-11-18 2018-05-22 Stages Llc Audio source spatialization relative to orientation sensor and output
US10945080B2 (en) 2016-11-18 2021-03-09 Stages Llc Audio analysis and processing system
CN107066121A (en) * 2016-11-30 2017-08-18 黄文超 A kind of magnetic suspension mouse with RFID inductor matrixes
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US10573291B2 (en) 2016-12-09 2020-02-25 The Research Foundation For The State University Of New York Acoustic metamaterial
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10302020B2 (en) 2016-12-12 2019-05-28 General Electric Company System and method for controlling a fuel flow to a gas turbine engine
US9773330B1 (en) 2016-12-29 2017-09-26 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10973439B2 (en) 2016-12-29 2021-04-13 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US11318350B2 (en) 2016-12-29 2022-05-03 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10650552B2 (en) 2016-12-29 2020-05-12 Magic Leap, Inc. Systems and methods for augmented reality
TWI692935B (en) 2016-12-29 2020-05-01 美商天工方案公司 Front end systems and related devices, integrated circuits, modules, and methods
US10352962B2 (en) * 2016-12-29 2019-07-16 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis and feedback
EP3343267B1 (en) 2016-12-30 2024-01-24 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US10146300B2 (en) * 2017-01-25 2018-12-04 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Emitting a visual indicator from the position of an object in a simulated reality emulation
KR101839522B1 (en) * 2017-02-21 2018-03-16 주식회사 와이젯 Wireless transceiver system using beam tracking
CN107016347A (en) * 2017-03-09 2017-08-04 腾讯科技(深圳)有限公司 A kind of body-sensing action identification method, device and system
US10515924B2 (en) 2017-03-10 2019-12-24 Skyworks Solutions, Inc. Radio frequency modules
US20190050060A1 (en) * 2017-03-10 2019-02-14 Awearable Apparel Inc. Methods, systems, and media for providing input based on accelerometer input
RU2729705C2 (en) * 2017-04-21 2020-08-11 Зенимакс Медиа Инк. Motion compensation systems and methods based on player input
EP3616387A1 (en) 2017-04-24 2020-03-04 Carnegie Mellon University Virtual sensor system
US10754005B2 (en) 2017-05-31 2020-08-25 Google Llc Radar modulation for radar sensing using a wireless communication chipset
US10782390B2 (en) 2017-05-31 2020-09-22 Google Llc Full-duplex operation for radar sensing using wireless communication chipset
US10644397B2 (en) * 2017-06-30 2020-05-05 Intel Corporation Methods, apparatus and systems for motion predictive beamforming
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
US10726218B2 (en) 2017-07-27 2020-07-28 Symbol Technologies, Llc Method and apparatus for radio frequency identification (RFID) tag bearing estimation
US10989803B1 (en) 2017-08-21 2021-04-27 Massachusetts Institute Of Technology Security protocol for motion tracking systems
JP7384416B2 (en) * 2017-10-13 2023-11-21 タクチュアル ラブズ シーオー. Minimal drive of transmitter to increase hover detection
US10747303B2 (en) * 2017-10-13 2020-08-18 Tactual Labs Co. Backscatter hover detection
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11406390B2 (en) 2017-10-30 2022-08-09 Cilag Gmbh International Clip applier comprising interchangeable clip reloads
US11129636B2 (en) 2017-10-30 2021-09-28 Cilag Gmbh International Surgical instruments comprising an articulation drive that provides for high articulation angles
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11280937B2 (en) 2017-12-10 2022-03-22 Magic Leap, Inc. Anti-reflective coatings on optical waveguides
KR20200100720A (en) 2017-12-20 2020-08-26 매직 립, 인코포레이티드 Insert for augmented reality viewing device
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US20190201146A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Safety systems for smart powered surgical stapling
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US20190201039A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Situational awareness of electrosurgical systems
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US20190200981A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11844579B2 (en) 2017-12-28 2023-12-19 Cilag Gmbh International Adjustments based on airborne particle properties
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US11678881B2 (en) * 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US20190201042A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Determining the state of an ultrasonic electromechanical system according to frequency shift
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US20190201113A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Controls for robot-assisted surgical platforms
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
GB201802850D0 (en) 2018-02-22 2018-04-11 Sintef Tto As Positioning sound sources
US11839396B2 (en) 2018-03-08 2023-12-12 Cilag Gmbh International Fine dissection mode for tissue classification
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US11701162B2 (en) 2018-03-08 2023-07-18 Cilag Gmbh International Smart blade application for reusable and disposable devices
US10755676B2 (en) 2018-03-15 2020-08-25 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
CN113870457A (en) * 2018-03-22 2021-12-31 创新先进技术有限公司 Timing system, method, device and equipment for competitive sports
US11589865B2 (en) 2018-03-28 2023-02-28 Cilag Gmbh International Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US10772511B2 (en) * 2018-05-16 2020-09-15 Qualcomm Incorporated Motion sensor using cross coupling
EP3803488A4 (en) 2018-05-30 2021-07-28 Magic Leap, Inc. Compact variable focus configurations
JP7319303B2 (en) 2018-05-31 2023-08-01 マジック リープ, インコーポレイテッド Radar head pose localization
CN112400157A (en) 2018-06-05 2021-02-23 奇跃公司 Homography transformation matrix based temperature calibration of viewing systems
WO2019237099A1 (en) 2018-06-08 2019-12-12 Magic Leap, Inc. Augmented reality viewer with automated surface selection placement and content orientation placement
US11579441B2 (en) 2018-07-02 2023-02-14 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
US11856479B2 (en) 2018-07-03 2023-12-26 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
US11510027B2 (en) 2018-07-03 2022-11-22 Magic Leap, Inc. Systems and methods for virtual and augmented reality
CN108717180B (en) * 2018-07-05 2021-09-17 南京航空航天大学 Networking radar power distribution method based on Stark-Berger game
US11598651B2 (en) 2018-07-24 2023-03-07 Magic Leap, Inc. Temperature dependent calibration of movement detection devices
US11624929B2 (en) 2018-07-24 2023-04-11 Magic Leap, Inc. Viewing device with dust seal integration
WO2020028834A1 (en) 2018-08-02 2020-02-06 Magic Leap, Inc. A viewing system with interpupillary distance compensation based on head motion
WO2020028191A1 (en) 2018-08-03 2020-02-06 Magic Leap, Inc. Unfused pose-based drift correction of a fused pose of a totem in a user interaction system
US11003205B2 (en) 2019-02-04 2021-05-11 Sigmasense, Llc. Receive analog to digital circuit of a low voltage drive circuit data communication system
US10499363B1 (en) * 2018-09-18 2019-12-03 Qualcomm Incorporated Methods and apparatus for improved accuracy and positioning estimates
CN113424197A (en) 2018-09-21 2021-09-21 定位成像有限公司 Machine learning assisted self-improving object recognition system and method
JP7201379B2 (en) * 2018-10-02 2023-01-10 東芝テック株式会社 RFID tag reader
US11580316B2 (en) * 2018-11-08 2023-02-14 Avery Dennison Retail Information Services Llc Interacting RFID tags
WO2020102412A1 (en) 2018-11-16 2020-05-22 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
US20200168045A1 (en) 2018-11-28 2020-05-28 Igt Dynamic game flow modification in electronic wagering games
CN109557512B (en) * 2018-12-06 2020-08-04 航天南湖电子信息技术股份有限公司 Radar receiver with high sensitivity and high dynamic range
EP3668197B1 (en) * 2018-12-12 2021-11-03 Rohde & Schwarz GmbH & Co. KG Method and radio for setting the transmission power of a radio transmission
WO2020146861A1 (en) 2019-01-11 2020-07-16 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
CN113518961A (en) 2019-02-06 2021-10-19 奇跃公司 Targeted intent based clock speed determination and adjustment to limit total heat generated by multiple processors
US10977808B2 (en) * 2019-02-18 2021-04-13 Raytheon Company Three-frame difference target acquisition and tracking using overlapping target images
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11751872B2 (en) 2019-02-19 2023-09-12 Cilag Gmbh International Insertable deactivator element for surgical stapler lockouts
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11259807B2 (en) 2019-02-19 2022-03-01 Cilag Gmbh International Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11762623B2 (en) 2019-03-12 2023-09-19 Magic Leap, Inc. Registration of local content between first and second augmented reality viewers
EP3719532B1 (en) 2019-04-04 2022-12-28 Transrobotics, Inc. Technologies for acting based on object tracking
US11445232B2 (en) 2019-05-01 2022-09-13 Magic Leap, Inc. Content provisioning system and method
EP3928181A1 (en) 2019-06-17 2021-12-29 Google LLC Mobile device-based radar system for applying different power modes to a multi-mode interface
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
JP2022542363A (en) 2019-07-26 2022-10-03 マジック リープ, インコーポレイテッド Systems and methods for augmented reality
GB2586059B (en) * 2019-08-01 2023-06-07 Sony Interactive Entertainment Inc System and method for generating user inputs for a video game
US10973062B2 (en) * 2019-08-26 2021-04-06 International Business Machines Corporation Method for extracting environment information leveraging directional communication
KR20210034270A (en) 2019-09-20 2021-03-30 삼성전자주식회사 Electronic device for determinig path of line of sight(los) and method for the same
WO2021097323A1 (en) 2019-11-15 2021-05-20 Magic Leap, Inc. A viewing system for use in a surgical environment
KR20210069479A (en) 2019-12-03 2021-06-11 삼성전자주식회사 Electronic device and operating method for identifying location information of device
US11860439B1 (en) 2020-05-06 2024-01-02 Apple Inc. Head-mounted electronic device with alignment sensors
CN112418200B (en) * 2021-01-25 2021-04-02 成都点泽智能科技有限公司 Object detection method and device based on thermal imaging and server
CN113129328B (en) * 2021-04-22 2022-05-17 中国电子科技集团公司第二十九研究所 Target hotspot area fine analysis method
US11640725B2 (en) 2021-05-28 2023-05-02 Sportsbox.ai Inc. Quantitative, biomechanical-based analysis with outcomes and context
GB2608186A (en) * 2021-06-25 2022-12-28 Thermoteknix Systems Ltd Augmented reality system
US11920521B2 (en) 2022-02-07 2024-03-05 General Electric Company Turboshaft load control using feedforward and feedback control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020030094A1 (en) * 1997-02-10 2002-03-14 Daniel Curry Arrangement for and method of establishing a logical relationship among peripherals in a wireless local area network
US20030195040A1 (en) * 2002-04-10 2003-10-16 Breving Joel S. Video game system and game controller
US20070139512A1 (en) * 2004-04-07 2007-06-21 Matsushita Electric Industrial Co., Ltd. Communication terminal and communication method

Family Cites Families (237)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2397746A (en) * 1942-05-23 1946-04-02 Hazeltine Corp Wave-signal direction finder
US3430243A (en) * 1966-04-04 1969-02-25 Robley D Evans Method of and apparatus for determining the distance and/or angles between objects with the aid of radiant energy
US3816830A (en) * 1970-11-27 1974-06-11 Hazeltine Corp Cylindrical array antenna
US3789410A (en) * 1972-01-07 1974-01-29 Us Navy Passive ranging technique
US4041494A (en) * 1975-11-10 1977-08-09 The United States Of America As Represented By The Secretary Of The Department Of Transportation Distance measuring method and apparatus
US4309703A (en) * 1979-12-28 1982-01-05 International Business Machines Corporation Segmented chirp waveform implemented radar system
US5248884A (en) * 1983-10-11 1993-09-28 The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland Infrared detectors
US4639900A (en) * 1984-02-22 1987-01-27 U.S. Philips Corporation Method and a system for monitoring a sea area
US4807183A (en) * 1985-09-27 1989-02-21 Carnegie-Mellon University Programmable interconnection chip for computer system functional modules
WO1988004060A2 (en) * 1986-11-27 1988-06-02 Starpeak Computers Limited Locating system
US5027433A (en) * 1988-04-04 1991-06-25 Hm Electronics, Inc. Remote infrared transceiver and method of using same
US5214615A (en) * 1990-02-26 1993-05-25 Will Bauer Three-dimensional displacement of a body with computer interface
US5229764A (en) * 1991-06-20 1993-07-20 Matchett Noel D Continuous biometric authentication matrix
US5138322A (en) * 1991-08-20 1992-08-11 Matrix Engineering, Inc. Method and apparatus for radar measurement of ball in play
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5502683A (en) * 1993-04-20 1996-03-26 International Business Machines Corporation Dual ported memory with word line access control
WO1994026075A1 (en) * 1993-05-03 1994-11-10 The University Of British Columbia Tracking platform system
JP2552427B2 (en) * 1993-12-28 1996-11-13 コナミ株式会社 Tv play system
US5574479A (en) * 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
CA2141144A1 (en) * 1994-03-31 1995-10-01 Joseph Desimone Electronic game utilizing bio-signals
US5412619A (en) * 1994-04-14 1995-05-02 Bauer; Will Three-dimensional displacement of a body with computer interface
US6539336B1 (en) * 1996-12-12 2003-03-25 Phatrat Technologies, Inc. Sport monitoring system for determining airtime, speed, power absorbed and other factors such as drop distance
US8280682B2 (en) * 2000-12-15 2012-10-02 Tvipr, Llc Device for monitoring movement of shipped goods
US5943427A (en) * 1995-04-21 1999-08-24 Creative Technology Ltd. Method and apparatus for three dimensional audio spatialization
US6418324B1 (en) * 1995-06-01 2002-07-09 Padcom, Incorporated Apparatus and method for transparent wireless communication between a remote device and host system
US5528557A (en) * 1995-08-07 1996-06-18 Northrop Grumman Corporation Acoustic emission source location by reverse ray tracing
US5742840A (en) * 1995-08-16 1998-04-21 Microunity Systems Engineering, Inc. General purpose, multiple precision parallel operation, programmable media processor
WO1997009638A2 (en) * 1995-09-07 1997-03-13 Siemens Aktiengesellschaft Rangefinder
US5754948A (en) * 1995-12-29 1998-05-19 University Of North Carolina At Charlotte Millimeter-wave wireless interconnection of electronic components
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
US6396041B1 (en) * 1998-08-21 2002-05-28 Curtis A. Vock Teaching and gaming golf feedback system and methods
US5700204A (en) * 1996-06-17 1997-12-23 Teder; Rein S. Projectile motion parameter determination device using successive approximation and high measurement angle speed sensor
US6400374B2 (en) * 1996-09-18 2002-06-04 Eyematic Interfaces, Inc. Video superposition system and method
US5786912A (en) * 1996-12-27 1998-07-28 Lucent Technologies Inc. Waveguide-based, fabricless switch for telecommunication system and telecommunication infrastructure employing the same
US6182203B1 (en) * 1997-01-24 2001-01-30 Texas Instruments Incorporated Microprocessor
ATE258000T1 (en) * 1997-02-13 2004-01-15 Nokia Corp METHOD AND DEVICE FOR DIRECTED RADIO TRANSMISSION
US6070269A (en) * 1997-07-25 2000-06-06 Medialab Services S.A. Data-suit for real-time computer animation and virtual reality applications
US6142876A (en) * 1997-08-22 2000-11-07 Cumbers; Blake Player tracking and identification system
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US6162123A (en) * 1997-11-25 2000-12-19 Woolston; Thomas G. Interactive electronic sword game
US5884104A (en) * 1997-11-26 1999-03-16 Eastman Kodak Company Compact camera flash unit
US6421453B1 (en) * 1998-05-15 2002-07-16 International Business Machines Corporation Apparatus and methods for user recognition employing behavioral passwords
US6438622B1 (en) * 1998-11-17 2002-08-20 Intel Corporation Multiprocessor system including a docking system
FR2786899B1 (en) * 1998-12-03 2006-09-29 Jean Bonnard MOVEMENT INDICATOR FOR SOFTWARE
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
US7933295B2 (en) * 1999-04-13 2011-04-26 Broadcom Corporation Cable modem with voice processing capability
US7015950B1 (en) * 1999-05-11 2006-03-21 Pryor Timothy R Picture taking method and apparatus
US6343315B1 (en) * 1999-05-12 2002-01-29 Lodgenet Entertainment Corporation Entertainment/Information system having disparate interactive devices
US6653971B1 (en) * 1999-05-14 2003-11-25 David L. Guice Airborne biota monitoring and control system
US6500070B1 (en) * 1999-05-28 2002-12-31 Nintendo Co., Ltd. Combined game system of portable and video game machines
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US6177903B1 (en) * 1999-06-14 2001-01-23 Time Domain Corporation System and method for intrusion detection using a time domain radar array
US7592944B2 (en) * 1999-06-14 2009-09-22 Time Domain Corporation System and method for intrusion detection using a time domain radar array
JP4278071B2 (en) * 1999-06-17 2009-06-10 株式会社バンダイナムコゲームス Image generation system and information storage medium
US6545661B1 (en) * 1999-06-21 2003-04-08 Midway Amusement Games, Llc Video game system having a control unit with an accelerometer for controlling a video game
JP2001104636A (en) * 1999-10-04 2001-04-17 Shinsedai Kk Cenesthesic ball game device
US7030860B1 (en) * 1999-10-08 2006-04-18 Synaptics Incorporated Flexible transparent touch sensing system for electronic devices
US6735708B2 (en) * 1999-10-08 2004-05-11 Dell Usa, L.P. Apparatus and method for a combination personal digital assistant and network portable device
US8956228B2 (en) * 1999-12-03 2015-02-17 Nike, Inc. Game pod
US7010634B2 (en) * 1999-12-23 2006-03-07 Intel Corporation Notebook computer with independently functional, dockable core computer
US7445550B2 (en) * 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US6315667B1 (en) * 2000-03-28 2001-11-13 Robert Steinhart System for remote control of a model airplane
JP4020567B2 (en) * 2000-05-15 2007-12-12 株式会社コナミデジタルエンタテインメント Game machine and game environment setting network system thereof
US20020049806A1 (en) * 2000-05-16 2002-04-25 Scott Gatz Parental control system for use in connection with account-based internet access server
DE60130836T2 (en) * 2000-06-12 2008-07-17 Broadcom Corp., Irvine Architecture and method for context switching
JP2002052243A (en) * 2000-08-11 2002-02-19 Konami Co Ltd Competition type video game
US7918808B2 (en) * 2000-09-20 2011-04-05 Simmons John C Assistive clothing
KR100364368B1 (en) * 2000-10-18 2002-12-12 엘지전자 주식회사 Private Network Using Bluetooth and Communication Method Using the Network
US7454166B2 (en) * 2003-04-25 2008-11-18 Xm Satellite Radio Inc. System and method for providing recording and playback of digital media content
JP2002171245A (en) * 2000-12-05 2002-06-14 Sony Corp Method for synthesizing retransmitted data and device for synthesizing retransmitted data
US6735663B2 (en) * 2000-12-18 2004-05-11 Dell Products L.P. Combination personal data assistant and personal computing device
EP1216899A1 (en) * 2000-12-22 2002-06-26 Ford Global Technologies, Inc. Communication system for use with a vehicle
JP2002199500A (en) * 2000-12-25 2002-07-12 Sony Corp Virtual sound image localizing processor, virtual sound image localization processing method and recording medium
US6801974B1 (en) * 2001-01-26 2004-10-05 Dell Products L.P. Method of filtering events in a combinational computing device
JP3722279B2 (en) * 2001-01-26 2005-11-30 日本電気株式会社 Optical transceiver module
US6816925B2 (en) * 2001-01-26 2004-11-09 Dell Products L.P. Combination personal data assistant and personal computing device with master slave input output
US7197584B2 (en) * 2001-01-26 2007-03-27 Dell Products L.P. Removable personal digital assistant in a dual personal computer/personal digital assistant computer architecture
US7369667B2 (en) * 2001-02-14 2008-05-06 Sony Corporation Acoustic image localization signal processing device
US7131907B2 (en) * 2001-02-22 2006-11-07 Kabushiki Kaisha Sega System and method for superimposing an image on another image in a video game
US20020183038A1 (en) * 2001-05-31 2002-12-05 Palm, Inc. System and method for crediting an account associated with a network access node
US7082285B2 (en) * 2001-03-23 2006-07-25 Broadcom Corporation Reduced instruction set baseband controller
US6540607B2 (en) * 2001-04-26 2003-04-01 Midway Games West Video game position and orientation detection system
US7065326B2 (en) * 2001-05-02 2006-06-20 Trex Enterprises Corporation Millimeter wave communications system with a high performance modulator circuit
US6587699B2 (en) * 2001-05-02 2003-07-01 Trex Enterprises Corporation Narrow beamwidth communication link with alignment camera
EP1428108B1 (en) * 2001-05-14 2013-02-13 Koninklijke Philips Electronics N.V. Device for interacting with real-time streams of content
US6563940B2 (en) * 2001-05-16 2003-05-13 New Jersey Institute Of Technology Unauthorized user prevention device and method
SE523407C2 (en) * 2001-05-18 2004-04-13 Jan G Faeger Device for determining the position and / or orientation of a creature in relation to an environment and use of such a device
US20030172380A1 (en) * 2001-06-05 2003-09-11 Dan Kikinis Audio command and response for IPGs
US20030001882A1 (en) * 2001-06-29 2003-01-02 Macer Peter J. Portable entertainment machines
DE10136981A1 (en) * 2001-07-30 2003-02-27 Daimler Chrysler Ag Method and device for determining a stationary and / or moving object
AU2002324969A1 (en) * 2001-09-12 2003-03-24 Pillar Vision Corporation Trajectory detection and feedback system
US6760387B2 (en) * 2001-09-21 2004-07-06 Time Domain Corp. Impulse radio receiver and method for finding angular offset of an impulse radio transmitter
US7054423B2 (en) * 2001-09-24 2006-05-30 Nebiker Robert M Multi-media communication downloading
US6937182B2 (en) * 2001-09-28 2005-08-30 Trex Enterprises Corp. Millimeter wave imaging system
US7257093B1 (en) * 2001-10-10 2007-08-14 Sandia Corporation Localized radio frequency communication using asynchronous transfer mode protocol
US6987988B2 (en) * 2001-10-22 2006-01-17 Waxess, Inc. Cordless and wireless telephone docking station with land line interface and switching mode
US7444393B2 (en) * 2001-10-30 2008-10-28 Keicy K. Chung Read-only storage device having network interface, a system including the device, and a method of distributing files over a network
US20050282633A1 (en) * 2001-11-13 2005-12-22 Frederic Nicolas Movement-sensing apparatus for software
US20030112585A1 (en) * 2001-12-13 2003-06-19 Silvester Kelan Craig Multiprocessor notebook computer with a tablet PC conversion capability
US6712692B2 (en) * 2002-01-03 2004-03-30 International Business Machines Corporation Using existing videogames for physical training and rehabilitation
JP3914771B2 (en) * 2002-01-09 2007-05-16 株式会社日立製作所 Packet communication apparatus and packet data transfer control method
GB0203621D0 (en) * 2002-02-15 2002-04-03 Bae Systems Defence Sysytems L Emitter location system
WO2003071813A2 (en) * 2002-02-19 2003-08-28 Zyray Wireless, Inc. Method and apparatus optimizing a radio link
US6990320B2 (en) * 2002-02-26 2006-01-24 Motorola, Inc. Dynamic reallocation of processing resources for redundant functionality
KR20050000369A (en) * 2002-03-12 2005-01-03 메나키, 엘엘씨 Motion tracking system and method
KR100449102B1 (en) * 2002-03-19 2004-09-18 삼성전자주식회사 System on chip processor for multimedia
US20030211888A1 (en) * 2002-05-13 2003-11-13 Interactive Telegames, Llc Method and apparatus using insertably-removable auxiliary devices to play games over a communications link
US7085536B2 (en) * 2002-05-23 2006-08-01 Intel Corporation Method and apparatus for dynamically resolving radio frequency interference problems in a system
US7043588B2 (en) * 2002-05-24 2006-05-09 Dell Products L.P. Information handling system featuring multi-processor capability with processor located in docking station
US20050035955A1 (en) * 2002-06-06 2005-02-17 Carter Dale J. Method of determining orientation and manner of holding a mobile telephone
US7146014B2 (en) * 2002-06-11 2006-12-05 Intel Corporation MEMS directional sensor system
US7159099B2 (en) * 2002-06-28 2007-01-02 Motorola, Inc. Streaming vector processor with reconfigurable interconnection switch
US7161579B2 (en) * 2002-07-18 2007-01-09 Sony Computer Entertainment Inc. Hand-held computer interactive device
US7646372B2 (en) * 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US8073157B2 (en) * 2003-08-27 2011-12-06 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US8947347B2 (en) * 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system
US7918733B2 (en) * 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
US7803050B2 (en) * 2002-07-27 2010-09-28 Sony Computer Entertainment Inc. Tracking device with sound emitter for use in obtaining information for controlling game program execution
US7854655B2 (en) * 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
DE10240497A1 (en) * 2002-09-03 2004-03-11 Robert Bosch Gmbh Radar measuring device and method for operating a radar measuring device
US20040054776A1 (en) * 2002-09-16 2004-03-18 Finisar Corporation Network expert analysis process
US7343524B2 (en) * 2002-09-16 2008-03-11 Finisar Corporation Network analysis omniscent loop state machine
US20040062308A1 (en) * 2002-09-27 2004-04-01 Kamosa Gregg Mark System and method for accelerating video data processing
US7200061B2 (en) * 2002-11-08 2007-04-03 Hitachi, Ltd. Sense amplifier for semiconductor memory device
US20040117442A1 (en) * 2002-12-10 2004-06-17 Thielen Kurt R. Handheld portable wireless digital content player
US20040123113A1 (en) * 2002-12-18 2004-06-24 Svein Mathiassen Portable or embedded access and input devices and methods for giving access to access limited devices, apparatuses, appliances, systems or networks
US20050032582A1 (en) * 2002-12-19 2005-02-10 Satayan Mahajan Method and apparatus for determining orientation and position of a moveable object
US7339608B2 (en) * 2003-01-03 2008-03-04 Vtech Telecommunications Limited Wireless motion sensor using infrared illuminator and camera integrated with wireless telephone
JP3875196B2 (en) * 2003-02-10 2007-01-31 株式会社東芝 Service providing device, service receiving device, service providing program, service receiving program, proximity wireless communication device, service providing method, and service receiving method
US7009561B2 (en) * 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
EP1880478B1 (en) * 2003-03-12 2013-01-16 International Business Machines Corporation Method and apparatus for converting optical signals to radio channels
US7324784B2 (en) * 2003-03-12 2008-01-29 Nec Corporation Transmission beam control method, adaptive antenna transmitter/receiver apparatus and radio base station
AU2003901463A0 (en) * 2003-03-31 2003-04-17 Qx Corporation Pty Ltd A method and device for multipath mitigation in positioning systems using clustered positioning signals
US7391888B2 (en) * 2003-05-30 2008-06-24 Microsoft Corporation Head pose assessment methods and systems
US20050009604A1 (en) * 2003-07-11 2005-01-13 Hsien-Ta Huang Monotone voice activation device
US20050014468A1 (en) * 2003-07-18 2005-01-20 Juha Salokannel Scalable bluetooth multi-mode radio module
US7385549B2 (en) * 2003-08-12 2008-06-10 Trex Enterprises Corp Millimeter wave portal imaging system
US7415244B2 (en) * 2003-08-12 2008-08-19 Trey Enterprises Corp. Multi-channel millimeter wave imaging system
US7432846B2 (en) * 2003-08-12 2008-10-07 Trex Enterprises Corp. Millimeter wave imaging system
JP4497093B2 (en) * 2003-08-28 2010-07-07 株式会社日立製作所 Semiconductor device and manufacturing method thereof
US7441154B2 (en) * 2003-09-12 2008-10-21 Finisar Corporation Network analysis tool
US7874917B2 (en) * 2003-09-15 2011-01-25 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US20050076161A1 (en) * 2003-10-03 2005-04-07 Amro Albanna Input system and method
US20050124307A1 (en) * 2003-12-08 2005-06-09 Xytrans, Inc. Low cost broadband wireless communication system
US20050132420A1 (en) * 2003-12-11 2005-06-16 Quadrock Communications, Inc System and method for interaction with television content
WO2005069272A1 (en) * 2003-12-15 2005-07-28 France Telecom Method for synthesizing acoustic spatialization
US20050185364A1 (en) * 2004-01-05 2005-08-25 Jory Bell Docking station for mobile computing device
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
TWI286036B (en) * 2004-02-10 2007-08-21 Realtek Semiconductor Corp Method for selecting a channel in a wireless network
US7148836B2 (en) * 2004-03-05 2006-12-12 The Regents Of The University Of California Obstacle penetrating dynamic radar imaging system
US9178953B2 (en) * 2004-03-18 2015-11-03 Nokia Technologies Oy Position-based context awareness for mobile terminal device
US20050245204A1 (en) * 2004-05-03 2005-11-03 Vance Scott L Impedance matching circuit for a mobile communication device
JP3866735B2 (en) * 2004-05-10 2007-01-10 株式会社東芝 Multifunction mobile communication terminal
US7671916B2 (en) * 2004-06-04 2010-03-02 Electronic Arts Inc. Motion sensor using dual camera inputs
US8027165B2 (en) * 2004-07-08 2011-09-27 Sandisk Technologies Inc. Portable memory devices with removable caps that effect operation of the devices when attached
US8016667B2 (en) * 2004-07-22 2011-09-13 Igt Remote gaming eligibility system and method using RFID tags
US8109858B2 (en) * 2004-07-28 2012-02-07 William G Redmann Device and method for exercise prescription, detection of successful performance, and provision of reward therefore
US7242359B2 (en) * 2004-08-18 2007-07-10 Microsoft Corporation Parallel loop antennas for a mobile electronic device
KR100890060B1 (en) * 2004-08-27 2009-03-25 삼성전자주식회사 System and Method for Controlling Congestion of Group Call Response Message On Access Channel
US7590589B2 (en) * 2004-09-10 2009-09-15 Hoffberg Steven M Game theoretic prioritization scheme for mobile ad hoc networks permitting hierarchal deference
EP1810173B1 (en) * 2004-09-22 2014-04-23 Xyratex Technology Limited System and method for configuring memory devices for use in a network
EP1646112A1 (en) * 2004-10-11 2006-04-12 Sony Deutschland GmbH Directivity control for short range wireless mobile communication systems
US20060085675A1 (en) * 2004-10-12 2006-04-20 Andrew Popell One-touch backup system
US6967612B1 (en) * 2004-10-22 2005-11-22 Gorman John D System and method for standoff detection of human carried explosives
US7683883B2 (en) * 2004-11-02 2010-03-23 Pierre Touma 3D mouse and game controller based on spherical coordinates system and system for use
US7672196B1 (en) * 2004-11-16 2010-03-02 Nihon University Sound source localizing apparatus and method
US6965340B1 (en) * 2004-11-24 2005-11-15 Agilent Technologies, Inc. System and method for security inspection using microwave imaging
US20060148568A1 (en) * 2004-12-30 2006-07-06 Motorola, Inc. Device and method for wirelessly accessing game media
US7852317B2 (en) * 2005-01-12 2010-12-14 Thinkoptics, Inc. Handheld device for handheld vision based absolute pointing system
WO2006081395A2 (en) * 2005-01-26 2006-08-03 Bentley Kinetics, Inc. Method and system for athletic motion analysis and instruction
US20060189386A1 (en) * 2005-01-28 2006-08-24 Outland Research, L.L.C. Device, system and method for outdoor computer gaming
US20070055949A1 (en) * 2005-01-29 2007-03-08 Nicholas Thomas Methods and apparatus for rfid interface control
US7330702B2 (en) * 2005-01-31 2008-02-12 Taiwan Semiconductor Manufacturing Co., Ltd. Method and apparatus for inter-chip wireless communication
EP1688847B1 (en) * 2005-02-03 2011-05-04 Texas Instruments Incorporated Die-to-die interconnect interface and protocol for stacked semiconductor dies
US7502965B2 (en) * 2005-02-07 2009-03-10 Broadcom Corporation Computer chip set having on board wireless interfaces to support test operations
US7489870B2 (en) * 2005-10-31 2009-02-10 Searete Llc Optical antenna with optical reference
US20060203758A1 (en) * 2005-03-11 2006-09-14 Samsung Electronics Co., Ltd. Mobile terminal for relaying multimedia data to an external display device
US20060211494A1 (en) * 2005-03-18 2006-09-21 Helfer Lisa M Gaming terminal with player-customization of display functions
US7343177B2 (en) * 2005-05-03 2008-03-11 Broadcom Corporation Modular ear-piece/microphone (headset) operable to service voice activated commands
US20060256090A1 (en) * 2005-05-12 2006-11-16 Apple Computer, Inc. Mechanical overlay
US7733285B2 (en) * 2005-05-18 2010-06-08 Qualcomm Incorporated Integrated, closely spaced, high isolation, printed dipoles
US8116401B2 (en) * 2005-05-26 2012-02-14 Broadcom Corporation Method and system for digital spur cancellation
US8001353B2 (en) * 2005-06-10 2011-08-16 Hewlett-Packard Development Company, L.P. Apparatus and method for configuring memory blocks
US7218143B1 (en) * 2005-06-14 2007-05-15 Xilinx, Inc. Integrated circuit having fast interconnect paths between memory elements and carry logic
KR101257848B1 (en) * 2005-07-13 2013-04-24 삼성전자주식회사 Data storing apparatus comprising complex memory and method of operating the same
GB0515796D0 (en) * 2005-07-30 2005-09-07 Mccarthy Peter A motion capture and identification device
US7927216B2 (en) * 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
JP4262726B2 (en) * 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US8308563B2 (en) * 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
JP4471910B2 (en) * 2005-09-14 2010-06-02 任天堂株式会社 Virtual positioning program
US8471812B2 (en) * 2005-09-23 2013-06-25 Jesse C. Bunch Pointing and identification device
US8142287B2 (en) * 2005-10-11 2012-03-27 Zeemote Technology Inc. Universal controller for toys and games
JP4859433B2 (en) * 2005-10-12 2012-01-25 任天堂株式会社 Position detection system and position detection program
US7715432B2 (en) * 2005-11-14 2010-05-11 Broadcom Corporation Primary protocol stack having a secondary protocol stack entry point
US8180363B2 (en) * 2005-11-15 2012-05-15 Sony Computer Entertainment Inc. Communication apparatus preventing communication interference
US7613482B2 (en) * 2005-12-08 2009-11-03 Accton Technology Corporation Method and system for steering antenna beam
US7170440B1 (en) * 2005-12-10 2007-01-30 Landray Technology, Inc. Linear FM radar
US20070135243A1 (en) * 2005-12-12 2007-06-14 Larue Michael B Active sports tracker and method
TWI286484B (en) * 2005-12-16 2007-09-11 Pixart Imaging Inc Device for tracking the motion of an object and object for reflecting infrared light
US7602301B1 (en) * 2006-01-09 2009-10-13 Applied Technology Holdings, Inc. Apparatus, systems, and methods for gathering and processing biometric and biomechanical data
US20070224944A1 (en) * 2006-03-10 2007-09-27 Hsiang Chen Portable device having changeable operating modes
US7714780B2 (en) * 2006-03-10 2010-05-11 Broadcom Corporation Beamforming RF circuit and applications thereof
JP4151982B2 (en) * 2006-03-10 2008-09-17 任天堂株式会社 Motion discrimination device and motion discrimination program
US7899394B2 (en) * 2006-03-16 2011-03-01 Broadcom Corporation RFID system with RF bus
US7423587B2 (en) * 2006-04-02 2008-09-09 Rolf Mueller Method for frequency-driven generation of a multiresolution decomposition of the input to wave-based sensing arrays
US8176230B2 (en) * 2006-04-07 2012-05-08 Kingston Technology Corporation Wireless flash memory card expansion system
US7539533B2 (en) * 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US20070268481A1 (en) * 2006-05-17 2007-11-22 Ramesh Raskar System and method for measuring scene reflectance using optical sensors
KR100753041B1 (en) * 2006-05-29 2007-08-30 삼성전자주식회사 Mobile terminal with a virtual mode dial and method for operating thereof
JP4208898B2 (en) * 2006-06-09 2009-01-14 株式会社ソニー・コンピュータエンタテインメント Object tracking device and object tracking method
US7816747B2 (en) * 2006-07-06 2010-10-19 International Business Machines Corporation Detector for detecting electromagnetic waves
US20080028118A1 (en) * 2006-07-31 2008-01-31 Craig Peter Sayers Portable dock for a portable computing system
US20080070682A1 (en) * 2006-08-15 2008-03-20 Nintendo Of America Inc. Systems and methods for providing educational games for use by young children, and digital storage mediums for storing the educational games thereon
US7860467B2 (en) * 2006-08-29 2010-12-28 Broadcom Corporation Power control for a dual mode transmitter
US20080070516A1 (en) * 2006-09-15 2008-03-20 Plantronics, Inc. Audio data streaming with auto switching between wireless headset and speakers
US20080076406A1 (en) * 2006-09-22 2008-03-27 Vanu, Inc. Wireless Backhaul
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
US8340057B2 (en) * 2006-12-22 2012-12-25 Canon Kabushiki Kaisha Automated wireless access to peripheral devices
WO2008088870A1 (en) * 2007-01-19 2008-07-24 Progressive Gaming International Corporation Table monitoring identification system, wager tagging and felt coordinate mapping
US9486703B2 (en) * 2007-01-31 2016-11-08 Broadcom Corporation Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith
US20090011832A1 (en) * 2007-01-31 2009-01-08 Broadcom Corporation Mobile communication device with game application for display on a remote monitor and methods for use therewith
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
FI121980B (en) * 2007-02-16 2011-06-30 Voyantic Oy Method for characterizing a radio link
US20080244466A1 (en) * 2007-03-26 2008-10-02 Timothy James Orsley System and method for interfacing with information on a display screen
US7647071B2 (en) * 2007-03-29 2010-01-12 Broadcom Corporation Communication devices with integrated gyrators and methods for use therewith
US20080242414A1 (en) * 2007-03-29 2008-10-02 Broadcom Corporation, A California Corporation Game devices with integrated gyrators and methods for use therewith
JP2008271023A (en) * 2007-04-18 2008-11-06 Univ Of Electro-Communications Antenna system
US10504317B2 (en) * 2007-04-30 2019-12-10 Cfph, Llc Game with player actuated control structure
US8209540B2 (en) * 2007-06-28 2012-06-26 Apple Inc. Incremental secure backup and restore of user settings and data
WO2009062153A1 (en) * 2007-11-09 2009-05-14 Wms Gaming Inc. Interaction with 3d space in a gaming system
US7895365B2 (en) * 2008-02-06 2011-02-22 Broadcom Corporation File storage for a computing device with handheld and extended computing units
CN103258184B (en) * 2008-02-27 2017-04-12 索尼计算机娱乐美国有限责任公司 Methods for capturing depth data of a scene and applying computer actions
US7671802B2 (en) * 2008-03-17 2010-03-02 Disney Enterprises, Inc. Active player tracking
CN102804668A (en) * 2009-06-19 2012-11-28 捷讯研究有限公司 Uplink Transmissions For Type 2 Relay

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020030094A1 (en) * 1997-02-10 2002-03-14 Daniel Curry Arrangement for and method of establishing a logical relationship among peripherals in a wireless local area network
US20030195040A1 (en) * 2002-04-10 2003-10-16 Breving Joel S. Video game system and game controller
US20070139512A1 (en) * 2004-04-07 2007-06-21 Matsushita Electric Industrial Co., Ltd. Communication terminal and communication method

Cited By (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8835736B2 (en) 2007-02-20 2014-09-16 Ubisoft Entertainment Instrument game system and method
US9132348B2 (en) 2007-02-20 2015-09-15 Ubisoft Entertainment Instrument game system and method
US8907193B2 (en) 2007-02-20 2014-12-09 Ubisoft Entertainment Instrument game system and method
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US8952888B2 (en) * 2008-05-09 2015-02-10 Koninklijke Philips N.V. Method and system for conveying an emotion
US20110063208A1 (en) * 2008-05-09 2011-03-17 Koninklijke Philips Electronics N.V. Method and system for conveying an emotion
US20180096227A1 (en) * 2008-08-22 2018-04-05 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US11170083B2 (en) * 2008-08-22 2021-11-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US11269979B2 (en) * 2008-08-22 2022-03-08 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US10679749B2 (en) 2008-08-22 2020-06-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US11080377B2 (en) * 2008-08-22 2021-08-03 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20180096228A1 (en) * 2008-08-22 2018-04-05 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US20180082151A1 (en) * 2008-08-22 2018-03-22 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
US8157609B2 (en) 2008-10-18 2012-04-17 Mattel, Inc. Mind-control toys and methods of interaction therewith
US20100105478A1 (en) * 2008-10-18 2010-04-29 Hallaian Stephen C Mind-control toys and methods of interaction therewith
US8986090B2 (en) 2008-11-21 2015-03-24 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
US9120016B2 (en) 2008-11-21 2015-09-01 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
US20110293144A1 (en) * 2009-02-02 2011-12-01 Agency For Science, Technology And Research Method and System for Rendering an Entertainment Animation
US20100332842A1 (en) * 2009-06-30 2010-12-30 Yahoo! Inc. Determining a mood of a user based on biometric characteristic(s) of the user in an online system
US10981054B2 (en) 2009-07-10 2021-04-20 Valve Corporation Player biofeedback for dynamically controlling a video game state
US11253781B2 (en) 2009-07-10 2022-02-22 Valve Corporation Player biofeedback for dynamically controlling a video game state
US10427042B2 (en) * 2009-07-10 2019-10-01 Valve Corporation Player biofeedback for dynamically controlling a video game state
US20110028218A1 (en) * 2009-08-03 2011-02-03 Realta Entertainment Group Systems and Methods for Wireless Connectivity of a Musical Instrument
US20160144278A1 (en) * 2010-06-07 2016-05-26 Affectiva, Inc. Affect usage within a gaming context
US10843078B2 (en) * 2010-06-07 2020-11-24 Affectiva, Inc. Affect usage within a gaming context
US9292887B2 (en) * 2012-10-14 2016-03-22 Ari M Frank Reducing transmissions of measurements of affective response by identifying actions that imply emotional response
US20150040149A1 (en) * 2012-10-14 2015-02-05 Ari M. Frank Reducing transmissions of measurements of affective response by identifying actions that imply emotional response
US9480911B2 (en) * 2013-02-28 2016-11-01 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US9889374B2 (en) 2013-02-28 2018-02-13 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US20140243093A1 (en) * 2013-02-28 2014-08-28 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US10413813B2 (en) 2013-02-28 2019-09-17 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US11007427B2 (en) 2013-02-28 2021-05-18 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
US10742798B1 (en) 2013-07-12 2020-08-11 Facebook, Inc. Calibration of grab detection
US9372103B2 (en) * 2013-07-12 2016-06-21 Facebook, Inc. Calibration of grab detection
US20150019153A1 (en) * 2013-07-12 2015-01-15 Facebook, Inc. Calibration of Grab Detection
US10413827B1 (en) 2013-09-18 2019-09-17 Electronic Arts Inc. Using biometrics to alter game content
US9669297B1 (en) * 2013-09-18 2017-06-06 Aftershock Services, Inc. Using biometrics to alter game content
US20220116721A1 (en) * 2013-10-09 2022-04-14 Voyetra Turtle Beach, Inc. Audio Alerts In A Wireless Device
US20220345837A1 (en) * 2013-10-09 2022-10-27 Voyetra Turtle Beach, Inc. Audio Device
US10933309B2 (en) 2017-02-14 2021-03-02 Sony Interactive Entertainment Europe Limited Sensing apparatus and method
WO2018150162A1 (en) * 2017-02-14 2018-08-23 Sony Interactive Entertainment Europe Limited Sensing apparatus and method
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method

Also Published As

Publication number Publication date
US20170232346A1 (en) 2017-08-17
US20080318684A1 (en) 2008-12-25
US20090017910A1 (en) 2009-01-15
US20120315991A1 (en) 2012-12-13
US20080318689A1 (en) 2008-12-25
US20170232345A1 (en) 2017-08-17
US8289212B2 (en) 2012-10-16
US20090273559A1 (en) 2009-11-05
US8031121B2 (en) 2011-10-04
US20090258706A1 (en) 2009-10-15
US20080316085A1 (en) 2008-12-25
US20080318683A1 (en) 2008-12-25
US7952962B2 (en) 2011-05-31
US8311579B2 (en) 2012-11-13
US20080318626A1 (en) 2008-12-25
US9943760B2 (en) 2018-04-17
US8628417B2 (en) 2014-01-14
US20080316863A1 (en) 2008-12-25
US9547080B2 (en) 2017-01-17
US20200129861A1 (en) 2020-04-30
US20080318682A1 (en) 2008-12-25
US8676257B2 (en) 2014-03-18
US20080316324A1 (en) 2008-12-25
US20130023290A1 (en) 2013-01-24
US11426660B2 (en) 2022-08-30
US20080318681A1 (en) 2008-12-25
US20080318595A1 (en) 2008-12-25
US20080318691A1 (en) 2008-12-25
US20110312421A1 (en) 2011-12-22
US20120129606A1 (en) 2012-05-24
US9417320B2 (en) 2016-08-16
US20080318675A1 (en) 2008-12-25
US20080316103A1 (en) 2008-12-25
US20080318625A1 (en) 2008-12-25
US20080318680A1 (en) 2008-12-25
US7973702B2 (en) 2011-07-05
US8062133B2 (en) 2011-11-22
US8160640B2 (en) 2012-04-17
US9523767B2 (en) 2016-12-20
US10549195B2 (en) 2020-02-04

Similar Documents

Publication Publication Date Title
US11426660B2 (en) Gaming object with orientation sensor for interacting with a display and methods for use therewith
US11157080B2 (en) Detection device, detection method, control device, and control method
US9925460B2 (en) Systems and methods for control device including a movement detector
US9486703B2 (en) Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith
US7519537B2 (en) Method and apparatus for a verbo-manual gesture interface
US9400548B2 (en) Gesture personalization and profile roaming
US20080174550A1 (en) Motion-Input Device For a Computing Terminal and Method of its Operation
US20100306715A1 (en) Gestures Beyond Skeletal
US20090143143A1 (en) Gaming object and game console with millimeter wave interface and methods for use therewith
US20220129081A1 (en) Controller and method for gesture recognition and a gesture recognition device
JP2020065656A (en) Program, method, and information processing device
JP2021058482A (en) Game method using controllers
KR20150101520A (en) Motion interface device and method for controlling character using the same

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ROFOUGARAN, AHMADREZA REZA;REEL/FRAME:021054/0931

Effective date: 20080529

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROFOUGARAN, MARYAM;SESHADRI, NAMBIRAJAN;IBRAHIM, BRIMA B.;AND OTHERS;SIGNING DATES FROM 20110526 TO 20110621;REEL/FRAME:027082/0199

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119