US20080316324A1 - Position detection and/or movement tracking via image capture and processing - Google Patents

Position detection and/or movement tracking via image capture and processing Download PDF

Info

Publication number
US20080316324A1
US20080316324A1 US12/135,332 US13533208A US2008316324A1 US 20080316324 A1 US20080316324 A1 US 20080316324A1 US 13533208 A US13533208 A US 13533208A US 2008316324 A1 US2008316324 A1 US 2008316324A1
Authority
US
United States
Prior art keywords
digital
gaming
game console
processing module
digital cameras
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/135,332
Inventor
Ahmadreza (Reza) Rofougaran
Maryam Rofougaran
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Avago Technologies International Sales Pte Ltd
Original Assignee
Broadcom Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Broadcom Corp filed Critical Broadcom Corp
Priority to US12/135,332 priority Critical patent/US20080316324A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ROFOUGARAN, AHMADREZA REZA, ROFOUGARAN, MARYAM
Publication of US20080316324A1 publication Critical patent/US20080316324A1/en
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KARAOGUZ, JEYHAN, SESHADRI, NAMBIRAJAN, IBRAHIM, BRIMA B., WALLEY, JOHN
Assigned to BANK OF AMERICA, N.A., AS COLLATERAL AGENT reassignment BANK OF AMERICA, N.A., AS COLLATERAL AGENT PATENT SECURITY AGREEMENT Assignors: BROADCOM CORPORATION
Assigned to AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. reassignment AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BROADCOM CORPORATION
Assigned to BROADCOM CORPORATION reassignment BROADCOM CORPORATION TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS Assignors: BANK OF AMERICA, N.A., AS COLLATERAL AGENT
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • A63F13/235Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console using a wireless connection, e.g. infrared or piconet
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • G01S13/426Scanning radar, e.g. 3D radar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/87Combinations of radar systems, e.g. primary radar and secondary radar
    • G01S13/878Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/825Fostering virtual characters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/50Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by details of game servers
    • A63F2300/55Details of game data or player data management
    • A63F2300/5546Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history
    • A63F2300/5553Details of game data or player data management using player registration data, e.g. identification, account, preferences, game history user representation in the game field, e.g. avatar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/003Bistatic radar systems; Multistatic radar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/045Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact

Definitions

  • the invention relates generally to position and tracking systems; and, more particularly, it relates to such systems that employ captured digital images to determine position of or track movement of an object.
  • Radio frequency (RF) wireless communication systems may operate in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • RF radio frequency
  • GSM global system for mobile communications
  • CDMA code division multiple access
  • LMDS local multi-point distribution systems
  • MMDS multi-channel-multi-point distribution systems
  • IrDA Infrared Data Association
  • a wireless communication device such as a cellular telephone, two-way radio, personal digital assistant (PDA), personal computer (PC), laptop computer, home entertainment equipment, RFID reader, RFID tag, et cetera communicates directly or indirectly with other wireless communication devices.
  • PDA personal digital assistant
  • PC personal computer
  • laptop computer home entertainment equipment
  • RFID reader RFID tag
  • et cetera communicates directly or indirectly with other wireless communication devices.
  • direct communications also known as point-to-point communications
  • the participating wireless communication devices tune their receivers and transmitters to the same channel or channels (e.g., one of the plurality of radio frequency (RF) carriers of the wireless communication system) and communicate over that channel(s).
  • RF radio frequency
  • each wireless communication device communicates directly with an associated base station (e.g., for cellular services) and/or an associated access point (e.g., for an in-home or in-building wireless network) via an assigned channel.
  • an associated base station e.g., for cellular services
  • an associated access point e.g., for an in-home or in-building wireless network
  • the associated base stations and/or associated access points communicate with each other directly, via a system controller, via the public switch telephone network, via the Internet, and/or via some other wide area network.
  • each RF wireless communication device For each RF wireless communication device to participate in wireless communications, it includes a built-in radio transceiver (i.e., receiver and transmitter) or is coupled to an associated radio transceiver (e.g., a station for in-home and/or in-building wireless communication networks, RF modem, etc.).
  • the receiver is coupled to the antenna and includes a low noise amplifier, one or more intermediate frequency stages, a filtering stage, and a data recovery stage.
  • the low noise amplifier receives inbound RF signals via the antenna and amplifies then.
  • the one or more intermediate frequency stages mix the amplified RF signals with one or more local oscillations to convert the amplified RF signal into baseband signals or intermediate frequency (IF) signals.
  • the filtering stage filters the baseband signals or the IF signals to attenuate unwanted out of band signals to produce filtered signals.
  • the data recovery stage recovers raw data from the filtered signals in accordance with the particular wireless communication standard.
  • the transmitter includes a data modulation stage, one or more intermediate frequency stages, and a power amplifier.
  • the data modulation stage converts raw data into baseband signals in accordance with a particular wireless communication standard.
  • the one or more intermediate frequency stages mix the baseband signals with one or more local oscillations to produce RF signals.
  • the power amplifier amplifies the RF signals prior to transmission via an antenna.
  • radio transceivers are implemented in one or more integrated circuits (ICs), which are inter-coupled via traces on a printed circuit board (PCB).
  • ICs integrated circuits
  • PCB printed circuit board
  • the radio transceivers operate within licensed or unlicensed frequency spectrums.
  • WLAN wireless local area network
  • ISM Industrial, Scientific, and Medical
  • an IR device in IR communication systems, includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode.
  • the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam.
  • the receiver via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used video games to detect the direction in which a game controller is pointed.
  • an IR sensor is placed near the game display, where the IR sensor to detect the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration.
  • the motion data is transmitted to the game console via a Bluetooth wireless link.
  • the Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • the IR communication has a limited area in which a player can be for the IR communication to work properly.
  • the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved.
  • the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions).
  • FIG. 1 is a diagram of an embodiment an apparatus that performs position determination and/or movement tracking via image capture and processing.
  • FIG. 2 is a diagram of an alternative embodiment of an apparatus that performs position determination and/or movement tracking via image capture and processing.
  • FIG. 3 is a diagram of an embodiment showing a means by which position of a point, object, etc. may be determined using multiple directional vectors extending from multiple known locations, respectively, to that point, object, etc.
  • FIG. 4 is a diagram of an embodiment showing the relationship between an object point and various image planes that have performed image capture of the object point.
  • FIG. 5 is a diagram of an embodiment showing the relationship between multiple object points and various image planes that have performed image capture of the multiple object points.
  • FIG. 6 is a diagram of an embodiment showing an image sensor and the association of physical pixels and the image pixels generated there from.
  • FIG. 7A and FIG. 7B are diagrams of an embodiment of an apparatus that employs directional vectors associated with captured images, at least some of which depict an object, to determine position of the object.
  • FIG. 8A and FIG. 8B are diagrams of an embodiment of an apparatus that employs directional vectors associated images, that depict a number of objects, to determine position of a device that has captured the images.
  • FIG. 9 is a schematic block diagram of an overhead view of an embodiment of a gaming system.
  • FIG. 10 is a schematic block diagram of a side view of an embodiment of a gaming system.
  • FIG. 11 is a diagram illustrating an embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wire-coupled to the multiple digital cameras, for position detection and/or movement tracking.
  • FIG. 12 is a diagram illustrating an alternative embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wirelessly coupled to at least some of the multiple digital cameras, for position detection and/or movement tracking.
  • FIG. 13 is a schematic block diagram of a side view of another embodiment of a gaming system.
  • FIG. 14 is a schematic block diagram of an overhead view of another embodiment of a gaming system.
  • FIG. 15 , FIG. 16 , and FIG. 17 are diagrams of an embodiment of a coordinate system of a gaming system.
  • FIG. 18 , FIG. 19 , and FIG. 20 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIG. 21 is a diagram of a method for determining position and/or motion tracking.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 23 , FIG. 24 , and FIG. 25 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIGS. 26 , FIG. 27 , and FIG. 28 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIG. 29 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 30 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 31 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 32 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 33 is a diagram of another embodiment of a coordinate system of a gaming system.
  • FIG. 34 is a diagram of a method for determining motion.
  • FIG. 35 is a diagram of an example of reference points on a player and/or gaming object.
  • FIG. 36 , FIG. 37 , and FIG. 38 are diagrams of examples of motion patterns.
  • FIG. 39 is a diagram of an example of motion estimation.
  • FIG. 40 and FIG. 41 are diagrams of examples of reference points on a player to determine player's physical measurements.
  • FIG. 42 is a diagram of an example of mapping a player to an image.
  • FIG. 43 is a diagram of another method for determining motion.
  • FIG. 44 is a schematic block diagram of an embodiment of a gaming object and/or game console.
  • FIG. 45 , FIG. 46 , and FIG. 47 are diagrams of various embodiments of methods for determining position and/or motion tracking.
  • FIG. 48 is a diagram of an embodiment of a method for determining a distance based on captured digital images.
  • FIG. 1 is a diagram of an embodiment an apparatus that performs position determination and/or movement tracking via image capture and processing.
  • the apparatus includes a number of digital cameras that generate digital images. An object is depicted within at least some of the digital images.
  • a processing module is coupled to receive the digital images. The processing module processes the digital images to identify characteristics of the object as depicted within at least some of the digital images. Based on the identified characteristics, the processing module determines position of the object with respect to locations of at least some of the digital cameras.
  • the processing module identifies directional vectors based on the identified characteristics of the object. These directional vectors may be viewed as extending from known locations (e.g., locations of the digital cameras, point of reference within the digital cameras, etc.) to the object.
  • a digital camera includes an electronic image sensor.
  • a digital image sensor when mounted on a surface of an integrated circuit and implemented for performing image capture directly may also be viewed as an alternative embodiment of a digital camera.
  • the specifications of such digital image sensors are oftentimes defined in terms of number of physical pixels within the digital image sensor that correspond to the number of image pixels that a picture captured by the image sensor will have. For example, as processes by which digital cameras are manufactured continues to improve, the number of mega-pixels that a digital image sensor includes continues to increase. Generally, the digital image sensors within digital cameras have more than a million physical pixels (e.g., mega-pixels (or more)).
  • a reference point within a digital camera may serve as a point from which a directional vector is defined.
  • a camera center of projection of the digital camera is a point to which all points in the image can be traced back to.
  • the focal distance of the digital camera may also correspond to the camera center of projection of the digital camera.
  • a directional vector may be defined as extending from such a reference point within the digital camera to a physical pixel that has captured a particular portion of an object of interest.
  • an image pixel of interest within a digital image corresponds to a physical pixel of the digital image sensor of the digital camera.
  • a directional vector may be defined as extending from that reference point within the digital camera to that physical pixel.
  • an object may include one or more sensing tags thereon to assist in the identification of the characteristics of the object depicted within at least some of the digital images.
  • sensing tags include a particular type of material (e.g., metal, etc.), an RFID tag, a material having particular properties (e.g., a light reflective material, a light absorbent material, etc.), a specific RGB [red, green, blue] color or combination of colors, a particular pattern, etc.).
  • the object whose characteristics are identified may have a predetermined size.
  • a player/user may employ a gaming object when playing a game, and the size of such a gaming object may be known beforehand.
  • the actual/physical size of the object may be associated with the identified ‘image size’ as depicted within the digital image.
  • the relationship between these two e.g., image size and predetermined size
  • Each of the digital cameras has a corresponding field of view in which it can perform image capture. Again, the object is depicted within at least some of the fields of view of at least some of the digital cameras. When the object is not within any field of view of any digital camera, then at least some of the digital cameras can be adjusted (e.g., such as using an actuator coupled to or integrated with a digital camera) so that the object may be visible within at least one of the fields of view of at least one of the cameras.
  • any of the digital cameras may be adjusted.
  • a digital camera may have auto-focus capability in which the focal distance of the digital camera is adjusted to provide a maximum clarity image of the object of interest.
  • the image capture rate of any digital camera may be adjusted based on a number of factors including a predetermined setting within the processing module, a user-selected setting within the processing module, a movement history of the object, a current movement of the object, and an expected future movement of the object.
  • the movement of the object may also be determined by merely updating the position of the object as a function of time.
  • the processing module may determine a first position of the object during a first time, and the processing module may then determine a second position of the object during a second time.
  • the movement of the object may be estimated by comparing the first determined position and the second determined position.
  • the rate of the movement of the object may be determined by also considering the times associated with the each of the first determined position and the second determined position.
  • the digital cameras may be ‘smart’ digital cameras in some embodiments that include means by which the configuration of the digital camera may be determined and communicated back to the processing module. Certain information such as focal length of the digital camera, the image capture setting of the digital camera (e.g., for digital cameras that can capture images having different numbers of pixels), physical orientation, physical location, etc. may be determined by such a smart digital camera, communicated back to the processing module, and then the processing module can consider this higher level of information when employing the identified characteristics of the object to determine the position of the object.
  • Certain information such as focal length of the digital camera, the image capture setting of the digital camera (e.g., for digital cameras that can capture images having different numbers of pixels), physical orientation, physical location, etc. may be determined by such a smart digital camera, communicated back to the processing module, and then the processing module can consider this higher level of information when employing the identified characteristics of the object to determine the position of the object.
  • wireless communication may also employed between the various components of such an apparatus without departing from the scope and spirit of the invention.
  • FIG. 2 is a diagram of an alternative embodiment of an apparatus that performs position determination and/or movement tracking via image capture and processing. This embodiment is somewhat analogous to the previous embodiment, with at least one difference being that the digital cameras are wirelessly coupled to the processing module. It is also noted that at least one digital camera may be integrated into the processing module.
  • the wireless means by which communication is supported may be varied, and it may be supported using any desired radio frequency (RF) communication standard including any that operates in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • RF radio frequency
  • At least one of the digital cameras includes a first radio frequency (RF) transceiver
  • the processing module includes a second RF transceiver. Based on an RF signal transmitted between the first RF transceiver and the second RF transceiver, the processing module can then determine a distance between the processing module and the digital camera from which the RF signal was transmitted. By using a transmission time at which the RF signal is transmitted from a first device, and a receive time at which the RF signal is received by a second device, and also knowing the speed/velocity at which the RF signal travels, then the distance between the first device and the second device may be determined.
  • RF radio frequency
  • FIG. 3 is a diagram of an embodiment showing a means by which position of a point, object, etc. may be determined using multiple directional vectors extending from multiple known locations, respectively, to that point, object, etc.
  • This diagram depicts 3D space in a right handed, Cartesian coordinate system (e.g., shown as having axes xyz). Clearly, the principles described with respect to this diagram are applicable to any other 3D coordinate system as well.
  • this position may be mapped to a virtual 3D coordinate system.
  • the upper right hand corner of the diagram depicts a virtual 3D space in a right handed, Cartesian coordinate system (e.g., shown as having axes x′y′z′).
  • FIG. 4 is a diagram of an embodiment showing the relationship between an object point and various image planes that have performed image capture of the object point.
  • This diagram shows two separate image planes, as corresponding to two separate digital cameras, that capture digital images of an object from different perspectives or fields of view.
  • the image plane of a digital camera may be considered as corresponding to the digital image sensor component of the digital camera.
  • a digital mage sensor may be a complementary metal-oxide-semiconductor (CMOS) device or a charge coupled device (CCD).
  • CMOS complementary metal-oxide-semiconductor
  • CCD charge coupled device
  • various parameters generally are employed to define a digital image sensor, including an image sensor type (e.g., 1 ⁇ 4′′, 1/3.6′′, etc.), a width and height (typically provided in milli-meters), a total number of physical pixels (e.g., X megapixels, where X is a number such as 3, 6, 8.1, etc.), a number of physical pixels along each of the width and height of the digital image sensor (e.g., y ⁇ z, where y and z are integer numbers), a diagonal size (again, typically provided in milli-meters) that corresponds to the normal lens focal length, the focal length factor, etc.
  • an image sensor type e.g., 1 ⁇ 4′′, 1/3.6′′, etc.
  • a width and height typically provided in milli-meters
  • a total number of physical pixels e.g., X megapixels, where X is a number such as 3, 6, 8.1, etc.
  • each physical pixel of a digital image sensor captures information (e.g., color, intensity, etc.) of a portion of the field of view of the digital camera, and this information is employed to generate an image pixel of a digital image. Therefore, in the digital image context, there can be viewed as being a one to one relationship between each physical pixel of a digital image sensor and each image pixel of an image generated from information captured by the digital image sensor.
  • a directional vector extends from a reference point of a digital camera 1 (DC 1 ) through the image plane of DC 1 to a point on the object of interest.
  • a directional vector (DV 1 ) also extends from this DC 1 reference point through the image plane of DC 1 (e.g., which corresponds to the digital image sensor of DC 1 .
  • This camera reference point may be a camera center of projection for DC 1 based on its current configuration (e.g., focus, etc.). Alternatively, another camera reference point may be employed (e.g., focal point, predetermined point within the camera, etc.) without departing from the scope and spirit of the invention.
  • another directional vector extends from a reference point of a DC 2 through the image plane of DC 2 to the same point on the object of interest. If the locations of DC 1 and DC 2 are known, and if the directional vectors extending from the respective points of reference of each of DC 1 and DC 2 are known, then the principles of triangulation may be employed to determine the location of the object point on the object of interest.
  • an image 1 height is the height of the object as depicted in a digital image captured by DC 1
  • an image 2 height is the height of the object as depicted in a digital image captured by DC 2 .
  • a first ratio between the actual height to the image 1 height may be made, and a second ratio between the actual height to the image 2 height may be made.
  • FIG. 5 is a diagram of an embodiment showing the relationship between multiple object points and various image planes that have performed image capture of the multiple object points. This diagram has some similarities to the previous embodiment, in that a directional vector extends from a reference point of a digital camera through the image plane of the digital camera to a point on the object of interest.
  • the object of this embodiment includes a number of sensing tags thereon.
  • These sensing tags can be portions of the object having a particular color, a light reflective material, a light absorbent material, an infrared light source, etc.
  • the sensing tags have some associated characteristic that is identifiable on the object.
  • the object in this diagram also has different types of sensing tags (e.g., of type 1 , type 2 , etc.).
  • This use of different types of sensing tags of an object may be employed to assist in determining the position and orientation of the object (e.g., sometimes referred to as ‘pose’ in the image processing context), since different sides, areas, etc. of the object may be better distinguished from one another. For example, when considering an object such as a cube, then a determination of whether the cube is right side up (or upside down) with reference to a desired convention of which side of the cube will be deemed to be ‘up’ may be determined.
  • first directional vectors associated with type 1 sensing tags extend from a reference point of a digital camera through the image plane of the digital camera to two separate points on the object that have type 1 sensing tags.
  • Second directional vectors associated with type 2 sensing tags extend from the reference point of the digital camera through the image plane of the digital camera to two separate points on the object that have type 2 sensing tags.
  • FIG. 6 is a diagram of an embodiment showing an image sensor and the association of physical pixels and the image pixels generated there from.
  • a digital image sensor is the element that captures information (e.g., color, intensity, contrast, etc.) of a field of view of the digital camera.
  • Each individual physical pixel of the digital image sensor captures a small portion of the field of view of the digital camera. For example, if the digital image sensor includes one million physical pixels, then each individual physical pixel of the digital image sensor captures information of one-millionth of the field of view of the digital camera. If the digital image sensor includes X megapixels, then each individual physical pixel of the digital image sensor captures information of (1/(X ⁇ 10 6 )) th of the field of view of the digital camera.
  • each of these discrete pieces of information is used to form a digital image corresponding what is seen in the field of view of the digital camera.
  • a directional vector extends from a reference point of a digital camera to one of the physical pixels of the digital image sensor. For example, when a particular image pixel of a digital image is identified, then the corresponding physical pixel that captured information used to generate that image pixel can be determined. Such a directional vector can then be determined. This directional vector may be the directional vector generated from this digital camera to a particular point on the object of interest.
  • FIG. 7A and FIG. 7B are diagrams of an embodiment of an apparatus, shown from two separate perspectives, that employs directional vectors associated with captured images, at least some of which depict an object, to determine position of the object.
  • FIG. 7A which is viewed in the xy plane of a 3D space having an xyz coordinate system
  • the principles of using triangulation may be employed when determining position of an object that is depicted in digital images captured by multiple digital cameras.
  • a projection of a first directional vector (DV 1 proj.) from a first digital camera (DC 1 ) extends from the first digital camera to the object.
  • a projection of a second directional vector (DV 2 proj.) from a second digital camera (DC 2 ) extends from the second digital camera to the object.
  • Additional directional vectors, associated with additional digital cameras, may also be employed.
  • the directional vectors then undergo processing in a processing module to determine the intersection of the various directional vectors. The intersection of these directional vectors is the location of the object.
  • this diagram is viewed in the xz plane of a 3D space having an xyz coordinate system.
  • FIG. 8A and FIG. 8B are diagrams of an embodiment of an apparatus, shown from two separate perspectives, respectively, that employs directional vectors associated images, that depict a number of objects, to determine position of a device that has captured the images.
  • the principles of using triangulation may be employed when determining position of a device that includes multiple digital cameras (e.g., a first digital camera (DC 1 ), a second digital camera (DC 2 ), etc.) that capture digital images that depict various known objects (e.g., a first object (object 1 ), a second object (object 2 ), etc.).
  • a device that includes multiple digital cameras (e.g., a first digital camera (DC 1 ), a second digital camera (DC 2 ), etc.) that capture digital images that depict various known objects (e.g., a first object (object 1 ), a second object (object 2 ), etc.).
  • a projection of a first directional vector (DV 1 proj.) from a first object (object 1 ) extends to the first digital camera (DC 1 ).
  • a projection of a second directional vector (DV 2 proj.) extends from a second object (object 2 ) to a second digital camera (DC 2 ).
  • Additional directional vectors, associated with additional objects, may also be employed.
  • the directional vectors orientations undergo processing in a processing module to determine their intersection. The intersection of these directional vectors is the location of the device that includes the multiple digital cameras.
  • this diagram is viewed in the xz plane of a 3D space having an xyz coordinate system.
  • FIG. 9 is a schematic block diagram of an overhead view of an embodiment of a gaming system that includes a game console and a gaming object.
  • the gaming system has an associated a physical area in which the game console and the gaming object are located.
  • the physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • the gaming object may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game.
  • the gaming object may be a simulated sword, a simulated gun, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, etc.
  • the game console determines the positioning of the gaming object within the physical area using one or more position determination techniques as subsequently discussed. Once the gaming object's position is determined, the game console tracks the motion of the gaming object using one or more motion tracking techniques as subsequently discussed to facilitate video game play. In this embodiment, the game console may determine the positioning of the gaming object within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) and tracks the motion within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds).
  • a positioning tolerance e.g., within a meter
  • a positioning update rate e.g., once every second or once every few seconds
  • a motion tracking tolerance e.g., within a few millimeters
  • a motion tracking update rate e.g., once every 10-100 milliseconds.
  • FIG. 10 is a schematic block diagram of a side view of an embodiment of a gaming system of FIG. 9 to illustrate that the positioning and motion tracking are done in three-dimensional space.
  • the gaming system provides accurate motion tracking of the gaming object, which may be used to map the player's movements to a graphics image for true interactive video game play.
  • FIG. 11 is a diagram illustrating an embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wire-coupled to the multiple digital cameras, for position detection and/or movement tracking.
  • a physical gaming environment (at least a portion of which may be represented within a virtual gaming environment) includes a number of digital cameras arranged at various locations therein to effectuate the image capture of a player and/or gaming object associated with the player. There may be some instances where the player has no gaming object (e.g., when simulating boxing), and the bodily position and/or movement of the player are those elements being monitored and/or tracked.
  • Each digital camera has a corresponding field of view in which it can perform image capture.
  • an entirety of the physical gaming environment can be visually captured by digital images generated by the digital cameras.
  • multiple views of a single object within the physical gaming environment can be obtained.
  • the game console is operable to perform processing of digital images captured by the digital cameras to identify characteristics of an object depicted within at least some of the digital images. Based on the identified object characteristics, the game console is operable to determine position of the object with respect to the digital cameras.
  • certain initialization processes can be performed in which the player and/or gaming object remains motionless.
  • the digital cameras then may perform image capture of the motionless player and/or gaming object for calibration purposes.
  • a size e.g., height, width, etc.
  • the size of other objects within the physical gaming environment may be estimated based on their relatively proportional size to a known object.
  • various means of performing digital image processing may be performed including pattern recognition in which a predetermined pattern (e.g., as corresponding to a particular shape) is compared to patterns detected within one of the digital images captured by one of the digital cameras.
  • a particular shape may have more than one pattern corresponding thereto (e.g., a pattern 1 of a person-related-shape corresponding to a taller/slender person vs. a pattern 2 of a person-related-shape corresponding to a shorter/bulky person, etc.).
  • a pattern detected within a digital image even if is not an expected pattern or can be associated with a predetermined pattern that is being searched for within the digital image, the detected pattern can be added (e.g., to a memory) that stores a number of patterns/shapes that may be detected within the digital image.
  • Another means of performing digital image processing may include searching for a particular color (e.g., as associated with a player, gaming object, etc.) within a digital image captured by a digital camera. For example, a player may wear a particular colored clothing article, and when processing the digital image captured by a digital camera, the color associated with that known-colored clothing article is sought for.
  • a particular color e.g., as associated with a player, gaming object, etc.
  • Other means of performing digital image processing may be performed including searching for reflections off of reflective material that covers the player and/or gaming object.
  • This digital image processing may involve searching for pixels or groups of pixels within a digital image above a certain threshold (which may be predetermined or adaptively set for each digital image). When the intensity is above that threshold, then that pixel (or group of pixels) can be associated as being associated with the reflective material covering the player and/or gaming object.
  • a certain threshold which may be predetermined or adaptively set for each digital image.
  • that pixel or group of pixels
  • Additional variations of the physical gaming environment may be employed such as providing special lighting to enhance the reflecting of light off of reflective material covering at least a portion of the player and/or gaming object.
  • an appropriate backdrop could also be employed to provide a higher degree of contrast between the player and/or gaming object and the rest of the physical gaming environment.
  • Certain operational parameters of the digital cameras may also be adjusted by a user/player or in real time by control signals provided by the game console.
  • the image capture rate employed by the digital cameras may be adjusted to based on any number of considerations including a predetermined setting within the game console, a player-selected setting within the game console (e.g., as selected by the player via a user interface), a type of game being played, a movement history of the player and/or gaming object, a current or expected movement of the player and/or gaming object, etc.
  • the any one of the digital cameras may include an integrated actuator to perform real-time re-positioning of a digital camera to effectuate better image capture of the player and/or gaming object within the physical gaming environment.
  • the camera may be mounted on an actuator that can perform such re-positioning of the digital camera.
  • a player/user can perform re-positioning of any digital camera as well.
  • the digital cameras are all wire-coupled to the game console.
  • Any desired wire-based communication protocol e.g., Ethernet
  • Ethernet may be employed to effectuate communication between the digital cameras and the game console to communicate digital images from the digital cameras to the game console and command signals (if necessary) from the game console to the digital cameras.
  • FIG. 12 is a diagram illustrating an alternative embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wirelessly coupled to the multiple digital cameras, for position detection and/or movement tracking.
  • This embodiment is somewhat analogous to the previous embodiment, with at least one difference being that at least some of the digital cameras and the game console each include wireless communication capability to effectuate wireless communication there between.
  • at least one of the digital cameras is wire-coupled to the game console.
  • some of the digital cameras and the game console either includes an integrated wireless transceiver or is coupled to a wireless transceiver to effectuate communication between some of the digital cameras and the game console.
  • a digital camera may be integrated into the game console as well without departing from the scope and spirit of the invention.
  • This wireless communication can be supported using any number of desired wireless protocols including Code Division Multiple Access (CDMA) signaling, Time Division Multiple Access (TDMA) signaling, Frequency Division Multiple Access (FDMA) signaling, or some other desired wireless standard, protocol, or proprietary means of communication.
  • CDMA Code Division Multiple Access
  • TDMA Time Division Multiple Access
  • FDMA Frequency Division Multiple Access
  • the wireless communication can be supported using any desired radio frequency (RF) communication standard including any that operates in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • RF radio frequency
  • FIG. 13 is a schematic block diagram of a side view of another embodiment of a gaming system that includes multiple gaming objects, the player, and a game console.
  • the gaming objects include one or more sensing tags (e.g., metal, RFID tag, light reflective material, light absorbent material, a specific RGB [red, green, blue] color, etc.).
  • the gaming objects may include a game controller, a helmet, a shirt, pants, gloves, and socks, each of which includes one or more sensing tags. In this manner, the sensing tags facilitate the determining of position and/or facilitate motion tracking as will be subsequently discussed.
  • FIG. 14 is a schematic block diagram of an overhead view of another embodiment of a gaming system that includes a game console, a plurality of players and a plurality of gaming objects.
  • the positioning and motion tracking of each of the gaming objects are determined by the game console and/or the one or more peripheral sensors.
  • FIG. 15 , FIG. 16 , and FIG. 17 are diagrams of an embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an xyz origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object is determined based on its Cartesian coordinates (e.g., x 1 , y 1 , z 1 ).
  • Cartesian coordinates e.g., x 1 , y 1 , z 1 .
  • the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the origin.
  • FIG. 18 , FIG. 19 , and FIG. 20 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object is determined based on its vector, or spherical, coordinates ( ⁇ , ⁇ , ⁇ ), which are defined as: ⁇ 0 is the distance from the origin to a given point P. 0 ⁇ 180° is the angle between the positive z-axis and the line formed between the origin and P. 0 ⁇ 360° is the angle between the positive x-axis and the line from the origin to the P projected onto the xy-plane.
  • is referred to as the zenith, colatitude or polar angle, while ⁇ is referred to as the azimuth.
  • To plot a point from its spherical coordinates go ⁇ units from the origin along the positive z-axis, rotate ⁇ about the y-axis in the direction of the positive x-axis and rotate ⁇ about the z-axis in the direction of the positive y-axis. As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined in vector, or spherical, coordinates with respect to the origin.
  • FIGS. 15-20 illustrate two types of coordinate system
  • any three-dimensional coordinate system may be used for tracking motion and/or establishing position within a gaming system.
  • FIG. 21 is a diagram of a method for determining position and/or motion tracking that begins by determining the environment parameters (e.g., determining the properties of the localized physical area such as height, width, depth, objects in the physical area, etc.). The method then continues by mapping the environment parameters to a coordinate system (e.g., Cartesian coordinate system of FIGS. 15-17 ). The method continues in one or more branches. Along one branch, the initial coordinates of the player are determined using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the environment parameters e.g., determining the properties of the localized physical area such as height, width, depth, objects in the physical area, etc.
  • the method then continues by mapping the environment parameters to a coordinate system (e.g., Cartesian coordinate system of FIGS. 15-17 ).
  • the method continues in one or more branches.
  • the other branch includes determining the coordinates of the gaming object's initial position using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 18-20 ).
  • the reference point may be the origin or any other point within the localized physical area.
  • the method continues in one or more branches.
  • a vector with respect to the reference point is determined to indicate the player's initial position, which may be done by using one or more of a plurality of position determining techniques as described herein.
  • This branch continues by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the other branch includes determining a vector with respect to the reference point for the gaming object to establish its initial position, which may be done by using one or more of a plurality of position determining techniques as described herein.
  • This branch continues by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein.
  • the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 23 , FIG. 24 , and FIG. 25 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an xyz origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its Cartesian coordinates (e.g., x 1 , y 1 , z 1 ).
  • Cartesian coordinates e.g., x 1 , y 1 , z 1
  • the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the preceding location (e.g., ⁇ x, ⁇ y, ⁇ z).
  • the positioning and motion tracking of the player may be done with reference to the position of the gaming object, such the gaming objects position is determined with reference to the origin and/or its previous position and the position of the player is determine with reference to the gaming object's position.
  • the reverse could be used as well.
  • both position and motion of the gaming object and the player may be referenced to a personal item of the player, such as a cell phone.
  • FIG. 26 , FIG. 27 , and FIG. 28 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system.
  • an origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its vector, or spherical coordinates (e.g., ⁇ 1 , ⁇ 1 , ⁇ 1 ).
  • the new position of the tracking and/or positioning points are determined as a vector, or spherical coordinates with respect to the preceding location (e.g., ⁇ V, or ⁇ , ⁇ , ⁇ ).
  • the positioning and motion tracking of the player may be done with reference to the position of the gaming object, such the gaming objects position is determined with reference to the origin and/or its previous position and the position of the player is determine with reference to the gaming object's position.
  • the reverse could be used as well.
  • both position and motion of the gaming object and the player may be referenced to a personal item of the player, such as a cell phone.
  • FIG. 29 is a diagram of another method for determining position and/or motion tracking that begins by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays.
  • the environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • the method then proceeds by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 15-17 ).
  • a coordinate system e.g., one of the systems shown in FIGS. 15-17 .
  • the physical area is a room
  • a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • objects in the room e.g., a couch, a chair, etc.
  • the method then proceeds by determining the coordinates of the player's, or players', position in the physical area.
  • the method then continues by determining the coordinates of a gaming object's initial position.
  • the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the initial position of the player may be used to determine the initial position of the gaming object.
  • one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method then proceeds by updating the coordinates of the player's, or players', position in the physical area to track the player's motion.
  • the method also continues by updating the coordinates of a gaming object's position to track its motion.
  • the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the motion of the player may be used to determine the motion of the gaming object.
  • one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 30 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for a player's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 18-20 ). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • a coordinate system e.g., one of the systems shown in FIGS. 18-20
  • the method then continues by determining a vector of a gaming object's initial position.
  • the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the initial position of the player may be used to determine the initial position of the gaming object.
  • one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method then proceeds by updating the vector of the player's, or players', position in the physical area to track the player's motion.
  • the method also continues by updating the vector of the gaming object's position to track its motion.
  • the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player.
  • the motion of the player may be used to determine the motion of the gaming object.
  • one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 31 is a diagram of another method for determining position and/or motion tracking that begins by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays.
  • the environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • the method then proceeds by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 23-25 ).
  • a coordinate system e.g., one of the systems shown in FIGS. 23-25 .
  • the physical area is a room
  • a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • objects in the room e.g., a couch, a chair, etc.
  • the method then proceeds by determining the coordinates of the gaming object's initial position in the physical area.
  • the method then continues by determining the coordinates of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method then proceeds by updating the coordinates of the gaming object's position in the physical area to track its motion.
  • the method also continues by updating the coordinates of the player's position to track the player's motion with respect to the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 32 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for a gaming object's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 26-28 ). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • a coordinate system e.g., one of the systems shown in FIGS. 26-28
  • the method then continues by determining a vector of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • the method then proceeds by updating the vector of the gaming object's position in the physical area to track its motion.
  • the method also continues by updating the vector of the player's position with respect to the gaming object's motion to track the player's motion. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 33 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above.
  • the coordinate system includes a positioning coordinate grid and a motion tracking grid, where the motion tracking grid is of a finer resolution than the positioning coordinate grid.
  • the player or gaming object's position within the physical area can have a first tolerance (e.g., within a meter) and the motion tracking of the player and/or the gaming object has a second tolerance (e.g., within a few millimeters).
  • the position of the player and/or gaming object can be updated infrequently in comparison to the updating of the motion (e.g., the position can be updated once every second or so while the motion may be updated once every 10 milliseconds).
  • FIG. 34 is a diagram of a method for determining motion of a gaming object and/or a player that begins by determining an initial position of the player and/or gaming object using one or more of the positioning techniques described herein. The method continues by determining motion reference points for the player and/or for the gaming object as shown in FIG. 35 .
  • the reference points may be sensors on the player and/or on the gaming object, may be particular body parts (e.g., nose, elbow, knee, etc.), particular points on the gaming object, and/or a combination thereof.
  • the number of reference points and the location thereof may be dependent on the video game, on the player's physical characteristics, on the player's skill level, on the desired motion tracking resolution, and/or on the motion tracking technique being used.
  • the method continues by determining initial motion coordinates for each reference point using one or more the position determining techniques and/or motion tracking techniques described herein.
  • the method continues by establishing one or more data rates for the reference points based on the location of the reference point, motion patterns (e.g., a video bowling game, the player will have particular motions for bowling), previous motion (e.g., half way through bowling a ball, know where the next motion is likely to be), and/or human bio-mechanics (e.g., arms and legs bends in a certain manner).
  • motion patterns e.g., a video bowling game, the player will have particular motions for bowling
  • previous motion e.g., half way through bowling a ball, know where the next motion is likely to be
  • human bio-mechanics e.g., arms and legs bends in a certain manner.
  • the reference point of a hand may have a faster data rate than a reference point on the head since the hand will most likely being moving faster
  • the method continues by obtaining motion tracking data (e.g., distances, vectors, distance changes, vector changes, etc.) for the reference points at intervals of the one or more data rates.
  • the method continues by determining motion of the reference points based on the motion tracking date at intervals of the one or more data rates.
  • FIG. 36 , FIG. 37 , FIG. 38 , and FIG. 39 are diagrams of examples of motion patterns in accordance with human bio-mechanics.
  • a head can move up/down, it can tilt, it can rotate, and/or a combination thereof.
  • head motion can be anticipated based on current play of the game. For example, during an approach shot, the head will be relatively steady with respect to tilting and rotating, and may move up or down along with the body.
  • FIG. 37 shows the motion patterns of an arm (or leg) in accordance with human bio-mechanics.
  • the arm (or leg) may contract or extend, go up or down, move side to side, rotate, or a combination thereof.
  • an arm (or leg) motion can be anticipated based on the current play of the game.
  • the arm (or leg) may be broken down in smaller body parts (e.g., upper arm, elbow, forearm, wrist, hand, fingers). Further note that the gaming object's motion will be similar to the body part it is associated with.
  • FIG. 38 illustrates the likely motions of a torso, which can move up/down, side to side, front to back, and/or a combination thereof.
  • torso motion can be anticipated based on current play of the game. As such, based on the human bio-mechanical limitations and ranges of motion along with the video game being player, the motion of the player and/or the associated gaming object may be anticipated, which facilitates better motion tracking.
  • FIG. 39 is a diagram of an example of motion estimation for the head, right arm, left arm, torso, right leg, and left leg of a video game player.
  • the arms will move the most often and over the most distance, followed by the legs, torso, and head.
  • the interval rate may be 10 milliseconds, which provides a 1 mm resolution for an object moving at 200 miles per hour.
  • the body parts are not anticipated to move at or near 200 mph.
  • each body part may include one or more reference points. Since the arms are anticipated to move the most and/or over the greatest distances, the reference point(s) associated with the arms are sampled once every third interval (e.g., interval 1 , 4 , 7 ).
  • the motion of the reference points is estimated based on the samples of intervals 1 and 4 (and may be more samples at different intervals), the motion pattern of the arm, human bio-mechanics, and/or a combination thereof.
  • the estimation may be a linear estimation, a most likely estimation, and/or any other mathematical technique for estimating data points between two or more samples. A similar estimation is made for intervals 5 and 6 .
  • the legs have a data rate of sampling once every four intervals (e.g., intervals 1 , 5 , 9 , etc.).
  • the motion data for the intervening intervals is estimated in a similar manner as the motion data of the arms was estimated.
  • the torso has a data rate of sampling once every five samples (e.g., interval 1 , 6 , 11 , etc.).
  • the head has a data rate of sampling once every six samples (e.g., interval 1 , 7 , 13 , etc.). Note that the initial sampling does not need to be done during the same interval for all of the reference points.
  • FIG. 40 and FIG. 41 are diagrams of examples of reference points on a player to determine player's physical measurements.
  • their positioning may be used to determine the physical attributes of the player (e.g., height, width, arm length, leg length, shoe size, etc.).
  • FIG. 42 is a diagram of an example of mapping a player to an image of the video game.
  • the image displayed in the video game corresponds to the player such that, as the player moves, the image moves the same way.
  • the image may a stored image of the actual player, a celebrity player (e.g., a professional athlete), a default image, and/or a user created image.
  • the mapping involves estimating motion of the non-reference points of the player based on the reference points of the player.
  • the mapping involves equating the reference points on the player to the same points on the image. The same may be done for the gaming object.
  • FIG. 43 is a diagram of another method for determining motion that begins by obtaining coordinates for the reference points of the player and/or gaming object. The method continues by determining the player's dimensions and/or determining the dimensions of the gaming object. The method continues by mapping the reference points of the player to corresponding points of a video image based on the player's dimensions. This step may also include mapping the reference points of the gaming object (e.g., a sword) to the corresponding image of the gaming object based on the gaming object's dimensions.
  • the gaming object e.g., a sword
  • the method continues by determining coordinates of other non-referenced body parts and/or parts of the gaming object based on the coordinates of the reference points. This may be done by a linear interpolation, by a most likely motion algorithm, by a look up table, and/or any other method for estimated data points from surrounding data points.
  • the method continues by tracking motion of the reference points and predicting motion of the non-referenced body parts and/or parts of the gaming object based on the motion of the reference points. This may also be done by a linear interpolation, by a most likely motion algorithm, by a look up table, and/or any other method for estimated data points from surrounding data points.
  • FIG. 44 is a schematic block diagram of an embodiment of a gaming object and/or game console that includes a physical layer (PHY) integrated circuit (IC) and a medium access control (MAC) layer processing module.
  • the PHY IC includes a position and/or motion tracking RF section, a controller interface RF section, and a baseband processing module.
  • the MAC processing module and the baseband processing module may be a single processing device or a plurality of processing devices.
  • Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions.
  • the processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information.
  • the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry
  • the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry.
  • the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in the various Figures depicted and described herein.
  • the MAC processing module triggers position and/or tracking data collection, formatting of the data, processing of the data, and/or controlling position and/or tracking data communications and/or controller communications.
  • the position and/or tracking RF section may include circuitry to transmit one or more beamformed RF signals, RF signals for 3D antenna reception, RFID communications, and/or any other RF transmission and/or reception discussed herein.
  • the game console may use a standardized protocol, a proprietary protocol, and/or a combination thereof to provide the communication between the gaming object and the console.
  • the communication protocol may borrow unused bandwidth from a standardized protocol to facilitate the gaming communication (e.g., utilize unused BW of a WLAN, cell phone, etc.).
  • FIG. 45 , FIG. 46 , and FIG. 47 are diagrams of various embodiments of methods for determining position and/or motion tracking.
  • the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify characteristics of an object that is depicted within at least some of the digital images. The method then operates by determining position of the object based on the identified characteristics. This determined position is with respect to the locations of at least some of the multiple digital cameras.
  • the method can continue by mapping this determined position to a virtual 3D (three-dimensional) coordinate system.
  • the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify characteristics of an object that is depicted within at least some of the digital images. Once these characteristics of the object are identified, the method operates by generating directional vectors based on the identified characteristics. These directional vectors may be viewed as extending from locations of at least some of the multiple digital cameras to a position of the object.
  • the method then operates by determining position of the object based on the directional vectors. Again, this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors.
  • the method can continue by mapping this determined position to a 3D (three-dimensional) coordinate system.
  • the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify at least one sensing tag that is depicted within at least some of the digital images.
  • the sensing tag can be any of a variety of sensing tags, including a light reflective material, a light absorbent material, am infrared source (e.g., when at least one of the digital cameras is infrared sensitive), a color, and/or any other desired type of sensing tag.
  • the sensing tag may be associated with an entirety of object depicted within at least some of the digital images. As also described herein, the sensing tag may be associated with only a portion of an object associated depicted within at least some of the digital images (e.g., a corner of an object, a body part of a player, etc.).
  • the method operates by generating directional vectors based on the identified sensing tag. These directional vectors may be viewed as extending from locations of at least some of the multiple digital cameras to a position of the sensing tag.
  • the method then operates by determining position of the sensing tag based on the directional vectors. Again, this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors.
  • the method can continue by mapping this determined position to a virtual 3D (three-dimensional) coordinate system.
  • FIG. 48 is a diagram of an embodiment of a method for determining a distance based on captured digital images.
  • the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images, using pattern recognition, to identify an object depicted within at least some of the digital images.
  • a size of the identified object is predetermined (e.g., such as a predetermined size of a gaming object, a known object, etc.).
  • the method operates to determine an image size of the identified object (e.g., a size of the object as depicted within at least one of the digital images). Once an image size of an object depicted within a digital image is know, and also when an actual size of the object is known, then the method can associate the known/predetermined size with the image size. This way, a scaling factor can be determined between objects depicted within the digital image and the actual size of objects within the a physical environment that includes the object.
  • an image size of the identified object e.g., a size of the object as depicted within at least one of the digital images.
  • the method then operates by determining a distance within the physical environment using the image size of the object and the predetermined size of the object (e.g., based on the scaling factor).
  • this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors.
  • the method can continue by mapping this determined distance within a virtual 3D (three-dimensional) coordinate system.
  • modules may be a single processing device or a plurality of processing devices.
  • a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions.
  • the operational instructions may be stored in a memory.
  • the memory may be a single memory device or a plurality of memory devices.
  • Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. It is also noted that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. In such an embodiment, a memory stores, and a processing module coupled thereto executes, operational instructions corresponding to at least some of the steps and/or functions illustrated and/or described herein.

Abstract

Position detection and/or movement tracking via image capture and processing. Digital cameras perform image capture of one or more objects within a particular region (e.g., a physical gaming environment). A game module or processing module processes the images captured by the digital cameras to identify a position of and/or track movement of objects (e.g., a player, a gaming object, a game controller, etc.). Various digital image processing techniques may be employed including pattern recognition of objects, color recognition/distinction, intensity recognition/distinction, relative size comparison, etc. to identify objects and/or track their movement. The coupling between the digital cameras and the game module or processing module may be wired, wireless, or a combination thereof. If wireless, any number of different signaling means may be employed including Code Division Multiple Access (CDMA) signaling, Time Division Multiple Access (TDMA) signaling, or Frequency Division Multiple Access (FDMA) signaling.

Description

    CROSS REFERENCE TO RELATED PATENTS/PATENT APPLICATIONS Provisional Priority Claims
  • The present U.S. Utility Patent Application claims priority pursuant to 35 U.S.C. §119(e) to the following U.S. Provisional Patent Application which is hereby incorporated herein by reference in its entirety and made part of the present U.S. Utility Patent Application for all purposes:
  • 1. U.S. Provisional Application Ser. No. 60/936,724, entitled “Position and motion tracking of an object,” (Attorney Docket No. BP6471), filed Jun. 22, 2007, pending.
  • BACKGROUND OF THE INVENTION
  • 1. Technical Field of the Invention
  • The invention relates generally to position and tracking systems; and, more particularly, it relates to such systems that employ captured digital images to determine position of or track movement of an object.
  • 2. Description of Related Art
  • Communication systems are known to support wireless and wire lined communications between wireless and/or wire lined communication devices. Such communication systems range from national and/or international cellular telephone systems to the Internet to point-to-point in-home wireless networks to radio frequency identification (RFID) systems. Each type of communication system is constructed, and hence operates, in accordance with one or more communication standards. For instance, radio frequency (RF) wireless communication systems may operate in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof. As another example, infrared (IR) communication systems may operate in accordance with one or more standards including, but not limited to, IrDA (Infrared Data Association).
  • Depending on the type of RF wireless communication system, a wireless communication device, such as a cellular telephone, two-way radio, personal digital assistant (PDA), personal computer (PC), laptop computer, home entertainment equipment, RFID reader, RFID tag, et cetera communicates directly or indirectly with other wireless communication devices. For direct communications (also known as point-to-point communications), the participating wireless communication devices tune their receivers and transmitters to the same channel or channels (e.g., one of the plurality of radio frequency (RF) carriers of the wireless communication system) and communicate over that channel(s). For indirect wireless communications, each wireless communication device communicates directly with an associated base station (e.g., for cellular services) and/or an associated access point (e.g., for an in-home or in-building wireless network) via an assigned channel. To complete a communication connection between the wireless communication devices, the associated base stations and/or associated access points communicate with each other directly, via a system controller, via the public switch telephone network, via the Internet, and/or via some other wide area network.
  • For each RF wireless communication device to participate in wireless communications, it includes a built-in radio transceiver (i.e., receiver and transmitter) or is coupled to an associated radio transceiver (e.g., a station for in-home and/or in-building wireless communication networks, RF modem, etc.). As is known, the receiver is coupled to the antenna and includes a low noise amplifier, one or more intermediate frequency stages, a filtering stage, and a data recovery stage. The low noise amplifier receives inbound RF signals via the antenna and amplifies then. The one or more intermediate frequency stages mix the amplified RF signals with one or more local oscillations to convert the amplified RF signal into baseband signals or intermediate frequency (IF) signals. The filtering stage filters the baseband signals or the IF signals to attenuate unwanted out of band signals to produce filtered signals. The data recovery stage recovers raw data from the filtered signals in accordance with the particular wireless communication standard.
  • As is also known, the transmitter includes a data modulation stage, one or more intermediate frequency stages, and a power amplifier. The data modulation stage converts raw data into baseband signals in accordance with a particular wireless communication standard. The one or more intermediate frequency stages mix the baseband signals with one or more local oscillations to produce RF signals. The power amplifier amplifies the RF signals prior to transmission via an antenna.
  • In most applications, radio transceivers are implemented in one or more integrated circuits (ICs), which are inter-coupled via traces on a printed circuit board (PCB). The radio transceivers operate within licensed or unlicensed frequency spectrums. For example, wireless local area network (WLAN) transceivers communicate data within the unlicensed Industrial, Scientific, and Medical (ISM) frequency spectrum of 900 MHz, 2.4 GHz, and 5 GHz. While the ISM frequency spectrum is unlicensed there are restrictions on power, modulation techniques, and antenna gain.
  • In IR communication systems, an IR device includes a transmitter, a light emitting diode, a receiver, and a silicon photo diode. In operation, the transmitter modulates a signal, which drives the LED to emit infrared radiation which is focused by a lens into a narrow beam. The receiver, via the silicon photo diode, receives the narrow beam infrared radiation and converts it into an electric signal.
  • IR communications are used video games to detect the direction in which a game controller is pointed. As an example, an IR sensor is placed near the game display, where the IR sensor to detect the IR signal transmitted by the game controller. If the game controller is too far away, too close, or angled away from the IR sensor, the IR communication will fail.
  • Further advances in video gaming include three accelerometers in the game controller to detect motion by way of acceleration. The motion data is transmitted to the game console via a Bluetooth wireless link. The Bluetooth wireless link may also transmit the IR direction data to the game console and/or convey other data between the game controller and the game console.
  • While the above technologies allow video gaming to include motion sensing, it does so with limitations. As mentioned, the IR communication has a limited area in which a player can be for the IR communication to work properly. Further, the accelerometer only measures acceleration such that true one-to-one detection of motion is not achieved. Thus, the gaming motion is limited to a handful of directions (e.g., horizontal, vertical, and a few diagonal directions).
  • Therefore, a need exists for motion tracking and positioning determination for video gaming and other applications that overcome the above limitations.
  • BRIEF SUMMARY OF THE INVENTION
  • The present invention is directed to apparatus and methods of operation that are further described in the following Brief Description of the Several Views of the Drawings, the Detailed Description of the Invention, and the claims. Other features and advantages of the present invention will become apparent from the following detailed description of the invention made with reference to the accompanying drawings.
  • BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
  • FIG. 1 is a diagram of an embodiment an apparatus that performs position determination and/or movement tracking via image capture and processing.
  • FIG. 2 is a diagram of an alternative embodiment of an apparatus that performs position determination and/or movement tracking via image capture and processing.
  • FIG. 3 is a diagram of an embodiment showing a means by which position of a point, object, etc. may be determined using multiple directional vectors extending from multiple known locations, respectively, to that point, object, etc.
  • FIG. 4 is a diagram of an embodiment showing the relationship between an object point and various image planes that have performed image capture of the object point.
  • FIG. 5 is a diagram of an embodiment showing the relationship between multiple object points and various image planes that have performed image capture of the multiple object points.
  • FIG. 6 is a diagram of an embodiment showing an image sensor and the association of physical pixels and the image pixels generated there from.
  • FIG. 7A and FIG. 7B are diagrams of an embodiment of an apparatus that employs directional vectors associated with captured images, at least some of which depict an object, to determine position of the object.
  • FIG. 8A and FIG. 8B are diagrams of an embodiment of an apparatus that employs directional vectors associated images, that depict a number of objects, to determine position of a device that has captured the images.
  • FIG. 9 is a schematic block diagram of an overhead view of an embodiment of a gaming system.
  • FIG. 10 is a schematic block diagram of a side view of an embodiment of a gaming system.
  • FIG. 11 is a diagram illustrating an embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wire-coupled to the multiple digital cameras, for position detection and/or movement tracking.
  • FIG. 12 is a diagram illustrating an alternative embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wirelessly coupled to at least some of the multiple digital cameras, for position detection and/or movement tracking.
  • FIG. 13 is a schematic block diagram of a side view of another embodiment of a gaming system.
  • FIG. 14 is a schematic block diagram of an overhead view of another embodiment of a gaming system.
  • FIG. 15, FIG. 16, and FIG. 17 are diagrams of an embodiment of a coordinate system of a gaming system.
  • FIG. 18, FIG. 19, and FIG. 20 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIG. 21 is a diagram of a method for determining position and/or motion tracking.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 23, FIG. 24, and FIG. 25 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIGS. 26, FIG. 27, and FIG. 28 are diagrams of another embodiment of a coordinate system of a gaming system.
  • FIG. 29 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 30 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 31 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 32 is a diagram of another method for determining position and/or motion tracking.
  • FIG. 33 is a diagram of another embodiment of a coordinate system of a gaming system.
  • FIG. 34 is a diagram of a method for determining motion.
  • FIG. 35 is a diagram of an example of reference points on a player and/or gaming object.
  • FIG. 36, FIG. 37, and FIG. 38 are diagrams of examples of motion patterns.
  • FIG. 39 is a diagram of an example of motion estimation.
  • FIG. 40 and FIG. 41 are diagrams of examples of reference points on a player to determine player's physical measurements.
  • FIG. 42 is a diagram of an example of mapping a player to an image.
  • FIG. 43 is a diagram of another method for determining motion.
  • FIG. 44 is a schematic block diagram of an embodiment of a gaming object and/or game console.
  • FIG. 45, FIG. 46, and FIG. 47 are diagrams of various embodiments of methods for determining position and/or motion tracking.
  • FIG. 48 is a diagram of an embodiment of a method for determining a distance based on captured digital images.
  • DETAILED DESCRIPTION OF THE INVENTION
  • FIG. 1 is a diagram of an embodiment an apparatus that performs position determination and/or movement tracking via image capture and processing. The apparatus includes a number of digital cameras that generate digital images. An object is depicted within at least some of the digital images. A processing module is coupled to receive the digital images. The processing module processes the digital images to identify characteristics of the object as depicted within at least some of the digital images. Based on the identified characteristics, the processing module determines position of the object with respect to locations of at least some of the digital cameras.
  • In one embodiment, the processing module identifies directional vectors based on the identified characteristics of the object. These directional vectors may be viewed as extending from known locations (e.g., locations of the digital cameras, point of reference within the digital cameras, etc.) to the object. In the context of using a digital camera, a digital camera includes an electronic image sensor. A digital image sensor, when mounted on a surface of an integrated circuit and implemented for performing image capture directly may also be viewed as an alternative embodiment of a digital camera.
  • The specifications of such digital image sensors are oftentimes defined in terms of number of physical pixels within the digital image sensor that correspond to the number of image pixels that a picture captured by the image sensor will have. For example, as processes by which digital cameras are manufactured continues to improve, the number of mega-pixels that a digital image sensor includes continues to increase. Generally, the digital image sensors within digital cameras have more than a million physical pixels (e.g., mega-pixels (or more)).
  • A reference point within a digital camera may serve as a point from which a directional vector is defined. As one example, when an image is captured by a digital camera, a camera center of projection of the digital camera is a point to which all points in the image can be traced back to. The focal distance of the digital camera may also correspond to the camera center of projection of the digital camera. A directional vector may be defined as extending from such a reference point within the digital camera to a physical pixel that has captured a particular portion of an object of interest. In other words, an image pixel of interest within a digital image corresponds to a physical pixel of the digital image sensor of the digital camera. A directional vector may be defined as extending from that reference point within the digital camera to that physical pixel.
  • Any of a variety of means may be performed to identify the characteristics of the object depicted within at least some of the digital images, including any of a variety of pattern recognition processes. Moreover, an object may include one or more sensing tags thereon to assist in the identification of the characteristics of the object depicted within at least some of the digital images.
  • Some examples of sensing tags include a particular type of material (e.g., metal, etc.), an RFID tag, a material having particular properties (e.g., a light reflective material, a light absorbent material, etc.), a specific RGB [red, green, blue] color or combination of colors, a particular pattern, etc.). By discerning and distinguishing different sensing tags that may be placed on different parts of the object, the relative position of those parts of the object may be determined. This may be performed in addition to the overall position of the object that may be determined by identifying the entire object.
  • In addition, the object whose characteristics are identified may have a predetermined size. In some of the embodiments depicted herein, a player/user may employ a gaming object when playing a game, and the size of such a gaming object may be known beforehand. When an object having a predetermined size is identified in a digital image, then the actual/physical size of the object may be associated with the identified ‘image size’ as depicted within the digital image. The relationship between these two (e.g., image size and predetermined size) may be employed to determine a scaling factor for that digital image. With this information, a distance between two objects depicted within the digital image may be determined.
  • Moreover, it is noted that once the position of the object is known, then that position may be mapped to a virtual 3D (three-dimensional) coordinate system. This may be employed within a variety of systems including a gaming system such as is described herein.
  • Each of the digital cameras has a corresponding field of view in which it can perform image capture. Again, the object is depicted within at least some of the fields of view of at least some of the digital cameras. When the object is not within any field of view of any digital camera, then at least some of the digital cameras can be adjusted (e.g., such as using an actuator coupled to or integrated with a digital camera) so that the object may be visible within at least one of the fields of view of at least one of the cameras.
  • It is also noted that the configuration of any of the digital cameras may be adjusted. For example, a digital camera may have auto-focus capability in which the focal distance of the digital camera is adjusted to provide a maximum clarity image of the object of interest. Moreover, the image capture rate of any digital camera may be adjusted based on a number of factors including a predetermined setting within the processing module, a user-selected setting within the processing module, a movement history of the object, a current movement of the object, and an expected future movement of the object.
  • It is noted that, while position determination is described herein with respect to an object, the movement of the object may also be determined by merely updating the position of the object as a function of time. For example, the processing module may determine a first position of the object during a first time, and the processing module may then determine a second position of the object during a second time. The movement of the object may be estimated by comparing the first determined position and the second determined position. The rate of the movement of the object may be determined by also considering the times associated with the each of the first determined position and the second determined position.
  • It is also noted that the digital cameras may be ‘smart’ digital cameras in some embodiments that include means by which the configuration of the digital camera may be determined and communicated back to the processing module. Certain information such as focal length of the digital camera, the image capture setting of the digital camera (e.g., for digital cameras that can capture images having different numbers of pixels), physical orientation, physical location, etc. may be determined by such a smart digital camera, communicated back to the processing module, and then the processing module can consider this higher level of information when employing the identified characteristics of the object to determine the position of the object.
  • Moreover, it is noted that while wire-coupling between the directional microphones and the processing module are illustrated in this embodiment, wireless communication may also employed between the various components of such an apparatus without departing from the scope and spirit of the invention.
  • FIG. 2 is a diagram of an alternative embodiment of an apparatus that performs position determination and/or movement tracking via image capture and processing. This embodiment is somewhat analogous to the previous embodiment, with at least one difference being that the digital cameras are wirelessly coupled to the processing module. It is also noted that at least one digital camera may be integrated into the processing module.
  • The wireless means by which communication is supported may be varied, and it may be supported using any desired radio frequency (RF) communication standard including any that operates in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • Moreover when the use of RF communication is employed within such an apparatus, at least one of the digital cameras includes a first radio frequency (RF) transceiver, and the processing module includes a second RF transceiver. Based on an RF signal transmitted between the first RF transceiver and the second RF transceiver, the processing module can then determine a distance between the processing module and the digital camera from which the RF signal was transmitted. By using a transmission time at which the RF signal is transmitted from a first device, and a receive time at which the RF signal is received by a second device, and also knowing the speed/velocity at which the RF signal travels, then the distance between the first device and the second device may be determined.
  • FIG. 3 is a diagram of an embodiment showing a means by which position of a point, object, etc. may be determined using multiple directional vectors extending from multiple known locations, respectively, to that point, object, etc. This diagram depicts 3D space in a right handed, Cartesian coordinate system (e.g., shown as having axes xyz). Clearly, the principles described with respect to this diagram are applicable to any other 3D coordinate system as well.
  • When at least two positions are known, and when directional vectors extending from each of those two locations are known, then if those directional vectors do intersect at all, then the location of the intersection may be determined using triangulation. If additional known locations are known, and if additional directional vectors extending from those additional known locations are also known, then a greater certainty of an intersection between the various directional vectors may be had.
  • It is noted that once the position associated with the intersection of these directional vectors is known, then this position (or location) may be mapped to a virtual 3D coordinate system. The upper right hand corner of the diagram depicts a virtual 3D space in a right handed, Cartesian coordinate system (e.g., shown as having axes x′y′z′).
  • FIG. 4 is a diagram of an embodiment showing the relationship between an object point and various image planes that have performed image capture of the object point. This diagram shows two separate image planes, as corresponding to two separate digital cameras, that capture digital images of an object from different perspectives or fields of view. The image plane of a digital camera may be considered as corresponding to the digital image sensor component of the digital camera. For example, a digital mage sensor may be a complementary metal-oxide-semiconductor (CMOS) device or a charge coupled device (CCD). As is known, various parameters generally are employed to define a digital image sensor, including an image sensor type (e.g., ¼″, 1/3.6″, etc.), a width and height (typically provided in milli-meters), a total number of physical pixels (e.g., X megapixels, where X is a number such as 3, 6, 8.1, etc.), a number of physical pixels along each of the width and height of the digital image sensor (e.g., y×z, where y and z are integer numbers), a diagonal size (again, typically provided in milli-meters) that corresponds to the normal lens focal length, the focal length factor, etc. the general trend in digital image sensor development over the years is to pack more and more physical pixels into a digital image sensor while also trying to reduce the overall size of the digital image sensor. In any case, each physical pixel of a digital image sensor captures information (e.g., color, intensity, etc.) of a portion of the field of view of the digital camera, and this information is employed to generate an image pixel of a digital image. Therefore, in the digital image context, there can be viewed as being a one to one relationship between each physical pixel of a digital image sensor and each image pixel of an image generated from information captured by the digital image sensor.
  • In this diagram, a directional vector extends from a reference point of a digital camera 1 (DC1) through the image plane of DC1 to a point on the object of interest. As can be seen, a directional vector (DV1) also extends from this DC1 reference point through the image plane of DC1 (e.g., which corresponds to the digital image sensor of DC1. This camera reference point may be a camera center of projection for DC1 based on its current configuration (e.g., focus, etc.). Alternatively, another camera reference point may be employed (e.g., focal point, predetermined point within the camera, etc.) without departing from the scope and spirit of the invention.
  • Analogously for a second digital camera (DC2), another directional vector extends from a reference point of a DC2 through the image plane of DC2 to the same point on the object of interest. If the locations of DC1 and DC2 are known, and if the directional vectors extending from the respective points of reference of each of DC1 and DC2 are known, then the principles of triangulation may be employed to determine the location of the object point on the object of interest.
  • As can also be seen in this diagram, there is a relationship between the dimensions of object (physically) and the corresponding images of that object as depicted in the digital images captured by DC1 and DC2. For one example, when considering the actual height of the object, then an image 1 height is the height of the object as depicted in a digital image captured by DC1, and an image 2 height is the height of the object as depicted in a digital image captured by DC2. These two image heights need not be the same (e.g., the object may be closer to one of the digital cameras than the other, the focus of one of the digital cameras may be different than the other, etc.). It is noted that if the actual height of the object is known, then a first ratio between the actual height to the image 1 height may be made, and a second ratio between the actual height to the image 2 height may be made. By knowing the actual size of something depicted within a digital image, and by knowing the configuration of the digital camera (e.g., focus, etc.), then a distance between the digital camera and the object may be determined.
  • FIG. 5 is a diagram of an embodiment showing the relationship between multiple object points and various image planes that have performed image capture of the multiple object points. This diagram has some similarities to the previous embodiment, in that a directional vector extends from a reference point of a digital camera through the image plane of the digital camera to a point on the object of interest.
  • However, the object of this embodiment includes a number of sensing tags thereon. These sensing tags can be portions of the object having a particular color, a light reflective material, a light absorbent material, an infrared light source, etc. Generally, the sensing tags have some associated characteristic that is identifiable on the object.
  • The object in this diagram also has different types of sensing tags (e.g., of type 1, type 2, etc.). This use of different types of sensing tags of an object may be employed to assist in determining the position and orientation of the object (e.g., sometimes referred to as ‘pose’ in the image processing context), since different sides, areas, etc. of the object may be better distinguished from one another. For example, when considering an object such as a cube, then a determination of whether the cube is right side up (or upside down) with reference to a desired convention of which side of the cube will be deemed to be ‘up’ may be determined.
  • In this embodiment, first directional vectors associated with type 1 sensing tags extend from a reference point of a digital camera through the image plane of the digital camera to two separate points on the object that have type 1 sensing tags. Second directional vectors associated with type 2 sensing tags extend from the reference point of the digital camera through the image plane of the digital camera to two separate points on the object that have type 2 sensing tags.
  • FIG. 6 is a diagram of an embodiment showing an image sensor and the association of physical pixels and the image pixels generated there from. Within a digital camera, a digital image sensor is the element that captures information (e.g., color, intensity, contrast, etc.) of a field of view of the digital camera. Each individual physical pixel of the digital image sensor captures a small portion of the field of view of the digital camera. For example, if the digital image sensor includes one million physical pixels, then each individual physical pixel of the digital image sensor captures information of one-millionth of the field of view of the digital camera. If the digital image sensor includes X megapixels, then each individual physical pixel of the digital image sensor captures information of (1/(X×106))th of the field of view of the digital camera.
  • Together, each of these discrete pieces of information, as captured by the physical pixels, is used to form a digital image corresponding what is seen in the field of view of the digital camera.
  • A directional vector extends from a reference point of a digital camera to one of the physical pixels of the digital image sensor. For example, when a particular image pixel of a digital image is identified, then the corresponding physical pixel that captured information used to generate that image pixel can be determined. Such a directional vector can then be determined. This directional vector may be the directional vector generated from this digital camera to a particular point on the object of interest.
  • FIG. 7A and FIG. 7B are diagrams of an embodiment of an apparatus, shown from two separate perspectives, that employs directional vectors associated with captured images, at least some of which depict an object, to determine position of the object.
  • Referring to perspective of FIG. 7A, which is viewed in the xy plane of a 3D space having an xyz coordinate system, the principles of using triangulation may be employed when determining position of an object that is depicted in digital images captured by multiple digital cameras. For example, a projection of a first directional vector (DV1 proj.) from a first digital camera (DC1) extends from the first digital camera to the object. A projection of a second directional vector (DV2 proj.) from a second digital camera (DC2) extends from the second digital camera to the object. Additional directional vectors, associated with additional digital cameras, may also be employed. The directional vectors then undergo processing in a processing module to determine the intersection of the various directional vectors. The intersection of these directional vectors is the location of the object.
  • Referring to perspective of FIG. 7B, this diagram is viewed in the xz plane of a 3D space having an xyz coordinate system.
  • FIG. 8A and FIG. 8B are diagrams of an embodiment of an apparatus, shown from two separate perspectives, respectively, that employs directional vectors associated images, that depict a number of objects, to determine position of a device that has captured the images.
  • Referring to the embodiment of FIG. 8A, which is viewed in the xy plane of a 3D space having an xyz coordinate system, the principles of using triangulation may be employed when determining position of a device that includes multiple digital cameras (e.g., a first digital camera (DC1), a second digital camera (DC2), etc.) that capture digital images that depict various known objects (e.g., a first object (object 1), a second object (object 2), etc.).
  • The principles of triangulation are employed in this embodiment, but in reverse that the previous embodiment. The orientation of each digital camera of the device, when capturing a digital image of a known object is determined.
  • For example, a projection of a first directional vector (DV1 proj.) from a first object (object 1) extends to the first digital camera (DC1). A projection of a second directional vector (DV2 proj.) extends from a second object (object 2) to a second digital camera (DC2). Additional directional vectors, associated with additional objects, may also be employed. The directional vectors orientations undergo processing in a processing module to determine their intersection. The intersection of these directional vectors is the location of the device that includes the multiple digital cameras.
  • Referring to the embodiment of FIG. 8B, this diagram is viewed in the xz plane of a 3D space having an xyz coordinate system.
  • FIG. 9 is a schematic block diagram of an overhead view of an embodiment of a gaming system that includes a game console and a gaming object. The gaming system has an associated a physical area in which the game console and the gaming object are located. The physical area may be a room, portion of a room, and/or any other space where the gaming object and game console are proximally co-located (e.g., airport terminal, on a bus, on an airplane, etc.).
  • The gaming object may be a wireless game controller and/or any object used or worn by the player to facilitate play of a video game. For example, the gaming object may be a simulated sword, a simulated gun, a helmet, a vest, a hat, shoes, socks, pants, shorts, gloves, etc.
  • In this system, the game console determines the positioning of the gaming object within the physical area using one or more position determination techniques as subsequently discussed. Once the gaming object's position is determined, the game console tracks the motion of the gaming object using one or more motion tracking techniques as subsequently discussed to facilitate video game play. In this embodiment, the game console may determine the positioning of the gaming object within a positioning tolerance (e.g., within a meter) at a positioning update rate (e.g., once every second or once every few seconds) and tracks the motion within a motion tracking tolerance (e.g., within a few millimeters) at a motion tracking update rate (e.g., once every 10-100 milliseconds).
  • FIG. 10 is a schematic block diagram of a side view of an embodiment of a gaming system of FIG. 9 to illustrate that the positioning and motion tracking are done in three-dimensional space. As such, the gaming system provides accurate motion tracking of the gaming object, which may be used to map the player's movements to a graphics image for true interactive video game play.
  • FIG. 11 is a diagram illustrating an embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wire-coupled to the multiple digital cameras, for position detection and/or movement tracking. A physical gaming environment (at least a portion of which may be represented within a virtual gaming environment) includes a number of digital cameras arranged at various locations therein to effectuate the image capture of a player and/or gaming object associated with the player. There may be some instances where the player has no gaming object (e.g., when simulating boxing), and the bodily position and/or movement of the player are those elements being monitored and/or tracked.
  • Each digital camera has a corresponding field of view in which it can perform image capture. By appropriately placing the digital cameras throughout various locations in an area, an entirety of the physical gaming environment can be visually captured by digital images generated by the digital cameras. By crossing more than one field of view of more than one digital camera, then multiple views of a single object within the physical gaming environment can be obtained. The game module (or another processing module) may then process the digital images captured by the digital cameras to make estimates of a position of an object within the physical gaming environment. Also, by comparing various digital images taken at different times (e.g., digital image 1 taken at time 1, digital image 2 taken at time 2=time (1+Δt)), then movement of the object within the physical gaming environment may be estimated.
  • The game console is operable to perform processing of digital images captured by the digital cameras to identify characteristics of an object depicted within at least some of the digital images. Based on the identified object characteristics, the game console is operable to determine position of the object with respect to the digital cameras.
  • Moreover, it is noted that, in this embodiment as well as other embodiments, certain initialization processes can be performed in which the player and/or gaming object remains motionless. The digital cameras then may perform image capture of the motionless player and/or gaming object for calibration purposes. In addition, if a size (e.g., height, width, etc.) of the player and/or gaming object is known and provided to the game console (e.g., by being entered via a user interface by the player, or by being estimated by the game console), then the size of other objects within the physical gaming environment may be estimated based on their relatively proportional size to a known object.
  • Also, various means of performing digital image processing may be performed including pattern recognition in which a predetermined pattern (e.g., as corresponding to a particular shape) is compared to patterns detected within one of the digital images captured by one of the digital cameras. It is noted that a particular shape may have more than one pattern corresponding thereto (e.g., a pattern 1 of a person-related-shape corresponding to a taller/slender person vs. a pattern 2 of a person-related-shape corresponding to a shorter/bulky person, etc.). Also, it is noted that a pattern detected within a digital image, even if is not an expected pattern or can be associated with a predetermined pattern that is being searched for within the digital image, the detected pattern can be added (e.g., to a memory) that stores a number of patterns/shapes that may be detected within the digital image.
  • Another means of performing digital image processing may include searching for a particular color (e.g., as associated with a player, gaming object, etc.) within a digital image captured by a digital camera. For example, a player may wear a particular colored clothing article, and when processing the digital image captured by a digital camera, the color associated with that known-colored clothing article is sought for.
  • Other means of performing digital image processing may be performed including searching for reflections off of reflective material that covers the player and/or gaming object. This digital image processing may involve searching for pixels or groups of pixels within a digital image above a certain threshold (which may be predetermined or adaptively set for each digital image). When the intensity is above that threshold, then that pixel (or group of pixels) can be associated as being associated with the reflective material covering the player and/or gaming object. Additional variations of the physical gaming environment may be employed such as providing special lighting to enhance the reflecting of light off of reflective material covering at least a portion of the player and/or gaming object. Moreover, an appropriate backdrop could also be employed to provide a higher degree of contrast between the player and/or gaming object and the rest of the physical gaming environment.
  • Certain operational parameters of the digital cameras may also be adjusted by a user/player or in real time by control signals provided by the game console. For example, the image capture rate employed by the digital cameras may be adjusted to based on any number of considerations including a predetermined setting within the game console, a player-selected setting within the game console (e.g., as selected by the player via a user interface), a type of game being played, a movement history of the player and/or gaming object, a current or expected movement of the player and/or gaming object, etc. Also, the any one of the digital cameras may include an integrated actuator to perform real-time re-positioning of a digital camera to effectuate better image capture of the player and/or gaming object within the physical gaming environment. Alternatively, the camera may be mounted on an actuator that can perform such re-positioning of the digital camera. Clearly, a player/user can perform re-positioning of any digital camera as well.
  • As can be seen in this embodiment, the digital cameras are all wire-coupled to the game console. Any desired wire-based communication protocol (e.g., Ethernet) may be employed to effectuate communication between the digital cameras and the game console to communicate digital images from the digital cameras to the game console and command signals (if necessary) from the game console to the digital cameras.
  • FIG. 12 is a diagram illustrating an alternative embodiment of a gaming system including multiple digital cameras for capturing images to undergo processing in a game module, that is wirelessly coupled to the multiple digital cameras, for position detection and/or movement tracking.
  • This embodiment is somewhat analogous to the previous embodiment, with at least one difference being that at least some of the digital cameras and the game console each include wireless communication capability to effectuate wireless communication there between. In this embodiment, at least one of the digital cameras is wire-coupled to the game console. For example, some of the digital cameras and the game console either includes an integrated wireless transceiver or is coupled to a wireless transceiver to effectuate communication between some of the digital cameras and the game console. In addition, a digital camera may be integrated into the game console as well without departing from the scope and spirit of the invention.
  • This wireless communication can be supported using any number of desired wireless protocols including Code Division Multiple Access (CDMA) signaling, Time Division Multiple Access (TDMA) signaling, Frequency Division Multiple Access (FDMA) signaling, or some other desired wireless standard, protocol, or proprietary means of communication.
  • In addition, the wireless communication can be supported using any desired radio frequency (RF) communication standard including any that operates in accordance with one or more standards including, but not limited to, RFID, IEEE 802.11, Bluetooth, advanced mobile phone services (AMPS), digital AMPS, global system for mobile communications (GSM), code division multiple access (CDMA), local multi-point distribution systems (LMDS), multi-channel-multi-point distribution systems (MMDS), and/or variations thereof.
  • FIG. 13 is a schematic block diagram of a side view of another embodiment of a gaming system that includes multiple gaming objects, the player, and a game console. In this embodiment, the gaming objects include one or more sensing tags (e.g., metal, RFID tag, light reflective material, light absorbent material, a specific RGB [red, green, blue] color, etc.). For example, the gaming objects may include a game controller, a helmet, a shirt, pants, gloves, and socks, each of which includes one or more sensing tags. In this manner, the sensing tags facilitate the determining of position and/or facilitate motion tracking as will be subsequently discussed.
  • FIG. 14 is a schematic block diagram of an overhead view of another embodiment of a gaming system that includes a game console, a plurality of players and a plurality of gaming objects. In this instance, the positioning and motion tracking of each of the gaming objects (and hence the player) are determined by the game console and/or the one or more peripheral sensors.
  • FIG. 15, FIG. 16, and FIG. 17 are diagrams of an embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these diagrams, an xyz origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object is determined based on its Cartesian coordinates (e.g., x1, y1, z1). As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the origin.
  • FIG. 18, FIG. 19, and FIG. 20 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these diagrams, an origin is selected to be somewhere in the localized physical area and each point being tracked and/or used for positioning on the player and/or on the gaming object is determined based on its vector, or spherical, coordinates (ρ, φ, θ), which are defined as: ρ≧0 is the distance from the origin to a given point P. 0≧φ≧180° is the angle between the positive z-axis and the line formed between the origin and P. 0≧θ≧360° is the angle between the positive x-axis and the line from the origin to the P projected onto the xy-plane. φ is referred to as the zenith, colatitude or polar angle, while θ is referred to as the azimuth.φ and θ lose significance when ρ=0 and θ loses significance when sin(φ)=0 (at φ=0 and φ=180°). To plot a point from its spherical coordinates, go ρ units from the origin along the positive z-axis, rotate φ about the y-axis in the direction of the positive x-axis and rotate θ about the z-axis in the direction of the positive y-axis. As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined in vector, or spherical, coordinates with respect to the origin.
  • While FIGS. 15-20 illustrate two types of coordinate system, any three-dimensional coordinate system may be used for tracking motion and/or establishing position within a gaming system.
  • FIG. 21 is a diagram of a method for determining position and/or motion tracking that begins by determining the environment parameters (e.g., determining the properties of the localized physical area such as height, width, depth, objects in the physical area, etc.). The method then continues by mapping the environment parameters to a coordinate system (e.g., Cartesian coordinate system of FIGS. 15-17). The method continues in one or more branches. Along one branch, the initial coordinates of the player are determined using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • The other branch includes determining the coordinates of the gaming object's initial position using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 22 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within a coordinate system (e.g., the vector coordinate system of FIGS. 18-20). The reference point may be the origin or any other point within the localized physical area. The method continues in one or more branches. Along one branch, a vector with respect to the reference point is determined to indicate the player's initial position, which may be done by using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the player's position to track the player's motion using one or more of a plurality of motion tracking techniques as described herein.
  • The other branch includes determining a vector with respect to the reference point for the gaming object to establish its initial position, which may be done by using one or more of a plurality of position determining techniques as described herein. This branch continues by updating the gaming object's position to track the gaming object's motion using one or more of a plurality of motion tracking techniques as described herein. Note that the rate of tracking the motion of the player and/or gaming object may be done at a rate based on the video gaming being played and the expected speed of motion. Further note that a tracking rate of 10 milliseconds provides 0.1 mm accuracy in motion tracking.
  • FIG. 23, FIG. 24, and FIG. 25 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these diagrams, an xyz origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its Cartesian coordinates (e.g., x1, y1, z1). As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined in Cartesian coordinates with respect to the preceding location (e.g., Δx, Δy, Δz).
  • As another example, the positioning and motion tracking of the player may be done with reference to the position of the gaming object, such the gaming objects position is determined with reference to the origin and/or its previous position and the position of the player is determine with reference to the gaming object's position. The reverse could be used as well. Further, both position and motion of the gaming object and the player may be referenced to a personal item of the player, such as a cell phone.
  • FIG. 26, FIG. 27, and FIG. 28 are diagrams of another embodiment of a coordinate system of a localized physical area that may be used for a gaming system. In these diagrams, an origin is selected to be somewhere in the localized physical area and the initial position of a point being tracked on the player and/or gaming object is determined based on its vector, or spherical coordinates (e.g., ρ1, φ1, θ1). As the player and/or gaming object moves, the new position of the tracking and/or positioning points are determined as a vector, or spherical coordinates with respect to the preceding location (e.g., ΔV, or Δρ, Δφ, Δθ).
  • As another example, the positioning and motion tracking of the player may be done with reference to the position of the gaming object, such the gaming objects position is determined with reference to the origin and/or its previous position and the position of the player is determine with reference to the gaming object's position. The reverse could be used as well. Further, both position and motion of the gaming object and the player may be referenced to a personal item of the player, such as a cell phone.
  • FIG. 29 is a diagram of another method for determining position and/or motion tracking that begins by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays. The environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • The method then proceeds by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 15-17). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room. In addition, objects in the room (e.g., a couch, a chair, etc.) are mapped to the coordinate system based on their physical location in the room.
  • The method then proceeds by determining the coordinates of the player's, or players', position in the physical area. The method then continues by determining the coordinates of a gaming object's initial position. Note that the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the initial position of the player may be used to determine the initial position of the gaming object. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method then proceeds by updating the coordinates of the player's, or players', position in the physical area to track the player's motion. The method also continues by updating the coordinates of a gaming object's position to track its motion. Note that the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the motion of the player may be used to determine the motion of the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 30 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for a player's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 18-20). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • The method then continues by determining a vector of a gaming object's initial position. Note that the positioning of the gaming object may be used to determine the position of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the initial position of the player may be used to determine the initial position of the gaming object. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method then proceeds by updating the vector of the player's, or players', position in the physical area to track the player's motion. The method also continues by updating the vector of the gaming object's position to track its motion. Note that the motion of the gaming object may be used to determine the motion of the player(s) if the gaming object is something worn by the player or is close proximity to the player. Alternatively, the motion of the player may be used to determine the motion of the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 31 is a diagram of another method for determining position and/or motion tracking that begins by determining environment parameters of the physical area in which the gaming object lays and/or in which the game system lays. The environmental parameters include, but are not limited to, height, width, and depth of the localized physical area, objects in the physical area, differing materials in the physical area, multiple path effects, interferers, etc.
  • The method then proceeds by mapping the environment parameters to a coordinate system (e.g., one of the systems shown in FIGS. 23-25). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room. In addition, objects in the room (e.g., a couch, a chair, etc.) are mapped to the coordinate system based on their physical location in the room.
  • The method then proceeds by determining the coordinates of the gaming object's initial position in the physical area. The method then continues by determining the coordinates of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method then proceeds by updating the coordinates of the gaming object's position in the physical area to track its motion. The method also continues by updating the coordinates of the player's position to track the player's motion with respect to the gaming object. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 32 is a diagram of another method for determining position and/or motion tracking that begins by determining a reference point within the physical area in which the gaming object lays and/or in which the game system lays. The method then proceeds by determining a vector for a gaming object's initial position with respect to a reference point of a coordinate system (e.g., one of the systems shown in FIGS. 26-28). As an example, if the physical area is a room, a point in the room is selected as the origin and the coordinate system is applied to at least some of the room.
  • The method then continues by determining a vector of the player's initial position with respect to the gaming object's initial position. Note that one or more of the plurality of positioning techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • The method then proceeds by updating the vector of the gaming object's position in the physical area to track its motion. The method also continues by updating the vector of the player's position with respect to the gaming object's motion to track the player's motion. Note that one or more of the plurality of motion techniques described herein may be used to determine the position of the player and/or of the gaming object.
  • FIG. 33 is a diagram of another embodiment of a coordinate system of a gaming system that is an extension of the coordinate systems discussed above. In this embodiment, the coordinate system includes a positioning coordinate grid and a motion tracking grid, where the motion tracking grid is of a finer resolution than the positioning coordinate grid. In general, the player or gaming object's position within the physical area can have a first tolerance (e.g., within a meter) and the motion tracking of the player and/or the gaming object has a second tolerance (e.g., within a few millimeters). As such, the position of the player and/or gaming object can be updated infrequently in comparison to the updating of the motion (e.g., the position can be updated once every second or so while the motion may be updated once every 10 milliseconds).
  • FIG. 34 is a diagram of a method for determining motion of a gaming object and/or a player that begins by determining an initial position of the player and/or gaming object using one or more of the positioning techniques described herein. The method continues by determining motion reference points for the player and/or for the gaming object as shown in FIG. 35. The reference points may be sensors on the player and/or on the gaming object, may be particular body parts (e.g., nose, elbow, knee, etc.), particular points on the gaming object, and/or a combination thereof. The number of reference points and the location thereof may be dependent on the video game, on the player's physical characteristics, on the player's skill level, on the desired motion tracking resolution, and/or on the motion tracking technique being used.
  • The method continues by determining initial motion coordinates for each reference point using one or more the position determining techniques and/or motion tracking techniques described herein. The method continues by establishing one or more data rates for the reference points based on the location of the reference point, motion patterns (e.g., a video bowling game, the player will have particular motions for bowling), previous motion (e.g., half way through bowling a ball, know where the next motion is likely to be), and/or human bio-mechanics (e.g., arms and legs bends in a certain manner). For example, the reference point of a hand may have a faster data rate than a reference point on the head since the hand will most likely being moving faster and more often than the head.
  • The method continues by obtaining motion tracking data (e.g., distances, vectors, distance changes, vector changes, etc.) for the reference points at intervals of the one or more data rates. The method continues by determining motion of the reference points based on the motion tracking date at intervals of the one or more data rates.
  • FIG. 36, FIG. 37, FIG. 38, and FIG. 39 are diagrams of examples of motion patterns in accordance with human bio-mechanics. As shown in FIG. 36, a head can move up/down, it can tilt, it can rotate, and/or a combination thereof. For a given video game, head motion can be anticipated based on current play of the game. For example, during an approach shot, the head will be relatively steady with respect to tilting and rotating, and may move up or down along with the body.
  • FIG. 37 shows the motion patterns of an arm (or leg) in accordance with human bio-mechanics. As shown, the arm (or leg) may contract or extend, go up or down, move side to side, rotate, or a combination thereof. For a given video game, an arm (or leg) motion can be anticipated based on the current play of the game. Note that the arm (or leg) may be broken down in smaller body parts (e.g., upper arm, elbow, forearm, wrist, hand, fingers). Further note that the gaming object's motion will be similar to the body part it is associated with.
  • FIG. 38 illustrates the likely motions of a torso, which can move up/down, side to side, front to back, and/or a combination thereof. For a given video game, torso motion can be anticipated based on current play of the game. As such, based on the human bio-mechanical limitations and ranges of motion along with the video game being player, the motion of the player and/or the associated gaming object may be anticipated, which facilitates better motion tracking.
  • FIG. 39 is a diagram of an example of motion estimation for the head, right arm, left arm, torso, right leg, and left leg of a video game player. In this game, it is anticipated that the arms will move the most often and over the most distance, followed by the legs, torso, and head. In this example the interval rate may be 10 milliseconds, which provides a 1 mm resolution for an object moving at 200 miles per hour. In this example, the body parts are not anticipated to move at or near 200 mph.
  • At interval 1, at least some of the reference points on the corresponding body parts is sampled. Note that each body part may include one or more reference points. Since the arms are anticipated to move the most and/or over the greatest distances, the reference point(s) associated with the arms are sampled once every third interval (e.g., interval 1, 4, 7). For intervals 2 and 3, the motion of the reference points is estimated based on the samples of intervals 1 and 4 (and may be more samples at different intervals), the motion pattern of the arm, human bio-mechanics, and/or a combination thereof. The estimation may be a linear estimation, a most likely estimation, and/or any other mathematical technique for estimating data points between two or more samples. A similar estimation is made for intervals 5 and 6.
  • The legs have a data rate of sampling once every four intervals (e.g., intervals 1, 5, 9, etc.). The motion data for the intervening intervals is estimated in a similar manner as the motion data of the arms was estimated. The torso has a data rate of sampling once every five samples (e.g., interval 1, 6, 11, etc.). The head has a data rate of sampling once every six samples (e.g., interval 1, 7, 13, etc.). Note that the initial sampling does not need to be done during the same interval for all of the reference points.
  • FIG. 40 and FIG. 41 are diagrams of examples of reference points on a player to determine player's physical measurements. In this example, once the positioning of the reference points is determined, their positioning may be used to determine the physical attributes of the player (e.g., height, width, arm length, leg length, shoe size, etc.).
  • FIG. 42 is a diagram of an example of mapping a player to an image of the video game. In this embodiment, the image displayed in the video game corresponds to the player such that, as the player moves, the image moves the same way. The image may a stored image of the actual player, a celebrity player (e.g., a professional athlete), a default image, and/or a user created image. The mapping involves estimating motion of the non-reference points of the player based on the reference points of the player. In addition, the mapping involves equating the reference points on the player to the same points on the image. The same may be done for the gaming object.
  • FIG. 43 is a diagram of another method for determining motion that begins by obtaining coordinates for the reference points of the player and/or gaming object. The method continues by determining the player's dimensions and/or determining the dimensions of the gaming object. The method continues by mapping the reference points of the player to corresponding points of a video image based on the player's dimensions. This step may also include mapping the reference points of the gaming object (e.g., a sword) to the corresponding image of the gaming object based on the gaming object's dimensions.
  • The method continues by determining coordinates of other non-referenced body parts and/or parts of the gaming object based on the coordinates of the reference points. This may be done by a linear interpolation, by a most likely motion algorithm, by a look up table, and/or any other method for estimated data points from surrounding data points. The method continues by tracking motion of the reference points and predicting motion of the non-referenced body parts and/or parts of the gaming object based on the motion of the reference points. This may also be done by a linear interpolation, by a most likely motion algorithm, by a look up table, and/or any other method for estimated data points from surrounding data points.
  • FIG. 44 is a schematic block diagram of an embodiment of a gaming object and/or game console that includes a physical layer (PHY) integrated circuit (IC) and a medium access control (MAC) layer processing module. The PHY IC includes a position and/or motion tracking RF section, a controller interface RF section, and a baseband processing module. As like any processing module disclosed herein, the MAC processing module and the baseband processing module may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on hard coding of the circuitry and/or operational instructions. The processing module may have an associated memory and/or memory element, which may be a single memory device, a plurality of memory devices, and/or embedded circuitry of the processing module. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, cache memory, and/or any device that stores digital information. Note that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory and/or memory element storing the corresponding operational instructions may be embedded within, or external to, the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. Further note that, the memory element stores, and the processing module executes, hard coded and/or operational instructions corresponding to at least some of the steps and/or functions illustrated in the various Figures depicted and described herein.
  • The MAC processing module triggers position and/or tracking data collection, formatting of the data, processing of the data, and/or controlling position and/or tracking data communications and/or controller communications. The position and/or tracking RF section may include circuitry to transmit one or more beamformed RF signals, RF signals for 3D antenna reception, RFID communications, and/or any other RF transmission and/or reception discussed herein.
  • The game console may use a standardized protocol, a proprietary protocol, and/or a combination thereof to provide the communication between the gaming object and the console. Note that the communication protocol may borrow unused bandwidth from a standardized protocol to facilitate the gaming communication (e.g., utilize unused BW of a WLAN, cell phone, etc.).
  • FIG. 45, FIG. 46, and FIG. 47 are diagrams of various embodiments of methods for determining position and/or motion tracking.
  • Referring to the method of FIG. 45, the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify characteristics of an object that is depicted within at least some of the digital images. The method then operates by determining position of the object based on the identified characteristics. This determined position is with respect to the locations of at least some of the multiple digital cameras.
  • Once the position of the object is known, the method can continue by mapping this determined position to a virtual 3D (three-dimensional) coordinate system.
  • Referring to the method of FIG. 46, the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify characteristics of an object that is depicted within at least some of the digital images. Once these characteristics of the object are identified, the method operates by generating directional vectors based on the identified characteristics. These directional vectors may be viewed as extending from locations of at least some of the multiple digital cameras to a position of the object.
  • The method then operates by determining position of the object based on the directional vectors. Again, this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors.
  • Once the position of the object is known, the method can continue by mapping this determined position to a 3D (three-dimensional) coordinate system.
  • Referring to the method of FIG. 47, the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images to identify at least one sensing tag that is depicted within at least some of the digital images. The sensing tag can be any of a variety of sensing tags, including a light reflective material, a light absorbent material, am infrared source (e.g., when at least one of the digital cameras is infrared sensitive), a color, and/or any other desired type of sensing tag. The sensing tag may be associated with an entirety of object depicted within at least some of the digital images. As also described herein, the sensing tag may be associated with only a portion of an object associated depicted within at least some of the digital images (e.g., a corner of an object, a body part of a player, etc.).
  • Once the sensing tag is identified within at least some of the digital images, the method operates by generating directional vectors based on the identified sensing tag. These directional vectors may be viewed as extending from locations of at least some of the multiple digital cameras to a position of the sensing tag.
  • The method then operates by determining position of the sensing tag based on the directional vectors. Again, this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors.
  • Once the position of the object is known, the method can continue by mapping this determined position to a virtual 3D (three-dimensional) coordinate system.
  • FIG. 48 is a diagram of an embodiment of a method for determining a distance based on captured digital images.
  • Referring to the method of FIG. 48, the method operates by capturing digital images using multiple digital cameras. The method then performs processing of the digital images, using pattern recognition, to identify an object depicted within at least some of the digital images. A size of the identified object is predetermined (e.g., such as a predetermined size of a gaming object, a known object, etc.).
  • In accordance with processing the digital images, the method operates to determine an image size of the identified object (e.g., a size of the object as depicted within at least one of the digital images). Once an image size of an object depicted within a digital image is know, and also when an actual size of the object is known, then the method can associate the known/predetermined size with the image size. This way, a scaling factor can be determined between objects depicted within the digital image and the actual size of objects within the a physical environment that includes the object.
  • The method then operates by determining a distance within the physical environment using the image size of the object and the predetermined size of the object (e.g., based on the scaling factor).
  • Again, this determined position is with respect to the locations of at least some of the multiple digital cameras as indicated by an intersection of at least some of the directional vectors. Once a distance, as depicted within at least one digital image is known, then the method can continue by mapping this determined distance within a virtual 3D (three-dimensional) coordinate system.
  • It is noted that the various modules (e.g., processing modules, baseband processing modules, MAC processing modules, game consoles, etc.) described herein may be a single processing device or a plurality of processing devices. Such a processing device may be a microprocessor, micro-controller, digital signal processor, microcomputer, central processing unit, field programmable gate array, programmable logic device, state machine, logic circuitry, analog circuitry, digital circuitry, and/or any device that manipulates signals (analog and/or digital) based on operational instructions. The operational instructions may be stored in a memory. The memory may be a single memory device or a plurality of memory devices. Such a memory device may be a read-only memory, random access memory, volatile memory, non-volatile memory, static memory, dynamic memory, flash memory, and/or any device that stores digital information. It is also noted that when the processing module implements one or more of its functions via a state machine, analog circuitry, digital circuitry, and/or logic circuitry, the memory storing the corresponding operational instructions is embedded with the circuitry comprising the state machine, analog circuitry, digital circuitry, and/or logic circuitry. In such an embodiment, a memory stores, and a processing module coupled thereto executes, operational instructions corresponding to at least some of the steps and/or functions illustrated and/or described herein.
  • The present invention has also been described above with the aid of method steps illustrating the performance of specified functions and relationships thereof. The boundaries and sequence of these functional building blocks and method steps have been arbitrarily defined herein for convenience of description. Alternate boundaries and sequences can be defined so long as the specified functions and relationships are appropriately performed. Any such alternate boundaries or sequences are thus within the scope and spirit of the claimed invention.
  • The present invention has been described above with the aid of functional building blocks illustrating the performance of certain significant functions. The boundaries of these functional building blocks have been arbitrarily defined for convenience of description. Alternate boundaries could be defined as long as the certain significant functions are appropriately performed. Similarly, flow diagram blocks may also have been arbitrarily defined herein to illustrate certain significant functionality. To the extent used, the flow diagram block boundaries and sequence could have been defined otherwise and still perform the certain significant functionality. Such alternate definitions of both functional building blocks and flow diagram blocks and sequences are thus within the scope and spirit of the claimed invention.
  • One of average skill in the art will also recognize that the functional building blocks, and other illustrative blocks, modules and components herein, can be implemented as illustrated or by discrete components, application specific integrated circuits, processors executing appropriate software and the like or any combination thereof.
  • Moreover, although described in detail for purposes of clarity and understanding by way of the aforementioned embodiments, the present invention is not limited to such embodiments. It will be obvious to one of average skill in the art that various changes and modifications may be practiced within the spirit and scope of the invention, as limited only by the scope of the appended claims.

Claims (32)

1. An apparatus, comprising:
a plurality of digital cameras that generates a plurality of digital images, wherein an object is depicted within at least some of the plurality of digital images; and
a processing module coupled to:
receive the plurality of digital images;
identify characteristics of the object within the at least some of the plurality of digital images to produce identified object characteristics; and
determine position of the object with respect to the plurality of digital cameras based on the identified object characteristics.
2. The apparatus of claim 1, wherein:
the identified object characteristics includes a plurality of directional vectors extending from at least some of the plurality of digital cameras to the object; and
the processing module determines the position of the object with respect to the plurality of digital cameras based on the plurality of directional vectors.
3. The apparatus of claim 2, wherein:
one of the plurality of directional vectors extends from a reference point of a digital image sensor of one digital camera to a physical pixel within the digital image sensor that corresponds to an image pixel of one digital image captured by the one digital camera.
4. The apparatus of claim 1, wherein:
the processing module employs a pattern recognition process to identify the characteristics of the object.
5. The apparatus of claim 1, wherein:
the object includes a sensing tag; and
at least one of the identified object characteristics is the sensing tag.
6. The apparatus of claim 5, wherein:
the sensing tag is at least one of:
a light reflective material;
a light absorbent material;
an infrared source such that at least one of the plurality of digital cameras is infrared sensitive; and
a color.
7. The apparatus of claim 1, wherein:
the object has a predetermined size;
the processing module employs a pattern recognition process to identify the object within the at least some of the plurality of digital images;
the identified object has an image size; and
based on the predetermined size and the image size, the processing module determines a distance between the object and the processing module or at least one of the plurality of digital cameras.
8. The apparatus of claim 1, wherein:
the processing module maps the position of the object within a virtual three-dimensional coordinate system.
9. The apparatus of claim 1, wherein:
a field of view of one camera of the plurality of digital cameras is adjusted based on the position of the object.
10. The apparatus of claim 1, wherein:
an image capture rate of one of the plurality of digital cameras is adjusted based on at least one of:
a predetermined setting within the processing module;
a user-selected setting within the processing module;
a movement history of the object;
a current movement of the object; and
an expected future movement of the object.
11. The apparatus of claim 1, wherein:
the processing module determines the position of the object during a first time;
the processing module determines at least one additional position of the object during a second time; and
the processing module estimates movement of the object by comparing the determined position and the at least one additional determined position.
12. The apparatus of claim 1, wherein:
the object includes a first radio frequency (RF) transceiver;
the processing module includes a second RF transceiver; and
based on an RF signal transmitted from the first RF transceiver to the second RF transceiver, the processing module determines a distance between the processing module and the object.
13. The apparatus of claim 1, wherein:
one of the plurality of digital cameras includes a first radio frequency (RF) transceiver;
the processing module includes a second RF transceiver; and
based on an RF signal transmitted from the first RF transceiver to the second RF transceiver, the processing module determines a distance between the processing module and the one digital camera.
14. The apparatus of claim 1, wherein:
a plurality of integrated circuits is distributed throughout a region in which the object is located; and
one of the plurality of digital cameras is a digital image sensor implemented on a surface of one of the plurality of integrated circuits.
15. An apparatus, comprising:
a gaming object for use within a gaming environment;
a plurality of digital cameras that generates a plurality of digital images, wherein the gaming object is depicted within at least some of the plurality of digital images; and
a game console coupled to:
receive the plurality of digital images;
identify characteristics of the gaming object within the at least some of the plurality of digital images to produce identified object characteristics; and
determine position of the gaming object within the gaming environment with respect to the plurality of digital cameras based on the identified object characteristics.
16. The apparatus of claim 15, wherein:
the gaming object is associated with a player located within the gaming environment; and
the game console determines position of the player based on the position of the gaming object.
17. The apparatus of claim 15, wherein:
the identified object characteristics includes a plurality of directional vectors extending from at least some of the plurality of digital cameras to the gaming object; and
the game console determines the position of the gaming object with respect to the plurality of digital cameras based on the plurality of directional vectors.
18. The apparatus of claim 17, wherein:
one of the plurality of directional vectors extends from a reference point of a digital image sensor of one digital camera to a physical pixel within the digital image sensor that corresponds to an image pixel of one digital image captured by the one digital camera.
19. The apparatus of claim 15, wherein:
the game console employs a pattern recognition process to identify the characteristics of the gaming object.
20. The apparatus of claim 15, wherein:
the gaming object includes a sensing tag; and
at least one of the identified object characteristics is the sensing tag.
21. The apparatus of claim 20, wherein:
the sensing tag is at least one of:
a light reflective material;
a light absorbent material;
an infrared source such that at least one of the plurality of digital cameras is infrared sensitive; and
a color.
22. The apparatus of claim 15, wherein:
the gaming object has a predetermined size;
the game console employs a pattern recognition process to identify the gaming object within the at least some of the plurality of digital images;
the identified gaming object has an image size; and
based on the predetermined size and the image size, the game console determines a distance between the gaming object and the game console or at least one of the plurality of digital cameras.
23. The apparatus of claim 15, wherein:
the game console maps the position of the gaming object within a virtual three-dimensional coordinate system.
24. The apparatus of claim 15, wherein:
an image capture rate of one of the plurality of digital cameras is adjusted based on at least one of:
a predetermined setting within the game console;
a player-selected setting within the game console;
a movement history of the gaming object;
a current movement of the gaming object; and
an expected future movement of the gaming object.
25. The apparatus of claim 15, wherein:
the position is a first position;
the game console determines the first position during a first time;
the game console determines a second position of the gaming object during a second time; and
the game console estimates movement of the gaming object by comparing the first position and the second position.
26. An apparatus, comprising:
a plurality of digital cameras, associated with a gaming object, that generates a plurality of digital images such that a plurality of predetermined references is depicted within at least some of the plurality of digital images; and
a game console coupled to:
receive the plurality of digital images;
identify characteristics of at least some of the plurality of predetermined references to produce identified characteristics; and
determine position of the gaming object with respect to the plurality of predetermined references based on the identified characteristics.
27. The apparatus of claim 26, wherein:
the identified characteristics includes a plurality of directional vectors extending from at least some of the plurality of digital cameras to the at least some of the plurality of predetermined references; and
the game console determines the position of the gaming object with respect to the plurality of digital cameras based on the plurality of directional vectors.
28. The apparatus of claim 27, wherein:
one of the plurality of directional vectors extends from a physical pixel within a digital image sensor of one digital camera, that corresponds to an image pixel of one digital image captured by the one digital camera, to a reference point of the digital image sensor.
29. The apparatus of claim 26, wherein:
the gaming object is associated with a player located within the gaming environment; and
the game console determines position of the player based on the position of the gaming object.
30. The apparatus of claim 26, wherein:
the game console employs a pattern recognition process to identify the characteristics of at least some of the plurality of predetermined references.
31. The apparatus of claim 26, wherein:
the game console maps the position of the gaming object within a virtual three-dimensional coordinate system.
32. The apparatus of claim 26, wherein:
the position is a first position;
the game console determines the first position during a first time;
the game console determines a second position of the gaming object during a second time; and
the game console estimates movement of the gaming object by comparing the first position and the second position.
US12/135,332 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing Abandoned US20080316324A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/135,332 US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US93672407P 2007-06-22 2007-06-22
US12/135,332 US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing

Publications (1)

Publication Number Publication Date
US20080316324A1 true US20080316324A1 (en) 2008-12-25

Family

ID=40135930

Family Applications (26)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,605 Abandoned US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US12/135,332 Abandoned US20080316324A1 (en) 2007-06-22 2008-06-09 Position detection and/or movement tracking via image capture and processing
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,702 Abandoned US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Family Applications Before (10)

Application Number Title Priority Date Filing Date
US12/125,154 Abandoned US20090017910A1 (en) 2007-01-31 2008-05-22 Position and motion tracking of an object
US12/128,810 Expired - Fee Related US8031121B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple antennas
US12/128,785 Expired - Fee Related US7973702B2 (en) 2007-06-22 2008-05-29 Apparatus for position detection using multiple HCF transmissions
US12/128,797 Abandoned US20080318689A1 (en) 2007-06-22 2008-05-29 Local positioning system and video game applications thereof
US12/131,550 Abandoned US20080318625A1 (en) 2007-06-22 2008-06-02 Mobile communication device with gaming mode and methods for use therewith
US12/131,579 Active 2029-08-10 US8160640B2 (en) 2007-06-22 2008-06-02 Multi-mode mobile communication device with motion sensor and methods for use therewith
US12/131,480 Abandoned US20080318680A1 (en) 2007-06-22 2008-06-02 Gaming object and gaming console that communicate user data via backscattering and methods for use therewith
US12/131,605 Abandoned US20080318673A1 (en) 2007-06-22 2008-06-02 Gaming object with biofeedback sensor for interacting with a gaming application and methods for use therewith
US12/131,331 Active 2034-07-30 US9523767B2 (en) 2007-06-22 2008-06-02 Game console and gaming object with motion prediction modeling and methods for use therewith
US12/131,522 Active 2032-10-21 US9547080B2 (en) 2007-06-22 2008-06-02 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Family Applications After (15)

Application Number Title Priority Date Filing Date
US12/135,341 Active 2029-06-27 US7952962B2 (en) 2007-06-22 2008-06-09 Directional microphone or microphones for position determination
US12/137,143 Active 2033-04-24 US9417320B2 (en) 2007-06-22 2008-06-11 Game device that generates a display with a simulated body image and methods for use therewith
US12/136,939 Abandoned US20080318682A1 (en) 2007-06-22 2008-06-11 Dual transceiver gaming console interface and methods for use therewith
US12/137,747 Active 2031-09-01 US8628417B2 (en) 2007-06-22 2008-06-12 Game device with wireless position measurement and methods for use therewith
US12/142,702 Abandoned US20080318595A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US12/142,064 Abandoned US20080318683A1 (en) 2007-06-22 2008-06-19 RFID based positioning system
US12/142,032 Active 2030-09-22 US8062133B2 (en) 2007-06-22 2008-06-19 Positioning within a video gaming environment using RF signals
US12/142,733 Abandoned US20080318684A1 (en) 2007-06-22 2008-06-19 Position location system using multiple position location techniques
US13/223,121 Expired - Fee Related US8289212B2 (en) 2007-06-22 2011-08-31 Apparatus for position detection using multiple antennas
US13/361,333 Active US8311579B2 (en) 2007-06-22 2012-01-30 Multi-mode mobile communication device with motion sensor and methods for use therewith
US13/592,804 Abandoned US20120315991A1 (en) 2007-06-22 2012-08-23 Apparatus position detection using multiple antennas
US13/627,360 Active US8676257B2 (en) 2007-06-22 2012-09-26 Multi-mode mobile communication device with motion sensor and methods for use therewith
US15/346,418 Active 2028-07-07 US10549195B2 (en) 2007-06-22 2016-11-08 Gaming object with orientation sensor for interacting with a display and methods for use therewith
US15/346,254 Active US9943760B2 (en) 2007-06-22 2016-11-08 Game console and gaming object with motion prediction modeling and methods for use therewith
US16/730,166 Active 2029-03-21 US11426660B2 (en) 2007-06-22 2019-12-30 Gaming object with orientation sensor for interacting with a display and methods for use therewith

Country Status (1)

Country Link
US (26) US20090017910A1 (en)

Cited By (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080316863A1 (en) * 2007-06-22 2008-12-25 Broadcom Corporation Directional microphone or microphones for position determination
US20090241034A1 (en) * 2008-03-21 2009-09-24 Kazuaki Ishizaki Object movement control system, object movement control method, server and computer program
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US20100232650A1 (en) * 2009-03-13 2010-09-16 Omron Corporation Measurement apparatus
KR100999711B1 (en) 2009-03-09 2010-12-08 광주과학기술원 Apparatus for real-time calibrating in the collaboration system and method using the same
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US20110096322A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Optical position detection device and display device with position detection function
US20110096031A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Position detecting function-added projection display apparatus
US20110096032A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Optical position detecting device and display device with position detecting function
US20110216946A1 (en) * 2008-10-01 2011-09-08 Sony Computer Entertainment, Inc. Information processing device, information processing method, program, and information storage medium
US20110237324A1 (en) * 2010-03-29 2011-09-29 Microsoft Corporation Parental control settings based on body dimensions
WO2012030453A1 (en) * 2010-08-31 2012-03-08 Microsoft Corporation User selection and navigation based on looped motions
US20130010071A1 (en) * 2011-07-04 2013-01-10 3Divi Methods and systems for mapping pointing device on depth map
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
WO2013035096A3 (en) * 2011-09-07 2013-07-18 Umoove Limited System and method of tracking an object in an image captured by a moving device
US20140009384A1 (en) * 2012-07-04 2014-01-09 3Divi Methods and systems for determining location of handheld device within 3d environment
US8714749B2 (en) 2009-11-06 2014-05-06 Seiko Epson Corporation Projection display device with position detection function
US20150209664A1 (en) * 2012-10-04 2015-07-30 Disney Enterprises, Inc. Making physical objects appear to be moving from the physical world into the virtual world
US20150221135A1 (en) * 2014-02-06 2015-08-06 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US20160014390A1 (en) * 2014-07-08 2016-01-14 Apple Inc. Electronic Devices With Connector Alignment Assistance
US9423669B2 (en) 2014-11-04 2016-08-23 Qualcomm Incorporated Method and apparatus for camera autofocus based on Wi-Fi ranging technique
US20170123425A1 (en) * 2015-10-09 2017-05-04 SZ DJI Technology Co., Ltd Salient feature based vehicle positioning
US9741135B2 (en) * 2014-12-22 2017-08-22 Baidu Online Networks Technology (Beijing) Co., Ltd. Method for measuring object and smart device
US9782669B1 (en) 2012-06-14 2017-10-10 Position Imaging, Inc. RF tracking with active sensory feedback
US9933509B2 (en) 2011-11-10 2018-04-03 Position Imaging, Inc. System for tracking an object using pulsed frequency hopping
US9945940B2 (en) 2011-11-10 2018-04-17 Position Imaging, Inc. Systems and methods of wireless position tracking
US9961503B2 (en) 2014-01-17 2018-05-01 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10001833B2 (en) 2012-08-14 2018-06-19 Position Imaging, Inc. User input system for immersive interaction
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US10234539B2 (en) 2012-12-15 2019-03-19 Position Imaging, Inc. Cycling reference multiplexing receiver system
US10237698B2 (en) 2013-01-18 2019-03-19 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10269182B2 (en) 2012-06-14 2019-04-23 Position Imaging, Inc. RF tracking with active sensory feedback
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US10388027B2 (en) * 2016-06-01 2019-08-20 Kyocera Corporation Detection method, display apparatus, and detection system
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634762B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US10869175B2 (en) * 2014-11-04 2020-12-15 Nathan Schumacher System and method for generating a three-dimensional model using flowable probes
CN112418200A (en) * 2021-01-25 2021-02-26 成都点泽智能科技有限公司 Object detection method and device based on thermal imaging and server
US11089232B2 (en) 2019-01-11 2021-08-10 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US11361536B2 (en) 2018-09-21 2022-06-14 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
US11961279B2 (en) 2022-06-13 2024-04-16 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method

Families Citing this family (475)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8915859B1 (en) * 2004-09-28 2014-12-23 Impact Sports Technologies, Inc. Monitoring device, system and method for a multi-player interactive game
DK1819816T3 (en) * 2004-12-07 2009-01-26 Applied Nanosystems Bv Methods for preparing and secreting modified peptides
EP1967942A1 (en) * 2005-10-26 2008-09-10 Sony Computer Entertainment America, Inc. System and method for interfacing and computer program
US7702608B1 (en) 2006-07-14 2010-04-20 Ailive, Inc. Generating motion recognizers for arbitrary motions for video games and tuning the motion recognizers to the end user
US7636645B1 (en) 2007-06-18 2009-12-22 Ailive Inc. Self-contained inertial navigation system for interactive control using movable controllers
US9405372B2 (en) 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
KR101299682B1 (en) * 2006-10-16 2013-08-22 삼성전자주식회사 Universal input device
US8344949B2 (en) * 2008-03-31 2013-01-01 Golba Llc Wireless positioning approach using time-delay of signals with a known transmission pattern
US8952832B2 (en) 2008-01-18 2015-02-10 Invensense, Inc. Interfacing application programs and motion sensors of a device
US8250921B2 (en) 2007-07-06 2012-08-28 Invensense, Inc. Integrated motion processing unit (MPU) with MEMS inertial sensing and embedded digital electronics
US7934423B2 (en) 2007-12-10 2011-05-03 Invensense, Inc. Vertically integrated 3-axis MEMS angular accelerometer with integrated electronics
US8462109B2 (en) 2007-01-05 2013-06-11 Invensense, Inc. Controlling and accessing content using motion processing on mobile devices
US8139945B1 (en) 2007-01-20 2012-03-20 Centrak, Inc. Methods and systems for synchronized infrared real time location
US7636697B1 (en) 2007-01-29 2009-12-22 Ailive Inc. Method and system for rapid evaluation of logical expressions
US20080200224A1 (en) 2007-02-20 2008-08-21 Gametank Inc. Instrument Game System and Method
US8907193B2 (en) 2007-02-20 2014-12-09 Ubisoft Entertainment Instrument game system and method
US8284822B2 (en) * 2007-02-27 2012-10-09 Broadcom Corporation Method and system for utilizing direct digital frequency synthesis to process signals in multi-band applications
US7826550B2 (en) * 2007-02-28 2010-11-02 Broadcom Corp. Method and system for a high-precision frequency generator using a direct digital frequency synthesizer for transmitters and receivers
US20080205545A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Crystalless Polar Transmitter
US20080205550A1 (en) * 2007-02-28 2008-08-28 Ahmadreza Rofougaran Method and System for Using a Phase Locked Loop for Upconversion in a Wideband Polar Transmitter
US8116387B2 (en) * 2007-03-01 2012-02-14 Broadcom Corporation Method and system for a digital polar transmitter
US20090233710A1 (en) * 2007-03-12 2009-09-17 Roberts Thomas J Feedback gaming peripheral
US7894830B2 (en) * 2007-04-28 2011-02-22 Broadcom Corporation Motion adaptive wireless local area network, wireless communications device and integrated circuits for use therewith
US8064923B2 (en) * 2007-04-28 2011-11-22 Broadcom Corporation Wireless communications device and integrated circuits with global positioning and method for use therewith
JP4438825B2 (en) * 2007-05-29 2010-03-24 ソニー株式会社 Arrival angle estimation system, communication apparatus, and communication system
US20080299906A1 (en) * 2007-06-04 2008-12-04 Topway Electrical Appliance Company Emulating playing apparatus of simulating games
US7912449B2 (en) * 2007-06-14 2011-03-22 Broadcom Corporation Method and system for 60 GHz location determination and coordination of WLAN/WPAN/GPS multimode devices
US8238832B1 (en) * 2007-08-28 2012-08-07 Marvell International Ltd. Antenna optimum beam forming for multiple protocol coexistence on a wireless device
US9186089B2 (en) 2007-09-14 2015-11-17 Medtronic Monitoring, Inc. Injectable physiological monitoring system
US8460189B2 (en) 2007-09-14 2013-06-11 Corventis, Inc. Adherent cardiac monitor with advanced sensing capabilities
US8790257B2 (en) 2007-09-14 2014-07-29 Corventis, Inc. Multi-sensor patient monitor to detect impending cardiac decompensation
WO2009036348A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Medical device automatic start-up upon contact to patient tissue
WO2009033298A1 (en) * 2007-09-14 2009-03-19 Zueger Christian A system for capturing tennis match data
US20090076343A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Energy Management for Adherent Patient Monitor
US20090076345A1 (en) 2007-09-14 2009-03-19 Corventis, Inc. Adherent Device with Multiple Physiological Sensors
US8591430B2 (en) 2007-09-14 2013-11-26 Corventis, Inc. Adherent device for respiratory monitoring
WO2009042190A1 (en) * 2007-09-25 2009-04-02 Wms Gaming Inc. Accessing wagering game services by aiming handheld device at external device
JP5116424B2 (en) * 2007-10-09 2013-01-09 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
US8206325B1 (en) 2007-10-12 2012-06-26 Biosensics, L.L.C. Ambulatory system for measuring and monitoring physical activity and risk of falling and for automatic fall detection
WO2009052032A1 (en) * 2007-10-19 2009-04-23 Sony Computer Entertainment America Inc. Scheme for providing audio effects for a musical instrument and for controlling images with same
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
JP5411425B2 (en) * 2007-12-25 2014-02-12 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
US9020780B2 (en) * 2007-12-31 2015-04-28 The Nielsen Company (Us), Llc Motion detector module
CN101971122B (en) * 2008-01-03 2013-06-12 埃波斯开发有限公司 Ultrasonic digitizer and host
US9007178B2 (en) 2008-02-14 2015-04-14 Intermec Ip Corp. Utilization of motion and spatial identification in RFID systems
US8994504B1 (en) 2008-02-14 2015-03-31 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
US9047522B1 (en) * 2008-02-14 2015-06-02 Intermec Ip Corp. Utilization of motion and spatial identification in mobile RFID interrogator
EP2257216B1 (en) 2008-03-12 2021-04-28 Medtronic Monitoring, Inc. Heart failure decompensation prediction based on cardiac rhythm
US8412317B2 (en) 2008-04-18 2013-04-02 Corventis, Inc. Method and apparatus to measure bioelectric impedance of patient tissue
JP5115991B2 (en) * 2008-04-30 2013-01-09 独立行政法人産業技術総合研究所 Object state detection apparatus and method
US8120354B2 (en) * 2008-05-01 2012-02-21 Broadband Discovery Systems, Inc. Self-calibrating magnetic field monitor
CN102016759A (en) * 2008-05-09 2011-04-13 皇家飞利浦电子股份有限公司 Method and system for conveying an emotion
US8242888B2 (en) 2008-06-05 2012-08-14 Keystone Technology Solutions, Llc Systems and methods to determine motion parameters using RFID tags
US8461966B2 (en) 2008-06-05 2013-06-11 Micron Technology, Inc. Systems and methods to determine kinematical parameters using RFID tags
US8830062B2 (en) 2008-06-05 2014-09-09 Micron Technology, Inc. Systems and methods to use radar in RFID systems
US9844730B1 (en) * 2008-06-16 2017-12-19 Disney Enterprises, Inc. Method and apparatus for an interactive dancing video game
US8483623B2 (en) * 2008-06-19 2013-07-09 Broadcom Corporation Method and system for frequency-shift based PCB-to-PCB communications
GB2461577A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh System and method for transferring electric energy to a vehicle
GB2461578A (en) 2008-07-04 2010-01-06 Bombardier Transp Gmbh Transferring electric energy to a vehicle
US8655622B2 (en) * 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
US10679749B2 (en) * 2008-08-22 2020-06-09 International Business Machines Corporation System and method for virtual world biometric analytics through the use of a multimodal biometric analytic wallet
GB2463693A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh A system for transferring electric energy to a vehicle
GB2463692A (en) 2008-09-19 2010-03-24 Bombardier Transp Gmbh An arrangement for providing a vehicle with electric energy
US8157609B2 (en) * 2008-10-18 2012-04-17 Mattel, Inc. Mind-control toys and methods of interaction therewith
CN101726738B (en) * 2008-10-30 2012-12-26 日电(中国)有限公司 Multi-target positioning system and multiple-access control method based on power control
US7855683B2 (en) * 2008-11-04 2010-12-21 At&T Intellectual Property I, L.P. Methods and apparatuses for GPS coordinates extrapolation when GPS signals are not available
US20100122278A1 (en) * 2008-11-13 2010-05-13 Alfred Xueliang Xin Method and an automated direction following system
US9120016B2 (en) 2008-11-21 2015-09-01 Ubisoft Entertainment Interactive guitar game designed for learning to play the guitar
EP2192696B1 (en) * 2008-11-28 2014-12-31 Sequans Communications Wireless communications method and system with spatial multiplexing using dually polarized antennas and corresponding receiver
US8085199B2 (en) * 2008-12-13 2011-12-27 Broadcom Corporation Receiver including a matrix module to determine angular position
JP2010152493A (en) * 2008-12-24 2010-07-08 Sony Corp Input device, control apparatus, and control method for the input device
US20100177749A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Methods of and apparatus for programming and managing diverse network components, including electronic-ink based display devices, in a mesh-type wireless communication network
US8457013B2 (en) 2009-01-13 2013-06-04 Metrologic Instruments, Inc. Wireless dual-function network device dynamically switching and reconfiguring from a wireless network router state of operation into a wireless network coordinator state of operation in a wireless communication network
US20100177076A1 (en) * 2009-01-13 2010-07-15 Metrologic Instruments, Inc. Edge-lit electronic-ink display device for use in indoor and outdoor environments
EP2391905B1 (en) 2009-01-27 2019-11-20 Xyz Interactive Technologies Inc. A method and apparatus for ranging finding, orienting, and/or positioning of single and/or multiple devices
US9652030B2 (en) * 2009-01-30 2017-05-16 Microsoft Technology Licensing, Llc Navigation of a virtual plane using a zone of restriction for canceling noise
WO2010087778A1 (en) * 2009-02-02 2010-08-05 Agency For Science, Technology And Research Method and system for rendering an entertainment animation
US8254964B2 (en) * 2009-02-23 2012-08-28 Sony Ericsson Mobile Communications Ab Method and arrangement relating to location based services for a communication device
US8311506B2 (en) * 2009-02-26 2012-11-13 Broadcom Corporation RFID receiver front end with phase cancellation and methods for use therewith
US8725156B2 (en) * 2009-04-02 2014-05-13 Honeywell International Inc. Methods for supporting mobile nodes in industrial control and automation systems and other systems and related apparatus
JP2010245796A (en) 2009-04-06 2010-10-28 Sony Corp Video display and method, video display system, and program
US20120121128A1 (en) * 2009-04-20 2012-05-17 Bent 360: Medialab Inc. Object tracking system
US8953029B2 (en) * 2009-05-08 2015-02-10 Sony Computer Entertainment America Llc Portable device interaction via motion sensitive controller
US8417264B1 (en) * 2009-05-14 2013-04-09 Spring Spectrum L.P. Method and apparatus for determining location of a mobile station based on locations of multiple nearby mobile stations
US20100304931A1 (en) * 2009-05-27 2010-12-02 Stumpf John F Motion capture system
KR100979623B1 (en) * 2009-05-27 2010-09-01 서울대학교산학협력단 Positioning system and method based on radio communication appratus comprising multiple antenna
US9383823B2 (en) * 2009-05-29 2016-07-05 Microsoft Technology Licensing, Llc Combining gestures beyond skeletal
CN101898042B (en) * 2009-05-31 2012-07-18 鸿富锦精密工业(深圳)有限公司 Game controller and control method thereof
US20100332842A1 (en) * 2009-06-30 2010-12-30 Yahoo! Inc. Determining a mood of a user based on biometric characteristic(s) of the user in an online system
US9511289B2 (en) 2009-07-10 2016-12-06 Valve Corporation Player biofeedback for dynamically controlling a video game state
US11253781B2 (en) 2009-07-10 2022-02-22 Valve Corporation Player biofeedback for dynamically controlling a video game state
US8676659B1 (en) * 2009-07-23 2014-03-18 Bank Of America Corporation Methods and apparatuses for facilitating financial transactions using gamer tag information
US20110025464A1 (en) * 2009-07-30 2011-02-03 Awarepoint Corporation Antenna Diversity For Wireless Tracking System And Method
KR20110012584A (en) * 2009-07-31 2011-02-09 삼성전자주식회사 Apparatus and method for estimating position by ultrasonic signal
US20110028218A1 (en) * 2009-08-03 2011-02-03 Realta Entertainment Group Systems and Methods for Wireless Connectivity of a Musical Instrument
CN102022979A (en) * 2009-09-21 2011-04-20 鸿富锦精密工业(深圳)有限公司 Three-dimensional optical sensing system
US8581773B1 (en) * 2009-10-15 2013-11-12 The Boeing Company Dual frequency transmitter
WO2011050283A2 (en) 2009-10-22 2011-04-28 Corventis, Inc. Remote detection and monitoring of functional chronotropic incompetence
US8535133B2 (en) * 2009-11-16 2013-09-17 Broadcom Corporation Video game with controller sensing player inappropriate activity
US8429269B2 (en) * 2009-12-09 2013-04-23 Sony Computer Entertainment Inc. Server-side rendering
US9451897B2 (en) 2009-12-14 2016-09-27 Medtronic Monitoring, Inc. Body adherent patch with electronics for physiologic monitoring
US20110148884A1 (en) * 2009-12-17 2011-06-23 Charles Timberlake Zeleny System and method for determining motion of a subject
US8497902B2 (en) * 2009-12-18 2013-07-30 Sony Computer Entertainment Inc. System for locating a display device using a camera on a portable device and a sensor on a gaming console and method thereof
CN102109594B (en) * 2009-12-28 2014-04-30 深圳富泰宏精密工业有限公司 System and method for sensing and notifying voice
US9019149B2 (en) 2010-01-05 2015-04-28 The Invention Science Fund I, Llc Method and apparatus for measuring the motion of a person
US20110166937A1 (en) * 2010-01-05 2011-07-07 Searete Llc Media output with micro-impulse radar feedback of physiological response
US8884813B2 (en) * 2010-01-05 2014-11-11 The Invention Science Fund I, Llc Surveillance of stress conditions of persons using micro-impulse radar
US9069067B2 (en) * 2010-09-17 2015-06-30 The Invention Science Fund I, Llc Control of an electronic apparatus using micro-impulse radar
US20110166940A1 (en) * 2010-01-05 2011-07-07 Searete Llc Micro-impulse radar detection of a human demographic and delivery of targeted media content
US9024814B2 (en) 2010-01-05 2015-05-05 The Invention Science Fund I, Llc Tracking identities of persons using micro-impulse radar
US8795082B2 (en) * 2010-01-25 2014-08-05 Rambus Inc. Directional beam steering system and method to detect location and motion
US9104238B2 (en) * 2010-02-12 2015-08-11 Broadcom Corporation Systems and methods for providing enhanced motion detection
EP2540065B1 (en) 2010-02-23 2017-01-18 Telefonaktiebolaget LM Ericsson (publ) Communication performance guidance in a user terminal
US8884741B2 (en) * 2010-02-24 2014-11-11 Sportvision, Inc. Tracking system
US8979665B1 (en) 2010-03-22 2015-03-17 Bijan Najafi Providing motion feedback based on user center of mass
US8965498B2 (en) 2010-04-05 2015-02-24 Corventis, Inc. Method and apparatus for personalized physiologic parameters
WO2011124054A1 (en) * 2010-04-09 2011-10-13 深圳市江波龙电子有限公司 Portable multimedia player
US20110275434A1 (en) * 2010-05-04 2011-11-10 Mediatek Inc. Methods for controlling a process of a game and electronic devices utilizing the same
JP5700758B2 (en) * 2010-05-19 2015-04-15 任天堂株式会社 GAME DEVICE, GAME PROGRAM, GAME SYSTEM, AND GAME PROCESSING METHOD
US8428394B2 (en) 2010-05-25 2013-04-23 Marcus KRIETER System and method for resolving spatial orientation using intelligent optical selectivity
US20110298887A1 (en) * 2010-06-02 2011-12-08 Maglaque Chad L Apparatus Using an Accelerometer to Capture Photographic Images
US10843078B2 (en) * 2010-06-07 2020-11-24 Affectiva, Inc. Affect usage within a gaming context
US8593331B2 (en) * 2010-06-16 2013-11-26 Qualcomm Incorported RF ranging-assisted local motion sensing
US8537847B2 (en) * 2010-06-22 2013-09-17 Sony Corporation Digital clock with internet connectivity and multiple resting orientations
US8174934B2 (en) * 2010-07-28 2012-05-08 Empire Technology Development Llc Sound direction detection
US9167975B1 (en) * 2010-07-28 2015-10-27 Impact Sports Technologies, Inc. Motion resistant device to monitor heart rate in ambulatory patients
US20120212374A1 (en) * 2010-08-17 2012-08-23 Qualcomm Incorporated Method and apparatus for rf-based ranging with multiple antennas
FI122328B (en) * 2010-08-18 2011-12-15 Sauli Hepo-Oja Active localization system
US20120064841A1 (en) * 2010-09-10 2012-03-15 Husted Paul J Configuring antenna arrays of mobile wireless devices using motion sensors
US20120063270A1 (en) * 2010-09-10 2012-03-15 Pawcatuck, Connecticut Methods and Apparatus for Event Detection and Localization Using a Plurality of Smartphones
US8391334B1 (en) * 2010-09-27 2013-03-05 L-3 Communications Corp Communications reliability in a hub-spoke communications system
KR101339431B1 (en) * 2010-11-19 2013-12-09 도시바삼성스토리지테크놀러지코리아 주식회사 Game controller, game machine, and game system employ the game controller
JP5241807B2 (en) * 2010-12-02 2013-07-17 株式会社コナミデジタルエンタテインメント GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US8319682B2 (en) * 2011-01-06 2012-11-27 The Boeing Company Method and apparatus for examining an object using electromagnetic millimeter-wave signal illumination
US8753275B2 (en) * 2011-01-13 2014-06-17 BioSensics LLC Intelligent device to monitor and remind patients with footwear, walking aids, braces, or orthotics
EP3312629A3 (en) 2011-02-21 2018-06-13 Transrobotics, Inc. System and method for sensing an object's dimensions
EP2497547B1 (en) 2011-03-08 2018-06-27 Nintendo Co., Ltd. Information processing program, information processing apparatus, information processing system, and information processing method
EP2497545B1 (en) 2011-03-08 2019-08-07 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
EP2497543A3 (en) 2011-03-08 2012-10-03 Nintendo Co., Ltd. Information processing program, information processing system, and information processing method
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
JP5792971B2 (en) 2011-03-08 2015-10-14 任天堂株式会社 Information processing system, information processing program, and information processing method
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US9159293B2 (en) * 2011-03-16 2015-10-13 Kyocera Corporation Electronic device, control method, and storage medium storing control program
GB201105587D0 (en) * 2011-04-01 2011-05-18 Elliptic Laboratories As User interfaces for electronic devices
US9880604B2 (en) 2011-04-20 2018-01-30 Microsoft Technology Licensing, Llc Energy efficient location detection
US9103899B2 (en) 2011-04-29 2015-08-11 The Invention Science Fund I, Llc Adaptive control of a personal electronic device responsive to a micro-impulse radar
US9151834B2 (en) 2011-04-29 2015-10-06 The Invention Science Fund I, Llc Network and personal electronic devices operatively coupled to micro-impulse radars
US9000973B2 (en) * 2011-04-29 2015-04-07 The Invention Science Fund I, Llc Personal electronic device with a micro-impulse radar
US8884809B2 (en) * 2011-04-29 2014-11-11 The Invention Science Fund I, Llc Personal electronic device providing enhanced user environmental awareness
US20120282987A1 (en) * 2011-05-06 2012-11-08 Roger Romero Artificial touch device for electronic touch screens
JP5937792B2 (en) * 2011-06-03 2016-06-22 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP5869236B2 (en) 2011-06-03 2016-02-24 任天堂株式会社 Information processing program, information processing apparatus, information processing system, and information processing method
US8890684B2 (en) * 2011-06-17 2014-11-18 Checkpoint Systems, Inc. Background object sensor
US8943396B2 (en) 2011-07-18 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for multi-experience adaptation of media content
US9084001B2 (en) 2011-07-18 2015-07-14 At&T Intellectual Property I, Lp Method and apparatus for multi-experience metadata translation of media content with metadata
KR101893601B1 (en) * 2011-07-22 2018-08-31 삼성전자 주식회사 Input apparatus of display apparatus, display system and control method thereof
US9316731B2 (en) * 2011-08-04 2016-04-19 Lattice Semiconductor Corporation Low-cost tracking system
US8942412B2 (en) 2011-08-11 2015-01-27 At&T Intellectual Property I, Lp Method and apparatus for controlling multi-experience translation of media content
US9237362B2 (en) 2011-08-11 2016-01-12 At&T Intellectual Property I, Lp Method and apparatus for multi-experience translation of media content with sensor sharing
US10585472B2 (en) 2011-08-12 2020-03-10 Sony Interactive Entertainment Inc. Wireless head mounted display with differential rendering and sound localization
US10209771B2 (en) * 2016-09-30 2019-02-19 Sony Interactive Entertainment Inc. Predictive RF beamforming for head mounted display
KR20130024823A (en) * 2011-08-29 2013-03-08 한국전자통신연구원 Method for communication between devices and system for the same
KR101398709B1 (en) * 2011-09-09 2014-05-28 주식회사 팬택 Terminal apparatus and method for supporting multi interface using user motion
US20130095875A1 (en) * 2011-09-30 2013-04-18 Rami Reuven Antenna selection based on orientation, and related apparatuses, antenna units, methods, and distributed antenna systems
US9449342B2 (en) 2011-10-27 2016-09-20 Ebay Inc. System and method for visualization of items in an environment using augmented reality
WO2013067526A1 (en) 2011-11-04 2013-05-10 Remote TelePointer, LLC Method and system for user interface for interactive devices using a mobile device
WO2013112223A2 (en) * 2011-11-09 2013-08-01 Marquette Trishaun Detection of an asymmetric object
US8902936B2 (en) 2011-12-22 2014-12-02 Cory J. Stephanson Sensor event assessor input/output controller
US10165228B2 (en) * 2011-12-22 2018-12-25 Mis Security, Llc Sensor event assessor training and integration
US9325752B2 (en) 2011-12-23 2016-04-26 Microsoft Technology Licensing, Llc Private interaction hubs
US9467834B2 (en) 2011-12-23 2016-10-11 Microsoft Technology Licensing, Llc Mobile device emergency service
US8874162B2 (en) 2011-12-23 2014-10-28 Microsoft Corporation Mobile device safe driving
US9363250B2 (en) 2011-12-23 2016-06-07 Microsoft Technology Licensing, Llc Hub coordination service
US9420432B2 (en) 2011-12-23 2016-08-16 Microsoft Technology Licensing, Llc Mobile devices control
US20130305354A1 (en) 2011-12-23 2013-11-14 Microsoft Corporation Restricted execution modes
JP2013153405A (en) * 2011-12-28 2013-08-08 Panasonic Corp Av apparatus and initial setting method thereof
US9558625B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for recommending games to anonymous players using distributed storage
US9269222B2 (en) * 2012-01-13 2016-02-23 Igt Canada Solutions Ulc Remote gaming system using separate terminal to set up remote play with a gaming terminal
US9129489B2 (en) * 2012-01-13 2015-09-08 Gtech Canada Ulc Remote gaming method where venue's system suggests different games to remote player using a mobile gaming device
US9295908B2 (en) 2012-01-13 2016-03-29 Igt Canada Solutions Ulc Systems and methods for remote gaming using game recommender
US9011240B2 (en) * 2012-01-13 2015-04-21 Spielo International Canada Ulc Remote gaming system allowing adjustment of original 3D images for a mobile gaming device
US9536378B2 (en) 2012-01-13 2017-01-03 Igt Canada Solutions Ulc Systems and methods for recommending games to registered players using distributed storage
US9079098B2 (en) 2012-01-13 2015-07-14 Gtech Canada Ulc Automated discovery of gaming preferences
US9558620B2 (en) 2012-01-13 2017-01-31 Igt Canada Solutions Ulc Systems and methods for multi-player remote gaming
US9123200B2 (en) * 2012-01-13 2015-09-01 Gtech Canada Ulc Remote gaming using game recommender system and generic mobile gaming device
US9208641B2 (en) * 2012-01-13 2015-12-08 Igt Canada Solutions Ulc Remote gaming method allowing temporary inactivation without terminating playing session due to game inactivity
US9159189B2 (en) * 2012-01-13 2015-10-13 Gtech Canada Ulc Mobile gaming device carrying out uninterrupted game despite communications link disruption
JP5847852B2 (en) * 2012-01-17 2016-01-27 株式会社ソニー・コンピュータエンタテインメント Server, information processing method, information processing program, and computer-readable recording medium storing information processing program
US9088309B2 (en) * 2012-02-17 2015-07-21 Sony Corporation Antenna tunning arrangement and method
WO2013130058A1 (en) * 2012-02-29 2013-09-06 Intel Corporation Location discrepancy corrections based on community corrections and trajectory detection
WO2013148986A1 (en) 2012-03-30 2013-10-03 Corning Cable Systems Llc Reducing location-dependent interference in distributed antenna systems operating in multiple-input, multiple-output (mimo) configuration, and related components, systems, and methods
US10107887B2 (en) 2012-04-13 2018-10-23 Qualcomm Incorporated Systems and methods for displaying a user interface
US9326689B2 (en) 2012-05-08 2016-05-03 Siemens Medical Solutions Usa, Inc. Thermally tagged motion tracking for medical treatment
US11871901B2 (en) 2012-05-20 2024-01-16 Cilag Gmbh International Method for situational awareness for surgical network or surgical network connected device capable of adjusting function based on a sensed situation or usage
TW201349032A (en) * 2012-05-23 2013-12-01 Tritan Technology Inc An anti-optical-noise pointer positioning system
US20130321245A1 (en) * 2012-06-04 2013-12-05 Fluor Technologies Corporation Mobile device for monitoring and controlling facility systems
US9213092B2 (en) * 2012-06-12 2015-12-15 Tyco Fire & Security Gmbh Systems and methods for detecting a change in position of an object
US20140028500A1 (en) * 2012-07-30 2014-01-30 Yu-Ming Liu Positioning System
US9307335B2 (en) * 2012-07-31 2016-04-05 Japan Science And Technology Agency Device for estimating placement of physical objects
US9230076B2 (en) 2012-08-30 2016-01-05 Microsoft Technology Licensing, Llc Mobile device child share
US9754442B2 (en) 2012-09-18 2017-09-05 Igt Canada Solutions Ulc 3D enhanced gaming machine with foreground and background game surfaces
US9454879B2 (en) 2012-09-18 2016-09-27 Igt Canada Solutions Ulc Enhancements to game components in gaming systems
US20140080638A1 (en) * 2012-09-19 2014-03-20 Board Of Regents, The University Of Texas System Systems and methods for providing training and instruction to a football kicker
JP6273662B2 (en) 2012-10-05 2018-02-07 トランスロボティックス,インク. System and method for high resolution distance sensing and application thereof
US9002641B2 (en) 2012-10-05 2015-04-07 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
US9405011B2 (en) 2012-10-05 2016-08-02 Hand Held Products, Inc. Navigation system configured to integrate motion sensing device inputs
US9477993B2 (en) * 2012-10-14 2016-10-25 Ari M Frank Training a predictor of emotional response based on explicit voting on content and eye tracking to verify attention
DE102012224321B4 (en) * 2012-12-21 2022-12-15 Applejack 199 L.P. Measuring device for detecting a hitting movement of a racket, training device and method for training a hitting movement
CA2861244A1 (en) 2012-12-28 2014-06-28 Gtech Canada Ulc Imitating real-world physics in a 3d enhanced gaming machine
US9876762B2 (en) 2012-12-31 2018-01-23 Elwha Llc Cost-effective mobile connectivity protocols
US8965288B2 (en) 2012-12-31 2015-02-24 Elwha Llc Cost-effective mobile connectivity protocols
US9713013B2 (en) 2013-03-15 2017-07-18 Elwha Llc Protocols for providing wireless communications connectivity maps
US9635605B2 (en) 2013-03-15 2017-04-25 Elwha Llc Protocols for facilitating broader access in wireless communications
US9781664B2 (en) 2012-12-31 2017-10-03 Elwha Llc Cost-effective mobile connectivity protocols
US9832628B2 (en) 2012-12-31 2017-11-28 Elwha, Llc Cost-effective mobile connectivity protocols
US9980114B2 (en) 2013-03-15 2018-05-22 Elwha Llc Systems and methods for communication management
US9451394B2 (en) 2012-12-31 2016-09-20 Elwha Llc Cost-effective mobile connectivity protocols
US9119068B1 (en) * 2013-01-09 2015-08-25 Trend Micro Inc. Authentication using geographic location and physical gestures
JP2014153663A (en) * 2013-02-13 2014-08-25 Sony Corp Voice recognition device, voice recognition method and program
US9480911B2 (en) * 2013-02-28 2016-11-01 Steelseries Aps Method and apparatus for monitoring and calibrating performances of gamers
WO2014139092A1 (en) * 2013-03-12 2014-09-18 Zheng Shi System and method for interactive board
JP6127602B2 (en) * 2013-03-13 2017-05-17 沖電気工業株式会社 State recognition device, state recognition method, and computer program
US9693214B2 (en) 2013-03-15 2017-06-27 Elwha Llc Protocols for facilitating broader access in wireless communications
US9596584B2 (en) 2013-03-15 2017-03-14 Elwha Llc Protocols for facilitating broader access in wireless communications by conditionally authorizing a charge to an account of a third party
US9866706B2 (en) 2013-03-15 2018-01-09 Elwha Llc Protocols for facilitating broader access in wireless communications
US9781554B2 (en) 2013-03-15 2017-10-03 Elwha Llc Protocols for facilitating third party authorization for a rooted communication device in wireless communications
US9706060B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for facilitating broader access in wireless communications
US9807582B2 (en) 2013-03-15 2017-10-31 Elwha Llc Protocols for facilitating broader access in wireless communications
US9843917B2 (en) 2013-03-15 2017-12-12 Elwha, Llc Protocols for facilitating charge-authorized connectivity in wireless communications
US9813887B2 (en) 2013-03-15 2017-11-07 Elwha Llc Protocols for facilitating broader access in wireless communications responsive to charge authorization statuses
US9706382B2 (en) 2013-03-15 2017-07-11 Elwha Llc Protocols for allocating communication services cost in wireless communications
ITMI20130495A1 (en) * 2013-03-29 2014-09-30 Atlas Copco Blm Srl ELECTRONIC CONTROL AND CONTROL DEVICE FOR SENSORS
US20140302919A1 (en) 2013-04-05 2014-10-09 Mark J. Ladd Systems and methods for sensor-based mobile gaming
US9311789B1 (en) 2013-04-09 2016-04-12 BioSensics LLC Systems and methods for sensorimotor rehabilitation
US9871184B2 (en) 2013-05-06 2018-01-16 Lokdon Llc System and a method for emitting an ultrasonic signal
FR3006477B1 (en) * 2013-05-29 2016-09-30 Blinksight DEVICE AND METHOD FOR DETECTING THE HANDLING OF AT LEAST ONE OBJECT
NZ754204A (en) * 2013-06-04 2019-11-29 Isolynx Llc Object tracking system optimization and tools
US9782670B2 (en) * 2014-04-25 2017-10-10 Ubisoft Entertainment Computer program, method, and system for enabling an interactive event among a plurality of persons
US9820231B2 (en) 2013-06-14 2017-11-14 Microsoft Technology Licensing, Llc Coalescing geo-fence events
US9998866B2 (en) 2013-06-14 2018-06-12 Microsoft Technology Licensing, Llc Detecting geo-fence events using varying confidence levels
US9372103B2 (en) * 2013-07-12 2016-06-21 Facebook, Inc. Calibration of grab detection
WO2015009761A2 (en) * 2013-07-15 2015-01-22 SeeScan, Inc. Utility locator transmitter devices, systems, and methods with dockable apparatus
US9128552B2 (en) * 2013-07-17 2015-09-08 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9223340B2 (en) 2013-08-14 2015-12-29 Lenovo (Singapore) Pte. Ltd. Organizing display data on a multiuser display
US9669297B1 (en) * 2013-09-18 2017-06-06 Aftershock Services, Inc. Using biometrics to alter game content
US9813891B2 (en) 2013-09-30 2017-11-07 Elwha Llc Mobile device sharing facilitation methods and systems featuring a subset-specific source identification
US9805208B2 (en) 2013-09-30 2017-10-31 Elwha Llc Mobile device sharing facilitation methods and systems with recipient-dependent inclusion of a data selection
US9826439B2 (en) 2013-09-30 2017-11-21 Elwha Llc Mobile device sharing facilitation methods and systems operable in network equipment
US9838536B2 (en) 2013-09-30 2017-12-05 Elwha, Llc Mobile device sharing facilitation methods and systems
US9774728B2 (en) 2013-09-30 2017-09-26 Elwha Llc Mobile device sharing facilitation methods and systems in a context of plural communication records
US9740875B2 (en) 2013-09-30 2017-08-22 Elwha Llc Mobile device sharing facilitation methods and systems featuring exclusive data presentation
US9753131B2 (en) * 2013-10-09 2017-09-05 Massachusetts Institute Of Technology Motion tracking via body radio reflections
EP3055708A2 (en) * 2013-10-09 2016-08-17 Massachusetts Institute Of Technology Motion tracking via body radio reflections
US10063982B2 (en) * 2013-10-09 2018-08-28 Voyetra Turtle Beach, Inc. Method and system for a game headset with audio alerts based on audio track analysis
US9616343B2 (en) * 2013-11-18 2017-04-11 Gaming Support B.V. Hybrid gaming platform
US10033945B2 (en) * 2013-12-12 2018-07-24 Flir Systems Ab Orientation-adapted image remote inspection systems and methods
KR20160105441A (en) 2013-12-27 2016-09-06 메사추세츠 인스티튜트 오브 테크놀로지 Localization with non-synchronous emission and multipath transmission
US9933247B2 (en) 2014-01-13 2018-04-03 The Boeing Company Mandrel configuration monitoring system
US9739883B2 (en) 2014-05-16 2017-08-22 Elwha Llc Systems and methods for ultrasonic velocity and acceleration detection
US9437002B2 (en) 2014-09-25 2016-09-06 Elwha Llc Systems and methods for a dual modality sensor system
US9618618B2 (en) 2014-03-10 2017-04-11 Elwha Llc Systems and methods for ultrasonic position and motion detection
US20150260823A1 (en) * 2014-03-11 2015-09-17 Crestron Electronics, Inc. Method of enclosing and powering a bluetooth emitter
JP2015196091A (en) * 2014-04-02 2015-11-09 アップルジャック 199 エル.ピー. Sensor-based gaming system for avatar to represent player in virtual environment
US10871566B2 (en) * 2014-04-09 2020-12-22 Thomas Danaher Harvey Methods and system to assist search and interception of lost objects
US9995824B2 (en) * 2014-04-09 2018-06-12 Thomas Danaher Harvey Methods and system to assist search for lost and submerged objects
US9885774B2 (en) * 2014-04-18 2018-02-06 Massachusetts Institute Of Technology Indoor localization of a multi-antenna receiver
EP3136961A4 (en) 2014-04-28 2018-03-14 Massachusetts Institute Of Technology Vital signs monitoring via radio reflections
WO2015184406A1 (en) * 2014-05-30 2015-12-03 Texas Tech University System Hybrid fmcw-intererometry radar for positioning and monitoring and methods of using the same
US9824524B2 (en) 2014-05-30 2017-11-21 Igt Canada Solutions Ulc Three dimensional enhancements to game components in gaming systems
US10347073B2 (en) 2014-05-30 2019-07-09 Igt Canada Solutions Ulc Systems and methods for three dimensional games in gaming systems
US9575560B2 (en) 2014-06-03 2017-02-21 Google Inc. Radar-based gesture-recognition through a wearable device
US10234952B2 (en) * 2014-07-18 2019-03-19 Maxim Integrated Products, Inc. Wearable device for using human body as input mechanism
US9525472B2 (en) 2014-07-30 2016-12-20 Corning Incorporated Reducing location-dependent destructive interference in distributed antenna systems (DASS) operating in multiple-input, multiple-output (MIMO) configuration, and related components, systems, and methods
US9811164B2 (en) 2014-08-07 2017-11-07 Google Inc. Radar-based gesture sensing and data transmission
US9921660B2 (en) 2014-08-07 2018-03-20 Google Llc Radar-based gesture recognition
US10268321B2 (en) 2014-08-15 2019-04-23 Google Llc Interactive textiles within hard objects
US9588625B2 (en) 2014-08-15 2017-03-07 Google Inc. Interactive textiles
US9778749B2 (en) 2014-08-22 2017-10-03 Google Inc. Occluded gesture recognition
US11169988B2 (en) 2014-08-22 2021-11-09 Google Llc Radar recognition-aided search
US20160073087A1 (en) * 2014-09-10 2016-03-10 Lenovo (Singapore) Pte. Ltd. Augmenting a digital image with distance data derived based on acoustic range information
JP5938142B1 (en) * 2014-09-22 2016-06-22 株式会社コスモネット Data carrier and data carrier system
US9993723B2 (en) * 2014-09-25 2018-06-12 Intel Corporation Techniques for low power monitoring of sports game play
US9600080B2 (en) * 2014-10-02 2017-03-21 Google Inc. Non-line-of-sight radar-based gesture recognition
CA2963578A1 (en) 2014-10-07 2016-04-14 Xyz Interactive Technologies Inc. Device and method for orientation and positioning
US9797979B2 (en) 2014-10-08 2017-10-24 Symbol Technologies, Llc System for and method of estimating bearings of radio frequency identification (RFID) tags that return RFID receive signals whose power is below a predetermined threshold
GB2531378B (en) * 2014-10-10 2019-05-08 Zwipe As Power harvesting
US11504192B2 (en) 2014-10-30 2022-11-22 Cilag Gmbh International Method of hub communication with surgical instrument systems
US9715010B2 (en) * 2014-11-28 2017-07-25 Htc Corporation Apparatus and method for detection
US10609475B2 (en) 2014-12-05 2020-03-31 Stages Llc Active noise control and customized audio system
US11327711B2 (en) 2014-12-05 2022-05-10 Microsoft Technology Licensing, Llc External visual interactions for speech-based devices
US9729267B2 (en) 2014-12-11 2017-08-08 Corning Optical Communications Wireless Ltd Multiplexing two separate optical links with the same wavelength using asymmetric combining and splitting
EP3234752B1 (en) * 2014-12-19 2019-02-20 Abb Ab Automatic configuration system for an operator console
US10275801B2 (en) * 2014-12-19 2019-04-30 Ca, Inc. Adapting user terminal advertisements responsive to measured user behavior
US10009715B2 (en) * 2015-01-06 2018-06-26 Microsoft Technology Licensing, Llc Geographic information for wireless networks
EP3062142B1 (en) 2015-02-26 2018-10-03 Nokia Technologies OY Apparatus for a near-eye display
US10016162B1 (en) 2015-03-23 2018-07-10 Google Llc In-ear health monitoring
US9983747B2 (en) 2015-03-26 2018-05-29 Google Llc Two-layer interactive textiles
US9848780B1 (en) 2015-04-08 2017-12-26 Google Inc. Assessing cardiovascular function using an optical sensor
RU2583450C1 (en) * 2015-04-14 2016-05-10 Игорь Александрович Маренков Method of locating ground source of radio-frequency of satellite communication system
EP3289434A1 (en) 2015-04-30 2018-03-07 Google LLC Wide-field radar-based gesture recognition
EP3289433A1 (en) 2015-04-30 2018-03-07 Google LLC Type-agnostic rf signal representations
KR102328589B1 (en) 2015-04-30 2021-11-17 구글 엘엘씨 Rf-based micro-motion tracking for gesture tracking and recognition
US10080528B2 (en) 2015-05-19 2018-09-25 Google Llc Optical central venous pressure measurement
US10088908B1 (en) 2015-05-27 2018-10-02 Google Llc Gesture detection and interactions
US9693592B2 (en) 2015-05-27 2017-07-04 Google Inc. Attaching electronic components to interactive textiles
US10251046B2 (en) * 2015-06-01 2019-04-02 Huawei Technologies Co., Ltd. System and method for efficient link discovery in wireless networks
US10376195B1 (en) 2015-06-04 2019-08-13 Google Llc Automated nursing assessment
US9995823B2 (en) 2015-07-31 2018-06-12 Elwha Llc Systems and methods for utilizing compressed sensing in an entertainment system
US10444256B2 (en) * 2015-08-07 2019-10-15 Structural Health Data Systems Device and system for relative motion sensing
WO2017040724A1 (en) * 2015-08-31 2017-03-09 Daniel Arnold Multiview body camera system with environmental sensors and alert features
US10542222B2 (en) 2015-08-31 2020-01-21 Daniel Arnold Multiview body camera system with environmental sensors and alert features
US10817065B1 (en) 2015-10-06 2020-10-27 Google Llc Gesture recognition using multiple antenna
US9929794B2 (en) * 2015-10-15 2018-03-27 Honeywell International Inc. Long term evolution (LTE) air to ground communication enhancements associated with uplink synchronization
US9837760B2 (en) 2015-11-04 2017-12-05 Google Inc. Connectors for connecting electronics embedded in garments to external devices
US10434396B2 (en) 2015-11-30 2019-10-08 James Shaunak Divine Protective headgear with display and methods for use therewith
WO2017092785A1 (en) * 2015-11-30 2017-06-08 Sony Mobile Communications Inc. Dynamic back-off time based on channel utilization statistics
EP3176766B1 (en) * 2015-12-03 2019-07-17 Sony Mobile Communications, Inc. Remote controlling a plurality of controllable devices
WO2017113054A1 (en) 2015-12-28 2017-07-06 华为技术有限公司 Floor positioning method, device and system
US20170255254A1 (en) * 2016-03-02 2017-09-07 Htc Corporation Tracker device of virtual reality system
US10362678B2 (en) 2016-04-18 2019-07-23 Skyworks Solutions, Inc. Crystal packaging with conductive pillars
US10297576B2 (en) 2016-04-18 2019-05-21 Skyworks Solutions, Inc. Reduced form factor radio frequency system-in-package
US10269769B2 (en) 2016-04-18 2019-04-23 Skyworks Solutions, Inc. System in package with vertically arranged radio frequency componentry
US10062670B2 (en) 2016-04-18 2018-08-28 Skyworks Solutions, Inc. Radio frequency system-in-package with stacked clocking crystal
WO2017192167A1 (en) 2016-05-03 2017-11-09 Google Llc Connecting an electronic component to an interactive textile
US20170017874A1 (en) * 2016-05-06 2017-01-19 Qualcomm Incorporated Radio frequency identification (rfid) reader with frequency adjustment of continuous radio frequency (rf) wave
US10285456B2 (en) 2016-05-16 2019-05-14 Google Llc Interactive fabric
US10175781B2 (en) 2016-05-16 2019-01-08 Google Llc Interactive object with multiple electronics modules
EP3400580B1 (en) * 2016-05-17 2022-08-31 Samsung Electronics Co., Ltd. Method and apparatus for facilitating interaction with virtual reality equipment
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions
US10252812B2 (en) 2016-09-28 2019-04-09 General Electric Company System and method for controlling fuel flow to a gas turbine engine based on motion sensor data
US10445925B2 (en) * 2016-09-30 2019-10-15 Sony Interactive Entertainment Inc. Using a portable device and a head-mounted display to view a shared virtual reality space
US10945080B2 (en) 2016-11-18 2021-03-09 Stages Llc Audio analysis and processing system
US9980075B1 (en) 2016-11-18 2018-05-22 Stages Llc Audio source spatialization relative to orientation sensor and output
CN107066121A (en) * 2016-11-30 2017-08-18 黄文超 A kind of magnetic suspension mouse with RFID inductor matrixes
US10579150B2 (en) 2016-12-05 2020-03-03 Google Llc Concurrent detection of absolute distance and relative movement for sensing action gestures
US10573291B2 (en) 2016-12-09 2020-02-25 The Research Foundation For The State University Of New York Acoustic metamaterial
US10302020B2 (en) 2016-12-12 2019-05-28 General Electric Company System and method for controlling a fuel flow to a gas turbine engine
US9773330B1 (en) * 2016-12-29 2017-09-26 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
TWI692935B (en) 2016-12-29 2020-05-01 美商天工方案公司 Front end systems and related devices, integrated circuits, modules, and methods
US10650552B2 (en) 2016-12-29 2020-05-12 Magic Leap, Inc. Systems and methods for augmented reality
US11318350B2 (en) 2016-12-29 2022-05-03 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10973439B2 (en) 2016-12-29 2021-04-13 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis, and feedback
US10352962B2 (en) * 2016-12-29 2019-07-16 BioMech Sensor LLC Systems and methods for real-time data quantification, acquisition, analysis and feedback
EP4300160A2 (en) 2016-12-30 2024-01-03 Magic Leap, Inc. Polychromatic light out-coupling apparatus, near-eye displays comprising the same, and method of out-coupling polychromatic light
US10146300B2 (en) * 2017-01-25 2018-12-04 Lenovo Enterprise Solutions (Singapore) Pte. Ltd. Emitting a visual indicator from the position of an object in a simulated reality emulation
GB2562452B (en) 2017-02-14 2020-11-04 Sony Interactive Entertainment Europe Ltd Sensing apparatus and method
KR101839522B1 (en) * 2017-02-21 2018-03-16 주식회사 와이젯 Wireless transceiver system using beam tracking
CN107016347A (en) * 2017-03-09 2017-08-04 腾讯科技(深圳)有限公司 A kind of body-sensing action identification method, device and system
US10515924B2 (en) 2017-03-10 2019-12-24 Skyworks Solutions, Inc. Radio frequency modules
US20190050060A1 (en) * 2017-03-10 2019-02-14 Awearable Apparel Inc. Methods, systems, and media for providing input based on accelerometer input
GB2590034B (en) 2017-04-21 2021-12-22 Zenimax Media Inc Systems and methods for player input motion compensation by anticipating motion vectors and/or caching repetitive motion vectors
US10436615B2 (en) 2017-04-24 2019-10-08 Carnegie Mellon University Virtual sensor system
US10754005B2 (en) 2017-05-31 2020-08-25 Google Llc Radar modulation for radar sensing using a wireless communication chipset
US10782390B2 (en) 2017-05-31 2020-09-22 Google Llc Full-duplex operation for radar sensing using wireless communication chipset
US10644397B2 (en) * 2017-06-30 2020-05-05 Intel Corporation Methods, apparatus and systems for motion predictive beamforming
US10578870B2 (en) 2017-07-26 2020-03-03 Magic Leap, Inc. Exit pupil expander
US10726218B2 (en) 2017-07-27 2020-07-28 Symbol Technologies, Llc Method and apparatus for radio frequency identification (RFID) tag bearing estimation
US10989803B1 (en) 2017-08-21 2021-04-27 Massachusetts Institute Of Technology Security protocol for motion tracking systems
US10747303B2 (en) * 2017-10-13 2020-08-18 Tactual Labs Co. Backscatter hover detection
JP7384416B2 (en) * 2017-10-13 2023-11-21 タクチュアル ラブズ シーオー. Minimal drive of transmitter to increase hover detection
US11026687B2 (en) 2017-10-30 2021-06-08 Cilag Gmbh International Clip applier comprising clip advancing systems
US11911045B2 (en) 2017-10-30 2024-02-27 Cllag GmbH International Method for operating a powered articulating multi-clip applier
US11510741B2 (en) 2017-10-30 2022-11-29 Cilag Gmbh International Method for producing a surgical instrument comprising a smart electrical system
US11801098B2 (en) 2017-10-30 2023-10-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11564756B2 (en) 2017-10-30 2023-01-31 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11317919B2 (en) 2017-10-30 2022-05-03 Cilag Gmbh International Clip applier comprising a clip crimping system
US11291510B2 (en) 2017-10-30 2022-04-05 Cilag Gmbh International Method of hub communication with surgical instrument systems
US11129636B2 (en) 2017-10-30 2021-09-28 Cilag Gmbh International Surgical instruments comprising an articulation drive that provides for high articulation angles
CN111448497B (en) 2017-12-10 2023-08-04 奇跃公司 Antireflective coating on optical waveguides
CN115826240A (en) 2017-12-20 2023-03-21 奇跃公司 Insert for augmented reality viewing apparatus
US11410259B2 (en) 2017-12-28 2022-08-09 Cilag Gmbh International Adaptive control program updates for surgical devices
US11786245B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Surgical systems with prioritized data transmission capabilities
US11424027B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Method for operating surgical instrument systems
US11446052B2 (en) 2017-12-28 2022-09-20 Cilag Gmbh International Variation of radio frequency and ultrasonic power level in cooperation with varying clamp arm pressure to achieve predefined heat flux or power applied to tissue
US11832840B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical instrument having a flexible circuit
US11419667B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Ultrasonic energy device which varies pressure applied by clamp arm to provide threshold control pressure at a cut progression location
US20190201146A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Safety systems for smart powered surgical stapling
US11589888B2 (en) 2017-12-28 2023-02-28 Cilag Gmbh International Method for controlling smart energy devices
US11324557B2 (en) 2017-12-28 2022-05-10 Cilag Gmbh International Surgical instrument with a sensing array
US11571234B2 (en) 2017-12-28 2023-02-07 Cilag Gmbh International Temperature control of ultrasonic end effector and control system therefor
US11576677B2 (en) 2017-12-28 2023-02-14 Cilag Gmbh International Method of hub communication, processing, display, and cloud analytics
US11602393B2 (en) 2017-12-28 2023-03-14 Cilag Gmbh International Surgical evacuation sensing and generator control
US11857152B2 (en) 2017-12-28 2024-01-02 Cilag Gmbh International Surgical hub spatial awareness to determine devices in operating theater
US11364075B2 (en) 2017-12-28 2022-06-21 Cilag Gmbh International Radio frequency energy device for delivering combined electrical signals
US11464559B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Estimating state of ultrasonic end effector and control system therefor
US11896322B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Sensing the patient position and contact utilizing the mono-polar return pad electrode to provide situational awareness to the hub
US11666331B2 (en) 2017-12-28 2023-06-06 Cilag Gmbh International Systems for detecting proximity of surgical end effector to cancerous tissue
US20190200981A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Method of compressing tissue within a stapling device and simultaneously displaying the location of the tissue within the jaws
US11937769B2 (en) 2017-12-28 2024-03-26 Cilag Gmbh International Method of hub communication, processing, storage and display
US11529187B2 (en) 2017-12-28 2022-12-20 Cilag Gmbh International Surgical evacuation sensor arrangements
US11678881B2 (en) * 2017-12-28 2023-06-20 Cilag Gmbh International Spatial awareness of surgical hubs in operating rooms
US11026751B2 (en) 2017-12-28 2021-06-08 Cilag Gmbh International Display of alignment of staple cartridge to prior linear staple line
US11633237B2 (en) 2017-12-28 2023-04-25 Cilag Gmbh International Usage and technique analysis of surgeon / staff performance against a baseline to optimize device utilization and performance for both current and future procedures
US20190201039A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Situational awareness of electrosurgical systems
US20190201139A1 (en) 2017-12-28 2019-07-04 Ethicon Llc Communication arrangements for robot-assisted surgical platforms
US11818052B2 (en) 2017-12-28 2023-11-14 Cilag Gmbh International Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US11659023B2 (en) 2017-12-28 2023-05-23 Cilag Gmbh International Method of hub communication
US11179175B2 (en) 2017-12-28 2021-11-23 Cilag Gmbh International Controlling an ultrasonic surgical instrument according to tissue location
US11132462B2 (en) 2017-12-28 2021-09-28 Cilag Gmbh International Data stripping method to interrogate patient records and create anonymized record
US11559308B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method for smart energy device infrastructure
US11786251B2 (en) 2017-12-28 2023-10-17 Cilag Gmbh International Method for adaptive control schemes for surgical network control and interaction
US11166772B2 (en) 2017-12-28 2021-11-09 Cilag Gmbh International Surgical hub coordination of control and communication of operating room devices
US11202570B2 (en) 2017-12-28 2021-12-21 Cilag Gmbh International Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems
US11903601B2 (en) 2017-12-28 2024-02-20 Cilag Gmbh International Surgical instrument comprising a plurality of drive systems
US11308075B2 (en) 2017-12-28 2022-04-19 Cilag Gmbh International Surgical network, instrument, and cloud responses based on validation of received dataset and authentication of its source and integrity
US11559307B2 (en) 2017-12-28 2023-01-24 Cilag Gmbh International Method of robotic hub communication, detection, and control
US11896443B2 (en) 2017-12-28 2024-02-13 Cilag Gmbh International Control of a surgical system through a surgical barrier
US11423007B2 (en) 2017-12-28 2022-08-23 Cilag Gmbh International Adjustment of device control programs based on stratified contextual data in addition to the data
US10892995B2 (en) 2017-12-28 2021-01-12 Ethicon Llc Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs
US10758310B2 (en) 2017-12-28 2020-09-01 Ethicon Llc Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices
US11432885B2 (en) 2017-12-28 2022-09-06 Cilag Gmbh International Sensing arrangements for robot-assisted surgical platforms
US11311306B2 (en) 2017-12-28 2022-04-26 Cilag Gmbh International Surgical systems for detecting end effector tissue distribution irregularities
US11540855B2 (en) 2017-12-28 2023-01-03 Cilag Gmbh International Controlling activation of an ultrasonic surgical instrument according to the presence of tissue
US11109866B2 (en) 2017-12-28 2021-09-07 Cilag Gmbh International Method for circular stapler control algorithm adjustment based on situational awareness
US11832899B2 (en) 2017-12-28 2023-12-05 Cilag Gmbh International Surgical systems with autonomously adjustable control programs
US11744604B2 (en) 2017-12-28 2023-09-05 Cilag Gmbh International Surgical instrument with a hardware-only control circuit
US11464535B2 (en) 2017-12-28 2022-10-11 Cilag Gmbh International Detection of end effector emersion in liquid
US11864728B2 (en) 2017-12-28 2024-01-09 Cilag Gmbh International Characterization of tissue irregularities through the use of mono-chromatic light refractivity
US11612444B2 (en) 2017-12-28 2023-03-28 Cilag Gmbh International Adjustment of a surgical device function based on situational awareness
US11389164B2 (en) 2017-12-28 2022-07-19 Cilag Gmbh International Method of using reinforced flexible circuits with multiple sensors to optimize performance of radio frequency devices
GB201802850D0 (en) 2018-02-22 2018-04-11 Sintef Tto As Positioning sound sources
US11389188B2 (en) 2018-03-08 2022-07-19 Cilag Gmbh International Start temperature of blade
US11589915B2 (en) 2018-03-08 2023-02-28 Cilag Gmbh International In-the-jaw classifier based on a model
US11259830B2 (en) 2018-03-08 2022-03-01 Cilag Gmbh International Methods for controlling temperature in ultrasonic device
US10755676B2 (en) 2018-03-15 2020-08-25 Magic Leap, Inc. Image correction due to deformation of components of a viewing device
CN113870457A (en) * 2018-03-22 2021-12-31 创新先进技术有限公司 Timing system, method, device and equipment for competitive sports
US11471156B2 (en) 2018-03-28 2022-10-18 Cilag Gmbh International Surgical stapling devices with improved rotary driven closure systems
US11259806B2 (en) 2018-03-28 2022-03-01 Cilag Gmbh International Surgical stapling devices with features for blocking advancement of a camming assembly of an incompatible cartridge installed therein
US11090047B2 (en) 2018-03-28 2021-08-17 Cilag Gmbh International Surgical instrument comprising an adaptive control system
US10772511B2 (en) * 2018-05-16 2020-09-15 Qualcomm Incorporated Motion sensor using cross coupling
WO2019232282A1 (en) 2018-05-30 2019-12-05 Magic Leap, Inc. Compact variable focus configurations
EP3803450A4 (en) 2018-05-31 2021-08-18 Magic Leap, Inc. Radar head pose localization
EP3804306B1 (en) 2018-06-05 2023-12-27 Magic Leap, Inc. Homography transformation matrices based temperature calibration of a viewing system
JP7421505B2 (en) 2018-06-08 2024-01-24 マジック リープ, インコーポレイテッド Augmented reality viewer with automated surface selection and content orientation placement
WO2020010097A1 (en) 2018-07-02 2020-01-09 Magic Leap, Inc. Pixel intensity modulation using modifying gain values
US11510027B2 (en) 2018-07-03 2022-11-22 Magic Leap, Inc. Systems and methods for virtual and augmented reality
US11856479B2 (en) 2018-07-03 2023-12-26 Magic Leap, Inc. Systems and methods for virtual and augmented reality along a route with markers
CN108717180B (en) * 2018-07-05 2021-09-17 南京航空航天大学 Networking radar power distribution method based on Stark-Berger game
JP7426982B2 (en) 2018-07-24 2024-02-02 マジック リープ, インコーポレイテッド Temperature-dependent calibration of movement sensing devices
US11624929B2 (en) 2018-07-24 2023-04-11 Magic Leap, Inc. Viewing device with dust seal integration
US11112862B2 (en) 2018-08-02 2021-09-07 Magic Leap, Inc. Viewing system with interpupillary distance compensation based on head motion
JP7438188B2 (en) 2018-08-03 2024-02-26 マジック リープ, インコーポレイテッド Unfused pose-based drift correction of fused poses of totems in user interaction systems
US11003205B2 (en) * 2019-02-04 2021-05-11 Sigmasense, Llc. Receive analog to digital circuit of a low voltage drive circuit data communication system
US10499363B1 (en) * 2018-09-18 2019-12-03 Qualcomm Incorporated Methods and apparatus for improved accuracy and positioning estimates
JP7201379B2 (en) * 2018-10-02 2023-01-10 東芝テック株式会社 RFID tag reader
US11580316B2 (en) * 2018-11-08 2023-02-14 Avery Dennison Retail Information Services Llc Interacting RFID tags
EP3881279A4 (en) 2018-11-16 2022-08-17 Magic Leap, Inc. Image size triggered clarification to maintain image sharpness
US20200168045A1 (en) 2018-11-28 2020-05-28 Igt Dynamic game flow modification in electronic wagering games
CN109557512B (en) * 2018-12-06 2020-08-04 航天南湖电子信息技术股份有限公司 Radar receiver with high sensitivity and high dynamic range
EP3668197B1 (en) * 2018-12-12 2021-11-03 Rohde & Schwarz GmbH & Co. KG Method and radio for setting the transmission power of a radio transmission
EP3921720A4 (en) 2019-02-06 2022-06-29 Magic Leap, Inc. Target intent-based clock speed determination and adjustment to limit total heat generated by multiple processors
US10977808B2 (en) * 2019-02-18 2021-04-13 Raytheon Company Three-frame difference target acquisition and tracking using overlapping target images
US11291444B2 (en) 2019-02-19 2022-04-05 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a closure lockout
US11317915B2 (en) 2019-02-19 2022-05-03 Cilag Gmbh International Universal cartridge based key feature that unlocks multiple lockout arrangements in different surgical staplers
US11357503B2 (en) 2019-02-19 2022-06-14 Cilag Gmbh International Staple cartridge retainers with frangible retention features and methods of using same
US11369377B2 (en) 2019-02-19 2022-06-28 Cilag Gmbh International Surgical stapling assembly with cartridge based retainer configured to unlock a firing lockout
US11464511B2 (en) 2019-02-19 2022-10-11 Cilag Gmbh International Surgical staple cartridges with movable authentication key arrangements
JP2022523852A (en) 2019-03-12 2022-04-26 マジック リープ, インコーポレイテッド Aligning local content between first and second augmented reality viewers
EP3719532B1 (en) 2019-04-04 2022-12-28 Transrobotics, Inc. Technologies for acting based on object tracking
JP2022530900A (en) * 2019-05-01 2022-07-04 マジック リープ, インコーポレイテッド Content provisioning system and method
KR20210132132A (en) 2019-06-17 2021-11-03 구글 엘엘씨 Mobile device-based radar system for applying different power modes to multi-mode interface
USD950728S1 (en) 2019-06-25 2022-05-03 Cilag Gmbh International Surgical staple cartridge
USD952144S1 (en) 2019-06-25 2022-05-17 Cilag Gmbh International Surgical staple cartridge retainer with firing system authentication key
USD964564S1 (en) 2019-06-25 2022-09-20 Cilag Gmbh International Surgical staple cartridge retainer with a closure system authentication key
JP2022542363A (en) 2019-07-26 2022-10-03 マジック リープ, インコーポレイテッド Systems and methods for augmented reality
GB2586059B (en) * 2019-08-01 2023-06-07 Sony Interactive Entertainment Inc System and method for generating user inputs for a video game
US10973062B2 (en) * 2019-08-26 2021-04-06 International Business Machines Corporation Method for extracting environment information leveraging directional communication
KR20210034270A (en) 2019-09-20 2021-03-30 삼성전자주식회사 Electronic device for determinig path of line of sight(los) and method for the same
WO2021097323A1 (en) 2019-11-15 2021-05-20 Magic Leap, Inc. A viewing system for use in a surgical environment
KR20210069479A (en) 2019-12-03 2021-06-11 삼성전자주식회사 Electronic device and operating method for identifying location information of device
US11860439B1 (en) 2020-05-06 2024-01-02 Apple Inc. Head-mounted electronic device with alignment sensors
CN113129328B (en) * 2021-04-22 2022-05-17 中国电子科技集团公司第二十九研究所 Target hotspot area fine analysis method
US11615648B2 (en) 2021-05-28 2023-03-28 Sportsbox.ai Inc. Practice drill-related features using quantitative, biomechanical-based analysis
GB2608186A (en) * 2021-06-25 2022-12-28 Thermoteknix Systems Ltd Augmented reality system
US11920521B2 (en) 2022-02-07 2024-03-05 General Electric Company Turboshaft load control using feedforward and feedback control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545661B1 (en) * 1999-06-21 2003-04-08 Midway Amusement Games, Llc Video game system having a control unit with an accelerometer for controlling a video game
US20050270368A1 (en) * 2004-06-04 2005-12-08 Electronic Arts Inc. Motion sensor using dual camera inputs
US20070060336A1 (en) * 2003-09-15 2007-03-15 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program

Family Cites Families (237)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2397746A (en) * 1942-05-23 1946-04-02 Hazeltine Corp Wave-signal direction finder
US3430243A (en) * 1966-04-04 1969-02-25 Robley D Evans Method of and apparatus for determining the distance and/or angles between objects with the aid of radiant energy
US3816830A (en) * 1970-11-27 1974-06-11 Hazeltine Corp Cylindrical array antenna
US3789410A (en) * 1972-01-07 1974-01-29 Us Navy Passive ranging technique
US4041494A (en) * 1975-11-10 1977-08-09 The United States Of America As Represented By The Secretary Of The Department Of Transportation Distance measuring method and apparatus
US4309703A (en) * 1979-12-28 1982-01-05 International Business Machines Corporation Segmented chirp waveform implemented radar system
US5248884A (en) * 1983-10-11 1993-09-28 The Secretary Of State For Defence In Her Britannic Majesty's Government Of The United Kingdom Of Great Britain And Northern Ireland Infrared detectors
US4639900A (en) * 1984-02-22 1987-01-27 U.S. Philips Corporation Method and a system for monitoring a sea area
US4807183A (en) * 1985-09-27 1989-02-21 Carnegie-Mellon University Programmable interconnection chip for computer system functional modules
EP0344153B1 (en) * 1986-11-27 1995-04-05 David Fenton Fenner Remote control systems
US5027433A (en) * 1988-04-04 1991-06-25 Hm Electronics, Inc. Remote infrared transceiver and method of using same
US5214615A (en) * 1990-02-26 1993-05-25 Will Bauer Three-dimensional displacement of a body with computer interface
US5229764A (en) * 1991-06-20 1993-07-20 Matchett Noel D Continuous biometric authentication matrix
US5138322A (en) * 1991-08-20 1992-08-11 Matrix Engineering, Inc. Method and apparatus for radar measurement of ball in play
US7098891B1 (en) * 1992-09-18 2006-08-29 Pryor Timothy R Method for providing human input to a computer
US5502683A (en) * 1993-04-20 1996-03-26 International Business Machines Corporation Dual ported memory with word line access control
AU6792194A (en) * 1993-05-03 1994-11-21 University Of British Columbia, The Tracking platform system
JP2552427B2 (en) * 1993-12-28 1996-11-13 コナミ株式会社 Tv play system
US5574479A (en) * 1994-01-07 1996-11-12 Selectech, Ltd. Optical system for determining the roll orientation of a remote unit relative to a base unit
CA2141144A1 (en) * 1994-03-31 1995-10-01 Joseph Desimone Electronic game utilizing bio-signals
US5412619A (en) * 1994-04-14 1995-05-02 Bauer; Will Three-dimensional displacement of a body with computer interface
US8280682B2 (en) * 2000-12-15 2012-10-02 Tvipr, Llc Device for monitoring movement of shipped goods
US6539336B1 (en) * 1996-12-12 2003-03-25 Phatrat Technologies, Inc. Sport monitoring system for determining airtime, speed, power absorbed and other factors such as drop distance
US5943427A (en) * 1995-04-21 1999-08-24 Creative Technology Ltd. Method and apparatus for three dimensional audio spatialization
US6418324B1 (en) * 1995-06-01 2002-07-09 Padcom, Incorporated Apparatus and method for transparent wireless communication between a remote device and host system
US5528557A (en) * 1995-08-07 1996-06-18 Northrop Grumman Corporation Acoustic emission source location by reverse ray tracing
US5742840A (en) * 1995-08-16 1998-04-21 Microunity Systems Engineering, Inc. General purpose, multiple precision parallel operation, programmable media processor
DE59601957D1 (en) * 1995-09-07 1999-06-24 Siemens Ag DEVICE FOR DISTANCE MEASUREMENT
US5754948A (en) * 1995-12-29 1998-05-19 University Of North Carolina At Charlotte Millimeter-wave wireless interconnection of electronic components
US6278418B1 (en) * 1995-12-29 2001-08-21 Kabushiki Kaisha Sega Enterprises Three-dimensional imaging system, game device, method for same and recording medium
US6396041B1 (en) * 1998-08-21 2002-05-28 Curtis A. Vock Teaching and gaming golf feedback system and methods
US5700204A (en) * 1996-06-17 1997-12-23 Teder; Rein S. Projectile motion parameter determination device using successive approximation and high measurement angle speed sensor
US6400374B2 (en) * 1996-09-18 2002-06-04 Eyematic Interfaces, Inc. Video superposition system and method
US5786912A (en) * 1996-12-27 1998-07-28 Lucent Technologies Inc. Waveguide-based, fabricless switch for telecommunication system and telecommunication infrastructure employing the same
US6182203B1 (en) * 1997-01-24 2001-01-30 Texas Instruments Incorporated Microprocessor
US6814293B2 (en) * 1997-02-10 2004-11-09 Symbol Technologies, Inc. Arrangement for and method of establishing a logical relationship among peripherals in a wireless local area network
DE69727245T2 (en) * 1997-02-13 2004-11-18 Nokia Corp. METHOD AND DEVICE FOR DIRECTED RADIO TRANSMISSION
US6070269A (en) * 1997-07-25 2000-06-06 Medialab Services S.A. Data-suit for real-time computer animation and virtual reality applications
US20020036617A1 (en) * 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US6142876A (en) * 1997-08-22 2000-11-07 Cumbers; Blake Player tracking and identification system
US6162123A (en) * 1997-11-25 2000-12-19 Woolston; Thomas G. Interactive electronic sword game
US5884104A (en) * 1997-11-26 1999-03-16 Eastman Kodak Company Compact camera flash unit
US6421453B1 (en) * 1998-05-15 2002-07-16 International Business Machines Corporation Apparatus and methods for user recognition employing behavioral passwords
US6438622B1 (en) * 1998-11-17 2002-08-20 Intel Corporation Multiprocessor system including a docking system
FR2786899B1 (en) * 1998-12-03 2006-09-29 Jean Bonnard MOVEMENT INDICATOR FOR SOFTWARE
US7102616B1 (en) * 1999-03-05 2006-09-05 Microsoft Corporation Remote control device with pointing capacity
US7933295B2 (en) * 1999-04-13 2011-04-26 Broadcom Corporation Cable modem with voice processing capability
US7015950B1 (en) * 1999-05-11 2006-03-21 Pryor Timothy R Picture taking method and apparatus
US6343315B1 (en) * 1999-05-12 2002-01-29 Lodgenet Entertainment Corporation Entertainment/Information system having disparate interactive devices
US6653971B1 (en) * 1999-05-14 2003-11-25 David L. Guice Airborne biota monitoring and control system
US6500070B1 (en) * 1999-05-28 2002-12-31 Nintendo Co., Ltd. Combined game system of portable and video game machines
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US7592944B2 (en) * 1999-06-14 2009-09-22 Time Domain Corporation System and method for intrusion detection using a time domain radar array
US6177903B1 (en) * 1999-06-14 2001-01-23 Time Domain Corporation System and method for intrusion detection using a time domain radar array
JP4278071B2 (en) * 1999-06-17 2009-06-10 株式会社バンダイナムコゲームス Image generation system and information storage medium
JP2001104636A (en) * 1999-10-04 2001-04-17 Shinsedai Kk Cenesthesic ball game device
US6735708B2 (en) * 1999-10-08 2004-05-11 Dell Usa, L.P. Apparatus and method for a combination personal digital assistant and network portable device
US7030860B1 (en) * 1999-10-08 2006-04-18 Synaptics Incorporated Flexible transparent touch sensing system for electronic devices
US8956228B2 (en) * 1999-12-03 2015-02-17 Nike, Inc. Game pod
US7010634B2 (en) * 1999-12-23 2006-03-07 Intel Corporation Notebook computer with independently functional, dockable core computer
US7445550B2 (en) * 2000-02-22 2008-11-04 Creative Kingdoms, Llc Magical wand and interactive play experience
US6315667B1 (en) * 2000-03-28 2001-11-13 Robert Steinhart System for remote control of a model airplane
JP4020567B2 (en) * 2000-05-15 2007-12-12 株式会社コナミデジタルエンタテインメント Game machine and game environment setting network system thereof
US20020049806A1 (en) * 2000-05-16 2002-04-25 Scott Gatz Parental control system for use in connection with account-based internet access server
EP1168158B1 (en) * 2000-06-12 2007-10-10 Broadcom Corporation Context switch architecture and system
JP2002052243A (en) * 2000-08-11 2002-02-19 Konami Co Ltd Competition type video game
US7918808B2 (en) * 2000-09-20 2011-04-05 Simmons John C Assistive clothing
KR100364368B1 (en) * 2000-10-18 2002-12-12 엘지전자 주식회사 Private Network Using Bluetooth and Communication Method Using the Network
JP2002171245A (en) * 2000-12-05 2002-06-14 Sony Corp Method for synthesizing retransmitted data and device for synthesizing retransmitted data
US6735663B2 (en) * 2000-12-18 2004-05-11 Dell Products L.P. Combination personal data assistant and personal computing device
EP1216899A1 (en) * 2000-12-22 2002-06-26 Ford Global Technologies, Inc. Communication system for use with a vehicle
JP2002199500A (en) * 2000-12-25 2002-07-12 Sony Corp Virtual sound image localizing processor, virtual sound image localization processing method and recording medium
US6801974B1 (en) * 2001-01-26 2004-10-05 Dell Products L.P. Method of filtering events in a combinational computing device
US7197584B2 (en) * 2001-01-26 2007-03-27 Dell Products L.P. Removable personal digital assistant in a dual personal computer/personal digital assistant computer architecture
US6816925B2 (en) * 2001-01-26 2004-11-09 Dell Products L.P. Combination personal data assistant and personal computing device with master slave input output
JP3722279B2 (en) * 2001-01-26 2005-11-30 日本電気株式会社 Optical transceiver module
EP1274279B1 (en) * 2001-02-14 2014-06-18 Sony Corporation Sound image localization signal processor
US7131907B2 (en) * 2001-02-22 2006-11-07 Kabushiki Kaisha Sega System and method for superimposing an image on another image in a video game
US20020183038A1 (en) * 2001-05-31 2002-12-05 Palm, Inc. System and method for crediting an account associated with a network access node
US7082285B2 (en) * 2001-03-23 2006-07-25 Broadcom Corporation Reduced instruction set baseband controller
US6540607B2 (en) * 2001-04-26 2003-04-01 Midway Games West Video game position and orientation detection system
US6587699B2 (en) * 2001-05-02 2003-07-01 Trex Enterprises Corporation Narrow beamwidth communication link with alignment camera
US7065326B2 (en) * 2001-05-02 2006-06-20 Trex Enterprises Corporation Millimeter wave communications system with a high performance modulator circuit
KR100987650B1 (en) * 2001-05-14 2010-10-13 코닌클리케 필립스 일렉트로닉스 엔.브이. Device for interacting with real-time streams of content
US6563940B2 (en) * 2001-05-16 2003-05-13 New Jersey Institute Of Technology Unauthorized user prevention device and method
SE523407C2 (en) * 2001-05-18 2004-04-13 Jan G Faeger Device for determining the position and / or orientation of a creature in relation to an environment and use of such a device
US20030172380A1 (en) * 2001-06-05 2003-09-11 Dan Kikinis Audio command and response for IPGs
US20030001882A1 (en) * 2001-06-29 2003-01-02 Macer Peter J. Portable entertainment machines
DE10136981A1 (en) * 2001-07-30 2003-02-27 Daimler Chrysler Ag Method and device for determining a stationary and / or moving object
US7094164B2 (en) * 2001-09-12 2006-08-22 Pillar Vision Corporation Trajectory detection and feedback system
US6760387B2 (en) * 2001-09-21 2004-07-06 Time Domain Corp. Impulse radio receiver and method for finding angular offset of an impulse radio transmitter
US7054423B2 (en) * 2001-09-24 2006-05-30 Nebiker Robert M Multi-media communication downloading
US6937182B2 (en) * 2001-09-28 2005-08-30 Trex Enterprises Corp. Millimeter wave imaging system
US7257093B1 (en) * 2001-10-10 2007-08-14 Sandia Corporation Localized radio frequency communication using asynchronous transfer mode protocol
US6987988B2 (en) * 2001-10-22 2006-01-17 Waxess, Inc. Cordless and wireless telephone docking station with land line interface and switching mode
US7444393B2 (en) * 2001-10-30 2008-10-28 Keicy K. Chung Read-only storage device having network interface, a system including the device, and a method of distributing files over a network
US20050282633A1 (en) * 2001-11-13 2005-12-22 Frederic Nicolas Movement-sensing apparatus for software
US20030112585A1 (en) * 2001-12-13 2003-06-19 Silvester Kelan Craig Multiprocessor notebook computer with a tablet PC conversion capability
US6712692B2 (en) * 2002-01-03 2004-03-30 International Business Machines Corporation Using existing videogames for physical training and rehabilitation
JP3914771B2 (en) * 2002-01-09 2007-05-16 株式会社日立製作所 Packet communication apparatus and packet data transfer control method
GB0203621D0 (en) * 2002-02-15 2002-04-03 Bae Systems Defence Sysytems L Emitter location system
WO2003071813A2 (en) * 2002-02-19 2003-08-28 Zyray Wireless, Inc. Method and apparatus optimizing a radio link
US6990320B2 (en) * 2002-02-26 2006-01-24 Motorola, Inc. Dynamic reallocation of processing resources for redundant functionality
AU2003220185B2 (en) * 2002-03-12 2007-05-10 Menache, Llc Motion tracking system and method
KR100449102B1 (en) * 2002-03-19 2004-09-18 삼성전자주식회사 System on chip processor for multimedia
US20030195040A1 (en) * 2002-04-10 2003-10-16 Breving Joel S. Video game system and game controller
US20030211888A1 (en) * 2002-05-13 2003-11-13 Interactive Telegames, Llc Method and apparatus using insertably-removable auxiliary devices to play games over a communications link
US7085536B2 (en) * 2002-05-23 2006-08-01 Intel Corporation Method and apparatus for dynamically resolving radio frequency interference problems in a system
US7043588B2 (en) * 2002-05-24 2006-05-09 Dell Products L.P. Information handling system featuring multi-processor capability with processor located in docking station
US20050035955A1 (en) * 2002-06-06 2005-02-17 Carter Dale J. Method of determining orientation and manner of holding a mobile telephone
US7146014B2 (en) * 2002-06-11 2006-12-05 Intel Corporation MEMS directional sensor system
US7159099B2 (en) * 2002-06-28 2007-01-02 Motorola, Inc. Streaming vector processor with reconfigurable interconnection switch
US7161579B2 (en) * 2002-07-18 2007-01-09 Sony Computer Entertainment Inc. Hand-held computer interactive device
US7646372B2 (en) * 2003-09-15 2010-01-12 Sony Computer Entertainment Inc. Methods and systems for enabling direction detection when interfacing with a computer program
US8073157B2 (en) * 2003-08-27 2011-12-06 Sony Computer Entertainment Inc. Methods and apparatus for targeted sound detection and characterization
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US8947347B2 (en) * 2003-08-27 2015-02-03 Sony Computer Entertainment Inc. Controlling actions in a video game unit
US7782297B2 (en) * 2002-07-27 2010-08-24 Sony Computer Entertainment America Inc. Method and apparatus for use in determining an activity level of a user in relation to a system
US7627139B2 (en) * 2002-07-27 2009-12-01 Sony Computer Entertainment Inc. Computer image and audio processing of intensity and input devices for interfacing with a computer program
US7918733B2 (en) * 2002-07-27 2011-04-05 Sony Computer Entertainment America Inc. Multi-input game control mixer
US7854655B2 (en) * 2002-07-27 2010-12-21 Sony Computer Entertainment America Inc. Obtaining input for controlling execution of a game program
US7803050B2 (en) * 2002-07-27 2010-09-28 Sony Computer Entertainment Inc. Tracking device with sound emitter for use in obtaining information for controlling game program execution
DE10240497A1 (en) * 2002-09-03 2004-03-11 Robert Bosch Gmbh Radar measuring device and method for operating a radar measuring device
US7343524B2 (en) * 2002-09-16 2008-03-11 Finisar Corporation Network analysis omniscent loop state machine
US20040054776A1 (en) * 2002-09-16 2004-03-18 Finisar Corporation Network expert analysis process
US20040062308A1 (en) * 2002-09-27 2004-04-01 Kamosa Gregg Mark System and method for accelerating video data processing
US7200061B2 (en) * 2002-11-08 2007-04-03 Hitachi, Ltd. Sense amplifier for semiconductor memory device
US20040117442A1 (en) * 2002-12-10 2004-06-17 Thielen Kurt R. Handheld portable wireless digital content player
US20040123113A1 (en) * 2002-12-18 2004-06-24 Svein Mathiassen Portable or embedded access and input devices and methods for giving access to access limited devices, apparatuses, appliances, systems or networks
AU2003297389A1 (en) * 2002-12-19 2004-07-14 Fortescue Corporation Method and apparatus for determining orientation and position of a moveable object
US7339608B2 (en) * 2003-01-03 2008-03-04 Vtech Telecommunications Limited Wireless motion sensor using infrared illuminator and camera integrated with wireless telephone
JP3875196B2 (en) * 2003-02-10 2007-01-31 株式会社東芝 Service providing device, service receiving device, service providing program, service receiving program, proximity wireless communication device, service providing method, and service receiving method
US7009561B2 (en) * 2003-03-11 2006-03-07 Menache, Llp Radio frequency motion tracking system and method
EP1880478B1 (en) * 2003-03-12 2013-01-16 International Business Machines Corporation Method and apparatus for converting optical signals to radio channels
KR100739087B1 (en) * 2003-03-12 2007-07-13 닛본 덴끼 가부시끼가이샤 Transmission beam control method, adaptive antenna transmitter/receiver apparatus and radio base station
AU2003901463A0 (en) * 2003-03-31 2003-04-17 Qx Corporation Pty Ltd A method and device for multipath mitigation in positioning systems using clustered positioning signals
CA2523480C (en) * 2003-04-25 2014-05-27 Xm Satellite Radio Inc. System and method for providing recording and playback of digital media content
US7391888B2 (en) * 2003-05-30 2008-06-24 Microsoft Corporation Head pose assessment methods and systems
US20050009604A1 (en) * 2003-07-11 2005-01-13 Hsien-Ta Huang Monotone voice activation device
US20050014468A1 (en) * 2003-07-18 2005-01-20 Juha Salokannel Scalable bluetooth multi-mode radio module
US7432846B2 (en) * 2003-08-12 2008-10-07 Trex Enterprises Corp. Millimeter wave imaging system
US7415244B2 (en) * 2003-08-12 2008-08-19 Trey Enterprises Corp. Multi-channel millimeter wave imaging system
US7385549B2 (en) * 2003-08-12 2008-06-10 Trex Enterprises Corp Millimeter wave portal imaging system
WO2005024949A1 (en) * 2003-08-28 2005-03-17 Hitachi, Ltd. Semiconductor device and its manufacturing method
US7441154B2 (en) * 2003-09-12 2008-10-21 Finisar Corporation Network analysis tool
US20050076161A1 (en) * 2003-10-03 2005-04-07 Amro Albanna Input system and method
US20050124307A1 (en) * 2003-12-08 2005-06-09 Xytrans, Inc. Low cost broadband wireless communication system
US20050132420A1 (en) * 2003-12-11 2005-06-16 Quadrock Communications, Inc System and method for interaction with television content
EP1695335A1 (en) * 2003-12-15 2006-08-30 France Telecom Method for synthesizing acoustic spatialization
US20050185364A1 (en) * 2004-01-05 2005-08-25 Jory Bell Docking station for mobile computing device
US7663689B2 (en) * 2004-01-16 2010-02-16 Sony Computer Entertainment Inc. Method and apparatus for optimizing capture device settings through depth information
TWI286036B (en) * 2004-02-10 2007-08-21 Realtek Semiconductor Corp Method for selecting a channel in a wireless network
US7148836B2 (en) * 2004-03-05 2006-12-12 The Regents Of The University Of California Obstacle penetrating dynamic radar imaging system
US9178953B2 (en) * 2004-03-18 2015-11-03 Nokia Technologies Oy Position-based context awareness for mobile terminal device
JP2005323340A (en) * 2004-04-07 2005-11-17 Matsushita Electric Ind Co Ltd Communication terminal and communication method
US20050245204A1 (en) * 2004-05-03 2005-11-03 Vance Scott L Impedance matching circuit for a mobile communication device
JP3866735B2 (en) * 2004-05-10 2007-01-10 株式会社東芝 Multifunction mobile communication terminal
US8027165B2 (en) * 2004-07-08 2011-09-27 Sandisk Technologies Inc. Portable memory devices with removable caps that effect operation of the devices when attached
US8016667B2 (en) * 2004-07-22 2011-09-13 Igt Remote gaming eligibility system and method using RFID tags
US8109858B2 (en) * 2004-07-28 2012-02-07 William G Redmann Device and method for exercise prescription, detection of successful performance, and provision of reward therefore
US7242359B2 (en) * 2004-08-18 2007-07-10 Microsoft Corporation Parallel loop antennas for a mobile electronic device
KR100890060B1 (en) * 2004-08-27 2009-03-25 삼성전자주식회사 System and Method for Controlling Congestion of Group Call Response Message On Access Channel
US7590589B2 (en) * 2004-09-10 2009-09-15 Hoffberg Steven M Game theoretic prioritization scheme for mobile ad hoc networks permitting hierarchal deference
WO2006036811A2 (en) * 2004-09-22 2006-04-06 Xyratex Technnology Limited System and method for configuring memory devices for use in a network
EP1646112A1 (en) * 2004-10-11 2006-04-12 Sony Deutschland GmbH Directivity control for short range wireless mobile communication systems
US20060085675A1 (en) * 2004-10-12 2006-04-20 Andrew Popell One-touch backup system
US6967612B1 (en) * 2004-10-22 2005-11-22 Gorman John D System and method for standoff detection of human carried explosives
US7683883B2 (en) * 2004-11-02 2010-03-23 Pierre Touma 3D mouse and game controller based on spherical coordinates system and system for use
WO2006054599A1 (en) * 2004-11-16 2006-05-26 Nihon University Sound source direction judging device and method
US6965340B1 (en) * 2004-11-24 2005-11-15 Agilent Technologies, Inc. System and method for security inspection using microwave imaging
US20060148568A1 (en) * 2004-12-30 2006-07-06 Motorola, Inc. Device and method for wirelessly accessing game media
US7852317B2 (en) * 2005-01-12 2010-12-14 Thinkoptics, Inc. Handheld device for handheld vision based absolute pointing system
EP1846115A4 (en) * 2005-01-26 2012-04-25 Bentley Kinetics Inc Method and system for athletic motion analysis and instruction
US20060189386A1 (en) * 2005-01-28 2006-08-24 Outland Research, L.L.C. Device, system and method for outdoor computer gaming
US20070055949A1 (en) * 2005-01-29 2007-03-08 Nicholas Thomas Methods and apparatus for rfid interface control
US7330702B2 (en) * 2005-01-31 2008-02-12 Taiwan Semiconductor Manufacturing Co., Ltd. Method and apparatus for inter-chip wireless communication
EP1688847B1 (en) * 2005-02-03 2011-05-04 Texas Instruments Incorporated Die-to-die interconnect interface and protocol for stacked semiconductor dies
US7502965B2 (en) * 2005-02-07 2009-03-10 Broadcom Corporation Computer chip set having on board wireless interfaces to support test operations
US7489870B2 (en) * 2005-10-31 2009-02-10 Searete Llc Optical antenna with optical reference
US20060203758A1 (en) * 2005-03-11 2006-09-14 Samsung Electronics Co., Ltd. Mobile terminal for relaying multimedia data to an external display device
US20060211494A1 (en) * 2005-03-18 2006-09-21 Helfer Lisa M Gaming terminal with player-customization of display functions
US7343177B2 (en) * 2005-05-03 2008-03-11 Broadcom Corporation Modular ear-piece/microphone (headset) operable to service voice activated commands
US20060256090A1 (en) * 2005-05-12 2006-11-16 Apple Computer, Inc. Mechanical overlay
US7733285B2 (en) * 2005-05-18 2010-06-08 Qualcomm Incorporated Integrated, closely spaced, high isolation, printed dipoles
US8116401B2 (en) * 2005-05-26 2012-02-14 Broadcom Corporation Method and system for digital spur cancellation
US8001353B2 (en) * 2005-06-10 2011-08-16 Hewlett-Packard Development Company, L.P. Apparatus and method for configuring memory blocks
US7218143B1 (en) * 2005-06-14 2007-05-15 Xilinx, Inc. Integrated circuit having fast interconnect paths between memory elements and carry logic
KR101257848B1 (en) * 2005-07-13 2013-04-24 삼성전자주식회사 Data storing apparatus comprising complex memory and method of operating the same
GB0515796D0 (en) * 2005-07-30 2005-09-07 Mccarthy Peter A motion capture and identification device
US7927216B2 (en) * 2005-09-15 2011-04-19 Nintendo Co., Ltd. Video game system with wireless modular handheld controller
JP4262726B2 (en) * 2005-08-24 2009-05-13 任天堂株式会社 Game controller and game system
US8308563B2 (en) * 2005-08-30 2012-11-13 Nintendo Co., Ltd. Game system and storage medium having game program stored thereon
JP4471910B2 (en) * 2005-09-14 2010-06-02 任天堂株式会社 Virtual positioning program
US8471812B2 (en) * 2005-09-23 2013-06-25 Jesse C. Bunch Pointing and identification device
US8142287B2 (en) * 2005-10-11 2012-03-27 Zeemote Technology Inc. Universal controller for toys and games
JP4859433B2 (en) * 2005-10-12 2012-01-25 任天堂株式会社 Position detection system and position detection program
US7715432B2 (en) * 2005-11-14 2010-05-11 Broadcom Corporation Primary protocol stack having a secondary protocol stack entry point
US8180363B2 (en) * 2005-11-15 2012-05-15 Sony Computer Entertainment Inc. Communication apparatus preventing communication interference
US7613482B2 (en) * 2005-12-08 2009-11-03 Accton Technology Corporation Method and system for steering antenna beam
US7170440B1 (en) * 2005-12-10 2007-01-30 Landray Technology, Inc. Linear FM radar
US20070135243A1 (en) * 2005-12-12 2007-06-14 Larue Michael B Active sports tracker and method
TWI286484B (en) * 2005-12-16 2007-09-11 Pixart Imaging Inc Device for tracking the motion of an object and object for reflecting infrared light
US7978081B2 (en) * 2006-01-09 2011-07-12 Applied Technology Holdings, Inc. Apparatus, systems, and methods for communicating biometric and biomechanical information
US20070224944A1 (en) * 2006-03-10 2007-09-27 Hsiang Chen Portable device having changeable operating modes
US7714780B2 (en) * 2006-03-10 2010-05-11 Broadcom Corporation Beamforming RF circuit and applications thereof
JP4151982B2 (en) * 2006-03-10 2008-09-17 任天堂株式会社 Motion discrimination device and motion discrimination program
US7899394B2 (en) * 2006-03-16 2011-03-01 Broadcom Corporation RFID system with RF bus
US7423587B2 (en) * 2006-04-02 2008-09-09 Rolf Mueller Method for frequency-driven generation of a multiresolution decomposition of the input to wave-based sensing arrays
US8176230B2 (en) * 2006-04-07 2012-05-08 Kingston Technology Corporation Wireless flash memory card expansion system
US7539533B2 (en) * 2006-05-16 2009-05-26 Bao Tran Mesh network monitoring appliance
US20070268481A1 (en) * 2006-05-17 2007-11-22 Ramesh Raskar System and method for measuring scene reflectance using optical sensors
KR100753041B1 (en) * 2006-05-29 2007-08-30 삼성전자주식회사 Mobile terminal with a virtual mode dial and method for operating thereof
JP4208898B2 (en) * 2006-06-09 2009-01-14 株式会社ソニー・コンピュータエンタテインメント Object tracking device and object tracking method
US7816747B2 (en) * 2006-07-06 2010-10-19 International Business Machines Corporation Detector for detecting electromagnetic waves
US20080028118A1 (en) * 2006-07-31 2008-01-31 Craig Peter Sayers Portable dock for a portable computing system
US20080070682A1 (en) * 2006-08-15 2008-03-20 Nintendo Of America Inc. Systems and methods for providing educational games for use by young children, and digital storage mediums for storing the educational games thereon
US7860467B2 (en) * 2006-08-29 2010-12-28 Broadcom Corporation Power control for a dual mode transmitter
US20080070516A1 (en) * 2006-09-15 2008-03-20 Plantronics, Inc. Audio data streaming with auto switching between wireless headset and speakers
US20080076406A1 (en) * 2006-09-22 2008-03-27 Vanu, Inc. Wireless Backhaul
US20080136775A1 (en) * 2006-12-08 2008-06-12 Conant Carson V Virtual input device for computing
US8340057B2 (en) * 2006-12-22 2012-12-25 Canon Kabushiki Kaisha Automated wireless access to peripheral devices
WO2008088870A1 (en) * 2007-01-19 2008-07-24 Progressive Gaming International Corporation Table monitoring identification system, wager tagging and felt coordinate mapping
US20090017910A1 (en) * 2007-06-22 2009-01-15 Broadcom Corporation Position and motion tracking of an object
US9486703B2 (en) * 2007-01-31 2016-11-08 Broadcom Corporation Mobile communication device with game application for use in conjunction with a remote mobile communication device and methods for use therewith
US20090011832A1 (en) * 2007-01-31 2009-01-08 Broadcom Corporation Mobile communication device with game application for display on a remote monitor and methods for use therewith
FI121980B (en) * 2007-02-16 2011-06-30 Voyantic Oy Method for characterizing a radio link
US20080244466A1 (en) * 2007-03-26 2008-10-02 Timothy James Orsley System and method for interfacing with information on a display screen
US20080242414A1 (en) * 2007-03-29 2008-10-02 Broadcom Corporation, A California Corporation Game devices with integrated gyrators and methods for use therewith
US7647071B2 (en) * 2007-03-29 2010-01-12 Broadcom Corporation Communication devices with integrated gyrators and methods for use therewith
JP2008271023A (en) * 2007-04-18 2008-11-06 Univ Of Electro-Communications Antenna system
US10504317B2 (en) * 2007-04-30 2019-12-10 Cfph, Llc Game with player actuated control structure
US8209540B2 (en) * 2007-06-28 2012-06-26 Apple Inc. Incremental secure backup and restore of user settings and data
WO2009062153A1 (en) * 2007-11-09 2009-05-14 Wms Gaming Inc. Interaction with 3d space in a gaming system
US7895365B2 (en) * 2008-02-06 2011-02-22 Broadcom Corporation File storage for a computing device with handheld and extended computing units
CN102016877B (en) * 2008-02-27 2014-12-10 索尼计算机娱乐美国有限责任公司 Methods for capturing depth data of a scene and applying computer actions
US7671802B2 (en) * 2008-03-17 2010-03-02 Disney Enterprises, Inc. Active player tracking
EP2443779B1 (en) * 2009-06-19 2020-08-05 BlackBerry Limited Uplink transmissions for type 2 relay

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6545661B1 (en) * 1999-06-21 2003-04-08 Midway Amusement Games, Llc Video game system having a control unit with an accelerometer for controlling a video game
US20070060336A1 (en) * 2003-09-15 2007-03-15 Sony Computer Entertainment Inc. Methods and systems for enabling depth and direction detection when interfacing with a computer program
US20050270368A1 (en) * 2004-06-04 2005-12-08 Electronic Arts Inc. Motion sensor using dual camera inputs

Cited By (82)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080316863A1 (en) * 2007-06-22 2008-12-25 Broadcom Corporation Directional microphone or microphones for position determination
US7952962B2 (en) * 2007-06-22 2011-05-31 Broadcom Corporation Directional microphone or microphones for position determination
US20100321473A1 (en) * 2007-10-04 2010-12-23 Samsung Techwin Co., Ltd. Surveillance camera system
US8508595B2 (en) * 2007-10-04 2013-08-13 Samsung Techwin Co., Ltd. Surveillance camera system for controlling cameras using position and orientation of the cameras and position information of a detected object
US8271587B2 (en) * 2008-03-21 2012-09-18 International Business Machines Corporation Object movement control system, object movement control method, server and computer program
US20090241034A1 (en) * 2008-03-21 2009-09-24 Kazuaki Ishizaki Object movement control system, object movement control method, server and computer program
US20090258703A1 (en) * 2008-04-09 2009-10-15 Aaron Philip Brunstetter Motion Assessment Using a Game Controller
US8724849B2 (en) * 2008-10-01 2014-05-13 Sony Corporation Information processing device, information processing method, program, and information storage medium
US20110216946A1 (en) * 2008-10-01 2011-09-08 Sony Computer Entertainment, Inc. Information processing device, information processing method, program, and information storage medium
KR100999711B1 (en) 2009-03-09 2010-12-08 광주과학기술원 Apparatus for real-time calibrating in the collaboration system and method using the same
US20100232650A1 (en) * 2009-03-13 2010-09-16 Omron Corporation Measurement apparatus
US8917900B2 (en) * 2009-03-13 2014-12-23 Omron Corporation Measurement apparatus
US9141235B2 (en) 2009-10-26 2015-09-22 Seiko Epson Corporation Optical position detecting device and display device with position detecting function
US9098137B2 (en) 2009-10-26 2015-08-04 Seiko Epson Corporation Position detecting function-added projection display apparatus
US20110096322A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Optical position detection device and display device with position detection function
US20110096031A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Position detecting function-added projection display apparatus
US20110096032A1 (en) * 2009-10-26 2011-04-28 Seiko Epson Corporation Optical position detecting device and display device with position detecting function
US8542350B2 (en) * 2009-10-26 2013-09-24 Seiko Epson Corporation Optical position detection device and display device with position detection function
US8714749B2 (en) 2009-11-06 2014-05-06 Seiko Epson Corporation Projection display device with position detection function
US8523667B2 (en) * 2010-03-29 2013-09-03 Microsoft Corporation Parental control settings based on body dimensions
US20110237324A1 (en) * 2010-03-29 2011-09-29 Microsoft Corporation Parental control settings based on body dimensions
WO2012030453A1 (en) * 2010-08-31 2012-03-08 Microsoft Corporation User selection and navigation based on looped motions
US10416276B2 (en) 2010-11-12 2019-09-17 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US11175375B2 (en) 2010-11-12 2021-11-16 Position Imaging, Inc. Position tracking system and method using radio signals and inertial sensing
US20130010071A1 (en) * 2011-07-04 2013-01-10 3Divi Methods and systems for mapping pointing device on depth map
US20130050499A1 (en) * 2011-08-30 2013-02-28 Qualcomm Incorporated Indirect tracking
WO2013035096A3 (en) * 2011-09-07 2013-07-18 Umoove Limited System and method of tracking an object in an image captured by a moving device
US10605904B2 (en) 2011-11-10 2020-03-31 Position Imaging, Inc. Systems and methods of wireless position tracking
US9933509B2 (en) 2011-11-10 2018-04-03 Position Imaging, Inc. System for tracking an object using pulsed frequency hopping
US9945940B2 (en) 2011-11-10 2018-04-17 Position Imaging, Inc. Systems and methods of wireless position tracking
US9782669B1 (en) 2012-06-14 2017-10-10 Position Imaging, Inc. RF tracking with active sensory feedback
US10269182B2 (en) 2012-06-14 2019-04-23 Position Imaging, Inc. RF tracking with active sensory feedback
US20140009384A1 (en) * 2012-07-04 2014-01-09 3Divi Methods and systems for determining location of handheld device within 3d environment
US10001833B2 (en) 2012-08-14 2018-06-19 Position Imaging, Inc. User input system for immersive interaction
US10180490B1 (en) 2012-08-24 2019-01-15 Position Imaging, Inc. Radio frequency communication system
US10534067B2 (en) 2012-08-24 2020-01-14 Position Imaging, Inc. Radio frequency communication system
US10338192B2 (en) 2012-08-24 2019-07-02 Position Imaging, Inc. Radio frequency communication system
US20150209664A1 (en) * 2012-10-04 2015-07-30 Disney Enterprises, Inc. Making physical objects appear to be moving from the physical world into the virtual world
US9690373B2 (en) * 2012-10-04 2017-06-27 Disney Enterprises, Inc. Making physical objects appear to be moving from the physical world into the virtual world
US10234539B2 (en) 2012-12-15 2019-03-19 Position Imaging, Inc. Cycling reference multiplexing receiver system
US10856108B2 (en) 2013-01-18 2020-12-01 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US10237698B2 (en) 2013-01-18 2019-03-19 Position Imaging, Inc. System and method of locating a radio frequency (RF) tracking device using a calibration routine
US11226395B2 (en) 2013-12-13 2022-01-18 Position Imaging, Inc. Tracking system with mobile reader
US10634762B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US10634761B2 (en) 2013-12-13 2020-04-28 Position Imaging, Inc. Tracking system with mobile reader
US10257654B2 (en) 2014-01-17 2019-04-09 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US9961503B2 (en) 2014-01-17 2018-05-01 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10623898B2 (en) 2014-01-17 2020-04-14 Position Imaging, Inc. Wireless relay station for radio frequency-based tracking system
US10200819B2 (en) * 2014-02-06 2019-02-05 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US10631131B2 (en) 2014-02-06 2020-04-21 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US20150221135A1 (en) * 2014-02-06 2015-08-06 Position Imaging, Inc. Virtual reality and augmented reality functionality for mobile devices
US20160014390A1 (en) * 2014-07-08 2016-01-14 Apple Inc. Electronic Devices With Connector Alignment Assistance
US10869175B2 (en) * 2014-11-04 2020-12-15 Nathan Schumacher System and method for generating a three-dimensional model using flowable probes
US9423669B2 (en) 2014-11-04 2016-08-23 Qualcomm Incorporated Method and apparatus for camera autofocus based on Wi-Fi ranging technique
US9741135B2 (en) * 2014-12-22 2017-08-22 Baidu Online Networks Technology (Beijing) Co., Ltd. Method for measuring object and smart device
US10642560B2 (en) 2015-02-13 2020-05-05 Position Imaging, Inc. Accurate geographic tracking of mobile devices
US11132004B2 (en) 2015-02-13 2021-09-28 Position Imaging, Inc. Spatial diveristy for relative position tracking
US10324474B2 (en) 2015-02-13 2019-06-18 Position Imaging, Inc. Spatial diversity for relative position tracking
US11416805B1 (en) 2015-04-06 2022-08-16 Position Imaging, Inc. Light-based guidance for package tracking systems
US10148918B1 (en) 2015-04-06 2018-12-04 Position Imaging, Inc. Modular shelving systems for package tracking
US10853757B1 (en) 2015-04-06 2020-12-01 Position Imaging, Inc. Video for real-time confirmation in package tracking systems
US11057590B2 (en) 2015-04-06 2021-07-06 Position Imaging, Inc. Modular shelving systems for package tracking
US11501244B1 (en) 2015-04-06 2022-11-15 Position Imaging, Inc. Package tracking systems and methods
US20170123425A1 (en) * 2015-10-09 2017-05-04 SZ DJI Technology Co., Ltd Salient feature based vehicle positioning
US10599149B2 (en) * 2015-10-09 2020-03-24 SZ DJI Technology Co., Ltd. Salient feature based vehicle positioning
US10444323B2 (en) 2016-03-08 2019-10-15 Position Imaging, Inc. Expandable, decentralized position tracking systems and methods
US10388027B2 (en) * 2016-06-01 2019-08-20 Kyocera Corporation Detection method, display apparatus, and detection system
US11436553B2 (en) 2016-09-08 2022-09-06 Position Imaging, Inc. System and method of object tracking using weight confirmation
US11774249B2 (en) 2016-12-12 2023-10-03 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634506B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10455364B2 (en) 2016-12-12 2019-10-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11506501B2 (en) 2016-12-12 2022-11-22 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11022443B2 (en) 2016-12-12 2021-06-01 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US10634503B2 (en) 2016-12-12 2020-04-28 Position Imaging, Inc. System and method of personalized navigation inside a business enterprise
US11120392B2 (en) 2017-01-06 2021-09-14 Position Imaging, Inc. System and method of calibrating a directional light source relative to a camera's field of view
US11361536B2 (en) 2018-09-21 2022-06-14 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method
US11637962B2 (en) 2019-01-11 2023-04-25 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11089232B2 (en) 2019-01-11 2021-08-10 Position Imaging, Inc. Computer-vision-based object tracking and guidance module
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
US20210394068A1 (en) * 2020-06-23 2021-12-23 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method
CN112418200A (en) * 2021-01-25 2021-02-26 成都点泽智能科技有限公司 Object detection method and device based on thermal imaging and server
US11961279B2 (en) 2022-06-13 2024-04-16 Position Imaging, Inc. Machine-learning-assisted self-improving object-identification system and method

Also Published As

Publication number Publication date
US20080316103A1 (en) 2008-12-25
US20120315991A1 (en) 2012-12-13
US20080318673A1 (en) 2008-12-25
US20080318681A1 (en) 2008-12-25
US9523767B2 (en) 2016-12-20
US20080316863A1 (en) 2008-12-25
US8311579B2 (en) 2012-11-13
US8062133B2 (en) 2011-11-22
US20080318684A1 (en) 2008-12-25
US20120129606A1 (en) 2012-05-24
US20130023290A1 (en) 2013-01-24
US9417320B2 (en) 2016-08-16
US7952962B2 (en) 2011-05-31
US8160640B2 (en) 2012-04-17
US20080318683A1 (en) 2008-12-25
US7973702B2 (en) 2011-07-05
US20170232345A1 (en) 2017-08-17
US8628417B2 (en) 2014-01-14
US20080316085A1 (en) 2008-12-25
US20080318626A1 (en) 2008-12-25
US9943760B2 (en) 2018-04-17
US20080318595A1 (en) 2008-12-25
US20080318680A1 (en) 2008-12-25
US9547080B2 (en) 2017-01-17
US20170232346A1 (en) 2017-08-17
US20080318675A1 (en) 2008-12-25
US20090258706A1 (en) 2009-10-15
US20080318625A1 (en) 2008-12-25
US8676257B2 (en) 2014-03-18
US10549195B2 (en) 2020-02-04
US20080318682A1 (en) 2008-12-25
US8289212B2 (en) 2012-10-16
US20110312421A1 (en) 2011-12-22
US20200129861A1 (en) 2020-04-30
US8031121B2 (en) 2011-10-04
US20080318689A1 (en) 2008-12-25
US20080318691A1 (en) 2008-12-25
US11426660B2 (en) 2022-08-30
US20090017910A1 (en) 2009-01-15
US20090273559A1 (en) 2009-11-05

Similar Documents

Publication Publication Date Title
US20080316324A1 (en) Position detection and/or movement tracking via image capture and processing
US11397264B2 (en) Tracking system
Li et al. A VLC smartphone camera based indoor positioning system
Wang et al. RF-kinect: A wearable RFID-based approach towards 3D body movement tracking
US8696458B2 (en) Motion tracking system and method using camera and non-camera sensors
TWI274295B (en) Method and apparatus for real time motion capture
CN111353355B (en) Motion tracking system and method
CN103619090A (en) System and method of automatic stage lighting positioning and tracking based on micro inertial sensor
EP1761798A1 (en) Wireless location and identification system and method
CN109668545A (en) Localization method, locator and positioning system for head-mounted display apparatus
US11443486B2 (en) Mobile 3D body scanning methods and apparatus
CN110169045A (en) Information processing equipment, information processing method and information processing system
GB2566923A (en) Motion tracking
Bostanci et al. Tracking methods for augmented reality
KR100777600B1 (en) A method and system for motion capture using relative coordinates
CN116660863A (en) Device and method for positioning user on omnidirectional motion platform
CN114690899A (en) Positioning device and method based on wearable equipment

Legal Events

Date Code Title Description
AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ROFOUGARAN, AHMADREZA REZA;ROFOUGARAN, MARYAM;REEL/FRAME:021156/0006

Effective date: 20080527

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SESHADRI, NAMBIRAJAN;IBRAHIM, BRIMA B.;WALLEY, JOHN;AND OTHERS;SIGNING DATES FROM 20110526 TO 20110623;REEL/FRAME:027063/0793

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH CAROLINA

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

Owner name: BANK OF AMERICA, N.A., AS COLLATERAL AGENT, NORTH

Free format text: PATENT SECURITY AGREEMENT;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:037806/0001

Effective date: 20160201

AS Assignment

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD., SINGAPORE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

Owner name: AVAGO TECHNOLOGIES GENERAL IP (SINGAPORE) PTE. LTD

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BROADCOM CORPORATION;REEL/FRAME:041706/0001

Effective date: 20170120

AS Assignment

Owner name: BROADCOM CORPORATION, CALIFORNIA

Free format text: TERMINATION AND RELEASE OF SECURITY INTEREST IN PATENTS;ASSIGNOR:BANK OF AMERICA, N.A., AS COLLATERAL AGENT;REEL/FRAME:041712/0001

Effective date: 20170119