US20140003195A1 - Coordinating devices to estimate distance, relative position, and device attitude - Google Patents

Coordinating devices to estimate distance, relative position, and device attitude Download PDF

Info

Publication number
US20140003195A1
US20140003195A1 US13/815,920 US201313815920A US2014003195A1 US 20140003195 A1 US20140003195 A1 US 20140003195A1 US 201313815920 A US201313815920 A US 201313815920A US 2014003195 A1 US2014003195 A1 US 2014003195A1
Authority
US
United States
Prior art keywords
transmission
environment
active device
location
spatial model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/815,920
Inventor
Stanislav Vonog
Tara Lemmey
Maxim Bykov
Nikolay Surin
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Net Power and Light Inc
Original Assignee
Net Power and Light Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/815,920 priority Critical patent/US20140003195A1/en
Application filed by Net Power and Light Inc filed Critical Net Power and Light Inc
Assigned to NET POWER AND LIGHT, INC. reassignment NET POWER AND LIGHT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BYKOV, MAXIM, SURIN, NIKOLAY, LEMMEY, TARA, VONOG, STANISLAV
Publication of US20140003195A1 publication Critical patent/US20140003195A1/en
Assigned to ALSOP LOUIE CAPITAL I, L.P., PENINSULA TECHNOLOGY VENTURES, L.P., PENINSULA VENTURE PRINCIPALS, L.P. reassignment ALSOP LOUIE CAPITAL I, L.P. SECURITY INTEREST Assignors: NET POWER AND LIGHT, INC.
Assigned to ALSOP LOUIE CAPITAL 1, L.P., BROWN, JOHN SEELY, PENINSULA TECHNOLOGY VENTURES, L.P., LOW, LAWRENCE B., SHIN, JEANNIE, PENINSULA VENTURE PRINCIPALS, L.P., ORRICK INVESTMENTS 2010, LLC, ORRICK INVESTMENTS 2011, LLC, WANG, TA-HUI TY, TWB INVESTMENT PARTNERSHIP II, LP, SINGTEL INNOV8 PTE. LTD., THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY (DAPER I) reassignment ALSOP LOUIE CAPITAL 1, L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER & LIGHT, INC.
Assigned to ALSOP LOUIE CAPITAL 1, L.P., BROWN, JOHN SEELY, LOW, LAWRENCE B., ORRICK INVESTMENTS 2010, LLC, ORRICK INVESTMENTS 2011, LLC, THE BOARD OF TRUSTEES OF THE LELAND STANFORD UNIVERSITY (DAPER I), PENINSULA TECHNOLOGY VENTURES, L.P., PENSULA VENTURE PRINCIPALS, L.P., SHIN, JEANNIE, WANG, TA-HUITY, TWB INVESTMENT PARTNERSHIP II, LP, SINGTEL INNOVS PTE. LTD. reassignment ALSOP LOUIE CAPITAL 1, L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER AND LIGHT, INC.
Assigned to ALSOP LOUIE CAPITAL 1, L.P., LOW, LAWRENCE B., ORRICK INVESTMENTS 2010, LLC, ORRICK INVESTMENTS 2011, LLC, PENINSULA TECHNOLOGY VENTURES, L.P., PENINSULA VENTURE PRINCIPALS, L.P., SHIN, JEANNIE, WANG, TA-HUITY, TWB INVESTMENT PARTNERSHIP II, LP, SINGTEL INNOV8 PTE. LTD., BROWN, JOHN SEELY, THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY (DAPER I) reassignment ALSOP LOUIE CAPITAL 1, L.P. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT. Assignors: NET POWER AND LIGHT, INC.
Assigned to PENINSULA TECHNOLOGY VENTURES, L.P., ALSOP LOUIE CAPITAL 1, L.P., SINGTEL INNOV8 PTE. LTD., SHINE, JEANNIE, LOW, LAWRENCE B., ORRICK INVESTMENTS 2011, LLC, BROWN, JOHN SEELY, TWB INVESTMENT PARTNERSHIP II, LP, THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY (DAPER I), WANG, TA-HUI TY, PENINSULA VENTURE PRINCIPALS, L.P., ORRICK INVESTMENTS 2010, LLC reassignment PENINSULA TECHNOLOGY VENTURES, L.P. SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER AND LIGHT, INC.
Priority to US15/052,393 priority patent/US9929798B2/en
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER & LIGHT, INC.
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. NOTE AND WARRANT CONVERSION AGREEMENT Assignors: ALSOP LOUIE CAPITAL 1, L.P., PENINSULA TECHNOLOGY VENTURES, L.P., PENINSULA VENTURE PRINCIPALS, L.P.
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. NOTE AND WARRANT CONVERSION AGREEMENT Assignors: ALSOP LOUIE CAPITAL 1, L.P., BROWN, JOHN SEELY, LOW, LAWRENCE B., ORRICK INVESTMENTS 2010, LLC, ORRICK INVESTMENTS 2011, LLC, PENINSULA TECHNOLOGY VENTURES, L.P., PENINSULA VENTURE PRINCIPALS, L.P., SHIN, JEANNIE, SINGTEL INNOV8 PTE. LTD., THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY (DAPER I), TWB INVESTMENT PARTNERSHIP II, LP, WANG, TA-HUI TY
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. NOTE AND WARRANT CONVERSION AGREEMENT Assignors: ALSOP LOUIE CAPITAL 1, L.P., BROWN, JOHN SEELY, LOW, LAWRENCE B., ORRICK INVESTMENTS 2010, LLC, ORRICK INVESTMENTS 2011, LLC, PENINSULA TECHNOLOGY VENTURES, L.P., PENINSULA VENTURE PRINCIPALS, L.P., SHIN, JEANNIE, SINGTEL INNOV8 PTE. LTD., THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY (DAPER I), TWB INVESTMENT PARTNERSHIP II, LP, WANG, TA-HUI TY
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER & LIGHT, INC.
Assigned to NET POWER & LIGHT, INC. reassignment NET POWER & LIGHT, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: NET POWER & LIGHT, INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/24Radio transmission systems, i.e. using radiation field for communication between two or more posts
    • H04B7/26Radio transmission systems, i.e. using radiation field for communication between two or more posts at least one of which is mobile
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0247Determining attitude
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0284Relative positioning
    • G01S5/0289Relative positioning of multiple transceivers, e.g. in ad hoc networks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves
    • G01S5/186Determination of attitude

Definitions

  • the present invention relates to methods and apparatus for coordinating a plurality of devices, and more specifically, a method and apparatus for coordinating a plurality of devices to estimate or find various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • the present invention contemplates a variety of improved methods and apparatus for coordinating multiple devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • the present teaching provides a paradigm enabling various inventions such as:
  • Certain aspects of this invention contemplate two or more devices contain or connect to components capable of transmitting data between the devices (e.g., speakers and microphones, signal transmitters and receivers, etc.).
  • Any suitable devices may be used in any combination, such as mobile phones, tablets, computers, and televisions (possibly connected via set-top boxes, game systems, DVD players, etc) and any suitable means may be used for transmitting data (e.g., via a Wifi, cellular, Ethernet network or other means of transmission).
  • FIG. 1 is a diagram of a multiple device coordination platform for coordinating a plurality of devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data, according to an embodiment of this invention
  • FIG. 2 is, respectively, a multiple device coordination platform, according to various embodiments of this invention.
  • FIG. 3A is a flowchart of a process for configuring a multiple device coordination platform, for estimating the distance and other absolute and/or relative kinematic data between two devices, according to an embodiment of this invention
  • FIG. 3B is a diagram of two devices according to one embodiment of this invention.
  • FIG. 3C is a diagram of two devices sensing a surrounding environment according to one embodiment of this invention.
  • FIG. 4A is a flowchart of a process for configuring a multiple device coordination platform, for estimating relative positions and other absolute and/or relative kinematic data of devices, according to an embodiment of this invention
  • FIG. 4B is a diagram of multiple devices according to one embodiment of this invention.
  • FIG. 4C is a diagram of two devices sensing a surrounding environment according to one embodiment of this invention.
  • FIG. 5 is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the relative motions and other absolute and/or relative kinematic data of one or more moving devices, according to an embodiment of this invention
  • FIG. 6A is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the change in attitudes and other absolute and/or relative kinematic data of one or more moving devices, according to an embodiment of this invention
  • FIG. 6B illustrates the orientation and attitude of device according to one embodiment of this invention
  • FIG. 7 is a flowchart of a process for configuring a multiple device coordination platform, for recognizing gestures and other absolute and/or relative kinematic data performed with handheld devices, according to an embodiment of this invention
  • FIG. 8 is a flowchart of a process for configuring a multiple device coordination platform, for changing the user interface of the device based on the proximity and other absolute and/or relative kinematic data of the other device, according to an embodiment of this invention
  • FIG. 9 is a diagram of a computer system that can be used to implement various embodiments.
  • FIG. 10 is a diagram of a chip set that can be used to implement an embodiment.
  • a method and apparatus for configuring of a multiple device coordination platform for coordinating multiple devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data is described.
  • numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It is apparent, however, that the embodiments may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments.
  • FIG. 1 is a diagram of a multiple device coordination platform for coordinating a plurality of devices, according to an embodiment.
  • the multiple device coordination platform 103 system is capable of estimating or calculating various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data of a plurality of devices according to an embodiment.
  • data are transmitted by requesting devices 101 a - 101 n , which have a means of transmitting data between devices, to the multiple device coordination platform 103 .
  • data are sensed by the devices 101 a - 101 n by employing a sensor means to gather data associated with a surrounding environment in which the devices exist, and the raw sensory data gathered by the devices 101 a - 101 n are transmitted to the multiple device coordination platform 103 .
  • the devices 101 a - 101 n may be mobile phones, Personal Digital Assistants (PDAs), Smartphones, tablets, laptops, desktops, televisions (possibly connected via set-top boxes, game systems, DVD players, etc.) and the like.
  • the means of sensing the data may include a proximity sensor, a camera, a microphone, and the like.
  • the sensor is a proximity sensor capable of detecting nearby objects.
  • the proximity sensor may be, for example, an electromagnetic radiation sensor, a capacitive photoelectric sensor, an inductive sensor, or the like.
  • the sensor is a camera capable of receiving visible or non-visible light.
  • the sensor is a microphone capable of receiving audio signal in the form of acoustical waves.
  • the means of transmitting the data is typically via a Wifi 111 , cellular 117 , or Ethernet network 113 , but could include other means of transmission.
  • a device 101 may contain or connect to transmission components associated with the device for transmitting and receiving a variety of nonintrusive signals (i.e., nonintrusive to participants using the device).
  • a device 101 may contain or connect to speakers and microphones associated with the device for transmitting and receiving audio signals.
  • the device 101 may contain or connect to radio frequency (RF) transmitters and receivers associated with the device for transmitting and receiving RF signals.
  • the device 101 may contain or connect to infrared (IR) transmitters and receivers associated with the device for transmitting and receiving IR signals.
  • RF radio frequency
  • IR infrared
  • a participant using the device 101 may estimate or calculate various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data based on the signal (e.g., audio, optical, etc.) received from other devices.
  • the device 101 may be actively broadcasting a signal to other devices and receiving in response a return signal in order to estimate or calculate various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data.
  • a device 101 may be passively sensing the environment in which it exists to receive the signal to assist in the estimation or calculation of various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data.
  • a plurality of devices acting together, transmit cumulative raw sensory data about what they are sensing to the multiple device coordination platform 103 .
  • a device 101 a may transmit an image of an environment sensed through a built-in camera of the device 101 a . Using this image along may be insufficient. However, the image, combined with other sensory data received from the devices 101 b - 101 n , such as audio data received from microphones, enables the multiple device coordination platform 103 to estimate or calculate accurately the various distances, relative positions, device attitudes, orientation, and/or other relative kinematic data (velocities, angular, accelerations) about the devices 101 a - 101 n.
  • various types of processing may be applied to the raw sensory data gathered by the devices 101 a - 101 n .
  • image recognition algorithms may be applied to raw still images or video.
  • noise filter algorithms may be applied to raw audio data.
  • the processing of the raw sensory data may be performed.
  • the processing of the raw sensory data may be performed by the multiple device coordination platform 103 .
  • a person having ordinary skill in the art will recognize the optimal configuration based on available computing power, bandwidth constraints, storage capacity, or other considerations.
  • the multiple device coordination platform 103 may utilize the calculated various distances, relative positions, device attitudes, orientation, and/or other relative kinematic data (velocities, angular, accelerations) about the devices 101 a - 101 n for constructing a real-time spatial model of the devices 101 a - 101 n used by participants at a particular venue.
  • the real-time spatial model may be used, for example, to track the movement of the devices, to apply gesture recognition, to alter the content displayed on devices used by the participants (such as rendering special effects on the display of a device used by a participant), and the like.
  • the data received by the devices via active broadcasting and via passive sensing may be utilized in parallel by the multiple device coordination platform 103 to estimate or calculate the various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • the multiple device coordination platform 103 may operate in connection with a device 101 by way of a communication network.
  • the device 101 may include various computer hardware, including multiple device coordination platform 103 .
  • the multiple device coordination platform 103 may be external to the devices 101 .
  • the multiple device coordination platform 103 can retrieve the data information gathered by the device 101 on behalf of the respective subscribing participant.
  • devices 101 a - 101 n may include cellular phones, BLUETOOTH-enabled devices, Wifi-enabled devices, satellite phone, smart phone, wireless phone, or any other suitable mobile device, such as a personal digital assistant (PDA), pocket personal computer, tablet, customized hardware, etc. which have the ability to transmit data between devices.
  • PDA personal digital assistant
  • devices 101 a - 101 n may include a computer (e.g., desktop computer, laptop, web appliance, etc., which also have the ability to transmit data between devices).
  • one or more networks are provided to handle various communication sessions, voice communications as well as non-voice communications.
  • Networks 111 - 117 may be any suitable wired and/or wireless network.
  • telephony network 117 may include a circuit-switched network, such as the public switched telephone network (PSTN), an integrated services digital network (ISDN), a private branch exchange (PBX), or other like network.
  • PSTN public switched telephone network
  • ISDN integrated services digital network
  • PBX private branch exchange
  • Wireless network 111 may employ various technologies including, for example, code division multiple access (CDMA), enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), mobile ad hoc network (MANET), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., microwave access (WiMAX), wireless fidelity (Wifi), long term evolution (LTE), satellite, and the like.
  • CDMA code division multiple access
  • EDGE enhanced data rates for global evolution
  • GPRS general packet radio service
  • MANET mobile ad hoc network
  • GSM global system for mobile communications
  • IMS Internet protocol multimedia subsystem
  • UMTS universal mobile telecommunications system
  • any other suitable wireless medium e.g., microwave access (WiMAX), wireless fidelity (Wifi), long term evolution (LTE), satellite, and the like.
  • data network 113 may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), the Internet, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, such as a proprietary cable or fiber-optic network.
  • LAN local area network
  • MAN metropolitan area network
  • WAN wide area network
  • the Internet or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, such as a proprietary cable or fiber-optic network.
  • FIG. 2 is, respectively, a diagram of a multiple device coordination platform 103 , according to an exemplary embodiment.
  • the a multiple device coordination platform 103 includes various executable modules for performing one or more computing, data processing and network based instructions that in combination enable devices 101 a - 101 n to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • the modules encompassing the multiple device coordination platform 103 can be implemented in hardware, firmware, software, or a combination thereof.
  • a device which may be a mobile device (e.g., mobile device 101 a of FIG. 1 ), or a computer, includes a device coordination module 201 that is configured to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data of devices.
  • the device may also include distance calculation module 203 configured to measure or estimate the absolute/relative distance between devices. It is noted that these communication sessions can be established over a circuit-switched network, a packet-switch network, or a combination thereof.
  • a communication interface 213 can be appropriately configured depending on the transport systems and networks.
  • the communication interface 213 may provide a means of collaborative data exchange and communication between the device and the multiple device coordination platform 103 ( FIG. 1 ).
  • the device may also feature the relative position determination module 205 for estimating or calculating relative positions of devices; the attitude determination module 207 for estimating, recording and tracking the change in attitudes of one or more moving devices; gesture recognition module 209 for recognizing gestures performed with a handheld device; and interface reaction module 211 for changing the user interface of the device based on the proximity of the other device.
  • FIGS. 3A , 4 A, 5 , 6 A, 7 and 8 are flowcharts of a system, method, and computer program product according to an embodiment. It will be understood that each block or step of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions, which embody the procedures described above, may be stored by a memory device of a mobile terminal, server, or other computing device and executed by a built-in processor in the computing device.
  • any such computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which executed on the computer or other programmable apparatus create means for implementing the functions specified in the flowchart block(s) or step(s).
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means, which implement the function specified in the flowchart block(s) or step(s).
  • the computer program instructions may also be loaded onto a computer or other programmable apparatus to trigger a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which executed on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block(s) or step(s).
  • blocks or steps of the flowcharts support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that one or more blocks or steps of the flowcharts, and combinations of blocks or steps in the flowcharts, may be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • FIG. 3A in steps 303 , 305 , 307 , 309 and 311 of the process 301 , is a flowchart of a process for configuring a multiple device coordination platform, for measuring the distance, and other absolute and/or relative kinematic data between two devices, for example, 321 and 323 (of FIG. 3B ), according to an embodiment.
  • the process is described with respect to FIGS. 3A-3B .
  • the process is described with respect to an audio signal.
  • the process may be implemented with other signals (e.g., non-intrusive optical signals, etc.) in other embodiments.
  • steps 303 and 305 of process 301 one device (e.g., 321 or 323 ) broadcasts an audio signal using a speaker 325 b to estimate or calculate the distance, and other absolute and/or relative kinematic data between two devices (e.g., 321 and 323 ). When two devices are close enough, this audio signal is picked up by the microphone 325 a of the other device.
  • the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices can be measured with reasonable accuracy based on the amplitude of the received signal.
  • FIG. 3B is a diagram of two devices 321 and 323 according to one further teaching.
  • the devices 321 , 323 contain or are connected to components 325 capable of broadcasting and receiving the audio signal in the embodiment discussed above.
  • the process 301 may be performed without step 303 .
  • the process 301 starts with a device, for example, 327 (of FIG. 3C ), sensing, or detecting a signal from another device, such as 329 (of FIG. 3C ), as indicated in step 305 .
  • the signal may be passively present with respect to the device 329 without the device 329 broadcasting the signal to the device 327 .
  • One example of the signal is an image being sensed, or gathered, by a proximity sensor 331 of the device 327 sensing a nearby existence of the device 329 .
  • a signal (e.g., sensed image) may be sensed by the device 329 of the device 327 , as indicated by step 305 .
  • the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices 327 and 329 can be measured with reasonable accuracy based on the signal sensed by the device 327 (and/or 329 ).
  • FIG. 3C is a diagram of the two devices 327 and 329 according to one further teaching of the embodiment, where each device contains or is connected to a sensor 331 (e.g., a proximity sensor) capable of gathering raw sensory data.
  • a sensor 331 e.g., a proximity sensor
  • FIG. 4A in steps 403 , 405 , 407 , 409 and 411 of process 401 , is a flowchart of a process for configuring a multiple device coordination platform, for estimating or calculating relative positions, and other absolute and/or relative kinematic data of devices, for example, 421 , 423 , 425 and 427 (of FIG. 4B ), according to an embodiment.
  • the processes are described with respect to FIGS. 4A-4B . Again, it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner. Further, according to the embodiment, the process is described with respect to an audio signal.
  • FIG. 4B is a diagram of multiple devices 421 , 423 , 425 and 427 according to one further teaching.
  • steps 403 and 405 of process 401 multiple devices (e.g., 421 , 423 , 425 and 427 ) broadcast audio signals using speakers 429 b to estimate or calculate the relative positions, and other absolute and/or relative kinematic data between those devices (e.g., 421 , 423 , 425 and 427 ). When those devices are close enough, this audio signal will be picked up by the microphone 429 a of other devices. Per steps 407 and 409 , the relative positions, and other absolute and/or relative kinematic data between those devices can be measured with reasonable accuracy based on the amplitude of received signals.
  • the process 401 starts with a device, for example, 433 (of FIG. 4C ), sensing, or detecting, passively signals from other devices, such as 435 , 437 , and 439 (of FIG. 4C ), as indicated in step 405 .
  • the signals may be passively present with respect to the devices 435 - 439 without being broadcast by those devices.
  • One example of the signals are images sensed, or gathered, by a proximity sensor 431 of the device 435 sensing the nearby existence of the devices 435 - 439 .
  • each of the devices 435 - 439 senses signals of one another using the proximity sensor 431 to estimate or calculate the relative positions, and other absolute and/or relative kinematic data between the devices (e.g., 433 , 435 , 437 and 439 ).
  • Per steps 407 and 409 the absolute and/or relative distance, and other absolute and/or relative kinematic data between the devices 435 , 437 , and 439 can be measured with reasonable accuracy based on the difference in signal received by the multiple devices.
  • FIG. 4C is a diagram of the devices 433 , 435 , 437 , and 439 containing or connecting to a sensor 431 (e.g., a proximity sensor) capable of gathering raw sensory data according to one further teaching of the embodiment.
  • a sensor 431 e.g., a proximity sensor
  • FIG. 5 in steps 503 , 505 , 507 , 509 and 511 of process 501 , is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the relative motion, and other absolute and/or relative kinematic data of the moving device, according to an embodiment.
  • FIG. 6A in steps 603 , 605 , 607 , 609 and 611 of process 601 , is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the change in attitude, and other absolute and/or relative kinematic data of the moving device, according to an embodiment.
  • FIG. 6B illustrates the orientation 621 and the attitude of the device according to one further teaching. It should be noted that “attitude” in this context refers to the rotation of the device around its three dimensional axes.
  • the processes are described with respect to FIGS. 5 and 6 A- 6 B. According to the embodiment, the processes are described with respect to an audio signal. However, the processes may be implemented in other embodiments with respect to other types of signals (e.g., non-intrusive optical signals, etc.). Again, it is noted that the steps of each process ( FIG. 5 and FIG. 6A respectively) may be performed in any suitable order, as well as combined or separated in any suitable manner.
  • the relative motion, change in attitude, and other absolute and/or relative kinematic data of the moving device can be recorded using the same general technique of comparing the audio signals detected by multiple receivers and tracking the changes in those signals over time.
  • step 503 of process 501 and step 603 of process 601 , the moving device broadcasts the audio signal using speakers and when the moving device and other devices are close enough, this audio signal will be picked up by the microphone of other devices to estimate or calculate the relative motion, and the change in attitude of the moving device.
  • the relative motion, the change in attitude, and other absolute and/or relative kinematic data of the moving device can be measured with reasonable accuracy based on the amplitude of the received signal.
  • steps 507 - 509 and 607 - 609 those signals, and other absolute and/or relative kinematic data can be recorded and tracked over time.
  • FIG. 7 in steps 703 , 705 and 707 of process 701 , is a flowchart of a process for configuring a multiple device coordination platform, for recognizing gestures, and other absolute and/or relative kinematic data performed with a handheld device, according to an embodiment.
  • the processes are described with respect to FIG. 7 , and it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner.
  • step 703 relative motions, change in attitudes and other absolute and/or relative kinematic data of handheld devices are recorded and tracked by the coordination platform.
  • the recording and tracking may be implemented with respect to audio signals, for example, being transmitted between the devices.
  • the recording and tracking may be implemented with respect to other types of signals, such as IR signals.
  • a gesture recognition algorithm can be applied to recognize gestures made with those handheld devices. Such gestures can include, for example, pointing, panning, flicking, etc.
  • FIG. 8 in steps 803 , 805 , 807 , 809 , 811 and 813 of process 801 , is a flowchart of a process for configuring a multiple device coordination platform, for changing the user interface of a device based on the proximity and other absolute and/or relative kinematic data of a second device, according to an embodiment.
  • the processes are described with respect to FIG. 8 , and it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner.
  • steps 803 and 805 of process 801 one device broadcasts an audio signal using a speaker to estimate or calculate the distance, and other absolute and/or relative kinematic data between two devices.
  • this audio signal is picked up by the microphone of the other device.
  • the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices can be measured with reasonable accuracy based on the amplitude of the received signal.
  • step 811 when two devices can measure each other's relative distance and other absolute and/or relative kinematic data, the user interfaces of the devices can react based on the distance and other absolute and/or relative kinematic data. For example, consider an app that allows the sharing of photos between two devices. As one device is moved closer to another, the shared photo from one device can become larger on the screen of the other device.
  • FIG. 9 is a diagram of a computer system that can be used to implement various embodiments.
  • the computer system 900 includes a bus 901 or other communication mechanism for communicating information and one or more processors (of which one is shown) 903 coupled to the bus 901 for processing information.
  • the computer system 900 also includes main memory 905 , such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 901 for storing information and instructions to be executed by the processor 903 .
  • Main memory 905 can also be used for storing temporary variables or other intermediate information during execution of instructions by the processor 903 .
  • the computer system 900 may further include a read only memory (ROM) 907 or other static storage device coupled to the bus 901 for storing static information and instructions for the processor 903 .
  • ROM read only memory
  • a storage device 909 such as a magnetic disk or optical disk, is coupled to the bus 901 for persistently storing information and instructions.
  • the computer system 900 may be coupled via the bus 901 to a display 911 , such as a cathode ray tube (CRT), liquid crystal display, active matrix display, or plasma display, for displaying information to a participant using a computer.
  • a display 911 such as a cathode ray tube (CRT), liquid crystal display, active matrix display, or plasma display
  • An input device 913 is coupled to the bus 901 for communicating information and command selections to the processor 903 .
  • a cursor control 915 such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 903 and for adjusting cursor movement on the display 911 .
  • the processes described herein are performed by the computer system 900 , in response to the processor 903 executing an arrangement of instructions contained in main memory 905 .
  • Such instructions can be read into main memory 905 from another computer-readable medium, such as the storage device 909 .
  • Execution of the arrangement of instructions contained in main memory 905 causes the processor 903 to perform the process steps described herein.
  • processors in a multi-processing arrangement may also be employed to execute the instructions contained in main memory 905 .
  • hard-wired circuitry may be used in place of or in combination with software instructions to implement the embodiment of the invention.
  • embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
  • the computer system 900 also includes a communication interface 917 coupled to bus 901 .
  • the communication interface 917 provides a two-way data communication coupling to a network link 919 connected to a local network 921 .
  • the communication interface 917 may be a digital subscriber line (DSL) card or modem, an integrated services digital network (ISDN) card, a cable modem, a telephone modem, or any other communication interface to provide a data communication connection to a corresponding type of communication line.
  • communication interface 917 may be a local area network (LAN) card (e.g. for EthernetTM or an Asynchronous Transfer Model (ATM) network) to provide a data communication connection to a compatible LAN.
  • LAN local area network
  • Wireless links can also be implemented.
  • communication interface 917 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information.
  • the communication interface 917 can include peripheral interface devices, such as a Universal Serial Bus (USB) interface, a PCMCIA (Personal Computer Memory Card International Association) interface, etc.
  • USB Universal Serial Bus
  • PCMCIA Personal Computer Memory Card International Association
  • the network link 919 typically provides data communication through one or more networks to other data devices.
  • the network link 919 may provide a connection through local network 921 to a host computer 923 , which has connectivity to a network 925 (e.g. a wide area network (WAN) or the global packet data communication network now commonly referred to as the “Internet”) or to data equipment operated by a service provider.
  • the local network 921 and the network 925 both use electrical, electromagnetic, or optical signals to convey information and instructions.
  • the signals through the various networks, and the signals on the network link 919 and through the communication interface 917 , which communicate digital data with the computer system 900 are exemplary forms of carrier waves bearing the information and instructions.
  • the computer system 900 can send messages and receive data, including program code, through the network(s), the network link 919 , and the communication interface 917 .
  • a server (not shown) might transmit requested code belonging to an application program for implementing an embodiment of the invention through the network 925 , the local network 921 and the communication interface 917 .
  • the processor 903 may execute the transmitted code while being received and/or store the code in the storage device 909 , or other non-volatile storage for later execution. In this manner, the computer system 900 may obtain application code in the form of a carrier wave.
  • Non-volatile media include, for example, optical or magnetic disks, such as the storage device 909 .
  • Volatile media include dynamic memory, such as main memory 905 .
  • Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 901 . Transmission media can also take the form of acoustic, optical, or electromagnetic waves, such as those generated during radio frequency (RF) and infrared (IR) data communications.
  • RF radio frequency
  • IR infrared
  • Computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • a floppy disk a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • the instructions for carrying out at least part of the embodiments of the invention may initially be borne on a magnetic disk of a remote computer.
  • the remote computer loads the instructions into main memory and sends the instructions over a telephone line using a modem.
  • a modem of a local computer system receives the data on the telephone line and uses an infrared transmitter to convert the data to an infrared signal and transmit the infrared signal to a portable computing device, such as a personal digital assistant (PDA) or a laptop.
  • PDA personal digital assistant
  • An infrared detector on the portable computing device receives the information and instructions borne by the infrared signal and places the data on a bus.
  • the bus conveys the data to main memory, from which a processor retrieves and executes the instructions.
  • the instructions received by main memory can optionally be stored on storage device either before or after execution by processor.
  • FIG. 10 illustrates a chip set or chip 1000 upon which an embodiment of the invention may be implemented.
  • Chip set 1000 is programmed to configure a multiple device coordination platform for coordinating multiple devices to estimate or calculate the distance, relative position, and device attitude as described herein and includes, for instance, the processor and memory components described with respect to FIG. 1-2 incorporated in one or more physical packages (e.g., chips).
  • a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction.
  • the chip set 1000 can be implemented in a single chip.
  • chip set or chip 1000 can be implemented as a single “system on a chip.” It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors. Chip set or chip 1000 , or a portion thereof, constitutes a means for performing one or more steps of configuring an image correction platform to enable the angle correction system for modify the images.
  • the chip set or chip 1000 includes a communication mechanism such as a bus 1001 for passing information among the components of the chip set 1000 .
  • a processor 1003 has connectivity to the bus 1001 to execute instructions and process information stored in, for example, a memory 1005 .
  • the processor 1003 may include one or more processing cores with each core configured to perform independently.
  • a multi-core processor enables multiprocessing within a single physical package. Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores.
  • the processor 1003 may include one or more microprocessors configured in tandem via the bus 1001 to enable independent execution of instructions, pipelining, and multithreading.
  • the processor 1003 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1007 , or one or more application-specific integrated circuits (ASIC) 1009 .
  • DSP digital signal processor
  • ASIC application-specific integrated circuits
  • a DSP 1007 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1003 .
  • an ASIC 1009 can be configured to performed specialized functions not easily performed by a more general purpose processor.
  • Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA) (not shown), one or more controllers (not shown), or one or more other special-purpose computer chips.
  • FPGA field programmable gate arrays
  • the chip set or chip 1000 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
  • the processor 1003 and accompanying components have connectivity to the memory 1005 via the bus 1001 .
  • the memory 1005 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD-ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein to configure an image correction platform to enable the angle correction system for modify the images.
  • the memory 1005 also stores the data associated with or generated by the execution of the inventive steps.
  • the above described functions may be carried out in many ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out embodiments of the invention. In one embodiment, all or a portion of the elements generally operate under control of a computer program product.
  • the computer program product for performing the methods of embodiments of the invention includes a computer-readable storage medium, such as the non-volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
  • the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense (i.e., to say, in the sense of “including, but not limited to”), as opposed to an exclusive or exhaustive sense.
  • the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements. Such a coupling or connection between the elements can be physical, logical, or a combination thereof.
  • the words “herein,” “above,” “below,” and words of similar import when used in this application, refer to this application as a whole and not to any particular portions of this application.
  • words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively.
  • the word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.

Abstract

The present invention contemplates a variety of improved techniques including methods and apparatus for coordinating a plurality of devices, and more specifically, a method and apparatus for coordinating a plurality of devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application No. 61/665,857 entitled “COORDINATING DEVICES TO ESTIMATE DISTANCE, RELATIVE POSITION, AND DEVICE ATTITUDE”, filed Jun. 28, 2012, and is hereby incorporated by reference.
  • BACKGROUND OF INVENTION
  • 1. Field of Invention
  • The present invention relates to methods and apparatus for coordinating a plurality of devices, and more specifically, a method and apparatus for coordinating a plurality of devices to estimate or find various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • 2. Summary of the Invention
  • This summary is provided to introduce in a simplified form certain concepts that are further described in the Detailed Description below. It is not intended to identify essential features of the claimed subject matter or to limit the scope of the claimed subject matter.
  • The present invention contemplates a variety of improved methods and apparatus for coordinating multiple devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data. The present teaching provides a paradigm enabling various inventions such as:
  • (1) A means of estimating or calculating the distance between two devices;
  • (2) A means of estimating or calculating relative positions of devices;
  • (3) A means of estimating or calculating the attitude of a handheld device relative to other devices;
  • (4) A means of recognizing gestures performed with a handheld device; and
  • (5) A technique of changing the user interface of a device based on the proximity of a second device.
  • Certain aspects of this invention contemplate two or more devices contain or connect to components capable of transmitting data between the devices (e.g., speakers and microphones, signal transmitters and receivers, etc.). Any suitable devices may be used in any combination, such as mobile phones, tablets, computers, and televisions (possibly connected via set-top boxes, game systems, DVD players, etc) and any suitable means may be used for transmitting data (e.g., via a Wifi, cellular, Ethernet network or other means of transmission).
  • Other aspects of the technique will be apparent from the accompanying figures and detailed description.
  • BRIEF DESCRIPTION OF DRAWINGS
  • One or more embodiments of the present invention are illustrated by way of example and not limitation in the figures of the accompanying drawings, in which like references indicate similar elements.
  • These and other objects, features and characteristics of the present invention will become more apparent to those skilled in the art from a study of the following detailed description in conjunction with the appended claims and drawings, all of which form a part of this specification. In the drawings:
  • FIG. 1 is a diagram of a multiple device coordination platform for coordinating a plurality of devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data, according to an embodiment of this invention;
  • FIG. 2 is, respectively, a multiple device coordination platform, according to various embodiments of this invention;
  • FIG. 3A is a flowchart of a process for configuring a multiple device coordination platform, for estimating the distance and other absolute and/or relative kinematic data between two devices, according to an embodiment of this invention;
  • FIG. 3B is a diagram of two devices according to one embodiment of this invention;
  • FIG. 3C is a diagram of two devices sensing a surrounding environment according to one embodiment of this invention;
  • FIG. 4A is a flowchart of a process for configuring a multiple device coordination platform, for estimating relative positions and other absolute and/or relative kinematic data of devices, according to an embodiment of this invention;
  • FIG. 4B is a diagram of multiple devices according to one embodiment of this invention;
  • FIG. 4C is a diagram of two devices sensing a surrounding environment according to one embodiment of this invention;
  • FIG. 5 is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the relative motions and other absolute and/or relative kinematic data of one or more moving devices, according to an embodiment of this invention;
  • FIG. 6A is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the change in attitudes and other absolute and/or relative kinematic data of one or more moving devices, according to an embodiment of this invention;
  • FIG. 6B illustrates the orientation and attitude of device according to one embodiment of this invention;
  • FIG. 7 is a flowchart of a process for configuring a multiple device coordination platform, for recognizing gestures and other absolute and/or relative kinematic data performed with handheld devices, according to an embodiment of this invention;
  • FIG. 8 is a flowchart of a process for configuring a multiple device coordination platform, for changing the user interface of the device based on the proximity and other absolute and/or relative kinematic data of the other device, according to an embodiment of this invention;
  • FIG. 9 is a diagram of a computer system that can be used to implement various embodiments; and
  • FIG. 10 is a diagram of a chip set that can be used to implement an embodiment.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Embodiments will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments are shown. Indeed, the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout.
  • A method and apparatus for configuring of a multiple device coordination platform for coordinating multiple devices to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data is described. In the following description, for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the invention. It is apparent, however, that the embodiments may be practiced without these specific details or with an equivalent arrangement. In other instances, well-known structures and devices are shown in block diagram form in order to avoid unnecessarily obscuring the embodiments.
  • Although various embodiments are described with respect to a terminal device, it is contemplated that these embodiments have applicability to any device capable of communicating over a network.
  • FIG. 1 is a diagram of a multiple device coordination platform for coordinating a plurality of devices, according to an embodiment. The multiple device coordination platform 103 system is capable of estimating or calculating various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data of a plurality of devices according to an embodiment. In some embodiments, data are transmitted by requesting devices 101 a-101 n, which have a means of transmitting data between devices, to the multiple device coordination platform 103. In some embodiments, data are sensed by the devices 101 a-101 n by employing a sensor means to gather data associated with a surrounding environment in which the devices exist, and the raw sensory data gathered by the devices 101 a-101 n are transmitted to the multiple device coordination platform 103.
  • The devices 101 a-101 n may be mobile phones, Personal Digital Assistants (PDAs), Smartphones, tablets, laptops, desktops, televisions (possibly connected via set-top boxes, game systems, DVD players, etc.) and the like. The means of sensing the data may include a proximity sensor, a camera, a microphone, and the like. In one example, the sensor is a proximity sensor capable of detecting nearby objects. The proximity sensor may be, for example, an electromagnetic radiation sensor, a capacitive photoelectric sensor, an inductive sensor, or the like. In another example, the sensor is a camera capable of receiving visible or non-visible light. In yet another example, the sensor is a microphone capable of receiving audio signal in the form of acoustical waves. The means of transmitting the data is typically via a Wifi 111, cellular 117, or Ethernet network 113, but could include other means of transmission.
  • The approach of system 110, according to certain embodiments, enables any device 101 a-101 n (e.g., a mobile phone, a laptop, etc.) to be configured to estimate or calculate various distances, relative positions, and device attitudes. By way of this approach, a device 101 may contain or connect to transmission components associated with the device for transmitting and receiving a variety of nonintrusive signals (i.e., nonintrusive to participants using the device). In one example, a device 101 may contain or connect to speakers and microphones associated with the device for transmitting and receiving audio signals. In another example, the device 101 may contain or connect to radio frequency (RF) transmitters and receivers associated with the device for transmitting and receiving RF signals. In yet another example, the device 101 may contain or connect to infrared (IR) transmitters and receivers associated with the device for transmitting and receiving IR signals.
  • Further, a participant using the device 101 may estimate or calculate various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data based on the signal (e.g., audio, optical, etc.) received from other devices. In some instances, the device 101 may be actively broadcasting a signal to other devices and receiving in response a return signal in order to estimate or calculate various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data. In other instances, a device 101 may be passively sensing the environment in which it exists to receive the signal to assist in the estimation or calculation of various distances, relative positions, and device attitudes and other absolute and/or relative kinematic data. In particular, a plurality of devices, acting together, transmit cumulative raw sensory data about what they are sensing to the multiple device coordination platform 103. For example, a device 101 a may transmit an image of an environment sensed through a built-in camera of the device 101 a. Using this image along may be insufficient. However, the image, combined with other sensory data received from the devices 101 b-101 n, such as audio data received from microphones, enables the multiple device coordination platform 103 to estimate or calculate accurately the various distances, relative positions, device attitudes, orientation, and/or other relative kinematic data (velocities, angular, accelerations) about the devices 101 a-101 n.
  • In some embodiments, various types of processing may be applied to the raw sensory data gathered by the devices 101 a-101 n. In one example, image recognition algorithms may be applied to raw still images or video. In another example, noise filter algorithms may be applied to raw audio data. In some embodiments, the processing of the raw sensory data may be performed. In other embodiments, the processing of the raw sensory data may be performed by the multiple device coordination platform 103. A person having ordinary skill in the art will recognize the optimal configuration based on available computing power, bandwidth constraints, storage capacity, or other considerations.
  • In some embodiments, the multiple device coordination platform 103 may utilize the calculated various distances, relative positions, device attitudes, orientation, and/or other relative kinematic data (velocities, angular, accelerations) about the devices 101 a-101 n for constructing a real-time spatial model of the devices 101 a-101 n used by participants at a particular venue. The real-time spatial model may be used, for example, to track the movement of the devices, to apply gesture recognition, to alter the content displayed on devices used by the participants (such as rendering special effects on the display of a device used by a participant), and the like.
  • In some embodiments, the data received by the devices via active broadcasting and via passive sensing, as discussed above, may be utilized in parallel by the multiple device coordination platform 103 to estimate or calculate the various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data.
  • The multiple device coordination platform 103 may operate in connection with a device 101 by way of a communication network. The device 101 may include various computer hardware, including multiple device coordination platform 103. Alternatively, the multiple device coordination platform 103 may be external to the devices 101. The multiple device coordination platform 103, among other capabilities, can retrieve the data information gathered by the device 101 on behalf of the respective subscribing participant.
  • In certain embodiments, devices 101 a-101 n may include cellular phones, BLUETOOTH-enabled devices, Wifi-enabled devices, satellite phone, smart phone, wireless phone, or any other suitable mobile device, such as a personal digital assistant (PDA), pocket personal computer, tablet, customized hardware, etc. which have the ability to transmit data between devices. In addition, devices 101 a-101 n may include a computer (e.g., desktop computer, laptop, web appliance, etc., which also have the ability to transmit data between devices).
  • In system 110, according to certain embodiments, one or more networks, such as data network 113, service provider network 115, telephony network 117, and/or wireless network 111, are provided to handle various communication sessions, voice communications as well as non-voice communications. Networks 111-117 may be any suitable wired and/or wireless network. For example, telephony network 117 may include a circuit-switched network, such as the public switched telephone network (PSTN), an integrated services digital network (ISDN), a private branch exchange (PBX), or other like network.
  • Wireless network 111 may employ various technologies including, for example, code division multiple access (CDMA), enhanced data rates for global evolution (EDGE), general packet radio service (GPRS), mobile ad hoc network (MANET), global system for mobile communications (GSM), Internet protocol multimedia subsystem (IMS), universal mobile telecommunications system (UMTS), etc., as well as any other suitable wireless medium, e.g., microwave access (WiMAX), wireless fidelity (Wifi), long term evolution (LTE), satellite, and the like. Meanwhile, data network 113 may be any local area network (LAN), metropolitan area network (MAN), wide area network (WAN), the Internet, or any other suitable packet-switched network, such as a commercially owned, proprietary packet-switched network, such as a proprietary cable or fiber-optic network.
  • FIG. 2 is, respectively, a diagram of a multiple device coordination platform 103, according to an exemplary embodiment. The a multiple device coordination platform 103 includes various executable modules for performing one or more computing, data processing and network based instructions that in combination enable devices 101 a-101 n to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data. Also, it is noted that the modules encompassing the multiple device coordination platform 103 can be implemented in hardware, firmware, software, or a combination thereof.
  • In FIG. 2, a device, which may be a mobile device (e.g., mobile device 101 a of FIG. 1), or a computer, includes a device coordination module 201 that is configured to estimate or calculate various distances, relative positions, device attitudes, and other absolute and/or relative kinematic data of devices. The device may also include distance calculation module 203 configured to measure or estimate the absolute/relative distance between devices. It is noted that these communication sessions can be established over a circuit-switched network, a packet-switch network, or a combination thereof. Thus, a communication interface 213 can be appropriately configured depending on the transport systems and networks. Furthermore, the communication interface 213 may provide a means of collaborative data exchange and communication between the device and the multiple device coordination platform 103 (FIG. 1).
  • In one embodiment, the device may also feature the relative position determination module 205 for estimating or calculating relative positions of devices; the attitude determination module 207 for estimating, recording and tracking the change in attitudes of one or more moving devices; gesture recognition module 209 for recognizing gestures performed with a handheld device; and interface reaction module 211 for changing the user interface of the device based on the proximity of the other device.
  • FIGS. 3A, 4A, 5, 6A, 7 and 8 are flowcharts of a system, method, and computer program product according to an embodiment. It will be understood that each block or step of the flowcharts, and combinations of blocks in the flowcharts, may be implemented by various means, such as hardware, firmware, and/or software including one or more computer program instructions. For example, one or more of the procedures described above may be embodied by computer program instructions. In this regard, the computer program instructions, which embody the procedures described above, may be stored by a memory device of a mobile terminal, server, or other computing device and executed by a built-in processor in the computing device. As will be appreciated, any such computer program instructions may be loaded onto a computer or other programmable apparatus (i.e., hardware) to produce a machine, such that the instructions which executed on the computer or other programmable apparatus create means for implementing the functions specified in the flowchart block(s) or step(s). These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means, which implement the function specified in the flowchart block(s) or step(s). The computer program instructions may also be loaded onto a computer or other programmable apparatus to trigger a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions which executed on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block(s) or step(s).
  • Accordingly, blocks or steps of the flowcharts support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that one or more blocks or steps of the flowcharts, and combinations of blocks or steps in the flowcharts, may be implemented by special purpose hardware-based computer systems which perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • Distance
  • In this regard, FIG. 3A, in steps 303, 305, 307, 309 and 311 of the process 301, is a flowchart of a process for configuring a multiple device coordination platform, for measuring the distance, and other absolute and/or relative kinematic data between two devices, for example, 321 and 323 (of FIG. 3B), according to an embodiment. For the purpose of illustration, the process is described with respect to FIGS. 3A-3B. Further, according to the embodiment, the process is described with respect to an audio signal. However, the process may be implemented with other signals (e.g., non-intrusive optical signals, etc.) in other embodiments. It is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner. In steps 303 and 305 of process 301, one device (e.g., 321 or 323) broadcasts an audio signal using a speaker 325 b to estimate or calculate the distance, and other absolute and/or relative kinematic data between two devices (e.g., 321 and 323). When two devices are close enough, this audio signal is picked up by the microphone 325 a of the other device. Per steps 307 and 309, the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices can be measured with reasonable accuracy based on the amplitude of the received signal. FIG. 3B is a diagram of two devices 321 and 323 according to one further teaching. The devices 321, 323 contain or are connected to components 325 capable of broadcasting and receiving the audio signal in the embodiment discussed above.
  • In another embodiment, the process 301 may be performed without step 303. The process 301, according to this embodiment, starts with a device, for example, 327 (of FIG. 3C), sensing, or detecting a signal from another device, such as 329 (of FIG. 3C), as indicated in step 305. The signal may be passively present with respect to the device 329 without the device 329 broadcasting the signal to the device 327. One example of the signal is an image being sensed, or gathered, by a proximity sensor 331 of the device 327 sensing a nearby existence of the device 329. In the same way, a signal (e.g., sensed image) may be sensed by the device 329 of the device 327, as indicated by step 305. Per steps 307 and 309, the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices 327 and 329 can be measured with reasonable accuracy based on the signal sensed by the device 327 (and/or 329). FIG. 3C is a diagram of the two devices 327 and 329 according to one further teaching of the embodiment, where each device contains or is connected to a sensor 331 (e.g., a proximity sensor) capable of gathering raw sensory data.
  • Relative Position and Attitude
  • FIG. 4A, in steps 403, 405, 407, 409 and 411 of process 401, is a flowchart of a process for configuring a multiple device coordination platform, for estimating or calculating relative positions, and other absolute and/or relative kinematic data of devices, for example, 421, 423, 425 and 427 (of FIG. 4B), according to an embodiment. For the purpose of illustration, the processes are described with respect to FIGS. 4A-4B. Again, it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner. Further, according to the embodiment, the process is described with respect to an audio signal. However, the process may be implemented in other embodiments with respect to other types of signals (e.g., non-intrusive optical signals, etc.). In certain embodiments, in situations where more than two devices are present in the local environment, relative positions of these devices can be estimate or calculated based on the difference in signal received by multiple devices. FIG. 4B is a diagram of multiple devices 421, 423, 425 and 427 according to one further teaching. In steps 403 and 405 of process 401, multiple devices (e.g., 421, 423, 425 and 427) broadcast audio signals using speakers 429 b to estimate or calculate the relative positions, and other absolute and/or relative kinematic data between those devices (e.g., 421, 423, 425 and 427). When those devices are close enough, this audio signal will be picked up by the microphone 429 a of other devices. Per steps 407 and 409, the relative positions, and other absolute and/or relative kinematic data between those devices can be measured with reasonable accuracy based on the amplitude of received signals.
  • In another embodiment, the process 401, according to this embodiment, starts with a device, for example, 433 (of FIG. 4C), sensing, or detecting, passively signals from other devices, such as 435, 437, and 439 (of FIG. 4C), as indicated in step 405. The signals may be passively present with respect to the devices 435-439 without being broadcast by those devices. One example of the signals are images sensed, or gathered, by a proximity sensor 431 of the device 435 sensing the nearby existence of the devices 435-439. In the same way as the device 433, each of the devices 435-439 senses signals of one another using the proximity sensor 431 to estimate or calculate the relative positions, and other absolute and/or relative kinematic data between the devices (e.g., 433, 435, 437 and 439). Per steps 407 and 409, the absolute and/or relative distance, and other absolute and/or relative kinematic data between the devices 435, 437, and 439 can be measured with reasonable accuracy based on the difference in signal received by the multiple devices. FIG. 4C is a diagram of the devices 433, 435, 437, and 439 containing or connecting to a sensor 431 (e.g., a proximity sensor) capable of gathering raw sensory data according to one further teaching of the embodiment.
  • FIG. 5, in steps 503, 505, 507, 509 and 511 of process 501, is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the relative motion, and other absolute and/or relative kinematic data of the moving device, according to an embodiment. FIG. 6A, in steps 603, 605, 607, 609 and 611 of process 601, is a flowchart of a process for configuring a multiple device coordination platform, for estimating, recording and tracking the change in attitude, and other absolute and/or relative kinematic data of the moving device, according to an embodiment. FIG. 6B illustrates the orientation 621 and the attitude of the device according to one further teaching. It should be noted that “attitude” in this context refers to the rotation of the device around its three dimensional axes.
  • For the purpose of illustration, the processes are described with respect to FIGS. 5 and 6A-6B. According to the embodiment, the processes are described with respect to an audio signal. However, the processes may be implemented in other embodiments with respect to other types of signals (e.g., non-intrusive optical signals, etc.). Again, it is noted that the steps of each process (FIG. 5 and FIG. 6A respectively) may be performed in any suitable order, as well as combined or separated in any suitable manner. In certain embodiments, in the case of a moving device, such as a handheld smart phone, the relative motion, change in attitude, and other absolute and/or relative kinematic data of the moving device can be recorded using the same general technique of comparing the audio signals detected by multiple receivers and tracking the changes in those signals over time. In step 503 of process 501, and step 603 of process 601, the moving device broadcasts the audio signal using speakers and when the moving device and other devices are close enough, this audio signal will be picked up by the microphone of other devices to estimate or calculate the relative motion, and the change in attitude of the moving device. Per steps 505-509 and 605-609, the relative motion, the change in attitude, and other absolute and/or relative kinematic data of the moving device can be measured with reasonable accuracy based on the amplitude of the received signal. In steps 507-509 and 607-609, those signals, and other absolute and/or relative kinematic data can be recorded and tracked over time.
  • Gesture Recognition
  • FIG. 7, in steps 703, 705 and 707 of process 701, is a flowchart of a process for configuring a multiple device coordination platform, for recognizing gestures, and other absolute and/or relative kinematic data performed with a handheld device, according to an embodiment. For the purpose of illustration, the processes are described with respect to FIG. 7, and it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner. In step 703, relative motions, change in attitudes and other absolute and/or relative kinematic data of handheld devices are recorded and tracked by the coordination platform. The recording and tracking may be implemented with respect to audio signals, for example, being transmitted between the devices. In other embodiments, the recording and tracking may be implemented with respect to other types of signals, such as IR signals. In 705, because the relative motions, the change in attitudes and other absolute and/or relative kinematic data of handheld devices can be recorded and tracked, a gesture recognition algorithm can be applied to recognize gestures made with those handheld devices. Such gestures can include, for example, pointing, panning, flicking, etc.
  • User Interface Change Based on Proximity
  • FIG. 8, in steps 803, 805, 807, 809, 811 and 813 of process 801, is a flowchart of a process for configuring a multiple device coordination platform, for changing the user interface of a device based on the proximity and other absolute and/or relative kinematic data of a second device, according to an embodiment. For the purpose of illustration, the processes are described with respect to FIG. 8, and it is noted that the steps of the process may be performed in any suitable order, as well as combined or separated in any suitable manner. In steps 803 and 805 of process 801, one device broadcasts an audio signal using a speaker to estimate or calculate the distance, and other absolute and/or relative kinematic data between two devices. When two devices are close enough, this audio signal is picked up by the microphone of the other device. Per steps 807 and 809, the absolute and/or relative distance, and other absolute and/or relative kinematic data between the two devices can be measured with reasonable accuracy based on the amplitude of the received signal.
  • In step 811, when two devices can measure each other's relative distance and other absolute and/or relative kinematic data, the user interfaces of the devices can react based on the distance and other absolute and/or relative kinematic data. For example, consider an app that allows the sharing of photos between two devices. As one device is moved closer to another, the shared photo from one device can become larger on the screen of the other device. FIG. 9 is a diagram of a computer system that can be used to implement various embodiments. The computer system 900 includes a bus 901 or other communication mechanism for communicating information and one or more processors (of which one is shown) 903 coupled to the bus 901 for processing information. The computer system 900 also includes main memory 905, such as a random access memory (RAM) or other dynamic storage device, coupled to the bus 901 for storing information and instructions to be executed by the processor 903. Main memory 905 can also be used for storing temporary variables or other intermediate information during execution of instructions by the processor 903. The computer system 900 may further include a read only memory (ROM) 907 or other static storage device coupled to the bus 901 for storing static information and instructions for the processor 903. A storage device 909, such as a magnetic disk or optical disk, is coupled to the bus 901 for persistently storing information and instructions.
  • The computer system 900 may be coupled via the bus 901 to a display 911, such as a cathode ray tube (CRT), liquid crystal display, active matrix display, or plasma display, for displaying information to a participant using a computer. An input device 913, such as a keyboard including alphanumeric and other keys, is coupled to the bus 901 for communicating information and command selections to the processor 903. Another type of input device is a cursor control 915, such as a mouse, a trackball, or cursor direction keys, for communicating direction information and command selections to the processor 903 and for adjusting cursor movement on the display 911.
  • According to an embodiment of the invention, the processes described herein are performed by the computer system 900, in response to the processor 903 executing an arrangement of instructions contained in main memory 905. Such instructions can be read into main memory 905 from another computer-readable medium, such as the storage device 909. Execution of the arrangement of instructions contained in main memory 905 causes the processor 903 to perform the process steps described herein. One or more processors in a multi-processing arrangement may also be employed to execute the instructions contained in main memory 905. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions to implement the embodiment of the invention. Thus, embodiments of the invention are not limited to any specific combination of hardware circuitry and software.
  • The computer system 900 also includes a communication interface 917 coupled to bus 901. The communication interface 917 provides a two-way data communication coupling to a network link 919 connected to a local network 921. For example, the communication interface 917 may be a digital subscriber line (DSL) card or modem, an integrated services digital network (ISDN) card, a cable modem, a telephone modem, or any other communication interface to provide a data communication connection to a corresponding type of communication line. As another example, communication interface 917 may be a local area network (LAN) card (e.g. for Ethernet™ or an Asynchronous Transfer Model (ATM) network) to provide a data communication connection to a compatible LAN. Wireless links can also be implemented. In any such implementation, communication interface 917 sends and receives electrical, electromagnetic, or optical signals that carry digital data streams representing various types of information. Further, the communication interface 917 can include peripheral interface devices, such as a Universal Serial Bus (USB) interface, a PCMCIA (Personal Computer Memory Card International Association) interface, etc.
  • The network link 919 typically provides data communication through one or more networks to other data devices. For example, the network link 919 may provide a connection through local network 921 to a host computer 923, which has connectivity to a network 925 (e.g. a wide area network (WAN) or the global packet data communication network now commonly referred to as the “Internet”) or to data equipment operated by a service provider. The local network 921 and the network 925 both use electrical, electromagnetic, or optical signals to convey information and instructions. The signals through the various networks, and the signals on the network link 919 and through the communication interface 917, which communicate digital data with the computer system 900, are exemplary forms of carrier waves bearing the information and instructions.
  • The computer system 900 can send messages and receive data, including program code, through the network(s), the network link 919, and the communication interface 917. In the Internet example, a server (not shown) might transmit requested code belonging to an application program for implementing an embodiment of the invention through the network 925, the local network 921 and the communication interface 917. The processor 903 may execute the transmitted code while being received and/or store the code in the storage device 909, or other non-volatile storage for later execution. In this manner, the computer system 900 may obtain application code in the form of a carrier wave.
  • The term “computer-readable medium” as used herein refers to any medium that participates in providing instructions to the processor 903 for execution. Such a medium may take many forms, including but not limited to computer-readable storage medium ((or non-transitory)—i.e., non-volatile media and volatile media), and transmission media. Non-volatile media include, for example, optical or magnetic disks, such as the storage device 909. Volatile media include dynamic memory, such as main memory 905. Transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise the bus 901. Transmission media can also take the form of acoustic, optical, or electromagnetic waves, such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media include, for example, a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, CDRW, DVD, any other optical medium, punch cards, paper tape, optical mark sheets, any other physical medium with patterns of holes or other optically recognizable indicia, a RAM, a PROM, and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave, or any other medium from which a computer can read.
  • Various forms of computer-readable media may be involved in providing instructions to a processor for execution. For example, the instructions for carrying out at least part of the embodiments of the invention may initially be borne on a magnetic disk of a remote computer. In such a scenario, the remote computer loads the instructions into main memory and sends the instructions over a telephone line using a modem. A modem of a local computer system receives the data on the telephone line and uses an infrared transmitter to convert the data to an infrared signal and transmit the infrared signal to a portable computing device, such as a personal digital assistant (PDA) or a laptop. An infrared detector on the portable computing device receives the information and instructions borne by the infrared signal and places the data on a bus. The bus conveys the data to main memory, from which a processor retrieves and executes the instructions. The instructions received by main memory can optionally be stored on storage device either before or after execution by processor.
  • FIG. 10 illustrates a chip set or chip 1000 upon which an embodiment of the invention may be implemented. Chip set 1000 is programmed to configure a multiple device coordination platform for coordinating multiple devices to estimate or calculate the distance, relative position, and device attitude as described herein and includes, for instance, the processor and memory components described with respect to FIG. 1-2 incorporated in one or more physical packages (e.g., chips). By way of example, a physical package includes an arrangement of one or more materials, components, and/or wires on a structural assembly (e.g., a baseboard) to provide one or more characteristics such as physical strength, conservation of size, and/or limitation of electrical interaction. It is contemplated that in certain embodiments the chip set 1000 can be implemented in a single chip. It is further contemplated that in certain embodiments the chip set or chip 1000 can be implemented as a single “system on a chip.” It is further contemplated that in certain embodiments a separate ASIC would not be used, for example, and that all relevant functions as disclosed herein would be performed by a processor or processors. Chip set or chip 1000, or a portion thereof, constitutes a means for performing one or more steps of configuring an image correction platform to enable the angle correction system for modify the images.
  • In one embodiment, the chip set or chip 1000 includes a communication mechanism such as a bus 1001 for passing information among the components of the chip set 1000. A processor 1003 has connectivity to the bus 1001 to execute instructions and process information stored in, for example, a memory 1005. The processor 1003 may include one or more processing cores with each core configured to perform independently. A multi-core processor enables multiprocessing within a single physical package. Examples of a multi-core processor include two, four, eight, or greater numbers of processing cores. Alternatively or in addition, the processor 1003 may include one or more microprocessors configured in tandem via the bus 1001 to enable independent execution of instructions, pipelining, and multithreading. The processor 1003 may also be accompanied with one or more specialized components to perform certain processing functions and tasks such as one or more digital signal processors (DSP) 1007, or one or more application-specific integrated circuits (ASIC) 1009. A DSP 1007 typically is configured to process real-world signals (e.g., sound) in real time independently of the processor 1003. Similarly, an ASIC 1009 can be configured to performed specialized functions not easily performed by a more general purpose processor. Other specialized components to aid in performing the inventive functions described herein may include one or more field programmable gate arrays (FPGA) (not shown), one or more controllers (not shown), or one or more other special-purpose computer chips.
  • In one embodiment, the chip set or chip 1000 includes merely one or more processors and some software and/or firmware supporting and/or relating to and/or for the one or more processors.
  • The processor 1003 and accompanying components have connectivity to the memory 1005 via the bus 1001. The memory 1005 includes both dynamic memory (e.g., RAM, magnetic disk, writable optical disk, etc.) and static memory (e.g., ROM, CD-ROM, etc.) for storing executable instructions that when executed perform the inventive steps described herein to configure an image correction platform to enable the angle correction system for modify the images. The memory 1005 also stores the data associated with or generated by the execution of the inventive steps.
  • While certain exemplary embodiments and implementations have been described herein, other embodiments and modifications will be apparent from this description. Accordingly, the invention is not limited to such embodiments, but rather to the broader scope of the presented claims and various obvious modifications and equivalent arrangements.
  • The above described functions may be carried out in many ways. For example, any suitable means for carrying out each of the functions described above may be employed to carry out embodiments of the invention. In one embodiment, all or a portion of the elements generally operate under control of a computer program product. The computer program product for performing the methods of embodiments of the invention includes a computer-readable storage medium, such as the non-volatile storage medium, and computer-readable program code portions, such as a series of computer instructions, embodied in the computer-readable storage medium.
  • Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the embodiments of the invention are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe exemplary embodiments in the context of certain exemplary combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
  • CONCLUSION
  • Unless the context clearly requires otherwise, throughout the description and the claims, the words “comprise,” “comprising,” and the like are to be construed in an inclusive sense (i.e., to say, in the sense of “including, but not limited to”), as opposed to an exclusive or exhaustive sense. As used herein, the terms “connected,” “coupled,” or any variant thereof means any connection or coupling, either direct or indirect, between two or more elements. Such a coupling or connection between the elements can be physical, logical, or a combination thereof. Additionally, the words “herein,” “above,” “below,” and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description using the singular or plural number may also include the plural or singular number respectively. The word “or,” in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
  • The above Detailed Description of examples of the invention is not intended to be exhaustive or to limit the invention to the precise form disclosed above. While specific examples for the invention are described above for illustrative purposes, various equivalent modifications are possible within the scope of the invention, as those skilled in the relevant art will recognize. While processes or blocks are presented in a given order in this application, alternative implementations may perform routines having steps performed in a different order, or employ systems having blocks in a different order. Some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternatives or subcombinations. Also, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed or implemented in parallel, or may be performed at different times. Further any specific numbers noted herein are only examples. It is understood that alternative implementations may employ differing values or ranges.
  • The various illustrations and teachings provided herein can also be applied to systems other than the system described above. The elements and acts of the various examples described above can be combined to provide further implementations of the invention.
  • Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference. Aspects of the invention can be modified, if necessary, to employ the systems, functions, and concepts included in such references to provide further implementations of the invention.
  • These and other changes can be made to the invention in light of the above Detailed Description. While the above description describes certain examples of the invention, and describes the best mode contemplated, no matter how detailed the above appears in text, the invention can be practiced in many ways. Details of the system may vary considerably in its specific implementation, while still being encompassed by the invention disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the invention should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the invention encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the invention under the claims.
  • While certain aspects of the invention are presented below in certain claim forms, the applicant contemplates the various aspects of the invention in any number of claim forms. For example, while only one aspect of the invention is recited as a means-plus-function claim under 35 U.S.C. §112, sixth paragraph, other aspects may likewise be embodied as a means-plus-function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claims intended to be treated under 35 U.S.C. §112, ¶6 will begin with the words “means for.”) Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the invention.

Claims (21)

What is claimed is:
1-51. (canceled)
52. A method of determining a spatial model of an environment by processing information gathered by a plurality of active devices located within the environment, the method comprising:
transmitting a first transmission from an active device into the environment, wherein the first transmission contains a first transmission characteristic associated with the active device and location of the first transmission within the environment;
receiving a second transmission from another active device located within the environment, wherein the second transmission includes a second transmission characteristic associated with the other active device and the location of the second transmission within the environment;
processing the first transmission characteristic and the second transmission characteristic to determine a first portion of a spatial model of the environment with respect to the location of the first transmission;
processing the first transmission characteristic and the second transmission characteristic to determine a second portion of the spatial model of the environment with respect to the location of the second transmission; and
processing the first portion of the spatial model and the second portion of the spatial model to determine a third portion of the spatial model indicative of spatial locations of at least some of the plurality of active devices within the environment.
53. The method of claim 52, wherein the first transmission comprises an audio signal transmission.
54. The method of claim 53, wherein the first transmission characteristic comprises an amplitude of the audio signal transmission.
55. The method of claim 52, wherein the second transmission comprises an audio signal transmission.
56. The method of claim 55, wherein the second transmission characteristic comprises an amplitude of the audio signal transmission.
57. The method of claim 51, wherein processing the first portion of the spatial model and the second portion of the spatial model to determine the third spatial model further comprises processing other data gathered from the environment about the location of the first transmission and about the location of the second transmission.
58. A method of processing data from a plurality devices located proximate to each other within an environment to determine a spatial model indicative of the positions of the plurality of devices within the environment, the method comprising:
receiving a first transmission from a first active device, wherein the first transmission includes a first transmission characteristic associated with a location of the first transmission and data associated with the environment about the first transmission location;
receiving a second transmission from a second active device, wherein the second transmission includes a second transmission characteristic associated with a location of the second transmission and data associated with the environment about the second transmission location;
processing the first transmission characteristic with respect to the location of the second transmission to generate a first portion of a spatial model indicative of at least the relative distance between the first active device and the second active device; and
processing the second transmission characteristic with respect to the location of the first transmission to generate a second portion of the spatial model associated with at least the relative distance between the first active device and the second active device; and
analyzing the first portion of the spatial model and the second portion of the spatial model to determine a third portion of the spatial model indicative of the proximity of the first active device and second active device to each other within the environment.
59. The method of claim 58, wherein the first transmission comprises a nonintrusive signal.
60. The method of claim 58, further comprising coordinating the transmission of the first transmission and the second transmission.
61. The method of claim 58, wherein the data associated with the environment about the location of the first transmission comprises data received from a plurality of sensors configured to sense the environment.
62. The method of claim 61, wherein at least one of the plurality sensors comprises an audio signal sensor.
63. The method of claim 58, wherein the first transmission characteristic is associated with the second signal transmission.
64. The method of claim 58, wherein analyzing the first portion of the spatial model comprises determining a relative attitude between the first active device and the second active device.
65. A computer-readable storage medium storing code configured to direct one or more processors associated with one or more computer systems to process data from a plurality devices located proximate to each other within an environment to determine a spatial model indicative of the positions of the plurality of devices within the environment, the computer-readable storage medium comprising:
code for receiving a first transmission from a first active device, wherein the first transmission includes a first transmission characteristic associated with the location of the first transmission and data associated with the environment about the location of the first transmission;
code for receiving a second transmission from a second active device, wherein the second transmission includes a second transmission characteristic associated with the location of the second transmission and data associated with the environment about the location of the second transmission;
code for processing the first transmission characteristic with respect to the location of the second transmission location to determine a first portion of a spatial model indicative of at least the relative distance between the first active device and the second active device; and
code for processing the second transmission characteristic with respect to the location of the first transmission to determine a second portion of the spatial model associated with at least the relative distance between the first active device and the second active device; and
code for analyzing the first portion of the spatial model and the second portion of the spatial model to determine a third portion of the spatial model indicative of the proximity of the first active device and second active device to each other within the environment.
66. The computer-readable storage medium of claim 65, wherein the first transmission comprises a nonintrusive signal.
67. The computer-readable storage medium of claim 65, further comprising code for coordinating the transmission of the first transmission and the second transmission.
68. The computer-readable storage medium of claim 65, wherein the data associated with the environment about the first transmission location comprises data received from a plurality of sensors configured to sense the environment.
69. The computer-readable storage medium of claim 68, wherein at least one of the two sensors comprises an audio signal sensor.
70. The computer-readable storage medium of claim 65, wherein the first transmission characteristic is associated with the second signal transmission.
71. The computer-readable storage medium of claim 65, wherein code for analyzing the first portion of the spatial model comprises code for determining a relative attitude between the first active device and the second active device.
US13/815,920 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude Abandoned US20140003195A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/815,920 US20140003195A1 (en) 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude
US15/052,393 US9929798B2 (en) 2012-06-28 2016-02-24 Coordinating devices to estimate distance, relative position, and device attitude

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261665857P 2012-06-28 2012-06-28
US13/815,920 US20140003195A1 (en) 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/052,393 Continuation US9929798B2 (en) 2012-06-28 2016-02-24 Coordinating devices to estimate distance, relative position, and device attitude

Publications (1)

Publication Number Publication Date
US20140003195A1 true US20140003195A1 (en) 2014-01-02

Family

ID=49778028

Family Applications (4)

Application Number Title Priority Date Filing Date
US13/815,920 Abandoned US20140003195A1 (en) 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude
US13/815,918 Abandoned US20140004797A1 (en) 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude
US15/052,393 Active US9929798B2 (en) 2012-06-28 2016-02-24 Coordinating devices to estimate distance, relative position, and device attitude
US15/140,319 Active US9729227B2 (en) 2012-06-28 2016-04-27 Coordinating devices to estimate distance, relative position, and device attitude

Family Applications After (3)

Application Number Title Priority Date Filing Date
US13/815,918 Abandoned US20140004797A1 (en) 2012-06-28 2013-03-15 Coordinating devices to estimate distance, relative position, and device attitude
US15/052,393 Active US9929798B2 (en) 2012-06-28 2016-02-24 Coordinating devices to estimate distance, relative position, and device attitude
US15/140,319 Active US9729227B2 (en) 2012-06-28 2016-04-27 Coordinating devices to estimate distance, relative position, and device attitude

Country Status (1)

Country Link
US (4) US20140003195A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150180986A1 (en) * 2013-12-20 2015-06-25 International Business Machines Corporation Providing a Sensor Composite Service Based on Operational and Spatial Constraints
US20170041740A1 (en) * 2015-08-04 2017-02-09 Kabushiki Kaisha Toshiba Radio-apparatus-disposition estimation device, estimation method, and non-transitory computer readable medium
US9729227B2 (en) 2012-06-28 2017-08-08 Wickr Inc. Coordinating devices to estimate distance, relative position, and device attitude
US20180150129A1 (en) * 2016-11-30 2018-05-31 Marcus Allen Thomas Systems and methods for adaptive user interface dynamics based on proximity profiling
JP2018094765A (en) * 2016-12-09 2018-06-21 京セラドキュメントソリューションズ株式会社 Device arrangement presentation system
US10575276B1 (en) * 2019-05-21 2020-02-25 At&T Intellectual Property I, L.P. User equipment localization through time series search
CN113679379A (en) * 2021-07-14 2021-11-23 深圳大学 Human body posture estimation method, device, equipment, system and medium based on sound waves
US20220200711A1 (en) * 2020-12-22 2022-06-23 Kabushiki Kaisha Toshiba Electronic apparatus, electronic system, and method
US20220326331A1 (en) * 2021-04-09 2022-10-13 LouStat Technologies, LLC Systems and Methods for Enhancing Location of Game in the Field

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9058096B2 (en) * 2013-10-21 2015-06-16 Google Inc. Methods and systems for indicating application data use and providing data according to permissions
DE102017006849A1 (en) * 2017-07-21 2019-01-24 Lukas HEINDL Method for determining the relative positions of at least two mobile terminals relative to one another
US20190107987A1 (en) * 2017-10-10 2019-04-11 Cisco Technology, Inc. Automated configuration of multiple collaboration endpoints
US10733698B2 (en) * 2018-04-06 2020-08-04 Groundspeak, Inc. System and method for rendering perspective adjusted views of a virtual object in a real world environment
CN109143162A (en) * 2018-09-30 2019-01-04 成都精位科技有限公司 Vehicle attitude calculation method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090067291A1 (en) * 1998-10-02 2009-03-12 Beepcard Inc. Computer communications using acoustic signals
US20090190441A1 (en) * 2008-01-29 2009-07-30 Nec (China) Co., Ltd. Autonomous ultrasonic indoor tracking system
US20110267924A1 (en) * 2010-04-28 2011-11-03 Pavel Horsky Acoustic distance measurement system having cross talk immunity

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1266336A4 (en) * 2000-03-21 2004-12-22 Ted R Rittmaster System and process for distribution of information on a communication network
US8416957B2 (en) * 2008-12-04 2013-04-09 Honda Motor Co., Ltd. Audio source detection system
TWI396862B (en) * 2009-12-04 2013-05-21 Teco Elec & Machinery Co Ltd Method, computer readable storage medium and system for localizing acoustic source
WO2012027597A2 (en) * 2010-08-27 2012-03-01 Intel Corporation Capture and recall of home entertainment system session
US9316717B2 (en) * 2010-11-24 2016-04-19 Samsung Electronics Co., Ltd. Position determination of devices using stereo audio
US9674661B2 (en) 2011-10-21 2017-06-06 Microsoft Technology Licensing, Llc Device-to-device relative localization
US20140003195A1 (en) 2012-06-28 2014-01-02 Net Power And Light, Inc. Coordinating devices to estimate distance, relative position, and device attitude

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090067291A1 (en) * 1998-10-02 2009-03-12 Beepcard Inc. Computer communications using acoustic signals
US20090190441A1 (en) * 2008-01-29 2009-07-30 Nec (China) Co., Ltd. Autonomous ultrasonic indoor tracking system
US20110267924A1 (en) * 2010-04-28 2011-11-03 Pavel Horsky Acoustic distance measurement system having cross talk immunity

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9729227B2 (en) 2012-06-28 2017-08-08 Wickr Inc. Coordinating devices to estimate distance, relative position, and device attitude
US20150180986A1 (en) * 2013-12-20 2015-06-25 International Business Machines Corporation Providing a Sensor Composite Service Based on Operational and Spatial Constraints
US9882994B2 (en) 2013-12-20 2018-01-30 International Business Machines Corporation Providing a sensor composite service based on operational and spatial constraints
US9654570B2 (en) * 2013-12-20 2017-05-16 International Business Machines Corporation Providing a sensor composite service based on operational and spatial constraints
US10182306B2 (en) * 2015-08-04 2019-01-15 Kabushiki Kaisha Toshiba Device and method for determining disposition of a plurality of radio apparatuses
JP2017032469A (en) * 2015-08-04 2017-02-09 株式会社東芝 Radio apparatus arrangement estimation device, radio apparatus arrangement estimation method, and radio apparatus arrangement estimation program
US20170041740A1 (en) * 2015-08-04 2017-02-09 Kabushiki Kaisha Toshiba Radio-apparatus-disposition estimation device, estimation method, and non-transitory computer readable medium
US20180150129A1 (en) * 2016-11-30 2018-05-31 Marcus Allen Thomas Systems and methods for adaptive user interface dynamics based on proximity profiling
US11353948B2 (en) * 2016-11-30 2022-06-07 Q Technologies, Inc. Systems and methods for adaptive user interface dynamics based on proximity profiling
JP2018094765A (en) * 2016-12-09 2018-06-21 京セラドキュメントソリューションズ株式会社 Device arrangement presentation system
US10575276B1 (en) * 2019-05-21 2020-02-25 At&T Intellectual Property I, L.P. User equipment localization through time series search
US10805901B1 (en) * 2019-05-21 2020-10-13 At&T Intellectual Property I, L.P. User equipment localization through time series search
US20220200711A1 (en) * 2020-12-22 2022-06-23 Kabushiki Kaisha Toshiba Electronic apparatus, electronic system, and method
US20220326331A1 (en) * 2021-04-09 2022-10-13 LouStat Technologies, LLC Systems and Methods for Enhancing Location of Game in the Field
US11774540B2 (en) * 2021-04-09 2023-10-03 LouStat Technologies, LLC Systems and methods for enhancing location of game in the field
CN113679379A (en) * 2021-07-14 2021-11-23 深圳大学 Human body posture estimation method, device, equipment, system and medium based on sound waves

Also Published As

Publication number Publication date
US20160170008A1 (en) 2016-06-16
US9929798B2 (en) 2018-03-27
US20140004797A1 (en) 2014-01-02
US20160241329A1 (en) 2016-08-18
US9729227B2 (en) 2017-08-08

Similar Documents

Publication Publication Date Title
US9929798B2 (en) Coordinating devices to estimate distance, relative position, and device attitude
US10209778B2 (en) Motion pattern classification and gesture recognition
US9460609B2 (en) Method and apparatus for preventing losing electronic devices
US20130142384A1 (en) Enhanced navigation through multi-sensor positioning
US20130252637A1 (en) Mobile communication terminal and method of recommending application or content
US9097537B2 (en) Electronic device and method for displaying position information of set device
US20130242106A1 (en) Multicamera for crowdsourced video services with augmented reality guiding system
US20150169780A1 (en) Method and apparatus for utilizing sensor data for auto bookmarking of information
CN106028276A (en) Information recommendation method and system
CN103996121B (en) Search matching method and search matching terminal
US20160370462A1 (en) Method and apparatus for providing time-of-flight calculations using distributed light sources
CN108958634A (en) Express delivery information acquisition method, device, mobile terminal and storage medium
US10192332B2 (en) Display control method and information processing apparatus
KR20140043489A (en) Usage recommendation for mobile device
CN113163383B (en) Communication connection control method, communication connection control device, computer equipment and readable storage medium
CN110189364B (en) Method and device for generating information, and target tracking method and device
Zhao et al. Trine: Cloud-edge-device cooperated real-time video analysis for household applications
CN111310595A (en) Method and apparatus for generating information
US9913117B2 (en) Electronic device and method for exchanging information using the same
CN105981357A (en) Systems and methods for contextual caller identification
KR20130072959A (en) Method and apparatus for providing of information
US20200296278A1 (en) Systems and methods of detecting and identifying an object
US20150082346A1 (en) System for Selective and Intelligent Zooming Function in a Crowd Sourcing Generated Media Stream
CN110286358A (en) A kind of indoor orientation method, equipment and computer readable storage medium
US10034138B2 (en) Companion device location within enclosed spaces

Legal Events

Date Code Title Description
AS Assignment

Owner name: NET POWER AND LIGHT, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:VONOG, STANISLAV;LEMMEY, TARA;SURIN, NIKOLAY;AND OTHERS;SIGNING DATES FROM 20130627 TO 20130629;REEL/FRAME:030719/0555

AS Assignment

Owner name: PENINSULA TECHNOLOGY VENTURES, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:033086/0001

Effective date: 20140603

Owner name: PENINSULA VENTURE PRINCIPALS, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:033086/0001

Effective date: 20140603

Owner name: ALSOP LOUIE CAPITAL I, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:033086/0001

Effective date: 20140603

AS Assignment

Owner name: BROWN, JOHN SEELY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: WANG, TA-HUI TY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: ORRICK INVESTMENTS 2011, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: TWB INVESTMENT PARTNERSHIP II, LP, WASHINGTON

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: SINGTEL INNOV8 PTE. LTD., SINGAPORE

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIO

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: ORRICK INVESTMENTS 2010, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: SHIN, JEANNIE, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: LOW, LAWRENCE B., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: ALSOP LOUIE CAPITAL 1, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: PENINSULA TECHNOLOGY VENTURES, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

Owner name: PENINSULA VENTURE PRINCIPALS, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:036740/0252

Effective date: 20150930

AS Assignment

Owner name: PENSULA VENTURE PRINCIPALS, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: TWB INVESTMENT PARTNERSHIP II, LP, WASHINGTON

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: BROWN, JOHN SEELY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: ALSOP LOUIE CAPITAL 1, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: SHIN, JEANNIE, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: SINGTEL INNOVS PTE. LTD., SINGAPORE

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: THE BOARD OF TRUSTEES OF THE LELAND STANFORD UNIVE

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: ORRICK INVESTMENTS 2011, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: WANG, TA-HUITY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: ORRICK INVESTMENTS 2010, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: LOW, LAWRENCE B., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

Owner name: PENINSULA TECHNOLOGY VENTURES, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037333/0680

Effective date: 20151218

AS Assignment

Owner name: PENINSULA TECHNOLOGY VENTURES, L.P., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: LOW, LAWRENCE B., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: ALSOP LOUIE CAPITAL 1, L.P., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: ORRICK INVESTMENTS 2011, LLC, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIO

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: TWB INVESTMENT PARTNERSHIP II, LP, WASHINGTON

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: WANG, TA-HUITY, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: BROWN, JOHN SEELY, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: SINGTEL INNOV8 PTE. LTD., SINGAPORE

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: SHIN, JEANNIE, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: ORRICK INVESTMENTS 2010, LLC, CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

Owner name: PENINSULA VENTURE PRINCIPALS, L.P., CALIFORNIA

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNEE'S NAME PREVIOUSLY RECORDED AT REEL: 037333 FRAME: 0680. ASSIGNOR(S) HEREBY CONFIRMS THE ASSIGNMENT;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037394/0150

Effective date: 20151218

AS Assignment

Owner name: SINGTEL INNOV8 PTE. LTD., SINGAPORE

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: ORRICK INVESTMENTS 2010, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: PENINSULA VENTURE PRINCIPALS, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: LOW, LAWRENCE B., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIO

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: ALSOP LOUIE CAPITAL 1, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: BROWN, JOHN SEELY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: ORRICK INVESTMENTS 2011, LLC, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: TWB INVESTMENT PARTNERSHIP II, LP, WASHINGTON

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: WANG, TA-HUI TY, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: PENINSULA TECHNOLOGY VENTURES, L.P., CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

Owner name: SHINE, JEANNIE, CALIFORNIA

Free format text: SECURITY INTEREST;ASSIGNOR:NET POWER AND LIGHT, INC.;REEL/FRAME:037786/0090

Effective date: 20160219

AS Assignment

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: NOTE AND WARRANT CONVERSION AGREEMENT;ASSIGNORS:PENINSULA TECHNOLOGY VENTURES, L.P.;PENINSULA VENTURE PRINCIPALS, L.P.;ALSOP LOUIE CAPITAL 1, L.P.;REEL/FRAME:038543/0839

Effective date: 20160427

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:038543/0831

Effective date: 20160427

AS Assignment

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:038398/0958

Effective date: 20160427

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:NET POWER & LIGHT, INC.;REEL/FRAME:038398/0946

Effective date: 20160427

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: NOTE AND WARRANT CONVERSION AGREEMENT;ASSIGNORS:PENINSULA TECHNOLOGY VENTURES, L.P.;PENINSULA VENTURE PRINCIPALS, L.P.;ALSOP LOUIE CAPITAL 1, L.P.;AND OTHERS;REEL/FRAME:038543/0942

Effective date: 20160427

Owner name: NET POWER & LIGHT, INC., CALIFORNIA

Free format text: NOTE AND WARRANT CONVERSION AGREEMENT;ASSIGNORS:PENINSULA TECHNOLOGY VENTURES, L.P.;PENINSULA VENTURE PRINCIPALS, L.P.;ALSOP LOUIE CAPITAL 1, L.P.;AND OTHERS;REEL/FRAME:038543/0966

Effective date: 20160427

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION