WO2009066998A3 - Apparatus and method for multiple-touch spatial sensors - Google Patents

Apparatus and method for multiple-touch spatial sensors Download PDF

Info

Publication number
WO2009066998A3
WO2009066998A3 PCT/MY2008/000164 MY2008000164W WO2009066998A3 WO 2009066998 A3 WO2009066998 A3 WO 2009066998A3 MY 2008000164 W MY2008000164 W MY 2008000164W WO 2009066998 A3 WO2009066998 A3 WO 2009066998A3
Authority
WO
WIPO (PCT)
Prior art keywords
spatial
image data
touch
dimensional
cameras
Prior art date
Application number
PCT/MY2008/000164
Other languages
French (fr)
Other versions
WO2009066998A2 (en
Inventor
Hock Woon Hon
Shern Shiou Tan
Original Assignee
Mimos Berhad
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mimos Berhad filed Critical Mimos Berhad
Publication of WO2009066998A2 publication Critical patent/WO2009066998A2/en
Publication of WO2009066998A3 publication Critical patent/WO2009066998A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04808Several contacts: gestures triggering a specific function, e.g. scrolling, zooming, right-click, when the user establishes several contacts with the surface simultaneously; e.g. using several fingers or a combination of fingers and pen

Abstract

The present invention relates to an apparatus and a method for multiple-touch three-dimensional contactless control for spatial sensing. The apparatus comprises two cameras (101, 102) having spatial sensors to capture object position in a form of an image, a register for registering the spatial directions as sensed by the spatial sensors, a data processing unit, and a computer for computing object point derivation and blob analysis. The method multiple- touch three-dimensional contactless control for spatial sensing comprising the steps of: positioning first and second camera (101, 102) and capturing image data of an object (107) by the cameras (101, 102); transferring the captured image data of the object (107) through background and foreground segmentation using image processing function; determining spatial positions of each of the image data of the object (107) and deriving a three-dimensional spatial position of the point; and processing the captured image data through blob analysis.
PCT/MY2008/000164 2007-11-23 2008-11-24 Apparatus and method for multiple-touch spatial sensors WO2009066998A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
MYPI20072085 2007-11-23
MYPI20072085A MY147059A (en) 2007-11-23 2007-11-23 Apparatus and method for multiple-touch spatial sensors

Publications (2)

Publication Number Publication Date
WO2009066998A2 WO2009066998A2 (en) 2009-05-28
WO2009066998A3 true WO2009066998A3 (en) 2009-10-15

Family

ID=40668031

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/MY2008/000164 WO2009066998A2 (en) 2007-11-23 2008-11-24 Apparatus and method for multiple-touch spatial sensors

Country Status (2)

Country Link
MY (1) MY147059A (en)
WO (1) WO2009066998A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2947348B1 (en) * 2009-06-25 2011-08-19 Immersion DEVICE FOR HANDLING AND VISUALIZING A VIRTUAL OBJECT
CN107945172A (en) * 2017-12-08 2018-04-20 博众精工科技股份有限公司 A kind of character detection method and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0913790A1 (en) * 1997-10-29 1999-05-06 Takenaka Corporation Hand pointing apparatus
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
KR20070061153A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 3d input apparatus by hand tracking using multiple cameras and its method
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0913790A1 (en) * 1997-10-29 1999-05-06 Takenaka Corporation Hand pointing apparatus
US20040108990A1 (en) * 2001-01-08 2004-06-10 Klony Lieberman Data input device
US7277599B2 (en) * 2002-09-23 2007-10-02 Regents Of The University Of Minnesota System and method for three-dimensional video imaging using a single camera
KR20070061153A (en) * 2005-12-08 2007-06-13 한국전자통신연구원 3d input apparatus by hand tracking using multiple cameras and its method

Also Published As

Publication number Publication date
WO2009066998A2 (en) 2009-05-28
MY147059A (en) 2012-10-15

Similar Documents

Publication Publication Date Title
TWI708216B (en) Method and system for calibrating vision system in environment
CN104626169B (en) Robot part grabbing method based on vision and mechanical comprehensive positioning
WO2007025300A8 (en) Capturing and processing facial motion data
WO2008123466A1 (en) Image processing device, control program, computer-readable recording medium, electronic device, and image processing device control method
CN104423569A (en) Pointing position detecting device, method and computer readable recording medium
WO2017077925A1 (en) Method and system for estimating three-dimensional pose of sensor
JP2010539557A5 (en)
GB201119501D0 (en) An apparatus, method and system
US9303982B1 (en) Determining object depth information using image data
WO2008090608A1 (en) Image reading device, image reading program, and image reading method
TW201120681A (en) Method and system for operating electric apparatus
WO2006015236A3 (en) Audio-visual three-dimensional input/output
JP2008116373A5 (en)
WO2009053848A3 (en) Methods and processes for detecting a mark on a playing surface and for tracking an object
WO2017215351A1 (en) Method and apparatus for adjusting recognition range of photographing apparatus
CN103795935B (en) A kind of camera shooting type multi-target orientation method and device based on image rectification
WO2008123462A1 (en) Image processing device, control program, computer-readable recording medium, electronic device, and image processing device control method
CN112258574A (en) Method and device for marking pose information and computer readable storage medium
CN107657642B (en) A kind of automation scaling method carrying out projected keyboard using external camera
WO2009125132A3 (en) Method for determining a three-dimensional representation of an object using a sequence of cross-section images, computer program product, and corresponding method for analyzing an object and imaging system
TW201741938A (en) Action sensing method and device
CN104376323B (en) A kind of method and device for determining target range
JP2008309595A (en) Object recognizing device and program used for it
WO2009066998A3 (en) Apparatus and method for multiple-touch spatial sensors
TWI520110B (en) 3d visual detection system and method for determining if an object enters a zone on demand

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08852074

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08852074

Country of ref document: EP

Kind code of ref document: A2