USRE43084E1 - Method and apparatus for inputting information including coordinate data - Google Patents

Method and apparatus for inputting information including coordinate data Download PDF

Info

Publication number
USRE43084E1
USRE43084E1 US12/722,345 US72234510A USRE43084E US RE43084 E1 USRE43084 E1 US RE43084E1 US 72234510 A US72234510 A US 72234510A US RE43084 E USRE43084 E US RE43084E
Authority
US
United States
Prior art keywords
display panel
display
predetermined object
cpu
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime, expires
Application number
US12/722,345
Inventor
Susumu Fujioka
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smart Technologies ULC
Original Assignee
Smart Technologies ULC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smart Technologies ULC filed Critical Smart Technologies ULC
Priority to US12/722,345 priority Critical patent/USRE43084E1/en
Priority to US13/345,044 priority patent/US20120327031A1/en
Application granted granted Critical
Publication of USRE43084E1 publication Critical patent/USRE43084E1/en
Assigned to MORGAN STANLEY SENIOR FUNDING, INC. reassignment MORGAN STANLEY SENIOR FUNDING, INC. SECURITY AGREEMENT Assignors: SMART TECHNOLOGIES INC., SMART TECHNOLOGIES ULC
Assigned to MORGAN STANLEY SENIOR FUNDING INC. reassignment MORGAN STANLEY SENIOR FUNDING INC. SECURITY AGREEMENT Assignors: SMART TECHNOLOGIES INC., SMART TECHNOLOGIES ULC
Assigned to SMART TECHNOLOGIES ULC, SMART TECHNOLOGIES INC. reassignment SMART TECHNOLOGIES ULC RELEASE OF TERM LOAN SECURITY INTEREST Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Assigned to SMART TECHNOLOGIES ULC, SMART TECHNOLOGIES INC. reassignment SMART TECHNOLOGIES ULC RELEASE OF ABL SECURITY INTEREST Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Assigned to SMART TECHNOLOGIES INC., SMART TECHNOLOGIES ULC reassignment SMART TECHNOLOGIES INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Assigned to SMART TECHNOLOGIES ULC, SMART TECHNOLOGIES INC. reassignment SMART TECHNOLOGIES ULC RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: MORGAN STANLEY SENIOR FUNDING, INC.
Adjusted expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • the present invention relates to a method and apparatus for inputting information including coordinate data. More particularly, the present invention relates to a method and apparatus for inputting information including coordinate data of a location of a coordinate input member, such as a pen, a human finger, etc., on an image displayed on a relatively large screen.
  • a coordinate input member such as a pen, a human finger, etc.
  • presentation systems electronic copy boards, or electronic blackboard systems provided with a relatively large screen display device, such as a plasma display panel, a rear projection display, etc.
  • Certain type of presentation systems also provide a touch input device disposed in front of a screen for inputting information related to the image displayed on the screen.
  • a touch input device is also referred as an electronic tablet, an electronic pen, etc.
  • a touch input device detects and inputs the touching motion and the coordinates of the touched location.
  • the touch input device repetitively detects and inputs a plurality of coordinates as a locus of the drawn line.
  • Japanese Laid-Open Patent Publication No. 11-85376 describes a touch input apparatus provided with light reflecting devices disposed around a display screen, light beam scanning devices, and light detectors.
  • the light reflecting device has a characteristic to reflect incident light toward a direction close to the incident light.
  • scanning light beams emitted by the light beam scanning devices are reflected by the light reflecting devices, and then received by the light detectors.
  • a coordinate input member such as a pen, a user's finger, etc.
  • touches the surface of the screen at a location the coordinate input member interrupts the path of the scanning light beams, and thereby the light detector is able to detect the touched location as a missing of the scanning light beams at the touched location.
  • the scanning light beams are desired to be thin and to scan on a plane close enough to the screen.
  • the contorted surface may interfere with the transmission of the scanning light beams, and consequently a coordinate input operation might be impaired.
  • a double-click operation might not be properly detected, free hand drawing lines and characters might be erroneously detected, and so forth.
  • Japanese Laid-Open Patent Publication No. 61-196317 describes a touch input apparatus provided with a plurality of television cameras.
  • the plurality of television cameras detect three-dimensional coordinates of a moving object, such as a pen, as a coordinate input member. Because the apparatus detects a three-dimensional coordinates, the plurality of television cameras are desirable to capture images of the moving object at a relatively high flame rate.
  • a touch input apparatus provided with an electro magnetic tablet and an electromagnetic stylus is known.
  • a location of the stylus is detected based on electromagnetic induction between the tablet and the stylus. Therefore, a distance between the tablet and the stylus tends to be limited in a rather short distance, for example, eight millimeters; otherwise a large size stylus or a battery powered stylus is used.
  • an object of the present invention is to provide a novel method and apparatus that can input information including coordinate data even when the surface of a display screen is contorted to a certain extent and without using a light scanning device.
  • Another object of the present invention is to provide a novel method and apparatus that can input information including coordinate data using a plurality of coordinate input members, such as a pen, a human finger, a stick, a rod, a chalk, etc.
  • Another object of the present invention is to provide a novel method and apparatus that can input information including coordinate data with a plurality of background devices, such as a chalkboard, a whiteboard, etc., in addition to a display device, such as a plasma display panel, a rear projection display.
  • a display device such as a plasma display panel, a rear projection display.
  • the present invention provides a method, computer readable medium and apparatus for inputting information including coordinate data that include extracting a predetermined object from an image including the predetermined object above a plane, detecting a motion of the predetermined object while the predetermined object is in a predetermined distance from the plane, and determining to input predetermined information.
  • FIG. 1 is a schematic view illustrating a coordinate data input system as an example configured according to the present invention
  • FIG. 2 is an exemplary block diagram of a control apparatus of the coordinate data input system of FIG. 1 ;
  • FIG. 3 is a diagram illustrating a method for obtaining coordinates where a coordinate input member contacts a display panel
  • FIG. 4 is a magnified view of the wide-angle lens and the CMOS image sensor of FIG. 3 ;
  • FIG. 5 is a diagram illustrating a tilt of the surface of the display panel to the CMOS image sensor
  • FIG. 6 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 1 as an example configured according to the present invention
  • FIG. 7 is a diagram illustrating an image captured by the first electronic camera of FIG. 1 ;
  • FIG. 8 is a diagram illustrating an image captured by the first electronic camera when an input pen distorts the surface of a display panel
  • FIG. 9 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention.
  • FIG. 10 is a diagram illustrating an image captured by the first electronic camera when an input pen tilts to the surface of a display panel
  • FIG. 11 is a diagram illustrating an image having an axial symmetry pen captured by the first electronic camera
  • FIG. 12 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention.
  • FIG. 13 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention.
  • FIG. 14 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention.
  • FIG. 15 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention.
  • FIG. 16A is a diagram illustrating an image captured by the first electronic camera and an output limitation of the image
  • FIG. 16B is a diagram illustrating an image captured by the first electronic camera and a displaced output limitation of the image
  • FIG. 17 is a schematic view illustrating a coordinate data input system as another example configured according to the present invention.
  • FIG. 18 is an exemplary block diagram of a control apparatus of the coordinate data input system of FIG. 17 configured according to the present invention.
  • FIG. 19 is a diagram illustrating an analog signal waveform output from a linear sensor camera
  • FIG. 20 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as an example configured according to the present invention
  • FIG. 21 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as another example configured according to the present invention
  • FIG. 22 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as another example configured according to the present invention
  • FIG. 23 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 1 as another example configured according to the present invention
  • FIG. 24 is a diagram illustrating an image captured by the first electronic camera in the coordinate data input system of FIG. 1 ;
  • FIG. 25 is an exemplary network system including the coordinate data input systems of FIG. 1 and FIG. 17 .
  • FIG. 1 is a schematic view illustrating a coordinate data input system 1 S as an example configured according to the present invention.
  • the coordinate data input system 1 S includes a coordinate data input apparatus 1 and a control apparatus 2 .
  • the coordinate data input apparatus 1 includes a first electronic camera 10 , a second electronic camera 11 , and a display panel 12 .
  • the display panel 12 displays an image with, for example, a 48 by 36 inch screen (diagonally 60 inches) and 1024 by 768-pixel resolution, which is referred as an XGA screen.
  • a plasma display panel, a rear projection display, etc. may be used as the display panel 12 .
  • Each of the first electronic camera 10 and the second electronic camera 11 implements a two-dimensional imaging device with a resolution that enables such as a selecting operation of an item in a menu window, a drawing operation of free hand lines, letters, etc.
  • a two-dimensional imaging device is also referred as an area sensor.
  • the two-dimensional imaging device preferably has variable output frame rate capability.
  • the two-dimensional imaging device also preferably has a random access capability that allows any imaging cell therein randomly accessed to obtain an image signal from the cell. Such a random access capability is sometimes also referred to as random addressability.
  • CMOS sensor complementary metal oxide semiconductor sensor
  • the electronic camera 10 also includes a wide-angle lens 50 which covers around 90 degrees or wider angle and an analog to digital converter.
  • the electronic camera 11 also includes a wide-angle lens 52 which covers around 90 degrees or wider angle and an analog to digital converter.
  • the first electronic camera 10 is disposed at a upper corner of the display panel 12 and such that an optical axis of the wide-angle lens 50 forms an angle of approximately 45 degrees with the horizontal edge of the display panel 12 .
  • the second electronic camera 11 is disposed at the other upper corner of the display panel 12 and such that the optical axis of the wide-angle lens 52 forms an angle of approximately 45 degrees with the horizontal edge of the display panel 12 .
  • each of the electronic cameras 10 and 11 is disposed approximately parallel to a display screen surface of the display panel 12 .
  • the electronic cameras 10 and 11 can capture whole the display screen surface of the display panel 12 , respectively.
  • Each of the captured images is converted into digital data, and the digital image data is then transmitted to the control apparatus 2 .
  • FIG. 2 is an exemplary block diagram of the control apparatus 2 of the coordinate data input system 1 S of FIG. 1 .
  • the control apparatus 2 includes a central processing unit (CPU) 20 , a main memory 21 , a clock generator 22 , a bus controller 23 , a read only memory (ROM) 24 , a peripheral component interconnect (PCI) bridge 25 , a cash memory 26 , a hard disk 27 , a hard disk controller (HD controller) 28 , a display controller 29 , a first image processing circuit 30 , and a second image processing circuit 31 .
  • CPU central processing unit
  • main memory 21 main memory
  • a clock generator 22 the control apparatus 2
  • ROM read only memory
  • PCI peripheral component interconnect
  • the control apparatus 2 also includes a local area network controller (LAN controller) 32 , a LAN interface 33 , a floppy disk controller (FD controller) 34 , a FD drive 35 , a compact disc read only memory controller (CD-ROM controller) 36 , a CD-ROM drive 37 , a keyboard controller 38 , a mouse interface 39 , a real time clock generator (RTC generator) 40 , a CPU bus 41 , a PCI bus 42 , an internal X bus 43 , a keyboard 44 , and a mouse 45 .
  • LAN controller local area network controller
  • FD controller floppy disk controller
  • CD-ROM controller compact disc read only memory controller
  • the CPU 20 executes a boot program, a basic input and output control system (BIOS) program stored in the ROM 24 , an operating system (OS), application programs, etc.
  • the main memory 21 may be structured by, e.g., a dynamic random access memory (DRAM), and is utilized as a work memory for the CPU 20 .
  • the clock generator 22 may be structured by, for example, a crystal oscillator and a frequency divider, and supplies a generated clock signal to the CPU 20 , the bus controller 23 , etc., to operate those devices at the clock speed.
  • the bus controller 23 controls data transmission between the CPU bus 41 and the internal X bus 43 .
  • the ROM 24 stores a boot program, which is executed immediate after the coordinate data input system 1 S is turned on, device control programs for controlling the devices included in the system 1 S, etc.
  • the PCI bridge 25 is disposed between the CPU bus 41 and the PCI bus 42 and transmits data between the PCI bus 42 and devices connected to the CPU bus 41 , such as the CPU 20 through the use of the cash memory 26 .
  • the cash memory 26 may be configured by, for example, a DRAM.
  • the hard disk 27 stores system software such as an operating system, a plurality of application programs, various data for multiple users of the coordinate data input system 1 S.
  • the hard disk (HD) controller 28 implements a standard interface, such as a integrated device electronics interface (IDE interface), and transmits data between the PCI bus 42 and the hard disk 27 at a relatively high speed data transmission rate.
  • IDE interface integrated device electronics interface
  • the display controller 29 converts digital letter/character data and graphic data into an analog video signal, and controls the display panel 12 of the coordinate data input apparatus 1 so as to display an image of the letters/characters and graphics thereon according to the analog video signal.
  • the first image processing circuit 30 receives digital image data output from the first electronic camera 10 through a digital interface, such as an RS-422 interface.
  • the first image processing circuit 30 then executes an object extraction process, an object shape recognition process, a motion vector detection process, etc. Further, the first image processing circuit 30 supplies the first electronic camera 10 with a clock signal and an image transfer pulse via the above-described digital interface.
  • the second image processing circuit 31 receives digital image data output from the second electronic camera 11 through a digital interface, such as also an RS-422 interface.
  • the second image processing circuit 31 is configured as the substantially same hardware as the first image processing circuit 30 , and operates substantially the same as the first image processing circuit 30 operates. That is, the second image processing circuit 31 also executes an object extraction process, an object shape recognition process, a motion vector detection process, and supplies a clock signal and an image transfer pulse to the second electronic camera 11 as well.
  • the clock signal and the image transfer pulse supplied to the first electronic camera 10 and those signals supplied to the second electronic camera 11 are maintained in synchronization.
  • the LAN controller 32 controls communications between the control apparatus 2 and external devices connected to a local area network, such as an Ethernet, via the LAN interface 33 according to the protocol of the network.
  • a local area network such as an Ethernet
  • IEEE 802.3 standard may be used.
  • the FD controller 34 transmits data between the PCI bus 42 and the FD drive 35 .
  • the FD drive 35 reads and writes a floppy disk therein.
  • the CD-ROM controller 36 transmits data between the PCI bus 42 and the CD-ROM drive 37 .
  • the CD-ROM drive 37 reads a CD-ROM disc therein and sends the read data to the CD-ROM controller 36 .
  • the CD-ROM controller 36 and the CD-ROM drive 37 may be connected with an IDE interface.
  • the keyboard controller 38 converts serial key input signals generated at the keyboard 44 into parallel data.
  • the mouse interface 39 is provided with a mouse port to be connected with the mouse 45 and controlled by mouse driver software or a mouse control program.
  • the coordinate data input apparatus 1 functions as a data input device, and therefore the keyboard 44 and the mouse 45 may be omitted from the coordinate data input system 1 S in normal operations except for a moment during a maintenance operation for the coordinate data input system 1 S.
  • the RTC generator 40 generates and supplies calendar data, such as day, hour, and minute, etc., and is battery back-upped.
  • FIG. 3 is a diagram illustrating a method for obtaining coordinates where a coordinate input member contacts or comes close to the display panel 12 .
  • the first electronic camera 10 includes the wide-angle lens 50 and a CMOS image sensor 51
  • the second electronic camera 11 also includes the wide-angle lens 52 and a CMOS image sensor 53 .
  • the first and second electronic cameras 10 and 11 are disposed such that the optical axes of the wide-angle lenses 50 and 52 , i.e., the optical axes of incident lights to the cameras, are parallel to the display surface of the display panel 12 . Further, the first and second electronic cameras 10 and 11 are disposed such that each of the angles of view of the electronic cameras 10 and 11 covers substantially a whole area where the coordinate input member can come close and touch the display panel 12 .
  • the symbol L denotes a distance between the wide-angle lens 50 and the wide-angle lens 52
  • the symbol X-Line denotes a line connecting the wide-angle lens 50 and the wide-angle lens 52 .
  • the symbol A(x, y) denotes a point A where a coordinate input member comes close to or touches the display panel 12 and the coordinates (x, y) thereof.
  • the point A(x, y) is referred as a contacting point.
  • ⁇ 1 denotes an angle which the line X-line forms with a line connecting the wide-angle lens 50 and the contacting point A(x, y)
  • ⁇ 2 denotes an angle that the X-line forms with a line connecting the wide-angle lens 52 and the contacting point A(x, y).
  • FIG. 4 is a magnified view of the wide-angle lens 50 and the CMOS image sensor 51 of FIG. 3 .
  • the symbol f denotes a distance between the wide-angle lens 50 and the CMOS image sensor 51 .
  • the symbol Q denotes a point at which the optical axis of the wide-angle lens 50 intersects the CMOS image sensor 51 .
  • the point Q is referred as an optical axis crossing point.
  • the symbol P denotes a point where an image of the contacting point A(x, y) is formed on the CMOS image sensor 51 .
  • the point P is referred as a projected point P of the contacting point A(x, y).
  • the symbol h denotes a distance between the point P and the point Q.
  • the symbol a denotes an angle which the optical axis of the wide-angle lens 50 forms with the X-line
  • the symbol ⁇ denotes an angle which the optical axis of the wide-angle lens 50 forms with a line connecting the contacting point A(x, y) and the point P.
  • the angle ⁇ and the distance f are constant values, because these values are determined by a mounted mutual location of the wide-angle lens 50 and the CMOS image sensor 51 , and a mounted angle of the wide-angle lens 50 to the line X-line at a manufacturing plant. Therefore, when the distance h is given, the angle ⁇ 1 is solved. Regarding the second electronic camera 11 , similar equations are hold, and thus the angle ⁇ 2 is solved.
  • Each of the CMOS image sensors 51 and 53 has a two-dimensional array or a matrix of imaging picture elements (pixels) or imaging cells.
  • the CMOS image sensors 51 and 53 are disposed such that a side having the larger number of imaging cells is parallel to the surface of the display panel 12 .
  • a coordinate axis along the direction having the larger number of imaging cells is represented by Ycamera axis.
  • a coordinate axis along the direction having a smaller number of imaging cells, i.e., the direction perpendicular to the Ycamera axis is represented by Xcamera axis.
  • a line formed on the CMOS image sensors 51 and 53 is hereinafter referred as “a formed line of the surface of the display panel 12 ,” “a projected line of the surface of the display panel 12 ,” or just simply “the surface of the display panel 12 .”
  • FIG. 5 is a diagram illustrating a tilt of the surface of the display panel 12 to the CMOS image sensors 51 and 53 .
  • the tilting angle ⁇ to the Ycamera axis is obtained as follows.
  • points A(x 1 c, y 1 c), B(x 2 c, y 2 c), C(x 3 c, y 3 c) are arbitrary points on the projected line of the surface of the display panel 12 .
  • a tilted coordinate system which tilts angle ⁇ to the original coordinate system (Xcamera, Ycamera), may also be conveniently utilized to obtain a location of a coordinate input member and a motion vector thereof.
  • the tilted coordinate system is related to a rotation of the original coordinate system at angle ⁇ .
  • FIG. 6 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 1 S of FIG. 1 as an example configured according to the present invention.
  • the CMOS image sensor 51 may not always have an appropriate image aspect ratio, or a ratio of the number of imaging cells in a direction to that in the other direction.
  • the first electronic camera 10 allows outputting the image signal captured by the CMOS image sensor 51 within a predetermined range from the surface of the display panel 12 in a direction perpendicular to the surface.
  • the first electronic camera 10 outputs digital image data in the predetermined range to the first image processing circuit 30 in the control apparatus 2 .
  • the second electronic camera 11 outputs digital image data in a predetermined range to the second image processing circuit 31 .
  • the first image processing circuit 30 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • the first image processing circuit 30 first determines gradients of image density among the pixels by differentiation, and then extracts contours based on a direction and magnitude of the gradients of image density.
  • the method described in Japanese Patent Publication No. 8-16931 may also be applied for extracting an object as a coordinate input member from frame image data.
  • the first image processing circuit 30 measures plural distances between the object and the projected line of the surface of the display panel 12 on the CMOS image sensor 51 .
  • the first image processing circuit 30 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 on the CMOS image sensor 51 .
  • An image forming reduction ratio on the CMOS image sensor 51 is fixed and a pixel pitch of the CMOS image sensor 51 (i.e., the interval between imaging cells) is known. As a result, the number of pixels between two points determines a distance between the two points.
  • the first image processing circuit 30 For measuring plural distances between the object and the surface of the display panel 12 , the first image processing circuit 30 counts pixels as regards plural distances between the contours of the extracted object and the projected line of the surface of the display panel 12 .
  • step S 103 the first image processing circuit 30 extracts the least number of pixels among the plural numbers of pixels counted for measuring plural distances in step S 102 .
  • a symbol Nmin denotes the least number of pixels among the plural numbers of pixels. Consequently, the distance being the minimum value Nmin corresponds to a nearest point of the object to the surface of the display panel 12 .
  • the first image processing circuit 30 determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • step S 103 When the minimum value Nmin is smaller than the predetermined number M 0 , i.e., YES in step S 103 , the process proceeds to step S 104 , and when the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 103 , the process returns to step S 101 .
  • step S 104 the first image processing circuit 30 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 . For this calculation, the first image processing circuit 30 uses the identical frame image data used for extracting the contours and the next following frame image data received from the first electronic camera 10 .
  • the first image processing circuit 30 obtains optical flows, i.e., velocity vectors, by calculating a rate of temporal change of a pixel image density.
  • the first image processing circuit 30 also obtains a rate of spatial change of image densities of pixels in the vicinity of the pixel used for calculating the rate of temporal change of the pixel image density.
  • the motion vectors are expressed on the coordinate system (Xcamera, Ycamera), which associates with the projected line of the surface of the display panel 12 on the CMOS image sensor 51 (i.e., Ycamera) and the coordinate perpendicular to the surface of the display panel 12 (i.e., Xcamera).
  • FIG. 7 is a diagram illustrating an image captured by the first electronic camera 10 .
  • a thick line illustrates the projected line of the surface of the display panel 12 on the CMOS image sensors 51 .
  • the display panel 12 includes a display area and a frame in circumference of and at approximately same level of the display screen surface. Therefore, the surface of the display panel 12 can also be the surface of the frame.
  • the alternate long and short dash line is drawn at a predetermined distance from the projected line of the surface of the display panel 12 .
  • the predetermined distance corresponds to the predetermined number M 0 of pixels at the step S 103 of FIG. 6 , and the region limited by the predetermined distance is denoted by REGION FOR OBTAINING MOTION VECTORS.
  • the linked plural lines illustrate a pen as the extracted contours of the object at the step S 101 of FIG. 6 .
  • the nearest point of the pen to the display panel 12 which is marked by the black dot at the tip of the pen in FIG. 7 , is in the REGION FOR OBTAINING MOTION VECTORS. Accordingly, a calculation of motion vectors, which is executed at the step S 104 of FIG. 6 , results in such as the motion vector and components thereof. Vx and Vy as illustrated in FIG. 7 regarding the nearest point (black dot) of the pen.
  • step S 105 the CPU 20 stores motion vector components along the direction Xcamera of the calculated vectors, such as the component Vx illustrated in FIG. 7 , in the main memory 21 .
  • the CPU 20 stores those vector components from each obtained frame image data in succession.
  • the successively stored motion vector data is also referred as trace data.
  • step S 106 the CPU 20 determines whether the extracted object, such as the pen in FIG. 7 , has made an attempt to input coordinates on the display panel 12 based on the trace data of motion vectors. A determining method is further described later.
  • the process proceeds to step S 108 , and when the object has not made an attempt to input coordinates, i.e., No in step S 107 , the process branches to step S 109 .
  • step S 108 the CPU 20 measures the distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object.
  • the extracted object is physically soft, such as a human finger, the extracted object may contact at an area rather than a point.
  • the contacting point A(x, y) can be replaced with the center of the contacting area.
  • the term contacting point A(x, y) is applied for not only a contacting state of the object and the display panel 12 , but also a state that the object is adjacent to the display panel 12 .
  • a range from the optical axis crossing point Q to an end of the CMOS image sensor 51 contains a fixed number (denoted by N 1 ) of pixels, which only depends upon relative locations of the wide-angle lens 50 and the CMOS image sensor 51 being disposed.
  • a range from the point P to the end of the CMOS image sensor 51 contains variable pixels (denoted by N 2 ), which varies depending upon the location of the contacting point A(x, y) of the object. Therefore, the range between the point Q and the point P contains
  • step S 110 the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h. As regards image data received from the second electronic camera 11 , the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 111 the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • step S 109 the CPU 20 determines whether the object is still within the predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object.
  • the process returns to step S 104 to obtain motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S 109 , the process returns to step S 101 .
  • angles ⁇ 1 , ⁇ 2 may also be solved by the first image processing circuit 30 and the second image processing circuit 31 , respectively, and then the obtained ⁇ 1 , ⁇ 2 are transferred to the CPU 20 to solve the coordinates x and y.
  • the CPU 20 may also execute the above-described contour extracting operation in step S 101 , the distance measuring operation in step S 102 , the least number extracting and comparing operation in steps S 103 and S 104 in place of the first image processing circuit 30 .
  • the hard disk 27 may initially store program codes, and the program codes are loaded to the main memory 21 for execution every time after the system 1 S is boot upped.
  • the CPU 20 When the coordinate data input system 1 S is in a writing input mode or a drawing input mode, the CPU 20 generates display data according to the obtained plural sets of coordinates x and y of the object, i.e., the locus data of the object, and sends the generated display data to the display controller 29 .
  • the display controller 29 displays an image corresponding to the locus of the object on the display panel 12 of the coordinate data input apparatus 1 .
  • FIG. 8 is a diagram illustrating an image captured by the first electronic camera 10 when an input pen distorts the surface of a display panel.
  • the tip of the pen is out of the frame due to the distortion or warp in the surface of the display panel caused by the pressure of the pen stroke.
  • Intersections of the surface of the display panel 12 and the contours of the pen are denoted by point A and point B.
  • the middle point of the points A and B may be presumed or substantially equivalent to a nearest point of the pen as well as a literal sense of the nearest point, such as the black dot at the tip of the pen illustrated in FIG. 7 .
  • FIG. 9 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. The method is also executed on the coordinate data input system 1 S of FIG. 1 .
  • step S 201 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • step S 202 the first image processing circuit 30 or the CPU 20 first extracts geometrical features of the shape of the extracted contours of the object.
  • the first image processing circuit 30 or the CPU 20 determines the position of the barycenter of the contours of the object, then measures distances from the barycenter to plural points on the extracted contours for all radial directions like the spokes of a wheel. Then, the CPU 20 extracts geometrical features of the contour shape of the object based on relations between each direction and the respective distance.
  • Japanese Laid-Open Patent Publication No. 8-315152 may also be referred for executing the above-stated character extraction method.
  • the CPU 20 compares the extracted geometrical features of the contour shape of the object with features of cataloged shapes of potential coordinate input members one after the other.
  • the shapes of potential coordinate input members may be stored in the ROM 24 or the hard disk 27 in advance.
  • the axis of the coordinate input member may tilt in any direction with various tilting angles. Therefore, the CPU 20 may rotate the contour shape of the object for predetermined angles to compare with the cataloged shapes.
  • FIG. 10 is a diagram illustrating an image captured by the first electronic camera 10 when an input pen as a coordinate input member tilts to the surface of a display panel 12 .
  • the pen tilts to the surface of the display panel 12 at an angle AR as illustrated. Therefore, when the CPU 20 inversely rotates, i.e., rotates counterclockwise, the contour shape of the object at the angle AR, the contour shape easily coincides with one of the cataloged shapes.
  • the shapes of potential coordinate input members may be rotated at plural angles in advance, and the rotated shapes stored in the ROM 24 or the hard disk 27 .
  • the real-time rotating operation of the contour shape is not needed; consequently, execution time for the coordinate data inputting operation is further saved.
  • FIG. 11 is a diagram illustrating an image having an axially symmetric pen captured by the first electronic camera 10 .
  • various sorts of potential coordinate input members such as a pen, a magic marker, a stick, a rod, etc., have axial symmetry. Therefore, the CPU 20 may analyze whether the captured object has axial symmetry, and when the captured object has axial to symmetry, the CPU 20 can simply presume the captured object to be a coordinate input member.
  • the axial symmetry may be determined based on distances from the barycenter to plural points on the extracted contours.
  • step S 203 the CPU 20 determines whether the character extracted contour shape of the object coincides with one of the cataloged shapes of potential coordinate input members by determining methods including the above-stated methods.
  • the process proceeds to step S 204 , and when the contour shape does not coincide with any of the cataloged shapes, i.e., NO in step S 203 , the process returns to step S 201 .
  • step S 204 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 with respect to each of the plural distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
  • step S 205 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels, which is denoted by Nmin, among the plural numbers of pixels counted in step S 204 , and determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • the process proceeds to step S 206 , and when the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 205 , the process returns to step S 201 .
  • step S 206 the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 .
  • the component Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12
  • the component Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 uses consecutive two frames and utilizes the optical flow method stated above.
  • step S 207 the CPU 20 successively stores motion vector components along the direction of Xcamera (i.e., Vx) of the calculated motion vectors of frames in the main memory 21 as trace data.
  • step S 208 the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 12 based on the trace data of motion vectors.
  • the process branches to step S 211 , and when the object has not made an attempt, i.e., No in step S 209 , the process proceeds to step S 210 .
  • step S 210 the CPU 20 determines whether the object is within a predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object.
  • the process returns to step S 206 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S 210 , the process returns to step S 201 .
  • step S 211 the first image processing circuit 30 or the CPU 20 measures a distance h on the CMOS image sensor 51 between the optical axis crossing point Q and a projected point P of a contacting point A(x, y).
  • step S 212 with reference to FIG. 4 , the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h. As regards image data received from the second electronic camera 11 , the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 213 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • the CPU 20 only inputs coordinates of an object that coincides with one of cataloged shapes of potential coordinate input members. Accordingly, the coordinate data input system 1 S can prevent an erroneous or unintentional inputting operation, e.g., inputting coordinates of an operator's arm, head, etc.
  • FIG. 12 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. This example is applied for, e.g., inputting a pointing or clicking operation for an icon, an item in a menu, etc., being displayed on the display panel 12 . The method is also executed on the coordinate data input system 1 S of FIG. 1 .
  • step S 301 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • step S 302 the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member.
  • the process proceeds to step S 303 , and when the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S 302 , the process returns to step S 301 .
  • step S 303 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 regarding each of the distances.
  • a distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
  • step S 304 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S 303 , and determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • the process proceeds to step S 305 , and when the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 304 , the process returns to step S 301 .
  • step S 305 the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 .
  • the component Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12
  • the component Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 uses two consecutive frames of image data and utilizes the optical flow method stated above.
  • step S 306 the CPU 20 successively stores motion vector components along the direction Xcamera, i.e., component Vx, of plural frames in the main memory 21 as trace data.
  • step S 307 the CPU 20 determines whether a moving direction of the extracted object has been reversed from an advancing motion toward the display panel 12 to a leaving motion from the panel 12 based on the trace data of motion vectors.
  • the process branches to step S 309 , and when the moving direction has not reversed, i.e., No in step S 307 , the process proceeds to step S 308 .
  • step S 308 the first image processing circuit 30 or the CPU 20 determines whether the object is within a predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object.
  • the process returns to step 5305 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S 308 , the process returns to step S 301 .
  • step S 309 the first image processing circuit 30 or the CPU 20 measures a distance h on the CMOS image sensor 51 between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object.
  • projected point P for example, a starting point of a motion vector being centered among plural motion vectors, whose direction has been reversed, is used.
  • step. S 310 referring to FIG. 4 , the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h. As regards image data received from the second electronic camera 11 , the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 311 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • FIG. 13 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. This example is applied to, for example, inputting information while a coordinate input member is staying at the surface of the display panel 12 . The method is also executed on the coordinate data input system 1 S of FIG. 1 .
  • step S 401 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • step S 402 the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member.
  • the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S 402
  • the process proceeds to step S 403
  • the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step 5402 , the process returns to step S 401 .
  • step S 403 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
  • step S 404 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S 403 , and determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • the process proceeds to step S 405 , and when the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 404 , the process returns to step S 401 .
  • step S 405 the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 .
  • Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12
  • Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 uses two consecutive frames and utilizes the optical flow method stated above.
  • step S 406 the CPU 20 successively stores motion vector components along the direction Xcamera of the calculated vectors, i.e., the component Vx, in the main memory 21 as trace data.
  • step S 407 the CPU 20 determines whether the vector component Vx, which is perpendicular to the plane of the display panel 12 , has become a value of zero from an advancing motion toward the display panel 12 .
  • the process branches to step S 409 , and when the component Vx has not become zero yet, i.e., No in step S 407 , the process proceeds to step S 408 .
  • step S 408 the CPU 20 determines whether the object is located within a predetermined region above the display panel 12 using the trace data of motion vectors component Vx of the object.
  • the process returns to step S 405 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S 408 , the process returns to step S 401 .
  • step S 409 the CPU 20 determines that a coordinate inputting operation has been started, and transits the state of the coordinate data input system 1 S to a coordinate input state.
  • step S 410 the first image processing circuit 30 or the CPU 20 measures a distance h between the optical axis crossing point Q and the projected point P of a contacting point A(x, y) of the object on the CMOS image sensor 51 .
  • step S 411 referring to FIG. 4 , the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h. As regards image data received from the second electronic camera 11 , the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 412 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • step S 413 the CPU 20 determines whether the motion vector component Vy at the point P has changed while the other motion vector component Vx is value of zero. In other words, the CPU 20 determines whether the object has moved in any direction whatever along the surface of the display panel 12 .
  • the process returns to step S 410 to obtain the coordinates x and y of the object at a moved location.
  • the process proceeds to step S 414 .
  • the CPU 20 may also determine the motion vector component Vy under a condition that the other component Vx is a positive value, which represents a direction approaching toward the display panel 12 in addition to the above-described condition of the component Vx is zero.
  • step S 414 the CPU 20 determines whether the motion vector component Vx regarding the point P has become a negative value, which represents a direction leaving from the display panel 12 .
  • the process proceeds to step S 415 , and if NO, the process returns to step S 410 .
  • step S 415 the CPU 20 determines that the coordinate inputting operation has been completed, and terminates the coordinate input state of the coordinate data input system 1 S.
  • the CPU 20 can generate display data according to the coordinated data obtained during the above-described coordinate input state, and transmit the generated display data to the display controller 29 to display an image of the input data on the display panel 12 .
  • FIG. 14 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. These operational steps are also executed on the coordinate data input system 1 S of FIG. 1 .
  • a frame rate output from each of the first and second electronic cameras 10 and 11 varies depending on a distance of a coordinate input member from the display panel 12 .
  • the frame rate may be expressed as the number of frames per one second.
  • the frame rate output from each of the CMOS image sensors 51 and 53 is increased to obtain the motion of the coordinate input member further in detail.
  • the output frame rate is decreased to reduce loads of the other devices in the coordinate data input system 1 S, such as the first image processing circuit 30 , the second image processing circuit 31 , the CPU 20 , etc.
  • the frame rate of each of the first and second electronic cameras 10 and 11 i.e., the frame rate of each of the CMOS image sensors 51 and 53 , is capable of being varied as necessary between at least at two frame rates, one referred to as a high frame rate and the other referred to as a low frame rate.
  • a data size per unit time input to the first image processing circuit 30 and the second image processing circuit 31 varies depending on the frame rate of the image data.
  • the low frame rate is initially selected as a default frame rate.
  • step S 501 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • step S 502 the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member.
  • the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S 502
  • the process proceeds to step S 503
  • the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S 502
  • the process returns to step S 501 .
  • step S 503 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 regarding each of the distances.
  • a distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
  • step S 504 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S 503 , and determines whether the minimum value Nmin is smaller than a first predetermined number M 1 .
  • the process proceeds to step S 505 , and when the minimum value Nmin is not smaller than the first predetermined number M 1 , i.e., NO in step S 504 , the process returns to step S 501 .
  • the first predetermined number M 1 in the step S 504 is larger than a second predetermined number M 0 for starting trace of vector data used in the following steps.
  • step S 505 the first image processing circuit 30 sends a command to the first electronic camera 10 to request increasing the output frame rate of the CMOS image sensor 51 .
  • a command for switching the frame rate i.e., from the low frame rate to the high frame rate or from the high frame rate to the low frame rate, is transmitted through a cable that also carries image data.
  • the first electronic camera 10 controls the CMOS image sensor 51 to increase the output frame rate thereof.
  • the charge time of each of photoelectric conversion devices, i.e., the imaging cells, in the CMOS image sensor 51 may be decreased.
  • step S 506 the CPU 20 determines whether the object is in a second predetermined distance from the display panel 12 to start a tracing operation of motion vectors of the object. In other words, the CPU 20 determines if the minimum value Nmin is smaller than the second predetermined number M 0 , which corresponds to the second predetermined distance, and if YES, the process proceeds to step S 507 , and if No, the process branches to step S 508 .
  • step S 507 the CPU 20 traces the motion of the object and generates coordinate data of the object according to the traced motion vectors.
  • the second predetermined number M 0 is smaller than the first predetermined number M 1 ; therefore, the spatial range for tracing motion vectors of the object is smaller than the spatial range for outputting image data with the high frame rate from the CMOS image sensor 51 .
  • step S 508 the first image processing circuit 30 determines whether the minimum value Nmin is still smaller than the first predetermined number M 1 , i.e., the object is still in the range of the first predetermined number M 1 .
  • the process returns to step S 506 , and when the minimum value Nmin is no longer smaller than the first predetermined number M 1 , i.e., NO in step S 508 , the process proceeds to step S 509 .
  • step S 509 the first image processing circuit 30 sends a command to the first electronic camera 10 to request decreasing the output frame rate of the CMOS image sensor 51 , and then the process returns to the step S 501 .
  • the first electronic camera 10 controls the CMOS image sensor 51 to decrease again the output frame rate thereof.
  • the second electronic camera 11 and the second image processing circuit 31 operate substantially the same as the first electronic camera 10 and the first image processing circuit 30 operate.
  • the first electronic camera 10 and the second electronic camera 11 operate in a low frame rate, and output a relatively small quantity of image data to the other devices. Consequently, power consumption of the coordinate data input system 1 S is decreased.
  • FIG. 15 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. These operational steps are also executed on the coordinate data input system 1 S of FIG. 1 .
  • an image area output from each of the CMOS image sensors 51 and 53 varies depending upon a distance of a coordinate input member from the display panel 12 .
  • the output image area is limited within a predetermined distance from a coordinate input member depending on a location of the coordinate input member.
  • an image data size included in a frame is also decreased, and consequently the decreased data size decreases loads of devices, such as the first image processing circuit 30 , the second image processing circuit 31 , the CPU 20 , etc. That is, the power consumption of the coordinate data input system 1 S is also decreased.
  • each of the CMOS image sensors 51 and 53 can be randomly accessed by pixel, i.e., the pixels in the CMOS image sensors 51 and 53 can be randomly addressed to output the image signal thereof.
  • This random accessibility enables the above-stated output image area limitation.
  • the output image area is set to cover a region surrounded by a whole horizontal span of and a predetermined altitude range above the display panel 12 as a default image area.
  • step S 601 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10 .
  • step S 602 the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member.
  • the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S 602
  • the process proceeds to step S 603
  • the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S 602
  • the process returns to step S 601 .
  • step S 603 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the distances for each measuring distance.
  • a distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the two points.
  • step S 604 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S 603 , and determines whether the minimum value Nmin is smaller than a predetermined number K.
  • the minimum value Nmin is smaller than the predetermined number K, i.e., YES in step S 604
  • the process proceeds to step S 605
  • the minimum value Nmin is not smaller than the predetermined number K, i.e., NO in step S 604
  • the process returns to step S 601 .
  • FIG. 16A is a diagram illustrating an image captured by the first electronic camera 10 and an output limitation of the image.
  • the symbol K denotes a predetermined distance
  • the symbol ym denotes a coordinate of the illustrated coordinate input member from an end of the CMOS image sensor 51 in the Ycamera axis direction.
  • the first image processing circuit 30 first calculates the distance ym of the object from an end of the CMOS image sensor 51 . After that, the first image processing circuit 30 sends a command to the first electronic camera 10 to limit the output image area of the CMOS image sensor 51 in a relatively small area. Referring back to FIG. 16A , the limited area corresponds to an inside area enclosed by a predetermined distance ⁇ from the coordinate input member for both sides in the Ycamera axis direction.
  • Such a command for limiting the output image area is transmitted through a common cable that carries image data.
  • the first electronic camera 10 controls the CMOS image sensor 51 so as to limit the output image area thereof.
  • FIG. 16B is a diagram illustrating an image captured by the first electronic camera 10 and a displaced output limitation of the image.
  • the symbol ym denotes an original location of a coordinate input member and the symbol ym 1 denotes a displaced location thereof.
  • the symbol LL denotes a displacement of the coordinate input member from the original location ym to the displaced location ym 1 .
  • the limiting range ⁇ of the output image also follows to the new location ym 1 .
  • step S 606 the first image processing circuit 30 determines whether the object has moved in the Ycamera axis direction.
  • the process proceeds to step S 607 , and if NO in step S 606 , the process skips the step S 607 and jumps to step S 608 .
  • step S 607 the first image processing circuit 30 sends a command to the first electronic camera 10 to limit the output image area of the CMOS image sensor 51 in the distance ⁇ around the moved location ym 1 of the object as illustrated in FIG. 16B .
  • the first electronic camera 10 carries on sending images limited in an area corresponding to the distance ⁇ around the object to the first image processing circuit 30 .
  • step S 608 the CPU 20 determines whether the object is within a predetermined distance from the display panel 12 to start a tracing operation of motion vectors of the object. In other words, the CPU 20 determines if the minimum value Nmin is smaller than the predetermined number M 0 , which corresponds to the predetermined distance, and if YES in step S 608 , the process proceeds to step S 609 , and if No in step S 608 , the process branches to step S 610 .
  • step S 609 the CPU 20 traces motion vectors of the object, and inputs coordinate data of the object according to traced motion vectors.
  • step S 610 the CPU 20 determines whether the object is still within the predetermined altitude K above the display panel 12 for outputting image data limited in the range 2 ⁇ .
  • the process returns to step S 608 , and when the object is no longer within the predetermined altitude K, i.e., NO in step S 610 , the process proceeds to step S 611 .
  • step S 611 the first image processing circuit 30 sends a command to the first electronic camera 10 to expand the output image area of the CMOS image sensor 51 to cover the whole area of the display panel 12 , and then the process returns to the step S 601 .
  • the first electronic camera 10 receives the command, the first electronic camera 10 controls the CMOS image sensor 51 to expand the output image that covers the whole area of the display panel 12 so as to be in the same state as when the coordinate data input system 1 S is turned on.
  • the second electronic camera 11 and the second image processing circuit 31 operate substantially the same as the first electronic camera 10 and the first image processing circuit 30 operate.
  • CMOS image sensors 51 and 53 are desirable to be provided with about 2000 imaging cells (pixels) in a direction.
  • the following examples according to the present invention are configured to further reduce costs of a coordinate data input system.
  • FIG. 17 is a schematic view illustrating a coordinate data input system 60 S as another example configured according to the present invention.
  • the coordinate data input system 60 S includes a coordinate data input apparatus 60 and a control apparatus 61 .
  • the coordinate data input apparatus 60 includes a first linear sensor camera 70 , a second linear sensor camera 71 , an area sensor camera 72 , a display panel 73 , and a frame 74 .
  • the linear sensor camera may also be referred as a line sensor camera, a one-dimensional sensor camera, a 1-D camera, etc.
  • the area sensor camera may also be referred as a video camera, a two-dimensional camera, a two-dimensional electronic camera, a 2-D camera, a digital still camera, etc.
  • Each of the first linear sensor camera 70 and the second linear sensor camera 71 includes a wide-angle lens, which covers 90 degrees or more and a charge coupled device (CCD) linear image sensor.
  • the first linear sensor camera 70 and the second linear sensor camera 71 output image data as analog signals.
  • the CCD linear image sensor is provided with, for example, 2000 pixel imaging cells, i.e., photoelectric converters, such as photodiodes.
  • the first linear sensor camera 70 and the second linear sensor camera 71 have an image resolution for reading an image on an XGA screen display in a direction along the array of the imaging cells, repetitively.
  • the two linear sensor cameras are disposed in an appropriate crossing angle of the optical axes thereof, and therefore enables inputting various information including two-dimensional coordinates, such as information on a selecting operation of an item in a menu window, a drawing operation of free hand lines and letters, etc.
  • the area sensor camera 72 includes a wide-angle lens, which covers 90 degrees, or more, a two-dimensional CMOS image sensor, and an analog to digital converter.
  • the two-dimensional CMOS image sensor has enough imaging cells and an enough output frame rate to enable recognizing the motion of a coordinate input member.
  • the two-dimensional CMOS image sensor for example, a sensor having 640 by 480 imaging cells, which is referred to as a VGA screen, may be used.
  • the area sensor camera 72 outputs image data as a digital signal, the data being converted by the embedded analog to digital converter.
  • any of the first linear sensor camera 70 , the second linear sensor camera 71 , and the area sensor camera 72 includes a smaller number of imaging pixels compare to the two-dimensional image sensor used in the coordinate data input system 1 S of FIG. 1 . Consequently, those cameras 70 , 71 and 72 can output frame images at a higher frame rate compared to the two-dimensional image sensor used in the coordinate data input system 1 S of FIG. 1 .
  • the first linear sensor camera 70 and the area sensor camera 72 are disposed at an upper left corner of the display panel 73 , respectively, such that the optical axis each of the wide-angle lenses forms an angle of approximately 45 degrees with a horizontal edge of the display panel 73 .
  • the second linear sensor camera 71 is disposed at an upper right corner of the display panel 73 , such that the optical axis of the wide-angle lens forms an angle of approximately 45 degrees with a horizontal edge of the display panel 73 .
  • the optical axis each of the cameras 70 , 71 and 72 is disposed approximately parallel to the display surface of the display panel 73 .
  • each of the cameras 70 , 71 and 72 can capture whole the display screen area of the display panel 73 , and transmit the captured image data to the control apparatus 61 .
  • the display panel 73 displays an image with, for example, a 48 by 36 inch screen and 1024 by 768-pixel resolution.
  • a plasma display panel, a rear projection liquid crystal display, a rear projection CRT display, etc. may be used as the display panel 73 .
  • the frame 74 is preferably to be structured with a low optical reflection coefficient material, such as black painted or plated metals, black resins, on the surface thereof.
  • the frame 74 is mounted on the left side, the bottom, and the right side circumferences of the display panel 73 .
  • the frame 74 is disposed protruding above the surface of the display panel 73 .
  • the dimensional amount of the protrusion may be equal to or more than the angle of view of the first linear sensor camera 70 and the second linear sensor camera 71 in the direction perpendicular to the surface of the display panel 73 .
  • the first linear sensor camera 70 and the second linear sensor camera 71 capture the frame 74 and output image data thereof, i.e., black image data, respectively.
  • FIG. 18 is an exemplary block diagram of the control apparatus 61 of the coordinate data input system 60 S of FIG. 17 configured according to the present invention.
  • the control apparatus 61 includes a central processing unit (CPU) 20 , a main memory 21 , a clock generator 22 , a bus controller 23 , a read only memory (ROM) 24 , a peripheral component interconnect (PCI) bridge 25 , a cache memory 26 , a hard disk 27 , a hard disk (HD) controller 28 , a display controller 29 , a first image processing circuit 90 , a second image processing circuit 91 , and a third image processing circuit 92 .
  • CPU central processing unit
  • main memory 21 main memory 21
  • a clock generator 22 the control apparatus 61
  • ROM read only memory
  • PCI peripheral component interconnect
  • the control apparatus 61 also includes a local area network (LAN) controller 32 , a LAN interface 33 , a floppy disk (FD) controller 34 , a FD drive 35 , a compact disc read only memory (CD-ROM) controller 36 , a CD-ROM drive 37 , a keyboard controller 38 , a mouse interface 39 , a real time clock (RTC) generator 40 , a CPU bus 41 , a PCI bus 42 , an internal X bus 43 , a keyboard 44 , and a mouse 45 .
  • LAN local area network
  • FD floppy disk
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • CD-ROM compact disc read only memory
  • FIG. 18 the elements that are substantially the same as those in FIG. 2 are denoted by the same reference numerals. Therefore, a description of the same elements in FIG. 18 as in FIG. 2 is not provided here to avoid redundancy.
  • the first image processing circuit 90 receives digital image data output from the area sensor camera 72 through a digital interface, such as an RS-422 interface.
  • the first image processing circuit 90 then executes an object extraction process, an object shape recognition process, an object motion vector determining process, etc.
  • the second image processing circuit 91 includes an analog to digital converting circuit, and receives the analog image signal output from the first linear sensor camera 70 via a coaxial cable. Then, the second image processing circuit 91 detects a linear (one-dimensional) location of an object based on the received image signal. Further, the second image processing circuit 91 supplies the first linear sensor camera 70 with a clock signal and an image transfer pulse via the above-described digital interface.
  • the clock signal and the image transfer pulse supplied to the first linear sensor camera 70 and those supplied to the second linear sensor camera 71 are maintained in synchronization.
  • FIG. 19 is a diagram illustrating an analog signal waveform output from the first linear sensor camera 70 or the second linear sensor camera 71 .
  • the analog signal waveform in FIG. 19 has been observed with an oscilloscope, and the horizontal axis represents time and the vertical axis represents a voltage.
  • the horizontal axis also corresponds to a direction of the aligned imaging cells.
  • the PEDESTAL LEVEL of the waveform corresponds to an output voltage of a captured image of the black frame 74 .
  • a positive pulse in the waveform corresponds to a captured image of a coordinate input member having a relatively high optical reflection coefficient, e.g., white, red, gray, etc. Lighting fixtures and/or sunlight flooded from windows irradiate both the black frame 74 and a coordinate input member, however the black frame 74 reflects little light and the coordinate input member reflects more light, and thereby the linear CCD image sensors in the linear sensor cameras 70 and 71 generate such a waveform having a pulse thereupon.
  • the height of the pulse is proportional to the optical reflection coefficient of the coordinate input member. Further, the height and width of the pulse is affected by the size of the coordinate input member and the distance thereof from the first linear sensor camera 70 and the second linear sensor camera 71 . For example, when the coordinate input member is thin and located far from the first linear sensor camera 70 and the second linear sensor camera 71 , the height and width of the pulse on an output voltage waveform generally become thin and short.
  • the height and width of the pulse is affected by a location of the coordinate input member in the direction perpendicular to the surface of the display panel 73 .
  • a pulse appears with a maximum height and width.
  • the height and width of the pulse become thinner and shorter. If the coordinate input member is out of the angle of view of the first linear sensor camera 70 and the second linear sensor camera 71 , the pulse disappears.
  • the alternate long and short dash line denoted by THRESHOLD LEVEL represents a threshold voltage used for discriminating or slicing a pulse portion of the waveform signal.
  • THRESHOLD LEVEL represents a threshold voltage used for discriminating or slicing a pulse portion of the waveform signal.
  • the threshold level may be determined based on an experiment. Further, the threshold level may be readjusted according to illumination of the room in which the coordinate data input system 60 S is installed for use.
  • the second image processing circuit 91 detects a peak of a pulse in an image signal output from the CCD linear image sensor of the first linear sensor camera 70 as a location P that corresponds to contact point A(x, y) of a coordinate input member, when the pulse exceeds the threshold level. After that, the second image processing circuit 91 measures a distance h between the optical axis crossing point Q of the first linear sensor camera 70 and the projected point P of the contacting point of coordinate input member on the CCD linear image sensor.
  • FIG. 20 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60 S of FIG. 17 as an example configured according to the present invention.
  • step S 701 the first image processing circuit 90 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the area sensor camera 72 .
  • the first image processing circuit 90 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 73 .
  • the first image processing circuit 90 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 73 for each of the measuring distances.
  • a pixel pitch of the CMOS image sensor is known, and therefore the number of pixels between two points determines the distance between the two points.
  • step S 703 the first image processing circuit 90 or the CPU 20 extracts the least number of pixels, which is denoted by Nmin, among the plural numbers of pixels counted in step S 702 , and determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • the minimum value Nmin is smaller than the predetermined number M 0 , i.e., YES in step S 703
  • the process proceeds to step S 704
  • the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 703 , the process returns to step S 701 .
  • step S 704 the first image processing circuit 90 or the CPU 20 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point, which corresponds the minimum value Nmin, to the display panel 73 .
  • the first image processing circuit 90 or the CPU 20 uses the identical frame image data used for extracting the contours and the next following frame image data received from the area sensor camera 72 .
  • the first image processing circuit 90 or the CPU 20 first obtains optical flows, i.e., velocity vectors by calculating a rate of temporal change of a pixel image density and a rate of spatial change of image density of pixels surrounding the pixel used for calculating the temporal change.
  • the motion vectors are expressed with the coordinate system (Xcamera, Ycamera), which associates with a line of the surface of the display panel 73 focused on the CMOS area sensor (i.e., Ycamera) and the coordinate perpendicular to the display panel 73 (i.e., Xcamera).
  • step S 705 the CPU 20 stores the calculated motion to vector components along the direction Xcamera, such as Vx, in the main memory 21 .
  • the CPU 20 stores those components obtained from each frame image data in succession.
  • the successively stored data is referred as trace data of motion vectors.
  • step S 706 the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 73 based on the trace data. As a determining method, the method illustrated in FIG. 12 may be used. When the object has made an attempt to input coordinates, i.e., YES in step S 707 , the process proceeds to step S 708 , and when the object has not made an attempt to input coordinates, i.e., No in step S 707 , the process branches to step S 710 .
  • step S 708 referring to FIG. 4 , the second image processing circuit 91 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first linear sensor camera 70 .
  • the third image processing circuit 92 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second linear sensor camera 71 .
  • step S 709 the second image processing circuit 91 or the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h.
  • the third image processing circuit 92 or the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 711 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 73 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • step S 710 the CPU 20 determines whether the object is within the predetermined region above the display panel 73 using the trace data of motion vector components Vx of the object. In other words, the CPU 20 determines whether the minimum value Nmin among plural distances is still smaller than the predetermined number M 0 .
  • the process returns to step S 704 to obtain motion vectors again.
  • the process returns to step S 701 .
  • FIG. 21 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60 S of FIG. 17 as another example configured according to the present invention.
  • step S 801 the first image processing circuit 90 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the area sensor camera 72 .
  • step S 802 the first image processing circuit 90 or the CPU 20 first extracts features of the shape of the extracted contours of the object. For extracting features of the shape, the first image processing circuit 90 or the CPU 20 determines the position of the barycenter of the contours of the object, then measures distances from the barycenter to plural points on the extracted contours for all radial directions like the spokes of a wheel. After that, the CPU 20 characterizes the contour shape of the object based on relations between each direction and the respective distance.
  • the first image processing circuit 90 or the CPU 20 compares the character extracted contour shape of the object with cataloged shapes of potential coordinate input members.
  • the shapes of potential coordinate input members may be stored in the ROM 24 or the hard disk 27 in advance.
  • the axis of the coordinate input member may tilt in any direction with various tilting angles. Therefore, the first image processing circuit 90 or the CPU 20 may compare the contour shape of the object after being rotated at various angles with the cataloged shapes.
  • the shapes of potential coordinate input members may be rotated at plural angles in advance, and the rotated shapes are stored in the ROM 24 or the hard disk 27 .
  • the real time rotating operation of the contour shape is not needed; and consequently execution time is saved.
  • step S 803 the first image processing circuit 90 or the CPU 20 determines whether the contour shape of the object coincides with one of the cataloged shapes of potential coordinate input members.
  • the process proceeds to step S 804 , and when the identified contour shape does not coincide with any of the cataloged shapes, i.e., NO in step S 803 , the process returns to step S 801 .
  • step S 804 the first image processing circuit 90 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 73 .
  • the first image processing circuit 90 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 73 as regards each of the measuring distances.
  • step S 805 the first image processing circuit 90 or the CPU 20 extracts the least number of pixels, i.e., Nmin, among the plural numbers of pixels counted in step S 804 , and determines whether the minimum value Nmin is smaller than a predetermined number M 0 .
  • the minimum value Nmin is smaller than the predetermined number M 0 , i.e., YES in step S 805
  • the process proceeds to step S 806
  • the minimum value Nmin is not smaller than the predetermined number M 0 , i.e., NO in step S 805 , the process returns to step S 801 .
  • step S 806 the first image processing circuit 90 or the CPU 20 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 73 by using the identical frame image data used for extracting the contours and the next following frame image data received from the area sensor camera 72 .
  • the first image processing circuit 90 or the CPU 20 first obtains optical flows, i.e., velocity vectors by calculating a rate of temporal change of a pixel image density and a rate of spatial change of image density of pixels surrounding the pixel used for calculating the temporal change.
  • the motion vectors are expressed with the coordinate system Xcamera, Ycamera.
  • step S 807 the CPU 20 stores motion vector components along the direction Xcamera of the calculated vectors, such as Vx, in the main memory 21 .
  • the CPU 20 stores those components obtained from each frame image data in succession as trace data of the motion vectors.
  • step S 808 the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 73 based on the trace data.
  • the method of FIG. 12 may be used.
  • the process branches to step S 811 , and when the object has not made any attempt to input coordinates, i.e., No in step S 809 , the process proceeds to step S 810 .
  • step S 810 the CPU 20 determines whether the object is within a predetermined region above the display panel 73 using the trace data of motion vector components Vx of the object.
  • the process returns to step S 806 to obtain motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S 810 , the process returns to step S 801 .
  • step S 811 the second image processing circuit 91 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first linear sensor camera 70 .
  • the third image processing circuit 92 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second linear sensor camera 71 .
  • step S 812 the second image processing circuit 91 or the CPU 20 solves the angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h.
  • the third image processing circuit 92 or the CPU 20 solves the angle ⁇ 2 in a similar manner.
  • step S 813 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 73 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • FIG. 22 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60 S of FIG. 17 as another example configured according to the present invention.
  • the first linear sensor camera 70 and the second linear sensor camera 71 output image data, respectively, to save loads for other devices in the coordinate data input system 60 S.
  • step S 901 the first image processing circuit 90 or the CPU 20 determines whether a coordinate input member has entered a predetermined region above the display panel 73 for tracing motion vectors thereof.
  • a coordinate input member has entered the predetermined region, i.e., YES in step S 901
  • the process proceeds to step S 902 , and when a coordinate input member has not entered yet, i.e., NO in step S 901 , the process stays at step S 901 .
  • step S 902 the second image processing circuit 91 sends a command to the first linear sensor camera 70 to start imaging operation.
  • the third image processing circuit 92 sends a command to the second linear sensor camera 71 to start an imaging operation.
  • Those commands are transmitted via digital interfaces.
  • the first linear sensor camera 70 starts an imaging operation and sends the taken image data to the second image processing circuit 91 .
  • the second linear sensor camera 71 also starts an imaging operation and sends the taken image data to the third image processing circuit 92 .
  • step S 903 the second image processing circuit 91 and the third image processing circuit 92 trace the coordinate input member and input coordinates of the coordinate input member on the display panel 73 , respectively.
  • step S 904 the first image processing circuit 90 or the CPU 20 determines whether the coordinate input member is out of the predetermined region for tracing motion vectors thereof.
  • the process proceeds to step S 905 , and when the coordinate input member is still in the predetermined region, i.e., NO in step S 904 , the process returns to step S 903 .
  • step S 905 the second image processing circuit 91 sends a command to the first linear sensor camera 70 to halt the imaging operation.
  • the third image processing circuit 92 sends a command to the second linear sensor camera 71 to halt the imaging operation. According to the commands, the first linear sensor camera 70 and the second linear sensor camera 71 halt the imaging operation, respectively.
  • the predetermined region above the display panel 73 is commonly used for both starting imaging operations and tracing motion vectors.
  • a predetermined region for starting imaging operations by the first linear sensor camera 70 and the second linear sensor camera 71 may be greater than a predetermined region for tracing motion vectors of a coordinate input member.
  • FIG. 23 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 1 S of FIG. 1 , as another example configured according to the present invention.
  • a coordinate input member when a coordinate input member is within a first predetermined region above a display device, the location of the coordinate input member is input as coordinates.
  • the coordinate input member for example, moves a cursor, draws a line, etc.
  • the coordinate input member for example, moves a cursor, receives a gesture command, etc.
  • step S 1001 the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the first electronic camera 10 .
  • step S 1002 the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12 .
  • the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the measuring distances. The number of pixels between two points determines the distance between the two points.
  • step S 1003 the first image processing circuit 30 or the CPU 20 extracts the least number of pixels, i.e., Nmin, among the plural numbers of pixels counted in step S 1002 . Then, the first image processing circuit 30 or the CPU 20 determines whether the minimum value Nmin is larger than a first predetermined number M 1 and equal to or smaller than a second predetermined number M 2 .
  • FIG. 24 is a diagram illustrating an image captured by the first electronic camera 10 in the coordinate data input system 1 S of FIG. 1 .
  • the rectangular region enclosed with a line corresponding to the first predetermined number M 1 , the projected line of surface of the display panel 12 , and the normals thereof is denoted by REGION 1 .
  • REGION 2 the rectangular region enclosed with the line corresponding to the first predetermined number M 1 , a line corresponding to the second predetermined number M 2 , and the normals of the projected line of surface of the display panel 12 is denoted by REGION 2 .
  • the REGION 1 is assigned for tracing motion vectors of the coordinate input member, and the REGION 2 is assigned for moving a cursor, inputting a gesture command, etc.
  • a pen as a coordinate input member is illustrated in the REGION 2 in FIG. 24 .
  • step S 1003 the first image processing circuit 30 determines whether the coordinate input member is in the REGION 2 .
  • the process proceeds to step S 1004 , and when the result is false, i.e., NO in step S 1003 , the process branches to step S 1008 .
  • step S 1004 the first image processing circuit 30 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first electronic camera 10 .
  • the second image processing circuit 31 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second electronic camera 11 .
  • step S 1005 the first image processing circuit 30 solves angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h.
  • the second image processing circuit 31 solves angle ⁇ 2 in a similar manner.
  • step S 1006 referring to FIG. 3 , the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • step S 1007 the CPU 20 generates display data of a cursor at a location according to the obtained coordinates x and y of the object, and sends the generated display data to the display controller 29 .
  • the CPU 20 may also send a cursor command to display a cursor at the location.
  • the display controller 29 can display a cursor at the location where the coordinate input member exists on the display panel 12 .
  • the process returns to step S 1001 .
  • the displayed cursor follows the coordinate input member.
  • step S 1008 the first image processing circuit 30 determines whether the minimum value Nmin is equal to or smaller than the first predetermined number M 1 . That is to say, the first image processing circuit 30 determines whether the coordinate input member is in the REGION 1 .
  • the process proceeds to step S 1009 , and when the result is false, i.e., NO in step S 1008 , the process returns to step S 1001 .
  • step S 1009 the first image processing circuit 30 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 by using the identical frame image data used for extracting the contours and the next following frame image data received from the first electronic camera 10 . After that, the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 12 based on the trace data of the calculated motion vectors.
  • the first image processing circuit 30 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first electronic camera 10 .
  • the second image processing circuit 31 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second electronic camera 11 .
  • the first image processing circuit 30 solves angle ⁇ 1 by using the equations (1) and (2), with known quantities f and ⁇ , and the measured distance h.
  • the second image processing circuit 31 solves angle ⁇ 2 in a similar manner.
  • the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles ⁇ 1 and ⁇ 2 .
  • the CPU 20 solves the coordinates x and y of the object on the display panel 12 for every frame image input.
  • the CPU 20 may also solve coordinates x and y for every plural frames of images.
  • the obtained coordinates x and y on the display panel 12 in the REGION 2 is used for moving a cursor.
  • the obtained coordinates x and y may also be used for another use, such as inputting a gesture command.
  • the CPU 20 may stores plural sets of coordinate data, i.e., trace data of coordinate data including time stamps thereof. Then, the CPU 20 analyzes the trace data of coordinate data, and tests whether the trace data coincides one of a plurality of defined locus of commands, which may be stored in the hard disk 27 in advance.
  • Japanese Laid-Open Patent Publication No. 5-197810 describes a matching method.
  • the method first obtains a set of a temporal combination and a spatial combination of motion vectors extracted from input images.
  • the method verifies the obtained set of temporal combination and spatial combination with patterns in a command pattern dictionary provided in advance.
  • the method identifies the input command as a specific one in the command pattern dictionary.
  • gesture commands when an operator strokes a pen downwardly at a predetermined range of velocity in the REGION 2 above the display panel 12 , the CPU 20 may recognize the stroke as a scroll command.
  • the CPU 20 recognizes as a scroll command, the CPU 20 scrolls the image displayed on the display panel 12 downwardly for a predetermined length, for example, the same length to the input stroke.
  • a gesture command or coordinate data may be distinguished according to a figure of the coordinate input member. For example, when a human hand or finger draws a figure on the display panel 12 , the coordinate data input system 1 S may recognize the motion as a gesture command, and when a symmetrical object, such as a pen, draws, the system 1 S may input coordinates of the symmetrical object.
  • FIG. 25 is an exemplary network system 200 including the coordinate data input systems of FIG. 1 and FIG. 17 .
  • the network system 200 includes a public switched telephone network (PSTN) 210 and a local area network 220 .
  • PSTN public switched telephone network
  • Three coordinate data input systems 1 SA, 1 SB and 60 SB are connected to the LAN 220 via the LAN interface 33 of FIG. 2 and FIG. 18 .
  • a server 230 is also connected to the LAN 220 .
  • a coordinate data input system 1 SC is connected to the PSTN 210 via the LAN interface 33 and a PSTN adaptor.
  • the coordinate data input systems 1 SA, 1 SB and 1 SC are substantially the same to the coordinate data input system of FIG. 1
  • the coordinate data input system 60 SB is substantially the same to the coordinate data input system of FIG. 17 .
  • each of the coordinate data input systems 1 SA, 1 SB, 1 SC and 60 SB transmits detected coordinate data of a coordinate input member and related information, such as a gesture command, accompanying control signals according to a transmission control protocol to the other coordinate data input systems via the PSTN 210 and the LAN 220 .
  • each of the coordinate data input systems 1 SA, 1 SB, 1 SC and 60 SB displays images on the display panel 12 of FIG. 1 or 73 of FIG. 17 according to the detected coordinate data and the related information sent from the other coordinate data input systems via the PSTN 210 and the LAN 220 in addition to according to coordinate data detected by itself.
  • all the coordinate data input systems 1 SA, 1 SB, 1 SC and 60 SB can share identical information and display an identical image on the display panel 12 or 73 .
  • people in different places can input information including coordinate data to a coordinate data input system implemented in each of the different places, and watch substantially the same image on the each display panel.
  • the server 230 stores programs to be executed by the CPU 20 of FIG. 2 and FIG. 18 , the first image processing circuit 30 of FIG. 2 , the second image processing circuit 31 of FIG. 2 , the first image processing circuit 90 of FIG. 18 , the second image processing circuit 91 of FIG. 18 , the third image processing circuit 92 of FIG. 18 , etc.
  • computer readable medium means a non-transitory hardware implementation, such as the afore-mentioned hard disk 27, CD-ROM 27, ROM 24, main memory 21, etc.
  • the novel method and apparatus according to the present invention can input information including coordinate data without using a light scanning device even when the surface of a display screen is contorted to a certain extent.
  • novel method and apparatus can input information including coordinate data using a plurality of coordinate input members, such as a pen, a human finger, a stick, etc.
  • novel method and apparatus can input information including coordinate data with a plurality of background devices, such as a chalkboard, a whiteboard, etc., in addition to a display device, such as a plasma display panel, a rear projection display.
  • a display device such as a plasma display panel, a rear projection display.

Abstract

A method, computer readable medium, and apparatus for inputting information, including coordinate data, includes: extracting a predetermined object from an image, including a predetermined object above a plane; detecting motion of the predetermined object while the predetermined object is within a predetermined distance from the plane; and then determining if to input predetermined information.

Description

CROSS-REFERENCE TO RELATED APPLICATION
This document is a continuation of U.S. application Ser. No. 09/698,031 filed on Oct. 30, 2000, and is based on Japanese patent application No. 11-309412 filed in the Japanese Patent Office on Oct. 29, 1999, the entire contents of each of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a method and apparatus for inputting information including coordinate data. More particularly, the present invention relates to a method and apparatus for inputting information including coordinate data of a location of a coordinate input member, such as a pen, a human finger, etc., on an image displayed on a relatively large screen.
2. Discussion of the Background
Lately, presentation systems, electronic copy boards, or electronic blackboard systems provided with a relatively large screen display device, such as a plasma display panel, a rear projection display, etc., are coming into wide use. Certain type of presentation systems also provide a touch input device disposed in front of a screen for inputting information related to the image displayed on the screen. Such a touch input device is also referred as an electronic tablet, an electronic pen, etc.
As to such a presentation system, for example, when a user of the system touches an icon on a display screen, a touch input device detects and inputs the touching motion and the coordinates of the touched location. Similarly, when the user draws a line, the touch input device repetitively detects and inputs a plurality of coordinates as a locus of the drawn line.
As an example, Japanese Laid-Open Patent Publication No. 11-85376 describes a touch input apparatus provided with light reflecting devices disposed around a display screen, light beam scanning devices, and light detectors. The light reflecting device has a characteristic to reflect incident light toward a direction close to the incident light. During an operation of the apparatus, scanning light beams emitted by the light beam scanning devices are reflected by the light reflecting devices, and then received by the light detectors. When a coordinate input member, such as a pen, a user's finger, etc., touches the surface of the screen at a location, the coordinate input member interrupts the path of the scanning light beams, and thereby the light detector is able to detect the touched location as a missing of the scanning light beams at the touched location.
In this apparatus, when a certain location-detecting accuracy in a direction perpendicular to the screen is required, the scanning light beams are desired to be thin and to scan on a plane close enough to the screen. Meanwhile, when the surface of the screen is contorted, the contorted surface may interfere with the transmission of the scanning light beams, and consequently a coordinate input operation might be impaired. As a result, for example, a double-click operation might not be properly detected, free hand drawing lines and characters might be erroneously detected, and so forth.
As another example, Japanese Laid-Open Patent Publication No. 61-196317 describes a touch input apparatus provided with a plurality of television cameras. In the apparatus, the plurality of television cameras detect three-dimensional coordinates of a moving object, such as a pen, as a coordinate input member. Because the apparatus detects a three-dimensional coordinates, the plurality of television cameras are desirable to capture images of the moving object at a relatively high flame rate.
As further example, a touch input apparatus provided with an electro magnetic tablet and an electromagnetic stylus is known. In this apparatus, a location of the stylus is detected based on electromagnetic induction between the tablet and the stylus. Therefore, a distance between the tablet and the stylus tends to be limited in a rather short distance, for example, eight millimeters; otherwise a large size stylus or a battery powered stylus is used.
SUMMARY OF THE INVENTION
The present invention has been made in view of the above-discussed and other problems and to overcome the above-discussed and other problems associated with the background methods and apparatus. Accordingly, an object of the present invention is to provide a novel method and apparatus that can input information including coordinate data even when the surface of a display screen is contorted to a certain extent and without using a light scanning device.
Another object of the present invention is to provide a novel method and apparatus that can input information including coordinate data using a plurality of coordinate input members, such as a pen, a human finger, a stick, a rod, a chalk, etc.
Another object of the present invention is to provide a novel method and apparatus that can input information including coordinate data with a plurality of background devices, such as a chalkboard, a whiteboard, etc., in addition to a display device, such as a plasma display panel, a rear projection display.
To achieve these and other objects, the present invention provides a method, computer readable medium and apparatus for inputting information including coordinate data that include extracting a predetermined object from an image including the predetermined object above a plane, detecting a motion of the predetermined object while the predetermined object is in a predetermined distance from the plane, and determining to input predetermined information.
BRIEF DESCRIPTION OF THE DRAWINGS
A more complete appreciation of the present invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
FIG. 1 is a schematic view illustrating a coordinate data input system as an example configured according to the present invention;
FIG. 2 is an exemplary block diagram of a control apparatus of the coordinate data input system of FIG. 1;
FIG. 3 is a diagram illustrating a method for obtaining coordinates where a coordinate input member contacts a display panel;
FIG. 4 is a magnified view of the wide-angle lens and the CMOS image sensor of FIG. 3;
FIG. 5 is a diagram illustrating a tilt of the surface of the display panel to the CMOS image sensor;
FIG. 6 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 1 as an example configured according to the present invention;
FIG. 7 is a diagram illustrating an image captured by the first electronic camera of FIG. 1;
FIG. 8 is a diagram illustrating an image captured by the first electronic camera when an input pen distorts the surface of a display panel;
FIG. 9 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention;
FIG. 10 is a diagram illustrating an image captured by the first electronic camera when an input pen tilts to the surface of a display panel;
FIG. 11 is a diagram illustrating an image having an axial symmetry pen captured by the first electronic camera;
FIG. 12 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention;
FIG. 13 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention;
FIG. 14 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention;
FIG. 15 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention;
FIG. 16A is a diagram illustrating an image captured by the first electronic camera and an output limitation of the image;
FIG. 16B is a diagram illustrating an image captured by the first electronic camera and a displaced output limitation of the image;
FIG. 17 is a schematic view illustrating a coordinate data input system as another example configured according to the present invention;
FIG. 18 is an exemplary block diagram of a control apparatus of the coordinate data input system of FIG. 17 configured according to the present invention;
FIG. 19 is a diagram illustrating an analog signal waveform output from a linear sensor camera;
FIG. 20 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as an example configured according to the present invention;
FIG. 21 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as another example configured according to the present invention;
FIG. 22 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 17 as another example configured according to the present invention;
FIG. 23 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system of FIG. 1 as another example configured according to the present invention;
FIG. 24 is a diagram illustrating an image captured by the first electronic camera in the coordinate data input system of FIG. 1; and
FIG. 25 is an exemplary network system including the coordinate data input systems of FIG. 1 and FIG. 17.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, wherein like reference numerals designate identical or corresponding parts throughout the several views, and more particularly to FIG. 1 thereof, is a schematic view illustrating a coordinate data input system 1S as an example configured according to the present invention. The coordinate data input system 1S includes a coordinate data input apparatus 1 and a control apparatus 2. The coordinate data input apparatus 1 includes a first electronic camera 10, a second electronic camera 11, and a display panel 12.
The display panel 12 displays an image with, for example, a 48 by 36 inch screen (diagonally 60 inches) and 1024 by 768-pixel resolution, which is referred as an XGA screen. For example, a plasma display panel, a rear projection display, etc., may be used as the display panel 12. Each of the first electronic camera 10 and the second electronic camera 11 implements a two-dimensional imaging device with a resolution that enables such as a selecting operation of an item in a menu window, a drawing operation of free hand lines, letters, etc. A two-dimensional imaging device is also referred as an area sensor.
The two-dimensional imaging device preferably has variable output frame rate capability. The two-dimensional imaging device also preferably has a random access capability that allows any imaging cell therein randomly accessed to obtain an image signal from the cell. Such a random access capability is sometimes also referred to as random addressability. As an example of such a random access two-dimensional imaging device, a complementary metal oxide semiconductor sensor (CMOS sensor) may be utilized.
The electronic camera 10 also includes a wide-angle lens 50 which covers around 90 degrees or wider angle and an analog to digital converter. Likewise, the electronic camera 11 also includes a wide-angle lens 52 which covers around 90 degrees or wider angle and an analog to digital converter. The first electronic camera 10 is disposed at a upper corner of the display panel 12 and such that an optical axis of the wide-angle lens 50 forms an angle of approximately 45 degrees with the horizontal edge of the display panel 12. The second electronic camera 11 is disposed at the other upper corner of the display panel 12 and such that the optical axis of the wide-angle lens 52 forms an angle of approximately 45 degrees with the horizontal edge of the display panel 12.
Further, the optical axis each of the electronic cameras 10 and 11 is disposed approximately parallel to a display screen surface of the display panel 12. Thus, the electronic cameras 10 and 11 can capture whole the display screen surface of the display panel 12, respectively. Each of the captured images is converted into digital data, and the digital image data is then transmitted to the control apparatus 2.
FIG. 2 is an exemplary block diagram of the control apparatus 2 of the coordinate data input system 1S of FIG. 1. Referring to FIG. 2, the control apparatus 2 includes a central processing unit (CPU) 20, a main memory 21, a clock generator 22, a bus controller 23, a read only memory (ROM) 24, a peripheral component interconnect (PCI) bridge 25, a cash memory 26, a hard disk 27, a hard disk controller (HD controller) 28, a display controller 29, a first image processing circuit 30, and a second image processing circuit 31.
The control apparatus 2 also includes a local area network controller (LAN controller) 32, a LAN interface 33, a floppy disk controller (FD controller) 34, a FD drive 35, a compact disc read only memory controller (CD-ROM controller) 36, a CD-ROM drive 37, a keyboard controller 38, a mouse interface 39, a real time clock generator (RTC generator) 40, a CPU bus 41, a PCI bus 42, an internal X bus 43, a keyboard 44, and a mouse 45.
The CPU 20 executes a boot program, a basic input and output control system (BIOS) program stored in the ROM 24, an operating system (OS), application programs, etc. The main memory 21 may be structured by, e.g., a dynamic random access memory (DRAM), and is utilized as a work memory for the CPU 20. The clock generator 22 may be structured by, for example, a crystal oscillator and a frequency divider, and supplies a generated clock signal to the CPU 20, the bus controller 23, etc., to operate those devices at the clock speed.
The bus controller 23 controls data transmission between the CPU bus 41 and the internal X bus 43. The ROM 24 stores a boot program, which is executed immediate after the coordinate data input system 1S is turned on, device control programs for controlling the devices included in the system 1S, etc. The PCI bridge 25 is disposed between the CPU bus 41 and the PCI bus 42 and transmits data between the PCI bus 42 and devices connected to the CPU bus 41, such as the CPU 20 through the use of the cash memory 26. The cash memory 26 may be configured by, for example, a DRAM.
The hard disk 27 stores system software such as an operating system, a plurality of application programs, various data for multiple users of the coordinate data input system 1S. The hard disk (HD) controller 28 implements a standard interface, such as a integrated device electronics interface (IDE interface), and transmits data between the PCI bus 42 and the hard disk 27 at a relatively high speed data transmission rate.
The display controller 29 converts digital letter/character data and graphic data into an analog video signal, and controls the display panel 12 of the coordinate data input apparatus 1 so as to display an image of the letters/characters and graphics thereon according to the analog video signal.
The first image processing circuit 30 receives digital image data output from the first electronic camera 10 through a digital interface, such as an RS-422 interface. The first image processing circuit 30 then executes an object extraction process, an object shape recognition process, a motion vector detection process, etc. Further, the first image processing circuit 30 supplies the first electronic camera 10 with a clock signal and an image transfer pulse via the above-described digital interface.
Similarly, the second image processing circuit 31 receives digital image data output from the second electronic camera 11 through a digital interface, such as also an RS-422 interface. The second image processing circuit 31 is configured as the substantially same hardware as the first image processing circuit 30, and operates substantially the same as the first image processing circuit 30 operates. That is, the second image processing circuit 31 also executes an object extraction process, an object shape recognition process, a motion vector detection process, and supplies a clock signal and an image transfer pulse to the second electronic camera 11 as well.
In addition, the clock signal and the image transfer pulse supplied to the first electronic camera 10 and those signals supplied to the second electronic camera 11 are maintained in synchronization.
The LAN controller 32 controls communications between the control apparatus 2 and external devices connected to a local area network, such as an Ethernet, via the LAN interface 33 according to the protocol of the network. As an example of an interface protocol, the Institute of Electrical and Electronics Engineers (IEEE) 802.3 standard may is used.
The FD controller 34 transmits data between the PCI bus 42 and the FD drive 35. The FD drive 35 reads and writes a floppy disk therein. The CD-ROM controller 36 transmits data between the PCI bus 42 and the CD-ROM drive 37. The CD-ROM drive 37 reads a CD-ROM disc therein and sends the read data to the CD-ROM controller 36. The CD-ROM controller 36 and the CD-ROM drive 37 may be connected with an IDE interface.
The keyboard controller 38 converts serial key input signals generated at the keyboard 44 into parallel data. The mouse interface 39 is provided with a mouse port to be connected with the mouse 45 and controlled by mouse driver software or a mouse control program. In this example, the coordinate data input apparatus 1 functions as a data input device, and therefore the keyboard 44 and the mouse 45 may be omitted from the coordinate data input system 1S in normal operations except for a moment during a maintenance operation for the coordinate data input system 1S. The RTC generator 40 generates and supplies calendar data, such as day, hour, and minute, etc., and is battery back-upped.
Now, a method for determining a location where a coordinate input member has touched on or come close to the image display surface of the display panel 12 is described. FIG. 3 is a diagram illustrating a method for obtaining coordinates where a coordinate input member contacts or comes close to the display panel 12. Referring to FIG. 3, the first electronic camera 10 includes the wide-angle lens 50 and a CMOS image sensor 51, and the second electronic camera 11 also includes the wide-angle lens 52 and a CMOS image sensor 53.
As stated above, the first and second electronic cameras 10 and 11 are disposed such that the optical axes of the wide- angle lenses 50 and 52, i.e., the optical axes of incident lights to the cameras, are parallel to the display surface of the display panel 12. Further, the first and second electronic cameras 10 and 11 are disposed such that each of the angles of view of the electronic cameras 10 and 11 covers substantially a whole area where the coordinate input member can come close and touch the display panel 12.
In FIG. 3, the symbol L denotes a distance between the wide-angle lens 50 and the wide-angle lens 52, and the symbol X-Line denotes a line connecting the wide-angle lens 50 and the wide-angle lens 52. The symbol A(x, y) denotes a point A where a coordinate input member comes close to or touches the display panel 12 and the coordinates (x, y) thereof. The point A(x, y) is referred as a contacting point. Further, the symbol (β1 denotes an angle which the line X-line forms with a line connecting the wide-angle lens 50 and the contacting point A(x, y), and the symbol β2 denotes an angle that the X-line forms with a line connecting the wide-angle lens 52 and the contacting point A(x, y).
FIG. 4 is a magnified view of the wide-angle lens 50 and the CMOS image sensor 51 of FIG. 3. Referring to FIG. 4, the symbol f denotes a distance between the wide-angle lens 50 and the CMOS image sensor 51. The symbol Q denotes a point at which the optical axis of the wide-angle lens 50 intersects the CMOS image sensor 51. The point Q is referred as an optical axis crossing point.
The symbol P denotes a point where an image of the contacting point A(x, y) is formed on the CMOS image sensor 51. The point P is referred as a projected point P of the contacting point A(x, y). The symbol h denotes a distance between the point P and the point Q. The symbol a denotes an angle which the optical axis of the wide-angle lens 50 forms with the X-line, and the symbol θ denotes an angle which the optical axis of the wide-angle lens 50 forms with a line connecting the contacting point A(x, y) and the point P.
Referring to FIG. 3 and FIG. 4, the following equations hold;
θ=arctan(h/f)   (1)
β1−α−θ  (2)
Where, the angle α and the distance f are constant values, because these values are determined by a mounted mutual location of the wide-angle lens 50 and the CMOS image sensor 51, and a mounted angle of the wide-angle lens 50 to the line X-line at a manufacturing plant. Therefore, when the distance h is given, the angle β1 is solved. Regarding the second electronic camera 11, similar equations are hold, and thus the angle β2 is solved.
After the angle β1 and the angle β2 are obtained, the coordinates of the contacting point A(x, y) are calculated by the followings based on a principle of trigonometrical survey;
x=Lx tan β2/(tan β1+tan β2)   (3)
y=x X tan β1   (4)
Next, a relation between the CMOS image sensor 51 and an image of the edges of the display panel 12 formed on the CMOS image sensor 51 is described. Each of the CMOS image sensors 51 and 53 has a two-dimensional array or a matrix of imaging picture elements (pixels) or imaging cells. When the number of imaging cells in a direction and the number of imaging cells in the other direction are different each other, the CMOS image sensors 51 and 53 are disposed such that a side having the larger number of imaging cells is parallel to the surface of the display panel 12.
Regarding the CMOS image sensors 51 and 53, a coordinate axis along the direction having the larger number of imaging cells is represented by Ycamera axis. A coordinate axis along the direction having a smaller number of imaging cells, i.e., the direction perpendicular to the Ycamera axis is represented by Xcamera axis. Thus, images of the edges or margins of the display panel 12 that are formed on the CMOS image sensors 51 and 53 become a line parallel to the Ycamera axis and perpendicular to the Xcamera axis. A projection of the surface of the display panel 12 on the CMOS image sensors 51 and 53 is formed as substantially the same line on the CMOS image sensors 51 and 53. Accordingly, such a line formed on the CMOS image sensors 51 and 53 is hereinafter referred as “a formed line of the surface of the display panel 12,” “a projected line of the surface of the display panel 12,” or just simply “the surface of the display panel 12.”
FIG. 5 is a diagram illustrating a tilt of the surface of the display panel 12 to the CMOS image sensors 51 and 53. Referring to FIG. 5, when the surface of the display panel 12 is not parallel, i.e., is tilted to the Ycamera axis as illustrated, the tilting angle δ to the Ycamera axis is obtained as follows.
When, points A(x1c, y1c), B(x2c, y2c), C(x3c, y3c) are arbitrary points on the projected line of the surface of the display panel 12. An angle δ between a line connecting each point and the origin of the coordinate system and the Ycamera axis is stated as follows;
δ1=arctan(x1c/y1c)   (5)
δ2=arctan(x2c/y2c)   (6)
δ3=arctan(x3c/y3c)   (7)
After that, the tilted angle δ is obtained as an average value of those angles;
δ=(δ1+δ2+δ3)/3   (8)
When the surface of the display panel 12 is tilted to the Ycamera axis, a tilted coordinate system, which tilts angle δ to the original coordinate system (Xcamera, Ycamera), may also be conveniently utilized to obtain a location of a coordinate input member and a motion vector thereof. The tilted coordinate system is related to a rotation of the original coordinate system at angle δ. When the surface of the display panel 12 tilts clockwise, the tilted coordinate system is obtained by being rotated counterclockwise, and vice versa. Relations between the original coordinate system (Xcamera, Ycamera) and the tilted coordinate system, which is denoted by (X1camera, Y1camera), are the following:
X1camera=XcameraX cos δ+YcameraX sin δ  (9)
Y1camera=XcameraX cos δ−YcameraX sin δ  (10)
When the surface of the display panel 12 does not tilt to the Ycamera axis by, e.g., as a result of adjusting operation on the electronic cameras 10 and 11 at a production factory, or an installing and maintenance operation at a customer office, those coordinate conversions are not always needed.
FIG. 6 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 1S of FIG. 1 as an example configured according to the present invention. By the way, the CMOS image sensor 51 may not always have an appropriate image aspect ratio, or a ratio of the number of imaging cells in a direction to that in the other direction. In such a case, the first electronic camera 10 allows outputting the image signal captured by the CMOS image sensor 51 within a predetermined range from the surface of the display panel 12 in a direction perpendicular to the surface. In other words, the first electronic camera 10 outputs digital image data in the predetermined range to the first image processing circuit 30 in the control apparatus 2. Likewise, the second electronic camera 11 outputs digital image data in a predetermined range to the second image processing circuit 31.
With reference to FIG. 6, in step 5101, the first image processing circuit 30 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10. As an example of extraction methods of contours of an object, the first image processing circuit 30 first determines gradients of image density among the pixels by differentiation, and then extracts contours based on a direction and magnitude of the gradients of image density. Further, the method described in Japanese Patent Publication No. 8-16931 may also be applied for extracting an object as a coordinate input member from frame image data.
In step 5102, the first image processing circuit 30 measures plural distances between the object and the projected line of the surface of the display panel 12 on the CMOS image sensor 51. For measuring a distance, the first image processing circuit 30 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 on the CMOS image sensor 51. An image forming reduction ratio on the CMOS image sensor 51 is fixed and a pixel pitch of the CMOS image sensor 51 (i.e., the interval between imaging cells) is known. As a result, the number of pixels between two points determines a distance between the two points.
For measuring plural distances between the object and the surface of the display panel 12, the first image processing circuit 30 counts pixels as regards plural distances between the contours of the extracted object and the projected line of the surface of the display panel 12.
In step S103, the first image processing circuit 30 extracts the least number of pixels among the plural numbers of pixels counted for measuring plural distances in step S102. A symbol Nmin denotes the least number of pixels among the plural numbers of pixels. Consequently, the distance being the minimum value Nmin corresponds to a nearest point of the object to the surface of the display panel 12. The first image processing circuit 30 then determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S103, the process proceeds to step S104, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S103, the process returns to step S101.
In step S104, the first image processing circuit 30 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12. For this calculation, the first image processing circuit 30 uses the identical frame image data used for extracting the contours and the next following frame image data received from the first electronic camera 10.
In this example, the first image processing circuit 30 obtains optical flows, i.e., velocity vectors, by calculating a rate of temporal change of a pixel image density. The first image processing circuit 30 also obtains a rate of spatial change of image densities of pixels in the vicinity of the pixel used for calculating the rate of temporal change of the pixel image density. The motion vectors are expressed on the coordinate system (Xcamera, Ycamera), which associates with the projected line of the surface of the display panel 12 on the CMOS image sensor 51 (i.e., Ycamera) and the coordinate perpendicular to the surface of the display panel 12 (i.e., Xcamera).
FIG. 7 is a diagram illustrating an image captured by the first electronic camera 10. With Reference to FIG. 7, a thick line illustrates the projected line of the surface of the display panel 12 on the CMOS image sensors 51. The display panel 12 includes a display area and a frame in circumference of and at approximately same level of the display screen surface. Therefore, the surface of the display panel 12 can also be the surface of the frame. The alternate long and short dash line is drawn at a predetermined distance from the projected line of the surface of the display panel 12. The predetermined distance corresponds to the predetermined number M0 of pixels at the step S103 of FIG. 6, and the region limited by the predetermined distance is denoted by REGION FOR OBTAINING MOTION VECTORS. The linked plural lines illustrate a pen as the extracted contours of the object at the step S101 of FIG. 6.
In this example, the nearest point of the pen to the display panel 12, which is marked by the black dot at the tip of the pen in FIG. 7, is in the REGION FOR OBTAINING MOTION VECTORS. Accordingly, a calculation of motion vectors, which is executed at the step S104 of FIG. 6, results in such as the motion vector and components thereof. Vx and Vy as illustrated in FIG. 7 regarding the nearest point (black dot) of the pen.
Referring back to FIG. 6, in step S105, the CPU 20 stores motion vector components along the direction Xcamera of the calculated vectors, such as the component Vx illustrated in FIG. 7, in the main memory 21. The CPU 20 stores those vector components from each obtained frame image data in succession. The successively stored motion vector data is also referred as trace data.
In step S106, the CPU 20 determines whether the extracted object, such as the pen in FIG. 7, has made an attempt to input coordinates on the display panel 12 based on the trace data of motion vectors. A determining method is further described later. When the extracted object has made an attempt to input coordinates, i.e., YES in step S107, the process proceeds to step S108, and when the object has not made an attempt to input coordinates, i.e., No in step S107, the process branches to step S109.
In step S108, the CPU 20 measures the distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object. When the extracted object is physically soft, such as a human finger, the extracted object may contact at an area rather than a point. In such case, the contacting point A(x, y) can be replaced with the center of the contacting area. In addition, as stated earlier, the term contacting point A(x, y) is applied for not only a contacting state of the object and the display panel 12, but also a state that the object is adjacent to the display panel 12.
A range from the optical axis crossing point Q to an end of the CMOS image sensor 51 contains a fixed number (denoted by N1) of pixels, which only depends upon relative locations of the wide-angle lens 50 and the CMOS image sensor 51 being disposed.
On the other hand, a range from the point P to the end of the CMOS image sensor 51 contains variable pixels (denoted by N2), which varies depending upon the location of the contacting point A(x, y) of the object. Therefore, the range between the point Q and the point P contains |N1-N2| pixels, and the distance between the point Q and point P in the direction Ycamera, i.e., the distance h, is determined as |N1-N2| x the pitch of the pixels.
Referring back again to FIG. 6, in step S110, the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second electronic camera 11, the CPU 20 solves the angle β2 in a similar manner.
In step S111, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
In step S109, the CPU 20 determines whether the object is still within the predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object. When the object is in the predetermined region, i.e., YES in step S109, the process returns to step S104 to obtain motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S109, the process returns to step S101.
As stated above, for solving β1, β2, x and y by using equations (1), (2), (3) and (4), the calculating operations is executed by the CPU 20. However, angles β1, β2 may also be solved by the first image processing circuit 30 and the second image processing circuit 31, respectively, and then the obtained β1, β2 are transferred to the CPU 20 to solve the coordinates x and y.
In addition, the CPU 20 may also execute the above-described contour extracting operation in step S101, the distance measuring operation in step S102, the least number extracting and comparing operation in steps S103 and S104 in place of the first image processing circuit 30. When the CPU 20 executes the operation, the hard disk 27 may initially store program codes, and the program codes are loaded to the main memory 21 for execution every time after the system 1S is boot upped.
When the coordinate data input system 1S is in a writing input mode or a drawing input mode, the CPU 20 generates display data according to the obtained plural sets of coordinates x and y of the object, i.e., the locus data of the object, and sends the generated display data to the display controller 29. Thus, the display controller 29 displays an image corresponding to the locus of the object on the display panel 12 of the coordinate data input apparatus 1.
A certain type of display panel, such as a rear projection display, has a relatively elastic surface, such as a plastic sheet screen. FIG. 8 is a diagram illustrating an image captured by the first electronic camera 10 when an input pen distorts the surface of a display panel. In the captured image, the tip of the pen is out of the frame due to the distortion or warp in the surface of the display panel caused by the pressure of the pen stroke. Intersections of the surface of the display panel 12 and the contours of the pen are denoted by point A and point B.
Accordingly, when the method of FIG. 6 is applied for a display panel having such a relatively elastic surface, the middle point of the points A and B may be presumed or substantially equivalent to a nearest point of the pen as well as a literal sense of the nearest point, such as the black dot at the tip of the pen illustrated in FIG. 7.
FIG. 9 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. The method is also executed on the coordinate data input system 1S of FIG. 1.
With reference to FIG. 9, in step S201, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10.
In step S202, the first image processing circuit 30 or the CPU 20 first extracts geometrical features of the shape of the extracted contours of the object. For extracting geometrical features, the first image processing circuit 30 or the CPU 20 determines the position of the barycenter of the contours of the object, then measures distances from the barycenter to plural points on the extracted contours for all radial directions like the spokes of a wheel. Then, the CPU 20 extracts geometrical features of the contour shape of the object based on relations between each direction and the respective distance. Japanese Laid-Open Patent Publication No. 8-315152 may also be referred for executing the above-stated character extraction method.
After that, the CPU 20 compares the extracted geometrical features of the contour shape of the object with features of cataloged shapes of potential coordinate input members one after the other. The shapes of potential coordinate input members may be stored in the ROM 24 or the hard disk 27 in advance.
When the operator of the coordinate data input system 1S points to an item on a menu or an icon, or draws a line, etc., with a coordinate input member, the axis of the coordinate input member may tilt in any direction with various tilting angles. Therefore, the CPU 20 may rotate the contour shape of the object for predetermined angles to compare with the cataloged shapes.
FIG. 10 is a diagram illustrating an image captured by the first electronic camera 10 when an input pen as a coordinate input member tilts to the surface of a display panel 12. In this case, the pen tilts to the surface of the display panel 12 at an angle AR as illustrated. Therefore, when the CPU 20 inversely rotates, i.e., rotates counterclockwise, the contour shape of the object at the angle AR, the contour shape easily coincides with one of the cataloged shapes.
Instead of such a rotating operation of the contour shape, the shapes of potential coordinate input members may be rotated at plural angles in advance, and the rotated shapes stored in the ROM 24 or the hard disk 27. Thus, the real-time rotating operation of the contour shape is not needed; consequently, execution time for the coordinate data inputting operation is further saved.
FIG. 11 is a diagram illustrating an image having an axially symmetric pen captured by the first electronic camera 10. As in the illustrated example, various sorts of potential coordinate input members, such as a pen, a magic marker, a stick, a rod, etc., have axial symmetry. Therefore, the CPU 20 may analyze whether the captured object has axial symmetry, and when the captured object has axial to symmetry, the CPU 20 can simply presume the captured object to be a coordinate input member.
By this method, not all the cataloged shapes of potential coordinate input members are required to be stored in the ROM 24 or the hard disk 27; therefore storage capacity thereof is saved. As an example, the axial symmetry may be determined based on distances from the barycenter to plural points on the extracted contours.
Referring back to FIG. 9, in step S203, the CPU 20 determines whether the character extracted contour shape of the object coincides with one of the cataloged shapes of potential coordinate input members by determining methods including the above-stated methods. When the character extracted contour shape coincides with one of the cataloged shapes, i.e., YES in step S203, the process proceeds to step S204, and when the contour shape does not coincide with any of the cataloged shapes, i.e., NO in step S203, the process returns to step S201.
In step S204, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 with respect to each of the plural distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
In step S205, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels, which is denoted by Nmin, among the plural numbers of pixels counted in step S204, and determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S205, the process proceeds to step S206, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S205, the process returns to step S201.
In step S206, the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12. The component Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12, and the component Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12. For calculating the motion vectors, the first image processing circuit 30 or the CPU 20 uses consecutive two frames and utilizes the optical flow method stated above.
In step S207, the CPU 20 successively stores motion vector components along the direction of Xcamera (i.e., Vx) of the calculated motion vectors of frames in the main memory 21 as trace data. In step S208, the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 12 based on the trace data of motion vectors. When the object has made an attempt to input coordinates, i.e., YES in step S209, the process branches to step S211, and when the object has not made an attempt, i.e., No in step S209, the process proceeds to step S210.
In step S210, the CPU 20 determines whether the object is within a predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object. When the object is in the predetermined region, i.e., YES in step S210, the process returns to step S206 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S210, the process returns to step S201.
In step S211, the first image processing circuit 30 or the CPU 20 measures a distance h on the CMOS image sensor 51 between the optical axis crossing point Q and a projected point P of a contacting point A(x, y). In step S212, with reference to FIG. 4, the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second electronic camera 11, the CPU 20 solves the angle α2 in a similar manner.
In step S213, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
As described, the CPU 20 only inputs coordinates of an object that coincides with one of cataloged shapes of potential coordinate input members. Accordingly, the coordinate data input system 1S can prevent an erroneous or unintentional inputting operation, e.g., inputting coordinates of an operator's arm, head, etc.
FIG. 12 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. This example is applied for, e.g., inputting a pointing or clicking operation for an icon, an item in a menu, etc., being displayed on the display panel 12. The method is also executed on the coordinate data input system 1S of FIG. 1.
With reference to FIG. 12, in step S301, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10. In step S302, the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member. When the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S302, the process proceeds to step S303, and when the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S302, the process returns to step S301.
In step S303, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 regarding each of the distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
In step S304, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S303, and determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S304, the process proceeds to step S305, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S304, the process returns to step S301.
In step S305, the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12. The component Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12, and the component Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12. For calculating the motion vectors, the first image processing circuit 30 or the CPU 20 uses two consecutive frames of image data and utilizes the optical flow method stated above.
In step S306, the CPU 20 successively stores motion vector components along the direction Xcamera, i.e., component Vx, of plural frames in the main memory 21 as trace data.
In step S307, the CPU 20 determines whether a moving direction of the extracted object has been reversed from an advancing motion toward the display panel 12 to a leaving motion from the panel 12 based on the trace data of motion vectors. When the moving direction of the extracted object has been reversed, i.e., YES in step S307, the process branches to step S309, and when the moving direction has not reversed, i.e., No in step S307, the process proceeds to step S308.
In step S308, the first image processing circuit 30 or the CPU 20 determines whether the object is within a predetermined region above the display panel 12 using the trace data of motion vector components Vx of the object. When the object is in the predetermined region, i.e., YES in step S308, the process returns to step 5305 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S308, the process returns to step S301.
In step S309, the first image processing circuit 30 or the CPU 20 measures a distance h on the CMOS image sensor 51 between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object. For projected point P, for example, a starting point of a motion vector being centered among plural motion vectors, whose direction has been reversed, is used.
In step. S310, referring to FIG. 4, the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second electronic camera 11, the CPU 20 solves the angle β2 in a similar manner.
In step S311, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
FIG. 13 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. This example is applied to, for example, inputting information while a coordinate input member is staying at the surface of the display panel 12. The method is also executed on the coordinate data input system 1S of FIG. 1.
Referring to FIG. 13, in step S401, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10. In step S402, the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member. When the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S402, the process proceeds to step S403, and when the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step 5402, the process returns to step S401.
In step S403, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
In step S404, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S403, and determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S404, the process proceeds to step S405, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S404, the process returns to step S401.
In step S405, the first image processing circuit 30 or the CPU 20 calculates motion vectors (Vx, Vy) regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12. Vx is a vector component along the Xcamera axis, i.e., a direction perpendicular to the projected line of the surface of the display panel 12, and Vy is a vector component along the Ycamera axis, i.e., a direction along the surface of the display panel 12. For calculating the motion vectors, the first image processing circuit 30 or the CPU 20 uses two consecutive frames and utilizes the optical flow method stated above.
In step S406, the CPU 20 successively stores motion vector components along the direction Xcamera of the calculated vectors, i.e., the component Vx, in the main memory 21 as trace data.
In step S407, the CPU 20 determines whether the vector component Vx, which is perpendicular to the plane of the display panel 12, has become a value of zero from an advancing motion toward the display panel 12. When the component Vx of the motion vector has become practically zero, i.e., YES in step S407, the process branches to step S409, and when the component Vx has not become zero yet, i.e., No in step S407, the process proceeds to step S408.
In step S408, the CPU 20 determines whether the object is located within a predetermined region above the display panel 12 using the trace data of motion vectors component Vx of the object. When the object is located in the predetermined region, i.e., YES in step S408, the process returns to step S405 to obtain new motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S408, the process returns to step S401.
In step S409, the CPU 20 determines that a coordinate inputting operation has been started, and transits the state of the coordinate data input system 1S to a coordinate input state. In step S410, the first image processing circuit 30 or the CPU 20 measures a distance h between the optical axis crossing point Q and the projected point P of a contacting point A(x, y) of the object on the CMOS image sensor 51.
In step S411, referring to FIG. 4, the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second electronic camera 11, the CPU 20 solves the angle β2 in a similar manner. In step S412, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
In step S413, the CPU 20 determines whether the motion vector component Vy at the point P has changed while the other motion vector component Vx is value of zero. In other words, the CPU 20 determines whether the object has moved in any direction whatever along the surface of the display panel 12. When the motion vector component Vy has changed while the other motion vector component Vx is zero, i.e., YES in step S413, the process returns to step S410 to obtain the coordinates x and y of the object at a moved location. When the motion vector component Vy has not changed, i.e., No in step S413, the process proceeds to step S414.
Further, the CPU 20 may also determine the motion vector component Vy under a condition that the other component Vx is a positive value, which represents a direction approaching toward the display panel 12 in addition to the above-described condition of the component Vx is zero.
In step S414, the CPU 20 determines whether the motion vector component Vx regarding the point P has become a negative value, which represents a direction leaving from the display panel 12. When the motion vector component Vx has become a negative value, i.e., YES in step S414, the process proceeds to step S415, and if NO, the process returns to step S410. In step S415, the CPU 20 determines that the coordinate inputting operation has been completed, and terminates the coordinate input state of the coordinate data input system 1S.
Thus, the CPU 20 can generate display data according to the coordinated data obtained during the above-described coordinate input state, and transmit the generated display data to the display controller 29 to display an image of the input data on the display panel 12.
FIG. 14 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. These operational steps are also executed on the coordinate data input system 1S of FIG. 1. In this example, a frame rate output from each of the first and second electronic cameras 10 and 11 varies depending on a distance of a coordinate input member from the display panel 12. The frame rate may be expressed as the number of frames per one second.
When a coordinate input member is within a predetermined distance, the frame rate output from each of the CMOS image sensors 51 and 53 is increased to obtain the motion of the coordinate input member further in detail. When the coordinate input member is out of the predetermined distance, the output frame rate is decreased to reduce loads of the other devices in the coordinate data input system 1S, such as the first image processing circuit 30, the second image processing circuit 31, the CPU 20, etc.
The frame rate of each of the first and second electronic cameras 10 and 11, i.e., the frame rate of each of the CMOS image sensors 51 and 53, is capable of being varied as necessary between at least at two frame rates, one referred to as a high frame rate and the other referred to as a low frame rate. A data size per unit time input to the first image processing circuit 30 and the second image processing circuit 31 varies depending on the frame rate of the image data. When the coordinate data input system 1S is powered on, the low frame rate is initially selected as a default frame rate.
Referring now to FIG. 14, in step S501, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10. In step S502, the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member. When the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S502, the process proceeds to step S503, and when the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S502, the process returns to step S501.
In step S503, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 regarding each of the distances. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the points.
In step S504, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S503, and determines whether the minimum value Nmin is smaller than a first predetermined number M1. When the minimum value Nmin is smaller than the first predetermined number M1, i.e., YES in step S504, the process proceeds to step S505, and when the minimum value Nmin is not smaller than the first predetermined number M1, i.e., NO in step S504, the process returns to step S501.
The first predetermined number M1 in the step S504 is larger than a second predetermined number M0 for starting trace of vector data used in the following steps.
In step S505, the first image processing circuit 30 sends a command to the first electronic camera 10 to request increasing the output frame rate of the CMOS image sensor 51. Such a command for switching the frame rate, i.e., from the low frame rate to the high frame rate or from the high frame rate to the low frame rate, is transmitted through a cable that also carries image data. When the first electronic camera 10 receives the command, the first electronic camera 10 controls the CMOS image sensor 51 to increase the output frame rate thereof. As an example for increasing the output frame rate of the CMOS image sensor 51, the charge time of each of photoelectric conversion devices, i.e., the imaging cells, in the CMOS image sensor 51 may be decreased.
In step S506, the CPU 20 determines whether the object is in a second predetermined distance from the display panel 12 to start a tracing operation of motion vectors of the object. In other words, the CPU 20 determines if the minimum value Nmin is smaller than the second predetermined number M0, which corresponds to the second predetermined distance, and if YES, the process proceeds to step S507, and if No, the process branches to step S508.
In step S507, the CPU 20 traces the motion of the object and generates coordinate data of the object according to the traced motion vectors. As stated earlier, the second predetermined number M0 is smaller than the first predetermined number M1; therefore, the spatial range for tracing motion vectors of the object is smaller than the spatial range for outputting image data with the high frame rate from the CMOS image sensor 51.
In step S508, the first image processing circuit 30 determines whether the minimum value Nmin is still smaller than the first predetermined number M1, i.e., the object is still in the range of the first predetermined number M1. When the minimum value Nmin is still smaller than the first predetermined number M1, i.e., YES in step S508, the process returns to step S506, and when the minimum value Nmin is no longer smaller than the first predetermined number M1, i.e., NO in step S508, the process proceeds to step S509.
In step S509, the first image processing circuit 30 sends a command to the first electronic camera 10 to request decreasing the output frame rate of the CMOS image sensor 51, and then the process returns to the step S501. Receiving the command, the first electronic camera 10 controls the CMOS image sensor 51 to decrease again the output frame rate thereof.
In the above-described operational steps, the second electronic camera 11 and the second image processing circuit 31 operate substantially the same as the first electronic camera 10 and the first image processing circuit 30 operate.
In this example, while the coordinate input device is a distant place from the display panel 12, the first electronic camera 10 and the second electronic camera 11 operate in a low frame rate, and output a relatively small quantity of image data to the other devices. Consequently, power consumption of the coordinate data input system 1S is decreased.
FIG. 15 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation as another example configured according to the present invention. These operational steps are also executed on the coordinate data input system 1S of FIG. 1. In this example, an image area output from each of the CMOS image sensors 51 and 53 varies depending upon a distance of a coordinate input member from the display panel 12. In other words, the output image area is limited within a predetermined distance from a coordinate input member depending on a location of the coordinate input member. When the output image area is limited in a small area, an image data size included in a frame is also decreased, and consequently the decreased data size decreases loads of devices, such as the first image processing circuit 30, the second image processing circuit 31, the CPU 20, etc. That is, the power consumption of the coordinate data input system 1S is also decreased.
The pixels in each of the CMOS image sensors 51 and 53 can be randomly accessed by pixel, i.e., the pixels in the CMOS image sensors 51 and 53 can be randomly addressed to output the image signal thereof. This random accessibility enables the above-stated output image area limitation. When the coordinate data input system 1S is powered on, the output image area is set to cover a region surrounded by a whole horizontal span of and a predetermined altitude range above the display panel 12 as a default image area.
Referring now to FIG. 15, in step S601, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from frame image data received from the first electronic camera 10. In step S602, the first image processing circuit 30 or the CPU 20 determines whether the contour shape of the object is regarded as a coordinate input member. When the contour shape of the object is regarded as a coordinate input member, i.e., YES in step S602, the process proceeds to step S603, and when the contour shape of the object is not regarded as a coordinate input member, i.e., NO in step S602, the process returns to step S601.
In step S603, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the distances for each measuring distance. A distance between two points is obtained as the product of the pixel pitch of the CMOS image sensor 51 and the number of pixels between the two points.
In step S604, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels Nmin among the plural numbers of pixels counted in step S603, and determines whether the minimum value Nmin is smaller than a predetermined number K. When the minimum value Nmin is smaller than the predetermined number K, i.e., YES in step S604, the process proceeds to step S605, and when the minimum value Nmin is not smaller than the predetermined number K, i.e., NO in step S604, the process returns to step S601.
FIG. 16A is a diagram illustrating an image captured by the first electronic camera 10 and an output limitation of the image. In FIG. 16A, the symbol K denotes a predetermined distance, and the symbol ym denotes a coordinate of the illustrated coordinate input member from an end of the CMOS image sensor 51 in the Ycamera axis direction.
Referring back to FIG. 15, in step S605, the first image processing circuit 30 first calculates the distance ym of the object from an end of the CMOS image sensor 51. After that, the first image processing circuit 30 sends a command to the first electronic camera 10 to limit the output image area of the CMOS image sensor 51 in a relatively small area. Referring back to FIG. 16A, the limited area corresponds to an inside area enclosed by a predetermined distance λ from the coordinate input member for both sides in the Ycamera axis direction.
Such a command for limiting the output image area is transmitted through a common cable that carries image data. When the first electronic camera 10 receives the command, the first electronic camera 10 controls the CMOS image sensor 51 so as to limit the output image area thereof.
FIG. 16B is a diagram illustrating an image captured by the first electronic camera 10 and a displaced output limitation of the image. In FIG. 16B, the symbol ym denotes an original location of a coordinate input member and the symbol ym1 denotes a displaced location thereof. The symbol LL denotes a displacement of the coordinate input member from the original location ym to the displaced location ym1. As illustrated, when the coordinate input member moves from the original location ym to the location ym1, the limiting range λ of the output image also follows to the new location ym1.
Referring back to FIG. 15, in step S606, the first image processing circuit 30 determines whether the object has moved in the Ycamera axis direction. When the object has moved, i.e., YES in step S606, the process proceeds to step S607, and if NO in step S606, the process skips the step S607 and jumps to step S608.
In step S607, the first image processing circuit 30 sends a command to the first electronic camera 10 to limit the output image area of the CMOS image sensor 51 in the distance λ around the moved location ym1 of the object as illustrated in FIG. 16B. Thus, as long as the object stays under the predetermined altitude K above the display panel 12, the first electronic camera 10 carries on sending images limited in an area corresponding to the distance λ around the object to the first image processing circuit 30.
In step S608, the CPU 20 determines whether the object is within a predetermined distance from the display panel 12 to start a tracing operation of motion vectors of the object. In other words, the CPU 20 determines if the minimum value Nmin is smaller than the predetermined number M0, which corresponds to the predetermined distance, and if YES in step S608, the process proceeds to step S609, and if No in step S608, the process branches to step S610.
In step S609, the CPU 20 traces motion vectors of the object, and inputs coordinate data of the object according to traced motion vectors.
In step S610, the CPU 20 determines whether the object is still within the predetermined altitude K above the display panel 12 for outputting image data limited in the range 2λ. When the object is within the predetermined altitude K, i.e., YES in step S610, the process returns to step S608, and when the object is no longer within the predetermined altitude K, i.e., NO in step S610, the process proceeds to step S611.
In step S611, the first image processing circuit 30 sends a command to the first electronic camera 10 to expand the output image area of the CMOS image sensor 51 to cover the whole area of the display panel 12, and then the process returns to the step S601. When the first electronic camera 10 receives the command, the first electronic camera 10 controls the CMOS image sensor 51 to expand the output image that covers the whole area of the display panel 12 so as to be in the same state as when the coordinate data input system 1S is turned on.
In the above-described operational steps, the second electronic camera 11 and the second image processing circuit 31 operate substantially the same as the first electronic camera 10 and the first image processing circuit 30 operate.
Present-day large screen display devices in the market, such as a plasma display panel (PDP) or a rear projection display generally have a 40-inch to 70-inch screen with 1024-pixel by 768-pixel resolution, which is known as an XGA screen. For capitalizing on those performance figures to a coordinate data input system, image sensors, such as the CMOS image sensors 51 and 53 are desirable to be provided with about 2000 imaging cells (pixels) in a direction. Against those backdrops, the following examples according to the present invention are configured to further reduce costs of a coordinate data input system.
FIG. 17 is a schematic view illustrating a coordinate data input system 60S as another example configured according to the present invention. The coordinate data input system 60S includes a coordinate data input apparatus 60 and a control apparatus 61. The coordinate data input apparatus 60 includes a first linear sensor camera 70, a second linear sensor camera 71, an area sensor camera 72, a display panel 73, and a frame 74.
The linear sensor camera may also be referred as a line sensor camera, a one-dimensional sensor camera, a 1-D camera, etc., and the area sensor camera may also be referred as a video camera, a two-dimensional camera, a two-dimensional electronic camera, a 2-D camera, a digital still camera, etc.
Each of the first linear sensor camera 70 and the second linear sensor camera 71 includes a wide-angle lens, which covers 90 degrees or more and a charge coupled device (CCD) linear image sensor. The first linear sensor camera 70 and the second linear sensor camera 71 output image data as analog signals. The CCD linear image sensor is provided with, for example, 2000 pixel imaging cells, i.e., photoelectric converters, such as photodiodes. Thus, the first linear sensor camera 70 and the second linear sensor camera 71 have an image resolution for reading an image on an XGA screen display in a direction along the array of the imaging cells, repetitively.
Further, the two linear sensor cameras are disposed in an appropriate crossing angle of the optical axes thereof, and therefore enables inputting various information including two-dimensional coordinates, such as information on a selecting operation of an item in a menu window, a drawing operation of free hand lines and letters, etc.
The area sensor camera 72 includes a wide-angle lens, which covers 90 degrees, or more, a two-dimensional CMOS image sensor, and an analog to digital converter. The two-dimensional CMOS image sensor has enough imaging cells and an enough output frame rate to enable recognizing the motion of a coordinate input member. The two-dimensional CMOS image sensor, for example, a sensor having 640 by 480 imaging cells, which is referred to as a VGA screen, may be used. The area sensor camera 72 outputs image data as a digital signal, the data being converted by the embedded analog to digital converter.
Any of the first linear sensor camera 70, the second linear sensor camera 71, and the area sensor camera 72 includes a smaller number of imaging pixels compare to the two-dimensional image sensor used in the coordinate data input system 1S of FIG. 1. Consequently, those cameras 70, 71 and 72 can output frame images at a higher frame rate compared to the two-dimensional image sensor used in the coordinate data input system 1S of FIG. 1.
The first linear sensor camera 70 and the area sensor camera 72 are disposed at an upper left corner of the display panel 73, respectively, such that the optical axis each of the wide-angle lenses forms an angle of approximately 45 degrees with a horizontal edge of the display panel 73. The second linear sensor camera 71 is disposed at an upper right corner of the display panel 73, such that the optical axis of the wide-angle lens forms an angle of approximately 45 degrees with a horizontal edge of the display panel 73. Further, the optical axis each of the cameras 70, 71 and 72 is disposed approximately parallel to the display surface of the display panel 73. Thus, each of the cameras 70, 71 and 72 can capture whole the display screen area of the display panel 73, and transmit the captured image data to the control apparatus 61.
The display panel 73 displays an image with, for example, a 48 by 36 inch screen and 1024 by 768-pixel resolution. For example, a plasma display panel, a rear projection liquid crystal display, a rear projection CRT display, etc., may be used as the display panel 73.
The frame 74 is preferably to be structured with a low optical reflection coefficient material, such as black painted or plated metals, black resins, on the surface thereof. The frame 74 is mounted on the left side, the bottom, and the right side circumferences of the display panel 73. Regarding a direction perpendicular to the surface of the display panel 73, the frame 74 is disposed protruding above the surface of the display panel 73. The dimensional amount of the protrusion may be equal to or more than the angle of view of the first linear sensor camera 70 and the second linear sensor camera 71 in the direction perpendicular to the surface of the display panel 73.
Accordingly, when no coordinate input member exists in the vicinity of the surface of the display panel 73, the first linear sensor camera 70 and the second linear sensor camera 71 capture the frame 74 and output image data thereof, i.e., black image data, respectively.
FIG. 18 is an exemplary block diagram of the control apparatus 61 of the coordinate data input system 60S of FIG. 17 configured according to the present invention. Referring to FIG. 18, the control apparatus 61 includes a central processing unit (CPU) 20, a main memory 21, a clock generator 22, a bus controller 23, a read only memory (ROM) 24, a peripheral component interconnect (PCI) bridge 25, a cache memory 26, a hard disk 27, a hard disk (HD) controller 28, a display controller 29, a first image processing circuit 90, a second image processing circuit 91, and a third image processing circuit 92.
The control apparatus 61 also includes a local area network (LAN) controller 32, a LAN interface 33, a floppy disk (FD) controller 34, a FD drive 35, a compact disc read only memory (CD-ROM) controller 36, a CD-ROM drive 37, a keyboard controller 38, a mouse interface 39, a real time clock (RTC) generator 40, a CPU bus 41, a PCI bus 42, an internal X bus 43, a keyboard 44, and a mouse 45.
In FIG. 18, the elements that are substantially the same as those in FIG. 2 are denoted by the same reference numerals. Therefore, a description of the same elements in FIG. 18 as in FIG. 2 is not provided here to avoid redundancy.
Referring to FIG. 18, the first image processing circuit 90 receives digital image data output from the area sensor camera 72 through a digital interface, such as an RS-422 interface. The first image processing circuit 90 then executes an object extraction process, an object shape recognition process, an object motion vector determining process, etc.
The second image processing circuit 91 includes an analog to digital converting circuit, and receives the analog image signal output from the first linear sensor camera 70 via a coaxial cable. Then, the second image processing circuit 91 detects a linear (one-dimensional) location of an object based on the received image signal. Further, the second image processing circuit 91 supplies the first linear sensor camera 70 with a clock signal and an image transfer pulse via the above-described digital interface.
The third image processing circuit 92 is configured with substantially the same hardware as the second image processing circuit 91, and operates substantially the same as the second image processing circuit 91 operates. That is, the third image processing circuit 92 includes an analog to digital converting circuit, and receives the analog image signal output from the second linear sensor camera 71 via a coaxial cable. Then, the third image processing circuit 92 detects a linear location of the object based on the image signal received from the second linear sensor camera 71. The third image processing circuit 92 also supplies the second linear sensor camera 71 with a clock signal and an image transfer pulse via a digital interface, such as an RS-422 interface.
In addition, the clock signal and the image transfer pulse supplied to the first linear sensor camera 70 and those supplied to the second linear sensor camera 71 are maintained in synchronization.
FIG. 19 is a diagram illustrating an analog signal waveform output from the first linear sensor camera 70 or the second linear sensor camera 71. The analog signal waveform in FIG. 19 has been observed with an oscilloscope, and the horizontal axis represents time and the vertical axis represents a voltage. In other words, since the first linear sensor camera 70 and the second linear sensor camera 71 have one-dimensionally aligned imaging cells, respectively, the horizontal axis also corresponds to a direction of the aligned imaging cells.
The PEDESTAL LEVEL of the waveform corresponds to an output voltage of a captured image of the black frame 74. A positive pulse in the waveform corresponds to a captured image of a coordinate input member having a relatively high optical reflection coefficient, e.g., white, red, gray, etc. Lighting fixtures and/or sunlight flooded from windows irradiate both the black frame 74 and a coordinate input member, however the black frame 74 reflects little light and the coordinate input member reflects more light, and thereby the linear CCD image sensors in the linear sensor cameras 70 and 71 generate such a waveform having a pulse thereupon.
The height of the pulse is proportional to the optical reflection coefficient of the coordinate input member. Further, the height and width of the pulse is affected by the size of the coordinate input member and the distance thereof from the first linear sensor camera 70 and the second linear sensor camera 71. For example, when the coordinate input member is thin and located far from the first linear sensor camera 70 and the second linear sensor camera 71, the height and width of the pulse on an output voltage waveform generally become thin and short.
Furthermore, the height and width of the pulse is affected by a location of the coordinate input member in the direction perpendicular to the surface of the display panel 73. For example, when the coordinate input member is contacting the display panel 73, a pulse appears with a maximum height and width. As the coordinate input member leaves from the display panel 73, the height and width of the pulse become thinner and shorter. If the coordinate input member is out of the angle of view of the first linear sensor camera 70 and the second linear sensor camera 71, the pulse disappears.
The alternate long and short dash line denoted by THRESHOLD LEVEL represents a threshold voltage used for discriminating or slicing a pulse portion of the waveform signal. When a pulse portion of the signal is above the threshold level, the location of the peak of the pulse along the time axis is utilized for identifying the location of the coordinate input member on the display panel 73.
As described, the height and width of the pulse is affected by the above described various factors, therefore the threshold level may be determined based on an experiment. Further, the threshold level may be readjusted according to illumination of the room in which the coordinate data input system 60S is installed for use.
Referring back to FIG. 18, the second image processing circuit 91 detects a peak of a pulse in an image signal output from the CCD linear image sensor of the first linear sensor camera 70 as a location P that corresponds to contact point A(x, y) of a coordinate input member, when the pulse exceeds the threshold level. After that, the second image processing circuit 91 measures a distance h between the optical axis crossing point Q of the first linear sensor camera 70 and the projected point P of the contacting point of coordinate input member on the CCD linear image sensor.
The above-stated points P and Q, and distance h substantially correspond to those symbols shown in FIG. 3 and FIG. 4. Therefore, the aforesaid equations (1), (2), (3) and (4) also hold. Where, the distance f between the CCD linear image sensor and the wide-angle lens is known. Likewise, the angle α, which the optical axis of the first linear sensor camera 70 forms with the X-line or a horizontal edge of the display panel 73, is known. Accordingly, the angle β1, which is formed by the X-line and a line connecting the wide-angle lens and a touching point A(x, y) of the coordinate input member, is obtained.
Similarly, the third image processing circuit 92 detects a peak of a pulse in an image signal output from the CCD linear image sensor of the second linear sensor camera 71 as a projected point P of the contacting point of the coordinate input member. Then, the third image processing circuit 92 measures a distance h between the optical axis crossing point Q of the second linear sensor camera 71 and the detected point P on the CCD linear image sensor. Accordingly, the angle β2 is also obtained. In addition, the distance L, which is the distance between the wide-angle lenses of the first linear sensor camera 70 and the second linear sensor camera 71, is known. Finally, a contacting point A(x, y) of the coordinate input member is solved.
FIG. 20 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60S of FIG. 17 as an example configured according to the present invention.
In the first place, the area sensor camera 72 limits an image area in a direction perpendicular to the display panel 73 to output the image data to the first image processing circuit 90 within a predetermined distance from the display panel 73 as necessary. In other words, the area sensor camera 72 clips an upper and/or lower portion of an analog image signal output from the CMOS area sensor thereof. Then, the area sensor camera 72 converts the analog image signal of the remained portion into digital data, and sends out the digital image data as frame image data to the first image processing circuit 90.
With reference to FIG. 20, in step S701, the first image processing circuit 90 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the area sensor camera 72.
In step S702, the first image processing circuit 90 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 73. For measuring those distances, the first image processing circuit 90 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 73 for each of the measuring distances. A pixel pitch of the CMOS image sensor is known, and therefore the number of pixels between two points determines the distance between the two points.
In step S703, the first image processing circuit 90 or the CPU 20 extracts the least number of pixels, which is denoted by Nmin, among the plural numbers of pixels counted in step S702, and determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S703, the process proceeds to step S704, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S703, the process returns to step S701.
In step S704, the first image processing circuit 90 or the CPU 20 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point, which corresponds the minimum value Nmin, to the display panel 73. For the calculation, the first image processing circuit 90 or the CPU 20 uses the identical frame image data used for extracting the contours and the next following frame image data received from the area sensor camera 72.
In this example, for calculating motion vectors, the first image processing circuit 90 or the CPU 20 first obtains optical flows, i.e., velocity vectors by calculating a rate of temporal change of a pixel image density and a rate of spatial change of image density of pixels surrounding the pixel used for calculating the temporal change. The motion vectors are expressed with the coordinate system (Xcamera, Ycamera), which associates with a line of the surface of the display panel 73 focused on the CMOS area sensor (i.e., Ycamera) and the coordinate perpendicular to the display panel 73 (i.e., Xcamera).
In step S705, the CPU 20 stores the calculated motion to vector components along the direction Xcamera, such as Vx, in the main memory 21. The CPU 20 stores those components obtained from each frame image data in succession. The successively stored data is referred as trace data of motion vectors.
In step S706, the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 73 based on the trace data. As a determining method, the method illustrated in FIG. 12 may be used. When the object has made an attempt to input coordinates, i.e., YES in step S707, the process proceeds to step S708, and when the object has not made an attempt to input coordinates, i.e., No in step S707, the process branches to step S710.
In step S708, referring to FIG. 4, the second image processing circuit 91 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first linear sensor camera 70. Similarly, the third image processing circuit 92 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second linear sensor camera 71.
In step S709, the second image processing circuit 91 or the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second linear sensor camera 71, the third image processing circuit 92 or the CPU 20 solves the angle β2 in a similar manner.
In step S711, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 73 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
In step S710, the CPU 20 determines whether the object is within the predetermined region above the display panel 73 using the trace data of motion vector components Vx of the object. In other words, the CPU 20 determines whether the minimum value Nmin among plural distances is still smaller than the predetermined number M0. When the object is in the predetermined region, i.e., YES in step S710, the process returns to step S704 to obtain motion vectors again. When the object is out of the predetermined region, i.e., NO in step S710, the process returns to step S701.
FIG. 21 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60S of FIG. 17 as another example configured according to the present invention.
With reference to HG. 21, in step S801, the first image processing circuit 90 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the area sensor camera 72.
In step S802, the first image processing circuit 90 or the CPU 20 first extracts features of the shape of the extracted contours of the object. For extracting features of the shape, the first image processing circuit 90 or the CPU 20 determines the position of the barycenter of the contours of the object, then measures distances from the barycenter to plural points on the extracted contours for all radial directions like the spokes of a wheel. After that, the CPU 20 characterizes the contour shape of the object based on relations between each direction and the respective distance.
After that, the first image processing circuit 90 or the CPU 20 compares the character extracted contour shape of the object with cataloged shapes of potential coordinate input members. The shapes of potential coordinate input members may be stored in the ROM 24 or the hard disk 27 in advance.
When an operator of the coordinate data input system 60S points to an item in a menu, an icon, draws a line, etc., by using a coordinate input member, the axis of the coordinate input member may tilt in any direction with various tilting angles. Therefore, the first image processing circuit 90 or the CPU 20 may compare the contour shape of the object after being rotated at various angles with the cataloged shapes.
Instead of the rotation of the contour shape, the shapes of potential coordinate input members may be rotated at plural angles in advance, and the rotated shapes are stored in the ROM 24 or the hard disk 27. Thus, the real time rotating operation of the contour shape is not needed; and consequently execution time is saved.
In step S803, the first image processing circuit 90 or the CPU 20 determines whether the contour shape of the object coincides with one of the cataloged shapes of potential coordinate input members. When the identified contour shape coincides with one of the cataloged shapes, i.e., YES in step S803, the process proceeds to step S804, and when the identified contour shape does not coincide with any of the cataloged shapes, i.e., NO in step S803, the process returns to step S801.
In step S804, the first image processing circuit 90 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 73. For measuring those distances, the first image processing circuit 90 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 73 as regards each of the measuring distances.
In step S805, the first image processing circuit 90 or the CPU 20 extracts the least number of pixels, i.e., Nmin, among the plural numbers of pixels counted in step S804, and determines whether the minimum value Nmin is smaller than a predetermined number M0. When the minimum value Nmin is smaller than the predetermined number M0, i.e., YES in step S805, the process proceeds to step S806, and when the minimum value Nmin is not smaller than the predetermined number M0, i.e., NO in step S805, the process returns to step S801.
In step S806, the first image processing circuit 90 or the CPU 20 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 73 by using the identical frame image data used for extracting the contours and the next following frame image data received from the area sensor camera 72.
In this example, for calculating motion vectors, the first image processing circuit 90 or the CPU 20 first obtains optical flows, i.e., velocity vectors by calculating a rate of temporal change of a pixel image density and a rate of spatial change of image density of pixels surrounding the pixel used for calculating the temporal change. The motion vectors are expressed with the coordinate system Xcamera, Ycamera.
In step S807, the CPU 20 stores motion vector components along the direction Xcamera of the calculated vectors, such as Vx, in the main memory 21. The CPU 20 stores those components obtained from each frame image data in succession as trace data of the motion vectors.
In step S808, the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 73 based on the trace data. As a determining method, the method of FIG. 12 may be used. When the object has made an attempt to input coordinates, i.e., YES in step S809, the process branches to step S811, and when the object has not made any attempt to input coordinates, i.e., No in step S809, the process proceeds to step S810.
In step S810, the CPU 20 determines whether the object is within a predetermined region above the display panel 73 using the trace data of motion vector components Vx of the object. When the object is in the predetermined region, i.e., YES in step S810, the process returns to step S806 to obtain motion vectors again, and when the object is out of the predetermined region, i.e., NO in step S810, the process returns to step S801.
In step S811, the second image processing circuit 91 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first linear sensor camera 70. Similarly, the third image processing circuit 92 or the CPU 20 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second linear sensor camera 71.
In step S812, the second image processing circuit 91 or the CPU 20 solves the angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second linear sensor camera 71, the third image processing circuit 92 or the CPU 20 solves the angle β2 in a similar manner.
In step S813, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 73 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
FIG. 22 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 60S of FIG. 17 as another example configured according to the present invention.
In this example, when a coordinate input member is only in proximity to the display panel 73, the first linear sensor camera 70 and the second linear sensor camera 71 output image data, respectively, to save loads for other devices in the coordinate data input system 60S.
Referring to FIG. 22, in step S901, the first image processing circuit 90 or the CPU 20 determines whether a coordinate input member has entered a predetermined region above the display panel 73 for tracing motion vectors thereof. When a coordinate input member has entered the predetermined region, i.e., YES in step S901, the process proceeds to step S902, and when a coordinate input member has not entered yet, i.e., NO in step S901, the process stays at step S901.
In step S902, the second image processing circuit 91 sends a command to the first linear sensor camera 70 to start imaging operation. Likewise, the third image processing circuit 92 sends a command to the second linear sensor camera 71 to start an imaging operation. Those commands are transmitted via digital interfaces. According to the commands, the first linear sensor camera 70 starts an imaging operation and sends the taken image data to the second image processing circuit 91. The second linear sensor camera 71 also starts an imaging operation and sends the taken image data to the third image processing circuit 92.
In step S903, the second image processing circuit 91 and the third image processing circuit 92 trace the coordinate input member and input coordinates of the coordinate input member on the display panel 73, respectively.
In step S904, the first image processing circuit 90 or the CPU 20 determines whether the coordinate input member is out of the predetermined region for tracing motion vectors thereof. When the coordinate input member is out of the predetermined region, i.e., YES in step S904, the process proceeds to step S905, and when the coordinate input member is still in the predetermined region, i.e., NO in step S904, the process returns to step S903.
In step S905, the second image processing circuit 91 sends a command to the first linear sensor camera 70 to halt the imaging operation. Likewise, the third image processing circuit 92 sends a command to the second linear sensor camera 71 to halt the imaging operation. According to the commands, the first linear sensor camera 70 and the second linear sensor camera 71 halt the imaging operation, respectively.
In the above example, the predetermined region above the display panel 73 is commonly used for both starting imaging operations and tracing motion vectors. However, a predetermined region for starting imaging operations by the first linear sensor camera 70 and the second linear sensor camera 71 may be greater than a predetermined region for tracing motion vectors of a coordinate input member.
FIG. 23 is a flowchart illustrating operational steps for practicing a coordinate data inputting operation in the coordinate data input system 1S of FIG. 1, as another example configured according to the present invention. In this example, when a coordinate input member is within a first predetermined region above a display device, the location of the coordinate input member is input as coordinates. Thus, the coordinate input member, for example, moves a cursor, draws a line, etc. Further, when the coordinate input member is within a second predetermined region above the display device, the coordinate input member, for example, moves a cursor, receives a gesture command, etc.
With reference to FIG. 23, in step S1001, the first image processing circuit 30 or the CPU 20 extracts contours of an object as a coordinate input member from the frame image data received from the first electronic camera 10.
In step S1002, the first image processing circuit 30 or the CPU 20 measures plural distances between points on the contours of the extracted object and points on the projected line of the surface of the display panel 12. For measuring those distances, the first image processing circuit 30 or the CPU 20 counts pixels included between a point on the contours of the extracted object and a point on the projected line of the surface of the display panel 12 for each of the measuring distances. The number of pixels between two points determines the distance between the two points.
In step S1003, the first image processing circuit 30 or the CPU 20 extracts the least number of pixels, i.e., Nmin, among the plural numbers of pixels counted in step S1002. Then, the first image processing circuit 30 or the CPU 20 determines whether the minimum value Nmin is larger than a first predetermined number M1 and equal to or smaller than a second predetermined number M2.
FIG. 24 is a diagram illustrating an image captured by the first electronic camera 10 in the coordinate data input system 1S of FIG. 1. Referring to FIG. 24, the rectangular region enclosed with a line corresponding to the first predetermined number M1, the projected line of surface of the display panel 12, and the normals thereof is denoted by REGION 1. Likewise, the rectangular region enclosed with the line corresponding to the first predetermined number M1, a line corresponding to the second predetermined number M2, and the normals of the projected line of surface of the display panel 12 is denoted by REGION 2.
The REGION 1 is assigned for tracing motion vectors of the coordinate input member, and the REGION 2 is assigned for moving a cursor, inputting a gesture command, etc. For example, a pen as a coordinate input member is illustrated in the REGION 2 in FIG. 24.
Referring back to FIG. 23, that is to say, in step S1003, the first image processing circuit 30 determines whether the coordinate input member is in the REGION 2. When the result of the determination is true, i.e., YES in step S1003, the process proceeds to step S1004, and when the result is false, i.e., NO in step S1003, the process branches to step S1008.
In step S1004, the first image processing circuit 30 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first electronic camera 10. Similarly, the second image processing circuit 31 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second electronic camera 11.
In step S1005, the first image processing circuit 30 solves angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from second electronic camera 11, the second image processing circuit 31 solves angle β2 in a similar manner.
In step S1006, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
In step S1007, the CPU 20 generates display data of a cursor at a location according to the obtained coordinates x and y of the object, and sends the generated display data to the display controller 29. The CPU 20 may also send a cursor command to display a cursor at the location. Thus, the display controller 29 can display a cursor at the location where the coordinate input member exists on the display panel 12. After that, the process returns to step S1001. Thus, as long as the coordinate input member moves in the REGION 2, the displayed cursor follows the coordinate input member.
In step S1008, the first image processing circuit 30 determines whether the minimum value Nmin is equal to or smaller than the first predetermined number M1. That is to say, the first image processing circuit 30 determines whether the coordinate input member is in the REGION 1. When the result of the determination is true, i.e., YES in step S1008, the process proceeds to step S1009, and when the result is false, i.e., NO in step S1008, the process returns to step S1001.
In step S1009, the first image processing circuit 30 calculates motion vectors regarding predetermined plural points on the extracted contours of the object including the nearest point to the display panel 12 by using the identical frame image data used for extracting the contours and the next following frame image data received from the first electronic camera 10. After that, the CPU 20 determines whether the extracted object has made an attempt to input coordinates on the display panel 12 based on the trace data of the calculated motion vectors.
When the CPU 20 determines that the object has made an attempt to input coordinates, the first image processing circuit 30 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the first electronic camera 10. Similarly, the second image processing circuit 31 measures a distance h between the optical axis crossing point Q and a projected point P of a contacting point A(x, y) of the object according to the image data received from the second electronic camera 11.
Then, the first image processing circuit 30 solves angle β1 by using the equations (1) and (2), with known quantities f and α, and the measured distance h. As regards image data received from the second electronic camera 11, the second image processing circuit 31 solves angle β2 in a similar manner.
After that, referring to FIG. 3, the CPU 20 solves the coordinates x and y of the object on the display panel 12 by using the equations (3) and (4), with known quantities L, and the solved angles β1 and β2.
In the above-described example, the CPU 20 solves the coordinates x and y of the object on the display panel 12 for every frame image input. However, the CPU 20 may also solve coordinates x and y for every plural frames of images.
In addition, in the above-described example, the obtained coordinates x and y on the display panel 12 in the REGION 2 is used for moving a cursor. However, the obtained coordinates x and y may also be used for another use, such as inputting a gesture command. For inputting a gesture command, the CPU 20 may stores plural sets of coordinate data, i.e., trace data of coordinate data including time stamps thereof. Then, the CPU 20 analyzes the trace data of coordinate data, and tests whether the trace data coincides one of a plurality of defined locus of commands, which may be stored in the hard disk 27 in advance.
As an example, Japanese Laid-Open Patent Publication No. 5-197810 describes a matching method. The method first obtains a set of a temporal combination and a spatial combination of motion vectors extracted from input images. The method then verifies the obtained set of temporal combination and spatial combination with patterns in a command pattern dictionary provided in advance. Thus, the method identifies the input command as a specific one in the command pattern dictionary.
As an example of gesture commands, when an operator strokes a pen downwardly at a predetermined range of velocity in the REGION 2 above the display panel 12, the CPU 20 may recognize the stroke as a scroll command. When the CPU 20 recognizes as a scroll command, the CPU 20 scrolls the image displayed on the display panel 12 downwardly for a predetermined length, for example, the same length to the input stroke.
Further, inputting either a gesture command or coordinate data may be distinguished according to a figure of the coordinate input member. For example, when a human hand or finger draws a figure on the display panel 12, the coordinate data input system 1S may recognize the motion as a gesture command, and when a symmetrical object, such as a pen, draws, the system 1S may input coordinates of the symmetrical object.
FIG. 25 is an exemplary network system 200 including the coordinate data input systems of FIG. 1 and FIG. 17. Referring to FIG. 25, the network system 200 includes a public switched telephone network (PSTN) 210 and a local area network 220. Three coordinate data input systems 1SA, 1SB and 60SB are connected to the LAN 220 via the LAN interface 33 of FIG. 2 and FIG. 18. A server 230 is also connected to the LAN 220. A coordinate data input system 1SC is connected to the PSTN 210 via the LAN interface 33 and a PSTN adaptor. The coordinate data input systems 1SA, 1SB and 1SC are substantially the same to the coordinate data input system of FIG. 1, and the coordinate data input system 60SB is substantially the same to the coordinate data input system of FIG. 17.
In the network system 200, each of the coordinate data input systems 1SA, 1SB, 1SC and 60SB transmits detected coordinate data of a coordinate input member and related information, such as a gesture command, accompanying control signals according to a transmission control protocol to the other coordinate data input systems via the PSTN 210 and the LAN 220.
Further, each of the coordinate data input systems 1SA, 1SB, 1SC and 60SB displays images on the display panel 12 of FIG. 1 or 73 of FIG. 17 according to the detected coordinate data and the related information sent from the other coordinate data input systems via the PSTN 210 and the LAN 220 in addition to according to coordinate data detected by itself.
Therefore, all the coordinate data input systems 1SA, 1SB, 1SC and 60SB can share identical information and display an identical image on the display panel 12 or 73. In other words, people in different places can input information including coordinate data to a coordinate data input system implemented in each of the different places, and watch substantially the same image on the each display panel.
The server 230 stores programs to be executed by the CPU 20 of FIG. 2 and FIG. 18, the first image processing circuit 30 of FIG. 2, the second image processing circuit 31 of FIG. 2, the first image processing circuit 90 of FIG. 18, the second image processing circuit 91 of FIG. 18, the third image processing circuit 92 of FIG. 18, etc.
When a manufacturer of the coordinate data input systems revises a program of the systems, the manufacturer stores the revised program and informs users of the systems of the new program revision. Then, the users of the coordinate data input systems can download the revised program into hard disk 27 of FIG. 2 and FIG. 18, and thus programs for the CPU and the image processing circuits of each system are updated. When updating operations for the coordinate data input systems connected to the PSTN 210 and the LAN 220 are completed, all users of the systems can share, for example, the latest functions of the system. As used herein, “computer readable medium” means a non-transitory hardware implementation, such as the afore-mentioned hard disk 27, CD-ROM 27, ROM 24, main memory 21, etc.
As described above, the novel method and apparatus according to the present invention can input information including coordinate data without using a light scanning device even when the surface of a display screen is contorted to a certain extent.
Further, the novel method and apparatus according to the present invention can input information including coordinate data using a plurality of coordinate input members, such as a pen, a human finger, a stick, etc.
Furthermore, the novel method and apparatus according to the present invention can input information including coordinate data with a plurality of background devices, such as a chalkboard, a whiteboard, etc., in addition to a display device, such as a plasma display panel, a rear projection display.
Numerous modifications and variations of the present invention are possible in light of the above teachings. For example, features described for certain embodiments may be combined with other embodiments described herein. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.

Claims (12)

1. A method for inputting information including coordinate data, comprising:
providing at least two cameras at respective corners of a display;
extracting, based on outputs from the at least two cameras, a predetermined object from an image including the predetermined object above a plane of the display and a plane of the display;
determining whether the predetermined object is within a predetermined distance from the plane of the display;
detecting, based on outputs from the at least two cameras, a position of the predetermined object while the predetermined object is determined to be within a predetermined distance from the plane;
calculating angles of views of each of the at least two cameras to the detected position; and
calculating coordinates of the predetermined object on the display panel utilizing the calculated angles.
2. A method for inputting information including coordinate data according to claim 1, wherein the at least two cameras are in opposite corners of the display.
3. A device for inputting information including coordinate data, comprising:
at least two cameras at respective corners of a display;
an object extracting device configured to extract a predetermined object from an image including the predetermined object above a plane of the display and a plane of the display, and to determine whether the predetermined object is within a predetermined distance from the plane of the display;
a detector device configured to detect a position of the predetermined object while the predetermined object is within a predetermined distance from the plane; and
a controller configured to calculate angles of views of each of the at least two cameras to the detected position and to calculate coordinates of the predetermined object on the display panel utilizing the calculated angles.
4. A device for inputting information including coordinate data according to claim 3, wherein the at least two cameras are in opposite corners of the display.
5. A device for inputting information including coordinate data, comprising:
at least two imaging means at respective corners of a display;
means for extracting, based on outputs from the at least two imaging means, a predetermined object from an image including the predetermined object above a plane of the display and a plane of the display, and for determining whether the predetermined object is within a predetermined distance from the plane of the display;
means for detecting, based on outputs from the at least two imaging means, a position of the predetermined object while the predetermined object is within a predetermined distance from the plane;
means for calculating angles of view of each of the least two imaging means and for calculating coordinates of the predetermined object on the display panel utilizing the calculated angles.
6. A device for inputting information including coordinate data according to claim 5, wherein the at least two imaging means are in opposite corners of the display.
7. Apparatus usable with at least one processing structure for inputting information including coordinate data, comprising:
a display device having at least two cameras at respective corners thereof; and
at least one non-transitory computer readable medium having program code configured to cause the at least one processing structure to:
(i) extract, based on outputs from the at least two cameras, a predetermined object from an image including the predetermined object above a plane of the display device and a plane of the display device;
(ii) determine whether the predetermined object is within a predetermined distance from the plane of the display device;
(iii) detect, based on outputs from the at least two cameras, a position of the predetermined object while the predetermined object is determined to be within a predetermined distance from the plane;
(iv) calculate angles of views of each of the at least two cameras to the detected position; and
(v) calculate coordinates of the predetermined object on the display device utilizing the calculated angles.
8. Apparatus usable with at least one processing structure for inputting information including coordinate data according to claim 7, wherein the at least two cameras are disposed at opposite corners of the display device.
9. Apparatus usable with at least one processing structure for inputting information including coordinate data, comprising:
a display having at least two cameras at respective corners thereof; and
at least one non-transitory computer readable medium configured to cause the at least one processing structure to:
(i) extract a predetermined object from an image including the predetermined object above a plane of the display and a plane of the display, and to determine whether the predetermined object is within a predetermined distance from the plane of the display;
(ii) detect a position of the predetermined object while the predetermined object is within a predetermined distance from the plane; and
(iii) calculate angles of views of each of the at least two cameras to the detected position and to calculate coordinates of the predetermined object on the display panel utilizing the calculated angles.
10. Apparatus usable with at least one processing structure for inputting information including coordinate data according to claim 9, wherein the at least two cameras are disposed at opposite corners of the display.
11. Apparatus usable with at least one processing structure for inputting information including coordinate data, comprising:
a display panel having at least two imaging devices at respective corners thereof; and
at least one non-transitory computer readable medium configured to cause the at least one processing structure to:
(i) extract, based on outputs from the at least two imaging devices, a predetermined object from an image including the predetermined object above a plane of the display panel and a plane of the display panel, and for determining whether the predetermined object is within a predetermined distance from the plane of the display panel;
(ii) detect, based on outputs from the at least two imaging devices, a position of the predetermined object while the predetermined object is within a predetermined distance from the plane; and
(iii) calculate angles of view of each of the least two imaging devices, and calculate coordinates of the predetermined object on the display panel utilizing the calculated angles.
12. Apparatus usable with at least one processing structure for inputting information including coordinate data according to claim 11, wherein the at least two imaging devices are disposed at opposite corners of the display panel.
US12/722,345 1999-10-29 2010-03-11 Method and apparatus for inputting information including coordinate data Expired - Lifetime USRE43084E1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US12/722,345 USRE43084E1 (en) 1999-10-29 2010-03-11 Method and apparatus for inputting information including coordinate data
US13/345,044 US20120327031A1 (en) 1999-10-29 2012-01-06 Method and apparatus for inputting information including coordinate data

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP11-309412 1999-10-29
JP30941299A JP4052498B2 (en) 1999-10-29 1999-10-29 Coordinate input apparatus and method
US09/698,031 US6674424B1 (en) 1999-10-29 2000-10-30 Method and apparatus for inputting information including coordinate data
US10/717,456 US7342574B1 (en) 1999-10-29 2003-11-21 Method and apparatus for inputting information including coordinate data
US12/722,345 USRE43084E1 (en) 1999-10-29 2010-03-11 Method and apparatus for inputting information including coordinate data

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US10/717,456 Continuation US7342574B1 (en) 1999-10-29 2003-11-21 Method and apparatus for inputting information including coordinate data
US10/717,456 Reissue US7342574B1 (en) 1999-10-29 2003-11-21 Method and apparatus for inputting information including coordinate data

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/345,044 Continuation US20120327031A1 (en) 1999-10-29 2012-01-06 Method and apparatus for inputting information including coordinate data

Publications (1)

Publication Number Publication Date
USRE43084E1 true USRE43084E1 (en) 2012-01-10

Family

ID=17992706

Family Applications (4)

Application Number Title Priority Date Filing Date
US09/698,031 Expired - Lifetime US6674424B1 (en) 1999-10-29 2000-10-30 Method and apparatus for inputting information including coordinate data
US10/717,456 Ceased US7342574B1 (en) 1999-10-29 2003-11-21 Method and apparatus for inputting information including coordinate data
US12/722,345 Expired - Lifetime USRE43084E1 (en) 1999-10-29 2010-03-11 Method and apparatus for inputting information including coordinate data
US13/345,044 Abandoned US20120327031A1 (en) 1999-10-29 2012-01-06 Method and apparatus for inputting information including coordinate data

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US09/698,031 Expired - Lifetime US6674424B1 (en) 1999-10-29 2000-10-30 Method and apparatus for inputting information including coordinate data
US10/717,456 Ceased US7342574B1 (en) 1999-10-29 2003-11-21 Method and apparatus for inputting information including coordinate data

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/345,044 Abandoned US20120327031A1 (en) 1999-10-29 2012-01-06 Method and apparatus for inputting information including coordinate data

Country Status (2)

Country Link
US (4) US6674424B1 (en)
JP (1) JP4052498B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090020342A1 (en) * 2007-07-18 2009-01-22 Smart Technologies Inc. Touch Panel And Interactive Input System Incorporating The Same
US20110266074A1 (en) * 2010-04-29 2011-11-03 Au Optronics Corporation Camera based touch system
US20110316814A1 (en) * 2010-06-28 2011-12-29 Ming-Tsan Kao Optical distance determination device, optical touch monitoring system and method for measuring distance of a touch point on an optical touch panel
US20120212454A1 (en) * 2011-02-18 2012-08-23 Seiko Epson Corporation Optical position detecting device and display system provided with input function
US20120287083A1 (en) * 2011-05-12 2012-11-15 Yu-Yen Chen Optical touch control device and optical touch control system
US20130179811A1 (en) * 2012-01-05 2013-07-11 Visteon Global Technologies, Inc. Projection dynamic icon knobs

Families Citing this family (94)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6424335B1 (en) * 1998-09-02 2002-07-23 Fujitsu Limited Notebook computer with detachable infrared multi-mode input device
JP3905670B2 (en) * 1999-09-10 2007-04-18 株式会社リコー Coordinate input detection apparatus, information storage medium, and coordinate input detection method
JP4052498B2 (en) 1999-10-29 2008-02-27 株式会社リコー Coordinate input apparatus and method
JP2001184161A (en) * 1999-12-27 2001-07-06 Ricoh Co Ltd Method and device for inputting information, writing input device, method for managing written data, method for controlling display, portable electronic writing device, and recording medium
DE60140909D1 (en) * 2000-07-05 2010-02-04 Smart Technologies Ulc Method for a camera-based touch system
US6803906B1 (en) 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
US7095401B2 (en) * 2000-11-02 2006-08-22 Siemens Corporate Research, Inc. System and method for gesture interface
JP4383864B2 (en) * 2001-09-05 2009-12-16 徹 大田 Device with character input function
US7015401B2 (en) * 2001-11-23 2006-03-21 Aiptek International, Inc. Image processing system with handwriting input function and the method for forming the same
US20040001144A1 (en) * 2002-06-27 2004-01-01 Mccharles Randy Synchronization of camera images in camera-based touch system to enhance position determination of fast moving objects
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
JP4746653B2 (en) * 2003-02-03 2011-08-10 辰巳電子工業株式会社 Automatic photo creation device
US8508508B2 (en) 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US8456447B2 (en) 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
US7629967B2 (en) * 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US7532206B2 (en) 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US6947032B2 (en) * 2003-03-11 2005-09-20 Smart Technologies Inc. Touch system and method for determining pointer contacts on a touch surface
US7256772B2 (en) 2003-04-08 2007-08-14 Smart Technologies, Inc. Auto-aligning touch system and method
US7411575B2 (en) 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
JP2005123858A (en) * 2003-10-16 2005-05-12 Mega Chips Corp Camera control device
US7355593B2 (en) * 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US7232986B2 (en) * 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
US7460110B2 (en) 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
US7492357B2 (en) * 2004-05-05 2009-02-17 Smart Technologies Ulc Apparatus and method for detecting a pointer relative to a touch surface
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8120596B2 (en) 2004-05-21 2012-02-21 Smart Technologies Ulc Tiled touch system
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7724242B2 (en) 2004-08-06 2010-05-25 Touchtable, Inc. Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter
US7728821B2 (en) 2004-08-06 2010-06-01 Touchtable, Inc. Touch detecting interactive display
US7719523B2 (en) 2004-08-06 2010-05-18 Touchtable, Inc. Bounding box gesture recognition on a touch detecting interactive display
WO2006090386A2 (en) * 2005-02-24 2006-08-31 Vkb Inc. A virtual keyboard device
KR100677457B1 (en) * 2005-05-24 2007-02-02 엘지전자 주식회사 Menu input apparatus and method for a terminal using a camera
US7911444B2 (en) 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display
US7630002B2 (en) * 2007-01-05 2009-12-08 Microsoft Corporation Specular reflection reduction using multiple cameras
US20070165007A1 (en) * 2006-01-13 2007-07-19 Gerald Morrison Interactive input system
US20070205994A1 (en) * 2006-03-02 2007-09-06 Taco Van Ieperen Touch system and method for interacting with the same
US7948450B2 (en) * 2006-11-09 2011-05-24 D3 Led, Llc Apparatus and method for allowing display modules to communicate information about themselves to other display modules in the same display panel
KR20080044017A (en) * 2006-11-15 2008-05-20 삼성전자주식회사 Touch screen
JP4838694B2 (en) * 2006-11-28 2011-12-14 富士フイルム株式会社 Electronic handwriting input device
US9442607B2 (en) 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
US8212857B2 (en) * 2007-01-26 2012-07-03 Microsoft Corporation Alternating light sources to reduce specular reflection
WO2008128096A2 (en) * 2007-04-11 2008-10-23 Next Holdings, Inc. Touch screen system with hover and click input methods
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
US8432377B2 (en) * 2007-08-30 2013-04-30 Next Holdings Limited Optical touchscreen with improved illumination
US8384693B2 (en) 2007-08-30 2013-02-26 Next Holdings Limited Low profile touch panel systems
US20090213093A1 (en) * 2008-01-07 2009-08-27 Next Holdings Limited Optical position sensor using retroreflection
US20090207144A1 (en) * 2008-01-07 2009-08-20 Next Holdings Limited Position Sensing System With Edge Positioning Enhancement
US8405636B2 (en) * 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
US8902193B2 (en) * 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US20090278794A1 (en) * 2008-05-09 2009-11-12 Smart Technologies Ulc Interactive Input System With Controlled Lighting
US20090277697A1 (en) * 2008-05-09 2009-11-12 Smart Technologies Ulc Interactive Input System And Pen Tool Therefor
TW201007254A (en) * 2008-08-04 2010-02-16 Pixart Imaging Inc Image-sensing module and image-sensing system
US20100079385A1 (en) * 2008-09-29 2010-04-01 Smart Technologies Ulc Method for calibrating an interactive input system and interactive input system executing the calibration method
EP2353069B1 (en) * 2008-10-02 2013-07-03 Next Holdings Limited Stereo optical sensors for resolving multi-touch in a touch detection system
US8405647B2 (en) * 2008-10-16 2013-03-26 Nec Display Solutions, Ltd. Image information detecting device
US8339378B2 (en) * 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
US20100229090A1 (en) * 2009-03-05 2010-09-09 Next Holdings Limited Systems and Methods for Interacting With Touch Displays Using Single-Touch and Multi-Touch Gestures
TWI452488B (en) * 2009-05-18 2014-09-11 Pixart Imaging Inc Controlling method applied to a sensing system
TWI399676B (en) 2009-06-30 2013-06-21 Pixart Imaging Inc Object detection calibration system of an optical touch screen and method thereof
US8368668B2 (en) * 2009-06-30 2013-02-05 Pixart Imaging Inc. Displacement detection system of an optical touch panel and method thereof
US8860693B2 (en) * 2009-07-08 2014-10-14 Apple Inc. Image processing for camera based motion tracking
US8692768B2 (en) 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
TWI394072B (en) * 2009-08-21 2013-04-21 Largan Precision Co Ltd Apparatus for detecting a touching position on a flat display and a method thereof
US20110095977A1 (en) * 2009-10-23 2011-04-28 Smart Technologies Ulc Interactive input system incorporating multi-angle reflecting structure
CN102713794A (en) * 2009-11-24 2012-10-03 奈克斯特控股公司 Methods and apparatus for gesture recognition mode control
US20110199387A1 (en) * 2009-11-24 2011-08-18 John David Newton Activating Features on an Imaging Device Based on Manipulations
WO2011069148A1 (en) * 2009-12-04 2011-06-09 Next Holdings Limited Methods and systems for position detection using an interactive volume
US8427443B2 (en) 2009-12-30 2013-04-23 Hong Kong Applied Science And Technology Research Institute Co. Ltd. Coordinate locating method, coordinate locating device, and display apparatus comprising the coordinate locating device
US20110234542A1 (en) * 2010-03-26 2011-09-29 Paul Marson Methods and Systems Utilizing Multiple Wavelengths for Position Detection
TWI433004B (en) * 2010-05-14 2014-04-01 Alcor Micro Corp Method for determining touch points on touch panel and system thereof
TW201201079A (en) * 2010-06-23 2012-01-01 Pixart Imaging Inc Optical touch monitor
TWI421753B (en) * 2010-08-12 2014-01-01 Lite On Semiconductor Corp Calibration method, detection device and optical touch panel for optical touch panel
TWI441060B (en) 2011-04-14 2014-06-11 Pixart Imaging Inc Image processing method for optical touch system
US8928589B2 (en) * 2011-04-20 2015-01-06 Qualcomm Incorporated Virtual keyboards and methods of providing the same
US8937588B2 (en) * 2011-06-15 2015-01-20 Smart Technologies Ulc Interactive input system and method of operating the same
CN103019391A (en) * 2011-09-22 2013-04-03 纬创资通股份有限公司 Input device and method using captured keyboard image as instruction input foundation
TW201342158A (en) * 2012-04-03 2013-10-16 Wistron Corp Optical touch sensing apparatus
TW201342137A (en) * 2012-04-10 2013-10-16 Pixart Imaging Inc Optical operation system
TWI470511B (en) * 2012-06-06 2015-01-21 Wistron Corp Dual-mode input apparatus
US9507462B2 (en) * 2012-06-13 2016-11-29 Hong Kong Applied Science and Technology Research Institute Company Limited Multi-dimensional image detection apparatus
KR101346374B1 (en) 2012-06-25 2013-12-31 주식회사 아하정보통신 Apparatus for detecting coordinates of electronic black board
TWI479393B (en) * 2012-11-21 2015-04-01 Wistron Corp Switching methods, optical touch devices using the same, and computer products thereof
TWI482068B (en) * 2012-11-21 2015-04-21 Wistron Corp Optical touch devices and operation method thereof
US9645734B2 (en) * 2012-11-21 2017-05-09 Wistron Corp. Optical touch devices and operation method thereof
JP6102330B2 (en) * 2013-02-22 2017-03-29 船井電機株式会社 projector
TWI498792B (en) * 2013-08-06 2015-09-01 Wistron Corp Optical touch system and touch and display system
US9875019B2 (en) * 2013-12-26 2018-01-23 Visteon Global Technologies, Inc. Indicating a transition from gesture based inputs to touch surfaces
JP6711817B2 (en) 2015-08-20 2020-06-17 キヤノン株式会社 Information processing apparatus, control method thereof, program, and storage medium
KR20230011485A (en) 2015-09-28 2023-01-20 애플 인크. Electronic device display with extended active area
WO2019094003A1 (en) * 2017-11-08 2019-05-16 Hewlett-Packard Development Company, L.P. Determining locations of electro-optical pens
CN109840038A (en) * 2018-06-12 2019-06-04 柯梦天 Utilize the electronic whiteboard of 3D image positioning touch technology
US11307045B2 (en) * 2019-12-19 2022-04-19 Lenovo (Singapore) Pte. Ltd. Method and system to determine navigation actions based on instructions from a directional dialogue
CN113934323B (en) * 2021-10-19 2023-12-29 河北师达教育科技有限公司 Multi-point display method and device based on intelligent blackboard and terminal equipment

Citations (249)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4107522A (en) 1975-11-11 1978-08-15 Erwin Sick Gesellschaft Mit Beschrankter Haftung Optik-Elektronik Rotary beam light curtain
US4144449A (en) 1977-07-08 1979-03-13 Sperry Rand Corporation Position detection apparatus
US4247767A (en) 1978-04-05 1981-01-27 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Touch sensitive computer input device
JPS57211637A (en) 1981-06-23 1982-12-25 Kokusai Electric Co Ltd Optical coordinate input device
US4507557A (en) 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
US4558313A (en) 1981-12-31 1985-12-10 International Business Machines Corporation Indicator to data processing interface
JPS61196317A (en) 1985-02-27 1986-08-30 Nippon Telegr & Teleph Corp <Ntt> Information input system
JPS61260322A (en) 1985-05-10 1986-11-18 ザ・レイトラム・コ−ポレ−シヨン Positioning system
JPS61196317U (en) 1985-05-29 1986-12-08
US4672364A (en) 1984-06-18 1987-06-09 Carroll Touch Inc Touch input device having power profiling
US4737631A (en) 1985-05-17 1988-04-12 Alps Electric Co., Ltd. Filter of photoelectric touch panel with integral spherical protrusion lens
US4742221A (en) 1985-05-17 1988-05-03 Alps Electric Co., Ltd. Optical coordinate position input device
US4746770A (en) 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US4762990A (en) 1985-10-21 1988-08-09 International Business Machines Corporation Data processing input interface determining position of object
US4782328A (en) 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
GB2204126A (en) 1987-04-28 1988-11-02 Wells Gardner Electronics Optical position determining apparatus
US4818826A (en) 1986-09-19 1989-04-04 Alps Electric Co., Ltd. Coordinate input apparatus including a detection circuit to determine proper stylus position
US4822145A (en) 1986-05-14 1989-04-18 Massachusetts Institute Of Technology Method and apparatus utilizing waveguide and polarized light for display of dynamic images
US4831455A (en) 1986-02-21 1989-05-16 Canon Kabushiki Kaisha Picture reading apparatus
US4868912A (en) 1986-11-26 1989-09-19 Digital Electronics Infrared touch panel
EP0347725A2 (en) 1988-06-22 1989-12-27 Wacom Company, Ltd. Electronic blackboard and accessories such as writing tools
US4980547A (en) 1985-05-24 1990-12-25 Wells-Gardner Electronics Corp. Light distribution and detection apparatus
JPH0354618A (en) 1989-07-22 1991-03-08 Fujitsu Ltd Optical position indicator
US5025314A (en) 1990-07-30 1991-06-18 Xerox Corporation Apparatus allowing remote interactive use of a plurality of writing surfaces
JPH0354618B2 (en) 1983-12-29 1991-08-20
US5097516A (en) 1991-02-28 1992-03-17 At&T Bell Laboratories Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging
US5109435A (en) 1988-08-08 1992-04-28 Hughes Aircraft Company Segmentation method for use against moving objects
US5130794A (en) 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5140647A (en) 1989-12-18 1992-08-18 Hitachi, Ltd. Image joining method and system
US5162618A (en) 1990-11-16 1992-11-10 Exzec, Inc. Acoustic touch position sensor with first order lamb wave reflective arrays
US5168531A (en) 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
JPH04350715A (en) 1991-05-28 1992-12-04 Matsushita Electric Ind Co Ltd Input device
JPH04355815A (en) 1991-06-03 1992-12-09 Pfu Ltd Touch screen
US5196835A (en) 1988-09-30 1993-03-23 International Business Machines Corporation Laser touch panel reflective surface aberration cancelling
JPH05181605A (en) 1991-12-27 1993-07-23 Seiko Instr Inc Projection display device with coordinate reading function and its display screen and display
JPH05189137A (en) 1992-01-16 1993-07-30 Sumitomo Heavy Ind Ltd Command input device for computer
JPH05197810A (en) 1992-01-20 1993-08-06 Nippon Telegr & Teleph Corp <Ntt> Command input processing method by image
US5239373A (en) 1990-12-26 1993-08-24 Xerox Corporation Video computational shared drawing space
US5317140A (en) * 1992-11-24 1994-05-31 Dunthorn David I Diffusion-assisted position location particularly for visual pen detection
US5359155A (en) 1993-03-25 1994-10-25 Tiger Scientific Corp. Illumination apparatus for a digitizer tablet
US5374971A (en) 1993-03-12 1994-12-20 Picturetel Corporation Two-view video camera stand and support method
JPH07110733A (en) 1993-10-13 1995-04-25 Nippon Signal Co Ltd:The Input device
US5414413A (en) 1988-06-14 1995-05-09 Sony Corporation Touch panel apparatus
EP0657841A1 (en) 1993-12-07 1995-06-14 AT&T Corp. Sensing stylus position using single 1-D imge sensor
JPH07230352A (en) 1993-09-16 1995-08-29 Hitachi Ltd Touch position detecting device and touch instruction processor
US5448263A (en) 1991-10-21 1995-09-05 Smart Technologies Inc. Interactive display system
JPH07110733B2 (en) 1988-05-10 1995-11-29 三菱重工業株式会社 Brake device deterioration determination device
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US5483603A (en) 1992-10-22 1996-01-09 Advanced Interconnection Technology System and method for automatic optical inspection
US5490655A (en) 1993-09-16 1996-02-13 Monger Mounts, Inc. Video/data projector and monitor ceiling/wall mount
JPH0816931B2 (en) 1987-02-06 1996-02-21 富士通株式会社 Contour extraction method
US5502568A (en) 1993-03-23 1996-03-26 Wacom Co., Ltd. Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's
JPH08108689A (en) 1994-05-31 1996-04-30 Nippon Typewriter Co Ltd Electronic blackboard
US5525764A (en) 1994-06-09 1996-06-11 Junkins; John L. Laser scanning graphic input system
US5528263A (en) 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5528290A (en) 1994-09-09 1996-06-18 Xerox Corporation Device for transcribing images on a board using a camera based board scanner
US5537107A (en) 1991-01-29 1996-07-16 Sony Corporation Remote control unit for video apparatus
US5554828A (en) 1995-01-03 1996-09-10 Texas Instruments Inc. Integration of pen-based capability into a field emission device system
JPH08240407A (en) 1995-03-02 1996-09-17 Matsushita Electric Ind Co Ltd Position detecting input device
JPH08315152A (en) 1995-05-22 1996-11-29 Sony Corp Image recognition device
US5581637A (en) 1994-12-09 1996-12-03 Xerox Corporation System for registering component image tiles in a camera-based scanner device transcribing scene images
US5581276A (en) 1992-09-08 1996-12-03 Kabushiki Kaisha Toshiba 3D human interface apparatus using motion recognition based on dynamic image processing
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5594502A (en) 1993-01-20 1997-01-14 Elmo Company, Limited Image reproduction apparatus
EP0762319A2 (en) 1995-08-24 1997-03-12 Symbios Logic Inc. Graphical input apparatus and method
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
JPH0991094A (en) 1995-09-21 1997-04-04 Sekisui Chem Co Ltd Coordinate detector for touch panel
US5638092A (en) 1994-12-20 1997-06-10 Eng; Tommy K. Cursor control system
JPH09224111A (en) 1996-02-16 1997-08-26 Hitachi Denshi Ltd Electronic blackboard
US5670755A (en) 1994-04-21 1997-09-23 Samsung Display Devices Co., Ltd. Information input apparatus having functions of both touch panel and digitizer, and driving method thereof
US5686942A (en) 1994-12-01 1997-11-11 National Semiconductor Corporation Remote computer input system which detects point source on operator
JPH09319501A (en) 1996-05-29 1997-12-12 Fujitsu Ltd Coordinate detector
WO1998007112A2 (en) 1996-08-13 1998-02-19 Lsi Logic Corporation Data input apparatus and method
US5729704A (en) 1993-07-21 1998-03-17 Xerox Corporation User-directed method for operating on an object-based model data structure through a second contextual image
EP0829798A2 (en) 1996-09-12 1998-03-18 Digital Equipment Corporation Image-based touchscreen
US5734375A (en) 1995-06-07 1998-03-31 Compaq Computer Corporation Keyboard-compatible optical determination of object's position
US5737740A (en) 1994-06-27 1998-04-07 Numonics Apparatus and method for processing electronic documents
US5736686A (en) 1995-03-01 1998-04-07 Gtco Corporation Illumination apparatus for a digitizer tablet with improved light panel
JPH10105324A (en) 1996-09-09 1998-04-24 Motorola Inc Intuitive gestuer system graphical user interface
US5764223A (en) 1995-06-07 1998-06-09 International Business Machines Corporation Touch-screen input device using the monitor as a light source operating at an intermediate frequency
US5771039A (en) 1994-06-06 1998-06-23 Ditzik; Richard J. Direct view display device integration techniques
US5790910A (en) 1997-08-04 1998-08-04 Peerless Industries, Inc. Camera mounting apparatus
US5801704A (en) 1994-08-22 1998-09-01 Hitachi, Ltd. Three-dimensional input device with displayed legend and shape-changing cursor
US5819201A (en) 1996-09-13 1998-10-06 Magellan Dis, Inc. Navigation system with vehicle service information
US5818424A (en) 1995-10-19 1998-10-06 International Business Machines Corporation Rod shaped device and data acquisition apparatus for determining the position and orientation of an object in space
US5818421A (en) 1994-12-21 1998-10-06 Hitachi, Ltd. Input interface apparatus for large screen display
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5831602A (en) 1996-01-11 1998-11-03 Canon Kabushiki Kaisha Information processing apparatus, method and computer program product
DE19810452A1 (en) 1997-06-13 1998-12-17 Wacom Co Ltd Optical coordinate digitiser
WO1999008897A1 (en) 1997-08-18 1999-02-25 The Texas A & M University System Centralised control system in a police vehicle
JPH1151644A (en) 1997-08-05 1999-02-26 Honda Motor Co Ltd Distance measuring instrument for vehicle
JPH1164026A (en) 1997-08-12 1999-03-05 Fujitsu Ten Ltd Navigation system
JPH1185376A (en) 1997-09-02 1999-03-30 Fujitsu Ltd Information display device with optical position detecting device
JPH11110116A (en) 1997-08-07 1999-04-23 Fujitsu Ltd Optical position detection device
WO1999021122A1 (en) 1997-10-22 1999-04-29 Ascent Technology, Inc. Voice-output reading system with gesture-based navigation
US5911004A (en) 1995-05-08 1999-06-08 Ricoh Company, Ltd. Image processing apparatus for discriminating image characteristics using image signal information obtained in an image scanning operation
WO1999028812A1 (en) 1997-12-04 1999-06-10 Northern Telecom Limited Intelligent touch display
US5914709A (en) 1997-03-14 1999-06-22 Poa Sana, Llc User input device for a computer system
US5920342A (en) 1994-09-16 1999-07-06 Kabushiki Kaisha Toshiba Image input apparatus for capturing images of multiple resolutions
WO1999040562A1 (en) 1998-02-09 1999-08-12 Joseph Lev Video camera computer touch screen system
US5943783A (en) 1992-09-04 1999-08-31 Balco, Incorporated Method and apparatus for determining the alignment of motor vehicle wheels
US5963199A (en) 1996-02-09 1999-10-05 Kabushiki Kaisha Sega Enterprises Image processing systems and data input devices therefor
US5982352A (en) 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US5988645A (en) 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US6002808A (en) 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
US6008798A (en) 1995-06-07 1999-12-28 Compaq Computer Corporation Method of determining an object's position and associated apparatus
US6031531A (en) 1998-04-06 2000-02-29 International Business Machines Corporation Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users
JP2000105671A (en) 1998-05-11 2000-04-11 Ricoh Co Ltd Coordinate input and detecting device, and electronic blackboard system
US6061177A (en) 1996-12-19 2000-05-09 Fujimoto; Kenneth Noboru Integrated computer display and graphical input apparatus and method
JP2000132340A (en) 1998-06-09 2000-05-12 Ricoh Co Ltd Coordinate input/detecting device and electronic blackboard system
US6075905A (en) 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
US6104387A (en) 1997-05-14 2000-08-15 Virtual Ink Corporation Transcription system
US6118433A (en) 1992-01-30 2000-09-12 Jenkin; Michael Large-scale, touch-sensitive video display
US6122865A (en) 1997-03-13 2000-09-26 Steelcase Development Inc. Workspace display
US6128003A (en) 1996-12-20 2000-10-03 Hitachi, Ltd. Hand gesture recognition system and method
US6141000A (en) 1991-10-21 2000-10-31 Smart Technologies Inc. Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing
US6147678A (en) 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6153836A (en) 1997-04-02 2000-11-28 Goszyk; Kurt A. Adjustable area coordinate position data-capture system
US6161066A (en) 1997-08-18 2000-12-12 The Texas A&M University System Advanced law enforcement and response technology
US6179426B1 (en) 1999-03-03 2001-01-30 3M Innovative Properties Company Integrated front projection system
US6188388B1 (en) 1993-12-28 2001-02-13 Hitachi, Ltd. Information presentation apparatus and information display apparatus
US6191773B1 (en) 1995-04-28 2001-02-20 Matsushita Electric Industrial Co., Ltd. Interface apparatus
JP2001075735A (en) 1999-09-06 2001-03-23 Canon Inc Coordinate input device, its method and computer readable memory
US6208330B1 (en) 1997-03-07 2001-03-27 Canon Kabushiki Kaisha Coordinate input apparatus and its control method
US6226035B1 (en) 1998-03-04 2001-05-01 Cyclo Vision Technologies, Inc. Adjustable imaging system with wide angle capability
US6229529B1 (en) 1997-07-11 2001-05-08 Ricoh Company, Ltd. Write point detecting circuit to detect multiple write points
US6252989B1 (en) 1997-01-07 2001-06-26 Board Of The Regents, The University Of Texas System Foveated image coding system and method for image bandwidth reduction
US6256033B1 (en) 1997-10-15 2001-07-03 Electric Planet Method and apparatus for real-time gesture recognition
US6262718B1 (en) 1994-01-19 2001-07-17 International Business Machines Corporation Touch-sensitive display apparatus
US20010019325A1 (en) 2000-03-06 2001-09-06 Ricoh Company, Ltd. Optical coordinate input/detection device with optical-unit positioning error correcting function
US20010022579A1 (en) 2000-03-16 2001-09-20 Ricoh Company, Ltd. Apparatus for inputting coordinates
US20010026268A1 (en) 2000-03-31 2001-10-04 Ricoh Company, Ltd. Coordiante input and detection device and information display and input apparatus
JP2001282456A (en) 2000-04-03 2001-10-12 Japan Science & Technology Corp Man-machine interface system
JP2001282457A (en) 2000-03-31 2001-10-12 Ricoh Co Ltd Coordinate input system, control method of coordinate input system and computer readable recording medium in which a program to make computer execute the method is recorded
US20010033274A1 (en) 1997-11-17 2001-10-25 Joon-Suan Ong Method and apparatus for erasing previously entered data
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US6328270B1 (en) 1999-11-12 2001-12-11 Elbex Video Ltd. Swivel joint with cable passage for a television camera or a case
US6335724B1 (en) 1999-01-29 2002-01-01 Ricoh Company, Ltd. Method and device for inputting coordinate-position and a display board system
CA2412878A1 (en) 2000-07-05 2002-01-10 Smart Technologies Inc. Camera-based touch system
US6339748B1 (en) 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus
WO2002007073A2 (en) 2000-07-13 2002-01-24 Koninklijke Philips Electronics N.V. Pointing direction calibration in camera-based system applications
US6353434B1 (en) 1998-09-08 2002-03-05 Gunze Limited Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display
US6359612B1 (en) 1998-09-30 2002-03-19 Siemens Aktiengesellschaft Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device
US20020036617A1 (en) 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
WO2002027461A1 (en) 2000-09-11 2002-04-04 Njoelstad Tormod Drawing, writing and pointing device
US20020050979A1 (en) 2000-08-24 2002-05-02 Sun Microsystems, Inc Interpolating sample values from known triangle vertex values
US20020067922A1 (en) 2000-12-02 2002-06-06 Harris Thomas H.S. Operator supported remote camera positioning and control system
US20020080123A1 (en) 2000-12-26 2002-06-27 International Business Machines Corporation Method for touchscreen data input
US6414673B1 (en) 1998-11-10 2002-07-02 Tidenet, Inc. Transmitter pen location system
US6414671B1 (en) 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
JP2002236547A (en) 2001-02-07 2002-08-23 Ricoh Co Ltd Information input system
US20020145595A1 (en) 2001-03-26 2002-10-10 Mitsuru Satoh Information input/output apparatus, information input/output control method, and computer product
US20020163530A1 (en) 2001-04-05 2002-11-07 Fujitsu Limited Of Kawasaki, Japan Image merging apparatus
US6496122B2 (en) 1998-06-26 2002-12-17 Sharp Laboratories Of America, Inc. Image display and remote control system capable of displaying two distinct images
US6497608B2 (en) 2001-02-09 2002-12-24 Sampo Technology Corp. Toy car camera system and rear vision mirrors
US6498602B1 (en) 1999-11-11 2002-12-24 Newcom, Inc. Optical digitizer with function to recognize kinds of pointing instruments
US6507339B1 (en) 1999-08-23 2003-01-14 Ricoh Company, Ltd. Coordinate inputting/detecting system and a calibration method therefor
US6512838B1 (en) 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US20030025951A1 (en) 2001-07-27 2003-02-06 Pollard Stephen Bernard Paper-to-computer interfaces
US6518600B1 (en) 2000-11-17 2003-02-11 General Electric Company Dual encapsulation for an LED
US6517266B2 (en) 2001-05-15 2003-02-11 Xerox Corporation Systems and methods for hand-held printing on a surface or medium
US6522830B2 (en) 1993-11-30 2003-02-18 Canon Kabushiki Kaisha Image pickup apparatus
US6529189B1 (en) 2000-02-08 2003-03-04 International Business Machines Corporation Touch screen stylus with IR-coupled selection buttons
US20030046401A1 (en) 2000-10-16 2003-03-06 Abbott Kenneth H. Dynamically determing appropriate computer user interfaces
US20030043116A1 (en) 2001-06-01 2003-03-06 Gerald Morrison Calibrating camera offsets to facilitate object Position determination using triangulation
US6530664B2 (en) 1999-03-03 2003-03-11 3M Innovative Properties Company Integrated front projection system with enhanced dry erase screen configuration
US20030063073A1 (en) 2001-10-03 2003-04-03 Geaghan Bernard O. Touch panel system and method for distinguishing multiple touch inputs
US6545669B1 (en) 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US20030071858A1 (en) 2001-09-28 2003-04-17 Hiroshi Morohoshi Information input and output system, method, storage medium, and carrier wave
US6559813B1 (en) 1998-07-01 2003-05-06 Deluca Michael Selective real image obstruction in a virtual reality display apparatus and method
US20030085871A1 (en) * 2001-10-09 2003-05-08 E-Business Information Technology Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
US6563491B1 (en) 1999-09-10 2003-05-13 Ricoh Company, Ltd. Coordinate input apparatus and the recording medium thereof
US6567078B2 (en) 2000-01-25 2003-05-20 Xiroku Inc. Handwriting communication system and handwriting input device used therein
US6567121B1 (en) 1996-10-25 2003-05-20 Canon Kabushiki Kaisha Camera control system, camera server, camera client, control method, and storage medium
US20030095112A1 (en) 2001-11-22 2003-05-22 International Business Machines Corporation Information processing apparatus, program and coordinate input method
US6570612B1 (en) 1998-09-21 2003-05-27 Bank One, Na, As Administrative Agent System and method for color normalization of board images
JP2003158597A (en) 2001-11-21 2003-05-30 Mitsubishi Rayon Co Ltd Image display device provided with screen used for hand- writing image entry face
US6577299B1 (en) 1998-08-18 2003-06-10 Digital Ink, Inc. Electronic portable pen apparatus and method
US6587099B2 (en) 2000-02-18 2003-07-01 Ricoh Company, Ltd. Coordinate input/detection device detecting installation position of light-receiving device used for detecting coordinates
US6594023B1 (en) 1999-09-10 2003-07-15 Ricoh Company, Ltd. Coordinate inputting/detecting apparatus, method and computer program product designed to precisely recognize a designating state of a designating device designating a position
US6597348B1 (en) 1998-12-28 2003-07-22 Semiconductor Energy Laboratory Co., Ltd. Information-processing device
US20030142880A1 (en) 2002-01-29 2003-07-31 Manabu Hyodo Image processing method, image processing apparatus, and electronic camera
US20030151562A1 (en) 2002-02-08 2003-08-14 Kulas Charles J. Computer display system using multiple screens
US20030151532A1 (en) 2002-02-13 2003-08-14 Hsin-Shu Chen Calibration of resistor ladder using difference measurement and parallel resistive correction
US6626718B2 (en) 2000-10-03 2003-09-30 Canon Kabushiki Kaisha Apparatus for manufacturing electron source, method for manufacturing electron source, and method for manufacturing image-forming apparatus
US6630922B2 (en) 1997-08-29 2003-10-07 Xerox Corporation Handedness detection for a physical manipulatory grammar
US6633328B1 (en) 1999-01-05 2003-10-14 Steris Corporation Surgical lighting system with integrated digital video camera
US6650822B1 (en) 1996-10-29 2003-11-18 Xeotion Corp. Optical device utilizing optical waveguides and mechanical light-switches
CA2493236A1 (en) 2002-06-10 2003-12-18 Steven Montellese Apparatus and method for inputting data
US6674424B1 (en) 1999-10-29 2004-01-06 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
US6683584B2 (en) 1993-10-22 2004-01-27 Kopin Corporation Camera display system
US20040021633A1 (en) 2002-04-06 2004-02-05 Rajkowski Janusz Wiktor Symbol encoding apparatus and method
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
US6690363B2 (en) 2000-06-19 2004-02-10 Next Holdings Limited Touch panel display system
US6690397B1 (en) 2000-06-05 2004-02-10 Advanced Neuromodulation Systems, Inc. System for regional data association and presentation and method for the same
US20040031779A1 (en) 2002-05-17 2004-02-19 Cahill Steven P. Method and system for calibrating a laser processing system and laser marking system utilizing same
US20040046749A1 (en) 1996-10-15 2004-03-11 Nikon Corporation Image recording and replay apparatus
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6736321B2 (en) 1995-12-18 2004-05-18 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system
US6741250B1 (en) 2001-02-09 2004-05-25 Be Here Corporation Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path
US20040108990A1 (en) 2001-01-08 2004-06-10 Klony Lieberman Data input device
US6756910B2 (en) 2001-02-27 2004-06-29 Optex Co., Ltd. Sensor for automatic doors
US20040149892A1 (en) 2003-01-30 2004-08-05 Akitt Trevor M. Illuminated bezel and touch system incorporating the same
US20040150630A1 (en) 2001-08-29 2004-08-05 Microsoft Corporation Manual controlled scrolling
US6774889B1 (en) 2000-10-24 2004-08-10 Microsoft Corporation System and method for transforming an ordinary computer monitor screen into a touch screen
EP1450243A2 (en) 2003-02-19 2004-08-25 Agilent Technologies Inc Electronic device having an image-based data input system
US20040169639A1 (en) 2003-02-28 2004-09-02 Pate Michael A. Visible pointer tracking with separately detectable pointer tracking signal
US20040178993A1 (en) 2003-03-11 2004-09-16 Morrison Gerald D. Touch system and method for determining pointer contacts on a touch surface
US20040179001A1 (en) 2003-03-11 2004-09-16 Morrison Gerald D. System and method for differentiating between pointers used to contact touch surface
US20040189720A1 (en) 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US6803906B1 (en) 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
US20040252091A1 (en) 2003-06-14 2004-12-16 Massachusetts Institute Of Technology Input device based on frustrated total internal reflection
US6864882B2 (en) 2000-05-24 2005-03-08 Next Holdings Limited Protected touch panel display system
US20050052427A1 (en) 2003-09-10 2005-03-10 Wu Michael Chi Hung Hand gesture interaction with touch surface
US20050057524A1 (en) 2003-09-16 2005-03-17 Hill Douglas B. Gesture recognition method and touch system incorporating the same
US20050083308A1 (en) 2003-10-16 2005-04-21 Homer Steven S. Display for an electronic device
US6911972B2 (en) 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US20050151733A1 (en) 2004-01-09 2005-07-14 Microsoft Corporation Multi-chart geometry images
US6933981B1 (en) 1999-06-25 2005-08-23 Kabushiki Kaisha Toshiba Electronic apparatus and electronic system provided with the same
US20050190162A1 (en) 2003-02-14 2005-09-01 Next Holdings, Limited Touch screen signal processing
US6954197B2 (en) 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
WO2005106775A1 (en) 2004-05-05 2005-11-10 Smart Technologies Inc. Apparatus and method for detecting a pointer relative to a touch surface
US20050248540A1 (en) 2004-05-07 2005-11-10 Next Holdings, Limited Touch panel display system with illumination and detection provided from a single edge
US6972753B1 (en) 1998-10-02 2005-12-06 Semiconductor Energy Laboratory Co., Ltd. Touch panel, display device provided with touch panel and electronic equipment provided with display device
US20050276448A1 (en) 2000-07-07 2005-12-15 Pryor Timothy R Multi-functional control and entertainment systems
US7007236B2 (en) 2001-09-14 2006-02-28 Accenture Global Services Gmbh Lab window collaboration
US7030861B1 (en) 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US20060158437A1 (en) 2005-01-20 2006-07-20 Blythe Michael M Display device
US7084868B2 (en) 2000-04-26 2006-08-01 University Of Louisville Research Foundation, Inc. System and method for 3-D digital reconstruction of an oral cavity from a sequence of 2-D images
US7098392B2 (en) 1996-07-10 2006-08-29 Sitrick David H Electronic image visualization system and communication methodologies
US20060202953A1 (en) 1997-08-22 2006-09-14 Pryor Timothy R Novel man machine interfaces and applications
US20060227120A1 (en) 2005-03-28 2006-10-12 Adam Eikman Photonic touch screen apparatus and method of use
US7121470B2 (en) 2002-01-11 2006-10-17 Hand Held Products, Inc. Transaction terminal having elongated finger recess
US20060274067A1 (en) 2001-09-14 2006-12-07 Hideo Hidai Image processing apparatus, display apparatus with touch panel, image processing method and computer program
WO2007003196A2 (en) 2005-07-05 2007-01-11 O-Pen Aps A touch pad system
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US7184030B2 (en) 2002-06-27 2007-02-27 Smart Technologies Inc. Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects
US7187489B2 (en) 1999-10-05 2007-03-06 Idc, Llc Photonic MEMS and structures
US7190496B2 (en) 2003-07-24 2007-03-13 Zebra Imaging, Inc. Enhanced environment visualization using holographic stereograms
US20070075648A1 (en) 2005-10-03 2007-04-05 Blythe Michael M Reflecting light
US20070116333A1 (en) 2005-11-18 2007-05-24 Dempski Kelly L Detection of multiple targets on a plane of interest
US20070126755A1 (en) 2002-06-19 2007-06-07 Microsoft Corporation System and Method for Whiteboard and Audio Capture
WO2007064804A1 (en) 2005-12-02 2007-06-07 General Electric Company Electroform, methods of making electroforms, and products made from electroforms
US7232986B2 (en) 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
US20070139932A1 (en) 2005-12-20 2007-06-21 Industrial Technology Research Institute Light source package structure
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
US20080062149A1 (en) 2003-05-19 2008-03-13 Baruch Itzhak Optical coordinate input device comprising few elements
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US20080129707A1 (en) 2004-07-27 2008-06-05 Pryor Timothy R Method and apparatus employing multi-functional controls and displays

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0816931A (en) 1994-06-28 1996-01-19 Tec Corp Order data processor

Patent Citations (287)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4107522A (en) 1975-11-11 1978-08-15 Erwin Sick Gesellschaft Mit Beschrankter Haftung Optik-Elektronik Rotary beam light curtain
US4144449A (en) 1977-07-08 1979-03-13 Sperry Rand Corporation Position detection apparatus
US4247767A (en) 1978-04-05 1981-01-27 Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Touch sensitive computer input device
JPS57211637A (en) 1981-06-23 1982-12-25 Kokusai Electric Co Ltd Optical coordinate input device
US4558313A (en) 1981-12-31 1985-12-10 International Business Machines Corporation Indicator to data processing interface
US4507557A (en) 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
JPH0354618B2 (en) 1983-12-29 1991-08-20
US4672364A (en) 1984-06-18 1987-06-09 Carroll Touch Inc Touch input device having power profiling
JPS61196317A (en) 1985-02-27 1986-08-30 Nippon Telegr & Teleph Corp <Ntt> Information input system
JPS61260322A (en) 1985-05-10 1986-11-18 ザ・レイトラム・コ−ポレ−シヨン Positioning system
US4737631A (en) 1985-05-17 1988-04-12 Alps Electric Co., Ltd. Filter of photoelectric touch panel with integral spherical protrusion lens
US4742221A (en) 1985-05-17 1988-05-03 Alps Electric Co., Ltd. Optical coordinate position input device
US4980547A (en) 1985-05-24 1990-12-25 Wells-Gardner Electronics Corp. Light distribution and detection apparatus
JPS61196317U (en) 1985-05-29 1986-12-08
US4762990A (en) 1985-10-21 1988-08-09 International Business Machines Corporation Data processing input interface determining position of object
US4831455A (en) 1986-02-21 1989-05-16 Canon Kabushiki Kaisha Picture reading apparatus
US4822145A (en) 1986-05-14 1989-04-18 Massachusetts Institute Of Technology Method and apparatus utilizing waveguide and polarized light for display of dynamic images
US4818826A (en) 1986-09-19 1989-04-04 Alps Electric Co., Ltd. Coordinate input apparatus including a detection circuit to determine proper stylus position
US4782328A (en) 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
US4868912A (en) 1986-11-26 1989-09-19 Digital Electronics Infrared touch panel
JPH0816931B2 (en) 1987-02-06 1996-02-21 富士通株式会社 Contour extraction method
US4746770A (en) 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
EP0279652A2 (en) 1987-02-17 1988-08-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
GB2204126A (en) 1987-04-28 1988-11-02 Wells Gardner Electronics Optical position determining apparatus
US4820050A (en) 1987-04-28 1989-04-11 Wells-Gardner Electronics Corporation Solid-state optical position determining apparatus
JPH07110733B2 (en) 1988-05-10 1995-11-29 三菱重工業株式会社 Brake device deterioration determination device
US5414413A (en) 1988-06-14 1995-05-09 Sony Corporation Touch panel apparatus
EP0347725A2 (en) 1988-06-22 1989-12-27 Wacom Company, Ltd. Electronic blackboard and accessories such as writing tools
US5109435A (en) 1988-08-08 1992-04-28 Hughes Aircraft Company Segmentation method for use against moving objects
US5196835A (en) 1988-09-30 1993-03-23 International Business Machines Corporation Laser touch panel reflective surface aberration cancelling
JPH0354618A (en) 1989-07-22 1991-03-08 Fujitsu Ltd Optical position indicator
US5140647A (en) 1989-12-18 1992-08-18 Hitachi, Ltd. Image joining method and system
US5130794A (en) 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5025314A (en) 1990-07-30 1991-06-18 Xerox Corporation Apparatus allowing remote interactive use of a plurality of writing surfaces
US5162618A (en) 1990-11-16 1992-11-10 Exzec, Inc. Acoustic touch position sensor with first order lamb wave reflective arrays
US5239373A (en) 1990-12-26 1993-08-24 Xerox Corporation Video computational shared drawing space
US5537107A (en) 1991-01-29 1996-07-16 Sony Corporation Remote control unit for video apparatus
US5097516A (en) 1991-02-28 1992-03-17 At&T Bell Laboratories Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging
JPH04350715A (en) 1991-05-28 1992-12-04 Matsushita Electric Ind Co Ltd Input device
JPH04355815A (en) 1991-06-03 1992-12-09 Pfu Ltd Touch screen
US5168531A (en) 1991-06-27 1992-12-01 Digital Equipment Corporation Real-time recognition of pointing information from video
US6337681B1 (en) 1991-10-21 2002-01-08 Smart Technologies Inc. Projection display system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US6141000A (en) 1991-10-21 2000-10-31 Smart Technologies Inc. Projection display system with touch sensing on screen, computer assisted alignment correction and network conferencing
US6747636B2 (en) 1991-10-21 2004-06-08 Smart Technologies, Inc. Projection display and system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks
US5448263A (en) 1991-10-21 1995-09-05 Smart Technologies Inc. Interactive display system
JPH05181605A (en) 1991-12-27 1993-07-23 Seiko Instr Inc Projection display device with coordinate reading function and its display screen and display
JPH05189137A (en) 1992-01-16 1993-07-30 Sumitomo Heavy Ind Ltd Command input device for computer
JPH05197810A (en) 1992-01-20 1993-08-06 Nippon Telegr & Teleph Corp <Ntt> Command input processing method by image
US6118433A (en) 1992-01-30 2000-09-12 Jenkin; Michael Large-scale, touch-sensitive video display
US5483261A (en) 1992-02-14 1996-01-09 Itu Research, Inc. Graphical input controller and method with rear screen image detection
US20040178997A1 (en) 1992-06-08 2004-09-16 Synaptics, Inc., A California Corporation Object position detector with edge motion feature and gesture recognition
US6414671B1 (en) 1992-06-08 2002-07-02 Synaptics Incorporated Object position detector with edge motion feature and gesture recognition
US5943783A (en) 1992-09-04 1999-08-31 Balco, Incorporated Method and apparatus for determining the alignment of motor vehicle wheels
US5581276A (en) 1992-09-08 1996-12-03 Kabushiki Kaisha Toshiba 3D human interface apparatus using motion recognition based on dynamic image processing
US5982352A (en) 1992-09-18 1999-11-09 Pryor; Timothy R. Method for providing human input to a computer
US5483603A (en) 1992-10-22 1996-01-09 Advanced Interconnection Technology System and method for automatic optical inspection
US5317140A (en) * 1992-11-24 1994-05-31 Dunthorn David I Diffusion-assisted position location particularly for visual pen detection
US5594502A (en) 1993-01-20 1997-01-14 Elmo Company, Limited Image reproduction apparatus
US5374971A (en) 1993-03-12 1994-12-20 Picturetel Corporation Two-view video camera stand and support method
US5502568A (en) 1993-03-23 1996-03-26 Wacom Co., Ltd. Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's
US5359155A (en) 1993-03-25 1994-10-25 Tiger Scientific Corp. Illumination apparatus for a digitizer tablet
US5729704A (en) 1993-07-21 1998-03-17 Xerox Corporation User-directed method for operating on an object-based model data structure through a second contextual image
JPH07230352A (en) 1993-09-16 1995-08-29 Hitachi Ltd Touch position detecting device and touch instruction processor
US5490655A (en) 1993-09-16 1996-02-13 Monger Mounts, Inc. Video/data projector and monitor ceiling/wall mount
JPH07110733A (en) 1993-10-13 1995-04-25 Nippon Signal Co Ltd:The Input device
US6683584B2 (en) 1993-10-22 2004-01-27 Kopin Corporation Camera display system
US5617312A (en) * 1993-11-19 1997-04-01 Hitachi, Ltd. Computer system that enters control information by means of video camera
US6522830B2 (en) 1993-11-30 2003-02-18 Canon Kabushiki Kaisha Image pickup apparatus
US5484966A (en) 1993-12-07 1996-01-16 At&T Corp. Sensing stylus position using single 1-D image sensor
EP0657841A1 (en) 1993-12-07 1995-06-14 AT&T Corp. Sensing stylus position using single 1-D imge sensor
US6188388B1 (en) 1993-12-28 2001-02-13 Hitachi, Ltd. Information presentation apparatus and information display apparatus
US6262718B1 (en) 1994-01-19 2001-07-17 International Business Machines Corporation Touch-sensitive display apparatus
US5988645A (en) 1994-04-08 1999-11-23 Downing; Dennis L. Moving object monitoring system
US5670755A (en) 1994-04-21 1997-09-23 Samsung Display Devices Co., Ltd. Information input apparatus having functions of both touch panel and digitizer, and driving method thereof
JPH08108689A (en) 1994-05-31 1996-04-30 Nippon Typewriter Co Ltd Electronic blackboard
US5771039A (en) 1994-06-06 1998-06-23 Ditzik; Richard J. Direct view display device integration techniques
US5525764A (en) 1994-06-09 1996-06-11 Junkins; John L. Laser scanning graphic input system
US5528263A (en) 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
US5737740A (en) 1994-06-27 1998-04-07 Numonics Apparatus and method for processing electronic documents
US5801704A (en) 1994-08-22 1998-09-01 Hitachi, Ltd. Three-dimensional input device with displayed legend and shape-changing cursor
US5528290A (en) 1994-09-09 1996-06-18 Xerox Corporation Device for transcribing images on a board using a camera based board scanner
US5920342A (en) 1994-09-16 1999-07-06 Kabushiki Kaisha Toshiba Image input apparatus for capturing images of multiple resolutions
US5686942A (en) 1994-12-01 1997-11-11 National Semiconductor Corporation Remote computer input system which detects point source on operator
US5581637A (en) 1994-12-09 1996-12-03 Xerox Corporation System for registering component image tiles in a camera-based scanner device transcribing scene images
US5638092A (en) 1994-12-20 1997-06-10 Eng; Tommy K. Cursor control system
US5818421A (en) 1994-12-21 1998-10-06 Hitachi, Ltd. Input interface apparatus for large screen display
US5554828A (en) 1995-01-03 1996-09-10 Texas Instruments Inc. Integration of pen-based capability into a field emission device system
US5594469A (en) 1995-02-21 1997-01-14 Mitsubishi Electric Information Technology Center America Inc. Hand gesture machine control system
US5736686A (en) 1995-03-01 1998-04-07 Gtco Corporation Illumination apparatus for a digitizer tablet with improved light panel
JPH08240407A (en) 1995-03-02 1996-09-17 Matsushita Electric Ind Co Ltd Position detecting input device
US6191773B1 (en) 1995-04-28 2001-02-20 Matsushita Electric Industrial Co., Ltd. Interface apparatus
US5911004A (en) 1995-05-08 1999-06-08 Ricoh Company, Ltd. Image processing apparatus for discriminating image characteristics using image signal information obtained in an image scanning operation
JPH08315152A (en) 1995-05-22 1996-11-29 Sony Corp Image recognition device
US6008798A (en) 1995-06-07 1999-12-28 Compaq Computer Corporation Method of determining an object's position and associated apparatus
US5764223A (en) 1995-06-07 1998-06-09 International Business Machines Corporation Touch-screen input device using the monitor as a light source operating at an intermediate frequency
US5734375A (en) 1995-06-07 1998-03-31 Compaq Computer Corporation Keyboard-compatible optical determination of object's position
EP0762319A2 (en) 1995-08-24 1997-03-12 Symbios Logic Inc. Graphical input apparatus and method
JPH0991094A (en) 1995-09-21 1997-04-04 Sekisui Chem Co Ltd Coordinate detector for touch panel
US5818424A (en) 1995-10-19 1998-10-06 International Business Machines Corporation Rod shaped device and data acquisition apparatus for determining the position and orientation of an object in space
US6736321B2 (en) 1995-12-18 2004-05-18 Metrologic Instruments, Inc. Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5831602A (en) 1996-01-11 1998-11-03 Canon Kabushiki Kaisha Information processing apparatus, method and computer program product
US5963199A (en) 1996-02-09 1999-10-05 Kabushiki Kaisha Sega Enterprises Image processing systems and data input devices therefor
JPH09224111A (en) 1996-02-16 1997-08-26 Hitachi Denshi Ltd Electronic blackboard
JPH09319501A (en) 1996-05-29 1997-12-12 Fujitsu Ltd Coordinate detector
US7098392B2 (en) 1996-07-10 2006-08-29 Sitrick David H Electronic image visualization system and communication methodologies
US6075905A (en) 1996-07-17 2000-06-13 Sarnoff Corporation Method and apparatus for mosaic image construction
US6002808A (en) 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
US6208329B1 (en) 1996-08-13 2001-03-27 Lsi Logic Corporation Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device
WO1998007112A2 (en) 1996-08-13 1998-02-19 Lsi Logic Corporation Data input apparatus and method
JPH10105324A (en) 1996-09-09 1998-04-24 Motorola Inc Intuitive gestuer system graphical user interface
US5745116A (en) 1996-09-09 1998-04-28 Motorola, Inc. Intuitive gesture-based graphical user interface
US5936615A (en) 1996-09-12 1999-08-10 Digital Equipment Corporation Image-based touchscreen
EP0829798A2 (en) 1996-09-12 1998-03-18 Digital Equipment Corporation Image-based touchscreen
US5819201A (en) 1996-09-13 1998-10-06 Magellan Dis, Inc. Navigation system with vehicle service information
US20040046749A1 (en) 1996-10-15 2004-03-11 Nikon Corporation Image recording and replay apparatus
US6567121B1 (en) 1996-10-25 2003-05-20 Canon Kabushiki Kaisha Camera control system, camera server, camera client, control method, and storage medium
US6650822B1 (en) 1996-10-29 2003-11-18 Xeotion Corp. Optical device utilizing optical waveguides and mechanical light-switches
US6061177A (en) 1996-12-19 2000-05-09 Fujimoto; Kenneth Noboru Integrated computer display and graphical input apparatus and method
US6128003A (en) 1996-12-20 2000-10-03 Hitachi, Ltd. Hand gesture recognition system and method
US6252989B1 (en) 1997-01-07 2001-06-26 Board Of The Regents, The University Of Texas System Foveated image coding system and method for image bandwidth reduction
US6208330B1 (en) 1997-03-07 2001-03-27 Canon Kabushiki Kaisha Coordinate input apparatus and its control method
US6760999B2 (en) 1997-03-13 2004-07-13 Steelcase Development Corporation Workspace display
US6122865A (en) 1997-03-13 2000-09-26 Steelcase Development Inc. Workspace display
US6209266B1 (en) 1997-03-13 2001-04-03 Steelcase Development Inc. Workspace display
US6427389B1 (en) 1997-03-13 2002-08-06 Steelcase Development Corporation Workspace display
US5914709A (en) 1997-03-14 1999-06-22 Poa Sana, Llc User input device for a computer system
US6153836A (en) 1997-04-02 2000-11-28 Goszyk; Kurt A. Adjustable area coordinate position data-capture system
US6104387A (en) 1997-05-14 2000-08-15 Virtual Ink Corporation Transcription system
DE19810452A1 (en) 1997-06-13 1998-12-17 Wacom Co Ltd Optical coordinate digitiser
US6100538A (en) 1997-06-13 2000-08-08 Kabushikikaisha Wacom Optical digitizer and display means for providing display of indicated position
US6229529B1 (en) 1997-07-11 2001-05-08 Ricoh Company, Ltd. Write point detecting circuit to detect multiple write points
US5790910A (en) 1997-08-04 1998-08-04 Peerless Industries, Inc. Camera mounting apparatus
JPH1151644A (en) 1997-08-05 1999-02-26 Honda Motor Co Ltd Distance measuring instrument for vehicle
JPH11110116A (en) 1997-08-07 1999-04-23 Fujitsu Ltd Optical position detection device
JPH1164026A (en) 1997-08-12 1999-03-05 Fujitsu Ten Ltd Navigation system
WO1999008897A1 (en) 1997-08-18 1999-02-25 The Texas A & M University System Centralised control system in a police vehicle
US6161066A (en) 1997-08-18 2000-12-12 The Texas A&M University System Advanced law enforcement and response technology
US20060202953A1 (en) 1997-08-22 2006-09-14 Pryor Timothy R Novel man machine interfaces and applications
US6630922B2 (en) 1997-08-29 2003-10-07 Xerox Corporation Handedness detection for a physical manipulatory grammar
JPH1185376A (en) 1997-09-02 1999-03-30 Fujitsu Ltd Information display device with optical position detecting device
US6256033B1 (en) 1997-10-15 2001-07-03 Electric Planet Method and apparatus for real-time gesture recognition
WO1999021122A1 (en) 1997-10-22 1999-04-29 Ascent Technology, Inc. Voice-output reading system with gesture-based navigation
US6339748B1 (en) 1997-11-11 2002-01-15 Seiko Epson Corporation Coordinate input system and display apparatus
US20010033274A1 (en) 1997-11-17 2001-10-25 Joon-Suan Ong Method and apparatus for erasing previously entered data
US6310610B1 (en) 1997-12-04 2001-10-30 Nortel Networks Limited Intelligent touch display
WO1999028812A1 (en) 1997-12-04 1999-06-10 Northern Telecom Limited Intelligent touch display
US6323846B1 (en) 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
WO1999040562A1 (en) 1998-02-09 1999-08-12 Joseph Lev Video camera computer touch screen system
US6226035B1 (en) 1998-03-04 2001-05-01 Cyclo Vision Technologies, Inc. Adjustable imaging system with wide angle capability
US6031531A (en) 1998-04-06 2000-02-29 International Business Machines Corporation Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users
US6608619B2 (en) 1998-05-11 2003-08-19 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US6429856B1 (en) 1998-05-11 2002-08-06 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
JP2000105671A (en) 1998-05-11 2000-04-11 Ricoh Co Ltd Coordinate input and detecting device, and electronic blackboard system
US6421042B1 (en) 1998-06-09 2002-07-16 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US20030001825A1 (en) 1998-06-09 2003-01-02 Katsuyuki Omura Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
JP2000132340A (en) 1998-06-09 2000-05-12 Ricoh Co Ltd Coordinate input/detecting device and electronic blackboard system
US6760009B2 (en) 1998-06-09 2004-07-06 Ricoh Company, Ltd. Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system
US6496122B2 (en) 1998-06-26 2002-12-17 Sharp Laboratories Of America, Inc. Image display and remote control system capable of displaying two distinct images
US6559813B1 (en) 1998-07-01 2003-05-06 Deluca Michael Selective real image obstruction in a virtual reality display apparatus and method
US6577299B1 (en) 1998-08-18 2003-06-10 Digital Ink, Inc. Electronic portable pen apparatus and method
US20020036617A1 (en) 1998-08-21 2002-03-28 Timothy R. Pryor Novel man machine interfaces and applications
US6353434B1 (en) 1998-09-08 2002-03-05 Gunze Limited Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display
US6570612B1 (en) 1998-09-21 2003-05-27 Bank One, Na, As Administrative Agent System and method for color normalization of board images
US6359612B1 (en) 1998-09-30 2002-03-19 Siemens Aktiengesellschaft Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device
US6972753B1 (en) 1998-10-02 2005-12-06 Semiconductor Energy Laboratory Co., Ltd. Touch panel, display device provided with touch panel and electronic equipment provided with display device
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
US6414673B1 (en) 1998-11-10 2002-07-02 Tidenet, Inc. Transmitter pen location system
US6147678A (en) 1998-12-09 2000-11-14 Lucent Technologies Inc. Video hand image-three-dimensional computer interface with multiple degrees of freedom
US6597348B1 (en) 1998-12-28 2003-07-22 Semiconductor Energy Laboratory Co., Ltd. Information-processing device
US6633328B1 (en) 1999-01-05 2003-10-14 Steris Corporation Surgical lighting system with integrated digital video camera
US6335724B1 (en) 1999-01-29 2002-01-01 Ricoh Company, Ltd. Method and device for inputting coordinate-position and a display board system
US6530664B2 (en) 1999-03-03 2003-03-11 3M Innovative Properties Company Integrated front projection system with enhanced dry erase screen configuration
US6179426B1 (en) 1999-03-03 2001-01-30 3M Innovative Properties Company Integrated front projection system
US6545669B1 (en) 1999-03-26 2003-04-08 Husam Kinawi Object-drag continuity between discontinuous touch-screens
US6933981B1 (en) 1999-06-25 2005-08-23 Kabushiki Kaisha Toshiba Electronic apparatus and electronic system provided with the same
US6507339B1 (en) 1999-08-23 2003-01-14 Ricoh Company, Ltd. Coordinate inputting/detecting system and a calibration method therefor
JP2001075735A (en) 1999-09-06 2001-03-23 Canon Inc Coordinate input device, its method and computer readable memory
US6563491B1 (en) 1999-09-10 2003-05-13 Ricoh Company, Ltd. Coordinate input apparatus and the recording medium thereof
US6594023B1 (en) 1999-09-10 2003-07-15 Ricoh Company, Ltd. Coordinate inputting/detecting apparatus, method and computer program product designed to precisely recognize a designating state of a designating device designating a position
US6512838B1 (en) 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US7187489B2 (en) 1999-10-05 2007-03-06 Idc, Llc Photonic MEMS and structures
US6674424B1 (en) 1999-10-29 2004-01-06 Ricoh Company, Ltd. Method and apparatus for inputting information including coordinate data
US6498602B1 (en) 1999-11-11 2002-12-24 Newcom, Inc. Optical digitizer with function to recognize kinds of pointing instruments
US6328270B1 (en) 1999-11-12 2001-12-11 Elbex Video Ltd. Swivel joint with cable passage for a television camera or a case
US6567078B2 (en) 2000-01-25 2003-05-20 Xiroku Inc. Handwriting communication system and handwriting input device used therein
US6529189B1 (en) 2000-02-08 2003-03-04 International Business Machines Corporation Touch screen stylus with IR-coupled selection buttons
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6587099B2 (en) 2000-02-18 2003-07-01 Ricoh Company, Ltd. Coordinate input/detection device detecting installation position of light-receiving device used for detecting coordinates
US20010019325A1 (en) 2000-03-06 2001-09-06 Ricoh Company, Ltd. Optical coordinate input/detection device with optical-unit positioning error correcting function
US20010022579A1 (en) 2000-03-16 2001-09-20 Ricoh Company, Ltd. Apparatus for inputting coordinates
JP2001282457A (en) 2000-03-31 2001-10-12 Ricoh Co Ltd Coordinate input system, control method of coordinate input system and computer readable recording medium in which a program to make computer execute the method is recorded
US20010026268A1 (en) 2000-03-31 2001-10-04 Ricoh Company, Ltd. Coordiante input and detection device and information display and input apparatus
JP2001282456A (en) 2000-04-03 2001-10-12 Japan Science & Technology Corp Man-machine interface system
US7084868B2 (en) 2000-04-26 2006-08-01 University Of Louisville Research Foundation, Inc. System and method for 3-D digital reconstruction of an oral cavity from a sequence of 2-D images
US6864882B2 (en) 2000-05-24 2005-03-08 Next Holdings Limited Protected touch panel display system
US6690397B1 (en) 2000-06-05 2004-02-10 Advanced Neuromodulation Systems, Inc. System for regional data association and presentation and method for the same
US6690363B2 (en) 2000-06-19 2004-02-10 Next Holdings Limited Touch panel display system
EP1297488B1 (en) 2000-07-05 2006-11-15 Smart Technologies Inc. Camera-based touch system
WO2002003316A1 (en) 2000-07-05 2002-01-10 Smart Technologies Inc. Camera-based touch system
US20070075982A1 (en) 2000-07-05 2007-04-05 Smart Technologies, Inc. Passive Touch System And Method Of Detecting User Input
US6803906B1 (en) 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
CA2412878A1 (en) 2000-07-05 2002-01-10 Smart Technologies Inc. Camera-based touch system
US7692625B2 (en) 2000-07-05 2010-04-06 Smart Technologies Ulc Camera-based touch system
US7236162B2 (en) 2000-07-05 2007-06-26 Smart Technologies, Inc. Passive touch system and method of detecting user input
US20050276448A1 (en) 2000-07-07 2005-12-15 Pryor Timothy R Multi-functional control and entertainment systems
US6531999B1 (en) 2000-07-13 2003-03-11 Koninklijke Philips Electronics N.V. Pointing direction calibration in video conferencing and other camera-based system applications
WO2002007073A2 (en) 2000-07-13 2002-01-24 Koninklijke Philips Electronics N.V. Pointing direction calibration in camera-based system applications
US20020050979A1 (en) 2000-08-24 2002-05-02 Sun Microsystems, Inc Interpolating sample values from known triangle vertex values
WO2002027461A1 (en) 2000-09-11 2002-04-04 Njoelstad Tormod Drawing, writing and pointing device
US6626718B2 (en) 2000-10-03 2003-09-30 Canon Kabushiki Kaisha Apparatus for manufacturing electron source, method for manufacturing electron source, and method for manufacturing image-forming apparatus
US20030046401A1 (en) 2000-10-16 2003-03-06 Abbott Kenneth H. Dynamically determing appropriate computer user interfaces
US6774889B1 (en) 2000-10-24 2004-08-10 Microsoft Corporation System and method for transforming an ordinary computer monitor screen into a touch screen
US6518600B1 (en) 2000-11-17 2003-02-11 General Electric Company Dual encapsulation for an LED
US20020067922A1 (en) 2000-12-02 2002-06-06 Harris Thomas H.S. Operator supported remote camera positioning and control system
US20020080123A1 (en) 2000-12-26 2002-06-27 International Business Machines Corporation Method for touchscreen data input
US20040108990A1 (en) 2001-01-08 2004-06-10 Klony Lieberman Data input device
JP2002236547A (en) 2001-02-07 2002-08-23 Ricoh Co Ltd Information input system
US6741250B1 (en) 2001-02-09 2004-05-25 Be Here Corporation Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path
US6497608B2 (en) 2001-02-09 2002-12-24 Sampo Technology Corp. Toy car camera system and rear vision mirrors
US7030861B1 (en) 2001-02-10 2006-04-18 Wayne Carl Westerman System and method for packing multi-touch gestures onto a hand
US6756910B2 (en) 2001-02-27 2004-06-29 Optex Co., Ltd. Sensor for automatic doors
US20020145595A1 (en) 2001-03-26 2002-10-10 Mitsuru Satoh Information input/output apparatus, information input/output control method, and computer product
US7176904B2 (en) 2001-03-26 2007-02-13 Ricoh Company, Limited Information input/output apparatus, information input/output control method, and computer product
US6911972B2 (en) 2001-04-04 2005-06-28 Matsushita Electric Industrial Co., Ltd. User interface device
US20020163530A1 (en) 2001-04-05 2002-11-07 Fujitsu Limited Of Kawasaki, Japan Image merging apparatus
US6517266B2 (en) 2001-05-15 2003-02-11 Xerox Corporation Systems and methods for hand-held printing on a surface or medium
US20030043116A1 (en) 2001-06-01 2003-03-06 Gerald Morrison Calibrating camera offsets to facilitate object Position determination using triangulation
US6919880B2 (en) 2001-06-01 2005-07-19 Smart Technologies Inc. Calibrating camera offsets to facilitate object position determination using triangulation
US20030025951A1 (en) 2001-07-27 2003-02-06 Pollard Stephen Bernard Paper-to-computer interfaces
US20040150630A1 (en) 2001-08-29 2004-08-05 Microsoft Corporation Manual controlled scrolling
US20060274067A1 (en) 2001-09-14 2006-12-07 Hideo Hidai Image processing apparatus, display apparatus with touch panel, image processing method and computer program
US7007236B2 (en) 2001-09-14 2006-02-28 Accenture Global Services Gmbh Lab window collaboration
US20030071858A1 (en) 2001-09-28 2003-04-17 Hiroshi Morohoshi Information input and output system, method, storage medium, and carrier wave
JP2003173237A (en) 2001-09-28 2003-06-20 Ricoh Co Ltd Information input-output system, program and storage medium
US20030063073A1 (en) 2001-10-03 2003-04-03 Geaghan Bernard O. Touch panel system and method for distinguishing multiple touch inputs
US7414617B2 (en) 2001-10-09 2008-08-19 Eit Co., Ltd. Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
US7202860B2 (en) 2001-10-09 2007-04-10 Eit Co., Ltd. Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
US20030085871A1 (en) * 2001-10-09 2003-05-08 E-Business Information Technology Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof
JP2003158597A (en) 2001-11-21 2003-05-30 Mitsubishi Rayon Co Ltd Image display device provided with screen used for hand- writing image entry face
JP2003167669A (en) 2001-11-22 2003-06-13 Internatl Business Mach Corp <Ibm> Information processor, program, and coordinate input method
US20030095112A1 (en) 2001-11-22 2003-05-22 International Business Machines Corporation Information processing apparatus, program and coordinate input method
US7121470B2 (en) 2002-01-11 2006-10-17 Hand Held Products, Inc. Transaction terminal having elongated finger recess
US20030142880A1 (en) 2002-01-29 2003-07-31 Manabu Hyodo Image processing method, image processing apparatus, and electronic camera
US20030151562A1 (en) 2002-02-08 2003-08-14 Kulas Charles J. Computer display system using multiple screens
US20030151532A1 (en) 2002-02-13 2003-08-14 Hsin-Shu Chen Calibration of resistor ladder using difference measurement and parallel resistive correction
US20040021633A1 (en) 2002-04-06 2004-02-05 Rajkowski Janusz Wiktor Symbol encoding apparatus and method
US20040031779A1 (en) 2002-05-17 2004-02-19 Cahill Steven P. Method and system for calibrating a laser processing system and laser marking system utilizing same
US7015418B2 (en) 2002-05-17 2006-03-21 Gsi Group Corporation Method and system for calibrating a laser processing system and laser marking system utilizing same
CA2493236A1 (en) 2002-06-10 2003-12-18 Steven Montellese Apparatus and method for inputting data
WO2003105074A2 (en) 2002-06-10 2003-12-18 Steven Montellese Apparatus and method for inputting data
US20070126755A1 (en) 2002-06-19 2007-06-07 Microsoft Corporation System and Method for Whiteboard and Audio Capture
US7184030B2 (en) 2002-06-27 2007-02-27 Smart Technologies Inc. Synchronization of cameras in camera-based touch system to enhance position determination of fast moving objects
US6954197B2 (en) 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
US7619617B2 (en) 2002-11-15 2009-11-17 Smart Technologies Ulc Size/scale and orientation determination of a pointer in a camera-based touch system
US20060022962A1 (en) 2002-11-15 2006-02-02 Gerald Morrison Size/scale and orientation determination of a pointer in a camera-based touch system
US6972401B2 (en) 2003-01-30 2005-12-06 Smart Technologies Inc. Illuminated bezel and touch system incorporating the same
US20040149892A1 (en) 2003-01-30 2004-08-05 Akitt Trevor M. Illuminated bezel and touch system incorporating the same
US20050190162A1 (en) 2003-02-14 2005-09-01 Next Holdings, Limited Touch screen signal processing
EP1450243A2 (en) 2003-02-19 2004-08-25 Agilent Technologies Inc Electronic device having an image-based data input system
US20040169639A1 (en) 2003-02-28 2004-09-02 Pate Michael A. Visible pointer tracking with separately detectable pointer tracking signal
US20040178993A1 (en) 2003-03-11 2004-09-16 Morrison Gerald D. Touch system and method for determining pointer contacts on a touch surface
US20040179001A1 (en) 2003-03-11 2004-09-16 Morrison Gerald D. System and method for differentiating between pointers used to contact touch surface
US6947032B2 (en) 2003-03-11 2005-09-20 Smart Technologies Inc. Touch system and method for determining pointer contacts on a touch surface
US20040189720A1 (en) 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20080062149A1 (en) 2003-05-19 2008-03-13 Baruch Itzhak Optical coordinate input device comprising few elements
US20040252091A1 (en) 2003-06-14 2004-12-16 Massachusetts Institute Of Technology Input device based on frustrated total internal reflection
US7190496B2 (en) 2003-07-24 2007-03-13 Zebra Imaging, Inc. Enhanced environment visualization using holographic stereograms
US20050052427A1 (en) 2003-09-10 2005-03-10 Wu Michael Chi Hung Hand gesture interaction with touch surface
US20050057524A1 (en) 2003-09-16 2005-03-17 Hill Douglas B. Gesture recognition method and touch system incorporating the same
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
US20070236454A1 (en) 2003-10-09 2007-10-11 Smart Technologies, Inc. Apparatus For Determining The Location Of A Pointer Within A Region Of Interest
US20050083308A1 (en) 2003-10-16 2005-04-21 Homer Steven S. Display for an electronic device
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US20050151733A1 (en) 2004-01-09 2005-07-14 Microsoft Corporation Multi-chart geometry images
US7232986B2 (en) 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
WO2005106775A1 (en) 2004-05-05 2005-11-10 Smart Technologies Inc. Apparatus and method for detecting a pointer relative to a touch surface
US20050248540A1 (en) 2004-05-07 2005-11-10 Next Holdings, Limited Touch panel display system with illumination and detection provided from a single edge
US20080129707A1 (en) 2004-07-27 2008-06-05 Pryor Timothy R Method and apparatus employing multi-functional controls and displays
US20060158437A1 (en) 2005-01-20 2006-07-20 Blythe Michael M Display device
US20060227120A1 (en) 2005-03-28 2006-10-12 Adam Eikman Photonic touch screen apparatus and method of use
WO2007003196A2 (en) 2005-07-05 2007-01-11 O-Pen Aps A touch pad system
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US20070075648A1 (en) 2005-10-03 2007-04-05 Blythe Michael M Reflecting light
US20070116333A1 (en) 2005-11-18 2007-05-24 Dempski Kelly L Detection of multiple targets on a plane of interest
WO2007064804A1 (en) 2005-12-02 2007-06-07 General Electric Company Electroform, methods of making electroforms, and products made from electroforms
US20070139932A1 (en) 2005-12-20 2007-06-21 Industrial Technology Research Institute Light source package structure

Non-Patent Citations (45)

* Cited by examiner, † Cited by third party
Title
European Search Report for EP 02 25 3594 dated Dec. 14, 2005 (3 pages).
European Search Report for EP 04 25 1392 dated Jan. 11, 2007 (2 pages).
European Search Report for EP 06 01 9268 dated Nov. 9, 2006 (4 pages).
European Search Report for EP 06 01 9269 dated Nov. 9, 2006 (4 pages).
Förstner, Wolfgang, "On Estimating Rotations", Festschrift für Prof. Dr. -Ing. Heinrich Ebner Zum 60. Geburtstag, Herausg.: C. Heipke und H. Mayer, Lehrstuhl für Photogrammetrie und Fernerkundung, TU München, 1999, 12 pages. (http://www.ipb.uni-bonn.de/papers/#1999).
Funk, Bud K., CCD's in optical panels deliver high resolution, Electronic Design, Sep. 27, 1980, pp. 139-143.
Hartley, R. and Zisserman, A., "Multiple View Geometry in Computer Vision", Cambridge University Press, First published 2000, Reprinted (with corrections) 2001, pp. 70-73, 92-93, and 98-99.
International Search Report and Written Opinion for PCT/CA2004/001759 mailed Feb. 21, 2005 (7 Pages).
International Search Report and Written Opinion for PCT/CA2009/000773 mailed Aug. 12, 2009 (11 Pages).
International Search Report for PCT/CA01/00980 mailed Oct. 22, 2001 (3 Pages).
Jul. 5, 2010 Office Action, with English translation, for Japanese Patent Application No. 2005-000268 (6 pages).
Kanatani, K., "Camera Calibration", Geometric Computation for Machine Vision, Oxford Engineering Science Series, vol. 37, 1993, pp. 56-63.
May 12, 2009 Office Action for Canadian Patent Application No. 2,412,878 (4 pages).
NASA Small Business Innovation Research Program: Composite List of Projects 1983-1989, Aug. 1990.
Overview page for IntuiFace by IntuiLab, Copyright 2008.
Partial European Search Report for EP 03 25 7166 dated May 19, 2006 (4 pages).
Press Release, "IntuiLab introduces IntuiFace, An interactive table and its application platform" Nov. 30, 2007.
Tapper, C.C., et al., "On-Line Handwriting Recognition-A Survey", Proceedings of the International Conference on Pattern Recognition (ICPR), Rome, Nov. 14-17, 1988, Washington, IEEE Comp. Soc. Press. US, vol. 2 Conf. 9, Nov. 14, 1988 (Nov. 14, 1988), pp. 1123-1132.
Touch Panel, vol. 1 No. 1 (2005).
Touch Panel, vol. 1 No. 10 (2006).
Touch Panel, vol. 1 No. 2 (2005).
Touch Panel, vol. 1 No. 3 (2006).
Touch Panel, vol. 1 No. 4 (2006).
Touch Panel, vol. 1 No. 5 (2006).
Touch Panel, vol. 1 No. 6 (2006).
Touch Panel, vol. 1 No. 7 (2006).
Touch Panel, vol. 1 No. 8 (2006).
Touch Panel, vol. 1 No. 9 (2006).
Touch Panel, vol. 2 No. 1 (2006).
Touch Panel, vol. 2 No. 2 (2007).
Touch Panel, vol. 2 No. 3 (2007).
Touch Panel, vol. 2 No. 4 (2007).
Touch Panel, vol. 2 No. 5 (2007).
Touch Panel, vol. 2 No. 6 (2007).
Touch Panel, vol. 2 No. 7-8 (2008).
Touch Panel, vol. 2 No. 9-10 (2008).
Touch Panel, vol. 3 No. 1-2 (2008).
Touch Panel, vol. 3 No. 3-4 (2008).
Touch Panel, vol. 3 No. 5-6 (2009).
Touch Panel, vol. 3 No. 7-8 (2009).
Touch Panel, vol. 3 No. 9 (2009).
Touch Panel, vol. 4 No. 2-3 (2009).
Villamor et al. "Touch Gesture Reference Guide", Apr. 15, 2010.
Wang, F., et al., "Stereo camera calibration without absolute world coordinate information", SPIE, vol. 2620, pp. 655-662, Jun. 14, 1995.
Wrobel, B., "minimum Solutions for Orientation", Calibration and Orientation of Cameras in Computer Vision, Springer Series in Information Sciences, vol. 34, 2001, pp. 28-33.

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090020342A1 (en) * 2007-07-18 2009-01-22 Smart Technologies Inc. Touch Panel And Interactive Input System Incorporating The Same
US8400407B2 (en) * 2007-07-18 2013-03-19 Smart Technologies Ulc Touch panel and interactive input system incorporating the same
US20110266074A1 (en) * 2010-04-29 2011-11-03 Au Optronics Corporation Camera based touch system
US8338725B2 (en) * 2010-04-29 2012-12-25 Au Optronics Corporation Camera based touch system
US20110316814A1 (en) * 2010-06-28 2011-12-29 Ming-Tsan Kao Optical distance determination device, optical touch monitoring system and method for measuring distance of a touch point on an optical touch panel
US9116578B2 (en) * 2010-06-28 2015-08-25 Pixart Imaging Inc. Optical distance determination device, optical touch monitoring system and method for measuring distance of a touch point on an optical touch panel
US20120212454A1 (en) * 2011-02-18 2012-08-23 Seiko Epson Corporation Optical position detecting device and display system provided with input function
US20120287083A1 (en) * 2011-05-12 2012-11-15 Yu-Yen Chen Optical touch control device and optical touch control system
US8537139B2 (en) * 2011-05-12 2013-09-17 Wistron Corporation Optical touch control device and optical touch control system
US20130179811A1 (en) * 2012-01-05 2013-07-11 Visteon Global Technologies, Inc. Projection dynamic icon knobs

Also Published As

Publication number Publication date
US20120327031A1 (en) 2012-12-27
US7342574B1 (en) 2008-03-11
US6674424B1 (en) 2004-01-06
JP2001125745A (en) 2001-05-11
JP4052498B2 (en) 2008-02-27

Similar Documents

Publication Publication Date Title
USRE43084E1 (en) Method and apparatus for inputting information including coordinate data
US11868543B1 (en) Gesture keyboard method and apparatus
AU603643B2 (en) Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US8681124B2 (en) Method and system for recognition of user gesture interaction with passive surface video displays
US8339378B2 (en) Interactive input system with multi-angle reflector
JP4822643B2 (en) Computer presentation system and method with optical tracking of a wireless pointer
US6823481B2 (en) Optical coordinate input/detection device with optical-unit positioning error correcting function
CN1322329B (en) Imput device using scanning sensors
US7486274B2 (en) Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices
US20030004678A1 (en) System and method for providing a mobile input device
US20030165048A1 (en) Enhanced light-generated interface for use with electronic devices
US20100103099A1 (en) Pointing device using camera and outputting mark
US20030226968A1 (en) Apparatus and method for inputting data
CN107407959B (en) Manipulation of three-dimensional images based on gestures
CN102436327B (en) Screen input system and implementation method thereof
EP1889143A2 (en) Bounding box gesture recognition on a touch detecting interactive display
TW201441896A (en) Method of determining touch gesture and touch control system
WO2003079179A1 (en) Motion mouse system
EP1100040A2 (en) Optical digitizer using curved mirror
JP4335468B2 (en) Information input / output system, information control method, program, and recording medium
US20180032142A1 (en) Information processing apparatus, control method thereof, and storage medium
JP2018063555A (en) Information processing device, information processing method, and program
Matsubara et al. Touch detection method for non-display surface using multiple shadows of finger
KR100962511B1 (en) Electronic pen mouse and operating method thereof
CN105278760A (en) Optical touch-control system

Legal Events

Date Code Title Description
AS Assignment

Owner name: MORGAN STANLEY SENIOR FUNDING INC., NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNORS:SMART TECHNOLOGIES ULC;SMART TECHNOLOGIES INC.;REEL/FRAME:030935/0848

Effective date: 20130731

Owner name: MORGAN STANLEY SENIOR FUNDING, INC., NEW YORK

Free format text: SECURITY AGREEMENT;ASSIGNORS:SMART TECHNOLOGIES ULC;SMART TECHNOLOGIES INC.;REEL/FRAME:030935/0879

Effective date: 20130731

FPAY Fee payment

Year of fee payment: 8

AS Assignment

Owner name: SMART TECHNOLOGIES INC., CANADA

Free format text: RELEASE OF ABL SECURITY INTEREST;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040711/0956

Effective date: 20161003

Owner name: SMART TECHNOLOGIES INC., CANADA

Free format text: RELEASE OF TERM LOAN SECURITY INTEREST;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040713/0123

Effective date: 20161003

Owner name: SMART TECHNOLOGIES ULC, CANADA

Free format text: RELEASE OF ABL SECURITY INTEREST;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040711/0956

Effective date: 20161003

Owner name: SMART TECHNOLOGIES ULC, CANADA

Free format text: RELEASE OF TERM LOAN SECURITY INTEREST;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040713/0123

Effective date: 20161003

AS Assignment

Owner name: SMART TECHNOLOGIES ULC, CANADA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040798/0077

Effective date: 20161003

Owner name: SMART TECHNOLOGIES INC., CANADA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040798/0077

Effective date: 20161003

Owner name: SMART TECHNOLOGIES INC., CANADA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040819/0306

Effective date: 20161003

Owner name: SMART TECHNOLOGIES ULC, CANADA

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:MORGAN STANLEY SENIOR FUNDING, INC.;REEL/FRAME:040819/0306

Effective date: 20161003

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 12