US20080068352A1 - Apparatus for detecting a pointer within a region of interest - Google Patents
Apparatus for detecting a pointer within a region of interest Download PDFInfo
- Publication number
- US20080068352A1 US20080068352A1 US11/764,723 US76472307A US2008068352A1 US 20080068352 A1 US20080068352 A1 US 20080068352A1 US 76472307 A US76472307 A US 76472307A US 2008068352 A1 US2008068352 A1 US 2008068352A1
- Authority
- US
- United States
- Prior art keywords
- light
- interest
- region
- imaging device
- light source
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/12—Reflex reflectors
- G02B5/122—Reflex reflectors cube corner, trihedral or triple reflector type
- G02B5/124—Reflex reflectors cube corner, trihedral or triple reflector type plural reflecting elements forming part of a unitary plate or sheet
Definitions
- the present invention relates generally to interactive systems and in particular to an apparatus for detecting a pointer within a region of interest.
- Touch systems are well known in the art and typically include a touch screen having a touch surface on which contacts are made using a pointer such as for example a pen tool, finger or other suitable object. Pointer contacts with the touch surface are detected and are used to generate output pointer position data representing areas of the touch surface where pointer contacts are made.
- a pointer such as for example a pen tool, finger or other suitable object.
- the camera-based touch system comprises a touch screen that includes a touch surface on which a computer-generated image is presented.
- a rectangular bezel or frame surrounds the touch surface and supports digital cameras at its corners.
- the digital cameras have overlapping fields of view that encompass and look across the touch surface.
- the digital cameras acquire images of the touch surface from different locations and generate image data.
- the image data is processed by digital signal processors to determine if a pointer exists in the captured image data.
- the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer in (x,y) coordinates relative to the touch surface using triangulation.
- the pointer location data is conveyed to a computer executing one or more application programs.
- the computer uses the pointer location data to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of an application program executed by the computer.
- the illuminated bezel projects infrared backlighting across the touch surface that is visible to the digital cameras.
- the digital cameras see bright bands of illumination as a result of the projected backlighting.
- the pointer occludes the backlight illumination. Therefore, in each captured image the pointer appears as a high-contrast dark region interrupting the bright band of illumination allowing the existence of the pointer in the captured image to be readily detected.
- the illuminated bezel works very well, because the illuminated bezel completely surrounds the touch surface and makes use of an array of infrared light emitting diodes mounted on a printed circuit board that is disposed behind a diffuser, manufacturing costs are significant especially in cases where the illuminated bezel surrounds large touch surfaces. As will be appreciated, lower cost backlight illumination for touch systems of this nature is desired.
- an apparatus for detecting a pointer within a region of interest comprising:
- At least one pair of imaging devices said imaging devices having overlapping fields of view encompassing said region of interest;
- At least one light source providing illumination across said region of interest and being within the field of view of at least one of said imaging devices;
- a filter associated with the at least one imaging device whose field of view sees said light source, said filter blocking light projected by said light source to inhibit said imaging device from being blinded by said projected light
- the filter blocks light having a characteristic different from a characteristic assigned to the at least one imaging device.
- the characteristic may be one of polarization and frequency.
- the apparatus may include a light source associated with each imaging device, with each light source being in the field of view of the non-associated imaging device. Light projected by each light source is visible to its associated imaging device but is blocked by the filter associated with the non-associated imaging device.
- the region of interest may overlie a touch surface on which pointer contacts are made, with imaging devices and associated light sources being provided adjacent each corner of the touch surface.
- an apparatus for detecting a pointer within a region of interest comprising:
- imaging devices having overlapping fields of view looking generally across said region of interest;
- each said light source providing illumination across said region of interest and being in the field of view of the non-associated imaging device;
- a filter device associated with each imaging device so that substantially only light projected by the light source associated therewith is received by said associated imagining device.
- an apparatus for detecting a pointer within a region of interest comprising:
- an imaging device adjacent at least two corners of said region of interest, the imaging devices having overlapping fields of view looking generally across said region of interest, said imaging devices being configured to capture light having a particular characteristic;
- each said light source projecting light across said region of interest having a characteristic of the type capturable by said associated imaging device.
- an apparatus for detecting a pointer within a region of interest comprising:
- At least two color imaging devices having overlapping fields of view looking generally across said region of interest
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest;
- At least one illumination source projecting light in a specified frequency range across said region of interest thereby to provide lighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said at least one illumination source to capture monochrome images.
- an apparatus for detecting a pointer contact on a generally rectangular touch surface comprising:
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest;
- illumination sources surrounding said touch surface and projecting light in a specified frequency range across said touch surface thereby to provide backlighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said illumination sources to capture monochrome images.
- an apparatus for detecting a pointer within a region of interest comprising:
- At least two monochrome imaging devices having overlapping fields of view looking generally across said region of interest;
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest;
- At least one illumination source projecting light across said region of interest
- At least one filter changing the frequency band of light in a cycle thereby to enable said imaging devices to capture images looking across said region of interest in different lighting conditions.
- the present invention provides advantages in that in one embodiment, backlight illumination is provided across the touch surface in an effective and cost efficient manner.
- the present invention provides further advantages in that since images looking across the region of interest can be acquired at different frequency bands of light, in addition to determining the location of the pointer, increased pointer attribute information can be easily obtained.
- FIG. 1 is a schematic diagram of an apparatus for detecting a pointer within a region of interest
- FIG. 2 is a front elevation view of a touch screen forming part of the apparatus of FIG. 1 ;
- FIG. 3 is another front elevation view of the touch screen of FIG. 2 ;
- FIG. 4 is a schematic diagram of a digital camera forming part of the touch screen of FIG. 2 ;
- FIG. 5 is a schematic diagram of a master controller forming part of the apparatus of FIG. 1 ;
- FIG. 6 is a front elevational view of an alternative embodiment of the touch screen
- FIG. 7 is a front elevational view of yet another embodiment of the touch screen.
- FIG. 8 is a graph showing the light sensitivity of digital cameras used in the touch screen of FIG. 7 .
- apparatus 50 is a camera-based touch system similar to that disclosed in International PCT Application Serial No. WO 02/03316, assigned to SMART Technologies Inc., assignee of the present invention, the content of which is incorporated herein by reference.
- touch system 50 includes a touch screen 52 coupled to a digital signal processor (DSP) based master controller 54 .
- DSP digital signal processor
- Master controller 54 is also coupled to a computer 56 .
- Computer 56 executes one or more application programs and generates computer-generated image output that is presented on the touch screen 52 .
- the touch screen 52 , master controller 54 and computer 56 form a closed-loop so that pointer contacts made on the touch screen 52 can be recorded as writing or drawing or used to control execution of an application programs executed by the computer 56 .
- FIGS. 2 and 3 better illustrate the touch screen 52 .
- Touch screen 52 in the present embodiment includes a high-resolution display device such as a plasma display 58 , the front surface of which defines a touch surface 60 .
- the touch surface 60 is bordered by a bezel or frame 62 coupled to the display device.
- Corner pieces 68 that house DSP-based CMOS digital cameras 70 are located at each corner of the bezel 62 .
- Each digital camera 70 is mounted within its respective corner piece 68 so that its field of view encompasses and looks generally across the entire plane of the touch surface 60 .
- An infrared light source 72 is associated with and positioned adjacent each-digital camera 70 .
- Each light source 72 includes an array of infrared (IR) light emitting diodes (LEDs). The light emitting diodes project infrared lighting across the touch surface 60 .
- IR infrared
- LEDs light emitting diodes
- Polarizers 74 are provided in front of the digital cameras 70 and the infrared light sources 72 .
- the polarization of the polarizers 74 at opposite corners of the touch surface 60 have opposite polarization.
- the polarizers 74 at the top and bottom left corners of the touch surface 60 have a vertical orientation and the polarizers 74 at the top and bottom right corners of the touch surface 60 have a horizontal orientation.
- the polarizers 74 minimize the light projected by the diagonally opposite infrared light sources 72 that is seen by the digital cameras 70 i.e. block the diagonally opposite infrared light sources 72 from their fields of view thereby to avoid digital camera photo-saturation and other effects that reduce the effectiveness of the digital cameras 70 .
- the digital camera 70 includes a two-dimensional CMOS image sensor and associated lens assembly 80 , a first-in-first-out (FIFO) buffer 82 coupled to the image sensor and lens assembly 80 by a data bus and a digital signal processor (DSP) 84 coupled to the FIFO 82 by a data bus and to the image sensor and lens assembly 80 by a control bus.
- a boot EPROM 86 and a power supply subsystem 88 are also included.
- the CMOS camera image sensor is configured for a 20 ⁇ 640 pixel subarray that can be operated to capture image frames at high frame rates in excess of 200 frames per second since arbitrary pixel rows can be selected. Also, since the pixel rows can be arbitrarily selected, the pixel subarray can be exposed for a greater duration for a given digital camera frame rate allowing for good operation in dark rooms as well as well lit rooms.
- the DSP 84 provides control information to the image sensor and lens assembly 80 via the control bus.
- the control information allows the DSP 84 to control parameters of the image sensor and lens assembly 80 such as exposure, gain, array configuration, reset and initialization.
- the DSP 84 also provides clock signals to the image sensor and lens assembly 80 to control the frame rate of the image sensor and lens assembly 80 .
- An infrared pass filter 89 is provided on the image sensor and lens assembly 80 to blind the digital camera 70 to frequencies of light outside the infrared range.
- Master controller 54 is better illustrated in FIG. 5 and includes a DSP 90 , a boot EPROM 92 , a serial line driver 94 and a power supply subsystem 95 .
- the DSP 90 communicates with the DSPs 84 of the digital cameras 70 over a data bus via a serial port 96 and communicates with the computer 56 over a data bus via a serial port 98 and the serial line driver 94 .
- the master controller 54 and each digital camera 70 follow a communication protocol that enables bidirectional communications via a common serial cable similar to a universal serial bus (USB). Communications between the master controller 54 and the digital cameras 70 are performed as background processes in response to interrupts.
- USB universal serial bus
- the infrared light source 72 associated with each digital camera 70 generates infrared light that is projected across the touch surface 60 covering an area at least as large as the field of view of the associated digital camera.
- the polarizers 74 at opposite diagonal corners of the touch surface 60 inhibit the infrared light source 72 diagonally opposite each digital camera 70 from blinding that digital camera due to the different polarization orientations of the polarizers 74 .
- Infrared light impinging on a polarizer 74 that is polarized in a manner different from the polarization orientation of the polarizer is blocked. In this manner, the digital camera 70 behind each polarizer 74 in effect does not see the infrared light source 72 at the diagonally opposite corner.
- Each digital camera 70 acquires images looking across the touch surface 60 within the field of view of its image sensor and lens assembly 80 at a desired frame rate and processes each acquired image to determine if a pointer is in the acquired image.
- the pointer is illuminated by the light projected by the infrared light sources 72 .
- Light reflecting off of the pointer typically does not maintain its polarization and therefore is visible to the digital cameras 70 . Therefore, the illuminated pointer appears as a high-contrast bright region interrupting a dark band in each captured image allowing the existence of the pointer in the captured images to be readily detected.
- Pointer information packets including pointer characteristics, status and/or diagnostic information are then generated by the digital cameras 70 and the PIPs are queued for transmission to the master controller 54 .
- the master controller 54 polls the digital cameras 70 for PIPs. If the PIPs include pointer characteristic information, the master controller 54 triangulates pointer characteristics in the PIPs to determine the position of the pointer relative to the touch surface 60 in Cartesian rectangular coordinates. The master controller 54 in turn transmits calculated pointer position data, status and/or diagnostic information to the computer 56 . In this manner, the pointer position data transmitted to the computer 56 can be recorded as writing or drawing or can be used to control execution of an applications program executed by the computer 56 . The computer 56 also updates the computer-generated image output conveyed to the plasma display 58 so that information presented on the touch surface 60 reflects the pointer activity.
- infrared light sources 72 and polarizers 74 at the corners of the touch surface 60 inhibit light sources in the fields of view of the digital cameras from blinding the digital cameras.
- Tough screen 152 is similar to that of the previous embodiment but in this case the bezel 162 is designed to allow the touch screen 152 to operate in an occlusion mode.
- bezel 162 in this embodiment, includes elongate retro-reflectors 164 bordering the sides of the touch surface 160 .
- the retro-reflectors 164 have retro-reflecting surfaces 166 lying in planes that are generally normal to the plane of the touch surface 160 .
- the retro-reflectors 164 are designed to maintain polarization of light impinging thereon.
- corner cube retroreflectors such as those manufactured by Reflexite Corporation and sold under the name ReflexiteTM AP1000 that preserve polarization are used.
- the retro-reflectors 164 when infrared light generated by the infrared light sources 172 travels across the touch surface and impinges on one or more retro-reflectors 164 , the retro-reflectors 164 in turn reflect the infrared light back in the opposite direction while maintaining the polarization of the infrared light. Since the infrared light sources 172 are mounted adjacent the digital cameras 170 , infrared light reflected by the retro-reflectors 164 is aimed back towards the digital cameras 170 . As a result, each digital camera 170 sees a bright band of illumination within its field of view.
- the digital cameras 170 see bright bands of illumination.
- the pointer occludes the infrared illumination and therefore appears as a high-contrast dark region interrupting a bright band of illumination in each captured image allowing the existence of the pointer in the captured images to be readily detected.
- the embodiments of the touch screen described above show digital cameras, infrared light sources and polarizers at each corner of the touch screen.
- Those of skill in the art will appreciate that only two imaging devices having overlapping fields of view are required.
- the infrared light sources need not be positioned adjacent the digital cameras.
- other types of filters may be used to inhibit the digital cameras from being blinded by a light source within its field of view.
- any filter type device that blocks light projected by a light source within the field of view of the digital camera based on a characteristic (i.e. polarization, frequency etc.) of the projected light may be used.
- each light source is described as including an array of IR LEDs, those of skill in the art will appreciate that other light source configurations to provide light illumination across the touch surface can be used.
- the touch system 50 has been described as including a plasma display 58 to present images on the touch surface, those of skill in the art will appreciate that this is not required.
- the touch screen may be a rear or front projection display device or virtually any surface on which a computer generated image is projected.
- the touch system 50 may be a writeboard where images are not projected on the touch surface.
- the touch system 50 is described as including a master controller 54 separate from the digital cameras, if desired one of the digital cameras can be conditioned to function as both a camera and the master controller and poll the other digital cameras for PIPs.
- the digital camera functioning as the master controller may include a faster DSP 84 than the remaining digital cameras.
- touch screen 252 includes a high-resolution display device such as a plasma display 258 , the front surface of which defines a touch surface 260 .
- the touch surface 260 is bordered by an illuminated bezel or frame 262 coupled to the display device.
- Illuminated bezel 262 is of the type disclosed in U.S. patent application Ser. No. 10/354,168 to Akitt et al., assigned to SMART Technologies Inc., assignee of the present invention, the content of which is incorporated by reference.
- Illuminated bezel 262 includes elongate side frame assemblies 264 that are coupled to the sides of the plasma display 258 .
- Each side frame assembly 264 accommodates a generally continuous infrared illumination source 266 .
- the ends of the side frame assemblies 264 are joined by corner pieces 268 that house DSP-based CMOS digital cameras 270 .
- Each digital camera 270 is mounted within its respective corner piece 268 so that its field of view encompasses and looks generally across the entire touch surface 260 .
- Each illuminated bezel 262 includes an array of IR LEDs (not shown) that project light onto a diffuser (not shown).
- the diffuser in turn, diffuses and expands the infrared light emitted by the IR LEDs so that adequate infrared backlighting is projected across the touch surface 260 .
- the illuminated bezels 162 appear as generally continuous bright bands of illumination to the digital cameras 270 .
- the image sensors used in the digital cameras 270 are color CMOS image sensors and do not include IR pass filters.
- FIG. 8 shows the light sensitivity of one of the image sensors. As can be seen, the sensitivity of the image sensor to red, green and blue light is localized around the appropriate frequencies. However at light in the infrared range i.e. about 850 nm, the color filters of the image sensors become transparent making the sensitivity of all of the pixels of the image sensors basically equal. This characteristic of the image sensor allows the touch screen to be operated in a number of modes depending on ambient light levels as will now be described.
- the illuminated bezels 262 are switched off allowing color images to be acquired by the digital cameras 270 .
- acquired color information is used to enhance pointer recognition and scene understanding.
- the foreground object i.e. the pointer
- the foreground object is the object of interest.
- the illuminated bezels 262 are switched on.
- the touch screen 252 operates in an occlusion mode as described previously.
- Pointer data is developed from images captured by the image sensors and processed in the manner discussed above.
- touch screen 252 has been described as using infrared illumination to provide backlighting, those of skill in the art will appreciate that light in a different frequency range other than infrared may be used provided the image sensors in the digital cameras have sufficient quantum efficiency at that different frequency range to capture images.
- infrared illumination can be multiplexed with ambient light to enable the digital cameras 270 to capture different types of images.
- the illuminated bezels 262 can be strobed so that one or more images are captured by the digital cameras 270 in ambient light conditions and then in infrared backlighting conditions. The strobing may be achieved by shutting the illuminated bezels 262 on and off and relying on ambient light levels in the off condition.
- the illumination source may include a white light source and a light filter in the form of a wheel that is rotatable in front of the light source.
- the wheel may include alternating infrared and clear sections. When a clear section is presented in front of the light source, white light is projected across the touch surface and when an infrared section is presented in front of the light source, infrared light is projected across the touch surface.
- the wheel may include infrared, blue, red and green sections arranged about the wheel. Depending on the section of the wheel positioned in front of the light source, light in a different frequency band is projected across the touch surface allowing one or more images to be captured during each type of illumination.
- colour wheels may be disposed in front of the digital cameras rather than adjacent the light source.
Abstract
An apparatus for detecting a pointer within a region of interest includes at least one pair of imaging devices. The imaging devices have overlapping fields of view encompassing the region of interest. At least one light source provides illumination across the region of interest and is within the field of view of at least one of the imaging device. A filter is associated with the at least one imaging device whose field of view sees the light source. The filter blocks light projected by the light source to inhibit the imaging device from being blinded by the projected light.
Description
- The present invention relates generally to interactive systems and in particular to an apparatus for detecting a pointer within a region of interest.
- Touch systems are well known in the art and typically include a touch screen having a touch surface on which contacts are made using a pointer such as for example a pen tool, finger or other suitable object. Pointer contacts with the touch surface are detected and are used to generate output pointer position data representing areas of the touch surface where pointer contacts are made.
- International PCT Application No. PCT/CA01/00980 filed on Jul. 5, 2001 and published under
number WO 02/03316 on Jan. 10, 2002, assigned to SMART Technologies Inc., assignee of the present invention, discloses a passive camera-based touch system. The camera-based touch system comprises a touch screen that includes a touch surface on which a computer-generated image is presented. A rectangular bezel or frame surrounds the touch surface and supports digital cameras at its corners. The digital cameras have overlapping fields of view that encompass and look across the touch surface. The digital cameras acquire images of the touch surface from different locations and generate image data. The image data is processed by digital signal processors to determine if a pointer exists in the captured image data. When it is determined that a pointer exists in the captured image data, the digital signal processors convey pointer characteristic data to a master controller, which in turn processes the pointer characteristic data to determine the location of the pointer in (x,y) coordinates relative to the touch surface using triangulation. The pointer location data is conveyed to a computer executing one or more application programs. The computer uses the pointer location data to update the computer-generated image that is presented on the touch surface. Pointer contacts on the touch surface can therefore be recorded as writing or drawing or used to control execution of an application program executed by the computer. - Although this camera-based touch system works extremely well, it has been found that when the digital camera frame rates are high, in less favorable light conditions, the ability to determine the existence of a pointer in the captured image data is diminished. As a result, there exists a need to improve the lighting environment for the digital cameras to ensure high resolution irrespective of ambient lighting conditions.
- U.S. patent application Ser. No. 10/354,168 to Akift et al. entitled “Illuminated Bezel And Touch System Incorporating The Same”, assigned to SMART Technologies Inc., assignee of the present invention, discloses an illuminated bezel for use in the above-described camera-based touch system. The illuminated bezel projects infrared backlighting across the touch surface that is visible to the digital cameras. As a result, when no pointer is positioned within the fields of view of the digital cameras, the digital cameras see bright bands of illumination as a result of the projected backlighting. When a pointer is positioned within the fields of view of the digital cameras, the pointer occludes the backlight illumination. Therefore, in each captured image the pointer appears as a high-contrast dark region interrupting the bright band of illumination allowing the existence of the pointer in the captured image to be readily detected.
- Although the illuminated bezel works very well, because the illuminated bezel completely surrounds the touch surface and makes use of an array of infrared light emitting diodes mounted on a printed circuit board that is disposed behind a diffuser, manufacturing costs are significant especially in cases where the illuminated bezel surrounds large touch surfaces. As will be appreciated, lower cost backlight illumination for touch systems of this nature is desired.
- Also, although the existence of the pointer in captured images can be readily detected, currently the use of monochrome digital cameras to capture images increases costs and provides limited information concerning attributes of the pointer used to contact the touch system.
- It is therefore an object of the present invention to provide a novel apparatus for detecting a pointer within a region of interest.
- Accordingly, in one aspect of the present invention, there is provided an apparatus for detecting a pointer within a region of interest comprising:
- at least one pair of imaging devices, said imaging devices having overlapping fields of view encompassing said region of interest;
- at least one light source providing illumination across said region of interest and being within the field of view of at least one of said imaging devices; and
- a filter associated with the at least one imaging device whose field of view sees said light source, said filter blocking light projected by said light source to inhibit said imaging device from being blinded by said projected light
- In one embodiment, the filter blocks light having a characteristic different from a characteristic assigned to the at least one imaging device. The characteristic may be one of polarization and frequency. The apparatus may include a light source associated with each imaging device, with each light source being in the field of view of the non-associated imaging device. Light projected by each light source is visible to its associated imaging device but is blocked by the filter associated with the non-associated imaging device.
- The region of interest may overlie a touch surface on which pointer contacts are made, with imaging devices and associated light sources being provided adjacent each corner of the touch surface.
- According to another aspect of the present invention there is provided an apparatus for detecting a pointer within a region of interest comprising:
- at least one pair of imaging devices, said imaging devices having overlapping fields of view looking generally across said region of interest;
- a light source associated with each imaging device, each said light source providing illumination across said region of interest and being in the field of view of the non-associated imaging device; and
- a filter device associated with each imaging device so that substantially only light projected by the light source associated therewith is received by said associated imagining device.
- According to still yet another aspect of the present invention there is provided an apparatus for detecting a pointer within a region of interest comprising:
- an imaging device adjacent at least two corners of said region of interest, the imaging devices having overlapping fields of view looking generally across said region of interest, said imaging devices being configured to capture light having a particular characteristic; and
- a light source associated with each imaging device, each said light source projecting light across said region of interest having a characteristic of the type capturable by said associated imaging device.
- According to still yet another aspect of the present invention there is provided an apparatus for detecting a pointer within a region of interest comprising:
- at least two color imaging devices having overlapping fields of view looking generally across said region of interest;
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest; and
- at least one illumination source projecting light in a specified frequency range across said region of interest thereby to provide lighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said at least one illumination source to capture monochrome images.
- According to still yet another aspect of the present invention there is provided an apparatus for detecting a pointer contact on a generally rectangular touch surface comprising:
- a color imaging device at each corner of said touch surface and having a field of view looking generally across said touch surface;
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest; and
- illumination sources surrounding said touch surface and projecting light in a specified frequency range across said touch surface thereby to provide backlighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said illumination sources to capture monochrome images.
- According to still yet another aspect of the present invention there is provided an apparatus for detecting a pointer within a region of interest comprising:
- at least two monochrome imaging devices having overlapping fields of view looking generally across said region of interest;
- processing circuitry receiving and processing images acquired by said imaging devices to detect the existence of a pointer in said images and to determine the location of said pointer relative to said region of interest; and
- at least one illumination source projecting light across said region of interest; and
- at least one filter changing the frequency band of light in a cycle thereby to enable said imaging devices to capture images looking across said region of interest in different lighting conditions.
- The present invention provides advantages in that in one embodiment, backlight illumination is provided across the touch surface in an effective and cost efficient manner. The present invention provides further advantages in that since images looking across the region of interest can be acquired at different frequency bands of light, in addition to determining the location of the pointer, increased pointer attribute information can be easily obtained.
- Embodiments of the present invention will now be described more fully with reference to the accompanying drawings in which:
-
FIG. 1 is a schematic diagram of an apparatus for detecting a pointer within a region of interest; -
FIG. 2 is a front elevation view of a touch screen forming part of the apparatus ofFIG. 1 ; -
FIG. 3 is another front elevation view of the touch screen ofFIG. 2 ; -
FIG. 4 is a schematic diagram of a digital camera forming part of the touch screen ofFIG. 2 ; -
FIG. 5 is a schematic diagram of a master controller forming part of the apparatus ofFIG. 1 ; -
FIG. 6 is a front elevational view of an alternative embodiment of the touch screen; -
FIG. 7 is a front elevational view of yet another embodiment of the touch screen; and -
FIG. 8 is a graph showing the light sensitivity of digital cameras used in the touch screen ofFIG. 7 . - Turning now to FIGS. 1 to 3, an apparatus for detecting a pointer within a region of interest in accordance with the present invention is shown and is generally identified by
reference numeral 50. In this embodiment,apparatus 50 is a camera-based touch system similar to that disclosed in International PCT Application Serial No. WO 02/03316, assigned to SMART Technologies Inc., assignee of the present invention, the content of which is incorporated herein by reference. As can be seen,touch system 50 includes atouch screen 52 coupled to a digital signal processor (DSP) basedmaster controller 54.Master controller 54 is also coupled to acomputer 56.Computer 56 executes one or more application programs and generates computer-generated image output that is presented on thetouch screen 52. Thetouch screen 52,master controller 54 andcomputer 56 form a closed-loop so that pointer contacts made on thetouch screen 52 can be recorded as writing or drawing or used to control execution of an application programs executed by thecomputer 56. -
FIGS. 2 and 3 better illustrate thetouch screen 52.Touch screen 52 in the present embodiment includes a high-resolution display device such as aplasma display 58, the front surface of which defines atouch surface 60. Thetouch surface 60 is bordered by a bezel orframe 62 coupled to the display device.Corner pieces 68 that house DSP-based CMOSdigital cameras 70 are located at each corner of thebezel 62. Eachdigital camera 70 is mounted within itsrespective corner piece 68 so that its field of view encompasses and looks generally across the entire plane of thetouch surface 60. - An infrared
light source 72 is associated with and positioned adjacent each-digital camera 70. Eachlight source 72 includes an array of infrared (IR) light emitting diodes (LEDs). The light emitting diodes project infrared lighting across thetouch surface 60. -
Polarizers 74 are provided in front of thedigital cameras 70 and the infraredlight sources 72. The polarization of thepolarizers 74 at opposite corners of thetouch surface 60 have opposite polarization. For example, in this embodiment, thepolarizers 74 at the top and bottom left corners of thetouch surface 60 have a vertical orientation and thepolarizers 74 at the top and bottom right corners of thetouch surface 60 have a horizontal orientation. In this manner, thepolarizers 74 minimize the light projected by the diagonally opposite infraredlight sources 72 that is seen by thedigital cameras 70 i.e. block the diagonally opposite infraredlight sources 72 from their fields of view thereby to avoid digital camera photo-saturation and other effects that reduce the effectiveness of thedigital cameras 70. - One of the
digital cameras 70 within acorner piece 68 is shown inFIG. 4 . As can be seen, thedigital camera 70 includes a two-dimensional CMOS image sensor and associatedlens assembly 80, a first-in-first-out (FIFO)buffer 82 coupled to the image sensor andlens assembly 80 by a data bus and a digital signal processor (DSP) 84 coupled to theFIFO 82 by a data bus and to the image sensor andlens assembly 80 by a control bus. Aboot EPROM 86 and apower supply subsystem 88 are also included. In the present embodiment, the CMOS camera image sensor is configured for a 20×640 pixel subarray that can be operated to capture image frames at high frame rates in excess of 200 frames per second since arbitrary pixel rows can be selected. Also, since the pixel rows can be arbitrarily selected, the pixel subarray can be exposed for a greater duration for a given digital camera frame rate allowing for good operation in dark rooms as well as well lit rooms. - The
DSP 84 provides control information to the image sensor andlens assembly 80 via the control bus. The control information allows theDSP 84 to control parameters of the image sensor andlens assembly 80 such as exposure, gain, array configuration, reset and initialization. TheDSP 84 also provides clock signals to the image sensor andlens assembly 80 to control the frame rate of the image sensor andlens assembly 80. - An
infrared pass filter 89 is provided on the image sensor andlens assembly 80 to blind thedigital camera 70 to frequencies of light outside the infrared range. -
Master controller 54 is better illustrated inFIG. 5 and includes aDSP 90, aboot EPROM 92, aserial line driver 94 and apower supply subsystem 95. TheDSP 90 communicates with theDSPs 84 of thedigital cameras 70 over a data bus via aserial port 96 and communicates with thecomputer 56 over a data bus via aserial port 98 and theserial line driver 94. - The
master controller 54 and eachdigital camera 70 follow a communication protocol that enables bidirectional communications via a common serial cable similar to a universal serial bus (USB). Communications between themaster controller 54 and thedigital cameras 70 are performed as background processes in response to interrupts. - The operation of the
touch system 50 will now be described. To provide appropriate lighting for thedigital cameras 70, the infraredlight source 72 associated with eachdigital camera 70 generates infrared light that is projected across thetouch surface 60 covering an area at least as large as the field of view of the associated digital camera. - As mentioned previously, the
polarizers 74 at opposite diagonal corners of thetouch surface 60 inhibit the infraredlight source 72 diagonally opposite eachdigital camera 70 from blinding that digital camera due to the different polarization orientations of thepolarizers 74. Infrared light impinging on apolarizer 74 that is polarized in a manner different from the polarization orientation of the polarizer is blocked. In this manner, thedigital camera 70 behind eachpolarizer 74 in effect does not see the infraredlight source 72 at the diagonally opposite corner. - Each
digital camera 70 acquires images looking across thetouch surface 60 within the field of view of its image sensor andlens assembly 80 at a desired frame rate and processes each acquired image to determine if a pointer is in the acquired image. When a pointer is positioned within the fields of view of thedigital cameras 70, the pointer is illuminated by the light projected by the infraredlight sources 72. Light reflecting off of the pointer typically does not maintain its polarization and therefore is visible to thedigital cameras 70. Therefore, the illuminated pointer appears as a high-contrast bright region interrupting a dark band in each captured image allowing the existence of the pointer in the captured images to be readily detected. - If a pointer is in the acquired image, the image is further processed to determine characteristics of the pointer contacting or hovering above the
touch surface 60. Pointer information packets (PIPs) including pointer characteristics, status and/or diagnostic information are then generated by thedigital cameras 70 and the PIPs are queued for transmission to themaster controller 54. - The
master controller 54 polls thedigital cameras 70 for PIPs. If the PIPs include pointer characteristic information, themaster controller 54 triangulates pointer characteristics in the PIPs to determine the position of the pointer relative to thetouch surface 60 in Cartesian rectangular coordinates. Themaster controller 54 in turn transmits calculated pointer position data, status and/or diagnostic information to thecomputer 56. In this manner, the pointer position data transmitted to thecomputer 56 can be recorded as writing or drawing or can be used to control execution of an applications program executed by thecomputer 56. Thecomputer 56 also updates the computer-generated image output conveyed to theplasma display 58 so that information presented on thetouch surface 60 reflects the pointer activity. - Specifics concerning the processing of acquired images and the triangulation of pointer characteristics in PIPs are described in U.S. patent application Ser. No. 10/294,917 to Morrison et al., assigned to SMART Technologies Inc., assignee of the present invention, the content of which is incorporated herein by reference. Accordingly, specifics will not be described further herein.
- As will be appreciated, the use of infrared
light sources 72 andpolarizers 74 at the corners of thetouch surface 60 inhibit light sources in the fields of view of the digital cameras from blinding the digital cameras. - Turning now to
FIG. 6 , another embodiment of a touch screen is shown and is generally identified byreference numeral 152.Tough screen 152 is similar to that of the previous embodiment but in this case thebezel 162 is designed to allow thetouch screen 152 to operate in an occlusion mode. As can be seen,bezel 162, in this embodiment, includes elongate retro-reflectors 164 bordering the sides of thetouch surface 160. The retro-reflectors 164 have retro-reflecting surfaces 166 lying in planes that are generally normal to the plane of thetouch surface 160. - The retro-
reflectors 164 are designed to maintain polarization of light impinging thereon. In the present embodiment, corner cube retroreflectors such as those manufactured by Reflexite Corporation and sold under the name Reflexite™ AP1000 that preserve polarization are used. - In this embodiment, when infrared light generated by the infrared
light sources 172 travels across the touch surface and impinges on one or more retro-reflectors 164, the retro-reflectors 164 in turn reflect the infrared light back in the opposite direction while maintaining the polarization of the infrared light. Since the infraredlight sources 172 are mounted adjacent thedigital cameras 170, infrared light reflected by the retro-reflectors 164 is aimed back towards thedigital cameras 170. As a result, eachdigital camera 170 sees a bright band of illumination within its field of view. - During image acquisition, when no pointer is positioned within the fields of view of the
digital cameras 170, thedigital cameras 170 see bright bands of illumination. When a pointer is positioned within the fields of view of thedigital cameras 170, the pointer occludes the infrared illumination and therefore appears as a high-contrast dark region interrupting a bright band of illumination in each captured image allowing the existence of the pointer in the captured images to be readily detected. - The embodiments of the touch screen described above show digital cameras, infrared light sources and polarizers at each corner of the touch screen. Those of skill in the art will appreciate that only two imaging devices having overlapping fields of view are required. Also the infrared light sources need not be positioned adjacent the digital cameras. In addition other types of filters may be used to inhibit the digital cameras from being blinded by a light source within its field of view. Basically any filter type device that blocks light projected by a light source within the field of view of the digital camera based on a characteristic (i.e. polarization, frequency etc.) of the projected light may be used.
- In addition, although each light source is described as including an array of IR LEDs, those of skill in the art will appreciate that other light source configurations to provide light illumination across the touch surface can be used.
- Although the
touch system 50 has been described as including aplasma display 58 to present images on the touch surface, those of skill in the art will appreciate that this is not required. The touch screen may be a rear or front projection display device or virtually any surface on which a computer generated image is projected. Alternatively, thetouch system 50 may be a writeboard where images are not projected on the touch surface. - Also, although the
touch system 50 is described as including amaster controller 54 separate from the digital cameras, if desired one of the digital cameras can be conditioned to function as both a camera and the master controller and poll the other digital cameras for PIPs. In this case, the digital camera functioning as the master controller may include afaster DSP 84 than the remaining digital cameras. - Turning now to
FIG. 7 , yet another embodiment of a touch screen is shown and is generally identified byreference numeral 252. In this embodiment,touch screen 252 includes a high-resolution display device such as aplasma display 258, the front surface of which defines atouch surface 260. Thetouch surface 260 is bordered by an illuminated bezel orframe 262 coupled to the display device.Illuminated bezel 262 is of the type disclosed in U.S. patent application Ser. No. 10/354,168 to Akitt et al., assigned to SMART Technologies Inc., assignee of the present invention, the content of which is incorporated by reference.Illuminated bezel 262 includes elongateside frame assemblies 264 that are coupled to the sides of theplasma display 258. Eachside frame assembly 264 accommodates a generally continuous infrared illumination source 266. The ends of theside frame assemblies 264 are joined bycorner pieces 268 that house DSP-based CMOS digital cameras 270. Each digital camera 270 is mounted within itsrespective corner piece 268 so that its field of view encompasses and looks generally across theentire touch surface 260. - Each
illuminated bezel 262 includes an array of IR LEDs (not shown) that project light onto a diffuser (not shown). The diffuser in turn, diffuses and expands the infrared light emitted by the IR LEDs so that adequate infrared backlighting is projected across thetouch surface 260. As a result, the illuminatedbezels 162 appear as generally continuous bright bands of illumination to the digital cameras 270. - Rather than using monochrome digital cameras capturing infrared images, in this embodiment, the image sensors used in the digital cameras 270 are color CMOS image sensors and do not include IR pass filters.
FIG. 8 shows the light sensitivity of one of the image sensors. As can be seen, the sensitivity of the image sensor to red, green and blue light is localized around the appropriate frequencies. However at light in the infrared range i.e. about 850 nm, the color filters of the image sensors become transparent making the sensitivity of all of the pixels of the image sensors basically equal. This characteristic of the image sensor allows the touch screen to be operated in a number of modes depending on ambient light levels as will now be described. - For example, in one mode of operation when the ambient light level is sufficiently high, the illuminated
bezels 262 are switched off allowing color images to be acquired by the digital cameras 270. During image processing, in addition to determining the pointer position in the manner described previously, acquired color information is used to enhance pointer recognition and scene understanding. - As will be appreciated, when an image including a pointer is captured, the foreground object i.e. the pointer, is the object of interest. During image processing, it is desired to separate the foreground object from the background. Since the optical properties of the foreground object and background are different for different wavelengths of light, the foreground object is detected easier in some light frequencies than others. For example, if the background is predominately blue, then the foreground object such as a finger will have higher luminosity when looking through red or green filters since the blue filter does not permit blue light to pass. This effectively segments the foreground object from the background. In general, the luminosity differences between the foreground object and the background are exploited at different frequencies.
- When the ambient light level drops below a threshold level, the illuminated
bezels 262 are switched on. In this case, thetouch screen 252 operates in an occlusion mode as described previously. Pointer data is developed from images captured by the image sensors and processed in the manner discussed above. - Although the
touch screen 252 has been described as using infrared illumination to provide backlighting, those of skill in the art will appreciate that light in a different frequency range other than infrared may be used provided the image sensors in the digital cameras have sufficient quantum efficiency at that different frequency range to capture images. - Rather than exclusively using ambient light when the ambient light level is sufficiently high and infrared illumination when the ambient light level is low, infrared illumination can be multiplexed with ambient light to enable the digital cameras 270 to capture different types of images. For example, the illuminated
bezels 262 can be strobed so that one or more images are captured by the digital cameras 270 in ambient light conditions and then in infrared backlighting conditions. The strobing may be achieved by shutting the illuminatedbezels 262 on and off and relying on ambient light levels in the off condition. - Alternatively, rather than using colour image sensors, monochrome sensors may be used in conjunction with an illumination source that provides lighting across the touch surface that changes frequency bands allowing one or more images to be captured by the digital cameras in the different frequency bands. For example, the illumination source may include a white light source and a light filter in the form of a wheel that is rotatable in front of the light source. The wheel may include alternating infrared and clear sections. When a clear section is presented in front of the light source, white light is projected across the touch surface and when an infrared section is presented in front of the light source, infrared light is projected across the touch surface.
- Other light filters can of course be used with the wheel. For example, the wheel may include infrared, blue, red and green sections arranged about the wheel. Depending on the section of the wheel positioned in front of the light source, light in a different frequency band is projected across the touch surface allowing one or more images to be captured during each type of illumination. Of course, those of skill in the art will appreciate that colour wheels may be disposed in front of the digital cameras rather than adjacent the light source.
- Although embodiments of the present invention have been described, those of skill in the art will appreciate that variations and modifications may be made without departing from the spirit and scope thereof as defined by the appended claims.
Claims (45)
1-44. (canceled)
45. An apparatus for detecting a pointer within a region of interest comprising:
at least one pair of imaging devices, said imaging devices having overlapping fields of view encompassing said region of interest;
at least one light source providing illumination across said region of interest and being within the field of view of at least one of said imaging devices; and
an optical filter associated with the at least one imaging device whose field of view sees said light source, said filter blocking light projected by said light source to inhibit said imaging device from being blinded by said projected light.
46. An apparatus according to claim 45 wherein said filter blocks light having a characteristic different from a characteristic assigned to the at least one imaging device.
47. An apparatus according to claim 46 wherein said characteristic includes at least one of polarization and frequency.
48. An apparatus according to claim 46 including a light source associated with each imaging device, each light source being in the field of view of the non-associated imaging device, light projected by each light source being visible to said associated imaging device and being blocked by a filter associated with the non-associated imaging device.
49. An apparatus according to claim 48 wherein said characteristic includes at least one of polarization and frequency.
50. An apparatus according to claim 49 wherein the light source associated with one imaging device projects illumination having a first polarization orientation and wherein the light source associated with the other imaging device projects illumination having a second polarization orientation.
51. An apparatus according to claim 50 wherein the first and second polarization orientations are vertical and horizontal polarization orientations.
52. An apparatus according to claim 46 wherein said region of interest overlies a touch surface on which pointer contacts are made.
53. An apparatus according to claim 52 wherein said touch surface and region of interest are rectangular.
54. An apparatus according to claim 53 including a light source associated with each imaging device, each light source being in the field of view of the non-associated imaging device, light projected by each light source being visible to said associated imaging device and being blocked by an optical filter associated with the non-associated imaging device.
55. An apparatus according to claim 54 wherein said characteristic includes at least one of polarization and frequency.
56. An apparatus according to claim 55 wherein the light source associated with one imaging device projects illumination having a first polarization orientation and wherein the light source associated with the other imaging device projects illumination having a second polarization orientation.
57. An apparatus according to claim 56 wherein the first and second polarization orientations are vertical and horizontal polarization orientations.
58. An apparatus for detecting a pointer within a region of interest comprising:
at least one pair of imaging devices, said imaging devices having overlapping fields of view looking generally across said region of interest;
a light source associated with each imaging device, each said light source providing illumination across said region of interest and being in the field of view of the non-associated imaging device;
an optical filter device associated with each imaging device so that substantially only light projected by the light source associated therewith is received by said associated imaging device to avoid the imaging device from being blinded by other light; and
a filter device associated with each light source to alter a characteristic of projected light such that the projected light is unable to pass through the filter device associated with the non-associated imaging device.
59. An apparatus according to claim 58 wherein said filter devices are polarizers.
60. An apparatus according to claim 59 wherein each light source is an infrared light source.
61. An apparatus according to claim 60 wherein each infrared light source includes at least one infrared light emitting diode (IR LED).
62. An apparatus according to claim 58 further comprising:
retro-reflective elements bordering said region of interest, said retro-reflective elements returning light impinging thereon in the direction of impingement without altering the polarization thereof.
63. An apparatus according to claim 58 wherein said region of interest overlies a touch surface on which pointer contacts are made.
64. An apparatus according to claim 63 wherein said touch surface and region of interest are rectangular.
65. An apparatus according to claim 64 wherein said filter devices are polarizers.
66. An apparatus according to claim 65 including an imaging device and associated light source at each corner of said region of interest, diagonally opposite imaging devices being aimed generally at one another.
67. An apparatus according to claim 66 wherein one of the diagonally opposite polarizers has a vertical orientation and wherein the other of the diagonally opposite polarizers has a horizontal orientation.
68. An apparatus for detecting a pointer within a region of interest comprising:
an imaging device adjacent at least two corners of said region of interest, the imaging devices having overlapping fields of view looking generally across said region of interest from different viewpoints, each imaging device having a different optical filter associated therewith so that each imaging device substantially only captures light having a particular characteristic thereby to avoid being blinded by light not having said particular characteristic; and
a light source associated with each imaging device, each said light source projecting light across said region of interest having a particular characteristic such that the projected light only passes through the optical filter of said associated imaging device.
69. An apparatus according to claim 68 wherein each light source is an infrared light source.
70. An apparatus according to claim 69 wherein each infrared light source includes at least one infrared light emitting diode (IR LED).
71. An apparatus according to claim 68 wherein said region of interest overlies a touch surface on which pointer contacts are made.
72. An apparatus according to claim 71 wherein said touch surface and region of interest are rectangular.
73. An apparatus according to claim 72 wherein said imaging devices are configured to capture light having different polarizations.
74. An apparatus according to claim 73 wherein said different polarizations are vertical and horizontal.
75. An apparatus for detecting a pointer within a region of interest comprising:
at least two color imaging devices having overlapping fields of view looking generally across said region of interest;
processing circuitry receiving and processing images acquired by said imaging devices to determine the location of said pointer relative to said region of interest; and
at least one illumination source projecting light in a specified frequency range across said region of interest thereby to provide lighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said at least one illumination source to capture monochrome images.
76. An apparatus according to claim 75 wherein said illumination source is operated to project light when ambient light levels fall below a threshold level.
77. An apparatus according to claim 76 wherein said illumination source projects light in the infrared range.
78. An apparatus according to claim 75 said illumination source projects light in the infrared range.
79. An apparatus according to claim 78 wherein said region of interest overlies a touch surface.
80. An apparatus according to claim 79 wherein said illumination source is operated to project light when ambient light levels fall below a threshold level.
81. An apparatus according to claim 75 wherein said region of interest overlies a touch surface.
82. An apparatus for detecting a pointer contact on a generally rectangular touch surface comprising:
a color imaging device at each corner of said touch surface and having a field of view looking generally across said touch surface;
processing circuitry receiving and processing images acquired by said imaging devices to determine the location of said pointer relative to said region of interest; and
illumination sources surrounding said touch surface and projecting light in a specified frequency range across said touch surface thereby to provide backlighting for said imaging devices, wherein said color imaging devices are sensitive to ambient light to capture color images and are sensitive to the light projected by said illumination sources to capture monochrome images.
83. An apparatus according to claim 82 wherein said illumination sources are operated to project light when ambient light levels fall below a threshold level.
84. An apparatus according to claim 83 wherein said illumination sources project light in the infrared range.
85. An apparatus for detecting a pointer within a region of interest comprising:
at least two monochrome imaging devices having overlapping fields of view looking generally across said region of interest;
processing circuitry receiving and processing images acquired by said imaging devices to determine the location of said pointer relative to said region of interest;
at least one illumination source projecting light across said region of interest; and
at least one filter changing the frequency band of light in a cycle thereby to enable said imaging devices to capture images looking across said region of interest in different lighting conditions.
86. An apparatus according to claim 85 wherein said illumination source projects light of different frequencies across said region of interest in a repeating cycle.
87. An apparatus according to claim 86 wherein said illumination source projects infrared, red, blue, and green light in a cycle across said region of interest.
88. An apparatus for detecting a pointer within a region of interest comprising:
at least one pair of imaging devices, said imaging devices having overlapping fields of view looking generally across said region of interest;
a light source associated with each imaging device, each said light source providing illumination across said region of interest and being in the field of view of the non-associated imaging device; and
a different optical filter device associated with each imaging device so that substantially only light projected by the light source associated therewith is received by said associated imaging device to avoid the imaging device from being blinded by other light.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/764,723 US20080068352A1 (en) | 2004-02-17 | 2007-06-18 | Apparatus for detecting a pointer within a region of interest |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/778,534 US7232986B2 (en) | 2004-02-17 | 2004-02-17 | Apparatus for detecting a pointer within a region of interest |
US11/764,723 US20080068352A1 (en) | 2004-02-17 | 2007-06-18 | Apparatus for detecting a pointer within a region of interest |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/778,534 Continuation US7232986B2 (en) | 2004-02-17 | 2004-02-17 | Apparatus for detecting a pointer within a region of interest |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080068352A1 true US20080068352A1 (en) | 2008-03-20 |
Family
ID=34838199
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/778,534 Active 2024-08-28 US7232986B2 (en) | 2004-02-17 | 2004-02-17 | Apparatus for detecting a pointer within a region of interest |
US11/764,723 Abandoned US20080068352A1 (en) | 2004-02-17 | 2007-06-18 | Apparatus for detecting a pointer within a region of interest |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/778,534 Active 2024-08-28 US7232986B2 (en) | 2004-02-17 | 2004-02-17 | Apparatus for detecting a pointer within a region of interest |
Country Status (1)
Country | Link |
---|---|
US (2) | US7232986B2 (en) |
Cited By (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050259084A1 (en) * | 2004-05-21 | 2005-11-24 | Popovich David G | Tiled touch system |
US20070236454A1 (en) * | 2003-10-09 | 2007-10-11 | Smart Technologies, Inc. | Apparatus For Determining The Location Of A Pointer Within A Region Of Interest |
US20080129700A1 (en) * | 2006-12-04 | 2008-06-05 | Smart Technologies Inc. | Interactive input system and method |
US20080284733A1 (en) * | 2004-01-02 | 2008-11-20 | Smart Technologies Inc. | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US20090020342A1 (en) * | 2007-07-18 | 2009-01-22 | Smart Technologies Inc. | Touch Panel And Interactive Input System Incorporating The Same |
US20090027357A1 (en) * | 2007-07-23 | 2009-01-29 | Smart Technologies, Inc. | System and method of detecting contact on a display |
US20090058832A1 (en) * | 2007-08-30 | 2009-03-05 | John Newton | Low Profile Touch Panel Systems |
US20090122027A1 (en) * | 2004-05-07 | 2009-05-14 | John Newton | Touch Panel Display System with Illumination and Detection Provided from a Single Edge |
US20090146973A1 (en) * | 2004-04-29 | 2009-06-11 | Smart Technologies Ulc | Dual mode touch systems |
US20090146972A1 (en) * | 2004-05-05 | 2009-06-11 | Smart Technologies Ulc | Apparatus and method for detecting a pointer relative to a touch surface |
US20090160801A1 (en) * | 2003-03-11 | 2009-06-25 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US20090277697A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Pen Tool Therefor |
US20090278794A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System With Controlled Lighting |
US20090278795A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Illumination Assembly Therefor |
US20090277694A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Bezel Therefor |
US20100060613A1 (en) * | 2002-11-15 | 2010-03-11 | Smart Technologies Ulc | Size/scale orientation determination of a pointer in a camera-based touch system |
US20100265202A1 (en) * | 2000-07-05 | 2010-10-21 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US20100265217A1 (en) * | 2009-04-21 | 2010-10-21 | Hon Hai Precision Industry Co., Ltd. | Optical touch system with display screen |
US20110007001A1 (en) * | 2009-07-09 | 2011-01-13 | Waltop International Corporation | Dual Mode Input Device |
US20110032215A1 (en) * | 2009-06-15 | 2011-02-10 | Smart Technologies Ulc | Interactive input system and components therefor |
US20110095989A1 (en) * | 2009-10-23 | 2011-04-28 | Smart Technologies Ulc | Interactive input system and bezel therefor |
US20110109565A1 (en) * | 2010-02-04 | 2011-05-12 | Hong Kong Applied Science And Technology Research Institute Co. Ltd. | Cordinate locating method, coordinate locating device, and display apparatus comprising the coordinate locating device |
US20110116105A1 (en) * | 2010-02-04 | 2011-05-19 | Hong Kong Applied Science and Technology Research Institute Company Limited | Coordinate locating method and apparatus |
CN102096526A (en) * | 2009-12-15 | 2011-06-15 | 乐金显示有限公司 | Optical sensing unit, display module and display device using the same |
CN102103441A (en) * | 2009-12-17 | 2011-06-22 | 乐金显示有限公司 | Method for detecting touch and optical touch sensing system |
CN102109933A (en) * | 2009-12-24 | 2011-06-29 | 乐金显示有限公司 | Assembly having display panel and optical sensing frame and display system using the same |
KR20110075723A (en) * | 2009-12-28 | 2011-07-06 | 엘지디스플레이 주식회사 | Compensation method for touch sensitiveness of display device including touch assembly |
US20110175849A1 (en) * | 2010-01-18 | 2011-07-21 | Acer Incorporated | Optical touch display device and method |
US20110199337A1 (en) * | 2010-02-12 | 2011-08-18 | Qisda Corporation | Object-detecting system and method by use of non-coincident fields of light |
US20110234638A1 (en) * | 2003-09-16 | 2011-09-29 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US20110261016A1 (en) * | 2010-04-23 | 2011-10-27 | Sunplus Innovation Technology Inc. | Optical touch screen system and method for recognizing a relative distance of objects |
US20110298708A1 (en) * | 2010-06-07 | 2011-12-08 | Microsoft Corporation | Virtual Touch Interface |
US20110304535A1 (en) * | 2010-06-15 | 2011-12-15 | Canon Kabushiki Kaisha | Coordinate input apparatus |
US8115753B2 (en) | 2007-04-11 | 2012-02-14 | Next Holdings Limited | Touch screen system with hover and click input methods |
KR101123932B1 (en) | 2009-09-24 | 2012-03-23 | 에이서 인코포레이티드 | Optical touch system and method |
US20120075254A1 (en) * | 2008-01-07 | 2012-03-29 | Simon James Bridger | Touch System Having An Uninterrupted Light Source |
KR20120048389A (en) * | 2010-11-05 | 2012-05-15 | 엘지디스플레이 주식회사 | Method for detecting touch |
US8289299B2 (en) | 2003-02-14 | 2012-10-16 | Next Holdings Limited | Touch screen signal processing |
US8339378B2 (en) | 2008-11-05 | 2012-12-25 | Smart Technologies Ulc | Interactive input system with multi-angle reflector |
US20130021299A1 (en) * | 2011-07-18 | 2013-01-24 | Pixart Imaging Inc. | Optical touch panel assembly and light sensor thereof |
US8405636B2 (en) | 2008-01-07 | 2013-03-26 | Next Holdings Limited | Optical position sensing system and optical position sensor assembly |
CN103049109A (en) * | 2012-12-20 | 2013-04-17 | 广州视睿电子科技有限公司 | Stylus and touch point identification method |
US8432377B2 (en) | 2007-08-30 | 2013-04-30 | Next Holdings Limited | Optical touchscreen with improved illumination |
US8456447B2 (en) | 2003-02-14 | 2013-06-04 | Next Holdings Limited | Touch screen signal processing |
US8508508B2 (en) | 2003-02-14 | 2013-08-13 | Next Holdings Limited | Touch screen signal processing with single-point calibration |
US20130222237A1 (en) * | 2010-11-12 | 2013-08-29 | 3M Innovative Properties Company | Interactive polarization-preserving projection display |
KR101308477B1 (en) | 2009-12-17 | 2013-09-16 | 엘지디스플레이 주식회사 | Method for Detecting Touch and Display Device Using the Same |
US20130265283A1 (en) * | 2012-04-10 | 2013-10-10 | Pixart Imaging Inc. | Optical operation system |
KR101352264B1 (en) | 2008-12-18 | 2014-01-17 | 엘지디스플레이 주식회사 | Apparatus and method for sensing muliti-touch |
WO2018043805A1 (en) * | 2016-08-29 | 2018-03-08 | 한신대학교 산학협력단 | Pointing apparatus using three-dimensional virtual button |
Families Citing this family (121)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4052498B2 (en) | 1999-10-29 | 2008-02-27 | 株式会社リコー | Coordinate input apparatus and method |
JP2001184161A (en) | 1999-12-27 | 2001-07-06 | Ricoh Co Ltd | Method and device for inputting information, writing input device, method for managing written data, method for controlling display, portable electronic writing device, and recording medium |
US9164654B2 (en) * | 2002-12-10 | 2015-10-20 | Neonode Inc. | User interface for mobile computer unit |
US8674966B2 (en) | 2001-11-02 | 2014-03-18 | Neonode Inc. | ASIC controller for light-based touch screen |
US9213443B2 (en) | 2009-02-15 | 2015-12-15 | Neonode Inc. | Optical touch screen systems using reflected light |
US9052777B2 (en) | 2001-11-02 | 2015-06-09 | Neonode Inc. | Optical elements with alternating reflective lens facets |
US8339379B2 (en) * | 2004-04-29 | 2012-12-25 | Neonode Inc. | Light-based touch screen |
US9778794B2 (en) | 2001-11-02 | 2017-10-03 | Neonode Inc. | Light-based touch screen |
US8416217B1 (en) | 2002-11-04 | 2013-04-09 | Neonode Inc. | Light-based finger gesture user interface |
US20120274765A1 (en) * | 2003-10-09 | 2012-11-01 | Smart Technologies Ulc | Apparatus for determining the location of a pointer within a region of interest |
US7359564B2 (en) * | 2004-10-29 | 2008-04-15 | Microsoft Corporation | Method and system for cancellation of ambient light using light frequency |
US7864159B2 (en) * | 2005-01-12 | 2011-01-04 | Thinkoptics, Inc. | Handheld vision based absolute pointing system |
US20070165007A1 (en) * | 2006-01-13 | 2007-07-19 | Gerald Morrison | Interactive input system |
JP5138175B2 (en) * | 2006-04-12 | 2013-02-06 | 任天堂株式会社 | Character input program, character input device, character input system, and character input method |
US20080052750A1 (en) * | 2006-08-28 | 2008-02-28 | Anders Grunnet-Jepsen | Direct-point on-demand information exchanges |
US8913003B2 (en) | 2006-07-17 | 2014-12-16 | Thinkoptics, Inc. | Free-space multi-dimensional absolute pointer using a projection marker system |
US7890863B2 (en) | 2006-10-04 | 2011-02-15 | Immersion Corporation | Haptic effects with proximity sensing |
US8638317B2 (en) * | 2007-03-16 | 2014-01-28 | Japan Display West Inc. | Display apparatus and method for controlling the same |
US20080255840A1 (en) * | 2007-04-16 | 2008-10-16 | Microsoft Corporation | Video Nametags |
US9176598B2 (en) | 2007-05-08 | 2015-11-03 | Thinkoptics, Inc. | Free-space multi-dimensional absolute pointer with improved performance |
US8526632B2 (en) * | 2007-06-28 | 2013-09-03 | Microsoft Corporation | Microphone array for a camera speakerphone |
US8330787B2 (en) * | 2007-06-29 | 2012-12-11 | Microsoft Corporation | Capture device movement compensation for speaker indexing |
US8165416B2 (en) * | 2007-06-29 | 2012-04-24 | Microsoft Corporation | Automatic gain and exposure control using region of interest detection |
EP2165248A4 (en) * | 2007-07-06 | 2011-11-23 | Neonode Inc | Scanning of a touch screen |
US8102377B2 (en) * | 2007-09-14 | 2012-01-24 | Smart Technologies Ulc | Portable interactive media presentation system |
US20090213093A1 (en) * | 2008-01-07 | 2009-08-27 | Next Holdings Limited | Optical position sensor using retroreflection |
ES2905909T3 (en) * | 2008-01-14 | 2022-04-12 | Avery Dennison Corp | Retroreflector for use in touch screen applications and position sensing systems |
US8248691B2 (en) * | 2008-05-30 | 2012-08-21 | Avery Dennison Corporation | Infrared light transmission film |
US8890842B2 (en) | 2008-06-13 | 2014-11-18 | Steelcase Inc. | Eraser for use with optical interactive surface |
US20110074738A1 (en) * | 2008-06-18 | 2011-03-31 | Beijing Irtouch Systems Co., Ltd. | Touch Detection Sensing Apparatus |
US20100079409A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Touch panel for an interactive input system, and interactive input system incorporating the touch panel |
US8810522B2 (en) * | 2008-09-29 | 2014-08-19 | Smart Technologies Ulc | Method for selecting and manipulating a graphical object in an interactive input system, and interactive input system executing the method |
US20100083109A1 (en) * | 2008-09-29 | 2010-04-01 | Smart Technologies Ulc | Method for handling interactions with multiple users of an interactive input system, and interactive input system executing the method |
US8305363B2 (en) * | 2008-10-10 | 2012-11-06 | Pixart Imaging | Sensing system and locating method thereof |
US8269158B2 (en) * | 2008-10-10 | 2012-09-18 | Pixart Imaging Inc. | Sensing system and method for obtaining position of pointer thereof |
KR100910024B1 (en) | 2008-10-13 | 2009-07-30 | 호감테크놀로지(주) | Camera type touch-screen utilizing linear infrared emitter |
US20100201812A1 (en) * | 2009-02-11 | 2010-08-12 | Smart Technologies Ulc | Active display feedback in interactive input systems |
US8643628B1 (en) | 2012-10-14 | 2014-02-04 | Neonode Inc. | Light-based proximity detection system and user interface |
US8775023B2 (en) | 2009-02-15 | 2014-07-08 | Neanode Inc. | Light-based touch controls on a steering wheel and dashboard |
US8250482B2 (en) | 2009-06-03 | 2012-08-21 | Smart Technologies Ulc | Linking and managing mathematical objects |
CA2707783A1 (en) | 2009-06-17 | 2010-12-17 | Smart Technologies Ulc | Interactive input system and arm assembly therefor |
US8416206B2 (en) * | 2009-07-08 | 2013-04-09 | Smart Technologies Ulc | Method for manipulating a graphic widget in a three-dimensional environment displayed on a touch panel of an interactive input system |
US8692768B2 (en) | 2009-07-10 | 2014-04-08 | Smart Technologies Ulc | Interactive input system |
JP2011048811A (en) * | 2009-07-31 | 2011-03-10 | Seiko Epson Corp | Optical position detection apparatus and display device having position detection function |
CA2772424A1 (en) * | 2009-09-01 | 2011-03-10 | Smart Technologies Ulc | Interactive input system with improved signal-to-noise ratio (snr) and image capture method |
KR20120095852A (en) * | 2009-10-15 | 2012-08-29 | 스마트 테크놀러지스 유엘씨 | Method and apparatus for drawing and erasing calligraphic ink objects on a display surface |
TWI424339B (en) * | 2009-11-04 | 2014-01-21 | Coretronic Corp | Optical touch apparatus and driving mothod |
CN102053757B (en) * | 2009-11-05 | 2012-12-19 | 上海精研电子科技有限公司 | Infrared touch screen device and multipoint positioning method thereof |
US8446392B2 (en) * | 2009-11-16 | 2013-05-21 | Smart Technologies Ulc | Method for determining the location of a pointer in a pointer input region, and interactive input system executing the method |
US8384694B2 (en) * | 2009-11-17 | 2013-02-26 | Microsoft Corporation | Infrared vision with liquid crystal display device |
KR20110062824A (en) * | 2009-12-04 | 2011-06-10 | 삼성전기주식회사 | Apparatus for detecting coordinates of an event within interest region, display device, security device and electronic blackboard including the same |
RU2573763C2 (en) | 2009-12-11 | 2016-01-27 | Авери Деннисон Корпорейшн | Position reading systems for use in sensor displays and prismatic film used in them |
CN102109930A (en) * | 2009-12-29 | 2011-06-29 | 鸿富锦精密工业(深圳)有限公司 | Touch display device |
US8502789B2 (en) * | 2010-01-11 | 2013-08-06 | Smart Technologies Ulc | Method for handling user input in an interactive input system, and interactive input system executing the method |
EP2524319A1 (en) * | 2010-01-13 | 2012-11-21 | SMART Technologies ULC | Method for handling and transferring data in an interactive input system, and interactive input system executing the method |
US20110169736A1 (en) * | 2010-01-13 | 2011-07-14 | Smart Technologies Ulc | Interactive input system and tool tray therefor |
US8624835B2 (en) * | 2010-01-13 | 2014-01-07 | Smart Technologies Ulc | Interactive input system and illumination system therefor |
US20110170253A1 (en) * | 2010-01-13 | 2011-07-14 | Smart Technologies Ulc | Housing assembly for imaging assembly and fabrication method therefor |
EP2524285B1 (en) * | 2010-01-14 | 2022-06-29 | SMART Technologies ULC | Interactive system with successively activated illumination sources |
US20110176082A1 (en) * | 2010-01-18 | 2011-07-21 | Matthew Allard | Mounting Members For Touch Sensitive Displays |
WO2011094855A1 (en) | 2010-02-05 | 2011-08-11 | Smart Technologies Ulc | Interactive input system displaying an e-book graphic object and method of manipulating a e-book graphic object |
WO2011098654A1 (en) * | 2010-02-09 | 2011-08-18 | Multitouch Oy | Interactive display |
US20110239114A1 (en) * | 2010-03-24 | 2011-09-29 | David Robbins Falkenburg | Apparatus and Method for Unified Experience Across Different Devices |
US9383864B2 (en) | 2010-03-31 | 2016-07-05 | Smart Technologies Ulc | Illumination structure for an interactive input system |
US9189086B2 (en) * | 2010-04-01 | 2015-11-17 | Smart Technologies Ulc | Interactive input system and information input method therefor |
US8872772B2 (en) * | 2010-04-01 | 2014-10-28 | Smart Technologies Ulc | Interactive input system and pen tool therefor |
US20110241987A1 (en) * | 2010-04-01 | 2011-10-06 | Smart Technologies Ulc | Interactive input system and information input method therefor |
KR20130073902A (en) | 2010-04-26 | 2013-07-03 | 스마트 테크놀러지스 유엘씨 | Method for handling objects representing annotations on an interactive input system and interactive input system executing the method |
KR100993602B1 (en) | 2010-05-28 | 2010-11-10 | 호감테크놀로지(주) | Camera type touch-screen utilizing light guide media of camera part and method for operating the same |
KR101706778B1 (en) | 2010-07-19 | 2017-02-27 | 엘지이노텍 주식회사 | Optical touch screen |
TWI585655B (en) | 2010-08-05 | 2017-06-01 | 友達光電股份有限公司 | Optical plate structure for a touch panel, and touch display panel and touch liquid crystal display panel including the same |
EP2447811B1 (en) * | 2010-11-02 | 2019-12-18 | LG Display Co., Ltd. | Infrared sensor module, touch sensing method thereof, and auto calibration method applied to the same |
CA2819551C (en) | 2010-12-01 | 2017-10-10 | Smart Technologies Ulc | Multi-touch input system with re-direction of radiation |
CN102063228B (en) * | 2010-12-14 | 2013-08-28 | 鸿富锦精密工业(深圳)有限公司 | Optical sensing system and touch screen applying same |
KR101080318B1 (en) | 2010-12-30 | 2011-12-08 | 주식회사 아이카이스트 | Touch screen using led tube |
US9261987B2 (en) | 2011-01-12 | 2016-02-16 | Smart Technologies Ulc | Method of supporting multiple selections and interactive input system employing same |
US8619027B2 (en) | 2011-02-15 | 2013-12-31 | Smart Technologies Ulc | Interactive input system and tool tray therefor |
US8860688B2 (en) | 2011-03-02 | 2014-10-14 | Smart Technologies Ulc | 3D interactive input system and method |
TW201239710A (en) | 2011-03-29 | 2012-10-01 | Genius Electronic Optical Co Ltd | Optical touch system |
US9262011B2 (en) | 2011-03-30 | 2016-02-16 | Smart Technologies Ulc | Interactive input system and method |
US8600107B2 (en) | 2011-03-31 | 2013-12-03 | Smart Technologies Ulc | Interactive input system and method |
WO2012129670A1 (en) | 2011-03-31 | 2012-10-04 | Smart Technologies Ulc | Manipulating graphical objects γν a multi-touch interactive system |
US8740395B2 (en) | 2011-04-01 | 2014-06-03 | Smart Technologies Ulc | Projection unit and method of controlling a first light source and a second light source |
US8937588B2 (en) | 2011-06-15 | 2015-01-20 | Smart Technologies Ulc | Interactive input system and method of operating the same |
US9442602B2 (en) | 2011-06-15 | 2016-09-13 | Smart Technologies Ulc | Interactive input system and method |
US8982100B2 (en) | 2011-08-31 | 2015-03-17 | Smart Technologies Ulc | Interactive input system and panel therefor |
US9292109B2 (en) | 2011-09-22 | 2016-03-22 | Smart Technologies Ulc | Interactive input system and pen tool therefor |
US9274615B2 (en) | 2011-11-11 | 2016-03-01 | Pixart Imaging Inc. | Interactive input system and method |
WO2013104061A1 (en) | 2012-01-11 | 2013-07-18 | Smart Technologies Ulc | Calibration of an interactive light curtain |
WO2013104062A1 (en) | 2012-01-11 | 2013-07-18 | Smart Technologies Ulc | Interactive input system and method |
US9323322B2 (en) | 2012-02-02 | 2016-04-26 | Smart Technologies Ulc | Interactive input system and method of detecting objects |
CA2866919C (en) | 2012-03-30 | 2018-08-21 | Smart Technologies Ulc | Method for generally continuously calibrating an interactive input system |
US9323367B2 (en) | 2012-06-22 | 2016-04-26 | Smart Technologies Ulc | Automatic annotation de-emphasis |
US10585530B2 (en) | 2014-09-23 | 2020-03-10 | Neonode Inc. | Optical proximity sensor |
US10324565B2 (en) | 2013-05-30 | 2019-06-18 | Neonode Inc. | Optical proximity sensor |
US9164625B2 (en) | 2012-10-14 | 2015-10-20 | Neonode Inc. | Proximity sensor for determining two-dimensional coordinates of a proximal object |
US9921661B2 (en) | 2012-10-14 | 2018-03-20 | Neonode Inc. | Optical proximity sensor and associated user interface |
US9741184B2 (en) | 2012-10-14 | 2017-08-22 | Neonode Inc. | Door handle with optical proximity sensors |
US10282034B2 (en) | 2012-10-14 | 2019-05-07 | Neonode Inc. | Touch sensitive curved and flexible displays |
US9544723B2 (en) | 2012-10-15 | 2017-01-10 | Smart Technologies Ulc | System and method to display content on an interactive display surface |
US9292129B2 (en) | 2012-10-30 | 2016-03-22 | Smart Technologies Ulc | Interactive input system and method therefor |
US9092093B2 (en) | 2012-11-27 | 2015-07-28 | Neonode Inc. | Steering wheel user interface |
TW201426463A (en) * | 2012-12-26 | 2014-07-01 | Pixart Imaging Inc | Optical touch system |
CN103914185A (en) * | 2013-01-07 | 2014-07-09 | 原相科技股份有限公司 | Optical touch system |
US20140210734A1 (en) | 2013-01-29 | 2014-07-31 | Smart Technologies Ulc | Method for conducting a collaborative event and system employing same |
US9542040B2 (en) | 2013-03-15 | 2017-01-10 | Smart Technologies Ulc | Method for detection and rejection of pointer contacts in interactive input systems |
US9471957B2 (en) | 2014-03-28 | 2016-10-18 | Smart Technologies Ulc | Method for partitioning, managing and displaying a collaboration space and interactive input system employing same |
US9600101B2 (en) | 2014-03-31 | 2017-03-21 | Smart Technologies Ulc | Interactive input system, interactive board therefor and methods |
CA2886483C (en) | 2014-03-31 | 2023-01-10 | Smart Technologies Ulc | Dynamically determining workspace bounds during a collaboration session |
CA2886485C (en) | 2014-03-31 | 2023-01-17 | Smart Technologies Ulc | Method for tracking displays during a collaboration session and interactive board employing same |
TWI518575B (en) * | 2014-05-15 | 2016-01-21 | 廣達電腦股份有限公司 | Optical touch module |
US9872178B2 (en) | 2014-08-25 | 2018-01-16 | Smart Technologies Ulc | System and method for authentication in distributed computing environments |
TWI582672B (en) * | 2015-01-20 | 2017-05-11 | 緯創資通股份有限公司 | An optical touch device and touch detecting method using the same |
JP6623812B2 (en) * | 2016-02-17 | 2019-12-25 | セイコーエプソン株式会社 | Position detecting device and contrast adjusting method thereof |
US10013631B2 (en) | 2016-08-26 | 2018-07-03 | Smart Technologies Ulc | Collaboration system with raster-to-vector image conversion |
WO2019147612A1 (en) | 2018-01-25 | 2019-08-01 | Neonode Inc. | Polar coordinate sensor |
US10951859B2 (en) | 2018-05-30 | 2021-03-16 | Microsoft Technology Licensing, Llc | Videoconferencing device and method |
CN109032431A (en) * | 2018-07-13 | 2018-12-18 | 业成科技(成都)有限公司 | It can define the optical touch control apparatus and its method of nib color |
WO2020112585A1 (en) | 2018-11-28 | 2020-06-04 | Neonode Inc. | Motorist user interface sensor |
US11842014B2 (en) | 2019-12-31 | 2023-12-12 | Neonode Inc. | Contactless touch input system |
KR20230074269A (en) | 2020-09-30 | 2023-05-26 | 네오노드, 인크. | optical touch sensor |
Citations (100)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4144449A (en) * | 1977-07-08 | 1979-03-13 | Sperry Rand Corporation | Position detection apparatus |
US4247767A (en) * | 1978-04-05 | 1981-01-27 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence | Touch sensitive computer input device |
US4507557A (en) * | 1983-04-01 | 1985-03-26 | Siemens Corporate Research & Support, Inc. | Non-contact X,Y digitizer using two dynamic ram imagers |
US4672364A (en) * | 1984-06-18 | 1987-06-09 | Carroll Touch Inc | Touch input device having power profiling |
US4672990A (en) * | 1985-10-11 | 1987-06-16 | Robillard Fred W | System for freeze protection of pipes |
US4737631A (en) * | 1985-05-17 | 1988-04-12 | Alps Electric Co., Ltd. | Filter of photoelectric touch panel with integral spherical protrusion lens |
US4742221A (en) * | 1985-05-17 | 1988-05-03 | Alps Electric Co., Ltd. | Optical coordinate position input device |
US4746770A (en) * | 1987-02-17 | 1988-05-24 | Sensor Frame Incorporated | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
US4818826A (en) * | 1986-09-19 | 1989-04-04 | Alps Electric Co., Ltd. | Coordinate input apparatus including a detection circuit to determine proper stylus position |
US4820050A (en) * | 1987-04-28 | 1989-04-11 | Wells-Gardner Electronics Corporation | Solid-state optical position determining apparatus |
US4822145A (en) * | 1986-05-14 | 1989-04-18 | Massachusetts Institute Of Technology | Method and apparatus utilizing waveguide and polarized light for display of dynamic images |
US4831455A (en) * | 1986-02-21 | 1989-05-16 | Canon Kabushiki Kaisha | Picture reading apparatus |
US5025314A (en) * | 1990-07-30 | 1991-06-18 | Xerox Corporation | Apparatus allowing remote interactive use of a plurality of writing surfaces |
US5097516A (en) * | 1991-02-28 | 1992-03-17 | At&T Bell Laboratories | Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging |
US5109435A (en) * | 1988-08-08 | 1992-04-28 | Hughes Aircraft Company | Segmentation method for use against moving objects |
US5196835A (en) * | 1988-09-30 | 1993-03-23 | International Business Machines Corporation | Laser touch panel reflective surface aberration cancelling |
US5317140A (en) * | 1992-11-24 | 1994-05-31 | Dunthorn David I | Diffusion-assisted position location particularly for visual pen detection |
US5414413A (en) * | 1988-06-14 | 1995-05-09 | Sony Corporation | Touch panel apparatus |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5483603A (en) * | 1992-10-22 | 1996-01-09 | Advanced Interconnection Technology | System and method for automatic optical inspection |
US5484966A (en) * | 1993-12-07 | 1996-01-16 | At&T Corp. | Sensing stylus position using single 1-D image sensor |
US5490655A (en) * | 1993-09-16 | 1996-02-13 | Monger Mounts, Inc. | Video/data projector and monitor ceiling/wall mount |
US5502568A (en) * | 1993-03-23 | 1996-03-26 | Wacom Co., Ltd. | Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's |
US5525764A (en) * | 1994-06-09 | 1996-06-11 | Junkins; John L. | Laser scanning graphic input system |
US5528290A (en) * | 1994-09-09 | 1996-06-18 | Xerox Corporation | Device for transcribing images on a board using a camera based board scanner |
US5594502A (en) * | 1993-01-20 | 1997-01-14 | Elmo Company, Limited | Image reproduction apparatus |
US5594469A (en) * | 1995-02-21 | 1997-01-14 | Mitsubishi Electric Information Technology Center America Inc. | Hand gesture machine control system |
US5617312A (en) * | 1993-11-19 | 1997-04-01 | Hitachi, Ltd. | Computer system that enters control information by means of video camera |
US5638092A (en) * | 1994-12-20 | 1997-06-10 | Eng; Tommy K. | Cursor control system |
US5729704A (en) * | 1993-07-21 | 1998-03-17 | Xerox Corporation | User-directed method for operating on an object-based model data structure through a second contextual image |
US5734375A (en) * | 1995-06-07 | 1998-03-31 | Compaq Computer Corporation | Keyboard-compatible optical determination of object's position |
US5737740A (en) * | 1994-06-27 | 1998-04-07 | Numonics | Apparatus and method for processing electronic documents |
US5736686A (en) * | 1995-03-01 | 1998-04-07 | Gtco Corporation | Illumination apparatus for a digitizer tablet with improved light panel |
US5764223A (en) * | 1995-06-07 | 1998-06-09 | International Business Machines Corporation | Touch-screen input device using the monitor as a light source operating at an intermediate frequency |
US5771039A (en) * | 1994-06-06 | 1998-06-23 | Ditzik; Richard J. | Direct view display device integration techniques |
US5911004A (en) * | 1995-05-08 | 1999-06-08 | Ricoh Company, Ltd. | Image processing apparatus for discriminating image characteristics using image signal information obtained in an image scanning operation |
US5914709A (en) * | 1997-03-14 | 1999-06-22 | Poa Sana, Llc | User input device for a computer system |
US6031531A (en) * | 1998-04-06 | 2000-02-29 | International Business Machines Corporation | Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users |
US6061177A (en) * | 1996-12-19 | 2000-05-09 | Fujimoto; Kenneth Noboru | Integrated computer display and graphical input apparatus and method |
US6075905A (en) * | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
US6179426B1 (en) * | 1999-03-03 | 2001-01-30 | 3M Innovative Properties Company | Integrated front projection system |
US6188388B1 (en) * | 1993-12-28 | 2001-02-13 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
US6191773B1 (en) * | 1995-04-28 | 2001-02-20 | Matsushita Electric Industrial Co., Ltd. | Interface apparatus |
US6208329B1 (en) * | 1996-08-13 | 2001-03-27 | Lsi Logic Corporation | Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device |
US6209266B1 (en) * | 1997-03-13 | 2001-04-03 | Steelcase Development Inc. | Workspace display |
US6226035B1 (en) * | 1998-03-04 | 2001-05-01 | Cyclo Vision Technologies, Inc. | Adjustable imaging system with wide angle capability |
US6229529B1 (en) * | 1997-07-11 | 2001-05-08 | Ricoh Company, Ltd. | Write point detecting circuit to detect multiple write points |
US6252989B1 (en) * | 1997-01-07 | 2001-06-26 | Board Of The Regents, The University Of Texas System | Foveated image coding system and method for image bandwidth reduction |
US6335724B1 (en) * | 1999-01-29 | 2002-01-01 | Ricoh Company, Ltd. | Method and device for inputting coordinate-position and a display board system |
US6337681B1 (en) * | 1991-10-21 | 2002-01-08 | Smart Technologies Inc. | Projection display system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks |
US6339748B1 (en) * | 1997-11-11 | 2002-01-15 | Seiko Epson Corporation | Coordinate input system and display apparatus |
US6353434B1 (en) * | 1998-09-08 | 2002-03-05 | Gunze Limited | Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display |
US6359612B1 (en) * | 1998-09-30 | 2002-03-19 | Siemens Aktiengesellschaft | Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device |
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
US20020050979A1 (en) * | 2000-08-24 | 2002-05-02 | Sun Microsystems, Inc | Interpolating sample values from known triangle vertex values |
US20030001825A1 (en) * | 1998-06-09 | 2003-01-02 | Katsuyuki Omura | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
US6507339B1 (en) * | 1999-08-23 | 2003-01-14 | Ricoh Company, Ltd. | Coordinate inputting/detecting system and a calibration method therefor |
US6512838B1 (en) * | 1999-09-22 | 2003-01-28 | Canesta, Inc. | Methods for enhancing performance and data acquired from three-dimensional image systems |
US20030025951A1 (en) * | 2001-07-27 | 2003-02-06 | Pollard Stephen Bernard | Paper-to-computer interfaces |
US6517266B2 (en) * | 2001-05-15 | 2003-02-11 | Xerox Corporation | Systems and methods for hand-held printing on a surface or medium |
US6522830B2 (en) * | 1993-11-30 | 2003-02-18 | Canon Kabushiki Kaisha | Image pickup apparatus |
US6530664B2 (en) * | 1999-03-03 | 2003-03-11 | 3M Innovative Properties Company | Integrated front projection system with enhanced dry erase screen configuration |
US6531999B1 (en) * | 2000-07-13 | 2003-03-11 | Koninklijke Philips Electronics N.V. | Pointing direction calibration in video conferencing and other camera-based system applications |
US20030063073A1 (en) * | 2001-10-03 | 2003-04-03 | Geaghan Bernard O. | Touch panel system and method for distinguishing multiple touch inputs |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
US6559813B1 (en) * | 1998-07-01 | 2003-05-06 | Deluca Michael | Selective real image obstruction in a virtual reality display apparatus and method |
US20030085871A1 (en) * | 2001-10-09 | 2003-05-08 | E-Business Information Technology | Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof |
US6563491B1 (en) * | 1999-09-10 | 2003-05-13 | Ricoh Company, Ltd. | Coordinate input apparatus and the recording medium thereof |
US6567121B1 (en) * | 1996-10-25 | 2003-05-20 | Canon Kabushiki Kaisha | Camera control system, camera server, camera client, control method, and storage medium |
US6567078B2 (en) * | 2000-01-25 | 2003-05-20 | Xiroku Inc. | Handwriting communication system and handwriting input device used therein |
US6570612B1 (en) * | 1998-09-21 | 2003-05-27 | Bank One, Na, As Administrative Agent | System and method for color normalization of board images |
US6674424B1 (en) * | 1999-10-29 | 2004-01-06 | Ricoh Company, Ltd. | Method and apparatus for inputting information including coordinate data |
US6683584B2 (en) * | 1993-10-22 | 2004-01-27 | Kopin Corporation | Camera display system |
US20040021633A1 (en) * | 2002-04-06 | 2004-02-05 | Rajkowski Janusz Wiktor | Symbol encoding apparatus and method |
US6690363B2 (en) * | 2000-06-19 | 2004-02-10 | Next Holdings Limited | Touch panel display system |
US6690397B1 (en) * | 2000-06-05 | 2004-02-10 | Advanced Neuromodulation Systems, Inc. | System for regional data association and presentation and method for the same |
US6690357B1 (en) * | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US20040046749A1 (en) * | 1996-10-15 | 2004-03-11 | Nikon Corporation | Image recording and replay apparatus |
US6710770B2 (en) * | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6736321B2 (en) * | 1995-12-18 | 2004-05-18 | Metrologic Instruments, Inc. | Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system |
US6741250B1 (en) * | 2001-02-09 | 2004-05-25 | Be Here Corporation | Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path |
US6864882B2 (en) * | 2000-05-24 | 2005-03-08 | Next Holdings Limited | Protected touch panel display system |
US20050083308A1 (en) * | 2003-10-16 | 2005-04-21 | Homer Steven S. | Display for an electronic device |
US6919880B2 (en) * | 2001-06-01 | 2005-07-19 | Smart Technologies Inc. | Calibrating camera offsets to facilitate object position determination using triangulation |
US6947032B2 (en) * | 2003-03-11 | 2005-09-20 | Smart Technologies Inc. | Touch system and method for determining pointer contacts on a touch surface |
US20060022962A1 (en) * | 2002-11-15 | 2006-02-02 | Gerald Morrison | Size/scale and orientation determination of a pointer in a camera-based touch system |
US7007236B2 (en) * | 2001-09-14 | 2006-02-28 | Accenture Global Services Gmbh | Lab window collaboration |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US20070019103A1 (en) * | 2005-07-25 | 2007-01-25 | Vkb Inc. | Optical apparatus for virtual interface projection and sensing |
US7176904B2 (en) * | 2001-03-26 | 2007-02-13 | Ricoh Company, Limited | Information input/output apparatus, information input/output control method, and computer product |
US7187489B2 (en) * | 1999-10-05 | 2007-03-06 | Idc, Llc | Photonic MEMS and structures |
US7187030B2 (en) * | 2003-06-16 | 2007-03-06 | Samsung Electronics Co., Ltd. | SONOS memory device |
US7190496B2 (en) * | 2003-07-24 | 2007-03-13 | Zebra Imaging, Inc. | Enhanced environment visualization using holographic stereograms |
US20070075648A1 (en) * | 2005-10-03 | 2007-04-05 | Blythe Michael M | Reflecting light |
US20070075982A1 (en) * | 2000-07-05 | 2007-04-05 | Smart Technologies, Inc. | Passive Touch System And Method Of Detecting User Input |
US20070116333A1 (en) * | 2005-11-18 | 2007-05-24 | Dempski Kelly L | Detection of multiple targets on a plane of interest |
US20080062149A1 (en) * | 2003-05-19 | 2008-03-13 | Baruch Itzhak | Optical coordinate input device comprising few elements |
US7355593B2 (en) * | 2004-01-02 | 2008-04-08 | Smart Technologies, Inc. | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US7532206B2 (en) * | 2003-03-11 | 2009-05-12 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US7692625B2 (en) * | 2000-07-05 | 2010-04-06 | Smart Technologies Ulc | Camera-based touch system |
Family Cites Families (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3870834A (en) * | 1973-06-11 | 1975-03-11 | Yeaple Corp | Personal stereophonic speaker system |
US4061877A (en) * | 1976-10-04 | 1977-12-06 | Shaymar, Inc. | Speaker system |
US4075438A (en) * | 1976-12-02 | 1978-02-21 | Gary Kappel | Stereo speaker system |
JPS5936295B2 (en) | 1981-06-23 | 1984-09-03 | 株式会社日立国際電気 | Optical coordinate input device |
US4558313A (en) * | 1981-12-31 | 1985-12-10 | International Business Machines Corporation | Indicator to data processing interface |
JPS60226337A (en) * | 1984-04-22 | 1985-11-11 | Pioneer Electronic Corp | Car-mounted speaker unit |
JPH0728456B2 (en) * | 1984-11-30 | 1995-03-29 | パイオニア株式会社 | Audio equipment |
US4782328A (en) * | 1986-10-02 | 1988-11-01 | Product Development Services, Incorporated | Ambient-light-responsive touch screen data input method and system |
US5880411A (en) * | 1992-06-08 | 1999-03-09 | Synaptics, Incorporated | Object position detector with edge motion feature and gesture recognition |
JP3244798B2 (en) * | 1992-09-08 | 2002-01-07 | 株式会社東芝 | Moving image processing device |
US5982352A (en) * | 1992-09-18 | 1999-11-09 | Pryor; Timothy R. | Method for providing human input to a computer |
US5754664A (en) * | 1993-09-09 | 1998-05-19 | Prince Corporation | Vehicle audio system |
GB2286100A (en) * | 1994-01-19 | 1995-08-02 | Ibm | Touch-sensitive display apparatus |
US5577733A (en) * | 1994-04-08 | 1996-11-26 | Downing; Dennis L. | Targeting system |
JPH08240407A (en) | 1995-03-02 | 1996-09-17 | Matsushita Electric Ind Co Ltd | Position detecting input device |
US5786810A (en) * | 1995-06-07 | 1998-07-28 | Compaq Computer Corporation | Method of determining an object's position and associated apparatus |
JPH0991094A (en) | 1995-09-21 | 1997-04-04 | Sekisui Chem Co Ltd | Coordinate detector for touch panel |
US5825352A (en) * | 1996-01-04 | 1998-10-20 | Logitech, Inc. | Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad |
TW394879B (en) * | 1996-02-09 | 2000-06-21 | Sega Enterprises Kk | Graphics processing system and its data input device |
JP3807779B2 (en) | 1996-05-29 | 2006-08-09 | 富士通株式会社 | Coordinate detection device |
US6850621B2 (en) * | 1996-06-21 | 2005-02-01 | Yamaha Corporation | Three-dimensional sound reproducing apparatus and a three-dimensional sound reproduction method |
US6002808A (en) * | 1996-07-26 | 1999-12-14 | Mitsubishi Electric Information Technology Center America, Inc. | Hand gesture control system |
US5936615A (en) * | 1996-09-12 | 1999-08-10 | Digital Equipment Corporation | Image-based touchscreen |
JP3063639B2 (en) * | 1996-09-26 | 2000-07-12 | ヤマハ株式会社 | Speaker device |
GB9622773D0 (en) * | 1996-11-01 | 1997-01-08 | Central Research Lab Ltd | Stereo sound expander |
JPH10224888A (en) * | 1997-02-06 | 1998-08-21 | Pioneer Electron Corp | On-vehicle speaker system |
JP3624070B2 (en) * | 1997-03-07 | 2005-02-23 | キヤノン株式会社 | Coordinate input device and control method thereof |
JP3876942B2 (en) * | 1997-06-13 | 2007-02-07 | 株式会社ワコム | Optical digitizer |
US6161066A (en) * | 1997-08-18 | 2000-12-12 | The Texas A&M University System | Advanced law enforcement and response technology |
US6072494A (en) * | 1997-10-15 | 2000-06-06 | Electric Planet, Inc. | Method and apparatus for real-time gesture recognition |
TW449709B (en) * | 1997-11-17 | 2001-08-11 | Hewlett Packard Co | A method for distinguishing a contact input |
US6310610B1 (en) * | 1997-12-04 | 2001-10-30 | Nortel Networks Limited | Intelligent touch display |
EP2256605B1 (en) * | 1998-01-26 | 2017-12-06 | Apple Inc. | Method and apparatus for integrating manual input |
AU2439399A (en) | 1998-02-09 | 1999-08-23 | Haim Azaria | Video camera computer touch screen system |
JP2000105671A (en) * | 1998-05-11 | 2000-04-11 | Ricoh Co Ltd | Coordinate input and detecting device, and electronic blackboard system |
TW459192B (en) * | 1999-06-25 | 2001-10-11 | Toshiba Corp | Electronic apparatus and electronic system provided with the same |
JP3905670B2 (en) * | 1999-09-10 | 2007-04-18 | 株式会社リコー | Coordinate input detection apparatus, information storage medium, and coordinate input detection method |
JP2001112572A (en) * | 1999-10-20 | 2001-04-24 | Pioneer Electronic Corp | Seat with speaker and acoustic system |
JP3819654B2 (en) * | 1999-11-11 | 2006-09-13 | 株式会社シロク | Optical digitizer with indicator identification function |
JP3934846B2 (en) * | 2000-03-06 | 2007-06-20 | 株式会社リコー | Coordinate input / detection device, electronic blackboard system, light receiving element positional deviation correction method, and storage medium |
JP2001265516A (en) * | 2000-03-16 | 2001-09-28 | Ricoh Co Ltd | Coordinate input device |
JP2001282445A (en) * | 2000-03-31 | 2001-10-12 | Ricoh Co Ltd | Coordinate input/detecting device and information display input device |
JP3793014B2 (en) * | 2000-10-03 | 2006-07-05 | キヤノン株式会社 | Electron source manufacturing apparatus, electron source manufacturing method, and image forming apparatus manufacturing method |
US6774889B1 (en) * | 2000-10-24 | 2004-08-10 | Microsoft Corporation | System and method for transforming an ordinary computer monitor screen into a touch screen |
US6972401B2 (en) * | 2003-01-30 | 2005-12-06 | Smart Technologies Inc. | Illuminated bezel and touch system incorporating the same |
US7665041B2 (en) * | 2003-03-25 | 2010-02-16 | Microsoft Corporation | Architecture for controlling a computer using hand gestures |
US20050052427A1 (en) * | 2003-09-10 | 2005-03-10 | Wu Michael Chi Hung | Hand gesture interaction with touch surface |
-
2004
- 2004-02-17 US US10/778,534 patent/US7232986B2/en active Active
-
2007
- 2007-06-18 US US11/764,723 patent/US20080068352A1/en not_active Abandoned
Patent Citations (101)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4144449A (en) * | 1977-07-08 | 1979-03-13 | Sperry Rand Corporation | Position detection apparatus |
US4247767A (en) * | 1978-04-05 | 1981-01-27 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence | Touch sensitive computer input device |
US4507557A (en) * | 1983-04-01 | 1985-03-26 | Siemens Corporate Research & Support, Inc. | Non-contact X,Y digitizer using two dynamic ram imagers |
US4672364A (en) * | 1984-06-18 | 1987-06-09 | Carroll Touch Inc | Touch input device having power profiling |
US4742221A (en) * | 1985-05-17 | 1988-05-03 | Alps Electric Co., Ltd. | Optical coordinate position input device |
US4737631A (en) * | 1985-05-17 | 1988-04-12 | Alps Electric Co., Ltd. | Filter of photoelectric touch panel with integral spherical protrusion lens |
US4672990A (en) * | 1985-10-11 | 1987-06-16 | Robillard Fred W | System for freeze protection of pipes |
US4831455A (en) * | 1986-02-21 | 1989-05-16 | Canon Kabushiki Kaisha | Picture reading apparatus |
US4822145A (en) * | 1986-05-14 | 1989-04-18 | Massachusetts Institute Of Technology | Method and apparatus utilizing waveguide and polarized light for display of dynamic images |
US4818826A (en) * | 1986-09-19 | 1989-04-04 | Alps Electric Co., Ltd. | Coordinate input apparatus including a detection circuit to determine proper stylus position |
US4746770A (en) * | 1987-02-17 | 1988-05-24 | Sensor Frame Incorporated | Method and apparatus for isolating and manipulating graphic objects on computer video monitor |
US4820050A (en) * | 1987-04-28 | 1989-04-11 | Wells-Gardner Electronics Corporation | Solid-state optical position determining apparatus |
US5414413A (en) * | 1988-06-14 | 1995-05-09 | Sony Corporation | Touch panel apparatus |
US5109435A (en) * | 1988-08-08 | 1992-04-28 | Hughes Aircraft Company | Segmentation method for use against moving objects |
US5196835A (en) * | 1988-09-30 | 1993-03-23 | International Business Machines Corporation | Laser touch panel reflective surface aberration cancelling |
US5025314A (en) * | 1990-07-30 | 1991-06-18 | Xerox Corporation | Apparatus allowing remote interactive use of a plurality of writing surfaces |
US5097516A (en) * | 1991-02-28 | 1992-03-17 | At&T Bell Laboratories | Technique for illuminating a surface with a gradient intensity line of light to achieve enhanced two-dimensional imaging |
US6337681B1 (en) * | 1991-10-21 | 2002-01-08 | Smart Technologies Inc. | Projection display system with pressure sensing at screen, and computer assisted alignment implemented by applying pressure at displayed calibration marks |
US5483261A (en) * | 1992-02-14 | 1996-01-09 | Itu Research, Inc. | Graphical input controller and method with rear screen image detection |
US5483603A (en) * | 1992-10-22 | 1996-01-09 | Advanced Interconnection Technology | System and method for automatic optical inspection |
US5317140A (en) * | 1992-11-24 | 1994-05-31 | Dunthorn David I | Diffusion-assisted position location particularly for visual pen detection |
US5594502A (en) * | 1993-01-20 | 1997-01-14 | Elmo Company, Limited | Image reproduction apparatus |
US5502568A (en) * | 1993-03-23 | 1996-03-26 | Wacom Co., Ltd. | Optical position detecting unit, optical coordinate input unit and optical position detecting method employing a pattern having a sequence of 1's and 0's |
US5729704A (en) * | 1993-07-21 | 1998-03-17 | Xerox Corporation | User-directed method for operating on an object-based model data structure through a second contextual image |
US5490655A (en) * | 1993-09-16 | 1996-02-13 | Monger Mounts, Inc. | Video/data projector and monitor ceiling/wall mount |
US6683584B2 (en) * | 1993-10-22 | 2004-01-27 | Kopin Corporation | Camera display system |
US5617312A (en) * | 1993-11-19 | 1997-04-01 | Hitachi, Ltd. | Computer system that enters control information by means of video camera |
US6522830B2 (en) * | 1993-11-30 | 2003-02-18 | Canon Kabushiki Kaisha | Image pickup apparatus |
US5484966A (en) * | 1993-12-07 | 1996-01-16 | At&T Corp. | Sensing stylus position using single 1-D image sensor |
US6188388B1 (en) * | 1993-12-28 | 2001-02-13 | Hitachi, Ltd. | Information presentation apparatus and information display apparatus |
US5771039A (en) * | 1994-06-06 | 1998-06-23 | Ditzik; Richard J. | Direct view display device integration techniques |
US5525764A (en) * | 1994-06-09 | 1996-06-11 | Junkins; John L. | Laser scanning graphic input system |
US5737740A (en) * | 1994-06-27 | 1998-04-07 | Numonics | Apparatus and method for processing electronic documents |
US5528290A (en) * | 1994-09-09 | 1996-06-18 | Xerox Corporation | Device for transcribing images on a board using a camera based board scanner |
US5638092A (en) * | 1994-12-20 | 1997-06-10 | Eng; Tommy K. | Cursor control system |
US5594469A (en) * | 1995-02-21 | 1997-01-14 | Mitsubishi Electric Information Technology Center America Inc. | Hand gesture machine control system |
US5736686A (en) * | 1995-03-01 | 1998-04-07 | Gtco Corporation | Illumination apparatus for a digitizer tablet with improved light panel |
US6191773B1 (en) * | 1995-04-28 | 2001-02-20 | Matsushita Electric Industrial Co., Ltd. | Interface apparatus |
US5911004A (en) * | 1995-05-08 | 1999-06-08 | Ricoh Company, Ltd. | Image processing apparatus for discriminating image characteristics using image signal information obtained in an image scanning operation |
US5734375A (en) * | 1995-06-07 | 1998-03-31 | Compaq Computer Corporation | Keyboard-compatible optical determination of object's position |
US5764223A (en) * | 1995-06-07 | 1998-06-09 | International Business Machines Corporation | Touch-screen input device using the monitor as a light source operating at an intermediate frequency |
US6736321B2 (en) * | 1995-12-18 | 2004-05-18 | Metrologic Instruments, Inc. | Planar laser illumination and imaging (PLIIM) system employing wavefront control methods for reducing the power of speckle-pattern noise digital images acquired by said system |
US6075905A (en) * | 1996-07-17 | 2000-06-13 | Sarnoff Corporation | Method and apparatus for mosaic image construction |
US6208329B1 (en) * | 1996-08-13 | 2001-03-27 | Lsi Logic Corporation | Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device |
US20040046749A1 (en) * | 1996-10-15 | 2004-03-11 | Nikon Corporation | Image recording and replay apparatus |
US6567121B1 (en) * | 1996-10-25 | 2003-05-20 | Canon Kabushiki Kaisha | Camera control system, camera server, camera client, control method, and storage medium |
US6061177A (en) * | 1996-12-19 | 2000-05-09 | Fujimoto; Kenneth Noboru | Integrated computer display and graphical input apparatus and method |
US6252989B1 (en) * | 1997-01-07 | 2001-06-26 | Board Of The Regents, The University Of Texas System | Foveated image coding system and method for image bandwidth reduction |
US6209266B1 (en) * | 1997-03-13 | 2001-04-03 | Steelcase Development Inc. | Workspace display |
US5914709A (en) * | 1997-03-14 | 1999-06-22 | Poa Sana, Llc | User input device for a computer system |
US6229529B1 (en) * | 1997-07-11 | 2001-05-08 | Ricoh Company, Ltd. | Write point detecting circuit to detect multiple write points |
US6339748B1 (en) * | 1997-11-11 | 2002-01-15 | Seiko Epson Corporation | Coordinate input system and display apparatus |
US6226035B1 (en) * | 1998-03-04 | 2001-05-01 | Cyclo Vision Technologies, Inc. | Adjustable imaging system with wide angle capability |
US6031531A (en) * | 1998-04-06 | 2000-02-29 | International Business Machines Corporation | Method and system in a graphical user interface for facilitating cursor object movement for physically challenged computer users |
US20030001825A1 (en) * | 1998-06-09 | 2003-01-02 | Katsuyuki Omura | Coordinate position inputting/detecting device, a method for inputting/detecting the coordinate position, and a display board system |
US6559813B1 (en) * | 1998-07-01 | 2003-05-06 | Deluca Michael | Selective real image obstruction in a virtual reality display apparatus and method |
US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
US6353434B1 (en) * | 1998-09-08 | 2002-03-05 | Gunze Limited | Input coordinate transformation apparatus for converting coordinates input from a coordinate input device into coordinates in a display coordinate system for displaying images on a display |
US6570612B1 (en) * | 1998-09-21 | 2003-05-27 | Bank One, Na, As Administrative Agent | System and method for color normalization of board images |
US6359612B1 (en) * | 1998-09-30 | 2002-03-19 | Siemens Aktiengesellschaft | Imaging system for displaying image information that has been acquired by means of a medical diagnostic imaging device |
US6690357B1 (en) * | 1998-10-07 | 2004-02-10 | Intel Corporation | Input device using scanning sensors |
US6335724B1 (en) * | 1999-01-29 | 2002-01-01 | Ricoh Company, Ltd. | Method and device for inputting coordinate-position and a display board system |
US6179426B1 (en) * | 1999-03-03 | 2001-01-30 | 3M Innovative Properties Company | Integrated front projection system |
US6530664B2 (en) * | 1999-03-03 | 2003-03-11 | 3M Innovative Properties Company | Integrated front projection system with enhanced dry erase screen configuration |
US6545669B1 (en) * | 1999-03-26 | 2003-04-08 | Husam Kinawi | Object-drag continuity between discontinuous touch-screens |
US6507339B1 (en) * | 1999-08-23 | 2003-01-14 | Ricoh Company, Ltd. | Coordinate inputting/detecting system and a calibration method therefor |
US6563491B1 (en) * | 1999-09-10 | 2003-05-13 | Ricoh Company, Ltd. | Coordinate input apparatus and the recording medium thereof |
US6512838B1 (en) * | 1999-09-22 | 2003-01-28 | Canesta, Inc. | Methods for enhancing performance and data acquired from three-dimensional image systems |
US7187489B2 (en) * | 1999-10-05 | 2007-03-06 | Idc, Llc | Photonic MEMS and structures |
US6674424B1 (en) * | 1999-10-29 | 2004-01-06 | Ricoh Company, Ltd. | Method and apparatus for inputting information including coordinate data |
US6567078B2 (en) * | 2000-01-25 | 2003-05-20 | Xiroku Inc. | Handwriting communication system and handwriting input device used therein |
US6710770B2 (en) * | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US6864882B2 (en) * | 2000-05-24 | 2005-03-08 | Next Holdings Limited | Protected touch panel display system |
US6690397B1 (en) * | 2000-06-05 | 2004-02-10 | Advanced Neuromodulation Systems, Inc. | System for regional data association and presentation and method for the same |
US6690363B2 (en) * | 2000-06-19 | 2004-02-10 | Next Holdings Limited | Touch panel display system |
US7692625B2 (en) * | 2000-07-05 | 2010-04-06 | Smart Technologies Ulc | Camera-based touch system |
US20070075982A1 (en) * | 2000-07-05 | 2007-04-05 | Smart Technologies, Inc. | Passive Touch System And Method Of Detecting User Input |
US6531999B1 (en) * | 2000-07-13 | 2003-03-11 | Koninklijke Philips Electronics N.V. | Pointing direction calibration in video conferencing and other camera-based system applications |
US20020050979A1 (en) * | 2000-08-24 | 2002-05-02 | Sun Microsystems, Inc | Interpolating sample values from known triangle vertex values |
US6741250B1 (en) * | 2001-02-09 | 2004-05-25 | Be Here Corporation | Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path |
US7030861B1 (en) * | 2001-02-10 | 2006-04-18 | Wayne Carl Westerman | System and method for packing multi-touch gestures onto a hand |
US7176904B2 (en) * | 2001-03-26 | 2007-02-13 | Ricoh Company, Limited | Information input/output apparatus, information input/output control method, and computer product |
US6517266B2 (en) * | 2001-05-15 | 2003-02-11 | Xerox Corporation | Systems and methods for hand-held printing on a surface or medium |
US6919880B2 (en) * | 2001-06-01 | 2005-07-19 | Smart Technologies Inc. | Calibrating camera offsets to facilitate object position determination using triangulation |
US20030025951A1 (en) * | 2001-07-27 | 2003-02-06 | Pollard Stephen Bernard | Paper-to-computer interfaces |
US7007236B2 (en) * | 2001-09-14 | 2006-02-28 | Accenture Global Services Gmbh | Lab window collaboration |
US20030063073A1 (en) * | 2001-10-03 | 2003-04-03 | Geaghan Bernard O. | Touch panel system and method for distinguishing multiple touch inputs |
US20030085871A1 (en) * | 2001-10-09 | 2003-05-08 | E-Business Information Technology | Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof |
US7202860B2 (en) * | 2001-10-09 | 2007-04-10 | Eit Co., Ltd. | Coordinate input device working with at least display screen and desk-top surface as the pointing areas thereof |
US20040021633A1 (en) * | 2002-04-06 | 2004-02-05 | Rajkowski Janusz Wiktor | Symbol encoding apparatus and method |
US20060022962A1 (en) * | 2002-11-15 | 2006-02-02 | Gerald Morrison | Size/scale and orientation determination of a pointer in a camera-based touch system |
US6947032B2 (en) * | 2003-03-11 | 2005-09-20 | Smart Technologies Inc. | Touch system and method for determining pointer contacts on a touch surface |
US7532206B2 (en) * | 2003-03-11 | 2009-05-12 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US20080062149A1 (en) * | 2003-05-19 | 2008-03-13 | Baruch Itzhak | Optical coordinate input device comprising few elements |
US7187030B2 (en) * | 2003-06-16 | 2007-03-06 | Samsung Electronics Co., Ltd. | SONOS memory device |
US7190496B2 (en) * | 2003-07-24 | 2007-03-13 | Zebra Imaging, Inc. | Enhanced environment visualization using holographic stereograms |
US20050083308A1 (en) * | 2003-10-16 | 2005-04-21 | Homer Steven S. | Display for an electronic device |
US7355593B2 (en) * | 2004-01-02 | 2008-04-08 | Smart Technologies, Inc. | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US20070019103A1 (en) * | 2005-07-25 | 2007-01-25 | Vkb Inc. | Optical apparatus for virtual interface projection and sensing |
US20070075648A1 (en) * | 2005-10-03 | 2007-04-05 | Blythe Michael M | Reflecting light |
US20070116333A1 (en) * | 2005-11-18 | 2007-05-24 | Dempski Kelly L | Detection of multiple targets on a plane of interest |
Cited By (85)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8055022B2 (en) | 2000-07-05 | 2011-11-08 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8203535B2 (en) | 2000-07-05 | 2012-06-19 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US8378986B2 (en) | 2000-07-05 | 2013-02-19 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US20100265202A1 (en) * | 2000-07-05 | 2010-10-21 | Smart Technologies Ulc | Passive touch system and method of detecting user input |
US20100060613A1 (en) * | 2002-11-15 | 2010-03-11 | Smart Technologies Ulc | Size/scale orientation determination of a pointer in a camera-based touch system |
US8228304B2 (en) | 2002-11-15 | 2012-07-24 | Smart Technologies Ulc | Size/scale orientation determination of a pointer in a camera-based touch system |
US8456447B2 (en) | 2003-02-14 | 2013-06-04 | Next Holdings Limited | Touch screen signal processing |
US8508508B2 (en) | 2003-02-14 | 2013-08-13 | Next Holdings Limited | Touch screen signal processing with single-point calibration |
US8289299B2 (en) | 2003-02-14 | 2012-10-16 | Next Holdings Limited | Touch screen signal processing |
US8466885B2 (en) | 2003-02-14 | 2013-06-18 | Next Holdings Limited | Touch screen signal processing |
US8456451B2 (en) | 2003-03-11 | 2013-06-04 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US20090160801A1 (en) * | 2003-03-11 | 2009-06-25 | Smart Technologies Ulc | System and method for differentiating between pointers used to contact touch surface |
US20110234638A1 (en) * | 2003-09-16 | 2011-09-29 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US8325134B2 (en) | 2003-09-16 | 2012-12-04 | Smart Technologies Ulc | Gesture recognition method and touch system incorporating the same |
US20070236454A1 (en) * | 2003-10-09 | 2007-10-11 | Smart Technologies, Inc. | Apparatus For Determining The Location Of A Pointer Within A Region Of Interest |
US8456418B2 (en) | 2003-10-09 | 2013-06-04 | Smart Technologies Ulc | Apparatus for determining the location of a pointer within a region of interest |
US8576172B2 (en) | 2004-01-02 | 2013-11-05 | Smart Technologies Ulc | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US8089462B2 (en) | 2004-01-02 | 2012-01-03 | Smart Technologies Ulc | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US20080284733A1 (en) * | 2004-01-02 | 2008-11-20 | Smart Technologies Inc. | Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region |
US8274496B2 (en) | 2004-04-29 | 2012-09-25 | Smart Technologies Ulc | Dual mode touch systems |
US20090146973A1 (en) * | 2004-04-29 | 2009-06-11 | Smart Technologies Ulc | Dual mode touch systems |
US20090146972A1 (en) * | 2004-05-05 | 2009-06-11 | Smart Technologies Ulc | Apparatus and method for detecting a pointer relative to a touch surface |
US20090122027A1 (en) * | 2004-05-07 | 2009-05-14 | John Newton | Touch Panel Display System with Illumination and Detection Provided from a Single Edge |
US8149221B2 (en) | 2004-05-07 | 2012-04-03 | Next Holdings Limited | Touch panel display system with illumination and detection provided from a single edge |
US20050259084A1 (en) * | 2004-05-21 | 2005-11-24 | Popovich David G | Tiled touch system |
US8120596B2 (en) | 2004-05-21 | 2012-02-21 | Smart Technologies Ulc | Tiled touch system |
US20080129700A1 (en) * | 2006-12-04 | 2008-06-05 | Smart Technologies Inc. | Interactive input system and method |
US9442607B2 (en) | 2006-12-04 | 2016-09-13 | Smart Technologies Inc. | Interactive input system and method |
US8115753B2 (en) | 2007-04-11 | 2012-02-14 | Next Holdings Limited | Touch screen system with hover and click input methods |
US20090020342A1 (en) * | 2007-07-18 | 2009-01-22 | Smart Technologies Inc. | Touch Panel And Interactive Input System Incorporating The Same |
US8400407B2 (en) * | 2007-07-18 | 2013-03-19 | Smart Technologies Ulc | Touch panel and interactive input system incorporating the same |
US20090027357A1 (en) * | 2007-07-23 | 2009-01-29 | Smart Technologies, Inc. | System and method of detecting contact on a display |
US8094137B2 (en) | 2007-07-23 | 2012-01-10 | Smart Technologies Ulc | System and method of detecting contact on a display |
US8384693B2 (en) | 2007-08-30 | 2013-02-26 | Next Holdings Limited | Low profile touch panel systems |
US8432377B2 (en) | 2007-08-30 | 2013-04-30 | Next Holdings Limited | Optical touchscreen with improved illumination |
US20090058832A1 (en) * | 2007-08-30 | 2009-03-05 | John Newton | Low Profile Touch Panel Systems |
US20120075254A1 (en) * | 2008-01-07 | 2012-03-29 | Simon James Bridger | Touch System Having An Uninterrupted Light Source |
US8405636B2 (en) | 2008-01-07 | 2013-03-26 | Next Holdings Limited | Optical position sensing system and optical position sensor assembly |
US8405637B2 (en) | 2008-01-07 | 2013-03-26 | Next Holdings Limited | Optical position sensing system and optical position sensor assembly with convex imaging window |
US20090277697A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Pen Tool Therefor |
US8902193B2 (en) | 2008-05-09 | 2014-12-02 | Smart Technologies Ulc | Interactive input system and bezel therefor |
US20090278794A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System With Controlled Lighting |
US20090278795A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Illumination Assembly Therefor |
US20090277694A1 (en) * | 2008-05-09 | 2009-11-12 | Smart Technologies Ulc | Interactive Input System And Bezel Therefor |
US8339378B2 (en) | 2008-11-05 | 2012-12-25 | Smart Technologies Ulc | Interactive input system with multi-angle reflector |
KR101352264B1 (en) | 2008-12-18 | 2014-01-17 | 엘지디스플레이 주식회사 | Apparatus and method for sensing muliti-touch |
US8558818B1 (en) * | 2009-04-21 | 2013-10-15 | Hon Hai Precision Industry Co., Ltd. | Optical touch system with display screen |
US8525815B2 (en) * | 2009-04-21 | 2013-09-03 | Hon Hai Precision Industry Co., Ltd. | Optical touch system with display screen |
US20100265217A1 (en) * | 2009-04-21 | 2010-10-21 | Hon Hai Precision Industry Co., Ltd. | Optical touch system with display screen |
US20110032215A1 (en) * | 2009-06-15 | 2011-02-10 | Smart Technologies Ulc | Interactive input system and components therefor |
US20110007001A1 (en) * | 2009-07-09 | 2011-01-13 | Waltop International Corporation | Dual Mode Input Device |
KR101123932B1 (en) | 2009-09-24 | 2012-03-23 | 에이서 인코포레이티드 | Optical touch system and method |
US20110095989A1 (en) * | 2009-10-23 | 2011-04-28 | Smart Technologies Ulc | Interactive input system and bezel therefor |
US20110141062A1 (en) * | 2009-12-15 | 2011-06-16 | Byung-Chun Yu | Optical sensing unit, display module and display device using the same |
CN102096526A (en) * | 2009-12-15 | 2011-06-15 | 乐金显示有限公司 | Optical sensing unit, display module and display device using the same |
US8659578B2 (en) | 2009-12-15 | 2014-02-25 | Lg Display Co., Ltd. | Optical sensing unit, display module and display device using the same |
US8803846B2 (en) * | 2009-12-17 | 2014-08-12 | Lg Display Co., Ltd. | Method for detecting touch and optical touch sensing system |
KR101308477B1 (en) | 2009-12-17 | 2013-09-16 | 엘지디스플레이 주식회사 | Method for Detecting Touch and Display Device Using the Same |
CN102103441A (en) * | 2009-12-17 | 2011-06-22 | 乐金显示有限公司 | Method for detecting touch and optical touch sensing system |
US20110148820A1 (en) * | 2009-12-17 | 2011-06-23 | Shi-Cheol Song | Method for detecting touch and optical touch sensing system |
US20110157050A1 (en) * | 2009-12-24 | 2011-06-30 | Hyung-Uk Jang | Assembly having display panel and optical sensing frame and display system using the same |
US8970554B2 (en) * | 2009-12-24 | 2015-03-03 | Lg Display Co., Ltd. | Assembly having display panel and optical sensing frame and display system using the same |
CN102109933A (en) * | 2009-12-24 | 2011-06-29 | 乐金显示有限公司 | Assembly having display panel and optical sensing frame and display system using the same |
KR20110075723A (en) * | 2009-12-28 | 2011-07-06 | 엘지디스플레이 주식회사 | Compensation method for touch sensitiveness of display device including touch assembly |
KR101658146B1 (en) * | 2009-12-28 | 2016-09-20 | 엘지디스플레이 주식회사 | Compensation Method for Touch Sensitiveness of Display Device Including Touch Assembly |
US20110175849A1 (en) * | 2010-01-18 | 2011-07-21 | Acer Incorporated | Optical touch display device and method |
US8711125B2 (en) * | 2010-02-04 | 2014-04-29 | Hong Kong Applied Science And Technology Research Institute Co. Ltd. | Coordinate locating method and apparatus |
US20110116105A1 (en) * | 2010-02-04 | 2011-05-19 | Hong Kong Applied Science and Technology Research Institute Company Limited | Coordinate locating method and apparatus |
US8937612B2 (en) | 2010-02-04 | 2015-01-20 | Hong Kong Applied Science And Technology Research Institute Co. Ltd. | Coordinate locating method, coordinate locating device, and display apparatus comprising the coordinate locating device |
US20110109565A1 (en) * | 2010-02-04 | 2011-05-12 | Hong Kong Applied Science And Technology Research Institute Co. Ltd. | Cordinate locating method, coordinate locating device, and display apparatus comprising the coordinate locating device |
US20110199337A1 (en) * | 2010-02-12 | 2011-08-18 | Qisda Corporation | Object-detecting system and method by use of non-coincident fields of light |
US20110261016A1 (en) * | 2010-04-23 | 2011-10-27 | Sunplus Innovation Technology Inc. | Optical touch screen system and method for recognizing a relative distance of objects |
US20110298708A1 (en) * | 2010-06-07 | 2011-12-08 | Microsoft Corporation | Virtual Touch Interface |
US20110304535A1 (en) * | 2010-06-15 | 2011-12-15 | Canon Kabushiki Kaisha | Coordinate input apparatus |
US9063618B2 (en) * | 2010-06-15 | 2015-06-23 | Canon Kabushiki Kaisha | Coordinate input apparatus |
KR101726629B1 (en) | 2010-11-05 | 2017-04-13 | 엘지디스플레이 주식회사 | Method For Detecting Touch |
KR20120048389A (en) * | 2010-11-05 | 2012-05-15 | 엘지디스플레이 주식회사 | Method for detecting touch |
US20130222237A1 (en) * | 2010-11-12 | 2013-08-29 | 3M Innovative Properties Company | Interactive polarization-preserving projection display |
US9454241B2 (en) * | 2010-11-12 | 2016-09-27 | 3M Innovative Properties Company | Interactive polarization-preserving projection display |
US9218091B2 (en) * | 2011-07-18 | 2015-12-22 | Pixart Imaging Inc. | Optical touch panel assembly and light sensor thereof |
US20130021299A1 (en) * | 2011-07-18 | 2013-01-24 | Pixart Imaging Inc. | Optical touch panel assembly and light sensor thereof |
US20130265283A1 (en) * | 2012-04-10 | 2013-10-10 | Pixart Imaging Inc. | Optical operation system |
CN103049109A (en) * | 2012-12-20 | 2013-04-17 | 广州视睿电子科技有限公司 | Stylus and touch point identification method |
WO2018043805A1 (en) * | 2016-08-29 | 2018-03-08 | 한신대학교 산학협력단 | Pointing apparatus using three-dimensional virtual button |
KR101926819B1 (en) * | 2016-08-29 | 2018-12-07 | 한신대학교 산학협력단 | Pointing device using three dimensional virtual button |
Also Published As
Publication number | Publication date |
---|---|
US20050178953A1 (en) | 2005-08-18 |
US7232986B2 (en) | 2007-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7232986B2 (en) | Apparatus for detecting a pointer within a region of interest | |
US8035612B2 (en) | Self-contained interactive video display system | |
US8035614B2 (en) | Interactive video window | |
US6972401B2 (en) | Illuminated bezel and touch system incorporating the same | |
EP2026170B1 (en) | Position detecting device | |
US7460110B2 (en) | Dual mode touch system | |
KR101258587B1 (en) | Self-Contained Interactive Video Display System | |
KR101247095B1 (en) | Uniform illumination of interactive display panel | |
US8102377B2 (en) | Portable interactive media presentation system | |
US20100201812A1 (en) | Active display feedback in interactive input systems | |
KR20120058594A (en) | Interactive input system with improved signal-to-noise ratio (snr) and image capture method | |
JP2010534367A (en) | Touch screen based on leaky total internal reflection | |
US20120249480A1 (en) | Interactive input system incorporating multi-angle reflecting structure | |
US20160301900A1 (en) | Touch screen rear projection display | |
US20110095989A1 (en) | Interactive input system and bezel therefor | |
WO2014049331A1 (en) | Touch sensing systems | |
CN103348306B (en) | Interactive display device and method for the same | |
KR101002072B1 (en) | Apparatus for touching a projection of images on an infrared screen | |
CN102646003B (en) | Sensing system | |
CN102141859B (en) | Optical touch display device and method | |
TWI788120B (en) | Non-contact elevator control system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |