WO2013108031A2 - Touch sensitive image display devices - Google Patents

Touch sensitive image display devices Download PDF

Info

Publication number
WO2013108031A2
WO2013108031A2 PCT/GB2013/050103 GB2013050103W WO2013108031A2 WO 2013108031 A2 WO2013108031 A2 WO 2013108031A2 GB 2013050103 W GB2013050103 W GB 2013050103W WO 2013108031 A2 WO2013108031 A2 WO 2013108031A2
Authority
WO
WIPO (PCT)
Prior art keywords
touch
light
image
sheet
display device
Prior art date
Application number
PCT/GB2013/050103
Other languages
French (fr)
Other versions
WO2013108031A3 (en
Inventor
Euan Christopher Smith
Gareth John Mccaughan
Adrian James Cable
Paul Richard Routley
Raul Benet Ballester
Original Assignee
Light Blue Optics Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB1200963.5A external-priority patent/GB201200963D0/en
Priority claimed from GBGB1201009.6A external-priority patent/GB201201009D0/en
Priority claimed from GBGB1205274.2A external-priority patent/GB201205274D0/en
Application filed by Light Blue Optics Limited filed Critical Light Blue Optics Limited
Priority to GB1413670.9A priority Critical patent/GB2513498A/en
Publication of WO2013108031A2 publication Critical patent/WO2013108031A2/en
Publication of WO2013108031A3 publication Critical patent/WO2013108031A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface

Definitions

  • This invention relates to touch sensitive image display devices of the type which project a sheet of light adjacent the displayed image, and to touch sensing systems. Some embodiments of the invention relate to techniques for improved identification of touch objects and/or pens, and in embodiments to techniques for distinguishing between, different touch objects and/or pens. Other embodiments of the invention relate particularly to improved techniques for generating the sheet of light.
  • a touch sensitive image display device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said object is provided with a reflective element, in embodiments a substantially retroreflective element, to reflect light from said touch sheet.
  • Employing a (retro)reflector for example a corner cube reflector, facilitates operation of the device by providing a bright, easily detected object in the captured touch sense image. Furthermore an object of this type may be distinguished from another object, such as a finger, on the basis of the differential brightness of the response.
  • the retroreflective element may comprise retroreflective tape (for example, available from 3M Inc).
  • the retroreflective element need not be precisely retroreflective - provided that the light is reflected back along approximately the same path the arrangement will function effectively. Thus, for example, up to 10 degrees (or more) deviation from exact retroreflection may be tolerable.
  • a scattering surface for example comprising white paint or tape, is provided behind the retroreflective element to scatter the small amounts of light which pass through the retroreflective element back towards the camera. It has been found important in practice to provide a diffuser over the retroreflective element, in particular where the camera is looking down on the touch sheet at an acute angle.
  • the touch sheet projection system is arranged to project a touch sheet so that this extends is a substantially planar fashion (whether or not comprising a continuous sheet of light) just above the display surface.
  • a suitable spread of diffused light is +/- greater than 10°, 15° or 20°.
  • the object detected by the touch sensing system comprises a pen.
  • the system is configured to detect both one or more pens and one or more user fingers.
  • the retroreflective element may be mounted on the tip of the pen.
  • the diffuser is preferably anisotropic, that is diffusing light more in a vertical direction perpendicular to the plane of the display surface than in a horizontal direction within the lateral thickness of the touch sheet.
  • the retroreflective element and diffuser are mounted on a pen the substantially one-dimensional, preferentially spreading the reflected light in a direction aligned with a longitudinal axis of the pen.
  • the diffuser may provide a small spread perpendicular to the main spreading direction to provide some tolerance for the user holding the pen at a slight angle rather than perpendicular to the display surface.
  • the diffuser may provide a spread of less than 5° for example 2-3°.
  • a plurality of objects is provided each having an optically distinguishable response to light from the touch sheet, and the device may then further comprise a system to distinguish between these responses to distinguish between the objects.
  • the touch sense camera itself may distinguish between different objects for example by distinguish between different "IR colours", that is different absorption/reflection responses over a portion of the IR spectrum.
  • an IR barcode or some other distinguishing feature may be provided on a pen or other object, arranging the distinguishing marks so that they are seen when illuminated by a relatively thin sheet of IR illumination.
  • a pattern for example along the length of a pen, may produce different distinguishable responses from the pattern on another object or objects as the pen is inserted into and through the touch sheet.
  • the pen or other object may be provided with an IR- responsive phosphor, for example of the type which is pumped by ambient illumination and stimulated by IR.
  • an IR- responsive phosphor for example of the type which is pumped by ambient illumination and stimulated by IR.
  • a pen/object is provided with a polariser, preferably (but not essentially) in combination with the aforementioned retroreflector to polarise the reflected light.
  • the polariser is a circular polariser so that the response is substantially insensitive to orientation, but in principle a linear or elliptical polariser may alternatively be employed.
  • a circular polariser may be implemented, for example, by a quarter wave plate at the wavelength of the light defining the touch sheet.
  • a circular polariser comprises a quarter wave plate and a linear polariser but where, say, the touch sheet is already polarised for example because it is generated by a linearly polarised laser, only a quarter wave plate need be employed.
  • a circular polariser enables the labelled object to be distinguished from another unlabelled object such as a finger.
  • at least two objects are provided, one with a left circular polariser the other with a right circular polariser.
  • the distinguishing system may then comprise left and right polarisation sensitive sensors.
  • three sensors for example photodiodes, are provided one with a left polariser, one with a right polariser and one unpolarised, facilitating distinguishing between a left polarised object, a right polarised object and an unpolarised object such as a finger.
  • the touch sheet may be arranged to provide pulses or a pulse strain of light or may be modulated at a high frequency (in combination for example with a phased locked loop detection system).
  • the time-of- flight detection system need not be used to locate the precise location of a labelled object since this is performed using the existing touch detection system; instead the time-of-flight may be used to link a polarisation-labelled pen with a known (detected) touch location. Since preferred embodiments of the system include touch position tracking, and since simultaneous initiation of touch events are relatively rare, it is not essential to employ time of flight or other techniques to distinguish between differently labelled pens, but it may be advantageous in some circumstances.
  • Embodiments of the above described techniques enable pens or other objects to be labelled with notional colours and the system may then be configured to output a pen label identification (colour) signal in association with a position for the object (pen).
  • a pen label identification (colour) signal in association with a position for the object (pen).
  • embodiments of the system may be used, for example, to provide a multicolour drawing or writing facility for an interactive whiteboard.
  • Embodiments of the system may be employed to provide a "passive" pen or other object with one or more user controls such as left-click and right-click buttons. (Here a "passive" object is one which lacks an electrical power source).
  • the object had a user-controllable reflective element such that, under user control of the object a property of light reflected from the touch sheet is controllable.
  • the signal processor may then be configured to detect (a change in) this said property to identify operation of the user control and to output corresponding user control data in response.
  • the user-controllable reflective element may simply be a region on the pen or other object which the user is able to selectively alter the response of the object to the illuminating touch sheet.
  • regions may have different brightnesses (light or dark spots) or polarisation characteristics and the user may simply cover one or more of these with a finger or change the orientation of the object/pen so that one or other is visible to the touch camera.
  • different sides of a pen nib or different ends of a pen may have a different IR colour or response: in this case the user rotates or flips the pen to operate the user control.
  • the pen or other object comprises a user control, in particular a mechanical control, for the user-controllable reflective element operable to selectively alter a reflected light response of the object to light from the touch sheet.
  • the user control may comprise two controls or a three-way control to selectively display (expose or cover) left circular polarised and right circular polarised light-polarising regions of the object.
  • the touch sensing system may be configured to "see” in the visible during normal operation use of the touch sensitive display device. More generally, as previously mentioned, pens may be labelled with IR-distinguishable labels as previously mentioned.
  • the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project non-visible light defining a touch sheet above said displayed image; a touch camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch camera, or another said camera of said display aligned said touch camera, is configured to capture an object image of one or more said objects; and wherein said signal processor is configured use said captured object image to distinguishable between multiple said objects.
  • a visible (400nm-700nm) light sensitive camera is provided alongside (or mechanically connected) the IR-sensitive camera, with a separate lens, but aligned to the IR camera such that the images from the two cameras can be aligned with one another.
  • a visually distinguishable feature of this touch object may then be used to label the object.
  • differently coloured objects or pens may be differently labelled or labelled with their respective "visual" colours.
  • the IR sensitive and visible light sensitive cameras share a portion of their optical path, typically a front end lens and optionally other portions of the optics, and a proportion of the light for the IR sensor is tapped off and provided to the visible light sensor.
  • the touch camera may be provided with a spatially patterned wavelength-selective filter so that some portions of the image sensor see visible light and other portions see the non-visible light, typically IR light, scattered from the touch sheet.
  • a spatially patterned wavelength-selective filter so that some portions of the image sensor see visible light and other portions see the non-visible light, typically IR light, scattered from the touch sheet.
  • a filter is a chequerboard pattern type filter similar to a Bayer filter.
  • an anti- aliasing filter in combination with the spatially patterned filter to mitigate the effects of loss of resolution, broadly speaking by blurring small features.
  • an anti-aliasing filter may be implemented using two layers of birefringent material.
  • blanking intervals between the display of different colour planes in a multicolour image projector for example of the digital micromirror type may be used to capture light scattered from the touch sheet and the illumination of the projector itself may be employed to capture an image of an object in one or more visible light colours.
  • the blanking period may alternatively be used for separate red, green and/blue illumination of an object, particularly if this is brief, and so forth.
  • each of these approaches provides a mechanism whereby, say, a red pen may be employed to in effect write in red on the display and so forth. Further, since an additional image of the object or objects is available not restricted to the intersection of the object with the touch sheet, this information may be employed to track one or more of the objects in three dimensions, for example to provide a gesture interpretation or other facility.
  • the invention also provides methods for distinguishing between touch objects along similar lines to those described above.
  • the invention still further provides a non-transitory data carrier carrying processor control code and/or data to implement such methods in either software, or software- defined hardware, or a combination of the two.
  • a touch sensor optical system to generate a light defining a touch sheet for a touch sensitive image display device, the optical system comprising: at least one light source; a first 1 D optical spreading device illuminated by said at least one light source to spread light from said light source in one dimension to generate a first fan of light; and a second 1 D optical spreading device illuminated by said first fan of light to spread light from said first fan of light to generate a second fan of light; wherein said first fan of light provides an extended light source, extended in said one dimension, for said second fan of light; and wherein at least some locations within said second fan of light receive illumination from a plurality of different directions.
  • embodiments employing two concatenated stages of optical spreading provides a number of advantages: because there is a broad, extended light source for the second spreading device in effect multiple fans are overlapped within the sheet of light thus providing illumination from more than one direction and reducing the risk of one object/finger shadowing another within the touch plane. Furthermore embodiments of this approach facilitate achieving improved coverage over a rectangular surface because, in effect, different parts of the surface are illuminated by fans originating from different parts of the second optical spreading device, thus facilitating coverage across the "near" edge of the display surface.
  • the increased extent of the light source helps in achieving eye safety and, more particularly, enables the laser power to be increased whilst remaining eye safe, thus improving the overall signal-to-noise ratio of the touch sensing system. This in turn facilitates coverage over a large display area.
  • the light constituting the sheet may, for example, diverge away from the light source or converge away from the light source (for example if some focussing power is added).
  • a converging configuration can be helpful in increasing the power density of the sheet of laser light within the sheet with increasing distance from the emitter, for example to partially or wholly compensate for a reducing power density as the light fans out.
  • compensation may be applied independently of whether or not multiple fans of light are used to generate the sheet.
  • one or more light sources and first spreading devices illuminate a plurality of the second stage spreading devices to generate a plurality of overlapping second fans of light.
  • these fans overlap at least along the majority of their edges within the display area because the intensity profile of the edge of a fan can exhibit artefacts which may otherwise appear as spurious object-detection events.
  • the second stage spreading devices may be located at intervals along one edge of the display area/sheet of light, optionally pointing in different directions (where "pointing" here refers to the direction of an optical axis, which is generally perpendicular to a line or plane in which the spreading device extends).
  • multiple separate light sources are used to generate the plurality of overlapping fans.
  • a single light source illuminating multiple first stage optical spreading devices each with one or more second stage spreading devices may be employed.
  • a single light source and first stage spreading device is employed, and the first stage spreading device may then be configured to provide a multi-peaked intensity distribution to approximate to or mimic the use of multiple separate sources.
  • a single laser may be employed to effectively provide three light sources each generating a respective overlapping fan. This can be useful in achieving high safety.
  • the multi-peaked intensity distribution may be achieved, for example, by a suitably shaped lens (surface) profile and/or by employing a holographic optical element as a spreading device.
  • the second spreading device comprises a lenticular lens array, for example in the form of a film.
  • a typical lenticular array has (one- dimensional) lenslets with a width of less than 1 mm or 0.5mm; the focal length may be less than ten times the width, for example between two and six times the width.
  • Each lenslet may take the form of an approximation to a cylindrical lens (although one surface of the array is typically flat).
  • the touch sensor optical system employs a laser light source followed by a collimation system to illuminate the first spreading device with a spatial extent of at least 1 cm
  • a lenticular array may also be employed for the first spreading device, but alternatively a cylindrical lens or some other 1 D spreader lens profile may be employed, for example a profile with one flat surface and a second sinusoidally ridged surface. Where the surface has multiple ridges or peaks preferably the laser illumination covers more than one of these ridges or peaks so that there is some averaging for non-uniformities in the surface profile.
  • a spreader lens element may have dimensions, for example a width between adjacent ridges, of greater than 10x, 50x or 100x corresponding dimensions of the lenticular array.
  • a "macroscopic" lens or surface of this type may, optionally, also be employed for the first optical spreading device. Additionally or alternatively one or both of the first and second optical spreading devices may incorporate or consist of a holographic optical element.
  • the light source comprises a stripe-emitter laser diode (sometimes referred to a broad area laser diode).
  • a stripe-emitter laser diode sometimes referred to a broad area laser diode.
  • Such a laser has an output beam which has a high beam divergence in the short-direction of the (rectangular) output face and a lower beam divergence in the long-direction of the output face.
  • the long- direction of the output face of the laser is parallel to the sheet of light because the beam is more easily hollimated in the vertical or short-direction to provide a substantially flat (or slightly diverging or converging) sheet of light.
  • the invention further provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch sensor optical system comprises a stripe-emitter laser aligned such that a long direction of a stripe-output face of the laser is parallel to said touch sheet.
  • the above described touch sensor optical system is incorporated into a touch sensitive image display device comprising a projector to project a displayed image at an acute angle onto a surface, typically generally in front of the device, using the above described optical system to project the light defining the touch sheet just above the displayed image.
  • this sheet of light is non-visible, for example in the infra red.
  • a camera is directed, also at an acute angle, to capture light scattered from the sheet by an object/finger interacting with the display. In embodiments the camera is co-located with the image projector and may share some or the majority of the projection optics.
  • a signal process or is employed to process the image from the camera to identify the locations of one or more fingers/objects touching the image, for use in interacting with the displayed image.
  • the invention provides a method of touch sensing in a touch sensitive image display device, the method comprising: projecting a displayed image onto a surface; projecting a light defining a touch sheet above said displayed image; capturing a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and processing said touch sense image to identify a location of said object relative to said displayed image; wherein said projecting of said light defining said touch sheet comprises projecting a plurality of overlapping fans of light above said displayed image.
  • the overlapping fans of light comprise fans projecting in at least two different directions, and overlapping at least along an edge of a fan.
  • these fans overlap within the thickness of the sheet of light to define a single continuous light sheet.
  • the fans of light may be projected from different locations along the edge of the display area/light sheet and/or may point in different directions, with the aim of achieving optimal coverage of the display area.
  • first and second stage optical spreading devices are employed, though this is not essential.
  • the overlapping fans of light may be provided using a common light source for each of a plurality of optical spreading devices, or employing a separate light source or laser for each spreading device.
  • the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said light defining said touch sheet comprises a plurality of overlapping fans of light.
  • the touch sensor optical system comprises at least one light source and a plurality of optical spreading devices illuminated by the light source each projecting a respective fan of light, the fans of light overlapping within the light sheet.
  • the optical spreading devices may be at different locations and/or oriented to direct (optical axis) of the fans in different directions.
  • the touch sensor optical system may comprise a stripe-emitter laser aligned with a long direction of a stripe- output face of the laser parallel to the touch sheet.
  • embodiments of the system comprise first and second stage optical spreading devices for reduced shadowing, and improved coverage and eye safety.
  • the light defining the touch sheet may be a plane or fan of light formed by a beam spreader.
  • the touch sheet may comprise beams defining a set of stripes or a comb.
  • one or more scanned and/or interlaced light beams may be employed to define the touch sheet.
  • the invention also provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch sensor optical system is configured to project a spatially and/or temporally discontinuous light structure to define said touch sheet.
  • the touch sensor optical system may include a mechanical scanner, such as a spinning polygonal mirror, to sweep one or more collimated beams over the touch area.
  • a mechanical scanner such as a spinning polygonal mirror
  • an optical element for example a diffractive optical element, may be employed to fans the beam out into a sequence of stripes or comb to define a touch sensing sheet.
  • Use of such a comb can assist in detecting the touch object in high ambient light conditions, and the spatial frequency of these beams, as perceived by the camera, can provide additional information on the distance of the touch object, to improve position detection accuracy.
  • it is preferably to employ a substantially single-mode laser in the touch sheet projector.
  • Embodiments of each of the above described aspects of the invention may be used in a range of touch-sensing display applications. For example, some of the techniques we describe may be used to detect the position(s) or one or more objects in mid-air. However embodiments of the invention are particularly useful for large area touch coverage, for example in interactive whiteboard or similar applications.
  • Embodiments of each of the above described aspects of the invention are not limited to use with any particular type of projection technology.
  • the techniques of the invention may also be applied to other forms of projection technology including, but not limited to, digital micromirror-based projectors such as projectors based on DLPTM (Digital Light Processing) technology from Texas Instruments, Inc. BRIEF DESCRIPTION OF THE DRAWINGS
  • Figures 1 a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device suitable for implementing embodiments of the invention, and details of a sheet of light-based touch sensing system for the device;
  • Figures 2a and 2b show, respectively, a holographic image projection system for use with the device of Figure 1 , and a functional block diagram of the device of Figure 1 ;
  • Figures 3a to 3e show, respectively, an embodiment of a touch sensitive image display device according to an aspect of the invention, use of a crude peak locator to find finger centroids, and the resulting finger locations;
  • Figures 4a and 4b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display suitable for implementing an embodiment of the invention
  • Figures 5a to 5e show, respectively, a touch sensitive image display system incorporating polarizing pen/object identification according to an embodiment of the invention, a photosensor module for system of Figure 5a, a polarizing pen for use in the system, details of a retroreflecting stack for the polarizing pen, and details of processing for the system;
  • Figures 6a to 6c show, respectively, example captured touch sense images illustrating shadowing, the use of multiple IR sources to reduce the likelihood of shadowed events, and a schematic illustration of artefacts at the edge of a fan of light;
  • Figures 7a and 7b show, respectively, an example IR fan generator for the apparatus of Figure 4, a development of the IR fan generator incorporating a second optical spreader;
  • Figures 8a and 8b show cross-sections through, respectively, a lenticular form and a spreader lens for use as optical spreaders in the touch sensor optical system of Figure 7;
  • Figure 9 shows, schematically, a stripe-emitter laser which may be employed in embodiments of the invention;
  • Figures 10a to 10c show an embodiment of a touch event capture system using a comb generator in, respectively, plan and side view; and an example of a comb-of-light touch sheet configured to focus towards the rear of the image area;
  • Figures 1 1 a and 1 1 b show representations of a touch-camera view for, respectively, a plane-of-light touch sheet and a comb-of-light touch sheet; and Figures 12a and 12b show representations of a spatial frequency structure of light scattered from comb-of-light touch sheet by, respectively, a touch object close to the comb source and a touch object further from the comb source.
  • Figures 1 a and 1 b show an example touch sensitive holographic image projection device 100 comprising a holographic image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102.
  • a proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device.
  • a holographic image projector is merely described by way of example; the techniques we describe herein may be employed with any type of image projection system.
  • the holographic image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop. This entails projecting at an acute angle onto the display surface (the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°).
  • table down projection the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°.
  • table down projection A holographic image projector is particularly suited to this application because it can provide a wide throw angle, long depth of field, and substantial distortion correction without significant loss of brightness/efficiency. Boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
  • the touch sensing system 250, 258, 260 comprises an infrared laser illumination system (IR line generator) 250 configured to project a sheet of infrared light 256 just above, for example -1 mm above, the surface of the displayed image 150 (although in principle the displayed image could be distant from the touch sensing surface).
  • the laser illumination system 250 may comprise an IR LED or laser 252, preferably collimated, then expanded in one direction by light sheet optics 254, which may comprise a negative or cylindrical lens.
  • light sheet optics 254 may include a 45 degree mirror adjacent the base of the housing 102 to fold the optical path to facilitate locating the plane of light just above the displayed image.
  • a CMOS imaging sensor (touch camera) 260 is provided with an ir-pass lens 258 captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256.
  • the boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a,b.
  • the touch camera 260 provides an output to touch detect signal processing circuitry as described further later.
  • Example holographic image projection system Figure 2a shows an example holographic image projection system architecture 200 in which the SLM may advantageously be employed.
  • the architecture of Figure 2 uses dual SLM modulation - low resolution phase modulation and higher resolution amplitude (intensity) modulation. This can provide substantial improvements in image quality, power consumption and physical size.
  • the primary gain of holographic projection over imaging is one of energy efficiency.
  • the low spatial frequencies of an image can be rendered holographically to maintain efficiency and the high- frequency components can be rendered with an intensity-modulating imaging panel, placed in a plane conjugate to the hologram SLM.
  • diffracted light from the hologram SLM device (SLM1 ) is used to illuminate the imaging SLM device (SLM2).
  • the hologram SLM is preferably be a fast multi-phase device, for example a pixellated MEMS-based piston actuator device.
  • SLM1 is a pixellated MEMS-based piston actuator SLM as described above, to display a hologram - for example a 160 ⁇ 160 pixel device with physically small lateral dimensions, e.g ⁇ 5mm or ⁇ 1 mm.
  • L1 , L2 and L3 are collimation lenses (optional, depending upon the laser output) for respective Red, Green and Blue lasers.
  • M1 , M2 and M3 are dichroic mirrors a implemented as prism assembly.
  • M4 is a turning beam mirror
  • SLM2 is an imaging SLM and has a resolution at least equal to the target image resolution (e.g. 854 ⁇ 480); it may comprise a LCOS (liquid crystal on silicon) or DMD (Digital Micromirror Device) panel.
  • LCOS liquid crystal on silicon
  • DMD Digital Micromirror Device
  • Diffraction optics 210 comprises lenses LD1 and LD2, forms an intermediate image plane on the surface of SLM2, and has effective focal length / such that f / ⁇ covers the active area of imaging SLM2.
  • optics 210 perform a spatial Fourier transform to form a far field illumination pattern in the Fourier plane, which illuminates SLM2.
  • PBS2 (Polarising Beam Splitter 2) transmits incident light to SLM2, and reflects emergent light into the relay optics 212 (liquid crystal SLM2 rotates the polarisation by 90 degrees).
  • PBS2 preferably has a clear aperture at least as large as the active area of SLM2.
  • Relay optics 212 relay light to the diffuser D1 .
  • M5 is a beam turning mirror
  • D1 is a diffuser to reduce speckle.
  • Projection optics 214 project the object formed on D1 by the relay optics 212, and preferably provide a large throw angle, for example >90°, for angled projection down onto a table top (the design is simplified by the relatively low scattere from the diffuser).
  • the different colours are time-multiplexed and the sizes of the replayed images are scaled to match one another, for example by padding a target image for display with zeros (the field size of the displayed image depends upon the pixel size of the SLM not on the number of pixels in the hologram).
  • a system controller and hologram data processor 202 inputs image data and provides low spatial frequency hologram data 204 to SLM1 and higher spatial frequency intensity modulation data 206 to SLM2.
  • the controller also provides laser light intensity control data 208 to each of the three lasers.
  • hologram calculation procedure reference may be made to WO2010/007404 (hereby incorporated by reference).
  • a system controller 1 10 is coupled to a touch sensing module 1 12 from which it receives data defining one or more touched locations on the display area, either in rectangular or in distorted coordinates (in the latter case the system controller may perform keystone distortion compensation).
  • the touch sensing module 1 12 in embodiments comprises a CMOS sensor driver and touch-detect processing circuitry.
  • the system controller 1 10 is also coupled to an input/output module 1 14 which provides a plurality of external interfaces, in particular for buttons, LEDs, optionally a USB and/or Bluetooth (RTM) interface, and a bi-directional wireless communication interface, for example using WiFi (RTM).
  • RTM USB and/or Bluetooth
  • the wireless interface may be employed to download data for display either in the form of images or in the form of hologram data.
  • this data may include price data for price updates, and the interface may provide a backhaul link for placing orders, handshaking to enable payment and the like.
  • Non-volatile memory 1 16, for example Flash RAM is provided to store data for display, including hologram data, as well as distortion compensation data, and touch sensing control data (identifying regions and associated actions/links).
  • Non-volatile memory 1 16 is coupled to the system controller and to the I/O module 1 14, as well as to an optional image-to-hologram engine 1 18 as previously described (also coupled to system controller 1 10), and to an optical module controller 120 for controlling the optics shown in figure 2a.
  • the image-to-hologram engine is optional as the device may receive hologram data for display from an external source).
  • the optical module controller 120 receives hologram data for display and drives the hologram display SLM, as well as controlling the laser output powers in order to compensate for brightness variations caused by varying coverage of the display area by the displayed image (for more details see, for example, our WO2008/075096).
  • the laser power(s) is(are) controlled dependent on the "coverage" of the image, with coverage defined as the sum of: the image pixel values, preferably raised to a power of gamma (where gamma is typically 2.2).
  • the laser power is inversely dependent on (but not necessarily inversely proportional to) the coverage; in preferred embodiments a lookup table as employed to apply a programmable transfer function between coverage and laser power.
  • Preferred embodiments of the device also include a power management system 122 to control battery charging, monitor power consumption, invoke a sleep mode and the like.
  • the system controller controls loading of the image/hologram data into the non-volatile memory, where necessary conversion of image data to hologram data, and loading of the hologram data into the optical module and control of the laser intensities.
  • the system controller also performs distortion compensation and controls which image to display when and how the device responds to different "key" presses and includes software to keep track of a state of the device.
  • the controller is also configured to transition between states (images) on detection of touch events with coordinates in the correct range, a detected touch triggering an event such as a display of another image and hence a transition to another state.
  • the system controller 1 10 also, in embodiments, manages price updates of displayed menu items, and optionally payment, and the like.
  • FIG. 3a shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention.
  • the system comprises an infra red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light.
  • the system also includes an image projector 1 18, for example a holographic image projector, also as previously described, to project an image typically generally in front of the device, in embodiments generally downwards at an acute angle to a display surface.
  • a controller 320 controls the IR laser on and off, controls the acquisition of images by camera 260 and controls projector 1 18.
  • images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra red.
  • the image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-800 nm. Because of laser diodes process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR.
  • subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA).
  • module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later. Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
  • the camera 260 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers.
  • differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
  • module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module. Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present.
  • Figure 3b illustrates an example such a coarse (decimated) grid.
  • the spots indicate the first estimation of the centre-of-mass.
  • a centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location.
  • Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found.
  • the system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258.
  • the optical access of camera 260 is directed downwards at an angle of approximately 70° to the plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable.
  • the thresholding may be position sensitive (at a higher level for mirror image parts) alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
  • the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region.
  • the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel).
  • a simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROI (region of interest), and R(x,y) may be estimated thus:
  • n is the order of the CoM calculation, and and ⁇ are the sizes of the ROI.
  • the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space:
  • x' xC x y T
  • x " xC x y T
  • d y ' xC y y T
  • C x and C y represent polynomial coefficients in matrix-form
  • x and y are the vectorised powers of x and y respectively.
  • C x and C y such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial:
  • a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events.
  • this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter.
  • this module In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the indentified fingers/objects.
  • the field of view of the touch sense camera system is larger than the displayed image. To improve robustness of the touch sensing system touch events outside the displayed image area (which may be determined by calibration) may be rejected (for example, using appropriate entries in a threshold table of threshold module 306 to clip the crude peak locator outside the image area).
  • FIG. 4a shows a plan view of an interactive whiteboard touch sensitive image display device 400.
  • Figure 4b shows a side view of the device.
  • IR fan sources 402, 404, 406 each providing a respective light fan 402a, 404a, 406a spanning approximately 120° (for example) and together defining a single, continuous sheet of light just above display area 410.
  • the fans overlap on display area 410, central regions of the display area being covered by three fans and more peripheral regions by two fans and just one fan. This is economical as shadowing is most likely in the central region of the display area.
  • Typical dimensions of the display area 410 may be of order 1 m by 2m.
  • the side view of the system illustrates a combined projector 420 and touch image capture camera 422 either aligned side-by-side or sharing at least an output portion of the projection optics.
  • the optical path between the projector/camera and display area is folded by a mirror 424.
  • the sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1 cm or 0.5cm above the display area.
  • the camera and projector 422, 420 are supported on a support 450 and may project light from a distance of up to around 0.5m from the display area.
  • Figure 5a this shows a system of the type illustrated in Figure 4 including a system to distinguish between at least three different objects, to pens with respective left circular polarising and right circular polarising tips, and an unpolarised object such as a finger.
  • the pens are provided with retroreflective tips as described further below.
  • Like elements to those of Figure 4 are indicated by like reference numerals.
  • this multi-pen touch sensitive display device 500 associated with the infrared touch sheet generation module 402, 404, 406 (which preferably but not essentially provides a touch sheet defined by overlapping fans of light) is a photo sensor module 502. As illustrated in Figure 5b this comprises three photo diodes 504, 506, 508, one provided with a left-circular polariser (a quarter wave plate and linear polariser, one having a clear window, and one being provided with a right circular polariser.
  • a photo sensor module 502. As illustrated in Figure 5b this comprises three photo diodes 504, 506, 508, one provided with a left-circular polariser (a quarter wave plate and linear polariser, one having a clear window, and one being provided with a right circular polariser.
  • a pen 510 as illustrated in Figure 5c has a pen nib 510a provided with a retro reflecting stack of the type illustrated in Figure 5d.
  • the retroreflective stack is shown in Figure 5d and comprises (in order going outwards towards the pen surface) a scattering layer 520, for example of white tape, a retroreflector layer 522, for example comprising retro reflective tape, a linear polariser layer 524 (optional pending whether the IR sheet is linearly polarised), a quarter wave plate 526, and a diffuser 528.
  • This stack is rolled around the pen nib 510a.
  • light in the IR sheet is either left or right circularly polarised by the pen according to the pen construction (or optionally, elliptically polarised) and this circularly polarised light is retroreflected back towards the photosensor module 502.
  • the diffuser preferably substantially one dimensional underlined along the longitudinal access of pen 510, helps to ensure that both the touch sensing camera 422 and the photosensor module 502 each receive the retroreflected light.
  • the polarised photodiodes each receive about half the light and the unolparised photo diodes receives the full light intensity whereas with a polarised retroreflection the unpolarised photodiode and one of the polarised photodiodes receives the full light intensity and the other polarised photodiode zero light intensity.
  • timer flight in combination with pulsed light emission from the IR sheet module may be employed to match a pen/finger to a detected objection location, using relatively approximate time of flight position detection.
  • the checkerboard spatial filter 530 illustrated in Figure 5a may be provided for the touch capture camera 422 in an alternative embodiment in which the touch camera captures a visible image of an object in addition to an IR image where the object intersects the touch sheet. This visible image may then be used, for example, to determine the colour of a pen and label the data output from the system accordingly for example to associate a detected touch position with a (pen) colour).
  • Figure 5e illustrates the signal processing to implement the system of Figures 5a-5d; the photo diodes 502 provide an input to a pen identification module 504, optionally incorporating time of flight detection using a timing signal from controller 506. Module 504 in turn provides pen/finger identification data to the touch position output module 314.
  • the plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface.
  • the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.
  • the object may be a more sophisticated passive object such as a passive pen incorporating user controls which change the appearance of object by for example, moving an aperture.
  • This may be employed to hide one or another spot or to change a spot count on the object, or to change a number of lines or line slope or orientation, or to modify the object appearance in some other way.
  • the change is a change in the polarisation response of the object.
  • a user control on the object may comprise one or more buttons mechanically modifying an aspect of visual appearance of the objects or pen to implement one or more user buttons. Operation of these virtual 'user buttons' maybe detected by the second sensing system and then provided as an output of from the system for use in any desirable manner.
  • a passive pen of this type provides left-click and right-click buttons, so that the pen can send back one of three "signals":
  • pressing a "left” or a “right” button reveals a left-circular or right- circular retroreflecting strip which is detected by the system.
  • a three- photodiode configuration is employed (with clear, left-polarising and right-polarising filters respectively, as described above) and a part of the pen, for example the nib of the pen has three regions of retroreflect, retroreflect+left-polarise (RR+LP) and retroreflect+right-polarise (RR+RP) respectively.
  • RR+LP retroreflect+left-polarise
  • RR+RP retroreflect+right-polarise
  • the two buttons on the pen when pressed move the aperture backwards or forwards to cover and retroflect region and expose the RR+LP or RR+RP region respectively to signify left click and right click. Difference signals from the photodiode then inform the touch subsystem which of the three pen states (touch only, touch+left-click or touch+right-click) is active. This system is also compatible with a finger (which looks the same as the pen with no button pressed).
  • the pen may reveal, for example two different patterned regions which are distinguished.
  • different regions on the object eg pen
  • different IR-distinguishable regions for example different brightnesses (light or dark spots) or polarisation characteristics.
  • the user may then cover one or more of these with a finger or change the orientation of the object/pen so that one or other region is visible to the touch sheet camera.
  • different sides of a pen nib or different ends of a pen may have a different IR colour or response: in this case the user rotates or flips the pen to operate the user control.
  • IR laser illumination system to produce a plane or sheet of light for use with a device of the type described above.
  • light fan touch is a technique where a sheet of light is generated just above a surface.
  • an object for example a finger
  • touches the surface light from the light sheet will scatter off the object.
  • a camera is positioned to capture this light with a suitable image processing system to process the captured image and register a touch event.
  • the techniques employed to generate the sheet of light are important for system performance in various ways.
  • FIG. 6a shows an example captured touch sense image illustrating shadowing.
  • Finger image 352 creates a shadow 352a for IR source 250 which partially obscures the image of a second finger 354, and a finger/object in position 356 cannot be detected because this region lies entirely within the shadow.
  • the arrows on the shadow borders indicate the direction of light propagation, diverging in straight lines away from the IR source in a fan shape).
  • Figure 6b illustrates, conceptually, that when multiple IR sources 250a,b,c are employed the risk of shadowing is much reduced.
  • Figure 6c schematically illustrates a pattern of light intensity of the type which can appear at the edge of a fan of light, depending upon how the fan is generated. This can generate artefacts in the capture touch image which may be mistaken for a finger or other object. Furthermore, to cover a large area a significant laser power may be needed, for example of order 100s of milliwatts, and there is a need to ensure that this is eye-safe.
  • FIG 7a shows, schematically, an IR fan generator for the apparatus of Figure 4.
  • a laser diode 600 provides an output into a collimator 602 which provides a partially or substantially completely collimated beam to a first optical spreader 604.
  • Spreader 604 converts the initial, narrow source to an extended source, providing a fan of light output.
  • three such fan generators may be employed to provide a single, combined sheet of light for touch sensing, the edges of the fans overlapping to smooth the intensity profile shown in Figure 6c.
  • the spreader 604 may be modified, for example using a holographic optical element, to provide a gradual intensity fall-off towards the edge of a fan.
  • Figure 7b illustrates a development of the system of Figure 7a in which light from the extended source provided by the fan from spreader 604 is provided to a second optical spreader 606 to provide an extended fan, in effect a set of overlapping fans originating from extended source 604.
  • Such an arrangement may be employed for each of the fan generators 402, 404, 406 of Figure 4 or one or all of these may be replaced by the arrangement of Figure 7b to provide, in effect, multiple overlapping fans.
  • optical spreader 604 is configured to define an intensity profile, with multiple laterally-displaced peaks mimicking multiple individual sources. The arrangement then becomes more similar to that of Figure 5.
  • This intensity profile may be generated, for example, by an appropriately shaped optical (lens) surface or by employing a holographic optical element as the spreader 604.
  • optical spreader 606 is illustrated as being straight, optionally this may be curved to provide fans directed over a range of different directions.
  • Each of the optical spreaders described above may either comprise, for example, a lenticular film or a spreader lens. These are illustrated schematically in Figures 8a and 8b respectively; they have broadly similar shapes although typically the spreader lens has characteristic dimensions (ridge spacings) of order 50x or 100x the lenticular film. (The figure illustrates a cross-section; the skilled person will appreciate that these cross-sections are "extruded" in the third dimension to, broadly speaking, approximate a set of cylindrical lenses).
  • the first optical spreader is either a lenticular array or some other 1 D optical element such as a cylindrical lens
  • the second optical spreader is a lenticular array.
  • illumination of the spreader covers more than one peak, as illustrated in Figure 8b, to reduce non-uniformity in the spread light due to non-uniformity of the illumination and/or optical surface.
  • an extended IR fan illumination source helps eye safety because if the light is imaged, it will be imaged over a large area of the retina as compared with a point IR source. Further, as previously described, use of an extended source results in soft rather than hard-edged shadows and can potentially remove the umbra of a shadow completely if the size of the source is sufficiently large.
  • Figure 9 shows a schematic view of a stripe-emitter laser which may be employed as laser diode 600 in the arrangements of Figure 7.
  • vertical direction 802 the output beam from the laser has a large divergence (because of the small aperture size) but distance 802 may be small enough for the beam to be substantially single-mode in this direction.
  • horizontal direction 804 the beam divergence is less (though still relatively large), but the light output results from the excitation from multiple different spatial modes.
  • the laser powers desirable for effective coverage of a large area such as an interactive whiteboard can most easily be met using a multi-mode laser, but it is not possible to produce a good collimated beam from the multi-mode laser.
  • a multi-mode laser of a suitable power level for example greater than 10 mW, 100 mW or 300 mW, is available in the form of a stripe-emitter laser.
  • a stripe multi-mode laser may be employed provided that the long axis 804 of the stripe output aperture is aligned parallel to a plane of the sheet of light.
  • the beam can then be collimated in the "vertical" axis to define a suitable light sheet, whilst spreading out in the lateral direction (which is generally parallel to the display surface).
  • the laser beam entering the spreading optics should be of a size matched to the spreading optics.
  • a collimation lens at a suitable distance in front of the laser as schematically illustrated in Figure 7.
  • the stripe-emitter multi-mode solid state diode laser as previously described, there may be a substantial difference in the divergence in the two axies of polarisation.
  • the direction of minimum divergence of the laser source is also the direction of the extended laser emission source.
  • free space propagation after the collimation lens can bring together with sizes in the two axes.
  • a preferred optical configuration for generating an IR fan employs a collimated laser beam illuminating one or more spreading elements (lenticular array(s) or other 1 D optical element(s)). If the illuminating beam is not perpendicular to the plane of the spreading element, the resulting line generated is not straight - this effect may be termed 'smile'.
  • the touch surface is defined by the shape of the IR illumination, it is desirable to ensure the spreading element is perpendicular to the illuminating beam if the touch surface is flat.
  • touch technology employing light sheet touch detection may be configured for operation on a non-planar surface, more particularly by deliberate tilting of the spreading element(s) with respect to the illuminating beam.
  • the location of a touch event may be detected using a spatially and/or temporally structured touch light sheet.
  • Such approaches can help improve touch-location accuracy as distance from the sensor (camera) increases and/or reduce the accuracy constraints on wide angle input optics for the camera.
  • Use of structured light can also help to improve signal-to-noise ratio vis-a-vis ambient light, which again otherwise tends to reduce with increasing distance from the touch sheet light source.
  • These approaches may be employed either for user interface systems aligned to a surface or to detect touch events in mid-air (such as gestures), in each case detecting objects that intersect with light in the touch sheet.
  • a comb-of-light approach can be preferably to a scanning system because it may be smaller and employ fewer components, and in particular no moving parts.
  • a comb generator module 1010 comprises a light source 1012 with an optional collimation lens 1014, followed by a comb generator optic 1016.
  • the image sensor 258, 260 detects light 1020 scattered from a touch object 1022 such as a finger back into the sensor, with a field of view indicated by cone 1024.
  • DOE diffractive optical element
  • the DOEs may (but need not) take the form of a substantially one-dimensional grating, with a pitch is calculated to produce a divergence angle for the comb as appropriate to illuminate the chosen touch detection area.
  • Data for a computer generated a DOE can be calculated using conventional hologram generation approaches, for example direct binary search or Gerchberg-Saxton.
  • a grating feature size of around 0.5um may be used with a light source wavelength of 905nm.
  • the comb is depicted as having substantially collimated prongs.
  • a focussed comb can also permit a greater density of prongs per unit area (to increase detection accuracy) while retaining a low mark to space ratio (to increase comb power density and hence ambient light resilience).
  • FIG. 10c An example of a focussed comb is shown in Figure 10c.
  • Such a comb configuration may be achieved by adjusting the focus of the collimation lens, or by encoding focussing power into the DOE design.
  • the number of prongs generated by the comb is preferably chosen so that at least one prong, and preferably more for accuracy, always intersects with a touch object (e.g. a finger) of a specified minimum diameter (e.g. 4mm), over the entire defined touch area.
  • a touch object e.g. a finger
  • a specified minimum diameter e.g. 4mm
  • This provides two mechanisms for separating light from the comb scattered by a touch event from ambient light: i) The specific spatial frequency component corresponding to scatter from the comb can be selectively extracted from the image, providing resilience to other ambient light events, ii) Intensity or other information between the "prongs" of the comb can be used to infer the ambient light level in the vicinity of the touch object, and hence compensate for it by, for example, adjusting detection thresholds appropriately.
  • Figure 1 1 shows example representations of image sensor data captured for a plane- of-light touch sheet (Figure 1 1 a), compared with a comb-of-light touch sheet ( Figure 1 1 b). These illustrate a finger 1 100, patches of ambient (IR) light 1 1 10 in/from the environment, and light 1 120 scattered from the plane/comb (touch sheet). Figure 1 1 b additionally illustrates that intensity data from regions 1 130 between the prongs can be used to extract the ambient illumination level. Further the spatial frequency 1 140 of the prongs can be used to filter touch events from ambient background.
  • IR ambient
  • Figure 1 1 b additionally illustrates that intensity data from regions 1 130 between the prongs can be used to extract the ambient illumination level.
  • the spatial frequency 1 140 of the prongs can be used to filter touch events from ambient background.
  • FIG. 12a shows a touch event close to the comb light source and Figure 12b a touch event detected further from the light comb source.
  • the high spatial frequency of scatter from the comb indicates that the touch event is close to the light comb source; in Figure 12b the lower spatial frequency of scatter from the comb indicates that the touch event is further from the light comb source.
  • the plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface. The skilled person will appreciate that whilst a relatively thin, flat sheet of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.

Abstract

We describe a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said object is provided with a retroreflective element to reflect light from said touch sheet.

Description

TOUCH SENSITIVE IMAGE DISPLAY DEVICES
FIELD OF THE INVENTION This invention relates to touch sensitive image display devices of the type which project a sheet of light adjacent the displayed image, and to touch sensing systems. Some embodiments of the invention relate to techniques for improved identification of touch objects and/or pens, and in embodiments to techniques for distinguishing between, different touch objects and/or pens. Other embodiments of the invention relate particularly to improved techniques for generating the sheet of light.
BACKGROUND TO THE INVENTION
Background prior art relating to touch sensing systems employing a plane or sheet of light can be found in US6,281 ,878 (Montellese), and in various later patents of Lumio/VKB Inc, such as US7,305,368, as well as in similar patents held by Canesta Inc, for example US6,710,770. Broadly speaking these systems project a fan-shaped plane of infrared (IR) light just above a displayed image and use a camera to detect the light scattered from this plane by a finger or other object reaching through to approach or touch the displayed image.
Further background prior art can be found in: WO01 /93006; US6650318; US7305368; US7084857; US7268692; US7417681 ; US7242388 (US2007/222760); US2007/019103; WO01 /93006; WO01/93182; WO2008/038275; US2006/187199; US6,614,422; US6,710,770 (US2002021287); US7,593,593; US7599561 ; US7519223; US7394459; US661 1921 ; USD595785; US6,690,357; US6,377,238; US5767842; WO2006/108443; WO2008/146098; US6,367,933 (WO00/21282); WO02/101443; US6,491 ,400; US7,379,619; US2004/0095315; US6281878; US6031519; GB2,343,023A; US4384201 ; DE 41 21 180A; and US2006/244720.
We have previously described techniques for improved touch sensitive holographic displays, for example in our earlier patent applications: WO2010/073024; WO2010/073045; and WO2010/073047. The inventors have continued to develop and advance touch sensing techniques suitable for use with these and other image display systems. In particular we will describe techniques which facilitate object detection and in embodiments enable, for example, differently labelled ("coloured") pens to be identified and distinguished. We will also describe improved techniques for generating the plane or sheet of light in such systems to address problems which can arise, including the shadowing of one finger or object by another in the direction of light propagation.
SUMMARY OF THE INVENTION
According to the invention there is therefore provided a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said object is provided with a reflective element, in embodiments a substantially retroreflective element, to reflect light from said touch sheet.
Employing a (retro)reflector, for example a corner cube reflector, facilitates operation of the device by providing a bright, easily detected object in the captured touch sense image. Furthermore an object of this type may be distinguished from another object, such as a finger, on the basis of the differential brightness of the response.
In embodiments using a retroreflective element the retroreflective element may comprise retroreflective tape (for example, available from 3M Inc). The retroreflective element need not be precisely retroreflective - provided that the light is reflected back along approximately the same path the arrangement will function effectively. Thus, for example, up to 10 degrees (or more) deviation from exact retroreflection may be tolerable. Preferably a scattering surface, for example comprising white paint or tape, is provided behind the retroreflective element to scatter the small amounts of light which pass through the retroreflective element back towards the camera. It has been found important in practice to provide a diffuser over the retroreflective element, in particular where the camera is looking down on the touch sheet at an acute angle. This is because in general the touch sheet projection system is arranged to project a touch sheet so that this extends is a substantially planar fashion (whether or not comprising a continuous sheet of light) just above the display surface. Thus providing a diffuser over the retroreflective element helps to scatter light back towards the camera. A suitable spread of diffused light is +/- greater than 10°, 15° or 20°.
In some preferred implementations the object detected by the touch sensing system comprises a pen. In preferred implementations the system is configured to detect both one or more pens and one or more user fingers. Where the object comprises a pen the retroreflective element may be mounted on the tip of the pen. The diffuser is preferably anisotropic, that is diffusing light more in a vertical direction perpendicular to the plane of the display surface than in a horizontal direction within the lateral thickness of the touch sheet. Where the retroreflective element and diffuser are mounted on a pen the substantially one-dimensional, preferentially spreading the reflected light in a direction aligned with a longitudinal axis of the pen. Nonetheless it is desirable for the diffuser to provide a small spread perpendicular to the main spreading direction to provide some tolerance for the user holding the pen at a slight angle rather than perpendicular to the display surface. Thus along this perpendicular axis the diffuser may provide a spread of less than 5° for example 2-3°.
In embodiments a plurality of objects is provided each having an optically distinguishable response to light from the touch sheet, and the device may then further comprise a system to distinguish between these responses to distinguish between the objects. In principle the touch sense camera itself may distinguish between different objects for example by distinguish between different "IR colours", that is different absorption/reflection responses over a portion of the IR spectrum. Alternatively an IR barcode or some other distinguishing feature may be provided on a pen or other object, arranging the distinguishing marks so that they are seen when illuminated by a relatively thin sheet of IR illumination. In a still further approach a pattern, for example along the length of a pen, may produce different distinguishable responses from the pattern on another object or objects as the pen is inserted into and through the touch sheet. In a still further approach the pen or other object may be provided with an IR- responsive phosphor, for example of the type which is pumped by ambient illumination and stimulated by IR. The skilled person will recognise, that in principle, a combination of these and the other approaches described below, may be employed.
In a preferred implementation, however, a pen/object is provided with a polariser, preferably (but not essentially) in combination with the aforementioned retroreflector to polarise the reflected light. Preferably the polariser is a circular polariser so that the response is substantially insensitive to orientation, but in principle a linear or elliptical polariser may alternatively be employed. A circular polariser may be implemented, for example, by a quarter wave plate at the wavelength of the light defining the touch sheet. In general a circular polariser comprises a quarter wave plate and a linear polariser but where, say, the touch sheet is already polarised for example because it is generated by a linearly polarised laser, only a quarter wave plate need be employed.
The use of a circular polariser enables the labelled object to be distinguished from another unlabelled object such as a finger. In preferred embodiments, however, at least two objects are provided, one with a left circular polariser the other with a right circular polariser. The distinguishing system may then comprise left and right polarisation sensitive sensors. In a preferred embodiment three sensors, for example photodiodes, are provided one with a left polariser, one with a right polariser and one unpolarised, facilitating distinguishing between a left polarised object, a right polarised object and an unpolarised object such as a finger. (The relative responses of Left, Unpolarised and Right polarised sensors are then ½ , 1 , ½ for an unpolarised finger and 1 , 1 , 0 (or vice versa) for a L/R polarised object). Optionally this approach may be extended to distinguish between multiple different objects by labelling the objects with different elliptical polarisations and discriminating between these. The skilled person will appreciate that although embodiments may employ three separate sensors each with a different polarisation sensitivity, optionally a single sensor, for example and imaging sensor, may have different spatial regions with different polarisation sensitivities and operate along similar lines. Where multiple objects may be present simultaneously the system may include a time- of-flight detection module for distinguishing between multiple simultaneous touches of the objects. With such an arrangement the touch sheet may be arranged to provide pulses or a pulse strain of light or may be modulated at a high frequency (in combination for example with a phased locked loop detection system). The time-of- flight detection system need not be used to locate the precise location of a labelled object since this is performed using the existing touch detection system; instead the time-of-flight may be used to link a polarisation-labelled pen with a known (detected) touch location. Since preferred embodiments of the system include touch position tracking, and since simultaneous initiation of touch events are relatively rare, it is not essential to employ time of flight or other techniques to distinguish between differently labelled pens, but it may be advantageous in some circumstances.
Embodiments of the above described techniques enable pens or other objects to be labelled with notional colours and the system may then be configured to output a pen label identification (colour) signal in association with a position for the object (pen). In this way embodiments of the system may be used, for example, to provide a multicolour drawing or writing facility for an interactive whiteboard. Embodiments of the system may be employed to provide a "passive" pen or other object with one or more user controls such as left-click and right-click buttons. (Here a "passive" object is one which lacks an electrical power source).
Thus in embodiments the object had a user-controllable reflective element such that, under user control of the object a property of light reflected from the touch sheet is controllable. The signal processor may then be configured to detect (a change in) this said property to identify operation of the user control and to output corresponding user control data in response. In principle the user-controllable reflective element may simply be a region on the pen or other object which the user is able to selectively alter the response of the object to the illuminating touch sheet. Thus, for example, regions may have different brightnesses (light or dark spots) or polarisation characteristics and the user may simply cover one or more of these with a finger or change the orientation of the object/pen so that one or other is visible to the touch camera. For example, different sides of a pen nib or different ends of a pen may have a different IR colour or response: in this case the user rotates or flips the pen to operate the user control.
More preferably however the pen or other object comprises a user control, in particular a mechanical control, for the user-controllable reflective element operable to selectively alter a reflected light response of the object to light from the touch sheet. In more sophisticated embodiments the user control may comprise two controls or a three-way control to selectively display (expose or cover) left circular polarised and right circular polarised light-polarising regions of the object.
In an alternative approach which also enables multicolour writing on an interactive whiteboard the touch sensing system may be configured to "see" in the visible during normal operation use of the touch sensitive display device. More generally, as previously mentioned, pens may be labelled with IR-distinguishable labels as previously mentioned.
Thus in another aspect the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project non-visible light defining a touch sheet above said displayed image; a touch camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch camera, or another said camera of said display aligned said touch camera, is configured to capture an object image of one or more said objects; and wherein said signal processor is configured use said captured object image to distinguishable between multiple said objects. In one implementation, where the touch sensing system is provided with a camera which sees in the visible, a visible (400nm-700nm) light sensitive camera is provided alongside (or mechanically connected) the IR-sensitive camera, with a separate lens, but aligned to the IR camera such that the images from the two cameras can be aligned with one another. This then enables the detected touch position in one image to be matched with a captured visible image of the touch object, and a visually distinguishable feature of this touch object may then be used to label the object. In this way, for example, differently coloured objects or pens may be differently labelled or labelled with their respective "visual" colours. In a variant of this approach the IR sensitive and visible light sensitive cameras share a portion of their optical path, typically a front end lens and optionally other portions of the optics, and a proportion of the light for the IR sensor is tapped off and provided to the visible light sensor. In a still further approach the touch camera may be provided with a spatially patterned wavelength-selective filter so that some portions of the image sensor see visible light and other portions see the non-visible light, typically IR light, scattered from the touch sheet. One example of such a filter is a chequerboard pattern type filter similar to a Bayer filter. This approach is less preferable because there is potentially a loss of both sensitivity and resolution, but when such a spatially patterned wavelength selective filter is employed it can be preferable to also include an anti- aliasing filter in combination with the spatially patterned filter to mitigate the effects of loss of resolution, broadly speaking by blurring small features. (Such an anti-aliasing filter may be implemented using two layers of birefringent material). In a still further approach, blanking intervals between the display of different colour planes in a multicolour image projector, for example of the digital micromirror type may be used to capture light scattered from the touch sheet and the illumination of the projector itself may be employed to capture an image of an object in one or more visible light colours. The skilled person will appreciate that there are other variants along these lines, for example the blanking period may alternatively be used for separate red, green and/blue illumination of an object, particularly if this is brief, and so forth.
As the skilled person will appreciate, each of these approaches provides a mechanism whereby, say, a red pen may be employed to in effect write in red on the display and so forth. Further, since an additional image of the object or objects is available not restricted to the intersection of the object with the touch sheet, this information may be employed to track one or more of the objects in three dimensions, for example to provide a gesture interpretation or other facility.
The invention also provides methods for distinguishing between touch objects along similar lines to those described above. The invention still further provides a non-transitory data carrier carrying processor control code and/or data to implement such methods in either software, or software- defined hardware, or a combination of the two.
IR laser illumination system/touch sheet
According to a further aspect of the invention there is therefore provided a touch sensor optical system to generate a light defining a touch sheet for a touch sensitive image display device, the optical system comprising: at least one light source; a first 1 D optical spreading device illuminated by said at least one light source to spread light from said light source in one dimension to generate a first fan of light; and a second 1 D optical spreading device illuminated by said first fan of light to spread light from said first fan of light to generate a second fan of light; wherein said first fan of light provides an extended light source, extended in said one dimension, for said second fan of light; and wherein at least some locations within said second fan of light receive illumination from a plurality of different directions.
In embodiments employing two concatenated stages of optical spreading provides a number of advantages: because there is a broad, extended light source for the second spreading device in effect multiple fans are overlapped within the sheet of light thus providing illumination from more than one direction and reducing the risk of one object/finger shadowing another within the touch plane. Furthermore embodiments of this approach facilitate achieving improved coverage over a rectangular surface because, in effect, different parts of the surface are illuminated by fans originating from different parts of the second optical spreading device, thus facilitating coverage across the "near" edge of the display surface. Furthermore the increased extent of the light source, typically a laser light source, helps in achieving eye safety and, more particularly, enables the laser power to be increased whilst remaining eye safe, thus improving the overall signal-to-noise ratio of the touch sensing system. This in turn facilitates coverage over a large display area.
The skilled person will appreciate that although we refer to a light defining a touch sheet, this does not necessarily define a plane of light - instead the light constituting the sheet may, for example, diverge away from the light source or converge away from the light source (for example if some focussing power is added). A converging configuration can be helpful in increasing the power density of the sheet of laser light within the sheet with increasing distance from the emitter, for example to partially or wholly compensate for a reducing power density as the light fans out. The skilled person will appreciate that such compensation may be applied independently of whether or not multiple fans of light are used to generate the sheet.
In embodiments of the system one or more light sources and first spreading devices illuminate a plurality of the second stage spreading devices to generate a plurality of overlapping second fans of light. In preferred implementations these fans overlap at least along the majority of their edges within the display area because the intensity profile of the edge of a fan can exhibit artefacts which may otherwise appear as spurious object-detection events. In embodiments the second stage spreading devices may be located at intervals along one edge of the display area/sheet of light, optionally pointing in different directions (where "pointing" here refers to the direction of an optical axis, which is generally perpendicular to a line or plane in which the spreading device extends).
In some arrangements, described later, multiple separate light sources, each with a first and second stage spreading device, are used to generate the plurality of overlapping fans. Alternatively a single light source illuminating multiple first stage optical spreading devices each with one or more second stage spreading devices may be employed. However in another arrangement a single light source and first stage spreading device is employed, and the first stage spreading device may then be configured to provide a multi-peaked intensity distribution to approximate to or mimic the use of multiple separate sources. In this way a single laser may be employed to effectively provide three light sources each generating a respective overlapping fan. This can be useful in achieving high safety. The multi-peaked intensity distribution may be achieved, for example, by a suitably shaped lens (surface) profile and/or by employing a holographic optical element as a spreading device.
In some preferred implementations the second spreading device comprises a lenticular lens array, for example in the form of a film. A typical lenticular array has (one- dimensional) lenslets with a width of less than 1 mm or 0.5mm; the focal length may be less than ten times the width, for example between two and six times the width. Each lenslet may take the form of an approximation to a cylindrical lens (although one surface of the array is typically flat). In embodiments the touch sensor optical system employs a laser light source followed by a collimation system to illuminate the first spreading device with a spatial extent of at least 1 cm
A lenticular array may also be employed for the first spreading device, but alternatively a cylindrical lens or some other 1 D spreader lens profile may be employed, for example a profile with one flat surface and a second sinusoidally ridged surface. Where the surface has multiple ridges or peaks preferably the laser illumination covers more than one of these ridges or peaks so that there is some averaging for non-uniformities in the surface profile. Such a spreader lens element may have dimensions, for example a width between adjacent ridges, of greater than 10x, 50x or 100x corresponding dimensions of the lenticular array. A "macroscopic" lens or surface of this type may, optionally, also be employed for the first optical spreading device. Additionally or alternatively one or both of the first and second optical spreading devices may incorporate or consist of a holographic optical element.
In some embodiments the light source comprises a stripe-emitter laser diode (sometimes referred to a broad area laser diode). Such a laser has an output beam which has a high beam divergence in the short-direction of the (rectangular) output face and a lower beam divergence in the long-direction of the output face. Counter- intuitively, where such a laser is employed it is preferably aligned so that the long- direction of the output face of the laser is parallel to the sheet of light because the beam is more easily hollimated in the vertical or short-direction to provide a substantially flat (or slightly diverging or converging) sheet of light.
Thus the invention further provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch sensor optical system comprises a stripe-emitter laser aligned such that a long direction of a stripe-output face of the laser is parallel to said touch sheet.
Preferably the above described touch sensor optical system is incorporated into a touch sensitive image display device comprising a projector to project a displayed image at an acute angle onto a surface, typically generally in front of the device, using the above described optical system to project the light defining the touch sheet just above the displayed image. Preferably this sheet of light is non-visible, for example in the infra red. A camera is directed, also at an acute angle, to capture light scattered from the sheet by an object/finger interacting with the display. In embodiments the camera is co-located with the image projector and may share some or the majority of the projection optics. A signal process or is employed to process the image from the camera to identify the locations of one or more fingers/objects touching the image, for use in interacting with the displayed image.
In a related aspect the invention provides a method of touch sensing in a touch sensitive image display device, the method comprising: projecting a displayed image onto a surface; projecting a light defining a touch sheet above said displayed image; capturing a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and processing said touch sense image to identify a location of said object relative to said displayed image; wherein said projecting of said light defining said touch sheet comprises projecting a plurality of overlapping fans of light above said displayed image.
Preferably, as previously described, the overlapping fans of light comprise fans projecting in at least two different directions, and overlapping at least along an edge of a fan. In embodiments these fans overlap within the thickness of the sheet of light to define a single continuous light sheet. The fans of light may be projected from different locations along the edge of the display area/light sheet and/or may point in different directions, with the aim of achieving optimal coverage of the display area.
In preferred embodiments first and second stage optical spreading devices are employed, though this is not essential. For example the overlapping fans of light may be provided using a common light source for each of a plurality of optical spreading devices, or employing a separate light source or laser for each spreading device.
In a further related aspect the invention provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said light defining said touch sheet comprises a plurality of overlapping fans of light. Preferably the touch sensor optical system comprises at least one light source and a plurality of optical spreading devices illuminated by the light source each projecting a respective fan of light, the fans of light overlapping within the light sheet. Again the optical spreading devices may be at different locations and/or oriented to direct (optical axis) of the fans in different directions. Again in embodiments the touch sensor optical system may comprise a stripe-emitter laser aligned with a long direction of a stripe- output face of the laser parallel to the touch sheet. As previously described, embodiments of the system comprise first and second stage optical spreading devices for reduced shadowing, and improved coverage and eye safety. In some embodiments the light defining the touch sheet may be a plane or fan of light formed by a beam spreader. In other embodiments however, the touch sheet may comprise beams defining a set of stripes or a comb. In still other embodiments one or more scanned and/or interlaced light beams may be employed to define the touch sheet.
Thus the invention also provides a touch sensitive image display device, the device comprising: an image projector to project a displayed image onto a surface; a touch sensor optical system to project light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch sensor optical system is configured to project a spatially and/or temporally discontinuous light structure to define said touch sheet.
The touch sensor optical system may include a mechanical scanner, such as a spinning polygonal mirror, to sweep one or more collimated beams over the touch area. Additionally or alternatively an optical element, for example a diffractive optical element, may be employed to fans the beam out into a sequence of stripes or comb to define a touch sensing sheet. Use of such a comb can assist in detecting the touch object in high ambient light conditions, and the spatial frequency of these beams, as perceived by the camera, can provide additional information on the distance of the touch object, to improve position detection accuracy. In such embodiments it is preferably to employ a substantially single-mode laser in the touch sheet projector.
Embodiments of each of the above described aspects of the invention may be used in a range of touch-sensing display applications. For example, some of the techniques we describe may be used to detect the position(s) or one or more objects in mid-air. However embodiments of the invention are particularly useful for large area touch coverage, for example in interactive whiteboard or similar applications.
Embodiments of each of the above described aspects of the invention are not limited to use with any particular type of projection technology. Thus although we will describe later an example of a holographic image projector, the techniques of the invention may also be applied to other forms of projection technology including, but not limited to, digital micromirror-based projectors such as projectors based on DLP™ (Digital Light Processing) technology from Texas Instruments, Inc. BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will now be further described, by way of example only, with reference to the accompanying figures in which:
Figures 1 a and 1 b show, respectively, a vertical cross section view through an example touch sensitive image display device suitable for implementing embodiments of the invention, and details of a sheet of light-based touch sensing system for the device; Figures 2a and 2b show, respectively, a holographic image projection system for use with the device of Figure 1 , and a functional block diagram of the device of Figure 1 ;
Figures 3a to 3e show, respectively, an embodiment of a touch sensitive image display device according to an aspect of the invention, use of a crude peak locator to find finger centroids, and the resulting finger locations;
Figures 4a and 4b show, respectively, a plan view and a side view of an interactive whiteboard incorporating a touch sensitive image display suitable for implementing an embodiment of the invention;
Figures 5a to 5e show, respectively, a touch sensitive image display system incorporating polarizing pen/object identification according to an embodiment of the invention, a photosensor module for system of Figure 5a, a polarizing pen for use in the system, details of a retroreflecting stack for the polarizing pen, and details of processing for the system;
Figures 6a to 6c show, respectively, example captured touch sense images illustrating shadowing, the use of multiple IR sources to reduce the likelihood of shadowed events, and a schematic illustration of artefacts at the edge of a fan of light;
Figures 7a and 7b show, respectively, an example IR fan generator for the apparatus of Figure 4, a development of the IR fan generator incorporating a second optical spreader; Figures 8a and 8b show cross-sections through, respectively, a lenticular form and a spreader lens for use as optical spreaders in the touch sensor optical system of Figure 7; Figure 9 shows, schematically, a stripe-emitter laser which may be employed in embodiments of the invention;
Figures 10a to 10c show an embodiment of a touch event capture system using a comb generator in, respectively, plan and side view; and an example of a comb-of-light touch sheet configured to focus towards the rear of the image area;
Figures 1 1 a and 1 1 b show representations of a touch-camera view for, respectively, a plane-of-light touch sheet and a comb-of-light touch sheet; and Figures 12a and 12b show representations of a spatial frequency structure of light scattered from comb-of-light touch sheet by, respectively, a touch object close to the comb source and a touch object further from the comb source.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Figures 1 a and 1 b show an example touch sensitive holographic image projection device 100 comprising a holographic image projection module 200 and a touch sensing system 250, 258, 260 in a housing 102. A proximity sensor 104 may be employed to selectively power-up the device on detection of proximity of a user to the device.
A holographic image projector is merely described by way of example; the techniques we describe herein may be employed with any type of image projection system.
The holographic image projection module 200 is configured to project downwards and outwards onto a flat surface such as a tabletop. This entails projecting at an acute angle onto the display surface (the angle between a line joining the centre of the output of the projection optics and the middle of the displayed image and a line in a plane of the displayed image is less than 90°). We sometimes refer to projection onto a horizontal surface, conveniently but not essentially non-orthogonally, as "table down projection". A holographic image projector is particularly suited to this application because it can provide a wide throw angle, long depth of field, and substantial distortion correction without significant loss of brightness/efficiency. Boundaries of the light forming the displayed image 150 are indicated by lines 150a, b.
The touch sensing system 250, 258, 260 comprises an infrared laser illumination system (IR line generator) 250 configured to project a sheet of infrared light 256 just above, for example -1 mm above, the surface of the displayed image 150 (although in principle the displayed image could be distant from the touch sensing surface). The laser illumination system 250 may comprise an IR LED or laser 252, preferably collimated, then expanded in one direction by light sheet optics 254, which may comprise a negative or cylindrical lens. Optionally light sheet optics 254 may include a 45 degree mirror adjacent the base of the housing 102 to fold the optical path to facilitate locating the plane of light just above the displayed image.
A CMOS imaging sensor (touch camera) 260 is provided with an ir-pass lens 258 captures light scattered by touching the displayed image 150, with an object such as a finger, through the sheet of infrared light 256. The boundaries of the CMOS imaging sensor field of view are indicated by lines 257, 257a,b. The touch camera 260 provides an output to touch detect signal processing circuitry as described further later.
Example holographic image projection system Figure 2a shows an example holographic image projection system architecture 200 in which the SLM may advantageously be employed. The architecture of Figure 2 uses dual SLM modulation - low resolution phase modulation and higher resolution amplitude (intensity) modulation. This can provide substantial improvements in image quality, power consumption and physical size. The primary gain of holographic projection over imaging is one of energy efficiency. Thus the low spatial frequencies of an image can be rendered holographically to maintain efficiency and the high- frequency components can be rendered with an intensity-modulating imaging panel, placed in a plane conjugate to the hologram SLM. Effectively, diffracted light from the hologram SLM device (SLM1 ) is used to illuminate the imaging SLM device (SLM2). Because the high-frequency components contain relatively little energy, the light blocked by the imaging SLM does not significantly decrease the efficiency of the system, unlike in a conventional imaging system. The hologram SLM is preferably be a fast multi-phase device, for example a pixellated MEMS-based piston actuator device.
In Figure 2a:
• SLM1 is a pixellated MEMS-based piston actuator SLM as described above, to display a hologram - for example a 160 χ 160 pixel device with physically small lateral dimensions, e.g <5mm or <1 mm.
• L1 , L2 and L3 are collimation lenses (optional, depending upon the laser output) for respective Red, Green and Blue lasers.
• M1 , M2 and M3 are dichroic mirrors a implemented as prism assembly.
• M4 is a turning beam mirror.
• SLM2 is an imaging SLM and has a resolution at least equal to the target image resolution (e.g. 854 χ 480); it may comprise a LCOS (liquid crystal on silicon) or DMD (Digital Micromirror Device) panel.
• Diffraction optics 210 comprises lenses LD1 and LD2, forms an intermediate image plane on the surface of SLM2, and has effective focal length / such that f / Δ covers the active area of imaging SLM2. Thus optics 210 perform a spatial Fourier transform to form a far field illumination pattern in the Fourier plane, which illuminates SLM2.
• PBS2 (Polarising Beam Splitter 2) transmits incident light to SLM2, and reflects emergent light into the relay optics 212 (liquid crystal SLM2 rotates the polarisation by 90 degrees). PBS2 preferably has a clear aperture at least as large as the active area of SLM2.
• Relay optics 212 relay light to the diffuser D1 .
• M5 is a beam turning mirror.
• D1 is a diffuser to reduce speckle.
• Projection optics 214 project the object formed on D1 by the relay optics 212, and preferably provide a large throw angle, for example >90°, for angled projection down onto a table top (the design is simplified by the relatively low entendue from the diffuser).
The different colours are time-multiplexed and the sizes of the replayed images are scaled to match one another, for example by padding a target image for display with zeros (the field size of the displayed image depends upon the pixel size of the SLM not on the number of pixels in the hologram).
A system controller and hologram data processor 202, implemented in software and/or dedicated hardware, inputs image data and provides low spatial frequency hologram data 204 to SLM1 and higher spatial frequency intensity modulation data 206 to SLM2. The controller also provides laser light intensity control data 208 to each of the three lasers. For details of an example hologram calculation procedure reference may be made to WO2010/007404 (hereby incorporated by reference).
Control system
Referring now to Figure 2b, this shows a block diagram of the device 100 of figure 1 . A system controller 1 10 is coupled to a touch sensing module 1 12 from which it receives data defining one or more touched locations on the display area, either in rectangular or in distorted coordinates (in the latter case the system controller may perform keystone distortion compensation). The touch sensing module 1 12 in embodiments comprises a CMOS sensor driver and touch-detect processing circuitry. The system controller 1 10 is also coupled to an input/output module 1 14 which provides a plurality of external interfaces, in particular for buttons, LEDs, optionally a USB and/or Bluetooth (RTM) interface, and a bi-directional wireless communication interface, for example using WiFi (RTM). In embodiments the wireless interface may be employed to download data for display either in the form of images or in the form of hologram data. In an ordering/payment system this data may include price data for price updates, and the interface may provide a backhaul link for placing orders, handshaking to enable payment and the like. Non-volatile memory 1 16, for example Flash RAM is provided to store data for display, including hologram data, as well as distortion compensation data, and touch sensing control data (identifying regions and associated actions/links). Non-volatile memory 1 16 is coupled to the system controller and to the I/O module 1 14, as well as to an optional image-to-hologram engine 1 18 as previously described (also coupled to system controller 1 10), and to an optical module controller 120 for controlling the optics shown in figure 2a. (The image-to-hologram engine is optional as the device may receive hologram data for display from an external source). In embodiments the optical module controller 120 receives hologram data for display and drives the hologram display SLM, as well as controlling the laser output powers in order to compensate for brightness variations caused by varying coverage of the display area by the displayed image (for more details see, for example, our WO2008/075096). In embodiments the laser power(s) is(are) controlled dependent on the "coverage" of the image, with coverage defined as the sum of: the image pixel values, preferably raised to a power of gamma (where gamma is typically 2.2). The laser power is inversely dependent on (but not necessarily inversely proportional to) the coverage; in preferred embodiments a lookup table as employed to apply a programmable transfer function between coverage and laser power. The hologram data stored in the non-volatile memory, optionally received by interface 1 14, therefore in embodiments comprises data defining a power level for one or each of the lasers together with each hologram to be displayed; the hologram data may define a plurality of temporal holographic subframes for a displayed image. Preferred embodiments of the device also include a power management system 122 to control battery charging, monitor power consumption, invoke a sleep mode and the like.
In operation the system controller controls loading of the image/hologram data into the non-volatile memory, where necessary conversion of image data to hologram data, and loading of the hologram data into the optical module and control of the laser intensities. The system controller also performs distortion compensation and controls which image to display when and how the device responds to different "key" presses and includes software to keep track of a state of the device. The controller is also configured to transition between states (images) on detection of touch events with coordinates in the correct range, a detected touch triggering an event such as a display of another image and hence a transition to another state. The system controller 1 10 also, in embodiments, manages price updates of displayed menu items, and optionally payment, and the like.
Touch Sensing Systems
Referring now to Figure 3a, this shows an embodiment of a touch sensitive image display device 300 according to an aspect of the invention. The system comprises an infra red laser and optics 250 to generate a plane of light 256 viewed by a touch sense camera 258, 260 as previously described, the camera capturing the scattered light from one or more fingers 301 or other objects interacting with the plane of light. The system also includes an image projector 1 18, for example a holographic image projector, also as previously described, to project an image typically generally in front of the device, in embodiments generally downwards at an acute angle to a display surface.
In the arrangement of Figure 3a a controller 320 controls the IR laser on and off, controls the acquisition of images by camera 260 and controls projector 1 18. In the illustrated example images are captured with the IR laser on and off in alternate frames and touch detection is then performed on the difference of these frames to subtract out any ambient infra red. The image capture objects 258 preferably also include a notch filter at the laser wavelength which may be around 780-800 nm. Because of laser diodes process variations and change of wavelength with temperature this notch may be relatively wide, for example of order 20 nm and thus it is desirable to suppress ambient IR. In the embodiment of Figure 3a subtraction is performed by module 302 which, in embodiments, is implemented in hardware (an FPGA).
In embodiments module 302 also performs binning of the camera pixels, for example down to approximately 80 by 50 pixels. This helps reduce the subsequent processing power/memory requirements and is described in more detail later. However such binning is optional, depending upon the processing power available, and even where processing power/memory is limited there are other options, as described further later. Following the binning and subtraction the captured image data is loaded into a buffer 304 for subsequent processing to identify the position of a finger or, in a multi-touch system, fingers.
Because the camera 260 is directed down towards the plane of light at an angle it can be desirable to provide a greater exposure time for portions of the captured image further from the device than for those nearer the device. This can be achieved, for example, with a rolling shutter device, under control of controller 320 setting appropriate camera registers.
Depending upon the processing of the captured touch sense images and/or the brightness of the laser illumination system, differencing alternate frames may not be necessary (for example, where 'finger shape' is detected). However where subtraction takes place the camera should have a gamma of substantial unity so that subtraction is performed with a linear signal.
Various different techniques for locating candidate finger/object touch positions will be described. In the illustrated example, however, an approach is employed which detects intensity peaks in the image and then employs a centroid finder to locate candidate finger positions. In embodiments this is performed in software. Processor control code and/or data to implement the aforementioned FPGA and/or software modules shown in Figure 3 (and also to implement the modules described later with reference to Figure 5) may be provided on a disk 318 or another physical storage medium.
Thus in embodiments module 306 performs thresholding on a captured image and, in embodiments, this is also employed for image clipping or cropping to define a touch sensitive region. Optionally some image scaling may also be performed in this module. Then a crude peak locator 308 is applied to the thresholded image to identify, approximately, regions in which a finger/object is potentially present.
Figure 3b illustrates an example such a coarse (decimated) grid. In the Figure the spots indicate the first estimation of the centre-of-mass. We then take a 32x20 (say) grid around each of these. This is preferably used in conjunction with a differential approach to minimize noise, i.e. one frame laser on, next laser off.
A centroid locator 310 (centre of mass algorithm) is applied to the original (unthresholded) image in buffer 304 at each located peak, to determine a respective candidate finger/object location. Figure 3c shows the results of the fine-grid position estimation, the spots indicating the finger locations found.
The system then applies distortion correction 312 to compensate for keystone distortion of the captured touch sense image and also, optionally, any distortion such as barrel distortion, from the lens of imaging optics 258. In one embodiment the optical access of camera 260 is directed downwards at an angle of approximately 70° to the plane of the image and thus the keystone distortion is relatively small, but still significant enough for distortion correction to be desirable. Because nearer parts of a captured touch sense image may be brighter than further parts, the thresholding may be position sensitive (at a higher level for mirror image parts) alternatively position-sensitive scaling may be applied to the image in buffer 304 and a substantially uniform threshold may be applied.
In one embodiment of the crude peak locator 308 the procedure finds a connected region of the captured image by identifying the brightest block within a region (or a block with greater than a threshold brightness), and then locates the next brightest block, and so forth, preferably up to a distance limit (to avoid accidentally performing a flood fill). Centroid location is then performed on a connected region. In embodiments the pixel brightness/intensity values are not squared before the centroid location, to reduce the sensitivity of this technique to noise, interference and the like (which can cause movement of a detected centroid location by more than once pixel). A simple centre-of-mass calculation is sufficient for the purpose of finding a centroid in a given ROI (region of interest), and R(x,y) may be estimated thus:
Y-l x-i
_ ys =0 xs =°
Λ Y-l X-l
∑ ∑ «" (¾ . ¾ )
ys =0 xs =0
∑ ∑ ¾«" (¾ . ¾ )
_ ys =0 xs =°
y Y-i x-i
∑ ∑ «" (¾ . ¾ )
ys =0 xs =0 where n is the order of the CoM calculation, and and ^are the sizes of the ROI.
In embodiments the distortion correction module 312 performs a distortion correction using a polynomial to map between the touch sense camera space and the displayed image space: Say the transformed coordinates from camera space (x,y) into projected space (x',y') are related by the bivariate polynomial: x' = xCxyT x" = xCxyT a d y ' = xCyyT ; where Cx and Cy represent polynomial coefficients in matrix-form, and x and y are the vectorised powers of x and y respectively. Then we may design Cx and Cy such that we can assign a projected space grid location (i.e. memory location) by evaluation of the polynomial:
Figure imgf000024_0001
Where is the number of grid locations in the x-direction in projector space, and .J is the floor operator. The polynomial evaluation may be implemented, say, in Chebyshev form for better precision performance; the coefficients may be assigned at calibration. Further background can be found in our published PCT application WO2010/073024.
Once a set of candidate finger positions has been identified, these are passed to a module 314 which tracks finger/object positions and decodes actions, in particular to identity finger up/down or present/absent events. In embodiments this module also provides some position hysteresis, for example implemented using a digital filter, to reduce position jitter. In a single touch system module 314 need only decode a finger up/finger down state, but in a multi-touch system this module also allocates identifiers to the fingers/objects in the captured images and tracks the indentified fingers/objects. In general the field of view of the touch sense camera system is larger than the displayed image. To improve robustness of the touch sensing system touch events outside the displayed image area (which may be determined by calibration) may be rejected (for example, using appropriate entries in a threshold table of threshold module 306 to clip the crude peak locator outside the image area).
Object/pen identification techniques
We will now describe embodiments of various techniques for use with a touch sensitive display device, for example of the general type described above. The skilled person will appreciate that the techniques we will describe may be employed with any type of image projection system, not just the example holographic image projection system of Figure 2. Thus referring to first Figure 4a, this shows a plan view of an interactive whiteboard touch sensitive image display device 400. Figure 4b shows a side view of the device.
As illustrated there are three IR fan sources 402, 404, 406, each providing a respective light fan 402a, 404a, 406a spanning approximately 120° (for example) and together defining a single, continuous sheet of light just above display area 410. The fans overlap on display area 410, central regions of the display area being covered by three fans and more peripheral regions by two fans and just one fan. This is economical as shadowing is most likely in the central region of the display area. Typical dimensions of the display area 410 may be of order 1 m by 2m. The side view of the system illustrates a combined projector 420 and touch image capture camera 422 either aligned side-by-side or sharing at least an output portion of the projection optics. As illustrated in embodiments the optical path between the projector/camera and display area is folded by a mirror 424. The sheet of light generated by fans 402a, 404a, 406a is preferably close to the display area, for example less than 1 cm or 0.5cm above the display area. However the camera and projector 422, 420 are supported on a support 450 and may project light from a distance of up to around 0.5m from the display area.
Referring now to Figure 5a, this shows a system of the type illustrated in Figure 4 including a system to distinguish between at least three different objects, to pens with respective left circular polarising and right circular polarising tips, and an unpolarised object such as a finger. The pens are provided with retroreflective tips as described further below. Like elements to those of Figure 4 are indicated by like reference numerals.
Thus in this multi-pen touch sensitive display device 500, associated with the infrared touch sheet generation module 402, 404, 406 (which preferably but not essentially provides a touch sheet defined by overlapping fans of light) is a photo sensor module 502. As illustrated in Figure 5b this comprises three photo diodes 504, 506, 508, one provided with a left-circular polariser (a quarter wave plate and linear polariser, one having a clear window, and one being provided with a right circular polariser.
A pen 510 as illustrated in Figure 5c has a pen nib 510a provided with a retro reflecting stack of the type illustrated in Figure 5d. The retroreflective stack is shown in Figure 5d and comprises (in order going outwards towards the pen surface) a scattering layer 520, for example of white tape, a retroreflector layer 522, for example comprising retro reflective tape, a linear polariser layer 524 (optional pending whether the IR sheet is linearly polarised), a quarter wave plate 526, and a diffuser 528. This stack is rolled around the pen nib 510a.
In operation light in the IR sheet is either left or right circularly polarised by the pen according to the pen construction (or optionally, elliptically polarised) and this circularly polarised light is retroreflected back towards the photosensor module 502. The diffuser, preferably substantially one dimensional underlined along the longitudinal access of pen 510, helps to ensure that both the touch sensing camera 422 and the photosensor module 502 each receive the retroreflected light. For an unpolarised object the polarised photodiodes each receive about half the light and the unolparised photo diodes receives the full light intensity whereas with a polarised retroreflection the unpolarised photodiode and one of the polarised photodiodes receives the full light intensity and the other polarised photodiode zero light intensity. This enables left and right circular polarised pens and fingers to be distinguished. Optionally, as previously mentioned, in a multi-touch device timer flight in combination with pulsed light emission from the IR sheet module may be employed to match a pen/finger to a detected objection location, using relatively approximate time of flight position detection.
The checkerboard spatial filter 530 illustrated in Figure 5a may be provided for the touch capture camera 422 in an alternative embodiment in which the touch camera captures a visible image of an object in addition to an IR image where the object intersects the touch sheet. This visible image may then be used, for example, to determine the colour of a pen and label the data output from the system accordingly for example to associate a detected touch position with a (pen) colour).
Figure 5e illustrates the signal processing to implement the system of Figures 5a-5d; the photo diodes 502 provide an input to a pen identification module 504, optionally incorporating time of flight detection using a timing signal from controller 506. Module 504 in turn provides pen/finger identification data to the touch position output module 314. It will be appreciated that for the touch sensing system to work a user need not actually touch the displayed image. The plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface. The skilled person will appreciate that whilst a relatively thin, flat sheet of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.
Passive pen
Additionally or alternatively, as previously mentioned, the object may be a more sophisticated passive object such as a passive pen incorporating user controls which change the appearance of object by for example, moving an aperture. This may be employed to hide one or another spot or to change a spot count on the object, or to change a number of lines or line slope or orientation, or to modify the object appearance in some other way. In some preferred implementations the change is a change in the polarisation response of the object.
Thus a user control on the object may comprise one or more buttons mechanically modifying an aspect of visual appearance of the objects or pen to implement one or more user buttons. Operation of these virtual 'user buttons' maybe detected by the second sensing system and then provided as an output of from the system for use in any desirable manner.
For example, in one embodiment, a passive pen of this type provides left-click and right-click buttons, so that the pen can send back one of three "signals":
• Touching the board
• Touching the board and left-button pressed • Touching the board and right-button pressed
In this embodiment, pressing a "left" or a "right" button reveals a left-circular or right- circular retroreflecting strip which is detected by the system. To implement this a three- photodiode configuration is employed (with clear, left-polarising and right-polarising filters respectively, as described above) and a part of the pen, for example the nib of the pen has three regions of retroreflect, retroreflect+left-polarise (RR+LP) and retroreflect+right-polarise (RR+RP) respectively. There is a mechanical aperture over the nib which covers two of those three regions (by default leaving only the retroflect region exposed). The two buttons on the pen when pressed move the aperture backwards or forwards to cover and retroflect region and expose the RR+LP or RR+RP region respectively to signify left click and right click. Difference signals from the photodiode then inform the touch subsystem which of the three pen states (touch only, touch+left-click or touch+right-click) is active. This system is also compatible with a finger (which looks the same as the pen with no button pressed).
In an alternative embodiment, instead of two different retroreflecting regions being revealed depending on the button press, the pen may reveal, for example two different patterned regions which are distinguished. In a still further approach different regions on the object (eg pen) may have different IR-distinguishable regions, for example different brightnesses (light or dark spots) or polarisation characteristics. The user may then cover one or more of these with a finger or change the orientation of the object/pen so that one or other region is visible to the touch sheet camera. For example, different sides of a pen nib or different ends of a pen may have a different IR colour or response: in this case the user rotates or flips the pen to operate the user control.
IR laser illumination system
We will now describe some preferred embodiments of an IR laser illumination system to produce a plane or sheet of light for use with a device of the type described above. As previously described, light fan touch is a technique where a sheet of light is generated just above a surface. When an object, for example a finger, touches the surface light from the light sheet will scatter off the object. A camera is positioned to capture this light with a suitable image processing system to process the captured image and register a touch event. The techniques employed to generate the sheet of light are important for system performance in various ways.
Thus referring to Figure 6a, this shows an example captured touch sense image illustrating shadowing. Finger image 352 creates a shadow 352a for IR source 250 which partially obscures the image of a second finger 354, and a finger/object in position 356 cannot be detected because this region lies entirely within the shadow. (The arrows on the shadow borders indicate the direction of light propagation, diverging in straight lines away from the IR source in a fan shape). Figure 6b illustrates, conceptually, that when multiple IR sources 250a,b,c are employed the risk of shadowing is much reduced.
Figure 6c schematically illustrates a pattern of light intensity of the type which can appear at the edge of a fan of light, depending upon how the fan is generated. This can generate artefacts in the capture touch image which may be mistaken for a finger or other object. Furthermore, to cover a large area a significant laser power may be needed, for example of order 100s of milliwatts, and there is a need to ensure that this is eye-safe.
Figure 7a shows, schematically, an IR fan generator for the apparatus of Figure 4. In this example a laser diode 600 provides an output into a collimator 602 which provides a partially or substantially completely collimated beam to a first optical spreader 604. Spreader 604 converts the initial, narrow source to an extended source, providing a fan of light output. As illustrated in Figure 4, three such fan generators may be employed to provide a single, combined sheet of light for touch sensing, the edges of the fans overlapping to smooth the intensity profile shown in Figure 6c. Optionally the spreader 604 may be modified, for example using a holographic optical element, to provide a gradual intensity fall-off towards the edge of a fan.
Figure 7b illustrates a development of the system of Figure 7a in which light from the extended source provided by the fan from spreader 604 is provided to a second optical spreader 606 to provide an extended fan, in effect a set of overlapping fans originating from extended source 604. Such an arrangement may be employed for each of the fan generators 402, 404, 406 of Figure 4 or one or all of these may be replaced by the arrangement of Figure 7b to provide, in effect, multiple overlapping fans.
In a modification to the arrangement of Figure 7b optical spreader 604 is configured to define an intensity profile, with multiple laterally-displaced peaks mimicking multiple individual sources. The arrangement then becomes more similar to that of Figure 5. This intensity profile may be generated, for example, by an appropriately shaped optical (lens) surface or by employing a holographic optical element as the spreader 604.
The skilled person will also appreciate that although in Figure 7b optical spreader 606 is illustrated as being straight, optionally this may be curved to provide fans directed over a range of different directions. Each of the optical spreaders described above may either comprise, for example, a lenticular film or a spreader lens. These are illustrated schematically in Figures 8a and 8b respectively; they have broadly similar shapes although typically the spreader lens has characteristic dimensions (ridge spacings) of order 50x or 100x the lenticular film. (The figure illustrates a cross-section; the skilled person will appreciate that these cross-sections are "extruded" in the third dimension to, broadly speaking, approximate a set of cylindrical lenses).
In some preferred embodiments the first optical spreader is either a lenticular array or some other 1 D optical element such as a cylindrical lens, and the second optical spreader is a lenticular array. However where a spreading lens is employed it is preferable that illumination of the spreader covers more than one peak, as illustrated in Figure 8b, to reduce non-uniformity in the spread light due to non-uniformity of the illumination and/or optical surface. An approach as described above, in particular with two successive optical spreaders, enables the beam to have substantial width when it is spread from the second element. For example at the second element the beam may have a width of up to around 20cm. Use of an extended IR fan illumination source helps eye safety because if the light is imaged, it will be imaged over a large area of the retina as compared with a point IR source. Further, as previously described, use of an extended source results in soft rather than hard-edged shadows and can potentially remove the umbra of a shadow completely if the size of the source is sufficiently large.
Figure 9 shows a schematic view of a stripe-emitter laser which may be employed as laser diode 600 in the arrangements of Figure 7. In the short, vertical direction 802 the output beam from the laser has a large divergence (because of the small aperture size) but distance 802 may be small enough for the beam to be substantially single-mode in this direction. By contrast in the long, horizontal direction 804 the beam divergence is less (though still relatively large), but the light output results from the excitation from multiple different spatial modes.
The laser powers desirable for effective coverage of a large area such as an interactive whiteboard can most easily be met using a multi-mode laser, but it is not possible to produce a good collimated beam from the multi-mode laser. However a multi-mode laser of a suitable power level, for example greater than 10 mW, 100 mW or 300 mW, is available in the form of a stripe-emitter laser.
In embodiments, therefore, a stripe multi-mode laser may be employed provided that the long axis 804 of the stripe output aperture is aligned parallel to a plane of the sheet of light. The beam can then be collimated in the "vertical" axis to define a suitable light sheet, whilst spreading out in the lateral direction (which is generally parallel to the display surface).
The laser beam entering the spreading optics should be of a size matched to the spreading optics. For a single mode solid state laser diode this can be achieved by employing a collimation lens at a suitable distance in front of the laser as schematically illustrated in Figure 7. For the stripe-emitter multi-mode solid state diode laser, as previously described, there may be a substantial difference in the divergence in the two axies of polarisation. However the direction of minimum divergence of the laser source is also the direction of the extended laser emission source. Thus free space propagation after the collimation lens can bring together with sizes in the two axes.
In a further refinement, which may be used in the above described aspects/embodiments of the invention or independently thereof, the tilt of the spreading lens may be adjusted: As described above, a preferred optical configuration for generating an IR fan employs a collimated laser beam illuminating one or more spreading elements (lenticular array(s) or other 1 D optical element(s)). If the illuminating beam is not perpendicular to the plane of the spreading element, the resulting line generated is not straight - this effect may be termed 'smile'. As the touch surface is defined by the shape of the IR illumination, it is desirable to ensure the spreading element is perpendicular to the illuminating beam if the touch surface is flat. In a variant of this, touch technology employing light sheet touch detection may be configured for operation on a non-planar surface, more particularly by deliberate tilting of the spreading element(s) with respect to the illuminating beam.
Use of structured touch-sheet light
In some approaches the location of a touch event may be detected using a spatially and/or temporally structured touch light sheet. Such approaches can help improve touch-location accuracy as distance from the sensor (camera) increases and/or reduce the accuracy constraints on wide angle input optics for the camera. Use of structured light can also help to improve signal-to-noise ratio vis-a-vis ambient light, which again otherwise tends to reduce with increasing distance from the touch sheet light source. These approaches may be employed either for user interface systems aligned to a surface or to detect touch events in mid-air (such as gestures), in each case detecting objects that intersect with light in the touch sheet. A comb-of-light approach can be preferably to a scanning system because it may be smaller and employ fewer components, and in particular no moving parts.
Thus, referring to Figure 10, this illustrates an example embodiment of such a system 1000. A comb generator module 1010 comprises a light source 1012 with an optional collimation lens 1014, followed by a comb generator optic 1016. A before, the image sensor 258, 260 detects light 1020 scattered from a touch object 1022 such as a finger back into the sensor, with a field of view indicated by cone 1024.
There are a number of methods which can be used to generate the comb, including placing a mask (transparency) containing the comb structure in front of the laser source. Preferably, however, a diffractive optical element (DOE) is placed in front of a collimated beam to generate the comb. This results in high efficiency and the DOE may be produced in high volume and at very low cost through embossing on plastic, etching onto glass, or other approaches.
The DOEs may (but need not) take the form of a substantially one-dimensional grating, with a pitch is calculated to produce a divergence angle for the comb as appropriate to illuminate the chosen touch detection area. Data for a computer generated a DOE can be calculated using conventional hologram generation approaches, for example direct binary search or Gerchberg-Saxton. For example, to obtain a diffraction (divergence) angle from the comb of 130 degrees, which is appropriate for obtaining a relatively large touch area at a relatively short distance from the comb generator module (desirable for a reasonably compact implementation), a grating feature size of around 0.5um may be used with a light source wavelength of 905nm.
Using a DOE for the purpose of generating a comb, rather than approaches which involve blocking light, is advantageous because significantly higher optical power densities in the prongs of the comb can be achieved due to the minimal loss inherent in light generation by diffraction. This can further improve signal-to-noise ratio, thus improving accuracy and ambient light resilience. In Figure 10a, the comb is depicted as having substantially collimated prongs. However, it can be advantageous, for example to enable increased optical power density, for the comb to be convergent, that is, focussed towards the rear region of the touch area. A focussed comb can also permit a greater density of prongs per unit area (to increase detection accuracy) while retaining a low mark to space ratio (to increase comb power density and hence ambient light resilience).
An example of a focussed comb is shown in Figure 10c. Such a comb configuration may be achieved by adjusting the focus of the collimation lens, or by encoding focussing power into the DOE design.
Regardless of whether the prongs within the comb are focussed or collimated, the number of prongs generated by the comb (and/or the pitch of the comb) is preferably chosen so that at least one prong, and preferably more for accuracy, always intersects with a touch object (e.g. a finger) of a specified minimum diameter (e.g. 4mm), over the entire defined touch area. This is controllable by appropriate design of the DOE. For a touch sheet comprising a plane of light, sensitivity to ambient light increases as the distance from the source of the plane of light increases, due to the progressive decrease in the intensity of light scattered into the sensor by the touch object. As a result, it can become progressively more difficult to differentiate between light from the light source (at e.g. 905nm) scattered by the touch object and light of a similar wavelength in the environment. Use of a comb-of light allows this problem to be ameliorated because as the distance from the light source increases and this problem becomes more acute, separation between the "prongs" of the comb also increases.
This provides two mechanisms for separating light from the comb scattered by a touch event from ambient light: i) The specific spatial frequency component corresponding to scatter from the comb can be selectively extracted from the image, providing resilience to other ambient light events, ii) Intensity or other information between the "prongs" of the comb can be used to infer the ambient light level in the vicinity of the touch object, and hence compensate for it by, for example, adjusting detection thresholds appropriately.
Figure 1 1 shows example representations of image sensor data captured for a plane- of-light touch sheet (Figure 1 1 a), compared with a comb-of-light touch sheet (Figure 1 1 b). These illustrate a finger 1 100, patches of ambient (IR) light 1 1 10 in/from the environment, and light 1 120 scattered from the plane/comb (touch sheet). Figure 1 1 b additionally illustrates that intensity data from regions 1 130 between the prongs can be used to extract the ambient illumination level. Further the spatial frequency 1 140 of the prongs can be used to filter touch events from ambient background.
Further, as linear distance of touch objects from the sensor increases, increased distortion due to the camera optics tends to reduce the number of pixels illuminated by scatter from the touch object and hence the accuracy of detected touch events decreases. With a plane of light touch sheet, light scattered by touch objects close to the sensor has a similar spatial structure to light scattered by touch objects further away from the sensor. By contrast, with the arrangement of Figure 10, as the comb diverges from the light source the separation of prongs from each other increases. This information can be used to augment the spatial location of detected touch events to improve accuracy, through additional extraction of spatial frequency information, as illustrated in Figure 12. Thus Figure 12a shows a touch event close to the comb light source and Figure 12b a touch event detected further from the light comb source. In Figure 12a the high spatial frequency of scatter from the comb indicates that the touch event is close to the light comb source; in Figure 12b the lower spatial frequency of scatter from the comb indicates that the touch event is further from the light comb source.
It will be appreciated that for the touch sensing system to work a user need not actually touch the displayed image. The plane or fan of light is preferably invisible, for example in the infrared, but this is not essential - ultraviolet or visible light may alternatively be used. Although in general the plane or fan of light will be adjacent to displayed image, this is also not essential and, in principle, the projected image could be at some distance beyond the touch sensing surface. The skilled person will appreciate that whilst a relatively thin, flat sheet of light is desirable this is not essential and some tilting and/or divergence or spreading of the beam may be acceptable with some loss of precision. Alternatively some convergence of the beam towards the far edge of the display area may be helpful in at least partially compensating for the reduction in brightness of the touch sensor illumination as the light fans out. Further, in embodiments the light defining the touch sheet need not be light defining a continuous plane - instead structured light such as a comb or fan of individual beams and/or one or more scanned light beams, may be employed to define the touch sheet.
The techniques we have described are particularly useful for implementing an interactive whiteboard although they also have advantages in smaller scale touch sensitive displays. No doubt many other effective alternatives will occur to the skilled person. It will be understood that the invention is not limited to the described embodiments and encompasses modifications apparent to those skilled in the art lying within the spirit and scope of the claims appended hereto.

Claims

CLAIMS:
1 . A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said object is provided with a reflective element to reflect light from said touch sheet.
2. A touch sensitive image display device as claimed in claim 1 further comprising a diffuser over said reflective element.
3. A touch sensitive image display device as claimed in claim 2 wherein said object comprises a pen with a reflective tip.
4. A touch sensitive image display device as claimed in claim 3 wherein said diffuser is anisotropic diffuser configured to preferentially spread said reflected light in a direction aligned along an axis of said pen.
5. A touch sensitive image display device as claimed in any preceding claim further comprising a scattering surface behind said reflective element
6. A touch sensitive image display device as claimed in any preceding claim comprising a plurality of said objects each having an optically distinguishable response to said light from said touch sheet; and further comprising a system to distinguish said responses to distinguish between said objects.
7. A touch sensitive image display device as claimed in claim 6 wherein at least one of said objects is provided with a circular polariser to circularly polarise said reflected light, and wherein said system to distinguish said responses to distinguish between said objects comprises at least one optical sensor configured to selectively sense circularly polarised light.
8. A touch sensitive image display device as claimed in claim 7 comprising first and second said objects provided with respective left and right circular polarisers, and wherein said system to distinguish said responses to distinguish between said objects comprises three optical sensors to sense, respectively, left circular polarised light, right circular polarised light, and unpolarised light.
9. A touch sensitive image display device as claimed in any one of claims 6 to 8 wherein said system to distinguish said responses to distinguish between said objects includes a time-of-flight detection system for distinguishing between multiple said objects intersecting said touch sheet concurrently.
10. A touch sensitive image display device as claimed in any preceding claim wherein said reflective element is a substantially retroreflective element.
1 1 . A touch sensitive image display device as claimed in any preceding claim wherein the object had a user-controllable reflective element such that, under user control of said object a property of light reflected from the touch sheet is controllable; and wherein said signal processor is configured to detect said property to identify use of said user control and to output user control data in response.
12. A touch sensitive image display device as claimed in claim 1 1 further comprising said object, and wherein said object comprises a user control for said user- controllable reflective element operable to selectively alter a reflected light response of said object to light from said touch sheet.
13. A touch sensitive image display device as claimed in claim 12 wherein said reflected light response of said object to light from said touch sheet comprises a change in polarisation of said reflected light responsive to operation of said user control.
14. A touch sensitive image display device as claimed in claim 12 or 13, wherein said user control comprises two controls or a three-way control to selectively display left circular polarised and right circular polarised light-polarising regions of the object.
15. A touch sensitive image display device as claimed in claim 12, 13 or 14 wherein said object is a passive object and wherein said user control comprises a mechanical control.
16. A touch sensitive image display device as claimed in any one of claims 1 1 to 15 wherein said user-controllable reflective element is able to selectively provide reflected light from said touch sheet comprising, by user selection, circularly polarised light.
17. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system to project non-visible light defining a touch sheet above said displayed image;
a touch camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said touch camera, or another said camera of said display aligned said touch camera, is configured to capture an object image of one or more said objects; and
wherein said signal processor is configured to use said captured object image to distinguishable between multiple said objects.
18. A touch sensor optical system to generate a light defining a touch sheet for a touch sensitive image display device, the optical system comprising:
at least one light source;
a first 1 D optical spreading device illuminated by said at least one light source to spread light from said light source in one dimension to generate a first fan of light; and a second 1 D optical spreading device illuminated by said first fan of light to spread light from said first fan of light to generate a second fan of light;
wherein said first fan of light provides an extended light source, extended in said one dimension, for said second fan of light; and wherein at least some locations within said second fan of light receive illumination from a plurality of different directions.
19. A touch sensor optical system as claimed in claim 18 comprising a plurality of said second 1 D optical spreading devices, each illuminated by said first fan of light, to generate a plurality of overlapping said second fans of light.
20. A touch sensor optical system as claimed in claim 18 or 19 wherein said first 1 D optical spreading device is configured to provide a multi-peaked intensity distribution of said spread light in said one dimension.
21 . A touch sensor optical system as claimed in claim 18, 19 or 20 wherein a said second 1 D optical spreading device comprises a lenticular lens array.
22. A touch sensor optical system as claimed in any one of claims 18 to 21 wherein said first 1 D optical spreading device comprises a lens surface having at least two peaks illuminated by said light source.
23. A touch sensor optical system as claimed in any one of claims 18 to 22 wherein said at least one light source comprises a laser light source followed by a collimation system to illuminate said first 1 D optical spreading device, and wherein a spatial extent of said first fan of light on said second 1 D optical spreading device is at least 1 cm.
24. A touch sensor optical system as claimed in any one of claims 18 to 23 wherein said at least one light source comprises a stripe-emitter laser aligned such that a long direction of a stripe-output face of the laser is parallel to said touch sheet; optionally omitting said first 1 D optical spreading device.
25. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system as claimed in any one of claims 1 to 7 to project a light defining a touch sheet above said displayed image; a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and preferably a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image.
26. A method of touch sensing in a touch sensitive image display device, the method comprising:
projecting a displayed image onto a surface;
projecting a light defining a touch sheet above said displayed image;
capturing a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and
processing said touch sense image to identify a location of said object relative to said displayed image;
wherein said projecting of said light defining said touch sheet comprises projecting a plurality of overlapping fans of light above said displayed image.
27. A method as claimed in claim 26 wherein said projecting of said overlapping fans of light comprises projecting fans of light in at least two different directions.
28. A method as claimed in claim 27 wherein said projecting of said fans of light from at least two different directions comprises illuminating a first optical spreading device from a light source to generate a first fan of light, and illuminating a second optical spreading device with an extended source comprising said first fan of light to project said light defining said touch sheet.
29. A method as claimed in claim 26, 27 or 28 wherein said projecting comprises projecting said plurality of overlapping fans of light from a plurality of respective optical spreading devices such that each of said overlapping fans of light has a different spatial coverage of said touch sheet and wherein an edge of one said fan of light overlaps another said fan of light within said touch sheet.
30. A method as claimed in claim 29 wherein said projecting of said fans of light from at least two different directions comprises illuminating each of said plurality of optical spreading devices from a respective said light source.
31 . A method as claimed in any one of claims 26 to 30 wherein said projecting of said overlapping fans of light comprises projecting fans of light from at least two different locations laterally spaced apart along an edge of said touch sheet.
32. A method as claimed in any one of claims 26 to 31 wherein said overlapping fans of light overlap within a thickness of said touch sheet to define a single continuous said touch sheet.
33. A method as claimed in any of claims 26 to 32 wherein said projecting of said light defining said touch sheet comprises using a stripe-emitter laser as a light source, aligning said laser such that a long direction of a stripe-output face of the laser is parallel to said touch sheet, and collimating an optical output from said laser in a direction perpendicular to a plane of said touch sheet.
34. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said light defining said touch sheet comprises a plurality of overlapping fans of light.
35. A touch sensitive image display device as claimed in claim 34 wherein said touch sensor optical system comprises:
at least one light source; and
a plurality of optical spreading devices illuminated by said light source; and wherein each of said optical spreading devices is positioned and directed to project a respective fan of light, and wherein said projected overlapping fans of light overlap within said touch sheet.
36. A touch sensitive image display device as claimed in claim 35 wherein said optical spreading devices point in different directions.
37. A touch sensitive image display device as claimed in claim 35 or 36 wherein said optical spreading devices are at locations laterally spaced apart along an edge of said touch sheet.
38. A touch sensitive image display device as claimed in any one of claims 34 to 37 wherein said touch sensor optical system comprises a stripe-emitter laser aligned such that a long direction of a stripe-output face of the laser is parallel to said touch sheet.
39. A touch sensitive image display device as claimed in any of claims 34 to 38 wherein said touch sensor optical system comprises:
a first 1 D optical spreading device illuminated by said at least one light source to spread light from said light source in one dimension to generate a first fan of light; and
a second 1 D optical spreading device illuminated by said first fan of light to spread light from said first fan of light to generate a second fan of light.
40. A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object or pen approaching said displayed image; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image; wherein said touch sensor optical system comprises a stripe-emitter laser aligned such that a long direction of a stripe-output face of the laser is parallel to said touch sheet.
41 . A touch sensitive image display device, the device comprising:
an image projector to project a displayed image onto a surface;
a touch sensor optical system to project light defining a touch sheet above said displayed image;
a camera directed to capture a touch sense image from a region including at least a portion of said touch sheet, said touch sense image comprising light scattered from said touch sheet by an object approaching said displayed image; and
a signal processor coupled to said camera, to process a said touch sense image from said camera to identify a location of said object relative to said displayed image;
wherein said touch sensor optical system is configured to project a spatially and/or temporally discontinuous light structure to define said touch sheet.
42. A touch sensitive image display device as claimed in claim 41 wherein said touch sensor optical system is configured to project a comb of light to define said touch sheet.
43. A touch sensitive image display device as claimed in claim 41 or 42 wherein said touch sensor optical system is configured to scan one or more light beams to define said light sheet.
PCT/GB2013/050103 2012-01-20 2013-01-17 Touch sensitive image display devices WO2013108031A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1413670.9A GB2513498A (en) 2012-01-20 2013-01-17 Touch sensitive image display devices

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
GBGB1200963.5A GB201200963D0 (en) 2012-01-20 2012-01-20 Touch sensing systems
GB1201009.6 2012-01-20
GBGB1201009.6A GB201201009D0 (en) 2012-01-20 2012-01-20 Touch sensing systems
GB1200963.5 2012-01-20
GB1205274.2 2012-03-26
GBGB1205274.2A GB201205274D0 (en) 2012-03-26 2012-03-26 Touch sensitive image display devices

Publications (2)

Publication Number Publication Date
WO2013108031A2 true WO2013108031A2 (en) 2013-07-25
WO2013108031A3 WO2013108031A3 (en) 2013-09-19

Family

ID=47599126

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2013/050103 WO2013108031A2 (en) 2012-01-20 2013-01-17 Touch sensitive image display devices

Country Status (2)

Country Link
GB (1) GB2513498A (en)
WO (1) WO2013108031A2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015016864A1 (en) * 2013-07-31 2015-02-05 Hewlett-Packard Development Company, L.P. System with projector unit and computer
GB2522248A (en) * 2014-01-20 2015-07-22 Promethean Ltd Interactive system
GB2523077A (en) * 2013-12-23 2015-08-19 Light Blue Optics Ltd Touch sensing systems
GB2526525A (en) * 2014-04-17 2015-12-02 Light Blue Optics Inc Touch sensing systems
WO2016018232A1 (en) * 2014-07-28 2016-02-04 Hewlett-Packard Development Company, L.P. Image background removal using multi-touch surface input
WO2017196591A1 (en) * 2016-05-09 2017-11-16 Microsoft Technology Licensing, Llc Multipath signal removal in time-of-flight camera apparatus
EP3281095A4 (en) * 2015-04-08 2018-05-30 Ricoh Company, Ltd. Information processing apparatus, information input system, information processing method, and computer program product
US10281997B2 (en) 2014-09-30 2019-05-07 Hewlett-Packard Development Company, L.P. Identification of an object on a touch-sensitive surface
US10318023B2 (en) 2014-08-05 2019-06-11 Hewlett-Packard Development Company, L.P. Determining a position of an input object
TWI696052B (en) * 2014-08-26 2020-06-11 英商萬佳雷射有限公司 Apparatus and methods for performing laser ablation on a substrate
WO2022173353A1 (en) * 2021-02-09 2022-08-18 Flatfrog Laboratories Ab An interaction system
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4384201A (en) 1978-04-24 1983-05-17 Carroll Manufacturing Corporation Three-dimensional protective interlock apparatus
DE4121180A1 (en) 1991-06-27 1993-01-07 Bosch Gmbh Robert Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts
US5767842A (en) 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US6031519A (en) 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
WO2000021282A1 (en) 1998-10-02 2000-04-13 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
GB2343023A (en) 1998-10-21 2000-04-26 Global Si Consultants Limited Apparatus for order control
US6281878B1 (en) 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
WO2001093182A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093006A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Data input device
US20020021287A1 (en) 2000-02-11 2002-02-21 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6377238B1 (en) 1993-04-28 2002-04-23 Mcpheters Robert Douglas Holographic control arrangement
US6491400B1 (en) 2000-10-24 2002-12-10 Eastman Kodak Company Correcting for keystone distortion in a digital image displayed by a digital projector
WO2002101443A2 (en) 2001-06-12 2002-12-19 Silicon Optix Inc. System and method for correcting keystone distortion
US6611921B2 (en) 2001-09-07 2003-08-26 Microsoft Corporation Input device with two input signal generating means having a power state where one input means is powered down and the other input means is cycled between a powered up state and a powered down state
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6650318B1 (en) 2000-10-13 2003-11-18 Vkb Inc. Data input device
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
US20040095315A1 (en) 2002-11-12 2004-05-20 Steve Montellese Virtual holographic input method and device
US20060187199A1 (en) 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
WO2006108443A1 (en) 2005-04-13 2006-10-19 Sensitive Object Method for determining the location of impacts by acoustic imaging
US20060244720A1 (en) 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
US7268692B1 (en) 2007-02-01 2007-09-11 Lumio Inc. Apparatus and method for monitoring hand propinquity to plural adjacent item locations
WO2008038275A2 (en) 2006-09-28 2008-04-03 Lumio Inc. Optical touch panel
US7379619B2 (en) 2005-03-09 2008-05-27 Texas Instruments Incorporated System and method for two-dimensional keystone correction for aerial imaging
WO2008075096A1 (en) 2006-12-21 2008-06-26 Light Blue Optics Ltd Holographic image display systems
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7417681B2 (en) 2002-06-26 2008-08-26 Vkb Inc. Multifunctional integrated image sensor and application to virtual interface technology
WO2008146098A1 (en) 2007-05-28 2008-12-04 Sensitive Object Method for determining the position of an excitation on a surface and device for implementing such a method
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
USD595785S1 (en) 2007-11-09 2009-07-07 Igt Standalone, multi-player gaming table apparatus with an electronic display
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
WO2010007404A2 (en) 2008-07-16 2010-01-21 Light Blue Optics Limited Holographic image display systems
WO2010073047A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Limited Touch sensitive image display device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2004205A (en) * 1932-01-09 1935-06-11 Mosinee Paper Mills Company Smelting furnace for black liquor
US7623115B2 (en) * 2002-07-27 2009-11-24 Sony Computer Entertainment Inc. Method and apparatus for light input device
US6917033B2 (en) * 2002-10-15 2005-07-12 International Business Machines Corporation Passive touch-sensitive optical marker
US20040140988A1 (en) * 2003-01-21 2004-07-22 David Kim Computing system and device having interactive projected display
US7911444B2 (en) * 2005-08-31 2011-03-22 Microsoft Corporation Input method for surface of interactive display

Patent Citations (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4384201A (en) 1978-04-24 1983-05-17 Carroll Manufacturing Corporation Three-dimensional protective interlock apparatus
DE4121180A1 (en) 1991-06-27 1993-01-07 Bosch Gmbh Robert Finger input type interactive screen display system for road vehicle navigation - has panel screen with matrix of sensing elements that can be of infrared or ultrasonic proximity devices or can be touch foil contacts
US5767842A (en) 1992-02-07 1998-06-16 International Business Machines Corporation Method and device for optical input of commands or data
US6377238B1 (en) 1993-04-28 2002-04-23 Mcpheters Robert Douglas Holographic control arrangement
US6281878B1 (en) 1994-11-01 2001-08-28 Stephen V. R. Montellese Apparatus and method for inputing data
US6031519A (en) 1997-12-30 2000-02-29 O'brien; Wayne P. Holographic direct manipulation interface
WO2000021282A1 (en) 1998-10-02 2000-04-13 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
US6367933B1 (en) 1998-10-02 2002-04-09 Macronix International Co., Ltd. Method and apparatus for preventing keystone distortion
US6690357B1 (en) 1998-10-07 2004-02-10 Intel Corporation Input device using scanning sensors
GB2343023A (en) 1998-10-21 2000-04-26 Global Si Consultants Limited Apparatus for order control
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US20020021287A1 (en) 2000-02-11 2002-02-21 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US7084857B2 (en) 2000-05-29 2006-08-01 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093006A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Data input device
US7305368B2 (en) 2000-05-29 2007-12-04 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2001093182A1 (en) 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
US6650318B1 (en) 2000-10-13 2003-11-18 Vkb Inc. Data input device
US6491400B1 (en) 2000-10-24 2002-12-10 Eastman Kodak Company Correcting for keystone distortion in a digital image displayed by a digital projector
US20070222760A1 (en) 2001-01-08 2007-09-27 Vkb Inc. Data input device
US7242388B2 (en) 2001-01-08 2007-07-10 Vkb Inc. Data input device
WO2002101443A2 (en) 2001-06-12 2002-12-19 Silicon Optix Inc. System and method for correcting keystone distortion
US6611921B2 (en) 2001-09-07 2003-08-26 Microsoft Corporation Input device with two input signal generating means having a power state where one input means is powered down and the other input means is cycled between a powered up state and a powered down state
US7417681B2 (en) 2002-06-26 2008-08-26 Vkb Inc. Multifunctional integrated image sensor and application to virtual interface technology
US20040095315A1 (en) 2002-11-12 2004-05-20 Steve Montellese Virtual holographic input method and device
US7394459B2 (en) 2004-04-29 2008-07-01 Microsoft Corporation Interaction between objects and a virtual environment display
US7593593B2 (en) 2004-06-16 2009-09-22 Microsoft Corporation Method and system for reducing effects of undesired signals in an infrared imaging system
US7519223B2 (en) 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US20060187199A1 (en) 2005-02-24 2006-08-24 Vkb Inc. System and method for projection
US7379619B2 (en) 2005-03-09 2008-05-27 Texas Instruments Incorporated System and method for two-dimensional keystone correction for aerial imaging
WO2006108443A1 (en) 2005-04-13 2006-10-19 Sensitive Object Method for determining the location of impacts by acoustic imaging
US20060244720A1 (en) 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US20070019103A1 (en) 2005-07-25 2007-01-25 Vkb Inc. Optical apparatus for virtual interface projection and sensing
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
WO2008038275A2 (en) 2006-09-28 2008-04-03 Lumio Inc. Optical touch panel
WO2008075096A1 (en) 2006-12-21 2008-06-26 Light Blue Optics Ltd Holographic image display systems
US7268692B1 (en) 2007-02-01 2007-09-11 Lumio Inc. Apparatus and method for monitoring hand propinquity to plural adjacent item locations
WO2008146098A1 (en) 2007-05-28 2008-12-04 Sensitive Object Method for determining the position of an excitation on a surface and device for implementing such a method
USD595785S1 (en) 2007-11-09 2009-07-07 Igt Standalone, multi-player gaming table apparatus with an electronic display
WO2010007404A2 (en) 2008-07-16 2010-01-21 Light Blue Optics Limited Holographic image display systems
WO2010073047A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Limited Touch sensitive image display device
WO2010073024A1 (en) 2008-12-24 2010-07-01 Light Blue Optics Ltd Touch sensitive holographic displays
WO2010073045A2 (en) 2008-12-24 2010-07-01 Light Blue Optics Ltd Display device

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015016864A1 (en) * 2013-07-31 2015-02-05 Hewlett-Packard Development Company, L.P. System with projector unit and computer
EP3028113A1 (en) * 2013-07-31 2016-06-08 Hewlett-Packard Development Company, L.P. System with projector unit and computer
EP3028113A4 (en) * 2013-07-31 2017-04-05 Hewlett-Packard Development Company, L.P. System with projector unit and computer
GB2523077A (en) * 2013-12-23 2015-08-19 Light Blue Optics Ltd Touch sensing systems
US9886105B2 (en) 2013-12-23 2018-02-06 Promethean Limited Touch sensing systems
GB2522248A (en) * 2014-01-20 2015-07-22 Promethean Ltd Interactive system
WO2015107225A3 (en) * 2014-01-20 2015-09-11 Promethean Limited Interactive system
GB2526525A (en) * 2014-04-17 2015-12-02 Light Blue Optics Inc Touch sensing systems
WO2016018232A1 (en) * 2014-07-28 2016-02-04 Hewlett-Packard Development Company, L.P. Image background removal using multi-touch surface input
US10656810B2 (en) 2014-07-28 2020-05-19 Hewlett-Packard Development Company, L.P. Image background removal using multi-touch surface input
US10318023B2 (en) 2014-08-05 2019-06-11 Hewlett-Packard Development Company, L.P. Determining a position of an input object
TWI696052B (en) * 2014-08-26 2020-06-11 英商萬佳雷射有限公司 Apparatus and methods for performing laser ablation on a substrate
US10281997B2 (en) 2014-09-30 2019-05-07 Hewlett-Packard Development Company, L.P. Identification of an object on a touch-sensitive surface
EP3281095A4 (en) * 2015-04-08 2018-05-30 Ricoh Company, Ltd. Information processing apparatus, information input system, information processing method, and computer program product
US10302768B2 (en) 2016-05-09 2019-05-28 Microsoft Technology Licensing, Llc Multipath signal removal in time-of-flight camera apparatus
US10234561B2 (en) 2016-05-09 2019-03-19 Microsoft Technology Licensing, Llc Specular reflection removal in time-of-flight camera apparatus
WO2017196591A1 (en) * 2016-05-09 2017-11-16 Microsoft Technology Licensing, Llc Multipath signal removal in time-of-flight camera apparatus
US11893189B2 (en) 2020-02-10 2024-02-06 Flatfrog Laboratories Ab Touch-sensing apparatus
WO2022173353A1 (en) * 2021-02-09 2022-08-18 Flatfrog Laboratories Ab An interaction system

Also Published As

Publication number Publication date
GB201413670D0 (en) 2014-09-17
GB2513498A (en) 2014-10-29
WO2013108031A3 (en) 2013-09-19

Similar Documents

Publication Publication Date Title
WO2013108031A2 (en) Touch sensitive image display devices
US9292109B2 (en) Interactive input system and pen tool therefor
CN102591488B (en) The input equipment improved and the method be associated
US8941620B2 (en) System and method for a virtual multi-touch mouse and stylus apparatus
US8682030B2 (en) Interactive display
Hodges et al. ThinSight: versatile multi-touch sensing for thin form-factor displays
US8681124B2 (en) Method and system for recognition of user gesture interaction with passive surface video displays
US10437391B2 (en) Optical touch sensing for displays and other applications
CN101971123A (en) Interactive surface computer with switchable diffuser
WO2013108032A1 (en) Touch sensitive image display devices
US20100295821A1 (en) Optical touch panel
WO2013144599A2 (en) Touch sensing systems
CA2493236A1 (en) Apparatus and method for inputting data
CN102016713A (en) Projection of images onto tangible user interfaces
KR20110005737A (en) Interactive input system with optical bezel
US20120249480A1 (en) Interactive input system incorporating multi-angle reflecting structure
JP6721875B2 (en) Non-contact input device
KR20110123257A (en) Touch pointers disambiguation by active display feedback
CN109146945B (en) Display panel and display device
JP6187067B2 (en) Coordinate detection system, information processing apparatus, program, storage medium, and coordinate detection method
US20150248189A1 (en) Touch Sensing Systems
US20140247249A1 (en) Touch Sensitive Display Devices
GB2523077A (en) Touch sensing systems
US9285894B1 (en) Multi-path reduction for optical time-of-flight
JP2019074933A (en) Non-contact input device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13700949

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase in:

Ref document number: 1413670

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20130117

WWE Wipo information: entry into national phase

Ref document number: 1413670.9

Country of ref document: GB

122 Ep: pct application non-entry in european phase

Ref document number: 13700949

Country of ref document: EP

Kind code of ref document: A2