WO2013172768A2 - Input system - Google Patents

Input system Download PDF

Info

Publication number
WO2013172768A2
WO2013172768A2 PCT/SE2013/050519 SE2013050519W WO2013172768A2 WO 2013172768 A2 WO2013172768 A2 WO 2013172768A2 SE 2013050519 W SE2013050519 W SE 2013050519W WO 2013172768 A2 WO2013172768 A2 WO 2013172768A2
Authority
WO
WIPO (PCT)
Prior art keywords
input
vehicle
detector
projector
virtual touchscreen
Prior art date
Application number
PCT/SE2013/050519
Other languages
French (fr)
Other versions
WO2013172768A3 (en
Inventor
Malte ROTHHÄMEL
Original Assignee
Scania Cv Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Scania Cv Ab filed Critical Scania Cv Ab
Priority to CN201380025231.1A priority Critical patent/CN104508598A/en
Priority to BR112014028380A priority patent/BR112014028380A2/en
Priority to EP13790166.6A priority patent/EP2850506A4/en
Priority to RU2014150517A priority patent/RU2014150517A/en
Publication of WO2013172768A2 publication Critical patent/WO2013172768A2/en
Publication of WO2013172768A3 publication Critical patent/WO2013172768A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • B60K2360/1438
    • B60K2360/21
    • B60K2360/334

Definitions

  • the present invention concerns an input system, and a method associated with such an input system, according to the preambles to the independent claims.
  • the invention concerns in particular an input system and method for a vehicle, which system and method facilitate the control of systems and functions on the vehicle.
  • a modern goods vehicle there is a plurality of systems or functions that the driver, or a passenger, desires to be able to use from the driver's seat, from the passenger seat or from other locations in and around the vehicle, such as from the bed or outside the vehicle.
  • systems or functions include lighting, heating systems, radio, TV and multimedia as well as, for example, the adjustment of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
  • buttons and controls are currently fulfilled by arranging buttons and controls in a number of locations in and in connection with the vehicle to enable the systems and the functions to be used even if the user is not sitting in the driver's seat.
  • the interior lighting can be controlled from a panel near the bed.
  • a remote control for adjusting the pneumatic suspension, which control is accessible at the passenger seat so that it can be taken out when an inspection occurs.
  • Such a remote control is often connected by means of a cable.
  • US 7,248,151 concerns a virtual keyboard for a vehicle, which keyboard is used in connection with controlling various functions for the vehicle, such as unlocking the vehicle.
  • a keyboard is projected, for example, on a side window of the vehicle.
  • a detector and a processor are arranged so as to detect gestures and identify the gestures that are performed on the keyboard.
  • the detector can, for example, consist of a camera or an infrared detector.
  • US 7,050,606 concerns a system for detecting and identifying hand gestures that are particularly suitable for controlling various functions for a vehicle. These functions pertain to, for example, heating, air conditioning, lighting, CD/radio settings, etc.
  • US 2011/0286676 describes systems and related methods intended for vehicles in order to detect and identify gestures in three dimensions.
  • the step is included of receiving one or a plurality of unprocessed frames of image data from a sensor, processing and combining a plurality of frames in order to identify body parts of the user in the vehicle and calculate the position of the end of the hand of the user, determining whether the hand has performed a dynamic or a static gesture, receiving a command that corresponds to a number of stored gestures, and performing the command.
  • Microsoft Kinect is thus a system that is adapted primarily for the game industry, and entails that sensors are arranged in connection with a display, wherein the sensors comprise a camera and a 3D sensor that can, with the help of special software, detect the motions of a user in three dimensions and identify, among other things, the face of the user.
  • the Kinect system is thus used in the Xbox game console and elsewhere.
  • the sensor for three dimensions consists of an infrared laser projector combined with a monochrome CMOS sensor, which can detect video data in three dimensions under daylight conditions.
  • the sensing distance for the 3D sensor can be adjusted, and the software can automatically calibrate the sensor depending upon which game is being played, and upon the physical surroundings of the player, so that furniture and other obstacles can be taken into account.
  • the system enables advanced motion recognition and can track the movements of two active players simultaneously, whereupon motion analysis can occur by evaluating motion steps from up to two joints per player.
  • the Kinect system sensor outputs a video signal with a frame rate of 30 Hz.
  • the video streams with the RGB signal use an 8-bit VGA resolution (640 x 480 pixels) with a color filter, while the monochrome video streams for 3D effects have a VGA resolution (640 ⁇ 480 pixels) of 11 bits, which offers sensitivity at 2,048 levels.
  • the Kinect sensor functions at distances within the range of 1.2 to 3.5 meters when it is used in conjunction with the software for the Xbox, but can be given an expanded range of from 0.7 to 6.0 meters.
  • the sensor has a horizontal detection angle of 57° and a vertical of 43°. The sensor can then be pivoted 27° in the horizontal direction.
  • buttons and control are arranged in the vehicle, not only in connection with the driver's seat but, for example, by the bed and in the form of remote controls. These buttons and controls often require separate cable runs, which makes their installation complicated and entails high costs. In addition, it can sometimes be difficult for the driver of the vehicle to know where the buttons for a given function are located.
  • the object of the present invention is to provide an improved input interface that is both more user-friendly and offers cost savings for the vehicle manufacturer.
  • a projector system and a detector system that is adapted so as to detect the body motions of a person are combined by means of the input system according to the invention.
  • miniature projectors and camera systems are used that are also used in connection with, among other things, video games, such as Microsoft Kinect, which is used in the Xbox game console (as described above).
  • video games such as Microsoft Kinect
  • a virtual touch screen is projected onto a desired surface in or outside the goods vehicle and used to input control instructions for systems and functions in the vehicle.
  • the system is arranged, for example, in the ceiling of the goods vehicle so that the virtual screen can be projected in any conceivable position.
  • the touchscreen can, for example, be projected on the wall next to the bed, on the mattress, on the floor or on walls. If one is outside the vehicle, the virtual touchscreen can be projected on a plate held in the hand or, for example, on the inside of the door or on the step.
  • buttons in the goods vehicle for, for example, remote control of the pneumatic suspension, lighting buttons and the control unit for the heating system. It is also possible to define, by oneself, which systems and functions are to be controllable by means of the virtual touchscreen.
  • a system is achieved that is both more user-friendly in part because access for controlling the various vehicle systems is improved. It also offers cost savings for the vehicle manufacture because, for example, fewer cable runs are required and fewer buttons and control units are needed.
  • Figure 1 is a block diagram that schematically illustrates the present invention.
  • FIG. 2 shows a flow diagram that illustrates the present invention. Detailed description of preferred embodiments of the invention
  • the present invention concerns an input system 2 for a vehicle, preferably, a goods vehicle, a bus or a motor home, but also for cars.
  • the input system comprises a projector system 4 and a detector system 6, which are adapted so as to communicate with a control unit 8.
  • the projector system 4 is adapted so as to generate a virtual touchscreen 10 comprising an adjustable input menu adapted so as to input control instructions to control one or a plurality of systems for the vehicle, which projector system 4 is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
  • the projector system 4 is preferably arranged in the ceiling of the vehicle cab.
  • the projector system 4 comprises at least one of an image generator, for example a cathode ray tube (CRT), a digital light processing unit, and an optical projecting unit that includes optical projection lenses for projecting a generated image.
  • an image generator for example a cathode ray tube (CRT)
  • CRT cathode ray tube
  • optical projecting unit that includes optical projection lenses for projecting a generated image.
  • the projector system can also consist of a laser device that "draws" the virtual touchscreen.
  • the projector system consists of three units that are disposed in various positions in and/or around the vehicle to enable projection in as many locations as possible.
  • a system with only one unit is also possible, in which case the unit can, for example, be arranged for projection near the bed.
  • the detector system 6 is also preferably arranged in the ceiling of the vehicle cab and can suitably be arranged in connection to the projector system.
  • the detector system comprises three units, each of which comprises at least one of an optical data-gathering unit, such as a camera or an infrared detector.
  • the detector system preferably also comprises a sound detector adapted so as to detect activation measures and inputting activities in the form of sound, for example from voices or tapping sounds.
  • the input system 2 is adapted so as to be in at least two modes, a standby mode and a use mode.
  • the input system is of course also in an entirely passive mode when the system is turned off.
  • the input system 2 is thus adapted so as to:
  • a - detect a first activation measure by means of said detector system 6 and, if such a first activation measure is detected, to
  • the system waits for the user to confirm that the touchscreen is to be projected, which the user does by performing a second activation measure.
  • the system according to this embodiment is adapted so as to
  • the input system is then ready to receive control instructions via the virtual touchscreen that is projected on the selected display surface. This occurs via the steps of:
  • control of the systems and functions for the vehicle is indicated by a double arrow from the control unit 8.
  • the control unit is preferably connected to the vehicle bus system, and the control signals that it generates are processed by the vehicle in the normal way, and consequently need not be described further here.
  • Examples of systems or functions that can suitably be controlled by means of the input system include the lighting, heating system, radio, TV and multimedia as well as, for example, settings of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
  • the first and, in applicable cases, second activation measure comprise one or a plurality of a predetermined pattern of motion (a gesture); a finger snap or a tap.
  • the first activation measure can, for example, consist of tapping two times to initiate the use mode and then, according to one embodiment, of causing the system to display the virtual touchscreen by identifying the selected display surface by distinctly pointing with one finger, which then constitutes the second activation measure.
  • the input menu is displayed on the virtual touchscreen.
  • the start menu i.e. the first input menu display
  • the start menu can be dependent upon where the virtual touchscreen is projected. If, for example, the touchscreen is to be projected near the bed, a menu for adjusting the lighting and the radio will be displayed.
  • the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein activation of a function occurs by means of a specified input activity.
  • An input activity consists of, for example, pressing a predefined area on the input menu by keeping the hand/finger in the input area for at least a predetermined time (on the order of parts of a second up to several seconds).
  • the system is adapted so as to generate a predetermined acknowledgement that consists of one or a plurality changes of the color or shape of an input area (the button), and the generation of an acoustic signal.
  • the present invention further comprises a method in connection with an input system for a vehicle, wherein the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
  • the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
  • the method will now be described briefly with reference to the flow diagram in Figure 2. Reference is also made to relevant parts of the foregoing description of the input system.
  • a projector system is thus adapted so as to generate a virtual touchscreen comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, whereupon the projector system is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
  • the input system is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the method comprises the steps of:
  • the method comprises detecting a second activation measure by means of said detector system, and performing step D, and the subsequent steps, only when said second activation measure has been detected.
  • This elective step has been identified using broken lines in Figure 2.
  • the first and, where applicable, second activation measures comprise, for example, one or a plurality of predetermined patterns of movement (a gesture), a tap.
  • various functions can be activated by generating various input activities that comprise touching a predefined input area by keeping a hand/finger in the input area for at least a
  • the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein the activation of a function is achieved by means of a specified input activity.
  • An input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
  • the projector system and the detector system must be disposed so that the virtual touchscreen is displayed in such a way that the operator can stand outside the vehicle and adjust, for example, the pneumatic suspension for the rear axle.
  • the projection can be adapted so that the touchscreen assumes a desired appearance. For example, consideration can be given to the

Abstract

An input system (2) for a vehicle, comprising a projector system (4) and a detector system (6), which are adapted so as to communicate with a control unit (8). The project system (4) is adapted so as to generate a virtual touchscreen (10) comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, wherein the project system (4) is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle. The input system (2) is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the input system (2) is adapted so as to: A - detect a first activation measure by means of said detector system (6), B - initiate the use mode, C - select and define a display surface for the virtual touchscreen (10), D - generate and project the virtual touchscreen on the selected display surface by means of said projector system (4), E - receive, by means of said detector system (6), input control instructions in the form of input activities via said projected input menu and generate an input signal (12) in dependence thereon, F - control one or a plurality of systems in dependence upon said input signal (12).

Description

Title
Input System
Technical field of the invention
The present invention concerns an input system, and a method associated with such an input system, according to the preambles to the independent claims.
The invention concerns in particular an input system and method for a vehicle, which system and method facilitate the control of systems and functions on the vehicle. Background of the invention
In a modern goods vehicle there is a plurality of systems or functions that the driver, or a passenger, desires to be able to use from the driver's seat, from the passenger seat or from other locations in and around the vehicle, such as from the bed or outside the vehicle. Examples of such systems or functions include lighting, heating systems, radio, TV and multimedia as well as, for example, the adjustment of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
This desire is currently fulfilled by arranging buttons and controls in a number of locations in and in connection with the vehicle to enable the systems and the functions to be used even if the user is not sitting in the driver's seat. For example, the interior lighting can be controlled from a panel near the bed. Another example is a remote control for adjusting the pneumatic suspension, which control is accessible at the passenger seat so that it can be taken out when an inspection occurs. Such a remote control is often connected by means of a cable. There are numerous examples of input interfaces for use in vehicles, and a number of different systems will be discussed briefly below.
US 7,248,151 concerns a virtual keyboard for a vehicle, which keyboard is used in connection with controlling various functions for the vehicle, such as unlocking the vehicle. A keyboard is projected, for example, on a side window of the vehicle. A detector and a processor are arranged so as to detect gestures and identify the gestures that are performed on the keyboard. The detector can, for example, consist of a camera or an infrared detector.
US 7,050,606 concerns a system for detecting and identifying hand gestures that are particularly suitable for controlling various functions for a vehicle. These functions pertain to, for example, heating, air conditioning, lighting, CD/radio settings, etc.
US 2011/0286676 describes systems and related methods intended for vehicles in order to detect and identify gestures in three dimensions. According to one of the methods, the step is included of receiving one or a plurality of unprocessed frames of image data from a sensor, processing and combining a plurality of frames in order to identify body parts of the user in the vehicle and calculate the position of the end of the hand of the user, determining whether the hand has performed a dynamic or a static gesture, receiving a command that corresponds to a number of stored gestures, and performing the command.
Reference is also made here to the article "Natural and Intuitive Hand Gestures: A
Substitute for Traditional Vehicle Control" by A. Riener and M. Rossbory, which was published in AutomotiveUI '11, November 29-December 2, 2011, Salzburg, Austria, Adjunct Proceedings. The article describes a test in which a driver gives instructions by means of gestures to control a number of functions. In the test, much of the technology that is used in video games is utilized, and in particular Microsoft Kinect, which is used for the Xbox game console. In the testing, hand gestures performed when the hand is kept in proximity to the gearshift were identified. A detector in the form of a camera was mounted in the ceiling and adjusted so as to capture hand motions in the area around the gearshift.
Microsoft Kinect is thus a system that is adapted primarily for the game industry, and entails that sensors are arranged in connection with a display, wherein the sensors comprise a camera and a 3D sensor that can, with the help of special software, detect the motions of a user in three dimensions and identify, among other things, the face of the user. The Kinect system is thus used in the Xbox game console and elsewhere. The sensor for three dimensions consists of an infrared laser projector combined with a monochrome CMOS sensor, which can detect video data in three dimensions under daylight conditions. The sensing distance for the 3D sensor can be adjusted, and the software can automatically calibrate the sensor depending upon which game is being played, and upon the physical surroundings of the player, so that furniture and other obstacles can be taken into account. The system enables advanced motion recognition and can track the movements of two active players simultaneously, whereupon motion analysis can occur by evaluating motion steps from up to two joints per player.
The Kinect system sensor outputs a video signal with a frame rate of 30 Hz. The video streams with the RGB signal use an 8-bit VGA resolution (640 x 480 pixels) with a color filter, while the monochrome video streams for 3D effects have a VGA resolution (640 χ 480 pixels) of 11 bits, which offers sensitivity at 2,048 levels. The Kinect sensor functions at distances within the range of 1.2 to 3.5 meters when it is used in conjunction with the software for the Xbox, but can be given an expanded range of from 0.7 to 6.0 meters. The sensor has a horizontal detection angle of 57° and a vertical of 43°. The sensor can then be pivoted 27° in the horizontal direction.
To enable the driver of a vehicle to utilize the many different systems and functions that are currently found in a modern goods vehicle, numerous buttons and control are arranged in the vehicle, not only in connection with the driver's seat but, for example, by the bed and in the form of remote controls. These buttons and controls often require separate cable runs, which makes their installation complicated and entails high costs. In addition, it can sometimes be difficult for the driver of the vehicle to know where the buttons for a given function are located. The object of the present invention is to provide an improved input interface that is both more user-friendly and offers cost savings for the vehicle manufacturer.
Summary of the invention
The foregoing objects are achieved by the invention defined in the independent claims. Preferred embodiments are defined by the dependent claims.
A projector system and a detector system that is adapted so as to detect the body motions of a person are combined by means of the input system according to the invention.
In implementing the input system according to the invention, miniature projectors and camera systems are used that are also used in connection with, among other things, video games, such as Microsoft Kinect, which is used in the Xbox game console (as described above). According to the invention, a virtual touch screen is projected onto a desired surface in or outside the goods vehicle and used to input control instructions for systems and functions in the vehicle. The system is arranged, for example, in the ceiling of the goods vehicle so that the virtual screen can be projected in any conceivable position.
The touchscreen can, for example, be projected on the wall next to the bed, on the mattress, on the floor or on walls. If one is outside the vehicle, the virtual touchscreen can be projected on a plate held in the hand or, for example, on the inside of the door or on the step.
It is thus possible to replace a plurality of buttons in the goods vehicle for, for example, remote control of the pneumatic suspension, lighting buttons and the control unit for the heating system. It is also possible to define, by oneself, which systems and functions are to be controllable by means of the virtual touchscreen.
Through the application of the present invention, a system is achieved that is both more user-friendly in part because access for controlling the various vehicle systems is improved. It also offers cost savings for the vehicle manufacture because, for example, fewer cable runs are required and fewer buttons and control units are needed.
Brief description of the drawing
Figure 1 is a block diagram that schematically illustrates the present invention.
Figure 2 shows a flow diagram that illustrates the present invention. Detailed description of preferred embodiments of the invention
The present invention will now be described with reference to the accompanying drawings, and first with reference to the block diagram in Figure 1. The present invention concerns an input system 2 for a vehicle, preferably, a goods vehicle, a bus or a motor home, but also for cars. The input system comprises a projector system 4 and a detector system 6, which are adapted so as to communicate with a control unit 8.
The projector system 4 is adapted so as to generate a virtual touchscreen 10 comprising an adjustable input menu adapted so as to input control instructions to control one or a plurality of systems for the vehicle, which projector system 4 is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle. The projector system 4 is preferably arranged in the ceiling of the vehicle cab.
The projector system 4 comprises at least one of an image generator, for example a cathode ray tube (CRT), a digital light processing unit, and an optical projecting unit that includes optical projection lenses for projecting a generated image. The projector system can also consist of a laser device that "draws" the virtual touchscreen.
In the embodiment shown in the figure, the projector system consists of three units that are disposed in various positions in and/or around the vehicle to enable projection in as many locations as possible. A system with only one unit is also possible, in which case the unit can, for example, be arranged for projection near the bed.
The detector system 6 is also preferably arranged in the ceiling of the vehicle cab and can suitably be arranged in connection to the projector system. In the embodiment shown in the figure, the detector system comprises three units, each of which comprises at least one of an optical data-gathering unit, such as a camera or an infrared detector. The detector system preferably also comprises a sound detector adapted so as to detect activation measures and inputting activities in the form of sound, for example from voices or tapping sounds.
In connection with the application of the invention, the input system 2 is adapted so as to be in at least two modes, a standby mode and a use mode. The input system is of course also in an entirely passive mode when the system is turned off.
The input system 2 is thus adapted so as to:
A - detect a first activation measure by means of said detector system 6 and, if such a first activation measure is detected, to
B - initiate the use mode.
In its use mode, the system has been opened for the user to choose where the virtual touchscreen is to be projected. The user accomplishes this by
C - selecting and defining a display surface for the virtual touchscreen 10.
According to one embodiment, the system waits for the user to confirm that the touchscreen is to be projected, which the user does by performing a second activation measure. The system according to this embodiment is adapted so as to
- detect a second activation measure by means of said detector system 6.
The input system is then ready to receive control instructions via the virtual touchscreen that is projected on the selected display surface. This occurs via the steps of:
D - generating and projecting the virtual touchscreen, using said projector system, on the selected display surface,
E - receiving, using said detector system 6, input control instructions in the form of input activities via said projected input menu and generating an input signal 12 in dependence thereon,
F - controlling one or a plurality of systems in dependence upon said input signal 12.
In the figure, the control of the systems and functions for the vehicle is indicated by a double arrow from the control unit 8. The control unit is preferably connected to the vehicle bus system, and the control signals that it generates are processed by the vehicle in the normal way, and consequently need not be described further here.
Examples of systems or functions that can suitably be controlled by means of the input system include the lighting, heating system, radio, TV and multimedia as well as, for example, settings of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
The first and, in applicable cases, second activation measure comprise one or a plurality of a predetermined pattern of motion (a gesture); a finger snap or a tap. The first activation measure can, for example, consist of tapping two times to initiate the use mode and then, according to one embodiment, of causing the system to display the virtual touchscreen by identifying the selected display surface by distinctly pointing with one finger, which then constitutes the second activation measure.
In use mode, the input menu is displayed on the virtual touchscreen. For example, the start menu, i.e. the first input menu display, can be dependent upon where the virtual touchscreen is projected. If, for example, the touchscreen is to be projected near the bed, a menu for adjusting the lighting and the radio will be displayed.
The input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein activation of a function occurs by means of a specified input activity. An input activity consists of, for example, pressing a predefined area on the input menu by keeping the hand/finger in the input area for at least a predetermined time (on the order of parts of a second up to several seconds).
In order to make it known that an input activity has been accepted by the system, the system is adapted so as to generate a predetermined acknowledgement that consists of one or a plurality changes of the color or shape of an input area (the button), and the generation of an acoustic signal.
The present invention further comprises a method in connection with an input system for a vehicle, wherein the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit. The method will now be described briefly with reference to the flow diagram in Figure 2. Reference is also made to relevant parts of the foregoing description of the input system. A projector system is thus adapted so as to generate a virtual touchscreen comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, whereupon the projector system is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
The input system is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the method comprises the steps of:
A - detecting a first activation measure,
B - initiating the use mode,
C - selecting and defining a display surface for the virtual touchscreen,
D - generating and projecting the virtual touchscreen by means of said projector system on the selected display surface,
E - receiving, by means of said detector systems, input control instructions in the form of input activities via said projected input menu and generating an input signal in dependence thereon,
F - controlling one or a plurality of systems in dependence upon said input signal (12).
According to one embodiment, the method comprises detecting a second activation measure by means of said detector system, and performing step D, and the subsequent steps, only when said second activation measure has been detected. This elective step has been identified using broken lines in Figure 2.
The first and, where applicable, second activation measures comprise, for example, one or a plurality of predetermined patterns of movement (a gesture), a tap.
When the virtual touchscreen is activated and an input menu is displayed, various functions can be activated by generating various input activities that comprise touching a predefined input area by keeping a hand/finger in the input area for at least a
predetermined time. The input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein the activation of a function is achieved by means of a specified input activity. An input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
According to one application of the invention, the projector system and the detector system must be disposed so that the virtual touchscreen is displayed in such a way that the operator can stand outside the vehicle and adjust, for example, the pneumatic suspension for the rear axle.
In an additional variant of the data-gathering system according to the invention, it is possible to add to the functions that the system is to control, and to influence the appearance of the screen based on the functionality desired.
In connection with the projector system receiving instructions as to where the virtual touchscreen is to be projected, the projection can be adapted so that the touchscreen assumes a desired appearance. For example, consideration can be given to the
circumstance that the display surface is not perpendicular in relation to the generated image, so that the touchscreen is distorted. This can be achieved automatically by means of a correcting function.
The present invention is not limited to the preferred embodiments described above.
Various alternatives, modifications and equivalents can be used. The foregoing embodiments are consequently not to be viewed as limiting the protective scope of the invention as defined in the accompany claims.

Claims

Claims
1. An input system (2) for a vehicle, comprising a projector system (4), a detector system (6), which are adapted so as to communicate with a control unit (8), said projector system (4) being adapted so as to generate a virtual touchscreen (10) comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, wherein the projector system (4) is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle, c h a r a c t e r i z e d i n t h a t the input system (2) is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the input system (2) is adapted so as to: A - detect a first activation measure by means of said detector system (6),
B - initiate the use mode,
C - select and define a display surface for the virtual touchscreen (10),
D - generate and project the virtual touchscreen on the selected display surface by means of said projector system (4),
E - receive, by means of said detector system (6), input control instructions in the form of input activities via said project input menu and generate an input signal (12) in
dependence thereon,
F - control one or a plurality of systems in dependence upon said input signal (12).
2. The input system according to claim 1, wherein the input system is adapted so as to detect a second activation measure by means of said detector system (6), and to perform step D, and the subsequent steps, only when said second activation measure has been detected.
3. The input system according to claim 1 or 2, wherein said first and, in applicable cases, second activation measure comprise one or more of a predetermined pattern of movement (a gesture), and a tap.
4. The input system according to any of claims 1-3, wherein said input activities comprise touching a predefined input area by keeping a hand/finger in the input area for at least a predetermined time.
5. The input system according to any of claims 1-4, wherein said input menu comprises a plurality of input menus arranged in a hierarchical system, and wherein activation of a function is achieved by means of a specified input activity.
5 6. The input system according to any of claims 1-5, wherein an input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
10 7. The input system according to any of the preceding claims, wherein said detector system (6) comprises at least one optical data-gathering unit, for example a camera or infrared detector.
8. The input system according to any of the preceding claims, wherein said 15 projector system (4) comprises at least one of an image generator, for example a cathode ray tube (CRT), a digital light-processing unit, and an optical projection unit that includes optical projection lenses for projecting the generated image.
9. The input system according to any of the preceding claims, wherein said 20 projector system (4) is arranged in the ceiling of the vehicle cab.
10. The input system according to any of the preceding claims, wherein said detector system (4) is arranged in the ceiling of the vehicle cab.
25 11. A method in connection with an input system for a vehicle, wherein the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit, said projector system being arranged so as to generate a virtual touchscreen comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, wherein the
30 projector system is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle,
c h a r a c t e r i z e d i n t h a t the input system is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the method comprises the steps of:
A - detecting a first activation measure,
B - initiating the use mode,
C - selecting and defining a display surface for the virtual touchscreen,
D - generating and projecting the virtual touchscreen on the selected display surface by means of said projector system,
E - receiving, by means of said detector system, input control instructions in the form of input activities via said projected input menu and generating an input signal in dependence thereon,
F - controlling one or a plurality of systems in dependence upon said input signal (12).
12. The method according to claim 11, wherein the method comprises detecting a second activation measure by means of said detector system, and performing step D, and the subsequent steps, only when said second activation measure has been detected.
13. The method according to claim 11 or 12, wherein said first and, in applicable cases, second activation measure comprise one or a plurality of a predetermined pattern of movements (a gesture), a tap.
14. The method according to any of claims 11-13, wherein said input activities comprise pressing on a predefined input area by keeping a hand/finger in the input area for at least a predetermined time.
15. The method according to any of claims 11-14, wherein said input menu comprises a plurality of input menus arranged in a hierarchical system, and wherein activation of a function is achieved by means of a specified input activity.
16. The method according to any of claims 11-15, wherein an input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
PCT/SE2013/050519 2012-05-14 2013-05-08 Input system WO2013172768A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN201380025231.1A CN104508598A (en) 2012-05-14 2013-05-08 A projected virtual input system for a vehicle
BR112014028380A BR112014028380A2 (en) 2012-05-14 2013-05-08 input system ".
EP13790166.6A EP2850506A4 (en) 2012-05-14 2013-05-08 A projected virtual input system for a vehicle
RU2014150517A RU2014150517A (en) 2012-05-14 2013-05-08 INPUT SYSTEM

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1250488-2 2012-05-14
SE1250488A SE537730C2 (en) 2012-05-14 2012-05-14 Projected virtual vehicle entry system

Publications (2)

Publication Number Publication Date
WO2013172768A2 true WO2013172768A2 (en) 2013-11-21
WO2013172768A3 WO2013172768A3 (en) 2014-03-20

Family

ID=49584416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2013/050519 WO2013172768A2 (en) 2012-05-14 2013-05-08 Input system

Country Status (6)

Country Link
EP (1) EP2850506A4 (en)
CN (1) CN104508598A (en)
BR (1) BR112014028380A2 (en)
RU (1) RU2014150517A (en)
SE (1) SE537730C2 (en)
WO (1) WO2013172768A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014226546A1 (en) * 2014-12-19 2016-06-23 Robert Bosch Gmbh Method for operating an input device, input device, motor vehicle
RU2618921C2 (en) * 2014-05-22 2017-05-12 Сяоми Инк. Method and device for touch input control
DE102018216662A1 (en) * 2018-09-27 2020-04-02 Continental Automotive Gmbh Dashboard layout, procedure and use
DE102020201235A1 (en) 2020-01-31 2021-08-05 Ford Global Technologies, Llc Method and system for controlling motor vehicle functions
US11144153B2 (en) 2017-12-07 2021-10-12 Elliptic Laboratories As User interface with acoustic proximity and position sensing arrangements

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT201700091628A1 (en) * 2017-08-08 2019-02-08 Automotive Lighting Italia Spa Virtual man-machine interface system and corresponding virtual man-machine interface procedure for a vehicle.
IT201800003722A1 (en) * 2018-03-19 2019-09-19 Candy Spa APPLIANCE WITH USER INTERFACE
DE102019200632B4 (en) * 2019-01-18 2021-11-25 Audi Ag Control system with portable interface unit and motor vehicle with the control system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004044664A1 (en) 2002-11-06 2004-05-27 Julius Lin Virtual workstation
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
US7248151B2 (en) 2005-01-05 2007-07-24 General Motors Corporation Virtual keypad for vehicle entry control
US20110286676A1 (en) 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060072009A1 (en) * 2004-10-01 2006-04-06 International Business Machines Corporation Flexible interaction-based computer interfacing using visible artifacts
US20060158616A1 (en) * 2005-01-15 2006-07-20 International Business Machines Corporation Apparatus and method for interacting with a subject in an environment
DE102005059449A1 (en) * 2005-12-13 2007-06-14 GM Global Technology Operations, Inc., Detroit Control system for controlling functions, has display device for graphical display of virtual control elements assigned to functions on assigned display surface in vehicle, and detection device for detecting control data
JP4942814B2 (en) * 2007-06-05 2012-05-30 三菱電機株式会社 Vehicle control device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7050606B2 (en) 1999-08-10 2006-05-23 Cybernet Systems Corporation Tracking and gesture recognition system particularly suited to vehicular control applications
WO2004044664A1 (en) 2002-11-06 2004-05-27 Julius Lin Virtual workstation
US7248151B2 (en) 2005-01-05 2007-07-24 General Motors Corporation Virtual keypad for vehicle entry control
US20110286676A1 (en) 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A. RIENER; M. ROSSBORY: "Natural and Intuitive Hand Gestures: A Substitute for Traditional Vehicle Control", AUTOMOTIVEUI '11, 29 November 2011 (2011-11-29)
See also references of EP2850506A4

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2618921C2 (en) * 2014-05-22 2017-05-12 Сяоми Инк. Method and device for touch input control
US9671911B2 (en) 2014-05-22 2017-06-06 Xiaomi Inc. Touch input control method and device
DE102014226546A1 (en) * 2014-12-19 2016-06-23 Robert Bosch Gmbh Method for operating an input device, input device, motor vehicle
US11144153B2 (en) 2017-12-07 2021-10-12 Elliptic Laboratories As User interface with acoustic proximity and position sensing arrangements
DE102018216662A1 (en) * 2018-09-27 2020-04-02 Continental Automotive Gmbh Dashboard layout, procedure and use
DE102020201235A1 (en) 2020-01-31 2021-08-05 Ford Global Technologies, Llc Method and system for controlling motor vehicle functions

Also Published As

Publication number Publication date
WO2013172768A3 (en) 2014-03-20
SE537730C2 (en) 2015-10-06
EP2850506A2 (en) 2015-03-25
CN104508598A (en) 2015-04-08
SE1250488A1 (en) 2013-11-15
RU2014150517A (en) 2016-07-10
BR112014028380A2 (en) 2017-06-27
EP2850506A4 (en) 2016-09-28

Similar Documents

Publication Publication Date Title
WO2013172768A2 (en) Input system
US9037354B2 (en) Controlling vehicle entertainment systems responsive to sensed passenger gestures
CN110626237B (en) Automatically regulated central control platform with arm detects
JP5261554B2 (en) Human-machine interface for vehicles based on fingertip pointing and gestures
KR101416378B1 (en) A display apparatus capable of moving image and the method thereof
US8085243B2 (en) Input device and its method
CN102398600B (en) Augmented reality is used to control system and the method thereof of in-vehicle apparatus
US20120131518A1 (en) Apparatus and method for selecting item using movement of object
US11006257B2 (en) Systems and methods for locating mobile devices within a vehicle
US20060066507A1 (en) Display apparatus, and method for controlling the same
US10618773B2 (en) Elevator operation control device and method using monitor
CN104755308A (en) Motor vehicle control interface with gesture recognition
EP1393591A2 (en) Automatically adjusting audio system
JP2007045169A (en) Information processor for vehicle
JP2011039600A (en) Device and method for supporting parking
JP2018055614A (en) Gesture operation system, and gesture operation method and program
US20130176218A1 (en) Pointing Device, Operating Method Thereof and Relative Multimedia Interactive System
US20210064147A1 (en) Gesture recognition using a mobile device
US20190258245A1 (en) Vehicle remote operation device, vehicle remote operation system and vehicle remote operation method
US11354862B2 (en) Contextually significant 3-dimensional model
JP2016029532A (en) User interface
US20220073089A1 (en) Operating system with portable interface unit, and motor vehicle having the operating system
US11630628B2 (en) Display system
CN113727156B (en) Multi-freedom-degree vehicle-mounted video system adjusting method and device based on light rays and sight lines
KR20140141285A (en) Device for Passenger tracking in a car

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13790166

Country of ref document: EP

Kind code of ref document: A2

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2013790166

Country of ref document: EP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13790166

Country of ref document: EP

Kind code of ref document: A2

ENP Entry into the national phase

Ref document number: 2014150517

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112014028380

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112014028380

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20141114