WO2013172768A2 - Input system - Google Patents
Input system Download PDFInfo
- Publication number
- WO2013172768A2 WO2013172768A2 PCT/SE2013/050519 SE2013050519W WO2013172768A2 WO 2013172768 A2 WO2013172768 A2 WO 2013172768A2 SE 2013050519 W SE2013050519 W SE 2013050519W WO 2013172768 A2 WO2013172768 A2 WO 2013172768A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- input
- vehicle
- detector
- projector
- virtual touchscreen
- Prior art date
Links
- 230000004913 activation Effects 0.000 claims abstract description 25
- 230000000694 effects Effects 0.000 claims abstract description 19
- 238000000034 method Methods 0.000 claims description 17
- 230000003287 optical effect Effects 0.000 claims description 6
- 230000000977 initiatory effect Effects 0.000 claims description 2
- 238000003825 pressing Methods 0.000 claims description 2
- 230000005019 pattern of movement Effects 0.000 claims 2
- 230000033001 locomotion Effects 0.000 description 9
- 239000000725 suspension Substances 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000010438 heat treatment Methods 0.000 description 4
- 238000012360 testing method Methods 0.000 description 3
- 230000001419 dependent effect Effects 0.000 description 2
- 238000010079 rubber tapping Methods 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B60K35/10—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K37/00—Dashboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- B60K2360/1438—
-
- B60K2360/21—
-
- B60K2360/334—
Definitions
- the present invention concerns an input system, and a method associated with such an input system, according to the preambles to the independent claims.
- the invention concerns in particular an input system and method for a vehicle, which system and method facilitate the control of systems and functions on the vehicle.
- a modern goods vehicle there is a plurality of systems or functions that the driver, or a passenger, desires to be able to use from the driver's seat, from the passenger seat or from other locations in and around the vehicle, such as from the bed or outside the vehicle.
- systems or functions include lighting, heating systems, radio, TV and multimedia as well as, for example, the adjustment of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
- buttons and controls are currently fulfilled by arranging buttons and controls in a number of locations in and in connection with the vehicle to enable the systems and the functions to be used even if the user is not sitting in the driver's seat.
- the interior lighting can be controlled from a panel near the bed.
- a remote control for adjusting the pneumatic suspension, which control is accessible at the passenger seat so that it can be taken out when an inspection occurs.
- Such a remote control is often connected by means of a cable.
- US 7,248,151 concerns a virtual keyboard for a vehicle, which keyboard is used in connection with controlling various functions for the vehicle, such as unlocking the vehicle.
- a keyboard is projected, for example, on a side window of the vehicle.
- a detector and a processor are arranged so as to detect gestures and identify the gestures that are performed on the keyboard.
- the detector can, for example, consist of a camera or an infrared detector.
- US 7,050,606 concerns a system for detecting and identifying hand gestures that are particularly suitable for controlling various functions for a vehicle. These functions pertain to, for example, heating, air conditioning, lighting, CD/radio settings, etc.
- US 2011/0286676 describes systems and related methods intended for vehicles in order to detect and identify gestures in three dimensions.
- the step is included of receiving one or a plurality of unprocessed frames of image data from a sensor, processing and combining a plurality of frames in order to identify body parts of the user in the vehicle and calculate the position of the end of the hand of the user, determining whether the hand has performed a dynamic or a static gesture, receiving a command that corresponds to a number of stored gestures, and performing the command.
- Microsoft Kinect is thus a system that is adapted primarily for the game industry, and entails that sensors are arranged in connection with a display, wherein the sensors comprise a camera and a 3D sensor that can, with the help of special software, detect the motions of a user in three dimensions and identify, among other things, the face of the user.
- the Kinect system is thus used in the Xbox game console and elsewhere.
- the sensor for three dimensions consists of an infrared laser projector combined with a monochrome CMOS sensor, which can detect video data in three dimensions under daylight conditions.
- the sensing distance for the 3D sensor can be adjusted, and the software can automatically calibrate the sensor depending upon which game is being played, and upon the physical surroundings of the player, so that furniture and other obstacles can be taken into account.
- the system enables advanced motion recognition and can track the movements of two active players simultaneously, whereupon motion analysis can occur by evaluating motion steps from up to two joints per player.
- the Kinect system sensor outputs a video signal with a frame rate of 30 Hz.
- the video streams with the RGB signal use an 8-bit VGA resolution (640 x 480 pixels) with a color filter, while the monochrome video streams for 3D effects have a VGA resolution (640 ⁇ 480 pixels) of 11 bits, which offers sensitivity at 2,048 levels.
- the Kinect sensor functions at distances within the range of 1.2 to 3.5 meters when it is used in conjunction with the software for the Xbox, but can be given an expanded range of from 0.7 to 6.0 meters.
- the sensor has a horizontal detection angle of 57° and a vertical of 43°. The sensor can then be pivoted 27° in the horizontal direction.
- buttons and control are arranged in the vehicle, not only in connection with the driver's seat but, for example, by the bed and in the form of remote controls. These buttons and controls often require separate cable runs, which makes their installation complicated and entails high costs. In addition, it can sometimes be difficult for the driver of the vehicle to know where the buttons for a given function are located.
- the object of the present invention is to provide an improved input interface that is both more user-friendly and offers cost savings for the vehicle manufacturer.
- a projector system and a detector system that is adapted so as to detect the body motions of a person are combined by means of the input system according to the invention.
- miniature projectors and camera systems are used that are also used in connection with, among other things, video games, such as Microsoft Kinect, which is used in the Xbox game console (as described above).
- video games such as Microsoft Kinect
- a virtual touch screen is projected onto a desired surface in or outside the goods vehicle and used to input control instructions for systems and functions in the vehicle.
- the system is arranged, for example, in the ceiling of the goods vehicle so that the virtual screen can be projected in any conceivable position.
- the touchscreen can, for example, be projected on the wall next to the bed, on the mattress, on the floor or on walls. If one is outside the vehicle, the virtual touchscreen can be projected on a plate held in the hand or, for example, on the inside of the door or on the step.
- buttons in the goods vehicle for, for example, remote control of the pneumatic suspension, lighting buttons and the control unit for the heating system. It is also possible to define, by oneself, which systems and functions are to be controllable by means of the virtual touchscreen.
- a system is achieved that is both more user-friendly in part because access for controlling the various vehicle systems is improved. It also offers cost savings for the vehicle manufacture because, for example, fewer cable runs are required and fewer buttons and control units are needed.
- Figure 1 is a block diagram that schematically illustrates the present invention.
- FIG. 2 shows a flow diagram that illustrates the present invention. Detailed description of preferred embodiments of the invention
- the present invention concerns an input system 2 for a vehicle, preferably, a goods vehicle, a bus or a motor home, but also for cars.
- the input system comprises a projector system 4 and a detector system 6, which are adapted so as to communicate with a control unit 8.
- the projector system 4 is adapted so as to generate a virtual touchscreen 10 comprising an adjustable input menu adapted so as to input control instructions to control one or a plurality of systems for the vehicle, which projector system 4 is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
- the projector system 4 is preferably arranged in the ceiling of the vehicle cab.
- the projector system 4 comprises at least one of an image generator, for example a cathode ray tube (CRT), a digital light processing unit, and an optical projecting unit that includes optical projection lenses for projecting a generated image.
- an image generator for example a cathode ray tube (CRT)
- CRT cathode ray tube
- optical projecting unit that includes optical projection lenses for projecting a generated image.
- the projector system can also consist of a laser device that "draws" the virtual touchscreen.
- the projector system consists of three units that are disposed in various positions in and/or around the vehicle to enable projection in as many locations as possible.
- a system with only one unit is also possible, in which case the unit can, for example, be arranged for projection near the bed.
- the detector system 6 is also preferably arranged in the ceiling of the vehicle cab and can suitably be arranged in connection to the projector system.
- the detector system comprises three units, each of which comprises at least one of an optical data-gathering unit, such as a camera or an infrared detector.
- the detector system preferably also comprises a sound detector adapted so as to detect activation measures and inputting activities in the form of sound, for example from voices or tapping sounds.
- the input system 2 is adapted so as to be in at least two modes, a standby mode and a use mode.
- the input system is of course also in an entirely passive mode when the system is turned off.
- the input system 2 is thus adapted so as to:
- a - detect a first activation measure by means of said detector system 6 and, if such a first activation measure is detected, to
- the system waits for the user to confirm that the touchscreen is to be projected, which the user does by performing a second activation measure.
- the system according to this embodiment is adapted so as to
- the input system is then ready to receive control instructions via the virtual touchscreen that is projected on the selected display surface. This occurs via the steps of:
- control of the systems and functions for the vehicle is indicated by a double arrow from the control unit 8.
- the control unit is preferably connected to the vehicle bus system, and the control signals that it generates are processed by the vehicle in the normal way, and consequently need not be described further here.
- Examples of systems or functions that can suitably be controlled by means of the input system include the lighting, heating system, radio, TV and multimedia as well as, for example, settings of the pneumatic suspension for the goods vehicle, for example in connection with the hitching of a trailer.
- the first and, in applicable cases, second activation measure comprise one or a plurality of a predetermined pattern of motion (a gesture); a finger snap or a tap.
- the first activation measure can, for example, consist of tapping two times to initiate the use mode and then, according to one embodiment, of causing the system to display the virtual touchscreen by identifying the selected display surface by distinctly pointing with one finger, which then constitutes the second activation measure.
- the input menu is displayed on the virtual touchscreen.
- the start menu i.e. the first input menu display
- the start menu can be dependent upon where the virtual touchscreen is projected. If, for example, the touchscreen is to be projected near the bed, a menu for adjusting the lighting and the radio will be displayed.
- the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein activation of a function occurs by means of a specified input activity.
- An input activity consists of, for example, pressing a predefined area on the input menu by keeping the hand/finger in the input area for at least a predetermined time (on the order of parts of a second up to several seconds).
- the system is adapted so as to generate a predetermined acknowledgement that consists of one or a plurality changes of the color or shape of an input area (the button), and the generation of an acoustic signal.
- the present invention further comprises a method in connection with an input system for a vehicle, wherein the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
- the input system comprises a projector system and a detector system, which are arranged so as to communicate with a control unit.
- the method will now be described briefly with reference to the flow diagram in Figure 2. Reference is also made to relevant parts of the foregoing description of the input system.
- a projector system is thus adapted so as to generate a virtual touchscreen comprising an adjustable input menu adapted for inputting control instructions for controlling one or a plurality of systems for the vehicle, whereupon the projector system is adapted so as to project the virtual touchscreen on a display surface in or at the vehicle.
- the input system is adapted so as to be in at least two modes, a sleep mode and a use mode, wherein the method comprises the steps of:
- the method comprises detecting a second activation measure by means of said detector system, and performing step D, and the subsequent steps, only when said second activation measure has been detected.
- This elective step has been identified using broken lines in Figure 2.
- the first and, where applicable, second activation measures comprise, for example, one or a plurality of predetermined patterns of movement (a gesture), a tap.
- various functions can be activated by generating various input activities that comprise touching a predefined input area by keeping a hand/finger in the input area for at least a
- the input menu preferably comprises a plurality of input menus arranged in a hierarchical system, wherein the activation of a function is achieved by means of a specified input activity.
- An input activity is accepted by means of a predetermined acknowledgement that consists of one or a plurality of changes in the color or shape of the input area (the button), and the generation of an acoustic signal.
- the projector system and the detector system must be disposed so that the virtual touchscreen is displayed in such a way that the operator can stand outside the vehicle and adjust, for example, the pneumatic suspension for the rear axle.
- the projection can be adapted so that the touchscreen assumes a desired appearance. For example, consideration can be given to the
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380025231.1A CN104508598A (en) | 2012-05-14 | 2013-05-08 | A projected virtual input system for a vehicle |
BR112014028380A BR112014028380A2 (en) | 2012-05-14 | 2013-05-08 | input system ". |
EP13790166.6A EP2850506A4 (en) | 2012-05-14 | 2013-05-08 | A projected virtual input system for a vehicle |
RU2014150517A RU2014150517A (en) | 2012-05-14 | 2013-05-08 | INPUT SYSTEM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SE1250488-2 | 2012-05-14 | ||
SE1250488A SE537730C2 (en) | 2012-05-14 | 2012-05-14 | Projected virtual vehicle entry system |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013172768A2 true WO2013172768A2 (en) | 2013-11-21 |
WO2013172768A3 WO2013172768A3 (en) | 2014-03-20 |
Family
ID=49584416
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/SE2013/050519 WO2013172768A2 (en) | 2012-05-14 | 2013-05-08 | Input system |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP2850506A4 (en) |
CN (1) | CN104508598A (en) |
BR (1) | BR112014028380A2 (en) |
RU (1) | RU2014150517A (en) |
SE (1) | SE537730C2 (en) |
WO (1) | WO2013172768A2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102014226546A1 (en) * | 2014-12-19 | 2016-06-23 | Robert Bosch Gmbh | Method for operating an input device, input device, motor vehicle |
RU2618921C2 (en) * | 2014-05-22 | 2017-05-12 | Сяоми Инк. | Method and device for touch input control |
DE102018216662A1 (en) * | 2018-09-27 | 2020-04-02 | Continental Automotive Gmbh | Dashboard layout, procedure and use |
DE102020201235A1 (en) | 2020-01-31 | 2021-08-05 | Ford Global Technologies, Llc | Method and system for controlling motor vehicle functions |
US11144153B2 (en) | 2017-12-07 | 2021-10-12 | Elliptic Laboratories As | User interface with acoustic proximity and position sensing arrangements |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IT201700091628A1 (en) * | 2017-08-08 | 2019-02-08 | Automotive Lighting Italia Spa | Virtual man-machine interface system and corresponding virtual man-machine interface procedure for a vehicle. |
IT201800003722A1 (en) * | 2018-03-19 | 2019-09-19 | Candy Spa | APPLIANCE WITH USER INTERFACE |
DE102019200632B4 (en) * | 2019-01-18 | 2021-11-25 | Audi Ag | Control system with portable interface unit and motor vehicle with the control system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2004044664A1 (en) | 2002-11-06 | 2004-05-27 | Julius Lin | Virtual workstation |
US7050606B2 (en) | 1999-08-10 | 2006-05-23 | Cybernet Systems Corporation | Tracking and gesture recognition system particularly suited to vehicular control applications |
US7248151B2 (en) | 2005-01-05 | 2007-07-24 | General Motors Corporation | Virtual keypad for vehicle entry control |
US20110286676A1 (en) | 2010-05-20 | 2011-11-24 | Edge3 Technologies Llc | Systems and related methods for three dimensional gesture recognition in vehicles |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060072009A1 (en) * | 2004-10-01 | 2006-04-06 | International Business Machines Corporation | Flexible interaction-based computer interfacing using visible artifacts |
US20060158616A1 (en) * | 2005-01-15 | 2006-07-20 | International Business Machines Corporation | Apparatus and method for interacting with a subject in an environment |
DE102005059449A1 (en) * | 2005-12-13 | 2007-06-14 | GM Global Technology Operations, Inc., Detroit | Control system for controlling functions, has display device for graphical display of virtual control elements assigned to functions on assigned display surface in vehicle, and detection device for detecting control data |
JP4942814B2 (en) * | 2007-06-05 | 2012-05-30 | 三菱電機株式会社 | Vehicle control device |
-
2012
- 2012-05-14 SE SE1250488A patent/SE537730C2/en unknown
-
2013
- 2013-05-08 EP EP13790166.6A patent/EP2850506A4/en not_active Withdrawn
- 2013-05-08 WO PCT/SE2013/050519 patent/WO2013172768A2/en active Application Filing
- 2013-05-08 CN CN201380025231.1A patent/CN104508598A/en active Pending
- 2013-05-08 BR BR112014028380A patent/BR112014028380A2/en not_active Application Discontinuation
- 2013-05-08 RU RU2014150517A patent/RU2014150517A/en unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7050606B2 (en) | 1999-08-10 | 2006-05-23 | Cybernet Systems Corporation | Tracking and gesture recognition system particularly suited to vehicular control applications |
WO2004044664A1 (en) | 2002-11-06 | 2004-05-27 | Julius Lin | Virtual workstation |
US7248151B2 (en) | 2005-01-05 | 2007-07-24 | General Motors Corporation | Virtual keypad for vehicle entry control |
US20110286676A1 (en) | 2010-05-20 | 2011-11-24 | Edge3 Technologies Llc | Systems and related methods for three dimensional gesture recognition in vehicles |
Non-Patent Citations (2)
Title |
---|
A. RIENER; M. ROSSBORY: "Natural and Intuitive Hand Gestures: A Substitute for Traditional Vehicle Control", AUTOMOTIVEUI '11, 29 November 2011 (2011-11-29) |
See also references of EP2850506A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
RU2618921C2 (en) * | 2014-05-22 | 2017-05-12 | Сяоми Инк. | Method and device for touch input control |
US9671911B2 (en) | 2014-05-22 | 2017-06-06 | Xiaomi Inc. | Touch input control method and device |
DE102014226546A1 (en) * | 2014-12-19 | 2016-06-23 | Robert Bosch Gmbh | Method for operating an input device, input device, motor vehicle |
US11144153B2 (en) | 2017-12-07 | 2021-10-12 | Elliptic Laboratories As | User interface with acoustic proximity and position sensing arrangements |
DE102018216662A1 (en) * | 2018-09-27 | 2020-04-02 | Continental Automotive Gmbh | Dashboard layout, procedure and use |
DE102020201235A1 (en) | 2020-01-31 | 2021-08-05 | Ford Global Technologies, Llc | Method and system for controlling motor vehicle functions |
Also Published As
Publication number | Publication date |
---|---|
WO2013172768A3 (en) | 2014-03-20 |
SE537730C2 (en) | 2015-10-06 |
EP2850506A2 (en) | 2015-03-25 |
CN104508598A (en) | 2015-04-08 |
SE1250488A1 (en) | 2013-11-15 |
RU2014150517A (en) | 2016-07-10 |
BR112014028380A2 (en) | 2017-06-27 |
EP2850506A4 (en) | 2016-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2013172768A2 (en) | Input system | |
US9037354B2 (en) | Controlling vehicle entertainment systems responsive to sensed passenger gestures | |
CN110626237B (en) | Automatically regulated central control platform with arm detects | |
JP5261554B2 (en) | Human-machine interface for vehicles based on fingertip pointing and gestures | |
KR101416378B1 (en) | A display apparatus capable of moving image and the method thereof | |
US8085243B2 (en) | Input device and its method | |
CN102398600B (en) | Augmented reality is used to control system and the method thereof of in-vehicle apparatus | |
US20120131518A1 (en) | Apparatus and method for selecting item using movement of object | |
US11006257B2 (en) | Systems and methods for locating mobile devices within a vehicle | |
US20060066507A1 (en) | Display apparatus, and method for controlling the same | |
US10618773B2 (en) | Elevator operation control device and method using monitor | |
CN104755308A (en) | Motor vehicle control interface with gesture recognition | |
EP1393591A2 (en) | Automatically adjusting audio system | |
JP2007045169A (en) | Information processor for vehicle | |
JP2011039600A (en) | Device and method for supporting parking | |
JP2018055614A (en) | Gesture operation system, and gesture operation method and program | |
US20130176218A1 (en) | Pointing Device, Operating Method Thereof and Relative Multimedia Interactive System | |
US20210064147A1 (en) | Gesture recognition using a mobile device | |
US20190258245A1 (en) | Vehicle remote operation device, vehicle remote operation system and vehicle remote operation method | |
US11354862B2 (en) | Contextually significant 3-dimensional model | |
JP2016029532A (en) | User interface | |
US20220073089A1 (en) | Operating system with portable interface unit, and motor vehicle having the operating system | |
US11630628B2 (en) | Display system | |
CN113727156B (en) | Multi-freedom-degree vehicle-mounted video system adjusting method and device based on light rays and sight lines | |
KR20140141285A (en) | Device for Passenger tracking in a car |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13790166 Country of ref document: EP Kind code of ref document: A2 |
|
DPE1 | Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101) | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2013790166 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13790166 Country of ref document: EP Kind code of ref document: A2 |
|
ENP | Entry into the national phase |
Ref document number: 2014150517 Country of ref document: RU Kind code of ref document: A |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112014028380 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112014028380 Country of ref document: BR Kind code of ref document: A2 Effective date: 20141114 |