US20150279116A1 - Image processing system, image processing method and program, and device - Google Patents

Image processing system, image processing method and program, and device Download PDF

Info

Publication number
US20150279116A1
US20150279116A1 US14/673,929 US201514673929A US2015279116A1 US 20150279116 A1 US20150279116 A1 US 20150279116A1 US 201514673929 A US201514673929 A US 201514673929A US 2015279116 A1 US2015279116 A1 US 2015279116A1
Authority
US
United States
Prior art keywords
image
unit
surveillance
image processing
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/673,929
Inventor
Shoji Yachida
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Assigned to NEC CORPORATION reassignment NEC CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YACHIDA, SHOJI
Publication of US20150279116A1 publication Critical patent/US20150279116A1/en
Priority to US16/297,486 priority Critical patent/US20190206108A1/en
Priority to US16/297,475 priority patent/US11100691B2/en
Priority to US17/375,751 priority patent/US11798211B2/en
Priority to US18/202,009 priority patent/US20230298238A1/en
Priority to US18/371,296 priority patent/US20240013458A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • G06K9/00671
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00132Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture in a digital photofinishing system, i.e. a system where digital photographic images undergo typical photofinishing processing, e.g. printing ordering
    • H04N1/00169Digital image input
    • H04N1/00177Digital image input from a user terminal, e.g. personal computer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • H04N7/185Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source from a mobile camera, e.g. for remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/62Text, e.g. of license plates, overlay texts or captions on TV images
    • G06V20/625License plates
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L9/00Cryptographic mechanisms or cryptographic arrangements for secret or secure communications; Network security protocols

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Alarm Systems (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Image Processing (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)

Abstract

The disclosed embodiments include computer-implemented devices, systems, and methods that support image processing services. In an embodiment, a server may match a partial image received from a terminal with stored candidate image in the memory, and upon identification of a match, transmit information identifying the matched candidate image to the terminal.

Description

  • This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2014-070719, filed on Mar. 31, 2014, the disclosure of which is incorporated herein in its entirety by reference.
  • BACKGROUND
  • 1. Technical Field
  • The present disclosure generally relates to an image processing technology.
  • 2. Description of the Related Art
  • In recent years, surveillance cameras have been set up everywhere for a crime prevention effect and for checking the situation upon occurrence of a crime. However, as blind spots often occur with only fixed surveillance cameras. In some instances, a wearable surveillance camera system may supplement a fixed surveillance camera to improve visibility and reduce the occurrence of blind spots.
  • For example, in some systems, a direction and/or position of a wearable surveillance camera may be controlled flexibly according to wearer's movement. Further, in certain system, a surveillance camera may apply a high-compression encoding scheme to captured image data to reduce an amount of image data transmitted to a center server.
  • For example, surveillance cameras may apply an image compression scheme, such as the MPEG2 (Moving Picture Experts Group) scheme and the H.264 scheme, to captured image data, which may be collected via a wired network (e.g., an Ethernet) and/or a wireless network. Using some high-compression encoding schemes, however, the encoded image data may include compressive artifacts or the like, and it may be impossible to obtain clear image data required to support services.
  • SUMMARY OF THE INVENTION
  • To achieve such an object, one aspect of the present invention is a an image processing system including a terminal including: an image accumulation unit that stores an image and identification information on the image or a partial image of the image; a transmission unit that transmits the identification information and the partial image to a server; and an output unit that outputs additional information based on identification information received from the server, and a server including: an image matching unit that matches the partial image received from the terminal with an accumulated candidate image; and a transmission unit that transmits the identification information corresponding to the matched image to the terminal.
  • And, another aspect of the present invention is an image processing system including a terminal including: an image accumulation unit that stores an image and identification information on the image or a partial image of the image; and a transmission unit that transmits the identification information of the image and the partial image of the image to a server and transmits the image identified by identification information received from the server to the server, and a server including: an image matching unit that matches the partial image received from the terminal with an accumulated candidate image; and a transmission unit that transmits the identification information corresponding to the matched image to the terminal.
  • And, another aspect of the present invention is an image processing device including an output unit that outputs an image of an object region when a value indicating a size of the object region of an image taken by an imaging unit is greater than a threshold, and, otherwise, outputs an image of a predetermined region including the object region.
  • According to the present invention, an image processing system, an image processing method and program, and a device that can appropriately perform service support are provided.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 2 illustrates an exemplary configuration of an image processing unit, consistent with the disclosed embodiments;
  • FIG. 3 depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 4 depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 5 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 6 depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 7 illustrates an exemplary configuration of an image processing system consistent with the disclosed embodiments;
  • FIG. 8 illustrates an example of image display, consistent with the disclosed embodiments;
  • FIG. 9 depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 10 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 11 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 12 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 13 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 14 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 15 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 16 depicts a flowchart of an exemplary image cutout method, consistent with the disclosed embodiments;
  • FIG. 17 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 18 illustrates depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 19 illustrates an exemplary configuration of an image processing system, consistent with the disclosed embodiments;
  • FIG. 20 illustrates depicts a flowchart of an exemplary process, consistent with the disclosed embodiments;
  • FIG. 21 illustrates an exemplary configuration of a terminal, consistent with the disclosed embodiments; and
  • FIG. 22 illustrates depicts a flowchart of an exemplary an image cutout method, consistent with the disclosed embodiments.
  • DETAILED DESCRIPTION
  • In the following, exemplary embodiments are described using the drawings. Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
  • Further, the following description makes reference to units constituting a device, apparatus, or the like. Consistent with the disclosed embodiments, and without limitation, one or more of the units may be realized by hardware, such as a logical circuit. In other instances, and without limitation, one or more of the units may be realized by a computer control unit, storage units such as a memory, a program including processor- and/or computer-implementable instructions loaded to the memory and a hard disk that stores the program, and an interface for network connection, and so on. In further aspects, and without limitation, one or more of the units may be realized by an arbitrary combination of hardware and software (e.g., processor- and/or computer-implementable instructions stored on a tangible, non-transitory computer-readable medium).
  • Additionally, in this application, the use of the singular includes the plural unless specifically stated otherwise. In this application, the use of “or” means “and/or” unless stated otherwise. Furthermore, the use of the term “including,” as well as other forms such as “includes” and “included,” is not limiting. In addition, terms such as “element” or “component” encompass both elements and components comprising one unit, and elements and components that comprise more than one subunit, unless specifically stated otherwise.
  • Embodiment 1
  • FIG. 1 is a diagram illustrating one example of the whole image processing system configuration, in accordance with a first exemplary embodiment.
  • The image processing system includes a first surveillance information terminal unit 1 and a first surveillance server unit 2. The first surveillance information terminal unit 1 and the first surveillance server unit 2 may be connected with each other through a wired or wireless communication channel C1.
  • As illustrated in FIG. 1, the first surveillance information terminal unit 1 may include a camera unit 11, an image processing A unit 12, an image accumulation unit 13 and a transmission/reception unit 14.
  • In some aspects, the camera unit 11 captures image data A11 and inputs to the image processing A unit 12.
  • In an instance where an object is included in image data A11, the image processing A unit 12 may detect an object region including the object on the basis of a predetermined detection parameter. The predetermined detection parameter may be a size of region sufficient to detect a person's face, a size of detection window for detecting objects, and/or a parameter for SIFT (Scale-invariant feature transform) algorithm. In some aspects, the image processing A unit 12 may process image data A11 to extract or “cut out” an image of the object region. The cut-out image may, for example, may represent partial image data A13. The object detected by the image processing A unit 12 may include, for example, a person's face, a bag and a car, and so on.
  • The image processing A unit 12 may output the partial image data A13 and an image identifier (e.g., an image ID), such as a frame number input from the camera unit 11, to the transmission/reception unit 14.
  • In certain aspects, the image processing A unit 12 records image data A11 in the image accumulation unit 13 together with the image ID. By way of example, a combination of image data A11 and the image ID may represent at least a portion of recording data A12, and the image ID may be a frame number.
  • In some embodiments, image data A11 may identify a plurality of objects displayed within corresponding regions on a screen. In certain instances, the image ID may correspond to an identifier (e.g., an ID) assigned to or associated with each of the object regions. Additionally or alternatively, the image ID may correspond to positional information (e.g., coordinate information) of at least one of the objects, a size of at least one of the objects, and so on.
  • In an embodiment, and based on a relationship between a recording capacity and/or a recording time of the image accumulation unit 13, the image processing A unit 12 may perform image compression of recording data A12 using a dynamic image compression scheme for. The dynamic image compression scheme may include, for example, an encoding scheme using an inter-frame differences, including, but not limited to ISO/IEC 13818-2 (MPEG-2 encoding scheme) and ISO/IEC 14496-10 (MPEG-4 AVC scheme).
  • In some aspects, the image processing A unit 12 may receive a request for an image data from the first surveillance unit 2. In response to the received request, the image processing A unit 12 may transmit the requested image data to the first surveillance server unit 2 through the transmission/reception A unit 14, as described below.
  • In some aspects, the transmission/reception unit 14 may transmit partial image data A13 and the image ID, which are output from the image processing A unit 12, to the first surveillance server unit 2. The transmission/reception A unit 14 may additionally transmit the image data to the first surveillance server unit 2.
  • The first surveillance server unit 2 may include a transmission/reception B unit 21, an image processing B unit 22, a database unit 23 and an image display unit 24.
  • The transmission/reception B unit 21 may, for example, input partial image data A13 and the image ID from the first surveillance information terminal unit 1. In certain aspects, the image processing B unit 22 may obtain the partial image data A13 and/or the image ID from the transmission/reception B unit 21.
  • In some instances, the image processing B unit 22 matches at least a portion of the partial image data A13 against image data stored in the database unit 23. In an embodiment, the image processing B unit 22 may establish the image matching method in accordance with one or more characteristics of an object to be matched.
  • In an instance where the image processing unit 22 matches the partial image data A13 to an image data included in the database unit 23, the image processing B unit 22 requests image data from the first surveillance information terminal unit 1 through the transmission/reception B unit 21. In one aspect, the image processing B unit 22 may transmit an image ID corresponding to the matched image data stored in the database unit 23 to the first surveillance information terminal unit 1. In other aspects, image processing unit 22 may transmit the image ID corresponding to the matched image data stored in the database unit 23 in addition to the matched image data.
  • Further, when image data is recorded in the image accumulation unit 13 in a non-compressed manner or in an intra-frame compression scheme, the image processing B unit 22 may transmit corresponding frame image data to the first surveillance information terminal unit 1. In an instance where the image data is recorded in the image accumulation unit 13 in a dynamic image compression scheme, the image processing B unit 22 may transmit an image stream including frame image data, which may include the image ID, to the first surveillance information terminal unit 1. The intra-frame compression scheme may include, for example, ISO/IEC 10918-1 (JPEG compression scheme), and the image stream transmitted in the above-mentioned dynamic image compression scheme may include a GOP (Group of Picture) unit in the MPEG-2 encoding scheme.
  • In an embodiment, and using the matching processes outlined above, image processing B unit 22 may identify an object (i.e., a matched object) within the image data received by the first surveillance server unit 2 that matches a corresponding object within the image data stored in the database unit 23. Image processing B unit 22 may transmit the image ID corresponding to the matched image data stored in the database unit 23 for requesting an image data, to the first surveillance information terminal unit 1. The transmission/reception A unit 14 may transmit the image data corresponding to the received image ID to the first surveillance server unit 2.
  • The image processing B unit 22 may, in some aspects, superimpose information related to the matched object over the received image data, which may be presented to a user by the image display unit 24.
  • For example, if the matched object image includes a person's face, the image processing unit 22 performs superimposition display of a border on a position corresponding to a region that includes the matched face, as illustrated in FIG. 8. The disclosed embodiments are, however, not limited to processes that superimpose a border on a portion of the matched image, and in further embodiments, the image processing unit 22 may superimpose an image ID or the like a portion of the matched image, either alone or in addition to the border.
  • FIGS. 2, 3, and 4 illustrate an exemplary operation of the image processing unit 12, in accordance with disclosed embodiments. For example, as illustrated in FIG. 2, the image processing A unit 12 may include an image IO (Input/Output) unit 121, a CPU (Central Processing Unit) unit 122, a tangible, non-transitory memory unit 123, and an external IO unit 124. In some aspects, memory unit 123 may store instructions that, when executed by CPU unit 122, may cause CPU unit 122 to perform the exemplary processes illustrated in the flowcharts of FIG. 3 and FIG. 4.
  • By way of example, FIG. 3 illustrates an exemplary process performed by image processing unit 12 when operating in a normal surveillance state. In certain aspects, the CPU unit 122 may initializes each parameter at the time of power activation (e.g., in step S1) and may accept an input of an image from the camera unit 11 (e.g., in step S2).
  • In step S3, the CPU unit 122 may determine whether the image IO unit 121 failed to input image data to the CPU unit 122. When there is no image input from the image IO unit 121 (step S3; YES), the CPU unit 122 may finish processing, and the exemplary process may be complete. While, when there is an image input from the image IO unit 121 (step S3, NO), the CPU unit 122 counts the number of frames (e.g., a FrameNum) in step S4 and reads a parameter for object detection from the memory unit 123 in step S5. The CPU unit 122 may detect an object from input image data on the basis of the read parameter (e.g., in step S6).
  • In step S7, the CPU unit 122 determines whether the object has been detected within the input image data. In an instance where the CPU unit 122 detects the object (step S7; YES), the CPU unit 122 extracts coordinates position information (e.g., ObjectCoord) in the image of the detected object and image data region (e.g., ObjectXsize, ObjectYsize) that includes the detected object (e.g., in step S8). The CPU unit 122 may also assign a unique object detection number (e.g., an ObjectNum) to the detected object (e.g., in step S9).
  • In certain aspects, the CPU unit 122 may link the object detection number, the coordinate position information and the image data region (e.g., partial image data A13), and may store them by using a table. The CPU unit 122 may also calculate an image ID based on the object detection number and the frame count (e.g., in step S10). In some instances, in step S10, the CPU unit 122 may establish the frame count as the image ID for the purpose of processing reduction.
  • The CPU unit 122 may output the image ID and partial image data A13 to the transmission/reception A unit 14 (e.g., in step S11). In some aspects, the exemplary process may pass back to step S6, and the CPU unit 122 may detect an additional object within the image data, as described above. The transmission/reception A unit 14 may, in step S11, transmit the image ID and partial image data A13 to the first surveillance server unit 2.
  • In some embodiments, the CPU unit 122 may repeat the detection operation in the whole image data region. When the CPU unit 122 detects no additional objects (e.g., step S7; NO), the image data detection operation is complete. The CPU unit 122 may perform image encoding with the image ID attached in the frame (e.g., in step S12) and may record image data in the image accumulation unit 13 by the external IO unit 124 (e.g., in step S13).
  • As illustrated in FIG. 4, in an instance where there is a request of the image ID and image data from the first surveillance server unit 2, the CPU unit 122 analyzes the image ID and extracts the frame number (e.g., in step S14). The CPU unit 122 may read accumulated image data including image data of the frame number from the image accumulation unit 13 (e.g., in step S15), and may output the accumulated image data to the transmission/reception unit 14 (e.g., in step S16). In some aspects, the transmission/reception unit 14 may transmit image data to the first surveillance server unit 2.
  • By these operations, the image processing system can maintain an image quality required for performing image matching, while reducing a size of image data transmitted and received between the first surveillance information terminal unit 1 and the first surveillance server unit 2. After the image matching, a surveillance agent of the first surveillance server unit 2 may supervise the whole image of the matching result.
  • Next, object detection operation in the image processing A unit 12 in FIG. 1 is described in detail using FIG. 16. For example, in an instance where the image processing system supervises a person as a surveillance target, and if the surveillance target were a wanted person, the image processing system may be configured to perform face matching with the wanted person's photographs. Thus, in some aspects, the size of a detected face image may correspond to a size required for matching with the wanted person's photographs. In other aspects, the size of a detected face image may exceed a predetermined threshold value.
  • For example, as illustrated in FIG. 16, in an instance where an inter-eye distance W1 of a first person's face is greater than predefined pixel number TH1 and less than TH2, the image processing A unit 12 cuts out an image data region (1) of a size larger than a region outlining the first person's face. In this case, there is a high possibility that the size of the face image is sufficient to facilitate face matching using any of the exemplary processes described above.
  • In other aspects, illustrated in FIG. 16, in an instance where inter-eye distance W2 of a second person's face is less than predefined pixel number TH1, the image processing A unit 12 cuts out an image data region (2) of a size larger than a region outlined by an entire body of the second person. Since there is a high possibility that the size of the second person's face image is not larger than a predetermined threshold, face matching is performed by using an image of the entire target.
  • In some embodiments, although the size of the first person's face may be greater than predefined pixel number TH1 and less than TH2, a possibility exists that the image processing unit 12 may be unable to acquire the first person's facial features (e.g., because the first person wears glasses or a mask). In certain aspects, the image processing A unit 12 may cut out an image data region larger in size than the region (1) outlined by the first person's face (e.g., a region outlining the entire body of the first person).
  • As another example, the target object for surveillance within the image data (e.g., a surveillance object) may correspond to a wanted vehicle. In an instance where the surveillance object is assumed to be a wanted vehicle, the image processing unit 12 may determine whether a license plate is detectable or not, and further, whether the license plate is readable or not. For example, when the image processing A unit 12 determines that a width W3 of the detected license plate is greater than predefined pixel number TH1 and less than a predefined pixel number TH2, a high possibility exists that the license plate is readable by the image processing unit 12, and the image processing A unit 12 may cut out a region corresponding to the license plate as an image data region (3).
  • Alternatively, in an instance where the image processing unit 12 determines that a width W4 of a license plate is less than predefined pixel number TH1, the image processing A unit 12 cuts out an image data region (4) of a larger size than the outline of the entire car and sends the image data region (4) to a surveillance server.
  • In certain aspects, and using the exemplary image cutouts described above, the image processing unit 12 may reduce a transmission data size by not transmitting a useless image region to the first surveillance server unit 2.
  • According to the disclosed exemplary embodiments, the image processing system may maintain an image quality required for performing image matching, while reducing the data size transmitted and received between the first surveillance information terminal unit 1 and the first surveillance server unit 2. Further, after the image matching, a surveillance agent of the first surveillance server unit 2 may monitor the whole image corresponding to the matching result. Therefore, in certain aspects, the image processing system may appropriately perform service support.
  • In some embodiments, the image processing system, and its components, may perform the exemplary processes described above to provide surveillance services. The disclosed embodiments are, however, not limited to surveillance services, and in other embodiments, the image processing system may perform on the more of the processes described above to provide customer services, maintenance inspection services, and other similar services.
  • In some aspects, the surveillance information terminal unit and the surveillance server unit may be referred to a terminal and a server, respectively. The terminal may, for instance, be associated with a surveillance agent, a serving person, and so on. In certain aspects, the terminal may include a mobile or handheld computing device (e.g., carried by a hand or other instrument of the person), and additionally or alternatively, a computing device integrated into a person's clothing or garments (e.g., a wearable computing device or a computing device in communication either a smart textile or electronic fabric).
  • In further aspects, the terminal may correspond to a “scouter.” The scouter may include a wearable computer that combines an eye line camera and an optical head-mounted display (e.g., a glasses-type head mount display) that presents a half-transparent image on part or entire of a person's field-of-view. For example, the scouter may be provided with a computing device configured to perform one or more of the data transmitting and output processes described above. When the scouter includes a dedicated glass frame with a camera or audio apparatus (e.g., a microphone and earphones), the scouter can support not only the surveillance service, but also on-site services such as a maintenance inspection service and an assembly service, using any of the disclosed exemplary embodiments and processes.
  • Embodiment 2
  • FIG. 5 is a diagram of a whole image processing system configuration, in accordance with a second exemplary embodiment. The image processing system of FIG. 5 may include a second surveillance information terminal unit 3 and a second surveillance server unit 4. Additionally, the image processing system may include the first surveillance server unit 2, as described above.
  • The second surveillance information terminal unit 3 and the second surveillance server unit 4 (and additionally or alternatively, the first surveillance server unit 2) may communicate across wired or wireless communication channel C1. As described below in reference to FIG. 5, the second surveillance information terminal unit 3 and the second surveillance server unit 4 may be connected with each other through the communication channel C1.
  • In an embodiment, the second surveillance information terminal unit 3 includes a camera unit 11, an image processing A unit 12, an image accumulation unit 13, a transmission/reception A unit 14 and an image display unit 15. In certain aspects, the camera unit 11, image processing A unit 12, image accumulation unit 13, and transmission/reception A unit 14 of FIG. 5 are similar in functionality to corresponding elements of the first surveillance information terminal unit 1 described above in reference to FIG. 1.
  • For example, and as described above, camera unit 11 may capture image data A11, and may provide the image data A11 as input to the image processing A unit 12. Using any of the exemplary processes outlined above, the image processing A unit 12 may extract or “cut out” an object detected from image data A11 on the basis of a predetermined detection parameter. The image processing A unit 12 may establish an image ID for the image data A11 (e.g., based on the frame number input from the camera unit 11), and may transmit the image ID and partial image data A13 (e.g., which includes image data corresponding to the cut-out object) to the transmission/reception A unit 14 using any of the exemplary processes outlined above.
  • In some aspects, and as described above, an image ID consistent with the disclosed embodiments may identify a plurality of objects displayed within corresponding regions on a screen. In some instances, the image ID may correspond to an identifier (e.g., an ID) assigned to or associated with each of the object regions. Additionally or alternatively, the image ID may correspond to positional information (e.g., coordinate information) of at least one of the objects, a size of at least one of the objects.
  • The image processing A unit 12 may, in certain aspects, record image data A11 and the image ID in the image accumulation unit 13 using any of the exemplary processes outlined above.
  • The transmission/reception A unit 14 transmits partial image data A13 and the image ID, which are output from the image processing A unit 12, to the second surveillance server unit 4.
  • In an embodiment, the second surveillance server unit 4 may include a transmission/reception B unit 21, an image processing B unit 22, and a database unit 23. In some instances, the transmission/reception B unit 21, image processing B unit 22, and database unit 23 of FIG. 5 are similar in functionality to corresponding elements of the first surveillance server unit 2 described above in reference to FIG. 1.
  • As described above, the transmission/reception B unit 21 may input partial image data A13 and the image ID to the image processing B unit 22. Further, in certain aspects, the image processing B unit 22 may perform matching operation between partial image data A13 and image data stored in the database unit 23 using any of the exemplary processes outlined above.
  • For example, and as described above, if partial image data A13 matches an image stored within the database unit 23, the image processing B unit 22 may transmit the image ID corresponding to the matched image to the second surveillance information terminal unit 3 through the transmission/reception B unit 21.
  • Further, the image processing A unit 12 of the second surveillance information terminal unit 3 may read out from the image accumulation unit 13 image data A14 corresponding to the image ID obtained from the second surveillance information terminal unit 3. The image processing A unit 12 may superimpose information related to the matching result of the object image over the image data A14. The image processing A unit 12 may display the image data A14, which is superimposed by the information on the matching result of the object image on the display unit 15, as described below in reference to FIG. 6.
  • FIG. 6 illustrates an exemplary process for superimposing information of matching image data, consistent with the disclosed exemplary embodiments. In some aspects, the image processing A unit 12 may determine the frame number, coordinate position information, and image data region based on the image ID obtained from the second surveillance information terminal unit 3 (e.g., step S21 of FIG. 6). The image processing A unit 12 may access the image accumulation unit 13 and reads an image stream of the frame number (step S22 in FIG. 6). In certain aspects, the image processing A unit 12 may decode a target frame of the read image stream, and may superimpose information related to the matching result of the object image over an image of the decoded frame on a region indicated by the coordinate position information (e.g., step S23 in FIG. 6). An example of the superimposition display is similar to Embodiment 1.
  • In some aspects, the image processing system described herein may can secure images having a quality sufficient to perform image matching, while reducing a the size and/or amount of the image data transmitted and received between the second surveillance information terminal unit 3 and the second surveillance server unit 4. Further, since a surveillance agent may confirm a matching result image in the surveillance information terminal unit 3 in an instance when the second surveillance server unit 4 performs image matching, the image processing system may facilitate the surveillance agent's careful survey of the matching result object.
  • Further, in some aspects, the second surveillance information terminal unit 3 of FIG. 5 may be coupled to the first surveillance server unit 2 of FIG. 1 across communication channel C1. In certain aspects, the surveillance agent may use the first surveillance server unit 2 (e.g., a server) to perform surveillance using any of the exemplary processes outlined above. Thus, the disclosed exemplary embodiments may enable image processing system to perform appropriate service support.
  • Embodiment 3
  • FIG. 7 illustrates a configuration of an exemplary image processing system, in accordance with a third embodiment. The image processing system of FIG. 7 includes a third surveillance information terminal unit 5 and the second surveillance server unit 4.
  • As described below in reference to FIG. 7, the third surveillance information terminal unit 5 and the second surveillance server unit 4 may be connected with each other through wired or wireless communication channel C1. Further, although not depicted in FIG. 7, the exemplary image processing system may also couple a first surveillance server unit 2 (e.g., as described above in reference to FIG. 5) to the third surveillance information terminal unit 5 across communications channel C1, either alone or in conjunction with the second surveillance server unit 4.
  • As illustrated in FIG. 7, the third surveillance information terminal unit 5 may include a camera unit 11, an image processing A unit 12, an image accumulation unit 13, a transmission/reception unit 14, an image display unit 15, and an input unit 16. In certain aspects, the camera unit 11, image processing A unit 12, image accumulation unit 13, and transmission/reception A unit 14, and the image display unit 15 of FIG. 7 are similar in functionality to corresponding elements of the first surveillance information terminal units 1 and 3 described above in reference to FIGS. 1 and 3.
  • As described above, the camera unit 11 may be configured to capture image data A11, to provide the captured image data 11 as an input to the image processing A unit 12. In certain aspects, the image processing A unit 12 may be configured to extract or “cut out” an object detected within image data A11 on the basis of a preset detection parameter using any of the exemplary processes described above. The transmission/reception A unit 14 may transmit the image ID and cut-out object image data A13 to the second surveillance server unit 4.
  • In some aspects, the image processing A unit 12 may be configured to store image data A11 in the image accumulation unit 13 together with the image ID. Further, using any of the exemplary processes described above, the image processing A unit 12 may superimpose the detected object over an image captured by camera unit 11 to generate a camera input image, and the image display unit 15 may present the camera input image having superimposed object to a user. By way of example, the image processing A unit 12 may superimpose a border on a detected face region, and the image display unit 15 may present an image that includes the detected face region and the superimposed border, as depicted in FIG. 8. As illustrated in FIG. 8, the ID to identify the person may be superimposed and displayed near the border of the face region. In some instances, the unique numbers (e.g., regions (1), and (2) in FIG. 8) may be superimposed and displayed near the border of the face region.
  • In an embodiment, the input unit 16 may include an information input unit installed as, for example, a touch sensor on a display screen, a cursor key, or a button that allows to directly input the ID of the person into the third surveillance information terminal unit 5.
  • In an embodiment, an object in a captured image may represent a person. By way of example, when a person who is doing a suspicious action appears in an image, a surveillance agent who is associated with the third surveillance information terminal unit 5 provide, to the input unit 16, information indicating an object region such as the person's ID. In certain instance, the surveillance agent may tap the region on the screen which is included in the person, and/or may input the unique number corresponding to the person. Further, in some instances, the image display unit 15 may present an image in which a border indicating the person's ID is superimposed on the person's face region (e.g., as illustrated in FIG. 8). The surveillance agent may select the presented border by tapping the region corresponding to the presented border, and the input unit 16 may input the person's ID which the selected border indicates.
  • When the ID is input (e.g., via the input unit 16), the third surveillance information terminal unit 5 may transmit the ID to the second surveillance server unit 4 such that the ID can be distinguished from other IDs. In certain instance, the input ID may be distinguished by a flag that the third surveillance information terminal unit 5 may add to the input ID. The second surveillance server unit 4 matches whether a face image of a person identified by the transmitted ID corresponds to a face image stored in the database unit 23. In a case where the face image of the person identified by the ID is a face image that is not stored, the face image of the person may be additionally stored in the database unit 23 as a person requiring special attention. By way of example, FIG. 3 illustrates an exemplary process performed by image processing unit 12 when operating in a normal surveillance state. In certain aspects, the CPU unit 122 may initializes each parameter at the time of power activation (e.g., in step S1) and may accept an input of an image from the camera unit 11 (e.g., in step S2).
  • FIG. 9 illustrates an exemplary process performed by the image processing A unit 12 of the third surveillance information terminal unit 5. In certain aspects, as described above in reference to the exemplary configuration of FIG. 2, image processing A unit 12 may include an image IO unit 121, a CPU unit 122, a tangible, non-transitory memory unit 123, and an external IO unit 124. By way of example, the CPU unit 122 may be configured to perform the exemplary processes of FIG. 9 to determine an image ID associated with an input image data received through image I/O unit 121 (e.g., as captured by camera unit 11), and to assign the determined image ID to the input image data (e.g., step S10 of FIG. 3).
  • Referring to FIG. 9, The CPU unit 122 may be further configured to superimpose a detected object on an input image and to present the input image having the superimposed object on the image display unit 15 (e.g., step S31 of FIG. 9). In an instance where a noted object (e.g., an object watched by a surveillance agent, such as a wanted criminal) exists in an image, an operator of the third surveillance information terminal unit 5 may input or select an ID of the object using the input unit 16. The CPU unit 122 may determine whether the input unit 16 receives an input or selection of the ID from the operator (e.g., step S32 of FIG. 9).
  • If the CPU unite were to determine that the input unit 16 receive the operator's input or selection of the ID (e.g., step S32; YES), the CPU unit 122 may attach a noted object flag which indicates that the object is indicated by an input from the input unit 16 to the input ID or the selected ID and calculates the image ID (e.g., step S33 of FIG. 9). The exemplary processes of FIG. 9 are complete, and the CPU unit 122 may output the calculated image ID, as described above in reference to step S11 of FIG. 3. If, however, the CPU unit 122 were to determine that the operated provided neither input nor selection to the input unit 16, the exemplary process passes back to FIG. 3, and the CPU unit 122 calculated the image ID as described above (e.g., in step S10 of FIG. 3).
  • The second surveillance server unit 4 may match whether an image of an object identified by the image ID is an image stored in the database unit 23. In a case where the face image of the person identified by the ID is an image that is not stored, the face image of the person may be additionally stored in the database unit 23 as a target requiring special attention.
  • Using the exemplary embodiments described above, the third surveillance information terminal unit 5 can acquire a matching result by selecting a target to be surveyed. For example, when the third surveillance information terminal unit 5 is connected with the first surveillance server unit 2, a surveillance agent who performs surveillance on the server can perform surveillance. Thus, in some embodiments, the image processing system can appropriately perform service support.
  • Embodiment 4
  • FIG. 10 illustrates an exemplary configuration of an image processing system, in accordance with a fourth embodiment. The image processing system of FIG. 10 includes a fourth surveillance information terminal unit 7 and the second surveillance server unit 4 connected with each other through wired or wireless communication channel C1. Further, although not depicted in FIG. 10, the exemplary image processing system may also couple a first surveillance server unit 2 (e.g., as described above in reference to FIG. 1) to the fourth surveillance information terminal unit 7 across communications channel C1, either alone or in conjunction with the second surveillance server unit 4.
  • The fourth surveillance information terminal unit 7 may include a camera unit 11, an image processing A unit 12, an image accumulation unit 13, a transmission/reception unit 14, an image display unit 15 and an image processing C unit 17. In certain aspects, the camera unit 11, image processing A unit 12, image accumulation unit 13, transmission/reception A unit 14, and the image display unit 15 of FIG. 10 are similar in functionality to corresponding elements of the surveillance information terminal units 1, 3, and 5 described above in reference to FIGS. 1, 3, and 7.
  • In some instances, an object for surveillance within a corresponding surveillance region may be identified beforehand, and the second surveillance server unit 4 may be configured to transmit image data or image feature for identifying the image that includes the identified object to the fourth surveillance information terminal unit 7. In some instances, the image feature may include interest points, region of interest, edges, corners, blobs, ridges, and the like. The identified object may include, but is not limited to, a wanted criminal, a lethal weapon, loyal customer, etc. In some aspects, the identified object may represent a target object, and the target object may designated by a surveillance agent or similar individual stored in the database unit 23.
  • In certain aspects, the fourth surveillance information terminal unit 7 may receive the image data that includes the target object and/or the image feature from the second surveillance server unit 4, and may store (or temporarily store) the image data and the image feature in the image processing C unit 17.
  • The image processing C unit 17 may be configured to perform image using object image region data detected by the image processing A unit 12 in conjunction with the stored image data or the stored image feature. In an instance where the image processing C unit 17 identifies an image of the target object as a result of the image matching processes, the image of the target object or information to distinguish the image from other objects is presented on the image display unit 15.
  • In certain aspects, as described above in reference to the exemplary configuration of FIG. 2, the image processing C unit 17 may include an image IO unit 121, a CPU unit 122, a tangible, non-transitory memory unit 123, and an external IO unit 124. Therefore, if the resources of the CPU unit 122 in FIG. 2 are available, the image processing A unit 12 and the image processing C unit 17 may be configured to perform any of the exemplary processes described above using the same CPU unit (e.g., the CPU unit 122).
  • The exemplary image processing system of FIG. 10 may, in some instances, enable a surveillance agent or similar individual to perform surveillance of an object requiring special attention. Thus, the image processing system can perform and support appropriate services.
  • Embodiment 5
  • FIG. 11 illustrates an exemplary configuration of an image processing system, in accordance with a fifth embodiment. The image processing system of FIG. 11 includes a fifth surveillance information terminal unit 8 and a second surveillance server unit 4 connected with each other through wired or wireless communication channel C1. Further, although not depicted in FIG. 10, the exemplary image processing system may also couple a first surveillance server unit 2 (e.g., as described above in reference to FIG. 1) to the fifth surveillance information terminal unit 8 across communications channel C1, either alone or in conjunction with the second surveillance server unit 4.
  • Referring to FIG. 11, the fifth surveillance information terminal unit 8 includes a camera unit 11, an image processing A unit 12, an image accumulation unit 13, a transmission/reception unit 14, an image display unit 15, an input unit 16 and an image processing C unit 17. In certain aspects, the camera unit 11, image processing A unit 12, image accumulation unit 13, transmission/reception A unit 14, image display unit 15, input unit 16, and image processing C unit 17 of FIG. 1 are similar in functionality to corresponding elements of the surveillance information terminal units 1, 3, 5, and 7 described above in reference to FIGS. 1, 3, 7, and 10.
  • In some aspects, the input unit 16 may include, but is not limited to, a touch sensor, a cursor key, or a button that allows an operator to directly input the ID, as described above. In an instance where a person performing a suspicious action appears in an image, a surveillance agent who is associated with the fifth surveillance information terminal unit 8 may receive the person's ID as an input from the input unit 16. The fifth surveillance information terminal unit 8 may, for example, transmit the ID to the second surveillance server unit 4 such that the ID can be distinguished from other IDs. In certain instance, the input ID may be distinguished by a flag that the fifth surveillance information terminal unit 8 may add to the input ID.
  • Further, in some instances, the second surveillance server unit 4 matches whether a face image of a person identified by the transmitted ID is a face image stored in the database unit 23. In an instance where the face image is not stored, the face image of the person may be additionally stored in the database unit 23 as a person requiring special attention.
  • In an instance where a target object is found beforehand, the second surveillance server unit 4 may transmit image data or image feature for identifying the image to the fifth surveillance information terminal unit 8. In certain aspects, the second surveillance server unit 4 may transmit the image data or the image feature to the fifth surveillance information terminal unit 8 upon detection of the target object or at a time proximate to the detection of the target object. Further, upon receipt of the image feature, the fifth surveillance information terminal unit 8 may store (e.g., permanently or temporarily) the received information in the image processing C unit 17.
  • The image processing C unit 17 may, in some aspects, perform image matching on the basis of object image region data detected in the image processing A unit 12 and the stored image data or the stored image feature. In an instance where the image of the target object is a result of the image matching process, the target object image and/or information to distinguish the target object image from other objects may be presented on the image display unit 15.
  • In the exemplary embodiments described above, the image processing system includes both the input unit 16 and the image processing C unit 17, the image processing system may thus designate a noted target on both the terminal and the server. Using the disclosed exemplary embodiments, a surveillance agent or similar individual may perform surveillance of a noted object more carefully, and the image processing system may perform and support appropriate services.
  • Embodiment 6
  • FIG. 12 illustrates an exemplary configuration of an image processing system, in accordance with a sixth embodiment. The image processing system includes multiple second surveillance information terminal units (e.g., units 3 and 3′) connected to a first surveillance server unit 2. Further, although described in terms of two second surveillance information terminal units 3 and 3′, the disclosed embodiments are not limited to such exemplary numbers of terminal units, and in further embodiments, the image processing system may include any additional or alternate number of second surveillance information terminal units (e.g., three or more).
  • As illustrated in FIG. 12, the second surveillance information terminal units 3 and 3′ and the first surveillance server unit 2 (or the second surveillance server unit 4) are connected with each other through wire or wireless communication channel C1. Since the configuration and operation of the second surveillance information terminal units 3 and 3′ are described in above in reference to FIG. 5, only the different operation parts are described in detail.
  • In an embodiment, partial image data A13 of an object detected in the second surveillance information terminal unit 3 may be input in the first surveillance server unit 2 through the network N1. In the first surveillance server unit 2, the image processing B unit 22 may be configured to perform image matching between the image data or the image feature stored within the database unit 23 and partial image data A13. In a case where the partial image data A13 is matched with the stored image data, the first surveillance server unit 2 may be configured to transmit the stored image data (e.g., as stored within the database unit 23) to the second surveillance information terminal unit 3′ as matching image data.
  • The second surveillance information terminal unit 3′ may be configured to present the matching image data on the image display unit 15. In some aspects, the exemplary processes described above may facilitate sharing of image data detected by another surveillance information terminal among the terminals 3 and 3′.
  • In some aspects, image processing systems consistent with the disclosed embodiments may display image data stored in the database unit 23 to a surveillance agent who uses the second surveillance information terminal unit 3′. By way of example, when another surveillance agent finds a suspicious object and/or a suspicious person, the disclosed embodiments facilitate the sharing of corresponding matching image data among other surveillance agents. Thus, the image processing system may perform and support appropriate services.
  • Embodiment 7
  • FIG. 13 illustrates an exemplary configuration of an image processing system, in accordance with a seventh embodiment. The image processing system includes multiple third surveillance information terminal units (e.g., units 5 and 5′) connected to a first surveillance server unit 2. Further, although described in terms of two third surveillance information terminal units 5 and 5′, the disclosed embodiments are not limited to such exemplary numbers of terminal units, and in further embodiments, the image processing system may include any additional or alternate number of third surveillance information terminal units (e.g., three or more).
  • As illustrated in FIG. 13, the third surveillance information terminal units 5 and 5′ and the first surveillance server unit 2 (or the second surveillance server unit 4) are connected with each other through wire or wireless communication channel C1. As for the third surveillance information terminal units 5 and 5′, since the configuration and operation are described above, only parts of different operation and/or functionality are described in detail. Further, although not depicted in FIG. 13, the exemplary image processing system may also couple a second surveillance server unit 4 (e.g., as described above in reference to FIG. 5) to the third surveillance information terminal units, either alone or in conjunction with the first surveillance server unit 2.
  • As described above, for example, a surveillance agent who is associated with the third surveillance information terminal unit 5 may input, into the input unit 16, the object ID of an interesting object in image data subjected to object detection. In some aspects, the object ID and partial image data A13 may be input in the first surveillance server unit 2 through the networkN1. In the first surveillance server unit 2, the image processing B unit 22 may be configured to perform image matching between image data or image feature for identifying the image stored within the database unit 23 and partial image data A13.
  • In an instance where partial image data A13 matches with the stored image data, the first surveillance server unit 2 may transmit the stored image data to the third surveillance information terminal unit 5′ as matching image data.
  • In some aspects, the third surveillance information terminal unit 5′ may present the matching image data to an operator using the image display unit 15. Using the disclosed embodiments, image data detected by another surveillance information terminal may be shared among the terminals.
  • In certain aspects, image processing systems consistent with the disclosed embodiments may be configured to display the image data stored in the database unit 23 for a surveillance agent that uses the third surveillance information terminal unit 5′. By way of example, when another surveillance agent finds a suspicious object and/or a suspicious person, the disclosed embodiments may share information among other surveillance agents, and the image processing system may perform and support appropriate services.
  • Embodiment 8
  • FIG. 14 illustrates an exemplary configuration of an image processing system, in accordance with an eighth embodiment. The image processing system of FIG. 14 includes multiple fourth surveillance information terminal units (e.g., units 7 and 7′) and a first surveillance server unit 2. Further, although described in terms of two fourth surveillance information terminal units 7 and 7′, the disclosed embodiments are not limited to such exemplary numbers of terminal units, and in further embodiments, the image processing system may include any additional or alternate number of fourth surveillance information terminal units (e.g., three or more).
  • As illustrated in FIG. 14, the fourth surveillance information terminal units 7 and 7′ and the first surveillance server unit 2 (or the second surveillance server unit 4) are connected with each other through wire or wireless communication channel C1. As for the fourth surveillance information terminal units 7 and 7′, since the configuration and operation are described above, only parts of different operation and/or functionality are described in detail. Further, although not depicted in FIG. 14, the exemplary image processing system may also couple a second surveillance server unit 4 (e.g., as described above in reference to FIG. 5) to the fourth surveillance information terminal units, either alone or in conjunction with the first surveillance server unit 2.
  • In an embodiment, image data or image feature for identifying the image associated with a target object may be input from the first surveillance server unit 2 into the fourth surveillance information terminal unit 7, and the image feature is stored in the image processing C unit 17. The image processing C unit 17 may be configured to perform image matching on the basis of object image region data detected in the image processing A unit 12 and the stored image data or the stored image feature. Further, in some aspects, the fourth surveillance information terminal unit 7′ may be configured to perform one or more of the exemplary processes performed by the fourth surveillance information terminal unit 7.
  • Partial image data A13 of an object detected by the fourth surveillance information terminal unit 7 (and additionally or alternatively, by fourth surveillance information terminal unit 7′) may be transmitted to the first surveillance server unit 2 through the networkN1. The first surveillance server unit 2 may transmit the image data to the fourth surveillance information terminal units 7′ and 7 which are other terminals. The fourth surveillance information terminal units 7 and 7′ may display matching image data received from the other terminal on the image display unit 15. By this means, image data detected by another surveillance information terminal is shared among the terminals.
  • The disclosed embodiments may, for example, facilitate processes that share an image of a surveillance object among surveillance information terminal units. By way of example, when another surveillance agent finds a suspicious object and a suspicious person, and so on, the disclosed embodiments may share information with terminal units associated with other surveillance agents. Thus, the image processing system may perform and support appropriate services.
  • Embodiment 9
  • FIG. 15 illustrates an exemplary configuration of an image processing system, in accordance with a ninth embodiment. The image processing system of FIG. 15 includes multiple fifth surveillance information terminal units (e.g., units 8 and 8′) connected to a second surveillance server unit 4. Further, although descried in terms of two fifth surveillance information terminal units 8 and 8′, the disclosed embodiments are not limited to such exemplary numbers of terminal units, and in further embodiments, the image processing system may include any additional or alternate number of more fifth surveillance information terminal units (e.g., three or more).
  • The fifth surveillance information terminal units 8 and 8′ and the second surveillance server unit 4 are connected with each other through wire or wireless communication channel C1. As for the fifth surveillance information terminal units 8 and 8′, since the configuration and operation are described above, and only parts of different operation and/or functionalities are described in detail. Further, although not depicted in FIG. 15, the exemplary image processing system may also couple a first surveillance server unit 2 (e.g., as described above in reference to FIG. 5) to the fifth surveillance information terminal units, either alone or in conjunction with the second surveillance server unit 4.
  • In an embodiment, a surveillance agent who is associated with the fifth surveillance information terminal unit 8 may input, into the input unit 16, an ID of an object which the surveillance agent wants to monitor. The fifth surveillance information terminal unit 8 may transmit the ID to the second server unit 4. The second surveillance server unit 4 may match whether an image of an object identified by the transmitted ID is image data stored in the database unit 23.
  • In certain aspects, when an image data of the object identified by the transmitted ID is not included within stored image data (e.g., within the database unit 23), the image data of the object identified by the transmitted ID is transmitted to the fifth surveillance information terminal unit 8′ as matching image data. The fifth surveillance information terminal unit 8′ may be configured to present the matching image data on the image display unit 15.
  • In a case where a target object to be detected is specified in advance, the second surveillance server unit 4 may transmits image data or image feature for identifying the image to the fifth surveillance information terminal units 8 and/or 8′. In certain aspects, the first surveillance server unit 2 may transmit the image data or the image feature to the fifth surveillance information terminal units 8 and/or 8′ upon detection of the target object or at a time proximate to the detection of the target object. Further, upon receipt of the image feature by fifth surveillance information terminal units 8 and/or 8′, and the feature amount may be stored in the image processing C unit 17.
  • In some aspects, the image processing C unit 17 may compare the matching image data with the image data or the image feature of the target object, and, in a case where they are matched, displays the matching image data on the image display unit 15.
  • The disclosed embodiments may, for example, facilitate processes that share an image of a surveillance object among surveillance information terminal units. By way of example, when another surveillance agent finds a suspicious object and/or a suspicious person, the disclosed embodiments may share information with terminal units associated with other surveillance agents. Thus, the image processing system may perform and support appropriate services.
  • Embodiment 10
  • FIG. 17 illustrates an exemplary configuration of an image processing system, in accordance with a tenth embodiment. The image processing system of FIG. 17 includes a terminal unit 10, which may include an image accumulation unit 101, a transmission unit 102 and an output unit 103, and a server 20, which may include an image matching unit 201 and a transmission unit 202.
  • In one aspect, the image accumulation unit 101 stores an image and an ID (identification information) of the image, or a partial image that includes a portion of that image, in association with each other. The image may be acquired from an imaging device, such as a camera. The terminal 10 may assign a unique ID to each acquired image (and additionally or alternatively, each partial image) and record the each acquired image (and additionally or alternatively, each partial image) in the image accumulation unit 101. In other aspects, the terminal 10 may acquire an image associated with a previously assigned ID, and may record the image in the image accumulation unit 101.
  • By way of example, a partial image may refer to a partial image “cut out” or extracted from an image. The partial image includes, for example, a region in which an object appears (e.g., an object region). As described above, the terminal 10 may include an image processing unit capable of performing object detection functionality. Further, in some instances, the ID assigned to an image may correspond to a frame number of the image. In instances where multiple objects are included on a screen, the ID may be assigned to every partial image may include object position information (e.g., coordinate information), the object size, etc. The transmission unit 102 may transmit the ID and the partial image to the server 20.
  • In an embodiment, the output unit 103 outputs additional information based on the ID received from the server 20. The output may, for example, include an image display output to a display unit and/or an output to an external device. The additional information may include information identifying a result of a matching process performed by the server 20. For example, the additional information may include information to distinguish an image (or partial image) associated with the ID received from the server 20 from another image (or another partial image). For example, when an object corresponds to a person's face, the additional information may identify a frame superimposed and displayed on the detected face region, as depicted in FIG. 8. As illustrated in FIG. 8, the ID to identify the person may be superimposed and displayed near the frame of the face region. In some aspects, the ID may itself represent additional information, e.g., a partial image in which the face region appears.
  • The image matching unit 201 may match a partial image receive from the terminal 10 with a candidate image. In some instances, the candidate image may be stored in a database disposed within the server 20 or at a network location accessible to the server 20. The transmission unit 202 may transmit the ID corresponding to the matched image to the terminal 10.
  • FIG. 18 illustrates an exemplary process performed by the image processing system in accordance with the disclosed embodiments. As described above, the transmission unit 102 of the terminal 10 may transmit the ID and a partial image to the server 20 (e.g., step S101 of FIG. 18). The image matching unit 201 of the server 20 may, in some aspects, match the partial image received from the terminal 10 with a candidate image (e.g., step S102 of FIG. 18). If the image matching unit 20 were to determine that the partial image matches the candidate image (e.g., step S102; Yes), the ID corresponding to the matched image may be transmitted to the terminal 10 (e.g., step S103 of FIG. 18). The output unit 103 of the terminal 10 may output additional information based on the ID received from the server 20 (e.g., step S104 of FIG. 18). Since the terminal 10 accumulates the whole image, it is possible to understand the relationship between the matching result of the partial image and the whole image including the partial image. Therefore, for example, the output unit 103 may superimpose and display a partial image (e.g., a face image) and the whole image (such as the entire character image and a background image) on a display unit, as illustrated in FIG. 8.
  • In some aspects, since an image which a terminal transmits to a server may be a partial image, the image processing system can reduce a communication load between the servers and terminals. Further, in additional embodiments, the image matching unit does not have to be installed in the terminal, and a terminal holder (e.g., an operator and/or a surveillance agent) may confirm a result of causing image matching to be performed in the server. Therefore, image processing systems consistent with the disclosed embodiments may perform and support surveillance services, customer services, and other services.
  • Embodiment 11
  • FIG. 19 illustrates an exemplary configuration of an image processing system, in accordance with an eleventh embodiment. The image processing system of FIG. 19 may include a terminal 10, which includes an image accumulation unit 101 and a transmission unit 104, and a server 20, which includes an image matching unit 201 and a transmission unit 202. The operation and functionality of image accumulation unit 101, the image matching unit 201 and the transmission unit 202 are comparable to similarly numbered units described above, and additional explanation is omitted. As described above, the server 20 may transmit the matched image, the image ID indicates the matched image, and/or the additional information to the terminal 10. Further, when a partial image is transmitted to the server 20 and the server 20 performs the image matching, a surveillance agent or similar individual associated with the server 20 may desire to watch and/or monitor the circumstance indicated by not only the partial image, but also the whole image of an image including the partial image. In some aspects, the transmission unit 104 may function in a manner similar to the transmission unit 102 described above. For instance, the transmission unit 104 may initially transmit a partial image to the server 20, and, afterward, transmit to the server 20 the whole image including a partial image identified by the ID received from the server 20.
  • FIG. 20 illustrates an exemplary process performed by the image processing system, in accordance with the disclosed embodiments.
  • As described above, the transmission unit 102 of the terminal 10 may transmit the ID and a partial image to the server 20 (e.g., step S101 of FIG. 20). The image matching unit 201 of the server 20 may, in some aspects, match the partial image received from the terminal 10 with a candidate image (e.g., step S102 of FIG. 20). If the image matching unit 20 were to determine that the partial image matches the candidate image (e.g., step S102; Yes), the ID corresponding to the matched image may be transmitted to the terminal 10 (e.g., step S103 of FIG. 20). The transmission unit 104 may initially transmit a partial image to the server 20, and, afterward, transmit to the server 20 the whole image including a partial image identified by the ID received from the server 20 (e.g., step S105 of FIG. 20).
  • In some aspects, when the server 20 matches the partial image with a candidate image, the server 20 may make a transmission request of the whole image including the partial image to the terminal 10.
  • Embodiment 12
  • FIG. 21 illustrates an exemplary image processing device 100, in accordance with a twelfth embodiment.
  • In one aspect, an output unit 1001 of the image processing device 100 may be configured to: (i) output an image of an object region when a value indicative of a size of the object region in an image captured by an imaging unit is greater than a threshold value; and (ii) output an image of a predetermined region including the object region when the value is not greater than the threshold value. The output may, for example, represent an image display output provided, by the output unit 1001, to a display unit, an external device, or of device or unit capable of presenting visual representations of data.
  • In an embodiment, the image processing device 100 may include an imaging unit (e.g., a camera), and a person may utilize the image processing device 100 capture an image of an environment surrounding the imaging unit. The image processing device 100 may also include an object detection unit, which may be configured to detect a region of an object from the image captured by the imaging unit. By way of example, the object may include, but is not limited to, a person, a car, a commodity, any portion thereof, and letters presented in the surface of them and a seal fixed to them. Here, it is presumed that the object is a person's face.
  • Thus, in some aspects, the size of the face image detected by the object detection unit may correspond to a size required for matching with a wanted person's photograph, and so on. In other aspects, the size of a detected face image may exceed a predetermined threshold value. For example, as illustrated in FIG. 16, in an instance where a first person's face inter-eye distance W1 that is a value indicating the object region size is greater than pixel number TH1 that is a threshold, the object detection unit cuts out a face image (1).
  • In other aspects, illustrated in FIG. 16, in an instance where a second person's face inter-eye distance W2 is less than pixel number TH1, the object detection unit cuts out an image data region of the outline size of the entire person's body of the second person which is a predetermined region including the face image.
  • In some embodiments, although the facial size of the first person may be greater than predefined pixel number TH1 and less than TH2, a possibility exists that the object detection unit may be unable to acquire the first person's facial features (e.g., because the first person wears glasses or a mask). In certain aspects, the object detection unit may cut out an image data region larger in size than the region (1) outlined by the first person's face (e.g., a region outlining the entire body of the first person).
  • As another example, the object for surveillance within the image data (e.g., a surveillance object) may correspond to a wanted vehicle. In an instance, when the object detection unit determines that a width W3 of the detected license plate is greater than predefined pixel number TH1, the object detection unit cuts out the region of the license plate as an image data region (3). Alternatively, in an instance where the object detection unit determines that a width W4 of a license plate is less than predefined pixel number TH1, the object detection unit cuts out an image data region (4) of the entire car outline or a size larger than the entire car outline. The output unit 1001 outputs the image cut out in this way.
  • As another example, depicted in FIG. 22, the object may include a logo of a commodity. The logo may correspond to a brand, a trademark, a sign, a character, and/or a figure that can distinguish the commodity from other commodities. As depicted in FIG. 22, when a width D2 of the detected logo is greater than a predefined pixel number, the object detection unit may be configured to cut out or extract a region of the logo as an image data region. Alternatively, when a width D2 of the logo is less than a predefined pixel number, the object detection unit may be configured to cut out or extract an image data region of the entire commodity outline, or a size larger than the entire commodity outline. The output unit 1001 outputs the image cut out to a display unit and/or an external device.
  • In certain aspects, image processing systems consistent with the disclosed embodiments may prevent an output of a useless image region. Therefore, the image processing system can perform image processing in a more efficient manner, and the image processing system may provide and support appropriate services.
  • Additional Exemplary Embodiments
  • In one embodiment, an image processing system includes a terminal. The terminal may, in some instances, include an image accumulation unit that stores an image and identification information on the image or a partial image of the image. The terminal may also include transmission unit that transmits the identification information and the partial image to a server, and an output unit that outputs additional information based on identification information received from the server. Further, the image processing system may include server having an image matching unit that matches the partial image received from the terminal with an accumulated candidate image, and a transmission unit that transmits the identification information corresponding to said partial image to the terminal.
  • In another embodiment, the above-mentioned output unit in the image processing system may be an image display unit that displays an image. The output unit may, in some instances, superimpose and display information on a matching result on the image as the additional information.
  • In another embodiment, an image processing system includes a terminal. The terminal may, in some instances, include an image accumulation unit that stores an image and identification information on the image or a partial image of the image. The terminal may also include a transmission unit that transmits the identification information of the image and the partial image of the image to a server and transmits the image identified by identification information received from the server to the server. Further, the image processing system may include server having an image matching unit that matches the partial image received from the terminal with an accumulated candidate image, and a transmission unit that transmits the identification information corresponding to said partial image to the terminal.
  • In some aspects, the above-mentioned partial image may include an object detected from an image, and the above-mentioned identification information may be position information of the object or time information on acquisition of the image.
  • In some aspects, the terminal of the above-mentioned image processing system may include an input unit that designates a region of the detected object. The image processing system may store a partial image in a database if the partial image including the object region is not matched with the candidate image.
  • In some aspects, the above-mentioned terminal may include an input unit that designates a region of the detected object. Further, a partial image including the object region designated by the input unit may be matched with the candidate image more preferentially than other partial images.
  • In some aspects, the transmission unit of the above-mentioned server may transmit an image designated from the candidate image to the above-mentioned terminal, and the above-mentioned terminal may include an image processing unit that matches the image and the partial image when the designated image is received.
  • In some aspects, when a plurality of the above-mentioned terminals exist and the partial image transmitted from at least one of the terminals to the above-mentioned server is matched in the image matching unit, the server may transmit the identification information corresponding to said partial image to a different terminal from the terminal.
  • In another embodiment, an image processing device may include an output unit that outputs an object region when a value indicating a size of the object region of an image taken by an imaging unit is greater than a threshold, and, otherwise, outputs an image of a predetermined region including the object region.
  • In some aspects, the object region may be a face region of a person and the predetermined region may be an image including a whole of the person.
  • In some aspects, the object region may be a license plate of a car and the predetermined region may be an image including a whole of the car.
  • In another embodiment, an image processing method may include storing an image and identification information on the image or a partial image of the image in a terminal, transmitting from the terminal the identification information and the partial image to a server and outputting additional information based on identification information received from the server.
  • The image processing method may further include matching by the server the partial image received from the terminal with an accumulated candidate image and transmits the identification information corresponding to the matched image to the terminal.
  • In some aspects, the image processing method may further include outputting by the terminal the additional information to an image display unit that displays an image, and superimposing and displaying by the terminal information on a matching result on the image as the additional information.
  • In another embodiment, an image processing method may include storing an image and identification information on the image or a partial image of the image in a terminal, transmitting from the terminal the identification information of the image and the partial image of the image to a server and transmitting from the terminal to the server the image identified by identification information received from the server, and matching by the server the partial image received from the terminal with an accumulated candidate image and transmits the identification information corresponding to the matched image to the terminal.
  • In some aspects, the partial image may include an object detected from an image, and the identification information may be position information of the object or time information on acquisition of the image.
  • In some aspects, the image processing method may further include accepting by the terminal the designation of an object region through an input unit that designates a region of the detected object, and, when a partial image including the designated object region is not matched with the candidate image, storing the partial image in a database in the terminal.
  • In some aspects, the above-mentioned terminal may accept the designation of an object region through an input unit that designates a region of the detected object. Further, a partial image including the designated object region may be matched with the candidate image more preferentially than other partial images.
  • In some aspects, the image processing method may further include transmitting by the server an image designated from the candidate image to the terminal, and when the designated image is received, matching by the terminal the image and the partial image.
  • In some aspects, the image processing method may further include, when a plurality of the terminals exist and the partial image transmitted from at least one of the terminals to the server is matched, transmitting by the server the identification information corresponding to the matched image to a different terminal from the terminal.
  • In another embodiment, an image processing method may include outputting an object region when a value indicating a size of the object region of an image taken by an imaging unit is greater than a threshold, and, otherwise, outputting an image of a predetermined region including the object region.
  • In some aspects, the object region may be a face region of a person and the predetermined region may be an image including a whole of the person.
  • In some aspects, the object region may be a license plate of a car and the predetermined region may be an image including a whole of the car.
  • In another embodiment, an image processing program may cause a computer to execute the above-mentioned image processing method.

Claims (12)

What is claimed is:
1. A terminal comprising:
a memory storing instructions; and
at least one processor coupled to the memory, the at least one processor being operative with the set of instructions in order to:
store an image and identification information associated with at least one of the image or a portion of the image to the memory, the image portion corresponding to a partial image;
transmit the identification information and the at least one image or partial image to a server; and
obtain, from the server, additional information indicative of a match between the at least one transmitted image or partial image and at least one candidate image or partial image; and
generate an electronic command to output the additional information.
2. The terminal of claim 1, wherein the at least one processor is further operative with the instructions to:
generate a command to present the image; and
generate a command to superimpose and present the at least one candidate image or partial image over a portion of the image.
3. The terminal of claim 1, wherein:
the partial image includes an object detected from an image; and
the identification information comprises position information associated with the object or temporal information identifying a time at which the terminal obtained the image.
4. The terminal according to claim 3, wherein the at least one processor is further operative with the instructions to:
designate a region of the detected object; and
store the partial image in the memory in an instance case when the partial image includes the designated object region and is not matched with the at least one candidate image or partial image.
5. A server comprising:
a memory storing instructions; and
at least one processor coupled to the memory, the at least one processor being operative with the set of instructions in order to:
match at least a portion of a first image received from a terminal with a candidate image, the first image portion corresponding to a partial image, and the candidate image being stored in the memory;
transmit identification information corresponding to the partial image to the terminal;
generate an electronic command to present a second image corresponding to the identification information, the second image being received from the terminal; and
superimpose and generate an electronic command to present information indicated of the matched candidate image over the presented second image.
6. The server of claim 5, wherein:
the partial image includes an object detected from an image; and
the identification information comprises position information associated with the object or temporal information identifying a time at which the server obtained the first image.
7. An image processing device, comprising:
a memory storing instructions; and
at least one processor coupled to the memory, the at least one processor being operative with the set of instructions in order to:
identify a first portion of a captured image, the first captured image portion comprising a second captured image portion, the second captured image portion including an object; and
generate an electronic command to output data corresponding to the first captured image portion
8. The image processing device according to claim 7, wherein the first captured image portion corresponds to the second captured image portion when value indicative of a size of the second captured image portion is greater than a threshold value.
9. The image processing device according to claim 7, wherein the object is a face of a person and the first captured image portion comprises an image including a whole of the person.
10. The image processing device according to claim 7, wherein the object is a license plate of a car and the first captured image portion comprises an image including a whole of the car.
11. An image processing system including
a first terminal, comprising:
a first memory storing first instructions; and
at least one first processor coupled to the memory, the at least one first processor being operative with the first instructions in order to:
store an image and identification information associated with at least one of the image or a portion of the image to the memory, the image portion corresponding to a partial image;
transmit the identification information and the at least one image or partial image to a server; and
output additional information indicative of a match between the at least one transmitted image or partial image and at least one candidate image or partial image; and
a server including:
a second memory storing second instructions; and
at least one second processor coupled to the memory, the at least one second processor being operative with the second instructions in order to:
match at least the partial image received from the first terminal with a portion of a stored candidate image; and
transmit the identification information corresponding to the partial image to the first terminal.
12. The image processing system of claim 11, wherein:
the system comprises one or more second terminals; and
when the server matches at least the received partial image received with the stored candidate image, the at least one second processor is further operative with the second instruction to transmit the identification information corresponding to the partial image to at least one of the second terminals.
US14/673,929 2014-03-31 2015-03-31 Image processing system, image processing method and program, and device Abandoned US20150279116A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US16/297,486 US20190206108A1 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US16/297,475 US11100691B2 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US17/375,751 US11798211B2 (en) 2014-03-31 2021-07-14 Image processing system, image processing method and program, and device
US18/202,009 US20230298238A1 (en) 2014-03-31 2023-05-25 Image processing system, image processing method and program, and device
US18/371,296 US20240013458A1 (en) 2014-03-31 2023-09-21 Image processing system, image processing method and program, and device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014070719 2014-03-31
JP2014-070719 2014-03-31

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/297,475 Continuation US11100691B2 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US16/297,486 Continuation US20190206108A1 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device

Publications (1)

Publication Number Publication Date
US20150279116A1 true US20150279116A1 (en) 2015-10-01

Family

ID=54191150

Family Applications (6)

Application Number Title Priority Date Filing Date
US14/673,929 Abandoned US20150279116A1 (en) 2014-03-31 2015-03-31 Image processing system, image processing method and program, and device
US16/297,486 Abandoned US20190206108A1 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US16/297,475 Active US11100691B2 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US17/375,751 Active 2035-04-19 US11798211B2 (en) 2014-03-31 2021-07-14 Image processing system, image processing method and program, and device
US18/202,009 Pending US20230298238A1 (en) 2014-03-31 2023-05-25 Image processing system, image processing method and program, and device
US18/371,296 Pending US20240013458A1 (en) 2014-03-31 2023-09-21 Image processing system, image processing method and program, and device

Family Applications After (5)

Application Number Title Priority Date Filing Date
US16/297,486 Abandoned US20190206108A1 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US16/297,475 Active US11100691B2 (en) 2014-03-31 2019-03-08 Image processing system, image processing method and program, and device
US17/375,751 Active 2035-04-19 US11798211B2 (en) 2014-03-31 2021-07-14 Image processing system, image processing method and program, and device
US18/202,009 Pending US20230298238A1 (en) 2014-03-31 2023-05-25 Image processing system, image processing method and program, and device
US18/371,296 Pending US20240013458A1 (en) 2014-03-31 2023-09-21 Image processing system, image processing method and program, and device

Country Status (3)

Country Link
US (6) US20150279116A1 (en)
JP (4) JP6627750B2 (en)
WO (1) WO2015151449A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3477536A1 (en) * 2017-10-28 2019-05-01 Leapsy International Ltd. Wearable device capable of recognizing human face and license plate
EP3477535A1 (en) * 2017-10-28 2019-05-01 Leapsy International Ltd. Method of recognizing human face and license plate utilizing wearable device
WO2020193671A1 (en) * 2019-03-27 2020-10-01 Schölly Fiberoptic GmbH Method for putting a camera control unit (ccu) into operation
US10949657B2 (en) * 2016-11-22 2021-03-16 Panasonic Intellectual Property Management Co., Ltd. Person's behavior monitoring device and person's behavior monitoring system
WO2021059139A1 (en) * 2019-09-27 2021-04-01 Ricoh Company, Ltd. Apparatus, image processing system, communication system, method for setting, image processing method, and recording medium
US11010599B2 (en) 2019-05-01 2021-05-18 EMC IP Holding Company LLC Facial recognition for multi-stream video using high probability group and facial network of related persons
US20220019759A1 (en) * 2020-07-16 2022-01-20 Goodrich Corporation Helicopter search light and method for detection and tracking of anomalous or suspicious behaviour
US11270113B2 (en) * 2017-04-21 2022-03-08 Hewlett-Packard Development Company, L.P. Object detections for virtual reality
US11334746B2 (en) * 2019-05-01 2022-05-17 EMC IP Holding Company LLC Facial recognition for multi-stream video using high probability group

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017084289A (en) * 2015-10-30 2017-05-18 キヤノン株式会社 Display controller, display control method and program
JP7054331B2 (en) * 2017-10-27 2022-04-13 ホーチキ株式会社 Room occupancy monitoring system
JP7142443B2 (en) * 2018-03-02 2022-09-27 グローリー株式会社 Image authentication system, image authentication method and image authentication program
CN110324528A (en) * 2018-03-28 2019-10-11 富泰华工业(深圳)有限公司 Photographic device, image processing system and method
JP7272626B2 (en) * 2019-01-09 2023-05-12 i-PRO株式会社 Verification system, verification method and camera device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060193509A1 (en) * 2005-02-25 2006-08-31 Microsoft Corporation Stereo-based image processing
US20110096922A1 (en) * 2009-10-23 2011-04-28 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20120307091A1 (en) * 2011-06-03 2012-12-06 Panasonic Corporation Imaging apparatus and imaging system
US20150012840A1 (en) * 2013-07-02 2015-01-08 International Business Machines Corporation Identification and Sharing of Selections within Streaming Content
US20150146922A1 (en) * 2012-06-29 2015-05-28 Secom Co., Ltd. Target detection device and target detection method

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002183180A (en) 2000-12-16 2002-06-28 Kentaro Hayashi Information gathering and retrieval system
JP2003046911A (en) * 2001-05-22 2003-02-14 Matsushita Electric Ind Co Ltd Monitoring recording system and method therefor
US20020175997A1 (en) 2001-05-22 2002-11-28 Matsushita Electric Industrial Co., Ltd. Surveillance recording device and method
JP2004128615A (en) * 2002-09-30 2004-04-22 Toshiba Corp Person monitoring system
JP2004280376A (en) 2003-03-14 2004-10-07 Japan Science & Technology Agency Method and system for recognition of subject's behavior
JP4622301B2 (en) 2004-05-07 2011-02-02 オムロン株式会社 Surveillance system and surveillance camera
JP4104577B2 (en) 2004-05-21 2008-06-18 三菱電機株式会社 Image transmission apparatus, image transmission method, transmission system, and video surveillance system
JP2006148842A (en) 2004-10-20 2006-06-08 Daimei Kk Wearable monitor camera system
JP4645180B2 (en) 2004-12-03 2011-03-09 コニカミノルタホールディングス株式会社 Video display device
JP2007158421A (en) * 2005-11-30 2007-06-21 Matsushita Electric Ind Co Ltd Monitoring camera system and face image tracing recording method
US8599267B2 (en) 2006-03-15 2013-12-03 Omron Corporation Tracking device, tracking method, tracking device control program, and computer-readable recording medium
JP5082724B2 (en) 2007-09-28 2012-11-28 オムロン株式会社 Image processing apparatus and method, and program
JP2011114580A (en) 2009-11-26 2011-06-09 Panasonic Corp Multiple cameras monitoring system, mobile terminal apparatus, center apparatus, and method for monitoring multiple cameras
JP5552946B2 (en) 2010-07-30 2014-07-16 株式会社リコー Face image sample collection device, face image sample collection method, program
JP2012252569A (en) 2011-06-03 2012-12-20 Nec Corp Information processing device, control method for information processing device and control program thereof, information processing system and information processing method
US8517394B2 (en) 2011-08-01 2013-08-27 Cnh America Llc Adjustable suspension system for a work vehicle
JPWO2013145530A1 (en) * 2012-03-28 2015-12-10 日本電気株式会社 Analysis system
JP5942258B2 (en) * 2012-06-12 2016-06-29 パナソニックIpマネジメント株式会社 Video display system, video synthesis re-encoding device, video display device, video display method, and video synthesis re-encoding program
US10034049B1 (en) * 2012-07-18 2018-07-24 Google Llc Audience attendance monitoring through facial recognition
WO2014017398A1 (en) 2012-07-24 2014-01-30 日本電気株式会社 Attendance management device, data processing method therfor, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060193509A1 (en) * 2005-02-25 2006-08-31 Microsoft Corporation Stereo-based image processing
US20110096922A1 (en) * 2009-10-23 2011-04-28 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20120307091A1 (en) * 2011-06-03 2012-12-06 Panasonic Corporation Imaging apparatus and imaging system
US20150146922A1 (en) * 2012-06-29 2015-05-28 Secom Co., Ltd. Target detection device and target detection method
US20150012840A1 (en) * 2013-07-02 2015-01-08 International Business Machines Corporation Identification and Sharing of Selections within Streaming Content

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10949657B2 (en) * 2016-11-22 2021-03-16 Panasonic Intellectual Property Management Co., Ltd. Person's behavior monitoring device and person's behavior monitoring system
US11270113B2 (en) * 2017-04-21 2022-03-08 Hewlett-Packard Development Company, L.P. Object detections for virtual reality
US11861902B2 (en) 2017-04-21 2024-01-02 Hewlett-Packard Development Company, L.P. Object detections for virtual reality
EP3477536A1 (en) * 2017-10-28 2019-05-01 Leapsy International Ltd. Wearable device capable of recognizing human face and license plate
EP3477535A1 (en) * 2017-10-28 2019-05-01 Leapsy International Ltd. Method of recognizing human face and license plate utilizing wearable device
WO2020193671A1 (en) * 2019-03-27 2020-10-01 Schölly Fiberoptic GmbH Method for putting a camera control unit (ccu) into operation
US11010599B2 (en) 2019-05-01 2021-05-18 EMC IP Holding Company LLC Facial recognition for multi-stream video using high probability group and facial network of related persons
US11334746B2 (en) * 2019-05-01 2022-05-17 EMC IP Holding Company LLC Facial recognition for multi-stream video using high probability group
WO2021059139A1 (en) * 2019-09-27 2021-04-01 Ricoh Company, Ltd. Apparatus, image processing system, communication system, method for setting, image processing method, and recording medium
US20220019759A1 (en) * 2020-07-16 2022-01-20 Goodrich Corporation Helicopter search light and method for detection and tracking of anomalous or suspicious behaviour
US11861895B2 (en) * 2020-07-16 2024-01-02 Goodrich Corporation Helicopter search light and method for detection and tracking of anomalous or suspicious behaviour

Also Published As

Publication number Publication date
JP2022048147A (en) 2022-03-25
US20240013458A1 (en) 2024-01-11
JP7001086B2 (en) 2022-01-19
JP6627750B2 (en) 2020-01-08
US20190206107A1 (en) 2019-07-04
JPWO2015151449A1 (en) 2017-04-13
US20210343057A1 (en) 2021-11-04
WO2015151449A1 (en) 2015-10-08
JP2024019238A (en) 2024-02-08
US20190206108A1 (en) 2019-07-04
US11798211B2 (en) 2023-10-24
US20230298238A1 (en) 2023-09-21
JP2020017301A (en) 2020-01-30
US11100691B2 (en) 2021-08-24

Similar Documents

Publication Publication Date Title
US11798211B2 (en) Image processing system, image processing method and program, and device
CN108271021B (en) Gaze sensing based block level update rate control
US10277832B2 (en) Image processing method and image processing system
TW201722136A (en) Security system and method
US9305331B2 (en) Image processor and image combination method thereof
JP2008035095A (en) Monitoring apparatus, monitoring system, monitoring method and program
JP2010136032A (en) Video monitoring system
JP2014022970A (en) Image transmission device, image transmission method, image transmission program, image recognition authentication system, and image reception device
CN108141568B (en) OSD information generation camera, synthesis terminal device and sharing system
US11120838B2 (en) Information processing apparatus, control method, and program
CN105578129A (en) Multipath multi-image video splicing device
US10863113B2 (en) Image processing apparatus, image processing method, and storage medium
JP2007019671A (en) Image communication system and image processing program
US9773143B2 (en) Image processing apparatus, image processing method, and image processing system
JP6437217B2 (en) Image output device, image management system, image processing method, and program
CN203894772U (en) Mass face detecting and identifying system
KR20140022670A (en) An apparatus for transmitting simplified motion information excluding background images and displaying the information by utilizing avatar and the methods thereof
US10783365B2 (en) Image processing device and image processing system
JP2005269473A (en) Monitoring system
JP2019009615A (en) Monitoring camera device, monitoring video distribution method, and monitoring system
JP2012181328A (en) Advertisement distribution system, advertisement distribution device, advertisement distribution method, and program
JP2012212235A (en) Object detection system, object detection method, and program
JP2019113978A (en) Movement line analysis system, control method therefor, and program
KR102009924B1 (en) System for monitoring image and operating method thereof
JP2016220148A (en) Control apparatus, control method, and system

Legal Events

Date Code Title Description
AS Assignment

Owner name: NEC CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YACHIDA, SHOJI;REEL/FRAME:035295/0659

Effective date: 20150320

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION