USRE44527E1 - Abnormality detection and surveillance system - Google Patents

Abnormality detection and surveillance system Download PDF

Info

Publication number
USRE44527E1
USRE44527E1 US13/361,438 US201213361438A USRE44527E US RE44527 E1 USRE44527 E1 US RE44527E1 US 201213361438 A US201213361438 A US 201213361438A US RE44527 E USRE44527 E US RE44527E
Authority
US
United States
Prior art keywords
movements
video
camera
individual
surveillance system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
US13/361,438
Inventor
David G. Aviv
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Prophet Productions LLC
Original Assignee
Prophet Productions LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=45508283&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=USRE44527(E1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Prophet Productions LLC filed Critical Prophet Productions LLC
Priority to US13/361,438 priority Critical patent/USRE44527E1/en
Application granted granted Critical
Publication of USRE44527E1 publication Critical patent/USRE44527E1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19608Tracking movement of a target, e.g. by detecting an object predefined as a target, using target direction and or velocity to predict its new position
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • G08B13/19613Recognition of a predetermined image pattern or behaviour pattern indicating theft or intrusion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19639Details of the system layout
    • G08B13/19641Multiple cameras having overlapping views on a single scene
    • G08B13/19643Multiple cameras having overlapping views on a single scene wherein the cameras play different roles, e.g. different resolution, different camera type, master-slave camera
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19697Arrangements wherein non-video detectors generate an alarm themselves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/188Capturing isolated or intermittent images triggered by the occurrence of a predetermined event, e.g. an object reaching a predetermined position

Definitions

  • This invention generally relates to surveillance systems, and more particularly, to trainable surveillance systems which detect and respond to specific abnormal video and audio input signals.
  • U.S. Pat. No. 4,737,847 issued to Araki et al. discloses an improved abnormality surveillance system wherein motion sensors are positioned within a protected area to first determine the presence of an object of interest, such as an intruder.
  • zones having prescribed “warning levels” are defined within the protected area. Depending on which of these zones an object or person is detected in, moves to, and the length of time the detected object or person remains in a particular zone determines whether the object or person entering the zone should be considered an abnormal event or a threat.
  • a preferred embodiment of the herein disclosed invention involves a surveillance system having at least one primary video camera for translating real images of a zone into electronic video signals at a first level of resolution and means for sampling movements within the zone from the video camera output. These elements are combined with means for electronically comparing the sampled movements with known characteristics of movements which are indicative of individuals engaged in criminal activity and means for determining the level of such criminal activity. Associated therewith are means for activating at least one secondary sensor and associated recording device having a second higher level of resolution, said activating means being in response to determining a predetermined level of criminal activity.
  • FIG. 1 is a schematic block diagram of the video, analysis, control, alarm and recording subsystems of an embodiment of this invention
  • FIG. 2A illustrates a frame K of a video camera's output of a particular environment, according to the invention, showing four representative objects (people) A, B, C, and D, wherein objects A, B and D are moving in a direction indicated with arrows, and object C is not moving;
  • FIG. 2B illustrates a frame K+5 of the video camera's output, according to the invention, showing objects A, B, and D are stationary, and object C is moving;
  • FIG. 2C illustrates a frame K+10 of the video camera's output, according to the invention, showing the current location of object A, B, C, D, and E;
  • FIG. 2D illustrates a frame K+11 of the video camera's output, according to the invention, showing object B next to object C, and object E moving to the right;
  • FIG. 2E illustrates a frame K+12 of the video camera's output. according to the invention, showing a potential crime taking place between objects B and C;
  • FIG. 2F illustrates a frame K+13 of the video camera's output, according to the invention, showing objects B and C interacting;
  • FIG. 2G illustrates a frame K+15 of the video camera's output, according to the invention, showing object C moving the right and object B following;
  • FIG. 2H illustrates a frame K+16 of the video camera's output, according to the invention, showing object C moving away from a stationary object B;
  • FIG. 2I illustrates a frame K+17 of the video camera's output, according to the invention, showing object B moving towards object C;
  • FIG. 3A illustrates a frame of a video camera's output, according to the invention, showing a “two on one” interaction of objects (people) A, B, and C;
  • FIG. 3B illustrates a later frame of the video camera's output of FIG. 3A , according to the invention, showing objects A and C moving towards object B;
  • FIG. 3C illustrates a later frame of the video camera's output of FIG. 3B , according to the invention, showing objects A and C moving in close proximity to object B;
  • FIG. 3D illustrates a later frame of the video camera's output of FIG. 3C , according to the invention, showing objects A and C quickly moving away from object B;
  • FIG. 4 is a schematic block diagram of a conventional word recognition system which may be employed in the invention.
  • the picture input means 10 may be any conventional electronic picture pickup device operational within the infrared or visual spectrum (or both) including a vidicon and a CCD/TV camera of moderate resolution, e.g., a camera about 11 ⁇ 2 inches in length and about 1 inch in diameter, weighing about 3 ounces, including for particular deployment a zoom lens attachment.
  • This device is intended to operate continuously and translate the field of view (“real”) images within a first observation area into conventional video electronic signals.
  • a high rate camera/recorder up to 300 frames/see (similar to those made by NAC Visual Systems of Woodland Hills, Calif., SONY and others) may be used as the picture input means 10 . This would enable the detection of even the very rapid movement of body parts that are indicative of criminal intent, and their recording, as hereinbelow described. The more commonly used camera operates at 30 frames per second and cannot capture such quick body movement with sufficient resolution.
  • Picture input means 10 instead of operating continuously, may be activated by an “alert” signal from the processor of the low resolution camera or from the audio/word recognition processor when sensing a suspicious event.
  • Picture input means 10 contains a preprocessor which normalizes a wide range of illumination levels, especially for outside observation.
  • the preprocessor emulates a vertebrate's retina, which has a an efficient and accurate normalization process.
  • One such preprocessor (VLSI retina chip) is fabricated by the Carver Meade Laboratory of the California Institute of Technology in Pasadena, Calif. Use of this particular preprocessor chip will increase the automated vision capability of this invention whenever variation of light intensity and light reflection may otherwise weaken the picture resolution.
  • the signals from the picture input means 10 are converted into digitized signals and then sent to the picture processing means 12 .
  • the processor means controlling each group of cameras will be governed by an artificial intelligence system, based on dynamic pattern recognition principles, as further described below.
  • Picture processing means 12 includes an image raster analyzer which effectively segments each image to isolate each pair of people.
  • the image raster analyzer subsystem of picture processing means 12 segments each sampled image to identify and isolate each pair of objects (or people), and each “two on one” group of three people separately.
  • the “two on one” grouping represents a common mugging situation in which two individuals approach a victim, one from in front of the victim and the other from behind.
  • the forward mugger tells the potential victim that if he does not give up his money, (or watch, ring, etc.) the second mugger will shoot him, stab or otherwise harm him.
  • the group of three people will thus be considered a potential crime in progress and will therefore be segmented and analyzed in picture processing means.
  • the essence of the zoom system is to vary the focal length such that an object being observed will be focused and magnified at its image plane.
  • an automatic version of the zoom system once an object is in the camera's field-of-view (FOV), the lens moves to focus the object onto the camera's image plane.
  • An error signal which is used to correct the focus by the image planes is generated by a CCD array into two halves and measuring the difference, segmenting in each until the object is at the center. Dividing the CCD array into more than two segments, say four quadrants, is a way to achieve automatic centering, as is the case with mono-pulse radar. Regardless of the number of segments, the error signal is used to generate the desired tracking of the object.
  • the zoom with input from the segmentation subsystem of the picture analysis means 12 will focus on the object closest to the right hand side of the image plane, and then proceed to move the focus to the left, focusing on the next object and on the next sequentially.
  • the automatic zoom can more naturally choose to home-in on the person with the brightest emission or reflection, and then proceed to the next brightness and so forth. This would be a form of an intensity/time selection multiplex zoom system.
  • the relative positioning of the input camera with respect to the area under surveillance will effect the accuracy by which the image raster analyzer segments each image.
  • the height of the input camera is preferably sufficient to minimize occlusion between the input camera and the movement of the individuals under surveillance.
  • Each image frame segment, once digitized, is stored in a frame by frame memory storage of picture processing means 12 .
  • Each frame from the picture input means 10 is subtracted from a previous frame already stored in processing means 12 using any conventional differencing process.
  • the differencing process involving multiple differencing steps takes place in the processing section 12 .
  • the resulting difference signal (outputted from the differencing sub-section of means 12 ) of each image indicates all the changes that have occurred from one frame to the next. These changes include any movements of the individuals located within the segment and any movements of their limbs, e.g., arms.
  • a collection of differencing signals for each moved object of subsequent sampled frames of images allows a determination of the type, speed and direction (vector) of each motion involved, processing which will extract acceleration, i.e., note of change of velocity: and change in acceleration with respect to time (called “jerkiness”), and correlating this with stored signatures of known physical criminal acts.
  • subsequent differencing signals may reveal that an individual's arm is moving to a high position, such as the upper limit of that arm's motion, i.e., above his head) at a fast speed. This particular movement could be perceived, as described below, as a hostile movement with a possible criminal activity requiring the expert analysis of security personnel.
  • the intersection of two tracks indicates the intersection of two moved objects.
  • the intersecting objects in this case, could be merely the two hands of two people greeting each other, or depending on other characteristics, as described below, the intersecting objects could be interpreted as a fist of an assailant contacting the face of a victim in a less friendly greeting.
  • the intersection of two tracks immediately requires further analysis and/or the summoning of security personnel. But the generation of an alarm, fight and sound devices located, for example, on a monitor will turn a guard's attention only to that monitor, hence the labor savings.
  • friendly interactions between individuals is a much slower physical process than is a physical assault vis-a-vis body parts of the individuals involved. Hence, friendly interactions may be easily distinguished from hostile physical acts using current low pass and high pass filters, and current pattern recognition techniques based on experimental reference data.
  • a sensor suite When a large number of sensors (called a sensor suite) are distributed over a large number of facilities, for example, a number of ATMs (automatic teller machines), associated with particular bank branches and in a particular state or states and all operated under a single bank network control, then only one monitor is required.
  • ATMs automated teller machines
  • optical flow computation A commercially available software tool may enhance object-movement analysis between frames (called optical flow computation).
  • optical flow computation specific (usually bright) reflective elements, called farkles, emitted from the clothing and/or the body parts of an individual of one frame are subtracted from a previous frame.
  • the bright portions will inherently provide sharper detail and therefore will yield more accurate data regarding the velocities of the relative moving objects.
  • Additional computation as described below, will provide data regarding the acceleration and even change in acceleration or “jerkiness” of each moving part sampled.
  • the physical motions of the individuals involved in an interaction will be detected by first determining the edges of the of each person imaged. And the movements of the body parts will then be observed by noting the movements of the edges of the body parts of the individuals involved in the interaction.
  • the differencing process will enable the determination of the velocity and acceleration and rate of acceleration of those body parts.
  • the now processed signal is sent to comparison means 14 which compares selected flames of the video signals from the picture input means 10 with “signature” video signals stored in memory 16 .
  • the signature signals are representative of various positions and movements of the body ports of an individual having various levels of criminal intent. The method for obtaining the data base of these signature video signals in accordance with another aspect of the invention is described in greater detail below.
  • an output “alert” signal is sent from the comparison means 14 to a controller 18 .
  • the controller 18 controls the operation of a secondary, high resolution picture input means (video camera) 20 and a conventional monitor 22 and video recorder 24 .
  • the field of view of the secondary camera 20 is preferably at most, the same as the field of view of the primary camera 10 , surveying a second observation area.
  • the recorder 24 may be located at the site and/or at both a law enforcement facility (not shown) and simultaneously at a court office or legal facility to prevent loss of incriminating information due to tampering.
  • the purpose of the secondary camera 20 is to provide a detailed video signal of the individual having assumed criminal intent and also to improve false positive and false negative performance. This information is recorded by the video recorder 24 and displayed on a monitor 22 . An alarm bell or light (not shown) or both may be provided and activated by an output signal from the controller 20 to summon a supervisor to immediately view the pertinent video images showing the apparent crime in progress and access its accuracy.
  • a VCR 26 is operating continuously (using a 6 hour loop-tape, for example).
  • the VCR 26 is being controlled by the VCR controller 28 .
  • All the “real-time” images directly from the picture input means 10 are immediately recorded and stored for at least 6 hours, for example.
  • a signal from the controller 18 is sent to the VCR controller 28 changing the mode of recording from tape looping mode to non-looping mode.
  • the tape will not re-loop and will therefore retain the perhaps vital recorded video information of the surveyed site, including the crime itself, and the events leading up to the crime.
  • the video signal may also be transmitted to a VCR located elsewhere; for example, at a law enforcement facility and, simultaneously to other secure locations of the Court and its associated offices.
  • each sampled frame of video is “segmented” into parts relating to the objects detected therein.
  • the video signal derived from the vidicon or CCD/TV camera is analyzed by an image raster analyzer. Although this process causes slight signal delays, it is accomplished nearly in real time.
  • a high resolution camera may not be required or otherwise used.
  • the resolution provided by a relatively simple and low cost camera may be sufficient.
  • the length of frame intervals between analyzed frames may vary. For example, in a high risk area, every frame from the CCD/TV camera may be analyzed continuously to ensure that the maximum amount of information is recorded prior to and during a crime. In a low risk area, it may be preferred to sample perhaps every 10 frames from each camera, sequentially.
  • the system would activate an alert mode wherein the system becomes “concerned and curious” in the suspicious actions and the sampling rate is increased to perhaps every 5 frames or even every frame.
  • an alert mode wherein the system becomes “concerned and curious” in the suspicious actions and the sampling rate is increased to perhaps every 5 frames or even every frame.
  • the entire system may be activated wherein both audio and video system begin to sample the environment for sufficient information to determine the intent of the actions.
  • FIG. 2 several frames of a particular camera output are shown to illustrate the segmentation process performed in accordance with the invention.
  • the system begins to sample at frame K and determines that there are four objects (previously determined to be people, as described below), A-D located within a particular zone being policed. Since nothing unusual is determined from the initial analysis, the system does not warrant an “alert” status. People A, B, and D are moving according to normal, non-criminal intent, as could be observed.
  • a crime likelihood is indicated when frames K+10 through K+13 are analyzed by the differencing process. And if the movement of the body parts indicate velocity, acceleration and “jerkiness” that compare positively with the stored digital signals depicting movements of known criminal physical assaults, it is likely that a crime is in progress here.
  • An alarm is generated the instant any of the above conditions is established.
  • This alarm condition will result in sending in police or Guards to the crime site, activating the high resolution CCD/TV camera to record the face of the person committing the assault, a loud speaker being activated automatically, playing a recorded announcement warning the perpetrator the seriousness of his actions now being undertaken and demanding that he cease the criminal act. After dark a strong light will be turned on automatically.
  • the automated responses will be actuated the instant an alarm condition is determined by the processor. Furthermore, an alarm signal is sent to the police station, and the same video signal of the event is transmitted to a court appointed data collection office, to the Public Defender's office and the District Attorney's Office.
  • Files of physical criminal acts which involve body parts movements such as hands, arms, elbows, shoulder, head, torso, legs, and feet, can be reviewed to ascertain this pattern.
  • a priority can be set by experiments and simulations of physical criminal acts gathered from “dramas” that are enacted by professional actors, the data gathered from experienced muggers who have been caught by the police as well as victims who have reported details of their experiences will help the actors perform accurately.
  • Video of their motions involved in these simulated acts can be stored in digitized form and files prepared for signature motion of each of the body parts involved, in the simulated physical criminal acts.
  • the above described Abnormality Detection System includes an RF-ID (Radio Frequency Identification) tag or card to assist in the detection and tracking of individuals within the field of view of a camera.
  • RF-ID Radio Frequency Identification
  • Such cards or tags could be used by authorized individuals to respond when queried by the RF interrogator.
  • the response signal of the tags propagation pattern which is adequately registered with the video sensor.
  • the card or tag, when sensed in video, would be assumed friendly and authorized. This information would simplify the segmentation process.
  • each RF-ID card will be turned ON, when a positive response to an interrogation signal is established.
  • the light will appear on the computer generated grid (also on the screen of the monitor) and the intersection of tracks clearly indicated, followed by their physical interaction. But also noted will be the intersection between the tagged and the untagged individuals. In all of such cases, the segmentation process will be simpler.
  • the applications of the present invention include banks, ATMs, hotels, schools, residence halls and dormitories, office and residential buildings, hospitals, sidewalks, street crossings, parks, containers and container loading areas, shipping piers, train stations, truck loading stations, airport passenger and freight facilities, bus stations, subway stations, theaters, concert halls, sport arenas, libraries, churches, museums, stores, shopping malls, restaurants, convenience stores, bars, coffee shops gasoline stations, highway rest stops, tunnels, bridges, gateways, sections of highways, toll booths, warehouses, and depots, factories and assembly rooms, law enforcement facilities including jails. Any location or facility, civilian or military, requiring security would be a likely application.
  • a tiny CCD/TV camera hidden in the ceiling or the rearview mirror of the car, and focussed through a pin hole lens to the driver's seat may be connected to the video processor to record the face of the drive.
  • the camera is triggered by the automatic word recognition processor that will identify the well known expressions commonly used by the car-jacker.
  • the video picture will be recorded and then transmitted via cellular phone in the car. Without a phone, the short video recording of the face of the car-jacker will be held until the car is found by the police, but now with the evidence (the picture of the car-jacker) in hand.
  • the security personnel manning the monitors are alerted only to video images which show suspicious actions (criminal activities) within a prescribed observation zone.
  • the security personnel are therefore used to access the accuracy of the crime and determine the necessary actions for an appropriate response.
  • computers to effectively filter out all normal and noncriminal video signals from observation areas, fewer security personnel are required to survey and “secure” a greater overall area (including a greater number of observation areas, i.e., cameras).
  • a battery operated portable version of the video system would automatically identify known objects in its field of view and a speech synthesizer would “say” the object. For example, “chair”, “table”, etc. would indicate the presence of a chair and a table.
  • At least two and perhaps three cameras are used simultaneously to cover the area. Should one camera sense a first level of criminal action, the other two could be manipulated to provide a three dimensional perspective coverage of the action.
  • the three dimensional image of a physical interaction in the policed area would allow observation of a greater number of details associated with the steps: accost, threat, assault, response and post response.
  • the conversion process from the two dimensional image to the three dimensional image is achieved by use of the known Radon transform.
  • both video and acoustic information is sampled and analyzed.
  • the acoustic information is sampled and analyzed in a similar manner to the sampling and analyzing of the above-described video information.
  • the audio information is sampled and analyzed in a manner shown in FIG. 4 , and is based on prior art.
  • ASR Automatic Speech Recognition
  • a conventional automatic word recognition system including an input microphone system 40 , an analysis subsystem 42 , a template subsystem 44 , a pattern comparator 46 , and a post-processor and decision logic subsystem 48 .
  • the acoustic/audio policing system will begin sampling all (or a selected portion) of nearby acoustic signals.
  • the acoustic signals will include voices and background noise.
  • the background noise signals are generally known and predictable, and may therefore be easily filtered out using conventional filtering techniques.
  • the expected noise signals are unfamiliar speech, automotive related sounds, honking, sirens, the sound of wind and/or rain.
  • the microphone input system 40 pick-up the acoustic signals and immediately filter out the predictable background noise signals and amplify the remaining recognizable acoustic signals.
  • the filtered acoustic signals are analyzed in the analysis subsystem 42 which processes the signals by means of digital and spectral analysis techniques.
  • the output of the analysis subsystem is compared in the pattern comparater subsystem 46 with selected predetermined words stored in memory in 44 .
  • the post processing and decision logic subsystem 48 generates an alarm signal, as described below.
  • the templates 44 include perhaps about 100 brief and easily recognizable terse expressions, some of which are single words, and are commonly used by those intent on a criminal act.
  • Some examples of commonly used word phrases spoken by a criminal to a victim prior to a mugging include: “Give me your money”, “This is a stick-up”, “Give me your wallet and you won't get hurt” . . . etc.
  • commonly used replies from a typical victim during such a mugging may also be stored as template words, such as “help”, and certain sounds such as shrieks, screams and groans, etc.
  • the output of the word recognition system shown in FIG. 4 is used as a trigger signal to activate a sound recorder, or a camera used elsewhere in the invention, as described below.
  • the preferred microphone used in the microphone input subsystem 40 is a shot-gun microphone, such as those commercially available from the Sennheiser Company of Frankfurt, Germany. These microphone have a supercardioid propagation pattern. However, the gain of the pattern may be too small for high traffic areas and may therefore require more than one microphone in an array configuration to adequately focus and track in these areas. The propagation pattern of the microphone system enables better focusing on a moving sound source (e.g., a person walking and talking).
  • a conventional directional microphone may also be used in place of a shot-gun type microphone, such as those made by the Sony Corporation of Tokyo, Japan. Such directional microphones will achieve similar gain to the shot-gun type microphones, but with a smaller physical structure.
  • a feedback loop circuit (not specifically shown) originating in the post processing subsystem 48 will direct the microphone system to track a particular dynamic source of sound within the area surveyed by video cameras.
  • An override signal from the video portion of the present invention will activate and direct the microphone system towards the direction of the field of view of the camera.
  • the video system will control the audio recording system towards the scene of interest.
  • the audio system will direct appropriate video cameras to visually cover and record the apparent source of the sound.
  • HMM hidden Markov model
  • ANN artificial neural network
  • the HMM applies probabilistic statistical procedure in recognizing words.
  • an estimate is made of the means and covariance of the probabilistic model of each word, e.g., those words which are considered likely to be uttered in an interaction.
  • the various ways which any given word is pronounced permits the spectral parameters of the word to be an effective describer of the model.
  • the steps involved in recognizing an input of an unknown word consists of computing the likelihood that the word was generated by each of the models developed during the training. The word is considered as “recognized” when its model gives the highest score.
  • the evaluation of conditional probabilities of one particular unit followed by the same or another word unit is also part of the computation.
  • the HMM system employed with the present invention allows both the audio and video systems to operate quickly and use HMM probability statistics to predict future movements or words based on an early recognition of initial movements and word stems.
  • the HMM system may be equally employed in the video recognition system. For example, if a person's arm quickly moves above his head, the HMM system may determine that there is a high probability that the arm will quickly come down, perhaps indicating a criminal intent.

Abstract

A surveillance system having at least one primary video camera for translating real images of a zone into electronic video signals at a first level of resolution. The system includes means for sampling movements of an individual or individuals located within the zone from the video signal output from at least one video camera. Video signals of sampled movements of the individual is electronically compared with known characteristics of movements which are indicative of individuals having a criminal intent. The level of criminal intent of the individual or individuals is then determined and an appropriate alarm signal is produced.

Description

CROSS-REFERENCE TO PATENT APPLICATION
This application is a continuation of U.S. patent application Ser. No. 12/466,350, filed May 14, 2009, now U.S. Pat. No. Re. 43,147, which is a Reissue of U.S. patent application Ser. No. 08/367,712, filed Jan. 3, 1995, now U.S. Pat. No. 5,666,157, each of which are incorporated herein by reference in their entirety.
FIELD OF THE INVENTION
This invention generally relates to surveillance systems, and more particularly, to trainable surveillance systems which detect and respond to specific abnormal video and audio input signals.
BACKGROUND OF THE INVENTION
Today's surveillance systems vary in complexity, efficiency and accuracy. Earlier surveillance systems use several closed circuit cameras, each connected to a devoted monitor. This type of system works sufficiently well for low-coverage sites, i.e., areas requiring up to perhaps six cameras. In such a system, a single person could scan the six monitors, in “real” time, and effectively monitor the entire (albeit small) protected area, offering a relatively high level of readiness to respond to an abnormal act or situation observed within the protected area. In this simplest of surveillance systems, it is left to the discretion of security personnel to determine, first, if there is any abnormal event in progress within the protected area, second, the level of concern placed on that particular event, and third, what actions should be taken in response to the particular event. The reliability of the entire system depends on the alertness and efficiency of the worker observing the monitors.
Many surveillance systems, however, require the use of a greater number of cameras (e.g., more than six) to police a larger area, such as at least every room located within a large museum. To adequately ensure reliable and complete surveillance within the protected area, either more personnel must be employed to constantly watch the additionally required monitors (one per camera), or fewer monitors may be used on a simple rotation schedule wherein one monitor sequentially displays the output images of several cameras, displaying the images of each camera for perhaps a few seconds. In another prior art surveillance system (referred to as the “QUAD” system), four cameras are connected to a single monitor whose screen continuously and simultaneously displays the four different images. In a “quaded quad” prior art surveillance system, sixteen cameras are linked to a single monitor whose screen now displays, continuously and simultaneously all sixteen different images. These improvements flow fewer personnel to adequately supervise the monitors to cover the larger protected area.
These improvements, however, still require the constant attention of at least one person. The above described multiple-image/single screen systems suffered from poor resolution and complex viewing. The reliability of the entire system is still dependent on the alertness and efficiency of the security personnel watching the monitors. The personnel watching the monitors are still burdened with identifying an abnormal act or condition shown on one of the monitors, determining which camera, and which corresponding zone of the protected area is recording the abnormal event, determining the level of concern placed on the particular event, and finally, determining the appropriate actions that must be taken to respond to the particular event.
Eventually, it was recognized that human personnel could not reliably monitor the “real-time” images from one or several cameras for long “watch” periods of time. It is natural for any person to become bored while performing a monotonous task, such as staring at one or several monitors continuously, waiting for something unusual or abnormal to occur, something which may never occur.
As discussed above, it is the human link which lowers the overall reliability of the entire surveillance system. U.S. Pat. No. 4,737,847 issued to Araki et al. discloses an improved abnormality surveillance system wherein motion sensors are positioned within a protected area to first determine the presence of an object of interest, such as an intruder. In the system disclosed by U.S. Pat. No. 4,737,847, zones having prescribed “warning levels” are defined within the protected area. Depending on which of these zones an object or person is detected in, moves to, and the length of time the detected object or person remains in a particular zone determines whether the object or person entering the zone should be considered an abnormal event or a threat.
The surveillance system disclosed in U.S. Pat. No. 4,737,847 does remove some of the monitoring responsibility otherwise placed on human personnel; however, such a system can only determine an intruder's “intent” by his presence relative to particular zones. The actual movements and sounds of the intruder are not measured or observed. A skilled criminal could easily determine the warning levels of obvious zones within a protected area and act accordingly; spending little time in zones having a high warning level, for example.
It is therefore an object of the present invention to provide a surveillance system which overcomes the problems of the prior art.
It is another object of the invention to provide such a surveillance system wherein a potentially abnormal event is determined by a computer prior to summoning a human supervisor.
It is another object of the invention to provide a surveillance system which compares specific measured movements of a particular person or persons with a trainable, predetermined set of “typical” movements to determine the level and type of a criminal or mischievous event.
It is another object of this invention to provide a surveillance system which transmits the data from various sensors to a location where it can be recorded for evidentiary purposes. It is another object of this invention to provide such a surveillance system which is operational day and night.
It is another object of this invention to provide a surveillance system which can cull out real-time events which indicate criminal intent using a weapon, by resolving the low temperature of the weapon relative to the higher body temperature and by recognizing the stances taken by the person with the weapon.
It is yet another object of this invention to provide a surveillance system which eliminates or reduces the number of TV monitors and guards presently required to identify abnormal events, as this system will perform this function in near real time.
INCORPORATED BY REFERENCE
The content of the following references is hereby incorporated by reference.
1. Motz L. and L. Bergstein “Zoom Lens Systems”, Journal of Optical Society of America, 3 papers in Vol. 52, 1992.
2. D. G. Aviv, “Sensor Software Assessment of Advanced Earth Resources Satellite Systems”, ARC Inc. Report #70-80-A, pp. 2-107 through 2-119; NASA contract NAS-1-16366.
3. Shio, A. and J. Sklansky “Segmentation of People in Motion”, Proc. of IEEE Workshop on Visual Motion, Princeton, N.J., October 1991.
4. Agarwal, R. and J Sklansky “Estimating Optical Flow from Clustered Trajectory Velocity Time”.
5. Suzuki, S. and J Sklansky “Extracting Non-Rigid Moving Objects by Temporal Edges”, IEEE, 1992, Transactions of Pattern Recognition.
6. Rabiner, L. and Biing-Hwang Juang “Fundamental of Speech Recognition”, Pub. Prentice Hall, 1993, (p.434-495).
7. Weibel, A. and Kai-Fu Lee Eds. “Readings in Speech Recognition”, Pub. Morgan Kaaufman, 1990 (p.267-296).
8. Rabiner L. “Application of Voice Processing to Telecommunication”, Proc. IEEE, Vol. 82, No. 2, February, 1994.
SUMMARY OF THE INVENTION
A preferred embodiment of the herein disclosed invention involves a surveillance system having at least one primary video camera for translating real images of a zone into electronic video signals at a first level of resolution and means for sampling movements within the zone from the video camera output. These elements are combined with means for electronically comparing the sampled movements with known characteristics of movements which are indicative of individuals engaged in criminal activity and means for determining the level of such criminal activity. Associated therewith are means for activating at least one secondary sensor and associated recording device having a second higher level of resolution, said activating means being in response to determining a predetermined level of criminal activity.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic block diagram of the video, analysis, control, alarm and recording subsystems of an embodiment of this invention;
FIG. 2A illustrates a frame K of a video camera's output of a particular environment, according to the invention, showing four representative objects (people) A, B, C, and D, wherein objects A, B and D are moving in a direction indicated with arrows, and object C is not moving;
FIG. 2B illustrates a frame K+5 of the video camera's output, according to the invention, showing objects A, B, and D are stationary, and object C is moving;
FIG. 2C illustrates a frame K+10 of the video camera's output, according to the invention, showing the current location of object A, B, C, D, and E;
FIG. 2D illustrates a frame K+11 of the video camera's output, according to the invention, showing object B next to object C, and object E moving to the right;
FIG. 2E illustrates a frame K+12 of the video camera's output. according to the invention, showing a potential crime taking place between objects B and C;
FIG. 2F illustrates a frame K+13 of the video camera's output, according to the invention, showing objects B and C interacting;
FIG. 2G illustrates a frame K+15 of the video camera's output, according to the invention, showing object C moving the right and object B following;
FIG. 2H illustrates a frame K+16 of the video camera's output, according to the invention, showing object C moving away from a stationary object B;
FIG. 2I illustrates a frame K+17 of the video camera's output, according to the invention, showing object B moving towards object C;
FIG. 3A illustrates a frame of a video camera's output, according to the invention, showing a “two on one” interaction of objects (people) A, B, and C;
FIG. 3B illustrates a later frame of the video camera's output of FIG. 3A, according to the invention, showing objects A and C moving towards object B;
FIG. 3C illustrates a later frame of the video camera's output of FIG. 3B, according to the invention, showing objects A and C moving in close proximity to object B;
FIG. 3D illustrates a later frame of the video camera's output of FIG. 3C, according to the invention, showing objects A and C quickly moving away from object B;
FIG. 4 is a schematic block diagram of a conventional word recognition system which may be employed in the invention.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring to FIG. 1, the picture input means 10, may be any conventional electronic picture pickup device operational within the infrared or visual spectrum (or both) including a vidicon and a CCD/TV camera of moderate resolution, e.g., a camera about 1½ inches in length and about 1 inch in diameter, weighing about 3 ounces, including for particular deployment a zoom lens attachment. This device is intended to operate continuously and translate the field of view (“real”) images within a first observation area into conventional video electronic signals.
Alternatively, a high rate camera/recorder, up to 300 frames/see (similar to those made by NAC Visual Systems of Woodland Hills, Calif., SONY and others) may be used as the picture input means 10. This would enable the detection of even the very rapid movement of body parts that are indicative of criminal intent, and their recording, as hereinbelow described. The more commonly used camera operates at 30 frames per second and cannot capture such quick body movement with sufficient resolution.
Picture input means 10, instead of operating continuously, may be activated by an “alert” signal from the processor of the low resolution camera or from the audio/word recognition processor when sensing a suspicious event.
Picture input means 10 contains a preprocessor which normalizes a wide range of illumination levels, especially for outside observation. The preprocessor emulates a vertebrate's retina, which has a an efficient and accurate normalization process. One such preprocessor (VLSI retina chip) is fabricated by the Carver Meade Laboratory of the California Institute of Technology in Pasadena, Calif. Use of this particular preprocessor chip will increase the automated vision capability of this invention whenever variation of light intensity and light reflection may otherwise weaken the picture resolution.
The signals from the picture input means 10 are converted into digitized signals and then sent to the picture processing means 12. The processor means controlling each group of cameras will be governed by an artificial intelligence system, based on dynamic pattern recognition principles, as further described below. Picture processing means 12 includes an image raster analyzer which effectively segments each image to isolate each pair of people. The image raster analyzer subsystem of picture processing means 12 segments each sampled image to identify and isolate each pair of objects (or people), and each “two on one” group of three people separately.
The “two on one” grouping represents a common mugging situation in which two individuals approach a victim, one from in front of the victim and the other from behind. The forward mugger tells the potential victim that if he does not give up his money, (or watch, ring, etc.) the second mugger will shoot him, stab or otherwise harm him. The group of three people will thus be considered a potential crime in progress and will therefore be segmented and analyzed in picture processing means.
With respect to a zoom lens system useful as an element in the picture input means 10, the essentials of the zoom lens subsystem are described in three papers written by L. Motz and L. Bergstein, in an article titled “Zoom Lens Systems” in the Journal of Optical Society of America, Vol. 52, April, 1992. This article is hereby incorporated by reference.
The essence of the zoom system is to vary the focal length such that an object being observed will be focused and magnified at its image plane. In an automatic version of the zoom system, once an object is in the camera's field-of-view (FOV), the lens moves to focus the object onto the camera's image plane. An error signal which is used to correct the focus by the image planes is generated by a CCD array into two halves and measuring the difference, segmenting in each until the object is at the center. Dividing the CCD array into more than two segments, say four quadrants, is a way to achieve automatic centering, as is the case with mono-pulse radar. Regardless of the number of segments, the error signal is used to generate the desired tracking of the object.
In a wide field-of-view (WFOV) operation, there may be more than one object, thus special attention is given to the design of the zoom system and its associated software and firmware control. Assuming three objects, as is the “two on one” potential mugging threat described above, and that the three persons are all in one plane, one can program a shifting from one object to the next, from one face to another face, in a prescribed sequential order. Moreover, as the objects move within the WFOV they will be automatically tracked in azimuth and elevation. In principle, the zoom would focus on the nearest object, assuming that the mount of light on each object is the same so that the prescribed sequence starting from the closes object will proceed to the remaining objects from, for example, right to left.
However, when the three objects are located in different planes, but still within the camera's WFOV, the zoom, with input from the segmentation subsystem of the picture analysis means 12 will focus on the object closest to the right hand side of the image plane, and then proceed to move the focus to the left, focusing on the next object and on the next sequentially.
In all of the above cases, the automatic zoom can more naturally choose to home-in on the person with the brightest emission or reflection, and then proceed to the next brightness and so forth. This would be a form of an intensity/time selection multiplex zoom system.
The relative positioning of the input camera with respect to the area under surveillance will effect the accuracy by which the image raster analyzer segments each image. In this preferred embodiment, it is beneficial for the input camera to view the area under surveillance from a point located directly above, e.g., with the input camera mounted high on a wall, a utility tower, or a traffic light support tower. The height of the input camera is preferably sufficient to minimize occlusion between the input camera and the movement of the individuals under surveillance.
Once the objects within each sampled video frame are segmented (i.e., detected and isolated), an analysis is made of the detailed movements of each object located within each particular segment of each image, and their relative movements with respect to the other objects.
Each image frame segment, once digitized, is stored in a frame by frame memory storage of picture processing means 12. Each frame from the picture input means 10 is subtracted from a previous frame already stored in processing means 12 using any conventional differencing process. The differencing process involving multiple differencing steps takes place in the processing section 12. The resulting difference signal (outputted from the differencing sub-section of means 12) of each image indicates all the changes that have occurred from one frame to the next. These changes include any movements of the individuals located within the segment and any movements of their limbs, e.g., arms.
Referring to FIG. 3, a collection of differencing signals for each moved object of subsequent sampled frames of images (called a “track”) allows a determination of the type, speed and direction (vector) of each motion involved, processing which will extract acceleration, i.e., note of change of velocity: and change in acceleration with respect to time (called “jerkiness”), and correlating this with stored signatures of known physical criminal acts. For example, subsequent differencing signals may reveal that an individual's arm is moving to a high position, such as the upper limit of that arm's motion, i.e., above his head) at a fast speed. This particular movement could be perceived, as described below, as a hostile movement with a possible criminal activity requiring the expert analysis of security personnel.
The intersection of two tracks indicates the intersection of two moved objects. The intersecting objects, in this case, could be merely the two hands of two people greeting each other, or depending on other characteristics, as described below, the intersecting objects could be interpreted as a fist of an assailant contacting the face of a victim in a less friendly greeting. In any event, the intersection of two tracks immediately requires further analysis and/or the summoning of security personnel. But the generation of an alarm, fight and sound devices located, for example, on a monitor will turn a guard's attention only to that monitor, hence the labor savings. In general however, friendly interactions between individuals is a much slower physical process than is a physical assault vis-a-vis body parts of the individuals involved. Hence, friendly interactions may be easily distinguished from hostile physical acts using current low pass and high pass filters, and current pattern recognition techniques based on experimental reference data.
When a large number of sensors (called a sensor suite) are distributed over a large number of facilities, for example, a number of ATMs (automatic teller machines), associated with particular bank branches and in a particular state or states and all operated under a single bank network control, then only one monitor is required.
A commercially available software tool may enhance object-movement analysis between frames (called optical flow computation). With optical flow computation, specific (usually bright) reflective elements, called farkles, emitted from the clothing and/or the body parts of an individual of one frame are subtracted from a previous frame. The bright portions will inherently provide sharper detail and therefore will yield more accurate data regarding the velocities of the relative moving objects. Additional computation, as described below, will provide data regarding the acceleration and even change in acceleration or “jerkiness” of each moving part sampled.
The physical motions of the individuals involved in an interaction, will be detected by first determining the edges of the of each person imaged. And the movements of the body parts will then be observed by noting the movements of the edges of the body parts of the individuals involved in the interaction. The differencing process will enable the determination of the velocity and acceleration and rate of acceleration of those body parts.
The now processed signal is sent to comparison means 14 which compares selected flames of the video signals from the picture input means 10 with “signature” video signals stored in memory 16. The signature signals are representative of various positions and movements of the body ports of an individual having various levels of criminal intent. The method for obtaining the data base of these signature video signals in accordance with another aspect of the invention is described in greater detail below.
If a comparison is made positive with one or more of the signature video signals, an output “alert” signal is sent from the comparison means 14 to a controller 18. The controller 18 controls the operation of a secondary, high resolution picture input means (video camera) 20 and a conventional monitor 22 and video recorder 24. The field of view of the secondary camera 20 is preferably at most, the same as the field of view of the primary camera 10, surveying a second observation area. The recorder 24 may be located at the site and/or at both a law enforcement facility (not shown) and simultaneously at a court office or legal facility to prevent loss of incriminating information due to tampering.
The purpose of the secondary camera 20 is to provide a detailed video signal of the individual having assumed criminal intent and also to improve false positive and false negative performance. This information is recorded by the video recorder 24 and displayed on a monitor 22. An alarm bell or light (not shown) or both may be provided and activated by an output signal from the controller 20 to summon a supervisor to immediately view the pertinent video images showing the apparent crime in progress and access its accuracy.
In still another embodiment of the invention, a VCR 26 is operating continuously (using a 6 hour loop-tape, for example). The VCR 26 is being controlled by the VCR controller 28. All the “real-time” images directly from the picture input means 10 are immediately recorded and stored for at least 6 hours, for example. Should it be determined that a crime is in progress, a signal from the controller 18 is sent to the VCR controller 28 changing the mode of recording from tape looping mode to non-looping mode. Once the VCR 26 is changed to a non-looping mode, the tape will not re-loop and will therefore retain the perhaps vital recorded video information of the surveyed site, including the crime itself, and the events leading up to the crime.
When the non-looping mode is initiated, the video signal may also be transmitted to a VCR located elsewhere; for example, at a law enforcement facility and, simultaneously to other secure locations of the Court and its associated offices.
Prior to the video signals being compared with the “signature” signals stored in memory, each sampled frame of video is “segmented” into parts relating to the objects detected therein. To segment a video signal, the video signal derived from the vidicon or CCD/TV camera is analyzed by an image raster analyzer. Although this process causes slight signal delays, it is accomplished nearly in real time.
At certain sites, or in certain situations, a high resolution camera may not be required or otherwise used. For example, the resolution provided by a relatively simple and low cost camera may be sufficient. Depending on the level of security for the particular location being surveyed, and the time of day, the length of frame intervals between analyzed frames may vary. For example, in a high risk area, every frame from the CCD/TV camera may be analyzed continuously to ensure that the maximum amount of information is recorded prior to and during a crime. In a low risk area, it may be preferred to sample perhaps every 10 frames from each camera, sequentially.
If, during such a sampling, it is determined that an abnormal or suspicious event is occurring, such as two people moving very close to each other, then the system would activate an alert mode wherein the system becomes “concerned and curious” in the suspicious actions and the sampling rate is increased to perhaps every 5 frames or even every frame. As described in greater detail below, depending on the type of system employed (i.e., video only, audio only or both), during such an alert mode, the entire system may be activated wherein both audio and video system begin to sample the environment for sufficient information to determine the intent of the actions.
Referring to FIG. 2, several frames of a particular camera output are shown to illustrate the segmentation process performed in accordance with the invention. The system begins to sample at frame K and determines that there are four objects (previously determined to be people, as described below), A-D located within a particular zone being policed. Since nothing unusual is determined from the initial analysis, the system does not warrant an “alert” status. People A, B, and D are moving according to normal, non-criminal intent, as could be observed.
A crime likelihood is indicated when frames K+10 through K+13 are analyzed by the differencing process. And if the movement of the body parts indicate velocity, acceleration and “jerkiness” that compare positively with the stored digital signals depicting movements of known criminal physical assaults, it is likely that a crime is in progress here.
Additionally, if a high velocity of departure is indicated when person C moves away from person B, as indicated in frames K+15 through K+17, a larger level of confidence, is attained in deciding that a physical criminal act has taken place or is about to.
An alarm is generated the instant any of the above conditions is established. This alarm condition will result in sending in Police or Guards to the crime site, activating the high resolution CCD/TV camera to record the face of the person committing the assault, a loud speaker being activated automatically, playing a recorded announcement warning the perpetrator the seriousness of his actions now being undertaken and demanding that he cease the criminal act. After dark a strong light will be turned on automatically. The automated responses will be actuated the instant an alarm condition is determined by the processor. Furthermore, an alarm signal is sent to the police station, and the same video signal of the event is transmitted to a court appointed data collection office, to the Public Defender's office and the District Attorney's Office.
As described above, it is necessary to compare the resulting signature of physical body parts motion involved in a physical criminal act, that is expressed by specific motion characteristics (i.e., velocity, acceleration, change of acceleration), with a set of signature files of physical criminal acts, in which body parts motion are equally involved. This comparison, is commonly referred to as pattern matching and is part of the pattern recognition process.
Files of physical criminal acts, which involve body parts movements such as hands, arms, elbows, shoulder, head, torso, legs, and feet, can be reviewed to ascertain this pattern. In addition, a priority can be set by experiments and simulations of physical criminal acts gathered from “dramas” that are enacted by professional actors, the data gathered from experienced muggers who have been caught by the police as well as victims who have reported details of their experiences will help the actors perform accurately. Video of their motions involved in these simulated acts can be stored in digitized form and files prepared for signature motion of each of the body parts involved, in the simulated physical criminal acts.
In another embodiment, the above described Abnormality Detection System includes an RF-ID (Radio Frequency Identification) tag or card to assist in the detection and tracking of individuals within the field of view of a camera. Such cards or tags could be used by authorized individuals to respond when queried by the RF interrogator. The response signal of the tags propagation pattern which is adequately registered with the video sensor. The card or tag, when sensed in video, would be assumed friendly and authorized. This information would simplify the segmentation process.
A light connected to each RF-ID card will be turned ON, when a positive response to an interrogation signal is established. The light will appear on the computer generated grid (also on the screen of the monitor) and the intersection of tracks clearly indicated, followed by their physical interaction. But also noted will be the intersection between the tagged and the untagged individuals. In all of such cases, the segmentation process will be simpler.
There are many manufacturers of RF-ID cards and Interrogators, three major ones are, The David Samoff Research Center of Princeton, N.J., AMTECH of Dallas, Tex. and MICRON Technology of Boise, Id.
The applications of the present invention include banks, ATMs, hotels, schools, residence halls and dormitories, office and residential buildings, hospitals, sidewalks, street crossings, parks, containers and container loading areas, shipping piers, train stations, truck loading stations, airport passenger and freight facilities, bus stations, subway stations, theaters, concert halls, sport arenas, libraries, churches, museums, stores, shopping malls, restaurants, convenience stores, bars, coffee shops gasoline stations, highway rest stops, tunnels, bridges, gateways, sections of highways, toll booths, warehouses, and depots, factories and assembly rooms, law enforcement facilities including jails. Any location or facility, civilian or military, requiring security would be a likely application.
Further applications of this invention are in moving platforms: automobiles, trucks, buses, subway cars, train cars, both freight and passenger, boats, ships (passenger and freight), tankers, service and construction vehicles, on and off-road, containers and their carriers, and airplanes, and also in equivalent military and sensitive mobile platforms.
As a deterrence to car-jacking a tiny CCD/TV camera hidden in the ceiling or the rearview mirror of the car, and focussed through a pin hole lens to the driver's seat, may be connected to the video processor to record the face of the drive. The camera is triggered by the automatic word recognition processor that will identify the well known expressions commonly used by the car-jacker. The video picture will be recorded and then transmitted via cellular phone in the car. Without a phone, the short video recording of the face of the car-jacker will be held until the car is found by the police, but now with the evidence (the picture of the car-jacker) in hand.
In this present surveillance system, the security personnel manning the monitors are alerted only to video images which show suspicious actions (criminal activities) within a prescribed observation zone. The security personnel are therefore used to access the accuracy of the crime and determine the necessary actions for an appropriate response. By using computers to effectively filter out all normal and noncriminal video signals from observation areas, fewer security personnel are required to survey and “secure” a greater overall area (including a greater number of observation areas, i.e., cameras).
It is also contemplated that the present system could be applied to assist blind people “see”. A battery operated portable version of the video system would automatically identify known objects in its field of view and a speech synthesizer would “say” the object. For example, “chair”, “table”, etc. would indicate the presence of a chair and a table.
Depending on the area to be policed, it is preferable that at least two and perhaps three cameras (or video sensors) are used simultaneously to cover the area. Should one camera sense a first level of criminal action, the other two could be manipulated to provide a three dimensional perspective coverage of the action. The three dimensional image of a physical interaction in the policed area would allow observation of a greater number of details associated with the steps: accost, threat, assault, response and post response. The conversion process from the two dimensional image to the three dimensional image is achieved by use of the known Radon transform.
In the extended operation phase of the invention as more details of the physical variation of movement characteristics of physical threats and assaults against a victim and also the speaker independent (male, female of different ages groups) and dialect independent words and terse sentences, with corresponding responses, will enable automatic recognition of a criminal assault, without he need of guard, unless required by statutes and other external requirements.
In another embodiment of the present invention, both video and acoustic information is sampled and analyzed. The acoustic information is sampled and analyzed in a similar manner to the sampling and analyzing of the above-described video information. The audio information is sampled and analyzed in a manner shown in FIG. 4, and is based on prior art.
The employment of the audio speech band, with its associated Automatic Speech Recognition (ASR) system, will not only reduce the false alarm rate resulting from the video analysis, but can also be used to trigger the video and other sensors if the sound threat predates the observed threat.
Referring to FIG. 4, a conventional automatic word recognition system is shown, including an input microphone system 40, an analysis subsystem 42, a template subsystem 44, a pattern comparator 46, and a post-processor and decision logic subsystem 48.
In operation, upon activation, the acoustic/audio policing system will begin sampling all (or a selected portion) of nearby acoustic signals. The acoustic signals will include voices and background noise. The background noise signals are generally known and predictable, and may therefore be easily filtered out using conventional filtering techniques. Among the expected noise signals are unfamiliar speech, automotive related sounds, honking, sirens, the sound of wind and/or rain.
The microphone input system 40 pick-up the acoustic signals and immediately filter out the predictable background noise signals and amplify the remaining recognizable acoustic signals. The filtered acoustic signals are analyzed in the analysis subsystem 42 which processes the signals by means of digital and spectral analysis techniques. The output of the analysis subsystem is compared in the pattern comparater subsystem 46 with selected predetermined words stored in memory in 44. The post processing and decision logic subsystem 48 generates an alarm signal, as described below.
The templates 44 include perhaps about 100 brief and easily recognizable terse expressions, some of which are single words, and are commonly used by those intent on a criminal act. Some examples of commonly used word phrases spoken by a criminal to a victim prior to a mugging, for example, include: “Give me your money”, “This is a stick-up”, “Give me your wallet and you won't get hurt” . . . etc. Furthermore, commonly used replies from a typical victim during such a mugging may also be stored as template words, such as “help”, and certain sounds such as shrieks, screams and groans, etc.
The specific word templates, from which inputed acoustic sounds are compared with, must be chosen carefully, taking into account the particular accents and slang of the language spoken in the region of concern. Hence, a statistical averaging of the spectral content of each word must be used.
The output of the word recognition system shown in FIG. 4 is used as a trigger signal to activate a sound recorder, or a camera used elsewhere in the invention, as described below.
The preferred microphone used in the microphone input subsystem 40 is a shot-gun microphone, such as those commercially available from the Sennheiser Company of Frankfurt, Germany. These microphone have a supercardioid propagation pattern. However, the gain of the pattern may be too small for high traffic areas and may therefore require more than one microphone in an array configuration to adequately focus and track in these areas. The propagation pattern of the microphone system enables better focusing on a moving sound source (e.g., a person walking and talking). A conventional directional microphone may also be used in place of a shot-gun type microphone, such as those made by the Sony Corporation of Tokyo, Japan. Such directional microphones will achieve similar gain to the shot-gun type microphones, but with a smaller physical structure.
A feedback loop circuit (not specifically shown) originating in the post processing subsystem 48 will direct the microphone system to track a particular dynamic source of sound within the area surveyed by video cameras.
An override signal from the video portion of the present invention will activate and direct the microphone system towards the direction of the field of view of the camera. In other words, should the video system detect a potential crime in progress, the video system will control the audio recording system towards the scene of interest. Likewise, should the audio system detect words of an aggressive nature, as described above, the audio system will direct appropriate video cameras to visually cover and record the apparent source of the sound.
A number of companies have developed very accurate and efficient, speaker independent word recognition systems based on a hidden Markov model (HMM) in combination with an artificial neural network (ANN). These companies include IBM of Armonk, N.Y., AT&T Bell Laboratories, Kurtzwell of Cambridge, Mass. and Lernout and Hauspie of Belgium.
Put briefly, the HMM applies probabilistic statistical procedure in recognizing words. In the training steps, an estimate is made of the means and covariance of the probabilistic model of each word, e.g., those words which are considered likely to be uttered in an interaction. The various ways which any given word is pronounced, permits the spectral parameters of the word to be an effective describer of the model. The steps involved in recognizing an input of an unknown word consists of computing the likelihood that the word was generated by each of the models developed during the training. The word is considered as “recognized” when its model gives the highest score. Finally, since the words are composed of word units, the evaluation of conditional probabilities of one particular unit followed by the same or another word unit is also part of the computation.
The resulting list of potential words is considerably shorter than the entire list of all spoken words of the English language. Therefore, the HMM system employed with the present invention allows both the audio and video systems to operate quickly and use HMM probability statistics to predict future movements or words based on an early recognition of initial movements and word stems.
The HMM system may be equally employed in the video recognition system. For example, if a person's arm quickly moves above his head, the HMM system may determine that there is a high probability that the arm will quickly come down, perhaps indicating a criminal intent.
While certain embodiments of the invention have been described for illustrative purposes, it is to be understood that there may be various other modifications and embodiments within the scope of the invention as defined by the following claims.

Claims (5)

What is claimed:
1. A surveillance system, comprising:
a) a video camera for translating real images of an area into electronic video signals;
b) means for sampling movements of an individual located within the area from said electronic video signals of said video camera;
c) means for electronically comparing said sampled movements with predetermined characteristics of movements;
d) means for predicting future movements of said individual based on said electronic comparing means of said sampled movements; and
e) means for generating a signal responsive to predetermined predicted future movements.
2. The surveillance system in accordance with claim 1, wherein said signal generating means activates a video signal recorder for recording said video signals from said camera.
3. The surveillance system in accordance with claim 1, wherein said signal generating means activates a microphone for receiving audible information of said individual located in said area.
4. The surveillance system in accordance with claim 1, wherein said signal generating means activates at least one high resolution camera.
5. A surveillance system, comprising:
a video camera capable of generating electronic video signals based on real images of an area viewed by the video camera, the electronic video signals comprising a first resolution, wherein the video camera is further capable of varying a focal length of the video camera in response to a video signal of the at least one individual;
a movement sampler capable of sampling movements of at least one individual in the generated electronic video signals;
a movement comparer capable of comparing sampled movements of the at least one individual with predetermined movement characteristics and filtering out background noise;
a future movement predictor capable of predicting future movements of the at least one individual based on the compared sampled movements of the at least one individual with the predetermined movement characteristics; and
an alert signal generator capable of generating an alert signal responsive to predicted future movements.
US13/361,438 1995-01-03 2012-01-30 Abnormality detection and surveillance system Expired - Lifetime USRE44527E1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/361,438 USRE44527E1 (en) 1995-01-03 2012-01-30 Abnormality detection and surveillance system

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US08/367,712 US5666157A (en) 1995-01-03 1995-01-03 Abnormality detection and surveillance system
US12/466,350 USRE43147E1 (en) 1995-01-03 2009-05-14 Abnormality detection and surveillance system
US13/361,438 USRE44527E1 (en) 1995-01-03 2012-01-30 Abnormality detection and surveillance system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US08/367,712 Reissue US5666157A (en) 1995-01-03 1995-01-03 Abnormality detection and surveillance system

Publications (1)

Publication Number Publication Date
USRE44527E1 true USRE44527E1 (en) 2013-10-08

Family

ID=45508283

Family Applications (3)

Application Number Title Priority Date Filing Date
US08/367,712 Ceased US5666157A (en) 1995-01-03 1995-01-03 Abnormality detection and surveillance system
US12/466,350 Expired - Lifetime USRE43147E1 (en) 1995-01-03 2009-05-14 Abnormality detection and surveillance system
US13/361,438 Expired - Lifetime USRE44527E1 (en) 1995-01-03 2012-01-30 Abnormality detection and surveillance system

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US08/367,712 Ceased US5666157A (en) 1995-01-03 1995-01-03 Abnormality detection and surveillance system
US12/466,350 Expired - Lifetime USRE43147E1 (en) 1995-01-03 2009-05-14 Abnormality detection and surveillance system

Country Status (3)

Country Link
US (3) US5666157A (en)
IL (1) IL116647A (en)
WO (1) WO1997042764A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9710708B1 (en) * 2014-03-24 2017-07-18 Vecna Technologies, Inc. Method and apparatus for autonomously recognizing at least one object in an image
US20170275134A1 (en) * 2014-11-26 2017-09-28 Otis Elevator Company Elevator security and control system based on passenger movement
US20180046864A1 (en) * 2016-08-10 2018-02-15 Vivint, Inc. Sonic sensing
US9984154B2 (en) 2015-05-01 2018-05-29 Morpho Detection, Llc Systems and methods for analyzing time series data based on event transitions
US20190361471A1 (en) * 2018-05-22 2019-11-28 Capital One Services, Llc Preventing image or video capture of input data provided to a transaction device
US10510239B1 (en) 2018-06-14 2019-12-17 Honeywell International Inc. Systems and methods for managing alert notifications from a secured area
US10733457B1 (en) 2019-03-11 2020-08-04 Wipro Limited Method and system for predicting in real-time one or more potential threats in video surveillance
US11386211B2 (en) 2018-12-19 2022-07-12 Capital One Services, Llc Obfuscation of input data provided to a transaction device
US11750639B2 (en) 2021-04-05 2023-09-05 Bank Of America Corporation ATM-based anomaly and security threat detection

Families Citing this family (297)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5910854A (en) 1993-02-26 1999-06-08 Donnelly Corporation Electrochromic polymeric solid films, manufacturing electrochromic devices using such solid films, and processes for making such solid films and devices
JPH08214201A (en) * 1994-11-28 1996-08-20 Canon Inc Image pickup device
US5666157A (en) 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system
US6028626A (en) * 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US6891563B2 (en) 1996-05-22 2005-05-10 Donnelly Corporation Vehicular vision system
US7542035B2 (en) * 1995-11-15 2009-06-02 Ford Oxaal Method for interactively viewing full-surround image data and apparatus therefor
DE19601005A1 (en) * 1996-01-15 1997-07-17 Bosch Gmbh Robert Process for the detection of moving objects in successive images
US5969755A (en) * 1996-02-05 1999-10-19 Texas Instruments Incorporated Motion based event detection system and method
KR100213055B1 (en) * 1996-07-27 1999-08-02 윤종용 Recording media saving type recording method of a supervisory system
US5953055A (en) * 1996-08-08 1999-09-14 Ncr Corporation System and method for detecting and analyzing a queue
WO1998011494A1 (en) * 1996-09-16 1998-03-19 Advanced Research Solutions, Llc Data correlation and analysis tool
JPH10150656A (en) * 1996-09-20 1998-06-02 Hitachi Ltd Image processor and trespasser monitor device
US6031573A (en) * 1996-10-31 2000-02-29 Sensormatic Electronics Corporation Intelligent video information management system performing multiple functions in parallel
US5974235A (en) * 1996-10-31 1999-10-26 Sensormatic Electronics Corporation Apparatus having flexible capabilities for analysis of video information
US5917958A (en) * 1996-10-31 1999-06-29 Sensormatic Electronics Corporation Distributed video data base with remote searching for image data features
EP1458187A3 (en) * 1996-10-31 2004-11-10 Sensormatic Electronics Corporation Intelligent video information management system
US5875304A (en) * 1996-10-31 1999-02-23 Sensormatic Electronics Corporation User-settable features of an intelligent video information management system
US5884042A (en) * 1996-10-31 1999-03-16 Sensormatic Electronics Corporation Data identification in an intelligent video information management system
US6035341A (en) * 1996-10-31 2000-03-07 Sensormatic Electronics Corporation Multimedia data analysis in intelligent video information management system
AU7947501A (en) * 1996-10-31 2002-01-03 Sensormatic Electronics Corporation Intelligent video information management system
US5875305A (en) * 1996-10-31 1999-02-23 Sensormatic Electronics Corporation Video information management system which provides intelligent responses to video data content features
US5886738A (en) * 1996-11-21 1999-03-23 Detection Dynamics Inc. Apparatus within a street lamp for remote surveillance
US6462775B1 (en) 1996-11-21 2002-10-08 Detection Dynamics, Inc. Apparatus within a street lamp for remote surveillance having directional antenna
US5973732A (en) * 1997-02-19 1999-10-26 Guthrie; Thomas C. Object tracking system for monitoring a controlled space
US5943140A (en) 1997-03-14 1999-08-24 Monroe; David Method and apparatus for sending and receiving facsimile transmissions over a non-telephonic transmission system
JP3812985B2 (en) * 1997-04-04 2006-08-23 富士通株式会社 Automatic monitoring device
US6173284B1 (en) * 1997-05-20 2001-01-09 University Of Charlotte City Of Charlotte Systems, methods and computer program products for automatically monitoring police records for a crime profile
US6172605B1 (en) * 1997-07-02 2001-01-09 Matsushita Electric Industrial Co., Ltd. Remote monitoring system and method
US6097429A (en) * 1997-08-01 2000-08-01 Esco Electronics Corporation Site control unit for video security system
US6069655A (en) * 1997-08-01 2000-05-30 Wells Fargo Alarm Services, Inc. Advanced video security system
US6091771A (en) * 1997-08-01 2000-07-18 Wells Fargo Alarm Services, Inc. Workstation for video security system
US6326613B1 (en) 1998-01-07 2001-12-04 Donnelly Corporation Vehicle interior mirror assembly adapted for containing a rain sensor
US6124886A (en) 1997-08-25 2000-09-26 Donnelly Corporation Modular rearview mirror assembly
US6172613B1 (en) 1998-02-18 2001-01-09 Donnelly Corporation Rearview mirror assembly incorporating vehicle information display
US8294975B2 (en) 1997-08-25 2012-10-23 Donnelly Corporation Automotive rearview mirror assembly
WO1999021145A1 (en) * 1997-10-20 1999-04-29 Industrial Research Limited An improved surveillance system
WO1999027488A1 (en) * 1997-11-25 1999-06-03 Currency Systems International Commercial currency destruction
US5940118A (en) * 1997-12-22 1999-08-17 Nortel Networks Corporation System and method for steering directional microphones
US8288711B2 (en) 1998-01-07 2012-10-16 Donnelly Corporation Interior rearview mirror system with forwardly-viewing camera and a control
US6278377B1 (en) 1999-08-25 2001-08-21 Donnelly Corporation Indicator for vehicle accessory
US6445287B1 (en) 2000-02-28 2002-09-03 Donnelly Corporation Tire inflation assistance monitoring system
AU2223999A (en) 1998-01-12 1999-07-26 David Monroe Apparatus for capturing, converting and transmitting a visual image signal via adigital transmission system
AU2223799A (en) * 1998-01-12 1999-07-26 David A. Monroe Apparatus and method for selection of circuit in multi-circuit communications device
US6636748B2 (en) * 1998-01-12 2003-10-21 David A. Monroe Method and apparatus for image capture, compression and transmission of a visual image over telephone or radio transmission system
US7184074B1 (en) * 1998-01-20 2007-02-27 Rolf Jansen Tractor/trailer back up kit
KR100457506B1 (en) 1998-02-25 2005-01-17 삼성전자주식회사 Monitoring system and method thereof using pc having screen acquisition board
US6420975B1 (en) 1999-08-25 2002-07-16 Donnelly Corporation Interior rearview mirror sound processing system
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6329925B1 (en) 1999-11-24 2001-12-11 Donnelly Corporation Rearview mirror assembly with added feature modular display
US6693517B2 (en) 2000-04-21 2004-02-17 Donnelly Corporation Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants
GB2337146B (en) * 1998-05-08 2000-07-19 Primary Image Limited Method and apparatus for detecting motion across a surveillance area
DE19827835B4 (en) * 1998-06-23 2012-01-19 Robert Bosch Gmbh Image transmission method and apparatus
US6853302B2 (en) * 2001-10-10 2005-02-08 David A. Monroe Networked personal security system
US20040068583A1 (en) * 2002-10-08 2004-04-08 Monroe David A. Enhanced apparatus and method for collecting, distributing and archiving high resolution images
US7634662B2 (en) * 2002-11-21 2009-12-15 Monroe David A Method for incorporating facial recognition technology in a multimedia surveillance system
US7576770B2 (en) * 2003-02-11 2009-08-18 Raymond Metzger System for a plurality of video cameras disposed on a common network
US20030202101A1 (en) * 2002-04-29 2003-10-30 Monroe David A. Method for accessing and controlling a remote camera in a networked system with multiple user support capability and integration to other sensor systems
US7428002B2 (en) * 2002-06-05 2008-09-23 Monroe David A Emergency telephone with integrated surveillance system connectivity
US7131136B2 (en) * 2002-07-10 2006-10-31 E-Watch, Inc. Comprehensive multi-media surveillance and response system for aircraft, operations centers, airports and other commercial transports, centers and terminals
US20030025599A1 (en) * 2001-05-11 2003-02-06 Monroe David A. Method and apparatus for collecting, sending, archiving and retrieving motion video and still images and notification of detected events
US7228429B2 (en) * 2001-09-21 2007-06-05 E-Watch Multimedia network appliances for security and surveillance applications
US7023913B1 (en) * 2000-06-14 2006-04-04 Monroe David A Digital security multimedia sensor
US20030061325A1 (en) * 2001-09-21 2003-03-27 Monroe David A. Method and apparatus for interconnectivity between legacy security systems and networked multimedia security surveillance system
US20020170064A1 (en) * 2001-05-11 2002-11-14 Monroe David A. Portable, wireless monitoring and control station for use in connection with a multi-media surveillance system having enhanced notification functions
US7057647B1 (en) * 2000-06-14 2006-06-06 E-Watch, Inc. Dual-mode camera system for day/night or variable zoom operation
US20020097322A1 (en) * 2000-11-29 2002-07-25 Monroe David A. Multiple video display configurations and remote control of multiple video signals transmitted to a monitoring station over a network
US7197228B1 (en) * 1998-08-28 2007-03-27 Monroe David A Multifunction remote control system for audio and video recording, capture, transmission and playback of full motion and still images
US6970183B1 (en) * 2000-06-14 2005-11-29 E-Watch, Inc. Multimedia surveillance and monitoring system including network configuration
US20080201505A1 (en) * 2003-01-08 2008-08-21 Monroe David A Multimedia data collection device for a host with a single available input port
US20030067542A1 (en) 2000-10-13 2003-04-10 Monroe David A. Apparatus for and method of collecting and distributing event data to strategic security personnel and response vehicles
US6628835B1 (en) * 1998-08-31 2003-09-30 Texas Instruments Incorporated Method and system for defining and recognizing complex events in a video sequence
US7229012B1 (en) 1998-10-09 2007-06-12 Diebold, Incorporated System and method for capturing and searching image data associated with transactions
US7533806B1 (en) 1998-10-09 2009-05-19 Diebold, Incorporated Reading of image data bearing record for comparison with stored user image in authorizing automated banking machine access
US7900823B1 (en) 1998-10-09 2011-03-08 Diebold, Incorporated Banking system controlled by data bearing records
US7389914B1 (en) 1998-10-09 2008-06-24 Diebold, Incorporated Method of capturing and communicating correlated data of check transaction at card reading automated banking machine
US7147147B1 (en) 2005-07-18 2006-12-12 Diebold, Incorporated System and method for capturing and searching image data associated with transactions
US6583813B1 (en) 1998-10-09 2003-06-24 Diebold, Incorporated System and method for capturing and searching image data associated with transactions
US7583290B2 (en) * 1998-10-09 2009-09-01 Diebold, Incorporated Cash dispensing automated banking machine with improved fraud detection capabilities
US20020007510A1 (en) * 1998-10-29 2002-01-24 Mann W. Stephen G. Smart bathroom fixtures and systems
US6515586B1 (en) 1998-12-18 2003-02-04 Intel Corporation Tactile tracking systems and methods
US6518881B2 (en) * 1999-02-25 2003-02-11 David A. Monroe Digital communication system for law enforcement use
US6545601B1 (en) 1999-02-25 2003-04-08 David A. Monroe Ground based security surveillance system for aircraft and other commercial vehicles
US6333759B1 (en) 1999-03-16 2001-12-25 Joseph J. Mazzilli 360 ° automobile video camera system
US6285297B1 (en) * 1999-05-03 2001-09-04 Jay H. Ball Determining the availability of parking spaces
WO2000068908A1 (en) * 1999-05-07 2000-11-16 Safety Adherence Technology (Pty) Ltd Surveillance system
GB9918248D0 (en) 1999-08-04 1999-10-06 Matra Bae Dynamics Uk Ltd Improvements in and relating to surveillance systems
US6424370B1 (en) * 1999-10-08 2002-07-23 Texas Instruments Incorporated Motion based event detection system and method
US6954859B1 (en) * 1999-10-08 2005-10-11 Axcess, Inc. Networked digital security system and methods
US6421080B1 (en) * 1999-11-05 2002-07-16 Image Vault Llc Digital surveillance system with pre-event recording
US6401066B1 (en) 1999-11-09 2002-06-04 West Teleservices Holding Company Automated third party verification system
US7206746B1 (en) 1999-11-09 2007-04-17 West Corporation Third party verification system
US7130800B1 (en) 2001-09-20 2006-10-31 West Corporation Third party verification system
US6461872B1 (en) * 1999-11-17 2002-10-08 General Electric Company Poly(1,4-ethylene-2-piperazone) composition, method for production of a poly(1,4-ethylene-2-piperazone) composition, TCE-detecting method and sensor
US7042492B2 (en) * 1999-12-10 2006-05-09 The Stanley Works Automatic door assembly with video imaging device
US6707486B1 (en) * 1999-12-15 2004-03-16 Advanced Technology Video, Inc. Directional motion estimator
US6819353B2 (en) 1999-12-23 2004-11-16 Wespot Ab Multiple backgrounds
SE519700C2 (en) * 1999-12-23 2003-04-01 Wespot Ab Image Data Processing
US7479980B2 (en) * 1999-12-23 2009-01-20 Wespot Technologies Ab Monitoring system
US6774905B2 (en) 1999-12-23 2004-08-10 Wespot Ab Image data processing
US6940998B2 (en) 2000-02-04 2005-09-06 Cernium, Inc. System for automated screening of security cameras
DE20102477U1 (en) * 2000-02-22 2001-05-03 Wincor Nixdorf Gmbh & Co Kg Device for protecting self-service machines against manipulation
US7370983B2 (en) 2000-03-02 2008-05-13 Donnelly Corporation Interior mirror assembly with display
US7480149B2 (en) 2004-08-18 2009-01-20 Donnelly Corporation Accessory module for vehicle
US7167796B2 (en) 2000-03-09 2007-01-23 Donnelly Corporation Vehicle navigation system for use with a telematics system
WO2001064481A2 (en) 2000-03-02 2001-09-07 Donnelly Corporation Video mirror systems incorporating an accessory module
US7855755B2 (en) 2005-11-01 2010-12-21 Donnelly Corporation Interior rearview mirror assembly with display
US20060063752A1 (en) * 2000-03-14 2006-03-23 Boehringer Ingelheim Pharma Gmbh & Co. Kg Bicyclic heterocycles, pharmaceutical compositions containing them, their use, and processes for preparing them
US6396408B2 (en) 2000-03-31 2002-05-28 Donnelly Corporation Digital electrochromic circuit with a vehicle network
US6671389B1 (en) 2000-04-27 2003-12-30 International Business Machines Corporation Method and system for detecting digital camera failure
US6646676B1 (en) 2000-05-17 2003-11-11 Mitsubishi Electric Research Laboratories, Inc. Networked surveillance and control system
US9892606B2 (en) 2001-11-15 2018-02-13 Avigilon Fortress Corporation Video surveillance system employing video primitives
US7868912B2 (en) * 2000-10-24 2011-01-11 Objectvideo, Inc. Video surveillance system employing video primitives
US8711217B2 (en) 2000-10-24 2014-04-29 Objectvideo, Inc. Video surveillance system employing video primitives
US8564661B2 (en) 2000-10-24 2013-10-22 Objectvideo, Inc. Video analytic rule detection system and method
US20070107029A1 (en) * 2000-11-17 2007-05-10 E-Watch Inc. Multiple Video Display Configurations & Bandwidth Conservation Scheme for Transmitting Video Over a Network
US7839926B1 (en) 2000-11-17 2010-11-23 Metzger Raymond R Bandwidth management and control
US7698450B2 (en) * 2000-11-17 2010-04-13 Monroe David A Method and apparatus for distributing digitized streaming video over a network
DE60220379T2 (en) 2001-01-23 2008-01-24 Donnelly Corp., Holland IMPROVED VEHICLE LIGHTING SYSTEM
US7255451B2 (en) 2002-09-20 2007-08-14 Donnelly Corporation Electro-optic mirror cell
US7581859B2 (en) 2005-09-14 2009-09-01 Donnelly Corp. Display device for exterior rearview mirror
US8180643B1 (en) 2001-02-15 2012-05-15 West Corporation Script compliance using speech recognition and compilation and transmission of voice and text records to clients
US7966187B1 (en) 2001-02-15 2011-06-21 West Corporation Script compliance and quality assurance using speech recognition
US7739115B1 (en) 2001-02-15 2010-06-15 West Corporation Script compliance and agent feedback
US7191133B1 (en) 2001-02-15 2007-03-13 West Corporation Script compliance using speech recognition
US7664641B1 (en) 2001-02-15 2010-02-16 West Corporation Script compliance and quality assurance based on speech recognition and duration of interaction
CN1308866C (en) 2001-03-19 2007-04-04 迪布尔特有限公司 Automated banking machine processing system and method
US7424175B2 (en) 2001-03-23 2008-09-09 Objectvideo, Inc. Video segmentation using statistical pixel modeling
JP2004532475A (en) * 2001-05-15 2004-10-21 サイコジェニックス・インコーポレーテッド Systems and methods for monitoring behavioral information engineering
US20030040925A1 (en) * 2001-08-22 2003-02-27 Koninklijke Philips Electronics N.V. Vision-based method and apparatus for detecting fraudulent events in a retail environment
US7728870B2 (en) * 2001-09-06 2010-06-01 Nice Systems Ltd Advanced quality management and recording solutions for walk-in environments
US6559769B2 (en) * 2001-10-01 2003-05-06 Eric Anthony Early warning real-time security system
US20030095180A1 (en) * 2001-11-21 2003-05-22 Montgomery Dennis L. Method and system for size adaptation and storage minimization source noise correction, and source watermarking of digital data frames
US6873261B2 (en) * 2001-12-07 2005-03-29 Eric Anthony Early warning near-real-time security system
US6819758B2 (en) 2001-12-21 2004-11-16 West Corporation Method, system, and computer-readable media for performing speech recognition of indicator tones
US6845215B1 (en) 2002-01-09 2005-01-18 James Greenwold Body-carryable, digital storage medium, audio/video recording assembly
US6824281B2 (en) 2002-01-31 2004-11-30 Donnelly Corporation Vehicle accessory module
US20040075547A1 (en) * 2002-02-12 2004-04-22 Vojtech George L Commandable covert surveillance system
US6862343B1 (en) 2002-03-27 2005-03-01 West Corporation Methods, apparatus, scripts, and computer readable media for facilitating secure capture of sensitive data for a voice-based transaction conducted over a telecommunications network
US6804331B1 (en) 2002-03-27 2004-10-12 West Corporation Method, apparatus, and computer readable media for minimizing the risk of fraudulent receipt of telephone calls
CN100369487C (en) * 2002-04-25 2008-02-13 松下电器产业株式会社 Object detection device, object detection server, and object detection method
US7860222B1 (en) 2003-11-24 2010-12-28 Securus Technologies, Inc. Systems and methods for acquiring, accessing, and analyzing investigative information
US6918674B2 (en) 2002-05-03 2005-07-19 Donnelly Corporation Vehicle rearview mirror system
US6937702B1 (en) 2002-05-28 2005-08-30 West Corporation Method, apparatus, and computer readable media for minimizing the risk of fraudulent access to call center resources
DE10225023A1 (en) * 2002-06-06 2004-01-15 Diehl Munitionssysteme Gmbh & Co. Kg Monitoring methods in a means of transportation
EP1514246A4 (en) 2002-06-06 2008-04-16 Donnelly Corp Interior rearview mirror system with compass
US7329013B2 (en) 2002-06-06 2008-02-12 Donnelly Corporation Interior rearview mirror system with compass
US7403967B1 (en) 2002-06-18 2008-07-22 West Corporation Methods, apparatus, and computer readable media for confirmation and verification of shipping address data associated with a transaction
US6873256B2 (en) 2002-06-21 2005-03-29 Dorothy Lemelson Intelligent building alarm
US7190809B2 (en) * 2002-06-28 2007-03-13 Koninklijke Philips Electronics N.V. Enhanced background model employing object classification for improved background-foreground segmentation
US6778085B2 (en) * 2002-07-08 2004-08-17 James Otis Faulkner Security system and method with realtime imagery
EP1537550A2 (en) * 2002-07-15 2005-06-08 Magna B.S.P. Ltd. Method and apparatus for implementing multipurpose monitoring system
EP1543509A4 (en) * 2002-09-02 2008-09-24 Samsung Electronics Co Ltd Optical information storage medium and method of and apparatus for recording and/or reproducing information on and/or from the optical information storage medium
WO2004026633A2 (en) 2002-09-20 2004-04-01 Donnelly Corporation Mirror reflective element assembly
WO2004103772A2 (en) 2003-05-19 2004-12-02 Donnelly Corporation Mirror assembly for vehicle
US7310177B2 (en) 2002-09-20 2007-12-18 Donnelly Corporation Electro-optic reflective element assembly
WO2004045215A1 (en) * 2002-11-12 2004-05-27 Intellivid Corporation Method and system for tracking and behavioral monitoring of multiple objects moving throuch multiple fields-of-view
US7221775B2 (en) * 2002-11-12 2007-05-22 Intellivid Corporation Method and apparatus for computerized image background analysis
US7634334B2 (en) * 2002-11-22 2009-12-15 Monroe David A Record and playback system for aircraft
US7781172B2 (en) 2003-11-21 2010-08-24 Kimberly-Clark Worldwide, Inc. Method for extending the dynamic detection range of assay devices
AU2003296850A1 (en) * 2002-12-03 2004-06-23 3Rd Millenium Solutions, Ltd. Surveillance system with identification correlation
US6791603B2 (en) 2002-12-03 2004-09-14 Sensormatic Electronics Corporation Event driven video tracking system
US7643168B2 (en) * 2003-01-03 2010-01-05 Monroe David A Apparatus for capturing, converting and transmitting a visual image signal via a digital transmission system
JP4568009B2 (en) * 2003-04-22 2010-10-27 パナソニック株式会社 Monitoring device with camera cooperation
US20100002070A1 (en) 2004-04-30 2010-01-07 Grandeye Ltd. Method and System of Simultaneously Displaying Multiple Views for Video Surveillance
US20050007453A1 (en) * 2003-05-02 2005-01-13 Yavuz Ahiska Method and system of simultaneously displaying multiple views for video surveillance
US7529424B2 (en) * 2003-05-02 2009-05-05 Grandeye, Ltd. Correction of optical distortion by image processing
US7528881B2 (en) * 2003-05-02 2009-05-05 Grandeye, Ltd. Multiple object processing in wide-angle video camera
US7450165B2 (en) * 2003-05-02 2008-11-11 Grandeye, Ltd. Multiple-view processing in wide-angle video camera
US20040223054A1 (en) * 2003-05-06 2004-11-11 Rotholtz Ben Aaron Multi-purpose video surveillance
US20050028215A1 (en) * 2003-06-03 2005-02-03 Yavuz Ahiska Network camera supporting multiple IP addresses
US7286157B2 (en) * 2003-09-11 2007-10-23 Intellivid Corporation Computerized method and apparatus for determining field-of-view relationships among multiple image sensors
US7446924B2 (en) 2003-10-02 2008-11-04 Donnelly Corporation Mirror reflective element assembly including electronic component
US20050075836A1 (en) * 2003-10-03 2005-04-07 Jason Arthur Taylor Forensic person tracking method and apparatus
US7346187B2 (en) * 2003-10-10 2008-03-18 Intellivid Corporation Method of counting objects in a monitored environment and apparatus for the same
US7280673B2 (en) * 2003-10-10 2007-10-09 Intellivid Corporation System and method for searching for changes in surveillance video
US7308341B2 (en) 2003-10-14 2007-12-11 Donnelly Corporation Vehicle communication system
US7127083B2 (en) * 2003-11-17 2006-10-24 Vidient Systems, Inc. Video surveillance system with object detection and probability scoring based on object class
US7148912B2 (en) * 2003-11-17 2006-12-12 Vidient Systems, Inc. Video surveillance system in which trajectory hypothesis spawning allows for trajectory splitting and/or merging
US7088846B2 (en) * 2003-11-17 2006-08-08 Vidient Systems, Inc. Video surveillance system that detects predefined behaviors based on predetermined patterns of movement through zones
US7136507B2 (en) * 2003-11-17 2006-11-14 Vidient Systems, Inc. Video surveillance system with rule-based reasoning and multiple-hypothesis scoring
AU2004233453B2 (en) * 2003-12-03 2011-02-17 Envysion, Inc. Recording a sequence of images
GB2410391B (en) * 2003-12-03 2009-05-13 Safehouse Internat Inc Processing an output signal to generate an exemplar image from a period of activity
NZ536913A (en) * 2003-12-03 2006-09-29 Safehouse Internat Inc Displaying graphical output representing the topographical relationship of detectors and their alert status
US7664292B2 (en) * 2003-12-03 2010-02-16 Safehouse International, Inc. Monitoring an output from a camera
US20050200486A1 (en) * 2004-03-11 2005-09-15 Greer Richard S. Patient visual monitoring system
US7893985B1 (en) 2004-03-15 2011-02-22 Grandeye Ltd. Wide angle electronic camera with improved peripheral vision
US8427538B2 (en) * 2004-04-30 2013-04-23 Oncam Grandeye Multiple view and multiple object processing in wide-angle video camera
US7366359B1 (en) * 2004-07-08 2008-04-29 Grandeye, Ltd. Image processing of regions in a wide angle video camera
WO2006040687A2 (en) 2004-07-19 2006-04-20 Grandeye, Ltd. Automatically expanding the zoom capability of a wide-angle video camera
US20060062478A1 (en) * 2004-08-16 2006-03-23 Grandeye, Ltd., Region-sensitive compression of digital video
US8860780B1 (en) 2004-09-27 2014-10-14 Grandeye, Ltd. Automatic pivoting in a wide-angle video camera
US9141615B1 (en) 2004-11-12 2015-09-22 Grandeye, Ltd. Interactive media server
AR048477A1 (en) * 2004-11-19 2006-05-03 Alusud Argentina S R L PICO VERTEDOR OF THE TYPE EMPLOYED IN BOTTLES CONTAINERS OF LIQUID SUBSTANCES WITH VARIABLE VISCOSITY DEGREE
EP1827908B1 (en) 2004-12-15 2015-04-29 Magna Electronics Inc. An accessory module system for a vehicle window
US7894531B1 (en) 2005-02-15 2011-02-22 Grandeye Ltd. Method of compression for wide angle digital video
JP4345692B2 (en) * 2005-02-28 2009-10-14 ソニー株式会社 Information processing system, information processing apparatus and method, and program
JP4702598B2 (en) * 2005-03-15 2011-06-15 オムロン株式会社 Monitoring system, monitoring apparatus and method, recording medium, and program
AU2006338248B2 (en) * 2005-03-25 2011-01-20 Sensormatic Electronics, LLC Intelligent camera selection and object tracking
US7339607B2 (en) * 2005-03-25 2008-03-04 Yongyouth Damabhorn Security camera and monitor system activated by motion sensor and body heat sensor for homes or offices
US8139896B1 (en) * 2005-03-28 2012-03-20 Grandeye, Ltd. Tracking moving objects accurately on a wide-angle video
US20060236375A1 (en) 2005-04-15 2006-10-19 Tarik Hammadou Method and system for configurable security and surveillance systems
US7626749B2 (en) 2005-05-16 2009-12-01 Donnelly Corporation Vehicle mirror assembly with indicia at reflective element
US20070090972A1 (en) * 2005-06-10 2007-04-26 Monroe David A Airborne digital video recorder
US7822224B2 (en) 2005-06-22 2010-10-26 Cernium Corporation Terrain map summary elements
US7944468B2 (en) * 2005-07-05 2011-05-17 Northrop Grumman Systems Corporation Automated asymmetric threat detection using backward tracking and behavioral analysis
US9036028B2 (en) 2005-09-02 2015-05-19 Sensormatic Electronics, LLC Object tracking and alerts
US8508607B2 (en) * 2005-09-06 2013-08-13 Its-7 Method and system for a programmable camera for configurable security and surveillance systems
JP5061444B2 (en) * 2005-09-20 2012-10-31 ソニー株式会社 Imaging apparatus and imaging method
US8723951B2 (en) * 2005-11-23 2014-05-13 Grandeye, Ltd. Interactive wide-angle video server
JP4442571B2 (en) * 2006-02-10 2010-03-31 ソニー株式会社 Imaging apparatus and control method thereof
US20090260075A1 (en) * 2006-03-28 2009-10-15 Richard Gedge Subject identification
EP2005748B1 (en) * 2006-04-13 2013-07-10 Curtin University Of Technology Virtual observer
CN101443789B (en) 2006-04-17 2011-12-28 实物视频影像公司 video segmentation using statistical pixel modeling
ES2569411T3 (en) 2006-05-19 2016-05-10 The Queen's Medical Center Motion tracking system for adaptive real-time imaging and spectroscopy
US7671728B2 (en) 2006-06-02 2010-03-02 Sensormatic Electronics, LLC Systems and methods for distributed monitoring of remote sites
US7825792B2 (en) * 2006-06-02 2010-11-02 Sensormatic Electronics Llc Systems and methods for distributed monitoring of remote sites
ES2320826B1 (en) * 2006-12-27 2010-02-25 Universidad Nacional De Educacion A Distancia (Uned) PROCEDURE TO DESCRIBE THE GEOMETRIC BEHAVIOR OF HUMANS ON A SCENE CATCHED BY AN ARTIFICIAL VISION SYSTEM, BASED ON A MODEL OF BLOCKS AND, SPECIAL, ORIENTED TO THE VIDEO-SURVEILLANCE TASK.
US8295542B2 (en) 2007-01-12 2012-10-23 International Business Machines Corporation Adjusting a consumer experience based on a 3D captured image stream of a consumer response
US8269834B2 (en) 2007-01-12 2012-09-18 International Business Machines Corporation Warning a user about adverse behaviors of others within an environment based on a 3D captured image stream
US8588464B2 (en) * 2007-01-12 2013-11-19 International Business Machines Corporation Assisting a vision-impaired user with navigation based on a 3D captured image stream
US7667596B2 (en) * 2007-02-16 2010-02-23 Panasonic Corporation Method and system for scoring surveillance system footage
JP5121258B2 (en) * 2007-03-06 2013-01-16 株式会社東芝 Suspicious behavior detection system and method
US9135807B2 (en) * 2007-03-14 2015-09-15 Seth Cirker Mobile wireless device with location-dependent capability
US20100019927A1 (en) * 2007-03-14 2010-01-28 Seth Cirker Privacy ensuring mobile awareness system
US8749343B2 (en) * 2007-03-14 2014-06-10 Seth Cirker Selectively enabled threat based information system
US7595815B2 (en) * 2007-05-08 2009-09-29 Kd Secure, Llc Apparatus, methods, and systems for intelligent security and safety
US20080303902A1 (en) * 2007-06-09 2008-12-11 Sensomatic Electronics Corporation System and method for integrating video analytics and data analytics/mining
US8123419B2 (en) 2007-09-21 2012-02-28 Seth Cirker Privacy ensuring covert camera
US8013738B2 (en) * 2007-10-04 2011-09-06 Kd Secure, Llc Hierarchical storage manager (HSM) for intelligent storage of large volumes of data
WO2009045218A1 (en) * 2007-10-04 2009-04-09 Donovan John J A video surveillance, storage, and alerting system having network management, hierarchical data storage, video tip processing, and vehicle plate analysis
US20090213123A1 (en) * 2007-12-08 2009-08-27 Dennis Allard Crow Method of using skeletal animation data to ascertain risk in a surveillance system
ES2334617B1 (en) * 2008-02-13 2011-02-10 Jose Juan Blanch Puig SYSTEM AND PROCEDURE FOR MONITORING THE ACTIVITY OF A PERSON IN AN ENCLOSURE, AND SENSOR TO DETECT A PERSON IN A DEFAULT AREA.
US8154418B2 (en) 2008-03-31 2012-04-10 Magna Mirrors Of America, Inc. Interior rearview mirror system
US9487144B2 (en) 2008-10-16 2016-11-08 Magna Mirrors Of America, Inc. Interior mirror assembly with display
US8570374B2 (en) 2008-11-13 2013-10-29 Magna Electronics Inc. Camera for vehicle
EP2211319B1 (en) * 2009-01-27 2012-10-03 Research In Motion Limited A method and handheld electronic device for detecting and providing notification of a device drop
US8044818B2 (en) * 2009-01-27 2011-10-25 Research In Motion Limited Method and handheld electronic device for detecting and providing notification of a device drop
US8571261B2 (en) * 2009-04-22 2013-10-29 Checkvideo Llc System and method for motion detection in a surveillance video
EP2276007A1 (en) * 2009-07-17 2011-01-19 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Method and system for remotely guarding an area by means of cameras and microphones.
US9544379B2 (en) 2009-08-03 2017-01-10 Wolfram K. Gauglitz Systems and methods for event networking and media sharing
US10574614B2 (en) 2009-08-03 2020-02-25 Picpocket Labs, Inc. Geofencing of obvious geographic locations and events
US20120229633A1 (en) * 2011-03-10 2012-09-13 Thomas J Boucino Security System and Method for Network Equipment Rack Having Camera
KR101799443B1 (en) * 2011-05-02 2017-11-20 삼성전자주식회사 Method for surveying watching of video content, Broadcasting receiving apparatus and Server thereof
US9606209B2 (en) 2011-08-26 2017-03-28 Kineticor, Inc. Methods, systems, and devices for intra-scan motion correction
US9286516B2 (en) 2011-10-20 2016-03-15 Xerox Corporation Method and systems of classifying a vehicle using motion vectors
US10289917B1 (en) * 2013-11-12 2019-05-14 Kuna Systems Corporation Sensor to characterize the behavior of a visitor or a notable event
US9717461B2 (en) 2013-01-24 2017-08-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10327708B2 (en) 2013-01-24 2019-06-25 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US9305365B2 (en) 2013-01-24 2016-04-05 Kineticor, Inc. Systems, devices, and methods for tracking moving targets
WO2014120734A1 (en) 2013-02-01 2014-08-07 Kineticor, Inc. Motion tracking system for real time adaptive motion compensation in biomedical imaging
US20140218515A1 (en) * 2013-02-04 2014-08-07 Systems Engineering Technologies Corporation Immediate action system
US9171213B2 (en) 2013-03-15 2015-10-27 Xerox Corporation Two-dimensional and three-dimensional sliding window-based methods and systems for detecting vehicles
US8971581B2 (en) 2013-03-15 2015-03-03 Xerox Corporation Methods and system for automated in-field hierarchical training of a vehicle detection system
JPWO2014174796A1 (en) * 2013-04-23 2017-02-23 日本電気株式会社 Information processing system, information processing method, and program
WO2014174798A1 (en) 2013-04-23 2014-10-30 日本電気株式会社 Information processing system, information processing method and storage medium
US20150185731A1 (en) * 2013-12-26 2015-07-02 Hyundai Motor Company Work-in-process inspection system using motion detection, and method thereof
PL406971A1 (en) 2014-01-28 2015-08-03 Politechnika Poznańska Method for analyzing of people behaviour in the intelligent monitoring system and the intelligent system of monitoring
US9195669B2 (en) * 2014-02-26 2015-11-24 Iboss, Inc. Detecting and managing abnormal data behavior
CN106572810A (en) 2014-03-24 2017-04-19 凯内蒂科尔股份有限公司 Systems, methods, and devices for removing prospective motion correction from medical imaging scans
WO2016014718A1 (en) 2014-07-23 2016-01-28 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US20160110791A1 (en) 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
US9704266B2 (en) 2014-12-11 2017-07-11 Rdi, Llc Non-contacting monitor for bridges and civil structures
US10062411B2 (en) 2014-12-11 2018-08-28 Jeffrey R. Hay Apparatus and method for visualizing periodic motions in mechanical components
KR102297389B1 (en) 2015-01-05 2021-09-02 픽포켓, 아이엔시. Use of a dynamic geofence to control media sharing and aggregation associated with a mobile target
US10043146B2 (en) * 2015-02-12 2018-08-07 Wipro Limited Method and device for estimating efficiency of an employee of an organization
US9943247B2 (en) 2015-07-28 2018-04-17 The University Of Hawai'i Systems, devices, and methods for detecting false movements for motion correction during a medical imaging scan
JP6708385B2 (en) * 2015-09-25 2020-06-10 キヤノン株式会社 Discriminator creating device, discriminator creating method, and program
WO2017091479A1 (en) 2015-11-23 2017-06-01 Kineticor, Inc. Systems, devices, and methods for tracking and compensating for patient motion during a medical imaging scan
US10325625B2 (en) * 2015-12-04 2019-06-18 Amazon Technologies, Inc. Motion detection for A/V recording and communication devices
US10139281B2 (en) * 2015-12-04 2018-11-27 Amazon Technologies, Inc. Motion detection for A/V recording and communication devices
US20190208168A1 (en) * 2016-01-29 2019-07-04 John K. Collings, III Limited Access Community Surveillance System
US10408625B2 (en) 2016-04-19 2019-09-10 Toyota Motor Engineering & Manufacturing North America, Inc. Vision-assist systems with room scanners to detect and notify users of out-of-order room states
US10810441B2 (en) 2016-08-16 2020-10-20 Motorola Solutions, Inc. Systems and methods for identifying hierarchical structures of members of a crowd
US11545013B2 (en) * 2016-10-26 2023-01-03 A9.Com, Inc. Customizable intrusion zones for audio/video recording and communication devices
US10891839B2 (en) 2016-10-26 2021-01-12 Amazon Technologies, Inc. Customizable intrusion zones associated with security systems
US10410064B2 (en) 2016-11-11 2019-09-10 Nio Usa, Inc. System for tracking and identifying vehicles and pedestrians
US10708547B2 (en) 2016-11-11 2020-07-07 Nio Usa, Inc. Using vehicle sensor data to monitor environmental and geologic conditions
US10694357B2 (en) 2016-11-11 2020-06-23 Nio Usa, Inc. Using vehicle sensor data to monitor pedestrian health
US10455353B2 (en) 2016-12-22 2019-10-22 Motorola Solutions, Inc. Device, method, and system for electronically detecting an out-of-boundary condition for a criminal origanization
US10477343B2 (en) 2016-12-22 2019-11-12 Motorola Solutions, Inc. Device, method, and system for maintaining geofences associated with criminal organizations
US10691950B2 (en) 2017-03-10 2020-06-23 Turing Video, Inc. Activity recognition method and system
US11475671B2 (en) 2017-05-26 2022-10-18 Turing Video Multiple robots assisted surveillance system
EP3673469A1 (en) 2017-12-04 2020-07-01 Siemens Mobility GmbH Automated detection of an emergency situation of one or more persons
US11734688B2 (en) * 2018-06-29 2023-08-22 Amazon Technologies, Inc. System to determine group association between users
US11423551B1 (en) 2018-10-17 2022-08-23 Rdi Technologies, Inc. Enhanced presentation methods for visualizing motion of physical structures and machinery
US10824935B2 (en) * 2018-10-17 2020-11-03 Mitsubishi Electric Research Laboratories, Inc. System and method for detecting anomalies in video using a similarity function trained by machine learning
US11151387B2 (en) * 2019-04-05 2021-10-19 Toyota Motor Engineering & Manufacturing North America, Inc. Camera system to detect unusual activities
US11402487B2 (en) * 2019-04-18 2022-08-02 GM Global Technology Operations LLC Joint radon transform association
US11373103B2 (en) * 2019-05-28 2022-06-28 Accenture Global Solutions Limited Artificial intelligence based system and method for predicting and preventing illicit behavior
US11132562B2 (en) 2019-06-19 2021-09-28 Toyota Motor Engineering & Manufacturing North America, Inc. Camera system to detect unusual circumstances and activities while driving
WO2021076523A1 (en) * 2019-10-13 2021-04-22 eConnect, Inc. Intelligent visual human behvior prediction
US11587384B1 (en) 2019-12-13 2023-02-21 Amazon Technologies, Inc. Group determination and association
US11373317B1 (en) 2020-01-24 2022-06-28 Rdi Technologies, Inc. Measuring the speed of rotation or reciprocation of a mechanical component using one or more cameras
US10943441B1 (en) 2020-06-05 2021-03-09 Bank Of America Corporation Image processing system and method for detecting errors in an ATM terminal
US11282213B1 (en) 2020-06-24 2022-03-22 Rdi Technologies, Inc. Enhanced analysis techniques using composite frequency spectrum data
US11322182B1 (en) 2020-09-28 2022-05-03 Rdi Technologies, Inc. Enhanced visualization techniques using reconstructed time waveforms
CN113486777A (en) * 2021-07-02 2021-10-08 北京一维大成科技有限公司 Behavior analysis method and device for target object, electronic equipment and storage medium

Citations (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4337482A (en) 1979-10-17 1982-06-29 Coutta John M Surveillance system
JPS58109373A (en) 1981-12-24 1983-06-29 フジテツク株式会社 Detector for abnormality in elevator cage
EP0183106A2 (en) 1984-11-10 1986-06-04 Matsushita Electric Works, Ltd. Visual image sensor system
JPS61260391A (en) 1985-05-14 1986-11-18 三菱電機株式会社 Monitor/controller
JPS62136988A (en) 1985-12-10 1987-06-19 Matsushita Electric Works Ltd Invasion monitoring device
US4692806A (en) 1985-07-25 1987-09-08 Rca Corporation Image-data reduction technique
JPS62222390A (en) 1986-03-24 1987-09-30 松下電工株式会社 Image recognition type wide range monitoring system
US4737847A (en) 1985-10-11 1988-04-12 Matsushita Electric Works, Ltd. Abnormality supervising system
JPH01244598A (en) 1988-03-25 1989-09-28 Toshiba Corp Picture supervisory equipment
JPH01251195A (en) 1988-03-31 1989-10-06 Toshiba Corp Monitor
DE3832353A1 (en) 1988-09-23 1990-04-05 Nolde Sylvia Expanding the function of monitoring devices in shops and department stores
JPH02151996A (en) 1988-12-03 1990-06-11 Toshiba Corp Room entry control system
GB2239369A (en) 1989-11-09 1991-06-26 Marconi Gec Ltd Image tracking
JPH0410099A (en) 1990-04-27 1992-01-14 Toshiba Corp Detector for person acting suspiciously
US5091780A (en) 1990-05-09 1992-02-25 Carnegie-Mellon University A trainable security system emthod for the same
JPH0460880A (en) 1990-06-29 1992-02-26 Shimizu Corp Moving body discrimination and analysis controlling system
US5097328A (en) 1990-10-16 1992-03-17 Boyette Robert B Apparatus and a method for sensing events from a remote location
US5099322A (en) 1990-02-27 1992-03-24 Texas Instruments Incorporated Scene change detection system and method
US5126577A (en) 1991-06-27 1992-06-30 Electro-Optical Industries, Inc. Infrared target plate handling apparatus with improved thermal control
JPH04257190A (en) 1991-02-08 1992-09-11 Toshiba Corp Moving body tracking and display device
JPH04273689A (en) 1991-02-28 1992-09-29 Hitachi Ltd Monitoring device
JPH0512578A (en) 1991-06-28 1993-01-22 Mitsubishi Electric Corp Invasion monitoring device
JPH0514892A (en) 1991-06-28 1993-01-22 Toshiba Corp Image monitor device
JPH0546771A (en) 1991-08-16 1993-02-26 Toshiba Corp Motive object detector
EP0529196A2 (en) 1991-08-29 1993-03-03 Pioneer Electronic Corporation Picture image monitoring system
WO1993005488A1 (en) 1991-09-12 1993-03-18 Electronic Data Systems Corporation Image analyser
JPH05143737A (en) 1991-11-22 1993-06-11 Ohkura Electric Co Ltd Method and device for identification with motion vector
US5243418A (en) 1990-11-27 1993-09-07 Kabushiki Kaisha Toshiba Display monitoring system for detecting and tracking an intruder in a monitor area
GB2265516A (en) 1992-03-24 1993-09-29 Sony Broadcast & Communication Motion analysis of moving images
EP0564858A2 (en) 1992-04-06 1993-10-13 Siemens Aktiengesellschaft Method for resolving clusters of moving segments
US5283644A (en) 1991-12-11 1994-02-01 Ibaraki Security Systems Co., Ltd. Crime prevention monitor system
JPH0628449A (en) 1992-07-08 1994-02-04 Matsushita Electric Ind Co Ltd Image synthesizing device
US5289275A (en) 1991-07-12 1994-02-22 Hochiki Kabushiki Kaisha Surveillance monitor system using image processing for monitoring fires and thefts
JPH06119564A (en) 1992-10-02 1994-04-28 Toshiba Corp Suspicious person detecting system
JPH06117836A (en) 1992-08-21 1994-04-28 Matsushita Electric Ind Co Ltd Image processing apparatus, controller of air conditioner, and applied equipment using the apparatus
JPH06251159A (en) 1993-03-01 1994-09-09 Nippon Telegr & Teleph Corp <Ntt> Operation recognizing device
JPH06266840A (en) 1993-03-11 1994-09-22 Hitachi Ltd Status detector for moving object
GB2277845A (en) 1993-05-03 1994-11-09 Philips Electronics Nv Monitoring system
EP0624858A1 (en) 1993-05-14 1994-11-17 Michael Josef Lantschner Method and device for detecting irregular movements of a person
US5387768A (en) 1993-09-27 1995-02-07 Otis Elevator Company Elevator passenger detector and door control system which masks portions of a hall image to determine motion and court passengers
US5396284A (en) 1993-08-20 1995-03-07 Burle Technologies, Inc. Motion detection system
US5396252A (en) 1993-09-30 1995-03-07 United Technologies Corporation Multiple target discrimination
US5416711A (en) 1993-10-18 1995-05-16 Grumman Aerospace Corporation Infra-red sensor system for intelligent vehicle highway systems
US5473364A (en) 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US5512942A (en) 1992-10-29 1996-04-30 Fujikura Ltd. Anomaly surveillance device
US5519669A (en) 1993-08-19 1996-05-21 At&T Corp. Acoustically monitored site surveillance and security system for ATM machines and other facilities
US5546072A (en) 1994-07-22 1996-08-13 Irw Inc. Alert locator
US5554983A (en) 1992-04-24 1996-09-10 Hitachi, Ltd. Object recognition system and abnormality detection system using image processing
US5555512A (en) 1993-08-19 1996-09-10 Matsushita Electric Industrial Co., Ltd. Picture processing apparatus for processing infrared pictures obtained with an infrared ray sensor and applied apparatus utilizing the picture processing apparatus
US5666157A (en) 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system
US5747719A (en) 1997-01-21 1998-05-05 Bottesch; H. Werner Armed terrorist immobilization (ATI) system
US5809161A (en) 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
US6028626A (en) 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
US6050369A (en) 1994-10-07 2000-04-18 Toc Holding Company Of New York, Inc. Elevator shaftway intrusion device using optical imaging processing
US6167143A (en) 1993-05-03 2000-12-26 U.S. Philips Corporation Monitoring system

Patent Citations (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4337482A (en) 1979-10-17 1982-06-29 Coutta John M Surveillance system
JPS58109373A (en) 1981-12-24 1983-06-29 フジテツク株式会社 Detector for abnormality in elevator cage
EP0183106A2 (en) 1984-11-10 1986-06-04 Matsushita Electric Works, Ltd. Visual image sensor system
JPS61260391A (en) 1985-05-14 1986-11-18 三菱電機株式会社 Monitor/controller
US4839631A (en) 1985-05-14 1989-06-13 Mitsubishi Denki Kabushiki Kaisha Monitor control apparatus
US4692806A (en) 1985-07-25 1987-09-08 Rca Corporation Image-data reduction technique
US4737847A (en) 1985-10-11 1988-04-12 Matsushita Electric Works, Ltd. Abnormality supervising system
JPS62136988A (en) 1985-12-10 1987-06-19 Matsushita Electric Works Ltd Invasion monitoring device
JPS62222390A (en) 1986-03-24 1987-09-30 松下電工株式会社 Image recognition type wide range monitoring system
JPH01244598A (en) 1988-03-25 1989-09-28 Toshiba Corp Picture supervisory equipment
JPH01251195A (en) 1988-03-31 1989-10-06 Toshiba Corp Monitor
DE3832353A1 (en) 1988-09-23 1990-04-05 Nolde Sylvia Expanding the function of monitoring devices in shops and department stores
JPH02151996A (en) 1988-12-03 1990-06-11 Toshiba Corp Room entry control system
GB2239369A (en) 1989-11-09 1991-06-26 Marconi Gec Ltd Image tracking
US5099322A (en) 1990-02-27 1992-03-24 Texas Instruments Incorporated Scene change detection system and method
JPH0410099A (en) 1990-04-27 1992-01-14 Toshiba Corp Detector for person acting suspiciously
US5091780A (en) 1990-05-09 1992-02-25 Carnegie-Mellon University A trainable security system emthod for the same
JPH0460880A (en) 1990-06-29 1992-02-26 Shimizu Corp Moving body discrimination and analysis controlling system
US5097328A (en) 1990-10-16 1992-03-17 Boyette Robert B Apparatus and a method for sensing events from a remote location
US5243418A (en) 1990-11-27 1993-09-07 Kabushiki Kaisha Toshiba Display monitoring system for detecting and tracking an intruder in a monitor area
JPH04257190A (en) 1991-02-08 1992-09-11 Toshiba Corp Moving body tracking and display device
JPH04273689A (en) 1991-02-28 1992-09-29 Hitachi Ltd Monitoring device
US5126577A (en) 1991-06-27 1992-06-30 Electro-Optical Industries, Inc. Infrared target plate handling apparatus with improved thermal control
JPH0512578A (en) 1991-06-28 1993-01-22 Mitsubishi Electric Corp Invasion monitoring device
JPH0514892A (en) 1991-06-28 1993-01-22 Toshiba Corp Image monitor device
US5289275A (en) 1991-07-12 1994-02-22 Hochiki Kabushiki Kaisha Surveillance monitor system using image processing for monitoring fires and thefts
JPH0546771A (en) 1991-08-16 1993-02-26 Toshiba Corp Motive object detector
EP0529196A2 (en) 1991-08-29 1993-03-03 Pioneer Electronic Corporation Picture image monitoring system
WO1993005488A1 (en) 1991-09-12 1993-03-18 Electronic Data Systems Corporation Image analyser
JPH05143737A (en) 1991-11-22 1993-06-11 Ohkura Electric Co Ltd Method and device for identification with motion vector
US5283644A (en) 1991-12-11 1994-02-01 Ibaraki Security Systems Co., Ltd. Crime prevention monitor system
US5809161A (en) 1992-03-20 1998-09-15 Commonwealth Scientific And Industrial Research Organisation Vehicle monitoring system
GB2265516A (en) 1992-03-24 1993-09-29 Sony Broadcast & Communication Motion analysis of moving images
EP0564858A2 (en) 1992-04-06 1993-10-13 Siemens Aktiengesellschaft Method for resolving clusters of moving segments
US5554983A (en) 1992-04-24 1996-09-10 Hitachi, Ltd. Object recognition system and abnormality detection system using image processing
JPH0628449A (en) 1992-07-08 1994-02-04 Matsushita Electric Ind Co Ltd Image synthesizing device
JPH06117836A (en) 1992-08-21 1994-04-28 Matsushita Electric Ind Co Ltd Image processing apparatus, controller of air conditioner, and applied equipment using the apparatus
JPH06119564A (en) 1992-10-02 1994-04-28 Toshiba Corp Suspicious person detecting system
US5512942A (en) 1992-10-29 1996-04-30 Fujikura Ltd. Anomaly surveillance device
JPH06251159A (en) 1993-03-01 1994-09-09 Nippon Telegr & Teleph Corp <Ntt> Operation recognizing device
JPH06266840A (en) 1993-03-11 1994-09-22 Hitachi Ltd Status detector for moving object
DE4314483A1 (en) 1993-05-03 1994-11-10 Philips Patentverwaltung Surveillance system
US6167143A (en) 1993-05-03 2000-12-26 U.S. Philips Corporation Monitoring system
GB2277845A (en) 1993-05-03 1994-11-09 Philips Electronics Nv Monitoring system
EP0624858A1 (en) 1993-05-14 1994-11-17 Michael Josef Lantschner Method and device for detecting irregular movements of a person
US5519669A (en) 1993-08-19 1996-05-21 At&T Corp. Acoustically monitored site surveillance and security system for ATM machines and other facilities
US5555512A (en) 1993-08-19 1996-09-10 Matsushita Electric Industrial Co., Ltd. Picture processing apparatus for processing infrared pictures obtained with an infrared ray sensor and applied apparatus utilizing the picture processing apparatus
US5712830A (en) 1993-08-19 1998-01-27 Lucent Technologies Inc. Acoustically monitored shopper traffic surveillance and security system for shopping malls and retail space
US5396284A (en) 1993-08-20 1995-03-07 Burle Technologies, Inc. Motion detection system
US5387768A (en) 1993-09-27 1995-02-07 Otis Elevator Company Elevator passenger detector and door control system which masks portions of a hall image to determine motion and court passengers
US5396252A (en) 1993-09-30 1995-03-07 United Technologies Corporation Multiple target discrimination
US5416711A (en) 1993-10-18 1995-05-16 Grumman Aerospace Corporation Infra-red sensor system for intelligent vehicle highway systems
US5473364A (en) 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US5546072A (en) 1994-07-22 1996-08-13 Irw Inc. Alert locator
US6050369A (en) 1994-10-07 2000-04-18 Toc Holding Company Of New York, Inc. Elevator shaftway intrusion device using optical imaging processing
US5666157A (en) 1995-01-03 1997-09-09 Arc Incorporated Abnormality detection and surveillance system
IL116647A (en) 1995-01-03 1999-03-12 Aviv David G Abnormality detection and surveillance system
US6028626A (en) 1995-01-03 2000-02-22 Arc Incorporated Abnormality detection and surveillance system
USRE42690E1 (en) 1995-01-03 2011-09-13 Prophet Productions, Llc Abnormality detection and surveillance system
US5747719A (en) 1997-01-21 1998-05-05 Bottesch; H. Werner Armed terrorist immobilization (ATI) system

Non-Patent Citations (135)

* Cited by examiner, † Cited by third party
Title
"Intelligent Scene Monitoring Drives Security, Surveillance." Signal, Jul. 1995, pp. 29-32.
Agarwal et al. "Estimating Optical Flow from Clustered Trajectory Velocity Time" Pattern Recognition, 1992. vol. I. Conference A: Computer Vision and Applications, Proceedings., 11th IAPR International Conference on Aug. 30-Sep. 3, 1992, pp. 215-219.
Aggarwal et al, "Human Motion Analysis: A Review," Computer Vision and Image Understanding, Mar. 1999, vol. 73, No. 3 pp. 428-440.
Aggarwal et al, "Human Motion Analysis: A Review," the Proceedings IEEE Nonrigid and Articulated Motion Workshop, pp. 90-102 (1997).
Allmen "Image Sequence Description Using Spatiotemporal Flow Curves: Toward Motion-Based Recognition," Thesis submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy Computer Sceinces, University of Wisconsin, Madison, 1991, 153 pages.
Allmen et al., "Computing Spatiotemporal Relations for Dynamic Perceptual Organization." Computer Sciences Department Technical Report 1130, University of Wisconsin-Madison, Dec. 1992, 33 pages.
Arai, Proceedings of the 11th Annual Conference of the Robotics, Society of Japan, No. 1, pp. 347-348 (Nov. 1993).
Ardayfio, David D., Fundamentals of Robotics, Marcel Dekker Inc., (1987).
Aviv "New on-board data processing approach to achieve large compaction," SPIE, 1979, vol. 180, pp. 48-55.
Aviv, D.G. "On Achieving Safer Streets," Library of Congress, TXU 545 919, Nov. 23, 1983, 7 pages.
Aviv, D.G. "The 'Public Eye' Security System," Library of Congress, TXU 551 435, Jan. 11, 1993, 13 pages.
Bergstein et al. "Four-Component Optically Compensated Varifocal System," Journal of the Optical Society of America, Apr. 1962, vol. 52, No. 4, pp. 376-388.
Bergstein et al. "Three-Component Optically Compensated Varifocal System," Journal of the Optical Society of America, Apr. 1962, vol. 52, No. 4, pp. 363-375.
Bergstein et al. "Two-Component Optically Compensated Varifocal System," Journal of the Optical Society of America, Apr. 1962, vol. 52, No. 4, pp. 353-362.
Bilbao et al., "Analysis Procedure of Perimeter Protection Systems: The TDCI Vector," International Carnahan Conference on Security Technology, 1989, pp. 229-235.
Black et al., "Estimating Multiple Independent Motions in Segmented Images using Parametric Models with Local Deformations," Proceedings of the 1994 IEEE Workshop on Motion of Non-Rigid and Articulated Objects (1994). 8 pages.
Bobick, et al., "A State-based Technique for the Summarization and Recognition of Gesture," Fifth Int'l Conf. on Computer Vision, Cambridge, MA, pp. 382-388 (Jun. 1995).
Bolle et al., "Method for Security Monitoring of Undesirable Behavior," IBM Technical Disclosure Bulletin, vol. 37, No. 11 (Nov. 1994). 2 pages.
Bouthemy et al., "Motion Segmentation and Qualitative Dynamic Scene Analysis from an Image Sequence," International Journal of Computer Vision, vol. 10, No. 2, pp. 157-182 (1993).
Brock-Gunn et al., "Using Colour Templates for Target Identification and Tracking," British Machine Vision Conference (1992). 10 pages.
Buker, A neural vision system for robotics applications, Automation, 45, (Oct. 1997). 8 pages.
Buxton et al. "Behavioral descriptions from image sequences," In Proceedings of Workshop on Integration of Natural and Vision Processing Language, 1994, 7 pages.
Buxton et al., "Visual Surveillance in a Dynamic and Uncertain World," Artificial Intelligence vol. 78, pp. 431-359 (1995).
Byrne et al., "Using Shape and Intensity to Track Non-Rigid Objects," University of Leeds School of Computer Studies Research Report Series, Report 94.14, May 1994, 11 pages.
Campbell, et al., "Recognition of Human Body Motion Using Phase Space Constraints," Fifth Int'l Conf. on Computer Vision, Cambridge, MA, pp. 624-630 (Jun. 1995).
Campbell, et al., "Using Phase Space Constraints to Represent Human Body Motion," Int'l Workshop on Automatic Face- and Gesture-Recognition, Zurich, pp. 338-343 (1995).
Carvalho et al., "Real Time Automatic Inspection Under Adverse Conditions," SPIE Optical Systems in Adverse Environments, vol. 1399, 1990, pp. 130-136.
Cedras et al. "Motion-based recognition: a survey," Image and Vision Computing, Mar. 1995, vol. 13, No. 2, pp. 129-155.
Chleq et al. "Realtime Image Sequence Interpretation for Video-Surveillance Applications," International Conference on Image Processing, 1996 Proceedings Sep. 16-19, 1996, pp. 801-804.
Custance et al. "Evaluating Scene Monitoring Systems: a Discussion Paper," Security Technology, 199. Crime Countermeasures, Proceedings. Institute of Electrical and Electronics Engineers 1992 International Carnahan Conference on Oct. 14-16, 1992, 7 pages.
Custance et al., "Image Surveillance Systems: Some Novel Design Features," IEEE International Carnahan Conference of Security Technology (1990). 5 pages.
Dance et al., "Interpretation of Dynamic Interaction in Image Sequences," Proceedings of the Artificial Intelligence in Defence Workshop (1995). 15 pages.
Davies et al., "Crowd Monitoring Using Image Processing," IEE Electronic and Communications Engineering Journal, vol. 7, No. 1 pp. 37-47 (Feb. 1995).
Defendants ADT Security Services, Inc., Bosch Security Systems, Inc., Mango DSP, Inc., Pelco, Inc. and Siemens Industry, Inc.'s Invalidity Contentions for U.S. Patent No. RE42,690, Jul. 18, 2012, 59 pages, United States District Court for the Eastern District of Texas Case No. 6: 11-cv-00494-LED.
Devereux et al., "A Method for Evaluating Video Motion Detection," Institute of Electrical and Electronics Engineers 29th Annual International Carnahan Conference on Security Technology (1995). 3 pages.
Electrical Review, vol. 79 No. 10 pp. 7276 (Oct. 1994).
Ellis et al. "Model-Based Vision for Automatic Alarm Interpretation," IEEE AES Systems Magazine, Mar. 1991, pp. 14-20.
Ellis et al. "Model-Based Vision for Automatic Alarm Interpretation," Security Technology, 1990. Crime Countermeasures, Proceedings, IEEE 1990 International Carnahan Conference on Oct. 10-12, 1990, pp. 62-67.
Ellis et al., "A Knowledge-Based Approach to Automatic Alarm Interpretation Using Computer Vision on Image Sequences," Proceedings of International Carnahan Conference on Security Technology (1989). 8 pages.
Flinchbaugh, "Robust Video Motion Detection and Event Recognition," Proceedings of DARPA Image Understanding Workshop (May 1997). 4 pages.
Freer et al. "Moving Object Surveillance and Analysis for Camera Based Security Systems," Security Technology, 1995. Proceedings. Institute of Electrical and Electronics Engineers 29th Annual 1995 International Carnahan Conference on Oct. 18-20, 1995, pp. 67-71.
Freer et al., "Automatic Recognition of Suspicious Activity for Camera Based Security Systems," European Convention on Security and Detection, May 1995, Conference Pub. No. 408, pp. 54-58.
Gavrila et al. "Towards 3-D model-based tracking and recognition of human movement: a multi-view approach," International Workshop on Face and Gestrure Recognition, Zurich, 1995, 6 pages.
Gibbins et al. "Detecting Suspicious Background Changes in Video Surveillance of Busy Scenes," Workshop on Applications of Computer Vision, 1996. Dec. 2-4, 1996, 5 pages.
Gould et al., "Detection and Representation of Events in Motion Trajectories," Advances in Image Processing and Analysis, Chapter 14, pp. 393-426 (1992).
Hennebert et al., "Detection of Small and Slow Moving Objects Observed by a Mobile Camera," Theory and Applications of Image Analysis II (1995). pp. 253-266.
Horner "Amethyst: an Enhanced Detection System Intelligently Combining Video Detection and Non-Video Detection Systems," Security Technology, 1995. Proceedings. Institute of Electrical and Electronics Engineers 29th Annual 1995 International Carnahan Conference on Oct. 18-20, 1995, pp. 59-66.
Hosie et al., "Towards Detecting Patterns of Human Behaviour from Image Sequences," Proceedings of the Artificial Intelligence in Defence Workshop (1995). pp. 109-122.
Hötter et al., "Detection of Moving Objects in Natural Scenes by a Stochastic Multi-Feature Analysis of Video Sequences," In Proceedings of Institute of Electrical and Electronics Engineers 29th Annual 1995 International Carnahan Conference on Security Technology (Oct. 1995). pp. 47-52.
Howarth et al. "Selective attention in dynamic vision," Proceedings of the Thirteenth IJCAI Conference, 1993, 7 pages.
Howarth et al., "Analogical Representation of Space and Time," Image and Vision Computing, vol. 10., No. 7 (Sep. 1992). pp. 467-478.
Howell et al., "Video Time Radiation Analysis Program (VTRAP)-Requirements and Preliminary Design Document," (Sep. 1994). 14 pages.
Huttenlocher et al., "Tracking Non-Rigid Objects in Complex Scenes," Proceedings of Fourth International Conference on Computer Vision (May 1993). pp. 93-101.
International Search Report for International (PCT) Patent Application No. PCT/US1996/08674, dated Sep. 17, 1996.
Intille, et al., "Closed-World Tracking," Fifth Int'l Conf. on Computer Vision, Cambridge, MA, pp. 672-678 (Jun. 1995).
Irani et al., "Detecting and Tracking Multiple Moving Objects Using Temporal Integration," ECCV (1992). 6 pages.
Irani et al., "Motion Analysis for Image Enhancement: Resolution, Occlusion, and Transparency," Journal of Visual Communication and Image Representation, vol. 4, No. 4 pp. 324-335 (1993).
Ju et al., "Cardboard People: A Parameterized Model of Articulated Image Motions," Proceedings of the Second International Conference on Automatic Face and Gesture Recognition (1996). 7 pages.
Kaneta et al., "Image Processing Method for Intruder Detection Around Power Line Towers," IAPR Workshop on Machine Vision Applications, Dec. 1992. pp. 353-356.
Kawashima et al., "Qualitative Image Analysis of Group Behaviour," Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Jun. 1994). pp. 690-693.
Klima et al. "Simple Motion Detection Methods in TV Image for Security Purposes," Security Technology, Proceedings. Institute of Electrical and Electronics Engineers 1993 International Carnahan Conference on Oct. 13-15, 1993, pp. 41-43.
Klima et al., "Motion Detection and Target Tracking in TV Image for Security Purposes," Proceedings of Institute of Electrical and Electronics Engineers 28th Annual International Carnahan Conference on Security Technology (1994). pp. 43-44.
Koga, "A Video Surveillance System Using 3-D Features on Real-Space Coordinates," Proceedings of the 1993 IEICE (the Institute of Electronics, Information and Communication Engineers) Fall Conference, Autumn Pt 6 p. 6.333 (Sep. 1993).
Kollnig et al., "3D Pose Estimation by Fitting Image Gradients Directly to Polyhedral Models," Proceedings of Fifth International Conference on Computer Vision (1995). pp. 569-574.
Kozlow, "David: Advanced Developments for the Next Generation of Video Intrusion Detection," ICCST (1989). pp. 145-147.
Leon et al. "Data fusion for the production of high quality pictures in the automatic sichtprufung," Automatisienrungstechnik, Oct. 1997, pp. 480-489.
Leung et al., "First Sight: A Human Body Outline Labeling System," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, No. 4, Apr. 1995, pp. 359-377.
Mackintosh "Sentinel A Technology For Practical Automatic Monitoring Of Busy And Complex Scenes," Security Technology, 1992 Crime Countermeasures, Proceedings. Institute of Electrical and Electronics Engineers 1992 International Carnahan Conference on Oct. 14-16, 1992, pp. 190-193.
MacLean et al., "Recovery of Egomotion and Segmentation of Independent Object Motion Using the EM Algorithm," British Machine Vision Conference, vol. 1 (1994). 11 pages.
Makarov et al., "Intrusion Detection Using Extraction of Moving Edges," Proceedings of the 12th IAPR International Conference on Pattern Recognition (1994). pp. 804-807.
Matas et al., "Constraining Visual Expectations Using a Grammar of Scene Events," Proceedings of the Sixth International Conference on Artificial Intelligence and Information-Control System of Robots (1995). 12 pages.
Matter, "Video Motion Detection for Physical Security Applications," (1990). 14 pages.
McLauchlan et al., "Course Image Motion for Saccade Control," British Machine Vision Conference (1992). 12 pages.
Mecocci et al., "Image Sequence Analysis for Counting in Real Time People Getting In and Out of a Bus," Signal Processing, vol. 35, pp. 105-116 (1994).
Munno et al., "Automatic Video Image Moving Target Detection for Wide Area Surveillance," Institute of Electrical and Electronics Engineers 1993 International Carnahan Conference on Security Technology (1993). pp. 47-57.
Murino et al., "Visual Surveillance by Depth from Focus," 20th International Conference on Industrial Electronics (Sep. 1994). pp. 998-1003.
Murray et al., "Active Exploration of Dynamic and Static Scenes," in Real-Time Computer Vision (1994). 20 pages.
Notice of Allowance for U.S. Appl. No. 08/367,712, mailed Dec. 24, 1996.
Notice of Allowance for U.S. Appl. No. 08/898,470, mailed Mar. 1, 1999.
Notice of Allowance for U.S. Appl. No. 12/230,490, mailed Jan. 31, 2013 5 pages.
Notice of Allowance for U.S. Appl. No. 12/466,340, mailed Jul. 18, 2011.
Notice of Allowance for U.S. Appl. No. 12/466,350 mailed Nov. 30, 2011.
Odobez et al., "Detection of Multiple Moving Objects Using Multiscale MRF with Camera Motion Compensation," 1st ICIP (Nov. 1994). 5 pages.
Official Action for U.S. Appl. No. 08/367,712, mailed Jul. 24, 1996.
Official Action for U.S. Appl. No. 08/898,470, mailed Oct. 1, 1998.
Official Action for U.S. Appl. No. 12/230,490, mailed Apr. 25, 2012 6 pages.
Official Action for U.S. Appl. No. 12/230,490, mailed Nov. 16, 2012 5 pages.
Official Action for U.S. Appl. No. 12/466,340, mailed Apr. 18, 2011.
Official Action for U.S. Appl. No. 12/466,340, mailed Aug. 30, 2010.
Official Action for U.S. Appl. No. 12/466,340, mailed Mar. 11, 2010.
Official Action for U.S. Appl. No. 12/466,340, mailed Nov. 8, 2010.
Official Action for U.S. Appl. No. 12/466,350 mailed Dec. 22, 2010.
Official Action for U.S. Appl. No. 12/466,350 mailed Mar. 15, 2010.
Pearce et al., "Rulegraphs for Graph Matching in Pattern Recognition," Pattern Recognition, vol. 27, No. 9, pp. 1231-1247 (1994).
Pentland et al., "Recovery of Nonrigid Motion and Structure," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 13, No. 7 (Jul. 1991). pp. 730-742.
Polana et al., "Low Level Recognition of Human Motion (Or How to get Your Man Without Finding his Body Parts)," Proceedings of IEEE Computer Society Workshop on Motion of Non-Rigid and Articulated Objects (1994). 6 pages.
Proceedings of JSPE (The Japan Society for Precision Engineering) Spring Conference, No. 3 p. 891-892 (Jul. 1993).
Rabiner "Applications of Voice Processing to Telecommunications," Proceedings of the IEEE, Feb. 1994, vol. 82, No. 2, pp. 199-228.
Rabiner "The Role of Voice Processing in Telecommunications," 2nd IEEE Workshop on Interactive Voice Technology for Telecommunications Applications (IVTTA94) Sep. 1994, 8 pages.
Rabiner et al. "Fundamental of Speech Recognition," Prentice Hall International, Inc., Apr. 12, 1993, pp. 434-495.
Rabiner et al., "A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition," Readings in Speech Recognition (1990). pp. 267-296.
Rabiner et al., Fundamentals of Speech Recognition, Chapters 8-9 (1993). 64 pages.
Rangarajan et al., "Matching Motion Trajectories Using Scale-Space," Pattern Recognition, vol. 26, No. 4, pp. 595-610 (1993).
Retz-Schmidt, "A Replai of Soccer: Recognizing Intentions in the Domain of Soccer Games," European Conference on Artificial Intelligence (Aug. 1988). pp. 455-457.
Richardson, "A Sequential Detection Approach to Target Tracking," a thesis submitted to the Department of Electrical Engineering for the degree of Master of Science, Queen's University, Kingston, Ontario, Canada (Jun. 1992). 177 pages.
Robinson "Neural Network Solutions Provide Facial Recognition," Signal, Feb. 1991, vol. 45, No. 6, pp. 73-76.
Rodger et al., "Video Motion Detection Systems: A Review for the Nineties," ("Rodger"), Proceedings of Institute of Electrical and Electronics Engineers 28th Annual International Carnahan Conference on Security Technology (1994). pp. 92-97.
Rohr "Towards Model-Based Recognition of Human Movements in Image Sequences," CVGIP: Image Understanding, Jan. 1994, vol. 59, No. 1, pp. 94-115.
Sakuma, "Detecting intruders using the method of Inter Frame Difference," 1990 Autumn National Convention Record, the Institute of Electronics, Information and Communication Engineers, pt. 6 (Sep. 1990).
Sakuma, "Detecting intruders using the method of Inter Frame Difference," ITEJ (The Institute of Television Engineers of Japan) Technical Report, vol. 14 No. 49 (IPCV90 27-30/AIPS90 50-53), pp. 1-6 (Sep. 1990).
Seki et al., "A Highly Reliable Intruder Monitoring System that uses Movement Information," Mitsubishi Denki Giho, vol. 67, No. 7 (1993). 16 pages with English Translation.
Shimonaga, "Automatic Supervisory of Intrusion with ITV," Proceedings of the 1992 ITE (the Institute of Television Engineers) Annual Convention, pp. 351-352 (Jul. 1992). (English abstract).
Shio et al. "Segmentation of People in Motion," Visual Motion, 1991., Proceedings of the IEEE Workshop on Oct. 7-9, 1991, pp. 325-332.
Smith, "Asset-2: Real-Time Motion Segmentation and Shape Tracking," IEEE Fifth International Conference on Computer Vision (Jun. 20, 1995). pp. 237-244.
Starner et al., "Real-Time American Sign Language Recognition from Video Using Hidden Markov Models," M.I.T. Media Laboratory Perceptual Computing Section Technical Report No. 375(1995). 7 pages.
Stiller, Computer-Age, Digital-Color Camera Compro von Pearl Agency (1996). 5 pages German only.
Stubbington, "Intelligent Scene Monitoring; Technical Aspects and Practical Experience," Security Technology, Carnahan Conference, Oct. 18-20, 1995, pp. 364-375.
Suzuki et al. "Extracting Non-Rigid Moving Objects by Temporal Edges,"Pattern Recognition, 1992. vol. I. Conference A: Computer Vision and Applications, Proceedings., 11th IAPR International Conference on Aug. 30-Sep. 3, 1992, pp. 69-73.
Takano et al., "Intruder Detection System by Image Processing," Institute of Electrical and Electronics Engineers 28th Annual 1994 International Carnahan Conference on Security Technology (1994). pp. 31-33.
Takatoo et al. "Detection of Objects Including Persons Using Image Processing," Pattern Recognition, 1996, Proceedings of the 13th International Conference on Aug. 25-29, 1996, pp. 466-472.
Technical Research Report of Shimizu Corporation, vol. 60, pp. 123-131 (Oct. 1994). Japanese only.
Toal et al., "Spatio-temporal Reasoning within a Traffic Surveillance System," Proceeds of the Second European Conference on Computer Vision (1992). pp. 884-892.
Tsuge et al., "Accident Vehicle Automatic Detection System by Image Processing Technology," Vehicle Navigation & Information Systems Conference Proceedings (1994). pp. 45-50.
U.S. Appl. No. 12/466,350, filed May 14, 2009, Aviv.
U.S. Appl. No. 13/230,490, filed Sep. 12, 2011, Aviv.
Weibel et al. "Readings in Speech Recognition," Morgan Kaufam, May 15, 1990 pp. 267-296.
Wilson, et al., "Configuration States for the Representation and Recognition of Gesture," Int'l Workshop on Automatic Face- and Gesture-Recognition, Zurich, pp. 129-134 (1995).
Wollert, Get the picture, Image Processing Industrial Market Report, Elektronik (Feb. 1997). 12 pages German Only.
Wren et al., "Pfinder: Real-Time Tracking of the Human Body," 1996 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition, (1996). pp. 51-56.
Wren et al., "Pfinder: Real-Time Tracking of the Human Body," IEEE Transactions on Pattern Analysis and Machine Intelligence (1997). pp. 780-785.
Xiao et al. "Eleview: An Active Elevator Monitoring Vision System," MVA 96, IAPR Workshop on Machine Vision Applications, in Tokyo, Japan, Nov. 12, 1996, pp. 253-256.
Yamato et al., "Recognizing Human Action in Time-Sequential Images using Hidden Markov Model," Proceedings of IEEE Computer Society Conference on Computer Vision (Jun. 1992). pp. 379-385.
Yamato et al., "Recognizing Human Action in Time-Sequential Images using Hidden Markov Model," The Transactions of the Institute of Electronics, Information and Communication Engineers, vol. 76, No. 12, Dec. 1993, pp. 2556-2563, Japanese Only.
Yeh et al. "A Vision System for Safe Robot Operation," 1988 IEEE International Conference on Robotics and Automation, Apr. 24-29, 1988, pp. 1461-1465.
Yoshikawa et al., "Development of Video Surveillance System using Motion Information," Mitsubishi Electric Corp. (1991). 5 pages Japanese Only.

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9710708B1 (en) * 2014-03-24 2017-07-18 Vecna Technologies, Inc. Method and apparatus for autonomously recognizing at least one object in an image
US10696517B2 (en) * 2014-11-26 2020-06-30 Otis Elevator Company Elevator security and control system based on passenger movement
US20170275134A1 (en) * 2014-11-26 2017-09-28 Otis Elevator Company Elevator security and control system based on passenger movement
US9984154B2 (en) 2015-05-01 2018-05-29 Morpho Detection, Llc Systems and methods for analyzing time series data based on event transitions
US10839009B2 (en) 2015-05-01 2020-11-17 Smiths Detection Inc. Systems and methods for analyzing time series data based on event transitions
US20180046864A1 (en) * 2016-08-10 2018-02-15 Vivint, Inc. Sonic sensing
US11354907B1 (en) 2016-08-10 2022-06-07 Vivint, Inc. Sonic sensing
US10579879B2 (en) * 2016-08-10 2020-03-03 Vivint, Inc. Sonic sensing
US20190361471A1 (en) * 2018-05-22 2019-11-28 Capital One Services, Llc Preventing image or video capture of input data provided to a transaction device
US10877499B2 (en) * 2018-05-22 2020-12-29 Capital One Services, Llc Preventing image or video capture of input data provided to a transaction device
US20210116950A1 (en) * 2018-05-22 2021-04-22 Capital One Services, Llc Preventing image or video capture of input data provided to a transaction device
US11747837B2 (en) * 2018-05-22 2023-09-05 Capital One Services, Llc Preventing image or video capture of input data provided to a transaction device
US20200126396A1 (en) * 2018-06-14 2020-04-23 Honeywell International Inc. Systems and methods for managing alert notifications from a secured area
US10984650B2 (en) * 2018-06-14 2021-04-20 Honeywell International Inc. Systems and methods for managing alert notifications from a secured area
US10510239B1 (en) 2018-06-14 2019-12-17 Honeywell International Inc. Systems and methods for managing alert notifications from a secured area
US11386211B2 (en) 2018-12-19 2022-07-12 Capital One Services, Llc Obfuscation of input data provided to a transaction device
US11868491B2 (en) 2018-12-19 2024-01-09 Capital One Services, Llc Obfuscation of input data provided to a transaction device
US10733457B1 (en) 2019-03-11 2020-08-04 Wipro Limited Method and system for predicting in real-time one or more potential threats in video surveillance
US11750639B2 (en) 2021-04-05 2023-09-05 Bank Of America Corporation ATM-based anomaly and security threat detection

Also Published As

Publication number Publication date
WO1997042764A1 (en) 1997-11-13
IL116647A0 (en) 1996-05-14
IL116647A (en) 1999-03-12
US5666157A (en) 1997-09-09
USRE43147E1 (en) 2012-01-31

Similar Documents

Publication Publication Date Title
USRE44527E1 (en) Abnormality detection and surveillance system
USRE44225E1 (en) Abnormality detection and surveillance system
Elharrouss et al. A review of video surveillance systems
US7683929B2 (en) System and method for video content analysis-based detection, surveillance and alarm management
US20040240542A1 (en) Method and apparatus for video frame sequence-based object tracking
Crocco et al. Audio surveillance: A systematic review
US20210076010A1 (en) System and method for gate monitoring during departure or arrival of an autonomous vehicle
US7542588B2 (en) System and method for assuring high resolution imaging of distinctive characteristics of a moving object
US9412142B2 (en) Intelligent observation and identification database system
KR101644443B1 (en) Warning method and system using prompt situation information data
US7535353B2 (en) Surveillance system and surveillance method
US20080309761A1 (en) Video surveillance system and method with combined video and audio recognition
US20070035622A1 (en) Method and apparatus for video surveillance
JP2000244897A (en) State recognition system and state recognition display generation method
JPH10285581A (en) Automatic monitoring device
CN105100700A (en) Target tracking device using handover between cameras and method thereof
WO2021095351A1 (en) Monitoring device, monitoring method, and program
US20190208168A1 (en) Limited Access Community Surveillance System
CN113920660B (en) Safety monitoring method and system suitable for safety storage equipment
Park et al. Sound learning–based event detection for acoustic surveillance sensors
Ho et al. Public space behavior modeling with video and sensor analytics
Flammini et al. Challenges and emerging paradigms for augmented surveillance
US20240062636A1 (en) System and method for identifying a weapon, and tracking and generating an alert in response to the weapon identification
KR20160086536A (en) Warning method and system using prompt situation information data
Izquierdo-Fuente et al. A human classification system for a video-acoustic detection platform