US20070022456A1 - Method and apparatus for surveillance using an image server - Google Patents

Method and apparatus for surveillance using an image server Download PDF

Info

Publication number
US20070022456A1
US20070022456A1 US11/535,461 US53546106A US2007022456A1 US 20070022456 A1 US20070022456 A1 US 20070022456A1 US 53546106 A US53546106 A US 53546106A US 2007022456 A1 US2007022456 A1 US 2007022456A1
Authority
US
United States
Prior art keywords
frame
image
differential
frames
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/535,461
Inventor
Daniel Esbensen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chartoleaux KG LLC
Original Assignee
Touch Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Touch Technologies Inc filed Critical Touch Technologies Inc
Priority to US11/535,461 priority Critical patent/US20070022456A1/en
Publication of US20070022456A1 publication Critical patent/US20070022456A1/en
Priority to US11/762,047 priority patent/US20080036863A1/en
Assigned to TT VISUAL TECHNOLOGY GROUP, LLC reassignment TT VISUAL TECHNOLOGY GROUP, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TOUCH TECHNOLOGIES, INC.
Assigned to TOUCH TECHNOLOGIES, INC. reassignment TOUCH TECHNOLOGIES, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ESBENSEN, DANIEL
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19665Details related to the storage of video surveillance data
    • G08B13/19667Details realated to data compression, encryption or encoding, e.g. resolution modes for reducing data volume to lower transmission bandwidth or memory requirements
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19654Details concerning communication with a camera
    • G08B13/19656Network used to communicate with a camera, e.g. WAN, LAN, Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention is in the field of electronic circuits and camera systems. More particularly, the present invention is directed to a system for surveillance using digital images and image servers.
  • Typical building surveillance systems today capture analog video signals from one or more video cameras and transmit those signals to a security panel for viewing by security personnel. Deployment of such systems over a large area and making the video images available over a network can be problematic because of the large bandwidth requirements of the video signal. Monitoring of multiple analog cameras is also difficult; for example, a human viewer's attention may not be on the security panel or directed to the correct camera image at the time an incident occurs. An, in general, the number of cameras a human can effectively monitor is limited. While techniques for motion detection in surveillance systems are known, the complexity and expense of incorporating these techniques into analog systems has limited the use of motion detection in many video surveillance systems.
  • Axis One group of cameras and camera servers for these applications are marketed under the brand name Axis. However, these installations are generally limited to single or a few cameras and do not have the ability to be deployed as a flexible and fully functional surveillance systems. Standard Axis technology also generally relies on full-frame updating and has only limited ability to reduce bandwidth of images.
  • a number of techniques are known for compressing digital video information.
  • Well known techniques for digital video include hardware assisted techniques such as MPEG, DVI, Motion JPEG, and software-only techniques such as QuickTime, Video for Windows, RealVideo, or AVI.
  • Some of these techniques include mechanisms for processing and transmitting delta frame information, wherein delta frames encode information about pixels that have changed between one frame and another.
  • these compression techniques for the most part are concerned with the quality of reproduction of real-time video image and have not been optimized for use in surveillance systems or for use in systems that do not contain custom video playback software or hardware.
  • What is needed is a flexible surveillance system that can capture image data from a number of digital cameras and make that data available to viewers in a variety of different ways.
  • a surveillance system with a basic architecture that is scalable, allowing for efficient installation, coordination, and control of one, to a few, to thousands of individual cameras and one to a few to thousands of individual clients.
  • an integrated system for digital surveillance that at every step of image processing optimizes images for easy storage, analysis, transmission, and presentation in a surveillance system.
  • Specific embodiments of the present invention address a number of problems associated with a digital camera surveillance system, such as control and coordination of digital cameras that may be widely deployed, analyzing data from multiple cameras, making data available in such a way that it can be efficiently transmitted over a network and can be easily displayed to potentially a large number of users, and displaying and controlling image data by existing client software such as a browser.
  • these problems are addressed by providing a flexible and scalable surveillance system and method; the method and system according to the present invention can work effectively in small installations with just a few cameras and only one viewer to installations including thousands of cameras, widely dispersed, allowing for selectable viewing by many viewers.
  • the invention consists of the following functional elements:
  • an FG comprises a PC equipted with one or more off-the-shelf video capture boards, with each video capture board connected to a camera.
  • the PC is programed according to the invention, to control the video capture functions and to perform low-level logic processing.
  • FG low-level logic processing generally includes one or more of the following: short-term storage of full images, computing of differential images, computing differential scores for a current image, filtering of gradual ambient light changes, and adjusting of camera characteristics.
  • FGs have a communication interface to send full frames and differential frames to a coordinator.
  • Coordinators for receiving full frames, differential frames, and possibly other data from FGs, storing this data, and for adding a higher level of image processing.
  • Coordinators generally include logic for one or more of the following: detecting and storing an incident from one or more FGs, resolving incidents from multiple FGs into an incident sequence; image recognition; logging and cataloging incidents according to a rules-based engine; generating alarms to security personnel or a server, etc.
  • a coordinator may also include an interface for sending control signals to the FG to control basic FG functions such as frequency of capture, focus, contrast, and, for moveable FGs, positioning.
  • a Camera Server for providing an interface to one or more client viewers.
  • a server handles image presentation and may include logic allowing a client to pan and zoom the view of an image.
  • a server includes logic to provide an intelligent interface to a client viewer including launching windows in the client viewer when incidents are detected and updating open windows with differential frames and full frames.
  • a server may also include an interface for receiving commands back from a client and forwarding those commands to a coordinator when appropriate.
  • a server also provides a possibly high capacity connection to the Internet, allowing potentially thousands of viewers to view the same image.
  • clients for displaying images delivered by the server.
  • clients may also receive commands from a user and forward results of those commands back to a server.
  • clients may be familiar, off-the-shelf, browser applications, such as Netscape Navigator or Internet Explorer, or clients may be proprietary applications. According to the present invention, where desired in a particular installation, both off-the-shelf and proprietary clients can simultaneously access image data.
  • these elements perform separable tasks appropriate to that element to allow for a flexible and scalable surveillance system.
  • the flexible system according to the invention allows various data and image processing tasks to be easily incorporated into specific systems depending on application.
  • cameras and FGs can employ digital signature key technology or other technology to verify that an image was not altered after it was initially captured.
  • FIG. 1 is a diagram of an illustrative embodiment of the invention using representative hardware elements as it might be deployed in a moderately sized business or academic setting.
  • FIG. 2 is a diagram of an alternative embodiment of the invention using representative hardware elements as it might be deployed at a single location, such as a single moderately sized building.
  • FIG. 3 is an illustrative functional diagram of an embodiment of the invention.
  • FIG. 1 shows an illustrative specific embodiment of a surveillance system according to the invention.
  • a surveillance system consists of a number of frame grabbers (FGs) 10 , each of which include one or more digital cameras 12 and controller 14 .
  • FGs are in communication with coordinator 20 , which may coordinate one to many FGs.
  • Coordinator 20 typically will include frame and incident storage 22 and may include rules storage 26 .
  • Coordinators 20 communicate with server 30 , which will typically include server image storage 32 and client interface 34 .
  • Interface 34 communicates with one or more viewing clients 40 - 42 .
  • Client 40 - 42 may be standard, off-the-self client software allowing display of images and running on an appropriate computing device, such as a PC or workstation, web-capable television, etc.
  • Browser clients include Netscape Communicator and Internet Explorer.
  • One or more of clients 40 - 42 may also be propriety client programs and may include specialized hardware, such as panel 42 , which may be a security surveillance panel or a kiosk display.
  • Connections 50 are shown in FIG. 1 to illustrate a functional data pathway between components.
  • such pathways can be network connections, backplane bus connections, wireless connections, IC interconnects, or any other data channel appropriate for a particular embodiment hardware configuration of the invention.
  • the elements shown in FIG. 1 may be embodied in physically separable electronic devices, or alternatively, the elements may be embodied into a small number of more integrated physical devices.
  • FGs 10 may be constructed as a single electronic unit, with the camera and controller component sharing some of the same logic circuits.
  • some or all of coordinators 20 may exist as processes on the same computer that holds server 30 .
  • server 30 or coordinator 20 may be physically constructed of a number of closely cooperating computer hardware devices.
  • FIG. 1 can be understood as an illustration of functional elements of the invention with functions performed on different arrangements of hardware components as appropriate to a particular installation.
  • FIG. 2 shows an alternative illustrative embodiment of a surveillance system according to the invention using a single computer 100 as a hardware platform.
  • frame grabbers include one or more digital cameras 12 and video capture boards 13 .
  • Other functions of controller 14 are performed by logic running on computer 100 .
  • Capture Boards 13 are distributed in bus slots in the computer and communicate with the camera either through a direct line or via wireless transceivers.
  • Coordinator 20 exists as logic routines running on computer 100 , using storage of the computer for frame and incident storage and any rule storage.
  • Server 30 is also a process running on computer 100 .
  • a client process 40 may also in some embodiments run on computer 100 to allow local viewing of captured images.
  • computer 100 will also have an image server 30 for remote client viewing.
  • Interprocess communication in computer 100 allows for data exchange and in some cases data sharing between the various functional elements.
  • FG 10 includes an off-the-shelf or custom camera 12 capable of cooperating with other hardware to produce a digitally encoded image array.
  • many different types and brands of such cameras are available. Some of these cameras include a microphone and wireless transmission capability between the camera and the capture circuitry.
  • a currently available off-the-shelf RemingtonTM brand audio/video sender/receiver combination allows for image/audio capture at low lux and wireless transmission to a capture board.
  • Many such cameras employ either well-known CCD or CMOS technology to capture a digital image. It is expected that an even wider range of such cameras will be available in the future, with greater capabilities that will make them particularly suitable for use in some embodiments of the present invention.
  • Camera hardware often includes “steady cam” technology that performs some corrections for vibrations of the camera.
  • camera 12 will generally be non-moving (i.e. fixed) and will be located to capture a view of interest.
  • camera 12 can be fitted with a wide-angle or “fish eye” lens to allow it to capture a large area.
  • software in the FG or in the coordinator or in the server is used to remove distortion caused by the lens and to flatten the captured image for viewing.
  • an FG captures an image of a larger area than will typically be displayed at one time at a client.
  • Logic routines in either the FG, the coordinator, or the server allow a client viewer to pan and zoom a view of the captured image.
  • Some capabilities of FG cameras that are either presently available or are anticipated soon to be available are the ability to capture frames of 15 million pixels allowing for greater zooming capability; the ability to operate at very low light levels, and the ability to capture infrared radiation or other non-visible electromagnetic radiation and the ability to capture synched audio data.
  • Camera hardware often includes “steady cam” technology that performs some corrections for vibrations of the camera.
  • CCD-type video digital cameras generally produce an analog video scan signal, which must converted to digital for digital processing or storage.
  • cameras will increasingly become available that produce a byte-stream or bit-stream description of the detected image.
  • Controller 14 includes capture circuitry and low-level processing and control logic immediately associated with the camera to allow for an efficient and flexible overall system.
  • controller 14 may be a PC-type microcontroller with an off-the-shelf, programmable, video capture board.
  • controller 14 may include or be comprised of custom designed logic circuits. Controller 14 captures, and for a short time stores, sequential still images from the camera in the form of digital data. For analog output cameras, controller 14 converts the analog signal to digital.
  • video capture boards receive a video signal and convert it to still digital frames at a selectable frame rate.
  • the still digital frames are encoded as full-color images and the capture board may perform some low-level color and brightness correction of the received signal.
  • the capture board delivers, on demand, digital captured frames.
  • the image delivered is compressed and converted to an encoded format such as GIF or JPEG while in others only 24 bit color is possible.
  • Off-the-shelf video capture boards brands include Videum and ATI.
  • Digital encoding of images can take many different forms.
  • One well-known, uncompressed form for full-color digital images is a two dimensional array of numerical pixel values, with each pixel holding three 8-bit numerical values, one value indicating Red intensity, one Green, and one Blue.
  • each pixel requires 24 bits of data and can represent one of 2 24 (16 million) different colors and an uncompressed, 24-bit image with an image size 640 ⁇ 800 pixels requires 1.5 Mbytes of storage.
  • Other encoding schemes are known, such as schemes that use fewer bits for each color value, and schemes that use different numbers of bits for different colors.
  • One well known technique for image compression can be generally referred to as the table/substitution technique.
  • the total number of colors actually displayed in a single image is reduced from 16 million to a smaller number, such as 256.
  • a palette or table is created by analyzing the original 16 million color image and selecting 256 of those colors for display. Those selected 256 colors are then stored in a 256 entry indexed palette and the index number (in one known method, an 8-bit number) for a color is substituted as the pixel value for that color.
  • the substituted pixel image and the palette are then used to represent the image, reducing a 1.5 Mbyte image to closer to 0.5 Mbytes.
  • compression techniques are used to further reduce the storage needed for an encoded image.
  • certain table values are reserved for predefined colors. Pure black and pure white are commonly reserved colors. In some schemes, a value is also reserved for a transparent pixel.
  • the FG controller processes the captured image and determines whether to send to the camera coordinator a full frame, a computed differential frame, or no frame. This determination may be based the amount of change between the captured image and a previously transmitted image, the elapsed time since the previous transmit, the number differential frames sent since the previous full frame, or other criteria.
  • a frame grabber also transmits differential scores that indicate an amount of change in the current frame from the previous frame.
  • Basic processing according to the invention involves a reference frame and a current frame, which are generally images of the same size and same encoding.
  • the current frame is the frame newly captured by the capture board.
  • the reference frame is a frame held in memory at the controller to which the current frame will be compared.
  • a differential score is determined for a current frame by determining which pixels in the current frame have a different value from the corresponding pixels in the reference frame.
  • a number of variations in computing and expressing differential scores are possible.
  • a raw differential percentage score may be computed by comparing each bit in the current frame to the corresponding bit in the reference frame. If there is any difference in value, that pixel is considered a changed pixel. The ratio of the sum of all changed pixels to the total number of pixels in the image is the raw differential percentage score.
  • a differential percentage score may also be computed using threshold logic routines to filter out differences between pixels that are not of interest, such as when a change is of minor intensity, or only affects a very small area.
  • Threshold algorithms can be defined in a variety of ways appropriate to the particular overall image conditions and applications. Thresholds can be defined that are different for different colors, such that a change in a green or red value, for example, is more likely to cause a pixel to be counter as different than a change in a blue value.
  • the controller may analyze an image by dividing the image into cells of roughly 5 ⁇ 5 pixels and determining the number of 5 ⁇ 5 cells in the current image that have changed compared to the reference image and generating a differential score from this comparison.
  • threshold logic can compare a current frame to more than one previous frames in order to determine whether captured values are “flickering” while the actual image before the camera is unchanged. Such flickering is common in low light situations.
  • the controller may compute more than one type of differential score for an image.
  • a differential score may be used by the controller to determine whether or not to transmit a frame according to the controller's rule set and whether or not the controller believes an incident has occurred.
  • One or more differential scores may be transmitted along with frames transmitted by the controller to the coordinator.
  • a differential frame is constructed of the same size as the captured image and a reference image, using the same or a similar file format.
  • pixel values from the current frame that are identical to or within tolerances of the reference frame are set to a value indicating transparency and pixels that have changed retain the value from the newly captured image. This allows for a high compression of the differential frame and for easy updating at a client viewer by superimposing a series of differential frames over a displayed full frame, as discussed below.
  • the controller can perform a still image compression routine. In many image formats, this compression routine is built into the format.
  • a controller can operate with only two full frames in memory, a current frame and a reference frame, and a buffer for holding differential frames.
  • the controller may retain additional image files to maintain a history of image processing for retransmission purposes or for more involved threshold and image analysis.
  • the process of computing one or more differential scores and constructing a differential frame may be combined such that as the controller scans the pixels in the captured image frame, it computes differential scores and builds a differential frame.
  • a controller From time to time, and whenever requested by a coordinator, a controller will transmit a current full frame. Among other functions, this allows the coordinator to catch up pixels that changed so gradually over time that they never registered as differential pixels. In one embodiment, a full frame is sent every 10 images.
  • the controller logic may perform a number of other image processing functions as known in the art, such as converting the captured visual image into a different format.
  • One format that may be advantageously used is the well-known GIF format for encoding and compressing digital images.
  • Other suitable formats include PNG, JPEG, etc.
  • the camera is motionless. This allows for simpler control and processing logic and for easier detection of incidents and computation of differentials. Any panning or zooming for viewing the image is accomplished not by the camera itself, but by logic functions closer to the client viewer, as described below.
  • the invention may include moving or moveable cameras.
  • a camera is moving, techniques that take into account movement of the camera are used to compute the differential or computing of the differential can be suspended during camera movement and full captured images can be transmitted.
  • Coordinator 20 receives frame data and possibly other control data from one or more FGs.
  • frame data is in the form of still images, including full (update) frames and differential frames and may include differential scores from some or all frames.
  • Control data may include data indicating that the FG detected a differential. It may also include data indicating the current position or focus depth of a moveable FG, an FG identifier, and a time signal. Transmission of frames to the coordinator can take according to one or more of the following: at expiration of a time interval since the last transmission, upon detection of a difference at a controller, at the request of the coordinator.
  • Coordinator 20 in one embodiment also may send commands to the FGs to control aspects of frame acquisition or transmission.
  • commands may include a resend, change camera characteristics such as brightness or contrast, send a full frame, set the frequency for frame transmission, establish rules regarding when frames should be transmitted, establish a tolerance level for determining if a differential frame should be transmitted, etc.
  • the coordinator will include an interface allowing a user to program certain features of the controller, such as indicating regions of the visual field that should be processed differently.
  • the coordinator might be able to receive user commands allowing a user to indicate that pixel changes in certain regions, such as windows or doorways, are not of interest during certain hours.
  • a coordinator is primarily responsible for determining if an incident occurred.
  • the coordinate accomplishes this using a rules-based engine or similar logical process that may take into account time of day, day of the week, nature of the pixel change detected, etc.
  • the coordinator takes into account differential scores transmitted by the FG.
  • the coordinator also provides the principal incident and history database for its connected cameras and includes the ability to playback stored incidents. In further embodiments, the coordinator additionally has the ability to connect multiple incidents, triggered at multiple cameras, into an incident sequence.
  • the coordinator has positional and view information about each camera and information about overlapping regions of cameras.
  • a coordinator will generally include a large amount of longer term, non volatile storage, such as large disk drives or removable storage technologies such as tape, or write/once or r/w CDs or DVDs.
  • a coordinator In embodiments with a large number of cameras, a coordinator will be a work horse machine accomplishing much of the computation-intensive processing needed by the system. As a result, a coordinator in such a system may be constructed of a number of cooperating computers or a multi-CPU computer system. A coordinator handles the principal time-stamping function for frames or incidents.
  • the coordinator includes a management interface to a management station 26 , which may be local to the coordinator or may connect remotely.
  • the management interface allows a user to perform various management functions, such as setting time parameters for whether incidents from particular cameras will be of interest, establishing other rules definitions. Alerts regarding cameras that have not reported in for a while (exception) report generation. Installing new software and other maintenance functions.
  • the management station reports on its interactions with the camera server, such as cameras that have been accessed and how frequently and it receives commands from the camera server.
  • the coordinator can also perform advanced image processing tasks such as image recognition or tracking a person or object identified in an image or determining that an object is coming toward or moving away from the camera.
  • the camera coordinator sends commands to the camera server regarding detected incidents or changes of an image that allow the server to intelligently control the view of connected clients by changing the view of images displayed at the clients or by creating new windows and directing images to those new windows.
  • a principal function of image server 30 is image delivery to client software for presentation to an observer.
  • the server can force a client to create new windows and can direct incidents to different windows.
  • the server employs push technology, wherein the server can periodically deliver a differential image to the browser.
  • the server's delivery of full images and differentials allows a client viewer to display a pseudo real time representation of the image seen by the camera by overlaying the differential images on the existing displayed image, with a minimum of processing and a minimum of transmission between the server and the client.
  • a server includes cache storage and may keep a current full frame in memory for all active attached cameras so that the server can transmit a full frame on demand when a client requests it.
  • the image server typically includes software with the ability to perform pan and zoom functions of an image.
  • a server has the necessary logic to talk to Java code, or similar code, running in the client. This allows a server to determine if it should send a new image, such as to a newly connected client. The newly connected client will receive a full frame and enough differential frames to get synchronized with the current view.
  • Prior art systems for transmitting moving images such as mpeg or the I-see-you,-you-see-me system suffer from utilizing a more complex encoding scheme requiring more specialized hardware and software interfaces on both the captured end and the received end.
  • the present invention benefits from the wide distribution of image viewing systems using simple static image coding in compression formats.
  • a surveillance system may be built with a low-speed/low-bandwidth connection between the FGs and the coordinator, a high-speed/high-bandwidth connection between the coordinators and the server, and low-speed connections with individual clients.
  • client viewers 40 can include off-the-shelf PC's running off-the-shelf internet browsers.
  • the browsers will be JAVA-enabled to allow image server 30 to switch views or create new windows.
  • Clients can also include custom surveillance consoles, such as 42 . These consoles can coexit in a networked environment with other viewers.
  • FIG. 3 is an illustrative block diagram showing exemplary frame handling according to a specific embodiment of the invention. It will be clear to those of skill in the art that many variations in image and frame handling within the scope of the invention are possible.

Abstract

Methods and apparatus for an image server surveillance system provide for as control and coordination of cameras that may be widely deployed, analyzing data from multiple cameras, making data available in such a way that it can be efficiently transmitted over a network and can be easily displayed to potentially a large number of users, and displaying and controlling image data by existing client software.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application is a divisional of patent application Ser. No. 09/482,181, filed 12 Jan. 2000 now U.S. patent ______ which claims priority from U.S. provisional application Ser. No. 60/131,990, filed Apr. 30, 1999, each of which is incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • The present invention is in the field of electronic circuits and camera systems. More particularly, the present invention is directed to a system for surveillance using digital images and image servers.
  • Many types of camera surveillance systems are known. Typical building surveillance systems today capture analog video signals from one or more video cameras and transmit those signals to a security panel for viewing by security personnel. Deployment of such systems over a large area and making the video images available over a network can be problematic because of the large bandwidth requirements of the video signal. Monitoring of multiple analog cameras is also difficult; for example, a human viewer's attention may not be on the security panel or directed to the correct camera image at the time an incident occurs. An, in general, the number of cameras a human can effectively monitor is limited. While techniques for motion detection in surveillance systems are known, the complexity and expense of incorporating these techniques into analog systems has limited the use of motion detection in many video surveillance systems.
  • Another problem that arises in analog surveillance systems is storage and playback technology of analog video data. Typical security cameras, at a retail store for example, employ videotape technology wherein full-motion video is continuously recorded, without regard to whether an incident of interest has occurred. Video tapes are retrieved and played back on the rare occasions when an incident occurs. A major problem with such systems is that the videotapes are often recorded at the slowest speed, giving the poorest image quality, and are repeatedly rerecorded. As a result, playback image quality is often very poor and when an incident does occur, investigators cannot get a clear enough image of individuals involved in the incident to make an identification. In response to this problem, the Federal Bureau of Investigation has established a laboratory program whose primary function is to help law enforcement personnel enhance poor quality images from video surveillance systems in order to aid in investigations.
  • It is known to make digitized video images available over the web for presentation by a web browser. Generally, such systems periodically update a full-frame captured still image from a camera using a push (controlled at the server side) or a pull (controlled at the client side) technology. Such systems have had a limited deployment to make images of such things as ski slope weather conditions, elephant houses at a zoo, or children at a day care center, available over the web using a standard web browser. In some applications, such as the day care center, access to the image is password protected so that only authorized viewers can receive the images.
  • One group of cameras and camera servers for these applications are marketed under the brand name Axis. However, these installations are generally limited to single or a few cameras and do not have the ability to be deployed as a flexible and fully functional surveillance systems. Standard Axis technology also generally relies on full-frame updating and has only limited ability to reduce bandwidth of images.
  • A number of techniques are known for compressing digital video information. Well known techniques for digital video include hardware assisted techniques such as MPEG, DVI, Motion JPEG, and software-only techniques such as QuickTime, Video for Windows, RealVideo, or AVI. Some of these techniques include mechanisms for processing and transmitting delta frame information, wherein delta frames encode information about pixels that have changed between one frame and another. However, these compression techniques for the most part are concerned with the quality of reproduction of real-time video image and have not been optimized for use in surveillance systems or for use in systems that do not contain custom video playback software or hardware.
  • What is needed is a flexible surveillance system that can capture image data from a number of digital cameras and make that data available to viewers in a variety of different ways. In some applications, what is further needed, is a surveillance system with a basic architecture that is scalable, allowing for efficient installation, coordination, and control of one, to a few, to thousands of individual cameras and one to a few to thousands of individual clients. Additionally, what is needed is an integrated system for digital surveillance that at every step of image processing optimizes images for easy storage, analysis, transmission, and presentation in a surveillance system.
  • SUMMARY OF THE INVENTION
  • Specific embodiments of the present invention address a number of problems associated with a digital camera surveillance system, such as control and coordination of digital cameras that may be widely deployed, analyzing data from multiple cameras, making data available in such a way that it can be efficiently transmitted over a network and can be easily displayed to potentially a large number of users, and displaying and controlling image data by existing client software such as a browser. According to the invention, these problems are addressed by providing a flexible and scalable surveillance system and method; the method and system according to the present invention can work effectively in small installations with just a few cameras and only one viewer to installations including thousands of cameras, widely dispersed, allowing for selectable viewing by many viewers.
  • In a specific embodiment, the invention consists of the following functional elements:
  • (1) Multiple Frame Grabbers (FGs) that include one or more cameras, digital image capture circuitry, and low-level logic routines. In one embodiment, an FG comprises a PC equipted with one or more off-the-shelf video capture boards, with each video capture board connected to a camera. The PC is programed according to the invention, to control the video capture functions and to perform low-level logic processing. FG low-level logic processing generally includes one or more of the following: short-term storage of full images, computing of differential images, computing differential scores for a current image, filtering of gradual ambient light changes, and adjusting of camera characteristics. FGs have a communication interface to send full frames and differential frames to a coordinator.
  • (2) One or more Camera Coordinators for receiving full frames, differential frames, and possibly other data from FGs, storing this data, and for adding a higher level of image processing. Coordinators generally include logic for one or more of the following: detecting and storing an incident from one or more FGs, resolving incidents from multiple FGs into an incident sequence; image recognition; logging and cataloging incidents according to a rules-based engine; generating alarms to security personnel or a server, etc. A coordinator may also include an interface for sending control signals to the FG to control basic FG functions such as frequency of capture, focus, contrast, and, for moveable FGs, positioning.
  • (3) A Camera Server for providing an interface to one or more client viewers. A server handles image presentation and may include logic allowing a client to pan and zoom the view of an image. A server includes logic to provide an intelligent interface to a client viewer including launching windows in the client viewer when incidents are detected and updating open windows with differential frames and full frames. A server may also include an interface for receiving commands back from a client and forwarding those commands to a coordinator when appropriate. In some embodiments, a server also provides a possibly high capacity connection to the Internet, allowing potentially thousands of viewers to view the same image.
  • (4) One or more clients for displaying images delivered by the server. In some applications, clients may also receive commands from a user and forward results of those commands back to a server. In various embodiments of the invention, clients may be familiar, off-the-shelf, browser applications, such as Netscape Navigator or Internet Explorer, or clients may be proprietary applications. According to the present invention, where desired in a particular installation, both off-the-shelf and proprietary clients can simultaneously access image data.
  • According to the invention, these elements perform separable tasks appropriate to that element to allow for a flexible and scalable surveillance system. The flexible system according to the invention allows various data and image processing tasks to be easily incorporated into specific systems depending on application. In security surviellance systems where later authentication of a recorded digital image is important, for example, cameras and FGs can employ digital signature key technology or other technology to verify that an image was not altered after it was initially captured.
  • A further understanding of the invention can be had from the detailed discussion of specific embodiments below. For purposes of clarity, this discussion refers to digital devices and concepts in terms of specific examples. However, the method and apparatus of the present invention may operate with a wide variety of types of digital devices. It is therefore not intended that the invention be limited except as provided in the attached claims. Furthermore, it is well known in the art that logic systems can include a wide variety of different components and different functions in a modular fashion. Different embodiments of a system can include different mixtures of elements and functions and may group various functions as parts of various elements. For purposes of clarity, the invention is described in terms of systems that include many different innovative components and innovative combinations of components. No inference should be taken to limit the invention to combinations containing all of the innovative components listed in any illustrative embodiment in the specification, and the invention should not be limited except as provided in independent embodiments described in the attached claims.
  • The invention will be better understood with reference to the following drawings and detailed description.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram of an illustrative embodiment of the invention using representative hardware elements as it might be deployed in a moderately sized business or academic setting.
  • FIG. 2 is a diagram of an alternative embodiment of the invention using representative hardware elements as it might be deployed at a single location, such as a single moderately sized building.
  • FIG. 3 is an illustrative functional diagram of an embodiment of the invention.
  • DESCRIPTION OF THE SPECIFIC EMBODIMENTS Overview of Two Typical Embodiments
  • Example Embodiment for a Large Campus
  • FIG. 1 shows an illustrative specific embodiment of a surveillance system according to the invention. Such a system consists of a number of frame grabbers (FGs) 10, each of which include one or more digital cameras 12 and controller 14. FGs are in communication with coordinator 20, which may coordinate one to many FGs. Coordinator 20 typically will include frame and incident storage 22 and may include rules storage 26. Coordinators 20 communicate with server 30, which will typically include server image storage 32 and client interface 34. Interface 34 communicates with one or more viewing clients 40-42. Client 40-42 may be standard, off-the-self client software allowing display of images and running on an appropriate computing device, such as a PC or workstation, web-capable television, etc. Well-known, currently available, browser clients include Netscape Communicator and Internet Explorer. One or more of clients 40-42 may also be propriety client programs and may include specialized hardware, such as panel 42, which may be a security surveillance panel or a kiosk display.
  • Connections 50 are shown in FIG. 1 to illustrate a functional data pathway between components. As is known in the art, such pathways can be network connections, backplane bus connections, wireless connections, IC interconnects, or any other data channel appropriate for a particular embodiment hardware configuration of the invention. According to the invention, the elements shown in FIG. 1 may be embodied in physically separable electronic devices, or alternatively, the elements may be embodied into a small number of more integrated physical devices. FGs 10, for example, may be constructed as a single electronic unit, with the camera and controller component sharing some of the same logic circuits. In some installations, some or all of coordinators 20 may exist as processes on the same computer that holds server 30. Conversely, as is known in the art, server 30 or coordinator 20 may be physically constructed of a number of closely cooperating computer hardware devices. Thus, FIG. 1 can be understood as an illustration of functional elements of the invention with functions performed on different arrangements of hardware components as appropriate to a particular installation.
  • Example Embodiment for a Small Site Installation
  • FIG. 2 shows an alternative illustrative embodiment of a surveillance system according to the invention using a single computer 100 as a hardware platform. In this system, frame grabbers include one or more digital cameras 12 and video capture boards 13. Other functions of controller 14 are performed by logic running on computer 100.
  • Capture Boards 13 are distributed in bus slots in the computer and communicate with the camera either through a direct line or via wireless transceivers. Coordinator 20 exists as logic routines running on computer 100, using storage of the computer for frame and incident storage and any rule storage. Server 30 is also a process running on computer 100.
  • A client process 40 may also in some embodiments run on computer 100 to allow local viewing of captured images. Typically computer 100 will also have an image server 30 for remote client viewing. Interprocess communication in computer 100 allows for data exchange and in some cases data sharing between the various functional elements.
  • Components of a Surveillance System
  • Frame Grabber Camera
  • FG 10 includes an off-the-shelf or custom camera 12 capable of cooperating with other hardware to produce a digitally encoded image array. Many different types and brands of such cameras are available. Some of these cameras include a microphone and wireless transmission capability between the camera and the capture circuitry. For example, a currently available off-the-shelf Remington™ brand audio/video sender/receiver combination allows for image/audio capture at low lux and wireless transmission to a capture board. Many such cameras employ either well-known CCD or CMOS technology to capture a digital image. It is expected that an even wider range of such cameras will be available in the future, with greater capabilities that will make them particularly suitable for use in some embodiments of the present invention. Camera hardware often includes “steady cam” technology that performs some corrections for vibrations of the camera.
  • In one desirable embodiment, camera 12 will generally be non-moving (i.e. fixed) and will be located to capture a view of interest. As is known in the art, camera 12 can be fitted with a wide-angle or “fish eye” lens to allow it to capture a large area. In such a case, software in the FG or in the coordinator or in the server is used to remove distortion caused by the lens and to flatten the captured image for viewing.
  • In some embodiments, an FG captures an image of a larger area than will typically be displayed at one time at a client. Logic routines in either the FG, the coordinator, or the server allow a client viewer to pan and zoom a view of the captured image.
  • Some capabilities of FG cameras that are either presently available or are anticipated soon to be available are the ability to capture frames of 15 million pixels allowing for greater zooming capability; the ability to operate at very low light levels, and the ability to capture infrared radiation or other non-visible electromagnetic radiation and the ability to capture synched audio data. Camera hardware often includes “steady cam” technology that performs some corrections for vibrations of the camera.
  • As is known in the art, CCD-type video digital cameras generally produce an analog video scan signal, which must converted to digital for digital processing or storage. However, it is expected that cameras will increasingly become available that produce a byte-stream or bit-stream description of the detected image.
  • Frame Grabber Controller
  • Image Capture and Standard Image Encoding
  • Controller 14 includes capture circuitry and low-level processing and control logic immediately associated with the camera to allow for an efficient and flexible overall system. In one embodiment, controller 14 may be a PC-type microcontroller with an off-the-shelf, programmable, video capture board. In an alternative embodiment, controller 14 may include or be comprised of custom designed logic circuits. Controller 14 captures, and for a short time stores, sequential still images from the camera in the form of digital data. For analog output cameras, controller 14 converts the analog signal to digital.
  • As is know in the art, video capture boards receive a video signal and convert it to still digital frames at a selectable frame rate. Generally, the still digital frames are encoded as full-color images and the capture board may perform some low-level color and brightness correction of the received signal. The capture board delivers, on demand, digital captured frames. In some capture boards, the image delivered is compressed and converted to an encoded format such as GIF or JPEG while in others only 24 bit color is possible. Off-the-shelf video capture boards brands include Videum and ATI.
  • Digital encoding of images can take many different forms. One well-known, uncompressed form for full-color digital images is a two dimensional array of numerical pixel values, with each pixel holding three 8-bit numerical values, one value indicating Red intensity, one Green, and one Blue. Thus, each pixel requires 24 bits of data and can represent one of 224 (16 million) different colors and an uncompressed, 24-bit image with an image size 640×800 pixels requires 1.5 Mbytes of storage. Other encoding schemes are known, such as schemes that use fewer bits for each color value, and schemes that use different numbers of bits for different colors.
  • One well known technique for image compression can be generally referred to as the table/substitution technique. In this technique, the total number of colors actually displayed in a single image is reduced from 16 million to a smaller number, such as 256. A palette or table is created by analyzing the original 16 million color image and selecting 256 of those colors for display. Those selected 256 colors are then stored in a 256 entry indexed palette and the index number (in one known method, an 8-bit number) for a color is substituted as the pixel value for that color. The substituted pixel image and the palette are then used to represent the image, reducing a 1.5 Mbyte image to closer to 0.5 Mbytes. In many known encoding formats, compression techniques are used to further reduce the storage needed for an encoded image. In some table/substitution schemes, certain table values are reserved for predefined colors. Pure black and pure white are commonly reserved colors. In some schemes, a value is also reserved for a transparent pixel.
  • Processing Captured Frames
  • Once an image is captured, the FG controller processes the captured image and determines whether to send to the camera coordinator a full frame, a computed differential frame, or no frame. This determination may be based the amount of change between the captured image and a previously transmitted image, the elapsed time since the previous transmit, the number differential frames sent since the previous full frame, or other criteria. In one embodiment, a frame grabber also transmits differential scores that indicate an amount of change in the current frame from the previous frame.
  • A number of variations in the processing of captured images to generate differential frames are possible according to the invention, and processing steps can take place in various orders or in parallel. For ease of understanding, the following description is provided of an exemplary specific embodiment processing.
  • Basic processing according to the invention involves a reference frame and a current frame, which are generally images of the same size and same encoding. The current frame is the frame newly captured by the capture board. The reference frame is a frame held in memory at the controller to which the current frame will be compared.
  • Computing Differential Scores
  • A differential score is determined for a current frame by determining which pixels in the current frame have a different value from the corresponding pixels in the reference frame. A number of variations in computing and expressing differential scores are possible. A raw differential percentage score may be computed by comparing each bit in the current frame to the corresponding bit in the reference frame. If there is any difference in value, that pixel is considered a changed pixel. The ratio of the sum of all changed pixels to the total number of pixels in the image is the raw differential percentage score.
  • A differential percentage score may also be computed using threshold logic routines to filter out differences between pixels that are not of interest, such as when a change is of minor intensity, or only affects a very small area. Threshold algorithms can be defined in a variety of ways appropriate to the particular overall image conditions and applications. Thresholds can be defined that are different for different colors, such that a change in a green or red value, for example, is more likely to cause a pixel to be counter as different than a change in a blue value.
  • In a further embodiment, the controller may analyze an image by dividing the image into cells of roughly 5×5 pixels and determining the number of 5×5 cells in the current image that have changed compared to the reference image and generating a differential score from this comparison.
  • In a further embodiment, threshold logic can compare a current frame to more than one previous frames in order to determine whether captured values are “flickering” while the actual image before the camera is unchanged. Such flickering is common in low light situations.
  • The controller may compute more than one type of differential score for an image. A differential score may be used by the controller to determine whether or not to transmit a frame according to the controller's rule set and whether or not the controller believes an incident has occurred. One or more differential scores may be transmitted along with frames transmitted by the controller to the coordinator.
  • Creating Differential Frames
  • A differential frame is constructed of the same size as the captured image and a reference image, using the same or a similar file format. In the differential frame, pixel values from the current frame that are identical to or within tolerances of the reference frame are set to a value indicating transparency and pixels that have changed retain the value from the newly captured image. This allows for a high compression of the differential frame and for easy updating at a client viewer by superimposing a series of differential frames over a displayed full frame, as discussed below. Once the differential frame has been constructed, the controller can perform a still image compression routine. In many image formats, this compression routine is built into the format.
  • Other Controller Operation
  • As can be seen from above, in a simple and compact embodiment, a controller can operate with only two full frames in memory, a current frame and a reference frame, and a buffer for holding differential frames. As discussed above, in an alternative embodiment, the controller may retain additional image files to maintain a history of image processing for retransmission purposes or for more involved threshold and image analysis.
  • The process of computing one or more differential scores and constructing a differential frame may be combined such that as the controller scans the pixels in the captured image frame, it computes differential scores and builds a differential frame.
  • From time to time, and whenever requested by a coordinator, a controller will transmit a current full frame. Among other functions, this allows the coordinator to catch up pixels that changed so gradually over time that they never registered as differential pixels. In one embodiment, a full frame is sent every 10 images.
  • The controller logic may perform a number of other image processing functions as known in the art, such as converting the captured visual image into a different format. One format that may be advantageously used is the well-known GIF format for encoding and compressing digital images. Other suitable formats include PNG, JPEG, etc.
  • Moving Camera
  • In a preferred embodiment of the present invention, the camera is motionless. This allows for simpler control and processing logic and for easier detection of incidents and computation of differentials. Any panning or zooming for viewing the image is accomplished not by the camera itself, but by logic functions closer to the client viewer, as described below.
  • In an alternative embodiment, the invention may include moving or moveable cameras. When a camera is moving, techniques that take into account movement of the camera are used to compute the differential or computing of the differential can be suspended during camera movement and full captured images can be transmitted.
  • Camera Coordinator
  • Coordinator 20 receives frame data and possibly other control data from one or more FGs. According to one embodiment of the invention, frame data is in the form of still images, including full (update) frames and differential frames and may include differential scores from some or all frames. Control data may include data indicating that the FG detected a differential. It may also include data indicating the current position or focus depth of a moveable FG, an FG identifier, and a time signal. Transmission of frames to the coordinator can take according to one or more of the following: at expiration of a time interval since the last transmission, upon detection of a difference at a controller, at the request of the coordinator.
  • Coordinator 20 in one embodiment also may send commands to the FGs to control aspects of frame acquisition or transmission. Such commands may include a resend, change camera characteristics such as brightness or contrast, send a full frame, set the frequency for frame transmission, establish rules regarding when frames should be transmitted, establish a tolerance level for determining if a differential frame should be transmitted, etc.
  • In a particular embodiment, the coordinator will include an interface allowing a user to program certain features of the controller, such as indicating regions of the visual field that should be processed differently. For example, the coordinator might be able to receive user commands allowing a user to indicate that pixel changes in certain regions, such as windows or doorways, are not of interest during certain hours.
  • In one embodiment, a coordinator is primarily responsible for determining if an incident occurred. The coordinate accomplishes this using a rules-based engine or similar logical process that may take into account time of day, day of the week, nature of the pixel change detected, etc. In determining that an incident has occurred, the coordinator takes into account differential scores transmitted by the FG.
  • In one embodiment, the coordinator also provides the principal incident and history database for its connected cameras and includes the ability to playback stored incidents. In further embodiments, the coordinator additionally has the ability to connect multiple incidents, triggered at multiple cameras, into an incident sequence. The coordinator has positional and view information about each camera and information about overlapping regions of cameras. A coordinator will generally include a large amount of longer term, non volatile storage, such as large disk drives or removable storage technologies such as tape, or write/once or r/w CDs or DVDs.
  • In embodiments with a large number of cameras, a coordinator will be a work horse machine accomplishing much of the computation-intensive processing needed by the system. As a result, a coordinator in such a system may be constructed of a number of cooperating computers or a multi-CPU computer system. A coordinator handles the principal time-stamping function for frames or incidents.
  • The coordinator includes a management interface to a management station 26, which may be local to the coordinator or may connect remotely. The management interface allows a user to perform various management functions, such as setting time parameters for whether incidents from particular cameras will be of interest, establishing other rules definitions. Alerts regarding cameras that have not reported in for a while (exception) report generation. Installing new software and other maintenance functions. The management station reports on its interactions with the camera server, such as cameras that have been accessed and how frequently and it receives commands from the camera server.
  • The coordinator can also perform advanced image processing tasks such as image recognition or tracking a person or object identified in an image or determining that an object is coming toward or moving away from the camera. The camera coordinator sends commands to the camera server regarding detected incidents or changes of an image that allow the server to intelligently control the view of connected clients by changing the view of images displayed at the clients or by creating new windows and directing images to those new windows.
  • Image Server
  • A principal function of image server 30 is image delivery to client software for presentation to an observer. In a particular embodiment, the server can force a client to create new windows and can direct incidents to different windows. In a preferred embodiment, the server employs push technology, wherein the server can periodically deliver a differential image to the browser. The server's delivery of full images and differentials allows a client viewer to display a pseudo real time representation of the image seen by the camera by overlaying the differential images on the existing displayed image, with a minimum of processing and a minimum of transmission between the server and the client.
  • A server includes cache storage and may keep a current full frame in memory for all active attached cameras so that the server can transmit a full frame on demand when a client requests it. The image server typically includes software with the ability to perform pan and zoom functions of an image.
  • In one embodiment, a server has the necessary logic to talk to Java code, or similar code, running in the client. This allows a server to determine if it should send a new image, such as to a newly connected client. The newly connected client will receive a full frame and enough differential frames to get synchronized with the current view.
  • Prior art systems for transmitting moving images such as mpeg or the I-see-you,-you-see-me system suffer from utilizing a more complex encoding scheme requiring more specialized hardware and software interfaces on both the captured end and the received end. The present invention benefits from the wide distribution of image viewing systems using simple static image coding in compression formats.
  • It receives commands from the client that it passes on to the coordinator. Requests for information. Initiate of image streams. Termination of last image stream.
  • According to the invention, a surveillance system may be built with a low-speed/low-bandwidth connection between the FGs and the coordinator, a high-speed/high-bandwidth connection between the coordinators and the server, and low-speed connections with individual clients.
  • Client Viewer
  • In one embodiment, client viewers 40 can include off-the-shelf PC's running off-the-shelf internet browsers. Preferably, the browsers will be JAVA-enabled to allow image server 30 to switch views or create new windows.
  • Clients can also include custom surveillance consoles, such as 42. These consoles can coexit in a networked environment with other viewers.
  • FIG. 3 is an illustrative block diagram showing exemplary frame handling according to a specific embodiment of the invention. It will be clear to those of skill in the art that many variations in image and frame handling within the scope of the invention are possible.

Claims (20)

1. A method for surveillance comprising:
capturing a plurality of still frames;
generating, from said plurality of still frames, a sequence of digital image arrays comprising a full frame and a plurality of differential frames;
transmitting said sequence to a camera coordinator;
determining, using said sequence, whether an incident is associated with one or more frames in said sequence;
transmitting said sequence to an image server;
storing said sequence at said image server; and
providing said sequence to one or more clients for viewing by a user.
2. The method according to claim 1 wherein said sequence stored at said image server is stored in a format designed for still image display on a client browser.
3. The method according to claim 1 wherein said sequence stored at said image server is stored in a format allowing for a pixel to be encoded as a transparent pixel.
4. The method according to claim 1 wherein said sequence stored at said image server comprises a full frame and one or more subsequent differential frames wherein pixels in a differential frame with values within a threshold of corresponding pixels in a preceding frame are set to transparent.
5. The method according to claim 1 wherein said generating creates a sequence of full and differential frames in a format designed for still image display on a client browser and allowing for a pixel to be encoded as a transparent pixel.
6. The method according to claim 5 wherein said sequence is transmitted to said camera coordinator, stored at said camera coordinator, transmitted to said image server, stored at said image server, and viewed by a client all using an image encoding format for still image display on a client browser and allowing for a pixel to be encoded as a transparent pixel.
7. The method according to claim 2 wherein said format is the PNG format.
8. The method according to claim 2 wherein said format is the GIF format.
9. The method according to claim 1 wherein said deriving comprises computing a percentage value for a differential frame indicating a calculated percentage change between said differential frame and a preceding frame.
10. The method according to claim 1 wherein said determining comprises comparing a single still frame to a preceding frame.
11. The method according to claim 1 wherein said deriving includes computing a percentage value for a differential frame indicating a calculated percentage change between said differential frame and a preceding frame.
12. The method according to claim 1 wherein said clients comprise off-the-shelf internet browser software.
13. The method according to claim 1 further comprising:
storing said sequence at said camera coordinator.
14. The method according to claim 1 wherein said storing comprises storage of sequences for which incidents were detected for later transmission as requested by an image server.
15. The method according to claim 1 wherein said image server includes a network interface with a high bandwidth capacity allowing for multiple simultaneous client connections.
16. A method for surveillance comprising:
capturing a plurality of still frames as arrays of digital data;
designating a frame in said plurality as a full frame;
for a frame subsequent to said full frame, computing a differential frame wherein a pixel in said differential frame that is within a threshold of a geometrically corresponding pixel in a preceding frame is set to transparent;
for a frame subsequent to said full frame, computing a percentage difference indicating a degree of change of pixels from a preceding frame;
transmitting a full frame, one or more differential frames, and one or more computed percentages to a camera coordinator;
determining that an incident has occurred using rules-based logic to analyze data received from said frame grabber;
storing frame data, image data, and incident data;
transmitting frame data to an image server; and
presenting frame data by said image server to one or more clients for viewing by one or more users.
17. A method for capturing, analyzing, and presenting image data from one or more digital image capture devices comprising:
capturing a plurality of digital image frames;
producing a plurality of sequences, said sequences comprising a full frame followed by one or more differential frames wherein pixels in said differential frames are set to transparent when they have a value within a threshold of a value of corresponding pixels in a preceding frame;
determining whether an incident is associated with one or more frames;
storing said plurality of sequences; and
presenting one or more sequences to a client viewer in response to a viewer's request or when an incident is associated with a sequence.
18. The method according to claim 17 wherein said determining comprises computing a percentage of pixels that have changed in one frame from one or more preceding frames.
19. The method according to claim 17 wherein said sequence stored at said image server is stored in a format designed for still image display on a client browser.
20. The method according to claim 17 wherein said storing comprises storage of sequences for which incidents were detected for later transmission as requested by an image server.
US11/535,461 1999-04-30 2006-09-26 Method and apparatus for surveillance using an image server Abandoned US20070022456A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/535,461 US20070022456A1 (en) 1999-04-30 2006-09-26 Method and apparatus for surveillance using an image server
US11/762,047 US20080036863A1 (en) 1999-04-30 2007-06-12 Method and apparatus for surveillance using an image server

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US13199099P 1999-04-30 1999-04-30
US09/482,181 US7124427B1 (en) 1999-04-30 2000-01-12 Method and apparatus for surveillance using an image server
US11/535,461 US20070022456A1 (en) 1999-04-30 2006-09-26 Method and apparatus for surveillance using an image server

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/482,181 Division US7124427B1 (en) 1999-04-30 2000-01-12 Method and apparatus for surveillance using an image server

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/762,047 Continuation US20080036863A1 (en) 1999-04-30 2007-06-12 Method and apparatus for surveillance using an image server

Publications (1)

Publication Number Publication Date
US20070022456A1 true US20070022456A1 (en) 2007-01-25

Family

ID=26829980

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/482,181 Expired - Lifetime US7124427B1 (en) 1999-04-30 2000-01-12 Method and apparatus for surveillance using an image server
US11/535,461 Abandoned US20070022456A1 (en) 1999-04-30 2006-09-26 Method and apparatus for surveillance using an image server
US11/762,047 Abandoned US20080036863A1 (en) 1999-04-30 2007-06-12 Method and apparatus for surveillance using an image server

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/482,181 Expired - Lifetime US7124427B1 (en) 1999-04-30 2000-01-12 Method and apparatus for surveillance using an image server

Family Applications After (1)

Application Number Title Priority Date Filing Date
US11/762,047 Abandoned US20080036863A1 (en) 1999-04-30 2007-06-12 Method and apparatus for surveillance using an image server

Country Status (5)

Country Link
US (3) US7124427B1 (en)
AU (1) AU6887900A (en)
DE (1) DE10084543T1 (en)
GB (1) GB2363936B (en)
WO (1) WO2000072573A2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040008257A1 (en) * 2002-07-11 2004-01-15 Jung-Hwan Kim Monitoring service process using communication network
US7460148B1 (en) * 2003-02-19 2008-12-02 Rockwell Collins, Inc. Near real-time dissemination of surveillance video
FR2944932A1 (en) * 2009-04-27 2010-10-29 Scutum Geographical zone representing information broadcasting method for Internet network, involves receiving image by technology platform, and connecting terminal to web server of technology platform
US20120081231A1 (en) * 2005-08-23 2012-04-05 Ronald Paul Harwood Method and system of controlling media devices configured to output signals to surrounding area
JP2016171520A (en) * 2015-03-13 2016-09-23 富士通株式会社 Image display system, controller, control program and control method

Families Citing this family (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6720990B1 (en) 1998-12-28 2004-04-13 Walker Digital, Llc Internet surveillance system and method
US7124427B1 (en) * 1999-04-30 2006-10-17 Touch Technologies, Inc. Method and apparatus for surveillance using an image server
US7522186B2 (en) * 2000-03-07 2009-04-21 L-3 Communications Corporation Method and apparatus for providing immersive surveillance
AUPQ684600A0 (en) * 2000-04-11 2000-05-11 Safehouse International Limited An object monitoring system
GB2364154B (en) * 2000-06-27 2002-07-17 Adrian Michael Godwin Building status network
US7627665B2 (en) * 2000-09-28 2009-12-01 Barker Geoffrey T System and method for providing configurable security monitoring utilizing an integrated information system
US8392552B2 (en) 2000-09-28 2013-03-05 Vig Acquisitions Ltd., L.L.C. System and method for providing configurable security monitoring utilizing an integrated information system
EP1323014A2 (en) * 2000-09-28 2003-07-02 Vigilos, Inc. Method and process for configuring a premises for monitoring
SE522121C2 (en) * 2000-10-04 2004-01-13 Axis Ab Method and apparatus for digital processing of frequently updated images from a camera
US20020171734A1 (en) * 2001-05-16 2002-11-21 Hiroshi Arakawa Remote monitoring system
US7480715B1 (en) 2002-01-25 2009-01-20 Vig Acquisitions Ltd., L.L.C. System and method for performing a predictive threat assessment based on risk factors
CA2390621C (en) * 2002-06-13 2012-12-11 Silent Witness Enterprises Ltd. Internet video surveillance camera system and method
GB2389978A (en) * 2002-06-17 2003-12-24 Raymond Joseph Lambert Event-triggered security monitoring apparatus
US20040028137A1 (en) * 2002-06-19 2004-02-12 Jeremy Wyn-Harris Motion detection camera
EP1401205B1 (en) * 2002-09-05 2012-04-25 Alcatel Lucent Monitoring support server
US10499091B2 (en) 2002-09-17 2019-12-03 Kinya Washino High-quality, reduced data rate streaming video production and monitoring system
US20050039211A1 (en) * 2002-09-17 2005-02-17 Kinya Washino High-quality, reduced data rate streaming video production and monitoring system
FI115277B (en) * 2002-12-12 2005-03-31 Plenware Group Oy Arrangement of motion observation in mobile station
AU2003900137A0 (en) * 2003-01-14 2003-01-30 Canon Kabushiki Kaisha Process and format for reliable storage of data
US7421727B2 (en) * 2003-02-14 2008-09-02 Canon Kabushiki Kaisha Motion detecting system, motion detecting method, motion detecting apparatus, and program for implementing the method
US7292723B2 (en) * 2003-02-26 2007-11-06 Walker Digital, Llc System for image analysis in a network that is structured with multiple layers and differentially weighted neurons
US20040186813A1 (en) * 2003-02-26 2004-09-23 Tedesco Daniel E. Image analysis method and apparatus in a network that is structured with multiple layers and differentially weighted neurons
CA2529903A1 (en) * 2003-06-19 2004-12-29 Sarnoff Corporation Method and apparatus for providing a scalable multi-camera distributed video processing and visualization surveillance system
US7259778B2 (en) 2003-07-01 2007-08-21 L-3 Communications Corporation Method and apparatus for placing sensors using 3D models
US7474852B1 (en) * 2004-02-12 2009-01-06 Multidyne Electronics Inc. System for communication of video, audio, data, control or other signals over fiber
WO2005117441A1 (en) * 2004-05-26 2005-12-08 Incorporated Administrative Agency, National Agricultural And Bio-Oriented Research Organization Autonomous operation control system
US20060029127A1 (en) * 2004-08-05 2006-02-09 Vicon Industries Inc. Controlling the distribution of different frames per second of a video stream to various recipients
DE102004044673B4 (en) * 2004-09-08 2007-04-05 Siemens Ag Method for monitoring at least one installation component of a technical installation
US7183907B2 (en) 2004-10-20 2007-02-27 Honeywell International, Inc. Central station monitoring with real-time status and control
WO2006049501A2 (en) * 2004-11-01 2006-05-11 Ultrawaves Design Holding B.V. Flexible surveillance network system
US7339607B2 (en) * 2005-03-25 2008-03-04 Yongyouth Damabhorn Security camera and monitor system activated by motion sensor and body heat sensor for homes or offices
NL1028743C1 (en) * 2005-04-12 2005-05-17 Internova Holding Bvba Motion detection method for building security systems, by comparing measured distance between moving object and optical device with reference value
JP4947936B2 (en) 2005-08-11 2012-06-06 ソニー株式会社 Monitoring system and management device
JP4926601B2 (en) * 2005-10-28 2012-05-09 キヤノン株式会社 Video distribution system, client terminal and control method thereof
IL172289A (en) * 2005-11-30 2011-07-31 Rafael Advanced Defense Sys Limited bandwidth surveillance system and method with rotation among monitors
CA2570425A1 (en) * 2005-12-06 2007-06-06 March Networks Corporation System and method for automatic camera health monitoring
JP4810420B2 (en) * 2006-02-24 2011-11-09 キヤノン株式会社 Image processing apparatus, image processing method, server, control method therefor, program, and storage medium
US8330967B2 (en) * 2006-06-26 2012-12-11 International Business Machines Corporation Controlling the print quality levels of images printed from images captured by tracked image recording devices
EP1879384B1 (en) * 2006-07-13 2009-05-13 Axis AB Improved pre-alarm video buffer
JP2008072447A (en) * 2006-09-14 2008-03-27 Fujitsu Ltd Image distribution system, image distribution program, image distribution method
US20080129822A1 (en) * 2006-11-07 2008-06-05 Glenn Daniel Clapp Optimized video data transfer
US20080122932A1 (en) * 2006-11-28 2008-05-29 George Aaron Kibbie Remote video monitoring systems utilizing outbound limited communication protocols
US7675549B1 (en) * 2006-12-08 2010-03-09 Itt Manufacturing Enterprises, Inc. Imaging architecture for region and time of interest collection and dissemination
US20080143831A1 (en) * 2006-12-15 2008-06-19 Daniel David Bowen Systems and methods for user notification in a multi-use environment
JP2008182431A (en) * 2007-01-24 2008-08-07 Nec Corp Video image and voice distribution system, and information processor
US20090265747A1 (en) * 2008-03-17 2009-10-22 Canada Anv Systems Inc. Systems and methods for providing web based self serviced video monitoring and security features for systems comprising ip video terminals and servers
US8027468B2 (en) * 2008-04-08 2011-09-27 Honeywell International Inc. Method and system for camera sensor fingerprinting
US9786164B2 (en) 2008-05-23 2017-10-10 Leverage Information Systems, Inc. Automated camera response in a surveillance architecture
US20090315883A1 (en) * 2008-06-19 2009-12-24 3M Innovative Properties Company Autostereoscopic display with pixelated luminaire
US20100245665A1 (en) * 2009-03-31 2010-09-30 Acuity Systems Inc Hybrid digital matrix
US8319833B2 (en) 2009-06-23 2012-11-27 Sentrus, Inc. Video surveillance system
US9019349B2 (en) * 2009-07-31 2015-04-28 Naturalpoint, Inc. Automated collective camera calibration for motion capture
US20110037864A1 (en) * 2009-08-17 2011-02-17 Microseven Systems, LLC Method and apparatus for live capture image
US9338515B2 (en) 2009-09-03 2016-05-10 At&T Intellectual Property I, L.P. Real-time and secured picture/video upload via a content delivery network
US9785898B2 (en) * 2011-06-20 2017-10-10 Hi-Tech Solutions Ltd. System and method for identifying retail products and determining retail product arrangements
US20160019427A1 (en) * 2013-03-11 2016-01-21 Michael Scott Martin Video surveillence system for detecting firearms
US10402661B2 (en) * 2013-07-22 2019-09-03 Opengate Development, Llc Shape/object recognition using still/scan/moving image optical digital media processing
US9243527B2 (en) * 2013-08-29 2016-01-26 Ford Global Technologies, Llc System and method for reducing friction in engines
US9521377B2 (en) * 2013-10-08 2016-12-13 Sercomm Corporation Motion detection method and device using the same
US9723273B2 (en) * 2014-04-16 2017-08-01 Vivint, Inc. Camera with a lens connector
US11209410B2 (en) * 2014-06-10 2021-12-28 Logan Instruments Corporation Dissolution tester assembly with integrated imaging system
US10609326B2 (en) * 2016-10-21 2020-03-31 TEKVOX, Inc. Self-contained video security system
US20190347915A1 (en) * 2018-05-11 2019-11-14 Ching-Ming Lai Large-scale Video Monitoring and Recording System
US11288537B2 (en) 2019-02-08 2022-03-29 Honeywell International Inc. Image forensics using non-standard pixels

Citations (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4145715A (en) * 1976-12-22 1979-03-20 Electronic Management Support, Inc. Surveillance system
US4249207A (en) * 1979-02-20 1981-02-03 Computing Devices Company Perimeter surveillance system
US4308559A (en) * 1979-05-14 1981-12-29 Peter Schiff Switching apparatus for closed circuit television monitoring systems
US4408224A (en) * 1980-05-09 1983-10-04 Hajime Industries Ltd. Surveillance method and apparatus
US4630110A (en) * 1984-02-15 1986-12-16 Supervision Control Systems, Inc. Surveillance system
US4814869A (en) * 1987-04-27 1989-03-21 Oliver Jr Robert C Video surveillance system
US4922339A (en) * 1988-03-31 1990-05-01 Stout Video Systems Means and method for visual surveillance and documentation
US4928175A (en) * 1986-04-11 1990-05-22 Henrik Haggren Method for the three-dimensional surveillance of the object space
US4943854A (en) * 1985-06-26 1990-07-24 Chuo Electronics Co., Ltd. Video surveillance system for selectively selecting processing and displaying the outputs of a plurality of TV cameras
US5097328A (en) * 1990-10-16 1992-03-17 Boyette Robert B Apparatus and a method for sensing events from a remote location
US5105183A (en) * 1989-04-27 1992-04-14 Digital Equipment Corporation System for displaying video from a plurality of sources on a display
US5202661A (en) * 1991-04-18 1993-04-13 The United States Of America As Represented By The Secretary Of The Navy Method and system for fusing data from fixed and mobile security sensors
US5237408A (en) * 1991-08-02 1993-08-17 Presearch Incorporated Retrofitting digital video surveillance system
US5353061A (en) * 1992-10-08 1994-10-04 International Business Machines Corporation System and method for frame-differencing video compression/decompression using perceptually-constant information and image analysis
US5384588A (en) * 1991-05-13 1995-01-24 Telerobotics International, Inc. System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5406324A (en) * 1992-10-30 1995-04-11 Roth; Alexander Surveillance system for transmitting images via a radio transmitter
US5471239A (en) * 1992-03-26 1995-11-28 Solid State Logic Limited Detecting scene changes
US5473364A (en) * 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US5491511A (en) * 1994-02-04 1996-02-13 Odle; James A. Multimedia capture and audit system for a video surveillance network
US5517236A (en) * 1994-06-22 1996-05-14 Philips Electronics North America Corporation Video surveillance system
US5606364A (en) * 1994-03-30 1997-02-25 Samsung Aerospace Industries, Ltd. Surveillance system for processing a plurality of signals with a single processor
US5657076A (en) * 1993-01-12 1997-08-12 Tapp; Hollis M. Security and surveillance system
US5751345A (en) * 1995-02-10 1998-05-12 Dozier Financial Corporation Image retention and information security system
US5806005A (en) * 1996-05-10 1998-09-08 Ricoh Company, Ltd. Wireless image transfer from a digital still video camera to a networked computer
US5953055A (en) * 1996-08-08 1999-09-14 Ncr Corporation System and method for detecting and analyzing a queue
US5956081A (en) * 1996-10-23 1999-09-21 Katz; Barry Surveillance system having graphic video integration controller and full motion video switcher
US5982418A (en) * 1996-04-22 1999-11-09 Sensormatic Electronics Corporation Distributed video data storage in video surveillance system
US6011547A (en) * 1996-10-22 2000-01-04 Fuji Photo Film Co., Ltd. Method and apparatus for reproducing image from data obtained by digital camera and digital camera used therefor
US6018774A (en) * 1997-07-03 2000-01-25 Yobaby Productions, Llc Method and system for creating messages including image information
US6023241A (en) * 1998-11-13 2000-02-08 Intel Corporation Digital multimedia navigation player/recorder
US6058428A (en) * 1997-12-05 2000-05-02 Pictra, Inc. Method and apparatus for transferring digital images on a network
US6076111A (en) * 1997-10-24 2000-06-13 Pictra, Inc. Methods and apparatuses for transferring data between data processing systems which transfer a representation of the data before transferring the data
US6078756A (en) * 1997-04-30 2000-06-20 Eastman Kodak Company Photographic and data transmission system for capturing images and magnetic data
US6085152A (en) * 1997-09-19 2000-07-04 Cambridge Management Advanced Systems Corporation Apparatus and method for monitoring and reporting weather conditions
US6130707A (en) * 1997-04-14 2000-10-10 Philips Electronics N.A. Corp. Video motion detector with global insensitivity
US6144772A (en) * 1998-01-29 2000-11-07 Canon Kabushiki Kaisha Variable compression encoding of digitized images
US6166763A (en) * 1994-07-26 2000-12-26 Ultrak, Inc. Video security system
US6166729A (en) * 1997-05-07 2000-12-26 Broadcloud Communications, Inc. Remote digital image viewing system and method
US6167469A (en) * 1998-05-18 2000-12-26 Agilent Technologies, Inc. Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof
US6182127B1 (en) * 1997-02-12 2001-01-30 Digital Paper, Llc Network image view server using efficent client-server tilting and caching architecture
US6226031B1 (en) * 1992-02-19 2001-05-01 Netergy Networks, Inc. Video communication/monitoring apparatus and method therefor
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
US6411209B1 (en) * 2000-12-06 2002-06-25 Koninklijke Philips Electronics N.V. Method and apparatus to select the best video frame to transmit to a remote station for CCTV based residential security monitoring
US6421080B1 (en) * 1999-11-05 2002-07-16 Image Vault Llc Digital surveillance system with pre-event recording
US6441734B1 (en) * 2000-12-12 2002-08-27 Koninklijke Philips Electronics N.V. Intruder detection through trajectory analysis in monitoring and surveillance systems
US6476858B1 (en) * 1999-08-12 2002-11-05 Innovation Institute Video monitoring and security system
US7124427B1 (en) * 1999-04-30 2006-10-17 Touch Technologies, Inc. Method and apparatus for surveillance using an image server

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02162890A (en) 1988-12-15 1990-06-22 Mitsubishi Electric Corp Motion detector
JP2865442B2 (en) * 1991-04-10 1999-03-08 株式会社東芝 Method of extracting change area of surveillance image
JPH07307944A (en) 1994-05-10 1995-11-21 Fujitsu General Ltd Monitor image transmission system
JP3679182B2 (en) * 1996-01-31 2005-08-03 三菱電機株式会社 Monitoring image processing device
CA2301858C (en) 1997-08-25 2007-02-20 Digital Security Controls Ltd. Controllable still frame video transmission system

Patent Citations (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4145715A (en) * 1976-12-22 1979-03-20 Electronic Management Support, Inc. Surveillance system
US4249207A (en) * 1979-02-20 1981-02-03 Computing Devices Company Perimeter surveillance system
US4308559A (en) * 1979-05-14 1981-12-29 Peter Schiff Switching apparatus for closed circuit television monitoring systems
US4408224A (en) * 1980-05-09 1983-10-04 Hajime Industries Ltd. Surveillance method and apparatus
US4630110A (en) * 1984-02-15 1986-12-16 Supervision Control Systems, Inc. Surveillance system
US4943854A (en) * 1985-06-26 1990-07-24 Chuo Electronics Co., Ltd. Video surveillance system for selectively selecting processing and displaying the outputs of a plurality of TV cameras
US4928175A (en) * 1986-04-11 1990-05-22 Henrik Haggren Method for the three-dimensional surveillance of the object space
US4814869A (en) * 1987-04-27 1989-03-21 Oliver Jr Robert C Video surveillance system
US4922339A (en) * 1988-03-31 1990-05-01 Stout Video Systems Means and method for visual surveillance and documentation
US5105183A (en) * 1989-04-27 1992-04-14 Digital Equipment Corporation System for displaying video from a plurality of sources on a display
US5097328A (en) * 1990-10-16 1992-03-17 Boyette Robert B Apparatus and a method for sensing events from a remote location
US5202661A (en) * 1991-04-18 1993-04-13 The United States Of America As Represented By The Secretary Of The Navy Method and system for fusing data from fixed and mobile security sensors
US5384588A (en) * 1991-05-13 1995-01-24 Telerobotics International, Inc. System for omindirectional image viewing at a remote location without the transmission of control signals to select viewing parameters
US5237408A (en) * 1991-08-02 1993-08-17 Presearch Incorporated Retrofitting digital video surveillance system
US6226031B1 (en) * 1992-02-19 2001-05-01 Netergy Networks, Inc. Video communication/monitoring apparatus and method therefor
US5471239A (en) * 1992-03-26 1995-11-28 Solid State Logic Limited Detecting scene changes
US5353061A (en) * 1992-10-08 1994-10-04 International Business Machines Corporation System and method for frame-differencing video compression/decompression using perceptually-constant information and image analysis
US5406324A (en) * 1992-10-30 1995-04-11 Roth; Alexander Surveillance system for transmitting images via a radio transmitter
US5657076A (en) * 1993-01-12 1997-08-12 Tapp; Hollis M. Security and surveillance system
US5491511A (en) * 1994-02-04 1996-02-13 Odle; James A. Multimedia capture and audit system for a video surveillance network
US5606364A (en) * 1994-03-30 1997-02-25 Samsung Aerospace Industries, Ltd. Surveillance system for processing a plurality of signals with a single processor
US5473364A (en) * 1994-06-03 1995-12-05 David Sarnoff Research Center, Inc. Video technique for indicating moving objects from a movable platform
US5517236A (en) * 1994-06-22 1996-05-14 Philips Electronics North America Corporation Video surveillance system
US6166763A (en) * 1994-07-26 2000-12-26 Ultrak, Inc. Video security system
US5751345A (en) * 1995-02-10 1998-05-12 Dozier Financial Corporation Image retention and information security system
US5751346A (en) * 1995-02-10 1998-05-12 Dozier Financial Corporation Image retention and information security system
US5982418A (en) * 1996-04-22 1999-11-09 Sensormatic Electronics Corporation Distributed video data storage in video surveillance system
US5806005A (en) * 1996-05-10 1998-09-08 Ricoh Company, Ltd. Wireless image transfer from a digital still video camera to a networked computer
US5953055A (en) * 1996-08-08 1999-09-14 Ncr Corporation System and method for detecting and analyzing a queue
US6011547A (en) * 1996-10-22 2000-01-04 Fuji Photo Film Co., Ltd. Method and apparatus for reproducing image from data obtained by digital camera and digital camera used therefor
US5956081A (en) * 1996-10-23 1999-09-21 Katz; Barry Surveillance system having graphic video integration controller and full motion video switcher
US6182127B1 (en) * 1997-02-12 2001-01-30 Digital Paper, Llc Network image view server using efficent client-server tilting and caching architecture
US6130707A (en) * 1997-04-14 2000-10-10 Philips Electronics N.A. Corp. Video motion detector with global insensitivity
US6078756A (en) * 1997-04-30 2000-06-20 Eastman Kodak Company Photographic and data transmission system for capturing images and magnetic data
US6166729A (en) * 1997-05-07 2000-12-26 Broadcloud Communications, Inc. Remote digital image viewing system and method
US6018774A (en) * 1997-07-03 2000-01-25 Yobaby Productions, Llc Method and system for creating messages including image information
US6085152A (en) * 1997-09-19 2000-07-04 Cambridge Management Advanced Systems Corporation Apparatus and method for monitoring and reporting weather conditions
US6076111A (en) * 1997-10-24 2000-06-13 Pictra, Inc. Methods and apparatuses for transferring data between data processing systems which transfer a representation of the data before transferring the data
US6058428A (en) * 1997-12-05 2000-05-02 Pictra, Inc. Method and apparatus for transferring digital images on a network
US6144772A (en) * 1998-01-29 2000-11-07 Canon Kabushiki Kaisha Variable compression encoding of digitized images
US6167469A (en) * 1998-05-18 2000-12-26 Agilent Technologies, Inc. Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof
US6271752B1 (en) * 1998-10-02 2001-08-07 Lucent Technologies, Inc. Intelligent multi-access system
US6023241A (en) * 1998-11-13 2000-02-08 Intel Corporation Digital multimedia navigation player/recorder
US7124427B1 (en) * 1999-04-30 2006-10-17 Touch Technologies, Inc. Method and apparatus for surveillance using an image server
US6476858B1 (en) * 1999-08-12 2002-11-05 Innovation Institute Video monitoring and security system
US6421080B1 (en) * 1999-11-05 2002-07-16 Image Vault Llc Digital surveillance system with pre-event recording
US6411209B1 (en) * 2000-12-06 2002-06-25 Koninklijke Philips Electronics N.V. Method and apparatus to select the best video frame to transmit to a remote station for CCTV based residential security monitoring
US6441734B1 (en) * 2000-12-12 2002-08-27 Koninklijke Philips Electronics N.V. Intruder detection through trajectory analysis in monitoring and surveillance systems

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040008257A1 (en) * 2002-07-11 2004-01-15 Jung-Hwan Kim Monitoring service process using communication network
US7460148B1 (en) * 2003-02-19 2008-12-02 Rockwell Collins, Inc. Near real-time dissemination of surveillance video
US20120081231A1 (en) * 2005-08-23 2012-04-05 Ronald Paul Harwood Method and system of controlling media devices configured to output signals to surrounding area
US9071911B2 (en) * 2005-08-23 2015-06-30 Ronald Paul Harwood Method and system of controlling media devices configured to output signals to surrounding area
FR2944932A1 (en) * 2009-04-27 2010-10-29 Scutum Geographical zone representing information broadcasting method for Internet network, involves receiving image by technology platform, and connecting terminal to web server of technology platform
JP2016171520A (en) * 2015-03-13 2016-09-23 富士通株式会社 Image display system, controller, control program and control method

Also Published As

Publication number Publication date
US20080036863A1 (en) 2008-02-14
WO2000072573A2 (en) 2000-11-30
GB2363936B (en) 2003-09-10
GB2363936A (en) 2002-01-09
AU6887900A (en) 2000-12-12
WO2000072573A3 (en) 2001-02-22
DE10084543T1 (en) 2002-07-25
GB0126514D0 (en) 2002-01-02
US7124427B1 (en) 2006-10-17

Similar Documents

Publication Publication Date Title
US7124427B1 (en) Method and apparatus for surveillance using an image server
US7952609B2 (en) Networked digital security system and methods
EP0839430B1 (en) Video compression system
US7732771B2 (en) Monitoring apparatus
US20040075738A1 (en) Spherical surveillance system architecture
US5581297A (en) Low power video security monitoring system
DE69928622T2 (en) Monitoring Network Camera System
US20100097464A1 (en) Network video surveillance system and recorder
US20080212685A1 (en) System for the Capture of Evidentiary Multimedia Data, Live/Delayed Off-Load to Secure Archival Storage and Managed Streaming Distribution
CN100446568C (en) Video monitoring equipment and device
CA2381960A1 (en) System and method for digital video management
US20070035623A1 (en) Directed attention digital video recordation
EP1855482A2 (en) Video surveillance with satellite communication access
JPH11284987A (en) Image supervisory system
JP2004056473A (en) Monitoring controller
US20060001741A1 (en) Realtime video display method of mixed signals
US10440310B1 (en) Systems and methods for increasing the persistence of forensically relevant video information on space limited storage media
JPH11205781A (en) Image pickup and recording device
KR101016243B1 (en) System of monitoring and analysis for digital video
CN100515036C (en) Intelligent image process closed circuit TV camera device and its operation method
JP2001145091A (en) Image transmission system
AU672756B2 (en) Low power video security monitoring system
WO2003052711A1 (en) Method and device for identifying motion
KR20040088417A (en) The system of cctv having function of devide at six multiple a screen and a picture transmission of wireless using dvr system
JPH0798792A (en) Monitoring tv telephone set

Legal Events

Date Code Title Description
AS Assignment

Owner name: TT VISUAL TECHNOLOGY GROUP, LLC, DELAWARE

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TOUCH TECHNOLOGIES, INC.;REEL/FRAME:019856/0328

Effective date: 20070717

AS Assignment

Owner name: TOUCH TECHNOLOGIES, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ESBENSEN, DANIEL;REEL/FRAME:020472/0345

Effective date: 20000112

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION