WO2002075687A2 - Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker - Google Patents

Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker Download PDF

Info

Publication number
WO2002075687A2
WO2002075687A2 PCT/IB2002/000547 IB0200547W WO02075687A2 WO 2002075687 A2 WO2002075687 A2 WO 2002075687A2 IB 0200547 W IB0200547 W IB 0200547W WO 02075687 A2 WO02075687 A2 WO 02075687A2
Authority
WO
WIPO (PCT)
Prior art keywords
signal
person
alarm
detecting
controller
Prior art date
Application number
PCT/IB2002/000547
Other languages
French (fr)
Other versions
WO2002075687A3 (en
Inventor
Srinivas Gutta
Eric Cohen-Solal
Trajkovic Miroslav
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to AT02712177T priority Critical patent/ATE296473T1/en
Priority to DE60204292T priority patent/DE60204292T2/en
Priority to JP2002574620A priority patent/JP2004531800A/en
Priority to EP02712177A priority patent/EP1371042B1/en
Publication of WO2002075687A2 publication Critical patent/WO2002075687A2/en
Publication of WO2002075687A3 publication Critical patent/WO2002075687A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0453Sensor means for detecting worn on the body to detect health condition by physiological monitoring, e.g. electrocardiogram, temperature, breathing
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/06Children, e.g. for attention deficit diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/113Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb occurring during breathing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/167Personality evaluation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device

Definitions

  • the present invention relates to devices that generate an alarm signal when conditions surrounding an invalid, infant, or child, or other person or a caretaker thereof, warrant it.
  • Remote security monitoring systems in which a video camera is trained on a subject or area of concern and observed by a trained observer are known in the art. Also infant or child monitors that transmit audio to a portable receiver are available in the market. These devices, however, require constant attention in order to provide protection to the subject or area of concern, such as an infant or child.
  • a monitored person's physical and emotional state may be determined by a computer for medical diagnostic purposes.
  • US Patent No. 5,617,855 hereby incorporated by reference as if fully set forth herein, describes a system that classifies characteristics of the face and voice along with electroencephalogram and other diagnostic data to help make diagnoses.
  • the device is aimed at the fields of psychiatry and neurology. This and other such devices, however, are not designed for monitoring persons in their normal environments.
  • machines automatically detect an occupant's presence or specific features of the occupant for purposes of machine-authorization and authentication or convenience.
  • some prior art systems employ biometric sensing, proximity detectors, radio frequency identification tags, or other devices.
  • EP 0716402B1 describes a method of detecting the number of people entering a train or other space using infrared sensors and fuzzy inference techniques. When the number of people is outside desired limits or unbalanced, the system can generate notices to that effect which may be linked to devices to correct the condition.
  • UK 2027312 A describes a method of detecting the movement of fish using IR cameras generating a standard video signal.
  • US 4524243 describes a system in which a user is required to activate a switch at specified intervals. Failure to do so results in the generation of an inactivity alarm.
  • US 5905436 discloses a system in which the failure of various sensors in a home to be triggered results in the generation of a signal at a central monitoring station indicating such.
  • the disclosure is directed at the supervision of an elderly person living at home.
  • UK 2179186A describes a system in which, if movement not detected at a predetermined time, an alarm is triggered. A warning is given so that the user can reset the switch.
  • US 6002994 discloses a system in which transmitters, placed at strategic locations in a house, are triggered whenever a person is present at the location of a sensor triggering the transmitter. Also, the system employs other inputs attached to devices and appliances that the user is expected to use. The system is trained to recognize normal patterns of use. The transmitters transmit a signal to a central monitor if the normal pattern is not followed.
  • physiological measurements may include the user's blood pressure, heart rate, body temperature, body weight and blood glucose level.
  • Non- physiological measurements may include room temperature, ammonia from spilled urine, methane from spoiling food, a presence of smoke, frequency of electrical usage, frequency of water usage, temperature of water flowing from a tap, the user's movement within the selected environment as indicated by motion sensors, and use of appliances including a toilet, telephone, stove, microwave oven, toaster, oven, refrigerator, freezer, dishwasher, bath, shower, garbage disposal means, clothes washer, clothes drier, mail box, door and vehicle.
  • Machine identification of faces is a technology that is well-developed. In GB
  • a controller identifies moving faces in a scene and tracks them to permit image capture sufficient to identify the face or distinctive features thereof. For example, the system could sound an alarm upon recognizing a pulled-down cap or face mask in a jewelry store security system.
  • an alarm system monitors conditions of a person requiring care, others attending to that person, and the environment of that person.
  • the alarm system generates an informative alarm signal or message containing information about these factors to help the message recipient understand what is going on.
  • the alarm signal is a live video and/or audio feed from a camera trained on the person requiring care.
  • the alarm signal is a symbolic set of data relating to the status and the condition that generated the alarm, for example, the message "Person requiring care not attended to in 3 hrs," "No movement detected in 20 min.,” “Stopped breathing,” or "Caretaker absent.”
  • the system generates responses to stimulate action, such as a response from a caretaker that is on the premises.
  • the alarm signal may be transmitted by phone line, the Internet, or a wireless channel.
  • the field of artificial intelligence and robotics has given rise to technology that enables machines to make sufficient sense of their surroundings to recognize predefined conditions, navigate vehicles, and identify objects and their orientations, for example.
  • Components of systems called autonomous observers have been made in the lab which allow a machine to follow an observer in an area and track escape routes.
  • Other applications of similar technology include video tracking systems that follow a speaker giving a presentation and respond to the speaker's gesture commands.
  • the technology of image and video recognition, audio recognition, and other inputs may be applied to infer the condition of a monitored person requiring care.
  • Artificial intelligence (AI) principles are used by a classification engine receiving video, audio, and/or other inputs to model a current situation.
  • video, audio, and other data that may be buffered up to the distress event, as well as live data, may be transmitted to a monitor along with an indication of the class to which the recognized event belongs.
  • the audio signal generated by a crying person requiring care may be classified as a "person-crying" condition either alone or in concert with the classification of other data such as video data of the crying person requiring care.
  • Condition classes for a suitable monitor system may include events such as:
  • occupancy patterns for example whether care-giver is spending unusual amounts of time away from person requiring care or care-giver is spending unusual amounts of time in a particular space, 12. patterns consistent with damage to the monitoring system,
  • the event that triggers an alarm condition may be a simple one such as prior art sensors that monitor breathing or crying, or they may be more complex ones that integrate multiple inputs into a network to make decisions as to the alarm status.
  • Such network devices may include classifiers in the form of neural networks, Bayesian networks, and other techniques for machine-recognition of physical objects and behaviors.
  • the art in this field is rich and rapidly-growing and it is expected that improved means for implementing the current invention will continually become available.
  • the classification engine is trainable so that it does not need to rely solely on predefined template patterns for pattern- matching.
  • the system may be provided with the ability to generate a simulated dialogue to provide for assistance in training such as asking an occupant to select from among a number of condition classes present in a monitored space at a time when the occupant can observe the monitored space.
  • the present invention may also employ simulated dialogue with a machine-generated persona such as disclosed in the following references, each of which is incorporated in its entirety as if fully set forth herein:
  • the persona may, in response to a particular condition (ambiguous classification of extant conditions or just on a random or interval time basis) request information from occupants about present circumstances.
  • the feedback received may be used by the classification engine to further infer the extant conditions and/or relayed to a responsible party along with other information about circumstances.
  • Fig. 1 is a schematic representation of a hardware system capable of supporting a monitor system according to an embodiment of the invention.
  • Fig. 2 is a high level flow diagram illustrating how inputs of various modalities may be filtered to generate an alarm signal consistent with several embodiments of the invention.
  • Fig. 3 is a functional diagram of a software system for implementing a monitor system according to an embodiment of the invention.
  • Fig. 4 is a flow chart illustrating the generation of an alarm signal according to an embodiment of the invention.
  • a programmable controller 100 receives input from various sources, for example, a connected image processor 305 connected to cameras 135 and 136, microphone 112, and sensors 141.
  • Sensors 141 may include alarm sensors such as breathing monitors or other SIDS prevention detectors or any other type of sensor such as temperature sensors, position sensors, security switches, proximity sensors, electrical load sensors, ambient light sensors, etc.
  • Data for updating the controller's 100 software or providing other required data, such as templates for modeling its environment, may be gathered through local or wide area or Internet networks symbolized by the cloud at 110.
  • the controller may output audio signals (e.g., synthetic speech or speech from a remote speaker) through a speaker 114 or a device of any other modality.
  • a terminal 116 may be provided for programming and requesting occupant input.
  • Fig. 2 illustrates how information gathered by the controller 100 of Fig. 1 may be used to identify particular conditions and generate an alarm responsive to those conditions.
  • Inputs of various modalities 500 such as video data, audio data, environmental conditions such as temperature, sound level, security system status, etc. are applied to a trained classifier 510 to discriminate and classify distinguishable features of a monitored environment.
  • the classifier 510 may be trained to discriminate faces and to classify them as belonging to one of a recognized set or not belonging to any member of the recognized set.
  • the classifier 510 may be trained to classify sudden noises like breaking glass or falling objects.
  • Still other examples are training it to recognize the emotional status and health of the monitored person by facial expression, physiognomy, body habitus, behavior, etc. from data in a video signal.
  • Each classification of events/status may then be combined and further classified as an alarm condition.
  • the classifier may be trained to identify a loud sound followed by an unrecognized face as an alarm condition.
  • classifiers as 510 are well developed and growing fast. Such classifiers may be trained explicitly using rules to form, for example, a Bayesian classifier. Alternatively, they may be trained using examples, as for training a neural net. Since the subject of how different kinds of classifiers are designed and trained is not the focus of the present invention, except as discussed herein, and because the technology for designing and training such classifiers is well-developed and highly varied, the particulars are not discussed in detail presently.
  • Some interface for programming and/or training the classifier 510 is indicated 530.
  • the end goal of the classifier 510 is to output status or alarm information to an alarm output 520. Both 530 and 520 may be networked terminals, cell phone devices, PDAs, or any suitable UI device.
  • a functional diagram of an event driven architecture that may be used to monitor an occupied zone separates the object illustrated by the single "black box" of classifier 510, into multiple objects whose outputs are combined to classify alarm conditions.
  • Audio input 245, video input 255, and other user interface devices (not shown) generate signals that are applied to respective classifiers 210, 240.
  • the audio input 245, which may be received by a microphone (not shown separately) or a directional audio detector (not shown separately) which indicates both the sound and its direction, or any other suitable audio transducer, may be applied to an audio classifier 210.
  • the latter data form a real-time signal, which the audio classifier 210 classifies by suitable digital or analog means or a combination thereof.
  • the audio classifier 210 then generates a current state information signal which it applies to both a mental state/health status classifier 290 and an event/class processor 207.
  • the signal generated by the audio classifier may be a vector that includes the following components.
  • Identified event switching of a light, snoring, tinny sound of a radio or TV, vacuum cleaner, etc.
  • Each instance of a discrete sound event and/or state may be combined with a time stamp indicating the time it began and, if it has, ended, and the combined vector signal applied to the event/class processor 207.
  • a video image classifier 240 receives video input 255, classifies image data and generates state information signals which are applied to the mental state/health status classifier 290 and the event/class processor 207.
  • the video image classifier 240 may be programmed to identify certain events such as gestures, rapid movement, number of occupants in its field of view, etc.
  • its output may be a vector which, for illustrative purposes, includes the following components. 1. Number of occupants,
  • Video processing techniques from various fields such as authentication, gesture control of machines, etc. may be employed in the current system according to the particular aims of the system designer.
  • Other input devices with associated classifiers 235 apply their output signals to the event/class processor 207.
  • the other UI classifiers 235 may include instrumentation monitoring the environment such as ambient light level, time of day, temperature of the room, security status of a building, etc.
  • Text data may be obtained from a speech to text converter 215 which receives the audio input 245 and converts it to text.
  • the text When obtained from audio, the text may be time- stamped by the speech to text converter 215.
  • the speech to text converter 215 parses the text using grammatical or structural rules such as used in new or prior art conversation simulators, as used in natural language search engines, or other suitable means. The result of this parsing is the extraction of words or utterance features that the mental state/health status classifier 290 may recognize. Parsing may be done using rule-based template matching as in conversation simulators or using more sophisticated natural language methods. Words indicative of mood may then be sent to the mental state/health status classifier 290 for classification of the mood of the speaker.
  • the mental state/health status classifier 290 receives signals from the various classifiers and processes these to generate a moodpersonality state signal.
  • the mental state/health status classifier 290 may be a trained neural network, a Bayesian network, a simple rule-based system, or any other type of classifier capable of taking many different inputs and predicting a probability of the occupant being in a given emotional state and having a given personality.
  • Various personality and mood typologies may be used, running from simple to complex.
  • An example of set of rules for classifying an occupant as bored is:
  • the output vector may be any suitable mental state classification.
  • the valence/intensity emotional state typology suggested in US Patent No. 5,987,415 may be used.
  • the following tables summarize the Big Five which is an evolutionary outgrowth of the Myers-Briggs typology.
  • the mental state/health status classifier 290 outputs a state vector, with a number of degrees of freedom, that corresponds to the models of personality and mental state chosen by the designer.
  • the mental state/health status classifier 290 may cumulate instantaneous data over a period of time in modeling personality, since this is a persistent state.
  • the mental state will have more volatile elements.
  • the event/class processor 207 is a classifier that combines state information from multiple classifiers to generate an environment/occupant state signal indicating the current status of the system's environment, including the occupants, particularly the monitored person.
  • the event/class processor may also generate event signals (interrupt signals) to ensure an instant response when certain events are recognized by the classifiers, for example, events that may coincide with an emergency condition.
  • the recognition of events may require state information from multiple classifiers, so the event/class processor 207 combines state data from multiple classifiers to generate a combined state signal and a combined event signal.
  • the environment/state signal may include an indication of all the possible event classes the various classifiers are capable of identifying or only those surpassing a threshold level of confidence.
  • the output generator 415 receives the mood/personality state vector and parsed reply data from the mental state/health status classifier 290 and input parser 410 respectively.
  • the response generator 415 also receives the environment/occupant state signal and events signal from the event/class processor 207.
  • the output generator 415 selects a type of response corresponding to the mental state, the environment/occupant state, and the events signal from an internal database and generates an alarm output if required.
  • the output generator may be programmed to select an output template that solicits further data from an occupant through user interface, such as the terminal 116 (Fig. 1). For example, if the various classifier output components indicate low confidence levels, the system could generate speech through the speaker 114 asking for information about the current state of the occupied space.
  • the video input 255 signal is applied to the video image classifier 240.
  • the video image classifier 240 is programmed to recognize a variety of different image and video-sequence classes in the video input 255 signal. For example, it may be programmed to distinguish between a person sitting up and lying down; between a person sitting still and one moving agitatedly or leaving an area; etc. A probability for each of these classes may be generated and output as a signal. Alternatively, a single, most-probable class may be generated and output as a signal. This signal is applied to the event/class processor 207, which combines this data with other class data to generate an environment/occupant state signal.
  • the event/class processor 207 receives an indication from the video image classifier 240 that something sudden and important has occurred, for example, the occupant has gotten up and left the room, the event/class processor 207 will generate an event signal. If the mental state/health status classifier 290 receives a signal from the video image classifier 240, indicating the occupant is moving in a fashion consistent with being agitated, that mental state/health status classifier 290 may combine this information with other classifier signals to generate a mood/personality state vector indicating an emotional state of heightened anxiety. For example, the audio classifier 210 may be contemporaneously indicating that the speaker's voice is more highly pitched than usual and the input parser 410 may indicate that the word count of the most recent utterances is low.
  • the event/class processor 207 and the mental state health status classifier 290 may be provided with a data storage capability and means for determining the current occupant so that corresponding histories can be stored for different occupants. Identification of occupants, as mentioned above, may be by face-recognition by means of the video image classifier 240, voice signature. It may also be confirmed by radio frequency identification (RFID) token, smart card, or a simple user interface that permits the occupant to identify him/herself with a biometric indicator such as a thumbprint or simply a PIN code. In this way, both the mental state/health status classifier 290 and event/class processor 207 may each correlate historical data with particular occupants and employ it in identifying and signaling trends to the output generator 415.
  • RFID radio frequency identification
  • the event/class processor 207 receives class information from the audio classifier 210 and other classifiers and attempts to identify these with a metaclass it is trained to recognize. That is, it combines classes of states to define an overarching state that is consistent with that multiple of states.
  • the architecture described herein is not the only way to implement the various features of the invention and the event/class processor 207 could simply be omitted and its functions taken over by the output generator 415.
  • One advantage of separating the functions, however, is that the event class processor 207 may employ a different type of classifier than the one used by the output generator 415.
  • the output generator 415 could use a rule-based template matcher while the event/class processor 207 could use a trained neural network-type classifier.
  • network- type classifiers such as neural network and Bayesian network classifiers
  • neural network and Bayesian network classifiers are difficult to train when they have a large number of possible output states.
  • the video image classifier 240 process may contain the ability to control the cameras (represented by video input 255) that receive video information.
  • the video image classifier 240 may contain a process that regularly attempts to distinguish objects in the room that may or may not be individuals and zoom on various features of those individuals. For example, every time a video image classifier identifies a new individual that image classifier may attempt to identify where the face is in the visual field and regularly zoom in on the face of each individual that has been identified in the field of view in order to obtain facial expression information which can be used for identifying the individual or for identifying the mood of the individual.
  • an audio signal may be filtered by a bandpass filter set for detection of loud crashing sounds and a detector that sets a time-latch output when the filter output is above certain level.
  • a video luminance signal may be low pass filtered and when its energy goes beyond a certain level, it also sets a time- latch. If both latched signals go positive (loud sound and great activity in temporal proximity), the system may generate an alarm.
  • Alarm signals may include simply some kind of notification of an alarm status.
  • alarms should be informative as possible within the specified design criteria.
  • an alarm signal may contain audio and/or video data preceding and following the event(s) that triggered the alarm status. These could be recorded by the output generator 415 and transmitted by email, streamed through a cell-phone connection or wireless convergence device with video capability, or some other means.
  • Symbolic representations of the most significant state classes that gave rise to the meta-classification of the alarm condition may also be transmitted. For example, a symbol indicating "loud noise" and/or unrecognized occupant may be transmitted to, say, a text pager used by a responsible party.
  • an arbitrary number of signals may be buffered continuously as illustrated by step S 10. If an alarm condition is indicated, at step SI 5 it is determined if the particular alarm condition had been previously overridden. If it had, buffering of signals is resumed and no further action is taken. If the alarm condition had not been overridden, a message is generated at step S20 and the buffered signals 1 .. N attached at step S30. The alarm message is then transmitted, for example by email, in step S40 and an optional live feed generated at step S50 if appropriate. The live feed may be made available at a URL included in an email transmission or as a portion of a signal in a message transmitted by an automated telephone call to a digital video telephone.
  • the buffered signal may be no more than a time sequence indicating the status of one or more sensors over time.
  • the buffered signals need not be signals that caused the indication of an alarm condition.
  • a video camera may be trained on a person's bed.
  • the alarm may be generated by a mechanical sensor (such as a chest strap) that detects breathing.
  • the video signal buffer up till the moment of the detection of the person's cessation of breathing may be the signal that is transmitted as part of the alarm message.
  • the length of the buffer may be as desired.
  • Each alarm may be a unique event, but each may also be generated by the same persistent condition, for example a failure of an infant or child to breathe for a period of time. It is desirable for a given alarm to be acknowledged so that a new alarm condition, arising from different circumstances, is not confused as the existing alarm currently being attended to.
  • One way to handle this is to assign a signature to each alarm based on a vector of the components that gave rise to the alarm condition. The recognition of the same alarm condition would give rise to another vector which may be compared to a table of existing alarms (at step SI 5) to see if the new alarm had already been overriden.
  • the components may be quantized to insure against small differences in vector components being identified as different or a low sensitivity comparison may be used to achieve the same effect.
  • Alarm signals may be transmitted by any of the following means. 1. Automatic telephone call with synthetic voice providing symbolic indication of alarm condition (pre-recorded phrases or synthetic speech) and/or buffered audio and/or live audio fed from the monitored space.
  • Wireless appliance with video may include the above plus recorded and/or live data plus text messages providing same information.
  • E-mail message may contain links to a URL with live or recorded data or may have embedded MIME attachment providing still or moving images.
  • Example 1 An infant's crib is placed against a wall with one or more cameras 135, 136 aimed at a side of the crib where a caretaker would ordinarily stand to view the infant.
  • a microphone 112 is placed in a position to pick up sounds near the crib.
  • the controller 100 receives live video and audio signals from the camera and microphone and filters them through respective classifiers 240 and 210, respectively.
  • the controller 100 is programmed to recognize the caretaker's face and produces a signal indicating that a face is present and a reliability estimate indicating how well the face matches expectation.
  • the controller 100 may be programmed to recognize other faces as well, such as relatives of the baby and children.
  • the controller 100 is further programmed to recognize the sound of crying and produce a signal indicating that crying is present.
  • the controller 100 is programmed to recognize the following events and produce corresponding signals: normal and abnormal body habitus of the infant, facial expression of infant indicating mood such as crying, content, playing, distressed, moving quickly or slowly, the number of individuals present, presence of new objects in the room and their "blob” sizes ("blob” is a term of art characterizing any closed connected shape that an image processor can define in a video image), mood of recognized face of caretaker.
  • the infant cries and the caretaker fails to respond.
  • the infant's mood is detected by the audio and video signals received.
  • a synthetic voice calls to the caretaker via the speaker 114 requesting assistance for the infant.
  • the alarm signal includes a text message, buffered video and buffered audio from a time prior to the alarm event.
  • the alarm signal is sent by email.
  • Example 2 The system configuration of example 1 is included in the present example.
  • Additional cameras and microphones are located at various locations in a house.
  • the baby's presence on a couch is detected.
  • the caretaker is recognized on the couch.
  • the sound of the television is detected.
  • the body position of the adult caretaker is detected in a sleeping position.
  • the system generates synthetic speech requesting for the caretaker to wake up and attend the infant (the programming of the controller 100 being a reflection of a concern that the infant is not in a safe place to be unattended.
  • an alarm is generated.
  • the alarm contains still images from the video (from a time prior to the alarm) of the sleeping adult and infant and live audio and video of the room in which the infant and adult are located.
  • Example 3 Example 3:
  • Example 2 The physical configuration (including programming) of Example 2 is incorporated in the present example.
  • the infant and crib are replaced by a sick adult and the caretaker is an adult.
  • the supervisor is an agency that supplied the nurse.
  • the alarm condition detected by the system is the lack of movement of any articles or persons in the house.
  • the recognition of occupants, facial recognition, body habitus and body positions, audio, etc. are producing indications of a presence of an adult in a living room, but the signals are low reliability and there is no indication of activity (movement of the occupant).
  • the system is programmed to generate an alarm if none of a set of expected events such as appearance of a recognized face at the bedside of the sick person, movement plus a recognized body position of a person in expected locations in a house, etc. for a predefined interval.
  • the alarm signal contains a live video and audio feed which is controllable by the supervisor (who can switch views, etc. and communicate with the caretaker).
  • the system of Example 3 is incorporated in the present example.
  • a babysitter is taking care of multiple children.
  • the system is programmed to recognize the babysitter's face and the faces of the children along with various characterizing features of these people such as body shape, movement patterns, etc.
  • the system is programmed to generate an alarm, and transmit it to a wireless handheld device, when either of the following occurs: (1) the number of occupants detected in the house (the system detects simultaneously at multiple cameras, a particular number of individuals) is greater than a predefined number (presumably the babysitter plus the children) or (2) the system detects an unrecognized face (i.e., it receives a good image of a face, but cannot produce a reliable match with a known template), (3) doors left open for extended periods of time (as indicated by security system status), (4) movement of persons in the house is excessive potentially indicating roughhousing or abuse.
  • the visage, location, and activities of both caretaker and person requiring care were monitored, it is possible to monitor only one or the other rather than both.
  • the notion of "occupants" recognized or not, may be applied as well to pets as to human occupants, although admittedly, the state of the art may not be at a point where pet's facial features may be distinguished from those of other animals, certainly the blob-recognition is at a point where it can distinguish between a poodle and cat or other kinds of animals and/or animate objects such as robots.

Abstract

A monitoring system for an infant, child, invalid, or other person requiring care uses computer vision and hearing, and inputs of other modalities, to analyze the status of a caretaker and/or cared-for person and its environment. The conditions are classified as normal or alarm conditions and an informative alarm signal is generated which may include records of the vision audio and other inputs. The system also has the ability to solicit responses from the occupants to stimulate a classifiable input to reduce ambiguity in its state signal.

Description

Automatic system for monitoring person requiring care and his / her caretaker
The present invention relates to devices that generate an alarm signal when conditions surrounding an invalid, infant, or child, or other person or a caretaker thereof, warrant it.
Remote security monitoring systems in which a video camera is trained on a subject or area of concern and observed by a trained observer are known in the art. Also infant or child monitors that transmit audio to a portable receiver are available in the market. These devices, however, require constant attention in order to provide protection to the subject or area of concern, such as an infant or child.
Automated infant or child monitors have been proposed which, for example, monitor an infant's sleeping position to help prevent sudden infant or child death syndrome (SIDS). One approach, suggested in US Patent No. 5,864,291 uses a breathing-sensing strap around the infant's torso to detect breathing. Another (US Patent No. 5,638,824) suggests using an ultrasonic sensor and US Patent No. 5,914,660 position sensors for the same purpose. The automation in these types of monitors, however, provide little use for babies that are distressed for reasons other than a failure to breathe or sleep in an expected position. Also, the alarm signal may contain false positives and is of little help in diagnosing the cause of distress.
A monitored person's physical and emotional state may be determined by a computer for medical diagnostic purposes. For example, US Patent No. 5,617,855, hereby incorporated by reference as if fully set forth herein, describes a system that classifies characteristics of the face and voice along with electroencephalogram and other diagnostic data to help make diagnoses. The device is aimed at the fields of psychiatry and neurology. This and other such devices, however, are not designed for monitoring persons in their normal environments. In still another application area, machines automatically detect an occupant's presence or specific features of the occupant for purposes of machine-authorization and authentication or convenience. To that end, some prior art systems employ biometric sensing, proximity detectors, radio frequency identification tags, or other devices. EP 0716402B1 describes a method of detecting the number of people entering a train or other space using infrared sensors and fuzzy inference techniques. When the number of people is outside desired limits or unbalanced, the system can generate notices to that effect which may be linked to devices to correct the condition. UK 2027312 A describes a method of detecting the movement of fish using IR cameras generating a standard video signal.
US 4524243 describes a system in which a user is required to activate a switch at specified intervals. Failure to do so results in the generation of an inactivity alarm.
US 5905436 discloses a system in which the failure of various sensors in a home to be triggered results in the generation of a signal at a central monitoring station indicating such. The disclosure is directed at the supervision of an elderly person living at home.
UK 2179186A describes a system in which, if movement not detected at a predetermined time, an alarm is triggered. A warning is given so that the user can reset the switch.
US 6002994 discloses a system in which transmitters, placed at strategic locations in a house, are triggered whenever a person is present at the location of a sensor triggering the transmitter. Also, the system employs other inputs attached to devices and appliances that the user is expected to use. The system is trained to recognize normal patterns of use. The transmitters transmit a signal to a central monitor if the normal pattern is not followed.
In this reference, physiological measurements may include the user's blood pressure, heart rate, body temperature, body weight and blood glucose level. Non- physiological measurements may include room temperature, ammonia from spilled urine, methane from spoiling food, a presence of smoke, frequency of electrical usage, frequency of water usage, temperature of water flowing from a tap, the user's movement within the selected environment as indicated by motion sensors, and use of appliances including a toilet, telephone, stove, microwave oven, toaster, oven, refrigerator, freezer, dishwasher, bath, shower, garbage disposal means, clothes washer, clothes drier, mail box, door and vehicle. Machine identification of faces is a technology that is well-developed. In GB
2343945A for a system for photographing or recognizing a Face, a controller identifies moving faces in a scene and tracks them to permit image capture sufficient to identify the face or distinctive features thereof. For example, the system could sound an alarm upon recognizing a pulled-down cap or face mask in a jewelry store security system. There remains a need in the present art for a system that monitors persons requiring supervision to be more robust, capable of responding to more subtle cues and provide more informative information to supervisors.
Briefly, an alarm system monitors conditions of a person requiring care, others attending to that person, and the environment of that person. The alarm system generates an informative alarm signal or message containing information about these factors to help the message recipient understand what is going on. In an embodiment, the alarm signal is a live video and/or audio feed from a camera trained on the person requiring care. In another embodiment, the alarm signal is a symbolic set of data relating to the status and the condition that generated the alarm, for example, the message "Person requiring care not attended to in 3 hrs," "No movement detected in 20 min.," "Stopped breathing," or "Caretaker absent." In still other embodiments, the system generates responses to stimulate action, such as a response from a caretaker that is on the premises. The alarm signal may be transmitted by phone line, the Internet, or a wireless channel. The field of artificial intelligence and robotics has given rise to technology that enables machines to make sufficient sense of their surroundings to recognize predefined conditions, navigate vehicles, and identify objects and their orientations, for example. Components of systems called autonomous observers have been made in the lab which allow a machine to follow an observer in an area and track escape routes. Other applications of similar technology include video tracking systems that follow a speaker giving a presentation and respond to the speaker's gesture commands. In embodiments of the present invention, the technology of image and video recognition, audio recognition, and other inputs may be applied to infer the condition of a monitored person requiring care.
Artificial intelligence (AI) principles are used by a classification engine receiving video, audio, and/or other inputs to model a current situation. When conditions are classified as calling for attention (distress event), video, audio, and other data that may be buffered up to the distress event, as well as live data, may be transmitted to a monitor along with an indication of the class to which the recognized event belongs. For example, the audio signal generated by a crying person requiring care may be classified as a "person-crying" condition either alone or in concert with the classification of other data such as video data of the crying person requiring care. Condition classes for a suitable monitor system may include events such as:
1. trigger by a breathing sensor, motion sensor, or audio sensor as in prior art devices, 2. delay in response time of a nanny or au pair or other care-giver given to a person requiring care,
3. movement (crawling) of the person requiring care into prohibited areas of a room, 4. sudden movement consistent with falling, running, normal walking, crawling, etc.,
5. lack of normal movement such as rapid movement such as an infant or child being picked up at a time other than a previously defined time,
6. presence of the person requiring care or other individuals in a space and their number,
7. consistency of the clothing, facial features, etc. of the occupants of a space throughout a period of time.
8. loud sounds, normal sounds, and unusual sounds, based upon signature of sound, 9. location of sound source,
10. occupancy of unauthorized spaces,
11. occupancy patterns, for example whether care-giver is spending unusual amounts of time away from person requiring care or care-giver is spending unusual amounts of time in a particular space, 12. patterns consistent with damage to the monitoring system,
13. voice signatures of unauthorized occupants or unrecognized voice signatures,
14. body habitus and physiognomy of occupants,
15. status of security system in the space, 16. unrecognized objects in occupied spaces or recognized objects being moved or found in unexpected locations,
17. temperature, humidity, sound levels, or other ambient variables out of normal range,
18. failure to detect face of a care-giver over crib for a specified interval, and 19. presence of an unrecognized face or body pattern.
The event that triggers an alarm condition may be a simple one such as prior art sensors that monitor breathing or crying, or they may be more complex ones that integrate multiple inputs into a network to make decisions as to the alarm status. Such network devices may include classifiers in the form of neural networks, Bayesian networks, and other techniques for machine-recognition of physical objects and behaviors. The art in this field is rich and rapidly-growing and it is expected that improved means for implementing the current invention will continually become available. Preferably the classification engine is trainable so that it does not need to rely solely on predefined template patterns for pattern- matching. The system may be provided with the ability to generate a simulated dialogue to provide for assistance in training such as asking an occupant to select from among a number of condition classes present in a monitored space at a time when the occupant can observe the monitored space.
In an embodiment, the present invention may also employ simulated dialogue with a machine-generated persona such as disclosed in the following references, each of which is incorporated in its entirety as if fully set forth herein:
- US Patent Serial No. 09/699,606 for Environment-Responsive User interface/Entertainment Device That Simulates Personal Interaction;
- US Patent Serial No. 09/686,831 for Virtual Creature Displayed on a Television; and
- US Patent Serial No. 09/699,577 for User interface Entertainment Device That Simulates Personal Interaction and Responds to Occupant's Mental State and/or Personality.
The persona may, in response to a particular condition (ambiguous classification of extant conditions or just on a random or interval time basis) request information from occupants about present circumstances. The feedback received may be used by the classification engine to further infer the extant conditions and/or relayed to a responsible party along with other information about circumstances.
The above applications also discuss the topic of classifying a rich array of inputs to make decisions about occupants.
The invention will be described in connection with certain preferred embodiments, with reference to the following illustrative figures so that it may be more fully understood. With reference to the figures, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice.
Fig. 1 is a schematic representation of a hardware system capable of supporting a monitor system according to an embodiment of the invention.
Fig. 2 is a high level flow diagram illustrating how inputs of various modalities may be filtered to generate an alarm signal consistent with several embodiments of the invention. Fig. 3 is a functional diagram of a software system for implementing a monitor system according to an embodiment of the invention.
Fig. 4 is a flow chart illustrating the generation of an alarm signal according to an embodiment of the invention.
Referring to Fig. 1, in a hardware apparatus for implementing an embodiment of the invention, a programmable controller 100 receives input from various sources, for example, a connected image processor 305 connected to cameras 135 and 136, microphone 112, and sensors 141. Sensors 141 may include alarm sensors such as breathing monitors or other SIDS prevention detectors or any other type of sensor such as temperature sensors, position sensors, security switches, proximity sensors, electrical load sensors, ambient light sensors, etc. Data for updating the controller's 100 software or providing other required data, such as templates for modeling its environment, may be gathered through local or wide area or Internet networks symbolized by the cloud at 110. The controller may output audio signals (e.g., synthetic speech or speech from a remote speaker) through a speaker 114 or a device of any other modality. For programming and requesting occupant input, a terminal 116 may be provided.
Fig. 2 illustrates how information gathered by the controller 100 of Fig. 1 may be used to identify particular conditions and generate an alarm responsive to those conditions. Inputs of various modalities 500 such as video data, audio data, environmental conditions such as temperature, sound level, security system status, etc. are applied to a trained classifier 510 to discriminate and classify distinguishable features of a monitored environment. For example, the classifier 510 may be trained to discriminate faces and to classify them as belonging to one of a recognized set or not belonging to any member of the recognized set. For another example, the classifier 510 may be trained to classify sudden noises like breaking glass or falling objects. Still other examples are training it to recognize the emotional status and health of the monitored person by facial expression, physiognomy, body habitus, behavior, etc. from data in a video signal. Each classification of events/status may then be combined and further classified as an alarm condition. For example, the classifier may be trained to identify a loud sound followed by an unrecognized face as an alarm condition.
The technologies for training such classifiers as 510 are well developed and growing fast. Such classifiers may be trained explicitly using rules to form, for example, a Bayesian classifier. Alternatively, they may be trained using examples, as for training a neural net. Since the subject of how different kinds of classifiers are designed and trained is not the focus of the present invention, except as discussed herein, and because the technology for designing and training such classifiers is well-developed and highly varied, the particulars are not discussed in detail presently. Some interface for programming and/or training the classifier 510 is indicated 530. The end goal of the classifier 510 is to output status or alarm information to an alarm output 520. Both 530 and 520 may be networked terminals, cell phone devices, PDAs, or any suitable UI device.
Referring now to Fig. 3, a functional diagram of an event driven architecture that may be used to monitor an occupied zone separates the object illustrated by the single "black box" of classifier 510, into multiple objects whose outputs are combined to classify alarm conditions. Audio input 245, video input 255, and other user interface devices (not shown) generate signals that are applied to respective classifiers 210, 240. The audio input 245, which may be received by a microphone (not shown separately) or a directional audio detector (not shown separately) which indicates both the sound and its direction, or any other suitable audio transducer, may be applied to an audio classifier 210. The latter data form a real-time signal, which the audio classifier 210 classifies by suitable digital or analog means or a combination thereof. The audio classifier 210 then generates a current state information signal which it applies to both a mental state/health status classifier 290 and an event/class processor 207.
To illustrate, the signal generated by the audio classifier may be a vector that includes the following components.
1. Identity of speaker,
2. Number of speakers,
3. Type of sound (crashing, bumping, periodic, tapping, etc.)
4. Sound intensity level, 5. Duration, time of day, of distinguished sound,
6. Quality of speech (whispering, yelling, rapid, etc.)
7. Quality of voice (masculine, feminine, child, etc.), and
8. Identified event (switching of a light, snoring, tinny sound of a radio or TV, vacuum cleaner, etc.).
Each instance of a discrete sound event and/or state may be combined with a time stamp indicating the time it began and, if it has, ended, and the combined vector signal applied to the event/class processor 207.
A video image classifier 240 receives video input 255, classifies image data and generates state information signals which are applied to the mental state/health status classifier 290 and the event/class processor 207. The video image classifier 240 may be programmed to identify certain events such as gestures, rapid movement, number of occupants in its field of view, etc. Like the audio classifier 210, its output may be a vector which, for illustrative purposes, includes the following components. 1. Number of occupants,
2. Identity of occupants (including unrecognized) which may derive information from body, facial features, movement, etc.,
3. Body position/gesture of each occupant (e.g., standing, seated, drinking, eating, 4. Sizes of transient objects in scene,
5. Nature of transient objects in scene (e.g., television, dinner plate, laundry basket, etc.),
6. Rapidity of movement of image center of occupants as an indication of running or chaos, and 7. Change in camera angle, etc.
Video processing techniques from various fields such as authentication, gesture control of machines, etc. may be employed in the current system according to the particular aims of the system designer.
Other input devices with associated classifiers 235 apply their output signals to the event/class processor 207. The other UI classifiers 235 may include instrumentation monitoring the environment such as ambient light level, time of day, temperature of the room, security status of a building, etc.
Text data may be obtained from a speech to text converter 215 which receives the audio input 245 and converts it to text. When obtained from audio, the text may be time- stamped by the speech to text converter 215. The speech to text converter 215 parses the text using grammatical or structural rules such as used in new or prior art conversation simulators, as used in natural language search engines, or other suitable means. The result of this parsing is the extraction of words or utterance features that the mental state/health status classifier 290 may recognize. Parsing may be done using rule-based template matching as in conversation simulators or using more sophisticated natural language methods. Words indicative of mood may then be sent to the mental state/health status classifier 290 for classification of the mood of the speaker.
The mental state/health status classifier 290 receives signals from the various classifiers and processes these to generate a moodpersonality state signal. The mental state/health status classifier 290 may be a trained neural network, a Bayesian network, a simple rule-based system, or any other type of classifier capable of taking many different inputs and predicting a probability of the occupant being in a given emotional state and having a given personality. Various personality and mood typologies may be used, running from simple to complex. An example of set of rules for classifying an occupant as bored is:
- low sentence/phrase word count (the occupant's sentences contain few words) (input parser 410 signal indicating response word count),
- a low incidence of words suggesting enthusiasm such as superlatives (input parser 410 signal indicating adjectives), - a quiet flat tone in the voice (audio classifier 210 signal indicating modulation inflection intensity),
- a lack of physical movement (video image classifier 240 signal indicating , etc.,
- little movement of the head or body, - sighing sounds, etc.,
- looking at watch, and
- lack of eye contact with objects such as television or book in the scene. Each of these may be classified by the corresponding classifier. The color of the occupant's clothes, the pitch of the occupant's voice, the number of times the occupant enters and leaves a single scene, the way the occupant gestures, etc. all may provide clues to the occupant's emotional state and/or personality. The output vector may be any suitable mental state classification. For example, the valence/intensity emotional state typology suggested in US Patent No. 5,987,415 may be used. The following tables summarize the Big Five which is an evolutionary outgrowth of the Myers-Briggs typology. There are many academic papers on the subject of modeling emotional states and personalities and many of these address the issues of machine classification based on voice, facial expression, body posture, and many other machine- inputs. Even the weather, which may be obtained using an agent over the Internet or via instruments measuring basic weather data such as daily sunshine, may be used to infer mental emotional state.
The Six Facets of Negative Emotionality (adapted from Costa & McCrae, 1992) with Anchors for the Two Extremes of the Continuum
Figure imgf000011_0001
The Six Facets of Extraversion (adapted from Costa & McCrae, 1992) with Anchors for the Two Extremes of the Continuum
Figure imgf000011_0002
The Six Facets of Openness (adapted from Costa & McCrae, 1992) with Anchors for the Two Extremes of the Continuum
Figure imgf000012_0001
The Six Facets of Agreeableness (adapted from Costa & McCrae, 1992) with Anchors for the Two Extremes of the Continuum
Six Facets of
CHALLENGER ADAPTER A+ Agreeableness
See others as honest & well-
Trust Cynical; skeptical intentioned
Straightforwardness Guarded; stretches truth Straightforward, frank
Altruism Reluctant to get involved Willing to help others
Compliance Aggressive; competitive Yields under conflict; defers
Modesty- Feels superior to others Self-effacing; humble
Tender-Mindedness Hardheaded; rational Tender-minded; easily moved
The Six Facets of Conscientiousness (adapted from Costa & McCrae, 1992) with Anchors for the Two Extremes of the Continuum
Figure imgf000013_0001
The mental state/health status classifier 290 outputs a state vector, with a number of degrees of freedom, that corresponds to the models of personality and mental state chosen by the designer. The mental state/health status classifier 290 may cumulate instantaneous data over a period of time in modeling personality, since this is a persistent state. The mental state will have more volatile elements.
The event/class processor 207 is a classifier that combines state information from multiple classifiers to generate an environment/occupant state signal indicating the current status of the system's environment, including the occupants, particularly the monitored person. The event/class processor may also generate event signals (interrupt signals) to ensure an instant response when certain events are recognized by the classifiers, for example, events that may coincide with an emergency condition. The recognition of events may require state information from multiple classifiers, so the event/class processor 207 combines state data from multiple classifiers to generate a combined state signal and a combined event signal. The environment/state signal may include an indication of all the possible event classes the various classifiers are capable of identifying or only those surpassing a threshold level of confidence. The output generator 415 receives the mood/personality state vector and parsed reply data from the mental state/health status classifier 290 and input parser 410 respectively. The response generator 415 also receives the environment/occupant state signal and events signal from the event/class processor 207. The output generator 415 selects a type of response corresponding to the mental state, the environment/occupant state, and the events signal from an internal database and generates an alarm output if required. Alternatively, the output generator may be programmed to select an output template that solicits further data from an occupant through user interface, such as the terminal 116 (Fig. 1). For example, if the various classifier output components indicate low confidence levels, the system could generate speech through the speaker 114 asking for information about the current state of the occupied space. For example "Is anyone there" could be generated if no clear presence of an adult can be detected. The system then uses its other input devices, such as video input 255, to decrease ambiguity in its status and event signals. Note that these features may be implemented through a conversation simulator as described in US Patent Ser. Nos. 09/699,606, 09/686,831 , and 09/699,577 may be built into the system to operate as a machine assistant.
Tracing the data flow beginning with the video input 255, the video input 255 signal is applied to the video image classifier 240. The video image classifier 240 is programmed to recognize a variety of different image and video-sequence classes in the video input 255 signal. For example, it may be programmed to distinguish between a person sitting up and lying down; between a person sitting still and one moving agitatedly or leaving an area; etc. A probability for each of these classes may be generated and output as a signal. Alternatively, a single, most-probable class may be generated and output as a signal. This signal is applied to the event/class processor 207, which combines this data with other class data to generate an environment/occupant state signal. If the event/class processor 207 receives an indication from the video image classifier 240 that something sudden and important has occurred, for example, the occupant has gotten up and left the room, the event/class processor 207 will generate an event signal. If the mental state/health status classifier 290 receives a signal from the video image classifier 240, indicating the occupant is moving in a fashion consistent with being agitated, that mental state/health status classifier 290 may combine this information with other classifier signals to generate a mood/personality state vector indicating an emotional state of heightened anxiety. For example, the audio classifier 210 may be contemporaneously indicating that the speaker's voice is more highly pitched than usual and the input parser 410 may indicate that the word count of the most recent utterances is low.
Note that to allow the system to determine whether a current class or state represents a change from a previous time, the event/class processor 207 and the mental state health status classifier 290 may be provided with a data storage capability and means for determining the current occupant so that corresponding histories can be stored for different occupants. Identification of occupants, as mentioned above, may be by face-recognition by means of the video image classifier 240, voice signature. It may also be confirmed by radio frequency identification (RFID) token, smart card, or a simple user interface that permits the occupant to identify him/herself with a biometric indicator such as a thumbprint or simply a PIN code. In this way, both the mental state/health status classifier 290 and event/class processor 207 may each correlate historical data with particular occupants and employ it in identifying and signaling trends to the output generator 415.
The event/class processor 207 receives class information from the audio classifier 210 and other classifiers and attempts to identify these with a metaclass it is trained to recognize. That is, it combines classes of states to define an overarching state that is consistent with that multiple of states. Of course, the architecture described herein is not the only way to implement the various features of the invention and the event/class processor 207 could simply be omitted and its functions taken over by the output generator 415. One advantage of separating the functions, however, is that the event class processor 207 may employ a different type of classifier than the one used by the output generator 415. For example, the output generator 415 could use a rule-based template matcher while the event/class processor 207 could use a trained neural network-type classifier. These allocations of functions may be more suitable since the number of outputs of the output generator 415 may be much higher than the number of classes the event/class processor
207(or the other classifiers) is trained to recognize. This follows from the fact that network- type classifiers (such as neural network and Bayesian network classifiers) are difficult to train when they have a large number of possible output states.
The video image classifier 240 process may contain the ability to control the cameras (represented by video input 255) that receive video information. The video image classifier 240 may contain a process that regularly attempts to distinguish objects in the room that may or may not be individuals and zoom on various features of those individuals. For example, every time a video image classifier identifies a new individual that image classifier may attempt to identify where the face is in the visual field and regularly zoom in on the face of each individual that has been identified in the field of view in order to obtain facial expression information which can be used for identifying the individual or for identifying the mood of the individual.
We note that the invention may be designed without the use of artificial intelligence (AI) technology as described above, although of robustness of AI technology makes it highly desirable to do so. For example, an audio signal may be filtered by a bandpass filter set for detection of loud crashing sounds and a detector that sets a time-latch output when the filter output is above certain level. Concurrently, a video luminance signal may be low pass filtered and when its energy goes beyond a certain level, it also sets a time- latch. If both latched signals go positive (loud sound and great activity in temporal proximity), the system may generate an alarm.
Alarm signals may include simply some kind of notification of an alarm status. Preferably, however, alarms should be informative as possible within the specified design criteria. For example, an alarm signal may contain audio and/or video data preceding and following the event(s) that triggered the alarm status. These could be recorded by the output generator 415 and transmitted by email, streamed through a cell-phone connection or wireless convergence device with video capability, or some other means. Symbolic representations of the most significant state classes that gave rise to the meta-classification of the alarm condition may also be transmitted. For example, a symbol indicating "loud noise" and/or unrecognized occupant may be transmitted to, say, a text pager used by a responsible party.
Referring now to Fig.4, an arbitrary number of signals may be buffered continuously as illustrated by step S 10. If an alarm condition is indicated, at step SI 5 it is determined if the particular alarm condition had been previously overridden. If it had, buffering of signals is resumed and no further action is taken. If the alarm condition had not been overridden, a message is generated at step S20 and the buffered signals 1 .. N attached at step S30. The alarm message is then transmitted, for example by email, in step S40 and an optional live feed generated at step S50 if appropriate. The live feed may be made available at a URL included in an email transmission or as a portion of a signal in a message transmitted by an automated telephone call to a digital video telephone.
The buffered signal may be no more than a time sequence indicating the status of one or more sensors over time. The buffered signals need not be signals that caused the indication of an alarm condition. For example, in an embodiment of the invention, a video camera may be trained on a person's bed. The alarm may be generated by a mechanical sensor (such as a chest strap) that detects breathing. The video signal buffer up till the moment of the detection of the person's cessation of breathing may be the signal that is transmitted as part of the alarm message. The length of the buffer may be as desired.
Each alarm may be a unique event, but each may also be generated by the same persistent condition, for example a failure of an infant or child to breathe for a period of time. It is desirable for a given alarm to be acknowledged so that a new alarm condition, arising from different circumstances, is not confused as the existing alarm currently being attended to. One way to handle this is to assign a signature to each alarm based on a vector of the components that gave rise to the alarm condition. The recognition of the same alarm condition would give rise to another vector which may be compared to a table of existing alarms (at step SI 5) to see if the new alarm had already been overriden. The components may be quantized to insure against small differences in vector components being identified as different or a low sensitivity comparison may be used to achieve the same effect. Alarm signals may be transmitted by any of the following means. 1. Automatic telephone call with synthetic voice providing symbolic indication of alarm condition (pre-recorded phrases or synthetic speech) and/or buffered audio and/or live audio fed from the monitored space.
2. Wireless appliance with video may include the above plus recorded and/or live data plus text messages providing same information. 3. E-mail message, may contain links to a URL with live or recorded data or may have embedded MIME attachment providing still or moving images.
4. Broadcast: radio message, audio message, display on a wired console, etc. The following are several example applications and use scenarios. Example 1: An infant's crib is placed against a wall with one or more cameras 135, 136 aimed at a side of the crib where a caretaker would ordinarily stand to view the infant. A microphone 112 is placed in a position to pick up sounds near the crib. The controller 100 receives live video and audio signals from the camera and microphone and filters them through respective classifiers 240 and 210, respectively. The controller 100 is programmed to recognize the caretaker's face and produces a signal indicating that a face is present and a reliability estimate indicating how well the face matches expectation. The controller 100 may be programmed to recognize other faces as well, such as relatives of the baby and children. The controller 100 is further programmed to recognize the sound of crying and produce a signal indicating that crying is present. In addition, the controller 100 is programmed to recognize the following events and produce corresponding signals: normal and abnormal body habitus of the infant, facial expression of infant indicating mood such as crying, content, playing, distressed, moving quickly or slowly, the number of individuals present, presence of new objects in the room and their "blob" sizes ("blob" is a term of art characterizing any closed connected shape that an image processor can define in a video image), mood of recognized face of caretaker.
In the above example, the following events may occur. The infant cries and the caretaker fails to respond. The infant's mood is detected by the audio and video signals received. A synthetic voice calls to the caretaker via the speaker 114 requesting assistance for the infant. The elapsed time from the recognition of the infant's distress till the present reaches a specified interval triggering an alarm. The alarm signal includes a text message, buffered video and buffered audio from a time prior to the alarm event. The alarm signal is sent by email.
Example 2: The system configuration of example 1 is included in the present example.
Additional cameras and microphones are located at various locations in a house. The baby's presence on a couch is detected. The caretaker is recognized on the couch. The sound of the television is detected. The body position of the adult caretaker is detected in a sleeping position. The system generates synthetic speech requesting for the caretaker to wake up and attend the infant (the programming of the controller 100 being a reflection of a concern that the infant is not in a safe place to be unattended. After a predefined interval during which continued sleep (or illness or death) is detected, an alarm is generated. The alarm contains still images from the video (from a time prior to the alarm) of the sleeping adult and infant and live audio and video of the room in which the infant and adult are located. Example 3:
The physical configuration (including programming) of Example 2 is incorporated in the present example. In this example, the infant and crib are replaced by a sick adult and the caretaker is an adult. The supervisor is an agency that supplied the nurse. The alarm condition detected by the system is the lack of movement of any articles or persons in the house. The recognition of occupants, facial recognition, body habitus and body positions, audio, etc. are producing indications of a presence of an adult in a living room, but the signals are low reliability and there is no indication of activity (movement of the occupant). The system is programmed to generate an alarm if none of a set of expected events such as appearance of a recognized face at the bedside of the sick person, movement plus a recognized body position of a person in expected locations in a house, etc. for a predefined interval. The alarm signal contains a live video and audio feed which is controllable by the supervisor (who can switch views, etc. and communicate with the caretaker). Example 4:
The system of Example 3 is incorporated in the present example. A babysitter is taking care of multiple children. The system is programmed to recognize the babysitter's face and the faces of the children along with various characterizing features of these people such as body shape, movement patterns, etc. The system is programmed to generate an alarm, and transmit it to a wireless handheld device, when either of the following occurs: (1) the number of occupants detected in the house (the system detects simultaneously at multiple cameras, a particular number of individuals) is greater than a predefined number (presumably the babysitter plus the children) or (2) the system detects an unrecognized face (i.e., it receives a good image of a face, but cannot produce a reliable match with a known template), (3) doors left open for extended periods of time (as indicated by security system status), (4) movement of persons in the house is excessive potentially indicating roughhousing or abuse. Although in the embodiments described, the visage, location, and activities of both caretaker and person requiring care were monitored, it is possible to monitor only one or the other rather than both. Also, the notion of "occupants" recognized or not, may be applied as well to pets as to human occupants, although admittedly, the state of the art may not be at a point where pet's facial features may be distinguished from those of other animals, certainly the blob-recognition is at a point where it can distinguish between a poodle and cat or other kinds of animals and/or animate objects such as robots.
While the present invention has been explained in the context of the preferred embodiments described above, it is to be understood that various changes may be made to those embodiments, and various equivalents may be substituted, without departing from the spirit or scope of the invention, as will be apparent to persons skilled in the relevant art.

Claims

CLAIMS:
1. A device for monitoring a first person requiring supervision, comprising:
- a controller (100) programmed to receive at least one monitor signal from an environment monitor (135, 305, 141, 112) located in a monitored zone;
- said controller being programmed to classify at least one alarm condition threatening to said first person responsively to said environment monitor to produce class data; and
- said controller being programmed to generate an alarm signal responsively to said class data, said alarm signal including at least a portion of said monitor signal at least one of immediately prior to or immediately after an incidence of said alarm condition.
2. A device as in claim 1, wherein said at least one monitor signal includes at least one of a still image, video, and audio data.
3. A device as in claim 1, wherein said controller is programmed to recognize faces and said alarm condition is responsive to one of a recognition of a face or a failure to recognize a face.
4. A device as in claim 3, wherein said controller is programmed to solicit an action by an occupant, said monitor signal being responsive to said action by said occupant.
5. A device as in claim 1, wherein said controller is programmed to solicit an action by an occupant, said monitor signal being responsive to said action by said occupant.
6. A device as in claim 1, wherein said controller is programmed to recognize a speaker's voice, said alarm signal being responsive to one of a recognition of said speaker's voice and a failure to recognize said speaker's voice.
7. A device as in claim 1, wherein said at least one monitor signal includes a detector (141) configured to detect a lapse in breathing by said person.
8. A device as in claim 1, wherein said alarm signal includes at least a portion of said monitor signal immediately prior to and immediately after an incidence of said alarm condition.
9. A device as in claim 1 wherein said alarm signal includes at least one of an audio signal, text data signal, and a video signal.
10. A monitoring system for monitoring the environment of a person requiring supervision, comprising:
- a controller (100) connected to receive at least one signal from at least one sensor (135, 305, 141, 112);
- said at least one sensor generating first and second signals responsive to a first state of a caretaker of said person and a second state of said person, respectively; - said controller being programmed to generate a first alarm signal when said first state is outside a first specified range and to generate a second alarm signal when said second state is outside a second specified range.
11. A monitoring system as in claim 10, wherein said first alarm signal includes a sample of at least one of said first and second signals.
12. A monitoring system as in claim 10, wherein said controller is programmed to generate a message to solicit an action by said caretaker signal when said first state is outside said first specified range.
13. A method of monitoring a person requiring supervision, comprising the steps of:
- generating a first signal indicative of a status of a person or said person's environment; - detecting an event requiring the attention of a remote supervisor;
- transmitting at least a portion of said first signal to said remote supervisor responsively to a result of said step of detecting.
14. A method as in claim 13, wherein said step of transmitting includes transmitting an electromagnetic signal including at least one of audio, video, and text data.
15. A method as in claim 13, wherein said person is an infant and said step of detecting includes detecting a lapse of breathing of said infant.
16. A method as in claim 13, wherein said step of detecting includes detecting at least one of an audio signal and video signal and classifying a predefined pattern in said at least one of an audio signal and a video signal.
17. A method as in claim 13, wherein said step of detecting includes detecting behavior of a person other than said child and in said child's environment.
18. A method as in claim 13, wherein said step of detecting includes at least one of recognizing a face of said person or another, classifying a body habitus of said person, classifying a physiognomy of said person, detecting a speed of movement of said person or another, detecting a number of persons in an occupied zone, and recognizing a voice signature, said steps of recognizing, classifying, and detecting being automatic machine processes.
19. A method as in claim 13, wherein said step of detecting includes detecting a failure of at least one of a movement of said person or another to move, speak, or generate any other detectable activities.
PCT/IB2002/000547 2001-03-15 2002-02-21 Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker WO2002075687A2 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AT02712177T ATE296473T1 (en) 2001-03-15 2002-02-21 AUTOMATIC MONITORING SYSTEM FOR A PATIENT AND HIS NURSE
DE60204292T DE60204292T2 (en) 2001-03-15 2002-02-21 AUTOMATIC MONITORING SYSTEM FOR A PATIENT AND HOSPITAL
JP2002574620A JP2004531800A (en) 2001-03-15 2002-02-21 Automated system for monitoring persons requiring monitoring and their caretakers
EP02712177A EP1371042B1 (en) 2001-03-15 2002-02-21 Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/808,848 2001-03-15
US09/808,848 US6968294B2 (en) 2001-03-15 2001-03-15 Automatic system for monitoring person requiring care and his/her caretaker

Publications (2)

Publication Number Publication Date
WO2002075687A2 true WO2002075687A2 (en) 2002-09-26
WO2002075687A3 WO2002075687A3 (en) 2003-05-30

Family

ID=25199922

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2002/000547 WO2002075687A2 (en) 2001-03-15 2002-02-21 Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker

Country Status (7)

Country Link
US (1) US6968294B2 (en)
EP (1) EP1371042B1 (en)
JP (1) JP2004531800A (en)
KR (1) KR20030001504A (en)
AT (1) ATE296473T1 (en)
DE (1) DE60204292T2 (en)
WO (1) WO2002075687A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004030532A1 (en) 2002-10-03 2004-04-15 The University Of Queensland Method and apparatus for assessing psychiatric or physical disorders
JP2005033608A (en) * 2003-07-08 2005-02-03 Victor Co Of Japan Ltd Video monitor device with e-mail transmitting function
WO2006023097A1 (en) * 2004-08-04 2006-03-02 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
EP1883342A2 (en) * 2005-05-03 2008-02-06 Aware Technologies, Inc. Method and system for wearable vital signs and physiology, activity, and environmental monitoring
CH710525A1 (en) * 2014-12-19 2016-06-30 Bkw Energie Ag Device for detecting an environment and for interaction with a user.
EP3805980A1 (en) * 2019-10-11 2021-04-14 Kepler Vision Technologies B.V. A system to notify a request for help by detecting an intent to press a button, said system using artificial intelligence
EP3848916A1 (en) * 2020-01-09 2021-07-14 Kxkjm Device for detecting hazardous behaviour of at least one person, associated detection method and detection network

Families Citing this family (288)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8823512B2 (en) 1997-01-09 2014-09-02 Donald Spector Sensor with remote communications capability
US8542087B1 (en) * 1997-01-09 2013-09-24 Donald Spector Combined audio/video monitor and light box assembly
US6175752B1 (en) 1998-04-30 2001-01-16 Therasense, Inc. Analyte monitoring device and methods of use
US8688188B2 (en) 1998-04-30 2014-04-01 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US8974386B2 (en) 1998-04-30 2015-03-10 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US8465425B2 (en) 1998-04-30 2013-06-18 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US9066695B2 (en) 1998-04-30 2015-06-30 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US8480580B2 (en) 1998-04-30 2013-07-09 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US6949816B2 (en) 2003-04-21 2005-09-27 Motorola, Inc. Semiconductor component having first surface area for electrically coupling to a semiconductor chip and second surface area for electrically coupling to a substrate, and method of manufacturing same
US8346337B2 (en) 1998-04-30 2013-01-01 Abbott Diabetes Care Inc. Analyte monitoring device and methods of use
US6560471B1 (en) 2001-01-02 2003-05-06 Therasense, Inc. Analyte monitoring device and methods of use
EP1397068A2 (en) 2001-04-02 2004-03-17 Therasense, Inc. Blood glucose tracking apparatus and methods
US7381184B2 (en) 2002-11-05 2008-06-03 Abbott Diabetes Care Inc. Sensor inserter assembly
US20040116102A1 (en) * 2002-12-17 2004-06-17 International Business Machines Corporation Heuristics for behavior based life support services
AU2003303597A1 (en) 2002-12-31 2004-07-29 Therasense, Inc. Continuous glucose monitoring system and methods of use
US7475059B2 (en) 2003-02-13 2009-01-06 Sap Ag Adapting business objects for searches and searching adapted business objects
US20040186813A1 (en) * 2003-02-26 2004-09-23 Tedesco Daniel E. Image analysis method and apparatus in a network that is structured with multiple layers and differentially weighted neurons
US7292723B2 (en) * 2003-02-26 2007-11-06 Walker Digital, Llc System for image analysis in a network that is structured with multiple layers and differentially weighted neurons
US7587287B2 (en) 2003-04-04 2009-09-08 Abbott Diabetes Care Inc. Method and system for transferring analyte test data
JP3779696B2 (en) * 2003-04-28 2006-05-31 株式会社東芝 System including electronic device and mobile device, and service providing method
US8066639B2 (en) 2003-06-10 2011-11-29 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US20190357827A1 (en) 2003-08-01 2019-11-28 Dexcom, Inc. Analyte sensor
JP3829829B2 (en) * 2003-08-06 2006-10-04 コニカミノルタホールディングス株式会社 Control device, program, and control method
US8775443B2 (en) * 2003-08-07 2014-07-08 Sap Ag Ranking of business objects for search engines
US7920906B2 (en) 2005-03-10 2011-04-05 Dexcom, Inc. System and methods for processing analyte sensor data for sensor calibration
US7154389B2 (en) * 2003-10-30 2006-12-26 Cosco Management, Inc. Monitor for sensing and transmitting sounds in a baby's vicinity
US7299082B2 (en) 2003-10-31 2007-11-20 Abbott Diabetes Care, Inc. Method of calibrating an analyte-measurement device, and associated methods, devices and systems
USD914881S1 (en) 2003-11-05 2021-03-30 Abbott Diabetes Care Inc. Analyte sensor electronic mount
US8589174B2 (en) * 2003-12-16 2013-11-19 Adventium Enterprises Activity monitoring
EP1718198A4 (en) 2004-02-17 2008-06-04 Therasense Inc Method and system for providing data communication in continuous glucose monitoring and management system
US20060008058A1 (en) * 2004-04-29 2006-01-12 Jui-Lin Dai Remote wellness monitoring and communication
US7248171B2 (en) 2004-05-17 2007-07-24 Mishelevich David J RFID systems for automatically triggering and delivering stimuli
CA2572455C (en) 2004-06-04 2014-10-28 Therasense, Inc. Diabetes care host-client architecture and data management system
US20060020192A1 (en) 2004-07-13 2006-01-26 Dexcom, Inc. Transcutaneous analyte sensor
US9779750B2 (en) * 2004-07-30 2017-10-03 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
US9704502B2 (en) 2004-07-30 2017-07-11 Invention Science Fund I, Llc Cue-aware privacy filter for participants in persistent communications
US9240188B2 (en) 2004-09-16 2016-01-19 Lena Foundation System and method for expressive language, developmental disorder, and emotion assessment
US10223934B2 (en) 2004-09-16 2019-03-05 Lena Foundation Systems and methods for expressive language, developmental disorder, and emotion assessment, and contextual feedback
JP4794846B2 (en) * 2004-10-27 2011-10-19 キヤノン株式会社 Estimation apparatus and estimation method
US9743862B2 (en) 2011-03-31 2017-08-29 Abbott Diabetes Care Inc. Systems and methods for transcutaneously implanting medical devices
US9788771B2 (en) 2006-10-23 2017-10-17 Abbott Diabetes Care Inc. Variable speed sensor insertion devices and methods of use
US8571624B2 (en) 2004-12-29 2013-10-29 Abbott Diabetes Care Inc. Method and apparatus for mounting a data transmission device in a communication system
US7731657B2 (en) 2005-08-30 2010-06-08 Abbott Diabetes Care Inc. Analyte sensor introducer and methods of use
US9572534B2 (en) 2010-06-29 2017-02-21 Abbott Diabetes Care Inc. Devices, systems and methods for on-skin or on-body mounting of medical devices
US9259175B2 (en) 2006-10-23 2016-02-16 Abbott Diabetes Care, Inc. Flexible patch for fluid delivery and monitoring body analytes
US10226207B2 (en) 2004-12-29 2019-03-12 Abbott Diabetes Care Inc. Sensor inserter having introducer
US20090105569A1 (en) 2006-04-28 2009-04-23 Abbott Diabetes Care, Inc. Introducer Assembly and Methods of Use
US8512243B2 (en) 2005-09-30 2013-08-20 Abbott Diabetes Care Inc. Integrated introducer and transmitter assembly and methods of use
US8613703B2 (en) 2007-05-31 2013-12-24 Abbott Diabetes Care Inc. Insertion devices and methods
US8333714B2 (en) 2006-09-10 2012-12-18 Abbott Diabetes Care Inc. Method and system for providing an integrated analyte sensor insertion device and data processing unit
US7697967B2 (en) 2005-12-28 2010-04-13 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US9398882B2 (en) 2005-09-30 2016-07-26 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor and data processing device
US9636450B2 (en) 2007-02-19 2017-05-02 Udo Hoss Pump system modular components for delivering medication and analyte sensing at seperate insertion sites
US7883464B2 (en) 2005-09-30 2011-02-08 Abbott Diabetes Care Inc. Integrated transmitter unit and sensor introducer mechanism and methods of use
US8029441B2 (en) 2006-02-28 2011-10-04 Abbott Diabetes Care Inc. Analyte sensor transmitter unit configuration for a data monitoring and management system
US8112240B2 (en) 2005-04-29 2012-02-07 Abbott Diabetes Care Inc. Method and apparatus for providing leak detection in data monitoring and management systems
US7827011B2 (en) * 2005-05-03 2010-11-02 Aware, Inc. Method and system for real-time signal classification
US20060260624A1 (en) * 2005-05-17 2006-11-23 Battelle Memorial Institute Method, program, and system for automatic profiling of entities
KR100719240B1 (en) * 2005-05-25 2007-05-17 (주)실비토스 Method for providing resident management service, system for using resident management service, and computer readable medium processing the method
CN102440785A (en) 2005-08-31 2012-05-09 弗吉尼亚大学专利基金委员会 Sensor signal processing method and sensor signal processing device
US7277823B2 (en) * 2005-09-26 2007-10-02 Lockheed Martin Corporation Method and system of monitoring and prognostics
US8688804B2 (en) * 2005-09-26 2014-04-01 Samsung Electronics Co., Ltd. Apparatus and method for transmitting sound information in web-based control system
US9521968B2 (en) 2005-09-30 2016-12-20 Abbott Diabetes Care Inc. Analyte sensor retention mechanism and methods of use
US8880138B2 (en) 2005-09-30 2014-11-04 Abbott Diabetes Care Inc. Device for channeling fluid and methods of use
US7766829B2 (en) 2005-11-04 2010-08-03 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US7761310B2 (en) * 2005-12-09 2010-07-20 Samarion, Inc. Methods and systems for monitoring quality and performance at a healthcare facility
US20070132597A1 (en) * 2005-12-09 2007-06-14 Valence Broadband, Inc. Methods and systems for monitoring patient support exiting and initiating response
US7786874B2 (en) 2005-12-09 2010-08-31 Samarion, Inc. Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility
US20080021731A1 (en) * 2005-12-09 2008-01-24 Valence Broadband, Inc. Methods and systems for monitoring patient support exiting and initiating response
US7911348B2 (en) * 2005-12-09 2011-03-22 Bee Cave, LLC. Methods for refining patient, staff and visitor profiles used in monitoring quality and performance at a healthcare facility
CA2636034A1 (en) 2005-12-28 2007-10-25 Abbott Diabetes Care Inc. Medical device insertion
US11298058B2 (en) 2005-12-28 2022-04-12 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor insertion
US7736310B2 (en) 2006-01-30 2010-06-15 Abbott Diabetes Care Inc. On-body medical device securement
US7885698B2 (en) 2006-02-28 2011-02-08 Abbott Diabetes Care Inc. Method and system for providing continuous calibration of implantable analyte sensors
US7826879B2 (en) 2006-02-28 2010-11-02 Abbott Diabetes Care Inc. Analyte sensors and methods of use
US8224415B2 (en) 2009-01-29 2012-07-17 Abbott Diabetes Care Inc. Method and device for providing offset model based calibration for analyte sensor
US8583205B2 (en) 2008-03-28 2013-11-12 Abbott Diabetes Care Inc. Analyte sensor calibration management
US7620438B2 (en) 2006-03-31 2009-11-17 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US9339217B2 (en) 2011-11-25 2016-05-17 Abbott Diabetes Care Inc. Analyte monitoring system and methods of use
US8346335B2 (en) 2008-03-28 2013-01-01 Abbott Diabetes Care Inc. Analyte sensor calibration management
US9392969B2 (en) 2008-08-31 2016-07-19 Abbott Diabetes Care Inc. Closed loop control and signal attenuation detection
US8226891B2 (en) 2006-03-31 2012-07-24 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US7801582B2 (en) 2006-03-31 2010-09-21 Abbott Diabetes Care Inc. Analyte monitoring and management system and methods therefor
US7618369B2 (en) 2006-10-02 2009-11-17 Abbott Diabetes Care Inc. Method and system for dynamically updating calibration parameters for an analyte sensor
US9675290B2 (en) 2012-10-30 2017-06-13 Abbott Diabetes Care Inc. Sensitivity calibration of in vivo sensors used to measure analyte concentration
US8374668B1 (en) 2007-10-23 2013-02-12 Abbott Diabetes Care Inc. Analyte sensor with lag compensation
US8219173B2 (en) 2008-09-30 2012-07-10 Abbott Diabetes Care Inc. Optimizing analyte sensor calibration
US8140312B2 (en) 2007-05-14 2012-03-20 Abbott Diabetes Care Inc. Method and system for determining analyte levels
US8473022B2 (en) 2008-01-31 2013-06-25 Abbott Diabetes Care Inc. Analyte sensor with time lag compensation
US7653425B2 (en) 2006-08-09 2010-01-26 Abbott Diabetes Care Inc. Method and system for providing calibration of an analyte sensor in an analyte monitoring system
US7630748B2 (en) 2006-10-25 2009-12-08 Abbott Diabetes Care Inc. Method and system for providing analyte monitoring
US7920907B2 (en) 2006-06-07 2011-04-05 Abbott Diabetes Care Inc. Analyte monitoring system and method
US20080033752A1 (en) * 2006-08-04 2008-02-07 Valence Broadband, Inc. Methods and systems for monitoring staff/patient contacts and ratios
KR100778116B1 (en) * 2006-10-02 2007-11-21 삼성전자주식회사 Device for correcting motion vector and method thereof
US7733233B2 (en) * 2006-10-24 2010-06-08 Kimberly-Clark Worldwide, Inc. Methods and systems for monitoring position and movement of human beings
US8135548B2 (en) 2006-10-26 2012-03-13 Abbott Diabetes Care Inc. Method, system and computer program product for real-time detection of sensitivity decline in analyte sensors
US20080150730A1 (en) * 2006-12-20 2008-06-26 Cheung-Hwa Hsu Infant remote monitoring system
US8121857B2 (en) 2007-02-15 2012-02-21 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US20080199894A1 (en) 2007-02-15 2008-08-21 Abbott Diabetes Care, Inc. Device and method for automatic data acquisition and/or detection
US8732188B2 (en) 2007-02-18 2014-05-20 Abbott Diabetes Care Inc. Method and system for providing contextual based medication dosage determination
US8930203B2 (en) 2007-02-18 2015-01-06 Abbott Diabetes Care Inc. Multi-function analyte test device and methods therefor
US8123686B2 (en) 2007-03-01 2012-02-28 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
WO2008128208A1 (en) * 2007-04-12 2008-10-23 Magneto Inertial Sensing Technology, Inc. Infant sid monitor based on accelerometer
EP2146625B1 (en) 2007-04-14 2019-08-14 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
US9204827B2 (en) 2007-04-14 2015-12-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
EP2146627B1 (en) 2007-04-14 2020-07-29 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in medical communication system
EP2137637A4 (en) 2007-04-14 2012-06-20 Abbott Diabetes Care Inc Method and apparatus for providing data processing and control in medical communication system
CA2683721C (en) 2007-04-14 2017-05-23 Abbott Diabetes Care Inc. Method and apparatus for providing dynamic multi-stage signal amplification in a medical device
ES2784736T3 (en) 2007-04-14 2020-09-30 Abbott Diabetes Care Inc Procedure and apparatus for providing data processing and control in a medical communication system
US7928850B2 (en) 2007-05-08 2011-04-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8461985B2 (en) 2007-05-08 2013-06-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US7595815B2 (en) * 2007-05-08 2009-09-29 Kd Secure, Llc Apparatus, methods, and systems for intelligent security and safety
US8665091B2 (en) 2007-05-08 2014-03-04 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US8456301B2 (en) 2007-05-08 2013-06-04 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9125548B2 (en) 2007-05-14 2015-09-08 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US10002233B2 (en) 2007-05-14 2018-06-19 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8103471B2 (en) 2007-05-14 2012-01-24 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US7996158B2 (en) 2007-05-14 2011-08-09 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8444560B2 (en) 2007-05-14 2013-05-21 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8260558B2 (en) 2007-05-14 2012-09-04 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8560038B2 (en) 2007-05-14 2013-10-15 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20080282988A1 (en) * 2007-05-14 2008-11-20 Carl Bloksberg Pet entertainment system
US8239166B2 (en) 2007-05-14 2012-08-07 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US8600681B2 (en) 2007-05-14 2013-12-03 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20080294012A1 (en) 2007-05-22 2008-11-27 Kurtz Andrew F Monitoring physiological conditions
US20080294018A1 (en) * 2007-05-22 2008-11-27 Kurtz Andrew F Privacy management for well-being monitoring
US8831299B2 (en) 2007-05-22 2014-09-09 Intellectual Ventures Fund 83 Llc Capturing data for individual physiological monitoring
US8038615B2 (en) * 2007-05-22 2011-10-18 Eastman Kodak Company Inferring wellness from physiological conditions data
US8038614B2 (en) * 2007-05-22 2011-10-18 Eastman Kodak Company Establishing baseline data for physiological monitoring system
US7972266B2 (en) * 2007-05-22 2011-07-05 Eastman Kodak Company Image data normalization for a monitoring system
CN103251414B (en) 2007-06-21 2017-05-24 雅培糖尿病护理公司 Device for detecting analyte level
JP5680960B2 (en) 2007-06-21 2015-03-04 アボット ダイアベティス ケア インコーポレイテッドAbbott Diabetes Care Inc. Health care device and method
US8160900B2 (en) 2007-06-29 2012-04-17 Abbott Diabetes Care Inc. Analyte monitoring and management device and method to analyze the frequency of user interaction with the device
US8834366B2 (en) 2007-07-31 2014-09-16 Abbott Diabetes Care Inc. Method and apparatus for providing analyte sensor calibration
US7768386B2 (en) 2007-07-31 2010-08-03 Abbott Diabetes Care Inc. Method and apparatus for providing data processing and control in a medical communication system
US20090044334A1 (en) * 2007-08-13 2009-02-19 Valence Broadband, Inc. Automatically adjusting patient platform support height in response to patient related events
US20090044332A1 (en) * 2007-08-13 2009-02-19 Valence Broadband, Inc. Height adjustable patient support platforms
US8013738B2 (en) * 2007-10-04 2011-09-06 Kd Secure, Llc Hierarchical storage manager (HSM) for intelligent storage of large volumes of data
WO2009045218A1 (en) * 2007-10-04 2009-04-09 Donovan John J A video surveillance, storage, and alerting system having network management, hierarchical data storage, video tip processing, and vehicle plate analysis
US8377031B2 (en) 2007-10-23 2013-02-19 Abbott Diabetes Care Inc. Closed loop control system with safety parameters and methods
US8216138B1 (en) 2007-10-23 2012-07-10 Abbott Diabetes Care Inc. Correlation of alternative site blood and interstitial fluid glucose concentrations to venous glucose concentration
US8409093B2 (en) 2007-10-23 2013-04-02 Abbott Diabetes Care Inc. Assessing measures of glycemic variability
WO2009061936A1 (en) 2007-11-06 2009-05-14 Three H Corporation Method and system for safety monitoring
US7987069B2 (en) 2007-11-12 2011-07-26 Bee Cave, Llc Monitoring patient support exiting and initiating response
US20090287120A1 (en) 2007-12-18 2009-11-19 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Circulatory monitoring systems and methods
US8636670B2 (en) 2008-05-13 2014-01-28 The Invention Science Fund I, Llc Circulatory monitoring systems and methods
US9717896B2 (en) 2007-12-18 2017-08-01 Gearbox, Llc Treatment indications informed by a priori implant information
US20090164239A1 (en) 2007-12-19 2009-06-25 Abbott Diabetes Care, Inc. Dynamic Display Of Glucose Information
US20090189771A1 (en) * 2008-01-28 2009-07-30 Chia-Lun Liu Smart care system
US7978085B1 (en) 2008-02-29 2011-07-12 University Of South Florida Human and physical asset movement pattern analyzer
JP5079549B2 (en) * 2008-03-03 2012-11-21 パナソニック株式会社 Imaging device and imaging device body
WO2009126942A2 (en) 2008-04-10 2009-10-15 Abbott Diabetes Care Inc. Method and system for sterilizing an analyte sensor
US8591410B2 (en) 2008-05-30 2013-11-26 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US8924159B2 (en) 2008-05-30 2014-12-30 Abbott Diabetes Care Inc. Method and apparatus for providing glycemic control
US7826382B2 (en) 2008-05-30 2010-11-02 Abbott Diabetes Care Inc. Close proximity communication device and methods
WO2010009172A1 (en) 2008-07-14 2010-01-21 Abbott Diabetes Care Inc. Closed loop control system interface and methods
US8094009B2 (en) 2008-08-27 2012-01-10 The Invention Science Fund I, Llc Health-related signaling via wearable items
US8125331B2 (en) 2008-08-27 2012-02-28 The Invention Science Fund I, Llc Health-related signaling via wearable items
US8130095B2 (en) 2008-08-27 2012-03-06 The Invention Science Fund I, Llc Health-related signaling via wearable items
US8284046B2 (en) 2008-08-27 2012-10-09 The Invention Science Fund I, Llc Health-related signaling via wearable items
US8734422B2 (en) 2008-08-31 2014-05-27 Abbott Diabetes Care Inc. Closed loop control with improved alarm functions
US20100057040A1 (en) 2008-08-31 2010-03-04 Abbott Diabetes Care, Inc. Robust Closed Loop Control And Methods
US8622988B2 (en) 2008-08-31 2014-01-07 Abbott Diabetes Care Inc. Variable rate closed loop control and methods
US9943644B2 (en) 2008-08-31 2018-04-17 Abbott Diabetes Care Inc. Closed loop control with reference measurement and methods thereof
US8986208B2 (en) 2008-09-30 2015-03-24 Abbott Diabetes Care Inc. Analyte sensor sensitivity attenuation mitigation
US9326707B2 (en) 2008-11-10 2016-05-03 Abbott Diabetes Care Inc. Alarm characterization for analyte monitoring devices and systems
US8749392B2 (en) 2008-12-30 2014-06-10 Oneevent Technologies, Inc. Evacuation system
WO2010084495A1 (en) * 2009-01-23 2010-07-29 Aser Rich Limited. Remote babysitting monitor
US8103456B2 (en) 2009-01-29 2012-01-24 Abbott Diabetes Care Inc. Method and device for early signal attenuation detection using blood glucose measurements
US20100198034A1 (en) 2009-02-03 2010-08-05 Abbott Diabetes Care Inc. Compact On-Body Physiological Monitoring Devices and Methods Thereof
US9799205B2 (en) 2013-07-15 2017-10-24 Oneevent Technologies, Inc. Owner controlled evacuation system with notification and route guidance provided by a user device
US8077027B2 (en) * 2009-02-27 2011-12-13 Tyco Safety Products Canada Ltd. System and method for analyzing faulty event transmissions
US8081083B2 (en) * 2009-03-06 2011-12-20 Telehealth Sensors Llc Mattress or chair sensor envelope with an antenna
US20100225488A1 (en) * 2009-03-06 2010-09-09 Telehealth Sensors, Llc Patient Monitoring System Using an Active Mattress or Chair System
US20100228516A1 (en) * 2009-03-06 2010-09-09 Telehealth Sensors, Llc Electronic mattress or chair sensor for patient monitoring
WO2010121084A1 (en) 2009-04-15 2010-10-21 Abbott Diabetes Care Inc. Analyte monitoring system having an alert
US9226701B2 (en) 2009-04-28 2016-01-05 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
WO2010127187A1 (en) 2009-04-29 2010-11-04 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
EP2425209A4 (en) 2009-04-29 2013-01-09 Abbott Diabetes Care Inc Method and system for providing real time analyte sensor calibration with retrospective backfill
WO2010138856A1 (en) 2009-05-29 2010-12-02 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US8613892B2 (en) 2009-06-30 2013-12-24 Abbott Diabetes Care Inc. Analyte meter with a moveable head and methods of using the same
US20110172550A1 (en) 2009-07-21 2011-07-14 Michael Scott Martin Uspa: systems and methods for ems device communication interface
WO2011127459A1 (en) 2010-04-09 2011-10-13 Zoll Medical Corporation Systems and methods for ems device communications interface
ES2776474T3 (en) 2009-07-23 2020-07-30 Abbott Diabetes Care Inc Continuous analyte measurement system
WO2011014851A1 (en) 2009-07-31 2011-02-03 Abbott Diabetes Care Inc. Method and apparatus for providing analyte monitoring system calibration accuracy
WO2011026148A1 (en) 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
EP3988470B1 (en) 2009-08-31 2023-06-28 Abbott Diabetes Care Inc. Displays for a medical device
CA2765712A1 (en) 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Medical devices and methods
WO2011026147A1 (en) 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US9320461B2 (en) 2009-09-29 2016-04-26 Abbott Diabetes Care Inc. Method and apparatus for providing notification function in analyte monitoring systems
WO2011041531A1 (en) 2009-09-30 2011-04-07 Abbott Diabetes Care Inc. Interconnect for on-body analyte monitoring device
US8185181B2 (en) 2009-10-30 2012-05-22 Abbott Diabetes Care Inc. Method and apparatus for detecting false hypoglycemic conditions
US20110105919A1 (en) * 2009-10-30 2011-05-05 Mohammed Naji Medical device
USD924406S1 (en) 2010-02-01 2021-07-06 Abbott Diabetes Care Inc. Analyte sensor inserter
WO2011112753A1 (en) 2010-03-10 2011-09-15 Abbott Diabetes Care Inc. Systems, devices and methods for managing glucose levels
CA3135001A1 (en) 2010-03-24 2011-09-29 Abbott Diabetes Care Inc. Medical device inserters and processes of inserting and using medical devices
US8635046B2 (en) 2010-06-23 2014-01-21 Abbott Diabetes Care Inc. Method and system for evaluating analyte sensor response characteristics
US11064921B2 (en) 2010-06-29 2021-07-20 Abbott Diabetes Care Inc. Devices, systems and methods for on-skin or on-body mounting of medical devices
US10092229B2 (en) 2010-06-29 2018-10-09 Abbott Diabetes Care Inc. Calibration of analyte measurement system
US8620625B2 (en) 2010-07-30 2013-12-31 Hill-Rom Services, Inc. Above bed sensor
EP2619724A2 (en) 2010-09-23 2013-07-31 Stryker Corporation Video monitoring system
WO2012048168A2 (en) 2010-10-07 2012-04-12 Abbott Diabetes Care Inc. Analyte monitoring devices and methods
US20120116252A1 (en) * 2010-10-13 2012-05-10 The Regents Of The University Of Colorado, A Body Corporate Systems and methods for detecting body orientation or posture
US8907287B2 (en) 2010-12-01 2014-12-09 Hill-Rom Services, Inc. Patient monitoring system
US9165334B2 (en) * 2010-12-28 2015-10-20 Pet Check Technology Llc Pet and people care management system
CA3177983A1 (en) 2011-02-28 2012-11-15 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US10136845B2 (en) 2011-02-28 2018-11-27 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
WO2012142502A2 (en) 2011-04-15 2012-10-18 Dexcom Inc. Advanced analyte sensor calibration and error detection
US20130031001A1 (en) * 2011-07-26 2013-01-31 Stephen Patrick Frechette Method and System for the Location-Based Discovery and Validated Payment of a Service Provider
US8599009B2 (en) 2011-08-16 2013-12-03 Elwha Llc Systematic distillation of status data relating to regimen compliance
US9622691B2 (en) 2011-10-31 2017-04-18 Abbott Diabetes Care Inc. Model based variable risk false glucose threshold alarm prevention mechanism
WO2013066873A1 (en) 2011-10-31 2013-05-10 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
JP6443802B2 (en) 2011-11-07 2018-12-26 アボット ダイアベティス ケア インコーポレイテッドAbbott Diabetes Care Inc. Analyte monitoring apparatus and method
JP2015502598A (en) * 2011-11-14 2015-01-22 ユニバーシティ オブ テクノロジー,シドニー Personal monitoring
US8710993B2 (en) 2011-11-23 2014-04-29 Abbott Diabetes Care Inc. Mitigating single point failure of devices in an analyte monitoring system and methods thereof
US9317656B2 (en) 2011-11-23 2016-04-19 Abbott Diabetes Care Inc. Compatibility mechanisms for devices in a continuous analyte monitoring system and methods thereof
EP4344633A2 (en) 2011-12-11 2024-04-03 Abbott Diabetes Care, Inc. Analyte sensor methods
US9084058B2 (en) * 2011-12-29 2015-07-14 Sonos, Inc. Sound field calibration using listener localization
US9295390B2 (en) 2012-03-02 2016-03-29 Hill-Rom Services, Inc. Facial recognition based monitoring systems and methods
TWI474315B (en) * 2012-05-25 2015-02-21 Univ Nat Taiwan Normal Infant cries analysis method and system
JP6093124B2 (en) * 2012-07-26 2017-03-08 HybridMom株式会社 Supervisory system
EP2890297B1 (en) 2012-08-30 2018-04-11 Abbott Diabetes Care, Inc. Dropout detection in continuous analyte monitoring data during data excursions
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
WO2014052136A1 (en) 2012-09-26 2014-04-03 Abbott Diabetes Care Inc. Method and apparatus for improving lag correction during in vivo measurement of analyte concentration with analyte concentration variability and range data
WO2014052802A2 (en) 2012-09-28 2014-04-03 Zoll Medical Corporation Systems and methods for three-dimensional interaction monitoring in an ems environment
JP6134925B2 (en) * 2013-02-27 2017-05-31 学校法人同志社 Home support system
USD732558S1 (en) 2013-03-11 2015-06-23 Arris Technology, Inc. Display screen with graphical user interface
US10076285B2 (en) 2013-03-15 2018-09-18 Abbott Diabetes Care Inc. Sensor fault detection using analyte sensor data pattern comparison
US9474475B1 (en) 2013-03-15 2016-10-25 Abbott Diabetes Care Inc. Multi-rate analyte sensor data collection with sample rate configurable signal processing
US10433773B1 (en) 2013-03-15 2019-10-08 Abbott Diabetes Care Inc. Noise rejection methods and apparatus for sparsely sampled analyte sensor data
KR101331195B1 (en) 2013-05-24 2013-11-19 주식회사 이도링크 Privacy protecting monitoring system
US20150105608A1 (en) * 2013-10-14 2015-04-16 Rest Devices, Inc. Infant Sleeping Aid and Infant-Bed Accessory
US20150154880A1 (en) * 2013-12-02 2015-06-04 Aetna Inc. Healthcare management with a support network
US11229382B2 (en) 2013-12-31 2022-01-25 Abbott Diabetes Care Inc. Self-powered analyte sensor and devices using the same
US10067341B1 (en) * 2014-02-04 2018-09-04 Intelligent Technologies International, Inc. Enhanced heads-up display system
US20170185748A1 (en) 2014-03-30 2017-06-29 Abbott Diabetes Care Inc. Method and Apparatus for Determining Meal Start and Peak Events in Analyte Monitoring Systems
US20150278481A1 (en) * 2014-03-31 2015-10-01 Elwha Llc Quantified-Self Machines and Circuits Reflexively Related to Big-Data Analytics Systems and Associated Food-and-Nutrition Machines and Circuits
US10318123B2 (en) 2014-03-31 2019-06-11 Elwha Llc Quantified-self machines, circuits and interfaces reflexively related to food fabricator machines and circuits
US10127361B2 (en) 2014-03-31 2018-11-13 Elwha Llc Quantified-self machines and circuits reflexively related to kiosk systems and associated food-and-nutrition machines and circuits
US9311804B2 (en) 2014-04-11 2016-04-12 Hill-Rom Services, Inc. Patient-need prediction system
US9485267B2 (en) * 2014-06-02 2016-11-01 Bastille Networks, Inc. Anomalous behavior detection using radio frequency fingerprints and access credentials
US9642340B2 (en) 2014-07-16 2017-05-09 Elwha Llc Remote pet monitoring systems and methods
US20160287073A1 (en) * 2015-04-05 2016-10-06 Smilables Inc. Infant monitoring hub
US9558642B2 (en) * 2015-04-21 2017-01-31 Vivint, Inc. Sleep state monitoring
US10213139B2 (en) 2015-05-14 2019-02-26 Abbott Diabetes Care Inc. Systems, devices, and methods for assembling an applicator and sensor control device
WO2016183493A1 (en) 2015-05-14 2016-11-17 Abbott Diabetes Care Inc. Compact medical device inserters and related systems and methods
US10789939B2 (en) 2015-06-25 2020-09-29 The University Of Chicago Wearable word counter
US10134424B2 (en) 2015-06-25 2018-11-20 VersaMe, Inc. Wearable word counter
US10959648B2 (en) 2015-06-25 2021-03-30 The University Of Chicago Wearable word counter
EP3319518A4 (en) 2015-07-10 2019-03-13 Abbott Diabetes Care Inc. System, device and method of dynamic glucose profile response to physiological parameters
PT3355761T (en) * 2015-09-30 2021-05-27 Centro Studi S R L Emotional/behavioural/psychological state estimation system
US10115029B1 (en) * 2015-10-13 2018-10-30 Ambarella, Inc. Automobile video camera for the detection of children, people or pets left in a vehicle
US9899035B2 (en) * 2015-11-04 2018-02-20 Ioannis Kakadiaris Systems for and methods of intelligent acoustic monitoring
US9640057B1 (en) 2015-11-23 2017-05-02 MedHab, LLC Personal fall detection system and method
US10185766B2 (en) * 2016-01-15 2019-01-22 Google Llc Systems and methods for monitoring objects and their states by using acoustic signals
US10209690B2 (en) 2016-01-15 2019-02-19 Google Llc Systems and methods for provisioning devices using acoustic signals
US9858789B2 (en) 2016-01-19 2018-01-02 Vivint, Inc. Occupancy-targeted baby monitoring
JP6812772B2 (en) * 2016-12-09 2021-01-13 富士ゼロックス株式会社 Monitoring equipment and programs
CN115444410A (en) 2017-01-23 2022-12-09 雅培糖尿病护理公司 Applicator and assembly for inserting an in vivo analyte sensor
US20180246964A1 (en) * 2017-02-28 2018-08-30 Lighthouse Ai, Inc. Speech interface for vision-based monitoring system
CN106710146B (en) * 2017-03-17 2023-02-03 淮阴工学院 Multifunctional prompting device for infant sleep
US11596330B2 (en) 2017-03-21 2023-03-07 Abbott Diabetes Care Inc. Methods, devices and system for providing diabetic condition diagnosis and therapy
US20180322253A1 (en) 2017-05-05 2018-11-08 International Business Machines Corporation Sensor Based Monitoring
EP3422255B1 (en) * 2017-06-30 2023-03-15 Axis AB Method and system for training a neural network to classify objects or events
US11567726B2 (en) * 2017-07-21 2023-01-31 Google Llc Methods, systems, and media for providing information relating to detected events
US20190057189A1 (en) * 2017-08-17 2019-02-21 Innovative World Solutions, LLC Alert and Response Integration System, Device, and Process
US11331022B2 (en) 2017-10-24 2022-05-17 Dexcom, Inc. Pre-connected analyte sensors
US11382540B2 (en) 2017-10-24 2022-07-12 Dexcom, Inc. Pre-connected analyte sensors
JP6878260B2 (en) 2017-11-30 2021-05-26 パラマウントベッド株式会社 Abnormality judgment device, program
US10529357B2 (en) 2017-12-07 2020-01-07 Lena Foundation Systems and methods for automatic determination of infant cry and discrimination of cry from fussiness
GB2571125A (en) * 2018-02-19 2019-08-21 Chestnut Mobile Ltd Infant monitor apparatus
CN108536996B (en) * 2018-03-01 2020-07-03 北京小米移动软件有限公司 Automatic sleep-soothing method, device, storage medium and intelligent baby crib
CN108335458A (en) * 2018-03-05 2018-07-27 李孟星 It is a kind of to see that the domestic intelligent of people sees guard system and its keeps an eye on method
US20200314207A1 (en) * 2019-03-26 2020-10-01 Abiomed, Inc. Dynamically Adjustable Frame Rate from Medical Device Controller
WO2020202444A1 (en) * 2019-04-01 2020-10-08 株式会社ファーストアセント Physical condition detection system
USD1002852S1 (en) 2019-06-06 2023-10-24 Abbott Diabetes Care Inc. Analyte sensor device
EP4031940A1 (en) 2019-09-18 2022-07-27 Johnson Controls Tyco IP Holdings LLP Building systems for improving temperature, pressure and humidity compliance
US11852505B2 (en) 2019-09-18 2023-12-26 Johnson Controls Tyco IP Holdings LLP Critical environment monitoring system
EP3838127A1 (en) * 2019-12-18 2021-06-23 Koninklijke Philips N.V. Device, system and method for monitoring of a subject
EP4076148A1 (en) * 2019-12-18 2022-10-26 Koninklijke Philips N.V. A system and method for alerting a caregiver based on the state of a person in need
CN111145476A (en) * 2019-12-26 2020-05-12 星络智能科技有限公司 Anti-falling control method for children
US11501501B1 (en) 2020-06-26 2022-11-15 Gresham Smith Biometric feedback system
US11897334B2 (en) 2020-11-19 2024-02-13 Ford Global Technologies, Llc Vehicle having pet bowl communication
US20220194228A1 (en) * 2020-12-17 2022-06-23 Ford Global Technologies, Llc Vehicle having pet monitoring and related controls
USD999913S1 (en) 2020-12-21 2023-09-26 Abbott Diabetes Care Inc Analyte sensor inserter
EP4277525A1 (en) * 2021-01-18 2023-11-22 Sentercare Ltd. Monitoring persons in a room
US11904794B2 (en) 2021-01-28 2024-02-20 Ford Global Technologies, Llc Pet restraint system for a vehicle
US11932156B2 (en) 2021-05-17 2024-03-19 Ford Global Technologies, Llc Vehicle having sliding console

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5505199A (en) * 1994-12-01 1996-04-09 Kim; Bill H. Sudden infant death syndrome monitor
JPH0946634A (en) * 1995-07-26 1997-02-14 Sanyo Electric Co Ltd Monitoring image recording and reproducing device
JP2000076421A (en) * 1998-08-28 2000-03-14 Nec Corp Feeling analyzing system
US6064910A (en) * 1996-11-25 2000-05-16 Pacesetter Ab Respirator rate/respiration depth detector and device for monitoring respiratory activity employing same

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3948250A (en) * 1974-09-16 1976-04-06 Becton, Dickinson And Company Physiological information display
US4196425A (en) * 1978-07-10 1980-04-01 by said David S. Weekly said Clyde E. Williams Patient activity monitoring system
DE2834079B1 (en) 1978-08-03 1979-10-31 Messerschmitt Boelkow Blohm Process for the automatic detection and evaluation of changes in image content and the related equipment
US4524243A (en) 1983-07-07 1985-06-18 Lifeline Systems, Inc. Personal alarm system
GB2179186A (en) 1985-07-29 1987-02-25 Lifeguard Systems Limited Activity monitoring apparatus
DE3830655A1 (en) 1988-09-09 1990-03-15 Hermann Janus Personal protection device having automatic emergency alarm
US5012522A (en) * 1988-12-08 1991-04-30 The United States Of America As Represented By The Secretary Of The Air Force Autonomous face recognition machine
US5576972A (en) * 1992-05-08 1996-11-19 Harrison; Dana C. Intelligent area monitoring system
US5479932A (en) * 1993-08-16 1996-01-02 Higgins; Joseph Infant health monitoring system
US5462051A (en) * 1994-08-31 1995-10-31 Colin Corporation Medical communication system
US6002994A (en) 1994-09-09 1999-12-14 Lane; Stephen S. Method of user monitoring of physiological and non-physiological measurements
JPH08161292A (en) 1994-12-09 1996-06-21 Matsushita Electric Ind Co Ltd Method and system for detecting congestion degree
US5692215A (en) * 1994-12-23 1997-11-25 Gerotech, Inc. System for generating periodic reports, generating trend analysis, and intervention in accordance with trend analysis from a detection subsystem for monitoring daily living activity
FR2743651B1 (en) 1996-01-12 1998-03-20 Somfy METHOD AND INSTALLATION FOR MONITORING PEOPLE IN A HOUSE
US5905436A (en) 1996-10-24 1999-05-18 Gerontological Solutions, Inc. Situation-based monitoring system
US6062216A (en) * 1996-12-27 2000-05-16 Children's Medical Center Corporation Sleep apnea detector system
KR100454157B1 (en) * 1997-05-07 2004-10-26 컴퓨메딕스 슬립 피티와이. 리미티드 Apparatus for controlling gas delivery to patient
US6190313B1 (en) * 1998-04-20 2001-02-20 Allen J. Hinkle Interactive health care system and method
US6049281A (en) * 1998-09-29 2000-04-11 Osterweil; Josef Method and apparatus for monitoring movements of an individual
US6160478A (en) * 1998-10-27 2000-12-12 Sarcos Lc Wireless health monitoring system
EP1071055B1 (en) * 1999-07-23 2004-12-22 Matsushita Electric Industrial Co., Ltd. Home monitoring system for health conditions
US6323761B1 (en) * 2000-06-03 2001-11-27 Sam Mog Son Vehicular security access system
US6553256B1 (en) * 2000-10-13 2003-04-22 Koninklijke Philips Electronics N.V. Method and apparatus for monitoring and treating sudden infant death syndrome

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5505199A (en) * 1994-12-01 1996-04-09 Kim; Bill H. Sudden infant death syndrome monitor
JPH0946634A (en) * 1995-07-26 1997-02-14 Sanyo Electric Co Ltd Monitoring image recording and reproducing device
US6064910A (en) * 1996-11-25 2000-05-16 Pacesetter Ab Respirator rate/respiration depth detector and device for monitoring respiratory activity employing same
JP2000076421A (en) * 1998-08-28 2000-03-14 Nec Corp Feeling analyzing system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN vol. 1997, no. 06, 30 June 1997 (1997-06-30) & JP 09 046634 A (SANYO ELECTRIC CO LTD), 14 February 1997 (1997-02-14) *
PATENT ABSTRACTS OF JAPAN vol. 2000, no. 06, 22 September 2000 (2000-09-22) & JP 2000 076421 A (NEC CORP), 14 March 2000 (2000-03-14) *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1545302A4 (en) * 2002-10-03 2008-12-17 Joachim Diederich Method and apparatus for assessing psychiatric or physical disorders
EP1545302A1 (en) * 2002-10-03 2005-06-29 Joachim Diederich Method and apparatus for assessing psychiatric or physical disorders
WO2004030532A1 (en) 2002-10-03 2004-04-15 The University Of Queensland Method and apparatus for assessing psychiatric or physical disorders
JP2005033608A (en) * 2003-07-08 2005-02-03 Victor Co Of Japan Ltd Video monitor device with e-mail transmitting function
US7562121B2 (en) 2004-08-04 2009-07-14 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
WO2006023097A1 (en) * 2004-08-04 2006-03-02 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US7966378B2 (en) 2004-08-04 2011-06-21 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US8635282B2 (en) 2004-08-04 2014-01-21 Kimberco, Inc. Computer—automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
EP1883342A2 (en) * 2005-05-03 2008-02-06 Aware Technologies, Inc. Method and system for wearable vital signs and physiology, activity, and environmental monitoring
EP1883342A4 (en) * 2005-05-03 2009-11-11 Aware Technologies Inc Method and system for wearable vital signs and physiology, activity, and environmental monitoring
CH710525A1 (en) * 2014-12-19 2016-06-30 Bkw Energie Ag Device for detecting an environment and for interaction with a user.
EP3805980A1 (en) * 2019-10-11 2021-04-14 Kepler Vision Technologies B.V. A system to notify a request for help by detecting an intent to press a button, said system using artificial intelligence
EP3848916A1 (en) * 2020-01-09 2021-07-14 Kxkjm Device for detecting hazardous behaviour of at least one person, associated detection method and detection network
FR3106229A1 (en) * 2020-01-09 2021-07-16 Kxkjm DEVICE FOR DETECTION OF BEHAVIOR AT RISK OF AT LEAST ONE PERSON, DETECTION METHOD AND ASSOCIATED DETECTION NETWORK

Also Published As

Publication number Publication date
EP1371042B1 (en) 2005-05-25
EP1371042A2 (en) 2003-12-17
ATE296473T1 (en) 2005-06-15
KR20030001504A (en) 2003-01-06
DE60204292D1 (en) 2005-06-30
JP2004531800A (en) 2004-10-14
US6968294B2 (en) 2005-11-22
US20020169583A1 (en) 2002-11-14
DE60204292T2 (en) 2006-03-16
WO2002075687A3 (en) 2003-05-30

Similar Documents

Publication Publication Date Title
EP1371042B1 (en) Automatic system for monitoring person requiring care and his/her caretaker automatic system for monitoring person requiring care and his/her caretaker
EP1371043B1 (en) Automatic system for monitoring independent person requiring occasional assistance
US11120559B2 (en) Computer vision based monitoring system and method
US10682097B2 (en) People monitoring and personal assistance system, in particular for elderly and people with special and cognitive needs
US11369321B2 (en) Monitoring and tracking system, method, article and device
Koshmak et al. Challenges and issues in multisensor fusion approach for fall detection
Kon et al. Evolution of smart homes for the elderly
US20020128746A1 (en) Apparatus, system and method for a remotely monitored and operated avatar
JP5133677B2 (en) Monitoring system
CN111882820B (en) Nursing system for special people
Copetti et al. Intelligent context-aware monitoring of hypertensive patients
Moncrieff et al. Multi-modal emotive computing in a smart house environment
WO2018201121A1 (en) Computer vision based monitoring system and method
KR102268456B1 (en) Real-time monitoring and care system
EP3992987A1 (en) System and method for continously sharing behavioral states of a creature
KR102331335B1 (en) Vulnerable person care robot and its control method
Ilapakurthy A framework for smart homes for elderly people using Labview
Crespo et al. State of the art report on automatic detection of activity in the home of elderly people for their remote assistance and contribution of cognitive computing.
WO2021122136A1 (en) Device, system and method for monitoring of a subject

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): JP KR

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR

WWE Wipo information: entry into national phase

Ref document number: 2002712177

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 1020027015382

Country of ref document: KR

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWP Wipo information: published in national office

Ref document number: 1020027015382

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2002574620

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 2002712177

Country of ref document: EP

WWG Wipo information: grant in national office

Ref document number: 2002712177

Country of ref document: EP