US20040030531A1 - System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor - Google Patents

System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor Download PDF

Info

Publication number
US20040030531A1
US20040030531A1 US10/341,335 US34133503A US2004030531A1 US 20040030531 A1 US20040030531 A1 US 20040030531A1 US 34133503 A US34133503 A US 34133503A US 2004030531 A1 US2004030531 A1 US 2004030531A1
Authority
US
United States
Prior art keywords
actor
subject matter
response
information
situation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/341,335
Inventor
Christopher Miller
Wende Dewing
Karen Haigh
David Toms
Rand Whillock
Christopher Geib
Stephen Metz
Rose Richardson
Stephen Whitlow
John Allen
Lawrence King
John Phelps
Victor Riley
Peggy Wu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honeywell International Inc filed Critical Honeywell International Inc
Priority to US10/341,335 priority Critical patent/US20040030531A1/en
Priority to PCT/US2003/009743 priority patent/WO2003083800A1/en
Priority to AU2003228403A priority patent/AU2003228403A1/en
Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: WU, PEGGY, DEWING, WENDE L., WHITLOW, STEPHEN D., ALLEN, JOHN A., GEIB, CHRISTOPHER W., RICHARDSON, ROSE MAE M, TOMS, DAVID C., PHELPS, JOHN A., HAIGH, KAREN Z., METZ. STEPHEN V., WHILLOCK, RAND P.
Publication of US20040030531A1 publication Critical patent/US20040030531A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1112Global tracking of patients, e.g. by using GPS
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • A61B5/1117Fall detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0407Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis
    • G08B21/0423Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons based on behaviour analysis detecting deviation from an expected pattern of behaviour or schedule
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0469Presence detectors to detect unsafe condition, e.g. infrared sensor, microphone
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0476Cameras to detect unsafe condition, e.g. video cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/04Alarms for ensuring the safety of persons responsive to non-activity, e.g. of elderly persons
    • G08B21/0438Sensor means for detecting
    • G08B21/0484Arrangements monitoring consumption of a utility or use of an appliance which consumes a utility to detect unsafe condition, e.g. metering of water, gas or electricity, use of taps, toilet flush, gas stove or electric kettle
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16ZINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS, NOT OTHERWISE PROVIDED FOR
    • G16Z99/00Subject matter not provided for in other main groups of this subclass
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/08Elderly
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0242Operational features adapted to measure environmental factors, e.g. temperature, pollution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/021Measuring pressure in heart or blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/14532Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring glucose, e.g. by tissue impedance measurement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Definitions

  • the present invention relates to an automated system and method for providing assistance to individuals based, at least in part, upon monitored activities. More particularly, it relates to a system and method that intelligently monitors, recognizes, supports, and responds to activities of an individual in an environment such as an in-home, daily living environment.
  • Kutzik The recognition by Kutzik that monitoring a person's daily living activities can provide useful information for subsequently assisting that person is clearly a step in the right direction.
  • an appropriate personal, in-home assistant system must not only receive sensor data, but also integrates these individual functions and information sources to automatically develop an appropriate response plan and implement the plan, thereby greatly assisting the actor/user in their activities.
  • a trend analysis feature alluded to by Kutzik et al. may provide a separate person (i.e., caregiver) with data from which a possible course of action could be gleaned.
  • Kutzik et al. does not address the “technophobia” concerns (often associated with elderly individuals) that might otherwise impede complete interaction between the user and the system.
  • the inability of Kutzik, as well as other similar systems, to satisfy these constraints is not surprising, given that requisite system architecture, ontology and methodologies did not heretofore exist and the system needs to overcome extensive technology and logic or reasoning obstacles.
  • FIG. 1 is a block diagram illustrating the system of the present invention
  • FIG. 2 is a simplified, schematic diagram of an architectural configuration of the system of FIG. 1;
  • FIG. 3 is a schematic illustration of a preferred architectural configuration of the system of FIG. 1;
  • FIGS. 4 - 11 are schematic illustrations of alternative architectural configurations
  • FIG. 12 is a block diagram of an alternative system in accordance with the present invention.
  • FIGS. 13 A- 13 C provide an exemplary method of operation in accordance with the present invention in flow diagram form
  • FIG. 14 is a schematic illustration of an architecture associated with the method of FIGS. 13 A- 13 C.
  • FIGS. 14 - 21 are block diagrams of alternative system configurations in accordance with the present invention.
  • FIG. 1 One preferred embodiment of an actor (or user or client) monitoring and responding system 20 in accordance with the present invention is shown in block form in FIG. 1.
  • the system 20 offers the potential to incorporate monitoring and support tools as a personal assistant.
  • the system 20 will support daily activities, facilitate remote interaction with family and caregivers, provide safety and security, and otherwise assist the user.
  • the system 20 includes one or more controllers 22 , a plurality of sensors 24 , and one or more effectors 26 .
  • the sensors 24 actively and/or passively monitor daily activities of an actor or user 28 or their environment (including other humans, animals, etc.). Information or data from the sensors 24 is signaled to the controller 22 .
  • the controller 22 processes the received information and, in conjunction with architecture features described below, assesses the actor's 28 actions or situation (or the actor's 28 environment 30 ), and performs a response planning task in which an appropriate response based upon the assessed situation is generated.
  • the controller 22 Based upon this selected response, the controller 22 signals the effector 26 that in turn carries out the planned response relative to the actor 28 or any other interested party (or caregiver) depending upon the particular situation.
  • the term “caregiver” encompasses any human other than the actor 28 that is in the actor's environment 30 or interacts with the actor 28 for any reason.
  • a “caregiver” in accordance with the present invention is not limited to a medical specialist (e.g., physician or nurse), but further includes any human such as a relative, neighbor, guest, etc.
  • the term “environment” encompasses a physical structure in which the actor 28 is located (permanently or periodically) as well as all things in that physical structure, such as lights, plumbing, ventilation, appliances, humans other than the actor 28 that at least periodically visit (e.g., caregiver as defined above and pets), etc.
  • the key component associated with the system 20 resides in the architecture provided with the controller 22 .
  • the sensors 24 and the effectors 26 can assume a wide variety of forms.
  • the sensors 24 are low cost, and are networked by the controller 22 .
  • the sensors 24 can include motion detectors, pressure pads, door latch sensors, panic buttons, toilet-flush sensors, microphones, cameras, fall-sensors, door sensors, heart rate monitor sensors, blood pressure monitor sensors, glucose monitor sensors, moisture sensors, light level sensors, telephone sensors, smoke/fire detectors, thermal sensors, water sensors, seismic sensors, etc.
  • one or more of the sensors 24 can be a sensor or actuator associated with a device or appliance used by the actor 28 , such as a stove, oven, television, telephone, security pad, medication dispenser, thermostat, etc., with the sensor or actuator providing data indicating that the device or appliance is being operated by the actor 28 (or someone else).
  • the sensors 24 can be non-intrusive or intrusive, active or passive, wired or wireless, physiological or physical. In short, the sensors 24 can include any type of sensor that provides information relating to activities or status of the actor 28 or the environment.
  • effectors 26 can also assume a wide variety of forms. Examples of applicable effectors 26 include computers, displays, telephones, pagers, speaker systems, lighting systems, fire sprinkler, door lock devices, pan/tilt/zoom controls on a camera, etc.
  • the effectors 26 can be placed directly within the actor's 28 environment, and/or can be remote from the actor 28 , for example providing information to other persons concerned with the actor's 28 daily activities (e.g., caregiver, family members, etc.).
  • the controller 22 is preferably a microprocessor-based device capable of storing and operating appropriate architectural components (or other modules), as described below.
  • the controller 22 can include discrete components that are linked to one another for appropriate interface.
  • a first controller component can be located at the actor's 28 home, whereas a second controller component can be located off-site.
  • an even greater number of controller components can be provided.
  • an entirety of the controller 22 can be located on-site or off-site, or can be worn on the body of the actor 28 .
  • Various hardware configuration for the controller 28 are described in greater detail elsewhere.
  • the ability of the system 20 of the present invention to provide integration of the various sensor data in conjunction with intelligent formulation of an appropriate response to a particular situation encountered by the actor 28 resides in the architecture provided with the controller 22 .
  • the architecture configuration for accomplishing these goals can reside in various iterations that are dependent upon a particular installation; a more complex application will entail a more complex architectural arrangement in terms of availability and integration of additional features.
  • the foregoing description includes exemplary hypothetical situations in conjunction with the methodology that the feature/configuration being described would employ to sense, analyze and/or address the hypothetical.
  • the examples provided are in no way limiting of the countless potential applications for the system 20 architecture, and the listed responses are in no way exhaustive.
  • the preferred system architecture entails four main categories of capability that can be described as fitting into a layered hierarchy. These include sensing 40 , situation assessment 42 , response planning 44 , and response execution 46 .
  • the sensing layer 40 coordinates signaled information from multiple sensor sources, preferably clustering multiple sensor reports into a single event.
  • the situation assessment layer 42 based upon information provided via the sensing layer 40 , an attempt is made to understand the current situation of the actor 28 , whether it is describing the person or persons in the environment being monitored (e.g., the actor 28 , caregivers, pets, postal workers, etc.), or physical properties of the environment (e.g., stove on/off, door opened/closed, vase fell in the kitchen, etc.).
  • the situation assessment layer 42 will preferably include a number of components or sub-layers, such as intent recognition for understanding what actors are trying to do, and response monitoring for adaptation.
  • the response planning layer 44 based upon the situation assessment information provided by the situation assessment layer 42 , the response planning layer 44 generates an appropriate response plan, such as what to do or whom to talk to, how to present the devised response, and on what particular effector(s) 26 (FIG. 1) the response should be effected. Finally, the response execution layer 46 effectuates the response plan generated by the response planning layer 44 .
  • an appropriate response plan such as what to do or whom to talk to, how to present the devised response, and on what particular effector(s) 26 (FIG. 1) the response should be effected.
  • the response execution layer 46 effectuates the response plan generated by the response planning layer 44 .
  • the architecture associated with the system 20 has components that are agent-oriented such that the system 20 provides multiple independent computational threads, and encourages a task-centered model of computation.
  • the system 20 benefits from the natural byproduct of decoupled areas of computational responsibility.
  • the multi-threaded computational model enhances this decoupling by supporting a system that makes use of the different levels of granularity that a problem presents.
  • the agents can migrate from one computational platform (or layer) to another to balance loads.
  • the preferred system 20 provides an agent or agents responsible for various capabilities essential to good system performance available at several levels of computational responsibility, from device control to user task tracking.
  • the model of the preferred system 20 is expressed in an ontology and agent communication protocol (referenced generally at 48 in FIG. 2) that forms a common language to describe the domain.
  • This ontologically mediated inter-agent communication provides an additional benefit; it gives components the ability to discover services provided by other agents, often through the services of a matchmaker.
  • Discovery directly provides the opportunity for an independent agent to expand its range of knowledge without radically changing its control focus. As a result, discovery allows the overall system 20 to grow at run-time without adversely affecting functionality.
  • the preferred agent-oriented approach provides modularity, independence, distribution, discovery, and social convention.
  • the preferred agent architecture associated with the system 20 is defined as a federated set of agents that define agent interfaces.
  • a “system agent” or “agents” is defined as a software module that is designed around fulfilling a single task or goal, and provides at least one agent interface.
  • An individual system agent is intended to perform a single (possibly very high level) task. Examples of the agent's task include interaction with a user or caregiver, preventing fires in a kitchen, interfacing with a medication-monitoring device, monitor long term trends, learning, filtering sensor noise, device management (e.g., speech or video understanding, television operation, etc.), etc.
  • the system agent is the basic delivery and compositional unit of the system 20 architecture.
  • agents for installation in the system 20 to provide new functionality. While the system 20 will preferably have, at its core, a small set of agents that will be present in every installation of the system 20 , the breakdown of system functionality into agents is designed to allow a flexible modularity to the system 20 construction. Choosing agents on the basis of provided functionality will allow the actor 28 (or a person responsible for his/her care) to customize the system 20 to provide only those functions they want to have without requiring the adoption of functionality that they are not interested in. Although the preferred system 20 architecture has been described as being agent-based, other configurations capable of performing the situation assessment, response planning, and response plan implementation features described below are also acceptable.
  • Agent interfaces provide the inter-agent communication and interaction for the system 20 . Each agent must make available at least one agent interface. In contrast to the task-organized functionality provided by agents, the agent interfaces are designed to allow the agents to provide functionality to each other. They provide for and foster specific kinds of interactions between the agents by restricting the kinds of information that can be provided through each interface.
  • the system 20 provides three types of agent interfaces, including a “Sensor agent interface”, and “Actuator agent interface”, and a “Reasoner agent interface” (hereinafter referred to as “SRA interfaces”).
  • a sensor agent interface answers questions about the current state of the world, such as “is the stove on/off?”, “has the user taken his/her medication for the day?”, “is the user in the house?”, etc.
  • These interfaces allow others to interact with the agent as though it is just a sensor.
  • An example of this kind of interface is a kitchen fire safety agent that allows other agents to know the state of the stove.
  • An actuator agent interface accepts requests for actions to change/modify the world, for example, including: turning the stove on/off, calling the user on the phone, flashing the lights, etc. These interfaces allow the agent to be used by others as a simple actuator. Preferably, the monitoring of an action to verify that it has been done would be carried out by the agent implementing the actuator agent interface rather than by the agent requesting the action. Finally, a reasoner agent interface answers questions about the future state of the world such as, for example, “will the user be home tonight?” or “can the user turn off the television?”, etc. These interfaces are designed to allow the agent to perform reasoning for other agents.
  • each agent will preferably have more than one interface and may even provide multiple interfaces of the same type.
  • a kitchen fire safety agent can provide a sensor agent interface for the state of the stove and a similar, but separate, agent interface for the toaster oven.
  • the kitchen fire safety agent preferably provides a sensor interface for indicating a current state of the stove, an actuator interface that allows changing of a stove temperature or activation/deactivation, and a reasoner agent interface that determines an expected future state of the stove.
  • an agent when an agent is registered as part of the system 20 , it will register the agent interfaces that it makes available. Other agents that wish to make use of these interfaces can be informed of the availability and be reconfigured accordingly.
  • This preferred agent discovery process entails discovery of software features and capabilities of available agents, and is not otherwise available with existing protocols, such as Universal Plug and Play (“UPnP”)
  • UUPnP Universal Plug and Play
  • Reflection is the process of reasoning about and acting upon one's self. Reflection is present at both the individual and social levels of properly constructed agent systems. Reflection at the single agent level primarily means that the agent can reason about the importance of its goals and commitments in a dynamic environment; it is apparent in explicit models of goals, tasks, and execution state. An agent's ability to reason about goals and commitments in the context of an agent system is provided by a common, interchangeable task model.
  • FIG. 3 One preferred embodiment of the system 20 architectural organization, including preferred layer-agent interrelationships, is provided in FIG. 3.
  • the framework illustrated in FIG. 3 includes multiple layers that correspond to the situation assessment layer 42 of FIG. 2, including “clustering”, “validating”, “situation assessment and response monitoring”, and “intent inference”.
  • FIG. 3 illustrates various agents within each layer and/or acting within several layers.
  • exemplary domain agents are provided (including “fire safety”, “home security”, and “medication management”). It will be understood that these are but a few examples of domain agents that can be used with the system 20 of the present invention.
  • the various layers identified in FIG. 3 provide a framework in which to describe an agent's capability, rather than a strict enforcement of code. Further, there are some agents that reside outside of this framework, notably because they are not part of the “reasoning chain” in quite the same way. These would include, for example, customization and configuration (that interacts with an actor to gather system set-up information), “machine learning” (described in greater detail below; generally refers to building models of the particular application environment and normal activities of the actor 28 that are used by caregivers of the system 20 to intervene or improve system accuracy and responsiveness), and a log manager (to mediate access to system databases). Further, devices (both sensors and actuators) reside in the device layer, communicating with a standard device communication protocol. The agents communicate within an agent infrastructure. In one preferred embodiment, one or more agents are provided that function as adaptors to translate device messages.
  • each of the agents shown in FIG. 3 provides all of the functionality related to the particular subject matter.
  • the domain agents described above can further include an “eating” agent that provides all of the functionality related to the actor's 28 eating habits, including, for example, monitoring what and when the user is eating, monitoring the freshness of food, creating menus and grocery lists, and raising alerts when necessary.
  • agent-components may communicate using whatever mechanism they choose, including the extremes of: (1) choosing to be one piece of undifferentiated code that requires no communication, or (2) using their own proprietary communication method, or (3) choosing to use the preferred system ontology in a communication protocol.
  • response planner layer agents need only maintain “ontological purity” in their communications with other agents. This same preferred feature holds true for agents that can reason over multiple layers in the reasoning architecture.
  • “Ontological purity” means that the ontology defines concepts that can be shared or inspected between agents, and those concepts exist within a level of the reasoning architecture. Concepts can be used within or across levels or layers, but preferably must be maintained across agents.
  • the particular infrastructure framework utilized for the system agent system architecture can assume a variety of forms, including FIPA-OS, AgentTool, Zeus, MadKit, OAA2, JAFMAS, JADE, DECAF, etc.
  • the layers and/or agents illustrated in the layered architecture of FIG. 3 preferably provide added “intelligence” to the system 20 , and are described in greater detail below. It should be noted, however, that regardless of whether one or more of the features are included, the overall layered architecture configuration of the system 20 provides a heretofore unavailable platform for seamlessly associating each of these features in a manner that preferably facilitates complete monitoring, recognizing, supporting and responding to the behavior of an actor in every day life, it being understood that the present invention is not limited to facilitating all of these functions (e.g., supporting and responding to behavior are not mandatory features).
  • devices in the various layers preferably can directly write to the log.
  • Agents preferably go through the log manager that selectively returns only the requested information.
  • the system 20 architecture can be adapted such that non-agents can access and review information stored within the log manager (e.g., a doctor's office would represent a non-agent that could benefit by having access to the log manager).
  • the system 20 can be adapted such that non-agents are able to write data into the log manager, but on a mediated basis.
  • the “sensor adapter” agent is preferably adapted to read the log of sensor firings, compensate for any latencies in data transmission, and then forward the information into the agent architecture.
  • the “clustering” layer is provided to combine multiple sensory streams into a single event.
  • the sensors can include a pressure-mat sensor in the kitchen, a pressure-mat sensor in the hall, and a motion sensor in the kitchen.
  • the preferred “event” agent associated with the clustering layer can interpret a three-sensor sequence of these sensors as probably reporting on the same event, namely entering the kitchen.
  • the “situation assessment and response monitoring” layer aggregates evidence presented by the various sensors and agents to predict a most likely ramification of the current user situation.
  • the layering preferably includes monitoring the effects of a subsequently-implemented response plan. For example, a particular situation assessment may conclude that the actor 28 has fallen. The resulting response plan is to ask the actor 28 whether or not he/she is “okay”. If, under these circumstances, the actor 28 does not respond to the question, then the response monitoring layer can conclude that the detected fall is likely to be more serious.
  • the “client” agent and the “home” agent monitor and manage information relating to the actor and the actor's environment, respectively.
  • the client agent information preferably includes current and past information, such as location, activity and capabilities, as well as preferred interaction mechanisms.
  • the information can be predetermined (provided directly by the actor and/or caregiver), inferred (via situation assessment or intent recognition), and/or learned (machine learning). Where the particular environment includes multiple actors (e.g., a spouse), a separate client agent will preferably be provided for each actor.
  • the home agent information preferably includes environment lay-out, sensor configurations, and normal sensor patterns. Again, the information may be predetermined, inferred and/or learned.
  • a further preferred feature of the previously-described “domain” agents is a responsibility for all reasoning related to its functional area.
  • Each domain agent performs situation assessment, provides intent recognition libraries (described below), and creates initial response plans.
  • each domain agent is preferably adapted to decide whether, for a particular situation, to wait for additional information, explicitly gather more information, or interact with the actor and/or caregiver.
  • the domain agent further needs to decide what actor interaction/interface device(s) to preferably use, what modality to preferably use on selected devices, and, where appropriate, which person(s) to preferably contact in the event that outside assistance is determined necessary.
  • the domain agent preferably proposes an interaction based only on its specialized knowledge; in other words it proposes a “context-free” response.
  • the “intent inference” layer preferably includes an “intent recognition” agent that, in conjunction with intent recognition libraries, pools multiple sensed events and infers goals of the actor, or more simply, formulates “what is the actor trying to do”. For example, going into the kitchen, opening the refrigerator, and turning on the stove likely indicate that the actor is preparing a meal.
  • Alternative intent inference evaluations include inferring that the actor is leaving the house, going to bed, etc.
  • the preferred intent recognition agent (or intent inference layer) entails repeatedly generating a set of possible intended goals (or activities) by the actor for a particular observed event or action, with each “new” set of possible intended goals being based upon an extension of the observed sequence of actions with hypothesized unobserved actions consistent with the observed actions.
  • the library of plans that describe the behavior of the actor are provided by the “domain” agents.
  • the system 20 probabilistically infers intended goals pursuant to a methodology in which potentially abandoned goals are eliminated from consideration, as taught, for example, in U.S. Provisional Application Serial No. 60/351,300, filed Jan. 22, 2002, the teachings of which are incorporated herein by reference.
  • the preferred intent inference layer improves the response planning capabilities of the system 20 because the response planner is able to “preemptively” respond.
  • the system 20 architecture can lock a door before a demented actor attempts to leave his/her home, provide next step-type suggestions to an actor experiencing difficulties with a particular activity or task, suppress certain warning alarms in response to a minor kitchen fire upon recognizing that the actor is quickly moving toward the kitchen, etc.
  • the preferred architecture of FIG. 3 further includes an “IDS” agent.
  • IDS Interaction Design System agent that processes sensor data to understand a particular situation, needs and capabilities of the actor 28 and available effectors that, as part of the Response Planning layer, are used to develop interaction plans. That is to say, the IDS agent provides information for developing a series of control actions designed to assist the actor through information presentation or adaptive automation behaviors.
  • the preferred IDS agent reasons about which user interaction/interface device to utilize for informing the actor of a particular plan.
  • the adaptive interaction generation feature promotes planned responses adapting, over time, to how the actor 28 (or others) responds to particular plan strategies. By further accounting for the urgency of a particular message, the preferred IDS agent dynamically responds to the current situation, and allows more flexible accommodation of the interaction/interface devices.
  • An additional feature preferably incorporated into the Situation Assessment and Response Monitoring layer is an inactivity monitoring feature.
  • the inactivity monitoring feature is preferably provided as part of the “machine learning” agent (described below) or as part of individual domain agents, and relates to an expected actor activity (e.g., the actor should wake up at 8 a.m., the actor should reach the bottom of the stairs within one minute of starting to descend) that does not occur.
  • the preferred system 20 architecture not only accounts for unexpected activities or events, but also for the failure of an expected activity to occur, with this failure being cause for alarm.
  • the inactivity monitoring function is primarily model based, and can include accumulated information such as a history of the actor's activities; a profile of the actor's environment; hardware-based sensor readings; information about the current state of the world (e.g., time of day); information about the caregiver's activities (where applicable); a prediction of the future actions of the actor and/or caregiver; predictions about the future state of the world; predetermined actor, caregiver and/or environment profiles; and predetermined actor and/or caregiver models, settings, or preferences.
  • the inactivity monitoring mechanism preferably can detect the unexpected inactivities that would otherwise go unnoticed by an activity only-based concept of monitoring. It does so by comparing the actor's current activities with his/her preset and/or expected patterns.
  • certain thresholds are implemented to allow for flexibility in the actor's schedule. However, there are certain recognizable patterns within the day, and within each activity. For example, if the actor is expected rise from bed between 8 a.m. and 10 a.m., and no activity has been detected during this time, the system 20 can be adapted to raise an alarm notifying a designated caregiver(s). By way of further example, and at a different granularity, if the actor 28 is descending from the stairs, and no motion is detected at the bottom of the staircase after a predetermined length of time, the system 20 can be adapted to raise an alarm. Therefore, the established threshold of the inactivity monitoring mechanism enables the system 20 to detect a greater range of unexpected behaviors and possibly dangerous situations.
  • the preferred system 20 architecture further includes an Unexpected Activity/Inactivity Response feature in the form of a module or agent that determines if the actor 28 needs assistance by monitoring for signs of unusual activity or inactivity. Given the “normal” or expected behavior of the actor 28 or the actor's environment, unusual activity can trigger a response. For example, movement in the basement when the actor 28 is normally asleep could trigger an intruder alarm response.
  • the “response plan/exec” agent preferably includes a response coordination feature that coordinates the responses of the “domain” agents.
  • the response coordinator preferably merges or suppresses interactions or changes interaction modality, as appropriate, based upon context. For example, if the actor 28 has fallen (entailing an “alarm” response), the response coordinator can suppress a reminder to take medication. Multiple reminders to the actor 28 can be merged into one message. Multiple alert requests to different devices can be merged onto one device. To this end, merged messages will preferably be sorted by priority, where priority is defined by the domain agent, as well as by the type of message (e.g., an alarm is more important than an alert).
  • the response plan/exec agent centralizes agent coordination, but alternatively the system 20 architecture can employ distributed modes.
  • the preferred centralized response coordination approach is feasible because all of the involved agents interact with a small sub-set of users through a small sub-set of devices. In other words, all activities involving communications with the outside world are strongly interrelated. Thus, while the agents are loosely coupled, their responses are not.
  • the “machine learning” agent provides a means for ongoing adaptation and improvement of system 20 responsiveness relative to the needs of the actor 28 .
  • the machine learning agent preferably entails a behavior model built over time for the actor 28 and/or the actor's environment.
  • the model is built by accumulating passive (or sensor-supplied) data and/or active (actor and/or caregiver entered) data in an appropriate database.
  • the data can be simply stored “as is”, or an evaluation(s) of the data can be performed for deriving event(s) and/or properties of event(s) as described, for example, in U.S. Provisional Patent Application Serial No. 60/834,899, filed May 30, 2002, the teachings are incorporated herein by reference.
  • the Response Planning layer will likely consider alternative plans or actions. Learning the previous success or failure of a chosen plan or action enables continuous improvement.
  • the system 20 can learn, for example, the most effective modality for a message; the most effective volume, repetition, or duration within a modality; and the actor's preferences regarding modality, intensity, etc.
  • the mechanism for learning can account for contextual conditions (e.g., audio messages are ineffective when the actor is in the kitchen).
  • the “customization” (or “configuration”) agent is preferably adapted to allow an installer of the system 20 to input configuration information about the actor, the caregiver (where relevant), other persons acting in the environment, as well as relevant information about the environment itself.
  • FIG. 3 The layered architecture presented in FIG. 3 is but one example of an appropriate configuration useful with the system 20 of the present invention.
  • Other exemplary architectures are presented in FIGS. 4 - 11 .
  • the exemplary architecture of FIG. 5 incorporates a more “horizontal” cut of agent functionality whereby there is generally one agent per layer that performs all the tasks required for that layer.
  • all situation assessment is carried out by a single agent within the architecture of FIG. 5, whereas individual agents are provided for selected situations with the architecture of FIG. 3 (e.g., all medication management-related assessment occurs in the medication management agent).
  • FIGS. 4 - 11 include the term “CARE” which is in reference to “client adaptive response environment” and the term “HOME” is in reference to “home observation and monitoring environment”, both of which represent system components in accordance with the present invention.
  • a preferred feature of the system 20 is an ability to mediate and resolve multiple actuation requests.
  • the system 20 is preferably adapted to handle multiple conflicting requests made to an agent interface.
  • this functionality is performed at the level of individual actuator agent interfaces.
  • a central planning committee design can be instituted.
  • the central planning committee technique would require a blackboard-type architecture, and would require providing all information needed to make a global decision rather than a local one. Given these restrictions, it is preferred that each actuator agent interface be required to handle the multiple conflicting request issue on an individual basis.
  • a first problem associated with multiple conflicting requests relates to multiple priority messages.
  • each actuation request is provided with a priority “level” (e.g., on the scale of 1-5).
  • Each priority level represents an order of magnitude jump from the level below it. The effect of this is that all requests of the same priority level are of the same importance and can be shuffled or reordered. Requests of a high level preempt all lower priority requests.
  • this priority scheme does not include an “urgency” factor for the requests.
  • the requesting agent places a request for the specified action at a particular time with a given priority. If the actuator agent is unable to fulfill that request, the requesting agent is so-notified. The requesting agent is then free to raise the priority of the request or to consider other methods of achieving the goal. Thus, reasoning about the urgency of the action is left within the requesting agent, and all arbitration at the actuator level is performed on the basis of the priority of the request.
  • An additional multiple request-related concern is one request interfering with the processing of (or “clobbering”) another request.
  • One of the traditional methods for handling this kind of problem is to allow the agents to pass portions of plans between themselves in order to explain the rationale for the action and to reach an agreement about the actions that need to be executed. This provides the information needed for the agents to resolve any conflicts between the actions of each of their plans. In a preferred embodiment, however, a limited form of this partial plan solution is provided.
  • the requesting agent In addition to a specific request from an agent, the requesting agent must specify the environment that the request should be fulfilled in. In artificial intelligence terminology, the conditions embodied by causal links between plan steps must be provided to the executing agent.
  • the preferred system 20 does this by specifying a list of sensor agent interface queries and their return values. In effect, this provides a list of predicates that must be true before the action is performed. If the specified conditions do not hold, then the system 20 cannot honor the request and will report that fact. Note that if an agent wants to ensure that some predicate, not provided by a sensor agent interface, holds during the execution of an action request, then it can provide the sensor agent interface necessary for the action. It should further be noted that in general, the “clobbering” concern is more relevant for actuator requests than reasoner or sensor agents, but these requirements are preferably placed in all three classes of agent interfaces.
  • the sensor integration, situation assessment and response planning features of the system 20 architecture present distinct advancements over previous in-home monitoring systems, and allows the system 20 to provide automated monitoring, supporting and responding to the activities (or inactivities) of the actor 28 .
  • This infrastructure provides a basis for building automated learning techniques that could generate actor-specific information (e.g., medical conditions, schedules, sensor noise, actor interests) that in turn can be used to generate better responses (e.g., notify doctors, better reminders, reduce false alarms, suggest activities).
  • actor-specific information e.g., medical conditions, schedules, sensor noise, actor interests
  • the situation assessment can be performed at a variety of levels of abstraction.
  • the system 20 can confer or assess a situation based upon stimulus-response, whereby a sensor directs an immediate response (e.g., modern security systems, motion-sensor path-lighting, or a heart rate monitor that raises an alarm if the heart rate drops precipitously).
  • a sensor directs an immediate response
  • the system 20 can “notice” and automatically control events before they actually occur, as opposed to the existing technique of simply responding to an event. This is preferably accomplished by providing the situation assessment layer with the ability to predict events based upon the potential ramifications of an existing situation, and then respond to this prediction.
  • the situation assessment layer is preferably adapted to notice that the stove is about to catch fire, and then act to turn the stove off; or turn the water heater off before the actor gets burned; etc.
  • system 20 architecture is highly proactive in automatically responding to “events” (beyond responding to “alarm” situations); for example automatically arming a security system upon determining that the actor has gone to bed, automatically locking the actor's home door upon determining that the actor has left the home; etc.
  • explicit reasoning modules for specific behaviors are incorporated into the system 20 architecture (e.g., a tracking algorithm that calculates the user's path based on motion-sensor events), and then possibly projects future states (e.g., turning on lights where the client is going, or locking the front door before the user wanders outside, or a video algorithm that recognizes faces).
  • These modules may be a “library” of behavior recognition techniques, such as a set of functions that are explicitly designed to recognize one (or a small number) behavior.
  • the system 20 architecture can be adapted such that individual agents build customized techniques for recognizing/obtaining information subtleties that are not required by other agents (e.g., a general vision agent could be configured to recognize food going into the actor's 28 mouth; a medications agent would want to know whether an ingested pill was of a certain color and nothing more, thereby allowing the medication agent to more efficiently and effectively interact with vision agent and implement the vision technique internally to the medication agent).
  • a “central” algorithm that weighs all likely current situations can be provided.
  • the system 20 preferably performs condition-based monitoring that uses data from hardware-based sensors in conjunction with other information from various sources.
  • the goals of condition-based monitoring are to provide greater accuracy for the assessment of the actor's current condition, include details with alarms raised, filter out unnecessary or inappropriate alarms, and also reduce the number of false alarms.
  • the information that could potentially be used to perform condition-based monitoring includes: a history of the actor's activities; a profile of the actor's 28 environment; hardware-based sensor readings; information about the current state of the world, including for example, the actor's location, the time of day, the day of week, planned activity calendar, and the number of people in the environment; information about the caregiver's activities; a prediction of the future actions of the actor or caregiver; a prediction of the future state of the world; user/caregiver/environmental patterns or profiles, actor/caregiver preferences; etc.
  • the system 20 can evaluate the current situation with more accuracy. Based upon the current condition of the environment and the recent history of actor 28 activities, the system 20 can initiate alarms and alerts in an appropriate manner, and assign an appropriate level of urgency. For example, the system 20 may reason that a possible fall sensor event (e.g., from a hardware-based sensor) that follows a shower event (e.g., from the history of the actor's activities) has a higher probability of the actor 28 suffering an injury-causing fall than a possible fall event that occurred on a dry, level surface (e.g., from the environment model). The system 20 can also reason that a toileting reminder may be inappropriate when there are guests in the actor's environment.
  • a possible fall sensor event e.g., from a hardware-based sensor
  • a shower event e.g., from the history of the actor's activities
  • the system 20 can also reason that a toileting reminder may be inappropriate when there are guests in the actor's environment.
  • Such monitoring mechanisms can be used by an automated response planner to decide how to respond-including for example, whether to actuate a device in the house (e.g., to turn on the lights), to raise an alarm/alert, to send a report, or to do nothing.
  • the information can also be included with each alarm to better aid the caregiver in assessing the actor's well-being.
  • the preferred system 20 architecture preferably promotes sharing of inferred states (via the intent inference layer) across multiple sensors and performing second-order sensor processing. For example, a motion sensor may indicate movement in a particular room, whereas a GPS transponder carried on the actor's 28 person indicates that he/she is away from home.
  • the situation assessment layer preferably reasons that either a window has been left open or there is an intruder.
  • the system 20 architecture polls the relevant window sensor to determine whether the window is open or closed before initiating a final response plan.
  • the preferred agent layering architecture of the present invention facilitates not only allowing third parties to incorporate new devices into the system 20 at any time, but also to allow third parties to incorporate new reasoning modules at any time into the system 20 .
  • third party reasoning modules can use new or existing devices as sensing or actuating mechanisms, and may provide information or user information from other reasoning modules.
  • a consolidated home ontology is provided that includes the terms of the language that devices and control services must use to communicate with one another.
  • newly added devices or agents can find other agents within the system 20 architecture that otherwise supply information that the new device or agent is interested in.
  • the response planning and response execution layers associated with the system 20 architecture can assume a variety of forms, some of which initiate further passive monitoring, and others that entail active interaction with the actor.
  • the system 20 preferably incorporates smart modes or agents into the response planning layer.
  • the smart modes entail querying the actor as to his/her status (mental/physical/emotional), the response to which is then combined with other sensor data to make inferences and re-adjust the system behavior.
  • Some exemplary modes include “guest present”, “vacation”, “feeling sick”, and “wants quiet” (or mute).
  • the actor 28 may indicate that she is not feeling well when she wakes up.
  • the system 20 can then ask the actor 28 to indicate a few of her symptoms and can give the actor 28 an opportunity to specify needs (e.g., need to get some juice and chicken soup; need to make a doctor appointment; need to call caregiver; do nothing; etc.).
  • the system 20 uses this information to adjust its reasoning, activities, and notifications accordingly.
  • any notifications preferably include information about the actor 28 feeling ill.
  • the system 20 has access to an appropriate database, it can match the actor's symptoms against the database given that it knows that the actor 28 has, for example, started a new prescription the day before (and issues alerts based upon the match if required).
  • the system 20 preferably can reduce general activity reminders; cancel appointments; reduce notification thresholds for general activities like mobility, toileting, eating; increased reminders to drink fluids; add facial tissues and cold medicine to the shopping list ; etc.
  • the smart mode states will act as individual pieces of information in the reasoning steps that aggregate evidence from a specific situation, a world understanding, and the smart modes themselves. This acts as a dynamic system, supporting reasoning based on an actual situation rather than a predefined sequence.
  • FIG. 14 provides a block diagram of one example of the system 20 incorporating smart mode information.
  • the smart mode can be an agent within the system 20 architecture, or could be within each of the domain agents.
  • the system 20 layered architecture can assume a variety of forms, and can include a variety of agents (or other modules) to effect the desired intelligent environmental automation system with situation awareness and decision-making capabilities, as exemplified by the methodology described with reference to the flow diagram of FIGS. 13 A- 13 C.
  • the method of FIGS. 13 A- 13 C is preferably performed in conjunction with the architecture of FIG. 14, it being understood that other architectural formats previously described are equally availing.
  • the layered, agent-based architecture of FIG. 14 is applied to an environment including multiple sensors and actuators (as identified in FIG. 14) for an actor living in a home.
  • the exemplary methodology of FIGS. 13 A- 13 C relates to a scenario in which the actor 28 first receives a phone call and then leaves a teakettle unattended on the actor's stove, and assumes a number of situation-specific variables.
  • an installer uses the “configuration” agent (akin to the “customization” agent in FIG. 3) to input information about the actor, the actor's next-door neighbor, and the actor's home. This information includes capabilities, telephone numbers, relevant alerts, and home lay-out.
  • this configuration information is stored in the log via the database manager (or “DB Mgr”) agent.
  • an incoming telephone call is placed to the actor's home.
  • a signal from the telephone sensor goes through the “sensor adapter” agent that, at step 108 , transfers it to the “phone interactions” agent.
  • the “phone interactions” agent needs to decide whether to filter the call. To this end, the two important factors are (a) who is calling, and (b) what is the actor doing.
  • the “phone interactions” agent polls, or otherwise receives information from, the “DB Mgr” agent regarding the status of the incoming telephone number.
  • the “DB Mgr” agent reports that the incoming phone number is the actor's next door neighbor and is thus “valid” at step 114 (as opposed to an unknown number that may be designated as “invalid”).
  • the “phone interactions” agent determines that the call will not be immediately filtered.
  • the “phone interactions” agent polls, or otherwise receives information from (e.g., a cached broadcast), the “client expert” agent (or “client” agent of FIG. 3) to determine what activity the actor is currently engaged in.
  • the “intent recognition” agent has been receiving broadcast sensor signals from the “sensor adaptor” agent and performing intent recognition processing of the information (referenced generally at step 119 ).
  • the “client expert” agent has been receiving or subscribing to, resultant activity messages from the “intent recognition” agent (referenced generally at step 120 ).
  • the “intent recognition” agent informs the “phone interactions” agent that the actor is awake and in the kitchen where a telephone is located.
  • the “phone interactions” agent decides not to filter the incoming call (based upon the above-described analysis). As such, the “phone interactions” agent requests the “response coordinator” agent to enunciate the phone call at step 126 . In response to this request, the “response coordinator” agent polls, or otherwise receives information from (e.g., broadcasted information), the “client expert” agent for the actor's capabilities at step 128 . The “client expert” agent, in turn, reports a hearing difficulty (from information previously received via the “DB Mgr” agent as indicated at step 129 ) to the “phone interactions” agent at step 130 . At step 132 , the “response coordinator” agent determines that visual cues are needed, with additional lights.
  • the “response coordinator” agent determines that visual cues are needed, with additional lights.
  • the “response coordinator” agent prompts the “PhoneCtrl” agent to let the phone ring and flash lights at step 134 .
  • the actor could be alerted in a variety of ways including messages on the television, flashing house lights, or announcing who the caller is via a speaker.
  • the “response coordinator” agent recognizes that other devices or activities in the home may impede the actor's ability to hear the phone ring or the subsequent conversation if the house is too noisy. In light of this determination, the “response coordinator” agent, at step 138 , decides to reduce other sounds in the home. For example, at step 140 , the “response coordinator” agent prompts the “TV” agent to mute the television. The “TV” agent, in turn, utilizes an IR control signal (akin to a remote control) to mute the television at step 142 .
  • an IR control signal (akin to a remote control)
  • an air quality sensor senses smoke near the stove in the kitchen (i.e., is “triggered”), and broadcasts this information to other interested agents, including the domain agent “fire”.
  • the domain agent “fire” polls the “intent recognition” agent as to whether the actor is likely to turn off the stove at step 146 .
  • the “intent recognition” agent has received information from the “sensor adaptors” agent (similar to step 119 previously described, with this same step 119 being referenced generally as in conjunction with step 146 ), and has determined that the actor has previously left the kitchen.
  • the “intent recognition” agent determines, at step 150 , that the actor is not likely to turn off the stove immediately, and reports the same to the “fire” agent at step 152 .
  • the “fire” agent at step 154 , then determines that a response plan must be generated.
  • the “fire” agent recognizes that the actor's stove is an older model and does not have a device agent or actuator that could be automatically de-activated, such that a different technique must be employed to turn off the stove.
  • the “fire” agent first determines that ventilation in the kitchen is needed. To implement this response, the “fire” agent, at step 160 , requests the “response coordinator” agent to turn on the fans in the kitchen. The “response coordinator” agent, in turn, prompts the “HVAC” agent to activate the kitchen fans at step 162 .
  • the “fire” agent at step 164 , recognizes that the current level of urgency is “low” (i.e., a burning fire has not yet occurred), so that contacting only the actor is appropriate (a higher level of urgency would implicate contacting others).
  • the “fire” agent first needs to select an appropriate device(s) for effectuating contact with the actor at step 166 .
  • all communication devices in the home are appropriate, including the television, the phone, the bedside display, and the lights.
  • the television and the bedside display provide rich visual information, while the phone and the lights draw attention quickly.
  • the “fire” agent polls, or otherwise receives information from (e.g., a broadcasted message), the “client expert” agent to determine where the actor is and what the actor is doing at step 168 . Simultaneous with the previous steps, the “client expert” agent has been subscribing to activity messages from the “intent recognition” agent, as previously described with respect to step 120 (it being noted that FIG. 15B generally references step 120 in conjunction with step 168 ).
  • the “intent recognition” agent Based on recent device use (i.e., the television remote and power to the television), the “intent recognition” agent, reports to the “client expert” agent (e.g., client expert has cached broadcasts of the actor's activity as determined by the “intent recognition” agent) that the actor is likely in the living room watching television.
  • the “client expert” agent reports this information to the “fire” agent at step 174 .
  • the “fire” agent selects the television as the best interaction device, with the lights and the telephone indicated as also appropriate, and the bedside display eliminated. Pursuant to this determination, the “fire” agent requests the “response coordinator” agent to raise an alert to the actor via one of these prioritized devices at step 178 .
  • the “response coordinator” agent reviews all other pending interaction requests to select the best overall interaction device. Seeing that there are no other pending interaction requests, the “response coordinator” selects the television as the interaction device for contacting the actor, and prompts the “television” agent to provide the message to the actor at step 182 .
  • the “response coordinator” agent will preferably select the best combination of interaction devices for all of the pending requests. For example, the “response coordinator” agent can choose a different interaction device for each message, or decide to display/transmit the messages on more than one interaction device.
  • the “television” agent polls, or otherwise receives information from (e.g., a cached broadcast message from the “response coordinator” agent) the “client expert” agent as to the best way to present the message at step 184 .
  • the “machine learning” agent Prior to this request, the “machine learning” agent has recognized that the actor responds more frequently to visual cues, especially when text is combined with an image. This information has been previously reported to the “client expert” agent, generally represented at step 186 .
  • the “client expert” agent informs the “television” agent to present a message on the television screen in the form of “[Actor's name], turn off the stove.”, along with an image of a stove and a burning pan at step 188 .
  • the “television” agent prompts the television to display this message at step 189 .
  • a wide variety of other message presentation formats could have been selected. For example, if the actor is blind (information gleaned from the “configuration” agent and/or the “machine learning” agent) or asleep (information provided by the “intent recognition” agent), a spoken message would have been more appropriate.
  • the “fire” agent continues to monitor what is happening in the home for combating the smoke/fire in the kitchen.
  • the “intent recognition” agent continues to monitor the intent of the actor and determines that the actor has not acknowledged the alert, and that there is no activity in the kitchen (via broadcasted information, or lack thereof, from sensors in the kitchen or at the television, or by polling those sensors). Once again, these determinations are based upon received broadcast sensor signals from the “sensor adaptor” agent as previously described with respect to step 119 (it being noted that reference is made to step 119 in conjunction with step 192 ).
  • the “intent recognition” agent generates a reduced confidence that the actor is actually watching television, and moreover the lack of activity in the kitchen means there are no pending high-confidence hypotheses.
  • the “client expert” agent receives broadcasted information from, or alternatively requests, the “intent recognition” agent regarding its most likely hypotheses and recognizes that the “intent recognition” agent does not know what the actor is doing. The “client expert” agent reports this to the “fire” agent.
  • the “fire” agent decides, based upon the above information, that the alert level must be escalated and re-issues the alert.
  • the “fire” agent requests the “response coordinator” to utilize both a high intrusiveness device (lights preferred over the telephone), and an informational device (bedside webpad preferred over the television because there is an ongoing request for the television message, and the television message was found to not be effective).
  • the “response coordinator” at step 204 recognizes that the lights and the bedside webpad do not conflict with one another, and prompts the “lights” agent and the “web” agent to raise the alert.
  • the “lights” agent flickers the home lights several times at step 206 .
  • the “web” agent polls, or otherwise receives information from (e.g., a cached broadcast), the “client expert” agent as to what information to present and how to present it.
  • the “client expert” agent has previously been informed (via step 186 as previously described and generally referenced in FIG. 13C in conjunction with step 208 ) that the actor responds best to combined text with images, and reports the same to the “web” agent at step 209 .
  • the “web” agent prompts the “bedside display” actuator to display the message: “[Actor's name], turn off the stove,” along with an image of a stove and smoking pan at step 210 .
  • the “fire” agent Before the actor gets to the stove, the “fire” agent prepares to further escalate the alert at step 212 (following previously-described step 190 in which the “fire” agent continues monitoring the kitchen). In particular, the “fire” agent polls the “DB Mgr” agent as to whom to send an alert to at step 214 . The “DB Mgr” agent informs, at step 216 , the “fire” agent that the actor's next door neighbor is the appropriate person to contact. However, before the escalated alert plan is effectuated, the “intent recognition” agent is informed of activity in the kitchen, via for example motion sensors data, and infers from this information that the actor is responding to the fire at step 220 .
  • the “intent recognition” agent is continuously receiving signaled information from the “sensor adaptor” agent as previously described with respect to step 119 (with step 119 being generally referenced in FIG. 13C in conjunction with step 220 ).
  • the “intent recognition” agent reports this change in status to the “fire” agent at step 222 (either directly or as part of a broadcasted message).
  • the “fire” agent at step 224 , does not send the escalated alert, but instead requests that the kitchen fans be deactivated (in a manner similar to that described above with respect to initiating ventilation).
  • the “fire” agent determines that the smoke level in the kitchen subsequently increases, the “fire” agent would initiate the escalated alert sequence via the “response coordinator” agent as previously described.
  • the controller 22 can be provided in multiple component forms.
  • the system 20 architecture combines information from a wide range of sensors and then performs higher level reasoning to determine if a response is needed.
  • FIG. 15 is an exemplary hardware/architecture for an alternative system 320 in accordance with the present invention that includes an in-home processor called the “home controller” 322 and a processor outside the home called the “remote server” 324 .
  • the home controller 322 has all of the hardware interfaces to talk to a wide range of devices.
  • the remote server 324 has more processor, memory, and communication resources to do higher level reasoning functions.
  • the home controller 322 preferably includes a number of different hardware interfaces to talk to a wide range of devices.
  • a client (or actor) interface communicates with devices that the actor uses to interact with the system. These devices could be as simple as a standard telephone or as complex as a web browser enabled device such as a PDA, “WebPad” available from Honeywell International, or other similar devices.
  • the home controller 322 preferably further includes a telephone interface so that the system 320 can call out in emergency situations.
  • the phone interface can be standard wired or cell based. If enabled to allow incoming calls, this telephone interface can also be used to access the system remotely, for example if a caregiver wanted to check on an actor's status from outside the home using a touch tone phone.
  • a preferred actuator interface in the home controller talks 322 to devices in the actor's environment that may be controlled by the system 320 , such as thermostats, appliance, lights, alarms, etc.
  • the system 320 can use this interface to, for example, turn on a bathroom light when the actor gets up in the middle of the night to go to the bathroom, turn off a stove, control thermostat settings, etc.
  • Preferred sensor interface(s), such as wired or RF-based, take in information from a wide range of available sensors. These sensors include motion detectors, pressure mats and door sensors that can help the system determine an actor's location and activity level. This interface can also talk to more specialized sensors such as a flush sensor that detects when a client uses the bathroom. An important class of sensors that communicate by without requiring hardwiring are wearable sensors such as panic button pendants or fall sensors. These sensors signal that the actor needs help immediately. Alternatively or in addition, to a number of other sensors, as previously described, can also be implemented.
  • the home controller's 322 processor can do some sensor aggregation and reasoning.
  • This low-level reasoning is reactive type reasoning that ties actions closely to sensors. Some of this type of reasoning includes turning on lighting based on motion sensors and calling emergency medical personnel if a panic button is pushed. Most sensor data is passed on to the remote server for higher level reasoning.
  • the remote server 324 does situational reasoning to determine what is happening in the actor's environment. Situations include everyday activities like eating and sleeping as well as emergency situations, for example if the actor has not gotten out of bed by a certain time of day.
  • a preferred response planner in the remote server 324 plans a response to the situation if one is required. If the response uses an actuator, a message is preferably sent back to the home controller 322 and out to the device through the actuator interface. If a response requires interaction with the actor, a message is sent to the home controller 322 and routed out through the actor interface.
  • the remote server 324 preferably further includes contains a database of contact information for responses that require contacting someone.
  • This database includes names, phone numbers, and possible e-mail addresses of people to be contacted and the situations for which they should be contacted.
  • a single remote server 324 can support a large number of independent environment installations or a large number of individual living environments in an institutional setting.
  • the remote server 324 can provide other web-based services to an actor including, for example, online news services, communications, online shopping, entertainment, linking to other system 320 users as part of a web-based community, etc.
  • the remote server 324 provides remote access to the system 320 information.
  • caregivers and family members can check on the actor's status from any web-enabled device on the Internet.
  • This interface can also be used when a response plan calls for contacting a family member or caregiver and the actor's contact information says they should be contacted by e-mail.
  • Further interface scenarios preferably incorporated into the system 320 architecture/hardware include allowing information to be pushed or pulled by service providers (e.g., service providers are able to review medical history, repair persons are able to confirm particular brand and model numbers of appliances needing repair, etc.).
  • the communications between the home controller 322 and the remote server 324 can use regular phone lines or cell phones or it could make use of broadband for high information throughput. Generally speaking, lower bandwidth/throughput requires more processing power in the actor's environment.
  • FIG. 16 Another alternative hardware architecture 360 configuration, shown in FIG. 16, has the same general functions, but puts all of the processing in the actor's environment. This requires either at least a second processor in the actor's environment to do the higher level reasoning or additional processing and memory resources in the controller at the actor's environment. Either way, the situation assessment and response planning functions are now performed inside the home. Notably, the situation assessment and response planning functions can be performed by separate controllers located in the actor's environment. Regardless, for this architecture, remote access can be accomplished, for example, either through a standard phone interface or by connecting the processor to the Internet.
  • FIG. 17 depicts yet another alternative configuration of a system 420 in accordance with the present invention in the form of a single, self-contained, easily configured, low-cost box.
  • the system 420 combines a small set of sensors and actuators in a single box with a telephone connection (or other mechanism for contacting the outside world).
  • the sensor suite can include a smoke detector, a carbon monoxide detector, a thermometer, and a microphone
  • the actuator or effector suite can include a motion-activated light for path lighting, a speaker, and a dial-out connection.
  • a user installs the system 420 so the motion detector can sense the movement of people within the room, indicates what room the device is in, and plugs the device into wall power and phone lines.
  • the system 420 gathers sensor data and periodically relays that data to a server through a dial-up connection.
  • the server can store gathered data from later review by the actor or others such as caregivers.
  • the system 420 of FIG. 17 is also capable of on-site reasoning about crises (e.g., panic alert, environmental problems, etc.) and can call caregivers or a monitoring station to alert them of a problem.
  • crises e.g., panic alert, environmental problems, etc.
  • FIG. 18 illustrates yet another, but similar, alternative system 430 configuration in which no “on-board” sensors are provided. Instead, external sensors interface with the system 430 via an RF link.
  • sensors can be provided that are adapted to perform local reasoning (e.g., a video camera that finds moving objects and provides corresponding coordinates and moving vectors).
  • FIGS. 19 - 21 illustrates other alternative configurations of system 440 , 450 , and 460 , respectively, in accordance with the present invention in a user-wearable form.
  • the system and related method of operation of the present invention can, unlike any other system previously considered, independently and intelligently monitor, and recognize the behavior of an actor, and preferably further support and respond to the behavior.
  • the preferred system 20 installation includes a controller, sensing components/modules, supportive components/modules, and responsive components/modules.
  • the controller can be one or more processing devices that may be centralized in or out of an area of interest, or distributed in or out of the area of interest.
  • the controller device(s) serve to gather, store and process sensor data and perform the various reasoning algorithms required for determining actor status and needs, generating response plans and executing the response plans via the various actor/effectors an interaction devices available for the actor, the actors environment and/or caregivers.
  • the controller further includes data tracking, logging and machine learning algorithms to control and detect trends and individual behavior patterns across collected data.
  • the sensing components/modules include one or more sensors deployed throughout the area of interest in conjunction with related modules for receiving and interpreting sensor data.
  • the supportive components/modules include one or more actuation and control devices in the area of interest.
  • one or more interaction devices available to the actor and/or the actor's caregiver are provided.
  • the system and method is preferably capable of using existing interaction devices such as telephones, televisions, pagers, and web-enabled computers.
  • the responsive components/modules include one or more sensors deployed throughout the area of interest, preferably along with actuation, control, and interaction devices.
  • system and method of the present invention can provide a number of application-specific features, including (but not limited to) those set forth below:
  • Monitor heating system space heaters, fireplaces, chimneys, and appliances (especially stove, oven, toasters, grills, microwaves) and provide alerts if unusual situation occurs.
  • a panic-button-type device that is worn by the actor and can be used to summon help.
  • a panic-button-type device that is worn by the actor and can be used to summon help.
  • Resource guide of elderly-support services e.g., dinner-delivery, in-home healthcare, or informational web pages.
  • IDS Leverage automated user interface generation capability
  • Task prompts or step-by-step instructions.
  • Monitor activities to detect signs of depression e.g., sleep patterns, amount of overall activity, changes in appetite, changes in voice.
  • Monitor activities to detect signs of dementia onset or worsening e.g., forgetting to do things system or others have suggested (STM), forgetting appointments (LTM), Sundowning (see wandering), Hallucinations (see hallucinations)).
  • STM forgetting to do things system or others have suggested
  • LTM forgetting appointments
  • Sundowning see wandering
  • Hallucinations see hallucinations
  • Monitor food degradation e.g., if meat has been defrosted in microwave and not cooked immediately, or if meat is out for longer than 2 hours at room temp.
  • Monitor cooking progress/success e.g., temperature and time in oven to determine whether food is cooked.
  • Task prompts or step-by-step instructions.
  • Omni-directional signal reception e.g., no matter what way the actor has a selected remote control facing, it will control the proper device.
  • Task prompts and cues (keys light up in order as cue for entry sequence).
  • Door mat sensor and door sensor can indicate a potential exit by actor (outside door mat sensor and door bell or acoustic sensor listening for a knock can confirn/disconfirm that the actor is not simply answering the door).
  • keys are RF-tagged, confirm that actor has keys (if so automatically lock the door; if not, may depend on facial or voice recognition when actor returns to actuate door lock).
  • Wandering switch if leaving on purpose, actor actuates a switch at door to indicate leaving house. If not switched, front gate locks to prevent departure and contain wandering path within home territory. Notify caregiver that actor is outside if outdoor conditions are adverse.
  • system may be able to detect onset of depression and other user mental states. Changes in sleep patterns, eating patterns, activity level, and even vocal qualities can provide an indication that the actor is becoming depressed. If actor is exhibiting declining trends in any or all of these parameters, system can administer a brief assessment (such as Geriatric Depression Scale coupled with Mini Mental State Exam) via the phone, webpad, or television to confirm the presence of depression. Since social isolation is a common component in elderly, system can also be adapted to intervene by sending a message to a neighbor or friend telling them it may be a good time to stop by for a visit. System can also help by providing wider communications access for the actor, for example connecting the actor with their favorite chat room at the appropriate time.
  • a brief assessment such as Geriatric Depression Scale coupled with Mini Mental State Exam
  • system could ask what is wrong, then system could scan house for signs of an intruder and reassure the actor that there is no one in the house.
  • system can call a designated caregiver party who will intervene to calm the actor.
  • System can log the event.
  • Sensory stimulation technique that has been successful in calming children via multi-sensory stimulation. Indications are that this technique is effective in reducing agitation in those suffering dementia.
  • Application includes: light therapy, essential oils, soft chair, wind chimes, lava lamps, etc. While having a Snoezelen room may not be practical, applying these techniques in part in the room of the agitated actor might be helpful in reducing their agitation until a caregiver can intervene.
  • the present invention provides a marked improvement over previous designs.
  • the system and method of the present invention incorporate a highly flexible architecture/agent construct that is capable of taking input from a wide variety of sensors in a wide variety of settings, mapping them to a set of events of interest, reasoning about desired responses to those events of interest, and then accomplishing those responses on a wide variety of effectors in a wide variety of settings.

Abstract

An automated system and method for monitoring and supporting and actor in an environment, such as a daily living environment. The system includes at least one sensor, at least one effector and a controller adapted to provide monitoring, situation assessment, response planning, and plan execution functions. In one preferred embodiment, the controller provides a layered architecture allowing multiple modules to interact and perform the desired monitoring and support functions.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is related to, and is entitled to the benefit of, U.S. Provisional Patent Application Serial No. 60/351,300, filed Jan. 22, 2002; U.S. Provisional Patent Application Serial No. 60/368,307, filed Mar. 28, 2002; U.S. Provisional Patent Application Serial No. 60/384,899, filed May 30, 2002; and U.S. Provisional Patent Application Serial No. 60/384,519, filed May 29, 2002; U.S. patent application Ser. No. 10/286,398, filed on Nov. 1, 2002; U.S. Provisional Patent Application Serial No. 60/424,257, filed on Nov. 6, 2002; a U.S. non-provisional patent application filed on even date herewith, entitled “System and Method for Learning Patterns of Behavior and Operating a Monitoring and Response System Based Thereon”, having attorney docket number H0003384.02; a U.S. provisional patent application filed on even date herewith, entitled “System and Method for Automatically Generating an Alert Message with Supplemental Information”, having attorney docket number H0003365; the teachings of all of which are incorporated herein by reference.[0001]
  • BACKGROUND OF THE INVENTION
  • The present invention relates to an automated system and method for providing assistance to individuals based, at least in part, upon monitored activities. More particularly, it relates to a system and method that intelligently monitors, recognizes, supports, and responds to activities of an individual in an environment such as an in-home, daily living environment. [0002]
  • The evolution of technology has given rise to numerous, discrete devices adapted to make daily, in-home living more convenient. For example, companies are selling microwaves that connect to the Internet, and refrigerators with computer displays, to name but a few. Manufacturers have thus far concentrated on the devices themselves, and the network protocols necessary for them to communicate on an individual basis. Experience in other domains (e.g., avionics, oil refineries, surgical theaters, etc.) shows that such innovations will merely produce a collection of distributed devices with localized intelligence that are not integrated, and that may actually conflict with each other in their installation and operation. Further, these discrete products typically include highly advanced sensor technology, and thus are quite expensive. Taken as a whole, then, these technological advancements are ill-suited to provide coordinated, situation aware, universal support to an in-home resident on a cost-effective basis. [0003]
  • The above-described drawbacks associated with state-of-the-art home-related technology are highly problematic in that a distinct need exists for an integrated personal assistant system. One particular population demographic evidencing a clear desire for such a system is elderly individuals. Generally speaking, with advanced age, elderly individuals may experience difficulties in safely taking care of themselves. Apparently, a nursing home is often the only option, in spite of the financial and emotional strain placed on both the individual as well as his/her family. Similar concerns arise for a number of other population categories, such as persons with specific disease conditions (e.g., dementia, Alzheimer's, etc.), disabled people, children, teenagers, over-stressed single parents, hospitals (e.g., newborns, general patient care, patient location/wandering, etc.), low-security prisons, or persons on parole. Other types of persons that could benefit from varying degrees of in-home or institutional monitoring and assistance include the mentally disabled, depressed or suicidal individuals, recovering drug or alcohol addicts, etc. In fact, virtually anyone could benefit from a universal system adapted to provide general in-home monitoring, reminding, integration, and management of in-home automation devices (e.g., integration of home comfort devices, vacation planning, food ordering, etc.), etc. [0004]
  • Some efforts have been made to develop a daily living monitoring system based upon information obtained by one or more sensors disposed about the user's home. For example, U.S. Pat. No. 5,692,215 and U.S. Pat. No. 6,108,685, both to Kutzik et al., describe an in-home monitoring and data-reporting device geared to generate movement, toileting, and medication-taking data for the elderly. The Kutzik et al. system cannot independently determine appropriate actions based upon sensor data; instead, the data is simply forwarded onto a caregiver who must independently analyze the information, formulate a response and execute the response at a later point in time. The recognition by Kutzik that monitoring a person's daily living activities can provide useful information for subsequently assisting that person is clearly a step in the right direction. However, to be truly beneficial, an appropriate personal, in-home assistant system must not only receive sensor data, but also integrates these individual functions and information sources to automatically develop an appropriate response plan and implement the plan, thereby greatly assisting the actor/user in their activities. A trend analysis feature alluded to by Kutzik et al. may provide a separate person (i.e., caregiver) with data from which a possible course of action could be gleaned. However, the Kutzik et al. system itself does not provide any in-depth sensor information correlation or analysis, and cannot independently or immediately assess a particular situation being encountered by the user, let alone generate an automated, situation-appropriate response. Further, Kutzik et al., does not address the “technophobia” concerns (often associated with elderly individuals) that might otherwise impede complete interaction between the user and the system. The inability of Kutzik, as well as other similar systems, to satisfy these constraints is not surprising, given that requisite system architecture, ontology and methodologies did not heretofore exist and the system needs to overcome extensive technology and logic or reasoning obstacles. [0005]
  • Emerging sensing and automation technologies represent an exciting opportunity to develop a system to monitor and support an actor in an environment. Unfortunately, current techniques entail either discrete devices that are unable to interact with one another and/or cannot independently and automatically respond to the daily activities of an actor based upon sensor-provided information. Therefore, a need exists for a system and method for providing accurate situation assessment and appropriate, intelligent responsive plan generation and implementation based upon the sensed daily activities of an actor.[0006]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram illustrating the system of the present invention; [0007]
  • FIG. 2 is a simplified, schematic diagram of an architectural configuration of the system of FIG. 1; [0008]
  • FIG. 3 is a schematic illustration of a preferred architectural configuration of the system of FIG. 1; [0009]
  • FIGS. [0010] 4-11 are schematic illustrations of alternative architectural configurations;
  • FIG. 12 is a block diagram of an alternative system in accordance with the present invention; [0011]
  • FIGS. [0012] 13A-13C provide an exemplary method of operation in accordance with the present invention in flow diagram form;
  • FIG. 14 is a schematic illustration of an architecture associated with the method of FIGS. [0013] 13A-13C; and
  • FIGS. [0014] 14-21 are block diagrams of alternative system configurations in accordance with the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A. Hardware Overview [0015]
  • One preferred embodiment of an actor (or user or client) monitoring and responding [0016] system 20 in accordance with the present invention is shown in block form in FIG. 1. As a point of reference, the system 20 offers the potential to incorporate monitoring and support tools as a personal assistant. By providing intelligent, affordable, usable, and expandable integration of devices, the system 20 will support daily activities, facilitate remote interaction with family and caregivers, provide safety and security, and otherwise assist the user.
  • In most general terms, the [0017] system 20 includes one or more controllers 22, a plurality of sensors 24, and one or more effectors 26. As described in greater detail below, the sensors 24 actively and/or passively monitor daily activities of an actor or user 28 or their environment (including other humans, animals, etc.). Information or data from the sensors 24 is signaled to the controller 22. The controller 22 processes the received information and, in conjunction with architecture features described below, assesses the actor's 28 actions or situation (or the actor's 28 environment 30), and performs a response planning task in which an appropriate response based upon the assessed situation is generated. Based upon this selected response, the controller 22 signals the effector 26 that in turn carries out the planned response relative to the actor 28 or any other interested party (or caregiver) depending upon the particular situation. As used throughout the specification, the term “caregiver” encompasses any human other than the actor 28 that is in the actor's environment 30 or interacts with the actor 28 for any reason. Thus, a “caregiver” in accordance with the present invention is not limited to a medical specialist (e.g., physician or nurse), but further includes any human such as a relative, neighbor, guest, etc. Further, the term “environment” encompasses a physical structure in which the actor 28 is located (permanently or periodically) as well as all things in that physical structure, such as lights, plumbing, ventilation, appliances, humans other than the actor 28 that at least periodically visit (e.g., caregiver as defined above and pets), etc.
  • The key component associated with the [0018] system 20 resides in the architecture provided with the controller 22. As such, the sensors 24 and the effectors 26 can assume a wide variety of forms. Preferably, the sensors 24 are low cost, and are networked by the controller 22. For example, the sensors 24 can include motion detectors, pressure pads, door latch sensors, panic buttons, toilet-flush sensors, microphones, cameras, fall-sensors, door sensors, heart rate monitor sensors, blood pressure monitor sensors, glucose monitor sensors, moisture sensors, light level sensors, telephone sensors, smoke/fire detectors, thermal sensors, water sensors, seismic sensors, etc. In addition, one or more of the sensors 24 can be a sensor or actuator associated with a device or appliance used by the actor 28, such as a stove, oven, television, telephone, security pad, medication dispenser, thermostat, etc., with the sensor or actuator providing data indicating that the device or appliance is being operated by the actor 28 (or someone else). The sensors 24 can be non-intrusive or intrusive, active or passive, wired or wireless, physiological or physical. In short, the sensors 24 can include any type of sensor that provides information relating to activities or status of the actor 28 or the environment.
  • Similarly, the [0019] effectors 26 can also assume a wide variety of forms. Examples of applicable effectors 26 include computers, displays, telephones, pagers, speaker systems, lighting systems, fire sprinkler, door lock devices, pan/tilt/zoom controls on a camera, etc. The effectors 26 can be placed directly within the actor's 28 environment, and/or can be remote from the actor 28, for example providing information to other persons concerned with the actor's 28 daily activities (e.g., caregiver, family members, etc.).
  • The [0020] controller 22 is preferably a microprocessor-based device capable of storing and operating appropriate architectural components (or other modules), as described below. In this regard, the controller 22 can include discrete components that are linked to one another for appropriate interface. For example, a first controller component can be located at the actor's 28 home, whereas a second controller component can be located off-site. Alternatively, an even greater number of controller components can be provided. Conversely, an entirety of the controller 22 can be located on-site or off-site, or can be worn on the body of the actor 28. Various hardware configuration for the controller 28 are described in greater detail elsewhere.
  • B. Architecture and Related Functions [0021]
  • As previously described, the ability of the [0022] system 20 of the present invention to provide integration of the various sensor data in conjunction with intelligent formulation of an appropriate response to a particular situation encountered by the actor 28 resides in the architecture provided with the controller 22. The architecture configuration for accomplishing these goals can reside in various iterations that are dependent upon a particular installation; a more complex application will entail a more complex architectural arrangement in terms of availability and integration of additional features. Regardless, to best explain the various architecture and preferred features/configurations, the foregoing description includes exemplary hypothetical situations in conjunction with the methodology that the feature/configuration being described would employ to sense, analyze and/or address the hypothetical. The examples provided are in no way limiting of the countless potential applications for the system 20 architecture, and the listed responses are in no way exhaustive.
  • With the above in mind, and with reference to FIG. 2, the preferred system architecture entails four main categories of capability that can be described as fitting into a layered hierarchy. These include [0023] sensing 40, situation assessment 42, response planning 44, and response execution 46. In general terms, the sensing layer 40 coordinates signaled information from multiple sensor sources, preferably clustering multiple sensor reports into a single event. With respect to the situation assessment layer 42, based upon information provided via the sensing layer 40, an attempt is made to understand the current situation of the actor 28, whether it is describing the person or persons in the environment being monitored (e.g., the actor 28, caregivers, pets, postal workers, etc.), or physical properties of the environment (e.g., stove on/off, door opened/closed, vase fell in the kitchen, etc.). The situation assessment layer 42 will preferably include a number of components or sub-layers, such as intent recognition for understanding what actors are trying to do, and response monitoring for adaptation. Regardless, based upon the situation assessment information provided by the situation assessment layer 42, the response planning layer 44 generates an appropriate response plan, such as what to do or whom to talk to, how to present the devised response, and on what particular effector(s) 26 (FIG. 1) the response should be effected. Finally, the response execution layer 46 effectuates the response plan generated by the response planning layer 44. Each of these functions are described in greater detail below.
  • Within each of the layers [0024] 40-46 or across two or more of the layers 40-46, one or more computational components can be employed. In a preferred embodiment, the architecture associated with the system 20 has components that are agent-oriented such that the system 20 provides multiple independent computational threads, and encourages a task-centered model of computation. By encouraging a task-centered model of computation, the system 20 benefits from the natural byproduct of decoupled areas of computational responsibility. The multi-threaded computational model enhances this decoupling by supporting a system that makes use of the different levels of granularity that a problem presents. In one embodiment, the agents can migrate from one computational platform (or layer) to another to balance loads. Thus, the preferred system 20 provides an agent or agents responsible for various capabilities essential to good system performance available at several levels of computational responsibility, from device control to user task tracking.
  • The model of the [0025] preferred system 20 is expressed in an ontology and agent communication protocol (referenced generally at 48 in FIG. 2) that forms a common language to describe the domain. This ontologically mediated inter-agent communication provides an additional benefit; it gives components the ability to discover services provided by other agents, often through the services of a matchmaker. Discovery directly provides the opportunity for an independent agent to expand its range of knowledge without radically changing its control focus. As a result, discovery allows the overall system 20 to grow at run-time without adversely affecting functionality. Thus, the preferred agent-oriented approach provides modularity, independence, distribution, discovery, and social convention.
  • The preferred agent architecture associated with the [0026] system 20 is defined as a federated set of agents that define agent interfaces. In this regard, as used throughout the specification, a “system agent” or “agents” is defined as a software module that is designed around fulfilling a single task or goal, and provides at least one agent interface. An individual system agent is intended to perform a single (possibly very high level) task. Examples of the agent's task include interaction with a user or caregiver, preventing fires in a kitchen, interfacing with a medication-monitoring device, monitor long term trends, learning, filtering sensor noise, device management (e.g., speech or video understanding, television operation, etc.), etc. The system agent is the basic delivery and compositional unit of the system 20 architecture. As such, different software venders can provide agents for installation in the system 20 to provide new functionality. While the system 20 will preferably have, at its core, a small set of agents that will be present in every installation of the system 20, the breakdown of system functionality into agents is designed to allow a flexible modularity to the system 20 construction. Choosing agents on the basis of provided functionality will allow the actor 28 (or a person responsible for his/her care) to customize the system 20 to provide only those functions they want to have without requiring the adoption of functionality that they are not interested in. Although the preferred system 20 architecture has been described as being agent-based, other configurations capable of performing the situation assessment, response planning, and response plan implementation features described below are also acceptable.
  • Agent interfaces provide the inter-agent communication and interaction for the [0027] system 20. Each agent must make available at least one agent interface. In contrast to the task-organized functionality provided by agents, the agent interfaces are designed to allow the agents to provide functionality to each other. They provide for and foster specific kinds of interactions between the agents by restricting the kinds of information that can be provided through each interface.
  • In a preferred embodiment, the [0028] system 20 provides three types of agent interfaces, including a “Sensor agent interface”, and “Actuator agent interface”, and a “Reasoner agent interface” (hereinafter referred to as “SRA interfaces”). A sensor agent interface answers questions about the current state of the world, such as “is the stove on/off?”, “has the user taken his/her medication for the day?”, “is the user in the house?”, etc. These interfaces allow others to interact with the agent as though it is just a sensor. An example of this kind of interface is a kitchen fire safety agent that allows other agents to know the state of the stove. An actuator agent interface accepts requests for actions to change/modify the world, for example, including: turning the stove on/off, calling the user on the phone, flashing the lights, etc. These interfaces allow the agent to be used by others as a simple actuator. Preferably, the monitoring of an action to verify that it has been done would be carried out by the agent implementing the actuator agent interface rather than by the agent requesting the action. Finally, a reasoner agent interface answers questions about the future state of the world such as, for example, “will the user be home tonight?” or “can the user turn off the television?”, etc. These interfaces are designed to allow the agent to perform reasoning for other agents.
  • In general, each agent will preferably have more than one interface and may even provide multiple interfaces of the same type. For example, a kitchen fire safety agent can provide a sensor agent interface for the state of the stove and a similar, but separate, agent interface for the toaster oven. Similarly, the kitchen fire safety agent preferably provides a sensor interface for indicating a current state of the stove, an actuator interface that allows changing of a stove temperature or activation/deactivation, and a reasoner agent interface that determines an expected future state of the stove. In a preferred embodiment, when an agent is registered as part of the [0029] system 20, it will register the agent interfaces that it makes available. Other agents that wish to make use of these interfaces can be informed of the availability and be reconfigured accordingly. This preferred agent discovery process entails discovery of software features and capabilities of available agents, and is not otherwise available with existing protocols, such as Universal Plug and Play (“UPnP”)
  • One of the benefits associated with the preferred agent-oriented paradigm is reflection. Reflection is the process of reasoning about and acting upon one's self. Reflection is present at both the individual and social levels of properly constructed agent systems. Reflection at the single agent level primarily means that the agent can reason about the importance of its goals and commitments in a dynamic environment; it is apparent in explicit models of goals, tasks, and execution state. An agent's ability to reason about goals and commitments in the context of an agent system is provided by a common, interchangeable task model. [0030]
  • One preferred embodiment of the [0031] system 20 architectural organization, including preferred layer-agent interrelationships, is provided in FIG. 3. The framework illustrated in FIG. 3 includes multiple layers that correspond to the situation assessment layer 42 of FIG. 2, including “clustering”, “validating”, “situation assessment and response monitoring”, and “intent inference”. Further, FIG. 3 illustrates various agents within each layer and/or acting within several layers. In this regard, exemplary domain agents are provided (including “fire safety”, “home security”, and “medication management”). It will be understood that these are but a few examples of domain agents that can be used with the system 20 of the present invention.
  • The various layers identified in FIG. 3 provide a framework in which to describe an agent's capability, rather than a strict enforcement of code. Further, there are some agents that reside outside of this framework, notably because they are not part of the “reasoning chain” in quite the same way. These would include, for example, customization and configuration (that interacts with an actor to gather system set-up information), “machine learning” (described in greater detail below; generally refers to building models of the particular application environment and normal activities of the [0032] actor 28 that are used by caregivers of the system 20 to intervene or improve system accuracy and responsiveness), and a log manager (to mediate access to system databases). Further, devices (both sensors and actuators) reside in the device layer, communicating with a standard device communication protocol. The agents communicate within an agent infrastructure. In one preferred embodiment, one or more agents are provided that function as adaptors to translate device messages.
  • The agents associated with FIG. 3 are depicted as larger ovals according to functional groupings. In a preferred embodiment, each of the agents shown in FIG. 3 provides all of the functionality related to the particular subject matter. For example, the domain agents described above can further include an “eating” agent that provides all of the functionality related to the actor's [0033] 28 eating habits, including, for example, monitoring what and when the user is eating, monitoring the freshness of food, creating menus and grocery lists, and raising alerts when necessary.
  • Communication between the agents of FIG. 3 is preferably performed through one or more of the three SRA interfaces previously described. Within an agent, agent-components may communicate using whatever mechanism they choose, including the extremes of: (1) choosing to be one piece of undifferentiated code that requires no communication, or (2) using their own proprietary communication method, or (3) choosing to use the preferred system ontology in a communication protocol. [0034]
  • While it is unlikely that an agent or agent-component residing in the response planning layer will want to or need access to an agent in the pattern matching layer (i.e., skipping layers), the preferred architecture will not restrict this information flow. In short, response planner layer agents need only maintain “ontological purity” in their communications with other agents. This same preferred feature holds true for agents that can reason over multiple layers in the reasoning architecture. “Ontological purity” means that the ontology defines concepts that can be shared or inspected between agents, and those concepts exist within a level of the reasoning architecture. Concepts can be used within or across levels or layers, but preferably must be maintained across agents. [0035]
  • The particular infrastructure framework utilized for the system agent system architecture can assume a variety of forms, including FIPA-OS, AgentTool, Zeus, MadKit, OAA2, JAFMAS, JADE, DECAF, etc. [0036]
  • C. Preferred Agent Features [0037]
  • Several of the layers and/or agents illustrated in the layered architecture of FIG. 3 preferably provide added “intelligence” to the [0038] system 20, and are described in greater detail below. It should be noted, however, that regardless of whether one or more of the features are included, the overall layered architecture configuration of the system 20 provides a heretofore unavailable platform for seamlessly associating each of these features in a manner that preferably facilitates complete monitoring, recognizing, supporting and responding to the behavior of an actor in every day life, it being understood that the present invention is not limited to facilitating all of these functions (e.g., supporting and responding to behavior are not mandatory features).
  • For example, devices in the various layers preferably can directly write to the log. Agents preferably go through the log manager that selectively returns only the requested information. Alternatively, the [0039] system 20 architecture can be adapted such that non-agents can access and review information stored within the log manager (e.g., a doctor's office would represent a non-agent that could benefit by having access to the log manager). Along these same lines, the system 20 can be adapted such that non-agents are able to write data into the log manager, but on a mediated basis.
  • The “sensor adapter” agent is preferably adapted to read the log of sensor firings, compensate for any latencies in data transmission, and then forward the information into the agent architecture. [0040]
  • The “clustering” layer is provided to combine multiple sensory streams into a single event. For example, for a [0041] particular system 20 installation, the sensors can include a pressure-mat sensor in the kitchen, a pressure-mat sensor in the hall, and a motion sensor in the kitchen. The preferred “event” agent associated with the clustering layer can interpret a three-sensor sequence of these sensors as probably reporting on the same event, namely entering the kitchen. The “situation assessment and response monitoring” layer aggregates evidence presented by the various sensors and agents to predict a most likely ramification of the current user situation. In this regard, the layering preferably includes monitoring the effects of a subsequently-implemented response plan. For example, a particular situation assessment may conclude that the actor 28 has fallen. The resulting response plan is to ask the actor 28 whether or not he/she is “okay”. If, under these circumstances, the actor 28 does not respond to the question, then the response monitoring layer can conclude that the detected fall is likely to be more serious.
  • The “client” agent and the “home” agent monitor and manage information relating to the actor and the actor's environment, respectively. The client agent information preferably includes current and past information, such as location, activity and capabilities, as well as preferred interaction mechanisms. The information can be predetermined (provided directly by the actor and/or caregiver), inferred (via situation assessment or intent recognition), and/or learned (machine learning). Where the particular environment includes multiple actors (e.g., a spouse), a separate client agent will preferably be provided for each actor. The home agent information preferably includes environment lay-out, sensor configurations, and normal sensor patterns. Again, the information may be predetermined, inferred and/or learned. [0042]
  • A further preferred feature of the previously-described “domain” agents is a responsibility for all reasoning related to its functional area. Each domain agent performs situation assessment, provides intent recognition libraries (described below), and creates initial response plans. With respect to the proposed response plan, each domain agent is preferably adapted to decide whether, for a particular situation, to wait for additional information, explicitly gather more information, or interact with the actor and/or caregiver. The domain agent further needs to decide what actor interaction/interface device(s) to preferably use, what modality to preferably use on selected devices, and, where appropriate, which person(s) to preferably contact in the event that outside assistance is determined necessary. The domain agent preferably proposes an interaction based only on its specialized knowledge; in other words it proposes a “context-free” response. [0043]
  • The “intent inference” layer preferably includes an “intent recognition” agent that, in conjunction with intent recognition libraries, pools multiple sensed events and infers goals of the actor, or more simply, formulates “what is the actor trying to do”. For example, going into the kitchen, opening the refrigerator, and turning on the stove likely indicate that the actor is preparing a meal. Alternative intent inference evaluations include inferring that the actor is leaving the house, going to bed, etc. In general terms, the preferred intent recognition agent (or intent inference layer) entails repeatedly generating a set of possible intended goals (or activities) by the actor for a particular observed event or action, with each “new” set of possible intended goals being based upon an extension of the observed sequence of actions with hypothesized unobserved actions consistent with the observed actions. The library of plans that describe the behavior of the actor (upon which the intent recognition is based) are provided by the “domain” agents. In a preferred embodiment, the [0044] system 20 probabilistically infers intended goals pursuant to a methodology in which potentially abandoned goals are eliminated from consideration, as taught, for example, in U.S. Provisional Application Serial No. 60/351,300, filed Jan. 22, 2002, the teachings of which are incorporated herein by reference. The preferred intent inference layer improves the response planning capabilities of the system 20 because the response planner is able to “preemptively” respond. For example, with intent inference capabilities, the system 20 architecture can lock a door before a demented actor attempts to leave his/her home, provide next step-type suggestions to an actor experiencing difficulties with a particular activity or task, suppress certain warning alarms in response to a minor kitchen fire upon recognizing that the actor is quickly moving toward the kitchen, etc.
  • The preferred architecture of FIG. 3 further includes an “IDS” agent. This is in reference to an Interaction Design System agent that processes sensor data to understand a particular situation, needs and capabilities of the [0045] actor 28 and available effectors that, as part of the Response Planning layer, are used to develop interaction plans. That is to say, the IDS agent provides information for developing a series of control actions designed to assist the actor through information presentation or adaptive automation behaviors. Thus, the preferred IDS agent reasons about which user interaction/interface device to utilize for informing the actor of a particular plan. The adaptive interaction generation feature promotes planned responses adapting, over time, to how the actor 28 (or others) responds to particular plan strategies. By further accounting for the urgency of a particular message, the preferred IDS agent dynamically responds to the current situation, and allows more flexible accommodation of the interaction/interface devices.
  • An additional feature preferably incorporated into the Situation Assessment and Response Monitoring layer is an inactivity monitoring feature. The inactivity monitoring feature is preferably provided as part of the “machine learning” agent (described below) or as part of individual domain agents, and relates to an expected actor activity (e.g., the actor should wake up at 8 a.m., the actor should reach the bottom of the stairs within one minute of starting to descend) that does not occur. In other words, the [0046] preferred system 20 architecture not only accounts for unexpected activities or events, but also for the failure of an expected activity to occur, with this failure being cause for alarm. The inactivity monitoring function is primarily model based, and can include accumulated information such as a history of the actor's activities; a profile of the actor's environment; hardware-based sensor readings; information about the current state of the world (e.g., time of day); information about the caregiver's activities (where applicable); a prediction of the future actions of the actor and/or caregiver; predictions about the future state of the world; predetermined actor, caregiver and/or environment profiles; and predetermined actor and/or caregiver models, settings, or preferences. The inactivity monitoring mechanism preferably can detect the unexpected inactivities that would otherwise go unnoticed by an activity only-based concept of monitoring. It does so by comparing the actor's current activities with his/her preset and/or expected patterns. In a preferred embodiment, certain thresholds are implemented to allow for flexibility in the actor's schedule. However, there are certain recognizable patterns within the day, and within each activity. For example, if the actor is expected rise from bed between 8 a.m. and 10 a.m., and no activity has been detected during this time, the system 20 can be adapted to raise an alarm notifying a designated caregiver(s). By way of further example, and at a different granularity, if the actor 28 is descending from the stairs, and no motion is detected at the bottom of the staircase after a predetermined length of time, the system 20 can be adapted to raise an alarm. Therefore, the established threshold of the inactivity monitoring mechanism enables the system 20 to detect a greater range of unexpected behaviors and possibly dangerous situations.
  • In conjunction with the above-described inactivity monitoring feature, the [0047] preferred system 20 architecture further includes an Unexpected Activity/Inactivity Response feature in the form of a module or agent that determines if the actor 28 needs assistance by monitoring for signs of unusual activity or inactivity. Given the “normal” or expected behavior of the actor 28 or the actor's environment, unusual activity can trigger a response. For example, movement in the basement when the actor 28 is normally asleep could trigger an intruder alarm response. This augments the above-described inactivity monitoring feature by adding a learned or programmed model of the normal/usual activities, and includes, in addition to the above listed information, learned actor, caregiver and/or environmental usual patterns; learned actor, caregiver, and/or environmental profiles; and learned actor and/or caregiver preferences.
  • The “response plan/exec” agent preferably includes a response coordination feature that coordinates the responses of the “domain” agents. The response coordinator preferably merges or suppresses interactions or changes interaction modality, as appropriate, based upon context. For example, if the [0048] actor 28 has fallen (entailing an “alarm” response), the response coordinator can suppress a reminder to take medication. Multiple reminders to the actor 28 can be merged into one message. Multiple alert requests to different devices can be merged onto one device. To this end, merged messages will preferably be sorted by priority, where priority is defined by the domain agent, as well as by the type of message (e.g., an alarm is more important than an alert). Preferably, the response plan/exec agent centralizes agent coordination, but alternatively the system 20 architecture can employ distributed modes. The preferred centralized response coordination approach, however, is feasible because all of the involved agents interact with a small sub-set of users through a small sub-set of devices. In other words, all activities involving communications with the outside world are strongly interrelated. Thus, while the agents are loosely coupled, their responses are not.
  • The “machine learning” agent provides a means for ongoing adaptation and improvement of [0049] system 20 responsiveness relative to the needs of the actor 28. The machine learning agent preferably entails a behavior model built over time for the actor 28 and/or the actor's environment. In general terms, the model is built by accumulating passive (or sensor-supplied) data and/or active (actor and/or caregiver entered) data in an appropriate database. The data can be simply stored “as is”, or an evaluation(s) of the data can be performed for deriving event(s) and/or properties of event(s) as described, for example, in U.S. Provisional Patent Application Serial No. 60/834,899, filed May 30, 2002, the teachings are incorporated herein by reference. Regardless, other modules in the system 20 preferably can utilize the learned models to adapt or change their operation. For example, the Response Planning layer will likely consider alternative plans or actions. Learning the previous success or failure of a chosen plan or action enables continuous improvement. In the realm of actor interaction and where the machine learning agent (or similar module) is provided, the system 20 can learn, for example, the most effective modality for a message; the most effective volume, repetition, or duration within a modality; and the actor's preferences regarding modality, intensity, etc. Thus, the mechanism for learning can account for contextual conditions (e.g., audio messages are ineffective when the actor is in the kitchen).
  • Finally, the “customization” (or “configuration”) agent is preferably adapted to allow an installer of the [0050] system 20 to input configuration information about the actor, the caregiver (where relevant), other persons acting in the environment, as well as relevant information about the environment itself.
  • D. Preferred Architecture Functioning [0051]
  • The layered architecture presented in FIG. 3 is but one example of an appropriate configuration useful with the [0052] system 20 of the present invention. Other exemplary architectures are presented in FIGS. 4-11. For example, the exemplary architecture of FIG. 5 incorporates a more “horizontal” cut of agent functionality whereby there is generally one agent per layer that performs all the tasks required for that layer. By way of comparison, all situation assessment is carried out by a single agent within the architecture of FIG. 5, whereas individual agents are provided for selected situations with the architecture of FIG. 3 (e.g., all medication management-related assessment occurs in the medication management agent). As a point of clarification, several of FIGS. 4-11 include the term “CARE” which is in reference to “client adaptive response environment” and the term “HOME” is in reference to “home observation and monitoring environment”, both of which represent system components in accordance with the present invention.
  • Regardless of the exact architectural configuration, a preferred feature of the [0053] system 20 is an ability to mediate and resolve multiple actuation requests. In particular, the system 20 is preferably adapted to handle multiple conflicting requests made to an agent interface. In one preferred embodiment, this functionality is performed at the level of individual actuator agent interfaces. Alternatively, a central planning committee design can be instituted. However, the central planning committee technique would require a blackboard-type architecture, and would require providing all information needed to make a global decision rather than a local one. Given these restrictions, it is preferred that each actuator agent interface be required to handle the multiple conflicting request issue on an individual basis.
  • A first problem associated with multiple conflicting requests relates to multiple priority messages. In a preferred embodiment, each actuation request is provided with a priority “level” (e.g., on the scale of 1-5). Each priority level represents an order of magnitude jump from the level below it. The effect of this is that all requests of the same priority level are of the same importance and can be shuffled or reordered. Requests of a high level preempt all lower priority requests. Preferably, this priority scheme does not include an “urgency” factor for the requests. With this model, the requesting agent places a request for the specified action at a particular time with a given priority. If the actuator agent is unable to fulfill that request, the requesting agent is so-notified. The requesting agent is then free to raise the priority of the request or to consider other methods of achieving the goal. Thus, reasoning about the urgency of the action is left within the requesting agent, and all arbitration at the actuator level is performed on the basis of the priority of the request. [0054]
  • An additional multiple request-related concern is one request interfering with the processing of (or “clobbering”) another request. One of the traditional methods for handling this kind of problem is to allow the agents to pass portions of plans between themselves in order to explain the rationale for the action and to reach an agreement about the actions that need to be executed. This provides the information needed for the agents to resolve any conflicts between the actions of each of their plans. In a preferred embodiment, however, a limited form of this partial plan solution is provided. In addition to a specific request from an agent, the requesting agent must specify the environment that the request should be fulfilled in. In artificial intelligence terminology, the conditions embodied by causal links between plan steps must be provided to the executing agent. The [0055] preferred system 20 does this by specifying a list of sensor agent interface queries and their return values. In effect, this provides a list of predicates that must be true before the action is performed. If the specified conditions do not hold, then the system 20 cannot honor the request and will report that fact. Note that if an agent wants to ensure that some predicate, not provided by a sensor agent interface, holds during the execution of an action request, then it can provide the sensor agent interface necessary for the action. It should further be noted that in general, the “clobbering” concern is more relevant for actuator requests than reasoner or sensor agents, but these requirements are preferably placed in all three classes of agent interfaces.
  • The sensor integration, situation assessment and response planning features of the [0056] system 20 architecture present distinct advancements over previous in-home monitoring systems, and allows the system 20 to provide automated monitoring, supporting and responding to the activities (or inactivities) of the actor 28. This infrastructure provides a basis for building automated learning techniques that could generate actor-specific information (e.g., medical conditions, schedules, sensor noise, actor interests) that in turn can be used to generate better responses (e.g., notify doctors, better reminders, reduce false alarms, suggest activities). The situation assessment can be performed at a variety of levels of abstraction. For example, the system 20 can confer or assess a situation based upon stimulus-response, whereby a sensor directs an immediate response (e.g., modern security systems, motion-sensor path-lighting, or a heart rate monitor that raises an alarm if the heart rate drops precipitously). Preferably, the system 20 can “notice” and automatically control events before they actually occur, as opposed to the existing technique of simply responding to an event. This is preferably accomplished by providing the situation assessment layer with the ability to predict events based upon the potential ramifications of an existing situation, and then respond to this prediction. For example, the situation assessment layer is preferably adapted to notice that the stove is about to catch fire, and then act to turn the stove off; or turn the water heater off before the actor gets burned; etc. In addition to the above and in a preferred embodiment, the system 20 architecture is highly proactive in automatically responding to “events” (beyond responding to “alarm” situations); for example automatically arming a security system upon determining that the actor has gone to bed, automatically locking the actor's home door upon determining that the actor has left the home; etc.
  • Preferably, explicit reasoning modules for specific behaviors are incorporated into the [0057] system 20 architecture (e.g., a tracking algorithm that calculates the user's path based on motion-sensor events), and then possibly projects future states (e.g., turning on lights where the client is going, or locking the front door before the user wanders outside, or a video algorithm that recognizes faces). These modules may be a “library” of behavior recognition techniques, such as a set of functions that are explicitly designed to recognize one (or a small number) behavior. Alternatively, the system 20 architecture can be adapted such that individual agents build customized techniques for recognizing/obtaining information subtleties that are not required by other agents (e.g., a general vision agent could be configured to recognize food going into the actor's 28 mouth; a medications agent would want to know whether an ingested pill was of a certain color and nothing more, thereby allowing the medication agent to more efficiently and effectively interact with vision agent and implement the vision technique internally to the medication agent). Further, a “central” algorithm that weighs all likely current situations can be provided.
  • Additionally, the [0058] system 20 preferably performs condition-based monitoring that uses data from hardware-based sensors in conjunction with other information from various sources. The goals of condition-based monitoring are to provide greater accuracy for the assessment of the actor's current condition, include details with alarms raised, filter out unnecessary or inappropriate alarms, and also reduce the number of false alarms. The information that could potentially be used to perform condition-based monitoring includes: a history of the actor's activities; a profile of the actor's 28 environment; hardware-based sensor readings; information about the current state of the world, including for example, the actor's location, the time of day, the day of week, planned activity calendar, and the number of people in the environment; information about the caregiver's activities; a prediction of the future actions of the actor or caregiver; a prediction of the future state of the world; user/caregiver/environmental patterns or profiles, actor/caregiver preferences; etc.
  • By including additional information about the actor's environment, the [0059] system 20 can evaluate the current situation with more accuracy. Based upon the current condition of the environment and the recent history of actor 28 activities, the system 20 can initiate alarms and alerts in an appropriate manner, and assign an appropriate level of urgency. For example, the system 20 may reason that a possible fall sensor event (e.g., from a hardware-based sensor) that follows a shower event (e.g., from the history of the actor's activities) has a higher probability of the actor 28 suffering an injury-causing fall than a possible fall event that occurred on a dry, level surface (e.g., from the environment model). The system 20 can also reason that a toileting reminder may be inappropriate when there are guests in the actor's environment. Such monitoring mechanisms can be used by an automated response planner to decide how to respond-including for example, whether to actuate a device in the house (e.g., to turn on the lights), to raise an alarm/alert, to send a report, or to do nothing. The information can also be included with each alarm to better aid the caregiver in assessing the actor's well-being. Further, the preferred system 20 architecture preferably promotes sharing of inferred states (via the intent inference layer) across multiple sensors and performing second-order sensor processing. For example, a motion sensor may indicate movement in a particular room, whereas a GPS transponder carried on the actor's 28 person indicates that he/she is away from home. With this information, the situation assessment layer preferably reasons that either a window has been left open or there is an intruder. Based upon this second-order analysis, the system 20 architecture polls the relevant window sensor to determine whether the window is open or closed before initiating a final response plan.
  • The preferred agent layering architecture of the present invention facilitates not only allowing third parties to incorporate new devices into the [0060] system 20 at any time, but also to allow third parties to incorporate new reasoning modules at any time into the system 20. In this regard, third party reasoning modules can use new or existing devices as sensing or actuating mechanisms, and may provide information or user information from other reasoning modules. To ensure that new devices and control services can coherently interact with existing devices in the particular system installation, a consolidated home ontology is provided that includes the terms of the language that devices and control services must use to communicate with one another. Thus, newly added devices or agents can find other agents within the system 20 architecture that otherwise supply information that the new device or agent is interested in.
  • As previously described, the response planning and response execution layers associated with the [0061] system 20 architecture can assume a variety of forms, some of which initiate further passive monitoring, and others that entail active interaction with the actor. In addition, the system 20 preferably incorporates smart modes or agents into the response planning layer. In general terms, the smart modes entail querying the actor as to his/her status (mental/physical/emotional), the response to which is then combined with other sensor data to make inferences and re-adjust the system behavior. Some exemplary modes include “guest present”, “vacation”, “feeling sick”, and “wants quiet” (or mute). For example, the actor 28 may indicate that she is not feeling well when she wakes up. The system 20 can then ask the actor 28 to indicate a few of her symptoms and can give the actor 28 an opportunity to specify needs (e.g., need to get some juice and chicken soup; need to make a doctor appointment; need to call caregiver; do nothing; etc.). The system 20 then uses this information to adjust its reasoning, activities, and notifications accordingly. Continuing the previous example, if the actor 28 later skips taking medications, any notifications preferably include information about the actor 28 feeling ill. If the system 20 has access to an appropriate database, it can match the actor's symptoms against the database given that it knows that the actor 28 has, for example, started a new prescription the day before (and issues alerts based upon the match if required). Further, the system 20 preferably can reduce general activity reminders; cancel appointments; reduce notification thresholds for general activities like mobility, toileting, eating; increased reminders to drink fluids; add facial tissues and cold medicine to the shopping list ; etc. Along these same lines, it is noted that the preferred system 20 reasons through multiple layers of refinement within the system. The smart mode states will act as individual pieces of information in the reasoning steps that aggregate evidence from a specific situation, a world understanding, and the smart modes themselves. This acts as a dynamic system, supporting reasoning based on an actual situation rather than a predefined sequence. FIG. 14 provides a block diagram of one example of the system 20 incorporating smart mode information. The smart mode can be an agent within the system 20 architecture, or could be within each of the domain agents.
  • E. Exemplary Method of Operation [0062]
  • As previously described, the [0063] system 20 layered architecture can assume a variety of forms, and can include a variety of agents (or other modules) to effect the desired intelligent environmental automation system with situation awareness and decision-making capabilities, as exemplified by the methodology described with reference to the flow diagram of FIGS. 13A-13C. As a point of reference, the method of FIGS. 13A-13C is preferably performed in conjunction with the architecture of FIG. 14, it being understood that other architectural formats previously described are equally availing. With this in mind, the layered, agent-based architecture of FIG. 14 is applied to an environment including multiple sensors and actuators (as identified in FIG. 14) for an actor living in a home. The exemplary methodology of FIGS. 13A-13C relates to a scenario in which the actor 28 first receives a phone call and then leaves a teakettle unattended on the actor's stove, and assumes a number of situation-specific variables.
  • Beginning at [0064] step 100, following installation of the system 20, an installer uses the “configuration” agent (akin to the “customization” agent in FIG. 3) to input information about the actor, the actor's next-door neighbor, and the actor's home. This information includes capabilities, telephone numbers, relevant alerts, and home lay-out. At step 102, this configuration information is stored in the log via the database manager (or “DB Mgr”) agent.
  • At [0065] step 104, an incoming telephone call is placed to the actor's home. At step 106, a signal from the telephone sensor (that includes a caller identification feature) goes through the “sensor adapter” agent that, at step 108, transfers it to the “phone interactions” agent.
  • At [0066] step 110, the “phone interactions” agent needs to decide whether to filter the call. To this end, the two important factors are (a) who is calling, and (b) what is the actor doing. With this in mind, at step 112, the “phone interactions” agent polls, or otherwise receives information from, the “DB Mgr” agent regarding the status of the incoming telephone number. The “DB Mgr” agent reports that the incoming phone number is the actor's next door neighbor and is thus “valid” at step 114 (as opposed to an unknown number that may be designated as “invalid”). Thus, at step 116, the “phone interactions” agent determines that the call will not be immediately filtered.
  • Because the “phone interactions” agent has determined that the phone call is from someone of interest to the actor, at [0067] step 118, the “phone interactions” agent polls, or otherwise receives information from (e.g., a cached broadcast), the “client expert” agent (or “client” agent of FIG. 3) to determine what activity the actor is currently engaged in. Simultaneous with steps 104-118, the “intent recognition” agent has been receiving broadcast sensor signals from the “sensor adaptor” agent and performing intent recognition processing of the information (referenced generally at step 119). Similarly, the “client expert” agent has been receiving or subscribing to, resultant activity messages from the “intent recognition” agent (referenced generally at step 120). With this in mind, step 122, the “intent recognition” agent informs the “phone interactions” agent that the actor is awake and in the kitchen where a telephone is located.
  • At [0068] step 124, the “phone interactions” agent decides not to filter the incoming call (based upon the above-described analysis). As such, the “phone interactions” agent requests the “response coordinator” agent to enunciate the phone call at step 126. In response to this request, the “response coordinator” agent polls, or otherwise receives information from (e.g., broadcasted information), the “client expert” agent for the actor's capabilities at step 128. The “client expert” agent, in turn, reports a hearing difficulty (from information previously received via the “DB Mgr” agent as indicated at step 129) to the “phone interactions” agent at step 130. At step 132, the “response coordinator” agent determines that visual cues are needed, with additional lights.
  • With all the above information in hand, and seeing no other requests for interactions and no current alarm state that might otherwise require phone call suppression, the “response coordinator” agent prompts the “PhoneCtrl” agent to let the phone ring and flash lights at [0069] step 134. It should be noted that a variety of other incoming call analyses and alerting functions could have been performed depending upon who the phone caller is, where the actor is located and what the actor is doing. Based upon this information, the actor could be alerted in a variety of ways including messages on the television, flashing house lights, or announcing who the caller is via a speaker.
  • At [0070] step 136, the “response coordinator” agent recognizes that other devices or activities in the home may impede the actor's ability to hear the phone ring or the subsequent conversation if the house is too noisy. In light of this determination, the “response coordinator” agent, at step 138, decides to reduce other sounds in the home. For example, at step 140, the “response coordinator” agent prompts the “TV” agent to mute the television. The “TV” agent, in turn, utilizes an IR control signal (akin to a remote control) to mute the television at step 142.
  • At [0071] step 144, an air quality sensor senses smoke near the stove in the kitchen (i.e., is “triggered”), and broadcasts this information to other interested agents, including the domain agent “fire”. In response, the domain agent “fire” polls the “intent recognition” agent as to whether the actor is likely to turn off the stove at step 146. Simultaneous with previous steps, the “intent recognition” agent has received information from the “sensor adaptors” agent (similar to step 119 previously described, with this same step 119 being referenced generally as in conjunction with step 146), and has determined that the actor has previously left the kitchen. With this in mind, the “intent recognition” agent determines, at step 150, that the actor is not likely to turn off the stove immediately, and reports the same to the “fire” agent at step 152. The “fire” agent, at step 154, then determines that a response plan must be generated. In this regard, at step 156, the “fire” agent recognizes that the actor's stove is an older model and does not have a device agent or actuator that could be automatically de-activated, such that a different technique must be employed to turn off the stove.
  • At [0072] step 158, the “fire” agent first determines that ventilation in the kitchen is needed. To implement this response, the “fire” agent, at step 160, requests the “response coordinator” agent to turn on the fans in the kitchen. The “response coordinator” agent, in turn, prompts the “HVAC” agent to activate the kitchen fans at step 162.
  • Simultaneous with the ventilation activation described above, the “fire” agent, at [0073] step 164, recognizes that the current level of urgency is “low” (i.e., a burning fire has not yet occurred), so that contacting only the actor is appropriate (a higher level of urgency would implicate contacting others). To implement this response plan, the “fire” agent first needs to select an appropriate device(s) for effectuating contact with the actor at step 166. In this regard, all communication devices in the home are appropriate, including the television, the phone, the bedside display, and the lights. The television and the bedside display provide rich visual information, while the phone and the lights draw attention quickly. In order to prioritize these devices, the “fire” agent polls, or otherwise receives information from (e.g., a broadcasted message), the “client expert” agent to determine where the actor is and what the actor is doing at step 168. Simultaneous with the previous steps, the “client expert” agent has been subscribing to activity messages from the “intent recognition” agent, as previously described with respect to step 120 (it being noted that FIG. 15B generally references step 120 in conjunction with step 168). Based on recent device use (i.e., the television remote and power to the television), the “intent recognition” agent, reports to the “client expert” agent (e.g., client expert has cached broadcasts of the actor's activity as determined by the “intent recognition” agent) that the actor is likely in the living room watching television. The “client expert” agent, in turn, reports this information to the “fire” agent at step 174.
  • With the above information in hand, at [0074] step 176 the “fire” agent selects the television as the best interaction device, with the lights and the telephone indicated as also appropriate, and the bedside display eliminated. Pursuant to this determination, the “fire” agent requests the “response coordinator” agent to raise an alert to the actor via one of these prioritized devices at step 178. At step 180, the “response coordinator” agent reviews all other pending interaction requests to select the best overall interaction device. Seeing that there are no other pending interaction requests, the “response coordinator” selects the television as the interaction device for contacting the actor, and prompts the “television” agent to provide the message to the actor at step 182. It should be noted that if other interaction requests are pending, the “response coordinator” agent will preferably select the best combination of interaction devices for all of the pending requests. For example, the “response coordinator” agent can choose a different interaction device for each message, or decide to display/transmit the messages on more than one interaction device.
  • Returning to the example, in response to the message, the “television” agent polls, or otherwise receives information from (e.g., a cached broadcast message from the “response coordinator” agent) the “client expert” agent as to the best way to present the message at [0075] step 184. Prior to this request, the “machine learning” agent has recognized that the actor responds more frequently to visual cues, especially when text is combined with an image. This information has been previously reported to the “client expert” agent, generally represented at step 186. With the learned information in hand, the “client expert” agent informs the “television” agent to present a message on the television screen in the form of “[Actor's name], turn off the stove.”, along with an image of a stove and a burning pan at step 188. The “television” agent prompts the television to display this message at step 189. It should be noted that a wide variety of other message presentation formats could have been selected. For example, if the actor is blind (information gleaned from the “configuration” agent and/or the “machine learning” agent) or asleep (information provided by the “intent recognition” agent), a spoken message would have been more appropriate.
  • At [0076] step 190, the “fire” agent continues to monitor what is happening in the home for combating the smoke/fire in the kitchen. At step 192, the “intent recognition” agent continues to monitor the intent of the actor and determines that the actor has not acknowledged the alert, and that there is no activity in the kitchen (via broadcasted information, or lack thereof, from sensors in the kitchen or at the television, or by polling those sensors). Once again, these determinations are based upon received broadcast sensor signals from the “sensor adaptor” agent as previously described with respect to step 119 (it being noted that reference is made to step 119 in conjunction with step 192). Thus, the “intent recognition” agent generates a reduced confidence that the actor is actually watching television, and moreover the lack of activity in the kitchen means there are no pending high-confidence hypotheses. At step 200, the “client expert” agent receives broadcasted information from, or alternatively requests, the “intent recognition” agent regarding its most likely hypotheses and recognizes that the “intent recognition” agent does not know what the actor is doing. The “client expert” agent reports this to the “fire” agent.
  • At [0077] step 202, the “fire” agent decides, based upon the above information, that the alert level must be escalated and re-issues the alert. In particular, the “fire” agent requests the “response coordinator” to utilize both a high intrusiveness device (lights preferred over the telephone), and an informational device (bedside webpad preferred over the television because there is an ongoing request for the television message, and the television message was found to not be effective). In response to this request, the “response coordinator” at step 204 recognizes that the lights and the bedside webpad do not conflict with one another, and prompts the “lights” agent and the “web” agent to raise the alert.
  • In response to this request, the “lights” agent flickers the home lights several times at [0078] step 206. Simultaneously, at step 208, the “web” agent polls, or otherwise receives information from (e.g., a cached broadcast), the “client expert” agent as to what information to present and how to present it. As previously described, the “client expert” agent has previously been informed (via step 186 as previously described and generally referenced in FIG. 13C in conjunction with step 208) that the actor responds best to combined text with images, and reports the same to the “web” agent at step 209. With this information in hand, the “web” agent prompts the “bedside display” actuator to display the message: “[Actor's name], turn off the stove,” along with an image of a stove and smoking pan at step 210.
  • Before the actor gets to the stove, the “fire” agent prepares to further escalate the alert at step [0079] 212 (following previously-described step 190 in which the “fire” agent continues monitoring the kitchen). In particular, the “fire” agent polls the “DB Mgr” agent as to whom to send an alert to at step 214. The “DB Mgr” agent informs, at step 216, the “fire” agent that the actor's next door neighbor is the appropriate person to contact. However, before the escalated alert plan is effectuated, the “intent recognition” agent is informed of activity in the kitchen, via for example motion sensors data, and infers from this information that the actor is responding to the fire at step 220. Once again, the “intent recognition” agent is continuously receiving signaled information from the “sensor adaptor” agent as previously described with respect to step 119 (with step 119 being generally referenced in FIG. 13C in conjunction with step 220). The “intent recognition” agent reports this change in status to the “fire” agent at step 222 (either directly or as part of a broadcasted message). In response, the “fire” agent, at step 224, does not send the escalated alert, but instead requests that the kitchen fans be deactivated (in a manner similar to that described above with respect to initiating ventilation). Finally, if the “fire” agent determines that the smoke level in the kitchen subsequently increases, the “fire” agent would initiate the escalated alert sequence via the “response coordinator” agent as previously described.
  • It will be recognized that the above scenario is but one example of how the methodology made available with the [0080] system 20 of the present invention can monitor, recognize, support and respond to activities of the actor 28 in daily life. The “facts” associated with the above scenario can be vastly different from application to application; and a multiple of completely different daily encounters can be processed and acted upon in accordance with the present invention.
  • F. Alternative Controller Hardware Configurations [0081]
  • As previously described with respect to FIG. 1, the [0082] controller 22 can be provided in multiple component forms. In this regard, the system 20 architecture combines information from a wide range of sensors and then performs higher level reasoning to determine if a response is needed. FIG. 15 is an exemplary hardware/architecture for an alternative system 320 in accordance with the present invention that includes an in-home processor called the “home controller” 322 and a processor outside the home called the “remote server” 324. The home controller 322 has all of the hardware interfaces to talk to a wide range of devices. The remote server 324 has more processor, memory, and communication resources to do higher level reasoning functions.
  • The [0083] home controller 322 preferably includes a number of different hardware interfaces to talk to a wide range of devices. A client (or actor) interface communicates with devices that the actor uses to interact with the system. These devices could be as simple as a standard telephone or as complex as a web browser enabled device such as a PDA, “WebPad” available from Honeywell International, or other similar devices.
  • The [0084] home controller 322 preferably further includes a telephone interface so that the system 320 can call out in emergency situations. The phone interface can be standard wired or cell based. If enabled to allow incoming calls, this telephone interface can also be used to access the system remotely, for example if a caregiver wanted to check on an actor's status from outside the home using a touch tone phone.
  • A preferred actuator interface in the [0085] home controller talks 322 to devices in the actor's environment that may be controlled by the system 320, such as thermostats, appliance, lights, alarms, etc. The system 320 can use this interface to, for example, turn on a bathroom light when the actor gets up in the middle of the night to go to the bathroom, turn off a stove, control thermostat settings, etc.
  • Preferred sensor interface(s), such as wired or RF-based, take in information from a wide range of available sensors. These sensors include motion detectors, pressure mats and door sensors that can help the system determine an actor's location and activity level. This interface can also talk to more specialized sensors such as a flush sensor that detects when a client uses the bathroom. An important class of sensors that communicate by without requiring hardwiring are wearable sensors such as panic button pendants or fall sensors. These sensors signal that the actor needs help immediately. Alternatively or in addition, to a number of other sensors, as previously described, can also be implemented. [0086]
  • In one preferred embodiment, the home controller's [0087] 322 processor can do some sensor aggregation and reasoning. This low-level reasoning is reactive type reasoning that ties actions closely to sensors. Some of this type of reasoning includes turning on lighting based on motion sensors and calling emergency medical personnel if a panic button is pushed. Most sensor data is passed on to the remote server for higher level reasoning.
  • The [0088] remote server 324 does situational reasoning to determine what is happening in the actor's environment. Situations include everyday activities like eating and sleeping as well as emergency situations, for example if the actor has not gotten out of bed by a certain time of day. A preferred response planner in the remote server 324 then plans a response to the situation if one is required. If the response uses an actuator, a message is preferably sent back to the home controller 322 and out to the device through the actuator interface. If a response requires interaction with the actor, a message is sent to the home controller 322 and routed out through the actor interface.
  • The [0089] remote server 324 preferably further includes contains a database of contact information for responses that require contacting someone. This database includes names, phone numbers, and possible e-mail addresses of people to be contacted and the situations for which they should be contacted.
  • A single [0090] remote server 324 can support a large number of independent environment installations or a large number of individual living environments in an institutional setting. The remote server 324 can provide other web-based services to an actor including, for example, online news services, communications, online shopping, entertainment, linking to other system 320 users as part of a web-based community, etc.
  • The [0091] remote server 324 provides remote access to the system 320 information. Using this preferably web-based interface, caregivers and family members can check on the actor's status from any web-enabled device on the Internet. This interface can also be used when a response plan calls for contacting a family member or caregiver and the actor's contact information says they should be contacted by e-mail. Further interface scenarios preferably incorporated into the system 320 architecture/hardware include allowing information to be pushed or pulled by service providers (e.g., service providers are able to review medical history, repair persons are able to confirm particular brand and model numbers of appliances needing repair, etc.).
  • The communications between the [0092] home controller 322 and the remote server 324 can use regular phone lines or cell phones or it could make use of broadband for high information throughput. Generally speaking, lower bandwidth/throughput requires more processing power in the actor's environment.
  • Another [0093] alternative hardware architecture 360 configuration, shown in FIG. 16, has the same general functions, but puts all of the processing in the actor's environment. This requires either at least a second processor in the actor's environment to do the higher level reasoning or additional processing and memory resources in the controller at the actor's environment. Either way, the situation assessment and response planning functions are now performed inside the home. Notably, the situation assessment and response planning functions can be performed by separate controllers located in the actor's environment. Regardless, for this architecture, remote access can be accomplished, for example, either through a standard phone interface or by connecting the processor to the Internet.
  • FIG. 17 depicts yet another alternative configuration of a [0094] system 420 in accordance with the present invention in the form of a single, self-contained, easily configured, low-cost box. The system 420 combines a small set of sensors and actuators in a single box with a telephone connection (or other mechanism for contacting the outside world). For example, the sensor suite can include a smoke detector, a carbon monoxide detector, a thermometer, and a microphone, while the actuator or effector suite can include a motion-activated light for path lighting, a speaker, and a dial-out connection. With this design, a user (not shown) installs the system 420 so the motion detector can sense the movement of people within the room, indicates what room the device is in, and plugs the device into wall power and phone lines. The system 420 gathers sensor data and periodically relays that data to a server through a dial-up connection. The server can store gathered data from later review by the actor or others such as caregivers. Preferably, the system 420 of FIG. 17 is also capable of on-site reasoning about crises (e.g., panic alert, environmental problems, etc.) and can call caregivers or a monitoring station to alert them of a problem. Thus, the system 420 of FIG. 17 can apply a control signal from the local site (e.g., by asking the actor if he/she is okay, turning on the path light, dialing for help, etc.), and can alter its own behavior based on learning, rather than relying on a remote reasoning. FIG. 18 illustrates yet another, but similar, alternative system 430 configuration in which no “on-board” sensors are provided. Instead, external sensors interface with the system 430 via an RF link. With either of the systems 420, 430, sensors can be provided that are adapted to perform local reasoning (e.g., a video camera that finds moving objects and provides corresponding coordinates and moving vectors).
  • Finally, FIGS. [0095] 19-21 illustrates other alternative configurations of system 440, 450, and 460, respectively, in accordance with the present invention in a user-wearable form.
  • G. Conclusion [0096]
  • In conclusion, the system and related method of operation of the present invention can, unlike any other system previously considered, independently and intelligently monitor, and recognize the behavior of an actor, and preferably further support and respond to the behavior. The [0097] preferred system 20 installation includes a controller, sensing components/modules, supportive components/modules, and responsive components/modules. The controller can be one or more processing devices that may be centralized in or out of an area of interest, or distributed in or out of the area of interest. The controller device(s) serve to gather, store and process sensor data and perform the various reasoning algorithms required for determining actor status and needs, generating response plans and executing the response plans via the various actor/effectors an interaction devices available for the actor, the actors environment and/or caregivers. Preferably, the controller further includes data tracking, logging and machine learning algorithms to control and detect trends and individual behavior patterns across collected data. The sensing components/modules include one or more sensors deployed throughout the area of interest in conjunction with related modules for receiving and interpreting sensor data. The supportive components/modules include one or more actuation and control devices in the area of interest. Further, one or more interaction devices, available to the actor and/or the actor's caregiver are provided. To this end, the system and method is preferably capable of using existing interaction devices such as telephones, televisions, pagers, and web-enabled computers. The responsive components/modules include one or more sensors deployed throughout the area of interest, preferably along with actuation, control, and interaction devices.
  • The system and method of the present invention can provide a number of application-specific features, including (but not limited to) those set forth below: [0098]
  • Safety (Fires, Bums, Poisoning, etc) [0099]
  • Monitor air quality. [0100]
  • Alert actor and caregiver about air quality changes if potential for danger is detected. [0101]
  • Alert Emergency Medical Services (EMS) and caregivers if critical air quality danger exists. [0102]
  • Automatically activate ventilation and air filters (air conditioners). [0103]
  • Automatically shut off source of problem,(e.g., furnace, stove, heater). [0104]
  • Sensor(s) and locks placed on cabinets storing dangerous household chemicals. [0105]
  • Alerting system if unauthorized user opens cabinet. [0106]
  • Detect choking sounds, vomiting or changes in actor's vital signs (such as respirator rate, pulse rate, blood pressure, high/low blood glucose, blood ketones, etc.). [0107]
  • Assess risk to actor and accommodate system's sensitivity to detect fires (e.g., if cigarette smoking is detected near an oxygen device, system would provide a warning). [0108]
  • Monitor heating system, space heaters, fireplaces, chimneys, and appliances (especially stove, oven, toasters, grills, microwaves) and provide alerts if unusual situation occurs. [0109]
  • Diagnostics of electrical wiring, smoke alarm battery, etc., and provide battery replacement reminders. [0110]
  • Provide exit path guidance with signs, lighting, auditory instructions, etc. [0111]
  • Contact caregivers if dangerous situation detected and emergency help if critical situation occurs. [0112]
  • A panic-button-type device that is worn by the actor and can be used to summon help. [0113]
  • Similar to known “smart medicine cabinet”, smart chemical cabinet—that dispenses chemicals for cleaning, etc., carefully. [0114]
  • Medical Monitoring [0115]
  • Monitor bathroom use and combine with other activity information to infer conditions like dehydration, etc. [0116]
  • Communicate with smart medical devices to gather and analyze medical data and make overall health inferences. [0117]
  • Provide initial training, reminders, and/or step-by-step instructions on how to use medical devices. [0118]
  • Provide reminders for actor to use installed medical equipment. [0119]
  • Provide easy method for actors to enter medical information into system for trending and analysis. [0120]
  • Provide easy method that caregiver can enter medical and care information. [0121]
  • Provide caregiver task tracking capability to coordinate efforts of multiple caregivers. [0122]
  • Provide dedicated caregiver information exchange UI facility. [0123]
  • See Eating, Medication, Safety, Mobility, Toileting, Multiple Caregivers, and coordination features issues for additional relevant technology opportunities. [0124]
  • Activity & Functional Assessments [0125]
  • Measurement of the ADL's. Incorporates most of the other functions (notably mobility), but also ability to do laundry, etc. [0126]
  • Visual observation of mobility in the environment. Control of camera to provide a view that would enable assessment of walking, transferring, shaking/reflexes, condition of skin/limbs/arms/legs, etc. [0127]
  • Facilitate administration of functional assessments like the Folstein MiniMental Status, various functional assessment tools used by interviewers. [0128]
  • Functional database for an actor. [0129]
  • Creative questioning and game playing to determine activity engagement, functional status, etc. [0130]
  • Taking measurements (weight, phone use (frequency), water use (to detect bathing), kitchen activities, walking (rate, gait, pause after standing), night activity). [0131]
  • Mobility [0132]
  • Obstacle detection (to warn actor). [0133]
  • Pathway lighting. [0134]
  • Exercise facilitation (regular exercise reduces risk of falling). [0135]
  • Increased monitoring sensitivity based on actor's medical conditions (e.g., if known that actor has had a recent prescription change, increase system sensitivity for fall monitoring). [0136]
  • Increased monitoring sensitivity based on activities or environmental conditions (e.g., seemingly minor everyday stresses, such as postural change, eating a meal, or an acute illness may result in hypotension and therefore, increased risk of falling). [0137]
  • System initiated contacting of medical and/or family members upon a fall. [0138]
  • A panic-button-type device that is worn by the actor and can be used to summon help. [0139]
  • Detect number of people in home. [0140]
  • Track actor's motion, recognize gait, predict problems (obstacles, falls). Recognize changes over time. [0141]
  • Caregiver Burnout [0142]
  • Support remote monitoring of activities and behavior monitor activity levels and environmental parameters. [0143]
  • View video images of actor. [0144]
  • Show trends (activity, appliance use, visitors, phone calls). [0145]
  • Support remote communication that serves as an equivalent surrogate for personal visits (burden and isolation). [0146]
  • Coordinated to-do lists for caregivers. [0147]
  • Daily activity reminders to actor (to keep actor from calling the caregiver). [0148]
  • Daily activity instructions to actor. [0149]
  • Resource guide of elderly-support services (e.g., dinner-delivery, in-home healthcare, or informational web pages). [0150]
  • Customize information content/delivery to caregivers concerns. [0151]
  • Support user-initiated customization of information and contact requests (call/page/email me if recipient does not get up by 8 am on day of some appointment). [0152]
  • Define information that is interesting to the caregivers (e.g. stovetop temperature, front door activity, etc.). [0153]
  • Automatic generation of a caregiver to-do list. [0154]
  • Facilitate caregiver support groups via the internet. [0155]
  • Provide Flexible Access [0156]
  • Leverage automated user interface generation capability (IDS) to deliver content to caregiver across multiple platforms and modalities (PC browser, PDA browser, WAP phone, phone). [0157]
  • Customize user interface presentations according to the actor's capabilities. [0158]
  • Learn user interface effectiveness to adapt presentations in accordance with the actor's preferences. [0159]
  • See also Dementia, medical monitoring. [0160]
  • Medication Management [0161]
  • Provide easy method that actor, caregiver or medical practitioner can update new medications. [0162]
  • Provide easy method that actor, caregiver or medical practitioner can enter medical information. [0163]
  • Provide preprogrammed database of drugs and their possible Adverse Drug Reactions (ADRs). [0164]
  • Provide reminders of time to take drugs, their dosage, and how they should be taken (e.g., with food?). [0165]
  • Provide an automated dispenser to track drugs taken and monitor time taken. [0166]
  • Alert actor and caregiver if new drugs and current drugs will cause ADR, if new drugs are duplicates, if new drug is necessary, if drug duration and dosage are abnormal, if there is better alternative drug (e.g. fewer side effects, less expensive). [0167]
  • Alert caregiver and/or EMS if possible ADR has taken place. [0168]
  • Monitor on-site inventory of medication and automatically re-order, or issue a reminder to re-order, when appropriate. [0169]
  • Cognitive Disorders (Dementia, Depression, etc.) [0170]
  • Task prompts or step-by-step instructions. [0171]
  • Query dialog to ease disorientation or loss of situation awareness (e.g., Actor: Is someone else in the house? System: you are alone in the house). [0172]
  • Monitor activities to detect signs of depression (e.g., sleep patterns, amount of overall activity, changes in appetite, changes in voice). [0173]
  • Administration of standardized instruments for depression assessment (GDS, CDES) or system communicates with caregiver to setup a healthcare professional to administer. [0174]
  • Monitor activities to detect signs of dementia onset or worsening (e.g., forgetting to do things system or others have suggested (STM), forgetting appointments (LTM), Sundowning (see wandering), Hallucinations (see hallucinations)). [0175]
  • Administration of standardized instruments for dementia assessment (RIL, Molloy et al, [0176] 1999), or system communicates with caregiver to set-up a healthcare professional to administer.
  • Assess changes in actor's behavior such as those listed in Kolanowsi, 1994 (e.g., aggressive psychomotor—hitting, kicking, pushing, scratching, assaultiveness). [0177]
  • Increased monitoring sensitivity and/or increased offloading of caregiver responsibilities based on actor's level of dementia (degradation in care recipient is correlated with increased caregiver burden). (Zarit or Montgomery caregiver burden assessment tools). [0178]
  • Education and training about stages of dementia, what to expect, how to handle behavior, resources available, how to reduce stress, etc. [0179]
  • Detect confusion. [0180]
  • Detect agitation. [0181]
  • Trend memory. [0182]
  • Trend toileting. [0183]
  • Eating [0184]
  • Track food for expiration dates and advise resident to dispose of food if too old. [0185]
  • Store basic list of groceries and automatically order new products or add them to an automatic grocery list once the item is used. [0186]
  • Automatically generate shopping list based on meal planning/nutritional goals. [0187]
  • Track nutritional value of meals, and alert caregiver and actor if eating inappropriately. [0188]
  • Monitor food degradation (e.g., if meat has been defrosted in microwave and not cooked immediately, or if meat is out for longer than 2 hours at room temp). [0189]
  • Monitor cooking progress/success (e.g., temperature and time in oven to determine whether food is cooked). [0190]
  • Monitor storage conditions (fridge and freezer temperatures to ensure food is cold enough). [0191]
  • Track schedule of food delivery and alert caregiver/actor/care organization if food delivery does not arrive. [0192]
  • Allow caregiver remote access to actor's shopping list. [0193]
  • Allow for shopping online by the actor or caregiver to alleviate stress or time associated with shopping. [0194]
  • Alert caregiver or actor of store events, sales on merchandise (e.g., coupons, senior specials). [0195]
  • Monitor appliance use, alert and/or control unsafe conditions. [0196]
  • Learn what the actor prefers to eat, and present sample recipes/menus based upon these preferences and other factors such as complexity of preparation as compared to the actor's abilities, what food is available, actor's nutritional needs, etc. [0197]
  • Monitor appliance use. [0198]
  • Suggest menus in keeping with available food, with balanced diet, and within dietary and medication constraints. [0199]
  • Provide instructions on meal preparation. [0200]
  • Transportation [0201]
  • Allow for easy communication with transport services. [0202]
  • Facilitate access to transport schedules. [0203]
  • Alert actor or caregiver if transportation is a problem. [0204]
  • Provide information about local transportation resources. [0205]
  • Isolation [0206]
  • Provide regular interaction with the actor via means that are normally associated with guests, friends, family, etc. (e.g., phone calls and e-mails). [0207]
  • Provide social interaction such as “reading” to actor (i.e., playing books on tape). [0208]
  • Facilitate ways in which actor can continue to get social contact from external sources like video phone interaction with doctors, calling in a daily/weekly shopping list to a human, ordering supplies via phone rather than web, etc. [0209]
  • Create a system community in which all system users can interact with one another via the web, video gatherings, phone. [0210]
  • Show pictures from the familiar past would help positively reinforce the actor and help with social isolation. [0211]
  • Instigate game playing with the actor. [0212]
  • Alert caregiver if the actor is alone for “too long”. [0213]
  • Provide “social” interactions between the system and the actor (e.g., ask social or friendly-type questions and reply to actor's response). [0214]
  • Facilitate on-line shopping. [0215]
  • Con detection (call filtering, door-to-door salespeople). [0216]
  • Managing Money [0217]
  • Electronic banking with automated bill payments and account balancing. [0218]
  • Formation of a bill to-do list to facilitate caregiver who manages finances (e.g., list might include vendors and amounts due along with funds availability information). [0219]
  • Scan phone communications for release of personal info that may indicate response to solicitation. [0220]
  • Monitor credit card bills and check payments for unusual expenditures. [0221]
  • Provide information about local financial management resources. [0222]
  • Checking account interlocks to prevent payments to unauthorized persons or organizations. [0223]
  • Visitor screening to deter door-to-door solicitors. [0224]
  • Support regular social contact to reduce sense of isolation, since isolation is a key reason elders talk to solicitors. [0225]
  • Toileting and Incontinence [0226]
  • Monitor toileting frequency. [0227]
  • Alerts to actor/caregivers. [0228]
  • Reports/Notifications/Reminders to elders/caregivers. [0229]
  • Provide reminders to use the bathroom. [0230]
  • Provide path lighting and obstacle detection for nighttime movement between bedroom and bathroom. [0231]
  • Increased monitoring sensitivity based on actor's medical conditions (e.g., if known that actor has reduced sensation, increase system sensitivity for urination outside bathroom and/or prompts to wear/change diapers). [0232]
  • Reminders and assistance with exercises. [0233]
  • Housework [0234]
  • Detect clutter to suggest clean up. [0235]
  • Detect air quality (look for molds, spores, bacteria). [0236]
  • Remind caregiver or actor to clean. [0237]
  • Detect smells on clothes. [0238]
  • Remind actor or caregiver of washing if not performed regularly. [0239]
  • Provide a washing schedule based on usage of clothes. [0240]
  • Provide information about local housekeeping resources. [0241]
  • Task prompts or step-by-step instructions. [0242]
  • Shopping Assistance [0243]
  • Allow caregiver remote access to actor's shopping list. [0244]
  • Allow for shopping online by the actor or caregiver to alleviate stress or time associated with shopping. [0245]
  • Maintain a schedule for when to go shopping. [0246]
  • Maintain a basic shopping list and track when supplies are low. [0247]
  • Facilitate the development of a shopping list. [0248]
  • Alert caregiver or actor of store events, sales on merchandise, etc. [0249]
  • Pressure Sores [0250]
  • Provide reminders to use bathroom. [0251]
  • Monitor for urine moisture. [0252]
  • Provide reminders to change clothing, wash clothing and sheets if moisture detected. [0253]
  • Monitor position and movement changes. [0254]
  • Provide reminders to change position and suggestions for new positions. [0255]
  • Using Equipment [0256]
  • Omni-directional signal reception (e.g., no matter what way the actor has a selected remote control facing, it will control the proper device). [0257]
  • One-way ergonomic design (a hardware design that makes it clear there is only one way to hold the remote). [0258]
  • Task prompts and cues (keys light up in order as cue for entry sequence). [0259]
  • Voice command controls. [0260]
  • Alcohol Abuse [0261]
  • Fit alcoholic drinks with usage caps that monitor how often they are opened (this is done with certain drug monitoring) and record this information. [0262]
  • Provide sensors for cabinets. Since most people store their alcohol in one area, it can give a rough estimate of how often it is used. [0263]
  • Provide warning messages if actor is has recently used alcohol and is about to take medication. [0264]
  • Provide warnings if consumption is approaching dangerous levels. [0265]
  • Send message via phone or e-mail to caregivers if alcohol misuse is detected (unconsciousness, falls, malnutrition). [0266]
  • Breath tests. [0267]
  • Wandering [0268]
  • Infer OK to leave house (e.g. check actor's schedule before the leave the house). [0269]
  • Interact with actor before they exit to try to “snap them out of it”. [0270]
  • Contact caregiver in the event that the actor is suspected of wandering. [0271]
  • Door mat sensor and door sensor can indicate a potential exit by actor (outside door mat sensor and door bell or acoustic sensor listening for a knock can confirn/disconfirm that the actor is not simply answering the door). [0272]
  • Check actor's schedule to see if exit is expected. [0273]
  • Check behavioral pattern to see if expected or unusual (exiting at 3 am). [0274]
  • If system is sure they are wandering, stall them until a caregiver can arrive. [0275]
  • Inform actor if there is inclement weather; if actor leaves anyway contact caregiver. [0276]
  • If keys are RF-tagged, confirm that actor has keys (if so automatically lock the door; if not, may depend on facial or voice recognition when actor returns to actuate door lock). [0277]
  • Wandering switch—if leaving on purpose, actor actuates a switch at door to indicate leaving house. If not switched, front gate locks to prevent departure and contain wandering path within home territory. Notify caregiver that actor is outside if outdoor conditions are adverse. [0278]
  • Detect & report enter-leave house. [0279]
  • Depression Detection and Intervention [0280]
  • By monitoring actor, especially elder, behavior over time system may be able to detect onset of depression and other user mental states. Changes in sleep patterns, eating patterns, activity level, and even vocal qualities can provide an indication that the actor is becoming depressed. If actor is exhibiting declining trends in any or all of these parameters, system can administer a brief assessment (such as Geriatric Depression Scale coupled with Mini Mental State Exam) via the phone, webpad, or television to confirm the presence of depression. Since social isolation is a common component in elderly, system can also be adapted to intervene by sending a message to a neighbor or friend telling them it may be a good time to stop by for a visit. System can also help by providing wider communications access for the actor, for example connecting the actor with their favorite chat room at the appropriate time. [0281]
  • Some Symptoms of Depression In The Elderly: [0282]
  • Depressed or irritable mood. [0283]
  • Loss of interest or pleasure in daily activities. [0284]
  • Temper, agitation. [0285]
  • Change in appetite, usually a loss of appetite. [0286]
  • Change in weight. [0287]
  • Weight loss (unintentional). [0288]
  • Weight gain (unintentional). [0289]
  • Difficulty sleeping. [0290]
  • Daytime sleepiness. [0291]
  • Difficulty falling asleep or staying asleep (insomnia). [0292]
  • Fatigue (tiredness or weariness). [0293]
  • Difficulty concentrating. [0294]
  • Feelings of worthlessness or sadness. [0295]
  • Memory loss. [0296]
  • Abnormal thoughts, excessive or inappropriate guilt. [0297]
  • Abnormal thoughts about death. [0298]
  • Excessively irresponsible behavior pattern. [0299]
  • Thoughts about suicide. [0300]
  • Plans to commit suicide or actual suicide attempts. [0301]
  • Hallucinations & Delusions [0302]
  • Help actor understand that they are not in any danger then call appropriate parties. [0303]
  • If system detects agitation, then system could ask what is wrong, then system could scan house for signs of an intruder and reassure the actor that there is no one in the house. [0304]
  • Then system can call a designated caregiver party who will intervene to calm the actor. [0305]
  • System can log the event. [0306]
  • Application of Snoezelen Technique [0307]
  • Sensory stimulation technique that has been successful in calming children via multi-sensory stimulation. Indications are that this technique is effective in reducing agitation in those suffering dementia. Application includes: light therapy, essential oils, soft chair, wind chimes, lava lamps, etc. While having a Snoezelen room may not be practical, applying these techniques in part in the room of the agitated actor might be helpful in reducing their agitation until a caregiver can intervene. [0308]
  • Usability [0309]
  • Operational modes (night/day, guests/alone, etc.). [0310]
  • Password-free elder interactions. [0311]
  • Function muting (turn off the toileting stuff today). [0312]
  • Sensor muting (ignore sensor 3 today). [0313]
  • Better display screens—e.g. easy-to-read security panel. [0314]
  • Suggest appropriate attire to the actor before the actor leaves the home. [0315]
  • Sleeping [0316]
  • Track sleeping habits. [0317]
  • Assess current sleeping habits against previous sleeping habits. [0318]
  • Assess sleeping habits based upon recommended sleep traits. [0319]
  • Identify sleep problems. [0320]
  • The present invention provides a marked improvement over previous designs. In particular, the system and method of the present invention incorporate a highly flexible architecture/agent construct that is capable of taking input from a wide variety of sensors in a wide variety of settings, mapping them to a set of events of interest, reasoning about desired responses to those events of interest, and then accomplishing those responses on a wide variety of effectors in a wide variety of settings. [0321]

Claims (55)

What is claimed is:
1. An automated monitoring and support system for an actor in an environment, the system comprising:
a sensor;
an effector; and
a controller for receiving information from the sensor to monitor at least one of the actor and the environment and for controlling operation of the effector based upon the monitored information, the controller including:
a situation assessor for determining a current situation based upon the sensor information,
a response planner for automatically generating an appropriate current response plan to the current situation as determined by the situation assessor,
a plan executor for prompting the effector to execute the generated current response plan.
2. The system of claim 1, wherein the system includes a plurality of different sensors each providing sensor information to the controller, and further wherein the situation assessor is adapted to selectively evaluate all sensor information.
3. The system of claim 2, wherein the situation assessor is further adapted to aggregate information from at least two of the sensors to designate a single event as occurring.
4. The system of claim 3, wherein the response planner is further adapted to rely upon the designated single event in generating an appropriate response.
5. The system of claim 1, wherein the situation assessor is further adapted to determine the current situation based upon at least one item selected from the group consisting of an activity of the actor, an intended activity of the actor, an inactivity of the actor, the status of the actor, a future status of the actor, a status of the environment, and a future status of the environment.
6. The system of claim 5, wherein the situation assessor is further adapted to consider periods of user inactivity in determining the current situation.
7. The system of claim 1, wherein the response planner is further adapted to coordinate a plurality of possible response plans.
8. The system of claim 7, wherein the response planner is further adapted to prioritize the plurality of possible response plans.
9. The system of claim 1, wherein the system includes a plurality of effectors, and further wherein the response planner is further adapted to select one or more of the plurality of effectors to execute the current response plan.
10. The system of claim 1, wherein the response planner is further adapted to consider previous re-actions of the actor to previous response plans in generating the current response plan.
11. The system of claim 1, wherein the situation assessor and the response planner are further adapted to monitor a response of the actor to the current response plan following execution by the plan executor.
12. The system of claim 1, wherein the system includes a plurality of effectors, and further wherein the plan executor is further adapted to control operations of each of the plurality of effectors.
13. The system of claim 12, wherein the plan executor is adapted to coordinate operation of the plurality of effectors.
14. The system of claim 1, wherein the controller further includes a machine learning device adapted to establish a behavioral model of at least one of the actor and the environment, and further wherein at least one of the situation assessor and the response planner is further adapted to utilize information from the machine learning device in determining the current situation and generating a current response plan, respectively.
15. The system of claim 1, wherein the system includes at least one sensor adapted to provide information relating to the actor and at least one sensor adapted to provide information related to the environment, and further wherein the situation assessor is further adapted to process the actor-related information and the environment-related information.
16. The system of claim 1, wherein the situation assessor, the response planner, and the plan executor are provided as a layered architecture.
17. The system of claim 16, wherein the controller further includes modules adapted to provide protocol constraints relating to designated subject matters, and further wherein the situation assessor, the response planner, and the plan executor define categories of capabilities made available to each of the modules by the controller.
18. The system of claim 17, wherein the controller further includes a first subject matter module and a second subject matter module each adapted to process information within at least one of the architecture layers.
19. The system of claim 18, wherein the first and second subject matter modules are communicatively linked.
20. The system of claim 18, wherein the first subject matter module is adapted to perform situation assessment operations related to a first domain subject matter and the second subject matter module is adapted to perform situation assessment operations relating to a second domain subject matter.
21. The system of claim 20, wherein the first and second subject matter modules are adapted to perform response planning operations relating to the first and second domain subject matters, respectively.
22. The system of claim 21, wherein the controller further includes a plan coordination module adapted to process response plans generated by the first and second subject matter modules.
23. The system of claim 21, wherein the controller further includes an intent recognition module adapted to determine an intent of the user, and further wherein the first and second subject matter modules utilize information generated by the intent recognition module in performing respective situation assessment and response planning operations.
24. The system of claim 21, wherein the controller is adapted to communicatively link a third, newly added domain subject matter module with the first and second subject matter modules.
25. The system of claim 19, wherein the first subject matter module is further adapted to perform response planning operations relating to a first subject matter utilizing information from the second subject matter module.
26. The system of claim 25, wherein the second subject matter module relates to actor information and the first subject matter module is adapted to process information relating to a subject matter selected from the group consisting of fire safety, home security, medication management, telephone interaction, eating, mobility, cognitive disorders, and toileting.
27. An automated monitoring and support system for an actor in an environment, the system comprising:
a sensor located in the environment;
an effector adapted to interface with at least one of the actor and the environment; and
a controller adapted to provide a layered architecture including a sensing layer for receiving information from the sensor, a situation assessment layer for determining a current situation of at least one of the actor and the environment based upon the sensor information, a response planning layer for automatically generating a response plan to the determined situation, and a plan execution layer for prompting the effector to execute the generated response plan, the controller further including:
a first domain subject matter module operating across at least the sensing, situation assessment, and response planning layers, the first domain subject matter module adapted to process information relating to a first subject matter and generate a response plan relating to the first subject matter.
28. The system of claim 27, wherein the first domain subject matter is selected from the group consisting of fire safety, home security, medication management, telephone interaction, eating, mobility, cognitive disorders, and toileting.
29. The system of claim 27, wherein the controller further includes a second domain subject matter module adapted to process information relating to a second subject matter and generate a response plan relating to the second subject matter.
30. The system of claim 29, wherein the controller further includes a plan coordination module for prioritizing response plans generated by the first and second domain subject matter modules.
31. The system of claim 27, wherein the controller further includes a second subject matter module adapted to process information relating to a second subject matter, and further wherein the first domain subject matter module utilizes information from the second subject matter module in performing situation assessment operations relating to the first subject matter.
32. The system of claim 27, wherein the controller further includes a second subject matter module adapted to process information relating to a second subject matter and further wherein the first domain subject matter utilizes information from the second subject matter module in performing response planning operations relating to the first subject matter.
33. The system of claim 27, wherein the controller further includes an intent recognition module adapted to recognize an intended activity of the actor based upon sensor information, and further wherein the first domain subject matter module utilizes information from the intent recognition module in performing situation assessment operations relating to the first subject matter.
34. The system of claim 27, wherein the controller further includes an intent recognition module adapted to recognize an intended activity of the actor based upon sensor information, and further wherein the first domain subject matter module utilizes information from the intent recognition module in performing response planning operations relating to the first subject matter.
35. The system of claim 27, wherein the controller further includes a behavioral database that establishes a behavioral model for at least one of the actor and the environment, and further wherein the first domain subject matter module utilizes information from the behavioral database in performing situation assessment operations relating to the first subject matter.
36. The system of claim 35, wherein the controller further includes a machine learning module adapted to generate the behavioral database.
37. The system of claim 27, wherein the controller further includes a behavioral database that establishes a behavioral model for at least one of the actor and the environment, and further wherein the first domain subject matter module utilizes information from the behavioral database in performing response planning operations relating to the first subject matter.
38. The system of claim 37, wherein the controller further includes a machine learning module adapted to generate the behavioral database.
39. The system of claim 27, wherein the system further includes a plurality of sensors, and further wherein the controller includes an event recognition module adapted to aggregate information from multiple ones of the sensors to designate a single event as occurring, and further wherein the first domain subject matter module utilizes information from the event recognition module in performing situation assessment operations relating to the first subject matter.
40. The system of claim 27, wherein the system further includes a plurality of effectors adapted to interface with the actor, and further wherein the controller includes a plan coordination module for implementing the response plan via at least two of the effectors.
41. An automated monitoring and support system for an actor in an environment, the system comprising:
a plurality of sensors adapted to sense information relating to the actor and the environment;
a plurality of effectors adapted to interface with the actor; and
a controller adapted to provide a layered architecture including a sensing layer for receiving information from the plurality of sensors, a situation assessment layer for determining a current situation of at least one of the actor and the environment based upon the sensor information, a response planning layer for automatically generating a response plan to the determined situation, and a plan executing layer for prompting at least one of the effectors to execute the generated response plan, the controller further including:
the first domain subject matter module operating across at least the sensing, situation assessment and response planning layers, the first domain subject matter module adapted to process information relating to a first subject matter and generate a response plan relating to the first subject matter,
a second domain subject matter module operating across at least the sensing, situation assessment, and response planning layers, the second domain subject matter module adapted to process information and generate a response plan relating to a second subject matter,
a third domain subject matter module operating across at least the situation assessment layer and adapted to process information relating to the actor,
a fourth subject matter module operating across at least the situation assessment module and adapted to process information relating to the environment,
a response execution module adapted to process response plans generated by the first and second domain subject matter modules.
42. The system of claim 41, further comprising a machine learning module communicatively linked to at least one of the first, second, third, and fourth modules.
43. The system of claim 41, further comprising an intent recognition module communicatively linked to at least one of the first, second, third, and fourth modules.
44. The system of claim 41, wherein the controller is adapted to load and communicatively link additional subject matter modules.
45. A method of automatically monitoring and supporting an actor in an environment, the method comprising:
receiving information from a plurality of sensors located in the environment;
automatically assessing a situation relating to at least one of the actor and the environment based upon information from the plurality of sensors;
automatically generating a response plan based upon the assessed situation; and
automatically executing the response plan by operating at least one of a plurality of effectors in the environment.
46. The method of claim 45, wherein automatically assessing a situation includes automatically assessing a situation of a plurality of subject matters.
47. The method of claim 45, further comprising:
providing a plurality of subject matter modules each adapted to assess a situation of a different subject matter.
48. The method of claim 47, wherein at least one of the plurality of subject matter modules utilizes information from at least another one of the plurality of subject matter modules in assessing a situation of the corresponding subject matter.
49. The method of claim 45, wherein automatically assessing a situation relating to the actor or the environment includes determining an intent of the actor.
50. The method of claim 45, wherein automatically generating a response plan includes generating a plurality of response plans relating to a plurality of different subject matters.
51. The method of claim 50, further comprising:
providing a plurality of subject matter modules each adapted to generate a response plan relating to a different subject matter.
52. The method of claim 50, further comprising:
evaluating each of the plurality of response plans; and
designating a primary response plan based upon the evaluation.
53. The method of claim 45, wherein generating a response plan includes referring to a machine learning database.
54. The method of claim 53, wherein generating a response plan further includes:
adapting the response plan to capabilities of the actor as provided by the machine learning database.
55. The method of claim 45, further comprising:
monitoring a response of the actor to the response plan following execution of the response plan.
US10/341,335 2002-03-28 2003-01-10 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor Abandoned US20040030531A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US10/341,335 US20040030531A1 (en) 2002-03-28 2003-01-10 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor
PCT/US2003/009743 WO2003083800A1 (en) 2002-03-28 2003-03-28 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor
AU2003228403A AU2003228403A1 (en) 2002-03-28 2003-03-28 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US36830702P 2002-03-28 2002-03-28
US10/341,335 US20040030531A1 (en) 2002-03-28 2003-01-10 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor

Publications (1)

Publication Number Publication Date
US20040030531A1 true US20040030531A1 (en) 2004-02-12

Family

ID=28678126

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/341,335 Abandoned US20040030531A1 (en) 2002-03-28 2003-01-10 System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor

Country Status (3)

Country Link
US (1) US20040030531A1 (en)
AU (1) AU2003228403A1 (en)
WO (1) WO2003083800A1 (en)

Cited By (214)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030225474A1 (en) * 2002-05-31 2003-12-04 Gustavo Mata Specialization of active software agents in an automated manufacturing environment
US20040116102A1 (en) * 2002-12-17 2004-06-17 International Business Machines Corporation Heuristics for behavior based life support services
US20040163079A1 (en) * 2003-02-13 2004-08-19 Path Communications, Inc. Software behavior pattern recognition and analysis
US20040210626A1 (en) * 2003-04-17 2004-10-21 International Business Machines Corporation Method and system for administering devices in dependence upon user metric vectors
US20040249825A1 (en) * 2003-06-05 2004-12-09 International Business Machines Corporation Administering devices with dynamic action lists
US20050091384A1 (en) * 2003-10-23 2005-04-28 International Business Machines Corporation Administering devices including allowed action lists
US20050137465A1 (en) * 2003-12-23 2005-06-23 General Electric Company System and method for remote monitoring in home activity of persons living independently
US20050182306A1 (en) * 2004-02-17 2005-08-18 Therasense, Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US20050285941A1 (en) * 2004-06-28 2005-12-29 Haigh Karen Z Monitoring devices
US20060041445A1 (en) * 2004-08-23 2006-02-23 Aaron Jeffrey A Electronic butler for providing application services to a user
US20060055543A1 (en) * 2004-09-10 2006-03-16 Meena Ganesh System and method for detecting unusual inactivity of a resident
US20060066448A1 (en) * 2004-08-04 2006-03-30 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US20060104488A1 (en) * 2004-11-12 2006-05-18 Bazakos Michael E Infrared face detection and recognition system
US20060102843A1 (en) * 2004-11-12 2006-05-18 Bazakos Michael E Infrared and visible fusion face recognition system
US20060123053A1 (en) * 2004-12-02 2006-06-08 Insignio Technologies, Inc. Personalized content processing and delivery system and media
US7091865B2 (en) 2004-02-04 2006-08-15 General Electric Company System and method for determining periods of interest in home of persons living independently
US20060187032A1 (en) * 2005-02-18 2006-08-24 Kunkel Daniel L Automated acquisition and notification system
US20060253745A1 (en) * 2001-09-25 2006-11-09 Path Reliability Inc. Application manager for monitoring and recovery of software based application processes
US20060285723A1 (en) * 2005-06-16 2006-12-21 Vassilios Morellas Object tracking system
US20060286620A1 (en) * 2005-06-18 2006-12-21 Karl Werner Glucose analysis instrument
US20070033261A1 (en) * 2003-05-16 2007-02-08 Matthias Wagner Personalized discovery of services
US7196614B2 (en) * 2003-07-11 2007-03-27 Carolan Joseph P Guidance system for rescue personnel
US20070078878A1 (en) * 2005-10-03 2007-04-05 Jason Knable Systems and methods for verbal communication from a speech impaired individual
US20070092245A1 (en) * 2005-10-20 2007-04-26 Honeywell International Inc. Face detection and tracking in a wide field of view
US20070161372A1 (en) * 2006-01-04 2007-07-12 Gary Rogalski Cordless phone system with integrated alarm & remote monitoring capability
US20070173978A1 (en) * 2006-01-04 2007-07-26 Gene Fein Controlling environmental conditions
US20070176760A1 (en) * 2006-01-18 2007-08-02 British Telecommunications Monitoring movement of an entity in an environment
US20070195703A1 (en) * 2006-02-22 2007-08-23 Living Independently Group Inc. System and method for monitoring a site using time gap analysis
US20070250561A1 (en) * 2003-04-17 2007-10-25 Bodin William K Method And System For Administering Devices With Multiple User Metric Spaces
US20080147358A1 (en) * 2005-01-19 2008-06-19 Ans, Inc. Detectors and techniques useful with automated acquisition and notification systems
US20080201206A1 (en) * 2007-02-01 2008-08-21 7 Billion People, Inc. Use of behavioral portraits in the conduct of E-commerce
US20080275582A1 (en) * 2004-11-19 2008-11-06 Nettles Steven C Scheduling AMHS pickup and delivery ahead of schedule
US20080281171A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US20080278332A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US20080294558A1 (en) * 2007-05-23 2008-11-27 Masahiro Shimanuki Portable electronic appliance, data processor, data communication system, computer program, data processing method
US20080319296A1 (en) * 2007-06-21 2008-12-25 Abbott Diabetes Care, Inc. Health monitor
US20090019061A1 (en) * 2004-02-20 2009-01-15 Insignio Technologies, Inc. Providing information to a user
US20090019457A1 (en) * 2003-07-02 2009-01-15 International Business Machines Corporation Administering Devices With Domain State Objects
US20090042246A1 (en) * 2004-12-07 2009-02-12 Gert Nikolaas Moll Methods For The Production And Secretion Of Modified Peptides
US20090054028A1 (en) * 2007-08-22 2009-02-26 Denning Jr Donald R Monitoring activities of daily living using radio frequency emissions
US20090089597A1 (en) * 2007-09-27 2009-04-02 Fuji Xerox Co., Ltd Information processing device, method of controlling the device, computer readable medium, and security system
US20090105571A1 (en) * 2006-06-30 2009-04-23 Abbott Diabetes Care, Inc. Method and System for Providing Data Communication in Data Management Systems
US20090105558A1 (en) * 2007-10-16 2009-04-23 Oakland University Portable autonomous multi-sensory intervention device
US20090146829A1 (en) * 2007-12-07 2009-06-11 Honeywell International Inc. Video-enabled rapid response system and method
ES2325345A1 (en) * 2005-09-20 2009-09-01 Universidade Da Coruña "on line" interactive system for the display of contents through a device, with capacity for the registration of biomedical activities and parameters, cognitive intervention, domotic control and telealarm of remote management. (Machine-translation by Google Translate, not legally binding)
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
US20090284372A1 (en) * 2003-06-10 2009-11-19 Abbott Diabetes Care Inc. Glucose Measuring Device For Use In Personal Area Network
US20100004896A1 (en) * 2008-07-05 2010-01-07 Ailive Inc. Method and apparatus for interpreting orientation invariant motion
US20100076288A1 (en) * 2003-04-04 2010-03-25 Brian Edmond Connolly Method and System for Transferring Analyte Test Data
US20100076284A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Management Devices and Methods
US20100113153A1 (en) * 2006-07-14 2010-05-06 Ailive, Inc. Self-Contained Inertial Navigation System for Interactive Control Using Movable Controllers
US20100225490A1 (en) * 2009-03-05 2010-09-09 Leuthardt Eric C Postural information system and method including central determining of subject advisory information based on subject status information and postural influencer status information
US20100228493A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including direction generation based on collection of subject advisory information
US20100225491A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100225474A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228492A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of State Of Delaware Postural information system and method including direction generation based on collection of subject advisory information
US20100225498A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Postural information system and method
US20100228494A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
US20100228487A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228158A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including device level determining of subject advisory information based on subject status information and postural influencer status information
US20100228154A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining response to subject advisory information
US20100228488A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228495A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
US20100271200A1 (en) * 2009-03-05 2010-10-28 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining response to subject advisory information
US20100274220A1 (en) * 2005-11-04 2010-10-28 Abbott Diabetes Care Inc. Method and System for Providing Basal Profile Modification in Analyte Monitoring and Management Systems
US20100289644A1 (en) * 2009-05-18 2010-11-18 Alarm.Com Moving asset location tracking
US20100295674A1 (en) * 2009-05-21 2010-11-25 Silverplus, Inc. Integrated health management console
US20100302043A1 (en) * 2009-06-01 2010-12-02 The Curators Of The University Of Missouri Integrated sensor network methods and systems
US20110003577A1 (en) * 2006-01-04 2011-01-06 Vtech Telecommunications Limited Cordless phone system with integrated alarm & remote monitoring capability
US20110044333A1 (en) * 2008-05-30 2011-02-24 Abbott Diabetes Care Inc. Close Proximity Communication Device and Methods
US7899772B1 (en) 2006-07-14 2011-03-01 Ailive, Inc. Method and system for tuning motion recognizers by a user using a set of motion signals
US20110054282A1 (en) * 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte Monitoring System and Methods for Managing Power and Noise
US20110060530A1 (en) * 2009-08-31 2011-03-10 Abbott Diabetes Care Inc. Analyte Signal Processing Device and Methods
US7917455B1 (en) 2007-01-29 2011-03-29 Ailive, Inc. Method and system for rapid evaluation of logical expressions
US20110093876A1 (en) * 2009-10-15 2011-04-21 At&T Intellectual Property I, L.P. System and Method to Monitor a Person in a Residence
US20110090085A1 (en) * 2009-10-15 2011-04-21 At & T Intellectual Property I, L.P. System and Method to Monitor a Person in a Residence
US20110291827A1 (en) * 2011-07-01 2011-12-01 Baldocchi Albert S Portable Monitor for Elderly/Infirm Individuals
US20120166162A1 (en) * 2009-09-04 2012-06-28 Siemens Aktiengesellschaft Device and method for generating a targeted realistic motion of particles along shortest paths with respect to arbitrary distance weightings for simulations of flows of people and objects
US8251821B1 (en) 2007-06-18 2012-08-28 Ailive, Inc. Method and system for interactive control using movable controllers
US20120304013A1 (en) * 2011-05-27 2012-11-29 International Business Machines Corporation Administering Event Pools For Relevant Event Analysis In A Distributed Processing System
WO2013006508A1 (en) * 2011-07-01 2013-01-10 Wsu Research Foundation Activity recognition in multi-entity environments
US8362904B2 (en) 2007-05-08 2013-01-29 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20130057702A1 (en) * 2010-07-06 2013-03-07 Lg Electronics Inc. Object recognition and tracking based apparatus and method
US20130100268A1 (en) * 2008-05-27 2013-04-25 University Health Network Emergency detection and response system and method
US8456301B2 (en) 2007-05-08 2013-06-04 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8461985B2 (en) 2007-05-08 2013-06-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
WO2013085379A2 (en) * 2011-12-09 2013-06-13 Universiti Putra Malaysia Livestock management and automation system using radio waves
US20130158368A1 (en) * 2000-06-16 2013-06-20 Bodymedia, Inc. System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability
US20130218812A1 (en) * 2012-02-17 2013-08-22 Wavemarket, Inc. System and method for detecting medical anomalies using a mobile communication device
US8593109B2 (en) 2006-03-31 2013-11-26 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US8597575B2 (en) 2006-03-31 2013-12-03 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
ES2435865R1 (en) * 2010-11-22 2014-01-31 Universidad De Zaragoza DEPENDENT PERSONNEL MONITORING SYSTEM
WO2014087040A1 (en) * 2012-12-03 2014-06-12 Menumat Oy Arrangement and method for nutrition and care services
WO2014087041A1 (en) * 2012-12-03 2014-06-12 Menumat Oy Arrangement and method for nutrition and remote care services
US20140220525A1 (en) * 2007-02-16 2014-08-07 Bodymedia, Inc. Managing educational content based on detected stress state and an individuals predicted type
US20150021465A1 (en) * 2013-07-16 2015-01-22 Leeo, Inc. Electronic device with environmental monitoring
US20150026111A1 (en) * 2013-07-22 2015-01-22 GreatCall, Inc. Method for engaging isolated individuals
US8943366B2 (en) 2012-08-09 2015-01-27 International Business Machines Corporation Administering checkpoints for incident analysis
US8954811B2 (en) 2012-08-06 2015-02-10 International Business Machines Corporation Administering incident pools for incident analysis
WO2015047166A1 (en) * 2013-09-25 2015-04-02 Phoniro Ab A telecare system and an electronic lock device for use therein, and an associated method for monitoring attendance to a telecare alarm event in a telecare system
US9024976B2 (en) 2009-03-05 2015-05-05 The Invention Science Fund I, Llc Postural information system and method
US9069536B2 (en) 2011-10-31 2015-06-30 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9086968B2 (en) 2013-09-11 2015-07-21 International Business Machines Corporation Checkpointing for delayed alert creation
US9088452B2 (en) 2009-04-29 2015-07-21 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US20150205278A1 (en) * 2010-03-25 2015-07-23 David H.C. CHEN Systems and methods of property security
US9095290B2 (en) 2007-03-01 2015-08-04 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
WO2015116773A1 (en) 2014-01-31 2015-08-06 Vivint, Inc. Progressive profiling in an automation system
US20150234886A1 (en) * 2012-09-06 2015-08-20 Beyond Verbal Communication Ltd System and method for selection of data according to measurement of physiological parameters
CN104950866A (en) * 2014-03-25 2015-09-30 株式会社日立高新技术 Failure cause classification apparatus
US20150294085A1 (en) * 2014-04-14 2015-10-15 Elwha LLC, a limited company of the State of Delaware Devices, systems, and methods for automated enhanced care rooms
US20150294086A1 (en) * 2014-04-14 2015-10-15 Elwha Llc Devices, systems, and methods for automated enhanced care rooms
US20150294067A1 (en) * 2014-04-14 2015-10-15 Elwha Llc Devices, systems, and methods for automated enhanced care rooms
US9170860B2 (en) 2013-07-26 2015-10-27 International Business Machines Corporation Parallel incident processing
US9178936B2 (en) 2011-10-18 2015-11-03 International Business Machines Corporation Selected alert delivery in a distributed processing system
US9226701B2 (en) 2009-04-28 2016-01-05 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
US9246865B2 (en) 2011-10-18 2016-01-26 International Business Machines Corporation Prioritized alert delivery in a distributed processing system
US20160033947A1 (en) * 2014-07-31 2016-02-04 Honeywell International Inc. Monitoring a building management system
US9256482B2 (en) 2013-08-23 2016-02-09 International Business Machines Corporation Determining whether to send an alert in a distributed processing system
US20160058428A1 (en) * 2014-09-03 2016-03-03 Earlysense Ltd. Menstrual state monitoring
US9286143B2 (en) 2011-06-22 2016-03-15 International Business Machines Corporation Flexible event data content management for relevant event and alert analysis within a distributed processing system
US9304590B2 (en) 2014-08-27 2016-04-05 Leen, Inc. Intuitive thermal user interface
US9311382B2 (en) 2012-12-14 2016-04-12 Apple Inc. Method and apparatus for personal characterization data collection using sensors
US20160117202A1 (en) * 2014-10-28 2016-04-28 Kamal Zamer Prioritizing software applications to manage alerts
US9332616B1 (en) 2014-12-30 2016-05-03 Google Inc. Path light feedback compensation
US9344381B2 (en) 2011-05-27 2016-05-17 International Business Machines Corporation Event management in a distributed processing system
US9348687B2 (en) 2014-01-07 2016-05-24 International Business Machines Corporation Determining a number of unique incidents in a plurality of incidents for incident processing in a distributed processing system
US9361184B2 (en) 2013-05-09 2016-06-07 International Business Machines Corporation Selecting during a system shutdown procedure, a restart incident checkpoint of an incident analyzer in a distributed processing system
US20160171866A1 (en) * 2013-04-22 2016-06-16 Domosafety Sa System and method for automated triggering and management of alarms
US9372477B2 (en) 2014-07-15 2016-06-21 Leeo, Inc. Selective electrical coupling based on environmental conditions
US9402155B2 (en) 2014-03-03 2016-07-26 Location Labs, Inc. System and method for indicating a state of a geographic area based on mobile device sensor measurements
US9408561B2 (en) 2012-04-27 2016-08-09 The Curators Of The University Of Missouri Activity analysis, fall detection and risk assessment systems and methods
US20160234243A1 (en) * 2015-02-06 2016-08-11 Honeywell International Inc. Technique for using infrastructure monitoring software to collect cyber-security risk data
WO2016133659A1 (en) * 2015-02-19 2016-08-25 Vivint, Inc. Methods and systems for automatically monitoring user activity
US9445451B2 (en) 2014-10-20 2016-09-13 Leeo, Inc. Communicating arbitrary attributes using a predefined characteristic
US9460262B2 (en) * 2011-06-17 2016-10-04 The Research Foundation Of State University Of New York Detecting and responding to sentinel events
US9532737B2 (en) 2011-02-28 2017-01-03 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US9569943B2 (en) 2014-12-30 2017-02-14 Google Inc. Alarm arming with open entry point
US9574914B2 (en) 2007-05-08 2017-02-21 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US9597016B2 (en) 2012-04-27 2017-03-21 The Curators Of The University Of Missouri Activity analysis, fall detection and risk assessment systems and methods
US9602337B2 (en) 2013-09-11 2017-03-21 International Business Machines Corporation Event and alert analysis in a distributed processing system
EP3118780A4 (en) * 2014-03-11 2017-03-22 Panasonic Intellectual Property Management Co., Ltd. Lifestyle behavior estimating device, and program
WO2017055317A1 (en) 2015-09-30 2017-04-06 Koninklijke Philips N.V. Assistance system for cognitively impaired persons
US20170100838A1 (en) * 2015-10-12 2017-04-13 The Boeing Company Dynamic Automation Work Zone Safety System
US9658902B2 (en) 2013-08-22 2017-05-23 Globalfoundries Inc. Adaptive clock throttling for event processing
US9685059B2 (en) 2015-02-12 2017-06-20 Google Inc. Devices and methods for providing heat-source alerts
US20170178001A1 (en) * 2015-12-21 2017-06-22 Glen J. Anderson Technologies for cognitive cuing based on knowledge and context
US20170231546A1 (en) * 2014-03-18 2017-08-17 J. Kimo Arbas System and method to detect alertness of machine operator
US9747769B2 (en) 2014-12-30 2017-08-29 Google Inc. Entry point opening sensor
US9778235B2 (en) 2013-07-17 2017-10-03 Leeo, Inc. Selective electrical coupling based on environmental conditions
US9800604B2 (en) 2015-05-06 2017-10-24 Honeywell International Inc. Apparatus and method for assigning cyber-security risk consequences in industrial process control environments
US9801013B2 (en) 2015-11-06 2017-10-24 Leeo, Inc. Electronic-device association based on location duration
US9865016B2 (en) 2014-09-08 2018-01-09 Leeo, Inc. Constrained environmental monitoring based on data privileges
JP2018005536A (en) * 2016-06-30 2018-01-11 東芝デジタルソリューションズ株式会社 Life data integration analysis system, life data integration analysis method and life data integration analysis program
US20180012474A1 (en) * 2016-07-07 2018-01-11 Wal-Mart Stores, Inc. Method and apparatus for monitoring person and home
WO2018073241A1 (en) 2016-10-20 2018-04-26 Philips Lighting Holding B.V. A system and method for monitoring activities of daily living of a person
US9962091B2 (en) 2002-12-31 2018-05-08 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US9980669B2 (en) 2011-11-07 2018-05-29 Abbott Diabetes Care Inc. Analyte monitoring device and methods
US20180174671A1 (en) * 2016-12-15 2018-06-21 International Business Machines Corporation Cognitive adaptations for well-being management
US10021119B2 (en) 2015-02-06 2018-07-10 Honeywell International Inc. Apparatus and method for automatic handling of cyber-security risk events
US10021125B2 (en) 2015-02-06 2018-07-10 Honeywell International Inc. Infrastructure monitoring tool for collecting industrial process control and automation system risk data
US10026304B2 (en) 2014-10-20 2018-07-17 Leeo, Inc. Calibrating an environmental monitoring device
US10022499B2 (en) 2007-02-15 2018-07-17 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US10028099B2 (en) 2012-10-08 2018-07-17 Location Labs, Inc. Bio-powered locator device
US10075474B2 (en) 2015-02-06 2018-09-11 Honeywell International Inc. Notification subsystem for generating consolidated, filtered, and relevant security risk-based notifications
US10075475B2 (en) 2015-02-06 2018-09-11 Honeywell International Inc. Apparatus and method for dynamic customization of cyber-security risk item rules
CN108877158A (en) * 2017-05-12 2018-11-23 波音公司 Modular safety monitoring and warning system and its application method
US10136816B2 (en) 2009-08-31 2018-11-27 Abbott Diabetes Care Inc. Medical devices and methods
US20180365965A1 (en) * 2017-05-25 2018-12-20 Robert Blatt Easily customizable inhabitant behavioral routines in a location monitoring and action system
US10168676B2 (en) 2014-04-29 2019-01-01 Cox Communications, Inc. Systems and methods for intelligent customization of an automation control service
US10206630B2 (en) 2015-08-28 2019-02-19 Foresite Healthcare, Llc Systems for automatic assessment of fall risk
US10298608B2 (en) 2015-02-11 2019-05-21 Honeywell International Inc. Apparatus and method for tying cyber-security risk analysis to common risk methodologies and risk levels
US10298411B2 (en) 2016-10-24 2019-05-21 Crestron Electronics, Inc. Building management system that determines building utilization
US10311713B2 (en) 2010-09-15 2019-06-04 Comcast Cable Communications, Llc Securing property
US20190192368A1 (en) * 2017-12-22 2019-06-27 Stryker Corporation Techniques For Notifying Persons Within A Vicinity Of A Patient Support Apparatus Of A Remote Control Function
US10339773B2 (en) 2014-12-30 2019-07-02 Google Llc Home security system with automatic context-sensitive transition to different modes
US20190216406A1 (en) * 2016-06-29 2019-07-18 Robert Polkowski Wearable device to assist cognitive dysfunction sufferer and method for operating the same
US10373464B2 (en) 2016-07-07 2019-08-06 Walmart Apollo, Llc Apparatus and method for updating partiality vectors based on monitoring of person and his or her home
US10382441B2 (en) 2016-10-13 2019-08-13 Honeywell International Inc. Cross security layer secure communication
US10430817B2 (en) 2016-04-15 2019-10-01 Walmart Apollo, Llc Partiality vector refinement systems and methods through sample probing
US10592959B2 (en) 2016-04-15 2020-03-17 Walmart Apollo, Llc Systems and methods for facilitating shopping in a physical retail facility
US10606222B2 (en) 2015-02-11 2020-03-31 International Business Machines Corporation Identifying home automation correlated events and creating portable recipes
WO2020069500A1 (en) * 2018-09-30 2020-04-02 Yibing Hu Smart health expert and manager
US10614504B2 (en) 2016-04-15 2020-04-07 Walmart Apollo, Llc Systems and methods for providing content-based product recommendations
US10634379B2 (en) 2017-09-28 2020-04-28 Honeywell International Inc. Actuators with condition tracking
US20200175846A1 (en) * 2015-03-27 2020-06-04 Google Llc Configuring a Smart Home Controller
US10705108B1 (en) 2019-02-05 2020-07-07 Honeywell International Inc. Sensing system for sensing stationary objects
US20200273555A1 (en) * 2019-02-25 2020-08-27 Boe Technology Group Co., Ltd. Intelligent reminding method, device and electronic apparatus
US10764079B2 (en) 2015-02-09 2020-09-01 Vivint, Inc. System and methods for correlating sleep data to security and/or automation system operations
US10805775B2 (en) 2015-11-06 2020-10-13 Jon Castor Electronic-device detection and activity association
US10939155B2 (en) 2013-11-19 2021-03-02 Comcast Cable Communications, Llc Premises automation control
US11006872B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11030882B2 (en) * 2019-03-11 2021-06-08 Lenovo (Singapore) Pte. Ltd. Automated security subsystem activation
US11044114B2 (en) 2014-01-31 2021-06-22 Vivint, Inc. Rule-based graphical conversational user interface for security and automation system
US11223588B2 (en) * 2018-09-19 2022-01-11 International Business Machines Corporation Using sensor data to control message delivery
US11224777B2 (en) 2019-02-25 2022-01-18 Honeywell International Inc. Fire and smoke actuator with temperature-dependent operating speed
CN114155612A (en) * 2022-02-10 2022-03-08 深圳爱莫科技有限公司 Restaurant personnel non-standard behavior model training method, detection method and processing equipment
US11276181B2 (en) 2016-06-28 2022-03-15 Foresite Healthcare, Llc Systems and methods for use in detecting falls utilizing thermal sensing
US11342074B1 (en) * 2017-12-31 2022-05-24 Teletracking Technologies, Inc. Patient identification using passive sensors
US20220167893A1 (en) * 2020-12-02 2022-06-02 Koninklijke Philips N.V. Methods and systems for detecting indications of cognitive decline
US20220175598A1 (en) * 2019-03-05 2022-06-09 Fuji Corporation Assistance information management system
US11434686B2 (en) 2019-11-20 2022-09-06 Kingsway Enterprises (Uk) Limited Pressure monitor
US20220304603A1 (en) * 2019-06-17 2022-09-29 Happy Health, Inc. Wearable device operable to detect and/or manage user emotion
US11462091B2 (en) 2020-11-26 2022-10-04 Kingsway Enterprises (Uk) Limited Anti-ligature device
TWI815424B (en) * 2022-04-29 2023-09-11 優網通國際資訊股份有限公司 Residential safety monitoring system
US11762350B2 (en) 2021-03-22 2023-09-19 Honeywell International Inc. Methods and systems for detecting occupancy of a space
US11793936B2 (en) 2009-05-29 2023-10-24 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US11869328B2 (en) 2018-04-09 2024-01-09 State Farm Mutual Automobile Insurance Company Sensing peripheral heuristic evidence, reinforcement, and engagement system
US11864926B2 (en) 2015-08-28 2024-01-09 Foresite Healthcare, Llc Systems and methods for detecting attempted bed exit
US11894129B1 (en) 2019-07-03 2024-02-06 State Farm Mutual Automobile Insurance Company Senior living care coordination platforms
US11901071B2 (en) 2019-08-19 2024-02-13 State Farm Mutual Automobile Insurance Company Senior living engagement and care support platforms
US11928604B2 (en) * 2005-09-08 2024-03-12 Apple Inc. Method and apparatus for building an intelligent automated assistant
US11935651B2 (en) 2021-01-19 2024-03-19 State Farm Mutual Automobile Insurance Company Alert systems for senior living engagement and care support platforms
US11950936B2 (en) 2023-02-22 2024-04-09 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7524279B2 (en) * 2003-12-31 2009-04-28 Raphael Auphan Sleep and environment control method and system
US7580819B2 (en) 2005-11-01 2009-08-25 Raytheon Company Adaptive mission profiling
US7580818B2 (en) 2005-11-01 2009-08-25 Raytheon Company Mission profiling
US20070123754A1 (en) * 2005-11-29 2007-05-31 Cuddihy Paul E Non-encumbering, substantially continuous patient daily activity data measurement for indication of patient condition change for access by remote caregiver
US8046320B2 (en) 2007-05-17 2011-10-25 Raytheon Company Domain-independent architecture in a command and control system
WO2013056335A1 (en) * 2011-10-21 2013-04-25 University Health Network Emergency detection and response system and method
EP2184724A1 (en) * 2008-11-05 2010-05-12 Nederlandse Organisatie voor toegepast-natuurwetenschappelijk Onderzoek TNO A system for tracking a presence of persons in a building, a method and a computer program product
FR2943157A1 (en) * 2009-03-13 2010-09-17 Univ Joseph Fourier Grenoble I DEVICE FOR MAINTAINING HOMEOSTASIS IN A PERSON
JP5554399B2 (en) * 2010-03-25 2014-07-23 三菱電機株式会社 Data transmission device
FI122787B (en) * 2010-12-28 2012-07-13 Lano Group Oy Remote Monitoring System
KR20130101365A (en) * 2012-03-05 2013-09-13 삼성전자주식회사 Method for providing health care service using universal play and plug network and apparatus therefor
ITFI20120217A1 (en) * 2012-10-19 2014-04-20 Digicsoft S R L TELEVISION APPARATUS FOR ROOMS AND PEOPLE.
DE102017129675B3 (en) 2017-12-12 2019-05-09 András Lelkes Smart home appliance
FR3085524B1 (en) * 2018-08-30 2021-02-19 Predical METHOD OF LEARNING THE WAY OF LIFE OF A PERSON AND DETECTION OF ANOMALIES

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4259548A (en) * 1979-11-14 1981-03-31 Gte Products Corporation Apparatus for monitoring and signalling system
US4803625A (en) * 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US4952928A (en) * 1988-08-29 1990-08-28 B. I. Incorporated Adaptable electronic monitoring and identification system
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US5228449A (en) * 1991-01-22 1993-07-20 Athanasios G. Christ System and method for detecting out-of-hospital cardiac emergencies and summoning emergency assistance
US5400246A (en) * 1989-05-09 1995-03-21 Ansan Industries, Ltd. Peripheral data acquisition, monitor, and adaptive control system via personal computer
US5410471A (en) * 1992-02-24 1995-04-25 Toto, Ltd. Networked health care and monitoring system
US5441047A (en) * 1992-03-25 1995-08-15 David; Daniel Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5596994A (en) * 1993-08-30 1997-01-28 Bro; William L. Automated and interactive behavioral and medical guidance system
US5857110A (en) * 1991-03-19 1999-01-05 Hitachi, Ltd. Priority control with concurrent switching of priorities of vector processors, for plural priority circuits for memory modules shared by the vector processors
US5920477A (en) * 1991-12-23 1999-07-06 Hoffberg; Steven M. Human factored interface incorporating adaptive pattern recognition based controller apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5692215A (en) * 1994-12-23 1997-11-25 Gerotech, Inc. System for generating periodic reports, generating trend analysis, and intervention in accordance with trend analysis from a detection subsystem for monitoring daily living activity
AUPQ303799A0 (en) * 1999-09-24 1999-10-21 Right Hemisphere Pty Limited Safety cubicle

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4259548A (en) * 1979-11-14 1981-03-31 Gte Products Corporation Apparatus for monitoring and signalling system
US4803625A (en) * 1986-06-30 1989-02-07 Buddy Systems, Inc. Personal health monitor
US4952928A (en) * 1988-08-29 1990-08-28 B. I. Incorporated Adaptable electronic monitoring and identification system
US5086385A (en) * 1989-01-31 1992-02-04 Custom Command Systems Expandable home automation system
US5400246A (en) * 1989-05-09 1995-03-21 Ansan Industries, Ltd. Peripheral data acquisition, monitor, and adaptive control system via personal computer
US5228449A (en) * 1991-01-22 1993-07-20 Athanasios G. Christ System and method for detecting out-of-hospital cardiac emergencies and summoning emergency assistance
US5857110A (en) * 1991-03-19 1999-01-05 Hitachi, Ltd. Priority control with concurrent switching of priorities of vector processors, for plural priority circuits for memory modules shared by the vector processors
US5920477A (en) * 1991-12-23 1999-07-06 Hoffberg; Steven M. Human factored interface incorporating adaptive pattern recognition based controller apparatus
US5410471A (en) * 1992-02-24 1995-04-25 Toto, Ltd. Networked health care and monitoring system
US5441047A (en) * 1992-03-25 1995-08-15 David; Daniel Ambulatory patient health monitoring techniques utilizing interactive visual communication
US5596994A (en) * 1993-08-30 1997-01-28 Bro; William L. Automated and interactive behavioral and medical guidance system

Cited By (420)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130158368A1 (en) * 2000-06-16 2013-06-20 Bodymedia, Inc. System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability
US20130158367A1 (en) * 2000-06-16 2013-06-20 Bodymedia, Inc. System for monitoring and managing body weight and other physiological conditions including iterative and personalized planning, intervention and reporting capability
US20060253745A1 (en) * 2001-09-25 2006-11-09 Path Reliability Inc. Application manager for monitoring and recovery of software based application processes
US7526685B2 (en) 2001-09-25 2009-04-28 Path Reliability, Inc. Application manager for monitoring and recovery of software based application processes
US20030225474A1 (en) * 2002-05-31 2003-12-04 Gustavo Mata Specialization of active software agents in an automated manufacturing environment
US20040116102A1 (en) * 2002-12-17 2004-06-17 International Business Machines Corporation Heuristics for behavior based life support services
US9962091B2 (en) 2002-12-31 2018-05-08 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US10750952B2 (en) 2002-12-31 2020-08-25 Abbott Diabetes Care Inc. Continuous glucose monitoring system and methods of use
US10039881B2 (en) 2002-12-31 2018-08-07 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US20040163079A1 (en) * 2003-02-13 2004-08-19 Path Communications, Inc. Software behavior pattern recognition and analysis
US7269824B2 (en) * 2003-02-13 2007-09-11 Path Reliability, Inc. Software behavior pattern recognition and analysis
US20100309001A1 (en) * 2003-04-04 2010-12-09 Abbott Diabetes Care Inc. Method and System for Transferring Analyte Test Data
US20100076288A1 (en) * 2003-04-04 2010-03-25 Brian Edmond Connolly Method and System for Transferring Analyte Test Data
US8437966B2 (en) 2003-04-04 2013-05-07 Abbott Diabetes Care Inc. Method and system for transferring analyte test data
US8483974B2 (en) 2003-04-04 2013-07-09 Abbott Diabetes Care Inc. Method and system for transferring analyte test data
US8560250B2 (en) 2003-04-04 2013-10-15 Abbott Laboratories Method and system for transferring analyte test data
US8682598B2 (en) 2003-04-04 2014-03-25 Abbott Laboratories Method and system for transferring analyte test data
US20100121168A1 (en) * 2003-04-04 2010-05-13 Abbott Diabetes Care Inc. Method and System for Transferring Analyte Test Data
US8112499B2 (en) 2003-04-17 2012-02-07 International Business Machines Corporation Administering devices in dependence upon user metric vectors
US20070287893A1 (en) * 2003-04-17 2007-12-13 Bodin William K Method And System For Administering Devices In Dependence Upon User Metric Vectors
US8145743B2 (en) 2003-04-17 2012-03-27 International Business Machines Corporation Administering devices in dependence upon user metric vectors
US8180885B2 (en) 2003-04-17 2012-05-15 International Business Machines Corporation Method and system for administering devices with multiple user metric spaces
US20070250561A1 (en) * 2003-04-17 2007-10-25 Bodin William K Method And System For Administering Devices With Multiple User Metric Spaces
US20040210626A1 (en) * 2003-04-17 2004-10-21 International Business Machines Corporation Method and system for administering devices in dependence upon user metric vectors
US8086658B2 (en) * 2003-05-06 2011-12-27 Ntt Docomo, Inc. Personalized discovery of services
US20070033261A1 (en) * 2003-05-16 2007-02-08 Matthias Wagner Personalized discovery of services
US20070283266A1 (en) * 2003-06-05 2007-12-06 Bodin William K Administering Devices With Dynamic Action Lists
US20040249825A1 (en) * 2003-06-05 2004-12-09 International Business Machines Corporation Administering devices with dynamic action lists
US20090284372A1 (en) * 2003-06-10 2009-11-19 Abbott Diabetes Care Inc. Glucose Measuring Device For Use In Personal Area Network
US9730584B2 (en) 2003-06-10 2017-08-15 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US8512239B2 (en) * 2003-06-10 2013-08-20 Abbott Diabetes Care Inc. Glucose measuring device for use in personal area network
US8688818B2 (en) 2003-07-02 2014-04-01 International Business Machines Corporation Administering devices with domain state objects
US20090019457A1 (en) * 2003-07-02 2009-01-15 International Business Machines Corporation Administering Devices With Domain State Objects
US8112509B2 (en) 2003-07-02 2012-02-07 International Business Machines Corporation Administering devices with domain state objects
US7196614B2 (en) * 2003-07-11 2007-03-27 Carolan Joseph P Guidance system for rescue personnel
US7461143B2 (en) * 2003-10-23 2008-12-02 International Business Machines Corporation Administering devices including allowed action lists
US20090055533A1 (en) * 2003-10-23 2009-02-26 International Business Machines Corporation Administering Devices Including Allowed Action Lists
US20050091384A1 (en) * 2003-10-23 2005-04-28 International Business Machines Corporation Administering devices including allowed action lists
US7912953B2 (en) 2003-10-23 2011-03-22 International Business Machines Corporation Administering devices including allowed action lists
US20050137465A1 (en) * 2003-12-23 2005-06-23 General Electric Company System and method for remote monitoring in home activity of persons living independently
US7091865B2 (en) 2004-02-04 2006-08-15 General Electric Company System and method for determining periods of interest in home of persons living independently
US8771183B2 (en) 2004-02-17 2014-07-08 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US20050182306A1 (en) * 2004-02-17 2005-08-18 Therasense, Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US11366873B2 (en) 2004-02-20 2022-06-21 Insignio Technologies, Inc. Personalized content processing and delivery system and media
US20090019061A1 (en) * 2004-02-20 2009-01-15 Insignio Technologies, Inc. Providing information to a user
EP1916639A2 (en) 2004-06-28 2008-04-30 Honeywell International Inc. Monitoring devices
US20050285941A1 (en) * 2004-06-28 2005-12-29 Haigh Karen Z Monitoring devices
US7966378B2 (en) 2004-08-04 2011-06-21 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US20060066448A1 (en) * 2004-08-04 2006-03-30 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US8635282B2 (en) 2004-08-04 2014-01-21 Kimberco, Inc. Computer—automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US7562121B2 (en) * 2004-08-04 2009-07-14 Kimberco, Inc. Computer-automated system and method of assessing the orientation, awareness and responses of a person with reduced capacity
US20060041445A1 (en) * 2004-08-23 2006-02-23 Aaron Jeffrey A Electronic butler for providing application services to a user
US7698244B2 (en) * 2004-08-23 2010-04-13 At&T Intellectual Property I, L.P. Electronic butler for providing application services to a user
US20060055543A1 (en) * 2004-09-10 2006-03-16 Meena Ganesh System and method for detecting unusual inactivity of a resident
US20060104488A1 (en) * 2004-11-12 2006-05-18 Bazakos Michael E Infrared face detection and recognition system
US20060102843A1 (en) * 2004-11-12 2006-05-18 Bazakos Michael E Infrared and visible fusion face recognition system
US7469060B2 (en) 2004-11-12 2008-12-23 Honeywell International Inc. Infrared face detection and recognition system
US7602942B2 (en) 2004-11-12 2009-10-13 Honeywell International Inc. Infrared and visible fusion face recognition system
US20080275582A1 (en) * 2004-11-19 2008-11-06 Nettles Steven C Scheduling AMHS pickup and delivery ahead of schedule
US10417298B2 (en) 2004-12-02 2019-09-17 Insignio Technologies, Inc. Personalized content processing and delivery system and media
US20060123053A1 (en) * 2004-12-02 2006-06-08 Insignio Technologies, Inc. Personalized content processing and delivery system and media
US20090042246A1 (en) * 2004-12-07 2009-02-12 Gert Nikolaas Moll Methods For The Production And Secretion Of Modified Peptides
US7827009B2 (en) 2005-01-19 2010-11-02 Ans, Inc. Detectors and techniques useful with automated acquisition and notification systems
US20110029285A1 (en) * 2005-01-19 2011-02-03 Kunkel Daniel L Detection of Objects or Other Materials in a Receptacle
US8150656B2 (en) 2005-01-19 2012-04-03 Ans, Inc. Detection of objects or other materials in a receptacle
US20080147358A1 (en) * 2005-01-19 2008-06-19 Ans, Inc. Detectors and techniques useful with automated acquisition and notification systems
US20060187032A1 (en) * 2005-02-18 2006-08-24 Kunkel Daniel L Automated acquisition and notification system
US7340379B2 (en) 2005-02-18 2008-03-04 Ans, Inc. Automated acquisition and notification system
US20060285723A1 (en) * 2005-06-16 2006-12-21 Vassilios Morellas Object tracking system
US7720257B2 (en) 2005-06-16 2010-05-18 Honeywell International Inc. Object tracking system
US20060286620A1 (en) * 2005-06-18 2006-12-21 Karl Werner Glucose analysis instrument
US7695677B2 (en) * 2005-06-18 2010-04-13 Roche Diagnostics Operations, Inc. Glucose analysis instrument
US11928604B2 (en) * 2005-09-08 2024-03-12 Apple Inc. Method and apparatus for building an intelligent automated assistant
ES2325345A1 (en) * 2005-09-20 2009-09-01 Universidade Da Coruña "on line" interactive system for the display of contents through a device, with capacity for the registration of biomedical activities and parameters, cognitive intervention, domotic control and telealarm of remote management. (Machine-translation by Google Translate, not legally binding)
US20070078878A1 (en) * 2005-10-03 2007-04-05 Jason Knable Systems and methods for verbal communication from a speech impaired individual
US8087936B2 (en) * 2005-10-03 2012-01-03 Jason Knable Systems and methods for verbal communication from a speech impaired individual
US20120202177A1 (en) * 2005-10-03 2012-08-09 Jason Knable Systems and methods for verbal communication from a speech impaired individual
US20070092245A1 (en) * 2005-10-20 2007-04-26 Honeywell International Inc. Face detection and tracking in a wide field of view
US7806604B2 (en) 2005-10-20 2010-10-05 Honeywell International Inc. Face detection and tracking in a wide field of view
US8585591B2 (en) 2005-11-04 2013-11-19 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US11538580B2 (en) 2005-11-04 2022-12-27 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US9323898B2 (en) 2005-11-04 2016-04-26 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US20100274220A1 (en) * 2005-11-04 2010-10-28 Abbott Diabetes Care Inc. Method and System for Providing Basal Profile Modification in Analyte Monitoring and Management Systems
US9669162B2 (en) 2005-11-04 2017-06-06 Abbott Diabetes Care Inc. Method and system for providing basal profile modification in analyte monitoring and management systems
US20110003577A1 (en) * 2006-01-04 2011-01-06 Vtech Telecommunications Limited Cordless phone system with integrated alarm & remote monitoring capability
US9154933B2 (en) 2006-01-04 2015-10-06 Vtech Telecommunications Limited Cordless phone system with integrated alarm and remote monitoring capability
US20070161372A1 (en) * 2006-01-04 2007-07-12 Gary Rogalski Cordless phone system with integrated alarm & remote monitoring capability
US20070173978A1 (en) * 2006-01-04 2007-07-26 Gene Fein Controlling environmental conditions
US8825043B2 (en) * 2006-01-04 2014-09-02 Vtech Telecommunications Limited Cordless phone system with integrated alarm and remote monitoring capability
US7764167B2 (en) 2006-01-18 2010-07-27 British Telecommunications Plc Monitoring movement of an entity in an environment
US20070176760A1 (en) * 2006-01-18 2007-08-02 British Telecommunications Monitoring movement of an entity in an environment
US20070195703A1 (en) * 2006-02-22 2007-08-23 Living Independently Group Inc. System and method for monitoring a site using time gap analysis
US9039975B2 (en) 2006-03-31 2015-05-26 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US9380971B2 (en) 2006-03-31 2016-07-05 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US9625413B2 (en) 2006-03-31 2017-04-18 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US8593109B2 (en) 2006-03-31 2013-11-26 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US8597575B2 (en) 2006-03-31 2013-12-03 Abbott Diabetes Care Inc. Analyte monitoring devices and methods therefor
US9743863B2 (en) 2006-03-31 2017-08-29 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US8933664B2 (en) 2006-03-31 2015-01-13 Abbott Diabetes Care Inc. Method and system for powering an electronic device
US20090105571A1 (en) * 2006-06-30 2009-04-23 Abbott Diabetes Care, Inc. Method and System for Providing Data Communication in Data Management Systems
US8051024B1 (en) 2006-07-14 2011-11-01 Ailive, Inc. Example-based creation and tuning of motion recognizers for motion-controlled applications
US9405372B2 (en) 2006-07-14 2016-08-02 Ailive, Inc. Self-contained inertial navigation system for interactive control using movable controllers
US7899772B1 (en) 2006-07-14 2011-03-01 Ailive, Inc. Method and system for tuning motion recognizers by a user using a set of motion signals
US20100113153A1 (en) * 2006-07-14 2010-05-06 Ailive, Inc. Self-Contained Inertial Navigation System for Interactive Control Using Movable Controllers
US9261968B2 (en) 2006-07-14 2016-02-16 Ailive, Inc. Methods and systems for dynamic calibration of movable game controllers
US7917455B1 (en) 2007-01-29 2011-03-29 Ailive, Inc. Method and system for rapid evaluation of logical expressions
US20080201206A1 (en) * 2007-02-01 2008-08-21 7 Billion People, Inc. Use of behavioral portraits in the conduct of E-commerce
US9633367B2 (en) 2007-02-01 2017-04-25 Iii Holdings 4, Llc System for creating customized web content based on user behavioral portraits
US20080228819A1 (en) * 2007-02-01 2008-09-18 7 Billion People, Inc. Use of behavioral portraits in web site analysis
US9646322B2 (en) 2007-02-01 2017-05-09 Iii Holdings 4, Llc Use of behavioral portraits in web site analysis
US10445764B2 (en) 2007-02-01 2019-10-15 Iii Holdings 4, Llc Use of behavioral portraits in the conduct of e-commerce
US10296939B2 (en) 2007-02-01 2019-05-21 Iii Holdings 4, Llc Dynamic reconfiguration of web pages based on user behavioral portrait
US10726442B2 (en) 2007-02-01 2020-07-28 Iii Holdings 4, Llc Dynamic reconfiguration of web pages based on user behavioral portrait
US9785966B2 (en) 2007-02-01 2017-10-10 Iii Holdings 4, Llc Dynamic reconfiguration of web pages based on user behavioral portrait
US8719105B2 (en) 2007-02-01 2014-05-06 7 Billion People, Inc. Dynamic reconfiguration of web pages based on user behavioral portrait
US10617823B2 (en) 2007-02-15 2020-04-14 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US10022499B2 (en) 2007-02-15 2018-07-17 Abbott Diabetes Care Inc. Device and method for automatic data acquisition and/or detection
US20140220525A1 (en) * 2007-02-16 2014-08-07 Bodymedia, Inc. Managing educational content based on detected stress state and an individuals predicted type
US20140222732A1 (en) * 2007-02-16 2014-08-07 Bodymedia, Inc. Managing educational content based on detected stress state
US20140310276A1 (en) * 2007-02-16 2014-10-16 Bodymedia, Inc. Home automation systems utilizing detected stress data of an individual
US20140310275A1 (en) * 2007-02-16 2014-10-16 Bodymedia, Inc. Home automation systems utilizing detected stress data of an individual
US20140310297A1 (en) * 2007-02-16 2014-10-16 Bodymedia, Inc. Home automation systems utilizing detected stress data of an individual and the individuals predicted type
US9801545B2 (en) 2007-03-01 2017-10-31 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US9095290B2 (en) 2007-03-01 2015-08-04 Abbott Diabetes Care Inc. Method and apparatus for providing rolling data in communication systems
US8362904B2 (en) 2007-05-08 2013-01-29 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US10178954B2 (en) 2007-05-08 2019-01-15 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8461985B2 (en) 2007-05-08 2013-06-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9314198B2 (en) 2007-05-08 2016-04-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US11696684B2 (en) 2007-05-08 2023-07-11 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US10653317B2 (en) 2007-05-08 2020-05-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9949678B2 (en) 2007-05-08 2018-04-24 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US10952611B2 (en) 2007-05-08 2021-03-23 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20080278332A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US8456301B2 (en) 2007-05-08 2013-06-04 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20080281171A1 (en) * 2007-05-08 2008-11-13 Abbott Diabetes Care, Inc. Analyte monitoring system and methods
US9649057B2 (en) 2007-05-08 2017-05-16 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9574914B2 (en) 2007-05-08 2017-02-21 Abbott Diabetes Care Inc. Method and device for determining elapsed sensor life
US9035767B2 (en) 2007-05-08 2015-05-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9177456B2 (en) 2007-05-08 2015-11-03 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US9000929B2 (en) 2007-05-08 2015-04-07 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US8593287B2 (en) 2007-05-08 2013-11-26 Abbott Diabetes Care Inc. Analyte monitoring system and methods
US20080294558A1 (en) * 2007-05-23 2008-11-27 Masahiro Shimanuki Portable electronic appliance, data processor, data communication system, computer program, data processing method
US8251821B1 (en) 2007-06-18 2012-08-28 Ailive, Inc. Method and system for interactive control using movable controllers
US8617069B2 (en) 2007-06-21 2013-12-31 Abbott Diabetes Care Inc. Health monitor
US20100076289A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US8597188B2 (en) 2007-06-21 2013-12-03 Abbott Diabetes Care Inc. Health management devices and methods
US20100076291A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US11276492B2 (en) 2007-06-21 2022-03-15 Abbott Diabetes Care Inc. Health management devices and methods
US20080319296A1 (en) * 2007-06-21 2008-12-25 Abbott Diabetes Care, Inc. Health monitor
US20100076284A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Management Devices and Methods
US11264133B2 (en) 2007-06-21 2022-03-01 Abbott Diabetes Care Inc. Health management devices and methods
US20100076290A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US20100076280A1 (en) * 2007-06-21 2010-03-25 Abbott Diabetes Care Inc. Health Monitor
US8798573B2 (en) * 2007-08-22 2014-08-05 Intel-Ge Care Innovations Llc Monitoring activities of daily living using radio frequency emissions
US20090054028A1 (en) * 2007-08-22 2009-02-26 Denning Jr Donald R Monitoring activities of daily living using radio frequency emissions
US20140335796A1 (en) * 2007-08-22 2014-11-13 Intel-Ge Care Innovations Llc Monitoring activities of daily living using radio frequency emissions
US20090089597A1 (en) * 2007-09-27 2009-04-02 Fuji Xerox Co., Ltd Information processing device, method of controlling the device, computer readable medium, and security system
US8041962B2 (en) * 2007-09-27 2011-10-18 Fuji Xerox Co., Ltd. Information processing device, method of controlling the device, computer readable medium, and security system
US20090105558A1 (en) * 2007-10-16 2009-04-23 Oakland University Portable autonomous multi-sensory intervention device
US20090221368A1 (en) * 2007-11-28 2009-09-03 Ailive Inc., Method and system for creating a shared game space for a networked game
US20090146829A1 (en) * 2007-12-07 2009-06-11 Honeywell International Inc. Video-enabled rapid response system and method
US7786858B2 (en) 2007-12-07 2010-08-31 Honeywell International Inc. Video-enabled rapid response system and method
US20130100268A1 (en) * 2008-05-27 2013-04-25 University Health Network Emergency detection and response system and method
US8737259B2 (en) 2008-05-30 2014-05-27 Abbott Diabetes Care Inc. Close proximity communication device and methods
US20110044333A1 (en) * 2008-05-30 2011-02-24 Abbott Diabetes Care Inc. Close Proximity Communication Device and Methods
US9184875B2 (en) 2008-05-30 2015-11-10 Abbott Diabetes Care, Inc. Close proximity communication device and methods
US8509107B2 (en) 2008-05-30 2013-08-13 Abbott Diabetes Care Inc. Close proximity communication device and methods
US11770210B2 (en) 2008-05-30 2023-09-26 Abbott Diabetes Care Inc. Close proximity communication device and methods
US9831985B2 (en) 2008-05-30 2017-11-28 Abbott Diabetes Care Inc. Close proximity communication device and methods
US8655622B2 (en) 2008-07-05 2014-02-18 Ailive, Inc. Method and apparatus for interpreting orientation invariant motion
US20100004896A1 (en) * 2008-07-05 2010-01-07 Ailive Inc. Method and apparatus for interpreting orientation invariant motion
US11202591B2 (en) 2009-02-03 2021-12-21 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11006871B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11166656B2 (en) 2009-02-03 2021-11-09 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11213229B2 (en) 2009-02-03 2022-01-04 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11006872B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US11006870B2 (en) 2009-02-03 2021-05-18 Abbott Diabetes Care Inc. Analyte sensor and apparatus for insertion of the sensor
US20100225498A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Postural information system and method
US9024976B2 (en) 2009-03-05 2015-05-05 The Invention Science Fund I, Llc Postural information system and method
US20100228494A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
US20100271200A1 (en) * 2009-03-05 2010-10-28 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining response to subject advisory information
US20100228492A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of State Of Delaware Postural information system and method including direction generation based on collection of subject advisory information
US20100225474A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228487A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100225490A1 (en) * 2009-03-05 2010-09-09 Leuthardt Eric C Postural information system and method including central determining of subject advisory information based on subject status information and postural influencer status information
US20100228495A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining subject advisory information based on prior determined subject advisory information
US20100228488A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US20100228493A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including direction generation based on collection of subject advisory information
US20100228158A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including device level determining of subject advisory information based on subject status information and postural influencer status information
US20100228154A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method including determining response to subject advisory information
US20100225491A1 (en) * 2009-03-05 2010-09-09 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Postural information system and method
US9226701B2 (en) 2009-04-28 2016-01-05 Abbott Diabetes Care Inc. Error detection in critical repeating data in a wireless sensor system
US10172518B2 (en) 2009-04-29 2019-01-08 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US9949639B2 (en) 2009-04-29 2018-04-24 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US10617296B2 (en) 2009-04-29 2020-04-14 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US9693688B2 (en) 2009-04-29 2017-07-04 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US9088452B2 (en) 2009-04-29 2015-07-21 Abbott Diabetes Care Inc. Method and system for providing data communication in continuous glucose monitoring and management system
US10950106B2 (en) 2009-05-18 2021-03-16 Alarm.Com Incorporated Fixed property monitoring with moving asset location tracking
US8531294B2 (en) * 2009-05-18 2013-09-10 Alarm.Com Incorporated Moving asset location tracking
US9666047B2 (en) 2009-05-18 2017-05-30 Alarm.Com Incorporated Fixed property monitoring with moving asset location tracking
US9123229B2 (en) 2009-05-18 2015-09-01 Alarm.Com Incorporated Fixed property monitoring with moving asset location tracking
US10366588B2 (en) 2009-05-18 2019-07-30 Alarm.Com Incorporated Fixed property monitoring with moving asset location tracking
US11651669B2 (en) 2009-05-18 2023-05-16 Alarm.Com Incorporated Fixed property monitoring with moving asset location tracking
US20100289644A1 (en) * 2009-05-18 2010-11-18 Alarm.Com Moving asset location tracking
US20100295674A1 (en) * 2009-05-21 2010-11-25 Silverplus, Inc. Integrated health management console
US11872370B2 (en) 2009-05-29 2024-01-16 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US11793936B2 (en) 2009-05-29 2023-10-24 Abbott Diabetes Care Inc. Medical device antenna systems having external antenna configurations
US8890937B2 (en) 2009-06-01 2014-11-18 The Curators Of The University Of Missouri Anonymized video analysis methods and systems
US20100328436A1 (en) * 2009-06-01 2010-12-30 The Curators Of The University Of Missouri Anonymized video analysis methods and systems
US20100302043A1 (en) * 2009-06-01 2010-12-02 The Curators Of The University Of Missouri Integrated sensor network methods and systems
US10188295B2 (en) 2009-06-01 2019-01-29 The Curators Of The University Of Missouri Integrated sensor network methods and systems
US11147451B2 (en) 2009-06-01 2021-10-19 The Curators Of The University Of Missouri Integrated sensor network methods and systems
US8993331B2 (en) 2009-08-31 2015-03-31 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US20110060530A1 (en) * 2009-08-31 2011-03-10 Abbott Diabetes Care Inc. Analyte Signal Processing Device and Methods
US10492685B2 (en) 2009-08-31 2019-12-03 Abbott Diabetes Care Inc. Medical devices and methods
US20110054282A1 (en) * 2009-08-31 2011-03-03 Abbott Diabetes Care Inc. Analyte Monitoring System and Methods for Managing Power and Noise
US9968302B2 (en) 2009-08-31 2018-05-15 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US11635332B2 (en) 2009-08-31 2023-04-25 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US11045147B2 (en) 2009-08-31 2021-06-29 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US10136816B2 (en) 2009-08-31 2018-11-27 Abbott Diabetes Care Inc. Medical devices and methods
US9314195B2 (en) 2009-08-31 2016-04-19 Abbott Diabetes Care Inc. Analyte signal processing device and methods
US11150145B2 (en) 2009-08-31 2021-10-19 Abbott Diabetes Care Inc. Analyte monitoring system and methods for managing power and noise
US10429250B2 (en) 2009-08-31 2019-10-01 Abbott Diabetes Care, Inc. Analyte monitoring system and methods for managing power and noise
USD1010133S1 (en) 2009-08-31 2024-01-02 Abbott Diabetes Care Inc. Analyte sensor assembly
US9058570B2 (en) * 2009-09-04 2015-06-16 Siemens Aktiengesellschaft Device and method for generating a targeted realistic motion of particles along shortest paths with respect to arbitrary distance weightings for simulations of flows of people and objects
US20120166162A1 (en) * 2009-09-04 2012-06-28 Siemens Aktiengesellschaft Device and method for generating a targeted realistic motion of particles along shortest paths with respect to arbitrary distance weightings for simulations of flows of people and objects
US20110093876A1 (en) * 2009-10-15 2011-04-21 At&T Intellectual Property I, L.P. System and Method to Monitor a Person in a Residence
US8516514B2 (en) 2009-10-15 2013-08-20 At&T Intellectual Property I, L.P. System and method to monitor a person in a residence
US9788057B2 (en) 2009-10-15 2017-10-10 At&T Intellectual Property I, L.P. System and method to monitor a person in a residence using a video camera
US20110090085A1 (en) * 2009-10-15 2011-04-21 At & T Intellectual Property I, L.P. System and Method to Monitor a Person in a Residence
US9247292B2 (en) 2009-10-15 2016-01-26 At&T Intellectual Property I, L.P. System and method to monitor a person in a residence with use of a set-top box device
US8390462B2 (en) * 2009-10-15 2013-03-05 At&T Intellectual Property I, L.P. System and method to monitor a person in a residence with use of a set-top box device
US10088819B2 (en) * 2010-03-25 2018-10-02 David H. C. Chen Systems and methods of property security
US20150205278A1 (en) * 2010-03-25 2015-07-23 David H.C. CHEN Systems and methods of property security
US20130057702A1 (en) * 2010-07-06 2013-03-07 Lg Electronics Inc. Object recognition and tracking based apparatus and method
US11189161B2 (en) 2010-09-15 2021-11-30 Comcast Cable Communications, Llc Securing property
US10311713B2 (en) 2010-09-15 2019-06-04 Comcast Cable Communications, Llc Securing property
ES2435865R1 (en) * 2010-11-22 2014-01-31 Universidad De Zaragoza DEPENDENT PERSONNEL MONITORING SYSTEM
US9532737B2 (en) 2011-02-28 2017-01-03 Abbott Diabetes Care Inc. Devices, systems, and methods associated with analyte monitoring devices and devices incorporating the same
US20120304013A1 (en) * 2011-05-27 2012-11-29 International Business Machines Corporation Administering Event Pools For Relevant Event Analysis In A Distributed Processing System
US9344381B2 (en) 2011-05-27 2016-05-17 International Business Machines Corporation Event management in a distributed processing system
US9201756B2 (en) 2011-05-27 2015-12-01 International Business Machines Corporation Administering event pools for relevant event analysis in a distributed processing system
US9213621B2 (en) * 2011-05-27 2015-12-15 International Business Machines Corporation Administering event pools for relevant event analysis in a distributed processing system
US9460262B2 (en) * 2011-06-17 2016-10-04 The Research Foundation Of State University Of New York Detecting and responding to sentinel events
US9419650B2 (en) 2011-06-22 2016-08-16 International Business Machines Corporation Flexible event data content management for relevant event and alert analysis within a distributed processing system
US9286143B2 (en) 2011-06-22 2016-03-15 International Business Machines Corporation Flexible event data content management for relevant event and alert analysis within a distributed processing system
US9460350B2 (en) 2011-07-01 2016-10-04 Washington State University Activity recognition in multi-entity environments
US20110291827A1 (en) * 2011-07-01 2011-12-01 Baldocchi Albert S Portable Monitor for Elderly/Infirm Individuals
WO2013006508A1 (en) * 2011-07-01 2013-01-10 Wsu Research Foundation Activity recognition in multi-entity environments
US8884751B2 (en) * 2011-07-01 2014-11-11 Albert S. Baldocchi Portable monitor for elderly/infirm individuals
US9178936B2 (en) 2011-10-18 2015-11-03 International Business Machines Corporation Selected alert delivery in a distributed processing system
US9178937B2 (en) 2011-10-18 2015-11-03 International Business Machines Corporation Selected alert delivery in a distributed processing system
US9246865B2 (en) 2011-10-18 2016-01-26 International Business Machines Corporation Prioritized alert delivery in a distributed processing system
US9069536B2 (en) 2011-10-31 2015-06-30 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9465420B2 (en) 2011-10-31 2016-10-11 Abbott Diabetes Care Inc. Electronic devices having integrated reset systems and methods thereof
US9980669B2 (en) 2011-11-07 2018-05-29 Abbott Diabetes Care Inc. Analyte monitoring device and methods
WO2013085379A2 (en) * 2011-12-09 2013-06-13 Universiti Putra Malaysia Livestock management and automation system using radio waves
WO2013085379A3 (en) * 2011-12-09 2013-08-08 Universiti Putra Malaysia Livestock management and automation system using radio waves
US10445464B2 (en) * 2012-02-17 2019-10-15 Location Labs, Inc. System and method for detecting medical anomalies using a mobile communication device
US20130218812A1 (en) * 2012-02-17 2013-08-22 Wavemarket, Inc. System and method for detecting medical anomalies using a mobile communication device
US9597016B2 (en) 2012-04-27 2017-03-21 The Curators Of The University Of Missouri Activity analysis, fall detection and risk assessment systems and methods
US9408561B2 (en) 2012-04-27 2016-08-09 The Curators Of The University Of Missouri Activity analysis, fall detection and risk assessment systems and methods
US10080513B2 (en) 2012-04-27 2018-09-25 The Curators Of The University Of Missouri Activity analysis, fall detection and risk assessment systems and methods
US8954811B2 (en) 2012-08-06 2015-02-10 International Business Machines Corporation Administering incident pools for incident analysis
US8943366B2 (en) 2012-08-09 2015-01-27 International Business Machines Corporation Administering checkpoints for incident analysis
US9892155B2 (en) * 2012-09-06 2018-02-13 Beyond Verbal Communication Ltd System and method for selection of data according to measurement of physiological parameters
US20150234886A1 (en) * 2012-09-06 2015-08-20 Beyond Verbal Communication Ltd System and method for selection of data according to measurement of physiological parameters
US9968306B2 (en) 2012-09-17 2018-05-15 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US11612363B2 (en) 2012-09-17 2023-03-28 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems
US10652697B2 (en) 2012-10-08 2020-05-12 Location Labs, Inc. Bio-powered locator device
US10028099B2 (en) 2012-10-08 2018-07-17 Location Labs, Inc. Bio-powered locator device
US10492031B2 (en) 2012-10-08 2019-11-26 Location Labs, Inc. Bio-powered locator device
WO2014087041A1 (en) * 2012-12-03 2014-06-12 Menumat Oy Arrangement and method for nutrition and remote care services
WO2014087040A1 (en) * 2012-12-03 2014-06-12 Menumat Oy Arrangement and method for nutrition and care services
US9311382B2 (en) 2012-12-14 2016-04-12 Apple Inc. Method and apparatus for personal characterization data collection using sensors
US20160171866A1 (en) * 2013-04-22 2016-06-16 Domosafety Sa System and method for automated triggering and management of alarms
US9361184B2 (en) 2013-05-09 2016-06-07 International Business Machines Corporation Selecting during a system shutdown procedure, a restart incident checkpoint of an incident analyzer in a distributed processing system
US9324227B2 (en) 2013-07-16 2016-04-26 Leeo, Inc. Electronic device with environmental monitoring
US20150021465A1 (en) * 2013-07-16 2015-01-22 Leeo, Inc. Electronic device with environmental monitoring
US9778235B2 (en) 2013-07-17 2017-10-03 Leeo, Inc. Selective electrical coupling based on environmental conditions
US20150026111A1 (en) * 2013-07-22 2015-01-22 GreatCall, Inc. Method for engaging isolated individuals
US9170860B2 (en) 2013-07-26 2015-10-27 International Business Machines Corporation Parallel incident processing
US9658902B2 (en) 2013-08-22 2017-05-23 Globalfoundries Inc. Adaptive clock throttling for event processing
US9256482B2 (en) 2013-08-23 2016-02-09 International Business Machines Corporation Determining whether to send an alert in a distributed processing system
US9086968B2 (en) 2013-09-11 2015-07-21 International Business Machines Corporation Checkpointing for delayed alert creation
US9602337B2 (en) 2013-09-11 2017-03-21 International Business Machines Corporation Event and alert analysis in a distributed processing system
US10171289B2 (en) 2013-09-11 2019-01-01 International Business Machines Corporation Event and alert analysis in a distributed processing system
WO2015047166A1 (en) * 2013-09-25 2015-04-02 Phoniro Ab A telecare system and an electronic lock device for use therein, and an associated method for monitoring attendance to a telecare alarm event in a telecare system
US11395030B2 (en) 2013-11-19 2022-07-19 Comcast Cable Communications, Llc Premises automation control
US10939155B2 (en) 2013-11-19 2021-03-02 Comcast Cable Communications, Llc Premises automation control
US9389943B2 (en) 2014-01-07 2016-07-12 International Business Machines Corporation Determining a number of unique incidents in a plurality of incidents for incident processing in a distributed processing system
US9348687B2 (en) 2014-01-07 2016-05-24 International Business Machines Corporation Determining a number of unique incidents in a plurality of incidents for incident processing in a distributed processing system
US11044114B2 (en) 2014-01-31 2021-06-22 Vivint, Inc. Rule-based graphical conversational user interface for security and automation system
US11029655B2 (en) 2014-01-31 2021-06-08 Vivint, Inc. Progressive profiling in an automation system
WO2015116773A1 (en) 2014-01-31 2015-08-06 Vivint, Inc. Progressive profiling in an automation system
US10564614B2 (en) 2014-01-31 2020-02-18 Vivint, Inc. Progressive profiling in an automation system
US9402155B2 (en) 2014-03-03 2016-07-26 Location Labs, Inc. System and method for indicating a state of a geographic area based on mobile device sensor measurements
EP3118780A4 (en) * 2014-03-11 2017-03-22 Panasonic Intellectual Property Management Co., Ltd. Lifestyle behavior estimating device, and program
US20170231546A1 (en) * 2014-03-18 2017-08-17 J. Kimo Arbas System and method to detect alertness of machine operator
US10383563B2 (en) * 2014-03-18 2019-08-20 J. Kimo Arbas System and method to detect alertness of machine operator
US20150279129A1 (en) * 2014-03-25 2015-10-01 Hitachi High-Technologies Corporation Failure cause classification apparatus
US9384603B2 (en) * 2014-03-25 2016-07-05 Hitachi High-Technologies Corporation Failure cause classification apparatus
CN104950866A (en) * 2014-03-25 2015-09-30 株式会社日立高新技术 Failure cause classification apparatus
US20150294067A1 (en) * 2014-04-14 2015-10-15 Elwha Llc Devices, systems, and methods for automated enhanced care rooms
US20150294085A1 (en) * 2014-04-14 2015-10-15 Elwha LLC, a limited company of the State of Delaware Devices, systems, and methods for automated enhanced care rooms
US20150294086A1 (en) * 2014-04-14 2015-10-15 Elwha Llc Devices, systems, and methods for automated enhanced care rooms
US10656607B2 (en) 2014-04-29 2020-05-19 Cox Communications, Inc Systems and methods for intelligent automation control services
US10168676B2 (en) 2014-04-29 2019-01-01 Cox Communications, Inc. Systems and methods for intelligent customization of an automation control service
US10983487B2 (en) 2014-04-29 2021-04-20 Cox Communications, Inc. Systems and methods for autonomous adaptation of an automation control service
US10331095B2 (en) 2014-04-29 2019-06-25 Cox Communications Systems and methods for development of an automation control service
US9372477B2 (en) 2014-07-15 2016-06-21 Leeo, Inc. Selective electrical coupling based on environmental conditions
US11245747B2 (en) 2014-07-31 2022-02-08 Honeywell International Inc. Monitoring a building management system
US20160033947A1 (en) * 2014-07-31 2016-02-04 Honeywell International Inc. Monitoring a building management system
US10666717B2 (en) * 2014-07-31 2020-05-26 Honeywell International Inc. Monitoring a building management system
US20170324808A1 (en) * 2014-07-31 2017-11-09 Honeywell International Inc. Monitoring a building management system
US9729618B2 (en) * 2014-07-31 2017-08-08 Honeywell International Inc. Monitoring a building management system
US9304590B2 (en) 2014-08-27 2016-04-05 Leen, Inc. Intuitive thermal user interface
US10575829B2 (en) * 2014-09-03 2020-03-03 Earlysense Ltd. Menstrual state monitoring
US20160058428A1 (en) * 2014-09-03 2016-03-03 Earlysense Ltd. Menstrual state monitoring
US9865016B2 (en) 2014-09-08 2018-01-09 Leeo, Inc. Constrained environmental monitoring based on data privileges
US10102566B2 (en) 2014-09-08 2018-10-16 Leeo, Icnc. Alert-driven dynamic sensor-data sub-contracting
US10304123B2 (en) 2014-09-08 2019-05-28 Leeo, Inc. Environmental monitoring device with event-driven service
US10043211B2 (en) 2014-09-08 2018-08-07 Leeo, Inc. Identifying fault conditions in combinations of components
US10078865B2 (en) 2014-09-08 2018-09-18 Leeo, Inc. Sensor-data sub-contracting during environmental monitoring
US10026304B2 (en) 2014-10-20 2018-07-17 Leeo, Inc. Calibrating an environmental monitoring device
US9445451B2 (en) 2014-10-20 2016-09-13 Leeo, Inc. Communicating arbitrary attributes using a predefined characteristic
KR102066368B1 (en) 2014-10-28 2020-01-14 이베이 인크. Prioritizing software applications to manage alerts
KR101907145B1 (en) * 2014-10-28 2018-10-11 이베이 인크. Prioritizing software applications to manage alerts
KR20180112126A (en) * 2014-10-28 2018-10-11 이베이 인크. Prioritizing software applications to manage alerts
KR20170078743A (en) * 2014-10-28 2017-07-07 이베이 인크. Prioritizing software applications to manage alerts
US20160117202A1 (en) * 2014-10-28 2016-04-28 Kamal Zamer Prioritizing software applications to manage alerts
CN107077386A (en) * 2014-10-28 2017-08-18 电子湾有限公司 Prioritization is carried out to software application to manage prompting
WO2016069768A1 (en) * 2014-10-28 2016-05-06 Ebay Inc. Prioritizing software applications to manage alerts
US9668320B2 (en) 2014-12-30 2017-05-30 Google Inc. Path light feedback compensation
US10127785B2 (en) 2014-12-30 2018-11-13 Google Llc Entry point opening sensor
US10290191B2 (en) * 2014-12-30 2019-05-14 Google Llc Alarm arming with open entry point
US9747769B2 (en) 2014-12-30 2017-08-29 Google Inc. Entry point opening sensor
US9332616B1 (en) 2014-12-30 2016-05-03 Google Inc. Path light feedback compensation
US9569943B2 (en) 2014-12-30 2017-02-14 Google Inc. Alarm arming with open entry point
US9940798B2 (en) 2014-12-30 2018-04-10 Google Llc Alarm arming with open entry point
US10339773B2 (en) 2014-12-30 2019-07-02 Google Llc Home security system with automatic context-sensitive transition to different modes
US10075475B2 (en) 2015-02-06 2018-09-11 Honeywell International Inc. Apparatus and method for dynamic customization of cyber-security risk item rules
US10021125B2 (en) 2015-02-06 2018-07-10 Honeywell International Inc. Infrastructure monitoring tool for collecting industrial process control and automation system risk data
US20160234243A1 (en) * 2015-02-06 2016-08-11 Honeywell International Inc. Technique for using infrastructure monitoring software to collect cyber-security risk data
US10021119B2 (en) 2015-02-06 2018-07-10 Honeywell International Inc. Apparatus and method for automatic handling of cyber-security risk events
US10075474B2 (en) 2015-02-06 2018-09-11 Honeywell International Inc. Notification subsystem for generating consolidated, filtered, and relevant security risk-based notifications
US10686841B2 (en) 2015-02-06 2020-06-16 Honeywell International Inc. Apparatus and method for dynamic customization of cyber-security risk item rules
US10764079B2 (en) 2015-02-09 2020-09-01 Vivint, Inc. System and methods for correlating sleep data to security and/or automation system operations
US10606221B2 (en) 2015-02-11 2020-03-31 International Business Machines Corporation Identifying home automation correlated events and creating portable recipes
US10298608B2 (en) 2015-02-11 2019-05-21 Honeywell International Inc. Apparatus and method for tying cyber-security risk analysis to common risk methodologies and risk levels
US10606222B2 (en) 2015-02-11 2020-03-31 International Business Machines Corporation Identifying home automation correlated events and creating portable recipes
US9685059B2 (en) 2015-02-12 2017-06-20 Google Inc. Devices and methods for providing heat-source alerts
US10078949B2 (en) 2015-02-12 2018-09-18 Google Llc Systems, devices, and methods for providing heat-source alerts
WO2016133659A1 (en) * 2015-02-19 2016-08-25 Vivint, Inc. Methods and systems for automatically monitoring user activity
US10419235B2 (en) 2015-02-19 2019-09-17 Vivint, Inc. Methods and systems for automatically monitoring user activity
US9942056B2 (en) 2015-02-19 2018-04-10 Vivint, Inc. Methods and systems for automatically monitoring user activity
US20200175846A1 (en) * 2015-03-27 2020-06-04 Google Llc Configuring a Smart Home Controller
US9800604B2 (en) 2015-05-06 2017-10-24 Honeywell International Inc. Apparatus and method for assigning cyber-security risk consequences in industrial process control environments
US10835186B2 (en) 2015-08-28 2020-11-17 Foresite Healthcare, Llc Systems for automatic assessment of fall risk
US11819344B2 (en) 2015-08-28 2023-11-21 Foresite Healthcare, Llc Systems for automatic assessment of fall risk
US11864926B2 (en) 2015-08-28 2024-01-09 Foresite Healthcare, Llc Systems and methods for detecting attempted bed exit
US10206630B2 (en) 2015-08-28 2019-02-19 Foresite Healthcare, Llc Systems for automatic assessment of fall risk
WO2017055317A1 (en) 2015-09-30 2017-04-06 Koninklijke Philips N.V. Assistance system for cognitively impaired persons
US9981385B2 (en) * 2015-10-12 2018-05-29 The Boeing Company Dynamic automation work zone safety system
US20170100838A1 (en) * 2015-10-12 2017-04-13 The Boeing Company Dynamic Automation Work Zone Safety System
US10805775B2 (en) 2015-11-06 2020-10-13 Jon Castor Electronic-device detection and activity association
US9801013B2 (en) 2015-11-06 2017-10-24 Leeo, Inc. Electronic-device association based on location duration
US20170178001A1 (en) * 2015-12-21 2017-06-22 Glen J. Anderson Technologies for cognitive cuing based on knowledge and context
US10599980B2 (en) * 2015-12-21 2020-03-24 Intel Corporation Technologies for cognitive cuing based on knowledge and context
US10614504B2 (en) 2016-04-15 2020-04-07 Walmart Apollo, Llc Systems and methods for providing content-based product recommendations
US10592959B2 (en) 2016-04-15 2020-03-17 Walmart Apollo, Llc Systems and methods for facilitating shopping in a physical retail facility
US10430817B2 (en) 2016-04-15 2019-10-01 Walmart Apollo, Llc Partiality vector refinement systems and methods through sample probing
US11276181B2 (en) 2016-06-28 2022-03-15 Foresite Healthcare, Llc Systems and methods for use in detecting falls utilizing thermal sensing
US20190216406A1 (en) * 2016-06-29 2019-07-18 Robert Polkowski Wearable device to assist cognitive dysfunction sufferer and method for operating the same
JP7114215B2 (en) 2016-06-30 2022-08-08 株式会社東芝 Life data integrated analysis system, life data integrated analysis method, and life data integrated analysis program
JP2018005536A (en) * 2016-06-30 2018-01-11 東芝デジタルソリューションズ株式会社 Life data integration analysis system, life data integration analysis method and life data integration analysis program
US10373464B2 (en) 2016-07-07 2019-08-06 Walmart Apollo, Llc Apparatus and method for updating partiality vectors based on monitoring of person and his or her home
US10169971B2 (en) * 2016-07-07 2019-01-01 Walmaty Apollo, LLC Method and apparatus for monitoring person and home
US10504352B2 (en) 2016-07-07 2019-12-10 Walmart Apollo, Llc Method and apparatus for monitoring person and home
US20180012474A1 (en) * 2016-07-07 2018-01-11 Wal-Mart Stores, Inc. Method and apparatus for monitoring person and home
US10382441B2 (en) 2016-10-13 2019-08-13 Honeywell International Inc. Cross security layer secure communication
WO2018073241A1 (en) 2016-10-20 2018-04-26 Philips Lighting Holding B.V. A system and method for monitoring activities of daily living of a person
CN109843173A (en) * 2016-10-20 2019-06-04 昕诺飞控股有限公司 System and method for monitoring the number of storage tanks produced per day of people
US10810855B2 (en) 2016-10-20 2020-10-20 Signify Holding B.V. System and method for monitoring activities of daily living of a person
US10298411B2 (en) 2016-10-24 2019-05-21 Crestron Electronics, Inc. Building management system that determines building utilization
US20180174671A1 (en) * 2016-12-15 2018-06-21 International Business Machines Corporation Cognitive adaptations for well-being management
US10845775B2 (en) 2017-05-12 2020-11-24 The Boeing Company Modular safety monitoring and warning system and methods for use thereof
US10649433B2 (en) 2017-05-12 2020-05-12 The Boeing Company Modular safety monitoring and warning system and methods for use thereof
US10409252B2 (en) * 2017-05-12 2019-09-10 The Boeing Company Modular safety monitoring and warning system and methods for use thereof
CN108877158A (en) * 2017-05-12 2018-11-23 波音公司 Modular safety monitoring and warning system and its application method
US20180365965A1 (en) * 2017-05-25 2018-12-20 Robert Blatt Easily customizable inhabitant behavioral routines in a location monitoring and action system
US11138861B2 (en) * 2017-05-25 2021-10-05 Robert Blatt Easily customizable inhabitant behavioral routines in a location monitoring and action system
US11466884B2 (en) * 2017-09-28 2022-10-11 Honeywell International Inc. Actuators with condition tracking
US10634379B2 (en) 2017-09-28 2020-04-28 Honeywell International Inc. Actuators with condition tracking
US10905611B2 (en) * 2017-12-22 2021-02-02 Stryker Corporation Techniques for notifying persons within a vicinity of a patient support apparatus of a remote control function
US20190192368A1 (en) * 2017-12-22 2019-06-27 Stryker Corporation Techniques For Notifying Persons Within A Vicinity Of A Patient Support Apparatus Of A Remote Control Function
US11342074B1 (en) * 2017-12-31 2022-05-24 Teletracking Technologies, Inc. Patient identification using passive sensors
US11887461B2 (en) 2018-04-09 2024-01-30 State Farm Mutual Automobile Insurance Company Sensing peripheral heuristic evidence, reinforcement, and engagement system
US11869328B2 (en) 2018-04-09 2024-01-09 State Farm Mutual Automobile Insurance Company Sensing peripheral heuristic evidence, reinforcement, and engagement system
US11223588B2 (en) * 2018-09-19 2022-01-11 International Business Machines Corporation Using sensor data to control message delivery
WO2020069500A1 (en) * 2018-09-30 2020-04-02 Yibing Hu Smart health expert and manager
US10705108B1 (en) 2019-02-05 2020-07-07 Honeywell International Inc. Sensing system for sensing stationary objects
US20200273555A1 (en) * 2019-02-25 2020-08-27 Boe Technology Group Co., Ltd. Intelligent reminding method, device and electronic apparatus
US11224777B2 (en) 2019-02-25 2022-01-18 Honeywell International Inc. Fire and smoke actuator with temperature-dependent operating speed
US20220175598A1 (en) * 2019-03-05 2022-06-09 Fuji Corporation Assistance information management system
US11030882B2 (en) * 2019-03-11 2021-06-08 Lenovo (Singapore) Pte. Ltd. Automated security subsystem activation
US20220304603A1 (en) * 2019-06-17 2022-09-29 Happy Health, Inc. Wearable device operable to detect and/or manage user emotion
US11894129B1 (en) 2019-07-03 2024-02-06 State Farm Mutual Automobile Insurance Company Senior living care coordination platforms
US11901071B2 (en) 2019-08-19 2024-02-13 State Farm Mutual Automobile Insurance Company Senior living engagement and care support platforms
US11908578B2 (en) 2019-08-19 2024-02-20 State Farm Mutual Automobile Insurance Company Senior living engagement and care support platforms
US11923086B2 (en) 2019-08-19 2024-03-05 State Farm Mutual Automobile Insurance Company Senior living engagement and care support platforms
US11923087B2 (en) 2019-08-19 2024-03-05 State Farm Mutual Automobile Insurance Company Senior living engagement and care support platforms
US11434686B2 (en) 2019-11-20 2022-09-06 Kingsway Enterprises (Uk) Limited Pressure monitor
US11462091B2 (en) 2020-11-26 2022-10-04 Kingsway Enterprises (Uk) Limited Anti-ligature device
US20220167893A1 (en) * 2020-12-02 2022-06-02 Koninklijke Philips N.V. Methods and systems for detecting indications of cognitive decline
US11935651B2 (en) 2021-01-19 2024-03-19 State Farm Mutual Automobile Insurance Company Alert systems for senior living engagement and care support platforms
US11762350B2 (en) 2021-03-22 2023-09-19 Honeywell International Inc. Methods and systems for detecting occupancy of a space
CN114155612A (en) * 2022-02-10 2022-03-08 深圳爱莫科技有限公司 Restaurant personnel non-standard behavior model training method, detection method and processing equipment
TWI815424B (en) * 2022-04-29 2023-09-11 優網通國際資訊股份有限公司 Residential safety monitoring system
US11950936B2 (en) 2023-02-22 2024-04-09 Abbott Diabetes Care Inc. Methods and apparatuses for providing adverse condition notification with enhanced wireless communication range in analyte monitoring systems

Also Published As

Publication number Publication date
AU2003228403A1 (en) 2003-10-13
WO2003083800A1 (en) 2003-10-09

Similar Documents

Publication Publication Date Title
US20040030531A1 (en) System and method for automated monitoring, recognizing, supporting, and responding to the behavior of an actor
US11869328B2 (en) Sensing peripheral heuristic evidence, reinforcement, and engagement system
US10311694B2 (en) System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US10475141B2 (en) System and method for adaptive indirect monitoring of subject for well-being in unattended setting
US20190272725A1 (en) Pharmacovigilance systems and methods
JP6502502B2 (en) System and method for monitoring human daily activities
US20130150686A1 (en) Human Care Sentry System
Brownsell et al. Assistive technology and telecare: forging solutions for independent living
US11633103B1 (en) Automatic in-home senior care system augmented with internet of things technologies
US11540757B2 (en) Assessing the functional ability of a person to perform a task
WO2006094401A1 (en) Home device monitoring system and method for interacting with a primary user
JP2017168098A (en) Watching system and life support proposing system
EP1807816A1 (en) System and method for automatically including supplemental information in reminder messages
Haigh et al. An open agent architecture for assisting elder independence
WO2019070763A1 (en) Caregiver mediated machine learning training system
WO2016057564A1 (en) System and method for adaptive indirect monitoring of subject for well-being in unattended setting
Brownsell et al. Future systems for remote health care
WO2020075675A1 (en) Care system management method, management device and program
Chiridza et al. A Smart Home environment to support risk monitoring for the elderly living independently
Haigh et al. Agents for recognizing and responding to the behaviour of an elder
Adlam et al. Implementing monitoring and technological interventions in smart homes for people with dementia-case studies
Ali et al. Developing a Fall-Prevention System for Nursing Homes
JP6947064B2 (en) Watching device, watching method, and watching program
Bellavista et al. Challenges, opportunities and solutions for ubiquitous eldercare
US20240119820A1 (en) Sensing peripheral heuristic evidence, reinforcement, and engagement system

Legal Events

Date Code Title Description
AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC., MINNESOTA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DEWING, WENDE L.;HAIGH, KAREN Z.;TOMS, DAVID C.;AND OTHERS;REEL/FRAME:014427/0555;SIGNING DATES FROM 20030521 TO 20030605

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION