US20120078647A1 - Systems and methods for improved perinatal workflow - Google Patents

Systems and methods for improved perinatal workflow Download PDF

Info

Publication number
US20120078647A1
US20120078647A1 US12/970,573 US97057310A US2012078647A1 US 20120078647 A1 US20120078647 A1 US 20120078647A1 US 97057310 A US97057310 A US 97057310A US 2012078647 A1 US2012078647 A1 US 2012078647A1
Authority
US
United States
Prior art keywords
patient
user
user interface
data
patients
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/970,573
Inventor
Tamara Grassle
Dianne Kessler
Charles Levecke
Leonard Evan Kahn
Afrazuddin Mohammad
John Baartz
Jeffrey Paul Czaplewski
Michael Jordan
Renee Vitullo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Priority to US12/970,573 priority Critical patent/US20120078647A1/en
Assigned to GENERAL ELECTRIC COMPANY reassignment GENERAL ELECTRIC COMPANY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: JORDAN, MICHAEL, BAARTZ, JOHN, CZAPLEWSKI, JEFFREY PAUL, GRASSLE, TAMARA, KAHN, LEONARD EVAN, KESSLER, DIANNE, LEVECKE, CHARLES, MOHAMMAD, AFRAZUDDIN, VITULLO, RENEE
Publication of US20120078647A1 publication Critical patent/US20120078647A1/en
Priority to US29/535,623 priority patent/USD792431S1/en
Priority to US29/606,622 priority patent/USD846579S1/en
Priority to US29/686,632 priority patent/USD933689S1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H15/00ICT specially adapted for medical reports, e.g. generation or transmission thereof
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Definitions

  • Perinatal systems involve a high degree of data granularity and a particular workflow for high acuity cases.
  • Patient condition can change very rapidly so large delays in getting information into an electronic system are not acceptable.
  • Certain examples provide systems and methods for improved perinatal workflow.
  • the system includes a memory to buffer live streaming data for one or more patients.
  • the system includes a user interface to display and receive input with respect to a list of one or more patients one or more associated with a clinician; a control to facilitate user selection of one or more patients from the list; and live streaming data received from one or more monitors for one or more selected patients, wherein the user interface is to facilitate user selection of a patient for a more detailed patient view.
  • the system includes an alert for one or more selected patients to be triggered based on a defined criterion.
  • the system includes a processor to process data for output via the user interface and to process user input.
  • Certain examples provide a computer-implemented method for clinical patient monitoring.
  • the method includes displaying a list of one or more patients one or more associated with a clinician and facilitating user selection of one or more patients from the list.
  • the method includes providing, via a user interface, live streaming data received from one or more monitors for one or more selected patients and providing, upon user selection of a patient via the user interface, a more detailed patient view for the selected patient.
  • the method includes generating an alert for one or more selected patients to be triggered based on a defined criterion.
  • the clinical dock interaction display system includes a user interface to display and receive input with respect to a list of one or more patients one or more associated with a clinician; a control to facilitate user selection of one or more patients from the list; and live streaming data received from one or more monitors for one or more selected patients.
  • the user interface is to facilitate user selection of a patient for a more detailed patient view.
  • the system also includes an alert for one or more selected patients to be triggered based on a defined criterion.
  • FIGS. 1-2 illustrate example dock interaction displays providing patient information to a user.
  • FIG. 3 depicts a flow chart for an example method for providing a clinical surveillance view of patient data to a user.
  • FIG. 4 illustrates an example viewer providing a time continuum and associated information for a monitored patient.
  • FIG. 5 depicts a flow chart for an example method for providing a clinical time continuum and associated real time data for a patient to a user.
  • FIG. 6 depicts a flow chart for an example method for voice recording, playback, and integration with a patient record.
  • FIG. 7 illustrates an example voice recording and review interface.
  • FIG. 8 illustrates a flow chart for an example method for smart clinical annotation of patient information in a clinical workflow.
  • FIG. 9 illustrates an example interface for annotation review.
  • FIG. 10 is a block diagram of an example processor system that can be used to implement the systems, apparatus and methods described herein.
  • At least one of the elements in an at least one example is hereby expressly defined to include a tangible medium such as a memory, DVD, CD, Blu-ray, etc. storing the software and/or firmware.
  • Certain clinical areas such as a high acuity perinatal department in a hospital, have a much higher ratio of clinician to patient because the clinician should be documenting continuously.
  • large delay is introduced because clinicians have gloved hands and are treating a patient and must then wash their hands, get to a computer to enter information, etc. Thus, an hour or more may pass before documentation is entered. People forget information or chart it at the wrong time, and that impacts legal liability and patient treatment.
  • Certain examples are agnostic to any enterprise system. Certain examples help support an improved real-time perinatal workflow with additional tools, information, and capabilities. Certain examples provide sharing of a clinical (e.g., perinatal) application with other clinical applications. Certain examples provide trend navigation through clinical (e.g., perinatal) data. Certain examples provide a multi-patient sidebar view of real time and/or stored patient data. Certain examples provide a view of a time continuum and associated data along with an ability to edit, annotate, report, and retrieve information associated with a patient's time continuum.
  • Healthcare professionals can review and document patient data in an application (e.g., a perinatal application) while monitoring various live streaming data of other patients using an overlaid user interface (UI) widget.
  • UI user interface
  • a healthcare professional can examine a detailed view of any of the other patients' data by selecting a patient and displaying the detailed information, while switching contexts in the underlying application to allow continuous documentation.
  • a Clinician's Surveillance View docks or positions itself with respect to one of four sides of a monitor/display screen (e.g., against an edge of the display).
  • the CSV can be left to float in any section of the visual display.
  • the CSV overlays in a non-intrusive way, an underlying application that is currently being used, which helps enable the health care professional to pursue other tasks while monitoring information in the CSV bar.
  • the CSV includes certain interactions and options to enable health care professionals to add and monitor patient information in real time.
  • health care professionals can click on one of the patients live feeds to view additional information.
  • a live feed is clicked to view additional information, other background applications bring that particular patient's information into focus to add redundancy in data manipulation.
  • the CVS helps to prevent interruptions in usage of other systems involved in a clinical (e.g., perinatal) workflow, such as an enterprise wide system, while allowing clinicians to simultaneously (or at least substantially simultaneously given some system delay) monitor live data for multiple patients.
  • a clinical workflow e.g., perinatal
  • the CVS can be used in a labor and delivery environment with real-time fetal monitoring, as well as in any high-risk care area.
  • CVS can interoperate and function concurrently with enterprise wide application(s).
  • Enterprise wide applications are generally non-specific to a particular care area, and CVS facilitates user access to functionality and rich clinical content targeted for high-risk care areas, while allowing user to continue to leverage enterprise wide systems for comprehensive documentation.
  • continuous streams of live data can be embedded within an overlaid application.
  • FIG. 1 illustrates an example dock interaction display 100 or “sidebar” providing a patient list and related real time waveform information to a user.
  • the display 100 can be positioned anywhere on a display screen by a user, such as in a middle right hemisphere of a user's screen.
  • the display 100 can interact with and/or be apart from other application(s) executing on the user's computer.
  • a user can search for a patient 110 , add a patient 120 , drag and drop 130 patient(s) from a list 135 to be monitored, etc.
  • real time (or substantially real time including a system delay (e.g., processing, data retrieval from memory, communication, etc.)) data (e.g., fetal monitor, patient monitor, and/or other waveform data, etc.) 140 can be displayed for one or more selected patients via the display 100 . Additionally, an indicator or alert 150 regarding one or more patients can be provided via the display 100 .
  • FIG. 2 illustrates another example dock interaction display 200 .
  • the example display 200 depicts how the dock display 200 may look if docked or positioned in a top hemisphere of a user's display screen.
  • a user can be provided with electrocardiogram (EKG) and/or other live streaming waveform data for selected patient(s). Color-coded alerting can be provided. A user can select a patient in the sidebar 100 , 200 to see a more detailed patient view. Live active scrolling can be provided.
  • EKG electrocardiogram
  • Live active scrolling can be provided.
  • a voice-activated “mark” button can be provided in conjunction with the waveform data to allow a user to document in real-time (or substantially real time) through keyboard input, mouse selection, voice indication, foot pedal, etc., to make a mark and/or other annotation on the “live” waveform.
  • a mark can be automatically converted into an annotation.
  • FIG. 3 depicts a flow chart for an example method 300 for providing a clinical surveillance view of patient data to a user.
  • a surveillance viewer is positioned on a user's display.
  • the viewer can automatically (e.g., based on a default position, user-specified preference, concurrent application(s) executing, workflow, etc.) be positioned and/or manually be positioned by a user on the display.
  • the viewer overlays, in a non-intrusive way, one or more underlying application(s) currently in use on the display.
  • a user can interact with other applications in a patient care workflow while monitoring information in the surveillance viewer.
  • one or more patients to be monitored are identified.
  • a user can search for a patient.
  • a user can provide a patient name and/or browse a list of available patient(s) to identify one or more patients to be monitored.
  • real time (e.g., including substantially real time) data for the patient is displayed to the user via the surveillance viewer.
  • fetal waveform, patient EKG, blood pressure, and/or other data can be displayed via a live feed to the user in the surveillance viewer.
  • the surveillance monitor can be used in a labor and delivery environment with real-time fetal monitoring, as well as in any other high-risk care area.
  • a live feed e.g., a fetal waveform
  • “moused” or hovered over, or otherwise selected to view additional information that particular information is retrieved for display and/or brought into focus for the user.
  • an indicator or alert can be marked via the surveillance viewer.
  • an indicator or mark can be provided for a patient, a data feed, etc., for display via the surveillance viewer.
  • certain examples enable a visualization of directly and indirectly acquired clinical content plotted over time and perpetually updated.
  • Features of the visualization include co-location of clinically relevant content collected from disparate sources, supplemented by a mechanism to initiate annotations (e.g., assessments and/or actions), through which a user can indicate, preserve, and/or visualize an intended association with the content that motivated the annotation.
  • certain examples provide for initiation of recording by an external recording device (e.g., audio, video, etc.), at a point of care, in order to reconstitute annotations for formal documentation at a later time.
  • an external recording device e.g., audio, video, etc.
  • Certain examples provide a continuously updating graph including time as the x axis to plot direct observations. Indirect observations are rendered on the same timeline, presenting a visual indicator of source and, potentially, a summary of content.
  • a support engine processes observations to discover pattern(s) of potential correlation. Patterns are based on interpreting the values of collected observations and deducing where annotations may be appropriate, for example. These pattern discoveries are displayed on the graph as indicators of potential annotation opportunity(ies).
  • Recorded annotations maintain an association to pattern sources, for example.
  • the association can indicate a target concern to which an annotation applies.
  • These concerns can be composites of specific discrete observations, associated with a range of time, and/or with a series of time ranges, for example.
  • users can initiate annotations and explicitly define their own observation dependency(ies), by selecting and highlighting either a range of time which includes known content, or by explicitly multi-selecting specific content (e.g., CTRL-CLICK), for example.
  • associated clinical content can be exposed.
  • Proposed exposition can include a bubble web from the annotation whereby the associated content is encircled and a connection line is drawn, for example.
  • Associations based strictly on time range can highlight a relevant range on the graph, for example.
  • audio/visual recording can be initiated to invoke a record action.
  • the visual can then display the start of the record mode and its duration.
  • the completion of the record mode results in the preservation of the recording as a clinical observation and the creation of a proposed annotation (as noted above), with the recording as the associated content.
  • clinicians may be interested in reviewing correlated data as the data becomes available to a clinician.
  • a clinician might not be aware that data is available or might have to search through multiple sources to correlate different data inputs (such as lab data, with vitals data, with fetal strip data, etc.). Because a clinician can see the data correlated on one screen from multiple external sources, the clinician can now more efficiently review the data, make inferences from that data, and document interactions associated with a group of data elements on a time continuum.
  • Certain examples provide a new level of usability for a live patient encounter and bedside documentation.
  • Data representation facilitates contextual interaction against a patient record at a specific point in time.
  • Centralized visualization of different sources of data is provided on a patient time-continuum in a single application.
  • the patient time continuum organization and display allows for correlation of other series/sources of data to provide further evidence of an event and its related annotation.
  • the time continuum display also enhances recognition of associative relationships by providing visual indicators.
  • flowsheets can be used to record content over time.
  • flowsheets are significantly more ridged in structure (having time bound to known intervals as columns) This consumes a significant amount of lateral real estate.
  • Certain examples allow for direct documentation upon a continuous waveform. Furthermore, other clinical observations over time, such as XDS document awareness, lab messages, etc., can be documented and displayed upon the continuous waveform. Associative relationships, as well as visual co-location, of the data can also be facilitated.
  • electronic fetal waveform capture with electronic annotations allow the content to be stored in a discoverable and searchable format. Recordings can be automatically and/or manually initiated, for example. In certain examples, hyperlinks can allow traversal of associated content through data association(s).
  • FIG. 4 illustrates an example viewer 400 providing a time continuum and associated information for a monitored patient.
  • the viewer 400 conveys indications of monitored waveform data 410 provided against a baseline 420 for a patient.
  • the waveform data 410 represents a portion of a time continuum 405 for the patient (e.g., an expanded portion compared to a compressed overall view of the time continuum 405 ).
  • the time continuum 405 can provide a window 430 (e.g., an eight hour window) in a current care cycle for the patient, while the waveform data 410 represents a current live feed or another selected subset of that data.
  • the waveform data window 410 can provide a certain time subset 440 determined automatically and/or specified by a user (e.g., looking back five minutes, ten minutes, twenty minutes, etc.).
  • One or more controls 450 allow a user to overlay data, mark data, insert annotations or notes, etc.
  • automatic (e.g., system or derived) annotations 460 as well as user-input annotations 470 can be shown and interacted with via the viewer 400 .
  • a user documents with respect to a point of care as the user is providing care and ties the documentation to the time continuum 405 .
  • This information can be provided (e.g., automatically and/or manually) into a flowsheet and/or other format to be viewed and/or correlated with other information (e.g., a vital signs graph, EKG waveform, lab results, etc.) so that the information all appears in the time continuum 405 .
  • Information can be tied together and provided in a visual format, so that, for example, a user can see a last lab result when a patient's EKG dropped. The user can open the lab result and superimpose a vital signs graph and see the information together for evaluation.
  • Information can be provided in real-time (including substantially in real time), not after the fact (like a longitudinal record).
  • the time continuum 405 can stretch from the current moment to the beginning of a cycle of care (e.g., the start of pregnancy).
  • the time continuum 405 may include two hours of test, then have no data for two weeks, then include a hospital visit, etc.
  • a user can track everything done in the time continuum and can select and review individual items in more detail.
  • the time continuum 405 can be presented chronologically based on occurrence, chronologically based on time of documentation, etc.
  • the viewer 400 can provide different ways to analyze a story and recreate what happened, for example.
  • data can be provided to a mobile device (e.g., a tablet computer, smart phone, laptop, netbook, personal digital assistant, etc.).
  • voice commands can be provided via a wired or wireless connection (e.g., Bluetooth, Wi-Fi, etc.) and a user can review information on the mobile, etc.
  • the mobile device can perform at least some of the processing for dictation, etc.
  • a user can document “on the fly” as he or she moves a patient from a waiting room to an operating room, for example.
  • Data can also be captured during transport from hospital to hospital and documentation maintained in transit to complete the patient record, for example.
  • a user may have other data not tied to the time continuum that he or she wants to see in the same space, but separate from the time continuum (e.g., a labor curve, vitals, growth curve, normal ranges, etc.).
  • the viewer 400 can provide a separate axis for such information (e.g., an axis showing four hours of data versus fifteen minutes of waveform data).
  • the viewer 400 provides a real-time (including substantially real time) push of updated information/content (e.g., lab results, etc.) that “pops up” or appears for a user to see as the user is documenting and treating a patient in real time.
  • updated information/content e.g., lab results, etc.
  • the viewer 400 can be pre-configured to quickly provide information to a user that he or she can popup in the time continuum 405 and then close to resume patient charting. For example, the user can view the popup data but does not have to pull the data item(s) into their time continuum.
  • the viewer 400 is provided as a “floating” window that is always on top and always available with other applications visible and accessible underneath.
  • a compressed view can be provided on top of the floating window to see trending over a long period of time without having to scroll through data, for example.
  • the time continuum 405 can be searched by keyword and navigated to a location or locations for a corresponding annotation (or annotations). In certain examples, a user can grab or otherwise select a tab and navigate forward and/or backward through available data.
  • the time continuum 405 and other information in the viewer 400 can include flags, indicators, and/or other pointer to data, events, and/or other information, for example.
  • FIG. 5 depicts a flow chart for an example method 500 for providing a clinical time continuum and associated real time data for a patient to a user.
  • a patient is selected for monitoring and viewing via a time continuum viewer.
  • a time continuum of data for the patient over a specified period is graphically represented via the viewer.
  • a real time or live portion (including substantially real time) of the time continuum data is displayed via the window. For example, the past ten minutes of the patient's EKG waveform are displayed in greater detail apart from the overall time continuum. The time continuum and the portion continue to update in real time (including substantially real time).
  • a live feed e.g., a fetal waveform
  • “moused” or hovered over, or otherwise selected to view additional information that particular information is retrieved for display and/or brought into focus for the user.
  • an indicator or alert can be marked via the surveillance viewer.
  • an indicator or mark can be provided for a patient, a data feed, etc., for display via the surveillance viewer.
  • EMR electronic medical record
  • a clinical system e.g., a perinatal system
  • a voice and video recognition engine embedded in the software and/or hardware to capture audio and/or video content and identify pertinent data elements and events in recorded data. For example, captured audio can be parsed and spoken data matched to discrete EMR data elements, while also noting a time index in the recording for quick recall and playback.
  • audio and, optionally, video can be recorded, analyzed and parsed to identify clinical data elements to be stored in the patient's EMR. Recordings can run continuously, or can be started and stopped at the clinician's discretion using a voice command, or other physical toggle (e.g., foot pedal, keyboard, mouse, etc.). A time continuum can be updated with an indicator to show a time at which data capture was initiated.
  • a parsing and recognition system can identify discrete data elements in audio and/or video and classify the data elements using standard terminology (e.g., SNOMED, ICD-9, ICD-10, NANDA, etc.) and/or hospital provided terminology for storage in the patient's EMR.
  • standard terminology e.g., SNOMED, ICD-9, ICD-10, NANDA, etc.
  • each discrete set of elements is indexed based on time in the recording.
  • the recorded session of care, as well as the parsed data, are saved for later authentication and accuracy verification.
  • a clinician is presented with a user interface screen on a computer showing a list of discrete elements and corresponding values as interpreted or actions performed.
  • the user interface facilitates user authentication and verification of the parsed data, for example.
  • the user interface can display a confidence level, determined by an analysis engine, for the data presented.
  • each element and/or value includes a link to a specific time slice in the recording that is associated with the analyzed data. The clinician can use this link to quickly replay the relevant portion of the recording and see or hear the information again for verification. Access to the complete recorded session can be made available for context if requested.
  • the analysis engine and user interface with indexed replay option allow clinicians to provide patient care while the system (e.g., a perinatal system) records the pertinent clinical data to the EMR quickly and accurately.
  • certain examples facilitate faster and more accurate charting of patient data.
  • validation of the parsed data can be facilitated to provide more accurate patient records in a more efficient manner.
  • video recordings can also be analyzed and parsed to identify clinical data elements. A user interface and workflow for quickly validating that information.
  • FIG. 6 depicts a flow chart for an example method 600 for voice recording, playback, and integration with a patient record.
  • a voice record is captured. For example, a voice record is captured via real time (including substantially real time) dictation.
  • the voice record is marked. For example, the voice record is automatically and/or manually marked with one or more time stamps, segment(s), keyword(s), etc.
  • the voice record is connected with one or more applications used by the user. For example, the voice record is inserted into a time continuum and/or patient record associated with a patient.
  • the voice record is translated (e.g., via a speech to text conversion).
  • a user can replay the stored voice record.
  • the user can replay an entire voice record, a marked field of the voice record, a selected section of the voice record, etc., to allow a clinician to replay and confirm/correct determined values from a speech to text translation of the voice record.
  • one or more values can be corrected/updated based on the reviewed recording.
  • voice recording and playback helps facilitate an improved workflow with applications, voice dictation capture, user review, reporting, etc.
  • FIG. 7 illustrates an example voice recording and review interface 700 .
  • audio data 710 can be parsed and made visible 720 for validation by a user. Audio can be made available for replay during a validation phase, for example. Voice data can be played back, translated into a note, etc. Video data can be similarly provided.
  • a summary of information can be provided by a clinician and/or patient and spoken without having to type into a computer.
  • dictation can be parsed into discrete values, and the discrete values 720 can be displayed and provided outbound to corresponding clinical documentation. For example, a nurse's comment “Heartrate 120 ” is translated and parsed to determine that the field is “heartrate” and the value for that field is “120”.
  • a confidence index and/or status 730 can be provided to a user. The user can then replay 740 , approve 730 , change, and/or store a value, for example. The user can replay 740 an entire recording, a certain portion 715 (e.g., keyword or section), etc., of the voice data 710 , for example.
  • Voice playback can be started, stopped, paused, forwarded, reversed, etc., using one or more controls 750 - 751 , for example.
  • Voice charting helps a user receive patient history and details as a clinician enters a room with the patient.
  • a user can dictate exam results without paper or computer available (e.g., via mobile device).
  • a viewer and/or reporting tool can prompt the user for missing exam details and/or other items and provide visual indicators and/or alerts to the user, for example.
  • mobile dictation and reporting a clinician can visit multiple patients before stopping at the computer for further documentation and analysis.
  • an audio and/or visual notification can be provided to user when lab or other results are ready. Audio and/or visual notification can also be used to provide reminders for patient care.
  • users are able to enter clinical annotations (e.g., documented observations, events, alerts, etc.) in a structured format by giving values to specified data items with a time context.
  • a system presents the user with a pre-defined set of items to annotate. Additionally, the system includes a capability to learn a clinical state of the patient, alter a user interface presented to the user, and modify workflows in the user experience (UX) appropriately.
  • the system and associated method(s) integrate both manually documented and acquired data to recognize a state or condition of the patient.
  • the state can represent a phase of care (e.g., pre-operative versus intra-operative phases in a surgical unit), some clinical progression (e.g., antepartum, labor, delivery, postpartum stages in a labor and delivery unit), and/or other patient status (e.g., a postpartum patient who has delivered a girl versus one who has delivered a boy).
  • a phase of care e.g., pre-operative versus intra-operative phases in a surgical unit
  • some clinical progression e.g., antepartum, labor, delivery, postpartum stages in a labor and delivery unit
  • other patient status e.g., a postpartum patient who has delivered a girl versus one who has delivered a boy.
  • the state can be a function of a single documented item including a certain expected value or being within some expected range, for example.
  • the state can be defined by a group of items including expected values.
  • the state can depend upon a chronological order of the recording of a group of variables, for example.
  • a patient can also occupy multiple states concurrently.
  • a set of rules can be configured to define the identified states.
  • a labor and delivery (L&D) patient may have four (4) states—antepartum, labor, delivery, and postpartum.
  • a system can inspect whether the patient has had documented contraction intervals of five (5) minutes or less, any dilatation value less than ten (10) centimeters (cm) is recorded, and is admitted to a bed in an L&D unit. If all conditions are true, then the system recognizes that the patient is in labor. Similarly, if a patient is in the postpartum state, the system can recognize that she has delivered a boy or a girl depending on the charted gender. At this point, she is assigned one state based upon the fact that she delivered and another state based upon the delivery and the baby's gender. Another example involves a patient who is in the delivery state at the same time she is in a C-section state (versus natural delivery state).
  • the user experience is altered streamline documentation and reduce errors.
  • the patient may appear differently in a roster once the system recognizes the labor state.
  • the system can also add or remove certain documentation capabilities based upon the state.
  • the user once a patient has entered the post-partum/girl delivery state, the user no longer has an option to chart that the baby was circumcised. If a patient is in the natural delivery state, the user will be unable to chart any information related to performance of a C-section, for example.
  • available annotation options can be provided, removed, limited, and/or otherwise guided based on patient state and other available patient data, for example.
  • systems and methods used for clinical documentation often in a high-paced environment in which clinicians are caring for multiple patients at the same time. Awareness of patient state allows a system to streamline workflows, presenting users with only pertinent options for use and documentation at given state(s). Thus, less time can be spent doing the work of documentation and more time spent attending to the patient. Additionally, by only allowing a user to record appropriate information, inconsistencies and errant conflicts in the record can be minimized or prevented.
  • Streamlining nurse workflow allows nurses and/or other healthcare practitioners to spend more time interacting with their patients and less time documenting on the computer. Additionally, error reduction is provided by lessening clinical risk and legal liability in the case of inaccurate information being stored in the patient's record.
  • inconsistencies can also be reported after documentation is complete. Post-hoc examination can identify specific data items that conflict with each other.
  • a rules engine can be applied to infer a patient state.
  • multiple states can be applied to a patient at the same time.
  • documented items from external data sources e.g., external information systems via interface, fetal monitor, other devices, etc.
  • a user can also explicitly specify a patient state to confirm, correct, or override an automated system determination of state.
  • the system can then provide intelligent documentation capabilities based upon the declared state.
  • FIG. 8 illustrates an example method 800 for smart clinical annotation of patient information in a clinical (e.g., perinatal) workflow.
  • one or more applicable patient states are identified. For example, one or more states indicative of patient condition, patient status, patient treatment, etc., are automatically identified based on stored patient data (e.g., EMR data, personal health record (PHR) data, radiology information system (RIS) data, picture archiving and communication system (PACS) data, etc.).
  • EMR data e.g., EMR data, personal health record (PHR) data, radiology information system (RIS) data, picture archiving and communication system (PACS) data, etc.
  • PHR personal health record
  • RIS radiology information system
  • PHS picture archiving and communication system
  • allowable annotations made by a user on a patient record are adjusted. For example, as a user is charting that a patient is eight (8) cm dilated, the type of available annotations is automatically adjusted to be more related to baby delivery than if the user were annotating that the patient is two (2) cm dilated.
  • values are suggested for remaining fields. For example, a user begins to input information into a field and values can then be suggested for one or more remaining fields based on the existing input and a historical data store of annotations.
  • data abnormalities or inconsistencies are identified. For example, data that does not make sense given other provided data is flagged. For example, a user cannot chart about a circumcision when the baby is a girl. In certain examples, certain choices may not be provided to a user based on the other information available. Based on this “smart”, more efficient charting, error can be reduced or prevented. Clinical decision support and rules can be used to support such “smart” charting.
  • values are calculated automatically based on annotation input.
  • the calculated values can form part of an annotation and/or can be approved by a user and placed into a patient record and/or report.
  • an annotation of a waveform can automatically trigger a waveform analysis that is pulled into an annotation.
  • a user can mark a fifteen (15) minute window and values can be calculated based on that marked window.
  • the automatically calculated values can be approved and dropped into a record.
  • an annotation can be selected and copied into another annotation, record, and/or report.
  • recent documentation can be selected and copied by a user into another annotation, patient record, report, etc.
  • FIG. 9 illustrates an example interface 900 for expanded annotation review.
  • the annotation review 900 includes clinical data 910 for a patient, clinician annotation(s) 920 , system or automatically generated/determined annotation(s) 930 , and one or more controls including a collapse/expand control 940 , a search control 950 , etc.
  • FIGS. 3 , 5 , 6 , and 8 are flow diagrams representative of example machine readable instructions that may be executed to implement example systems and methods described herein, and/or portions of one or more of those systems (e.g., systems 100 and 1100 ) and methods.
  • the example processes of FIGS. 3 , 5 , 6 , and 8 can be performed using a processor, a controller and/or any other suitable processing device.
  • the example processes of FIGS. 3 , 5 , 6 , and 8 can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium such as a flash memory, a read-only memory (ROM), and/or a random-access memory (RAM).
  • coded instructions e.g., computer readable instructions
  • a tangible computer readable medium such as a flash memory, a read-only memory (ROM), and/or a random-access memory (RAM).
  • the term tangible computer readable medium is expressly defined to include any type of computer readable storage and to exclude propagating signals. Additionally or alternatively, the example processes of FIGS. 3 , 5 , 6 , and 8 can be implemented using coded instructions (e.g., computer readable instructions) stored on a non-transitory computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable medium is expressly defined to include any type of computer readable medium and to exclude propagating signals.
  • coded instructions e.g., computer readable instructions
  • a non-transitory computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which
  • FIGS. 3 , 5 , 6 , and 8 can be implemented using any combination(s) of application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, etc.
  • ASIC application specific integrated circuit
  • PLD programmable logic device
  • FPLD field programmable logic device
  • discrete logic hardware, firmware
  • FIGS. 3 , 5 , 6 , and 8 can be implemented manually or as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware.
  • FIGS. 3 , 5 , 6 , and 8 are described with reference to the flow diagrams of FIGS.
  • FIGS. 3 , 5 , 6 , and 8 other methods of implementing the processes of FIGS. 3 , 5 , 6 , and 8 can be employed.
  • the order of execution of the blocks can be changed, and/or some of the blocks described can be changed, eliminated, sub-divided, or combined.
  • any or all of the example processes of FIGS. 3 , 5 , 6 , and 8 can be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.
  • FIG. 10 is a block diagram of an example processor system 1010 that can be used to implement the systems, apparatus and methods described herein.
  • the processor system 1010 includes a processor 1012 that is coupled to an interconnection bus 1014 .
  • the processor 1012 can be any suitable processor, processing unit or microprocessor.
  • the system 1010 can be a multi-processor system and, thus, can include one or more additional processors that are identical or similar to the processor 1012 and that are communicatively coupled to the interconnection bus 1014 .
  • the processor 1012 of FIG. 10 is coupled to a chipset 1018 , which includes a memory controller 1020 and an input/output (I/O) controller 1022 .
  • a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 1018 .
  • the memory controller 1020 performs functions that enable the processor 1012 (or processors if there are multiple processors) to access a system memory 1024 and a mass storage memory 1025 .
  • the system memory 1024 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc.
  • the mass storage memory 1025 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc.
  • the I/O controller 1022 performs functions that enable the processor 1012 to communicate with peripheral input/output (I/O) devices 1026 and 1028 and a network interface 1030 via an I/O bus 1032 .
  • the I/O devices 1026 and 1028 may be any desired type of I/O device such as, for example, a keyboard, a video display or monitor, a mouse, etc.
  • the network interface 1030 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a DSL modem, a cable modem, a cellular modem, etc. that enables the processor system 1010 to communicate with another processor system.
  • ATM asynchronous transfer mode
  • memory controller 1020 and the I/O controller 1022 are depicted in FIG. 10 as separate blocks within the chipset 1018 , the functions performed by these blocks may be integrated within a single semiconductor circuit or may be implemented using two or more separate integrated circuits.
  • certain examples provide one or more floating windows or “always available” viewers providing streaming, real time, or “live” data to a user regarding one or more of his/her patients.
  • Data can include fetal and/or patient waveform data, patient time continuum, voice record, annotations, reports, etc.
  • the floating viewer can be combined, separated, etc., and positioned at any location on a user's display.
  • Certain examples provide rules-based limitations and/or assistance regarding annotations, reporting, charting, etc.
  • Certain examples provide speech to text conversion for review, playback, and inclusion in annotations, reports, charting, etc.
  • Certain examples contemplate methods, systems and computer program products on any machine-readable media to implement functionality described above. Certain examples can be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired and/or firmware system, for example.
  • One or more of the components of the systems and/or steps of the methods described above may be implemented alone or in combination in hardware, firmware, and/or as a set of instructions in software, for example.
  • Certain examples can be provided as a set of instructions residing on a computer-readable medium, such as a memory, hard disk, DVD, or CD, for execution on a general purpose computer or other processing device.
  • Certain examples can omit one or more of the method steps and/or perform the steps in a different order than the order listed. For example, some steps/blocks may not be performed in certain examples. As a further example, certain steps may be performed in a different temporal order, including simultaneously, than listed above.
  • Computer-readable media for carrying or having computer-executable instructions or data structures stored thereon.
  • Such computer-readable media can be any available media that may be accessed by a general purpose or special purpose computer or other machine with a processor.
  • Such computer-readable media can include RAM, ROM, PROM, EPROM, EEPROM, Flash, CD-ROM, DVD, Blu-ray, optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of computer-readable media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
  • Computer-executable instructions include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types.
  • Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of certain methods and systems disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
  • Logical connections can include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation.
  • LAN local area network
  • WAN wide area network
  • Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet and can use a wide variety of different communication protocols.
  • Those skilled in the art will appreciate that such network computing environments will typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like.
  • Examples can also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network.
  • program modules may be located in both local and remote memory storage devices.
  • An exemplary system for implementing the overall system or portions of embodiments of the invention might include a general purpose computing device in the form of a computer, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit.
  • the system memory may include read only memory (ROM) and random access memory (RAM).
  • the computer may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media.
  • the drives and their associated computer-readable media provide nonvolatile storage of computer-executable instructions, data structures, program modules and other data for the computer.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • User Interface Of Digital Computer (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Tourism & Hospitality (AREA)
  • Child & Adolescent Psychology (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)

Abstract

Certain examples provide systems and methods for improved perinatal workflow. Certain examples provide a clinical dock interaction display system. The system includes a memory to buffer live streaming data for one or more patients. The system includes a user interface to display and receive input with respect to a list of one or more patients one or more associated with a clinician; a control to facilitate user selection of one or more patients from the list; and live streaming data received from one or more monitors for one or more selected patients, wherein the user interface is to facilitate user selection of a patient for a more detailed patient view. The system includes an alert for one or more selected patients to be triggered based on a defined criterion. The system includes a processor to process data for output via the user interface and to process user input.

Description

    RELATED APPLICATIONS
  • The present application relates to and claims the benefit of priority from U.S. Provisional Patent Application No. 61/387,922, filed on Sep. 29, 2010, which is herein incorporated by reference in its entirety.
  • FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT
  • [Not Applicable]
  • MICROFICHE/COPYRIGHT REFERENCE
  • [Not Applicable]
  • BACKGROUND
  • Perinatal systems involve a high degree of data granularity and a particular workflow for high acuity cases. Patient condition can change very rapidly so large delays in getting information into an electronic system are not acceptable.
  • BRIEF SUMMARY
  • Certain examples provide systems and methods for improved perinatal workflow.
  • Certain examples provide a clinical dock interaction display system. The system includes a memory to buffer live streaming data for one or more patients. The system includes a user interface to display and receive input with respect to a list of one or more patients one or more associated with a clinician; a control to facilitate user selection of one or more patients from the list; and live streaming data received from one or more monitors for one or more selected patients, wherein the user interface is to facilitate user selection of a patient for a more detailed patient view. The system includes an alert for one or more selected patients to be triggered based on a defined criterion. The system includes a processor to process data for output via the user interface and to process user input.
  • Certain examples provide a computer-implemented method for clinical patient monitoring. The method includes displaying a list of one or more patients one or more associated with a clinician and facilitating user selection of one or more patients from the list. The method includes providing, via a user interface, live streaming data received from one or more monitors for one or more selected patients and providing, upon user selection of a patient via the user interface, a more detailed patient view for the selected patient. The method includes generating an alert for one or more selected patients to be triggered based on a defined criterion.
  • Certain examples provide a tangible computer readable storage medium including executable program instructions which, when executed by a computer processor, cause the computer to implement a clinical dock interaction display system. The clinical dock interaction display system includes a user interface to display and receive input with respect to a list of one or more patients one or more associated with a clinician; a control to facilitate user selection of one or more patients from the list; and live streaming data received from one or more monitors for one or more selected patients. The user interface is to facilitate user selection of a patient for a more detailed patient view. The system also includes an alert for one or more selected patients to be triggered based on a defined criterion.
  • BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS
  • FIGS. 1-2 illustrate example dock interaction displays providing patient information to a user.
  • FIG. 3 depicts a flow chart for an example method for providing a clinical surveillance view of patient data to a user.
  • FIG. 4 illustrates an example viewer providing a time continuum and associated information for a monitored patient.
  • FIG. 5 depicts a flow chart for an example method for providing a clinical time continuum and associated real time data for a patient to a user.
  • FIG. 6 depicts a flow chart for an example method for voice recording, playback, and integration with a patient record.
  • FIG. 7 illustrates an example voice recording and review interface.
  • FIG. 8 illustrates a flow chart for an example method for smart clinical annotation of patient information in a clinical workflow.
  • FIG. 9 illustrates an example interface for annotation review.
  • FIG. 10 is a block diagram of an example processor system that can be used to implement the systems, apparatus and methods described herein.
  • The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings.
  • DETAILED DESCRIPTION OF CERTAIN EXAMPLES
  • Although the following discloses example methods, systems, articles of manufacture, and apparatus including, among other components, software executed on hardware, it should be noted that such methods and apparatus are merely illustrative and should not be considered as limiting. For example, it is contemplated that any or all of these hardware and software components could be embodied exclusively in hardware, exclusively in software, exclusively in firmware, or in any combination of hardware, software, and/or firmware. Accordingly, while the following describes example methods, systems, articles of manufacture, and apparatus, the examples provided are not the only way to implement such methods, systems, articles of manufacture, and apparatus.
  • When any of the appended claims are read to cover a purely software and/or firmware implementation, at least one of the elements in an at least one example is hereby expressly defined to include a tangible medium such as a memory, DVD, CD, Blu-ray, etc. storing the software and/or firmware.
  • Certain clinical areas, such as a high acuity perinatal department in a hospital, have a much higher ratio of clinician to patient because the clinician should be documenting continuously. Traditionally, large delay is introduced because clinicians have gloved hands and are treating a patient and must then wash their hands, get to a computer to enter information, etc. Thus, an hour or more may pass before documentation is entered. People forget information or chart it at the wrong time, and that impacts legal liability and patient treatment.
  • Additionally, hospitals are demanding enterprise systems, so department systems should have better interoperability with enterprise systems. Systems should help user be better and smarter in documentation. Certain examples are agnostic to any enterprise system. Certain examples help support an improved real-time perinatal workflow with additional tools, information, and capabilities. Certain examples provide sharing of a clinical (e.g., perinatal) application with other clinical applications. Certain examples provide trend navigation through clinical (e.g., perinatal) data. Certain examples provide a multi-patient sidebar view of real time and/or stored patient data. Certain examples provide a view of a time continuum and associated data along with an ability to edit, annotate, report, and retrieve information associated with a patient's time continuum.
  • Clinical Surveillance View
  • Healthcare professionals can review and document patient data in an application (e.g., a perinatal application) while monitoring various live streaming data of other patients using an overlaid user interface (UI) widget. A healthcare professional can examine a detailed view of any of the other patients' data by selecting a patient and displaying the detailed information, while switching contexts in the underlying application to allow continuous documentation.
  • In certain examples, upon activation, a Clinician's Surveillance View (CSV) docks or positions itself with respect to one of four sides of a monitor/display screen (e.g., against an edge of the display). When desired, the CSV can be left to float in any section of the visual display. The CSV overlays, in a non-intrusive way, an underlying application that is currently being used, which helps enable the health care professional to pursue other tasks while monitoring information in the CSV bar.
  • The CSV includes certain interactions and options to enable health care professionals to add and monitor patient information in real time. When necessary and/or desired, health care professionals can click on one of the patients live feeds to view additional information. When a live feed is clicked to view additional information, other background applications bring that particular patient's information into focus to add redundancy in data manipulation.
  • Thus, the CVS helps to prevent interruptions in usage of other systems involved in a clinical (e.g., perinatal) workflow, such as an enterprise wide system, while allowing clinicians to simultaneously (or at least substantially simultaneously given some system delay) monitor live data for multiple patients. For example, the CVS can be used in a labor and delivery environment with real-time fetal monitoring, as well as in any high-risk care area.
  • In certain examples, CVS can interoperate and function concurrently with enterprise wide application(s). Enterprise wide applications are generally non-specific to a particular care area, and CVS facilitates user access to functionality and rich clinical content targeted for high-risk care areas, while allowing user to continue to leverage enterprise wide systems for comprehensive documentation.
  • In certain examples, continuous streams of live data can be embedded within an overlaid application.
  • FIG. 1 illustrates an example dock interaction display 100 or “sidebar” providing a patient list and related real time waveform information to a user. The display 100 can be positioned anywhere on a display screen by a user, such as in a middle right hemisphere of a user's screen. The display 100 can interact with and/or be apart from other application(s) executing on the user's computer. As demonstrated in FIG. 1, a user can search for a patient 110, add a patient 120, drag and drop 130 patient(s) from a list 135 to be monitored, etc. Once a patient is added to the monitoring list, real time (or substantially real time including a system delay (e.g., processing, data retrieval from memory, communication, etc.)) data (e.g., fetal monitor, patient monitor, and/or other waveform data, etc.) 140 can be displayed for one or more selected patients via the display 100. Additionally, an indicator or alert 150 regarding one or more patients can be provided via the display 100.
  • FIG. 2 illustrates another example dock interaction display 200. The example display 200 depicts how the dock display 200 may look if docked or positioned in a top hemisphere of a user's display screen.
  • In certain examples, using the patient sidebar or interaction display 100, 200, a user can be provided with electrocardiogram (EKG) and/or other live streaming waveform data for selected patient(s). Color-coded alerting can be provided. A user can select a patient in the sidebar 100, 200 to see a more detailed patient view. Live active scrolling can be provided.
  • In certain examples, a voice-activated “mark” button can be provided in conjunction with the waveform data to allow a user to document in real-time (or substantially real time) through keyboard input, mouse selection, voice indication, foot pedal, etc., to make a mark and/or other annotation on the “live” waveform. In certain examples, a mark can be automatically converted into an annotation.
  • FIG. 3 depicts a flow chart for an example method 300 for providing a clinical surveillance view of patient data to a user. At 310, a surveillance viewer is positioned on a user's display. For example, the viewer can automatically (e.g., based on a default position, user-specified preference, concurrent application(s) executing, workflow, etc.) be positioned and/or manually be positioned by a user on the display. The viewer overlays, in a non-intrusive way, one or more underlying application(s) currently in use on the display. Thus, a user can interact with other applications in a patient care workflow while monitoring information in the surveillance viewer.
  • At 320, one or more patients to be monitored are identified. For example, a user can search for a patient. For example, a user can provide a patient name and/or browse a list of available patient(s) to identify one or more patients to be monitored. At 330, real time (e.g., including substantially real time) data for the patient is displayed to the user via the surveillance viewer. For example, fetal waveform, patient EKG, blood pressure, and/or other data can be displayed via a live feed to the user in the surveillance viewer. For example, the surveillance monitor can be used in a labor and delivery environment with real-time fetal monitoring, as well as in any other high-risk care area.
  • At 340, additional detail is provided upon selection of monitored data. For example, when a live feed (e.g., a fetal waveform) is clicked on, “moused” or hovered over, or otherwise selected to view additional information, that particular information is retrieved for display and/or brought into focus for the user.
  • At 350, an indicator or alert can be marked via the surveillance viewer. For example, an indicator or mark can be provided for a patient, a data feed, etc., for display via the surveillance viewer.
  • Clinical Time Continuum at Point of Care
  • Alternatively and/or in addition, certain examples enable a visualization of directly and indirectly acquired clinical content plotted over time and perpetually updated. Features of the visualization include co-location of clinically relevant content collected from disparate sources, supplemented by a mechanism to initiate annotations (e.g., assessments and/or actions), through which a user can indicate, preserve, and/or visualize an intended association with the content that motivated the annotation.
  • Additionally, certain examples provide for initiation of recording by an external recording device (e.g., audio, video, etc.), at a point of care, in order to reconstitute annotations for formal documentation at a later time.
  • Certain examples provide a continuously updating graph including time as the x axis to plot direct observations. Indirect observations are rendered on the same timeline, presenting a visual indicator of source and, potentially, a summary of content.
  • As observations are collected, a support engine processes observations to discover pattern(s) of potential correlation. Patterns are based on interpreting the values of collected observations and deducing where annotations may be appropriate, for example. These pattern discoveries are displayed on the graph as indicators of potential annotation opportunity(ies).
  • Recorded annotations maintain an association to pattern sources, for example. The association can indicate a target concern to which an annotation applies. These concerns can be composites of specific discrete observations, associated with a range of time, and/or with a series of time ranges, for example. Additionally, users can initiate annotations and explicitly define their own observation dependency(ies), by selecting and highlighting either a range of time which includes known content, or by explicitly multi-selecting specific content (e.g., CTRL-CLICK), for example.
  • By setting context (e.g., via hover/mouse over, click/select, etc.) on a previously noted annotation, associated clinical content can be exposed. Proposed exposition can include a bubble web from the annotation whereby the associated content is encircled and a connection line is drawn, for example. Associations based strictly on time range can highlight a relevant range on the graph, for example.
  • In certain examples, audio/visual recording can be initiated to invoke a record action. The visual can then display the start of the record mode and its duration. The completion of the record mode results in the preservation of the recording as a clinical observation and the creation of a proposed annotation (as noted above), with the recording as the associated content.
  • Often, high-acuity clinical settings such as labor and delivery, suffer from inefficiency and lack of timely documentation. Care and safety of a patient is a top priority above clinical documentation of that care. However, timeliness of that documentation leads to increased accuracy and availability of data for review by other clinicians. For example, if a nurse is caring for a patient during delivery, he/she will be wearing gloves and other protective equipment. The nurse will be unable to document using a keyboard or mouse until he/she can remove the gloves. The nurse also will be unable to document in a system if a patient has immediate needs such as turning onto side or starting a drip of intravenous (IV) fluids. Certain examples allow a user to tag a time continuum using an input device such as keyboard/mouse, voice command/control, etc.
  • Additionally, clinicians may be interested in reviewing correlated data as the data becomes available to a clinician. A clinician might not be aware that data is available or might have to search through multiple sources to correlate different data inputs (such as lab data, with vitals data, with fetal strip data, etc.). Because a clinician can see the data correlated on one screen from multiple external sources, the clinician can now more efficiently review the data, make inferences from that data, and document interactions associated with a group of data elements on a time continuum.
  • Certain examples provide a new level of usability for a live patient encounter and bedside documentation. Data representation facilitates contextual interaction against a patient record at a specific point in time. Centralized visualization of different sources of data is provided on a patient time-continuum in a single application. The patient time continuum organization and display allows for correlation of other series/sources of data to provide further evidence of an event and its related annotation. The time continuum display also enhances recognition of associative relationships by providing visual indicators.
  • Alternatively, flowsheets can be used to record content over time. However, flowsheets are significantly more ridged in structure (having time bound to known intervals as columns) This consumes a significant amount of lateral real estate.
  • Certain examples allow for direct documentation upon a continuous waveform. Furthermore, other clinical observations over time, such as XDS document awareness, lab messages, etc., can be documented and displayed upon the continuous waveform. Associative relationships, as well as visual co-location, of the data can also be facilitated.
  • Rather than relied on paper printouts from fetal monitors that must be manually written on, electronic fetal waveform capture with electronic annotations allow the content to be stored in a discoverable and searchable format. Recordings can be automatically and/or manually initiated, for example. In certain examples, hyperlinks can allow traversal of associated content through data association(s).
  • FIG. 4 illustrates an example viewer 400 providing a time continuum and associated information for a monitored patient. The viewer 400 conveys indications of monitored waveform data 410 provided against a baseline 420 for a patient. The waveform data 410 represents a portion of a time continuum 405 for the patient (e.g., an expanded portion compared to a compressed overall view of the time continuum 405). As illustrated in FIG. 4 the time continuum 405 can provide a window 430 (e.g., an eight hour window) in a current care cycle for the patient, while the waveform data 410 represents a current live feed or another selected subset of that data. The waveform data window 410 can provide a certain time subset 440 determined automatically and/or specified by a user (e.g., looking back five minutes, ten minutes, twenty minutes, etc.). One or more controls 450 allow a user to overlay data, mark data, insert annotations or notes, etc. As represented in FIG. 4, automatic (e.g., system or derived) annotations 460 as well as user-input annotations 470 can be shown and interacted with via the viewer 400.
  • Using the viewer 400, a user documents with respect to a point of care as the user is providing care and ties the documentation to the time continuum 405. This information can be provided (e.g., automatically and/or manually) into a flowsheet and/or other format to be viewed and/or correlated with other information (e.g., a vital signs graph, EKG waveform, lab results, etc.) so that the information all appears in the time continuum 405. Information can be tied together and provided in a visual format, so that, for example, a user can see a last lab result when a patient's EKG dropped. The user can open the lab result and superimpose a vital signs graph and see the information together for evaluation. Information can be provided in real-time (including substantially in real time), not after the fact (like a longitudinal record).
  • In certain examples, the time continuum 405 can stretch from the current moment to the beginning of a cycle of care (e.g., the start of pregnancy). For example, the time continuum 405 may include two hours of test, then have no data for two weeks, then include a hospital visit, etc. In certain examples, a user can track everything done in the time continuum and can select and review individual items in more detail. The time continuum 405 can be presented chronologically based on occurrence, chronologically based on time of documentation, etc. The viewer 400 can provide different ways to analyze a story and recreate what happened, for example.
  • In certain examples, data can be provided to a mobile device (e.g., a tablet computer, smart phone, laptop, netbook, personal digital assistant, etc.). For example, voice commands can be provided via a wired or wireless connection (e.g., Bluetooth, Wi-Fi, etc.) and a user can review information on the mobile, etc. In certain examples, the mobile device can perform at least some of the processing for dictation, etc. Using the mobile device, a user can document “on the fly” as he or she moves a patient from a waiting room to an operating room, for example. Data can also be captured during transport from hospital to hospital and documentation maintained in transit to complete the patient record, for example.
  • In certain examples, a user may have other data not tied to the time continuum that he or she wants to see in the same space, but separate from the time continuum (e.g., a labor curve, vitals, growth curve, normal ranges, etc.). The viewer 400 can provide a separate axis for such information (e.g., an axis showing four hours of data versus fifteen minutes of waveform data).
  • In certain examples, the viewer 400 provides a real-time (including substantially real time) push of updated information/content (e.g., lab results, etc.) that “pops up” or appears for a user to see as the user is documenting and treating a patient in real time. In certain examples, the viewer 400 can be pre-configured to quickly provide information to a user that he or she can popup in the time continuum 405 and then close to resume patient charting. For example, the user can view the popup data but does not have to pull the data item(s) into their time continuum.
  • In certain examples, the viewer 400 is provided as a “floating” window that is always on top and always available with other applications visible and accessible underneath. In certain examples, a compressed view can be provided on top of the floating window to see trending over a long period of time without having to scroll through data, for example.
  • In certain examples, the time continuum 405 can be searched by keyword and navigated to a location or locations for a corresponding annotation (or annotations). In certain examples, a user can grab or otherwise select a tab and navigate forward and/or backward through available data. The time continuum 405 and other information in the viewer 400 can include flags, indicators, and/or other pointer to data, events, and/or other information, for example.
  • FIG. 5 depicts a flow chart for an example method 500 for providing a clinical time continuum and associated real time data for a patient to a user. At 510, a patient is selected for monitoring and viewing via a time continuum viewer. At 520, a time continuum of data for the patient over a specified period is graphically represented via the viewer. At 530, a real time or live portion (including substantially real time) of the time continuum data is displayed via the window. For example, the past ten minutes of the patient's EKG waveform are displayed in greater detail apart from the overall time continuum. The time continuum and the portion continue to update in real time (including substantially real time).
  • At 540, additional detail is provided upon selection of monitored data. For example, when a live feed (e.g., a fetal waveform) is clicked on, “moused” or hovered over, or otherwise selected to view additional information, that particular information is retrieved for display and/or brought into focus for the user.
  • At 550, an indicator or alert can be marked via the surveillance viewer. For example, an indicator or mark can be provided for a patient, a data feed, etc., for display via the surveillance viewer.
  • Clinical Charting Using Voice and/or Video
  • Clinicians often need to record information quickly and store this data in a patient's electronic medical record (EMR) and/or other data store. By utilizing tools such as dictation voice recognition and video monitoring of patient care, clinical data entry into the EMR and/or other data store can be automated.
  • In certain examples, a clinical system (e.g., a perinatal system) can include a voice and video recognition engine embedded in the software and/or hardware to capture audio and/or video content and identify pertinent data elements and events in recorded data. For example, captured audio can be parsed and spoken data matched to discrete EMR data elements, while also noting a time index in the recording for quick recall and playback.
  • For example, as a clinician provides care for a patient, audio and, optionally, video can be recorded, analyzed and parsed to identify clinical data elements to be stored in the patient's EMR. Recordings can run continuously, or can be started and stopped at the clinician's discretion using a voice command, or other physical toggle (e.g., foot pedal, keyboard, mouse, etc.). A time continuum can be updated with an indicator to show a time at which data capture was initiated.
  • A parsing and recognition system can identify discrete data elements in audio and/or video and classify the data elements using standard terminology (e.g., SNOMED, ICD-9, ICD-10, NANDA, etc.) and/or hospital provided terminology for storage in the patient's EMR.
  • As elements are parsed, each discrete set of elements is indexed based on time in the recording. The recorded session of care, as well as the parsed data, are saved for later authentication and accuracy verification. After the audio and/or video data is analyzed, a clinician is presented with a user interface screen on a computer showing a list of discrete elements and corresponding values as interpreted or actions performed. The user interface facilitates user authentication and verification of the parsed data, for example. The user interface can display a confidence level, determined by an analysis engine, for the data presented. In certain examples, each element and/or value includes a link to a specific time slice in the recording that is associated with the analyzed data. The clinician can use this link to quickly replay the relevant portion of the recording and see or hear the information again for verification. Access to the complete recorded session can be made available for context if requested.
  • The analysis engine and user interface with indexed replay option allow clinicians to provide patient care while the system (e.g., a perinatal system) records the pertinent clinical data to the EMR quickly and accurately.
  • Thus, while clinicians are busy taking care of their patients, it is very difficult to document at the same time care is being given. There are often critical events that are time sensitive, and it is important that the clinician can record the data at the time of the event, while simultaneously providing patient care. Dictating a quick comment or a “mark” of some type can assist a clinician (e.g., a nurse) to accurately document events as they occur (and/or shortly thereafter). For example, a nurse who has gloved hands and cannot touch a keyboard can dictate, “head delivered” in a perinatal application. Analysis of video can document actions performed such as “patient was moved to side position”. The nurse can then go back after the event and confirm and/or add to his/her documentation, for example.
  • Thus, certain examples facilitate faster and more accurate charting of patient data. Using dictation and dictation parsing, validation of the parsed data can be facilitated to provide more accurate patient records in a more efficient manner. Additionally, in some examples, video recordings can also be analyzed and parsed to identify clinical data elements. A user interface and workflow for quickly validating that information.
  • FIG. 6 depicts a flow chart for an example method 600 for voice recording, playback, and integration with a patient record. At 610, a voice record is captured. For example, a voice record is captured via real time (including substantially real time) dictation. At 620, the voice record is marked. For example, the voice record is automatically and/or manually marked with one or more time stamps, segment(s), keyword(s), etc. At 630, the voice record is connected with one or more applications used by the user. For example, the voice record is inserted into a time continuum and/or patient record associated with a patient. At 640, the voice record is translated (e.g., via a speech to text conversion). At 650, a user can replay the stored voice record. For example, the user can replay an entire voice record, a marked field of the voice record, a selected section of the voice record, etc., to allow a clinician to replay and confirm/correct determined values from a speech to text translation of the voice record. At 660, one or more values can be corrected/updated based on the reviewed recording. Thus, voice recording and playback helps facilitate an improved workflow with applications, voice dictation capture, user review, reporting, etc.
  • Certain examples provide clinicians with a more efficient mechanism to receive and record information while providing patient care. FIG. 7 illustrates an example voice recording and review interface 700. As shown in FIG. 7, audio data 710 can be parsed and made visible 720 for validation by a user. Audio can be made available for replay during a validation phase, for example. Voice data can be played back, translated into a note, etc. Video data can be similarly provided.
  • In certain examples, a summary of information can be provided by a clinician and/or patient and spoken without having to type into a computer.
  • In certain examples, dictation can be parsed into discrete values, and the discrete values 720 can be displayed and provided outbound to corresponding clinical documentation. For example, a nurse's comment “Heartrate 120” is translated and parsed to determine that the field is “heartrate” and the value for that field is “120”. In certain examples, after a voice recording has been mapped to fields and values, a confidence index and/or status 730 can be provided to a user. The user can then replay 740, approve 730, change, and/or store a value, for example. The user can replay 740 an entire recording, a certain portion 715 (e.g., keyword or section), etc., of the voice data 710, for example. Voice playback can be started, stopped, paused, forwarded, reversed, etc., using one or more controls 750-751, for example.
  • Voice charting helps a user receive patient history and details as a clinician enters a room with the patient. A user can dictate exam results without paper or computer available (e.g., via mobile device). A viewer and/or reporting tool can prompt the user for missing exam details and/or other items and provide visual indicators and/or alerts to the user, for example. Using mobile dictation and reporting, a clinician can visit multiple patients before stopping at the computer for further documentation and analysis.
  • In certain examples, an audio and/or visual notification can be provided to user when lab or other results are ready. Audio and/or visual notification can also be used to provide reminders for patient care.
  • “Smart” Clinical Annotations
  • In certain examples, users are able to enter clinical annotations (e.g., documented observations, events, alerts, etc.) in a structured format by giving values to specified data items with a time context. In certain examples, a system presents the user with a pre-defined set of items to annotate. Additionally, the system includes a capability to learn a clinical state of the patient, alter a user interface presented to the user, and modify workflows in the user experience (UX) appropriately. The system and associated method(s) integrate both manually documented and acquired data to recognize a state or condition of the patient. The state can represent a phase of care (e.g., pre-operative versus intra-operative phases in a surgical unit), some clinical progression (e.g., antepartum, labor, delivery, postpartum stages in a labor and delivery unit), and/or other patient status (e.g., a postpartum patient who has delivered a girl versus one who has delivered a boy). As the patient's state changes, the nature of information presented to and recorded by the end user changes to reflect the current patient state.
  • Certain examples provide systems and methods to recognize patient state and adapt presented information accordingly. The state can be a function of a single documented item including a certain expected value or being within some expected range, for example. The state can be defined by a group of items including expected values. The state can depend upon a chronological order of the recording of a group of variables, for example. A patient can also occupy multiple states concurrently. A set of rules can be configured to define the identified states.
  • For example, a labor and delivery (L&D) patient may have four (4) states—antepartum, labor, delivery, and postpartum. A system can inspect whether the patient has had documented contraction intervals of five (5) minutes or less, any dilatation value less than ten (10) centimeters (cm) is recorded, and is admitted to a bed in an L&D unit. If all conditions are true, then the system recognizes that the patient is in labor. Similarly, if a patient is in the postpartum state, the system can recognize that she has delivered a boy or a girl depending on the charted gender. At this point, she is assigned one state based upon the fact that she delivered and another state based upon the delivery and the baby's gender. Another example involves a patient who is in the delivery state at the same time she is in a C-section state (versus natural delivery state).
  • In certain examples, the user experience is altered streamline documentation and reduce errors. Continuing the previous example, the patient may appear differently in a roster once the system recognizes the labor state. The system can also add or remove certain documentation capabilities based upon the state. In the example, once a patient has entered the post-partum/girl delivery state, the user no longer has an option to chart that the baby was circumcised. If a patient is in the natural delivery state, the user will be unable to chart any information related to performance of a C-section, for example. Thus, available annotation options can be provided, removed, limited, and/or otherwise guided based on patient state and other available patient data, for example.
  • In certain examples, systems and methods used for clinical documentation, often in a high-paced environment in which clinicians are caring for multiple patients at the same time. Awareness of patient state allows a system to streamline workflows, presenting users with only pertinent options for use and documentation at given state(s). Thus, less time can be spent doing the work of documentation and more time spent attending to the patient. Additionally, by only allowing a user to record appropriate information, inconsistencies and errant conflicts in the record can be minimized or prevented.
  • Streamlining nurse workflow allows nurses and/or other healthcare practitioners to spend more time interacting with their patients and less time documenting on the computer. Additionally, error reduction is provided by lessening clinical risk and legal liability in the case of inaccurate information being stored in the patient's record.
  • In certain examples, inconsistencies can also be reported after documentation is complete. Post-hoc examination can identify specific data items that conflict with each other.
  • In certain examples, a rules engine can be applied to infer a patient state. In certain examples, multiple states can be applied to a patient at the same time. In certain examples, sub-states and/or combination states (e.g., state delivered+state girl=state delivered girl) can be provided. In certain examples, documented items from external data sources (e.g., external information systems via interface, fetal monitor, other devices, etc.) can be examined by a system to determine state.
  • In certain examples, a user can also explicitly specify a patient state to confirm, correct, or override an automated system determination of state. The system can then provide intelligent documentation capabilities based upon the declared state.
  • FIG. 8 illustrates an example method 800 for smart clinical annotation of patient information in a clinical (e.g., perinatal) workflow. At 810, one or more applicable patient states are identified. For example, one or more states indicative of patient condition, patient status, patient treatment, etc., are automatically identified based on stored patient data (e.g., EMR data, personal health record (PHR) data, radiology information system (RIS) data, picture archiving and communication system (PACS) data, etc.).
  • At 820, based on patient state(s), allowable annotations made by a user on a patient record are adjusted. For example, as a user is charting that a patient is eight (8) cm dilated, the type of available annotations is automatically adjusted to be more related to baby delivery than if the user were annotating that the patient is two (2) cm dilated.
  • At 830, based one or more completed fields, values are suggested for remaining fields. For example, a user begins to input information into a field and values can then be suggested for one or more remaining fields based on the existing input and a historical data store of annotations.
  • At 840, data abnormalities or inconsistencies are identified. For example, data that does not make sense given other provided data is flagged. For example, a user cannot chart about a circumcision when the baby is a girl. In certain examples, certain choices may not be provided to a user based on the other information available. Based on this “smart”, more efficient charting, error can be reduced or prevented. Clinical decision support and rules can be used to support such “smart” charting.
  • At 850, values are calculated automatically based on annotation input. The calculated values can form part of an annotation and/or can be approved by a user and placed into a patient record and/or report. For example, an annotation of a waveform can automatically trigger a waveform analysis that is pulled into an annotation. For example, a user can mark a fifteen (15) minute window and values can be calculated based on that marked window. The automatically calculated values can be approved and dropped into a record.
  • At 860, an annotation can be selected and copied into another annotation, record, and/or report. For example, recent documentation can be selected and copied by a user into another annotation, patient record, report, etc.
  • FIG. 9 illustrates an example interface 900 for expanded annotation review. The annotation review 900 includes clinical data 910 for a patient, clinician annotation(s) 920, system or automatically generated/determined annotation(s) 930, and one or more controls including a collapse/expand control 940, a search control 950, etc.
  • FIGS. 3, 5, 6, and 8 are flow diagrams representative of example machine readable instructions that may be executed to implement example systems and methods described herein, and/or portions of one or more of those systems (e.g., systems 100 and 1100) and methods. The example processes of FIGS. 3, 5, 6, and 8 can be performed using a processor, a controller and/or any other suitable processing device. For example, the example processes of FIGS. 3, 5, 6, and 8 can be implemented using coded instructions (e.g., computer readable instructions) stored on a tangible computer readable medium such as a flash memory, a read-only memory (ROM), and/or a random-access memory (RAM). As used herein, the term tangible computer readable medium is expressly defined to include any type of computer readable storage and to exclude propagating signals. Additionally or alternatively, the example processes of FIGS. 3, 5, 6, and 8 can be implemented using coded instructions (e.g., computer readable instructions) stored on a non-transitory computer readable medium such as a flash memory, a read-only memory (ROM), a random-access memory (RAM), a cache, or any other storage media in which information is stored for any duration (e.g., for extended time periods, permanently, brief instances, for temporarily buffering, and/or for caching of the information). As used herein, the term non-transitory computer readable medium is expressly defined to include any type of computer readable medium and to exclude propagating signals.
  • Alternatively, some or all of the example processes of FIGS. 3, 5, 6, and 8 can be implemented using any combination(s) of application specific integrated circuit(s) (ASIC(s)), programmable logic device(s) (PLD(s)), field programmable logic device(s) (FPLD(s)), discrete logic, hardware, firmware, etc. Also, some or all of the example processes of FIGS. 3, 5, 6, and 8 can be implemented manually or as any combination(s) of any of the foregoing techniques, for example, any combination of firmware, software, discrete logic and/or hardware. Further, although the example processes of FIGS. 3, 5, 6, and 8 are described with reference to the flow diagrams of FIGS. 3, 5, 6, and 8, other methods of implementing the processes of FIGS. 3, 5, 6, and 8 can be employed. For example, the order of execution of the blocks can be changed, and/or some of the blocks described can be changed, eliminated, sub-divided, or combined. Additionally, any or all of the example processes of FIGS. 3, 5, 6, and 8 can be performed sequentially and/or in parallel by, for example, separate processing threads, processors, devices, discrete logic, circuits, etc.
  • FIG. 10 is a block diagram of an example processor system 1010 that can be used to implement the systems, apparatus and methods described herein. As shown in FIG. 10, the processor system 1010 includes a processor 1012 that is coupled to an interconnection bus 1014. The processor 1012 can be any suitable processor, processing unit or microprocessor. Although not shown in FIG. 10, the system 1010 can be a multi-processor system and, thus, can include one or more additional processors that are identical or similar to the processor 1012 and that are communicatively coupled to the interconnection bus 1014.
  • The processor 1012 of FIG. 10 is coupled to a chipset 1018, which includes a memory controller 1020 and an input/output (I/O) controller 1022. As is well known, a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 1018. The memory controller 1020 performs functions that enable the processor 1012 (or processors if there are multiple processors) to access a system memory 1024 and a mass storage memory 1025.
  • The system memory 1024 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc. The mass storage memory 1025 may include any desired type of mass storage device including hard disk drives, optical drives, tape storage devices, etc.
  • The I/O controller 1022 performs functions that enable the processor 1012 to communicate with peripheral input/output (I/O) devices 1026 and 1028 and a network interface 1030 via an I/O bus 1032. The I/ O devices 1026 and 1028 may be any desired type of I/O device such as, for example, a keyboard, a video display or monitor, a mouse, etc. The network interface 1030 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 device, a DSL modem, a cable modem, a cellular modem, etc. that enables the processor system 1010 to communicate with another processor system.
  • While the memory controller 1020 and the I/O controller 1022 are depicted in FIG. 10 as separate blocks within the chipset 1018, the functions performed by these blocks may be integrated within a single semiconductor circuit or may be implemented using two or more separate integrated circuits.
  • Thus, certain examples provide one or more floating windows or “always available” viewers providing streaming, real time, or “live” data to a user regarding one or more of his/her patients. Data can include fetal and/or patient waveform data, patient time continuum, voice record, annotations, reports, etc. The floating viewer can be combined, separated, etc., and positioned at any location on a user's display. Certain examples provide rules-based limitations and/or assistance regarding annotations, reporting, charting, etc. Certain examples provide speech to text conversion for review, playback, and inclusion in annotations, reports, charting, etc.
  • Certain examples contemplate methods, systems and computer program products on any machine-readable media to implement functionality described above. Certain examples can be implemented using an existing computer processor, or by a special purpose computer processor incorporated for this or another purpose or by a hardwired and/or firmware system, for example.
  • One or more of the components of the systems and/or steps of the methods described above may be implemented alone or in combination in hardware, firmware, and/or as a set of instructions in software, for example. Certain examples can be provided as a set of instructions residing on a computer-readable medium, such as a memory, hard disk, DVD, or CD, for execution on a general purpose computer or other processing device. Certain examples can omit one or more of the method steps and/or perform the steps in a different order than the order listed. For example, some steps/blocks may not be performed in certain examples. As a further example, certain steps may be performed in a different temporal order, including simultaneously, than listed above.
  • Certain examples include computer-readable media for carrying or having computer-executable instructions or data structures stored thereon. Such computer-readable media can be any available media that may be accessed by a general purpose or special purpose computer or other machine with a processor. By way of example, such computer-readable media can include RAM, ROM, PROM, EPROM, EEPROM, Flash, CD-ROM, DVD, Blu-ray, optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer or other machine with a processor. Combinations of the above are also included within the scope of computer-readable media. Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing machines to perform a certain function or group of functions.
  • Generally, computer-executable instructions include routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular abstract data types. Computer-executable instructions, associated data structures, and program modules represent examples of program code for executing steps of certain methods and systems disclosed herein. The particular sequence of such executable instructions or associated data structures represent examples of corresponding acts for implementing the functions described in such steps.
  • Certain examples can be practiced in a networked environment using logical connections to one or more remote computers having processors. Logical connections can include a local area network (LAN) and a wide area network (WAN) that are presented here by way of example and not limitation. Such networking environments are commonplace in office-wide or enterprise-wide computer networks, intranets and the Internet and can use a wide variety of different communication protocols. Those skilled in the art will appreciate that such network computing environments will typically encompass many types of computer system configurations, including personal computers, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, and the like. Examples can also be practiced in distributed computing environments where tasks are performed by local and remote processing devices that are linked (either by hardwired links, wireless links, or by a combination of hardwired or wireless links) through a communications network. In a distributed computing environment, program modules may be located in both local and remote memory storage devices.
  • An exemplary system for implementing the overall system or portions of embodiments of the invention might include a general purpose computing device in the form of a computer, including a processing unit, a system memory, and a system bus that couples various system components including the system memory to the processing unit. The system memory may include read only memory (ROM) and random access memory (RAM). The computer may also include a magnetic hard disk drive for reading from and writing to a magnetic hard disk, a magnetic disk drive for reading from or writing to a removable magnetic disk, and an optical disk drive for reading from or writing to a removable optical disk such as a CD ROM or other optical media. The drives and their associated computer-readable media provide nonvolatile storage of computer-executable instructions, data structures, program modules and other data for the computer.
  • While the invention has been described with reference to certain examples or embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment or example disclosed, but that the invention will include all embodiments falling within the scope of the description and appended claims.

Claims (21)

1. A clinical dock interaction display system comprising:
a memory to buffer live streaming data for one or more patients;
a user interface to display and receive input with respect to:
a list of one or more patients one or more associated with a clinician;
a control to facilitate user selection of one or more patients from the list; and
live streaming data received from one or more monitors for one or more selected patients,
wherein the user interface is to facilitate user selection of a patient for a more detailed patient view;
an alert for one or more selected patients to be triggered based on a defined criterion; and
a processor to process data for output via the user interface and to process user input.
2. The system of claim 1, wherein the user interface further comprises a search component to facilitate a user's search for a patient based on a patient identifier.
3. The system of claim 1, wherein the user interface further comprises an input to allow a user to create a patient record.
4. The system of claim 1, wherein the live streaming data comprises electrocardiogram waveform data.
5. The system of claim 1, further comprising a dynamically updating numerical value to be calculated from the live streaming data, the numerical value to be displayed via the user interface.
6. The system of claim 5, wherein the user interface is to facilitate user confirmation of the numerical value and is to automatically insert the numerical value into a report.
7. The system of claim 1, wherein the user interface is to be dynamically sizable to be displayed in conjunction with other applications and data on a user display.
8. The system of claim 1, wherein the system comprises a supplemental display connectable to a user's primary display.
9. The system of claim 8, wherein the supplemental display comprises a mobile device display.
10. The system of claim 1, wherein the user interface further comprises an input to facilitate user marking of the live streaming data of a selected patient to make an annotation with respect to the live streaming data for the patient.
11. A computer-implemented method for clinical patient monitoring comprising:
displaying a list of one or more patients one or more associated with a clinician;
facilitating user selection of one or more patients from the list;
providing, via a user interface, live streaming data received from one or more monitors for one or more selected patients;
providing, upon user selection of a patient via the user interface, a more detailed patient view for the selected patient;
generating an alert for one or more selected patients to be triggered based on a defined criterion.
12. The method of claim 11, wherein the user interface further comprises a search component to facilitate a user's search for a patient based on a patient identifier.
13. The method of claim 11, wherein the user interface further comprises an input to allow a user to create a patient record.
14. The method of claim 11, wherein the live streaming data comprises electrocardiogram waveform data.
15. The method of claim 11, further comprising a dynamically updating numerical value to be calculated from the live streaming data, the numerical value to be displayed via the user interface.
16. The method of claim 15, wherein the user interface is to facilitate user confirmation of the numerical value and is to automatically insert the numerical value into a report.
17. The method of claim 11, wherein the user interface is to be dynamically sizable to be displayed in conjunction with other applications and data on a user display.
18. The method of claim 11, wherein the user interface further comprises an input to facilitate a user marking the live streaming data of a selected patient to make an annotation with respect to the live streaming data for the patient.
19. A tangible computer readable storage medium including executable program instructions which, when executed by a computer processor, cause the computer to implement a clinical dock interaction display system, the clinical dock interaction display system comprising:
a user interface to display and receive input with respect to:
a list of one or more patients one or more associated with a clinician;
a control to facilitate user selection of one or more patients from the list; and
live streaming data received from one or more monitors for one or more selected patients,
wherein the user interface is to facilitate user selection of a patient for a more detailed patient view; and
an alert for one or more selected patients to be triggered based on a defined criterion.
20. The computer readable medium of claim 19, further comprising a dynamically updating numerical value to be calculated from the live streaming data, the numerical value to be displayed via the user interface.
21. The computer readable medium of claim 19, wherein the user interface further comprises an input to facilitate user marking of the live streaming data of a selected patient to make an annotation with respect to the live streaming data for the patient.
US12/970,573 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow Abandoned US20120078647A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US12/970,573 US20120078647A1 (en) 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow
US29/535,623 USD792431S1 (en) 2010-09-29 2015-08-07 Display screen or portion thereof with graphical user interface
US29/606,622 USD846579S1 (en) 2010-09-29 2017-06-06 Display screen or portion thereof with graphical user interface
US29/686,632 USD933689S1 (en) 2010-09-29 2019-04-05 Display screen or portion thereof with graphical user interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US38792210P 2010-09-29 2010-09-29
US12/970,573 US20120078647A1 (en) 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US29/535,623 Continuation USD792431S1 (en) 2010-09-29 2015-08-07 Display screen or portion thereof with graphical user interface

Publications (1)

Publication Number Publication Date
US20120078647A1 true US20120078647A1 (en) 2012-03-29

Family

ID=45870084

Family Applications (5)

Application Number Title Priority Date Filing Date
US12/970,573 Abandoned US20120078647A1 (en) 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow
US12/970,563 Active 2034-06-14 US9292656B2 (en) 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow
US29/535,623 Active USD792431S1 (en) 2010-09-29 2015-08-07 Display screen or portion thereof with graphical user interface
US29/606,622 Active USD846579S1 (en) 2010-09-29 2017-06-06 Display screen or portion thereof with graphical user interface
US29/686,632 Active USD933689S1 (en) 2010-09-29 2019-04-05 Display screen or portion thereof with graphical user interface

Family Applications After (4)

Application Number Title Priority Date Filing Date
US12/970,563 Active 2034-06-14 US9292656B2 (en) 2010-09-29 2010-12-16 Systems and methods for improved perinatal workflow
US29/535,623 Active USD792431S1 (en) 2010-09-29 2015-08-07 Display screen or portion thereof with graphical user interface
US29/606,622 Active USD846579S1 (en) 2010-09-29 2017-06-06 Display screen or portion thereof with graphical user interface
US29/686,632 Active USD933689S1 (en) 2010-09-29 2019-04-05 Display screen or portion thereof with graphical user interface

Country Status (1)

Country Link
US (5) US20120078647A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120278099A1 (en) * 2011-04-26 2012-11-01 Cerner Innovation, Inc. Monitoring, capturing, measuring and annotating physiological waveform data
CN103593721A (en) * 2012-08-13 2014-02-19 中国商用飞机有限责任公司 A method for monitoring a business flow based on complex event processing technology
WO2014144339A1 (en) * 2013-03-15 2014-09-18 Zoll Medical Corporation Patient monitor screen aggregation
US20150269317A1 (en) * 2014-03-18 2015-09-24 Cameron Marcum Methods and apparatus for generating and evaluating modified data structures
USD792431S1 (en) 2010-09-29 2017-07-18 General Electric Company Display screen or portion thereof with graphical user interface
US20170357765A1 (en) * 2016-06-13 2017-12-14 Medical Informatics Corporation User interface for configurably displaying real-time data for multiple patients
USD831038S1 (en) * 2015-11-25 2018-10-16 General Electric Company Display screen or portion thereof with transitional graphical user interface
USD839883S1 (en) * 2015-11-25 2019-02-05 General Electric Company Display screen or portion thereof with graphical user interface
US20200121188A1 (en) * 2017-06-16 2020-04-23 Koninklijke Philips N.V. Annotating fetal monitoring data
US10991135B2 (en) * 2015-08-11 2021-04-27 Masimo Corporation Medical monitoring analysis and replay including indicia responsive to light attenuated by body tissue
US11086499B2 (en) * 2018-03-23 2021-08-10 Nihon Kohden Corporation Portable information terminal, biological information management method, biological information management program and computer-readable storage medium
USD938961S1 (en) * 2019-08-14 2021-12-21 GE Precision Healthcare LLC Display screen with graphical user interface
US11553885B2 (en) * 2019-06-20 2023-01-17 Nihon Kohden Corporation Patient monitor with user input to rearrange patient display areas
US11961597B1 (en) * 2014-05-31 2024-04-16 Allscripts Software, Llc User interface detail optimizer

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140249850A1 (en) * 2013-03-01 2014-09-04 James Thomas Woodson Critical condition module
US20150066582A1 (en) * 2013-08-30 2015-03-05 Pipelinersales Corporation Methods, Systems, and Graphical User Interfaces for Customer Relationship Management
USD816678S1 (en) 2014-07-03 2018-05-01 Verizon Patent And Licensing Inc. Display panel or screen with graphical user interface
USD828364S1 (en) * 2014-07-03 2018-09-11 Verizon Patent And Licensing Inc. Display panel for a graphical user interface with flip notification
US10120529B2 (en) 2014-07-08 2018-11-06 Verizon Patent And Licensing Inc. Touch-activated and expandable visual navigation of a mobile device via a graphic selection element
USD832874S1 (en) * 2015-02-19 2018-11-06 Cerner Innovation, Inc. Display screen with graphical user interface
KR102368689B1 (en) * 2015-03-20 2022-03-02 삼성디스플레이 주식회사 Display module and method for controlling the same
USD836655S1 (en) * 2015-04-06 2018-12-25 Domo, Inc Display screen or portion thereof with a graphical user interface
US9467745B1 (en) 2015-04-06 2016-10-11 Domo, Inc. Viewer traffic visualization platform
USD769908S1 (en) 2015-08-07 2016-10-25 Domo, Inc. Display screen or portion thereof with a graphical user interface for analytics
USD823313S1 (en) * 2015-09-03 2018-07-17 Eaton Industries (France) S.A.S. Display screen or portion thereof with a graphical user interface
USD810119S1 (en) * 2015-10-29 2018-02-13 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
JP1572638S (en) * 2016-03-24 2017-03-27
USD886116S1 (en) 2016-04-14 2020-06-02 Markup Llc Display screen portion with graphical user interface
USD838278S1 (en) * 2016-09-29 2019-01-15 United Services Automobile Association (Usaa) Display screen or portion thereof with a payday forecast graphical user interface
USD829729S1 (en) * 2016-09-29 2018-10-02 United Services Automobile Association (Usaa) Display screen or portion thereof with graphical user interface
USD817988S1 (en) * 2016-11-17 2018-05-15 Trivver, Inc. Display screen or portion thereof with tab based graphical user interface
USD827659S1 (en) * 2016-12-19 2018-09-04 Illumina, Inc. Display screen or portion thereof with graphical user interface of a structural genetic variation indicator
USD827660S1 (en) * 2016-12-19 2018-09-04 Illumina, Inc. Display screen or portion thereof with graphical user interface of a structural genetic variation indicator
USD864223S1 (en) * 2017-03-16 2019-10-22 General Electric Company Display screen with graphical user interface
USD864224S1 (en) * 2017-03-16 2019-10-22 General Electric Company Display screen with graphical user interface
USD862491S1 (en) 2017-08-29 2019-10-08 General Electric Company Display screen or portion thereof with graphical user interface
USD852822S1 (en) * 2017-11-24 2019-07-02 Siemens Aktiengesellschaft Display screen with graphical user interface
USD892829S1 (en) * 2018-04-20 2020-08-11 Adp, Llc Display screen or a portion thereof with an animated graphical user interface
USD892149S1 (en) * 2018-04-20 2020-08-04 Adp, Llc Display screen or a portion thereof with an animated graphical user interface
USD892148S1 (en) * 2018-04-20 2020-08-04 Adp, Llc Display screen or a portion thereof with an animated graphical user interface
USD864221S1 (en) * 2018-08-21 2019-10-22 Google Llc Display screen with animated graphical user interface
USD948543S1 (en) * 2018-10-26 2022-04-12 Hvr Mso Llc Display screen or portion thereof with a graphical user interface
USD920343S1 (en) * 2019-01-09 2021-05-25 Bigfoot Biomedical, Inc. Display screen or portion thereof with graphical user interface associated with insulin delivery
USD913302S1 (en) * 2019-02-21 2021-03-16 Polestar Performance Ab Display screen or portion thereof with animated graphical user interface
USD914035S1 (en) * 2019-02-21 2021-03-23 Polestar Performance Ab Display screen or portion thereof with animated graphical user interface
USD905073S1 (en) * 2019-02-21 2020-12-15 Polestar Performance Ab Display screen or portion thereof with graphical user interface
USD913303S1 (en) * 2019-02-21 2021-03-16 Polestar Performance Ab Display screen or portion thereof with animated graphical user interface
USD914036S1 (en) * 2019-02-26 2021-03-23 Polestar Performance Ab Display screen or portion thereof with animated graphical user interface
USD916873S1 (en) 2019-06-19 2021-04-20 Stryker Corporation Display screen or portion thereof with graphical user interface
USD931299S1 (en) 2019-06-25 2021-09-21 Stryker Corporation Display screen or portion thereof with graphical user interface
USD919640S1 (en) * 2020-01-24 2021-05-18 Caterpillar Inc. Display screen with graphical user interface
USD974370S1 (en) 2020-04-03 2023-01-03 Markup Llc Display screen portion with graphical user interface
USD946025S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for monitoring information
USD946026S1 (en) * 2020-10-19 2022-03-15 Splunk Inc. Display screen or portion thereof having a graphical user interface for a metrics-based presentation of information
USD955409S1 (en) * 2020-12-15 2022-06-21 Cowbell Cyber, Inc. Display screen or portion thereof with a graphical user interface
USD945485S1 (en) * 2020-12-15 2022-03-08 Cowbell Cyber, Inc. Display screen or portion thereof with a graphical user interface

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5920317A (en) * 1996-06-11 1999-07-06 Vmi Technologies Incorporated System and method for storing and displaying ultrasound images
US20100145729A1 (en) * 2006-07-18 2010-06-10 Barry Katz Response scoring system for verbal behavior within a behavioral stream with a remote central processing system and associated handheld communicating devices
US20100235782A1 (en) * 2009-03-11 2010-09-16 Airstrip Development, L.P. Systems and Methods For Viewing Patient Data
US20110087501A1 (en) * 2009-10-08 2011-04-14 Digital Healthcare Systems, Inc. Systems and methods for managing at-home medical prevention, recovery, and maintenance
US20110246217A1 (en) * 2010-04-05 2011-10-06 MobiSante Inc. Sampling Patient Data

Family Cites Families (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5838313A (en) * 1995-11-20 1998-11-17 Siemens Corporate Research, Inc. Multimedia-based reporting system with recording and playback of dynamic annotation
USD428398S (en) * 1998-05-07 2000-07-18 Apple Computer, Inc. Menu design for a computer display screen
USD441761S1 (en) * 1999-08-12 2001-05-08 Hitachi, Ltd. Computer generated icon for a display screen
EP1384191B1 (en) * 2000-11-17 2006-08-30 Draeger Medical Systems, Inc. A system and method for annotating patient medical information
US8956292B2 (en) * 2005-03-02 2015-02-17 Spacelabs Healthcare Llc Trending display of patient wellness
USD573153S1 (en) * 2005-12-23 2008-07-15 Navio Systems, Inc. Graphical user interface embodied in a display panel
US20080208631A1 (en) * 2007-02-22 2008-08-28 General Electric Company Methods and systems for providing clinical documentation for a patient lifetime in a single interface
USD582923S1 (en) * 2007-09-17 2008-12-16 Sap Ag Display panel with a computer-generated icon
USD582929S1 (en) * 2007-09-17 2008-12-16 Sap Ag Display panel with a computer-generated icon
USD582930S1 (en) * 2007-09-17 2008-12-16 Sap Ag Display panel with a computer-generated icon
USD582926S1 (en) * 2007-09-17 2008-12-16 Sap Ag Display panel with a transitional computer-generated icon
USD582928S1 (en) * 2007-09-17 2008-12-16 Sap Ag Display panel with a computer-generated icon
WO2009066222A2 (en) * 2007-11-19 2009-05-28 Koninklijke Philips Electronics N.V. System for storing data of interventional procedure
US7805320B2 (en) 2008-01-10 2010-09-28 General Electric Company Methods and systems for navigating a large longitudinal dataset using a miniature representation in a flowsheet
US20100131293A1 (en) 2008-11-26 2010-05-27 General Electric Company Interactive multi-axis longitudinal health record systems and methods of use
US8154723B2 (en) * 2009-04-03 2012-04-10 Sharp Laboratories Of America, Inc. Method and systems for particle characterization using optical sensor output signal fluctuation
USD626133S1 (en) * 2010-02-04 2010-10-26 Microsoft Corporation User interface for a display screen
USD678302S1 (en) * 2010-05-26 2013-03-19 Covidien Lp Display screen with a transitional graphical user interface
US20120078647A1 (en) 2010-09-29 2012-03-29 General Electric Company Systems and methods for improved perinatal workflow
USD665397S1 (en) * 2010-10-04 2012-08-14 Microsoft Corporation Display screen with graphical user interface
USD670726S1 (en) * 2011-01-24 2012-11-13 Microsoft Corporation Display screen with animated graphical user interface
USD708199S1 (en) * 2011-10-06 2014-07-01 Tetra Laval Holdings & Finance S.A. Display screen with graphical user interface
USD664984S1 (en) * 2011-09-12 2012-08-07 Microsoft Corporation Display screen with animated user interface
USD735736S1 (en) * 2012-01-06 2015-08-04 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface
USD705251S1 (en) * 2012-02-09 2014-05-20 Microsoft Corporation Display screen with animated graphical user interface
EP2847659B1 (en) * 2012-05-09 2019-09-04 Apple Inc. Device, method, and graphical user interface for transitioning between display states in response to a gesture
USD689082S1 (en) * 2012-06-19 2013-09-03 Mark A. Stiffler Display screen or portion thereof with transitional graphical user interface
USD689085S1 (en) * 2012-07-10 2013-09-03 Microsoft Corporation Display screen with animated graphical user interface
USD689083S1 (en) * 2012-07-10 2013-09-03 Microsoft Corporation Display screen with animated graphical user interface
USD735752S1 (en) * 2012-08-14 2015-08-04 Samsung Electronics Co., Ltd. Electronic device with an animated graphical user interface
USD724603S1 (en) * 2012-10-10 2015-03-17 Citrix Systems, Inc. Display screen with animated user interface
US8887090B2 (en) * 2012-10-31 2014-11-11 General Electric Company Surfacing of detailed information via formlets
EP2741192A3 (en) * 2012-12-06 2016-09-14 Samsung Electronics Co., Ltd Display device for executing a plurality of applications and method for controlling the same
US20140206970A1 (en) * 2013-01-22 2014-07-24 Park Nicollet Institute Evaluation and display of glucose data
USD717328S1 (en) * 2013-03-05 2014-11-11 Xian Qian Lin Display screen or portion thereof with graphical user interface
USD773478S1 (en) * 2013-03-15 2016-12-06 Park Nicollet Institute Graphical data display screen with graphical user interface
USD737281S1 (en) * 2013-05-23 2015-08-25 Google Inc. Display panel or portion thereof with a changeable graphical user interface component
USD736788S1 (en) * 2013-05-24 2015-08-18 Google Inc. Display panel or portion thereof with a changeable graphical user interface component
USD736787S1 (en) * 2013-05-24 2015-08-18 Google Inc. Display panel or portion thereof with a changeable graphical user interface component
USD725140S1 (en) * 2013-06-28 2015-03-24 Microsoft Corporation Display screen with graphical user interface
USD729834S1 (en) * 2013-06-28 2015-05-19 Microsoft Corporation Display screen with graphical user interface
USD749107S1 (en) * 2013-09-24 2016-02-09 Yamaha Corporation Display screen with animated graphical user interface
USD752616S1 (en) * 2013-10-23 2016-03-29 Ares Trading S.A. Display screen with graphical user interface
USD756372S1 (en) * 2013-12-02 2016-05-17 Symantec Corporation Display screen with graphical user interface
USD756371S1 (en) * 2013-12-02 2016-05-17 Symantec Corporation Display screen with graphical user interface
USD759073S1 (en) * 2014-02-19 2016-06-14 Winklevoss Ip Llc Display screen portion with graphical user interface
USD757768S1 (en) * 2014-02-21 2016-05-31 Titus Inc. Display screen with graphical user interface
USD752092S1 (en) * 2014-03-28 2016-03-22 Microsoft Corporation Display screen with animated graphical user interface
USD771657S1 (en) * 2014-05-22 2016-11-15 Samsung Electronics Co., Ltd. Display screen or portion thereof with a graphical user interface
USD751581S1 (en) * 2014-05-22 2016-03-15 Samsung Electronics Co., Ltd. Display screen or portion thereof with a graphical user interface
USD757750S1 (en) * 2014-05-30 2016-05-31 Microsoft Corporation Display screen with animated graphical user interface
USD799534S1 (en) * 2014-06-24 2017-10-10 Robert Bosch Gmbh Display screen with graphical user interface
US9864797B2 (en) * 2014-10-09 2018-01-09 Splunk Inc. Defining a new search based on displayed graph lanes
JP1554194S (en) * 2015-02-25 2016-07-19
USD776136S1 (en) * 2015-03-02 2017-01-10 Envision Energy (Jiangsu) Co., Ltd. Display screen with a downtime analyzer graphical user interface
USD766956S1 (en) * 2015-04-28 2016-09-20 IncludeFitness, Inc. Display screen with an animated graphical user interface
USD775637S1 (en) * 2015-07-28 2017-01-03 Microsoft Corporation Display screen with animated graphical user interface
USD773487S1 (en) * 2015-08-31 2016-12-06 Practice Fusion, Inc. Display screen or portion thereof with animated graphical user interface
GB2552274A (en) * 2015-11-09 2018-01-17 Sky Cp Ltd Television user interface
USD817988S1 (en) * 2016-11-17 2018-05-15 Trivver, Inc. Display screen or portion thereof with tab based graphical user interface
USD810760S1 (en) * 2016-12-22 2018-02-20 Palantir Technologies, Inc. Display screen or portion thereof with transitional graphical user interface
USD831672S1 (en) * 2016-12-23 2018-10-23 Teletracking Technologies, Inc. Display screen with animated graphical user interface
EP3701853A4 (en) * 2017-10-27 2021-11-03 Shenzhen Mindray Bio-Medical Electronics Co., Ltd. Monitor, display method applied to monitor, display device, and storage medium
USD916823S1 (en) * 2019-07-19 2021-04-20 eCU Technology, LLC Display screen or portion thereof with graphical user interface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5920317A (en) * 1996-06-11 1999-07-06 Vmi Technologies Incorporated System and method for storing and displaying ultrasound images
US20100145729A1 (en) * 2006-07-18 2010-06-10 Barry Katz Response scoring system for verbal behavior within a behavioral stream with a remote central processing system and associated handheld communicating devices
US20100235782A1 (en) * 2009-03-11 2010-09-16 Airstrip Development, L.P. Systems and Methods For Viewing Patient Data
US20110087501A1 (en) * 2009-10-08 2011-04-14 Digital Healthcare Systems, Inc. Systems and methods for managing at-home medical prevention, recovery, and maintenance
US20110246217A1 (en) * 2010-04-05 2011-10-06 MobiSante Inc. Sampling Patient Data

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USD792431S1 (en) 2010-09-29 2017-07-18 General Electric Company Display screen or portion thereof with graphical user interface
USD933689S1 (en) 2010-09-29 2021-10-19 General Electric Company Display screen or portion thereof with graphical user interface
USD846579S1 (en) 2010-09-29 2019-04-23 General Electric Company Display screen or portion thereof with graphical user interface
US20200357493A1 (en) * 2011-04-26 2020-11-12 Cerner Innovation, Inc. Monitoring, capturing, measuring and annotating physiological waveform data
US20120278099A1 (en) * 2011-04-26 2012-11-01 Cerner Innovation, Inc. Monitoring, capturing, measuring and annotating physiological waveform data
CN103593721A (en) * 2012-08-13 2014-02-19 中国商用飞机有限责任公司 A method for monitoring a business flow based on complex event processing technology
US11576576B2 (en) 2013-03-15 2023-02-14 Zoll Medical Corporation Patient monitor screen aggregation
US9788724B2 (en) 2013-03-15 2017-10-17 Zoll Medical Corporation Patient monitor screen aggregation
US11179037B2 (en) 2013-03-15 2021-11-23 Zoll Medical Corporation Patient monitor screen aggregation
CN105190631A (en) * 2013-03-15 2015-12-23 卓尔医学产品公司 Patient monitor screen aggregation
WO2014144339A1 (en) * 2013-03-15 2014-09-18 Zoll Medical Corporation Patient monitor screen aggregation
US10321824B2 (en) 2013-03-15 2019-06-18 Zoll Medical Corporation Patient monitor screen aggregation
US10722119B2 (en) 2013-03-15 2020-07-28 Zoll Medical Corporation Patient monitor screen aggregation
US20150269317A1 (en) * 2014-03-18 2015-09-24 Cameron Marcum Methods and apparatus for generating and evaluating modified data structures
US11961597B1 (en) * 2014-05-31 2024-04-16 Allscripts Software, Llc User interface detail optimizer
US10991135B2 (en) * 2015-08-11 2021-04-27 Masimo Corporation Medical monitoring analysis and replay including indicia responsive to light attenuated by body tissue
US11967009B2 (en) 2015-08-11 2024-04-23 Masimo Corporation Medical monitoring analysis and replay including indicia responsive to light attenuated by body tissue
US11605188B2 (en) 2015-08-11 2023-03-14 Masimo Corporation Medical monitoring analysis and replay including indicia responsive to light attenuated by body tissue
USD839883S1 (en) * 2015-11-25 2019-02-05 General Electric Company Display screen or portion thereof with graphical user interface
USD831038S1 (en) * 2015-11-25 2018-10-16 General Electric Company Display screen or portion thereof with transitional graphical user interface
US20170357765A1 (en) * 2016-06-13 2017-12-14 Medical Informatics Corporation User interface for configurably displaying real-time data for multiple patients
IL263590B1 (en) * 2016-06-13 2023-07-01 Medical Informatics Corp User interface for configurably displaying real-time data for multiple patients
CN109661648A (en) * 2016-06-13 2019-04-19 医疗信息公司 For showing the user interface of the real time data of multiple patients configurablely
US11763921B2 (en) * 2017-06-16 2023-09-19 Koninklijke Philips N.V. Annotating fetal monitoring data
US20200121188A1 (en) * 2017-06-16 2020-04-23 Koninklijke Philips N.V. Annotating fetal monitoring data
US11086499B2 (en) * 2018-03-23 2021-08-10 Nihon Kohden Corporation Portable information terminal, biological information management method, biological information management program and computer-readable storage medium
US11553885B2 (en) * 2019-06-20 2023-01-17 Nihon Kohden Corporation Patient monitor with user input to rearrange patient display areas
USD938961S1 (en) * 2019-08-14 2021-12-21 GE Precision Healthcare LLC Display screen with graphical user interface

Also Published As

Publication number Publication date
USD846579S1 (en) 2019-04-23
USD933689S1 (en) 2021-10-19
US9292656B2 (en) 2016-03-22
USD792431S1 (en) 2017-07-18
US20120075116A1 (en) 2012-03-29

Similar Documents

Publication Publication Date Title
US9292656B2 (en) Systems and methods for improved perinatal workflow
US11031129B2 (en) Systems, methods, user interfaces and analysis tools for supporting user-definable rules and smart rules and smart alerts notification engine
US20200357493A1 (en) Monitoring, capturing, measuring and annotating physiological waveform data
Leape Institute of Medicine medical error figures are not exaggerated
US8132104B2 (en) Multi-modal entry for electronic clinical documentation
US9865025B2 (en) Electronic health record system and method for patient encounter transcription and documentation
US11424025B2 (en) Systems and methods for medical device monitoring
US11830590B2 (en) Maintaining context of clinically relevant information when displayed
US11666288B2 (en) Systems and methods for graphical user interfaces for medical device trends
US8694337B2 (en) Display of patient-specific data
US8355924B2 (en) Patient activity coordinator
US8560335B2 (en) Viewing clinical activity details within a selected time period
US20210065889A1 (en) Systems and methods for graphical user interfaces for a supervisory application
EP2839429A1 (en) Systems and methods for displaying patient data
US9974506B2 (en) Associating coronary angiography image annotations with syntax scores for assessment of coronary artery disease
US20180292978A1 (en) Apparatus and method for presentation of medical data
US20210064224A1 (en) Systems and methods for graphical user interfaces for medical device trends
US8050946B2 (en) Clinical activity navigator
JP2007233850A (en) Medical treatment evaluation support device, medical treatment evaluation support system and medical treatment evaluation support program
US20120131436A1 (en) Automated report generation with links
US8589185B2 (en) Acknowledgement of previous results for medication administration
US20160055321A1 (en) Systems and methods for tooth charting
Chetta et al. Augmenting EHR interfaces for enhanced nurse communication and decision making
Eisenberg et al. The electronic health record as a healthcare management strategy and implications for obstetrics and gynecologic practice
US20210201240A1 (en) Dynamic Dash Flow For Tracking Key Performance Indicators Of Tasks Associated With Particular Medical Order Types

Legal Events

Date Code Title Description
AS Assignment

Owner name: GENERAL ELECTRIC COMPANY, NEW YORK

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GRASSLE, TAMARA;KESSLER, DIANNE;LEVECKE, CHARLES;AND OTHERS;SIGNING DATES FROM 20101214 TO 20101216;REEL/FRAME:025903/0378

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION