US20050216561A1 - System and method for a computer based cooperative work system - Google Patents

System and method for a computer based cooperative work system Download PDF

Info

Publication number
US20050216561A1
US20050216561A1 US11/135,276 US13527605A US2005216561A1 US 20050216561 A1 US20050216561 A1 US 20050216561A1 US 13527605 A US13527605 A US 13527605A US 2005216561 A1 US2005216561 A1 US 2005216561A1
Authority
US
United States
Prior art keywords
exemplar
events
agent
coefficients
cscw
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/135,276
Inventor
Stephen Boies
Yiming Ye
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Priority to US11/135,276 priority Critical patent/US20050216561A1/en
Publication of US20050216561A1 publication Critical patent/US20050216561A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/535Tracking the activity of the user

Definitions

  • the present invention generally relates to computer supported collaborated work and, more particularly, to intelligently collaborating with computers in a network with one or more agents.
  • Synchronous mode refers to the situation where activities occur at the same time and in the same place.
  • Distributed synchronous mode refers to the situation where activities occur at the same time but at different places.
  • Asynchronous mode refers to the situation where activities occur at different times in the same place.
  • Distributed asynchronous mode refers to the situation where activities occur at different times and places.
  • Event perception will be even more important to CSCW in the pervasive computing world, where the dominance of the traditional PC as the primary computing resource is replaced by a large collection of devices with embedded computing. These intelligent, interconnected devices will be seamlessly embedded within our offices, constantly sensing and reacting to the environment. The information provided by these pervasive devices within an office environment will be very important in CSCW applications.
  • Autonomous agents are expected to be of great value to a CSCW system and a certain amount of future research on CSCW will be centered on multi-agent aspect of groupware.
  • a multi-agent approach to CSCW can capture the dynamics of a team work and even re-shape its form and characteristics.
  • the automation brought by CSCW agents will dramatically reduce certain types of frictional costs during team work.
  • the intelligence of a multi-agent CSCW system will be able to keep the privacy of its user and the security of each user's local work.
  • U.S. Pat. No. 5,996,002 to Katsurabayashi et al. discloses a collaborative work support system that is performed on plural computers, each of which is assigned to an operator, and supports collaborative work in which the plural computers display common data and each operator operates the displayed common data through his or her own computer.
  • U.S. Pat. No. 5,948,057 to Berger et al. discloses a method for computer-supported matching of a number of data copies of a stored data file stored in at least one computer, in the reintegration of a number of data copies that were changed during decoupled work phases by uses of a shared work environment, ad thus exhibit inconsistencies. The reintegration is conducted so that the number of matchings is reduced on the basis of protocol data files.
  • U.S. Pat. No. 5,781,732 to Adams discloses a shared document framework for use by an application program that provides collaborative access to a shared document by means of a caucus service associated with the shared document.
  • the caucus service receives messages from caucus members and broadcast transmits them to all caucus members in global order.
  • U.S. Pat. No. 5,708,853 to Sanemitsu discloses an integrated circuit (IC) card having a camera, a microphone and a modem for transmitting electrical signals from the camera and microphone to a telephone or communication line, or transmits signals received from the communication line to a terminal, such as a personal computer (PC).
  • IC integrated circuit
  • U.S. Pat. No. 5,396,265 to Ulrich et al. discloses a tactile computer input device which simulates an object being designed.
  • the input device is used with a computer aided design (CAD) system and allows a user to manually manipulate the input device as if it were the object under design.
  • CAD computer aided design
  • U.S. Pat. No. 5,068,645 to Drumm discloses a device for controlling a cursor on a data terminal display screen.
  • the device is in the form of a headset and includes an orientation sensor which provides an electrical signal related to the orientation of the device without it being adjacent to any fixed surface.
  • U.S. Pat. No. 5,418,889 to Ito discloses a knowledge base generating system that includes a knowledge base having a first knowledge base containing sets of causal relation knowledge described cause and effect relations of events taking place within a target machine, and having a second knowledge base containing sets of membership knowledge describing a structure of members of the target machine, each event of the cause and effect relations having data to identify one of the members, so that the first and second knowledge bases have mutually retrievable data.
  • U.S. Pat. No. 5,353,384 to Yoshida discloses an expert system which includes a first knowledge base for storing detailed knowledge, a second knowledge base for storing compiled knowledge, an inference engine for solving a problem using the second knowledge base, and an analysis engine for extracting knowledge having high utilization from the first knowledge base and storing the extracted knowledge in the second knowledge base.
  • U.S. Pat. No. 5,295,067 to Cho et al. discloses a system for order planning that translates an order configuration into a list of bills of materials list.
  • the system operates based on a first logical specification of relationships between models, optional features or device codes and required material components.
  • a further object of this invention is to provide an improved apparatus, system, and method for computer collaboration over a network.
  • an agent mediated CSCW system that can create a sense of group work and at the same time keep the privacy and maintain the security of each user.
  • a multi-agent negotiation process is used in the system to reduce fractions among group members during the geographically distributed team work.
  • a markup language such as XML (eXtended Markup Language) is used in the system to encode communication messages.
  • Event perception is an important task in agent mediated CSCW system which uses an eigen space to perform the event perception task. For the case where the number of devices is large, an eigen pyramid is constructed which can be used to discriminate different events. For more information on the eigen space approach, see Numerical Recipes in C by William H. Press, Saul Teukolsky, William T. Vetterling, and Brian P. Flannery.
  • FIG. 1 is a block diagram of a preferred system architecture of the invention
  • FIG. 2 is a block diagram of a preferred single agent component of the invention
  • FIG. 3 is a diagram illustrating the situation for the user within his or her working environment
  • FIG. 4 is a flow chart on the data collection process according to the invention.
  • FIG. 5 is a flow chart of a data collection process for model building of a data matrix A
  • FIG. 6 is a graphical illustration of the reading data at time 0 ;
  • FIG. 7 is a graphical illustration of the reading data for the i th event's j th exemplar
  • FIG. 8 is a graphical illustration showing in more detail reading data for the i th event's j th exemplar
  • FIG. 9 is a graphical illustration of the readings for all the exemplars of event i;
  • FIG. 10 is a graphical illustration of the data matrix
  • FIG. 11 is a graphical illustration showing in more detail the data matrix
  • FIG. 12 is a flow chart of a co-efficient generating process of the invention.
  • FIG. 13 is a flow chart of an even perception process without data abstraction
  • FIG. 14 is a graphical illustration for eigen pyramid construction according to the invention.
  • FIG. 15 is a graphical illustration for eigen-pyramid's data matrix change
  • FIG. 16 is a flow chart of the eigen pyramid model construction process
  • FIG. 17 is a flow chart of an extraction process for extracting the signature of the real measured data during the event perception process
  • FIG. 18 is flow chart of a start negotiation process
  • FIG. 19 is a flow chart of a start agent process run during the single agent negotiation process
  • FIG. 20 is a flow chart of a respond agent process run during the negotiation process
  • FIG. 21 is a flow chart of a dialogue process for the start agent to handle conflict with its user
  • FIG. 22 is a flow chart of the start agent during the group agent negotiation process
  • FIG. 23 is a flow chart of a privacy guarding process for privacy guarding for the start agent during the group awareness process
  • FIG. 24 is a flow chart showing the procedure for the respond agent during the privacy guarding process
  • FIG. 25 is a graphical illustration of the Agent-Event matrix for a given agent used to guard the privacy of the agent's user.
  • FIG. 26 is a flow chart showing the awareness process for an agent's own user.
  • Block 101 is the server, which handles the messages among agents.
  • Block 105 is the network channel between the agent 115 and the server 101 .
  • 155 denotes the communication channels between the devices 165 and the agent 115 . It can be in any form.
  • a device 165 may be a camera which, after it analyzes an image, can divide the results into several categories and send the message through a TCP/IP (Transfer Control Protocol/Internet Protocol) channel 155 .
  • TCP/IP Transfer Control Protocol/Internet Protocol
  • a device 165 may be a keyboard which, after the user hits a key, generates a signal that can be grabbed and sent to the agent 115 via a TCP/IP channel 155 .
  • the devices 165 are used for the system to perform event perception. They can be a camera, a keyboard, a sensitive touch screen, a weight sensor, a motion sensor, and many other devices that can sense the environment.
  • the user 195 uses a multi-modal communication channel 175 for communicating with devices in the environment. Channel 175 is actually the sensing channel for the devices to sense the environment and the user and various activities.
  • the devices 165 keep sensing the environment and user 195 through multi-modal channels 175 .
  • the sensing results are categorized and send to the agent 1 15 .
  • the agent 115 analyzes the data and keeps detecting what has happened. When other agents ask about the status of its user, after a negotiation process, the agent provides relevant information.
  • FIG. 2 is a preferred diagram on a single agent component.
  • Negotiation module 205 is responsible for negotiations with another agent or agents. It is responsible for parsing messages back and forth between itself and another agent. When it wants to pass information to other agents, it first sends the information to the server 101 ( FIG. 1 ), then the server passes the information to the other agent or agents. The communications between them can be through a TCP/IP channel. When another agent or agents send message to the current agent, they also go through the server. They first send the message to the server, then the server transfers the message to the negotiation module 205 of the current agent. The negotiation module 205 passes messages back and forth between plan generation module 215 .
  • the plan generation module 215 generates plans for the agent to negotiate with other agents, or to transmit information through multi-modal user interface module 257 and finally received by the user.
  • the plan generation module 215 consults the event perception module 255 and the knowledge data base 207 so as to generate plans.
  • the knowledge data base 207 stores various data bases for the user, such as the user's day-to-day calendar, appointment schedule, and the like.
  • the calendar can be meeting schedules, teleconferencing schedules, telephone call schedules, and many others. It also has some inference rule in one way or another so as to generate plans. For example, if the user is going to having a meeting with John at 1:00 o'clock, then he should not have any other appointments with Mary or other people in his department at this time.
  • Knowledge data base 207 can use any of them.
  • Event perception module 255 perceives events and provide the results to plan generation module 215 when queried.
  • Plan generation module 215 generates plans by various intelligent ways based on the content of event perception module 255 and knowledge data base 207 .
  • FIG. 3 illustrates the situation for the user within his or her working environment.
  • the user 195 is surrounded by various devices such as screen 301 , keyboard 357 , mouse 375 , and other devices 305 , 315 and 355 .
  • Some devices are only for output of information to the user, such as, say, devices 301 and 315 .
  • Some devices are only for input from the user or environment, such as, say, devices 355 , 357 and 375 .
  • Other devices can be bi-directional.
  • the screen 301 could be a touch screen both displaying information to the user and receiving input from the user.
  • FIG. 4 is a flow chart on the data collection process.
  • Function block 401 determines the data collection time intervals, such as one second intervals. This means every one second, the agent will collect the readings from all the devices.
  • the first reading is obtained at time instant 0 .
  • the second reading is obtained for the time period between time instant 0 and time instant 1 .
  • the r th reading is obtained between the time instants r ⁇ 1 and r.
  • Function block 415 determines the sensing categories for each available devices. For example, a weight sensor within the environment can divide its weights for every 10 pounds. Thus, if the total weight range can be sensed is 100 pounds, then we can divide the sensed range into ten categories. If the average weight sensed within the time instant r ⁇ 1 and r belongs to category c, we say that the r th reading belongs to category c. A camera can analyze images by divided the images into several categories based on the image analyzing results. As for a keyboard, we can divide the category like this. Suppose the keys on the keyboard are 0, 1, . . . , 9, a, . . . , z, A, . . . , Z.
  • the category is 0, if he typed “0”, then it is 1. If he typed “9” then it is 10, if he typed “a”, then it is 11, etc.
  • the keys on the keyboard are k 1 , . . . , k n . Then, if the user typed on key k i within the time period, then the category is i. If the user touched two keys k i and k j , then the category is (i ⁇ 1) ⁇ n+j. If the user touches r keys, k 1 , . . . , k r , then the category is (i 1 ⁇ 1)n r ⁇ 1 +(i 2 ⁇ 1)n r ⁇ 2 + . . . +(i r-1 ⁇ 1)n+i r .
  • Function block 455 is the module for collecting readings for all the devices at every time instant. At every time instant, data is collected for each device, until the time is used up. The details will be described in FIGS. 6 to 11 .
  • FIG. 5 is a flow chart of a data collection process for model building of a data matrix A.
  • Function block 505 collects exemplar within the training time, and then a determination is made in decision block 515 to determine if we need more data for training. If so, the process loops back to function block 505 ; otherwise, the process uses collected exemplar to construct the sampling readings to form matrix A in function block 555 .
  • r 1 i,j (0) is the readings of device 1 at time instant 0 for the i th event's j th exemplar.
  • the column vector [r 1 i,j (0)] is the readings of all the devices at the time instant 0 .
  • N exemplar exemplar for each events. Each exemplar collects readings from devices within a time interval [0,T]. These readings are discretized into N readings +1 readings at time instant 0 , 1 N readings ⁇ T , ... ⁇ , N readings - 1 N readings ⁇ T , T .
  • N readings - 1 N readings ⁇ T , T For device h(0 ⁇ h ⁇ M), we denote its k th (0 ⁇ k ⁇ N readings ) readings for the j th (1 ⁇ j ⁇ N exemplar ) exemplar from the i th (1 ⁇ i ⁇ N) event as r h i,j (k).
  • FIG. 6 is a graphical explanation of the reading of data at time 0 for the j th exemplar of the i th event.
  • FIG. 7 illustrates the reading data for the event's j th exemplar.
  • FIG. 8 illustrates a more detailed explanation of the reading data for the i th event's j th exemplar.
  • the sampling readings matrix A for all the events and their associated exemplar can be created by the set of all j and i of [r i,j ].
  • A ([r 1,1 ], . . . , [r 1,N exemplar ], . . . , [r N,1 ], . . . , [r N,N exemplar ]).
  • the dimension of matrix A is (N ⁇ N exemplar ) columns and (M ⁇ (N readings +1)) rows, N is the number of events and M is the number of devices.
  • the N ⁇ N exemplar columns of matrix A give readings for all the exemplar of all the events. This is the total number of training sets of data.
  • each column of matrix A refers to a given training set
  • the elements of the column refers to the readings for this set.
  • FIG. 9 illustrates the readings for all the exemplar of event i.
  • FIG. 10 illustrates the data matrix A.
  • FIG. 11 illustrates a more detail the data matrix A.
  • W is a diagonal matrix with singular values ⁇ 1 , . . . , ⁇ N ⁇ N exemplar , sorted in decreasing order along the diagonal. The virtue of these values is that they rank the dimensions of the space in terms of variations along the principal component directions, and that this ranking is very often related to their importance.
  • V T is a (N ⁇ N exemplar ) ⁇ (N ⁇ N exemplar ) matrix that encodes the coefficients to be used in expanding each column of matrix A in terms of principal component directions.
  • FIG. 12 is a flow chart of a co-efficient generating process.
  • exemplar are collected to form the data matrix A.
  • function block 1205 the eigen vectors are generated based on the exemplar matrix.
  • Function block 1215 generates the co-efficient for the corresponding data matrix. More particularly, the readings from the j th exemplar of the ith event [r i,j ], can be approximated according to the q singular values ⁇ 1 ⁇ , ⁇ 2 ⁇ . . .
  • matrix C into a matrix that represents the average coefficient for each event.
  • matrix C contain the coefficient vectors of all its exemplar: (C i,1 , . . . , C i,N exemplar ).
  • ⁇ right arrow over (C) ⁇ is the model of events learned from the training phase and will be used in event perception.
  • FIG. 13 is a flow chart of an even perception process without data abstraction.
  • function block 1301 device readings are collected, and in function block 1305 , coefficients are generated. These processes are quite similar to those of function blocks 1201 and 1205 , respectively, in FIG. 12 .
  • Function block 1315 performs the event perception task. More particularly, the perception of events involves matching readings from all the devices in a real situation against learned models of all the events. For the same event, readings in a real application may differ from those of its exemplar because of various reasons such as noise, etc. However, they may share some commonalities or signatures. The use of eigen space approach for event perception assumes that these commonalities or signatures of a given event is captured by the coefficients of the readings along principal component directions.
  • [R(t)] we discretize [R(t)] into N readings +1 at time instant 0 , 1 N readings , ... ⁇ , N readings - 1 N readings .
  • [R(k)] denote the k th readings of all the devices at k th time instant. By concatenating readings from all the time instants, we obtain the column vector of matrix R which gives readings of all the devices at all the time instants.
  • matrix C contains the coefficient vectors of all its exemplar, (C i,1 , . . . , C i,N exemplar ).
  • FIG. 14 is an illustration for eigen pyramid construction used in the explanation for function block 1601 in FIG. 16 .
  • FIG. 15 is an illustration for an eigen-pyramid's data matrix change.
  • FIG. 16 is a flow chart illustrating the eigen pyramid model construction process.
  • Function block 1601 divides the device readings into k groups as illustrated in FIG. 14 .
  • n acceptable is the number of devices that can be handled by the above eigen space approach.
  • N total M ⁇ N readings +1 is the total number of readings to be considered.
  • Our strategy is to first divide uniformly these N total devices into different groups such that each group can be handled by the above eigen space method.
  • N total k(n acceptable ⁇ 1)+u 1 , where 0 ⁇ u ⁇ n acceptable ⁇ 1.
  • Function block 1605 obtains the co-efficient for each group. This is illustrated in Mapping 1515 , 1557 and 1575 of FIG. 15 .
  • For each group we run the training data and detect their principal directions. Now, we collect the coefficient vector with respect to the principal directions for each training exemplar. The length of the coefficient vector is N exemplar ⁇ N events .
  • Function block 1615 forms the next layer of the pyramid. This is illustrated in 1557 and 1565 of FIG. 15 . Since the coefficients capture the differences in the training data, we will take coefficients of each exemplar as the input to the second level of the pyramid. By concatenating the coefficients of each group, we can obtain the new “exemplar” column vector which has length k ⁇ N exemplar ⁇ N events and which should be much smaller than the original length of the “exemplar” column vector. For every old exemplar, we can get a new exemplar. Each new exemplar will be a column in the new matrix. After we put all the new exemplar together, we get a new data matrix that acts as the second layer of the pyramid.
  • decision block 1617 a determination is made as to whether to continue the data abstraction process. If so, the process loops back to function block 1601 ; otherwise, the process ends.
  • the length for each training vector (exemplar) is N total .
  • they are divided into k groups. Each group generates N exemplar ⁇ N events coefficients.
  • FIG. 17 is a flow chart of an extraction process for extracting the signature for each divided group of the real measured data during the event perception process.
  • Function block 1701 collects data from devices during the event perception process. Then according to the division when the pyramid is built, function block 1705 divides the initial data into different groups the same way as the pyramid model generation process when this corresponding layer is being built.
  • function block 1707 extract coefficients for the data of each group with respect to the principal directions of the first layer of the pyramid formed during the training phase in function block 1707 and, in function block 1715 , the coefficients are connected to form the next layer.
  • a determination is made in decision block 1755 as to whether there are more layers in the pyramid. If so, the process loops back to function block 1705 ; otherwise, the event perception is output at output block 1775 .
  • FIG. 18 is flow chart of a start negotiation process.
  • the user sends his negotiation request via the multi-modal interface to its agent.
  • the agent analyzes the request and determines, based on the input, which agents are to be contacted.
  • Decision block 1815 checks to determine if the total number of agents to be contacted is more than one or not. If so, the process goes to function block 1855 to start the multi-agent negotiation process. Otherwise, the process goes to function block 1875 to start single agent negotiation process.
  • FIG. 19 is a flow chart of a start agent process run during the single agent negotiation process.
  • Function block 1901 identifies the other agents to be contacted.
  • Function block 1905 identifies the topics to be negotiated, such as making a phone call or make an appointment.
  • Function block 1907 checks the knowledge data base 207 ( FIG. 2 ) so as to identify parameters of the topic. For example, for scheduling a time, there should be a start time, an end time, and the attendees of the meeting, etc.
  • Function block 1911 examines the knowledge data base to identify the set of acceptable choices. In decision block 1915 , a check is made to determine whether there is at least one choice left. If so, function block 1917 asks the negotiation module 205 ( FIG. 2 ) to construct a negotiation message and send the message.
  • decision block 1951 a determination is made as to when the agent received messages from the responding agent about whether the request is approved or not. If not, the process loops back to function block 1915 ; but if so, the process exits. Returning now to decision block 1915 , if there are no choices left, a conflict dialogue is started with its own user in function block 1955 to determine a new request.
  • XML extended Markup Language
  • FIG. 20 is a flow chart of a responding agent process run during the negotiation process.
  • the agent receives the negotiation request from the starting agent in function block 2001 .
  • the responding agent identifies the starting agent, the topic, and the parameters.
  • the responding agent check the knowledge base to see whether there are any conflicts. If there is no conflicts, as determined in decision block 2015 , the starting agent simply sends a message “approved” in function block 2017 . Otherwise, the preference level is checked in function block 2019 to see whether there are any possibility of updating. For example, although at a certain time the user has a meeting with a colleague, but since the starting agent works for the CEO of the company, the original appointment should be replaced by the new appointment. If after checking the knowledge data base in decision block 2055 it is determined that it can be updated, then simply update in function block 2077 . Otherwise, a “not approved” message is sent to the starting agent in function block 2057 .
  • FIG. 21 is a flow chart of a dialogue process for the start agent to handle conflict with its user.
  • Function block 2101 displays information to the user via multi-modal device such as a screen or a voice channel; i.e., “request is not approved”. After receiving the message, it is the user who needs to determine whether to negotiate directly with the other user in function block 2105 . If the user decides to directly negotiate with the other user, as determined in decision block 2115 , the agent system is bypassed in function block 2155 . If not, in function block 2175 , the user can propose an alternative negotiation request and send the request to the agent via multi-model interface.
  • FIG. 22 is a flow chart of the start agent during the group agent negotiation process.
  • the group of agents to be contacted is identified.
  • function block 2205 identifies the topic to be negotiated (e.g., phone, schedule time, etc.).
  • a check of the knowledge data base is made in function block 2207 to identify parameters of the topic.
  • Function block 2211 examines the knowledge data base to identify the set of acceptable choices.
  • a determination is made in decision block 2215 if there are any choices left and, if so, in function block 2251 , the agent constructs a negotiation message (can be in XML form) and sends the message to all of the responding agents.
  • the starting agent receives all the messages from all the responding agents.
  • Decision block 2271 checks whether the request is approved by every responding agent. If so, the process goes to function block 2277 to inform all the other agent about the final negotiation results; but if not, the process loops back to decision block 2215 . Returning to decision block 2215 , if there are no more choices left, then a conflict dialogue is stared with its own user in function block 2255 to determine the new request, and the process loops back to function block 2201 .
  • FIG. 23 is a flow chart of a privacy guarding process for the start agent during the group awareness process. This is an idea to keep the privacy of its user so that it can not be invaded. In other words, “I only let you know what I want you to know”.
  • the user sends an inquiry to his agent about the status of the other users, such as whether they are making a phone call, they are working right now, etc.
  • the agent identifies the number of users and the corresponding status parameters.
  • decision block 2315 the agent selects one agent from the agent pools to be queried and send the request.
  • the process goes to function block 2357 ; otherwise, in function block 2355 , the starting agent receives the response from the agent to be queried, and the process loops back to decision block 2315 .
  • the response for the queried agent can contain what information you can display and using which channel.
  • a determination is made as to the display strategies for different users and the corresponding channels; for example, how to display the corresponding message, what content to display, etc.
  • the starting agent keeps receiving messages from the other agents about the status of their users, and the starting agent will keep displaying the status of the other users.
  • FIG. 24 is a flow chart of the procedure for the responding agent during the privacy guarding process.
  • Function block 2401 receives the message from the starting agent on the status query.
  • Function block 2405 checks its knowledge data base and the value of the agent-event matrix. This matrix encodes the privacy concerns of the agent about its user.
  • Function block 2415 generates the list of events to be transmitted and the way of transmitting them.
  • Function block 2455 keeps performing the event perception task and sends messages to the start agent about the status of its user.
  • the agent-event matrix is used to determine what to send.
  • FIG. 25 graphically illustrates the Agent-Event matrix for a given agent used to guard the privacy of the agent's user.
  • the rows represent different events.
  • the columns represent agents.
  • ae ij represent the situation for event i to be sent to agent j.
  • FIG. 26 is a flow chart illustrating the awareness process for an agent's own user. The goal is to provide its own user about who is monitoring him.
  • Function block 2601 accesses the knowledge data base to check which agent requested its agent's info.
  • Function block 2605 identifies the information other agent requested, and which information was given to them.
  • Function block 2615 displays the corresponding information to its user when asked using a proper user interface.

Abstract

An agent mediated Computer Supported Cooperative Work (CSCW) system creates a sense of group work and at the same time keeps the privacy and maintains the security of each user. A multi-agent negotiation process is used in the system to reduce fractions among group members during the geographically distributed team work. A markup language, such as XML (extended Markup Language), is used in the system to encode communication messages. Event perception is an important task in agent mediated CSCW system which uses an eigen space to perform the event perception task. For the case where the number of devices is large, an eigen pyramid is constructed which can be used to discriminate different events.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention generally relates to computer supported collaborated work and, more particularly, to intelligently collaborating with computers in a network with one or more agents.
  • 2. Background Description
  • With ubiquitous connectivity on the horizon, collaborative computing promises to become one of this new century's core applications. People will be more and more involved in Computer Supported Cooperative Work (CSCW) because of the pressure from companies to improve their product-development and decision making process and because of the convenience brought by the information super-highway.
  • There are four modes conceptualized by CSCW researchers on how people work; synchronous mode, distributed synchronous mode, asynchronous mode, and distributed asynchronous mode. Synchronous mode refers to the situation where activities occur at the same time and in the same place. Distributed synchronous mode refers to the situation where activities occur at the same time but at different places. Asynchronous mode refers to the situation where activities occur at different times in the same place. Distributed asynchronous mode refers to the situation where activities occur at different times and places.
  • Many computer systems support simultaneous interaction by more than one user. However, most of them support multiuser interaction in a way that prohibits cooperation; that is, they give each user the illusion that the user is the only one using the system. To support and encourage cooperation, cooperative applications must allow users to be aware of the activities of others. The purpose of a cooperative multiuser interface is to establish and maintain a common context, allowing the activities or events associated with one user to be reflected on other users' screens. For example, Lotusg Sametime (http://www.lotus.com/sametime) is a family of real-time collaboration products which provides instant awareness, communication, and document sharing capabilities, bringing the flexibility and efficiency of real-time communication to the business world.
  • With awareness of coworkers, partners, or customers online, users can communicate in a variety of ways. However, a direct reflection of all the activities on other users' screen is not approachable. The first reason is that it wastes communication bandwidth, especially when users are far apart and the amount of data to be transmitted, such as video data, is huge. The second reason is that many users may not like the situation that his or her activities are broadcasted to all the other members of the team. The third reason is that each user is concentrating on his or her own work and does not have the energy and motivation to monitor every movement of other users.
  • Thus, it is critical for CSCW interface to analyze activities of a given user, detect that important events have occurred, and only reflect necessary events to other agents.
  • Event perception will be even more important to CSCW in the pervasive computing world, where the dominance of the traditional PC as the primary computing resource is replaced by a large collection of devices with embedded computing. These intelligent, interconnected devices will be seamlessly embedded within our offices, constantly sensing and reacting to the environment. The information provided by these pervasive devices within an office environment will be very important in CSCW applications.
  • Autonomous agents are expected to be of great value to a CSCW system and a certain amount of future research on CSCW will be centered on multi-agent aspect of groupware. A multi-agent approach to CSCW can capture the dynamics of a team work and even re-shape its form and characteristics. The automation brought by CSCW agents will dramatically reduce certain types of frictional costs during team work. Furthermore, the intelligence of a multi-agent CSCW system will be able to keep the privacy of its user and the security of each user's local work.
  • PRIOR ART
  • Collaborative computing systems, sensing devices of various kinds that provide input to computer systems, and knowledge base or expert systems are generally known in the prior art. Some examples include the following:
  • U.S. Pat. No. 5,996,002 to Katsurabayashi et al. discloses a collaborative work support system that is performed on plural computers, each of which is assigned to an operator, and supports collaborative work in which the plural computers display common data and each operator operates the displayed common data through his or her own computer.
  • U.S. Pat. No. 5,948,057 to Berger et al. discloses a method for computer-supported matching of a number of data copies of a stored data file stored in at least one computer, in the reintegration of a number of data copies that were changed during decoupled work phases by uses of a shared work environment, ad thus exhibit inconsistencies. The reintegration is conducted so that the number of matchings is reduced on the basis of protocol data files.
  • U.S. Pat. No. 5,781,732 to Adams discloses a shared document framework for use by an application program that provides collaborative access to a shared document by means of a caucus service associated with the shared document. The caucus service receives messages from caucus members and broadcast transmits them to all caucus members in global order.
  • U.S. Pat. No. 5,708,853 to Sanemitsu discloses an integrated circuit (IC) card having a camera, a microphone and a modem for transmitting electrical signals from the camera and microphone to a telephone or communication line, or transmits signals received from the communication line to a terminal, such as a personal computer (PC).
  • U.S. Pat. No. 5,396,265 to Ulrich et al. discloses a tactile computer input device which simulates an object being designed. The input device is used with a computer aided design (CAD) system and allows a user to manually manipulate the input device as if it were the object under design.
  • U.S. Pat. No. 5,068,645 to Drumm discloses a device for controlling a cursor on a data terminal display screen. The device is in the form of a headset and includes an orientation sensor which provides an electrical signal related to the orientation of the device without it being adjacent to any fixed surface.
  • U.S. Pat. No. 5,418,889 to Ito discloses a knowledge base generating system that includes a knowledge base having a first knowledge base containing sets of causal relation knowledge described cause and effect relations of events taking place within a target machine, and having a second knowledge base containing sets of membership knowledge describing a structure of members of the target machine, each event of the cause and effect relations having data to identify one of the members, so that the first and second knowledge bases have mutually retrievable data.
  • U.S. Pat. No. 5,353,384 to Yoshida discloses an expert system which includes a first knowledge base for storing detailed knowledge, a second knowledge base for storing compiled knowledge, an inference engine for solving a problem using the second knowledge base, and an analysis engine for extracting knowledge having high utilization from the first knowledge base and storing the extracted knowledge in the second knowledge base.
  • U.S. Pat. No. 5,295,067 to Cho et al. discloses a system for order planning that translates an order configuration into a list of bills of materials list. The system operates based on a first logical specification of relationships between models, optional features or device codes and required material components.
  • SUMMARY OF THE INVENTION
  • It is therefore an object of the present invention to provide a new event perception algorithm that can perceive events in the uses's workspace.
  • It is another object of this invention to provide an architecture of agent mediated CSCW.
  • A further object of this invention is to provide an improved apparatus, system, and method for computer collaboration over a network.
  • According to the invention, there is provided an agent mediated CSCW system that can create a sense of group work and at the same time keep the privacy and maintain the security of each user. A multi-agent negotiation process is used in the system to reduce fractions among group members during the geographically distributed team work. A markup language, such as XML (eXtended Markup Language), is used in the system to encode communication messages. (See, for example, http://www.ibm.com/developer/xml.) Event perception is an important task in agent mediated CSCW system which uses an eigen space to perform the event perception task. For the case where the number of devices is large, an eigen pyramid is constructed which can be used to discriminate different events. For more information on the eigen space approach, see Numerical Recipes in C by William H. Press, Saul Teukolsky, William T. Vetterling, and Brian P. Flannery.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The foregoing and other objects, aspects and advantages will be better understood from the following detailed description of a preferred embodiment of the invention with reference to the drawings, in which:
  • FIG. 1 is a block diagram of a preferred system architecture of the invention;
  • FIG. 2 is a block diagram of a preferred single agent component of the invention;
  • FIG. 3 is a diagram illustrating the situation for the user within his or her working environment;
  • FIG. 4 is a flow chart on the data collection process according to the invention;
  • FIG. 5 is a flow chart of a data collection process for model building of a data matrix A;
  • FIG. 6 is a graphical illustration of the reading data at time 0;
  • FIG. 7 is a graphical illustration of the reading data for the ith event's jth exemplar;
  • FIG. 8 is a graphical illustration showing in more detail reading data for the ith event's jth exemplar;
  • FIG. 9 is a graphical illustration of the readings for all the exemplars of event i;
  • FIG. 10 is a graphical illustration of the data matrix;
  • FIG. 11 is a graphical illustration showing in more detail the data matrix;
  • FIG. 12 is a flow chart of a co-efficient generating process of the invention;
  • FIG. 13 is a flow chart of an even perception process without data abstraction;
  • FIG. 14 is a graphical illustration for eigen pyramid construction according to the invention;
  • FIG. 15 is a graphical illustration for eigen-pyramid's data matrix change;
  • FIG. 16 is a flow chart of the eigen pyramid model construction process;
  • FIG. 17 is a flow chart of an extraction process for extracting the signature of the real measured data during the event perception process;
  • FIG. 18 is flow chart of a start negotiation process;
  • FIG. 19 is a flow chart of a start agent process run during the single agent negotiation process;
  • FIG. 20 is a flow chart of a respond agent process run during the negotiation process;
  • FIG. 21 is a flow chart of a dialogue process for the start agent to handle conflict with its user;
  • FIG. 22 is a flow chart of the start agent during the group agent negotiation process;
  • FIG. 23 is a flow chart of a privacy guarding process for privacy guarding for the start agent during the group awareness process;
  • FIG. 24 is a flow chart showing the procedure for the respond agent during the privacy guarding process;
  • FIG. 25 is a graphical illustration of the Agent-Event matrix for a given agent used to guard the privacy of the agent's user; and
  • FIG. 26 is a flow chart showing the awareness process for an agent's own user.
  • DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT OF THE INVENTION
  • In this disclosure, we will present the system architecture of an agent mediated CSCW system and study in detail its event perception issues. Referring now to the drawings, and more particularly to FIG. 1, there is shown a block diagram of one preferred system architecture. Block 101 is the server, which handles the messages among agents. Block 105 is the network channel between the agent 115 and the server 101. 155 denotes the communication channels between the devices 165 and the agent 115. It can be in any form. For example, a device 165 may be a camera which, after it analyzes an image, can divide the results into several categories and send the message through a TCP/IP (Transfer Control Protocol/Internet Protocol) channel 155. Another example, a device 165 may be a keyboard which, after the user hits a key, generates a signal that can be grabbed and sent to the agent 115 via a TCP/IP channel 155. In general, the devices 165 are used for the system to perform event perception. They can be a camera, a keyboard, a sensitive touch screen, a weight sensor, a motion sensor, and many other devices that can sense the environment. The user 195 uses a multi-modal communication channel 175 for communicating with devices in the environment. Channel 175 is actually the sensing channel for the devices to sense the environment and the user and various activities. The devices 165 keep sensing the environment and user 195 through multi-modal channels 175. The sensing results are categorized and send to the agent 1 15. The agent 115 analyzes the data and keeps detecting what has happened. When other agents ask about the status of its user, after a negotiation process, the agent provides relevant information.
  • FIG. 2 is a preferred diagram on a single agent component. Negotiation module 205 is responsible for negotiations with another agent or agents. It is responsible for parsing messages back and forth between itself and another agent. When it wants to pass information to other agents, it first sends the information to the server 101 (FIG. 1), then the server passes the information to the other agent or agents. The communications between them can be through a TCP/IP channel. When another agent or agents send message to the current agent, they also go through the server. They first send the message to the server, then the server transfers the message to the negotiation module 205 of the current agent. The negotiation module 205 passes messages back and forth between plan generation module 215. The plan generation module 215 generates plans for the agent to negotiate with other agents, or to transmit information through multi-modal user interface module 257 and finally received by the user. The plan generation module 215 consults the event perception module 255 and the knowledge data base 207 so as to generate plans. The knowledge data base 207 stores various data bases for the user, such as the user's day-to-day calendar, appointment schedule, and the like. The calendar can be meeting schedules, teleconferencing schedules, telephone call schedules, and many others. It also has some inference rule in one way or another so as to generate plans. For example, if the user is going to having a meeting with John at 1:00 o'clock, then he should not have any other appointments with Mary or other people in his department at this time. But when the CEO of the company want to meet him at this time, the knowledge data base should overwrite John's meeting. There are many ways to store data and relevant inference rules, as well understood by those skilled in the art of expert systems. Knowledge data base 207 can use any of them. Event perception module 255 perceives events and provide the results to plan generation module 215 when queried. Plan generation module 215 generates plans by various intelligent ways based on the content of event perception module 255 and knowledge data base 207.
  • FIG. 3 illustrates the situation for the user within his or her working environment. The user 195 is surrounded by various devices such as screen 301, keyboard 357, mouse 375, and other devices 305, 315 and 355. Some devices are only for output of information to the user, such as, say, devices 301 and 315. Some devices are only for input from the user or environment, such as, say, devices 355, 357 and 375. Other devices can be bi-directional. For example, the screen 301 could be a touch screen both displaying information to the user and receiving input from the user.
  • FIG. 4 is a flow chart on the data collection process. Function block 401 determines the data collection time intervals, such as one second intervals. This means every one second, the agent will collect the readings from all the devices. Function block 405 determines the total time for data collection, such as one minute. If for example the data collection time is one minute, then the total time for collecting data is 60+1=61 times. The first reading is obtained at time instant 0. The second reading is obtained for the time period between time instant 0 and time instant 1. In general, the rth reading is obtained between the time instants r−1 and r.
  • Function block 415 determines the sensing categories for each available devices. For example, a weight sensor within the environment can divide its weights for every 10 pounds. Thus, if the total weight range can be sensed is 100 pounds, then we can divide the sensed range into ten categories. If the average weight sensed within the time instant r−1 and r belongs to category c, we say that the rth reading belongs to category c. A camera can analyze images by divided the images into several categories based on the image analyzing results. As for a keyboard, we can divide the category like this. Suppose the keys on the keyboard are 0, 1, . . . , 9, a, . . . , z, A, . . . , Z. If within the time interval, the user has done nothing, then the category is 0, if he typed “0”, then it is 1. If he typed “9” then it is 10, if he typed “a”, then it is 11, etc. In general, suppose the keys on the keyboard are k1, . . . , kn. Then, if the user typed on key ki within the time period, then the category is i. If the user touched two keys ki and kj, then the category is (i−1)×n+j. If the user touches r keys, k1, . . . , kr, then the category is (i1−1)nr−1+(i2−1)nr−2+ . . . +(ir-1−1)n+ir.
  • We can also group based on the commands the user types. For example, if the user typed “Is” in the Unix®D operating system (OS) environment, we might categorize it as 1. In general, we have different ways of doing the categorization. Our goal here is to categorize the input in such a way that the categorization can be easily used for the eigen space method to perform the event perception task. Function block 455 is the module for collecting readings for all the devices at every time instant. At every time instant, data is collected for each device, until the time is used up. The details will be described in FIGS. 6 to 11.
  • FIG. 5 is a flow chart of a data collection process for model building of a data matrix A. In function block 501, the different events to be perceived are determined. Function block 505 collects exemplar within the training time, and then a determination is made in decision block 515 to determine if we need more data for training. If so, the process loops back to function block 505; otherwise, the process uses collected exemplar to construct the sampling readings to form matrix A in function block 555. Here r1 i,j(0) is the readings of device 1 at time instant 0 for the ith event's jth exemplar. The column vector [r1 i,j(0)] is the readings of all the devices at the time instant 0.
  • To model events, we collect Nexemplar exemplar for each events. Each exemplar collects readings from devices within a time interval [0,T]. These readings are discretized into Nreadings+1 readings at time instant 0, 1 N readings T , , N readings - 1 N readings T , T .
    For device h(0≦h≦M), we denote its kth (0≦k≦Nreadings) readings for the jth (1≦j≦Nexemplar) exemplar from the ith (1≦i≦N) event as rh i,j(k).
  • Let [rh i,j(k)]=(r1 i,j(k), . . . , rM i,j(k))T be a column vector of kth readings of all the devices for the jth exemplar from the ith event. Let [ri,j] represent the column vector obtained by simply concatenating the [ri,j(k)] column vector for all the k readings, [ri,j]=(r1 i,j(0), . . . , rM i,j(0), r1 i,j(1), . . . , rM i,j(1), . . . , r1 i,j(Nreadings), . . . , rM i,j(Nreadings))T. Here [ri,j] gives the readings for the jth exemplar of the ith event. These are the readings of all the devices with respect to an exemplar in the model training phase. The length of the vector [ri,j] is M×(Nreadings+1).
  • FIG. 6 is a graphical explanation of the reading of data at time 0 for the jth exemplar of the ith event. FIG. 7 illustrates the reading data for the event's jth exemplar. FIG. 8 illustrates a more detailed explanation of the reading data for the ith event's jth exemplar.
  • The sampling readings matrix A for all the events and their associated exemplar can be created by the set of all j and i of [ri,j]. A=([r1,1], . . . , [r1,N exemplar ], . . . , [rN,1], . . . , [rN,N exemplar ]). The dimension of matrix A is (N×Nexemplar) columns and (M×(Nreadings+1)) rows, N is the number of events and M is the number of devices. The N×Nexemplar columns of matrix A give readings for all the exemplar of all the events. This is the total number of training sets of data. The M×(Nreadings+1) elements of each column gives the readings from all the devices at all the discrete time instants for an exemplar. Thus, each column of matrix A refers to a given training set, the elements of the column refers to the readings for this set. Usually
    (N readings+1)>>N×N exemplar.
  • FIG. 9 illustrates the readings for all the exemplar of event i. FIG. 10 illustrates the data matrix A. FIG. 11 illustrates a more detail the data matrix A.
  • Matrix A can be decomposed using singular value decomposition (SVD) as:
    A=UWV T,
    where U=(U1, . . . , UN×N exemplar ) is an orthogonal matrix of the same size as matrix A representing the principal component directions Ui(1≦i≦N×Nexemplar) in the training set. These are best directions that can clearly distinguish the training data. W is a diagonal matrix with singular values λ1, . . . , λN×N exemplar , sorted in decreasing order along the diagonal. The virtue of these values is that they rank the dimensions of the space in terms of variations along the principal component directions, and that this ranking is very often related to their importance. VT is a (N×Nexemplar)×(N×Nexemplar) matrix that encodes the coefficients to be used in expanding each column of matrix A in terms of principal component directions.
  • FIG. 12 is a flow chart of a co-efficient generating process. In function block 1201, exemplar are collected to form the data matrix A. In function block 1205, the eigen vectors are generated based on the exemplar matrix. Function block 1215 generates the co-efficient for the corresponding data matrix. More particularly, the readings from the jth exemplar of the ith event [ri,j], can be approximated according to the q singular values λ1≧, λ2≧ . . . ≧λq as: [ r i , j ] = = 1 q c i , j U ,
    where q=N×Nexemplar, cl i,j are scalar values that can be calculated by taking the dot product of [ri,j] and Ul, Cl i,j=[ri,j]TUl. This is the process of projecting the reading vector [ri,j] onto the subspace spanned by the q basis vectors U1, . . . , Uq with parameters c1 i,j, . . . cq i,j. Thus, for a given i and j, we can obtain a vector Ci,j=(C1 i,j, . . . , cq i,j)T that gives the coefficients of the corresponding readings. For all the possible i and j, we can get a coefficient matrix C=(C1,1, . . . , C1,N exemplar , . . . , CN,1, . . . , CN,N exemplar ).
  • Now, we transform matrix C into a matrix that represents the average coefficient for each event. For any event i, matrix C contain the coefficient vectors of all its exemplar: (Ci,1, . . . , Ci,N exemplar ). The average coefficient vector for these exemplar vectors can be calculated by: C -> = j = 1 N exemplar C i , j N exemplar .
    The average coefficient matrix becomes {right arrow over (C)}=({right arrow over (C)}1, . . . , {right arrow over (C)}N). Each column i of matrix C corresponds to the average coefficient vector {right arrow over (C)}=({right arrow over (c)}1 i, . . . , {right arrow over (c)}q i) of event i. {right arrow over (C)} is the model of events learned from the training phase and will be used in event perception.
  • FIG. 13 is a flow chart of an even perception process without data abstraction. In function block 1301, device readings are collected, and in function block 1305, coefficients are generated. These processes are quite similar to those of function blocks 1201 and 1205, respectively, in FIG. 12. Function block 1315 performs the event perception task. More particularly, the perception of events involves matching readings from all the devices in a real situation against learned models of all the events. For the same event, readings in a real application may differ from those of its exemplar because of various reasons such as noise, etc. However, they may share some commonalities or signatures. The use of eigen space approach for event perception assumes that these commonalities or signatures of a given event is captured by the coefficients of the readings along principal component directions.
  • Suppose R(t)=(R1(t), . . . , RM(t))T is the readings from the M devices within time period [0,T]. We discretize [R(t)] into Nreadings+1 at time instant 0, 1 N readings , , N readings - 1 N readings .
    Let [R(k)] denote the kth readings of all the devices at kth time instant. By concatenating readings from all the time instants, we obtain the column vector of matrix R which gives readings of all the devices at all the time instants.
  • By projecting this vector on the principal component directions, we recover a vector of coefficients, {right arrow over (c)}=(c1, . . . , cq), that approximate the event to be perceived as a linear combination of eigen event basis. Upon the recovery of the real situation coefficient vector, the normalized distance Δi between {right arrow over (c)} and model coefficients {right arrow over (C)}i is used to perceive the observed event. Here Δ i = k = 1 q ( c k - c -> k i ) 2 .
    The event i with the smallest distance Δi is considered the best match of the observed event. The above is the process when we need to distinguish which event has happened among several possible events.
  • During the model formulation process, we can get the following. For any event i, matrix C contains the coefficient vectors of all its exemplar, (Ci,1, . . . , Ci,N exemplar ). The average coefficient vector for these exemplar vectors can be calculated by C -> i = j = 1 N exemplar C i , j N exemplar .
    The average coefficient matrix becomes C=({right arrow over (C)}1, . . . , {right arrow over (C)}N). Each column i of matrix C corresponds to the average coefficient vector {right arrow over (C)}=({right arrow over (c)}1 i, . . . , {right arrow over (c)}q i) of event i.
  • For event i, for coefficient vector Ci,1, we can obtain an offset difference: η i , 1 = = 1 q [ c -> i - c -> i , 1 ] 2 .
    Similarly, for Ci,2, . . . , Ci,N exemplar , we can obtain ηi,2, . . . , ηi,N exemplar as follows: η i , j = = 1 q [ c -> i - c -> i , j ] 2 .
    Here, j is between 1, . . . , Nexemplar. Thus, we get ηi,1, . . . , ηi,N exemplar . We reorder the above list such that they are in the increasing order, ηi,1≦ . . . ≦i,N exemplar . These are the errors of the training exemplar sample with respect to their average case. We define an acceptance threshold such as 0.95 (or other values based on the experience).
  • During the event perception situation, we can collect data and calculate the coefficients {right arrow over (c)}=(c1, . . . , cq). We then calculate the difference η = = 1 q [ c -> i - c -> i , j ] 2 .
    We then find the value of k such that
    ηi,k≦η<ηi,k+1.
    If k N exemplar 0.95 ,
    then we believe that event i has happened. Otherwise, we believe that event i has not happened. The above is the process when we need to figure out whether a given event has happened.
  • We can image that in the inter-connected world, a huge amount of devices will be involved. Events like what a person is doing can be perceived by considering only devices within an office. However, events like whether people within a building are having a meeting should be considered with all the devices within the building. In general, the bigger the scale of the events to be perceived, the more devices need to be considered. When the number of devices exceeds a certain threshold, the strategy above will not work because too much computational time is needed.
  • FIG. 14 is an illustration for eigen pyramid construction used in the explanation for function block 1601 in FIG. 16. FIG. 15 is an illustration for an eigen-pyramid's data matrix change.
  • FIG. 16 is a flow chart illustrating the eigen pyramid model construction process. Function block 1601 divides the device readings into k groups as illustrated in FIG. 14. In order to perform event perception when the number of devices is huge, we propose a new strategy called “pyramid eigen space”. Suppose nacceptable is the number of devices that can be handled by the above eigen space approach. Suppose Ntotal=M×Nreadings+1 is the total number of readings to be considered. Our strategy is to first divide uniformly these Ntotal devices into different groups such that each group can be handled by the above eigen space method. Suppose Ntotal=k(nacceptable−1)+u1, where 0≦u<nacceptable−1. If u=0, then we can divide the devices into nacceptable−1 groups. Otherwise, we can divide the first k(nacceptable−1) devices into k groups where each group has nacceptable−1 members. Then we distribute the rest r devices into the first r groups obtained. Thus, we have divided devices into k groups, where each group has either nacceptable−1 or nacceptable members.
  • Function block 1605 obtains the co-efficient for each group. This is illustrated in Mapping 1515, 1557 and 1575 of FIG. 15. For each group, we run the training data and detect their principal directions. Now, we collect the coefficient vector with respect to the principal directions for each training exemplar. The length of the coefficient vector is Nexemplar×Nevents.
  • Function block 1615 forms the next layer of the pyramid. This is illustrated in 1557 and 1565 of FIG. 15. Since the coefficients capture the differences in the training data, we will take coefficients of each exemplar as the input to the second level of the pyramid. By concatenating the coefficients of each group, we can obtain the new “exemplar” column vector which has length
    k×Nexemplar×Nevents
    and which should be much smaller than the original length of the “exemplar” column vector. For every old exemplar, we can get a new exemplar. Each new exemplar will be a column in the new matrix. After we put all the new exemplar together, we get a new data matrix that acts as the second layer of the pyramid.
  • In decision block 1617, a determination is made as to whether to continue the data abstraction process. If so, the process loops back to function block 1601; otherwise, the process ends.
  • At the first stage, the length for each training vector (exemplar) is Ntotal. During the process above, they are divided into k groups. Each group generates Nexemplar×Nevents coefficients. Thus, the total length for the second level of input will be k×Nexemplar×Nevents, which is much less than Ntotal=k(nacceptable−1)+r.
  • If k×Nexemplar×Nevents>nacceptable, we take this new data as input and repeat the above data abstraction process to further reduce the amount of data.
  • If k×Nexemplar×Nevents is much less than nacceptable and a further eigen coefficient extraction is meaningless, then we take these k×Nexemplar×Nevents numbers as the final coefficients of the training exemplar.
  • If k×Nexemplar×Nevents is less than nacceptable and a further eigen coefficient extraction is meaningful, then we take another round of eigen coefficient extraction. These newly generated coefficients will be taken as the final coefficients of the exemplar of the training events.
  • These new coefficients will be taken as the final coefficients of the exemplar of the training events. After extracting the final coefficients of all the exemplar of all the events, the average of the final coefficients of all the exemplar with respect to a given event are taken as the model fo the corresponding event. Just like what we did in the single layer case, these models are generated and used to perform the event perception task.
  • During the event perception phase, we can first get readings from all devices. FIG. 17 is a flow chart of an extraction process for extracting the signature for each divided group of the real measured data during the event perception process. Function block 1701 collects data from devices during the event perception process. Then according to the division when the pyramid is built, function block 1705 divides the initial data into different groups the same way as the pyramid model generation process when this corresponding layer is being built. We extract coefficients for the data of each group with respect to the principal directions of the first layer of the pyramid formed during the training phase in function block 1707 and, in function block 1715, the coefficients are connected to form the next layer. A determination is made in decision block 1755 as to whether there are more layers in the pyramid. If so, the process loops back to function block 1705; otherwise, the event perception is output at output block 1775.
  • FIG. 18 is flow chart of a start negotiation process. In function block 1801, the user sends his negotiation request via the multi-modal interface to its agent. In function block 1805, the agent analyzes the request and determines, based on the input, which agents are to be contacted. Decision block 1815 checks to determine if the total number of agents to be contacted is more than one or not. If so, the process goes to function block 1855 to start the multi-agent negotiation process. Otherwise, the process goes to function block 1875 to start single agent negotiation process.
  • FIG. 19 is a flow chart of a start agent process run during the single agent negotiation process. Function block 1901 identifies the other agents to be contacted. Function block 1905 identifies the topics to be negotiated, such as making a phone call or make an appointment. Function block 1907 checks the knowledge data base 207 (FIG. 2) so as to identify parameters of the topic. For example, for scheduling a time, there should be a start time, an end time, and the attendees of the meeting, etc. Function block 1911 examines the knowledge data base to identify the set of acceptable choices. In decision block 1915, a check is made to determine whether there is at least one choice left. If so, function block 1917 asks the negotiation module 205 (FIG. 2) to construct a negotiation message and send the message. It can be in the form of XML (eXtended Markup Language) or other protocols. Then, in decision block 1951, a determination is made as to when the agent received messages from the responding agent about whether the request is approved or not. If not, the process loops back to function block 1915; but if so, the process exits. Returning now to decision block 1915, if there are no choices left, a conflict dialogue is started with its own user in function block 1955 to determine a new request.
  • FIG. 20 is a flow chart of a responding agent process run during the negotiation process. The agent receives the negotiation request from the starting agent in function block 2001. In function block 2005, the responding agent identifies the starting agent, the topic, and the parameters. In function block 2011, the responding agent check the knowledge base to see whether there are any conflicts. If there is no conflicts, as determined in decision block 2015, the starting agent simply sends a message “approved” in function block 2017. Otherwise, the preference level is checked in function block 2019 to see whether there are any possibility of updating. For example, although at a certain time the user has a meeting with a colleague, but since the starting agent works for the CEO of the company, the original appointment should be replaced by the new appointment. If after checking the knowledge data base in decision block 2055 it is determined that it can be updated, then simply update in function block 2077. Otherwise, a “not approved” message is sent to the starting agent in function block 2057.
  • FIG. 21 is a flow chart of a dialogue process for the start agent to handle conflict with its user. Function block 2101 displays information to the user via multi-modal device such as a screen or a voice channel; i.e., “request is not approved”. After receiving the message, it is the user who needs to determine whether to negotiate directly with the other user in function block 2105. If the user decides to directly negotiate with the other user, as determined in decision block 2115, the agent system is bypassed in function block 2155. If not, in function block 2175, the user can propose an alternative negotiation request and send the request to the agent via multi-model interface.
  • FIG. 22 is a flow chart of the start agent during the group agent negotiation process. In function block 2201, the group of agents to be contacted is identified. According to the input of the user, function block 2205 identifies the topic to be negotiated (e.g., phone, schedule time, etc.). A check of the knowledge data base is made in function block 2207 to identify parameters of the topic. Function block 2211 examines the knowledge data base to identify the set of acceptable choices. A determination is made in decision block 2215 if there are any choices left and, if so, in function block 2251, the agent constructs a negotiation message (can be in XML form) and sends the message to all of the responding agents. In function block 2255, the starting agent receives all the messages from all the responding agents. Decision block 2271 checks whether the request is approved by every responding agent. If so, the process goes to function block 2277 to inform all the other agent about the final negotiation results; but if not, the process loops back to decision block 2215. Returning to decision block 2215, if there are no more choices left, then a conflict dialogue is stared with its own user in function block 2255 to determine the new request, and the process loops back to function block 2201.
  • FIG. 23 is a flow chart of a privacy guarding process for the start agent during the group awareness process. This is an idea to keep the privacy of its user so that it can not be invaded. In other words, “I only let you know what I want you to know”. In function block 2301, the user sends an inquiry to his agent about the status of the other users, such as whether they are making a phone call, they are working right now, etc. In function block 2305, the agent identifies the number of users and the corresponding status parameters. In decision block 2315, the agent selects one agent from the agent pools to be queried and send the request. If every agent has been queried, the process goes to function block 2357; otherwise, in function block 2355, the starting agent receives the response from the agent to be queried, and the process loops back to decision block 2315. The response for the queried agent can contain what information you can display and using which channel. Returning to function block 2357, a determination is made as to the display strategies for different users and the corresponding channels; for example, how to display the corresponding message, what content to display, etc. In function block 2375, the starting agent keeps receiving messages from the other agents about the status of their users, and the starting agent will keep displaying the status of the other users.
  • FIG. 24 is a flow chart of the procedure for the responding agent during the privacy guarding process. Function block 2401 receives the message from the starting agent on the status query. Function block 2405 checks its knowledge data base and the value of the agent-event matrix. This matrix encodes the privacy concerns of the agent about its user. Function block 2415 generates the list of events to be transmitted and the way of transmitting them. Function block 2455 keeps performing the event perception task and sends messages to the start agent about the status of its user. The agent-event matrix is used to determine what to send.
  • FIG. 25 graphically illustrates the Agent-Event matrix for a given agent used to guard the privacy of the agent's user. The rows represent different events. The columns represent agents. aeij represent the situation for event i to be sent to agent j. For each event, the agent divides them into different categories. Thus, each event is associated with a set of different values. If aeij=−1, then the responding agent will not give any information about event i to the agent j. If aeij=0, then the responding agent will display exactly what it perceived. If aeij=t>0, then the responding agent will always transmit the value of the t to the starting agent.
  • FIG. 26 is a flow chart illustrating the awareness process for an agent's own user. The goal is to provide its own user about who is monitoring him. Function block 2601 accesses the knowledge data base to check which agent requested its agent's info. Function block 2605 identifies the information other agent requested, and which information was given to them. Function block 2615 displays the corresponding information to its user when asked using a proper user interface.
  • While the invention has been described in terms of a single preferred embodiment, those skilled in the art will recognize that the invention can be practiced with modification within the spirit and scope of the appended claims.

Claims (8)

1-15. (canceled)
16. A computer supported cooperative work (CSCW) method comprising the steps of:
dividing sensing devices associated with sensing the environment of a user into groups;
calculating coefficient vectors for each group to form a layer of an eigen space pyramid;
obtaining readings from the sensing devices according to the groups;
generate coefficients for data of each of the groups obtained by the readings; and
connecting all the coefficients together to form a next layer of the eigen space pyramid,
wherein the eigen space pyramid is used to perceive subsequent events from the sensing devices.
17. The CSCW method according to claim 16, wherein the eigen space pyramid perceives subsequent events by matching readings from the sensing devices to sense the environment against learned models of all events.
18. The CSCW method according to claim 16, wherein an average of the coefficient vectors for is calculated by:
C -> i = j = 1 N exemplar c ij N exemplar
19. The CSCW method according to claim 16, further comprising:
for each of the groups, running training data and detecting their principal directions;
collecting the coefficient vectors with respect to the principal directions for each training exemplar;
inputting the coefficient vectors of each training exemplar as an input to the next level of the eigen space pyramid.
20. The CSCW method according to claim 19, further comprising dividing each training vector into a group which generates Nexemplar×Nevents coefficients, wherein a total length for the second level of input will be k×Nexemplar×Nevents, which is less than Ntotal=k(nacceptable−1)+r, where r is a reading.
21. The CSCW method according to claim 20, wherein:
if k×Nexemplar×Nevents>nacceptable, new data is input and the process is repeated to reduce an amount of data.
if k×Nexemplar×Nevents is much less than nacceptable and a further eigen coefficient extraction is meaningless, then k×Nexemplar×Nevents are final coefficients of the training exemplar; and
if k×Nexemplar×Nevents is less than nacceptable and a further eigen coefficient extraction is meaningful, then another round of eigen coefficient extraction is performed and newly generated coefficients will be taken as final coefficients of the exemplar of the training events. and newly generated coefficients will be taken as final coefficients of the exemplar of the training events.
22. The CSCW method according to claim 21, wherein:
after extracting the final coefficients of all exemplar of all events, the average of the final coefficients of all the exemplar with respect to a given event are taken as the model of the corresponding event.
US11/135,276 2000-05-02 2005-05-24 System and method for a computer based cooperative work system Abandoned US20050216561A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US11/135,276 US20050216561A1 (en) 2000-05-02 2005-05-24 System and method for a computer based cooperative work system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/562,915 US6981019B1 (en) 2000-05-02 2000-05-02 System and method for a computer based cooperative work system
US11/135,276 US20050216561A1 (en) 2000-05-02 2005-05-24 System and method for a computer based cooperative work system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/562,915 Division US6981019B1 (en) 2000-05-02 2000-05-02 System and method for a computer based cooperative work system

Publications (1)

Publication Number Publication Date
US20050216561A1 true US20050216561A1 (en) 2005-09-29

Family

ID=34991443

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/562,915 Expired - Fee Related US6981019B1 (en) 2000-05-02 2000-05-02 System and method for a computer based cooperative work system
US11/135,276 Abandoned US20050216561A1 (en) 2000-05-02 2005-05-24 System and method for a computer based cooperative work system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/562,915 Expired - Fee Related US6981019B1 (en) 2000-05-02 2000-05-02 System and method for a computer based cooperative work system

Country Status (1)

Country Link
US (2) US6981019B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070214460A1 (en) * 2005-10-27 2007-09-13 Institute For Information Industry Method and system for dynamic event matching
CN100341298C (en) * 2005-10-13 2007-10-03 华中科技大学 Expandable dynamic fault-tolerant method for cooperative system
KR100803579B1 (en) 2006-12-29 2008-02-15 성균관대학교산학협력단 Quantitative estimating system for grouping of multiagent and method thereof
US20080140488A1 (en) * 2006-12-08 2008-06-12 Tolga Oral Event scheduling conflict management and resolution for unprocessed events in a collaborative computing environment
CN101984430A (en) * 2010-11-04 2011-03-09 中兴通讯股份有限公司 Multi-user collaborative graphic editing method and system for mobile terminal
CN112215326A (en) * 2019-07-10 2021-01-12 华为技术有限公司 Distributed AI system
WO2021056731A1 (en) * 2019-09-23 2021-04-01 平安科技(深圳)有限公司 Log data analysis-based behavior detection method, apparatus, device, and medium

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0108044D0 (en) * 2001-03-30 2001-05-23 British Telecomm Application synchronisation
US6651100B2 (en) * 2002-03-12 2003-11-18 Lexmark International, Inc. Automatic negotiation of an internet protocol address for a network connected device
WO2003084173A1 (en) * 2002-03-28 2003-10-09 British Telecommunications Public Limited Company Synchronisation in multi-modal interfaces
US7765175B2 (en) 2003-09-18 2010-07-27 Optimum Power Technology, L.P. Optimization expert system
US7089604B2 (en) * 2003-11-05 2006-08-15 Wright Glenn H Toilet support device and method
US7908325B1 (en) 2005-06-20 2011-03-15 Oracle America, Inc. System and method for event-based collaboration
US8789053B2 (en) * 2007-04-05 2014-07-22 Newton Howard Task execution and delegation by autonomous mobile agents based on intent knowledge base
WO2011002496A1 (en) * 2009-06-29 2011-01-06 Michael Domenic Forte Asynchronous motion enabled data transfer techniques for mobile devices
KR20110010906A (en) * 2009-07-27 2011-02-08 삼성전자주식회사 Apparatus and method for controlling of electronic machine using user interaction
US20120137230A1 (en) * 2010-06-23 2012-05-31 Michael Domenic Forte Motion enabled data transfer techniques
US8548740B2 (en) * 2010-10-07 2013-10-01 Honeywell International Inc. System and method for wavelet-based gait classification
DE102011001365A1 (en) * 2011-03-17 2012-09-20 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Braking device and braking method for a motor vehicle
US10819759B2 (en) 2015-04-30 2020-10-27 At&T Intellectual Property I, L.P. Apparatus and method for managing events in a computer supported collaborative work environment
US9794306B2 (en) 2015-04-30 2017-10-17 At&T Intellectual Property I, L.P. Apparatus and method for providing a computer supported collaborative work environment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5901244A (en) * 1996-06-18 1999-05-04 Matsushita Electric Industrial Co., Ltd. Feature extraction system and face image recognition system
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US6314178B1 (en) * 1997-04-11 2001-11-06 Walker Digital, Llc Method and apparatus for enabling interaction between callers with calls positioned in a queue
US6480885B1 (en) * 1998-09-15 2002-11-12 Michael Olivier Dynamically matching users for group communications based on a threshold degree of matching of sender and recipient predetermined acceptance criteria
US6526156B1 (en) * 1997-01-10 2003-02-25 Xerox Corporation Apparatus and method for identifying and tracking objects with view-based representations
US6535909B1 (en) * 1999-11-18 2003-03-18 Contigo Software, Inc. System and method for record and playback of collaborative Web browsing session
US6640241B1 (en) * 1999-07-19 2003-10-28 Groove Networks, Inc. Method and apparatus for activity-based collaboration by a computer system equipped with a communications manager

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5068645A (en) 1987-10-14 1991-11-26 Wang Laboratories, Inc. Computer input device using an orientation sensor
JP2596652B2 (en) 1990-04-30 1997-04-02 インターナショナル・ビジネス・マシーンズ・コーポレイション Knowledge base order processing method and system
US5396265A (en) 1990-09-17 1995-03-07 Massachusetts Institute Of Technology Three-dimensional tactile computer input device
JP3301040B2 (en) 1991-04-22 2002-07-15 株式会社日立製作所 Expert system
US5418889A (en) 1991-12-02 1995-05-23 Ricoh Company, Ltd. System for generating knowledge base in which sets of common causal relation knowledge are generated
WO1994028492A1 (en) * 1993-05-25 1994-12-08 Hitachi, Ltd. Distributed control system and method of configurating the system
JPH0935032A (en) 1995-07-24 1997-02-07 Mitsubishi Electric Corp Ic card and information equipment terminal
DE19607149A1 (en) 1996-02-26 1997-08-28 Siemens Ag Method for computer-aided comparison of several file copies of a stored file stored in at least one computer
US5781732A (en) 1996-06-20 1998-07-14 Object Technology Licensing Corp. Framework for constructing shared documents that can be collaboratively accessed by multiple users
JP3821170B2 (en) 1996-07-26 2006-09-13 富士ゼロックス株式会社 Method for managing collaborative work information and collaborative work support system
US5861883A (en) * 1997-05-13 1999-01-19 International Business Machines Corp. Method and system for portably enabling awareness, touring, and conferencing over the world-wide web using proxies and shared-state servers
US6233600B1 (en) * 1997-07-15 2001-05-15 Eroom Technology, Inc. Method and system for providing a networked collaborative work environment
US6342906B1 (en) * 1999-02-02 2002-01-29 International Business Machines Corporation Annotation layer for synchronous collaboration
US6446113B1 (en) * 1999-07-19 2002-09-03 Groove Networks, Inc. Method and apparatus for activity-based collaboration by a computer system equipped with a dynamics manager
US6505233B1 (en) * 1999-08-30 2003-01-07 Zaplet, Inc. Method for communicating information among a group of participants

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5901244A (en) * 1996-06-18 1999-05-04 Matsushita Electric Industrial Co., Ltd. Feature extraction system and face image recognition system
US6526156B1 (en) * 1997-01-10 2003-02-25 Xerox Corporation Apparatus and method for identifying and tracking objects with view-based representations
US6314178B1 (en) * 1997-04-11 2001-11-06 Walker Digital, Llc Method and apparatus for enabling interaction between callers with calls positioned in a queue
US6308199B1 (en) * 1997-08-11 2001-10-23 Fuji Xerox Co., Ltd. Cooperative work support system for managing a window display
US6480885B1 (en) * 1998-09-15 2002-11-12 Michael Olivier Dynamically matching users for group communications based on a threshold degree of matching of sender and recipient predetermined acceptance criteria
US6640241B1 (en) * 1999-07-19 2003-10-28 Groove Networks, Inc. Method and apparatus for activity-based collaboration by a computer system equipped with a communications manager
US6535909B1 (en) * 1999-11-18 2003-03-18 Contigo Software, Inc. System and method for record and playback of collaborative Web browsing session

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100341298C (en) * 2005-10-13 2007-10-03 华中科技大学 Expandable dynamic fault-tolerant method for cooperative system
US20070214460A1 (en) * 2005-10-27 2007-09-13 Institute For Information Industry Method and system for dynamic event matching
US7849084B2 (en) * 2005-10-27 2010-12-07 Institute For Information Industry Method and system for dynamic event matching
US20080140488A1 (en) * 2006-12-08 2008-06-12 Tolga Oral Event scheduling conflict management and resolution for unprocessed events in a collaborative computing environment
KR100803579B1 (en) 2006-12-29 2008-02-15 성균관대학교산학협력단 Quantitative estimating system for grouping of multiagent and method thereof
CN101984430A (en) * 2010-11-04 2011-03-09 中兴通讯股份有限公司 Multi-user collaborative graphic editing method and system for mobile terminal
WO2012058989A1 (en) * 2010-11-04 2012-05-10 中兴通讯股份有限公司 Method and system for multi-user collaborative graph editing on mobile terminal
CN112215326A (en) * 2019-07-10 2021-01-12 华为技术有限公司 Distributed AI system
WO2021056731A1 (en) * 2019-09-23 2021-04-01 平安科技(深圳)有限公司 Log data analysis-based behavior detection method, apparatus, device, and medium

Also Published As

Publication number Publication date
US6981019B1 (en) 2005-12-27

Similar Documents

Publication Publication Date Title
US20050216561A1 (en) System and method for a computer based cooperative work system
Sycara et al. Coordination of multiple intelligent software agents
Mostefaoui et al. Context-aware computing: a guide for the pervasive computing community
Cao Domain-driven data mining: Challenges and prospects
Bui et al. An agent-based framework for building decision support systems
Bauer et al. Agent UML: A formalism for specifying multiagent interaction
Phillips et al. Modeling the intelligence analysis process for intelligent user agent development
Pena-Mora et al. Multiple device collaborative and real time analysis system for project management in civil engineering
Yao et al. Recommendations on the internet of things: Requirements, challenges, and directions
Stabell Decision support systems: alternative perspectives and schools
Seyff et al. Exploring how to use scenarios to discover requirements
Desouza Intelligent agents for competitive intelligence: survey of applications
KR20100063289A (en) System and method for managing work for integrating
JP2003067593A (en) Expert service provision system, expert management server, computer program, storage medium, and method of operating expert management server
Tsai et al. Ontology-mediated integration of intranet web services
Petit-Rozé et al. MAPIS, a multi-agent system for information personalization
Kwon et al. ubiES: An intelligent expert system for proactive services deploying ubiquitous computing technologies
Tian et al. Application of Agent-based Web Mining in E-business
Lee et al. Organizational learning systems
Chiu et al. Constraint-based negotiation in a multi-agent information system with multiple platform support
Chung et al. An approach for developing support systems for strategic decision making in business
Gasser et al. Organizations as complex, dynamic design problems
Wise Decisions in design: Analyzing and aiding the art of synthesis
Ohmukai et al. Social scheduler: a proposal of collaborative personal task management
Forth et al. Decision making with a KGP agent system

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION