CA2054631C - Look-ahead method and apparatus for predictive dialing using a neural network - Google Patents
Look-ahead method and apparatus for predictive dialing using a neural networkInfo
- Publication number
- CA2054631C CA2054631C CA002054631A CA2054631A CA2054631C CA 2054631 C CA2054631 C CA 2054631C CA 002054631 A CA002054631 A CA 002054631A CA 2054631 A CA2054631 A CA 2054631A CA 2054631 C CA2054631 C CA 2054631C
- Authority
- CA
- Canada
- Prior art keywords
- current
- call record
- call
- input parameters
- dial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/50—Centralised arrangements for answering calls; Centralised arrangements for recording messages for absent or busy subscribers ; Centralised arrangements for recording messages
- H04M3/51—Centralised call answering arrangements requiring operator intervention, e.g. call or contact centers for telemarketing
- H04M3/523—Centralised call answering arrangements requiring operator intervention, e.g. call or contact centers for telemarketing with call distribution or queueing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/10—Interfaces, programming languages or software development kits, e.g. for simulating neural networks
- G06N3/105—Shells for specifying net layout
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/22—Arrangements for supervision, monitoring or testing
- H04M3/36—Statistical metering, e.g. recording occasions when traffic exceeds capacity of trunks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M3/00—Automatic or semi-automatic exchanges
- H04M3/42—Systems providing special services or facilities to subscribers
- H04M3/50—Centralised arrangements for answering calls; Centralised arrangements for recording messages for absent or busy subscribers ; Centralised arrangements for recording messages
- H04M3/51—Centralised call answering arrangements requiring operator intervention, e.g. call or contact centers for telemarketing
- H04M3/5158—Centralised call answering arrangements requiring operator intervention, e.g. call or contact centers for telemarketing in combination with automated outdialling systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q3/00—Selecting arrangements
- H04Q3/0016—Arrangements providing connection between exchanges
- H04Q3/002—Details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04Q—SELECTING
- H04Q3/00—Selecting arrangements
- H04Q3/42—Circuit arrangements for indirect selecting controlled by common circuits, e.g. register controller, marker
- H04Q3/54—Circuit arrangements for indirect selecting controlled by common circuits, e.g. register controller, marker in which the logic circuitry controlling the exchange is centralised
- H04Q3/545—Circuit arrangements for indirect selecting controlled by common circuits, e.g. register controller, marker in which the logic circuitry controlling the exchange is centralised using a stored programme
- H04Q3/54575—Software application
- H04Q3/54591—Supervision, e.g. fault localisation, traffic measurements, avoiding errors, failure recovery, monitoring, statistical analysis
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S379/00—Telephonic communications
- Y10S379/904—Auto-calling
Abstract
A predictive dialing system having a computer connected to a telephone switch stores a group of call records in its internal storage. Each call record contains a group of input parameters, including the date, the time, and one or more workload factors. Workload factors can indicate the number of pending calls, the number of available operators, the average idle time, the connection delay, the completion rate, and the nuisance call rate, among other things. Ih the preferred embodiment, each call record also contains a dial action, which indicates whether a call was initiated or not. These call records are analyzed by a neural network to determine a relationship between the input parameters and the dial action stored in each call record. This analysis is done as part of the training process for the neural network. After this relationship is determined, the computer system sends a current group of input parameters to the neural network, and, based on the analysis of the previous call records, the neural network determines whether a call should be initiated or not. The neural network bases its decision on the complex relationship it has learned from its training data -- perhaps several thousand call records spanning several days, months, or even years. The neural network is able to automatically adjust -- in a look ahead, proactive manner -- for slow and fast periods of the day, week, month, and year.
Description
09-90-047 1 Z~5~63~
LOOK-AHEAD METHOD AND APPA~ATUS EOR PREDICTIVE
DIALING USING ~ NE~kAL NETWORK
Field of the Invention This invention relates to the data processing field.
More particularly, this invention is a look-ahead method and apparatus for predictive dialing using a neural network.
Background of the Invention Communications in the l990s is considerably more complex that it used to be. Back in the stone age, when one Neanderthal wanted to communicate with another Neanderthal, he walked over to the second Neanderthal and grunted a few sounds. Gradually, communication evolved into written messages that could be delivered, first by messenger and later by mail.
Eventually, the telephone was invented. The telephone alLowed a person to communicate with another person simply and efficiently by picking up the receiver and dialing the telephone number of the person he wished to speak to.
Salespeople were on a similar evolutionary track. When a salesman wanted to sell something to another person, he went door to door and tried to convince whoever was there that they should buy what the salesman was selling. When this proved to be inefficient due to the high number of doors slammed in the salesman s face, the salesman began mailing letters, brochures, and other written promotional materials to prospective customers. This was also inefficient, since a very high percentage of these màilings were considered to be "junk mail" by the recipients. Only a small percentage of the mailings resulted in sales.
It didn't take long for salespeople to discover the telephone. A salesman could quickly and inexpensively call ~ 2~5~631 a proæpective customer and explain what he was selling.
Since most calls ended ~uickly (with the potential customer expressing his lack of interest in a variety of ways and then hanging up) the bulk of the time was spent figuring out who was going to be called and trying to establish a connection with that person. The phone would often be busy or not answered, forcing the salesman to try again later and look for another prospective customer to call.
Salespeople began to realize that this approach was also inefficient. They discovered that computers could quickly perform much of the overhead involved with establishing connections with prospective customers. When a salesperson (now known as a "telemarketer") compLeted a call, he could instruct the computer to dial the next number from a list of numbers stored in the computer. This became known as outbound telemarketing.
Although very efficient, conventional o~tbound telemarketing still had problems. Much of the telemarketer s time was spent listening to busy signals or phones that weren t answered. In addition, telemarketers often grew weary of a high degree of rejection, and were reluctant to instruct the computer that they were ready to make another call. To solve these problems, predictive dialing was developed. In a typical predictive dialing arrangement, the potential customer is called by the computer. If someone answers the phone, the computer finds an available telemarketer and connects the call to this telemarketer.
While prior attempts in the predictive dialing field have been very good at the "dialing" part of predictive dialing, they have not been good at the "predicting" part.
Often, the computer makes and completes a call to a customer, only to discover that there isn t an operator available to take the call. This is known as a "nuisance call". The customer then is subjected to a recorded announcement, a ringing signal, dead silence or a hang up.
~ Z05~631 The opposite problem of having operators sitting idle waiting for the computer to dial a customer also frequently occurs in prior attempts. This is known as "operator idle time".
U.S. Patent number 4,829,563 to Crockett et al attempted to solve these problems of nuisance calls and operator idle time by dynamically adjusting the number of calls dialed based on short term comparisons of the weighted predicted number of calls versus the predicted number of operators, and based on periodic adjustment of a weighting factor. Crockett s "short term" comparisons are always "reactive" in nature -- changes are made only after nuisance calls or operator idle time rise to unacceptable levels.
Therefore, Crockett's "reactive dialing" approach falls short of solving the above-identified problems of nuisance calls and operator idle time.
Summary o~ the Invention It is a principle object of the invention to provide an efficient predictive dialing technique.
It is another object of the invention to provide a predictive dialing technique that maintains nuisance calls and operator idle time within acceptable levels.
It is another object of the invention to provide a predictive dialing technique able to look ahead and anticipate changes in calling patterns and adjust accordingly, before nuisance calls or operator idle time rise to unacceptable levels.
It is another object of the invention to use a neural network in a predictive dialing technique that is able to look ahead and anticipate changes in calling patterns and adjust accordingly, before nuisance calls or operator idle time reach unacceptable levels, based on what the neural network has learned.
;~054631 These and other objects are accomplished by the look-ahead method and apparatus for predictive dialing using a neural network disclosed herein.
A predictive dialing system having a computer connected to a telephone switch stores a group of call records in its internal storage. Each call record contains a group of input parameters, including the date, the time, and one o~
more workload factor parameters. Workload factor parameters can indicate the number of pending calls, the number of available operators, the average idle time, the connection delay, the completion rate, the conversation length and the nuisance call rate, among other things. In the preferred embodiment, each call record also contains a dial action, which indicates whether a call was initiated or not.
These call records are analyzed by a neural network to determine a relationship between the input parameters and the dial action stored in each call record. This ahalysis is done as part of the training process for the neural network. After this relationship is determined, the computer system sends a current group of input parameters to the neural network, and, based on the analysis of the previous call records, the neural network determines whether a call should be initiated or not. The neural network bases its decision on the complex relationship it has learned from its training data -- perhaps several thousand call records spanning several days, months, or even years. The neural network is able to automatically adjust -- in a look ahead, proactive manner -- for slow and fast periods of the day, week, month, and year.
Brief Description of the Drawing Fig. 1 shows a block diagram of the predictive dialing system of the invention.
205463 ~
Fig. 2 shows how a massively parallel hardware implemented neural network can be simulated on a serial Von Neumann based computer system.
Figs. 3A-3B shows a conceptual framework of the computing environment of the invention.
Fig. 4 shows the neural network data structure of the invention.
Figs. 5-9 show the flowcharts of the neural network utility of the invention.
Figs. lOA-lOB show examples of numeric training data used in the preferred and alternate embodiments of the invention.
Figs. 11-17 show screens displayed to a user creating, training, and running the predictive dialing neural network of the invention.
Description of the Preferred Embodiment Fig. 1 shows a block diagram of predictive dialing system 5 of the invention. Computer system 10 consists of main or central processing unit 11 connected to storage 12.
Storage 12 can be primary memory such as RAM and/or secondary memory such as magnetic or optical storage.
Processor 11 is connected to co-processor 13. Co-processor 13 may provide generic math calculation functions (a math co-processor) or specialized neural network hardware support functions (a neural network processor). Co-processor 13 is not necessary if CPU 11 has sufficient processing power to ZO~
handle an intensive computational workload without unacceptable performance degradation. CPU 11 is also connected to user interface 14. User interface 14 allows developers and users to communicate with computer system lO, normally through a workstation or terminal.
In the preferred embodiment, computer system 10 is an IBM~ Application System/400~ midrange computer, although any computer system could be used. Co-processor 13 is preferably a processor on the Application System/400 midrange computer, but could also be the math co-processor ~such as an Intel 80387 math co-processor) found on personal computers, such as the IRM PS/2~. In this case, CPU ll and co-processor 13 would communicate with each other via IBM PC
Support.
Computer system 10 is connected to telephone switch 17 over line 15 through telephony enabler 20. In the preferred embodiment, telephony enabler 20 is the IBM licensed program product CallPath/400, although other commercially available telephony enablers could be used. In the preferred embodiment, telephone switch 17 is a Teleos IRX9000, although any other switch capable of interfacing with a computer and supporting a predictive dialing application may be used. Switch 17 is able to establish connections between an external telephone, such as external telephone l9, with an operator telephone, such as operator telephone 18, in a conventional manner under the direction of computer system lO. Computer system 10 also communicates with a plurality of operator terminals, such as operator terminal 16, via user interface 14. Data associated with the calls made by predictive dialing system 5 (such as a script or information about the called party) may be displayed on operator terminal 16.
Fig. 2 shows how neurai network (parallel) computers can be simulated on a Von Neumann (serial) processor system.
There are many different neural network models with different connection topologies and processing unit 20S~531.
attributes. However, they can be generally classified as computing systems which are made of many (tens, hundreds, or thousands) simple processing units 21 which are connected by adaptive (changeable) weights 22. In addition to processors and weights, a neural network model must have a learning mechanism 23, which operates by updating the weights after each training iteration.
A neural network model can be simulated on a digital computer by programs and data. Programs 26 simulate the processing functions performed by neural network processing units 21, and adaptive connection weights 22 are contained in data 27. Programs 28 are used to implement the learning or connection weight adaptation mechanism 23.
Fig. 3A shows the conceptual layout of the neural network of this invention and how it relates to the predictive dialing application software. At the highest level is application programming interface 31 (API). API 31 is a formally specified interface which allows applicatio~
developers lacking expert knowledge of neural networks to access and use the utility programs and data structure of neural network shell 32 in their application programs.
Neural network shell 32 consists of a set of utility programs 33 and a neural network data structure 50. Shell 32 provides the capability for easily and efficiently defining, creating, training, and running neural networks in applications on conventional computing systems.
Any neural network model, such as example models 35-38, can be supported by neural network shell 32 by defining a generic neural network data structure 50 which can be accessed by all of the utility programs in neural network shell 32. Each neural network model is mapped onto this generic neural network data structure, described in more detail in Fig. 4. Programs specific to each neural network model are called by neural network utility programs 33, as will be discussed later.
RO9-90-047 8 z~ 3~
Fig. 3B shows how predictive dialing application program 41 becomes neural network application program 40 ~y interfacing with one or more of the neural network utility programs 45-48 in neural network shell 32. Utility programs 45-48 in turn interface with data structure 50. Data to be processed by neural network application program 40 (also referred to herein as "neural network") enters on input 42.
After the data is run through the neural network, the result is returned to application program 41 on lihe 44.
Application program 41 and utility programs 46-48 reside in suitably programmed CPU 11 and/or co-processor 13 (Fig. 1).
Data structure 50 resides in storage 12 and/or in internal storage of CPU 11 and/or co-processor 13.
Fig. 4 shows neural network data structure 50 of the invention. Data structure 50 provides a common framework which allows any neural network model to be defined for use in an application program. This common framework is accomplished by providing several of the fields in neur~1 network data structure 50 for model specific parameters.
"AS/400~ Neural Network Utility: User s Guide and Reference PRPQ P84189" (order number SC21-8202-0), pages 103-105, shows how the model specific fields of data structure 50 are used by the Back Propagation, ART, Self Organizing Feature Map, T~P, and BAM neural network models.
Data structure 50 consists of header portion 60 and body portion 90. Header portion 60 contains fields 61-79.
Fields 61 and 62 are pointers to other neural network data structures, if any. If neural networks are arranged in a linked list for serial processing of data, the first pointer would link to the previous network. This link can be used to obtain the outputs from the previous ~ub-net in the larger network. The second pointer would be a pointer to the next network. Depending on the collection of sub-networks, either or both of these links would be used in a complex (hybrid) network composed of several sub-networks.
~09-gO-047 9 Z0~i~63~
Neural network data structures can be chained together to provide increased flexibility and function to the application program. Providing the capability of linking to two additional neural networks allows "super" networks made up of modules of networks.
Field 63 is an offset in bytes to the next free space in body portion 90. Field 64 is an offset in bytes to end of the neural network data structure. Since body portion 90 is a variable length data area, fields 63 and 64 are needed to keep track of the size of the data structure and the next available free space in body portion 90.
Field 65 contains the name of the neural network. The name of the predictive dialing neural network, discussed in more detail later, will be entered into this field. The name of this network is NNPACER, and this name is placed in field 65 by the create neural network utility program, as will be discussed later.
Field 66 contains the name of the library where the neural network is located and is re~uired in the preferred embodiment. In the AS/400, programs are stored in libraries. Libraries are similar to sub directories in the personal computing environment. Field 66 would not be necessary in computing environments without libraries.
Field 67 contains the network version identifier. This information is used to prevent mismatches between neural network shell programs and neural network data structures.
As new versions or releases of software are developed, compatibility with existing networks is desirable. If any enhancements require changes to the fundamental network data structure, this field would allow detection of a software-to-data mismatch. The software could call a conversion routine to update the data structure format, or accept down-level data structures.
Field 79 contains the name of the neural network model or type. The neural network model name used in the RO9-90-047 ~0 0 2~;4~31 preferred embodiment by the predictive dialing neural network is "*BKP" for Back Propagation.
Field 68 contains the current state of the network.
Possible states are INITIALIZE if the network is being created, TRAINING if the network is being trained, or LOCKED if the training is complete and ready to run.
Field 69 is an optional field for storing a model specific alphanumeric field, if desired. Field 70 keeps track of the elapsed network training time in seconds.
Fields 71-74 contain different types of parameters used differently by specific neural network models. Field 71 contains up to four network Boolean parameters. A Back Propagation neural network model, for example, uses two of these parameters for determining whether epoch update and random input is enabled or disabled. The network Boolean parameters are also known as network flags. Of course, field 71 (as well as other fields of data structure 50) could be made larger or smaller to accommodate fewer or greater than the number of parameters used in the preferred embodiment, if desired. Field 72 contains network size parameters. This field contains up to five model-specific network si~e integer parameters. Field 73 contains up to five model-specific network index integer parameters. Field 74 contains up to six model-specific network training real parameters, such as learn rate, momentum, epoch error, etc.
Field 75 keeps track of the number of training epochs (an epoch is an iteration through the complete set of training data) o~ the neural network. Field 76 contains an array of offsets in bytes to the start of each model-specific array in body portion 90. Field 77 contains an array of resolved pointers to the start of each model-specific array in body portion 90. Field 78 contains an array of parameters describing the type of data held in each array. For example, some neural models accept only binary inputs. In the preferred embodiment, if a parameter ~09 go_o47 11 ~ 20~3~
in field 78 contains a "1" then its corresponding array contains bitmapped data. If the parameter is a "2" then its corresponding array contains ~ingle precision floating point data (the default). If it is "3" then its corresponding array contains fixed point zoned decimal data. These parameters are used to make more efficient use of storage.
The contents of body portion 90 of data structure 50 will now be discussed. Body portion 90 is a variable-length data area which contains a number (sixteen in the preferred embodiment) of model-specific arrays. Pages 103-105 of Attachment I shows the arrays mapped to header portion 60 and body portion 90 for each of the exemplary neural network models. For example, the back propagation model maps eleven arrays to body portion 90: activations, weights, threshold, weight deltas, etc, as shown under the heading "Array Mapping" on page 103.
Data structure 50 is created by the Create Neural Network utility program, as will be discussed later (Figs.
7A-7B). The Teach and Run utility programs access the header information to initialize the pointers to the data area arrays. The data in the data area arrays in turn are used in the simulation of the neural network training and calculation processes.
Figs. 5-9 show the flowcharts of the invention, as performed by suitably programmed CPU 11 and/or co-processor 13. Fig. 5 shows an overview of the major steps in the neural network application program development process.
Block 110 asks if there is a new neural network model to be defined. If so, block 200 calls the Define Neural Network Model Subroutine (Fig. 6). If not, block 120 asks if the user wishes to create a neural network data structure. A
neural network data structure is created for each neural network. For example, one neural network data structure would be created for our predictive dialing neural network.
If block 120 is answered affirmatively, block 300 calls the Create Neural Network Data Structure Subroutine (Fig. 7).
R09-90-047 i2 ~ 20~;~631 If not, block 130 asks if the user wishes to train a neural network. A neural network needs to be trained with t~ainihg data so that it can learn the relationship between inp~t data and the desired output result, or extract releva~t features from input data. If so, block 400 calls the Teach Neural Network Subroutine (Fig. 8). If not, block 1~0 a~ks if the user wants to run a neural network. If so, block 500 calls the Run Neural Network Model Subroutine (Fig. 9). If not, the program ends in block 190.
Figs. 6A - 6D describe Define Neural Network Model Subroutine 200. For our predictive dialing neural network we want to define a Back Propagation neural network model.
Block 201 assigns a neural network model specific meaning to network string field 69, if desired. In our network, this field is not needed, so a null string is assigned. Block 202 assigns a neural network model specific meaning to Boolean parameters field 71. In our network, two Boolean parameters are assigned: Epoch update (Y/N) and Rando~
Inputs (Y/N). Block 203 assigns a neural net~ork model specific meaning to network size parameters field 72. Ih our network, five parameters are assigned: number of inputs, number of units in hidden layer 1, number of units in hidden layer 2, number of outputs, and number of processing units. Block 204 assigns a neural network model specific meaning to network index parameters field 13. In our network, the following parameters are assigned: first hidden unit 1, last hidden unit 1, first hidden unit 2, last hidden unit 2, and first output. Block 205 assigns a neural network model specific meaning to network training parameters field 74. In our network, the following parameters are assigned: learn rate, momentum, pattern error, epoch error, and tolerance. Block 206 assigns a neural network model specific meaning to network array offsets field 76. Since there are eleven data arrays to be defined in a Back Propagation neural network model, this field contains the byte offset to the first element of each of the eleven arrays located in body portion 90.
R09-90-047 13 2~S~31 Block 210 calls the Build Neural Network Model Create Program Subroutine of Fig 6B. Referring now to Fig. 6B, subroutine 210 requires that model specific routines are built so that they can be executed later by the Create Neural Network Data Structure Subroutine (Fig. 7). Block 211 provides a simple routine to prompt the user for parameter information specific to the neural network and check for erroneous and inconsistent parameter values. For example, block 211 would provide a routine that would prepare a screen similar to Fig. 12. The screen in Fig. 12, among other things, prompts the user for information about the following parameters: Number of input units, number of hidden units Ll, number of hidden units L2, and number of output units.
Block 212 provides a routine to initialize the generic neural network data structure with default parameter values to create the default neural network data structure for this neural network model. All neural network models have the same generic neural network data structure. Each individual neural network model has its own unique default data structure. Therefore, all neural networks application programs that use the same neural network model (such as Back Propagation) will input unique parameter values into the same default neural network data structure.
Block 213 saves the neural network model create program built in subroutine 210 by giving it a unique name and writing it to storage 12 (Fig. l). In the preferred embodiment, this program can be written in any language desired which has the capability to access the data structure. Block 219 returns to block 230 of Fig. 6A.
Block 230 calls the Build Neural Network Model Teach Program Subroutine of Fig 6C. Referring now to Fig. 6C, subroutine 230 requires that model specific routines are written so that they can be executed later by the Teach Neural Network Subroutine (Fig. 8). Block 231 provides a simple routine to initialize the network array pointers in Ro9-90-047 14 2~53~
field 77 of Fig. 4. Block 232 provides a routine for copying network size, index and training parameters (fields 72-74) into local variables. This iæ done to improve performance and programming reliability. Block 233 provides a routine to initialize the neural network. Block 233 initializes counters and variables used by the neural network teach program. If network status field 68 is "Initialize", block 233 also initializes data array values (connection weights) and changes the status from "Initialize" to "Training" in field 68.
Block 234 provides a routine to perform a single teach step for this neural network model. This routine provides a mechanism, highly dependent on the neural network model, used to adjust the values of the data in the data array of body 90 so that the network can learn the desired functions.
Those skilled in the art would take a neural network model description of its weight adjustment procedures and simply convert this description to a program, using a computer language of their choice, that accesses the data structù~e of the invention.
Block 235 provides a routine to be performed when the training epoch processing has been completed. This routine can vary in complexity from a simple clean up procedure such as resetting variables to a more complex adjustment of data array values, depending on the neural network model. Those skilled in the art would take a neural network model description of its uni~ue end of epoch processing and simply convert this description to a program, using a computer language of their choice, that accesses the data structure of the invention.
Block 236 saves the neural network model teach program built in subroutine 230 by giving it a unique name and writing it to storage 12 (Fig. 1). Block 239 returns to block 250 of Fig. 6A.
~09-90-047 15 ~ 2~5463~
Block 250 calls the Build Neural Network Model Run Program Subroutine of Fig 6D. Referring now to Fig. 6D, ~ubroutine 250 re~uires that model specific routines are written so that they can be executed later by the Run Neural Network Subroutine (Fig. 8)~ Block 251 provides a simple routine to initialize the network array pointers in field 77 of Fig. 4. Block 252 provides a routine for copying network size, index and training parameters (fields 72-74) into local variables. Block 253 provides a routine to pass input data through the neural network. Block 254 provides a routine to return the output result to the Run Neural Network Subroutine. Block 255 saves the neural network model run program built in subroutine 250 by giving it a unique name and writing it to storage 12 (Fig. 1). Block 259 returns to block 260 of Fig. 6A.
Block 260 enters the name of the neural network model (such as "*BKP" for back propagation) and the names of the create, teach, and run programs for this model s~ved in blocks 213, 236, and 255 into a model definition file stored in storage 12. Block 270 returns to block 120 of Fig. 5.
In the preferred embodiment, five neural network models are predefined for the convenience of the application developer or user. The predefined models are Back Propagation, Adaptive Resonance Theory, Self Organizing Feature Maps, Self Organizing TSP Networks, and Bidirectional Associative Memories. Therefore, these models do not have to be defined by the user using the Define Neural Network Model Subroutine. The predictive dialing application program of the invention uses the predefined Back Propagation model as its neural network model,although other models could also be used.
The remaining flowcharts will be discussed in conjunction with the predictive dialing neural netWork of the invention. The user creates this neural network by answering block 120 affirmatively in Fig. 5 and calling the Create Neural Network Data Structure Subroutine in block 300 '~05~63~
(Fig. 7). Referring now to Fig. 7A, block 301 prompts the user for the name of the neural network and textual description information, as shown in Fig. 11. The user enters "NNPACER" as the name of the neural network and "Neural Network Pacer for Predictive Dialing" for the textual description. Block 302 prompts the user for the name of the neural network model. As shown in Fig. 11, the user enters "*BKP", an abbreviation for the Back Propagation neural network model. Block 303 checks to see if the model "*BKP" was defined in the model definition file in block 260 of Fig. 6A. If not, block 304 posts an error message and the user is asked to reenter the name of the neural network model in block 301. In our network, the model definition file contains the "*BKP" and block 330 calls the Run Model Create Program Subroutine for this model of Fig. 7B. The Model Create Program was prepared by the Build Model Create Program Subroutine of Fig. 6B, as has been discussed. The name of this program, along with the names of the Teach and Run programs for this model, are all contained in the model definition file.
Referring now to Fig. 7B, block 331 creates the default neural network data structure for this neural network model, by running the routine provided in block 212 of Fig. 6B.
Block 332 prompts the user for neural network specific parameters, as shown in Fig. 12. In the preferred embodiment, the user specifies 16 input units (one each for month, day, year, day of week, hour, minute, second, pending calls, available operators, average connect delay, average idle time, nuisance call rate, average completion rate, average conversation length, idle time delta and nuisance call delta), 35 hidden units and l output unit (call action). In the preferred embodiment, the number of hidden units is equal to 2 * (number of inputs + number of outputs) +1. Block 333 checks to see if the user supplied parameters are acceptable. Note that the routine provided by block 211 in Fig. 6B to prompt the user for these parameters placed limits on the user's input, such as l-1000 output units. If the user inputs a value outside of any of these ranges, 2Q~i4~3~
block 333 would be answered negatively, an error message would be posted in block 334, and the user would be asked to reenter the data in block 332. In addition, if the user inputs inconsistent parameter information, an error message would also be posted. In our case, the user s~pplied parameters are all acceptable, so block 335 fills in all user supplied parameters into the default data structure created by block 331. Block 336 performs calculations to fill in network index parameters field 73 and network array offsets field 76, based on the data now residing in the data structure. Block 337 initializes the Boolean parameters in field 71 (both to "N" in our example) and the training parameters in field 74 (to the values shown in Fig. 15 in our example) Block 338 allocates and initializes the data array fields located in body portion 90. In a back propagation neural network model, the following arrays would be allocated: activations, weights, threshold, weight deltas, threshold deltas, teach, error, delta, network input, weight derivative, and threshold derivative. Thëse values are all initialized (as determined by the neu~fil network model) in block 338. After block 338 is executed, the neural network data structure contains all the information needed to teach the neural network how to perform predictive dialing. The subroutine returns in block 339 to block 305 in Fig. 7A. Block 305 returns to block 130 in Fig. 5.
Note that once a neural network data structure has been created, it can be transported to another computer system to be taught and/or run. The other computer system can be of an entirely different architecture and run an entirely different operating system than the computer system that created the neural network data structure. This fleXibility is possible since the data structure contains data that can be used universally among different computer systems.
Since our user wants to train his newly created neural network to perform predictive dialing, he answers block 130 affirmatively in Fig. 5, thereby calling the Teach Neural I Z054631.
Network Subroutine in block 400 (Fig. 8). Referring now to Fig. 8A, block 401 prompts the user for the name of the neural network and library as shown in Fig. 14. The user enters "NNPACER" as the name of the neural network, "BIGUS"
as the library name. Fig. 14 also gives the user the opportunity to enter in the name of a custom interface program he can write to improve the usability of his particular neural network, if desired. In addition, the user is asked if he wants the training results to be logged or displayed, and (if a custom interface program exists) whether he wants the training data taken automatically from the data set or one step at a time from the user when he presses the enter key. Block 402 sees if the data structure specified in block 401 exists. If not, an error is posted and the user is returned to block 401. If so, block 403 prompts the user for the name of the data set where the training data is located, As shown in Fig. 13, the user enters "NNDATA" as the data set and "NNPACER" as the data set member where the training data is located.
Fig. lOA shows the initial training data used in the preferred embodiment. Initial training data can be generated manually taking into'account known and estimated conditions in a predictive dialing environment. For example, the first two records of training data indicates that calls after 4:00 PM on Fridays have a lower completion rate than calls at 10:30 AM on Wednesdays. Therefore, with all other workload factors being even, the neural network may learn that it should make a call at 4:00 PM on Friday, but shouldn't make the call at 10:30 AM on Wednesday, since the desired nuisance rate might be exceeded. The third record indicates that a call shouldn't be made because the average idle time is too low. The fourth record indicates that a call shouldn't be made because the average nuisance call rate is too high. The fifth record indicates that a call shouldn't be made because the number of calls pending is too high. The sixth record indicates that a call should be made because the number of available operators is sufficiently high.
~09-90-047 19 ~ 20~i~6~1 Input parameters 811-814 make up date parameter 810.
Input parameters 821-823 make up time parameter 820. In the preferred embodiment, time parameter 820 takes into acco~nt the time zone of the called party. Input parameters 831-838 make up workload factor parameter 830. In an alternate embodiment shown in Fig. lOB, date parameter 810 con~ists of a single input parameter. Time parameter 820 consists of a single input parameter. Workload factor parameter 830 consists of a single input parameter. Workload factor parameter 830 could be selected to be whatever the application developer considers to be the most important parameter, such as idle time delta or nuisance call delta.
Output parameter 850 is not needed if only records where a call was made are stored.
Block 404 determines that the data set exists, so block 405 prompts the user for the name of the custom interface program, if any. If symbolic data is stored in the data set, a user specified custom interface program is neëded to convert symbolic data ~that humans understand) into numeric data (that neural networks understand). A custom interface program may also be used to normalize input data to give all data a range between O and 1, if desired. In our network, a custom interface program was specified in Fig. 13, and this program normalizes all data in a conventional matter for computational efficiency. Block 420 calls the Run Model Teach Program Subroutine for this model of Fig. 8B. The Model Teach Program was prepared by the Build Model Teach Program Subroutine of Fig. 6C, as has been discussed.
Re~erring now to Fig. 8B, block 433 performs the initialization routine built by blocks 231, 232 and 233 of Fig. 6C. Block 421 checks to see if a custom interface program was specified. If so, block 422 gets the data from the custom interface program. Otherwise, block 423 gets the data directly from the data set. Block 424 performs one teach step by running the neural network model-dependent routine provided by block 234 of Fig. 6C. In our example, the values of the data in the data arrays in body 90 are -ZO~A~631 adjusted to minimize the error between the desired and actual network outputs. Block 425 again checks for a custom interface program. If it exists, block 426 checks to see if the uæer wants the values of the data in the data structure to be displayed. If so, a custom screen generated by the custom interface program is displayed in block 427. An example custom screen is shown in Fig. 17. If no custom interface program exists but the user wants data displayed, a default screen is displayed in block 428. An example default screen is shown in Fig. 15.
Referring again to Fig. 8B, block 429 checks to see if the user wanted the data logged. If so, block 430 performs custom or default logging of data. In either event, block 434 checks to see if one epoch has been completed. A~ epoch is complete when all training data in the data set has been processed once. If not, control loops back to block 421 to get the next training data. If one epoch has beén completed, block 435 performs the end of epoch processihg routine built by block 235 in Fig. 6C. In our example, the end of epoch processing routine determines if the difference between the actual and desired output for our output unit (call action) for all training data is less than the specified tolerance (one of the training parameters in field 74). If so, it sets the network status in field 68 to "locked". When the status of the neural network is "locked"
the values of the data arrays are not permitted to change.
Block 431 then checks to see if the number of iterations specified by the user has been completed. Until this happens, block 431 is answered negatively and flow returns back to block 421 to perform another iteration through the training data. When the training pe~iod is complete, block 431 is answered positively. The subroutine returns in block 439 to block 407 of Fig. 8A. Block 407 returns to block 140 of Fig. 5.
Since our user wants to run his newly trained neural network to perform predictive dialing, he answers block 140 RO9-90-047 21 20~S31 affirmatively in Fig. 5, thereby calling the Run Neural Network Subroutine in block 500 (Fig. 9). Alternatively, predictive dialing application program 41 (Fig. 3B) can call the Run Neural Network Subroutine directly, thereby bypassing Fig. 5.
Referring now to Fig. 9A, block 501 performs the initialization routine built by blocks 251 and 252 of Fig.
6D. Block 502 determines the name of the neural network.
Block 530 calls the Run Model Run Program Subroutine for this model of Fig. 9B. The Model Run Program was prepared by Build Model ~un Program Subroutine of Fig. 6D, as has been discussed.
Referring now to Fig. 9B, block 531 gets the date, time, day of week, number of pending calls, and number o available operators from the system. It then calculates the average connect delay, the average completion rate, the average idle time, the average nuisance call rate, the average completion ~ate and the average conversation ~ength.
Although these averages can be calculated any number of ways, a preferred way is to keep a running count of the last 5 minutes of activity and determine the various averages over this time period. Block 533 calculates an idle time delta and a nuisance call delta. Idle time delta is the seconds per hour difference between a desired idle time (a variable entered into the computer system by the user) and the actual idle time. For example, if 205 seconds per hour is the desired idle time, and if the actual idle time is 240 seconds, the idle time delta would be -35 seconds (205-240=
-35). The nuisance call delta is desired percentage of nuisance calls minus actual percentage of nuisance calls.
For example, if desired nuisance calls percentage is 0.64%
and actual nuisance calls percentage is 0.2%, the nuisance call delta is +0.4% (0.6%-0.2% = 0.4%). The first record of Fig. lOA shows an idle time delta of -35 seconds and a nuisance call delta of 0.4%. The input data of blocks 531 and 533 are considered to be a "current call record".
Z(~6~1 The desired idle time and desired percentage of nuisance calls are design choices and can vary based on the particular application. A 300 second to 600 second idle time per hour (5-10 minutes) may be desirable to minimize operator fatigue yet also avoid operator boredom and low productivity. It is normally desirable to keep the nuisance call percentage as close to 0% as possible to minimize customer annoyance with being contacted by a computer when no operator is available.
The data used in blocks 531 and 533 is normally determined from information retrieved from telephony enabler 20. In the preferred embodiment, this information is retrieved from the CallPath/400 telephony enabler by using a series of commands supported by the CallPath/400 Application Programming Interface. This interface is described in more detail in IBM document GC21-9867, CallPath/400 Programmer's Reference. Some of the specific commands that can be used by those skilled in the art are Make_Call, Receive, Add-_Party, and Disconnect. These commands retu~n the information needed to determine the data used in blocks 531 and 533 in the form of the following events: Call_Alerting, Call_Connected, Call_Rejected, ~isconnected, (and associated timestamp information included with the above events). The Feature_Invoked event is also used in determining status of operators or agents.
Block 535 runs all the input data contained in the current call record through the trained neural network.
When the neural network was trained, it determined a relationship between input data contained in call records with a call action (make or don't make the call). Based on this relationship, the neural network looks at the input data in the current call record and, in the preferred embodiment, passes a numeric value between 0 and 1 to predictive dialing application program 41 via line 44 (Fig.
3B). The closer this numeric value is to 1, the more confident the neural network is that a call should be made.
Predictive dialing application program 41, in the preferred ~ 9-90-047 23 20546~1 embodiment, gets a threshold value of 0.5, althoùgh this could be larger or smaller. Therefore, a numeric value of 0.5 or greater from the neural network indicates that a call should be made, while a numeric value less thah 0.5 indicates that a call should not be made.
Block 540 asks if the neural network indicated that a call should be made. If so, block 541 instructs the switch to make the call. In the preferred embodiment, this is done by informing telephony enabler 20, that a call should be made. Telephony enabler 20 handles the communications protocol with the switch neceæsary to make calls.
Block 542 saves the current call record in a temporary dataset for future analysis, as will be discussed later. In the preferred embodiment, block 542 appends the call action onto the call record and saves all call records, whether the call was made or not. An alternate embodiment is contemplated where the call action is not appended ahd onLy call records where a call was made is saved in block 542.
Block 545 checks to see if the application program wants to stop making calls. The application program may automatically stop making calls after a certain elapsed time, at a specific time of day, or if all the operators have gone home. If no such indication to stop making calls is received, flow of control loops back to block 531 where new input data is retrieved. If an indication to stop making callæ is received, block 550 asks if the call records saved through various iterations of block 542 should be analyzed to see if the neural network needs further training. If analysis is not desirable, the subroutihe returns in block 590 to block 519 to block 190 in Fig. 5, where the program ends, or, alternatively, ret~rns to predictive dialing application program 41 that caLled it for further processing.
If block 550 is answered affirmatively, Analyze Call Records Subroutine 600 of Fig. 9C is called. Referring now ~ ;~Q5~6~
to Fig. 9C, block 601 asks if there is a call record to process. If so, block 605 asks if the average idle time is greater than desired. If so, block 606 asks i~ a call was made. A call should have been made if the idle time is greater than desired, since operators are sitting aro~hd waiting for something to do. If block 606 indicate~ that a call was not made, the neural network made the "wrohg"
decision in this case. Block 607 changes the Dial Action field in the call record from a "0" (indicating that a call wasn t made) to a "1" (indicating that a call was made).
This change is done to make the call record reflect the desired result so that the neural network can learn from it later. If block 606 indicateæ that a call was made, the neural network made the right decision. In either event, flow returns back to block 601 to look for another record to process.
If block 605 was answered negatively, block 615 asks if the average idle time is less than desired. If so, block 616 asks if a call was made. A call should not have been made if the idle time is less than desired, since operators are overworked. If block 616 indicates that a call was made, the neural network made the "wrong" decision in this case.
Block 617 changes the Dial Action field in the call record from a "1" (indicating that a call was made) to a "0"
(indicating that a call wasn t made). As before, this change is done to make the call record reflect the desired result so that the neural network can learn from it later.
If block 616 indicates that a call was not made, the neural network made the right decision. In either event, flow returns back to block 601 to look for another record to process.
If block 615 was answered negatively, block 625 asks if the average nuisance call rate is greater than desired. If so, block 626 asks if a call was made. A call should not have been made if the nuisance call rate is greater than desired, since it will be likely that there will be no operators available to take the call. If block 626 Z(~S4~31 indicates that a call was made, the neural network made the "wrong" decision in this case. Block 627 changes the Dial Action field in the call record from a "1" (indicati~g that a call was made) to a "0" (indicating that a call wasn't made). As before, this chahge is done to make the call record reflect the desired result so that the neural network can learn from it later. If block 626 indicates that a call was not made, the neural network made the right decision.
In either event, flow returns back to block 601 to look for another record to process.
When block 601 indicates that there are no more call records to process, the subroutine returns in block 650 to block 560 in Fig. 9B. Block 560 adds the call records (some of which may have been changed by subroutine 600) to the training dataset. The temporary dataset is then erased. By putting these records into the training dataset, the neural network can be retrained by restarting the flowchart of Fi~.
5 and indicating that the network is to be trained. In this manner, the neural network can improve its learning process and make fewer and fewer mistakes in the future. After a few of these learning iterations, the neural network should be able to consistently stay within the desired idle rate and nuisance call percentage parameters and be able to look ahead and anticipate changes in calling patterns and adjust accordingly, before the nuisance call rate or operator idle time reach unacceptable levels.
While this invention has been described with respect to the preferred embodiment, it will be understood by those skilled in the art that various changes in detail may be made therein without departing from the spirit, scope and teaching of the invention. For example, the inpùt parameters selected could be ~uite different from those in the preferred embodiment. Economic factors ~ch as unemployment rate or gross national product may be added;
other factors such as current weather conditions may also be added. In addition, the desired idle time or the desired nuisance call percentage can be larger or smaller than the ~ g-90-047 26 ~ 2~5~631 exemplary values shown herein. Although a neural network is used in the preferred embodiment, the relationship between a selected group of input parameters and the desired output can be determined through a expert system or other programming or logical circuitry. Accordingly, the herein disclosed is to be limited only as specified in the following claims.
LOOK-AHEAD METHOD AND APPA~ATUS EOR PREDICTIVE
DIALING USING ~ NE~kAL NETWORK
Field of the Invention This invention relates to the data processing field.
More particularly, this invention is a look-ahead method and apparatus for predictive dialing using a neural network.
Background of the Invention Communications in the l990s is considerably more complex that it used to be. Back in the stone age, when one Neanderthal wanted to communicate with another Neanderthal, he walked over to the second Neanderthal and grunted a few sounds. Gradually, communication evolved into written messages that could be delivered, first by messenger and later by mail.
Eventually, the telephone was invented. The telephone alLowed a person to communicate with another person simply and efficiently by picking up the receiver and dialing the telephone number of the person he wished to speak to.
Salespeople were on a similar evolutionary track. When a salesman wanted to sell something to another person, he went door to door and tried to convince whoever was there that they should buy what the salesman was selling. When this proved to be inefficient due to the high number of doors slammed in the salesman s face, the salesman began mailing letters, brochures, and other written promotional materials to prospective customers. This was also inefficient, since a very high percentage of these màilings were considered to be "junk mail" by the recipients. Only a small percentage of the mailings resulted in sales.
It didn't take long for salespeople to discover the telephone. A salesman could quickly and inexpensively call ~ 2~5~631 a proæpective customer and explain what he was selling.
Since most calls ended ~uickly (with the potential customer expressing his lack of interest in a variety of ways and then hanging up) the bulk of the time was spent figuring out who was going to be called and trying to establish a connection with that person. The phone would often be busy or not answered, forcing the salesman to try again later and look for another prospective customer to call.
Salespeople began to realize that this approach was also inefficient. They discovered that computers could quickly perform much of the overhead involved with establishing connections with prospective customers. When a salesperson (now known as a "telemarketer") compLeted a call, he could instruct the computer to dial the next number from a list of numbers stored in the computer. This became known as outbound telemarketing.
Although very efficient, conventional o~tbound telemarketing still had problems. Much of the telemarketer s time was spent listening to busy signals or phones that weren t answered. In addition, telemarketers often grew weary of a high degree of rejection, and were reluctant to instruct the computer that they were ready to make another call. To solve these problems, predictive dialing was developed. In a typical predictive dialing arrangement, the potential customer is called by the computer. If someone answers the phone, the computer finds an available telemarketer and connects the call to this telemarketer.
While prior attempts in the predictive dialing field have been very good at the "dialing" part of predictive dialing, they have not been good at the "predicting" part.
Often, the computer makes and completes a call to a customer, only to discover that there isn t an operator available to take the call. This is known as a "nuisance call". The customer then is subjected to a recorded announcement, a ringing signal, dead silence or a hang up.
~ Z05~631 The opposite problem of having operators sitting idle waiting for the computer to dial a customer also frequently occurs in prior attempts. This is known as "operator idle time".
U.S. Patent number 4,829,563 to Crockett et al attempted to solve these problems of nuisance calls and operator idle time by dynamically adjusting the number of calls dialed based on short term comparisons of the weighted predicted number of calls versus the predicted number of operators, and based on periodic adjustment of a weighting factor. Crockett s "short term" comparisons are always "reactive" in nature -- changes are made only after nuisance calls or operator idle time rise to unacceptable levels.
Therefore, Crockett's "reactive dialing" approach falls short of solving the above-identified problems of nuisance calls and operator idle time.
Summary o~ the Invention It is a principle object of the invention to provide an efficient predictive dialing technique.
It is another object of the invention to provide a predictive dialing technique that maintains nuisance calls and operator idle time within acceptable levels.
It is another object of the invention to provide a predictive dialing technique able to look ahead and anticipate changes in calling patterns and adjust accordingly, before nuisance calls or operator idle time rise to unacceptable levels.
It is another object of the invention to use a neural network in a predictive dialing technique that is able to look ahead and anticipate changes in calling patterns and adjust accordingly, before nuisance calls or operator idle time reach unacceptable levels, based on what the neural network has learned.
;~054631 These and other objects are accomplished by the look-ahead method and apparatus for predictive dialing using a neural network disclosed herein.
A predictive dialing system having a computer connected to a telephone switch stores a group of call records in its internal storage. Each call record contains a group of input parameters, including the date, the time, and one o~
more workload factor parameters. Workload factor parameters can indicate the number of pending calls, the number of available operators, the average idle time, the connection delay, the completion rate, the conversation length and the nuisance call rate, among other things. In the preferred embodiment, each call record also contains a dial action, which indicates whether a call was initiated or not.
These call records are analyzed by a neural network to determine a relationship between the input parameters and the dial action stored in each call record. This ahalysis is done as part of the training process for the neural network. After this relationship is determined, the computer system sends a current group of input parameters to the neural network, and, based on the analysis of the previous call records, the neural network determines whether a call should be initiated or not. The neural network bases its decision on the complex relationship it has learned from its training data -- perhaps several thousand call records spanning several days, months, or even years. The neural network is able to automatically adjust -- in a look ahead, proactive manner -- for slow and fast periods of the day, week, month, and year.
Brief Description of the Drawing Fig. 1 shows a block diagram of the predictive dialing system of the invention.
205463 ~
Fig. 2 shows how a massively parallel hardware implemented neural network can be simulated on a serial Von Neumann based computer system.
Figs. 3A-3B shows a conceptual framework of the computing environment of the invention.
Fig. 4 shows the neural network data structure of the invention.
Figs. 5-9 show the flowcharts of the neural network utility of the invention.
Figs. lOA-lOB show examples of numeric training data used in the preferred and alternate embodiments of the invention.
Figs. 11-17 show screens displayed to a user creating, training, and running the predictive dialing neural network of the invention.
Description of the Preferred Embodiment Fig. 1 shows a block diagram of predictive dialing system 5 of the invention. Computer system 10 consists of main or central processing unit 11 connected to storage 12.
Storage 12 can be primary memory such as RAM and/or secondary memory such as magnetic or optical storage.
Processor 11 is connected to co-processor 13. Co-processor 13 may provide generic math calculation functions (a math co-processor) or specialized neural network hardware support functions (a neural network processor). Co-processor 13 is not necessary if CPU 11 has sufficient processing power to ZO~
handle an intensive computational workload without unacceptable performance degradation. CPU 11 is also connected to user interface 14. User interface 14 allows developers and users to communicate with computer system lO, normally through a workstation or terminal.
In the preferred embodiment, computer system 10 is an IBM~ Application System/400~ midrange computer, although any computer system could be used. Co-processor 13 is preferably a processor on the Application System/400 midrange computer, but could also be the math co-processor ~such as an Intel 80387 math co-processor) found on personal computers, such as the IRM PS/2~. In this case, CPU ll and co-processor 13 would communicate with each other via IBM PC
Support.
Computer system 10 is connected to telephone switch 17 over line 15 through telephony enabler 20. In the preferred embodiment, telephony enabler 20 is the IBM licensed program product CallPath/400, although other commercially available telephony enablers could be used. In the preferred embodiment, telephone switch 17 is a Teleos IRX9000, although any other switch capable of interfacing with a computer and supporting a predictive dialing application may be used. Switch 17 is able to establish connections between an external telephone, such as external telephone l9, with an operator telephone, such as operator telephone 18, in a conventional manner under the direction of computer system lO. Computer system 10 also communicates with a plurality of operator terminals, such as operator terminal 16, via user interface 14. Data associated with the calls made by predictive dialing system 5 (such as a script or information about the called party) may be displayed on operator terminal 16.
Fig. 2 shows how neurai network (parallel) computers can be simulated on a Von Neumann (serial) processor system.
There are many different neural network models with different connection topologies and processing unit 20S~531.
attributes. However, they can be generally classified as computing systems which are made of many (tens, hundreds, or thousands) simple processing units 21 which are connected by adaptive (changeable) weights 22. In addition to processors and weights, a neural network model must have a learning mechanism 23, which operates by updating the weights after each training iteration.
A neural network model can be simulated on a digital computer by programs and data. Programs 26 simulate the processing functions performed by neural network processing units 21, and adaptive connection weights 22 are contained in data 27. Programs 28 are used to implement the learning or connection weight adaptation mechanism 23.
Fig. 3A shows the conceptual layout of the neural network of this invention and how it relates to the predictive dialing application software. At the highest level is application programming interface 31 (API). API 31 is a formally specified interface which allows applicatio~
developers lacking expert knowledge of neural networks to access and use the utility programs and data structure of neural network shell 32 in their application programs.
Neural network shell 32 consists of a set of utility programs 33 and a neural network data structure 50. Shell 32 provides the capability for easily and efficiently defining, creating, training, and running neural networks in applications on conventional computing systems.
Any neural network model, such as example models 35-38, can be supported by neural network shell 32 by defining a generic neural network data structure 50 which can be accessed by all of the utility programs in neural network shell 32. Each neural network model is mapped onto this generic neural network data structure, described in more detail in Fig. 4. Programs specific to each neural network model are called by neural network utility programs 33, as will be discussed later.
RO9-90-047 8 z~ 3~
Fig. 3B shows how predictive dialing application program 41 becomes neural network application program 40 ~y interfacing with one or more of the neural network utility programs 45-48 in neural network shell 32. Utility programs 45-48 in turn interface with data structure 50. Data to be processed by neural network application program 40 (also referred to herein as "neural network") enters on input 42.
After the data is run through the neural network, the result is returned to application program 41 on lihe 44.
Application program 41 and utility programs 46-48 reside in suitably programmed CPU 11 and/or co-processor 13 (Fig. 1).
Data structure 50 resides in storage 12 and/or in internal storage of CPU 11 and/or co-processor 13.
Fig. 4 shows neural network data structure 50 of the invention. Data structure 50 provides a common framework which allows any neural network model to be defined for use in an application program. This common framework is accomplished by providing several of the fields in neur~1 network data structure 50 for model specific parameters.
"AS/400~ Neural Network Utility: User s Guide and Reference PRPQ P84189" (order number SC21-8202-0), pages 103-105, shows how the model specific fields of data structure 50 are used by the Back Propagation, ART, Self Organizing Feature Map, T~P, and BAM neural network models.
Data structure 50 consists of header portion 60 and body portion 90. Header portion 60 contains fields 61-79.
Fields 61 and 62 are pointers to other neural network data structures, if any. If neural networks are arranged in a linked list for serial processing of data, the first pointer would link to the previous network. This link can be used to obtain the outputs from the previous ~ub-net in the larger network. The second pointer would be a pointer to the next network. Depending on the collection of sub-networks, either or both of these links would be used in a complex (hybrid) network composed of several sub-networks.
~09-gO-047 9 Z0~i~63~
Neural network data structures can be chained together to provide increased flexibility and function to the application program. Providing the capability of linking to two additional neural networks allows "super" networks made up of modules of networks.
Field 63 is an offset in bytes to the next free space in body portion 90. Field 64 is an offset in bytes to end of the neural network data structure. Since body portion 90 is a variable length data area, fields 63 and 64 are needed to keep track of the size of the data structure and the next available free space in body portion 90.
Field 65 contains the name of the neural network. The name of the predictive dialing neural network, discussed in more detail later, will be entered into this field. The name of this network is NNPACER, and this name is placed in field 65 by the create neural network utility program, as will be discussed later.
Field 66 contains the name of the library where the neural network is located and is re~uired in the preferred embodiment. In the AS/400, programs are stored in libraries. Libraries are similar to sub directories in the personal computing environment. Field 66 would not be necessary in computing environments without libraries.
Field 67 contains the network version identifier. This information is used to prevent mismatches between neural network shell programs and neural network data structures.
As new versions or releases of software are developed, compatibility with existing networks is desirable. If any enhancements require changes to the fundamental network data structure, this field would allow detection of a software-to-data mismatch. The software could call a conversion routine to update the data structure format, or accept down-level data structures.
Field 79 contains the name of the neural network model or type. The neural network model name used in the RO9-90-047 ~0 0 2~;4~31 preferred embodiment by the predictive dialing neural network is "*BKP" for Back Propagation.
Field 68 contains the current state of the network.
Possible states are INITIALIZE if the network is being created, TRAINING if the network is being trained, or LOCKED if the training is complete and ready to run.
Field 69 is an optional field for storing a model specific alphanumeric field, if desired. Field 70 keeps track of the elapsed network training time in seconds.
Fields 71-74 contain different types of parameters used differently by specific neural network models. Field 71 contains up to four network Boolean parameters. A Back Propagation neural network model, for example, uses two of these parameters for determining whether epoch update and random input is enabled or disabled. The network Boolean parameters are also known as network flags. Of course, field 71 (as well as other fields of data structure 50) could be made larger or smaller to accommodate fewer or greater than the number of parameters used in the preferred embodiment, if desired. Field 72 contains network size parameters. This field contains up to five model-specific network si~e integer parameters. Field 73 contains up to five model-specific network index integer parameters. Field 74 contains up to six model-specific network training real parameters, such as learn rate, momentum, epoch error, etc.
Field 75 keeps track of the number of training epochs (an epoch is an iteration through the complete set of training data) o~ the neural network. Field 76 contains an array of offsets in bytes to the start of each model-specific array in body portion 90. Field 77 contains an array of resolved pointers to the start of each model-specific array in body portion 90. Field 78 contains an array of parameters describing the type of data held in each array. For example, some neural models accept only binary inputs. In the preferred embodiment, if a parameter ~09 go_o47 11 ~ 20~3~
in field 78 contains a "1" then its corresponding array contains bitmapped data. If the parameter is a "2" then its corresponding array contains ~ingle precision floating point data (the default). If it is "3" then its corresponding array contains fixed point zoned decimal data. These parameters are used to make more efficient use of storage.
The contents of body portion 90 of data structure 50 will now be discussed. Body portion 90 is a variable-length data area which contains a number (sixteen in the preferred embodiment) of model-specific arrays. Pages 103-105 of Attachment I shows the arrays mapped to header portion 60 and body portion 90 for each of the exemplary neural network models. For example, the back propagation model maps eleven arrays to body portion 90: activations, weights, threshold, weight deltas, etc, as shown under the heading "Array Mapping" on page 103.
Data structure 50 is created by the Create Neural Network utility program, as will be discussed later (Figs.
7A-7B). The Teach and Run utility programs access the header information to initialize the pointers to the data area arrays. The data in the data area arrays in turn are used in the simulation of the neural network training and calculation processes.
Figs. 5-9 show the flowcharts of the invention, as performed by suitably programmed CPU 11 and/or co-processor 13. Fig. 5 shows an overview of the major steps in the neural network application program development process.
Block 110 asks if there is a new neural network model to be defined. If so, block 200 calls the Define Neural Network Model Subroutine (Fig. 6). If not, block 120 asks if the user wishes to create a neural network data structure. A
neural network data structure is created for each neural network. For example, one neural network data structure would be created for our predictive dialing neural network.
If block 120 is answered affirmatively, block 300 calls the Create Neural Network Data Structure Subroutine (Fig. 7).
R09-90-047 i2 ~ 20~;~631 If not, block 130 asks if the user wishes to train a neural network. A neural network needs to be trained with t~ainihg data so that it can learn the relationship between inp~t data and the desired output result, or extract releva~t features from input data. If so, block 400 calls the Teach Neural Network Subroutine (Fig. 8). If not, block 1~0 a~ks if the user wants to run a neural network. If so, block 500 calls the Run Neural Network Model Subroutine (Fig. 9). If not, the program ends in block 190.
Figs. 6A - 6D describe Define Neural Network Model Subroutine 200. For our predictive dialing neural network we want to define a Back Propagation neural network model.
Block 201 assigns a neural network model specific meaning to network string field 69, if desired. In our network, this field is not needed, so a null string is assigned. Block 202 assigns a neural network model specific meaning to Boolean parameters field 71. In our network, two Boolean parameters are assigned: Epoch update (Y/N) and Rando~
Inputs (Y/N). Block 203 assigns a neural net~ork model specific meaning to network size parameters field 72. Ih our network, five parameters are assigned: number of inputs, number of units in hidden layer 1, number of units in hidden layer 2, number of outputs, and number of processing units. Block 204 assigns a neural network model specific meaning to network index parameters field 13. In our network, the following parameters are assigned: first hidden unit 1, last hidden unit 1, first hidden unit 2, last hidden unit 2, and first output. Block 205 assigns a neural network model specific meaning to network training parameters field 74. In our network, the following parameters are assigned: learn rate, momentum, pattern error, epoch error, and tolerance. Block 206 assigns a neural network model specific meaning to network array offsets field 76. Since there are eleven data arrays to be defined in a Back Propagation neural network model, this field contains the byte offset to the first element of each of the eleven arrays located in body portion 90.
R09-90-047 13 2~S~31 Block 210 calls the Build Neural Network Model Create Program Subroutine of Fig 6B. Referring now to Fig. 6B, subroutine 210 requires that model specific routines are built so that they can be executed later by the Create Neural Network Data Structure Subroutine (Fig. 7). Block 211 provides a simple routine to prompt the user for parameter information specific to the neural network and check for erroneous and inconsistent parameter values. For example, block 211 would provide a routine that would prepare a screen similar to Fig. 12. The screen in Fig. 12, among other things, prompts the user for information about the following parameters: Number of input units, number of hidden units Ll, number of hidden units L2, and number of output units.
Block 212 provides a routine to initialize the generic neural network data structure with default parameter values to create the default neural network data structure for this neural network model. All neural network models have the same generic neural network data structure. Each individual neural network model has its own unique default data structure. Therefore, all neural networks application programs that use the same neural network model (such as Back Propagation) will input unique parameter values into the same default neural network data structure.
Block 213 saves the neural network model create program built in subroutine 210 by giving it a unique name and writing it to storage 12 (Fig. l). In the preferred embodiment, this program can be written in any language desired which has the capability to access the data structure. Block 219 returns to block 230 of Fig. 6A.
Block 230 calls the Build Neural Network Model Teach Program Subroutine of Fig 6C. Referring now to Fig. 6C, subroutine 230 requires that model specific routines are written so that they can be executed later by the Teach Neural Network Subroutine (Fig. 8). Block 231 provides a simple routine to initialize the network array pointers in Ro9-90-047 14 2~53~
field 77 of Fig. 4. Block 232 provides a routine for copying network size, index and training parameters (fields 72-74) into local variables. This iæ done to improve performance and programming reliability. Block 233 provides a routine to initialize the neural network. Block 233 initializes counters and variables used by the neural network teach program. If network status field 68 is "Initialize", block 233 also initializes data array values (connection weights) and changes the status from "Initialize" to "Training" in field 68.
Block 234 provides a routine to perform a single teach step for this neural network model. This routine provides a mechanism, highly dependent on the neural network model, used to adjust the values of the data in the data array of body 90 so that the network can learn the desired functions.
Those skilled in the art would take a neural network model description of its weight adjustment procedures and simply convert this description to a program, using a computer language of their choice, that accesses the data structù~e of the invention.
Block 235 provides a routine to be performed when the training epoch processing has been completed. This routine can vary in complexity from a simple clean up procedure such as resetting variables to a more complex adjustment of data array values, depending on the neural network model. Those skilled in the art would take a neural network model description of its uni~ue end of epoch processing and simply convert this description to a program, using a computer language of their choice, that accesses the data structure of the invention.
Block 236 saves the neural network model teach program built in subroutine 230 by giving it a unique name and writing it to storage 12 (Fig. 1). Block 239 returns to block 250 of Fig. 6A.
~09-90-047 15 ~ 2~5463~
Block 250 calls the Build Neural Network Model Run Program Subroutine of Fig 6D. Referring now to Fig. 6D, ~ubroutine 250 re~uires that model specific routines are written so that they can be executed later by the Run Neural Network Subroutine (Fig. 8)~ Block 251 provides a simple routine to initialize the network array pointers in field 77 of Fig. 4. Block 252 provides a routine for copying network size, index and training parameters (fields 72-74) into local variables. Block 253 provides a routine to pass input data through the neural network. Block 254 provides a routine to return the output result to the Run Neural Network Subroutine. Block 255 saves the neural network model run program built in subroutine 250 by giving it a unique name and writing it to storage 12 (Fig. 1). Block 259 returns to block 260 of Fig. 6A.
Block 260 enters the name of the neural network model (such as "*BKP" for back propagation) and the names of the create, teach, and run programs for this model s~ved in blocks 213, 236, and 255 into a model definition file stored in storage 12. Block 270 returns to block 120 of Fig. 5.
In the preferred embodiment, five neural network models are predefined for the convenience of the application developer or user. The predefined models are Back Propagation, Adaptive Resonance Theory, Self Organizing Feature Maps, Self Organizing TSP Networks, and Bidirectional Associative Memories. Therefore, these models do not have to be defined by the user using the Define Neural Network Model Subroutine. The predictive dialing application program of the invention uses the predefined Back Propagation model as its neural network model,although other models could also be used.
The remaining flowcharts will be discussed in conjunction with the predictive dialing neural netWork of the invention. The user creates this neural network by answering block 120 affirmatively in Fig. 5 and calling the Create Neural Network Data Structure Subroutine in block 300 '~05~63~
(Fig. 7). Referring now to Fig. 7A, block 301 prompts the user for the name of the neural network and textual description information, as shown in Fig. 11. The user enters "NNPACER" as the name of the neural network and "Neural Network Pacer for Predictive Dialing" for the textual description. Block 302 prompts the user for the name of the neural network model. As shown in Fig. 11, the user enters "*BKP", an abbreviation for the Back Propagation neural network model. Block 303 checks to see if the model "*BKP" was defined in the model definition file in block 260 of Fig. 6A. If not, block 304 posts an error message and the user is asked to reenter the name of the neural network model in block 301. In our network, the model definition file contains the "*BKP" and block 330 calls the Run Model Create Program Subroutine for this model of Fig. 7B. The Model Create Program was prepared by the Build Model Create Program Subroutine of Fig. 6B, as has been discussed. The name of this program, along with the names of the Teach and Run programs for this model, are all contained in the model definition file.
Referring now to Fig. 7B, block 331 creates the default neural network data structure for this neural network model, by running the routine provided in block 212 of Fig. 6B.
Block 332 prompts the user for neural network specific parameters, as shown in Fig. 12. In the preferred embodiment, the user specifies 16 input units (one each for month, day, year, day of week, hour, minute, second, pending calls, available operators, average connect delay, average idle time, nuisance call rate, average completion rate, average conversation length, idle time delta and nuisance call delta), 35 hidden units and l output unit (call action). In the preferred embodiment, the number of hidden units is equal to 2 * (number of inputs + number of outputs) +1. Block 333 checks to see if the user supplied parameters are acceptable. Note that the routine provided by block 211 in Fig. 6B to prompt the user for these parameters placed limits on the user's input, such as l-1000 output units. If the user inputs a value outside of any of these ranges, 2Q~i4~3~
block 333 would be answered negatively, an error message would be posted in block 334, and the user would be asked to reenter the data in block 332. In addition, if the user inputs inconsistent parameter information, an error message would also be posted. In our case, the user s~pplied parameters are all acceptable, so block 335 fills in all user supplied parameters into the default data structure created by block 331. Block 336 performs calculations to fill in network index parameters field 73 and network array offsets field 76, based on the data now residing in the data structure. Block 337 initializes the Boolean parameters in field 71 (both to "N" in our example) and the training parameters in field 74 (to the values shown in Fig. 15 in our example) Block 338 allocates and initializes the data array fields located in body portion 90. In a back propagation neural network model, the following arrays would be allocated: activations, weights, threshold, weight deltas, threshold deltas, teach, error, delta, network input, weight derivative, and threshold derivative. Thëse values are all initialized (as determined by the neu~fil network model) in block 338. After block 338 is executed, the neural network data structure contains all the information needed to teach the neural network how to perform predictive dialing. The subroutine returns in block 339 to block 305 in Fig. 7A. Block 305 returns to block 130 in Fig. 5.
Note that once a neural network data structure has been created, it can be transported to another computer system to be taught and/or run. The other computer system can be of an entirely different architecture and run an entirely different operating system than the computer system that created the neural network data structure. This fleXibility is possible since the data structure contains data that can be used universally among different computer systems.
Since our user wants to train his newly created neural network to perform predictive dialing, he answers block 130 affirmatively in Fig. 5, thereby calling the Teach Neural I Z054631.
Network Subroutine in block 400 (Fig. 8). Referring now to Fig. 8A, block 401 prompts the user for the name of the neural network and library as shown in Fig. 14. The user enters "NNPACER" as the name of the neural network, "BIGUS"
as the library name. Fig. 14 also gives the user the opportunity to enter in the name of a custom interface program he can write to improve the usability of his particular neural network, if desired. In addition, the user is asked if he wants the training results to be logged or displayed, and (if a custom interface program exists) whether he wants the training data taken automatically from the data set or one step at a time from the user when he presses the enter key. Block 402 sees if the data structure specified in block 401 exists. If not, an error is posted and the user is returned to block 401. If so, block 403 prompts the user for the name of the data set where the training data is located, As shown in Fig. 13, the user enters "NNDATA" as the data set and "NNPACER" as the data set member where the training data is located.
Fig. lOA shows the initial training data used in the preferred embodiment. Initial training data can be generated manually taking into'account known and estimated conditions in a predictive dialing environment. For example, the first two records of training data indicates that calls after 4:00 PM on Fridays have a lower completion rate than calls at 10:30 AM on Wednesdays. Therefore, with all other workload factors being even, the neural network may learn that it should make a call at 4:00 PM on Friday, but shouldn't make the call at 10:30 AM on Wednesday, since the desired nuisance rate might be exceeded. The third record indicates that a call shouldn't be made because the average idle time is too low. The fourth record indicates that a call shouldn't be made because the average nuisance call rate is too high. The fifth record indicates that a call shouldn't be made because the number of calls pending is too high. The sixth record indicates that a call should be made because the number of available operators is sufficiently high.
~09-90-047 19 ~ 20~i~6~1 Input parameters 811-814 make up date parameter 810.
Input parameters 821-823 make up time parameter 820. In the preferred embodiment, time parameter 820 takes into acco~nt the time zone of the called party. Input parameters 831-838 make up workload factor parameter 830. In an alternate embodiment shown in Fig. lOB, date parameter 810 con~ists of a single input parameter. Time parameter 820 consists of a single input parameter. Workload factor parameter 830 consists of a single input parameter. Workload factor parameter 830 could be selected to be whatever the application developer considers to be the most important parameter, such as idle time delta or nuisance call delta.
Output parameter 850 is not needed if only records where a call was made are stored.
Block 404 determines that the data set exists, so block 405 prompts the user for the name of the custom interface program, if any. If symbolic data is stored in the data set, a user specified custom interface program is neëded to convert symbolic data ~that humans understand) into numeric data (that neural networks understand). A custom interface program may also be used to normalize input data to give all data a range between O and 1, if desired. In our network, a custom interface program was specified in Fig. 13, and this program normalizes all data in a conventional matter for computational efficiency. Block 420 calls the Run Model Teach Program Subroutine for this model of Fig. 8B. The Model Teach Program was prepared by the Build Model Teach Program Subroutine of Fig. 6C, as has been discussed.
Re~erring now to Fig. 8B, block 433 performs the initialization routine built by blocks 231, 232 and 233 of Fig. 6C. Block 421 checks to see if a custom interface program was specified. If so, block 422 gets the data from the custom interface program. Otherwise, block 423 gets the data directly from the data set. Block 424 performs one teach step by running the neural network model-dependent routine provided by block 234 of Fig. 6C. In our example, the values of the data in the data arrays in body 90 are -ZO~A~631 adjusted to minimize the error between the desired and actual network outputs. Block 425 again checks for a custom interface program. If it exists, block 426 checks to see if the uæer wants the values of the data in the data structure to be displayed. If so, a custom screen generated by the custom interface program is displayed in block 427. An example custom screen is shown in Fig. 17. If no custom interface program exists but the user wants data displayed, a default screen is displayed in block 428. An example default screen is shown in Fig. 15.
Referring again to Fig. 8B, block 429 checks to see if the user wanted the data logged. If so, block 430 performs custom or default logging of data. In either event, block 434 checks to see if one epoch has been completed. A~ epoch is complete when all training data in the data set has been processed once. If not, control loops back to block 421 to get the next training data. If one epoch has beén completed, block 435 performs the end of epoch processihg routine built by block 235 in Fig. 6C. In our example, the end of epoch processing routine determines if the difference between the actual and desired output for our output unit (call action) for all training data is less than the specified tolerance (one of the training parameters in field 74). If so, it sets the network status in field 68 to "locked". When the status of the neural network is "locked"
the values of the data arrays are not permitted to change.
Block 431 then checks to see if the number of iterations specified by the user has been completed. Until this happens, block 431 is answered negatively and flow returns back to block 421 to perform another iteration through the training data. When the training pe~iod is complete, block 431 is answered positively. The subroutine returns in block 439 to block 407 of Fig. 8A. Block 407 returns to block 140 of Fig. 5.
Since our user wants to run his newly trained neural network to perform predictive dialing, he answers block 140 RO9-90-047 21 20~S31 affirmatively in Fig. 5, thereby calling the Run Neural Network Subroutine in block 500 (Fig. 9). Alternatively, predictive dialing application program 41 (Fig. 3B) can call the Run Neural Network Subroutine directly, thereby bypassing Fig. 5.
Referring now to Fig. 9A, block 501 performs the initialization routine built by blocks 251 and 252 of Fig.
6D. Block 502 determines the name of the neural network.
Block 530 calls the Run Model Run Program Subroutine for this model of Fig. 9B. The Model Run Program was prepared by Build Model ~un Program Subroutine of Fig. 6D, as has been discussed.
Referring now to Fig. 9B, block 531 gets the date, time, day of week, number of pending calls, and number o available operators from the system. It then calculates the average connect delay, the average completion rate, the average idle time, the average nuisance call rate, the average completion ~ate and the average conversation ~ength.
Although these averages can be calculated any number of ways, a preferred way is to keep a running count of the last 5 minutes of activity and determine the various averages over this time period. Block 533 calculates an idle time delta and a nuisance call delta. Idle time delta is the seconds per hour difference between a desired idle time (a variable entered into the computer system by the user) and the actual idle time. For example, if 205 seconds per hour is the desired idle time, and if the actual idle time is 240 seconds, the idle time delta would be -35 seconds (205-240=
-35). The nuisance call delta is desired percentage of nuisance calls minus actual percentage of nuisance calls.
For example, if desired nuisance calls percentage is 0.64%
and actual nuisance calls percentage is 0.2%, the nuisance call delta is +0.4% (0.6%-0.2% = 0.4%). The first record of Fig. lOA shows an idle time delta of -35 seconds and a nuisance call delta of 0.4%. The input data of blocks 531 and 533 are considered to be a "current call record".
Z(~6~1 The desired idle time and desired percentage of nuisance calls are design choices and can vary based on the particular application. A 300 second to 600 second idle time per hour (5-10 minutes) may be desirable to minimize operator fatigue yet also avoid operator boredom and low productivity. It is normally desirable to keep the nuisance call percentage as close to 0% as possible to minimize customer annoyance with being contacted by a computer when no operator is available.
The data used in blocks 531 and 533 is normally determined from information retrieved from telephony enabler 20. In the preferred embodiment, this information is retrieved from the CallPath/400 telephony enabler by using a series of commands supported by the CallPath/400 Application Programming Interface. This interface is described in more detail in IBM document GC21-9867, CallPath/400 Programmer's Reference. Some of the specific commands that can be used by those skilled in the art are Make_Call, Receive, Add-_Party, and Disconnect. These commands retu~n the information needed to determine the data used in blocks 531 and 533 in the form of the following events: Call_Alerting, Call_Connected, Call_Rejected, ~isconnected, (and associated timestamp information included with the above events). The Feature_Invoked event is also used in determining status of operators or agents.
Block 535 runs all the input data contained in the current call record through the trained neural network.
When the neural network was trained, it determined a relationship between input data contained in call records with a call action (make or don't make the call). Based on this relationship, the neural network looks at the input data in the current call record and, in the preferred embodiment, passes a numeric value between 0 and 1 to predictive dialing application program 41 via line 44 (Fig.
3B). The closer this numeric value is to 1, the more confident the neural network is that a call should be made.
Predictive dialing application program 41, in the preferred ~ 9-90-047 23 20546~1 embodiment, gets a threshold value of 0.5, althoùgh this could be larger or smaller. Therefore, a numeric value of 0.5 or greater from the neural network indicates that a call should be made, while a numeric value less thah 0.5 indicates that a call should not be made.
Block 540 asks if the neural network indicated that a call should be made. If so, block 541 instructs the switch to make the call. In the preferred embodiment, this is done by informing telephony enabler 20, that a call should be made. Telephony enabler 20 handles the communications protocol with the switch neceæsary to make calls.
Block 542 saves the current call record in a temporary dataset for future analysis, as will be discussed later. In the preferred embodiment, block 542 appends the call action onto the call record and saves all call records, whether the call was made or not. An alternate embodiment is contemplated where the call action is not appended ahd onLy call records where a call was made is saved in block 542.
Block 545 checks to see if the application program wants to stop making calls. The application program may automatically stop making calls after a certain elapsed time, at a specific time of day, or if all the operators have gone home. If no such indication to stop making calls is received, flow of control loops back to block 531 where new input data is retrieved. If an indication to stop making callæ is received, block 550 asks if the call records saved through various iterations of block 542 should be analyzed to see if the neural network needs further training. If analysis is not desirable, the subroutihe returns in block 590 to block 519 to block 190 in Fig. 5, where the program ends, or, alternatively, ret~rns to predictive dialing application program 41 that caLled it for further processing.
If block 550 is answered affirmatively, Analyze Call Records Subroutine 600 of Fig. 9C is called. Referring now ~ ;~Q5~6~
to Fig. 9C, block 601 asks if there is a call record to process. If so, block 605 asks if the average idle time is greater than desired. If so, block 606 asks i~ a call was made. A call should have been made if the idle time is greater than desired, since operators are sitting aro~hd waiting for something to do. If block 606 indicate~ that a call was not made, the neural network made the "wrohg"
decision in this case. Block 607 changes the Dial Action field in the call record from a "0" (indicating that a call wasn t made) to a "1" (indicating that a call was made).
This change is done to make the call record reflect the desired result so that the neural network can learn from it later. If block 606 indicateæ that a call was made, the neural network made the right decision. In either event, flow returns back to block 601 to look for another record to process.
If block 605 was answered negatively, block 615 asks if the average idle time is less than desired. If so, block 616 asks if a call was made. A call should not have been made if the idle time is less than desired, since operators are overworked. If block 616 indicates that a call was made, the neural network made the "wrong" decision in this case.
Block 617 changes the Dial Action field in the call record from a "1" (indicating that a call was made) to a "0"
(indicating that a call wasn t made). As before, this change is done to make the call record reflect the desired result so that the neural network can learn from it later.
If block 616 indicates that a call was not made, the neural network made the right decision. In either event, flow returns back to block 601 to look for another record to process.
If block 615 was answered negatively, block 625 asks if the average nuisance call rate is greater than desired. If so, block 626 asks if a call was made. A call should not have been made if the nuisance call rate is greater than desired, since it will be likely that there will be no operators available to take the call. If block 626 Z(~S4~31 indicates that a call was made, the neural network made the "wrong" decision in this case. Block 627 changes the Dial Action field in the call record from a "1" (indicati~g that a call was made) to a "0" (indicating that a call wasn't made). As before, this chahge is done to make the call record reflect the desired result so that the neural network can learn from it later. If block 626 indicates that a call was not made, the neural network made the right decision.
In either event, flow returns back to block 601 to look for another record to process.
When block 601 indicates that there are no more call records to process, the subroutine returns in block 650 to block 560 in Fig. 9B. Block 560 adds the call records (some of which may have been changed by subroutine 600) to the training dataset. The temporary dataset is then erased. By putting these records into the training dataset, the neural network can be retrained by restarting the flowchart of Fi~.
5 and indicating that the network is to be trained. In this manner, the neural network can improve its learning process and make fewer and fewer mistakes in the future. After a few of these learning iterations, the neural network should be able to consistently stay within the desired idle rate and nuisance call percentage parameters and be able to look ahead and anticipate changes in calling patterns and adjust accordingly, before the nuisance call rate or operator idle time reach unacceptable levels.
While this invention has been described with respect to the preferred embodiment, it will be understood by those skilled in the art that various changes in detail may be made therein without departing from the spirit, scope and teaching of the invention. For example, the inpùt parameters selected could be ~uite different from those in the preferred embodiment. Economic factors ~ch as unemployment rate or gross national product may be added;
other factors such as current weather conditions may also be added. In addition, the desired idle time or the desired nuisance call percentage can be larger or smaller than the ~ g-90-047 26 ~ 2~5~631 exemplary values shown herein. Although a neural network is used in the preferred embodiment, the relationship between a selected group of input parameters and the desired output can be determined through a expert system or other programming or logical circuitry. Accordingly, the herein disclosed is to be limited only as specified in the following claims.
Claims (35)
1. A look-ahead method for predictive dialing, comprising the steps of:
storing a first call record, said first call record comprising a first group of input parameters, said first group of input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters, said second group of input parameters comprising a second date, a second time of day, and a second workload factor; and learning from said first call record and said second call record a relationship between said first and second groups of input parameters and a plurality of dial actions.
storing a first call record, said first call record comprising a first group of input parameters, said first group of input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters, said second group of input parameters comprising a second date, a second time of day, and a second workload factor; and learning from said first call record and said second call record a relationship between said first and second groups of input parameters and a plurality of dial actions.
2. The method of claim 1, further comprising the step of creating a current call record containing a current date, a current time, and a current workload factor.
3. The method of claim 2, further comprising the step of determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step.
4. The method of claim 3, further comprising the step of initiating a call if said relationship between said group of input parameters and said plurality of dial actions indicates that said call should be initiated at said current time said current date, and said current workload factor.
5. The method of claim 4, further comprising the step of storing said current call record along with said first call record and said second call record.
6. A look-ahead method for predictive dialing, comprising the steps of:
storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
7. The method of claim 6, further comprising the step of creating a current call record containing a current data, a current time, and a current workload factor.
8. The method of claim 7, further comprising the step of determining whether a call should be initiated at said current time, said current data, and said current workload factor based on said relationship learned by said learning step.
9. The method of claim 8, further comprising the step of initiating a call if said relationship between said group of input parameters and said dial actions indicates that said call should be initiated at said current time, said current date, and said current workload factor.
10. The method of claim 8, further comprising the steps of:
appending a current dial action onto said current call record, said current dial action responsive to said determining step; and storing said current call record along with said first call record and said second call record.
appending a current dial action onto said current call record, said current dial action responsive to said determining step; and storing said current call record along with said first call record and said second call record.
11. A look-ahead method for predictive dialing using a neural network, comprising the steps of:
storing a first call record, said first call record comprising a first group of input parameters, said first group of input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters, said second group of input parameters comprising a second date, a second time of day, and a second workload factor; and said neutral network learning from said first call record and said second call record a relationship between said first and second groups of input parameters and a plurality of dial actions.
storing a first call record, said first call record comprising a first group of input parameters, said first group of input parameters comprising a first date, a first time of day, and a first workload factor;
storing a second call record, said second call record comprising a second group of input parameters, said second group of input parameters comprising a second date, a second time of day, and a second workload factor; and said neutral network learning from said first call record and said second call record a relationship between said first and second groups of input parameters and a plurality of dial actions.
12. The method of claim 11, further comprising the step of creating a current call record containing a current date, a current time, and a current workload factor.
13. The method of claim 12, further comprising the step of said neural network determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship between said group of input parameters and said dial actions learned by said learning step from said stored plurality of call records.
14. The method of claim 13, further comprising the step of initiating a call if said relationship between said group of input parameters and said dial actions indicates that said call should be initiated at said current time, said current date, and said current workload factor.
15. The method of claim 13, further comprising the steps of:
appending a current dial action onto said current call record, said current dial action responsive to said determining step; and storing said current call record along with said first call record and said second call record.
appending a current dial action onto said current call record, said current dial action responsive to said determining step; and storing said current call record along with said first call record and said second call record.
16. The method of claim 15, further comprising the steps of:
analyzing said first, said second, and said current call records to determine if a correct dial action was made;
changing said dial action to said correct dial action if said correct dial action was not made.
analyzing said first, said second, and said current call records to determine if a correct dial action was made;
changing said dial action to said correct dial action if said correct dial action was not made.
17. The method of claim 16, further comprising the step of the neural network re-learning from said first, said second, and said current call records a relationship between said group of input parameters and said dial actions.
18. A computer system for predictive dialing, comprising:
means for storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first data, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second data, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
means for storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first data, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second data, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
19. The computer system of claim 18, further comprising means for creating a current call record containing a current date, a current time, and a current workload factor.
20. The computer system of claim 19, further comprising means for determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step.
21. The computer system of claim 20, further comprising means for initiating a call if said relationship between said group of input parameters and said dial action indicates that said call should be initiated at said current time, said current date, and said current workload factor.
22. The computer system of claim 20, further comprising:
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
23. The computer system of claim 22, further comprising:
means for analyzing said firSt, said second, and said current call records to determine if a correct dial action was made;
means for changing said dial action to said correct dial action if said current dial action was not made.
means for analyzing said firSt, said second, and said current call records to determine if a correct dial action was made;
means for changing said dial action to said correct dial action if said current dial action was not made.
24. The computer system of claim 23, further comprising the neural network re-learning from said first, said second, and said current call records a relationship between said group of input parameters and said dial actions.
25. A program product for predictive dialing, comprising;
a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions.
26. The program product of claim 25, further comprising means for creating a current call record containing a current date, a current time, and a current workload factor.
27. The program product of claim 26, further comprising means for determining whether a call should be initiated it said current time, said current date, and said current workload factor based on said relationship learned by said teaming step.
28. The program product of claim 27, further comprising means for initiating a call if said relationship between said group of input parameters and said dial actions indicates that said call should be initiated at said current time, said current date, and said current workload factor.
29. The program product of claim 27, further comprising:
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
30. The program product of claim 29, further comprising:
means for analyzing said first, said second, and said current call records to determine is a correct dial action was made;
means for changing said dial action to said correct dial action if said correct dial action was not made.
means for analyzing said first, said second, and said current call records to determine is a correct dial action was made;
means for changing said dial action to said correct dial action if said correct dial action was not made.
31. The program product of claim 30, further comprising means for the neural network re-learning from said first, said second, and said current call records a relationship between said group of input parameters and said dial actions.
32. A predictive dialing system, comprising:
a computer system having a processor and storage;
a telephone switch connected top said computer;
said computer system further comprising:
means for storing a first call record each said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input Parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions;
means for creating a current call record containing a current data, a current time, and a current workload factor;
means for determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step;
and means for instructing said telephone switch to connect an operator telephone with an external telephone if said relationship between said group of input parameters and said dial actions indicates that said connection should be initiated at said current time, said current date, and said current workload factor.
a computer system having a processor and storage;
a telephone switch connected top said computer;
said computer system further comprising:
means for storing a first call record each said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first date, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input Parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions;
means for creating a current call record containing a current data, a current time, and a current workload factor;
means for determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step;
and means for instructing said telephone switch to connect an operator telephone with an external telephone if said relationship between said group of input parameters and said dial actions indicates that said connection should be initiated at said current time, said current date, and said current workload factor.
33. The predictive dialing system of claim 32, further comprising:
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
34. A predictive dialing system, comprising:
a computer system, comprising:
a processor;
storage;
a neural network;
a telephone switch connected to said computer;
said neural network further comprising:
means for storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first data, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions, means for creating a current call record containing a current date, a current time, and a current workload factor;
means for determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step;
and means for instructing said telephone switch to connect an operator telephone with an external telephone if said relationship between said group of input parameters and said dial actions indicates that said connection should be initiated at said current time, said current date, and said current workload factor.
a computer system, comprising:
a processor;
storage;
a neural network;
a telephone switch connected to said computer;
said neural network further comprising:
means for storing a first call record, said first call record comprising a first group of input parameters and a first dial action, said input parameters comprising a first data, a first time of day, and a first workload factor;
means for storing a second call record, said second call record comprising a second group of input parameters and a second dial action, said input parameters comprising a second date, a second time of day, and a second workload factor; and means for learning from said first call record and said second call record a relationship between said first and second groups of input parameters and said first and second dial actions, means for creating a current call record containing a current date, a current time, and a current workload factor;
means for determining whether a call should be initiated at said current time, said current date, and said current workload factor based on said relationship learned by said learning step;
and means for instructing said telephone switch to connect an operator telephone with an external telephone if said relationship between said group of input parameters and said dial actions indicates that said connection should be initiated at said current time, said current date, and said current workload factor.
35. The predictive dialing system of claim 34, further comprising:
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
means for appending a current dial action onto said current call record, said current dial action responsive to said determining step; and means for storing said current call record along with said first call record and said second call record.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US07/626,670 US5155763A (en) | 1990-12-11 | 1990-12-11 | Look ahead method and apparatus for predictive dialing using a neural network |
US626,670 | 1990-12-11 |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2054631A1 CA2054631A1 (en) | 1992-06-12 |
CA2054631C true CA2054631C (en) | 1996-07-23 |
Family
ID=24511338
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002054631A Expired - Fee Related CA2054631C (en) | 1990-12-11 | 1991-10-31 | Look-ahead method and apparatus for predictive dialing using a neural network |
Country Status (4)
Country | Link |
---|---|
US (1) | US5155763A (en) |
EP (1) | EP0493292A3 (en) |
JP (1) | JP2823100B2 (en) |
CA (1) | CA2054631C (en) |
Families Citing this family (131)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5327490A (en) * | 1991-02-19 | 1994-07-05 | Intervoice, Inc. | System and method for controlling call placement rate for telephone communication systems |
US5295184A (en) * | 1991-05-30 | 1994-03-15 | Davox Corporation | Dynamically adjustable call pacing system |
CA2147601C (en) * | 1992-10-21 | 1998-08-25 | Norman J. Donaghue, Jr. | Integrated intelligent call blending |
US5436963A (en) * | 1992-12-30 | 1995-07-25 | International Business Machines Corporation | Telephone answering method and apparatus |
US5343518A (en) * | 1993-01-14 | 1994-08-30 | Davox Corporation | System and method for controlling the dialing order of call record lists in an automated dialing system |
US5461699A (en) * | 1993-10-25 | 1995-10-24 | International Business Machines Corporation | Forecasting using a neural network and a statistical forecast |
US5436965A (en) * | 1993-11-16 | 1995-07-25 | Automated Systems And Programming, Inc. | Method and system for optimization of telephone contact campaigns |
US5444820A (en) * | 1993-12-09 | 1995-08-22 | Long Island Lighting Company | Adaptive system and method for predicting response times in a service environment |
US5561711A (en) * | 1994-03-09 | 1996-10-01 | Us West Technologies, Inc. | Predictive calling scheduling system and method |
US5592543A (en) * | 1994-06-01 | 1997-01-07 | Davox Corporation | Method and system for allocating agent resources to a telephone call campaign |
US5570419A (en) * | 1995-10-13 | 1996-10-29 | Intervoice Limited Partnership | System and method for an improved predictive dialer |
US5953393A (en) * | 1996-07-15 | 1999-09-14 | At&T Corp. | Personal telephone agent |
US5905789A (en) * | 1996-10-07 | 1999-05-18 | Northern Telecom Limited | Call-forwarding system using adaptive model of user behavior |
US6167117A (en) * | 1996-10-07 | 2000-12-26 | Nortel Networks Limited | Voice-dialing system using model of calling behavior |
US5917891A (en) * | 1996-10-07 | 1999-06-29 | Northern Telecom, Limited | Voice-dialing system using adaptive model of calling behavior |
US5912949A (en) * | 1996-11-05 | 1999-06-15 | Northern Telecom Limited | Voice-dialing system using both spoken names and initials in recognition |
JP3767954B2 (en) * | 1996-11-07 | 2006-04-19 | 富士通株式会社 | Demand forecasting device |
US6208713B1 (en) | 1996-12-05 | 2001-03-27 | Nortel Networks Limited | Method and apparatus for locating a desired record in a plurality of records in an input recognizing telephone directory |
US6005927A (en) * | 1996-12-16 | 1999-12-21 | Northern Telecom Limited | Telephone directory apparatus and method |
GB2321364A (en) * | 1997-01-21 | 1998-07-22 | Northern Telecom Ltd | Retraining neural network |
GB2321362A (en) * | 1997-01-21 | 1998-07-22 | Northern Telecom Ltd | Generic processing capability |
GB2321363A (en) * | 1997-01-21 | 1998-07-22 | Northern Telecom Ltd | Telecommunications |
US6021191A (en) * | 1997-03-05 | 2000-02-01 | At&T Corp. | Automatic routing technique |
US6192354B1 (en) | 1997-03-21 | 2001-02-20 | International Business Machines Corporation | Apparatus and method for optimizing the performance of computer tasks using multiple intelligent agents having varied degrees of domain knowledge |
TW504632B (en) | 1997-03-21 | 2002-10-01 | Ibm | Apparatus and method for optimizing the performance of computer tasks using intelligent agent with multiple program modules having varied degrees of domain knowledge |
US6085178A (en) * | 1997-03-21 | 2000-07-04 | International Business Machines Corporation | Apparatus and method for communicating between an intelligent agent and client computer process using disguised messages |
US6401080B1 (en) | 1997-03-21 | 2002-06-04 | International Business Machines Corporation | Intelligent agent with negotiation capability and method of negotiation therewith |
US6016342A (en) * | 1997-06-18 | 2000-01-18 | At&T Corp. | Telecommunications apparatus for initiating calls based on past calling patterns |
US6782087B1 (en) | 1997-09-19 | 2004-08-24 | Mci Communications Corporation | Desktop telephony application program for a call center agent |
US6192121B1 (en) | 1997-09-19 | 2001-02-20 | Mci Communications Corporation | Telephony server application program interface API |
US6490350B2 (en) | 1997-09-30 | 2002-12-03 | Mci Communications Corporation | Monitoring system for telephony resources in a call center |
US6084954A (en) * | 1997-09-30 | 2000-07-04 | Lucent Technologies Inc. | System and method for correlating incoming and outgoing telephone calls using predictive logic |
US6954529B2 (en) * | 1997-10-17 | 2005-10-11 | Debra Ann Marie Gill | Recordal of call results in a predictive dialing application |
GB2347583B (en) * | 1997-12-19 | 2003-03-12 | Blake Rice | Automated right-party contact telephone system |
JPH11239220A (en) * | 1998-02-24 | 1999-08-31 | Fujitsu Ltd | Exchange |
US6137862A (en) * | 1998-03-16 | 2000-10-24 | Mci Communications Corporation | Failover mechanism for computer/telephony integration monitoring server |
US6665271B1 (en) * | 1998-03-17 | 2003-12-16 | Transnexus, Llc | System for real-time prediction of quality for internet-based multimedia communications |
DE19923622A1 (en) * | 1998-08-31 | 2000-03-02 | Ralf Steiner | Neural network for computer controlled knowledge management has elements weighted according to characteristics of Hilbert space |
US7110526B1 (en) | 1998-10-14 | 2006-09-19 | Rockwell Electronic Commerce Technologies, Llc | Neural network for controlling calls in a telephone switch |
CA2256119C (en) | 1998-12-16 | 2002-02-12 | Ibm Canada Limited-Ibm Canada Limitee | Time slot based call pacing method and apparatus |
WO2000072256A2 (en) * | 1999-05-24 | 2000-11-30 | Ipcentury Ag | Neuronal network for computer-assisted knowledge management |
US7444407B2 (en) * | 2000-06-29 | 2008-10-28 | Transnexus, Inc. | Intelligent end user devices for clearinghouse services in an internet telephony system |
WO2001047232A2 (en) | 1999-12-22 | 2001-06-28 | Transnexus, Inc. | Secure enrollment of a device with a clearinghouse server for internet telephony system |
US6556671B1 (en) * | 2000-05-31 | 2003-04-29 | Genesys Telecommunications Laboratories, Inc. | Fuzzy-logic routing system for call routing with-in communication centers and in other telephony environments |
EP1319281B1 (en) * | 2000-09-11 | 2007-05-09 | TransNexus, Inc. | Clearinghouse server for internet telephony and multimedia communications |
US6754236B1 (en) * | 2000-10-03 | 2004-06-22 | Concerto Software, Inc. | System and method for dialing in a telephony system using a common channel signaling protocol in which the use of bearer channels is maximized |
US7047227B2 (en) * | 2000-12-22 | 2006-05-16 | Voxage, Ltd. | Interface between vendors and customers that uses intelligent agents |
US20020082912A1 (en) * | 2000-12-22 | 2002-06-27 | Leon Batachia | Transactions between vendors and customers using push/pull model |
US7525956B2 (en) | 2001-01-11 | 2009-04-28 | Transnexus, Inc. | Architectures for clearing and settlement services between internet telephony clearinghouses |
US20020172349A1 (en) * | 2001-05-17 | 2002-11-21 | Shea Phillip N. | Neural net-call progress tone detector |
US6757694B2 (en) * | 2001-10-03 | 2004-06-29 | International Business Machines Corporation | System and method for logically assigning unique names to devices in a storage system |
CA2403043A1 (en) * | 2002-09-10 | 2004-03-10 | Keith Gill | Multiline dialing apparatus with interchangeable subscriber service capability |
CA2407991A1 (en) * | 2002-10-15 | 2004-04-15 | Keith Gill | Expandable multiline dialing apparatus |
DE10322634A1 (en) * | 2003-05-20 | 2004-12-23 | TOS Team für Organisation und Systeme GmbH | Control method for a communication event, especially a telephone call, message transfer or data transmission, whereby communication initiator and target resource profiles are defined and used when generating a communication link |
US7072460B2 (en) * | 2003-05-27 | 2006-07-04 | Vtech Telecommunications Limited | System and method for retrieving telephone numbers |
US6929507B2 (en) * | 2003-12-30 | 2005-08-16 | Huang Liang Precision Enterprise Co., Ltd. | Coaxial connector structure |
WO2005089147A2 (en) * | 2004-03-11 | 2005-09-29 | Transnexus, Inc. | Method and system for routing calls over a packet switched computer network |
US8238329B2 (en) | 2005-12-13 | 2012-08-07 | Transnexus, Inc. | Method and system for securely authorizing VoIP interconnections between anonymous peers of VoIP networks |
WO2006065789A2 (en) * | 2004-12-13 | 2006-06-22 | Transnexus, Inc. | Method and system for securely authorizing anonymous peers of voip networks |
WO2007027427A2 (en) * | 2005-08-29 | 2007-03-08 | Wms Gaming Inc. | On-the-fly encryption on a gaming machine |
US9787841B2 (en) | 2008-01-28 | 2017-10-10 | Afiniti Europe Technologies Limited | Techniques for hybrid behavioral pairing in a contact center system |
US20090232294A1 (en) * | 2008-01-28 | 2009-09-17 | Qiaobing Xie | Skipping a caller in queue for a call routing center |
US9654641B1 (en) | 2008-01-28 | 2017-05-16 | Afiniti International Holdings, Ltd. | Systems and methods for routing callers to an agent in a contact center |
US8824658B2 (en) | 2008-11-06 | 2014-09-02 | Satmap International Holdings Limited | Selective mapping of callers in a call center routing system |
US8903079B2 (en) | 2008-01-28 | 2014-12-02 | Satmap International Holdings Limited | Routing callers from a set of callers based on caller data |
US9781269B2 (en) | 2008-01-28 | 2017-10-03 | Afiniti Europe Technologies Limited | Techniques for hybrid behavioral pairing in a contact center system |
US9692898B1 (en) | 2008-01-28 | 2017-06-27 | Afiniti Europe Technologies Limited | Techniques for benchmarking paring strategies in a contact center system |
US10708431B2 (en) | 2008-01-28 | 2020-07-07 | Afiniti Europe Technologies Limited | Techniques for hybrid behavioral pairing in a contact center system |
US9712676B1 (en) | 2008-01-28 | 2017-07-18 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a contact center system |
US10708430B2 (en) | 2008-01-28 | 2020-07-07 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a contact center system |
US8781100B2 (en) | 2008-01-28 | 2014-07-15 | Satmap International Holdings Limited | Probability multiplier process for call center routing |
US9774740B2 (en) | 2008-01-28 | 2017-09-26 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a contact center system |
US8879715B2 (en) | 2012-03-26 | 2014-11-04 | Satmap International Holdings Limited | Call mapping systems and methods using variance algorithm (VA) and/or distribution compensation |
US10567586B2 (en) | 2008-11-06 | 2020-02-18 | Afiniti Europe Technologies Limited | Pooling callers for matching to agents based on pattern matching algorithms |
US8670548B2 (en) | 2008-01-28 | 2014-03-11 | Satmap International Holdings Limited | Jumping callers held in queue for a call center routing system |
US10750023B2 (en) | 2008-01-28 | 2020-08-18 | Afiniti Europe Technologies Limited | Techniques for hybrid behavioral pairing in a contact center system |
US9300802B1 (en) | 2008-01-28 | 2016-03-29 | Satmap International Holdings Limited | Techniques for behavioral pairing in a contact center system |
US9712679B2 (en) | 2008-01-28 | 2017-07-18 | Afiniti International Holdings, Ltd. | Systems and methods for routing callers to an agent in a contact center |
US8718271B2 (en) | 2008-01-28 | 2014-05-06 | Satmap International Holdings Limited | Call routing methods and systems based on multiple variable standardized scoring |
US20100020959A1 (en) * | 2008-07-28 | 2010-01-28 | The Resource Group International Ltd | Routing callers to agents based on personality data of agents |
US8644490B2 (en) | 2008-08-29 | 2014-02-04 | Satmap International Holdings Limited | Shadow queue for callers in a performance/pattern matching based call routing system |
US8781106B2 (en) | 2008-08-29 | 2014-07-15 | Satmap International Holdings Limited | Agent satisfaction data for call routing based on pattern matching algorithm |
CN102301688B (en) * | 2008-11-06 | 2014-05-21 | 资源集团国际有限公司 | Systems And Methods In A Call Center Routing System |
US8472611B2 (en) | 2008-11-06 | 2013-06-25 | The Resource Group International Ltd. | Balancing multiple computer models in a call center routing system |
USRE48412E1 (en) | 2008-11-06 | 2021-01-26 | Afiniti, Ltd. | Balancing multiple computer models in a call center routing system |
US8634542B2 (en) * | 2008-12-09 | 2014-01-21 | Satmap International Holdings Limited | Separate pattern matching algorithms and computer models based on available caller data |
US8295471B2 (en) | 2009-01-16 | 2012-10-23 | The Resource Group International | Selective mapping of callers in a call-center routing system based on individual agent settings |
US10594870B2 (en) | 2009-01-21 | 2020-03-17 | Truaxis, Llc | System and method for matching a savings opportunity using census data |
US10504126B2 (en) * | 2009-01-21 | 2019-12-10 | Truaxis, Llc | System and method of obtaining merchant sales information for marketing or sales teams |
US8724797B2 (en) | 2010-08-26 | 2014-05-13 | Satmap International Holdings Limited | Estimating agent performance in a call routing center system |
US8699694B2 (en) | 2010-08-26 | 2014-04-15 | Satmap International Holdings Limited | Precalculated caller-agent pairs for a call center routing system |
US8750488B2 (en) | 2010-08-31 | 2014-06-10 | Satmap International Holdings Limited | Predicted call time as routing variable in a call routing center system |
US9025757B2 (en) | 2012-03-26 | 2015-05-05 | Satmap International Holdings Limited | Call mapping systems and methods using bayesian mean regression (BMR) |
US8792630B2 (en) | 2012-09-24 | 2014-07-29 | Satmap International Holdings Limited | Use of abstracted data in pattern matching system |
CN113095662B (en) | 2015-12-01 | 2024-03-19 | 阿菲尼帝有限公司 | Techniques for case distribution |
US10142473B1 (en) | 2016-06-08 | 2018-11-27 | Afiniti Europe Technologies Limited | Techniques for benchmarking performance in a contact center system |
US9692899B1 (en) | 2016-08-30 | 2017-06-27 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a contact center system |
US9723144B1 (en) | 2016-09-20 | 2017-08-01 | Noble Systems Corporation | Utilizing predictive models to improve predictive dialer pacing capabilities |
US9888121B1 (en) | 2016-12-13 | 2018-02-06 | Afiniti Europe Technologies Limited | Techniques for behavioral pairing model evaluation in a contact center system |
US9955013B1 (en) | 2016-12-30 | 2018-04-24 | Afiniti Europe Technologies Limited | Techniques for L3 pairing in a contact center system |
US11831808B2 (en) | 2016-12-30 | 2023-11-28 | Afiniti, Ltd. | Contact center system |
US10320984B2 (en) | 2016-12-30 | 2019-06-11 | Afiniti Europe Technologies Limited | Techniques for L3 pairing in a contact center system |
US10326882B2 (en) | 2016-12-30 | 2019-06-18 | Afiniti Europe Technologies Limited | Techniques for workforce management in a contact center system |
US10257354B2 (en) | 2016-12-30 | 2019-04-09 | Afiniti Europe Technologies Limited | Techniques for L3 pairing in a contact center system |
US10762417B2 (en) * | 2017-02-10 | 2020-09-01 | Synaptics Incorporated | Efficient connectionist temporal classification for binary classification |
US11100932B2 (en) * | 2017-02-10 | 2021-08-24 | Synaptics Incorporated | Robust start-end point detection algorithm using neural network |
US11080600B2 (en) * | 2017-02-10 | 2021-08-03 | Synaptics Incorporated | Recurrent neural network based acoustic event classification using complement rule |
US11087213B2 (en) * | 2017-02-10 | 2021-08-10 | Synaptics Incorporated | Binary and multi-class classification systems and methods using one spike connectionist temporal classification |
US10762891B2 (en) * | 2017-02-10 | 2020-09-01 | Synaptics Incorporated | Binary and multi-class classification systems and methods using connectionist temporal classification |
US11853884B2 (en) * | 2017-02-10 | 2023-12-26 | Synaptics Incorporated | Many or one detection classification systems and methods |
US10135986B1 (en) | 2017-02-21 | 2018-11-20 | Afiniti International Holdings, Ltd. | Techniques for behavioral pairing model evaluation in a contact center system |
US10762427B2 (en) * | 2017-03-01 | 2020-09-01 | Synaptics Incorporated | Connectionist temporal classification using segmented labeled sequence data |
US10970658B2 (en) | 2017-04-05 | 2021-04-06 | Afiniti, Ltd. | Techniques for behavioral pairing in a dispatch center system |
US9930180B1 (en) | 2017-04-28 | 2018-03-27 | Afiniti, Ltd. | Techniques for behavioral pairing in a contact center system |
US10122860B1 (en) | 2017-07-10 | 2018-11-06 | Afiniti Europe Technologies Limited | Techniques for estimating expected performance in a task assignment system |
US10509669B2 (en) | 2017-11-08 | 2019-12-17 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a task assignment system |
US10110746B1 (en) | 2017-11-08 | 2018-10-23 | Afiniti Europe Technologies Limited | Techniques for benchmarking pairing strategies in a task assignment system |
US11399096B2 (en) | 2017-11-29 | 2022-07-26 | Afiniti, Ltd. | Techniques for data matching in a contact center system |
US10509671B2 (en) | 2017-12-11 | 2019-12-17 | Afiniti Europe Technologies Limited | Techniques for behavioral pairing in a task assignment system |
US10623565B2 (en) | 2018-02-09 | 2020-04-14 | Afiniti Europe Technologies Limited | Techniques for behavioral pairing in a contact center system |
US11250359B2 (en) | 2018-05-30 | 2022-02-15 | Afiniti, Ltd. | Techniques for workforce management in a task assignment system |
US10496438B1 (en) | 2018-09-28 | 2019-12-03 | Afiniti, Ltd. | Techniques for adapting behavioral pairing to runtime conditions in a task assignment system |
US10867263B2 (en) | 2018-12-04 | 2020-12-15 | Afiniti, Ltd. | Techniques for behavioral pairing in a multistage task assignment system |
US11144344B2 (en) | 2019-01-17 | 2021-10-12 | Afiniti, Ltd. | Techniques for behavioral pairing in a task assignment system |
US10757261B1 (en) | 2019-08-12 | 2020-08-25 | Afiniti, Ltd. | Techniques for pairing contacts and agents in a contact center system |
US11445062B2 (en) | 2019-08-26 | 2022-09-13 | Afiniti, Ltd. | Techniques for behavioral pairing in a task assignment system |
US10757262B1 (en) | 2019-09-19 | 2020-08-25 | Afiniti, Ltd. | Techniques for decisioning behavioral pairing in a task assignment system |
CN111325382B (en) * | 2020-01-23 | 2022-06-28 | 北京百度网讯科技有限公司 | Method and device for predicting free parking space of parking lot, electronic equipment and storage medium |
WO2021158436A1 (en) | 2020-02-03 | 2021-08-12 | Afiniti, Ltd. | Techniques for behavioral pairing in a task assignment system |
CN115244513A (en) | 2020-02-04 | 2022-10-25 | 阿菲尼帝有限公司 | Techniques for error handling in a task distribution system with an external pairing system |
US11050886B1 (en) | 2020-02-05 | 2021-06-29 | Afiniti, Ltd. | Techniques for sharing control of assigning tasks between an external pairing system and a task assignment system with an internal pairing system |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63104596A (en) * | 1986-10-21 | 1988-05-10 | Canon Inc | Key telephone system |
US4858120A (en) * | 1987-03-18 | 1989-08-15 | International Telesystems Corp. | System for regulating arrivals of customers to servers |
US4881261A (en) * | 1988-06-29 | 1989-11-14 | Rockwell International Corporation | Method for predictive pacing of calls in a calling system |
JPH0211051A (en) * | 1988-06-29 | 1990-01-16 | Mitsubishi Electric Corp | Telephone facility with learning function |
US4933964A (en) * | 1989-07-25 | 1990-06-12 | International Telesystems Corporation | Pacing of telephone calls for call origination management systems |
-
1990
- 1990-12-11 US US07/626,670 patent/US5155763A/en not_active Expired - Fee Related
-
1991
- 1991-10-31 CA CA002054631A patent/CA2054631C/en not_active Expired - Fee Related
- 1991-11-15 EP EP19910480171 patent/EP0493292A3/en not_active Withdrawn
- 1991-12-10 JP JP3326014A patent/JP2823100B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
EP0493292A2 (en) | 1992-07-01 |
EP0493292A3 (en) | 1993-08-18 |
US5155763A (en) | 1992-10-13 |
CA2054631A1 (en) | 1992-06-12 |
JP2823100B2 (en) | 1998-11-11 |
JPH05308406A (en) | 1993-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2054631C (en) | Look-ahead method and apparatus for predictive dialing using a neural network | |
EP0962088B1 (en) | Call-forwarding system using adaptive model of user behavior | |
US7177880B2 (en) | Method of creating and displaying relationship chains between users of a computerized network | |
Conrath | Communications environment and its relationship to organizational structure | |
US5617511A (en) | Neural network shell for application programs | |
US5715371A (en) | Personal computer-based intelligent networks | |
Mozer et al. | Predicting subscriber dissatisfaction and improving retention in the wireless telecommunications industry | |
Musa | The operational profile | |
EP0443976A2 (en) | Neural network shell for application programs | |
US20020029154A1 (en) | Mechanism and method for dynamic question handling through an electronic interface | |
Corkill et al. | Unifying Data-Directed and Goal-Directed Control: An Example and Experiments. | |
Horvitz et al. | Bayesphone: Precomputation of context-sensitive policies for inquiry and action in mobile devices | |
Handschin et al. | Object-oriented software engineering for transmission planning in open access schemes | |
GB2366421A (en) | Ordering relational database operations according to referential integrity constraints | |
EP0405876B1 (en) | Apparatus and method for computer-aided decision making | |
Dror et al. | Modeling Uncertainty: an examination of stochastic theory, methods, and applications | |
Baldwin et al. | Inference and learning in fuzzy Bayesian networks | |
Davis | Rapid prototyping using executable requirements specifications | |
Kursawe | Evolution strategies: Simple models of natural processes? | |
JPH06502751A (en) | Systems and processes for specifying personalized telecommunications services | |
Cesta et al. | O-Oscar: A flexible object-oriented architecture for schedule management in space applications | |
CN110287304A (en) | Question and answer information processing method, device and computer equipment | |
Taxén | A Strategy for Organisational Knowledge Evolution | |
CN114218355A (en) | Call control method, device, equipment and storage medium | |
Karam | Effectiveness assessment of the METANET demonstration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKLA | Lapsed |