WO1997042615A1 - Method and apparatus for computer-based educational testing - Google Patents

Method and apparatus for computer-based educational testing Download PDF

Info

Publication number
WO1997042615A1
WO1997042615A1 PCT/US1997/008566 US9708566W WO9742615A1 WO 1997042615 A1 WO1997042615 A1 WO 1997042615A1 US 9708566 W US9708566 W US 9708566W WO 9742615 A1 WO9742615 A1 WO 9742615A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
taker
test result
computer
testing
Prior art date
Application number
PCT/US1997/008566
Other languages
French (fr)
Inventor
Jay S. Walker
Bruce Schneier
James A. Jorasch
Original Assignee
Walker Asset Management Limited Partnership
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=24596414&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=WO1997042615(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Walker Asset Management Limited Partnership filed Critical Walker Asset Management Limited Partnership
Priority to AU31352/97A priority Critical patent/AU3135297A/en
Publication of WO1997042615A1 publication Critical patent/WO1997042615A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Definitions

  • the present invention relates generally to methods and apparatuses for computer-based education. More particularly, the invention relates to computer-based assessment of an individual's educational performance relative to selected comparative norms.
  • edutainment software Software which blends both fun and learning is often referred to as “edutainment” software.
  • Popular edutainment programs include “Mathblaster, " “Where in the World is Carmen Sandiego, “ and “Word Munchers. " These edutainment programs (as well as certain other educational programs generally) present players with a series of increasingly difficult tests or puzzles, wherein players must correctly solve the present round before they are allowed to continue to the next round of play.
  • edutainment software In the above-mentioned edutainment software,
  • This capability permits the software to self-adjust to the skill strengths and weaknesses of individual players when measured against predetermined (e.g., programmed into the software at the time of manufacture) and absolute (e.g., percentage of questions answered correctly) norms.
  • predetermined e.g., programmed into the software at the time of manufacture
  • absolute e.g., percentage of questions answered correctly
  • a drawback of the existing software is that it cannot take into account relative norms (e.g., comparisons among a group of children currently taking the test) , because of the lack
  • Another category of educational software runs on networked mainframe-and-terminal systems for large-scale, simultaneous testing of groups of students at fixed
  • SU ⁇ STTTUTE SHEH (RULE 28) locations.
  • these centralized systems require a human being (e.g., a parent, teacher, or other proctor) to monitor individual users of the software to prevent cheating.
  • the software could subsequently retain scores in a local database which allows a teacher to recognize the best students and to ensure that all students have used the software to achieve certain minimum requirement levels.
  • Such systems are well-suited for formal testing, but are ill-suited for home-based educational testing because of their hardwired, inflexible configurations and because of the continuous human monitoring requirement .
  • Yet another category of software is used for computerized evaluation of standardized tests taken by school children using paper forms. Groups of students simultaneously record their answers to paper-based, multiple-choice questions by blackening the ovals on Scantron forms which are optically scanned for grading by software running on computers. A standard paper report is then generated and distributed to each student. These reports, such as the Connecticut Mastery Test administered in the Connecticut primary and secondary schools, or the more widely known SATs, measure the student's comparative performance against all others in his school district as well as the state. Although these testing systems provide teachers and parents with an evaluative, reliable comparison as to how the student is performing vs. his peers, they suffer from other shortcomings that make them ill-suited to home-based educational testing.
  • testing methodology (reading printed questions and blackening ovals) is cumbersome and rigid.
  • the methodology does not allow the setting of local performance criteria apart from standardized (but relative) norms programmed into the central mainframe computer.
  • the entire test must be taken and submitted to the central computer for grading before
  • a teacher or central authority might want to distribute tests in electronic format for students to take on their home computers. Requiring a simultaneous start time and/or deadline completion for all tested students is necessary to deter cheating in the form of test-taker substitution or a later test-taker obtaining an earlier test-taker's answers to the same test.
  • current software does not allow such a configuration.
  • An object of the invention is to provide a system for computer-implemented education whereby a home user of educational computer software can be evaluated against a wide variety of selected comparative norms. Another object of the invention is to reliably associate a test-taker and his reported test results to a recipient thereof. Another object of the invention is to facilitate automatic monitoring of an unproctored test-taker' s test results against objective criteria established by a local supervisor of the test-taker. Another object of the invention is to implement a feedback mechanism whereby a
  • SUBSTITUTE SHEET PLE 26 user's future interactions with the educational software can be automatically modified in response to his ongoing or historical performance.
  • Another object of the invention is to facilitate the non-simultaneous testing of test-takers in a manner that deters one test-taker from providing his test results to another test-taker.
  • Another object of the invention is to allow the simultaneous testing of remotely located test-takers in the absence of a system of internetworked and synchronized testing computers.
  • testing software is incorporated into "edutainment" games for reporting a player's overall score, along with other subcategories of information (e.g., subject areas within the test material, time taken to answer questions, etc.) , as an x-digit numeric test result (e.g., 2340-9908-0011) or "score code.”
  • the test result may also include information as to whether or not the software had been altered during the testing procedure.
  • the score is scrambled to prevent unauthorized access thereto.
  • the system could use a variety of cryptographic protocols to make it difficult to forge and to prevent the user from misrepresenting his result to a database located at the central computer.
  • the player could use a touch-tone phone to call an 800 number to reach a central computer and register his score with an interactive voice response unit.
  • registration may be mandatory or optional.
  • the software may require registration after a certain number of games by denying access until a confirmation code was entered.
  • a live operator might register the player into a database
  • SUBSTTTUTE SHEET (RULE 28) containing important comparative factors such as age, grade, school, address, parent's name, password, etc. This "setup" phase could also be accomplished by mail.
  • the score registration process would be automated.
  • An interactive system would prompt the player step-by-step through the registration process, beginning with the player entering his personal ID number and then his score code.
  • the central computer would decrypt the score code and save the decrypted information in the central database for comparative evaluation against an appropriate test benchmark.
  • Such benchmark would typically be performed according to demographic or geographic norms determined by the test- taker's registration information.
  • the test benchmark could even include the user's own past test results. After making the appropriate comparisons, it would generate a customized "performance indication" to be given to the player.
  • the performance indication could also include a confirmation code to be entered into the test-taker's computer as evidence that he had registered his test results. Prior to acceptance, the testing computer could check the confirmation code to make sure it was valid. Based on the performance indication and/or confirmation code, the software could adjust future interactions with the test-taker based on his past performance. This "reflexive" feature would be based on how comparable children answered the same questions, as opposed to being preprogrammed at the time of manufacture or distribution of the software. For example, if a child is performing significantly below his peers in a particular category, the reflexive feature of the software could direct the software to focus future test questions in that category.
  • SUBSTITUTE SHE ⁇ (RULE 26) be restricted by a password held only by the child's parent.
  • a detailed report could be generated by the central computer and mailed to the parent, allowing the parent to carefully review his child's comparative scores. This report could also contain specialized exercises and remedial program recommendations based on the child's results.
  • each test could be composed of different questions selected, in part, based on past performance.
  • the questions could also be randomized, either in order or content, to deter cheating.
  • a test-taker would call an 800 number to register his score code, which could include the serial number of the testing software and/or student ID.
  • the testing computers could be configured for simultaneous testing based on receipt of a starting authorization. It was mentioned previously that the returned performance indication would provide statistical information and/or future test customization to an end user, such as a parent or teacher.
  • the same performance indication could also include a reward attainment message for certifying the test-taker's achievement of a reward threshold, which had been previously registered by a parent, teacher, or other end user at the central computer.
  • the end user could subsequently interrogate the testing computer (or even the central computer itself) to determine if the test-taker had been issued the reward attainment message and to provide a reward in accordance therewith.
  • testing software is located on the testing computer, and answers to pre- programmed questions are scored locally before the test-
  • SUKT ⁇ ilTE SHEET (RULE 2fD taker manually transmits the results to the central computer using a voice-based telephonic data input device.
  • the testing computer could connect to the central computer and transmit the score code via an on ⁇ line (e.g., Internet, World Wide Web, America Online, etc.) connection.
  • the testing computer could even receive the test questions directly from the central computer rather than pre-storing them locally. This would allow the test administrator great flexibility in modifying and controlling test question distribution and access.
  • the order or content of subsequent test questions could also be interactively customized for the needs of the individual test-taker in accordance with his ongoing or historical performance.
  • the testing computer could transmit ungraded test answers, for grading and performance evaluation at the central computer.
  • remotely-located test- takers can compare their results with other remotely located test-takers. Comparisons can be done on an ongoing basis with no need for an on-line connection, and scoring can be done locally with no need to send all answers to a centralized scoring authority. Alternatively, an on-line connection could be used for even greater flexibility in test question distribution and control.
  • test-taker's score is securely associated with the test-taker's identity, which deters cheating as well as allowing the test results to be certified to a party (e.g., a parent) that did not necessarily witness the test-taking.
  • a party e.g., a parent
  • the parent would not need to monitor the results since they cannot be falsely registered.
  • the parent could even establish his own local criteria for evaluating his child's performance, with the monitoring and evaluation to
  • SUBSTmiTE SHEET (RULE 2B) be performed automatically by the same central computer that certifies the test results.
  • the returned performance indication could also include commands to the testing software to customize future functionality based on past performance.
  • the educational software is not only for testing, but may be used for tutoring or workbook purposes.
  • corroborative data Another form of cheating, that of test-taker substitution, is also deterred by the inclusion of corroborative data into the encoded test result transmitted from the testing computer to the central computer.
  • corroborative data could include a biometric identifier of the test-taker or of a witness to the testing (e.g., a proctor), other forms of witness identifier entered via a keypad or other input device, or a location datum from a GPS receiver.
  • the corroborative data deters cheating by providing
  • Figure 1 illustrates the basic components of a system for computer-based educational testing.
  • Figure 2 illustrates one embodiment of a device for administering a test at a test-taker's home.
  • Figure 3 illustrates one embodiment of a central computer for receiving test results from a testing device, performing a comparative evaluation with selected norms, and returning a performance indication for the test-taker.
  • Figure 4 illustrates an exemplary test administration procedure.
  • Figure 5 illustrates an exemplary test result encoding procedure.
  • Figure 6 illustrates an exemplary test result transmission procedure.
  • Figure 7 illustrates an exemplary decoding procedure at the central computer.
  • Figure 8 illustrates an exemplary performance assessment procedure.
  • Figures 9(a) and 9(b) illustrate exemplary procedures for providing the performance indication to the user.
  • Figures 10 (a) and 10 (b) illustrate an exemplary reward mechanism from the perspectives of the (a) the parent and (b) the central computer, respectively.
  • Figures 11(a) and 11(b) illustrate exemplary
  • SUBSTITUTE SHEET PLE 20 content of (a) the score code and (b) the performance indication, respectively.
  • Figure 12 illustrates an alternative embodiment of the invention wherein the test results are encoded using a cryptographic hashing operation.
  • Figure 13 illustrates an exemplary process for decoding the encoded test result corresponding to Figure 12.
  • Figure 14 illustrates an exemplary process for test question generation and distribution in a home testing application.
  • Figure 15 illustrates an exemplary process for beginning a testing session in a home testing application.
  • Figure 16 illustrates an exemplary process for test result encoding in a home testing application.
  • Figure 17 illustrates an exemplary process for transmitting encoded test results to the central computer in a home testing application.
  • Figure 18 illustrates an exemplary process for decoding and scoring a received encoded test result at the central computer in a home testing application.
  • Figure 19 illustrates an exemplary process for transmitting the scored test results to a teacher in a home testing application.
  • the system includes a home testing computer for transmitting the student ' s test results to a central computer which derives a performance assessment of the test-taker.
  • the performance assessment can be standardized or customized, as well as relative or absolute. Further, the transmitted test results are configured to reliably associate the
  • SUBSTITUTE SHEET (RULE 2 ⁇ ) student with his test results, using encoding and/or other user identification techniques, to deter fraud.
  • the system allows a parentally-controlled reward system such that children who reach specified objectives can claim an award that parents are confident was fairly and honestly earned without the parent being required to supervise the game play. Fraud, and the need for proctoring, is also deterred during multiple student testing via an option for "simultaneous testing" of geographically dispersed test-takers.
  • certain ancillary elements used in conjunction with the educational testing device are well understood to those skilled in the art and are not shown in order not to obscure the present invention.
  • the design and construction of clocks, computer memories, and software or hardware cryptographic algorithms are well known to those skilled in the art and will not be described in detail herein.
  • a computer-based educational testing system comprising a testing computer 200 and a central computer 300.
  • a test- taker at testing computer 200 would typically exchange information with the central computer 300 using a telephone 120 connected to a public switched network 110 provided by a local or regional telephone operating company ("TELCO") .
  • TELCO local or regional telephone operating company
  • PCS Personal Communication Systems
  • the educational testing system facilitates the administration of a test, the sending of the test results 1110 to the central computer 300, the verification of those results, the generation of a performance indication 1150, and the transmission of that information to an end user.
  • the end user could be the test-taker himself, his parent, his
  • SUBSTITUTE 8HETPLE 20 o teacher, a college admissions office, or any other party having an interest in the performance of the test-taker.
  • the performance indication 1150 is shown as being returned over the public switched telephone network 110 to an end user at telephone 120 for entry into the testing computer 200.
  • the testing device 200 could be a conventional personal computer 210 having a CPU 220, an input device 225 (e.g., a keyboard or mouse), one or more communications ports 230, a clock 240, a display driver 245, a display (e.g., a video monitor) 250, RAM 260, ROM 265, and a data storage device 270.
  • the storage device could either be fixed media (e.g., a hard disk) or a drive capable of reading removable media (e.g., a floppy disk or CD-ROM) .
  • the storage device may be used to store the educational software 272, the score database 274, and message or audit databases 276.
  • the score database 274 contains the performance indication 1150 received from the central computer 300.
  • the audit database 276 contains audit data produced by the educational software 272, or received from the central computer 300, such as a cumulative record of all tests taken by the test-taker.
  • one or more of the CPU 220, the clock 240, the RAM 260, the ROM 265, or the data storage device 270 can be located within a secure perimeter 280.
  • Secure perimeter 280 may include physical, electronic, or a combination of physical and electronic features to resist tampering.
  • physical features could include encapsulation
  • electronic features could include a silicon firewall
  • combination features could include self-zeroizing, or otherwise volatile, RAM 260 or ROM 265 which electrically modifies its contents upon detection of tampering.
  • Such tampering might include physically stressing the device or attempting to change the clock rate by modifying its power source (not shown in
  • secure perimeter 280 could be merely tamper-evident.
  • tamper resistant As will be appreciated by those skilled in the art, a great variety of tamper- resistant/tamper-evident techniques can be deployed, and will not be enumerated in detail herein. Therefore, as a matter of convenience, terms such as “tamper resistant”, “tamper evident”, or “secure” shall be understood to refer to any of the aforementioned or other security measures throughout this discussion.
  • the testing computer 200 may optionally be connected, via communications port 230, to a communication device (e.g., a modem, a network card, or a transmitter) to allow direct communications with the central computer 300.
  • a communication device e.g., a modem, a network card, or a transmitter
  • the testing device 200 may also include a biometric reader 280 such as a fingerprint reader or retinal scanner.
  • the central computer 300 includes a central processor 310, cryptographic processor 320, ROM 330, RAM 340, clock 350, and data storage device 360.
  • the cryptographic processor 320 might be a specially secured, dedicated processor separate from the central processor 310.
  • the cryptographic processor 320 could be integrated in the central processor 310.
  • any conventional personal computer, computer workstation, or mainframe computer having sufficient memory and processing capabilities may be used as the central computer 300.
  • the central computer 300 is connected to an interactive voice response unit (IVRU) 370 for receiving test results 1100 (see Figures 1 and 11(a)) from the test-taker via touch-tone signals transmitted over phone network 120.
  • IVRU interactive voice response unit
  • This process will be described in greater detail below with respect to Figure 6.
  • IVRUs are well known in the art (see, e.g., Jerry Fitzgerald, "Business Data Communications - Basic Concepts, Security & Design," 4th ed. , John Wiley & Sons, 1993.) and need not be described in detail here.
  • the central computer 300 receives the encoded test result from the test-taker, decodes them using central processor 310 and/or cryptographic processor 320, and compares the decoded test results against test benchmarks stored in software database 385 or test result database 390. The central computer 300 then generates a performance indication 1150 (see Figure 1) containing its evaluation and returns the performance indication 1150 to the end user. Because the central computer 300 might be simultaneously performing these operations for a multitude of test-takers, it must be capable of high volume transaction processing and of performing a significant number of mathematical calculations in processing data inquiries. Thus a relatively powerful microprocessor that has a wide data bus would be a suitable central processor.
  • the cryptographic processor 320 supports verification of the encoded test result 1100 received by the central computer, as well as encoding of the performance indication 1150, subsequently returned to the end user. Any suitable microprocessor may be used for the cryptographic processor 320. For example, in its 16 MHz configuration, the Motorola MC68HC16's fast 16-bit multiply-and-accumulate instruction requires less than one second to perform a 512-bit RSA private key operation. Other commercially available specialized cryptographic processors include VLSI Technology's 33MHz 6868 or Semaphore Communications' 40 MHz Roadrunner284. Alternatively, the cryptographic processor 320 may also be configured as part of central processor 310.
  • Data storage device 360 reads either fixed storage media within the central computer 300, or
  • test-taker database 380 maintains data on the test-taker and/or end user, including names, personal ID numbers, phone numbers, private key information, e-mail addresses, physical addresses, software owned, etc.
  • Test result database 390 maintains data on all test results sent to the central computer 300.
  • test results will be discussed in greater detail below with respect to Figure 11(a) , but might include various ID numbers (e.g., test-taker, testing device and testing software) , and an indication of answers to the test questions (e.g., questions answered correctly/incorrectly, or slowest/fastest) .
  • ID numbers e.g., test-taker, testing device and testing software
  • answers to the test questions e.g., questions answered correctly/incorrectly, or slowest/fastest
  • test-taker sits down at the testing computer 200 and, at step 420, requests a testing session from the educational software 272.
  • This software is loaded into the RAM 260 of the testing device 200 to be executed by its CPU 220.
  • the testing software instructions appear on the video monitor 250 requesting that the test-taker enter his personal ID number for identification.
  • the testing process begins, with test questions being displayed on the video monitor 250. The test-taker supplies answers through the keyboard or mouse 225, and continues until the testing process is complete. Questions can include true/false, multiple choice, fill- in-the-blank, or any other machine-scorable format.
  • Questions can include true/false, multiple choice, fill- in-the-blank, or any other machine-scorable format.
  • the software 272 scores the test by comparing the answers provided by the test-taker to the correct answers stored within the educational software 272.
  • the test results are incorporated into a numeric score code, which includes not only the number of correct/incorrect answers, but also an indication of which answers were incorrect.
  • the score code might indicate that three answers were wrong, and that questions number six, twelve, and seventeen were missed.
  • the encoded test result 1100 might also include data on the amount of time required to answer each question, as well as identifying information as to the testing computer 200, software 272, and test-taker.
  • Figure 5 illustrates an exemplary embodiment of a process for encoding the test results at the home testing computer 200.
  • the process begins, at step 510 (same as step 460 of Figure 4) , with incorporating the test results into a numeric score code 1100.
  • the test results should be encoded using a secret algorithm
  • a suitable cryptographic algorithm is used for encoding the test result.
  • the cryptographic operation constitutes encryption
  • RSA could be used for public key (asymmetric) encryption.
  • the keys could be arbitrary, or they could be based on the testing computer ID number or test-taker's personal ID number.
  • encryption in the testing computer's private key is particularly appropriate if authenticity is required
  • encryption in the central computer's public key is appropriate if confidentiality is desired. If both authenticity and confidentiality are desired, a double
  • SUBSTITUTE SHEET (ROLE 28) encryption could use both the testing computer's private key and the central computer's public key.
  • secret key e.g., DES
  • DES secret key
  • asymmetric or symmetric cryptographic techniques are described herein, those skilled in the art will appreciate that many other cryptographic techniques can also be used, as will be described below in the section entitled “Alternative Embodiments of the Invention.” These encoding protocols can be implemented in software or hardware, and can be made more secure by including the algorithm and keys in secure perimeter 280.
  • the encrypted test result 1100 is displayed on the monitor 250 of the testing computer 200, at step 540, along with an 800 number to call, at step 550.
  • test-taker may be given an option whether or not to register his score. If he believes that he has performed poorly, he may decide not to register.
  • the test results could be displayed on the video monitor 250 only in encoded form, which would be unintelligible to the test- taker. Thus, a good result would be indistinguishable from a bad result because neither would appear in plaintext form. This would help discourage fraud in the form of continuous retaking of a test until a good result was achieved.
  • test-taker decides not to register, at step 620, he begins another testing session or exits from the software. If the test-taker decides to register, at step 630, he calls the 800 number displayed on the monitor 250 and, in step 640, connects with the IVRU 370 of the central computer 300. At step 650, if the test-
  • SUBSTITUTE SHEET (RULE 28) taker has previously registered himself with the central computer 300, he continues with test result transmission at step 680. Alternatively, if the test-taker has not yet registered himself with the central computer 300, the call is transferred at step 655 to a live operator to whom the test-taker provides registration information 660. As shown in block 665, such information might include the test-taker's name, phone number, address, age, school, grade level, end user to receive the performance indication (e.g., a parent or the test-taker himself), educational software ownership, etc. At step 670, a unique personal ID number is then assigned to the test- taker.
  • the central computer creates a database record for this information in the test-taker database 380, which is indexed by personal ID number.
  • the test-taker is then fully registered with the central computer 300 and is ready to transmit the encoded test result 1100.
  • the testing computer 200 displays the encoded test result 1100 to the test-taker, who subsequently manually telephones the central computer 300 and uses his touch- tone keypad to transmit his personal ID number and the encoded test result 1100 to the central computer 300 in response to voice prompts from the IVRU 370.
  • the encoded test 1100 result is sent to the cryptographic processor 320.
  • the central computer 300 looks up the personal ID number in the test- taker database 380 and retrieves the appropriate key to decrypt the encoded test result 1100. If asymmetric cryptography is used, this would be the public key corresponding to the test-taker's private key; if symmetric cryptography is used, this would be the same secret key used for the encryption at the testing computer 200.
  • the cryptographic processor 320 Upon receipt of the decryption key, at step 730, the cryptographic processor 320 decrypts the encoded test result 1100 at step 740, thereby verifying that the score code was produced by the personal ID provided therein. The decrypted score information is then stored within test result database 390 at step 750.
  • the central computer 300 takes the test result and initiates a search for comparative scores.
  • the software ID number Within the numeric code of the test result is the software ID number, identifying the software used for that test session.
  • the central computer 300 sorts the test result database 390 by software ID number, retrieving only those records that pertain to the software package 272. These records could be further narrowed, for example, by restricting them to those records generated by test-takers of the same age or grade level.
  • test-taker's test result is then compared relative to the performance of an appropriate reference group, or some other test benchmark, to generate a simple statistical comparison such as percentile rank.
  • these comparative data are incorporated into a numeric performance indication 1150, which will be given to the test-taker for input into the software package 272 that generated the test result (see Figure 1) .
  • the central computer 300 can generate a control message that directs the testing computer's software package 272 to focus on that particular weakness during subsequent testing sessions by modifying the testing computer's operation in response to the control message . For example, if the test-taker is consistently missing fractions questions during math tests, the testing computer 200 could be directed to tutor the test-taker in
  • SUBSTITUTE SHEET (RULE 2B> that subject or ask more questions of that type on the next math test .
  • Each software package could have hundreds of methods to focus future tests, each method tied to a control message stored in a software database 385 of the central computer 300 and in message database 276 of testing computer 200. In the above example, focusing on fractions might be expressed as control message "324" of the software database record for the math software.
  • control messages are included along with the evaluative data in the performance indication 1150 before it is provided to the test-taker (or other end user) at the end of the test result transmission call.
  • the test-taker enters the performance indication into the educational software 272 that generated the test result, triggering the software 272 to display the evaluative data generated by the central computer 300.
  • the performance indication 1150 could use cryptographic techniques, similar to those used for the test results, to ensure authenticity and/or integrity.
  • FIG 9(a) there is shown an exemplary embodiment of a process for providing the performance indication 1150 to the test-taker.
  • the test- taker enters the performance indication 1150 into the testing computer 200, which verifies the performance indication 1150 to ensure that it is valid.
  • the testing computer 200 displays basic statistics as to the relative performance of the test-taker.
  • the amount of information contained within the performance indication is limited by its transmission between the test-taker (or his parent) and the central computer 300 in audio form over the telephone 120 of Figure 1, including such factors as the amount of data the test-taker is willing to enter or the rate at which the test-taker can listen to and enter the data.
  • DTFM Dual Tone Frequency Modulator
  • other transmission methods such as the use of Dual Tone Frequency Modulator ("DTFM") or other modulated tones, can significantly increase the amount of information transferred without requiring the test-taker to manually enter the data.
  • DTFM Dual Tone Frequency Modulator
  • an acoustic coupler/demodulator associated with the testing computer 200 would convert the audio DTFM tones into a digital bitstream without any human intervention.
  • modulated tones could also be used for the initial transmission of the encoded test result 1100 from the testing computer 200 to the central computer 300.
  • the central computer 300 could also generate a detailed comparison report to be mailed to an end user (perhaps the test-taker himself) , as shown in Figure 9(b) .
  • the test-taker would make such a request via, or during transmission of, the encoded score code 1100 sent to the central computer 300 using the IVRU 370.
  • These more detailed test results could indicate how well the test-taker has performed based on his total score, and/or individual questions, based on an appropriate geographic or demographic test benchmark. It was mentioned previously that the performance indication 1150 may be provided to the test-taker for input into the testing computer 200.
  • the performance indication 1150 may be provided to an end user other than the test-taker himself. Either way, the performance indication 1150 represents a certification of the test result, by the central computer, to an end user monitoring the test-taker's performance. Thus, the performance indication 1150 may incorporate a reward attainment message, specifying the test-taker's achievement of a specified performance level, that can be subsequently accessed by the end user at some time after the testing session.
  • the reward system consists of two stages: specifying a reward level (in steps 1010 - 1060) and checking the child's performance (in steps 1070 - 1090) .
  • the test-taker's parent or teacher
  • the parent is prompted (at step 1020) by the central computer's IVRU 370 to enter his ID number (at step 1030) , the software identifier (at step 1040) , and a reward threshold to be monitored by the central computer 300 (at step 1050) .
  • the reward threshold could be in addition to, or part of, the test benchmark described previously in the content of test result evaluation.
  • the central computer 300 checks the score code saved in the test-taker database 380 against the specified reward threshold. If appropriate, at step 1085, a reward attainment message is included in the performance indicator 1150 provided by the IVRU 370 to the test-taker over the phone.
  • the test-taker then enters the performance indication 1150 into the testing computer 200 where it can be accessed at a later time by the test- taker's parent.
  • the parent could himself call the central computer 300 to directly receive the performance indication 1150 including the reward attainment message.
  • the parent could then use the reward attainment message to provide a designated reward to the test-taker.
  • SUBSTITUTE SHEET (RULE 2fi) encrypted test result 1100 it is known to be authentic.
  • the cryptographic technique of "one-way functions" may be used to ensure test result integrity.
  • a one-way function is one that outputs a unique representation of an input such that a given output is likely only to have come from its corresponding input, and such that the input can not be readily deduced from the output.
  • the term one-way function includes hashes, message authenticity codes (MACs -- keyed one-way functions) , cyclic redundancy checks (CRCs) , and other techniques well known to those skilled in the art. See, for example, Bruce Schneier, "Applied Cryptography,” Wiley, 1996.
  • step 1210 the same as step 510 of Figure 5
  • step 1220 CPU 220 then uses a software hashing algorithm incorporated in software 272 to hash the score code.
  • the hashing algorithm could be stored in RAM 260 or ROM 265, or it could be hardwired in a special dedicated cryptoprocessor (e.g., a test-taker's cryptographic token) separate from the CPU 220.
  • the result is an encoded test result 1100 comprising the (cleartext) score code and a (ciphertext) one-way function representative of at least a portion of the score code.
  • This encoded test result 1100 is displayed on video monitor 150, at step 1230, along with an 800 number for the test-taker to call to register the hash, at step 1240. Notice that, in contrast to the encryption embodiment in which the test result could be displayed only in encoded form, the hashed test result must also be made available to the central computer 300 in cleartext form.
  • the hashing can be performed in conjunction with encryption.
  • the test result is first encrypted prior to hashing.
  • test result can first be hashed and then encrypted.
  • hashing followed by encryption is often referred to as a digital signature.
  • the encryption operation ensures test result authenticity, in addition to the test result integrity provided by the hashing operation.
  • a unique device identification number can be added to the encoded test result 1100 to provide assurance of authenticity. Referring now to Figure 13, there is shown an exemplary embodiment of a decoding process corresponding to Figure 12.
  • the encoded test result 1100 is sent to central computer 300, which verifies the hashed test result by reading the cleartext part of the encoded test result (e.g., the score code and device or software ID) and the ciphertext part of the encoded test result (e.g., a hash of a portion of the cleartext part) .
  • the central computer 300 retrieves the appropriate hash function and, in step 1330, performs an identical hashing algorithm on the appropriate portion of cleartext part to recompute the hash. If the encoded test result uses any form of encryption in addition to hashing, the central computer 300 would locate and perform the appropriate cryptographic protocol to decrypt the encrypted portion of the test result in the appropriate manner.
  • the received and recomputed hashes are compared to determine that the test result came from the testing computer 200 and had not been altered subsequent to transmission. If so, at step 1360, the score code is stored in score code database 274.
  • the encoded test result could include digital certificates for public key distribution to a central computer 300 that does not know the testing computer's public key needed to verify a test result encrypted with the testing computer's private key.
  • the testing computer's public key is encrypted (and vouched for) by the private key of a trusted certifier (e.g., a well known manufacturer of the measurement certification device) whose public key is known to the central computer 300.
  • the central computer 300 uses the certifier's public key to decrypt the testing computer's public key, then uses the testing computer's public key to decrypt the encrypted test result.
  • the central computer 300 could simply obtain the testing computer's public key from a publicly accessible database, eliminating the need for digital certificates.
  • CRP challenge-response protocol
  • CCP challenge-response protocol
  • the central computer 300 generates and transmits a datum (also referred to as a "nonce") to the testing computer 200.
  • the testing computer 200 then incorporates this random number in the encoded test result 1100 transmitted to the central computer 300. If the received random number matches the random number previously generated, the central computer 300 accepts the encoded test result 1100 as fresh. Conversely, an old encoded test result 1100 would contain a non-matching random number.
  • the testing computer 200 includes a sequence number in the encoded test result 1100. This sequence number is incremented by one every time the testing computer 200 generates an encoded test result 1100.
  • the central computer 300 stores the most recent sequence number in memory, and accepts a transmitted encoded test result 1100 if its sequence number is one greater than the last stored sequence number.
  • the testing computer 200 includes the current time in the encoded test result 1100 transmitted to the central computer 300.
  • the central computer 300 checks the time included in the encoded test result 1100 against the time from the central computer's clock 350. If the times are within a prescribed window, the encoded test result 1100 is accepted as fresh.
  • the testing computer 200 itself generates a random number to be included in the encoded test result 1100.
  • the central computer 300 maintains a database of all random numbers received from all testing computers. If the new random number is not in that database, then the current encoded test result 1100 is accepted as fresh. If a time element is incorporated
  • Biometric identification devices e.g., a fingerprint reader, a voice recognition system, or a retinal scanner
  • An example of a fingerprint reader is the Startek FC100 FINGERPRINT VERIFIER, which connects to a PC via a standard interface card.
  • the fingerprint verifier acquires a test-taker's fingerprint when the test-taker places his finger on an optical scanning lens, then scans, digitizes, and compresses the data (typically a 256 byte file) in memory. During testing, each live scan fingerprint is compared against a previously enrolled/stored template, stored in the testing computer 200. If the prints do not match, access to the
  • SUBSTITUTE SHEH (RULE 28) educational software 272 can be denied.
  • This procedure may be implemented: 1) before the start of a testing session, 2) during the testing session in response to prompts from the educational software, 3) at some predetermined or random time, or 4) continuously by incorporating the scanning lens into the testing computer 200 such that the test-taker is required to maintain his finger on the lens at all times during the testing session for continuous verification.
  • a voice verification system located at either or both the central computer 300 and the testing computer 200, may utilize a person's "voice-print" to verify test-taker identity.
  • the process of obtaining a voice-print and subsequently using it to verify a person's identity is well-known in the art, and will not be described in detail herein.
  • suitable voice identification/verification technologies are commercially available from companies such as SpeakEZ, Inc. and others.
  • the speaker identification software is used to sample the test-taker's voice, which is stored in the test-taker database 380 at the central computer 300.
  • the IVRU 370 Each time the test-taker calls the central computer 300 to register a test result, the IVRU 370 prompts the test-taker to speak his or her name into the telephone 120.
  • the speaker identification software then directs the central computer 300 to check the test-taker's current voice-print against the voice- print stored in the test-taker database 380. Unless there is a match, the test result registration procedure is aborted.
  • the voice-print may also be stored in a database in the testing computer 200, to verify the test-taker's identity at that location prior to allowing a test session.
  • smiSTmrrE SHEET (RULE m for corroborating the test-taker's identity.
  • other forms of independent data can be added to the encoded test result 1100 to corroborate the testing process.
  • the biometric reader could be used to certify the identity of a witness to the testing process (e.g., a proctor) in addition to the identity of the test-taker himself.
  • the witness could also be certified by inputting a unique witness identifier into a keypad or other input device at the testing computer 200.
  • a Global Positioning Satellite (GPS) signal receiver could be incorporated with the testing computer 200 to provide a corroborative datum indicating the location of the testing computer.
  • GPS Global Positioning Satellite
  • test-taker himself acts as an information conduit between the testing computer 200 and the central computer 300, in conjunction with telephone 120 and interactive voice response unit 370.
  • this may limit the amount of information that can be conveyed between the testing computer 200 and the central computer 300.
  • One solution when response time is not crucial, involves transmitting such information via non-electronic media (e.g., Figure 9(b)) .
  • Another solution would be to establish a direct electronic connection via a dial-up modem, a cable modem, a set-top box, or any other form of electronic communication.
  • conventional modems may be used for communications at rates from 1,200 to 33,000 bits per second (baud) .
  • ISDN modems may be used for communications at rates of 64,000 to 128,000 baud.
  • SUBSTITUTE SHEET (RULE 26 ⁇ )
  • special network switches may be used in connection with even higher-speed communications links, e.g., over Tl or T3 lines. Such a connection would allow the transmission of vastly greater amounts of information, as well as allowing the testing computer 200 to take advantage of the storage and processing capabilities of the central computer 300.
  • the central computer's network interface must also be able to support multiple, simultaneous data connections.
  • the central computer 300 is accessible over the INTERNET or commercial on-line services such as America Online, CompuServe, or Prodigy, allowing multiple test-takers to access the central computer 300 via simultaneous on-line connections.
  • the test questions would not have to be pre-stored at the testing computer 200, but could be stored at, and downloaded directly from, the central computer 300. This would give the testing authority much greater flexibility to keep test questions current and confidential.
  • future test questions could be customized in response to the test-taker's ongoing or historical performance.
  • test benchmark could include any information useful for evaluating a test result transmitted from the testing computer 200 to the central computer 300.
  • test result and the test benchmark could include absolute, relative, or statistical information.
  • FIGs 14-19 illustrate another embodiment of the invention in which the test questions are not pre ⁇ stored at the testing computer 200.
  • Teachers or commercial testing services
  • the teacher enters the test questions into the test generator software operating on a standard personal computer.
  • the teacher enters the personal IDs of each student taking the test into the test generator software to ensure that only authorized students can take the test.
  • the teacher enters an unlock code and one or more cryptographic keys of any desired type.
  • the unlock code is used by the test-takers to activate the test session, while the cryptographic keys will be used by the students to encrypt their test results.
  • the teacher enters both a start and finish time for the test, perhaps declaring that the test will begin at 1:00 PM on Saturday and end at 3:00 PM the same day.
  • the teacher enters a unique test ID number at step 1450, which allows the central computer 300 to track the results of the test.
  • the test generator software writes tests onto individual disks.
  • the test disks can take many forms including floppy disks, tape cartridges, magneto-optical disks, etc. The disks do not have to be identical. Each disk's questions could be arranged in a different order, or each disk could contain different questions. For example, the teacher could enter a large number of questions into the generator software with randomly
  • SUBSTITUTE SHEH (RULE 28) selected subsets of these questions to be generated for different disks.
  • the teacher completes the test generation process by calling the central computer 300 at step 1470, and using an IVRU or other electronic device to register the test ID number, personal IDs of the test- takers, unlock code, start time/stop time, and cryptographic key.
  • the central computer 300 stores this information in a database for later use.
  • the disks are distributed to the students.
  • test-taker brings the test disk home and puts it into his testing computer 200. Without the appropriate unlock code, however, the test-taker is unable to access the test questions.
  • the test-taker calls the central computer 300 and enters the test ID number. This ID number is preferably printed on the disk itself.
  • the central computer 300 accesses the test database 380 and retrieves the database record for that particular test ID number.
  • the central computer 300 compares the current time to the designated start time of the test. If the current time is prior to the test start time, the test-taker is instructed to call back at the established start time.
  • the central computer 300 supplies the appropriate unlock code at step 1530.
  • the test-taker enters this unlock code into the test software, allowing access to the software 272.
  • the test-taker is prevented from continuing.
  • the test-taker begins the testing session.
  • FIG. 16 there is shown an exemplary embodiment for the process of encoding the test results for transmission to the central computer 300.
  • test-taker's answers are aggregated into a test result, which is encrypted at step 1620 using the key entered into the test disk at the time the disk was generated.
  • the encoded test result 1100 is displayed on the video monitor 150 of the testing computer 200.
  • the 800 number of the central computer 300 is similarly displayed.
  • test-taker calls the 800 number to register his test results with the central computer 300.
  • the IVRU 370 prompts the test-taker who, at step 1740, enters the test ID number.
  • the test- taker enters his personal ID number and, at step 1760, enters the encoded test result 1100. This result is timestamped by the central computer 300 at step 1770 so that it can be checked later for compliance with the established end time of the test.
  • the encoded test result 1100 is sent to the cryptographic processor 320.
  • the central computer 300 looks up the test ID number to find the cryptographic key used to encrypt the test result. This key is transmitted at step 1830 to the cryptographic processor 320 where, at step 1840, the encoded test result 1100 is decrypted and scored.
  • the answer key used for scoring could have been previously transmitted from the teacher to the central computer during the test registration process at step 1470 in Figure 14.
  • the scored test result 1100 is saved in the test result database 390.
  • the central computer 300 generates a list of all scores for the given test ID number. Along with these scores, at step 1920, are the timestamps associated with each score. At step 1930, the scores are separated into two lists -- those with timestamps before the stop time, and those with timestamps after the stop time. At step 1940, these scores are sent to the teacher. Scores whose timestamps exceed the pre-established end time for the test may be rejected or penalized in an appropriate manner.
  • the testing computer 200 transmits unscored test answers to the central computer 300 for scoring at the central computer 300.
  • the encoded test results could be scored by the testing computer 200 before being transmitted to the central computer 300. This requires that an answer key be available to the testing computer 200 along with the test questions. This answer key may have been provided in encrypted form, along with the test questions, on the disk created by the teacher's test generator software and provided to the student prior to testing.
  • the testing software could be programmed in a distributed computing language such as Sun's Java, that allows both the functional (testing algorithms) and non-functional (test content) aspects of the testing software to be integrated into "executable content" accessible from a remote server over the World Wide Web.
  • a distributed computing language such as Sun's Java
  • Even more sophisticated distributed computing protocols allow the processing of information by different computers operating in a networked environment . Allowing the central computer 300 (or any other networked computer) to perform portions
  • SUBSTITUTE SHEET (RULE 2B) of the computational processing that would otherwise be performed by the testing computer 200 greatly reduces the cost of the testing computer 200, making it more affordable to a wider market.
  • the testing computer 200 could upload the encoded, unscored answers to a networked computer for remote scoring.
  • the central computer 300 may be configured as a distributed architecture wherein the databases and cryptographic processors are housed in physically separate units. In such a configuration, each unit is in communication with its constituent parts, but the processor and/or data storage functions are performed by stand-alone units. This arrangement yields a more dynamic and flexible system, less prone to catastrophic hardware failures affecting the entire system.

Abstract

Methods and apparatuses are disclosed for computer-based evaluation of a test taker's performance with respect to selected comparative norms. The system includes a home-testing computer (200) for transmitting the test taker's test results to a central computer (300) which derives a performance assessment of the test taker. The performance assessment can be standardized or customized, as well as relative or absolute. Further, the transmitted test results are configured to reliably associate the student with his test results, using encoding (1100), user identification, or corroborative techniques to deter fraud. Thus, for example, the system allows a parentally-controlled reward system such that children who reach specified objectives can claim an award that parents are confident was fairly and honestly earned without the parent being required to proctor the testing. Fraud, and the need for proctoring, is also deterred during multiple student testing via an option for simultaneous testing of geographically dispersed test takers.

Description

METHOD AND APPARATUS FOR COMPUTER-BASED
EDUCATIONAL TESTING
BACKGROUND OF THE INVENTION
Field Of the Invention
The present invention relates generally to methods and apparatuses for computer-based education. More particularly, the invention relates to computer-based assessment of an individual's educational performance relative to selected comparative norms.
Background
Educational software that tests a user's knowledge, coupled with the immediate scoring of answers, is well known in the art. For home use, a wide range of software is directed to various age groups, although the primary audience is school-grade children. Thus, many educational programs are designed to be quite entertaining, with built-in tests being played as games having an underlying educational purpose. Thus, terms such as "student", "player", or "test-taker," shall be understood to mean any participant in, or user of, educational software that tests the user's knowledge in a formal or informal manner.
Software which blends both fun and learning is often referred to as "edutainment" software. Popular edutainment programs include "Mathblaster, " "Where in the World is Carmen Sandiego, " and "Word Munchers. " These edutainment programs (as well as certain other educational programs generally) present players with a series of increasingly difficult tests or puzzles, wherein players must correctly solve the present round before they are allowed to continue to the next round of play. In the above-mentioned edutainment software,
SUBSiπUTE SHEET fϋlfti 2fθ each player exists in isolation from all other players. This arrangement is reasonably well suited to informal, isolated learning, but lacks the comprehensive data collection/comparison features needed for formal test administration. For example, without an on-line database of scores, there is no way for a user of educational software to compare his test score with other users' scores. Such inter-player comparisons would be useful for comparing accuracy and speed of answers. Comparatives could be calculated and distributed for any given test as a whole, for groups of related questions or for individual questions. For example, is a score of 2,350 in Mathblaster a good score for a ten year old ? For a fourteen year old ? How did all fifth graders do on level 4 of the multiplication section ? Or, in "Where in the World is Carmen Sandiego, " is a student at the sixth grade level for naming U.S. state capitols but only at a third grade level at naming foreign countries ? Finally, how do these scores vary by school district, state, or perhaps even for children in the same school or class ? Certain of the home education software applications allow players to be tutored by the software before having to answer a specific question or solve a problem. Still other home education software detects specific areas where players are making errors and automatically adjusts the question type and/or information screens to help the players improve on their weaknesses, after which they are re-tested. This capability permits the software to self-adjust to the skill strengths and weaknesses of individual players when measured against predetermined (e.g., programmed into the software at the time of manufacture) and absolute (e.g., percentage of questions answered correctly) norms. However, a drawback of the existing software is that it cannot take into account relative norms (e.g., comparisons among a group of children currently taking the test) , because of the lack
SUBSTITUTE SHE! (Rϋlf 20) of an on-line database of scores, as mentioned previously. Another drawback of the existing home education software is that standardized entertainment elements often become repetitive and boring after seeing them for the third or fourth time. Absent rewards or external competitions of some kind, users often stop using the software once the novelty has worn off. Yet, currently available software lacks a reliable and accurate award recognition system that will allow parents to track and reward their children's performance. Such rewards could be contingent on achievement of a standardized norm, as discussed previously. For example, a parent might offer a child a new bicycle in return for a child achieving mastery of fourth grade reading skills. However, the parent would have to watch the child each time he finished a test to make sure that he had indeed scored at the fourth grade level or higher. Thus, there exists a need for educational software which can certify a child's performance against a specified norm without requiring the child's parent to witness or supervise the child's playing. Furthermore, if the parent wanted to relate his child's performance to the scores of other children in the same school, grade, or age group, there would be no way to determine whether scores achieved by the child met those comparative criteria. Finally, current educational software does not allow the parent to define customized reward programs. For example, in a math program having many different modules for different areas, a parent might wish to reward his child for achieving average performance on all modules and superior performance on at least two modules . The measurement of such performance relative to the performance of other children at other locations and times would require an elaborate tracking system.
Another category of educational software runs on networked mainframe-and-terminal systems for large-scale, simultaneous testing of groups of students at fixed
SUβSTTTUTE SHEH (RULE 28) locations. In contrast to the edutainment programs, these centralized systems require a human being (e.g., a parent, teacher, or other proctor) to monitor individual users of the software to prevent cheating. The software could subsequently retain scores in a local database which allows a teacher to recognize the best students and to ensure that all students have used the software to achieve certain minimum requirement levels. Such systems are well-suited for formal testing, but are ill-suited for home-based educational testing because of their hardwired, inflexible configurations and because of the continuous human monitoring requirement .
Yet another category of software is used for computerized evaluation of standardized tests taken by school children using paper forms. Groups of students simultaneously record their answers to paper-based, multiple-choice questions by blackening the ovals on Scantron forms which are optically scanned for grading by software running on computers. A standard paper report is then generated and distributed to each student. These reports, such as the Connecticut Mastery Test administered in the Connecticut primary and secondary schools, or the more widely known SATs, measure the student's comparative performance against all others in his school district as well as the state. Although these testing systems provide teachers and parents with an evaluative, reliable comparison as to how the student is performing vs. his peers, they suffer from other shortcomings that make them ill-suited to home-based educational testing. First, the testing methodology (reading printed questions and blackening ovals) is cumbersome and rigid. Second, the methodology does not allow the setting of local performance criteria apart from standardized (but relative) norms programmed into the central mainframe computer. Finally, because the entire test must be taken and submitted to the central computer for grading before
SUBSTITUTE SHEET (HULE2B) results are returned to the test-taker, there is no interactive feedback mechanism in which future test questions are selected in accordance with ongoing or historical performance.
Furthermore, all of the aforementioned software is ill-suited for at-home testing because of the lack of a mechanism to reliably associate a test-taker with his answers allows a dishonest test-taker to obtain test answers from another test-taker. Thus, even the sophisticated hard-wired testing systems typically require human proctoring to ensure security — a condition that may not be available in geographically distributed home testing applications.
Finally, in many instances, a teacher or central authority might want to distribute tests in electronic format for students to take on their home computers. Requiring a simultaneous start time and/or deadline completion for all tested students is necessary to deter cheating in the form of test-taker substitution or a later test-taker obtaining an earlier test-taker's answers to the same test. However, current software does not allow such a configuration.
Summary of the Invention
An object of the invention is to provide a system for computer-implemented education whereby a home user of educational computer software can be evaluated against a wide variety of selected comparative norms. Another object of the invention is to reliably associate a test-taker and his reported test results to a recipient thereof. Another object of the invention is to facilitate automatic monitoring of an unproctored test-taker' s test results against objective criteria established by a local supervisor of the test-taker. Another object of the invention is to implement a feedback mechanism whereby a
SUBSTITUTE SHEET PLE 26) user's future interactions with the educational software can be automatically modified in response to his ongoing or historical performance. Another object of the invention is to facilitate the non-simultaneous testing of test-takers in a manner that deters one test-taker from providing his test results to another test-taker. Another object of the invention is to allow the simultaneous testing of remotely located test-takers in the absence of a system of internetworked and synchronized testing computers. In connection with the foregoing, in one embodiment of the invention, testing software is incorporated into "edutainment" games for reporting a player's overall score, along with other subcategories of information (e.g., subject areas within the test material, time taken to answer questions, etc.) , as an x-digit numeric test result (e.g., 2340-9908-0011) or "score code." The test result may also include information as to whether or not the software had been altered during the testing procedure. To prevent alteration during the transmission process, the score is scrambled to prevent unauthorized access thereto. For even greater security, the system could use a variety of cryptographic protocols to make it difficult to forge and to prevent the user from misrepresenting his result to a database located at the central computer.
Whenever a player finishes a game for which the score is good enough to justify "registering," the player could use a touch-tone phone to call an 800 number to reach a central computer and register his score with an interactive voice response unit. Such registration may be mandatory or optional. For example, the software may require registration after a certain number of games by denying access until a confirmation code was entered.
If a player was calling for the first time, a live operator might register the player into a database
SUBSTTTUTE SHEET (RULE 28) containing important comparative factors such as age, grade, school, address, parent's name, password, etc. This "setup" phase could also be accomplished by mail.
For subsequent call-ins, the score registration process would be automated. An interactive system would prompt the player step-by-step through the registration process, beginning with the player entering his personal ID number and then his score code. The central computer would decrypt the score code and save the decrypted information in the central database for comparative evaluation against an appropriate test benchmark. Such benchmark would typically be performed according to demographic or geographic norms determined by the test- taker's registration information. The test benchmark could even include the user's own past test results. After making the appropriate comparisons, it would generate a customized "performance indication" to be given to the player.
The performance indication could also include a confirmation code to be entered into the test-taker's computer as evidence that he had registered his test results. Prior to acceptance, the testing computer could check the confirmation code to make sure it was valid. Based on the performance indication and/or confirmation code, the software could adjust future interactions with the test-taker based on his past performance. This "reflexive" feature would be based on how comparable children answered the same questions, as opposed to being preprogrammed at the time of manufacture or distribution of the software. For example, if a child is performing significantly below his peers in a particular category, the reflexive feature of the software could direct the software to focus future test questions in that category.
In addition, the player's comparative results and statistics could be logged to a special review section of the testing program. Access to these statistics could
SUBSTITUTE SHEπ (RULE 26) be restricted by a password held only by the child's parent. Optionally, a detailed report could be generated by the central computer and mailed to the parent, allowing the parent to carefully review his child's comparative scores. This report could also contain specialized exercises and remedial program recommendations based on the child's results.
For the case of a class of students taking computerized tests at home, each test could be composed of different questions selected, in part, based on past performance. The questions could also be randomized, either in order or content, to deter cheating. Once finished, a test-taker would call an 800 number to register his score code, which could include the serial number of the testing software and/or student ID. Thus one student could not give his score code to a second student to falsely suggest that the second student had taken the test. To further deter fraud, the testing computers could be configured for simultaneous testing based on receipt of a starting authorization. It was mentioned previously that the returned performance indication would provide statistical information and/or future test customization to an end user, such as a parent or teacher. The same performance indication could also include a reward attainment message for certifying the test-taker's achievement of a reward threshold, which had been previously registered by a parent, teacher, or other end user at the central computer. The end user could subsequently interrogate the testing computer (or even the central computer itself) to determine if the test-taker had been issued the reward attainment message and to provide a reward in accordance therewith.
In the above embodiment, the testing software is located on the testing computer, and answers to pre- programmed questions are scored locally before the test-
SUKTπilTE SHEET (RULE 2fD taker manually transmits the results to the central computer using a voice-based telephonic data input device. Alternatively, the testing computer could connect to the central computer and transmit the score code via an on¬ line (e.g., Internet, World Wide Web, America Online, etc.) connection. Indeed, the testing computer could even receive the test questions directly from the central computer rather than pre-storing them locally. This would allow the test administrator great flexibility in modifying and controlling test question distribution and access. The order or content of subsequent test questions could also be interactively customized for the needs of the individual test-taker in accordance with his ongoing or historical performance. Finally, instead of grading the test and transmitting the test score, the testing computer could transmit ungraded test answers, for grading and performance evaluation at the central computer.
Using this invention, remotely-located test- takers can compare their results with other remotely located test-takers. Comparisons can be done on an ongoing basis with no need for an on-line connection, and scoring can be done locally with no need to send all answers to a centralized scoring authority. Alternatively, an on-line connection could be used for even greater flexibility in test question distribution and control.
Whether on-line or off-line, the test-taker's score is securely associated with the test-taker's identity, which deters cheating as well as allowing the test results to be certified to a party (e.g., a parent) that did not necessarily witness the test-taking. The parent would not need to monitor the results since they cannot be falsely registered. The parent could even establish his own local criteria for evaluating his child's performance, with the monitoring and evaluation to
SUBSTmiTE SHEET (RULE 2B) be performed automatically by the same central computer that certifies the test results. The returned performance indication could also include commands to the testing software to customize future functionality based on past performance. In this sense, the educational software is not only for testing, but may be used for tutoring or workbook purposes.
Similarly, teachers can get reports on how their students compare in a wide variety of dimensions vs. other comparable students, and the students can be provided with specific remedial direction. Also, since each test-taker is uniquely associated with his test results, a teacher can administer a test to a class of students at home without fear that the students could give each other a valid code to simulate having taken the test as a whole. Another type of cheating involves students getting specific answers to specific questions from earlier test- takers during staggered testing in the absence of reliable monitoring. In that case, a central phone system (or other transmission system) could make available a "start code" only at a certain time (e.g. 8:00 PM) to students calling in for the code. Students would also have a specified window to call in and register a score code at the end of the test. This simultaneity deters students from calling earlier test-takers to get their answers to the test questions.
Another form of cheating, that of test-taker substitution, is also deterred by the inclusion of corroborative data into the encoded test result transmitted from the testing computer to the central computer. Such corroborative data could include a biometric identifier of the test-taker or of a witness to the testing (e.g., a proctor), other forms of witness identifier entered via a keypad or other input device, or a location datum from a GPS receiver. In any of these cases, the corroborative data deters cheating by providing
SUBSTITUTE SHEET (RULE 28) independent assurance that the test was taken by whom, where, or when it was supposed to have been taken.
The features and advantages of the present invention will be more readily understood and apparent from the following detailed description of the invention, which should be read in conjunction with the accompanying drawings, and from the claims which are appended at the end of the detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 illustrates the basic components of a system for computer-based educational testing.
Figure 2 illustrates one embodiment of a device for administering a test at a test-taker's home. Figure 3 illustrates one embodiment of a central computer for receiving test results from a testing device, performing a comparative evaluation with selected norms, and returning a performance indication for the test-taker.
Figure 4 illustrates an exemplary test administration procedure.
Figure 5 illustrates an exemplary test result encoding procedure.
Figure 6 illustrates an exemplary test result transmission procedure. Figure 7 illustrates an exemplary decoding procedure at the central computer.
Figure 8 illustrates an exemplary performance assessment procedure.
Figures 9(a) and 9(b) illustrate exemplary procedures for providing the performance indication to the user.
Figures 10 (a) and 10 (b) illustrate an exemplary reward mechanism from the perspectives of the (a) the parent and (b) the central computer, respectively. Figures 11(a) and 11(b) illustrate exemplary
SUBSTITUTE SHEET PLE 20) content of (a) the score code and (b) the performance indication, respectively.
Figure 12 illustrates an alternative embodiment of the invention wherein the test results are encoded using a cryptographic hashing operation. Figure 13 illustrates an exemplary process for decoding the encoded test result corresponding to Figure 12.
Figure 14 illustrates an exemplary process for test question generation and distribution in a home testing application.
Figure 15 illustrates an exemplary process for beginning a testing session in a home testing application.
Figure 16 illustrates an exemplary process for test result encoding in a home testing application. Figure 17 illustrates an exemplary process for transmitting encoded test results to the central computer in a home testing application.
Figure 18 illustrates an exemplary process for decoding and scoring a received encoded test result at the central computer in a home testing application.
Figure 19 illustrates an exemplary process for transmitting the scored test results to a teacher in a home testing application.
DETAILED DESCRIPTION OF THE INVENTION
Methods and apparatuses are disclosed for computer-based evaluation of a student's performance with respect to selected comparative norms. The system includes a home testing computer for transmitting the student ' s test results to a central computer which derives a performance assessment of the test-taker. The performance assessment can be standardized or customized, as well as relative or absolute. Further, the transmitted test results are configured to reliably associate the
SUBSTITUTE SHEET (RULE 2β) student with his test results, using encoding and/or other user identification techniques, to deter fraud. Thus, for example, the system allows a parentally-controlled reward system such that children who reach specified objectives can claim an award that parents are confident was fairly and honestly earned without the parent being required to supervise the game play. Fraud, and the need for proctoring, is also deterred during multiple student testing via an option for "simultaneous testing" of geographically dispersed test-takers. In this disclosure, certain ancillary elements used in conjunction with the educational testing device are well understood to those skilled in the art and are not shown in order not to obscure the present invention. For example, the design and construction of clocks, computer memories, and software or hardware cryptographic algorithms, are well known to those skilled in the art and will not be described in detail herein.
Referring now to Figure 1, there is shown a computer-based educational testing system comprising a testing computer 200 and a central computer 300. A test- taker at testing computer 200 would typically exchange information with the central computer 300 using a telephone 120 connected to a public switched network 110 provided by a local or regional telephone operating company ("TELCO") . However, those skilled in the art will appreciate that dedicated data lines, cellular telephony, Personal Communication Systems ("PCS") , microwave links, satellite systems, or any other direct or indirect communication link could also be used. The educational testing system facilitates the administration of a test, the sending of the test results 1110 to the central computer 300, the verification of those results, the generation of a performance indication 1150, and the transmission of that information to an end user. The end user could be the test-taker himself, his parent, his
SUBSTITUTE 8HETPLE 20) o teacher, a college admissions office, or any other party having an interest in the performance of the test-taker. In the exemplary embodiment shown in Figure 1, the performance indication 1150 is shown as being returned over the public switched telephone network 110 to an end user at telephone 120 for entry into the testing computer 200.
As shown in Fig. 2, the testing device 200 could be a conventional personal computer 210 having a CPU 220, an input device 225 (e.g., a keyboard or mouse), one or more communications ports 230, a clock 240, a display driver 245, a display (e.g., a video monitor) 250, RAM 260, ROM 265, and a data storage device 270. The storage device could either be fixed media (e.g., a hard disk) or a drive capable of reading removable media (e.g., a floppy disk or CD-ROM) . The storage device may be used to store the educational software 272, the score database 274, and message or audit databases 276. The score database 274 contains the performance indication 1150 received from the central computer 300. The audit database 276 contains audit data produced by the educational software 272, or received from the central computer 300, such as a cumulative record of all tests taken by the test-taker. For security purposes, one or more of the CPU 220, the clock 240, the RAM 260, the ROM 265, or the data storage device 270, can be located within a secure perimeter 280. Secure perimeter 280 may include physical, electronic, or a combination of physical and electronic features to resist tampering. For example, physical features could include encapsulation, electronic features could include a silicon firewall, and combination features could include self-zeroizing, or otherwise volatile, RAM 260 or ROM 265 which electrically modifies its contents upon detection of tampering. Such tampering might include physically stressing the device or attempting to change the clock rate by modifying its power source (not shown in
SUBSTITUTE MEET PLE 20) the Figure) to operate outside an allowable voltage or frequency range. Alternatively, secure perimeter 280 could be merely tamper-evident. As will be appreciated by those skilled in the art, a great variety of tamper- resistant/tamper-evident techniques can be deployed, and will not be enumerated in detail herein. Therefore, as a matter of convenience, terms such as "tamper resistant", "tamper evident", or "secure" shall be understood to refer to any of the aforementioned or other security measures throughout this discussion. The testing computer 200 may optionally be connected, via communications port 230, to a communication device (e.g., a modem, a network card, or a transmitter) to allow direct communications with the central computer 300. Finally, for security purposes to be described later, the testing device 200 may also include a biometric reader 280 such as a fingerprint reader or retinal scanner.
As shown in Fig. 3, the central computer 300 includes a central processor 310, cryptographic processor 320, ROM 330, RAM 340, clock 350, and data storage device 360. In some cases, the cryptographic processor 320 might be a specially secured, dedicated processor separate from the central processor 310. Alternatively, the cryptographic processor 320 could be integrated in the central processor 310. Thus, any conventional personal computer, computer workstation, or mainframe computer having sufficient memory and processing capabilities may be used as the central computer 300.
In one embodiment of the invention, the central computer 300 is connected to an interactive voice response unit (IVRU) 370 for receiving test results 1100 (see Figures 1 and 11(a)) from the test-taker via touch-tone signals transmitted over phone network 120. This process will be described in greater detail below with respect to Figure 6. IVRUs are well known in the art (see, e.g., Jerry Fitzgerald, "Business Data Communications - Basic Concepts, Security & Design," 4th ed. , John Wiley & Sons, 1993.) and need not be described in detail here.
The central computer 300 receives the encoded test result from the test-taker, decodes them using central processor 310 and/or cryptographic processor 320, and compares the decoded test results against test benchmarks stored in software database 385 or test result database 390. The central computer 300 then generates a performance indication 1150 (see Figure 1) containing its evaluation and returns the performance indication 1150 to the end user. Because the central computer 300 might be simultaneously performing these operations for a multitude of test-takers, it must be capable of high volume transaction processing and of performing a significant number of mathematical calculations in processing data inquiries. Thus a relatively powerful microprocessor that has a wide data bus would be a suitable central processor. Typical of such processors are the Intel Pentium or the Motorola PowerPC 604, which both employ a 32-bit data bus. The cryptographic processor 320 supports verification of the encoded test result 1100 received by the central computer, as well as encoding of the performance indication 1150, subsequently returned to the end user. Any suitable microprocessor may be used for the cryptographic processor 320. For example, in its 16 MHz configuration, the Motorola MC68HC16's fast 16-bit multiply-and-accumulate instruction requires less than one second to perform a 512-bit RSA private key operation. Other commercially available specialized cryptographic processors include VLSI Technology's 33MHz 6868 or Semaphore Communications' 40 MHz Roadrunner284. Alternatively, the cryptographic processor 320 may also be configured as part of central processor 310.
Data storage device 360 reads either fixed storage media within the central computer 300, or
SUBSTITUTE S1IEET (RULE 2β) removable storage media external to the central computer 300. Such media could include high capacity magnetic (e.g., hard disk) , optical (e.g., CD-ROM) , or magneto- optical media, as well as low capacity media such as flash memory. Stored on these media are test-taker database 380 and test result database 390. Test-taker database 380 maintains data on the test-taker and/or end user, including names, personal ID numbers, phone numbers, private key information, e-mail addresses, physical addresses, software owned, etc. Test result database 390 maintains data on all test results sent to the central computer 300. The test results will be discussed in greater detail below with respect to Figure 11(a) , but might include various ID numbers (e.g., test-taker, testing device and testing software) , and an indication of answers to the test questions (e.g., questions answered correctly/incorrectly, or slowest/fastest) .
Referring now to Figure 4, there is shown an exemplary embodiment of a process for administering a test. As a matter of convenience, the elements of the system will be referenced as numbered in Figures 1-3, although such numbering is not shown in the process description of Figure 4. At step 410, the test-taker sits down at the testing computer 200 and, at step 420, requests a testing session from the educational software 272. This software is loaded into the RAM 260 of the testing device 200 to be executed by its CPU 220. At step 440, the testing software instructions appear on the video monitor 250 requesting that the test-taker enter his personal ID number for identification. At step 430, the testing process begins, with test questions being displayed on the video monitor 250. The test-taker supplies answers through the keyboard or mouse 225, and continues until the testing process is complete. Questions can include true/false, multiple choice, fill- in-the-blank, or any other machine-scorable format. At
SUBSTITUTE SHEET (RULE 20) C1
- 18 - step 450, the software 272 scores the test by comparing the answers provided by the test-taker to the correct answers stored within the educational software 272. At step 460, the test results are incorporated into a numeric score code, which includes not only the number of correct/incorrect answers, but also an indication of which answers were incorrect. In a twenty-question geography test, for example, the score code might indicate that three answers were wrong, and that questions number six, twelve, and seventeen were missed. As shown in Figure 11(a) , the encoded test result 1100 might also include data on the amount of time required to answer each question, as well as identifying information as to the testing computer 200, software 272, and test-taker.
Figure 5 illustrates an exemplary embodiment of a process for encoding the test results at the home testing computer 200. The process begins, at step 510 (same as step 460 of Figure 4) , with incorporating the test results into a numeric score code 1100. At a minimum, to prevent fraud by a dishonest test-taker, the test results should be encoded using a secret algorithm
(e.g., scrambling or digit replacement techniques), known only to the testing computer 200 and the central computer 300. Stronger forms of encoding, which use cryptographic protocols, could also be used to encode the test results. Thus, at step 520, a suitable cryptographic algorithm is used for encoding the test result. For example, if the cryptographic operation constitutes encryption, RSA could be used for public key (asymmetric) encryption. The keys could be arbitrary, or they could be based on the testing computer ID number or test-taker's personal ID number.
While encryption in the testing computer's private key is particularly appropriate if authenticity is required, encryption in the central computer's public key is appropriate if confidentiality is desired. If both authenticity and confidentiality are desired, a double
SUBSTITUTE SHEET (ROLE 28) encryption could use both the testing computer's private key and the central computer's public key. Furthermore, secret key (e.g., DES) encryption could be used if the stronger protections of public key cryptography are not required or if public key cryptography is too computationally intensive. Finally, although asymmetric or symmetric cryptographic techniques are described herein, those skilled in the art will appreciate that many other cryptographic techniques can also be used, as will be described below in the section entitled "Alternative Embodiments of the Invention." These encoding protocols can be implemented in software or hardware, and can be made more secure by including the algorithm and keys in secure perimeter 280. Continuing now with Figure 5, the encrypted test result 1100 is displayed on the monitor 250 of the testing computer 200, at step 540, along with an 800 number to call, at step 550.
Referring now to Figure 6, there is shown an exemplary embodiment of a process for transmitting the encoded test result 1100 to the central computer 300. At step 610, at the conclusion of the testing session, the test-taker may be given an option whether or not to register his score. If he believes that he has performed poorly, he may decide not to register. The test results could be displayed on the video monitor 250 only in encoded form, which would be unintelligible to the test- taker. Thus, a good result would be indistinguishable from a bad result because neither would appear in plaintext form. This would help discourage fraud in the form of continuous retaking of a test until a good result was achieved. If the test-taker decides not to register, at step 620, he begins another testing session or exits from the software. If the test-taker decides to register, at step 630, he calls the 800 number displayed on the monitor 250 and, in step 640, connects with the IVRU 370 of the central computer 300. At step 650, if the test-
SUBSTITUTE SHEET (RULE 28) taker has previously registered himself with the central computer 300, he continues with test result transmission at step 680. Alternatively, if the test-taker has not yet registered himself with the central computer 300, the call is transferred at step 655 to a live operator to whom the test-taker provides registration information 660. As shown in block 665, such information might include the test-taker's name, phone number, address, age, school, grade level, end user to receive the performance indication (e.g., a parent or the test-taker himself), educational software ownership, etc. At step 670, a unique personal ID number is then assigned to the test- taker. The central computer creates a database record for this information in the test-taker database 380, which is indexed by personal ID number. The test-taker is then fully registered with the central computer 300 and is ready to transmit the encoded test result 1100. At step 680, the testing computer 200 displays the encoded test result 1100 to the test-taker, who subsequently manually telephones the central computer 300 and uses his touch- tone keypad to transmit his personal ID number and the encoded test result 1100 to the central computer 300 in response to voice prompts from the IVRU 370.
Referring now to Figure 7, there is shown an exemplary embodiment of a process for decoding the encoded test result 1100 at the central computer 300. At step 710, the encoded test 1100 result is sent to the cryptographic processor 320. At step 720, the central computer 300 looks up the personal ID number in the test- taker database 380 and retrieves the appropriate key to decrypt the encoded test result 1100. If asymmetric cryptography is used, this would be the public key corresponding to the test-taker's private key; if symmetric cryptography is used, this would be the same secret key used for the encryption at the testing computer 200.
SUBSTITUTE SHEET PLE 26) Upon receipt of the decryption key, at step 730, the cryptographic processor 320 decrypts the encoded test result 1100 at step 740, thereby verifying that the score code was produced by the personal ID provided therein. The decrypted score information is then stored within test result database 390 at step 750.
Referring now to Figure 8, there is shown an exemplary embodiment of a process for evaluating the decoded score code at the central computer 300. At step 810, the central computer 300 takes the test result and initiates a search for comparative scores. Within the numeric code of the test result is the software ID number, identifying the software used for that test session. To find other test results for this software, the central computer 300 sorts the test result database 390 by software ID number, retrieving only those records that pertain to the software package 272. These records could be further narrowed, for example, by restricting them to those records generated by test-takers of the same age or grade level. The test-taker's test result is then compared relative to the performance of an appropriate reference group, or some other test benchmark, to generate a simple statistical comparison such as percentile rank. At step 830, these comparative data are incorporated into a numeric performance indication 1150, which will be given to the test-taker for input into the software package 272 that generated the test result (see Figure 1) . If the test result indicates a particular area of weakness, the central computer 300 can generate a control message that directs the testing computer's software package 272 to focus on that particular weakness during subsequent testing sessions by modifying the testing computer's operation in response to the control message . For example, if the test-taker is consistently missing fractions questions during math tests, the testing computer 200 could be directed to tutor the test-taker in
SUBSTITUTE SHEET (RULE 2B> that subject or ask more questions of that type on the next math test . Each software package could have hundreds of methods to focus future tests, each method tied to a control message stored in a software database 385 of the central computer 300 and in message database 276 of testing computer 200. In the above example, focusing on fractions might be expressed as control message "324" of the software database record for the math software. As shown in step 850, such control messages are included along with the evaluative data in the performance indication 1150 before it is provided to the test-taker (or other end user) at the end of the test result transmission call. The test-taker enters the performance indication into the educational software 272 that generated the test result, triggering the software 272 to display the evaluative data generated by the central computer 300.
The performance indication 1150 could use cryptographic techniques, similar to those used for the test results, to ensure authenticity and/or integrity. Referring now to Figure 9(a) , there is shown an exemplary embodiment of a process for providing the performance indication 1150 to the test-taker. At step 910, the test- taker enters the performance indication 1150 into the testing computer 200, which verifies the performance indication 1150 to ensure that it is valid. Then, at step 920, the testing computer 200 displays basic statistics as to the relative performance of the test-taker. The amount of information contained within the performance indication is limited by its transmission between the test-taker (or his parent) and the central computer 300 in audio form over the telephone 120 of Figure 1, including such factors as the amount of data the test-taker is willing to enter or the rate at which the test-taker can listen to and enter the data. Those skilled in the art will appreciate that other transmission methods, such as the use of Dual Tone Frequency Modulator ("DTFM") or other modulated tones, can significantly increase the amount of information transferred without requiring the test-taker to manually enter the data. In that case an acoustic coupler/demodulator associated with the testing computer 200 would convert the audio DTFM tones into a digital bitstream without any human intervention. Of course, such modulated tones could also be used for the initial transmission of the encoded test result 1100 from the testing computer 200 to the central computer 300. Alternatively or additionally, the central computer 300 could also generate a detailed comparison report to be mailed to an end user (perhaps the test-taker himself) , as shown in Figure 9(b) . The test-taker would make such a request via, or during transmission of, the encoded score code 1100 sent to the central computer 300 using the IVRU 370. These more detailed test results could indicate how well the test-taker has performed based on his total score, and/or individual questions, based on an appropriate geographic or demographic test benchmark. It was mentioned previously that the performance indication 1150 may be provided to the test-taker for input into the testing computer 200. Alternatively, the performance indication 1150 may be provided to an end user other than the test-taker himself. Either way, the performance indication 1150 represents a certification of the test result, by the central computer, to an end user monitoring the test-taker's performance. Thus, the performance indication 1150 may incorporate a reward attainment message, specifying the test-taker's achievement of a specified performance level, that can be subsequently accessed by the end user at some time after the testing session.
Referring now to Figure 10, there is shown an exemplary embodiment of a process for an end user to administer a reward system based on the performance indication 1150. The reward system consists of two stages: specifying a reward level (in steps 1010 - 1060) and checking the child's performance (in steps 1070 - 1090) . At step 1010, the test-taker's parent (or teacher) calls the central computer 300 using a telephone in a manner similar to the test-taker's test result registration process. During initialization, the parent is prompted (at step 1020) by the central computer's IVRU 370 to enter his ID number (at step 1030) , the software identifier (at step 1040) , and a reward threshold to be monitored by the central computer 300 (at step 1050) . The reward threshold could be in addition to, or part of, the test benchmark described previously in the content of test result evaluation. During a subsequent testing session, at steps 1070-1080, the central computer 300 checks the score code saved in the test-taker database 380 against the specified reward threshold. If appropriate, at step 1085, a reward attainment message is included in the performance indicator 1150 provided by the IVRU 370 to the test-taker over the phone. The test-taker then enters the performance indication 1150 into the testing computer 200 where it can be accessed at a later time by the test- taker's parent. Alternatively, the parent could himself call the central computer 300 to directly receive the performance indication 1150 including the reward attainment message. The parent could then use the reward attainment message to provide a designated reward to the test-taker.
Alternative Embodiments of the Invention
1) Cryptographic Techniques
It was mentioned previously that encryption was the encoding protocol used for certifying the test results. Provided the encryption key has not been compromised, if the central computer 300 can decrypt the
SUBSTITUTE SHEET (RULE 2fi) encrypted test result 1100, it is known to be authentic. Alternatively, the cryptographic technique of "one-way functions" may be used to ensure test result integrity. As used herein, a one-way function is one that outputs a unique representation of an input such that a given output is likely only to have come from its corresponding input, and such that the input can not be readily deduced from the output. Thus, the term one-way function includes hashes, message authenticity codes (MACs -- keyed one-way functions) , cyclic redundancy checks (CRCs) , and other techniques well known to those skilled in the art. See, for example, Bruce Schneier, "Applied Cryptography," Wiley, 1996. As a matter of convenience, the term "hash" will be understood to represent any of the aforementioned or other one-way functions throughout this discussion. Referring now to Figure 12, there is shown an exemplary embodiment of a process for encoding test results at the testing computer 200 using hash functions. The process begins, at step 1210 (the same as step 510 of Figure 5) with the testing computer 200 incorporating the test results into a numeric score code. As shown in step 1220, CPU 220 then uses a software hashing algorithm incorporated in software 272 to hash the score code. Alternatively, the hashing algorithm could be stored in RAM 260 or ROM 265, or it could be hardwired in a special dedicated cryptoprocessor (e.g., a test-taker's cryptographic token) separate from the CPU 220. The result is an encoded test result 1100 comprising the (cleartext) score code and a (ciphertext) one-way function representative of at least a portion of the score code. This encoded test result 1100 is displayed on video monitor 150, at step 1230, along with an 800 number for the test-taker to call to register the hash, at step 1240. Notice that, in contrast to the encryption embodiment in which the test result could be displayed only in encoded form, the hashed test result must also be made available to the central computer 300 in cleartext form. If it is desired to prevent the test-taker from seeing his actual test result (e.g., to prevent multiple test-taking as described previously) , the hashing can be performed in conjunction with encryption. For example, in step 1225, the test result is first encrypted prior to hashing.
Alternatively, the test result can first be hashed and then encrypted. The use of hashing followed by encryption is often referred to as a digital signature. The encryption operation ensures test result authenticity, in addition to the test result integrity provided by the hashing operation. Finally, instead of or in addition to encryption, a unique device identification number (see Figure 11(a)) can be added to the encoded test result 1100 to provide assurance of authenticity. Referring now to Figure 13, there is shown an exemplary embodiment of a decoding process corresponding to Figure 12. At step 1310, the encoded test result 1100 is sent to central computer 300, which verifies the hashed test result by reading the cleartext part of the encoded test result (e.g., the score code and device or software ID) and the ciphertext part of the encoded test result (e.g., a hash of a portion of the cleartext part) . At step 1320, the central computer 300 retrieves the appropriate hash function and, in step 1330, performs an identical hashing algorithm on the appropriate portion of cleartext part to recompute the hash. If the encoded test result uses any form of encryption in addition to hashing, the central computer 300 would locate and perform the appropriate cryptographic protocol to decrypt the encrypted portion of the test result in the appropriate manner. At steps 1340 and 1350, the received and recomputed hashes are compared to determine that the test result came from the testing computer 200 and had not been altered subsequent to transmission. If so, at step 1360, the score code is stored in score code database 274.
SUBSTITUTE SHEET PLE 28) Else, the score code is rejected at step 1370, and the test-taker is asked to re-enter the encoded test result at step 1380.
Certain well-known enhancements to public key cryptography could also be used to provide greater assurance. For example, the encoded test result could include digital certificates for public key distribution to a central computer 300 that does not know the testing computer's public key needed to verify a test result encrypted with the testing computer's private key. In a digital certificate, the testing computer's public key is encrypted (and vouched for) by the private key of a trusted certifier (e.g., a well known manufacturer of the measurement certification device) whose public key is known to the central computer 300. The central computer 300 uses the certifier's public key to decrypt the testing computer's public key, then uses the testing computer's public key to decrypt the encrypted test result. Alternatively, the central computer 300 could simply obtain the testing computer's public key from a publicly accessible database, eliminating the need for digital certificates.
Another commonly used cryptographic technique, the so-called challenge-response protocol (CRP) , may be used to ensure to a recipient that an encoded test result is current, i.e., not a copy of a previously generated encoded test result. During test result registration, the central computer 300 generates and transmits a datum (also referred to as a "nonce") to the testing computer 200. The testing computer 200 then incorporates this random number in the encoded test result 1100 transmitted to the central computer 300. If the received random number matches the random number previously generated, the central computer 300 accepts the encoded test result 1100 as fresh. Conversely, an old encoded test result 1100 would contain a non-matching random number. Those skilled
SUBSTITUTE SHEET (RULE 28) in the art will appreciate that the challenge can use any datum whose value is unpredictable by the testing computer 200; random numbers happen to be a particularly convenient choice.
Although public key and symmetric key cryptography have been described in the encryption of the test result, those skilled in the art will realize that simpler cryptographic protocols may also be used. For example, substitution ciphers or transposition ciphers offer lower levels of security, but require far less computing power and can be more easily integrated into a software package.
In another variation, the testing computer 200 includes a sequence number in the encoded test result 1100. This sequence number is incremented by one every time the testing computer 200 generates an encoded test result 1100. The central computer 300 stores the most recent sequence number in memory, and accepts a transmitted encoded test result 1100 if its sequence number is one greater than the last stored sequence number.
In yet another variation, the testing computer 200 includes the current time in the encoded test result 1100 transmitted to the central computer 300. The central computer 300 then checks the time included in the encoded test result 1100 against the time from the central computer's clock 350. If the times are within a prescribed window, the encoded test result 1100 is accepted as fresh.
In still another procedure, the testing computer 200 itself generates a random number to be included in the encoded test result 1100. The central computer 300 maintains a database of all random numbers received from all testing computers. If the new random number is not in that database, then the current encoded test result 1100 is accepted as fresh. If a time element is incorporated
SUBSTITUTE SHEET (RULE 28) as well, then the central computer 300 only has to store a relatively small quantity of received random numbers. In any of the above variations, reused encoded test results are prevented (or at least detectable) because a reused encoded test result would contain a datum corresponding to a previous request/reply pair, rather than the current datum.
Although certain exemplary cryptographic operations (hashing, asymmetric encryption, symmetric encryption, substitution ciphers, transposition ciphers, digital certificates, and challenge-response protocols) have been disclosed for use singly or in specified combinations, those skilled in the art will appreciate that many other combinations of these basic operations may be used, depending on the needs of the specific application.
2) Corroborative Data for Test-Taker Verification
The above-described cryptographic protocols are useful for deterring fraud in the form of test result modification. Alternatively, or in addition, corroborative techniques can be used for deterring fraud in the form of test-taker substitution. Biometric identification devices (e.g., a fingerprint reader, a voice recognition system, or a retinal scanner) may be used to provide absolute test-taker identity verification at the testing computer 200. An example of a fingerprint reader is the Startek FC100 FINGERPRINT VERIFIER, which connects to a PC via a standard interface card. The fingerprint verifier acquires a test-taker's fingerprint when the test-taker places his finger on an optical scanning lens, then scans, digitizes, and compresses the data (typically a 256 byte file) in memory. During testing, each live scan fingerprint is compared against a previously enrolled/stored template, stored in the testing computer 200. If the prints do not match, access to the
SUBSTITUTE SHEH (RULE 28) educational software 272 can be denied. This procedure may be implemented: 1) before the start of a testing session, 2) during the testing session in response to prompts from the educational software, 3) at some predetermined or random time, or 4) continuously by incorporating the scanning lens into the testing computer 200 such that the test-taker is required to maintain his finger on the lens at all times during the testing session for continuous verification.
As another example of a biometric device, a voice verification system, located at either or both the central computer 300 and the testing computer 200, may utilize a person's "voice-print" to verify test-taker identity. The process of obtaining a voice-print and subsequently using it to verify a person's identity is well-known in the art, and will not be described in detail herein. Those of ordinary skill in the art will appreciate that suitable voice identification/verification technologies are commercially available from companies such as SpeakEZ, Inc. and others. During initialization, the speaker identification software is used to sample the test-taker's voice, which is stored in the test-taker database 380 at the central computer 300. Each time the test-taker calls the central computer 300 to register a test result, the IVRU 370 prompts the test-taker to speak his or her name into the telephone 120. The speaker identification software then directs the central computer 300 to check the test-taker's current voice-print against the voice- print stored in the test-taker database 380. Unless there is a match, the test result registration procedure is aborted. The voice-print may also be stored in a database in the testing computer 200, to verify the test-taker's identity at that location prior to allowing a test session. The above-mentioned biometric data are useful
smiSTmrrE SHEET (RULE m for corroborating the test-taker's identity. In addition, other forms of independent data can be added to the encoded test result 1100 to corroborate the testing process. For example, the biometric reader could be used to certify the identity of a witness to the testing process (e.g., a proctor) in addition to the identity of the test-taker himself. Of course, the witness could also be certified by inputting a unique witness identifier into a keypad or other input device at the testing computer 200. Alternatively, a Global Positioning Satellite (GPS) signal receiver could be incorporated with the testing computer 200 to provide a corroborative datum indicating the location of the testing computer.
3) Direct Communication between Testing and Central Computers
The above embodiments have been described with respect to a system utilizing an indirect connection between the testing computer 200 and the central computer 300, i.e., the test-taker himself acts as an information conduit between the testing computer 200 and the central computer 300, in conjunction with telephone 120 and interactive voice response unit 370. As discussed above with respect to the delivery of the performance indication to the test-taker, this may limit the amount of information that can be conveyed between the testing computer 200 and the central computer 300. One solution, when response time is not crucial, involves transmitting such information via non-electronic media (e.g., Figure 9(b)) . Another solution would be to establish a direct electronic connection via a dial-up modem, a cable modem, a set-top box, or any other form of electronic communication. For example, conventional modems may be used for communications at rates from 1,200 to 33,000 bits per second (baud) . Alternatively, ISDN modems may be used for communications at rates of 64,000 to 128,000 baud.
SUBSTITUTE SHEET (RULE 26} Finally, special network switches may be used in connection with even higher-speed communications links, e.g., over Tl or T3 lines. Such a connection would allow the transmission of vastly greater amounts of information, as well as allowing the testing computer 200 to take advantage of the storage and processing capabilities of the central computer 300.
The central computer's network interface must also be able to support multiple, simultaneous data connections. In a preferred embodiment, the central computer 300 is accessible over the INTERNET or commercial on-line services such as America Online, CompuServe, or Prodigy, allowing multiple test-takers to access the central computer 300 via simultaneous on-line connections. In another embodiment of the invention, the test questions would not have to be pre-stored at the testing computer 200, but could be stored at, and downloaded directly from, the central computer 300. This would give the testing authority much greater flexibility to keep test questions current and confidential. In addition, where the performance indication 1150 is provided to the central computer 300 on an ongoing basis rather than after completion of the entire test, future test questions could be customized in response to the test-taker's ongoing or historical performance. To the extent that downloaded questions are not known in advance by the testing computer 200, they will be unscorable by the testing computer 200. In such cases, the encoded test result 1100 transmitted from the testing computer 200 to the central computer 300 will include encoded but otherwise unscored test answers. The central computer 300 will then score the questions using an answer key as part of the test benchmark. Thus, as used in various exemplary embodiments of the invention disclosed herein, the term test benchmark could include any information useful for evaluating a test result transmitted from the testing computer 200 to the central computer 300. Furthermore, either or both of the test result and the test benchmark could include absolute, relative, or statistical information. 4) At-Home Testing
Figures 14-19 illustrate another embodiment of the invention in which the test questions are not pre¬ stored at the testing computer 200. Referring now to Figure 14, there is shown an exemplary embodiment of a process for distributing tests to students for examinations taken at home. Teachers (or commercial testing services) create a set of floppy disks with test questions and distribute them to students to take the tests at a given start time. At step 1410, the teacher enters the test questions into the test generator software operating on a standard personal computer. At step 1420, the teacher enters the personal IDs of each student taking the test into the test generator software to ensure that only authorized students can take the test. At step 1430, the teacher enters an unlock code and one or more cryptographic keys of any desired type. The unlock code is used by the test-takers to activate the test session, while the cryptographic keys will be used by the students to encrypt their test results. At step 1440, the teacher enters both a start and finish time for the test, perhaps declaring that the test will begin at 1:00 PM on Saturday and end at 3:00 PM the same day. Lastly, the teacher enters a unique test ID number at step 1450, which allows the central computer 300 to track the results of the test. At step 1460, the test generator software writes tests onto individual disks. The test disks can take many forms including floppy disks, tape cartridges, magneto-optical disks, etc. The disks do not have to be identical. Each disk's questions could be arranged in a different order, or each disk could contain different questions. For example, the teacher could enter a large number of questions into the generator software with randomly
SUBSTITUTE SHEH (RULE 28) selected subsets of these questions to be generated for different disks. The teacher completes the test generation process by calling the central computer 300 at step 1470, and using an IVRU or other electronic device to register the test ID number, personal IDs of the test- takers, unlock code, start time/stop time, and cryptographic key. The central computer 300 stores this information in a database for later use. At step 1480, the disks are distributed to the students.
Referring now to Figure 15, there is shown an exemplary embodiment of a process for taking the distributed test. The test-taker brings the test disk home and puts it into his testing computer 200. Without the appropriate unlock code, however, the test-taker is unable to access the test questions. At step 1510, therefore, the test-taker calls the central computer 300 and enters the test ID number. This ID number is preferably printed on the disk itself. The central computer 300 accesses the test database 380 and retrieves the database record for that particular test ID number. At step 1520, the central computer 300 compares the current time to the designated start time of the test. If the current time is prior to the test start time, the test-taker is instructed to call back at the established start time. If the current time is within the time window of the test, the central computer 300 supplies the appropriate unlock code at step 1530. At step 1540, the test-taker enters this unlock code into the test software, allowing access to the software 272. At step 1550 he enters his personal ID number. If either of these values is not accepted by the test software, the test-taker is prevented from continuing. At step 1560 the test-taker begins the testing session.
Referring now to Figure 16, there is shown an exemplary embodiment for the process of encoding the test results for transmission to the central computer 300. At
SUBSTITUTE SflEET (ROLE 28) step 1610, the test-taker's answers are aggregated into a test result, which is encrypted at step 1620 using the key entered into the test disk at the time the disk was generated. At step 1630, the encoded test result 1100 is displayed on the video monitor 150 of the testing computer 200. At step 1640, the 800 number of the central computer 300 is similarly displayed.
Referring now to Figure 17, there is shown an exemplary embodiment of a process for transmitting the encoded test results 1100 to the central computer 300. At step 1720, the test-taker calls the 800 number to register his test results with the central computer 300. At step 1730, the IVRU 370 prompts the test-taker who, at step 1740, enters the test ID number. At step 1750, the test- taker enters his personal ID number and, at step 1760, enters the encoded test result 1100. This result is timestamped by the central computer 300 at step 1770 so that it can be checked later for compliance with the established end time of the test.
Referring now to Figure 18, there is shown an exemplary embodiment of a process for decoding the encoded test result 1100 within central computer 300. At step 1810, the encoded test result 1100 is sent to the cryptographic processor 320. At step 1820, the central computer 300 looks up the test ID number to find the cryptographic key used to encrypt the test result. This key is transmitted at step 1830 to the cryptographic processor 320 where, at step 1840, the encoded test result 1100 is decrypted and scored. The answer key used for scoring could have been previously transmitted from the teacher to the central computer during the test registration process at step 1470 in Figure 14. At step 1850, the scored test result 1100 is saved in the test result database 390.
Referring now to Figure 19, there is shown an exemplary embodiment of a process for transmitting the scored test results back to the teacher who generated the test. At step 1910, the central computer 300 generates a list of all scores for the given test ID number. Along with these scores, at step 1920, are the timestamps associated with each score. At step 1930, the scores are separated into two lists -- those with timestamps before the stop time, and those with timestamps after the stop time. At step 1940, these scores are sent to the teacher. Scores whose timestamps exceed the pre-established end time for the test may be rejected or penalized in an appropriate manner.
In the above embodiment, the testing computer 200 transmits unscored test answers to the central computer 300 for scoring at the central computer 300. In a variation of the above embodiment, the encoded test results could be scored by the testing computer 200 before being transmitted to the central computer 300. This requires that an answer key be available to the testing computer 200 along with the test questions. This answer key may have been provided in encrypted form, along with the test questions, on the disk created by the teacher's test generator software and provided to the student prior to testing.
5) Distributed Computing In another embodiment of the invention, the testing software could be programmed in a distributed computing language such as Sun's Java, that allows both the functional (testing algorithms) and non-functional (test content) aspects of the testing software to be integrated into "executable content" accessible from a remote server over the World Wide Web. Even more sophisticated distributed computing protocols allow the processing of information by different computers operating in a networked environment . Allowing the central computer 300 (or any other networked computer) to perform portions
SUBSTITUTE SHEET (RULE 2B) of the computational processing that would otherwise be performed by the testing computer 200 greatly reduces the cost of the testing computer 200, making it more affordable to a wider market. For example, rather than locally scoring the test questions, the testing computer 200 could upload the encoded, unscored answers to a networked computer for remote scoring.
In a similar manner, while the above embodiments describe a single computer acting as the central computer 300, those skilled in the art will realize that the functionality of the central computer 300 can be realized among a plurality of computers. Thus, in an alternative embodiment of the invention, the central computer 300 may be configured as a distributed architecture wherein the databases and cryptographic processors are housed in physically separate units. In such a configuration, each unit is in communication with its constituent parts, but the processor and/or data storage functions are performed by stand-alone units. This arrangement yields a more dynamic and flexible system, less prone to catastrophic hardware failures affecting the entire system.
For purposes of illustration only, and not to limit generality, the present invention has been explained with reference to various examples of time sources, cryptographic operations, output devices, and sensors. However, one skilled in the art will appreciate that the invention is not limited to the particular illustrated embodiments or applications, but includes many others that operate in accordance with the principles disclosed herein.

Claims

CT/US97 856- 38 - CLAIMSWhat is claimed is:
1. A computer device, comprising: a memory device having control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to receive a test result encoded by an electronic testing device associated with a test-taker, decode the test result, compare the test result with a plurality of stored test results, generate a performance indication based upon the comparison and provide the performance indication to an end-user.
2. The computer device of claim 1, wherein the test result includes a test-taker identifying datum.
3. The computer device of claim 2, wherein the memory device further comprises a test-taker database and a test result database; wherein the test-taker database comprises a record containing a first field representing the test- taker identifying datum and a second field representing a software identifying datum; and wherein the test result database comprises a plurality of records, each record containing a third field representing one of the stored test results and a fourth field representing a corresponding software identifier.
4. The computer device of claim 3 , wherein the processor is further configured to: retrieve the record from the test-taker database and locate the software identifying datum;
SUBSTTTUTE SHEET (RULE 26) retrieve from the test result database stored test results having a corresponding software identifier that is the same as the software identifying datum; and compare the decoded test result relative to the stored test results.
5. The computer device of claim 4, wherein the test-taker database further includes a reward threshold provided by the end- user, and wherein the processor is further configured to compare the test result relative to the reward threshold and provide a reward attainment message if the test result exceeds the reward threshold.
6. The computer device of claim 5, wherein the record of the test-taker database further includes a fifth field representing a geographical location of the test- taker and each record of the test result database further includes a geographical identifier, and wherein the processor is further configured to retrieve, from the stored test results having a corresponding software identifier that is the same as the software identifying datum, stored test results having a geographical identifier that is the same as the geographical location of the test-taker before the comparing step.
7. A computer device, comprising: a memory device having control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to provide test questions stored in a database to a test-taker associated with an electronic testing device, receive a list of unscored test answers from the electronic testing device, generate a test result, compare the test result with a plurality of stored test results, generate a
SUBSTITUTE SHEET (RULE 20) performance indication based upon the comparison and provide the performance indication to an end-user.
8. The computer device of claim 7, wherein the test result includes a test-taker identifying datum.
9. The computer device of claim 8, wherein the memory device further comprises a test-taker database and a test result database; wherein the test-taker database comprises a record containing a first field representing the test- taker identifying datum and a second field representing a software identifying datum; and wherein the test result database comprises a plurality of records, each record containing a third field representing one of the stored test results and a fourth field representing a corresponding software identifier.
10. The computer device of claim 9, wherein the processor is further configured to: retrieve the record from the test-taker database and locate the software identifying datum; retrieve from the test result database stored test results having a corresponding software identifier that is the same as the software identifying datum; and compare the decoded test result relative to the stored test results .
11. The computer device of claim 10, wherein the test-taker database further includes a reward threshold provided by the end-user, and wherein the processor is further configured to compare the test result relative to the reward threshold and provide a reward attainment message if the test result exceeds the reward threshold.
12. A computer device, comprising: a memory device having control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to receive testing information from an end-user, regulate access to a test by a test-taker in response to the testing information, receive at least one test result encoded by at least one electronic testing device associated with at least one test-taker, decode the test result, evaluate the decoded test result, generate a performance indication and provide the performance indication to an end-user.
13. A computer device, comprising: a memory device having control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to receive a test result encoded by an electronic testing device associated with a test-taker, decode the test result, evaluate the decoded test result, generate a performance indication, compare the test result against a specified reward threshold and provide the performance indication to an end-user if the test-taker has attained the reward threshold.
14. A testing method, comprising the steps of: receiving a test result encoded by an electronic testing device associated with a test-taker; decoding the test result; comparing the test result with a plurality of stored test results; generating a performance indication based upon the comparison; and
SUBSTTTUTE SHEET (RULE 26) providing the performance indication to an end-user.
15. A computer device, comprising: a memory device having encoding control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to administer a test to a test-taker, generate a test result and performance data and incorporate the test result and performance data into a numeric score code, encode the numeric score code using an encoding algorithm unknown to the test-taker and transmit the encoded test result to a central computer configured to evaluate the performance of the test-taker and provide a performance indication.
16. The computer device of claim 15, wherein the numeric score code includes a test-taker identifying datum.
17. The computer device of claim 15, wherein the performance data includes a number of questions answered correctly, a number of questions answered incorrectly and a time required to answer each question.
18. The computer device of claim 15, wherein the processor is further configured to receive the performance indication from the central computer, wherein the performance indication includes a control message configured to indicate a reward to be provided to the test-taker.
19. The computer device of claim 18, wherein the processor is further configured to receive from an end-user a threshold for receiving the reward.
SUBSTITUTE SHEET (RULE 2B)
20. The computer device of claim 19, wherein the numeric score code includes the threshold for receiving the reward.
21. The computer device of claim 18, wherein the performance indication is encoded at the central computer, and wherein the processor is further configured to decode the received performance indication.
22. A computer device, comprising: a memory device having encoding control code embodied therein; and a processor disposed in communication with the memory device, the processor configured to receive test questions from a central computer, administer the received test questions to a test-taker, transmit answers to the test questions by the test-taker to the central computer configured to generate a test result, evaluate the test result and generate a performance indication, and display the performance indication generated by the central computer.
23. A testing method, comprising the steps of: administering a test to a test-taker; generating a test result and performance data and incorporating the test result and performance data into a numeric score code; encoding the numeric score code using an encoding algorithm unknown to the test-taker; and transmitting the encoded test result to a central computer configured to evaluate the performance of the test-taker and provide a performance indication.
PCT/US1997/008566 1996-05-09 1997-05-08 Method and apparatus for computer-based educational testing WO1997042615A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU31352/97A AU3135297A (en) 1996-05-09 1997-05-08 Method and apparatus for computer-based educational testing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/647,301 1996-05-09
US08/647,301 US5947747A (en) 1996-05-09 1996-05-09 Method and apparatus for computer-based educational testing

Publications (1)

Publication Number Publication Date
WO1997042615A1 true WO1997042615A1 (en) 1997-11-13

Family

ID=24596414

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/008566 WO1997042615A1 (en) 1996-05-09 1997-05-08 Method and apparatus for computer-based educational testing

Country Status (3)

Country Link
US (1) US5947747A (en)
AU (1) AU3135297A (en)
WO (1) WO1997042615A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0910844A1 (en) * 1997-03-11 1999-04-28 Sylvan Learning Systems, Inc. Method and system for administrating of remotely-proctored secure examination
FR2773410A1 (en) * 1997-12-09 1999-07-09 Tokheim Corp TEACHING SYSTEM AND METHOD
FR2775820A1 (en) * 1998-03-03 1999-09-10 Elti Remote analysis of the profile of persons being assessed for their knowledge or of opinion survey data
EP1003142A1 (en) * 1998-11-17 2000-05-24 Alcatel Method for automatic monitoring of the achievment of teaching goals by a computer
WO2001031610A1 (en) * 1999-10-27 2001-05-03 Arc Research & Development Pty Limited A data collection method
CN102804742A (en) * 2009-06-11 2012-11-28 阿尔卡特朗讯公司 Method and application for parental control for using a terminal
EP2606451A2 (en) * 2010-08-16 2013-06-26 Extegrity Inc. Systems and methods for detecting substitution of high-value electronic documents
CN108352044A (en) * 2015-11-24 2018-07-31 索尼公司 Information processing unit, information processing method and program

Families Citing this family (227)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5758257A (en) * 1994-11-29 1998-05-26 Herz; Frederick System and method for scheduling broadcast of and access to video programs and other data using customer profiles
US6393253B1 (en) * 1995-12-02 2002-05-21 Mediafive Corporation Data processing device
US7483670B2 (en) 1996-05-09 2009-01-27 Walker Digital, Llc Method and apparatus for educational testing
US5727950A (en) * 1996-05-22 1998-03-17 Netsage Corporation Agent based instruction system and method
US6427063B1 (en) 1997-05-22 2002-07-30 Finali Corporation Agent based instruction system and method
US8342854B2 (en) * 1996-09-25 2013-01-01 Educate Online Technology, Llc Language-based computer generated instructional material
US6733296B2 (en) 1996-09-25 2004-05-11 Sylvan Learning Systems, Inc. Learning system and method for holding incentive-based learning
US6729885B2 (en) 1996-09-25 2004-05-04 Sylvan Learning Systems, Inc. Learning system and method for engaging in concurrent interactive and non-interactive learning sessions
US6804489B2 (en) 1996-09-25 2004-10-12 Laureate Education, Inc. Learning system and method for teacher load balancing
US20030198930A1 (en) * 1997-09-24 2003-10-23 Sylvan Learning Systems, Inc. System and method for conducting a learning session based on a teacher privilege
US6733295B2 (en) 1996-09-25 2004-05-11 Sylvan Learning Systems, Inc. Learning system for enabling separate teacher-student interaction over selected interactive channels
DK0934581T3 (en) * 1996-09-25 2003-03-24 Sylvan Learning Systems Inc System for automated testing and electronic dissemination of curriculum and student administration
US6175841B1 (en) * 1997-07-17 2001-01-16 Bookette Software Company Computerized systems for producing on-line instructional materials
US6112049A (en) 1997-10-21 2000-08-29 The Riverside Publishing Company Computer network based testing system
DE19819920C2 (en) * 1998-05-05 2000-03-02 Andreas Unsicker Method and device for testing the correctness of answers to quiz questions via media and events, in particular television and radio
US6178308B1 (en) * 1998-10-16 2001-01-23 Xerox Corporation Paper based intermedium for providing interactive educational services
US20020030096A1 (en) * 1999-01-25 2002-03-14 David Isherwood Method and system for directing end user to selected network location of provider based on user-provided codes
US6993580B2 (en) * 1999-01-25 2006-01-31 Airclic Inc. Method and system for sharing end user information on network
US6633223B1 (en) * 1999-02-10 2003-10-14 Symbol Technologies, Inc. Wireless LAN scholastic tracking system
US6513042B1 (en) * 1999-02-11 2003-01-28 Test.Com Internet test-making method
US20020026321A1 (en) 1999-02-26 2002-02-28 Sadeg M. Faris Internet-based system and method for fairly and securely enabling timed-constrained competition using globally time-sychronized client subsystems and information servers having microsecond client-event resolution
US7058817B1 (en) 1999-07-02 2006-06-06 The Chase Manhattan Bank System and method for single sign on process for websites with multiple applications and services
AU6230500A (en) * 1999-07-16 2001-02-05 Marathon Entertainment, Inc. Trusted communications between untrusting parties
US8527337B1 (en) 1999-07-20 2013-09-03 Google Inc. Internet based system and apparatus for paying users to view content and receiving micropayments
US7886008B2 (en) * 1999-07-28 2011-02-08 Rpost International Limited System and method for verifying delivery and integrity of electronic messages
US6758754B1 (en) * 1999-08-13 2004-07-06 Actv, Inc System and method for interactive game-play scheduled based on real-life events
US6965752B2 (en) * 1999-08-27 2005-11-15 Ecollege.Com On-line educational system having an electronic notebook feature
US6688891B1 (en) * 1999-08-27 2004-02-10 Inter-Tares, Llc Method and apparatus for an electronic collaborative education process model
US7949722B1 (en) 1999-09-29 2011-05-24 Actv Inc. Enhanced video programming system and method utilizing user-profile information
US6985889B1 (en) * 1999-09-30 2006-01-10 Topiary Communications, Inc. System and method for sharing of expert knowledge
US7337159B2 (en) * 1999-09-30 2008-02-26 Topiary Communications, Inc. System and method for sharing of expert knowledge
WO2001033477A2 (en) 1999-11-04 2001-05-10 Jpmorgan Chase Bank System and method for automated financial project management
US20040202308A1 (en) * 1999-11-16 2004-10-14 Knowlagent, Inc. Managing the selection of performance interventions in a contact center
US20060233346A1 (en) * 1999-11-16 2006-10-19 Knowlagent, Inc. Method and system for prioritizing performance interventions
US20050175971A1 (en) * 1999-11-16 2005-08-11 Knowlagent, Inc., Alpharetta, Ga Method and system for scheduled delivery of training to call center agents
US20040202309A1 (en) * 1999-11-16 2004-10-14 Knowlagent, Inc. Managing the rate of delivering performance interventions in a contact center
US8571975B1 (en) 1999-11-24 2013-10-29 Jpmorgan Chase Bank, N.A. System and method for sending money via E-mail over the internet
US10275780B1 (en) 1999-11-24 2019-04-30 Jpmorgan Chase Bank, N.A. Method and apparatus for sending a rebate via electronic mail over the internet
US6681098B2 (en) 2000-01-11 2004-01-20 Performance Assessment Network, Inc. Test administration system using the internet
US6505031B1 (en) * 2000-02-25 2003-01-07 Robert Slider System and method for providing a virtual school environment
US6775377B2 (en) 2001-09-10 2004-08-10 Knowlagent, Inc. Method and system for delivery of individualized training to call center agents
US7505921B1 (en) 2000-03-03 2009-03-17 Finali Corporation System and method for optimizing a product configuration
JP4961575B2 (en) 2000-03-31 2012-06-27 オープンティービー、インコーポレイテッド System and method for regional metadata insertion
US6544042B2 (en) 2000-04-14 2003-04-08 Learning Express, Llc Computerized practice test and cross-sell system
US7050753B2 (en) * 2000-04-24 2006-05-23 Knutson Roger C System and method for providing learning material
US7043193B1 (en) * 2000-05-09 2006-05-09 Knowlagent, Inc. Versatile resource computer-based training system
US6685476B1 (en) 2000-05-23 2004-02-03 Robert L. Safran, Sr. Computer-based educational learning
US7085800B2 (en) * 2000-06-01 2006-08-01 Annette M. Abbott Comprehensive system, process and article of manufacture to facilitate institutional, regulatory and individual continuing education requirements via a communications network
US7426530B1 (en) 2000-06-12 2008-09-16 Jpmorgan Chase Bank, N.A. System and method for providing customers with seamless entry to a remote server
US20020019940A1 (en) * 2000-06-16 2002-02-14 Matteson Craig S. Method and apparatus for assigning test and assessment instruments to users
US10185936B2 (en) 2000-06-22 2019-01-22 Jpmorgan Chase Bank, N.A. Method and system for processing internet payments
US7747542B2 (en) 2000-07-28 2010-06-29 Laborcheck, Inc. Method for complying with employment eligibility verification requirements
US6551109B1 (en) 2000-09-13 2003-04-22 Tom R. Rudmik Computerized method of and system for learning
US6778807B1 (en) 2000-09-15 2004-08-17 Documus, Llc Method and apparatus for market research using education courses and related information
US8335855B2 (en) 2001-09-19 2012-12-18 Jpmorgan Chase Bank, N.A. System and method for portal infrastructure tracking
US7099620B2 (en) * 2000-09-22 2006-08-29 Medical Council Of Canada Method and apparatus for administering an internet based examination to remote sites
AU2002237734A1 (en) 2000-10-30 2002-05-27 Timothy Gayle Goux Method and system for improving insurance premiums and risk of loss
GB0028835D0 (en) * 2000-11-27 2001-01-10 Arenbee Media Consultants Pvt An examination implementation method and system
US20020078139A1 (en) * 2000-12-18 2002-06-20 International Business Machines Corporation System and method of administering exam content
JP4236929B2 (en) * 2000-12-18 2009-03-11 バーリントン・イングリッシュ・リミテッド Access control for interactive learning system
US7996321B2 (en) * 2000-12-18 2011-08-09 Burlington English Ltd. Method and apparatus for access control to language learning system
US7203840B2 (en) * 2000-12-18 2007-04-10 Burlingtonspeech Limited Access control for interactive learning system
US20020082066A1 (en) * 2000-12-22 2002-06-27 Eric Berlin Systems and methods wherein a player positions an item in a list during game play
AU2002245226A1 (en) 2001-01-09 2002-07-24 Topcoder, Inc. Systems and methods for coding competitions
US6988895B1 (en) 2001-01-12 2006-01-24 Ncs Pearson, Inc. Electronic test item display as an image with overlay controls
US20020199118A1 (en) * 2001-02-02 2002-12-26 Medinservice.Com, Inc. Internet training course system and methods
US6871043B2 (en) * 2001-02-02 2005-03-22 Ecollege.Com Variable types of sensory interaction for an on-line educational system
WO2002063525A1 (en) * 2001-02-08 2002-08-15 Jong-Hae Kim The method of education and scholastic management for cyber education system utilizing internet
US7899702B2 (en) * 2001-03-23 2011-03-01 Melamed David P System and method for facilitating generation and performance of on-line evaluations
US20020172930A1 (en) * 2001-03-28 2002-11-21 Sun Microsystems, Inc. Fill-in-the-blank applet
US20020142842A1 (en) * 2001-03-29 2002-10-03 Easley Gregory W. Console-based system and method for providing multi-player interactive game functionality for use with interactive games
US7614014B2 (en) * 2001-04-05 2009-11-03 Daniel Keele Burgin System and method for automated end-user support
US20020147848A1 (en) * 2001-04-05 2002-10-10 Burgin Daniel Keele System and method for enabling communication between browser frames
US8096809B2 (en) * 2001-04-05 2012-01-17 Convergys Cmg Utah, Inc. System and method for automated end-user support
US6789047B1 (en) 2001-04-17 2004-09-07 Unext.Com Llc Method and system for evaluating the performance of an instructor of an electronic course
US7181413B2 (en) * 2001-04-18 2007-02-20 Capital Analytics, Inc. Performance-based training assessment
US8849716B1 (en) 2001-04-20 2014-09-30 Jpmorgan Chase Bank, N.A. System and method for preventing identity theft or misuse by restricting access
US7286793B1 (en) * 2001-05-07 2007-10-23 Miele Frank R Method and apparatus for evaluating educational performance
WO2002099598A2 (en) 2001-06-07 2002-12-12 First Usa Bank, N.A. System and method for rapid updating of credit information
US20030017442A1 (en) * 2001-06-15 2003-01-23 Tudor William P. Standards-based adaptive educational measurement and assessment system and method
US6790045B1 (en) * 2001-06-18 2004-09-14 Unext.Com Llc Method and system for analyzing student performance in an electronic course
US7266839B2 (en) 2001-07-12 2007-09-04 J P Morgan Chase Bank System and method for providing discriminated content to network users
US20060263756A1 (en) * 2001-07-18 2006-11-23 Wireless Generation, Inc. Real-time observation assessment with self-correct
EP1535392A4 (en) * 2001-07-18 2009-09-16 Wireless Generation Inc System and method for real-time observation assessment
US20030028425A1 (en) * 2001-07-31 2003-02-06 Zane Adam S. Method for increasing patronage to a sales enterprise through utilizing an award system
US7922494B2 (en) * 2001-08-28 2011-04-12 International Business Machines Corporation Method for improved administering of tests using customized user alerts
US7103576B2 (en) 2001-09-21 2006-09-05 First Usa Bank, Na System for providing cardless payment
US20030064354A1 (en) * 2001-09-28 2003-04-03 Lewis Daniel M. System and method for linking content standards, curriculum, instructions and assessment
US20030073065A1 (en) * 2001-10-12 2003-04-17 Lee Riggs Methods and systems for providing training through an electronic network to remote electronic devices
US20030074559A1 (en) * 2001-10-12 2003-04-17 Lee Riggs Methods and systems for receiving training through electronic data networks using remote hand held devices
JP2005106836A (en) * 2001-10-12 2005-04-21 Ima International:Kk Medical education system
US20030073471A1 (en) * 2001-10-17 2003-04-17 Advantage Partners Llc Method and system for providing an environment for the delivery of interactive gaming services
US8799024B2 (en) * 2001-10-23 2014-08-05 Timothy Gayle Goux System and method for improving the operation of a business entity and monitoring and reporting the results thereof
CA2466071C (en) 2001-11-01 2016-04-12 Bank One, Delaware, N.A. System and method for establishing or modifying an account with user selectable terms
US7174010B2 (en) * 2001-11-05 2007-02-06 Knowlagent, Inc. System and method for increasing completion of training
EP1316907A2 (en) * 2001-11-30 2003-06-04 Koninklijke Philips Electronics N.V. System for teaching the use of the functions of an apparatus, apparatus and method used in such a system
US20030104344A1 (en) * 2001-12-03 2003-06-05 Sable Paula H. Structured observation system for early literacy assessment
US6895213B1 (en) 2001-12-03 2005-05-17 Einstruction Corporation System and method for communicating with students in an education environment
US7987501B2 (en) 2001-12-04 2011-07-26 Jpmorgan Chase Bank, N.A. System and method for single session sign-on
JP3772205B2 (en) * 2002-02-06 2006-05-10 国立大学法人佐賀大学 Teaching material learning system
US7941533B2 (en) 2002-02-19 2011-05-10 Jpmorgan Chase Bank, N.A. System and method for single sign-on session management without central server
US20030177203A1 (en) * 2002-03-13 2003-09-18 Ncr Corporation Developer tools for web-based education
US20060248504A1 (en) * 2002-04-08 2006-11-02 Hughes John M Systems and methods for software development
US7778866B2 (en) * 2002-04-08 2010-08-17 Topcoder, Inc. Systems and methods for software development
WO2003088119A1 (en) * 2002-04-08 2003-10-23 Topcoder, Inc. System and method for soliciting proposals for software development services
US8776042B2 (en) * 2002-04-08 2014-07-08 Topcoder, Inc. Systems and methods for software support
US20030208613A1 (en) * 2002-05-02 2003-11-06 Envivio.Com, Inc. Managing user interaction for live multimedia broadcast
US8892895B1 (en) 2002-05-07 2014-11-18 Data Recognition Corporation Integrated system for electronic tracking and control of documents
US20030216943A1 (en) * 2002-05-15 2003-11-20 Mcphee Ron Interactive system and method for collecting and reporting health and fitness data
US6772081B1 (en) 2002-05-21 2004-08-03 Data Recognition Corporation Priority system and method for processing standardized tests
US20030228563A1 (en) * 2002-06-11 2003-12-11 Sang Henry W. System and method for creating and evaluating learning exercises
US7640167B2 (en) * 2002-08-16 2009-12-29 Nec Infrontia Corporation Self-service sales management system and method, and its program
US8128414B1 (en) 2002-08-20 2012-03-06 Ctb/Mcgraw-Hill System and method for the development of instructional and testing materials
US7766743B2 (en) * 2002-08-29 2010-08-03 Douglas Schoellkopf Jebb Methods and apparatus for evaluating a user's affinity for a property
US7058660B2 (en) 2002-10-02 2006-06-06 Bank One Corporation System and method for network-based project management
US8301493B2 (en) 2002-11-05 2012-10-30 Jpmorgan Chase Bank, N.A. System and method for providing incentives to consumers to share information
WO2004053292A2 (en) 2002-11-13 2004-06-24 Educational Testing Service Systems and methods for testing over a distributed network
DE10254939A1 (en) * 2002-11-25 2004-06-24 Siemens Ag Telemedicine system for providing online disease diagnosis by a certificated or authenticated grader of medical images or signals, whereby images are entered at one point and a previously authorized person accesses them remotely
DE10254938A1 (en) * 2002-11-25 2004-06-17 Siemens Ag Operating telematic system in health service involves trainer suggesting stored data record for training depending on defined aim, storing suggested record in training database with reference results
DE10254914A1 (en) * 2002-11-25 2004-06-17 Siemens Ag Medical system for evaluation of medical image data, especially image data, by a number of distributed diagnosing personnel, with the variance of the findings automatically calculated based on the variation between individuals
US7930716B2 (en) 2002-12-31 2011-04-19 Actv Inc. Techniques for reinsertion of local market advertising in digital video from a bypass source
US8385811B1 (en) 2003-02-11 2013-02-26 Data Recognition Corporation System and method for processing forms using color
US20040202987A1 (en) * 2003-02-14 2004-10-14 Scheuring Sylvia Tidwell System and method for creating, assessing, modifying, and using a learning map
US7308581B1 (en) 2003-03-07 2007-12-11 Traffic101.Com Systems and methods for online identity verification
US20040229199A1 (en) * 2003-04-16 2004-11-18 Measured Progress, Inc. Computer-based standardized test administration, scoring and analysis system
CA2427786A1 (en) * 2003-05-02 2004-11-02 Auckland Uniservices Limited System, method and computer program for student assessment
US20040230825A1 (en) * 2003-05-16 2004-11-18 Shepherd Eric Robert Secure browser
US20050251416A1 (en) * 2004-05-06 2005-11-10 Greene Jeffrey C Methods for improving the clinical outcome of patient care and for reducing overall health care costs
CN100585662C (en) * 2003-06-20 2010-01-27 汤姆森普罗梅特里克公司 System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application
US20050282133A1 (en) * 2004-06-18 2005-12-22 Christopher Crowhurst System and method for facilitating computer-based testing using traceable test items
CA2530064C (en) * 2003-06-20 2015-11-24 Thomson Prometric System and method for computer based testing using cache and cacheable objects to expand functionality of a test driver application
US7181158B2 (en) * 2003-06-20 2007-02-20 International Business Machines Corporation Method and apparatus for enhancing the integrity of mental competency tests
US20050003330A1 (en) * 2003-07-02 2005-01-06 Mehdi Asgarinejad Interactive virtual classroom
US20050008998A1 (en) * 2003-07-10 2005-01-13 Munger Chad B. System and method for providing certified proctors for examinations
US7257557B2 (en) * 2003-07-22 2007-08-14 Online Testing Services, Inc. Multi-modal testing methodology
US7158628B2 (en) * 2003-08-20 2007-01-02 Knowlagent, Inc. Method and system for selecting a preferred contact center agent based on agent proficiency and performance and contact center state
US7228484B2 (en) * 2003-09-11 2007-06-05 International Business Machines Corporation Method and apparatus for implementing redundancy enhanced differential signal interface
US8190893B2 (en) 2003-10-27 2012-05-29 Jp Morgan Chase Bank Portable security transaction protocol
US20050101314A1 (en) * 2003-11-10 2005-05-12 Uri Levi Method and system for wireless group communications
US7856374B2 (en) * 2004-01-23 2010-12-21 3Point5 Training retail staff members based on storylines
US8190080B2 (en) 2004-02-25 2012-05-29 Atellis, Inc. Method and system for managing skills assessment
US20050239034A1 (en) * 2004-04-07 2005-10-27 Mckeagney Francis Client/server distribution of learning content
US7995735B2 (en) 2004-04-15 2011-08-09 Chad Vos Method and apparatus for managing customer data
US8231389B1 (en) 2004-04-29 2012-07-31 Wireless Generation, Inc. Real-time observation assessment with phoneme segment capturing and scoring
US20060004859A1 (en) * 2004-05-05 2006-01-05 Kolman Robert S Methods and apparatus that use contextual test number factors to assign test numbers
US20080262866A1 (en) * 2004-05-06 2008-10-23 Medencentive, Llc Methods for Improving the Clinical Outcome of Patient Care and for Reducing Overall Health Care Costs
US9171285B2 (en) 2004-05-06 2015-10-27 Medencentive, Llc Methods for improving the clinical outcome of patient care and for reducing overall health care costs
US7980855B1 (en) 2004-05-21 2011-07-19 Ctb/Mcgraw-Hill Student reporting systems and methods
US20060003306A1 (en) * 2004-07-02 2006-01-05 Mcginley Michael P Unified web-based system for the delivery, scoring, and reporting of on-line and paper-based assessments
US20060072739A1 (en) * 2004-10-01 2006-04-06 Knowlagent Inc. Method and system for assessing and deploying personnel for roles in a contact center
US7609145B2 (en) * 2004-10-06 2009-10-27 Martis Ip Holdings, Llc Test authorization system
US20080281635A1 (en) * 2004-10-06 2008-11-13 Martis Dinesh J Method of administering a beneficiary medical procedure
US7683759B2 (en) * 2004-10-06 2010-03-23 Martis Ip Holdings, Llc Patient identification system
US20060093095A1 (en) * 2004-10-08 2006-05-04 Heck Mathew W Method and apparatus for test administration
US20090198604A1 (en) * 2004-12-17 2009-08-06 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Tracking a participant loss in a virtual world
US7774275B2 (en) * 2005-02-28 2010-08-10 Searete Llc Payment options for virtual credit
US7536752B2 (en) * 2005-01-21 2009-05-26 Leviton Manufacturing Company, Inc. Rack mounted component door system and method
US8556723B2 (en) * 2005-02-04 2013-10-15 The Invention Science Fund I. LLC Third party control over virtual world characters
US20070013691A1 (en) * 2005-07-18 2007-01-18 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Supervisory authority in virtual world environment
US7958047B2 (en) * 2005-02-04 2011-06-07 The Invention Science Fund I Virtual credit in simulated environments
US20060199163A1 (en) * 2005-03-04 2006-09-07 Johnson Andrea L Dynamic teaching method
US20060216685A1 (en) * 2005-03-25 2006-09-28 Interactive Speech Solutions, Llc Interactive speech enabled flash card method and system
US20070042335A1 (en) * 2005-05-11 2007-02-22 Ctb Mcgraw-Hill System and method for assessment or survey response collection using a remote, digitally recording user input device
US20060256953A1 (en) * 2005-05-12 2006-11-16 Knowlagent, Inc. Method and system for improving workforce performance in a contact center
US8170466B2 (en) * 2005-05-27 2012-05-01 Ctb/Mcgraw-Hill System and method for automated assessment of constrained constructed responses
US20070009871A1 (en) * 2005-05-28 2007-01-11 Ctb/Mcgraw-Hill System and method for improved cumulative assessment
US20060281512A1 (en) * 2005-06-10 2006-12-14 Rehm Peter H Automatic publication of interactive crossword puzzles
US20070031801A1 (en) * 2005-06-16 2007-02-08 Ctb Mcgraw Hill Patterned response system and method
US8185877B1 (en) 2005-06-22 2012-05-22 Jpmorgan Chase Bank, N.A. System and method for testing applications
US7865423B2 (en) * 2005-08-16 2011-01-04 Bridgetech Capital, Inc. Systems and methods for providing investment opportunities
US20070048723A1 (en) * 2005-08-19 2007-03-01 Caveon, Llc Securely administering computerized tests over a network
US8583926B1 (en) 2005-09-19 2013-11-12 Jpmorgan Chase Bank, N.A. System and method for anti-phishing authentication
US7513775B2 (en) * 2005-10-05 2009-04-07 Exam Innovations, Inc. Presenting answer options to multiple-choice questions during administration of a computerized test
US20070117082A1 (en) * 2005-11-21 2007-05-24 Winneg Douglas M Systems, methods and apparatus for monitoring exams
RU2008134126A (en) * 2006-01-20 2010-02-27 Топкоудер, Инк. (Us) SYSTEM AND METHOD OF PROJECT DEVELOPMENT
US20070220479A1 (en) * 2006-03-14 2007-09-20 Hughes John M Systems and methods for software development
US8266426B2 (en) * 2006-03-24 2012-09-11 Red Hat, Inc. Hardware certification based on certifying development processes
US8608477B2 (en) * 2006-04-06 2013-12-17 Vantage Technologies Knowledge Assessment, L.L.C. Selective writing assessment with tutoring
US20070250378A1 (en) * 2006-04-24 2007-10-25 Hughes John M Systems and methods for conducting production competitions
US20080052146A1 (en) * 2006-05-01 2008-02-28 David Messinger Project management system
US10783458B2 (en) * 2006-05-01 2020-09-22 Topcoder, Inc. Systems and methods for screening submissions in production competitions
US7677967B2 (en) * 2006-07-07 2010-03-16 Jessop Louis G Battle school
US8793490B1 (en) 2006-07-14 2014-07-29 Jpmorgan Chase Bank, N.A. Systems and methods for multifactor authentication
US20080070217A1 (en) * 2006-08-21 2008-03-20 Universidad De Santiago De Chile Software tool for creating an interactive graphic organizer
JP4241804B2 (en) * 2006-11-20 2009-03-18 コニカミノルタビジネステクノロジーズ株式会社 Image processing apparatus and program
US7831195B2 (en) * 2006-12-11 2010-11-09 Sharp Laboratories Of America, Inc. Integrated paper and computer-based testing administration system
US20080168274A1 (en) * 2007-01-05 2008-07-10 Victor Natanzon System And Method For Selectively Enabling Features On A Media Device
US20080167960A1 (en) * 2007-01-08 2008-07-10 Topcoder, Inc. System and Method for Collective Response Aggregation
US20080196000A1 (en) * 2007-02-14 2008-08-14 Fernandez-Lvern Javier System and method for software development
US20080222055A1 (en) * 2007-03-07 2008-09-11 Hughes John M System and Method for Creating Musical Works
US8073792B2 (en) * 2007-03-13 2011-12-06 Topcoder, Inc. System and method for content development
WO2008121730A1 (en) * 2007-03-28 2008-10-09 Prometric Inc. Identity management system for authenticating test examination candidates and /or individuals
US8358964B2 (en) * 2007-04-25 2013-01-22 Scantron Corporation Methods and systems for collecting responses
US8473735B1 (en) 2007-05-17 2013-06-25 Jpmorgan Chase Systems and methods for managing digital certificates
US20080285818A1 (en) * 2007-05-17 2008-11-20 Hardy Warren Fingerprint verification system for computerized course attendance and performance testing
US20090011397A1 (en) * 2007-07-02 2009-01-08 Greg Writer Method, system and device for managing compensated educational interaction
US20090192849A1 (en) * 2007-11-09 2009-07-30 Hughes John M System and method for software development
WO2009089447A1 (en) * 2008-01-11 2009-07-16 Topcoder, Inc. System and method for conducting competitions
US8321682B1 (en) 2008-01-24 2012-11-27 Jpmorgan Chase Bank, N.A. System and method for generating and managing administrator passwords
US8639177B2 (en) * 2008-05-08 2014-01-28 Microsoft Corporation Learning assessment and programmatic remediation
US8949187B1 (en) * 2008-05-30 2015-02-03 Symantec Corporation Systems and methods for creating and managing backups based on health information
US20100088740A1 (en) * 2008-10-08 2010-04-08 Bookette Software Company Methods for performing secure on-line testing without pre-installation of a secure browser
US9608826B2 (en) 2009-06-29 2017-03-28 Jpmorgan Chase Bank, N.A. System and method for partner key management
WO2011035271A1 (en) 2009-09-18 2011-03-24 Innovative Exams, Llc Apparatus and system for and method of registration, admission and testing of a candidate
US20110102142A1 (en) * 2009-11-04 2011-05-05 Widger Ian J Webcast viewer verification methods
WO2011094214A1 (en) * 2010-01-29 2011-08-04 Scantron Corporation Data collection and transfer techniques for scannable forms
TWI409722B (en) * 2010-03-08 2013-09-21 Prime View Int Co Ltd Examining system and method thereof
US20110244439A1 (en) * 2010-03-09 2011-10-06 RANDA Solutions, Inc. Testing System and Method for Mobile Devices
US8909127B2 (en) 2011-09-27 2014-12-09 Educational Testing Service Computer-implemented systems and methods for carrying out non-centralized assessments
US20140030687A1 (en) * 2012-07-27 2014-01-30 Uniloc Luxembourg, S.A. Including usage data to improve computer-based testing of aptitude
US9870713B1 (en) * 2012-09-17 2018-01-16 Amazon Technologies, Inc. Detection of unauthorized information exchange between users
US20160042198A1 (en) 2012-10-19 2016-02-11 Pearson Education, Inc. Deidentified access of content
US8984650B2 (en) 2012-10-19 2015-03-17 Pearson Education, Inc. Privacy server for protecting personally identifiable information
US20150187223A1 (en) * 2013-12-30 2015-07-02 Pearson Education, Inc. Deidentified access of instructional content
US9288056B1 (en) 2015-05-28 2016-03-15 Pearson Education, Inc. Data access and anonymity management
US9436911B2 (en) 2012-10-19 2016-09-06 Pearson Education, Inc. Neural networking system and methods
US20140222995A1 (en) * 2013-02-07 2014-08-07 Anshuman Razden Methods and System for Monitoring Computer Users
US9419957B1 (en) 2013-03-15 2016-08-16 Jpmorgan Chase Bank, N.A. Confidence-based authentication
TWI534768B (en) * 2013-12-17 2016-05-21 Jian-Cheng Liu Wisdom teaching counseling test method
US10148726B1 (en) 2014-01-24 2018-12-04 Jpmorgan Chase Bank, N.A. Initiating operating system commands based on browser cookies
WO2016028864A1 (en) 2014-08-22 2016-02-25 Intelligent Technologies International, Inc. Secure testing device, system and method
US10540907B2 (en) 2014-07-31 2020-01-21 Intelligent Technologies International, Inc. Biometric identification headpiece system for test taking
US10410535B2 (en) 2014-08-22 2019-09-10 Intelligent Technologies International, Inc. Secure testing device
US10535277B2 (en) 2017-01-09 2020-01-14 International Business Machines Corporation Automated test generator and evaluator
US10467551B2 (en) 2017-06-12 2019-11-05 Ford Motor Company Portable privacy management
US11763692B2 (en) 2019-06-05 2023-09-19 International Business Machines Corporation Secure delivery and processing of paper-based exam questions and responses
CN113360891B (en) * 2021-05-25 2023-12-15 深圳市蘑菇财富技术有限公司 Anti-cheating method based on exercise system and related equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5321611A (en) * 1993-02-05 1994-06-14 National Computer Systems, Inc. Multiple test scoring system
US5513994A (en) * 1993-09-30 1996-05-07 Educational Testing Service Centralized system and method for administering computer based tests
US5565316A (en) * 1992-10-09 1996-10-15 Educational Testing Service System and method for computer based testing

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4658093A (en) * 1983-07-11 1987-04-14 Hellman Martin E Software distribution system
CA1245361A (en) * 1984-06-27 1988-11-22 Kerry E. Thacher Tournament data system
US5319710A (en) * 1986-08-22 1994-06-07 Tandem Computers Incorporated Method and means for combining and managing personal verification and message authentication encrytions for network transmission
US5155680A (en) * 1986-10-24 1992-10-13 Signal Security Technologies Billing system for computing software
US4967354A (en) * 1987-06-18 1990-10-30 Tescor, Inc. Method of preparing customized written examinations
US5112051A (en) * 1989-06-05 1992-05-12 Westinghouse Electric Corp. Interfacing device for a computer games system
US5036461A (en) * 1990-05-16 1991-07-30 Elliott John C Two-way authentication system between user's smart card and issuer-specific plug-in application modules in multi-issued transaction device
US5259029A (en) * 1990-05-16 1993-11-02 Duncan Jr F Jeff Decoding device for computer software protection
US5050212A (en) * 1990-06-20 1991-09-17 Apple Computer, Inc. Method and apparatus for verifying the integrity of a file stored separately from a computer
JPH04143881A (en) * 1990-10-05 1992-05-18 Toshiba Corp Mutual authenticating system
US5359510A (en) * 1990-11-28 1994-10-25 Sabaliauskas Anthony L Automated universal tournament management system
US5243654A (en) * 1991-03-18 1993-09-07 Pitney Bowes Inc. Metering system with remotely resettable time lockout
US5193114A (en) * 1991-08-08 1993-03-09 Moseley Donald R Consumer oriented smart card system and authentication techniques
JP2821306B2 (en) * 1992-03-06 1998-11-05 三菱電機株式会社 Authentication method and system between IC card and terminal
EP0566811A1 (en) * 1992-04-23 1993-10-27 International Business Machines Corporation Authentication method and system with a smartcard
JPH0697931A (en) * 1992-09-14 1994-04-08 Fujitsu Ltd Personal communication terminal registration control system
US5349642A (en) * 1992-11-03 1994-09-20 Novell, Inc. Method and apparatus for authentication of client server communication
US5351293A (en) * 1993-02-01 1994-09-27 Wave Systems Corp. System method and apparatus for authenticating an encrypted signal
US5416840A (en) * 1993-07-06 1995-05-16 Phoenix Technologies, Ltd. Software catalog encoding method and system
US5400319A (en) * 1993-10-06 1995-03-21 Digital Audio Disc Corporation CD-ROM with machine-readable I.D. code
US5412575A (en) * 1993-10-07 1995-05-02 Hewlett-Packard Company Pay-per-use access to multiple electronic test capabilities
US5434918A (en) * 1993-12-14 1995-07-18 Hughes Aircraft Company Method for providing mutual authentication of a user and a server on a network

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5565316A (en) * 1992-10-09 1996-10-15 Educational Testing Service System and method for computer based testing
US5321611A (en) * 1993-02-05 1994-06-14 National Computer Systems, Inc. Multiple test scoring system
US5513994A (en) * 1993-09-30 1996-05-07 Educational Testing Service Centralized system and method for administering computer based tests

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0910844A1 (en) * 1997-03-11 1999-04-28 Sylvan Learning Systems, Inc. Method and system for administrating of remotely-proctored secure examination
EP0910844A4 (en) * 1997-03-11 2000-07-12 Sylvan Learning Systems Inc Method and system for administrating of remotely-proctored secure examination
EP1018717A2 (en) * 1997-03-11 2000-07-12 Sylvan Learning Systems, Inc. Method and system for administrating of remotely-proctored secure examination
EP1018717A3 (en) * 1997-03-11 2000-07-26 Sylvan Learning Systems, Inc. Method and system for administrating of remotely-proctored secure examination
US6195528B1 (en) 1997-12-09 2001-02-27 Tokheim Corporation Teaching method and system
FR2773410A1 (en) * 1997-12-09 1999-07-09 Tokheim Corp TEACHING SYSTEM AND METHOD
NL1010773C2 (en) * 1997-12-09 2000-02-23 Tokheim Corp Teaching method and system.
FR2775820A1 (en) * 1998-03-03 1999-09-10 Elti Remote analysis of the profile of persons being assessed for their knowledge or of opinion survey data
EP1003142A1 (en) * 1998-11-17 2000-05-24 Alcatel Method for automatic monitoring of the achievment of teaching goals by a computer
WO2001031610A1 (en) * 1999-10-27 2001-05-03 Arc Research & Development Pty Limited A data collection method
CN102804742A (en) * 2009-06-11 2012-11-28 阿尔卡特朗讯公司 Method and application for parental control for using a terminal
EP2606451A2 (en) * 2010-08-16 2013-06-26 Extegrity Inc. Systems and methods for detecting substitution of high-value electronic documents
EP2606451A4 (en) * 2010-08-16 2014-05-14 Extegrity Inc Systems and methods for detecting substitution of high-value electronic documents
US9953175B2 (en) 2010-08-16 2018-04-24 Extegrity, Inc. Systems and methods for detecting substitution of high-value electronic documents
CN108352044A (en) * 2015-11-24 2018-07-31 索尼公司 Information processing unit, information processing method and program
US11081017B2 (en) 2015-11-24 2021-08-03 Sony Corporation Information processing apparatus, information processing method, and program
EP3382637B1 (en) * 2015-11-24 2023-11-29 Sony Group Corporation Information processing device, information processing method, and program

Also Published As

Publication number Publication date
US5947747A (en) 1999-09-07
AU3135297A (en) 1997-11-26

Similar Documents

Publication Publication Date Title
US5947747A (en) Method and apparatus for computer-based educational testing
US8725060B2 (en) Method and apparatus for educational testing
US20200410886A1 (en) Cloud based test environment
Rowe Cheating in online student assessment: Beyond plagiarism
US9984582B2 (en) Peered proctoring
US6659861B1 (en) Internet-based system for enabling a time-constrained competition among a plurality of participants over the internet
US7257557B2 (en) Multi-modal testing methodology
US6263439B1 (en) Verification system for non-traditional learning operations
US20080131860A1 (en) Security and tamper resistance for high stakes online testing
US20030084345A1 (en) Managed access to information over data networks
US20120244508A1 (en) Method for remotely proctoring tests taken by computer over the internet
US6928278B2 (en) Authentic person identification
US20050136388A1 (en) System and method for providing instructional data integrity in offline e-learning systems
WO2000059226A1 (en) Viewing terminal, viewing authorization system, method for authorizing viewing, remote education method, and recorded medium
CN205680097U (en) A kind of mobile learning system based on finger print mobile phones
JP2003228278A (en) Level judging device, remote education device, and remote education system
Hao et al. Verifiable classroom voting in practice
KR20010091742A (en) Secure Electronic Voting System
Wu Apollo: End-to-end Verifiable Internet Voting with Recovery from Vote Manipulation
US20020026585A1 (en) Audit and verification system
KR102621935B1 (en) online examination cheating detecting system using sound apparatus
WO2002042963A2 (en) A computerised administration system and method for administering the conduct of examinations such as academic examinations
Popoveniuc A framework for secure mixnet-based electronic voting
KR100690927B1 (en) Method and system for providing learning evaluation service
Fokum Networking Education: A Caribbean Perspective

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 97540277

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

122 Ep: pct application non-entry in european phase