US20170061809A1 - Method and system for importing hard copy assessments into an automatic educational system assessment - Google Patents

Method and system for importing hard copy assessments into an automatic educational system assessment Download PDF

Info

Publication number
US20170061809A1
US20170061809A1 US14/836,605 US201514836605A US2017061809A1 US 20170061809 A1 US20170061809 A1 US 20170061809A1 US 201514836605 A US201514836605 A US 201514836605A US 2017061809 A1 US2017061809 A1 US 2017061809A1
Authority
US
United States
Prior art keywords
processing system
image processing
educational assessment
assessment
questions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/836,605
Inventor
Robert J. St. Jacques, JR.
Dennis L. Venable
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xerox Corp
Original Assignee
Xerox Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xerox Corp filed Critical Xerox Corp
Assigned to XEROX CORPORATION reassignment XEROX CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ST. JACQUES, JR., ROBERT J., VENABLE, DENNIS L.
Publication of US20170061809A1 publication Critical patent/US20170061809A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/166Editing, e.g. inserting or deleting
    • G06F40/169Annotation, e.g. comment data or footnotes
    • G06K9/18
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/22Character recognition characterised by the type of writing
    • G06V30/224Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/06Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception

Definitions

  • a document processing method and system is provided to create an educational assessment using an image processing system, such as a multifunction printer (MFP), the created educational assessment including a plurality of questions administered to one or more students for completion.
  • MFP multifunction printer
  • the present disclosure relates to the process of assessing the attributes of a student or group of students at selected times during their learning process and particularly relates to the assessment and evaluation of student attributes or progress in a structured classroom where a teacher is required to educate the students to a level of proficiency in various subject matters and at particular grade levels.
  • the teacher periodically gives the students printed form assessments or tests, as they have previously been referred to, in order to obtain an indication of the student(s) level(s) of proficiency in the subject matter of immediate interest.
  • U.S. Patent Publication No. 2010/0075290 published Mar.
  • AUTOMATIC EDUCATIONAL ASSESSMENT SERVICE describes a system for automatically evaluating assessments of the type given by a teacher/educator for determining the state of learning or progress of students during the course of instructions; and, the system is applicable particularly in a classroom setting where the teacher is responsible for educating a relatively large group of students.
  • the system and technique of the present disclosure enables the teacher/educator to select from the digital user interface (DUI) of a Multifunction Device (MFD) any of multiple predetermined stored assessment forms in a Data Warehouse/Repository of such assessment forms for administration to a teacher/educator selected group of one or more students.
  • DAI digital user interface
  • MMD Multifunction Device
  • the teacher requests the system to create an Assessment Batch and to print out personalized versions of the assessment form, where each version is automatically bar coded for the individual student.
  • the student's name is also printed on the form for the purpose of delivering each assessment to the appropriate student. If desired, the student's name may be printed on the reverse side of the form such as, for example in large print, such that the person administering the test can verify from a distance that each student has the correct form, and so that forms can be handed out individually without disclosing the content of the assessment.
  • the marked assessment forms are then scanned into the system at the MFD.
  • the system Based on the information bar coded on the scanned forms, the system then identifies the student and Assessment Batch. The system then employs the appropriate image analysis of the markings, and performs an evaluation of each item on each of the assessments based upon a pre-programmed rubric. The system then automatically stores a preliminary evaluation in the Data Warehouse/Repository for each student. The teacher/educator may then view the assessments at a remote terminal and validate/annotate them. The system then automatically updates the validated/annotated assessment records in the Data Warehouse/Repository (DW/R) for later retrieval in various report views, which may be retrieved at the MFD or remotely by the teacher or other authorized educator.
  • DW/R Data Warehouse/Repository
  • a method of creating an educational assessment using an image processing system comprising: a) a user of the image processing system performing one or more of scanning a preexisting educational assessment into the image processing system generating a digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata, and loading into the image processing system the digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata; b) the image processing system displaying the preexisting educational assessment on a display operatively associated with the image processing system; c) the user selectably capturing an image of a single question associated with the displayed preexisting educational assessment; d) the image processing system generating a search query including one or more items included within the captured image of the single question;
  • an image processing system for creating an educational assessment, the created educational assessment including a plurality of questions associated with one or more predefined formats including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion
  • the image processing system comprising: a preexisting educational assessment processing module configured to perform one or more of receiving a digital representation of a preexisting educational assessment into the image processing system generated using an operatively associated scanner, and loading into the image processing system the digital representation of the preexisting educational assessment, the digital representation of the preexisting educational assessment not conforming to the one or more predefined formats and not including the associated metadata and the preexisting educational assessment processing module configured to display the preexisting educational assessment on a display operatively associated with the image processing system; an image capture module configured to capture an image of a single question associated with the displayed preexisting educational assessment; a search query module configured to generate a search query including one or more items included within a captured image of a single question associated with the displayed preexisting educational assessment, and execute
  • a method of creating an educational assessment using an image processing system comprising: a) a user of the image processing system creating and entering a question using a User Interface (UI) operatively associated with the image processing system; b) the image processing system generating a search query including one or more items included in the user created question; c) the image processing system executing a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions; d) the image processing system displaying the one or more matching predefined questions and associated metadata on the UI; e) the user selecting one of the displayed matching predefined questions and/or selecting the metadata associated with one
  • UI User Interface
  • DW/R Data Warehouse/Repository
  • FIG. 1 is a pictorial diagram of a method to process preexisting assessments according to an exemplary embodiment of this disclosure
  • FIG. 2 is a diagram of a system to process preexisting assessments according to an exemplary embodiment of this disclosure
  • FIGS. 3A and 3B is a flow chart of a method to generate a printed assessment for manual marking by a student according to an exemplary embodiment of this disclosure
  • FIG. 4 is a pictorial diagram of a workflow method to create an educational assessment based on a preexisting educational assessment not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank;
  • FIG. 5 is an example of a preexisting educational assessment question according to an exemplary embodiment of this disclosure.
  • FIG. 6 is an example of an original preexisting teacher created educational assessment which does not include any associated metadata, the original preexisting educational assessment subsequently processed according to an exemplary embodiment of this disclosure;
  • FIG. 7 illustrates the display of the example preexisting teacher created educational assessment after it is scanned into the image processing system according to an exemplary embodiment of this disclosure
  • FIG. 8 illustrates the image processing system displayed preexisting educational assessment in FIG. 7 including a teacher selected question according to an exemplary embodiment of this disclosure
  • FIG. 9 is a detailed view of the teacher selected question shown in FIG. 8 ;
  • FIG. 10 illustrates a Question Selected Tool including question query results for selection by a teacher and features to further refine the query according to an exemplary embodiment of this disclosure
  • FIG. 11 is a pictorial diagram of another workflow method to create an educational assessment based on a teacher created question not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank; and
  • FIG. 12 is a detailed view of the teacher created question shown in FIG. 11 .
  • This disclosure provides a method and system that can take digital content that has been converted from paper and with the aid of the content owner, such as a teacher, search for similar and relevant content from published items. The teacher can then use their content and associate it with the published items or replace their content with the published item(s).
  • the method and system provides an ease of use interface that enables teachers to create their own digital content based on exiting and previously used paper based content.
  • the disclosed method and system allows for the elimination or reduction in the amount of professional services, i.e., specially trained technicians, required for the conversion of teacher content. See U.S. patent application Ser. No. 14/609,820, filed Jan. 30, 2015, by Clar et al., entitled “METHOD AND SYSTEM TO ATTRIBUTE METADATA TO PREEXISTING DOCUMENTS”.
  • an Assessment Batch includes the teacher's name and a student list which includes the names of the students to be included in the batch, the particular assessment form to be administered to the students in the student list and the creation date of the Assessment Batch.
  • the teacher/educator administers the assessments which are marked.
  • the printed sheets may be marked by the teacher/educator or the students according to the nature of the assessment.
  • the teacher/educator or their designated representative scans the marked assessments into the system at the MFD.
  • the system automatically evaluates the assessments employing image analysis according to the established rubrics associated with the assessment form associated with the Assessment Batch and enables the teacher to access the evaluations at station 5 which is illustrated as a remote station such as a teacher's personal computer (PC).
  • the teacher/educator validates/annotates the assessments and upon receipt of the validation, the system generates reports at station 6 which may be accessed and viewed at either the MFD or the teacher's personal computer terminal remote from the MFD.
  • FIG. 2 the overall architecture of the system employed with the presently disclosed method is illustrated pictorially with the MFD 12 connected through an application server 14 along line 16 to a network 18 which may be either a local or wide area network and may include connections to the internet.
  • a remote terminal or PC 20 such as a teacher/educator access terminal is connected along line 22 to the network 18 .
  • a system server 24 is also connected to the network 18 and provides the functions of database access, serves as a workflow engine, mail handler, web server and functions of image processing/scoring.
  • a Data Warehouse/Repository 26 is also connected to the network and contains such items as assessment forms and associated rubrics, workflow definitions, Assessment Batch records, reports and teacher/student/class data and is operable to receive updates and to provide for access to data stored therein remotely therefrom over network 18 .
  • the system and method of the present disclosure function to assist a teacher/educator by providing automatic evaluation of assessments administered to students based upon established rubrics programmed into the system and employing image analysis.
  • the system and method of the present disclosure have the capability to evaluate assessments which are marked with images other than by marking within a box or bubble with respect to multiple choice answers.
  • the system has the ability to scan the marked assessment and lift the manually made marks made during the administering of the assessment from the preprinted markings on the assessment sheet.
  • the system and method then employ image analysis to identify and evaluate the lifted marks.
  • the method and system are capable of handling numerous types of assessment items employed by teachers/educators examples of which are illustrated in the present disclosure in FIGS. 8-22 .
  • assessments may be administered to the students and may include summative, formative, diagnostic, interest, preference and benchmark assessments.
  • the teacher/educator selects the education assessment service (EAS) print service from the DUI (Digital User Display) of the MFD 12 and proceeds to require the teacher to provide authentication or personal identification information at step 32 .
  • the system then proceeds to display on the MFD DUI all the pre-defined assessment forms currently associated with the teacher's identification entered in at step 32 .
  • the teacher then chooses at step 36 an assessment form and initiates the formation of an assessment “Batch” associated with that teacher and the selected assessment form.
  • the “Assessment Batch” comprises the basic evaluation unit or cell that the teacher has requested.
  • the teacher then proceeds at step 38 to input a class to assess such as, for example, a seventh grade class, a seventh grade math class, a fifth grade English writing class, or a fourth grade reading class, etc.
  • the system then proceeds to step 40 and enquires as to whether the teacher/educator wishes to select the entire class; and, if the enquiry in step 40 is answered in the affirmative, the system then proceeds to step 42 and includes all students in the class on the Assessment Batch Student List. However, if the query at step 40 is answered in the negative, the system proceeds to step 44 and the class list is displayed on the MFD DUI and the teacher selects specific students to be included on the Assessment Batch Student List.
  • step 46 the system then proceeds to step 46 and the teacher is prompted to select print from the MFD DUI.
  • the system then proceeds to step 48 and automatically creates a new Assessment Batch record in the Data Warehouse/Repository to store the teacher's identification, the particular assessment form, the Student List, the status data, the date created, and other data which may be required by the particular school administrator/system.
  • step 50 The system then proceeds to step 50 and automatically formats a personalized assessment layout for each student on the Student List, which layout includes the student name to insure each student receives the correct assessment and an identification bar code to encode the Assessment Batch and the student.
  • the assessment item order/layout for each student may be varied for each student to discourage students from looking at neighboring students' assessments for hints.
  • step 52 prints the personalized page(s) for each student on the Student List for the Assessment Batch.
  • the system confirms that all page(s) are printed and updates the Data Warehouse/Repository.
  • the teacher/educator takes the personalized printed assessment page(s) and administers the assessment to each designated student.
  • the teacher/assessor or student as the case may be, manually marks on the printed assessment page(s) the appropriate response to the challenge indicated on the particular assessment page.
  • the marked assessment pages are collected by the teacher/educator for subsequent evaluation.
  • Metadata is crucial for the functions of data tracking, reporting, and the customization of learning for the students as well for assisting teachers in their daily practice.
  • the type of metadata to track ranges from the global level to the question level. For example, items being tracked on a global level may include:
  • the metadata being tracked on the question level may include:
  • the assessment creator/evaluator system assigns additional data for filing and sorting of the assessment such as:
  • Version number (auto generated as 1.0 for first install, successively increases as same assessment is scanned again with same name)
  • Some disadvantages associated with this stop-gap measure include (1) the use of a manual process requiring a trained professional services team to scan, markup the assessments (identifying “hot spots,” etc.), and to identify question types and correct answers; (2) the resulting assessments are one off aberrations that require a separate flow through an automatic educational assessment system; (3) detailed metadata about the questions, including the content of the question, is lost; and (4) assessments created in this manner cannot be used for newly created workflows based on automatically generated assessments, e.g., online assessments or tablet-based assessments.
  • QTI IMS Question and Test Interoperability
  • Standards like QTI provide a very structured representation of assessment questions, also referred to as “items”, including complete question content, question type, grade level, unit, correct answers, associated artifacts (e.g., images, charts, etc.) as well as a plethora of additional detailed metadata.
  • the school district is also granted the right to use the structured, digital versions of the assessment content that is included with the material.
  • third party vendors will act as a go-between aggregating the content to which a specific school district has access across multiple publishers, and providing it to the school district, usually as something similar to a large set of flat files in QTI XML format. It is then possible to parse the content and store it in a searchable database referred to as an Item Bank. Once stored, the items may be retrieved in a number of ways, including database queries. For example, something like the following query can be used to find items that contain the words “quick,” “brown,” and “fox”, assuming the item body content is stored column ‘ItemBody’ of table ‘items’:
  • the automatic educational assessment method and system disclosed herein leverages access to item banks to retrieve the structured, digital version of assessment items.
  • the teacher scans the existing content into the teacher-facing UI S 61 and the existing assessment content is displayed 70 by the UI.
  • the teacher draws a bounding box 71 around the body of an individual item, e.g., the text content of the question not including the answers in a multiple choice question S 62 .
  • OCR Optical Character Recognition
  • Step S 65 executes some simple techniques to prevent the generation of long, complex queries and/or to improve results, such as:
  • the teacher is presented with a list of the results 78 and chooses the correct or desired item S 66 .
  • Steps S 61 -S 66 are repeated for each remaining question on the assessment.
  • the automatic educational assessment system generates and stores a new assessment with the content from the selected items and the assessment may now be administered to students along with any other assessment using a workflow as described with reference to FIGS. 1-3 associated with the automatic assessment system.
  • FIG. 5 is an example of a preexisting educational assessment question according to an exemplary embodiment of this disclosure.
  • FIG. 6 is an example of an original preexisting teacher created educational assessment which does not include any associated metadata, the original preexisting educational assessment subsequently processed according to an exemplary embodiment of this disclosure.
  • FIG. 7 illustrates the display of the example preexisting teacher created educational assessment after it is scanned into the image processing system according to an exemplary embodiment of this disclosure.
  • FIG. 8 illustrates the image processing system displayed preexisting educational assessment in FIG. 7 including a teacher selected question according to an exemplary embodiment of this disclosure.
  • FIG. 9 is a detailed view of the teacher selected question shown in FIG. 8 .
  • FIG. 10 illustrates a Question Selected Tool including question query results for selection by a teacher and features to further refine the query according to an exemplary embodiment of this disclosure.
  • the assessment creation toolkit includes this capability using an alternative workflow.
  • the alternative workflow provides the teacher with the flexibility to tailor and tune assessments precisely as they see fit for their students, without the drawback of other systems which lose the metadata that publisher created content contains.
  • Item formats such as QTI contain significant and detailed metadata that provides detailed information about individual questions, well above and beyond the item body and possible answers. Information such as grade level, subject, Common Core compliance and much more is contained within this metadata.
  • CCS Common Core Standard
  • educators can construct their own questions and use the content as a query to find similar questions in an item bank. If a similar enough question is returned as a result of the query, the returned question metadata can be copied from the existing returned question into the newly teacher created question and modified as needed, thus avoiding the manual metadata entry that would otherwise be needed.
  • FIG. 11 shown is a pictorial diagram of the slightly altered version of the main workflow previously described, the altered version creating an educational assessment based on a teacher created question not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank.
  • the teacher creates a new question 90 using the assessment creator.
  • FIG. 12 is a detailed view of the teacher created question shown in FIG. 11 .
  • step S 82 the text of the question 91 is converted into an SQL query S 83 .
  • the educator may pick and choose which terms are significant, e.g. “yards” and “divide” to find questions about division involving measurement in yards; other terms like “Jackie” or “rope” may constrain the question too much and reduce or eliminate good matches.
  • step S 83 the question text is used to construct one or more search queries for the item bank 76 .
  • a matching algorithm expands or narrows the query to fine tune the results before returning the resulting items 92 to the teacher.
  • step S 85 the teacher is presented with a list of the results 93 and chooses an item that most closely matches what she is looking for, and copies the metadata for that question.
  • the teacher may alter the query, e.g., to search for “feet” instead of “yards.”
  • the teacher pastes the metadata from the matching question into her new question, and modifies the fields as needed, and saves the item to her item bank.
  • the automatic educational assessment system generates and stores a new assessment with the content from the selected items and the assessment may now be given along with any other assessment using the workflow previously described with reference to FIGS. 1-3 .
  • the teacher After executing this workflow, the teacher has a complete assessment populated with custom content and with all of the appropriate metadata intact.
  • a query or queries may be constructed to attempt to retrieve the corresponding question from an item bank.
  • the queries described herein are constructed solely using the content of the question body, as opposed to the entire question, e.g. the multiple choice options.
  • the content of the entire question may be used in combination with another cropping and parsing technique to create queries that will produce different and sometimes better results by allowing for interrogation of other parts of the item model, e.g., the responses in a multiple choice question and/or the answer in addition to the item body.
  • one exemplary technique breaks the question into significant parts, potentially using a confidence metric produced by the OCR engine to focus only on the words with a highest confidence, and build queries from those parts. Variations on this technique are used in the example algorithms below. To facilitate this, some simple natural language processing is used to remove stop words, e.g., and, or, the, etc., from the item content. For example, the item text, after stop words are removed could read something similar to the example shown in FIG. 4 as reference character 72 . Additionally, for the examples described below, it is assumed that the text content of the item is stored as a string in the ‘ItemBody’ column of the ‘items’ table within the Item Bank.
  • the entire content of the question is used as a single query. It's worth noting that some databases limit the number of search terms to some maximum, e.g., 64 , so for a very long question this alternative may not be an option.
  • Step 1) calculate the maximum distance parameter for the NEAR condition for the entire body of text; this is a count of the stop words between search terms.
  • the longest stop word phrase is 3 words long, “Which of the”.
  • Step 2 create a query containing all of the terms in the body of text (see example query below).
  • Step 3 execute the query against the item bank and return the result set.
  • the query may be simplified by using only words longer than some arbitrary threshold.
  • Step 1) calculate the maximum distance parameter for the NEAR condition counting not only skipped stop words, but words containing fewer than some number of characters, e.g., 6.
  • the longest phrase containing terms not included in the search is 8 words long, e.g., “has 20 yards of rope. She needs to”.
  • Step 2 create a query containing all of the words that contain at least the required number of characters and the calculated distance parameter (see example query below).
  • Step 3 execute the query against the item bank and return the result set.
  • the query may be broken up into several queries, each of which returns a separate result set.
  • Step 1) calculate the maximum distance parameter for the NEAR condition only for the first line of text. In this example there is at most one word separating each of the terms.
  • Step 2 create a query containing only terms from the first line of text using the calculated maximum distance (see example queries below).
  • Step 3 execute the query against the item bank and save the result set.
  • a line contains fewer than 2 search terms, e.g. line 4, which contains only the single term “answer”, it may be omitted or concatenated onto another query.
  • Step 4 iterate over the result sets and record the number of times each unique item appears.
  • Step 5 build a new result set by comparing the count for each unique item to a threshold and including only those items that meet or exceed the threshold, e.g., items that appear in 2 out of 3 of the result sets.
  • the final query concatenates the single word ‘answer’ from line 4 of the text onto the end of the query for line 3:
  • the exemplary embodiment also relates to an apparatus for performing the operations discussed herein.
  • This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer.
  • a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer).
  • a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; and electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), just to mention a few examples.
  • the methods illustrated throughout the specification may be implemented in a computer program product that may be executed on a computer.
  • the computer program product may comprise a non-transitory computer-readable recording medium on which a control program is recorded, such as a disk, hard drive, or the like.
  • a non-transitory computer-readable recording medium such as a disk, hard drive, or the like.
  • Common forms of non-transitory computer-readable media include, for example, floppy disks, flexible disks, hard disks, magnetic tape, or any other magnetic storage medium, CD-ROM, DVD, or any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EPROM, or other memory chip or cartridge, or any other tangible medium from which a computer can read and use.
  • the method may be implemented in transitory media, such as a transmittable carrier wave in which the control program is embodied as a data signal using transmission media, such as acoustic or light waves, such as those generated during radio wave and infrared data communications, and the like.
  • transitory media such as a transmittable carrier wave
  • the control program is embodied as a data signal using transmission media, such as acoustic or light waves, such as those generated during radio wave and infrared data communications, and the like.

Abstract

This disclosure provides a method and system to create an educational assessment using an image processing system. According to an exemplary method, a preexisting printed assessment is scanned to produce an image file and the image processing system generates and executes a search query based on a user selected preexisting question. The executed search queries a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions and associated metadata. The retrieved question and metadata are used to create the educational assessment.

Description

    CROSS REFERENCE TO RELATED PATENTS AND APPLICATIONS
  • U.S. patent application Ser. No. 14/609,820, filed Jan. 30, 2015, by Clar et al., entitled “METHOD AND SYSTEM TO ATTRIBUTE METADATA TO PREEXISTING DOCUMENTS” is incorporated herein by reference in its entirety.
  • BACKGROUND
  • This disclosure relates to document processing methods and systems. According to an exemplary embodiment of this disclosure, a document processing method and system is provided to create an educational assessment using an image processing system, such as a multifunction printer (MFP), the created educational assessment including a plurality of questions administered to one or more students for completion.
  • Moreover, the present disclosure relates to the process of assessing the attributes of a student or group of students at selected times during their learning process and particularly relates to the assessment and evaluation of student attributes or progress in a structured classroom where a teacher is required to educate the students to a level of proficiency in various subject matters and at particular grade levels. Typically, in a grade level classroom, the teacher periodically gives the students printed form assessments or tests, as they have previously been referred to, in order to obtain an indication of the student(s) level(s) of proficiency in the subject matter of immediate interest. U.S. Patent Publication No. 2010/0075290, published Mar. 25, 2010, by DeYoung et al., and entitled “AUTOMATIC EDUCATIONAL ASSESSMENT SERVICE” describes a system for automatically evaluating assessments of the type given by a teacher/educator for determining the state of learning or progress of students during the course of instructions; and, the system is applicable particularly in a classroom setting where the teacher is responsible for educating a relatively large group of students. The system and technique of the present disclosure enables the teacher/educator to select from the digital user interface (DUI) of a Multifunction Device (MFD) any of multiple predetermined stored assessment forms in a Data Warehouse/Repository of such assessment forms for administration to a teacher/educator selected group of one or more students.
  • The teacher then requests the system to create an Assessment Batch and to print out personalized versions of the assessment form, where each version is automatically bar coded for the individual student. The student's name is also printed on the form for the purpose of delivering each assessment to the appropriate student. If desired, the student's name may be printed on the reverse side of the form such as, for example in large print, such that the person administering the test can verify from a distance that each student has the correct form, and so that forms can be handed out individually without disclosing the content of the assessment.
  • Once the students have completed the assessment, or alternatively where the teacher/educator marks the assessment for students' oral response, the marked assessment forms are then scanned into the system at the MFD.
  • Based on the information bar coded on the scanned forms, the system then identifies the student and Assessment Batch. The system then employs the appropriate image analysis of the markings, and performs an evaluation of each item on each of the assessments based upon a pre-programmed rubric. The system then automatically stores a preliminary evaluation in the Data Warehouse/Repository for each student. The teacher/educator may then view the assessments at a remote terminal and validate/annotate them. The system then automatically updates the validated/annotated assessment records in the Data Warehouse/Repository (DW/R) for later retrieval in various report views, which may be retrieved at the MFD or remotely by the teacher or other authorized educator.
  • This disclosure and the exemplary embodiments provided herein address concerns of users of an Automatic Educational Assessment System as disclosed in U.S. Patent Publication No. 2010/0075290, published Mar. 25, 2010, by DeYoung et al., and entitled “AUTOMATIC EDUCATIONAL ASSESSMENT SERVICE”, which include the desire to use preexisting assessments and curriculum.
  • INCORPORATION BY REFERENCE
  • IMS Global Learning Consortium, IMS Question & Test Interoperability Overview, http://www.imsglobal.org/question/qtiv2p1/imsqti_oviewv2p1.html, 31 Aug. 2012;
  • U.S. Pat. No. 8,831,504, issued Sep. 9, 2014, by German et al., and entitled “SYSTEM AND METHOD FOR GENERATING INDIVIDUALIZED EDUCATIONAL PRACTICE WORKSHEETS”;
  • U.S. Pat. No. 8,768,241, issued Jul. 1, 2014, by Venable, and entitled “SYSTEM AND METHOD FOR REPRESENTING DIGITAL ASSESSMENTS”;
  • U.S. Pat. No. 8,725,059, issued May 13, 2014, by Lofthus et al, and entitled “SYSTEM AND METHOD FOR RECOMMENDING EDUCATIONAL RESOURCES”;
  • U.S. Pat. No. 8,718,534, issued May 6, 2014, by Srinivas Sharath, and entitled “SYSTEM FOR CO-CLUSTERING OF STUDENT ASSESSMENT DATA”;
  • U.S. Pat. No. 8,699,939, issued Apr. 15, 2014, by German et al., and entitled “SYSTEM AND METHOD FOR RECOMMENDING EDUCATIONAL RESOURCES”;
  • U.S. Pat. No. 8,521,077, issued Aug. 27, 2013, by Venable, and entitled “SYSTEM AND METHOD FOR DETECTING UNAUTHORIZED COLLABORATION ON EDUCATIONAL ASSESSMENTS”;
  • U.S. Pat. No. 8,457,544, issued Jun. 4, 2013, by German et al., and entitled “SYSTEM AND METHOD FOR RECOMMENDING EDUCATIONAL RESOURCES”;
  • U.S. Pat. No. 7,965,891, issued Jun. 21, 2011, by Handley et al., and entitled “SYSTEM AND METHOD FOR IDENTIFYING AND LABELING FIELDS OF TEXT ASSOCIATED WITH SCANNED BUSINESS DOCUMENTS”;
  • U.S. Pat. No. 7,756,332, issued Jul. 13, 2010, by Jager, and entitled “METADATA EXTRACTION FROM DESIGNATED DOCUMENT AREAS”;
  • U.S. Pat. No. 7,689,037, issued Mar. 30, 2010, by Handley et al., and entitled “SYSTEM AND METHOD FOR IDENTIFYING AND LABELING FIELDS OF TEXT ASSOCIATED WITH SCANNED BUSINESS DOCUMENTS”;
  • U.S. Pat. No. 7,058,567, issued Jun. 6, 2006, by Ait-Mokhtar et al., and entitled “NATURAL LANGUAGE PARSER”;
  • U.S. Pat. No. 6,178,308, issued Jan. 23, 2001, by Bobrow et al., and entitled “PAPER-BASED INTERMEDIUM FOR PROVIDING INTERACTIVE EDUCATIONAL SERVICES”;
  • U.S. Patent Publication No. 2014/0234822, published Aug. 21, 2014, by Srinivas et al., and entitled “SYSTEM FOR CO-CLUSTERING OF STUDENT ASSESSMENT DATA”;
  • U.S. Patent Publication No. 2014/0093858, published Apr. 3, 2014, by Caruthers, Jr. et al., and entitled “METHOD AND SYSTEM FOR EVALUATING ELECTRONIC DOCUMENT”;
  • U.S. Patent Publication No. 2014/0065594, published Mar. 6, 2014, by Venable, and entitled “CREATING ASSESSMENT MODEL FOR EDUCATIONAL ASSESSMENT SYSTEM”;
  • U.S. Patent Publication No. 2014/0064622, published Mar. 6, 2014, by Newell et al., and entitled “METHOD AND SYSTEM FOR EVALUATING HANDWRITTEN DOCUMENTS”;
  • U.S. Patent Publication No. 2012/0189999, published Jul. 26, 2012, by Uthman et al., and entitled “SYSTEM AND METHOD FOR USING OPTICAL CHARACTER RECOGNITION TO EVALUATE STUDENT WORKSHEETS”;
  • U.S. Patent Publication No. 2011/0195389, published Aug. 11, 2011, by DeYoung et al., and entitled “SYSTEM AND METHOD FOR TRACKING PROGRESSION THROUGH AN EDUCATIONAL CURRICULUM”;
  • U.S. Patent Publication No. 2011/0151423, published Jun. 23, 2011, by Venable, and entitled “SYSTEM AND METHOD FOR REPRESENTING DIGITAL ASSESSMENTS”;
  • U.S. Patent Publication No. 2011/0123967, published May 26, 2011, by Perronnin et al., and entitled “DIALOG SYSTEM FOR COMPREHENSION EVALUATION”;
  • U.S. Patent Publication No. 2010/0157345, published Jun. 24, 2010, by Lofthus et al., and entitled “SYSTEM FOR AUTHORING EDUCATIONAL ASSESSMENTS”;
  • U.S. Patent Publication No. 2010/0075292, published Mar. 25, 2010, by DeYoung et al., and entitled “AUTOMATIC EDUCATION ASSESSMENT SERVICE”;
  • U.S. Patent Publication No. 2010/0075291, published Mar. 25, 2010, by DeYoung et al., and entitled “AUTOMATIC EDUCATIONAL ASSESSMENT SERVICE”;
  • U.S. Patent Publication No. 2010/0075290, published Mar. 25, 2010, by DeYoung et al., and entitled “AUTOMATIC EDUCATIONAL ASSESSMENT SERVICE”;
  • U.S. Patent Publication No. 2009/0035733, published Feb. 5, 2009, by Meitar et al., and entitled “DEVICE, SYSTEM, AND METHOD OF ADAPTIVE TEACHING AND LEARNING”;
  • U.S. Patent Publication No. 2005/0041860, published Feb. 24, 2005, by Jager, and entitled “METADATA EXTRACTION FROM DESIGNATED DOCUMENT AREAS”; and
  • Misra et al., “A SYSTEM FOR AUTOMATED EXTRACTION OF METADATA FROM SCANNED DOCUMENTS USING LAYOUT RECOGNITION AND STRING PATTERN SEARCH MODELS”, Archiving, 2009, 1509STP: 107-112, 17 pages, are incorporated herein by reference in their entirety.
  • BRIEF DESCRIPTION
  • In one embodiment of this disclosure, described is a method of creating an educational assessment using an image processing system, the created educational assessment including a plurality of questions associated with one or more predefined formats including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the method comprising: a) a user of the image processing system performing one or more of scanning a preexisting educational assessment into the image processing system generating a digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata, and loading into the image processing system the digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata; b) the image processing system displaying the preexisting educational assessment on a display operatively associated with the image processing system; c) the user selectably capturing an image of a single question associated with the displayed preexisting educational assessment; d) the image processing system generating a search query including one or more items included within the captured image of the single question; e) the image processing system executing a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions; f) the image processing system displaying the one or more matching predefined questions on the display; g) the user selecting one or more of the displayed matching predefined questions including the metadata associated with each of the one or more predefined questions; and h) the image processing system creating a digital representation of an educational assessment including the user selected one or more displayed matching predefined questions including the metadata associated with each of the one or more predefined questions.
  • In another embodiment of this disclosure, described is an image processing system for creating an educational assessment, the created educational assessment including a plurality of questions associated with one or more predefined formats including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the image processing system comprising: a preexisting educational assessment processing module configured to perform one or more of receiving a digital representation of a preexisting educational assessment into the image processing system generated using an operatively associated scanner, and loading into the image processing system the digital representation of the preexisting educational assessment, the digital representation of the preexisting educational assessment not conforming to the one or more predefined formats and not including the associated metadata and the preexisting educational assessment processing module configured to display the preexisting educational assessment on a display operatively associated with the image processing system; an image capture module configured to capture an image of a single question associated with the displayed preexisting educational assessment; a search query module configured to generate a search query including one or more items included within a captured image of a single question associated with the displayed preexisting educational assessment, and execute a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions; and an educational assessment creation module configured to display one or more matching predefined questions, receive one or more of the displayed matching predefined questions selected by the user, and creating a digital representation of an educational assessment including the user selected one or more displayed matching predefined questions including the metadata associated with each of the one or more predefined questions.
  • In still another embodiment of this disclosure, described is a method of creating an educational assessment using an image processing system, the created educational assessment including a plurality of questions including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the method comprising: a) a user of the image processing system creating and entering a question using a User Interface (UI) operatively associated with the image processing system; b) the image processing system generating a search query including one or more items included in the user created question; c) the image processing system executing a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions; d) the image processing system displaying the one or more matching predefined questions and associated metadata on the UI; e) the user selecting one of the displayed matching predefined questions and/or selecting the metadata associated with one of the displayed matching predefined questions; and f) the image processing system creating a digital representation of an educational assessment including the user created question including metadata associated with the user selected matching predefined question and/or user selected metadata.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a pictorial diagram of a method to process preexisting assessments according to an exemplary embodiment of this disclosure;
  • FIG. 2 is a diagram of a system to process preexisting assessments according to an exemplary embodiment of this disclosure;
  • FIGS. 3A and 3B is a flow chart of a method to generate a printed assessment for manual marking by a student according to an exemplary embodiment of this disclosure;
  • FIG. 4 is a pictorial diagram of a workflow method to create an educational assessment based on a preexisting educational assessment not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank;
  • FIG. 5 is an example of a preexisting educational assessment question according to an exemplary embodiment of this disclosure;
  • FIG. 6 is an example of an original preexisting teacher created educational assessment which does not include any associated metadata, the original preexisting educational assessment subsequently processed according to an exemplary embodiment of this disclosure;
  • FIG. 7 illustrates the display of the example preexisting teacher created educational assessment after it is scanned into the image processing system according to an exemplary embodiment of this disclosure;
  • FIG. 8 illustrates the image processing system displayed preexisting educational assessment in FIG. 7 including a teacher selected question according to an exemplary embodiment of this disclosure;
  • FIG. 9 is a detailed view of the teacher selected question shown in FIG. 8;
  • FIG. 10 illustrates a Question Selected Tool including question query results for selection by a teacher and features to further refine the query according to an exemplary embodiment of this disclosure;
  • FIG. 11 is a pictorial diagram of another workflow method to create an educational assessment based on a teacher created question not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank; and
  • FIG. 12 is a detailed view of the teacher created question shown in FIG. 11.
  • DETAILED DESCRIPTION
  • This disclosure provides a method and system that can take digital content that has been converted from paper and with the aid of the content owner, such as a teacher, search for similar and relevant content from published items. The teacher can then use their content and associate it with the published items or replace their content with the published item(s). The method and system provides an ease of use interface that enables teachers to create their own digital content based on exiting and previously used paper based content. In addition, the disclosed method and system allows for the elimination or reduction in the amount of professional services, i.e., specially trained technicians, required for the conversion of teacher content. See U.S. patent application Ser. No. 14/609,820, filed Jan. 30, 2015, by Clar et al., entitled “METHOD AND SYSTEM TO ATTRIBUTE METADATA TO PREEXISTING DOCUMENTS”.
  • Referring to FIG. 1, an overview of the functional operation of an assessment creation/evaluation system is illustrated wherein at station 1 the multifunctional device (MFD) is provided for the teacher/educator to input the information required regarding the assessment form and student or number of students desired to create an Assessment Batch; and, once the Assessment Batch has been created in the system by teacher/educator input at the DUI (digital user interface) of the MFD, the assessments may be also printed at the MFD or any remote printer connected thereto. In the present practice, an Assessment Batch includes the teacher's name and a student list which includes the names of the students to be included in the batch, the particular assessment form to be administered to the students in the student list and the creation date of the Assessment Batch.
  • At station 2 of the system indicated generally at 10 in FIG. 1, the teacher/educator administers the assessments which are marked. Depending on type of the assessment, the printed sheets may be marked by the teacher/educator or the students according to the nature of the assessment.
  • At station 3, the teacher/educator or their designated representative, scans the marked assessments into the system at the MFD. At station 4, the system automatically evaluates the assessments employing image analysis according to the established rubrics associated with the assessment form associated with the Assessment Batch and enables the teacher to access the evaluations at station 5 which is illustrated as a remote station such as a teacher's personal computer (PC). The teacher/educator validates/annotates the assessments and upon receipt of the validation, the system generates reports at station 6 which may be accessed and viewed at either the MFD or the teacher's personal computer terminal remote from the MFD.
  • Referring to FIG. 2, the overall architecture of the system employed with the presently disclosed method is illustrated pictorially with the MFD 12 connected through an application server 14 along line 16 to a network 18 which may be either a local or wide area network and may include connections to the internet. A remote terminal or PC 20 such as a teacher/educator access terminal is connected along line 22 to the network 18. A system server 24 is also connected to the network 18 and provides the functions of database access, serves as a workflow engine, mail handler, web server and functions of image processing/scoring.
  • A Data Warehouse/Repository 26 is also connected to the network and contains such items as assessment forms and associated rubrics, workflow definitions, Assessment Batch records, reports and teacher/student/class data and is operable to receive updates and to provide for access to data stored therein remotely therefrom over network 18.
  • As mentioned hereinabove, the system and method of the present disclosure function to assist a teacher/educator by providing automatic evaluation of assessments administered to students based upon established rubrics programmed into the system and employing image analysis. The system and method of the present disclosure have the capability to evaluate assessments which are marked with images other than by marking within a box or bubble with respect to multiple choice answers. The system has the ability to scan the marked assessment and lift the manually made marks made during the administering of the assessment from the preprinted markings on the assessment sheet. The system and method then employ image analysis to identify and evaluate the lifted marks. The method and system are capable of handling numerous types of assessment items employed by teachers/educators examples of which are illustrated in the present disclosure in FIGS. 8-22.
  • Various types of assessments may be administered to the students and may include summative, formative, diagnostic, interest, preference and benchmark assessments.
  • Referring to FIGS. 3A and 3B, the operation of the method of the present disclosure presented in block diagram in which, at step 30 the teacher/educator selects the education assessment service (EAS) print service from the DUI (Digital User Display) of the MFD 12 and proceeds to require the teacher to provide authentication or personal identification information at step 32. At step 34 the system then proceeds to display on the MFD DUI all the pre-defined assessment forms currently associated with the teacher's identification entered in at step 32.
  • The teacher then chooses at step 36 an assessment form and initiates the formation of an assessment “Batch” associated with that teacher and the selected assessment form. It will be understood, that once initiated, the “Assessment Batch” comprises the basic evaluation unit or cell that the teacher has requested. The teacher then proceeds at step 38 to input a class to assess such as, for example, a seventh grade class, a seventh grade math class, a fifth grade English writing class, or a fourth grade reading class, etc. The system then proceeds to step 40 and enquires as to whether the teacher/educator wishes to select the entire class; and, if the enquiry in step 40 is answered in the affirmative, the system then proceeds to step 42 and includes all students in the class on the Assessment Batch Student List. However, if the query at step 40 is answered in the negative, the system proceeds to step 44 and the class list is displayed on the MFD DUI and the teacher selects specific students to be included on the Assessment Batch Student List.
  • From step 42 or step 44 the system then proceeds to step 46 and the teacher is prompted to select print from the MFD DUI. The system then proceeds to step 48 and automatically creates a new Assessment Batch record in the Data Warehouse/Repository to store the teacher's identification, the particular assessment form, the Student List, the status data, the date created, and other data which may be required by the particular school administrator/system.
  • The system then proceeds to step 50 and automatically formats a personalized assessment layout for each student on the Student List, which layout includes the student name to insure each student receives the correct assessment and an identification bar code to encode the Assessment Batch and the student. The assessment item order/layout for each student may be varied for each student to discourage students from looking at neighboring students' assessments for hints. The system then proceeds to step 52, prints the personalized page(s) for each student on the Student List for the Assessment Batch. The system then confirms that all page(s) are printed and updates the Data Warehouse/Repository.
  • At step 54, the teacher/educator takes the personalized printed assessment page(s) and administers the assessment to each designated student. The teacher/assessor or student, as the case may be, manually marks on the printed assessment page(s) the appropriate response to the challenge indicated on the particular assessment page. Upon completion of marking of the assessments, the marked assessment pages are collected by the teacher/educator for subsequent evaluation.
  • For an assessment creator/evaluator system, the use of metadata is crucial for the functions of data tracking, reporting, and the customization of learning for the students as well for assisting teachers in their daily practice. The type of metadata to track ranges from the global level to the question level. For example, items being tracked on a global level may include:
  • (A)Global Level:
  • 1. Assessment Name
  • 2. Description
  • 3. Level-(Grade)
  • 4. Subject
  • 5. Standards
  • 6. Skills
  • The metadata being tracked on the question level may include:
  • (B)Question Level:
  • 1. Question Type—(Multiple Choice, Rubric, Constructed Response, Fill in the Box, N of M, Bubble sheet, and future types (allowing for growth of invention)
  • 2. Question Number
  • 3. Points (worth)
  • 4. Description
  • 5. Standards
  • 6. Skills
  • This data is stored in the system and aligned with the assessment that was scanned in. The assessment creator/evaluator system assigns additional data for filing and sorting of the assessment such as:
  • (C) Filing and Sorting Data Assigned
  • Assessment Name (A1)
  • Version number (auto generated as 1.0 for first install, successively increases as same assessment is scanned again with same name)
  • Created By (from user ID)
  • Grade (A3)
  • CCSS code (A5)
  • Description (A2)
  • Subject (A4).
  • Educators, over the course of their careers, accumulate a wealth of material that they like to incorporate into their lesson plans. This material includes assessments, often provided by publishers. One of the major challenges in getting educators to adopt automatic educational assessment technology is that educators are not willing to simply replace the content that they have used for years with the content included with the new platform. Simply providing a wealth of material isn't enough; the automatic educational assessment system needs to provide a mechanism for importing this existing material into the system.
  • Currently, automatic educational assessment systems do not have the capability to scan an arbitrary assessment and deconstruct it into separate questions, including question types, correct answers, etc. Existing content must be imported into the system and modified into a format that the assessment system is capable of interpreting. A current mechanism for doing this is a manually intensive process that requires scanning existing assessments and manually indicating the location of “hot spots” on the page, i.e., locations into which students will record the answers to the questions on the assessment, as well as the correct answers to each question. Some disadvantages associated with this stop-gap measure include (1) the use of a manual process requiring a trained professional services team to scan, markup the assessments (identifying “hot spots,” etc.), and to identify question types and correct answers; (2) the resulting assessments are one off aberrations that require a separate flow through an automatic educational assessment system; (3) detailed metadata about the questions, including the content of the question, is lost; and (4) assessments created in this manner cannot be used for newly created workflows based on automatically generated assessments, e.g., online assessments or tablet-based assessments.
  • Many publishers of educational material provide assessment content in structured, digital formats such as the IMS Question and Test Interoperability (QTI) standard. Standards like QTI provide a very structured representation of assessment questions, also referred to as “items”, including complete question content, question type, grade level, unit, correct answers, associated artifacts (e.g., images, charts, etc.) as well as a plethora of additional detailed metadata.
  • Typically, when a school district purchases materials from the publisher, the school district is also granted the right to use the structured, digital versions of the assessment content that is included with the material. For a fee, third party vendors will act as a go-between aggregating the content to which a specific school district has access across multiple publishers, and providing it to the school district, usually as something similar to a large set of flat files in QTI XML format. It is then possible to parse the content and store it in a searchable database referred to as an Item Bank. Once stored, the items may be retrieved in a number of ways, including database queries. For example, something like the following query can be used to find items that contain the words “quick,” “brown,” and “fox”, assuming the item body content is stored column ‘ItemBody’ of table ‘items’:
  • SELECT * FROM items WHERE CONTAINS(ItemBody, ‘quick’) AND
    CONTAINS(ItemBody, ‘brown’) AND CONTAINS(ItemBody,
    ‘fox’);
  • Other, more sophisticated queries can test for specific word order or co-location of terms, e.g. return only questions that contain the words “quick,” “brown,” “fox,” “lazy,” and “dog” in that order:
  • SELECT * FROM items WHERE CONTAINS(ItemBody,
    ‘NEAR((quick, brown, fox, lazy, dog), 3)’);
  • Note the number “3” near the end of the query; this is the maximum distance parameter to the NEAR condition of the CONTAINS statement. This is used to specify the maximum number of non-matching terms that may appear between the terms specified in the query. Accordingly, for this example, the following item body is considered a match because none of the search terms are separated by more than 3 words.
      • The quick brown fox jumped over the lazy dog.
  • However, the following item body is not considered a match because the phrase “jumped over the very” separates the terms “fox” and “lazy” by more than 3 words.
      • The quick brown fox jumped over the very lazy dog.
  • The automatic educational assessment method and system disclosed herein leverages access to item banks to retrieve the structured, digital version of assessment items.
  • Initially, the teacher scans the existing content into the teacher-facing UI S61 and the existing assessment content is displayed 70 by the UI.
  • Next, the teacher draws a bounding box 71 around the body of an individual item, e.g., the text content of the question not including the answers in a multiple choice question S62.
  • Next, the bounding box is used to crop the image and Optical Character Recognition (OCR) is used to extract the text content 72 from the cropped image S63.
  • Next, the text is converted into an SQL (Structured Query Language) query 74 S64 and the query is executed by accessing an item bank 76. Step S65 executes some simple techniques to prevent the generation of long, complex queries and/or to improve results, such as:
  • Dropping stop words, such as “or,” “and,” “the,” etc.
  • Using only the longest words.
  • Splitting a query into some number N queries, e.g., one per line of extracted text. If many questions are returned by each query, results may be narrowed by only considering questions that are returned by M out of N queries S65.
  • Next, the teacher is presented with a list of the results 78 and chooses the correct or desired item S66.
  • If there are no satisfactory results returned because the corresponding item isn't available in the Item Bank, an assessment creation system can be used to recreate the item content as disclosed in U.S. patent application Ser. No. 14/609,820, filed Jan. 30, 2015, by Clar et al.
  • At this point the teacher may choose to alter the item content. While content publishers generally don't encourage this, it is not uncommon for teachers to use modified versions of publisher content.
  • Steps S61-S66 are repeated for each remaining question on the assessment.
  • The automatic educational assessment system generates and stores a new assessment with the content from the selected items and the assessment may now be administered to students along with any other assessment using a workflow as described with reference to FIGS. 1-3 associated with the automatic assessment system.
  • For those items that exist in the Item Bank to which the teacher has access, some potential benefits associated with the disclosed automatic educational assessment system include:
      • A teacher's existing content is quickly and easily imported into the automatic educational assessment system and used to create assessments compatible with the main, paper-based workflow, overcoming a significant barrier to adoption of an automatic assessment system by teachers.
      • Items imported may be relatively easily mixed-and-matched with other items to create custom assessments.
      • Assessments translated from paper to digital format may also be used in additional workflows, e.g., tablet or online based assessments.
      • All relevant metadata for each item is preserved without the need to extract metadata from the content on the page.
  • FIG. 5 is an example of a preexisting educational assessment question according to an exemplary embodiment of this disclosure.
  • FIG. 6 is an example of an original preexisting teacher created educational assessment which does not include any associated metadata, the original preexisting educational assessment subsequently processed according to an exemplary embodiment of this disclosure.
  • FIG. 7 illustrates the display of the example preexisting teacher created educational assessment after it is scanned into the image processing system according to an exemplary embodiment of this disclosure.
  • FIG. 8 illustrates the image processing system displayed preexisting educational assessment in FIG. 7 including a teacher selected question according to an exemplary embodiment of this disclosure.
  • FIG. 9 is a detailed view of the teacher selected question shown in FIG. 8.
  • FIG. 10 illustrates a Question Selected Tool including question query results for selection by a teacher and features to further refine the query according to an exemplary embodiment of this disclosure.
  • In many cases educators like to create their own questions or alter the content of existing questions, and the assessment creation toolkit provided herein includes this capability using an alternative workflow. The alternative workflow provides the teacher with the flexibility to tailor and tune assessments precisely as they see fit for their students, without the drawback of other systems which lose the metadata that publisher created content contains. Item formats such as QTI contain significant and detailed metadata that provides detailed information about individual questions, well above and beyond the item body and possible answers. Information such as grade level, subject, Common Core compliance and much more is contained within this metadata. For example, as shown in FIG. 10, each of the assessment questions is associated with a Common Core Standard (CCS) metadata field, here CCS: 3.MD.A.1.
  • Using a slightly altered version of the main workflow previously described with reference to FIG. 4, educators can construct their own questions and use the content as a query to find similar questions in an item bank. If a similar enough question is returned as a result of the query, the returned question metadata can be copied from the existing returned question into the newly teacher created question and modified as needed, thus avoiding the manual metadata entry that would otherwise be needed.
  • With reference to FIG. 11, shown is a pictorial diagram of the slightly altered version of the main workflow previously described, the altered version creating an educational assessment based on a teacher created question not including metadata using an image processing system according to an exemplary embodiment of this disclosure, the created educational assessment including a plurality of questions and associated metadata retrieved from a question item bank. Initially, at step S81 the teacher creates a new question 90 using the assessment creator. FIG. 12 is a detailed view of the teacher created question shown in FIG. 11.
  • Next, at step S82 the text of the question 91 is converted into an SQL query S83. The educator may pick and choose which terms are significant, e.g. “yards” and “divide” to find questions about division involving measurement in yards; other terms like “Jackie” or “rope” may constrain the question too much and reduce or eliminate good matches.
  • Next, at step S83 the question text is used to construct one or more search queries for the item bank 76.
  • Next, as S84, a matching algorithm expands or narrows the query to fine tune the results before returning the resulting items 92 to the teacher.
  • Next, at step S85 the teacher is presented with a list of the results 93 and chooses an item that most closely matches what she is looking for, and copies the metadata for that question.
  • a. If no results are returned, the teacher may alter the query, e.g., to search for “feet” instead of “yards.”
  • Next, the teacher pastes the metadata from the matching question into her new question, and modifies the fields as needed, and saves the item to her item bank.
  • Next, the teacher repeats steps S81-S85 for each question in the assessment.
  • Finally, the automatic educational assessment system generates and stores a new assessment with the content from the selected items and the assessment may now be given along with any other assessment using the workflow previously described with reference to FIGS. 1-3.
  • After executing this workflow, the teacher has a complete assessment populated with custom content and with all of the appropriate metadata intact.
  • Various aspects of the method and system are now described in further detail.
  • Given the example question in FIG. 5, there are several possible ways that a query or queries may be constructed to attempt to retrieve the corresponding question from an item bank. Provided are a few exemplary algorithms. It is to be understood, the variations may be used alone, or in combination if too many or too few results are returned. Further, the queries may be altered, e.g., using “OR” instead of “AND” clauses, to find similar but not exactly matching questions.
  • It is also important to note that the queries described herein are constructed solely using the content of the question body, as opposed to the entire question, e.g. the multiple choice options. The content of the entire question may be used in combination with another cropping and parsing technique to create queries that will produce different and sometimes better results by allowing for interrogation of other parts of the item model, e.g., the responses in a multiple choice question and/or the answer in addition to the item body.
  • OCR is not a perfect technology, and so it is desirable to avoid relying on a perfect and complete extraction of the content of the entire question. In order to improve results, and increase the possibility of receiving at least one matching item from the item bank, one exemplary technique breaks the question into significant parts, potentially using a confidence metric produced by the OCR engine to focus only on the words with a highest confidence, and build queries from those parts. Variations on this technique are used in the example algorithms below. To facilitate this, some simple natural language processing is used to remove stop words, e.g., and, or, the, etc., from the item content. For example, the item text, after stop words are removed could read something similar to the example shown in FIG. 4 as reference character 72. Additionally, for the examples described below, it is assumed that the text content of the item is stored as a string in the ‘ItemBody’ column of the ‘items’ table within the Item Bank.
  • In every case it is assumed that at least one item will be returned, and that the result set will contain the correct item. Alternatively, if no acceptable items are returned after trying several different techniques, a manual technique may be used to extract some of the question content directly from the scanned image instead.
  • A Single Simple Query
  • In this example, the entire content of the question is used as a single query. It's worth noting that some databases limit the number of search terms to some maximum, e.g., 64, so for a very long question this alternative may not be an option.
  • Step 1), calculate the maximum distance parameter for the NEAR condition for the entire body of text; this is a count of the stop words between search terms. In this example, the longest stop word phrase is 3 words long, “Which of the”.
  • Step 2), create a query containing all of the terms in the body of text (see example query below).
  • Step 3), execute the query against the item bank and return the result set.
  • The example query below contains every search term in the entire question:
  • SELECT * FROM items WHERE CONTAINS (ItemBody,
    ‘NEAR(Jackie, 20, yards, rope, needs, divide, between, 5,
    people, following, shows, correct, units, answer), 3)’;
  • Querying Using Only Long Words
  • Alternatively, the query may be simplified by using only words longer than some arbitrary threshold.
  • Step 1), calculate the maximum distance parameter for the NEAR condition counting not only skipped stop words, but words containing fewer than some number of characters, e.g., 6. In this example, the longest phrase containing terms not included in the search is 8 words long, e.g., “has 20 yards of rope. She needs to”.
  • Step 2), create a query containing all of the words that contain at least the required number of characters and the calculated distance parameter (see example query below).
  • Step 3), execute the query against the item bank and return the result set.
  • In the example below, only words that are longer than 5 characters are used:
  • SELECT * FROM items WHERE CONTAINS(ItemBody,
    ‘NEAR(Jackie, divide, between, people, following,
    correct, answer), 8)’;
  • One Query Per Line
  • Alternatively, the query may be broken up into several queries, each of which returns a separate result set.
  • Step 1), calculate the maximum distance parameter for the NEAR condition only for the first line of text. In this example there is at most one word separating each of the terms.
  • Step 2), create a query containing only terms from the first line of text using the calculated maximum distance (see example queries below).
  • Step 3), execute the query against the item bank and save the result set.
  • Repeat steps 1)-3) for each line of text in the question that contains two or more search terms.
  • If a line contains fewer than 2 search terms, e.g. line 4, which contains only the single term “answer”, it may be omitted or concatenated onto another query.
  • Step 4), iterate over the result sets and record the number of times each unique item appears.
  • Step 5), build a new result set by comparing the count for each unique item to a threshold and including only those items that meet or exceed the threshold, e.g., items that appear in 2 out of 3 of the result sets.
  • The example queries for lines 1 and 2 contain only search terms corresponding to words found on those lines of text:
  • SELECT * FROM items WHERE CONTAINS (ItemBody,
    ‘NEAR(Jackie, 20, yards, rope, needs), 2)’;
    SELECT * FROM items WHERE CONTAINS (ItemBody,
    ‘NEAR(divide, between, 5, people), 2)’;
  • In order to avoid a query that contains only a single word which would therefore return a large set of results, the final query concatenates the single word ‘answer’ from line 4 of the text onto the end of the query for line 3:
  • SELECT * FROM items WHERE CONTAINS (ItemBody,
    ‘NEAR(following, shows, correct, units, answer),
    2);
  • Some portions of the detailed description herein are presented in terms of algorithms and symbolic representations of operations on data bits performed by conventional computer components, including a central processing unit (CPU), memory storage devices for the CPU, and connected display devices. These algorithmic descriptions and representations are the means used by those skilled in the data processing arts to most effectively convey the substance of their work to others skilled in the art. An algorithm is generally perceived as a self-consistent sequence of steps leading to a desired result. The steps are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical or magnetic signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.
  • It should be understood, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantities. Unless specifically stated otherwise, as apparent from the discussion herein, it is appreciated that throughout the description, discussions utilizing terms such as “processing” or “computing” or “calculating” or “determining” or “displaying” or the like, refer to the action and processes of a computer system, or similar electronic computing device, that manipulates and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
  • The exemplary embodiment also relates to an apparatus for performing the operations discussed herein. This apparatus may be specially constructed for the required purposes, or it may comprise a general-purpose computer selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a computer readable storage medium, such as, but is not limited to, any type of disk including floppy disks, optical disks, CD-ROMs, and magnetic-optical disks, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, magnetic or optical cards, or any type of media suitable for storing electronic instructions, and each coupled to a computer system bus.
  • The algorithms and displays presented herein are not inherently related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the methods described herein. The structure for a variety of these systems is apparent from the description above. In addition, the exemplary embodiment is not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the exemplary embodiment as described herein.
  • A machine-readable medium includes any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer). For instance, a machine-readable medium includes read only memory (“ROM”); random access memory (“RAM”); magnetic disk storage media; optical storage media; flash memory devices; and electrical, optical, acoustical or other form of propagated signals (e.g., carrier waves, infrared signals, digital signals, etc.), just to mention a few examples.
  • The methods illustrated throughout the specification, may be implemented in a computer program product that may be executed on a computer. The computer program product may comprise a non-transitory computer-readable recording medium on which a control program is recorded, such as a disk, hard drive, or the like. Common forms of non-transitory computer-readable media include, for example, floppy disks, flexible disks, hard disks, magnetic tape, or any other magnetic storage medium, CD-ROM, DVD, or any other optical medium, a RAM, a PROM, an EPROM, a FLASH-EPROM, or other memory chip or cartridge, or any other tangible medium from which a computer can read and use.
  • Alternatively, the method may be implemented in transitory media, such as a transmittable carrier wave in which the control program is embodied as a data signal using transmission media, such as acoustic or light waves, such as those generated during radio wave and infrared data communications, and the like.
  • It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.

Claims (20)

What is claimed is:
1. A method of creating an educational assessment using an image processing system, the created educational assessment including a plurality of questions associated with one or more predefined formats including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the method comprising:
a) a user of the image processing system performing one or more of scanning a preexisting educational assessment into the image processing system generating a digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata, and loading into the image processing system the digital representation of the preexisting educational assessment which does not conform to the one or more predefined formats and does not include the associated metadata;
b) the image processing system displaying the preexisting educational assessment on a display operatively associated with the image processing system;
c) the user selectably capturing an image of a single question associated with the displayed preexisting educational assessment;
d) the image processing system generating a search query including one or more items included within the captured image of the single question;
e) the image processing system executing a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions;
f) the image processing system displaying the one or more matching predefined questions on the display;
g) the user selecting one or more of the displayed matching predefined questions including the metadata associated with each of the one or more predefined questions; and
h) the image processing system creating a digital representation of an educational assessment including the user selected one or more displayed matching predefined questions including the metadata associated with each of the one or more predefined questions.
2. The method of executing an educational assessment according to claim 1, wherein steps c)-h) are repeated for a second user selected single question.
3. The method of creating an educational assessment according to claim 1, wherein the user selectably captures the image of the single question using a bounding box.
4. The method of creating an educational assessment according to claim 1, wherein step d) includes the image processing system performing Optical Character Recognition (OCR) to extract text content from the captured image.
5. The method of creating an educational assessment according to claim 4, wherein the extracted text is processed by the image processing system to generate the search query.
6. The method of creating an educational assessment according to claim 5, wherein the search query is a SQL (Structured Query Language) query.
7. The method of creating an educational assessment according to claim 6, wherein step e) narrows the search query if more than a predetermined number of matching predefined questions are retrieved from the DW/R.
8. The method of creating an educational assessment according to claim 1, wherein step d) comprises:
d1) performing OCR to extract text content from the captured image; and
d2) generating a SQL search query, the SQL search query generated using one or more of:
dropping predefined stop words;
using only longest extracted words; and
splitting the query into a plurality of queries.
9. The method of creating an educational assessment according to claim 1, wherein the associated metadata includes one or more of question content, question type, grade level, unit, correct answer, and associated one or more artifacts.
10. The method of creating an educational assessment according to claim 1, wherein step c) includes the user selecting one or more items included in the captured image of the single question to be used to generate the search query.
11. An image processing system comprising memory storing instructions for performing the method of creating an educational assessment according to claim 1.
12. An image processing system for creating an educational assessment, the created educational assessment including a plurality of questions associated with one or more predefined formats including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the image processing system comprising:
a preexisting educational assessment processing module configured to perform one or more of receiving a digital representation of a preexisting educational assessment into the image processing system generated using an operatively associated scanner, and loading into the image processing system the digital representation of the preexisting educational assessment, the digital representation of the preexisting educational assessment not conforming to the one or more predefined formats and not including the associated metadata and the preexisting educational assessment processing module configured to display the preexisting educational assessment on a display operatively associated with the image processing system;
an image capture module configured to capture an image of a single question associated with the displayed preexisting educational assessment;
a search query module configured to generate a search query including one or more items included within a captured image of a single question associated with the displayed preexisting educational assessment, and execute a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions; and
an educational assessment creation module configured to display one or more matching predefined questions, receive one or more of the displayed matching predefined questions selected by the user, and creating a digital representation of an educational assessment including the user selected one or more displayed matching predefined questions including the metadata associated with each of the one or more predefined questions.
13. The image processing system according to claim 12, wherein the image capture module is configured to selectably capture the image of the single question using a bounding box.
14. The image processing system according to claim 12, wherein the search query module is configured to perform Optical Character Recognition (OCR) to extract content from the captured image and the extracted text is processed to generate a SQL (Structured Query Language) query.
15. The image processing system according to claim 12, wherein the associated metadata includes one or more of question content, question type, grade level, unit, correct answer, and associated one or more artifacts.
16. The image processing system according to claim 12, wherein the search query module is configured to generate the search query using one or more user selected items included in the captured image of the single question.
17. A method of creating an educational assessment using an image processing system, the created educational assessment including a plurality of questions including metadata associated with each of the plurality of questions, and the created educational assessment administered to one or more students for completion, the method comprising:
a) a user of the image processing system creating and entering a question using a User Interface (UI) operatively associated with the image processing system;
b) the image processing system generating a search query including one or more items included in the user created question;
c) the image processing system executing a search of a Data Warehouse/Repository (DW/R) based on the search query to retrieve one or more predefined questions matching one or more search criteria associated with the search query, the one or more predefined questions associated with the one or more predefined formats including metadata associated with each of the one or more predefined questions;
d) the image processing system displaying the one or more matching predefined questions and associated metadata on the UI;
e) the user selecting one of the displayed matching predefined questions and/or selecting the metadata associated with one of the displayed matching predefined questions; and
f) the image processing system creating a digital representation of an educational assessment including the user created question including metadata associated with the user selected matching predefined question and/or user selected metadata.
18. The method of creating an educational assessment according to claim 17, wherein step b) includes the image processing system performing Optical Character Recognition (OCR) to extract text content from the user created question, and the extracted text is processed by the image processing system to generate the search query.
19. The method of creating an educational assessment according to claim 17, wherein the associated metadata includes one or more of question content, question type, grade level, unit, correct answer, and associated one or more artifacts.
20. An image processing system comprising memory storing instructions for performing the method of creating an educational assessment according to claim 17.
US14/836,605 2015-01-30 2015-08-26 Method and system for importing hard copy assessments into an automatic educational system assessment Abandoned US20170061809A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/609,820 US10325511B2 (en) 2015-01-30 2015-01-30 Method and system to attribute metadata to preexisting documents

Publications (1)

Publication Number Publication Date
US20170061809A1 true US20170061809A1 (en) 2017-03-02

Family

ID=56554352

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/609,820 Active 2035-05-24 US10325511B2 (en) 2015-01-30 2015-01-30 Method and system to attribute metadata to preexisting documents
US14/836,605 Abandoned US20170061809A1 (en) 2015-01-30 2015-08-26 Method and system for importing hard copy assessments into an automatic educational system assessment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/609,820 Active 2035-05-24 US10325511B2 (en) 2015-01-30 2015-01-30 Method and system to attribute metadata to preexisting documents

Country Status (1)

Country Link
US (2) US10325511B2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10997362B2 (en) * 2016-09-01 2021-05-04 Wacom Co., Ltd. Method and system for input areas in documents for handwriting devices
CN113111158A (en) * 2021-04-14 2021-07-13 杭州电子科技大学 Intelligent data visualization oriented conversational question-answering implementation method
US20210375149A1 (en) * 2020-06-02 2021-12-02 Lumas Information Services, LLC System and method for proficiency assessment and remedial practice
US11527168B2 (en) * 2019-06-07 2022-12-13 Enduvo, Inc. Creating an assessment within a multi-disciplined learning tool
US20230105904A1 (en) * 2021-10-04 2023-04-06 Canon Kabushiki Kaisha Image processing apparatus generating image of review question, control method therefor, and storage medium storing control program therefor
US11810476B2 (en) 2019-06-07 2023-11-07 Enduvo, Inc. Updating a virtual reality environment based on portrayal evaluation

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180260389A1 (en) * 2017-03-08 2018-09-13 Fujitsu Limited Electronic document segmentation and relation discovery between elements for natural language processing
CN107341005B (en) * 2017-06-20 2020-09-18 东软集团股份有限公司 Chart generation method and device
USD888737S1 (en) * 2018-05-18 2020-06-30 Adp, Llc Display screen or a portion thereof with an animated graphical user interface
JP2022547750A (en) 2019-09-16 2022-11-15 ドキュガミ インコーポレイテッド Cross-document intelligent authoring and processing assistant
US11443239B2 (en) 2020-03-17 2022-09-13 Microsoft Technology Licensing, Llc Interface for machine teaching modeling
US11443144B2 (en) * 2020-03-17 2022-09-13 Microsoft Technology Licensing, Llc Storage and automated metadata extraction using machine teaching
JP2021184178A (en) * 2020-05-22 2021-12-02 セイコーエプソン株式会社 Information processing system, and information processing method
US11934447B2 (en) * 2022-07-11 2024-03-19 Bank Of America Corporation Agnostic image digitizer

Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03240089A (en) * 1990-02-19 1991-10-25 Nec Corp Method and device for test generation
JPH07104660A (en) * 1993-09-30 1995-04-21 Hitachi Software Eng Co Ltd Education supporting system
US6112049A (en) * 1997-10-21 2000-08-29 The Riverside Publishing Company Computer network based testing system
US6471521B1 (en) * 1998-07-31 2002-10-29 Athenium, L.L.C. System for implementing collaborative training and online learning over a computer network and related techniques
US20020176628A1 (en) * 2001-05-22 2002-11-28 Starkweather Gary K. Document imaging and indexing system
US20030180703A1 (en) * 2002-01-28 2003-09-25 Edusoft Student assessment system
US20040030556A1 (en) * 1999-11-12 2004-02-12 Bennett Ian M. Speech based learning/training system using semantic decoding
US20050041860A1 (en) * 2003-08-20 2005-02-24 Jager Jodocus Franciscus Metadata extraction from designated document areas
US6961482B2 (en) * 2001-03-05 2005-11-01 Ncs Pearson, Inc. System for archiving electronic images of test question responses
US20050255438A1 (en) * 2004-05-13 2005-11-17 John Manos Worksheet wizard
US7014469B1 (en) * 1998-11-20 2006-03-21 Nocera Tina M Method for developing answer-options to issue-questions relating to personal finance and investment
US20060087683A1 (en) * 2004-02-15 2006-04-27 King Martin T Methods, systems and computer program products for data gathering in a digital and hard copy document environment
US20060216683A1 (en) * 2003-05-14 2006-09-28 Goradia Gautam D Interactive system for building, organising, and sharing one's own databank of questions and answers in a variety of questioning formats, on any subject in one or more languages
US20060289625A1 (en) * 2005-06-24 2006-12-28 Fuji Xerox Co., Ltd. Question paper forming apparatus and question paper forming method
US20070043678A1 (en) * 2005-08-17 2007-02-22 Kurzweil Educational Systems, Inc. Optical character recognition technique for protected viewing of digital files
US7376634B2 (en) * 2003-12-17 2008-05-20 International Business Machines Corporation Method and apparatus for implementing Q&A function and computer-aided authoring
US20090047648A1 (en) * 2007-08-14 2009-02-19 Jose Ferreira Methods, Media, and Systems for Computer-Based Learning
US20090055801A1 (en) * 2007-03-27 2009-02-26 Fujitsu Limited Computer readable storage medium that stores a test specifications creating program, test specifications creating apparatus and test specifications creating method
US20100047758A1 (en) * 2008-08-22 2010-02-25 Mccurry Douglas System and method for using interim-assessment data for instructional decision-making
US20100255453A1 (en) * 2009-04-02 2010-10-07 Chincarini Ludwig B Method and computer system of creating, storing, producing, and distributing examinations
US20120258435A1 (en) * 2011-04-05 2012-10-11 Smart Technologies Ulc Method for conducting an assessment and a participant response system employing the same
US20120329029A1 (en) * 2011-06-23 2012-12-27 Rauta Mihai Catalin Computer implemented teaching method and apparatus
US8412514B1 (en) * 2005-10-27 2013-04-02 At&T Intellectual Property Ii, L.P. Method and apparatus for compiling and querying a QA database
US20130084554A1 (en) * 2011-09-30 2013-04-04 Viral Prakash SHAH Customized question paper generation
US20130183649A1 (en) * 2011-06-15 2013-07-18 Ceresis, Llc Method for generating visual mapping of knowledge information from parsing of text inputs for subjects and predicates
US8505090B2 (en) * 2004-04-01 2013-08-06 Google Inc. Archive of text captures from rendered documents
US8526055B1 (en) * 2007-10-22 2013-09-03 Data Recognition Corporation Standardized test and survey imaging system
US20130309644A1 (en) * 2012-05-15 2013-11-21 Tata Consultancy Services Limited Secured computer based assessment
US20140210734A1 (en) * 2013-01-29 2014-07-31 Smart Technologies Ulc Method for conducting a collaborative event and system employing same
US20140272884A1 (en) * 2013-03-13 2014-09-18 International Business Machines Corporation Reward Based Ranker Array for Question Answer System
US20150088932A1 (en) * 2013-09-24 2015-03-26 Jimmy M. Sauz Device, system, and method for enhanced memorization of a document
US20150099256A1 (en) * 2013-12-17 2015-04-09 Chien Cheng Liu Intelligent teaching and tutoring test method
US20150187219A1 (en) * 2013-12-27 2015-07-02 Cloud Ta Llc Systems and methods for computer-assisted grading of printed tests
US20150199400A1 (en) * 2014-01-15 2015-07-16 Konica Minolta Laboratory U.S.A., Inc. Automatic generation of verification questions to verify whether a user has read a document
US9268852B2 (en) * 2004-02-15 2016-02-23 Google Inc. Search engines and systems with handheld document data capture devices
US20160379515A1 (en) * 2015-06-29 2016-12-29 Fujitsu Limited System and method for enhancing logical thinking in curation learning

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6178308B1 (en) * 1998-10-16 2001-01-23 Xerox Corporation Paper based intermedium for providing interactive educational services
US7058567B2 (en) 2001-10-10 2006-06-06 Xerox Corporation Natural language parser
US7283274B2 (en) 2001-10-26 2007-10-16 Hewlett-Packard Development Company, L.P. Method and system for printing user data to form documents
AU2003249237A1 (en) * 2002-07-15 2004-02-02 Device Independent Software, Inc. Editing image for delivery over a network
US20040121298A1 (en) 2002-11-06 2004-06-24 Ctb/Mcgraw-Hill System and method of capturing and processing hand-written responses in the administration of assessments
US7689037B2 (en) 2004-10-22 2010-03-30 Xerox Corporation System and method for identifying and labeling fields of text associated with scanned business documents
US20100159437A1 (en) 2008-12-19 2010-06-24 Xerox Corporation System and method for recommending educational resources
US8725059B2 (en) 2007-05-16 2014-05-13 Xerox Corporation System and method for recommending educational resources
US8699939B2 (en) 2008-12-19 2014-04-15 Xerox Corporation System and method for recommending educational resources
US8457544B2 (en) 2008-12-19 2013-06-04 Xerox Corporation System and method for recommending educational resources
US20090035733A1 (en) 2007-08-01 2009-02-05 Shmuel Meitar Device, system, and method of adaptive teaching and learning
US20090068629A1 (en) 2007-09-06 2009-03-12 Brandt Christian Redd Dual output gradebook with rubrics
US20100075292A1 (en) 2008-09-25 2010-03-25 Deyoung Dennis C Automatic education assessment service
US20100075290A1 (en) 2008-09-25 2010-03-25 Xerox Corporation Automatic Educational Assessment Service
US20100075291A1 (en) * 2008-09-25 2010-03-25 Deyoung Dennis C Automatic educational assessment service
US20100157345A1 (en) 2008-12-22 2010-06-24 Xerox Corporation System for authoring educational assessments
US20110072340A1 (en) * 2009-09-21 2011-03-24 Miller Darren H Modeling system and method
US8386574B2 (en) * 2009-10-29 2013-02-26 Xerox Corporation Multi-modality classification for one-class classification in social networks
US20110123967A1 (en) 2009-11-24 2011-05-26 Xerox Corporation Dialog system for comprehension evaluation
US8768241B2 (en) 2009-12-17 2014-07-01 Xerox Corporation System and method for representing digital assessments
US20110195389A1 (en) 2010-02-08 2011-08-11 Xerox Corporation System and method for tracking progression through an educational curriculum
BRPI1000577B1 (en) * 2010-02-19 2020-10-13 Alexandre Jonatan Bertoli Martins method and system for extracting and managing information contained in electronic documents
US8521077B2 (en) 2010-07-21 2013-08-27 Xerox Corporation System and method for detecting unauthorized collaboration on educational assessments
US8831504B2 (en) 2010-12-02 2014-09-09 Xerox Corporation System and method for generating individualized educational practice worksheets
US8589317B2 (en) * 2010-12-16 2013-11-19 Microsoft Corporation Human-assisted training of automated classifiers
US20120189999A1 (en) * 2011-01-24 2012-07-26 Xerox Corporation System and method for using optical character recognition to evaluate student worksheets
US8718534B2 (en) 2011-08-22 2014-05-06 Xerox Corporation System for co-clustering of student assessment data
US9824604B2 (en) 2012-09-04 2017-11-21 Conduent Business Services, Llc Creating assessment model for educational assessment system
US9098777B2 (en) 2012-09-06 2015-08-04 Xerox Corporation Method and system for evaluating handwritten documents
US20140093858A1 (en) 2012-10-01 2014-04-03 Xerox Corporation Method and system for evaluating electronic document

Patent Citations (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03240089A (en) * 1990-02-19 1991-10-25 Nec Corp Method and device for test generation
JPH07104660A (en) * 1993-09-30 1995-04-21 Hitachi Software Eng Co Ltd Education supporting system
US6112049A (en) * 1997-10-21 2000-08-29 The Riverside Publishing Company Computer network based testing system
US6471521B1 (en) * 1998-07-31 2002-10-29 Athenium, L.L.C. System for implementing collaborative training and online learning over a computer network and related techniques
US7014469B1 (en) * 1998-11-20 2006-03-21 Nocera Tina M Method for developing answer-options to issue-questions relating to personal finance and investment
US20040030556A1 (en) * 1999-11-12 2004-02-12 Bennett Ian M. Speech based learning/training system using semantic decoding
US6961482B2 (en) * 2001-03-05 2005-11-01 Ncs Pearson, Inc. System for archiving electronic images of test question responses
US20020176628A1 (en) * 2001-05-22 2002-11-28 Starkweather Gary K. Document imaging and indexing system
US20030180703A1 (en) * 2002-01-28 2003-09-25 Edusoft Student assessment system
US20060216683A1 (en) * 2003-05-14 2006-09-28 Goradia Gautam D Interactive system for building, organising, and sharing one's own databank of questions and answers in a variety of questioning formats, on any subject in one or more languages
US20050041860A1 (en) * 2003-08-20 2005-02-24 Jager Jodocus Franciscus Metadata extraction from designated document areas
US7376634B2 (en) * 2003-12-17 2008-05-20 International Business Machines Corporation Method and apparatus for implementing Q&A function and computer-aided authoring
US9268852B2 (en) * 2004-02-15 2016-02-23 Google Inc. Search engines and systems with handheld document data capture devices
US20060087683A1 (en) * 2004-02-15 2006-04-27 King Martin T Methods, systems and computer program products for data gathering in a digital and hard copy document environment
US7421155B2 (en) * 2004-02-15 2008-09-02 Exbiblio B.V. Archive of text captures from rendered documents
US8505090B2 (en) * 2004-04-01 2013-08-06 Google Inc. Archive of text captures from rendered documents
US7618259B2 (en) * 2004-05-13 2009-11-17 Hewlett-Packard Development Company, L.P. Worksheet wizard—system and method for creating educational worksheets
US20050255438A1 (en) * 2004-05-13 2005-11-17 John Manos Worksheet wizard
US7604161B2 (en) * 2005-06-24 2009-10-20 Fuji Xerox Co., Ltd. Question paper forming apparatus and question paper forming method
US20060289625A1 (en) * 2005-06-24 2006-12-28 Fuji Xerox Co., Ltd. Question paper forming apparatus and question paper forming method
US20070043678A1 (en) * 2005-08-17 2007-02-22 Kurzweil Educational Systems, Inc. Optical character recognition technique for protected viewing of digital files
US9009078B2 (en) * 2005-08-17 2015-04-14 Kurzweil/Intellitools, Inc. Optical character recognition technique for protected viewing of digital files
US8412514B1 (en) * 2005-10-27 2013-04-02 At&T Intellectual Property Ii, L.P. Method and apparatus for compiling and querying a QA database
US20090055801A1 (en) * 2007-03-27 2009-02-26 Fujitsu Limited Computer readable storage medium that stores a test specifications creating program, test specifications creating apparatus and test specifications creating method
US20090047648A1 (en) * 2007-08-14 2009-02-19 Jose Ferreira Methods, Media, and Systems for Computer-Based Learning
US8526055B1 (en) * 2007-10-22 2013-09-03 Data Recognition Corporation Standardized test and survey imaging system
US20100047758A1 (en) * 2008-08-22 2010-02-25 Mccurry Douglas System and method for using interim-assessment data for instructional decision-making
US20100255453A1 (en) * 2009-04-02 2010-10-07 Chincarini Ludwig B Method and computer system of creating, storing, producing, and distributing examinations
US20120258435A1 (en) * 2011-04-05 2012-10-11 Smart Technologies Ulc Method for conducting an assessment and a participant response system employing the same
US20130183649A1 (en) * 2011-06-15 2013-07-18 Ceresis, Llc Method for generating visual mapping of knowledge information from parsing of text inputs for subjects and predicates
US20120329029A1 (en) * 2011-06-23 2012-12-27 Rauta Mihai Catalin Computer implemented teaching method and apparatus
US20130084554A1 (en) * 2011-09-30 2013-04-04 Viral Prakash SHAH Customized question paper generation
US20130309644A1 (en) * 2012-05-15 2013-11-21 Tata Consultancy Services Limited Secured computer based assessment
US20140210734A1 (en) * 2013-01-29 2014-07-31 Smart Technologies Ulc Method for conducting a collaborative event and system employing same
US20140272884A1 (en) * 2013-03-13 2014-09-18 International Business Machines Corporation Reward Based Ranker Array for Question Answer System
US20150088932A1 (en) * 2013-09-24 2015-03-26 Jimmy M. Sauz Device, system, and method for enhanced memorization of a document
US20150099256A1 (en) * 2013-12-17 2015-04-09 Chien Cheng Liu Intelligent teaching and tutoring test method
US20150187219A1 (en) * 2013-12-27 2015-07-02 Cloud Ta Llc Systems and methods for computer-assisted grading of printed tests
US20150199400A1 (en) * 2014-01-15 2015-07-16 Konica Minolta Laboratory U.S.A., Inc. Automatic generation of verification questions to verify whether a user has read a document
US20160379515A1 (en) * 2015-06-29 2016-12-29 Fujitsu Limited System and method for enhancing logical thinking in curation learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Photo Math; 2014; https://www.youtube.com/watch?v=jeRCvbN_bLA *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10997362B2 (en) * 2016-09-01 2021-05-04 Wacom Co., Ltd. Method and system for input areas in documents for handwriting devices
US11527168B2 (en) * 2019-06-07 2022-12-13 Enduvo, Inc. Creating an assessment within a multi-disciplined learning tool
US11810476B2 (en) 2019-06-07 2023-11-07 Enduvo, Inc. Updating a virtual reality environment based on portrayal evaluation
US20210375149A1 (en) * 2020-06-02 2021-12-02 Lumas Information Services, LLC System and method for proficiency assessment and remedial practice
CN113111158A (en) * 2021-04-14 2021-07-13 杭州电子科技大学 Intelligent data visualization oriented conversational question-answering implementation method
US20230105904A1 (en) * 2021-10-04 2023-04-06 Canon Kabushiki Kaisha Image processing apparatus generating image of review question, control method therefor, and storage medium storing control program therefor
US11765301B2 (en) * 2021-10-04 2023-09-19 Canon Kabushiki Kaisha Image processing apparatus generating image of review question, control method therefor, and storage medium storing control program therefor

Also Published As

Publication number Publication date
US10325511B2 (en) 2019-06-18
US20160224516A1 (en) 2016-08-04

Similar Documents

Publication Publication Date Title
US20170061809A1 (en) Method and system for importing hard copy assessments into an automatic educational system assessment
US20100075292A1 (en) Automatic education assessment service
US20100075291A1 (en) Automatic educational assessment service
EP2172921A2 (en) Automatic educational assessment service
US20170330469A1 (en) Curriculum assessment
US20120282587A1 (en) System and method for generating and implementing individualized educational practice worksheets
US8794978B2 (en) Educational material processing apparatus, educational material processing method, educational material processing program and computer-readable recording medium
US20030180703A1 (en) Student assessment system
US20120189999A1 (en) System and method for using optical character recognition to evaluate student worksheets
US20100157345A1 (en) System for authoring educational assessments
US8768241B2 (en) System and method for representing digital assessments
US8831504B2 (en) System and method for generating individualized educational practice worksheets
Campbell et al. Rhetorical move structure in high-tech marketing white papers
CN104881480A (en) Database-based annotating method and device
KR20130021684A (en) System for managing answer paper and method thereof
US8521077B2 (en) System and method for detecting unauthorized collaboration on educational assessments
Gugino Using Google Docs to enhance the teacher work sample: Building e-portfolios for learning and practice
US9967425B2 (en) Image forming apparatus
Bloomfield Evolution of a digital paper exam grading system
US20070099168A1 (en) Method of configuring and evaluating a document
CN112396897A (en) Teaching system
JP2005024693A (en) Score processing system, score processing method and test paper used therfor
KR101479444B1 (en) Method for Grading Examination Paper with Answer
O'Brien et al. Benchmarking and accreditation goals support the value of an undergraduate business law core course
KR102126834B1 (en) Automatic scoring system using qr code

Legal Events

Date Code Title Description
AS Assignment

Owner name: XEROX CORPORATION, CONNECTICUT

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ST. JACQUES, JR., ROBERT J.;VENABLE, DENNIS L.;SIGNING DATES FROM 20150727 TO 20150824;REEL/FRAME:036429/0892

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION