A tool may be provided that may allow a first party (e.g., a sending bank) to synchronize its image scan settings with unknown image assessment standards of a second party (e.g., a recipient bank). However, such a tool may be used within a single party that performs both scanning and image assessment, and is not limited for use between two or more parties.
|
11. An apparatus, comprising:
a scanner configured to scan a plurality of different documents each using a different scan setting; and
a computer configured to receive an output from the scanner and to generate image data representing a plurality of images, to determine a scan setting based on the feedback information and the scan settings used for the plurality of images, wherein the feedback information is based on the image data,
wherein the different scan settings and the determined scan setting each comprises a plurality of scan setting types, and wherein the computer is further configured to rank the plurality of scan setting types based on the feedback information.
1. A method, comprising:
performing a plurality of scans of a plurality of different documents each using a different scan setting, to generate image data representing a plurality of images;
for each of at least a subset of the images, receiving feedback information;
determining, by a computer, a scan setting based on the feedback information and the scan settings used for the plurality of images; and
performing an additional document scan using the determined scan setting,
wherein the different scan settings and the determined scan setting each comprises a plurality of scan setting types, and wherein determining further comprises ranking, the plurality of scan setting types based on the feedback information.
16. An apparatus, comprising:
means for performing a plurality of scans of a plurality of different documents each using a different scan setting, to generate image data representing a plurality of images;
means for receiving feedback information for each of at least a subset of the images;
means for determining a scan setting based on the feedback information and the scan settings used for the plurality of images; and
means for performing an additional document scan using the determined scan setting,
wherein the different scan settings and the determined scan setting each comprises a plurality of scan setting types, and wherein the means for determining is further for ranking the plurality of scan setting types based on the feedback information.
6. A non-transitory computer-readable medium storing computer-executable instructions for performing a method, the method comprising:
performing a plurality of document scans of a plurality of different documents each using a different scan setting, to generate image data representing a plurality of images;
determining a scan setting based on received feedback information and the scan settings used for the plurality of images, wherein the feedback information is based on the image data;
and performing an additional document scan using the determined scan setting,
wherein the different scan settings and the determined scan setting each comprises a plurality of scan setting types, and wherein determining further comprises ranking the plurality of scan setting types based on the feedback information.
2. The method of
3. The method of
4. The method of
5. The method of
7. The non-transitory computer-readable medium of
8. The non-transitory computer-readable medium of
9. The non-transitory computer-readable medium of
10. The non-transitory computer-readable medium of
12. The apparatus of
13. The apparatus of
14. The apparatus of
15. The apparatus of
|
With the implementation of the Check Clearing for the 21st Century Act (also known as Check 21) in 2004, financial institutions such as banks are now able to exchange check images to settle cash letters, rather than sending the paper checks for settlement. Check 21 has generally allowed financial institutions to realize significant savings in the time required to settle cash letters.
Under Check 21, a bank of first deposit (BOFD) or other sending bank uses Image Exchange to send check images to another recipient bank under the ANSI X.937 standard. The recipient bank, which may be a private bank or one of the Federal Reserve branches, conducts a quality assessment using specialized image quality software. If an image does not meet the recipient bank's image quality standards, the image is rejected and returned to the sending bank unpaid. These rejected images are sometimes referred to as administrative returns or non-conforming images. A rejected image typically requires the sending bank to physically locate the original paper check, re-encode the magnetic ink character recognition (MICR) line of the paper check, and re-process the paper check. This process takes a certain amount of time per check, which is undesirable. Even if only a very small percentage of images (e.g., one percent) are non-conforming, the handling of non-conforming images can nevertheless add significant risk and cost on the part of the sending bank where very large numbers of checks are processed every day, as is typical.
Although image quality assessment tools are commercially available, financial institutions have not been able to easily diagnose and resolve the root causes of the chronic image quality issues that are causing trading partners to reject the images. This problem is complicated by the fact that image quality standards typically vary from bank to bank and are often not known. This is because most banks purchase commercially available image assessment software from a third party provider, and most third party providers are not apt to disclose information about their proprietary image analysis algorithms, leaving the sending banks to simply guess how the software of each recipient banks assesses check images.
Moreover, even though existing check scanning equipment provides for adjustment of scan settings, there is a tremendous variability in the quality of the source checks involved. The adjustability of scan settings is meant to adjust for this variability. However, scan setting adjustment cannot be adjusted for each individual check—millions of checks are often scanned every day.
There is therefore a need to reduce the incidence of non-conforming check images in the check clearing process, or of non-conforming images of any other type of document in other processes. There is also a need to find a static combination of scan settings that for a given process will reduce or even minimize the incidence of the non-conforming images. The particular combination of settings may depend upon the scanner itself, the type of document being scanned, the image quality standards being implemented, and the variability of document quality in the document population being processed, among other possible factors. To accomplish this, a tool may be provided that may allow a first party (e.g., a sending bank) to synchronize its image scan settings with unknown image assessment standards of a second party (e.g., a recipient bank). However, such a tool may be used within a single party that performs both scanning and image assessment, and is not limited for use between two or more parties.
For example, some aspects as described herein are directed to, for example, methods, apparatuses, and software for performing various functions, such as performing a plurality of document scans each using a different scan setting, to generate image data representing a plurality of images; for each of the images, receiving feedback information; determining a scan setting based on the feedback information and the scan settings used for the plurality of images; and performing an additional document scan using the determined scan setting.
These and other aspects of the disclosure will be apparent upon consideration of the following detailed description of illustrative aspects.
A more complete understanding of the present disclosure may be acquired by referring to the following description in consideration of the accompanying drawings, in which like reference numbers indicate like features, and wherein:
The various aspects summarized previously may be embodied in various forms. The following description shows by way of illustration various examples in which the aspects may be practiced. It is understood that other examples may be utilized, and that structural and functional modifications may be made, without departing from the scope of the present disclosure.
Except where explicitly stated otherwise, all references herein to two or more elements being “coupled,” “connected,” and “interconnected” to each other is intended to broadly include both (a) the elements being directly connected to each other, or otherwise in direct communication with each other, without any intervening elements, as well as (b) the elements being indirectly connected to each other, or otherwise in indirect communication with each other, with one or more intervening elements.
Although many of the examples herein are described in the context of check images, the various aspects may be used in any imaging context, such as in the imaging of any type of document.
Referring to
Scanner 101 may be configured to scan paper documents, such as checks or other financial documents, photographs, textual documents, medical records, and/or any other types of documents. For example, scanner 101 may be, or be part of, a commercially available check processing apparatus such as an IBM 3890 high speed document processor that is typically used by many banks at the present time.
Computer 102 may be any type of computing device or combination of multiple computing devices, such as a desktop computer, a laptop computer, a handheld computer, a server, a mainframe, and/or a central processing unit (CPU) or other processor. Computer 102 may be programmable by executing computer-executable instructions (such as in the form of software). These computer-executable instructions may be stored on a computer-readable medium, which may be, or be part of, storage 103. Any or all of the functions performed by computer 102 referred to herein may be performed in accordance with the execution of the appropriate computer-executable instructions stored in storage 103. Additionally or alternatively, storage 103 may store data on a computer-readable medium so as to be accessible to computer 102.
Likewise, computer 105 may be any type of computing device of combination thereof, and storage 106 may include a computer-readable medium for storing computer-executable instructions to be executed by computer 105 and/or data accessible to computer 105. Any or all of the functions performed by computer 105 referred to herein may be performed in accordance with the execution of the appropriate computer-executable instructions stored in storage 106.
A computer-readable medium as used herein is any type of device and/or material, or combination of devices and/or materials, capable of storing information in a form readable by machine. For example, a computer-readable medium may be one or more optical disks (such as compact disks, or CDs; or such as optical drives), one or more magnetic disks (such as floppy disks or magnetic hard drives), one or more magnetic tapes, and/or one or more memory chips.
User interface 104 may be any type of device that allows a user to input information into computer 102 and/or receive information output from computer 102. For example, user interface 104 may include one or more video screens, one or more printers, one or more keyboards, one or more mice, joysticks, or other cursor navigation controls, one or more touch-sensitive or stylus-sensitive input pads (which may be integrated with a video screen), one or more audio microphones, and/or one or more audio speakers.
In operation, paper checks or other documents may be scanned by scanner 101, to produce data representing images of the paper documents. This data is forwarded to computer 102 and/or storage 103. Computer 102 may receive this data from scanner 101 and/or from storage 103, and may package and forward the data (now referred to as image data in this example) to computer 105 for image assessment. Where computers 102 and 105 are physically separate computers, they may be directly or indirectly coupled together. Where they are indirectly coupled, they may be coupled via a network, which may include, for example, the Internet, a local area network (LAN), and/or an intranet. Where computers 102 and 105 communicate with each other (either directly or via a network), each of the computers may include a communication interface, such as a network card.
Computer 105 may store the received image data in storage 106 and may perform image assessment, such as in accordance with computer-executable instructions stored in storage 106. As a result of the image assessment, computer 105 may provide feedback to computer 102 (such as via the network mentioned above), indicating a result of the image assessment. For example, the feedback may be in the form of data and include a simple pass/fail result and/or one or more reasons for the pass/fail result.
An illustrative method that may be performed by the system of
Before beginning such an analysis and testing, it may be desirable to understand how the bank's image capture system judges the images it creates compared to how the trading partner's image system will judge those same images (step 201), by testing a wide variety of different test documents. Ideally, it is desirable that there is a strong correlation between the trading partner's image assessment and the bank's own image assessment. A high correlation will make the remainder of this process more effective, and means that the pass/fail image quality analysis between the bank and the trading partner are reasonably synchronized. In fact, it may be decided that in order to proceed with further steps, at least a threshold amount of correspondence between image quality assessments made by the bank and made by the trading partner must exist. For example, that threshold amount may be a 70% correlation (i.e., at least 70% of the image quality judgments must be the same). If the percent of commonality in judgments is less than the threshold, it may be decided that the measurement system should be fixed before proceeding.
This test on a variety of different test documents may be conducted with an initial combination of scan setting parameter values. The initial combination may be the combination of values that is currently being used by the bank, or a default set of scan settings provided by scanner 101 or the manufacturer of scanner 101, or even any arbitrary initial group of values, such as a medium value for each parameter. The test documents may be selected so as to be a representative sample of those documents currently captured (or expected to be captured) in production.
Once the measurement system is validated, a sub-set of the entire population may be analyzed. This subset may be, for instance, a representative sample of the images rejected by trading partners. A sampling of these images may be visually inspected by humans. Based on the visual inspection, the images may be placed in various logical categories identifying certain problems with the images. For example, based on the visual inspection, it may be determined that some of the images look too dark to read, or a clearly too light, or show a significant amount of herringbone or other background pattern.
Based upon the categorization, in step 202 a Pareto chart may be created to show which categories (i.e., which image defects) are the most prevalent and which are to be included in the scope of testing from this point onward.
Using the Pareto chart and the defects identified as being “in scope” for the study, a range of scan settings may be determined in step 203. Those scan settings that match the defect types that were identified in step 202 may be identified.
Using a test set of documents to be scanned (for example, 100 different documents), tests may be conducted in step 204 to determine what each of the scan setting parameters in step 203 actually do to the images. In this test (unlike those to follow), only one parameter is changed at a time. This may be used as a learning step to understand the scan setting parameters. The parameters tested here will move on to step 205 of the process.
Next in the process, in step 205, N>1 scans are performed on a document in order to test the impact of variations of scan setting parameter values on non-conforming images used or found in the previous steps. Each scan may use a different combination of settings of the chosen scan setting parameters. The various scans may be of the same document or of different documents. However, scanning the same document using different scan settings may produce a more accurate result.
For instance, Table 1 below shows an example of scan settings that may be used to perform the various scans in step 205. In this example, it is assumed that scan setting parameters A and B each have a possible range of 0 through 255, scan setting parameter C has a possible range of 0 through 10, and scan setting parameter D has a possible range of 0.5 to 3.0. These possible ranges of scan settings may differ depending upon the particular system used.
TABLE 1
Scan #
Parameter A
Parameter B
Parameter C
Parameter D
1
25
25
1
0.7
2
25
100
1
0.7
3
25
200
1
0.7
4
100
25
1
0.7
5
100
100
1
0.7
6
100
200
1
0.7
. . .
. . .
. . .
. . .
. . .
. . .
. . .
. . .
. . .
. . .
N − 2
200
200
4
2.5
N − 1
200
200
8
1.0
N
200
200
8
2.5
The actual settings of each scan setting parameter used in step 205 may be chosen in any way desired. However, it may be beneficial to choose combinations of scan settings that cover a broad range of possible combinations. With a greater variety of scans collected in step 205, a greater amount of data is collected. This may mean that a more accurate final set of scan settings may be determined. For example, assuming that M scan setting parameters are chosen in step 204, it may be desirable to distribute the various scan setting parameter value combinations somewhat evenly throughout an M-dimensional space (where each scan setting parameter is a separate dimension) or a portion thereof. For instance, where only two scan setting parameters are chosen (M=2), then it may be desirable to choose scan setting combinations such as shown in
The multiple scans in step 205 may be handled manually and/or automatically at scanner 101. Where the scanning is at least partially automated, scanning may be governed by scanner 101 and/or by computer 102. The user may enter desired scan settings into user interface 104, and/or the desired scan settings may be stored as data in storage 103. The images resulting from the scans may be also stored by computer 103 and/or by scanner 101 in storage 103.
In step 206, fractional factorial Design of Experiments (DOE) techniques may be used to eliminate non-significant scan setting parameters. In other words, we are trying to find the critical few scan setting parameters that make the most difference to image quality. Statistical and/or DOE software tools such as Minitab or SAS may be used for this and other steps.
At step 207, the multiple images from the image scans of step 205 are sent to the second party (where there is one), in this example the trading partner, who analyzes/assesses the images from the various scans, and generates feedback for some or all of the images. This analysis and feedback generation may be performed by computer 105. The feedback may be explicit or may be implicit in that no feedback is provided for some images. For example, silence (i.e., no feedback) for a given image may by default mean a pass or fail, as desired. Where computer 105 is separate from computer 102, computer 102 may retrieve the stored image data and forward it to computer 105, either directly or via a network. The network may include, for example, the Internet, a local area network (LAN), and/or an intranet. Alternatively, the image data may be stored from computer 102 onto a portable computer-readable medium (such as a compact disk), which would then be physically provided to computer 105.
As previously mentioned, the feedback may include a pass or fail indication. The feedback may further include one or more reasons associated with the pass/fail indication, especially where the indication is of a failure. For example, the feedback may indicate that a particular image failed, and that a reason it failed was that it was too light (see, e.g.,
Where computers 102 and 105 are the same, then the feedback may simply be internally generated data, such as between two software applications. Where computers 102 and 105 are separate, then the feedback may be sent as data back to computer 102 directly, via the network, or through a portable delivered computer-readable medium as described previously. Alternatively, the feedback may be provided in human-readable format (e.g., a written paper letter or an email) and/or provided verbally such as via telephone.
Computer 102 and/or the user may use the feedback, as well as the knowledge of which images were scanned using which scan settings, to choose a subset of the scan setting parameters. This may be done in any number of ways. For example, a DOE approach may be implemented using computer 102, such as creating a Pareto chart like the one shown in
In choosing a subset of the original set of chosen scan setting parameters, computer 102 and/or the user may rank in order the scan setting parameters and/or various combinations of the scan setting parameters, such as in accordance with their relative effects on the passage/failure of images. In the example of
Referring again to
The user and/or computer 102 then chooses final scan setting parameter values based on the feedback received and generated in step 209. In doing so, statistical analysis may again be performed. For example,
Any results of the statistical analysis in steps 204 and 207 may be output to the user by computer 102 via user interface 104. For example, the graphs in
In this way, a set of scan setting parameter values may be determined that may result in a relatively high image pass rate based on an image assessment algorithm that is not necessarily known. These parameter values may be determined iteratively and analytically based on a series of test scans and their respective image assessment results. Once these final parameter values are chosen, they may then be used for future scans of documents, especially of documents that are of the same type as those used for the testing phase.
It is noted that the final parameter values may depend on the type of document being scanned. For example, if checks are used as test scan documents in the method of
Davis, Margaret A., Redline, Mark A.
Patent | Priority | Assignee | Title |
8587842, | Apr 01 2011 | Hon Hai Precision Industry Co., Ltd. | Image processing apparatus and method for controlling same |
Patent | Priority | Assignee | Title |
6351815, | Jul 12 1996 | Apple Inc | Media-independent document security method and apparatus |
6427032, | Dec 27 1998 | Hewlett-Packard Company; IMAGETAG, INC | Apparatus and method for digital filing |
6751780, | Oct 01 1998 | HEWLETT-PACKARD DEVELOPMENT COMPANY, L P | User interface for initiating the export of an optimized scanned document using drag and drop |
7353988, | Apr 17 1998 | Diebold Nixdorf, Incorporated; DIEBOLD SELF-SERVICE SYSTEMS DIVISION OF DIEBOLD NIXDORF, INCORPORATED | Financial check with an electronic ink display |
7831912, | Apr 01 2004 | Kyocera Corporation | Publishing techniques for adding value to a rendered document |
20020154343, | |||
WO173679, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Oct 25 2007 | DAVIS, MARGARET A | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020023 | /0006 | |
Oct 25 2007 | REDLINE, MARK A | Bank of America Corporation | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020023 | /0006 | |
Oct 26 2007 | Bank of America Corporation | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Feb 25 2015 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Mar 12 2019 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Feb 21 2023 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Sep 20 2014 | 4 years fee payment window open |
Mar 20 2015 | 6 months grace period start (w surcharge) |
Sep 20 2015 | patent expiry (for year 4) |
Sep 20 2017 | 2 years to revive unintentionally abandoned end. (for year 4) |
Sep 20 2018 | 8 years fee payment window open |
Mar 20 2019 | 6 months grace period start (w surcharge) |
Sep 20 2019 | patent expiry (for year 8) |
Sep 20 2021 | 2 years to revive unintentionally abandoned end. (for year 8) |
Sep 20 2022 | 12 years fee payment window open |
Mar 20 2023 | 6 months grace period start (w surcharge) |
Sep 20 2023 | patent expiry (for year 12) |
Sep 20 2025 | 2 years to revive unintentionally abandoned end. (for year 12) |