Disclosed is a method and system for capturing a user action on a user interface and fetching user interface elements in the user interface into a first list and operations of the user interface elements into a second list. A test case for the user action is created in an automation accelerator by selecting a user interface element from the first list and an operation of the user interface element from the second list. An automation accelerator script of the test case is created by the automation accelerator.
|
1. A computerized method, comprising:
capturing a user action on a user interface in a user action table, wherein the user action includes a set of user action attributes, the set of user action attributes including a user interface element, an operation performed on the user interface element in the user action, a type of the user interface element, a value of the user interface element, and comments of the user action, and wherein the operation performed on the user interface element is selected from a group consisting of a click, a set, and a launch;
fetching the user interface element in the user interface as a first user action attribute into a first list and the operation performed on the user interface element as a second user action attribute into a second list;
automatically creating a test case without user intervention in an automation accelerator for the captured user action from the user action table by selecting the user interface element from the first list and the operation performed on the user interface element in the user action from the second list;
automatically updating rest of the set of user action attributes upon selecting the user interface element from the first list and the operation performed on the user interface element from the second list; and
generating an automation accelerator script of the test case upon a save operation on the test case.
15. A computing system, comprising:
one or more non-transitory memory devices, the memory devices having stored thereon instructions related to:
a capturing unit to capture a user action including a set of user action attributes in a user action table, the user action on a user interface, the set of user action attributes includes a user interface element, an operation performed on the user interface element in the user action, a type of the user interface element, a value of the user interface element, and comments of the user action, and wherein the operation performed on the user interface element is selected from a group consisting of a click, a set, and a launch;
a fetching unit to fetch the user interface element in the user interface as a first user action attribute into a first list and the operation performed on the user interface element as a second user action attribute into a second list;
a test case creating unit to automatically create a test case without user intervention in an automation accelerator for the user action upon selecting the user interface element from the first list and the operation performed on the user interface element in the user action from the second list, wherein rest of the set of user action attributes are automatically updated in response to the selection;
a script creating unit to create an automation accelerator script of the test case.
19. An article of manufacture, comprising:
a non-transitory computer readable storage medium having instructions which when executed by a machine cause the machine to execute a method comprising:
capturing a user action on a user interface in a user action table, wherein the user action includes a set of user action attributes, the set of user action attributes including a user interface element, an operation performed on the user interface element in the user action, a type of the user interface element, a value of the user interface element, and comments of the user action, and wherein the operation performed on the user interface element is selected from a group consisting of a click, a set, and a launch;
fetching the user interface element in the user interface as a first user action attribute into a first list and the operation performed on the user interface element as a second user action attribute into a second list;
automatically creating a test case without user intervention in an automation accelerator for the captured user action from the user action table by selecting the user interface element from the first list and the operation performed on the user interface element performed in the user action from the second list;
automatically updating rest of the set of user action attributes upon selecting the user interface element from the first list and the operation performed on the user interface element from the second list; and
generating an automation accelerator script of the test case upon a save operation on the test case.
2. The method in
3. The method in
4. The method in
5. The method in
6. The method in
7. The method in
8. The method in
9. The method in
10. The method in
11. The method in
12. The method in
13. The method in
14. The method in
capturing the user action on the user interface automatically; and
creating the test case for the user action by selecting the user interface element and the operation performed on the user interface element automatically.
16. The system in
17. The system in
18. The system in
|
The invention generally relates to the field of testing software applications and more specifically to a way of accelerating a test automation process.
The process of testing software applications is typically a manual process. A user writes a test case in human readable format in a test case document and follows this test case while testing the software application. The manual process of testing is typically a tedious and time consuming process. Also, the manual process may be prone to error since a person may tend to make a mistake while writing the test case in the test case document or a user testing the software application may tend to make a mistake in reading the test case document and performing the test.
To eliminate the above problems, an automated testing process for user interface of software applications was introduced. In automated testing process, test automation tools were used that typically could automate the testing process. In the automated testing process using the test automation tool, the user creates a test case in the test automation tool and the test automation tool performs the testing of the software application by executing the test case. The test case is typically written in a programming language understood by the test automation tool. Such a test automation tool typically requires a skilled user as it demands the user to have knowledge of the programming language understood by the test automation tool. Moreover, it is tedious and time consuming to write the test case in the programming language of the test automation tool. Also, test automation tools typically only identify standard user interface elements such as inputbox, dropdown list, checkbox, and radio button with standard properties such as width, height, number of characters. Test automation tools may not identify customized user interface elements such as the user interface elements having additional properties. For example, if a customized inputbox has additional properties such as background color, background pattern, the test automation tool may not identify the customized inputbox.
What is described is a method and system for capturing a user action on a user interface and fetching user interface elements in the user interface into a first list and operations of the user interface elements into a second list. A test case for the user action is created in an automation accelerator by selecting a user interface element from the first list and an operation of the user interface element from the second list. An automation accelerator script of the test case is created by the automation accelerator.
What is described is a method and system for capturing a user action on a user interface and fetching user interface elements in the user interface into a first list and operations of the user interface elements into a second list. A test case for the user action is created in an automation accelerator by selecting a user interface element from the first list and an operation of the user interface element from the second list. An automation accelerator script of the test case is created by the automation accelerator. The automation accelerator script may be executed by the automation accelerator to execute the test case on a software application. The automation accelerator generates a test report containing a result of execution of the test case.
At 325, the automation accelerator checks if there are any more user actions to be performed. If yes, then the automation accelerator performs the steps from step 315 to step 325, that is, the automation accelerator captures attributes of all the user actions performed on the UI and adds the user action to the test case. If there are no more user actions to be captured, the automation accelerator script is created at step 330. The automation accelerator script for the test case may be created by using, for example, a save button.
UI element 410 specifies name of the UI element in the UI of the software application on which the user action is performed. Type 415 specifies a type of UI element 410. Type 415 includes UI elements such as button, radio, checkbox, dropdown list, input box and hyperlink. Operation 420 is the operation performed on the UI element 410 in the user action which includes operations such as click, set, and launch. Value 425 contains a value of UI element 410 and comments 430 contains comments on the user action. The fields, UI element 410, type 415, and operation 420 are mandatory fields, that is, the fields must be provided with values in user action table 405 where as the fields, value 425 and comments 430 are not mandatory fields which may not have values. User action table 405 may be updated with the attributes of all the user action performed on the UI.
The attribute of the first user action such as name of UI element 610 which is “Browser” 620 may be selected from UI element dropdown list 615. UI element drop down list 615 contains a list of all UI elements in login UI 500.
The name of UI element 810 on which the first user action is performed is captured as “Browser” 840, type 815 of UI element 810 is updated by automation accelerator 900 as “Browser” 845, operation 820 performed on “Browser” 840 is captured as “Launch” 850 and a value 825 of “Browser” 840 is set as “https://a1stest.wdf.corp/” 855 by automation accelerator 900. Each row in user action table 805 corresponds to the user action performed on login UI 500. A set of such user actions in user action table 805 form a test case.
The below four user actions form test case 950 for testing the login functionality of login UI 500.
All the above user actions may be the necessary user actions to be performed by the user to login to the software application. The user actions may be captured in the automation accelerator 900 while the user is performing the user action on login UI 500 or the user actions may be captured in automation accelerator 900 independently, that is, the user may update the attributes of the user action in automation accelerator 900 without actually performing the user actions on login UI 500. After capturing the above four user actions in automation accelerator 900, the user may save test case 950 using save button 903. When test case 950 is saved, an automation accelerator script is generated by automation accelerator 900. The automation accelerator script is an extensible markup language (XML) file containing test case 950.
The automation accelerator script may be opened in automation accelerator 900 and executed by automation accelerator 900. On executing the automation accelerator script, automation accelerator 900 executes the test case 950 on login UI 500 automatically. Once, the automation accelerator script is executed by automation accelerator 900, typically no further human intervention is required to perform the testing of login UI 500 as automation accelerator simulates the user actions on login UI 500 automatically.
Automation accelerator 900 may also be executed in a record mode. In the record mode, automation accelerator window is minimized or executed in the background. Automation accelerator records the user actions performed on login UI 500 by automatically capturing the attributes of the user actions in user action table 905. Automation accelerator 900 automatically identifies the name of the UI element, the operation of the UI element and the value of UI element on which the user action is performed and updates the attributes of the user actions in user action table 905. After performing the user actions on the login UI 500, the user may deactivate the record mode and save test case 950 in automation accelerator 900. In record mode, no human intervention may be required for capturing the user actions to create test case 950 in automation accelerator 900.
The user may specify to automation accelerator 900, login UI 500 for which test case 950 may be created by using pointer button 904. For example, the user may click on pointer button 904 via a pointing device, then switch to login UI 500 and click on login UI 500 again. Automation accelerator 900 identifies login UI 500 as the UI for which a test case may be created and builds a list of UI elements and a list of operations for the UI elements in login UI 500 which are required to create user actions in automation accelerator 900.
Automation accelerator 900 may execute test case 950 completely or partially, that is, automation accelerator 900 may execute all the user actions in test case 950 or only a selected group of the user actions. The user may specify the group of user actions to be tested in login UI 500 by specifying a start point and an end point in the test case. All the user actions between the start point and the end point may be executed by automation accelerator 900. For example, the user may specify user action 935 as start point of test case 950 and user action 940 as the end point of test case 950. Automation accelerator 900 would only execute user action 935 and user action 940 when the automation accelerator script is executed.
Automation accelerator 1005 communicates with test automation tool 1015 via automation accelerator technology layer 1010 to fetch UI elements into a first list and operations of the UI elements into a second list when user 1000 identifies the UI for which a test case may be created. Automation accelerator 1005 communicates with test automation tool 1015 to simulate user actions on the UI when automation accelerator script is executed. Automation accelerator 1005 allows user 1000 to create the test case in a declarative way unlike test automation tool 1015 wherein user 1000 typically has to create the test case by writing a code for the test case in a programming language understood by test automation tool 1015. Automation accelerator 1005 typically eliminates the need to know a programming language specific to test automation tool 1015.
Also, automation accelerator 1005 may be customized to identify customized UI elements in the UI unlike test automation tool 1015 which typically identifies only standard UI elements. Standard UI elements are UI elements such as inputbox, drop down list, radio button and checkbox with standard properties such as width, height, and number of characters. If inputbox is customized to have one or more additional properties such as background style of inputbox that are not part of the standard properties, test automation tool 1015 typically does not identify the customized inputbox. Automation accelerator 1005 may be programmed to have the API that may identify customized UI elements.
The test case may be saved in automation accelerator 1110. Automation accelerator 1110 persists the test case as an automation accelerator script 1125. Automation accelerator script 1125 is persisted in an XML file format. Automation accelerator script 1125 may also be displayed in human readable format by opening it in automation accelerator 1110. Automation accelerator 1110 also has a feature that allows user 1100 to generate automation tool script 1130 for the test case that is executable by test automation tool 1115. Automation tool script 1130 allows a user 1100 to execute the test case on user interface 1105 by executing automation tool script 1130 in test automation tool 1115. This is particularly useful in a test environment where automation accelerator 1110 is not integrated with test automation tool 1115. Another advantage of the above feature is that user 1100 may create the test case using automation accelerator 1110 which is typically faster and easier than creating a test case in test automation tool 1115.
Test data container 1120 persists values of UI elements specified in the test case in automation accelerator 1110. Automation accelerator 1110 creates test data container when automation accelerator script 1125 is executed and initializes test data container 1120 with the values of the UI elements. Automation accelerator 1110 fetches the values of UI elements from test data container 1120 for simulating the user actions on user interface 1105 to perform the testing. Automation accelerator 1110 generates a test report 1135 that contains a result of execution of the test case. Test report 1135 is generated in a human readable format.
Embodiments of the invention may include various steps as set forth above. The steps may be embodied in machine-executable program code which causes a general-purpose or special-purpose processor to perform certain steps. Alternatively, these steps may be performed by specific hardware components that contain hardwired logic for performing the steps, or by any combination of programmed computer components and custom hardware components.
Embodiments of the present invention may also be provided as a machine-readable medium for storing the machine-executable instructions. The machine-readable medium may include, but is not limited to, flash memory, optical disks, CD-ROMs, DVD ROMs, RAMs, EPROMs, EEPROMs, magnetic or optical cards, propagation media or any other type of machine-readable media suitable for storing electronic instructions. For example, the present invention may be downloaded as a computer program which may be transferred from a remote computer (e.g., a server) to a requesting computer (e.g., a client) by way of data signals embodied in a carrier wave or other propagation medium via a communication link (e.g., a modem or network connection).
Throughout the foregoing description, for the purposes of explanation, numerous specific details were set forth in order to provide a thorough understanding of the invention. It will be apparent, however, to one skilled in the art that the invention may be practiced without some of these specific details. Accordingly, the scope and spirit of the invention should be judged in terms of the claims which follow.
Khaladkar, Sunil, Zencke, Peter
Patent | Priority | Assignee | Title |
10180900, | Apr 15 2016 | Red Hat Israel, Ltd. | Recordation of user interface events for script generation |
10282281, | Oct 07 2011 | ATOS FRANCE | Software testing platform and method |
10684942, | Aug 04 2015 | MICRO FOCUS LLC | Selective application testing |
10936475, | Nov 05 2018 | SAP SE | Automated scripting and testing system |
11249890, | Jun 24 2020 | Webomates LLC | Software defect creation |
8612806, | Dec 16 2011 | SAP SE | Systems and methods for recording user interactions within a target application |
9075918, | Feb 25 2014 | International Business Machines Corporation | System and method for creating change-resilient scripts |
9195572, | Dec 16 2011 | SAP SE | Systems and methods for identifying user interface (UI) elements |
9223647, | May 03 2011 | MICRO FOCUS LLC | Automatic classification adjustment of recorded actions for automation script |
9274934, | Feb 25 2014 | International Business Machines Corporation | System and method for creating change-resilient scripts |
9727450, | Mar 27 2015 | ATOS FRANCE | Model-based software application testing |
Patent | Priority | Assignee | Title |
5157779, | Jun 07 1990 | Sun Microsystems, Inc. | User extensible testing system |
5781720, | Nov 19 1992 | Borland Software Corporation | Automated GUI interface testing |
6184880, | Jan 08 1997 | NEC Corporation | Automatic GUI system operation device and operation macro execution device |
6502102, | Mar 27 2000 | Accenture Global Services Limited | System, method and article of manufacture for a table-driven automated scripting architecture |
7421683, | Jan 28 2003 | BEACON APPLICATION SERVICES CORP | Method for the use of information in an auxiliary data system in relation to automated testing of graphical user interface based applications |
7451403, | Dec 20 2002 | GENPACT USA, INC | System and method for developing user interfaces purely by modeling as meta data in software application |
7627821, | Jun 15 2004 | Microsoft Technology Licensing, LLC | Recording/playback tools for UI-based applications |
20030055836, | |||
20030231205, | |||
20040002996, | |||
20040041827, | |||
20050021289, | |||
20050278728, | |||
20060005132, | |||
20060059433, | |||
20060230319, | |||
20080086627, | |||
20080092119, | |||
20080250051, | |||
20090133000, | |||
WO2006132564, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Aug 27 2007 | KHALADKAR, SUNIL | SAP AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020265 | /0947 | |
Sep 14 2007 | ZENCKE, PETER | SAP AG | ASSIGNMENT OF ASSIGNORS INTEREST SEE DOCUMENT FOR DETAILS | 020265 | /0947 | |
Sep 19 2007 | SAP AG | (assignment on the face of the patent) | / | |||
Jul 07 2014 | SAP AG | SAP SE | CHANGE OF NAME SEE DOCUMENT FOR DETAILS | 033625 | /0334 |
Date | Maintenance Fee Events |
Aug 22 2011 | ASPN: Payor Number Assigned. |
Jan 28 2015 | M1551: Payment of Maintenance Fee, 4th Year, Large Entity. |
Feb 06 2019 | M1552: Payment of Maintenance Fee, 8th Year, Large Entity. |
Feb 08 2023 | M1553: Payment of Maintenance Fee, 12th Year, Large Entity. |
Date | Maintenance Schedule |
Aug 16 2014 | 4 years fee payment window open |
Feb 16 2015 | 6 months grace period start (w surcharge) |
Aug 16 2015 | patent expiry (for year 4) |
Aug 16 2017 | 2 years to revive unintentionally abandoned end. (for year 4) |
Aug 16 2018 | 8 years fee payment window open |
Feb 16 2019 | 6 months grace period start (w surcharge) |
Aug 16 2019 | patent expiry (for year 8) |
Aug 16 2021 | 2 years to revive unintentionally abandoned end. (for year 8) |
Aug 16 2022 | 12 years fee payment window open |
Feb 16 2023 | 6 months grace period start (w surcharge) |
Aug 16 2023 | patent expiry (for year 12) |
Aug 16 2025 | 2 years to revive unintentionally abandoned end. (for year 12) |