Sei sulla pagina 1di 30

Software Testing Concepts

There are two ways of Testing. 1. Manual Testing. 2. Automation Testing. 1). Manual Testing It is a process in which all the phases of Software Testing Life Cycle like Test Planning, Test Development, Test Execution, Result Analysis, Bug Tracking and Reporting are accomplished successfully manually with human efforts. Drawbacks of Manual Testing

1. More number of people is required. 2. Time consuming. 3. No accuracy. 4. Tiredness. 5. Simultaneous actions are almost impossible. 6. Repeating the task in a same manner is not so easy manually.

2). Automation Testing It is a process in which all the drawbacks of manual testing are addressed (over come) properly and provides speed and accuracy to the existing testing phase. Note: Automation Testing is not a replacement for manual testing it is just a continuation for a manual testing in order to provide speed and accuracy. Drawbacks of Automation Testing 1. Too cost. 2. Cannot automat all the areas. 3. Lake of expatriation. Automation Tool Automated Tool is an Assistance of test engineers, which works based on the instructions and information. Definitions

Project: It is something developed based on a particular customer requirement and used by that particular customer only. Product: Product is some thing that is developed based on the companys specifications and used by multiple customers. Quality: Quality is defined as not only the justification of the requirement but also the present of value (user friendly). Defect: Defect is defined as deviation from the requirements. Testing: Testing is a process in which the defects are identified, isolated (separated), subjected (sending) for rectification and ensured that the product is defect free in order to produce a quality product in the end and hence customer satisfaction. (Or)

Testing is the process of executing a program with the intent of finding errors. (Or) Verifying and validating the application with respect to customer requirements. (Or) Finding the differences between customer expected and actual values. (Or) Testing should also ensure that a quality product is delivered to the customer. Process of Developing Project in the Software Company.
BIDDING THE PROJECT: Bedding the

project is defined as request for proposal, estimation and signoff.

KICK OF MEETING: It is a initial meeting conducted in the software company soon after the project is signed off in order to discus the over view of the project and to select a project manager for the project. Usually High Level Manager, Project Manager, Technical Manager, Quality Managers, Test leads and Project leads will be involved in this meeting. PIN (Project Initiation Note) PIN is a mail prepaid by the project manager and send to the CEO of the software company in order to get the permission to start the project development.

SDLC (Software Development Life Cycle) It contains 6 phases.


Initial phase / Requirement phase. Analysis phase. Design phase. Coding phase. Testing phase. Delivery and maintenance phase. Initial Phase Task Interacting with the customer and gathering the requirements. Roles BA (Business Annalist) EM (Engagement Manager) Process First of all the business annalist will take an appointment from the customer, collects the templates from the company meats the customer on the appointed date gathers the requirements with the support of the templates and comeback to the company with a requirements documents. Then the engagement manager will check for the extra requirements if at all he finds any extra requirements he is responsible for the excess cast of the project. The engagement manager is also responsible for prototype demonstration in case of confused requirements. Template It is defined as a pre-defined format with pre-defined fields used for preparing a document perfectly. Prototype It is a rough and rapidly developed model used for demonstrating to the client in order to gather clear requirements and to win the confidence of a customer. Proof The proof of this phase is requirements document which is also called with the following name FRS - (Functional Requirement Specification) BRS - (Business Requirement Specification) CRS - (Client/Customer Requirement Specification) URS - (User Requirement Specification) BDD - (Business Design Document)

BD - (Business Document) Note Some companys may the over all information in one document called as BRS and the detailed information in other document called FRS. But most of the companys will maintain both of information in a single document. Analysis Phase Task Feasibility study. Tentative planning. Technology selection. Requirement A\analysis. Roles System Annalist (SA) Project Manager (PM) Team Manager (TM) Process (I) Feasibility study It is detailed study of the requirements in order to check whether all the requirements are possible are not. (II) Tentative planning the resource planning and time planning is temporary done in this section. (III) Technology selection the lists of all the technologies that are to be used to accomplish the project successfully will be analyzed listed out hear in this section. (IV) Requirement analysis The list of all the requirements like human resources, hardware, software required to accomplish this project successfully will be clearly analyzed and listed out hear in this section.

Proof

The proof of this phase is SRS (Software Requirement Specification).

Design phase Tasks HLD (High Level Designing) LLD (Low Level Designing) Roles Process HLD is done by the CA (Chief Architect). LLD is done by the TL (Technical Lead).

The chief architect will divided the whole project into modules by drawing some diagrams and technical lead will divided each module into sub modules by drawing some diagrams using UML (Unified Modeling Language). The technical lead will also prepare the PSEUDO Code. Proof The proof of this phase is TDD (Technical Design Document). Pseudo Code It is a set of English instructions used for guiding the developer to develop the actual code easily. Module Module is defined as a group of related functionalities to perform a major task. Coding Phase Task Roles Process Developers will develop the actual source code by using the PSUEDO Code and following the coding standards like proper indentation, color-coding, proper commenting and etc Proof The proof of this phase is SCD (Source Code Document). Programming / Coding. Developers / Programmers.

Testing Phase Task Testing. Roles Test Engineer. Process

First of all the Test Engineer will receive the requirement documents and review it for under studying the requirements.

If at all they get any doubts while understanding the requirements they will prepare the Review Report (RR) with all the list of doubts.

Once the clarifications are given and after understanding the requirements clearly they will take the test case template and write the test cases.

Once the build is released they will execute the test cases.

After executions if at all find any defects then they will list out them in a defect profile document.

Then they will send defect profile to the developers and wait for the next build.

Once the next build is released they will once again execute the test cases

If they find any defects they will follow the above procedure again and again till the product is defect free.

Once they feel product is defect free they will stop the process.

Proof The proof of this phase is Quality Product. Test case Test case is an idea of a Test Engineer based on the requirement to test a particular feature. Delivery and Maintenance phase Delivery Task Installing application in the client environment. Roles Senior Test Engineers / Deployment Engineer. Process The senior test engineers are deployment engineer will go to the client place and install the application into the client environment with the help of guidelines provided in the deployment document. Maintenance After the delivery if at all any problem occur then that will become a task based on the problem the corresponding roll will be appointed. Based on the problem role will define the process and solve the problem.

Where exactly testing comes in to picture? Which many sort of testing are there? There are two sorts of testing. 1. Unconventional testing Unconventional Testing It is a sort of testing in which quality assurance people will check each and every out come document right from the initial phase of the SDLC. Conventional Testing It is a sort of testing in which the test engineer will test the application in the testing phase of SDLC. TESTING METHODOLOGY (OR) TESTING TECHNIQUES 2. Conventional testing

There are 3 methods are there


Black Box Testing. White Box Testing. Gray Box Testing 1 Black Box Testing It is a method of testing in which one will perform testing only on the functional part of an application with out having any structural knowledge. Usually test engineers perform it. 2 White box Testing (Or) Glass box Testing (Or) Clear box Testing It is a method of testing in which one will perform testing on the structural part of an application. Usually developers are white box testers perform it. 3 Gray box Testing It is a method of testing in which one will perform testing on both the functional part as well as the structural part of an application. Note: The Test engineer with structural Knowledge will perform gray box testing. LEVELS OF TESTING

There are 5 levels of testing. 1. Unite level testing 2. Module level testing 3. Integration level testing 4. System level testing 5. User acceptance level testing 1) Unit level testing

If one performs testing on a unit then that level of testing is known as unit level testing. It is white box testing usually developers perform it. Unit: -It is defined as a smallest part of an application. 2) Module level testing

If one perform testing on a module that is known as module level testing. It is black box testing usually test engineers perform it. 3) Integration level testing

Once the modules are developing the developers will develop some interfaces and integrate the module with the help of those interfaces while integration they will check whether the interfaces are working fine or not. It is a white box testing and usually developers or white box testers perform it. The developers will be integrating the modules in any one of the following approaches. i) Top Down Approach (TDA) In this approach the parent modules are developed first and then integrated with child modules. ii) Bottom up Approach (BUA) In this approach the child modules are developed first and the integrated that to the corresponding parent modules. iii) Hybrid Approach This approach is a mixed approach of both Top down and Bottom up approaches. iv) Big bang Approach Once all the modules are ready at a time integrating them finally is known as big bang approach.

STUB While integrating the modules in top down approach if at all any mandatory module is missing then that module is replaced with a temporary program known as STUB. DRIVER While integrating the modules in bottom up approach if at all any mandatory module is missing then that module is replaced with a temporary program known as DRIVER. 4) System level testing Once the application is deployed into the environment then if one performs testing on the system it is known as system level testing it is a black box testing and usually done by the test engineers. At this level of testing so many types of testing are done. Some of those are

System Integration Testing Load Testing Performance Testing Stress Testing etc. 5) User acceptance testing. The same system testing done in the presents of the user is known as user acceptance testing. It s a black box testing usually done by the Test engineers. ENVIRONMENT

Environment is a combination of 3 layers. Presentation Layer. Business Layer. Database Layer. Types of Environment There are 4 types of environments. 1. Stand alone Environment / One tire Architecture. 2. Client Server Environment / Two tire Architecture. 3. Web Environment / Three tire Architecture. 4. Distributed Environment / N tire Architecture.

1. Stand alone environment (Or) One Tire Architecture.

This environment contains all the three layers that is Presentation layer, Business layered and Database layer in a Single tier. 2. Client Server Environment (Or) Two Tire Architecture

In this environment two tiers will be there one tier is for client and other tier is for Database server. Presentation layer and Business layer will be present in each and every client and the database will be present in database server. 3. Web Environment

In this Environment three tiers will be there client resides in one tier, application server resides in middle tier and database server resides in the last tier. Every client will have the presentation layer, application server will have the business layer and database server will have the database layer. 4. Distributed Environment

It is same as the web Environment but the business logic is distributed among application server in order to distribute the load. Web Server: It is software that provides web services to the client.

Application Server: It is a server that holds the business logic. Ex: Ton tact, Tomcat, Web logic, web Spear etc

SOFTWARE DEVELOPMENT MODELS

There are 6 models.


Water fall Model (or) Sequential Model Prototype Model Evolutionary Model Spiral Model Fish Model V - Model

1) Water fall Model (or) Sequential Model

Advantages: It is a simple model and easy to maintain project implementation is very easy. Drawbacks: Cant incorporate new changes in the middle of the project development.

2) Prototype Model

Advantages: When ever the customer with the requirements then this is the best model to gather the clear requirements. Drawbacks: It is not a complete model. Time consuming model Prototype has to be Build Companys cost The user may strict to the prototype and limit his requirements. 3) Evolutionary Model Advantages

Whenever the customer is revolving the requirements this is the best suitable model. Drawbacks

Dead lines are not clearly defined Project monitoring and maintenance is difficult.

4) Spiral Model The spiral model is a software development process combining elements of both design and prototyping- instages, in an effort to combine advantages of top-down and bottom-up concepts. Also known as the spiral lifecycle model, it is a systems development method (SDM) used in information technology (IT). This model of development combines the features of the prototyping model and the waterfall model. The spiral model is intended for large, expensive and complicated projects Advantages

Estimates (i.e. budget, schedule, etc.) become more realistic as work progresses, because important issues are discovered earlier. It is more able to cope with the (nearly inevitable) changes that software development generally entails. Software engineers (who can get restless with protracted design processes) can get their hands in and start working on a project earlier. Advantages This is the best-suited model for highly risk-based projects. Drawbacks Time consumed model, costly model and project monitoring and maintenance is difficult.

5) Fish Model Verification: Verification is a process of checking conducted on each and every role of an organization in order to check whether he is doing his tasks in a right manner according to the guidelines or not. Right from the starting of the process tiles the ending of the process. Usually the documents are verified in this process of checking. Validation Validation is a process of checking conducted on the developed product in order to check whether it is working according to the requirements or not.

Advantages As the verification and validation are done the outcome of a Fish Model is a quality product. Drawbacks: Time consuming and costly model.

6) V Model

Advantages As the verification and validation are done along with the Test Management. The out come of V-Model is a quality product. Drawback Time consuming and costly model. TYPES OF TESTING

There are 18 types of testing. 1. Build Verification Testing. 2. Regression Testing. 3. Re Testing. 4. - Testing. 5. - Testing. 6. Static Testing. 7. Dynamic Testing. 8. Installation Testing. 9. Compatibility Testing. 10. Monkey Testing 11. Exploratory Testing. 12. Usability Testing. 13. End To End Testing. 14. Port Testing. 15. Reliability Testing 16. Mutation Testing. 17. Security Testing. 18. Adhoc Testing. 1) Sanitary Testing / Build Verification Testing / Build Accepting Testing. It is a type of testing in which one will conduct overall testing on the released build in order to check weather it is proper for further details testing or not. Some companies even call it as Sanitary Testing and also Smoke Testing. But some companys will say that just before the release of the built the developers will conduct the overall testing in order to check weather the build is proper for detailed testing or not that is known as Smoke Testing and once the build is released once again the testers will conduct the over all testing in order to check weather the build is proper for further detailed testing or not. That is known as Sanity Testing. 2) Regression Testing

It is a type of testing in which one will perform testing on the already tested functionality again and again this is usually done in scenarios (Situations). Scenario 1: When ever the defects are raised by the Test Engineer rectified by the developer and the next build is released to the testing department then the Test Engineer will test the defect functionality and its related functionalities once again. Scenario 2: When ever some new changes are requested by the customer, those new features are incorporated by the developers, next built is released to the testing department then the test engineers will test the related functionalities of the new features once again which are already tested. That is also known as regression testing. Note: Testing the new features for the first time is new testing but not the regression testing. 3) Re Testing: It is a type of testing in which one will perform testing on the same function again and again with multiple sets of data in order to come to a conclusion whether the functionality is working fine or not. 4) - Testing: It is a type of testing in which one (I.e., out Test Engineer) will perform user acceptance testing in our company in the presents of the customer. Advantages: If at all any defects are found there is a chance of rectifying them immediately. 5) - Testing: It is a type of testing in which either third party testers or end users will perform user acceptance testing in the client place before actual implementation. 6) Static Testing: It is a type of testing in which one will perform testing on an application or its related factors with out performing any actions. Ex: GUI Testing, Document Testing, Code reviewing and etc 7) Dynamic Testing: It is a type of testing in which one will perform testing on the application by performing same action. Ex: Functional Testing.

8) Installation Testing: It is a type of testing in which one will install the application in to the environment by following the guidelines given in the deployment document and if the installation is successful the one will come to a conclusion that the guidelines are correct otherwise the guidelines are not correct. 9) Compatibility Testing: It is a type of testing in which one may have to install the application into multiple number of environments prepared with different combinations of environmental components in order to check whether the application is suitable with these environments or not. This is use usually done to the products. 10) Monkey Testing: It is a type of testing in which one will perform some abnormal actions intentionally (wanted) on the application in order to check its stability. 11) Exploratory Testing: It is a type of testing in which usually the domain expert will perform testing on the application parallel by exploring the functionality with out having the knowledge of requirements. 12) Usability Testing: It is a type of testing in which one will concentrate on the user friendliness of the application. 13) End To End Testing: It is a type of testing in which one will perform testing on a complete transaction from one end to another end. 14) Port Testing: It is a type of testing in which one will check weather the application is comfortable or not after deploying it into the original clients environment. 15) Reliability Testing (or) Soak Testing: It is a type of testing in which one will perform testing on the application continuously for long period of time in order to check its stability. 16) Mutation Testing: It is a type of testing in which one will perform testing by doing some changes For example usually the developers will be doing any many changes to the program and check its performance it is known as mutation testing. 17) Security Testing: It is a type of testing in which one will usually concentrate on the following areas.

i) Authentication. ii) Direct URL Testing. iii) Firewall Leakage Testing. I) Authentication Testing: It is a type of testing in which a Test Engineer will enter different combinations of user names and passwords in order to check whether only the authorized persons are accessing the application or not. ii) Direct URL Testing: It is a type of testing in which a test engineer will specified the direct URLs of secured pages and check whether they are been accessing or not. iii) Firewall leakage Testing: It is a type of testing in which one will enter as one level of user and try to access the other level unauthorized pages in order to check whether the firewall is working properly or not. 18) Adhoc Testing: It is a type of testing in which one will perform testing on the application in his own style after understanding the requirements clearly. SOFTWARE TESTING LIFE CYCLE It contains 6 phases. 1. TEST PLANNING. 2. TEST DEVELOPMENT. 3. TEST EXECUTION. 4. RESULT ANALYSIS. 5. BUG TRACKING. 6. REPORTING. 1) TEST PLANNING Plan: Plan is a strategic document, which describes how to perform a task in an effective, efficient and optimized way. Optimization: Optimization is a process of reducing or utilizing the input resources to their maximum and getting the maximum possible output. Test Plan:

It is a strategic document, which describe how to perform testing on an application in an effective, efficient and optimized way. The Test Lead prepares test plan.

3.0 TEST STRATEGY It is defined as an organization level term, which is used for testing all the projects in the organization. TEST PLAN It is defined as a project level term, which is describes how to test a particular project in an organization. Note: Test strategy is common for all the projects. But test plan various from project to project. 3.1 Levels of Testing The lists of all the levels of testing that are maintained in that company are listed out here in this section. 3.2 Types of Testing The lists of all the types of testing that are followed by that company are listed out here in this section.

3.3 Test Design Technique The lists of all the techniques that are followed by that company during the test case development are listed out here in this section. Ex: BVA (Boundary Value Analysis) ECP (Equable Class Partition) 3.4 Configuration Management 3.5 Test Metrics The lists of all the tasks that are measured and maintain in terms of metrics are clearly mentioned here in this section. 3.6 Terminologies The list of all the terms and the corresponding meanings are listed out here in this section 3.7 Automation plan The lists of all the areas that are planed for automation in that company are listed out her in this section.

3.8 List of Automated Tools The lists of all the automated tools that are used in that company are listed out here in this section. 4.0 BASE CRITERIA 4.1 Acceptance Criteria. When to stop testing in full pledged manner thinking then enough testing is done on the application is clearly described here in this section. 4.2 Suspension Criteria. When to stop testing suddenly and suspended the build will be clearly mentioned here in this section. 5.0 TEST DELIVERABLE. The lists of all the documents that are to be prepared and deliver in the testing phase are listed out here in this section. 6.0 TEST ENVIRONMENT. The customer specified environment that is about to be used for testing is clearly describes here in this Section. 7.0 RESOURCE PLANNING. Who has to do what is clearly described here in this section. 8.0 SCHEDULING. The starting dates and the ending dates of each and ever task is clearly described here in this section. 9.0 STAFFING AND TRAINING. How much staff is to be requited what kind of training is to be provided is clearly planned and mentioned here in this section. 10.0 RISK AND CONTINGENCES. The list of all the potential risks corresponding solution plans are listed out here in this section. Risks

Unable to deliver the software with in the dead lines. Employees may leave the organization in the middle of the project development. Customer may impose the dead lines. Unable to test all the features with in the time. Lake of expatriation.

Contingences

Proper plan endurance. People need to be maintained on bench. What not to be tested has to be planed properly. Severity priority based execution. Proper training needs to be provided.

11.0 ASSUMPTIONS. The list of all the assumptions that are to be assumed by a test engineer will be listed out here in this section. 12.0 APPRUVAL INFORMATION. Who will approve what is clearly mentioned here in this section. 2. TEST DEVELOPMENT. Types of Test Cases Test cases are broadly divided into two types. 1. G.U.I Test Cases. 2. Functional test cases. Functional test cases are further divided into two types. 1. Positive Test Cases. 2. Negative Test Cases.

Guidelines to prepare GUI Test Cases:


Check for the availability of all the objects. Check for the alignments of the objects if at all customer has specified the requirements. Check for the consistence of the all the objects. Check for the Spelling and Grammar. Apart from these guidelines anything we test with out performing any action will fall under GUI test cases.

Guidelines for developing Positive Test Cases.

A test engineer must have positive mind setup.

A test engineer should consider the positive flow of the application. A test engineer should use the valid input from the point of functionality.

Guidelines for developing the Negative Test Cases:


A test engineer must have negative mind setup. He should consider the negative flow of the application. He should use at least one invalid input for a set of data.

Test Case Template: 1. Test Objective : 2. Test Scenario : 3. Test Procedure : 4. Test Data : 5. Test Cases :

1. Test Objective: The purpose of the document is clearly described here in this section. 2. Test Scenarios: The list of all the situations that are to be tested, that are listed out here in this section. 3. Test Procedure: Test procedure is a functional level term, which describes how to test the functionality. So in this section one will describe the plan for testing the functionality. 4. Test Data: The data that is required for testing is made available here in this section. 5. Test Cases: The list of all the detailed test cases is- listed out here in this section. Note: Some companies even maintain all the above five fields individually for each and every scenario. But some companies maintain commonly for all the scenarios.

3. TEST EXECUTION. During the test execution phase the test engineer will do the following. 1. He will perform the action that is described in the description column. 2. He will observe the actual behavior of the application. 3. He will document the observed value under the actual value column. 4. RESULT ANALYSIS. In this phase the test engineer will compare the expected value with actual value and mention the result as pass if both are match other wise mentioned the result as fail. 5. BUG TRACKING. Bug tracking is a process in which the defects are identifying, isolated and managed. Defect Profile Document Defect ID: The sequences of defect numbers are listed out here in this section. Steps of Reproducibility: The lists of all the steps that are followed by a test engineer to identity the defect are listed out here in this section. Submitter: The test engineer name who submits the defect will be mentioned here in this section. Date of Submission: The date on which the defects submitted is mentioned here in this section. Version Number: The corresponding version number is mentioned here in this section. Build Number: Corresponding build number is mentioned here is this section. Assigned to: The project lead or development lead will mentioned the corresponding developers name for name the defect is assigned. Severity:

How serious the defect is, is described in terms of severity. It is classified in to 4 types. 1. FATAL Sev1 S1 1 2. MAJOR Sev2 S2 2 3. MINOR Sev3 S3 3 4. SUGGESION Sev4 S4 4

FATAL: It is all the problems are related to navigational blocks or unavailability of functionality then such types of problems is treated to be fatal defect. Note: It is also called as show stopper defects. MAJOR: It at all the problems is related to the working of the features then such types of problems are treated to be major defects. MINOR: It at all the problems is related to the look and feel of the application then such types of problems are treated to be minor defects. SUGGITIONS: If at all the problems are related to the value of the application then such types of problems are treated to be suggestions. Priority: The sequence in which the defects have to be rectified is described in terms of priority. It is classified in to 4 types. 1. CRITICAL 2.HIGH 3.MEDIUM 4.LOW

Usually the FATAL defects are given CRITICAL priority, MAJOR defects are given HIGH priority, MINOR defects are given MEDIUM priority and SUGGITION defects are given LOW priority sent depending upon the situation the priority may be changed by the project lead or development lead. Ex: Low Severity High Priority Case:

In the case of customer visit all the look and feel defects, which are usually less savior, are given highest priority. High Severity Low Priority Case: If at all some part of the application is not available because it is under development still the test engineer will treat team as FATAL defect, but the development lead will give less priority for those defects.

BUG LIFE CYCLE New / Open: When ever the defect is found for the first time the test engineer will set the status as New / Open. But some companies will say to set the status as only new at this situation and once the developers accept the defect they will set the status as open. Reopen and Closed: Once the defects are rectified by the developer and the next build is released to the testing department then the testers will check whether the defects are rectified properly or not. If they feel rectified they will set the status as Closed. Other wise they will set the status as Reopen Fixed for Verification / Fixed / Rectified. When ever the test engineer raises the defects, accepted in the developers. Rectified then they will set the status as Fixed. Hold: Whenever the developer confused to accept or Reject the defect he will set the status as hold. Testers Mistake / Testers Error / Rejected. Whenever the developer is confused it is not at all a defect then he will set the status as reject. As Per Design (This is a rare case) When ever some new changes are incorporated engineers then the test engineers will raze then as defects but the developers will set the status as As per Design. Error: It is a problem related to the program. Defect: If the test engineer with respect to the functionality identifies a problem then it is called defect.

Bug: If the developer accepts the defect, that is called as Bug. Fault / Failure: The customer identity the problem, after delivery. It is called Fault / Failure. 6. BUG REPORTING. Boundary Value Analysis (BVA). 1. Equableness Class Partition (ECP). 1). Boundary Value Analysis (BVA). When ever the engineers need to develop test cases for a range kind of input then they will go for boundary value analysis. This describes to concentrate on the boundary of the range. Usually they test with the following values. LB-1 LB LB+1 MV UB-1 UB UB+1 2). Equableness Class Partition (ECP). When ever the test engineer need to develop test cases for a feature which has more number of validation then one will go for equableness class partition. Which describe first divide the class of inputs and then prepare the test cases? Ex: Develop the test cases for E-Mail Test box whose validations are as follows. Requirements: 1. It should accept Minimum 4 characters Maximum 20 characters. 2. It should accept only small characters. 3. It should accept @ and _ special symbols only.

Boundary Value Analysis: LB-1 LB LB+1 MV UB-1 UB UB+1 3ch 4ch 5ch 12ch 19ch 20ch 21ch Equableness Class Partition (ECP). Valid Invalid

4char 5char 12char 19char 20char az @ _

3char 21char AZ 09 All the Special Symbols apart form @ and _. Alpha Numeric. Blank Space Dismal Numbers.

Test Case Document:

Test Case ID 1

Test Case Type +ve

Description Enter the value as per the VIT

Expected Value It should accept.

-ve

Enter the value as per the IIT

It should not accept.

Sl NO Input 1 abcd 2 3 4 5 6 ab@zx abcdabcd@ab_ abcdabcddcbaaccd_@z abcdabcdabcdabcdz@_x abcdabcdabcdabcd_xyz

Sl No Input

1 2 3 4 5 6 7

abc ABCD ABCD123 12345.5 abcd abcd abcd abcd abcdabcd-----abc*#)

Valid Input Table (VIT).

Invalid Input Table (IIT).

Process of Developing Login Screen Test Cases. Use Case It is a description of functionality of certain feature of an application in terms of actors, actions and response. Preparation of use case Input information requires to preparing the use cases. Snap Shot: Functional Requirements 1. Login screen should contain user name, password connect to fields, login, clear and cancel buttons. 2. Connect to field is not a mandatory field but it should allow the user to select a database option who ever request it. So that he can connect to the mentioned database while login in. 3. Upon entering valid username, valid password and clicking on login button corresponding page must be displayed 4. Upon entering some information into any of the fields and clicking on clear button all the fields must be cleared and the cursor should be placed in the user name field. 5. Upon clicking on cancel button login screen must be closed. Special Requirements / Business Rules / Validation. 1. Initially whenever the login screen is invoked (opened) the login, clear button must be disabled. 2. Cancel button must be always enabled. 3. Upon entering user name and password the login button must be enabled. 4. Upon entering some information into any of the field the clear button must be enable. 5. The tabbing must be User name, Password, Connect to, Login, Clear and Cancel.

Use case Template: Name of the Use case : Brief discretion of the Use case : Actors involved : Special Requirements : Pre - Conditions : Post Conditions : Flow of events : Use case Document: Name of the Use case : Login Use Case This use case describes the functionalities of all the Features of login screen. Actors involved : Normal User, Admin User.

Brief discretion of the Use case :

Special Requirements There are two types of special requirements. 1. Implicit Requirements. 2. Explicit Requirements. Implicit Requirements: The Requirements that are analyzed by the business analyst with out the permission of customer in order to increase the value of the application are known as implicit req2uirments. Explicit Requirements: The customer specified special requirements are known as explicit requirements. Implicit Requirements: 1. Whenever the login screen is invoked initially the cursor should be placed in the user name field. 2. Upon entering invalid user name and valid password and clicking on login button an error message should be displayed as follows.

Invalid Username Please Try Again. 3. Upon entering valid user name and invalid password and clicking on login button an error message should be displayed as follows. Invalid Password Please Try Again. 4. Upon entering both invalid username and password and clicking on login button an error message should be displayed as follows. Invalid Username and Password Please Try Again. Explicit Requirements: 1. Login screen should contain user name, password connect to fields, login, clear and cancel buttons. 2. Connect to field is not a mandatory field but it should allow the user to select a database option who ever request it. So that he can connect to the mentioned database while login in. 3. Upon entering valid username, valid password and clicking on login button corresponding page must be displayed 4. Upon entering some information into any of the fields and clicking on clear button all the fields must be cleared and the cursor should be placed in the user name field. 5. Upon clicking on cancel button login screen must be closed. Pre - Conditions: Login screen must be available. Post Conditions: Either home page or admin page for valid users and error message for invalid users. Flow of events: Main Flow Action Actor invokes the application. Response Login screen is displayed with the following felids username, password, connect to, login, clear and cancel. Actor enters valid username, valid password and clicks on Authentication, either home page or admin page is login button. displayed depending upon the actor entered. Actor enters valid username, valid password, selects a Authentication, either home page or admin page is database option and clicks on login button. displayed with the mentioned data base connection depending upon the actor entered. Actor enters invalid username, valid password and clicks Go to alternative flow table 1. on login button. Actor enters valid username and invalid password and Go to alternative flow table 2. clicks on login button Actor enters invalid username and invalid password and Go to alternative flow table 3.

clicks on login button. Actor enters some information into any of the fields and clicks on the clear button. Alternative Flow Table 1. (Invalid Username) Action Actor enters invalid username, valid password and clicks on login button.

Go to alternative flow table 4.

Response Authenticates, an error message is displayed Invalid Username Please Try Again.

Alternative Flow Table 2. (Invalid Password) Action Actor enters valid username, invalid password and clicks on login button. Response Authenticates, an error message is displayed Invalid Password Please Try Again.

Alternative Flow Table 3. (Invalid Username and Password) Action Response Actor enters invalid username, invalid password and clicks Authenticates, an error message is displayed Invalid on login button. password and username.

Alternative Flow Table 4. (Clear Click) Action Actor enters some information into any of the fields Response All the fields are cleared and the cursor is placed on the user name.

Alternative Flow table 5. (Cancel Click) Action Response Actor clicks on the cancel button Login screen is closed.

Defect- Id: The list of defect numbers are mentioned here in this section Defect Description: What exactly the defect is clearly described here in this section Steps for reproducibility: The list of all the steps that are followed by the test engineer to identify the defect will be listed out here in this section Submitter: The name of the test engineer who has submitted the defect will be mentioned here in this section. Date Of Submission: The date on which the defect is submitted is mentioned here in this section. Version No: Corresponding Version number is mentioned here in this section. Build No: Corresponding build number is mentioned here in this section. Assigned To: The development lead will fill the developers name for whom the defect is assigned. Severity: How serious the defect is defined in terms of severity, Severity is classified in to four types: 1. Fatal (Sev1) or S1 or 1 2. Major (Sev2) or S2 or 2 3. Minor (Sev3) or S3 or 3 4. Suggestion (Sev4) or S4 or 4