Sei sulla pagina 1di 4

International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169

Volume: 4 Issue: 7 197 - 200


____________________________________________________________________________________________________________________
Performance Analysis of a Web Application Using Standard Tools - A Case
Study of Paper less E-Governance System for Examinations of University of
Kerala

Bijamma Thomas * Rajan Varughese** V Ajayakumar***


* Bijamma Thomas, Faculty member, Mahatma Gandhi University, Kottayam,Kerala.
** Rajan Varughese, Director, Marthoma College of Management and Technology,
Perumbavoor, Ernakulam,Kerala.
*** V Ajayakumar, Formerly Director, Computer Centre, Kerala University,Thiruvanthapuram..

1.1 Introduction verify whether the software accomplishes all the functions
it was required to accomplish and that it does so consistently
The University Examination Information System provides
and the execution is within acceptable performance
opportunity for its users to have timely information and
constraints. The need to have an integrated system test
hence the reliability of such software systems needs to be
under realistic condition was discussed by Davis, C. G., [3].
assessed and improvements, if needed,are to be
implemented so as to reduce the possibilities of failures and 1.2 E-Governance of Universities a futuristic view
congestion while the system is accessed. Early Software
The independent software applications of universities such
Reliability Models (SRMs) provided the expected number of
as systems managing student life cycle, faculty and staff
faults discovered in time t. Several SRMs, with different
databases, Finance, Pay Roll, Academic programmes,
mean value functions have been proposed in the literature,
Departments/Schools/Research Centers/Affiliating
as surveyed by Gokhale et al. [6]. The first study to link
institutions and colleges may get integrated comprehensive
testing effort explicitly to the fault detection process was in
ERP solutions as in industries and businesses which will
the early 1980s (Brooks and Motley [2] ). Subraya P.M.,
provide efficient Decision Support System for
[11] considered an integrated approach of testing to web
universities. Such solutions can deliver efficient support
based systems and highlighted the needs for performance
services to the students, faculty, staff, management and the
testing. He developed an approach for integrating
public. The best practices in administration and
performance testing in web based system after assessing the
management will get introduced in the university
different phases of automation process with the software
administration processes promoting an environment to
development lifecycle and provided guidelines and
change and redesign the existing processes to create more
checklists. This has helped in the conduct and analysis of
efficient and effective processes to benefit the end users.
results when different testing tools in web based
applications based on different technologies and different Tracking of the current status of a student from the student
platforms are used. Upsdell [12] observed that the average life cycle management information system (MIS) right
load time of home pages is about five (5) seconds for the from the stage of preliminary enquiry, submission of the
worlds top 120 retailers. (www.upsdell.com). The Zonar application, performance in the entrance examination and
research group [10] introduced the eight(8) second rule selection process, rank list and admission, Academic
which observes that time for download beyond eight calendar and scheduling, studentship in one of the
seconds will not hold the user and the user will bail out. programmes, scholarships, fee remittance status, completion
Drake, [4] addressed performance issues in the context of of the programme, Evaluation process starting from
software development life cycle (SDLC) management as registering for examination, assessment and evaluation as
performance testing life cycle which is significant in testing per schedule, grades in the evaluation, status of award of
of web based systems. Software quality is based on the degree/diploma, issue of transcript, placement status
systematic and timely use of testing technology (Dunham, would be possible just as a dispatched mail item is tracked
[5]). Software development methodology for a given in an efficient Courier Service.
problem is evolved based on test results to assess errors of
programs and continuously improving the different steps for 2.1 Relevance of System Performance Assessment
required results using statistical techniques as referred in
The system performance is assessed for load and
Montanari A & Pighin M [7]. Rosinberg, L., et. al. [9]
performance to evaluate the accessibility and availability.
demonstrated the use of quality matrix in the evaluation of
The assessment of performance of application software
software reliability. Bates, D., [1] identifies testing as to
helps in ensuring its quality. As the University moves into
197
IJRITCC | July 2016, Available @ http://www.ijritcc.org
_______________________________________________________________________________________
International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169
Volume: 4 Issue: 7 197 - 200
____________________________________________________________________________________________________________________
e-governance, the clients can be highly demanding on the accessibility of the application for varying number of users
quality of the applications in terms of functionality, easy to ranging from 1 to 250.
navigate and error free. The application of the software over
4.1 Analysis: Conclusion
the internet is making it accessible any where on the globe
and hence it is highly relevant in the web enabled As per analysis, the measurements of users increased from
environment to support the needs of the stake holders. one to 200 is given above. The corresponding
measurements of one hundred users and two hundred users
2.2 Software performance testing
are calculated using the formula y = mx + b
Load testing is the process of putting demand on a system or
device and measuring its response to assess the maximum measurements 1 user 5 users 100 users 200 users
operating capacity of the system and is performed to
determine a systems behavior under both normal and Hits per second 0.38 1.8 35.414 70.814
anticipated peak load conditions. It generally refers to the
practice of modeling the expected usage of a software
Throughput 4757 23602 470199.2 940419.2
program by simulating multiple users accessing the program
concurrently by subjecting the software to different amounts
of virtual and live users. The load testing results provide Response Time 0.56 0.67 3.487 6.387
insight into the various causes for slow performance.
4.2 Future Trends
3.1 Narration about the Software System
The process of software testing is undergoing drastic
University Software System is build up with the aim of changes such as greater adoption of Service Oriented
providing IT enabled services to students, staff and faculty Architectures (SOA), Software as a Service (SaaS), web
of affiliated colleges and all other stakeholders regarding: services, wireless and mobile technologies which leads to
(a) On-line enrolment system, (b) On-line announcement more agile approaches to software development and
exam results with photo and signature identification (ATI- increasing emphasis on the 4Rs; repeatability, reliability, re-
Any Time Information System), (c) On-line Announcement use and robustness. Performance evaluation of software
college-wise results, (d) On-line official transcript, (e) On- would change to accommodate these trends and become a
line hall ticket, (f) On-line nominal rolls, (g) On-line exam business led activity which may result in test sourcing by
registration to students, (h) On-line exam registration to using cloud services.
college authorities, (i) On-line entry of ESA, CA marks, (j)
Comparison of report
On-line entry of students attendance, (k) College-wise
DATA & GRAPHS
performance evaluation, (l) Coursewise performance
evaluation, (m) Digital Enquiry System, and (n) Providing
information to recruiting agencies Avg.
Agent Request Error Avg.Respons
throughp
s s s e Tme
3.2 Performance evaluation of the University System for ut
examinations: procedure 1 430 0 0.418 2.387
5 471 0 1.926 2.574
The system performance is assessed for load and
10 442 3 5.568 1.007
performance to evaluate the accessibility and availability.
15 512 7 8.737 1.082
The performance of university software is assessed using
20 424 3 9.187 1.092
measurements on (a) Requests (b) Errors (c) Average
Response Time (d) Average Throughput by administering 25 515 9 12.437 1.1
an open source testing tool named PYLOT. This test gives 50 529 27 27.09 1.145
quantitative findings helpful to software process 75 542 47 40.335 1.161
improvement at the activity level. 100 557 70 52.251 1.175
150 589 122 75.993 1.251
3.3 Testing Report 200 620 169 96.649 1.339
The purpose of the test is to assess the congestion level and 250 658 225 116.999 1.385
customer satisfaction in terms of availability / accessibility
of the system for the students. Graphs are used to illustrate
the assessment of quality in terms of availability /

198
IJRITCC | July 2016, Available @ http://www.ijritcc.org
_______________________________________________________________________________________
International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169
Volume: 4 Issue: 7 197 - 200
____________________________________________________________________________________________________________________

REQUEST GRAPH

700
600
No. of Requests

500
400
300
200
100
0
0 50 100 150 200 250 300
Users

ERROR GRAPH

250

200
No.of errors

150

100

50

0
0 50 100 150 200 250 300
Users

AVERAGE THROUGHPUT

2.5
Avg.Throughput

1.5

0.5

0
0 50 100 150 200 250 300
Users

199
IJRITCC | July 2016, Available @ http://www.ijritcc.org
_______________________________________________________________________________________
International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169
Volume: 4 Issue: 7 197 - 200
____________________________________________________________________________________________________________________
[5] Dunham, J. R. (1989) , V& V in the next decade,
IEEE software, May, 1989
References: [6] Gokhale et al., (1999), Annals of Software Engineering,
[1] Bates, D., (Ed.) (1977) , Software Engineering Vol.8 , p 85-121
Techniques; InfoTech State of the Art Report [7] Montanari, A., & Pighin, M., (1994), Identification of
[2] Brooks, D., and Motely, W., (1980), Analysis of Discrete experimental parameter for automatic. Software
Software Reliability Models, Technical Report, New Evaluation and Testing , Software Quality Management ,
York Vol.II p.335
[3] Davis, C. G.(1979), InfoTech State of the Art Report: [8] Richard Fairley (1997), Software Engineering Concepts:
Software Testing, Vol.1, Analysis and Bibliography Tata McGraw Hill Edition.
(p.25) [9] Rosinberg, L. & Ted Hammer, & Shaw, J. (1998), NASA
[4] Drake, T. (2004). Testing software based systems: The GSFC, Greenbelt, MD 20771, USA.
final frontier. Retrieved December 8, 2004, from [10] The Zonar Research Group (2003),
http://www.softwaretechnews.com/stn3-3/final- [11] Subraya, B. M. (2006), Integrated Approach to Web
frontier.html Performance Testing: A Practitioners guide,IRM Press
[12] Upsdell (2003), www.upsdell.com and
http://also.co.uk/does/speed.pdf

200
IJRITCC | July 2016, Available @ http://www.ijritcc.org
_______________________________________________________________________________________

Potrebbero piacerti anche