Sei sulla pagina 1di 16

1.

Introduction
1.1 Overview
Before we delve into Virtual PC let us go through the basic concept that we have
used which is virtualization. Virtualization is a concept that is leading to the resounding
progress that is being happening in the computing environment at the present day. It is a
natural human behaviour that they stick to things that they have being using for a
considerable amount of time. Same is the case with the computing environment they work
with. Users especially beginners prefer the same interface that they have being using and
often they experience working in a newer environment quite tougher. But after the
evolution of the concept of virtualization this problem has been tackled to a certain extent.

Virtualization aims at having an operating system that a user wishes


on top of an already existing PC. Moving to a newer system is quite a tough and an error-
prone task as concerned with the common masses. The problems can be reduced through
use of virtual operating system that distinguishes computing environment from their
underlying operating system.

Think of an PC that does all the major tasks of a user even if the system is
not owned by him but he gets an overview that he is working in his customized desktop.
The relevance of this application lies in the portable access of a users files and the ability
to work in an environment he sets. This is a web-application project aiming to create a
platform so that any users can create their own virtual PC’s when they are net connected.

This project came to our mind when the file sharing website www.liondrive.com
was deliberately banned by the administrators leading to the missing of all uploaded files.
Also such applications are useful especially to the common people who prefer a working
background that they are used to. The main aim of a Virtual PC is its portable feature of
the files where the user interface is just like an operating system and not a web page.
1.2 Problem Statement
what is a Virtual PC?

The Virtual PC we are planning to create targets the common masses. We know there
are many file sharing and mailing websites which aim the same as our Virtual PC in terms of
portability. But those webpages do not provide an interface that a simple or in other words
common people prefer to. These conventional webpages have a user-interface that makes a
beginner little inefficient in working. So we aim to create a webpage so that a user gets a look
of the customized operating system of his wish. Also he can perform operations just as he did
in his home desktop without getting an idea that it is a webpage loaded from the server which
means virtualization of the operating system in the web. So in brief it is actually a
virtualization of the operating system in the form of a web page.

We know an operating system is the prime component in managing tasks which include
process management, file management, device management. Looking into these features it has
become clear that there is a limitation in designing our Virtual PC since not all tasks that an OS
does can be included in this application, but all essential features of look and feel as well as
managing certain important tasks are included which is discussed in detail in the next section.

2. Literature Survey
There are many sites that offer file sharing applications like www.4shared.com,
www.pageflakes.com, www.linkup.com etc. Also we can find gdrive developed by google
which is a worldwide used file sharing website. But webOS differs from these as it has a user
interface similar to that of an operating system in our PC. We are aiming to create a webpage
that resembles like our personal desktop where there are facilities to manage all important
works where the user gets a transparency between his virtual os and the webpage he is using.
Also there has been a lot of research in this area and it has proved to be a tough task.
Giving a virtual view of an operating system to a user is a much tougher task as compared with
the operations a user performs.
We had an idea of how challenging the source-code development would be as each
simple operation may require well developed algorithm to function. So we have planned to use
the C# &ASP.net tools for development of this application.

3. Requirement Specification
3.1 Introduction

Requirements analysis in systems engineering and software engineering, encompasses those


tasks that go into determining the needs or conditions to meet for a new or altered product,
taking account of the possibly conflicting requirements of the various stakeholders, such as
beneficiaries or users.

Requirements analysis is critical to the success of a development project. [2] Requirements must
be actionable, measurable, testable, related to identified business needs or opportunities, and
defined to a level of detail sufficient for system design.

Conceptually, requirements analysis includes three types of activity:

• Eliciting requirements: the task of communicating with customers and users to


determine what their requirements are. This is sometimes also called requirements
gathering.
• Analyzing requirements: determining whether the stated requirements are unclear,
incomplete, ambiguous, or contradictory, and then resolving these issues.
• Recording requirements: Requirements might be documented in various forms, such as
natural-language documents, use cases, user stories, or process specifications.

Requirements analysis can be a long and arduous process during which many delicate
psychological skills are involved. New systems change the environment and relationships
between people, so it is important to identify all the stakeholders, take into account all their
needs and ensure they understand the implications of the new systems. Analysts can employ
several techniques to elicit the requirements from the customer. Historically, this has included
such things as holding interviews, or holding focus groups (more aptly named in this context as
requirements workshops) and creating requirements lists. More modern techniques include
prototyping, and use cases. Where necessary, the analyst will employ a combination of these
methods to establish the exact requirements of the stakeholders, so that a system that meets the
business needs is produced.

3.2 Hardware Requirement Specification


The selection of hardware is very important in the existence and proper working of any of the
software. When selecting hardware, the size and capacity requirements are also important.

Processor : Pentium 1V

Speed : Above 500MHz

RAM capacity : 1GB

Hard Disk drive : 180GB

Key Board : Samsung 108 Keys

Mouse : Logitech Optical

CD Writer : 52×LG

Motherboard : Intel

3.3 Software Requirement Specification


One of the most difficult tasks is selecting software, once the system requirement is find
out then we have to determine whether a particular software package fits for those system
requirements. This section summarizes the application requirement.

Operating System : Windows 2000 & above

Front End : C#.Net

Back End : SQL Server 2000


4. Feasibility Study
4.1 Introduction

Feasibility study is a test of system proposed regarding workability impact on the ability to
meet user need and effective use of resources. Before any user request is to be accepted, it is
mandatory to check whether the new system is feasible or not. The major purpose of the
analysis is to see that the development is technically and operationally helpful to the
organization or not. Thus when a new application is proposed it normally goes through a
feasibility study before it is approved for developed.

4.2 System Feasibility

Feasibility study is a test of system proposed regarding workability impact on the ability to
meet user need and effective use of resources. Before any user request is to be accepted, it is
mandatory to check whether the new system is feasible or not. The major purpose of the
analysis is to see that the development is technically and operationally helpful to the
organization or not. Thus when a new application is proposed it normally goes through a
feasibility study before it is approved for developed.

4.3 Technical Feasibility

A study of resource availability may affect the achieve an acceptable system. The technical
feasibility the most area to ensure at this stage. Because objective, performance, functions are
somewhat hazy, anything seems possible if the right assumptions are made. It is essential that
the process of analysis and definition be conducted in parallel with an assessment of technical
feasibility includes resource availability of the organization where the project is to be
developed and implemented.

4.4 Operational Feasibility

Operational Feasibility is consideration about the working of the system after the installation of
the system. The computerization will increase the operational efficiency of the employees.
4.5 Behavioral Feasibility

An estimate should be made of how strong a reaction the user staff is likely to have toward the
development of a computerized system. It is common knowledge the computer installations
have something to do understandable that the introduction of a candidate system requires
special effort to educate, sell and train the staff on new ways of considering business

4.6 Economical Feasibility

An evaluation of development cost weighted against the ultimate income or benefit derived
from the developed system. Economical justification is generally the “bottom line”
consideration. For most system economic justification includes a broad range of concerns that
include cost benefit analysis delineates cost for project development and weights them tangible
(i.e., measures directly) and intangible benefits of the system.

5. System Design or Proposed


5.1 Data Flow Diagram
The data flow diagram (DFD) is one of the most important tools used by system analyst. Data
flow diagrams are made up of a number of symbols, which represent system components .Most
data flow modelling methods use four kinds of symbols. These symbols are used to represent
four kinds of system components. Processes, data source, data flows and external entities.
Processes are represented by circles in DFD. Dataflow represented by a thin line in the DFD
and each data store has a unique name and square or rectangle represents external entities
unlike detailed flow charts, data flow diagrams do not supply detailed descriptions of the
modules but graphically describes a systems data and how the data interact with the system.

To construct a data flow diagram, we use


• Arrow
• Circles
• Open End Box
• Squares

An arrow identifies the data flow in motion. It is a pipe line through which information
is flown like the rectangle in the flow chart. A circle stands for process that converts data into
information’s. An open-ended box represents a data store, data at rest or a temporary repository
of data. A square defines a source or destinations of system data.

Six rules for constructing a data flow diagram


• Arrows should not cross each other.
• Squares, circles and files must bear names.
• Decomposed data flow squares and circles can have same names.
• Choose meaningful names for data flow.
Draw all data flows around the outside of the diagram.

LEVEL 0

USER VIRTUAL

OS
LEVEL 1 FOR USER
Regis
tratio
USER
n

Request

Registration File
LOGI
N Response

FILE SYSTEM
LEVEL 1 FOR ADMIN
Regis
tratio
USER
n

Request

Registration File
LOGI
N
Response

FILE SYSTEM
ADMIN

User Ap
mana ppr
Dow
Uplo Cut Copy Delet Rena geme ov
nloa Paste
ad e me nt al
d

FILE SYSTEM
USER

Dow
Uplo Cut Copy Delet Rena
nloa Paste
ad e me
d

FILE SYSTEM

5.2 Database Design


The overall objective in the development of data base technology has been to treat data
as an organisational resource and as an integrated whole. Database Management System allows
data to be protected and organise separately from other resources. Database is an integrated
collection of data. This is the difference between logical and physical data.
The structure of data has always been an important part of the design. At the
programme component level, the design of data structures and the associated algorithms
required to manipulate them is essential to the creation of high quality applications. At the
application level, the transition of a data model into data base in pivotal to achieving the
business objective of the system. A Database is a collection of interrelated data stored with
minimum redundancy to serve usage more quickly and efficiently.
The main objectives covered in database design are
 Controlled redundancy
 Data integration
 Data independence

 Accuracy and integrity

 Privacy and security


 Performance
5.3 Table Design

Table Name: Registration

Sl. No Field Name Data Type Allow Null Description


1 Name Varchar yes
2 Address Varchar yes
3 UserName Varchar no Unique value
4 Password Varchar yes
5 SecurityQues Varchar yes
tion
6 Answer Varchar yes
7 PhoneNumbe Varchar yes
r
8 Email Varchar yes
5.4 Use case Diagram

Login

Registratio
n
Manage All
User

Change Password

Access Shared Folder

Manage Files and Folders


User Admin
Access their own
Location
Access All Files And
Folders

Upload Files to Specific Location

6. Implementation
7. Testing
7.1 Black Box Testing
Black box testing takes an external perspective of the test object to derive test cases. These
tests can be functional or non-functional, though usually functional. The test designer selects
valid and invalid inputs and determines the correct output. There is no knowledge of the test
object's internal structure.
Also known as functional testing. Asoftware testing technique whereby the internal workings
of the item being tested are not known by the tester. For example, in a black box test on a
software design the tester only knows the inputs and what the expected outcomes should be
and not how the program arrives at those outputs. The tester does not ever examine the
programming code and does not need any further knowledge of the program other than its
specifications.

The advantages of this type of testing include:

• The test is unbiased because the designer and the tester are independent of each other.
• The tester does not need knowledge of any specific programming languages.
• The test is done from the point of view of the user, not the designer.
• Test cases can be designed as soon as the specifications are complete.

The disadvantages of this type of testing include:

• The test can be redundant if the software designer has already run a test case.
• The test cases are difficult to design.
• Testing every possible input stream is unrealistic because it would take a inordinate
amount of time; therefore, many program paths will go untested.

7.2 White Box Testing


Structure testing is also referred to as White Box Testing. It is a test case design method that
uses the control structure of the procedural design to drive test case. In this system, Unit testing
has been successfully handled. The test data was given to each and every module in all respects
and got the desired output. Each module has been tested found working properly.

7.3 Path Box Testing


Basic path testing a white box testing technique .It was proposed by Tom McCabe. These tests
guarantee to execute every statement in the program at least one time during testing. Basic set
is the set of all the execution path of a procedure. This enables the tester to drive a logical
complexity measure of a procedural design and are this measure as a guide for defining a
basis set of execution paths. Test cases derived to exercise the basis set are guaranteed
to execute every during testing. We may are ‘FLOW GRAPH NOTATION’ as a useful loop
for understanding control flow and illustrating the approach. We can are ‘CYLLOMETRIC
COMPLEXITY’ is a software metric that provides a quantitative measure of the logical
complexity of a program. Here, the value computed for cyclometric complexity defines the
No. of independent paths is the basis set of program and provides as with the upped
bound for the No. of tests that be conducted to ensure that all statements have been
exercise at least once.
7.4 Equivalence Partition Testing
Equivalence partitioning is a software testing technique that divides the input data of a
software unit into partition of data from which test cases can be derived. In principle, test cases
are designed to cover each partition at least once. This technique tries to define test case that
uncovers classes of errors, thereby reducing the total number of test cases that must be
developed.
Equivalence partitioning is a method for deriving test cases. In this method, classes of input
conditions called equivalence classes are identified such that each member of the class causes
the same kind of processing and output to occur. In this method, the tester identifies various
equivalence classes for partitioning. A class is a set of input conditions that are is likely to be
handled the same way by the system. If the system were to handle one case in the class
erroneously, it would handle all cases erroneously.
Equivalence partition is a group of test cases that tests the same thing or produce the same bug
(or not). we usually use this test design technique to reduce test cases (and test time) and to
assure best coverage.

7.5 Stress Testing


Stress test which is concerned with exercising the internal logic of a program and travelling
particular execution paths is done. This testing is also called as Black Box testing.
This testing attempts to find errors in the following categories.
 Incorrect or missing functions
 Interface errors
 Errors in data structure or external data base access
 Performance error
 Initialization and termination errors

8. Screen Layout

9. Conclusion

When looking in terms of the user, the only requirement is data abstraction. He/she just wants
to work in an environment he/she decides. A small operation like opening a file might look as
simple as it is seen. But when developed as a virtual PC every small operation has to be taken
care of with prime attention. The a virtual PC that we are planning to develop are obviously
devoid of certain small applications or tasks which are irrelevant to the aim, but which are
originally present. So in short we aim to develop a good user-friendly application that might
help a lot in efficient file sharing.

The website was implemented and tested with the real data and were found to be error free.
Also, the system is protected from any unauthorized access. All the necessary validations are
carried out on this project, so that any kind of users can make use of this software and
necessary messages makes them conscious of the error they have made.

Potrebbero piacerti anche