Sei sulla pagina 1di 38

ABSTRACT The project Call Center Management automates the operations of a call center by giving replies to the customers

queries. By adding more entries to the data base store, the application can respond to more number of queries from the customers. The importance is given on giving correct reply to the input queries. The project gets the queries from the various customers and stores them in a centralized data store. When there are number of queries the queries are stored up in a queue and then the queries are processed one by one. There are separate blocks called data recognizer for recognizing the data, i.e. queries, and data interpreter for interpreting those queries. The input query from the customer is first recognized by the data recognizer by comparing with the entries in the data base store. In the database the solution for each and every query is stored and maintained. Then it is interpreted as what type of query it is and how it should respond to the query. The input query is compared with the queries in the database store. The solution for the input query is founded. The information service switch switches the application between different type of distributed services. The final result, the reply to the customers query is obtained at the end. The project is developed in the environment of .NET. The server side scripting for the website is done through VB.NET along with data stored in SQL SERVER database

1. INTRODUCTION

1.1 ABOUT THE PROJECT


The project Call Center Management automates the operations of a call center by giving replies to the customers queries. By adding more entries to the data base store, the application can respond to more number of queries from the customers. The importance is giving correct reply to the input queries. A Phone call service is provided to send information from the call center executive to the customer. In turn customer can also send mails to the company regarding the clarification of their doubts about the product details and the company details. Objective of the projects is to computerize the calls of an organization depending on the type of call. This is mainly designed to avoid the drawbacks involved in the manual process MODULES: Employee Details It gets and gives necessary information from the employee details and then makes out a valid entry. It holds information such as employee name , employee number, first name , last name, registration Date and close date. Complaint details This Module involves the complaint details which will register all the customer complaints with customer name, customer address, Complaint date, product id and status. All the payments receivable through the transactions are entered in this module. Customer Registration
This Module consists of customer registration like notification number, customer number, name, status, zone , product, call type, call details , service request , priority , remarks and sign by.

1.2 Organization Profile

2. SYSTEM SPECIFICATION
2.1. HARDWARE CONFIGURATION
Processor Memory Network Card Keyboard IP Address Mouse Hard Disk Monitor 2.8 GHz Intel Pentium 4 240 MB Intel(R) PRO/100 VE 108 keys 10.1.1.61 HCL Optical 250 GB 15 inch color monitor

2.2. SOFTWARE CONFIGURATION


Operating Environment Front end Back end Microsoft Windows XP VB.net SQL Server 2005

2.3 LANGUAGE DESCRIPTION


The .NET Framework is a new computing platform that simplifies application development in the highly distributed environment of the Internet. The .NET Framework is designed to fulfill the following objectives:

To provide a consistent object-oriented programming environment whether object code is stored and executed locally, executed locally but Internet-distributed, or executed remotely.

To provide a code-execution environment that minimizes software deployment and versioning conflicts. To provide a code-execution environment that guarantees safe execution of code, including code created by an unknown or semi-trusted third party. To provide a code-execution environment that eliminates the performance problems of scripted or interpreted environments. To make the developer experience consistent across widely varying types of applications, such as Windows-based applications and Web-based applications. To build all communication on industry standards to ensure that code based on the .NET Framework can integrate with any other code.

ADO.NET Overview

ADO.NET is an evolution of the ADO data access model that directly addresses user requirements for developing scalable applications. It was designed specifically for the web with scalability, statelessness, and XML in mind. ADO.NET uses some ADO objects, such as the Connection and Command objects, and also introduces new objects. Key new ADO.NET objects include the DataSet, DataReader, and DataAdapter.

The important distinction between this evolved stage of ADO.NET and previous data architectures is that there exists an object -- the DataSet -- that is separate and distinct from any data stores. Because of that, the DataSet functions as a standalone entity. You can think of the DataSet as an always disconnected recordset that knows nothing about the source or destination of the data it contains. Inside a DataSet, much like in a database, there are tables, columns, relationships, constraints, views, and so forth. A DataAdapter is the object that connects to the database to fill the DataSet. Then, it connects back to the database to update the data there, based on operations performed while the DataSet held the data. In the past, data processing has been primarily connection-based. Now, in an effort to make multi-tiered apps more efficient, data processing is turning to a message-based approach that revolves around chunks of information. At the center of this approach is the DataAdapter, which provides a bridge to retrieve and save data between a DataSet and its source data store. It accomplishes this by means of requests to the appropriate SQL commands made against the data store. The XML-based DataSet object provides a consistent programming model that works with all models of data storage: flat, relational, and hierarchical. It does this by having no 'knowledge' of the source of its data, and by representing the data that it holds as collections and data types. No matter what the source of the data within the DataSet is, it is manipulated through the same set of standard APIs exposed through the DataSet and its subordinate objects. While the DataSet has no knowledge of the source of its data, the managed provider has detailed and specific information. The role of the managed provider is to connect, fill, and persist the DataSet to and from data stores. The OLE DB and SQL Server .NET Data Providers (System.Data.OleDb and System.Data.SqlClient) that are part of the .Net Framework provide four basic objects: the Command, Connection, DataReader and DataAdapter. In the remaining sections of this document, we'll walk through each part of the DataSet and the OLE DB/SQL Server .NET Data Providers explaining what they are, and how to program against them.

The following sections will introduce you to some objects that have evolved, and some that are new. These objects are:

Connections. For connection to and managing transactions against a database. Commands. For issuing SQL commands against a database. DataReaders. For reading a forward-only stream of data records from a SQL Server data source. DataSets. For storing, remoting and programming against flat data, XML data and relational data. DataAdapters. For pushing data into a DataSet, and reconciling data against a database.

Commands Commands contain the information that is submitted to a database, and are represented by provider-specific classes such as SQLCommand. A command can be a stored procedure call, an UPDATE statement, or a statement that returns results. You can also use input and output parameters, and return values as part of your command syntax. The example below shows how to issue an INSERT statement against the Northwind database. DataReaders The DataReader object is somewhat synonymous with a read-only/forward-only cursor over data. The DataReader API supports flat as well as hierarchical data. A DataReader object is returned after executing a command against a database. The format of the returned DataReader object is different from a recordset. For example, you might use the DataReader to show the results of a search list in a web page.

DataSets and DataAdapters DataSets The DataSet object is similar to the ADO Recordset object, but more powerful, and with one other important distinction: the DataSet is always disconnected. The DataSet object represents a cache of data, with database-like structures such as tables, columns, relationships, and constraints. However, though a DataSet can and does behave much like a database, it is important to remember that DataSet objects do not interact directly with databases, or other source data. This allows the developer to work with a programming model that is always consistent, regardless of where the source data resides. Data coming from a database, an XML file, from code, or user input can all be placed into DataSet objects. Then, as changes are made to the DataSet they can be tracked and verified before updating the source data. The GetChanges method of the DataSet object actually creates a second DatSet that contains only the changes to the data. This DataSet is then used by a DataAdapter (or other objects) to update the original data source. The DataSet has many XML characteristics, including the ability to produce and consume XML data and XML schemas. XML schemas can be used to describe schemas interchanged via WebServices. In fact, a DataSet with a schema can actually be compiled for type safety and statement completion. DataAdapters (OLEDB/SQL) The DataAdapter object works as a bridge between the DataSet and the source data. Using the provider-specific SqlDataAdapter (along with its associated SqlCommand and SqlConnection) can increase overall performance when working with a Microsoft SQL Server databases. The DataAdapter object uses commands to update the data source after changes have been made to the DataSet. Using the Fill method of the DataAdapter calls the SELECT command; using the Update method calls the INSERT, UPDATE or DELETE command for

each changed row. You can explicitly set these commands in order to control the statements used at runtime to resolve changes, including the use of stored procedures. For ad-hoc scenarios, a CommandBuilder object can generate these at run-time based upon a select statement. However, this run-time generation requires an extra round-trip to the server in order to gather required metadata, so explicitly providing the INSERT, UPDATE, and DELETE commands at design time will result in better run-time performance. 1. ADO.NET is the next evolution of ADO for the .Net Framework. 2. ADO.NET was created with n-Tier, statelessness and XML in the forefront. Two new objects, the DataSet and DataAdapter, are provided for these scenarios. 3. ADO.NET can be used to get data from a stream, or to store data in a cache for updates. There is a lot more information about ADO.NET in the documentation. 4. Remember, you can execute a command directly against the database in order to do inserts, updates, and deletes. You don't need to first put data into a DataSet in order to insert, update, or delete it. 5. Also, you can use a DataSet to bind to the data, move through the data, and navigate data relationships SQL SERVER A database management, or DBMS, gives the user access to their data and helps them transform the data into information. Such database management systems include dBase, paradox, IMS, SQL Server and SQL Server. These systems allow users to create, update and extract information from their database. A database is a structured collection of data. Data refers to the characteristics of people, things and events. SQL Server stores each data item in its own fields. In SQL Server, the fields relating to a particular person, thing or event are bundled together to form a single complete unit of data, called a record (it can also be referred to as raw or an occurrence). Each record is made up of a number of fields. No two fields in a record can have the same field name.

During an SQL Server Database design project, the analysis of your business needs identifies all the fields or attributes of interest. If your business needs change over time, you define any additional fields or change the definition of existing fields. SQL Server Tables SQL Server stores records relating to each other in a table. Different tables are created for the various groups of information. Related tables are grouped together to form a database. Primary Key Every table in SQL Server has a field or a combination of fields that uniquely identifies each record in the table. The Unique identifier is called the Primary Key, or simply the Key. The primary key provides the means to distinguish one record from all other in a table. It allows the user and the database system to identify, locate and refer to one particular record in the database.

Relational Database Sometimes all the information of interest to a business operation can be stored in one table. SQL Server makes it very easy to link the data in multiple tables. Matching an employee to the department in which they work is one example. This is what makes SQL Server a relational database management system, or RDBMS. It stores data in two or more tables and enables you to define relationships between the table and enables you to define relationships between the tables. Foreign Key When a field is one table matches the primary key of another field is referred to as a foreign key. A foreign key is a field or a group of fields in one table whose values match those of the primary key of another table.

10

Referential Integrity Not only does SQL Server allow you to link multiple tables, it also maintains consistency between them. Ensuring that the data among related tables is correctly matched is referred to as maintaining referential integrity. Data Abstraction A major purpose of a database system is to provide users with an abstract view of the data. This system hides certain details of how the data is stored and maintained. Data abstraction is divided into three levels. Physical level: This is the lowest level of abstraction at which one describes how the data are actually stored. Conceptual Level: At this level of database abstraction all the attributed and what data are actually stored is described and entries and relationship among them. View level: This is the highest level of abstraction at which one describes only part of the database. Advantages of RDBMS Redundancy can be avoided Inconsistency can be eliminated Data can be Shared Standards can be enforced Security restrictions ca be applied Integrity can be maintained Conflicting requirements can be balanced Data independence can be achieved.

11

Disadvantages of DBMS A significant disadvantage of the DBMS system is cost. In addition to the cost of purchasing of developing the software, the hardware has to be upgraded to allow for the extensive programs and the workspace required for their execution and storage. While centralization reduces duplication, the lack of duplication requires that the database be adequately backed up so that in case of failure the data can be recovered. Features of Sql Server (Rdbms) SQL SERVER is one of the leading database management systems (DBMS) because it is the only Database that meets the uncompromising requirements of todays most demanding information systems. From complex decision support systems (DSS) to the most rigorous online transaction processing (OLTP) application, even application that require simultaneous DSS and OLTP access to the same critical data, SQL Server leads the industry in both performance and capability. SQL SERVER is a truly portable, distributed, and open DBMS that delivers unmatched performance, continuous operation and support for every database.SQL SERVER RDBMS is high performance fault tolerant DBMS which is specially designed for online transactions processing and for handling large database application. Enterprise wide Data Sharing The unrivaled portability and connectivity of the SQL SERVER DBMS enables all the systems in the organization to be linked into a singular, integrated computing resource. Portability SQL SERVER is fully portable to more than 80 distinct hardware and operating systems platforms, including UNIX, MSDOS, OS/2, Macintosh and dozens of proprietary

12

platforms. This portability gives complete freedom to choose the database sever platform that meets the system requirements.

3. SYSTEM ANALYSIS
3.1 FEASIBILITY STUDY
Fact Finding Fact-finding is the stage in which data about the system are collected in terms of technical and functional requirements. In this project the data collection is completed using the data carriers, which are existing in the tables. Feasibility Analysis Feasibility study ensures whether to proceed the development of the project or to stop by conducting study on four primary areas such as, Economic Feasibility. Technical Feasibility. Operational Feasibility. Alternatives. Economic Feasibility By comparing the development cost with the income/benefit derived from the proposed system. Cost benefit analysis assesses cost for project development and weighs them against tangible (measurable) and intangible benefits of a system. Cost-benefit analysis will vary depending on the characteristics of the system to be developed, the relative size of the project and the expected return on the investment. Technical Feasibility It is a study of function, performance and constraints that may affect the ability to achieve the proposed system. The system engineer evaluates the technical merits of the system such as performance, reliability, maintainability, and produce ability. He makes the assessment, for example, Required technologies performance

13

Required material, method, and algorithms. Identifying development risks. Impact of technology issues on cost. Automated tools like prototyping and simulation tools assist greatly in technical analysis. The results obtained from technical analysis form the basis for another go/no-go decision on the system. If technical risks are severe, then the development will be terminated. Operational Feasibility The purpose of the study is to determine whether the new system would be used if it is developed and implemented. The assessment of operational feasibility will be done along side by side with technical feasibility. The needs of various people affected by proposed system must be taken into account. The various social costs must be evaluated these will include the costs of education and training, communication, salary changes and hidden costs like those caused by hostility, ignorance and fear. Economic Feasibility The economic feasibility study was aimed at determining whether the cost of creating the system is not so great that the project can be considered. Are there sufficient benefits in creating the system to make the costs acceptable the technique of cost benefit analysis is often used as a basis for assessing economic feasibility. The factors for evaluation are: Cost of operation of existing and proposed system. Cost of development of proposed system. Values of the benefits of proposed system Alternatives An evaluation of alternative approaches to the development of the system is done. Once questions associated with the analysis tasks have been answered, alternative solutions are considered. Allocated system parameters are applied to all the alternatives and each alternative is evaluated by a set of evaluation parameters. Then one approach (within alternates) gets selected if that is feasible. 14

The feasibility study is reviewed by the project management team of Meta Concepts to decide whether to proceed or stop the development during planning, specification, and development step of both hardware and software engineering.

3.2 EXISTING SYSTEM


The organization maintains their existing application in Visual basic and data in MS Access tables. This application cannot be worked out by their product interested customer residing at their remote location. This application is effective in a LAN network and not suited for internet application. Some of the processes are performed manually. The organization has no suitable application for maintaining the whole set of the business operation. Thus it takes more time when the process is performed manually. Drawbacks Difficulty in global updating Database tables are not normalized. System is insecure. Time consumption is high.

3.3 PROPOSED SYSTEM


The proposed system solves the drawback of the existing system and works satisfactorily. The proposed system is a good management information system. The proposed system is supposed to handle as many number of customer as possible in any particular time. The customer queries should be periodically referred and the solution should be provided quickly. Merits of Proposed System: The system objectives are to automate the important aspects of the system and to keep track of the transaction status of the company. The system is designed in such a way by giving prime importance of the following features. Efficiency Interactivity Fully menu driven 15

User friendly To reduce time of processing

4. SYSTEM DESIGN
Designing is the first activity for any system. It is the process of applying various techniques and principles for the purpose of defining a system. There is various type of designing. They are as follows Input and Output Design Data Base Design

4.1 INPUT DESIGN


Input design is the method by which valid data are accepted from the user. This part of the designing requires very careful attention. If the data going into the system is incorrect, the processing and output will magnify these errors. Inaccurate input data are the most common cause of error in data processing. The input design is carried out such a way that the input screens are user friendly. The goal of designing input is to make input data entry as easy and error free. Input screens filter the invalid data from becoming an operational data at each entry phase. This is achieved by providing proper checks and validations.

4.2 OUTPUT DESIGN


The output design defines the output required and the format in which it is to be produced. Care must be taken to present the right information so that right decision is made. A very important and effective way of presenting information is report. Various reports will be generated by the system. These reports will be used for various verification and analysis.

16

4.3 DATAFLOW DIAGRAM


A data flow diagram is a network that describes the flow of data within the system. They also describe the processes that change or transform data throughout the system. They are graphical tools used to describe and analyze the movement of data in the system. Using this tool the transformation of data from input to output through processes can be described logically. Most data flow modeling methods use four kinds of symbols. These symbols are used to represent four kinds of system components. Processes, data stores, data flows and external entities. Processes are represented by circles in DFD. Data flow is represented by a thin line. Square or rectangle represents external entities. Unlike detailed flowchart, data flow diagrams do not supply detailed description of the modules but graphically describes a systems data and how the data interact with the system. To construct a data flow diagram, we use 1. Arrow 2. Circles 3. Open Ended Box 4. Squares An arrow identifies the data flow in motion. A circle stands for process that converts data into information. An open-ended box represents a data store, data at rest or temporary repository of data. A square defines a source or destination of system data.

17

5. SYSTEM TESTING AND IMPLEMENTATION


5.1 SYSTEM TESTING
The testing phase is an important part of software development. It performs a very critical role of quality assurance and for ensuring the reliability of software. It is the process of finding errors and missing operations and also a complete verification to determine whether the objectives are met and the user requirements are satisfied. The goal of testing is to uncover requirement design or coding errors in the programs. Consequently different levels of testing are employed. In software, the use of testing is not limited to the testing phase. The results of testing are used later on during maintenance also. Regardless of the source of test data, the programmers and analyst will eventually conduct four different types of tests. We have tested this software for the discrepancies in the software, like how the software works on the system. The system was tested for its integrity, effectiveness, scalability and ease of use. Unit or Functional Testing Unit testing focuses on the various modules of the system individually, assuming that if it functions properly as a unit. In this testing module interface was tested to assure the information flows properly into and out of the module. Each and every module was tested rigorously, whether the module was working to the expectations and whether the module was returning the values properly on test.

18

Integration Testing: Integration testing is a systematic technique for constructing the program linkage while conducting test at the same time to uncover errors associated with interface. Here each functional module is integrated into a single program and testing was carried out each time a module was linked. After the unit testing was over the units were integrated in an phased manner so that when each and every module was integrated it was checked against with whether the integrated module was working to the design. There were many problems faced with the integration testing like there were many variables used in both the modules which caused the problems. This were rectified in the integration testing. White-Box Testing White box testing, sometimes also called glass-box testing is a testing case design method that uses control structure of the procedural design to derive test cases. Using whitebox methods, the software engineer can derive test cases that Guarantee that all independent paths within a module have been exercised at least once. Exercises all logical decisions on their true or false sides. Execute all loops at their boundaries and within their operational bounds. Exercise internal data structures to ensure their validity.

19

Black Box Testing Black box testing also called behavioral testing focuses on the functional requirements of the software. The black box testing enables the software engineer to drive sets of input conditions that will fully exercise all functional requirements for a program. Black box testing attempts to find error in the following categories: In correct or missing functions. Interface errors. Errors in data structures or external database access. Behavior or performance errors. Inheritance or termination errors.

5.2 SYSTEM IMPLEMENTION


System implementation is important stage of the project where the theoretical design is turned into the practical system. The main stages of the implementation are as follows Implementation Planning Implementation planning is the first task in the system implementation. Planning means deciding on the methods and time scale to be adopted. Once the planning is over the major effort in the computer department is to insure that the programs in the system are working properly. Implementation was the final stage of the project in this the project was implemented on the computer. The entire project was loaded into the web server and was tested on the clients, whether the site was working satisfactorily. Then on the clients all the links were tested for the successful operation of the project. Test Case
Test Case Customer number Criteria Entered register number with alphabets Result Message displayed as Enter number

20

6. CONCLUSION
This project Call centre Management has been designed and developed as per the specification. The project is very simple for clear understanding. The code written in this project is very clearer. Many statements that make the code unclear are omitted. The system is tested with various sample datas. This package developed is tested with sample data, which were to provide the satisfactory results. After the system has been implemented, the maintenances of the system should be very easy so that the forthcoming changes can be made easily. This has been developed is so flexible that the change can be made easily. Some of the error handler modules are also used. Thus all this features makes the project success.

21

7. SCOPE FOR THE FURTHER DEVELOPMENT


The software has been developed in such a way that it can accept modifications and further changes. The software is very user friendly and in the future any changes can be done easily, without affecting the software. Software restructuring is carried out. Software restructuring modifies source code in an effort to make it amenable to future changes. In general, restructuring does not modify the over all program architecture. It tends to focus on the design details of individual modules and on local data structure defined within modules. The project feature can be further enhanced with all authentication and logon information by encrypting the data when transmitted over customer and the textile web site. In addition, it is possible to configure windows 2000 server networking for further firewall security so that all data that passes between a client and server is secured properly. Enhancing various elements, which are not required for the current set of the business process, can further develop the project. It should be possible to check the authentication then there required by any personnel of the Call Center Company. This would help in avoiding the non-authenticated users modifying the data improperly.

22

8.BIBLIOGRAPHY
I. BOOKS Elias.M.Award, System Analysis and Design Galgotia Publication Pvt.Ltd. 1991 Roger.S.Pressman, Software Engineering McGraw-Hill International Editions, 1991 Gary cornell & Jonathan Morrison, Programming VB.NET Pares publishers Matt.j.Crounch, VB.NET programming Pearson Education, 2003 Edition. Steven Holzner , Visual Basic.NET Programming Websites www.programheaven.com www.Google.com www.codeproject.com

23

9. APPENDIX
A. DATA FLOW DIAGRAM

Customer

Registratio n

Registration

Customer Address location complain t

Executive

Employee

Informatio n Service Switch

Reply tocustomer query

24

B. TABLE DESCRIPTION Customer registration


Field Type Number Number Memo Memo Date/Time Date/Time Memo Memo Memo Memo Memo Memo Number Memo Memo Memo Memo Size Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger Longinteger

Atten_by

User Table Field User name User id Type Text Text Size 255 255

Address location
Field addproof appliance Cus_type Dop Plant D_zone Sp_code Type Memo Memo Memo Date/Time Number Memo Memo Size Long Integer Long Integer Long Integer Long Integer Long Integer Long Integer Long Integer

25

Department Table Field Department Employee Table Field E_name Fname Iname Add1 Add2 City Pincode Phone Department designation salary Type Number Memo Memo Memo Memo Memo Number Memo Memo Memo Memo Size Long integer Long integer Long integer Long integer Long integer Long integer Long integer Long integer Long integer Long integer Long integer Type Memo Size Long integer

26

D. SCREEN LAYOUTS

27

28

29

30

31

32

33

34

35

36

37

38

Potrebbero piacerti anche