Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
1. INTRODUCTION
1.1 Organization Profile
Sumka Sons Insturmentation is a part of Sumka Group of Companies established in 1994 has been producing quality castings in ferrous (Gray,Steel,Ductile and Non Ferrous aluminum) from coimbatore India. The Group is managed by Founders Mr. Angou Mourougane and Mr. M.Muthukumar, Engineering Graduates from Pondicherry University. Head Quartered in Coimbatore, India, the Company has established Clients all over South-India and Parts of North India. The Company is dedicated to providing Quality Marketing and Services to India's Leading Universities and Institutions needing Biotechnology based Instrumentation Products. Since its inception, the company has earned good co-operation from all corners - Their Principals, Their Customers and Their vendors. Due to this, they are all the more committed to Providing Quality Products with competitive pricing giving the purchase advantage to their clients. They are able to attain an annual turnover of 35 million Indian rupees (close to 1 million US Dollars).They are constantly striving for better Products and wider Market reach. Sumka Sons Instrumentation is a Marketing and Service Company dealing with Research and BioTechnological based Instrumentation Products.Sumka Sons Inc is a Global Company satisfying Customers with Quality Castings.
2. SYSTEM CONFIGURATION
2.1 Hardware Configuration
Processor Clock Speed RAM capacity Hard Disk Monitor Type Mouse Type Keyboard Floppy Disk Pentium IV 900 MHZ 256MB 40GB 17 Inch Color Monitor Scroll mouse Logitech 1.44 MB
Microsoft Visual C++.Net Visual C++ provides deep support for creating XML web services including ATL servers and a new project type for creating powerful serverbased applications. With attribute based programming any function can be easily exposed as an XML web service. Traditional unmanaged C++ and new managed C++ code can be mixed freely within the same application. Existing components can be wrapped as .NET components by using the managed extension. This preserves investment in existing code while integrating with the .NET Framework. ADO.NET ActiveX Data Objects.NET (ADO.NET), formerly known as ADO+ is a new set of classes that expose the data access services of the .NET Framework. ADO.NET is a natural evolution of ADO and is built around NTier application development.ADO.NET has been created with XML at its core. The ADO.NET object model is composed of two central components: the connected layer, which consist the classes that comprises the .NET Data Provider and the disconnected layer, which is rooted in the Dataset. .NET Data Providers includes the following components: the Connection Object, the Command Object, the Data Reader, and the Data Adapter. The first two should be familiar to existing ADO Programmer; they are used to open a connection to a data source and execute a command against it. The Data Reader loosely corresponds to a forward-only, read-only record set. It is a highly optimized, no buffering and fire host-style interface for getting the results of a query executed against the data source. The Data Adapter provides the dataset.
The Dataset is a local buffer of tables or a collection of disconnected record sets. Integrated Development Environment Visual Studio.NET provides a single Integrated Development Environment (IDE) that helps developers build solutions faster using key productivity features accessible by any .NET language. The IDE is a completely customizable cockpit that enables the highest performance for developers. It provides unified access to the designer, editors, and tools of Visual Studio from any .NET language.
servers running Microsoft Windows 2000 Data Center Edition. SQL Server 2000 Enterprise Edition supports features such as federated servers, indexed views and large memory support that allow it to scale the performance levels required by the largest web sites. Enterprise- Level Database Features The SQL Server 2000 relational database engine supports the features required to support demanding data processing environments. The database engine protects data integrity while minimizing the overhead of managing thousands of users concurrently modifying the database. Data Warehousing SQL Server 2000 includes tools for extracting and analyzing summary of the data for online analytical processing. SQL Server also includes tools for visually designing databases and analyzing data using English-based questions.
10
4. SYSTEM DESIGN
Data Flow Diagram
The data flow diagram (DFD) is one of the most important tools used by system analysts. Data flow diagrams are made up of a number of symbols that represents system components. Most data flow modeling methods use four kinds of symbols. These symbols are used to represent four kinds of system components. Processes, data stores, data flows and external entities. Circles in DFD represent processes. Data flow is represented by a thin line in the DFD and each data store has a unique name and square or rectangle represents external entities. Unlike detailed flowchart, data flow diagrams do not supply detailed description of the modules but graphically describes a systems data and how the data interact with the system. To construct a data flow diagram we use, Arrow Circles Open end box Squares An arrow identifies the dataflow in motion. It is a pipeline through which information is flown like the rectangle in the flowchart. A circle stands for process that converts data into information. An open-ended box represents a data store, data at rest or a temporary repository of data. A square defines a source or destination of system data.
11
Five rules for constructing a data flow diagram: 1. Arrows should not cross each other 2. Squares, circles and files must bear names. 3. Decomposed data flow squares and circles can have same names. 4. Choose meaningful names for data flow.
12
application programs from the changing aspect of the physical data organization. This is essential because deferent application need the same data in different forms. If all the data kept in all this different forms it will result in wastage of space due to increase in redundancy Various tables designed for this purpose are attached in Annexure
13
14
Computer output is the most important direct sources of information to the user efficient. Intelligible output design should improve system
15
relationship with the user and help in decision making. Hard copy is preferred since it is to be used by the management for the future references. The output design, various output layouts are made. The objectives of output specification are: To interpret and communicator the results of the computer in a way in which they can understand. To communicate the output design specification to the user in which unambiguous and comprehensive errors may not occur. Outputs from a computer system are required primarily to communicate the results of processing to user. Outputs are also used to provide a permanent copy of these results for later consultation.
16
5. SYSTEM DEVELOPMENT
Development is the state in the project where the theoretical design is turned into a working system. The most crucial stage is achieving a successful system and confidence on the new system for the user that it will work efficiently and effectively. The system can be developed only after when testing is done and if it is found to work to the specifications. It involves careful planning, investigation of the current system and its constraints on development of education and training of users and testing the system. System development is an important phase in the software engineering process. Various approaches are there for developing a system. Individual application approach is one of the well-known approaches for system development. This involves the development of individual computer based application the system serve in a specific application area in an isolated way. The first step in which the analyst must undertake is to understand the current system by gathering all information about it. The required datas are collected by several methods like: Interviews with the management. Questionnaires Study of the current manuals Observation of the functioning
17
Sampling and Research In a system development phase, all the codes of the application are generated. This approach has got a number of advantages. The system is easier to develop and implement because applications are small and relatively simple.
18
19
TESTING METHODOLOGIES 1. UNIT TESTING Unit testing focuses verification efforts on the smallest unit of software Design, the module. This is also known as Module Testing. The modules are tested separately. This testing is carried out during programming stage itself. In this testing step each module is found to be working satisfactorily as regard to the expected output from the module. 2. INTEGRATION TESTING Data can be lost across an interface. One module can have an adverse effort on the other. Sub functions, when combined, may not produce the desired major functions. Integration testing is a systematic testing for constructing he program structure, while at the same time conducting tests to uncover errors associated within the interface. The objective is to take unit tested module and build a program structure. All the modules are combined and tested as a whole. Here correction is difficult because the vast expense of the entire program complicate the isolation of causes. Thus in the integration testing step, all the errors are uncovered are corrected for the next testing steps. 3. VALIDATION TESTING After Integration Testing, the software is completely assembled as a package, interfacing errors have been uncovered and corrected and then test of software is conducted i.e., Validation Test. Validation Test succeeds when the software function in a manner that can be reasonably expected by
20
the client. Software validation is achieved through series of Black Box Testing, which confirms with the requirements. 4. BLACKBOX TESTING Black box testing is conducted at the software interfaces. This test through is designed to uncover errors interfaces. This test through is designed to uncover errors interfaces, is also used to demonstrate that software functions are operational, input is properly accepted, output are produced correctly and that the integrity of external information are maintained. Black Box Testing attempts to find errors in the following categories: Incorrect or missing functions. Interface errors. Errors in Database Structure. Performance and termination errors. 5. OUTPUT TESTING After performing the validation testing, the next step is output testing of the proposed system since no system could be useful if it does not produces the required output in the specific format. The outputs generated or displayed by the system under consideration are tested by asking the user about the format required by them. Here, the output format is considered in two ways, one is on the screen and another is printed format.
21
6. USER ACCEPTANCE TESTING User acceptance testing in a system is the key factor for the success of any system. The system under consideration is tested for user acceptance by constantly keeping in touch with the prospective system users at time of developing and making changes wherever required. This is done with regard to the following points: Input screen design Output screen design The above testing are done taking various kinds of test data. Preparation of test data plays vital role in the system testing. After preparing the test data the system under study is tested using that test data. While testing the system by using test data errors are again uncovered and corrected by using above testing steps.
22
23
7. CONCLUSION
The Developed system is highly interactive and user friendly. The project was tested under different conditions was found to be effective. The project ERP- INDUSTRIAL MANAGEMENT SYSTEM was designed and developed to save time gives lesser strains and generates quicker and accurate results. Reports have been generated so that it meets user requirements to the maximum possible extents. The system provides accurate updating, data validation and integrity. The system has been designed and run to satisfy the needs of the entire organization. The existing system makes the job very difficult. The new system reduces work and also results in quick retrieval of information, which is very vital to the progress of an organization. The project has been successfully completed and tested using the sample database information. The system has been developed with the present working conditions. The industrial environment is very fast with new features, styles etc are expected.
24
25
9 BIBLIOGRAPHY
1. Billy Hollis, Rockford Lhotka, PROGRAMMING IN VB.NET, Public Beta Release, 2001. 2. Elias M.Award, System Ana1ysis And Design, Galgotia 2000. Paul Nielsen, Microsoft SQL server 2000 Bible, Wiley Publishing, Inc. 3. Roger S.Pressman, Software Engineering, Tata Macgraw Hill 1999 4. Kalani Kirk Hausman, Ed Tittel,, Development and Implementation With Visual Studio.Net ,QUE Publishing,2003 5. Matthew MacDonald, Bill Hamilton, ADO.NET in a Nutshell, O'Reilly,2003