Sei sulla pagina 1di 7

What is C programming?

C is a general-purpose programming language that is extremely popular, simple and flexible. It is machine-
independent, structured programming language which is used extensively in various applications.

History of C language

The base or father of programming languages is 'ALGOL.' It was first introduced in 1960. 'ALGOL' was used on a
large basis in European countries. 'ALGOL' introduced the concept of structured programming to the developer
community. In 1967, a new computer programming language was announced called as 'BCPL' which stands for
Basic Combined Programming Language. BCPL was designed and developed by Martin Richards, especially for
writing system software. This was the era of programming languages. Just after three years, in 1970 a new
programming language called 'B' was introduced by Ken Thompson that contained multiple features of 'BCPL.' This
programming language was created using UNIX operating system at AT&T and Bell Laboratories. Both the 'BCPL'
and 'B' were system programming languages.

In 1972, a great computer scientist Dennis Ritchie created a new programming language called 'C' at the Bell
Laboratories. It was created from 'ALGOL', 'BCPL' and 'B' programming languages. 'C' programming language
contains all the features of these languages and many more additional concepts that make it unique from other
languages.

Where is C used? Key Applications

1. 'C' language is widely used in embedded systems.


2. It is used for developing system applications.
3. It is widely used for developing desktop applications.
4. Most of the applications by Adobe are developed using 'C' programming language.
5. It is used for developing browsers and their extensions. Google's Chromium is built using 'C' programming
language.
6. It is used to develop databases. MySQL is the most popular database software which is built using 'C'.
7. It is used in developing an operating system. Operating systems such as Apple's OS X, Microsoft's
Windows, and Symbian are developed using 'C' language. It is used for developing desktop as well as
mobile phone's operating system.
8. It is used for compiler production.
9. It is widely used in IOT applications.
 Documentation section : The documentation section consists of a set of comment lines giving the
name of the program, the author and other details, which the programmer would like to use later.
 Link section : The link section provides instructions to the compiler to link functions from the system
library.
 Definition section : The definition section defines all symbolic constants.
 Global declaration section : There are some variables that are used in more than one function. Such
variables are called global variables and are declared in the global declaration section that is outside of
all the functions. This section also declares all the user-defined functions.
 main () function section : Every C program must have one main function section. This section
contains two parts; declaration part and executable part Declaration part : The declaration part
declares all the variables used in the executable part. Executable part : There is at least one statement
in the executable part. These two parts must appear between the opening and closing braces. The
program execution begins at the opening brace and ends at the closing brace. The closing brace of the
main function is the logical end of the program. All statements in the declaration and executable part end
with a semicolon.
 Subprogram section : The subprogram section contains all the user-defined functions that are called
in the main () function. User-defined functions are generally placed immediately after the main ()
function, although they may appear in any order. Note:All section, except the main () function section
may be absent when they are not required.
Computer programming language ?
Computer programming language, any of various languages for expressing a set of detailed
instructions for a digital computer. Such instructions can be executed directly when they are in
the computer manufacturer-specific numerical form known as machine language, after a simple
substitution process when expressed in a corresponding assembly language, or after translation
from some “higher-level” language. Although there are many computer languages, relatively few
are widely used.
Machine and assembly languages are “low-level,” requiring a programmer to manage explicitly
all of a computer’s idiosyncratic features of data storage and operation. In contrast, high-level
languages shield a programmer from worrying about such considerations and provide a notation
that is more easily written and read by programmers.

Language Type:

1.Machine and assembly languages:


A machine language consists of the numeric codes for the operations that a particular computer
can execute directly. The codes are strings of 0s and 1s, or binary digits (“bits”), which are
frequently converted both from and to hexadecimal (base 16) for human viewing and
modification. Machine language instructions typically use some bits to represent operations, such
as addition, and some to represent operands, or perhaps the location of the next instruction.
Machine language is difficult to read and write, since it does not resemble conventional
mathematical notation or human language, and its codes vary from computer to computer.
Assembly language is one level above machine language. It uses short mnemonic codes for
instructions and allows the programmer to introduce names for blocks of memory that hold data.
One might thus write “add pay, total” instead of “0110101100101000” for an instruction that
adds two numbers.
Assembly language is designed to be easily translated into machine language. Although blocks
of data may be referred to by name instead of by their machine addresses, assembly language
does not provide more sophisticated means of organizing complex information. Like machine
language, assembly language requires detailed knowledge of internal computer architecture. It is
useful when such details are important, as in programming a computer to interact
with input/output devices (printers, scanners, storage devices, and so forth).
2.Algorithmic languages(ALGOL)
Algorithmic languages are designed to express mathematical or symbolic computations. They
can express algebraic operations in notation similar to mathematics and allow the use of
subprograms that package commonly used operations for reuse. They were the first high-level
languages.
3.Fortran
The first important algorithmic language was FORTRAN (formula translation), designed in 1957
by an IBM team led by John Backus. FORTRAN was also designed to translate into efficient
machine language. It was immediately successful and continues to evolve.Its control structures
included conditional IF statements, repetitive loops (so-called DO loops), and a GOTO statement
that allowed nonsequential execution of program code. FORTRAN made it convenient to have
subprograms for common mathematical operations, and built libraries of them.
What is an algorithm?

An algorithm is a procedure or step-by-step instruction for solving a problem. They form the
foundation of writing a program.

An algorithm in general is a sequence of steps to solve a particular problem. Algorithms are


universal. The algorithm you use in C programming language is also the same algorithm you
use in every other language.

Advantages

 Easy to write.
 Human readable techniques to understand logic.
 Algorithms for big problems can be written with moderate efforts.

Disadvantages

 Difficult to debug.
 Difficult to show branching and looping.
 Jumping (goto) makes it hard to trace some problems.

Definition of Flowchart

A flowchart is a pictorial or graphical representation of a process. Each step in


the process is represented by a different symbol and contains a short
description of the process step. The flow chart symbols are linked together with
arrows showing the process flow direction. This pictorial representation can give
a step-by-step solution to the given problem.

Advantages

 Easy to draw.
 Easy to understand the logic.
 Easy to identify mistakes by the non-computer person.
 Easy to show branching and looping.
Disadvantages

 Time-consuming.
 Difficult to modify.
 Very difficult to draw a flowchart for big or complex problems.

Simple Flowcharting Symbols


Algorithm and flowchart to find the largest number

Potrebbero piacerti anche