Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Definition
An embedded systems is any computer that is a component in a
larger system and that relies on its own microprocessor.
An embedded system is a computer system with a dedicated
function within a larger mechanical or electrical system, often
with real time computing constraints. It is embedded as a part of
a complete device often including hardware and mechanical parts
An embedded system is a system that has embedded hardware,
which makes it a system dedicated for an application or a specific
part of an application or a product or part of a larger system.
Embedded System
Contain RTOS
ES is a
REACTIVE system:
Accepts input
Performs calculations
Generates output
Features
1.
2.
3.
4.
Reliability:
Maintainability: can be easily repaired or replaced
Availability:
Safety: in the event of system failure no harm should come to
people or property. Should be fail-proof
5. Security: Resilience of the system against unauthorized use
6. High speed operation
7. Low power consumption: ES are constrained for power
8. Small size and low weight
9. Adaptability
10. Accuracy
6
Features (contd)
11. ES do a very specific task they cannot be programmed to do
different things.
12. They have limited resources of memory i.e. no secondary
storage devices
13. They should operate in extreme environmental conditions
14. Cost effective
15. Dedicated user interface (no mouse keyboard or screen)
16. Many ES have real-time constraints
17. Run time efficiency: minimum resources for dedicated function
18. Multirate operation: different operations with different time
time scale of operation are to be handled
7
Classification of ES
I. Based on generation
II. Based on complexity and performance
requirements
III. Based on functionality
IV. Based on deterministic behavior
V. Based on triggering
I. Based on generation
Based on the order in which the ES evolved
1.
First Generation:
Built around 8 bit processors (8085, Z80) and 4 bit
microcontrollers
Simple hardware circuits
Firmware developed in assembly code
Eg. stepper motor control units, digital telephone keypads
2. Second Generation:
Built around 16 bit microprocessors and 8 or 16 bit controllers
The instruction set was much more complex and powerful
that 1st generation processors
Some had embedded operating systems
Eg. Data Acquisition Systems (DAC), SCADA
9
Third Generation:
Built around 32 bit processors and 16 bit microprocessors
Concept of domain specific processors like DSP (Digital
Signal Processors) and ASIC (Application Specific Integrated
Circuits) was introduced
More complex and powerful instruction set with the concept
of instruction pipelining
The basic instruction cycle is broken up into a series called a
pipeline. Rather than processing each instruction sequentially
(finishing one instruction before starting the next), each
instruction is split up into a sequence of steps so different
steps can be executed in parallel and instructions can be
processed concurrently (starting one instruction before
finishing the previous one).
10
Fourth Generation:
Advent of System on Chip (SoC), reconfigurable processors
and multi-core processors resulted in
high performance
Tight integration
miniaturization
Eg. mobile-internet devices, smart phones etc.
11
Small Scale
2.
Medium Scale
Slightly complex in hardware and firmware requirements
Usually built around16-32 bit microprocessor or microcontroller or
DSP
Usually contain embedded operating system
External RAM, ROM
Eg. washing machine, micro-wave oven
12
Large Scale
Highly complex hardware and firmware/software
requirements
Employed in mission critical applications demanding high
performance
Built around 32 or 64 bit Reconfigurable SoC, multicore
processors and programmable logic devices
Does operations like task scheduling, prioritization and
management
Eg. nuclear reactors, anti-missile systems
13
14
Real time ES
Which gives the required output in a specified time or which
strictly follows the time-dead lines for completion of a task
Does the function with time constraints
Types: Soft and Hard Real time systems
15
16
17
18
V. Based on triggering
Applicable to ES which are reactive in nature
Based on trigger:
Event triggered: system responses to an event
Time triggered: system responses to time
constraints
19
Need/Application
1.
2.
3.
4.
5.
6.
Data collection/storage/representation
Data communication
Data processing
Monitoring
Control
Application specific user interface
20
Challenges
Hardware requirements
Deadlines or time constraints
Power consumption
Upgradability or flexibility
Reliability
Complex testing
Limited observability and controllability
Restricted development environments
21
Challenges (contd)
The three Ps
Performance
Power consumption
Price
22
23
Components of ES
1. Core/Controller
2. Memory ROM, RAM, Cache
3. Peripherals: serial port, parallel port, network port,
keyboard and mouse ports, memory unit port, monitor
port
4. Other components
Reset circuit
Brown-out protection circuit
Timers/Clocks/Oscillator circuits
Real Time Clock (RTC)
Watch-dog timer
Interrupt controller
24
Components of ES
25
Requirement
Specification
Architecture
Components
System Integration
Testing-Verification-Validation
26
27
28
29
30
Requirement Gathering
This step onwards the software development team works to carry on the
project. The team holds discussions with various stakeholders from
problem domain and tries to bring out as much information as possible on
their requirements. The requirements are contemplated and segregated into
user requirements, system requirements and functional requirements.
The requirements are collected using a number of practices as given
31
Feasibility Study
After requirement gathering, the team comes up with a rough plan of
software process. At this step the team analyzes if a software can be
made to fulfill all requirements of the user and if there is any possibility
of software being no more useful. It is found out, if the project is
financially, practically and technologically feasible for the organization
to take up. There are many algorithms available, which help the
developers to conclude the feasibility of a software project.
System Analysis
At this step the developers decide a roadmap of their plan and try to
bring up the best software model suitable for the project. System
analysis includes Understanding of software product limitations,
learning system related problems or changes to be done in existing
systems beforehand, identifying and addressing the impact of project
on organization and personnel etc. The project team analyzes the scope
of the project and plans the schedule and resources accordingly.
32
Software Design
Next step is to bring down whole knowledge of requirements
and analysis on the desk and design the software product. The
inputs from users and information gathered in requirement
gathering phase are the inputs of this step. The output of this step
comes in the form of two designs; logical design and physical
design. Engineers produce meta-data and data dictionaries,
logical diagrams, data-flow diagrams and in some cases pseudo
codes.
Coding
This step is also known as programming phase. The
implementation of software design starts in terms of writing
program code in the suitable programming language and
developing error-free executable programs efficiently.
33
Testing
An estimate says that 50% of whole software development
process should be tested. Errors may ruin the software from
critical level to its own removal. Software testing is done while
coding by the developers and thorough testing is conducted by
testing experts at various levels of code such as module testing,
program testing, product testing, in-house testing and testing the
product at users end. Early discovery of errors and their remedy
is the key to reliable software.
Integration
Software may need to be integrated with the libraries, databases
and other program(s). This stage of SDLC is involved in the
integration of software with outer world entities.
34
Implementation
This means installing the software on user machines. At times, software needs
post-installation configurations at user end. Software is tested for portability
and adaptability and integration related issues are solved during
implementation.
Disposition
As time elapses, the software may decline on the performance front. It may go
completely obsolete or may need intense upgradation. Hence a pressing need to
eliminate a major portion of the system arises. This phase includes archiving
data and required software components, closing down the system, planning
disposition activity and terminating system at appropriate end-of-system time.
35
36
1. Waterfall Model
37
38
2. Iterative Model
39
40
3. Spiral Model
Spiral model is a combination of both, iterative model
and one of the SDLC model. It can be seen as if you
choose one SDLC model and combine it with cyclic
process (iterative model).
This model considers risk, which often goes un-noticed
by most other models. The model starts with
determining objectives and constraints of the software
at the start of one iteration. Next phase is of prototyping
the software. This includes risk analysis. Then one
standard SDLC model is used to build the software. In
the fourth phase of the plan of next iteration is
prepared.
41
42
4. V model
The major drawback of waterfall model is we move to the next stage
only when the previous one is finished and there was no chance to
go back if something is found wrong in later stages. V-Model
provides means of testing of software at each stage in reverse
manner.
At every stage, test plans and test cases are created to verify and
validate the product according to the requirement of that stage. For
example, in requirement gathering stage the test team prepares all
the test cases in correspondence to the requirements. Later, when the
product is developed and is ready for testing, test cases of this stage
verify the software against its validity towards requirements at this
stage.
This makes both verification and validation go in parallel. This
model is also known as verification and validation model.
43
44
45
46
Assembler:
A computer will not understand any program written in a language, other
than its machine language.
The programs written in other languages must be translated into the
machine language (using a software)
A program which translates an assembly language program into a machine
language program is called an assembler.
Self assembler or resident assembler : If an assembler which runs on a
computer and produces the machine codes for the same computer.
Cross Assembler : If an assembler that runs on a computer and produces
the machine codes for other computer
two types: One Pass Assembler and Two Pass Assembler.
One pass assembler is the assembler which assigns the memory addresses
to the variables and translates the source code into machine code in the first
pass simultaneously.
A Two Pass Assembler is the assembler which reads the source code twice.
In the first pass, it reads all the variables and assigns them memory
addresses. In the second pass, it reads the source code and translates the
code into object code.
47
Compiler:
It is a program which translates a high level language program into a
machine language program.
A compiler is more intelligent than an assembler.
It checks all kinds of limits, ranges, errors etc.
Program run time is more and occupies a larger part of the memory.
slow speed: a compiler goes through the entire program and then
translates the entire program into machine codes.
If a compiler runs on a computer and produces the machine codes for
the same computer then it is known as a self compiler or resident
compiler.
On the other hand, if a compiler runs on a computer and produces the
machine codes for other computer then it is known as a cross compiler.
48
Interpreter:
An interpreter is a program which translates statements of a program
into machine code.
It translates only one statement of the program at a time.
It reads only one statement of program, translates it and executes it.
Then it reads the next statement of the program again translates it and
executes it.
On the other hand, a compiler goes through the entire program and then
translates the entire program into machine codes.
A compiler is 5 to 25 times faster than an interpreter.
By the compiler, the machine codes are saved permanently for future
reference. on the other hand, the machine codes produced by interpreter
are not saved.
An interpreter is a small program as compared to compiler so occupies
less memory space
used in a smaller system which has limited memory space.
49
Linker:
In high level languages there are some built in header files or libraries
These libraries are predefined and these contain basic functions which
are essential for executing the program.
These functions are linked to the libraries by a program called Linker.
If linker does not find a library of a function then it informs to compiler
and then compiler generates an error.
The compiler automatically invokes the linker as the last step in
compiling a program.
Not only built in libraries, but also links the user defined functions to
the user defined libraries.
Usually a longer program is divided into smaller subprograms called
modules. And these modules must be combined to execute the program.
The process of combining the modules is done by the linker.
50
Loader:
Loader is a program that loads machine codes of a program into the
system memory.
In Computing, a loader is the part of an Operating System that is
responsible for loading programs.
It is one of the essential stages in the process of starting a program.
It places programs into memory and prepares them for execution.
Loading a program involves reading the contents of executable file into
memory.
Once loading is complete, the operating system starts the program by
passing control to the loaded program code.
All operating systems that support program loading have loaders. In
many operating systems the loader is permanently resident in memory.
51
Debuggers
A debugger or debugging tool is a computer program that is used
to test and debug other programs (the "target" program).
The code to be examined might alternatively be running on
an instruction set simulator (ISS), a technique that allows great power
in its ability to halt when specific conditions are encountered
but which will typically be somewhat slower than executing the code
directly on the appropriate (or the same) processor.
Some debuggers offer two modes of operationfull or partial
simulationto limit this impact.
52
53
Suppose you have a function (Function A) that calculates the total marks obtained by a student in
a particular academic year. Suppose this function derives its values from another function
(Function B) which calculates the marks obtained in a particular subject.
You have finished working on Function A and wants to test it. But the problem you face here is
that you can't seem to run the Function A without input from Function B; Function B is still under
development. In this case, you create a dummy function to act in place of Function B to test your
function. This dummy function gets called by another function. Such a dummy is called a Stub.
To understand what a driver is, suppose you have finished Function B and is waiting for Function
A to be developed. In this case you create a dummy to call the Function B. This dummy is called
the driver.
56
57
Tool chain
A tool chain is a set of programming tools that are
used to perform a complex software development
task or to create a software product
Set of distinct software development tools linked
or chained together by specific stages such as
GNU Compiler Collection (GCC)
binutils:
glibc
5.
6.
60
Design issues
61
Appendix
62
The GNU Binutils are a collection of binary tools. The main ones are:
ld - the GNU linker.
as - the GNU assembler.
glibc:
Any Unix-like operating system needs a C library: the library which defines the
``system calls'' and other basic facilities such as open, malloc (memory allocation),
printf, exit...
The GNU C Library is used as the C library in the GNU system and in GNU/Linux
systems, as well as many other systems that use Linux as the kernel.
64
Hardware/Software/Firmware
HARDWARE is the physical arrangement of electronic parts that
can only be changed with a screwdriver or soldering iron. It is
purely physical.
SOFTWARE is the arrangement of digital instructions that guide
the operation of computer hardware. Software is loaded from
storage (flash, disk, network, etc) into the computer's operating
memory (RAM) on demand, and is designed to be easy to change.
FIRMWARE is a special class of software that is not intended to
change once shipped. An update requires either a swap of chips or a
special process to reload the flash memory containing the
software. This kind of software powers things like your TV, your
microwave, and your home router, as well as the BIOS (the boot
code) of your PC.
65