Sei sulla pagina 1di 30

Test Programming an ATE for Diagnosis

by

Heng Zhao

A thesis submitted to the Graduate Faculty of


Auburn University in
partial fulfillment of the
requirements for the Degree of
Master of Electrical Engineering

Auburn, Alabama

June 1, 2015

Keywords: Diagnostic Tree, stuck-at-fault, Automatic Test Equipment

Copyright 2015 by Heng Zhao

Approved by

Vishwani D. Agrawal, Chair, James J. Danaher Professor of Electrical & Computer Eng.
Victor P. Nelson, Professor of Electrical & Computer Engineering
Adit D. Singh, James B. Davis Professor of Electrical & Computer Engineering
Abstract

If we need to test a circuit, we will give the circuit some input logic vectors. Then we

will observe its output. A fault is said to be detected by a test pattern if the output for that

test pattern is different from the expected output. A defect is an error caused in a device

during the manufacturing process. The logic values observed at primary outputs of a device

under test (DUT) upon application of a test pattern t are called the output of that test pattern.

The output of a test pattern, when testing a fault-free device that works exactly as designed,

is called the expected output of that test pattern. However, a large circuit may needs too

many vectors to test, which would take lot of time and money. In this way, we may spend

much more money in testing than in making the circuit. So if we reduce the number of test

vectors, we would save a lot of expense.

As we know, even though the circuits were made from the same model, the circuits

which made actually are still different. Testing them with the same vectors will not an

efficacious way. The circuits in different situation, should use different vectors and in

different sequence. The algorithm, diagnostic tree, will help to choose the vectors and

shorter the time. If the vectors fail in this circuit, the diagnostic tree will give another vector,

however, if the vectors success in the circuit, it will get a different vector to test. After the

algorithm has been finished, we need to use it to see how it works. In this way, we should

simulate a benchmark circuit and test it with diagnostic tree in the ATE machine. The ATE

machine can test the circuit and we can see its processing in the Flow Edit.

ii
Acknowledgments

I would like to thank so many people who helped me. First and the most, I am grateful

to Prof. Vishwani D. Agrawal for all his support. Without his help, I can’t finish the thesis. In

his office, I got many ideas about the thesis. His suggestion about using the ATE machine

helps me a lot. I am very grateful to be his student.

I am also thankful to Prof. Adit D. Singh and Prof. Victor P. Nelson for being on my

advisory committee. I have learned a lot from their class about VLSI testing and Computer

Aid Design.

I would also like to thank my family first, they support me to study in Auburn from China.

Furthermore, thanks my friends, Zhao Yang, BaoHu Li , Chaoyi She, JiaLin Ding, Kupeng Zeng,

ZeShi Luo, for these years help in study.

iii
Table of Contents

Abstract......................................................................................................................................................... ii

Acknowledgments ....................................................................................................................................... iii

List of Figures ................................................................................................................................................ v

List of Abbreviations .................................................................................................................................... vi

1 Introduction ........................................................................................................................................... 7

1.1 Organization of Thesis .............................................................................................................. 7

2 Background ....................................................................................................................................... 8

2.1 Stuck at fault............................................................................................................................. 8

2.2 Diagnostic test .......................................................................................................................... 8

2.3 Diagnostic tree ......................................................................................................................... 8

2.4 ATE tester ................................................................................................................................. 9

2.5 Advantest T2000 .................................................................................................................... 10

2.6 Tessent FastScan .................................................................................................................... 10

2.7 Benchmark Circuit .................................................................................................................. 10

2.8 OTPL ........................................................................................................................................ 10

2.9 ATPG ........................................................................................................................................ 11

3 Prior Work ......................................................................................................................................... 12

3.1 traditional stuck-at-fault simulation ......................................................................................... 12

3.2 use multiplexer to simulate stuck-at-fault ............................................................................... 13

4 Algorithms ....................................................................................................................................... 15

5 Simulation and Result ...................................................................................................................... 17

iv
5.1 Results ..................................................................................................................................... 22

6 Conclusion ....................................................................................................................................... 27

6.1 Future Work .............................................................................................................................. 27

Bibliography ................................................................................................................................................ 29

List of Figures

3.2 multipexer for stuck-at-fault. . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . .. . . . . . . . . . . . .. . . . 13

3.3 orignal signal simulation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . 13

3.4 Stuck-at-1 model simulation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . 14

4.1 Diagnostic tree. . . . . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . 15

4.2 Diagnostic tree in 74182 benchmark circuit. . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16

5.1 Fault dictionary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5.2 74182 Gate-Level Schematic. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18

5.3 File generated by FastScan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

5.4 Faults detect by vector T0 . . . . . . . . . . . . . . . . . . . .. . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

5.5 Faults detected by vector T1. . . . . . . . . . . . . . . . . .. . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . .20

5.6 Fault dictionary for 74182 benchmark circuit. . . . . . .. . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

5.7 Fault free circuit in Flow edit. . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

5.8 Traditional way to test circuit. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

5.9 Diagnostic tree to test circuit. . . . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24

5.10 Multiple-stuck-at-fault (most common result). . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

5.11 Multiple-stuck-at-fault (special result). . . . . . . . . . . . . . . . . .. . . . . . .. . . . . . . . . . . . . . . . . . . . . 26

6 Full-response fault dictionary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

v
List of Abbreviations

ATE Automatic Test Equipment

ATPG Automatic Test Pattern Generation

DFT Design for Testability

OTPL Test-plan Programming Language

ATPG Automatic Test Pattern Generator

vi
Chapter 1
Introduction

There are many circuits made with mistakes. Even though a high proportion of them

are good circuits, we still need to test all of them. Therefore, a short time of testing is

desirable. There are various ways to reduce the test time. Some people try to use fewer test

vectors to test the circuit, but still have high fault coverage. There are other methods too to

shorten the test time. In this project, reducing the number of test application clock cycles is

the main objective which helps shorten the time of test. Application of an algorithm, called

diagnostic tree, is explored.

1.1 Organization of Thesis

Chapter 2 introduces various concepts that help understand the proposed work.

Chapter 3 gives a brief study of the previous work. Chapter 4 explains the algorithm used in

this project. Chapter 5 describes simulation and results with the ATE machine. Chapter 6

gives the conclusion of the project and suggests future work which may be done to optimize

the algorithm.

7
Chapter 2

Background

This chapter explains some concepts used in this thesis.

2.1 Stuck-at fault

A stuck-at fault is a particular fault model used by fault simulators and automatic test

pattern generation (ATPG) tools to mimic a manufacturing defect within an integrated circuit.

Individual signals and pins are assumed to be stuck at logical ‘1’, ‘0’ or ‘X’. In this project, I

will detect the single stuck-at fault. It assumes that only one line or node in the digital circuit

is stuck [12]. Sometimes it will be stuck at logic high, and sometimes it will be stuck at logic

low. As this fault model will be easy simulate and detect in an FPGA board, my circuit will

only have the single stuck-at fault.

2.2 Diagnostic test

The diagnostic test is applied after the circuit has failed. With the diagnostic test, we

can figure out the faulty part.

2.3 Diagnostic tree

As we know, for large circuits, there will be a huge data storage and the task of

matching the test result will waste a lot of time. Therefore, if we can choose test vectors, we

can save time. Diagnostic tree is an algorithm which is provided in the book, Essentials of

Electronic Testing [12]. It is just like a decision tree. For a circuit under test, it will decide

which vectors should be used to test. We can also call this tree fault tree. It is a top down,

8
deductive failure analysis of the state of the system, with Boolean logic to combine a series

of vectors. Tests are applied one at a time, and after each application a partial diagnosis is

obtained. The diagnostic tree will help to choose the next vector based on the outcome of the

previous test. The diagnostic tree can be learned and optimized. This process of top-down

decision tree is just an example of a greedy algorithm.

2.4 ATE tester

An automatic test equipment (ATE) is a system that applies tests to a device under

test (DUT), also referred to as unit under test (UUT), and makes a pass or fail decision. ATE

can be a simple computer control, or a complex system, and contains dozens of sophisticated

testing instruments (real or simulated electronic test equipment) to test complex electronic

packaged parts or chips. ATE is widely used in electronics manufacturing after production of

electronic components and systems. ATE can also be used to test electrical equipment and

automotive electronic modules. It is used for radar and other applications of wireless

communication in the military.

Semiconductor ATE architecture contains a main controller (usually a computer), the

synchronization of one or more signal source and capture tools. Historically, custom design

of the controller, pin electronics, and signal generators has been used for ATE systems. The

device under test (DUT) is physically connected to the ATE through a custom interface

adapter or “fixture”. ATE computer uses modern computer language (such as C, C + +, Java

and Smalltalk, or Labview) and additional control statements.

9
2.5 Advantest T2000

T2000 operating system provides an easy to use environment. It provides a general

software platform for test, and supports a variety of industrial standards such as STIL and

STDF. It has powerful GUI tools for debugging and development of test programs.

2.6 Tessent FastScan

Tessent FastScan software system simplifies the process of producing compact high

coverage tests. It can used for almost all types of designs, which makes it common solution

for ATPG. Comprehensive testing at full speed is vital to ensure high quality detection.

Tessent Fastscan tests transitions at full speed, multiple detections, time perception, as well

as the critical path.

2.7 Benchmark Circuit

The benchmark circuits are a set of logic circuits. These help in development and

evaluation of test methodology. They are available in various formats (bench, Verilog, VHDL)

and at various levels from RTL to gate level.

2.8 OTPL

The OTPL test-plan is written in the official T2000 programming language – Open

architecture Test-plan Programming Language (OTPL). The OTPL test-plan is converted to

C++, and then complied into Microsoft Windows Dynamic Link Library file (DLL).

10
2.9 ATPG

The ATPG, which means Automatic Test Pattern Generation, is a method to derive tests

to distinguish between the correct circuit behavior and the faulty circuit behavior caused by

defects. The ATPG requires that the real defects are represented by fault models. The number

of modeled faults that is detected and the number of generated patterns will influence the

efficiency of its testing.

11
Chapter 3
Prior Work

This chapter discusses previous work done that relates to the problems solved in this

research.

A stuck-at fault is a fault model used by fault simulation and automatic test pattern

generation (ATPG) tools to mimic a manufacturing defect in an integrated circuit. Individual

signals and pins are assumed to be stuck at Logical '1', '0' and 'X'.

3.1 Traditional Stuck-at Fault Emulation

In the traditional way, when we use a single-stuck-at-fault model, we may consider a

logical way to simulate it. As this method can be understood easily.

Figure 3.1 shows a logical way to create a stuck-at-0 signal with a ‘not’ gate (~) and

an ‘and’ gate (*). Thus, we have the logic (~A*A) = 0.

Figure 3.1 stuck-at-0 model

Similarly, to create a stuck-at-1 signal, we change the ‘and’ gate to an ‘or’ gate (+).

Then, we have the logic (~A+A) = 1.

However, when we create a stuck-at-fault diagnostic experiments, we need the single-

stuck-at-faults to be created at different nodes or lines using control signals. If we need to

change the circuit which we emulate, we may waste a lot of time in reloading the circuit onto

the FPGA board.

12
3.2 Using Multiplexers to Emulate Stuck-at Faults

With the FPGA board XC3S200A, I can emulate stuck-at faults by adding multiplexers

at faulty nodes. At a target node a multiplexer is inserted with one input connected to the

original gate output and the second input connected to the stuck-at-1 or 0 fault being

emulated. As shown in Figure 3.2, the select signal (Errsig) is used to choose between the

faulty node data and the correct data.

Figure 3.2 multipexer for stuck-at-fault

In the ModelSim, I simulated a circuit with the mux. When the Errsig signal is set to

1, PBo_0 is just the same as its original signal PBo.

Figure 3.3 Orignal signal simulation

13
However, when we set the Errsig to 0, the PBo_0 has the same wave with signal ss,

which keeps 1 all the time. In this way, we can simulate a stuck-at-fault in the FPGA board.

Figure 3.4 stuck-at-1 simulation

14
Chapter 4
Algorithms

In the book, Essentials of Electronic Testing, there is a diagnostic tree with four test

vectors, T1, T2, T3 and T4, shown in Figure 4.1.

With the diagnostic tree, the process of diagnostic can terminate before all tests are

applied, just like the fault 𝑎0 can be detected with two vector, 𝑇2 and 𝑇4 . In this diagnostic

tree, it is bounded by the total number of tests, but in my diagnostic tree with eleven tests,

as the book says that can be less than the total number, I use nine tests.

As we know, the diagnostic tree can be arranged in several ways. One approach is to

reduce the depth. We start with the set of all faults as “suspects.” Tests are ordered such that

the passing of each test will reduce the suspect set by the greatest amount. This may

sometimes increase the depth of the tree on the side of failing tests. The overall depth can be

reduced by dividing the fault set into equal halves by each test.

Figure 4.1 A diagnostic tree.

In this way, I tried my best to divide faults into two equal halves. Then we get the
theoretical lower bound on the length of the diagnosis. Figure 4.2 shows the tree.

15
Figure 4.2 Diagnostic tree for 74182 benchmark circuit.

16
Chapter 5
Simulation and Result

A diagnostic test is applied after the circuit has failed. We can use the diagnostic tree

to test the circuit and try to find out the faulty part and replace it. In the diagnostic test, we

need the fault dictionary for the circuit to create the diagnostic tree.

Figure 5.1 Fault dictionary.

In my project, I use an example circuit to simulate faults and generate tests. The carry

look-ahead (CLA) realization of the carry function is used by each of the 74X-series circuits

modeled here. Given carry-in (C_n), generate (G) and propagate (P) signals, the circuit

produces three carry out signals, plus two P and G signals used to cascade with another CLA

block. This circuit is small enough to be modeled as a single primitive.

The Advantest T2000 tester uses its own proprietary software for the execution and

analysis of test programs. The objective of the project is to simulate faults or errors on the

field programmable gate arrays (FPGA) that are analogous to random defects that can occur

on a chip during the micro fabrication process. A Xilinx board XC3S200A will be utilized in

this project. The Advantest Tester is capable of applying these patterns to the FPGA serially

and has software already loaded that is meant to detect timing errors and faults of the DUT.

With the Xilinx ISE Project Navigator, a download circuit file written in Verilog generates a

17
bit file. At the same time, a ucf file is also needed to help ISE to create a port for the circuit

for it to be tested by an ATE machine. Then the Mercury Programmer downloads the bit file

onto the FPGA board. The Advantest T2000 tester is capable of programming the FPGA board

with a bit configuration file that is converted to pattern list coded in the OTPL language.

Figure 5.2 74182 gate-level schematic.

In order to create the fault dictionary, in the first step, I need to use FastScan with the

command “set fault type stuck at faults –all” to generate vectors to test the circuit. After

FastScan generated eleven patterns, I reloaded each pattern into FastScan to find out all

faults that the pattern could detect. In this way, I got the data shown next.

18
Figure 5.3 File generated by FastScan.

As we know, in the file, the code DS means that the vector can detect the fault. UO

means it is an unobserved fault, UT means it is an untestable fault, and RE means it is a

redundant fault. In the dictionary, there are two types of entries, one and zero, which mean

detectable and not detectable. In this way, each pattern which includes DS will be examined

by my python program, and I get the files such as shown next.

19
Figure 5.4 Faults that can be detect by vector T0.

Figure 5.5 Faults that can be detect by vector T1.

So, we can see that each pattern has its own single stuck at faults which can be

detected, just like a dictionary. Then, I make a large dictionary which contains all eleven test

vectors. There are seventy-five faults in the dictionary. But there are some equivalent faults,

which are not listed below.

20
Figure 5.6 Fault dictionary for 74182 circuit.

21
5.1 Results

In the ATE machine, the Flow edit can simulate the diagnostic tree. In the .tpl file,

which is the file for the test-plan, we design the tree. The DLL version of the test-plan is

loaded by the user into the T2000 System Controller, which performs a check of the

programmed tester settings, and then copies the test-plan and all pattern files into each of

the Site Controllers in use by the test plan.

An OTPL test plan is made up of several files that specify the conditions and execution

sequence of a set of tests to be applied to a DUT. These conditions can include setting the

levels and timing for the DUT pins, and the vectors to be applied as part of the functional

pattern. The eleven vectors are put in eleven pattern files. Each time, the circuit fails on a

pattern, it will go to another pattern. However, if it passes, it will go to a different pattern. In

the end, the bin tool will tell us which faulty part it is.

After the configuration has all been done for the machine, I first test the fault free

circuit and get a result which I expect, as shown in Figure 5.7.

22
Figure 5.7 Fault free circuit in Flow edit.

At first, I tried the traditional way to test the circuit, that is, apply all eleven vectors

to the circuit. The faulty part under test here is ‘/PB[3] stuck-at-1’. Then I see the result in

the Flow edit display of Figure 5.8, showing vector (red) it fails and vectors it passes (green).

After that, I go to the dictionary to find out the fault.

Figure 5.8 Traditional way to test circuit.

Then I put the circuit into the test plan which is the diagnostic tree and I got the result

of Figure 5.9 in the bin tool, which identifies the fault that the ‘/PB[3] stuck-at-1’, with only

23
four vectors. In this way, we can reduce more than half of the time to find the faulty part. I

also tried the fault ‘PB[0] stuck-at-0’ and ‘PB[0] stuck-at-1’, it get the result which I expected.

Figure 5.9 Diagnostic tree to test circuit.

Then I tried some multiple stuck-at-faults. Many of them were identified as the fault

/CNY stuck-at-0. However, for the multiple fault, PB[0] and GB[0] both stuck-at-1, the result

of the diagnostic tree is ‘/CNX stuck-at-0’. I also tried some fault which change a gate. For

example, ‘AND PB0GB01gate (PB0GB01, PB[0], GB[0], GB[1]);’ changed to ‘OR PB0GB01gate

(PB0GB01, PB[0], GB[0], GB[1])’, and the diagnostic tree gave the ‘/CNY stuck-at-0’ as the

fault.

24
Figure 5.10 Multiple-stuck-at fault (most common result).

25
Figure 5.11 Multiple-stuck-at fault (special result).

26
Chapter 6
Conclusion

In comparison to the traditional way, we find that the diagnostic tree helps save a lot

of time in VLSI testing. The diagnostic tree can be built in many ways for the same circuit.

Making it as a binary tree is the most efficient way to optimize the test. With the mux-select,

we can emulate stuck-at-faults on the FPGA board. The successful ATE experiment with the

FPGA board helped us to apply the algorithm in real diagnosis. The Flow Edit tool of T2000

makes displays the diagnostic tree visually. We can see how the vectors fail or pass.

6.1 Future Work

In my opinion, the reason why many non-single-stuck-at-faults are diagnosed as the

fault /CNY stuck-at-0 from the diagnostic tree, is that these faults produce failures at all the

test vectors. In the diagnostic tree, if all vectors can detect the fault, it will be diagnosed as

the fault /CNY stuck-at-0 in the primary output. In this way, we can find a method for

optimizing the diagnostic tree. When a faulty part has a multiple-stuck-at-fault, we can use

FastScan to generate several vectors which are used to detect the multiple-stuck-at-fault. In

other words, additional vectors are generated so that every fault of interest has a unique

signature (test syndrome) in the fault dictionary. The vector which we use to detect the fault,

generated by FastScan, is the most important data to help optimize the diagnostic tree. Thus,

the diagnostic tree can be enhanced to detect many other faults, as we add vectors to create

the tree.

We can combine all the diagnostic trees into one huge tree. Just connect the other

fault of diagnostic tree with the port twenty-four, which means /CNY stuck-at-0. Because as

we get this fault, we know that all the vector just fail, then we can start from this vector, to

27
go to another fault of diagnostic tree. If that tree goes to a port which means all vectors failed,

we can still go to yet another tree.

Furthermore, we can use a full-response fault dictionary to solve the problem about

multiple-stuck-at-faults, just as shown below in Figure 6.

Figure 6 A full-response fault dictionary.

We consider a circuit with two outputs, o1 and o2. This circuit may have a multiple-

stuck-at-fault. Suppose f1 through f8 are the faults which are detected by vectors t1 through

t5. A ‘1’ for t1 and f1 under o1 means that fault f1 can be detected by t1 at the output o2. A

‘0’ for t1 and f1 under o2 means that fault f1 cannot be detected by t1 at the output o2. In

this way, we can make a table for different vectors, then we can use it to make a more

complex diagnostic tree. As this table can distinguish different outputs, we can decrease the

ambiguity brought in by a multiple-stuck-at-fault, especially in the circuit where a single-

stuck-at-fault may not influence some of the outputs.

28
Bibliography

[1] Nanosim User Guide. Synopsys, San Jose, CA, 2008.


[2] EZWAVE User Guide. Mentor Graphics Corp., Wilsonville, OR, 2011.
[3] L. Zhao, “Net Diagnosis Using Stuck-at and Transition Fault Models,” Master’s thesis, Auburn
University, Auburn, Alabama, USA, 2011.
[4] M. A. Shukoor, “Fault Detection and Diagnostic Test Set Minimization,” Master’s thesis, Auburn
University, Auburn, Alabama, USA, 2009.
[5] K. R. Kantipudi, “Minimizing N-Detect Tests for Combinational Circuits,” Master’s thesis, Auburn
University, Auburn, Alabama, USA, 2007.
[6] C. Alagappan, “Dictionary-Less Defect Diagnosis as Real or Surrogate Single Stuck-At Faults,”
Master’s thesis, Auburn University, Auburn, Alabama, USA, 2013.
[7] M. A. Shukoor and V. D. Agrawal, “Compaction of Diagnostic Test Set for a Full-Response
Dictionary,” Journal of Electronic Testing, April 2012, Volume 28, Issue 2, pp. 177-187.
[8] C. Alagappan and V. D. Agrawal, “Defect Diagnosis of Digital Circuits Using Surrogate Faults,”
Communications in Computer and Information Science, Volume 382, 2013, pp. 376-386.
[9] Leonardo Spectrum User Guide. Mentor Graphics Corp, Wilsonville, OR, 2011.
[10] ATPG and Failure Diagnosis Tools. Mentor Graphics Corp., Wilsonville, OR, 2009.
[11] C. Dunbar and K. Nepal, “Using Platform FPGAs for Fault Emulation and
Test-set Generation to Detect Stuck-at Faults,” Journal of computers, Volume 6, No. 11, November
2011.
[12] M. L. Bushnell and V. D. Agrawal, Essentials of Electronic Testing for Digital, Memory and Mixed-
Signal VLSI Circuits, Springer, 2000.
[13] F.-W. Wang, J.-Y. Shi and L. Wang, “Method of Diagnostic Tree Design for System-Level Faults
Based on Dependency Matrix and Fault Tree,” Proc. IEEE 18th International Conference on
Industrial Engineering and Engineering Management (IE&EM), 3-5 Sept. 2011, pp. 1113–1117.
[14] D. W. Tong, C. H. Jolly and K. C. Zalondek, “Diagnostic Tree Design with Model-Based Reasoning,”
Proc. International Test Conference, 2004, pp. 355–364.
[15] T. Assaf and J. Bechta Dugan, “Diagnostic Expert Systems from Dynamic Fault Trees,” Proc. Annual
Symposium on Reliability and Maintainability, 26-29 Jan. 2004, pp. 444-450.
[16] P. Gmytrasiewicz, J. A. Hassberger and J. C. Lee, “Fault Tree Based Diagnostics Using Fuzzy Logic,”
IEEE Transactions on Pattern Analysis and Machine Intelligence, Volume 12, No. 11, pp. 1115–
1119, 1990.
[17] P. Girard, “Survey of Low-Power Testing of VLSI Circuits,” IEEE Design & Test of Computers, May-
June 2002, pp. 80–90.

29
[18] C.-H. Wu and K.-J. Lee, “An Efficient Diagnosis Pattern Generation Procedure to Distinguish
Stuck-at Faults and Bridging Faults,” Proc. 23rd IEEE Asian Test Symposium (ATS), 2014, pp. 306–
311.
[19] A. Matrosova, E. Loukovnikova, S. Ostanin and A. Zinchuck, “Test Generation for Single and
Multiple Stuck-at Faults of a Combinational Circuit Designed by Covering Shared ROBDD with
CLBs,” Proc. Defect and Fault-Tolerance in VLSI Systems, 26-28 Sept. 2007, pp. 206–214.
[20] D. M. Miller and J. C. Muzio, “Spectral Fault Signatures for Single Stuck-At Faults in Combinational
Networks,” Computers, Volume C-33, No. 8, pp. 765–769, 1984.
[21] D. B. Lavo, B. Chess, T. Larrabee and F. J. Ferguson, “Diagnosing Realistic Bridging Faults with
Single Stuck-at Information,” IEEE Transactions on Computer-Aided Design of Integrated Circuits
and Systems, Volume 17, No. 3, pp. 255–268, 1998.
[22] H. Takahashi, N. Yanagida and Y. Takamatsu, “Multiple Stuck-at Fault Diagnosis in
Combinational Circuits Based on Restricted Single Sensitized Paths,” Proc. 2nd IEEE Asian Test
Symposium, 1993, pp. 185–190.
[23] M. A. Heap and W. A. Rogers, “Generating Single-Stuck-Fault Coverage from a Collapsed-Fault
Set,” Computer, Volume 22, Issue 4, pp. 51-57.

30

Potrebbero piacerti anche