Sei sulla pagina 1di 23

Cambridge International General Certificate of Secondary Education

0417 Information and Communication Technology November 2016


Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/02
Practical Test A

Key Messages

Candidates should be advised to take care with any data entry tasks and to check and proofread their work.

Candidates should be familiar with the differences between serif and sans serif fonts, and with the fact that
these are font properties, rather than font styles.

Candidates should ensure that any screen shots within their evidence document are readable and of a
suitable size.

General comments

Most candidates attempted the knowledge questions but the evaluation question was not completed well.
Many candidates described the checklist rather than evaluate the suitability of the checklist they had
produced, and very few related this to the target audience.

There were a significant number of typographical errors in data entry throughout the paper. Many of these
inaccuracies could have been avoided with more careful checking and proofreading. Candidates are advised
to enter the text exactly as shown on the exam paper and to check and recheck this data entry in the
document, report, mail merge and presentation. Common errors included incorrect capitalisation, incorrect or
missing characters, omission of spaces, truncated headings and additional punctuation. Candidates also
have difficulty distinguishing between the typeface categories serif and sans-serif with some using these as a
font face name. This was evident in the screenshot evidence of the paragraph style where some candidates
entered Serif or Sans-serif as the font name in the font dialogue box rather than selecting a font style with
the serif or sans-serif properties required.

Every task prompts the candidate to print their name, Centre number and candidate number on every
document submitted for assessment. It is important that candidates do this as, without clear printed evidence
of the author of the work, marks cannot be awarded by the Examiner for these pages. Printouts will not be
marked if the names and details are missing, or if these details are written on by hand as there is no real
evidence that they are the originators of the work. Documents that extend to more than one page but have
identification details on one page only, such as a database report with name at top or bottom, will be treated
as the same document and marked without identification details on the second page providing the layout is
consistent and it is obviously part of the same document. It is therefore important that exam Supervisors
return all printouts to candidates, even where identification details are missing from the second page as
Examiners can identify that it is part of the same document. Some candidates submitted multiple printouts for
some of the tasks and, as instructed, crossed out those printouts that were draft copies. If multiple printouts
are submitted without draft versions being crossed through, only the first occurrence of each page will be
marked.

Candidates are required to produce screenshots to evidence the ICT skills that cannot be assessed through
the printed product alone. Candidates should check each printed screenshot to ensure it is clear and large
enough to be read. Where Examiners are unable to read the materials presented, they cannot award
candidates the marks. Similarly, some candidates did not achieve marks as a result of presenting
screenshots with important elements cropped.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Access to the internet or email during the examination is not permitted and candidates who use the internet
to find answers to the theory questions are in breach of the regulations for this syllabus. Centres who allow
access to the internet may find their candidates disqualified from the whole of the 0417 qualification.
Candidates should be encouraged to print their Evidence Document towards the end of the exam, regardless
of whether they have finished the exam paper. Centres should not staple the work, hole-punch or tie it
together with string. A number of candidates lost marks where holes had been punched through text being
marked such as the headings of database reports.

Comments on specific questions

Task 1 The Evidence Document

This question was completed well by most candidates. Occasionally the screenshots were too small or faint
to be read. A small number did not present the evidence document for marking.

Task 2 Document

Question 1

The page setup was generally completed well. A few candidates set the margin measurements in inches
rather than centimetres without converting the measurements. Some candidates displayed a screenshot of
the actual document without the page setup options visible and a few incorrectly evidenced the printer
orientation settings rather than the document page setup options. The paper orientation evidence was
occasionally cropped out of the screenshot image.

Question 2

A large number of candidates saved the file in the original rtf format rather than in the format of the word
processing software being used. Screenshot evidence of the save was often inconclusive as it showed the
save process rather than capturing the outcome after the file had been saved. In these instances if the file
extension was not evident in the footer there was insufficient evidence that the file had been saved in the
correct format. Some omitted to follow the capitalisation as shown or introduced data entry errors into the file
name.

Question3

The creation and application of paragraph styles was performed well. The creation of paragraph styles was
assessed through the screenshot evidence of the THC-Subheading style evidence only. The style name
occasionally contained capitalisation or data entry errors. The created style was sometimes based on
another which inherited additional formatting from the original style. A style with additional formatting not as
detailed in the House style specification was penalised. A few candidates captured evidence of the
paragraph formatting dialogue boxes open but did not then show evidence of the attribute saved as part of
the style. This was particularly evident in setting the spacing after the style where the paragraph dialogue
box was shown open but 12 point spacing after was not saved as part of the style definition. The application
of all styles was marked on the appearance of the text to which they were applied. No marks were achieved
for applying the formatting attributes to the text in the document without the creation of the individual styles
outlined in the House Style specification.

Question 4

Most candidates correctly inserted the required headers and footers. Occasionally these did not align with
the page margins. Some candidates left text placeholders or included superfluous text in the header and/or
footer area. Candidates who included their name as part of the header/footer details were not penalised. A
few candidates incorrectly included a path with the automated file name. As part of the House style
specification, a style should have been created and which was now applied to the headers and footers. It
was apparent that a large number of candidates had attempted to format this text without creating the style.
The correct formatting was often only applied to one item, italic was missing or the text was not displayed in
a serif font style.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 5 to 8

Most candidates entered the title and subtitle although these occasionally contained capitalisation and/or
spelling errors. The House style specification stated the formatting to be applied to the title and subtitle
styles. These styles were not always applied correctly with a serif instead of a sans-serif font style or
additional enhancements applied such as bold enhancement from the title style applied to the subtitle style.
Quite often there was additional spacing below the title and/or subtitle although the styles specified 0 space
after, or an unrequested line inserted below these.

Question 9

There were often inconsistencies in the body text style within the document. A few candidates used a sans-
serif font for all body text. More common was a mixture of serif and sans-serif font styles and a variation in
text sizes for one or more paragraphs. Candidates should make sure that the style has been applied
correctly to all the text. Justification was usually well done. Additional enhancements to the body style such
as italics or all the text displayed in upper case were penalised.

Question 10

Most candidates changed the page layout to two columns but the section break was occasionally in the
wrong position. A number of candidates displayed the entire document in two columns and a small number
inserted a page break rather than a section break. Occasionally the column spacing was incorrect and had
not been changed from the default measurement.

Question 11

This question tested the application of the THC-Subheading style created in question 3 and was only
awarded if there was evidence the subheading style had been created. If the style of all nine subheadings
matched the THC-Subheading style definition evidenced in the Evidence Document this style application
mark was awarded.

Question 12 to 15

Most candidates inserted the table in the correct place and within the column width. The majority deleted the
row, although a few removed the data but left an empty row in place. The formatting of row 1 was generally
done well with the heading centred over the columns. A few candidates applied the grey shading to the text
rather than to the full cell. Occasionally row 1 containing the heading Key Personnel had not been imported
so merging was not evidenced although marks were awarded if the correct formatting had been applied to
row 1 of the table presented. Formatting of rows 2 to 8 was not done well with many not applying the table
style, numbers not right aligned, or text wrapped over two lines. Many displayed all the table borders and
gridlines.

Question 16

The bulleted list formed the theory questions to test Assessment Objective 1. Candidates were required to
identify methods of preventing viruses. Access to the internet during the examination is not permitted so no
marks could be gained from answers copied from websites. Very few candidates gave clear statements on
how to prevent viruses. It was insufficient to state buy, download or install antivirus as the software needs
to be used or run to be effective. One mark was awarded for each valid method given.

Question 17 and 18

Most candidates changed the required text to a numbered list. However few aligned this at the left margin or
displayed the list with no space after each line.

Question 19

The document layout and presentation should be checked to ensure that spacing is consistent, there are no
split tables or lists, no widows or orphans, no blank pages and the styles have been applied correctly.
Inconsistent spacing was common due to additional hard returns entered within the document, as was areas
of white space and unequal column alignment at the top of the page. It was evident that many had not run
the spellchecker as the two incorrect spellings committed and attend in the penultimate paragraph had not
been corrected.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Task 3 Database

Question 20

Almost all candidates used the correct field names and data types although a few incorrectly included an ID
field. Most set Member_No as the primary key. Quite commonly the date field was not displayed in the
correct format. The Fees_Due field was set as Boolean although often this was displayed on the report with
check boxes or True/False instead of Yes/No.

Question 21

The creation of a drop down menu was performed much better than in previous series. Some candidates
produced evidence of the automatic filtering facility available within a table or query which achieved no
marks. Some candidates produced a form which included a drop down menu which was accepted if it was
evidenced on the correct field. Most showed the evidence through design properties which also captured the
restrictions set. A validation rule was also accepted as evidence of limiting the data entry. Quite often there
were data entry errors in the keyed entries.

Question 22

The new record was entered accurately with few errors. Although this record appeared in both reports it was
only marked in report 1.

Question 23

The second table was correctly imported with most setting Memb_Code as the primary key. Most candidates
created a one-to-many relationship between the correct fields. A few did not display the Annual_Fee with a
currency symbol and to two decimal places.

Question 24

Report 1 was completed well by a number of candidates. Most created the new calculated field although the
field heading Late_Payment occasionally contained capitalisation or data entry errors. The calculation was
usually done correctly although a few multiplied by 1.5 instead of 1.05. The search was based on three
criterion and the most common fault was searching for greater than 01/01/2015 rather than greater than or
equal to 01/01/2015. Most displayed the correct fields but these were often not fully visible and in the wrong
order with the software defaulting to place the sort fields first in the report. The sorting on two fields was
frequently incorrect with only the Memb_Type field sorted. The report title contained errors particularly in the
word Membership. The data usually fitted to one page wide with landscape orientation. The sum calculation
was done well but this was often not positioned under the Annual_Fee column. The label Total fees due
contained capitalisation errors and occasionally included additional punctuation. Several did not provide
screenshot evidence of the formula used to sum the records or the formula was truncated. The currency
values in Annual_Fee, Late_Payment and Total fees due did not always display the same currency symbol
or to two decimal places.

Question 25

A number of candidates did not select the correct records for report 2 which often contained all the payment
methods. The three types SW, GY and TE were not always present and this was likely to be due to errors in
using or with criteria listed on separate lines resulting in only the first line finding records which match all the
criteria. Some candidates did not fit this report on one page. The field order was often wrong with data
truncated, most commonly in the Add_1 field. The report heading often contained capitalisation or data entry
errors. Some candidates included their identification details at the bottom of the report rather than the top
right.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Task 4 Mail merge document

Question 26

This is a new skill tested in the revised syllabus and on the whole candidates performed well in this task. The
date was usually displayed in the correct format although some used a create date rather than todays date.
Few provided valid screenshot evidence of a date field being used. To achieve this the date should be
inserted as a field and the field codes displayed to capture this evidence. The mail merge fields were
inserted in the correct positions but some candidates did not replace the existing text and/or chevrons and
many incorrectly deleted spacing and punctuation when they inserted the fields. Many candidates did not
insert a space between the first name and last name fields. Most inserted their details but several put this in
the header rather than the footer area.

Question 27

The logo was correctly inserted and resized although a few failed to maintain the aspect ratio. The logo was
accepted in any position outside of the table.

Question 28 and 29

This was done well by many candidates with most providing evidence of an automated selection for induction
course records. Despite this, there were a significant number of candidates who failed to produce any letters
or merged to all the records instead of the required selection. Some of the resulting merged letters did not
match the original master letter with inconsistencies in the layout and presentation. A few candidates
attempted to use find or find in field to select recipients at the printing stage which did not merge the
letters.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 30
This question provided the analysis and evaluation to test Assessment Objective 3. Candidates were
required to evaluate the suitability of the checklist for use with its intended audience and to identify
improvements with reasons. Few candidates performed well in evaluating their checklist and many did not
attempt this question. The checklist was intended for use during the induction of new staff at the Health Club
and the intended audience was therefore adults who were new employees. Few made any reference at all to
the target audience. Most candidates identified a way the checklist could be improved but did not provide a
reason for this. Many stated that a logo could be included to help brand recognition but as they had inserted
the logo as part of the task they were not evaluating the checklist they had produced.

Task 5 Presentation

Question 31
Most candidates imported the six slides without difficulty. Occasionally changes had been made to the
content of the slides, or an additional blank slide included.

Question 32
Generally the master slide was created well with the items positioned as instructed. Occasionally there was
some overlap of the master items on the slides. Some candidates failed to apply italic enhancement to the
text or bullets. The master slide was marked from the screenshot in the evidence document. A few
candidates produced screenshot evidence of a slide rather than the master slide.

Question 33
Most candidates did change the slide 1 layout to title and subtitle, centred on the slide. Sometimes the
subtitle retained its bulleted format or the items were not centred.

Question 34 to 36
Most candidates produced a chart but this was not always a vertical bar chart. Few candidates selected the
correct date with most including the total in the selection. The titles were inserted but often contained
capitalisation or data entry errors. Most inserted the chart on the correct slide but did not position this to the
left of the bullets.

Question 37
Some candidates printed individual slides rather than handouts with 6 to the page. The single slide print was
not always done or did not fill the page.

Task 6 Printing the Evidence Document

Question 38
Some candidates submitted no printout of the Evidence Document. It is essential that candidates print their
Evidence Document as failure to do so can result in a number of lost marks. Candidates should be
encouraged to print this towards the end of the exam, regardless of whether they have finished the paper.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/03
Practical Test

Key Messages

Candidates should ensure that any screen shots within their evidence document are readable and of a
suitable size.

Candidates should be advised to ensure that they print their name, Centre number and candidate number on
each document, as specified in the question paper.

General comments

The majority of candidates attempted and completed all elements of the paper and there was no evidence
that candidates had run out of time.

A significant number of candidates produced screen shot evidence where the text was so small (and often
pixelated) that it was extremely difficult and in some cases impossible for examiners to award marks.
Candidates MUST ensure that all printouts are readable without the use of magnification devices.

A small number of candidates failed to print their name, Centre number and candidate number on some of
the documents submitted for assessment. Without clear printed evidence of the author of the work,
examiners were unable to award any marks for these pages. It is not acceptable for candidates to annotate
their printouts by hand with their name as there is no real evidence that they are the originators of the work.

A small number of candidates submitted multiple printouts for some of the tasks and failed to cross out those
printouts that were draft copies. Where multiple printouts are submitted, examiners will only mark the first
occurrence of each page.

For all tasks set within the context of this examination, candidates are expected to use the most efficient
methods to solve a particular problem. This was not always the case in candidates evidence, both in the
spreadsheet and website authoring sections of the paper.

Comments on specific questions

Website Authoring

Question 1

The stronger candidates answered this question well. However most gave a description of the perceived
inadequacies of the pages rather than the efficiency of the markup. A number of candidates suggested
improvements that were all identified in the tasks they had to perform to complete the web pages. There
were some excellent answers identifying issues like duplicated statements in the markup and that greater
efficiency could be obtained using external stylesheets.

Question 2

Almost all candidates successfully replaced the text with their candidate details and the text Image A with
the image of a beach.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 3

Almost all candidates successfully replaced the text with the company logo. Most resized the image as
specified but few produced appropriate alternate text which would be displayed if the image was not
available/downloaded or if a text reader was used with the web page. Many candidates used a single word
Logo as the alternate text, which would not explain to a user the contents of the image.

Question 4

Most candidates successfully completed this task, although a small number ignored the range of images
specified and selected from one of the other supplied web pages.

Question 5

Many candidates found this task challenging. Some created an anchor around the text Click here but did
not place a bookmark in the body section of markup above the first table.

Question 6

Most candidates were successful in using the mailto: attribute within the anchor and a significant number of
candidates gained full credit for this task, despite the complexity of the markup required. Some candidates
inaccurately entered the email address and/or subject line.

Question 7

The solution to this question was frequently omitted or contained errors. A small number of candidates used
the correct markup but placed this in the body section rather than the head section of the document.

Question 8

Although this question was frequently completed as required, some candidates hyperlinked to the wrong web
page.

Evidence 2

Despite being instructed to Display the web page in your browser, a large number of scripts contained the
web page in a web authoring package. These candidates did not gain credit for the images as there was no
evidence that the images would be displayed in a browser.

Question 9

Although this question asked explicitly about in this website there were a significant number of very
generalised answers on testing and test plans. Few candidates considered the tasks that they had been
asked to perform in steps 2 8 and identified how these would be tested before the site was uploaded to the
internet.

Question 10

This question was only answered well by stronger candidates. Because the stylesheet provided had a
number of errors, many candidates did not consider the most efficient methods of completing this task. Not
all candidates interpreted the table cells section to set the definition for table data, and even fewer merged
repeated definitions in both the table and td sections into single statements. There were some candidates
who produced good answers with a variety of different solutions.

Question 11

Most candidates who attempted this question completed it as specified, although a few used their web
authoring package rather than a browser to display the page.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Spreadsheet

Question 12

Almost all candidates loaded the required file but not all candidates added the automated filename and file
path on the right in the header.

Question 13

Many candidates inserted the correct number of blank rows in the correct place.

Question 14

Most candidates inserted the text but there were a significant number who introduced errors into their
entered text, frequently related to spelling, case or spacing.

Question 15

Many candidates successfully completed merging the cells, but a small number merged fewer cells than
instructed in the question paper.
The majority of candidates successfully formatted this text as 36 points high, but some candidates omitted
the white text on the black background. A few candidates incorrectly set this as a serif font. Several
candidates did not display all of this text. There were many acceptable solutions to this and the most
effective of these was wrapping the text within the cell.

Question 16

The majority of candidates successfully entered the correct text into the correct cells.

Question 17

A number of candidates set the first column bold and underlined rather than the text in the two specified
rows. Where candidates did set the correct rows (5 and 26) as instructed they frequently formatted only the
start of row 5.

Questions 18 and 19

A significant number of candidates did not answer these questions fully. Whilst most demonstrated the
correct use of a VLOOKUP, some attempted to use LOOKUP functions and referred to unsorted data in the
source file. For these questions LOOKUP would not provide a solution as the question specified that the
original external file must be used. Some candidates attempted to use named ranges, but named ranges
cannot be stored in a csv file. Others did not use the named range even when they had created it correctly in
an earlier question. A significant number of candidates who used VLOOKUP did not select the most efficient
range for the lookup. Most candidates set the correct return column 2 and included the FALSE parameter to
ensure the function performed as expected.

Question 20

There were a number of solutions involving nested IF statements, the most efficient used three IF statements
and had the reference to cell $F$4 outside the nested IF conditions. A significant number of candidates
found the logic challenging and had logic errors, or syntax errors with statements like IF(50<F6<101
Some candidates found solutions that worked using AND within the IF statements but these were
unnecessary and inefficient, despite returning the correct values. A number of candidates inefficiently used
the value 124.2 in their formulae, rather than an absolute cell reference to F4.

Question 21

Whilst many candidates correctly used SUMIF or equivalent functions, there were a number of candidates
who incorrectly elected to count the occurrences rather than add the contents and used COUNTIF. Most
candidates correctly set the two ranges as absolute values (so that they did not change upon replication) and
the single cell reference as a relative reference but some candidates did not.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 22

Many candidates replicated their formulae correctly.

Question 23

Only the strongest candidates gained credit for this question. It required candidates to look at each cell and
determine the most appropriate formatting for that particular cell. Many candidates applied the correct
currency symbol to 2 decimal places to the correct cells, but a significant number of candidates applied this
to the correct column but omitted cell F4.

Question 24

Although most candidates generated a printout of formulae, a significant number of candidates did not
display the row and column headings. Some candidates who printed their formulae did not extend the
column widths, meaning that the full formulae and / or labels could not be seen and answers could not be
credited.

Question 25

Most candidates completed this question as specified, but again in the evidence for some candidates the
columns were not fully visible.

Question 26

Most of the candidates who completed this question did so as specified and gained credit.

Question 27

Most candidates completed this question as specified, but again in the evidence for some candidates the
columns were not fully visible in the values spreadsheet.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/11
Written Paper

Key messages

In many cases, questions requiring more detailed answers needed to contain more explanation or discussion.

There were many instances of students giving brand names rather than the generic name for the product. It
is clearly stated on the front page of the examination paper No marks will be awarded for using brand names
of software packages or hardware.

General comments

Candidates on the whole seemed to have enough time to answer all questions, with few candidates not
giving a response to all questions.

There were some answers that did not seem to fit with the question that had been asked. This was most
evident in question 11(b) where students gave answers regarding comparing of data and the turning on and
off the actuators and question 12 where candidates gave answers relating to customers rather than to the
store.

Areas which seemed to pose most difficulty were the comparison questions where students did not carry out
a comparison in the majority of cases.

Some candidates gave extra answers that were not asked for and did not gain credit.

Comments on specific questions

Question 1

This question on internal hardware devices was a development of this type of question asked in previous
series; candidates in the main seemed to answer it well.

(a) This part was quite well answered; though a number of candidates gave hard drive, ROM or even
CD/DVD as their responses.

(b) This part was not as well answered as (a) a lot of CPU/processor answers. Some referred to
printing as a printed circuit board was in the question.

(c) This part was generally well answered though candidates who answered ROM for (a) tended to put
RAM here.

(d) This part was well answered but only because most gave speakers rather than soundcard. A
commendable number, however, gave sound card.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 2

This question on user interfaces was well answered on the whole.

(a) This was a better-answered question although a small minority put CLI; or desktop or even HTML.
Some candidates answered with Windows or Notepad; which are trade names and therefore
incorrect.

(b) Most candidates gained one mark for mention of an application or a program. They tended to call
an icon a thing, something, a feature rather than an image/picture. Those that managed to
answer the first part were often unable to explain its use.

(c) This question was fairly well answered.

Question 3

A high percentage of candidates gained both marks in this question on networks. Those that lost marks
tended to be those that thought the internet was an example of a WLAN, or less frequently, a WAN is not a
collection of LANs or a WAN is a Wireless Area Network.

Question 4

This question on word-processing was well answered, although many candidates struggled with the concept
of widows, and, to a lesser extent, gutter margins.

Question 5

This question on viruses tested the depth of knowledge on the subject to its fullest. It was a good
differentiator which enabled all candidates to gain some marks.

(a) Most candidates gained at least one mark here, with many getting three or more. Those that didnt
do well usually couldnt explain what a virus is although they could give an example of its effects.

(b) This part was fairly well answered, with a range of marks being awarded here. Those that struggled
referred to the use, installation or updating of anti-virus software, but little else. Many incorrectly
suggested the use of a firewall. Many correctly suggested not opening un-recognised
emails/software, etc. but in many cases the positive precaution was given, .e.g. ONLY use trusted
sources, media, etc. not recognising that these could be capable of transferring/carrying a virus.

(c) This was a new topic and had not been asked before; therefore it was not as well answered as
parts (a) and (b). Many candidates repeated the question and wrote about removing the virus.
Several ignored the stem and wanted to delete the file as opposed to quarantining the virus.

Question 6

Many candidates achieved a mark on this question on the systems life cycle. Some answers seen included
types of verification, methods of implementation and testing strategies. Many thought this was asking for
ways of gaining user feedback. Comparing with initial requirements was a popular correct response;
identifying limitations and further improvements less so.

Question 7

This question was designed to test candidates knowledge of the difference between proofreading and
verification.

(a) Most candidates referred to checking the data without differentiating between verification and
proofreading. Not many appeared to know what proofreading is, although some knew what
verification is.

(b) This part was generally well answered, but despite the wording of part (a), some insisted on giving
proofreading as their answer. Other candidates gave validation or double entry as their answer.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 8

Mail merging is one of the newer topics on the syllabus; it was not as well answered as expected.

(a) This was fairly well answered, with few achieving full marks. A number talked about emails. Many
lacked a detailed knowledge of the process and only gained marks for answers such as typing the
letter or selecting/creating the mailing list. There was considerable confusion regarding the adding
of merge fields. In many cases candidates appeared to understand the process, as used in the
practical paper, but had trouble describing it.

(b) Most candidates did quite well, though many did so only because they had spotted proofreading in
question 7.

(c) Most candidates achieved two marks, and a handful achieved full marks in this question. Some
candidates answered with marks generally only being gained by giving examples of personal data.
Many mentioned stealing data, which is too vague. The major issue with the answers given was
that candidates did not refer to the data being linked to living individuals. Incorrect assumptions
included that personal data was a username and password.

Question 9

(a) Some reasonable marks were awarded in this question on virtual reality devices, but candidates
saw the word device and guessed at the devices. This was demonstrated by the answers given in
part (b). This question is based on a newer topic and therefore this meant a lot of candidates did
not gain any marks. Some answers were often too vague and insufficient. Some candidates mixed
up 3D viewing with VR.

(b) Very little understanding was shown here, with some candidates not understanding virtual reality. A
number of candidates insisted on giving games/gaming as their answer despite it being used in the
question. Many very general answers were seen, including medical, scientific experiments,
modelling construction, training, cinemas, etc.

Question 10

This question on health issues was generally well answered, although a few gave safety issues rather than
health. A number gave obesity as a problem. A number gave using a comfortable chair to rectify back
problems.

Question 11

This was a good stepped question which tested the candidates knowledge of sensors and monitoring over
the three parts.

(a) Virtually all candidates gained the mark although some answered with heat sensor.

(b) Very few candidates gained more than one mark in this question. Descriptions tended to be
generally lacking detail on how the data is transmitted in such a system or on the specific outputs
from the system. Many references were made to comparing actual readings with pre-set values,
i.e. answering as if for control, rather than for monitoring. Some rather futuristic answers went on to
suggest that the weather could be changed as a result. Some candidates stated that a buzzer
sounded if the temperature was too high.

(c) Many answers tended to focus on cost and accuracy, without saying in what way. Many seemed to
think that the weather balloon would go higher than an aircraft, without considering why this would
be an advantage anyway. Many candidates did not give a comparison and/or an elaboration, which
meant they often fell short of the mark.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 12

The majority of candidates gave advantages to the customer rather than to the store in this question on
online shopping. Those that did give answers relating to advantages to the store still did not make many
points. Most of these types of answers related to having more customers.

Question 13

For what was expected to be a fairly straightforward question on booking systems very few candidates
gained more than two marks. Many candidates ignored the questions requirements. Instead of giving inputs
and processing, they frequently offered descriptions of the outputs of the system. Some candidates focused
on the transfer of funds to pay for the tickets.

Question 14

There was some confusion between parts (a) and (b) in this question on internet safety.

(a) Many candidates seemed unprepared for this question and answers such as avoiding viruses and
protecting against hacking were often seen. A sizeable number of candidates turned the question
round to one they could answer and gave answers about emailing or the use of https.

(b) This part was quite well answered, with roughly half the candidates gaining two or three marks.
Many candidates who did not score these marks often gave incomplete answers. There were still
references to firewalls, encryption and passwords as strategies.

Question 15

(a) Databases proved to be a topic for which candidates had either studied well or had not prepared
well enough for. The question was related to the revised practical examination and therefore it was
felt that it should be straightforward. Those that did prepare well gained good marks. Those that
hadnt were left trying to answer the question by guessing. Even then, they often used the terms
out of context or guessed the wrong fields. Most identified that Venue ID and Location ID had to be
connected in some way to form the relationship. A high percentage identified the one to many
relationships being created. Knowledge of creating relationships is good, but clumsy wording and
non-use of appropriate terminology prevented many candidates earning full marks.

(b) Candidates struggled with a question that asked them to apply their knowledge of relational
databases and flat file databases rather than just write it down. Most candidates seemed unable
to do this, meaning that most answers were too vague, e.g. easier to use.

Question 16

This question on research methods was very well answered and candidates demonstrated a fair knowledge
on the different methods of analysis. What was surprising was the number of candidates who understood the
question and could give research methods but couldnt give advantages or disadvantages. Some gave
answers relating to implementation. Others tried to guess as they clearly hadnt prepared for this type of
question.

Question 17

This question on web page design was generally well answered, although some candidates struggled with
anything other than a title and the details given in the question. A number of candidates wrote the actual
markup for the page, although this was not a requirement of the question. On the whole most candidates
filled the page. Some candidates produced information with errors like a holiday to Greenland. Very few
candidates included videos or sound.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 18

This question was intended to be a discriminator and most candidates did not produce anything beyond a
level 1 answer. Despite it being a discuss question and ample space provided for their answer, few
candidates actually used up all the available space. Most candidates provided brief descriptions of their
perception of advantages and disadvantages. Few managed to expand the points they made in order to
justify their argument. Most concentrated on speed of interrogation, the accuracy, or otherwise, of the
information found on the internet and the amount of information available. Other issues tended to be ignored.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/12
Written Paper

Key messages

Questions that asked the candidates to answer simple and straightforward questions were not as well
answered by candidates as in previous sessions of the paper. Those questions that stretched the candidates
or dealt with the newer elements of the syllabus lacked depth in the answers written.

There were less blank answers than in recent examinations although many candidates simply guessed the
answers rather than leaving them blank.

There were far too many instances of candidates giving brand names rather than the generic name for the
product. It is clearly stated on the front page of the examination paper No marks will be awarded for using
brand names of software packages or hardware.

General comments

All candidates appeared to have enough time to finish the paper.

Some candidates gave extra answers that were not asked for and did not gain credit.

It was noticeable that quite a few candidates did not respond to questions that have come up in previous
papers and are normally well answered but in this case, they seemed to have misunderstood what the
question was asking.

Comments on specific questions

Question 1

This appeared to be a straightforward question but seemed to catch out a lot of candidates who were not
able to correctly identify the correct communication device from the description. More candidates refrained
from answering this question than is usual for this first question.

Question 2

(a) Most candidates were able to score one mark for this by giving the correct type of software. Quite a
few seem to misunderstand what programs do with the hardware or the computer system. In
general, this was reasonably well answered. Some good answers were seen but many were along
the lines of cant touch it, not a physical thing.

(b) There were a similar number of responses to this question as there were to question 1, with many
blank responses seen. Many candidates gave examples of the software rather than the type of
software and then went on to give brand names. e.g. Microsoft WORD instead of word processor.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 3

This question was generally well answered. Some candidates seemed to think motors and printers were
input devices.

Question 4

This question was generally well answered. The most common mistakes seen were the first statement, with
some candidates not realising that capital letters meant shouting, or candidate not realising that BCC does
mean blind carbon copy.

Question 5

(a) This was a new question as in previous sessions questions were about video-conferencing and
therefore web-conferencing seemed to confuse a few candidates. Most candidates were able to
give a description of what web-conferencing was, stating it was a conference over the internet. But
in trying to explain how it could be used in the scenario given in the question, very few were able to
gain additional marks for their description as they did not match it to the scenario. Some candidates
managed to link the answer to VOIP but did not link it back to the scenario. There were significant
references to brand names, which are not permitted.

(b) This question was generally well answered. There was often little concept of multiple users and
real time.

(c) There were a lot of vague answers given by candidates in this question on the drawbacks of video-
conferencing, but many managed three marks; however, few managed the matched pair, therefore,
losing the second mark. Too many were discussing internet speed rather than the reliability of the
internet connection. Some candidates answered with benefits.

Question 6

This question on storage mediums was generally, well answered, apart from the file server backup which
many thought should be done on a fixed hard disk.

Question 7

(a) Most candidates were able to identify the IF function, and gave a completely correct answer. Quite
a few added an extra IF function for the false part of the formula; which would have been correct if
they had tested for the 75. There were many candidates who omitted to put the text into (speech
marks).

(b) This question on test data was fairly well answered, but some candidates did not understand what
test data is, therefore this question seemed to catch a lot of candidates off guard. This question
was perceived as much harder than it actually was, as it was within a scenario involving
spreadsheets.

Question 8

A lot of candidates appeared to be confused about how biometrics are used in the manner described in the
question. Although quite a few candidates scored well for this question, the majority did not achieve more
than three marks. This was partly due to the fact it was a discuss question and partly due to their lack of
knowledge on the subject.

A good number of candidates gave advantages and a disadvantage. There were a lot of vague descriptions
from many candidates. Many gained marks for uniqueness and for difficulty to forge, with some getting the
mark for hardware being more expensive.

Very few candidates gave a conclusion.

Question 9

Some candidates did not read the question properly and referred to books/music/movies. Many gave basic
descriptions without mentioning the concept of permission or making copies.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 10

(a) Most candidates did not understand the concept of expert systems, despite mentioning them
inappropriately in answers to other unrelated questions. Some rather worryingly thought that either
the knowledge base or the rules base controlled the whole process. Some thought that the
inference engine was a user interface in disguise (doing all the questioning and accepting input).
Some candidates wrote about scanning the frog and taking DNA samples. Some candidates mis-
read the question and related it back to designing the expert system rather than using it. It was
good to see that a few candidates realised it was a newly discovered species and that it would not
be in the expert system therefore would need to be added.

(b) Another question that comes up regularly but quite a few candidates seemed caught out by it. This
question relates directly to the list in the syllabus.

(c) This question was poorly answered. A lot of candidates did not read the question fully and gave
field data types rather than validation checks and some just seemed to guess which validation
check suited each field name.

(d) The question on headers and footers in database reports was a new question relating to the
practical side of the course. The question was well answered. Most candidates gave two or three
good examples of what was part of the header and footer but many omitted to explain why they are
used, depriving themselves of the extra mark.

Question 11

(a) (d) These parts on relational databases were reasonably well answered, although a few candidates
lost marks as they were putting in extra answers or missing out the _ in the field names.

(e) Quite a few candidates were not able to properly define the correct syntax for a formula in a
database, so although the fields were right the expression was incorrect due to the use of the
wrong type of brackets. As with previous parts of this question, the work related to the practical
side of the course.

Question 12

This question on the difference between a function and a formula was not as well answered as expected.
Some candidates confused the two terms. Many appeared to understand the differences but were unable to
articulate these. In fact, most of the marks gained were from giving suitable examples for a function or a
formula. Few candidates gave any successful explanations.

Question 13

(a) This question on user documentation was generally not well answered. Some appeared to know
what technical documentation is but were unable to say what it is used for. Many responses
mentioned technicians rather than programmers/analysts.

(b) Again, this was another question that appears fairly regularly and yet the number of candidates
being able to answer it correctly appeared less than expected. Many candidates gave vague
answers with some confusing this with user documentation.

Question 14

(a) Smishing is a new topic in the syllabus, candidates did not understand that it related the text
messaging, instead thinking it related to emails and the internet. This was possibly due to
smartphones allowing emails to be read and sent. Many confused phishing with pharming and
wrote about strategies for dealing with emails. Candidates had to specify that the firewall and anti-
virus software were up to date.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

(b) This question on cookies was not well answered. Many thought that cookies were used by
companies to guard against the threat of hacking. Those that understood cookies did not make
more than one valid point. Few realised a cookie was a file. Some clearly knew that cookies
tracked your actions, with a few candidates scoring close to full marks. There were a lot of vague
answers as to what the cookie stored about a users visit to a website. Only the more able
candidates were able to score marks for the explanation. Some candidates forgot that the question
was saying that cookies were only a nuisance, not a security threat and wrote about them
containing viruses. It was clear that some candidates did not know what a cookie is.

Question 15

Most candidates scored marks based on their answers describing what a blog is. Few candidates were able
to give satisfactory descriptions of a wiki. Some candidates thought that a wiki was Wikipedia and therefore
answered the question accordingly.

Question 16

(a) This question, asking for a definition of an intranet, was generally well answered

(b) This question was fairly well answered with many good answers for either intranet or internet. A
surprising number of candidates seemed to have little idea of what the internet actually is other
than that you can surf the web, although most candidates were able to mention
global/international/world wide etc.

Question 17

There were many misunderstandings about cheques such as lack of security, having to go to the bank to use
them, short term use and many more. Few could compare them (similarities) though many attempted to
contrast them with little success in supporting this with the major differences. Candidates need to read the
question carefully and ensure that they give equal weighting to both sides of the discussion, rather than
writing half a page on a single point. In a question like this, candidates need to give similarities and
differences in order to access the higher levels.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

INFORMATION AND COMMUNICATION


TECHNOLOGY

Paper 0417/13
Written Paper

Key messages

In many cases, questions requiring more detailed answers needed to contain more explanation or discussion.

There many instances of students giving brand names rather than the generic name for the product. It is
clearly stated on the front page of the examination paper No marks will be awarded for using brand names
of software packages or hardware.

General comments

All candidates appeared to have enough time to finish the paper.

Once again, some candidates gave extra answers that were not asked for and did not gain credit.

Comments on specific questions

Question 1

This question on different devices was well answered on the whole. However, the statements about smart
phones surprisingly caused the candidates problems. Occasionally the statement about the tablet computer
having a separate keyboard did as well. Many managed to achieve half marks on this question.

Question 2

As part (a) and (b) were on the same topic, candidates that did not do so well on the first part also did not do
so well on the second part in this question on interfaces.

(a) Very few candidates seemed to have come across a CLI and those that did had obviously been
introduced to it as the command prompt interface. Others that did not know the answer simply
wrote diskpart as it was on the screen.

(b) Those candidates who did not know about a CLI were unable to give a disadvantage. Those that
did answer CLI for part (a) did tend to answer well on part (b) but not always.

(c) It was felt that as GUIs are more popular than CLIs this question would have been answered
better. However that was not necessarily the case. Most candidates did well but the second and
third options occasionally caused candidates difficulty. It was surprising that candidates didnt use
the third option in their answer to part (b).

Question 3

This question and RAM and ROM was not as well answered as expected, with several candidates unable to
access any of the marks available. Incorrect answers were evenly spread amongst the options provided.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 4

This question on devices was not as well answered as anticipated, with few candidates gaining more than
two marks. A surprising number of candidates did not attempt various parts of this question.

(a) Most did not specify dot matrix printer for this part, with laser printer being a common incorrect
answer.

(b) Even though the question asked for a monitor, many gave a printer as the answer, but there were
lots of blanks. When candidates gave the correct type of device LED was the most popular answer

(c) The mark for 3D printer was the one most often awarded; although this could have been linked to a
question later on in the paper as there were some answers crossed out and 3D printer written in.

(d) This part was not well answered at all; very few gave the answer as a touch screen.

Question 5

Many candidates did not attempt this question on changeover methods. Of those that did, very few managed
to give an advantage with any depth, choosing one word answers like cheaper, faster or easier. Others did
not give advantages but only explained the method. Where correct answers were given, parallel running was
answered more frequently than the others.

Question 6

Despite candidates often giving hacking as an incorrect answer to various other questions, most candidates
were unable to answer this question well.

(a) Most were unable to grasp the notion that hackers act illegally or without permission. Candidates
seemed unsure of what hackers do when they access systems other than stealing information.
Stealing information is a very vague answer to this question.

(b) Again, a lack of understanding gave rise to answers relating to antivirus, encryption and firewalls.
Candidates seldom made more than one valid point and that was usually either the use of a strong
password or changing your password regularly. Some candidates gave answers about internet
security, email security or even changing passwords constantly.

Question 7

This question on fraudulent emails was generally well answered, although a small minority misunderstood the
point of the question and tried to explain why the activity on the account had been suspicious rather than why
the email was suspicious. They picked up on the spelling, the email with no reply and the logging
in/hyperlink.

Question 8

(a) This question on URLs was not well answered. Many general answers were seen and few
candidates gained marks. Generally when marks were obtained it was usually for http:// or
www.bbc.com, but this was rare. Some got protocol by expanding the acronym. Few stated domain
name for the second response. The article number was also not well answered with some thinking
it was an IP address.

(b) Again, very few candidates understood the concept of using a web browser rather than a search
engine. Few marks awarded. Common incorrect answers like cheaper or faster, etc. were seen.

Question 9

This question on spelling and grammar checks was an example of a question which candidates seemed to
understand, but were unable to put their answers into words. When marks were awarded it was usually for
either peoples names not being understood by a spellchecker or for a different version of English being
used, i.e. US English rather than UK/NZ English.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 10

(a) Candidates did not appear to grasp the concept of an audio-conference, preferring to answer with
regard to a video-based web-conference. A number of candidates referred to Skype which is a
brand name. Very few marks were awarded. Some mentioned setting a time but seemed to deal
with the hardware. Some candidates wrote about the methods of communication rather than the
setting up.

(b) Many candidates concentrated on using firewalls to prevent hacking or to stop viruses, without
providing the level of detail required by the question. Again, few marks were awarded.

Question 11

(a) Although familiar with the term, most candidates do not appear to understand the concept of the
cloud, preferring, instead, to think of it as a virtual storage.

(b) Following on from their lack of understanding of the concept it was unsurprising that candidates
were unable to enunciate the benefits and drawbacks. Marks were rarely awarded but when they
were it was usually for answers relating to accessing it anywhere or the problems if the internet
connection is lost.

Question 12

This question on a computer controlled burglar alarm was not well answered. Usually just an overview of
such a system was described by candidates, who did not appear to understand the technology which is used
in such a system. Very few candidates mentioned the processing taking part.

Question 13

Very few complete answers were seen to this question on an expert system. Some answers seen involved
the components of a PC while others related to uses of expert systems.

Question 14

(a) Very few candidates seemed to be familiar with the use of robots on a car production line. Many
thought the camera was for monitoring the robot rather than the end product. A substantial number
thought the light sensor was there to either switch on the lights in the factory when it got dark or to
check that the cars headlights were working properly. A number thought that the pressure sensor
was used to check the correct pressure of the tyres or the inside of the car; some, however, did
manage to give a creditable answer for a pressure sensor

(b) This part on the advantages of robots was answered reasonably well. Some candidates just gave
faster or cheaper without qualification.

(c) This question on the disadvantages of using robots in manufacturing was less well answered than
part (b). Many candidates did not realise that if there was a power cut human production would
also stop. Some very vague answers were seen such as its hard to control robots or program
them.

(d) Very few candidates demonstrated that they understood the workings of a 3D printer. Marks, when
awarded, were usually for using a program to design the model and very little else other than the
occasional mention of the printing material being used in the printer. Many candidates were looking
at improving a car and using the model as a prototype.

Question 15

(a) Few correct answers were seen to this question on a LAN, giving little evidence of technical
knowledge.

(b) Many candidates were only able to access the first mark for each of the methods of connecting
computers to networks.

2016
Cambridge International General Certificate of Secondary Education
0417 Information and Communication Technology November 2016
Principal Examiner Report for Teachers

Question 16

(a) This question on spreadsheets was poorly answered. Very few showed any depth of
understanding. There appeared to be very little understanding of testing strategies.

(b) Some good answers relating to the use of an IF condition were seen. Occasional mention was
made of shading cells but answers just tended to be rewording of the question.

Question 17

Very little understanding was seen in the answers to this question on Number Plate Recognition, especially
relating to the use of a camera. Some candidates thought chips and barcodes were used in the car rather
than answering that this was a number plate recognition system. Some candidates managed to mention
OCR. Automatic production of bills was sometimes referred to. Some candidates re-wrote the question when
it came to the paying of the bill.

Question 18

This question on safety issues was not as well answered as expected. Weak answers included heavy
equipment falling and fire, where candidates did not mention CO2 when talking about fire extinguishers

Question 19

Little understanding was shown in this question on smart appliances. Many candidates mentioned hacking.
Candidates often did not expand on their major points. Credit was given for interruption of the internet
connection rendering devices less functional, more expensive devices and for remotely controlling devices
(though this was very rare in answers).

2016

Potrebbero piacerti anche