Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
GENERAL OBJECTIVE
SPECIFIC OBJECTIVES
See Section 4.2.2
Structured Free-form
SPECIFIC OBJECTIVES
See Section 4.2.2
Input Sentence:
John is a doctor and is handsome.
SPECIFIC OBJECTIVES
See Section 4.2.2
Input
James and Steve are students. They went to school. They saw Steve's friend by the classroom.
Preprocessing:124 - 1 1 : 1 5
// 1st word in Sentence 1 (James and Steve) = 5th word in Sentence 1 (students)
Preprocessing:124 - 2 1 : 1 1
// 1st word in Sentence 2 (James and Steve) = 1st word in Sentence 1 (James and Steve)
Preprocessing:124 - 3 1 : 1 1
// 1st word in Sentence 3 (James and Steve) = 1st word in Sentence 1 (James and Steve)
...
Preprocessing:124 - 3 10 : 3 10
// 10th word in Sentence 3 (classroom) = 10th word in Sentence 3 (classroom)
DEPENDENCY PARSING
Relations between the words in the sentence
Relations recognized by the system
compound nmod:agent nmod:to
Direct object
Adjectival modifier
Adverbial modifier
Nominal modifiers
Negation modifier
Conjunct: and
CONCEPT PARSER
Extracts the main idea in the sentence
Also use linguistic patterns and templates
see page 99 to 101
CONCEPT
SENTICNET
POLARITY
ELEMENT DETECTION
Rules/restrictions to detect each element
Character
Preferably named
if isA person (for common nouns)
Location
Preferably named
if isA place (for common nouns)
ELEMENT DETECTION
Rules/restrictions to detect each element
Events
Must have an action
Conflict
Clause with least negative polarity that
reached the threshold (-0.2)
ELEMENT DETECTION
Rules/restrictions to detect each element
Resolution
Clause with positive emotion found at
the end
Must be experienced by at least 1
doer in the conflict
ELEMENT DETECTION
Rules/restrictions to detect each element
Resolution
If not a negated clause, related to the
conflict within three hops
(See figure 5-8 in page 105)
Example
Conflict : injury
Resolution : live
ELEMENT DETECTION
Rules/restrictions to detect each element
Resolution
If the conflicts main verb signifies
negative emotion, all concepts from the
conflict must be found in the resolution
with the exception of the negation verb
Example
Conflict : hate, going to school
Resolution : love, going to school
ELEMENT DETECTION
Rules/restrictions to detect each element
Resolution
If negated by the word not, all
concepts must be similar to
conflict without not
Example
Conflict : not pretty
Resolution : pretty
EXAMPLE
John went to China. John got an injury in China. So, John went to
the hospital. Doctor Chow gave John a medicine. John applied the
medicine. John healed and lived.
EXAMPLE
John went to China. John got an injury in China. So, John went to
the hospital. Doctor Chow gave John a medicine. John applied the
medicine. John healed and lived.
Start
Characters: John, Doctor Chow
Location: China
Conflict: John got an injury
Middle
Series of actions: went to hospital
gave medicine
applied medicine
End
Resolution: John lived
2.
KNOWLEDGE BASE
2.1.
SENTICNET
SENTICNET
Only utilized the concepts and polarity values
Code Listing 5-2 Example Concept in SenticNet
<rdf:Description rdf:about="http://sentic.net/api/en/concept/a_lot_of_flowers">
<rdf:type rdf:resource="http://sentic.net/api/concept"/>
<text>a lot of flowers</text>
<semantics rdf:resource="http://sentic.net/api/en/concept/flower"/>
<semantics rdf:resource="http://sentic.net/api/en/concept/love"/>
<semantics rdf:resource="http://sentic.net/api/en/concept/show_love"/>
<semantics rdf:resource="http://sentic.net/api/en/concept/rose"/>
<semantics rdf:resource="http://sentic.net/api/en/concept/give_flower"/>
<pleasantness>0.027</pleasantness>
<attention>0.093</attention>
<sensitivity>0.025</sensitivity>
<aptitude>0.071</aptitude>
<polarity>0.055</polarity>
</rdf:Description>
2.2.
CONCEPTNET/
COMMON SENSE
KNOWLEDGE BASE
FILTERING
Code Listing 5-1 Summary of steps done for filtering ConceptNet
4. Maximum of 3 words
5. All relations
ADDITIONAL KNOWLEDGE
Person attributes
Instances of persons
Location
Nationality
Size
Texture
Talent
2.4.
ABSTRACT STORY
REPRESENTATION
SENTENCE
John healed and lived.
SENTENCE
John healed and lived.
= John healed + John lived
SENTENCE
John healed and lived.
= John healed + John lived
CLAUSE
SENTENCE
John healed and lived.
= John healed + John lived
CLAUSE
EVENT
DESCRIPTION
SENTENCE
John healed and lived.
= John healed + John lived
CLAUSE
CONCEPT(S) EVENT
DESCRIPTION
SENTICNET
POLARITY
CLAUSE
Code Listing 5-4 Properties of Clauses
protected List<String> concepts;
protected Map<String, Noun> doers;
protected boolean isNegated;
DESCRIPTION
Gabriel is a student and is tall.
= Gabriel is a student + Gabriel is tall
REFERENCE ATTRIBUTE
hasA, isA, notIsA, capableOf, hasProperty,
notHasA, atLocation notHasProperty
OTHER RELATIONS
Sentence Assertion Description
Shelie is not ugly. Shelie notHasProperty Concept B is a property
ugly not possessed by
Concept A
Jill is not in Manila Jill notAtLocation Manila Concept A is not in
location Concept B
Harry does not have a Harry notHasA dog Concept B is a noun not
dog. possessed by Concept
A
Paul is not an Paul notIsA employee Concept A is not an
employee instance of Concept B
Leon has a car Car isOwnedBy Leon Concept B is possessed
by Concept A;
Partnered with HasA
CLAUSE
Code Listing 5-4 Properties of Clauses
protected List<String> concepts;
protected Map<String, Noun> doers;
protected boolean isNegated;
EVENTS
Felisa bought her sister a cake
INDIRECT DIRECT
OBJECT OBJECT
Felisa capableOf buy
SPECIAL CLAUSE
Also stores the polarity
Conflict
Resolution
NOUNS
Code Listing 5-3 Properties of Nouns
protected String id;
protected boolean isCommon;
protected Map<String, List<String>> attributes;
protected Map<String, Map<String, Noun>> references;
4 CATEGORIES
Character
Object
Location
else...Unknown
NOUNS
Marie is a student, is beautiful and loves to
read.
Properties Values Values
id Marie student
isCommon false true
attributes Marie hasProperty beautiful
Marie capableOf love
references Marie isA student
STORY ELEMENTS
Description of each element
Start
Characters
a person that can be named or not
Location
a place that can be named or not
Conflict
a special clause with least negative
polarity that reached the threshold
STORY ELEMENTS
Description of each element
Middle
Series of Actions
At least 2 event clauses
End
Resolution
A Special clause that is the opposite or
related to the conflict
...ideally, the solution to the conflict...
3.
GENERATING STORY
TEXTS AND PROMPTS
TWO ROLES OF ALICE
Collaborator Facilitator
COLLABORATOR
Initial threshold = 3
Check latest sentence if frequency of all nouns mentioned is less than
the threshold
If frequency of a noun is less than the threshold, add noun as a
candidate to prompt
Else if all the frequency of the nouns in current sentence reached
the threshold, check the frequency of the nouns found in the
previous sentences.
If frequency of a noun is less than the threshold, add noun
as a candidate to prompt
Else if all the frequency of the nouns reached the threshold,
threshold + 2
Randomly pick a noun from the candidates to prompt
COLLABORATOR
CATEGORIES
Character
Object
Location
CONCEPTNET
used as supplementary knowledge
TEMPLATE-BASED APPROACH
COLLABORATOR
Relation Part of Story Templates Text Generated
Combination : 15 cases
Incorrect 3 18 4 3
Character/Location
Conflict
Two Events
Resolution
Restrictions
Answer Checker
See appendix for test cases and their results
CHILDREN EVALUATION
User Acceptance Testing
Round 1
17 kids
Grade 3 to 4
Public and private schools
Nine 8 year olds
Eight 9 year olds
Round 2
7 kids
Grade 3 to 4
Private schools
Only 5 were able to answer the evaluation form
CHILDREN EVALUATION
User Acceptance Testing
Methodology
The students were briefed about the system (its purpose
and features) and were shown a demo on how to use the
system prior to testing
The students were made aware that an evaluation form is
to be answered after using the system
The students were observed while they were interacting
with Alice
The students interaction with Alice is logged in a file
The students were also interviewed when they were
answering the evaluation form
CHILDREN EVALUATION
Item 1. Were you able to finish your story using Alice?
Yes No
Round 1 9 8
Round 2 5 2
CHILDREN EVALUATION
Item 1. Were you able to finish your story using Alice?
Round 1
BASED ON SELF-
Yes
9
No
Round 2 EVALUATION 5 2
CHILDREN EVALUATION
Item 1. Were you able to finish your story using Alice?
Round 1 0 17
Round 2 2 5
Round1 Round2
Strict rule to detect Most reached the end
location Lenient location rule
Relies on NER for Most used hates and does
character/location not like for the conflict
Conflict did not reach Uses SRL for common
the -0.2 threshold nouns
CHILDREN EVALUATION
Item 2 . Was Alice able to help you when you were
writing a story?
Yes No
Round 1 13 04
Round 2 07 00
Round 2 5 0 0 2
Round 2 5 0 0 2
Wrong topic Child: once opon a time there was a rabbit who was having a tour
around the forest and saw a house and he saw many poitions and he
wanted to touch it
Delayed Child wrote : Once a upon a time there is a baby tiger play with his
prompt friends and he became big now and he are wild and dangerous. and he
killed his family and now he is in the zoo
Child wrote : The church is in Singapore. And after that he went home.Ralph
ask his father,daddy
I went to church.
Prompt Generated: I want to hear more about boy's father.
Child wrote : Ralph's daddy is Eric.
CHILDREN EVALUATION
Item 5. Are the ideas & suggestions helpful?
Round 2 5 0 0 2
Round 2 5 0 0 2
Friend
A student in appearance
Helped them in writing by giving ideas/suggestions
Teacher
Guided them in writing their story
Helped them in writing by giving ideas/suggestions
Neither
Several mistakes
Did not use Alice
CHILDREN EVALUATION
Item 7. What other features would you like to see in
Alice?
CHILDREN EVALUATION
Item 8. What can you say about the user interface?
Comments
Easy to navigate
Colors, layout and fonts were fine
o 3 children said that they want it to be in other
colors
The peers appearance is ok
o One child asked for a male counterpart of
Alice
Round 1 Observation
Children thought that theres only one To Do List
even when they were briefed
Children were confused on how to terminate the
dialog boxes
EXPERT EVALUATION
User Acceptance Testing
Objective
Evaluate the appropriateness and the effectiveness of
Alice (see page 51)
Methodology
Evaluators were briefed about the criteria and the logs
from the childrens interaction with Alice
Evaluators were given two similar evaluation criteria,
one for round 1 and one for round 2 testing
Rate the prompts and story segments from 1.0 to 5.0,
with 1.0 being the lowest and 5.0 as the highest
EXPERT EVALUATION
User Acceptance Testing
Evaluators
Ms. Pacis
Thoroughly evaluated the logs for the round
1 testing
Gave the qualitative rating for the logs of the
round 2 testing
Mr. Gojo-Cruz
Only evaluated the round 2 testing logs
FACILITATOR
Expert Evaluation
Ideal characteristics
Found under the 5.0 of the evaluation criteria
Too general Prompt Generated: Write more about the family. Prompt
Prompt Generated: Tell something more about the family. Generator
Prompt Generated: I want to hear more about the family.
Prompt Generated: Describe the family.
Prompt Generated: Write more about the family.
Wrong Child: she saw the children while Nanny was asking for Content
topic help she introduced herself and her name was Lucy the big planner
person and the children told her to open the window
Child error Child wrote : and he put it out of the house and the rabbit did not
nkow he was out of the house and that morning the wisard went out
of the house to get ingridiens for a potion
Corrected Prompt:
Tell me more! What happened?
Tell me what happened.
Missing pronouns Prompt Generated: Tell something more about boys day
or articles Correct Prompt: Tell me something more about the boys day
Ideal characteristics
Found under the 5.0 of the evaluation criteria
Frequency
Issues
Round 1 Round 2
Concepts are not age appropriate 40% 17%
A mother is an abbess.
Japan is a lacquerware.
Round 1 Round 2
A jar is a containerful.
COLLABORATOR
Examples of story segments that makes no sense
Round 1 Round 2
Jenna is a populate. A water can be blue
Gabriela is a grammatical Jonah olso's water is Life.
category.
COLLABORATOR
Examples of story segments that are unconnected to the story
Child wrote : A rabbit jump on Alice shoes and she follow the rabbit and
the rabbit went to a hole and Alice fall to a house and drink a potion and
make her small and her dress was bigger than her
Category Example
Cambria, E., Olsher, D., & Rajagopal, D. (2014). Senticnet 3: A common and common-sense knowledge base for cognition-driven
sentiment analysis. In C. E. Brodley & P. Stone (Eds.), Proceedings of the twenty-eighth AAAI conference on artificial
intelligence, july 27 -31, 2014, quebec city, quebec, canada. (pp. 1515-1521). AAAI Press. Retrieved from
http://www.aaai.org/ocs/index.php/AAAI/AAAI14/paper/view/8479
Cassell, J. (2001). Towards a model of technology and literacy development: Story listening systems (Tech. Rep. No. ML-GNL-01-1).
Massachusetts Institute of Technology.
Cassell, J., Ryokai, K., Druin, A., Klaff, J., Laurel, B., & Pinkard, N. (2000). Storyspaces: Interfaces for children's voices. In Chi
'00 extended abstracts on human factors in computing systems (pp. 243-244). New York, NY, USA: ACM. Retrieved from
http://doi.acm.org/10.1145/633292.633434 doi: 10.1145/633292.633434
Chuu, C., & Kim, H. (2012). Storyfighter: A common sense storytelling game.
Gottlieb, D. & Juster, J. (n.d.). Generating a dynamic gaming environment using omcs.
Hourcade, J. P., Bederson, B. B., Druin, A., Taxn, G., & Bb, N. N. (2002). Kidpad: a collaborative storytelling tool for children. In
In extended abstracts chi 2002. ACM.
Jerz, D., & Kennedy, K. (n.d.). Short story tips: 10 ways to improve your creative writing. Retrieved from
http://jerz.setonhill.edu/writing/creative1/shortstory/
Liu, H., & Singh, P. (2002). Makebelieve: Using commonsense knowledge to generate stories. In R. Dechter & R. S. Sutton (Eds.),
Aaai/iaai (p. 957-958). AAAI Press / The MIT Press. Retrieved from http://dblp.uni-
trier.de/db/conf/aaai/aaai2002.html#LiuS02
Liu, H., & Singh, P. (2004). Conceptnet - a practical commonsense reasoning tool-kit. BT Technology Journal, 22 (4), 211-226.
Retrieved from http://dx.doi.org/ 10.1023/B:BTTJ.0000047600.45421.6d doi: 10.1023/B:BTTJ.0000047600.45421.6d
References
Manning, C. D., Surdeanu, M., Bauer, J., Finkel, J. R., Bethard, S., & McClosky, D. (2014). The stanford corenlp natural language
processing toolkit. In Proceedings of the 52nd annual meeting of the association for computational linguistics, ACL 2014,
june 22-27, 2014, baltimore, md, usa, system demonstrations (pp. 55-60). Retrieved from
http://aclweb.org/anthology/P/P14/P14-5010.pdf
McIntyre, N., & Lapata, M. (2009). Learning to tell tales: A data-driven approach to story generation. In Proceedings of the joint
conference of the 47th annual meeting of the acl and the 4th international joint conference on natural language processing of
the afnlp: Volume 1 - volume 1 (pp. 217-225). Stroudsburg, PA, USA: Association for Computational Linguistics. Retrieved
from http://dl.acm.org/citation.cfm?id=1687878.1687910
Ong, E. (2014). Picture Books: Challenges and opportunities in automatic story generation. In S. Ona & Z. C. Pablo (Eds.),
Information and Communications Technology in the philippines: Contemporary Perspective (pp. 1-17). Manila, Philippines:
De La Salle University Publishing House.
Ong, E., Bienes, K., Jimenez, N., Miranda, E., & Pascual, G. (2014). A system for collecting commonsense knowledge from
children. DLSU Research Congress 2014, De La Salle University, Manila.
Rambo, R. (2015). English Composition 1. Retrieved April 04, 2016, from http://www2.ivcc.edu/rambo/eng1001/sentences.htm
References
Robertson, J., & Good, J. (2003). Ghostwriter: A narrative virtual environment for children. In Proceedings of the 2003 conference
on interaction design and children (pp. 85-91). New York, NY, USA: ACM. Retrieved from
http://doi.acm.org/10.1145/953536.953549 doi: 10.1145/953536.953549
Roxas, R. J., Huang, D. L., Peralta, B. E., & Ong, E. (2014). Generating text descriptions in the alex interactive storytelling system
using a semantic ontology. Philippine Computing Journal, 9 (1), 34-43.
Singh, P. (2001). The public acquisition of commonsense knowledge. Retrieved from citeseer.ist.psu.edu/singh02public.html
Xu, Y., Park, H., & Baek, Y. (2011). A new approach toward digital storytelling: An activity focused on writing self-e cacy in a
virtual learning environment. Educational Technology & Society, 14 (4), 181-191. Retrieved from http://dblp.uni-
trier.de/db/journals/ ets/ets14.html#XuPB11
Williams, R., Barry, B., & Singh, P. (2005). Comickit: Acquiring story scripts using common sense feedback. In Proceedings of the
10th international conference on intelligent user interfaces (pp. 302-304). New York, NY, USA: ACM. Retrieved from
http://doi.acm.org/10.1145/1040830.1040907 doi: 10.1145/1040830.1040907
References
Wordnet. (2015). Retrieved from https://wordnet.princeton.edu/