In this session I will talk about dependence tree, which is an important topic of natural language processing. A parse tree is the graph representation of the syntactic structure of a string according to some context free grammars. A parse tree is usually constructed based upon either the constituency relation of constituency grammars or the dependency relation of dependence grammars. A parse tree contains morphological structures in the text in a more machine readable format and thus provides a better structure for text to analyze than the bag-of-words representation. Also from the semantics perspective a parse tree is much richer semantic structure than the bag-of-words representation. From the perspective of parsing a sentence, a sentence can be parsed by relating each word to other words in the sentence which depend on it. The syntactic parsing of a sentence consists of finding the correct syntactic structure of that sentence in a given formalism or given grammar. Dependency Grammar and Phrase Structure Grammar are two such formalisms. Dependency parsing based on Dependency Grammar has been more frequently used as part of many NLP applications than contingency parsing based on Phrase Structure Grammar. Phrase Structure Grammar is defined by phrase structure rules. Phrase Structure Grammars are based on the contingency relation as opposed to the dependence relation associated with dependence grammars. The way it works is that it breaks sentence into constituents which are phrases. Constituents are then broke into smaller constituents, such as noun phrase, preposition phrase, verb phrase, and so on and so forth. Phrasal structures are often recursive. This property of language allows for embedding of phrase or categories which is infinitely long phrases. In the example on the slide, the noun phrase, a boy in a bubble has a sub noun phrase, a bubble in it. Continuing on phrase structure tree, let's take another example. For the sentence, the red figures on the screens indicated falling stocks. In this example from one head word which is also called root on top, there are two phrases right on the root. One noun phrase and one verb phrase. The noun phrase accounts for the sentence portion of red figures on the screens. The noun phrase consists of one sub noun phrase and one sub preposition phrase. The verb phrase accounts for indicate falling stocks. And it consists of one verb and one sub noun phrase. On the contrary to the Phrase Structure Grammar, the syntactic structure of Dependency Grammar consists of lexical items linked by binary asymmetric relations called dependencies. Dependency is the notion that linguistic units, such as words, connected to each other by directed links. Dependency Grammar is interest in grammatical relations between individual words. One side of word plays a role of governor and the other side of word plays a role of dependant. Dependency Grammar does not propose a recursive structure and rather, it propose a network of relations. These relations can also have labels. Next slide shows an example. As you see in this example, phrasal nodes are missing in the dependent structure when compared to constituency structure. For the sentence, red figures on the screen indicated falling stocks. In this example, instead of recursive phrase structure, there are relations between any given two words. The words red figures have the name and mod relation between two Nmod stands for nominal modifier. The Nmod relation is used for nominal modifiers of nouns or clausal predicates. On the dependency relation is VARG, V-A-R-G which represents for subject and object dependency. Let's wrap up the discussion of constituency tree versus dependency tree. Dependence structure explicitly we present head-dependent or governor dependent relations which can show directed arcs between words. Dependence structures also demonstrate functional categories which can be represented by arc labels. The functional categories of dependence tree may indicate some kind of structural categories according to their parts-of-speech tags. On the contrary phrase structure explicitly represents phrases which are non-terminal I'm sorry, which are non-terminal nodes. In addition, phrase structure represents structure categories which are non-terminal labels and may indicate some kinds of functional categories. Here functional category means grammatically functional category.