CS606-Midterm
1 / 50
45.Follow of C is ____ .
2 / 50
_______ algorithm is used in DFA minimization.
3 / 50
Responsibility of _____ is to produce fast and compact code.
4 / 50
Consider the grammar
5 / 50
_______ of a two-pass compiler is consists of Instruction selection, Register allocation and Instruction scheduling.
6 / 50
Optimal registers allocation is an NP-hard problem.
7 / 50
AST summarizes the grammatical structure with the details of derivations.
8 / 50
When generating a lexical analyzer from a token description, the item sets (states) are constructed by two types of “moves”: character moves and ____ moves.
9 / 50
In LL1() parsing algorithm _________ contains a sequence of grammar symbols.
10 / 50
Yacc contains built-in support for handling ambiguous grammars resulting in shift-reduce conflicts. By default these conflicts are solved by performing the ________.
11 / 50
Lexical Analyzer generator _______ is written in Java.
12 / 50
Lexer and scanner are two different phases of compiler
13 / 50
A _______ is a top down parser.
14 / 50
When generating a lexical analyzer from a ________ description, the item sets (states) are constructed by two types of “moves”: character moves and e moves.
15 / 50
Alternative of the backtrack in parser is Look ahead symbol in ______ .
16 / 50
Left factoring is enough to make LL1 grammar
17 / 50
LR parsers can handle _______ grammars.
18 / 50
We can get an LL(1) grammar by _______ .
19 / 50
Can a DFA simulate NFA?
20 / 50
21 / 50
In compilation process Hierarchical analysis is also called
22 / 50
23 / 50
The transition graph for an NFA that recognizes the language ( a | b)*abb will have following set of states.
24 / 50
First of C is ______ .
25 / 50
We use ----- to mark the bottom of the stack and also the right end of the input when considering the Stack implementation of Shift-Reduce Parsing.
26 / 50
LR parsing _____ a string to the start symbol by inverting productions.
27 / 50
___________ phase which supports macro substitution and conditional compilation.
28 / 50
29 / 50
Left factoring of a grammar is done to save the parser from back tracking.
30 / 50
Left factoring is enough to make a grammar LL(1).
31 / 50
In a transition table cells of the table contain the ________ state.
32 / 50
For each language to make LL(1) grammar, we take two steps, 1st is removing left recurrence and 2nd is applying fin sequence.
33 / 50
LL(1) parsing is called non-predictive parsing.
34 / 50
In Three-pass compiler ____ is used for code improvement or optimization.
35 / 50
The following two items A -> P • Q B -> P • Q can co-exist in an ______ item set.
36 / 50
In predictive parsing table the rows are ___________ .
37 / 50
Parser takes tokens from scanner and tries to generate ______ .
38 / 50
Compilers are sometimes classified as.
39 / 50
In Back End module of compiler, optimal register allocation uses______ .
40 / 50
41 / 50
Which of the statement is true about Regular Languages?
42 / 50
Front end of two pass compiler takes________ as input.
43 / 50
One of the core tasks of compiler is to generate fast and compact executable code.
44 / 50
Recursive ____________ parsing is done for LL(1) grammar.
45 / 50
Typical compilation means programs written in high-level languages to low-level ____________.
46 / 50
Functions of Lexical analyzer are?
47 / 50
NFA is easy to implement as compared to DFA.
48 / 50
____is evaluated to yield a value.
49 / 50
Intermediate Representation (IR) stores the value of its operand in
50 / 50
Grammars with LL(1) conflicts can be made LL(1) by applying left-factoring, substitution, and left-recursion removal. Left-factoring takes care of ________conflicts.
Your score is
The average score is 0%
Restart quiz