News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

3AC and DAGs

Started by cman, July 09, 2007, 07:10:48 PM

Previous topic - Next topic

cman

Is there any disadvantages to generate DAGS during parsing rather than afterwards ( after 3AC has been generated from syntax trees built in the parsing stage ). Just curious as most of the things I've read the use of DAGs for expressions follows the generation of 3AC. Thanks for any input. :U

Mark Jones

It doesn't look like anyone understands what you are talking about. Could you explain?
"To deny our impulses... foolish; to revel in them, chaos." MCJ 2003.08

tenkey

Presumably, the DAG is nearly worthless without the 3AC in the first place. As the DAG can be generated from the 3AC, you would be keeping the parser simple by having it generate only the 3AC. The 3AC can be generated linearly, just like outputting assembly language source code, so you can send it to a file without worrying about memory management.

DAG is "directed acyclic graph". A graph can be drawn as a bunch of circles with lines between them. The circles are nodes, and the lines are known as edges. "Directed" refers to the property that all edges have direction, so each line is drawn as an arrow. "Acyclic" means that some nodes might not be part of a loop of arrows. An example of a DAG is a state transition diagram. A flow chart is another, and that is what is being represented in certain types of optimization.

3AC is three address code. Each instruction has three address fields, which are usually two operands and one result. They are also known as quadruples (or quads, for short). The fourth field is the opcode field.
A programming language is low level when its programs require attention to the irrelevant.
Alan Perlis, Epigram #8

cman

Thanks for the replies! I realized this was a bad question right after I reviewed local optimization techniques using 3AC. The use of DAGS would only be useful locally in blocks in an appropriate flow graph to eliminate local common subexpressions , dead code , for the reordering of expressions and algebraic simplification of expressions. Thanks for the replies!  :U

cman

I was wondering: do most compilers combine local and global optimization techniques or is one favored over doing the other ( global ?? ). Thanks  :U

tenkey

I'm not sure if you understand the use of DAGs. They are definitely used for global optimization, as it is the way to represent flow graphs. Each node would represent a basic block of 3AC code. Code motion, and the analysis needed to ensure correctness of code motion is facilitated by the flow graph.

"Most compilers" is rather tricky to define. Most of us are not privy to the internal workings of more than a few compilers. Production quality compilers are large programs, and they have a history. Some designs are easier to change than others. You have choices. The choices depend on what kinds of optimizations you "must have", how well you understand the source language, how well you understand the target platform, how much computing resources your target developers have, and how much energy you want to invest into making a quality compiler.

You might want to look at Muchnick's book to see what options you have in configuring the optimization pattern of a compiler. The inside of the cover has a flow chart that shows a number of ways that different kinds of optimizations can be combined and sequenced.
A programming language is low level when its programs require attention to the irrelevant.
Alan Perlis, Epigram #8

cman

Thanks for the information Tenkey! The only optimizations I remember are local ( I know that there are many more that I don't remember , some of which are global  :bg ) involving building DAGS from 3AC and a number of optimizations that can be inferred from the ordering of the 3AC instructions in the graph. I was wondering about if some global and local optimizations were redundant, like eliminating "dead code" sequences locally and then performing an algorithm to eliminate "dead code" sequences globally , as an example.

I was just looking at compiler construction books last week on amazon.com and saw Advanced Compiler Design and Implementation. For those interested look here:

http://www.amazon.com/gp/product/1558603204/ref=pd_cp_b_0/102-9680090-0135320?pf_rd_m=ATVPDKIKX0DER&pf_rd_s=center-41&pf_rd_r=17BYWK5SGQBWMY2TVG76&pf_rd_t=201&pf_rd_p=252362401&pf_rd_i=084931240X


looks like a great book on optimization and code generation. Anyway , thanks for your time and expertise. :U

cman

Anyway , the local optimizations with DAGS I was speaking about are covered in section 8.5.1 ( the DAG representation of Basic Blocks ) of the second edition "dragon book" ( it dosen't seem like anyone knows what I'm referring to  :red ). Anyway, for a sequence of instructions like:



a = b + c
b = a - d
c = b + c
d = a - d




the block is then used to build the DAG:



                             +  ( c )
                        /          \
                  - ( b,d )         \       
                    /                \
                   + ( a )           \
                /    \                  \
               b     c __________\               



sorry for the bad drawing. Anyway , the graph is sorted and the new code sequence is :



a = b + c
d = a - d
c = d + c





Just so everyone knows what I'm referring to.  :bg