Difference: CS255Spring08Discussions (4 vs. 5)

Revision 52008-01-31 - XiaomingGu

Line: 1 to 1
 
META TOPICPARENT name="CS255Spring08"

CS255/455 Spring 2008 Questions and Answers

Line: 6 to 6
 
Changed:
<
<
Q: More of remarks and vague ideas than questions: in class we discussed about Chris' idea to perform the a=b-c <=> b=c+1 analysis. I had doubts about the idea because I was thinking "=" more of as an assignment operator, rather than the algebraic equality alternative, which pretty much is what it is in this analysis step. I guess thus, that at this level we can perform an even wider variety of symbolic analysis (something that would resemble what Maple does for the common mathematical notations, but in our case solely for Algebra) and generate an even wider range of possible optimizations one cannot possible expect at first sight. I was wondering though to what extent this actually happens, since the way we program brakes down the simplest algebraic expressions into difficult to track simplified steps. I was also wondering whether one can run any optimizations as (syntax) tree-data structure operations, i.e whether instead of a hash there exists an advanced data structure that keeps the tree-like structure of the syntax tree and optimizations are realized as (for example) rotations, insertions/deletions/pruning etc. After all, after the optimizations we should be able to build a new syntax tree from an optimized version of the code ; does there exist a possible repetetive transformation that does not need intermediate hash-based analysis (maybe I 'm making a hard problem impossible here)?
>
>
Q: More of remarks and vague ideas than questions: in class we discussed about Chris' idea to perform the a=b-c <=> b=c+1 analysis. I had doubts about the idea because I was thinking "=" more of as an assignment operator, rather than the algebraic equality alternative, which pretty much is what it is in this analysis step. I guess thus, that at this level we can perform an even wider variety of symbolic analysis (something that would resemble what Maple does for the common mathematical notations, but in our case solely for Algebra) and generate an even wider range of possible optimizations one cannot possible expect at first sight. I was wondering though to what extent this actually happens, since the way we program brakes down the simplest algebraic expressions into difficult to track simplified steps. I was also wondering whether one can run any optimizations as (syntax) tree-data structure operations, i.e whether instead of a hash there exists an advanced data structure that keeps the tree-like structure of the syntax tree and optimizations are realized as (for example) rotations, insertions/deletions/pruning etc. After all, after the optimizations we should be able to build a new syntax tree from an optimized version of the code ; does there exist a possible repetetive transformation that does not need intermediate hash-based analysis (maybe I 'm making a hard problem impossible here)?
  A:
Line: 21 to 21
  Question was can that "an" be one/some, or should it be every ?
Changed:
<
<
We denote the list of possible donimators of X as DOM(X) and the immediate denominator of X as IDOM(X). The question we were asked to prove in class was whether IDOM definition possibly allows for two different possible denominators. Let's denote with SDOM(X) the strict denominators of X. Then SDOM(X)=IDOM(X) - {X}. Let's assume that the definition of IDOM allows for two different immediate dominators ; if it doesn't allow for two, it certainly cannot allow for more, since it will then also have two. Say IDOM(X)={Y,Z} . Then, there exist two options : either X and Y are on the same execution path, or not. By the property of "closeness in an execution path", only one of the possible predecessors of X in the same path can be closer to X ; That should hold because Y,Z!=X, since the immediate denominator is in SDOM. So either Y==Z or the two immediate dominators reside on different paths. If Y and Z are on different paths, then the do not belong in SDOM, because by definition, Y and Z should appear in every path that enters X, though we just claimed that they reside on different execution paths. Thus, "There can be only one". I hope this is a decent proof.
>
>
We denote the list of possible donimators of X as DOM(X) and the immediate denominator of X as IDOM(X). The question we were asked to prove in class was whether IDOM definition possibly allows for two different possible denominators. Let's denote with SDOM(X) the strict denominators of X. Then SDOM(X)=DOM(X)-{X}. Let's assume that the definition of IDOM allows for two different immediate dominators ; if it doesn't allow for two, it certainly cannot allow for more, since it will then also have two. Say IDOM(X)={Y,Z} . Then, there exist two options : either X and Y are on the same execution path, or not. By the property of "closeness in an execution path", only one of the possible predecessors of X in the same path can be closer to X ; That should hold because Y!=X and Z!=X, since the immediate denominator is in SDOM. So either Y==Z or the two immediate dominators reside on different paths. If Y and Z are on different paths, then the do not belong in SDOM, because by definition, Y and Z should appear in every path that enters X, though we just claimed that they reside on different execution paths. Thus, "There can be only one". I hope this is a decent proof.
 
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2020 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding URCS? Send feedback