Changeset e394618
 Timestamp:
 Aug 10, 2016, 5:34:20 PM (6 years ago)
 Branches:
 aaronthesis, armeh, cleanupdtors, ctor, deferred_resn, demangler, enum, forallpointerdecay, jacob/cs343translation, jenkinssandbox, master, memory, newast, newastuniqueexpr, newenv, no_list, persistentindexer, pthreademulation, qualifiedEnum, resolvnew, with_gc
 Children:
 72e2ea0
 Parents:
 be0a9d8
 File:

 1 edited
Legend:
 Unmodified
 Added
 Removed

doc/aaron_comp_II/comp_II.tex
rbe0a9d8 re394618 407 407 If crossargument resolution dependencies cannot be completely eliminated, effective caching strategies to reduce duplicated work between equivalent argumentparameter matches in different combinations may mitigate the asymptotic defecits of the wholecombination matching approach. 408 408 The final area of investigation is heuristics and algorithmic approaches to reduce the number of argument interpretations considered in the common case; if argumentparameter matches cannot be made independent, even small reductions in $i$ should yield significant reductions in the $i^{p+1}$ resolver runtime factor. 409 409 410 The discussion below presents a number of largely orthagonal axes for expression resolution algorithm design to be investigated, noting prior work where applicable. 411 Though some of the proposed improvements to the expression resolution algorithm are based on heuristics rather than asymptoticly superior algorithms, it should be noted that user programmers often employ idioms and other programming patterns to reduce the mental burden of producing correct code, and if these patterns can be identified and exploited by the compiler then the significant reduction in expression resolution time for common, idiomatic expressions should result in lower total compilation time even for code including difficulttoresolve expressions that push the expression resolver to its theoretical worst case. 410 412 411 413 \subsection{ArgumentParameter Matching} … … 446 448 \subsubsection{Hybrid} 447 449 This proposal includes the investigation of hybrid topdown/bottomup argumentparameter matching. 448 A reasonable hybrid approach might be to take a topdown approach when the expression to be matched is known to have a fixed type, and a bottomup approach in untyped contexts. 449 This may include switches from one type to another at different levels of the expression tree, for instance: 450 A reasonable hybrid approach might take a topdown approach when the expression to be matched has a fixed type, and a bottomup approach in untyped contexts. 451 This approach may involve switching from one type to another at different levels of the expression tree. 452 For instance: 450 453 \begin{lstlisting} 451 454 forall(otype T) … … 456 459 int x = f( f( '!' ) ); 457 460 \end{lstlisting} 458 Here, the outer call to ©f© must have a return type that is (implicitly convertable to) ©int©, so a topdown approach could be used to select \textit{(1)} as the proper interpretation of ©f©. \textit{(1)}'s parameter ©x© here, however, is an unbound type variable, and can thus take a value of any complete type, providing no guidance for the choice of candidate for the inner ©f©. The leaf expression ©'!'©, however, gives us a zerocost interpretation of the inner ©f© as \textit{(2)}, providing a minimalcost expression resolution where ©T© is bound to ©void*©.459 460 Deciding when to switch between bottomup and topdown resolution in a hybrid algorithm is a necessarily heuristic process, and though finding good heuristics for it is an open question, one reasonable approach might be to switch from topdown to bottomup when the number of candidate functions exceeds some threshold.461 The outer call to ©f© must have a return type that is (implicitly convertable to) ©int©, so a topdown approach is used to select \textit{(1)} as the proper interpretation of ©f©. \textit{(1)}'s parameter ©x©, however, is an unbound type variable, and can thus take a value of any complete type, providing no guidance for the choice of candidate for the inner call to ©f©. The leaf expression ©'!'©, however, determines a zerocost interpretation of the inner ©f© as \textit{(2)}, providing a minimalcost expression resolution where ©T© is bound to ©void*©. 462 463 Deciding when to switch between bottomup and topdown resolution to minimize wasted work in a hybrid algorithm is a necessarily heuristic process, and though finding good heuristics for which subexpressions to swich matching strategies on is an open question, one reasonable approach might be to set a threshold $t$ for the number of candidate functions, and to use topdown resolution for any subexpression with fewer than $t$ candidate functions, to minimize the number of unmatchable argument interpretations computed, but to use bottomup resolution for any subexpression with at least $t$ candidate functions, to reduce duplication in argument interpretation computation between the different candidate functions. 461 464 462 465 \subsubsection{Common Subexpression Caching} … … 464 467 465 468 \subsection{Implicit Conversion Application} 466 Baker's \cite{Baker82} and Cormack's\cite{Cormack81}algorithms do not account for implicit conversions\footnote{Baker does briefly comment on an approach for handling implicit conversions.}; both assume that there is at most one valid interpretation of a given expression for each distinct type.469 Baker's and Cormack's algorithms do not account for implicit conversions\footnote{Baker does briefly comment on an approach for handling implicit conversions.}; both assume that there is at most one valid interpretation of a given expression for each distinct type. 467 470 Integrating implicit conversion handling into their algorithms provides some choice of implementation approach. 468 471
Note: See TracChangeset
for help on using the changeset viewer.