Changeset 7bb37fc for doc


Ignore:
Timestamp:
Aug 11, 2016, 4:04:27 PM (8 years ago)
Author:
Aaron Moss <a3moss@…>
Branches:
ADT, aaron-thesis, arm-eh, ast-experimental, cleanup-dtors, ctor, deferred_resn, demangler, enum, forall-pointer-decay, jacob/cs343-translation, jenkins-sandbox, master, memory, new-ast, new-ast-unique-expr, new-env, no_list, persistent-indexer, pthread-emulation, qualifiedEnum, resolv-new, with_gc
Children:
ce2fed5
Parents:
72e2ea0
Message:

Finish first editing pass over Comp II draft

Location:
doc
Files:
2 edited

Legend:

Unmodified
Added
Removed
  • doc/aaron_comp_II/comp_II.tex

    r72e2ea0 r7bb37fc  
    212212While a nominal-inheritance system with associated types could model one of those two relationships by making ©El© an associated type of ©Ptr© in the ©pointer_like© implementation, few such systems could model both relationships simultaneously.
    213213
    214 The flexibility of \CFA's implicit trait satisfaction mechanism provides user programmers with a great deal of power, but also blocks some optimization approaches for expression resolution.
     214The flexibility of \CFA's implicit trait satisfaction mechanism provides programmers with a great deal of power, but also blocks some optimization approaches for expression resolution.
    215215The ability of types to begin to or cease to satisfy traits when declarations go into or out of scope makes caching of trait satisfaction judgements difficult, and the ability of traits to take multiple type parameters could lead to a combinatorial explosion of work in any attempt to pre-compute trait satisfaction relationships.
    216216On the other hand, the addition of a nominal inheritance mechanism to \CFA's type system or replacement of \CFA's trait satisfaction system with a more object-oriented inheritance model and investigation of possible expression resolution optimizations for such a system may be an interesting avenue of further research.
     
    250250\subsubsection{User-generated Implicit Conversions}
    251251One possible additional feature to \CFA included in this research proposal is \emph{user-generated implicit conversions}.
    252 Such a conversion system should be simple for user programmers to utilize, and fit naturally with the existing design of implicit conversions in C; ideally it would also be sufficiently powerful to encode C's usual arithmetic conversions itself, so that \CFA only has one set of rules for conversions.
     252Such a conversion system should be simple for programmers to utilize, and fit naturally with the existing design of implicit conversions in C; ideally it would also be sufficiently powerful to encode C's usual arithmetic conversions itself, so that \CFA only has one set of rules for conversions.
    253253
    254254Ditchfield~\cite{Ditchfield:conversions} has laid out a framework for using polymorphic-conversion-constructor functions to create a directed acyclic graph (DAG) of conversions.
     
    396396
    397397Expression resolution is somewhat unavoidably exponential in $d$, the depth of the expression tree, and if arguments cannot be matched to parameters independently of each other, expression resolution is also exponential in $p$.
    398 However, both $d$ and $p$ are fixed by the user programmer, and generally bounded by reasonably small constants.
     398However, both $d$ and $p$ are fixed by the programmer, and generally bounded by reasonably small constants.
    399399$k$, on the other hand, is mostly dependent on the representation of types in the system and the efficiency of type assertion checking; if a candidate argument combination can be compared to a function parameter list in linear time in the length of the list (\ie $k = 1$), then the $p^{k \cdot d}$ factor is linear in the input size of the source code for the expression, otherwise the resolution algorithm exibits sub-linear performance scaling on code containing more-deeply nested expressions.
    400400The number of valid interpretations of any subexpression, $i$, is bounded by the number of types in the system, which is possibly infinite, though practical resolution algorithms for \CFA must be able to place some finite bound on $i$, possibly at the expense of type-system completeness.
     
    409409
    410410The discussion below presents a number of largely orthagonal axes for expression resolution algorithm design to be investigated, noting prior work where applicable.
    411 Though some of the proposed improvements to the expression resolution algorithm are based on heuristics rather than asymptoticly superior algorithms, it should be noted that user programmers often employ idioms and other programming patterns to reduce the mental burden of producing correct code, and if these patterns can be identified and exploited by the compiler then the significant reduction in expression resolution time for common, idiomatic expressions should result in lower total compilation time even for code including difficult-to-resolve expressions that push the expression resolver to its theoretical worst case.
     411Though some of the proposed improvements to the expression resolution algorithm are based on heuristics rather than asymptoticly superior algorithms, it should be noted that programmers often employ idioms and other programming patterns to reduce the mental burden of producing correct code, and if these patterns can be identified and exploited by the compiler then the significant reduction in expression resolution time for common, idiomatic expressions should result in lower total compilation time even for code including difficult-to-resolve expressions that push the expression resolver to its theoretical worst case.
    412412
    413413\subsection{Argument-Parameter Matching}
     
    467467
    468468\subsection{Implicit Conversion Application}
    469 Baker's and Cormack's algorithms do not account for implicit conversions\footnote{Baker does briefly comment on an approach for handling implicit conversions.}; both assume that there is at most one valid interpretation of a given expression for each distinct type.
     469Baker's and Cormack's algorithms do not account for implicit conversions\footnote{Baker does briefly comment on an approach for handling implicit conversions, but does not provide an implementable algorithm.}; both assume that there is at most one valid interpretation of a given expression for each distinct type.
    470470Integrating implicit conversion handling into their algorithms provides some choice of implementation approach.
    471471
    472472\subsubsection{On Parameters}
    473 Bilson\cite{Bilson03} did account for implicit conversions in his algorithm, but it is not clear his approach is optimal.
    474 His algorithm integrates checking for valid implicit conversions into the argument-parameter matching step, essentially trading more expensive matching for a smaller number of argument interpretations.
    475 This approach may result in the same subexpression being checked for a type match with the same type multiple times, though again memoization may mitigate this cost, and this approach will not generate implicit conversions that are not useful to match the containing function.
     473Bilson does account for implicit conversions in his algorithm, but it is unclear the approach is optimal.
     474His algorithm integrates checking for valid implicit conversions into the argument-parameter-matching step, essentially trading more expensive matching for a smaller number of argument interpretations.
     475This approach may result in the same subexpression being checked for a type match with the same type multiple times, though again memoization may mitigate this cost, and this approach does not generate implicit conversions that are not useful to match the containing function.
     476Calculating implicit conversions on parameters pairs naturally with a top-down approach to expression resolution, though it can also be used in a bottom-up approach, as Bilson demonstrates.
    476477
    477478\subsubsection{On Arguments}
    478 Another approach would be to generate a set of possible implicit conversions for each set of interpretations of a given argument.
    479 This would have the benefit of detecting ambiguous interpretations of arguments at the level of the argument rather than its containing call, would also never find more than one interpretation of the argument with a given type, and would re-use calculation of implicit conversions between function candidates.
    480 On the other hand, this approach may unncessarily generate argument interpretations that will never match a parameter, wasting work.
    481 Further, in the presence of tuple types this approach may lead to a combinatorial explosion of argument interpretations considered, unless the tuple can be considered as a sequence of elements rather than a unified whole.
     479Another approach is to generate a set of possible implicit conversions for each set of interpretations of a given argument.
     480This approach has the benefit of detecting ambiguous interpretations of arguments at the level of the argument rather than its containing call, never finds more than one interpretation of the argument with a given type, and re-uses calculation of implicit conversions between function candidates.
     481On the other hand, this approach may unncessarily generate argument interpretations that never match any parameter, wasting work.
     482Further, in the presence of tuple types this approach may lead to a combinatorial explosion of argument interpretations considered, unless the tuple can be considered as a sequence of elements rather than a unified whole.
     483Calculating implicit conversions on arguments is a viable approach for bottom-up expression resolution, though it may be difficult to apply in a top-down approach due to the presence of a target type for the expression interpretation.
    482484
    483485\subsection{Candidate Set Generation}
    484 Cormack\cite{Cormack81}, Baker\cite{Baker82} and Bilson\cite{Bilson03} all generate the complete set of candidate argument interpretations before attempting to match the containing function call expression.
    485 However, given that the top-level expression interpretation that is ultimately chosen will be the minimal-cost valid interpretation, any consideration of non-minimal-cost interpretations is in some sense wasted work.
    486 If we assume that user programmers will generally write function calls with relatively low-cost interpretations, a possible work-saving heuristic is to generate only the lowest-cost argument interpretations first, attempt to find a valid top-level interpretation using them, and only if that fails generate the higher-cost argument interpretations.
     486Cormack, Baker and Bilson all generate the complete set of candidate argument interpretations before attempting to match the containing function call expression.
     487However, given that the top-level expression interpretation that is ultimately chosen is the minimal-cost valid interpretation, any consideration of non-minimal-cost interpretations is in some sense wasted work.
     488Under the assumption that that programmers generally write function calls with relatively low-cost interpretations, a possible work-saving heuristic is to generate only the lowest-cost argument interpretations first, attempt to find a valid top-level interpretation using them, and only if that fails generate the next higher-cost argument interpretations.
    487489
    488490\subsubsection{Eager}
    489491Within the eager approach taken by Cormack, Baker and Bilson, there are still variants to explore.
    490492Cormack and Baker do not account for implict conversions, and thus do not account for the possibility of multiple valid interpretations with distinct costs; Bilson, on the other hand, sorts the list of interpretations to aid in finding minimal-cost interpretations.
    491 Sorting the lists of argument or function call interpretations by cost at some point during resolution may provide useful opportunities to short-circuit expression evaluation when a minimal-cost interpretation is found, though it is not clear if this short-circuiting behaviour would justify the cost of the sort.
     493Sorting the lists of argument or function call interpretations by cost at some point during resolution may provide useful opportunities to short-circuit expression evaluation when a minimal-cost interpretation is found, though it is unclear if this short-circuiting behaviour justifies the cost of the sort.
    492494
    493495\subsubsection{Lazy}
    494496In the presence of implicit conversions, many argument interpretations may match a given parameter by application of an appropriate implicit conversion.
    495 However, if user programmers actually use relatively few implicit conversions, then the ``on arguments'' approach to implicit conversions will generate a large number of high-cost interpretations which may never be used.
    496 The essence of the lazy approach to candidate set generation is to wrap the matching algorithm into the element generator of a lazy list type, only generating as few elements at a time as possible to ensure that the next-smallest-cost interpretation has been generated.
    497 Assuming that argument interpretations are provided to the parameter matching algorithm in sorted order, a sorted list of function call interpretations can be produced by generating combinations of arguments sorted by total cost\footnote{I have already developed a lazy $n$-way combination generation algorithm to perform this task.}, then generating function call interpretations in the order suggested by this list.
    498 Note that the function call interpretation chosen may have costs of its own, for instance polymorphic type binding, so in some cases a number of argument combinations (any combination whose marginal cost does not exceed the cost of the function call interpretation itself) may need to be considered to determine the next-smallest-cost function call interpretation.
    499 Ideally, this candidate generation approach will lead to very few unused candidates being generated (in the expected case where the user programmer has, in fact, provided a validly-typable program), but this research project will need to determine whether or not the overheads of lazy generation exceed the benefit produced from considering fewer interpretations.
     497However, if programmers actually use relatively few implicit conversions, then the ``on arguments'' approach to implicit conversions generates a large number of high-cost interpretations that may never be used.
     498Even if the ``on parameters'' approach to implicit conversions is used, eager generation of interpretations spends extra time attempting possibly expensive polymorphic or conversion-based matches in cases where an exact monomorphic interpretation exists.
     499
     500The essence of the lazy approach to candidate set generation is to wrap the matching algorithm into the element generator of a lazy list, only generating as few elements at a time to ensure the next-smallest-cost interpretation has been generated.
     501Assuming argument interpretations are provided to the parameter matching algorithm in sorted order, a sorted list of function call interpretations can be produced by generating combinations of arguments sorted by total cost\footnote{I have already developed a lazy $n$-way combination generation algorithm to perform this task.}, then generating function call interpretations in the order suggested by this list.
     502The function call interpretation chosen may have costs of its own, for instance polymorphic type binding, so in some cases a number of argument combinations (any combination whose marginal cost does not exceed the cost of the function call interpretation itself) may need to be considered to determine the next-smallest-cost function call interpretation.
     503Ideally, this candidate generation approach leads to very few unused candidates being generated (in the expected case where the programmer has, in fact, provided a validly-typable program), but it is an open question whether or not the overheads of lazy generation exceed the benefit produced from considering fewer interpretations.
    500504
    501505\subsubsection{Stepwise Lazy}
    502 As a compromise between the trade-offs of the eager and lazy approaches, it would also be interesting to investigate a ``stepwise lazy'' approach, where all the interpretations for some ``step'' are eagerly generated, then the interpretations in the later steps are only generated on demand.
     506As a compromise between the trade-offs of the eager and lazy approaches, I also propose to investigate a ``stepwise lazy'' approach, where all the interpretations for some ``step'' are eagerly generated, then the interpretations in the later steps are only generated on demand.
    503507Under this approach the \CFA resolver could, for instance, try expression interpretations in the following order:
    504508\begin{enumerate}
     
    508512\item Interpretations containing at least one unsafe implicit conversion.
    509513\end{enumerate}
    510 If a valid expression interpretation is found in one step, it is guaranteed to be lower-cost than any interpretation in a later step (by the structure of \CFA interpretation costs), so no step after the first one where a valid interpretation can be found need be considered.
    511 This may save significant amounts of work, especially given that the first steps avoid potentially expensive handling of implicit conversions and type assertion satisfaction entirely.
     514If a valid expression interpretation is found in one step, it is guaranteed to be lower-cost than any interpretation in a later step (by the structure of \CFA interpretation costs), so no further steps need be considered.
     515This approach may save significant amounts of work, especially given that the first steps avoid potentially expensive handling of implicit conversions and type assertion satisfaction entirely, and cover a large proportion of common monomorphic code.
    512516
    513517%\subsection{Parameter-Directed}
     
    542546
    543547\section{Proposal}
    544 Baker\cite{Baker82} discussed various expression resolution algorithms that could handle name overloading, but left experimental comparison of those algorithms to future work; Bilson\cite{Bilson03} described one extension of Baker's algorithm to handle implicit conversions, but did not fully explore the space of algorithmic approaches to handle both overloaded names and implicit conversions.
    545 This project is intended to experimentally test a number of expression resolution algorithms which are powerful enough to handle the \CFA type-system, including both name overloading and implicit conversions.
    546 This comparison will close Baker's open research question, as well as potentially improving on Bilson's \CFA compiler.
    547 
    548 Rather than testing all of these algorithms in-place in the \CFA compiler, a resolver prototype will be developed which acts on a simplified input language encapsulating the essential details of the \CFA type-system\footnote{Note that this simplified input language is not required to be a usable programming language.}.
     548Baker~\cite{Baker82} discussed various expression resolution algorithms that can handle name overloading, but left experimental comparison of those algorithms to future work; Bilson~\cite{Bilson03} described one extension of Baker's algorithm to handle implicit conversions, but did not fully explore the space of algorithmic approaches to handle both overloaded names and implicit conversions.
     549This project is intended to experimentally test a number of expression resolution algorithms that are powerful enough to handle the \CFA type-system, including both name overloading and implicit conversions.
     550This comparison closes Baker's open research question, as well as potentially improving Bilson's \CFA compiler.
     551
     552Rather than testing all of these algorithms in-place in the \CFA compiler, a resolver prototype is being developed which acts on a simplified input language encapsulating the essential details of the \CFA type-system\footnote{Note this simplified input language is not a usable programming language.}.
    549553Multiple variants of this resolver prototype will be implemented, each encapsulating a different expression resolution variant, sharing as much code as feasible.
    550554These variants will be instrumented to test runtime performance, and run on a variety of input files; the input files may be generated programmatically or from exisiting code in \CFA or similar languages.
    551 These experimental results will allow the research team to determine the algorithm likely to be most performant in practical use, and replace CFA's existing expression resolver with that code.
    552 The experimental results will also provide some empirical sense of the compile-time cost of various language features by comparing the results of the most performant resolver variant that supports the feature with the most performant resolver variant that doesn't, a useful capability to guide language design.
    553 
    554 This proposed project should provide valuable data on how to implement a performant compiler for modern programming languages such as \CFA with powerful static type-systems, specifically targeting the feature interaction between name overloading and implicit conversions.
     555These experimental results should make it possible to determine the algorithm likely to be most performant in practical use, and replace CFA's existing expression resolver.
     556
     557The experimental results will also provide some empirical sense of the compile-time cost of various language features by comparing the results of the most performant resolver variant that supports a feature with the most performant resolver variant that does not support that feature, a useful capability to guide language design.
     558As an example, there are currently multiple open proposals for how implicit conversions should interact with polymorphic type binding in \CFA, each with distinct levels of expressive power; if the resolver prototype is modified to support each proposal, the optimal algorithm for each proposal can be compared, providing an empirical demonstration of the trade-off between expressive power and compiler runtime.
     559
     560This proposed project should provide valuable data on how to implement a performant compiler for modern programming languages such as \CFA with powerful static type-systems, specifically targeting the feature interaction between name overloading and implicit conversions.
     561This work is not limited in applicability to \CFA, but may also be useful for supporting efficient compilation of the upcoming ``Concepts'' standard~\cite{C++concepts} for \CC template constraints, for instance.
    555562
    556563\appendix
  • doc/bibliography/cfa.bib

    r72e2ea0 r7bb37fc  
    832832    address     = {http://www.iso.org},
    833833    year        = 2012,
     834}
     835
     836@manual{C++Concepts,
     837        keywords = {ISO/IEC TS 19217:2015},
     838        contributer = {a3moss@uwaterloo.ca},
     839        key = {C++ Concepts},
     840        title = {Information technology -- Programming languages -- {C}{\kern-.1em\hbox{\large\texttt{+\kern-.25em+}}} Extensions for concepts},
     841        organization = {International Standard ISO/IEC TS 19217:2015},
     842        publisher = {International Standard Organization},
     843        address = {http://www.iso.org},
     844        year = 2015
    834845}
    835846
Note: See TracChangeset for help on using the changeset viewer.