Changeset 53f9448


Ignore:
Timestamp:
Apr 2, 2017, 2:54:55 PM (8 years ago)
Author:
Peter A. Buhr <pabuhr@…>
Branches:
ADT, aaron-thesis, arm-eh, ast-experimental, cleanup-dtors, deferred_resn, demangler, enum, forall-pointer-decay, jacob/cs343-translation, jenkins-sandbox, master, new-ast, new-ast-unique-expr, new-env, no_list, persistent-indexer, pthread-emulation, qualifiedEnum, resolv-new, with_gc
Children:
ea73caf
Parents:
0788c03d (diff), ee89ad43 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the (diff) links above to see all the changes relative to each parent.
Message:

Merge branch 'master' of plg2:software/cfa/cfa-cc

Location:
doc
Files:
2 edited

Legend:

Unmodified
Added
Removed
  • doc/bibliography/cfa.bib

    r0788c03d r53f9448  
    27192719        implementations are the same.
    27202720    }
     2721}
     2722
     2723@online{GCCExtensions,
     2724    contributer = {a3moss@uwaterloo.ca},
     2725    key = {{GNU}},
     2726    title = {Extensions to the {C} Language Family},
     2727    year = 2014,
     2728    url = {https://gcc.gnu.org/onlinedocs/gcc-4.7.2/gcc/C-Extensions.html},
     2729    urldate = {2017-04-02}
    27212730}
    27222731
  • doc/generic_types/generic_types.tex

    r0788c03d r53f9448  
    117117
    118118\begin{abstract}
    119 The C programming language is a foundational technology for modern computing with millions of lines of code implementing everything from commercial operating-systems to hobby projects. This installation base and the programmers producing it represent a massive software-engineering investment spanning decades and likely to continue for decades more. Nonetheless, C, first standardized over thirty years ago, lacks many features that make programming in more modern languages safer and more productive. The goal of the \CFA project is to create an extension of C that provides modern safety and productivity features while still ensuring strong backwards compatibility with C and its programmers. Prior projects have attempted similar goals~\cite{C++,Grossman06} but failed to honour C programming-style, e.g., adding object-oriented, functional programming with garbage collection is a non-starter for many C developers. Specifically, \CFA is designed to have an orthogonal feature-set based closely on the C programming paradigm, so that \CFA features can be added \emph{incrementally} to existing C code-bases, and C programmers can learn \CFA extensions on an as-needed basis, preserving investment in existing code and engineers. This paper describes only two \CFA extensions, generic and tuple types, and how they are implemented in accordance with these principles.
     119The C programming language is a foundational technology for modern computing with millions of lines of code implementing everything from commercial operating-systems to hobby projects. This installation base and the programmers producing it represent a massive software-engineering investment spanning decades and likely to continue for decades more. Nonetheless, C, first standardized over thirty years ago, lacks many features that make programming in more modern languages safer and more productive. The goal of the \CFA project is to create an extension of C that provides modern safety and productivity features while still ensuring strong backwards compatibility with C and its programmers. Prior projects have attempted similar goals but failed to honour C programming-style; for instance, adding object-oriented or functional programming with garbage collection is a non-starter for many C developers. Specifically, \CFA is designed to have an orthogonal feature-set based closely on the C programming paradigm, so that \CFA features can be added \emph{incrementally} to existing C code-bases, and C programmers can learn \CFA extensions on an as-needed basis, preserving investment in existing code and engineers. This paper describes only two \CFA extensions, generic and tuple types, and how they are implemented in accordance with these principles.
    120120\end{abstract}
    121121
     
    133133\item Extensions introduced by \CFA must be translated in the most efficient way possible.
    134134\end{enumerate}
    135 The purpose of these goals is to ensure that existing C code-bases can be converted to \CFA incrementally and with minimal effort, and that programmers who know C can productively generate \CFA code without training beyond the features they wish to employ. In its current implementation, \CFA is compiled by translating it to the GCC-dialect of C, allowing it to leverage the portability and code optimizations provided by GCC, meeting goals (1)-(3).
    136 
    137 \CFA has been previously extended with polymorphic functions and name overloading (including operator overloading)~\citep{Bilson03}, and deterministically-executed constructors and destructors~\citep{Schluntz17}. This paper describes how generic and tuple types are designed and implemented in \CFA in accordance with both the backward compatibility goals and existing features described above.
     135The purpose of these goals is to ensure that existing C code-bases can be converted to \CFA incrementally and with minimal effort, and that programmers who know C can productively generate \CFA code without training beyond the features they wish to employ. In its current implementation, \CFA is compiled by translating it to the GCC-dialect of C~\citep{GCCExtensions}, allowing it to leverage the portability and code optimizations provided by GCC, meeting goals (1)-(3).
     136
     137\CFA has been previously extended with polymorphic functions and name overloading (including operator overloading) by \citet{Bilson03}, and deterministically-executed constructors and destructors by \citet{Schluntz17}. This paper describes how generic and tuple types are designed and implemented in \CFA in accordance with both the backward compatibility goals and existing features described above.
    138138
    139139
     
    141141\label{sec:poly-fns}
    142142
    143 \CFA's polymorphism was originally formalized by~\citet{Ditchfield92}, and first implemented by~\citet{Bilson03}. The signature feature of \CFA is parametric-polymorphic functions; such functions are written using a @forall@ clause (which gives the language its name):
     143\CFA's polymorphism was originally formalized by \citet{Ditchfield92}, and first implemented by \citet{Bilson03}. The signature feature of \CFA is parametric-polymorphic functions; such functions are written using a @forall@ clause (which gives the language its name):
    144144\begin{lstlisting}
    145145`forall( otype T )` T identity( T val ) { return val; }
     
    156156int val = twice( twice( 3.7 ) );
    157157\end{lstlisting}
    158 which works for any type @T@ with an addition operator defined. The translator accomplishes this polymorphism by creating a wrapper function for calling @+@ with @T@ bound to @double@, then providing this function to the first call of @twice@. It then has the option of using the same @twice@ again and converting the result to @int@ on assignment, or creating another @twice@ with type parameter @T@ bound to @int@ because \CFA uses the return type in its type analysis. The first approach has a late conversion from integer to floating-point on the final assignment, while the second has an eager conversion to integer. \CFA minimize the number of conversions and their potential to lose information, so it selects the first approach.
     158which works for any type @T@ with an addition operator defined. The translator accomplishes this polymorphism by creating a wrapper function for calling @+@ with @T@ bound to @double@, then providing this function to the first call of @twice@. It then has the option of using the same @twice@ again and converting the result to @int@ on assignment, or creating another @twice@ with type parameter @T@ bound to @int@ because \CFA uses the return type in its type analysis. The first approach has a late conversion from integer to floating-point on the final assignment, while the second has an eager conversion to integer. \CFA minimizes the number of conversions and their potential to lose information, so it selects the first approach.
    159159
    160160Monomorphic specializations of polymorphic functions can satisfy polymorphic type-assertions.
     
    175175\end{lstlisting}
    176176@qsort@ and @bsearch@ can only be called with arguments for which there exists a function named @<@ taking two arguments of the same type and returning an @int@ value.
    177 Here, the build-in monomorphic specialization of @<@ for type @double@ is passed as an additional implicit parameter to the calls of @qsort@ and @bsearch@.
     177Here, the built-in monomorphic specialization of @<@ for type @double@ is passed as an additional implicit parameter to the calls of @qsort@ and @bsearch@.
    178178
    179179Crucial to the design of a new programming language are the libraries to access thousands of external features.
     
    207207\CC's type-system cannot disambiguate between the two versions of @bsearch@ because it does not use the return type in overload resolution, nor can \CC separately compile a templated @bsearch@.
    208208
    209 Call-site inferencing and nested functions provide a localized form of inhertance. For example, @qsort@ only sorts in asending order using @<@. However, it is trivial to locally change this behaviour:
     209Call-site inferencing and nested functions provide a localized form of inheritance. For example, @qsort@ only sorts in ascending order using @<@. However, it is trivial to locally change this behaviour:
    210210\begin{lstlisting}
    211211{
     
    251251% \end{lstlisting}
    252252\begin{lstlisting}
    253 trait sumable( otype T ) {
    254         const T 0;                                                              $\C{// variable zero}$
     253trait summable( otype T ) {
     254        void ?{}(T*, zero_t);                                   $\C{// constructor from 0 literal}$
    255255        T ?+?( T, T );                                                  $\C{// assortment of additions}$
    256256        T ?+=?( T *, T );
     
    258258        T ?++( T * );
    259259};
    260 forall( otype T | sumable( T ) )
     260forall( otype T | summable( T ) )
    261261  T sum( T a[$\,$], size_t dimension ) {
    262         T total = 0;                                                    $\C{// instantiate T, select 0}$
     262        T total = { 0 };                                                        $\C{// instantiate T from 0}$
    263263        for ( unsigned int i = 0; i < dimension; i += 1 )
    264264                total += a[i];                                          $\C{// select appropriate +}$
     
    278278};
    279279\end{lstlisting}
    280 Given the information provided for an @otype@, variables of polymorphic type can be treated as if they were a complete struct type -- they can be stack-allocated using the @alloca@ compiler builtin, default or copy-initialized, assigned, and deleted. As an example, the @sum@ function produces generated code something like the following (simplified for clarity and brevity):
     280Given the information provided for an @otype@, variables of polymorphic type can be treated as if they were a complete struct type -- they can be stack-allocated using the @alloca@ compiler builtin, default or copy-initialized, assigned, and deleted. As an example, the @sum@ function produces generated code something like the following (simplified for clarity and brevity) \TODO{} fix example, maybe elide, it's likely too long with the more complicated function:
    281281\begin{lstlisting}
    282282void abs( size_t _sizeof_M, size_t _alignof_M,
     
    670670If assertion arguments must match exactly, then the call to @g@ cannot be resolved, since the expected type of @f@ is flat, while the only @f@ in scope requires a tuple type. Since tuples are fluid, this requirement reduces the usability of tuples in polymorphic code. To ease this pain point, function parameter and return lists are flattened for the purposes of type unification, which allows the previous example to pass expression resolution.
    671671
    672 This relaxation is made possible by extending the existing thunk generation scheme, as described by~\citet{Bilson03}. Now, whenever a candidate's parameter structure does not exactly match the formal parameter's structure, a thunk is generated to specialize calls to the actual function:
     672This relaxation is made possible by extending the existing thunk generation scheme, as described by \citet{Bilson03}. Now, whenever a candidate's parameter structure does not exactly match the formal parameter's structure, a thunk is generated to specialize calls to the actual function:
    673673\begin{lstlisting}
    674674int _thunk(int _p0, double _p1, double _p2) {
Note: See TracChangeset for help on using the changeset viewer.