Changeset eaeca5f
- Timestamp:
- Aug 29, 2021, 11:46:13 AM (4 years ago)
- Branches:
- ADT, ast-experimental, enum, forall-pointer-decay, jacob/cs343-translation, master, pthread-emulation, qualifiedEnum
- Children:
- 75f8e04
- Parents:
- 1d402be (diff), cfbab07 (diff)
Note: this is a merge changeset, the changes displayed below correspond to the merge itself.
Use the(diff)
links above to see all the changes relative to each parent. - Location:
- doc/theses/andrew_beach_MMath
- Files:
-
- 8 edited
Legend:
- Unmodified
- Added
- Removed
-
TabularUnified doc/theses/andrew_beach_MMath/conclusion.tex ¶
r1d402be reaeca5f 1 1 \chapter{Conclusion} 2 \label{c:conclusion} 2 3 % Just a little knot to tie the paper together. 3 4 4 In the previous chapters this thesis presents the design and implementation of 5 \CFA's EHM. Both the design and implementation are based off of tools and 5 In the previous chapters this thesis presents the design and implementation 6 of \CFA's exception handling mechanism (EHM). 7 Both the design and implementation are based off of tools and 6 8 techniques developed for other programming languages but they were adapted to 7 better fit \CFA's feature set and add a few features that do not exist in other8 EHMs, like conditional catch, default handlers, implicitly changing resumption 9 in to termination in the resumption default handler, and cancellation through10 coroutines and threads back to program main.9 better fit \CFA's feature set and add a few features that do not exist in 10 other EHMs; 11 including conditional matching, default handlers for unhandled exceptions 12 and cancellation though coroutines and threads back to the program main stack. 11 13 12 14 The resulting features cover all of the major use cases of the most popular … … 15 17 such as virtuals independent of traditional objects. 16 18 17 The implementation has been tested through a set of small but interesting micro-benchmarks 18 and compared to other implementations. 19 The \CFA project's test suite has been expanded to test the EHM. 20 The implementation's performance has also been 21 compared to other implementations with a small set of targeted 22 micro-benchmarks. 19 23 The results, while not cutting edge, are good enough for prototyping, which 20 24 is \CFA's current stage of development. 21 25 22 This initial EHM is a valuable new feature for \CFA in its own right but also serves 23 as a tool and motivation for other developments in the language. 26 This initial EHM will bring valuable new features to \CFA in its own right 27 but also serves as a tool and motivation for other developments in the 28 language. -
TabularUnified doc/theses/andrew_beach_MMath/existing.tex ¶
r1d402be reaeca5f 10 10 11 11 Only those \CFA features pertaining to this thesis are discussed. 12 % Also, only new features of \CFA will be discussed,13 12 A familiarity with 14 13 C or C-like languages is assumed. … … 17 16 \CFA has extensive overloading, allowing multiple definitions of the same name 18 17 to be defined~\cite{Moss18}. 19 \begin{ lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}]20 char @i@; int @i@; double @i@;21 int @f@(); double @f@();22 void @g@( int ); void @g@( double );23 \end{ lstlisting}18 \begin{cfa} 19 char i; int i; double i; 20 int f(); double f(); 21 void g( int ); void g( double ); 22 \end{cfa} 24 23 This feature requires name mangling so the assembly symbols are unique for 25 24 different overloads. For compatibility with names in C, there is also a syntax … … 63 62 int && rri = ri; 64 63 rri = 3; 65 &ri = &j; // rebindable64 &ri = &j; 66 65 ri = 5; 67 66 \end{cfa} … … 79 78 \end{minipage} 80 79 81 References are intended for pointer situations where dereferencing is the common usage, 82 \ie the value is more important than the pointer. 80 References are intended to be used when the indirection of a pointer is 81 required, but the address is not as important as the value and dereferencing 82 is the common usage. 83 83 Mutable references may be assigned to by converting them to a pointer 84 with a @&@ and then assigning a pointer to them, as in @&ri = &j;@ above 84 with a @&@ and then assigning a pointer to them, as in @&ri = &j;@ above. 85 % ??? 85 86 86 87 \section{Operators} 87 88 88 89 \CFA implements operator overloading by providing special names, where 89 operator usages are translated into function calls using these names.90 operator expressions are translated into function calls using these names. 90 91 An operator name is created by taking the operator symbols and joining them with 91 92 @?@s to show where the arguments go. … … 94 95 This syntax make it easy to tell the difference between prefix operations 95 96 (such as @++?@) and post-fix operations (@?++@). 96 For example, plus and equality operators are defined for a point type. 97 98 As an example, here are the addition and equality operators for a point type. 97 99 \begin{cfa} 98 100 point ?+?(point a, point b) { return point{a.x + b.x, a.y + b.y}; } … … 102 104 } 103 105 \end{cfa} 104 Note these special names are not limited to builtin 105 operators, and hence, may be used with arbitrary types. 106 \begin{cfa} 107 double ?+?( int x, point y ); // arbitrary types 108 \end{cfa} 109 % Some ``near misses", that are that do not match an operator form but looks like 110 % it may have been supposed to, will generate warning but otherwise they are 111 % left alone. 112 Because operators are never part of the type definition they may be added 113 at any time, including on built-in types. 106 Note that this syntax works effectively but a textual transformation, 107 the compiler converts all operators into functions and then resolves them 108 normally. This means any combination of types may be used, 109 although nonsensical ones (like @double ?==?(point, int);@) are discouraged. 110 This feature is also used for all builtin operators as well, 111 although those are implicitly provided by the language. 114 112 115 113 %\subsection{Constructors and Destructors} 116 117 \CFA also provides constructors and destructors as operators, which means they 118 are functions with special operator names rather than type names in \Cpp. 119 While constructors and destructions are normally called implicitly by the compiler, 120 the special operator names, allow explicit calls. 121 122 % Placement new means that this is actually equivalent to C++. 114 In \CFA, constructors and destructors are operators, which means they are 115 functions with special operator names rather than type names in \Cpp. 116 Both constructors and destructors can be implicity called by the compiler, 117 however the operator names allow explicit calls. 118 % Placement new means that this is actually equivant to C++. 123 119 124 120 The special name for a constructor is @?{}@, which comes from the … … 129 125 struct Example { ... }; 130 126 void ?{}(Example & this) { ... } 127 { 128 Example a; 129 Example b = {}; 130 } 131 131 void ?{}(Example & this, char first, int num) { ... } 132 Example a; // implicit constructor calls 133 Example b = {};134 Example c = {'a', 2}; 135 \end{cfa} 136 Both @a@ and @b@ are initialized with the first constructor,137 while @c@ is initialized with the second.138 Constructor calls can be replaced with C initialization using special operator \lstinline{@=}.139 \begin{cfa} 140 Example d @= {42}; 141 \end{cfa} 132 { 133 Example c = {'a', 2}; 134 } 135 \end{cfa} 136 Both @a@ and @b@ will be initalized with the first constructor, 137 @b@ because of the explicit call and @a@ implicitly. 138 @c@ will be initalized with the second constructor. 139 Currently, there is no general way to skip initialation. 140 % I don't use @= anywhere in the thesis. 141 142 142 % I don't like the \^{} symbol but $^\wedge$ isn't better. 143 143 Similarly, destructors use the special name @^?{}@ (the @^@ has no special 144 144 meaning). 145 % These are a normally called implicitly called on a variable when it goes out146 % of scope. They can be called explicitly as well.147 145 \begin{cfa} 148 146 void ^?{}(Example & this) { ... } 149 147 { 150 Example e; // implicit constructor call 151 ^?{}(e); // explicit destructor call 152 ?{}(e); // explicit constructor call 153 } // implicit destructor call 148 Example d; 149 ^?{}(d); 150 151 Example e; 152 } // Implicit call of ^?{}(e); 154 153 \end{cfa} 155 154 … … 225 224 The global definition of @do_once@ is ignored, however if quadruple took a 226 225 @double@ argument, then the global definition would be used instead as it 227 isa better match.228 % Aaron's thesis might be a good reference here. 229 230 To avoid typing long lists of assertions, constraints can be collect into231 convenient package called a @trait@, which can then be used in an assertion226 would then be a better match. 227 \todo{cite Aaron's thesis (maybe)} 228 229 To avoid typing long lists of assertions, constraints can be collected into 230 convenient a package called a @trait@, which can then be used in an assertion 232 231 instead of the individual constraints. 233 232 \begin{cfa} … … 253 252 node(T) * next; 254 253 T * data; 255 } 254 }; 256 255 node(int) inode; 257 256 \end{cfa} … … 293 292 }; 294 293 CountUp countup; 295 for (10) sout | resume(countup).next; // print 10 values296 294 \end{cfa} 297 295 Each coroutine has a @main@ function, which takes a reference to a coroutine 298 296 object and returns @void@. 299 297 %[numbers=left] Why numbers on this one? 300 \begin{cfa} [numbers=left,numberstyle=\scriptsize\sf]298 \begin{cfa} 301 299 void main(CountUp & this) { 302 for (unsigned int up = 0;; ++up) {303 this.next = up;300 for (unsigned int next = 0 ; true ; ++next) { 301 this.next = next; 304 302 suspend;$\label{suspend}$ 305 303 } … … 307 305 \end{cfa} 308 306 In this function, or functions called by this function (helper functions), the 309 @suspend@ statement is used to return execution to the coroutine's resumer310 without terminating the coroutine's function (s).307 @suspend@ statement is used to return execution to the coroutine's caller 308 without terminating the coroutine's function. 311 309 312 310 A coroutine is resumed by calling the @resume@ function, \eg @resume(countup)@. 313 311 The first resume calls the @main@ function at the top. Thereafter, resume calls 314 312 continue a coroutine in the last suspended function after the @suspend@ 315 statement, in this case @main@ line~\ref{suspend}. The @resume@ function takes 316 a reference to the coroutine structure and returns the same reference. The 317 return value allows easy access to communication variables defined in the 318 coroutine object. For example, the @next@ value for coroutine object @countup@ 319 is both generated and collected in the single expression: 320 @resume(countup).next@. 313 statement. In this case there is only one and, hence, the difference between 314 subsequent calls is the state of variables inside the function and the 315 coroutine object. 316 The return value of @resume@ is a reference to the coroutine, to make it 317 convent to access fields of the coroutine in the same expression. 318 Here is a simple example in a helper function: 319 \begin{cfa} 320 unsigned int get_next(CountUp & this) { 321 return resume(this).next; 322 } 323 \end{cfa} 324 325 When the main function returns the coroutine halts and can no longer be 326 resumed. 321 327 322 328 \subsection{Monitor and Mutex Parameter} … … 330 336 exclusion on a monitor object by qualifying an object reference parameter with 331 337 @mutex@. 332 \begin{ lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}]333 void example(MonitorA & @mutex@ argA, MonitorB & @mutex@argB);334 \end{ lstlisting}338 \begin{cfa} 339 void example(MonitorA & mutex argA, MonitorB & mutex argB); 340 \end{cfa} 335 341 When the function is called, it implicitly acquires the monitor lock for all of 336 342 the mutex parameters without deadlock. This semantics means all functions with … … 362 368 { 363 369 StringWorker stringworker; // fork thread running in "main" 364 } // implicitly join with thread / wait for completion370 } // Implicit call to join(stringworker), waits for completion. 365 371 \end{cfa} 366 372 The thread main is where a new thread starts execution after a fork operation -
TabularUnified doc/theses/andrew_beach_MMath/features.tex ¶
r1d402be reaeca5f 19 19 20 20 \paragraph{Raise} 21 The raise is the starting point for exception handling 21 The raise is the starting point for exception handling, 22 22 by raising an exception, which passes it to 23 23 the EHM. … … 30 30 \paragraph{Handle} 31 31 The primary purpose of an EHM is to run some user code to handle a raised 32 exception. This code is given, with some other information, in a handler. 32 exception. This code is given, along with some other information, 33 in a handler. 33 34 34 35 A handler has three common features: the previously mentioned user code, a 35 region of code it guards ,and an exception label/condition that matches36 the raised exception.36 region of code it guards and an exception label/condition that matches 37 against the raised exception. 37 38 Only raises inside the guarded region and raising exceptions that match the 38 39 label can be handled by a given handler. … … 41 42 42 43 The @try@ statements of \Cpp, Java and Python are common examples. All three 43 show the common features of guarded region, raise, matching and handler. 44 \begin{cfa} 45 try { // guarded region 46 ... 47 throw exception; // raise 48 ... 49 } catch( exception ) { // matching condition, with exception label 50 ... // handler code 51 } 52 \end{cfa} 44 also show another common feature of handlers, they are grouped by the guarded 45 region. 53 46 54 47 \subsection{Propagation} 55 48 After an exception is raised comes what is usually the biggest step for the 56 EHM: finding and setting up the handler for execution. The propagation from raise to 49 EHM: finding and setting up the handler for execution. 50 The propagation from raise to 57 51 handler can be broken up into three different tasks: searching for a handler, 58 52 matching against the handler and installing the handler. … … 60 54 \paragraph{Searching} 61 55 The EHM begins by searching for handlers that might be used to handle 62 the exception. The search is restricted to63 handlers that have the raise site in their guarded56 the exception. 57 The search will find handlers that have the raise site in their guarded 64 58 region. 65 59 The search includes handlers in the current function, as well as any in … … 67 61 68 62 \paragraph{Matching} 69 Each handler found is matchedwith the raised exception. The exception63 Each handler found is with the raised exception. The exception 70 64 label defines a condition that is used with the exception and decides if 71 65 there is a match or not. 66 % 72 67 In languages where the first match is used, this step is intertwined with 73 68 searching; a match check is performed immediately after the search finds … … 84 79 different course of action for this case. 85 80 This situation only occurs with unchecked exceptions as checked exceptions 86 (such as in Java) are guaranteed to find a matching handler.81 (such as in Java) can make the guarantee. 87 82 The unhandled action is usually very general, such as aborting the program. 88 83 … … 98 93 A handler labeled with any given exception can handle exceptions of that 99 94 type or any child type of that exception. The root of the exception hierarchy 100 (here \code{C}{exception}) acts as a catch-all, leaf types catch single types ,95 (here \code{C}{exception}) acts as a catch-all, leaf types catch single types 101 96 and the exceptions in the middle can be used to catch different groups of 102 97 related exceptions. 103 98 104 99 This system has some notable advantages, such as multiple levels of grouping, 105 the ability for libraries to add new exception types ,and the isolation100 the ability for libraries to add new exception types and the isolation 106 101 between different sub-hierarchies. 107 102 This design is used in \CFA even though it is not a object-orientated … … 123 118 For effective exception handling, additional information is often passed 124 119 from the raise to the handler and back again. 125 So far, only communication of the exception's identity is covered. 126 A common communication method for passing more information is putting fields into the exception instance 120 So far, only communication of the exceptions' identity is covered. 121 A common communication method for adding information to an exception 122 is putting fields into the exception instance 127 123 and giving the handler access to them. 128 Using reference fields pointing to data at the raise location allows data to be 129 passed in both directions. 124 % You can either have pointers/references in the exception, or have p/rs to 125 % the exception when it doesn't have to be copied. 126 Passing references or pointers allows data at the raise location to be 127 updated, passing information in both directions. 130 128 131 129 \section{Virtuals} 132 \label{s: Virtuals}130 \label{s:virtuals} 133 131 Virtual types and casts are not part of \CFA's EHM nor are they required for 134 132 an EHM. 135 133 However, one of the best ways to support an exception hierarchy 136 134 is via a virtual hierarchy and dispatch system. 137 Ideally, the virtual system should have been part of \CFA before the work135 Ideally, the virtual system would have been part of \CFA before the work 138 136 on exception handling began, but unfortunately it was not. 139 137 Hence, only the features and framework needed for the EHM were 140 designed and implemented for this thesis. Other features were considered to ensure that 138 designed and implemented for this thesis. 139 Other features were considered to ensure that 141 140 the structure could accommodate other desirable features in the future 142 141 but are not implemented. 143 142 The rest of this section only discusses the implemented subset of the 144 virtual -system design.143 virtual system design. 145 144 146 145 The virtual system supports multiple ``trees" of types. Each tree is … … 149 148 number of children. 150 149 Any type that belongs to any of these trees is called a virtual type. 151 For example, the following hypothetical syntax creates two virtual-type trees.152 \begin{flushleft}153 \lstDeleteShortInline@154 \begin{tabular}{@{\hspace{20pt}}l@{\hspace{20pt}}l}155 \begin{cfa}156 vtype V0, V1(V0), V2(V0);157 vtype W0, W1(W0), W2(W1);158 \end{cfa}159 &160 \raisebox{-0.6\totalheight}{\input{vtable}}161 \end{tabular}162 \lstMakeShortInline@163 \end{flushleft}164 150 % A type's ancestors are its parent and its parent's ancestors. 165 151 % The root type has no ancestors. 166 152 % A type's descendants are its children and its children's descendants. 167 Every virtual type (tree node) has a pointer to a virtual table with a unique 168 @Id@ and a list of virtual members (see \autoref{s:VirtualSystem} for 169 details). Children inherit their parent's list of virtual members but may add 170 and/or replace members. For example, 171 \begin{cfa} 172 vtable W0 | { int ?<?( int, int ); int ?+?( int, int ); } 173 vtable W1 | { int ?+?( int, int ); int w, int ?-?( int, int ); } 174 \end{cfa} 175 creates a virtual table for @W0@ initialized with the matching @<@ and @+@ 176 operations visible at this declaration context. Similarly, @W1@ is initialized 177 with @<@ from inheritance with @W0@, @+@ is replaced, and @-@ is added, where 178 both operations are matched at this declaration context. It is important to 179 note that these are virtual members, not virtual methods of object-orientated 180 programming, and can be of any type. Finally, trait names can be used to 181 specify the list of virtual members. 182 183 \PAB{Need to look at these when done. 184 185 \CFA still supports virtual methods as a special case of virtual members. 186 Function pointers that take a pointer to the virtual type are modified 187 with each level of inheritance so that refers to the new type. 188 This means an object can always be passed to a function in its virtual table 189 as if it were a method. 190 \todo{Clarify (with an example) virtual methods.} 191 }% 153 154 For the purposes of illistration, a proposed -- but unimplemented syntax -- 155 will be used. Each virtual type is repersented by a trait with an annotation 156 that makes it a virtual type. This annotation is empty for a root type, which 157 creates a new tree: 158 \begin{cfa} 159 trait root_type(T) virtual() {} 160 \end{cfa} 161 The annotation may also refer to any existing virtual type to make this new 162 type a child of that type and part of the same tree. The parent may itself 163 be a child or a root type and may have any number of existing children. 164 \begin{cfa} 165 trait child_a(T) virtual(root_type) {} 166 trait grandchild(T) virtual(child_a) {} 167 trait child_b(T) virtual(root_type) {} 168 \end{cfa} 169 \todo{Update the diagram in vtable.fig to show the new type tree.} 170 171 Every virtual type also has a list of virtual members and a unique id, 172 both are stored in a virtual table. 173 Every instance of a virtual type also has a pointer to a virtual table stored 174 in it, although there is no per-type virtual table as in many other languages. 175 176 The list of virtual members is built up down the tree. Every virtual type 177 inherits the list of virtual members from its parent and may add more 178 virtual members to the end of the list which are passed on to its children. 179 Again, using the unimplemented syntax this might look like: 180 \begin{cfa} 181 trait root_type(T) virtual() { 182 const char * to_string(T const & this); 183 unsigned int size; 184 } 185 186 trait child_type(T) virtual(root_type) { 187 char * irrelevant_function(int, char); 188 } 189 \end{cfa} 190 % Consider adding a diagram, but we might be good with the explanation. 191 192 As @child_type@ is a child of @root_type@ it has the virtual members of 193 @root_type@ (@to_string@ and @size@) as well as the one it declared 194 (@irrelivant_function@). 195 196 It is important to note that these are virtual members, and may contain 197 arbitrary fields, functions or otherwise. 198 The names ``size" and ``align" are reserved for the size and alignment of the 199 virtual type, and are always automatically initialized as such. 200 The other special case are uses of the trait's polymorphic argument 201 (@T@ in the example), which are always updated to refer to the current 202 virtual type. This allows functions that refer to to polymorphic argument 203 to act as traditional virtual methods (@to_string@ in the example), as the 204 object can always be passed to a virtual method in its virtual table. 192 205 193 206 Up until this point the virtual system is similar to ones found in 194 object-orientated languages but this is where \CFA diverges. Objects encapsulate a 195 single set of methods in each type, universally across the entire program, 196 and indeed all programs that use that type definition. Even if a type inherits and adds methods, it still encapsulate a 197 single set of methods. In this sense, 198 object-oriented types are ``closed" and cannot be altered. 199 200 In \CFA, types do not encapsulate any code. Traits are local for each function and 201 types can satisfy a local trait, stop satisfying it or, satisfy the same 202 trait in a different way at any lexical location in the program where a function is call. 203 In this sense, the set of functions/variables that satisfy a trait for a type is ``open" as the set can change at every call site. 207 object-oriented languages but this is where \CFA diverges. 208 Objects encapsulate a single set of methods in each type, 209 universally across the entire program, 210 and indeed all programs that use that type definition. 211 The only way to change any method is to inherit and define a new type with 212 its own universal implementation. In this sense, 213 these object-oriented types are ``closed" and cannot be altered. 214 % Because really they are class oriented. 215 216 In \CFA, types do not encapsulate any code. 217 Whether or not satisfies any given assertion, and hence any trait, is 218 context sensitive. Types can begin to satisfy a trait, stop satisfying it or 219 satisfy the same trait at any lexical location in the program. 220 In this sense, an type's implementation in the set of functions and variables 221 that allow it to satisfy a trait is ``open" and can change 222 throughout the program. 204 223 This capability means it is impossible to pick a single set of functions 205 224 that represent a type's implementation across a program. … … 208 227 type. A user can define virtual tables that are filled in at their 209 228 declaration and given a name. Anywhere that name is visible, even if it is 210 defined locally inside a function \PAB{What does this mean? (although that means it does not have a211 static lifetime)}, it can be used.229 defined locally inside a function (although in this case the user must ensure 230 it outlives any objects that use it), it can be used. 212 231 Specifically, a virtual type is ``bound" to a virtual table that 213 232 sets the virtual members for that object. The virtual members can be accessed 214 233 through the object. 234 235 This means virtual tables are declared and named in \CFA. 236 They are declared as variables, using the type 237 @vtable(VIRTUAL_TYPE)@ and any valid name. For example: 238 \begin{cfa} 239 vtable(virtual_type_name) table_name; 240 \end{cfa} 241 242 Like any variable they may be forward declared with the @extern@ keyword. 243 Forward declaring virtual tables is relatively common. 244 Many virtual types have an ``obvious" implementation that works in most 245 cases. 246 A pattern that has appeared in the early work using virtuals is to 247 implement a virtual table with the the obvious definition and place a forward 248 declaration of it in the header beside the definition of the virtual type. 249 250 Even on the full declaration, no initializer should be used. 251 Initialization is automatic. 252 The type id and special virtual members ``size" and ``align" only depend on 253 the virtual type, which is fixed given the type of the virtual table and 254 so the compiler fills in a fixed value. 255 The other virtual members are resolved, using the best match to the member's 256 name and type, in the same context as the virtual table is declared using 257 \CFA's normal resolution rules. 215 258 216 259 While much of the virtual infrastructure is created, it is currently only used … … 228 271 @EXPRESSION@ object, otherwise it returns @0p@ (null pointer). 229 272 230 \section{Exception} 231 % Leaving until later, hopefully it can talk about actual syntax instead 232 % of my many strange macros. Syntax aside I will also have to talk about the 233 % features all exceptions support. 234 235 Exceptions are defined by the trait system; there are a series of traits, and 273 \section{Exceptions} 274 275 The syntax for declaring an exception is the same as declaring a structure 276 except the keyword that is swapped out: 277 \begin{cfa} 278 exception TYPE_NAME { 279 FIELDS 280 }; 281 \end{cfa} 282 283 Fields are filled in the same way as a structure as well. However an extra 284 field is added, this field contains the pointer to the virtual table. 285 It must be explicitly initialised by the user when the exception is 286 constructed. 287 288 Here is an example of declaring an exception type along with a virtual table, 289 assuming the exception has an ``obvious" implementation and a default 290 virtual table makes sense. 291 292 \begin{minipage}[t]{0.4\textwidth} 293 Header: 294 \begin{cfa} 295 exception Example { 296 int data; 297 }; 298 299 extern vtable(Example) 300 example_base_vtable; 301 \end{cfa} 302 \end{minipage} 303 \begin{minipage}[t]{0.6\textwidth} 304 Source: 305 \begin{cfa} 306 vtable(Example) example_base_vtable 307 \end{cfa} 308 \vfil 309 \end{minipage} 310 311 %\subsection{Exception Details} 312 If one is only raising and handling exceptions, that is the only interface 313 that is needed. However it is actually a short hand for a more complex 314 trait based interface. 315 316 The language views exceptions through a series of traits, 236 317 if a type satisfies them, then it can be used as an exception. The following 237 318 is the base trait all exceptions need to match. … … 247 328 completing the virtual system). The imaginary assertions would probably come 248 329 from a trait defined by the virtual system, and state that the exception type 249 is a virtual type, is a descendant of @exception_t@ (the base exception type) ,250 and note itsvirtual table type.330 is a virtual type, is a descendant of @exception_t@ (the base exception type) 331 and allow the user to find the virtual table type. 251 332 252 333 % I did have a note about how it is the programmer's responsibility to make … … 267 348 \end{cfa} 268 349 Both traits ensure a pair of types are an exception type, its virtual table 269 type ,350 type 270 351 and defines one of the two default handlers. The default handlers are used 271 352 as fallbacks and are discussed in detail in \vref{s:ExceptionHandling}. … … 276 357 facing way. So these three macros are provided to wrap these traits to 277 358 simplify referring to the names: 278 @IS_EXCEPTION@, @IS_TERMINATION_EXCEPTION@ ,and @IS_RESUMPTION_EXCEPTION@.359 @IS_EXCEPTION@, @IS_TERMINATION_EXCEPTION@ and @IS_RESUMPTION_EXCEPTION@. 279 360 280 361 All three take one or two arguments. The first argument is the name of the … … 299 380 These twin operations are the core of \CFA's exception handling mechanism. 300 381 This section covers the general patterns shared by the two operations and 301 then goes on to cover the details ofeach individual operation.382 then goes on to cover the details each individual operation. 302 383 303 384 Both operations follow the same set of steps. 304 385 First, a user raises an exception. 305 Second, the exception propagates up the stack .386 Second, the exception propagates up the stack, searching for a handler. 306 387 Third, if a handler is found, the exception is caught and the handler is run. 307 388 After that control continues at a raise-dependent location. 308 Fourth, if a handler is not found, a default handler is run and, if it returns, then control 389 As an alternate to the third step, 390 if a handler is not found, a default handler is run and, if it returns, 391 then control 309 392 continues after the raise. 310 393 311 %This general description covers what the two kinds have in common. 312 The differences in the two operations include how propagation is performed, where execution continues 313 a fter an exception is caught and handled, and which default handler is run.394 The differences between the two operations include how propagation is 395 performed, where excecution after an exception is handler 396 and which default handler is run. 314 397 315 398 \subsection{Termination} 316 399 \label{s:Termination} 317 Termination handling is the familiar EHM and used in most programming 400 Termination handling is the familiar kind of handling 401 and used in most programming 318 402 languages with exception handling. 319 403 It is a dynamic, non-local goto. If the raised exception is matched and … … 347 431 Then propagation starts with the search. \CFA uses a ``first match" rule so 348 432 matching is performed with the copied exception as the search key. 349 It starts from the raise in the throwing function and proceeds towards thebase of the stack,433 It starts from the raise site and proceeds towards base of the stack, 350 434 from callee to caller. 351 435 At each stack frame, a check is made for termination handlers defined by the … … 361 445 \end{cfa} 362 446 When viewed on its own, a try statement simply executes the statements 363 in the \snake{GUARDED_BLOCK} ,and when those are finished,447 in the \snake{GUARDED_BLOCK} and when those are finished, 364 448 the try statement finishes. 365 449 … … 387 471 termination exception types. 388 472 The global default termination handler performs a cancellation 389 (see \vref{s:Cancellation} for the justification) on the current stack with the copied exception. 390 Since it is so general, a more specific handler is usually 391 defined, possibly with a detailed message, and used for specific exception type, effectively overriding the default handler. 473 (as described in \vref{s:Cancellation}) 474 on the current stack with the copied exception. 475 Since it is so general, a more specific handler can be defined, 476 overriding the default behaviour for the specific exception types. 392 477 393 478 \subsection{Resumption} 394 479 \label{s:Resumption} 395 480 396 Resumption exception handling is the less familar EHM, but is 481 Resumption exception handling is less familar form of exception handling, 482 but is 397 483 just as old~\cite{Goodenough75} and is simpler in many ways. 398 484 It is a dynamic, non-local function call. If the raised exception is … … 403 489 function once the error is corrected, and 404 490 ignorable events, such as logging where nothing needs to happen and control 405 should always continue from the raise point. 491 should always continue from the raise site. 492 493 Except for the changes to fit into that pattern, resumption exception 494 handling is symmetric with termination exception handling, by design 495 (see \autoref{s:Termination}). 406 496 407 497 A resumption raise is started with the @throwResume@ statement: … … 410 500 \end{cfa} 411 501 \todo{Decide on a final set of keywords and use them everywhere.} 412 It works much the same way as the termination throw. 413 The expression must return a reference to a resumption exception, 414 where the resumption exception is any type that satisfies the trait 415 @is_resumption_exception@ at the call site. 416 The assertions from this trait are available to 417 the exception system while handling the exception. 418 419 At run-time, no exception copy is made, since 502 It works much the same way as the termination raise, except the 503 type must satisfy the \snake{is_resumption_exception} that uses the 504 default handler: \defaultResumptionHandler. 505 This can be specialized for particular exception types. 506 507 At run-time, no exception copy is made. Since 420 508 resumption does not unwind the stack nor otherwise remove values from the 421 current scope, so there is no need to manage memory to keep the exception in scope.422 423 Then propagation starts with the search. It starts from the raise in the 424 resuming function and proceeds towards the base of the stack,425 f rom callee to caller.426 At each stack frame, a check is made for resumption handlers defined by the 427 @catchResume@ clauses of a @try@ statement.509 current scope, there is no need to manage memory to keep the exception 510 allocated. 511 512 Then propagation starts with the search, 513 following the same search path as termination, 514 from the raise site to the base of stack and top of try statement to bottom. 515 However, the handlers on try statements are defined by @catchResume@ clauses. 428 516 \begin{cfa} 429 517 try { … … 435 523 } 436 524 \end{cfa} 437 % PAB, you say this above. 438 % When a try statement is executed, it simply executes the statements in the 439 % @GUARDED_BLOCK@ and then finishes. 440 % 441 % However, while the guarded statements are being executed, including any 442 % invoked functions, all the handlers in these statements are included in the 443 % search path. 444 % Hence, if a resumption exception is raised, these handlers may be matched 445 % against the exception and may handle it. 446 % 447 % Exception matching checks the handler in each catch clause in the order 448 % they appear, top to bottom. If the representation of the raised exception type 449 % is the same or a descendant of @EXCEPTION_TYPE@$_i$, then @NAME@$_i$ 450 % (if provided) is bound to a pointer to the exception and the statements in 451 % @HANDLER_BLOCK@$_i$ are executed. 452 % If control reaches the end of the handler, execution continues after the 453 % the raise statement that raised the handled exception. 454 % 455 % Like termination, if no resumption handler is found during the search, 456 % then the default handler (\defaultResumptionHandler) visible at the raise 457 % statement is called. It will use the best match at the raise sight according 458 % to \CFA's overloading rules. The default handler is 459 % passed the exception given to the raise. When the default handler finishes 460 % execution continues after the raise statement. 461 % 462 % There is a global @defaultResumptionHandler{} is polymorphic over all 463 % resumption exceptions and performs a termination throw on the exception. 464 % The \defaultTerminationHandler{} can be overridden by providing a new 465 % function that is a better match. 466 467 The @GUARDED_BLOCK@ and its associated nested guarded statements work the same 468 for resumption as for termination, as does exception matching at each 469 @catchResume@. Similarly, if no resumption handler is found during the search, 470 then the currently visible default handler (\defaultResumptionHandler) is 471 called and control continues after the raise statement if it returns. Finally, 472 there is also a global @defaultResumptionHandler@, which can be overridden, 473 that is polymorphic over all resumption exceptions but performs a termination 474 throw on the exception rather than a cancellation. 475 476 Throwing the exception in @defaultResumptionHandler@ has the positive effect of 477 walking the stack a second time for a recovery handler. Hence, a programmer has 478 two chances for help with a problem, fixup or recovery, should either kind of 479 handler appear on the stack. However, this dual stack walk leads to following 480 apparent anomaly: 481 \begin{cfa} 482 try { 483 throwResume E; 484 } catch (E) { 485 // this handler runs 486 } 487 \end{cfa} 488 because the @catch@ appears to handle a @throwResume@, but a @throwResume@ only 489 matches with @catchResume@. The anomaly results because the unmatched 490 @catchResuem@, calls @defaultResumptionHandler@, which in turn throws @E@. 491 492 % I wonder if there would be some good central place for this. 493 Note, termination and resumption handlers may be used together 525 Note that termination handlers and resumption handlers may be used together 494 526 in a single try statement, intermixing @catch@ and @catchResume@ freely. 495 527 Each type of handler only interacts with exceptions from the matching 496 528 kind of raise. 529 Like @catch@ clauses, @catchResume@ clauses have no effect if an exception 530 is not raised. 531 532 The matching rules are exactly the same as well. 533 The first major difference here is that after 534 @EXCEPTION_TYPE@$_i$ is matched and @NAME@$_i$ is bound to the exception, 535 @HANDLER_BLOCK@$_i$ is executed right away without first unwinding the stack. 536 After the block has finished running control jumps to the raise site, where 537 the just handled exception came from, and continues executing after it, 538 not after the try statement. 497 539 498 540 \subsubsection{Resumption Marking} … … 502 544 and run, its try block (the guarded statements) and every try statement 503 545 searched before it are still on the stack. There presence can lead to 504 the \emph{recursive resumption problem}. 546 the recursive resumption problem. 547 \todo{Is there a citation for the recursive resumption problem?} 505 548 506 549 The recursive resumption problem is any situation where a resumption handler … … 516 559 When this code is executed, the guarded @throwResume@ starts a 517 560 search and matches the handler in the @catchResume@ clause. This 518 call is placed on the stack above the try-block. Now the second raise in the handler 519 searches the same try block, matches, and puts another instance of the 561 call is placed on the stack above the try-block. 562 Now the second raise in the handler searches the same try block, 563 matches again and then puts another instance of the 520 564 same handler on the stack leading to infinite recursion. 521 565 522 While this situation is trivial and easy to avoid, much more complex cycles can 523 form with multiple handlers and different exception types. The key point is 524 that the programmer's intuition expects every raise in a handler to start 525 searching \emph{below} the @try@ statement, making it difficult to understand 526 and fix the problem. 527 566 While this situation is trivial and easy to avoid, much more complex cycles 567 can form with multiple handlers and different exception types. 528 568 To prevent all of these cases, each try statement is ``marked" from the 529 time the exception search reaches it to either when a matching handler530 completesor when the search reaches the base569 time the exception search reaches it to either when a handler completes 570 handling that exception or when the search reaches the base 531 571 of the stack. 532 572 While a try statement is marked, its handlers are never matched, effectively … … 540 580 for instance, marking just the handlers that caught the exception, 541 581 would also prevent recursive resumption. 542 However, the rule selected mirrors what happens with termination,543 and hence, matches programmer intuition that a raise searches below a try.544 545 In detail, the marked try statements are the ones that would be removed from582 However, the rules selected mirrors what happens with termination, 583 so this reduces the amount of rules and patterns a programmer has to know. 584 585 The marked try statements are the ones that would be removed from 546 586 the stack for a termination exception, \ie those on the stack 547 587 between the handler and the raise statement. … … 609 649 610 650 \subsection{Comparison with Reraising} 611 Without conditional catch, the only approach to match in more detail is to reraise 612 the exception after it has been caught, if it could not be handled. 651 In languages without conditional catch, that is no ability to match an 652 exception based on something other than its type, it can be mimicked 653 by matching all exceptions of the right type, checking any additional 654 conditions inside the handler and re-raising the exception if it does not 655 match those. 656 657 Here is a minimal example comparing both patterns, using @throw;@ 658 (no argument) to start a re-raise. 613 659 \begin{center} 614 \begin{tabular}{l |l}660 \begin{tabular}{l r} 615 661 \begin{cfa} 616 662 try { 617 618 } catch(excep _t * ex; can_handle(ex)) {619 620 handle(ex);621 622 623 624 } 663 do_work_may_throw(); 664 } catch(exception_t * exc ; 665 can_handle(exc)) { 666 handle(exc); 667 } 668 669 670 625 671 \end{cfa} 626 672 & 627 673 \begin{cfa} 628 674 try { 629 do_work_may_throw(); 630 } catch(excep_t * ex) { 631 if (can_handle(ex)) { 632 handle(ex); 675 do_work_may_throw(); 676 } catch(exception_t * exc) { 677 if (can_handle(exc)) { 678 handle(exc); 679 } else { 680 throw; 681 } 682 } 683 \end{cfa} 684 \end{tabular} 685 \end{center} 686 At first glance catch-and-reraise may appear to just be a quality of life 687 feature, but there are some significant differences between the two 688 stratagies. 689 690 A simple difference that is more important for \CFA than many other languages 691 is that the raise site changes, with a re-raise but does not with a 692 conditional catch. 693 This is important in \CFA because control returns to the raise site to run 694 the per-site default handler. Because of this only a conditional catch can 695 allow the original raise to continue. 696 697 The more complex issue comes from the difference in how conditional 698 catches and re-raises handle multiple handlers attached to a single try 699 statement. A conditional catch will continue checking later handlers while 700 a re-raise will skip them. 701 If the different handlers could handle some of the same exceptions, 702 translating a try statement that uses one to use the other can quickly 703 become non-trivial: 704 705 \noindent 706 Original, with conditional catch: 707 \begin{cfa} 708 ... 709 } catch (an_exception * e ; check_a(e)) { 710 handle_a(e); 711 } catch (exception_t * e ; check_b(e)) { 712 handle_b(e); 713 } 714 \end{cfa} 715 Translated, with re-raise: 716 \begin{cfa} 717 ... 718 } catch (exception_t * e) { 719 an_exception * an_e = (virtual an_exception *)e; 720 if (an_e && check_a(an_e)) { 721 handle_a(an_e); 722 } else if (check_b(e)) { 723 handle_b(e); 633 724 } else { 634 725 throw; … … 636 727 } 637 728 \end{cfa} 638 \end{tabular} 639 \end{center} 640 Notice catch-and-reraise increases complexity by adding additional data and 641 code to the exception process. Nevertheless, catch-and-reraise can simulate 642 conditional catch straightforwardly, when exceptions are disjoint, \ie no 643 inheritance. 644 645 However, catch-and-reraise simulation becomes unusable for exception inheritance. 646 \begin{flushleft} 647 \begin{cfa}[xleftmargin=6pt] 648 exception E1; 649 exception E2(E1); // inheritance 650 \end{cfa} 651 \begin{tabular}{l|l} 652 \begin{cfa} 653 try { 654 ... foo(); ... // raise E1/E2 655 ... bar(); ... // raise E1/E2 656 } catch( E2 e; e.rtn == foo ) { 657 ... 658 } catch( E1 e; e.rtn == foo ) { 659 ... 660 } catch( E1 e; e.rtn == bar ) { 661 ... 662 } 663 664 \end{cfa} 665 & 666 \begin{cfa} 667 try { 668 ... foo(); ... 669 ... bar(); ... 670 } catch( E2 e ) { 671 if ( e.rtn == foo ) { ... 672 } else throw; // reraise 673 } catch( E1 e ) { 674 if (e.rtn == foo) { ... 675 } else if (e.rtn == bar) { ... 676 else throw; // reraise 677 } 678 \end{cfa} 679 \end{tabular} 680 \end{flushleft} 681 The derived exception @E2@ must be ordered first in the catch list, otherwise 682 the base exception @E1@ catches both exceptions. In the catch-and-reraise code 683 (right), the @E2@ handler catches exceptions from both @foo@ and 684 @bar@. However, the reraise misses the following catch clause. To fix this 685 problem, an enclosing @try@ statement is need to catch @E2@ for @bar@ from the 686 reraise, and its handler must duplicate the inner handler code for @bar@. To 687 generalize, this fix for any amount of inheritance and complexity of try 688 statement requires a technique called \emph{try-block 689 splitting}~\cite{Krischer02}, which is not discussed in this thesis. It is 690 sufficient to state that conditional catch is more expressive than 691 catch-and-reraise in terms of complexity. 692 693 \begin{comment} 694 That is, they have the same behaviour in isolation. 695 Two things can expose differences between these cases. 696 697 One is the existence of multiple handlers on a single try statement. 698 A reraise skips all later handlers for a try statement but a conditional 699 catch does not. 700 % Hence, if an earlier handler contains a reraise later handlers are 701 % implicitly skipped, with a conditional catch they are not. 702 Still, they are equivalently powerful, 703 both can be used two mimic the behaviour of the other, 704 as reraise can pack arbitrary code in the handler and conditional catches 705 can put arbitrary code in the predicate. 706 % I was struggling with a long explanation about some simple solutions, 707 % like repeating a condition on later handlers, and the general solution of 708 % merging everything together. I don't think it is useful though unless its 709 % for a proof. 710 % https://en.cppreference.com/w/cpp/language/throw 711 712 The question then becomes ``Which is a better default?" 713 We believe that not skipping possibly useful handlers is a better default. 714 If a handler can handle an exception it should and if the handler can not 715 handle the exception then it is probably safer to have that explicitly 716 described in the handler itself instead of implicitly described by its 717 ordering with other handlers. 718 % Or you could just alter the semantics of the throw statement. The handler 719 % index is in the exception so you could use it to know where to start 720 % searching from in the current try statement. 721 % No place for the `goto else;` metaphor. 722 723 The other issue is all of the discussion above assumes that the only 724 way to tell apart two raises is the exception being raised and the remaining 725 search path. 726 This is not true generally, the current state of the stack can matter in 727 a number of cases, even only for a stack trace after an program abort. 728 But \CFA has a much more significant need of the rest of the stack, the 729 default handlers for both termination and resumption. 730 731 % For resumption it turns out it is possible continue a raise after the 732 % exception has been caught, as if it hadn't been caught in the first place. 733 This becomes a problem combined with the stack unwinding used in termination 734 exception handling. 735 The stack is unwound before the handler is installed, and hence before any 736 reraises can run. So if a reraise happens the previous stack is gone, 737 the place on the stack where the default handler was supposed to run is gone, 738 if the default handler was a local function it may have been unwound too. 739 There is no reasonable way to restore that information, so the reraise has 740 to be considered as a new raise. 741 This is the strongest advantage conditional catches have over reraising, 742 they happen before stack unwinding and avoid this problem. 743 744 % The one possible disadvantage of conditional catch is that it runs user 745 % code during the exception search. While this is a new place that user code 746 % can be run destructors and finally clauses are already run during the stack 747 % unwinding. 729 (There is a simpler solution if @handle_a@ never raises exceptions, 730 using nested try statements.) 731 732 % } catch (an_exception * e ; check_a(e)) { 733 % handle_a(e); 734 % } catch (exception_t * e ; !(virtual an_exception *)e && check_b(e)) { 735 % handle_b(e); 736 % } 748 737 % 749 % https://www.cplusplus.com/reference/exception/current_exception/ 750 % `exception_ptr current_exception() noexcept;` 751 % https://www.python.org/dev/peps/pep-0343/ 752 \end{comment} 738 % } catch (an_exception * e) 739 % if (check_a(e)) { 740 % handle_a(e); 741 % } else throw; 742 % } catch (exception_t * e) 743 % if (check_b(e)) { 744 % handle_b(e); 745 % } else throw; 746 % } 747 In similar simple examples translating from re-raise to conditional catch 748 takes less code but it does not have a general trivial solution either. 749 750 So, given that the two patterns do not trivially translate into each other, 751 it becomes a matter of which on should be encouraged and made the default. 752 From the premise that if a handler that could handle an exception then it 753 should, it follows that checking as many handlers as possible is preferred. 754 So conditional catch and checking later handlers is a good default. 753 755 754 756 \section{Finally Clauses} … … 766 768 The @FINALLY_BLOCK@ is executed when the try statement is removed from the 767 769 stack, including when the @GUARDED_BLOCK@ finishes, any termination handler 768 finishes ,or during an unwind.770 finishes or during an unwind. 769 771 The only time the block is not executed is if the program is exited before 770 772 the stack is unwound. … … 786 788 they have their own strengths, similar to top-level function and lambda 787 789 functions with closures. 788 Destructors take more work for their creation, but if there is clean-up code790 Destructors take more work to create, but if there is clean-up code 789 791 that needs to be run every time a type is used, they are much easier 790 to set-up .792 to set-up for each use. % It's automatic. 791 793 On the other hand finally clauses capture the local context, so is easy to 792 794 use when the clean-up is not dependent on the type of a variable or requires … … 804 806 raise, this exception is not used in matching only to pass information about 805 807 the cause of the cancellation. 806 Final y, since a cancellation only unwinds and forwards, there is no default handler.808 Finally, as no handler is provided, there is no default handler. 807 809 808 810 After @cancel_stack@ is called the exception is copied into the EHM's memory … … 815 817 After the main stack is unwound there is a program-level abort. 816 818 817 The reasons for this semantics in a sequential program is that there is no more code to execute.818 This semantics also applies to concurrent programs, too, even if threads are running.819 That is, if any threads starts a cancellation, it implies all threads terminate. 820 Keeping the same behaviour in sequential and concurrent programs is simple.821 Also, even in concurrent programs there may not currently be any other stacks 822 and even if other stacks do exist, main has no way to know where they are.819 The first reason for this behaviour is for sequential programs where there 820 is only one stack, and hence to stack to pass information to. 821 Second, even in concurrent programs, the main stack has no dependency 822 on another stack and no reliable way to find another living stack. 823 Finally, keeping the same behaviour in both sequential and concurrent 824 programs is simple and easy to understand. 823 825 824 826 \paragraph{Thread Stack} … … 850 852 851 853 With explicit join and a default handler that triggers a cancellation, it is 852 possible to cascade an error across any number of threads, cleaning up each 854 possible to cascade an error across any number of threads, 855 alternating between the resumption (possibly termination) and cancellation, 856 cleaning up each 853 857 in turn, until the error is handled or the main thread is reached. 854 858 … … 863 867 caller's context and passes it to the internal report. 864 868 865 A coroutine only knows of two other coroutines, its starter and its last resumer. 869 A coroutine only knows of two other coroutines, 870 its starter and its last resumer. 866 871 The starter has a much more distant connection, while the last resumer just 867 872 (in terms of coroutine state) called resume on this coroutine, so the message … … 869 874 870 875 With a default handler that triggers a cancellation, it is possible to 871 cascade an error across any number of coroutines, cleaning up each in turn, 876 cascade an error across any number of coroutines, 877 alternating between the resumption (possibly termination) and cancellation, 878 cleaning up each in turn, 872 879 until the error is handled or a thread stack is reached. 873 874 \PAB{Part of this I do not understand. A cancellation cannot be caught. But you875 talk about handling a cancellation in the last sentence. Which is correct?} -
TabularUnified doc/theses/andrew_beach_MMath/future.tex ¶
r1d402be reaeca5f 2 2 \label{c:future} 3 3 4 The following discussion covers both missing language features that affected my 5 work and research based improvements. 4 The following discussion covers both possible interesting research 5 that could follow from this work as long as simple implementation 6 improvements. 6 7 7 8 \section{Language Improvements} … … 9 10 \CFA is a developing programming language. As such, there are partially or 10 11 unimplemented features (including several broken components) 11 that I had to workaround while building an EHM largely in 12 the \CFA language (some C components). The following are a few of these 13 issues, and once implemented/fixed, how they would affect the exception system. 12 that I had to workaround while building the EHM largely in 13 the \CFA language (some C components). Below are a few of these issues 14 and how implementing/fixing them would affect the EHM. 15 In addition there are some simple improvements that had no interesting 16 research attached to them but would make using the language easier. 14 17 \begin{itemize} 15 \item16 The implementation of termination is not portable because it includes17 hand-crafted assembly statements for each architecture, where the18 ARM processor was just added.19 % The existing compilers cannot translate that for other platforms and those20 % sections must be ported by hand to21 Supporting more hardware architectures in a general way is important.22 18 \item 23 19 Due to a type-system problem, the catch clause cannot bind the exception to a … … 29 25 @return@, \etc. The reason is that current code generation hoists a handler 30 26 into a nested function for convenience (versus assemble-code generation at the 31 @try@ statement). Hence, when the handler runs, its can access local variable 32 in the lexical scope of the @try@ statement, but the closure does not capture 33 local control-flow points so it cannot perform non-local transfers in the 34 hoisted function. 27 try statement). Hence, when the handler runs, it can still access local 28 variables in the lexical scope of the try statement. Still, it does mean 29 that seemingly local control flow is not in fact local and crosses a function 30 boundary. 31 Making the termination handlers code within the surrounding 32 function would remove this limitation. 33 % Try blocks are much more difficult to do practically (requires our own 34 % assembly) and resumption handlers have some theoretical complexity. 35 35 \item 36 36 There is no detection of colliding unwinds. It is possible for clean-up code 37 37 run during an unwind to trigger another unwind that escapes the clean-up code 38 38 itself; such as a termination exception caught further down the stack or a 39 cancellation. There do exist ways to handle this case, but currently there is no40 detection and the first unwind is simplyforgotten, often leaving39 cancellation. There do exist ways to handle this case, but currently there is 40 no detection and the first unwind will simply be forgotten, often leaving 41 41 it in a bad state. 42 42 \item 43 Finally, the exception system has not ha ve a lotprogrammer testing.43 Finally, the exception system has not had a lot of programmer testing. 44 44 More time with encouraged usage will reveal new 45 45 quality of life upgrades that can be made. … … 50 50 project, but was thrust upon it to do exception inheritance; hence, only 51 51 minimal work is done. A draft for a complete virtual system is available but 52 not finalized. 52 not finalized. A future \CFA project is to complete that work and then 53 53 update the exception system that uses the current version. 54 54 … … 61 61 types to allow traits to refer to types not listed in their header. This 62 62 feature allows exception traits to not refer to the virtual-table type 63 explicitly. %, removing the need for the current interface macros. 63 explicitly, removing the need for the current interface macros, 64 such as @EHM_IS_EXCEPTION@. 64 65 65 66 \section{Additional Raises} … … 77 78 Non-local/concurrent raise requires more 78 79 coordination between the concurrency system 79 and the exception system. Many of the interesting design decisions cent re80 and the exception system. Many of the interesting design decisions center 80 81 around masking, \ie controlling which exceptions may be thrown at a stack. It 81 82 would likely require more of the virtual system and would also effect how … … 97 98 exception signature. An exception signature must declare all checked 98 99 exceptions that could propagate from the function, either because they were 99 raised inside the function or a call toa sub-function. This improves safety100 raised inside the function or came from a sub-function. This improves safety 100 101 by making sure every checked exception is either handled or consciously 101 102 passed on. … … 133 134 Workarounds are possible but awkward. Ideally an extension to libunwind could 134 135 be made, but that would either require separate maintenance or gaining enough 135 support to have it folded into the code base.136 support to have it folded into the official library itself. 136 137 137 138 Also new techniques to skip previously searched parts of the stack need to be -
TabularUnified doc/theses/andrew_beach_MMath/implement.tex ¶
r1d402be reaeca5f 14 14 \label{s:VirtualSystem} 15 15 % Virtual table rules. Virtual tables, the pointer to them and the cast. 16 While the \CFA virtual system currently has only one public feature, virtual 17 cast (see the virtual cast feature \vpageref{p:VirtualCast}), 18 substantial structure is required to support it, 16 While the \CFA virtual system currently has only one public features, virtual 17 cast and virtual tables, 18 % ??? refs (see the virtual cast feature \vpageref{p:VirtualCast}), 19 substantial structure is required to support them, 19 20 and provide features for exception handling and the standard library. 20 21 21 22 \subsection{Virtual Type} 22 A virtual type~(see \autoref{s:Virtuals}) has a pointer to a virtual table, 23 called the \emph{virtual-table pointer}, which binds an instance of a virtual 24 type to a virtual table. Internally, the field is called \snake{virtual_table} 25 and is fixed after construction. This pointer is also the table's id and how 26 the system accesses the virtual table and the virtual members there. It is 27 always the first field in the structure so that its location is always known. 28 \todo{Talk about constructors for virtual types (after they are working).} 23 A virtual type~(see \autoref{s:virtuals}) has a pointer to a virtual table, 24 called the \emph{virtual-table pointer}, 25 which binds each instance of a virtual type to a virtual table. 26 Internally, the field is called \snake{virtual_table} 27 and is fixed after construction. 28 This pointer is also the table's id and how the system accesses the 29 virtual table and the virtual members there. 30 It is always the first field in the 31 structure so that its location is always known. 32 33 % We have no special rules for these constructors. 34 Virtual table pointers are passed to the constructors of virtual types 35 as part of field-by-field construction. 29 36 30 37 \subsection{Type Id} 31 Every virtual type needs a unique id, so that type ids can be compared for 32 equality, which checks if the types representation are the same, or used to 33 access the type's type information. Here, uniqueness means within a program 34 composed of multiple translation units (TU), not uniqueness across all 35 programs. 36 37 One approach for program uniqueness is declaring a static declaration for each 38 type id, where the runtime storage address of that variable is guaranteed to be 39 unique during program execution. The type id storage can also be used for other 40 purposes. 38 Every virtual type has a unique id. 39 These are used in type equality, to check if the representation of two values 40 are the same, and to access the type's type information. 41 This uniqueness means across a program composed of multiple translation 42 units (TU), not uniqueness across all programs or even across multiple 43 processes on the same machine. 44 45 Our approach for program uniqueness is using a static declaration for each 46 type id, where the run-time storage address of that variable is guaranteed to 47 be unique during program execution. 48 The type id storage can also be used for other purposes, 49 and is used for type information. 41 50 42 51 The problem is that a type id may appear in multiple TUs that compose a 43 program, see \autoref{ss:VirtualTable}; hence in each TU, it must be declared 44 as external to prevent multiple definitions. However, the type id must actually 45 be declared in one of the TUs so the linker creates the storage. Hence, the 46 problem becomes designating one TU to insert an actual type-id declaration. But 47 the \CFA compiler does not know the set of the translation units that compose a 48 program, because TUs can be compile separately, followed by a separate link 49 step. 50 51 The solution is to mimic a \CFA feature in \Cpp{17}, @inline@ variables and 52 function: 53 \begin{quote} 54 There may be more than one definition of an inline function or variable (since 55 \Cpp{17} in the program as long as each definition appears in a different 56 translation unit and (for non-static inline functions and variables (since 57 \Cpp{17})) all definitions are identical. For example, an inline function or an 58 inline variable (since \Cpp{17}) may be defined in a header file that is 59 @#include@'d in multiple source files.~\cite{C++17} 60 \end{quote} 61 The underlying mechanism to provide this capability is attribute 62 \begin{cfa} 63 section(".gnu.linkonce.NAME") 64 \end{cfa} 65 where @NAME@ is the variable/function name duplicated in each TU. The linker than 66 provides the service of generating a single declaration (instance) across all 67 TUs, even if a program is linked incrementally. 68 69 C does not support this feature for @inline@, and hence, neither does \CFA. 70 Again, rather than implement a new @inline@ extension for \CFA, a temporary 71 solution for the exception handling is to add the following in \CFA. 72 \begin{lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}] 73 @__attribute__((cfa_linkonce))@ void f() {} 74 \end{lstlisting} 75 which becomes 76 \begin{lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}] 77 __attribute__((section(".gnu.linkonce._X1fFv___1"))) void @_X1fFv___1@(){} 78 \end{lstlisting} 79 where @NAME@ from above is the \CFA mangled variable/function name. Note, 80 adding this feature is necessary because, when using macros, the mangled name 81 is unavailable. This attribute is useful for purposes other than exception 82 handling, and should eventually be rolled into @inline@ processing in \CFA. 83 84 Finally, a type id's data implements a pointers to the type's type information 85 instance. Dereferencing the pointer gets the type information. 86 87 \subsection{Implementation} 88 52 program (see \autoref{ss:VirtualTable}); so the initial solution would seem 53 to be make it external in each translation unit. Honever, the type id must 54 have a declaration in (exactly) one of the TUs to create the storage. 55 No other declaration related to the virtual type has this property, so doing 56 this through standard C declarations would require the user to do it manually. 57 58 Instead the linker is used to handle this problem. 59 % I did not base anything off of C++17; they are solving the same problem. 60 A new feature has been added to \CFA for this purpose, the special attribute 61 \snake{cfa_linkonce}, which uses the special section @.gnu.linkonce@. 62 When used as a prefix (\eg @.gnu.linkonce.example@) the linker does 63 not combine these sections, but instead discards all but one with the same 64 full name. 65 66 So each type id must be given a unique section name with the linkonce 67 prefix. Luckily \CFA already has a way to get unique names, the name mangler. 68 For example, this could be written directly in \CFA: 69 \begin{cfa} 70 __attribute__((cfa_linkonce)) void f() {} 71 \end{cfa} 72 This is translated to: 73 \begin{cfa} 74 __attribute__((section(".gnu.linkonce._X1fFv___1"))) void _X1fFv___1() {} 75 \end{cfa} 76 This is done internally to access the name manglers. 77 This attribute is useful for other purposes, any other place a unique 78 instance required, and should eventually be made part of a public and 79 stable feature in \CFA. 80 81 \subsection{Type Information} 82 83 There is data stored at the type id's declaration, the type information. 89 84 The type information currently is only the parent's type id or, if the 90 85 type has no parent, the null pointer. … … 103 98 \end{cfa} 104 99 105 T he type information is constructed as follows:100 Type information is constructed as follows: 106 101 \begin{enumerate} 107 102 \item … … 124 119 \item 125 120 \CFA's name mangler does its regular name mangling encoding the type of 126 the declaration into the instance name. This process gives a program unique name 121 the declaration into the instance name. 122 This process gives a completely unique name 127 123 including different instances of the same polymorphic type. 128 124 \end{enumerate} 129 125 \todo{The list is making me realize, some of this isn't ordered.} 130 126 127 Writing that code manually, with helper macros for the early name mangling, 128 would look like this: 129 \begin{cfa} 130 struct INFO_TYPE(TYPE) { 131 INFO_TYPE(PARENT) const * parent; 132 }; 133 134 __attribute__((cfa_linkonce)) 135 INFO_TYPE(TYPE) const INFO_NAME(TYPE) = { 136 &INFO_NAME(PARENT), 137 }; 138 \end{cfa} 131 139 132 140 \begin{comment} … … 158 166 and the other is discarded. 159 167 \end{comment} 160 161 168 162 169 \subsection{Virtual Table} … … 191 198 The first and second sections together mean that every virtual table has a 192 199 prefix that has the same layout and types as its parent virtual table. 193 This, combined with the fixed offset to the virtual -table pointer, means that200 This, combined with the fixed offset to the virtual table pointer, means that 194 201 for any virtual type, it is always safe to access its virtual table and, 195 202 from there, it is safe to check the type id to identify the exact type of the … … 209 216 type's alignment, is set using an @alignof@ expression. 210 217 218 Most of these tools are already inside the compiler. Using the is a simple 219 code transformation early on in compilation allows most of that work to be 220 handed off to the existing tools. \autoref{f:VirtualTableTransformation} 221 shows an example transformation, this example shows an exception virtual table. 222 It also shows the transformation on the full declaration, 223 for a forward declaration the @extern@ keyword is preserved and the 224 initializer is not added. 225 226 \begin{figure}[htb] 227 \begin{cfa} 228 vtable(example_type) example_name; 229 \end{cfa} 230 \transformline 231 % Check mangling. 232 \begin{cfa} 233 const struct example_type_vtable example_name = { 234 .__cfavir_typeid : &__cfatid_example_type, 235 .size : sizeof(example_type), 236 .copy : copy, 237 .^?{} : ^?{}, 238 .msg : msg, 239 }; 240 \end{cfa} 241 \caption{Virtual Table Transformation} 242 \label{f:VirtualTableTransformation} 243 \end{figure} 244 211 245 \subsection{Concurrency Integration} 212 246 Coroutines and threads need instances of @CoroutineCancelled@ and … … 218 252 These transformations are shown through code re-writing in 219 253 \autoref{f:CoroutineTypeTransformation} and 220 \autoref{f:CoroutineMainTransformation} for a coroutine and a thread is similar. 221 In both cases, the original declaration is not modified, only new ones are 222 added. 223 224 \begin{figure} 254 \autoref{f:CoroutineMainTransformation}. 255 Threads use the same pattern, with some names and types changed. 256 In both cases, the original declaration is not modified, 257 only new ones are added. 258 259 \begin{figure}[htb] 225 260 \begin{cfa} 226 261 coroutine Example { … … 242 277 \caption{Coroutine Type Transformation} 243 278 \label{f:CoroutineTypeTransformation} 244 %\end{figure} 245 246 \bigskip 247 248 %\begin{figure} 279 \end{figure} 280 281 \begin{figure}[htb] 249 282 \begin{cfa} 250 283 void main(Example & this) { … … 277 310 \begin{cfa} 278 311 void * __cfa__virtual_cast( 279 struct __cfavir_type_td parent, 280 struct __cfavir_type_id const * child ); 281 \end{cfa} 282 The type id for the target type of the virtual cast is passed in as @parent@ and 312 struct __cfavir_type_id * parent, 313 struct __cfavir_type_id * const * child ); 314 \end{cfa} 315 The type id for the target type of the virtual cast is passed in as 316 @parent@ and 283 317 the cast target is passed in as @child@. 284 318 The generated C code wraps both arguments and the result with type casts. … … 294 328 295 329 \section{Exceptions} 296 \todo{Anything about exception construction.} 330 % The implementation of exception types. 331 332 Creating exceptions can roughly divided into two parts, 333 the exceptions themselves and the virtual system interactions. 334 335 Creating an exception type is just a matter of preppending the field 336 with the virtual table pointer to the list of the fields 337 (see \autoref{f:ExceptionTypeTransformation}). 338 339 \begin{figure}[htb] 340 \begin{cfa} 341 exception new_exception { 342 // EXISTING FIELDS 343 }; 344 \end{cfa} 345 \transformline 346 \begin{cfa} 347 struct new_exception { 348 struct new_exception_vtable const * virtual_table; 349 // EXISTING FIELDS 350 }; 351 \end{cfa} 352 \caption{Exception Type Transformation} 353 \label{f:ExceptionTypeTransformation} 354 \end{figure} 355 356 The integration between exceptions and the virtual system is a bit more 357 complex simply because of the nature of the virtual system prototype. 358 The primary issue is that the virtual system has no way to detect when it 359 should generate any of its internal types and data. This is handled by 360 the exception code, which tells the virtual system when to generate 361 its components. 362 363 All types associated with a virtual type, 364 the types of the virtual table and the type id, 365 are generated when the virtual type (the exception) is first found. 366 The type id (the instance) is generated with the exception if it is 367 a monomorphic type. 368 However if the exception is polymorphic then a different type id has to 369 be generated for every instance. In this case generation is delayed 370 until a virtual table is created. 371 % There are actually some problems with this, which is why it is not used 372 % for monomorphic types. 373 When a virtual table is created and initialized two functions are created 374 to fill in the list of virtual members. 375 The first is a copy function which adapts the exception's copy constructor 376 to work with pointers, avoiding some issues with the current copy constructor 377 interface. 378 Second is the msg function, which returns a C-string with the type's name, 379 including any polymorphic parameters. 297 380 298 381 \section{Unwinding} … … 308 391 stack. On function entry and return, unwinding is handled directly by the 309 392 call/return code embedded in the function. 310 \PAB{Meaning: In many cases, the position of the instruction pointer (relative to parameter 311 and local declarations) is enough to know the current size of the stack 312 frame.} 313 393 394 % Discussing normal stack unwinding: 314 395 Usually, the stack-frame size is known statically based on parameter and 315 396 local variable declarations. Even for a dynamic stack-size, the information … … 319 400 bumping the hardware stack-pointer up or down as needed. 320 401 Constructing/destructing values within a stack frame has 321 a similar complexity but larger constants, which takes longer. 322 402 a similar complexity but larger constants. 403 404 % Discussing multiple frame stack unwinding: 323 405 Unwinding across multiple stack frames is more complex because that 324 406 information is no longer contained within the current function. 325 With separate compilation a function does not know its callers nor their frame size. 326 In general, the caller's frame size is embedded only at the functions entry (push 327 stack) and exit (pop stack). 328 Without altering the main code path it is also hard to pass that work off 329 to the caller. 407 With seperate compilation, 408 a function does not know its callers nor their frame layout. 409 Even using the return address, that information is encoded in terms of 410 actions in code, intermixed with the actions required finish the function. 411 Without changing the main code path it is impossible to select one of those 412 two groups of actions at the return site. 330 413 331 414 The traditional unwinding mechanism for C is implemented by saving a snap-shot … … 340 423 many languages define clean-up actions that must be taken when certain 341 424 sections of the stack are removed. Such as when the storage for a variable 342 is removed from the stack (destructor call) or when a try statement with a finally clause is 425 is removed from the stack, possibly requiring a destructor call, 426 or when a try statement with a finally clause is 343 427 (conceptually) popped from the stack. 344 428 None of these cases should be handled by the user --- that would contradict the … … 383 467 In plain C (which \CFA currently compiles down to) this 384 468 flag only handles the cleanup attribute: 469 %\label{code:cleanup} 385 470 \begin{cfa} 386 471 void clean_up( int * var ) { ... } … … 394 479 395 480 To get full unwinding support, all of these features must be handled directly 396 in assembly and assembler directives; parti cularly the cfi directives481 in assembly and assembler directives; partiularly the cfi directives 397 482 \snake{.cfi_lsda} and \snake{.cfi_personality}. 398 483 … … 529 614 needs its own exception context. 530 615 531 An exception context isretrieved by calling the function616 The current exception context should be retrieved by calling the function 532 617 \snake{this_exception_context}. 533 618 For sequential execution, this function is defined as … … 658 743 function. The LSDA in particular is hard to mimic in generated C code. 659 744 660 The workaround is a function called @__cfaehm_try_terminate@ in the standard661 \CFA library. The contents of a try block and the termination handlers are converted 662 into nested functions. These are then passed to the try terminate function and it 663 calls them, appropriately.745 The workaround is a function called \snake{__cfaehm_try_terminate} in the 746 standard \CFA library. The contents of a try block and the termination 747 handlers are converted into nested functions. These are then passed to the 748 try terminate function and it calls them, appropriately. 664 749 Because this function is known and fixed (and not an arbitrary function that 665 750 happens to contain a try statement), its LSDA can be generated ahead 666 751 of time. 667 752 668 Both the LSDA and the personality function for @__cfaehm_try_terminate@ are set ahead of time using 753 Both the LSDA and the personality function for \snake{__cfaehm_try_terminate} 754 are set ahead of time using 669 755 embedded assembly. This assembly code is handcrafted using C @asm@ statements 670 756 and contains 671 enough information for asingle try statement the function represents.757 enough information for the single try statement the function represents. 672 758 673 759 The three functions passed to try terminate are: … … 681 767 decides if a catch clause matches the termination exception. It is constructed 682 768 from the conditional part of each handler and runs each check, top to bottom, 683 in turn, first checking to see if the exception type matches. 684 The match is performed in two steps, first a virtual cast is used to see 685 if the raised exception is an instance of the declared exception or one of 686 its descendant type, and then is the condition true, if present. 687 It takes a pointer to the exception and returns 0 if the 769 in turn, to see if the exception matches this handler. 770 The match is performed in two steps, first a virtual cast is used to check 771 if the raised exception is an instance of the declared exception type or 772 one of its descendant types, and then the condition is evaluated, if 773 present. 774 The match function takes a pointer to the exception and returns 0 if the 688 775 exception is not handled here. Otherwise the return value is the id of the 689 776 handler that matches the exception. … … 698 785 All three functions are created with GCC nested functions. GCC nested functions 699 786 can be used to create closures, 700 in other words, functions that can refer to their lexical scope in other 701 functions on the stack when called. This approach allows the functions to refer to all the 787 in other words, 788 functions that can refer to variables in their lexical scope even 789 those variables are part of a different function. 790 This approach allows the functions to refer to all the 702 791 variables in scope for the function containing the @try@ statement. These 703 792 nested functions and all other functions besides @__cfaehm_try_terminate@ in … … 786 875 the operation finishes, otherwise the search continues to the next node. 787 876 If the search reaches the end of the list without finding a try statement 788 that can handle the exception, the default handler is executed and the 789 operation finishes, unless it throws an exception. 877 with a handler clause 878 that can handle the exception, the default handler is executed. 879 If the default handler returns, control continues after the raise statement. 790 880 791 881 Each node has a handler function that does most of the work. … … 797 887 If no match is found the function returns false. 798 888 The match is performed in two steps, first a virtual cast is used to see 799 if the raised exception is an instance of the declared exception or one of 800 its descendant type, and then is the condition true, if present. 801 \PAB{I don't understand this sentence. 802 This ordering gives the type guarantee used in the predicate.} 889 if the raised exception is an instance of the declared exception type or one 890 of its descendant types, if so then it is passed to the custom predicate 891 if one is defined. 892 % You need to make sure the type is correct before running the predicate 893 % because the predicate can depend on that. 803 894 804 895 \autoref{f:ResumptionTransformation} shows the pattern used to transform 805 a \CFA try statement with catch clauses into the appropr iate C functions.896 a \CFA try statement with catch clauses into the approprate C functions. 806 897 \todo{Explain the Resumption Transformation figure.} 807 898 … … 852 943 (see \vpageref{s:ResumptionMarking}), which ignores parts of 853 944 the stack 854 already examined, and is accomplished by updating the front of the list as the 855 search continues. Before the handler is called at a matching node, the head of the list 945 already examined, and is accomplished by updating the front of the list as 946 the search continues. 947 Before the handler is called at a matching node, the head of the list 856 948 is updated to the next node of the current node. After the search is complete, 857 949 successful or not, the head of the list is reset. … … 890 982 \section{Finally} 891 983 % Uses destructors and GCC nested functions. 892 \autoref{f:FinallyTransformation} shows the pattern used to transform a \CFA 893 try statement with finally clause into the appropriate C functions. 894 The finally clause is placed into a GCC nested-function 895 with a unique name, and no arguments or return values. This nested function is 984 985 %\autoref{code:cleanup} 986 A finally clause is handled by converting it into a once-off destructor. 987 The code inside the clause is placed into GCC nested-function 988 with a unique name, and no arguments or return values. 989 This nested function is 896 990 then set as the cleanup function of an empty object that is declared at the 897 beginning of a block placed around the context of the associated @try@898 statement .991 beginning of a block placed around the context of the associated try 992 statement (see \autoref{f:FinallyTransformation}). 899 993 900 994 \begin{figure} … … 919 1013 // TRY BLOCK 920 1014 } 921 922 1015 } 923 1016 \end{cfa} … … 927 1020 \end{figure} 928 1021 929 The rest is handled by GCC. The try block and all handlers are inside this 930 block. At completion, control exits the block and the empty object is cleaned 1022 The rest is handled by GCC. 1023 The TRY BLOCK 1024 contains the try block itself as well as all code generated for handlers. 1025 Once that code has completed, 1026 control exits the block and the empty object is cleaned 931 1027 up, which runs the function that contains the finally code. 932 1028 … … 939 1035 940 1036 The first step of cancellation is to find the cancelled stack and its type: 941 coroutine, thread ,or main thread.1037 coroutine, thread or main thread. 942 1038 In \CFA, a thread (the construct the user works with) is a user-level thread 943 1039 (point of execution) paired with a coroutine, the thread's main coroutine. 944 1040 The thread library also stores pointers to the main thread and the current 945 coroutine.1041 thread. 946 1042 If the current thread's main and current coroutines are the same then the 947 1043 current stack is a thread stack, otherwise it is a coroutine stack. -
TabularUnified doc/theses/andrew_beach_MMath/intro.tex ¶
r1d402be reaeca5f 11 11 12 12 % Now take a step back and explain what exceptions are generally. 13 Exception handling provides dynamic inter-function control flow. 13 14 A language's EHM is a combination of language syntax and run-time 14 components that are used to construct, raise, and handle exceptions, 15 including all control flow. 16 Exceptions are an active mechanism for replacing passive error/return codes and return unions (Go and Rust). 17 Exception handling provides dynamic inter-function control flow. 15 components that construct, raise, propagate and handle exceptions, 16 to provide all of that control flow. 18 17 There are two forms of exception handling covered in this thesis: 19 18 termination, which acts as a multi-level return, 20 19 and resumption, which is a dynamic function call. 21 % PAB: Maybe this sentence was suppose to be deleted? 22 Termination handling is much more common, 23 to the extent that it is often seen as the only form of handling. 24 % PAB: I like this sentence better than the next sentence. 25 % This separation is uncommon because termination exception handling is so 26 % much more common that it is often assumed. 27 % WHY: Mention other forms of continuation and \cite{CommonLisp} here? 28 29 Exception handling relies on the concept of nested functions to create handlers that deal with exceptions. 20 % About other works: 21 Often, when this separation is not made, termination exceptions are assumed 22 as they are more common and may be the only form of handling provided in 23 a language. 24 25 All types of exception handling link a raise with a handler. 26 Both operations are usually language primitives, although raises can be 27 treated as a primitive function that takes an exception argument. 28 Handlers are more complex as they are added to and removed from the stack 29 during execution, must specify what they can handle and give the code to 30 handle the exception. 31 32 Exceptions work with different execution models but for the descriptions 33 that follow a simple call stack, with functions added and removed in a 34 first-in-last-out order, is assumed. 35 36 Termination exception handling searches the stack for the handler, then 37 unwinds the stack to where the handler was found before calling it. 38 The handler is run inside the function that defined it and when it finishes 39 it returns control to that function. 30 40 \begin{center} 31 \begin{tabular}[t]{ll} 32 \begin{lstlisting}[aboveskip=0pt,belowskip=0pt,language=CFA,{moredelim=**[is][\color{red}]{@}{@}}] 33 void f( void (*hp)() ) { 34 hp(); 35 } 36 void g( void (*hp)() ) { 37 f( hp ); 38 } 39 void h( int @i@, void (*hp)() ) { 40 void @handler@() { // nested 41 printf( "%d\n", @i@ ); 42 } 43 if ( i == 1 ) hp = handler; 44 if ( i > 0 ) h( i - 1, hp ); 45 else g( hp ); 46 } 47 h( 2, 0 ); 48 \end{lstlisting} 49 & 50 \raisebox{-0.5\totalheight}{\input{handler}} 51 \end{tabular} 41 \input{callreturn} 52 42 \end{center} 53 The nested function @handler@ in the second stack frame is explicitly passed to function @f@. 54 When this handler is called in @f@, it uses the parameter @i@ in the second stack frame, which is accessible by an implicit lexical-link pointer. 55 Setting @hp@ in @h@ at different points in the recursion, results in invoking a different handler. 56 Exception handling extends this idea by eliminating explicit handler passing, and instead, performing a stack search for a handler that matches some criteria (conditional dynamic call), and calls the handler at the top of the stack. 57 It is the runtime search $O(N)$ that differentiates an EHM call (raise) from normal dynamic call $O(1)$ via a function or virtual-member pointer. 58 59 Termination exception handling searches the stack for a handler, unwinds the stack to the frame containing the matching handler, and calling the handler at the top of the stack. 60 \begin{center} 61 \input{termination} 62 \end{center} 63 Note, since the handler can reference variables in @h@, @h@ must remain on the stack for the handler call. 64 After the handler returns, control continues after the lexical location of the handler in @h@ (static return)~\cite[p.~108]{Tennent77}. 65 Unwinding allows recover to any previous 66 function on the stack, skipping any functions between it and the 67 function containing the matching handler. 68 69 Resumption exception handling searches the stack for a handler, does \emph{not} unwind the stack to the frame containing the matching handler, and calls the handler at the top of the stack. 43 44 Resumption exception handling searches the stack for a handler and then calls 45 it without removing any other stack frames. 46 The handler is run on top of the existing stack, often as a new function or 47 closure capturing the context in which the handler was defined. 48 After the handler has finished running it returns control to the function 49 that preformed the raise, usually starting after the raise. 70 50 \begin{center} 71 51 \input{resumption} 72 52 \end{center} 73 After the handler returns, control continues after the resume in @f@ (dynamic return).74 Not unwinding allows fix up of the problem in @f@ by any previous function on the stack, without disrupting the current set of stack frames.75 53 76 54 Although a powerful feature, exception handling tends to be complex to set up 77 55 and expensive to use 78 56 so it is often limited to unusual or ``exceptional" cases. 79 The classic example is error handling, where exceptions are used to80 remove error handling logic from the main execution path, while paying57 The classic example is error handling, exceptions can be used to 58 remove error handling logic from the main execution path, and pay 81 59 most of the cost only when the error actually occurs. 82 60 … … 88 66 some of the underlying tools used to implement and express exception handling 89 67 in other languages are absent in \CFA. 90 Still the resulting basicsyntax resembles that of other languages:91 \begin{ lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}]92 @try@{68 Still the resulting syntax resembles that of other languages: 69 \begin{cfa} 70 try { 93 71 ... 94 72 T * object = malloc(request_size); 95 73 if (!object) { 96 @throw@OutOfMemory{fixed_allocation, request_size};74 throw OutOfMemory{fixed_allocation, request_size}; 97 75 } 98 76 ... 99 } @catch@(OutOfMemory * error) {77 } catch (OutOfMemory * error) { 100 78 ... 101 79 } 102 \end{ lstlisting}80 \end{cfa} 103 81 % A note that yes, that was a very fast overview. 104 82 The design and implementation of all of \CFA's EHM's features are … … 107 85 108 86 % The current state of the project and what it contributes. 109 The majority of the \CFA EHM is implemented in \CFA, except for a small amount of assembler code. 110 In addition, 111 a suite of tests and performance benchmarks were created as part of this project. 112 The \CFA implementation techniques are generally applicable in other programming 87 All of these features have been implemented in \CFA, 88 covering both changes to the compiler and the run-time. 89 In addition, a suite of test cases and performance benchmarks were created 90 along side the implementation. 91 The implementation techniques are generally applicable in other programming 113 92 languages and much of the design is as well. 114 Some parts of the EHM use features unique to \CFA, and hence, 115 are harder to replicate in other programming languages. 116 % Talk about other programming languages. 117 Three well known programming languages with EHMs, %/exception handling 118 C++, Java and Python are examined in the performance work. However, these languages focus on termination 119 exceptions, so there is no comparison with resumption. 93 Some parts of the EHM use other features unique to \CFA and would be 94 harder to replicate in other programming languages. 120 95 121 96 The contributions of this work are: 122 97 \begin{enumerate} 123 98 \item Designing \CFA's exception handling mechanism, adapting designs from 124 other programming languages ,and creating new features.125 \item Implementing stack unwinding forthe \CFA EHM, including updating126 the \CFA compiler and run-time environment to generate and execute the EHM code.127 \item Design ing and implementinga prototype virtual system.99 other programming languages and creating new features. 100 \item Implementing stack unwinding and the \CFA EHM, including updating 101 the \CFA compiler and the run-time environment. 102 \item Designed and implemented a prototype virtual system. 128 103 % I think the virtual system and per-call site default handlers are the only 129 104 % "new" features, everything else is a matter of implementation. 130 \item Creating tests and performance benchmarks to compare with EHM's in other languages. 105 \item Creating tests to check the behaviour of the EHM. 106 \item Creating benchmarks to check the performances of the EHM, 107 as compared to other languages. 131 108 \end{enumerate} 132 109 133 %\todo{I can't figure out a good lead-in to the roadmap.} 134 The thesis is organization as follows.135 The next section and parts of \autoref{c:existing} cover existing EHMs.136 New \CFAEHM features are introduced in \autoref{c:features},110 The rest of this thesis is organized as follows. 111 The current state of exceptions is covered in \autoref{s:background}. 112 The existing state of \CFA is also covered in \autoref{c:existing}. 113 New EHM features are introduced in \autoref{c:features}, 137 114 covering their usage and design. 138 115 That is followed by the implementation of these features in 139 116 \autoref{c:implement}. 140 Performance results are presented in \autoref{c:performance}. 141 Summing up and possibilities for extending this project are discussed in \autoref{c:future}. 117 Performance results are examined in \autoref{c:performance}. 118 Possibilities to extend this project are discussed in \autoref{c:future}. 119 Finally, the project is summarized in \autoref{c:conclusion}. 142 120 143 121 \section{Background} 144 122 \label{s:background} 145 123 146 Exception handling is a well examined areain programming languages,147 with papers on the subject dating back the 70s~\cite{Goodenough75}.124 Exception handling has been examined before in programming languages, 125 with papers on the subject dating back 70s.\cite{Goodenough75} 148 126 Early exceptions were often treated as signals, which carried no information 149 except their identity. Ada ~\cite{Ada} still uses this system.127 except their identity. Ada still uses this system.\todo{cite Ada} 150 128 151 129 The modern flag-ship for termination exceptions is \Cpp, 152 130 which added them in its first major wave of non-object-orientated features 153 131 in 1990. 154 % https://en.cppreference.com/w/cpp/language/history 155 While many EHMs have special exception types, 156 \Cpp has the ability to use any type as an exception. 157 However, this generality is not particularly useful, and has been pushed aside for classes, with a convention of inheriting from 132 \todo{cite https://en.cppreference.com/w/cpp/language/history} 133 Many EHMs have special exception types, 134 however \Cpp has the ability to use any type as an exception. 135 These were found to be not very useful and have been pushed aside for classes 136 inheriting from 158 137 \code{C++}{std::exception}. 159 While \Cpp has a special catch-all syntax @catch(...)@, there is no way to discriminate its exception type, so nothing can 160 be done with the caught value because nothing is known about it. 161 Instead the base exception-type \code{C++}{std::exception} is defined with common functionality (such as 162 the ability to print a message when the exception is raised but not caught) and all 138 Although there is a special catch-all syntax (@catch(...)@) there are no 139 operations that can be performed on the caught value, not even type inspection. 140 Instead the base exception-type \code{C++}{std::exception} defines common 141 functionality (such as 142 the ability to describe the reason the exception was raised) and all 163 143 exceptions have this functionality. 164 Having a root exception-type seems to be the standard now, as the guaranteed functionality is worth 165 any lost in flexibility from limiting exceptions types to classes. 166 167 Java~\cite{Java} was the next popular language to use exceptions. 168 Its exception system largely reflects that of \Cpp, except it requires 169 exceptions to be a subtype of \code{Java}{java.lang.Throwable} 144 That trade-off, restricting usable types to gain guaranteed functionality, 145 is almost universal now, as without some common functionality it is almost 146 impossible to actually handle any errors. 147 148 Java was the next popular language to use exceptions. \todo{cite Java} 149 Its exception system largely reflects that of \Cpp, except that requires 150 you throw a child type of \code{Java}{java.lang.Throwable} 170 151 and it uses checked exceptions. 171 Checked exceptions are part of a function's interface defining all exceptions it or its called functions raise. 172 Using this information, it is possible to statically verify if a handler exists for all raised exception, \ie no uncaught exceptions. 173 Making exception information explicit, improves clarity and 174 safety, but can slow down programming. 175 For example, programming complexity increases when dealing with high-order methods or an overly specified 176 throws clause. However some of the issues are more 177 programming annoyances, such as writing/updating many exception signatures after adding or remove calls. 178 Java programmers have developed multiple programming ``hacks'' to circumvent checked exceptions negating the robustness it is suppose to provide. 179 For example, the ``catch-and-ignore" pattern, where the handler is empty because the exception does not appear relevant to the programmer versus 180 repairing or recovering from the exception. 152 Checked exceptions are part of a function's interface, 153 the exception signature of the function. 154 Every function that could be raised from a function, either directly or 155 because it is not handled from a called function, is given. 156 Using this information, it is possible to statically verify if any given 157 exception is handled and guarantee that no exception will go unhandled. 158 Making exception information explicit improves clarity and safety, 159 but can slow down or restrict programming. 160 For example, programming high-order functions becomes much more complex 161 if the argument functions could raise exceptions. 162 However, as odd it may seem, the worst problems are rooted in the simple 163 inconvenience of writing and updating exception signatures. 164 This has caused Java programmers to develop multiple programming ``hacks'' 165 to circumvent checked exceptions, negating their advantages. 166 One particularly problematic example is the ``catch-and-ignore'' pattern, 167 where an empty handler is used to handle an exception without doing any 168 recovery or repair. In theory that could be good enough to properly handle 169 the exception, but more often is used to ignore an exception that the 170 programmer does not feel is worth the effort of handling it, for instance if 171 they do not believe it will ever be raised. 172 If they are incorrect the exception will be silenced, while in a similar 173 situation with unchecked exceptions the exception would at least activate 174 the language's unhandled exception code (usually program abort with an 175 error message). 181 176 182 177 %\subsection 183 178 Resumption exceptions are less popular, 184 although resumption is as old as termination; 185 hence, few 179 although resumption is as old as termination; hence, few 186 180 programming languages have implemented them. 187 181 % http://bitsavers.informatik.uni-stuttgart.de/pdf/xerox/parc/techReports/ 188 182 % CSL-79-3_Mesa_Language_Manual_Version_5.0.pdf 189 Mesa ~\cite{Mesa} is one programming languages that did.Experience with Mesa190 is quoted as being one of the reasons resumptions are not183 Mesa is one programming language that did.\todo{cite Mesa} Experience with Mesa 184 is quoted as being one of the reasons resumptions were not 191 185 included in the \Cpp standard. 192 186 % https://en.wikipedia.org/wiki/Exception_handling 193 As a result, resumption has ignored in main-stream programming languages. 194 However, ``what goes around comes around'' and resumption is being revisited now (like user-level threading). 195 While rejecting resumption might have been the right decision in the past, there are decades 196 of developments in computer science that have changed the situation. 197 Some of these developments, such as functional programming's resumption 198 equivalent, algebraic effects\cite{Zhang19}, are enjoying significant success. 199 A complete reexamination of resumptions is beyond this thesis, but their re-emergence is 200 enough to try them in \CFA. 187 Since then resumptions have been ignored in main-stream programming languages. 188 However, resumption is being revisited in the context of decades of other 189 developments in programming languages. 190 While rejecting resumption may have been the right decision in the past, 191 the situation has changed since then. 192 Some developments, such as the function programming equivalent to resumptions, 193 algebraic effects\cite{Zhang19}, are enjoying success. 194 A complete reexamination of resumptions is beyond this thesis, 195 but there reemergence is enough to try them in \CFA. 201 196 % Especially considering how much easier they are to implement than 202 % termination exceptions .203 204 %\subsection 205 Functional languages tend to use other solutions for their primary EHM,206 but exception-like constructs still appear.207 Termination appears in error construct, which marks the result of an208 expression as an error; the reafter, the result of any expression that tries to use it is also an209 error, and so on until an appropriate handler is reached.197 % termination exceptions and how much Peter likes them. 198 199 %\subsection 200 Functional languages tend to use other solutions for their primary error 201 handling mechanism, but exception-like constructs still appear. 202 Termination appears in the error construct, which marks the result of an 203 expression as an error; then the result of any expression that tries to use 204 it also results in an error, and so on until an appropriate handler is reached. 210 205 Resumption appears in algebraic effects, where a function dispatches its 211 206 side-effects to its caller for handling. 212 207 213 208 %\subsection 214 Some programming languages have moved to a restricted kind of EHM 215 called``panic".216 In Rust ~\cite{Rust}, a panic is just a program level abort that may be implemented by217 unwinding the stack like in termination exception handling. 209 More recently exceptions seem to be vanishing from newer programming 210 languages, replaced by ``panic". 211 In Rust, a panic is just a program level abort that may be implemented by 212 unwinding the stack like in termination exception handling.\todo{cite Rust} 218 213 % https://doc.rust-lang.org/std/panic/fn.catch_unwind.html 219 In Go~\cite{Go}, a panicis very similar to a termination, except it only supports214 Go's panic through is very similar to a termination, except it only supports 220 215 a catch-all by calling \code{Go}{recover()}, simplifying the interface at 221 the cost of flexibility. 216 the cost of flexibility.\todo{cite Go} 222 217 223 218 %\subsection 224 219 While exception handling's most common use cases are in error handling, 225 here are other ways to handle errors with comparisons toexceptions.220 here are some other ways to handle errors with comparisons with exceptions. 226 221 \begin{itemize} 227 222 \item\emph{Error Codes}: 228 This pattern has a function return an enumeration (or just a set of fixed values) to indicate 229 if an error occurred and possibly which error it was. 230 231 Error codes mix exceptional and normal values, artificially enlarging the type and/or value range. 232 Some languages address this issue by returning multiple values or a tuple, separating the error code from the function result. 233 However, the main issue with error codes is forgetting to checking them, 223 This pattern has a function return an enumeration (or just a set of fixed 224 values) to indicate if an error has occurred and possibly which error it was. 225 226 Error codes mix exceptional/error and normal values, enlarging the range of 227 possible return values. This can be addressed with multiple return values 228 (or a tuple) or a tagged union. 229 However, the main issue with error codes is forgetting to check them, 234 230 which leads to an error being quietly and implicitly ignored. 235 Some new languages have tools that issue warnings, if the error code is 236 discarded to avoid this problem. 237 Checking error codes also results in bloating the main execution path, especially if an error is not dealt with locally and has to be cascaded down the call stack to a higher-level function.. 231 Some new languages and tools will try to issue warnings when an error code 232 is discarded to avoid this problem. 233 Checking error codes also bloats the main execution path, 234 especially if the error is not handled immediately hand has to be passed 235 through multiple functions before it is addressed. 238 236 239 237 \item\emph{Special Return with Global Store}: 240 Some functions only return a boolean indicating success or failure 241 and store the exact reason for the error in a fixed global location. 242 For example, many C routines return non-zero or -1, indicating success or failure, 243 and write error details into the C standard variable @errno@. 244 245 This approach avoids the multiple results issue encountered with straight error codes 246 but otherwise has many (if not more) of the disadvantages. 247 For example, everything that uses the global location must agree on all possible errors and global variable are unsafe with concurrency. 238 Similar to the error codes pattern but the function itself only returns 239 that there was an error 240 and store the reason for the error in a fixed global location. 241 For example many routines in the C standard library will only return some 242 error value (such as -1 or a null pointer) and the error code is written into 243 the standard variable @errno@. 244 245 This approach avoids the multiple results issue encountered with straight 246 error codes but otherwise has the same disadvantages and more. 247 Every function that reads or writes to the global store must agree on all 248 possible errors and managing it becomes more complex with concurrency. 248 249 249 250 \item\emph{Return Union}: … … 254 255 so that one type can be used everywhere in error handling code. 255 256 256 This pattern is very popular in functional or any semi-functional language with257 primitive support for tagged unions (or algebraic data types).258 % We need listing Rust/rust to format code snip its from it.257 This pattern is very popular in any functional or semi-functional language 258 with primitive support for tagged unions (or algebraic data types). 259 % We need listing Rust/rust to format code snippets from it. 259 260 % Rust's \code{rust}{Result<T, E>} 260 The main advantage is providing for more information about an261 error , other than one of a fix-set of ids.262 While some languages use checked union access to force error-code checking, 263 it is still possible to bypass the checking. 264 The main disadvantage is again significant error code on the main execution path and cascading through called functions.261 The main advantage is that an arbitrary object can be used to represent an 262 error so it can include a lot more information than a simple error code. 263 The disadvantages include that the it does have to be checked along the main 264 execution and if there aren't primitive tagged unions proper usage can be 265 hard to enforce. 265 266 266 267 \item\emph{Handler Functions}: 267 This pattern implicitly associates functions with errors.268 On error, the function that produced the error implicitlycalls another function to268 This pattern associates errors with functions. 269 On error, the function that produced the error calls another function to 269 270 handle it. 270 271 The handler function can be provided locally (passed in as an argument, 271 272 either directly as as a field of a structure/object) or globally (a global 272 273 variable). 273 C++ uses this approach as its fallback system if exception handling fails, \eg 274 \snake{std::terminate_handler} and for a time \snake{std::unexpected_handler} 275 276 Handler functions work a lot like resumption exceptions, without the dynamic handler search. 277 Therefore, setting setting up the handler can be more complex/expensive, especially if the handle must be passed through multiple function calls, but cheaper to call $O(1)$, and hence, 278 are more suited to frequent exceptional situations. 279 % The exception being global handlers if they are rarely change as the time 280 % in both cases shrinks towards zero. 274 C++ uses this approach as its fallback system if exception handling fails, 275 such as \snake{std::terminate_handler} and, for a time, 276 \snake{std::unexpected_handler}. 277 278 Handler functions work a lot like resumption exceptions, 279 but without the dynamic search for a handler. 280 Since setting up the handler can be more complex/expensive, 281 especially when the handler has to be passed through multiple layers of 282 function calls, but cheaper (constant time) to call, 283 they are more suited to more frequent (less exceptional) situations. 281 284 \end{itemize} 282 285 283 286 %\subsection 284 287 Because of their cost, exceptions are rarely used for hot paths of execution. 285 Therefore, there is an element of self-fulfilling prophecy for implementation 286 techniques to make exceptions cheap to set-up at the cost 287 of expensive usage. 288 This cost differential is less important in higher-level scripting languages, where use of exceptions for other tasks is more common. 289 An iconic example is Python's @StopIteration@ exception that is thrown by 290 an iterator to indicate that it is exhausted, especially when combined with Python's heavy 291 use of the iterator-based for-loop. 288 Hence, there is an element of self-fulfilling prophecy as implementation 289 techniques have been focused on making them cheap to set-up, 290 happily making them expensive to use in exchange. 291 This difference is less important in higher-level scripting languages, 292 where using exception for other tasks is more common. 293 An iconic example is Python's \code{Python}{StopIteration} exception that 294 is thrown by an iterator to indicate that it is exhausted. 295 When paired with Python's iterator-based for-loop this will be thrown every 296 time the end of the loop is reached. 297 \todo{Cite Python StopIteration and for-each loop.} 292 298 % https://docs.python.org/3/library/exceptions.html#StopIteration -
TabularUnified doc/theses/andrew_beach_MMath/performance.tex ¶
r1d402be reaeca5f 11 11 Tests were run in \CFA, C++, Java and Python. 12 12 In addition there are two sets of tests for \CFA, 13 one for termination and on e for resumption exceptions.13 one for termination and once with resumption. 14 14 15 15 C++ is the most comparable language because both it and \CFA use the same 16 16 framework, libunwind. 17 In fact, the comparison is almost entirely aquality of implementation.17 In fact, the comparison is almost entirely in quality of implementation. 18 18 Specifically, \CFA's EHM has had significantly less time to be optimized and 19 19 does not generate its own assembly. It does have a slight advantage in that 20 there are some features it handles directly instead of throughutility functions,20 \Cpp has to do some extra bookkeeping to support its utility functions, 21 21 but otherwise \Cpp should have a significant advantage. 22 22 23 Java isa popular language with similar termination semantics, but23 Java a popular language with similar termination semantics, but 24 24 it is implemented in a very different environment, a virtual machine with 25 25 garbage collection. 26 It also implements the @finally@ clause on @try@blocks allowing for a direct26 It also implements the finally clause on try blocks allowing for a direct 27 27 feature-to-feature comparison. 28 As with \Cpp, Java's implementation is mature, optimized29 and has extra features.28 As with \Cpp, Java's implementation is mature, has more optimizations 29 and extra features as compared to \CFA. 30 30 31 31 Python is used as an alternative comparison because of the \CFA EHM's 32 current performance goals, which is not tobe prohibitively slow while the32 current performance goals, which is to not be prohibitively slow while the 33 33 features are designed and examined. Python has similar performance goals for 34 34 creating quick scripts and its wide use suggests it has achieved those goals. … … 37 37 resumption exceptions. Even the older programming languages with resumption 38 38 seem to be notable only for having resumption. 39 So instead, resumption is compared to its simulation in other programming 40 languages using fixup functions that are explicitly passed for correction or 41 logging purposes. 42 % So instead, resumption is compared to a less similar but much more familiar 43 %feature, termination exceptions. 39 Instead, resumption is compared to its simulation in other programming 40 languages: fixup functions that are explicity passed into a function. 44 41 45 42 All tests are run inside a main loop that repeatedly performs a test. 46 43 This approach avoids start-up or tear-down time from 47 44 affecting the timing results. 48 Each test is run a N times (configurable from the command line). 45 The number of times the loop is run is configurable from the command line, 46 the number used in the timing runs is given with the results per test. 47 Tests ran their main loop a million times. 49 48 The Java tests runs the main loop 1000 times before 50 49 beginning the actual test to ``warm-up" the JVM. 50 % All other languages are precompiled or interpreted. 51 51 52 52 Timing is done internally, with time measured immediately before and … … 59 59 60 60 The exceptions used in these tests are always based off of 61 a base exception. This requirement minimizes performance differences based 61 the base exception for the language. 62 This requirement minimizes performance differences based 62 63 on the object model used to represent the exception. 63 64 … … 66 67 For example, empty inline assembly blocks are used in \CFA and \Cpp to 67 68 prevent excessive optimizations while adding no actual work. 68 Each test was run eleven times. The top three and bottom three results were69 discarded and the remaining five values are averaged.70 71 The tests are compiled with gcc-10 for \CFA and g++-10 for \Cpp. Java is72 compiled with version 11.0.11. Python with version 3.8. The tests were run on:73 \begin{itemize}[nosep]74 \item75 ARM 2280 Kunpeng 920 48-core 2$\times$socket \lstinline{@} 2.6 GHz running Linux v5.11.0-2576 \item77 AMD 6380 Abu Dhabi 16-core 4$\times$socket \lstinline{@} 2.5 GHz running Linux v5.11.0-2578 \end{itemize}79 Two kinds of hardware architecture allows discriminating any implementation and80 architectural effects.81 82 69 83 70 % We don't use catch-alls but if we did: 84 71 % Catch-alls are done by catching the root exception type (not using \Cpp's 85 72 % \code{C++}{catch(...)}). 73 74 When collecting data each test is run eleven times. The top three and bottom 75 three results are discarded and the remaining five values are averaged. 76 The test are run with the latest (still pre-release) \CFA compiler was used, 77 using gcc-10 as a backend. 78 g++-10 is used for \Cpp. 79 Java tests are complied and run with version 11.0.11. 80 Python used version 3.8. 81 The machines used to run the tests are: 82 \todo{Get patch versions for python, gcc and g++.} 83 \begin{itemize}[nosep] 84 \item ARM 2280 Kunpeng 920 48-core 2$\times$socket 85 \lstinline{@} 2.6 GHz running Linux v5.11.0-25 86 \item AMD 6380 Abu Dhabi 16-core 4$\times$socket 87 \lstinline{@} 2.5 GHz running Linux v5.11.0-25 88 \end{itemize} 89 Representing the two major families of hardware architecture. 86 90 87 91 \section{Tests} … … 90 94 They should provide a guide as to where the EHM's costs are found. 91 95 92 \paragraph{Raise and Handle} 93 This group measures the cost of a try statement when exceptions are raised and 94 the stack is unwound (termination) or not unwound (resumption). Each test has 95 has a repeating function like the following 96 \begin{lstlisting}[language=CFA,{moredelim=**[is][\color{red}]{@}{@}}] 96 \paragraph{Stack Traversal} 97 This group measures the cost of traversing the stack, 98 (and in termination, unwinding it). 99 Inside the main loop is a call to a recursive function. 100 This function calls itself F times before raising an exception. 101 F is configurable from the command line, but is usually 100. 102 This builds up many stack frames, and any contents they may have, 103 before the raise. 104 The exception is always handled at the base of the stack. 105 For example the Empty test for \CFA resumption looks like: 106 \begin{cfa} 97 107 void unwind_empty(unsigned int frames) { 98 108 if (frames) { 99 @unwind_empty(frames - 1);@ // AUGMENTED IN OTHER EXPERIMENTS 100 } else throw (empty_exception){&empty_vt}; 101 } 102 \end{lstlisting} 103 which is called N times, where each call recurses to a depth of R (configurable from the command line), an 104 exception is raised, the stack is a unwound, and the exception caught. 105 \begin{itemize}[nosep] 106 \item Empty: 107 For termination, this test measures the cost of raising (stack walking) an 108 exception through empty stack frames from the bottom of the recursion to an 109 empty handler, and unwinding the stack. (see above code) 110 111 \medskip 112 For resumption, this test measures the same raising cost but does not unwind 113 the stack. For languages without resumption, a fixup function is to the bottom 114 of the recursion and called to simulate a fixup operation at that point. 115 \begin{cfa} 116 void nounwind_fixup(unsigned int frames, void (*raised_rtn)(int &)) { 117 if (frames) { 118 nounwind_fixup(frames - 1, raised_rtn); 109 unwind_empty(frames - 1); 119 110 } else { 120 int fixup = 17; 121 raised_rtn(fixup); 111 throwResume (empty_exception){&empty_vt}; 122 112 } 123 113 } 124 114 \end{cfa} 125 where the passed fixup function is: 126 \begin{cfa} 127 void raised(int & fixup) { 128 fixup = 42; 129 } 130 \end{cfa} 131 For comparison, a \CFA version passing a function is also included. 115 Other test cases have additional code around the recursive call add 116 something besides simple stack frames to the stack. 117 Note that both termination and resumption will have to traverse over 118 the stack but only termination has to unwind it. 119 \begin{itemize}[nosep] 120 % \item None: 121 % Reuses the empty test code (see below) except that the number of frames 122 % is set to 0 (this is the only test for which the number of frames is not 123 % 100). This isolates the start-up and shut-down time of a throw. 124 \item Empty: 125 The repeating function is empty except for the necessary control code. 126 As other traversal tests add to this, so it is the baseline for the group 127 as the cost comes from traversing over and unwinding a stack frame 128 that has no other interactions with the exception system. 132 129 \item Destructor: 133 This test measures the cost of raising an exception through non-empty frames, 134 where each frame has an object requiring destruction, from the bottom of the 135 recursion to an empty handler. Hence, there are N destructor calls during 136 unwinding. 137 138 \medskip 139 This test is not meaningful for resumption because the stack is only unwound as 140 the recursion returns. 141 \begin{cfa} 142 WithDestructor object; 143 unwind_destructor(frames - 1); 144 \end{cfa} 130 The repeating function creates an object with a destructor before calling 131 itself. 132 Comparing this to the empty test gives the time to traverse over and/or 133 unwind a destructor. 145 134 \item Finally: 146 This test measures the cost of establishing a try block with an empty finally 147 clause on the front side of the recursion and running the empty finally clauses 148 during stack unwinding from the bottom of the recursion to an empty handler. 149 \begin{cfa} 150 try { 151 unwind_finally(frames - 1); 152 } finally {} 153 \end{cfa} 154 155 \medskip 156 This test is not meaningful for resumption because the stack is only unwound as 157 the recursion returns. 135 The repeating function calls itself inside a try block with a finally clause 136 attached. 137 Comparing this to the empty test gives the time to traverse over and/or 138 unwind a finally clause. 158 139 \item Other Handler: 159 For termination, this test is like the finally test but the try block has a 160 catch clause for an exception that is not raised, so catch matching is executed 161 during stack unwinding but the match never successes until the catch at the 162 bottom of the recursion. 163 \begin{cfa} 164 try { 165 unwind_other(frames - 1); 166 } catch (not_raised_exception *) {} 167 \end{cfa} 168 169 \medskip 170 For resumption, this test measures the same raising cost but does not unwind 171 the stack. For languages without resumption, the same fixup function is passed 172 and called. 140 The repeating function calls itself inside a try block with a handler that 141 will not match the raised exception, but is of the same kind of handler. 142 This means that the EHM will have to check each handler, but will continue 143 over all of the until it reaches the base of the stack. 144 Comparing this to the empty test gives the time to traverse over and/or 145 unwind a handler. 173 146 \end{itemize} 174 147 175 \paragraph{Try/Handle/Finally Statement} 176 This group measures just the cost of executing a try statement so 177 \emph{there is no stack unwinding}. Hence, the program main loops N times 178 around: 179 \begin{cfa} 180 try { 181 } catch (not_raised_exception *) {} 182 \end{cfa} 148 \paragraph{Cross Try Statement} 149 This group of tests measures the cost setting up exception handling if it is 150 not used (because the exceptional case did not occur). 151 Tests repeatedly cross (enter and leave, execute) a try statement but never 152 preform a raise. 183 153 \begin{itemize}[nosep] 184 154 \item Handler: 185 The try statement has a handler ( catch/resume).155 The try statement has a handler (of the appropriate kind). 186 156 \item Finally: 187 157 The try statement has a finally clause. … … 191 161 This group measures the cost of conditional matching. 192 162 Only \CFA implements the language level conditional match, 193 the other languages mimic with an ``unconditional" match (it still194 checks the exception's type) and conditional re-raise if it is not suppose 163 the other languages mimic it with an ``unconditional" match (it still 164 checks the exception's type) and conditional re-raise if it is not supposed 195 165 to handle that exception. 196 \begin{center} 197 \begin{tabular}{ll} 198 \multicolumn{1}{c}{\CFA} & \multicolumn{1}{c}{\Cpp, Java, Python} \\ 166 167 There is the pattern shown in \CFA and \Cpp. Java and Python use the same 168 pattern as \Cpp, but with their own syntax. 169 170 \begin{minipage}{0.45\textwidth} 199 171 \begin{cfa} 200 172 try { 201 throw_exception(); 202 } catch (empty_exception * exc; 203 should_catch) { 173 ... 174 } catch (exception_t * e ; 175 should_catch(e)) { 176 ... 204 177 } 205 178 \end{cfa} 206 & 207 \begin{cfa} 179 \end{minipage} 180 \begin{minipage}{0.55\textwidth} 181 \begin{lstlisting}[language=C++] 208 182 try { 209 throw_exception(); 210 } catch (EmptyException & exc) { 211 if (!should_catch) throw; 183 ... 184 } catch (std::exception & e) { 185 if (!should_catch(e)) throw; 186 ... 212 187 } 213 \end{cfa} 214 \end{tabular} 215 \end{center} 188 \end{lstlisting} 189 \end{minipage} 216 190 \begin{itemize}[nosep] 217 191 \item Match All: … … 221 195 \end{itemize} 222 196 223 \medskip 224 \noindent 225 All omitted test code for other languages is functionally identical to the \CFA 226 tests or simulated, and available online~\cite{CforallExceptionBenchmarks}. 197 \paragraph{Resumption Simulation} 198 A slightly altered version of the Empty Traversal test is used when comparing 199 resumption to fix-up routines. 200 The handler, the actual resumption handler or the fix-up routine, 201 always captures a variable at the base of the loop, 202 and receives a reference to a variable at the raise site, either as a 203 field on the exception or an argument to the fix-up routine. 204 % I don't actually know why that is here but not anywhere else. 227 205 228 206 %\section{Cost in Size} … … 237 215 238 216 \section{Results} 239 One result not directly related to \CFA but important to keep in 240 mind is that, for exceptions, the standard intuition about which languages 241 should go faster often does not hold. For example, there are a few cases where Python out-performs 242 \CFA, \Cpp and Java. The most likely explanation is that, since exceptions are 243 rarely considered to be the common case, the more optimized languages 244 make that case expense. In addition, languages with high-level 245 representations have a much easier time scanning the stack as there is less 246 to decode. 247 248 Tables~\ref{t:PerformanceTermination} and~\ref{t:PerformanceResumption} show 249 the test results for termination and resumption, respectively. In cases where 250 a feature is not supported by a language, the test is skipped for that language 251 (marked N/A). For some Java experiments it was impossible to measure certain 252 effects because the JIT corrupted the test (marked N/C). No workaround was 253 possible~\cite{Dice21}. To get experiments in the range of 1--100 seconds, the 254 number of times an experiment is run (N) is varied (N is marked beside each 255 experiment, e.g., 1M $\Rightarrow$ 1 million test iterations). 256 257 An anomaly exists with gcc nested functions used as thunks for implementing 258 much of the \CFA EHM. If a nested-function closure captures local variables in 259 its lexical scope, performance dropped by a factor of 10. Specifically, in try 260 statements of the form: 261 \begin{cfa} 262 try { 263 unwind_other(frames - 1); 264 } catch (not_raised_exception *) {} 265 \end{cfa} 266 the try block is hoisted into a nested function and the variable @frames@ is 267 the local parameter to the recursive function, which triggers the anomaly. The 268 workaround is to remove the recursion parameter and make it a global variable 269 that is explicitly decremented outside of the try block (marked with a ``*''): 270 \begin{cfa} 271 frames -= 1; 272 try { 273 unwind_other(); 274 } catch (not_raised_exception *) {} 275 \end{cfa} 276 To make comparisons fair, a dummy parameter is added and the dummy value passed 277 in the recursion. Note, nested functions in gcc are rarely used (if not 278 completely unknown) and must follow the C calling convention, unlike \Cpp 279 lambdas, so it is not surprising if there are performance issues efficiently 280 capturing closures. 281 282 % Similarly, if a test does not change between resumption 283 % and termination in \CFA, then only one test is written and the result 284 % was put into the termination column. 285 286 % Raw Data: 287 % run-algol-a.sat 288 % --------------- 289 % Raise Empty & 82687046678 & 291616256 & 3252824847 & 15422937623 & 14736271114 \\ 290 % Raise D'tor & 219933199603 & 297897792 & 223602799362 & N/A & N/A \\ 291 % Raise Finally & 219703078448 & 298391745 & N/A & ... & 18923060958 \\ 292 % Raise Other & 296744104920 & 2854342084 & 112981255103 & 15475924808 & 21293137454 \\ 293 % Cross Handler & 9256648 & 13518430 & 769328 & 3486252 & 31790804 \\ 294 % Cross Finally & 769319 & N/A & N/A & 2272831 & 37491962 \\ 295 % Match All & 3654278402 & 47518560 & 3218907794 & 1296748192 & 624071886 \\ 296 % Match None & 4788861754 & 58418952 & 9458936430 & 1318065020 & 625200906 \\ 297 % 298 % run-algol-thr-c 299 % --------------- 300 % Raise Empty & 3757606400 & 36472972 & 3257803337 & 15439375452 & 14717808642 \\ 301 % Raise D'tor & 64546302019 & 102148375 & 223648121635 & N/A & N/A \\ 302 % Raise Finally & 64671359172 & 103285005 & N/A & 15442729458 & 18927008844 \\ 303 % Raise Other & 294143497130 & 2630130385 & 112969055576 & 15448220154 & 21279953424 \\ 304 % Cross Handler & 9646462 & 11955668 & 769328 & 3453707 & 31864074 \\ 305 % Cross Finally & 773412 & N/A & N/A & 2253825 & 37266476 \\ 306 % Match All & 3719462155 & 43294042 & 3223004977 & 1286054154 & 623887874 \\ 307 % Match None & 4971630929 & 55311709 & 9481225467 & 1310251289 & 623752624 \\ 308 % 309 % run-algol-04-a 310 % -------------- 311 % Raise Empty & 0.0 & 0.0 & 3250260945 & 0.0 & 0.0 \\ 312 % Raise D'tor & 0.0 & 0.0 & 29017675113 & N/A & N/A \\ 313 % Raise Finally & 0.0 & 0.0 & N/A & 0.0 & 0.0 \\ 314 % Raise Other & 0.0 & 0.0 & 24411823773 & 0.0 & 0.0 \\ 315 % Cross Handler & 0.0 & 0.0 & 769334 & 0.0 & 0.0 \\ 316 % Cross Finally & 0.0 & N/A & N/A & 0.0 & 0.0 \\ 317 % Match All & 0.0 & 0.0 & 3254283504 & 0.0 & 0.0 \\ 318 % Match None & 0.0 & 0.0 & 9476060146 & 0.0 & 0.0 \\ 319 320 % run-plg7a-a.sat 321 % --------------- 322 % Raise Empty & 57169011329 & 296612564 & 2788557155 & 17511466039 & 23324548496 \\ 323 % Raise D'tor & 150599858014 & 318443709 & 149651693682 & N/A & N/A \\ 324 % Raise Finally & 148223145000 & 373325807 & N/A & ... & 29074552998 \\ 325 % Raise Other & 189463708732 & 3017109322 & 85819281694 & 17584295487 & 32602686679 \\ 326 % Cross Handler & 8001654 & 13584858 & 1555995 & 6626775 & 41927358 \\ 327 % Cross Finally & 1002473 & N/A & N/A & 4554344 & 51114381 \\ 328 % Match All & 3162460860 & 37315018 & 2649464591 & 1523205769 & 742374509 \\ 329 % Match None & 4054773797 & 47052659 & 7759229131 & 1555373654 & 744656403 \\ 330 % 331 % run-plg7a-thr-a 332 % --------------- 333 % Raise Empty & 3604235388 & 29829965 & 2786931833 & 17576506385 & 23352975105 \\ 334 % Raise D'tor & 46552380948 & 178709605 & 149834207219 & N/A & N/A \\ 335 % Raise Finally & 46265157775 & 177906320 & N/A & 17493045092 & 29170962959 \\ 336 % Raise Other & 195659245764 & 2376968982 & 86070431924 & 17552979675 & 32501882918 \\ 337 % Cross Handler & 397031776 & 12503552 & 1451225 & 6658628 & 42304965 \\ 338 % Cross Finally & 1136746 & N/A & N/A & 4468799 & 46155817 \\ 339 % Match All & 3189512499 & 39124453 & 2667795989 & 1525889031 & 733785613 \\ 340 % Match None & 4094675477 & 48749857 & 7850618572 & 1566713577 & 733478963 \\ 341 % 342 % run-plg7a-04-a 343 % -------------- 344 % 0.0 are unfilled. 345 % Raise Empty & 0.0 & 0.0 & 2770781479 & 0.0 & 0.0 \\ 346 % Raise D'tor & 0.0 & 0.0 & 23530084907 & N/A & N/A \\ 347 % Raise Finally & 0.0 & 0.0 & N/A & 0.0 & 0.0 \\ 348 % Raise Other & 0.0 & 0.0 & 23816827982 & 0.0 & 0.0 \\ 349 % Cross Handler & 0.0 & 0.0 & 1422188 & 0.0 & 0.0 \\ 350 % Cross Finally & 0.0 & N/A & N/A & 0.0 & 0.0 \\ 351 % Match All & 0.0 & 0.0 & 2671989778 & 0.0 & 0.0 \\ 352 % Match None & 0.0 & 0.0 & 7829059869 & 0.0 & 0.0 \\ 353 354 \begin{table} 217 % First, introduce the tables. 218 \autoref{t:PerformanceTermination}, 219 \autoref{t:PerformanceResumption} 220 and~\autoref{t:PerformanceFixupRoutines} 221 show the test results. 222 In cases where a feature is not supported by a language, the test is skipped 223 for that language and the result is marked N/A. 224 There are also cases where the feature is supported but measuring its 225 cost is impossible. This happened with Java, which uses a JIT that optimize 226 away the tests and it cannot be stopped.\cite{Dice21} 227 These tests are marked N/C. 228 To get results in a consistent range (1 second to 1 minute is ideal, 229 going higher is better than going low) N, the number of iterations of the 230 main loop in each test, is varied between tests. It is also given in the 231 results and usually have a value in the millions. 232 233 An anomaly in some results came from \CFA's use of gcc nested functions. 234 These nested functions are used to create closures that can access stack 235 variables in their lexical scope. 236 However, if they do so then they can cause the benchmark's run-time to 237 increase by an order of magnitude. 238 The simplest solution is to make those values global variables instead 239 of function local variables. 240 % Do we know if editing a global inside nested function is a problem? 241 Tests that had to be modified to avoid this problem have been marked 242 with a ``*'' in the results. 243 244 % Now come the tables themselves: 245 % You might need a wider window for this. 246 247 \begin{table}[htb] 355 248 \centering 356 \caption{ Performance Results Termination(sec)}249 \caption{Termination Performance Results (sec)} 357 250 \label{t:PerformanceTermination} 358 251 \begin{tabular}{|r|*{2}{|r r r r|}} 359 252 \hline 360 & \multicolumn{4}{c||}{AMD} & \multicolumn{4}{c|}{ARM}\\253 & \multicolumn{4}{c||}{AMD} & \multicolumn{4}{c|}{ARM} \\ 361 254 \cline{2-9} 362 N\hspace{8pt} 363 364 \hline 365 Throw Empty (1M) & 3.4 & 2.8 & 18.3 & 23.4 & 3.7 & 3.2 & 15.5 & 14.8\\366 Throw D'tor (1M) & 48.4 & 23.6 & N/A & N/A & 64.2 & 29.0 & N/A & N/A\\367 Throw Finally (1M) & 3.4* & N/A & 17.9 & 29.0 & 4.1* & N/A & 15.6 & 19.0\\368 Throw Other (1M) & 3.6* & 23.2 & 18.2 & 32.7 & 4.0* & 24.5 & 15.5 & 21.4\\369 Try/Catch (100M) & 6.0 & 0.9 & N/C & 37.4 & 10.0 & 0.8 & N/C & 32.2\\370 Try/Finally (100M) & 0.9 & N/A & N/C & 44.1 & 0.8 & N/A & N/C & 37.3\\371 Match All (10M) & 32.9 & 20.7 & 13.4 & 4.9 & 36.2 & 24.5 & 12.0 & 3.1\\372 Match None (10M) & 32.7 & 50.3 & 11.0 & 5.1 & 36.3 & 71.9 & 12.3 & 4.2\\255 N\hspace{8pt} & \multicolumn{1}{c}{\CFA} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c||}{Python} & 256 \multicolumn{1}{c}{\CFA} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c|}{Python} \\ 257 \hline 258 Empty Traversal (1M) & 3.4 & 2.8 & 18.3 & 23.4 & 3.7 & 3.2 & 15.5 & 14.8 \\ 259 D'tor Traversal (1M) & 48.4 & 23.6 & N/A & N/A & 64.2 & 29.0 & N/A & N/A \\ 260 Finally Traversal (1M) & 3.4* & N/A & 17.9 & 29.0 & 4.1* & N/A & 15.6 & 19.0 \\ 261 Other Traversal (1M) & 3.6* & 23.2 & 18.2 & 32.7 & 4.0* & 24.5 & 15.5 & 21.4 \\ 262 Cross Handler (100M) & 6.0 & 0.9 & N/C & 37.4 & 10.0 & 0.8 & N/C & 32.2 \\ 263 Cross Finally (100M) & 0.9 & N/A & N/C & 44.1 & 0.8 & N/A & N/C & 37.3 \\ 264 Match All (10M) & 32.9 & 20.7 & 13.4 & 4.9 & 36.2 & 24.5 & 12.0 & 3.1 \\ 265 Match None (10M) & 32.7 & 50.3 & 11.0 & 5.1 & 36.3 & 71.9 & 12.3 & 4.2 \\ 373 266 \hline 374 267 \end{tabular} 375 268 \end{table} 376 269 377 \begin{table} 270 \begin{table}[htb] 271 \centering 272 \caption{Resumption Performance Results (sec)} 273 \label{t:PerformanceResumption} 274 \begin{tabular}{|r||r||r|} 275 \hline 276 N\hspace{8pt} 277 & AMD & ARM \\ 278 \hline 279 Empty Traversal (10M) & 0.2 & 0.3 \\ 280 D'tor Traversal (10M) & 1.8 & 1.0 \\ 281 Finally Traversal (10M) & 1.7 & 1.0 \\ 282 Other Traversal (10M) & 22.6 & 25.9 \\ 283 Cross Handler (100M) & 8.4 & 11.9 \\ 284 Match All (100M) & 2.3 & 3.2 \\ 285 Match None (100M) & 2.9 & 3.9 \\ 286 \hline 287 \end{tabular} 288 \end{table} 289 290 \begin{table}[htb] 378 291 \centering 379 292 \small 380 \caption{ Performance Results Resumption (sec)}381 \label{t:Performance Resumption}293 \caption{Resumption/Fixup Routine Comparison (sec)} 294 \label{t:PerformanceFixupRoutines} 382 295 \setlength{\tabcolsep}{5pt} 383 \begin{tabular}{|r|*{2}{|r r r r|}} 384 \hline 385 & \multicolumn{4}{c||}{AMD} & \multicolumn{4}{c|}{ARM} \\ 386 \cline{2-9} 387 N\hspace{8pt} & \multicolumn{1}{c}{\CFA (R/F)} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c||}{Python} & 388 \multicolumn{1}{c}{\CFA (R/F)} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c|}{Python} \\ 389 \hline 390 Resume Empty (10M) & 3.8/3.5 & 14.7 & 2.3 & 176.1 & 0.3/0.1 & 8.9 & 1.2 & 119.9 \\ 391 Resume Other (10M) & 4.0*/0.1* & 21.9 & 6.2 & 381.0 & 0.3*/0.1* & 13.2 & 5.0 & 290.7 \\ 392 Try/Resume (100M) & 8.8 & N/A & N/A & N/A & 12.3 & N/A & N/A & N/A \\ 393 Match All (10M) & 0.3 & N/A & N/A & N/A & 0.3 & N/A & N/A & N/A \\ 394 Match None (10M) & 0.3 & N/A & N/A & N/A & 0.4 & N/A & N/A & N/A \\ 296 \begin{tabular}{|r|*{2}{|r r r r r|}} 297 \hline 298 & \multicolumn{5}{c||}{AMD} & \multicolumn{5}{c|}{ARM} \\ 299 \cline{2-11} 300 N\hspace{8pt} & \multicolumn{1}{c}{Raise} & \multicolumn{1}{c}{\CFA} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c||}{Python} & 301 \multicolumn{1}{c}{Raise} & \multicolumn{1}{c}{\CFA} & \multicolumn{1}{c}{\Cpp} & \multicolumn{1}{c}{Java} & \multicolumn{1}{c|}{Python} \\ 302 \hline 303 Resume Empty (10M) & 3.8 & 3.5 & 14.7 & 2.3 & 176.1 & 0.3 & 0.1 & 8.9 & 1.2 & 119.9 \\ 304 %Resume Other (10M) & 4.0* & 0.1* & 21.9 & 6.2 & 381.0 & 0.3* & 0.1* & 13.2 & 5.0 & 290.7 \\ 395 305 \hline 396 306 \end{tabular} 397 307 \end{table} 398 308 399 As stated, the performance tests are not attempting to compare exception 400 handling across languages. The only performance requirement is to ensure the 401 \CFA EHM implementation runs in a reasonable amount of time, given its 402 constraints. In general, the \CFA implement did very well. Each of the tests is 403 analysed. 309 % Now discuss the results in the tables. 310 One result not directly related to \CFA but important to keep in mind is that, 311 for exceptions the standard intuition about which languages should go 312 faster often does not hold. 313 For example, there are a few cases where Python out-performs 314 \CFA, \Cpp and Java. 315 The most likely explanation is that, since exceptions 316 are rarely considered to be the common case, the more optimized languages 317 make that case expensive to improve other cases. 318 In addition, languages with high-level representations have a much 319 easier time scanning the stack as there is less to decode. 320 321 As stated, 322 the performance tests are not attempting to show \CFA has a new competitive 323 way of implementing exception handling. 324 The only performance requirement is to insure the \CFA EHM has reasonable 325 performance for prototyping. 326 Although that may be hard to exactly quantify, we believe it has succeeded 327 in that regard. 328 Details on the different test cases follow. 329 404 330 \begin{description} 405 \item[Throw/Resume Empty] 406 For termination, \CFA is close to \Cpp, where other languages have a higher cost. 407 408 For resumption, \CFA is better than the fixup simulations in the other languages, except Java. 409 The \CFA results on the ARM computer for both resumption and function simulation are particularly low; 410 I have no explanation for this anomaly, except the optimizer has managed to remove part of the experiment. 411 Python has a high cost for passing the lambda during the recursion. 412 413 \item[Throw D'tor] 414 For termination, \CFA is twice the cost of \Cpp. 415 The higher cost for \CFA must be related to how destructors are handled. 416 417 \item[Throw Finally] 418 \CFA is better than the other languages with a @finally@ clause, which is the 419 same for termination and resumption. 420 421 \item[Throw/Resume Other] 422 For termination, \CFA is better than the other languages. 423 424 For resumption, \CFA is equal to or better the other languages. 425 Again, the \CFA results on the ARM computer for both resumption and function simulation are particularly low. 426 Python has a high cost for passing the lambda during the recursion. 427 428 \item[Try/Catch/Resume] 429 For termination, installing a try statement is more expressive than \Cpp 430 because the try components are hoisted into local functions. At runtime, these 431 functions are than passed to libunwind functions to set up the try statement. 432 \Cpp zero-cost try-entry accounts for its performance advantage. 433 434 For resumption, there are similar costs to termination to set up the try 435 statement but libunwind is not used. 436 437 \item[Try/Finally] 438 Setting up a try finally is less expensive in \CFA than setting up handlers, 439 and is significantly less than other languages. 440 441 \item[Throw/Resume Match All] 442 For termination, \CFA is close to the other language simulations. 443 444 For resumption, the stack unwinding is much faster because it does not use 445 libunwind. Instead resumption is just traversing a linked list with each node 446 being the next stack frame with the try block. 447 448 \item[Throw/Resume Match None] 449 The same results as for Match All. 331 \item[Empty Traversal] 332 \CFA is slower than \Cpp, but is still faster than the other languages 333 and closer to \Cpp than other languages. 334 This is to be expected as \CFA is closer to \Cpp than the other languages. 335 336 \item[D'tor Traversal] 337 Running destructors causes huge slowdown in every language that supports 338 them. \CFA has a higher proportionate slowdown but it is similar to \Cpp's. 339 Considering the amount of work done in destructors is so low the cost 340 likely comes from the change of context required to do that work. 341 342 \item[Finally Traversal] 343 Speed is similar to Empty Traversal in all languages that support finally 344 clauses. Only Python seems to have a larger than random noise change in 345 its run-time and it is still not large. 346 Despite the similarity between finally clauses and destructors, 347 finally clauses seem to avoid the spike in run-time destructors have. 348 Possibly some optimization removes the cost of changing contexts. 349 \todo{OK, I think the finally clause may have been optimized out.} 350 351 \item[Other Traversal] 352 For \Cpp, stopping to check if a handler applies seems to be about as 353 expensive as stopping to run a destructor. 354 This results in a significant jump. 355 356 Other languages experiance a small increase in run-time. 357 The small increase likely comes from running the checks, 358 but they could avoid the spike by not having the same kind of overhead for 359 switching to the check's context. 360 361 \todo{Could revist Other Traversal, after Finally Traversal.} 362 363 \item[Cross Handler] 364 Here \CFA falls behind \Cpp by a much more significant margin. 365 This is likely due to the fact \CFA has to insert two extra function 366 calls while \Cpp doesn't have to do execute any other instructions. 367 Python is much further behind. 368 369 \item[Cross Finally] 370 \CFA's performance now matches \Cpp's from Cross Handler. 371 If the code from the finally clause is being inlined, 372 which is just a asm comment, than there are no additional instructions 373 to execute again when exiting the try statement normally. 374 375 \item[Conditional Match] 376 Both of the conditional matching tests can be considered on their own, 377 however for evaluating the value of conditional matching itself the 378 comparison of the two sets of results is useful. 379 Consider the massive jump in run-time for \Cpp going from match all to match 380 none, which none of the other languages have. 381 Some strange interaction is causing run-time to more than double for doing 382 twice as many raises. 383 Java and Python avoid this problem and have similar run-time for both tests, 384 possibly through resource reuse or their program representation. 385 However \CFA is built like \Cpp and avoids the problem as well, this matches 386 the pattern of the conditional match which makes the two execution paths 387 much more similar. 388 450 389 \end{description} 451 390 452 \begin{comment} 453 This observation means that while \CFA does not actually keep up with Python in 454 every case, it is usually no worse than roughly half the speed of \Cpp. This 455 performance is good enough for the prototyping purposes of the project. 456 457 The test case where \CFA falls short is Raise Other, the case where the 458 stack is unwound including a bunch of non-matching handlers. 459 This slowdown seems to come from missing optimizations. 460 461 This suggests that the performance issue in Raise Other is just an 462 optimization not being applied. Later versions of gcc may be able to 463 optimize this case further, at least down to the half of \Cpp mark. 464 A \CFA compiler that directly produced assembly could do even better as it 465 would not have to work across some of \CFA's current abstractions, like 466 the try terminate function. 467 468 Resumption exception handling is also incredibly fast. Often an order of 469 magnitude or two better than the best termination speed. 470 There is a simple explanation for this; traversing a linked list is much 471 faster than examining and unwinding the stack. When resumption does not do as 472 well its when more try statements are used per raise. Updating the internal 473 linked list is not very expensive but it does add up. 474 475 The relative speed of the Match All and Match None tests (within each 476 language) can also show the effectiveness conditional matching as compared 477 to catch and rethrow. 478 \begin{itemize}[nosep] 479 \item 480 Java and Python get similar values in both tests. 481 Between the interpreted code, a higher level representation of the call 482 stack and exception reuse it it is possible the cost for a second 483 throw can be folded into the first. 484 % Is this due to optimization? 485 \item 486 Both types of \CFA are slightly slower if there is not a match. 487 For termination this likely comes from unwinding a bit more stack through 488 libunwind instead of executing the code normally. 489 For resumption there is extra work in traversing more of the list and running 490 more checks for a matching exceptions. 491 % Resumption is a bit high for that but this is my best theory. 492 \item 493 Then there is \Cpp, which takes 2--3 times longer to catch and rethrow vs. 494 just the catch. This is very high, but it does have to repeat the same 495 process of unwinding the stack and may have to parse the LSDA of the function 496 with the catch and rethrow twice, once before the catch and once after the 497 rethrow. 498 % I spent a long time thinking of what could push it over twice, this is all 499 % I have to explain it. 500 \end{itemize} 501 The difference in relative performance does show that there are savings to 502 be made by performing the check without catching the exception. 503 \end{comment} 504 505 506 \begin{comment} 507 From: Dave Dice <dave.dice@oracle.com> 508 To: "Peter A. Buhr" <pabuhr@uwaterloo.ca> 509 Subject: Re: [External] : JIT 510 Date: Mon, 16 Aug 2021 01:21:56 +0000 511 512 > On 2021-8-15, at 7:14 PM, Peter A. Buhr <pabuhr@uwaterloo.ca> wrote: 513 > 514 > My student is trying to measure the cost of installing a try block with a 515 > finally clause in Java. 516 > 517 > We tried the random trick (see below). But if the try block is comment out, the 518 > results are the same. So the program measures the calls to the random number 519 > generator and there is no cost for installing the try block. 520 > 521 > Maybe there is no cost for a try block with an empty finally, i.e., the try is 522 > optimized away from the get-go. 523 524 There's quite a bit of optimization magic behind the HotSpot curtains for 525 try-finally. (I sound like the proverbial broken record (:>)). 526 527 In many cases we can determine that the try block can't throw any exceptions, 528 so we can elide all try-finally plumbing. In other cases, we can convert the 529 try-finally to normal if-then control flow, in the case where the exception is 530 thrown into the same method. This makes exceptions _almost cost-free. If we 531 actually need to "physically" rip down stacks, then things get expensive, 532 impacting both the throw cost, and inhibiting other useful optimizations at the 533 catch point. Such "true" throws are not just expensive, they're _very 534 expensive. The extremely aggressive inlining used by the JIT helps, because we 535 can convert cases where a heavy rip-down would normally needed back into simple 536 control flow. 537 538 Other quirks involve the thrown exception object. If it's never accessed then 539 we're apply a nice set of optimizations to avoid its construction. If it's 540 accessed but never escapes the catch frame (common) then we can also cheat. 541 And if we find we're hitting lots of heavy rip-down cases, the JIT will 542 consider recompilation - better inlining -- to see if we can merge the throw 543 and catch into the same physical frame, and shift to simple branches. 544 545 In your example below, System.out.print() can throw, I believe. (I could be 546 wrong, but most IO can throw). Native calls that throw will "unwind" normally 547 in C++ code until they hit the boundary where they reenter java emitted code, 548 at which point the JIT-ed code checks for a potential pending exception. So in 549 a sense the throw point is implicitly after the call to the native method, so 550 we can usually make those cases efficient. 551 552 Also, when we're running in the interpreter and warming up, we'll notice that 553 the == 42 case never occurs, and so when we start to JIT the code, we elide the 554 call to System.out.print(), replacing it (and anything else which appears in 555 that if x == 42 block) with a bit of code we call an "uncommon trap". I'm 556 presuming we encounter 42 rarely. So if we ever hit the x == 42 case, control 557 hits the trap, which triggers synchronous recompilation of the method, this 558 time with the call to System.out.print() and, because of that, we now to adapt 559 the new code to handle any traps thrown by print(). This is tricky stuff, as 560 we may need to rebuild stack frames to reflect the newly emitted method. And 561 we have to construct a weird bit of "thunk" code that allows us to fall back 562 directly into the newly emitted "if" block. So there's a large one-time cost 563 when we bump into the uncommon trap and recompile, and subsequent execution 564 might get slightly slower as the exception could actually be generated, whereas 565 before we hit the trap, we knew the exception could never be raised. 566 567 Oh, and things also get expensive if we need to actually fill in the stack 568 trace associated with the exception object. Walking stacks is hellish. 569 570 Quite a bit of effort was put into all this as some of the specjvm benchmarks 571 showed significant benefit. 572 573 It's hard to get sensible measurements as the JIT is working against you at 574 every turn. What's good for the normal user is awful for anybody trying to 575 benchmark. Also, all the magic results in fairly noisy and less reproducible 576 results. 577 578 Regards 579 Dave 580 581 p.s., I think I've mentioned this before, but throwing in C++ is grim as 582 unrelated throws in different threads take common locks, so nothing scales as 583 you might expect. 584 \end{comment} 391 Moving on to resumption there is one general note, 392 resumption is \textit{fast}, the only test where it fell 393 behind termination is Cross Handler. 394 In every other case, the number of iterations had to be increased by a 395 factor of 10 to get the run-time in an approprate range 396 and in some cases resumption still took less time. 397 398 % I tried \paragraph and \subparagraph, maybe if I could adjust spacing 399 % between paragraphs those would work. 400 \begin{description} 401 \item[Empty Traversal] 402 See above for the general speed-up notes. 403 This result is not surprising as resumption's link list approach 404 means that traversing over stack frames without a resumption handler is 405 $O(1)$. 406 407 \item[D'tor Traversal] 408 Resumption does have the same spike in run-time that termination has. 409 The run-time is actually very similar to Finally Traversal. 410 As resumption does not unwind the stack both destructors and finally 411 clauses are run while walking down the stack normally. 412 So it follows their performance is similar. 413 414 \item[Finally Traversal] 415 The increase in run-time fromm Empty Traversal (once adjusted for 416 the number of iterations) roughly the same as for termination. 417 This suggests that the 418 419 \item[Other Traversal] 420 Traversing across handlers reduces resumption's advantage as it actually 421 has to stop and check each one. 422 Resumption still came out ahead (adjusting for iterations) but by much less 423 than the other cases. 424 425 \item[Cross Handler] 426 The only test case where resumption could not keep up with termination, 427 although the difference is not as significant as many other cases. 428 It is simply a matter of where the costs come from. Even if \CFA termination 429 is not ``zero-cost" passing through an empty function still seems to be 430 cheaper than updating global values. 431 432 \item[Conditional Match] 433 Resumption shows a slight slowdown if the exception is not matched 434 by the first handler, which follows from the fact the second handler now has 435 to be checked. However the difference is not large. 436 437 \end{description} 438 439 Finally are the results of the resumption/fixup routine comparison. 440 These results are surprisingly varied, it is possible that creating a closure 441 has more to do with performance than passing the argument through layers of 442 calls. 443 Even with 100 stack frames though, resumption is only about as fast as 444 manually passing a fixup routine. 445 So there is a cost for the additional power and flexibility exceptions 446 provide. -
TabularUnified doc/theses/andrew_beach_MMath/uw-ethesis.tex ¶
r1d402be reaeca5f 210 210 \lstMakeShortInline@ 211 211 \lstset{language=CFA,style=cfacommon,basicstyle=\linespread{0.9}\tt} 212 % PAB causes problems with inline @=213 %\lstset{moredelim=**[is][\protect\color{red}]{@}{@}}214 212 % Annotations from Peter: 215 213 \newcommand{\PAB}[1]{{\color{blue}PAB: #1}}
Note: See TracChangeset
for help on using the changeset viewer.