- Timestamp:
- Nov 18, 2024, 10:07:22 PM (10 hours ago)
- Branches:
- master
- Children:
- 29075d1
- Parents:
- 0dffe91
- Location:
- doc/theses/mike_brooks_MMath
- Files:
-
- 2 edited
Legend:
- Unmodified
- Added
- Removed
-
doc/theses/mike_brooks_MMath/string.tex
r0dffe91 r1b39705 121 121 Earlier work on \CFA~\cite[ch.~2]{Schluntz17} implemented object constructors and destructors for all types (basic and user defined). 122 122 A constructor is a user-defined function run implicitly \emph{after} an object's declaration-storage is created, and a destructor is a user-defined function run \emph{before} an object's declaration-storage is deleted. 123 This feature guarantees pre invariants for users before accessing an object and post invariants for the programming environment after an object terminates.123 This feature, called RAII~\cite[p.~389]{Stroustrup94}, guarantees pre invariants for users before accessing an object and post invariants for the programming environment after an object terminates. 124 124 125 125 The purposes of these invariants goes beyond ensuring authentic values inside an object. 126 Invariants can also track occurrences of themanaged objects in other data structures.126 Invariants can also track occurrences of managed objects in other data structures. 127 127 For example, reference counting is a typical application of an invariant outside of the data values. 128 128 With a reference-counting smart-pointer, the constructor and destructor \emph{of a pointer type} tracks the life cycle of the object it points to. … … 130 130 131 131 In general, a lifecycle function has access to an object by location, \ie constructors and destructors receive a @this@ parameter providing an object's memory address. 132 The lifecycle implementation can then add this object to a collection upon creation and remove it at destruction. 133 A module that provides such objects, by using and encapsulating such a collection, can traverse the collection at relevant times, to keep the objects ``good.''. 134 Hence, declaring such an object not only ensures ``good'' authentic at initialization, but also an implicit subscription to a service that keeps the value ``good'' across its lifetime. 135 136 In many cases, the relationship between memory location and lifecycle is simple. 137 But with stack-allocated objects being used as parameters and returns, there is a sender version in one stack frame and a receiver version in another. 138 \CC is able to treat those versions as distinct objects and guarantee a copy-constructor call for communicating the value from one to the other. 139 This ability has implications on the language's calling convention. 140 Consider an ordinary function @void f( Vehicle x )@, which receives an aggregate by value. 141 If the type @Vehicle@ has custom lifecycle functions, then a call to a user-provided copy constructor occurs, after the caller evaluates its argument expression, after the callee's stack frame exists, with room for its variable @x@ (which is the location that the copy-constructor must target), but before the user-provided body of @f@ begins executing. 142 \CC achieves this ordering by changing the function signature, in the compiled form, to pass-by-reference and having the callee invoke the copy constructor in its preamble. 143 On the other hand, if @Vehicle@ is a simple structure then the C calling convention is applied as the code originally appeared, that is, the call-site implementation code performs a bitwise copy from the caller's expression result, into the callee's x. 144 145 TODO: learn correction to fix inconsistency: this discussion says the callee invokes the copy constructor, but only the caller knows which copy constructor to use! 146 147 TODO: discuss the return-value piece of this pattern 148 149 The \CFA RAII system has limited support for using lifecycle functions to provide a ``stay good'' service. It works in restricted settings, including on dynamically allocated objects. It does not work for communicating arguments and returns by value because the system does not produce a constructor call that tracks the implied move from a sender frame to a receiver frame. This limitation does not prevent a typical reference-counting design from using call-with-value/return-of-value, because the constructor--destructor calls are correctly balanced. But it impedes a ``stay-good'' service from supporting call-with-value/return-of-value, because the lifecycles presented to the constructor/destructor calls do not keep stable locations. A ``stay-good'' service is achievable so long as call-with-value/return-of-value do not occur. The original presentation [to cite Schluntz section] acknowledges this limitation; the present discussion makes its consequences more apparent. 150 151 The \CFA team sees this limitation as part of a tactical interim state that should some day be improved. The \CFA compiler is currently a source-to-source translator that targets relatively portable C. Several details of its features are provisionally awkward or under-performant until finer control of its code generation is feasible. In the present state, all calls that appear in \CFA source code as call-with-value/return-of-value are emitted this way to the underlying C calling convention. SO WHAT? 152 153 The present string-API contribution has both the ``stay good'' promise and call-with-value/return-of-value being essential. The main string API uses a work-around to achieve the full feature set, at a runtime performance penalty. An alternative API sacrifices call-with-value/return-of-value functionality to recover full runtime performance. These APIs are layered, with the slower, friendlier High Level API (HL) wrapping the faster, more primitive Low Level API (LL). They present the same features, up to lifecycle management, with call-with-value/return-of-value being disabled in LL and implemented with the workaround in HL. The intention is for most future code to target HL. In a more distant future state, where \CFA has an RAII system that can handle the problematic quadrant, the HL layer can be abolished, the LL can be renamed to match today's HL, and LL can have its call-with-value/return-of-value permission reenabled. Then, programs written originally against HL will simply run faster. In the meantime, two use cases of LL exist. Performance-critical sections of applications have LL as an option. Within [Xref perf experiments], though HL-v-LL penalties are measured, typical comparisons of the contributed string library vs similar systems are made using LL. This measurement gives a fair estimate of the goal state for \CFA while it is an evolving work in progress. 154 132 The lifecycle implementation can then add this object to a collection at creation and remove it at destruction. 133 A module providing lifecycle semantics can traverse the collection at relevant times to keep the objects ``good.'' 134 Hence, declaring such an object not only ensures ``good'' authentic values, but also an implicit subscription to a service that keeps the value ``good'' across its lifetime. 135 136 In many cases, the relationship between memory location and lifecycle is straightforward. 137 For example, stack-allocated objects being used as parameters and returns, with a sender version in one stack frame and a receiver version in another, as opposed to assignment where sender and receiver are in the same stack frame. 138 What is crucial for lifecycle management is knowing if the receiver is initialized or uninitialized, \ie an object is or is not currently associated with management. 139 To provide this knowledge, languages differentiate between initialization and assignment to a left-hand side. 140 \begin{cfa} 141 Obj obj2 = obj1; // initialization, obj2 is uninitialized 142 obj2 = obj1; // assignment, obj2 must be initialized for management to work 143 \end{cfa} 144 Initialization occurs at declaration by value, parameter by argument, return temporary by function call. 145 Hence, it is necessary to have two kinds of constructors: by value or object. 146 \begin{cfa} 147 Obj obj1{ 1, 2, 3 }; // by value, management is initialized 148 Obj obj2 = obj1; // by obj, management is updated 149 \end{cfa} 150 When no object management is required, initialization copies the right-hand value. 151 Hence, the calling convention remains uniform, where the unmanaged case uses @memcpy@ as the initialization constructor and managed uses the specified initialization constructor. 152 153 The \CFA RAII system supports lifecycle functions, except for returning a value from a function to a temporary. 154 For example, in \CC: 155 \begin{cfa} 156 struct S {...}; 157 S identity( S s ) { return s; } 158 S s; 159 s = identity( s ); // S temp = identity( s ); s = temp; 160 \end{cfa} 161 the generated code for the function call created a temporary with initialization from the function call, and then assigns the temporary to the receiver. 162 This two step approach means extra storage for the temporary and two copies to get the result into the receiver variable. 163 \CC{17} introduced return value-optimization (RVO)~\cite{RVO20} to ``avoid copying an object that a function returns as its value, including avoiding creation of a temporary object''. 164 \CFA uses C semantics for function return giving direct value-assignment, which eliminates unnecessary code, but skips an essential feature needed by lifetime management. 165 The following discusses the consequences of this semantics with respect to lifetime management of \CFA strings. 166 167 The present string-API contribution provides lifetime management with initialization semantics on function return. 168 The workaround to achieve the full lifetime semantics does have a runtime performance penalty. 169 An alternative API sacrifices return initialization semantics to recover full runtime performance. 170 These APIs are layered, with the slower, friendlier High Level API (HL) wrapping the faster, more primitive Low Level API (LL). 171 Both API present the same features, up to lifecycle management, with return initialization being disabled in LL and implemented with the workaround in HL. 172 The intention is for most future code to target HL. 173 When \CFA becomes a full compiler, it can provide return initialization with RVO optimizations. 174 Then, programs written with the HL API will simply run faster. 175 In the meantime, performance-critical sections of applications use LL. 176 Subsequent performance experiments~\VRef{s:PerformanceAssessment} with other string libraries has \CFA strings using the LL API. 177 These measurement gives a fair estimate of the goal state for \CFA. 155 178 156 179 157 180 \subsection{Memory management} 158 181 159 A centrepiece of the string module is its memory manager. The management scheme defines a large shared buffer for strings' text. Allocation in this buffer is always bump-pointer; the buffer is compacted and/or relocated with growth when it fills. A string is a smart pointer into this buffer. 160 161 This cycle of frequent cheap allocations, interspersed with infrequent expensive compactions, has obvious similarities to a general-purpose memory manager based on garbage collection (GC). A few differences are noteworthy. First, in a general purpose manager, the objects of allocation contain pointers to other such objects, making the transitive reachability of these objects be a critical property. Here, the allocations are of buffers of text, never pointers, so one allocation never keeps another one alive. Second, in a general purpose manager, where the handle that keeps an allocation alive is the same as the program's general-purpose inter-object reference, an extremely lean representation of this reference is required. Here, a fatter representation is acceptable because [why??]. 182 A centrepiece of the string module is its memory manager. 183 The management scheme defines a shared buffer for string text. 184 Allocation in this buffer is via a bump-pointer; 185 the buffer is compacted and/or relocated with growth when it fills. 186 A string is a smart pointer into this buffer. 187 188 This cycle of frequent cheap allocations, interspersed with infrequent expensive compactions, has obvious similarities to a general-purpose memory manager based on garbage collection (GC). 189 A few differences are noteworthy. 190 First, in a general purpose manager, the objects of allocation contain pointers to other such objects, making the transitive reachability of these objects be a critical property. 191 Here, the allocations are text, so one allocation never keeps another alive. 192 Second, in a general purpose manager, the handle that keeps an allocation alive is just a lean pointer. 193 For strings, a fatter representation is acceptable because there are fewer string head pointers versus chained pointers within nodes as for linked containers. 162 194 163 195 \begin{figure} … … 167 199 \end{figure} 168 200 169 \VRef[Figure]{f:memmgr-basic} shows the representation. A heap header, with its text buffer, defines a sharing context. Often, one global sharing context is appropriate for an entire program; exceptions are discussed in [xref TBD]. Strings are handles into the buffer. They are members of a linked list whose order matches the order of their buffer fragments (exactly, where there is no overlapping, and approximately, where there is). The header maintains a next-allocation pointer (alloc, in the figure) after the last live allocation of the buffer. No external references into the buffer are allowed and the management procedure relocates the text allocations as needed. The string handles contain explicit length fields, null termination characters are not used and all string text is kept in contiguous storage. When strings (the inter-linked handles) are allocated on the program's call stack, a sustained period with no use of the program's dynamic memory allocator can ensue, during which the program nonetheless creates strings, destroys them, and runs length-increasing modifications on existing ones. 170 171 Compaction happens when the heap fills. It is the first of two uses of the linked list. The list allows discovering all live string handles, and from them, the ranges of the character buffer that are in use. With these ranges known, their total character count gives the amount of space in use. When that amount is small, compared with the current buffer size, an in-place compaction occurs, which entails copying the in-use ranges, to be adjacent, at the font of the buffer. When the in-use amount is large, a larger buffer is allocated (using the program's general-purpose dynamic allocator), the in-use strings are copied to be adjacent at the front of it, and the original buffer is freed back to the program's general allocator. Either way, navigating the links between the handles provides the pointers into the buffer, first read, to find the source fragment, then written with the location of the resultant fragment. This linkage across the structure is unaffected during a compaction; only the pointers from the handles to the buffer are modified. This modification, along with the grooming/provisioning of the text storage resource that it represents, is an example, in the language of [xref RAII limitations] of the string module providing a ``stay good'' service. 172 173 Object lifecycle events are the subscription-management triggers in such a service. There are two fundamental string-creation routines: importing from external text like a C-string, and initialization from an existing \CFA string. When importing, a fresh allocation at the free end of the buffer occurs, into which the text is copied. The resultant handle is therefore inserted into the list at the position after the incumbent last handle, a position given by the heap manager's ``last handle'' pointer. When initializing from text already on the \CFA heap, the resultant handle is a second reference onto the original run of characters. In this case, the resultant handle's linked-list position is beside the original handle. Both string initialization styles preserve the string module's internal invariant that the linked-list order match the buffer order. For string destruction, the list being doubly linked provides for easy removal of the disappearing handle. 174 175 While a string handle is live, it accepts modification operations, some of which make it reference a different portion of the underlying buffer, and accordingly, move the handle to a different position in the inter-handle list. While special cases have more optimal handling, the general case requires a fresh buffer run. In this case, the new run is allocated at the bump-pointer end and filled with the required value. Then, handles that originally referenced the old location and need to see the new value are pointed at the new buffer location, unlinked from their original positions in the handles' list, and linked in at the end of the list. An optimal case, when the target is not a substring of something larger, and the source is text from elsewhere in the managed buffer, allows the target to be re-pointed at the source characters, and accordingly, move list position to be beside the source. Cases where in-place editing happens, addressed further in [xref: TBD], leave affected handles in their original list positions. In analogy to the two cases of string initialization, the two cases of realizing assignment by moving either to a fresh buffer run, or to overlap references with the source, maintain the invariant of linked list order matching buffer order. 176 177 178 To explain: GCing allocator doing bump-pointer with compaction 179 180 201 \VRef[Figure]{f:memmgr-basic} shows the representation. 202 A heap header and its text buffer, defines a sharing context. 203 Normally, one global sharing context is appropriate for an entire program; 204 exceptions are discussed in [xref TBD]. 205 A string is a handle into the buffer and linked into a list. 206 The list is doubly linked for $O(1)$ insertion and removal at any location. 207 Strings are orders n the list by text-buffer address, where there is no overlapping, and approximately, where there is. 208 The header maintains a next-allocation pointer, @alloc@, pointing to the last live allocation of the buffer. 209 No external references point into the buffer and the management procedure relocates the text allocations as needed. 210 A string handle contains an explicit string, while its string is contiguous and not null terminated. 211 The length sets an upper limit on the string size, but is large (4 or 8 bytes). 212 String handles can be allocated in the stack or heap, while the text buffer is large enough with good management so that only one dynamic allocation is necessary for it during program execution. 213 During this period strings can vary in size dynamically. 214 215 When the text buffer fills, \ie the next new string allocation causes @alloc@ to point beyond the end of the buffer, the strings are compacted. 216 The linked handles define all live strings in the buffer, which indirectly defines the allocated and free space in the buffer. 217 Since the string handles are in (roughly) sorted order, the handle list can be traversed copying the first text to the start of the buffer and subsequent strings after each over. 218 After compaction, if the amount of free storage is still less than the new string allocation, a larger text buffer is heap allocated, the current buffer is copies into the new buffer, and the original buffer is freed. 219 Note, the list of string handles is unaffected during a compaction; 220 only the string pointers are modified to new buffer locations. 221 222 Object lifecycle events are the subscription-management triggers in such a service. 223 There are two fundamental string-creation routines: importing external text like a C-string or reading a string, and initialization from an existing \CFA string. 224 When importing, storage comes from the end of the buffer, into which the text is copied. 225 The resultant handle is inserted at the end of the handle list to maintain ordering. 226 When initializing from text already in the text buffer, the new handle is a second reference into the original run of characters. 227 In this case, the new handle's linked-list position is after the original handle. 228 Both string initialization styles preserve the string module's internal invariant that the linked-list order matches the buffer order. 229 For string destruction, handles are removed from the list. 230 231 Certain string operations can results in a subset (substring) of another string. 232 The resulting handle is then place in the correct sorted position in the list, possible with a short linear search to locate the position. 233 For string operations resulting in a new string, that string is allocated at the end of the buffer. 234 For shared-edit strings, handles that originally referenced containing locations need to see the new value at the new buffer location. 235 These strings are moved to appropriate locations at the end of the list (addressed further in [xref: TBD]. 236 For nonshared-edit strings, a containing string can be moved and the nonshared strings can remain in the same position. 237 String assignment words similarly to string initialization, maintain the invariant of linked list order matching buffer order. 181 238 182 239 At the level of the memory manager, these modifications can always be explained as assignments; for example, an append is an assignment into the empty substring at the end. 183 240 184 While favourable conditions allow for in-place editing, the general case requires a fresh buffer run. For example, if the new value does not fit in the old place, or if other handles are still using the old value, then the new value will use a fresh buffer run. 241 While favourable conditions allow for in-place editing, the general case requires a fresh buffer run. 242 For example, if the new value does not fit in the old place, or if other handles are still using the old value, then the new value will use a fresh buffer run. 185 243 186 244 where there is room for the resulting value in the original buffer location, and where all handles referring to the original buffer location should see the new value, 187 245 188 189 always boiled down to assignment and appendment. Assignment has special cases that happen in-place, but in the general case, it is implemented as a sequence of appends onto a fresh allocation at the end of the buffer. (The sequence has multiple steps when the assignment target is a substring: old before, new middle, old after.) Similarly, an append request can be serviced in-place when there is room, or as a pair of appends 190 246 always boiled down to assignment and appendment. 247 Assignment has special cases that happen in-place, but in the general case, it is implemented as a sequence of appends onto a fresh allocation at the end of the buffer. 248 (The sequence has multiple steps when the assignment target is a substring: old before, new middle, old after.) 249 Similarly, an append request can be serviced in-place when there is room, or as a pair of appends. 191 250 192 251 … … 227 286 228 287 229 \subsection{Performance assessment} 288 \section{Performance assessment} 289 \label{s:PerformanceAssessment} 230 290 231 291 I assessed the \CFA string library's speed and memory usage. I present these results in even equivalent cases, due to either micro-optimizations foregone, or fundamental costs of the added functionality. They also show the benefits and tradeoffs, as >100\% effects, of switching to \CFA, with the tradeoff points quantified. The final test shows the overall win of the \CFA text-sharing mechanism. It exercises several operations together, showing \CFA enabling clean user code to achieve performance that STL requires less-clean user code to achieve. -
doc/theses/mike_brooks_MMath/uw-ethesis.bib
r0dffe91 r1b39705 124 124 } 125 125 126 127 126 @misc{Mendio24, 128 127 contributer = {pabuhr@plg}, … … 132 131 howpublished= {\url{https://www.mend.io/most-secure-programming-languages}}, 133 132 } 133 134 @misc{RVO20, 135 contributer = {pabuhr@plg}, 136 title = {Return value optimization ({RVO})}, 137 author = {Special Interest Group on {C++}}, 138 year = 2020, 139 month = jun, 140 howpublished= {\url{https://sigcpp.github.io/2020/06/08/return-value-optimization}}, 141 }
Note: See TracChangeset
for help on using the changeset viewer.