Miscellaneous wording and reference fixes

pull/4/merge
Pat Morin 2011-06-20 13:13:44 -04:00
parent 5153c4f64e
commit 2b85ba9937
6 changed files with 17 additions and 16 deletions

View File

@ -15,7 +15,7 @@ Structures project.
latex/
java/ods/
This code is released under a Creative Commons Attribution v3.0 license.
This code is released under a Creative Commons Attribution license.
The full text of the license is available here.
http://creativecommons.org/licenses/by/2.5/ca/

View File

@ -58,7 +58,7 @@ easy to verify, by induction, that a binary tree having $#n#\ge 1$
real nodes has $#n#+1$ external nodes.
\section{BinaryTree: A Basic Binary Tree}
\section{#BinaryTree#: A Basic Binary Tree}
The simplest way to represent a node #u# in a binary tree is
to store the (at most 3) neighbours of #u# explicitly:
@ -158,7 +158,7 @@ level-by level, and left-to-right within each level.}
\section{BinarySearchTree: An Unbalanced Binary Search Tree}
\section{#BinarySearchTree#: An Unbalanced Binary Search Tree}
\seclabel{binarysearchtree}
A #BinarySearchTree# is a special kind of binary tree in which each node #u#
@ -302,7 +302,7 @@ per operation.
#SkiplistSet# structure can implement the #SSet# interface with $O(\log
#n#)$ time per operation. The problem with the #BinarySearchTree#
structure is that it can become \emph{unbalanced}. Instead of looking
like the tree in \figref{bst-example} it can look like a long path of
like the tree in \figref{bst} it can look like a long path of
#n# nodes.
There are a number of ways of avoiding unbalanced binary search

View File

@ -19,7 +19,7 @@ integers in some specific range. In the code samples, some of these
obtained using random bits generated from atmospheric noise.
\section{HashTable: Hashing with Chaining}
\section{#HashTable#: Hashing with Chaining}
\seclabel{hashtable}
A #HashTable# data structure uses \emph{hashing with chaining} to store

View File

@ -123,8 +123,8 @@ management system. Anyone can fork from the current version of the
book and develop their own version (for example, in another programming
language). They can then ask that their changes be merged back into
my version. My hope is that, by doing things this way, this book will
continue to be a useful textbook for long after my interest in the project
(or my pulse, whichever comes first) has ended.
continue to be a useful textbook long after my interest in the project
(or my pulse, whichever comes first) has waned.
\include{intro-tmp}
\include{arrays-tmp}

View File

@ -232,7 +232,7 @@ In a random binary search tree, the #find(x)# operation takes $O(\log
#n#)$ expected time.
\end{thm}
\section{Treaps}
\section{#Treap#: A Randomized Binary Search Tree}
The problem with random binary search trees is, of course, that they are
not dynamic. They don't support the #add(x)# or #remove(x)# operations
@ -361,7 +361,7 @@ search path is at most $2\ln #n#+O(1)$. Furthermore, each rotation
decreases the depth of #u#. This stops if #u# becomes the root, so
the expected number of rotations can not exceed the expected length of
the search path. Therefore, the expected running-time of the #add(x)#
operation in a #Treap# is $O(\log #n#)$. (Exercise~\ref{ex:treap-rotates}
operation in a #Treap# is $O(\log #n#)$. (\excref{treap-rotates}
asks you to show that the expected number of rotations performed during
an insertion is actually only $O(1)$.)
@ -418,7 +418,7 @@ It is worth comparing the #Treap# data structure to the #SkiplistSet#
data structure. Both implement the #SSet# operations in $O(\log #n#)$
time per operation. In both data structures, #add(x)# and #remove(x)#
involve a search and then a constant number of pointer changes
(see Exercise~\ref{ex:treap-rotations} below). Thus, for both these
(see \excref{treap-rotates} below). Thus, for both these
structures, the expected length of the search path is the critical value
in assessing their performance. In a #SkiplistSet#, the expected length
of a search path is
@ -431,7 +431,7 @@ In a treap, the expected length of a search path is
\]
Thus, the search paths in a #Treap# are considerably shorter and this
translates into noticeably faster operations on #Treap#s than #Skiplist#s.
Exercise~\ref{ex:skiplist-opt} in \chapref{skiplists} shows how the
\excref{skiplist-opt} in \chapref{skiplists} shows how the
expected length of the search path in a #Skiplist# can be reduced to
\[
e\ln #n# + O(1) \approx 1.884\log_2 #n# + O(1)
@ -455,7 +455,7 @@ Give a recursive formula for the number of sequences that generate a
complete binary tree of height $h$ and evaluate this formula for $h=3$.)
\end{exc}
\begin{exc}
\begin{exc}\exclabel{treap-rotates}
Use both parts of \lemref{rbs-treap} to prove that the expected number of rotations performed by an #add(x)# operation (and hence also a #remove(x)# operation) is $O(1)$.
\end{exc}

View File

@ -95,7 +95,7 @@ the search path is navigated; In particular the structures differ in how
they decide if the search path should go down in $L_{r-1}$ or go right
in $L_r$.
\section{SkiplistSets as SSets}
\section{#SkiplistSet#: An Efficient #SSet# Implementation}
A #SkiplistSet# uses a skiplist structure to implement the #SSet#
interface. When used this way, the list $L_0$ stores the elements of
@ -162,7 +162,7 @@ the operations #add(x)#, #remove(x)#, and #find(x)# in $O(\log #n#)$
expected time per operation.
\end{thm}
\section{Skiplists as Lists}
\section{#SkiplistList#: An Efficient Random-Access #List# Implementation}
A #SkiplistList# is a realization of the skiplist idea that implements
the #List# interface. In a #SkiplistList#, $L_0$ contains the elements of the
@ -272,7 +272,8 @@ data structure:
#remove(i)# in $O(\log #n#)$ expected time per operation.
\end{thm}
\section{Skiplists as Ropes}
%\section{Skiplists as Ropes}
%TODO: A section on ropes
\section{Analysis of Skiplists}
\seclabel{skiplist-analysis}
@ -450,7 +451,7 @@ been developed.
% At some point, a cache-optimized version of skiplists was used in part of the Linux kernel. !!I can't confirm this.
\begin{exc}
\begin{exc}\exclabel{skiplist-opt}
Suppose that, instead of promoting an element from $L_{i-1}$
into $L_i$, we promote it with some probability $p$, $0 < p < 1$.
Show that the expected length of the search path in this case is