akuchling: Sherlock Hemlock (Default)
The Privacy Handbook's copyright is held by the FSF, and it's under the GNU Free Documentation License (FDL). The FDL isn't viewed very favorably these days -- using invariant sections means the document is no longer considered "free" under the Debian Free Software Guidelines -- and today the CC family of licenses are more popular than the FDL.

(I think the FDL's complicated DRM and "Invariant section" language make the FDL hard to understand, and don't apply to the common use case for writers. IMHO relatively few people need to have some parts of a document modifiable and other parts invariant; instead people care about the commercial/non-commercial usage of the document as a whole.)

So the Privacy Handbook's license needs to be changed. Werner Koch, the GnuPG maintainer, has now made this request to the FSF. I don't think the FSF responds very quickly -- presumably this decision will have to wait for some committee or the FSF board to decide -- so now we wait.
akuchling: Sherlock Hemlock (Default)
Sunday I worked on translating the Docbook form of the GNU Privacy Handbook to Emacs org-mode. I didn't try to script it or parse the SGML at all, but just copied the original manual.sgml file to manual.org and began whacking
away at it in Emacs, doing search-and-replaces and manual editing to translate between the two markups.

Docbook provides minutely detailed markup. In a code example, you can mark the system prompt by <prompt></prompt> and the user's typing by <userinput>. All these details are discarded by the translation to org-mode, which only offers the basic formatting of emphasis, italic, monospace, and a few other styles.

I'm working on a branch called akuchling-modernize in my new gph repository on github. The next step will be to apply a patch to make various small updates that's been sent to me. The patch is probably old enough that it's slightly out of date, so then I'll start reading through the document
and consider what revisions I want to make. I also need to figure out what to do for a glossary; the Docbook has a <firstterm> element marking the first use of a term. For now I've left them in. I don't know if I should add a glossary section or just link to the new FAQ.
akuchling: Sherlock Hemlock (Default)
About two weeks ago, I had another spasm of activity and wrote a bunch of new entries for the GPG FAQ, both filling in entries that were blank and adding a few that I thought were a good idea. Happily the maintainer accepted my pull request.

The next step will be to request that the new FAQ be posted and replace the outdated FAQ.
akuchling: Sherlock Hemlock (Default)
Another book project: this was written as a book proposal aimed at O'Reilly, for an actual printed book (back when people actually bought printed computer books). I wrote two sections, defining an algorithm and explaining O() notation; parts of these two sections live on in my 50-examples book.
Outline
---------

Written Sat. Aug 30 2003.

"Algorithms in Python" (alt. title "Learning Algorithms with Python", if
Sedgewick has a trademark) is a book that uses Python as the vehicle to
teach the basic ideas of algorithms -- what an algorithm is, how to
notate them clearly, what O() notation means, etc. -- and to introduce a
number of different algorithms.

Discussions will rarely or never be rigorous.  Requirements:

* Minimal mathematical background should be required (basically
  functions, I think).  Those bits of notation that are necessary will
  be explained as they're needed.
* Elementary Python programming knowledge is needed, so readers
  should have a tutorial such as "Learning Python" alongside this book;
  it will not attempt to teach Python, though it will introduce relevant
  standard library modules.  
* Python 2.3 will be used.

The goal is to make the book understandable by a bright high-school
student.  If the student goes on to formal computer science study, she
will have a good foundation for it; if not, she'll understand some basic
computer ideas.

Exercises may be included (I haven't decided yet); they might be hard to
invent, but they'd also be very useful for the book's use in teaching
settings.  If exercises are provided, I'll write up a set of unit tests
for correct implementations; this will make it easy for teachers to
check whether answers are correct and let them concentrate on
programming style.  It might be better to leave exercises out of the
book and put them on a separate web site, where they can be continually
updated.

* Preface (the usual front matter)
  * Introductory comments
    * Intended audience; prerequisites; goals of book
  * Outline of book
  * Typographical conventions
  * Acknowledgements

* Algorithmic Analysis (alt. title "What is an Algorithm"?)
  * Basic idea: an algorithm is a series of steps to perform a task
  * Example: finding the largest number in a list
    * Iterative formulation
    * Recursive formulation
  * Measuring time complexity: O, Theta, Omega notation
    * Compare different time complexities: O(n) vs O(lg n)
    * Applying O() to memory/space complexity
  * Time complexity of various Python built-in operations (dicts, lists)

* Hashing 
  Hashing is going to be the first serious algorithm used as an
  example, so it's going to be worked out in the most detail.  Later
  chapters will explain algorithms, give reasons why they work, 
  and discuss their O() complexity, but in less detail than in this chapter.
  * Basic concept of hash tables
  * Computing hash codes
  * Handling collisions
  * Resizing 
    * Amortized algorithm costs (I can't see how to fit this into
      chapter 1, since you need a reasonably complicated algorithm to
      use as an example,)
  * Deleting hash table entries
  * Determining time complexity of hashing 
    (this would be the most detailed explanation of determining time complexity; 
     remaining chapters would be more hand-waving)

* Graphs
  * Graph concepts
  * Different graph representations (Node objects, sets of arcs)
  * Traversal
  * Topological sorts
    * Example: working out file dependencies
  * Connected-components
  * Spanning trees
  * Shortest paths

* Trees
  * Trees as a special case of graphs
  * Binary trees
    * Representations (Node objects, lists/tuples, tables)
    * Searching
    * Inserting
    * Deleting
    * Unbalanced trees
  * Recursive operations on trees
  * Balanced trees: 
    * red-black
    * do AVL trees, or just mention them briefly?  
      (Briefly, I think; there's already more than enough material in this outline!)
    * B-trees
  
* Numbers (should the numeric material come before trees and graphs?)
  * Representing numbers on computers
    * Machine integers
    * Floating point representation
    * Large integers
  * Random number generation
  
* Numeric Analysis
  * Polynomial evaluation
  * Finding zeros of functions
  * Differentiating functions
  * Integrating functions

* Sorting
  * Basic concepts
    * Comparing
    * Stability
  * Simple algorithms:
    * Bubble sort
    * Shell sort
    * Insertion sort
    * Merge sort
  * Quicksort
    * Implementation
    * Issues
    * Time-complexity
  * Time complexity of sorting
    * Proof of O(n lg n) bound.
    * Breaking assumptions: parallellism, spaghetti sort

Hard problems: NP-completeness
  * P and NP
  * NP-completeness
    * Explanation
    * Various examples of NP-complete problems
    * Show that all NP-complete problems are equivalent
  * Solving an NP-complete problem
    * Exhaustive search
    * Heuristics

* Final thoughts 
  I like books that close with some sort of summation. 
  This brief final chapter will discuss a few general issues.
  * Think about time complexity
  * Do the simplest thing...
  * Write tests
  * Value clarity over optimization
  * Where to go from here?
    * More books to read; things to do
    * Current state of research (parallel, distributed, various domains)


Optional Topics
--------------------

This section lists various chapters for which I produced an outline 
but later decided aren't a core part of the book.  Any of these chapters can 
be reintroduced at a later point.

* Number Theory
  * Finite fields
  * Primality testing (kind of an odd duck)
  * RSA
  
Geometric Algorithms:
  * Point representations
  * Convex hull
  * Line intersections
  * Range searching

* Strings 
  This will likely be a difficult chapter, both to write and to read,
  because FSMs will be a pain to explain.  (Indeed, you could envision
  an entirely different document that covered FSMs, languages, Turing
  completeness, parsing, ...  One for the sequel, I think.)
  * Simple searching
  * Boyer-Moore/KMP searches
  * Data compression
  * Regular expressions
    * Simple regex patterns
    * Finite automata
    * Languages
      * What finite automata can't do
    * Brief theory-of-computation overview

* Real Implementations:
  Examines real-world implementation of various algorithms.
  * Random number generation: Mersenne twister
  * Karatsuba algorithm?  (maybe too complicated and boring?)
  * Python's dictionary hashing
  * Python's sort

* Game trees
  * Introduction
  * Game rules (tic-tac-toe; or maybe hnefatafl, a game I did for a school project once)
  * Simple tree search
  * Alpha/beta cutoff
  


Possible Topics
--------------------

* Heaps (can't figure out where to fit them in -- any suggestions?)


Not Covered
------------------

* Linked lists (they aren't the source of many algorithms,
  and their implementation in Python doesn't seem very interesting)
akuchling: Sherlock Hemlock (Default)
(I'm cleaning off my laptop's hard drive because I think its days are numbered, so I'll probably be posting a bunch of different items that I come across over the next few days or weeks.)

At one point, I had the idea of a writing project to describe some of the basic ideas of computers for a non-technical reader. It would have been kind of philosophical, something like W. Daniel Hillis's "The Pattern on the Stone: The Simple Ideas that Make Computers Work".The book never got beyond the stage of a list of chapters, which was:

  1. Computers only deal with numbers.
  2. Computers are general-purpose.
  3. Programming consists of writing very detailed recipes that tell computers what to do.
  4. We can figure out how long those recipes take to run.
  5. Successive layers of abstraction help us control complexity.
  6. Some problems are very hard and cannot be realistically be solved by computers.
  7. Human brains don't seem to work like computers.

Profile

akuchling: Sherlock Hemlock (Default)
akuchling

September 2016

S M T W T F S
    123
45678910
11121314151617
18192021222324
2526272829 30 

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 21st, 2017 08:26 pm
Powered by Dreamwidth Studios