Here is a small tutorial on extending Python in C: ExtendingPython. Feel free to add or edit. -Doug

Most of the changes I was planning on making have been made. Conx is still open to tweaking and bug fixes or new suggestions. Here's a change list that documents what changes have been made.

Your Bugs or Suggestions

Change List

  1. Adopted Python idioms where appropriate. ex. for i in range(len(<list of objects>)) becomes for item in <list of objects>

  2. Changes some attribute names. ex. Network.input -> Network.inputs, Network.output -> Network.targets, Network.connection -> Network.connections

  3. Changes some method names. Where possible the old method names are still around for compatibility. ex. Network.setOutputs() -> Network.setTargets(), but Network.setOutputs() is still around.

  4. Fixed and modified many file IO methods in the Network class. ex. Added Network.patternVector() so that inputs or targets read from a file can be converted back into patterns.

  5. Added Network.addPattern() and Network.delPattern() which are safer than Network.setPattern() which might unknowingly delete a previous pattern.

  6. Modified exceptions to be more detailed. Added exception extension classes, LayerError, NetworkError, and SRNError. These extend AttributeError (most errors are due to attribute problems). Additionally, exceptions return a tuple with the first item being a string describing the problem and the second being the values of the offending variable(s).

  7. Organized methods according to function for easier browsing. Included comments describing each group of methods.

  8. Added doc strings to all methods. Pydoc will generate documentation based on these doc strings.

  9. Eliminated all counter attributes describing list attributes where len(<list>) is a viable substitute. I'm assuming len() for a listtype in python is implemented in constant time but I have no documentation for this. If not, this is easy to change back. Using len(<list>) is safer.

  10. Moved some methods into SRN that were related to prediction and sequencing that weren't alread there.

  11. SRN class now has addContext() method. This is similar to Network.add() but takes a Layer instance argument and an option hidden layer name. Call addContext() when adding a context layer associated with a hidden layer (default 'hidden').

  12. addContext() adds the layer to the network and also to a dictionary of (hidden layer name, context layer reference) items.

  13. Added SRN.copyHiddenToContext() which copies hidden layer activations to context layers automatically for all context layers added using addContext.

  14. Added more testing code with emphasis on exceptions and file IO.

  15. Added a method verifyArchitecture() to Network. verifyArchitecture() is called in train(). propagate() no longer checks to make sure there is a valid network. verifyArchitecture() must be called to check this.

  16. Added testing code for verifyArchitecture() at the end of conx.

Previous Changes

  1. Removed SRN specific code from Network class. (Sequencing and prediction)

  2. Added activationSet and targetSet flags to Layer class.

  3. Should targetSet flag have to be reset before copying or setting another target.

  4. Added verifyInputs() to propagate() for error checking.

  5. Added verifyTargets() to backprop() for error checking.

  6. Added many exceptions.

Most Recent Changes

  1. Inputs and targets can be passed to step() as dictionary arguments in the form: <layer name> = <input/target list>

  2. The step() method when called from SRN copies context to hidden automatically or clears context depending on the argument initContext = {0, 1}.

  3. Fixed plugin NN brains.

  4. Activations between 0-1 produce warning

  5. Targets between 0-1 throw exception

  6. Checks arguments to setInputs and setOutputs [[num,num,..],[...],...]

  7. step() does not increment epoch

  8. addSRNLayer -> addThreeLayers

  9. Reimplemented target check for multiple targets as warning

  10. Bured copyHiddenToContext() in backprop()

  11. New type "kind" for layers

  12. Added prop_from() method

  13. connect() inforces layer and connection order

  14. Weight matrix problem (getWeights) fixed

  15. remove symmetric support

Numeric Conx

Conx now employs Numeric functions wherever possible. The speed of Conx should be noticably faster. I would estimate that moving Conx completely into C may make the code faster yet by a factor of two or more, but using Numeric functions in Python preserves the ease of extensibility that Python offers while now running at a reasonable speed.

Road Map to C

After doing more research on Friday I have discovered two options for moving conx code to C(++). If preserving the OO structure of the program is important then Boost.Python is probably the way to go. This is sort of like SWIG for C++ and would hopefully allow easy implementation of Python classes in C++. The second option is to create a Network type in C exclusively. This would involve taking all the core functionality to C code (not just single methods as I was trying before). The benefit is that those methods would become C code operating on C data structures, eliminating the overhead and providing the best optimization. We could then wrap this new type in a class that would provide some of the functionality (error checking etc...) left out of the C code and subclass it to create SRN specific code. The Layer and Connection classes would no longer exist, being folded into the Network type implementation.

C++ and Boost Pros

C++ and Boost Cons

C Type Extension Pros

C Type Extension Cons