-
Notifications
You must be signed in to change notification settings - Fork 44
SoftwareModule
Related: Architecture, CodeProgrammingLanguage
For creating ConceptNetworks.
Execution
(At this point, all modules are single-threaded and run sequentially in each engine cycle. Processed in the order of the ID of their upper-left neuron. This may be changed in the future and should not be relied upon.)
File
The layout and content of modules are included in the XML SoftwareNetworkFile when it is saved.
Programming: Any variables within a module declared as “Public” will automatically be saved and restored in the XML network file unless explicitly excluded with an “[XmlIgnore]” directive.
Intention
Software shortcuts:
- Convenience
- While all neural functions could theoretically be created in synapses, there are many which are much more convenient to implement in code.
- Pre-Build
- Implementing functionality for which neural implementations are yet to be determined.
- Example: We don’t know how binocular depth perception works in the brain, but within a module, we can use trigonometry to perform similar functionality.
-
SoftwareModuleEfficiency
- And maybe keeping pre-build modules
Methods
(API)
Primary:
- “Initialize”
- Is run only once when the module is first added to a network or if requested by the user.
- (The Initialize method is called when the module is added to a network, whenever the “Initialize” command is selected from the context menu, or the Neuron Engine is initialized.)
- (Note that the Initialize method is not called when the network is loaded from a network XML file as this would change the state of the network which is otherwise unaltered by Saving and Opening.)
- Within the Initialize method, a module might allocate a slew of synapses.
- Is run only once when the module is first added to a network or if requested by the user.
- “Fire"
- Is run once for each cycle of the Neuron Engine.
- Within the Fire-method, a module might send or receive signals to other functionality within the computer. (For example: A robotic module might sample some neuron values and then send the appropriate signals to various robotic servos to perform some action).
- The Fire method can sample the state of the neurons in an input image and set values for its own neurons directly.
Secondary:
- GetNeuron
- GetNeuronAt
- Can return a neuron object for any neuron in the network
- SetupAfterLoad
- If you need to reinitialize something whenever a file is loaded
- SetupBeforeSave
Possability
No limit. ('Rights')
Modules can implement “rules” which govern the creation of connections for vast arrays of neurons.
Any rectangular cluster of neurons can be assigned to be a module.
Synapses of a module can connect to and/or from any neuron in the (UKS) network ( = including neurons of other modules).
Modules can also access the characteristics of other modules. In this way, for example, a module performing some vision function can set its dimensions as appropriate to the size of the input image and create synapses connecting the input image to its neurons.
For output, modules can convert neural pulses to servo controls for robotics or speech output.
Rights
The code within the module has full control of the network.
Modules have direct access to all the underlying resources of the simulator (adding/deleting/ modifying synapses, changing/reading neuron values).
UsageHint
Modules may expose 'public methods' which are accessible to other modules (see 'File'). Generally, modules should communicate by setting 'neuron-values' rather than by 'method-calls' because it makes the modules more generally useful. /However, there are instances where the use of neurons would be tedious and direct method calls are more convenient.
- Module2DModel
- (Has SoftwareUIDialogCustom)
- Manages the content of the UKS to create persistent memory of Sallie’s two-dimensional surroundings. It automatically updates positions so they are correct relative to Sallie’s current position and orientation. Each object position has an associated confidence level (based on the accuracy of the distance estimate) and this is represented in the dialog by the length of white ends on segments. By temporarily adding segments or changing Sallie’s perceived position, Sallie can “imagine” surroundings with new objects or from a different point of view.
- Module2DSim
- (Has dialog)
- Maintains Sallie’s simulated surroundings. Sallie’s position and orientation are maintained from Move and Turn modules and directly output to Sallie’s various sensory modules. Detects collisions between Sallie and objects and moves objects based on assumptions of center of mass and friction.
- Module2DSmell
- Sallie’s limited sense of smell. This module has two rows of neurons representing input from two aroma sensors which is the strength of a field from green objects. Within the simulator, only green objects have an aroma. It receives input directly from the Module2DSim module.
- Module2DTouch
- When one of Sallie’s arms contacts a simulated object, this module fires neurons indicating the position and angle of the touch and whether or not the touch was at the end of an object. This can update information in the 2DModel since the distance value of touch is much more accurate than visual depth perception.
- Module2DVision
- Updates information in the Module2DModel based on the content of the current field of view. Uses binocular information from Module2DSim to estimate distances using trigonometry.
- Module3DSim
- (Has Dialog)
- Allows Sallie to move about in a three-dimensional world. Only Sallie’s visual input is shown in the dialog display.
- ModuleArm
- Allows for control of Sallie’s individual arm positions in Module2DSim. Each instance of the module controls one arm. This forms the basis for Sallie’s ability to explore objects by touch along with Module2DTouch.
- ModuleAudible
- Works with the UKS to manage Phonemes, Words, and Phrases. This is analogous to the Module2DModel in that it manages UKS content related to Sallie’s surroundings.
- ModuleBase
- (Has Dialog) This is the Base Class from which all other modules are derived. Useful only from the programming interface.
- ModuleBehavior
- This is somewhat analogous to the brain’s cerebellum in that it can manage sequences of primitive physical behaviors. For example, to turn or move a specific amount, multiple smaller moves or turns may be required.
- ModuleBoundary
- Works with ModuleImageFile to find visual boundaries.
- ModuleCamera
- Analogous to a retina. Takes input from an attached video camera and sets neuron values to represent the colors seen at specific locations.
- ModuleColorComponent
- Converts a neuron with the Color model into neurons that have firing rates appropriate to the RGB and brightness components of the color.
- ModuleCommand
- (Has Dialog)
- Can read, edit, and execute test scripts. Each step can fire any labeled neurons in any module by name and can test for (and wait for) results.
- ModuleEvent
- Works with the UKS to manage memory of events, actions, and outcomes so that Sallie can learn which behaviors are best in various situations.
- ModuleFireOldest
- This will fire the neuron within the module which fired the longest ago. This could be useful in selecting things to forget—if a neuron hasn’t fired in a long time, it possibly doesn’t contain useful information.
- ModuleGoToDest
- This module demonstrates the use of imagination in determining a route. The module works with the 2D model to imagine the world from a different (remembered) point of view.
- ModuleGraph
- This predecessor to ModuleUKS implements parent/child, next, and other relationships in neurons.
- ModuleGrayScale
- This module works with ModuleImageFile module to generate a grayscale image from the component color values.
- SoftwareModuleHearing
- ModuleHearWords
- This module works with ModuleUKS to manage word and phrase storage.
- ModuleImageFile
- (Has Dialog)
- This module reads an image file in BMP or PNG format and sets color neuron values as appropriate.
- ModuleKBDebug
- (Has Dialog)
- This module records neuron firings in and out of the ModuleUKSN to create a transaction display.
- ModuleLife
- This allocates synapses to make an array of neurons act to simulate Conway’s Game of Life.
- ModuleLineFinder
- This module works with the Boundary module to find linear sections of a boundary.
- ModuleMotor
- Analogous to the brain’s motor cortex. This module consolidates Move and Turn functions.
- ModuleMove
- This module distributes motion required to Module2DSim, Module2DModel, Module3DVision, and Module2DVision.
- ModuleMoveObject
- This module works with Module2DModel to allow Sallie to create a sequence of actions to achieve a goal.
- ModuleNavigate
- This module works in the 2D environment using ModuleUKS to allow Sallie to solve mazes using landmark memory.
- ModuleNull
- This module does nothing.
- ModuleSpeakPhonemes
- (Has Dialog)
- This module works with ModuleUKS to learn words in terms of underlying Phonemes.
- SoftwareModuleSpeaking
- ModuleSpeakWords
- Uses the Windows speech synthesizer to create speech with ModuleUKS.
- ModuleSpeechIn
- Uses the Windows speech recognition system to create neuron firings from speech.
- ModuleSpeechOut
- Uses the Windows speech synthesizer to create speech from neuron firings.
- ModuleStrokeFinder
- (Has Dialog)
- Along with ModuleLineFinder locates strokes within an image.
- ModuleTurn
- Distributes Sallie’s rotation to modules that need the information.
- ModuleUKS
- (Has Dialog)
- The abstract Universal Knowledge Store.
- ModuleUKSN
- The Universal Knowledge Store with the addition of a neuron interface.
- SoftwareModuleVision