Receptron


The receptron is a neuromorphic data processing model — specifically neuromorphic computing — that generalizes the traditional perceptron, by incorporating non-linear interactions between inputs. Unlike classical perceptron, which rely on linearly independent weights, the receptron leverages complexity in physical substrates, such as the electric conduction properties of nanostructured materials or optical speckle fields, to perform classification tasks. The receptron bridges unconventional computing and neural network principles, enabling solutions that do not require the training approaches typical of artificial neural networks based on the perceptron model.

Algorithm

The receptron is an algorithm for supervised learning of binary classifiers, so a classification algorithm that makes its predictions based on a predictor function, combining a set of weights with the feature vector. The mathematical model is based on the sum of inputs with non-linear interactions:
     
where and  are non-linear weight functions depending on the inputs,. Nonlinearity will typically make the system extremely complex, and allowing for the solution of problems not solvable through the simpler rules of a linear system, such as the perceptron or McCulloch Pitts neurons, which is based on the sum of linearly independent weights:
   
where are constant real values. A consequence of this simplicity is the limitation to linearly separable functions, which necessitates multi-layer architectures and training algorithms like backpropagation
As in the perceptron case, the summation in Eq. 1 origins the activation of the receptron output through the thresholding process,

where th is a constant threshold parameter. Equation 3 can be written by using the Heaviside step function.
The weight functions  can be written with a finite number of parameters, simplifying the model representation. One can Taylor-expand  and use the idempotency of Boolean variables  such that  can be written as

where  are independent parameters that can be seen as the components of a tensor of rank and type.
The sum in Eq. reduces to the perceptron case when off-diagonal terms of vanish. If one considers, one gets:

in the perceptron case, the vanishing of implies linearity. In the receptron case, meaning that the superposition principle is no longer valid, the latter terms being responsible of the more complex non-linear interaction between the inputs.

Design and implementations

1. Electrical Receptron

Substrate: Nanostructured and nanocomposite films. These films form disordered networks of nanoparticles with resistive switching and non-linear electrical conduction.

2. Optical Receptron

Substrate: Optical speckle fields generated by random interference of light emerging from a disordered medium illuminated by a laser or coherent radiation.

Key features

Physical Substrate Computing: The receptron does not require digital training; instead, it exploits the natural complexity of materials to perform computations.
Non-Linear Separability: Unlike traditional perceptrons, which fail on problems like the XOR function, the receptron can solve such tasks due to its inherent non-linearity.
Training-Free Operation: Classification is achieved through the physical system's response rather than iterative weight adjustments, reducing computational overhead.