-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Leonardo Artiles Montero edited this page May 20, 2024
·
1 revision
Sharp-Grad is a lightweight automatic differentiation (autograd) engine written in C#. It supports basic operations and neural network components, making it suitable for research (not yet) and educational purposes.
Below is an example of how to use Sharp-Grad for automatic differentiation and neural network operations.
using SharpGrad;
using SharpGrad.DifEngine;
using SharpGrad.NN;
class Program
{
static void Main(string[] args)
{
// Define values
Value<float> a = new Value<float>(1.5f, "a");
Value<float> b = new Value<float>(2.0f, "b");
Value<float> c = new Value<float>(6.0f, "c");
// Perform operations
Value<float> d = (a + b * c);
Value<float> e = d / new Value<float>(2.0f, "2");
Value<float> f = e.Pow(new Value<float>(2.0f, "2"));
Value<float> g = f.ReLU();
// Backpropagation
g.Grad = 1.0f;
g.Backpropagate();
// Output gradients
Console.WriteLine($"Gradient of a: {a.Grad}");
Console.WriteLine($"Gradient of b: {b.Grad}");
Console.WriteLine($"Gradient of c: {c.Grad}");
// More operations
Value<float> j = new Value<float>(0.5f, "j");
Value<float> k = j.Tanh();
Value<float> l = k.Sigmoid();
Value<float> m = l.LeakyReLU(1.0f);
// Backpropagation
m.Grad = 1.0f;
m.Backpropagate();
// Output gradient and data
Console.WriteLine($"Gradient of j: {j.Grad}");
Console.WriteLine($"Data of m: {m.Data}");
}
}-
Value Initialization:
- Values
a,b, andcare initialized with1.5,2.0, and6.0respectively.
- Values
-
Operations:
- Arithmetic operations are performed to create a computation graph.
-
dis computed as ( a + b \times c ). -
eis computed as ( d / 2 ). -
fis computed as ( e^2 ). -
gis computed using the ReLU activation function onf.
-
Backpropagation:
- Gradients are computed by setting the gradient of
gto1.0and callingBackpropagate.
- Gradients are computed by setting the gradient of
-
Additional Operations:
-
jis initialized with0.5. -
kis computed using the Tanh activation function onj. -
lis computed using the Sigmoid activation function onk. -
mis computed using the LeakyReLU activation function onl.
-
-
Backpropagation and Outputs:
- Gradients for
jare computed by setting the gradient ofmto1.0and callingBackpropagate. - The gradients and data are printed to the console.
- Gradients for
If you wish to contribute to Sharp-Grad, please follow the guidelines below:
- Fork the repository.
- Create a new branch for your feature or bugfix.
- Commit your changes and push to the branch.
- Create a pull request detailing your changes.
Sharp-Grad is licensed under the MIT License. See the LICENSE file for more details.
### References
- [Sharp-Grad GitHub Repository](https://github.com/Leonardo16AM/Sharp-Grad)