Commit Graph

58 Commits

Author SHA1 Message Date
3955ddd888 Implement Inception block D 2024-05-27 21:30:42 +02:00
df47a31f36 Rename dim2d to shape2d 2024-05-27 21:14:51 +02:00
94a16b4352 Add padding to max pooling 2024-05-26 19:03:10 +02:00
4a67b708f0 Add padding to avg pooling 2024-05-26 18:54:12 +02:00
cba177e417 Add getOutputDims to 2d layers 2024-05-26 14:28:43 +02:00
10e73638b6 Add non square pooling and batch norm tests 2024-05-20 22:16:00 +02:00
6dca8ccd3c Unify 2d layer naming 2024-05-20 16:23:58 +02:00
74098b24e3 Add support for non square matrices 2024-05-20 15:20:43 +02:00
6f8b5f4081 Rename batchnorm 2024-05-20 13:05:48 +02:00
33d4a43dca Update concat layer 2024-05-19 20:21:13 +02:00
4a1c4a5f91 Add epsilon param to batch norm 2024-05-19 15:13:22 +02:00
5c8d3f7e25 Compute mean and variance 2024-04-29 20:55:11 +02:00
f60d62f6bd Implement batch norm layer 2024-04-28 19:58:00 +02:00
f17debc244 Implement getOutputSize and getInputSize for seq layers 2024-04-22 20:31:58 +02:00
a32c737785 Allocate activation on heap 2024-04-22 18:59:16 +02:00
d08567a563 Fix weigh bias parsing and better error logging 2024-04-20 18:36:53 +02:00
ecf7416f8e Rework padding size setting 2024-04-20 16:31:28 +02:00
9fb9d7e8e1 Implement getting layer, weights and biases 2024-04-16 19:09:41 +02:00
18522c2dea Cleanup and refactor 2024-04-11 22:52:41 +02:00
b49dddf34a Improve softmax numerical stability 2024-04-08 23:25:46 +02:00
9482d7bc43 Add model predict test 2024-03-22 22:31:32 +01:00
90fb104dae Implement output layer 2024-03-21 23:07:46 +01:00
af6838e8ae Initial model implementation 2024-03-20 22:31:39 +01:00
6f4cdf3792 Implement avg pool test 2024-03-20 21:57:22 +01:00
dfff0360d9 Implement max pooling test 2024-03-20 21:44:04 +01:00
ef63cbd9f1 Implement avg pooling 2024-03-19 22:33:43 +01:00
a0fc1b00ae Implement max pooling layer 2024-03-19 22:04:58 +01:00
b6c4b7d2ae Refactor layers 2024-03-19 21:35:05 +01:00
8d14b74f66 Implement Add layer 2024-03-18 20:37:13 +01:00
d9c6c663c8 Rename ILayer to WeightedLayer 2024-03-18 20:36:52 +01:00
6cf604423a Combine padding and conv kernel 2024-03-18 19:53:40 +01:00
aac0c3a826 Implement concat layer 2024-03-17 21:38:29 +01:00
42d646750b Abstract activation and implement softmax 2024-03-17 18:37:15 +01:00
0c22fac64e Add toplevel CUDANet namespace 2024-03-17 16:08:53 +01:00
dc86cddeb7 Use tiling shmem for mat vec mul kernel 2024-03-15 23:33:09 +01:00
88f7fff217 Add prefix to guards 2024-03-13 22:23:23 +01:00
7157a27e56 Add documentation comments 2024-03-12 21:50:06 +01:00
708164e4d0 Implement simple input layer 2024-03-12 21:16:46 +01:00
9d91896f13 Change forward function to return output pointer 2024-03-12 20:50:49 +01:00
d2ab78fbc7 Add Kernels namespace 2024-03-11 21:04:23 +01:00
e0178e2d5c Cleanup and refactor 2024-03-11 20:39:44 +01:00
f3112311da Make conv2d work again 2024-03-10 19:13:22 +01:00
d177a67cd6 Add bias to conv2d 2024-03-09 23:03:23 +01:00
4b6fcbc191 Implement simple test for host conv2d 2024-03-08 23:12:04 +01:00
69ccba2dad Start conv test implementation 2024-03-07 22:03:05 +01:00
fc2c1616b4 Initial cpu conv implementation 2024-03-07 21:24:59 +01:00
f4257afd5a Remove cublas dependency 2024-03-05 18:41:35 +01:00
cfc5c46d5e Initialize conv2d layer 2024-03-04 22:16:03 +01:00
f37320594a Add activations enum 2024-03-03 15:24:54 +01:00
48ba09b28d Format source code using clang-format 2024-02-27 18:52:12 +01:00