100 Commits

Author SHA1 Message Date
98ad4ac760 Migrate conv2d layer 2024-09-10 21:48:47 +02:00
757584544c Migrate conv2d layer 2024-09-10 21:19:50 +02:00
74f49d6a00 Migrate output layer 2024-09-10 19:20:00 +02:00
f7b525e494 Migrate input layer 2024-09-10 19:11:21 +02:00
fe7c16ac36 Migrate concat layer 2024-09-09 22:16:22 +02:00
a0665fb05c Migrate max pooling 2024-09-09 21:48:36 +02:00
76e5225001 Migrate avg pooling 2024-09-09 21:36:13 +02:00
75475790ac Migrate Dense layer 2024-09-08 13:36:53 +02:00
0dca8348bd Migrate Activation layer 2024-09-08 12:49:13 +02:00
f8220f0ec1 Restructure cuda backend 2024-09-05 22:23:47 +02:00
bc9bff10cd Load running mean and var from weight file 2024-08-25 19:33:33 +02:00
9704d0d53e Add running mean and running var to batchnorm 2024-08-25 19:05:10 +02:00
a54ffa8b20 Change inception prefixes 2024-06-05 02:04:15 +02:00
8168f02f58 Add adaptive avg pooling 2024-05-30 17:17:31 +02:00
479c1119e7 Fix module layer issues 2024-05-30 13:08:13 +02:00
8ac2da004c Fix adding module layers 2024-05-29 21:02:26 +02:00
098fd65074 Add model summary 2024-05-28 18:42:58 +02:00
3955ddd888 Implement Inception block D 2024-05-27 21:30:42 +02:00
df47a31f36 Rename dim2d to shape2d 2024-05-27 21:14:51 +02:00
94a16b4352 Add padding to max pooling 2024-05-26 19:03:10 +02:00
4a67b708f0 Add padding to avg pooling 2024-05-26 18:54:12 +02:00
cba177e417 Add getOutputDims to 2d layers 2024-05-26 14:28:43 +02:00
10e73638b6 Add non square pooling and batch norm tests 2024-05-20 22:16:00 +02:00
6dca8ccd3c Unify 2d layer naming 2024-05-20 16:23:58 +02:00
74098b24e3 Add support for non square matrices 2024-05-20 15:20:43 +02:00
6f8b5f4081 Rename batchnorm 2024-05-20 13:05:48 +02:00
870b121c2a Switch model and module members to protected 2024-05-19 20:21:54 +02:00
33d4a43dca Update concat layer 2024-05-19 20:21:13 +02:00
e23ffe1ee1 Remove default constructor from Module 2024-05-19 16:16:57 +02:00
8bb2562c4c Add main include file 2024-05-19 16:16:14 +02:00
4a1c4a5f91 Add epsilon param to batch norm 2024-05-19 15:13:22 +02:00
c84f58b97c Implement modules 2024-05-15 18:52:31 +02:00
7c48ed86d2 Implement vector variance function 2024-05-14 21:58:23 +02:00
da8f3167cb Add utils vector mean function 2024-05-14 21:41:18 +02:00
5c8d3f7e25 Compute mean and variance 2024-04-29 20:55:11 +02:00
0ab623fa23 Implement vector mean calculation 2024-04-28 22:04:15 +02:00
f60d62f6bd Implement batch norm layer 2024-04-28 19:58:00 +02:00
3320f610db Add imagenet class map 2024-04-28 19:03:51 +02:00
c2acad151b Implement simple model validation 2024-04-22 20:57:40 +02:00
f17debc244 Implement getOutputSize and getInputSize for seq layers 2024-04-22 20:31:58 +02:00
a32c737785 Allocate activation on heap 2024-04-22 18:59:16 +02:00
942ee6a32b Add layer name to vector 2024-04-21 12:20:02 +02:00
5e663b9029 Fix bias in conv layer 2024-04-20 19:09:00 +02:00
d08567a563 Fix weigh bias parsing and better error logging 2024-04-20 18:36:53 +02:00
ecf7416f8e Rework padding size setting 2024-04-20 16:31:28 +02:00
9fb9d7e8e1 Implement getting layer, weights and biases 2024-04-16 19:09:41 +02:00
f4ae45f867 Start implementing weights import 2024-04-15 22:17:48 +02:00
18522c2dea Cleanup and refactor 2024-04-11 22:52:41 +02:00
4b9d123e94 Implement device vector utils 2024-04-11 22:22:33 +02:00
710a33bdde Move softmax partial kernels to matmul 2024-04-11 22:01:47 +02:00