Commit Graph

156 Commits

Author SHA1 Message Date
7157a27e56 Add documentation comments 2024-03-12 21:50:06 +01:00
708164e4d0 Implement simple input layer 2024-03-12 21:16:46 +01:00
9d91896f13 Change forward function to return output pointer 2024-03-12 20:50:49 +01:00
2518138ef8 Add strided conv2d test 2024-03-11 21:51:27 +01:00
a3973f0b21 Add activation to conv2d 2024-03-11 21:05:38 +01:00
d2ab78fbc7 Add Kernels namespace 2024-03-11 21:04:23 +01:00
e0178e2d5c Cleanup and refactor 2024-03-11 20:39:44 +01:00
f3112311da Make conv2d work again 2024-03-10 19:13:22 +01:00
6bbc036f62 Generate conv2d test results with pytorch 2024-03-10 19:11:48 +01:00
6ce45cc834 Remove gh actions lint 2024-03-09 23:21:36 +01:00
d177a67cd6 Add bias to conv2d 2024-03-09 23:03:23 +01:00
4f3c4f1afb Fix conv2d kernel dims 2024-03-09 22:55:37 +01:00
96804777ee Refactor conv2d test 2024-03-09 22:54:46 +01:00
fceef07a9b Remove linear activation kernel 2024-03-09 22:54:23 +01:00
a3d85a10fc Working conv2d forward 2024-03-09 21:08:16 +01:00
e51aabc2f2 Initial cuda conv kernel implementation 2024-03-08 23:35:54 +01:00
4b6fcbc191 Implement simple test for host conv2d 2024-03-08 23:12:04 +01:00
69ccba2dad Start conv test implementation 2024-03-07 22:03:05 +01:00
7e75943a6b Add stride to index calculation 2024-03-07 21:32:39 +01:00
fc2c1616b4 Initial cpu conv implementation 2024-03-07 21:24:59 +01:00
07f231a30b Switch padding kernel to row major 2024-03-05 21:04:11 +01:00
f4257afd5a Remove cublas dependency 2024-03-05 18:41:35 +01:00
98ad84c659 Add matrix math kernels 2024-03-05 17:38:46 +01:00
cfc5c46d5e Initialize conv2d layer 2024-03-04 22:16:03 +01:00
f37320594a Add activations enum 2024-03-03 15:24:54 +01:00
7e4460cc5e Implement padding kernel 2024-03-03 15:24:39 +01:00
019ccc33d9 Start implementing padding kernel 2024-02-29 22:21:48 +01:00
045359cca2 Remove not needed code 2024-02-29 22:21:32 +01:00
0f0e57b819 Update ReLU test 2024-02-28 21:45:52 +01:00
e267f08a2f Format code 2024-02-27 21:49:05 +01:00
19ee20ea66 Add dense sigmoid test 2024-02-27 21:48:08 +01:00
9747abe53e Fix dense layer forward prop 2024-02-27 21:47:46 +01:00
b1eb8b5806 Add activations test 2024-02-27 20:19:17 +01:00
48ba09b28d Format source code using clang-format 2024-02-27 18:52:12 +01:00
fb454de053 Update lint.yaml 2024-02-27 11:17:01 +01:00
5e1e0ed1d1 Initial activations implementation 2024-02-27 00:24:57 +01:00
6e99525ad0 Rename hheader files to .cuh 2024-02-26 19:53:46 +01:00
d3520d5b13 Add lint action 2024-02-21 22:11:48 +01:00
6436544a1b Remove codeql workflow 2024-02-21 22:06:52 +01:00
908b9fd375 Use gtest cmake vars 2024-02-21 21:54:36 +01:00
035f3b053b Rename files to .cu and fix IDX2C usage 2024-02-21 20:03:04 +01:00
15c0cd30f0 Add googletest step to codeql workflow 2024-02-19 22:38:29 +01:00
dbc206d18c Fix unit weight matrix test 2024-02-19 22:27:21 +01:00
02fc9e4e8b Use IDX2C macro properly 2024-02-19 22:26:54 +01:00
4b6fff9bfd Improve test dense 2024-02-18 13:12:49 +01:00
ee1a8cc6e6 Set up basic tests with gtest 2024-02-17 23:07:26 +01:00
f541e2f7f8 Set up cmake to compile library 2024-02-17 23:07:09 +01:00
ac18768297 Enable debug logging 2024-02-14 16:31:04 +01:00
5d6a241b5c Run on Ubuntu 20.04 2024-02-14 16:15:03 +01:00
9f8f0776af Create codeql.yml 2024-02-14 12:05:15 +01:00