CO542 Neural Networks Reading Group (E15)

Department of Computer Engineering, Faculty of Engineering, University of Peradeniya

This reading group was organized parallel to the CO542 neural networks and fuzzy logic undergraduate course for E15 in (early) 2021.

The next iteration for E16 undergraduates held in (late) 2021 is available on https://cepdnaclk.github.io/co542-neural-networks-reading-group-e16/.

Contents

  1. Student feedback
  2. Schedule
  3. Papers
  4. Questions
  5. Groups and memebers
  6. Instructions
  7. Booklet submission template
  8. Marks

Feedback

Feedback from students and analysis

Go up


Schedule

The discussion group takes place every friday 4:00pm (IST) onwards

Click here to add the schedule to your google calender.

WeekDatePapersZoom link
Week 1 2021 Jan 29 1 Recording
Week 2 2021 Feb 05 2, 5, 12 Recording
Week 3 2021 Feb 12 3, 13 Recording
Week 4 2021 Feb 19 16, 9, 11
Week 5 8, 15 Asynchronous
Week 6 17, 18 Asynchronous
Week 7 6, 4 Asynchronous

Go up


Papers

eng.pdn.ac.lk email users can download the papers from here.

Paper No.CategoryPaperDiscussionGroupSlides
1 Math: activations He, K., Zhang, X., Ren, S. and Sun, J., 2015. Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE international conference on computer vision (pp. 1026-1034). Discussion on: week 1 10
2 Math: Optimizers Kingma, D.P. and Ba, J.,. Adam: A method for stochastic optimization. 3rd International Conference on Learning Representations, (ICLR) 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings Discussion on: week 2 14
3 Math: Backprop Rumelhart, D.E., Hinton, G.E. and Williams, R.J., 1986. Learning representations by back-propagating errors. nature, 323(6088), pp.533-536. Discussion on: week 3 5
4 NN: DNN Srivastava, R.K., Greff, K. and Schmidhuber, J., 2015. Training very deep networks. Advances in neural information processing systems, 28, pp.2377-2385. Discussion on: week 7 15
5 NN: CNN LeCun, Y., Bottou, L., Bengio, Y. and Haffner, P., 1998. Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11), pp.2278-2324. Discussion on: week 2 7
6 NN: LSTM Hochreiter, S. and Schmidhuber, J., 1997. Long short-term memory. Neural computation, 9(8), pp.1735-1780. Discussion on: week 7 1
8 NN: GAN Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., Courville, A. and Bengio, Y., 2014. Generative adversarial nets. Advances in neural information processing systems, 27, pp.2672-2680. Discussion on: week 5 3
9 NN: Edge computing Kim, Y.D., Park, E., Yoo, S., Choi, T., Yang, L. and Shin, D., 2015. Compression of deep convolutional neural networks for fast and low power mobile applications. arXiv preprint arXiv:1511.06530. Discussion on: week 4 9
11 App: image detection Krizhevsky, A., Sutskever, I. and Hinton, G.E., 2017. Imagenet classification with deep convolutional neural networks. Communications of the ACM, 60(6), pp.84-90. Discussion on: week 5 13
13 App: NLP Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, Ł. and Polosukhin, I., 2017. Attention is all you need. In Advances in neural information processing systems (pp. 5998-6008). Discussion on: week 3 4
15 App: VQA Antol, S., Agrawal, A., Lu, J., Mitchell, M., Batra, D., Lawrence Zitnick, C. and Parikh, D., 2015. Vqa: Visual question answering. In Proceedings of the IEEE international conference on computer vision (pp. 2425-2433). Discussion on: week 6 2
16 NN: uncertainity Gal, Y. and Ghahramani, Z., 2016, June. Dropout as a bayesian approximation: Representing model uncertainty in deep learning. In international conference on machine learning (pp. 1050-1059). Discussion on: week 4 6
17 NN: datasets This group will present a summary about the datasets used in the papers 1-16. Discussion on: week 6 11
18 NN: loss functions This group will present a summary about the loss functions used in the papers 1-16 Discussion on: week 6 12

Go up


Questions for groups

Go up


Group members

Google sheet. Paper assignment is done in random. If you REALLY want to know how it was done, read this.

Group NumberMemeber 1Memeber 2Memeber 3Memeber 4Paper No.
1 E/15/073: Ayesh E/15/315: Thisum E/15/281: Dinelka E/15/181: Tharindu 6
2 E/15/187: Devin E/15/092: Imesh E/15/043: Yasiru E/15/246: Rajitha 15
3 E/15/016: Anojan E/15/351: Thakshajini E/15/171: Kapilrajh E/15/348: Suhail 8
4 E/15/373: Vaheesan E/15/366: Thinesh E/15/330: Sathursan E/15/021: Aslam 13
5 E/15/209: Kithma E/15/233: Nipuni E/15/325: Chalani E/15/220: Hashan 3
6 E/15/142: Ganindu E/15/154: Chamin E/15/179: Anandi E/15/350: Pasan 16
7 E/15/279: Wathsari E/15/077: Kshithija E/15/211: Ishani E/15/076: Sandushi 5
8 E/15/287: Rajinth E/15/121: Chamika E/15/109: Harindu E/15/241: Dinuka 12
9 E/15/065: Prasad E/15/119: Dinithi E/15/202: Dulanjali E/15/208: Roshani 9
10 E/15/238: Sewwandie E/15/362: Hasini E/15/345: Vidwa E/15/081: Imalsha 1
11 E/15/280: Pubudu E/15/316: Suneth E/15/123: Wishma E/15/173: Dilshani 17
12 E/15/271: Sonali E/15/243: Sewwandi E/15/048: Laksara E/15/363: Rashmi 18
13 E/15/129: Pasindu E/15/106: Dhanushka E/15/310: Sachinthaka E/15/260: Thilina 11
14 E/15/299: Malitha E/15/139: Sajini E/15/249: Dasuni E/15/292: Madushi 2
15 E/15/369: Dilan E/13/087: Lochana E/15/258: Isuru E/15/178: Chrishni 4

Go up


Instructions

Go up


Booklet submission template

You have to use this latex template.
If you are new to overleaf/latex, there are enough online tutorials on the matter. Learning to use latex properly will be very important in the long run for a few of you. Since we don't know who that few is, we are expecting all of you to learn it.

If you are a ME student struggling with overleaf/latex, ask a CO student (who has already submitted their 7th sem FYP report in Latex).

If you are a CO student who is comfortable in Latex, try to dig deep into who invented Tex (Donald Knuth), download his masterpiece TAOCP and skim through a few pages. This might motivate a few of you to look at computing in a whole new light. Since we don't know who that few is, give it a try!

Go up


Marks

Go up


Last updated: 2021 May 20