Outline In this presentation, we present the DNN and CNN structure We briefly shows some famous CNN works (structure, improvements, performance) Difficulty of training DNN and shows a good solution (ResNet) Revised Version [2021-06-25] A revised version can be download from Download_Link Slides [embeddoc url="https://liwen.site/wp-content/uploads/2020/07/Deep-Learning-2-CNN_V12.pptx" download="all" viewer="microsoft"]

Content In the document, we present: Each neural network (NN) consists of many neurons, and each neuron has two elements: linear function and activation function. We have shown the reason of using activation functions. (choices of activation functions) Two numerical examples have been given to track the forward and backward propagation of a neuron. We also implemented the NN to do binary classification task, and showed the experimental results. Presentation Document version: 2021-06-17 can be downloaded from this Link [embeddoc url="https://liwen.site/wp-content/uploads/2020/07/Deep-Learning-1-NN_v22.pptx" download="all" viewer="microsoft"]

Presentation Slides

Presentation Slides

Presentation Slides Q&A

Presentation Slides Q&A Report

Presentation Slides Q&A

Objectives Deep learning is a recently hot machine learning method. The deep learning architectures are formed by the composition of several nonlinear transformations with the goal to yield more abstract and extract useful representations/features. (i) Start with a revision of the basic principle of Neural Networks, neutron structure, examples of back-propagation, learning procedure and iterations (preferably with experimental results), and then (ii) discuss at least one type of deep learning architecture, with a way or ways to illustrate its working principle. (iii) You can also give a summary of different deep learning architectures, and highlight their uses and significances with a good explanation if possible. (iv) you can also illustrate the whole procedure for its use for object recognition/classification. Outlines Introduction of Deep Neural Network and Convolutional Neural Network Milestones (some famous networks)Deep Convolutional Neural NetworkConclusion Presentation Slides [2021-06-25] A revised version can be download from Download_Link This is an embedded Microsoft Office presentation, powered by Office. References [1] He, Kaiming, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. 2016. “Deep Residual Learning for Image Recognition.”…

Objectives Deep learning is a recently hot machine learning method. The deep learning architectures are formed by the composition of several nonlinear transformations with the goal to yield more abstract and extract useful representations/features. (i) Start with a revision of the basic principle of Neural Networks, neutron structure, examples of back-propagation, learning procedure and iterations (preferably with experimental results), and then (ii) discuss at least one type of deep learning architecture, with a way or ways to illustrate its working principle. (iii) You can also give a summary of different deep learning architectures, and highlight their uses and significances with a good explanation if possible. (iv) you can also illustrate the whole procedure for its use for object recognition/classification. Outlines IntroductionEffect of a NeuronExamples of Binary ClassificationConclusion Presentation Slides Q&A Recommended References [1] S. Ruder, “An overview of gradient descent optimization algorithms,” arXiv Prepr. arXiv1609.04747, 2016. [2] L. Bottou, “Large-scale machine learning with stochastic gradient descent,” in Proceedings of COMPSTAT’2010, Springer, 2010, pp. 177–186. [3] N. Andrew, K. Katanforoosh, and Y. B. Mourri, “Deep Learning,”…

Objectives Decision trees and random forests have been used in our group for some years. It is good to have a review on its basic theory, limitation and recent developments. The study includes, but not limited to, the definition of decision trees, binary trees, multi-decision trees, ensemble methods, bagging, boosting, random forest, and applications to object recognition and super-resolution. Some attention must be given to "Randomness" theory and random trees. Please point out the significance of similarity and confidence measures. Give further examples (outside the paper(s)) of similarity and confidence measures, and the ways to achieve high confidence decision with a number of weak classifiers. Outlines IntroductionRandomness Theory (Random Forests)Confidence MeasurementApplicationsConclusion Appendix: Ensemble Learning Presentation Slides Q&A References [1] W.C. Siu, X.F. Yang, L.W. Wang, J.J. Huang and Z.S. Liu, "Introduction to Random Tree and Random Forests for Fast Signal Processing and Object Recognition", Chapter 1 of Learning Approaches in Signal Processing", Pan Stanford Series on Digital Signal Processing, Vol.2, November 2018 (Edited by W.C. Siu, L.P. Chau, L. Wang and T. Tan), November 2018.…