Common approaches in Dimension Reduction
Explaining Curse-of-dimensionality problem, Principal Component Analysis (PCA), Fisher Discriminant Analysis (FDA), and Linear Discriminant Analysis (LDA).
Paper Review - AnimeGAN
Studying image-to-image translation. Overview of 2019 ISICA paper "AnimeGAN - A Novel Lightweight GAN for Photo Animation".
Paper Review - White-box Cartoonization
Studying image-to-image translation. Overview of 2020 CVPR paper "Learning to Cartoonize Using White-box Cartoon Representations".
Paper Review - CartoonGAN
Studying image-to-image translation. Overview of 2018 CVPR paper "CartoonGAN- Generative Adversarial Networks for Photo Cartoonizations".
Why Transformer can achieve the same tasks as Bi-LSTM and Seq2Seq models?
A Seq2Seq model takes a sequence as input and produces a sequence as output. The input sequence and output sequence are not always in the same length. In order words, the length of the output sequence is determined by the model. In problems such as Machine Translation and Speech Recognition, a Seq2Seq model which has an encoder-decoder architecture is needed. The transformer can achieve such tasks because it also has an encoder-decoder architecture. The encoder processes the input sequence into hidden states, and it will provide information for the decoder to predict the output sequence. The transformer uses an Auto-regressive decoder, which uses the previously predicted output as input to generate new predicted output. As illustrated in the figure below, Transformer can determine the length of output and thus solve Seq2Seq problems like Speech Recognition.
Some Practice Questions on common Deep Learning architectures
Some Practice Questions on common Deep Learning architectures. The topics cover Autoencoders, CNN, RNN, GANs, Transformers.
ResNet and DenseNet
Resnet made a shortcut connections using Add, while DenseNet further exploits the effect of shortcut connections using Concat. Example code attached.
Some Practice Questions on DNN
Some Practice Questions on DNN. The topics cover gradient descent, optimization, loss functions, vanishing gradients problems and regularization techniques.
Revisit Basic Math of Machine Learning
Notes on Revisit GMM, SVM, and Basic Math of Machine Learning.
I made my childhood game in Unity!
Paper Warfare is a top-down combat game for the PC that developed with Unity. Players customize their own self-made paper aircraft, control them on the battlefield to duel with other aircraft creations from their opponents. Paper Warfare originated from a simple “Flick andHit” game which is inspired by collective primary school memories of Hong Kong teenagers playing paper-folded aircraft back in their childhood. In that real-life game, each player first uses a piece of paper to fold their own aircraft, then they put their self-made aircraft onto the desk, then flick their aircraft and try their best to knock out their opponent. The first aircraft that gets knocked out of the desk will lose the game. Our game is mainly based on this mini-game concept, as we wish to arouse people with those nostalgic memories.