The Annotated Faster-RCNN with PyTorch
Apirl 20 2020 by Wacoder Tags Machine Learning
Faster RCNN is arguably one of the most popuar two-stage object detection models. The invention of RPN in Faster RCNN leads to an elegant and effective solution where region proposal share the same feature extraction with region-based detector. In this blog, I present an "annotated" version of the paper in the form of line-by-line implementation.
read moreThe Annotated SSD with PyTorch
March 13 2020 by Wacoder Tags Machine Learning
Since the introduction of SSD: Single Shot Multibox Shot in 2016, it has been widely used in object detection application for its performance and speed. Using SSD, it only needs to take one shot to detect multiple objects within the image. In this blog, I present an "annotated" version of the paper in the form of line-by-line implementation.
read moreFind Charity Donors
May 18 2017 by Wacoder Tags Machine Learning
This project is to accurately predict whether an individual makes more than $50,000. This sort of task can arise in a non-profit setting, where organizations survive on donations. Understanding an individual's income can help a non-profit better understand how large of a donation to request, or whether or not they should reach out to begin with. The dataset for this project originates from the UCI Machine Learning Repository.
read moreBoston House Price Prediction
May 3 2017 by Wacoder Tags Machine Learning
In this project, we will evaluate the predictive power of a model that has been trained and tested on data collected from homes in suburbs of Boston. A model trained on this data that is seen as a good fit could then be used to make certain predictions about a home's monetary value.
read moreNote on Gibbs sampling from AM207
March 3 2017 by Wacoder Tags Basics
It is not always easy to tune the proposal distribution. The Gibbs sampling is a procedure for multivariate distributioin in which all samples are accepted therefore we do not have to specify a proposal distribution, leaving some guessing work out of MCMC procedure. The Gibbs sampling can only be applied in situations where we know the full conditional distribution of each component in the multivariate distribution conditioned on all other components.
read moreNote on Markov Chain Monte Carlo method from AM207
Feb. 26 2017 by Wacoder Tags Basics
Markov Chain Monte Carlo (MCMC) techniques are applied to solve integration and optimization problems in large dimensional spaces. Problems that are intractable using analytic approaches often become possible to solve using some form of MCMC.
read moreSome funny facts on US election
Oct. 3 2016 by Wacoder Tags Data Mining
The analysis of presidential election of United States covers the third debate between Trump and Clinton. Furthermore, it also presents the popularity of supporters and haters of each candidate on Twitter based on sentiment analysis on tweets.
read moreKaggle: Titanic Machine learning from disaster
Aug. 2 2016 by Wacoder Tags Machine Learning
The sinking of the RMS Titanic is one of the most infamous shipwrecks in history. On April 15, 1912, during her maiden voyage, the Titanic sank after colliding with an iceberg, killing 1502 out of 2224 passengers and crew. This challeng ask to complete the analysis of what sorts of people were likely to survive. In particular, to apply the tools of machine learning to predict which passengers survived the tragedy.
read moreA Complete Handbook on Decision Tree
Aug. 1 2016 by Wacoder Tags Machine Learning
Desicion Trees are a non-parametric supervised learning method used for classifcation and regression. The goal is to create a model that predicts the value of a target variable by learning simple decision rules inferred from the data.
read moreStatistics Basic Cheatsheet
July 24 2016 by Wacoder Tags Basics
When you want to make sense of data around you every day, knowing how and when to use data analysis techniques and formulas of statistics will help.
read moreWhat is the Cramer-Rao Lower Bound?
July 11 2016 by Wacoder Tags Bayesian Estimation
In estimation theory and statistics, the Cramer-Rao lower bound (CRLB), named in honor of Harald Cramer and Calyampudi Rao who were among the first to derive it, expresses a lower bound on the variance of estimator of a deterministic parameter. The CRLB tells the best we can ever expect to be able to do (with an unbiased estimator).
read moreHow the backpropagation algorithm works in neural network?
July 06 2016 by Wacoder Tags Machine Learning
The backpropagation algorithm was originally introduced in the 1970s, but its importance wasn't fully appreciated until a famous 1986 paper by David Rumelhart et. al. Today, the backpropagation algorithm is the workhorse of neural network. This passage explains backpropagation algorithm mathematically.
read moreSimultaneous Localization and Mapping
Apr. 16 2016 by Wacoder Tags Bayesian Estimation
The simultaneous localization and mapping (SLAM) problem asks if it is possible for a mobile robot to be placed at unknown location in an unknown environment and for the robot to incrementally build a consistent map of this environment while simultaneously determining its location within this map.
read moreAbout this blog
Apr. 16 2016 by Wacoder
This blog journals the kownledge on localization algorithm, Bayesian estimation, Machine learning and data mining.