Don't worry too much about “which one is best”, take one and go all the way through! I think breadth is better than depth for a lot of things.
http://playground.tensorflow.org/. So simple and interactive. Also, need multiple layers for spiral, but it does it!
The higher layers seem to learn specific examples of people/cats/dogs/etc. Does the network transfer well then?
Setting learning rate. You want to find a good one as you'll be re-training your models as you grab more data.
Stochastic Gradient Descent with restarts is better than ensembles because it allows one to stay in “wide” / generalizable minima rather than restart to find another minima that might not be wide.
Also, the initial training, what was it training? It wasn't modifying the precomputed weights, so what was it doing?
Still want to make a continuous output thing. Train with simple thing like sine wave amplitude or frequency and maybe combine them. Then move to something complicated like the sales prediction, data center load prediction, or better the stuff that Levi is doing at BPA.
Winning kaggle submissions: https://www.kaggle.com/sudalairajkumar/winning-solutions-of-kaggle-competitions
From Turi Create documentation: https://github.com/apple/turicreate
ML Task | Description |
---|---|
Recommender | Personalize choices for users |
Image Classification | Label images |
Object Detection | Recognize objects within images |
Style Transfer | Stylize images |
Activity Classification | Detect an activity using sensors |
Image Similarity | Find similar images |
Classifiers | Predict a label |
Regression | Predict numeric values |
Clustering | Group similar datapoints together |
Text Classifier | Analyze sentiment of messages |
Talk with Andres:
His thesis was combining correlation filters (MACE) and SVM's.
He's in charge of AI strategy for the data center.
It's not going to slow down, even with economic downturn. There are more and more 2nd tier companies finding uses for things. In addition to the 1st tier guys
It's working super well. Which would you prefer for your robot doctor? A robot that will get it correct 80% of the time but tell you why it failed or a 99% right robot that you don't know so well how to diagnose?
Recommends starting with Kaggle things and moving from there. That's your “rep” in the industry.
Facebook machine learning data center, (deep neural networks), published 6 months ago
Random forests are used a lot in production, surprisngly.
Coursera specialization course by Andrew Ng
Google has a cool demo on an automated phone calling system.
Designing the DNN model is still an engineering exercise at this point.
Turi, Carlos Guestrin's startup got bought by Google. Previously he was a machine learning only guy.
Adversarial neural networks, you can come up with the same thing for normal machine learning methods too.
I feel excited. Ready to apply machine learning / DNN to some real problems and see what comes out.
Apparently is blowing every other method out of the water. Unfortunately, we don't understand why it works yet, and so we have to sacrifice the lives of many grad students in order to find local optima
Fizz buzz with a neural network: https://news.ycombinator.com/item?id=11753627
Experiment | Conditioning | Feature Extraction | Classification |
---|---|---|---|
Classifying Number Patterns | |||
RGB Color Recognition (invariant to illumination) |
How does one find features that are robust to Scale/Time/Space/Rotation?
Parameter estimation = Regression = Continuous value as output
Classification = “Thresholded” version of Regression (binary classification as output)
Multiple_Signal_Classification (MUSIC) algorithm seems really cool. Uses eigenvectors of autocorrelation to find frequencies of k emitters. But what if my emitters aren't sinusoidal?
I was quite weak on boosting, clustering (finding n classes among a bunch of fingerprints), and machine learning on decision trees (signal processing and recognition using decision trees?!?! Never heard of it).
The Unreasonable Effectiveness of Data, by some Google guys including Russell Norvig. Combining lots of simple classifiers or n-grams into one big system is performing better than elaborate models on less data.
“What vegetables prevent osteoporosis” paper looks immediately helpful for reading project.
The same meaning can be expressed in many different ways, and the same expression can express many different meanings.
imaqtool; %Gui %Mac specific settings vid = videoinput('macvideo', 1, 'YCbCr422_640x480'); src = getselectedsource(vid); vid.FramesPerTrigger = 5; preview(vid);