Happy New Year, guys!
This is not a deep tech-post 🙂
After a long break, well, lots of Christmas parties actually in my house, now I am back from the holiday. It’s more like a trip to me rather than a serious conference. I would give my special thanks to my generous boss, who sponsored me even without publishing any papers. Besides talks and workshops, made few friends as well. Also looking forward to this year in Barcelona.
When I was checking my notes, I found these:
Large-scale Machine Learning
Gibbs Sampling, HMM
Also from the accepted papers:
Deep Learning was the first HOT.
So if you are interested in it, check this post: 7 Steps for becoming DL Expert. (Strongly recommended)
(Please forgive me that I myself is a green hand. Will show you very simple ideas from the conference.)
CV & NLP
About images, there is a history of perspectives on it. It’s natural that scientists are trying to learn some knowledge from images. Initially, Google made image labeling game, which was a kind of collective intelligence, to label the images. Then it was like a bridge which connects languages and images.
Then the scientists got the idea of playing with a higher level, let the computers tell us the story behind an image!
Are You Talking to a Machine? Dataset and Methods for Multilingual Image Question Answering (from Baidu Research)
The following pictures are from the paper mentioned above.
Able to answer questions about the content of an image.
Techs including LSTM, CNN.
120,000 images and 250,000 question-answer pairs
Evaluated by human judges…distinguish from both human and machine….64.7% cannot tell. (I think this way is interesting, but maybe another way of accuracy..?)
People were talking about ML few years back, now they are more interested in large-scale ML or DL. Got to know few PhDs or researchers here in Ireland who specialized in ML, data science, or on HPC. When we talk about ML, the topics would be restricted to some algorithms or theories. When we talk about large-scale ML, it means how do you apply ML algorithms on the distributed system, like Spark or Hadoop, which are capable to deal with large datasets.
I was lucky that I met few people from Alibaba Group, they were one of the sponsors of NIPS, I was told that in the group, nearly 5,000 people are working on the data only. They also had an advertisement for “Large-scale Machine Learning”. Check out the relevant news: Aliyun (阿里巴巴Alibaba Group) released China’s first AI platform.
Large-Scale Distributed Systems for Training Neural Networks, Google Brain
Basically, Google Brain team released a new version of TensorFlow , an open source software library for numerical computation using data flow graphs. If you are familiar with DistBelief, you can think of the TensorFlow as an upgraded one.
It supports CNN, RNN and LSTM, with APIs for Python, C++, C, also people are working on Java, R, Go, ..
From device to device, make the process of training a network to be a distributed job.
Applied to Gmail(SmartReply), RankBrain, Image Classification Model…
The following photo shows the smart reply process. A deep RNN is needed when generating intelligent reply messages.
Two examples of Smart Reply, found on the Internet:
I have never thought about this before, until the days I was looking for PhD positions few month back. Personally, I think the topic of Large-Scale ML or DL, is a combination of ML, HPC. As the NNs are becoming more and more complex, training will be a great challenge. Models will be improved, so do the frameworks.
QA by Facebook
Question Answering is an application in NLP. From my perspective, it is a higher level one, if you focus on the scale it is targeting. Question Answering is not only an analysis of few words or sentence anymore. The main thing is, time must be taken into consideration. More importantly, the new tech, should be able to have a “memory”.
In a workshop, a research representative from Facebook gave a talk about it. They created the MemN2N model, a model of the memory networks, allowing to deal with QA, Recommendations and Dialogs.
Models they have tried:
- Classic NLP cascade e.g. SVM struct.
- N-gram models with SVM-type classifier.
- (LSTM) Recurrent Neural Nets.
Here is an example to show the “memory”:
Mixed with timestamp:
(Capable to understand more complex meanings)
And what’s next:
Find the GitHub codes here.
As I mentioned the “higher level”, similar would be to give a summary of a paragraph.
UC Berkeley research team gave very impressive spotlights about robots in another workshop. The robots could learn how to do a job after times of trails.
After two days after the conference, I visited TJ Watson IBM in New York, where I found a more intelligent project about the robots. The research combined both QA and robots, it’s like that you are talking to a real person. The robot was able to talk with people and make decisions then complete a job.
There was a workshop of ML in Healthcare. A talk targeting on mHealth(mobile health) was impressive.
Sensors on the phones or others wearable devices (say iWatch) are able to collect time-series data like heartbeats. We are curious to find out our body conditions via how many steps we’ve walked for the whole day, how many calories we actually lost and so on. And even, among what we have known, we are seeking for future advice (any advice for tomorrow? ) .
When it comes to making decisions, the problem of personalization always appear. How? Take heartbeat for example, the thresholds during any experiments vary from different conditions (people with different ages, genders, etc). When the associate rules are going to be defined, personalization is leading to more challenges.
Have a look at the big figures of Deep Learning (not a latest one, sorry) [*]:
The name card:
China town in Montreal:
Posters & Workshop:
Finally, special thanks to my friend, Daxiang Dong, a researcher from Baidu, Beijing, who contributed to the post.