636: Sparse Gaussian Process Regression via L1 Penalization

Feng Yan (Purdue University); Yuan Qi (Purdue University)

They introduced a way to do sparse GPs for large amounts of data by adding a L1 penalization to the influence of data points. It effectively removes irrelevant data points using a convex optimization. It avoids the local optima problems of normal sparse GPs.

I liked all the papers in the application track:

901: Web-Scale Bayesian Click-Through Rate Prediction for Sponsored Search Advertising in Microsoft's Bing Search Engine

Thore Graepel (Microsoft Research); Joaquin Quiñonero Candela (Microsoft Research); Thomas Borchert (Microsoft Research); Ralf Herbrich (Microsoft Research)

902: Detecting Large-Scale System Problems by Mining Console Logs

Wei Xu (UC Berkeley); Ling Huang (Intel Labs Berkeley); Armando Fox (UC Berkeley); David Patterson (UC Berkeley); Michael I. Jordan (UC Berkeley)

I liked this since it is somewhat related to my project.

903: The Role of Machine Learning in Business Optimization

Chid Apte (IBM T. J. Watson Research Center)

IBM is increasing the efficiency of collecting back taxes in NY state using machine learning (which some people found scary).

374: Local Minima Embedding

Minyoung Kim (CMU); Fernando De la Torre (CMU)

The idea is to visualize the a high dimensional objective function in a lower dimension that can be visualized while preserving the local optima. Its a really good idea, but it is not good enough to help with the hard problems we would want to solve with it (ie visualizing local optima in high dimensional neural network optimization).

495: Hilbert Space Embeddings of Hidden Markov Models

Le Song (Cmu); Byron Boots (Carnegie Mellon University); Sajid Siddiqi (Google); Geoffrey Gordon (Carnegie Mellon University); Alex Smola (Yahoo! Research)

551: Distance Dependent Chinese Restaurant Processes

David Blei (Princeton University); Peter Frazier (Cornell)

Elastic net, LASSO, and LARS in Python

5 years ago

## 19 comments:

It takes all kinds to make a world.............................................................

什麼樣的學習計畫並不重要，重要的是你是什麼樣的人。............................................................

一個人的際遇在第一次總是最深刻的，有時候甚至會讓人的心變成永遠的絕緣。............................................................

很用心的blog~很喜歡~願您一切順心..................................................................

部落格很棒唷~ 支持你歐^^..................................................................

單純喜歡你的部落格，希望你能收到我的感謝 ^^............................................................

不要去想沒拿到的東西，多想想自己手裡所擁有的..................................................

道歉是人類一定必要的禮節..................................................

He who would climb the ladder must begin at the bottom...................................................

Learning makes a good man better and ill man worse.............................................................

Since it is the early worm that gets eaten by the bird, sleep late.............................................................

Many a true word is spoken in jest..................................................................

良言一句三冬暖，惡語傷人六月寒。......................................................................

我愛那些使自己的德行成為自己的目標或命定的人................................................

耐心是一株很苦的植物，但果實卻很甜美。..................................................

世間事沒有一樣沒有困難，只要有信心去做，至少可以做出一些成績。..................................................

所有的資產，在不被諒解時，都成了負債............................................................

辛苦了！祝你愈來愈好！............................................................

用心經營blog,人氣百分百~^^ 加油 ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ． ．

Post a Comment