Saturday, November 10, 2012

Fast Functions via Randomized Algorithms: Fastfood versus Random Kitchen Sinks

I just noticed the following while reading Learning at Scale by Alex Smola: a randomized scheme that aims at replacing the Random Kitchen Sinks approximation to Kernel Learning at large scales. Random Kitchen Sinks were featured here a while back, Here is the site to learn more about Random Kitchen Sinks (or Random Features as they were called back then).
.



I'll wait for the paper.



Join our Reddit Experiment, Join the CompressiveSensing subreddit and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

2 comments:

Anonymous said...

I just read a couple papers from google about "their" fastfood algorithm. Basically it is a verbatim recitation of information I had on the google code website a few years ago. They have however made every effort not to acknowledge the originating source of the information. Fortunately I have deep pools of creativity to draw on while they are embalmed in their own dogma.

Igor said...

Please send me email so we can discuss this offline.

Cheers,

Igor.

Printfriendly