Shekhar is working on a Master Algorithm of Machine Learning called "WebAssembly Learner".Outside of machine learning, if you have two different problems to solve, you need to write two different programs.
They might use some of the same infrastructure, like the same programming language or the same database system, but a
program to, say, play chess is of no use if you want to process credit-card applications. In machine learning, the same
algorithm can do both, provided you give it the appropriate data to learn from. and These Machine learning algorithms are also called learner.
If so few learners can do so much, the logical question is: Could one learner do everything?
In other words, could a single algorithm learn all that can be learned from data?
This is a very tall order, since it would ultimately include everything in an adult’s brain, everything evolution has
created, and the sum total of all scientific knowledge. But in fact all the major learners—including nearest-neighbor,
decision trees, and Bayesian networks, a generalization of Naïve Bayes—are universal in the following sense: if you
give the learner enough of the appropriate data, it can approximate any function arbitrarily closely—which is
math-speak for learning anything. The catch is that “enough data” could be infinite. Learning from finite data requires
making assumptions, as we’ll see, and different learners make different assumptions, which makes them good for some
things but not others.
All knowledge—past, present, and future—can be derived from data by a single, universal learning algorithm.
He call this learner WebAssembly learner.
He is unreasonably excited about Machine Learning, Web Fonts, Web Development, and will become your best friend if you bring him cheese. On second thought, He may be a mouse.
If you need a image, I recommend this one.
THANKS FOR READING! ❤︎