LM101-086: Ch8: How to Learn the Probability of Infinitely Many Outcomes 16201p

22/07/2021

This 86th episode of Learning Machines 101 discusses the problem of asg probabilities to a...

This 86th episode of Learning Machines 101 discusses the problem of asg probabilities to a possibly infinite set of outcomes in a space-time continuum which characterizes our physical world. Such a set is called an “environmental event”. The machine learning algorithm uses information about the frequency of environmental events to learning. If we want to study statistical machine learning, then we must be able to discuss how to represent and compute the probability of an environmental event. It is essential that we have methods for communicating probability concepts to other researchers, methods for calculating probabilities, and methods for calculating the expectation of specific environmental events. This episode discusses the challenges of asg probabilities to events when we allow for the case of events comprised of an infinite number of outcomes. Along the way we introduce essential concepts for representing and computing probabilities using measure theory mathematical tools such as sigma fields, and the Radon-Nikodym probability density function. Near the end we also briefly discuss the intriguing Banach-Tarski paradox and how it motivates the development of some of these special mathematical tools. Check out: www.learningmachines101.com and www.statisticalmachinelearning.com for more information!!!

LM101-085:Ch7:How to Guarantee your Batch Learning Algorithm Converges +1 año 30:50 LM101-084: Ch6: How to Analyze the Behavior of Smart Dynamical Systems +1 año 33:12 LM101-083: Ch5: How to Use Calculus to Design Learning Machines +1 año 34:21 LM1010-082: Ch4: How to Analyze and Design Linear Machines +1 año 29:04 LM101-081: Ch3: How to Define Machine Learning (or at Least Try) +1 año 37:19 Ver más en APP Comentarios del episodio 3l2pv