I read with interest the column by Brian Hayes (“Programs and Probability,” Computing Science, September–October) on developing a probabilistic programming language for computers. The article mentioned the Monte Carlo method and how its fundamentally random character is antithetical to the digital nature of computers. It is worth further elaborating on this irony. Monte Carlo methods were developed hand in hand with the original digital computers, with both used for weapons systems, specifically the hydrogen bomb. Furthermore, Monte Carlo calculations are a very widely used form of calculation today and thus a principal contribution of computers to numerical analysis. I suspect that many of the early computer pioneers would find it bizarre that their precise logical gate structures are now used to generate so many pseudorandom numbers.
Above is a published letter, with the following citation:
M Gerstein (2015). "The Full Monte Carlo", American Scientist (Nov-Dec., vol. 103, Num. 6, pg. 374)
http://www.americanscientist.org/issues/pub/2015/6/the-full-monte-carlo
Published letter in response to:
Programs and Probability
Computer programs must cope with chance and uncertainty, just as people do. One solution is to build probabilistic reasoning into the programming language.
Brian Hayes
http://www.americanscientist.org/issues/pub/2015/5/Programs-and-probability
Specifically, from Issue:
September-October 2015, Volume 103, Number 5, Page: 326, DOI: 10.1511/2015.116.326