Poems That Solve Puzzles
Latest Publications


TOTAL DOCUMENTS

14
(FIVE YEARS 14)

H-INDEX

0
(FIVE YEARS 0)

Published By Oxford University Press

9780198853732, 9780191888168

2020 ◽  
pp. 143-158
Author(s):  
Chris Bleakley

Chapter 8 explores the arrival of the World Wide Web, Amazon, and Google. The web allows users to display “pages” of information retrieved from remote computers by means of the Internet. Inventor Tim Berners-Lee released the first web software for free, setting in motion an explosion in Internet usage. Seeing the opportunity of a lifetime, Jeff Bezos set-up Amazon as an online bookstore. Amazon’s success was accelerated by a product recommender algorithm that selectively targets advertising at users. By the mid-1990s there were so many web sites that users often couldn’t find what they were looking for. Stanford PhD student Larry Page invented an algorithm for ranking search results based on the importance and relevance of web pages. Page and fellow student, Sergey Brin, established a company to bring their search algorithm to the world. Page and Brin - the founders of Google - are now worth US$35-40 billion, each.


Author(s):  
Chris Bleakley

The Introduction explains what an algorithms is, provides three examples, and exposes the relationship between algorithms and computers. An algorithm is a sequence of well-defined steps which solve an information problem. The chapter begins with the simple example of an algorithm for sharing sweets. The chapter highlights the elegance of effective algorithms and shows how they are written down. It then moves on to explain a straight-forward algorithm for sorting books. An alternative algorithm for more quickly sorting books is presented and contrasted. By definition, a computer is a machine that performs algorithms. The chapter explains how algorithms are the underlying methods that computers follow to process data and make decisions.


2020 ◽  
pp. 203-214
Author(s):  
Chris Bleakley

Chapter 12 is the story of AlphaGo – the first computer program to defeat a top human player at the board game Go. On March 19, 2016, grandmaster Lee Sedol took on AlphaGo for a US$1 million prize in a best of five match. Experts expected that it would be easy money for Sedol. To most observers surprise, AlphaGo swept the first three games to win the match. AlphaGo was based on deep artificial neural networks (ANNs). The networks were trained with 30 million example moves followed 1.2 million games played against itself. AlphaGo was the creation of a London based company named Deep Mind Technologies. Founded in 2010 and acquired by Google 2014, DeepMind’s made a succession of high profile breakthroughs in artificial intelligence. Recently, their AlphaZero ANN displayed signs of general-purpose intelligence. It learned to play Chess, Shogi, and Go to world champion level in a few days.


2020 ◽  
pp. 117-142
Author(s):  
Chris Bleakley

Chapter 7 exposes the algorithms that are the foundations of the Internet. The Internet relies on “packet-switching” to transfer data between computers. Messages are broken into ‘‘packets’’ of data and these packets are routed across the network in a series of hops between linked computers. The advantage of packet-switching is that the network is easily extended and is robust to isolated computer failures. Data sent on the Internet is protected from errors by means of an algorithm invented by Richard Hamming. His algorithm adds information to packets, enabling receiving computers to detect and correct transmission errors. Communication on the Internet is secured by means of an algorithm published in 1977. The RSA algorithm relies on the properties of large prime numbers to prevent eavesdroppers from reading encrypted messages.


Author(s):  
Chris Bleakley

Chapter 1 traces the origins of algorithms from ancient Mesopotamia to Greece in the 2th century BC. The oldest known algorithms were inscribed on clay tablets by the Babylonians more than 4,000 years ago. The clay tablets document algorithms ranging from geometry to accountancy. One tablet in particular - YBC 7289 - indicates knowledge of the Pythagorean Theorem thousands of years before its supposed invention by the ancient Greeks. The Greeks made other advances in algorithms. Euclid’s algorithm determines the greatest common divisor of two numbers. The Sieve of Eratosthenes finds prime numbers. Both algorithms proved to be important stepping stones to modern cryptography - the mathematics of secret messages.


2020 ◽  
pp. 179-202
Author(s):  
Chris Bleakley

Chapter 11 traces the history of artificial neural networks (ANNs) from humble beginnings in the 1940s to their monumental successes in the 21st century. ANNs are algorithms which mimic the behaviour of the nerve cells in the human brain. The concept was originally proposed by Walter Pitts and Warren McCulloch but it was Frank Rosenblatt that popularised the idea, building an ANN to recognise simple shape in images. Rosenblatt’s Perceptron was heavily criticised and attention turned to other, more rigorours mathematical, approaches. In the 70s, three independent research teams invented an effective algorithm for training an ANN to perform pattern recognition tasks. By the 1990s, a handful of results suggested that the idea might work after all. Around 2006, it finally became apparent that computer performance had been the limiting factor. Large networks could perform many pattern recognition just as well humans. So-called deep learning was about to transform computing.


2020 ◽  
pp. 55-74
Author(s):  
Chris Bleakley

Chapter 4 tells the story of numerical weather forecasting from its inception to today’s supercomputing algorithms. In 1922, Lewis Fry Richardson proposed that, since the atmosphere is subject to the laws of physics, future weather can be predicted by means of algorithmic calculations. His attempt at forecasting a single day’s weather by means of manual calculations took several months. In the late 1940s, John von Neumann resurrected Richardson’s idea and launched a project to conduct the first weather forecast by computer. The world’s first operational electronic computer – ENIAC - completed a 24-hour forecast in just one day. It appeared that accurate forecasting simply required faster computers. In 1969, Edward Lorenz discovered that tiny errors in weather measurements can accumulate during numerical forecasting to produce large errors. The so-called Butterfly Effect was alleviated by the Monte Carlo simulation method invented by Stanislaw Ulam for particle physics.


2020 ◽  
pp. 39-54
Author(s):  
Chris Bleakley

Chapter 3 tells the story of the visionaries that first imagined the computer. In the 19th century, Charles Babbage invented a mechanical computer but failed in his attempts to build it. He and Ada Lovelace wrote a series of programs for the proposed machine. These programs were the first transcriptions of algorithms into sequences of machine executable instructions. After Babbage’s failure, the idea of building a real computer was abandoned for fifty years. As a young PhD student, Alan Turing forever defined the relationship between algorithms and computers. According to his definition, a computer is a machine that performs algorithms. He devised a theoretical computer that allowed him to investigate the limits of computation. This, before a single computer was ever built. Turing went on to work as a cryptographer during World War II. Turing outlined the future of computing but tragically died at the age of 41.


2020 ◽  
pp. 159-170
Author(s):  
Chris Bleakley

Chapter 9 examines how algorithms extract knowledge from data. In the 1990s, web sites began to store increasing volumes of data. A new industry grew up to exploit this data for commercial purposes. In a Harvard dorm, Mark Zuckerberg created a web site that allowed students to share personal news. The key to Facebook’s success was an algorithm invented by Zuckerberg to prioritise posts based on their popularity and relevance. The algorithm sparked a revolution in viral messaging. Netflix launched a US$1 million prize to find the most accurate algorithm for recommending movies to based users’ ratings. Google Flu Trends attempted to predict outbreaks of the flu across the USA based on Google queries. Despite initial high profile successes, the project ended in failure - an ode to the limits of data science.


2020 ◽  
pp. 75-92
Author(s):  
Chris Bleakley

Chapter 5 delves into the origins of artificial intelligence (AI). By the end of the 1940s, a few visionaries realised that computers were more than mere automatic calculators. They believed that computers running the right algorithms could perform tasks previously thought to require human intelligence. Christopher Strachey completed the first artificially intelligent computer program in 1952. The program played the board game Checkers. Arthur Samuel of IBM extended and improved on Strachey’s program by including machine learning - the ability of a program to learn from experience. A team from Carnegie Melon University developed the first computer program that could perform algebra. The program eventually reproduced 38 of the 52 proofs in a classic mathematics textbook. Flushed by these successes, serious scientists made wildly optimistic pronouncements about the future of AI. In the event, project after project failed to deliver and the first “AI winter” set in.


Sign in / Sign up

Export Citation Format

Share Document