Bandit Algorithms for Website Optimization: Developing, Deploying, and Debugging

Author:

John Myles White

Publisher:

Shroff/O'Reilly

Rs250

Availability: Available

Shipping-Time: Usually Ships 5-9 Days

    

Rating and Reviews

0.0 / 5

5
0%
0

4
0%
0

3
0%
0

2
0%
0

1
0%
0
Publisher

Shroff/O'Reilly

Publication Year 2013
ISBN-13

9789350239735

ISBN-10 9789350239735
Binding

Paperback

Number of Pages 106 Pages
Language (English)
Dimensions (Cms) 24 X 18 X 1
Weight (grms) 200
When looking for ways to improve your website, how do you decide which changes to make? And which changes to keep? This concise book shows you how to use Multiarmed Bandit algorithms to measure the real-world value of any modifications you make to your site. Author John Myles White shows you how this powerful class of algorithms can help you boost website traffic, convert visitors to customers and increase many other measures of success._x000D_ _x000D_ This is the first developer-focused book on bandit algorithms, which were previously described only in research papers. You'll quickly learn the benefits of several simple algorithms including the epsilon-Greedy, Softmax and Upper Confidence Bound (UCB) algorithms by working through code examples written in Python, which you can easily adapt for deployment on your own website._x000D_ _x000D_ Learn the basics of A/B testing and recognize when its better to use bandit algorithms._x000D_ Develop a unit testing framework for debugging bandit algorithms._x000D_ Get additional code examples written in Julia, Ruby and JavaScript with supplemental online materials.

John Myles White

John Myles White is a PhD candidate in Psychology at Princeton. He studies pattern recognition, decision-making and economic behavior using behavioral methods and fMRI. He is particularly interested in anomalies of value assessment.
No Review Found