Doprava zadarmo s Packetou nad 59.99 €
Pošta 4.49 SPS 4.99 Kuriér GLS 3.99 Zberné miesto GLS 2.99 Packeta kurýr 4.99 Packeta 2.99 SPS Parcel Shop 2.99

Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems

Jazyk AngličtinaAngličtina
Kniha Brožovaná
Kniha Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems Sebastian Bubeck
Libristo kód: 04834934
Nakladateľstvo now publishers Inc, december 2012
A multi-armed bandit problem - or, simply, a bandit problem - is a sequential allocation problem def... Celý popis
? points 231 b
92.15
Skladom u dodávateľa Odosielame za 15-20 dní

30 dní na vrátenie tovaru


Mohlo by vás tiež zaujímať


TOP
Reckless Lauren Roberts / Brožovaná
common.buy 10.29
TOP
Science of Cooking Stuart Farrimond / Pevná
common.buy 22.20
TOP
Happiness 1 Shuzo Oshimi / Brožovaná
common.buy 11.90
TOP
The Course of Love Alain de Botton / Brožovaná
common.buy 9.28
TOP
Dead Boy Detectives Omnibus Toby Litt / Pevná
common.buy 76.60
Japanese Sake Bible Takashi Eguchi / Brožovaná
common.buy 14.02
Dog Behaviour, Evolution, and Cognition Adam Miklosi / Brožovaná
common.buy 67.62
Damn Delicious Meal Prep Chungah Rhee / Pevná
common.buy 24.11
Albert Camus: A Life Olivier Todd / Brožovaná
common.buy 20.88
Design Principles for Photography Jeremy Webb / Brožovaná
common.buy 44.20
Battles of Tolkien David Day / Brožovaná
common.buy 13.52
Art of SEO Stephan Spencer / Brožovaná
common.buy 53.49
Texas BBQ The Editors of Southern Living / Brožovaná
common.buy 16.95

A multi-armed bandit problem - or, simply, a bandit problem - is a sequential allocation problem defined by a set of actions. At each time step, a unit resource is allocated to an action and some observable payoff is obtained. The goal is to maximize the total payoff obtained in a sequence of allocations. The name bandit refers to the colloquial term for a slot machine (a "one-armed bandit" in American slang). In a casino, a sequential allocation problem is obtained when the player is facing many slot machines at once (a "multi-armed bandit"), and must repeatedly choose where to insert the next coin. Multi-armed bandit problems are the most basic examples of sequential decision problems with an exploration-exploitation trade-off. This is the balance between staying with the option that gave highest payoffs in the past and exploring new options that might give higher payoffs in the future. Although the study of bandit problems dates back to the 1930s, exploration-exploitation trade-offs arise in several modern applications, such as ad placement, website optimization, and packet routing. Mathematically, a multi-armed bandit is defined by the payoff process associated with each option. In this book, the focus is on two extreme cases in which the analysis of regret is particularly simple and elegant: independent and identically distributed payoffs and adversarial payoffs. Besides the basic setting of finitely many actions, it also analyzes some of the most important variants and extensions, such as the contextual bandit model. This monograph is an ideal reference for students and researchers with an interest in bandit problems.

Informácie o knihe

Celý názov Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems
Jazyk Angličtina
Väzba Kniha - Brožovaná
Dátum vydania 2012
Počet strán 138
EAN 9781601986269
ISBN 1601986262
Libristo kód 04834934
Nakladateľstvo now publishers Inc
Váha 208
Rozmery 234 x 159 x 8
Darujte túto knihu ešte dnes
Je to jednoduché
1 Pridajte knihu do košíka a vyberte možnosť doručiť ako darček 2 Obratom Vám zašleme poukaz 3 Knihu zašleme na adresu obdarovaného

Prihlásenie

Prihláste sa k svojmu účtu. Ešte nemáte Libristo účet? Vytvorte si ho teraz!

 
povinné
povinné

Nemáte účet? Získajte výhody Libristo účtu!

Vďaka Libristo účtu budete mať všetko pod kontrolou.

Vytvoriť Libristo účet