LANGUAGE English
SOURCE IEEE ICC’18, Kansas City, MO, May 20-24,2018
Published Date:2018-05
ABSTRACT
Vehicular cloud computing (VCC) is proposed to
effectively utilize and share the computing and storage resources
on vehicles. However, due to the mobility of vehicles, the network
topology, the wireless channel states and the available computing
resources vary rapidly and are difficult to predict. In this work,
we develop a learning-based task offloading framework using
the multi-armed bandit (MAB) theory, which enables vehicles to
learn the potential task offloading performance of its neighboring
vehicles with excessive computing resources, namely service
vehicles (SeVs), and minimizes the average offloading delay. We
propose an adaptive volatile upper confidence bound (AVUCB)
algorithm and augment it with load-awareness and occurrenceawareness,
by redesigning the utility function of the classic
MAB algorithms. The proposed AVUCB algorithm can effectively
adapt to the dynamic vehicular environment, balance the tradeoff
between exploration and exploitation in the learning process, and
converge fast to the optimal SeV with theoretical performance
guarantee. Simulations under both synthetic scenario and a
realistic highway scenario are carried out, showing that the
proposed algorithm achieves close-to-optimal delay performance.