1 / 15

Can we reliably forecast individual 3G usage data?

Cosmo Zheng. Can we reliably forecast individual 3G usage data?. An analysis using mathematical simulation of time series algorithms. Background. Fluctuations in daily demand for bandwidth make ordinary usage pricing inefficient

elle
Download Presentation

Can we reliably forecast individual 3G usage data?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Cosmo Zheng Can we reliably forecast individual 3G usage data? An analysis using mathematical simulation of time series algorithms

  2. Background • Fluctuations in daily demand for bandwidth make ordinary usage pricing inefficient • Solution: Time-dependent pricing to persuade users to defer usage http://scenic.princeton.edu/tube/overview.html

  3. Our Problem • Users must be informed of expected future prices, to assess the costs of deferring usage • We need a reliable way to predict future usage based on past data http://scenic.princeton.edu/tube/technology.html

  4. The Algorithms • Nonlinear regression – generate a fitted function of the form D + A*sin(2πt/24) + B*sin(2πt/12) + C*sin(2πt/6) • Use fitted function to extrapolate

  5. Algorithms (cont.) • Time series decomposition – isolate trend, seasonal, and residual components • Extend trend and seasonal components into the future

  6. Algorithms (cont.) • Exponential smoothing – generate {St} based on a weighted average of previous data • Simplest form is S1= X0, St = αXt-1 + (1-α)St-1 for t>1, where α is a smoothing factor

  7. The Data • Use simulated datasets, representing usage each hour over 5 days • {Xt} for 1 <= t <= 120 • First 4 days are historical data (training set), 5th day is the test set

  8. Algorithm 1: Regression

  9. Regression (cont.) R2 = 0.424

  10. Algorithm 2: Decomposition

  11. Decomposition (cont.) R2 = 0.693

  12. Algorithm 3: Smoothing

  13. Smoothing (cont.) R2 = 0.516

  14. Additional Trials Sum of absolute error R2

  15. Conclusions • Time series decomposition provided most accurate prediction of future usage, followed by exponential smoothing, then regression • Possible explanation: usage pattern is strongly cyclic; repeats itself on a daily basis • Suggestion: investigate further into better means of isolating seasonal data; some more sophisticated algorithms exist (ARIMA, stochastic volatility models).

More Related