Abstrakti
In streaming dataflow applications such as video conferencing systems, the applications are often subjected to traffic occurring in bursts. As systems consisting of a CPU and a GPU are becoming ubiquitous, efficient utilisation of such platforms for handling bursts of data becomes an interesting problem. For GPUs to be efficient, the chunk size of data to process must be large. The bursty nature of the traffic due to the underlying network connections may result in an unacceptable increase in latency if the large chunk size requirement is held strict. We study these systems in the context of the dataflow programming language Reconfigurable Video Coding-Cal Actor Language (RVC-CAL) with a streaming dataflow perspective. To address the aforementioned issue, a crucial step is to determine a device crossover point defined as the chunk size at which the decision to switch to the other device can be made. This is predicted quantitatively using an analytical model of CPUs and GPUs whose parameters are statically determined and later tuned during runtime. In this paper, we validate this model against experimentally measured values for kernels generated for a streaming dataflow application and show that the crossover point determined by the model lies within the range predicted by the measurements.
Alkuperäiskieli | Ei tiedossa |
---|---|
Otsikko | 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP 2015) |
Toimittajat | Jose Moura, Dapeng Oliver Wu |
Kustantaja | IEEE Global Conference on Signal and Information Processing |
Sivut | 100–106 |
ISBN (painettu) | 978-1-4799-7590-7 |
Tila | Julkaistu - 2015 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisuussa |
Tapahtuma | IEEE Global Conference on Signal and Information Processing (GlobalSIP) - 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP) Kesto: 14 jouluk. 2015 → 16 jouluk. 2015 |
Konferenssi
Konferenssi | IEEE Global Conference on Signal and Information Processing (GlobalSIP) |
---|---|
Ajanjakso | 14/12/15 → 16/12/15 |