Home

For the past two decades, we have witnessed a deluge of data from distinctive sources such as social media, sensors and mobile devices such as smartphones. Such ever-increasing data production is expected to reach the order of 20 Zetabytes by the year 2020. Nonetheless, it is estimated that only 1% of the whole data generated is actually processed and transformed in useful information. In addition to its sheer volume, big data also exhibits other unique characteristics as compared with traditional data. For instance, big data is commonly unstructured and require more real-time analysis. Big data has been by characterized by its traditional 3V´s: volume, velocity and variety but many other Vs can be added to traditional characterization of Big Data. Such characteristics bring demands and calls for new system architectures for data acquisition, transmission, storage, and large-scale data processing.

The project Communications and processing of Big Data in clouds and fogs (Comunicação e processamento de big data em nuvens e névoas computacionais), BIG Cloud, aims at developing new architectures and mechanisms to empower communications and processing infrastructure for the evolution of Big Data processing.

This project is sponsored by São Paulo Research Foundation – FAPESP (project number 15/24494-8) and involves several universities and researchers.