Geometrical Approach to Big Data

Prof. Václav Snásel

Václav Snásel

VSB-Technical University of Ostrava

Czech Republic


The Big Data paradigm is one of the main science and technology challenges of today. Big data includes various data sets that are too large or too complex for efficient processing and analysis using traditional as well as unconventional algorithms and tools. The challenge is to derive value from signals buried in an avalanche of noise arising from challenging data volume, flow and validity. The computer science challenges are as varied as they are important. Whether searching for influential nodes in huge networks, segmenting graphs into meaningful communities, modelling uncertainties in health trends for individual patients, controlling of complex systems, linking data bases with different levels of granularity in space and time, unbiased sampling, connecting with infrastructure involving sensors, and high performance computing, answers to these questions are the key to competitiveness and leadership in this field.

The Big Data is usually modelled as point clouds in a high-dimensional space. One way to understand something about the data is to find a geometric object for which the data looks like a sampling of points. Then the geometric object is seen as an interpolation of the data. Main tool for studying of qualitative features of geometric objects is topology. Topology studies only properties of geometric objects which do not depend on the chosen coordinates, distance, but rather on intrinsic geometric properties of the objects.

Short Bio:

Václav Snásel is a profesor at the VSB-Technical University of Ostrava, Czech Republic. His research and development experience includes over 25 years in the Industry and Academia. He works in a multi-disciplinary environment involving artificial intelligence, multidimensional data indexing, conceptual lattice, information retrieval, semantic web, knowledge management, data compression, machine intelligence, neural network, web intelligence, data mining and applied to various real world problems.

Big Stream Data Distribution: Technologies and Perspectives

Prof. Tomoki Yoshihisa

Tomoki Yoshihisa

Ôsaka University



Due to the recent prevalence of the high speed Internet access, big stream data such as video data or 3D data attracts great attention. They are used for internet broadcasting, video conferencing, game, etc. and have many applications on their distribution systems. So, the big stream data distribution is an important technology of today. One of the key points of the distribution is avoiding the interruptions of displaying the data since the viewers prefer the smooth image display. Conventional data buffering technologies are not so effective for the big stream data distribution since the data size is big and it takes long time to buffer even a part of the data. The interruption avoidance for the big stream data distribution is a difficult challenge in the research field. So, various distribution technologies have been studied. Most of them are based on the ideas of near video-on-demand, P2P delivery, and adaptive bit rate. In this talk, I will introduce some big stream data distribution technologies including my research. Also, I will give a perspective of the research field.

Short Bio:

Tomoki Yoshihisa is an associate professor at Osaka University, Japan. He received PhD at Osaka University in 2005. He was a visiting researcher at the University of California, Irvine in 2008. His main research interests include stream data distribution, video-on-demand, and also wearable computing. He has made significant technical contributions in the stream data distribution field with over 100 research publications in top-tier international journals (IEEE Transactions on Broadcasting) and conference/workshop proceedings. He received the Best Paper Awards of 3PGCIC 2012, and some other international/domestic conferences.