Arghya Kusum Das
Arghya Kusum Das is an Assistant Professor of Computer Science at the University of Wisconsin - Platteville. He earned his Ph.D. in Computer Science with an emphasis in Scientific Big data Analysis from Louisiana State University, USA. Dr. Das's research focuses on the scalable and efficient analysis of scientific big data (e.g., genomics, and healthcare data). It includes both development of large-scale software frameworks and proposing new hardware architecture for efficient data analysis. His research also focuses on the secured transfer of big data over the Internet using Blockchain and other decentralized technologies. Das is currently working towards developing an innovative, online platform, and curriculums to spread quality education in the field of big data analysis, AI, HPC and related technologie
Large-scale streaming and big data applications requiring large amounts of memory have made the OpenCAPI technology and FPGA an appealing and cost-effective solution. Large research labs to data analytic startups are increasingly utilizing the technology to accelerate their applications creating new jobs and research opportunities. In this presentation we will discuss about the course that we designed to teach high performance analysis of big data leveraging the FPGA technology together with OpenCAPI.
Understanding machine learning algorithm is essential but the development, acceleration, and production engineering capabilities are also required in industry. This machine learning course introduces students to the concepts of data preprocessing, algorithmic overview of different supervised and unsupervised learning techniques, their development strategies and accelerating those algorithms using different hardware such as, IBM Power hardware. We developed the course in collaboration with experts from different industries (e.g., Facebook and IBM). The course will help the community to know more about the capabilities of IBM POWER processor while Design an ML production system end-to-end including project scoping, data needs, modeling strategies, and deployment requirements.