SDSC has created a number of ‘Centers of Excellence’ as part of a larger strategic focus to help researchers across all domains – including those who are relatively new to computational science – better manage ever-increasing volume of digitally based information. These centers formalize key elements of SDSC’s wide range of expertise, from ‘big data’ management to the analysis and advancement of the Internet.
Formed in 1997, CAIDA is a collaborative undertaking among organizations in the commercial, government, and research sectors aimed at promoting greater cooperation in the engineering and maintenance of a robust, scalable global Internet infrastructure. In 2014, kc claffy, CAIDA’s principal investigator and co-founder, was awarded the Institute of Electrical and Electronics Engineers’ Internet Award. She was recognized by the IEEE for her “seminal contributions to the field of Internet measurement, including security and network data analysis, and for distinguished leadership in and service to the Internet community by providing open-access data and tools.”
CLDS was established in 2012 as an industry-university partnership to study and address technical as well as technology management-related challenges facing information-intensive organizations in the era of big data. CLDS specializes in developing applicable concepts, frameworks, analytical approaches, case analyses and systems solutions to big data management, with a related goal of developing a set of benchmarks for providing objective measures of the effectiveness of hardware and software systems dealing with data-intensive applications. Based at SDSC to leverage the Center’s resources and large-scale compute and storage resources, current CLDS initiatives include the Big Data Benchmarking Community effort and the How Much Information? research program. As an industry-university collaboration, CLDS encourages participation by industry and welcomes industry sponsorship of projects. Center research is available via a variety of venues, including working papers, research briefings, multi-company forum workshops and sponsor conferences.
Sherlock, an offering of the Health Cyberinfrastructure Division, is an SDSC center of excellence focused on information technology and data services in healthcare for the government and academia that includes cloud computing, cyber security, data management, application development, HPC, big data, and visualization. Formed in 2013, Sherlock’s services are offered to federal, state, and local governments as well as universities nationwide.
Sherlock offers four major products: Analytics, Case Management, Cloud Services, and a Data Lab, along with an array of consulting expertise. These products comply with HIPAA and FISMA regulations for dealing with sensitive data.
“Data management, technology, and policy challenges, especially in the health sector, can be overwhelmingly complex and confusing,” said Sandeep Chandra, Sherlock’s director. “Our expertise spans many IT disciplines, including cloud computing, cyber security, data management and mining, application development, high-performance computing, big data, and visualization. We have developed and deployed specific services designed to provide a solid and secure foundation for a wide range of initiatives, including how Sherlock is taking on healthcare fraud.”
Sherlock’s resources are physically located within the SDSC Data Center, and, where needed for redundancy, in a secure data center in Northern California. Sherlock Cloud systems interconnect with a 10Gb/s (gigabits per second) network fabric within the SDSC Data Center, and wide-area networking utilizes more than 100Gb/s of high-bandwidth connections to the Internet and research networks such as Internet2, National Lambda Rail (NLR), and the Corporation for Education Network Initiatives in California (CENIC).
Called the WorDS Center for ‘Workflows for Data Science’, this new center of excellence, formed in 2014, leverages more than a decade of experience within SDSC’s Scientific Workflow Automation Technologies Laboratory in developing and validating scientific workflows for researchers involved in computational science, data science, and engineering.
“WorDS is designed to serve those researchers at the intersection of distributed and parallel computing, big data analysis, and reproducible science, while fostering a collaborative working culture,” said Ilkay Altintas, the center’s director. “Our aim is to assist researchers in creating workflows to better manage the tremendous amount of data being generated across a wide range of scientific disciplines, from natural sciences to marketing research, while letting them focus on their specific areas of research instead of having to solve workflow issues or the computational challenges that arise as data analysis progresses from task to task.”
Expertise and services offered by the WorDS Center include:
The WorDS Center is funded by a combination of sponsored agreements and recharge services.