Welcome to DBTest

Welcome

About DBTest

Over the last few years, we have seen an increase in academic and industry work focusing on benchmarking and performance evaluation in novel cloud settings, driven both by an increase and diversity in workloads as well as hardware such as FPGAs or GPUs. Moreover, there are new classes of data-driven applications (i.e., machine learning and big data scenarios) that now need to be considered while at the same time, SQL and no-SQL engines alike are developing, and new engine concepts such as unified engines are being created. Consequentially, special testing efforts and rigor to ensure classical database strengths such as reliability, integrity, and performance are crucial for these novel system architectures and designs.

The goal of DBTest 2024 is to bring researchers and practitioners from academia and industry together to discuss challenges, approaches and open problems around these issues.

Topics Of Interest

  • Reproducibility of database research (new!)
  • Testing and benchmarking of learning-based database systems (new!)
  • Testing of database systems, storage services, and database applications
  • Testing of database systems using novel hardware and software technology (non-volatile memory, hardware transactional memory, …)
  • Testing heterogeneous systems with hardware accelerators (GPUs, FPGAs, ASICs, …)
  • Testing distributed and big data systems
  • Testing machine learning systems
  • Specific challenges of testing and quality assurance for cloud-based systems
  • War stories and lessons learned
  • Performance and scalability testing
  • Testing the reliability and availability of database systems
  • Algorithms and techniques for automatic program verification
  • Maximizing code coverage during testing of database systems and applications
  • Generation of synthetic data for test databases
  • Testing the effectiveness of adaptive policies and components
  • Tools for analyzing database management systems (e.g., profilers, debuggers)
  • Workload characterization with respect to performance metrics and engine components
  • Metrics for test quality, robustness, efficiency, and effectiveness
  • Operational aspects such as continuous integration and delivery pipelines
  • Security and vulnerability testing
  • Experimental reproduction of benchmark results
  • Functional and performance testing of interactive data exploration systems
  • Tracability, reproducibility and reasoning for ML-based systems

Keynote Speakers

Everett (Ev) Maus
Staff Software Engineer, Google

Ev has a passion for teaching computers to find bugs – ideally without making other developers too unhappy.

He's been on the Spanner Engineering Productivity team at Google in 2018, where he’s building tooling to sustainably ensure the correctness and reliability of Spanner. He was an invited industry attendee at the 2023 Daghstul Seminar on Ensuring the Reliability and Robustness of Database Management Systems.

Prior to Google, he worked at Microsoft from 2014 to 2018 on scaling static analysis and security tooling. While there, he contributed to the SARIF standard for tooling results and spoke at BlueHat about building security tooling.

When he isn’t trying to teach computers to break software (usually other people’s), he collects fountain pens, mixes a decent manhattan, plays violin, and recently adopted a puppy.

He graduated from the University of Virginia with a BA in Computer Science and Mathematics in December 2013.

Jesús Camacho Rodríguez
Principal Research SDE Manager, Gray Systems Lab (GSL)

Jesús is a Principal Research SDE Manager at the Gray Systems Lab (GSL), the applied research group within Azure Data. His research focuses broadly on optimizing data systems performance and efficiency, with close collaboration with product and engineering teams.

Before joining Microsoft, Jesús held various engineering positions at Cloudera, where he worked on query processing and optimization in Cloudera’s SQL data warehouse engines. He has also been actively involved with open-source projects such as Apache Calcite and Apache Hive.

Jesús earned his PhD from the University of Paris-Sud and Inria, France, and holds a degree in Computer Science and Engineering from the University of Almería, Spain.



Call for Contributions

Research or Experience Papers

Authors are invited to submit original, unpublished research papers, or experience papers that describe the implementation of testing-related solutions in applications and products that are not being considered for publication in any other forum pertaining to any of the topics listed above as DBTest's topics of interest.

The authors' submission is expected to contain 4 to 6 pages excluding references and appendix.

Accepted research submissions will be published in the ACM DL.

Talks

Authors can submit a talk proposal for previously published but relevant content as well as new ideas within the scope of DBTest's topics of interest that they want feedback from the community on.

The submission should consist of 1 page including references and appendix. Talk proposals need to be marked with the suffix [Talk Proposal] in the submission's title.

Accepted talk proposals will be listed on the DBTest homepage.



Guidelines

Authors are required to follow the current ACM Proceedings Format.

The submission will be handled through HotCRP.


HotCRP

Timeline

March 15**, 2024 / 11:59PM US PST

Paper Submission

April 11, 2024 / 11:59PM US PST

Notification Of Outcome

April 25, 2024 / 11:59PM US PST

Camera-Ready Copy
** Extended deadline from March 8th 2024.
Organization

Program Committee

Adam Dickinson (Snowflake)
Amarnadh Sai Eluri (Google)
Anupam Sanghi (IBM Research)
Brian Kroth (Microsoft GSL)
Danica Porobic (Oracle)
Daniel Ritter (SAP)
Jinsheng Ba (NUS)
Renata Borovica-Gajic (University of Melbourne)
Russell Sears (CrystalDB)
Stefania Dumbrava (ENSIIE)
Wensheng Dou (University of Chinese Academy of Sciences)
Xiu Tang (Zhejiang University)
Yannis Chronis (Google Research)
Zuming Jiang (ETH)

Workshop Co-Chairs

Anja
Gruenheid

Microsoft, Switzerland

Manuel
Rigger

National University of Singapore

Steering Committee

Carsten
Binnig

TU Darmstadt, Germany

Alexander
Böhm

Google, Germany

Tilmann
Rabl

TU Berlin, Germany