Multimodal Synthetic Data for Deep Neural Networks (MSynD) workshop
Members of the Imaging and Computer Vision group are organising Multimodal Synthetic Data for Deep Neural Networks (MSynD) workshop at the International Joint Conference on Neural Network (IJCNN). IJCNN is a premier international conference in the area of neural networks theory, analysis and applications, and this year 2023, the conference will be held in Gold Coast, Queensland, Australia. There is a significant overlap between our group’s research focus and IJCNNs theme in developing deep neural network models and applications. At this event would want to discuss the current trend in using multimodal synthetic data for deep neural networks with world-leading experts attending IJCNN and our workshop.
Who this workshop is for?
Multimodal Synthetic Data for Deep Neural Networks workshop focused on the generation and use of synthetic data to train, evaluate, and deploy deep neural networks. We aim to bring together researchers, practitioners, and industry experts in the field of artificial intelligence (AI) and machine learning to discuss the latest advancements, opportunities and challenges in using synthetic data for training deep neural networks in various applications, such as computer vision, natural language processing, speech recognition, robotics, healthcare, and more.
Our goal is to provide a platform for researchers and practitioners to exchange ideas, share their research findings, and discuss best practices and ethical considerations in using multimodal synthetic data. The workshop is designed for individuals who are interested in exploring and advancing the use of synthetic data for training deep neural networks across multiple modalities, including audio, video, text, and sensor data. Specifically, this workshop is intended for:
- Researchers and Academics: Researchers and academics in the fields of artificial intelligence, machine learning, computer vision, natural language processing, speech recognition, robotics, healthcare, and related disciplines who are interested in exploring and advancing the use of synthetic data for deep neural networks.
- Industry Professionals: Industry professionals, practitioners, and data scientists working in AI-related industries who are interested in incorporating synthetic data into their deep neural network training pipelines.
- Data Scientists and Engineers: Data scientists, engineers, and practitioners who are involved in data generation, data augmentation, and data pre-processing for deep neural network training, and are interested in learning about state-of-the-art techniques and best practices in using synthetic data.
- Graduate Students and Early Career Researchers: Graduate students, early career researchers, and individuals pursuing research or careers in AI-related fields, who are interested in gaining insights into the latest advancements and challenges in using synthetic data for deep neural networks.
- Ethicists and Legal Experts: Ethicists, legal experts, and professionals involved in the ethical implications and legal aspects of artificial intelligence, machine learning, and data generation, who are interested in understanding the ethical considerations and responsible use of synthetic data in deep neural network training.
When and Where?
When: Friday, 23 June 2023
Where: Gold Coast Convention and Exhibition Center, Gold Coast, QLD, Australia
Host conference: International Joint Conference on Neural Network (IJCNN)
Call for Abstracts
We are delighted to invite one-page extended abstracts at MSynD workshop. Topics of interest for this workshop include, but are not limited to:
- Synthetic data generation techniques (audio, video, text, sensor data) for various applications
- Combining modalities in synthetic data generation
- Novel approaches for using synthetic data
- Evaluation and benchmarking of deep learning models trained on synthetic data
- Best practices, guidelines, and ethics in using synthetic data
- Case studies and real-world applications of synthetic data
- Strategies for domain adaptation and transfer learning with synthetic data
- Data augmentation techniques using multimodal synthetic data for improving the robustness and generalization of deep learning models.
- Limitations, challenges, and future directions of using multimodal synthetic data for deep neural networks.
- Submission website via CMT: https://cmt3.research.microsoft.com/MSYND2023. If you new to CMT please have a look at their how-to guide for authors
- Submission template: Author guidelines for MSYND 2023 1-page abstract
- The authors should strictly follow the one-page extended abstract template. Any manuscripts exceeding the one-page limit may be rejected.
- MSynD workshop will use the single-blind reviewing process for one-page abstracts.
- Abstracts will be reviewed for suitability by related experts from the organising and review Committee.
- Accepted abstracts will be invited for poster presentation at the workshop.
- Poster formatting instructions will be provided with the notification of acceptance.
|Workshop Registration||Open Now to everyone|
|Abstract Submission Deadline|
|Acceptance Notification||29 May 2023, 23:59 AEDT|
|Poster submission Deadline||9 June 2023, 23:59 AEDT|
|Workshop Date:||23 June, 2023, Time: TBD|
Will be announced soon