
COSIME: Cooperative multi-view integration and Scalable and Interpretable Model Explainer
Single-omics approaches often provide a limited view of complex biological systems, whereas multiomics integration offers a more comprehensive understanding by combining diverse data views. However, integrating heterogeneous data types and interpreting the intricate relationships between biological features—both within and across different data views—remains a bottleneck. To address these challenges...

MANGEM: A web app for multimodal analysis of neuronal gene expression, electrophysiology, and morphology
Recently, it has become possible to obtain multiple types of data (modalities) from individual neurons, like how genes are used (gene expression), how a neuron responds to electrical signals (electrophysiology), and what it looks like (morphology). These datasets can be used to group similar neurons together and learn their functions, but the complexity of the data can make this process difficult...

Joint Variational Autoencoders for Multimodal Imputation and Embedding
Single-cell multimodal datasets have measured various characteristics of individual cells, enabling a deep understanding of cellular and molecular mechanisms. However, multimodal data generation remains costly and challenging, and missing modalities happen frequently. Recently, machine learning approaches have been developed for data imputation but typically require fully matched multimodalities...