Quantitative Biology

All posts under category "Quantitative Biology"

6 posts total
Sorted by date
Biologically Inspired LGN-CNN Architecture Mimics Lateral Geniculate Nucleus Functionality

Biologically Inspired LGN-CNN Architecture Mimics Lateral Geniculate Nucleus Functionality

In this paper we introduce a biologically inspired Convolutional Neural Network (CNN) architecture called LGN-CNN that has a first convolutional layer composed by a single filter that mimics the role of the Lateral Geniculate Nucleus (LGN). The first layer of the neural network shows a rotational symmetric pattern justified by the structure of the net itself that turns up to be an approximation of a Laplacian of Gaussian (LoG). The latter function is in turn a good approximation of the receptive field profiles (RFPs) of the cells in the LGN. The analogy with the visual system is established, emerging directly from the architecture of the neural network. A proof of rotation invariance of the first layer is given on a fixed LGN-CNN architecture and the computational results are shown. Thus, contrast invariance capability of the LGN-CNN is investigated and a comparison between the Retinex effects of the first layer of LGN-CNN and the Retinex effects of a LoG is provided on different images. A statistical study is done on the filters of the second convolutional layer with respect to biological data. In conclusion, the model we have introduced approximates well the RFPs of both LGN and V1 attaining similar behavior as regards long range connections of LGN cells that show Retinex effects.

paper research
Using Engineered Neurons in Digital Logic Circuits  A Molecular Communications Analysis

Using Engineered Neurons in Digital Logic Circuits A Molecular Communications Analysis

With the advancement of synthetic biology, several new tools have been conceptualized over the years as alternative treatments for current medical procedures. Most of those applications are applied to various chronic diseases. This work investigates how synthetically engineered neurons can operate as digital logic gates that can be used towards bio-computing for the brain. We quantify the accuracy of logic gates under high firing rates amid a network of neurons and by how much it can smooth out uncontrolled neuronal firings. To test the efficacy of our method, simulations composed of computational models of neurons connected in a structure that represents a logic gate are performed. The simulations demonstrated the accuracy of performing the correct logic operation, and how specific properties such as the firing rate can play an important role in the accuracy. As part of the analysis, the Mean squared error is used to quantify the quality of our proposed model and predicting the accurate operation of a gate based on different sampling frequencies. As an application, the logic gates were used to trap epileptic seizures in a neuronal network, where the results demonstrated the effectiveness of reducing the firing rate. Our proposed system has the potential for computing numerous neurological conditions of the brain.

paper research
Comparative Analysis of Formula and Structure Prediction from Tandem Mass Spectra

Comparative Analysis of Formula and Structure Prediction from Tandem Mass Spectra

Liquid chromatography mass spectrometry (LC-MS)-based metabolomics and exposomics aim to measure detectable small molecules in biological samples. The results facilitate hypothesis-generating discovery of metabolic changes and disease mechanisms and provide information about environmental exposures and their effects on human health. Metabolomics and exposomics are made possible by the high resolving power of LC and high mass measurement accuracy of MS. However, a majority of the signals from such studies still cannot be identified or annotated using conventional library searching because existing spectral libraries are far from covering the vast chemical space captured by LC-MS/MS. To address this challenge and unleash the full potential of metabolomics and exposomics, a number of computational approaches have been developed to predict compounds based on tandem mass spectra. Published assessment of these approaches used different datasets and evaluation. To select prediction workflows for practical applications and identify areas for further improvements, we have carried out a systematic evaluation of the state-of-the-art prediction algorithms. Specifically, the accuracy of formula prediction and structure prediction was evaluated for different types of adducts. The resulting findings have established realistic performance baselines, identified critical bottlenecks, and provided guidance to further improve compound predictions based on MS.

paper research
Integrating Modalities  Benchmarking Single-Cell Genomics Methods

Integrating Modalities Benchmarking Single-Cell Genomics Methods

Single-cell data analysis has the potential to revolutionize personalized medicine by characterizing disease-associated molecular changes at the single-cell level. Advanced single-cell multimodal assays can now simultaneously measure various molecules (e.g., DNA, RNA, Protein) across hundreds of thousands of individual cells, providing a comprehensive molecular readout. A significant analytical challenge is integrating single-cell measurements across different modalities. Various methods have been developed to address this challenge, but there has been no systematic evaluation of these techniques with different preprocessing strategies. This study examines a general pipeline for single-cell data analysis, which includes normalization, data integration, and dimensionality reduction. The performance of different algorithm combinations often depends on the dataset sizes and characteristics. We evaluate six datasets across diverse modalities, tissues, and organisms using three metrics Silhouette Coefficient Score, Adjusted Rand Index, and Calinski-Harabasz Index. Our experiments involve combinations of seven normalization methods, four dimensional reduction methods, and five integration methods. The results show that Seurat and Harmony excel in data integration, with Harmony being more time-efficient, especially for large datasets. UMAP is the most compatible dimensionality reduction method with the integration techniques, and the choice of normalization method varies depending on the integration method used.

paper research
MethConvTransformer  Early AD Detection Through DNA Methylation

MethConvTransformer Early AD Detection Through DNA Methylation

Alzheimer s disease (AD) is a multifactorial neurodegenerative disorder characterized by progressive cognitive decline and widespread epigenetic dysregulation in the brain. DNA methylation, as a stable yet dynamic epigenetic modification, holds promise as a noninvasive biomarker for early AD detection. However, methylation signatures vary substantially across tissues and studies, limiting reproducibility and translational utility. To address these challenges, we develop MethConvTransformer, a transformer-based deep learning framework that integrates DNA methylation profiles from both brain and peripheral tissues to enable biomarker discovery. The model couples a CpG-wise linear projection with convolutional and self-attention layers to capture local and long-range dependencies among CpG sites, while incorporating subject-level covariates and tissue embeddings to disentangle shared and region-specific methylation effects. In experiments across six GEO datasets and an independent ADNI validation cohort, our model consistently outperforms conventional machine-learning baselines, achieving superior discrimination and generalization. Moreover, interpretability analyses using linear projection, SHAP, and Grad-CAM++ reveal biologically meaningful methylation patterns aligned with AD-associated pathways, including immune receptor signaling, glycosylation, lipid metabolism, and endomembrane (ER/Golgi) organization. Together, these results indicate that MethConvTransformer delivers robust, cross-tissue epigenetic biomarkers for AD while providing multi-resolution interpretability, thereby advancing reproducible methylation-based diagnostics and offering testable hypotheses on disease mechanisms.

paper research
SymSeqBench  a unified framework for the generation and analysis of rule-based symbolic sequences and datasets

SymSeqBench a unified framework for the generation and analysis of rule-based symbolic sequences and datasets

Sequential structure is a key feature of multiple domains of natural cognition and behavior, such as language, movement and decision-making. Likewise, it is also a central property of tasks to which we would like to apply artificial intelligence. It is therefore of great importance to develop frameworks that allow us to evaluate sequence learning and processing in a domain agnostic fashion, whilst simultaneously providing a link to formal theories of computation and computability. To address this need, we introduce two complementary software tools SymSeq, designed to rigorously generate and analyze structured symbolic sequences, and SeqBench, a comprehensive benchmark suite of rule-based sequence processing tasks to evaluate the performance of artificial learning systems in cognitively relevant domains. In combination, SymSeqBench offers versatility in investigating sequential structure across diverse knowledge domains, including experimental psycholinguistics, cognitive psychology, behavioral analysis, neuromorphic computing and artificial intelligence. Due to its basis in Formal Language Theory (FLT), SymSeqBench provides researchers in multiple domains with a convenient and practical way to apply the concepts of FLT to conceptualize and standardize their experiments, thus advancing our understanding of cognition and behavior through shared computational frameworks and formalisms. The tool is modular, openly available and accessible to the research community.

paper research

< Category Statistics (Total: 566) >

Computer Science (514) Machine Learning (117) Artificial Intelligence (89) Computer Vision (71) Computation and Language (NLP) (62) Electrical Engineering and Systems Science (36) Cryptography and Security (24) Robotics (22) Systems and Control (22) Software Engineering (20) Mathematics (18) Statistics (17) Economics (16) Information Retrieval (15) Distributed, Parallel, and Cluster Computing (14) Human-Computer Interaction (14) Neural and Evolutionary Computing (13) Computer Science and Game Theory (11) Econometrics (11) Image and Video Processing (10) Physics (10) Sound (10) Multiagent Systems (9) Optimization and Control (8) Computational Geometry (7) Databases (7) Graphics (6) Networking and Internet Architecture (6) Quantitative Biology (6) Quantum Physics (5) Theoretical Economics (5) Computational Complexity (4) Computational Engineering, Finance, and Science (4) Computers and Society (4) Emerging Technologies (4) Information Theory (4) Methodology (4) Multimedia (4) Programming Languages (4) Quantitative Finance (4) Signal Processing (4) Audio and Speech Processing (3) Data Structures and Algorithms (3) Hardware Architecture (3) History and Philosophy of Physics (3) Logic in Computer Science (3) Neurons and Cognition (3) Social and Information Networks (3) Statistics Theory (3) Computation (2) Condensed Matter (2) Dynamical Systems (2) Formal Languages and Automata Theory (2) General Finance (2) Operating Systems (2) Optics (2) Quantitative Methods (2) Applications (1) Astrophysics (1) Combinatorics (1) Computational Physics (1) Digital Libraries (1) Disordered Systems and Neural Networks (1) General Economics (1) Genomics (1) Geophysics (1) Instrumentation and Methods for Astrophysics (1) Logic (1) Mathematical Finance (1) Mathematical Software (1) Medical Physics (1) Mesoscale and Nanoscale Physics (1) Metric Geometry (1) Other Statistics (1) Performance (1) Physics and Society (1) Plasma Physics (1) Probability (1) Trading and Market Microstructure (1)

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut