Back to Open Source

Opifex

Scientific Machine Learning Platform

A unified JAX-native platform for scientific machine learning with neural operators, PINNs, and uncertainty quantification.

Repository Coming Soon

This project is under active development and will be open-sourced soon

Overview

Opifex (from Latin — "worker, skilled maker") is a unified scientific machine learning framework built on JAX/Flax NNX. It provides a standardized protocol for PINNs, Neural Operators, and Hybrid solvers, with built-in uncertainty quantification treating all computation as Bayesian inference.

The platform features a probabilistic-first design philosophy. Unlike black-box ML approaches, Opifex models respect conservation laws, symmetries, and other physical constraints inherent to scientific systems. Advanced uncertainty quantification through ensemble methods, conformal prediction, and generative UQ ensures that predictions come with reliable confidence estimates.

Opifex provides seamless integration with Artifex, enabling researchers to leverage top-tier generative models (Diffusion, Flows) for scientific applications through the unified solver adapter. This bridge between generative modeling and scientific computing opens new approaches to problems in physics-informed generation and probabilistic scientific simulation.

The framework includes Fourier Neural Operators (FNO) and DeepONet for learning solution operators of PDEs, standard and advanced PINN variants for incorporating physical laws, neural DFT for quantum chemistry, and learn-to-optimize capabilities for meta-optimization. All components follow a unified solver protocol with standardized interfaces for consistent experimentation.

Key Features

Neural Operators

Fourier Neural Operators (FNO), DeepONet, and specialized variants for learning solution operators of PDEs at scale.

Physics-Informed Neural Networks

Standard PINNs and advanced variants with multi-physics composition, incorporating physical laws directly into the learning process.

Neural Density Functional Theory

ML-accelerated DFT for quantum chemistry calculations, enabling faster and more accurate molecular property predictions with chemical accuracy.

Uncertainty Quantification

Ensemble methods, Conformal Prediction, and Generative UQ treating all computation as Bayesian inference for reliable scientific predictions.

Artifex Integration

Seamless bridge to Artifex generative models (Diffusion, Flows) through the unified solver adapter for generative scientific modeling.

JAX-Native Performance

Built entirely on JAX/Flax NNX with full JIT compilation, automatic differentiation, and GPU/TPU acceleration.

Use Cases

1

Protein folding dynamics simulation at molecular resolution

2

Reaction-diffusion systems for cellular pattern formation

3

Drug-target binding affinity prediction with uncertainty estimates

4

Metabolic flux analysis and optimization

5

Quantum chemical property prediction for drug design

6

Population dynamics in synthetic ecosystems

7

Tissue mechanics and morphogenesis simulation

8

Learn-to-optimize meta-optimization for scientific problems

Installation

# Clone the repository
git clone https://github.com/avitai/opifex.git
cd opifex

# Set up development environment
./setup.sh

# Activate environment
source ./activate.sh

# Run tests to verify installation
uv run pytest tests/ -v

Quick Start

import jax
from flax import nnx
from opifex.neural.operators.fno import FourierNeuralOperator
from opifex.neural.operators.deeponet import DeepONet

# Create FNO for learning PDE solution operators
rngs = nnx.Rngs(jax.random.PRNGKey(0))
fno = FourierNeuralOperator(
    in_channels=1, out_channels=1,
    hidden_channels=32, modes=12, num_layers=4,
    rngs=rngs,
)

x = jax.random.normal(jax.random.PRNGKey(1), (4, 1, 64, 64))
y = fno(x)  # (4, 1, 64, 64) -> (4, 1, 64, 64)

# Create DeepONet for function-to-function mapping
deeponet = DeepONet(
    branch_sizes=[100, 64, 64, 32],
    trunk_sizes=[2, 64, 64, 32],
    activation="gelu", rngs=rngs,
)

Built With

JAXFlax NNXOptaxSciPyNumPyDiffraxEquinoxOrbax

Ready to Get Started?

Explore the documentation, try examples, or contribute to the project.