Skip to content

ORNL/Sci_PPFL_FM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 

Repository files navigation

Privacy-Preserving Federated Learning for Science: Building Sustainable and Trustworthy Foundation Models

This repository is part of a U.S. Department of Energy (DOE) Advanced Scientific Computing Research (ASCR) project focused on developing privacy-preserving federated learning approaches for scientific foundation models. The project aims to advance secure, trustworthy, and sustainable machine learning methodologies for large-scale scientific applications.

ORNL Principal Investigator: Olivera Kotevska, PhD.

Project Duration: 2024-2027


Awards

PRESTO: Privacy Recommendation and Security Optimization - R&D 100 Award Winner 2025
PRESTO is an innovative framework designed to provide privacy recommendations and security optimization for machine learning systems. This recognition highlights the project's significant contribution to advancing privacy-preserving technologies in scientific computing.
Learn more about PRESTO


Publications

Under review

  1. Automated Membership Inference Attacks (MIA): Discovering MIA Signal Computations using Large Language Model (LLM) Agents
    Link

  2. SelfGrader: Stable Jailbreak Detection for Large Language Models using Token-Level Logits
    Link

Accepted

  1. XMark: Reliable Multi-Bit Watermarking for LLM-Generated Texts The 64th Annual Meeting of the Association for Computational Linguistics
    Link | Code

  2. Scalable Federated Learning for Scientific Foundation Models on Leadership-Class Systems
    The 6th Workshop on Machine Learning and Systems (EuroMLSys) co-located with EuroSys '26

  3. Traceable Black-box Watermarks for Federated Learning
    The Fourteenth International Conference on Learning Representations (ICLR) 2026
    Link | Code

  4. Energy-Efficiency Metrics for Privacy-Preserving Federated Learning with SmartNIC Server Acceleration
    The Sixteenth International Workshop on Accelerators and Hybrid Emerging Systems co-located with 40th IEEE International Parallel and Distributed Processing Symposium

  5. Selective Amnesia using Contrastive Subnet Erasure for Class Level Unlearning in Vision Models
    The IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2026
    Code

  6. DP-TwoLevel: Entropy-weighted multi-layer attention for token-level attribution in autoregressive language models
    SPIE Conference on Assurance and Security for AI-enabled Systems 2026

  7. Entropy-weighted Multi-layer Attention for Token-level Attribution in Autoregressive Language Models
    SPIE Conference on Assurance and Security for AI-enabled Systems 2026

Published

  1. Engineering Privacy at the Edge: A Practical Guide to Differential Privacy in System Architectures
    The 43rd IEEE International Conference on Computer Design (ICCD 2025)
    Link | Code

  2. Privacy-Preserving Federated Learning for Science: Challenges and Research Directions
    The 13th IEEE International Conference on Big Data (IEEE BigData 2025)
    Link

  3. Balancing Trade-offs: Adaptive Differential Privacy in Interpretable Machine Learning Models
    22nd Annual International Conference on Privacy, Security, and Trust (PST2025)
    Link

  4. Optimal Client Sampling in Federated Learning with Client-level Heterogeneous Differential Privacy
    IEEE Internet of Things Journal
    Link | Code

  5. MIC-DP: A Scalable Correlation-Aware Differential Privacy Framework for High-Dimensional Data
    IEEE Transactions on Privacy Journal
    Link | Code

  6. Privacy Preservation from High-Performance Computing to Autonomous Science [Industrial and Governmental Activities]
    IEEE Computational Intelligence Magazine
    Link

  7. OmniFed: A Modular Framework for Configurable Federated Learning from Edge to HPC
    2025 International Conference for High Performance Computing, Networking, Storage and Analysis (SC'25), ExHedtAI: The Workshop on Extreme Heterogeneity and AI Convergence in HPC
    Link | Code

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors