CECE Developer Guide¶
Project Overview¶
CECE (Community Emissions Computing Engine) is a C++20 emissions compute component designed for high performance using Kokkos and ESMF.
Core Architecture & Philosophy¶
CECE is designed as a modular, performance-portable emissions framework. Key components include:
- StackingEngine: Manages the aggregation of emission layers using sophisticated hierarchy-based processing, kernel fusion optimization, and advanced temporal/spatial scaling. See the Stacking Engine Documentation for comprehensive technical details.
- Vertical Distribution: Multiple algorithms for mapping 2D emissions to 3D atmospheric grids with strict mass conservation. See the Vertical Distribution Documentation for complete algorithm descriptions and usage examples.
- PhysicsFactory: A self-registration registry for physics schemes. New schemes should inherit from
BasePhysicsSchemeand use thePhysicsRegistration<T>pattern. - Internal State: Persisted via
CeceInternalDatato maintain field handles and metadata across ESMF phases, avoiding redundant lookups. - TIDE Integration: High-performance data ingestion for external emission inventories via the TIDE (Temporal Interpolation & Data Extraction) library with smart caching and regridding capabilities.
Global Coding Standards¶
- Language: C++20 and Fortran 2008+.
- Style: Google C++ Style Guide for C++.
- Namespace:
cece::(defined ininclude/cece/cece.hpp). - Documentation: Doxygen format (
/** ... */) required for all public APIs. - Memory: Use
Kokkos::Viewfor data. Avoid raw pointers. - ESMF: Use
ESMC_C API for C++ bridge. Wrap data inKokkos::ViewwithKokkos::MemoryTraits<Kokkos::Unmanaged>. - Performance Portability:
- All compute kernels MUST use Kokkos parallel primitives (
parallel_for,parallel_reduce). - Avoid hardware-specific code (e.g., Cuda intrinsics).
- Use
Kokkos::DefaultExecutionSpacefor dispatch. - Strictly avoid
std::coutor blocking I/O inside kernels.
- All compute kernels MUST use Kokkos parallel primitives (
Physics Scheme Development¶
When implementing or modifying physics schemes:
1. Configurability: NEVER hardcode physical constants or tuning factors. All parameters must be read from the YAML options block in Initialize.
2. BasePhysicsScheme Helpers: Use provided scientist-friendly helpers:
* ResolveImport(name, state): Retrieve input fields.
* ResolveExport(name, state): Retrieve output fields.
* MarkModified(name, state): Signal that an export field has been updated.
* ResolveDiagnostic(name, nx, ny, nz): Register/retrieve diagnostic fields.
3. Optimization: Use Horner's Method for evaluating polynomials (e.g., Schmidt numbers, SST scaling) to minimize floating-point operations.
Vertical Distribution¶
CECE supports multiple vertical distribution methods for mapping 2D emissions to 3D grids: * SINGLE: Place all emissions in a single specific layer. * RANGE: Distribute evenly over a range of layer indices. * PRESSURE: Distribute based on a pressure range (Pa). * HEIGHT: Distribute based on a height range (m). * PBL: Distribute within the Planetary Boundary Layer. * Conservation: All methods ensure strict column mass conservation.
For complete algorithm descriptions, performance characteristics, and usage examples, see the Vertical Distribution Documentation.
Configuration System¶
CECE uses a comprehensive YAML configuration system that supports: - Hierarchical emission layer processing with categories and priorities - Temporal scaling profiles (diurnal, weekly, seasonal) - Environmental dependencies and dynamic scaling factors - Multiple vertical distribution algorithms - TIDE data stream integration with smart caching - Physics scheme configuration and parameter tuning
For complete configuration reference with all available options and examples, see the Configuration Documentation.
Development Environment¶
The required development environment is the jcsda/docker-gnu-openmpi-dev:1.9 Docker container. This container provides the necessary compilers, MPI, and ESMF dependencies.
⚠️ IMPORTANT: No Mocking Policy¶
Mocking ESMF or NUOPC is strictly forbidden. All development and verification must be performed using real ESMF dependencies within the JCSDA Docker environment to ensure real-world parity and stability.
Quick Start¶
-
Run the Setup Script: Execute the provided
If you encounter Docker overlayfs issues or need to fix the environment, run:setup.shscript to pull the Docker image and drop into a shell. -
Build: Inside the container:
-
Run Example Driver: To see CECE in action with external data (ESMF fields):
-
Test:
Python Dependencies¶
The physics scheme generator and other scripts require jinja2, pyyaml, and pytest.
Documentation Resources¶
- ESMF User Guide: https://earthsystemmodeling.org/docs/release/latest/ESMF_usrdoc
- ESMF Reference Manual: https://earthsystemmodeling.org/docs/release/latest/ESMF_refdoc/
- NUOPC Reference Manual: https://earthsystemmodeling.org/docs/release/latest/NUOPC_refdoc
- TIDE Documentation: Located at
src/io/tidein the CECE repository - Fortran Standards: Fortran 2008 Standard (ISO/IEC 1539-1:2010)