Documentation Index
Fetch the complete documentation index at: https://mintlify.com/DedalusProject/dedalus/llms.txt
Use this file to discover all available pages before exploring further.
Overview
The evaluator module provides the Evaluator class and handler classes for managing task evaluation and output. Handlers specify what to compute and when, while the evaluator coordinates the actual computation.
Evaluator
Evaluator Class
Coordinates evaluation of expressions through various handlers.
Parameters
- dist (Distributor): Problem distributor
- vars (dict): Variables for parsing task strings
Methods
add_file_handler(filename, …) - Create file output handler
handler = evaluator.add_file_handler('snapshots', sim_dt=0.1)
add_dictionary_handler(…) - Create dictionary handler
handler = evaluator.add_dictionary_handler(iter=10)
add_system_handler(…) - Create system handler
handler = evaluator.add_system_handler(iter=1)
**evaluate_scheduled(kw) - Evaluate all scheduled handlers
evaluator.evaluate_scheduled(
iteration=solver.iteration,
sim_time=solver.sim_time,
wall_time=time.time(),
timestep=dt
)
**evaluate_group(group, kw) - Evaluate specific group
evaluator.evaluate_group('analysis')
Handlers
Handler Base Class
All handlers share common parameters:
Scheduling Parameters
- sim_dt (float, optional): Simulation time cadence
- wall_dt (float, optional): Wall time cadence (seconds)
- iter (int, optional): Iteration cadence
- custom_schedule (function, optional): Custom scheduling function
- group (str, optional): Handler group name
Methods
add_task(task, layout=‘g’, name=None, scales=None) - Add a task to evaluate
handler.add_task("u", name='velocity')
handler.add_task("dx(u)", layout='g', scales=2)
add_tasks(tasks, …) - Add multiple tasks
handler.add_tasks(["u", "v", "T"])
FileHandler
add_file_handler(filename, max_writes=None, mode='overwrite', parallel='gather', **kw)
Handler that writes tasks to HDF5 files.
Parameters
- filename (str): Base filename/path for output
- max_writes (int, optional): Maximum writes per file (default: unlimited)
- mode (str, optional): ‘overwrite’ or ‘append’ (default: ‘overwrite’)
- parallel (str, optional): ‘gather’, ‘virtual’, or ‘mpio’ (default: ‘gather’)
- Scheduling parameters (sim_dt, wall_dt, iter, etc.)
Example: Basic File Output
import dedalus.public as d3
import time
# Problem setup
solver = problem.build_solver(d3.RK222)
# Create file handler
snapshots = solver.evaluator.add_file_handler(
'snapshots',
sim_dt=0.1,
max_writes=50
)
# Add tasks
snapshots.add_task("u", name='velocity')
snapshots.add_task("T", name='temperature')
snapshots.add_task("div(u)", name='divergence')
# Timestepping (file handler is called automatically)
dt = 0.001
while solver.proceed:
solver.step(dt)
Example: Multiple Handlers
import dedalus.public as d3
# Snapshots every 0.1 sim time
snapshots = solver.evaluator.add_file_handler(
'snapshots',
sim_dt=0.1,
max_writes=50
)
snapshots.add_task("u")
snapshots.add_task("T")
# Analysis outputs every 0.01 sim time
analysis = solver.evaluator.add_file_handler(
'analysis',
sim_dt=0.01
)
analysis.add_task("sqrt(u@u)", name='speed')
analysis.add_task("dx(u) - dy(v)", name='vorticity')
# Checkpoints every 1.0 wall time hours
checkpoint = solver.evaluator.add_file_handler(
'checkpoint',
wall_dt=3600,
max_writes=1,
mode='overwrite'
)
checkpoint.add_task("u")
checkpoint.add_task("T")
Parallel Output Methods
‘gather’ (default): Gather to process 0, write single file
- Simple, portable
- Memory intensive for large problems
‘virtual’: Write per-process files, create virtual HDF5 dataset
- Scalable
- Requires HDF5 1.10+
‘mpio’: Parallel MPI-IO to single file
- Scalable
- Requires parallel HDF5 build
# Use virtual datasets for large parallel runs
handler = solver.evaluator.add_file_handler(
'data',
parallel='virtual',
sim_dt=0.1
)
DictionaryHandler
add_dictionary_handler(**kw)
Handler that stores outputs in a dictionary for immediate access.
Example
import dedalus.public as d3
# Create dictionary handler
analysis = solver.evaluator.add_dictionary_handler(iter=10)
analysis.add_task("sqrt(u@u)", name='speed')
analysis.add_task("integ(T)", name='avg_temp')
# Timestepping
while solver.proceed:
solver.step(dt)
if solver.iteration % 10 == 0:
# Access evaluated tasks
speed = analysis['speed']
avg_T = analysis['avg_temp']
print(f"Max speed: {np.max(speed['g'])}")
print(f"Avg temp: {avg_T['g'].flat[0]}")
SystemHandler
Handler for internal system use (e.g., RHS evaluation in solvers).
Example
# Typically used internally by solvers
F_handler = solver.evaluator.add_system_handler(iter=1, group='F')
for eq in problem.eqs:
F_handler.add_task(eq['F'])
F_handler.build_system()
Task Specification
String Expressions
Most flexible - parsed using problem namespace:
handler.add_task("u", name='velocity')
handler.add_task("dx(u)", name='du_dx')
handler.add_task("sqrt(u@u + v@v)", name='speed')
handler.add_task("integ(T, 'x')/Lx", name='x_average')
Field Objects
Direct field references:
handler.add_task(u, name='velocity')
Operator Expressions
Pre-built expressions:
import dedalus.public as d3
ω = d3.curl(u)
handler.add_task(ω, name='vorticity')
Layout and Scales
Layout Parameter
Specify evaluation layout:
# Grid space (default)
handler.add_task("u", layout='g')
# Coefficient space
handler.add_task("u", layout='c')
Scales Parameter
Specify dealiasing scales:
# Default (1x)
handler.add_task("u")
# 2x dealiasing
handler.add_task("u", scales=2)
# Per-axis scales
handler.add_task("u", scales=(1, 2, 1))
Custom Scheduling
Define custom output schedules:
import dedalus.public as d3
def custom_schedule(iteration, sim_time, wall_time, timestep):
# Output on specific iterations
return iteration in [100, 500, 1000, 5000]
handler = solver.evaluator.add_file_handler(
'custom',
custom_schedule=custom_schedule
)
Reading Output
Reading HDF5 Files
import h5py
import numpy as np
# Open output file
with h5py.File('snapshots/snapshots_s1.h5', 'r') as f:
# List tasks
print(list(f['tasks'].keys()))
# Read data
u = f['tasks']['velocity']
t = f['scales']['sim_time']
# Access specific writes
u_0 = u[0] # First write
u_last = u[-1] # Last write
# Access metadata
grid = f['tasks']['velocity'].dims[1][0][:]
Post-processing Script
import h5py
import numpy as np
import matplotlib.pyplot as plt
# Merge multiple output files if needed
from dedalus.tools import post
post.merge_process_files('snapshots', cleanup=True)
# Read merged data
with h5py.File('snapshots/snapshots_s1.h5', 'r') as f:
u = f['tasks']['velocity'][:]
x = f['tasks']['velocity'].dims[1][0][:]
t = f['scales']['sim_time'][:]
# Make plot
plt.figure()
for i in range(0, len(t), 10):
plt.plot(x, u[i], label=f't={t[i]:.2f}')
plt.legend()
plt.xlabel('x')
plt.ylabel('u')
plt.savefig('evolution.png')
See Also