The translators, preprocessors, and tooling that real Fortran shops actually use. We work in them daily and have shipped production systems with them.
This is the multi-page printable view of this section. Click here to print.
Tools
1 - Promula
Promula — also known as gmFortran and shipped by Great Migrations — is the Fortran translation, preprocessing, and modernisation toolset we work in daily. We have used it on real engagements to lift legacy Fortran into a maintainable, testable, modern shape, and to extend production codebases with multi-line strings, hash maps, regex, SQL, and other modern conveniences without throwing away any existing logic.
Where we add value
- Driving Promula / gmFortran on production codebases, not toy examples.
- Adopting it in an existing Fortran codebase without disrupting the build.
- Configuring rules and exceptions for the quirks every old codebase has.
- Combining tool output with hand-tuned changes so the result reads like code humans want to maintain.
- Wiring the translation into reproducible builds, CI, and unit-test pipelines so the migration is repeatable instead of a one-shot stunt — see Expertise → 32 → 64-bit migration.
- Coordinating with Great Migrations when an engagement needs both the toolmaker and an experienced delivery team.
- Training your team on Promula idioms so the codebase stays maintainable after we leave.
Talk to us
2 - NCL
NCL — the NCAR Command Language — is the scripting language we use to drive image generation from scientific Fortran codes. It is the standard tool for turning numerical output (wave fields, atmospheric grids, sea-state results) into the charts and rendered images downstream consumers rely on. NCL is no longer actively developed by NCAR, which makes expert support for existing NCL pipelines especially valuable.
Where we add value
- Writing and maintaining NCL scripts that turn Fortran numerical output — including WRF and WII — into production-quality visuals.
- Keeping legacy NCL pipelines working now that NCL itself is end-of-life: pinning compatible versions, packaging dependencies, and documenting the build.
- Integrating NCL into the build, run, and reporting pipeline so visuals are produced automatically rather than by hand.
- Reproducible imaging — same input data, same script, same image, every time, including months later.
- Recovering and re-running historic NCL scripts against archived data when somebody needs a regenerated chart from years ago — see Expertise → Legacy data.
- Migrating NCL outputs to modern delivery surfaces (web, REST, dashboards) — see Examples → REST API.