Coming soon in Red Hat Enterprise Linux 9.3 and 8.9—and already in CentOS Stream 9 and 8—we're updating the rust-toolset package to Rust 1.71.1. This comes with many upstream features, like the new OnceLock API and Cargo's new "sparse" protocol, but an additional change we're making to our packaging is to include the profiler runtime with the Rust standard library. This has no impact on the default Rust compilation workflow, but it does enable two new capabilities: source-based code coverage and profile-guided optimization.
In this blog, I'll explain how to use code coverage and profile-guided optimization with Rust code on Red Hat Enterprise Linux.
Setup
If you've already been using rust-toolset on RHEL, then simply upgrading to the new version will include the necessary runtime library. In addition, the llvm package will be needed for a few additional tools to process the instrumented data. On a new system, you can get it all by running:
$ sudo yum install rust-toolset llvm
If your project also includes C or C++ code, you may want the full llvm-toolset
package to include the Clang compiler as well.
Source-based code coverage
Code coverage is usually part of the dev/test cycle, checking that execution (usually of tests) reaches as much of the code as possible. This way, the developer can increase confidence that all parts of the code are operating correctly. The newly enabled code coverage option in Rust instruments precise annotations based on the original code, so all of the branches and regions of code are represented precisely, rather than just line-based annotations of some older coverage tools.
The rustc
option -Cinstrument-coverage
is best applied in a Cargo build by setting the RUSTFLAGS
environment variable during development builds.
$ env RUSTFLAGS="-Cinstrument-coverage" cargo build
You can also enable this while building and running your entire test suite:
$ env RUSTFLAGS="-Cinstrument-coverage" cargo test
When each application runs, it will output a *.profraw
file in the current directory. LLVM tools can then process and report code coverage from that data.
$ llvm-profdata merge -sparse *.profraw -o coverage.profdata $ llvm-cov show -instr-profile=coverage.profdata \ ./target/debug/hello-world 1| 1|fn main() { 2| 1| println!("Hello, world!"); 3| 1|}
Additional options like -show-line-counts-or-regions
can be added to show more detailed region information when the code is more complicated than line counts alone can represent.
For more information, see the Code Coverage section of the rustc book, as well as the similar section from Clang if you're also using C or C++, and the documentation for llvm-profdata and llvm-cov for more options in reporting.
Profile-guided optimization
A lot of compiler optimizations rely on inferring (and sometimes guessing) which code paths are likely to be taken most often so it can generate the fastest outcome for those paths. This especially matters when an unlikely path might place a heavy cost on code generation that the likely path would do better to avoid. Profile-guided optimization (PGO) is a way to avoid guessing and inform the compiler using real workloads running your program.
The process for building with PGO works in two phases: instrumentation with -Cprofile-generate
, and application with -Cprofile-use
. These are options to the rustc
compiler, and in a normal Cargo-driven build they are easiest to apply using the RUSTFLAGS
environment variable.
To instrument your application, set the option with a directory path to store the raw profiling data. It's also good practice to explicitly set the --target
, as that separates the RUSTFLAGS
from build scripts and procedural macros during the build process. Since we're also focusing on optimization, it's a good idea to use a --release build for this.
$ PROFDIR=$(mktemp -d) $ env RUSTFLAGS="-Cprofile-generate=$PROFDIR" \ cargo build --release --target x86_64-unknown-linux-gnu
The instrumented binary will be ./target/x86_64-unknown-linux-gnu/release/
. Run it a few times with representative workloads for your program, and each run will create a raw profile in the PROFDIR
we specified at build time. Then this data can be merged and used in a new build:
$ llvm-profdata merge "$PROFDIR" -o "$PROFDIR/merged.profdata" $ env RUSTFLAGS="-Cprofile-use=$PROFDIR/merged.profdata" \ cargo build --release --target x86_64-unknown-linux-gnu
The new binary will no longer be instrumented, but it will be optimized using the data from your training workloads, and this should be more performant than a non-PGO build.
For more information, see the Profile-guided Optimization section of the rustc book, as well as the similar section from Clang if you're also using C or C++.
Summary
With the inclusion of the profiler runtime in rust-toolset
in RHEL 9.3 and 8.9, Rust is ready to enhance your workflow with code coverage during development and profile-guided optimization for deployment. We hope you find these features useful!
Sobre el autor
Josh Stone is a part of the Platform Tools Team, where he is responsible for the Rust toolchain.
Navegar por canal
Automatización
Las últimas novedades en la automatización de la TI para los equipos, la tecnología y los entornos
Inteligencia artificial
Descubra las actualizaciones en las plataformas que permiten a los clientes ejecutar cargas de trabajo de inteligecia artificial en cualquier lugar
Nube híbrida abierta
Vea como construimos un futuro flexible con la nube híbrida
Seguridad
Vea las últimas novedades sobre cómo reducimos los riesgos en entornos y tecnologías
Edge computing
Conozca las actualizaciones en las plataformas que simplifican las operaciones en el edge
Infraestructura
Vea las últimas novedades sobre la plataforma Linux empresarial líder en el mundo
Aplicaciones
Conozca nuestras soluciones para abordar los desafíos más complejos de las aplicaciones
Programas originales
Vea historias divertidas de creadores y líderes en tecnología empresarial
Productos
- Red Hat Enterprise Linux
- Red Hat OpenShift
- Red Hat Ansible Automation Platform
- Servicios de nube
- Ver todos los productos
Herramientas
- Training y Certificación
- Mi cuenta
- Soporte al cliente
- Recursos para desarrolladores
- Busque un partner
- Red Hat Ecosystem Catalog
- Calculador de valor Red Hat
- Documentación
Realice pruebas, compras y ventas
Comunicarse
- Comuníquese con la oficina de ventas
- Comuníquese con el servicio al cliente
- Comuníquese con Red Hat Training
- Redes sociales
Acerca de Red Hat
Somos el proveedor líder a nivel mundial de soluciones empresariales de código abierto, incluyendo Linux, cloud, contenedores y Kubernetes. Ofrecemos soluciones reforzadas, las cuales permiten que las empresas trabajen en distintas plataformas y entornos con facilidad, desde el centro de datos principal hasta el extremo de la red.
Seleccionar idioma
Red Hat legal and privacy links
- Acerca de Red Hat
- Oportunidades de empleo
- Eventos
- Sedes
- Póngase en contacto con Red Hat
- Blog de Red Hat
- Diversidad, igualdad e inclusión
- Cool Stuff Store
- Red Hat Summit