Julia 1.12 has been released with a new trim feature, the ability to redefine structs, and the final switch to partitioned semantics.
Julia is the dynamic language for technical computing optimized for running MATLAB and R-style programs. Development began on Julia at MIT in 2009, and it has steadily increased in popularity, recently becoming one of the top six languages for machine learning projects on GitHub.

The new trim option is experimental in this release. It is designed to create smaller binaries by removing code not proven to be reachable from entry points. Developers can mark entry points using the Base.Experimental.entrypoint feature. The Julia developers say that not all code …
Julia 1.12 has been released with a new trim feature, the ability to redefine structs, and the final switch to partitioned semantics.
Julia is the dynamic language for technical computing optimized for running MATLAB and R-style programs. Development began on Julia at MIT in 2009, and it has steadily increased in popularity, recently becoming one of the top six languages for machine learning projects on GitHub.

The new trim option is experimental in this release. It is designed to create smaller binaries by removing code not proven to be reachable from entry points. Developers can mark entry points using the Base.Experimental.entrypoint feature. The Julia developers say that not all code is expected to work with this option, and since it is experimental you may encounter problems.
The ability to redefine types including structs has been added to extend the previous support for redefinition of constants that was added in an earlier version. This is now well defined and follows world age semantics. The developers say that there is also work in progress in Revise.jl to automatically redefine functions on replaced bindings. This should significantly reduce the number of times you have to restart Julia while iterating on some piece of code.
New tracing flags and macros for inspecting what Julia compiles have also been added to this version to show how long each compiled method took (in milliseconds) before the corresponding precompile(...) line. This makes it easier to spot costly compilations. There are also two new macros for ad-hoc tracing without restarting Julia.
There are two new multi-threading features. Julia now starts with one interactive thread by default (in addition to the default thread), which is where the REPL and other interactive operations run. By separating these from the default thread pool the REPL can perform operations like autocomplete queries in parallel with user code execution, leading to a more responsive interactive experience.
Another improvement means that the threads settings respect CPU affinity settings, such as those set via cpuset, taskset and cgroups.
Another change of note is the ability to build Julia and LLVM using the Binary Optimization and Layout Tool (BOLT). BOLT is a post-link optimizer from LLVM that improves runtime performance by reordering functions and basic blocks, splitting hot and cold code, and folding identical functions. Julia now supports building BOLT-optimized versions of libLLVM, libjulia-internal, and libjulia-codegen.
This would reduce compilation and execution time in common workloads. The team says the all-inference benchmarks improve by about 10%, an LLVM-heavy workload shows a similar ~10% gain, and building corecompiler.ji improves by 13–16% with BOLT. When combined with PGO and LTO, total improvements of up to ~23% have been observed.
Julia 1.12 is available now.

More Information
Related Articles
Julia Makes Its Debut in TIOBE Top 20
Julia 1.9 Adds Native Code Caching
Julia 1.8 Improves Apple Silicon Support
Julia Language Creators Awarded Numerical Software Prize
To be informed about new articles on I Programmer, sign up for our weekly newsletter,subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info