A community education resource
August 18, 2025
9 min read
Optimizing for the edge: Lessons from DOS, Turbo Pascal, and hardware constraints
Inside the As-Easy-As story: How hardware constraints shaped innovative programming.

In the early days of personal computing, developers built software by hand from first principles because they had to. This series shares the story behind As-Easy-As, the spreadsheet built from scratch, and the lessons learned along the way.
As-Easy-As was originally a DOS program. What things do you have to “balance” when you write DOS programs?
Well, it was an evolving process. When we first got start…
A community education resource
August 18, 2025
9 min read
Optimizing for the edge: Lessons from DOS, Turbo Pascal, and hardware constraints
Inside the As-Easy-As story: How hardware constraints shaped innovative programming.

In the early days of personal computing, developers built software by hand from first principles because they had to. This series shares the story behind As-Easy-As, the spreadsheet built from scratch, and the lessons learned along the way.
As-Easy-As was originally a DOS program. What things do you have to “balance” when you write DOS programs?
Well, it was an evolving process. When we first got started, extended memory was not a thing and 640K was a luxury for most PC users before the mid 1980s. Since files were compiled into .COM files that needed to be loaded in a single 64KB memory segment, because they lacked allocation information, that presented another limitation. I remember multiple iterations of optimization to try and keep the code small and fast.
As systems were getting better and available memory larger, we kept enhancing our memory models to use more RAM, e.g., the expanded/extended memory to 1 MB, above the directly addressable 640K, and later to use up to 500 of 16k-pages or extended RAM (up to 8 MB!). Eventually, we even implemented a method to simulate up to 2 MB of virtual RAM on disk, for users that needed the additional memory, but their system did not have it.
Dave was primarily the person doing the optimizing and he was very good at it. I think I’ve mentioned before how impressive his understanding of computers and software was! Early on, there were no math co-processors either, so, for example, if you were writing code that would be using a lot of math operations, you had to think of possibly implementing direct bit manipulations by shifting register bits left or right. Fortunately, Turbo Pascal allowed you to include in-line machine code (Assembly Language), which we used a lot to manipulate hardware directly and get optimization that was not available in standard Pascal code. We’d use assembler to optimize specific chunks of code, then take the assembled hex code and include it in Pascal functions and procedures.
One of the drawbacks, of course, was that inline assembly code is highly dependent on the target platform (CPU architecture and memory model). For the most part, this was not a problem, though we did encounter inconsistencies moving from 80286 to 80386 and the 80387 math co-processor. Early on, we had conditional branching for running the program on systems with a co-processor and those without one. The user had to know what they were running and start the program with an appropriate command line option, /N486 to bypass CO-processor error trapping.
The other area where we used in-line machine code a lot, was to take over from the operating system and write directly to video RAM, bypassing the BIOS. Again, this resulted in better, faster screen writes and updates, but opened the door to problems with displays or display adapters that were not 100% conforming to published specs.
More command lines to start the program in a “special” mode for such systems, if I remember correctly, /ATT if you were using an AT&T system with a monochrome monitor, /ATT2 if you were using a Toshiba laptop, /E for enhanced graphics adapter, /EM for monochrome EGA monitors, and so on. I can’t remember them all by heart, but we ended up with over 30 command line switches and hundreds of combinations. However, most PC users at the time were hackers and they could figure out the right combinations to get the most out of their PCs.
A sample calculation in TCALC – Turbo Pascal demo program
What are some “tricks” or methods you used to write a DOS application like this?
Not sure I can remember all the “specific coding tricks,” but as I have mentioned in many other places, taking over certain operations from the operating system and bypassing the BIOS was a main goal. Processing keyboard interrupts and video RAM interrupts were almost all taken over. Also, coming up with our own, optimized Run-Length-Encoding for representing data was a big help. Manipulating the registers directly to perform routine integer operations was also used to speed up execution. We made abundant use of overlay files to allow us to run the program on systems with limited resources, having a resident portion of the code that was always loaded and swapping different overlay segments in and out of memory, as needed.
What’s an example of something you can do in DOS programming that’s hard to do in Windows or Linux?
The ability to generate the smallest possible files was always a goal for us and DOS was good at it. Not sure how realistic this example is, but in the early days I would demonstrate to people how simple it was to write a program that would display “Hello World,” using the built in DOS Debugger and the generated .COM file was only 22 bytes. I think someone had reduced that to even 20 bytes! What’s the smallest Windows .EXE that can do that? I believe because DOS allowed access at the lowest level, it made it possible to write very tight and fast code. I’m sure there are many more things one could do in DOS that cannot, or at least not easily, in Windows. Linux maybe a bit more accessible.
What things have gotten easier in programming?
What I think was the advantage also made it harder. No built-in function libraries or external ones that could be linked from your code, you had to write pretty much everything you needed (well, with the exception of basic functions). Writing code in Windows makes thousands of built-in functions available to you. And, that’s my second pet peeve! Load everything, whether you need/use it, or not! So what if your code needs 16 GB of RAM? RAM is cheap, right? No need to optimize, no need to look at how this affects execution speed, or how it impacts other apps.
Applications now have assumed the behavior of gasses, “they expand to fill the volume they are released in.” I have vivid memories of Dave and I spending hours on what to do to make the program 128 or 256 bytes smaller, yes bytes. Because of the limited resources available on the hardware systems, anything you could do to save even 128 bytes was a plus!
As-Easy-As was originally written in Turbo Pascal, then Delphi. Why change to Delphi?
We moved to Delphi when we started porting the program to Windows. Delphi is essentially an enhanced, object-oriented version of Turbo Pascal, so the code conversion was not as bad as it sounds. We did, however, have to re-architect the program to take advantage of the OOP model. Delphi’s visual development environment and support for GUI applications made development a bit easier. Delphi’s introduction of a RAD (Rapid Application Development) approach with visual tools for building applications was also a great tool.
Startup screen for the shareware version of As-Easy-As 5.7
TRIUS Inc shared the registration code for As-Easy-As, that was a great gesture to say “Thanks” to the community of original users. Were you behind this decision?
Yes, it was my decision to make these programs available to everyone for free, at that time. There was some internal pushback, because a couple of companies had expressed interest in purchasing the source code, but I felt that people in general had been good to us and since we were not interested in investing any more resources in updating these programs (we had shifted focus by then), we should let people have them!
Some have asked me “why didn’t you just remove the registration requirement altogether?” Over the years we had built a sophisticated registration detection system interweaved within a number of program modules, installation module, validations, etc. The effort to “untangle” that, so we can release a version that would not require registration would be significant and not really needed.
The forums aren’t there anymore, but Internet Archive has the announcements, here’s the one for DOS: AS-EASY-AS for DOS – Free!
Do you still do programming today?
I don’t do much programming these days, and haven’t done so for many years. I manage a group of very good developers, so I leave the coding part to them. On the rare occasion that I code, it’s usually C# and SQL. Once in a while, I get personally interested in a topic and will spend some time at home with SciLAB.
I have done very little coding on the Mac, it has been many years ago. We are a Microsoft house using Windows. However, on my system, I am using VMWare running Linux Ubuntu, ArchBang and I just installed FreeDOS (not sure how much time I will have to spend with it).
I don’t code anything worthwhile or of interest to other developers. Those days are gone. Developers now have access to great tools, no need to rely on old relic coders like me. I have to say though, that some nowadays may not have the instinctive knowledge of base principles that we needed to have back in the day.
Sure, you can get ChatGPT, or some other on-line system to instantly convert a number from Base-10 to Base-16, or do binary arithmetic for you, but I think it’s important to understand what is the process that produces those results, that it’s not just magic. It’s important to understand how registers work, how the CPU is just a very fast processor of straight forward operations, it’s not magic! Did you get that this is one of my pet peeves?
Thanks to Paris for this deep dive into the As-Easy-As spreadsheet and how it was developed. Paris shared more details than I could fit into one interview, and you can get more details in my interview at Technically We Write about writing the manual, and in my interview at Coaching Buttons about growing TRIUS Inc as a company.
Read the entire interview series
- Part 1: The nuclear engineer who became a software founder
- Part 2: Building As-Easy-As: A spreadsheet born out of scientific need
- Part 3: Inside the shareware era: Features, bugs, and programming in DOS
More from We Love Open Source
- Command line magic: Extracting links with awk, grep, and tr
- How I built a Markdown-to-HTML tool on a 5MB FreeDOS system
- Explore the five steps of the FreeDOS boot sequence
- A throwback experiment with Linux and Unix
- How to write your first FreeDOS program
About the Author
Jim Hall is an open source software advocate and developer, best known for usability testing in GNOME and as the founder + project coordinator of FreeDOS. At work, Jim is CEO of Hallmentum, an IT executive consulting company that provides hands-on IT Leadership training, workshops, and coaching.
The opinions expressed on this website are those of each author, not of the author’s employer or All Things Open/We Love Open Source.
Want to contribute your open source content?