Richard Heurtley's Resume

Richard Heurtley
2777 Corliss Road
Richford, Vermont 05476
802-370-6877 (daytime)
richard@heurtley.com

Buzzwords

I am a talented, experienced, and prolific computer programmer. I mostly use the C language and have experience in the following areas:

  • assembly language
  • development tools (interpreters and debuggers)
  • device drivers
  • firmware
  • image processing
  • mapping and GIS
  • medical instrumentation
  • Modbus
  • networking
  • PLC ladder logic
  • printer drivers
  • process control
  • real-time programming
  • sensors
  • signal processing
  • SQL databases
  • stack oriented threaded programming
  • web programming
  • Windows and Linux

Personal information

Q: vi or Emacs?
A: vi

Q: Slashdot ID?
A: five digits

Q: Linux distribution?
A: Devuan

Employment History

For the last 18 years I've been an independent consulting computer systems engineer with clients in the Vermont area.

The Hard'ack Technology Environmental Monitor is an example of my recent work.

10/10 to 5/16: Computer Programmer
Housing Vermont
Burlington, Vermont

Housing Vermont is a non-profit developer of affordable housing. Housing Vermont's Energy Project Manager approached me about his idea for an ultra-low-cost data monitoring system to find out how well the equipment he's been installing works. We call the result Hard'ack and intend to make it open source.

01/09 to date: Computer Programmer
Northern Tier Center for Health
Richford, Vermont

NOTCH is a Federally Qualified Health Center with clinics in five rural Vermont towns. Back when NOTCH was just the Richford Health Center I was the volunteer IT guy until the operation got big enough to warrant hiring someone. Later NOTCH approved a modest budget for some custom applications including a web-based vacation scheduling program and some apps to help manage a prescription drug subsidy program.

11/04 to 01/09: Computer Programmer
Kaycan Ltd.
Richford, Vermont

Kaycan is a manufacturer of aluminum and vinyl siding. I developed a Linux-based controller for a vinyl resin blending system.

09/99 to 01/03: Technical Director
EPS Inc.
Williston, Vermont

EPS is a service bureau for the subsidized housing industry. I set up their original data center and wrote a large shell system around their core commercial program. Later I was hired to advise on updating their data center with modern equipment. I recommended centralizing and virtualizing their servers with a blade array, VMware, and some SAN storage arrays.

07/98 to 03/99: Senior Software Designer
Hyperchip Inc.
Montreal, Quebec

Hyperchip was a start-up developing high-speed network switching technology. I wrote several switch architecture software simulations and an efficient high speed NT device driver for an FPGA PCI card.

10/94 to 07/98: Computer Programmer
Quinton Instrument Company
Bothell, Washington

Quinton made cardiology equipment. I was originally hired to help rewrite their Q-Cath application using Vermont Views. Later I helped port the DOS application to Windows NT.

01/91 to 06/93: Senior Software Engineer
Dow Jones/Telerate
New Orleans, Louisiana

Telerate distributed real-time bond, foreign exchange, and commodity information before the advent of common Internet access. The New Orleans office wrote and maintained the real-time DOS program that displayed the information. The group created and used an innovative language that was interpreted, stack based, threaded, and object oriented. I ported the interpreter from real to protected mode, added a source code debugger, wrote several printer drivers and productivity tools, and developed a Huffman based scheme to compress the information feed.

10/88 to 12/90: Senior Software Engineer
Vermont Creative Software
Richford, Vermont

Originally VCS's main product was a data entry forms library called Vermont Views. I wrote extensions to let the library render in DOS graphics mode, use expanded memory, and process mouse events. I also created a DOS testing tool that records and plays back keystrokes and compares the resulting screen images. All VCS programmers spent considerable time providing technical support to customers.

04/86 to 10/88: Senior Software Engineer
Anderson-Jacobson
San Jose, California

Anderson-Jacobson manufactured a line of modems. I designed an add-on Z80 processor board for an existing 2400bps modem and implemented the MNP error control protocol. I then designed the controller section for a 9600bps modem using an 80186 processor and wrote the small real-time kernel and data flow logic necessary to support the modem control functions.

10/84 to 04/86: Software Engineer
ITT Information Systems
San Jose, California

ITT Information Systems manufactured a line of IBM compatible personal computers. I was in a group that developed a voice interface board and associated software.

Education

In 1984 I emerged from Rensselaer Polytechnic Institute with a Bachelor of Science degree in Computer and Systems Engineering.

Examples

Low cost data acquisition

Housing Vermont wanted to be able to monitor the heating and electric systems in some of its properties. They specified three criteria: low cost, cheap, and inexpensive. I wrote a set of programs that run on any Linux or Windows computer. We've used the following Linux systems as site controllers:

The site controller programs are:

  • gather: Gathers data from various devices using the Modbus and BACnet protocols. Data is written to series of tab delimited text files suitable for importing into a spreadsheet.
  • scatter: Uploads data files to a remote server. Also concatenates, compresses, and deletes old data files from the controller.
  • beacon: Periodically sends a packet of data to a remote server containing the site name and the HTTPS port used by the panel program. The server's reply contains a password used by the panel program.
  • panel: A cgi-bin program invoked by a web server (lighttpd) to render the acquired data values in real time.

The gather program can process the acquired data in various ways before writing it to the text file. The most useful processing is integration over time. This allows the gather program to calculate:

  • Heating/cooling degrees days from the outside air temperature
  • Kilowatt hours from current
  • Fuel usage from heater/pump run time

The remote server is a surplus workstation running CentOS. It runs the following programs:

  • nexus: A cgi-bin program invoked by Apache. This is the main user interface to the system and is used primarily to render and chart the acquired data.
  • dispatch: Receives data packet from the beacon program and sends the appropriate reply.
  • sentinel: Inspects the acquired data for alarm conditions and optionally sends e-mail when an alarm occurs.

An example of the nexus chart specification form is here:

http://www.heurtley.com/richard/nexus_chart1.png

The resulting chart is here:

http://www.heurtley.com/richard/msm_heats1.png

A snapshot of the real-time data display rendered by the controller is here. The rows can be arranged over multiple pages:

http://www.heurtley.com/richard/panel.png

A summary of my latest work with the Hard'ack system. "Some Thoughts on Drone Borne Gas Sensors":

http://www.heurtley.com/richard/thoughts/index.html

And "A Drone-Borne Thermal Imager Assembly":

http://www.heurtley.com/richard/drone/index.html

Fun with temperature charts

Rendering annual temperature charts for different locations using various curve fitting techniques:

http://www.heurtley.com/richard/tchart

Rendering charts of 30 year global satellite temperature data:

http://www.heurtley.com/richard/gtchart

GPL washing machine

My washing machine has a 12-volt DC motor that was originally controlled by some clever plastic gears and a DPDT relay. I had to replace the relay a few times and when the gears started to wear out I decided to replace them with a PLC. I wrote a C language program to verify the ladder logic program before loading it to the PLC:

http://www.heurtley.com/richard/washer.c

Microwave line-of-sight coverage tool

The State of Vermont floated a bond issue to improve the State's telecommunications infrastructure. One of my friends owns two large centrally located hills in the Town of Richford and gathered together a group of knowledgeable people to explore the possibility of erecting cellular or WiMAX antenna towers. I wrote a program to determine coverage areas for antennas of varying heights at different locations. An example configuration file is:

http://www.heurtley.com/richard/cabin_2k.par

The results are available at:

http://www.richfordvt.net/coverage

The program has the following interesting aspects:

  • Terrain elevation and imagery were obtained from the U.S. Geological Survey's Seamless web site.
  • Vermont town borders were obtained from the Vermont Center for Geographic Information's web site. I wrote functions to read the Shapefile format.
  • Each source of information used a different coordinate system. I located code on the web to convert between latitude/longitude, UTM, and the State Plane Coordinate systems.

Blender controller

A neighbor, one of Kaytec's vice-presidents, hired me to automate a vinyl compound blender for the company, a producer of vinyl siding and other extruded vinyl products. The result is 67K lines of code, only about 12% of which make up the state machines that actually control the blender. The system has the following properties:

  • It compiles and runs in both Linux and Windows. The target platform is Linux.
  • It can use either POSIX threads or the Win32 threading system.
  • The PostgreSQL back end is accessed with the libpq library under Linux or ODBC under Windows.
  • The user interface is rendered in plain ASCII text for the console or HTML for remote access via the built-in HTTP server.

The blender has been running for several years without program issues. I used the LaTeX system to write the 153-page user's manual.

The user's manual chapter that describes the blender program is:

http://www.heurtley.com/richard/chapt22.pdf

Workflow: A system for subsidized contract administration

EPS, Inc. of Williston, Vermont has contracts with several U.S. States to administer subsidized housing contracts, mostly HUD Section 8 contracts. The system I wrote for them is the biggest I've done single-handedly at nearly 100K lines of code. It consisted of several batch programs and a large cgi-bin web program for the interactive bits. The initial system comprised a Samba file server, an Apache web server, a PostgreSQL database server, and a fax server. All the servers ran Linux.

The data-processing cycle worked roughly like this:

  1. A batch program dialed HUD's e-mail server to download a batch of files.
  2. A batch program parsed the files, logged all the relevant information to the database, routed responses as necessary, and routed files to an appropriate destination.
  3. Most of the data went to one of the analysts who used a third-party program to process it. The analysts then used the cgi-bin program to log their work in the database.
  4. A batch program formatted e-mail from outgoing files.
  5. A batch program dialed HUD's e-mail server to upload responses.

The file formats used by HUD are complex and inconsistent. Much of the processing was just trying to determine the property and/or contract to which a piece of data referred.

The e-mail programs with client-side implementations of the SMTP and IMAP protocols are available at:

http://www.tracsexperts.com/code.html

Hypercube network router simulator

I had a brief stint at Hyperchip, a Montreal startup in the backbone Internet router business. There wasn't much for the software guys to do while the hardware guys were still developing so I turned the pages of my notebook back to a lecture the company's president gave on hypercube network topologies and wrote a simulator. First I designed a corner node store-and-forward processor that could conceivably be implemented in hardware and then implemented it in software. I used a 10 dimensional hypercube (1024 corners) with 16 ports per corner node. Ten ports talked to each node's 10 neighbors, one port was the "hyperspace" link to the opposite hypercube corner, and the remaining ports were for attached devices. The processor could spread broadcast messages efficiently, respond to supervisory messages like "ping", and would forward regular traffic to the node closest to the packet's destination.

The GUI was 32x32 grid of 16 tiny squares arranged in a diagonal, each representing a port. Yellow lines were drawn between squares as messages passed between ports. The user could inject a message and watch it be propagated through the hypercube. Unfortunately if one injected more than a few broadcast ping packets the network got gridlocked. I never did solve that problem.

The Quinton Q-Cath hemodynamic monitoring system

The Q-Cath system is the largest I've worked on at about 130K lines. I was responsible for about half of it. The project started out as a rewrite of an existing program for extended DOS mode and continued when the program was ported to Windows NT. I wrote my first Windows device driver to support the system's proprietary ISA card.

The most interesting part of the project was rendering real-time waveforms on the PC's screen. The hardware platform was nothing special for the mid-1990s and there was no particular video hardware support. To move waveforms, I calculated just the pixels that changed between columns and effected the changes in a tight assembly language function called by a timer interrupt.

The heart of the Q-Cath system was a proprietary single board computer with a touchscreen interface. This connected the input signal conditioner, PC, non-PC waveform display, and chart recorder. The firmware needed some updates for new features so I had the opportunity to dust off and exercise my 16 bit programming skills and EPROM burner.

Digital cineloop machine

A Q-Cath system is half of a hospital's catheterization lab equipment. The other half is an x-ray imager so the doctor can see what's going on while he's pushing a wire up your artery. The manager of the Q-Cath project and I teamed up to develop and sell a digital cineloop machine that would record 512x512 8 bits/pixel gray scale images at 60 frames/sec, fast enough for pediatric use. The platform was a 133MHz Pentium processor with 128MB of RAM running Windows NT. An Imaging Technologies IC-PCI image acquisition card DMA'ed images to RAM while a dual-controller PCI SCSI card wrote them out to the raw sectors of four fast 2GB SCSI drives. The 30K line program also imported and exported video in the medical DICOM format. The system was impressive for 1996. We hoped to sell it to Quinton but the company was taken private before we could finish negotiations and the new management dropped the entire Q-Cath product line shortly after.

DICOM API

Digital Imaging and Communications in Medicine (DICOM) is a complex standard for handling, storing, printing, and transmitting medical imaging information. The digital cineloop machine project needed DICOM support to be marketable so I wrote 28K lines of API and demo programs.

DICOM gray scale images stored in the little-used lossless JPEG format, which is basically Huffman, encoded delta values. Decoding lossless JPEG is slow, annoyingly slow on the hardware of the time. I developed a JPEG decoder that may be unique in that it precalculates a state machine for a given Huffman tree and then decodes the compressed image a byte at a time. Each compressed byte is used as an index into a jump table and up to eight uncompressed pixels are immediately written out with an average of about three machine instructions per pixel.

Map: A tool to draw maps from surveying data

One of my hobbies is surveying. I wrote an ambitious program to generate contour maps from surveying data. The program:

  1. Reads a file of observations, most observations being some combination of bearing, range, and elevation angle.
  2. Populates and solves a matrix to determine the position of each point with least-square error distribution.
  3. Constructs a reasonable set of triangles from the points.
  4. Populates and solves a very large sparse least-squares matrix to model the surface above each triangle as a polynomial function of X and Y.
  5. Draws a map with user-specified lines, labels, and contours.

The large sparse matrix includes constraints to ensure the elevation and slope is the same along each shared triangle side. I tried many different equations to do this and ended up adding a primitive symbolic equation processor to the program to remove my error-prone hand derivations.

Unfortunately the program does exactly what I specified but not exactly what I want. It tends to extrapolate elevation and create peaks and valleys that are not justifiable. It works great for map-making however. If one takes care out in the field, the program can accurately locate points by triangulating bearing observations.

This map uses bearing observations to chart a trail network inside a lot:

Data file: http://www.heurtley.com/richard/lot_trails.s
Map: http://www.heurtley.com/richard/lot_trails.png

This map reconstructs a surveying chart with bearing and range data:

Data file: http://www.heurtley.com/richard/lots.s
Map: http://www.heurtley.com/richard/lots.png

Voyager planetary probe color composite image generator

Around 1990 NASA and the JPL released a set of CD-ROMs containing all the imagery from the two Voyager planetary probes. This gave me the opportunity to do something I wanted to do ever since I was a nerdy little kid and saw the images in Sky & Telescope magazine - make my own color composites.

First I wrote a program to decompress the Huffman encoded monochrome images and print out thumbnail images in order to see what was on the CD-ROMs. The DOS-extended compositing program had a graphical GUI and used a mouse to manipulate images and slidebars. The process had a number of steps:

  1. Select three similar images taken with three different filters. Uncompress them and write them to the red, green, and blue "original" files. Run the compositing program and read the "original" files.
  2. Remove the reseau marks on each image. (The probes' cameras have an array of little dabs of metal on the photodetector leaving black spots.) The position of each reseau mark was published. I filled in the black spots with a 3D approximation of the surrounding pixels.
  3. Align the three images. The images were not taken in quick succession and the planet often rotated a bit between shots. I aligned the images by choosing a point with sharp detail and aligning the red and blue images to the green and saving the point position and offsets. After manually aligning several points spread across the image, the program calculated a set of polynomial offset equations and warped the red and blue images to align with the green.
  4. Reconstruct "true" colors. The probes' filters' transmission characteristics were published. The program calculated the color of each aligned point in XYZ color space and then transformed it to RGB.
  5. Contrast enhancement. The last stage was a manual procedure to manipulate the image's histogram. Intensity levels in the original could be "spiked" and then shifted higher or lower. The result was intelligent histogram equalization.

The program worked and I'm quite proud of it. I ran it on a 20MHz 80386SX notebook with 8MB of RAM and a 40MB hard disk drive. If I were to revive the program I'd save the intermediate images with floating point intensity values instead of always reducing them to eight bits. I think the reductions caused graininess.

One of the composite images is here:

http://www.heurtley.com/richard/JUP8.BMP

Expanded memory executable linker

The original IBM PC design was limited to 640KB of RAM. An early attempt to add more RAM mapped it into one or more 16KB banks high up in the original 1MB address space. This was called expanded memory.

One of Vermont Creative Software's products is the Vermont Views Designer, a WYSIWYG text mode form designer. At one time the Designer was getting large enough to limit the number and/or size of the forms loaded into memory and I was tasked with investigating whether expanded memory could solve the problem.

I did it two ways. First I modified the Designer to load forms into expanded memory. That was straightforward exercise in replacing pointers with handles and then dereferencing them. Then I wrote a linker that loaded the Designer program itself into expanded memory. The linker scanned the object modules and packed "close" functions, those that called each other, into 16KB chunks. Calls to close functions were handled by the regular CALL instruction. Calls to "far" functions, those on a different 16KB chunk, were redirected to a table in low RAM that called the expanded memory API to map in the appropriate chunk and then jumped to it. All data was kept in low RAM.

I did this work seconds before the first of the DOS extenders came out and obsoleted it.

Microcom Networking Protocol

The Microcom Networking Protocol (MNP) was one of several error-correcting protocols competing for standardization in the early days of modems. I was hired to add it to a new line of 2400bps modems by Anderson-Jacobson. First I designed a Z80 co-processor card that stood between the data terminal equipment (DTE) and the modem, and then I implemented the MNP protocol on it in assembly language. The implementation centered on a little cooperative multi-tasking kernel with high and low priority process queues.

Interesting bits were "AT" detection and calculating the optimum frame size.

Hayes compatible modem commands all begin with the characters "AT". Normally the start bit of the "A" is measured to determine the DTE's bit rate and the rest of the "AT" waveform is ignored. The co-processor board matched the complete waveform and determined the DTE's speed, character size (7 or 8 bit), parity and stop bit settings. In the process of writing this I discovered that there are two combinations of settings that are indistinguishable from the "AT" waveform.

An optimization one can add to an error-correcting network protocol is adjusting the frame size for the error frequency. If the transmission line is noisy then small frames are less likely to be corrupted but small frames also have a greater percentage of overhead. I derived an equation to determine the optimum frame size for a given error rate. It involves a square root operation so I included a little look-up table in the ROM. Later I found a much more rigorous version of the equation in my copy of Tanenbaum's Computer Networks.

The MNP protocol lost the standardization battle in favor of what became v.42bis. Information on the MNP protocol is available at:

http://en.wikipedia.org/wiki/Microcom_Networking_Protocol

DSP 300 baud modem

My first job out of school was working on an innovative eight-bit ISA voice card that had a telephone jack in addition to microphone and speaker jacks. The application program included voice record/playback, telephone answering machine, and voice recognition functions. I was responsible for maintaining the original on-board 8086 and TMS32010 processor programming.

As I got more familiar with the on-board code I started enhancing it and wrote, as a proof of concept, a 300-baud modem. The signal processor generated DTMF tones for dialing, detected dial, ring, and busy tones; and generated and decoded the two 300-baud data tones. The on-board 8086 handled "AT" modem command processing. A host Terminate and Stay Resident (TSR) program provided an INT API to an emulation of an 8250 UART. I then hacked a popular terminal emulator program and replaced the IN and OUT instructions to the UART with INT calls to the TSR. It was a very cool system. Later I reused some of the code and wrote a single-line voice mail system.

The voice card was far ahead of its time and wasn't widely sold. ITT Information Systems made a TV commercial featuring the card but it was more of a "Look how innovative we are." ad than a "Buy this product." ad.