Case Study: Bill of Lading Edit Interface using AngularJS for Logistics Company


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

A silo installation at one of the client’s transload terminals. Product is brought in via rail, and loaded into the silos with a bucket elevator. Trucks drive under the silos to be loaded, before delivering proppant the last mile, to the actual well site.


The client runs transload facilities across the country, with an especially high concentration in the Southwest. At these facilities, they load, unload, and store bulk materials for their customers, primarily frac sand to be used in drilling oil wells. When material is loaded and shipped out from the terminal via truck, the driver is issued a bill of lading. The client manages all of the operations at their terminals with their custom terminal management application, which we helped them build.

A sand truck, waiting to be loaded.

All load actions are tracked in the application, arrival and departure weights for the trucks are captured from the on-site truck scales, and a signature is captured electronically from the driver. The application uses all this data to automatically generate a correct and accurate bill of lading, which is printed out and handed to the driver. A copy is also stored in the cloud as a PDF for later reference.

The bill of lading is automatically generated as a PDF, which can be printed. The driver’s signature is captured electronically and placed on the appropriate section of the document.

The process is highly efficient and allows them to quickly process trucks at their sites, and maintain uniform processes at every location across the company. In fact, the system worked so well that another company approached our client about licensing their system to manage their own terminals. However, the partner company had certain specific feature sets which they required before implementing the system at their own facilities.

The most significant feature request was an easier way to modify bills of lading after the fact, as well as tracking those modifications. The bill of lading contains a large assortment of data about the load: customer, service company, product, tare and gross weights, purchase order number, driver name, signature, etc. One strategy used in the terminal management system to eliminate data entry errors is to provide dedicated screens for different steps in the loading process. These dedicated screens only allow entering the specific data relevant to that step (entering order data, receiving a new truck, loading the truck, collecting the final weight and dispatching the truck), and a series of statuses ensures that only the trucks at a given step can be accessed on that screen.

While this structure is very effective at preventing errors, on the rare occasion that an error did occur, resolving the issue was a bit cumbersome. A user with administrative access would have to manually revert the statuses on the truck to edit the different fields. As the client’s business continued to grow, even a steady low error percentage inevitably grew to a larger administrative task, and for their partner, it was a non-starter. They needed a way to edit all the relevant details about a truck on a single page.


Making a single page form with all the relevant fields for a truck is trivial. However, the system provides numerous checks to make sure that the order matches the customer, the job matches the order, the product matches all these things, and so on. Additionally, the quantity for each draft (load event) needed to total up to match the net weight calculated from the gross and tare, and vice versa. Since every piece of data could be modified, every other piece of data needed to be checked in real-time to make sure things still added up. Additionally, there were a series of dependent select fields (drop-down menus), where the user’s choice in one menu would affect the available options in other menus.

The core terminal management system was built as a web application, running in the cloud, backed by a relational database, with the user interface rendered in the browser with HTML. Additional UI enhancements were implemented using JavaScript. However, the level of interactivity required for this page would have been an overly complex mess to implement. Instead, we opted to use Angular.

Angular is an open-source JavaScript UI framework built by Google and used by a wide range of organizations around the world. Angular allows you to define a scope, a set of JavaScript variables. You can then reference those scope variables in your page template, in functions you write to call on that page, and in event triggers. The variables in the scope are shared around the page, so any time the variable is updated, the references to it are automatically updated as well.

Weights are automatically recalculated based on user input.

We used these capabilities to automatically update the gross tare and net weights based on the total of the drafts, keeping the net weight matched to the sum of the drafts. We built a integration to allow Chosen.JS’s searchable select boxes to interact with Angular scope variables. Any time a value was updated in a drop-down menu (Customer, Order, Job, Product) we could automatically trigger updates elsewhere in the form. When a change in value caused the available options in another select box to be modified, we would pull an updated option list from the API via AJAX.

The entire truck loading process laid out on a single screen for easy editing.

One particularly challenging component was dealing with transient load sources. The system tracks which container is used to load each draft in the truck. That way, if there is a contamination or other quality issue with the material on that truck, the system can easily tell you where it loaded from, in order to locate and mitigate the source of the problem. However, trucks are often loaded directly from railcars, which by their nature are mobile. Every day, empty railcars are removed from the facility, and full ones arrive to replace them. The system provides a list of railcars, silos, and other containers that are available to load from and have the matching product for the truck. However, if you are editing a BOL from a truck that has already left, the current list of load sources may be different from the list at the time that truck was on site. And if you modify the arrival and departure timestamps for the truck, the list of available load sources will change again. We had to extend the load source API so that you could query based on timestamps and lookup which load sources were on-site during that window of time and contained the specific product at that time (because a container might be emptied of one product, and refilled with another). Any change to the truck’s arrival and departure times would trigger another query to this API, causing the list of available load sources to be updated in real-time.

Each draft (load event) is listed, along with quantity, timestamp, and load source (asset). All this data can be edited, and the system will handle the business logic seamlessly.

All of these changes are managed exclusively on the client-side, without modifying the database. Once the user is happy with their changes, they submit the edits to the system. A series of validation checks are run, to make sure the business rules are applied correctly, and the bill of lading is updated in the system. A new PDF is generated and stored in the cloud for future reference.

However, since this modified BOL is different from the document the driver signed, and was handed during dispatch, the modified BOL document does not bear the driver’s digital signature. Instead it contains a note indicating that the BOL has been modified, as well as an attached log, listing all the changes made to the document.

This BOL has been edited, so the driver’s signature is replaced with a note indicating the change.

The system stores the original BOL document, as well as every revised version, with the attached change log. Each version can be downloaded on demand from the application.

All revisions are kept for reference, allowing users to see the entire history of the BOL through each edit.


This project was a huge time saver for the administrative support team tasked with handling these BOL modifications. Changes that were once a considerable headache could be handled in a matter of seconds. Because there were fewer opportunities for error, much of this work was able to be delegated back to the terminal operations team, which dramatically reduced turn around time, since the fix could be made by a manager who was physically on-site, rather than submitted to a queue with a host of other issues to be resolved by the support team, who would previously have to call the terminal and investigate the issue to determine exactly what the correct data would be.

The revision history showing previous versions, and listing the corrections, provided traceability in case of any questions down the road.

Additionally, the same single-page interface also allowed us to implement a “Manual BOL” process. If a facility loses Internet access, they cannot access the terminal management system, since it is hosted in the cloud. However, trucks still need to be loaded and processed. Delaying loads could cause well site operations to stop, which can be extremely costly for our client’s customers, and might even lead to canceled contracts. Thus, if a facility’s Internet access is down, they must revert to a paper BOL form, a copy of which is kept in a file at the terminal. Once Internet access is restored, the paper BOL can be scanned, and the relevant data is entered on the “Manual BOL” screen. This screen is basically the same form as the BOL editing interface, except a new bill of lading is created, rather than modifying an existing one.

Finally, this feature was a non-negotiable requirement for our client’s partner who licensed the system, and implementing this allowed this strategic partnership to move forward. This lead to deploying the client’s custom terminal management system to over 40 locations across the US and Canada, and opening an entirely new revenue stream for the business. It also strengthened the partnership with that client, leading to additional future projects.

CIG Billing Reports UI

Case Study: Billing Reports for Oil Field Logistics Provider


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

This client operates transload facilities, where they transfer products between trucks, railcars, and on-site storage containers. They specialize in transloading frac sand for the oil and gas industry, particularly in the Southwest, although they handle a variety of materials at locations all over the country.

One of the client’s transload terminals.


The railcars, trucks, and materials they handle do not belong to the logistics service provider, but rather belong to their clients, who either produce and sell materials or consume them. Either way, they need to move large quantities of bulk materials using multiple modes of transport, which our client facilitates.

One of the logistics provider’s transload facilities. Sand is brought in via railcar and unloaded with the mobile conveyor in the center of the image. Trucks can be loaded directly from the railcar with this device, but product may also be unloaded from railcars and stored in containers like the ones shown on the right.

The client’s business is not based on the products they handle, but on the handling itself. As such, they bill their clients based on the weight transported, with rates varying based on location, customer, mode of transport, material type, container type (silo, hopper, warehouse, etc.), and various other factors. Some of these rates are set on a sliding scale based on volume. Some include minimum volumes that must be achieved within a certain time frame.

They also bill demurrage fees for railcars stored at their facilities, which vary based on the type of car. Most of these are actually charged by the railroad and passed through to the client. They typically have a certain number of free days, with a charge daily for each day after that threshold.

Many of our client’s customers are multi-billion dollar companies, and their requirements can vary greatly. As such, each contract is negotiated separately and will have different terms.

Because of all this, calculating exactly what each customer owes them at the end of the month is an extremely complex task. Data on every truck to visit each facility was being exported into a spreadsheet, and the accounting department would add all the various calculations, manually entering the more complex variations. But as the business grew, this became too time-consuming, and they simply could not keep up.


We built them a system where they could enter all the complex details of the billing agreement for each contract. Then we built a report generation system where they could grab all the transactions for each customer, have the fees automatically calculated, and quickly review them. Once reviewed, they would click a button and all the transactions would be exported to their accounting software, and an invoice would be generated. Later, as the client’s business grew, we modified the export to send the transactions to an enterprise ERP system instead.


This system allowed them to recoup lost revenue that would have been missed by the manual process, and easily generate accurate invoices for all their clients with only a couple of accountants, instead of an army of analysts, which they would need at this point.

Case Study: Truck Scales Integration for Transload Company


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

Operators inspect a railcar at one of their transload facilities.


At each of the client’s loading facilities, they have operate one or several truck scales, industrial scales large enough to weigh an entire 18-wheeler, truck and cab.

A sand truck sitting on the scale platform, being weighed.

When a truck arrives on site, the facility operations staff weigh the truck in, then sends the truck to get loaded somewhere else on site. After loading, the truck is weighed again. So you have an initial weight (tare) and a final weight (gross). The difference gives you the weight actually loaded on the truck while it was on site (minus any fuel consumed during the time period).

The scales have a series of pressure-sensitive load cells, which are all connected to a metal box, called an “Indicator” which is located in the small office where the terminal operator sits, at a computer.

The client wanted to be able to have that scale value feed directly into their custom terminal management system, so the operator doesn’t have to manually type the number in, and risk incorrect entry. This application, by the way, is a web-based cloud application, not something running locally on the PC.

We found there was a wide variety of scale indicators from different vendors in use. Some were network-enabled with RJ-45 ethernet ports or even wireless ports, but a lot just had 9 pin serial ports. We found that if we connected a PC to the serial or ethernet port on the scale, via Hyperterminal, it was constantly streaming a steady flow of raw data. However, the format of that data varied, depending on the vendor.

In order to streamline development and testing, we built a little service in Node.JS that could simulate the output from various scales.


We ended up building a Node.JS service, which would open a socket to the scale indicator, and read the raw data stream. We built separate profiles for the different formats used by different scale vendors, so the service could interpret them. We also added logic to ignore variations caused by wind, and cut down network chatter. Then we deployed this service on a little headless appliance PC at each terminal.

Initially, the service simply provided a REST API to allow us to request scale weights on demand, but this required port forwarding at each location, which was not always possible, and when it was, made setup more complicated. Later, we reworked it so the service would push the weight data up to the cloud whenever there was a change.

If the scale management microservice is configured to push weight data, it simply makes a REST API call to PropLogistics when the weight is updated. However, the microservice also provides a simple REST API from which the current weight, along with other relevant data on the scale can be requested on-demand.

Either way, the terminal operator, sitting at the PC, entering data about the truck could simply click a button, and capture the weight from the scale inside the web application, cutting data entry errors down dramatically, and making the data entry process much smoother.

With the click of a button in the terminal management application, terminal operators can capture real-time weights from any scale connected to the Scaleman system.


The weights collected from these truck scales are used to automatically generate a bill of lading for each truck departing one of the client’s transload facilities. Feeding the scale data directly into the client’s custom web application, not allowed saved employees time improved truck processing efficiency, it also eliminated a source of errors, allowing the client to have confidence that the information printed on their bills of lading is correct.

Case Study: Signature Pad Integration for a Transload Company


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

One of the client’s transload facilities.


The client has a proprietary cloud-based operations management application they use to track all material and asset movements at each of their facilities.  A key part of this process requires capturing bill of lading signatures from truck drivers before they leave the facility. Different facilities require different configurations, based on available equipment, facilities, and staffing.  Some facilities use desktop PCs with a USB signature pad, while others use a ruggedized mobile device with a touchscreen.  Still others use custom-built kiosks with larger touchscreens.  No matter the hardware used, signatures need to be captured and stored in a consistent format, and must be easily retrieved later for auditing and verification purposes.

A USB signature pad can be plugged in to a desktop PC and used to collect driver signatures.


Using an open source jQuery signature plugin, and the proprietary SDK from the device manufacturer, we built a single, reusable component which allows signatures to be captured via any method: signature pad, touch screen, mouse drawing, and saved to the app’s datastore as a PNG file.

Trucks are weighed on a truck scale to capture the final gross weight before generating the bill of lading. Once this data is collected, the driver signs the BOL eletronically.

If the signature pad is installed, impressions are captured in real time, and rendered on the screen.  If the signature pad is not installed, the system will fall back gracefully.  A message informing the user on how to install the signature pad can be displayed.  Meanwhile, the other signature methods are still available.

Driver signatures are captured in the web app, along with the other truck data. This box supports both physical signature pads like the one pictured about as well as signing with a mouse or touch screen.


Our client no longer has to be concerned about signatures when planning deployments.  Any possible situation can be handled with minimal overhead, whether they are using desktop computers or mobile handhelds.  Not only that but some time after this solution was deployed, the transload company began installing kiosks with a large desktop-sized touch screen at certain facilities.  Because of the flexibility of the signature solution, the kiosks were able to support on-screen signature capture without any additional development.  Whether the facility uses touch screen kiosks, handheld devices, or a signature pad attached to a desktop PC, the application is able to handle it seamlessly.  Digital signature capture for Bill of Lading documents will be available in any scenario.

Kashoo PHP GitHub page

Project Profile: Kashoo PHP

Our client had an extensive custom internal back office application written on the LAMP stack which needed to be integrated with Kashoo Simple Cloud Accounting. We built a wrapper library in PHP to interface with Kashoo’s REST API, and allow easy integration with their existing tool set.

The client graciously allowed us to release the library to the community as an open soruce project, on GitHub.


Project Profile: FaithVillage

FaithVillage is a faith-based social network and syndicated content delivery application.  We worked on numerous aspects of the application, from integrating a customized Magento store, to overhauling the messaging and notification systems, the URL routing system, the user authentication system, and building a new calendar system with support for invites and scheduling.  On this project, I worked with: PHP, MySQL, JavaScript, Zend Framework (full stack MVC), Doctrine ORM, MooTools, jQuery, Magento, git, GitHub, among others.

Vendor Invoice Portal

Project Profile: Vendor Invoice Portal

We created this small web app for a client.  They needed a solution to allow their vendors and contractors to submit and manage invoices.  The application provided a web interface for third party vendors to access, as well as a FileMaker interface for back office access, all backed by a MySQL database.

For this project, we used PHP, MySQL, Symfony, JavaScript, jQuery, Twitter Bootstrap, and FileMaker Pro.

Locals Know

Project Profile: Locals Know

Locals Know is a social travel application for the iOS and Android platforms.  We worked on a web-based RESTful API used by the mobile applications to pull data from the server.  Much of the structure for the API was already built by a previous developer.  However, the existing code base had several outstanding issues.  First of all, the database schema did not match the intended design of the app, and there were some other issues.  We restructured the database, wrote migrations, modified the API responses, and fixed several other bugs.  Beyond bug fixes, we also added additional API end points for new features and helped troubleshoot some issues with the API requests from the iOS application.

As well as dealing with the RESTful API, we built a client-facing web interface to allow users to view this content outside of the native mobile apps.  Using the responsive design features in Twitter Bootstrap, we built this web interface to render on desktop web browsers, tablets, and mobile browsers.

In addition, to support this web interface, and the social nature of the app, we built a custom URL shortener system to aid in sharing content via Twitter and Facebook.  The URL shortener created compact URLs, uniform in length, with no discernible pattern, guaranteed to be unique.

For this project we made use of PHP, MySQL, Yii MVC Framework, JavaScript, jQuery, Google Maps API, and Twitter Bootstrap.

Project Profile: LiquiTraq

Partnering with our client, we worked on ongoing basis, managing the database and IT systems for Liquis, an asset management and corporate liquidation company.  We built a custom FileMaker application which managed nearly every aspect of the company’s day-to-day operations.  This included tracking of incoming trucks, pallets, and assets; automatically importing product photos for online sales, ODBC import/export with MSSQL-based eBay listing tool, Blackthorne; complex financial report generation; and automated pick and shipping systems.  We also built a web interface, allowing clients to log in and view complete status of items entrusted to the company in real time.  During this project, we used FileMaker Pro, PHP, MySQL, MS SQL, eBay Blackthorne, ODBC, rsync, Groovy, UPS Shipping XML API, Java printing APIs, among others technologies.

Project Profile: Interspire Email Marketer

Interspire Email Marketer is a leading application for managing large scale email marketing campaigns.  We worked on two maintenance releases of the product.  In both releases we documented and resolved numerous bugs and user experience issues.  We also sourced, trained, and managed a distributed team of support engineers to handle customer support tickets.  For this project, we worked with PHP, MySQL, Postfix, JavaScript, jQuery, Atlassian Jira, and SVN.