Case Study: Bill of Lading Edit Interface using AngularJS for Logistics Company


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

A silo installation at one of the client’s transload terminals. Product is brought in via rail, and loaded into the silos with a bucket elevator. Trucks drive under the silos to be loaded, before delivering proppant the last mile, to the actual well site.


The client runs transload facilities across the country, with an especially high concentration in the Southwest. At these facilities, they load, unload, and store bulk materials for their customers, primarily frac sand to be used in drilling oil wells. When material is loaded and shipped out from the terminal via truck, the driver is issued a bill of lading. The client manages all of the operations at their terminals with their custom terminal management application, which we helped them build.

A sand truck, waiting to be loaded.

All load actions are tracked in the application, arrival and departure weights for the trucks are captured from the on-site truck scales, and a signature is captured electronically from the driver. The application uses all this data to automatically generate a correct and accurate bill of lading, which is printed out and handed to the driver. A copy is also stored in the cloud as a PDF for later reference.

The bill of lading is automatically generated as a PDF, which can be printed. The driver’s signature is captured electronically and placed on the appropriate section of the document.

The process is highly efficient and allows them to quickly process trucks at their sites, and maintain uniform processes at every location across the company. In fact, the system worked so well that another company approached our client about licensing their system to manage their own terminals. However, the partner company had certain specific feature sets which they required before implementing the system at their own facilities.

The most significant feature request was an easier way to modify bills of lading after the fact, as well as tracking those modifications. The bill of lading contains a large assortment of data about the load: customer, service company, product, tare and gross weights, purchase order number, driver name, signature, etc. One strategy used in the terminal management system to eliminate data entry errors is to provide dedicated screens for different steps in the loading process. These dedicated screens only allow entering the specific data relevant to that step (entering order data, receiving a new truck, loading the truck, collecting the final weight and dispatching the truck), and a series of statuses ensures that only the trucks at a given step can be accessed on that screen.

While this structure is very effective at preventing errors, on the rare occasion that an error did occur, resolving the issue was a bit cumbersome. A user with administrative access would have to manually revert the statuses on the truck to edit the different fields. As the client’s business continued to grow, even a steady low error percentage inevitably grew to a larger administrative task, and for their partner, it was a non-starter. They needed a way to edit all the relevant details about a truck on a single page.


Making a single page form with all the relevant fields for a truck is trivial. However, the system provides numerous checks to make sure that the order matches the customer, the job matches the order, the product matches all these things, and so on. Additionally, the quantity for each draft (load event) needed to total up to match the net weight calculated from the gross and tare, and vice versa. Since every piece of data could be modified, every other piece of data needed to be checked in real-time to make sure things still added up. Additionally, there were a series of dependent select fields (drop-down menus), where the user’s choice in one menu would affect the available options in other menus.

The core terminal management system was built as a web application, running in the cloud, backed by a relational database, with the user interface rendered in the browser with HTML. Additional UI enhancements were implemented using JavaScript. However, the level of interactivity required for this page would have been an overly complex mess to implement. Instead, we opted to use Angular.

Angular is an open-source JavaScript UI framework built by Google and used by a wide range of organizations around the world. Angular allows you to define a scope, a set of JavaScript variables. You can then reference those scope variables in your page template, in functions you write to call on that page, and in event triggers. The variables in the scope are shared around the page, so any time the variable is updated, the references to it are automatically updated as well.

Weights are automatically recalculated based on user input.

We used these capabilities to automatically update the gross tare and net weights based on the total of the drafts, keeping the net weight matched to the sum of the drafts. We built a integration to allow Chosen.JS’s searchable select boxes to interact with Angular scope variables. Any time a value was updated in a drop-down menu (Customer, Order, Job, Product) we could automatically trigger updates elsewhere in the form. When a change in value caused the available options in another select box to be modified, we would pull an updated option list from the API via AJAX.

The entire truck loading process laid out on a single screen for easy editing.

One particularly challenging component was dealing with transient load sources. The system tracks which container is used to load each draft in the truck. That way, if there is a contamination or other quality issue with the material on that truck, the system can easily tell you where it loaded from, in order to locate and mitigate the source of the problem. However, trucks are often loaded directly from railcars, which by their nature are mobile. Every day, empty railcars are removed from the facility, and full ones arrive to replace them. The system provides a list of railcars, silos, and other containers that are available to load from and have the matching product for the truck. However, if you are editing a BOL from a truck that has already left, the current list of load sources may be different from the list at the time that truck was on site. And if you modify the arrival and departure timestamps for the truck, the list of available load sources will change again. We had to extend the load source API so that you could query based on timestamps and lookup which load sources were on-site during that window of time and contained the specific product at that time (because a container might be emptied of one product, and refilled with another). Any change to the truck’s arrival and departure times would trigger another query to this API, causing the list of available load sources to be updated in real-time.

Each draft (load event) is listed, along with quantity, timestamp, and load source (asset). All this data can be edited, and the system will handle the business logic seamlessly.

All of these changes are managed exclusively on the client-side, without modifying the database. Once the user is happy with their changes, they submit the edits to the system. A series of validation checks are run, to make sure the business rules are applied correctly, and the bill of lading is updated in the system. A new PDF is generated and stored in the cloud for future reference.

However, since this modified BOL is different from the document the driver signed, and was handed during dispatch, the modified BOL document does not bear the driver’s digital signature. Instead it contains a note indicating that the BOL has been modified, as well as an attached log, listing all the changes made to the document.

This BOL has been edited, so the driver’s signature is replaced with a note indicating the change.

The system stores the original BOL document, as well as every revised version, with the attached change log. Each version can be downloaded on demand from the application.

All revisions are kept for reference, allowing users to see the entire history of the BOL through each edit.


This project was a huge time saver for the administrative support team tasked with handling these BOL modifications. Changes that were once a considerable headache could be handled in a matter of seconds. Because there were fewer opportunities for error, much of this work was able to be delegated back to the terminal operations team, which dramatically reduced turn around time, since the fix could be made by a manager who was physically on-site, rather than submitted to a queue with a host of other issues to be resolved by the support team, who would previously have to call the terminal and investigate the issue to determine exactly what the correct data would be.

The revision history showing previous versions, and listing the corrections, provided traceability in case of any questions down the road.

Additionally, the same single-page interface also allowed us to implement a “Manual BOL” process. If a facility loses Internet access, they cannot access the terminal management system, since it is hosted in the cloud. However, trucks still need to be loaded and processed. Delaying loads could cause well site operations to stop, which can be extremely costly for our client’s customers, and might even lead to canceled contracts. Thus, if a facility’s Internet access is down, they must revert to a paper BOL form, a copy of which is kept in a file at the terminal. Once Internet access is restored, the paper BOL can be scanned, and the relevant data is entered on the “Manual BOL” screen. This screen is basically the same form as the BOL editing interface, except a new bill of lading is created, rather than modifying an existing one.

Finally, this feature was a non-negotiable requirement for our client’s partner who licensed the system, and implementing this allowed this strategic partnership to move forward. This lead to deploying the client’s custom terminal management system to over 40 locations across the US and Canada, and opening an entirely new revenue stream for the business. It also strengthened the partnership with that client, leading to additional future projects.

CIG Billing Reports UI

Case Study: Billing Reports for Oil Field Logistics Provider


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

This client operates transload facilities, where they transfer products between trucks, railcars, and on-site storage containers. They specialize in transloading frac sand for the oil and gas industry, particularly in the Southwest, although they handle a variety of materials at locations all over the country.

One of the client’s transload terminals.


The railcars, trucks, and materials they handle do not belong to the logistics service provider, but rather belong to their clients, who either produce and sell materials or consume them. Either way, they need to move large quantities of bulk materials using multiple modes of transport, which our client facilitates.

One of the logistics provider’s transload facilities. Sand is brought in via railcar and unloaded with the mobile conveyor in the center of the image. Trucks can be loaded directly from the railcar with this device, but product may also be unloaded from railcars and stored in containers like the ones shown on the right.

The client’s business is not based on the products they handle, but on the handling itself. As such, they bill their clients based on the weight transported, with rates varying based on location, customer, mode of transport, material type, container type (silo, hopper, warehouse, etc.), and various other factors. Some of these rates are set on a sliding scale based on volume. Some include minimum volumes that must be achieved within a certain time frame.

They also bill demurrage fees for railcars stored at their facilities, which vary based on the type of car. Most of these are actually charged by the railroad and passed through to the client. They typically have a certain number of free days, with a charge daily for each day after that threshold.

Many of our client’s customers are multi-billion dollar companies, and their requirements can vary greatly. As such, each contract is negotiated separately and will have different terms.

Because of all this, calculating exactly what each customer owes them at the end of the month is an extremely complex task. Data on every truck to visit each facility was being exported into a spreadsheet, and the accounting department would add all the various calculations, manually entering the more complex variations. But as the business grew, this became too time-consuming, and they simply could not keep up.


We built them a system where they could enter all the complex details of the billing agreement for each contract. Then we built a report generation system where they could grab all the transactions for each customer, have the fees automatically calculated, and quickly review them. Once reviewed, they would click a button and all the transactions would be exported to their accounting software, and an invoice would be generated. Later, as the client’s business grew, we modified the export to send the transactions to an enterprise ERP system instead.


This system allowed them to recoup lost revenue that would have been missed by the manual process, and easily generate accurate invoices for all their clients with only a couple of accountants, instead of an army of analysts, which they would need at this point.

Case Study: Industrial Automation (SCADA) and Web Application Integration for a Transloading and Logistics Provider


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

One of the client’s transload terminals.


This transload company contracted with one of their clients to install 4 200 ft silos to store frac sand and more quickly process inbound and outbound loads from one of their facilities. The silo installation included a pit and bucket elevator system for unloading railcars, truck scales and gate valves for loading trucks, and an automation system to control all this equipment remotely from an onsite scale house.

The top of a silo, taken from an adjacent silo. This is part of a four pack. These silos are actually at a different location, but are built to the same specifications, and are nearly identical in appearance.

The facility is owned operated by the transload provider, but the silos would be owned by the client and used exclusively for their product. The silos were constructed because this was one of the busiest terminals in the region, with an extremely high volume of trucks.

The facility is located in a small town on one of its major thoroughfares, and the long line of trucks waiting to be loaded had caused traffic blockage for the entire town, much to the annoyance of local residents.

The silos were designed to be managed using an piece of industrial automation software called a SCADA (short for “supervisory control and data acquisition”). This significant investment in mechanical equipment and software promised to increase throughput for the terminal. However, since the SCADA was provided by their client, the transload company had no control over the software, and limited access to its data.

The silos have a bucket conveyor system which carries product from ground-level, up to the top of the installations and deposits it in the appropriate silo with these conveyor legs.

Our client has a custom terminal management application, which we helped them build, and which manages inventory, bills of lading, transload billing, and all other aspects of on-site terminal operations. It also feeds data to report systems used by management and customers to make strategic decisions.

A view of the silo installation from below. The silos are just under 200 ft tall.

Any loads into and out of the silos needed to be tracked in the terminal management application in order to maintain accurate inventory, and to generate correct bills of lading. Additionally, the terminal management application needed to feed data into the SCADA system about the trucks being loaded through the silos, to ensure the correct product was loaded on each truck.


The SCADA system implemented at this site had limited integration capabilities. However, it was configured to write out a log of trucks unloaded and railcars loaded to a local MySQL database, in two separate tables. The system also had the ability to consume an XML feed of incoming trucks.

We had previously built the client a custom microservice service in Node.JS. It ran on an appliance installed at each of their facilities, and collected data from truck scales, interfacing with the terminal management application in the cloud through a REST API.

We modified this service, adding a microservice to provide the XML feed the SCADA required, pulling data from the terminal management application. We also added a service which would repeatedly poll the SCADA database for new and updated records in the relevant tables. Any new loads would be translated to the format required by the terminal management system’s REST API, and forwarded to that system, in as close to real time as possible.

Drafts loaded using the silo automation system are automatically imported into PropLogistics, and show up on the loading screen here.


This project had a very tight deadline but we were able to execute quickly and deploy our solution within just a few days. Our client’s operators on-site were able to run the automation system and load trucks far more quickly, increasing throughput while decreasing traffic.

The automation system received the truck and product data it needed to correctly assign products. The operations team was able to generate bills of lading through the terminal management application, utilizing all the optimizations already present in that system.

Since all load and unload events occurring at the silos were automatically logged in the terminal management application, they were able to track inventory without manual double entry, saving time and avoiding errors.

The transload provider was able to exceed expectations for their client, and the terminal became the most productive site in the client’s entire logistics network.

Case Study: Signature Pad Integration for a Transload Company


Our client is a market leader in oil field logistics and transload services.  They own and manage a nationwide network of transload terminals where they store and move millions of pounds of bulk sand, crude, and other materials to and from trucks, railcars, silos, and other containers.  Their services are a critical element in the energy supply chain in North America.

One of the client’s transload facilities.


The client has a proprietary cloud-based operations management application they use to track all material and asset movements at each of their facilities.  A key part of this process requires capturing bill of lading signatures from truck drivers before they leave the facility. Different facilities require different configurations, based on available equipment, facilities, and staffing.  Some facilities use desktop PCs with a USB signature pad, while others use a ruggedized mobile device with a touchscreen.  Still others use custom-built kiosks with larger touchscreens.  No matter the hardware used, signatures need to be captured and stored in a consistent format, and must be easily retrieved later for auditing and verification purposes.

A USB signature pad can be plugged in to a desktop PC and used to collect driver signatures.


Using an open source jQuery signature plugin, and the proprietary SDK from the device manufacturer, we built a single, reusable component which allows signatures to be captured via any method: signature pad, touch screen, mouse drawing, and saved to the app’s datastore as a PNG file.

Trucks are weighed on a truck scale to capture the final gross weight before generating the bill of lading. Once this data is collected, the driver signs the BOL eletronically.

If the signature pad is installed, impressions are captured in real time, and rendered on the screen.  If the signature pad is not installed, the system will fall back gracefully.  A message informing the user on how to install the signature pad can be displayed.  Meanwhile, the other signature methods are still available.

Driver signatures are captured in the web app, along with the other truck data. This box supports both physical signature pads like the one pictured about as well as signing with a mouse or touch screen.


Our client no longer has to be concerned about signatures when planning deployments.  Any possible situation can be handled with minimal overhead, whether they are using desktop computers or mobile handhelds.  Not only that but some time after this solution was deployed, the transload company began installing kiosks with a large desktop-sized touch screen at certain facilities.  Because of the flexibility of the signature solution, the kiosks were able to support on-screen signature capture without any additional development.  Whether the facility uses touch screen kiosks, handheld devices, or a signature pad attached to a desktop PC, the application is able to handle it seamlessly.  Digital signature capture for Bill of Lading documents will be available in any scenario.


Project Profile: FaithVillage

FaithVillage is a faith-based social network and syndicated content delivery application.  We worked on numerous aspects of the application, from integrating a customized Magento store, to overhauling the messaging and notification systems, the URL routing system, the user authentication system, and building a new calendar system with support for invites and scheduling.  On this project, I worked with: PHP, MySQL, JavaScript, Zend Framework (full stack MVC), Doctrine ORM, MooTools, jQuery, Magento, git, GitHub, among others.

Vendor Invoice Portal

Project Profile: Vendor Invoice Portal

We created this small web app for a client.  They needed a solution to allow their vendors and contractors to submit and manage invoices.  The application provided a web interface for third party vendors to access, as well as a FileMaker interface for back office access, all backed by a MySQL database.

For this project, we used PHP, MySQL, Symfony, JavaScript, jQuery, Twitter Bootstrap, and FileMaker Pro.

Locals Know

Project Profile: Locals Know

Locals Know is a social travel application for the iOS and Android platforms.  We worked on a web-based RESTful API used by the mobile applications to pull data from the server.  Much of the structure for the API was already built by a previous developer.  However, the existing code base had several outstanding issues.  First of all, the database schema did not match the intended design of the app, and there were some other issues.  We restructured the database, wrote migrations, modified the API responses, and fixed several other bugs.  Beyond bug fixes, we also added additional API end points for new features and helped troubleshoot some issues with the API requests from the iOS application.

As well as dealing with the RESTful API, we built a client-facing web interface to allow users to view this content outside of the native mobile apps.  Using the responsive design features in Twitter Bootstrap, we built this web interface to render on desktop web browsers, tablets, and mobile browsers.

In addition, to support this web interface, and the social nature of the app, we built a custom URL shortener system to aid in sharing content via Twitter and Facebook.  The URL shortener created compact URLs, uniform in length, with no discernible pattern, guaranteed to be unique.

For this project we made use of PHP, MySQL, Yii MVC Framework, JavaScript, jQuery, Google Maps API, and Twitter Bootstrap.

Project Profile: LiquiTraq

Partnering with our client, we worked on ongoing basis, managing the database and IT systems for Liquis, an asset management and corporate liquidation company.  We built a custom FileMaker application which managed nearly every aspect of the company’s day-to-day operations.  This included tracking of incoming trucks, pallets, and assets; automatically importing product photos for online sales, ODBC import/export with MSSQL-based eBay listing tool, Blackthorne; complex financial report generation; and automated pick and shipping systems.  We also built a web interface, allowing clients to log in and view complete status of items entrusted to the company in real time.  During this project, we used FileMaker Pro, PHP, MySQL, MS SQL, eBay Blackthorne, ODBC, rsync, Groovy, UPS Shipping XML API, Java printing APIs, among others technologies.

Project Profile: Interspire Email Marketer

Interspire Email Marketer is a leading application for managing large scale email marketing campaigns.  We worked on two maintenance releases of the product.  In both releases we documented and resolved numerous bugs and user experience issues.  We also sourced, trained, and managed a distributed team of support engineers to handle customer support tickets.  For this project, we worked with PHP, MySQL, Postfix, JavaScript, jQuery, Atlassian Jira, and SVN.