How audits can optimize code base: Term Finance “clearing price” algorithm

Posted on May 9th, 2023 by Runtime Verification & Term Labs
Posted in Audits

term labs.jpg

This blog post has been co-authored by the Term Labs and Runtime Verification teams.

Introduction

We are delighted to announce the completion of a successful two-week audit of the Term Finance Protocol. For this engagement, Runtime Verification was tasked with examining a small, critical part of its codebase, with the goal of establishing code correctness, discovering edge-case behavior, and optimizing gas consumption. 

At a high level, Term Finance is a collateralized fixed-rate lending protocol that matches borrowers and lenders through a unique auction process. Lenders submit offers to lend, at or above their respective offer rates. Bidders submit bids to borrow, at or below their respective bid rates. At the conclusion of the auction, the protocol’s smart contracts determine an auction clearing rate at which borrowers and lenders transact. All borrowers willing to pay at or above the clearing rate receive loans and all lenders willing to lend at or below the clearing rate make loans, in each case, at the clearing rate. All other bids and offers are said to be “left on the table” and their tokens are immediately returned to their wallets. At the conclusion of the auction, borrowers receive their loan proceeds and lenders receive ERC-20 tokens, Term Repo Tokens, that are essentially receipt tokens that can be redeemed for principal plus interest at maturity.

The portion of the code under examination for this audit was the algorithm for calculating the clearing price of an auction, which is, intuitively, the price point at which auction demand and supply are in equilibrium. In this setting, we refer to the demand at a given price, p, as the bid volume, bv(p), that is, the sum of all bids greater than or equal to p, and to the supply at a given price, p, as the offer volume, ov(p), that is, the sum of all offers less than or equal to p.

During the first few days of the engagement the auditors at RV examined the code and communicated closely with Term Labs to confirm the algorithm’s intended specification. The algorithm was implemented as a combination of a binary and a gradient search, and had a customizable cut-off parameter that would stop the search if insufficient progress had been made in the last iteration. During this initial pass, several untreated edge cases and minor optimizations were identified, and RV made an observation that the algorithm recomputed bid and offer volumes on each iteration of its search. This was a signal to pursue an avenue of optimization that would leverage previously calculated volumes to reduce computation, following the approach of dynamic programming.

After delving into this idea more deeply, it became apparent that the algorithm could be simplified into a carefully crafted linear search that avoids recomputation of bid and offer volumes and removes the need for the cut-off parameter. Both teams agreed to shift the focus of the engagement towards creating the new algorithm, while, at the same time, making sure that the original algorithm was properly analyzed.

Theoretical Foundations

To ensure correctness of the new algorithm, Runtime Verification set out to formalize the clearing-price problem in optimization theory, state the main ideas underpinning the new algorithm as theorems, and prove that they hold. In particular, following the design of the original algorithm the clearing volume of a given price p was defined as the minimum of its bid and offer volumes

pre-clearing price  ppcPpcp_{pc} \in P_{pc} as any price that maximizes the clearing volume

and a clearing price pcp_{c} as any pre-clearing price that additionally minimizes the sum of its bid and offer volumes

Assuming that the algorithm is provided with a list of bids and a list of offers that are sorted by price in ascending order, RV was able to prove that it is possible to find a pre-clearing price by performing a single iteration on the offers, and from there, identify the clearing price by performing an additional single iteration on the bids.

Based on these findings, the new algorithm was implemented in Solidity and shows a strong correspondence between the mathematical formalization and the implementation. Importantly, the complexity of the implementation is linear in the number of bids and offers and the implementation ensures that the calculation of the bid and offer volumes is performed only once.

Experimental Evaluation

To provide further assurance of correctness and gain insight into gas consumption, the auditors at Runtime Verification performed comprehensive testing of the original and the new algorithm in Foundry. In particular, the following was taken into consideration (i) the original algorithm without cut-off, (ii) the original algorithm with cut-off, and the (iii) new algorithm. The number of bids and offers, nn, was taken to be a power of two between 8 and 2048 inclusive and then, for each such nn, 100 tests were randomly generated (that is, nn bids and nn offers with randomly-generated prices and amounts), and then the considered algorithms were run on those tests.

In all of the tests, the original algorithm without cut-off and the new algorithm agreed on their solutions; Runtime Verification auditors manually inspected a number of these solutions and have confidence that they were indeed optimal. The original algorithm with cut-off, however, did not return the optimal solution in some of the cases, with the number of such discrepancies increasing as the number of bids and offers increases. Importantly, the gas consumption of the new algorithm was substantially lower than that of the original algorithm across all experiment sizes, as illustrated by the following table:

# of bids/offers816326412825651210242048
Gas decrease 
(new vs. cut-off)
55%50%46%44%45%49%55%62%70%
Gas decrease 
(new vs. no cut-off)
73%69%65%61%61%63%67%72%76%

These experimental results, coupled with the above-presented theoretical results, have ultimately resulted in the new algorithm being adopted by the protocol.

Further optimizations: sorting

Recall that both the original and the new clearing-price algorithm require that bids and offers be sorted in ascending order according to price. The protocol’s original approach to ensuring data integrity was to perform this sorting on-chain using quicksort, which is of log-linear complexity. After examining the gas consumption of sorting on-chain and recognizing that it is highly likely for this sorting to be more expensive than the clearing-price algorithm itself, RV suggested that the sorting be replaced with a sortedness check, which is linear in complexity, thus moving the sorting responsibility to protocol keepers without compromising on data integrity. This gas optimization was ultimately adopted and implemented by the Term Finance protocol.

Conclusions

What makes this engagement unique and notable is that it was done, from beginning to end, in close collaboration with the Term Labs team. It is a prime example of a model of collaborative code audit that can extend far beyond simple bug identification and result in significant and tangible gains to gas optimization and code efficiency. With the domain knowledge of the Term Labs team and Runtime Verification’s expertise in the use of formal methods and implementing coding best practices we were able to achieve a materially improved algorithm as a result of the engagement. It was a pleasure to work on this project with the Term Labs team and we’re excited to see the algorithm perform once it’s live.

About Term Labs

Term Labs is a blockchain research and development company founded by a team with deep traditional finance, big tech, and decentralized finance experience. Term Labs is dedicated to the development of robust, transparent, and scalable fixed-rate lending on blockchains.

About Runtime Verification

Runtime Verification is a technology startup based in Champaign-Urbana, Illinois. The company uses formal methods to perform security audits on virtual machines and smart contracts on public blockchains. It also provides software testing, verification services and products to improve the safety, reliability, and correctness of software systems in the blockchain field.