Skip to main content

Node.js

tigerbeetle-node

The TigerBeetle client for Node.js.

Prerequisites

Linux >= 5.6 is the only production environment we support. But for ease of development we also support macOS and Windows.

  • NodeJS >= 18

Setup

First, create a directory for your project and cd into the directory.

Then, install the TigerBeetle client:

npm install --save-exact tigerbeetle-node

Now, create main.js and copy this into it:

const { id } = require("tigerbeetle-node");
const { createClient } = require("tigerbeetle-node");

console.log("Import ok!");

Finally, build and run:

node main.js

Now that all prerequisites and dependencies are correctly set up, let's dig into using TigerBeetle.

Sample projects

This document is primarily a reference guide to the client. Below are various sample projects demonstrating features of TigerBeetle.

  • Basic: Create two accounts and transfer an amount between them.
  • Two-Phase Transfer: Create two accounts and start a pending transfer between them, then post the transfer.
  • Many Two-Phase Transfers: Create two accounts and start a number of pending transfer between them, posting and voiding alternating transfers.

Sidenote: BigInt

TigerBeetle uses 64-bit integers for many fields while JavaScript's builtin Number maximum value is 2^53-1. The n suffix in JavaScript means the value is a BigInt. This is useful for literal numbers. If you already have a Number variable though, you can call the BigInt constructor to get a BigInt from it. For example, 1n is the same as BigInt(1).

Creating a Client

A client is created with a cluster ID and replica addresses for all replicas in the cluster. The cluster ID and replica addresses are both chosen by the system that starts the TigerBeetle cluster.

Clients are thread-safe and a single instance should be shared between multiple concurrent tasks.

Multiple clients are useful when connecting to more than one TigerBeetle cluster.

In this example the cluster ID is 0 and there is one replica. The address is read from the TB_ADDRESS environment variable and defaults to port 3000.

const client = createClient({
cluster_id: 0n,
replica_addresses: [process.env.TB_ADDRESS || "3000"],
});

The following are valid addresses:

  • 3000 (interpreted as 127.0.0.1:3000)
  • 127.0.0.1:3000 (interpreted as 127.0.0.1:3000)
  • 127.0.0.1 (interpreted as 127.0.0.1:3001, 3001 is the default port)

Creating Accounts

See details for account fields in the Accounts reference.

const account = {
id: id(), // TigerBeetle time-based ID.
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 718,
flags: 0,
timestamp: 0n,
};

const account_errors = await client.createAccounts([account]);
// Error handling omitted.

See details for the recommended ID scheme in time-based identifiers.

Account Flags

The account flags value is a bitfield. See details for these flags in the Accounts reference.

To toggle behavior for an account, combine enum values stored in the AccountFlags object (in TypeScript it is an actual enum) with bitwise-or:

  • AccountFlags.linked
  • AccountFlags.debits_must_not_exceed_credits
  • AccountFlags.credits_must_not_exceed_credits
  • AccountFlags.history

For example, to link two accounts where the first account additionally has the debits_must_not_exceed_credits constraint:

const account0 = {
id: 100n,
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 1,
timestamp: 0n,
flags: AccountFlags.linked | AccountFlags.debits_must_not_exceed_credits,
};
const account1 = {
id: 101n,
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 1,
timestamp: 0n,
flags: AccountFlags.history,
};

const account_errors = await client.createAccounts([account0, account1]);
// Error handling omitted.

Response and Errors

The response is an empty array if all accounts were created successfully. If the response is non-empty, each object in the response array contains error information for an account that failed. The error object contains an error code and the index of the account in the request batch.

See all error conditions in the create_accounts reference.

const account0 = {
id: 102n,
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 1,
timestamp: 0n,
flags: 0,
};
const account1 = {
id: 103n,
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 1,
timestamp: 0n,
flags: 0,
};
const account2 = {
id: 104n,
debits_pending: 0n,
debits_posted: 0n,
credits_pending: 0n,
credits_posted: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
reserved: 0,
ledger: 1,
code: 1,
timestamp: 0n,
flags: 0,
};

const account_errors = await client.createAccounts([account0, account1, account2]);
for (const error of account_errors) {
switch (error.result) {
case CreateAccountError.exists:
console.error(`Batch account at ${error.index} already exists.`);
break;
default:
console.error(
`Batch account at ${error.index} failed to create: ${
CreateAccountError[error.result]
}.`,
);
}
}

To handle errors you can either 1) exactly match error codes returned from client.createAccounts with enum values in the CreateAccountError object, or you can 2) look up the error code in the CreateAccountError object for a human-readable string.

Account Lookup

Account lookup is batched, like account creation. Pass in all IDs to fetch. The account for each matched ID is returned.

If no account matches an ID, no object is returned for that account. So the order of accounts in the response is not necessarily the same as the order of IDs in the request. You can refer to the ID field in the response to distinguish accounts.

const accounts = await client.lookupAccounts([100n, 101n]);

Create Transfers

This creates a journal entry between two accounts.

See details for transfer fields in the Transfers reference.

const transfers = [{
id: id(), // TigerBeetle time-based ID.
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: 0,
timestamp: 0n,
}];

const transfer_errors = await client.createTransfers(transfers);
// Error handling omitted.

See details for the recommended ID scheme in time-based identifiers.

Response and Errors

The response is an empty array if all transfers were created successfully. If the response is non-empty, each object in the response array contains error information for a transfer that failed. The error object contains an error code and the index of the transfer in the request batch.

See all error conditions in the create_transfers reference.

const transfers = [{
id: 1n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: 0,
timestamp: 0n,
},
{
id: 2n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: 0,
timestamp: 0n,
},
{
id: 3n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: 0,
timestamp: 0n,
}];

const transfer_errors = await client.createTransfers(batch);
for (const error of transfer_errors) {
switch (error.result) {
case CreateTransferError.exists:
console.error(`Batch transfer at ${error.index} already exists.`);
break;
default:
console.error(
`Batch transfer at ${error.index} failed to create: ${
CreateTransferError[error.result]
}.`,
);
}
}

To handle errors you can either 1) exactly match error codes returned from client.createTransfers with enum values in the CreateTransferError object, or you can 2) look up the error code in the CreateTransferError object for a human-readable string.

Batching

TigerBeetle performance is maximized when you batch API requests. The client does not do this automatically for you. So, for example, you can insert 1 million transfers one at a time like so:

const batch = []; // Array of transfer to create.
for (let i = 0; i < batch.len; i++) {
const transfer_errors = await client.createTransfers(batch[i]);
// Error handling omitted.
}

But the insert rate will be a fraction of potential. Instead, always batch what you can.

The maximum batch size is set in the TigerBeetle server. The default is 8190.

const batch = []; // Array of transfer to create.
const BATCH_SIZE = 8190;
for (let i = 0; i < batch.length; i += BATCH_SIZE) {
const transfer_errors = await client.createTransfers(
batch.slice(i, Math.min(batch.length, BATCH_SIZE)),
);
// Error handling omitted.
}

Queues and Workers

If you are making requests to TigerBeetle from workers pulling jobs from a queue, you can batch requests to TigerBeetle by having the worker act on multiple jobs from the queue at once rather than one at a time. i.e. pulling multiple jobs from the queue rather than just one.

Transfer Flags

The transfer flags value is a bitfield. See details for these flags in the Transfers reference.

To toggle behavior for a transfer, combine enum values stored in the TransferFlags object (in TypeScript it is an actual enum) with bitwise-or:

  • TransferFlags.linked
  • TransferFlags.pending
  • TransferFlags.post_pending_transfer
  • TransferFlags.void_pending_transfer

For example, to link transfer0 and transfer1:

const transfer0 = {
id: 4n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: TransferFlags.linked,
timestamp: 0n,
};
const transfer1 = {
id: 5n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: 0,
timestamp: 0n,
};

// Create the transfer
const transfer_errors = await client.createTransfers([transfer0, transfer1]);
// Error handling omitted.

Two-Phase Transfers

Two-phase transfers are supported natively by toggling the appropriate flag. TigerBeetle will then adjust the credits_pending and debits_pending fields of the appropriate accounts. A corresponding post pending transfer then needs to be sent to post or void the transfer.

Post a Pending Transfer

With flags set to post_pending_transfer, TigerBeetle will post the transfer. TigerBeetle will atomically roll back the changes to debits_pending and credits_pending of the appropriate accounts and apply them to the debits_posted and credits_posted balances.

const transfer0 = {
id: 6n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: TransferFlags.pending,
timestamp: 0n,
};

let transfer_errors = await client.createTransfers([transfer0]);
// Error handling omitted.

const transfer1 = {
id: 7n,
debit_account_id: 102n,
credit_account_id: 103n,
// Post the entire pending amount.
amount: amount_max,
pending_id: 6n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: TransferFlags.post_pending_transfer,
timestamp: 0n,
};

transfer_errors = await client.createTransfers([transfer1]);
// Error handling omitted.

Void a Pending Transfer

In contrast, with flags set to void_pending_transfer, TigerBeetle will void the transfer. TigerBeetle will roll back the changes to debits_pending and credits_pending of the appropriate accounts and not apply them to the debits_posted and credits_posted balances.

const transfer0 = {
id: 8n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 0n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: TransferFlags.pending,
timestamp: 0n,
};

let transfer_errors = await client.createTransfers([transfer0]);
// Error handling omitted.

const transfer1 = {
id: 9n,
debit_account_id: 102n,
credit_account_id: 103n,
amount: 10n,
pending_id: 8n,
user_data_128: 0n,
user_data_64: 0n,
user_data_32: 0,
timeout: 0,
ledger: 1,
code: 720,
flags: TransferFlags.void_pending_transfer,
timestamp: 0n,
};

transfer_errors = await client.createTransfers([transfer1]);
// Error handling omitted.

Transfer Lookup

NOTE: While transfer lookup exists, it is not a flexible query API. We are developing query APIs and there will be new methods for querying transfers in the future.

Transfer lookup is batched, like transfer creation. Pass in all ids to fetch, and matched transfers are returned.

If no transfer matches an id, no object is returned for that transfer. So the order of transfers in the response is not necessarily the same as the order of ids in the request. You can refer to the id field in the response to distinguish transfers.

const transfers = await client.lookupTransfers([1n, 2n]);

Get Account Transfers

NOTE: This is a preview API that is subject to breaking changes once we have a stable querying API.

Fetches the transfers involving a given account, allowing basic filter and pagination capabilities.

The transfers in the response are sorted by timestamp in chronological or reverse-chronological order.

const filter = {
account_id: 2n,
user_data_128: 0n, // No filter by UserData.
user_data_64: 0n,
user_data_32: 0,
code: 0, // No filter by Code.
timestamp_min: 0n, // No filter by Timestamp.
timestamp_max: 0n, // No filter by Timestamp.
limit: 10, // Limit to ten balances at most.
flags: AccountFilterFlags.debits | // Include transfer from the debit side.
AccountFilterFlags.credits | // Include transfer from the credit side.
AccountFilterFlags.reversed, // Sort by timestamp in reverse-chronological order.
};

const account_transfers = await client.getAccountTransfers(filter);

Get Account Balances

NOTE: This is a preview API that is subject to breaking changes once we have a stable querying API.

Fetches the point-in-time balances of a given account, allowing basic filter and pagination capabilities.

Only accounts created with the flag history set retain historical balances.

The balances in the response are sorted by timestamp in chronological or reverse-chronological order.

const filter = {
account_id: 2n,
user_data_128: 0n, // No filter by UserData.
user_data_64: 0n,
user_data_32: 0,
code: 0, // No filter by Code.
timestamp_min: 0n, // No filter by Timestamp.
timestamp_max: 0n, // No filter by Timestamp.
limit: 10, // Limit to ten balances at most.
flags: AccountFilterFlags.debits | // Include transfer from the debit side.
AccountFilterFlags.credits | // Include transfer from the credit side.
AccountFilterFlags.reversed, // Sort by timestamp in reverse-chronological order.
};

const account_balances = await client.getAccountBalances(filter);

Query Accounts

NOTE: This is a preview API that is subject to breaking changes once we have a stable querying API.

Query accounts by the intersection of some fields and by timestamp range.

The accounts in the response are sorted by timestamp in chronological or reverse-chronological order.

const query_filter = {
user_data_128: 1000n, // Filter by UserData.
user_data_64: 100n,
user_data_32: 10,
code: 1, // Filter by Code.
ledger: 0, // No filter by Ledger.
timestamp_min: 0n, // No filter by Timestamp.
timestamp_max: 0n, // No filter by Timestamp.
limit: 10, // Limit to ten balances at most.
flags: QueryFilterFlags.reversed, // Sort by timestamp in reverse-chronological order.
};

const query_accounts = await client.queryAccounts(query_filter);

Query Transfers

NOTE: This is a preview API that is subject to breaking changes once we have a stable querying API.

Query transfers by the intersection of some fields and by timestamp range.

The transfers in the response are sorted by timestamp in chronological or reverse-chronological order.

const query_filter = {
user_data_128: 1000n, // Filter by UserData.
user_data_64: 100n,
user_data_32: 10,
code: 1, // Filter by Code.
ledger: 0, // No filter by Ledger.
timestamp_min: 0n, // No filter by Timestamp.
timestamp_max: 0n, // No filter by Timestamp.
limit: 10, // Limit to ten balances at most.
flags: QueryFilterFlags.reversed, // Sort by timestamp in reverse-chronological order.
};

const query_transfers = await client.queryTransfers(query_filter);

Linked Events

When the linked flag is specified for an account when creating accounts or a transfer when creating transfers, it links that event with the next event in the batch, to create a chain of events, of arbitrary length, which all succeed or fail together. The tail of a chain is denoted by the first event without this flag. The last event in a batch may therefore never have the linked flag set as this would leave a chain open-ended. Multiple chains or individual events may coexist within a batch to succeed or fail independently.

Events within a chain are executed within order, or are rolled back on error, so that the effect of each event in the chain is visible to the next, and so that the chain is either visible or invisible as a unit to subsequent events after the chain. The event that was the first to break the chain will have a unique error result. Other events in the chain will have their error result set to linked_event_failed.

const batch = []; // Array of transfer to create.
let linkedFlag = 0;
linkedFlag |= TransferFlags.linked;

// An individual transfer (successful):
batch.push({ id: 1n /* , ... */ });

// A chain of 4 transfers (the last transfer in the chain closes the chain with linked=false):
batch.push({ id: 2n, /* ..., */ flags: linkedFlag }); // Commit/rollback.
batch.push({ id: 3n, /* ..., */ flags: linkedFlag }); // Commit/rollback.
batch.push({ id: 2n, /* ..., */ flags: linkedFlag }); // Fail with exists
batch.push({ id: 4n, /* ..., */ flags: 0 }); // Fail without committing.

// An individual transfer (successful):
// This should not see any effect from the failed chain above.
batch.push({ id: 2n, /* ..., */ flags: 0 });

// A chain of 2 transfers (the first transfer fails the chain):
batch.push({ id: 2n, /* ..., */ flags: linkedFlag });
batch.push({ id: 3n, /* ..., */ flags: 0 });

// A chain of 2 transfers (successful):
batch.push({ id: 3n, /* ..., */ flags: linkedFlag });
batch.push({ id: 4n, /* ..., */ flags: 0 });

const transfer_errors = await client.createTransfers(batch);
// Error handling omitted.

Imported Events

When the imported flag is specified for an account when creating accounts or a transfer when creating transfers, it allows importing historical events with a user-defined timestamp.

The entire batch of events must be set with the flag imported.

It's recommended to submit the whole batch as a linked chain of events, ensuring that if any event fails, none of them are committed, preserving the last timestamp unchanged. This approach gives the application a chance to correct failed imported events, re-submitting the batch again with the same user-defined timestamps.

// External source of time.
let historical_timestamp = 0n
// Events loaded from an external source.
const historical_accounts = []; // Loaded from an external source.
const historical_transfers = []; // Loaded from an external source.

// First, load and import all accounts with their timestamps from the historical source.
const accounts = [];
for (let index = 0; i < historical_accounts.length; i++) {
let account = historical_accounts[i];
// Set a unique and strictly increasing timestamp.
historical_timestamp += 1;
account.timestamp = historical_timestamp;
// Set the account as `imported`.
account.flags = AccountFlags.imported;
// To ensure atomicity, the entire batch (except the last event in the chain)
// must be `linked`.
if (index < historical_accounts.length - 1) {
account.flags |= AccountFlags.linked;
}

accounts.push(account);
}

const account_errors = await client.createAccounts(accounts);
// Error handling omitted.

// Then, load and import all transfers with their timestamps from the historical source.
const transfers = [];
for (let index = 0; i < historical_transfers.length; i++) {
let transfer = historical_transfers[i];
// Set a unique and strictly increasing timestamp.
historical_timestamp += 1;
transfer.timestamp = historical_timestamp;
// Set the account as `imported`.
transfer.flags = TransferFlags.imported;
// To ensure atomicity, the entire batch (except the last event in the chain)
// must be `linked`.
if (index < historical_transfers.length - 1) {
transfer.flags |= TransferFlags.linked;
}

transfers.push(transfer);
}

const transfer_errors = await client.createTransfers(transfers);
// Error handling omitted.

// Since it is a linked chain, in case of any error the entire batch is rolled back and can be retried
// with the same historical timestamps without regressing the cluster timestamp.