Use TensorFlow to predict Bitcoin prices

Use TensorFlow to predict Bitcoin prices

Here's your chance to use TensorFlow with JavaScript. Train a neural network to predict the rise and fall of Bitcoin prices.

Credit: Dreamstime

TensorFlow is the most popular machine learning platform and the Node version, TensorFlow.js, makes it easy to use in a JavaScript application. In this article, we’ll combine two of the most interesting technologies of the day: cryptocurrency and AI. We’ll train a neural network to make a price prediction based on a set of historical Bitcoin prices, then check it against the actual price of Bitcoins during a given date range.

Start the project

You’ll need Node/NPM installed to start. Then you can create a new directory and run npm init, and accept the defaults. Next, install TensorFlow by entering npm i tensorflow/tfjs. That's the only dependency you need for this demo.

The application consists of three files:

  • predict.js to handle the command-line interaction.
  • fetchPriceData.js to get the price data.
  • >trainAndPredict.js to make predictions.

predict.js is typical Node.js for handling command-line arguments. There are some niceties added for cleaning up the arguments, which we won't bother with here. The basic idea is that the user runs predict.js and passes in three arguments:

  • The token symbol (like Bitcoin, Solana, and so on)
  • The end date
  • The number of days to go back

If you don’t provide anything, the program defaults to Bitcoin, ending today, going back 90 days.

Set up the prediction

We’re using the CoinGecko API, which is particular about how it returns historical data. If the time range covers more than 90 days, it’ll automatically use a daily interval instead of hourly. Our code will handle either, but by default, we’ll use three months of hourly data, ending today. (CoinGecko wants the timestamps in seconds, not milliseconds, so that explains the odd juggling of 1000s in calculating start and end dates.)

The predict.js file is shown in Listing 1.

Listing 1. predict.js

const fetchPriceData = require('./fetchPriceData');
const trainAndPredict = require('./trainAndPredict.js');
const [tokenSymbol] = process.argv.slice(2);
// Set default if argument not provided
const defaultTokenSymbol = 'bitcoin';
// CoinGecko likes timestamps in seconds, not ms
const endDate = new Date( / 1000;
const startDate = new Date((endDate * 1000) - 90 * 24 * 60 * 60 * 1000);
// Use default values if arguments are not present
const finalTokenSymbol = tokenSymbol || defaultTokenSymbol;
async function main(){
  try {
    const { timePrices, predictTime, actualPrice } = await fetchPriceData(finalTokenSymbol, startDate, endDate);
    const results = await trainAndPredict(timePrices, predictTime);
    console.log(`Prediction for ${new Date(predictTime)}: ${results} `);
    console.log(`Actual: ${actualPrice}`);
  } catch (error) {
    console.error('Error:', error);

This code won’t run until we create the two modules for it to import. predict.js uses those modules, fetchPriceData and trainAndPredict, to get the price data and run the AI. The fetchPriceData module returns three things:

  • An array of timestamps and prices
  • A timestamp used to predict the price of Bitcoins
  • The actual price at that time

We’ll use the actual price to compare the AI's prediction to what really happened.

Get the price data

To get our price data, we use the CoinGecko API to define the fetchPriceData module, as shown in Listing 2.

Listing 2. fetchPriceData.js

async function fetchPriceData(tokenSymbol, fromTimestamp, toTimestamp) {
  try {
    const url = 
  `${tokenSymbol}/market_chart/range?vs_currency=usd&from=${fromTimestamp}&to=${toTimestamp}`;"Getting price data from: " + url);
    const response = await fetch(url);
    const data = await response.json();`Got ${data.prices.length} data points`);
    if (data && data.prices) {
      const lastElement = data.prices.pop();
      const pricesButLast = data.prices.slice(0, data.prices.length - 1);
      return { timePrices: pricesButLast, predictTime: lastElement[0], actualPrice: lastElement[1] };
    } else {
      console.log('No price data available.');
      return { timePrices: [], predictTime: null, actualPrice: null };
  } catch (error) {
    console.error('Error fetching data:', error.message);
    return { timePrices: [], predictTime: null, actualPrice: null };
module.exports = fetchPriceData;

The CoinGecko API is found at We'll use the market_chart/range endpoint, which lets you define a date range and token ID. The vs_currency parameter specifies what flat currency to use as the scale (in our case, it’s hardcoded to usd). You can find more information about the CoinGecko API by reading its documentation and introduction. It’s a full-featured API and you can use most of it without an API key.

We interpolate the arguments sent over from predict.js into the endpoint URL, like so:${tokenSymbol}/market_chart/range?vs_currency=usd&from=${fromTimestamp}&to=${toTimestamp}

If you actually plug in some data here, you can open the URL in a browser and get a look at the format; for example, try plugging in:

Note that the times are in Unix timestamp format.

We use this URL with the Fetch API, which was integrated into Node in version 14. Then, we navigate the JSON structure to get the actual price list at data.prices. This holds two-dimensional arrays with [timestamp, price] as the structure.

Train and predict

Next, we do a bit of manipulation that might seem strange, but the idea is to take the last time/price couplet and remove it from the end of the data set. We do this so the AI model can train on the rest of it, and then we can give the model the last timestamp, get a prediction, and compare it to the real price.

Once we have that, we return an object with three fields holding our data:

{ timePrices: pricesButLast, predictTime: lastElement[0], actualPrice: lastElement[1] }

Back in predict.js, we receive this data and we're ready to call trainAndPredict. It takes the time/price data and a timestamp and returns a value, which is the prediction. Listing 3 has the code for the trainAndPredict module.

Listing 3. trainAndPredict

async function trainAndPredict(timeAndPriceData, newTimestamp) {
  // Extract timestamps and prices from the data
  const timestamps =[timestamp, price]) => timestamp);
  const prices =[timestamp, price]) => price);
  // Normalize and scale the data
  const minTimestamp = Math.min(...timestamps);
  const maxTimestamp = Math.max(...timestamps);
  const minPrice = Math.min(...prices);
  const maxPrice = Math.max(...prices);
  const normalizedTimestamps = => (ts - minTimestamp) / (maxTimestamp - minTimestamp));
  const normalizedPrices = => (price - minPrice) / (maxPrice - minPrice));
  // Convert data to TensorFlow tensors
  const X = tf.tensor1d(normalizedTimestamps);
  const y = tf.tensor1d(normalizedPrices);
  // Create a simple linear regression model
  const model = tf.sequential();
  model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
  model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
  model.add(tf.layers.dense({ units: 1, inputShape: [1] }));
  // Compile the model
  model.compile({ optimizer: 'sgd', loss: 'meanSquaredError' });
  // Train the model
  await, y, { epochs: 100 });
  // Normalize the new timestamp for prediction
  const normalizedNewTimestamp = (newTimestamp - minTimestamp) / (maxTimestamp - minTimestamp);
  // Predict the normalized price for the new timestamp
  const normalizedPredictedPrice =    model.predict(tf.tensor1d([normalizedNewTimestamp]));
  // Scale the predicted price back to the original range
  const predictedPrice = normalizedPredictedPrice.mul(maxPrice - minPrice).add(minPrice);
  return predictedPrice.dataSync()[0];
module.exports = trainAndPredict;

Listing 3 is where we start to get into the machine learning code. From here forward, it’s helpful to understand basic machine learning concepts

Training the model

We begin by extracting the two data elements, time and price, into separate variables. We then normalise them. Preprocessing data with normalisation is common in neural networks. Essentially, we are making sure that both data sets are close to each other in spread. Specifically, we compress the values of both to the range of 0 to 1. This ensures the gradient descent algorithm, which finds the smallest error, works more efficiently and helps it avoid being trapped in local minima

The normalisation operation for both values consists of finding the maximum and minimum values, subtracting the minimum from the actual value, then dividing by the maximum minus the minimum. 

Next, we use the data to create tensors, the basic building block in TensorFlow. Tensors are like vectors on steroids, with n-dimensions. They support complex operations for interactions between the dimensions. TensorFlow lets you model the data as tensors, and then use that data to train your model against. You can learn more about the practical use of tensors in TensorFlow by reading the TensorFlow guide.

In our case, we only need tame tensors: a couple of one-dimensional vectors for the times and prices. We get these by calling tensor1d().

We then create and add layers to the neural network. We use a simple “sequential” network, with each layer in a column, and add them with model.add(tf.layers.dense({ units: 1, inputShape: [1] })). Each of these is a layer of neurons in the network. The “dense” designation means that each neuron connects to all the other neurons in the next layer. These are very basic layers, and there is a huge amount of configuration that you can use in TensorFlow, like how many layers, how many neurons, the type of layer (dense, convolution, Long Short-Term Memory, etc.), the activation function, and so on. You can also define custom networks and neuron properties. 

Next, we compile the model with the optimiser set to “sgd” (stochastic gradient descent) and the mean square error loss function. Again, this is all open for configuration to help improve the model performance.

Running the predictor

Now, we are ready to actually run the training with await, y, { epochs: 100 }), where epochs is the number of cycles to run the data through the network. (Too many epochs can lead to overfitting on the training data.)

Finally, we can make a prediction using the timestamp (which we also normalise), with this call: model.predict(tf.tensor1d([normalizedNewTimestamp])). To get back to a price, we denormalise the prediction. This is normal JavaScript math, except for the .mul() method, which is part of TensorFlow's support for tensor operations; in this case, element-wise multiplication. In our case, we only have a single predicted value, not an array, so it just performs normal multiplication.

If we run this code, we'll see something like what's shown in Listing 4.

Listing 4. Running the predictor

$ node predict
Prediction for Fri Aug 11 2023 21:00:01 GMT+0000 (Coordinated Universal Time): 30595.5546875 
Actual: 29380.31012784574
$ node predict solana
Prediction for Fri Aug 11 2023 22:00:54 GMT+0000 (Coordinated Universal Time): 24.18000602722168 
Actual: 24.5108586331892

How did we do?

In the first case, we get a Bitcoin prediction and in the second, a Solana prediction. Both use the last 90 days of hourly data.

In both cases, the prediction is in the ballpark, but not terribly accurate. Why is that? The answer is, because there is no real relationship between the timestamps and the prices. That was a very naive and silly approach to predicting Bitcoin prices.

Nevertheless, it gave us a good look at all the fundamental components of using TensorFlow with a cryptocurrency API feed. From here, a whole world of possibilities open up.

Tags javascriptTensorFlow

Show Comments