Build a Node app with a HuggingFace conversational model

How to use a template Github repo to start a node app with a prompt for a HuggingFace conversational model.

Build a Node app with a HuggingFace conversational model
Photo by Alex Knight / Unsplash

In this tutorial, we'll run a large language model application using Node, React, and HuggingFace.js. This tutorial will use an LLM React Node app template to get the project started.

This tutorial assumes the reader is familiar with Git, Node, React, and Yarn.

First, clone the LLM React Node app template repository and set it as your current directory.

git clone https://github.com/golivecosmos/llm-react-node-app-template.git

cd llm-react-node-app-template.git

clone a repo with a template to build an LLM React Node app

Before running the app locally, we need to set up an environment file to connect the app's Node server to an LLM. In this example, we'll create an environment file with an API key for the HuggingFace Hub API. To see an example of what the .env file should look like, see the .SAMPLE-env file included in the template.

touch .env

echo ENABLED_MODEL_STORE=HUGGING_FACE >> .env
echo HUGGINGFACEHUB_API_KEY=<your_API_key> >> .env

Now we're ready to run the app. We can start it up by installing all its dependent packages and then running the backend and frontend servers.

# install packages 
yarn

# start the backend server
yarn start-server

install the project's dependencies and start the node server

After starting the backend server, open http://localhost:3100/ in your browser to confirm the server is running. You'll see a welcome message printed.

Now let's start the frontend server. Open a second terminal window in the project's root directory and start the frontend server.

yarn start

start the frontend server

We're ready to check out the app and send prompts to the connected LLM.

Open http://localhost:5173/ in your browser and you should see a basic input prompt to communicate with a HuggingFace conversational model.

And now you are up and running 🚀.


Curious for more? Follow us on X (Twitter) to continue the conversation.